PhD

PhD student: Alan Balendran

Title: Conceptualizing and assessing the robustness of healthcare algorithms

Supervisor: Raphaël Porcher

Doctoral school: ED 393 Epidemiology and Biomedical Information Sciences, Université Paris Cité

Thesis topic:

The use of artificial intelligence (AI) and machine learning (ML) algorithms in healthcare for diagnostic and decision-support tools requires special attention. The concept of ethical, trustworthy, or responsible AI emphasizes the development of solutions that consider various intrinsic aspects of AI, such as generalizability, interpretability, fairness, reproducibility, and robustness. The latter aspect pertains to evaluating the performance of a model under perturbations.
Studies have shown that models are generally vulnerable to small perturbations, sometimes imperceptible to humans. One of the key challenges in implementing AI in clinical practice is to develop algorithms that are resilient to the most likely perturbations encountered in healthcare and to define appropriate evaluation methods.
To understand the influence of perturbations on model behavior, it is necessary to establish tests to assess the robustness of a model. However, there are many ways to perturb a model (input errors, different domains, adversarial attacks, etc.), and this can occur at various stages in the life of an algorithm (data collection, training or validation of the algorithm, deployment).
This PhD project on the robustness of ML algorithms in healthcare is divided into three successive stages. The first stage aims to identify various existing concepts and aspects grouped under the term “robustness” of an ML algorithm, as well as the metrics and methods used to evaluate or quantify them. The second stage aims to group and prioritize the identified concepts of robustness based on categories of use cases (frequency of occurrence, severity, ease to remediate). Finally, the last stage involves constructing a pipeline to assess the robustness of ML algorithms in healthcare.

Members

Right
Back to top
Dear visitors,
This site uses cookies to improve your browsing experience. You can accept all cookies or refuse them, except for technical cookies, necessary for the use of the site.
For more information on the purposes and to choose your preferences by type of cookie, click on "Manage cookies". preferences".
You can modify your preferences at any time, and in particular withdraw your consent, by clicking on "Legal notices" at the bottom of each page of the site. Find out more
Manage cookies
This site uses cookies deposited by the site or by third parties.
This page allows you to determine your cookie preferences. For information, technical cookies are cookies necessary for the proper functioning of our site, used by the host for the technical management of the network. They are essential to use the main functionalities of the site and therefore cannot be deactivated.
Accept all cookies
Technical cookies
Technical cookies are cookies necessary for the proper functioning of the site which allow you to appreciate its main features.
Videos
These cookies are necessary for the display of YouTube videos present on the site.
Save my preferences