Dr. Gupta discussed causal fairness principles to mitigate bias and promote equity in healthcare using MIMIC 3 data. […]
Dr. Sudip Gupta: Algorithmic Bias and Causal Fairness in Healthcare Evidence from MIMIC database

Dr. Gupta discussed causal fairness principles to mitigate bias and promote equity in healthcare using MIMIC 3 data. […]
Dr. Mackey discussed a project funded by the Robert Wood Johnson Foundation in partnership with The Native Biodata Consortium to develop a blockchain-based governance system for managing Indigenous genomic data. […]
Dr. Jiang explored how a groundbreaking discovery was utilized to generate 30 metadata-based features through machine learning for the automatic detection of PHI fields in structured Electronic Health Record (EHR) data. […]
Dr. Salimi offered insights for data management, machine learning, and responsible data science, emphasizing the significance of handling selection bias in algorithmic decision-making. […]
Read More… from Dr. Babak Salimi: Certifying Fair Predictive Models in the Face of Selection Bias
Dr. Malin drew upon examples from large-scale data-driven projects like the EMR and bio-repository at Vanderbilt University Medical Center, the eMERGE consortium of the NIH, and the All of Us Research Program, aiming to create a comprehensive database of EMRs, genome sequences, and mHealth records from one million Americans. […]
Dr Mathews provided an overview of the recent NASEM consensus study report that focused on the use of race and ethnicity and other population descriptors in genomics research, including the recommendations made by the committee. […]
American Indians experience elevated rates of health conditions like diabetes, chronic kidney disease, and cardiovascular disease, as well as greater exposure to environmental hazards. […]
Dr. Goldenberg discussed the evolving ethical, legal, and social concerns (ELSI) associated with biorepositories and biobanking. […]
Dr. Ustun discussed how machine learning models, personalized with sensitive features like sex, age group, and HIV status, can perform better for populations but worse for specific groups, potentially causing harm. […]
Read More… from Dr. Berk Ustun: Towards Personalization without Harm