Unknown

Dataset Information

0

Joint Fairness Model with Applications to Risk Predictions for Under-represented Populations.


ABSTRACT: Under-representation of certain populations, based on gender, race/ethnicity, and age, in data collection for predictive modeling may yield less-accurate predictions for the under-represented groups. Recently, this issue of fairness in predictions has attracted significant attention, as data-driven models are increasingly utilized to perform crucial decision-making tasks. Methods to achieve fairness in the machine learning literature typically build a single prediction model subject to some fairness criteria in a manner that encourages fair prediction performances for all groups. These approaches have two major limitations: i) fairness is often achieved by compromising accuracy for some groups; ii) the underlying relationship between dependent and independent variables may not be the same across groups. We propose a Joint Fairness Model (JFM) approach for binary outcomes that estimates group-specific classifiers using a joint modeling objective function that incorporates fairness criteria for prediction. We introduce an Accelerated Smoothing Proximal Gradient Algorithm to solve the convex objective function, and demonstrate the properties of the proposed JFM estimates. Next, we presented the key asymptotic properties for the JFM parameter estimates. We examined the efficacy of the JFM approach in achieving prediction performances and parities, in comparison with the Single Fairness Model, group-separate model, and group-ignorant model through extensive simulations. Finally, we demonstrated the utility of the JFM method in the motivating example to obtain fair risk predictions for under-represented older patients diagnosed with coronavirus disease 2019 (COVID-19).

SUBMITTER: Do H 

PROVIDER: S-EPMC8132236 | biostudies-literature | 2021 May

REPOSITORIES: biostudies-literature

altmetric image

Publications

A Joint Fairness Model with Applications to Risk Predictions for Under-represented Populations.

Do Hyungrok H   Nandi Shinjini S   Putzel Preston P   Smyth Padhraic P   Zhong Judy J  

ArXiv 20210510


In data collection for predictive modeling, under-representation of certain groups, based on gender, race/ethnicity, or age, may yield less-accurate predictions for these groups. Recently, this issue of fairness in predictions has attracted significant attention, as data-driven models are increasingly utilized to perform crucial decision-making tasks. Existing methods to achieve fairness in the machine learning literature typically build a single prediction model in a manner that encourages fair  ...[more]

Similar Datasets

| S-EPMC9363518 | biostudies-literature
| S-EPMC9677487 | biostudies-literature
| S-EPMC10690022 | biostudies-literature
| S-EPMC8801802 | biostudies-literature
| S-EPMC7173710 | biostudies-literature
| S-EPMC11469344 | biostudies-literature
| S-EPMC8604241 | biostudies-literature
| S-EPMC6240798 | biostudies-literature
| S-EPMC3581485 | biostudies-literature
| S-EPMC10186088 | biostudies-literature