Oasis Labs announced the partnership with Meta, and the launch of a platform to assess fairness in Meta’s products while protecting people’s privacy.
As Meta’s technology partner, Oasis Labs built the platform that uses Secure Multi-Party Computation (SMPC) to safeguard information as Meta asks users on Instagram to take a survey in which they can voluntarily share their race or ethnicity.
The project will advance fairness measurement in AI models, which will positively impact the lives of individuals across the globe and benefit society as a whole.
This first-of-its-kind platform will play a major role in an initiative that is an important step toward identifying whether an AI model is fair and allowing for appropriate mitigation.
How the platform will assess fairness in AI models
Meta’s Responsible AI, Instagram Equity, and Civil Rights teams are introducing an off-platform survey to people who use Instagram. Users will be asked to share their race and/or ethnicity on a voluntary basis.
The data, collected by a third-party survey provider, will be secret-shared with third-party facilitators in a way such that the user’s survey responses cannot be learned by either the facilitators or Meta.
The measurement is then computed by the facilitators using encrypted prediction data from AI models, that are cryptographically shared by Meta, with the combined, de-identified results from each facilitator reconstituted into aggregate fairness measurement results by Meta.
The cryptographic techniques used by the platform enable Meta to measure for bias and fairness while providing individuals that contribute sensitive demographic measurement data with high levels of privacy protection.
To read more about the platform, its objectives, and the launch, please visit here.
If you would like to be kept up to date on developments with the Oasis Labs, please join:
All the information contained on our website is published in good faith and for general information purposes only. Any action the reader takes upon the information found on our website is strictly at their own risk.