Measuring Social Biases of Crowd Workers using Counterfactual Queries


Workshop paper


Bhavya Ghai, Q. Vera Liao, Yunfeng Zhang, Klaus Mueller

Cite

Cite

APA   Click to copy
Ghai, B., Liao, Q. V., Zhang, Y., & Mueller, K. Measuring Social Biases of Crowd Workers using Counterfactual Queries.


Chicago/Turabian   Click to copy
Ghai, Bhavya, Q. Vera Liao, Yunfeng Zhang, and Klaus Mueller. Measuring Social Biases of Crowd Workers Using Counterfactual Queries, n.d.


MLA   Click to copy
Ghai, Bhavya, et al. Measuring Social Biases of Crowd Workers Using Counterfactual Queries.


BibTeX   Click to copy

@techreport{bhavya-a,
  title = {Measuring Social Biases of Crowd Workers using Counterfactual Queries},
  author = {Ghai, Bhavya and Liao, Q. Vera and Zhang, Yunfeng and Mueller, Klaus}
}

Abstract
Social biases based on gender, race, etc. have been shown to pollute machine learning (ML) pipeline predominantly via biased training datasets. Crowdsourcing, a popular cost-effective measure to gather labeled training datasets, is not immune to the inherent social biases of crowd workers. To ensure such social biases aren't passed onto the curated datasets, it's important to know how biased each crowd worker is. In this work, we propose a new method based on counterfactual fairness to quantify the degree of inherent social bias in each crowd worker. This extra information can be leveraged together with individual worker responses to curate a less biased dataset.

PDF

Share



Follow this website


You need to create an Owlstown account to follow this website.


Sign up

Already an Owlstown member?

Log in