Design Bias in Technology and AI

Race

Facial Recognition Software

Gender

Microsoft's Kinect video game product not working on darker skinned people

car crash test dummies not being sized for and therefore tested on the female body

Defining design bias: when a product is designed and created in such a way that a bias is unintentionally produced in the results of the product

HP's facial webcams being unable to operate on Black people

US

The UK

CCTV being shown to contain bias in its reporting to the police

Project Greenlight surveilling predominantly Black and Latino neighborhoods

The city of San Francisco banning facial recognition software due to its biases

Finding a solution: including people of color, women, and other marginalized groups in designing and testing technology and AI to ensure the design bias is minimized

VR being shown to have biases against women as they are more prone to sickness if they play

Soap dispensers not working on Black hands in the Atlanta Marriott Hotel

some cities trying to cope or find a solution with design bias

rise of technology like Zoom and other webcams as a result of Covid-19

clearly not exclusively an American problem

a perhaps deadly risk involved in design bias

ICE logging drivers licenses into facial recognition software to identify undocumented immigrants

wrongful displacement (deportation) as a result of design bias

medicine

women being not included, or not well represented in medical studies

smartphones and video game controllers not fitting well in women's hands

potential life threatening effects on their health as their symptoms may be regarded incorrectly