Please enable JavaScript.
Coggle requires JavaScript to display documents.
Design Bias in Technology and AI - Coggle Diagram
Design Bias in Technology and AI
Race
Microsoft's Kinect video game product not working on darker skinned people
HP's facial webcams being unable to operate on Black people
rise of technology like Zoom and other webcams as a result of Covid-19
Soap dispensers
not working
on Black hands in the Atlanta Marriott Hotel
Facial Recognition Software
US
Project Greenlight surveilling predominantly Black and Latino neighborhoods
The city of San Francisco banning facial recognition software due to its biases
some cities trying to cope or find a solution with design bias
ICE logging drivers licenses into facial recognition software to identify undocumented immigrants
wrongful displacement (deportation) as a result of design bias
The UK
CCTV being
shown
to contain bias in its reporting to the police
clearly not exclusively an American problem
Gender
car crash test dummies not being sized for and therefore tested on the female body
a perhaps deadly risk involved in design bias
VR being shown to have biases against women as they are more prone to sickness if they play
medicine
women being not included, or not well represented in medical studies
potential life threatening effects on their health as their symptoms may be regarded incorrectly
smartphones and video game controllers not fitting well in women's hands
Defining design bias: when a product is designed and created in such a way that a bias is unintentionally produced in the results of the product
Finding a solution: including people of color, women, and other marginalized groups in designing and testing technology and AI to ensure the design bias is minimized