Facial-recognition bias
I realize now that one big example of computing bias (or particularly, bias in algorithms) is facial recognition "AI". It's oddly true, but apparently, when training the algorithm, it is given mostly photos of light-skinned men. Odd? Sure, but really, it isn't the algorithm's fault. We are the ones who feed the algorithm inputs, and it's trying to do what it sees as right based on what it's given.
And this seems small, but this bias can get worse. It's like a butterfly effect, which the algorithm may try to recognize criminal faces. As a result, people have been wrongly getting arrested, mainly due to misidentification.
Some applications of hiring software may also have this dangerous bias and might try to weigh your chances at getting a job based on facial behavior. And the scary part, it can be skewed depending on your race or gender.
Comments
Post a Comment