General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsFacial recognition bias frustrates Black asylum applicants to US, advocates say
https://www.theguardian.com/us-news/2023/feb/08/us-immigration-cbp-one-app-facial-recognition-biasFacial recognition bias frustrates Black asylum applicants to US, advocates say
Migrants from Africa and Haiti reportedly cannot utilize app to accept their photos, which is now required to apply for asylum
Melissa del Bosque in Tucson
Wed 8 Feb 2023 06.00 EST
The US governments new mobile app for migrants to apply for asylum at the US-Mexico border is blocking many Black people from being able to file their claims because of facial recognition bias in the tech, immigration advocates say.
Non-profits that assist Black asylum seekers are finding that the app, CBP One, is failing to register many people with darker skin tones, effectively barring them from their right to request entry into the US.
People who have made their way to the south-west border from Haiti and African countries, in particular, are falling victim to apparent algorithm bias in the technology that the app relies on.
Racial bias in face recognition technology has long been a problem. Increasingly used by law enforcement and government agencies to fill databases with biometric information including fingerprints and iris scans, a 2020 report by Harvard University called it the least accurate identifier, especially among darker-skinned women with whom the error rate is higher than 30%.
more
(Systemic racism baked into the tech will save us system.)
brush
(53,781 posts)Programmers know very well how to produce software that recognizes all facial tones.
CRT is spot on in spelling out the many instances of systemic racism in many spheres of US culture, law, employment, housing, policing and IMMIGRATION.
Hugh_Lebowski
(33,643 posts)Against database photos taken over a long period, using older tech, lower resolutions, etc.
Pretty sure that's the inherent problem in play here. And perhaps user error ... it would not be at all surprising to me that darker skin tones require significantly more light in order to capture the details necessary, for example.
And I kinda doubt it's a giant racist conspiracy, personally
brush
(53,781 posts)Seriously? Are most of them not white, American men? I could also mention the hard time that women have in that world.
Come on, man. Why would racism, sexism and bias skip over the tech/programming field?
Hugh_Lebowski
(33,643 posts)But I am a white American programmer. In our shop we've had Indians, SE Asians, Blacks, Women, etc. And no most programmers world-wide are not white American men.
And just for starters regarding the programming piece .... we don't know from whom the government may've licensed the FR software, what that company's make-up is, racially/culturally, etc. Also FR software of the sort they'd be using likely doesn't rely on skin tones for anything. Because it's too easy to alter skin tone.
In the end, I try to not jump to conclusions that conveniently confirms my biases (i.e. systemic racism exists) without any actual hard evidence. Stupidity/incompetence and/or actual, real technical limitations is frankly more likely than "a conspiracy to screw over certain segments of the population" in my mind.
I could be wrong, what you suggest is possible, but my guess is ... racism isn't the reason for this problem.
You're free to suppose differently of course
WhiskeyGrinder
(22,351 posts)brush
(53,781 posts)we all know it's Spanish. Do not tell me that is not intentional.