
Un errore di riconoscimento facciale spinge la polizia ad arrestare un uomo asiatico per furto con scasso a 100 miglia di distanza
https://www.theguardian.com/technology/2026/feb/25/facial-recognition-error-prompts-police-to-arrest-asian-man-for-burglary-100-miles-away?CMP=Share_iOSApp_Other
di 457655676
7 commenti
Welcome to the future, where software can just get things wrong all the time, and it’ll fuck up your life permanently with no one to blame.
This is maybe the 5th time I’ve heard about AI misidentifying non white people.
It’s genuinely terrifying that a computer can send the police in your direction
If you read the article it’s pretty clear they don’t look the same. Thames valley claim a visual ID was made first too. So all brown people look alike?
Surely if you are arrested and subsequently released without charged your details should not be retained by the police.
This will happen a lot more and there will be no accountability because they will just say it was a computer error and void all responsibility. It’s one of the many reasons why I’m massively against ai facial recognition and consider it massive government overreach which is totally unnecessary.
Serious question; what are the legal implications of resisting arrest when you have never committed a crime in your life and have been instructed to be arrested by some Palantir robot?
Police need to be held personally liable for the consequences of using this kind of technology. Heads should roll whenever this happens.
You’d think they’d have the basic common sense to check the guy’s identity and see he has no criminal record and lives a hundred miles away before arresting him.
It’s pretty likely to be a false positive given those circumstances.