Facial recognition run by private companies in shops confuses me- accuracy improves as more data is fed in to these systems, but private systems only get tiny amounts of data and don’t have access to things like custody photos or even the identities of the people they label ‘thief’.
Edit: and the data they do get isn’t going to be of particularly high quality if it’s just relying on images that are fed to it from shop CCTV- it’s not like they can share data from one customer to the other.
Edit 2: as pointed out by a commenter, this was human error. The system identified him correctly, but his data should never have been put on there.
Is there some way of finding out which shops use it? Or are they required to display some kind of notice? So that those of us who are lucky enough to have the luxury of being able to shop somewhere else (not everyone, I know) can choose not to go there?
Defiant-Yellow-2375 on
To be fair to the machines, he looks like every white British bloke over 55.
techbear72 on
> “We will continue to work closely with the Information Commissioner’s office to ensure regulations remain effective.”
Remain? Doesn’t seem like they’re working very well.
shortymcsteve on
Facial recognition in Home Bargains is madness. It also doesn’t seem to work very well, since the two related stories at the bottom are also about this happening in 2024 and 2025 to other people in the same store.
egg1st on
It’s not bad technology, it’s a bad process. The tech will highlight likely matches to known criminals. The mistake is that their process assumes the tech is right. The process should first inform the customer about the system and that it’s raised a flag. State that it may be a mistake. Can they confirm their identity.
bvimo on
Look at the head, the placement of the eyes. Quite obvious to any phrenologist this man is a criminal. Well done technology for identifying the latent tendency.
Hollywood-is-DOA on
We are slowly walking into a world full of facial recognition that they eventually link to your social credit score.
mashed666 on
Probably Hanwha camera’s? I’ve noticed there quite popular in retail environments.
stray_r on
Does this count as defamation? A couple of expensive to defend lawsuits might be the pressure required to increase specificity.
ii-_- on
Non-story. I’m sorry for this guy getting falsely getting labelled but it’s so so so worth it. The occasional mis-label massively outweighs catching actual criminals. We benefit from an intensely competitive supermarket scene (suppliers don’t!) so less theft should result in lower prices for us.
11 commenti
Facial recognition run by private companies in shops confuses me- accuracy improves as more data is fed in to these systems, but private systems only get tiny amounts of data and don’t have access to things like custody photos or even the identities of the people they label ‘thief’.
Edit: and the data they do get isn’t going to be of particularly high quality if it’s just relying on images that are fed to it from shop CCTV- it’s not like they can share data from one customer to the other.
Edit 2: as pointed out by a commenter, this was human error. The system identified him correctly, but his data should never have been put on there.
This is the same facial recognition company that [wrongly identifies people at Sainsbury’s](https://www.lbc.co.uk/article/sainsburys-apologises-facial-recognition-london-news-5HjdRdG_2/).
Is there some way of finding out which shops use it? Or are they required to display some kind of notice? So that those of us who are lucky enough to have the luxury of being able to shop somewhere else (not everyone, I know) can choose not to go there?
To be fair to the machines, he looks like every white British bloke over 55.
> “We will continue to work closely with the Information Commissioner’s office to ensure regulations remain effective.”
Remain? Doesn’t seem like they’re working very well.
Facial recognition in Home Bargains is madness. It also doesn’t seem to work very well, since the two related stories at the bottom are also about this happening in 2024 and 2025 to other people in the same store.
It’s not bad technology, it’s a bad process. The tech will highlight likely matches to known criminals. The mistake is that their process assumes the tech is right. The process should first inform the customer about the system and that it’s raised a flag. State that it may be a mistake. Can they confirm their identity.
Look at the head, the placement of the eyes. Quite obvious to any phrenologist this man is a criminal. Well done technology for identifying the latent tendency.
We are slowly walking into a world full of facial recognition that they eventually link to your social credit score.
Probably Hanwha camera’s? I’ve noticed there quite popular in retail environments.
Does this count as defamation? A couple of expensive to defend lawsuits might be the pressure required to increase specificity.
Non-story. I’m sorry for this guy getting falsely getting labelled but it’s so so so worth it. The occasional mis-label massively outweighs catching actual criminals. We benefit from an intensely competitive supermarket scene (suppliers don’t!) so less theft should result in lower prices for us.