Share.

5 commenti

  1. KeyTea8394 on

    While I’m sure there are technical reasons as to why the system is “Biased” I pulled this out from the article:

    >
    Testing at the National Physical Laboratory found the software correctly identified Asian suspects 98 per cent of the time, falling to 91 per cent for white suspects and 87 per cent for Black suspects.

    So out of 100 people it picks out it gets 13 wrong for Black suspects and 9 white suspects. Key word suspects. I’m sure after getting stopped by the police and identifying yourself that’d realise it wasn’t you and let you go. while inconvenient if its catching 87 or 91 actual criminals its got to be worth it.

  2. wkavinsky on

    I’m totally shocked at this.

    Totally.

    It’s also yet another fucking reason why the drive to AI and surveillance is fucking bad – these systems are **inherently** biased by the data they are trained on, and they aren’t good at picking up new and emerging trends.

  3. SexDrugsAndPopcorn on

    Nothing like changing the narrative from a privacy issue to a race issue

  4. RandonEnglishMun on

    AI is only as biased as the person feeding it information.

Leave A Reply