Agli strumenti di intelligenza artificiale non dovrebbe essere consentito di creare immagini “spogliate”, affermano i britannici

https://yougov.co.uk/technology/articles/53828-ai-tools-should-not-be-allowed-to-make-undressed-images-say-britons

di SpottedDicknCustard

22 commenti

  1. SpottedDicknCustard on

    > A new YouGov survey shows that the British public overwhelmingly believe AI companies should not be allowed to generate such imagery. Fully 96% of Britons say that firms should not be allowed to generate ‘undressed’ images of children (only 1% say they should), with 87% saying the same regarding such images of adults (5% think this is ok).

    Good to see such near universal support.

    The 1% should probably have their hard drives investigated.

  2. VivianOfTheOblivion on

    I wonder why Nigel wants to go against near-unanimity here, and suggest that legislating against the child porn robot is an affront to free speech? Any Reformers want to chime in here? You’re all about protecting women and children, innit?

  3. PsychologySpecific16 on

    Most generative AI can though as I understand it.

    We already have convictions for actual AI CP on other platforms so I don’t think it’s as simple as writing some new code.

  4. RecentTwo544 on

    The issue here is a lot of that 96% will be lying. You’re not exactly going to say on a survey “yeah, I love committing serious sexual offences.”

  5. HammerSpanner on

    But but but….what about my free speech? /s

    it fu*ken boggles my mind – before Trump and Elon came along and convinced all the dickheads that it’s okay to be cruel and abusive (not to mention racist, transphobic “alpha twats”), this wouldn’t even be a debate, it would be banned. No one would question it.

  6. LeoLH1994 on

    Agreed. There is a difference between that and legitimate art which often aims to get the subject’s consent first.

  7. Cletus_Banjo on

    Sounds like a technical solution to a social problem. Isn’t the issue that fact that so many people WANT to make undressed images, rather than whether software allows them to do this?

  8. RedBerryyy on

    Theres plenty of space to argue about rules around model weights, but musk just straight up making an online easily accessable unconsensual undressing tool is something else, its not a problem that needs a delicate technical touch, hes not even trying, blows my mind.

  9. SignalButterscotch73 on

    No freely accessible AI tool should be able to make pornographic images. I wouldn’t ban the ability completely as porn is still legal to make.

    No AI tool should be able to make child porn. Illegal is illegal.

    Fuck you Elon Musk.

  10. VampyrByte on

    Genie is somewhat out of the bottle with the technology as a whole, although it will still be a good thing to regulate large providers and have them take some accountability.

    However all of this and more is possible with local models that dont involve a big corporation to hold to account. We should probably base our laws and regulations around the harassment and harm, rather than the images themselves. In the near future these images and videos may well be indistinguisable from the real thing.

  11. BalianofReddit on

    Ill go one step further

    Ai should be completely banned from all production of visual ir auditory media. Full stop, no research, none.

  12. tezmo666 on

    This is a good first step to regulation but let’s be honest, AI should not be allowed to use the likeness of anyone, period. Most models are still relatively discernible, but we’re on the cusp of having the most sophisticated disinformation/scamming tool imaginable at the finger tips of anyone.

    Problem is, actually coming down hard on these start ups now and restricting use means their USP is basically gone. And this bubble is too big to fail now, so it either pops on it’s own or as is the way with our reactionary government, we wait until there’s an epidemic of deepfake scammers ripping everyones nan off before something is done.

  13. fanglord on

    It’s kind of a weird ethical thing in general, any nude/lewd image or video will be generated from real people who will have almost certainly not consented to it even if the end product is not recognisable as the original person.

    Maybe porn may save the creative industry because I think it’s a fair enough argument that if you can’t do it without consent for adult material then it should be also applicable to any media.

  14. Remarkable-Ad155 on

    Goes without saying this is a no for minors. 

    For adults if some enterprising person wants to make money by creating content *of themselves or others from whom they have permission* where you can do this, fine by me. Porn already exists. 

    The key concept here (which seems to be missing from the debate) is consent. It’s clearly not correct to allow Grok to do this to adults without consent (or children at all). 

    Does go to show the dangers of obsessively sharing pictures of yourself and family too though, really hope this gives at least a few people pause for thought. 

  15. One_Anteater_9234 on

    It isnt. It is a re imagining of what someone who looks like you would look like naked. It isnt real

  16. huzzah-1 on

    When The Government wants censorship, it is NEVER for your benefit. I am against giving them yet another lever of control because they will only use it to impose more authoritarian policies. Everything they do is with ill intent.

  17. TheSpaceFace on

    I think there is too much focus on the tools used to generate stuff like this. Technically you can use Photoshop or a similar software to create similar images you can also download AI models which will continue to do this which were trained outside the United Kingdom, so banning the ability for these AI models to generate content like this is tricky and perhaps impossible to enforce.

    For example it would be silly to ban all pencils because someone can draw something bad with it. The focus should always be on criminalizing the conduct such as the distrubution and harrasment, rather than forcing the technology to be limited. If you make the tool ‘dumb’ enough to never produce anything ‘undressed,’ you also make it too dumb to understand anatomy, medical science, or classical art.

    I do think that we should discourage people from being able to generating stuff like this, but we shouldn’t approach it from forcing legislation on the tools themselves in my opinion, we should focus on enforcing laws around how companies can use AI. The problem is rarely with the tools themselves but its the companies who are using them in an abusive way for profit. Such as X/Twitter.

  18. Protect_the_citizen on

    They should not be used to create art/music either, but here we are…

  19. TheMarksmanHedgehog on

    Since image generation is just image recognition backwards, it should be relatively easy to scan and automatically block or discard offending images, especially for larger companies who implement these “tools.”

    Locally run/open source tools could be more of a problem, but I could see it as entirely reasonable to ask that they implement image recognition and automatic deletion of anything sufficiently “sus” as a safety, obviously anyone savvy enough could remove that functionality, but it might serve as a deterrent.

  20. ancapailldorcha on

    It’s funny how the protect our women and children people are either completely silent on this or are offering mealy-mouthed defences of the literally world’s richest man.

    It boggles the mind how we’ve allowed these techc*nts to exploit people for so long.

  21. ChalmersMcNeill on

    So does anyone think there are any prominent politicians or right wing activists in the 1% ?

  22. OkSignificance5380 on

    Cats out of the bag.

    It is possible to run local models

Leave A Reply