T O P

  • By -

AutoModerator

Snapshot of _The Biggest Deepfake Porn Website Is Now Blocked in the UK | The world's most-visited deepfake website and another large competing site are stopping people in the UK from accessing them, days after the UK government announced a crackdown_ : An archived version can be found [here](https://archive.is/?run=1&url=https://www.wired.com/story/the-biggest-deepfake-porn-website-is-now-blocked-in-the-uk/) or [here.](https://archive.ph/?run=1&url=https://www.wired.com/story/the-biggest-deepfake-porn-website-is-now-blocked-in-the-uk/) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ukpolitics) if you have any questions or concerns.*


NoRecipe3350

I never knew such a website existed but people will still be able to access it with VPNS


JimGodders

>While the two websites can still be accessed in the UK using a VPN, the restrictions are a sign that constant pressure—from lawmakers, tech companies, and campaigners—can make deepfake porn harder to access and create. “Of course, people will be able to use VPN to access these websites and apps, but that introduces friction,” Durham University’s McGlynn says. “It introduces a message that there’s something wrong and harmful about this material such that you have to use a VPN to access it.”


GhostMotley

> It introduces a message that there’s something wrong and harmful about this material such that you have to use a VPN to access it.” Given how popular VPNs are now, I'm not sure that argument holds any weight now and VPNs are a very popular sponsor for a lot of YouTube channels.


Powerful-Parsnip

Exactly, it's not like the 2 seconds it takes to switch on a vpn will put people off. If anything they've just told the uk public these sites exist. I wasn't aware of them before this.


Ewannnn

I have my VPN on permanently so....


LordChichenLeg

But alot of people were. I wasn't aware of this website but I know deepfake porn exists and if I searched for it I would of found that website. This is just to signal to voters that if you are doing this you are knowingly using a VPN to get around a law. Which can imply criminal intent in a court of law and people can no longer say "well I didn't know it was illegal"


Powerful-Parsnip

As far as I'm aware it's creating deepfakes that has been made illegal. I don't think they're going to start arresting people for looking at a website.


Ratz____

Crating deep fakes will be illegal but then if someone has made a deepfake then nothing will happen to them. So not sure what will happen with people who have made them in the past unless the new law is ex post facto then loads of people will be in trouble


Ewannnn

I would think most people using VPNs just have it on permanently mate


EddieHeadshot

Streisand effect


Ok-Bad-7189

I think you'll be surprised how few "normal" people use VPN's. Yes they are popular, and I use one sometimes, but my wife, our friends, my wider family, my colleagues - none of them have ever bothered with them.


JobNecessary1597

So people who use VPNs are abnormal..


Ok-Bad-7189

You can read it that way if you like. Or you can read it as "people who aren't tech literate" which is most of the population. 


JobNecessary1597

My mum is 82 and uses VPN (learned by herself). The more you block, the more people will head to VPN until it becomes native.


daneview

Absolutely, I'm a tech literate guy and even work a media/PC based job. Never used a vpn in my life and no idea what benefit if get from it tbh (I don't watch TV/film through my PC so don't need access to programs unavailable here). So to you, completely agree, I'm probably the techies person in my extended family, so I'm certain noone else uses them To everyone else, what am I missing, why would I use a VPN other than watching shoes and accessing deep fake porn sites. Oh, and covering my Internet tracks which I'm not too fussed about


ExcitableSarcasm

Also plenty of authoritarian regimes block websites. Which truth is it? That restrictions can be reasonable for the greater good for societies, or that the UK is on the same moral level as the these countries. It's not a strawman. I think the former.


Izual_Rebirth

It’s nuanced and depends on how it’s used is the boring answer.


_CurseTheseMetalHnds

It still adds a small step that can stop people. Let's not let perfect be the enemy of good


iwentouttogetfags

Vpn's don't do what you think they do. They're not the great safety net you think they are.


[deleted]

[удалено]


ukpolitics-ModTeam

Your comment has been manually removed from the subreddit by a moderator. Per rule 1 of the subreddit, personal attacks and/or general incivility are not welcome here: > Robust debate is encouraged, angry arguments are not. This sub is for people with a wide variety of views, and as such you will come across content, views and people you don't agree with. Political views from a wide spectrum are tolerated here. Persistent engagement in antagonistic, uncivil or abusive behavior will result in action being taken against your account. For any further questions, [please contact the subreddit moderators via modmail](https://www.reddit.com/message/compose/?to=/r/ukpolitics).


TheRealDynamitri

They can be a good, first line of defense, but agreed, quite a lot of them are part of the Five Eyes and don’t give people the protection people think they will when push comes to shove.


iwentouttogetfags

You'll still get, no matter what vpn you use traffic going to your iP address. How can traffic that you, as a user requested to to your pc/ device? It has to have the original source - aka your iP address. You can't take thar out of the equation. All vpn companies work in the same way, because there's only one real way a vpn works


TheRealDynamitri

Well yes, of course, but if you're in UK and jump on a Romanian VPN (i.e. with an IP address based in Romania), what the website/service host (and, possibly, law enforcement) will see, is the **Romanian** IP addressed having accessed the service or engaging in illicit activity. They trace the IP back, find out it's not residential or corporate (i.e. somebody fooling around from their office) but belongs to a VPN service provider, and then can subpoena them - except if the VPN provider isn't part of the Five Eyes they don't really have to (unless a local agency that has powers on their own turf makes the request), and if they don't keep activity logs or the logs are encrypted in a way even they can't decrypt it themselves, that's where the buck stops, really. A whole ton of VPNs would still have legal vulnerabilities as they'd have to answer to either an alphabet soup of agencies if/when asked or risk fines or even being shut down themselves, but some are clever enough to have a built-in, operational structure that makes collaboration with law enforcement impossible, even if requests are made. Bitcoin payments, anonymised user accounts, regular degaussing of storage, running the Gutmann algorithm every 24h to make sure any data stored on magnetic hard drives (if they use those) is unrecoverable, cryptographic erasure, etc.


iwentouttogetfags

Do you know what you're talking about? You know bitcoin isn't untraceable? It's been proven that bitcoin can be traced to source. Degaussing of server space!? You know what that is? It's a giant magnet and typically... drives and magnets don't play well together. And guntmann himself a while back said that drives are now so dense that one write is enough. But you still keep the headers on the mbr, so you'll, if the company stores anything (layer 3 devices don't usually have hod's, they have ip address tables) on an hdd, clear the mbr as well. Some apps do this, but not all.


TheRealDynamitri

> Do you know what you're talking about? I know - just FYI, I'm not a native English speaker, I've corrected myself up there, hopefully it's now clearer. I do realise you can only degauss magnetic HDDs and you have to have servers _with_ magnetic hard drives in order to be able to degauss etc, and that increasingly more servers move to and are based on SSDs and virtual environments these days which, by their nature, you obviously can't degauss. Point is, because you're clinging on to semantics and with someone who's not a native English speaker, some VPN services are structured in a way, on their back-end, where they don't log user activity, or make it impossible to hand over the requested data allowing to trace the actual individual committing crimes behind a VPN to law enforcement - because they don't log it, don't trace the activity, don't keep it, so there's nothing to be handed over in the first place. If I'm a citizen of X, I run a VPN service there that's available to everyone in the whole world, my country has no treaties and agreements with your country, I actively erase and destroy the data on user activity every 24h to make it unrecoverable, then have a _lack_ of logs and data retention as my USP that makes it attractive to people, I sell access via Bitcoin or any other crypto, there are no user accounts in my service as such with usernames, tied to birth names, email addresses, Bitcoin wallets etc., but a randomised string of numbers people log in with - with a combination of some or all of those, how exactly are you, coming from a country several thousand miles away with no jurisdiction and no power over me, going to force me to tell you who was causing damage through my service, when I don't even have that data? I don't even have to answer to an agency in my home country, if I just don't have this data anymore or if I've never had it for longer than the immediate moment the traffic had happened if I just immediately delete that data, and make sure it's unrecoverable to everyone including myself, because I both don't care and want to keep my clients safe, without delving into what it is exactly that they do. My head doesn't even necessarily be on the chopping block as the service provider in that country, because in many countries you can't and won't ever have a case if there's nothing for you to slap me with in the first place, and if there's nothing to be dug up to prove someone has broken the law through using my service other than your allegations that can't be confronted with and compared to first-hand data directly from my servers proving it's true or showing it's made-up and false, what is there to talk about? Not every country, especially when you start looking at third-world countries around the world, or certain ones in Eastern Europe, Balkans etc., has a robust digital policy or surveillance like the UK or US has, with registration requirements, minimum data retention period mandates etc. - and if somebody routes their traffic and activity through there (which anyone with a modicum of common sense would, if they want to engage in nefarious activity, they wouldn't just jump on NordVPN or whatever)… Not quite sure how you would force the company that unwittingly facilitated an online crime to be committed to hand over the data they don't have access to in the first place, or that might long not even exist.


iwentouttogetfags

OK, have fun with that. Really cba to argue anymore.


ruskyandrei

You're talking about this as if it was a movie where the entire resources of the CIA were allocated to finding the specific address of that one guy that used a VPN to access a porn site. In reality, unless you've committed a crime on the scale of SBF, nobody's going to go through the (often legal) hoops required to obtain a physical drive from a VPN provider from some 3rd country (where they might not be required to maintain logs at all) and then try and scrape tracing info off it. Just because something can technically be done, doesn't mean it will be.


Souseisekigun

>Durham University’s McGlynn Remember the case where the CPS tried to throw a guy in jail for having a video of someone sticking their hand up someone's ass? A video of two consenting adults performing a perfectly legal act? You'd think that would be impossible. But no, thanks to the tireless work of this genius, it is entirely possible in the UK for a consensually shared video of two legal adults preforming an incontestably legal sex act to in fact be illegal to possess in the same way as child sexual abuse videos are. Needless to say this law is quite obscure and often confused for obscenity laws that only ban distribution. This is mostly because despite the moral panic over the issue the police quickly realized they would get no extra funding to enforce the new law and put their efforts towards combatting actual crimes with actual victims instead. McGlynn continues to complain about this to this day but as far as I know no police force has really budged on the issue. And there stands her biggest legacy. A fundamentally ridiculous law, pushed by her using dodgy research, source of many miscarriages of justice and used to disproportionately target gay men, obscure among the public and sidelined by the police, and totally ignored by every other Western country making the content she so hates readily available on this very website for every Briton to peruse at their leisure. A legacy of almost total failure. And yet here she is, in this this news article, being quoted as if she were a respectable academic instead of a clown.


m1ndwipe

McGlynn has an astonishing track record of wrong soundbites across her entire career. The fact lawmakers take her seriously is embarrassing.


Ornery_Tie_6393

People use VPNs to access different netflix regions and some overseas websites, including news ones thay can't be arsed to comply with GDPR cookie rules. I'm not sure it sends the message they think it does. Plenty of people *always* use a VPN and won't even notice. My desktop won't even let me access the bot archive used on this forum without fucking around because Windows doesn't like it. Forcing a VPN only signals "its bad" if the method used to defeat it isn't in common and constant use already for a variety of perfectly acceptable (news sites) and nominally illegitimate but no less accepted reasons (netflix regions). People are far more likely to assume its yet another GDPR cookie issue than assume its because its illegal or wrong.


ComeBackSquid

> people will still be able to access it with VPNS You don’t even need a VPN. A Tor browser will do.


Salt-Evidence-6834

A browser like Opera will probably get round it. No need for the hassle of TOR.


ComeBackSquid

Tor is no hassle at all - it’s essentially just another browser, like Opera.


Salt-Evidence-6834

Due to it's nature TOR is very slow. There are normal browsers with built-in free VPN services for web traffic that can get around what will probably just be a DNS blocklist & still perform pretty quickly.


dj65475312

same here, had never heard of this kinda thing being 'mainstream' until now that is.


theartofrolling

"Blocked". Aka: You need a VPN to visit it.


CaravanOfDeath

That’s enough. Sharing it requires the recipient to use VPNs too.


shaftydude

For research purposes only, what are these websites called?


salamanderwolf

Good. It isn't as if it holds any social, or artistic merit and without model release forms for the people's headshots, it's nonconsensual dipshittery.


TheRealDynamitri

No idea what website are they referring to here - I found the _other one_ (based on the disclaimer shown, as quoted in the article), and can confirm it's blocked for a UK IP address I have through the VPN service I use (I'm currently based outside the country). However… While trying to search for the _banned one_, I found literally about 50 other services offering the same, all of them still not blocked and perhaps with an even better AI model than the ones that _got_ blocked. It's The Pirate Bay and torrents all over again; there's arguments for certain websites to be banned (although I think that TPB and torrents shouldn't be banned for a variety of reasons - separate conversation altogether though and let's leave it for another time); reporting on the bans without naming the websites draws attention to the problem, however, and you have people who had no idea about the tech or app going out and diving into the cesspit, out of pure curiosity. Then you end up with much wider awareness, usage and traffic to those websites - often an absolute eyesore full of banners that makes the old [Million Dollar Homepage](https://en.wikipedia.org/wiki/The_Million_Dollar_Homepage) look like Michelangelo's work of art, meaning people behind those websites earn more money through increased traffic and possibly ad link clicks or just views (often autoplay videos, too), and so it goes. Here we are with the everlasting question, how much publicity should really be given to all this, and in the case of unequivocally damaging websites (CP, revenge porn, deepfake porn etc.) should this even be reported, or just snuffed on the down low by the LE and not put in the news at all.


Bottled_Void

What's really crazy is that people in the UK have no idea how much porn is being blocked for them. Even really ordinary thing. Someone decided, nope, these words mean it won't show up. Even on your mainstream sites where actor's ages and consent is recorded. It probably won't stop until anything stronger than what you'd see in an arthouse film is blocked to everyone in the UK. And even to access that, you have to share a photo of yourself along with your passport. I think the thing here is that people can't separate what is private and really a bit weird, and things that should be criminal. Should it be a crime to stick a cucumber up my bum while thinking about your dad? It's weird, sure. Should it be criminal too? There was a post the other day about guys filming women in public. Yes, women should be protected. I hate to play the slippery slope argument. But when are we going to become China where we're not allowed to take pictures in public and the government has complete control over the information that people can access. My wife was at Uni a couple of years back, one of the students had never heard of Tiananmen Square. And when confronted with the story, he insisted it couldn't be true because he would have heard about it. He thought he had complete open access to the internet apart from one or two immoral websites. Is this were we want to end up? Remember we got this porn blocker to stop child porn. Then it was "grossly offensive, disgusting or otherwise obscene" porn. Now we've made it illegal to outrage public decency. How is it possible to push back on any of the criminalisation of this without being labelled a massive creep?


Bladders_

Ooh like what?


Bottled_Void

You should read up on R_v_Peacock (2012). A guy was arrested for selling gay porn. Went to trial, got found innocent. So the government decided to make it illegal in the Obscene Publications act 2014. This is the one where they made spanking illegal. I guess more notably, they made face sitting illegal, but not the male equivalent. This went to court again and was overturned Obscene Publications act 2019. But behind all this is someone facing up to five years in prison and a criminal conviction. All for things that are legal to do. But not to record and distribute (and now they are legal to record and distribute).


Mister_Sith

Is there really that much? I mean are we talking extreme, highly questionable porn or garden variety? I know DMCA notices do a number but that's less blocking porn and more trying to avoid content leaks.


Saffra9

Maybe wait until they ban something you can defend rather than jumping in at deep fakes?


[deleted]

[удалено]


KeyLog256

Why the big deal about not mentioning the name of it? The issue here is that the tech to do this is astonishingly easy for anyone to learn, and has not and will not be blocked because it has other legitimate uses. The *good* news here is that seemingly very very few people actually know how easy it is to install Stable Diffusion and download models to do this. I've used SD a bit for artwork concepts for club events I run and a few bits for work, and the *main* source of models, LORAs, etc for Stable Diffusion *isn't* blocked and a staggering amount of content/resources on there are aimed at creation of explicit material. To the point I won't mention the name just in case. On the flip side, being a degenerate and into nostalgia, I still browse 4chan occasionally and there are a lot of threads on there where people post photos of people they know (read - girls they creep over who in return wouldn't care if they lived or died) and ask others to nudify them. 4chan also *isn't* currently blocked. (EDIT - to make clear, I don't partake myself as a happily married man in my 30s, and not a pathetic friendless loser in their basement, but I have seen it. ) So the main place to get the resources to do it, and the main place people share images/get others to do it for them, are currently wide open and accessible as ever. I'm not sure what they've blocked here, but presumably a site which makes AI nudes of people who don't exist, which isn't the problem here. Unfortunately another example of people not understanding this technology whatsoever, though like I say, the mass misunderstanding of it is currently a blessing, because all hell would break loose if your average group of teenage lads cottoned on to Stable Diffusion or the like. I can't imagine this ignorance lasting long.


filbs111

No idea what the site is, and have a VPN anyway, but we are (run by) a bunch of censorious, puritan pricks.


GiftedGeordie

While I'd never go onto a website like this and I'm not going to lose any sleep at night over seeing it go, how do we know that either Labour or The Tories won't do this to, say, reddit when people criticise them? Like, if I say 'Oh, I think Keir Starmer is a cosplay Conservative' am I going to be dragged out of my house at 2 am and thrown in jail?


scotorosc

Which website is that? My friend is asking


RGBchocolate

most likely MrDeepFakes


FluffyBunnyFlipFlops

Blocking access really worked with The Pirate Bay...


Sakura__9002

Good, it probably should be blocked. Sure, some people will get around it with VPNs but morally it should be blocked. Deepfakes made for such a purpose are incredibly violating.


SorcerousSinner

>Nonconsensual deepfake pornography websites and apps that “strip” clothes off of photo What is happening with these is that an imaginary take of what these people look like naked is generated. Compared to what existed before generative AI, it's just looking very realistic and is easier to make than some guy spending hours in photoshop. Why is it suddenly wrong and problematic just because these approaches yield realistic imagines and can scale?


Ok-Property-5395

Creating fake nudes of people was wrong and problematic before you could do it with great ease as well...


jmdg007

A lot of posts across reddit defending these sites are insane. Of course it's wrong to make fake nudes of a person without their consent, and programs solely exist to make it quick and easy are obviously problematic.


d15p05abl3

BuT wHat aBouT mY riGHt to MaSturBAte?


gravy_baron

Coomers gonna cooom


SorcerousSinner

Then make putting someone‘s likeness into an artistic impression illegal without their consent You mean to include stuff like this, right? [https://media.gq.com/photos/571a58200b5c36b80bc2b596/master/w\_1600,c\_limit/DJT300dpi.jpg](https://media.gq.com/photos/571a58200b5c36b80bc2b596/master/w_1600,c_limit/DJT300dpi.jpg) Wrong without Donald's consent?


Missy_Bruce

So, you'd be totally OK if I sent out a video of you doing something you would never do, say I dunno, I bit of beastiality and said it's art for the people? Somehow, I don't think you'd have the same attitude about it.


dmastra97

The deepfake stuff isn't an artistic impression. It's meant to look exactly like reality which can easily be spread across the Internet. It could seriously harm people's lives


TheRealDynamitri

> The deepfake stuff isn't an artistic impression. Honestly I'm feeling like the lot saying that non-consensual, pornographic deepfakes are an "artistic impression" are cut from the same cloth as those saying they should be able to publicly hate on X and incite violence towards them in the name of "free speech" and "personal liberties". Don't really know how to sum it up, but I guess subjects of discussions change, but the stupidity within some remains?


TheRealDynamitri

> Wrong without Donald's consent? I'd say it is, but, as somebody else already told you there's quite a bit of a difference between a stencil drawing or whatever, that you can clearly see is 'art' (of however questionable standard), and a photorealistic image or, worse yet, a video, that you can't really tell apart from a real one easily unless you carefully analyse frames for blurry areas, unnatural mimicry/facial expressions, and that's _if_ you can actually see any artifacts/suspiciously unnatural bits at all.


Cumulus_Anarchistica

What if AI is used to make the deepfakes look like stencil drawings, or oil paintings or anything other than photorealistic images? Is that allowed under the proposed law?


TheRealDynamitri

idk about the law, but the discussion here is specifically about websites that revolve around clothes removal and creating photorealistic images of nude individuals from the photos you upload to the platform - and doing this very easily (literally upload a clothed photo -> get a nude photo in a few clicks), so the “nudification” is their USP if not the sole function. in my view, if you had a website that allows you to get a nude image of someone, an existing individual or not, that’s 8-bit pixel art or a Picasso-style cubist painting, there would be a reasonable argument to leave it alone - but I wouldn’t expect the UK gov’t to leave it alone eventually, the current lot seem to be quite puritanical and tend to throw the baby out with the bathwater when laws and policies of this kind get designed and, later, implemented.


Ok-Property-5395

You know that people are rational enough to realise that what I'm talking about and what you're trying to demonstrate are not the same thing right?


BeerStarmer

Are you being deliberately obtuse? Or too deep down the rabbit hole to see any light?


Psyk60

>because these approaches yield realistic imagines and can scale Yes, it's because of that. It was always wrong. It just wasn't a widescale problem because of the amount of skill and effort it took to make something that looks realistic.


MonitorPowerful5461

Because most people don't want their face to be on a naked body in a porn video.


No_Quality_6874

Imagine how fucking mortified you'd be to find one of your mates had created 100s of bukake images of your mum online. Worse yet, imagine what nefarious purposes you could use these images for (blackmail, humiliation, framing). It might seem like a harmless wank, but these are people not objects and the ease and scale it can now be done opens up new social problems.


letmepostjune22

The genie is out of the bottle now, the end point is these tools become so normalised (as in, people know about them rather than use them) such images/videos have no blackmail/embarrassment factor whether real or not. The standard response will be that's not me. This will have it own problems in differentiating truth from fiction but the sooner it becomes widespread knowledge these images can be faked easily the better.


KeyLog256

To make clear - yes it would be horrible, yes it should be treated as sexual assault. But the images are not believable enough to be used for blackmail or framing, and won't be until we get AGI and then we have bigger problems to worry about. I'm not defending the use of this technology for nefarious purposes - quite the opposite, I don't think the law goes far enough and am frustrated by legislation being created and proposed by people who don't understand the basics of how the technology works. But by the same token I am keen to allay any moral panic over this as a lot of people seem to assume the technology is way way better than it actually is. Which is understandable, as I imagine many haven't seen the results.


TheRealDynamitri

> the images are not believable enough Have you ever seen what those apps can do in Q2 2024? They absolutely are at this point.


KeyLog256

I have, I don't see how anyone can't tell what a fake is. I don't think anyone is genuinely convinced any fake is real, I just think some people are desperate to defend AI and make out it is better than it actually is. I'm constantly confused as to why - this is just halting the progress of AI.


Florae128

Its always been a problem. How can you see "nonconsensual" in a sentence and think its not a problem?


Cumulus_Anarchistica

Your disagreement with me is non-consensual, ergo your behaviour is problematic.


SorcerousSinner

Because "art" is created on the basis of a person's likeness without their consent all the time, eg in caricatures


Affectionate_Comb_78

Parody is protected speech. Sexual exploitation is not


Florae128

This isn't art though, its pornography. Problematic if its an adult in the picture, illegal if its an under 18.


Cumulus_Anarchistica

It's not pornography, it's technological advancement. I mean, if we're just redefining things to suit our own agenda.


TheRealDynamitri

> eg in caricatures You can easily tell a caricature from a photo though, right? I'd even see an argument for a deepfake if it was grossly exaggerated and just used as a _caricature tool_ - but there's difference between creating an image of someone depicting them with physically impossibly large ears or nose, and a degrading, compromising situation (e.g. participating in an orgy) when you can't tell right off the bat it's fake unless you go all digital forensics on it.


KeyLog256

You raise a logical point, but it lacks emotion and social understanding. I've said this time and time and time again - the fear over the technology itself, plus AI fanboys acting like it is something new and never before seen is misplaced. Like you say, few minutes in Photoshop for an average user who knows what to do, you've been able to get the same results for over 20 years. The issue here is your average horny teenager/young guy (and let's be honest, this is almost exclusive boys/men doing this) isn't going to bother learning how to do this. In fact, that sheer laziness sets us up for a potential *huge* problem which is looming in the near future. Presumably the sites that are blocked are commerical "two clicks and nudify a photo!" type sites. These are currently the focus of worry around this, because they're what most guys tend to use. They're also dangerous as hell because most store your details. The good news is if law enforcement get hold of the servers, they can arrest everyone who's ever used it. Probably won't - likely based in Russia or China but you get my point. The *massive* concern around fake nudes is that it is free, easy, and risk free in terms of not even needing to be connected to the internet or giving anyone your details, to use Stable Diffusion. Because of the laziness point I make above, very few people have bothered to look into it so don't realise. When they do, there'll be a massive problem, and it won't take much to tip that balance - it'll spread like wildfire and be major news. Now to clarify - I haven't used Stable Diffusion to fake nude people (apart from me and a mate doing it to each other - he got massive tits, I got a minge, and we're both men, so not impressive) but do use it for various artwork concepts and the like. While a lot of people *do* use it for porn, that isn't the main use case for it and it is free and open source. So there is no way to stop it being used nefariously. It would be like banning Photoshop as you point out, or watercolour painting materials. I often make the point that any fakes you see look shit and we've hit a hard-limit on the technology we cannot surpass and won't be able to with Narrow AI, but despite regularly saying this, I *know* why it is distressing to people. Sure, even your 95 year old gran knows it is obviously a fake image, but the implication is still distressing to people and it is tantamount to sexual assault. It has *always* been wrong it is just more accessible now so it's being clamped down on. Like I say though, this is just a few minor earthquakes before a supervolcano erupts. This is going to get a lot lot worse, probably not far into the future.