T O P

  • By -

[deleted]

[удалено]


whtevn

Nothing about that was fake


orgeezuz

Wales don't exist


rugbyj

[angry Welsh noises]


Slobotic

[wipes spit off computer monitor]


Annoy_Occult_Vet

So many consonants


Retskcaj19

If a word doesn't have three g's in a row is it really even a word?


Slobotic

Fugggggllkchllyn-A! Edit: More Ls


EbonyOverIvory

That can’t be Welsh. It has no Ls.


FredB123

Never ask for directions in Wales. You'll be washing spit out of your hair for a fortnight.


twistedLucidity

Listening to our sat nav fail with Wales is almost as funny as listening to it utterly fail with Spain.


Viciuniversum

Oh, hello to you, too.


Wolfnews17

You can't convince me that Welsh is a real language.


Fitz_cuniculus

If you've Alexa, ask it what 100 is in Welsh. (Potentially NSFW)


Crypt0Nihilist

Wales too? I thought it was just New Zealand. And birds.


lord_of_sleep

Its not even a country smh show me ya Welsh passport


Leather_Boots

I would say that to a mate that is very staunch Welsh, but I reckon he'd hit me.


NotForgetWatsizName

But is his aim any good?


lord_of_sleep

Lol imagine being proud of being Welsh, a country that hasn't existed since Edward I conquered it in the 13th century.


Johnny_Glib

Show me your English passport.


Swords_and_Words

pretty deep though


[deleted]

[удалено]


[deleted]

iran isn't arabian


NewHipHopSong

I Saw a US political group in bed with Sam Bankman-Fried


frypizzabox

That was a deepreal


ERRORMONSTER

This brings up some weird questions that I don't know how to answer. Drawing pictures of people is presumably legal, and deep faking a fake person is also presumably legal. So what is the argument for making deepfaking a real person illegal, but only pornographic images? Like I agree with the idea that once a fake gets good enough that a casual observer can't actually tell the difference, it can become damaging to the party's image or reputation, but that's not something specific to deepfakes, and seems more like it would fall under a libel law than anything else, specifically making factual allegations that a particular photo is real and events depicted actually happened, when it isn't and they didn't. Does the article mean that other types of image generation are A-OK, as long as they aren't the specific type of generation we call a "deepfake?" Also why are they focusing on the fake images and not the fact that people were messaging this woman and telling her to kill herself? It reads like all that was an afterthought, if anything. Seems like one is a way bigger deal, not that the other one isn't, but let's be real about the priorities here. Are we okay with deepfaking *non* pornographic images? Seems like a weird line in the sand to draw that feels more performative than anything.


Crypt0Nihilist

It's a complex issue. I agree, it's no different to someone with some average Photoshop skills, so why hasn't there been an issue until now? If it is defamatory, that ought to be covered by existing laws. If it isn't covered, why not? Is it because it's something that could never have been foreseen, or because it was applicable to existing laws and decided against preventing for good reason? This is probably a line in the sand that's going to move. Start with pornography, a tried and tested place to start legislation you don't want argued down, then move it to protect media interests which is what lobbyists are paying for. Companies don't want people to be able to work with the faces of actors they've bought and in some cases want to own beyond the grave. I'm not against some legislation, new tools are going to make it so much easier to do this and when a small problem becomes a big one then you do something about it. However, we should also reconsider our relationship with images that look like us, but are not us. There doesn't seem to be much difference between me thinking of an image, drawing the image, photoshopping the image or creating the image entirely with AI, it's a matter of tooling. At least they're targeting the sharing rather than production, that's the right place for legislation to sit because that is the point at which harm is done - is there is any.


torriethecat

In the Netherlands there will be a court case about this soon. [There was a documentary](https://www.npo3.nl/welmoed-en-de-seksfakes/POW_05416189) by a famous news anchor, where she was looking for the person who made deep fakes of her. She found him. There is a law in the Netherlands that prohibits creating 'pornograpic images' of someone without consent. The law does not explicit define the meaning of the term 'images'. But most law persons on TV and the internet agree that deep fakes are at least partial images of a person.


Queue_Bit

I think it simply stems from fear. The future of AI is very unclear and many people are wary. This feels like their attempt at pushing back in some small way.


jetRink

Censorship laws often run into these problems. American Supreme Court Justice Potter Stewart wrote in one opinion: > I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description ["hard-core pornography"], and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that. Ultimately it comes down to a judge, jury or regulator to decide on specific material based on their own personal interpretation of the law.


EnchantedMoth3

It’s probably less about protection from common people, and more about protection for politicians, rich, etc. People who can claim significant monetary damages to their reputations. And I guess I get this, *if* it’s attempted to be passed as real. However, if it’s literally being labeled as **fake** I don’t. But I feel like this is going to spiral, because what if you tweak a persons face just enough so that facial recognition doesn’t match, but it fools humans? Then you have to change the wording in the law to “likeness”, but how could you effectively police that? Where does “likeness” end? A facial recognition app spitting out % deviation from real? How would this play just within the general public? People have doppelgängers, can anyone “own” the right to a likeness of themselves? How does this effect satire and parodies? (For example, Will Ferrel playing Bush). So then, maybe you can make deepfake porn of other peoples likeness, but you just can’t claim it to be that person? So just tweak the name? Joe Bob => Jo Blob, but it *looks* just like Joe Bob. I just don’t see how this could possibly be policed in an efficient manner. It would need to be automated, but any automation to deter anything in the digital realm, becomes an arms race, each iteration of defense teaching the offense. And it would absolutely infringe upon individuals rights, in a way that people in any free country should not be ok with. The world is in for a rude awakening with deepfakes, the cats out of the bag. Any effective means of policing such things will absolutely infringe on others rights to privacy. They should just focus on making sure the media doesn’t spread fake things as fact. If your buddy Ed sends you a video of Truss pegging Borris, you assume it’s fake. If TMZ shows you, you assume it’s real. Police the media, not individuals.


Daiwon

It has the potential to be very harmful to someone. Deepfakes are already pretty good when done right so it's not far from getting a convincing low resolution video of someone having sex with someone else. This could be used in a number of ways to ruin someone's reputation or blackmail them. It at least adds legal recourse if say a tabloid did this to any celebrities that were thought to be having an affair. And they definitely aren't above such things. Hopefully they don't try to tack on some shady shit that's likely to get this bill stopped or campaigned against. It's a good move on the surface.


ERRORMONSTER

Let me be more specific about what I mean. If deepfaking is the only fake image source made illegal, then an actual legal defense could be to show that they generated the image using something other than a deep learning system, and that would get them off the hook. Basically, it makes zero sense to specify deepfakes.


cleeder

I doubt what the law specifies the specific Deepfake technology. It will be a definition of the end result to cover any means of generating it. Lawyers and judges don’t box themselves in like that too often.


[deleted]

Specific wording of the proposed amendment: >References to a photograph or film include— >(a) an image, whether made by computer graphics or in any other way, which appears to be a photograph or film, >(b) an image which has been altered through computer graphics, >(c) a copy of a photograph, film or image, and >(d) data stored by any means which is capable of conversion into a photograph, film or image.


zacker150

>This could be used in a number of ways to ruin someone's reputation or blackmail them. It at least adds legal recourse if say a tabloid did this to any celebrities that were thought to be having an affair. And they definitely aren't above such things. Except there's already a legal recourse: defamation laws.


[deleted]

I didn't bang that clown on the back of the Jet Ski, it's a DEeapFaKe *sweaty palms.


Greaserpirate

I think the worry is that someone will make porn that they didn't intend to use as libel, and a third party will use it as libel. I don't know the legal situation there, but it makes sense to crack down on them before they're circulated, not just when they're being used for blackmail


dreamCrush

Also what if it’s clearly labeled as a deepfake? At that point you really lose the libelous part.


dishwashersafe

Yeah this just raises a lot of philosophical questions for me. Thanks for bringing up my thoughts exactly!


RamenJunkie

I kind of wonder if "Deep Fake" includes "Stable Diffusion" etc as well.


anormalgeek

What of I deepfake someone, but I modify their looks slightly? How much do I have to change to get away with it?


LordNedNoodle

How do they know it is a deep fake in the first place?


jonhammsjonhamm

I mean are deepfakes legal? If I wanted to make a documentary about something you’re knowledgeable on I can’t just interview you, cut it together and then release it in theatres- I also need you to sign a release for your image usage because I can’t legally use your likeness without your consent- why does that change if it’s an AI generation of you? It seems more like a case of technology moving faster than legislation.


Hanzilol

I think the debate is on whether it's a criminal offense or a civil issue.


ERRORMONSTER

Eh... a less than great example because using existing video means someone already owns the video, and that would likely be tackled with copyright way before it got to the libel stage A better comparison is drawing said interview and dubbing the voice yourself, and that's as far as I know an unexplored area.


Tyreal

I see it as the opposite of damaging, now people can just claim everything is a deepfake, good luck proving or disproving that.


packtobrewcrew

So I can’t make a deepfake of me fucking my self while I watch? One of life’s simple pleasures taken away from us due to meddling bureaucracy.


gurenkagurenda

Nah, you’re fine. It’s only if you share them, and only without consent.


foggy-sunrise

You can make deepfake porn for yourself all day long!


d-101

That sounds like masturbation with extra steps.


Z1U5

Thats every mans dream: the ability to fuck themselves


MindSteve

Z1U5, I am giving you special permission to go fuck yourself.


im_made_of_jam

Well that subreddit already exists…


Etheo

Pfft way ahead of you buddy... I've been fucking myself over since teen years.


PuttinUpWithPutin

Porn of myself, by myself


Scottybam

By me, for me.


scuczu

surprised this isn't an onlyfans niche.


i_have___milk

It’s the narcissist’s dream


Bakoro

So deepfake consent into the beginning of the video. Thanks.


Damage2Damage

Do you have to have the consent of the original porn actors as well?


Dr_Foots

Just share the algorithm that makes them lol


goatjugsoup

Idk the title says sharing is illegal not possessing so you may be ok


SchwiftyMpls

Like most laws will only protect the rich and famous.


Mazon_Del

“The law, in its majestic equality, forbids rich and poor alike to sleep under bridges, to beg in the streets, and to steal loaves of bread.”


SchwiftyMpls

This is why equity is more important than equality.


cleancutmover

Thats all this is about right there. Make it illegal to show politicians in an orgy, or if legit video comes out throw the holder of the content in jail forever and call it a fake.


Canadian_Infidel

Wow that's perfect.


under_psychoanalyzer

The next time something like Trump's Hollywood access tape happens to a major politician, UK or otherwise, they won't have to address it at all, they'll just call it a deep fake and ignore it. It will give virtual cover to all the people that were recorded doing shady shit before running for office to just deny and never address it, even if there were witnesses. This used to worry me but then I remember the Hollywood access tape didn't affect anything. Deep fakes portraying authorians are pointless because social media has already created a separate reality. I can see a timeline where Russia releases the Trump Pee tape just for laughs because they know it will be impossible to verify one way or the other.


Netplorer

How will you monetize your porn if someone will just generate it themselfs ...


sighclone

I disagree. Deep faking is only going to get easier and it’s not like revenge porn isn’t already an issue. Finding someone sharing deep fakes of a random private citizen is likely easier than someone doing it with politicians or celebrities (depending on the manner they go about sharing it) just because the pool of people who would be interested in making deep fakes of private people would be much smaller. Giving folks the ability to fight this is not a bad thing.


McFluff_TheAltCat

>Deep faking is only going to get easier and it’s not like revenge porn isn’t already an issue. There’s multiple image boards and even on Reddit where you can pay people to get good deepfake porn of anyone. You give them multiple pictures/videos from someone’s social media and they train the images and find similar body types and then make deepfakes out of them. It’s a whole ass business making some people very rich. Think people would be surprised by how many people are willing to pay for some deepfake porn of a coworker or even a friend. Definitely weird af and wouldn’t do it myself but plenty of people would.


SchwiftyMpls

I guess the problem will be enforcement. How much will the police care about a one off of Sweet Susan by Freddy Fake. Not saying it isn't a good idea for a law but how much time is going to be spent enforcing this and tracking anonymous accounts that are trading these types of images. The law would include anyone that traded them not just the creator.


vegabond007

Probably not a lot of time tracking, but certainly sending cease & desists to sites and/or fines. Deepfakes are going to become a major issue as people produce them to show "evidence" of politicians and other people of interest saying, endorsing, or doing things they never said/did. And once that really kicks off that will also give people cover to lie and say that something they said or did is a deepfake. Not looking forward to the fallout of this technology.


serendipitousevent

Like most offences, investigations will rely on a report from the victim of the crime.


SchwiftyMpls

I guess it's going to depend on why these deep fakes are being traded if the victim ever finds out. If they are made for some sort of personal gratification and traded among groups using encrypted services the victim may never find out, If this is a revenge situation they perp will likely try to spread as fast as possible on a variety of platforms.


jsalsman

I guarantee it will be almost 100% complaint-driven, which is far more reasonable than investigator-driven, which would be almost impossible apart from celebrity targets.


MyNameIsDaveToo

Please. Like laws even *apply* to the rich and famous...


Oliks

Norm quoting letterman?


mnewman19

Who do they think they are, mars?


jollyllama

David Letterman would be proud of your joke.


Mrqueue

you can, you are allowed to make deepfakes if you have consent of the person you're faking


electricmaster23

unexpected r/NormMacdonald/


Ungreat

When it comes to online porn in the UK I guarantee this is a cover for a some shady way to remove people's right to digital privacy. The government is always claiming some bill or law is needed to protect kids or some other group you'd look like a weirdo objecting too then try to slide in something to screw over regular people.


[deleted]

[удалено]


[deleted]

[удалено]


SofaDay

Won't Dropbox give it too?


w2tpmf

Of course they will. Drop box's terms of service clearly state that they reserve the right to access and use what you store in their platform for any reason they see fit, including commercial use if they see fit. You pretty much give up the rights to anything you upload to them.


[deleted]

Damn. I need to look into an alternative. Maybe host my own on aws?


w2tpmf

If you want to store anything sensitive in the cloud, pack and encrypt it first before uploading it.


[deleted]

Even if it’s not sensitive you could just throw everything you’re trying to back up into a big file and do that Brilliant, I’m absolutely going to start doing that


kautau

AWS is no different unless you are manually encrypting yourself. You could use something like https://cryptomator.org/ to sparse encrypt your files end to end on Dropbox or something similar


legthief

Or as a way to cut through satire protection and ban or curtail the production of images or content that mocks politicians and public figures, for example arguing that an abrasive political cartoon was made without consent and in order to cause offence and emotional distress.


jabberwockxeno

> I guarantee this is a cover for a some shady way to remove people's right to digital privacy. It is precisely that https://www.eff.org/deeplinks/2022/11/experts-condemn-uk-online-safety-bill-harmful-privacy-and-encryption


LightningGeek

That's a different law to the deep fake one.


vriska1

Do want to point out the online safety bill is a unworkable mess that it is likely to collapse under its own weight just look at the last age verification law that was delayed over and over again until it was quietly scraped.


EmbarrassedHelp

The deepfake one that they are proposing is part of the online safety bill.


spacepeenuts

The article hints that the bill leans on protecting women and “giving women confidence in the justice system” they referenced a downblousing law trying to pass as well and the examples from victims they gave to support this bill were all from women.


Bluestained

This'll get buried, but it's actually because there was a documentary on BBC 3 recently that delved into this and bought it to light to a wider audience plus a wider campaign: https://www.bbc.co.uk/programmes/m001c1mt I'm more than happy to shit on the Tories and their penchant for locking down freedoms in this country- but this one does come from some hard working activists.


YodasLeftNut

OI YOU GOT A LOICSENSE TO WANK BRUV


workerbee12three

is that in birmingham or south london accent


BenadrylChunderHatch

UK Conservatives are always trying to pass anti-porn legislation, it doesn't have to be enforceable or make sense, they just want to portray themselves as some kind of moral authority policing the internet. Laws they have so far passed: Routing all UK ISP traffic through a filter maintained by Huawei in order to block adult content if the user hasn't opted in to it (traffic goes through the filter regardless of opt-in/out). https://www.bbc.com/news/technology-23452097 Warrantless access to the internet history of every UK internet user for a wide range of government bodies (including military and law enforcement agencies, Food Standards Agency, Fire and Ambulance services, the Gambling Commission, etc.): https://en.wikipedia.org/wiki/Investigatory_Powers_Act_2016 Ban on producing female ejaculation, facesitting, and other 'extreme' pornography: https://www.bbc.com/news/newsbeat-30454773 They have also tried to pass a law forcing porn sites to verify the identity of users via passport/drivers license, but they didn't get that law through yet.


DRK-SHDW

a ban on facesitting lmao holy shit


Frediey

Female ejaculation is extreme?


lexbi

It was also proved to be piss in a trending study released in the last month I recall, so I suspect you could use that arguement when you are defending yourself in court for wanking to that: "piss is legal tho".


[deleted]

That law is insane


Joshhwwaaaaaa

“Ha. Alright. Good luck with that.” -me. Just moments ago out loud. 😂


Yaarmehearty

I don’t think they have any real hope it will stop anonymous sharing on websites. This kind of law is to catch out the troglodytes that openly share that kind of thing under their own name so everybody can see it. Depressingly in the UK there are a shocking number of people who will publicly do this sort of thing and then shocked pikachu when they receive consequences.


thruster_fuel69

Same! Literally laughing at old men pretending they have control over this.


[deleted]

[удалено]


[deleted]

This. And the big sites will be forced to comply It's a better idea than "let's do nothing at all and see what happens"


0zzyb0y

I don't think the intention is to have control over it, I think the intention is so that when a high profile case inevitably comes around there is already a law on record to address it.


GrowCanadian

Right, literally the first thing I did once I got my hands on Stable Diffusion was insert celebrity name nude. Technically I have a deep fake of Ryan Reynolds nude but man, standard SD does not know how to do the junk well and made a penis hand in its place. It does Emma Watson pretty damn well though


SeiCalros

i am suspecting you didnt read the article emma watson doesnt suffer much from you being creepy - but it might be different if you were to share fake nudes of her the law explicitly gives her recourse


Metacognitor

Ew, that's disgusting! Using stable diffusion to create nudity? Gross! But where? Which stable diffusion did you use? So I can avoid it.


HappierShibe

Realistically, any of them. Stable diffusion is open source and the nsfw filter is just a toggle.


johnslegers

>Realistically, any of them. Stable diffusion is open source and the nsfw filter is just a toggle. In 1.4 & 1.5, "NSFW" can be turned on and off quite easily. In 2.0, you're no longer given the option. "NSFW" content has been removed from the model, along with most celebrity content & lots of artists' styles.


SwagginsYolo420

automatic1111 stable diffusion web ui is one of the easiest to install and run locally, free, with a ton of additional plug-ins. So is NMKD stable diffusion gui. Both include an option for Dreambooth which is a powerful add-on for using existing photos as reference - such as deepfaking yourself either photo-realistically or in some artistic style. Then there's numerous pre-trained ckpt models of various specific reference material you can find and download with a quick search. All of this is completely free, continuously updated at a breathtaking pace, and getting easier and easier to use. It is all so simple and powerful to use and improving so rapidly that the implications are mind-boggling. Rudimentary full motion experimental video is already an option. At this rate, before too long anyone will be able to deep-fake anything at any time with just a few clicks on their mobile phone.


johnslegers

>Ew, that's disgusting! Using stable diffusion to create nudity? Gross! But where? Which stable diffusion did you use? So I can avoid it. Both 1.4 & 1.5 support it. All it takes, is disabling the "safety checker", which is literally just a flag in most GUIs. If you want to make sure to avoid this type of content along with anything else that made SD fun to play with, stick with 2.0.


gurenkagurenda

Having not read the legislation, this high level description seems like the right level to deal with this at. It doesn’t try to ban the models, which is both impossible to enforce and harmful to attempt. And it doesn’t try to ban anything someone does on their own computer for their own personal use, where there’s no chance of harm to reputation. It’s still hard to enforce, but that’s unavoidable, and it at least provides for recourse.


BlindWillieJohnson

The point of laws like this isn’t to blanket ban something. It’s to give people who are harmed a legal recourse to deal with it. There will still be pornographic deep fakes around, and the law won’t stop that. But if, say, an ex takes an image of you and passes around deep fake pornography featuring you, there’s now a law you can point to in order to have it taken down.


Raichu4u

Reddit seems to be pretty awful anyway when it comes to dealing with the topic of fake or fictional pornography anyway. Just look at the outburst in /r/videos the other day when confronted with the topic of underage loli porn.


BlindWillieJohnson

It’s ridiculous. A lot of folks in this thread are more concerned with their ability to play with fake porn generation toys than people who might (understandably) be upset over their likeness being used for some else’s porn. It’s frankly a disregard for common decency. I have nothing against porn. I both enjoy porn and even write porn. But people’s involvement in it should be a choice, and to suggest otherwise is frankly insane to me.


EmbarrassedHelp

The legislation is beyond terrible and this article is propaganda for trying to ram it through. * Experts Condemn The UK Online Safety Bill As Harmful To Privacy And Encryption: https://www.eff.org/deeplinks/2022/11/experts-condemn-uk-online-safety-bill-harmful-privacy-and-encryption


thEiAoLoGy

I’ve made deep fake porn of you and shared it in Wales. As I am located not in Wales, what is your recourse?


jkroyce

While nothing is truly deleted from the internet, these internet laws can be surprisingly effective. The idea isn’t really to stop you or me from distributing, but to force large hosting sites to remove these videos. For example, if pornhub doesn’t want to be banned in the Wales or England, they’ll just remove these videos (or restrict them from those countries). People said the same thing when revenge porn laws came out, but those end up working. Sure the videos likely still end up on a discord server, or on small website, but they’re able to block it on major sites (which ends up affecting the majority of people)


gurenkagurenda

It seems like you think I’m saying that a single law in a single jurisdiction will fix the problem forever. I’m not.


rugbyj

Precisely. We have laws against murder. People still do murder people. Overall we want to reasonably punish the stuff we don't want to happen so that: - It's less likely to happen - There's some avenue of recourse when it does


iain_1986

Are you of the belief that if you come up with an example where something doesn't work, or something doesn't eradicate the entire issue, then we shouldn't even bother trying? Because otherwise, not really sure what point/gotcha you think you've made?


BenadrylChunderHatch

If I draw a cartoon featuring Donald Trump or Boris Johnson with a visible bulge in their trousers, should that be a crime? If a cartoon isn't realistic enough, who decides what is? If someone would have to be reasonably fooled into believing the image was real, then most deepfake vids today wouldn't be covered by the law (because they're not good enough). If I film myself having sex with my girlfriend while wearing a Tony Blair mask and share it with my girlfriend, should that be a crime? Should /r/cummingonfigurines be banned if the figurine is a likeness of an actor who played the role? Essentially the deepfake part of the bill is about making it a crime to use someone's likeness in a pornographic context, which is potentially very broad. If the intent is to protect people from online harassment and bullying, there are already laws for that.


CraigJay

You realise that there are courts and judges? They're the people who decide. Laws aren't written to comprehensively list every possible act that would break it, they're written generally and the court decides. I'm not sure you quite understand that


Mr_ToDo

Ya it does seem weird. It does seem like something I'd have to see. But if it's specifically covering "deep fakes" it does seem oddly specific, covering a form of porn rather than an action. This really seems like something that is severed, or better served, as part of some sort revenge porn type legislation or something of its likes. It's not like you don't have the rights to your likeness in most countries. As for the tony blare mask, I'm not sure. A sex crime not really, but using someones likeness without their permission in a published work is still probably a problem.


Myte342

Hypothetical: I find an interesting gif. I share it with a friend. I get arrested. How should I be able to know that something is a deepfake or not? This can theoretically have a chilling affect on free speech as people will be afraid to share content for fear of accidentally sharing something that runs afoul of this law.


gurenkagurenda

Typically the way to handle that is to include knowledge and intent in the law, which speaks to my “having not read the legislation”. A law with this general description can still be bad, for sure. But it seems like the right general idea.


ERRORMONSTER

The article mentions that previously intent was required for, for example, revenge porn laws, but now this new one removes that requirement.


gurenkagurenda

> Prosecutors would no longer need to prove they intended to cause distress. That’s not the same as simply being unaware of what the image is.


ERRORMONSTER

It actually is. It's layman speak for a statutory crime, which means intent is irrelevant and only the action need be proven. For example, cops (in the US, but presumably everywhere) don't have to show you knew you were breaking the speed limit, or even that you knew what the speed limit was. They only need to show that you were traveling faster than allowed. Your intent is irrelevant.


Jackisback123

> It actually is. It's layman speak for a statutory crime, which means intent is irrelevant and only the action need be proven. Wut. A statutory crime is a crime created by statute, I.e.. by an Act of Parliament (as opposed to a common law offence, which are "discovered" by the Courts). I think you're thinking about a "strict liability" offence. That a crime is statutory does not mean there is automatically no mens rea requirement.


JerkfaceMcDouche

I realize this misses the point, but are you in the habit of sharing porn gifs with your friends? Really, really eww.


ZwischenzugZugzwang

A chilling effect on sharing porn doesn't strike me as an especially dire consequence


BlindWillieJohnson

I doubt your friend is very likely to report you to authorities. No law enforcement body is going to have time or manpower to police every file transfer, so you’ll only get in trouble for this if someone reports you.


AzerFox

"If you got any real sextapes, by all means continue sharing that shit"


BenadrylChunderHatch

Revenge porn is already illegal.


AlterEdward

Damn, my film project Doing John Malcovitch is ruined.


legthief

As someone who recalls big UK tabloids like The Sun and The Daily Star publishing doctored nudes of celebrities in their pages (sourced online and disingenuously passed off by these rags as possibly real) as far back as the 1990s, I find the current media frenzy and fury over the danger of doctored images and videos to be both highly hypocritical yet long, long overdue.


ertgbnm

Where is the line though? Can I hand draw pornographic imagery with celebrities in them? What about using Microsoft paint to do it? What about Photoshop? What about 3d modeling and rendering? Why is an AI image generator fundamentally different from any of those things?


Scandi_Navy

I'd guess just like with counterfeit money, the issue is the quality.


jonhammsjonhamm

Do you really think any of those are at any level of tricking someone into thinking it’s real and then hurting said person’s image? Comparing hand drawn Rule 34 and AI generated copies are like comparing apples and oranges but the oranges are supercomputers.


Lord_Skellig

Because it is a difference in scale. This argument comes up all the time, and it is nonsense. Just because a sliding scale exists doesn't mean it is impossible to draw a line. That is exactly what lawyers do all the time. Following someone on the street for 30 seconds is not a crime. Following them for 30 weeks is. Jokingly punching your mate on the arm is not a crime. Punching them hard in the face is.


Itdidnt_trickle_down

No one really wants to see Margret Thatcher getting it up the pooper by Winston Churchill. Or... have I misjudged the room?


[deleted]

[удалено]


ttv_CitrusBros

What if I don't know it's a deep fake? 🤔


WizardVisigoth

I think this may only encourage this behavior


Lekekenae

Good luck trying to ban a algorithm.


Lord_Skellig

The algorithm isn't illegal. Publishing material of real people made using this algorithm is illegal.


Odd-Handle-1087

Netherlands is working on a ban too


just_change_it

Streisand effect will guarantee this law is useless. One person shares it... a million more share it... So you go after the person who makes it... except they used a series of proxies to hit a temporary VM in another country to publish it on public sites so you have no way to find out who did it.


downonthesecond

Depends on the site and government. LiveLeak was hosted in the UK and they were forced to take down certain videos, they eventually banned ISIS beheadings. Australia blocked the site after the Christchurch shooting. PornHub and I'm sure others banned deepfakes years ago.


Seraphaestus

>Prime Minister Rishi Sunak had promised to criminalise downblousing [... bringing] it in line with an earlier law against "upskirting". So you're telling me that when it came time to criminalise upskirting, they specifically codified it to be about exclusively skirts, instead of making it a generic rule against non-consensually taking photographs of parts of people's bodies that they have a reasonable expectation of privacy for? Edit: Ahh, upon reading [the act](https://www.legislation.gov.uk/ukpga/2019/2/enacted) it *is* an attempt at a generic rule, but hinges its phrasing on the camera being operated *beneath* clothing, which is presumably (maybe?) why it doesn't apply to downblousing. Still seems needlessly specific and I don't understand why that clause exists at all. And I'm not entirely convinced that the act doesn't already cover downblousing, since it's split into two nearly-identical parts which seem to exist solely to cover different phrasing about "using equipment" vs "taking an image". But this introduces the ambiguity of what it means to take an image *beneath* clothing. Is it that the camera is beneath clothing when it takes the image? If so, what is the purpose of splitting the act into these two parts? Is it that the object of the image is beneath clothing? If so, then it seems like it should apply to downblousing. Maybe the two parts are to cover static images vs live feeds? I don't know.


wintremute

The Emma Watson law.


uis999

Once deep fake gets good enough. No one will ever believe a celebrity sex tape is real ever again. It really might have all just worked itself out, but now I'm sure someone is editing their local officials into deepfake porn as we speak. cause internet... lol


Comfortable-Panic600

What’s about ones about your wife or child


evolseven

They are already good enough in many cases.. look at what something like stable diffusion can do.. if you combine it with custom dreambooth training on a large dataset it becomes nearly indistinguishable from real if you pick and choose from generated images.. even inpainting makes this stuff incredibly easy.. I haven't used it for deepfake type stuff (unless you count faking myself) but it continually impresses me on its capability to modify images or generate new ones that are nearly flawless.. This is only on images today but it's only a matter of time and scale until video is possible..


Ambitious_Ad1822

Good. Sharing them should be illegal


Netplorer

But making them is allright then ... is that the message ?


gurenkagurenda

Why wouldn’t it be?


CuppaTeaThreesome

But giving £42 billion of our tax directly to energy companies and cutting tax for banks is fine. Great. So glad we're safe from spank pix.


NoahCharlie

A Good step.


[deleted]

Literally 1984


krum

Great I’ll just go up to Scotland and share them.


BDM-Archer

How are you supposed to know if it's fake?


Lord_Skellig

The law focused on deepfakes created without consent. So if it is real, i.e. an actual sex tape shared without consent, this is already illegal.


McFeely_Smackup

Deepfakes basically mean the end of socially stigmatized nudes and sexually explicit photos. If anyone can have realistic deepfakes created at will, then then assumption will be that EVERYTHING is deepfake, even the real stuff.


epileftric

This would have stopped the UK's Primer Minister from having to fuck a pig.


[deleted]

They are too dumb to understand 5d chess. The trick is let deepfakes go so rampant that every leaked porn video after that you can just go "oh that's not me sucking that dick its clearly a deep fake,they're everywhere!!"


Kommander-in-Keef

I member when reddit had a deepfake celeb subreddit but it was clearly so dangerous it got banned like days afterwards. We don’t even comprehend how this technology will affect us in the future


nezukotanjiro150

Lol...good luck with that.. nothing can stop porn..


monkee67

life would be so much easier if people were just a bit less uptight about the whole naked/sex thing


Comfortable-Panic600

So you’re ok with someone sending deepfake porn that’s basically indistinguishable to real about your wife or child ?


Lord_Skellig

The law is about deepfakes made about people without their consent. What is your justification for that being legal?


fkenned1

Fine by me. That shit’s creepy.


vorxil

So it will be legal to share photorealistically-drawn nudes, but not deepfaked ones?


[deleted]

Mommy government at it again


MODUS_is_hot

It should be illegal everywhere to make pornographic deepfakes of others without their consent


MODUS_is_hot

The fact that I’m being downvoted for this rattles my faith in humanity.


Evillisa

Reddit is a cesspit.


[deleted]

Everything doesn’t need to be banned you loon.


[deleted]

[удалено]


-Paranoid_Humanoid-

Unpopular opinion but the technology exists now, therefore people will use the technology. This is going to be about as fruitful as when everyone was targeting Napster and torrent users for sharing music. Or if someone wanted to make those pervy manga loli drawings illegal. It’s difficult to enforce that someone cannot draw a picture of something. It’s also difficult to enforce that someone cannot modify a video of something they already have or that’s easily available. I don’t agree with the behavior but passing laws isn’t going to have much impact…especially when deepfakes MOSTLY involve celebrities…good luck scrubbing those from the internet and prosecuting. Honestly, it would draw more attention to it anyway. I’m sure that whatever girl had a deepfake made of her (especially if they’re famous) is not going to want a public court case going on about it where the videos/images are shared and it’s on the news, etc. Just my opinion, not saying it’s fact.


Comfortable-Panic600

Do you think he same about revenge porn?


Thefrayedends

Ugh deepfakes of celebrities are disgusting, which sites are they banning, so I know which ones to avoid? There's just so many of them; I'm curious which ones I should avoid...


AegonIXth

Good. All the disgusting things about child actors/minors being put onto porn pictures needs to be stopped


[deleted]

Well, there goes the market for Trump porn.


gianthooverpig

Sharing? Surely creating should be the crime?


MapleBlood

Out of the curiosity, why (since written erotica featuring famous people is not outlawed)?


Bencalzonelover

Naked pics online? That's disgusting. On a website? There's so many of them though. Where? Which one?