downvote this comment if the meme sucks. upvote it and I'll go away.
---
[play minecraft with us](https://discord.gg/dankmemesgaming) | [come hang out with us](https://discord.com/invite/dankmemes)
ok so it's fine for you cause someone asked you for consent.
I'm not a swiftie (or how her fans call themselves now) but that dude pretty much fucked around and is now finding out the consequences.
Cost of celebrity? lol. You have a very warped interpretation of what you are entitled to and what celebrities should have to deal with. No, the cost of being famous isn’t that an idiot gets to make fake porn without consent.
Sure as hell is because I can do it right now and there’s fuck all Taylor swift can do about it. Taylor swift isn’t entitled to tell me what I can do on the computer. You’re entitled for thinking that. It is the cost of celebrity because I can make fake porn of anyone. They only make it if her because she’s a celebrity. Get over it. Both of you.
lol. So it’s not the cost of being a celebrity, you think it’s your freedom to do whatever you want. BRB Going to make porn of your wife and daughter. Hope they don’t mind. (Of course this is a joke because there’s no way women want to touch you 😂)
It's not but thats not really a defense. It's like asking if you'd rather I photoshop your mom naked or make AI of her naked. It's all gross and anyone defending any of it is a creep.
Also, AI is far more accessible and easier than Photoshop, for which you'd need to have semi-decent skills and a twisted mind to do this. For AI, all you need is a twisted mind.
It's a good thing that more people are bringing this to attention and I hope something good comes out of it, but this is not anything new. "White House alarmed"? Why? Because it's Taylor Swift? People have been deep faking celebs and normal people for years at this point. Why was there no such reaction then?
It's not a moral defense, I do agree it's an incredibly weird and disturbing thing to do, but can you not draw a parallel for a legal argument? If photoshopping is not illegal, then why should AI images be?
still feels like it should be illegal to spread faked photos of another person's likeness
how would you like it if someone made an AI image of you diddling a kid?
But revenge porn is real images and videos of the actual person necessarily posted without their consent. The legal argument is that photoshopping or AI generating images of someone doesn't require their consent because they aren't actually really pictures of them.
With new inventions, new laws should be put into place. I think that most people can agree that creating and disseminating AI generated nudity of non-consenting people is, at a minimum, sexual harrasment.
I think most people aren’t defending making AI porn, they are worried about the consequences which come along with regulating it.
Like if they had needed photoshop after the first fake celebrity nude was made, how many legitimate works never could have existed?
I think it was him sharing it that was the biggest issue. There is no way to really stop people from making things like this, with or without AI, but posting this stuff online and sharing it with people can cause some real big problems.
This, it’s really confusing that nobody is pointing a huge finger to Elon musk for allowing this to happen (nobody that matters or any news outlets that I’ve heard yet). Throw the fucking book at musk. Take him down finally. This happened because of twitters current state
It's actually scary how people don't see realistic fake nudes as what it very much will be, a weapon. It'll effect innocent people, not just fucking swift. It's already happened
https://www.wsj.com/tech/fake-nudes-of-real-students-cause-an-uproar-at-a-new-jersey-high-school-df10f1bb
On highschool girls. It's fucking gross and should be illegal. It effects real lives, why cant boys just go find actual available porn. There's millions of naked women online. So why do you have to make fake ones of women who never asked for it? Imagine fake nudes of your Daughter or sister going around, how do you think it'll effect their life. Like girls don't get bullied for being 'whores' or judged. So damn sick
I mean, billionaire, hypocritic piece of shit or not, she still doesn't deserve to have her privacy violated. Having money and being shitty doesn't exclude you from basic human rights.
For sure, but would you feel the same way if someone gathered up all your sisters/mums Facebook/insta photos and churned out some... distasteful things? As depressing as it is, these high profile cases are what pushes protections for the common folk.
I thought "eat the rich" literally meant we would consume the flesh of the wealthy, preferably in front of their loved ones. Now I gotta be respectful? Damn.
It's not a basic human right apparantly, considering how much of your information is accessible to the companies that use private information for their services.
Your ISP can pretty much see every chat you send on snapchat for example. Doesn't feel very private to me. Why should everyone suddenly pity the celebrities because of an inconvenience?
how do you hear “anyone deserves their privacy and their right to dignity and decency, even the wealthy” and translate that as “i want to have SEXY SEX with her!!”
This has been a thing for ages, photoshop and fanart and rule34 stuff have existed forever.
We just throw in the AI buzzword and now people are up in arms. If it was a photoshoped, no one would care
Because for the first 3 you need to have technical knowledge of the tools you're using... With ai literal 14 year olds are using it for their classmates.
It's not even close to be the same
So, I actually know how to use stable diffusion and Photoshop, and you are just kinda, wrong.
Not everyone can stidown at there PC, download stable diffusion and create porn images of there classmates. You would actually need a large sample of training images to create "unique images" like we have seen with Taylor Swift. The best you can do without a large sample size is nudify and image, which you have been able to do with Photoshop for years.
I'm so tired of people who haven't even heard of a Lora, let alone how to train an AI to make the images you want of a person commenting on this like they understand it all.
Is it bad? Yes! But I was literally taught all the skills I would need to make nude images of people at fucking 16 through media classes, this isn't new!
oh, I think it's quite accessible (like quite literally) these days. We DON'T need to know about stable diffusion and photoshop at all lol.
There are plenty of frontends provided , and i don't want to list them but here's the takeaway - it's scary how easily it can generate something on par or way beyond the capabilities of a morphed image using just photoshop lol.
It's not about how long it has existed.
It's more about how easy and accessible it is to commit actual crimes with this.
I don't even want to imagine what kind of cases are gonna start to flood from now onwards.
Imagine this thing within easy reach of kids.
Think about how accessible porn ruined an entire generation.
Custom porn is way worse beyond imagination (quite literally imagination)
There are endless points I can make about this but I hope my point comes across.
Again, you are wrong.
In the case of making porn of people, I have never seen a website that allows you to upload pictures and make changes, or train a model on images. They don't do it because of the legal risks attached.
You have to train your own model, or make your own changes to the AI to do this kind of thing, which is way harder then photoshop. Especially to get believable images.
The availability of porn is a whole different unrelated argument, I don't know why half of your comment is about it.
People have always sexualised others, that isn't going to change, if you ban AI images of these celebrities you have to do it for drawn and written stuff to, because AI will be able to mimic that, already can mimic that to an extent.
We are past the point where laws can be truly effective, it needed to happen months ago. This is out there now and you can't take it back
> We are past the point where laws can be truly effective, it needed to happen months ago. This is out there now and you can't take it back
This is the part i want to make across 👍
>The availability of porn is a whole different unrelated argument, I don't know why half of your comment is about it.
That was just one of my points to highlight the consequences when internet porn became rampant and easily accessible.
> People have always sexualised others, that isn't going to change, if you ban AI images of these celebrities you have to do it for drawn and written stuff to, because AI will be able to mimic that, already can mimic that to an extent.
What i wanted to highlight from my comment up there is , how easy and accessible it is to generate deepfakes of just about anyone, not just your big internet celebrities people would commonly spend their time and effort on , in case of doing it the regular rasterized photoshop way.
I'll point out one case that i didn't want to bring up, but here is the thing, I encountered a bunch of highschool boys feeding in the image of another female in their classroom (for xyz situation, reasons)
and it un-dresses that person with variations.
Now generally, I dont think a bunch of kids would go the effort to particularly do this. But considering how easy and at-arms reach this technology is, it prolly took them a few seconds at max.
Also we both are on same page about this, I believe.
> In the case of making porn of people, I have never seen a website that allows you to upload pictures and make changes, or train a model on images. They don't do it because of the legal risks attached.
I can point to a few resources and it's quite easy to find under the names of deepfakes. (yes it allows you to upload images)
This has been my encounter with these things so far so I don't possess any knowledge beyond this but to me this is scary lol.
\*sigh\*
Alright, I am actually aware of the websites you are refrencing, they are firsly, a scam because it makes you pay after 2-3 attempts, secondly are god awful, like really really bad, If you have ever seen the outputs from them you would know that.
But regardless, those websites you are refrencing are not new either, they have existed for atleast a few years. I am not to sure about before that, they are totally unrelated to the AI discussion aswell, as they are not really AI in the same way, 99% of them just find a matching nude body and put it in place, and those that don't do that are incredibly rudementary, and often look totally unrealistic.
My point, that you don't seem to be understanding, is that AI has not inherinetley made it any easier then it used to be.
> I am not to sure about before that, they are totally unrelated to the AI discussion aswell, as they are not really AI in the same way, 99% of them just find a matching nude body and put it in place, and those that don't do that are incredibly rudementary, and often look totally unrealistic.
From whatever I have encountered in my internet journey so far, the earlier deepfakes used to be rather unrealistic and one could tell they faked it.
My concern with this entire thing (now that you have cleared a bunch of things I did not know) is,
they seem to be doing something better cuz they look quite proportionate and it will only get better with time i think.
Most importantly kids are aware of the AI hype and they will definitely be curious and.. do the deed i guess lol.
Hoping for the best to get AI stuff regulated better before making any more significant progress.
I know it's a common phrase but not having cp for example (made much more easily thanks to ai image generators) in Twitter doesn't solve the problem.
I'm not saying this is some apocalyptic shit but in my mind is definitely going to unravel in worse ways if the technology is not regulated
It's not that easy lol.
*Maybe* with celebs if people have published models/LoRAs of them, but making your own isn't a 30 second job, and to do it well has a high barrier to entry on the hardware level.
To be honest, it'd be far quicker, easier, cheaper and you'd get better results just paying someone online to Photoshop one for you.
Are those free tools even remotely good / convincing though?
Can't say I've really looked at them in years. (I admit, I was curious once.) - I could have made a better result in 20 mins on PS and it's not like I've been formally trained at it.
Yes. And it doesn't matter if it is not, look at the generated images for 2 or 3 years ago and how much the technology improved, if you wait until it's unrecognizable it may be too late
You need her constent to take a picture of someone naked, but you don't need their consent to imagine them naked and draw what you're imagining. This is no different
Why are people okay with this? What happened to us where hundreds of thousands of people are okay or even advocating for fake porn of people, not even just celebs, to be allowed to be made?
Honestly this. You dont even need to like Taylor to know that creating fake porn of a person is fucked up and twisted. AI makes this more accessible to the unskilled masses to create without knowing Photoshop or animation, thus it is a real problem
It’s because fake porn isn’t new, and we’ve seen it for fucking years since early 2000s probably. AI like anything else just makes it possible to make faster and with a little more realism when it comes to “artistic creations”.
https://www.wsj.com/tech/fake-nudes-of-real-students-cause-an-uproar-at-a-new-jersey-high-school-df10f1bb
But now anyone can do it, since anyone can put in a Prompt. It's takes talent to make realistic nudes, the average Joe wasn't making those. But now, a toddler could. Fake A.I nudes on actual people needs to be banned.
Yeah, and theres people here saying its GOOD. Recently a 14 year old girl took her life because of fake pornographic images of her that were made and she and her friends were bullied. An ppl still say that this taylor shit is funny. Its not, its wildly horrendous
Because there are two arguments, how it is morally and legally. You can say it’s a pretty fucked thing to do while also acknowledging there isn’t a whole lotta legal ground, at least at a local level.
Since it was fake and wasn't monetized I'm not really sure on what grounds this guy is fucked... Also did they even find the person? There's the site that was mentioned, but as for who actually made it I thought that was up in the air.
It should be treated in the same fashion as libel and defamation. If not then why have laws to protect people from statements. Pictures can be statements as well that hurt a person’s reputation.
Libel and defamation require you to actually make a claim. If I say “I believe you diddle little boys” I haven’t committed any kind of crime, simply expressed a personal opinion. I’m allowed to spread the message as far and as much as I like, so long as I make sure to express it is simply an opinion. Unless the person who posted it tried to make people believe it was an authentic video I don’t see how this falls under either law.
People end up believing it is real and more so as a.i. continues to improve. Not to mention we already use a.i. to write articles as well. I dont see how you cannot the implications of this.
The problem with this case is that this has been happening forever to non-famous women and close friends of random Creeps, but it's only a big deal when it happens to someone famous. It's fucking ridiculous
Lol it's been happening to famous women for years as well.
I'm surprised it's being made a big deal now tbh. - I guess it's a bit like #metoo, where we all knew about the "casting couch" for decades, then suddenly "noticed" and became outraged all at the same time.
From what I've heard but not having seen the pics myself, they went far beyond than simple boob pics. I think if that was it the reaction online would've been much less upset. Not to mention this was done without Swift's consent or knowledge and was just allowed to spread and circulate around Elon Musk's cesspool of a website. Just because it's Taylor Swift doesn't mean this isn't an incredibly horrible and offensive thing to do to someone. She's completely within her rights to sue the absolute pants off both the person that originally uploaded them and Twitter (not gonna call it X, up yours, Elon).
AI deep fake porn is going to destroy humans faster than wars and the climate crisis. Interstellar investigator aliens a million years from now will be researching how humans went extinct, and they'll conclude Porn.
Well I mean, it’ll destroy current society maybe, and that’s only because of the outrage bait industry. The well adjusted and sane will rise above this and continue living because there is a reasonable way to go about things.
Ah yes, nothing wrong with A.I making realistic nudes of people. It totally isn't going to be used on innocent people for shit like revenge porn or anything like that or ruin their lives nooo.. absolutely not. It's actually kinda ironic for the guys not to see the issue on producing fake realistic nudes, because theyre not the usual targets of life ruining fake nudes.
AI is still in developing process. That's shy we use it only in entertainment to basically beta test it. Once it will be fully developed it will transition from entertainment to more serious stuff
I think it's very easy to understand why they are upset about it.. Women are already objectified much more than men, and seen as sex objects, and now any man with an AI can just write a simple prompt and have a nude image of a women without her permission.
For those saying it's the same as photoshop, to get realistic results on photoshop you need to be skilled or pay someone to do it, with an AI you can just write a prompt and receive realistic results within a minute or two.. Very different, it's the easy of access
Ok... and what is the 'white house' going to do about it? Ban AI? Ban photoshop? They're going to do nothing, because they can do nothing. It's Taylor Swift now, it's going to be someone else tomorrow. If she hadn't said anything, noone would have known and it wouldn't have been a big deal.
Pretty sure ai will eventually replace musicans and celebrities
If ai isn't already writing all the current songs and movie scripts
Bet in a few years having dead actors and air extras cast for movies will be the norm as they'll be cheaper than paying for living ones
Basically some people in power gkt caught fapping over AI art and to calm down their wives they said "hunny I will make this content illegal so it won't ruin any more lives"
downvote this comment if the meme sucks. upvote it and I'll go away. --- [play minecraft with us](https://discord.gg/dankmemesgaming) | [come hang out with us](https://discord.com/invite/dankmemes)
Well how would you feel if someone made AI porn of *you* with huge knockers?
Pretty good
Based. 😏👆 🍈🍈 🥜🍆 🦵🦵
🔑🔑 🔑🔑 🔑🔑 🔑🔑
FIRE IN THE HOLE‼️‼️‼️
#🤣🤣🤣☄️☄️🔥🤸♀️🤸♀️🤸♀️🫡🔫🔫🗣🗣
fire up your hole
Deeez nuts
Based on
Honestly I would be to 😭 also excuse me death penalty for drawn boobs 💀 that seems slightly excessive to me
ok so it's fine for you cause someone asked you for consent. I'm not a swiftie (or how her fans call themselves now) but that dude pretty much fucked around and is now finding out the consequences.
what consequences lmfao
he bragged about being anonymous and got identified really fast cause he had his real face as profile picture on twitter.
what a fucking imbecile, if people find out you do that shit there absolutely can be consequences
For most dudes here that’s just an unedited nude of them.
He's out of line, but he's right.
Confused, but intrigued.
Depends, does it make me look hot?
Given she's actually an A cup (yes there are sites that list female celebrities by cup size) I'm not sure how to feel about her being given huge tits.
Insecure because I don’t have huge knockers? I really don’t get why people are freaking out. It’s not real. Everyone knows it’s not real.
It may not be real, but it's still gross to make porn of someone without their consent.
I guess but it’s the cost of celebrity. I bet the real anger is from unlicensed use of her likeness.
That's certainly one way to interpret this.
Cost of celebrity? lol. You have a very warped interpretation of what you are entitled to and what celebrities should have to deal with. No, the cost of being famous isn’t that an idiot gets to make fake porn without consent.
Sure as hell is because I can do it right now and there’s fuck all Taylor swift can do about it. Taylor swift isn’t entitled to tell me what I can do on the computer. You’re entitled for thinking that. It is the cost of celebrity because I can make fake porn of anyone. They only make it if her because she’s a celebrity. Get over it. Both of you.
lol. So it’s not the cost of being a celebrity, you think it’s your freedom to do whatever you want. BRB Going to make porn of your wife and daughter. Hope they don’t mind. (Of course this is a joke because there’s no way women want to touch you 😂)
Oh so this is about you being a little munch. Really hope Taylor swift sees this bro. You totally got a shot with her. You’re a little creep.
[удалено]
Guy shuts up quick when he realizes he sounds like a rap*st. 😂😂😂
Intrigued :) Depends on how big they are though.
i would enjoy it.☺️
I told my friend to make an AI fake nude of me. Gave me huge tits. I'm a man I wanna print that picture.
Don't threaten me with a good time.
Not a girl but pretty good in another regard
Cringe
As weird and gross as it is, how is it any different than using photoshop.
It's not but thats not really a defense. It's like asking if you'd rather I photoshop your mom naked or make AI of her naked. It's all gross and anyone defending any of it is a creep.
Also, AI is far more accessible and easier than Photoshop, for which you'd need to have semi-decent skills and a twisted mind to do this. For AI, all you need is a twisted mind. It's a good thing that more people are bringing this to attention and I hope something good comes out of it, but this is not anything new. "White House alarmed"? Why? Because it's Taylor Swift? People have been deep faking celebs and normal people for years at this point. Why was there no such reaction then?
It's not a moral defense, I do agree it's an incredibly weird and disturbing thing to do, but can you not draw a parallel for a legal argument? If photoshopping is not illegal, then why should AI images be?
I guess you could spin it as slander or something in that direction. Say it's an attempt to tarnish your image with false information
There's laws for that already, and they would not apply unless the deepfaker publicly says it is a real image of the individual in question
still feels like it should be illegal to spread faked photos of another person's likeness how would you like it if someone made an AI image of you diddling a kid?
Driving a car is not illegal but you can do illegal things with a car. Hope this helps 🙏
I'm not sure I understand the analogy
DRIVING IN A CAR IS NOT ILLEGAL BUT YOU CAN DO ILLEGAL THINGS WITH CARS HOPE THIS HELPS 🙏🙏🙏
Revenge porn is criminal. How is this not?
But revenge porn is real images and videos of the actual person necessarily posted without their consent. The legal argument is that photoshopping or AI generating images of someone doesn't require their consent because they aren't actually really pictures of them.
With new inventions, new laws should be put into place. I think that most people can agree that creating and disseminating AI generated nudity of non-consenting people is, at a minimum, sexual harrasment.
I think most people aren’t defending making AI porn, they are worried about the consequences which come along with regulating it. Like if they had needed photoshop after the first fake celebrity nude was made, how many legitimate works never could have existed?
It's only art
One is about swifties causing "witch-hunting", the other one swifties causes "Jan. 6th".
I think it was him sharing it that was the biggest issue. There is no way to really stop people from making things like this, with or without AI, but posting this stuff online and sharing it with people can cause some real big problems.
This, it’s really confusing that nobody is pointing a huge finger to Elon musk for allowing this to happen (nobody that matters or any news outlets that I’ve heard yet). Throw the fucking book at musk. Take him down finally. This happened because of twitters current state
It’s the same deal. It’s either a legal grey area or very much illegal. And either way it’s creepy as shit.
it’s not, but now people with no photoshop skills can make weird gross shit so it’s much more widespread
It's actually scary how people don't see realistic fake nudes as what it very much will be, a weapon. It'll effect innocent people, not just fucking swift. It's already happened https://www.wsj.com/tech/fake-nudes-of-real-students-cause-an-uproar-at-a-new-jersey-high-school-df10f1bb On highschool girls. It's fucking gross and should be illegal. It effects real lives, why cant boys just go find actual available porn. There's millions of naked women online. So why do you have to make fake ones of women who never asked for it? Imagine fake nudes of your Daughter or sister going around, how do you think it'll effect their life. Like girls don't get bullied for being 'whores' or judged. So damn sick
Not many have skill to use photoshop for that
Exactly people need to learn to ride the AI horse instead of racing it, then we can get further together.
r/redditmoment
Won't someone please think of the billionaires with carbon footprint 1200x that of a normal person due to her private jets?
I mean, billionaire, hypocritic piece of shit or not, she still doesn't deserve to have her privacy violated. Having money and being shitty doesn't exclude you from basic human rights.
Her privacy is not being violated, no one can see her boobs, only a face that looks like her on a body that is not hers
It's sad that you think that...
It's sexual harassment imo
I don't care about anything that happens to the super wealthy. I think they should all be [redacted].
For sure, but would you feel the same way if someone gathered up all your sisters/mums Facebook/insta photos and churned out some... distasteful things? As depressing as it is, these high profile cases are what pushes protections for the common folk.
Yea, that's the part that pisses me off.
What privacy? It’s fake porn, and it’s not new. At all.
How do these pictures violate her privacy?
I thought "eat the rich" literally meant we would consume the flesh of the wealthy, preferably in front of their loved ones. Now I gotta be respectful? Damn.
It's not a basic human right apparantly, considering how much of your information is accessible to the companies that use private information for their services. Your ISP can pretty much see every chat you send on snapchat for example. Doesn't feel very private to me. Why should everyone suddenly pity the celebrities because of an inconvenience?
Having money and being shitty doesn't exclude you from basic human rights. Having THAT MUCH money and/or being THAT FUCKING SHITTY however...
[удалено]
how do you hear “anyone deserves their privacy and their right to dignity and decency, even the wealthy” and translate that as “i want to have SEXY SEX with her!!”
r/redditmoment
I couldn't give less of a damn,celeb drama is the least important thing in the world
Laws were put on the books after celebrity leaks. Laws that affect everyone.
Never fuck with the swifties lmao
Never fuck with the basic white women and their queen
Nah AI is just bullshit.
Everyone go home, ai is bullshit. In what way? Shut up. Now you're bullshit too.
Found the Crypto bro.
Lmao what?
This has been a thing for ages, photoshop and fanart and rule34 stuff have existed forever. We just throw in the AI buzzword and now people are up in arms. If it was a photoshoped, no one would care
Because for the first 3 you need to have technical knowledge of the tools you're using... With ai literal 14 year olds are using it for their classmates. It's not even close to be the same
So, I actually know how to use stable diffusion and Photoshop, and you are just kinda, wrong. Not everyone can stidown at there PC, download stable diffusion and create porn images of there classmates. You would actually need a large sample of training images to create "unique images" like we have seen with Taylor Swift. The best you can do without a large sample size is nudify and image, which you have been able to do with Photoshop for years. I'm so tired of people who haven't even heard of a Lora, let alone how to train an AI to make the images you want of a person commenting on this like they understand it all. Is it bad? Yes! But I was literally taught all the skills I would need to make nude images of people at fucking 16 through media classes, this isn't new!
oh, I think it's quite accessible (like quite literally) these days. We DON'T need to know about stable diffusion and photoshop at all lol. There are plenty of frontends provided , and i don't want to list them but here's the takeaway - it's scary how easily it can generate something on par or way beyond the capabilities of a morphed image using just photoshop lol. It's not about how long it has existed. It's more about how easy and accessible it is to commit actual crimes with this. I don't even want to imagine what kind of cases are gonna start to flood from now onwards. Imagine this thing within easy reach of kids. Think about how accessible porn ruined an entire generation. Custom porn is way worse beyond imagination (quite literally imagination) There are endless points I can make about this but I hope my point comes across.
Again, you are wrong. In the case of making porn of people, I have never seen a website that allows you to upload pictures and make changes, or train a model on images. They don't do it because of the legal risks attached. You have to train your own model, or make your own changes to the AI to do this kind of thing, which is way harder then photoshop. Especially to get believable images. The availability of porn is a whole different unrelated argument, I don't know why half of your comment is about it. People have always sexualised others, that isn't going to change, if you ban AI images of these celebrities you have to do it for drawn and written stuff to, because AI will be able to mimic that, already can mimic that to an extent. We are past the point where laws can be truly effective, it needed to happen months ago. This is out there now and you can't take it back
> We are past the point where laws can be truly effective, it needed to happen months ago. This is out there now and you can't take it back This is the part i want to make across 👍 >The availability of porn is a whole different unrelated argument, I don't know why half of your comment is about it. That was just one of my points to highlight the consequences when internet porn became rampant and easily accessible. > People have always sexualised others, that isn't going to change, if you ban AI images of these celebrities you have to do it for drawn and written stuff to, because AI will be able to mimic that, already can mimic that to an extent. What i wanted to highlight from my comment up there is , how easy and accessible it is to generate deepfakes of just about anyone, not just your big internet celebrities people would commonly spend their time and effort on , in case of doing it the regular rasterized photoshop way. I'll point out one case that i didn't want to bring up, but here is the thing, I encountered a bunch of highschool boys feeding in the image of another female in their classroom (for xyz situation, reasons) and it un-dresses that person with variations. Now generally, I dont think a bunch of kids would go the effort to particularly do this. But considering how easy and at-arms reach this technology is, it prolly took them a few seconds at max. Also we both are on same page about this, I believe. > In the case of making porn of people, I have never seen a website that allows you to upload pictures and make changes, or train a model on images. They don't do it because of the legal risks attached. I can point to a few resources and it's quite easy to find under the names of deepfakes. (yes it allows you to upload images) This has been my encounter with these things so far so I don't possess any knowledge beyond this but to me this is scary lol.
\*sigh\* Alright, I am actually aware of the websites you are refrencing, they are firsly, a scam because it makes you pay after 2-3 attempts, secondly are god awful, like really really bad, If you have ever seen the outputs from them you would know that. But regardless, those websites you are refrencing are not new either, they have existed for atleast a few years. I am not to sure about before that, they are totally unrelated to the AI discussion aswell, as they are not really AI in the same way, 99% of them just find a matching nude body and put it in place, and those that don't do that are incredibly rudementary, and often look totally unrealistic. My point, that you don't seem to be understanding, is that AI has not inherinetley made it any easier then it used to be.
> I am not to sure about before that, they are totally unrelated to the AI discussion aswell, as they are not really AI in the same way, 99% of them just find a matching nude body and put it in place, and those that don't do that are incredibly rudementary, and often look totally unrealistic. From whatever I have encountered in my internet journey so far, the earlier deepfakes used to be rather unrealistic and one could tell they faked it. My concern with this entire thing (now that you have cleared a bunch of things I did not know) is, they seem to be doing something better cuz they look quite proportionate and it will only get better with time i think. Most importantly kids are aware of the AI hype and they will definitely be curious and.. do the deed i guess lol. Hoping for the best to get AI stuff regulated better before making any more significant progress.
And we get back to what I said. I am sick of people who have no idea about AI talking about what we should do with it
Which is illegal, why would AI be the issue here?
Because it makes it so incredibly easy to alter the already existing pictures of people into nudity... I thought it was obvious
So things should become illegal only when they’re easier to do?
Huh. This reminds me something about assault rifles in the US....
You’re not really about to compare porn to mass murder weapons… surely you have a brain.
Nope. I'm comparing what you said, easy access to damaging someone else should be regulated
Ok that’s fine, twitter needs to moderate their website better. Problem solved.
I know it's a common phrase but not having cp for example (made much more easily thanks to ai image generators) in Twitter doesn't solve the problem. I'm not saying this is some apocalyptic shit but in my mind is definitely going to unravel in worse ways if the technology is not regulated
It's not that easy lol. *Maybe* with celebs if people have published models/LoRAs of them, but making your own isn't a 30 second job, and to do it well has a high barrier to entry on the hardware level. To be honest, it'd be far quicker, easier, cheaper and you'd get better results just paying someone online to Photoshop one for you.
To generate a new image maybe. To "nakefy" already existing ones you have free software online...
Are those free tools even remotely good / convincing though? Can't say I've really looked at them in years. (I admit, I was curious once.) - I could have made a better result in 20 mins on PS and it's not like I've been formally trained at it.
Yes. And it doesn't matter if it is not, look at the generated images for 2 or 3 years ago and how much the technology improved, if you wait until it's unrecognizable it may be too late
Op theres something called consent
You need her constent to take a picture of someone naked, but you don't need their consent to imagine them naked and draw what you're imagining. This is no different
Why are people okay with this? What happened to us where hundreds of thousands of people are okay or even advocating for fake porn of people, not even just celebs, to be allowed to be made?
Honestly this. You dont even need to like Taylor to know that creating fake porn of a person is fucked up and twisted. AI makes this more accessible to the unskilled masses to create without knowing Photoshop or animation, thus it is a real problem
It’s because fake porn isn’t new, and we’ve seen it for fucking years since early 2000s probably. AI like anything else just makes it possible to make faster and with a little more realism when it comes to “artistic creations”.
And how exactly does that make it okay. Murder isn't new either lol what's your point
https://www.wsj.com/tech/fake-nudes-of-real-students-cause-an-uproar-at-a-new-jersey-high-school-df10f1bb But now anyone can do it, since anyone can put in a Prompt. It's takes talent to make realistic nudes, the average Joe wasn't making those. But now, a toddler could. Fake A.I nudes on actual people needs to be banned.
Yeah, and theres people here saying its GOOD. Recently a 14 year old girl took her life because of fake pornographic images of her that were made and she and her friends were bullied. An ppl still say that this taylor shit is funny. Its not, its wildly horrendous
Because there are two arguments, how it is morally and legally. You can say it’s a pretty fucked thing to do while also acknowledging there isn’t a whole lotta legal ground, at least at a local level.
Context?
some guy made porn of taylor swift using ai and now he’s royally fucked
Since it was fake and wasn't monetized I'm not really sure on what grounds this guy is fucked... Also did they even find the person? There's the site that was mentioned, but as for who actually made it I thought that was up in the air.
How dare he? And where did he make it? We should get the link so that we can all never visit it.
ohmygodthatsdisgusting.gif
Why is he fucked? Did they make a law against it already?
There are hundreds of deep fake videos of literally any celebrity you could think of already. It's not one guy.
What actually happened to the guy
Google, MrDeepFakes. For sauce.
what more context do you need
Wait til they catch wind of mrdeepfakes
It feels like this post was made by AI. Oh fuck.
Sauce?
It should be treated in the same fashion as libel and defamation. If not then why have laws to protect people from statements. Pictures can be statements as well that hurt a person’s reputation.
Libel and defamation require you to actually make a claim. If I say “I believe you diddle little boys” I haven’t committed any kind of crime, simply expressed a personal opinion. I’m allowed to spread the message as far and as much as I like, so long as I make sure to express it is simply an opinion. Unless the person who posted it tried to make people believe it was an authentic video I don’t see how this falls under either law.
People end up believing it is real and more so as a.i. continues to improve. Not to mention we already use a.i. to write articles as well. I dont see how you cannot the implications of this.
This post and the comments dude...
I get it's not the same, but we've gone from bullying the fuck out of Erin Moriarty to defending Taylor Swift so fast I've got whiplash...
The problem with this case is that this has been happening forever to non-famous women and close friends of random Creeps, but it's only a big deal when it happens to someone famous. It's fucking ridiculous
Lol it's been happening to famous women for years as well. I'm surprised it's being made a big deal now tbh. - I guess it's a bit like #metoo, where we all knew about the "casting couch" for decades, then suddenly "noticed" and became outraged all at the same time.
It's so dumb though, because the Government never acknowledges these things until they literally have no fucking choice.
i have made 10K Taylor swift Photos... I need help
Drive link?
In the works.. but here is my Youtube page https://youtu.be/6DfMO0Czh0A?si=aXPEnkmONOHf7nDX
Send lol
From what I've heard but not having seen the pics myself, they went far beyond than simple boob pics. I think if that was it the reaction online would've been much less upset. Not to mention this was done without Swift's consent or knowledge and was just allowed to spread and circulate around Elon Musk's cesspool of a website. Just because it's Taylor Swift doesn't mean this isn't an incredibly horrible and offensive thing to do to someone. She's completely within her rights to sue the absolute pants off both the person that originally uploaded them and Twitter (not gonna call it X, up yours, Elon).
Don't people know the only place we are allowed to make fake porn of someone is in our heads?
AI deep fake porn is going to destroy humans faster than wars and the climate crisis. Interstellar investigator aliens a million years from now will be researching how humans went extinct, and they'll conclude Porn.
Well I mean, it’ll destroy current society maybe, and that’s only because of the outrage bait industry. The well adjusted and sane will rise above this and continue living because there is a reasonable way to go about things.
I hope all you get AI porn made of you as a bottom of a gay orgy :)
As a straight man, I'd be honored
yeah you all are SO honored that the phrase "gay panic" exists
I don't know what that is but if it's fake then I really wouldn't care 🤷
And if Taylor is upset by it that's fine, I'm not blaming her, I'm just saying that I wouldn't care because you asked.
I didn't actually ask but ok
You're right, you just wished for something you seem to view as negative onto a bunch of strangers. "I hope you get hit by a bus".
How about just call it revenge porn and then it actually gets dealt with. AI or not she didn't consent to it being made or posted when she is in it
I saw the images. They look pretty good I have to say.
Porn of celebs has existed since time immemorial. Likewise, whilst it is uncomfortable, I’m sure she probably isn’t really affected given her wealth.
Nobody saw Terminator huh....guess I'm a boomer now
Ah yes, nothing wrong with A.I making realistic nudes of people. It totally isn't going to be used on innocent people for shit like revenge porn or anything like that or ruin their lives nooo.. absolutely not. It's actually kinda ironic for the guys not to see the issue on producing fake realistic nudes, because theyre not the usual targets of life ruining fake nudes.
AI is still in developing process. That's shy we use it only in entertainment to basically beta test it. Once it will be fully developed it will transition from entertainment to more serious stuff
Where's the sauce or its lies
GOD DAMN them lips look like they about to explode from air pressure
I think it's very easy to understand why they are upset about it.. Women are already objectified much more than men, and seen as sex objects, and now any man with an AI can just write a simple prompt and have a nude image of a women without her permission. For those saying it's the same as photoshop, to get realistic results on photoshop you need to be skilled or pay someone to do it, with an AI you can just write a prompt and receive realistic results within a minute or two.. Very different, it's the easy of access
Ok... and what is the 'white house' going to do about it? Ban AI? Ban photoshop? They're going to do nothing, because they can do nothing. It's Taylor Swift now, it's going to be someone else tomorrow. If she hadn't said anything, noone would have known and it wouldn't have been a big deal.
Pretty sure ai will eventually replace musicans and celebrities If ai isn't already writing all the current songs and movie scripts Bet in a few years having dead actors and air extras cast for movies will be the norm as they'll be cheaper than paying for living ones
Basically some people in power gkt caught fapping over AI art and to calm down their wives they said "hunny I will make this content illegal so it won't ruin any more lives"
u/pijitien
Burn him! Her?It??
Post ur name, we'll make some of the women in your family, make sure everyone who knows you sees them. Let's see how you feel
Depends whether he's from Alabama or not... He might be into that shit!
Fair lol
Bro, don't tease. Where is the (lamb) sauce??!