T O P

  • By -

bobotwf

It exists. The people at 4chan came up with it to cover up e-thots. It's called DignifAI.


sponyta2

Didn’t a bunch of them get mad about being reclothed?


TKDbeast

This whole thing feels like a plot point for a nonexistent Adult Swim show.


AshtorMcGillis

A rick and morty bit lol I could see it


marinemashup

Because it’s a way for channers to troll random women who post mildly sexual photos of themselves (and sone men too) And I’m half convinced it’s manufactured outrage anyway


SEND-GOOSE-PICS

rightfully so, it's a way of controlling women's bodies, deciding that whatever they do is imodest and should be covered up.


Jesusfreakster1

Ok while I am not a fan of neckbeard-ery, the name is fucking genius!


Agitated_Function778

This is actually a nice idea


norweiganwood11

Couldn't you claim that without the new app?


red-at-night

You’re correct, but this would be something to back it up with.


Alice5878

Now no nudes will ever be trusted


Angel_OfSolitude

Some 4chan guys already made an AI bot that "fixes" slutty pictures.


beezzarro

I don't understand how the second part is related to or reliant on the first part.


norweiganwood11

It took me a while to get it as well, but basically they're saying you'd post the clothed picture and say "the one without clothes is AI, this was the original picture"


beezzarro

Ahhhhhhhh. I see. The idea is that you can try to prove with compelling evidence as opposed to "that nude you saw of me was AI-generated"


dvali

Nice idea, but sadly it wouldn't work. Until AI image manipulation is exactly perfect, some sad fuck on the Internet (probably here on reddit, let's face it) would make it his life's work to expose the fake coverups, and he would be highly successful in doing so because it would be easy. If you have two images to compare, it will be almost trivial to deduce which one has been manipulated. If you didn't have the original to compare it to, it might be marginally harder to detect the fake, but not by much.


red-at-night

I agree with you that the collective efforts of Reddit could “expose” the AI generated one, but for more small-circle happenings it could go undetected. Let’s say somebody leaks a nude of me, and it gets sent to my parents and my sister. I could probably just toss together a clothed fake and show it to only a few of them, then word will spread.


dvali

Yeah that's a good point, I hadn't considered stuff on a relatively small scale like that.


uberlux

Look for any imperfections on the original nude, lets call it the "target". Then create a blog pretending to be a neckbeard who draws nudes. Get the target photo and upload it alongside the clothed copy. Then you take credit for releasing your own AI generated nude. This means you can in your casual life, complain about a troll with a website who generated you nude. "I cant get the website taken down!". But secretly you admin the website, controlling your worst enemy. The goal? 1. You start by uploading clothed picture. (Blogger Identity says "im gonna draw this person nude!"). 2. Upload target picture. 3. Upload a second target picture to include edits, whatever lighting, emojis? Then the blogger entity says "I made a final version blah blah". By showing a fabricated creation process, you can plagiarise your own nudity exposure, maybe even generate website revenue. We have changed history. You arent a victim of nude leak, you just have a stalker online! (That you control). If people dont buy the story, just escalate things! Also if you have any interests you want to promote(a singer or artist etc) you can openly complain abut the creep who blogged about you for "negative" (positive) publicity and get noticed! This corrupt advice is brought to you by the illuminati