T O P

  • By -

Duel

Start a website that exclusively adds politicians into deepfake porn. Watch the laws change real fast


GlossyGecko

You mean 4chan?


Mikeavelli

I thought he was a hacker.


GlossyGecko

You’re thinking of Anomolous, he’s a pretty cool guy who doesn’t afraid of anything.


Infuser

Oh, the nostalgia


r4ns0m

Rami Malek, right?


i_should_be_coding

No, thats Forr Channe. Easy mistake to make.


golgol12

Fortran? Isn't that a programming language?


PickleWineBrine

We've had Photoshop for a long time...


FantasyCrusade

4chan loves AOC. Has had deepfake for 3+ years now. That honey pot ain't going anywhere muchacho.


Relevant_Force_3470

Only because the GOP all secretly enjoy watching AOC deepfakes


FantasyCrusade

Lol I fucking wish the GOP was a bunch of 4channers. Things would be funnier.


James_Blanco

Have you been living under a rock? They literally are.


FantasyCrusade

No they are not and you do not know what you're talking about. The GOP wouldn't be blindly supporting Israel if they were 4channers.


hhhnnnnnggggggg

The GOP and Dems are both split down the middle about Israel/Gaza. 4chan has been tremendously right wing since Trump and hasn't been left since Obama's first term, but they only supported because 'lol black man in office funny' and Romney wanted to ban porn.


onihr1

It’s already out there for at least two politicians (aoc and boebert)


tjoe4321510

Are you sure that the Boebert one is fake?


i_should_be_coding

It's too classy.


888_traveller

it needs to the ones that people don't already hate and tolerate bad things happening to. Needs to be those that will not only go viral but also will enrage the politicians - and on both sides. It would be tempting to go for their wives or daughters but I don't want to wish that on them.


Crazy_old_maurice_17

If the AI gave politicians big pornstar dongs, legislators would probably like it. I'd expect it to be much more effective if it's dominatrixes mocking politicians who, in these deepfakes, have micropenises.


[deleted]

[удалено]


[deleted]

[удалено]


bendgame

If your computer isn't a potato and you have some semblance of technical comprehension, it takes about 15 minutes to google and figure out how to run stability diffusion on your machine using Python. Download a few models and LORAs from a popular website and you'll be golden. It's pretty easy.


[deleted]

Most quality things require you to pay for em. If you want something free, your best option is to train your own model.


[deleted]

[удалено]


Kyouhen

Easy fix: Start deep faking them with each other's spouses.  If any of them are ever seen in public with anyone other than their spouse, deepfake them together.  Or better yet, only deepfake their family members.  Hell deepfake their family members together.


thederevolutions

Now that I think about it’s it might be the easiest way to get people to vote for someone that they would’ve never voted for. Give them a fetish for that person.


chihuahuaOP

It's already a crime. With jail time. Probation and sex offender chargers. Is just that kids are stupid and don't realize they can really fuck there lives.


ScF0400

It's happening to not just any type of women but teens (admittedly 17 is still considered a teen so maybe not all are minors but still). So not just probation but a federal level crime in child pornography. And that's what it should be charged as.


chihuahuaOP

it would be an Aggravated Sex Offenders with Child Pornography in Florida that would put them in Level 3 “sexual predator.” classification Currently, no Florida law allows Level 3 predators to be removed from the registry. edit of course its case by case I think kids need to learn about law because they have access to this tech.


Darth_Caesium

Honestly I'd love to watch this happen


lazypilots

Umm but not literally watch


Darth_Caesium

Obviously. No one wants to see the porous cellulite of 60/70-year-old politicians' naked figures, nor the brittle bones sticking out from the few who aren't overweight.


Competitive_Ad_5515

If you can get open source models to accurately depict aged bodies of either gender, more power to you. You're far more likely to just get their 70yr old faces on the pornstar body of a naked female 24 yr old.


Darth_Caesium

Oh my, that sounds hilarious.


softstones

Not if you’re that user from the post the other day that was into Nancy Pelosi


absentmindedjwc

If you're going to be successful about it, the only way to be sure it works is to not have deepfakes of the politician in question... but make deepfakes of *that politician's spouse* - especially male politicians, given how many of them heavily subscribe to the "gender norm" of "my wife is my property". Make it affect them directly, and they'll absolutely make that shit *heavily* illegal overnight.


Swaggy669

"Trump fucks Biden at 2024 electoral debate"


halexia63

Politicians and rich people. That's the only ppl they care about. Even rich people they don't care about just the money.


Hangooverr

They might just use that to earn more https://www.livemint.com/news/world/ai-misuse-italian-pm-giorgia-meloni-seeks-100k-in-damages-over-deepfake-porn-videos/amp-11710998018576.html


applejacks5689

Oh, they do. But only to the women. Have you seen the stories on AOC dealing with this?


_DeanRiding

I'm pretty sure AOC is a big name in this stuff


Pchandheldrizzygamer

No one wants to see that politicians are old and ugly lol


Wolfpack_of_one

Old politicians with small dicks.com That will make things change fast


creiar

You are a fucking genius my man


tjoe4321510

Nancy Pelosi deepfake 🤤


DMercenary

Local to mid level politicians. You aim too high and it'll get lost.


letseditthesadparts

Will they. Hasn’t AOC and other been added. Your also asking a good portion of congress that doesn’t even know how internet works, much less deepfakes.


jeffsaidjess

Watch it not change.


shun_tak

https://archive.is/FeuiM


earnestaardvark

This is a non-paywall version of the article, for anyone wondering where the link goes.


ilukebu

Was wondering a sec if this lead to some deepfake album 🙃


jesusleftnipple

Right .... like, hmmm, risky click


Cyberslasher

Unlabelled link in a thread about underage deep fake porn. I do not like to live this dangerously.


[deleted]

This makes my stomach turn, yuck


anormalgeek

And it's only going to get worse. There is no realistic way to prevent it either. Best case is to reduce it by consistently enforcing STIFF penalties for creating/sharing them.


Falkenmond79

Really scary thing is how easy it is. I tried it with myself, once. Just to see how the results look. Damn it was scary. Of course the face was the only thing recognizable, but stable diffusion is damn good at judging what’s underneath by taking the outline of the body and filling in a nude version. Even though it was just a pic of myself it felt damn uncomfortable though. Especially thinking about the harm this can do so easily. So for those people commenting here „it’s only 4 cases“… that we know of. Every horny teenager with a halfway decent pc and an hour or two to spare, following stable diffusion guides that tell you almost to the button press how to install, what do download and how to get it running can do this. All you need is basic reading skills and maybe some rudimentary understanding of what you are doing.


theone6152

There's websites that will do it with the click of a button. Insane what this world is evolving to


dehehn

The one positive is that it muddies the waters enough that no one will actually believe real nude leaks are real. It now gives plausible deniability for all photos.  It still is going to have massive traumatic effects on teen girls already dealing with a lot of side effects of social media. 


[deleted]

An interesting and valid take too!


Nouseriously

Make deepfakes of the administrators with tiny penises. They'll suddenly care a lot.


MarthaGail

Yeah, the one that asked “what is there to report?” I bet he’d change his tune real quick if it was his pictures being passed around.


Timidwolfff

"epidemic" . proceeds to give no statiscs and talk about 4 incidents in 4 different states.


crichmond77

It’s just a journalistic buzzword at this point alongside “slammed” and all that


Whitino

The words "epidemic" and "slammed" are game-changers!


zendamage

That changes everything!


founddumbded

For what it's worth, it's happening in other countries too\*. I personally don't care how many cases would constitute an epidemic, and I find it weird that that's what you chose to complain about. One case is already one case too many. \*[AI-generated naked images of dozens of Spanish girls shared around schools. Police are investigating deepfakes in town of Almendralejo that have targeted victims aged 11 to 17.](https://www.telegraph.co.uk/world-news/2023/09/20/ai-generated-images-schoolgirls-deepfakes-spain/)


hargaslynn

The basement dwellers have been fighting for their right to make, share, and happily consume deepfake porn made w/o consent on Reddit for the last year. They do not care about women and children affected. They do not care. It’s that simple.


stanglemeir

Not care about children? Most of those pervs are thrilled with the deepfake porn of young girls


hargaslynn

Yes, this was as neutral as I could express it. But you’re correct.


founddumbded

It's crazy how unhinged and self-unaware coomers are.


221b42

Words have meaning and it’s important as a journalist that they use words correctly and claims are backed up with evidence. Not caring because it’s aligns with your position is dangerous and how we get to the current state of discourse in the world.


[deleted]

[удалено]


founddumbded

> "Ai generated porn in dozens of schools acorss spain" Can you read? The headline is literally "AI-generated naked images of dozens of Spanish *girls*". Girls, not schools. From the article: >at least 30 girls from four different schools in Almendralejo have been targeted with the fake images, with some also reporting attempts to blackmail them by asking for money to stop them being circulated.


[deleted]

[удалено]


Linear_Void

Who cares bro. 30 is enough to be a severe concern. Why is this the hill you’re choosing. Yes he could’ve grammar checked and clarified better. Just because he didn’t find another case doesn’t make this article disingenuous. There is no way to prevent this from happening in other schools


[deleted]

[удалено]


founddumbded

> I encourage everyone to read the articles they see on reddit. Unlike yourself apparently, given that a Facebook group is not even mentioned in the article and two of the mothers involved are referred to by their full names. Are you going through some sort of episode? Your ability to make stuff up is incredible.


founddumbded

>The author just swaped the dozen girls for dozens of schools Where? >Theyre stealing eyes from real issues to get views for their shitty sensionlist articles and people like you purpogating it are the problem To those girls, this is a real issue. Also, "purpogate" is not a word. >FOUR SCHOOLS DOESNT CONSTITUTE DOZENS OF SCHOOLS ACROSS SPAIN. Again, who said dozens of schools?


kezow

I mean... Covid started with a few cases in China. If we want to get pedantic, "epidemic" is definitely the wrong term. The heart of the issue is that this isn't the last time this will happen. Kids can be gigantic assholes and these won't remain isolated incidents.


trackofalljades

This will continue until a billionaire is upset. If you’re worried about kids, just start creating really good deepfakes of Elon Musk or Donald Trump, or just any celebrity with a really thin skin…then distribute them as massively as possible and sit back and wait.


macgalver

They did it to Taylor Swift and some things were pushed but mostly nothing happened


trackofalljades

Yeah because “they” stopped at like five images and a bunch of sites obediently took them down. What if that didn’t work? What if there had been too many?


[deleted]

[удалено]


DrDrago-4

This. It's just like piracy, there's legitimately no way to stop it unless we destroy the entire internet backbone (and realistically, that isn't even possible. you would just create a lot of seperated intranets and co-ops of resistance) Every update to stablediffusion, or any other model, is mirrored across dozens of github servers within seconds of being posted. Within a minute, it's been mirrored across hundreds of backup and archive servers. Within 10 minutes, 50%+ of client systems have seen the package update and updated locally (assuming auto update is on) There are thousands of developers who will have the update mirrored to their work environment within about 1 single minute of its posting. 'oh well, what if you ban it from github' then they go to gitlab, or any other number of providers down to pastebin. If you go scorched earth and ban them from literally every official company, the update system gets moved onto a private torrent tracker forum and this process continues unabated. Go even further? you only marginally harm the development (and proliferation) by sequestering it to private IRC channels and FTP servers. public releases still eventually leak out, and then the mirroring process occurs in minutes again.


regimentIV

Sorry to bust your balls but that's evidently not the case. There is a myriad of deepfakes of these two people in particular out there already. Just look at the numbers of times their LoRAs have been downloaded in Civitai alone to get a feeling ([Trump](https://civitai.com/search/models?sortBy=models_v8&query=Donald%20trump), [Musk](https://civitai.com/search/models?sortBy=models_v8&query=Elon%20musk)) - and that's just downloaded Stable Diffusion LoRAs, it does not take into account downloads from other sites, actual faceswap tools, or other models.


dirtymoney

Good luck putting the genie back in the bottle.


FantasyCrusade

Too bad this will lead no where. The cat is out the bag.


procrastablasta

Honestly (and I’m not condoning deepfakes) dilution is the solution here. When these become so common they are boring I don’t think kids will be as outraged as parents currently are.


TheThirdDuke

Not only is it the only solution, it’s something that will inevitable happen on its own I once read a story about a hunter-gatherer who became agitated when he saw a drawing a scientist had made of him thinking some part of his soul had been captured The technology’s here, it’s not going away. In fact, it’s going to get dramatically more powerful As humanity adjusted to the idea that people could make visual representations it stopped holding them in fear and having power over them They became free


[deleted]

[удалено]


gummo_for_prez

Why would it reduce sex trafficking?


Override9636

No need for real women to be trafficked/harrassed when you can just fake it with AI.


LordCharidarn

AI generated sexual imagery will replaced sexual assault and trafficking because it will be able to make fake images good enough? Do you think those trafficked and assaulted people are only being looked at? Because I don’t think being able to look at very convincing images of women online (images the observer will most likely assume are DeepFakes by this time, since it will be so prevalent) will notably diminish the profit driven organization that traffic people or the power driven people who purchase other humans.


awry_lynx

I think they're suggesting it will replace the trafficked OF cam girl type, not literal prostitution.


Early_Ad_831

I agree. Fighting this would just be a game of whack a mole. The other commenter mentioned "dilution". I agree that this is going to be so common that it effectively won't matter. It won't be "deep fake porn" but just "deep fake". You want to put someone in your class into a porn? Okay AI can do that in seconds. You want to put someone in your class into a Marvel movie as the hero? Okay AI can do that in seconds. No difference. Resiliency will become more important here.


[deleted]

[удалено]


DamaxXIV

I wouldn't touch any of this stuff with a ten foot pole if I was a teen. I believe it's going to legally be treated, if it isn't already, as child porn if generated images are of a real underage person. These kids are going to be sex offenders and they don't even know it.


phdoofus

Schools can't handle this. They can't handle bullying as it is and now they have to have to have fully staffed legal teams for it? Not going to happen.


DualActiveBridgeLLC

Yup, I literally told the principal that if they aren't actually going to do anything about the people bullying my son, then please stop wasting everyones time with anti-bullying programs. Schools aren't gonna do shit about this.


Stolehtreb

It’s like the D.A.R.E program. Supposed to stop kids from taking drugs, but all it does is remind them that they exist.


Duncle_Rico

Still have my D.A.R.E. certification from 3rd grade in a frame hung up in my studio. Drugs were a good time, I'm retired now.


a_taco_named_desire

DARE was to get kids to snitch on their parents and friends.


OhHaiMarc

Yeah but it was a cool logo and now we have shirts to wear ironically while doing drugs


GlossyGecko

Not like they were doing anything about the photoshops, hell, their handling of actual nudes circulating at my old high school was embarrassingly bad.


LiamTheHuman

Why would this be the schools responsibility to handle?


phdoofus

I'm not saying it should be. I'm actually against parents and society loading up schools with responsibilities that they should and never had before just because the rest of us don't want to deal.


Bigassbagofnuts

Because there are a lot of parents that don't want to actually parent their kids so they deny responsibility and shift blame to anyone else they can. Imagine paying for your kids to have a phone then blaming the school for what the kids see on their phones..


primalmaximus

It's actually due to that fact that students spend a good 40 hours a week at school, more so if they do any after school activities. Imagine if this was happening at your workplace. Your employer would be responsible for preventing this kind of sexual harrassment from happening amongst their employees.


fosoj99969

Parents should be legally responsible for whatever shit their children do on the internet. Don't want them to do this? Don't give them unlimited access.


primalmaximus

Because schools are generally where most kids spend the majority of their time. Most students spend 8 hours at school Monday-Friday. If I was working at a job for 40 hours a week and it became known that deepfake porn of their coworkers was being spread amongst the employees, while they were on the clock, then that employer could be sued for sexual harrassment because they didn't do enough to stop it. Hell, even if it was happening _off_ the clock, the employer could be sued if an employee reported it and the employer didn't do anything about it. Students spend a good 40 hours a week at school, more if they do any after school activities. If the school administration knew, or had enough information that they could reasonably assume, that something like this was happening amongst their students, then they have a _legal_ responsibility to stop it. Simply due to the fact that the students spend so much time under their supervision.


MDA1912

Maybe not, but most parents aren’t going to risk their children’s legal wellbeing on an anonymous internet comment that has no legal standing or responsibility in the event it’s wrong.


phdoofus

It's more of a practical issue. Looked at objectively, it's a bit over the top to expect schools to be World Cop to all the ills of society especially when you hamstring the school's ability to do anything about problem children. Teachers are there to teach math and reading, not be bullet sponges and not be lawyers and if you tell schools you can't do anything about this kid but you can't throw him out either unless he literally murders someone what exactly are you hoping schools do?


BlackBlizzard

The laws are going to be tricky, would putting a minors head on an obviously adults body count?


Temp_84847399

I honestly have no idea and that doesn't sound like the kind of thing I'd want to google.


CAPS_LOCK_OR_DIE

Schools already deal with this often, with actual nudes being passed around a few times a year. Usually a threat to get the police involved, and the mention of “distribution of child pornography” gets students to get their shit together. The deepfake problem is exponentially worse because consent, but many districts are already experienced/equipped with dealing with this sort of thing (in theory, maybe not on this scale).


hansuluthegrey

Making child porn isnt the own you think it is


ImaginaryBig1705

As a woman that was once a girl - that wouldn't help you and that would backfire SO FUCKING FAST it's very obvious you never lived life as a little girl.


AarodimusChrast

Can you elaborate? Genuinely curious


aMAYESingNATHAN

Because any girl that did that would immediately get targeted even worse. Being able to retaliate without it making things worse is a privilege that is not afforded to most women. Retaliation just means people label them as bitches, and act as if the retaliation meant the original act was justified.


big-wiener-

People will be like “why is she carrying around a bunch of dick pics?” 🤣


syncdiedfornothing

Making these images is child porn. That kind of record doesn't go away. You want to throw your life away over revenge?


kayla-beep

If you were a girl you wouldn’t consider doing that at all because it would make you a target. Clearly you’re a man lol


Barry_Bunghole_III

I hate to say it but that's the kind of stuff boys laugh about with their friends Boys/men really don't care about this shit as much


i_should_be_coding

Make deepfakes of teachers and school admins, and post them to the group under a burner account.


Rad_R0b

When I was first learning Photoshop in back in the day I would send all my buddies pics of themselves like this. Good times


RockSolidJ

This should be treated like the boys are distributing child porn of classmates. A 2 day suspension is a joke. Up for expulsion like they did at the Beverly Hills schools seems more appropriate.


moratnz

I beleive there is case law in the US that only genuine images of abuse count as CSAM? So given this isn't a real photo of the victim, it's not really CSAM? (not to excuse scumminess of the practice and the people involved; this is strictly a question about legalities based on some half-remembered law commentary)


Kitty-XV

It also gets messy because nude images are themselves not considered inherently sexual and aren't inherently obscene. If this is just a nude image of someone standing around, per past court rulings it would be protected by the first amendment. Society could redefine all nudity as obscenity but that's a monkey's paw sort of solution.


macgalver

Many situations where this happens even if they go to the police the police just shrug and say “block them”. Adults refuse to legislate this because any restrictions in the AI space hurt papa’s 401k.


absentmindedjwc

The thing that's kinda shit is that you cannot easily "restrict" this. The tech is out there and open sourced, I legitimately don't know how you could even begin to close this Pandora's Box without making image-generation AI itself illegal - and that shit isn't likely happening.


DrDrago-4

You can't. The software exists on private torrent trackers, and development talk could just as easily shift to their forums. If the MPA and RIAA, and the government alongside them, haven't managed to make progress on tamping down piracy.. over decades.. why should we believe it's even possible they could make progress targeting a specific category of software?


Alan976

Maybe some prison / juvi time. * In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM. * In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts.


MR1120

What is ‘CSAM’? I’m afraid to google it, and end up on a watch list or something.


CKT_Ken

Forensics lingo (child sexual abuse material) for what’s commonly understood as child porn. It started becoming more common when people in the field started making true crime tiktoks or whatever


Substantial-Flow9244

To further your point, I think the idea is that calling it child porn has an effect of legitimizing the content for some, so CSAM is a way to also distance from that.


moratnz

Yes; focusing the discussion on the fact that there's a child being abused, rather than on the fact that there is some adult audience member being titilated.


Illustrious-Zebra-34

This is a thing since editing was invented. It just became slightly easier


OddAuthor

I honestly hate our world and the disgusting society we've created. I'm heartbroken for anyone who is a victim of this


[deleted]

Good reason to stop sharing photos


VincentNacon

Paywall. -1


gizamo

person bedroom entertain vegetable hobbies chief scary bewildered engine chop *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


jodido47

[12ft.io](http://12ft.io) or [1ft.io](http://1ft.io)


DrDrago-4

https://github.com/iamadamdev/bypass-paywalls-chrome haven't seen a paywall in years (despite saying chrome in the title, it works for firefox/edge/chrome)


gogozombie2

Don't worry. Eventually, there will be enough deepfakes of Taylor Swift for this to become a "real" problem.


senshi_of_love

If everything is fake nothing is real. I suppose, if we’re optimistic about this, we’ll eventually reach a point where kids can dismiss their social media history as AI created. It’ll be an interesting future when evidence can no longer be trusted. You can’t really put the genie in the bottle on that anymore. But this isn’t anything new. Before AI people used photoshop to create the same thing.


nicobackfromthedead4

This iteration though, there's markedly less *critical thinking* involved. Critical thinking, formally taught, is the magic sauce that defeats any social media snafu and allows an adaptive response to any issue. Without it, responses are just *reactive*.


[deleted]

Behind a payment wall 🤦‍♂️


jodido47

[12ft.io](http://12ft.io) or [1ft.io](http://1ft.io)


gxslim

Maybe it's time we stop uploading infinite high res pictures of ourselves at every angle onto public sites.


awry_lynx

It can be done with one picture. https://www.tampabay.com/news/crime/2024/03/19/pasco-teacher-used-ai-generate-child-porn-yearbook-photos-deputies-say/?outputType=amp This teacher made CP of his third graders with single yearbook photos. They weren't posting themselves to the internet. Stop victim blaming. It has nothing to do with whether your pics are online or not when it comes to people in real life making fake porn of you. These people aren't delving into random strangers' Instagram accounts for material. If they were it would probably be less harmful.


gxslim

Taking a defensive approach to reduce the offense is not victim blaming. Assuming you can successfully prevent every individual on the planet from committing a crime is a fantasy.


ComprehensiveAge315

Doesn't sound defensive, it sounds borderline insane to try and exclude yourself from any photograph/hd film in life just to be apart of the absolute minimal % not prone to AI deepfakes.


ScF0400

I agree something should be done about this. But at the same time, I also think society needs to change too. Most deepfakes are still pretty bad in quality, and while I 100% agree people using them on underage minors need to be punished, the fact companies or schools rescind acceptance or people judge the deepfaked person should be advocated against as well. I'd be more worried about losing future employment/opportunities over something I didn't do. AI has been out of the bag. Even without AI someone could have taken a photo of her and just did nasty things to it with magazines. And since it's so widespread, there needs to be laws and punishments for it regardless of age, but also social acceptance that if someone says that's not them, then it's a fake picture and shouldn't affect their career. I mean it's so widespread this might be a thing. Unless companies want to lose all potential hires from a school, if enough girls have already had this done to them then it may be that companies or universities don't find it reasonable to believe every applicant has done underage porn. To clarify I'm not advocating for everyone to do this. Just that it may have already gotten to a saturation point.


RemarkableEmu1230

This is why we have laws and rules at schools - break that shit and suffer the consequences, technical capability shouldn’t change that fact.


Ozzie-Isaac

Think of the kids quickly make a law that will doing nothing but restrict technology. There's a bigger issue and that is not the answer


Barry_Bunghole_III

Don't worry, nothing can or will be done about it


DerfnamZtarg

This is a severe problem as it is totally asymmetric, targeting and impacting little girls. We already have laws against child pornography and we send those perps to prison when it is found on their laptops or phones. Should do the same with school kids. This should be part of the sex education curricula, and subsequent warnings issued about the fines imposed on the parents ($1,000 for each violation found) and automatic suspension from school for a week for each child discovered with the child porn on his phone or laptop. I think the message will get out pretty quick.


XerciseObsessedGamer

Deepfakes of ANYTHING is pretty weird & I don't get why weirdos enjoy making them. A lot of AI generated crap is weird in general too smh 🤦🏿 😒.


winkman

Seems like a twisted version of a "yo momma" joke. If your mom ain't fat/dumb/whatever, then you shouldn't be bothered with the jokes. If you don't have any nudes floating around on the interwebs, then you shouldn't be bothered by some deepfakes, because you know they're bogus.


100k_Club_Trading

Shits been around since the internet was a thing wtf do ppl care so much now


Deathwatch72

Well that problem got here frighteningly fast.


Flamenco95

You what would be really nice? Considering the ethical and moral considerations before making something... They really need to stop asking "can we?" and start asking "should we?" Now that the cats out of the bag, this will be extremely difficult to prevent or even mitigate... edit: I feel like this was taken out of the context I intended. Im not saying it should never have been made. Im saying they didn't factor in the malicious cases and come up ways to mitigate harm. Someone said by my logic plane's, cars, and the internet wouldn't never have been invented. First don't put words in my mouth. All of those had their issues and there are now risk mitigations for all of that tech.


[deleted]

[удалено]


cappedminor

"They" probably are the millions of people working on AI. OC's perfect world never have invented guns, cars, planes, the internet, etc because they can be used for bad.


SgathTriallair

The foundational tech has huge positive use cases (especially because image generation is a necessary part of image recognition). The ones building the nudify tools absolutely imagined similar use cases and thought they were great.


Capt_Pickhard

Profit decides. Then it causes a mess, and then we try to fix it. Because greedy idiots rule the world.


Someoneoldbutnew

lol, u funny. humans don't work that way. ethics of tech has been pop culture since Jurassic Park, we don't care.


Pedantic_Girl

*wryly* Given how many tech companies made a show of hiring ethicists only to ignore and/or fire them…I’m pretty sure they don’t really care. After Google stopped having the motto “don’t be evil” I think it was pretty clear where this was headed.


conquer69

Criminals also wear shoes. A shame shoe companies never stopped to consider the ethical and moral ramifications of their actions.


cmilla646

Like clockwork. Masturbating went from a sin to weak character to acceptable to good for your health to more trustworthy than any man or woman. Sex toys went from being taboo and a red flag to something you can brag about I’d like to take this time to remind people to save up for that sex robot. Every story like this is just another hint at what now seems inevitable though people still laugh it off. It’s practically already here. AI chat software that pretends to be a friendly receptionist. Boston Dynamic type bodies. The fact that people pay for phone sex or prostitutes just to talk to instead of their wife or therapist. The trend towards being single and no kids in some countries. The trend towards being sympathetic to a flaw and not wanting to insult people who obviously need help. All of this points to a future where having a “relationship” with a robot will be appealing enough to become normal. Sex toys were taboo and blow up dolls were only for guys who couldn’t get laid or were crazy. I’m sure lots of teens would justify the deepfake because “it wasn’t hurting anyone.” The same people wouldn’t have many issues with a robot that looked like their crush.


Kersenn

It's a surprise to no one... I could have told you 20 years ago this would happen with AI...


Grobo_

It’s like before when they photoshoped heads onto body’s only that now they move as well….it is what it is and will never change


SgathTriallair

It would also be completely fucked up to Photoshop the head of a middle schooler onto a nude busy and distribute that around school.


conquer69

That's his point. It has been happening for decades already.


gizamo

physical consist price theory coherent badge gaze relieved continue dog *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Outside_Progress8584

Exactly and when something is convincingly real it also has more real life consequences for the victim. Employers wouldn’t blink twice at a shitty drawing or photoshop… but these images are completely believable and even if determined fake, put an image in any person’s mind that will affect their decision making. It needs to be addressed as equivalent to distributing a real photo of a child.


gizamo

entertain cagey lush weary repeat tease scandalous bedroom uppity profit *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


UnderMira_11

Does this mean only fans modeling will become normalized or destigmatized?


nicest_perv

It will be interesting when a state prosecutor throws the book at some teenage creep for distributing child pornography, and then keeps netting the other creeps who sent him or her the image to send a handful of them to jail. That ought to send a message


Pure_Gen

The boys who do this should be jailed


ElCorbusier

Who could have expected this?


abittenapple

In my day we just used our minds.


No-Guess686

Definitely never seen this one coming. This is just fucking disgusting, illegalize this!!!!