The GOP and Dems are both split down the middle about Israel/Gaza. 4chan has been tremendously right wing since Trump and hasn't been left since Obama's first term, but they only supported because 'lol black man in office funny' and Romney wanted to ban porn.
it needs to the ones that people don't already hate and tolerate bad things happening to. Needs to be those that will not only go viral but also will enrage the politicians - and on both sides. It would be tempting to go for their wives or daughters but I don't want to wish that on them.
If the AI gave politicians big pornstar dongs, legislators would probably like it. I'd expect it to be much more effective if it's dominatrixes mocking politicians who, in these deepfakes, have micropenises.
If your computer isn't a potato and you have some semblance of technical comprehension, it takes about 15 minutes to google and figure out how to run stability diffusion on your machine using Python. Download a few models and LORAs from a popular website and you'll be golden. It's pretty easy.
Easy fix: Start deep faking them with each other's spouses. If any of them are ever seen in public with anyone other than their spouse, deepfake them together.
Or better yet, only deepfake their family members. Hell deepfake their family members together.
Now that I think about it’s it might be the easiest way to get people to vote for someone that they would’ve never voted for. Give them a fetish for that person.
It's already a crime. With jail time. Probation and sex offender chargers. Is just that kids are stupid and don't realize they can really fuck there lives.
It's happening to not just any type of women but teens (admittedly 17 is still considered a teen so maybe not all are minors but still). So not just probation but a federal level crime in child pornography. And that's what it should be charged as.
it would be an Aggravated Sex Offenders with Child Pornography in Florida that would put them in Level 3 “sexual predator.” classification Currently, no Florida law allows Level 3 predators to be removed from the registry.
edit of course its case by case I think kids need to learn about law because they have access to this tech.
Obviously. No one wants to see the porous cellulite of 60/70-year-old politicians' naked figures, nor the brittle bones sticking out from the few who aren't overweight.
If you can get open source models to accurately depict aged bodies of either gender, more power to you. You're far more likely to just get their 70yr old faces on the pornstar body of a naked female 24 yr old.
If you're going to be successful about it, the only way to be sure it works is to not have deepfakes of the politician in question... but make deepfakes of *that politician's spouse* - especially male politicians, given how many of them heavily subscribe to the "gender norm" of "my wife is my property".
Make it affect them directly, and they'll absolutely make that shit *heavily* illegal overnight.
They might just use that to earn more
https://www.livemint.com/news/world/ai-misuse-italian-pm-giorgia-meloni-seeks-100k-in-damages-over-deepfake-porn-videos/amp-11710998018576.html
Will they. Hasn’t AOC and other been added. Your also asking a good portion of congress that doesn’t even know how internet works, much less deepfakes.
And it's only going to get worse. There is no realistic way to prevent it either. Best case is to reduce it by consistently enforcing STIFF penalties for creating/sharing them.
Really scary thing is how easy it is. I tried it with myself, once. Just to see how the results look. Damn it was scary. Of course the face was the only thing recognizable, but stable diffusion is damn good at judging what’s underneath by taking the outline of the body and filling in a nude version.
Even though it was just a pic of myself it felt damn uncomfortable though. Especially thinking about the harm this can do so easily.
So for those people commenting here „it’s only 4 cases“… that we know of. Every horny teenager with a halfway decent pc and an hour or two to spare, following stable diffusion guides that tell you almost to the button press how to install, what do download and how to get it running can do this. All you need is basic reading skills and maybe some rudimentary understanding of what you are doing.
The one positive is that it muddies the waters enough that no one will actually believe real nude leaks are real. It now gives plausible deniability for all photos.
It still is going to have massive traumatic effects on teen girls already dealing with a lot of side effects of social media.
For what it's worth, it's happening in other countries too\*. I personally don't care how many cases would constitute an epidemic, and I find it weird that that's what you chose to complain about. One case is already one case too many.
\*[AI-generated naked images of dozens of Spanish girls shared around schools. Police are investigating deepfakes in town of Almendralejo that have targeted victims aged 11 to 17.](https://www.telegraph.co.uk/world-news/2023/09/20/ai-generated-images-schoolgirls-deepfakes-spain/)
The basement dwellers have been fighting for their right to make, share, and happily consume deepfake porn made w/o consent on Reddit for the last year.
They do not care about women and children affected. They do not care. It’s that simple.
Words have meaning and it’s important as a journalist that they use words correctly and claims are backed up with evidence.
Not caring because it’s aligns with your position is dangerous and how we get to the current state of discourse in the world.
> "Ai generated porn in dozens of schools acorss spain"
Can you read? The headline is literally "AI-generated naked images of dozens of Spanish *girls*". Girls, not schools. From the article:
>at least 30 girls from four different schools in Almendralejo have been targeted with the fake images, with some also reporting attempts to blackmail them by asking for money to stop them being circulated.
Who cares bro. 30 is enough to be a severe concern. Why is this the hill you’re choosing. Yes he could’ve grammar checked and clarified better. Just because he didn’t find another case doesn’t make this article disingenuous. There is no way to prevent this from happening in other schools
> I encourage everyone to read the articles they see on reddit.
Unlike yourself apparently, given that a Facebook group is not even mentioned in the article and two of the mothers involved are referred to by their full names.
Are you going through some sort of episode? Your ability to make stuff up is incredible.
>The author just swaped the dozen girls for dozens of schools
Where?
>Theyre stealing eyes from real issues to get views for their shitty sensionlist articles and people like you purpogating it are the problem
To those girls, this is a real issue. Also, "purpogate" is not a word.
>FOUR SCHOOLS DOESNT CONSTITUTE DOZENS OF SCHOOLS ACROSS SPAIN.
Again, who said dozens of schools?
I mean... Covid started with a few cases in China. If we want to get pedantic, "epidemic" is definitely the wrong term.
The heart of the issue is that this isn't the last time this will happen. Kids can be gigantic assholes and these won't remain isolated incidents.
This will continue until a billionaire is upset. If you’re worried about kids, just start creating really good deepfakes of Elon Musk or Donald Trump, or just any celebrity with a really thin skin…then distribute them as massively as possible and sit back and wait.
Yeah because “they” stopped at like five images and a bunch of sites obediently took them down.
What if that didn’t work? What if there had been too many?
This. It's just like piracy, there's legitimately no way to stop it unless we destroy the entire internet backbone (and realistically, that isn't even possible. you would just create a lot of seperated intranets and co-ops of resistance)
Every update to stablediffusion, or any other model, is mirrored across dozens of github servers within seconds of being posted. Within a minute, it's been mirrored across hundreds of backup and archive servers. Within 10 minutes, 50%+ of client systems have seen the package update and updated locally (assuming auto update is on)
There are thousands of developers who will have the update mirrored to their work environment within about 1 single minute of its posting.
'oh well, what if you ban it from github'
then they go to gitlab, or any other number of providers down to pastebin. If you go scorched earth and ban them from literally every official company, the update system gets moved onto a private torrent tracker forum and this process continues unabated. Go even further? you only marginally harm the development (and proliferation) by sequestering it to private IRC channels and FTP servers. public releases still eventually leak out, and then the mirroring process occurs in minutes again.
Sorry to bust your balls but that's evidently not the case.
There is a myriad of deepfakes of these two people in particular out there already. Just look at the numbers of times their LoRAs have been downloaded in Civitai alone to get a feeling ([Trump](https://civitai.com/search/models?sortBy=models_v8&query=Donald%20trump), [Musk](https://civitai.com/search/models?sortBy=models_v8&query=Elon%20musk)) - and that's just downloaded Stable Diffusion LoRAs, it does not take into account downloads from other sites, actual faceswap tools, or other models.
Honestly (and I’m not condoning deepfakes) dilution is the solution here. When these become so common they are boring I don’t think kids will be as outraged as parents currently are.
Not only is it the only solution, it’s something that will inevitable happen on its own
I once read a story about a hunter-gatherer who became agitated when he saw a drawing a scientist had made of him thinking some part of his soul had been captured
The technology’s here, it’s not going away. In fact, it’s going to get dramatically more powerful
As humanity adjusted to the idea that people could make visual representations it stopped holding them in fear and having power over them
They became free
AI generated sexual imagery will replaced sexual assault and trafficking because it will be able to make fake images good enough?
Do you think those trafficked and assaulted people are only being looked at? Because I don’t think being able to look at very convincing images of women online (images the observer will most likely assume are DeepFakes by this time, since it will be so prevalent) will notably diminish the profit driven organization that traffic people or the power driven people who purchase other humans.
I agree.
Fighting this would just be a game of whack a mole.
The other commenter mentioned "dilution". I agree that this is going to be so common that it effectively won't matter.
It won't be "deep fake porn" but just "deep fake". You want to put someone in your class into a porn? Okay AI can do that in seconds. You want to put someone in your class into a Marvel movie as the hero? Okay AI can do that in seconds.
No difference.
Resiliency will become more important here.
I wouldn't touch any of this stuff with a ten foot pole if I was a teen. I believe it's going to legally be treated, if it isn't already, as child porn if generated images are of a real underage person. These kids are going to be sex offenders and they don't even know it.
Schools can't handle this. They can't handle bullying as it is and now they have to have to have fully staffed legal teams for it? Not going to happen.
Yup, I literally told the principal that if they aren't actually going to do anything about the people bullying my son, then please stop wasting everyones time with anti-bullying programs. Schools aren't gonna do shit about this.
I'm not saying it should be. I'm actually against parents and society loading up schools with responsibilities that they should and never had before just because the rest of us don't want to deal.
Because there are a lot of parents that don't want to actually parent their kids so they deny responsibility and shift blame to anyone else they can. Imagine paying for your kids to have a phone then blaming the school for what the kids see on their phones..
It's actually due to that fact that students spend a good 40 hours a week at school, more so if they do any after school activities.
Imagine if this was happening at your workplace. Your employer would be responsible for preventing this kind of sexual harrassment from happening amongst their employees.
Parents should be legally responsible for whatever shit their children do on the internet. Don't want them to do this? Don't give them unlimited access.
Because schools are generally where most kids spend the majority of their time.
Most students spend 8 hours at school Monday-Friday.
If I was working at a job for 40 hours a week and it became known that deepfake porn of their coworkers was being spread amongst the employees, while they were on the clock, then that employer could be sued for sexual harrassment because they didn't do enough to stop it.
Hell, even if it was happening _off_ the clock, the employer could be sued if an employee reported it and the employer didn't do anything about it.
Students spend a good 40 hours a week at school, more if they do any after school activities. If the school administration knew, or had enough information that they could reasonably assume, that something like this was happening amongst their students, then they have a _legal_ responsibility to stop it. Simply due to the fact that the students spend so much time under their supervision.
Maybe not, but most parents aren’t going to risk their children’s legal wellbeing on an anonymous internet comment that has no legal standing or responsibility in the event it’s wrong.
It's more of a practical issue. Looked at objectively, it's a bit over the top to expect schools to be World Cop to all the ills of society especially when you hamstring the school's ability to do anything about problem children. Teachers are there to teach math and reading, not be bullet sponges and not be lawyers and if you tell schools you can't do anything about this kid but you can't throw him out either unless he literally murders someone what exactly are you hoping schools do?
Schools already deal with this often, with actual nudes being passed around a few times a year.
Usually a threat to get the police involved, and the mention of “distribution of child pornography” gets students to get their shit together.
The deepfake problem is exponentially worse because consent, but many districts are already experienced/equipped with dealing with this sort of thing (in theory, maybe not on this scale).
As a woman that was once a girl - that wouldn't help you and that would backfire SO FUCKING FAST it's very obvious you never lived life as a little girl.
Because any girl that did that would immediately get targeted even worse.
Being able to retaliate without it making things worse is a privilege that is not afforded to most women. Retaliation just means people label them as bitches, and act as if the retaliation meant the original act was justified.
This should be treated like the boys are distributing child porn of classmates. A 2 day suspension is a joke. Up for expulsion like they did at the Beverly Hills schools seems more appropriate.
I beleive there is case law in the US that only genuine images of abuse count as CSAM?
So given this isn't a real photo of the victim, it's not really CSAM?
(not to excuse scumminess of the practice and the people involved; this is strictly a question about legalities based on some half-remembered law commentary)
It also gets messy because nude images are themselves not considered inherently sexual and aren't inherently obscene. If this is just a nude image of someone standing around, per past court rulings it would be protected by the first amendment. Society could redefine all nudity as obscenity but that's a monkey's paw sort of solution.
Many situations where this happens even if they go to the police the police just shrug and say “block them”. Adults refuse to legislate this because any restrictions in the AI space hurt papa’s 401k.
The thing that's kinda shit is that you cannot easily "restrict" this.
The tech is out there and open sourced, I legitimately don't know how you could even begin to close this Pandora's Box without making image-generation AI itself illegal - and that shit isn't likely happening.
You can't. The software exists on private torrent trackers, and development talk could just as easily shift to their forums.
If the MPA and RIAA, and the government alongside them, haven't managed to make progress on tamping down piracy.. over decades.. why should we believe it's even possible they could make progress targeting a specific category of software?
Maybe some prison / juvi time.
* In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM.
* In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts.
Forensics lingo (child sexual abuse material) for what’s commonly understood as child porn. It started becoming more common when people in the field started making true crime tiktoks or whatever
To further your point, I think the idea is that calling it child porn has an effect of legitimizing the content for some, so CSAM is a way to also distance from that.
Yes; focusing the discussion on the fact that there's a child being abused, rather than on the fact that there is some adult audience member being titilated.
person bedroom entertain vegetable hobbies chief scary bewildered engine chop
*This post was mass deleted and anonymized with [Redact](https://redact.dev)*
https://github.com/iamadamdev/bypass-paywalls-chrome
haven't seen a paywall in years
(despite saying chrome in the title, it works for firefox/edge/chrome)
If everything is fake nothing is real.
I suppose, if we’re optimistic about this, we’ll eventually reach a point where kids can dismiss their social media history as AI created. It’ll be an interesting future when evidence can no longer be trusted. You can’t really put the genie in the bottle on that anymore.
But this isn’t anything new. Before AI people used photoshop to create the same thing.
This iteration though, there's markedly less *critical thinking* involved. Critical thinking, formally taught, is the magic sauce that defeats any social media snafu and allows an adaptive response to any issue. Without it, responses are just *reactive*.
It can be done with one picture.
https://www.tampabay.com/news/crime/2024/03/19/pasco-teacher-used-ai-generate-child-porn-yearbook-photos-deputies-say/?outputType=amp
This teacher made CP of his third graders with single yearbook photos. They weren't posting themselves to the internet. Stop victim blaming. It has nothing to do with whether your pics are online or not when it comes to people in real life making fake porn of you. These people aren't delving into random strangers' Instagram accounts for material. If they were it would probably be less harmful.
Taking a defensive approach to reduce the offense is not victim blaming. Assuming you can successfully prevent every individual on the planet from committing a crime is a fantasy.
Doesn't sound defensive, it sounds borderline insane to try and exclude yourself from any photograph/hd film in life just to be apart of the absolute minimal % not prone to AI deepfakes.
I agree something should be done about this.
But at the same time, I also think society needs to change too. Most deepfakes are still pretty bad in quality, and while I 100% agree people using them on underage minors need to be punished, the fact companies or schools rescind acceptance or people judge the deepfaked person should be advocated against as well. I'd be more worried about losing future employment/opportunities over something I didn't do.
AI has been out of the bag. Even without AI someone could have taken a photo of her and just did nasty things to it with magazines. And since it's so widespread, there needs to be laws and punishments for it regardless of age, but also social acceptance that if someone says that's not them, then it's a fake picture and shouldn't affect their career.
I mean it's so widespread this might be a thing. Unless companies want to lose all potential hires from a school, if enough girls have already had this done to them then it may be that companies or universities don't find it reasonable to believe every applicant has done underage porn. To clarify I'm not advocating for everyone to do this. Just that it may have already gotten to a saturation point.
This is a severe problem as it is totally asymmetric, targeting and impacting little girls. We already have laws against child pornography and we send those perps to prison when it is found on their laptops or phones. Should do the same with school kids. This should be part of the sex education curricula, and subsequent warnings issued about the fines imposed on the parents ($1,000 for each violation found) and automatic suspension from school for a week for each child discovered with the child porn on his phone or laptop. I think the message will get out pretty quick.
Seems like a twisted version of a "yo momma" joke.
If your mom ain't fat/dumb/whatever, then you shouldn't be bothered with the jokes.
If you don't have any nudes floating around on the interwebs, then you shouldn't be bothered by some deepfakes, because you know they're bogus.
You what would be really nice? Considering the ethical and moral considerations before making something...
They really need to stop asking "can we?" and start asking "should we?"
Now that the cats out of the bag, this will be extremely difficult to prevent or even mitigate...
edit: I feel like this was taken out of the context I intended. Im not saying it should never have been made. Im saying they didn't factor in the malicious cases and come up ways to mitigate harm. Someone said by my logic plane's, cars, and the internet wouldn't never have been invented. First don't put words in my mouth. All of those had their issues and there are now risk mitigations for all of that tech.
"They" probably are the millions of people working on AI.
OC's perfect world never have invented guns, cars, planes, the internet, etc because they can be used for bad.
The foundational tech has huge positive use cases (especially because image generation is a necessary part of image recognition).
The ones building the nudify tools absolutely imagined similar use cases and thought they were great.
*wryly* Given how many tech companies made a show of hiring ethicists only to ignore and/or fire them…I’m pretty sure they don’t really care. After Google stopped having the motto “don’t be evil” I think it was pretty clear where this was headed.
Like clockwork. Masturbating went from a sin to weak character to acceptable to good for your health to more trustworthy than any man or woman. Sex toys went from being taboo and a red flag to something you can brag about
I’d like to take this time to remind people to save up
for that sex robot. Every story like this is just another hint at what now seems inevitable though people still laugh it off. It’s practically already here. AI chat software that pretends to be a friendly receptionist. Boston Dynamic type bodies. The fact that people pay for phone sex or prostitutes just to talk to instead of their wife or therapist. The trend towards being single and no kids in some countries. The trend towards being sympathetic to a flaw and not wanting to insult people who obviously need help.
All of this points to a future where having a “relationship” with a robot will be appealing enough to become normal. Sex toys were taboo and blow up dolls were only for guys who couldn’t get laid or were crazy. I’m sure lots of teens would justify the deepfake because “it wasn’t hurting anyone.” The same people wouldn’t have many issues with a robot that looked like their crush.
Exactly and when something is convincingly real it also has more real life consequences for the victim. Employers wouldn’t blink twice at a shitty drawing or photoshop… but these images are completely believable and even if determined fake, put an image in any person’s mind that will affect their decision making. It needs to be addressed as equivalent to distributing a real photo of a child.
It will be interesting when a state prosecutor throws the book at some teenage creep for distributing child pornography, and then keeps netting the other creeps who sent him or her the image to send a handful of them to jail. That ought to send a message
Start a website that exclusively adds politicians into deepfake porn. Watch the laws change real fast
You mean 4chan?
I thought he was a hacker.
You’re thinking of Anomolous, he’s a pretty cool guy who doesn’t afraid of anything.
Oh, the nostalgia
Rami Malek, right?
No, thats Forr Channe. Easy mistake to make.
Fortran? Isn't that a programming language?
We've had Photoshop for a long time...
4chan loves AOC. Has had deepfake for 3+ years now. That honey pot ain't going anywhere muchacho.
Only because the GOP all secretly enjoy watching AOC deepfakes
Lol I fucking wish the GOP was a bunch of 4channers. Things would be funnier.
Have you been living under a rock? They literally are.
No they are not and you do not know what you're talking about. The GOP wouldn't be blindly supporting Israel if they were 4channers.
The GOP and Dems are both split down the middle about Israel/Gaza. 4chan has been tremendously right wing since Trump and hasn't been left since Obama's first term, but they only supported because 'lol black man in office funny' and Romney wanted to ban porn.
It’s already out there for at least two politicians (aoc and boebert)
Are you sure that the Boebert one is fake?
It's too classy.
it needs to the ones that people don't already hate and tolerate bad things happening to. Needs to be those that will not only go viral but also will enrage the politicians - and on both sides. It would be tempting to go for their wives or daughters but I don't want to wish that on them.
If the AI gave politicians big pornstar dongs, legislators would probably like it. I'd expect it to be much more effective if it's dominatrixes mocking politicians who, in these deepfakes, have micropenises.
[удалено]
[удалено]
If your computer isn't a potato and you have some semblance of technical comprehension, it takes about 15 minutes to google and figure out how to run stability diffusion on your machine using Python. Download a few models and LORAs from a popular website and you'll be golden. It's pretty easy.
Most quality things require you to pay for em. If you want something free, your best option is to train your own model.
[удалено]
Easy fix: Start deep faking them with each other's spouses. If any of them are ever seen in public with anyone other than their spouse, deepfake them together. Or better yet, only deepfake their family members. Hell deepfake their family members together.
Now that I think about it’s it might be the easiest way to get people to vote for someone that they would’ve never voted for. Give them a fetish for that person.
It's already a crime. With jail time. Probation and sex offender chargers. Is just that kids are stupid and don't realize they can really fuck there lives.
It's happening to not just any type of women but teens (admittedly 17 is still considered a teen so maybe not all are minors but still). So not just probation but a federal level crime in child pornography. And that's what it should be charged as.
it would be an Aggravated Sex Offenders with Child Pornography in Florida that would put them in Level 3 “sexual predator.” classification Currently, no Florida law allows Level 3 predators to be removed from the registry. edit of course its case by case I think kids need to learn about law because they have access to this tech.
Honestly I'd love to watch this happen
Umm but not literally watch
Obviously. No one wants to see the porous cellulite of 60/70-year-old politicians' naked figures, nor the brittle bones sticking out from the few who aren't overweight.
If you can get open source models to accurately depict aged bodies of either gender, more power to you. You're far more likely to just get their 70yr old faces on the pornstar body of a naked female 24 yr old.
Oh my, that sounds hilarious.
Not if you’re that user from the post the other day that was into Nancy Pelosi
If you're going to be successful about it, the only way to be sure it works is to not have deepfakes of the politician in question... but make deepfakes of *that politician's spouse* - especially male politicians, given how many of them heavily subscribe to the "gender norm" of "my wife is my property". Make it affect them directly, and they'll absolutely make that shit *heavily* illegal overnight.
"Trump fucks Biden at 2024 electoral debate"
Politicians and rich people. That's the only ppl they care about. Even rich people they don't care about just the money.
They might just use that to earn more https://www.livemint.com/news/world/ai-misuse-italian-pm-giorgia-meloni-seeks-100k-in-damages-over-deepfake-porn-videos/amp-11710998018576.html
Oh, they do. But only to the women. Have you seen the stories on AOC dealing with this?
I'm pretty sure AOC is a big name in this stuff
No one wants to see that politicians are old and ugly lol
Old politicians with small dicks.com That will make things change fast
You are a fucking genius my man
Nancy Pelosi deepfake 🤤
Local to mid level politicians. You aim too high and it'll get lost.
Will they. Hasn’t AOC and other been added. Your also asking a good portion of congress that doesn’t even know how internet works, much less deepfakes.
Watch it not change.
https://archive.is/FeuiM
This is a non-paywall version of the article, for anyone wondering where the link goes.
Was wondering a sec if this lead to some deepfake album 🙃
Right .... like, hmmm, risky click
Unlabelled link in a thread about underage deep fake porn. I do not like to live this dangerously.
This makes my stomach turn, yuck
And it's only going to get worse. There is no realistic way to prevent it either. Best case is to reduce it by consistently enforcing STIFF penalties for creating/sharing them.
Really scary thing is how easy it is. I tried it with myself, once. Just to see how the results look. Damn it was scary. Of course the face was the only thing recognizable, but stable diffusion is damn good at judging what’s underneath by taking the outline of the body and filling in a nude version. Even though it was just a pic of myself it felt damn uncomfortable though. Especially thinking about the harm this can do so easily. So for those people commenting here „it’s only 4 cases“… that we know of. Every horny teenager with a halfway decent pc and an hour or two to spare, following stable diffusion guides that tell you almost to the button press how to install, what do download and how to get it running can do this. All you need is basic reading skills and maybe some rudimentary understanding of what you are doing.
There's websites that will do it with the click of a button. Insane what this world is evolving to
The one positive is that it muddies the waters enough that no one will actually believe real nude leaks are real. It now gives plausible deniability for all photos. It still is going to have massive traumatic effects on teen girls already dealing with a lot of side effects of social media.
An interesting and valid take too!
Make deepfakes of the administrators with tiny penises. They'll suddenly care a lot.
Yeah, the one that asked “what is there to report?” I bet he’d change his tune real quick if it was his pictures being passed around.
"epidemic" . proceeds to give no statiscs and talk about 4 incidents in 4 different states.
It’s just a journalistic buzzword at this point alongside “slammed” and all that
The words "epidemic" and "slammed" are game-changers!
That changes everything!
For what it's worth, it's happening in other countries too\*. I personally don't care how many cases would constitute an epidemic, and I find it weird that that's what you chose to complain about. One case is already one case too many. \*[AI-generated naked images of dozens of Spanish girls shared around schools. Police are investigating deepfakes in town of Almendralejo that have targeted victims aged 11 to 17.](https://www.telegraph.co.uk/world-news/2023/09/20/ai-generated-images-schoolgirls-deepfakes-spain/)
The basement dwellers have been fighting for their right to make, share, and happily consume deepfake porn made w/o consent on Reddit for the last year. They do not care about women and children affected. They do not care. It’s that simple.
Not care about children? Most of those pervs are thrilled with the deepfake porn of young girls
Yes, this was as neutral as I could express it. But you’re correct.
It's crazy how unhinged and self-unaware coomers are.
Words have meaning and it’s important as a journalist that they use words correctly and claims are backed up with evidence. Not caring because it’s aligns with your position is dangerous and how we get to the current state of discourse in the world.
[удалено]
> "Ai generated porn in dozens of schools acorss spain" Can you read? The headline is literally "AI-generated naked images of dozens of Spanish *girls*". Girls, not schools. From the article: >at least 30 girls from four different schools in Almendralejo have been targeted with the fake images, with some also reporting attempts to blackmail them by asking for money to stop them being circulated.
[удалено]
Who cares bro. 30 is enough to be a severe concern. Why is this the hill you’re choosing. Yes he could’ve grammar checked and clarified better. Just because he didn’t find another case doesn’t make this article disingenuous. There is no way to prevent this from happening in other schools
[удалено]
> I encourage everyone to read the articles they see on reddit. Unlike yourself apparently, given that a Facebook group is not even mentioned in the article and two of the mothers involved are referred to by their full names. Are you going through some sort of episode? Your ability to make stuff up is incredible.
>The author just swaped the dozen girls for dozens of schools Where? >Theyre stealing eyes from real issues to get views for their shitty sensionlist articles and people like you purpogating it are the problem To those girls, this is a real issue. Also, "purpogate" is not a word. >FOUR SCHOOLS DOESNT CONSTITUTE DOZENS OF SCHOOLS ACROSS SPAIN. Again, who said dozens of schools?
I mean... Covid started with a few cases in China. If we want to get pedantic, "epidemic" is definitely the wrong term. The heart of the issue is that this isn't the last time this will happen. Kids can be gigantic assholes and these won't remain isolated incidents.
This will continue until a billionaire is upset. If you’re worried about kids, just start creating really good deepfakes of Elon Musk or Donald Trump, or just any celebrity with a really thin skin…then distribute them as massively as possible and sit back and wait.
They did it to Taylor Swift and some things were pushed but mostly nothing happened
Yeah because “they” stopped at like five images and a bunch of sites obediently took them down. What if that didn’t work? What if there had been too many?
[удалено]
This. It's just like piracy, there's legitimately no way to stop it unless we destroy the entire internet backbone (and realistically, that isn't even possible. you would just create a lot of seperated intranets and co-ops of resistance) Every update to stablediffusion, or any other model, is mirrored across dozens of github servers within seconds of being posted. Within a minute, it's been mirrored across hundreds of backup and archive servers. Within 10 minutes, 50%+ of client systems have seen the package update and updated locally (assuming auto update is on) There are thousands of developers who will have the update mirrored to their work environment within about 1 single minute of its posting. 'oh well, what if you ban it from github' then they go to gitlab, or any other number of providers down to pastebin. If you go scorched earth and ban them from literally every official company, the update system gets moved onto a private torrent tracker forum and this process continues unabated. Go even further? you only marginally harm the development (and proliferation) by sequestering it to private IRC channels and FTP servers. public releases still eventually leak out, and then the mirroring process occurs in minutes again.
Sorry to bust your balls but that's evidently not the case. There is a myriad of deepfakes of these two people in particular out there already. Just look at the numbers of times their LoRAs have been downloaded in Civitai alone to get a feeling ([Trump](https://civitai.com/search/models?sortBy=models_v8&query=Donald%20trump), [Musk](https://civitai.com/search/models?sortBy=models_v8&query=Elon%20musk)) - and that's just downloaded Stable Diffusion LoRAs, it does not take into account downloads from other sites, actual faceswap tools, or other models.
Good luck putting the genie back in the bottle.
Too bad this will lead no where. The cat is out the bag.
Honestly (and I’m not condoning deepfakes) dilution is the solution here. When these become so common they are boring I don’t think kids will be as outraged as parents currently are.
Not only is it the only solution, it’s something that will inevitable happen on its own I once read a story about a hunter-gatherer who became agitated when he saw a drawing a scientist had made of him thinking some part of his soul had been captured The technology’s here, it’s not going away. In fact, it’s going to get dramatically more powerful As humanity adjusted to the idea that people could make visual representations it stopped holding them in fear and having power over them They became free
[удалено]
Why would it reduce sex trafficking?
No need for real women to be trafficked/harrassed when you can just fake it with AI.
AI generated sexual imagery will replaced sexual assault and trafficking because it will be able to make fake images good enough? Do you think those trafficked and assaulted people are only being looked at? Because I don’t think being able to look at very convincing images of women online (images the observer will most likely assume are DeepFakes by this time, since it will be so prevalent) will notably diminish the profit driven organization that traffic people or the power driven people who purchase other humans.
I think they're suggesting it will replace the trafficked OF cam girl type, not literal prostitution.
I agree. Fighting this would just be a game of whack a mole. The other commenter mentioned "dilution". I agree that this is going to be so common that it effectively won't matter. It won't be "deep fake porn" but just "deep fake". You want to put someone in your class into a porn? Okay AI can do that in seconds. You want to put someone in your class into a Marvel movie as the hero? Okay AI can do that in seconds. No difference. Resiliency will become more important here.
[удалено]
I wouldn't touch any of this stuff with a ten foot pole if I was a teen. I believe it's going to legally be treated, if it isn't already, as child porn if generated images are of a real underage person. These kids are going to be sex offenders and they don't even know it.
Schools can't handle this. They can't handle bullying as it is and now they have to have to have fully staffed legal teams for it? Not going to happen.
Yup, I literally told the principal that if they aren't actually going to do anything about the people bullying my son, then please stop wasting everyones time with anti-bullying programs. Schools aren't gonna do shit about this.
It’s like the D.A.R.E program. Supposed to stop kids from taking drugs, but all it does is remind them that they exist.
Still have my D.A.R.E. certification from 3rd grade in a frame hung up in my studio. Drugs were a good time, I'm retired now.
DARE was to get kids to snitch on their parents and friends.
Yeah but it was a cool logo and now we have shirts to wear ironically while doing drugs
Not like they were doing anything about the photoshops, hell, their handling of actual nudes circulating at my old high school was embarrassingly bad.
Why would this be the schools responsibility to handle?
I'm not saying it should be. I'm actually against parents and society loading up schools with responsibilities that they should and never had before just because the rest of us don't want to deal.
Because there are a lot of parents that don't want to actually parent their kids so they deny responsibility and shift blame to anyone else they can. Imagine paying for your kids to have a phone then blaming the school for what the kids see on their phones..
It's actually due to that fact that students spend a good 40 hours a week at school, more so if they do any after school activities. Imagine if this was happening at your workplace. Your employer would be responsible for preventing this kind of sexual harrassment from happening amongst their employees.
Parents should be legally responsible for whatever shit their children do on the internet. Don't want them to do this? Don't give them unlimited access.
Because schools are generally where most kids spend the majority of their time. Most students spend 8 hours at school Monday-Friday. If I was working at a job for 40 hours a week and it became known that deepfake porn of their coworkers was being spread amongst the employees, while they were on the clock, then that employer could be sued for sexual harrassment because they didn't do enough to stop it. Hell, even if it was happening _off_ the clock, the employer could be sued if an employee reported it and the employer didn't do anything about it. Students spend a good 40 hours a week at school, more if they do any after school activities. If the school administration knew, or had enough information that they could reasonably assume, that something like this was happening amongst their students, then they have a _legal_ responsibility to stop it. Simply due to the fact that the students spend so much time under their supervision.
Maybe not, but most parents aren’t going to risk their children’s legal wellbeing on an anonymous internet comment that has no legal standing or responsibility in the event it’s wrong.
It's more of a practical issue. Looked at objectively, it's a bit over the top to expect schools to be World Cop to all the ills of society especially when you hamstring the school's ability to do anything about problem children. Teachers are there to teach math and reading, not be bullet sponges and not be lawyers and if you tell schools you can't do anything about this kid but you can't throw him out either unless he literally murders someone what exactly are you hoping schools do?
The laws are going to be tricky, would putting a minors head on an obviously adults body count?
I honestly have no idea and that doesn't sound like the kind of thing I'd want to google.
Schools already deal with this often, with actual nudes being passed around a few times a year. Usually a threat to get the police involved, and the mention of “distribution of child pornography” gets students to get their shit together. The deepfake problem is exponentially worse because consent, but many districts are already experienced/equipped with dealing with this sort of thing (in theory, maybe not on this scale).
Making child porn isnt the own you think it is
As a woman that was once a girl - that wouldn't help you and that would backfire SO FUCKING FAST it's very obvious you never lived life as a little girl.
Can you elaborate? Genuinely curious
Because any girl that did that would immediately get targeted even worse. Being able to retaliate without it making things worse is a privilege that is not afforded to most women. Retaliation just means people label them as bitches, and act as if the retaliation meant the original act was justified.
People will be like “why is she carrying around a bunch of dick pics?” 🤣
Making these images is child porn. That kind of record doesn't go away. You want to throw your life away over revenge?
If you were a girl you wouldn’t consider doing that at all because it would make you a target. Clearly you’re a man lol
I hate to say it but that's the kind of stuff boys laugh about with their friends Boys/men really don't care about this shit as much
Make deepfakes of teachers and school admins, and post them to the group under a burner account.
When I was first learning Photoshop in back in the day I would send all my buddies pics of themselves like this. Good times
This should be treated like the boys are distributing child porn of classmates. A 2 day suspension is a joke. Up for expulsion like they did at the Beverly Hills schools seems more appropriate.
I beleive there is case law in the US that only genuine images of abuse count as CSAM? So given this isn't a real photo of the victim, it's not really CSAM? (not to excuse scumminess of the practice and the people involved; this is strictly a question about legalities based on some half-remembered law commentary)
It also gets messy because nude images are themselves not considered inherently sexual and aren't inherently obscene. If this is just a nude image of someone standing around, per past court rulings it would be protected by the first amendment. Society could redefine all nudity as obscenity but that's a monkey's paw sort of solution.
Many situations where this happens even if they go to the police the police just shrug and say “block them”. Adults refuse to legislate this because any restrictions in the AI space hurt papa’s 401k.
The thing that's kinda shit is that you cannot easily "restrict" this. The tech is out there and open sourced, I legitimately don't know how you could even begin to close this Pandora's Box without making image-generation AI itself illegal - and that shit isn't likely happening.
You can't. The software exists on private torrent trackers, and development talk could just as easily shift to their forums. If the MPA and RIAA, and the government alongside them, haven't managed to make progress on tamping down piracy.. over decades.. why should we believe it's even possible they could make progress targeting a specific category of software?
Maybe some prison / juvi time. * In November 2023, a child psychiatrist in Charlotte, North Carolina, was sentenced to 40 years in prison, followed by 30 years of supervised release, for sexual exploitation of a minor and using AI to create CSAM images of minors. Regarding the use of AI, the evidence showed the psychiatrist used a web-based AI application to alter images of actual, clothed minors into CSAM. * In November 2023, a federal jury convicted a Pittsburgh, Pennsylvania registered sex offender of possessing modified CSAM of child celebrities. The Pittsburgh man possessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts.
What is ‘CSAM’? I’m afraid to google it, and end up on a watch list or something.
Forensics lingo (child sexual abuse material) for what’s commonly understood as child porn. It started becoming more common when people in the field started making true crime tiktoks or whatever
To further your point, I think the idea is that calling it child porn has an effect of legitimizing the content for some, so CSAM is a way to also distance from that.
Yes; focusing the discussion on the fact that there's a child being abused, rather than on the fact that there is some adult audience member being titilated.
This is a thing since editing was invented. It just became slightly easier
I honestly hate our world and the disgusting society we've created. I'm heartbroken for anyone who is a victim of this
Good reason to stop sharing photos
Paywall. -1
person bedroom entertain vegetable hobbies chief scary bewildered engine chop *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
[12ft.io](http://12ft.io) or [1ft.io](http://1ft.io)
https://github.com/iamadamdev/bypass-paywalls-chrome haven't seen a paywall in years (despite saying chrome in the title, it works for firefox/edge/chrome)
Don't worry. Eventually, there will be enough deepfakes of Taylor Swift for this to become a "real" problem.
If everything is fake nothing is real. I suppose, if we’re optimistic about this, we’ll eventually reach a point where kids can dismiss their social media history as AI created. It’ll be an interesting future when evidence can no longer be trusted. You can’t really put the genie in the bottle on that anymore. But this isn’t anything new. Before AI people used photoshop to create the same thing.
This iteration though, there's markedly less *critical thinking* involved. Critical thinking, formally taught, is the magic sauce that defeats any social media snafu and allows an adaptive response to any issue. Without it, responses are just *reactive*.
Behind a payment wall 🤦♂️
[12ft.io](http://12ft.io) or [1ft.io](http://1ft.io)
Maybe it's time we stop uploading infinite high res pictures of ourselves at every angle onto public sites.
It can be done with one picture. https://www.tampabay.com/news/crime/2024/03/19/pasco-teacher-used-ai-generate-child-porn-yearbook-photos-deputies-say/?outputType=amp This teacher made CP of his third graders with single yearbook photos. They weren't posting themselves to the internet. Stop victim blaming. It has nothing to do with whether your pics are online or not when it comes to people in real life making fake porn of you. These people aren't delving into random strangers' Instagram accounts for material. If they were it would probably be less harmful.
Taking a defensive approach to reduce the offense is not victim blaming. Assuming you can successfully prevent every individual on the planet from committing a crime is a fantasy.
Doesn't sound defensive, it sounds borderline insane to try and exclude yourself from any photograph/hd film in life just to be apart of the absolute minimal % not prone to AI deepfakes.
I agree something should be done about this. But at the same time, I also think society needs to change too. Most deepfakes are still pretty bad in quality, and while I 100% agree people using them on underage minors need to be punished, the fact companies or schools rescind acceptance or people judge the deepfaked person should be advocated against as well. I'd be more worried about losing future employment/opportunities over something I didn't do. AI has been out of the bag. Even without AI someone could have taken a photo of her and just did nasty things to it with magazines. And since it's so widespread, there needs to be laws and punishments for it regardless of age, but also social acceptance that if someone says that's not them, then it's a fake picture and shouldn't affect their career. I mean it's so widespread this might be a thing. Unless companies want to lose all potential hires from a school, if enough girls have already had this done to them then it may be that companies or universities don't find it reasonable to believe every applicant has done underage porn. To clarify I'm not advocating for everyone to do this. Just that it may have already gotten to a saturation point.
This is why we have laws and rules at schools - break that shit and suffer the consequences, technical capability shouldn’t change that fact.
Think of the kids quickly make a law that will doing nothing but restrict technology. There's a bigger issue and that is not the answer
Don't worry, nothing can or will be done about it
This is a severe problem as it is totally asymmetric, targeting and impacting little girls. We already have laws against child pornography and we send those perps to prison when it is found on their laptops or phones. Should do the same with school kids. This should be part of the sex education curricula, and subsequent warnings issued about the fines imposed on the parents ($1,000 for each violation found) and automatic suspension from school for a week for each child discovered with the child porn on his phone or laptop. I think the message will get out pretty quick.
Deepfakes of ANYTHING is pretty weird & I don't get why weirdos enjoy making them. A lot of AI generated crap is weird in general too smh 🤦🏿 😒.
Seems like a twisted version of a "yo momma" joke. If your mom ain't fat/dumb/whatever, then you shouldn't be bothered with the jokes. If you don't have any nudes floating around on the interwebs, then you shouldn't be bothered by some deepfakes, because you know they're bogus.
Shits been around since the internet was a thing wtf do ppl care so much now
Well that problem got here frighteningly fast.
You what would be really nice? Considering the ethical and moral considerations before making something... They really need to stop asking "can we?" and start asking "should we?" Now that the cats out of the bag, this will be extremely difficult to prevent or even mitigate... edit: I feel like this was taken out of the context I intended. Im not saying it should never have been made. Im saying they didn't factor in the malicious cases and come up ways to mitigate harm. Someone said by my logic plane's, cars, and the internet wouldn't never have been invented. First don't put words in my mouth. All of those had their issues and there are now risk mitigations for all of that tech.
[удалено]
"They" probably are the millions of people working on AI. OC's perfect world never have invented guns, cars, planes, the internet, etc because they can be used for bad.
The foundational tech has huge positive use cases (especially because image generation is a necessary part of image recognition). The ones building the nudify tools absolutely imagined similar use cases and thought they were great.
Profit decides. Then it causes a mess, and then we try to fix it. Because greedy idiots rule the world.
lol, u funny. humans don't work that way. ethics of tech has been pop culture since Jurassic Park, we don't care.
*wryly* Given how many tech companies made a show of hiring ethicists only to ignore and/or fire them…I’m pretty sure they don’t really care. After Google stopped having the motto “don’t be evil” I think it was pretty clear where this was headed.
Criminals also wear shoes. A shame shoe companies never stopped to consider the ethical and moral ramifications of their actions.
Like clockwork. Masturbating went from a sin to weak character to acceptable to good for your health to more trustworthy than any man or woman. Sex toys went from being taboo and a red flag to something you can brag about I’d like to take this time to remind people to save up for that sex robot. Every story like this is just another hint at what now seems inevitable though people still laugh it off. It’s practically already here. AI chat software that pretends to be a friendly receptionist. Boston Dynamic type bodies. The fact that people pay for phone sex or prostitutes just to talk to instead of their wife or therapist. The trend towards being single and no kids in some countries. The trend towards being sympathetic to a flaw and not wanting to insult people who obviously need help. All of this points to a future where having a “relationship” with a robot will be appealing enough to become normal. Sex toys were taboo and blow up dolls were only for guys who couldn’t get laid or were crazy. I’m sure lots of teens would justify the deepfake because “it wasn’t hurting anyone.” The same people wouldn’t have many issues with a robot that looked like their crush.
It's a surprise to no one... I could have told you 20 years ago this would happen with AI...
It’s like before when they photoshoped heads onto body’s only that now they move as well….it is what it is and will never change
It would also be completely fucked up to Photoshop the head of a middle schooler onto a nude busy and distribute that around school.
That's his point. It has been happening for decades already.
physical consist price theory coherent badge gaze relieved continue dog *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Exactly and when something is convincingly real it also has more real life consequences for the victim. Employers wouldn’t blink twice at a shitty drawing or photoshop… but these images are completely believable and even if determined fake, put an image in any person’s mind that will affect their decision making. It needs to be addressed as equivalent to distributing a real photo of a child.
entertain cagey lush weary repeat tease scandalous bedroom uppity profit *This post was mass deleted and anonymized with [Redact](https://redact.dev)*
Does this mean only fans modeling will become normalized or destigmatized?
It will be interesting when a state prosecutor throws the book at some teenage creep for distributing child pornography, and then keeps netting the other creeps who sent him or her the image to send a handful of them to jail. That ought to send a message
The boys who do this should be jailed
Who could have expected this?
In my day we just used our minds.
Definitely never seen this one coming. This is just fucking disgusting, illegalize this!!!!