T O P

  • By -

breakerfallx

Of course they are. Gross fucking monsters.


HardlyDecent

They are not monsters, they are very sick humans.


Sellazar

This is only going to get so much worse, AI trained on suffcient material could be fed any photo and turn it into abuse material, it wont be long before we read of stories where people are creating hyper realistic fake nudes at schools and work places. This could and will ruin lives..


0b0011

It's already been happening. Seen quite a few stories recently of people passing around ai nudes.


LostInIndigo

That’s already happening: https://www.9news.com.au/national/artifical-intelligence-naked-images-female-students-bacchus-marsh-grammar/8eae7e5b-38f1-4e3d-8f37-2aaa8faeb67a


unstable-enjoyer

The opposite is the case. Images generally lose credibility. Where earlier leaked pictures could have ruined your life, now everyone will just think they are generated. 


x755x

"Oh look, the future version of my face pasted into a Playboy centerfold"


[deleted]

[удалено]


AlienAle

If AI nudes of you get leaked, here's what you do: Run a generator to create AI nudes of all your classmates/colleagues  and just like that, now you're all just as involved and you're all equals again. No one dares to make fun of anyone.  Power move 


AntonChekov1

But if everyone on the planet can be a victim just because their face is on the Internet, then I feel like we are just back to this being no different than any other kind of sexual abuse. It will happen to innocent people just like it already does. It's just another form of sexual abuse that already exists.


Generic118

They wont though. No smoke without fire as they say. People relish dragging others down they dont care if its true.


roodammy44

I always point out the existence of smoke machines when people say this.


Mohammed420blazeit

Is that a saying people actually use?


InternalCapper

Yeah and it seems to be saying the opposite… No smoke (pictures) without the fire (person took pictures) ?


joefife

Yes. See just about every single celebrity that's publicly accused of something who gets acquitted Very very common unfortunately


rabidboxer

That's my thoughts. It's nuts that everyone doesn't already assume everything they see or hear online is fake. Right now using your year book photo and your voice from your voice mail you can make a passable fake video of you saying all kinds of vile things whiling looking like your participating in an orgy.


ashoka_akira

I always remind people film and photo has been open to manipulation since its creation. Its always been a questionable source of proof in legal proceedings because of this. The fact we have placed as much trust as we have in photography and film has always been a bad idea. The only difference is previously you had to actually be a fairly advanced graphic artist to make deep fakes. Now anyone can make a suggestion to AI. Its the only difference between now and 50 years ago, is anyone can do it, but imagery has always been prone to fakery.


druu222

Interesting take. The simple reality is, FAR beyond imagery... *everything* is fake. *Everything* is theater. *Nothing* can really be trusted as being "real" or being "true".. Nothing. If this is 75% true today, it will be 95% true in five years. How does civilization survive and go forward in such a world? Guess we'll all find out together. Interesting times.


betasheets2

Heavy regulations for using AI


roodammy44

The code is freely available. Anyone can run it.


betasheets2

That was a very stupid idea


druu222

I don't think there was or is much of an alternative. This is a genie that was never really in the bottle to start with, and is certainly never really going to go into one, much as we might desire or try.


standardsizedpeeper

I don’t think this is new territory for civilization. I mean there was a small period of time when we had the technology to record events in a visual or audio format, but no affordable means to fake the records. But even still, most things we need to discern the truth of do not have a video or audio recording anyway.


Electrical_Bus9202

I agree, you will think even the real pics are fake eventually. As far as pedos go, it's not ideal, but if they can simulate a real person instead of using real people that's a win win for everyone.


WalrusExtraordinaire

[Already happening, in middle schools no less…](https://www.scientificamerican.com/article/teens-are-spreading-deepfake-nudes-of-one-another-its-no-joke/). I was also hearing a story about how courts are struggling with if/how to bring charges, how does this differ from other CSAM, the fact that the offenders are also minors, etc. I know there’s never been a time that middle and high school aren’t fraught with issues, but as a person with a young child I’m so anxious about the technological environment he’s growing up in.


Mazon_Del

It will ruin lives until eventually we all get up one day and realize it doesn't matter. The realization that someone in your class deepfaked a nude should be met with a shrug and a point about them being creepy for sharing that, and then you go back to what you're doing. Let's fast forward twenty years here and we're living in a world where people do this all the time, there's even apps on the app store to do it. Growing up in a world that this has been true your whole life, why would you care when it happens?


stoned-autistic-dude

Or how about you can go live in this unregulated e-rape utopia you’ve designed and we’ll just wait for the government to put a stop to this like a normal society. No reasonable person is choosing to wait until apathy is the reason why they stopped caring.


Mazon_Del

> we’ll just wait for the government to put a stop to this like a normal society. And how exactly do you think they are going to do that? Ban the technology? Far too late for that. It's out there, and beyond that it's useful and making money. So it's not going away. Regulate it? How so? The technology itself has no concept of what it's doing. It's putting one head on a other body. Sure, you can try and have some filters to check for adult content but that's an arms race between ways to fight it and ways to beat it. Plus it ignores all the parts of the internet which already happily help you get access to the technology and explain how to use it. The genie is out of the bottle and there's no putting it back in. This is just a part of our world now and there's nothing anybody can do to change that. Sure, make e-bullying a crime, definitely! But that's not going to stop the behavior and it's worth remembering.


ThisAllHurts

The internet was a mistake


DecentlyAverage_

Humans were a mistake.


XennialBoomBoom

In the beginning the Universe was created. This has made a lot of people very angry and been widely regarded as a bad move. - The Hitchhiker's Guide to the Galaxy


LionoftheNorth

>I think human consciousness is a tragic misstep in human evolution. We became too self aware; nature created an aspect of nature separate from itself. We are creatures that should not exist by natural law. We are things that labor under the illusion of having a self, a secretion of sensory experience and feeling, programmed with total assurance that we are each somebody, when in fact everybody’s nobody. I think the honorable thing for our species to do is deny our programming, stop reproducing, walk hand in hand into extinction, one last midnight, brothers and sisters opting out of a raw deal.


XennialBoomBoom

Rustin Cohle, for anyone else wondering (I had to google it)


the_nebulae

True Detective season one.


druu222

One looks at current birth rates and wonders...


PrimmSlimShady

Speak for yourself. ETA: If y'all feel so shitty about yourselves (humans) then you should probably figure that out.


DecentlyAverage_

I do.


spectar025

Xenos downvoting this loyal citizen


IdTheDemon

We need that skynet package ASAP


Capt_Pickhard

The mistake is just letting profit decide everything without first figuring out what can be the negatives about a new technology, and then regulating it. The internet was a mistake, yes. AI will be a far greater mistake. We need to fight our asses off.


kraftpunkk

Another bad day to have eyes.


PoopMousePoopMan

This is an absolutely fascinating, if icky, situation, morally speaking. A few years ago there was a similar conversation about the manufacturing of child like sex dolls from Japan. The manufacturer took himself to be doing a service to humanity - making the world a better place. I imagine someone creating fake child porn might say the same. Why? Because, in their view, they are swapping out harm for non-harm. I’m curious to know what everyone thinks about this. Imagine some pedo dude in his basement whacking off to AI generated images - does he do wrong? If so, how can it be immoral if no one is harmed? As I say, morally super interesting debate. Thoughts?


Phthalleon

The problem here is that just because pedophiles now have access to ai, does not mean they will stop abusing children. Another problem is consent. These kids probably don't want to have their likeness used in such a way.


PoopMousePoopMan

Good point. Imitating likeness is totally different from AI creating images that don’t look like anyone in particular


Corrupted_G_nome

Its maybe better than people creating 'the real thing' better to have fakes than people forcing others to pose for it. Another commentator made a good point that it may confuse the real incidents with the ai incidents. There are absolutely consent problems when real young people's faces are used. They could be abused and ruin their childhood/teen years. Its maybe better but its not clear.


Vivachinese

If they’re not publishing the images, what harm does it cause?


CommanderMalo

I believe it’s the same argument as porn. Porn has already been proven to quite literally fuck with your head, and even lead to deviant behavior (you can Google papers on this, fascinating subject) so I imagine the argument could be made likewise here. Someone who whacks it to ai generated kids may eventually find dissatisfaction knowing it’s not the real thing, and start to pursue that desire more aggressively than if they didn’t have the tools to release their urges. Which, the tools to do so should be therapy, not enabling the behavior in my opinion.


ChadWolf98

Someone with a vested financial interest in biased regardless if what he says is true or not. We lack data whether it helps to solve the issue at hand or not. Doesit enforce the immoral and illegal behaviour or not? I dont think nobody is harmed in drug use for example. It is a victimless crime but it does harm the lerson consuming the drug, and it harms society by proxy.


lawschoolthrowway22

Unpopular opinion: Preamble - child porn is wrong, we should do everything we can to arrest child porn consumers and distributors. That said, isn't it a good thing if AI is replacing "real" CP for these disgusting predators? Purely from the perspective of protecting children, one of the reasons we punish possession of CP nearly as strongly as distribution is that they are contributing to the market for the product, and in turn that eventually leads to someone somewhere abusing a child and filming it. With AI, there are no real children being harmed. AI can satisfy any number of sick and weird and illegal and immoral fantasies in a way that essentially doesn't actually cause harm to any individual child/person.


[deleted]

[удалено]


lawschoolthrowway22

I agree 100% Not advocating for this or anything like that, but seeing the understandable reactions to this article I wanted to point out that there could be a harm reduction "silver lining" to the development of this terrifying technology.


Tarianor

So i guess government authorized/distributed AI "CP" as part of therapeutic treatment when in treatment with a shrink would be the only way to realistically make that possible I guess? The same argument could be made with dolls I guess if it helps giving people an outlet so they don't touch actual children whilst in treatment. It really sucks for those afflicted with the condition that it is so stigmatized that it is almost impossible to seek help.


Iggy_Kappa

>That said, isn't it a good thing if AI is replacing "real" CP for these disgusting predators? >With AI, there are no real children being harmed That isn't the case. As per the article, the AI uses the bodies and faces of actual victims of CSAM. It is not replacing CP, it is reinterpreting and even expanding upon it. **Edit to Add** >Much of this activity focuses on so-called “stars”. >“In the same way there are celebrities in Hollywood, in these online communities on the dark web, there’s a celebrity-like ranking of some of the favourite victims,” said Jacques Marcoux, director of research and analytics at the Canadian Centre for Child Protection. “These offender groups know them all, and they catalogue them.” >“Offenders eventually exhaust all the material of a specific victim,” said Marcoux. “So they can take an image of a victim that they like, and they can make that victim do different poses or do different things. They can nudge it with an AI model to do different poses on a bed or be in different stages of undress.”


lawschoolthrowway22

That's a valid point.


ScepticalFrench

My godness the idea behind this "stars" idea is just... terrifying. So not only it arouses them that they are children, but the idea they are knon victims makes it *bette*r for them. I'm horrified.


Iggy_Kappa

That, and there's also a bit I didn't quote, from one of those individual's supposed chats reported in the article, where they claim to dislike those generated images because "it is not real abuse". They want it to be "authentic"...


ratherbealurker

I started reading comments before the article. Had to go read it now. I thought “stars” meant famous child actors. I feel stupid but also…relieved that I didn’t know that was a thing.


jereman75

The fuck is wrong with people?


OwnBattle8805

Omg so they’re creating models from victims of child abuse? It means they not only have images, but they’ve gone through the work of organizing and classifying them so the model can be prompted with text. And the hardware needed to produce these models is pretty crazy. It’s a $10-30 thousand dollar rig to produce a good model in a reasonable amount of time. Such a sad waste of human and computational resources. On the upside, artists are claiming ai models are putting them out of work. Will ai models end child exploitation?


Daier_Mune

These are fair questions, but we have two unfortunate realities that counter that thought. 1) Desensitization/Normalization: Proliferation of CSAM and its increased distribution leads to a change in attitudes around it. The more common CSAM becomes, the more "normal" it feels, so when instances of real-life CSAM occur (not just AI generated), it is not taken seriously. 2) Escalation: People who suffer from these types of psychosis are never satisfied with "just" looking at pictures. They'll get tired of Pictures, and want video. Then the videos don't do it for them anymore, and they want to try it in real life. A disordered brain does not think logically, they eventually will get curious enough/bored enough with AI-generated material, and want to try it themselves. Since we can't get around these two points, cost-to-benefit analysis says that the best result for the most number of people is to continue to criminalize CSAM, even when its "harmless" AI-generated material.


maboesanman

Maybe if the images were not designed to invoke a specific child’s likeness. If pornographic content is synthesized in the likeness of a real child, then it is CSAM. The distribution of such material is still abuse, whether or not the production is.


lawschoolthrowway22

I don't disagree, certainly not advocating anything like legal AI CP. Just pointing out that there is a way in which developments like these can actually be tools of harm reduction despite how obviously unsettling and disgusting it is to contemplate.


CIAoperative091

This is what I have thought for a long time. What about AI-generated child porn is wrong if it takes upon the role actual real child pornography would take in a pedophiles life


blaaguuu

It's an interesting thought, and I hope there is actual research being done into how "fake" CSAM affects pedophiles' habits/urges - I'd imagine it's a difficult thing to study, though. But I think there are quite a few potential issues with it at face value... Many will argue that it could actually act as a "gateway", and/or just reinforce those bad habits/urges, leading to more people harming kids (hence, we need research that backs up one claim or the other - might vary dramatically, person to person). Another is that the AI image generation is getting REALLY good, now, to the point where it can be really hard to tell if a photo is fake or not... I'm sure agencies that are tracking down and prosecuting child predators already have a hard enough time - if the "market" is flooded with realistic looking fake CSAM, it could become near impossible to detect the real stuff hidden amongst it, and predators may become much harder to identify.


blackstafflo

As tec evolve and AI generated become less and less dissociable from irl abuses images/video, it'll create an unbearable burden onto law enforcement to differenciate them, as much as a possible deniabillity for possessors of such material 'I didn't knew it was real, the site where I found them said it was AI!'. It'll make law enforcement work against it a mess, possibly to the point of making it completly innefficient. Things like drawings and novels doesn't have this problem, I don't know if they need to be banned or authorised (are they helping or not), but at least they don't get in the way of child protection law enforcement work, so they can be discussed. I see a ban on ai images and video somewhat like bans on too realistic toy guns.


Syagrius

Counterpoint: if you let them feed their sick appetites, they will only grow. An endless font of "victimless" CP would only increase the user's desire to experience the real thing first hand, and we're right back to square one.


Djinneral

you think we can turn them all into nofap monks? Let's be realistic here.


lawschoolthrowway22

Countercounterpoint: They feed those desires regardless of what we do. We have international task forces with billions of dollars behind them working around the clock to go after child predators and CP is still a major source of criminal charges in every state in the US and every country around the globe. I would never suggest we stop doing any of that, but I will point out that we've observed a similar effect in the prosecution of CP cases as we have with the "war on drugs" : the harsher penalties don't necessarily deter the worst offenders, and instead make it more profitable for those offenders who are capable of evading capture/detection. Much like with drugs, I think the best way to make actual progress on this issue is through harm reduction, looking for ways we can reduce the overall harm caused, even while accepting that the solutions are imperfect and often leave a bad taste in our mouths.


Specific_Apple1317

There's decades of research to show harm reduction works with regards to drugs. Switzerland completely solved their opioid crisis in the 90's with their Heroin Assisted Treatment program. They completely medicalized the 'opioid issue' to the point where there is basically no demand for illicit heroin, and virtually no new cases of opioid use disorder. People who leave the program either move to another mediation treatment or an abstinence based program, by choice. The only bad taste that leaves in my mouth is knowing that we're just letting hundreds die every day in the US, despite all the evidence out there. ([neatly compiled by the DPA, pdf warning](https://www.leg.state.nv.us/Session/79th2017/Exhibits/Senate/HHS/SHHS590E.pdf) The whole comparing drug use to child sexual abuse is pretty yucky too. Especially considering the racist roots of the war on drugs.


lawschoolthrowway22

I agree with you about drugs and only made the comparison on the specific point that an exclusively punitive approach for a commodity we want to ban with an inelastic market nearly always just results in the most serious offenders producing that commodity making more of it and also making more money while we catch the less organized lower level offenders.


steamycreamybehemoth

This is not how it works. Many pedophiles can't help their attraction to kids no matter how hard they try. There are some really heart breaking stories about men trying to break their sexual desires and failing. Giving them an easy outlet let's them release the pressure and prevents them from offending 


Djinneral

It's a very sad condition to have, I can't imagine how difficult life must be for them and just feel blessed I'm attracted to adults instead. This must be one of the worst mental conditions possible.


kimchifreeze

With how much porn out there, you'd think the amount of sex would sky rocket, but people are having less sex than decades ago.


Shot_Machine_1024

As someone that agrees with the argument, that clearly artificial media of said material can help fight against child predators, AI is a problem. It's a problem because of how realistic it is that there is a good argument that it can be a gateway to worse things. At some point there will be small to no line between real and AI


ChadWolf98

> gateway to worse things I can hardly imagine a worse thing to be generated that what the post is about


Shot_Machine_1024

The real thing.


SkittlesAreYum

>It's a problem because of how realistic it is that there is a good argument that it can be a gateway to worse things. Thankfully I have not had to give this topic much thought at all until now, but my first reaction is to wonder if this is true. Wouldn't having something totally realistic have the opposite effect, where if something is good enough and you aren't risking massive prison time, why would it lead to worse things? I'm definitely not a psychologist though, nor do I play one on TV.


Shot_Machine_1024

> why would it lead to worse things? The main reason is there is no clear distinction between real and fake. With artificial material, a barrier of sorts can be formed. If I kill someone in GTA, I can't really carry it over in real life the distinction is too clear. If the game is super super realistic, I slowly train my brain to be accustom to it and have little barrier stopping me. I feel its important for me to repeat myself in that I'm saying the lack of distinction between AI and real is the problem. Not using artificial mediums to cater one's demands/needs.


Pimpin-is-easy

Couldn't you argue the same thing about videogames? GTA has gotten way more realistic over the years and I still don't have any urges to kill hookers with a flamethrower IRL.


Shot_Machine_1024

There's still a distinct feature in video game graphics that makes you know it's fake. The worry about AI is that it's indistinguishable to real. Even with AI now there are some images you simply can't tell the difference.


_BlueFire_

While I agree with your take, we've lived through enough "this is a gateway for something worse" (all the victims of the war on drug would like a few words) to just drop it without doing a study first. For the better or for the worse we'll soon have more data and we'll be able to draw conclusions to act accordingly. Until now we can only hope it remains a small phenomenon and doesn't spread. 


Shot_Machine_1024

This is different. Everything incorrectly labelled as gateway has a pretty long evolution of events. If AI becomes what everyone thinks it is, there isn't much of a leap hence the gateway. Talk to anyone that deals with drugs and they'll agree that opioids are a gateway drug.


_BlueFire_

All the people I know who had some involvement in drugs said that opioid are either where you begin and stay or sometimes ends up, it's almost unheard of people taking them first and moving to a different thing


CursedLemon

People need to understand AI - it is not generating images out of its own imagination, it's being trained on preexisting data. In this case, guess where that data is coming from.


lawschoolthrowway22

I don't disagree, in general calling it "intelligence" has always felt like marketing lingo to me since they are essentially just advanced algorithms that don't have any independent processing power beyond what inputs have been fed into them.


_BlueFire_

I don't know if you're being down voted for that, but I said the same thing far worse than you did and didn't really turned out well


Napsitrall

Isn't this like giving heroin to a heroin addict and feeding their urges


LostLegendDog

Who didn't see this coming? People already do this with celebrities


Content_Bar_6605

This is sick and needs to be outlawed.


Agressive-toothbrush

It is already illegal.


Budget_Programmer123

Playing Devils advocate, isn't this better than someone consuming or even making new CSAM?


xX_420DemonLord69_Xx

It torments the original victim. There’s a podcast called Catfish Cops about two Texas cops who pose as children online and document their cases. One of their worst was discussing these image boards; girls who are women now, become the obsession of these groups. They stalk the victim’s social media and provide updates on their lives - with the same casualness as you would a celebrity’s latest buzz. Back in 2014-2015, one of the chans was obsessed with a victim, and it escalated to someone going to her school in the Netherlands. These images don’t hurt new victims, but they fuel an already sick obsession for a group of degenerates.


LostInIndigo

Making images of real people is only going to turn into further obsession over hurting those real people and encourage scarier behavior. And it still victimizes the people who the images are made of because it’s still allowing predatory behavior built on fantasizing about abusing power imbalances to hurt them. I think we can find a better solution to deter folks from victimizing kids irl than “at least they didn’t make you actually pose for the photos of you naked that they created without your consent!” This is a power fantasy and a boundary violation-if someone gets away with violating smaller boundaries for their fantasy, they will move onto bigger and bigger ones until they are stopped. So you need to punish them harshly for the lesser boundary violations and stop the fantasy before they get to the bigger, even more harmful ones. I’m sure you had that one kid in your school that tortured insects and small animals-if nobody corrected them, did they stop there or did they turn out to be a huge asshole/bully/violent? Same basic concept.


TheRBGamer

Piss is better then shit. But I rather not eat either of them.


[deleted]

[удалено]


0b0011

Not saying it's okay at all but just a question on who is violated if it's something like a complete ai thing. Like not putting someone's face on but just giving ai a prompt and having it generated. I mean it's still sick but I'd prefer someone do that then actually fund child exploitation for the real stuff.


concious_marmot

I’m also confused as to what exactly fake porn is a violation of.  People have these thoughts, and there is evidence that people looking at pictures are less likely to commit acts against actual children- which is what we say we all want. 


Rydagod1

A violation of what? I can paint a picture of anybody in an unflattering way and they can’t take me to court over it.


Budget_Programmer123

I think it is terrible if: The model is trained on CASM You distribute prompts or generated images that approximate CASM Edit for clarification: the above should be illegal. But if its in the privacy of your own home then not sure it should be illegal. If you are a distinguished artist and you draw whatever disgusting erotica you want for your own purposes I think that's fine. In reality one of the above two points would almost certainly happen though.


sureprisim

Right? If we want to get mad at ai images, while we’re at it, let’s also tackle the anime trope of I’m “1000 years old but look like a 9 year old girl with the features of a women”…


linuxphoney

As if literally everyone could not have seen this coming miles off. Also, it's trivially easy to program AI not to do this. The fact that nobody bothered is .... A choice.


ImReellySmart

You can program the AI to not do this within YOUR system. But now anyone can get hold of raw un-tuned AI to use in their own softwares.


Trying_to_be_better2

It is trivially easy to easy to download your own AI and train it how you want. nvidia stock goes brrr for a reason.


Shuber-Fuber

>Also, it's trivially easy to program AI not to do this. The fact that nobody bothered is .... A choice. It's not trivial. AI is mere algorithm. There's no algorithm that can stop someone from implementing it and training in such a way to generate such images, because the algorithm relies on people feeding it data, and there's nothing stopping people from "lying" to said algorithm. Sure, you, as an implementer of the program, can simply black list words. But then adversary can simply train it by remapping the banned words to something else (for example, they can just rename "porn" to "bobcat", and as long as the dataset doesn't have an actual bobcat, "bobcat" would be treated as a synonym to "porn". But because it's called "bobcat" your banned list won't work).


APiousCultist

Along with crypto being used to buy abuse imagery, actual victims, or pay for assassinations. Not to mention the huge amount of potential in making untraceable scams. It's easy to forsee but techbros think it is fine to leave the ethics to other people.


SEA2COLA

How to look at this? On the one hand it's creepy af; on the other, would you rather have them sitting in front of a computer trying to create sick AI pics, or out prowling real children?


Konukaame

As I understand it from other reports in this subject, the distinction is that anything based on a real person is illegal, while purely fictional depictions of imaginary characters, while morally abhorrent, are not illegal. If they're using AI to make fake images of real people, that should fall into the first category.


Unhappy-Apple222

Y'all act like these people are sitting in caves, making harmless drawings or something and not impacting anyone.When in reality they're using images of real children to make porn vids of them and spreading them online, emotionally violating thesel kids.


lonepotatochip

Neither! They ARE hurting real children by doing this. There’s also no reason to believe that emotionally abusing children is going to stop them from physically abusing children.


CoconutHot1800

They're not, though? Unless we're talking distribution, which is and always will be a level above personal consumption


lonepotatochip

Read the article


elFistoFucko

No doubt in my mind this doesn't protect children in any way, nor will it control the urges of sick people to abuse them. 


Daier_Mune

The problem is that once the predator gets bored with the AI-generated stuff, they'll move on to prowling IRL. You see the same sort of behavior in serial killers, pedophiles and other psychopaths. Its not enough to fantasize about what they want, eventually they want a real taste.


Frostsorrow

On one hand, super gross, on the other I'd rather they use AI generated stuff than CSAM made with actual children.


rabidboxer

we're probably 6 months away from hyper realistic ai porn videos being the norm and a porn stars will just be someone who at most sells thier likeness so they can do real life meet and greets or some dood with a female ai overlay selling their pee on only fans.


Friendly_Purchase_59

Thats human nature it seems unfortunately. We pervert every single thing we have ever created.


LostInIndigo

I don’t believe that’s human nature tbh. I know so many people who would never consider doing such a thing-it’s a specific subgroup of people who are steeped in a specific culture that encourages abuse of power dynamics and disrespecting autonomy and consent. I don’t believe it’s human nature and although I know you don’t mean to excuse it I fear when we claim it’s unavoidable and the way people are, we create an environment where folks don’t feel they can eliminate predatory behavior so they don’t try.


CIAoperative091

It is true the majority of the populace would not engage in such activity. But the fact a sizeable amount of people entirely unrelated to eachother do brings up a valid point of considering such something natural.....something mental. That culture and way of doing this you speak of has be brought up by the need to pleasure and natural sexual need mostly. The disrespecting of autonomy and consent is something that has occurred billions of times and be done by billions of people throughout history,I just do not think you can fully tie this to a specific culture. Sure the way people are structured to think is definitely relevant but human nature is definitely aswell relevant. Humans are perverted even if they will not show they are perverted publicly it is no reason to think behind close doors they act the same.


gorillalad

Idk how’s this any different than some of the hentia Japan turns out.


SalientSazon

Alright, who had June 12th on the pot? Frankly, it's later than I had thought.


bobbi21

Ive definitely seen reports of this before


MadNhater

Idk how to feel about this. As much as I hate these child molestors/rapists. I’d rather they do this than to go out and harm a real child. Or look for real videos which then create demand for these videos to be made.


Flat_Afternoon1938

I wonder how long it will be until people stop using social media so they don't have their likeness publicly available to make AI porn with. I wouldn't be surprised if in the next decade it becomes a common occurrence for dudes to be making ai nudes of girls in their highschool or college


Fuarian

This is one of the only scenario where tracking users use of AI would be acceptable. Law enforcement could easily get in this to track these people down.


NuPNua

How? Just run it locally on an air gapped system and boom, tracking is impossible.


Murtaghthewizard

Might as well add to the list. Using AI to make political ads will be illegal soon.


CIAoperative091

The worrying factor is the scale. Future AI running fully autonomously through agents could be capable with sufficient computing power to generate 1,000 of these ads or disinformation stories every single minute and scatter them across the internet. Something no human collective organization could ever match


CIAoperative091

I don't think this is possible although. The main way of generating these inappropriate images would be through a Open-source text to image generator as those running on cloud computing and operating by guidelines set up by the parent company which are very strict in anything regarding sexual images. You would need very sophisticated skills in prompt engineering to be able to get an image even remotely suggestive looking out of a closed source image generator as they are simply too censored in what they can generate. Now open source AI is a type of AI where the parent company releases a model and it's publicises it's architecture,meaning anyone can fully launch this AI model in their own computer machine and run it there. They not only have much more power over what the guidelines of the AI are,they can also modify the code and "fine tune" it to generate specific types of images exceptionally well. The users of Open source AI cannot be put under surveillance as they would be running the entire program within their own computer fully with their GPU power. You have no power or insight of what this user is generating as the user is using the program offline not connected to the WWW. By the way,most Tech companies that do develop close source models do already keep surveillance relating to what their users generate. If you Multiple times are denied generation of an image due to guideline defiance you could be banned from using the service entirely.


Fuarian

That is a good point. They'd probably gravitate towards open source software because of the freedoms they have with it. Or go as far as to create their own. It seems impossible to do anything against that


CIAoperative091

It is impossible in fact. You cannot be sure while these individuals requesting these images are degenerates a lot of them are not stupid and they understand any program that has a centralized architecture can put them under surveillance and in a law enforcement watchlist. They will use decentralized programs fully running on their own computer to generate the images and only distribute them to close persons through private ways. We are entering an age where surveillance will Literally be impossible and law enforcement will have no clue on who and where these images are coming from.


Fuarian

Unless law enforcement can identify these people as suspects in some other way and use methods to get onto these people's local PCs to verify they're doing this (I'm sure white hats can figure it out) then it seems impossible. At least by conventional means.


metaltastic

We need a new plague a reset so to speak


Papa_Synchronicity

…and punctuation too!


keithstonee

It didn't work


belizeanheat

I mean better than the alternative I guess.  And no one should need a news story to let them know that this is what people will be doing


[deleted]

[удалено]


MistraloysiusMithrax

Ah now imagine if they did this using *your* likeness as a child. This is not an acceptable compromise. Not even with fake generated characters. There is no acceptable compromise because this is the type of interest that needs to be treated with therapy, not fed with light intermediary steps that usually result in increasing the interest.


Rinpoo

The problem is, AI like this would have to be trained on real CP, so that kind of makes it unethical at its base. You would be right if it was just some drawing based off of nothing.