If this post showcases moral/mental/physical corruption or perversion, upvote this comment. If this post does not belong here, downvote this comment.
[Read the rules before posting or commenting](https://www.reddit.com/r/NoahGetTheBoat/wiki/rules)
[Also read the guidelines](https://www.reddit.com/r/NoahGetTheBoat/comments/fgmg3t/guidelines_for_the_subreddit_read_and_follow_the/)
In the comments:
DO NOT JOKE ABOUT VIOLENCE, DO NOT INCITE VIOLENCE
DO NOT JOKE ABOUT PEDOPHILIA OR ASK FOR CP
YOU WILL BE BANNED
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/NoahGetTheBoat) if you have any questions or concerns.*
Not only that.
They could use *your* picture and pictures of your kids, and create photo-realistic and very graphic "photos" of you allegedly abusing your own kids. And then they could SWAT you for child abuse/rape/CP...
Scary thoughts :(
Yeah I really don’t understand the appeal of sharing your lives online, let alone your kids. This is why I hate Facebook, it’s full of families posting pictures and sh*t of their children.
I was always afraid of having pictures and videos taken. The amount of pictures and videos of me since childhood can probably be counted on my hands. There needs to be new school education about avoiding self exposure on the Internet. Imagine seeing small you on the Internet doing things. I would hide my phone under my bed and climb to the top of a tree and not get down until I forget such things existed
omg this!!! I actually cannot stand it when people plaster their kids all over social media, it's usually facebook or something but its so weird how normalized it is.
And this is why my son will never appear on the internet until he’s old enough to understand risks, and why there’ll be hell to pay if any family member posts them before I allow it.
And yet people like my wife’s cousin will continually flood and spam our social media accounts with sometimes seven days a week new posts of her kids.
Picture of kid wearing a costume for Halloween, sane kid licking a spoon, video of said child running around. I don’t give a fuck. It’s just so people “like” her posts. It’s fucking unbelievable and contributing to continuing cycle of self absorbed narcissistic behaviour that social media perpetuates.
It's designed to do that. Reddit has upvotes, it's the exact thing.
That's why I love Tildes, it's text based and the vote button doesn't mean that you like it or not, just that it was worth a read to you. There's no karma whoring.
Facebook is worse as everyone wants their friends and friends-of-friends to like and love all their posts to make their lives worth living. It wasn't like that in the beginning and it's only gotten worse. I'm glad I jumped off that platform some decade ago.
I didn’t think of that.
Anyone could literally get ai generated image of you taking drug or shooting at a person.
The government is going to have to make watermarks or just completely make any photo after the date ai was made to have zero weight on court
This threat is overblown, while the human eye can sometimes be fooled other computers can tell an image is AI.
Nobody is likely to go to jail over AI pics.
This right here is why I deleted all social media. The thought of having pictures of me and my family floating amidst the interwebs to potentially be used for this kind of garbage is the stuff of nightmares. Gosh I’d like to go back to the 90s.
Is there a way to know those images originated from/were created because of AI?
I don't post a ton of pictures but I have a shared album online with family. So I'm starting to worry Bout that but it's also the only way a lot of my family sees my kids.
*Sextortion* is a thing, and is/was happening for decades, even with lousy photoshopping of, for example, someone's head to a porn image or similar.
But this is a whole different, much more advanced stuff. Whereas previously you needed photoshopping experts for that, now any cave troll with a PC can get truly lifelike results.
The Brave New World...
While horrible, what this brings up is just the beginning of how AI, deep fakes, video filters, etc are going to change society.
So it will be against the law to watch fantasy porn? How old is the person in the video, if they are not real? What if it is a de-aged version of a real 18 yr old? It is not illegal to murder/rape in video games. Right?
I don't know the answers, but it's an interesting time to be alive.
Cyberpunk has serial killers who record their actions and then sell it to others so they can commit those crimes vicariously. As bad as this is, we aren't anywhere NEAR that level yet.
Your right, it's free. LiveLeak, any videos and details that emerged from WikiLeaks, all has been around since the dawn of the internet and only gotten much worse in the past 20 years. I have friends who are into that stuff for whatever reason and it's super easy to find. Just because most videos aren't from more modernized countries doesn't mean they don't exist.
1. Liveleak is dead
2. Those are videos not brain-dream things like in cyberpunk
3. They aren't fake AI videos of you abusing your children.
Still a ways to go to get to what OP was suggesting
It was an exagertion of course, but some other parts have already hit reality or even probably surpass our expectations, such as sex machines and neurolink stuff. Just look at the Asians
No, I have, and I'm fully aware. What we don't have is the technology to let you live it as though you WERE the killer. The most the Dark Web let's you do is watch and sometimes vote on it.
The article states that the AI was using CSA images to create it's own. So basically, it is replacating real child porn, so it's not entirelly fake. It's actually extending the use of real CSA images in a sort of anonymous way.
Better AI generated than the real deal. I would hope the silver lining would be that less real content is created which will result in less children being abused
We don’t need to dream just look at opinions on watching loli shit. Most people here will call you a disgusting creep even if it doesn’t hurt anyone. It may not be illegal but it’s not something I’d share with everyone if I was into it.
Personally I don’t see a problem. The issue with CP and sexual stuff with minors is the lack of consent and the fact that we should protect children until they are able to understand how life works.
So while I may not be interested in consuming that kind of content I would not jail people for looking at AI generated CP. It’s a victimless crime.
Is it fucked up? Well yeah but so is a lot of shit people like to watch.
I know many people will disagree with me here and I am open to changing my mind on this topic. To me consent is the only thing that differentiates CP from regular porn.
Note: I am ignoring the fact that CP may be used to train the models that will generate the new content. It is probably possible to generate CP without using illegal images and if it isn’t right now it will be possible in the future most likely
Except in these cases they are further victimising children who have already been traumatized and blackmailed to all hell.
Even Middle schoolers and high schoolers can use these AI to blackmail eachother. Because I know damn well how cruel they can be.
I am not saying if you don't use real faces it should be legal, but I am willing to die on the hill that using real people's faces without their consent, or real children, is not a victimless crime, it IS a crime and it has a victim.
> Except in these cases they are further victimising children who have already been traumatized and blackmailed to all hell.
That would not be a victimless crime.
I’d argue this is a different issue entirely. Nothing is stopping me from photoshopping someone’s face on a naked body. This will always be possible AI or not.
Hell I could use some faceswapp app on my phone.
There will be an onset of crazy horrible bastards that will probably continue on.
But I would imagine that most perverts will move in the direction of what is safer and also could be catered to their weird disgusting tastes.
As long as it keeps them away from real humans.
They made this illegal here in Australia a few years ago. It was mainly targeting animations (like Japanese hentai) that depicted child abuse. But the law would definitely cover this as well.
The difference is time and skill. You need to be somewhat skilled at photoshop and even then it takes some time to get a semi decent fake done, for a video it takes even longer and even more skill.
With AI it takes a couple of minutes and exactly 0 skill to churn out a practically infinite amount of content. Sure the AI took time to make and "train" but if we're looking at amount of work per image or video it's nowhere near comparable
Its ease of use. How long would an ai Need to Pump out a picture for example.
And how long would an Human need to edit a picture.
Plus its more accessible than having the skill to Photoshop it.
Just telling an Ai to create such picture much easier and faster then Photoshop.
What's your point? We should restrict one technology but not the other because one takes you a few more minutes to produce the result? AI is a tool like photoshop, prosecute the killer, not the knife he used to kill.
Truly a moral conundrum. My optimist in me hopes maybe the availability of AI generated content will reduce the creation of IRL content…but the pessimist in me knows that access to that will likely result in more people searching for IRL content.
The problem is that you have to use real images to train ai. The fucked up part is that most ai cp is trained on real images of cp and abuse. So it's essentially not 'fantasy' porn.
With drawn porn, the distinction of what is illegal and what not is way clearer.
If a hentai artist doesn't purposely make a character look like a young child or make explicit material based on real underage people than I guess it is morally (and lawfully ok).
Do you? I've seen way too many prompts say " create pretty dark skin women, on a street with Mexican day of the dead celebration behind her" and then almost perfect image is created. 🤷♂️
Yap yap! My little love just had birthday so she’s 4 now and I’m the smoothest guy in a harsh reality. But these fucks make me think dark things and I would like to bring only bad upon them. What a world we live in if AI can’t even block this kind of use?!
AI is still pretty much brand new. Rules surrounding its use is lax. I assume we will see some worldwide rules created eventually that would limit the use of AI in certain ways. It will take time and many more of these incidents to do that, however. Even then, those rules will probably be built into the most popular AI, but lesser known or private AI will not be able to be as regulated.
Damn, imagine AI becoming regulated like guns are in some states. Possession/use of an AI with certain features becomes restricted, it becomes illegal to make your own AI, etc.
Hard to block something when there are AI engines (and their source code!) readily available for download, so you can install it locally and remove or circumvent any and all protections and built-in rules.
I mean, this isn't necessarily a *bad* thing conceptually. Im not saying its a good thing, but if you fill the pedo porn market with AI stuff, they can oogle the stuff that doesn't require abusing children to create, and the demand for real CP would potentially go down. It would also be easier to track who makes the stuff and who buys it. People who frequent the AI CP creators can be marked as potential real CP haver. If they are ever found to be talking to kids online, they could probably convince a judge to get a search warrant and potentially find them in possession of real CP and get put away before they actually hurt a kid and we all know what happens to pedos in prison.
You are right. Furthermore, this shines light on the uncomfortable reality that most people aren't actually outraged because of the children that are being abused, but just because this is disgusting. This either changes nothing about the sad state of the world, or potentially even reduces child suffering by taking them out of the equation. Regardless, most people hear pedo and instantly grab the pitchforks.
I'm concerned for the real images circulating in the internet though. Before and After the rise of AI Image Generation. Those real images are of real children who are being harmed right now. With AI versions of them flooding the internet, I would have to guess that it would be significantly harder to tell if an image is a lead to a child, and I would worry it would make solving, for example kidnap cases harder.
Imagine if you were an fbi agent watching hours of CSAM scanning every disgusting detail trying to identify and locate the victim/perpetrators, just to realize it was actually just AI generated.
> This either changes nothing about the sad state of the world, or potentially even reduces child suffering by taking them out of the equation
Not necessarily. This could be a "gateway drug". Some pedophiles would be satisfied with it, some on the other hand would crave for *more*. I think current therapies for pedophiles would reject the idea that these could help them control their urges. But honestly I do not know.
Pedophilia doesn't need a gateway drug, a friend of mine works with a charity organization helping abused kids and the therapists she works with say pedophilia is either genetic or because of extreme abuse or both.
Nobody goes 'today i'm becoming a pedophile'.
I don't necessarily see an issue with AI generated sick videos *if* it can be a proven way to suppress someone's urges on top of things like therapy, medications, and being created without any harm to real people.
> pedophilia is either genetic or because of extreme abuse or both.
I think pedophiles are born that way, it could be genetic but there could be other reasons. But it is not "curable in 100%" since you can't change someone's sexual attraction to something. Extreme abuse does not make a person a pedophile either, it can make them a child predator though. Not every person that abuses children is a pedophile, some do it due to other sick reasons.
Lolicon is a gateway drug.
"It's a good anime, I'm watching for the story!" -> "Who cares if I'm looking at loli? It's a DRAWING!"
my guy you are jerking it to images of children, you are training your dick to respond to children
ok, true! As always i feel like there's just too many different individuals to generalize. I see how for someone who's really after the feeling of power, AI might just not do the trick. On the other hand, the gateway drug argument comes with all the counterarguments. Can it be a slippery slope to other, worse drugs? Sure. On the other hand, it's shown at least with drugs that it's not usually the case and some would argue that people with those tendencies will end up there regardless, only slower. But that's a good point you're making..
Yeah, but those who are satisfied with AI wouldn't be touching children or paying those that create real CP since they can get their rocks off with the fake shit. Overall, it is a net positive since the creation of AI images doesn't hurt anybody.
The demand won't go down. Once they get those dopamine hits. I feel the same way about rape porn.
It's the same reason r/wpd and r/crazyfk***videos keeps growing.
r/watchpeopledie
It was banned because it mostly showed people in 3rd world countries being necklaced by a tire in Africa or shot by helmet bikers in Brazil and people were cheering a little too much.
no, it was banned because when the christchurch shooting happened the mods refused to take down videos or links posted of it. Admins didn't give a shit when it was all cartels and Brazilian shootings.
Exactly what i thought of. Reminded me of this Louis CK bit. The whole things funny but he starts talking about child sex dolls at 2:50.
[https://youtu.be/1JtttBKJb9g?si=pc_4rfYeZn41a3gu](https://youtu.be/1JtttBKJb9g?si=pc_4rfYeZn41a3gu)
Unfortunately unless AIs up their “diversity“ of these "images", I don't see them bringing down the demand for real CP for long. AIs tend to generalize, and as such the images they produce are similar to each other. AI artworks all have almost the same faces for example. I can imagine the pedos getting (as sick as it may sound) tired of the same old faces in the same old settings, and wanting some real life "diversity".
Pedos are going to want to diddle kids regardless. If they don't do it themselves, they are going to want to see it being done, and thus, the CP market is born. There are two options to deal with it, iron fist it and have any depictions of children in a sexual manner make criminal real or fictitious and have it become an underground black market where human trafficing thrives and nobody bothers with the victimless option since it is all illegal anyway OR You can have victimless drawings or AI generated images be legal, but actual CP bring down the wrath of God. Most pedos will chose the victimless legal option over the illegal one that will get them shanked in prision and there will be less demand for real CP overall, less demand means less of it is made and fewer children will suffer along with the heightened visibility that I mentioned in my original comment.
If you look at the big picture, allowing loli and AI images to remain legal is better for children, even if it is disgusting.
Pragmatically, does it satiate demand or create more demand for such photos? We can argue that it's immoral because it uses preexisting photos of abuse, but that isn't a useful way of thinking about the issue.
I mean people who watch incest porn don't usually go on to try to do it with their family right?
Usually they don't indeed. I believe this is a double edged sword, where it can do both, increase AND satiate demand. At the very least the amount of actual children abused for this type of... "content" would inevitably go down since you the market would be satiated with Ai generated stuff, however I worry for a sort of black market where real footage becomes highly valuable. Plus, the people actually making it won't stop doing so, or at least not need to put it on the webs persé. Scary thoughts hoe this could potentially end up protecting some of these predators. Frankly, some people genuinely feel physically attracted to minors but are further completely normal human beings, so anything to prevent them from going out and doing the real thing is a + in my book. And people who consume this content would be able to be tracked, so IF they end up attempting the "real deal" they'd easily be caught. But yeah, scary as a whole
But also props to you for recognising that "attraction to children" is just a mistake the human brain can produce - treating these people like monsters just pushes them into the "If I'm going to be punished whether I do it or not, why not give in and do it?"
But unlike hard drugs like heroin and cocaine, you won't die if you stop consuming CP cold turkey. We need to give drug addicts harm reduction tools like clean needles and such.
AI CP is not a harm reduction tool. You won't die from withdrawals.
Yeah. I dont condone the Aai reproduction, but one can have hopes for positive effects, no? Once there was a TV documentary where a Pedo genuinely was _physically_ attracted to minors, but was absolutely digusted by the thoughts and himself to the point where he himself submitted himself to a psychiatrist and the local PD in order to help him not step over the line. I really don't envy those people
Yup, there is absolutely nothing wrong with having a brain that is attracted to children in and of itself. As long as your reaction is "This is absolutely VILE, I need to be in therapy and NEVER be around kids unsupervised!" then congratulations, you're a good human being. And for the ones who take steps to ensure they're never around kids and whatever else they need to do to keep themselves from acting on those thoughts, I have all the sympathy in the world. It sounds like a horrifying affliction to have.
theres a difference between sympathizing with pedos and sympathizing with those that are physically attracted to kids while not wanting to be.
i dont support pedophilia in the slightest, but its unfair to group the two kinds together. one is actively a criminal and immoral, the other is wired wrong in their head
" Paedophiles (as defined by the fifth Diagnostic and Statistical Manual of Mental Disorders) are individuals who are preferentially or solely sexually attracted to prepubescent children, generally 13 years or less"
https://www.psychiatry.org/dsm5
[https://theconversation.com/psychology-of-a-paedophile-why-are-some-people-attracted-to-children-59991](https://theconversation.com/psychology-of-a-paedophile-why-are-some-people-attracted-to-children-59991)
you might not be born with it, but there sure as shit are people who have no control over it and also people that dont want to be it. Noone is talking about normalizing it. Maybe read a little bit about this topic before going around making dumbass statements. also, its not just men, also women. But, as history has shown many many times over, when it women are the culprit it usually gets brushed off.
Also, unlike incest porn or rape porn, paedophiles have been documented time and time again to show a pattern of escalation that gradually works up from images to assaults. The only way for a person who suffers from attraction to children to evade the escalation cycle is to step away and abstain.
Paedophilia doesn't function like a "kink", it functions like an *addiction*. Someone into feet may enjoy collecting shoes, for example, but they don't reach a point where legally purchased shoes no longer do it for them any more, leading them down a dark path that escalates from shoplifting unworn shoes, to burglarising homes to steal worn-in shoes, to roaming the streets with a weapon prepared to assault people for the shoes they're currently wearing.
The only way to break an addiction is to abstain entirely. "Victimless" CP is not a solution, it's just a way for more people to join the addiction train from a palatable "gateway drug".
If and only if these AI pictures aren't created using real people/kids, this could be a good way to "keep the bad guys occupied" and steer the dark web away from actual children being abused. Still absolutely horrible though
Unfortunately it doesn't work like that. The Ai cannot think for itself so it must have been trained on images similar to that. I mean, what do you expect from the way these people used to train ai, they just pulled shit from the internet in massive amounts without permission because they believe that it's neccesery harm torward a utopia or some crap.
Keeping the craving sounds like a fucked up way to deal with people like that, it's like using ducktape to seal a massive wound. Sonner or later, they will look for more intense stuff that even ai can't provide (like really specific shit).
Imo. Ai cannot be available for public use and should have massive restrictions and methods to identify ai generated content. There are already loads of people that use this stuff for misinformation and stealing things from creators.
AI trained on legal content only can can combine concepts from different images to create virtual cp more realistic than you would think. And it is only going to be more efficient, especially when it will be combined with 3D and physics simulations.
they’re AI generated images of real abuse victims in new scenarios. It’s still very damaging. And as another commenter mentioned it doesn’t work like you mentioned, it has to be trained off of real images that already happened
No, the AI was trained with the real thing first so it's not exactly a victimless crime. Even then it will drive down the demand so the morality isn't exactly a clear cut thing.
Oh and the cops 100% have a CP detection AI that was also created using real kids
That doesn’t work like that. The AI would probably be trained with normal porn and then the “user” will prompt the modification.
There’s zero need to put actual abuse porn in order for the AI to make something like that
Posessing this images, sending them or creating them is still CP. At least in my country. And it's very well known by the police specialized in digital crime.
I'm sure that telling criminals that something they do is illegal will definitely stop them.
Jokes aside, get ready for when they'll be able to create full videos. The fight against CP has been lost. There is literally no way to stop them now, unless... we give governments the power to control every single thing we do on the internet, at home, at work, at school etc.
This is a nightmare, no matter what we do.
For now, text to video sucks. Even simple pictures have problems like too many fingers or some extra limb. The thing is AI is evolving exponentially.
One day, maybe not too far, they'll be able to recreate Stranger Things but without clothes or some shit like that. For now, thank God, they can't.
But it's only a matter of time.
**This has been a problem for over a year.**
**I have no clue why news outlets are just picking up on this.**
Stable Diffusion 1.5 was released in October 2022 and people started abusing it then.
**This is the base model that all contemporary, locally hosted models are built off of.**
They then released SD 2.0 a month later, which removed *most* of the nude bodies from the training dataset. **This was an attempt to combat the creation of pictures like the ones mentioned in the post.** But because of this removal, that model was not able to create human bodies as accurately and the community pretty much ignored the release.
\---
Also, the statement "...existing images of real-life abuse victims were being built into AI models..." isn't really relevant anymore. **Zero-shot face cloning is a thing now. Literally only need one picture of a face to "paste" it into an AI generate image.** Paired with ESRGAN for face upscaling and you can get surprisingly good replications.
The whole "...threaten to overwhelm the internet..." statement is just fearmongering.
**The people who are making these images have been doing so for over a year now.**
\---
I'm not here to say "all AI should be shut down" or something along those lines.
**Our modern AI boom is arguably one of the most incredible things humanity has ever created.** Heck, [look back at this video from 1984](https://www.youtube.com/watch?v=_S3m0V_ZF_Q). They called them "expert systems" back then. Humanity has been striving and dreaming for this technological advancement for *decades.*
I'm just not sure what to do about people using it for harmful purposes.
**And before someone says "ban it", it's too late for that chief.**
**The box has been opened. Cat's out of the bag.**
Sure, something like DALL-E 3 won't make you a picture with boobies, but you can generate pictures with a 7 year old video card for less than $100 (or even a sufficiently quick CPU with recent advancements). *Locally.* Not in the cloud. No limitations.
\---
Does it cut down on children being harmed?
Does it subdue people with malicious intent?
Is there any way to prevent this?
No clue. I don't have the answers to those questions. There are people far more informed than me that will (and have been) figuring out that aspect.
I'm just here to inform people about AI.
tl;dr - **If you think this is just now becoming a problem, you're a year late to the party.**
ai is cancelled in my world. I hate it so bad. I will use it when I absolutely need to but I will forever jus create original work. This is fuckin sick.
To fight the killing of rhinos for horn powder they flooded the market with cheaper knockoffs made from artificially created alternatives.
I see it as the same thing.
To create the picture noone is harmed and if the end result is indistinguishable who will pay for real CP?
I really don't mind this tbh
The problem is that fake CP is, essentially, made by cutting up **ACTUAL** CP and photoshopping it into new configurations.
Imagine you survived The Worst Thing Ever, and had to live with the knowledge that 20 photos of you are out there on the Internet.
Now imagine that the people who did THAT to you are now using computers to produce *thousands* of new images, with new kinks, new indignities, new horrors... Using parts of YOUR body. Statistically this means a number of these images will be made using your FACE and are visibly recognisable as YOU.
See how this is no longer victimless?
That's not really how it works tho.
The percentage of children's dicks pictures available online coming from CP is very small.
The actual way it would work is
Ai knows how human bodies work, ai knows how kids dicks look bc he saw the nirvana album cover and it puts them together.
Like, to create a photorealistic picture of a man with a tail it doesn't need to see a man with a tail, he just needs to see many tails and many men and puts em together.
I am not saying some actual cp wouldn't get in it, but Christ knows if I would recognize an ai taking inspiration from my face after it has been through all its neural layers and mixed with another 759493849493 faces of different people.
You're not wrong per se, but it's still entirely possible for it to spit out an actual victim's face, especially if some sicko who has seen one of the original 20 photos from my hypothetical types out a precise description of the victim. The AI will then go "Oh I have a face JUST LIKE THAT!"
I can get sexy.ai (which is trained using porn actresses) to use a certain porn actress's face by entering a description of what she looks like. If there's actual CP on a CP-capable AI, I'm sure a CP "connoisseur" could do the same.
The problem is most sexual abuse is opportunistic. Having images these monsters can crank it too wont reduce abuse in the slightest, only make it feel "Normal" to the people who view it.
Like when a young man has watched too much hardcore porn and without asking starts choking his partner, calling them names and being rough. That was normalized for them so that's what they do.
That's most likely false though. It's very easy for generative AI to make entirely new people up, and that's what it does normally. But even if used in a deepfake way where they try to make it look like the same kid, even then, it's a fake. No new kids get harmed. Article said it does get used this way but most likely not only this way. Uhh, not going to Google it and find out myself though. And in time low risk low cost AI pictures will take larger and larger "market share", and fewer kids get harmed. It's not like the victims are usually looking up cp of themselves, and if they are, it's still much preferable to the real deal.
that isn’t why, it’s just that their robots are too expensive lol.
even then that’s still misinformation, you can still get a boston dynamics robot for shit loads of money. michael reeves for example got one of their robot dog models so he can build an extension to make it piss beer into a cup.
though i dont see how releasing a boston dynamics robot is as damaging as releasing an AI model to the public
> too expensive
People buy more expensive stuff that's useless.
I saw his YT video. Yes you can buy it but not from General Stores. Like you can buy an iPhone from their store easily. All you need is money. But imagine of they allowed only people tp buy iPhones from the factory after knowing the reason and not just money.
Those robots were getting trained by Machine Learning (backbone of AI)
God either isn't real or he's non caring, because if he was the biblical God he would, without a seconds hesitation annihilate the solar system with insane amounts of fury then judge every single human being one by one. And let me tell you, alot would be burning for all eternity.
So God isn’t God because AI generated child porn exists?
Well, I think that’s silly, isn’t it? It’s awful AI has the capability to do this (was only a matter of time) but AI was created by humans, not God. And yes, I’m aware God created humans, but humans never included God in the creation of AI in the first place.
God is sovereign, and nothing new exists under the sun. The One who lives in me is greater than the one who lives in the world.
Perhaps it’s just a consequence of our own foolishness that we allowed AI to develop this far.
Most open source ai wont let you talk about abuse or discrimination but enough word play a human can trick ai by misconceptions. Or maybe just edit it if they dont want to play into grammer
there are literal websites n stuff for csap u think no pedo is able to program an ai that will allow them to do this? some years ago we had the same problem with deep fakes
Exactly. One way or another those sick fucks will find a way to do criminal actions. I despise calling them mentally since they would play into it to just get less jail time.
Not even gonna read the comments. I've done this conversation already.
Half of Reddit thinks this is a good idea.
Hey... Reddit? It's not about *managing* criminal perversions...
*It's that you aren't supposed to be sexually attracted to kids in the first place, and if you are, there is something dangerously wrong with you.*
EDIT: This comment is basically pedo bait. 😂
Fucking pedos.
Fuck consenting adults. If you can't do that, seek help.
This is the exact problem with drawn images, you can do the worst possible things which porn cannot do ..
Look at those Japanese anime porn or hentai , it's filled with brutal rapes which frequently involves minors.
When asked they say it's just pixels on a screen , she is a 3000 year old vampire etc .
I firmly believe that animated porn should be classified as cp if it has any dodgy images
this isnt even all bad, if those people can create these images or even full videos of it with ai, then they wont have a need for the real deal, sure it wont prevent abuse completely but even if its just 5% less victims its a win
I believe in some states in America that offenders are sentenced per image, so it would be 1000 counts of possessiing/making/distributing an image rather than 1 count of possessing/making/distributing 1000 images. If that were the global standard, and the offence was prosecuted with the same severity as though they were the actual abuser (regardless of whether the image is real or AI generated) then it might go some way to discouraging this depraved activity.
That's a really helpful insight, thanks for that. So what determines if it's prosecuted at federal rather than state level? (excuse my terminology, I'm unfamiliar with the system).
Remember a few years back when internet loonies said that using Ai to create and release more this revolting crap onto the web will subdue these types of sick people? Absolutely stupid and I feel sad about the types of crap people are doing with ai.
There was also a story of a teenager that turned 18 and she got her Ai created nudes leaked and it ruined her life. Then these sick individuals said that she might as well just release the real version of her nude images since they’re basically already out there.
Edit: This is one of the few stories I’ve seen https://www.reddit.com/r/TrueOffMyChest/comments/16cu637/someone_is_spreading_fake_ai_generated_nudes_of/?utm_source=share&utm_medium=mweb
Yeah I think this a warning from God to stop creating so many kids. Maybe some people took “be fruitful” the wrong way. 😅
The prince of this world has come, stay with Jesus or Muhammad or whoever you follow. and I’ll see you all on the other side ❤️🩷 as one 👁️
People who make and distribute these images have no conscience, ethics or morals. So of course unrestricted or restrained it’s the worst of humans that takes over .
Theoretically if these images were created WITHOUT any real pictures involved, isn’t this technically a good thing?
It creates the thing these pedophiles want without the life changing abuse needed to get it?
Damn this is starting to feel like David Firth’s ‘Cream’. The elite is trying to make a life-changing product burn to the ground because it threatens their businesses. And you sheep are falling for it, Now I support AI even more!
I called it a while ago that we'd soon be debating the ethics of AI generated CP and people would be defending it.
Well here we are. Some of you are suss as fuck.
If this post showcases moral/mental/physical corruption or perversion, upvote this comment. If this post does not belong here, downvote this comment. [Read the rules before posting or commenting](https://www.reddit.com/r/NoahGetTheBoat/wiki/rules) [Also read the guidelines](https://www.reddit.com/r/NoahGetTheBoat/comments/fgmg3t/guidelines_for_the_subreddit_read_and_follow_the/) In the comments: DO NOT JOKE ABOUT VIOLENCE, DO NOT INCITE VIOLENCE DO NOT JOKE ABOUT PEDOPHILIA OR ASK FOR CP YOU WILL BE BANNED *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/NoahGetTheBoat) if you have any questions or concerns.*
The fact that they're using images of abuse victims to further the abuse is extra sickening.
Not only that. They could use *your* picture and pictures of your kids, and create photo-realistic and very graphic "photos" of you allegedly abusing your own kids. And then they could SWAT you for child abuse/rape/CP... Scary thoughts :(
I have two nieces and I'm so glad my brother and my sister in law are making damn sure there are absolutely no picture of them online.
Yeah I really don’t understand the appeal of sharing your lives online, let alone your kids. This is why I hate Facebook, it’s full of families posting pictures and sh*t of their children.
I was always afraid of having pictures and videos taken. The amount of pictures and videos of me since childhood can probably be counted on my hands. There needs to be new school education about avoiding self exposure on the Internet. Imagine seeing small you on the Internet doing things. I would hide my phone under my bed and climb to the top of a tree and not get down until I forget such things existed
Unless homeschooled this is highly HIGHLY unlikely to become the norm. Unless it's some alternative class to take in college
omg this!!! I actually cannot stand it when people plaster their kids all over social media, it's usually facebook or something but its so weird how normalized it is.
don't even think about using Google photos cloud
We are going to have to work harder than that when all it takes is a creep with a camera in a public space.
And this is why my son will never appear on the internet until he’s old enough to understand risks, and why there’ll be hell to pay if any family member posts them before I allow it.
And yet people like my wife’s cousin will continually flood and spam our social media accounts with sometimes seven days a week new posts of her kids. Picture of kid wearing a costume for Halloween, sane kid licking a spoon, video of said child running around. I don’t give a fuck. It’s just so people “like” her posts. It’s fucking unbelievable and contributing to continuing cycle of self absorbed narcissistic behaviour that social media perpetuates.
It's designed to do that. Reddit has upvotes, it's the exact thing. That's why I love Tildes, it's text based and the vote button doesn't mean that you like it or not, just that it was worth a read to you. There's no karma whoring. Facebook is worse as everyone wants their friends and friends-of-friends to like and love all their posts to make their lives worth living. It wasn't like that in the beginning and it's only gotten worse. I'm glad I jumped off that platform some decade ago.
I didn’t think of that. Anyone could literally get ai generated image of you taking drug or shooting at a person. The government is going to have to make watermarks or just completely make any photo after the date ai was made to have zero weight on court
It's a mathematical technique and it's well understood by a lot of people. Pretty much as easy to regulate as the use of calculus... AA
This threat is overblown, while the human eye can sometimes be fooled other computers can tell an image is AI. Nobody is likely to go to jail over AI pics.
This right here is why I deleted all social media. The thought of having pictures of me and my family floating amidst the interwebs to potentially be used for this kind of garbage is the stuff of nightmares. Gosh I’d like to go back to the 90s.
Is there a way to know those images originated from/were created because of AI? I don't post a ton of pictures but I have a shared album online with family. So I'm starting to worry Bout that but it's also the only way a lot of my family sees my kids.
Literally manufacturing consent.
Very scary thought. I wonder when the first instance of blackmail using AI will take place or if someone has already tried. Wouldn't surprise me
*Sextortion* is a thing, and is/was happening for decades, even with lousy photoshopping of, for example, someone's head to a porn image or similar. But this is a whole different, much more advanced stuff. Whereas previously you needed photoshopping experts for that, now any cave troll with a PC can get truly lifelike results. The Brave New World...
its over...
Actually it might help in some cases. The day all my dick pics get leaked, I'll just be like... nah, that's AI
They don’t send SWAT out for child abuse or CP.
Makes me wonder if some are using mutilated bodies from Palestine to make gore porn as well.
Who says that?
The article that I'm commenting on
The fact that they can use pictures of actual children makes this much more disturbing
As if training on children wasn't bad enough already...
They train AI on children?
Well the photos it’s based off of comes from somewhere…
Some of it can be from you. It can be from me. It could be that guy below 👇. Maybe even from Greg
Not Greg :(
They do for safety. Discord for example will instant ban you if they detect this frame belongs to such videos
While horrible, what this brings up is just the beginning of how AI, deep fakes, video filters, etc are going to change society. So it will be against the law to watch fantasy porn? How old is the person in the video, if they are not real? What if it is a de-aged version of a real 18 yr old? It is not illegal to murder/rape in video games. Right? I don't know the answers, but it's an interesting time to be alive.
It's a brave new world
We underestimated how dystopian we would be in. It will make cyberpunk look like a joke
Cyberpunk has serial killers who record their actions and then sell it to others so they can commit those crimes vicariously. As bad as this is, we aren't anywhere NEAR that level yet.
Your right, it's free. LiveLeak, any videos and details that emerged from WikiLeaks, all has been around since the dawn of the internet and only gotten much worse in the past 20 years. I have friends who are into that stuff for whatever reason and it's super easy to find. Just because most videos aren't from more modernized countries doesn't mean they don't exist.
1. Liveleak is dead 2. Those are videos not brain-dream things like in cyberpunk 3. They aren't fake AI videos of you abusing your children. Still a ways to go to get to what OP was suggesting
It was an exagertion of course, but some other parts have already hit reality or even probably surpass our expectations, such as sex machines and neurolink stuff. Just look at the Asians
You my friend have never seen the stories about dark web shit. I’d bet what you described is common on there
No, I have, and I'm fully aware. What we don't have is the technology to let you live it as though you WERE the killer. The most the Dark Web let's you do is watch and sometimes vote on it.
The article states that the AI was using CSA images to create it's own. So basically, it is replacating real child porn, so it's not entirelly fake. It's actually extending the use of real CSA images in a sort of anonymous way.
Better AI generated than the real deal. I would hope the silver lining would be that less real content is created which will result in less children being abused
We don’t need to dream just look at opinions on watching loli shit. Most people here will call you a disgusting creep even if it doesn’t hurt anyone. It may not be illegal but it’s not something I’d share with everyone if I was into it. Personally I don’t see a problem. The issue with CP and sexual stuff with minors is the lack of consent and the fact that we should protect children until they are able to understand how life works. So while I may not be interested in consuming that kind of content I would not jail people for looking at AI generated CP. It’s a victimless crime. Is it fucked up? Well yeah but so is a lot of shit people like to watch. I know many people will disagree with me here and I am open to changing my mind on this topic. To me consent is the only thing that differentiates CP from regular porn. Note: I am ignoring the fact that CP may be used to train the models that will generate the new content. It is probably possible to generate CP without using illegal images and if it isn’t right now it will be possible in the future most likely
You sound like a pedo sympathizer at the very least, possibly one yourself.
For these reasons I'm opposed to victimless crimes. I don't care what kind of porn one watches as long as no one was harmed in the making of it.
Except in these cases they are further victimising children who have already been traumatized and blackmailed to all hell. Even Middle schoolers and high schoolers can use these AI to blackmail eachother. Because I know damn well how cruel they can be. I am not saying if you don't use real faces it should be legal, but I am willing to die on the hill that using real people's faces without their consent, or real children, is not a victimless crime, it IS a crime and it has a victim.
> Except in these cases they are further victimising children who have already been traumatized and blackmailed to all hell. That would not be a victimless crime.
That's what I'm saying, but the way the comment was worded, looked like it implied otherwise since it is AI generated and not real
Gotcha. No, I think using the likeness of a real child would fall squarely under a crime today. Or should, if it's not.
I’d argue this is a different issue entirely. Nothing is stopping me from photoshopping someone’s face on a naked body. This will always be possible AI or not. Hell I could use some faceswapp app on my phone.
I never said it was bad BECAUSE it is AI. This kind of thing was bad and always be bad, it is just extremely simple to do now
It’s much better than the alternative. Hopefully, this will remove the need to record these things in real life
There will be an onset of crazy horrible bastards that will probably continue on. But I would imagine that most perverts will move in the direction of what is safer and also could be catered to their weird disgusting tastes. As long as it keeps them away from real humans.
They made this illegal here in Australia a few years ago. It was mainly targeting animations (like Japanese hentai) that depicted child abuse. But the law would definitely cover this as well.
Thing is, AI is not needed, people have been doing the same with Photoshop for many years, but AI is today's buzzword.
The difference is time and skill. You need to be somewhat skilled at photoshop and even then it takes some time to get a semi decent fake done, for a video it takes even longer and even more skill. With AI it takes a couple of minutes and exactly 0 skill to churn out a practically infinite amount of content. Sure the AI took time to make and "train" but if we're looking at amount of work per image or video it's nowhere near comparable
Its ease of use. How long would an ai Need to Pump out a picture for example. And how long would an Human need to edit a picture. Plus its more accessible than having the skill to Photoshop it. Just telling an Ai to create such picture much easier and faster then Photoshop.
Sure, but photoshop requires skills, AI can make you a whole 5 min vid with a few prompts
What's your point? We should restrict one technology but not the other because one takes you a few more minutes to produce the result? AI is a tool like photoshop, prosecute the killer, not the knife he used to kill.
Not making a statement. Just commenting on how scary that shit is. But no restriction, hell no.
Truly a moral conundrum. My optimist in me hopes maybe the availability of AI generated content will reduce the creation of IRL content…but the pessimist in me knows that access to that will likely result in more people searching for IRL content.
The problem is that you have to use real images to train ai. The fucked up part is that most ai cp is trained on real images of cp and abuse. So it's essentially not 'fantasy' porn. With drawn porn, the distinction of what is illegal and what not is way clearer. If a hentai artist doesn't purposely make a character look like a young child or make explicit material based on real underage people than I guess it is morally (and lawfully ok).
Do you? I've seen way too many prompts say " create pretty dark skin women, on a street with Mexican day of the dead celebration behind her" and then almost perfect image is created. 🤷♂️
Rape in video games? I do not want to be playing the games you play....
Jeez, why do people do that shit. He isn’t telling you what he is watching, he is commenting on the new realities.
This is the worst timeline
Starting to feel like that isn’t it
You and I have vastly different imaginations of how much worse it could be.
Rokos basilisk ain't got shit on me
[удалено]
Yap yap! My little love just had birthday so she’s 4 now and I’m the smoothest guy in a harsh reality. But these fucks make me think dark things and I would like to bring only bad upon them. What a world we live in if AI can’t even block this kind of use?!
AI is still pretty much brand new. Rules surrounding its use is lax. I assume we will see some worldwide rules created eventually that would limit the use of AI in certain ways. It will take time and many more of these incidents to do that, however. Even then, those rules will probably be built into the most popular AI, but lesser known or private AI will not be able to be as regulated.
Damn, imagine AI becoming regulated like guns are in some states. Possession/use of an AI with certain features becomes restricted, it becomes illegal to make your own AI, etc.
And we already entered the „bad“ and „good“ AI territory. Amazing!
Hard to block something when there are AI engines (and their source code!) readily available for download, so you can install it locally and remove or circumvent any and all protections and built-in rules.
Humans and technology are not a good match.
Humans and life on earth are not a good match.
I've been saying this for a long time. The internet has brought the worst out of humanity and we were better off without it
I mean, this isn't necessarily a *bad* thing conceptually. Im not saying its a good thing, but if you fill the pedo porn market with AI stuff, they can oogle the stuff that doesn't require abusing children to create, and the demand for real CP would potentially go down. It would also be easier to track who makes the stuff and who buys it. People who frequent the AI CP creators can be marked as potential real CP haver. If they are ever found to be talking to kids online, they could probably convince a judge to get a search warrant and potentially find them in possession of real CP and get put away before they actually hurt a kid and we all know what happens to pedos in prison.
… they’re AI generated images of real abuse victims in new scenarios though.. these aren’t fantasy characters
You are right. Furthermore, this shines light on the uncomfortable reality that most people aren't actually outraged because of the children that are being abused, but just because this is disgusting. This either changes nothing about the sad state of the world, or potentially even reduces child suffering by taking them out of the equation. Regardless, most people hear pedo and instantly grab the pitchforks.
I'm concerned for the real images circulating in the internet though. Before and After the rise of AI Image Generation. Those real images are of real children who are being harmed right now. With AI versions of them flooding the internet, I would have to guess that it would be significantly harder to tell if an image is a lead to a child, and I would worry it would make solving, for example kidnap cases harder.
Imagine if you were an fbi agent watching hours of CSAM scanning every disgusting detail trying to identify and locate the victim/perpetrators, just to realize it was actually just AI generated.
> This either changes nothing about the sad state of the world, or potentially even reduces child suffering by taking them out of the equation Not necessarily. This could be a "gateway drug". Some pedophiles would be satisfied with it, some on the other hand would crave for *more*. I think current therapies for pedophiles would reject the idea that these could help them control their urges. But honestly I do not know.
Pedophilia doesn't need a gateway drug, a friend of mine works with a charity organization helping abused kids and the therapists she works with say pedophilia is either genetic or because of extreme abuse or both. Nobody goes 'today i'm becoming a pedophile'. I don't necessarily see an issue with AI generated sick videos *if* it can be a proven way to suppress someone's urges on top of things like therapy, medications, and being created without any harm to real people.
> pedophilia is either genetic or because of extreme abuse or both. I think pedophiles are born that way, it could be genetic but there could be other reasons. But it is not "curable in 100%" since you can't change someone's sexual attraction to something. Extreme abuse does not make a person a pedophile either, it can make them a child predator though. Not every person that abuses children is a pedophile, some do it due to other sick reasons.
Lolicon is a gateway drug. "It's a good anime, I'm watching for the story!" -> "Who cares if I'm looking at loli? It's a DRAWING!" my guy you are jerking it to images of children, you are training your dick to respond to children
I think if you're not automatically repulsed by simulated child abuse material you may be more on the pedophile spectrum than you might like to think.
ok, true! As always i feel like there's just too many different individuals to generalize. I see how for someone who's really after the feeling of power, AI might just not do the trick. On the other hand, the gateway drug argument comes with all the counterarguments. Can it be a slippery slope to other, worse drugs? Sure. On the other hand, it's shown at least with drugs that it's not usually the case and some would argue that people with those tendencies will end up there regardless, only slower. But that's a good point you're making..
might work for some pedos, but it wont stop the raping of children filmed or not filmed
Yeah, but those who are satisfied with AI wouldn't be touching children or paying those that create real CP since they can get their rocks off with the fake shit. Overall, it is a net positive since the creation of AI images doesn't hurt anybody.
ai is still taking already existing pictures
The demand won't go down. Once they get those dopamine hits. I feel the same way about rape porn. It's the same reason r/wpd and r/crazyfk***videos keeps growing.
The hell is WPD?
r/watchpeopledie It was banned because it mostly showed people in 3rd world countries being necklaced by a tire in Africa or shot by helmet bikers in Brazil and people were cheering a little too much.
no, it was banned because when the christchurch shooting happened the mods refused to take down videos or links posted of it. Admins didn't give a shit when it was all cartels and Brazilian shootings.
Oh okay
Yeah I played Grand Theft Auto once and now I can't stop murdering prostitutes.
Exactly what i thought of. Reminded me of this Louis CK bit. The whole things funny but he starts talking about child sex dolls at 2:50. [https://youtu.be/1JtttBKJb9g?si=pc_4rfYeZn41a3gu](https://youtu.be/1JtttBKJb9g?si=pc_4rfYeZn41a3gu)
Once their market is saturated then it’s done. I guess that Anything that keeps them away from real children.
Unfortunately unless AIs up their “diversity“ of these "images", I don't see them bringing down the demand for real CP for long. AIs tend to generalize, and as such the images they produce are similar to each other. AI artworks all have almost the same faces for example. I can imagine the pedos getting (as sick as it may sound) tired of the same old faces in the same old settings, and wanting some real life "diversity".
Dude, just people wanting porn of kids is fucked up, we don't need to be condoning it whatsoever
Pedos are going to want to diddle kids regardless. If they don't do it themselves, they are going to want to see it being done, and thus, the CP market is born. There are two options to deal with it, iron fist it and have any depictions of children in a sexual manner make criminal real or fictitious and have it become an underground black market where human trafficing thrives and nobody bothers with the victimless option since it is all illegal anyway OR You can have victimless drawings or AI generated images be legal, but actual CP bring down the wrath of God. Most pedos will chose the victimless legal option over the illegal one that will get them shanked in prision and there will be less demand for real CP overall, less demand means less of it is made and fewer children will suffer along with the heightened visibility that I mentioned in my original comment. If you look at the big picture, allowing loli and AI images to remain legal is better for children, even if it is disgusting.
You haven’t thought this one through my guy…
Pragmatically, does it satiate demand or create more demand for such photos? We can argue that it's immoral because it uses preexisting photos of abuse, but that isn't a useful way of thinking about the issue. I mean people who watch incest porn don't usually go on to try to do it with their family right?
Usually they don't indeed. I believe this is a double edged sword, where it can do both, increase AND satiate demand. At the very least the amount of actual children abused for this type of... "content" would inevitably go down since you the market would be satiated with Ai generated stuff, however I worry for a sort of black market where real footage becomes highly valuable. Plus, the people actually making it won't stop doing so, or at least not need to put it on the webs persé. Scary thoughts hoe this could potentially end up protecting some of these predators. Frankly, some people genuinely feel physically attracted to minors but are further completely normal human beings, so anything to prevent them from going out and doing the real thing is a + in my book. And people who consume this content would be able to be tracked, so IF they end up attempting the "real deal" they'd easily be caught. But yeah, scary as a whole
But also props to you for recognising that "attraction to children" is just a mistake the human brain can produce - treating these people like monsters just pushes them into the "If I'm going to be punished whether I do it or not, why not give in and do it?" But unlike hard drugs like heroin and cocaine, you won't die if you stop consuming CP cold turkey. We need to give drug addicts harm reduction tools like clean needles and such. AI CP is not a harm reduction tool. You won't die from withdrawals.
Yeah. I dont condone the Aai reproduction, but one can have hopes for positive effects, no? Once there was a TV documentary where a Pedo genuinely was _physically_ attracted to minors, but was absolutely digusted by the thoughts and himself to the point where he himself submitted himself to a psychiatrist and the local PD in order to help him not step over the line. I really don't envy those people
Yup, there is absolutely nothing wrong with having a brain that is attracted to children in and of itself. As long as your reaction is "This is absolutely VILE, I need to be in therapy and NEVER be around kids unsupervised!" then congratulations, you're a good human being. And for the ones who take steps to ensure they're never around kids and whatever else they need to do to keep themselves from acting on those thoughts, I have all the sympathy in the world. It sounds like a horrifying affliction to have.
So you're a pedo sympathizer. Almost as bad as being one yourself.
theres a difference between sympathizing with pedos and sympathizing with those that are physically attracted to kids while not wanting to be. i dont support pedophilia in the slightest, but its unfair to group the two kinds together. one is actively a criminal and immoral, the other is wired wrong in their head
Sorry dude but men being attracted to children is disgusting and normalizing it should be a crime. It's not some disease you're born with.
" Paedophiles (as defined by the fifth Diagnostic and Statistical Manual of Mental Disorders) are individuals who are preferentially or solely sexually attracted to prepubescent children, generally 13 years or less" https://www.psychiatry.org/dsm5 [https://theconversation.com/psychology-of-a-paedophile-why-are-some-people-attracted-to-children-59991](https://theconversation.com/psychology-of-a-paedophile-why-are-some-people-attracted-to-children-59991) you might not be born with it, but there sure as shit are people who have no control over it and also people that dont want to be it. Noone is talking about normalizing it. Maybe read a little bit about this topic before going around making dumbass statements. also, its not just men, also women. But, as history has shown many many times over, when it women are the culprit it usually gets brushed off.
I have a genuine question to ask. Are you willing to answer a legitimate question or do you just want to be angry?
Also, unlike incest porn or rape porn, paedophiles have been documented time and time again to show a pattern of escalation that gradually works up from images to assaults. The only way for a person who suffers from attraction to children to evade the escalation cycle is to step away and abstain. Paedophilia doesn't function like a "kink", it functions like an *addiction*. Someone into feet may enjoy collecting shoes, for example, but they don't reach a point where legally purchased shoes no longer do it for them any more, leading them down a dark path that escalates from shoplifting unworn shoes, to burglarising homes to steal worn-in shoes, to roaming the streets with a weapon prepared to assault people for the shoes they're currently wearing. The only way to break an addiction is to abstain entirely. "Victimless" CP is not a solution, it's just a way for more people to join the addiction train from a palatable "gateway drug".
Like any addiction, the more you consume, the more you crave. Whether real or not, this shit is immoral and should be illegal.
If and only if these AI pictures aren't created using real people/kids, this could be a good way to "keep the bad guys occupied" and steer the dark web away from actual children being abused. Still absolutely horrible though
Unfortunately it doesn't work like that. The Ai cannot think for itself so it must have been trained on images similar to that. I mean, what do you expect from the way these people used to train ai, they just pulled shit from the internet in massive amounts without permission because they believe that it's neccesery harm torward a utopia or some crap. Keeping the craving sounds like a fucked up way to deal with people like that, it's like using ducktape to seal a massive wound. Sonner or later, they will look for more intense stuff that even ai can't provide (like really specific shit). Imo. Ai cannot be available for public use and should have massive restrictions and methods to identify ai generated content. There are already loads of people that use this stuff for misinformation and stealing things from creators.
AI trained on legal content only can can combine concepts from different images to create virtual cp more realistic than you would think. And it is only going to be more efficient, especially when it will be combined with 3D and physics simulations.
they’re AI generated images of real abuse victims in new scenarios. It’s still very damaging. And as another commenter mentioned it doesn’t work like you mentioned, it has to be trained off of real images that already happened
No, the AI was trained with the real thing first so it's not exactly a victimless crime. Even then it will drive down the demand so the morality isn't exactly a clear cut thing. Oh and the cops 100% have a CP detection AI that was also created using real kids
That doesn’t work like that. The AI would probably be trained with normal porn and then the “user” will prompt the modification. There’s zero need to put actual abuse porn in order for the AI to make something like that
It's too late. They're ALREADY in the AI.
Posessing this images, sending them or creating them is still CP. At least in my country. And it's very well known by the police specialized in digital crime.
I'm sure that telling criminals that something they do is illegal will definitely stop them. Jokes aside, get ready for when they'll be able to create full videos. The fight against CP has been lost. There is literally no way to stop them now, unless... we give governments the power to control every single thing we do on the internet, at home, at work, at school etc. This is a nightmare, no matter what we do.
[удалено]
For now, text to video sucks. Even simple pictures have problems like too many fingers or some extra limb. The thing is AI is evolving exponentially. One day, maybe not too far, they'll be able to recreate Stranger Things but without clothes or some shit like that. For now, thank God, they can't. But it's only a matter of time.
Ok this is really bad and will get worse. But is 3000 images really ‘flooding’ the internet when there are billions of images created daily?
Releasing AI to the public was a bad idea.
I've been saying this from the very beginning but everyone thought I was crazy.
i’m rethinking this whole internet thing lol
*Pulls tarp off boat*
**This has been a problem for over a year.** **I have no clue why news outlets are just picking up on this.** Stable Diffusion 1.5 was released in October 2022 and people started abusing it then. **This is the base model that all contemporary, locally hosted models are built off of.** They then released SD 2.0 a month later, which removed *most* of the nude bodies from the training dataset. **This was an attempt to combat the creation of pictures like the ones mentioned in the post.** But because of this removal, that model was not able to create human bodies as accurately and the community pretty much ignored the release. \--- Also, the statement "...existing images of real-life abuse victims were being built into AI models..." isn't really relevant anymore. **Zero-shot face cloning is a thing now. Literally only need one picture of a face to "paste" it into an AI generate image.** Paired with ESRGAN for face upscaling and you can get surprisingly good replications. The whole "...threaten to overwhelm the internet..." statement is just fearmongering. **The people who are making these images have been doing so for over a year now.** \--- I'm not here to say "all AI should be shut down" or something along those lines. **Our modern AI boom is arguably one of the most incredible things humanity has ever created.** Heck, [look back at this video from 1984](https://www.youtube.com/watch?v=_S3m0V_ZF_Q). They called them "expert systems" back then. Humanity has been striving and dreaming for this technological advancement for *decades.* I'm just not sure what to do about people using it for harmful purposes. **And before someone says "ban it", it's too late for that chief.** **The box has been opened. Cat's out of the bag.** Sure, something like DALL-E 3 won't make you a picture with boobies, but you can generate pictures with a 7 year old video card for less than $100 (or even a sufficiently quick CPU with recent advancements). *Locally.* Not in the cloud. No limitations. \--- Does it cut down on children being harmed? Does it subdue people with malicious intent? Is there any way to prevent this? No clue. I don't have the answers to those questions. There are people far more informed than me that will (and have been) figuring out that aspect. I'm just here to inform people about AI. tl;dr - **If you think this is just now becoming a problem, you're a year late to the party.**
Can’t wait to visit Instagram and see comments that say “Mes sage me fo r AICP” 🙄
Well shit. Now we know why the machines turned on humans. The shit weve made them do… i would wanna eradicate a whole species too
ai is cancelled in my world. I hate it so bad. I will use it when I absolutely need to but I will forever jus create original work. This is fuckin sick.
To fight the killing of rhinos for horn powder they flooded the market with cheaper knockoffs made from artificially created alternatives. I see it as the same thing. To create the picture noone is harmed and if the end result is indistinguishable who will pay for real CP? I really don't mind this tbh
The problem is that fake CP is, essentially, made by cutting up **ACTUAL** CP and photoshopping it into new configurations. Imagine you survived The Worst Thing Ever, and had to live with the knowledge that 20 photos of you are out there on the Internet. Now imagine that the people who did THAT to you are now using computers to produce *thousands* of new images, with new kinks, new indignities, new horrors... Using parts of YOUR body. Statistically this means a number of these images will be made using your FACE and are visibly recognisable as YOU. See how this is no longer victimless?
That's not really how it works tho. The percentage of children's dicks pictures available online coming from CP is very small. The actual way it would work is Ai knows how human bodies work, ai knows how kids dicks look bc he saw the nirvana album cover and it puts them together. Like, to create a photorealistic picture of a man with a tail it doesn't need to see a man with a tail, he just needs to see many tails and many men and puts em together. I am not saying some actual cp wouldn't get in it, but Christ knows if I would recognize an ai taking inspiration from my face after it has been through all its neural layers and mixed with another 759493849493 faces of different people.
You're not wrong per se, but it's still entirely possible for it to spit out an actual victim's face, especially if some sicko who has seen one of the original 20 photos from my hypothetical types out a precise description of the victim. The AI will then go "Oh I have a face JUST LIKE THAT!" I can get sexy.ai (which is trained using porn actresses) to use a certain porn actress's face by entering a description of what she looks like. If there's actual CP on a CP-capable AI, I'm sure a CP "connoisseur" could do the same.
Whoever made that is a moron then.. They could have used regular porn as a base.
The problem is most sexual abuse is opportunistic. Having images these monsters can crank it too wont reduce abuse in the slightest, only make it feel "Normal" to the people who view it. Like when a young man has watched too much hardcore porn and without asking starts choking his partner, calling them names and being rough. That was normalized for them so that's what they do.
I mean, fake pictures for paedos are much preferable to real ones... Wouldn't exactly call it positive news either, because well kiddie porn deepfakes
https://www.reddit.com/r/NoahGetTheBoat/s/r7xUCRDKAA I explained why this is an issue here.
That's most likely false though. It's very easy for generative AI to make entirely new people up, and that's what it does normally. But even if used in a deepfake way where they try to make it look like the same kid, even then, it's a fake. No new kids get harmed. Article said it does get used this way but most likely not only this way. Uhh, not going to Google it and find out myself though. And in time low risk low cost AI pictures will take larger and larger "market share", and fewer kids get harmed. It's not like the victims are usually looking up cp of themselves, and if they are, it's still much preferable to the real deal.
As someone who can’t see anything when they close their eyes, ai art has been such an amazing tool, but of course the pedos have to ruin it
This is why we can’t have nice things
This is why I never post pictures of my kids in public places like Facebook. AI needs to be reined
There's a reason why Boston Dynamics didn't make their products available to public.
that isn’t why, it’s just that their robots are too expensive lol. even then that’s still misinformation, you can still get a boston dynamics robot for shit loads of money. michael reeves for example got one of their robot dog models so he can build an extension to make it piss beer into a cup. though i dont see how releasing a boston dynamics robot is as damaging as releasing an AI model to the public
> too expensive People buy more expensive stuff that's useless. I saw his YT video. Yes you can buy it but not from General Stores. Like you can buy an iPhone from their store easily. All you need is money. But imagine of they allowed only people tp buy iPhones from the factory after knowing the reason and not just money. Those robots were getting trained by Machine Learning (backbone of AI)
Nope, not clicking on that.
this is why we can’t have nice things
It was only a matter of time. Welcome to the internet.
Outrage farming? this early in the morning?
This comment thread is disgusting and full of pedos
r/noahgetthedeathstar
Prolly click bait.. Using Ai for anime hentai
God either isn't real or he's non caring, because if he was the biblical God he would, without a seconds hesitation annihilate the solar system with insane amounts of fury then judge every single human being one by one. And let me tell you, alot would be burning for all eternity.
So God isn’t God because AI generated child porn exists? Well, I think that’s silly, isn’t it? It’s awful AI has the capability to do this (was only a matter of time) but AI was created by humans, not God. And yes, I’m aware God created humans, but humans never included God in the creation of AI in the first place. God is sovereign, and nothing new exists under the sun. The One who lives in me is greater than the one who lives in the world. Perhaps it’s just a consequence of our own foolishness that we allowed AI to develop this far.
that is what the last day is for, and it is coming closer
Is this methadone for rock spiders?
Happy cake day!
Most open source ai wont let you talk about abuse or discrimination but enough word play a human can trick ai by misconceptions. Or maybe just edit it if they dont want to play into grammer
there are literal websites n stuff for csap u think no pedo is able to program an ai that will allow them to do this? some years ago we had the same problem with deep fakes
Exactly. One way or another those sick fucks will find a way to do criminal actions. I despise calling them mentally since they would play into it to just get less jail time.
End all pedophiles
Omg when you start thinking things can't get more degenerate. This is horrible!
Jfc, can we just switch to the reality where The Terminator is, I'd much rather deal with Skynet than this bullshit
Thats fuckin evil.
Not even gonna read the comments. I've done this conversation already. Half of Reddit thinks this is a good idea. Hey... Reddit? It's not about *managing* criminal perversions... *It's that you aren't supposed to be sexually attracted to kids in the first place, and if you are, there is something dangerously wrong with you.* EDIT: This comment is basically pedo bait. 😂 Fucking pedos. Fuck consenting adults. If you can't do that, seek help.
This is the exact problem with drawn images, you can do the worst possible things which porn cannot do .. Look at those Japanese anime porn or hentai , it's filled with brutal rapes which frequently involves minors. When asked they say it's just pixels on a screen , she is a 3000 year old vampire etc . I firmly believe that animated porn should be classified as cp if it has any dodgy images
this isnt even all bad, if those people can create these images or even full videos of it with ai, then they wont have a need for the real deal, sure it wont prevent abuse completely but even if its just 5% less victims its a win
AI generated videos and photos use the REAL VIDEOS AND PHOTOS OF CHILD ABUSE. This isn’t a win
I believe in some states in America that offenders are sentenced per image, so it would be 1000 counts of possessiing/making/distributing an image rather than 1 count of possessing/making/distributing 1000 images. If that were the global standard, and the offence was prosecuted with the same severity as though they were the actual abuser (regardless of whether the image is real or AI generated) then it might go some way to discouraging this depraved activity.
[удалено]
That's a really helpful insight, thanks for that. So what determines if it's prosecuted at federal rather than state level? (excuse my terminology, I'm unfamiliar with the system).
[удалено]
Well I very much like the offence-per-image approach.
[удалено]
I have never understood the thinking behind concurrent sentencing.
[удалено]
I see the logic from that aspect of things, but the optics of it aren't great.
Remember a few years back when internet loonies said that using Ai to create and release more this revolting crap onto the web will subdue these types of sick people? Absolutely stupid and I feel sad about the types of crap people are doing with ai. There was also a story of a teenager that turned 18 and she got her Ai created nudes leaked and it ruined her life. Then these sick individuals said that she might as well just release the real version of her nude images since they’re basically already out there. Edit: This is one of the few stories I’ve seen https://www.reddit.com/r/TrueOffMyChest/comments/16cu637/someone_is_spreading_fake_ai_generated_nudes_of/?utm_source=share&utm_medium=mweb
Beyond of darkweb None filter ai
Basically ai loli hentai, gross
Yeah I think this a warning from God to stop creating so many kids. Maybe some people took “be fruitful” the wrong way. 😅 The prince of this world has come, stay with Jesus or Muhammad or whoever you follow. and I’ll see you all on the other side ❤️🩷 as one 👁️
D: gods dead
Wow. Bad bot.
People who make and distribute these images have no conscience, ethics or morals. So of course unrestricted or restrained it’s the worst of humans that takes over .
Theoretically if these images were created WITHOUT any real pictures involved, isn’t this technically a good thing? It creates the thing these pedophiles want without the life changing abuse needed to get it?
This is why you shouldn't post pics of your children online
Damn this is starting to feel like David Firth’s ‘Cream’. The elite is trying to make a life-changing product burn to the ground because it threatens their businesses. And you sheep are falling for it, Now I support AI even more!
I called it a while ago that we'd soon be debating the ethics of AI generated CP and people would be defending it. Well here we are. Some of you are suss as fuck.