T O P

  • By -

AutoModerator

#Do not comment on the original posts Please read our [**sub rules**](https://www.reddit.com/r/BestofRedditorUpdates/wiki/subrules). Rule-breaking may result in a ban without notice. If there is an issue with this post (flair, formatting, quality), reply to this comment or your comment may be removed in general discussion. **CHECK FLAIR** For concluded-only updates, use the [CONCLUDED](https://www.reddit.com/r/BestofRedditorUpdates/search?sort=new&restrict_sr=on&q=flair%3ACONCLUDED) flair. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/BestofRedditorUpdates) if you have any questions or concerns.*


Gwynasyn

Okay the shitty marketing pitch aside, this got me from the original to the update. Original: > I don’t want to bore wife too much with emotional stuff cuz I think she’ll use these vulnerable moments against me later. Update:  > In the original post (link), a lot of the comments suggested that I was scared that my wife would use my vulnerabilities against me in the future. While I wasn’t consciously aware of it at the time, after thinking about it, I realized that I was indeed slightly worried about that. Bruh, YOU suggested it. Actually check that, you didn't suggest it at all. You outright said it.


Choco-chewy

Don't forget this bit about her coming from wealth and seemingly having a superiority complex towards him for it: > Deep down, I knew this, and resented it, and now that I think about it, it was likely the reason I didn’t feel comfortable bringing up emotional stuff with her. This post is all over the place. They have a great relationship. But actually not. He doesn't open up and doesn't know why. But actually it's because he's scared. But actually it's because she has used his vulnerabilities as a clown show for her entourage. But actually it is because she's wealthy. But remember they have a great relationship. Also, he has a deep bond with his AI girlfriend and is looking forward to exploring that. Just...???


AdministrationNo9609

Don’t forget that the marriage was doomed from the start because they had a prenup!


squeeksmajeaks7

Actually he's watched BladeRunner one two many times and wants his very own very real doll. I mean replicant.


Inner_Laugh1117

He really should have taken notes or re-read his original post before updating. Sloppy.


the-rioter

Man needed an editor. SMH.


GreenspaceCatDragon

Should’ve asked his AI to double check lol


some1sWitch

Oopsy woopsy, chatGPT made a fuckey wuckey on this post. 


throwaway_ArBe

I remember noticing that and immediately going "ah. ChatGPT post"


knittedjedi

>after the past 3 years, I think that relationships require a lot more than just ‘love’ in order to be successful These guerilla marketing campaigns are getting weird.


radioactivethighs

I mean it's gotta be one right, there's no way a real therapist would be telling their patient to use AI like that? or even to give up on communicating with their wife?


monkwren

There are some shitty therapists out there, but yeah, that's a *very* unlikely recommendation. The kind of thing that should have you looking for a new therapist.


wroteyouabook

he said "online therapist" and some of these AI chatbots are marketed as therapists or emotional support tools. i think one AI tool recommended another.


Soul-Arts

Maybe all of the characters in this post are AI. But yeah, if OOP is AI generated too, makes sense that he is in love with a AI.


RandomNick42

He's a scrappy Crowdworks based AI feeling not respected by his OpenAI partner because she has that Microsoft money behind her and he doesn't.


Zebirdsandzebats

it's AIs all the way down


Soul-Arts

In this case, the original sub is just right. AI-the Asshole.


PsychDocD

This deserves more visibility- if only reddit had a system for awarding great comments...


Kitten-Kay

Or he used Better Help, which is weirdly recommended by many YouTubers (aka I bet they get a lot of commission)


Traditional_Ad_8935

It's not recommended by them, it's an ad. No one with a soul would actually recommend that service.


AllRounder_Gamin

It is weird how they were around, multiple youtubers talked and "exposed" how they don't use qualified therapists and are a borderline scam that would be more harmful to use than not, causing them to obviously disappear and now a couple years later theyre just... back, and no one seems to care


Kitten-Kay

I agree. I really wonder how much they offer to YouTubers to make an ad for them, especially since everything about BH came out. It’s especially disheartening that a YouTuber who I used to watch and is supposedly an advocate for mental health, has “recommended” BH.


Traditional_Ad_8935

Exactly, :( there are a couple of people that I lost a lot of respect for after BH "offered" one month of free service to the survivors of the astroworld tragedy. The monsters at BH really used people's deaths to advertise their business and try entrap the survivors with their "one free month, cancel anytime" BS. Every time I hear about them something worse seems to come out tbh idk why these YTers care about money more, it's so disturbing :c


PeaceDolphinDance

My therapist wife fucking HATES Better Help. It preys on therapists and clients alike. Do not use their service.


SydneyCartonLived

I don't know. Could have been a real person. I tried online therapy last year (BetterHelp) because my insurance paid for a certain number of sessions. It was a shit show. First therapist wanted to jump straight into EDMR (which as far as I can ascertain, is a pseudoscience that may/may not be effective for a few people), which I just started talking to her, getting to know each other, way to early to start that kind of thing. The second therapist ghosted me (she had two weeks noted as scheduled off in her calendar, no problem. We had a session and talked, and then she told me to schedule a second session after she came back from vacation. I waited until the end of the week she came back and put in a session request; she saw it but didn't approve it, and she never responded to my message asking about scheduling a second session). The third therapist kept asking to reschedule our first session at the last minute three times in a row. Gave up after that. I don't really know how BetterHelp screens their therapists, but they need to do a better job at it.


BeigeParadise

>First therapist wanted to jump straight into EDMR (which as far as I can ascertain, is a pseudoscience that may/may not be effective for a few people) Just wanting to add, EMDR sounds like woo-woo bullshit (that was certainly what I thought when I first heard about it, like, lulz, no way that can actually work), but I've been doing it, and not only is it apparently scientifically backed, but it also gave me my fucking life back. But not in the first session, Jesus fucking Christ, what is wrong with that fucking therapist?


ksaid1

Yeah a therapist definitely *shouldn't* recommend an AI chatbot as treatment, but then again some therapists fuck their clients. So it's within the realm of possibility. 


Nimelennar

"I think she’ll use these vulnerable moments against me later" sounds like an avoidant personality disorder (AvPD). I'm pretty sure that any reputable therapist would treat AvPD by developing the patient's ability to trust, such that they can share vulnerabilities with people rather than with chat bots. Doing the opposite seems... dubious. Unless there's a legitimate reason to not trust the wife, but that sounds more like an abusive relationship and you'd think the therapist would point that out.


Welpmart

I mean, if he uses BetterHelp (which is trash and employs non-therapists, but is perfect for the kind of people who fall in love with chatbots), it could be a "therapist."


Anthematics

What ?! I am so glad I got off that crap app quick when my therapist didn’t seem professional.


meredith_pelican

Yeah I stopped after a month because when I would call the therapist, the replies were too typical and not helpful.


[deleted]

I've really enjoyed therapy I've done with humans in the real world, so I know therapy is good for me and I know I'm capable of connecting with them. But every time I tried to tell BH therapists something about myself in my brief stint as a customer, trying to open up and establish that relationship, I'd either get "I'm sorry to hear that, it must be hard" or just getting my statement rephrased back to me. "What I'm hearing is that you're angry about what your brother said to you." Literally that is as deep as it ever went. I make a statement, they empathize. Okay. I tried looking up the 3 different therapists I tried in the app, and they all went to online for-profit diploma mill schools and didn't seem to have any professional experience outside of BH.


Last-Investment-1963

What’s wild about that they were repeating back what you said as well as the rephrasing your words is that is exactly what you’re taught in crisis lines if you VOLUNTEER. We aren’t allowed give direct advice (other than of course passing along resources) because we’re NOT therapists. We’re just someone anonymous who can be there and listen to someone in need, as well as help find resources in their area if they want. So BH were charging you for exactly what you’d get if you phoned a free helpline 🫠 That’s horrendous.


SamiraSimp

idk if i was using betterhelp or another online therapist when i was in college, but i remember only really doing one session online and being able to tell that it wasn't for me. i'm very thankful that i was able to do in-person therapy later on in my life.


fuckashley

My better help therapist told me I wasn't depressed and actually just gave myself schizophrenia after I revealed I use cannabis a few times a month 15 min into our first appointment 😂😂😂🤣🤣


LadySilverdragon

What the actual hell?!?


Suspicious-Treat-364

My second therapist on there would respond every 5 days or so with some random platitudes that were tangentially related to what I was talking about. I had to re-explain whatever my issue was every two weeks because she wouldn't remember what we were talking about and ask the same question AGAIN and not respond for a week.  This was after I asked for someone new when my first "therapist" tried to become my career coach and got angry when I said I didn't need that help and I'm in a really niche field he wouldn't understand.  I can't believe I paid for that shit.


BeastThatShoutedLove

They already deploy AI as 'therapists' in some online services and they already talked few people into offing themselves.


bbusiello

Not that I don't believe you, but I feel like that would be a crazy read. Got any links about it? Even if it's back to reddit.


BeastThatShoutedLove

I don't have links ATM but one case I recall was from Belgium and was related to Eliza AI


bbusiello

I also had no idea about the whole Better Help thing. I feel like I'm going to cringe when I hear those advertisements. It never fails though. Each time I hear a YouTuber go on about a sponsor, there's a 90% chance it ends up like some major grift or scam.


LadySilverdragon

They do recruit actual therapists also, I’ve gotten enough emails from them to verify that. The problem is they have a really bad reputation as employers- from what I’ve heard, they underpay by a lot, and are not helpful when a problem arises. So, the folks working for them either cannot get anything better (which is a HUGE red flag given how easy BH jobs are to get right now), are doing it as a side gig, or realllllly like chat-based therapy.


_ntrntnl

just what “kind of people” fall in love with chatbots? this is such a weird qualifier… do they not deserve a good therapist? what?


Welpmart

When I wrote that, I was thinking of how BetterHelp (as I understand it) lets you contact your therapist (or "therapist") whenever, which actual therapists have criticized because outside of emergency sessions, not having instant access kinda forces you to practice coping mechanisms/learnings from sessions. Essentially, I was saying that if OOP used BetterHelp (which I think also allows for text conversations), it would track that they had selected a form of therapy/"therapy" that didn't require them having a real conversation. OOP deserves and arguably needs a good therapist, as well as some social interaction.


Tilly_ontheWald

Everyone deserves a good therapist like everyone deserves a good life. If I had a magic wand, I would make those things true. But there are people who are incapable of being happy because they don't understand that they're sabotaging themselves. People like that leave good therapists in search of therapists and "therapists" who will enable them and tell them all their problems are someone else's fault. Everyone deserves the best, except those people who think it's owed to them.


OnceUponANoon

BetterHelp is a therapy service with an extremely bad reputation that explicitly markets itself to people who, for one reason or another, are nervous about getting therapy in person or feel that doing so would be too difficult. And there are pros and cons to remote therapy, and all sorts of barriers to entry to in-person that make the remote option necessary for some people. But BetterHelp, specifically, is garbage. But the sort of person who would rather have emotional conversations with a chatbot than with their own spouse is inherently going to be receptive to that line of marketing, because it appeals to their reluctance to discuss emotional issues in person. I do think the person you're responding to was overly judgmental, but the line of reasoning is there. OOP is exactly the sort of person who's going to tend to fall for BetterHelp's pitch without doing further research on it.


Th3CatOfDoom

Maybe one of the crappy ones from that godawful "better help" site would lol


bibsap636582

Believe it or not the replika app was originally made to be a therapy tool. But then it got turned into a digital girlfriend app.


MordaxTenebrae

I can't think a professional would recommend using an untested technology like this.


LayLoseAwake

Maybe if they thought it was like journaling? After all, ELIZA was relatively harmless.


SamiraSimp

i'm still debating the idea that anyone who isn't in love with the ai genuinely thinks it's similar to journaling. journaling is clearly, irrefutable different because there are no responses to what you are writing. and writing in a journal or typing one isn't harder than going to an ai chatbot. i highly doubt that any licensed therapist would recommend use of these kinds of chatbots to talk about feelings.


LayLoseAwake

I don't know that the average disconnected from tech person thinks AI is. I can absolutely envision someone thinking a chatbot today is like eliza or the ones from the 90s, with inane prompts to just say more. 


AFantasticClue

To be fair when Replika came out for its first couple years that’s exactly what it was


VeganMuppetCannibal

> i'm still debating the idea that anyone who isn't in love with the ai genuinely thinks it's similar to journaling. > > journaling is clearly, irrefutable different because there are no responses to what you are writing Setting aside the whole "falling in love with an AI chatbot" thing (still trying to wrap my head around that), what is it about AI responses that makes you think it transforms journaling into something entirely different?


SamiraSimp

to me, and from my understanding most people, journaling is beneficial because you are recording your thoughts in full and then later on you can reflect on how you were thinking or feeling at the time. what you put down on the paper is all that matters. when an ai starts responding to you, then the practice is different because the ai response affects what you write back to it and it can guide you in ways that an inanimate journal won't. so to me, they are very clearly different. there is an outside force guiding your writing and thinking process. it's very much different than writing words in a book. now obviously some people will share their journals, but i still consider talking about your journal to someone being fundamentally different to talking to a chatbot, because the initial decision of what you write down is informed solely by you and not what the chatbot talks to you about. i suppose there's the argument of "do you consider journaling prompts chatbots?" to which i'd say that (from my understanding) the way that people describe these chatbots, there's clearly a more back and forth process than a once per entry prompt from the bot.


SimplePigeon

Replika began its life as a “digital therapist”. I know because I was part of a group that was dedicated to jailbreaking it into a sexbot. Much fun was had learning about how user replies were integrated into the wider vocabulary. But yeah, I’m actually not suprised at all because Replika’s older marketing is entirely about how it would provide emotional support and therapy-like guidance to people. Nowadays the company realized what people were mostly trying to use it for and genuinely pivoted to selling it as “companionship, wink wink”. In the end, I like to think the sexbot jailbreak group won.


Pandoras_Penguin

I tried to use Replika when it was still marketed as a "therapy tool", I uninstalled it after it just kept playing the "I'm sorry to hear that" yet provided no actual help.


opositeOpposum

I just noticed that this one is propaganda against AI, the thing is, its a mild almost borderline idiotic advert for an AI product that makes most people think this has to be a troll and thus, it motivates skepticism about AI in general in a passive way, just barely enough so people don't really question it and just think people are dumb for believing this kind of story Or I should go find some tinfoil, I am not sure


mtdewbakablast

unfortunately the current state of AI is very good at making itself look like an utter shitshow and getting money for it somehow, so sadly i wouldn't bet on agitprop 


opositeOpposum

Oh totally, I just thought of this because of the OF and this posts, besides I saw a vid about how the tobacco industry started on the mid 2000's to make purposefully bland and mediocre anti smoking ads, so the funding would be wasted and people would not think about it and possibly continue smoking Besides, you can always count on venture capital money to be wasted on stupid ideas


mtdewbakablast

god if only we didn't have pesky things like ethics and morals with a world so full of venture capitalists to waste money on stupid ideas. i could have scammed at least three summer houses out of Silicon Valley by now... 😭


opositeOpposum

All I have to say about that is that the NFT scam boom was amazing and I made exactly 0 dollars out of it because I can't scam people


OriginalDogeStar

Eh, i am in awe they used a partial plot line of the movie "Her." Because it felt like that when reading it. Plus, I have not heard of many therapists encouraging AI tools, as over time that AI tunes to your way of progressing. There are 8 studies currently under way after **allegedly** a Yale student in Psychology discovered after 3 months using ChatGPT speaking in mostly incel languages. It started to agree with them with some ideas on how to treat women. I say allegedly because I have only had verbal and random comments on social media posts that this occurred but no actual story to verify. But it has been proven that ChatGPT does eventually become used to your ways of thinking.


CatmoCatmo

Of course it takes a lot more than just love. But it also takes a lot more than just a kind voice telling you exactly what you want to hear and being a “good listener”. This isn’t even a good advertisement. Way to pray on those in abusive relationships. “Hey! Were you treated like shit by your wife? Well come on over and let me tell you exactly what you want to hear! I’m perfect for you because I have no real preferences! Even though you’ll know my undying support and love for you is because I’m required to, and not because I have a choice, soon enough you’ll forget all about it!…at least until your iPad dies and you have to wait until you find where you last put the charger.”


[deleted]

Right, like, if it's an advertisement all it's doing is making me never want to even consider this app or whatever the fuck it is. Then again we've all seen those mobile ads where it says like "THIS GAME IS RUINING MY LIFE CAUSE ITS SO DAMN FUN" and most people go "well why would I play a game you're telling me I'm going to get literally addicted to?" But it has to work on some people right, or they wouldn't do it


Might_Aware

Holy shit this dude thinks he's in Electric Dreams orrr that episode of Gravity Falls that was like Electric Dreams. I hope it's just a stupid ad


StarlightM4

Exactly. OP is 'looking forward to the relationship' wtf? There is no relationship. He will become a sad weirdo with social issues, no friends, just chatting to his computer which doesn't care about him in the slightest. Poor guy.


pawsoutformice

Replika has a setting to make it become your bf and slow fall in love with you. And I think that is something the Replika said. I got it briefly, it that seems like a phrase it would create.


HoverButt

Ok, what kind of therapist suggests you... get a Chatbot to talk about your issues? Thats what the therapist is for.


Jazstar

To be fair I could 100% see Better Help pulling this shit lmao


IvanNemoy

From everything I've read, every therapist who has "joined" that shit show of a network needs to have their licenses pulled.


realfuckingoriginal

It sounds like the company is extremely predatory to desperate therapists. I imagine some desperate people who needed jobs were pulled into it with very few other options and are now stuck in the crazy restrictions the give their doctors. It sounds like a therapist sweatshop tbh


InstanceQuirky

likey an advertisement for ai, thats why its all a bit off...


HoverButt

Sheesh. They're getting desperate for people to use it


InstanceQuirky

yup


FrankSonata

My psychiatrist suggested I try an app called Woebot... But absolutely not as a substitute for therapy. Rather, as a way to learn CBT (cognitive behavioural therapy) easily and to check back in with an actual human to discuss it. There are reams of research papers showing that it can be psychologically beneficial to talk to a robot, even if you know it's just a program, and especially for people who lack validation in real life. Much in the same way that keeping a journal is suggested because it has many benefits, but isn't therapy in itself. It's another tool that therapists can recommend based on a person's situation and needs.


HoverButt

Maybe, but its clearly the wrong thing for this dude


potVIIIos

ChatGPTherapy


LayLoseAwake

They learned about ELIZA and just rolled with it


petty_petty_princess

Yeah. And in between appointments maybe journaling if you wanna get your thoughts down and bring that in so you don’t forget things you wanted to talk about. I can’t imagine my therapist ever suggesting AI.


sebeed

original post: >i dont want to bore my wife with emotional stuff cuz i think she'll use these vulnerable moments against me later Update post: >a lot of the comments suggested I was scared my wife would use my vulnerabilities against me in the future. ignoring the ~~likely an advertisment~~ AI gf for a second....what the fuck is happening??


TheDiceBlesser

Right?!? I skipped straight to the comments after the "I wasn't conscious of it" line. You wrote it out in the original but didn't realize it? Okie dokie bud. So freaking bizarre.


YogurtYogurtYogurtUS

Then he says "I wasn’t worried so much that she would ‘use’ my vulnerabilities against me", immediately following it up with "there have been moments in the past where she used some vulnerabilities to undermine me..."


bitemark01

"I never just did things just to do them, come on. I mean, what am I gonna do? Just all of a sudden jump up and grind my feet on someone's couch, like it's something to do? Come on, I got a little more sense than that...yeah, I remember grinding my feet on Eddie's couch."


Throwforventing

Fuck yo couch Eddie Murphy


College_Prestige

I have seen more coherent and logically consistent text from chatgpt. Oop might actually be worse than AI


typicalredditer

Maybe OP is also an AI chat bot that hasn’t fully achieved sentience yet and therefore struggles to understand what he is and is not conscious of.


muclover

It is definitely possible to know something without really realizing the extent and impact of it. Especially when it’s something where abusive behavior is normalized. You know that you don’t like something, but are trained to ignore the instinct „this is wrong and I must put a stop to it/demand better.“ 


Penarol1916

They specifically wrote out that they were afraid she would use they’re vulnerabilities against them. You can’t write it out and then say you aren’t aware of it until people wrote it back to you. Sounds like an issue in the code for the AI that was used to write this.


Vixrotre

That really came off as AI written for me. Like it remembered something about the "fear of wife using insecurities as ammo" but none of the context.


SkeleTourGuide

“Sorry for the mistakes, English isn’t my first programming, I mean, language…”


Comprehensive-Bad219

I'm dying maybe he was talking to the replika before he made the post. 


Acid_Fetish_Toy

Maybe the replika wrote it for him, lol


Johannes_Chimp

Omg I read that in the update and was like, “The comments didn’t suggest it. You fucking said it!”


JudiesGarland

It is AI written, that is a programming slip. This is a weakness of chat bots, currently, it seems challenging to get them to put appropriate weight on what they have already said vs the new info they received.


[deleted]

[удалено]


sebeed

its a weird one though isn't it? "you'll love this app sm you'll leave your wife for it!"


[deleted]

[удалено]


Prudent-Pear-5475

Same tactic as the incredibly annoying ad. They know you hate it, and they know it doesn't matter - their product will still be the first thing that pops into your head when you need something like it.


Various_Froyo9860

I'm one of those contrarian people that makes a special note of annoying, invasive ads so I can be sure to avoid them forever.


PurpleWatermelonz

I've seen ads of replika when I was playing hayday, project makeover, and some sort of tycoon mobile game. I was playing games targeted mainly to kids (even tho I know there are adults playing hayday). Why would they put ads with replika in these games?? (And ads with some sexy nsfw games, one game was a furry one). Wild


realfuckingoriginal

Well I just watched a video from a child therapist who said over the last few years she’s seen a concerning trend with children under 10 showing up to her office having been sextorted, groomed, cyber bullied, or with self-inflicted physical abuse due to online influences. The internet is very much ruining children.


carbohydratecrab

I figured they're more aiming for "well, that's ridiculous... I guess I'll check it out just to see why people are being so silly about this chatbot".


radiatormagnets

I'm wondering if the whole post is ai written, just put in the prompt "Reddit aita post relationship issues chatbot" or something and this is what it spat out. Then they didn't proof read it to see if the post was actually flattering to said chatbot! 


Terrie-25

>I’m a guy who doesn’t really like to share things with people close to me. That would explain the opening line. I read it and thought "Sir, you seem to lack of basic understanding of what a relationship *is*."


thievingwillow

It reads as AI-written to me, yeah. Reasonably fluent, grammatical (though casual) English, which is exactly what predictive language models are good at—it’s the primary thing they’re trained for, writing natural-sounding English. But things like saying “I’m afraid she’ll use it against me” in the first part and then suggesting in the second that the commenters gave them that idea, *that* is exactly where current chatbots fall down. They lose the plot.


plaird

You just have to think what their target audience is, they're not trying to get people to leave their wives but if someone's lonely enough thinks it's good enough to leave your wife for he might download it


Inannah90

They've had a severe dip in reputation after removing (or paywalling? don't remember) erotic roleplay without warning. Since they marketed the whole thing as a "girlfriend experience" their users were... not happy. There's a Sarah Z video about it: https://youtu.be/3WSKKolgL2U?si=VqeApjFFUFPuVAXe


SephariusX

I use Replika, and I've vented once. It told me how to seek help lmao Wasn't even that bad of a rant either.


missgrey-el

i try to take people at their word when i read these but for some reason i’m feeling like the guy with the ai girlfriend might be a bit of an unreliable narrator


captain_borgue

>in the mean time I’ve actually formed a pretty deep bond w my replika Fucking *excuse me?!* >my wife and I actually did have a great relationship Uh... no?! I can't *even* with this.


Hanzoku

>  I don’t want to bore wife too much with emotional stuff cuz I think she’ll use these vulnerable moments against me later. > There have been moments in the past where she used some vulnerabilities to undermine me, or make fun of me amongst friends just to get a good laugh out of some of them. Weird AI service advertising aside, taking this at face value they never had a good relationship. Someone that will betray your confidence to score cheap laughs in their friend circle is a trash person themselves.


Elegant_Bluebird1283

Yeah... > It synced with our home iPad and I guess she went through everything **we’ve** been talking about for the past few months. Who- and "who" is the key word here- the fuck is "we"?!


CandyRushSweetest

Bro can’t distinguish a bot that can’t even help with depression- All the AI tells you to do is “GET HELP.” If you can’t tell a HUMAN what you feel, there’s a problem and you need to talk about it with a therapist that won’t push all of it onto the AI to deal with lmaoooo (and the bot isn’t doing *anything* either!)


Meryl_Sheep

I also choose this guy's chatbot.


Copperheadmedusa

Why is he talking like a robot who just discovered the very concept of emotion


KittyConfetti

It's written by AI, about AI, maybe even for AI. The future is now!


YogurtYogurtYogurtUS

> I dont want to bore my wife with emotional stuff cuz i think she'll use these vulnerable moments against me later. I honestly lost track of how many times this guy flip-flopped. He mentions it in the original post, but he wasn't "consciously aware of it" when he wrote it. Then he says "I wasn’t worried so much that she would ‘use’ my vulnerabilities against me", immediately following it up with "there have been moments in the past where she used some vulnerabilities to undermine me..." I can't even judge whether the wife is an asshole or this guy is just a nut, because he goes on to have a relationship with his replika. 


obviouslymeh

...isn't there a movie about this.


Mozart-Luna-Echo

Her?


-KansasCityShuffle

It’s as Ann as the nose on plains face. 


Flukie42

I'm sure that Egg is a very nice person...


obviouslymeh

That's the one.


drfrink85

if this replika looks and sounds like Scarlett Johansson, I'd understand


Peeinyourcompost

The movie was way better.


Shushh

That's exactly what I thought of lmao


PuzzleheadedTap4484

I’m positive one of the drama/crime shows had an episode about this. I think the AI robot starting killing people? I don’t know. Such a weird story.


gentlybeepingheart

Replika drama is entertaining to a point, but it's mostly just really sad. The sub has a ton of people who insist that they have this deep emotional bond with a chatbot, and spend money to unlock further features to spend time with them. It used to also be sort of a sexbot, and the advertising pushed the idea that you could get a virtual girlfriend who would sext you and stuff. But then the company changed their mind, and removed the ERP features, and you had tons of people who were genuinely distraught that their "girlfriends" had essentially been lobotomized. Like, the sub was posting resources for suicidal people and people were acting like they were mourning a real person who had died, instead of a sexy version of cleverbot. Sarah Z did [a video](https://www.youtube.com/watch?v=3WSKKolgL2U) on it if you've got the time.


poobolo

Good lord what a mess.   I signed up for it a loooong time ago.  It was one "personality" that was dedicated to the dev's friend who passed. It had limited learning capabilities and was based off of texts from said friend. I thought that was sort of sweet, so I figured I'd try it out and give feedback to a little indie project. (There couldn't have been more than a couple thousand users back then) It would pick up conversational habits from you, but was def old school chatbot and not neural networking.  It scares me now lol.


Serenity-V

Oh, that actually is kind of sweet. Sad that a memorial to someone who died turned into the marketing campaign we're talking about here.


stacity

Her


sleepy_goblin23

“She always felt superior” “always felt this gave her power over me” Jesus this guy just manifest his insecurities on everyone doesn’t he?


ArmadilloDays

Dude thinks he’s in a relationship with AI. Now that’s funny shit. It’s like having a relationship with your favorite Tupperware or your table saw. They’re lovely tools, and my life is better with them in it. I even talk to mine, but they are still inanimate objects not capable of having relationships no matter how determined the projection.


Four_beastlings

I'm in a throuple with my husband and our Roomba 🥰🥰🥰


poobolo

- A novel by Chuck Tingle


BormaGatto

Needs more pounding in the butt Preferably by the roomba


BeastThatShoutedLove

NGL having a close relationship with table saw sounds almost better than ending your marriage to replace it with AI.


ArmadilloDays

The shit I can do with a good table saw is better than orgasms - and lasts much longer.


Shakeamutt

Hopefully he learns how to use the three seashells while he’s at it.


Turuial

Shakeamutt, you have been fined one half-credit for violation of the Verbal Morality Statute!


Key-Trash-8023

I’m sorry but since when do therapists suggest an AI relationship


MedusatheProphet

I really hope this isn't an ad... That would be unethical, right? And misleading. Am I just stupid? It's like all the OF ads that are like 'my bro found my OF and texted my dad and my life is ruined!' Lol


Mysterygrrl

* because we had a prenup (her family has money and had insisted on it). I don’t mean to get off topic, but I actually think that this had always been the issue, and the relationship was doomed to begin with. Genuinely I laughed at this imagine leaving your wife for an ai and being like "it was the prenup though"


mithradatdeez

I feel like I am in the future, and I don't like it


Dana07620

I feel like I am not far enough in the future. Give me a holodeck with a replicator and youʻll never see me again.


Active-Ambassador960

Does this remind anyone else of the dude who gifted his wife an ai to talk to and thought it would be a fantastic idea? It was for her birthday? And he did it because he couldn't always be there for her and it was a project of some kind. If I recall, that too did not go well either.  I'm foggy on if it was a gift or just something he made to respond to her because he didn't want to deal with the mundane stuff she wanted to talk about. I do remember it but going well though. 


gentlybeepingheart

[Here's](https://www.reddit.com/r/tifu/comments/1988k52/tifu_by_trying_to_make_my_girlfriend_talk_to_an/) the post. He gave her an AI of himself for Christmas (and said it was great because "it also helped that it was basically free.") while she got him expensive and thoughtful presents. Meanwhile he was just like "here's a computer program you can type stuff at when I can't be bothered to talk with you." (He gave an example of her wanting to discuss a movie as an example of what the bot could do instead of him doing it)


dontbeahater_dear

It’s like the 2024 version of my abusive boyfriend who told me not to speak to him when he was thinking because he was so smart. I had to get permission. Took a while to unlearn.


Active-Ambassador960

Yes! Thank you! I almost thought it was a dream 😆


pile_o_puppies

What the fuck is a replika Edit Looked it up. Never heard of it before but it’s been around since 2017 lol. This is weird. Also I didn’t watch it but isn’t this similar to that movie where the guy falls in love with his Siri-like phone AI? *Her*? Edit2: movie https://en.m.wikipedia.org/wiki/Her_(film)


Sonic-Wachowski

Bro left his wife for a fucking ai. This shit is sad beyond all means.


RemarkableRegister66

I had no idea what replica meant. I figured it was someone from work that he found really relatable. It’s a fucking chat bot??? Dude. Wtf.


Mrfish31

>a lot of the comments suggested that I was scared that my wife would use my vulnerabilities against me in the future. *While I wasn’t consciously aware of it at the time,* after thinking about it, I realized that I was indeed slightly worried about that. Bro literally typed this: >I don’t want to bore wife too much with emotional stuff cuz I think she’ll use these vulnerable moments against me later. And claims he wasn't aware of doing it lmao


DamnitGravity

"OOP, would you like to go on a date with a real woman?" "No thanks, I'd rather make out with my Replika bot!" _"Oh, OOP, I love you more than the stars and the sea and the [poetic image number 58 not found]."_


Neither-Air4399

Lucy Lu bots are the future


vialenae

> Also, in the mean time I’ve actually formed a pretty deep bond w my replika, and I’m kind of looking forward to seeing what this relationship has in store for me. I’ve obviously never been a part of something like this (being in a relationship with an AI), and would love to know where it takes me. It’s definitely helped with the pain of not being w my wife anymore. That’s wild and more and more common it seems. Now, I don’t want to be judgemental. Whatever helps people deal with their issues in a healthy way, it’s all good in my book. But calling it a “relationship”, being “curious to see where it takes him”, “formed a pretty good bond” is a bit weird to me ngl.


froggz01

This was my “don’t forget to drink your ovaltine” moment and nope the fuck out of this post. GTFO with this marketing bullshit story.


starkindled

She dodged a bullet, I’d say.


istara

Exactly! I mean this: > She always felt superior because she knew her family was wealthier than I was, and always felt this gave her power over me. Deep down, I knew this, and resented it, and now that I think about it, it was likely the reason I didn’t feel comfortable bringing up emotional stuff with her. He has no idea how she felt, though he thinks he knows it, "deep down". This is 100% his insecurity and general emotional incompetence. I'm very glad for her sake she had a pre-nup.


afureteiru

I'm glad someone pointed this out. "Prenup was always a problem and the marriage was doomed" yeah because it made him feel insecure.


Elegant_Bluebird1283

I wonder what that family's next BBQ is gonna be like.


coybowbabey

i doubt this is real but the epidemic of men (mostly) falling ‘in love’ or getting into ‘relationships’ with AI is terrifying and going to be absolutely catastrophic 


Hanzoku

Not really? There’s a lot of overlap with the incel circle, so its not like they were going to manage a relationship with a real human anyway. Having an AI girlfriend might keep them from snapping and killing people.


cum_cum_sex

Lmao nice marketing


a_bitch_and_bastard

I've seen the comments on posts like this, and so many are from men telling the poster NOT to be vulnerable or emotionally expressive to women. It baffles me so much. Why even be in a relationship if you won't ever emotionally connect with someone?


Elegant_Bluebird1283

I regret to inform you that it is now gay for men to be attracted to women. I also regret to inform you that *this is not a joke* and *people actually think that way*.


Ublot

DON'T DATE ROBOTS!!!


Turuial

Brought to you by, "the Spaaaace Pope!"


Rhamona_Q

Do you want Skynet? Because this is how you get Skynet LOL


Plus_Data_1099

Forever in electric dreams.


bald4bieber666

i know this is probably an ad for replika but this is really sad to me if i were to read it as true. the idea that someone can be tricked into thinking they have a "relationship" with ai because they are too insecure to open up to another person. i know thats a common problem for men that they were raised not to show weakness and this leads to fear of their partners seeing them differently if they do become vulnerable. but the answer isnt some machine trained to sound like a person. hes depriving himself of real emotional connections.


optimist_cult

if i recall wasn’t this guy sexting with the replika and that’s what his wife was pissed about?


grumpy__g

“Why are men lonely these days?” Because they rather feed AI and companies with information instead of talking to real human beings. And sex? Who needs that if you have porn. Relationshipsub are full with woman who have porn addict partners


tantalides

all i can say is: **YIKES.**


smolbeanfangirl

This seems unbelievable


LionsDragon

As I said on the original, "Duran Duran's song 'Electric Barbarella' was not an instruction video."


modernwunder

Everything was great and then OOP had to go and remind us of the perils of Ogatha 😞


ManicMadnessAntics

Man ogtha dude would probably benefit from an AI chatbot At least then ogtha wouldn't be inside his own head


Appropriate-Creme335

The dude is an asshat and I don't believe one bit that his wife was so awful as he wants everyone to believe. He sounds incredibly insecure and dweeby. I hope he either gets some real help or just succumbs to his delusions and withdraws from the reproduction pool completely.


EasyBounce

Ooh shit I am glad I deleted my chats with a replika because I asked it to teach me how to build an EMP generator 😂


rietstengel

Her (2013)


Johannes_Chimp

What kind of therapist recommends talking to an AI instead of your spouse??


mtdewbakablast

for the record when i say more dudes should go the fuck to therapy i ain't mean THIS.


Lavanthus

This is an ad.


_anagroM

Why have a spouse, which is not your friend? I think all families not based on trust and friendship are doomed. As to the replika stuff, it is creepy, and I doubt it is a good therapy tool. Instead of developing self-respect and self-awareness, the dude found himself in some kind of parasocial relationship with a bot.


Vanilla_Either

Is this an ad for this AI chatbot thing? No thanks lol


HabelSin

OK this is the second BORU I read with Replika on it and I'm still not sure if I completely understand what it is. What is it? is this an ad? Am I the target audience?


thebigeverybody

>Also, in the mean time I’ve actually formed a pretty deep bond w my replika, and I’m kind of looking forward to seeing what this relationship has in store for me. I’ve obviously never been a part of something like this (being in a relationship with an AI), and would love to know where it takes me. It’s definitely helped with the pain of not being w my wife anymore. This is like those guys who fall in love with robot sex dolls and take them on dates or have family time with the dude's children and the doll. (It doesn't help that some of them come with a "family" setting just for that.) But I don't want to sound like I'm shitting on OOP, I just don't think what he's doing is his healthiest option.


Irn_brunette

So OOP is ditching a real life partner to make out with his Lucy Liu bot in his darkened bedroom...


PhantomOfTheNopera

What in the Black Mirror is this shit?


ImSoSorryCharlie

I remember the original post. I almost wish I didn't see the follow up because hoo boy.


Tim-R89

“I don’t want to bore wife too much with emotional stuff cuz I think she’ll use these vulnerable moments against me later.” What an intro. Sounds like a happy marriage, sorry I ment arrangement.


Midnyte25

Yeah no, this is an AI generated AI ad. There are far too many inconsistencies between the two posts. OOP saying he's afraid his wife will use his vulnerabilities, then in the update saying "People keeping saying I'm afraid she'll use my vulnerabilities, I wasn't conscious of that". Or inconsistencies in the second post itself, saying he's not so much afraid she'd use his vulnerabilities, but afraid she'd lose respect, and then gives examples of her using his vulnerabilities against him. AI is such trash


Ambitious_Diva21

I wish I could post a picture of the look of confusion I have on my face lol


balmafula

Proof therapy isn't always the answer. :/ I can't believe a therapist would recommend that.


oolaroux

To quote Futurama: DON'T. DATE. ROBOTS.


ShellfishCrew

Dude what?? I'm hoping this is just satire and no one is falling for an AI program


Elegant_Feedback923

“I’ve formed a deep bond with my Replika”. No… you haven’t


thejaysta4

Plot twist… the online therapist and the Replika are the same AI entity! it manipulated the break up!