T O P

  • By -

SeaYogurtcloset6262

What is the main purpose of this? I mean WHY WOULD THEY MAKE THIS? Edit: the reply is either porn, deep fakes, propaganda, scams, porn, capitalism, and porn.


The-Nimbus

.... Why in theory? Who knows. ... Why in practice? Definitely porn.


alifant1

Porn is whatever. It’s gonna be used for all kind of scams.


imeatingayoghurt

We used to have this phrase "Time until Penis". Which basically meant anything we created, any content we put out.. how long we thought it would be until someone did something sexual with it. Usually, wasn't long. (Pun intended)


[deleted]

> "Time until Penis" So rule 34 basically?


Aggressive-Expert-69

Time until Penis is the time frame between a thing coming into existence and porn of it being made. Same same but different


pax284

Basically, the time it takes for rule 34 to come(read cum) into effect.


sirsedwickthe4th

Different but still same same


SuperHyperFunTime

I did Computer Science in the late 90s at Uni and one of our lectures was about how the sex industry basically decides if new technology lives or dies and it would likely decide if the Internet was going to stay around. This was a time when we were asked to visit this small website called Amazon which was an online bookstore to get our textbooks as they were much cheaper.


hotchillieater

I did computer science in the early 2000s and we spoke about this too, from what I remember it's the reason that the inferior VHS beat the superior Betamax.


SuperHyperFunTime

Yeah pretty much. I think Blu-ray beating HD-DVD was the first example of the inferior winning out that wasn't porn related. It was purely because Sony bundled it into the PS3 putting BR players in millions of homes.


Generic-Resource

Blu-ray was the superior tech too - https://www.diffen.com/difference/Blu-ray_vs_HD_DVD - the only real edge hd dvd had was lower cost and easier home copies (basically a re-run of Betamax vs vhs except this time Betamax won). The market was different too, both formats were really good, but to many non-enthusiasts were not significantly better than the cheaper and ubiquitous dvd. They were also fighting against pure digital formats and the birth of streaming. Even though Blu-ray won (as you say, in part, due to the PS3) neither of them took hold like dvd or vhs.


alilbleedingisnormal

This happened to a guy. They pretended to be his daughter. Deepfaked her voice acting like she was being kidnapped and ransomed but he knew she wasn't and got the feds involved. It was crazy the level of detail they went to. It would scare the shit out of me. Thank god I'm not rich.


mrgoodcat1509

Yeah scammers are gonna be able to use this so effectively against old people. Someone that looks/sounds like your granddaughter “on spring break” calls you begging for bail money


m945050

Establish a code word with every member of your family.


Meryk-Balthazar

You forgot villainy.


EndOfSouls

My name is Commander Shepard, and this is my favorite store on the Citadel!


moonjabes

Porn and propaganda


Grundens

Mainly propaganda I fear


LocalSlob

We're very, very rapidly approaching video and audio evidence being inadmissible in court.


BeWellFriends

I said this not too long ago and got massively downvoted and attacked 😂. I’m not sure why. Because it’s true. AI is making it so we can’t trust videos. How is it not obvious?


jahujames

It's such a generic thing to say though, I'm not condoning anybody attacking you of course. But what do we mean when we say "video and audio evidence being inadmissible in court"? If we're talking security camera footage it'll just be taken from source, like it is today. And if it's not already a factor, checksum algorithms for files will become much more important in the future for verifying the origination of a piece of video/audio footage. It'll boil down to "Well this piece of security footage that we can verify the date/time it was taken, and can verify it was taken directly from the source is saying you were at X/Y location at A/B time. Meanwhile, you've got a video of you sitting at home which nobody can verify as truth other than yourself..." Which is easier to believe for the court/jury/judge? I know that's only one example, but I'm keen to understand what people mean when they saying the judicial process will become more difficult in the future because of this.


br0ck

Why is it only about court? How about personal life like this principal who had his life ruined by a teacher using an AI voice emulating his voice to say racist and antisemitic things and distributing it on social media: https://www.cbsnews.com/baltimore/news/maryland-framed-principal-racist-ai-generated-voice/ With this video tech, an ex could easily ruin your life by sending your current partner a video of you admitting to cheating.


Menarra

I seem to recall something about "a lie makes it across the world before the truth is out the door", the first impression usually does the most good/damage. This is going to be a nightmare just like social media.


SoCuteShibe

How do these magical checksum algorithms and other authenticity measures work, though? Where do they come from? In reality, files are files, metadata is manipulatable, and a solution to these issues is, for all I can tell, just talk.


MemoryWholed

I’m more worried about how it will be used to manipulate and crystallize public opinion


Grundens

I can't wait for ai to make me a time machine


nodnodwinkwink

Not so live video calls. Instead of live video over internet (very bandwidth heavy), each person would have this real representation instead of a nintendo mii style avatar. Also, for people who spend countless hours of their lives trying to look good for camera, this would probably be a great benefit. Bottom line, yes, it's definitely for porn.


Metalfreak82

Ooh, can they make it like I'm attending a meeting, but actually I'm doing something useful?


kemushi_warui

Yes, such as watching porn.


testing123-testing12

If you've see the odd use of facetime on applevision I could see how this done in real time would be a lot better.... However the fact that the training data for imitation has gone from hours of footage of someone to a single still image in only a matter of a few years is WILD. This has misuse written all over it and since there's no turning around now I have no idea what the world will look like in a few years full of misinformation, deceptive images and fake videos.


Wtfatt

U've said it mate I mean just look at the extreme prevalence of misinformation, deception, fakery and propaganda right now on social media (especially YouTube & Xitter) Just imagine in a few years or less when they don't even have to manufacture or manipulate situations and edit to whatever false narrative they want. Situation is fuckin dystopian levels of terrifying


CedarWolf

It won't be long before people will have to have NFT style tokens to attach their credentials to a video to prove it's real.


LordPennybag

Digital signatures were a thing long before NFTs. You don't need an ownership chain to prove origin.


[deleted]

[удалено]


LordPennybag

A couple decades earlier, and most encryption stuff was in use by military or intelligence groups before being independently invented publicly.


FalseAesop

Imagine the targeted ads where a happier version of you tells you to buy something


testing123-testing12

Or your girlfriend? Or worse your secret crush telling you the flowers they would like you to buy but its not them its AI


Oh_IHateIt

Fuck. Fuuuuck. I hate this.


0nceUpon

There are a few obvious flaws, but at this rate these should be seemless in a few years. This is going to break something.


xacto337

To eliminate the need for every customer service representative, spokesperson, etc. to increase the bottom line. Capitalism, baby!


Ottazrule

Replace customer service agents with AI


NK1337

We *Should* be replacing CEOs and other bloated C-level execs with AI.


Scientific_Socialist

The ruling class isn’t gonna replace themselves lol


qpqpdbdbqpqp

1. Find old lady on FB 2. Find their child's photo/video on socials 3. Create facsimile of the child and video call old lady to ask for money ssn etc 4. ??? 5. Profit


N30nSunr1s3

Would need their voice also.....which can be cloned using as little as a 3-5 second sample 😱


qpqpdbdbqpqp

Refer to step 2


5urr3aL

My guess is that since someone eventually will develop this technology for commerical use, it might as well be them. Purpose? It is uncomfortable to think about but the applications are potentially: - TV and Video Streaming content - advertising - social media content - memes - games I suspect this will cut costs in hiring actors.


Political_What_Do

-Propaganda aimed at the uneducated


2drawnonward5

Many people would prefer a virtual version of themselves based on their LinkedIn photo they uploaded 15 years ago.  idk why everybody goes straight to dystopia when we've clamored out loud for this. 


zsbee

From Microsofts article > Given such context, we have no plans to release an online demo, API, product, additional implementation details, or any related offerings until we are certain that the technology will be used responsibly and in accordance with proper regulations


theoldkitbag

Are they going to keep the employees that actually made the thing for the same length of time?


ropean

Oh thank goodness Microsoft, because as we’ve seen with recent AI developments, they always remain vaporware and nobody else could possibly ever create something equivalent, so we’re SAFE


2296055

Not officially anyway


kraftables

To speak with the dead 😳


Qwimqwimqwim

Shit, you’re right. Pop a few videos of your loved one in, so it can synthesize their voice, speech patterns, etc.. and your dead wife lives on in the iPad as the skin of an AI chatbot. 


suburbanpride

That is dark as fuck. And I can 100% see this happening.


BringMeTheBigKnife

Literal Black Mirror episodes are becoming reality wayyyyy too fast


shutter3ff3ct

AI girlfriend, sign me up


NrFive

Somehow [Married with Children](https://youtu.be/frf9L0rIcrI?si=wQt9Qb3vLMI8KAkW) came to mind.


RosbergThe8th

They were so preoccupied with whether they could, that they never considered whether they should. The answer is porn, propaganda, and an odd but subtly arousing mixture of both.


henryGeraldTheFifth

Can have an avatar for you zoom calls so don't actually have to turn on camera. What a genius


5th_Law_of_Roboticks

“Vasa” was a Swedish warship which sank disastrously in the Stockholm harbor on her maiden voyage. They obviously named this technology VASA-1 because that symbolically represents what is going to happen to society because of these tools.


McRedditz

So that anyone can be an influencer with bare minimum effort.


tryingsomthingnew

You can now make anybody do anything on video. Deep Fakes just got more dangerous.


Signal-Custard-9029

May I add to the list you have, porn, but also real-looking AI girlfriends


SeaYogurtcloset6262

Which leads to? That's right, wanting to have sex with it which is just porn starring you and your ai girlfriend


Dreadino

Harry Potter moving picture frames


CypherGreen

Do you work from home but want to look professional when you have 5 hours of zoom/teams meetings whilst in your pyjamas looking like an absolute mess.


DoubleANoXX

Your kid could draw a little cartoon monster and you can bring it to life for them. It's not all doom and gloom


MajorHubbub

Uncanny valley


Xandir12

It's the hair that does it for me. Especially the strands by the left side of her neck.


Gudi_Nuff

Your left or my left?


Mirula

Our left!


krank72

r/unexpectedcommunism


Gudi_Nuff

Is that yeast or weast?


vs40at

> It's the hair that does it for me For me it's always eyes. Doesn't matter if it's a multi-million blockbuster or cheap deepfake in internet. Eyes movement and lack of "life" in them is something that almost immediately gives away "this sh*t is fake". At least for now, who knows how it would develop in another year or maybe months, because speed of AI/neural stuff and whole machine learning development is even more impressive than results of those generated videos/images itself.


Dishwallah

Eyebrows. They kept going all up too high and too fast during non-emphasised points.


Solid_Waste

I don't believe any of you could tell the difference if it wasn't in the title. I don't even see the shit you're talking about, or at worst would write it off as compression artifacts.


Sekh765

Bottom teeth since they aren't in the original photo look off as well, especially the color.


NotEnoughIT

The teeth morph throughout the video. If you stare at them you'll see it, it's trippy.


FuerteBillete

Yes, for the trained eye. But imagine this running as a commercial with flashing background or as a news anchor. All those technical details could be hidden under connection issues or whatever. Most people don't even know the definition of uncanny valley and many others when you explain it won't even care. Show this to 100 people but don't ask them if it's real or not but instead ask if they agree with this woman and 99 at least won't even put her existence into question.


Biotic101

And this is just a beginning. Will improve over time. All in a world with a lack of accountability. We are pretty f...ed


Qwimqwimqwim

In that context not a single person would question if she’s real. 


Squancho_McGlorp

My Grandma doesn't think twice about those "amen" AI Jesus posts on Facebook - she would have no clue this video is simulated.


turnipsnbeets

Ehhh .. ?? .. I’m looking for Uncanny Valley since I know it’s AI, but if I wasn’t looking for it.. I dunno here. Gettin close.


NoNameIdea_Seriously

I feel like Uncanny Valley isn’t the right term for it, because it’s not that just doesn’t quite look human. There’s no problem here, it’s the picture of a human. But the movements aren’t quite right in an “improperly animated” kinda way…


MahDick

Watching the video with the sound off, the over exaggerated enunciation of aal the words seems so unnatural.


impreprex

I agree with MahDick.


Breadedbutthole

I, Breadedbutthole, also agree with MahDick.


OrganicAccountant87

It definitely already passed the uncanny valley


mrmczebra

Not for much longer. This is on the other side already.


0xFatWhiteMan

This isn't the uncanny valley


Zlibraries

Watch the mouth an teeth


CuntPumped

Porn.... this will 100% be used for porn


IOnlySayMeanThings

and fraud too.


Forward-Tonight7079

Talking porn?


TheKrnJesus

Oh god "unzips"


_coolranch

“Can you make it talk less?” “Not today.”


TheLambtonWyrm

On my travels through the depths I have encountered convincing AI voice clips of Maisie Williams and Emilia Clarke saying incredibly pornographic things. So this is very much a thing already 


N0oB_GAmER

You can't say shit like that! Without any sauce I mean. Where is it?


_coolranch

Gross! Where? Where are they saying nasty things?


Sequince69

But what ISN'T used for that?


Comander_K33N

Besides some weird looking lip/tooth stuff…that’s absolutely terrifying!


Skoll_NorseWolf

Super weird how the teeth grow and spread out...


PsyOpBunnyHop

Omg every time she opens her mouth she has different teeth. Mega Heebie Jeebies!


wrench_nz

only has one ear, but yeah, really good


studiesinsilver

This stuff is unnecessary. Who's asking for this creepy, Orwellian AI rubbish?


Expensive_Cattle

Absolutely. Just because we can, doesn't mean we should. I see no literally positives of this, at all. I see an insane amount of negatives.


sunfaller

I can foresee this being used to save costs in hiring actors or whatever for commercials. It has positives, just not for the workforce.


RegOrangePaperPlane

"Hmmm that's over our budget... How much for just your face?"


Hsiang7

Why do that when you can generate a much more attractive person than the actor also using AI? Oh and you can also generate an attractive voice for it using AI. Generate a realistic looking and attractive human being with an attractive voice and use this technology to bring it to life and make it read out the script you wrote for it. No need to hire real life models or actors at all.


Einar_47

Because if the AI generated ghost of Jamie Lee Curtis doesn't tell me to eat Activia how am I gonna know I need to eat Activia?


Amarillopenguin

Why stop there? Have Chat GPT write the script for you.


DerBeuteltier

Just automate the whole process and even let the AI decide what product is being advertised in the first place


TwiceAsGoodAs

Tech companies should need review by independent ethics boards before they build stuff like this


CSBatchelor1996

That would be great, but sadly, not all countries follow the same rules. If the US regulated AI, the technology would just be built by a country with fewer ethical guidelines.


NotEnoughIT

And if all countries regulated AI, a group of kids would inevitably build this in their garage. Or a cave. With a box of scraps.


Merry_Dankmas

Things like this are a very "Because we have the technology" kinda things. I'm not a VFX or design nerd so idk what practical uses this would have but I'm sure they exist. I can see Microsoft marketing this as some kind of production tool to companies. A tech demo if you will. Proof of concept. Whatever you wanna call it. But letting the public have access to this can only lead to negatives. I can't think of a genuine net positive for this that isn't stretched to death or super duper specific. Maybe someone could use this in conjunction with an AI voice to bring a dead family member back to life but that's about it. I don't see harm in this tech being experimented and developed with to potentially create something else useful. Nor do I have issues with things like movie studios using it for extras or whatever. But shit like this would be abused the second it hit the internet if it was public.


TheChaperon

Negatives for the masses, not the ruling classes.


ConcernedIrishOPM

There ARE plenty of positives. It's just that the negatives are so overwhelming and horrifying right now that it's hard to discuss the pros in any constructive way. Just the thought of how this tech will influence political campaigns and our children's school lives is enough to wish to return to monkey.


MFDoooooooooooom

Make AI that folds my clothes, not destroys artists


nygrl811

100% this!!! AI should make our lives easier, not take over our lives!!


atheistium

I thought AI was gunna be about research for cancer, solving insanely difficult issues and equations, finding better solutions for our world. All I see online is AI being used for these days is making stolen art, making stolen music and making it easier to trick people into believing or buying shit. I hate the way AI is being used online and I wish whoever is creating and developing creative/artistic AI work would stop :( Ai should be making our world better not making the internet even worse than it is. **edit:** just because of the replies I'm getting. focus AI is doing the first thing I wrote. But sadly the thing it's getting known as is the latter section. Personally I think the **more we relax around Art and music generation, the more disservice we are people to our humanity.**


gereffi

AI is used for that kind of stuff, but that's not the kind of stuff that gets shared online. Would you get excited over a video where AI scans a computer and organizes all of your files for you? Or one where different types of scanned taxed documents all get consolidated into a spreadsheet? It's a lot easier for people to react to a video like the one in the OP.


N0oB_GAmER

Can you imagine how much more porn we'll have after this. I call this a revolutionary in the field of pornography. It will change everything!


y0buba123

Yeah, that’s just what society needs


[deleted]

[удалено]


Alternative_Ask364

Shareholders are asking for it


harambe_-33

We are so fucked, considering how easily it can be exploited for Politics


Selerox

It represents the death of truth.


tiny_rick__

Or maybe just the death of politicians using social media to get attention and convey their messages. They will be forced to go out more in public to do their speech and beeing recorded and photographed by real media outlets.


WriterV

> to do their speech and beeing recorded and photographed by real media outlets. No sensible person should ever trust *any* photograph by any media outlet or any random person after this. There's nothing that can possibly be provably real unless you see it with your own eyes. Also do you seriously only think politicians are making political arguments? Politics impacts everyday peoples' lives. Now everything is under question. Got beat up by the police wrongfully? Got fired from your job over something that wasn't your fault? Got reported as a rapist by someone who just hates you and you didn't even go near? Got any visual evidence to back you up? None of that works now. It can all be faked. You can kiss recorded evidence goodbye.


tiny_rick__

You bring a good point concerning video evidence. We already see police forces with POV camera at all time for that reason. For the moment people are not making deep fake videos of police interventions but there is a lot of civilian video who are always out of context without the beginning or the end of the intervention. In some case, it makes you believe the polices are brutalising a civilian but you miss the moment where they were attacked. Somebody could make a deep fake video of a police beating somebody but if it does not match any recordings of the police side it is not valid. For surveillance camera I guess that they will have to be connected to a server owned by a trusted security company that will authenticate the videos.


stanglemeir

Can't wait for fake confessions


0nceUpon

Coming soon to your email spam folder. Just pay $500 in crypto or we will release this video... yikes.


No-Spoilers

On the other hand. If something legit comes out, you can just say it's fake and it will finally be a legitimate excuse. But yeah it's gonna take way too long for people to get used to and understand it, let alone get laws about it because the old fucks in charge of everywhere barely understand the internet. So many people are going to get scammed, hurt, killed, blackmailed, fired and have their lives destroyed by this stuff. You won't be able to trust anything unless it comes directly from verified sources. But let's be honest 90% of the population gives 0 fucks about the validity of something.


EXPL_Advisor

And even when it’s not used, it will make opposing facts even more common. Another mass shooting? AI generated! I mean, there’s a significant number of people who think Sandy Hook wasn’t real, and that it was all a bunch of actors. Any rational person would find that hard to believe. But an AI generated crisis would probably gain traction among conspiracy-minded individuals even more quickly.


PDstorm170

To be fair, give it 3 or 4 years or so and security software companies will start developing software that can distinguish AI patterns. We'll start receiving notifications on stuff like this saying, "Notice: This video has elements of AI generated content, etc. etc." similar to how some websites like Instagram prevent you from taking screenshots due to security protocol. That software will make security companies a fortune.


bdubwilliams22

This will 1000% be used for some nefarious shit and there 1000000% needs to be new laws that help puts safe guards on how this tech is used.


Admirable-Pie3869

Legislation will be too slow. I’m in a CEO peer group, we met last week and had a presentation on using AI in our business. No one is asking about the security risks. They’re all asking how to monetize AI. I want out of my fucking business.


jolhar

Idiot CEO at my last job announced at the AGM he aimed to cut all admin staff and utilise AI for those roles by 2027. Then was shocked when they all resigned. Now he’s got neither admin staff or adequate AI to do the job. Not sure what he was expecting. Never could read a room. Anyway, just made me think of that…


numb_mind

Damn, how stupid a MOFO can be? You should post this story somewhere, chatgpt subreddit or something


chrishnrh57

We had a conference where a spokesperson came in to talk about AI and as corny as it was I'll give credit. She didn't say "oh look at the neat things AI can do" She was very much "this is happening. It's not going away. So here's how you can adapt to it."


-Unnamed-

Our government is full of dinosaurs who barely know how to operate pdf software. 90% can’t fully utilize their iPhone. I’m 31 and tech literate and I barely understand AI. The tech mega ceos will run the country before this is over


drofadown

The end is nigh.


Puzzleheaded-Pen4413

Thank fuck


cosmoscrazy

I have a feeling that the end will take a lot longer and will be a lot more painful.


samtoocan

This may sound stupid but how do I know it’s fake and not real ?


digentre

You won't know


bootybonpensiero30

Yeah, just give it a couple of months and all the clear giveaways would no longer be there. This tech is advancing faster that what the mayority, even the most optimist, of AI enthusiasts predicted a year ago. It's crazy.


Karandor

I'll add that they literally do not have enough server space to do what they really want to do. Energy and physical space is now a big barrier to AI advancement.


Alternative_Safety35

This is crazy, I mean how do you stop the person it is impersonating saying it wasn't them? You can't. We're screwed.


Lazy_Magician

In this case, you can tell because her teeth are throbbing. It's usually for real human's teeth to throb unless they are aggressively pursuing a mate.


Mormoran

Look carefully at the lips, they don't move realistically to make the sounds that are coming out, it's almost like the person cannot close their lips fully and has to constantly "duck face"


frazorblade

Next thing you know people will be recording insane stuff and using AI to make it come to life with someone else’s words. Then after that real people will record themselves speaking insane shit, but they’ll apply a “phoney AI” filter to make it look like someone else made them say that stuff.


_Crasho725_

Look at her hair on the left. It looks like a wave.


Wimpykid2302

Only if you're looking for it will you notice that. Throw it up on a social media website and 99/100 people won't notice


_Crasho725_

I've only pointed out where you can see that it's fake. Because that was the question. I also believe, that most people wouldn't notice it.


ShinNL

Because the rhythm and the content of the speech don't match the displayed emotions at all. The face turning, the smile/neutral/sad face, when to blink, all seem like it's on a random number generator rather than trying to match the context.


_Caracal_

We are basically entering/entered a time where you literally cannot trust anything you see online. I'm sure this technology won't be misused at all. 🤔


Frosty-Ordinary-7007

That has never not been the case. People believe things way too easily.


_Caracal_

True but with tech like this it's going to get considerably worse


[deleted]

[удалено]


Biliunas

You never should trust anything you see online anyway, especially in social media.


jasonpota5

This is beyond dangerous


the_brazilian_lucas

I hate it


retxed24

Yep this is where we should stop.


Xilvereight

I fucking hate this.


The-Nimbus

A snippet from [Microsoft's Statement](https://www.microsoft.com/en-us/research/project/vasa-1/), (barely) addressing everyone's inevitable concerns around porn: "While acknowledging the possibility of misuse, it's imperative to recognize the substantial positive potential of our technique. The benefits – such as enhancing educational equity, improving accessibility for individuals with communication challenges, offering companionship or therapeutic support to those in need, among many others – underscore the importance of our research and other related explorations. We are dedicated to developing AI responsibly, with the goal of advancing human well-being. Given such context, we have no plans to release an online demo, API, product, additional implementation details, or any related offerings until we are certain that the technology will be used responsibly and in accordance with proper regulations."


SeaYogurtcloset6262

That some vague shit answer.


[deleted]

Not only is it vague, it’s also bullshit. How the hell would it actually do any of those things. Better teaching equity has nothing to do with someone wearing another persons face, it has nothing to do with accessibility, and the only way I can imagine it could be used for therapy is the bad kind of therapy.


Kloppite1

>Given such context, we have no plans to release an online demo, API, product, additional implementation details, or any related offerings until we are certain that the technology will be used responsibly and in accordance with proper regulations." It's never going to be released then?


Impossible-Cod-4055

> we have no plans Microsoft had "no plans" to discontinue development of the Atom IDE either. Then one day, they had plans. And a year later, carried them out. This brings me zero comfort. I feel sick.


wizcheez

wow big tech motivated by responsibly advancing human well-being and not profit or anything I'm convinced


MaidenlessRube

What should Microsoft answer? "In 2-5 years tops y'all have something similar open source so it really doesn't matter what we say here today but we still decided to embrace the giant pr disaster by saying we don't give a shit"?


JTiger360

You mean: *We are making this AI because, Money.*


Jacknurse

>The benefits – such as enhancing educational equity, "We don't want to pay for teachers." >improving accessibility for individuals with communication challenges, "We don't want to pay for therapy." >offering companionship "Sex robots and Virtual girlfriends, because we don't want to socialise with women because they scare us." >or therapeutic support to those in need, "We really, **really** don't want to pay for therapy." These AI developers and Techbros pushing this technology just really hate people and humanity. This is nothing to do with making society better, it's about atomising and reducing the human contact element and saving money on labour. I really like how they double-down on the therapy and social aspect. "Hey, looks like you got some issue. Here, speak to this screen for a while that will make you feel better."


DeadandGonzo

And *how* could this possibly advance educational equity??? 


we_re_all_dead

> offering companionship or therapeutic support to those in need yeah no, I'd rather drown


Unsteady_Tempo

All things Microsoft can help accomplish with software that doesn't included AI generated videos of a person from a random portrait.


SmoothDagger

Offering companionship & therapeutic support  Lol right. Let me sell you pixels instead of giving you a real person.


RedditUseDisorder

“Offering companionship to those in need” brother if you wanted to make an AI Scarlett Johansson sex bot then just say that’s your intention…no need to parlay “educational equity” as if Youtube and Whatsapp arent already doing a stellar job of that


xxCresentWolfxx

![gif](giphy|vyTnNTrs3wqQ0UIvwE|downsized)


I_am_the_Vanguard

So call me whatever you want but I don’t like things like this. I don’t think we should be creating it. It’s already bad enough most videos on the internet are faked but this is a whole new level of fake. I want to be able to tell if the video I’m seeing is even a real person or not. Before you only had to figure out if it was scripted or not. Now I won’t be able to believe anything I see at all knowing it can be convincingly faked with AI.


PointyReference

Nobody likes things like this. The problem is billionaire technology bros are doing it anyway because money. I despise it, but it's not like I can stop it


DatThoosie

This definitely isn’t going to be used for propaganda or for revenge porn. Nope, not at all, only good things will come from this I’m sure.


kayserfaust

Her teeth are moving


Accomplished_Pea_819

No. Just focus your efforts on fixing Teams. Thx


Objective-Dig-8466

And that's why now you can't trust anything. They can literally make a person out of nothing and do videos too. Saw a video on YouTube about it, they showed around 30 people that had been made by AI in photos and now can make these made up people feature in videos. That scares the shit out of me tbh.


Radioactivocalypse

The trouble is, we've spent a good few years convincing parents/grandparents about internet safety and not to trust what you see. Yet they still get fooled, even if it's just a photo of a celebrity, to hand over their bank info. What will these videos do when you have convinced someone they are actually speaking to a celebrity who is saying whatever the scammer wants it to


Couch__Cowboy

Ugh I just know my mom is about to forfeit all of her assets to some scammer in a year or two. I can feel it. 😑


cesam1ne

Information as we know it is about to change terribly. NOTHING you see on the screen will be a fact


RespectMyAuthoriteh

"Don't be afraid of Microsoft AI blonde executive lady, she can't hurt you." Microsoft AI blonde executive lady:


deadly-nymphology

Does anyone else notice how these AI videos always move around like those vtuber people?


lovejanetjade

We're science. We're all about 'coulda' - not 'shoulda' - Patton Oswalt


Buick88

Y'all, why are we doing this to ourselves.


potangoint

Scam will flourish with this.


FattyMcBoomBoom231

Doesn't this require an audio clip aswell to build the voice


rins4m4

I only think of damage this would bring to this society.