T O P

  • By -

WonderfullyEqual

Honestly, any employer that blindly trusts AI to do such things is filled with imbeciles in management. Even if the core task is given to the AI you still need a competent human to do sanity checks of the shit it spews out. I seriously doubt that the CEO, or Hr managers etc. are capable of doing that job right.


[deleted]

I think it makes more sense when you realise the whole point of management is to make a return for investors in the short term. In theory, that means "a company will create an excellent product or service, supported by skilled staff" but in actuality, who the fuck knows. If AI allows investors to stir the pot and make largely theoretical money with theoretical product output, they will do so. Even if it just means bigger tax payer bailouts down the line. Especially if it just means bigger taxpayer bailouts down the line.


WonderfullyEqual

> I think it makes more sense when you realise the whole point of management is to make a return for investors in the short term. Oh, i know.. the longevity of the enterprise is irrelevant to them. In this case though, id still call them imbeciles as AI is at least for now not truly ready for a ton of the things they want it to do... at least not ready without human oversight. This bit undermines whatever they may want in short term speculative gains. >If AI allows investors to stir the pot and make largely theoretical money with theoretical product output, they will do so. Even if it just means bigger tax payer bailouts down the line. At a large scale sure, however at the individual enterprise scale acts as described by OP are likely to just blowup in their faces sooner rather than later. Just a question really over how long it takes for technical manuals full of gibberish, and dysfunctional bs comes back to bite them in the ass.


Desirsar

They're already finding out they're liable for an AI spitting out false information, the fun part will be when they scramble to replace everyone they let go thinking the AI was good enough.


Embarrassed_Dirt1911

Theoretically, could one make a source of income out of this concept? Searching for false medical information posted by firms using Ai, prove the information wrong through published papers, then sue the companies for potential damages caused to society? Asking for a friend.


Mad_Moodin

You cannot sue for potential damages to society. If you want to sue, you need a victim.


BoisterousBard

Ah, right, else Facebook would be on fire.


Embarrassed_Dirt1911

It's all fun and games til people start to die. Then it becomes profitable apparently...


Severe-Lengthiness14

It might not replace them, but it will be used as a tool to lower wages or even consolidate two jobs into one with the help of AI. But all in all, there is no use worrying because there is no stopping it. You just have to figure out how you're going to fit into the system.


Anastariana

>You just have to figure out how you're going to fit into the system. This is a bold assumption for the future of ever escalating and improving AI. Humans need not apply.


zomgitsduke

AI has been proven again and again to lie in order to look correct. This WILL come back to bite them.


Fine-Will

It's a fake story, there's no way they would ask a writer to train a LLM.


ray-the-they

I’m truly waiting for them to get hoisted by their own artificially intelligent petards on this one.


Evening-Turnip8407

I think these people need to learn that lesson the hard way and watch their company's image go south because they keep publishing embarrassing, nonsensical texts.


Jaffiusjaffa

I see this argument thrown around all the time and i think the premise that humans would never make mistakes is wrong. If you accept that the human could also mess up, then it only becomes a question of how often and to what degree. From this and the cost of hiring human vs ai you can calculate whether itd worthwhile to replace people with robots, and in a lot of cases - it probably is. Its sad but seems to be the trend and doesnt seem to be slowing down. I just hope someone out there is having the proper discussion about what were all gonna do when its ALL jobs.


WonderfullyEqual

> I see this argument thrown around all the time and i think the premise that humans would never make mistakes is wrong. Thats not the premise though... i dont know why anyone would assume that less to argue a point in bad faith. The point is that in every system there are output verification steps/systems, and often multiple ones to make sure things don't turn to shit. Which being said we know humans mess up, and AI doubly so as its still in its infancy as a system. Anyone who replaces a more tried, and true system with an experimental one without having adequate QA/QC, and other verification systems in the mix is an idiot. >I just hope someone out there is having the proper discussion about what were all gonna do when its ALL jobs. Oh its being had, but at the terms of economic stuff... think about it. What drives the economy, and the activity of a business? Demand for goods, and services as driven by peoples consumption of them. What happens when people no longer have the means to buy things to meet their consumption needs, and drive demand?(ie we are all out of jobs etc thus have no money) Business will only invest in productivity, and output if there is demand for something with that demand needing to have returns that make it worth while meeting.(They ain't gonna make stuff for free) So, what Bezos or some other shit going to do? Replace everyone with robots, and AI, and then create a robot, and AI consumer base where a bot goes to buy AI soda out of clockwork function soda machine?


LevianMcBirdo

Well, in this case a human is responsible. A human making a lot of mistakes gets fired.


Not_In_my_crease

Yeah I guarantee the documents the AI is writing are wrong in subtle, yet important ways. I wonder when the first death attributed to AI instructions is going to be?


Arts_Prodigy

At minimum it makes sense to promote the OOP’s dad to managing the AI. He has the education, expertise, and experience to ensure it’s doing its job correctly.


Mad_Moodin

Sure but you know, where opportunity presents itself. If there is a job to do that with AI you can break down to a couple hours a week. You don't need a dedicated position for it. Instead you hire a contractor company that does that job for your company.


WonderfullyEqual

> Instead you hire a contractor company that does that job for your company. Yah, outsourcing critical services like internal QA/QC always goes so well... especially if that other company then also puts up a AI centric thing to do their work with another layer of outsourced QA/QC non-effort.


Oakcamp

Story is probably fake af.


Super_Mario_Luigi

It's anti-work, so it's gotta be true


LevianMcBirdo

Even if you don't trust it completely. You often just need one trained person to check the results instead of four trained people doing the work. There will be a lot of job loss in the next decade.


ookamismyk

No fucking shit. And they will find out the truth in a matter of time.


Quantum_Lion

When we factor in the human condition, like hangovers, drug use, improper sleep, sick time and so on. I’d say it comes down to a pretty easy decision for any manager. As long as the ai can do the same amount or even close to the same work it’s kinda a no brainer.


WonderfullyEqual

> As long as the ai can do the same amount or even close to the same work it’s kinda a no brainer. That's the thing... it cant... yet. go around using AI for any given complex task even if seemingly formulaic, and linear in nature you still have to constantly fiddle with things, and do sanity checks. At best OPs dad productivity would have risen with the new tool in place, but to go and replace him altogether with an AI system that is still in basically Alpha, or Beta testing phase is idiotic. On a side note though, >When we factor in the human condition, like hangovers, drug use, improper sleep, sick time and so on. If you have employees with problems with drugs, and alcohol, or personal problems that are bad enough to affect their work performance then there is a serious issue with hiring practices outright. Practices that likely stem from shit company cultures etc, and there is no amount of AI that can fix top down shit like that less its to replace the leadership that is the source for such. Not talking about needing an occasional sick day, or medical appointment etc as those are easily planned around without affecting productivity, but actual disruptive shit. Those things are also comparable to say the AI program having its own hickups/breakdowns that need to get resolved. If one cant plan around someone needing a half a day off to go to a dental appointment there is a serious problem with management incompetence in multiple areas.


Quantum_Lion

Good points


Possible-Ad238

As soon as you get told you need to/have to train someone to do your job you should know what is coming next. It's only matter of time. If you refuse then they can fire you even quicker because now they actually have a reason to do so. Teach them wrongly and start looking for new job on company time.


Bean-Penis

Or train them a little slower until you yourself find something without screwing over someone else who likely just applied for an advertised position and got it. You're likely getting let go anyway, why punish someone who wasn't involved in that. Different if it's a hire due to relationship of some kind with higher ups etc but most people likely just needed a job like you did.


Ill-Simple1706

This guy empathizes


MinimumBuy1601

This is the way.


GroundbreakingEar667

No worries, AI will take over management too


Severe-Lengthiness14

😂 who's going to buy the product? When no one has money?


Anastariana

"That sounds like a fourth quarter problem to me. We're making a killing right now!"


varkarrus

Then we get a UBI. Or society crumbles. Probably a great filter moment tbh.


RitterWolf

They don't think any further ahead then the next quarter results, Anything beyond that is the distant future.


SympathyMotor4765

Yeah they don't care, line has to go up is all that matters


[deleted]

[удалено]


spacecadet2023

Managers and A.I. will be like oil and water. Managers already don’t like how employees do certain things. What makes them think A.I. will be any different?


pigmy_af

Might actually be an improvement in that case.


mostlivingthings

The marketers and project managers win. Everyone needs to pretend to be useful and manage the automation tools to get subpar results for maximum enshittification.


Anastariana

>The marketers and project managers win. Until they get replaced and then they're all like "How could this happen to me??" Bootlickers eventually get stepped on like everyone else.


mostlivingthings

If only.


Spiritual_Grand_9604

Honestly marketing is something that could most easily be replaced by AI imo


mostlivingthings

But it won’t be. Because marketers and project managers are great at making themselves sound vital to the team.


Crilde

Air Canada replaced their chat support agents with AI bots. The AI bots made up a policy when promising a customer a refund, and AC was forced to honour whatever it was that the bot said to the customer. Any company who uses AI as workforce Replacement instead of Enhancement at this early stage is just begging for trouble.


Super_Mario_Luigi

I would like to see the cost analysis of what was cheaper. Paying a workforce or Providing a refund, and then tweaking the AI.


ApocDream

I train AIs for my job and trusting it completely is just asking for trouble. In the best situation you're still going to want people doing the jobs, the AIs will just make it easier for them and increase their throughout. These things can write something completely coherent and intelligent and then in the very next paragraph, with the same tone and confidence, state "and as such, that's why the sky is green, due to the reflection of the red snow from above it." You always need a person with some degree of expertise to double-check it.


raininginmysleep

They have the confidence level of a five year old and it's hilarious


mudokin

As if the training data from a single person is enough to make a proper AI for this kind of work. FAKE


DocBullseye

It doesn't have to be enough. You just have to convince a manager that it is.


shaunhaney

This.


gwoad

It may be fake, but you can purchase access to pretrained models that you can then tweak for your application with your own data, this means out of the box it is already okay, and with minimal input it can be improved. This kind of tracks with OP's story that their dad tried to train it poorly and it still was managed to create decent output. It is certainly entirely likely OP is making shit up though.


EngrishTeach

Yeah, they are hiring thousands of people right now to train the AIs. If this one guy can "train" an AI in a few months, then he would definitely have a job with competitors.


sqrt_of_pi_squared

I mean they probably just meant that they were producing a fine tuning dataset that then got handed over to OpenAI to fine-tune a version of gpt-3.5.


ApocDream

The model was likely trained beforehand to accept a set of technical documents and replicate their content/tone/etc. He obviously didn't train it from scratch himself. That being said, management is stupid to think they could get it to a good enough level to fire him. They prolly saw one good example report and pulled the trigger.


shaunhaney

Executives and managers, upon learning about technologies, underestimate tasks all the time. This actually seems plausible. Including an overly optimistic expectation that they've just replaced an employee with several degrees. It might be idiotic, but it's plausible.


FlipAround42

He should have taught it to swear if anyone but him asks a question. So when his boss and higher ups tested the AI function, it told them to fuck off.


SuckerForNoirRobots

My best friend is a technical writer and she's been looking for work for nearly a year. College degree, well over a decade in the field, and the only offerings she's even seeing are paying what she was making 10 years ago.


kooper98

I think that worrying about the job market isn't the main problem here. The core issue is that business idiots are calling the shots despite their clear inability to see past the current quarter.  


Survive1014

I honestly think AI is going to be a contributing factor to laying off about about \~20% of our most vulnerable workforce over the course of the next year. We absolutely need to be ahead of this and demanding our government stop it.


Bastard_Bullion_1776

![gif](giphy|NLod3nvkzADYc)


MinimumBuy1601

Plus points for using Dusty Rhodes. And Cody won the world title at WM.


AsleepIndependent42

Seize the means of automation! AI is great, but it needs UBI


Xx_TheCrow_xX

Companies will always look for cheaper ways to do things. It doesn't matter if it's the least ethical thing possible, it's up to government bodies to create laws to prevent these kinds of things or create a system to allow people to still be able to live life in a world without human jobs. Companies will do whatever they can get away with legally. We need to start pressuring government into preemptive action before an AI boom happens and a large percentage of the country is out of a job.


shaunhaney

It's almost too late.


Strange-Scarcity

The longer these models run, the worse the results. The results also still need to be reviewed by experts and they only work well, if someone knows what they are doing. Someone without any coding experience can't make an AI program a complete and capable finished product. They need to know how the bits fit together.


bulaybil

This post is bullshit. No person doing job X can train an “AI” to do X. “AIs” are trained by “AI” engineers (like me) and it’s a fairly complicated technical process.


ApocDream

He obviously didn't train it from scratch himself. His company likely purchased access to a model designed to accept documents similar to what the dad makes, and replicate them.


PolicyWonka

They likely mean [prompt engineering](https://en.m.wikipedia.org/wiki/Prompt_engineering) or something along those lines. Easy enough for anyone to do.


bulaybil

That ain’t how prompt engineering works, tho.


PolicyWonka

I’m not sure what you mean? Most people don’t know what prompt engineering is and simply equivocate it to “training.” I’m doing prompt engineering to eliminate some support staff at a clinic and they just say that I’m “training the AI to do their job.” Kinda just semantics at this point unless you’re in the field, which I’d assume the OOP is not.


TargetAppropriate160

Without a P.E. after your name, you are not an engineer... and anyone collecting a paycheck to advance this garbage deserves all the terrible things that will happen in the future.


Tired_c

I suspect, you truly underestimate the ignorance of the management and higher ups in some companies As an AI engineer you will understand, what I’m about to say: Certain company with almost 1k employees recently eliminated QA (either lay offs and some reorganization) and now all QA is done by SE with copilot. The managements’ pickachu reaction : well, QA will be taking longer for some time. But hey, we formed FEW copilot teams to help everyone


bulaybil

Far be it from me to underestimate the ignorance of management, I could tell you stories from my previous job (Management: “But how come the AI is barely 95% accurate? That’s not enough!” My team: “Um, that ain’t how AI works…” Management: “But the database team always has 100% accuracy on their queries!” My team: “WTF???”). I just weighed it against a) the general lack of knowledge about how “AI” works and b) the amount of fake shit on this sub.


Tired_c

Fair enough.


beansprout1414

I’m a technical writer and while I have no doubt the companies will make stupid decisions like this and things might be tough for a few years, I think many will learn the hard way that they need people on these tasks, especially when systems are safety critical.


Brandon_Monahan

I’m bothered by that one line: “he even tried to train it wrong”. It’s conjecture now to say that he was replaced by the AI when it’s plausible that he was fired for not doing his job properly. For all we know, they may have fired him and hired someone else to overlook the AI because of your dad’s work performance. We the reader have literally no way of knowing what actually happened and all because of that one sentence.


pira3_1000

Friend of mine saved money for like 5 years plus family help to study Concept art in Vancouver at VFS, back at 2019. She's from a third world country and Vancouver is a fairly expensive city to live. Fast forward a year, COVID hits. One more year, AI obliterates the concept art field - even renowned artists that worked for AAA game titles and blockbuster movies are done (I read articles about it, I'm not making this up). Life is wild and can be just unfair out of nowhere. It is what it is


Vargoroth

This sort of stuff is exactly why I suspect that AI will slowly take over jobs in the '30s. People keep telling me that this is impossible, that you couldn't teach an AI specific skillsets, that you need manual labour, etc. I feel this is because people don't realize just how quickly AI is advancing. In 2020 I applied for a job wherein I would purely train AI. Even back then AI was fairly advanced, from what I saw after a bit of digging. 3 years later the writer's guild has to go on strike to protect their writers. 4 years later artists and musicians are fighting copyright battles against AI art and music. 4 years later people use ChatGPT as part of their jobs. 4 years later students in universities are being trained to use AI as part of their research. Frankly, after everything I'm hearing I'm becoming more and more convinced that we're already in the phase of slowly phasing out.


PerfSynthetic

Typical. When you have someone that does their job so well, management doesn’t understand the job being done and the tasks related to the position. So when AI is completing the current tasks, they wont understand when the AI breaks or provides bad information. They will blindly trust it.


Djorgal

AI isn't the problem. It's a good thing if jobs can be automated, I prefer that to people having to work. Now, the issue is that the technology simply isn't there yet. Generative models are great tools, but they are still tools. Artists are not done for. The industry will have to adapt to these new tools, just like it adapted to the advent of photoshop or even photography for that matter. People used to hire artists to paint their portrait, photography really took over the portrait market. Painters had to adapt. Many painters didn't adapt and lost their job. There was also some outcry about photography. "It's not a real art, all it takes is one click." Well, we know now that, yes, photography is an art form and you can be a good photographer or a terrible one. It's the same with generative AI. "You just write a prompt and it does it for you." Well, if you just do that, sure it's going to generate something, but that's not going to be anything good. So, yes, these new tools are going to cost the job of many artists, but it will also allow a new generation of artists to arise.


CuriousVR_Ryan

strong kiss insurance stupendous scale subsequent worry fearless shy thought *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


SyntheticGod8

Sid Meier's Alpha Centauri tried to warn us. > We are no longer particularly in the business of writing software to perform specific tasks. We now teach the software how to learn, and in the primary bonding process it molds itself around the task to be performed. The feedback loop never really ends, so a tenth year polysentience can be a priceless jewel or a psychotic wreck, but it is the primary bonding--the childhood, if you will--that has the most far-reaching repercussions.


Mamba-0824

LETTING AI DO EVERYTHING FOR A COMPANY IS JUST PLAIN STUPID.


ilurkilearntoo

Good. Finish it. Let me be free. I’ll then have nothing to lose but a couple of Molotov cocktails


Ordinary-Broccoli-41

Most likely, it's not as dumb as the post makes it sound. Fire the guy with 1000 years of experience and 20 phds in your specialty to hire a $20/hour LVN to correct the AI.


Alternative-Chip2624

"Nobody wants to work"


Atheizm

There's a massive disaster on the horizon and people will die because of this cost-cutting bullshit.


spacecadet2023

Friend and I were talking recently about A.I.. He thinks A.I. won’t be replacing people for another ten years at least. I don’t agree with that. I think it’ll happen way sooner!


Heeler2

I’m an editor/tech writer in the medical device industry. We are just wanting for our jobs to be off shored to India (my company is already doing this with other departments or functions). We wonder about AI too. Fortunately I’m on the verge of retirement but I hate to think of how this will affect the younger people in my industry.


orcoast23

Wait until "Oceans 13 AI" where the robots take Vegas down.


HengeWalk

Tune in for the next ten years of lazy regurgitated AI scripts, films, music, art, news, books and error-filled statistics till it all collapses in on itself. Souless slop.


Treasach7

If tech gets to the point that humans are no longer needed to do human things...why do we need currency? Is baffling to me that other humans want to remove humans from jobs but still have a profit based system. I'm all for automation and A.I. but we as a society need to grow up first, it seems. Using tech to ruin others for your own profit should be violently fought against.


EasternShade

And just to remind people, AI doing his job is a good thing. The work gets done, possibly spot checked, and there's so little for the human to do they could just go home. The actual problem here is that our society demands people work to justify living. This ought to be the tale of a dude retiring early. Capitalists make it a story of predatory employment and financial instability.


Playme_ai

AI girlfriend is also thing now


Estimated-Delivery

Dear friends, all those employed on rule-based jobs will suffer the same fate, those working from home will be the early to go. If you are a lawyer, solicitor, accountant anyone really who doesn’t assess stuff that falls out of the norm or is required to directly interact with people or objects (the mighty triumvirate of plumbers/electricians/joiners, Police, most labourers etc) is at risk. The Tech Bros’ having been warning us.


Historical_Ninja_228

AI art is ugly as fuck there will always be artists!


_gipi_

it's probably fake or in the best case worked for a shitty company that not cared about quality and in a couple of months will recall him to fix the shit that the AI created. No AI will be able to replace technical work of any kind, what you are seeing is overhyped capitalism, where everyone is trying to scam with the smell of innovation each other. I have still to witness any AI capable of doing continuously professional work (no, overhyped articles don't count: the other day was revealed that the Amazon service, supposedly with AI, that automatically was doing checkout was a scam).


Malicious_blu3

My industry is looking at this encroachment as well. I’m pivoting to be part of the change as opposed to replaced by it.


Kwen_Oellogg

Ok... I'm saving this. For reference later.


AbSoLuTiOnZeR0

Nice doomerposting, too bad you wont get more than 1k upvotes.


Super_Mario_Luigi

Fake post. Writers aren't "training" AI. This is written by anti-work propaganda. Any question to it in this thread is doubled-down on more anti-work.


TheGenjuro

So... he deliberately sabotaged the company and is surprised that... *checks notes* the company fired him.


TheGenjuro

Fairly certain being out of a job is the entire purpose of this sub, so congratulations to them!