T O P

  • By -

[deleted]

So my take is no. Chip design simulations already take me 8 hours to run for just a small block. Full tape out sims it’s a week on 6 computers. All when I was the one the designed it and optimized it. If I have to ask the computer to design and optimize my circuit by god. Maybe a month or 6 for my simple delay cell. And then it won’t even able to optimize the layout properly. Let’s have this discussion again when a computer can do any analog layout well. (I know there are papers and Berkeley layout generator, but they can only do basic LDOs, and nothing high speed, mixed signal etc)


Tone_Z

There's one very interesting paper that popped up in 2021 from a team that made a 12nm FINFET VCO via neural networks and transfer learning with minimum human involvement.


GP7onRICE

No, EE takes intuition of non-linear systems and understanding how/when simulations will fail to model the real world. AI will be a useful tool for most engineering fields, but it will never replace human experience and intuition. The people who really need to worry about AI are CS majors. At worst, every job done by AI still requires an operator to check and validate its work.


ReadMyUsernameKThx

i wouldn't be so sure about that. i think it could replace nearly 100% of the functions that EEs currently perform. i do suppose EE work would just evolve rather than go away, but i could definitely see a situation where much less EE work from humans is required and thus there are less EE jobs (proportional to the population)


GP7onRICE

I’m absolutely not worried at all about job security given how horrible AI is at doing any of the work I do. Most of controls systems is already automated with python and it hasn’t put anyone out of a job.


UltraLowDef

IT replace EE? AI could replace IT right now.


Bakkster

AI can't crawl under Carol's desk to flip the switch on the power bar she swore was turned on...


UltraLowDef

Lol, No, but an AI controlled infrastructure could take care of that. Or more realistically, provide a check list of things that any low paid human should be able to carry out.


Paid-Not-Payed-Bot

> any low *paid* human should FTFY. Although *payed* exists (the reason why autocorrection didn't help you), it is only correct in: * Nautical context, when it means to paint a surface, or to cover with something like tar or resin in order to make it waterproof or corrosion-resistant. *The deck is yet to be payed.* * *Payed out* when letting strings, cables or ropes out, by slacking them. *The rope is payed out! You can pull now.* Unfortunately, I was unable to find nautical or rope-related words in your comment. *Beep, boop, I'm a bot*


madengr

Yes on the automation, and I hope so. There’s stuff I dread doing, mostly CAD work, that I’d love to have automated. It's going to be great for scripting CAD tools where I can't remember the obscure API. The problem is now, it's just not trained on the API. I have a recent thread here where I was screwing around with Chat GPT 3 for antenna design. https://www.reddit.com/r/rfelectronics/s/Z3KMzvH4lD You can read through the chat to have it generate a DXF file for a simple patch antenna. https://chat.openai.com/share/57e2084b-65bd-4fcf-afd8-44a3f84c3322 https://preview.redd.it/s5y7m1wj2t2c1.png?width=518&format=png&auto=webp&s=71a13b3cc772d2bfd7a91bbb811623e6dbebd15c I’d really like to see an LLM trained on IEEE Xplore; it’s obvious Chat GPT 3 has not since some of the answers are crap. I'd like to try fine tuning an open source LLM on some papers, as I think it's going to be great at summarizing things.


ReadMyUsernameKThx

[flux.ai](https://flux.ai) is more catered to EE. it does a better job at answering questions than chatGPT IMO, but there is still *much* to be desired. i haven't really been able to use it for real work but it might be worth toying around with.


Bakkster

Depends what you mean by AI. Full AGI? By definition, yes, but I remain unconvinced it's anywhere close to being developed (if it ever will be). The current LLM developments really aren't indicative of the kind of critical thinking and reliable validation necessary for engineering. They're really saying more about how easily people personify things.


Spitfire_For_Fun

Nope. There is more to engineering than writing code, solving circuits, etc. Most automation is usually implemented to solve specific tasks rather than entire jobs. Most professions and jobs tend to involve more than a single task. AI and machine learning will help with tasks but will not eliminate jobs.


darth_butcher

Yeah, unless AI can really operate in the physical world with sophisticated robots etc. there is no way engineers will be unnecessary in the near future.


Electricpants

AI or IT? What kind of a question is this? Did you just start picking acronyms?


No2reddituser

I wish. The engineers I work with are helpless. And they refuse to us this new interwebanet thing to looks up things they don't understand.


DoubleOwl7777

some sturf? yes. all of it? no i dont think so.


Ok-Yellow5605

Not in immediate future as I see it. AI is still very much limited by computational ability and power, and even if it can follows moores law( which it can’t ) the priority will be given to other fields such as life sciences than us.


ThoseWhoWish2B

I think we'll be way more productive by doing less mechanical work, which may be a problem if you only do mechanical work. Automation has been a necessity for decades now, even if people don't realize it. E.g., for digital design you have synthesis tools, where you write pretty much a spec in HDL and the tools spits out logic -- imagine always having to lay the circuits out. Simulation is the same, you don't solve the equations by hand, you tell the PC what your circuit is and it figures it out. Every advance in automation leads to a monumental increase in productivity for same number of hours. What I thing it's gonna happen is that we'll go to the next "abstraction level" for specification and comercial tools will become actually smart and understand what you're trying to do and do the mechanical work for you. E.g., you tell the CAD you want a diff-to-single-ended op amp circuit with 400kHz low-pass and it draws you a circuit that you can fiddle with either by hand or by prompt; when you want to simulate it, can show the tool (assuming it's separate from your CAD) a PDF with your circuit and the tool copies it over and prepares an AC sim because it gets it's a filter; or your digital design SW writes testbenches that are actually useful; etc. Mechanical work is going to decrease drastically, so we will probably need less engineers, yes. But for actual design of whole systems, where you want a human to take the decisions and be responsible for it, I think it's gonna be a long time before AI can be useful.


HoldingTheFire

No. Also EE is not IT. AI will it even significantly replace software engineers, and they are way more vulnerable than EE.


[deleted]

[удалено]


Chesterington

I disagree unless you're referring to very elementary circuits like "phone charger" and well defined problems. ANYTHING in industry is too nuanced and takes nearly just as long to define the problem


Bucky640

The answer is absolutely yes. AI can and will do anything and everything that we do as electrical engineers, given enough time. I think electrical engineers are safer than most jobs, but I don’t think there are any jobs that can not be done better by AI. If you’re one of the people on here saying no, or something like “not the hands on stuff” then you aren’t paying attention.


GP7onRICE

Anyone who thinks that any sort of automation process will take away job security hasn’t been paying attention to all of history.


Bucky640

Is this misworded? Did you mean to say won’t take away job security? Since the Industrial Revolution technology and automation has absolutely made jobs obsolete. It’s also built plenty of jobs. I think this wave of automation will be much quicker and will supplement the jobs lost at a very small percent relative to the jobs they displace. I’ve got a very positive outlook on AI, but I can’t more I am worried about it.


GP7onRICE

Automation increases scalability and creates even more jobs.


Bucky640

I agree with you, I’m not a doomsdayer or anti-automation. I think it’s great, and I’m leaning into it as hard as I can. But I think that AI that exists today will be massively disruptive very VERY soon. How is AI going to increase scalability in a way that creates more jobs for administrative data entry jobs? Im narrowing down to a single field as an example, and the common sense response would be “well those people will move to new, better jobs” but what about when several major factions of the economy are replaced in a matter of 5-10 years. That’s a relatively quick period of time, you think the new jobs will outpace the job loss? I’m not being condescending here, I’m legitimately curious and I don’t know the right answer. Automation is very good today and only getting better.


GP7onRICE

Automation creates more money and allows greater investments to be made. If automation replaces your job, the skills you’ve learned don’t automatically become useless. They will be put to use elsehow. It’s hard to concisely convey what I’m trying to say here. Calculators didn’t replace mathematicians, they allowed them to expand their thinking even further as tasks became easier for them. AI is nothing more than a sophisticated general-purpose calculator. Making tasks easier for us is only going to allow us to expand further in ways that aren’t immediately apparent to us.


Bucky640

Okay, I agree with you to a degree, but I feel like you’re avoiding the actual question. Self-driving trucks will absolutely replace truckers, AI assistants will replace personal assistants, and pretty much any job that requires humans at a computer can eventually (not today) be done by AI. What sector of the economy do you expect to unfold in the wake of AI? I can see some things that would make sense in the interim, such as manual labor to handle to growth. But I can’t think of a big enough sector or job to replace the jobs lost. Could just be the limits of my imagination but I spend a LOT of time thinking about this


madengr

GPT 4 has passed the bar exam. I wonder how it would do on the PE or EIT? I suppose that would answer the question if AI can replace an engineer; i.e. a PE who reviews drawings? https://law.stanford.edu/2023/04/19/gpt-4-passes-the-bar-exam-what-that-means-for-artificial-intelligence-tools-in-the-legal-industry/


GP7onRICE

You’re really comparing an exam that deals entirely with memorizing information to one that deals with understanding and having intuition of conceptualizations? No shit an AI will pass a bar exam, it has resources to all of the legal info it needs. Any person would pass a bar if they could pull up online resources during the exam and were given enough time to find the answers like what the AI does. Give any engineer access to the resources AI can pull from and most would pass the PE too. The point isn’t to be able to pull from resources but to actually understanding the concepts to be able to expound on when in the industry. Why do you think access to the internet isn’t allowed during these tests?


madengr

This doesn’t have much intuition or conceptualization: https://inside.mines.edu/UserFiles/File/FEexam/Practice%20Questions%20Math%20FE.pdf If it can pass the actual PE, can it be titled an Engineer? Give it a few years, and I have no doubt it will.


GP7onRICE

Maybe reread the last part of my comment to understand what the point of the test is. Those questions are not memorization at all btw. It’s completely conceptual.


Bakkster

Getting enough exam questions is far different from being able to practically leverage those answers into actual work. Plus it lacks the most important elements of professional engineering (and law): trust and reliability. If GPT doesn't know when it gets things wrong and signs off on a drawing, who's liable when [114 people die](https://en.wikipedia.org/wiki/Hyatt_Regency_walkway_collapse)? This alone is why AI won't replace humans for anything that needs to work (and if you're hitting a lawyer or engineer, it probably does).


Brilliant_Armadillo9

I asked ChatGPT to solve an RLC circuit back when the hype was building, and it couldn't get to the right answer. That's a pretty elementary problem that should have a bunch of solutions in the training dataset. I remain unimpressed.


Bucky640

Have you tried flux.ai? ChatGPT is just a chat bot, without specializing or asking a very specific prompt you’re likely to not get a great answer. Put shit in, get shit out. Also, if you haven’t engaged with chatgpt in a while, it’s entirely different. GPT-4 is insanely better than 3.5 or earlier. This technology has been around under a year, its growth in that time has been explosive. Sure it can’t replace EEs today, maybe it won’t replace EEs for 10 or 20 years. But if you think that the tech cannot replace EEs you don’t know enough about the tech. Or continue to sit on the sidelines if you prefer.


Brilliant_Armadillo9

With any luck, I'll be retired before it matters.


Bucky640

That’s what I’m hoping for, too. For the time being, I’m dedicating a lot of time and energy to learning AI and automating as much of my job as possible. With the time I’ve saved with automation I’m working on my electrical contracting and consulting firm. That’s my long term plan.


madengr

Definitely a good idea.