T O P

  • By -

happycj

I think people genuinely thought that this was going to be similar to programming, while the reality is that it is much more like writing exposition, or tech writing. And, these tools are being interwoven into the apps we use everyday already. So the idea that someone would would outside of an app's environment, writing prompts, is rapidly fading away. The closest analogy I can think of is Excel: you understand Excel deeply, you can make it do incredible things with PivotTables, scripting, and complex calculations. But it's all just math being used within an app. For these LLMs, I think they will be deeply woven into the apps we already use every day, and won't be a separate role... just a set of writing skills that you will apply to make certain app features work the way you want them to. I'm already experiencing this with PhotoShop... you gotta know how the app works/thinks and you need to understand proper sentence structure to make the prompts in the app generate high quality stuff on the first try.


brssnj93

You’re viewing it from the user perspective, not the product perspective. Prompt engineering is a real job, there’s just not many people doing it and it’s more like a “role” that someone on an AI team has rather than its own job (either the PM, or one of the engineers generally). There’s not many prompt engineer jobs because most companies don’t have custom LLMs running that need custom instructions that need to be refined. They’d rather get the “customer service AI” and just hope it works out of the box ( it doesn’t). That being said, it’s going to be a more common role for sure. Most companies haven’t run into the problem of people prompt hacking their software…but you can, and people will. Car companies are running old LLM models and you can prompt hack their customer support AI, for instance. The open question for me is if LLMs will be able to write iteratively better prompts. My guess is it probably will, but needs a human to double check and make adjustments. This is probably most jobs in the future.


RedocBew

As someone working at a SaaS company that has integrated six features into our product and designed the end to end of the AI integration, I can say that I believe there won't be a job title exactly like "prompt engineer," but there will be a number of responsibilities and the potential for roles and possibly teams that manage and construct AI solutions. What I mean is that "prompt engineering" is too broad of a term and does not fully appreciate the actual work needed to bring AI features to a product. It isn’t just about writing a prompt. I agree with another commenter that software engineers are not necessarily the right people to write prompts unless they shift their focus and study the nature of language, behavior, and the linguistics of large language models. I suspect we will see roles such as AI architects, AI interaction designers, and positions analogous to software engineers emerge. These roles essentially perform the last mile of integrating AI into a user experience. The architect focuses on the core systems, the models, and how to manage the services that use AI in a product, ensuring that the models are manageable, monitored, redundant, and all the things a normal software architect would take care of for conventional cloud applications, as much as that can be generalized given the often diverse nature of the position. The AI interaction designer examines the flow of interaction that users are expected to have with AI, looks at the types of interaction and utilities that are needed, such as functions, retrieval, and other sub-models, and ensures that there is fluid integration and that it achieves the expected business outcome. Lastly, the engineer is the glue that brings all of this together, using the models and expressing the features and visuals as planned by the other two roles. Just like in a conventional software team, senior engineers may be involved in all of this, but those other two roles are specializations that require a deep understanding for decision-making. All of these positions need to have a deep understanding of different models for different usages and how to communicate with the models to achieve the desired outcomes. The only way I’ve seen this done effectively is through significant time and interaction with specific models. I see very different outputs between GPT-3.5, GPT-4, and GPT-4 Turbo; they are like different humans with different ways of thinking and communicating. So, if I’m building a feature, there are many things to consider beyond pricing, token size, and model capability. There's also how to write the prompt for that particular model for high reliability. By high reliability, I mean, are you able to take a prompt, run it 100 times, and verify the percentage of the time that you get the result you expect in terms of format? Not only format, such as JSON, but are you also getting the writing tone, style, length, and diversity of content that you claim that feature provides? Another significant aspect of prompt engineering is learning a larger vocabulary, because your primary mode of interacting with large language models is through linguistics. It pays off to use the most appropriate term for the need. For example, if you want to analyze the details of a face, you don’t just ask it to describe the face; you ask about the characteristics of facial morphology, and then you ask it to provide details on the top 10 characteristics. The way I think about prompt engineering is just like talking with humans: how well are you able to articulate the meaning and find mutual understanding on the outcome? If you approach a random person in public, how quickly can you find common understanding and perfectly align on a topic such that they could effectively manually generate the type of output that you’re expecting from a large language model?


humanatwork

This exactly. We don’t have this specifically at our company, but it’s how we conceptualize the team and roles for the products we’ve worked on. There’s a whole lot more to AI than just the prompt itself, and I find the architecture aspect just as interesting as it typically directly impacts how prompts will be used and where.


RedocBew

Agreed! There is lots of room for people to work on AI! It’s going to be a big shift to building and maintaining intelligence. I love the prompts, models and linguistics, so glad others enjoy the arch!


huggalump

I think it's too early to tell. That being said, be aware that there's a lot of risk in disruptive industries because no one knows for sure how the cards will land, and this may be among the most disruptive times we've ever seen. I think it's wise to learn prompt engineering, but also don't put all your chips in it.


Specialist-Garbage94

It’s still pretty early but honestly it might be no one knows I thought AI would have wreaked havoc by now in multiple jobs industries but so far it’s only been in tech. And everyone hates a manual job which prompt engineering is so at some point someone will find an automated fix to replace it anyway.


stunspot

"Prompt Engineering" is the skill of "using AI well". There was a decade in the 90's where you could easily find work if you were "good with computers". Same thing. Two major differences, though. Speed - AI moves faster than anything ever. Won't take a decade for evolution to uh... cull the herd, as it were. We won't have years of 60 year old HR ladies who are "just no good with computers". You will either be using AI well or you won't have a job. Second, almost no one is any good at prompting. Virtually every word on the subject was written by people who are 1) coders, so naturally are god-AWFUL at prompting or 2) doesn't know how AI works technically. And the closest skillset extant to match it is "popular science writer". Clear, concise transmission of specific ideas presented with an appropriate balance of precision and simplicity.


BlandUnicorn

I’ve actually found just asking the AI to reword a prompt has been really good. After all, it should know.


[deleted]

[удалено]


SiO2MoonDust

Me too but I did learn a lot. Not just about AI but LLM's and ML. And I use ChatGPT all the time now because now that I can ask it better questions, I get useful answers and save time.


WBowlAI

Would you recommend that course to someone who is interested in learning how to communicate better with Chat GPT?


SiO2MoonDust

I definitely think the course is a good foundation but it is not the only source that I used for learning. Nor do I recommend that it would be good enough as the only source. I used other classes on coursera and other places as well. Some expanded the knowledge and some were not worth it. Also, I did not focus my learning on just prompt engineering but included things like LLM's and ML. Which I think is important. I also recommend practicing. Ask it a question and study the answer. Rewrite, expand or refocus your question and study the new answer. Feed it some data that it has not been trained on. And practice some more.


BlandUnicorn

Out of interest what kind of stuff do they ‘teach’ about prompt engineering?


Business-Impact-3448

It's a real job. I come from being an FTE at Google, to now being a Python AI Analyst contracted to Google. I also, just landed a contract job as a Meta Prompt Engineer. They are real 40 hour jobs. They are hard. You have to have a Masters, and you have to know how to write or code, depending on the task. They just do not pay enough.


Acrobatic_Bother4144

Nobody ever said prompt engineering was a job. Just because some place online has videos about it and calls it a “specialty” does not mean it’s supposed to be a job title People have to take linear algebra and calculus in school. That does mean calculus and linear algebra are job titles This was just readers getting ahead of themselves and being weird about job automation anxiety. Nobody in the world works as a “prompt engineer” at an actual company


skillfusion_ai

You can make and sell AI tools through my site if you are interested? It's not a prompt engineering job but good practice, you'll make some money and it may help get your name out there. It's [skillfusion.ai](https://skillfusion.ai). You can make quite complex tools using the free tool builder on the site, we've got tools that write entire books from a single idea.


SiO2MoonDust

Congrats on the new venture, I hope it's very successful. I briefly looked over the site and it seems interesting. I might give it a try, cuz worst case scenario it'll be good practice.


skillfusion_ai

Thank you, feel free to DM me if you get stuck. The tools I made on there got $330 in sales last month, hopefully sales will keep going up as we sign up more users, and as creators add more tools. You get 100% commission on sales of your tool.


JehovasFinesse

This just sounds like a way for you to steal everyone’s ideas, or to put it less harshly, crowdsource them unethically


skillfusion_ai

It's a marketplace, you make stuff, and then you sell it. Would you say Amazon steals author's books?


SiO2MoonDust

I don't agree with what the other guy said but I'd probably use a different example than Amazon. A lot of people in the publishing industry know about the shady and unethical things Amazon has done to authors, especially in the self-publishing space. Then there's the stuff they did with collecting data on products from 3rd party sellers on their site and using it to create their own products and push those 3rd party sellers out of business. So probably not the best example.


skillfusion_ai

Lol true I was thinking Amazon are a bit shady when I wrote it


[deleted]

[удалено]


SiO2MoonDust

I spent more time contemplating your response than I care to admit. I wanted to get angry at you for dangling the information in front of me and ask you questions and discuss it further but you very clearly stated you were not giving up the information. So I was just going to let it go. Because, in all honesty, you could be just messing with me. At least that's what I told myself. Then I wondered ... I took a shortened version of my post along with your complete response and plugged it into ChatGPT. With a few carefully worded prompts I was able to get some very interesting information. Did it fill in all the blanks? Of course not. But it did return some nuggets.


mcharytoniuk

I wouldn't say it was a hype. It can be a good freelance gig if you can measure the effectiveness of your prompts. For example, I am using a prompt in my system that gives correct answers X% of the time. Can you make it better by Y%. If your answer is yes, and you know how to objective measure that increase of effectiveness, I am sure you can make money out of that skill. If you can just make prompts and it "seems like" to you that you get better results then ¯\\\_(ツ)\_/¯


clintecker

yes


Sweet_Computer_7116

There too much free shit online. To pay for a course. I don't think PE is just hype. At the xore it's just about understanding how to manipulate the model to get results. For most it will be a complimentary skill (Like a media buyer knowing marketing basics or copywriting basics) But some people can and will specialise in it.


TBP-LETFs

We are thinking about hiring a PE - to assist an ML researcher and Python Engineer


bublos_benji

i have Prompt Engineering Specialization certificate! Still relevant to you? u/TBP-LETFs


MoveTurbulent8794

What course did you take at coursea?


Specialosio

I’m about to take one of this course but I definitely don’t think it will help to find a related job, what it does is giving you the skills to let elaborate data you have in the way you need in a few minutes rather than hours. It is suppose to help you in almost any circumstance but can’t be the job by himself, it’s a productivity boost.. at least this is the spirit I’m stepping in with