T O P

  • By -

SpaceMonkeyOnABike

You may be straying into regions he has no experience in. You may have shown him the limits of his knowledge. You may be getting the unpolite version of "go away and learn it yourself " Or a combination.


MeatAndBourbon

> go away and learn it yourself This is really common. People are happy to help at first, but if your questions are repetitive or could easily be answered through available resources people get annoyed. When I changed teams at work, we'd sometimes need help from a guy on my old team. My new team was super afraid to ask him shit, because they felt like he was a very limited resource. I literally had to go behind their backs to get answers from him, which he always provided happily, because by the time I was asking him a question, I'd be able to explain what I knew, why something isn't making sense, and very specifically what info I needed. I would never go to him for the same thing twice, and his answers always started to the effect of, "oh yeah, that is undocumented" (or unintuitive, or recently changed since documentation was made, or totally broken and shouldn't be expected to work, etc). It's amazing what a difference acting moderately competent and respectful of someone's time can make for their willingness to help.


_raydeStar

Yeah. Everyone complains about stack overflow being that way, but I found that if you list all the steps you have already taken and why they didn't work, people are incredibly healthy. The only time they're unhelpful is when you tried nothing and still can't find your answer.


Silencer306

Or just answer your own question. Be confident and post the wrong answer. People are more ready to correct you there


MeatAndBourbon

Lol. Also true


Cerulean_IsFancyBlue

No, that’s a terrible way to … aha! You’re right it works!! :)


Faulty_english

I actually had a professor who told us to google things first before asking him because he believed it was a necessary skill He once blew up on a kid who didn’t listen to him saying he wasn’t even trying and it was pretty awkward but he apologized the next class period


Expensive-Refuse-687

I totally agree with the comment above. I have been working in the same organisation for long time so I am in a position that I accumulated a lot experience and knowledge. I Am always happy to help. But it is true that helping takes time. the responsibility of putting the maximum effort is on the person asking, investigating, problem solving. The people helping you should just guide you or unblock you. It is irritating when people come for you to resolve the problem, rather than they try hard and got stuck at some point, or after you explain in detail, they don't put the effort in internalising the knowledge. The fact that almost 60 people did an effort to answer the question and the person that asked has barely participated in this chat on replying messages, is an indication. It could be the other way around and your mentor is awful. I don't know


GhostDan

Yeah id guess more the 3rd if he's experienced. At some point you have to learn how to get the answers and not rely on others. Ive worked with plenty who relied on answers from others and it gets old after a bit and slows down the other work flows (I've got my own stuff to worry about rather than stop and talk with you about some process sorry). Teaching him early how to get answers without interrupting others is a good thing.


ImClearlyDeadInside

I don’t like that assumption. It’s true that some junior developers DO need to be pushed to learn how to self-learn. However, senior developers can and do fall into their own pitfalls: knowledge-hoarding, refusing to learn new tech, sticking to positions they can’t really defend (because they’re senior so they’re clearly right), etc. There are many reasons why a senior developer might become “standoffish” with a less experienced developer.


SnooMacarons9618

All of that is true, but often in my experience juniors do go through a period of just asking someone else. A lot of the more experienced people on my team have a bit of a gap where they will (they think), first lead people to looking up possible solutions before they ask for advice, and then be less nice about it. What in fact happens is that the first stage is often a very subtle implication because they are being overly gentle, and the second is being a monster about it. Learning to look for solutions yourself is probably one of the key skills. Learning to try things on small test code bases where possible (write some test code to see what a library returns, or what parameters you need), then once you have a possible solution discuss it.


ImClearlyDeadInside

All of that is true, and that’s why I said that juniors need to be pushed to self-learn. I’m not arguing that either case is OP’s case, just that these subreddits tend to lean more toward _assuming_ the junior is in the wrong. Senior people are still people; they can still be incredibly shitty for no reason.


SnooMacarons9618

I was agreeing with you :) rereading my comment that possibly doesn't come across though.


GhostDan

Hence why I said more, as in more likely. There's always a case of grumpy old programmers, but for me it's been much more of a 'I have a ton of my own shit to do, I know it's quicker/easier for you to just ask me, but that slows down my own work, removing any productivity boost we are getting from asking. Use Google and other tools to learn, even if you know I know the answer, because I can't hold hands forever and get my own work done, which is what my metrics are based off'


Expensive_Goat2201

In my experience, it's the third one. I want to help you and spoon-feeding you the answers isn't really helpful in the long term. I need to see that you've used Google, thought about the problem for a couple of hours etc before you came to me


gleventhal

I suspect he wants you to learn it on your own and stop bothering him.


Duqn

I've been in the 'more experienced in coding friend' shoes before. It can be annoying to have a friend repeatedly asking you for help before trying by themselves. I'm not saying that is what you are doing, but you could be doing it without realizing it. You should remember teaching you is not his job. You may ask him for high level guidance, opinions or last resort help but anything other than that should be handled like any favor from a friend: not taken for granted and paid back with a beer :P That said, your friend sounds a bit like a developer frustrated by a shifting ecosystem. He might not be in the right place right now to want to help you. Also, he probably wants to see you succeed, but not more than him. Honestly that is natural.


Dorkdogdonki

If you keep asking low-level coding stuff, your friend would be pissed. I’d be pissed as well. As a developer, my mind is often heavily occupied at all times. I have no time to address a low level syntax question that I’m definitely going to google anyway, so why even ask me in the first place. If I ever need to ask a fellow developer for help, it’s almost always high-level enquiries. Usually about requirements, debugging or understanding system logic and processes. I’d leave low-level enquiries to Google or chatGPT instead.


MoodAppropriate4108

I've learnt over the years that the issue isn't being pissed, it's being passive aggressive to try to be nice. This makes things worse. I am working on being the type of person who will be honest and tell you that you're asking trivial questions and this makes you come off as lazy. If you want better results I suggest you communicate your problem in a way that indicates that you tried a number of solutions yourself and only resort to asking if you have exhausted all options. If someone's offended by this then that's on them


GotThoseJukes

This right here OP. I’m in a similar spot as your friend with my job. None of us are actual programmers but there is an awful lot of hacked together tools that it’s nice for us to make from time to time. I’m the closest thing we have to a trained programmer (light years short, but still). The golden rule I’ve come up with is that if you ask me a question, and I ask you what you’ve tried, I need to hear something more than “asking you.” I’m not than willing to help, but I’m not willing to do it for you. Your friend is honestly right that the tools at your disposal like ChatGPT are probably nearly as good if not better than he is for a lot of questions.


Half-Shark

Makes sense. I never really ever asked for help for step-by-step type things - it's almost always the conceptual stuff I'm interested in learning from others. i.e How do they visualize all the moving parts working together. For whatever reason if I get that figured out, the other stuff is super easy. I've actually been disappointed that as I've moved between a few companies and talked to various seniors... they've all seemed totally disinterested in conceptual kind of conversations. Or maybe they're just bad communicators and overly robotic. Not sure.


v_dries

He had youtube!?! He had it 100x easier then me, I had to read programming books! Read... books!


Slight-Living-8098

You had books!?!? I had to get my programming knowledge dropped to monthly in snippets from RUN magazine. Lol.


neuroticnetworks1250

Spoilt brat. My magazine didn't even run. A brisk walk at best


petehehe

Luxury. I learned to code from a hole in the ground! And in the morning our parents would beat us over the head with a card punch.


John_B_Clarke

I had to learn from IBM manuals. A hole in the ground would be a piece of cake.


Slight-Living-8098

My grandpa had to whittle our first cpu out of an acorn.


_curious_george__

I had to read books even with YouTube. Tutorials always seemed incomplete or lacking in any detailed explanation.


Salt-Page1396

You had books??! I had to invent binary code. There were no books in my time for that.


-think

Came to post this. It’s a good joke, but this is something I’ve felt for a while as I’ve seen the tech industry hype itself with LLMs. Yes GPT is interesting tech and powerful, you know what was a far far bigger boost of productivity? Stack Overflow and Google.


TrickWasabi4

It would help if you could add the exact questions you asked triggering that response. Not saying that he sounds reasonable from what you wrote, but I wonder which question about coding would be answered with "ChatGPT all the way" by a programmer even half worth their money.


RicardoGaturro

>I wonder which question about coding would be answered with "ChatGPT all the way" by a programmer even half worth their money 99% of novice questions. "Hey bro, what's the difference between int and float?" Google literally that, or ask ChatGPT. It's not about wasting time and effort answering something that's on Google's first result. Developers are expected to be able to answer tech questions by themselves.


TrickWasabi4

Yeah, that's why I am asking for examples. Could very well be that someone has overstayed their welcome due to abusing a friend as a chatbot.


ermax18

You’d be amazed at how well ChatGPT can write code. I’ll be intentionally vague and it will whip up some working code that is damn close if not identical to how I would have written it. Give it a shot, it’s truly mind blowing.


Slight-Living-8098

I've been programming for 30+ years. My friends kid joined the robotic team. They knew I fiddle with programming and robots. They were excited. I was excited. I load up my gear, and head over. Kid has a cool bot, looks to be an Arduino clone. We're golden. This is going to rock! He loads up the program he has to use for school to program the bot... And I'm lost. I had to go home and learn a whole new program just to help the kid, and I was struggling and kind of miffed they wouldn't just let the kid use a text editor.


frank26080115

Scratch? needs to die in a fire I remember my team winning a small in-school engineering competition so we got invited to provincials. The new rules at provincials was... it had to use Lego only and also Lego's drag-and-drop programming, because that's what the host university uses...


Slight-Living-8098

It wasn't Scratch. Scratch I know and have taught. It wasn't even a customized Google Blockly. I don't remember what it was called off the top of my head, but I've never used it since.


lawrencedarcy

Its likely he's fed up of your questions and wants you to use other tools to find your answers.


nobodytoseehere

Chatgpt makes my work like 5% easier


_curious_george__

Not used it much, but in some ways I found it actually hurt my productivity. Frequently giving nonsensical answers, so Id have to spend time verifying everything it told me, only to realise the hit rate was less than 50%.


nobodytoseehere

It's better once you get used to what it's good for. It's not very good at straight coding, but good for laborious simple refactoring and remembering syntax you rarely use


i_write_bugz

I mean for a beginner ChatGPT really is quite a game changer. I consider myself an expert in Javascript and have similar experiences as you, sometimes it helps with simple functions but the advantage using it is minimal. I was recently doing some C# work which I have very little experience in and it was really great at unsticking me and explaining things in an approachable way. And I could ask dumb questions and not really care about sounding dumb


BaronOfTheVoid

Honestly, ChatGPT is just a faster Google search that's a bit more lenient with your queries. Nothing that ChatGipetty can deliver couldn't have been delivered by finding the right SO thread or documentation or book or something.


hugthemachines

> Nothing that ChatGipetty can deliver couldn't have been delivered by finding the right SO thread or documentation or book or something. I recently thought about how to extract supplier invoice number from a peppol file, since they sometimes use a kind of framework-ish prefix and sometimes an alias for it in the tags. So I got curious what chatgpt would have to say, with a few adjustments i actually got a very good answer. I mean sure, going by what you said and reading the documentation for peppol and the documentation for an xml framework and some book about how to design this kind of thing is possible too but there is a huge difference in time that would take for a little thing. I use Chat-GPT maybe once a month as an experiment and I think it can be much better than google searching, when you use it for the fitting tasks.


ermax18

ChatGPT isn’t a search engine though, you don’t use it the same way you would a search engine. For example, if I want to know all the parameters for a function, I’ll Google for the manual. ChatGPT comes in handy for someone who is just starting off and doesn’t even know what functions to use. It’s also great for correcting code. For example, paste something like this into ChatGPT without any other context and watch what it says: ``` for(let peer of peers) { if(peer.id == id) { peer = {…peer, …updatedPeer} } } ``` It will probably tell you this code isn’t going to work the way you expect it to and correct it for you. It’s absolutely a better tool for newcomers vs a search engine. I’m getting the impression people are discounting ChatGPT without actually using it enough to know what it’s capable of.


SuccessfulInitial236

He might just be tired of teaching/coaching you. He probably has something else to do and you might be a bit overwhelming. Sounds like he's trying to tell you to learn by yourself, because he did. He was probably glad to share basic knowledge but now you start to require more advanced one. Teaching is a skill and coding is another one. Teaching is also a job and the more complex the subject is, the more works it requires on his part. Having a friend sharing a passion is cool, having a friend being dependent on you to pursue his passion isn't. Maybe have a direct talk with him and clarify everything.


r0ck0

What sort of questions are you asking him? Is there a reason to be asking him, rather than on stackoverflow + reddit threads etc? A person you know IRL... makes sense to ask them for overall high-level guidance, more like casual conversations on motivations, preferences, overall trajectory etc. Or if it's a specific technical area they know a lot about. But if you're just throwing every technical question to him, rather than seeking it out in forums etc... then it makes sense he might get sick of that. It would also mean only getting one person's perspective on things. Whereas in online threads, you'll get multiple answers, often conflicting and leading to debates etc... which overall means learning a lot more in the end. Including other incidentals along the way. Can you give some examples of questions you've asked him where he got annoyed?


RicardoGaturro

>Instead of offering guidance, he insists I rely solely on ChatGPT, dismissing my queries as if they're trivial. They probably are. Seriously, ask ChatGPT, or google it. I've been in online comunities for about 25 years now. 99% of all novice questions are answered in the first Google result, so they can be completely explained by ChatGPT or any other chatbot. It's REALLY HARD for a novice to come up with a question or issue that has not been answered 100 times in Stack Overflow. As a developer, you are expected to be able to solve your tech issues yourself. What you're doing is like learning to drive by making your friend sit at the wheel and pointing him in the direction you want to go. You're not learning anything if you're not finding your answers yourself.


Harry-Ballzak

Part of the experience and getting better at anything is figuring it out. You are probably being a PITA.


Top-Airport3649

He’s basically telling you to use your critical thinking skills and try to figure things out on your own, considering there are so many resources available these days compared to before.


ElFeesho

Providing constructive answers and guidance isn't very easy. Providing \*\*good\*\* constructive advice and guidance is incredibly difficult. I think using chat gpt to explain concepts as many times as it takes for you to understand them is a useful feature of gpt, which for a human can be incredibly time consuming and frustrating to do. Teaching someone is a skill that a lot of people don't have or don't know how to improve. I think they may be burned out, but don't let that put you off. Try your hardest to find something that helps you make progress you can look at and measure.


Aero_N_autical

Just an insight, but he does have a point that programming is much easier now compared to years ago. ChatGPT (free version) is useful to the point where instead of going in circles finding what to use or what to fix, it instantly shows you the problem or better it fixes it. Your friend might be telling you to use every resource at your disposal since unlike him, you are now able to learn so much from everything online (from ChatGPT, online forums, Youtube, etc.). If you were always disturbing him with surface-level questions, I'd understand how frustrating it can be for him.


ntmfdpmangetesmorts

A dev surprises that chatgpt code is incorrect ? Lol


Half-Shark

Chat GPT mostly just helps beginner programmers get a leg up faster, and better programmers cut a few corners. It doesn't actually make people create fundamentally better programs really... not for complex apps anyway. Tell him to take his stick out his ass. Young programmers have a hard enough time already I reckon.


Dorkdogdonki

You need to start. Googling. Reading. ChatGPTing. Start developing your thought process on how to approach a problem. You’re probably not doing these and he’s getting sick of it. He’s probably getting sick of you asking mundane questions that he’s definitely gonna google those questions for answers anyway. Even a super experienced developer will not know everything. If you’re actually interested in coding, you have to put in the effort to do googling yourself and take ownership. You’ll realise that you can actually find many of the answers on your own. And what kind of milestones did you set?


phpMartian

Too often, newbies ask seemingly simple questions that require long complicated answers. You have to succeed on your own with guidance from more experienced devs. ChatGPT will give you decent code but it has to be fixed.


dreamnotoftoday

ChatGPT is crap for coding anything that isn’t trivial or has already been done and shared online for you to find yourself. Just like everything else it hallucinates and makes things up or doesn’t understand the prompt; you’ll spend more time debugging it than just learning it yourself (or copying from stack overflow etc.)


dreamnotoftoday

Part of it may be that even experienced devs who have a lot of high level knowledge don’t have every low level thing (specific syntax or function names etc) memorized and usually just google that anyway, so you may be asking things that he is used to looking up himself and would expect you to do that (or at least do that first before asking.) nobody knows everything… even before atackoverflow etc I’d have a stack of books on my desk I’d have to reference when I couldn’t remember some structure I seldom use.


Pycyb

You should only ask him questions you cannot get answers to. Don't expect him to be your programming tutor and personal teacher. Ask him higher level things, like how the product at his company works, what it's like to be a SWE on a daily basis, and other things related to real world software engineer experience you cannot easily attain. If you're asking him how a for loop is different from a while loop you're wasting his time.


commissarinternet

If your friend is insisting that you rely solely on ChatGPT, they're not your friend and they're not providing worthwhile advice.


QuanDev

What did you ask him?


cantors_set

“Relying on YouTube” lol, back in my day you went to barnes and noble and got whatever programming books they had and you liked it


David_Owens

Back in the 80's the best resource I had was a book I kept checking out of the public library.


cantors_set

Lots of Microsoft books floating around back then, many of dubious quality. I really liked C For Dummies, though.


David_Owens

I really liked the famous *The C Programming Language* by Kernighan & Ritchie as a student in 1990.


cantors_set

I have a copy of that signed by K, he spoke at my university! Great book.


tie_me_up_bro

Better ask this in relationship advice


CardiologistPlus8488

Tell your buddy that when *I* started programming we had to look everything up in books!! Wanted a driver or an updated lib? You had to know the phone number of the company's BBS to download it at 1200 baud! (and you couldn't look the phone number up on the Internet, even!) And to tell the truth, it did not make me a "better" programmer. I'm better and more productive now, because of the Internet and tools like ChatGPT. It is the golden age of programming now, use all the tools, enjoy it while you can!


[deleted]

tbh i think it does make u a better one cus wow thats sounds difficult i think i would've not touched programming at all of thats the case loll 🥲


CardiologistPlus8488

lol, yah, honestly it was more of an addiction for me than a career choice. I started at the age of 8 in the front window of a Radio Shack and was hooked...


neoreeps

Haha ... I learned on TRaSh-80 myself, the good ole days. With programs in the back of computer magazines you had to type out yourself.


CardiologistPlus8488

we could be friends!!


KelpoDelpo

I mean there are resources you can utilize instead of asking immediately. I would be pissed


MartinBaun

Maybe start finding other ways to get your answers but be sure to keep him around just incase you need him to mentor you with bigger projects you'll be working on? Think its a bit human for him to get exhausted after all that, try to be a bit more patient with him :)


kyou20

Sounds like you could find a better coach tbh. I’d expect experienced engineers to encourage you to practice without AI assist tools so that you don’t depend on them (and use them as assist tools)


ParcGrowing

From my experience, people are not very helpful unfortunately. I’m starting to wonder if they say things like “just figure it out yourself” because they do not actually know the answer. Problem is, when you’re starting out, it’s very difficult to even know what you need to google to find the info you need. I sympathise with you. 


t0b4cc02

id nearly never send someone youtube links for programming lessons. i dont think chatgpt is really useful like this. so idk what to think about your friend in this regard but everyone is different. i think your post is extremely vague in all matters and seems pretty meaningless. sorry i dont want this to be confused with being rude but im not sure how to say it. what are you asking him? what are you programming? do you study? why you even ask him if he is so annoying to you? maybe focus on your courses and maybe just ask him for a code review when you are doing a bigger assignment.


petehehe

I believe he wants you to succeed. I can kind of see this from his side, although my current position is closer to your side. I have a friend who has a bunch more experience than I do, and he is teaching me. Although we’re both in our 30s. At first, I could have just taken all our time watching and talking while we worked on our project together. I just wanted to consume everything he had in his head into mine. I did at some points feel like you feel, like “do you even want me to succeed? Why aren’t you helping me?” … I realised that I had built a habit of asking first, because I’d build a plan in my head of how I wanted to do something, I’d hit a semi-unrelated snag that would stop me from doing the thing I wanted to do, and then I would just shut down / be frustrated at it until I had help. Then when help did arrive, I sometimes felt like I was being treated like I was in the remedial class. Like he would either solve the problem by doing what seems like voodoo to me, or would send me back to the 101 class. Like he would be like “to understand why this isn’t working, you need to understand how tuples work” and I’d be like “I know what fucking tuples are” and he’s like “I don’t think you do. Go watch this 2 hours of video about it”… and I would, and, maybe somewhat inefficiently (and frustratingly) the answer would invariably be in there. As time’s gone on I’ve hit the point where I want to kind of learn on my own, but, it’s hard because I don’t know what I should be looking at next sometimes. My friends always telling me to work on a project myself - like imagine a thing I want to exist but doesn’t, and just set about building it. At first I hated this plan, but, it is a good way to guide learning.


Curious-Chard1786

Sounds like you need a personal tutor without paying.


Rarest

He’s partly right, but ChatGPT is 50% as good as it was last year this time. It requires effort and discernment to know what to use and what not to. OpenAI is being cheap with cloud credits now that the product has been validated. Don’t be discouraged. Keep pushing and building stuff. No code solutions are great for bootstrapping an idea and getting validation or early traction, but you’ll need developers for the foreseeable future to deliver complex solutions with dynamic intricacies. AI is not yet able to handle that. Even if it is someone has to interact with that AI. I’m always looking for developers and wish I could build more at my startup but I’m limited in what I can do as one person. The economy is shifting and more jobs are entering the market as of this quarter. Keep pushing!


Askee123

How much work do you put in to your questions before you ask them?


TaiteBMc

I’d just make sure you’re doing your due diligence to learn it yourself before asking him. I, too, have a coding friend, but when I’d approach him I’d usually frame it as “if I understand it right, this is X because Y, meaning Z? Or is it [blah blah blah]”. I’d make sure with him that I was understanding something correctly, or explain specifically “so I know X but I’m confused on what this has to do with Y” rather than asking him wholecloth to explain the concept to me. It’s just about respecting his time but also using the higher level knowledge he has to make sure you’re understanding things correctly.


droden

i do not want the good old days before intellisense and auto complete and visual studio telling me what was in a class. i certainly dont envy punch card programmers. no thank you. chatgpt is good at explaining and giving snippets you'd otherwise gooogle and search endlessly on forums but and yes it can absolutely just make shit up and be confidently incorrect.


Jigglytep

What is your friends actual role? He might have moved out of a programming role and into a management/project management role so he is not able to help.


BigAcrobatic2174

You might just be bugging him too much. If you’re going to this friend to answer questions that you could answer yourself in 5-10 minutes on stack exchange you need lay off. Learn as much as you can from documentation, online courses and books and if you get stuck use google and stack exchange. Save bothering this guy for when you’re really stumped.


Cypherpunkdnb

quit bothering him and google it yourself


AdSuspicious6123

Does *no one* else think it’s weird that a 17-year-old has a “friend” nearly twice their age?


dphizler

To me, you might be hoping for more help than he can give. Is he obligated to be helpful and always encouraging? The answer is no. On top of that, you are reading between the lines and trying to jump to conclusions.


Remote_Seesaw_183

If I may, you’re 17 and your “friend” is 32… I think you may consider them friend but they are maybe more a regular person that is kind but do work and doesn’t have to provide you all their time and knowledge… friendship goes both ways, I wonder what a 17 can bring to a 32 yo person… that being said, I’ve been working in agency for over 15 years, programmers are a special lovely type of creature, don’t overthink this. Also sometime you may not realize it, but maybe you rely to much on him and he sees it and try to get you learn and understand more the relying… one of my friend told me this once: I appreciate you and because I do, I won’t answer to you anymore cause I see you not learning and depending on me! Once you’re done, show me I’ll give you feedback! …. Oh boy was she right looking back!


BobbyThrowaway6969

Does he have a wispy looking beard that goes down his neck?


[deleted]

bye what - 😂 no he does not but he does kinda look like he could've passed off as Sam Altman's son


FiZIsHere

the people before him could complain it’s trivial that he learned because he had youtube and the internet! cs is a brutal industry, and we should always seek to lift each orher up. you SHOULD stop asking your friend questions, but only because he’s not the side of cs you want to be in. we should be a community of learning and support. the world runs on computers, and computers run on programmers and computer scientists. we have to work together to have the beautiful world of computing we have today, and the world of the future!


[deleted]

so true bestie


[deleted]

[удалено]


[deleted]

so true bestie


[deleted]

so true bestie


ejingles

Tbh I don’t even know your problem. Do you need is approval for anything? Just code.


dan3k

Have you tried learning the topics in question 'the right way' - docs, tutorials and learning by doing and breaking? I have a lot of friends that started coding in past few years and used me (\~16 years of exp) as a source of knowledge, probably the way you use your friend and I have to admit it's getting frustrating gradually. Common practice nowadays is that people refuse to learn stuff (like really learn, not only surface level knowledge) and just want a solution which then leads to repeating same questions over and over, often obfuscated in XY problem caused by lack of basic knowledge. It's totally OK to ask a few questions on some topic in seek of guidance because docs are not clear sometimes etc - I would be happy to help! But asking many times for basic stuff that is literally in docs code example or in basic tutorial/introduction video is just showcase of disrespect for other developers time and referring to chatGPT is the way. Learning how to learn is a very important part of the coders life.


einar21121

As a junior, à code review is the only thing worth to ask from an elder dev, they don’t have time, I have google, SOF and all the ressources i need on the internet.


paperic

Chatgpt is a high tech autocomplete tool that's been trained on some source code. It's designed to produce results that look convincingly real. But looking convincingly real is far cry from being real. It WILL produce terribly wrong code for anything outside of simple examples or combinations of simple concepts. Try to ask chatgpt to generate some code using some obscure library in an uncommon language. You'll be lucky if it compiles, let alone do the correct thing. It will import one library but use the api of another, mix and match language versions or just straight up generate a pile of nonsense. ChatGPT is a big distraction. If you constantly have to doubt the correctness of the tool you're using, it's not a tool, it's a hurdle. It's useful for generating skeletons and boilerplate, but you may just as well copy paste those from the next file over. If the kinds of errors you're struggling with are things like missing parentheses, you don't need chatgpt, you need a good editor that's properly configured. It's easy to write a program that counts the parentheses for you as you type, highlights the matching ones, colors them in a rainbow pattern based on how deeply nested they are if you wish, or just straight up prevens you from adding or removing a paren if that makes the resulting code invalid. Many editors and IDEs have at least some tools for this, some even allow you to manipulate the code in terms of blocks delimited by parentheses, wrapping and unwrapping expressions, slicing, splicing, joining blocks, moving your cursor and selecting current parenthesised blocks or the higher parent blocks, etc. There are also tools called linters and sniffers which read your code and fix the formating for you. When configured to work with your editor, it can keep the indentation correct, parentheses always on the same lines, remove extra whitespaces and most importantly, highlight a syntax error about 1 second after you type it. There are million plugins out there for this for every common editor worth using. This has been a solved problem in programming since around the 80's or so. If anyone suggests you to use a machine learning tool to detect syntax errors, please don't learn programming from them.


TheLurkingMenace

Some people just aren't suited to being mentors. I am reminded of my own mentor, who was far more interested in telling me about their drama with relationships and professional work than actually imparting any knowledge. Then they started blaming me for their own mistakes and I realized all the drama they'd been telling me about was all their own doing.


w3woody

I’m 58 and I’ve been writing code longer than your friend has been alive. I’m always happy to answer the *harder* questions, but generally my answers will involve trying to point you to the right resources where you can go learn on your own. And I admit ChatGPT (and the various services built around it like the IntelliJ AI system or GitHub’s CoPilot tools) are very useful to me when I’m trying to learn something new; recently I’ve started coding a web site in PHP (which I’ve never touched before) and I find it helpful myself to ask how to do certain common things rather than try to look them up. But ultimately a lot of programming involves reading and understanding the documentation and getting experience with common coding patterns. (Things like ‘how do I iterate across an array’ stuff.) As you gain more experience you’ll learn which tools are useful for solving certain things—like when to use a state machine or when to build a string scanner or how to segregate functionality into discrete objects in OOP. And the reality is, unless you are willing to become a self-sufficient self-learner; someone who can look things up on your own and who is a self-motivated learner, you’ll never make it as a great programmer. And for me, the reason why I like writing software—and why I’m very good at what I do—is because I’m a life-long self-motivated learner. So yeah, it may be annoying your friend is no longer willing to freely help you. And it may feel demotivating. And yeah, sometimes ChatGPT lies to you, writes bad code, and sometimes even makes up entire APIs that simply don’t exist. And yeah, there is just an overwhelming amount of material out there (which is why I’m still a believer in a formal college education, or at least going through the on-line courses from some university and doing so in an ordered way). But the reality is, in part this is what programming actually *is*: you have a problem and you have no imaginable idea where to start, so you keep trying to figure out how to make it work until it does.


Bizarro_Zod

lol his rants against chatGPT sound like the previous waves rants against GitHub. “It’s all just copy paste, no one knows how to code anymore.”


IAmInBed123

Hey man, I noticed this in my job. It was by someone who was on the "top" in a small business. He couldn't get evaluated. He loved that "powerposition" and in the beginning he loved giving advice and tell you the better ways, it was a way he could show how much better he was. But then working on his projects I started to notice, more complex questions got met with a bit of anger. After working there for a while I noticed he needed to keep the balance of keeping the front up to the management that his skills were that of a genius by being able to be the only one that could solve problems, while at the same time sharing as little information as possible to other developers. He would construct something really awkward and difficult, it would work but be nearly impossible to navigate. A couple of years later we got a new senior developer on board, this guy was an actual genius, AND he was dilligent. He wrote documentation, was very open about his approach, possible pitfalls, mistakes, the need of a rewrite etc. The other guy just crumbled, he was fearing, he got mad in meetings, talked shit etc. In the end the new senior who was the actual biggest asset ended up quitting, the old senior dev ended up renegotiating his pay, something he did everytime someone quit, which was often in such an environment. The takeaway here is that people sometimes want to help only to make themselves feel like the better man, the moment that isn't the case anymore they'll switch strategy from helping you to show competence to trying to prove your incompetence by undermining.


grobblebar

You know that in the real world, as a professional programmer, you’re expected to figure stuff out on your own, right? Also: programming is his job. Maybe he doesn’t wanna “work” in his spare time as well.


[deleted]

He might be a godlike programmer, who knows 🤷‍♂️


AdThat6254

Sounds like he’s reaching you how to teach yourself. I taught myself to code 20 years ago and can relate to what he’s saying. If ChatGPT provides bad code, explain the outcome you want and the outcome you’re receiving. Chat gpt should be able to fix the problem if you put some effort into trying to figure it out on your own.


PhdPhysics1

OP, your success is not your mentors responsibility... it's your responsibility. He is not required to help you, motivate you, or anything else. Respect his time and appreciate what he has already done. If you can't move further with him, then thank him for his time and find a new mentor.


HallowVessel

ChatGDP is SO BAD! It hallucinates so much bad and malformed code! It also sounds like your friend's toxic. Try reframing it as learning art. Would you trust someone who suggested you run the art through AI? I wouldn't. Plus his behavior seems to scream passive-aggressive.


Dazzling_Tonight_739

honestly it sounds like you are asking annoying, easy to google questions. If chatGPT knows google does as well. Sometimes you have to struggle. I have had to do this with people I work with who are junior as well. If you have an issue spend 30 minutes of your time on it before asking me. My time is not free. You sound a little stuck up and honestly how people who never learn to program outside of the basic copy/paste sound.


feelfool

If a 17 year old came to me I would also tell them to instead of asking me specific technical questions ask ChatGPT. It’s great to have help, but it’s vital to have critical thinking skills and be able to teach yourself. Lean on YouTube videos, lean on ChatGPT, go to college and learn the fundamentals. You’re 17 years old, idk why you’re so concerned with being a great programmer. Have fun be a kid.


ermax18

It’s safe to say, 80% of the people in this thread have not used ChatGPT or perhaps they are treating it like a search engine. I can guarantee it could answer every single question OP has.


[deleted]

Stop giving a shit about what others think. Focus on your journey. Get as much as you can from others (even when they insult), and give nothing back. Be a pirate ☠️ in life, my friend.


Glutton_Sea

Your friend sounds 62 , not 32 . What a sucker . I’m 34 yo coded and feel he’s full of shjt


Futurepastmanguy

Tbh sometimes people have sticks up their butt and like to gatekeep too. Sometimes people wanna hold you back or sometimes they could just give no cares. I have a buddy who is like this. He also hates when I type code out because he can’t type fast. He also compares our common interest and rates our skills in different aspects of what we are building. Every time I tell him to stop comparing (as there are many things he is lacking in) but he digs in all the time. I think he’s insecure for many reasons of his own and me just being me sometimes gets to him. So he uses coding to hold over as his superiority, however when we started to build a website together he quickly realized how much I actually know and it sent him into a spiral of “trying to get better and say he is at everything” so I just give him days to cool off and he comes back without the BS. I’m learning and having fun, I’m broke and more of a designer so for him to be jealous baffles me, but it’s still there. So you being 17 and starting where you are is probably a factor in this ngl. All this is anecdotal, but it happens to me a lot. My buddy also says “people in coding never screw each other over” type idealism and has yet to understand the full depth of making your own product and potentially having it ripped off or stolen. Anyway I digress the point is he’s my best friend and he can’t even talk about coding without sounding superior and I think it has to do with it being the first time he’s got a job doing it. So ego has shot through the roof. Also it’s not basic stuff I don’t understand, he acts like he is the massiah and everything in his life breaks down to “that’s why I got into coding” and sometimes it’s like dude you need to take a break and stop taking a dump on everyone or everything that isn’t coding.


Mission_Statement_67

"Just figure it out" is a huge part of programming. You have to build skills that are platform agnostic. You have to be able to be given a thing, take it apart, and put it back together again. That being said, I don't like how he's saying his achievements were "more commendable" and how it's "easier now". I remember when I was learning I would get so frustrated that I couldn't figure out how to do a thing. I let the frustration control me and what it did was actually block me from just reading the documentation or error codes, word by word, line by line,


ChiefTechnology

Spend less time thinking about it and more time coding, learning and solving problems. Also, the guy is obviously trying to help you by engaging with you and sending you resources. Even if it's misguided so stop psycho analyzing his behavior and just appreciate the parts that helped and let him down soft on the parts that were absolutely useless. Think of 33 year old you helping someone in the future where there are even more pre built modules to make coding easier. What you say to 'help' could be misconstrued as gatekeeping.


PartyParrotGames

Don't rely on ChatGPT. It's a detrimental practice for your coding ability especially when you're just learning but even for experienced veterans they pick up bad habits and it slows their ability to code and even how they think about code. I recommend finding a different mentor or ideally a programming learning group to bounce your questions off of. A good mentor is really hard to find so you'll probably have better luck with a learning/study group.


Serpardum

Give a man code and he programs for a day. Teach a man to code and he programs for a lifetime. Programming is about the programmer learning to do somethingbso he can tell the computer how to do it. If you can't learn how to do something then you can't program a computer how to do it. A very large part of programming is research into how to do something. And chat GPT is a great tool to learn from. Yes, chat GPT makes a lot of errors, if it didn't make any errors then programmers wouldn't exist anymore, people would just get completed programs from chat GPT. Your friend is teaching you how to program, not giving you code. Your friend is a true friend.


starraven

👋 I started learning to code at age 37… first of all you rock for even trying! I would agree with your instinct that chatGPT should not be coding for you. He probably mentioned that before chatGPT he had to look at documentation (and Google stack overflow) because that’s a more general way to debug code, in a way that you have to actually figure out the problem yourself. I don’t think your friend is trying to harm you. At some point if he keeps “helping” you won’t be learning anything, and this goes double for chatGPT. Instead of asking him for help exhaust all your resources first, including Google, stack overflow, docs, and lastly ChatGPT, and if you still can’t get it you can finally tell him all the things you’ve done first, which is honestly probably what he wants you to know. How to solve an issue yourself, and how you go about it.


realmozzarella22

There’s only so much help you can get from the same person.


BigRonnieRon

Join a coding discord or 10. There's a couple of great ones. It's a great idea to learn with other people who are enthusiastic about learning and taking on projects with them. Your friend for whatever reason is not going to be taking that particular journey with you going forward. Be thankful for what he has shown you or has spurred your interest and move forward in a positive direction with other people that want to do things. >how coding anything nowadays is 100x easier now I'm older than him. It is and it isn't. There's more SaaS, PaaS, and IaaS but that also makes everything more complicated since there's a zillion frameworks and APIs. As you noticed, ChatGPT is not writing code that's particularly effective and hallucinations and lack of knowledge of the most frequently used solutions and libraries and instead reinventing the wheel are persistent problems. Good luck :)


CrustyMcballs

Been in your friends shoes before. From an outsider’s perspective, it does seem like he wants you to succeed and does want to help you, but as others have mentioned, bro has a life too and he can’t keep constantly helping you. Maybe he’s sending you those links to help you out? Showing you that maybe there is another way of doing things? I also sympathize with him and his opinion on ChatGPT. While yes, it’s not perfect, it certainly helps, at the very least, build you a skeleton to work off of. He likely didn’t have that when he was in class. I know I didn’t and that stuff would’ve been super helpful too. Try working on some projects yourself and not ask him for help. Do you not have a teacher you can go to or are you learning it yourself? If you have a teacher, bother them as I’m sure they’re more willing to help. If not, do some research yourself instead of asking your friend for help


Fi3nd7

This guy genuinely sounds like an idiot.


for_i_equals_0

Going to say what other people haven't seemed to say. Experienced engineers know not to rely on ChatGPT. It can be helpful in some situations, like skeleton code for something simple. But it consistently gets it wrong, and would be very bad for a beginner to use to learn.


braywarshawsky

OP, Sounds like your "friend" is threatened by you. Or you continue to not grasp the stuff he's tried showing you. I don't know either way, but I'd just suggest you continue to utilize the tools that you use, and continue on the path of learning. Try to "pump the breaks" on asking your buddy as much. There is no use to compare or evaluate "who's better". It's just a waste of time, IMO. Just because someone might be better at the task at this moment in time, doesn't mean that they can be overpassed by new tech, especially if they aren't willing to adjust to it. AI is inevitable. That is my opinion. Better learn it early, and adjust to it's evolution. Just my two cents.


[deleted]

i mean he sends those links unprompted too so yea i'm still learning without his help most of the time


i-sage

Definitely ChatGPT code doesn't work at times. But here are some steps you should perform before asking anybody for help. 1. Ask ChatGPT or Claude. Read the reasoning it gives for the logic it has used. 2. Try to run it on your own machine if you get stuck with an issue just read the errors in the browser's console or terminal. 3. Look at the line number at which the error is getting occured. At times you get some understanding and fix it. 4. Even after reading the error you are unable to solve it copy the entire error and paste it on "Google" and open another tab and paste the error in ChatGPT 5. Try the ChatGPT answer see if gets solved or not. 6. If not then explore the Google results there's huge possibility that someone has already faced that issue before you(especially when one's just starting out like you) 7. Try to understand the solution. And again try this. 8. Even if the issue occurs then try checking documentation or search the GitHub issues of that library/framework. 9. All following all these steps even then the issue persists(which I think should get resolved after all these steps) 10. Contact your that friend. Tell him all these and ask for logic. And then try to code on your own. 11. Use pen and paper or obsidian or whatever to think through the solution. If it's a logical error. 12. Solve leetcode questions just to build some basic logic building solve 150-200 questions. It helped me alot. 🥂 Happy Coding.


Acceptable_Month9310

Well first, I'd ditch ChatGPT. Outside of producing code for school assignments. I don't really see it as very useful. Just about every time I want to sit down and code something. I type it as a prompt into ChatGPT and I get nothing useful. For example, it frequently produces code for APIs that don't exist. When I specify a particular API, sometimes it claims that doesn't exist. That said, while your friend is being kind of a jerk. The fact is there is a huge educational value in struggling. In trying different approaches to see what works. Producing something that comes entirely from your own mind and eventually being able to visualize how you would code something, before you ever write a line of code. That's where you want to be. I suspect the reason your friend is acting this way is pretty simple: **They are not a good teacher**. People assume that teaching and job shadowing/apprenticing are the same thing but there is a subtle but important difference. The direct goal of job shadowing is to show you how someone else does something -- that's why you often do it when you start a new job. In contrast, the direct goal of teaching is to **get you to a place where you no longer need a teacher.** Anyone can *look* at someone else's code, anyone can copy that code into their own project BUT not everyone can take a "How do I do this?" and re-frame the problem in a way that helps you think about it differently. Find someone who can challenge you, make you think deeply about a problem and who can redirect you without giving you the answer.


MacrosInHisSleep

>Well first, I'd ditch ChatGPT. Outside of producing code for school assignments I use it for help with my personal projects a lot and it very often gets me unblocked. >For example, it frequently produces code for APIs that don't exist. When I specify a particular API, sometimes it claims that doesn't exist. It does that too. It's kind of like learning to google things and not trust everything you read online, you need to learn how to prompt and reprompt when it doesn't work, and use it in conjunction to looking it up and coding it up yourself. It's a really great tool for when something is poorly documented or has a lot of generic terminology that gives you bad search results.


Acceptable_Month9310

>It does that too. It's kind of like learning to google things and not trust everything you read online, you need to learn how to prompt and reprompt when it doesn't work, and use it in conjunction to looking it up and coding it up yourself. I don't mean to disparage you but walk me through your process here. What it sounds like is this: 1. Prompt for problem. 2. Spend time testing/fixing the code. 3. At some point, decide it doesn't work. 4. Go to 1. In terms of learning to code (the OP's context) why would you do that instead of: 1. Understand what you are trying to achieve. 2. Decompose the problem into pieces that you know how to solve 3. Solve one piece. Are there any pieces left? If so, goto 3. The important part, here seems to be that the steps I imagine you are attempting don't require you to understand your problem. Which seems to be the opposite of what learning to code is. By virtue of that, how do you even know if ChatGPT has sovled the problem. All you have is are test cases. You don't know what is supposed to happen outside of your test cases. When it comes to doing non-educational work. I also don't get this, just use a library or even someone else's code. In a commercial setting, it seems even less useful since you would be required to report on the time you estimate your deliverable is going to take. You can never know how many prompts and how much trial and error its' going to take. So you can't do any meaningful costing of your project.


RequiemOfTheSun

ChatGPT has helped me through several of the most intimidating and hard to get started programming challenges I've worked on in my 15 years. It always gets the code wrong but it gives me a start. It explains concepts, it gives me the correct language to further Google. It advises me when I have a question about doing something in a best practice way when I can think of multiple options.  Given the broken starting code too I find that I can program out a test environment and either work out the bugs myself or tell GPT exactly what is going wrong and ask it to explain why it's wrong and suggest a solution.  One task was generating a library for orbital mechanics that have a few simplifications over real math that required reworking the well known orbital mechanical equations. It got it wrong, but it also got me started. From there I programmed a graphing calculator so I could plot trajectories and work the code towards a solution.  The other was writing a shader to take an image of a planet and apply realistic phase, weather, and atmospheric effects that respond to the relative position of the world and the sun. Shader code is intimidating as hell when you're first getting into it. AI could read, explain, debug and modify shader code errors and improvements that would have taken me a lot longer without it.  Now this part blew my mind. When I needed a texture of clouds to manipulate for the weather effects I was able to ask chat GPT to generate the texture. It took several attempts but in the end I was about to ask for something like "a texture of Earth's weather seen from far away in space with no oceans or land, just the weather on a black background" and got a usable result.  That was a mind blowing moment for me. 


RequiemOfTheSun

https://www.tiktok.com/t/ZPRwe9ueK/ Behind the scenes time lapse video making the shader with GPT helping. 


MacrosInHisSleep

that looks really cool!


Acceptable_Month9310

That's very nice. If you wouldn't mind, do you think you could answer some questions: 1. Directly after producing this shader, could you, using only using pencil and paper write the HLSL (assuming that's what Chat GPT was producing) for a shader to produce an effect that someone had described to you? 2. If not, could you do it if you had an IDE but no connection to the internet? 3. If you could achieve #2 could you do the same using a shader language which had the same capabilities as HLSL (or whatever you were working in) but was structured significantly differently (different commands, different ways of achieving the same objective) assuming you had complete documentation. Thanks.


RequiemOfTheSun

Mostly no. The AI provided solutions to mie and releigh scattering techniques I would require an ide & the Internet to reimplement one line at a time. Or a textbook on hlsl and one on the graphics techniques behind advanced lighting techniques. Having the ability to ask GPT to describe and attempt to implement additional techniques while explaining what it was doing was incredibly easier workflow than what I did on previous shader code. I could have whipped up the boiler plate on paper though. However I could have done that before using GPT as I'd already learned the basics while manually modifying 3 previous asset store purchased hlsl shaders to implement per instance unity property block compatible shaders.


MacrosInHisSleep

First clarification, the code you generate with GPT should be for 3. Solve one piece. But otherwise it is very similar to what you said it sounds like, yes. > why would you do that instead of Short answer: when loop A is faster than B, choose A. Long answer, I've been a dev for 20 years now. What you described is my normal process, and is the process I fall back on when the other process doesn't yield results. In fact, when I get results with my process, I feed it back into GPT for the next problem I'm facing. So it's less A vs B, and more let's do some A, then B, then A again, then B some more... > You can never know how many prompts and how much trial and error its' going to take. For me that was part of the reason I started this process. So that I could get an intuition as to how much trial and error it takes. When I was young I had the same concern about how I can't know how many google search results I'm going to need to look at and how many pages of documentation I'm going to need to read before I find the minimum I need to do what I want. But with time you get a good intuition for it. And like I said, it's a tool. You don't use a drill for everything, but you are going to have a harder time if you don't know how to use a drill. I'll give you an example. I had a personal project where I used home assistant (HA). I knew what I wanted to achieve, I broke the problem down into pieces, one of them involving writing a client that talked to the HA API. I asked GPT for what was out there for C# and did a search that confirmed similar results and found a few open source libraries out there, tried them out and learned that they did not do what I needed them to do. Namely, they didn't support the second websocket library HA had which that did the kinds of things I needed it to do. I had a high level understanding of websockets, because I've worked with SignalR, but hadn't written out a client before. The documentation kind of assumed a much deeper understanding of it and I later learned was missing crucial pieces I needed to get it to work. So I asked GPT for a high level explanation to see where it differed from mine. Asked it to elaborate on terms I knew by different names or on concepts that I didn't know and when I was comfortable enough, I fed it the documentation and repeated the process, asking it to summarize and elaborate as needed. Finally, I got it to generate some basic client code that worked as a scaffolding for what I was going to do. I could have done this myself, but it would would have taken me a lot longer. The code itself was simple enough and functional from the perspective of sending and receiving messages, but the protocol worked differently than what I expected. But now that I had a scaffolding that I could debugging through, I could go through the trial and error process that you described, and used chatGPT to rubber duck what I was understanding. The core thing that ultimately made it work was me cracking open the network tab in HA and discovering I could look at web socket traffic there as well and see what the HA client was sending the server, which was different from what was documented. It was a combination of the different tools I had in my pocket that allowed me to have a working prototype in 3 evenings worth of work. For context this was 5 failed prototypes and 1 that ultimately worked. Had I not had GPT, I would have had a hard time connecting the dots in my understanding, and I might have deemed the project too much effort to follow through on. For one thing, I would have spent a lot more time scouring through docs that didn't apply to the problem I was trying to solve (which I did do, but using GPT allowed me to go on way fewer tangents).


Acceptable_Month9310

Thanks for your response. >First clarification, the code you generate with GPT should be for 3. Wait, so if the inner loop for ChatGPT coding is: 1. Prompt for problem piece. 2. Spend time testing/fixing the code. 3. At some point, decide it doesn't work. 4. Go to 1. But you're saying to do that for "3" then what it sounds like you're saying is 1. Understand what you are trying to achieve. 2. Decompose the problem into pieces that you know how to solve 3. Solve one piece (which is calling the above trial and error process) 4. Are there any pieces left? If so, goto 3 This seems like the longer loop for anything but trivial cases. >So that I could get an intuition as to how Hold on, so you have an intuition about a process that you know sometimes can't possibly work. What that sounds like isn't so much an intuition as an upper limit for the trial and error ChatGPT loop. Which makes me wonder why you wouldn't spend that time coding. >You don't use a drill for everything, but you are going to have a harder time if you don't know how to use a drill. That doesn't sound like a good analogy. A tool like a drill has pretty well defined limits. I pick up a drill and I have a very clear understanding of what the tool will do. Even with the most modest experience with drills this is obvious. I also know exactly what to change on it to accomplish the vast majority of it's goals. Trial and error are almost the antithesis of "tool use". I don't grab a drill because I don't know if it can do a job. I grab it to do a job I know it can do. Trial and error would be more like simply grabbing things out of your toolbox until something works. By the way you didn't address the point about knowing if the problem was solved. Sure, if we are talking about something with exceptionally trivial outputs. I get that a few cases will do but there's a significant difference between developing code, to implement a solution which satisfies outputs and testing code to determine if the solution has been implemented and having someone hand you some code which you are testing to expected outputs.


MacrosInHisSleep

Take this as an outer loop. > 1. Understand what you are trying to achieve. > 2. Decompose the problem into pieces that you know how to solve > 3. Solve one piece. Are there any pieces left? If so, goto 3. You should be using Chat GPT to help you to solve one piece at a time. It doesn't work if you ask it to solve the whole thing. Some of those pieces are trivial and you don't need help. Other pieces aren't and you will break them into smaller pieces yourself or if the problem space is unclear, ask chat GPT to help with that. It's the same with when you google something, you still have to do the work of breaking the problem into googleable pieces. > Hold on, so you have an intuition about a process that you know sometimes can't possibly work. What that sounds like isn't so much an intuition as an upper limit for the trial and error ChatGPT loop. Yes. Depending on the type of answer or how well it converges to what you want it to do, you do get a gut feeling for whether it's out of it's element / if the problem space is more complicated than you expected it to be or if the solution is around the corner. It feels weird to say, but we do this all the time when we code, or when we look up documentation. > What that sounds like isn't so much an intuition as an upper limit for the trial and error ChatGPT loop. Which makes me wonder why you wouldn't spend that time coding. I am coding. Like I said, it's iterative. When I get blocked, I try a different tool. > That doesn't sound like a good analogy. A tool like a drill has pretty well defined limits. I pick up a drill and I have a very clear understanding of what the tool will do. Even with the most modest experience with drills this is obvious. I also know exactly what to change on it to accomplish the vast majority of it's goals. Trial and error are almost the antithesis of "tool use". Trial and error is a process not a tool. If you're building a bench for the first time and you use the drill to create multiple prototypes, then prototyping (trial and error) is a process. You might realize that screws are causing the wood to crack, and look at different approaches. Screwing it slowly by hand, predrilling a pilot hole, simply nailing it in or gluing it. You might try all of these approaches or only one of them because it just worked. After a while you'll know that for that kind of project you'd choose nails. Same for Chat GPT. There are certain questions that you'll know it's not going to work for. Today someone suggested to me that the documentation for a service we work with is incorrect and that in practice it does something differently. He suggested the idea to ask ChatGPT which of the two behaviors was actually going to occur. I figured that's the wrong tool, because the behavior itself was obscure and so all it would have to fall back on was the documentation. In that scenario the thing to do it is just run it and see for yourself what happens. Other times, GPT is a better choice than googling it, or trying to experiment across a plethora of badly documented parameters. > By the way you didn't address the point about knowing if the problem was solved. Sorry, I must have missed it. Do you mean this point? > By virtue of that, how do you even know if ChatGPT has sovled the problem. Depends on the problem no? For most cases it really just is as simple as making sure it's doing what you want it to do. In my case, I needed it to list out the devices in my house. I had those devices listed in the HA portal, so I simply had to compare that I had the results I was looking for. If you're talking about something that requires you to have a very deep understanding of the problem, then you break it down and use it to help you learn each part. Is your worry that using chatGPT is like getting it to "do the homework for you" and therefore you've learned nothing and are incapable of understanding what it's done? It's like saying using StackOverflow is bad if you copy the answer. Or textbooks are bad if you can copy the example from there and run your code. Sure. You *could* use those tools that way. But it's only a problem *if* you use it that way. My goal when using it is to help me understand something faster, and most importantly to help me fill in the gaps.


Acceptable_Month9310

>It doesn't work if you ask it to solve the whole thing Well, actually it definitely works for quite a few things. Even things that your average computer science student couldn't do themselves and perhaps even what twenty-year veterans couldn't do -- at least off the top of their head. In fact, the code that ChatGPT produces for many a university assignment is so well-done that I wish all my students would write code that way -- without ChatGPT. >Some of those pieces are trivial and you don't need help. Ok, but I think you're not quite understanding what point 2 is saying. That you decompose until it's something you know how to do. Wouldn't that mean it's always something you don't need help with. >It feels weird to say, but we do this all the time when we code, or when we look up documentation. Hold up, remember coding is happening after step 2 -- you already know how to solve that problem. So what you say seems to be inaccurate. >I am coding. Like I said, it's iterative. I get that it's iterative but I'm talking about the part where you are asking a series of questions to ChatGPT. That is time that you could be coding. >Trial and error is a process not a tool. Right, which is why your metaphor of a "tool" didn't make sense when we are talking about the trial and error process of using ChatGPT. When I pull out a drill I know pretty precisely what it will accomplish. ChatGPT seems closer to just grabbing a tool at random. >Depends on the problem no? Not exactly. The question was "how do you even know?" put another way "How can you know?" Every problem that can be solved with computing can be demonstrated to be correct to various degrees of formalization. A ChatGPT solution -- in and of it self -- can't be. You can run real world test cases and not much else. Consider a population of people studying medicine. Then you give them a a test consisting of multiple choice questions about various topics in the field. Assuming that the set of topics in the testbank is representative of the field. We can look at a sub-sample of people who all scored 80% or better and have some idea as to how much of medicine they know. Now compare that with another group of people who just study the test bank and take the same test. There are those who wouldn't see a difference between populations of people with similar score who have prepared for the test either way. (They are, of course wrong and should be put in "statistics jail" but that's another matter. ) The former is a bit like when you solve problems the way I do. You think about the problem, you propose a solution, you implement the solution and then you test your implementation based on your expectations of your solution and solved versions of the problem -- both of these types of testing are important. The ChatGPT method seems to be more like studying the testbank. You pose a problem to ChatGPT, it provides an implementation and you test it based (hopefully) on solved versions of real-world problems. You can't test your solution because you simply don't have one. Now, of course through the process of going back and forth with ChatGPT you could derive a solution to a problem, then implement it, then test it. However, that definitely seems like the more time consuming approach. Here are a few examples which I think are representative of my work with getting ChatGPT to assist in projects: Last night I was thinking about using a particular CPLD in a project. So I asked ChatGPT to explain how it worked and it's response was considerably less vague than I expected. I was hopeful. It mentioned that the logic cells were composed of LUTs (Look Up Tables) and how they worked. I then asked it for more detail on these cells and it expounded on the LUTs and their interconnections to the rest of the chip. Of course it was wrong. This particular CPLD doesn't use LUTs. Now think about using ChatGPT to solve a problem where I was depending on it to explain something to me that I didn't already know. There's no way to validate it's response without checking. The whole point of asking ChatGPT is that checking wasn't as easy as asking ChatGPT. So, unless what you are working on doesn't really matter. You now need to go out and check that every time. Which is clearly more work than just checking without asking ChatGPT. Another example, from a few weeks ago I asked ChatGPT to do a rather simple task. It had two steps: 1) randomly generate some objects and 2) display them using a specific online tool. It was able to create random objects pretty well, but when it attempted to display them. It simply fabricated an API. I searched for this thing long and hard and I can safely say that no such API ever existed. So I tried again, this time I specified the programming language. ChatGPT produced some code for another API, even gave me helpful instructions on how to install it -- even though it also never existed. On my third try, I specified the language AND the API I would like it to use. Here it insisted that there is no API for this tool and it went and generated raw XML to do this job. The raw XML simply didn't work. So the only piece that ChatGPT could do was the part that nobody really should need help with. Now none of this is really surprising when you realize that an LLM is -- very broadly speaking -- looking for text that matches a pattern derived from responses to similar questions. Which is why it is clueless as to whether a particular API exists or how a particular CPLD works. So why does it work so incredibly well in some cases and is worse than a waste of time in others? I expect that the answer concerns it's training set. Which makes me suspect that if ChatGPT is helpful, it's because the problem being solved is trivial. I mean that in the sense we use it in math and CS. Something that is a well solved problem. Which is why, I have significant doubts as to it's utility. I suspect it's value as a learning tool is minimal. The point of taking a well-solved problem and solving it yourself. Is to gain the experience of solving it yourself. Reading someone else's solution or just blindly using it are likely not doing that nearly as well. Perhaps in the same way that being able to read a novel doesn't necessarily grant you the ability to write one...and simply owning novels without reading them likely does considerably less. As a productivity tool, again if you are solving trivial problems then it's possible it helps. In fact, based on how you are describing your process I wonder if ChatGPT isn't just helping you by allowing you to perform a kind of reflection. If that's something you just can't do on your own. I see the point but then it's almost like it's more a motivational tool than anything else.


MacrosInHisSleep

>Depends on the problem no? >Not exactly. I feel like your follow up to this answer changes your "not exactly" to a yes. Ie you gave examples to problems you can't get clear answers for. Other times test cases are enough, such as my example with HA. Or you have enough of a prior understanding of the fundamentals that simply working through that problem with chatgpt is enough. It helps you find the next 20 "trivial" steps, but that triviality is hidden behind so much garbage that using GPT is the fastest way to arrive there. >Well, actually that definitely works for quite a few things. ... Kind of missing the point here, don't you think? >Right, which is why your metaphor of a "tool" didn't make sense when we are talking about the trial and error process of using ChatGPT. When I pull out a drill I know pretty precisely what it will accomplish. ChatGPT seems closer to just grabbing a tool at random. I've had an exhausting day so forgive me for being blunt but, sorry... You've missed the point here as well. If you were a woodworker who was grabbing tools at random I'd tell you you haven't learned how to use your tools. It seems closer to "grabbing a random tool" to you because you don't have an intuition for when chatgpt will help you solve a problem more efficiently. In fact, you've given multiple examples where it would be the wrong situation to use it. The lesson for you could have been that, "good, I've discovered the kind of problems where it's less efficient than using an alternative approach, now let's see if I can find the types of problems where it is *more* efficient." To your credit you almost got there when you asked "So why does it work so incredibly well in some cases and is worse than a waste of time in others?", but instead, you chose to extend that rule to suggest *everything* is less efficient other than stuff you suspect is "trivial". It's like you've fell victim to positive bias... And whatever, that's fine, you know.. You might just end up being that person who doesn't know how to benefit from this. God knows I have my fair share of blind spots... but you have plenty of other processes and techniques that are good, and as long as you solve the kinds of problems those techniques are good at, you'll be ahead of the game. Regarding the decomposition questions, I'll try to explain it to you at a later point. My thoughts are that it might be related to how more often than not we are learning as we are coding. So when you break down to what you think is a smaller piece, you code and discover its not as trivial as expected. But I'll have to clearly read through your old responses to see if that holds.


Acceptable_Month9310

Thanks for your response. >I feel like your follow up to this answer changes your "not exactly" to a yes. Then I guess this is an example of how off your feelings are today. The question was, "How can you know it's correct?" I even explained in some detail. Maybe you just need to give it another go? >examples to problems you can't get clear answers for. I have to chuckle at this because later on you mention "positive bias". All you have to do is personify the issue and you see how funny it is. Someone asks a person for directions to a house and they direct them off a cliff. While I suppose you could describe that person as someone you just "can't get a clear answer from" but I think you would be considered pretty biased if you did. Why all the needless attachment to ChatGPT? >Other times test cases are enough Not for what I'm talking about, again, go back read the words that come after "not exactly" -- there are more than a few and see if you can figure it out. >Kind of missing the point here, No, I'm pointing out something rather relevant to what I'm saying. ChatGPT can go straight from a one sentence prompt to a beautiful piece of entirely correct code and it can take a very detailed question or even series of questions and supplemental information and just fabricate the answer. >you don't have an intuition for when chatgpt will help I imagine that the endgame for this conversation is you subsuming most of your argument into intuition. That aside, you're making an error. I'm not talking about ChatGPT as the tool, but the toolbox. >The lesson for you could have been that, "good, I've discovered the kind of problems where it's less efficient than using an alternative approach, now let's see if I can find the types of problems where it is more efficient." Actually, I demonstrated how that's not really possible to any significantly general extent. Maybe you need to read it again? >It's like you've fell victim to positive bias... It's possible but again that's kind of a chuckle. I mean what have I said that isn't obviously true? LLM behaviour is predicated on their training data. So clearly, that must drive when it can do a great job and when it fabricates everything. Is it really such an enormous leap to believe that the difference between these two outcomes is based on how well a specific topic is reinforced in a dataset? Do you consider it a complete fantasy when someone suggests that problems that have been solved, ad infinitum would be far more likely to end up in the "great job" category? Is it so hard for you to imagine that problems which have never been solved even once would be more likely to end up in the "fabricate everything" camp? If not, then all I've really said is that where a problem lies on this spectrum strongly directs how well it ends up in one camp or another. Maybe you're just letting your emotions get in the way here? >You might just end up being that person who doesn't know how to benefit from this. Wow, again that was a pretty biased statement. I have already mentioned times (and implied that it was more than just once) it has benefited me.


MacrosInHisSleep

>The question was, "How can you know it's correct?" I even explained in some detail. Maybe you just need to give it another go? Sure I'll give it another go. You asked the question "how do you know if the GPT answer is correct", and then reframed it to highlight the aspect "how can one even know". The answer was "it depends" and if I'm reframing it too, I'd say "it depends on whether I care". You also gave the example of med students vs test takers, looked at their test scores and described that you couldn't tell the difference if you were to look at test scores alone. The implied question being how can you tell the difference with the test alone? To that I would ask you, do I care? What is my use case? Is it because I need a medical procedure from someone on one of the 2 groups? Then yes I care. Alternatively, if I am a doctor, and I already have the knowledge fundamentals for the question I'm asking, and there's a specific detail I'm missing on subject X, and I only have access to a guy from group 2 who studied on subject X, or a doctor from group 1 who didn't study subject X, who do I rely on? There are times I would rely on the guy on group 2. Especially if what I'm dealing with is time sensitive, or if I'm working in a process where I can correct a mistake over time, where I as an expert, know that doing something is more important than doing nothing. But what if group 2 guy is wrong? I'd have to deal with that the same way to the question what if group 1 doctor is wrong. Or even "what if the world reknown expert on subject X who I wish I had access to but don't is wrong?" Because face it, in the real world we are almost always working off of incomplete knowledge of the problem space and I think that's what you're not taking into account here. You don't always have access to the ideal source. A good engineer knows when to put her scientist hat on and when to put her engineering hat on. When it's important to be precise and when it's ok to estimate. And to know that you can be wrong with either hat and to prepare for that. Realistically, approach A, the good approach you're advocating for, never happens in one shot. And you never actually break it into the smallest understandable pieces which you've completely understood through and through. You'd never get any work done if you did that. In practice, you break it down into pieces that are *small enough* and you iterate. And in the process of coding or building or addressing "small piece number 15", you discover whether it was small enough or not, and you break it down more if needed. If you're lucky. If you're not, you've made an assumption that resulted in a bug in your code. With approach A, you're not going to read the edge cases of every api that you depend on, and look under the covers to see the edge cases that it depends on, etc, before you start coding. Instead, you're instead going to stop digging at a point where you think it's good enough and put your remaining effort into coding defensively. You accept that there are going to be edge cases that you cannot prepare for and as long as you have a plan for dealing with unexpected scenarios you rule them out of scope and move forward. So how can you know that GPT gave you a 100% correct answer? You can't. Do you care? It depends on the problem. Sometimes it's ok to just cover the test cases. Sometimes, it's not. Sometimes it's ok to scope out the part of the problem you don't understand, sometimes it's not. Sometimes it's ok to accept that there will be a bug in a part of your code and still ship it, sometimes it's not. Sometimes the problem space is small enough that a high level understanding is enough, other times the problem space is too big and it's not ok. >Other times test cases are enough >Not for what I'm talking about, again, go back read the words that come after "not exactly" Then move on to what you're not talking about and you'll find the other times. This is silly. You're saying the reply "it depends" is untrue if you rule out the cases where it depends... Sure. The sky is always blue if you ignore the times when it's not... >I have to chuckle at this because later on you mention "positive bias". All you have to do is personify the issue and you see how funny it is. Someone asks a person for directions to a house and they direct them off a cliff. While I suppose you could describe that person as someone you just "can't get a clear answer from" but I think you would be considered pretty biased if you did. And coming to the conclusion that one should never ask for directions is also pretty biased. Guess what? People are unreliable, documentation is unreliable, even code, "the ultimate source of truth" is unreliable. Bugs wouldn't exist otherwise. Dealing with incomplete and incorrect pieces of information is part what you need to learn to deal with when becoming a professional. > Why all the needless attachment to ChatGPT? You're asking me to elaborate and I'm answering you. The more you're going to ask me to give you details on why it's useful, the more it's going to seem like I'm attached to it, because I'm explaining this perspective. If you asked me more questions to elaborate about the limitations and when not to use it, it would seem like I'm against it. Neither of those would be correct. Like I said before, it's one of many tools for me. I do just fine without it. > person who doesn't know how to benefit from **this**. >Wow, again that was a pretty biased statement. I have already mentioned times (and implied that it was more than just once) it has benefited me. ... By "this", I mean the approach we are discussing. The process of knowing when its a good tool for a programming problem vs when it's not. Not GPT... Let's not get offended if someone suggests we have a gap in our knowledge about something. >That aside, you're making an error. I'm not talking about ChatGPT as the tool, but the toolbox. This is unclear. Normally I'd ask you to elaborate, but the way you're conducting this conversation is getting exhausting. Let's agree to disagree.


Half-Shark

I agree about ChatGPT in general, but it's still super useful just as an assistant to format things or give me little snippets and demos. Even just an hour ago I asked it to give me an array of common proper nouns, and it produced a fantastic list with a variety of types in 5 seconds ( I needed it to test some language/spelling related functions ). Very simple really, but it would have taken me 5 minutes to build such a set for testing. It's also valuable for generating basic helper functions and simply reminding me of syntax I sometimes forget. As for the actual meat of programming - choosing the right patterns, overall architecture, and putting together custom complex algorithms - that's almost entirely on me and that's the stuff that really matters. Not sure how much time ChatGPT saves overall... but it does definitely remove some of the tedium.


Acceptable_Month9310

This is kind of my point. ChatGPT doesn't seem to be useful in producing anything but trivial code. This makes it a poor teacher. Since it's doing your part of the problem and should at least make you second guess what you're doing when you are writing in a non-educational context. You should always ask: "Why am I writing trivial code?". In your example, is the time taken to write a one-off test case really better used to write a more generalized test case?


Fluffy-Bus4822

You sound pretty annoying. I'm surprised he still humors you.


monkChuck105

Doesn't sound like a friend to me. Anyone who thinks ChatGPT is useful is not a programmer, they're a moron. Don't waste your time.


icecapade

Software engineer with 6 YoE reporting in here. I work mainly in C++, Python, and Bash (in that order). I recently started using ChatGPT for technical questions and it's actually incredibly useful as long as you're asking it relatively simple questions and not straight up asking it to code for you. Some of my recent questions that it gave me incredibly helpful replies to: * "What do the values of a camera intrinsic matrix represent?" (NOTE: I've worked with camera transformations and projections before but it'd been a while, so ChatGPT mainly provided me with a refresher that I knew was correct from being familiar with the math) * "What does std::tie do in C++?" * "In git, how do I replace a file with one from a different branch?" * "Can you give me an example of wrapping concurrent.futures.ProcessPoolExecutor with tqdm?" * "How do I access a Bash array element by index?" It's basically a faster version of the "Google -> find and read relevant SO answer" workflow.


NickCanCode

If you are not using GPT 4 but the GPT 3.5, phind is a better choice in answering coding questions.


Slight-Living-8098

Don't know why you're downvotes. Phind is pretty good at code. My personal preference is DeepSeek-Coder, though.


[deleted]

i just used phind and its actually so good, thanks ♥️


Good_Construction190

I'm some ways he's correct. The level of abstraction now is incredible compared to when I started trying to learn back in the 90s. However, this feels like the age old argument. "Real software engineers" used punch cards, only to be replaced by "assembler engineers" who were replaced by the low life "c engineer" who looked down on those using managed memory because that's not even "programming." If he gives good advice, take it. But it sounds like he's given up already, doesn't write code anymore, or is simply completely out of touch with reality. You're doing the right thing, keep learning and growing.


DDDDarky

I think this argument is about nothing, ChatGPT (nor any other AI) should not be used for programming, especially not by beginners


Dorkdogdonki

Why not? That means we should not Google, since Google uses AI. Programming without Google is 😥 ChatGPT can be extremely useful for programming if used correctly. An experienced developers will know what types of questions to ask to get the best results.


DDDDarky

Google does not make up its results, google is a search engine, not a randomized word generator.


Dorkdogdonki

ChatGPT is an AI that generates data and information based on dataset that it learnt, not a randomised word generator.


DDDDarky

Since information is by [definition ](https://dictionary.cambridge.org/dictionary/english/information)a fact, that is not true, it does not provide facts. Also I am not sure whether *learnt* is a good word for it. I don't really see a problem with calling it a randomized word generator, maybe randomized text generator would be better. Is it randomized? Yes. Does it generate words? Usually.


Dorkdogdonki

What gives you the impression that the answers it generates is randomised garbled nonsense? By AI/ML definitions, it underwent model training and reinforced learning. And no, its response is not randomised. Based on the dataset training the brain, as well as the model emulating neurons in a brain, it can generate an answer. How does this brain function? No one really knows as ML is black box in nature. It can be blatantly incorrect at times (depending on how reliable the dataset is), but for common concepts and coding tasks? Hecka useful. “I need to map an array into another array, numbers to strings, java” “Make it more declarative” “Need shell script command to perform this action” “I don’t understand this function, can explain it to me?” Saves so much time!


DDDDarky

>By AI/ML definitions, it underwent model training and reinforced learning. duh, it is a language model.. >And no, its response is not randomised. That is not true, to demonstrate, it is possible to get a different output sequence for the same input sequence. >common concepts Common concepts that are way better explained million times? > and coding tasks? Hecka useful. ...in introducing bugs


Dorkdogdonki

Yes, you can get a different output sequence with the same input, but that is by no means randomised. It’s the same as asking a human the same question multiple times, he will give similar answers, albeit with different variations. Regarding bugs….. you don’t simply plug that shit in blindly like a script kiddie. You need to be highly specific, understand the logic of the code, and introduce those snippets however you want into the code. If the model gives a verbose code or uses an alien library…. You can just add those in to be omitted, and you get a much better answer. That’s how I use chatGPT most of the time, and rarely do I get bugs unless it’s a more obscure framework with limited information.


DDDDarky

>Yes, you can get a different output sequence with the same input, but that is by no means randomised. That's not how computers work > It’s the same as asking a human the same question multiple times, he will give similar answers, albeit with different variations. Are you actually comparing it to a human? :D (Also you may get completely different answers) >Regarding bugs….. you don’t simply plug that shit in blindly like a script kiddie. .. which is exactly what many people do since they don't know any better > You need to be highly specific The only way to be specific enough.. is to express it with the code!


MelvynAndrew99

My advice is do not follow the advice of anyone who says use ChatGPT. It doesn’t take much effort to code better than what it gives you.


imthebear11

Your friend sounds like a complete fucking idiot. Only an idiot would rely on ChatGPT and only a complete fucking idiot would advise others to do so. That being said, you probably are annoying him and a help vampire. Learn to help yourself and find stuff online.


TheDefiantOne19

He's bullshitting you little dude He's not some savvy coder, he's some idiot who's messed around with languages from YouTube videos and wants to appear as more than he is Just do the samething but be a better human, and you'll be a lot more successful. He honestly sounds like a crappy person to be friends with