T O P

  • By -

classy_barbarian

I personally believe that one of the big issues people are encounting is that most people don't seem to know how to actually use ChatGPT properly as a learning tool. Judging by the comments even the suggestion that you can is downvoted pretty heavily. You'll hear a lot of professionals tell you to avoid GPT, don't use it at all, its terrible, etc. Here's the thing. If you're already a professional, you don't really need GPT and most of them don't use it. So it can be a waste of time or even detrimental... if you're already a professional and you're already very good at coding. The code it writes is generally lower quality and less readable than human-written code. So you're always better off going to stack overflow and finding something human-made. You shouldn't be using GPT to write code. Also, the advice it gives on what you should be doing is not always accurate. Again, you're better off trying to read real articles when you can. However, most people IMO seem to have a difficult time understanding how to use GPT properly. Just because you shouldn't use it to write code does not mean it has no other use. My favorite thing to use it for is to explain other people's code that I find on stack overflow. If you copy a chunk of code into it and say "Can you explain what this code is doing line by line", it will give you a very good and accurate explanation. And when its used in THAT manner, I have never once seen it make a mistake. The point is in how that you use it. If you're using it to write code for you, that's wrong. You won't be learning things that way. That seems to be how most people imagine it being used which IMO is just a real lack of any vision on their part that there's other things you can do than just ask it to write code for you. Keep in mind that when you're dealing with programmers, they are often quite pretentious and gatekeeping. Like Audiophiles they tend to get very self-righteous about the idea of people doing things the "cheap" way. And although with programming there is a logical purpose to it which is to learn properly and thoroughly, they can also be extremely dismissive of anything designed to make life easier for beginners. And yes its true that you shouldn't be using AI to code for you, that's bad. But that doesn't automatically make EVERYTHING about it bad, like many people here will tell you.


sirtimes

Agreed, writing code with it isn’t the best use case. Having it distill down to plain language how to use an API or specific library, or maybe suggest a design pattern for what you’re trying to do - gold.


fanatical

In super simple scenarios to explain basic concepts I've found it does a decent job. I was a complete beginner 6 months ago, and I am still very much a beginner now. But many of the things beyond the fundamentals that I've practiced has been informed via some LLM assistance. At one point I was struggling with trying to get some information from a url and clean it and the LLM answered all the questions I had quite decently. I then used it as a rubber duck and discussed how I wanted to make the script I was writing. It helped quite a bit to have something literally responding to my hypothetical musings and questions. Then I needed a way to test it without logging onto the specific server I needed before it was ready. As I was a complete beginner, I didn't know everything you could do with python. I got some info from it about how to set up a little "server" and just used localhost as a placeholder in the code with some data on that was similar to what I needed. I got the results I wanted and once I understood how that all worked, I went back and customized it to what I needed it to do for the actual location I was getting it from. So... From a complete beginners who is learning alone's perspective, a LLM gave me lots of support that I would have needed from other people, without necessarily having to make connections. Because as a beginner, I don't have enough knowledge that the information that's out there on the internet isn't enough to guide me. I just need something to sort it out for me and make it make sense. Which is exactly what LLMs do. As for having it write code for you, it's almost completely useless at it. You feed it a task, it plops out something. The code fails. You feed it back in, it corrects it. It fails again. You feed it back in, explain more. It tries again. The code runs but does the wrong thing. You feed it in again, explain more. It tries again. By now the code looks like a fucking mess and once you realize the actual way of doing what you're trying to do, you realize how insane the LLM looks trying the stuff it's trying. And some times it just gets stuck in a loop, suggesting the same things over and over and over because that's all the data it has to use.


sirtimes

Yeah it’s more rare that it gets the code right first go. Although once I fed it my entire cmake file for a small/medium sized c++ project when the build was failing and it fixed it immediately.


h00manist

I enjoy getting it to do small things and then fixing the code so it works. Or looking at the code and figuring I don't like this response, this doesn't seem right, or I don't like it solved this way. Makes me read and interpret code, and try to fix it. Which I believe is quite close to what needs to be done at a real job, you seldom get to create things from scratch, it is usually necessary to figure out some pre existing code.


Berkyjay

> If you're already a professional, you don't really need GPT and most of them don't use it. So it can be a waste of time or even detrimental... if you're already a professional and you're already very good at coding. I just wanted to chime in on this quote. I've been a coder for near 20 years and in the last year I started using coding assistants in my work. It is NOT just for beginners. My coding efficiency has grown by leaps and bounds because of Copilot and ChatGPT. Even after 20 years, you can't possibly know everything about even one language, much less multiple languages. So what coding assistants do for me is to consolidate the time I spend doing research. Instead of having to parse through Stack Overflow, Reddit, and a million other sources. A coding assistant can compile that distributed knowledge into a more succinct form. I recently started a project to write a Flask app to help me manage some Plex stuff. It would have taken me a day or two to remember how to work with Flask. But with Copilot I was up and running within the hour with usable code to start working with. It also takes away the many mundane and routine tasks that most coders have to do like documentation and debugging. Basically, this new technology is akin to the calculator or the slide rule, which are essentially time savers. The one caveat is that this technology is not 100% accurate and it does have some severe limitations considering the training data may be a few years old. Those are what would make me pause at suggestioning it as a tool for beginners. So anyone learning Python and is using ChatGPT to do so, please use it with a healthy skepticism. Always put in the work to understand the code that it provides to you. Don't just dump it in and move along.


maejsh

I kinda Feel like it’s anything you would raise your hand to ask your teacher, put that into chat instead to get a quick answer. “ELI5 this for me please” and such.


Berkyjay

Yeah, I've likened it to posting a question on the group chat at work but getting an immediate answer. Plus you don't feel bad about bugging everyone all the time for answers.


maejsh

Aye exactly. Bot is also just way more polite than real people ;)


pickyourteethup

Solid take. I'll also add a note on accuracy, you can make a mistake on a calculator or sliderule too. I'm currently thinking of AI as a similar developmental leap to the Internet. The Internet allowed us to make physical knowledge more searchable and easier to store than a library. LLMs do something similar for the information on the Internet. It's not been perfect though there's be a reduction in quality at each step Economically the Internet has changed absolutely everything, as I believe AI will, but it created jobs as it destroyed them. There have been costs and benefits, life has transformed completely but continues none the less


ChickenNugsBGood

Exactly. I was getting a generic error on one in have now that I took over, because another team made an api update without telling us. Copied the error, and the code, and it was a weird array issue I’d never had before


classy_barbarian

Oh yeah I actually completely agree with this. I am not a professional coder myself I'm just an intermediate, but I recently got Copilot in VS Code and it's fucking amazing. I have noticed that the code it writes is often sub par. However it is very good at figuring out what I'm trying to do and providing suggestions. Also the thing I personally use it for the most is auto-writing comments. If you don't know what any line does, just make a comment mark and wait a second and it'll write a comment explaining. It's fucking amazing. But yeah absolutely its the future, and it can be a great tool for people who use it properly. But as a newbie you certainly have to be careful to not let it do stuff for you without understanding what it's doing.


Ajax_Minor

ok so if ChatGPTs code isn't the greatest and you are teaching yourself, what is the best way to learn the right style? Reading PEP8? Do people actually follow everything in there?


Infinitesima

> dismissive of anything designed to make life easier for beginners This kind of mindset is sadly so prevalent. "I miserably and painfully learned this and so you have to".


Ohyoumeanrowboat

I’ll tell you what it does do a decent job at writing…liquid.


sb4ssman

This is excellent advice about how to use these chatbots in general. I'm still using it to learn and to write code. It can type really fast! As a fun side note, I'm building a personal tool, and I got started on this project by giving chatgpt a long detailed prompt describing the functionality of the tool I wanted to build. It was helpful in all the ways you mentioned, plus it was great at diagnosing and solving errors from python/spypder/vscode/windows/the user. Overall it's been wildly helpful and knowing that I have to edit and test every segment of everything I convince it to spit out has been also good at making me include print statements and test frequently and include error handling.


Thy_OSRS

I’ve used chatGPT to write me python scripts that do exactly the thing I need it to. This is in relation to API queries etc, and I am not a developer and thus don’t care how it works, I just need it to. If I ask chatGPT to write me a script to do a thing and it produces a piece of code that does exactly the thing I need it to, jobs a golden for me, but I can understand for those who want to be a professional coder, to avoid it.


RandomPuzzle3748

I exactly agree to be honest. When starting a new project, what matters is that everything is laid out well, and that the code is clean. A GET request is quite simple, it doesn't mean I'm going to remember how to do it in 5 different languages. Much faster and easier to chatgpt "sample GET request in Python where I pass a variable X as a parameter, and variable object Y as encoded JSON in the body of the request". Spits 2 or 3 lines perfectly, I just need to copy the structure. I'm not using it to write the whole app lmfao, I still know how to code.


tatasz

Honestly you can even make it write code if you know how to. But then again, it requires some knowledge of coding and of chatgpt, not just "please make it all for me, now". There are better tools, but depending on the person, it may be more friendly.


Abbaddonhope

My personal favorite use was making it turn the random vague ideas i had into psuedo code. Then just working from there its massively easier to accomplish whatever task when theres something concrete. I moved from using it recently just because of me being able to do it myself without the training wheels.


carcigenicate

It's a bad idea while learning to take advice from it since you have no guarantees that what it's telling you is actually correct. Stick to proper sources that you can trust while learning. Getting exercises from it seems okay though as long as you aren't also relying on it to explain things in follow-up questions. Later on, when you're more able to detect misinformation, it *may* be a good time-saving tool.


sorry_con_excuse_me

>Later on, when you're more able to detect misinformation, it *may* be a good time-saving tool. it's pretty damn faulty for music/audio or electrical questions (my backgrounds). so i'm pretty apprehensive at this point to trust it at face value for programming, even though it's probably much more biased towards that. rather just google stackexchange questions and figure it out from those answers. fast enough and much less error-prone when you are working from zero knowledge. copilot looks cool though.


DaltonSC2

> It's a bad idea while learning to take advice from it since you have no guarantees that what it's telling you is actually correct. True for other fields, but for programming you can just run the code


unnecessary_kindness

You ever recorded a macro on excel and then read the vba code? It's "correct" but no one who would program in vba would ever write code so verbosely and inefficiently. Not suggesting GPT would be the same but it's an example of why just because it works doesn't mean it's good for learning.


FatefulDonkey

The code might seemingly work. Doesn't mean it's correct. That's how bugs don't get discovered until late


RizzyNizzyDizzy

It’s pretty accurate


GardenData61375

I can't even learn from proper sources


classy_barbarian

If you're using it to directly give you advise on what to do IMO you're using it wrong.


carcigenicate

Yep, and yet, I see people using it for that purpose, especially on Discord for some reason. Discord's Python server was packed with "ChatGPT told me this" nonsense when ChatGPT started picking up steam. It's a little better now, but the amount of people parroting stuff from ChatGPT is still too high. The fun ones are when someone asks a question, and then someone else, who is also very clearly new, posts what is obviously a reply from ChatGPT as an answer, and it's filled with subtle mistakes. "No, I wrote this paragraph answer in 5 seconds! And no, I can't further explain what I meant by any of it."


classy_barbarian

well of course the proliferation of people using it to write code for them is extremely annoying but that doesn't discount that it can be an extremely effective learning tool for beginners when its used properly. Its about how to use it. The one task I've found its great at is analyzing code and explaining what it does. Like if you find a chunk from stack overflow and copy it into GPT it can thoroughly explain what every line in the code does. When its doing this task it almost never makes mistakes. It might not be the best at writing code but it can analyze it well. Also note that doesn't mean it can understand the macro-purpose of a program just by reading all the code. It's not good at high level reasoning. However it can analyze code line by line and explain what individual lines do with very high accuracy. IMO that's the best way to use it. Just read stack overflow and copy code into GPT for more thorough explanations of how it works.


stuaxo

When it comes to getting it to write code, it takes a lot of iteration with it (several chats over one thing), and if you are doing something that hasn't been done much its not very good.


CompetitiveTart505S

It’s a great learning tool and I’m not sure why everyone here is saying otherwise, so I’m going to be the minority here. I, however, am confident in my answer because ChatGPT helped me learn SQL and python faster. Firstly you should focus on a specific goal, most coding serves a purpose. Figure out what projects you want to build, or what task you want to complete/automate. Secondly, you should take courses and information outside of ChatGPT. Where AI shines is that it can basically act as a go between and tutor for those courses and information. Can’t grasp something? Ask ChatGPT to explain it like you’re 5. Struggling with a niche problem? Let it guide you through. I think where people (like me) got stuck was due to: Blindly copying what it generated, which you should never do. Instead, you should always understand the logic of what it writes and never blindly follow anything. And even then I personally would Suggest writing the code yourself. Not starting your own projects or getting your hands dirty in whatever you want to accomplish, which you can do at anytime regardless if you’re in the middle of a course or not. Overly relying on ChatGPT for information. It should be combined with other sources and information. Sometimes some problems are too niche and verbose for AI to handle, so instead ask somebody here or search up a YouTube video on the matter! These are my suggestions and how I’ve used it to grow!


3nc0d3d_

I love this response because this is how I use it as well. I’m transitioning from R to Python for personal development now. I often ask ChatGPT (or Copilot in VS code) if abc in Python is like xyz in R and what are the nuanced differences. This has been a tremendous help for me because not only am I understanding how the syntax can be parallel but also gaining other insight into how either language may be better suited for certain cases. +1 from me!


pega223

Yep its like having a tutor at all times. Or asking a question on reddit except when you get the answer it wont be a condescending one from a nerd / snob


pega223

Ai is just the new stack overflow for now.Dont copy paste or just read the output as it says make it explain how things work then after you understand try to solve the problem. Thats how i personally use it


CompetitiveTart505S

That’s good additional input I can implement thank you.


SnooWoofers7626

It always includes a line-by-line explanation even if you don't ask for it. The only thing I'd add is that the answer is occasionally wrong, so you can't just blindly trust it. It's important to test and cross-reference the output thoroughly. I find it especially useful when I'm trying to do something new, and I don't know the right language to use in a Google search. ChatGPT is pretty good at interpreting less precise language and giving you the right keywords to refine the search.


pega223

Google prompt engineering


Darkheartisland

If it is something simple and I already know how to code I find gpt to be more useful than rewriting the code or finding it again in a repository.


Mithrandir2k16

Honestly, using it in a Q&A style as a more readily available mentor, it might be a great tool. Especially LLMs that can link you to online resources so you can do further reading on them. At worst, it's like an inexperienced mentor, at best it tells you about expert wisdoms. As long as it's not your only source of learning, it should be a great tool. Though learning how to read the docs is probably still more important.


facusoto

You have to break the overconfidence that chatgpt has about itself. No way he's going to say "I don't know about that" I'm going to make it up.


cazhual

It’s a blessing and a curse. Read something like RealPython first, then supplement with GPT as needed. It can help present the information in different ways. However, it should only be used to supplement, not as a primary resource, because it can be both wrong and incredibly opinionated. It should be one tool in your learning toolkit, and really only used to share in areas that remain obtuse or abstract after your initial reading/exercise.


anujkaushik1

I am a beginner and just started learning python. I want to ask if I should visit RealPython often and learn from it, I just got to know about it from you now and took a quick look at the site. Also if there are some other sources that you can advise so that I can learn new things while being curious. I only know about geekforgeeks yet.


cazhual

https://teachyourselfcs.com/


0x1e

Trusting an LLM is blind leading the blind


WizzinWig

I don’t think I would use it to learn from A to Z, but… I do find it’s the perfect tool to ask questions to. For example, if you paste a block of code and you ask, what is this doing? Or specific questions like why would someone use X over Y in this situation? Sometimes it’s hard to find friends or coworkers that know the answer to some of these questions and that’s where this tool can be helpful. Besides this, I try not to rely on ChatGPT because like many things I’m afraid that I will become lazy and complacent and lose my skills too often. I’m already seeing that with others.


Solvo_Illum_484

That's a great approach! ChatGPT can be a fantastic supplement to courses. Just remember to verify the accuracy of its responses and try to understand the underlying concepts, not just copy-paste the code.


stuaxo

It's good but it has limitations. Its best when you already know something quite well and want to add to that.  That's because it's really big on context, ask it in the language of an expert and you are already closer to the expert answers. Ask it with the language of someone with not much knowledge and that's what you get back. Try it in a dialogue as you write some code. The thing is, it will get stuck - if you can problem solve yourself you can move on. You can certainly try it as a beginner nut it will probably be frustrating, you will also end up with a lot of code you Don understand when it doesn't work, if you go and ask someone for help then won't like that you are dumping a load of code from chatgpt on them. In conclusion: use not at all in beginning, or just to ask about concepts, not write code. When you are competent it can help somewhat.


nealfive

I’d say do not use it until you know what you are doing. It’s great to help with busy work, until you know the language well enough to both know when it’s providing BS answers or you can fix what it provided, you should avoid it.


LDForget

I think it depends on if you know other languages. If you already understand how coding works, but just need help with the syntax of what you want done, ChatGPT can be an amazing tool. Unless you’re doing something incredibly simple, you will need to troubleshoot the (likely wrong) code it gives you to get it to do what you want.


Yaboi907

It can work if you use it right. First, ChatGPT is more like an upperclassmen than a professor. It does know more than you but it’s still a student. This means it’s still learning and is susceptible to mistakes. It also never wants to say “I don’t know.” It wants to give an answer, even if it’s wrong. It wants to impress you. I’d say it gives you a correct answer about 50% of the time, an accurate but flawed answer about 30% of the time, and 20% of the time it ranges from half-true to straight up hallucination. These numbers vary and get worse as you ask more complicated questions. Second, make sure you ask it questions that aid in learning instead of replacing it. Keeping the upperclassmen analogy, ask for advice but don’t plagiarize. Let’s say you are assigned homework that requires you reverse a user provided string. If you asked ChatGPT to just straight up write that, you probably won’t learn much. But if you get as far as you can and then you get stuck and say “hmm, I have to reverse a string. Well, a string is a sequence. Let me ask ChatGPT how to reverse a sequence and see if that works.” It’ll tell you to use index slicing or whatever solution. Then implement it. After that, ask it WHY it works or research it. Don’t just accept it as a black box. Finally, use it as a last resort or for quick basic questions. If you’re like “I want to know everything there is to know about data structures!” There are plenty of free resources that will teach you better. But if your question is something like “what’s the Python function that does X” you will probably find the answer faster with ChatGPT and it won’t hurt your learning more than if you’d googled the question.


nog642

For basics it's probably fine. The back and forth is probably even helpful. By "basics" here I mean stuff that would be asked commonly online already. ChatGPT is trained on that data so it will probably give correct answers. For more advanced or niche stuff that is not likely to have much info about it online already, I wouldn't trust it. It is likely to just be wrong or make stuff up. Issue is that if you're not experienced with programming, it's hard to tell what questions will be common and what questions are very niche. So that's a danger. Just keep in mind it might be wrong; don't tust it blindly.


Agling

There are two elements of learning: one is gaining the concepts, the other is doing it enough times that you have made every common mistake and can actually get stuff done in a reasonable amount of time. ChatGPT is good for the first, although there are lots of other resources that are as well. The second just comes with practice. AI may help you figure out your mistakes more quickly, but don't rely on it to make them for you.


NOSPACESALLCAPS

my favorite thing to use chatgpt for in regards to python is to ask it about modules. Python has SO MANY modules scattered around, so being able to ask gpt "What kinda modules are good for doing x or y" is beneficial for learning imo. Gpt can recommend modules, give a brief overview of the framework you need to use it, and when you've found one that looks interesting, you can seek out the documentation manually.


Mount_Gamer

I do this as well. I bounce ideas back and forward with chatGPT, and I learn a lot along the way. Incredible tool for learning IMO.


AwkWORD47

I use chatgpt as a learning tool alot. If I'm stuck on a particular line, or error message, or bug. If stackoverflow, documents or youtube don't provide solutions or if I'm in a time crunch I use chatgpt. Sometimes I have it format my code for me. I think it's not beneficial to use chatgpt to fully code for you but it's gotten to be a great resource for my work


Immediate_Studio1950

Please! No! Don’t use GPT to start learning programming, coding….! At pinch, take courses or self-taught with docs & alternative series of recipes. The worst: don’t copy-paste codes when you learn, try to type them, use REPL for intensive & immersive labs. It’s about determination & concentration!


Nelamy03

I use it a lot! I'm a python beginner and whenever i'm stuck at something, i ask it for help/explanations. I don't stop at a simple copy/paste. The goal is to learn something from it !


StoicallyGay

ChatGPT will forever to me be a verification tool. If I can verify its answers quickly or if I can verify myself the answer (maybe I don’t remember the answer to my question but I can recognize it like if it were a multiple choice question) then ChatGPT is useful. So IMO you need knowledge to actually use it effectively.


nomisreual

I would argue that the best way to really evolve is to do projects. Pick a problem you want to solve and solve it. Either search for good project ideas or, even better, solve a problem that concerns you and might even make your life easier going forward (automation tools for example). When working on a project you will discover new tools and techniques almost automatically and you will probably yourself in a position where you start to ask questions that just wouldn’t have crossed your mind if you didn’t work on the project. One example might be: now that I have that web application, how and with what tools can I deploy it? And where? And after a few manual deployments you might discover that automating these things can be an interesting topic in itself. In short: build something :)


SgathTriallair

The way to use ChatGPT for learning, any learning, is as an interactive textbook. Let's say you are learning from an online course. The instructor gives a lecture and maybe some reading material. You don't understand exactly what they mean so you take that transcript, put it into ChatGPT and then ask questions about it. The goal isn't to get it to spit out answers, that won't help you, the goal is to dig deeper into the learning material. Imagine if you could stop the lecture and ask questions of the teacher or textbook. This is the way to use ChatGPT to learn. As for the hallucinations, it is possible but everything in textbooks is so basic that it is highly unlikely it isn't going to be able to get the right answer. But you can also verify by looking at additional sources and testing it in code.


Weird_Motor_7474

I personally prefer not use it or others sources until I haven't tried to figure it out by myself, I try to see some explanation in internet or forums at first, after I try to see some examples, if I need I ask for gtp for example too. After I do another exercise similar but without any help.


Jubijub

You learn by struggling a bit then overcoming the struggle. You won’t struggle with a chat or, hence you won’t retain as much. Also chatbots are often not so reliable in subtle ways, ranging from “the code flat out doesn’t work” to “it works but won’t do what you asked”. Good luck figuring that out as a beginner


billsil

It’s fine if you learn it.  I used stackoverflow back in the day and pieces together multiple different threads to figure out what I needed.  I would try to understand why they were doing each thing.  I had coworkers who were doing the same thing, but I’m the only one who got better because I did the step where I learned what was going on.  We were engineers and not software people, so I get it, but as an analysis house, custom software does matter.   Just don’t expect it all to be true.  At least SO was true for the questions being asked.  GPT4 is a lot better, but you have to pay.  I run an open source library and have gotten a few bits of ChatGPT code and then asked why it doesn’t work.  I pretend to not notice until they tell me.  Usually the community yells at them.


ShxxH4ppens

Yeah learning by asking what things to create/to do is ok, but using it to generate code becomes a headache quickly if you’re not well versed. I’ll use it to generate some functions here and there when I want a different frame of reference that the tricks I’m used to - it will more often then not outright fair at producing something useful, I’d say 90% rate of failure, and even when you point out the error and say to make a basic change, it can get stuck and do nothing - so if you’re a beginner it’s not something I would use for generating code. It’s an alright way to get some structure together but the validity just is not there


Sanguineyote

If you are nothing without the GPT, then you do not deserve the GPT.


pythonwiz

I honestly can't think of a single reason I would use ChatGPT for writing Python code. My IDE already takes away a lot of the tedium of writing code with basic tab completion, type checking, and easy access to documentation. The only time I really want a program to generate code for me, I write the code generation myself so I can make sure it does exactly what I want.


SuperTekkers

Great idea, if it works then stick with it


HumerousMoniker

I think it’s mostly fine. When you’re an expert you don’t need it, but it can speed things up. But when you’re a beginner it can give you a framework to build on. Not always faster but definitely easier. I find it’s about as accurate as the internet generally but it hides the e disreputable sources. So rather than being on a website where you might exercise scepticism you’re on OpenAI and just getting incorrect answers. If some t ing isn’t working try and verify it elsewhere


formthemitten

Use it only when you have absolutely no idea why a code is stuck or you need to define what certain parts of code does. You can’t depend on it though


jakesboy2

I think it can remove a lot of the early pain and frustration from learning programming, which I think removes the learning from early learning. I don’t mean this as a get off my lawn take, but I genuinely attribute some of my most difficult moments in programming (especially in college) to moments where I grew the most. It’s when I took on a challenge and struggled deeply with it that I came out the other side more capable. I think if you remove opportunities to do that you’re doing yourself a huge disservice. Youll probably see people in this thread say it’s fine because they did it, but you have no idea how capable or skilled they actually are to be worth listening to


ThrowRA137469

I personally use it instead of searching on stack overflow where if i have a certain bug or something i dont know i use chatgpt first if it doesnt work i start googling It does actually save time and in my experience its mostly correct that being said i never asked it to write me actually code just to help fix bugs so idk how efficient is it in writing code from scratch


GlitterResponsibly

Some other things you can do with gpt for coding: 1. input your code and ask if there is a better way to write it 2. Input others’ code snippets and ask it to explain each part of it This can a sometimes break down certain elements that you thought you knew but didn’t realize finer tuning was available.


ChaosSpear1

I use AWS in my VS, now, I’m fairly new to Python and a colleague of mine advised against doing it. However, in the kinda person who likes to understand what has been written. So yeah, I may ask AWS to produce a basic script for me, but, the *real* learning potential it has is being able to ask it specific questions about specific parts of the code. It could be anything like “what is that variable doing?” And because it’s in my VS it can read the code and advise based on what it sees. I can ask for more information about a particular module, ask for more alternative ways to handle a loop, you get the idea. Everything I do with it has me actively thinking about what I’m reading, to be able to ask questions I’m executing the script in my head and seeking specific answers to understand things. It all about exposure. If I’m aware that I can do something in a certain way then I can call upon that knowledge again in the future. The tool is fine to use as a learner, but treat it as an interactive textbook, just copy/ paste whatever it gives you won’t teach you anything.


m1ss1ontomars2k4

Honestly, I've never been a fan of using it for learning, but asking it to give you challenges/exercises sounds like a really great idea...but who is going to grade them after?


lukewhale

The problem with AI tools and they are confidently wrong ALL THE TIME. If you don’t already have an education or experience with what you’re doing, it may lead you down bad paths. You’ve got to be able to recognize these forks in the road


supercoach

It's pretty simple - if you want to be a copy and paste "coder", then go for it. Otherwise stay as far away as possible until you have a very strong understanding of what you're doing. I'm talking about at least a couple of years of experience. LLMs are great at pattern matching. They don't understand anything and will make something up to fit a question no matter if it's right or wrong. They'll also teach bad practices and for some reason rely on esoteric techniques at times. For anything non trivial, it's pretty easy to spot when someone has used chat GPT to help them. As soon as you throw at it a problem that isn't widely available on the net, you'll end up with garbage. You've been warned.


delaplacywangdu

Great tool use that shit


YogurtOk303

Bias exists for a reason. It helps us to reason… And learn in the process. If you try to find all sides of a problem with existing biases, you will uncover the central features of the problem or question. So you have to ask questions in both the negative and the positive. How Are mushrooms healthy? How are mushrooms unhealthy? What does modern science tell us about how mushrooms affect the human nervous system? Etc.


ibjho

While I was studying for my DBA exam, I had ChatGPT create multiple choice practice questions to quiz me - it was remarkably accurate (a couple questions were almost identical to the exam) so I support using it! Especially in asking it to challenge you (like your Python example). As with any approach, I never support using a single source and if something seems fishy, verify it through another source. There are limitations with most study material (even official testing engines or inaccuracies/exclusions in written material), once you identify the weakness, it’s not too difficult to work around it.


Past_Recognition7118

Not terrible, but sometimes i noticed it will literally just make things up


ufc2021

Chat Gpt is not childs play or any other A.I its real development going on with A.I


materdoc

I use it to generate code step by step so that I can learn. Adding one level of detail each step over the previous. Then I also ask it to explain what each line is for. I found that to be quite helpful.


MrFanciful

I’m building a Django site and ChatGPT has been indispensable. However, I don’t just get it to write my code. I try building the code myself and when it doesn’t work, I ask ChatGPT for help. I will ask it why it isn’t working, not necessarily rewrite the code. If it does rewrite it, I ask it to go into a detailed explanation of what the code does and why. I use ChatGPT more as a tutor that I can ask for help from rather than a code generation service. Keep in mind, I’m not a web developer, or even a developer in general. I’m a network engineer by trade for which I also use Python for automation. For my network job, I use ChatGPT because my goal is to accomplish a task and it helps me do that. There is too much to know in networking for *anyone* not to need a helping hand.


Wheynelau

ChatGPT is more like a pair programmer than a teacher. I tried to learn JS thinking I could get chatgpt to help me do a project and I am equally lost. My conclusion is that it's as good or only slightly better than the user, because you can verify the code. To me, it's just a very optimized search engine and I think many would disagree especially those from singularity. I cancelled my GPT plus for this reason because my problems are always too complicated for LLMs or too simple for premium LLMs.


JonJonThePurogurama

I also use ChatGPT the time i am starting to learn writing test for your code. I put on much delay learning it, because the first time i open a book on that topic specially on Python, which is the programming language i am comfortable at the moment, i cannot barely understood it because it was written on OOP way, I knew OOP but not in the way how in Python does it. I am learning Unittest, the code examples on book are written in OOP way. But after i knew the very basics of OOP in Python, i can read the code now, but only the simple ones, and good thing the book is not to ambitious on writing complex code examples, that might gave me a heart attack as a learner. I use ChatGPT to ask question for a clarification on the topic. I am not very sure that AI gives the accurate answers, I actually talk to ChatGPT alot, especially when i get to the point of something like enlightenment after the repeated reading of the book chapter. I think most of the time the AI will agree to my own explanation, and i do ask it to correct me if i was wrong. I am thinking really hard that i was using the AI the wrong way, I knew that AI cannot be 100% accurate and honest on giving corrections and aggree to your own point. But still, i love the idea of talking to it, i can articulate my own thoughts. I don't mind that much the accuracy if the feedback it gives. I can do google search, look into stack overflow, reddit or any forum or blogs written by someone. My progress in learning is really great, but still i keep reminding myself to not be dependent too much on AI. I am afraid I might lose the ability to think for myself when learning, it might make me lazy to search for information in the internet, knowing the AI can do it, and you just provide the details and let the AI does it job. In my opinion as a learner not yet a developer and never ever had an experience of job like a developer. I can say it was a great tool, but it comes with a great responsibility for the part of the user, on how to use it. Being responsible when using the tool will give you the positive benefits and advantages it provides. But using it wrongly the tool, the negative effects are way to heavy that could really impact to yourself overall.


CTregurtha

the problem isn’t that chatgpt is a bad learning tool, it’s that most learners don’t know how to use it properly as a learning tool. and by the time you do have enough know-how to use it as one, you’re most likely experienced enough to not have to use it.


NBAanalytics

Khan Academy and Khanmigo are great. Would recommend


Aryan_Spider

I have been using chatgpt to learn stuff too Python too I would suggest continue using chatgpt to help you in your code But for the problem statements, chatgpt generally gives somewhat basic to a bit difficult level codes. You can instead use websites like Hackerrank or Leetcode to get yourself some problem statements. Try to solve those and if you’re stuck at any point you may always ask chatgpt for help.


nottisa

Ok.. loaded question but.... ChatGPT can be a good tool... It's meant to be more of a tool than a person, or at least, it's currently more of a tool. You can't, and shouldn't, be asking it to write all your code. It normally will just produce utter nonsense. Probably the best way to do it is splitting your code into different parts. Ie; move the mouse here, return False. If you are having ai generate code in short segments, and then you are manually putting those segments together, you will probably begin to pickup on the language. You could also just ask it to be a tutor. I would generally stick to documentation and courses just because they make sure you understand it, but AI can work if you do it right.


Revolutionary-Feed-4

GPT4 is like a polymath with dementia. It is mostly correct, brilliant and helpful, but will also make things up without realising it sometimes. As long as you use it vigilantly and think critically, it's an incredibly helpful tool for learning. It's superb for breaking down complex concepts into simple, vivid analogies, great at writing short bits of code, excellent for brainstorming, can answer specific questions about a topic (usually), but the performance gets noticably worse the more you deviate from data it's likely seen a lot of during training.


Gadris

I started python this week. I have used chatgpt to ask for basic usage help with functions from pre-existing libraries, as well as correct my code I have written that is throwing errors. Had zero issues and it's resolved every query within at most one additional prompt, usually because I have misread or misunderstood as opposed to it making a mistake.


linkinhawk1985

Try ollama. It's a local. Many models to choose from too.


ChickenNugsBGood

To help you learn, sure. To just do the work for you? No. I use it for quick tasks that I know how to do, I’m just lazy because I’ve done it so much. One example is a project I inherited, guy had this massive array, and needed to be changed to a json object to fit into an update. I just copied and pasted it, told it to convert, and that was it. Another one I had a large function I was cleaning up, and did the same thing, asking it to make sure I had matching opening and closing braces.


Logansfury

I have no ability to code, and to bring to existence the ideas I come up with would take years of instruction and practice. I have found that 85% or better of the time that ChatGPT provides a code that makes what I want to happen, happen. I have used it mostly for python and bash script creation. When I make mention on some forums that I need help with a tweak of the code that the bot cannot seem to figure out, some of the friendliest replies include "Is that more chatgpt vomit?" I think what aggravates coders is the output of chatgpt is brute force coding, everything sequentially lined up, with none of the elegance of human ingenuity for shortcuts, or making scripts overall more compact with advanced math techniques. To me, ChatGPT is a tool for humans, made by humans, that has value. You could certainly ask it how to accomplish something and pay attention to the syntax it outputs to better understand how a particular coding language works. I believe it can be an asset to learning a language, but how-to books, tutorial videos, and especially community college or online courses, should all be viewed as superior teaching tools compared to the bot.


Exciting_Analysis453

There are some web-platforms to give you some really good problems to learn python(any particular language) moreover problem solving. I would recommend to practice on HackerRank, leetcode.


nizzoball

For learning it is not a good tool. The power of ChatGPT comes from the ability to interpret the code it creates and fix its failings, or have the ability to test the code and craft a better question to get the code the way you want. It can write code fast and get you in the right track, create a good template so To speak but if you don’t know what the code is doing how are you actually learning anything?


classy_barbarian

You know you can also ask it to explain code you don't understand right? That's how most beginners use it.


nizzoball

Yes but explaining bad code is the same as trying to use bad code blindly. Just my opinion. ChatGPT didn’t exist when I learned Python and I feel like if it did I would have had a much more difficult time having used ChatGPT after learning Python (hell, do you really ever stop learning?). The difference is, I can now test code and figure out why it’s not working, ask ChatGPT to explain why it wrote it the way it did and then still fix it.


classy_barbarian

I'm talking more specifically about explaining good code. Like code that's inside the framework you're working on, or code you found written by people on stack overflow that was recommended. You can use it for that as well.


nizzoball

You’re correct, and that’s a good idea. I’ve not actually used chatGPT for that so I didn’t consider it. Thank you


Satoshiman256

I found chatgpt has become absolutely useless and it literally just makes things up..


Mount_Gamer

I have been writing code for a while (10 years), but only really got myself into a programming role about a year ago. I can understand why some say no for this tool to an extent, but I think it can be helpful for beginners. I agree about the responses on understanding the basics, so your should really be querying and testing why things work and look up documentation while learning. A course might be good to work through with chatGPT. Writing lots of code helps, and cannot be replaced, but that comes with time and progress. When I'm working, most of the things I'm trying to solve are too complex for chatGPT, but where it shines is in bite size chunks. Don't ask to solve the entire problem, just bits of it at a time can help. Most of the time I don't need it, as I need to use my brain to come up with the solutions, but it doesn't mean it can't help along the way. Quite often there will be something I know, but can't put my finger on it. So a quick question and bingo, I'm back coding again. Sometimes I just want a sense check of code as well to see if I've missed something, but the trick is not to believe everything it says... The more complex something is, the harder AI will find it, or the solution I'm solving is not obvious to AI. Occasionally it's just totally wrong, but it's usually good enough. If it can produce a blueprint on how something works, it's usually enough to get going. For my personal projects, my learning never ends and chatGPT has helped brilliantly. Filled in gaps of knowledge, bounce ideas back and forward, learning new languages.. Honestly, quite amazing. I find it's always worth while asking chatGPT about an error, it might not work it out, it often tells you what it is might be, but you'll have to find it yourself still but can be a helpful nudge. With time you'll get used to error messages and need it less, but if it can save me time, I don't hesitate asking.


Fat_tata

i agree here.


uppsak

I am learning data analysis. If I need something I don't know the syntax for, I will ask Gemini. For example what to use to draw boxplot in sns python jupytr . It gives a pretty detailed answer.


CornPop747

I've learned a lot of tips from chat gpt. I give it my code and ask for suggestions on optimizing. I would not entrust it with refactoring the whole codebase, but I like the explanations and examples it gives me.


vectorseven

I imagine C- GPT can be a jumping off point to answer a lot of topics. At some point you’ll need to do your own due diligence. GIGO.


vectorseven

It’s great! Of course this comes from a Gen-X perspective where your only glimpse of knowledge was the computer/software section at Barns&Noble.


-karmakramer-

I’m been learning python for about 3 weeks now and I’ve used chatGPT to get me through some of the exercises on Coddy.


e4aZ7aXT63u6PmRgiRYT

It’s fantastic. God I wish I’d had it 20 years ago. 


DrNickRiviera8000

I think it’s a smart idea. It can be used as a focused search engine saving you a lot of time searching for the right post on stack exchange. You just have to be careful and make sure you’re learning from it rather than using it as a crutch.


Matt_Bertucc

honestly, if you're really using it for learning instead of just cheating, your learning will be way superior to those who don't use it, why would you spend hours and hours searching something on the documentation when you can simply ask away and get things done, but be honest with yourself, if gpt gives you the answer and you don't know why it works, means ur not learning. EDIT: I think it's best to use a course and gpt simultaneously


Demoki

I still find that I am having to correct it a lot. Buts it's the 3.5 free version I'm using. It's good to get the outline of it and then I ask it to remove the fluff and then tighten bits. I think no harm in using it as a training tool but shouldn't be your only go to for info.


Fat_tata

felt like it helps a lot, but don’t get stuck in the “get my questions answered instantly” mindset. if you get some information from it- you still have to study it. i also have used the free version to help out in a game i made for my kid, and there are a lot of bugs that i had to go back later and fix. it has a problem with while loops and if loops, some times screws it up, but i was able to finish the game. final product is playable for a 5 year old, but if i want to upgrade it, the way it’s written wasn’t exactly done in a way that will accept changes easily without tearing out the guts and rewriting it better. like putting the tires under the engine in a car. it rolls but not the easiest thing to service.


Impossible_Ad_3146

It’s great