Submitted by Dear-Vehicle-3215 t3_126ft3c in MachineLearning
Smallpaul t1_jea4whk wrote
Reply to comment by cc-test in [D] What do you think about all this hype for ChatGPT? by Dear-Vehicle-3215
>You get a zero cost tutor that may or may not be correct about something objective, and as a student you are supposed to trust that?
No. I did not say to trust that.
Also: if you think that real teachers never make mistakes, you're incorrect yourself. My kids have textbooks full of errata. Even Donald Knuth issues corrections for his books (rarely).
>I also pay, well my company does, to access GPT-4 and it's still not that close to being a reliable tutor. I wouldn't tell my juniors to ask ChatGPT about issues they are having instead of asking me or another of the seniors or lead engineer.
Then you are asking them to waste time.
I am "junior" on a particular language and I wasted a bunch of time on a problem because I don't want to bug the more experience person every time I have a problem.
The situation actually happened twice in one day.
The first time, I wasted 30 minutes trying to interpret an extremely obscure error message, then asked my colleague, then kicked myself because I had run into the same problem six months ago.
Then I asked ChatGPT4, and it gave me six possible causes. Which included the one that I had seen before. Had I asked GPT4, I would have saved myself 30 minutes and saved my colleague an interruption.
The second time, I asked ChatGPT4 directly. It gave me 5 possible causes. Using process of elimination I immediately knew which it was. Saved me trying to figure it out for myself before interrupting someone else.
You are teaching your juniors to be helpless instead of teaching them how to use tools appropriately.
> Code working is not equivocal to the code being written correctly or well. If you're the kind of engineer that just think "oh well it works at least, that's good enough" then you're the kind of engineer who will be replaced by AI tooling in the near future.
One of the ways you can use this tool is to ask it how to make the code more reliable, easier to read, etc.
If you use the tool appropriately, it can help with that too.
cc-test t1_jea9ejf wrote
>Then you are asking them to waste time.
Having inexperienced staff gain more knowledge about languages and tooling in the context of the codebases they work in isn't a waste of time.
Sure, for example, I'm not going to explain every function in each library or package that we use, and will point juniors towards the documentation. Equally, I'm not going to say "hey ask ChatGPT instead of just looking at the docs", mainly because ChatGPT's knowledge is out of date and the junior would likely be getting outdated information.
>The first time, I wasted 30 minutes trying to interpret an extremely obscure error message, then asked my colleague, then kicked myself because I had run into the same problem six months ago.
So you weren't learning a new language or codebase, you were working with something you already knew. I don't care if anyone, regardless of seniority, uses GPT or any other LLM or any type of model for that matter to solve problems with. You were able to filter through the incorrect outputs or less than ideal outputs and arrive at the solution that suited the problem best.
How are you supposed to do that when you have no foundation to work with?
I do care about people new to a subject matter using it to learn because of the false positives the likes of ChatGPT can spew out.
Telling a junior to use ChatGPT to learn something new is just lazy mentoring and I'd take that as a red flag for any other senior or lead I found doing that.
Viewing a single comment thread. View all comments