Submitted by Suspicious-Spend-415 t3_10sb0sy in MachineLearning
[removed]
Submitted by Suspicious-Spend-415 t3_10sb0sy in MachineLearning
[removed]
[deleted]
No
This "its taking our jerbs" hysteria reminds me of the fear of immigration
Most clients don’t even know what software they need. Software engineering isn’t entirely about writing code. You have to solve the problem of a client, which sometimes the client can’t clearly articulate. Perhaps there would come a time when programmers would write a large project assisted by an AI in a very short time.
It will automate some parts of CS jobs but not gonna replace all because the CS jobs is not just about wrting the code. there are much more than that and the hardest one is problem analysis
History has shown what happens at technological breaking points. Yes, you may not want to earn a living as a horse carriage chauffeur, however, there are opportunity to become a car chauffeur.
I think your premise is wrong, it’s not about replacement, it’s about evolution.
It’s not about ‘threatening’ jobs, but improving certain aspects of it.
No. The most positions were HR by last layoff’s. They still huge demand for technical people, many countries have specific programs „Blabla 2030“, each of them need 100 thousand tech people for their agenda. Let’s talk about this in 10 years again
Actually there are specialized LLMs for code like Codex so I don’t know. I think this kind of stuff will make the mundane tasks a lot faster to perform. That’s about it for now but who knows what the future holds?
One year ago I tried information extraction from invoices with GPT-3 and it worked very well. Our team has been working on this project for years, collected data, built labelling tools, trained models, etc ... and now this AI does it without any specific training. We shivered fearing for our future.
Now I started using GPT-3 and let me tell you - it's not as easy as it looks in the playground. If you use GPT-3 you need to think of prompt design, demonstrations, prompt evaluation, data pre-processing and post-processing (is the extracted text actually present in the source?), using justifications, CoT or self consistency. In the end I have so much work I don't know what to do first.
AI will assume a number of tasks and open up other tasks around it so the total amount of work will remain the same - which is as much as people can handle. Software is a weird field - it has been cannibalising itself for decades and decades and yet developers are growing in numbers and compensation. That is a testament to our infinite desire for more.
> It’s not about ‘threatening’ jobs, but improving certain aspects of it.
Jobs don't just exist by themselves, it's the people who demand products and services causing jobs to exist. In other words, they are a function of human needs and desires.
The question is - can automation satiate all our desires? I don't think so. We will invent new jobs and tasks because we will desire things automation can't provide yet. In a contest between human entitlement and AI advancement I think entitlement will always win - we will think everything we have is just basic stuff and want something more. If you asked people from 300 years ago what they think about our lifestyles they would think we already reached singularity, but we know we haven't because we feel already entitled to what we have.
>Since ChatGPT, many articles have been popping up about how AI will replace software engineers and developers. (Maybe not in the near future, but eventually)
Most of them are clickbaits.
No for a couple of reasons, but the most important being that software is almost never developed in isolation. You need to interact with other libraries, other engineers, clients. Software engineering is not about writing the most exotic tree data structure in the least amount of time. I can look it up in stackoverflow and I argue that (currently) it is faster than writing a whole prompt about it.
Certainly, one hundred per cent agree, if I understand you correctly.
Don't know about human entitlement, but from a simple time/energy-limitation perspective:
I'm sure time and energy is some of the reasons.
CKtalon t1_j70ht51 wrote
Basically job scopes will change due to the boost in efficiency.
The mediocre of any field will potentially be kicked out or priced out by AI.
More domain experts will be needed to vet the AI output and guide the improvement of AI (using RLHF) for probably decades to come. Generalists will likely be replaced by AI with time.