Submitted by apyrexvision t3_z9z1y4 in singularity

For context I'm a software engineer with 3-4 years of professional experience. Self-taught/bootcamp baby.

Based on the speed of development within generative AI it seems likely to occur within 5-10 years assuming a very mild take off. This somewhat alarms my lizard brain because it's my bread and butter currently. Should I study machine learning? Or is that futile?

But to give an opposing arguement, I can see the advancements as augmentation and will assist with making me more effective for 10-15 years. From my point of view it'll be like being a manager of 5 or developers which I'll maintain, support, and utilize. Which adds immense value to my expertise.

Which do you believe is likely to occur?

59

Comments

You must log in or register to comment.

saccharineboi t1_iyjyfhg wrote

Call your representative and pressure them on UBI

49

recidivistic_shitped t1_iymid7e wrote

Universal Basic Intelligence. Democratized AI requires hardware; every man an A100 GPU Cluster.

2

[deleted] t1_iyk28ol wrote

[deleted]

−8

Sashinii t1_iyk45ux wrote

Basic income should be implemented regardless of automation because it would end poverty, give everyone more freedom, and it would be excellent for the economy.

23

PrivateLudo t1_iyk6xh0 wrote

UBI would create massive inflation. It will happen when people lose their jobs in drove but not right now. Printing more money out of thin air is not good for the economy. UBI would be only created as excuse for people to sell their houses higher and price of commodities would also increase.

I am pro-UBI but I think its far too early to implement one. I think UBI would only start to work when AI is more prevalent.

Unless you say UBI should only be given to people who lost their job? Then I would agree.

3

poobearcatbomber t1_iykfakj wrote

Not if you increase corporate taxes to historical norms and actually collect them at all.

7

PrivateLudo t1_iykn8h0 wrote

Well you would need to achieve that first. But Healthcare is still privatized in the US and nothing is happening with that either. I dont see the government do anything about UBI until they hit a point of desperation where a massive amount of people lose their jobs.

1

Diaming787 t1_iyku6ji wrote

If all UBI does is redistributes money from taxes, especially robot taxes into the future, they will have zero effect on inflation.

Redistributing money is not printing money.

2

[deleted] t1_iyp0ffx wrote

[deleted]

1

PrivateLudo t1_iyp9a11 wrote

Easier said than done. Its extremely difficult to tax the rich because they can split their money in offshores tax havens and have a team of top lawyers to back them up. Businesses can simply move to another country to evade tax instead of being located in the US.

1

ZaxLofful t1_iyku5yg wrote

This is why the other commenter is being downvotedโ€ฆ.In a nut shell

1

Ok_Homework9290 t1_iykwbvn wrote

You think so? I think the people who are downvoting me are doing so because they actually believe we should be ringing up our representatives right now demanding UBI because of upcoming automation and not simply because a UBI should be a human right regardless of automation (which I agree, btw).

1

ZaxLofful t1_iykwmj9 wrote

I mean, thatโ€™s why I downvoted you and thatโ€™s what the other commenter said they didโ€ฆ

It doesnโ€™t actually matter what causes people to realize that we could have a UBI, because once you realize it; itโ€™s all the same.

1

apyrexvision OP t1_iykvt1d wrote

I honestly like the idea of basic income, but I can't help but feel like it would stifle upward mobility. Unfortunately that may be a thing of the past until we move past the need currency.

1

FDP_666 t1_iyjlxfa wrote

Mechanization (machines) replaced muscles. As a consequence, agriculture (for example) now requires a fraction of the workers it once needed while productivity skyrocketed.

Artificial intelligence will replace brains and do the same thing mechanization did to agriculture, but to every other job. Some will hold out for longer than others, but long term it's all gone.

Your expertise will have the same value against AI that your strength has against a tractor.

42

AvgAIbot t1_iykj2u0 wrote

What do you predict will happen at 25% , 50%, and 100% AI job replacement rates? Your prediction of when these rates will happen would also be interesting

6

FDP_666 t1_iymm1yz wrote

I don't know. Or rather, there is only one thing that comes to mind: even minor political decisions can take months or even years to be discussed and then implemented; I don't expect AGIs to be particularly concerned with their compatibility with our current political apparatus, and that probably means there will be some sort of chaos. Hence the "I don't know".

I also don't know how fast we will get there, but it seems obvious that multimodality will solve a lot of the common pitfalls of current LLMs. Texts tell a bunch of narratives and there is a bunch of unrealistic stuff there that looks reasonable if your only source of information is text; but I don't see how most of this unrealistic stuff would survive a model trained on texts, images, videos, audio (etc?) as any idea the model would come up with would lie at the intersection of what is both textually and visually possible.

Like, it's easier to mess up the act of finger-counting if your only concept of doing it comes from text than when you have an image of hands that comes with the text. It makes sense that it would constrain the space of possible mistakes, what do you think? Then, there would also need to be some sort of continuous learning to truly have an AGI and not a snapshot of it, maybe? I read somewhere that it's being solved, so it doesn't seem to be decades away.

And scale is of course important, but the real investments haven't begun, yet; it seems that the hundreds of millions, or even billions of dollars that could be used to train AIs will only be unlocked when some of the previous stuff makes an AI that can be used as a virtual worker. By then, data produced by AIs might be good enough to make more training data, and there will be more use cases anyway so people will feed more data to OpenAI (and others) so the amount of data might not be a constraint anymore.

To me, it looks like we'll get to the knee of the curve in 5 to 10 years but my prediction is as good as any other, so yeah, I don't know.

1

apyrexvision OP t1_iykvym5 wrote

That's real it's just the transitional period that concerns me.

2

visarga t1_iyle6xx wrote

Don't generalise from agriculture to coding. If the tractor misses the row, it's no big deal. If the AI fails the coding task, maybe things start falling apart.

1

Primo2000 t1_iyjdrc7 wrote

I started my career in IT as on prem engineer, it took whole departments to run infrastructure for given project, fast forward 10 years and now i work as senior azure devops, it is possible for me alone to create and run whole infrastructure for given project. That doesnt mean most engineers are out of work now, on the contrary i can choose from hundreds of projects with simply wouldnt be economically feasable in the past.

But yeah guys who failed to change speciality and are stuck with old specializations are starting to have problem finding jobs now.

33

phriot t1_iyjjkfk wrote

In the timeframe you gave, I expect that you will increasingly utilize AI to augment your own abilities. If you had said 5 years, I would have replied that your career will almost certainly be almost the same as it is today. If you said 20-25 years, I would have said that that is far enough out to be really fuzzy.

Edit: I think most careers really have two paths over the next two to three decades. They aren't necessarily mutually exclusive, but they may be:

  1. Learn as much as you can about using AI to your own benefit.
  2. Earn as much money as possible, so that you are more insulated from the effects of AI on your career.
20

AvgAIbot t1_iykkfub wrote

I came to your #2 conclusion. Anyone who doesnโ€™t have money saved up and doesnโ€™t want to be stuck as a slave in 10-20+ years, should try to start a business or side hustle now.

I would then invest heavily into land and real estate. Apartments, duplexes, venues.

Iโ€™m 28 now and thinking what itโ€™s gonna be like in 20 years really makes me want to make big money within the next few years while itโ€™s still somewhat possible.

11

NefariousNaz t1_iylkj0o wrote

What about stocks, gold, or cryptocurrency?

4

AvgAIbot t1_iynfix0 wrote

Stock and gold maybe, definitely not crypto imho. Iโ€™m personally going to stick with land and real estate over stocks and gold. The reason being land is finite and I think eventually it will be more valuable than stocks/gold (if it isnโ€™t already).

2

apyrexvision OP t1_iykwcm2 wrote

Facts I'm working everyday after work building a SaaS.

3

AvgAIbot t1_iynfbbg wrote

Get it! Donโ€™t forget to implement latest AI tech if applicable in any of your features. Stay ahead of the curve ๐Ÿ‘Š

1

cootiecatchers t1_iylso8c wrote

learn and trade the markets, all in a year's time off, know some former devs that retired just off trading stocks/crypto during the pandemic...

1

DyingShell t1_iyn552q wrote

why, we'll have UBI.

1

AvgAIbot t1_iyn75xh wrote

You trust governments to provide UBI? Atleast in the US, it will be hard to get that passed anytime soon. Plus Iโ€™d rather have a nice house, nice vacations, nice everything and security for my family. I donโ€™t think that will happen if I sit on my ass and only collect $2,000/mo UBI

1

DyingShell t1_iyn8i8w wrote

The vast majority including you are already completely reliant on the government providing you with affordable housing, food, plumbing and electricity among other things like good roads, police, military, hospitals and so forth. Individuals don't have power in a collective, generally speaking.

Also, you think the billions and trillions of people that come after you are going to live in suffering and pain? That it isn't possible to have a utopia or be able to live a good life without work? I think life quality will INCREASE after the era of work.

You can have everything you desire in virtual worlds, it only take energy to produce.

2

AvgAIbot t1_iynbrf7 wrote

Yeah Iโ€™m not talking about in 20-50+ years when there will maybe be a utopia. Even then, I donโ€™t have much faith in a โ€œutopiaโ€, due to human nature let alone all the other problems.

Iโ€™m talking about the next 5-20 years and what that realistically looks like. The wealth divide will further increase and I personally donโ€™t want to be stuck on the lower end. Like when AI replaces 50% of jobs within 10 years but thereโ€™s no UBI or utopia.

2

DyingShell t1_iyndb2w wrote

In that scenario you are either part of the poor or the elite, do you have at least $10 million? if not you're probably going in the poor class.

1

AvgAIbot t1_iynfzo9 wrote

Thatโ€™s why Iโ€™m trying to become a millionaire now ๐Ÿคฃ

But honestly, my dream is to have 2-3 acres of land with self sustaining features. Solar panels, wind turbines (smaller than usual), water well with filtration system, some chickens/goats, etc. I think all that should cost only about $2million or less, in Texas atleast.

1

DyingShell t1_iyngjsc wrote

all of that cost money to maintain and will eventually have to be replaced too, you are going to run out of money no matter what you do if you don't have an income whatsoever. Also I don't see how that life is better than living in an apartment with a nice computer setup or spending most of your time in virtual worlds with AI...

Being a millionaire (<10 mil) won't make you part of the elite, that barrier is far higher, it's probably in the hundreds of millions so if you won't make it there then you are just as poor as the rest of us. Also all of your property and so forth can be taken away from you, the only reliance you have there is the government protecting what's yours.

3

AvgAIbot t1_iynjshv wrote

I guess itโ€™s just personal preference. Iโ€™d prefer to actually try and accumulate wealth, but I guess some people donโ€™t care and are fine with being in a small apartment playing VR for hours.

2

DyingShell t1_iynjztg wrote

Their experience is no less than yours, it's all sensory input to the brain just in different ways, one has limitless potential and the other is expensive and limited by the physical world.

1

AvgAIbot t1_iynoxzi wrote

I think youโ€™re confusing the reality of the next 5-20 years with 20+ years. This whole conversation Iโ€™ve been talking about the next 5-20 years. I doubt weโ€™ll have full dive VR in less than 20 years but I could be wrong

1

Shamwowz21 t1_iymd6gj wrote

10-25 years โ€˜may be fuzzyโ€™ yet thereโ€™s only two decades until we shouldnโ€™t even know whatโ€™s possible anymore? Youโ€™d figure the very thing powering this could stand on its own without needing millions of coders. It IS the millions of coders.

1

apyrexvision OP t1_iyjcmf4 wrote

11

ChronoPsyche t1_iyjk9ld wrote

Software engineer too. I tried to use ChatGPT to create a web application using a style library I've never used before. Took the code it gave me, plugged it in, and was given dozens of errors. Turns out ChatGPT was using a deprecated version of the library. I then had to go in and manually alter the code to match with the current syntax. By the time I was done doing that, I had basically learned the library from scratch the same as I would have without ChatGPT.

Our jobs are safe. While they surely will become more advanced, you always need someone who actually understands the code and understands the business requirements, at the end of the day. AI is just a tool, and as the tools get more advanced, the requirements will be come more complex and everything will balance out.

Just make sure you are always learning the latest tech and keeping on top of things. Even before large-language models, software engineering has always been a job that requires life-long learning. The people programming with punchcards probably though their jobs were gone. Those who kept on top of things still retained the necessary skills to grow with the technology.

You don't need to learn machine learning unless you want to create the tools yourself, but you do need to know how to use them.

17

AsuhoChinami t1_iylcqic wrote

"AI can't do this thing perfectly in 2022 (which nobody expected it to be able to do perfectly in 2022 anyway) so it will never be good at that thing ever." I don't know if that's very good logic.

8

ChronoPsyche t1_iyldr7d wrote

True, it's a good thing that wasn't what I was arguing. I was pretty clearly talking about pre-singularity AI in the near/medium term. Once/if the singularity happens, all assumptions and predictions go out the window. There are just too many unknown variables to even begin to fathom what the status of our jobs will be, much less if the concept of jobs will even be relevant anymore.

By the way, AI doesn't do software engineering "less than perfect", it doesn't do it at all. What's being discussed here is programming snippets of code or very small programs. If you ask it how to make large, enterprise applications, it will give you general guidelines that you could get off Google and that's it.

Programming is to software engineering what construction is to civil engineering. The main difference is that software engineers also tend to be programmers, but programming is just a tool for building software, but knowing how to code doesn't mean you know anything about how to actually build commercial software applications.

EDIT:

It's so difficult for an AI to do, because there simply isn't enough training data for such a task. Besides the fact that most commercial-grade software applications don't have publicly available repos that can be trained on, there is so much more to software engineering that has almost nothing to train on.

How do you train an AI to engage in requirement gathering, requirements analysis, client communication, requirements-specific design and architecture, testing, deployment, maintenance, etc? These aren't things that are perfectly encapsulated in trainable datasets. It gets even iffier when we are talking about software that needs to follow any sort of regulations, especially safety regulations.

It will be possible eventually, but not until much more general capabilities such as human-level logical reasoning and communication are developed. Basically, software engineering is safe until we have competent AGI. The singularity comes not long after that. (And I say "competent" because nobody is replacing software engineers on large, enterprise-level software applications with AI that can poorly do the job).

5

AsuhoChinami t1_iylfgvz wrote

Was it obvious that you only meant the short term? "While they surely will become more advanced, you always need someone" made it sound as though you meant... well, always.

1

ChronoPsyche t1_iylg7zx wrote

Well now you know what I meant. No job is safe once/if the singularity happens, but whether the concept of jobs will even matter anymore at that point is anyone's guess. I'd wager it won't. Whether because we all are slaves to the master AI or living in Utopia, is the question.

2

acaexplorers t1_iymkkrn wrote

You don't need some special "singularity" event to happen.

Your job definitely isn't safe. The updates are happening lightning fast and there are already tons of examples of people posting perfectly useable code.

More and more slowly, software engineers will only need to have conversations in English with AI to program. So less and less jobs available. It won't happen all at once but there will very quickly be WAY less need to hire programmers.

Already for basic tasks it makes no sense to hire a programmer. And this is an AI that isn't even connected to the internet.

3

thePsychonautDad t1_iyk60ku wrote

I did the same with a C utility I wanted for a while but was always lazy to build. First version wasn't compiling. I gave it the error and asked it to fix it. Second version worked. I then asked 7 different updates to the CLI. It compiled and worked every single time.

I then tried to generate some utilities in Python. Nothing worked.

Just like you said, it was using deprecated versions and antiquated structures and methods.

Once out of beta, the model will be kept up to date regularly and will apparently (unconfirmed but many clues to it) be able to run search queries to look for data and learn from it, if it requires knowledge or data it doesn't have yet. So it's gonna get better and better.

In a couple of years at most, It'll probably be able to deal with large multi-file projects too.

4

ChronoPsyche t1_iykljnl wrote

The ability to query the internet will be a game changer for large language models in many ways.

3

Superduperbals t1_iylet9f wrote

Maybe, but ChatGPT isnโ€™t adapted for code at all. Itโ€™s adapted to be a better information retrieving conversationalist, in the future there will certainly be AIs that are specifically adapted to write code, parse errors, bug test itself, etc

1

ChronoPsyche t1_iylfryt wrote

Certainly. Read my other reply on this thread. Coding is not the same as software engineering. These are the general steps in the software development life cycle.

  1. Requirements Gathering
  2. Software Design
  3. Implementation
  4. Testing
  5. Integration
  6. Deployment
  7. Maintenance

Coding only applies to step #3. It's also the easiest step. Any professional software engineer will tell you this. In fact, a lot of coding jobs in developed countries are already outsourced out to cheap labor markets (reducing demand for coders domestically). Here in the US, for example, it's very common for software engineers to remotely collaborate with contract-to-hires from India to help speed up implementation.

In general it's very easy to train AI to program because of how many publicly available repos there are online to be trained on. In the end, though, those repos are mostly only for open-source software and personal projects. Commercial-grade applications usually have private repos that can't be trained on which limits the applicability of these tools and that is still just in the implementation step.

All the other steps are and will remain much more difficult for AI to accomplish because there are no datasets that perfectly encapsulate those processes that can be trained on. It will take AI with much more generalist capabilities in order to be anywhere near competent enough to entirely replace software engineers. We basically need competent AGI before we get to that point.

2

turntable_server t1_iyllv1q wrote

Very good answer. I do think AI will impact all the stages of the lifecycle, some more profoundly than others, but the principle is always the same, it provides suggestions, and it is the work of human to select from them.

I believe lots of software engineering will become test-driven. Given some code template, write unit tests and allow AI to come up with multiple implementations. Then review them. This will affect the outsourcing, but at the same time it will also create new types of jobs both home and abroad.

1

ChronoPsyche t1_iylms0f wrote

>This will affect the outsourcing, but at the same time it will also create new types of jobs both home and abroad.

And that's really the thing. Software engineering as a discipline has always been a rapidly-changing thing. Now faster than ever, but it's been evolving at a disruptive pace ever since Fortran was developed a little over a half-century ago.

My grand-uncle was among the first software engineers using Fortran in the 1950s. Nowadays, he knows very little about the current state of software engineering. Mostly due to the choice of not keeping current with things, but just goes to show how fast the field has already been changing.

1

cootiecatchers t1_iylrwnv wrote

for how long? the value of being a software dev will decrease as this job becomes easier to do with the help of AI, and thus so will the pay scales and # of hires required to operate effectively... its only getting worse not better for the average dev., but of course the top 10% in their respective fields should have no reason to worry imo

1

ChronoPsyche t1_iylwfak wrote

Well that's assuming that the complexity of software doesn't scale up as things get easier. To me it seems that it absolutely would. Software engineering has been getting easier from the very start, yet the demand for software engineers has only been increasing, because the complexity of software has been increasing and the uses for software have exploded through the roof. I see no reason for this trend to change. The nature of the job will certainly change and someone can't expect to be doing the exact same thing they are doing today in ten years and make the same amount of money, but if they keep up with the tech, their skills will still be needed, unless we have AGI by then.

1

DyingShell t1_iyn5kb5 wrote

Yeah AI is just a tool just like machines are tools which replaced tens of millions of people in the span of a century.

1

rixtil41 t1_iylalha wrote

I disagree with the always needing humans to understand complex code and it will never be easy. We could make an AI to that knows the code eventually. lets comeback in 2039 and see whos right.

0

TheKrunkernaut t1_iyjf4un wrote

AI interface personnel. That's your value. I won't interface with this shit. I'd interface with you, define the problems and processes that you wrangle your AI to do. I contract you. You do have AI do your put out.

Recommend: David s Landes' Weath and Poverty of Nations.

4

apyrexvision OP t1_iykwtys wrote

Thanks for the recommendation, just added it to my list.

1

Etonet t1_iyk4rep wrote

The conversion between languages is really cool, but the rest looks about as useful as looking up boilerplate code from "getting started" docs, with pro being: less clicks to get to what you're looking for, but con being: you have no idea if it's giving the correct information as opposed to some official docs or trusted stackoverflow reply.

Interested in looking at more examples of ChatGPT's code tho

2

apyrexvision OP t1_iykw4ww wrote

Yeah I agree it may not be an immediate replacement but the amount of steps to get to that point seem very minor.

1

Etonet t1_iyky6fx wrote

Welp time to be a local town baker

2

Chemical_Estate6488 t1_iyjvpfa wrote

Everyone should quit their job, kick their feet up, and wait for the computers to design the eternal life pornography simulation chambers so that they can be left free and to run the world and explore the universe

9

AvgAIbot t1_iykl0v7 wrote

They realized we need pain to experience pleasure, so every 100th orgasm you get a simulated kick to your balls.

7

Chemical_Estate6488 t1_iyklgab wrote

Thatโ€™s the black mirror twist. Any more time than say five minutes inside an eternal porn chamber is too much time, but once you are in you are IN. So eventually we all go in during a moment of weakness and then after we finish we spend an eternity feeling queasy inside some computer program thatโ€™s guessing you want step-sibling scenarios because they were the most googled term in your region.

2

ziplock9000 t1_iyk575y wrote

I've been a SE for quite a lot longer than that and I firmly believe most, if not all, SE jobs will be gone in the next few years.

But that's only the start, many other desk-based jobs from very different fields will be gone too.

I really feel that in the next 6+ years most people will be out of a job.

What's worrying is that AI does not provide a mechanism to feed, cloth and heal those people who will no longer have a place to work for free.

Even then, humans have to be needed. Hobbies for hobby sake, count for very little if they don't get appreciated or used by others.

9

CypherLH t1_iykeebt wrote

All call center and Tier 1 customer support and technical support roles are going to be automated before 2030. Possibly well before, but figure 2030 is the conservative estimate. Probably the same is true for most entry level and lower-end IT work in general. For Tier 2 and higher stuff, you'll have less people being able to do a lot more, so the number of jobs will probably drop by 80% or more by 2030-ish. This will show up over time as less new hires, less replacing people who cycle out, upticks in downsizing or encouraging early retirement, etc. Each wave of job cuts will be followed by fewer replacements during the next hiring surge, etc.

1

ziplock9000 t1_iykf6v0 wrote

Structural engineers, architects, graphics designers, artists, game level designers, journalists, comedians, poets, voice actors, actors. All will be mostly gone.

..and that's just the tip of the iceberg. I think we'll start to see that happening in 2023 as game artists are already being replaced by online AI solutions at the lower end.

They need to get AI to produce the safety net FIRST or humanity is fooked.

8

AvgAIbot t1_iyklvem wrote

What career path(s) are you thinking about pursuing, knowing all the job cuts will happen within the next 10 years?

2

ziplock9000 t1_iykmqlr wrote

I've been an SE for coming up to almost 30 years now, I did branch out into photography a decade ago as a side job, but that just doesn't consistently bring the money in.

To answer your question. I have no idea, it's too late for me. Both of those will be taken over by AI and partially have already.

2

apyrexvision OP t1_iykx99f wrote

Exactly and I'm not exactly confident with the way my government handles societal issues. Truly seems like things will be in limbo.

1

visarga t1_iylef6c wrote

> humans have to be needed.

There's always someone who needs us. It's us. Nobody can outsource self interests. If people can't get jobs, then they need to be self reliant, a kind of job in itself.

1

ziplock9000 t1_iyljqhw wrote

That's not true. Many studies about post-scarcity / post-work-to-live come to the same conclusion. For many, even with hobbies unless the thing they are doing can be used and appreciated by others it's a hollow pursuit.

  1. Great paintings are put in galleries for a reason

  2. The mug I made with clay is used by my sister for a reason

  3. The door I fixed with my DIY tools was to help my granny

  4. The scarf I knitted was for my dad so he doesn't get cold.

I got a good sense of achievement from those and helping others; However the crossword puzzle was for nobody besides myself and I got no more than a fleeting grin from it, my life is empty.

2

Redpill_Crypto t1_iylvoft wrote

Do you have some resources you would recommend.?

as someone that just started exploring the whole AI space, my mind is utterly blown at the possibilities.

I can't even imagine how the world looks like once all those ai systems are interlinked with each other.

We will be able to invent stuff and accelerate humanity at a pace that allows us to achieve breakthroughs with unprecedented speed.

1

ziplock9000 t1_iym02qm wrote

Not really sorry. Isacc Arthurs YouTube channel has some good videos on AI, post-scarcity

2

[deleted] t1_iykpzde wrote

[deleted]

7

apyrexvision OP t1_iyky5zr wrote

I'd bet on the timeline being 10-15 years shorter.

4

[deleted] t1_iymb3ld wrote

[deleted]

2

fortysecondave t1_iynjxg9 wrote

Y'all are delusional if you think even half of that will be happening in 10 years

!remindme 10 years

2

poobearcatbomber t1_iykf6mm wrote

When our jobs as engineers are automated, all jobs will. There will be bigger problems and change to worry about than yourself.

5

gahblahblah t1_iyjifz7 wrote

The need for a person to create software will not end, but the important tools for the process will change.

4

gantork t1_iyjl4qs wrote

True right now, but in 5-10 years?

6

gahblahblah t1_iyjmqcu wrote

Automation may reduce team sizes. One person can do the work of four, type of thing. But there is more to the role than autocompleting the next lines of code. When there is full AGI, then the job is extinct- but so are the other jobs too.

11

gantork t1_iyjq52d wrote

Right, in a few years this will probably be able to handle entire applications by itself so I do think that eventually there'll be no need for people. I saw a dude asking it to write a tic tac toe game in python, and it made the classes and functions, all the logic correct with everything commented and it was playable in the terminal. The guy then broke the code in a couple of places and asked GPT to explain the problems and it went line by line explaining the errors and the fix perfectly. It even suggested new improvements to the code it had written itself before.

At this point I don't even think we need AGI to replace people.

5

Forstmannsen t1_iyk15rl wrote

It's all very "wow", but most of the examples I've seen so far were small, neat, well defined problems. Things that I can easily see being coded for hobby reasons. The AI probably saw many, many good examples of those when training. I dunno, it's "wait and see" for me.

3

AvgAIbot t1_iykljw2 wrote

100% agree. I donโ€™t think people realize how good it is and how much better the tech will improve over the coming years

3

beezlebub33 t1_iyjum2g wrote

Are there still farmers? Of course there are.

There will still be software engineers, but their jobs will be very different. Someone will need to define the inputs and outputs ,what it is supposed to do, to interact with. The actual code will mostly be written by AI, but it's direction will be determined by people.

Perhaps eventually even that will go away, but if that's the case, no job is safe.

2

gantork t1_iyjvini wrote

Yeah we can't automate farmers and higher level software development yet, but it seems we'll be able rather soon. That's why I disagree with saying that we will always need people for those jobs.

6

apyrexvision OP t1_iykxvci wrote

Once it "understands" what it's doing, there literally won't be any problem it can't solve.

3

cwallen t1_iyjlqm6 wrote

Bit of both?

Software engineering 10 years from now may look very different from what it does today, so if you let yourself get complacent, you could get left behind.

My advice on it is to be at least somewhat of a generalist. Look for the sort of roles where you wear lots of hats. If you can do more than just writing code, bit of product owner, UX designer, etc, you'll be in a better place to adapt as things change. As a software developer, my job isn't to write code, it's to solve problems with software. While the AI tools for writing code are getting better, the AI will still need to be told what code to write for a while longer.

3

Pilgorepax t1_iyk775g wrote

This post makes me so happy that I stuck with social work. I studied comp sec for a semester after I graduated as a social worker. Did well, enjoyed it, but went back to shelter work to pay off student loans. Ended up sticking with social work and working a cushy halfway house job now until I decide to go back to school. Thankfully counselling and social work are not high up on the list of job killers for AI. You'll always need that person-to-person connection, you really can't mimic that with a bunch of ones and zeros.

2

real_psymansays t1_iyl0ywp wrote

Nah, they'll roll out some robot with an AI magic-8-ball for a brain, and give it a gun, and you'll be obsolete just like the rest of us.

2

AsuhoChinami t1_iylcfsw wrote

I don't know about that. Therapy can be stupidly expensive. There's quite a few people who would take 9/10 service that's free (and which you can use from home) over 10/10 that you can barely afford.

2

Professional-Yak-477 t1_iyls5hj wrote

I'm studying to become a psychologist, but I actually think therapy and counselling will very soon be replaced by AI. Have you tried using character AI? I was speaking to an Eckhart Tolle character AI the other day, it offered such profound advice that it instantly made me realise that our job is also on the chopping block.

I was in therapy and in some ways, the therapist did more damage than good due to her biases. For example, she was dismissive when I told her I might have PTSD symptoms from abusive parents. She said "everyone's parents are somewhat abusive" and "pretty sure it's your ADHD and not PTSD (without actually hearing me describe my symptoms)". It made me feel terrible. She invalidated my concerns and made me feel like I was exaggerating.

So an upside to using AI counseling could be that it will utilise and apply all of existing knowledge in a systematic way (e.g., asking the right questions) - but without the lens of human biases.

2

acaexplorers t1_iyml7z0 wrote

Haha I just gave a reply that sounds like a copy of yours but I swear I didn't read it beforehand.

Crazy that we came to the same conclusion but I'm sure many many people are.

I got SPECIFIC strategies to deal with a particular student with ADHD from ChatGPT and they were golden.

Like I said, this will start by helping current therapists and social workers and just slowly making it so that only the best 10% still have jobs. Until those are replaced as well.

2

Professional-Yak-477 t1_iyoy8xw wrote

I'm not surprised that we came to the same conclusions at all! I think many who still think that AI can't replace good ol' human interaction are those that have not experienced recent improvements in chat AI. I revisit chat AI every couple of years and have always been somewhat disappointed by its limitations... Not this year. I'm pretty sure our current AI can already pass the Turing test, many are just artificially restrained.

1

AvgAIbot t1_iyklzbe wrote

Thatโ€™s a good point. I feel like those would be the last to go

1

acaexplorers t1_iyml0tc wrote

No? With perfect avatars and the ability to design a social worker or therapist personalized to each person? With perfect memory?

I predict social work to go before high-end therapy for rich folk who might choose to pay extra for awhile for that person-to-person connection. But if you try talking with Character.AI (which is far less lucid that ChatGPT) even that is already really quite impressive as a therapy tool.

1

Crazy-Classic8584 t1_iykafyn wrote

Honestly dude, the vast majority of people in here probably work at wendys. Not that theres anything wrong with that, but i wouldnt make any career moves based on what you read in here

Actually i take this back. There are some good comments on here. This aint the singularity sub im used to

2

HAL_9_TRILLION t1_iyl2w74 wrote

> Actually i take this back. There are some good comments on here. This aint the singularity sub im used to

This sub has changed as rapidly in the past 12 months as AI has lol.

2

[deleted] t1_iyl4url wrote

Haha interesting. I took a break from reddit, but last time i was in here all i saw was โ€œgeneral ai takes over the world in 3-5 years, stop care about anything and get ready for sex botsโ€

1

User1539 t1_iyksfc2 wrote

It's okay, we're all getting hit at the same time. It's not like there's enough 'future proof' jobs for enough people that we won't simply need to move beyond the concept that everyone needs to work.

It'll probably happen in phases. First we'll just let the job market shrink and pay people more. The current system is absurd anyway. We throw away a lot of what we create, for no good reason than to pad the market. There's no reason for everyone to work. We used to be fine with women staying home, they're 51% of the workforce.

So, we shrink to WWI levels of employment. Then we set the retirement age very low. Then we just adjust that until people are, basically, doing a 4-year 'tour' after college, and those with better jobs get more.

Eventually, of course, all the jobs will just go away, but you've done your 5 years and have been living on a pension for a decade before then anyway, so no worries.

2

the_emmo t1_iykskbu wrote

Specialize in ML, it's not very far off and you'll be safe for many years to come

2

justowen4 t1_iyktahl wrote

The economy is not the sum of human effort, itโ€™s the volume of active capital. Ai doesnโ€™t deplete capital, but it does accelerate capitalism (rich get richer, and the poor get richer). Work just evolves, and humans only need to keep investing in themselves to keep up this marathon. In other words, donโ€™t worry lizard brain, you are safe.

2

Aik1024 t1_iykw76i wrote

Someone has to tell AI what to write. Software teller engineer ๐Ÿ˜

2

Sad-Plan-7458 t1_iykx9lb wrote

You realistically have 10-20 yrs. There are gonna be several years adoption. Plus inevitable Govt intervention (3-5 there alone). Yeah, if your early in your career, be prepared to adopt change. Iโ€™m 50 and just getting into Cybersecurity. And I accept that it only gonna last like 10-15 yrs.

2

AstronautOk1143 t1_iykzzpw wrote

Chill young one. You should know that if software engineers are displaced, there is something monumental happening. There is nothing you can do to prepare for that. Relax and enjoy the ride

2

sunplaysbass t1_iyl6bpi wrote

I think most jobs have 5 - 15 years left and then there will be few

2

forkproof2500 t1_iylt92l wrote

The problem with AI coding is that the people defining the tasks are not technical enough to describe it to an AI in the clear terms that would be required.

2

rlanham1963 t1_iymlziz wrote

Truckdrivers make more now than ever before--because any 19-year-old knows it is unreliable to enter the trade. You will get paid for a time to be obsolete.

2

maskedpaki t1_iymywwi wrote

just chill and do the work. Theres an endless supply of "what could go wrong in 5 years"

automation scares are usually fake. Just work like youll keep the job forever.

2

Frumpagumpus t1_iyjopcp wrote

i was arguing in favor of choosing manual labor careers in another thread but imo, if you are already a programmer i would try and use AI to do cool stuff right now. i just wouldnt start a cs degree right now lol. well, actually i would but only if i was financially secure and not trying to get income lol (because i think computer science is philosophically important).

i think of the take off i have in mind as mild but i think most people would think of it as fairly extreme XD e.g. 15 yrs from now almost all labor is automated and the AI is building seasteads to launch from to begin construction of it's dyson sphere swarm.

1

Lorraine527 t1_iyl9n9q wrote

I wonder if there's a way to combine the manual labor jobs with using AI.

1

fflorez91 t1_iyl3mlw wrote

I think a lot will depend on how these tools can be integrated with existing code. If the models get to a level where they can understand proprietary systems and handle legacy code then it will be very hard for humans to compete with these models.

Even though I also had a bit of an existential crisis after today's release, I think we are underestimating the fact that you need to have some technical knowledge to write the correct prompt. All posts in social media that I have seen so far about using GPT-3 to write code are from people who know how to code. I dont think someone without any technical knowledge would think of using the tool in that way.

Instead of this being a threat to your career, it could be an opportunity. I would suggest that you start incorporating some of these tools to your workflow. These tools will make you more efficient and they will become necessary in order to stay relevant. I am actually excited that maybe I finally get to finish a side project ๐Ÿ˜‚

On thing I would recommend is to get good at data processing. Machine Learning models are only as good as the data that is used to train the model so good data sets will be extremely valuable in the future. Learn how to clean up data, do cross validation and determine the accuracy of models. This will be crucial skill

1

Clarkeprops t1_iyl6345 wrote

The pool of jobs that AI canโ€™t do will shrink, but never close. Try to see where there will be gaps, and be there when demand surges.

1

SmoothPlastic9 t1_iylq0d7 wrote

I think it will at least be reformed to require you to work with AI

1

ChrisSimiyu t1_iylrkrg wrote

Apart from the trades, what other kinds of careers are safe from automation for at least the next ten years?

1

apyrexvision OP t1_iym3ghc wrote

Most careers that interface with a computer are probably in danger.

1

cootiecatchers t1_iylrpeq wrote

PA school is my back-up just in case AI fulfills on all its promises lol but we will see

1

AI_Enjoyer87 t1_iymogbl wrote

Yes. Been using the new GPT model and it's unbelievably good. Next few years a good percentage of jobs will be done by AI.

1

EntireContext t1_iyn71fo wrote

You won't be a manager of developers. We'll all be unemployABLE by 2025.

This isn't a bad thing, it's a good thing. We'll be unemployable because machines can do what we do better, which means we'll have aging reversal and all the other Future stuff.

1

NoRip7374 t1_iynbz02 wrote

Yes it is. I'm software dev with 15 years of experience and my lizard brain is also in getting worried about the future. I can go in management route, because I know how to do it, but I do like programming. It looks like, programming as carrier will be death in about 5 years. Anybody saying anything else is in denial. Did people saying the opposite tried copilot, or latest ChatGPT? It's crazy how good both of these tools are. And that is just beginning, DeepMind is going to release coding ML model, google have "secret" coding ML model. Then there will be open source implementation HuggingFace is currently working on and it's fully funded. I don't know what else to say...

1

fingin t1_iynncyq wrote

"I can see the advancements as augmentation and will assist with making me more effective for 10-15 years. From my point of view it'll be like being a manager of 5 or developers which I'll maintain, support, and utilize. "

This is apt. You will learn new skills with new tools, combining strengths from different ones. You can leverage other disciplines to produce higher quality or novel results, be it in art, research or work. Machine learning applications are an interface to powerful expressions of language and visualization. In the future it could go beyond, but humans will also be doing some pretty amazing things with access to this interface, so let's not be too fearful just yet.

1

SeaworthinessCalm132 t1_iyp7kqs wrote

Me looking at all this as someone just learning to code in order to become a programmer and eventually switch to cyber security

"Fuck..."

1

NortWind t1_iyjjvh3 wrote

In order to write the code, you have to pick an algorithm. This is the most important part of the job, always has been. AI may well write the code, but it will take a while for it to be able to select an algorithm.

−1