Lawjarp2

Lawjarp2 t1_je3cu0i wrote

Depends on how early in the transition you lose your job and how long the transition is.

I think the transition could be anywhere between 5-20 years. Depending on how efficient and proactive a government is.

White collar jobs, except those at the very top of each field, will lose their jobs first. Or atleast start to make less.

Given this, you should have 5-20x your yearly expenses saved. Wealth in currency or assets may not be stable either. So really your only hope is that the government gets it's shit together

2

Lawjarp2 t1_jd6n0af wrote

Nothing in the fast scenario is ever good. This sub thinks everyone will adapt as quickly but the vast majority will not because they have not this concept for years and the movies only show them the most negative scenarios.

Your slow scenarios are too slow. I would call 30 years as slow. This is also the best timeline wherein we get to prepare better.

So I think it's slow, adaptive and aligned with slow being 30 years that is best for us. But I think what will happen is fast, adaptive and unaligned

3

Lawjarp2 t1_jcb2j0i wrote

It scores at the 5th percentile on codeforces. It can barely solve medium hard questions on leetcode.

Most software development doesn't need one to be good at anything mentioned above. But they do indicate ones ability to do leap of logic required to solve something like AGI. GPT-4 is not ready for that yet.

14

Lawjarp2 t1_jb66hv1 wrote

The things that can slow it down are already in motion but they can only push it down so far.

(1) A recession causing a drain on the companies trying to build AI. A recession is here.

(2) A war or other critical event causing interest rates to go high, leading to defaults in startups and even established companies. Interest rate will go all the way to 6% this year.

(3) Hardware/cost limits being hit. Better hardware will ofcourse be available soon but it's harder now to just scale by pumping money. It's already reaching hundreds of millions of dollars, more is only possible by governments or high return rate on these AI models.

(4) Isolation of a large country like China from chip manufacturing and procuring for AI.

Other things that could happen

(*) GPT-4 being a bust and thereby eroding confidence.

(*) OpenAI and other companies fail to monetize.

(*) Scaling may have reached it's limits. Newer architectures take time.

But even with all this, it can only slow it down by 5-10 years. We will still likely have AGI in 2030s.

31

Lawjarp2 t1_jacugjc wrote

Context won't even matter. No single person wrote all those millions of lines of code. No single person needs to know all of it. Just the functionality of each module and how to use it is enough as context for others to use it and build their own module.

Essentially a 32k or even 8k context would itself be enough. But chatGPT as it is now is not robust.

20

Lawjarp2 t1_ja7sbla wrote

Did some digging. They used 800000 cells, it never got to superhuman levels, human cells are better than mouse cells.

Things to take away, 800k is not equivalent to 800k parameters. 800k cells might equal anywhere between 800m to 8b parameter(1000 - 10000 synapses per neuron). If a bloody 800m parameter model can't learn to play ping pong like a superhuman it's not really that great. However, it probably does converge sooner, given that the way neurons learn in brain isn't like a feed forward network.

The most curious thing here is that apparently neurons work by trying to reduce entropy. Looks like all of life is trying to do the same at many levels.

16

Lawjarp2 t1_ja4y8qi wrote

I understand your feeling, I feel the same way too. I honestly don't know what to do either. Right now here's what I think makes sense.

It is likely software development will be one of the first things to get disrupted along with art. You don't have replace everyone just increase productivity greatly. This is unfortunate as those who lose job need to fend for themselves till UBI.

(1) If financial security is an issue you must be at your job till you get replaced and learn to use AI as a tool and prolong the inevitable replacement as long as possible.

(2) Assess how good GPT-4 will be at coding. Given it will be heavily trained on code and openAI is offering foundry to companies, it must be the best model for a while.

(2a) If GPT-4 is poor at programming continue on till next big model.

(2b) If GPT-4 or any other new model is reasonably good at it. Do the following based on how well off you are.

If you have greater than 10X yearly expenses saved. Quit and enjoy life.

If you have greater than 5X yearly expenses saved. Quit and enjoy life at reasonable expenses with some odd jobs along the way.

If less than 5X but greater then 3X saved. Continue working till you get replaced but chill a bit.

Less than 3X saved then we don't really have a choice now do we.

1

Lawjarp2 t1_ja4b2v6 wrote

Air.

I'm surprised someone is dumb enough to not understand that things lose value when there is lots of supply and it's usefulness is irrelevant if you can't corner a market.

Hoarders hoard to reduce supply. Can't do so in a world which can create ever more stuff. If hoarding even makes sense in a world with crazy supply.

4

Lawjarp2 t1_ja32wjj wrote

Nope. We have a crazy amount of resources. We lack the energy to extract and seperate it. We literally sit on the biggest rock of minerals in the solar system.

People don't need unlimited amounts of anything they ofcourse want it. But if something was unlimited and cheap most people won't use as much. It's like the corporate policy of unlimited leaves, if you give people infinite leaves they actually take less than the sanctioned leaves.

People underestimate how much land there is. Even if we want to limit our total area we can always give everyone more vertical space.

8

Lawjarp2 t1_ja31joi wrote

Money won't matter. It is very unlikely that society would tolerate some having an obscene amount of land while others don't when individual value addition is zero. AI is about to make everybody equal, equally pointless, and unlike the failed communist attempts at the same, forced equality now is not gonna be an economic disaster.

−4