Down_The_Rabbithole

Down_The_Rabbithole t1_j49lf0z wrote

>remember people used to think GNU was the operating system of the future because it was open source.

That actually came true though. Almost all servers, supercomputers, embedded systems and mobile systems like smartphones use a Linux-derived system. Essentially the only place where Linux didn't dominate was the desktop PC which is honestly extremely niche in the grand scheme of computing in 2023.

You can safely say that GNU/Linux is the main operating system of humanity in 2023 and be technically correct.

For example you probably wrote your comment on a smartphone running a Linux derived OS. You sending that message to a cell tower running Linux. The Reddit servers receiving the comment running linux. And me reading it back on my linux phone.

42

Down_The_Rabbithole t1_j1qonpd wrote

AGI can't come from transformer models. They simply don't scale up well enough. The T in GPT stands for Transformer.

Hence GPT-4 isn't going to be the beginning of AGI. Anyone familiar with machine learning can tell you this. Nobody is claiming that this will lead to AGI either.

This subreddit really really needs to work on misinformation and clearing some misunderstandings up.

2

Down_The_Rabbithole t1_iyuyvec wrote

Reply to comment by Shelfrock77 in bit of a call back ;) by GeneralZain

This isn't dependent on moore's law or AI but is actually limited by certain technology out of range of the exponential growth like battery capacity, understanding of the human brain and anti-inflammatory drug development.

It's possible we'll have reached ASI guided singularity, fusion power generation and space habitats while still not having access to FDVR because of a physical limit in something like anti-inflammatory drugs or material connection to the human brain.

6

Down_The_Rabbithole t1_ixzg548 wrote

Every serious software developer knows our jobs will be gone within 5-10 years time. Most of the smarter colleagues are already teaching themselves people skills or going back to college part time to learn things like psychology because most STEM jobs will be gone very soon and the humanities might be the only jobs around in a decade.

−1

Down_The_Rabbithole t1_ivsks9k wrote

Yeah, no. This doesn't work if you actually understand the math involved, like some other commenter said it would reinforce local maxima which means it would work well in very specific and isolated cases but wouldn't generalize well.

Training data generation is the largest problem current AI models face and it's going to make the entire industry stagnate over the next couple of years as we're slowly running out of data to train bigger models with. Synthetic data training however is very unlikely to be a viable solution.

If anything we'd probably need to have jobs of organic data generation by actual humans to train AI better with in the future.

2

Down_The_Rabbithole t1_iv9ggzh wrote

Smaller chips are faster because things are closer together.

Smaller chips are cheaper to produce because you can make more of them at the same time

Smaller chips consume less power and thus increases battery life on smartphones/laptops

Smaller chips produce less heat and thus can be either clocked higher for more speed or laptops/smartphones can be made smaller/thinner as it uses less cooling.

But most often the chips don't actually shrink, they just use the new production technology to put more stuff in a chip of similar size.

1

Down_The_Rabbithole t1_iv9g8vz wrote

Quantum tunneling has been a problem since 32nm. The solution to it is to just have hardware that does the calculation multiple times to ensure a bit didn't get switched, the result that comes up most often is assumed to be the correct one.

Jim Keller has an entire talk about how to manage quantum tunneling bit flips statistically.

Sadly it means more and more of the actual silicon is used for redundancy stuff like this instead of actually used for normal computing.

We can clearly see this as a CPU from 2008 (I7 920) and a CPU from 2022 (I7 13900k) have almost 100x difference in amount of transistors, yet the 13900k is "only" 5-10x faster.

10

Down_The_Rabbithole t1_iv9fyae wrote

Intel has stopped doing that this generation. As they got frustrated by consumers thinking they are behind in transistor density. So they have now renamed their 7nm as 5nm. And will rename their 5nm to 2.1nm to be more in line with the fake names of TSMC.

Samsung is the worst of all. Their "4nm" is equivalent to GlobalFoundry 12nm, Intel 14nm and TSMC 10nm.

4

Down_The_Rabbithole t1_iv50d89 wrote

No, that would be pseudoscience. By definition things are always beholden to the laws of physics. Else they wouldn't be the laws of physics.

This is not a religion based on hope. It's a science based on observation and mathematical constructs.

That Arthur C. Clarke quote is just that, a funny quote. It isn't actually factual or something you should adhere to. It's just a funny remark.

Concepts like Entropy however are actual real science and the laws of physics are things everything in the universe by definition adheres to. These things are not the same.

5

Down_The_Rabbithole t1_iv4vmid wrote

Entropy makes most of these assertions mathematically impossible. Kurzweil needs to stop sniffing his own farts and selling fairy tales to people, and just stick to the singularity.

Computers aren't magic. AI aren't gods. These things are still beholden to the laws of physics.

−1

Down_The_Rabbithole t1_iuvbb2o wrote

AI will discriminate based on its training data. Since its training data will come from human history it will discriminate exactly how history has discriminated.

AI is going to favor old white men for job positions. AI law is going to favor young white women. AI policing is going to favor young black men.

At least in the US.

We don't have enough modern data that doesn't have a bias to train a proper AI that won't repeat this pattern.

1

Down_The_Rabbithole t1_iudaaxk wrote

That's nothing compared to the 1980s and 1990s though. The collapse of the 2nd world power and an entire ideology and its effects were insane and the 1990s brought the rise of the internet.

The world trade center attacks and middle eastern wars are side notes in history as they didn't really have as big of an impact. It's just that so few important events happened in those decades that they look relatively big, but in the scheme of things they were minor almost irrelevant events.

Smartphones are just a continuation of computer and internet technology, not a real innovation, just a marketing term, same for social media and streaming which were all just continuations of the 1990s World Wide Web and computerization of society.

The 2000s and 2010s were relatively quiet. I'm sure they will be termed "The silent decades" or something in the future due to how little of note happened here. Maybe they will be tainted by the (now naive) assumption of globalism and the world becoming more peaceful on its own over time through trade now that the world is again splitting into two separate economies, those of liberal democracies in the west and authoritarian systems in the east.

0

Down_The_Rabbithole t1_iud2x4i wrote

Honestly the 2020s has already had way more events than a normal decade has. A global pandemic, the first large european war since WW2, AI content generation revolution.

In the history books of the future the 2000s and 2010s will be 2-3 pages long while the 2020s will have 10-20 pages dedicated to it. "Pages" since I doubt there will be actual physical books of course.

3

Down_The_Rabbithole t1_iu0pczy wrote

Not how it works. There is an explosion in job applications right now because of the transition to a digital economy. Digital goods are already unlimited, meaning if there are fewer consumers it doesn't mean a (programmer/artist) just has to produce less. The worker needs to produce the same amount, the end product just gets distributed to fewer people digitally at the end, which doesn't impact it as much.

−1

Down_The_Rabbithole t1_iu07pw6 wrote

There are more job openings in the world than workers to fulfill those roles, so no. That's not the case. Even if the job would provide everything somebody would want there would still simply be too few people to fulfill those roles.

The true solution is to automate away those jobs but we don't have the technology yet and having children takes ~18 years before they can enter the workforce.

So we're going to experience a crisis of labor shortage one way or the other. There's no real solution going forward just a notice that things are about to get worse.

Unless you're working class/lower middle class. Then things are going to be great as employees have to compete for your labor which will result in higher compensation and working conditions.

−1

Down_The_Rabbithole t1_itpcjv9 wrote

I agree, Save up and expect to be out of a job during a transitionary period.

I'm a programmer with a CS degree. I expect all facets of CS to be automated away over the next 5-10 years. It's not a save career at all.

I also don't think investing in AI companies is a sure bet. Not because I don't think AI will dominate. But because the primary beneficiaries of AI technologies will actually not be the AI companies. It will be the entities that will be able to generate the most value once AI technology is in their hands. This isn't software companies that provide the AI technology. These are value generating sectors that are mostly being bottlenecked by human labor constraints..

In fact I can actually see AI businesses go out of business the more they succeed at building competent AI because the technology would inherently get commodified over time which is the worst position a company could find themselves in.

Remember that the companies that built the first railroads almost all went bankrupt, it was the ticket sellers that profited the most. I suspect the same to be true for AI since the dynamics are the same. The companies building the AI will almost all go bankrupt as the capital investment to build the AI is fixed but the rewards of AI will not inherently benefit the builder of the AI but the user of the AI.

That said, here's what you should do:

  • 1: Lower your dependence on external producers as much as possible. Generate your own electricity, own your own place, maybe even generate your own food, Get a 3D printer to print your own replacement parts

  • 2: Get as much savings as possible. Make sure this is a properly diversified portfolio, paper cash, digital savings on a bank, Government Bonds, Company Bonds, Stocks in every sector, Precious metals, maybe even some crypto.

  • 3: Ensure that your job has as much physical components to it as possible. Physical jobs are harder to automate as they need a physical capital investment in most cases. I'm a programmer myself so I'm probably one of the first to go, but I could switch to computer engineering with my degree and pedigree quite easily which would require hardware tinkering which makes me harder to replace.

Drivers, Miners, Janitors, Construction workers etc are the ones to be automated away last, Digital intellectual workers that sit in front of a computer all day to manipulate data in some way or another are going to be the first to get automated. This means all programmers, lawyers, digital artists, system admins, data entry, office workers and everyone else using a keyboard and mouse to generate income is going to go the way of the dodo in the next 5-10 years time.

3

Down_The_Rabbithole t1_itpbtuv wrote

What I meant by saying "physical navigation and manipulation of physical environments is the hardest problem to solve for" is that it won't be in the first big round of automation which is what we're now slowly starting to see happening, thus investing in those areas is premature. Kinda like investing in "palm-top" smartphones before the internet had sufficient penetration. You should invest in that area after this initial automation of digital/intellectual workers to achieve the highest financial return. Programmers, Lawyers and Pharmaceutical researchers will be replaced long before self driving is solved.

I also disagree that there is a large barrier to entry to a strong AI. There is a large initial investment to it, but after the initial investment has been done in terms of R&D, algorithms needed and hardware configuration it works on it's trivial to build. Look at current Transformer model and how newcomers quickly are able to make State of the art (SOTA) products like Stable Diffusion. This showcases that the AI makers won't be the winners here, they are like the enablers. It's not like the steam engine builders were the big winners from the industrial revolution, it was the steam engine users that generated the most value, I expect the same to happen with AI because the barriers to entry are indeed, not that high.

You make a great point towards existing businesses having "sticky" practices with margins priced in which is ripe for disruption. But in that case the businesses best to invest into don't exist yet as they will be startups, which wasn't the question at hand, the question was which companies should you invest in now.

Due to my assumptions listed above I think the worst financial mistake to make is to invest in companies that are actually building the AI systems as they are investing R&D into something that once completed has extremely low barrier to entry and it would render their own core business obsolete. Which is why I think it's unwise to invest in something like Alphabet.

I also think it's counter-productive to invest into companies that has a too long-term vision and won't benefit from the short term automation of intellectual and digital workers like Tesla. They invest in what is probably the 2nd or 3rd wave of automation after intellectual workers are already gone; Navigational and physical manipulation. I would hold off on investing in Tesla and use your capital to benefit from current capabilities which are large transformer model AI that can disrupt intellectual workers. The return horizon on self-driving cars and robot workers is just too long to be worth the investment in 2022.

2