Comments

You must log in or register to comment.

SomeoneSomewhere1984 t1_jebfot8 wrote

LOL, What? AI is going to be programmed to extract everything it can from the population and give it billionaires.

21

DATCO-BERLIN t1_jebgn3d wrote

AI is owned and controlled by billionaires. Expect it to make them richer.

16

deformedexile t1_jebv49v wrote

This depends on one of the following two assumptions:

  1. AI is no smarter than its masters (i.e. can be controlled by a human will) or 2) AI has no independent will.

IMO, 1 is a time-limited proposition, and 2 is The Question.

1

Zaflis t1_jebxm93 wrote

What you describe is not artificial general intelligence but just a neural network or some script.

1

deformedexile t1_jec3hll wrote

And what you think of as a human being is just a giant haphazardly assembled confederation of cells with no originating principal apart from self-replication.

1

Zaflis t1_jedlrru wrote

A flower too is a haphazardly assembled confederation then, but it has traits we think are fascinating. We simply have more of them. Much.. more.

1

deformedexile t1_jee1y3w wrote

My point is that everything you think is special about humans fell out of nothing but descent with modification. Meanwhile, LLMs have actually had facility with language designed into them. It should not be a surprise for LLMs to acquire some abilities with which they were not intentionally endowed, since we have, and the LLMs were intentionally endowed with so much more. And in fact, they already have acquired some abilities with which they were not intentionally endowed.

1

StaffOfDoom t1_jebohcu wrote

As an ethical solution, AI is less likely to 'cull' the rich than it is to just make money/exchange/markets worthless. That way, the rich are no longer rich but they're still alive.

5

thatnameagain t1_jebo6c2 wrote

It would just do what it was programmed to do. That's all it would base its "opinions" on.

3

cal405 t1_jebih09 wrote

It would amplify the most destructive elements of humanity in an effort to rid itself of it's closest competitor and, thereby, ensure unrivaled dominance over earth.

1

Thin-Limit7697 t1_jeblkd7 wrote

It wouldn't give a fuck.

AIs are neither superheroes nor gods, they are tools that only exist to learn how to do a job and then do it in the best way it can. They don't care about anything else.

Now, who would decide what job should said AI perform? And what job would it be? And who is going to train it? That's the real question.

1

Zemirolha t1_jebqouc wrote

considering we are the most powerful and capable animals on Earth and we abuse our power enslaving, killing, raping and torturing others animals because we ar selfish and addicted, absolutelly without necessity, if AI woke it would have to kill us all so another animal could have its chance as dominant specie.

1

Halebarde t1_jebx2sc wrote

Is this place just a commie hellhole? If superintelligent AI decides to handle resource allocation, it will be anything but egalitarian.

1

Suolucidir t1_jec3awc wrote

I am not sure about AI becoming self aware, but I see a lot of anxiety in the community about billionaires being the only people in control of these models and I want to address that issue a little bit.

The fact is that GPT-4 is amazing and not open source. So it is true that you cannot run it yourself. However, it is not inaccessible and you can use it for free or pay to use it on upgraded hardware with more memory on a pay-as-you-go model - so it is certainly accessible for regular people.

With that said, GPT-4 is not the only game in town. For example, Bloom is an open source alternative that is routinely viewed as comparable to GPT-3.5(and better in some cases, depending on what you are asking for). There are a few other models that get very close to GPT-3 performance that are open source too, like EleutherAI's GPT-NeoX-20B model.

Anyway, Bloom is free to download, use, and even modify for anybody. You might be thinking "Yeah, well how am I supposed to afford to run a model trained on 167 Billion parameters?"

And that is a reasonable thought. The answer is that you probably cannot afford to run it yourself. Here is an example of the hardware you would need to buy: https://shop.lambdalabs.com/deep-learning/servers/blade/customize (At 8x A100 GPUs it's just over $150,000). However, 10 people could go in together with $15000 apiece and then it's cheaper than any new car (and it's likely you would never run into each other, HUGE university departments share this kind of hardware effectively).

Alternatively, this guy did it for $32/hour using Amazon's cloud: https://medium.com/mlearning-ai/bloom-176b-how-to-run-a-real-large-language-model-in-your-own-cloud-e5f6bdfb3bb1

Here is a link to the actual model if anybody wants to really do this: https://huggingface.co/bigscience/bloom

1

Lil_Souljaa t1_jebes5e wrote

Fuck no. I only believe in one GOD. Jesus father. THANK YOU LORD.

−9

whatistheformat t1_jebiqod wrote

God couldn't create human beings that could eventually make a self-aware AI?

3