czk_21

czk_21 t1_jeh55z9 wrote

Reply to comment by 4e_65_6f in Should AIs have rights? by yagami_raito23

how about

3- every sentient self-aware entity should have some basic rights

your points is only about human perspective but what about theirs? did we forget about slavery?looking down on someone arbitrarily is morally wrong

7

czk_21 t1_jefqfxe wrote

of course I have read a LOT of translated text, english is not my first language

and yes it will, maybe I could use better term obsolete, how would they be needed when AI can translate better, cheaper and much faster? its same with any other task in which humans will be outperformed

4

czk_21 t1_jef00f5 wrote

in couple years human translators will be pretty useless, in sense that AI will do the job same or better but it knowing more languages will still be valuable skill in general to better yourself or if you want to speak normally with native speakers, it always makes other people glad when foreigner speak with them in their language

4

czk_21 t1_je0p48f wrote

> But for an AI to actually be CEO would require unending hundreds of years of law. I don't expect it to actually happen

AI can easily learn all laws in humaan existence now, thats nonissue, problematic could be reasoning but as we can see GPT-4 scores better than 90% of people in law bar exams...

AI can also take note of markets in real time and do complicated market analysis in seconds/minutes, no human can compete

AI can make company more efficient and as bonus you wont need to pay millions to CEO, its win win

5

czk_21 t1_je0nhbe wrote

true, even now GPT-4 could be betetr teacher in subjects like psychology, history, economics, medicine, law or biology,it scores very high in these fields, for example biological olympyiad -99.5th percentile= on par with best or better than all humans

factuality need to be improved but you know humans make mistakes too and GPT-4 is already on similar level as experts

imagine when GPT-5 will be better in said subject than most university professors, what point there will be to attending lower level education? even university would not be that good for humanities...

2

czk_21 t1_jdzr8s1 wrote

> it always predicts the same output probabilities from the same input

it does not, you can adjust it with "temperature"

The temperature determines how greedy the generative model is.

If the temperature is low, the probabilities to sample other but the class with the highest log probability will be small, and the model will probably output the most correct text, but rather boring, with small variation.

If the temperature is high, the model can output, with rather high probability, other words than those with the highest probability. The generated text will be more diverse, but there is a higher possibility of grammar mistakes and generation of nonsense.

1

czk_21 t1_jdzq2zj wrote

my point was that it is interesting how much one could make in those 30 min with AI tools

mod refered to it as low quality and that I should put more thought into new posts, yet here we can see that vast majority of other people consider it also interesting- 92 %, thats sort of proof they should not delete me in the first place

2

czk_21 t1_jdvjoeo wrote

oh really? when AI will be able to do everythign humans do-and much more efficiently, there is no reason for human to be in work anymore, its same concept which exists now and existed all the time, those who are better at the job replace those who are bad

for human society not to collapse, there needs to be some form of UBI, so I would say its basically guaranteed to happen, its just extended social benefits system which we have now

1