2Punx2Furious

2Punx2Furious t1_jecz2b2 wrote

Reply to comment by WonderFactory in GPT characters in games by YearZero

I think you could get around the latency issue by having the generated dialogue come in form of letters that you receive in-game, which would feel a lot more natural than a slow conversation. Or have some cutscenes in between the prompt and the answers. As for the price, it should probably be an optional setting, and maybe the price should be offset by a subscription or ads, as much as I hate them, but in this case it would be difficult to do otherwise, unless you plan to foot the bill of your users forever.

1

2Punx2Furious t1_j3nzumw wrote

> We're a long way from a robotic farmhand being able to perform those skills, certainly not for a price comparable to a farm laborer.

If we get AGI, we automatically get that as well, by definition. Those you listed are all currently hard problems, yes, but an AGI would be able to do them, no problem.

The issue is, will AGI ever be achieved, and if yes, when?

I think the answer to the first one is simple, the second one not as much.

The answer (in very short) is: Most likely yes, unless we go extinct first. Because we know that general intelligence is possible, so I see no reason why it shouldn't be possible to replicate artificially, and even improve it, and several, very wealthy companies are actively working on it, and the incentive to achieve it is huge.

As for the when, it's impossible to know until it happens, and even then, some people will argue about it for a while. I have my predictions, but there are lots of disagreeing opinions.

I don't know how someone even remotely interested in the field could think it will never happen for sure.

As for my prediction/opinion, I actually give it a decent chance of it happening in the next 10-20 years, with probability increasing every year until the 2040s. I would be very surprised if it doesn't happen by then, but of course, there is no way to tell.

0

2Punx2Furious t1_j34zjvb wrote

It does have some perception. Just because it doesn't have the full sensory capability that (most) humans have, doesn't mean it has none. Its only input is text, but it has it.

Also, for "sentience" only "self-perception" is really necessary, by definition, which yes, it looks like it doesn't have that. But I don't really care about sentience, "awareness" or "consciousness". I only care about intelligence and sapience, which it seems to have to some degree.

3

2Punx2Furious t1_j1qmwkg wrote

> Then there will still be a reason to watch human comics, and entertainers, we just won't be overwhelmed by the large scale division that this level of Ai could create.

Until the AI decides that we are no longer allowed to do that, because it goes against the values we gave it. That's one of the reasons why alignment is so hard, even if you think there are no downsides at first, some subtleties can be harmful when they become extreme.

3