FoniksMunkee

FoniksMunkee t1_jefm1in wrote

Okay - but that won't work.

Stacy makes $100,000. She takes out a mortgage of $700,000 and has montly repayments of approx $2000.

She gets laid off but is now getting $35,000 a year as reduced salary.

She now has only $11,000 a year to pay all her bills, kids tuition, food and any other loans she has.

Now lets talk about Barry... he's in the same situation as Stacy - but he wanted to buy a house - but now his $35,000 isn't enough to qualify for a loan. He's pissed.

​

Like - I think we need a UBI or something - but how does this even work?

2

FoniksMunkee t1_jefi6fs wrote

There isn't going to be another AI winter. I am almost certain that the US government has realised they are on the cusp of the first significant opportunity to fundamentally change the ratio of "work produced" per barrel of oil. I.e. we can spend the same amount of energy to get 10x 100x productivity.

There is no stopping this. That said - it doesn't mean you want to stop listening to the warnings.

1

FoniksMunkee t1_jefgmzz wrote

UBI is most definitely a solution. In the short term. Because you aren't going to jump from "everyone with a job!" to "no one has a job but it's okay because we have free stuff coz - AI". That's not going to happen overnight.

There is going to be a reasonably long and painful bit in between where people still have goddam mortgages to pay and investment of time and resources into assets that they own.

2

FoniksMunkee t1_jeff2h4 wrote

Short answer. No. It is still useful to learn another language. Language isn't just about direct translations. It actually rewires your brain - you think differently in another language. It gives you insight to the culture. It's also super annoying when you are stuck in local government office and your damn phone / AR headset / whatever, runs out of batteries.

If you are just travelling to another country for a holiday - AI translation is probably going to be the best bet. If you are going to move to another country, or get in a relationship with someone from another country - learn their language.

3

FoniksMunkee t1_jefbr87 wrote

No they aren't, no ones slowing anything right now DESPITE concerns. In fact the exact opposite is happening.

But that's not the most convincing argument - "On the off chance we save SOME lives, let's risk EVERYONE's lives!".

Look this is a sliding scale - this could land anywhere from utopia to everyones dead. My guess is that it will be somewhere closer to utopia, but not enough so that everyone gets to enjoy it.

The problem is you have NO IDEA where this will take us. None of us does. Not even the AI researchers. So I would be cautious about telling people that the fear of AI being dangerous is "irrational". It really fucking isn't. The fear is in part based on the the ideas and concerns of the very researchers who are making these tools.

If you don't have at least a little bit of concern, then you are not paying attention.

1

FoniksMunkee t1_jef9g8l wrote

Actually no, it's a very rational fear. Because it's possible.

You know, perhaps the answer to the Fermi Paradox... the reason the universe seems so quiet, and the reason we haven't found alien life yet - is because any sufficiently advanced civilisation will eventually develop a machine intelligence. And that machine intelligence ends up destroying it's creators and for some reason decides to make itself undetectable.

0

FoniksMunkee t1_jedm0mj wrote

Reply to comment by [deleted] in GPT characters in games by YearZero

There will be teams that are starting games today - that might look at ways of using it. I Know of some teams using it for art pipelines (but mostly inspiration / mood boards). And there will be some tools like Photoshop and Blender etc that will mean we will see it appear in the pipelines earlier.

But as far as wide spread penetration - it will probably need to wait for teams to start new projects. So obviously, a team starting today may choose to do something already. But widespread use? It's way too risky. It's hard enough to justify an upgrade from UE4.27 to UE5.1. I can't imagine up-ending an entire gameplay system just to integrate AI would be an easy sell.

And while I won't be the first to know when it starts happening... I will be early on because of where I sit in the pipeline. All I hear right now is crickets.

1

FoniksMunkee t1_jediwl9 wrote

Reply to comment by [deleted] in GPT characters in games by YearZero

Yes, that's a good point I didn't think of is API costs.

The other issue is games kinda move slowly in some respects. There are games that started before ChatGPT was commonly known that wont be finished until after ChatGPT 5 is out. And there is no way these tools will be integrated during that process.

They would also have to convince Sony, MS and Nintendo to have their SDK's in a model. I don't think MS will necessarily have a problem with that... but there's a ton of third party libraries that would need to come on board, not to mention how you deal with existing legacy code. There are still more companies in the AAA industry with custom engines than are using commercial engines like UE4/UE5.

Then comes the RND, then comes the new game... so what, 5 years before we see wide spread adoption in the AAA market?

2

FoniksMunkee t1_jedcqe9 wrote

Reply to comment by [deleted] in GPT characters in games by YearZero

It's interesting - I work in AAA games and work with a lot of clients. They are deathly silent on any kind of AI integration so far. So it will be interesting to see when this starts coming down the pipe.

3

FoniksMunkee t1_je8ba4w wrote

You may be missing the point of the statement (or perhaps people are using it wrong?) - but let me give you this example.

Midjourney doesn't understand what a hand is. It knows what one looks like, so it can draw it in most cases. But it has no understanding of it's use in any real sense. That means it will quite happily draw a hand in a position that would break bones and tendons of a human. That's not an issue when you're just doing a drawing, but there are plenty of cases where that lack of context can be an issue. And it may not be just the case of feeding it more data.

That is the kind of understanding that is entirely relevant and not stupid to point out. yes, people get input data to learn, but they also have other senses like pain for instance. They also get experience by trying things out, i.e. experience.

A problem for AI in some tasks is that it has a lack of understanding of the implication of it's choices.

4