Submitted by Verificus t3_zqwc0j in singularity
sumane12 t1_j10lqtt wrote
Ooh this is going to be fun...
-
In relation to your perspective of GPU prices, I think the opposite, I think crypto massively inflated GPU prices and since crypto is going through a bit of a winter right now, I think Nvidia and amd have radically overproduced. This massive increase in supply will keep prices down for a few years. There's other reasons also.
-
Conversational NPC's. ChatGPT has shown us what's possible. Use a LLM and give it specific information related to your game and you have a conversational interactive NPC.
-
Higher fidelity graphics. As you mentioned, unreal engine is taking the graphical capabilities of gaming to a different level, nanite is a game changer and will make it possible for real life quality rendering. This will be achieved by making animated objects such as characters out of point clouds, and using the point cloud data to create a animated mesh, this will allow infinitely complex character models through nanite, without an increase in computer hardware. Currently nanite only works for static meshes.
-
Brain computer interfaces. I think neuralink and invasive companies like it are 10-20 years away from a consumer ready product (although hope I'm wrong) however I see non invasive BCI's coming right down in price and having the fidelity to no longer need a controller.
-
Point 4 will allow VR to be more utilised. Currently VR is a bit of a workout, which is great if that's what you're looking for, but ultimately if you want to lose yourself in a good story, being able to forget you are in vr by having controls come through thought, will enable a much more immersive experience. Increased user adoption means increased investment. Better quality haptics, better quality screens, etc.
-
More indie games. Dalle 3d will be released soon, so I think within 12 months you will have high fidelity 3d meshes created by AI, hopefully a decent method of retopology will also be available, allowing people to create great assets in a fraction of the time. Developers will also be using AI assistants to come up with a lot of side quests and back story related to the main story, allowing more time to spend on more important things like combat mechanics ect. The AI assistants will also do the majority of the code. This massive productivity boost will allow individuals to create highly complex games that would usually require a triple A studio.
-
Sex. Yeah, it's obviously going to happen. Someone is going to create underwear with haptic feedback, increased fidelity graphics in vr, highly engaging NPC's that very closely mimic human appearance and interaction, and boom, you have a very believable dating sim.
-
Extrapolate this with improvements in each area of the technology and you have an experience that very closely mimics the holodeck from star wars. Ultimately culminating in whole brain emulation and full drive vr. Then after you have experienced everything worth experiencing, you go back in with a blank memory, set all the parameters to random, and turn on permadeath.....
..... Oh shit!!!!!
King_pineapple23 t1_j11yyxc wrote
I cant wait for realistic NPC, its all i dream about since my first experience gaming
SoulGuardian55 t1_j13dxer wrote
I imagine how Tabletop RPG's players shall react to this. There are ongoing attempts to create virtual tabletop simulators of next generation, combine with this, and players and GM's shall create their worlds, what they want, not just writing on the paper.
monsieurpooh t1_j14o4lb wrote
You just pretty much described what AI Roguelite is trying to do
SoulGuardian55 t1_j157fcf wrote
AI Roguelite is just the beginning.
Verificus OP t1_j10n2fz wrote
For point1, I am referring to the fact that going to 3nm or 2nm production processes is going to produce wafers at ungodly price levels. For future generation performance jumps that means that perhaps performance per dollar or performance at a 4090 level might come down significantly but enthusiast performance levels will go higher and higher with predictions saying in 5 years, Nvidia’s best GPU might cost 3k-4k.
great_waldini t1_j12h27y wrote
We have no reason to think GPUs will get substantially more expensive than they currently are. They’re artificially inflated the last couple of years because of the crypto frenzy - let’s hope proof of work dies soon.
As for 2-3nm silicone, that’s extremely unlikely to ever happen. We can actually already get lithography down to those scales (not on a mass manufacturing scale obviously) but the problem we run into at those scales is actually quantum tunneling. Which is to say electrons start to spontaneously jump the gap and effectively short circuit. This leads to unreliability of the compute processor. Think flipping ones to zeros and zeros to ones when they shouldn’t be flipping. That makes for big problems at the metal level needless to say.
AMD has demonstrated one alternative however to keep us true to Moore’s law - expanding the breadth of parallel processing with more threads. There will be other such innovations in architecture as well, and surely more on the manufacturing side too.
Consumers will always pay a premium to be on the cutting edge of high performance, but if Moore’s law has held true this long, I’m not worried about costs decoupling from the patterns they’ve so far obeyed anytime soon. It’s certainly a “Lindy” type of situation.
If GPUs reach $3-5k, it’ll be because of inflation. Not for any fundamental reason to the technology itself.
Verificus OP t1_j136zto wrote
I think you don’t get what I am referring to. 2nm and 3nm are real upcoming GPU production processes but obviously it is not really 2nm and 3nm, it’s how it is marketed. Doesn’t mean cost aren’t going off the charts, they will do so.
great_waldini t1_j141pcw wrote
I see - maybe I'm not up on the latest marketing BS for GPUs haha. At any rate, I'm curious what you think will be the driving force behind skyrocketing prices? Am I missing a hidden variable on manufacturing costs increasing? Or where do you picture that coming from?
Verificus OP t1_j142fs2 wrote
Basically this graph another user posted: https://i.postimg.cc/25ZV1Fjp/image.png
sumane12 t1_j10pzie wrote
Perhaps, I personally believe the opposite, I think if 2 or 3nm is too expensive, we will go 3d, but I guess time will tell. But again, hopefully it won't matter too much if we can get higher fidelity graphics with the same cards
Clarkeprops t1_j13b87d wrote
Trickle down economics are a real thing when it comes to tech. A 10 year old 3 thousand dollar video card is worth what now? Fifty bucks? Let the bleeding edge pay for the R&D. They’ll pay top dollar, and I’ll get it at a discount in 2-3 years. Tech ALWAYS has come down in price. Eventually, absolutely anyone can afford it.
Quealdlor t1_j14duil wrote
GTX 1060 6GB was even better for gaming than the original, $999 Titan 3 years earlier.
Clarkeprops t1_j14l4v2 wrote
And that card is barely 8 years old.
A 10 year old card is $75 on eBay. I was close.
Quealdlor t1_j14rzh9 wrote
10 years ago top cards were going for $499-549.
Clarkeprops t1_j1c50o2 wrote
The GTX 690 was $1000 in 2012, and in todays dollars that’s $1300.
It’s $130 now on eBay. That’s 10% of original cost.
My point stands.
Quealdlor t1_j14swnz wrote
And btw, the OG Titan became available on February 21st, 2013. I remember it, because I have memory for such information. It was 9.83 years ago - almost 10 years, not 8. You can buy the OG Titan (used) for $140 on eBay - over 7x cheaper.
Clarkeprops t1_j1c5phh wrote
GTX 690 was 1300 10 years ago (adjusted for inflation)
It’s now 130. That’s 10% of original cost.
Quealdlor t1_j1cqqut wrote
I don't count double GPU cards, because they were problematic and there are no modern counterparts. RTX 4090 is GTX 580's current counterpart.
Quealdlor t1_j1e3ans wrote
Look at this https://youtu.be/7gFxAlGjwms?t=984 to see how poorly games performed in 2012 on multi-GPU configurations. 16 teraflops theoretically, but in practice it could even perform close to Xbox One or PlayStation 4 which were only $399 not that long after that.
Quealdlor t1_j135sfq wrote
Here's the chart you are looking for: https://i.postimg.cc/25ZV1Fjp/image.png
Layer_4_Solutions t1_j125vhj wrote
Regarding prices, AI might be eating up all the new chips for years to come. Jacking prices up.
sumane12 t1_j12ewoh wrote
Yeah that's very true. Although I've heard something about allowing AI companies like open AI , to use your GPU while you're not using it, effectively building up tokens that you can either sell back to them, or use for your own AI requirements, kinda like selling solar energy back to the power company. If that's true, if companies can take advantage of the cloud, it should actually reduce costs even more. I don't know, I think there's too many variables at this point.
Clarkeprops t1_j142uxv wrote
Organizations like SETI have been doing cloud based data processing for decades, so it’s a great idea that works.
mocha_sweetheart t1_j15fvdf wrote
Any companies already doing this?
zascar t1_j1300id wrote
Great post
Viewing a single comment thread. View all comments