Submitted by redhatGizmo t3_115a172 in technology
Comments
cambeiu t1_j90ie0t wrote
I think we will see the triple A gaming industry eventually adjust system requirements/performance to conform with he new realities of the market. No point making games for the 4060 as the recommended GPU when most of the market cannot afford that. Also, AI based upscaling might help alleviate things on the hardware requirement front.
In many ways I see parallels between the current component price crisis and the 1970s fuel crisis. As a result of the oil shock back then, big muscle cars gave way to smaller, more fuel efficient vehicles. The focus moved from raw power to efficiency and cost. We saw the rise of the compact and sub-compact cars, which did not exist in the US back in the 50s and 60s.
I think the computer market will go thru the same process now. IGPUs/APUs could become the baseline for gaming moving forward.
I for one refuse to pay $500+ for a discrete GPU and I don't think I am in the minority.
DRARCOX t1_j90ips1 wrote
I guess where I'm confused is... So what? My 2080 graphics card in my Lenovo desktop from 2019 runs everything I've ever thrown at it.
This sounds like a problem for people who buy a new smartphone every year, not for anyone with enough sense to not "need" a card or other hardware that came out less than 6 months ago to me.
Maybe I'm just missing something.
redbrick5 t1_j90mk2k wrote
I think it's the opposite. With manafacturers constantly pushing the edge higher and higher the result is the mid market gets discounted lower and lower. You don't neeeeed to buy the top end GPUs when building a new rig. My 2080 serves me just fine right now.
Iron_on_reddit t1_j90mqxx wrote
Exactly! This is more about the hype/marketing strangling the weak minded.
Also, PC gaming is much, much more, than AAA games. In the last 10-15 years, there were only a few triple A titles that caught my attention, while I fully enjoyed many indie games, and they require much more modest hardware.
PC gaming will be fine. The 34th installment of the same, regurgitated, boring concept covered in beautiful, but otherwise meaningless graphics? Maybe not so much, but they won't be missed anyway.
Blacksbren t1_j90nl48 wrote
I do agree the lower cards are more the. Fine a 1060 will play almost every game out there some of the market is for people who want to have the best, some is for work. I have the 4090 not because I need it to play a game I have it because I want it and can run everything on ultra. The game looks nicer. But that being said even if it did not look nice I would still play said game because I enjoy the game.
Gargenville t1_j90p38s wrote
Maybe this is an age thing but as an old timer I do think it's kind of wild we have multiple people in this thread talking about their crappy old 2080, like my brother in christ that's a $700 GPU, it's insane for that to be the 'barely hanging in there' card.
hurfery t1_j90ruwf wrote
Yes, lol. The window has been moved. By a lot. The mining craze, pandemic, supply crisis, scalping, Nvidia taking maximum advantage of it all... All this has normalized paying more than 500 for a gpu. People now go up to 2000+.
redbrick5 t1_j90t49q wrote
I think for decades we have been accustomed to buying the latest/maxed out components when buying/building a new computer. Primarily because there was a huge improvement over the tech from 2yrs prior. No longer the situation and our lust for the best vs cost can't be rationalized
Overall-Business-624 t1_j90teo7 wrote
I think they will just push everyone to game streaming from the cloud. maybe when valve gets bored with the steam deck.
myne t1_j90tg16 wrote
I'm fairly sure they look at the steam stats when they make a game and make sure it'll run ok on at least the top 70% of the in-use market.
Sure, you have to turn stuff off/down, but it still should play ok.
It's a volume business. They need sales. They need to target the audience that exists.
I'll note the 1060 is finally the #2 card on steam.
A 6 year old mid range card is still way up there in the market. It'll be hard for game designers to not target it.
Overall-Business-624 t1_j90tnoo wrote
it is a problem when games like hogwarts legacy barely holds 1080p60 fps on an RTX4090 https://www.youtube.com/watch?v=zofJ5yFvajA
cambeiu t1_j90ug38 wrote
Eventually, but we are still many years away from that. Maybe once 5G becomes ubiquitous globally and the datacenters more geographically distributed this will be the case. But until then, most games will need to be run locally.
uparm t1_j90wdfj wrote
LOL my 2016 graphics card is still 2/3 the price it was in 2016 accounting for inflation according to newegg. What a fucking failure of the "free market"
Spanks79 t1_j90wtu7 wrote
Time to actually start writing more effective code and efficient hardware use.
stu54 t1_j90zc6u wrote
A 1060 SHOULD play almost every game out there. The games that don't are mostly open world RPGs like we had in 2014 with low graphics settings options omitted.
The power consumption is what offends me most. Like, by now you should be able to play games on 200 watts, but nope, 500+ watts or GTFO.
stu54 t1_j90zo5u wrote
The 2080 is a 215 watt gpu. Games should run on 60 watt gpus like they used to.
Call-Me-Robby t1_j90zq1j wrote
Your comment is supposed to be ironic but yes. Nvidia and AMD will soon stop fabricating expensive gpus if they don’t bring them enough money because no one can buy them. They’re companies, not cartoon villains who make expensive products just to fuck with the poors.
stu54 t1_j910sts wrote
The tech is still hugely better over 2 generations, but the power consumption keeps rising. Somehow a 120 watt last gen GPU became entry level.
stu54 t1_j911jcu wrote
Games sell hardware. A Nvidia partnership game is made to make the most common hardware obsolete, so new hardware can be sold.
The games industry has split into three, AAA games, free to play microtransaction games, and indie games. The industry wants to kill indie games because they don't generate any shareholder returns. They will obliterate Steam with frivolous lawsuits and givaways, and then the creativity that big corporations can't compete with will go away.
Spot-CSG t1_j913u09 wrote
The ray tracing is broken thats why. I'm playing at 120fps in 1440p Ultra settings with RT off. 5800x/3070ti.
Spot-CSG t1_j914fls wrote
A 1060 should struggle with new titles. The card is almost 7 years old... sure it can run things 1080p low settings at 30fps but thats not as playable as it used to be for me.
slowslownotbad t1_j917woz wrote
If Apple can’t turn this plus VR into a successful foray into gaming, then they really don’t want it.
Nevermorre t1_j919k61 wrote
Last week I finally chose to upgrade my AMD RX 580 to a RX 6750 XT. I often consider my computer to be somewhat on the good side of mid-tier, and I think that's hard to judge, but its the baseline I go for.
When looking into a new GPU, my main goal was to find a card that is probably going to run better than what my current CPU and RAM may allow, so my GPU will be bottlenecked by my other hardware, however, I didn't feel the need to get top of the line, nor could I afford it, so I set a budget for around $700.
Decided to not use Amazon for my seller and went to Newegg. I narrowed down to my 6750 XT 12GB for $470 - with a $70 off coupon code at checkout to make it $400 - or, 6800 16GB for $600.
I was really stuck between the two because after a lot of comparing and benchmark videos, of course the 6800 was the better performer, but not by a large margin. it was really the debate between the 12 and 16 Gigs of VRAM that I questioned. I did glace through Nivida cards, and saw even the high end GPUs were 12Gigs so I was confused, because 16GB should be a no brainer. I posted on r/PcBuild for advice, which I appreciate the helpful responses, and was starting to lean towards the 6750 when I found a JayzTwoCentz video, specifically about GPU VRAM. It was very insightful and I'm glad I went with the cheaper option.
I'm proud to say I've built this PC and did my best to chose the best parts for my budget, but damn are specs complicated and not overly intuitive for me. Choosing an affordable GPU for what you really need without overspending is probably the most challenging with 5 slightly different models for the same gen card and...ugh, it still makes my head hurt. Luckily my next upgrade will just to get a couple more RAM cards for 32Gigs. When I ran an ingame stress test of Hogwarts Legacy by flying around the map and through the Forbidden Forest, the new GPU handled fairly well and the VRAM was sitting between 7-10gigs while my actual RAM was maxing around 15 of my 16gigs consistantly. I'm not even sure what that fully tells me, but I know games are starting to become more demanding anyway, and RAM for the most part is cheap, so I am fine upgrading to 32, even if I really do not need it.
stu54 t1_j91aok4 wrote
I guess we have to wait and see what a 120 watt GPU can do this generation.
Nevermorre t1_j91b7ut wrote
That's why I upgraded to my current GPU. However, to be fair, I do think my RX 580 was starting to show its age and I knew it was not going to favor a lot that would be coming out fairly soon. I could still play on mid settings mostly fine, but this game specifically, had some graphic bugs - mostly with WILD water textures getting streched all over the place making vision impossible. Also, the area around where I customize my wand, I'm guessing it was the "hitbox" even though it was not an active item. Anyway, after poking around online, I found others with the same issues and we all had the same card, I think I saw a 570 but close enough.
I'm not nearly as much of a gamer as I use to be, hell I dropped gaming for almost two years and only really got back in when Spiderman and Days Gone dropped on Steam. Not long after I finished Days Gone, Hogwarts Legacy was just a couple months on the horizon. Not sure what I'll pick up next, but I like to keep my system ready all the same. Also, I planned for upgrading my main components when I built my PC, one piece at a time every few years. I'm not entirely sure where my Ryzen 5 3600 sits currently, from what I understand it's still a humble, but competitive, piece I hope to get a few more years out of.
firedrakes t1_j91e8np wrote
lol. that ship has sailed. back in 360 era.
current hardware for consumer cant push physics, hd assets and native. its all upscale tech . be it internal game engine stuff/fsr/dlss.
doneandtired2014 t1_j91frgl wrote
The gist of the article is that NVIDIA and AMD are focusing on the halo tier at the exclusion of all else and pricing their cards in an effort to maintain cryptoboom margins in the face of crypto collapsing like a neutron star (for the third time).
If you needed a sub-$900 card today, your options are 1) pay $100-$200 over MSRP for RTX 30 stock, 2) try to snag RDNA2 products before the remaining stock pool evaporates, 3) hope most of your games use DX12 or Vulkan on ARC.
Adding to that, performance gains have basically flatlined at the sub $400 price point. $380 in 2016 got you a GTX 1070. $300 in 2023 gets you...OCed GTX 1070 performance from AMD and NVIDIA. Intel shines here when a game gets along well with their drivers. When a game doesn't, you get performance on par with a GTX 1050 Ti.
$200 gets you cards that aren't even as perfomant as the $200 options from 7 years ago.
OnionBagMan t1_j91i4pn wrote
I’m running a 1080ti and can play whatever I want.
This article is a bit off imo. Any computer can run almost any game these days. Sure you can go overboard with your spending it it’s completely unnecessary. An i5-2500k and a 970 will probably play 99% of future 2023 releases just fine.
doktaphill t1_j91ljsu wrote
Is every article a vitriolic argument now? Are we blaming people for not spending their money correctly?
sudoku7 t1_j91mvuo wrote
Ya, I think we are likely to see AAA PC Gaming give more way to consoles in that light. The consoles themselves are cheaper and easier to optimize for.
beef-o-lipso t1_j91n3qb wrote
Agreed. I finally upgraded because a few games I wanted to play wouldn't run on mubGTX970. But I note sims like X-Plane 12 and I think MSFS have the 970 as recommended. So it varies.
I have yet to see a game requiring rtx30+.
Due-Resident-4588 t1_j91n8yy wrote
I got into the pc hardware scene late 2016. Back then (even though it was only 7 years ago lol) things were a lot more affordable. 10 series was just coming out and was fairly priced for the big performance leap it was. Technology has advanced today and I understand that thing’s naturally get more expensive to research and develop but the top end gpu now costs 1,600 and the best of the time then which was the 1080ti cost 700 dollars. Things have gotten way out of wack. Several of my older pc building buddies have put a new rig on hold and are using a ps5 or Xbox because they’re affordable for good performance and a comparable pc is something like 1k usd. What sucks about all of this is that things will only get worse …. Not better.
medievalmachine t1_j91nhfi wrote
Lots of companies put themselves out of business for short term profit. But that’s not the case here.
Casual and entry level PC gamers have moved to laptops and used gear from crypto and one of those is temporary. Hell, Intel is entering the market from the bottom!
But also the top end of the market expanded as people found a new way to show off. Just like cars and TVs and houses etc. There’s no issue here. Different, not worse.
[deleted] t1_j91o4b1 wrote
Using ultra-enthusiast hardware as an excuse not to make anything run well is strangling PC gaming.
That said, I haven't bought a PC game from those studios in a decade, so carry on. I'll be over here playing fifty roguelites and loving it :P
AnimZero t1_j91oa9d wrote
Can't wait for late 2024/early 2025 when the next gen of cards comes out and is somehow even more expensive. Oh, maybe they'll use even more wattage and take up even more space in my case, too! Can't wait for that.
Agreeable-Meat1 t1_j91rbhk wrote
Meanwhile I bought an MSI laptop in 2018 on Black Friday and other than the case falling apart, there have been 0 issues.
jaakers87 t1_j91snvz wrote
Thats not a hardware problem. It doesn't matter what GPU you have with that game, it still stutters. So why pay $1500 when you will have the same experience as a $500 card?
Florida_man2022 t1_j91vrro wrote
“Runs everything” on what settings? That’s a caviar
Opposite_of_a_Cynic t1_j91xjsf wrote
KSP 2 has a 2060 as minimum and a 3080 as recommended.
RecipeNo101 t1_j91xnyn wrote
They're already struggling to move hardware. CPU and GPU sales have plummeted. Steam seems far too entrenched to be ousted when people are tired of multiple launchers and all the others are dogshit. I collect every free Epic game, I have hundreds, and I've never given them a cent.
NuTeacher t1_j91y5o5 wrote
I've also noticed that the window for acceptable in-game specs has changed too. It used to be that the target was 1080p and 60 fps which even the lowest end card can do now. But now people want to be able to play at 1440p 120+fps or even at 4k 60fps. The 1060 can generally get 50-60 fps at 1080p. We've seen a huge leap in power these last three generations and card prices are reflecting that.
Inconceivable-2020 t1_j91z6px wrote
Game Developers publishing games that only play correctly on obscenely expensive hardware is strangling PC gaming.
Dangerous_Employee47 t1_j91zb0o wrote
Twas always thus. It was why I only played consoles until Civ V came out and I had to have a PC to play it.
stu54 t1_j91ze3s wrote
Epic games freebies will win in the end. Distributing digital content isn't very expensive, and it undercuts Steam. New indie developers cannot compete with free games. Your time and hard drive space belongs to Epic.
kenriko t1_j9214pl wrote
I remember when people bitched an moaned about Steam when HL2 was released. It was evil DRM back then. Ha!
yhtomitn64 t1_j921d9m wrote
I maxed out at paying 200 bucks for a used 1070! I did also spend 3k on a fatbike and new frame/fork for my other mtb so I guess shifting priorities. Zwift looks sweet on 1070 though!
kslusherplantman t1_j923l8p wrote
And yet the compact VW Beetle did exist… are you saying the US didn’t make compact cars? They def were being made at the time and were being sold in the US
RecipeNo101 t1_j923nfd wrote
Yup, I remember being so annoyed with Counter-Strike 1.6 on Steam. Waiting to download HL2. The platform was a laggy mess. To their credit, they've come a long way since then, and it seems clear enough to me that other launchers don't have the desire or ability to match even a fraction of Steam's features.
eosh t1_j924ago wrote
Yeap, I have zero issues with ray tracing off. All other settings on high with my 1080.
Head-Ad4770 t1_j9256o4 wrote
What the hell??? Why??? I understand the original KSP is now over 10 years old and not being updated anymore, but why such demanding system requirements for the sequel?
Head-Ad4770 t1_j925kxx wrote
Is it just a crappy excuse to force us to upgrade our hardware, considering A LOT has changed over 10+ years? Corporate greed? Both at the same time?
charlsey2309 t1_j925q2a wrote
Eh I have epic on my PC to collect the free games……..but I buy games on steam
amoralhedgehog t1_j926zam wrote
I'm not at all confident in that prediction. Steam's current projections suggest Epic's impact on sales units stabilised last year, with sales expected to recover and surpass pre-Epic figures over the next 5 years. The platform has significant stickiness for millenial gamers, who now have dominating consumer power alongside decade-old Steam libraries.
Regarding indie games, the number of indie titles released roughly doubles every 5 years, meanwhile AAA titles have declined over the long term with the exception of a post-covid release backlog. As for "indie devs can't compete with free games"... at least 20% of the most popular indie games on Steam are freemiums...
Suffice to say, there are many features of the market that point toward the resilience of Steam against loss-leading AAA competitors.
mvw2 t1_j92dp8e wrote
It's the same model for cars and houses. It's a model that works great when the market volume is limited. You have limited sales, and you maximize profits to that limited number by focusing almost exclusively at the high end. It's a model that's good for business but forces customers to buy nothing, buy old, or buy into only the high end at exorbitant prices. ALL the in-between stuff is gone.
For video card makers, they have a unique hurdle. They are pushing hard into edge case tech, pushing as far as they can achieve with the physics of materials. And outside of ideological change of structure optimizations and good software optimization, this is really just a game of attempting to produce at the edge of science and do so without massive scrap losses. Worse yet, all the costs are tied into the process of it all, so cards from low end to high end barely cost different. It doesn't cost them hardly anything less to produce a bottom tier video card, but they'll make a lot less from it. Plus they have to be competitive with other brands and older generations of themselves. The push for performance and push for higher cost go hand-in-hand making sure the new stuff remains just barely the better value.
For raw cost, crypto miners and scalpers have played a big part of scarcity, unfortunately. And they are driving real costs well above MSRP. This in turn makes older cards more valuable and pushes up the entire market space as a whole. Everything is expensive just for the sake of being expensive. A few people simply exist to make a pile of money from it. And for these dumb reasons a 10 year old card can be sold for as much as it was bought new a decade ago.
Now our saving grace as PC gamers is that games have long become very flexible in hardware needs. The transition to Steam and the high analytics it brought to manufacturers showed quite starkly how ancient people's hardware had become. Game developers are forced to temper their designs to work on much, much older hardware. This has provided a customer base with a LOT of breathing room for holding onto very old tech or only performing very minor upgrades to retain the ability to play brand new games.
This used to not be the case at all. In the early days games forced hardware development, to the point where every new game was nearly unplayable on anything but the absolute newest hardware. You HAD to upgrade your entire computer every 2 to 4 years just to play anything new. You couldn't simply set graphics to low and magically make an older PC play it. The tech and software advanced way too fast. You could literally spend $3k on a brand new PC, and there was a chance that it could not play the game that was coming out 6 months later. The technology race was that aggressive.
This ground to a halt in the late 90s and early 2000s when this model was no longer viable. It shifted the burden of flexibility on game developers and forced them to slow down and accommodate. Today, you can play current year titles with a 15 year old PC. Not long ago that was unfathomable.
However, what this has also created was this weird vacuum in the hardware world. There actually isn't much driving next generations. You do have commercial users and professionals wanting faster task speeds for simulations, rendering, etc. But outside of that, PC gamers haven't really had drivers outside of high res monitors. There isn't much actually pushing graphics cards these days. Many developers are also bound by the heavy console market that are all stuck in one time period in history, and all these new titles have to work on them too. But for PC folks, the push is pretty much solely monitor resolution, stepping from 1080 to 4k and now into 8k. The need for that isn't even worthwhile. It's just there. And it's about the only thing pushing video cards forward. Sure, there's ray tracing too, but that's a singular use case. You still have developers catering to decade or older tech, so the game itself can never really ever again push the boundaries like it used to. And thinks like 4k, 8k, multi-monitor, etc. are just throughput equations. So the game in today's graphics card world is just that, throughput of pixel volume.
thenrepent t1_j92e36w wrote
> For raw cost, crypto miners and scalpers have played a big part of scarcity, unfortunately. And they are driving real costs well above MSRP. This in turn makes older cards more valuable and pushes up the entire market space as a whole. Everything is expensive just for the sake of being expensive. A few people simply exist to make a pile of money from it. And for these dumb reasons a 10 year old card can be sold for as much as it was bought new a decade ago.
The demand for GPUs for crypto mining should be fairly low nowadays - it was primarily Ethereum being mined with GPUs, and Ethereum switched to proof of stake (no mining). So that demand has fallen away entirely.
hurfery t1_j92e4q8 wrote
4k 120 fps is where it's at. 😎
Well yes but tech progress isn't really progress if you pay 3x more for 3x performance. New tech is supposed to give more performance per dollar compared to the old one.
Wadae28 t1_j92fate wrote
I’d say the lack of games is strangling all of gaming. I own a PC and a PS4. I can’t think of any titles that are generation defining technology wise. Don’t get me wrong, games like GOW Ragnarok are amazing but I was more impressed with the story telling and performances than the visuals.
Dark_Destroyer t1_j92hcnf wrote
It's not ultra-enthusiasts that are strangling PC gaming. It is that NVIDIA and AMD expected miners to continue to buy high end cards to mine crypto.
Now that mining is a thing of the past, both companies are sitting on a pile of chips while at the same time, used GPUs from miners are flooding the market.
They should have seen the writing on the wall when the warning was given that mining crypto will no longer be a thing a year prior to it ending.
What will most likely happen in the next year or two, is cards will come down in price but most likely not to where they were before the pandemic.
DST2287 t1_j92iyn8 wrote
Hell, most games don’t play correctly, so many shitty pc ports lately.
pain_in_the_dupa t1_j92kr7i wrote
Oh hell. I hope you’re wrong, because I don’t want to be playing the game equivalent of a K-car with an 85mph mandatory max speedometer.
SixthLegionVI t1_j92ybs4 wrote
Is it just me or where these more options for cards 15 to 20 years ago? I remember playing 3d games in the late 90s/early 2000s with inadequate hardware and the games weren't perfect but definitely playable.
MikaLovesYuu t1_j931czu wrote
I used to like buying ultra enthusiasts hardware because it meant you could set your games to max but nowadays developers don’t bother optimizing for the majority and develop for the next generation so you need high end hardware to get good performance. The problem isn’t hardware it is lazy developers. There are plenty of good examples but too many bad examples.
GarbageTheClown t1_j932zlp wrote
Yeah, because we all know that the KSP developers also design and sell high end computer hardware.
GarbageTheClown t1_j933dik wrote
It doesn't work, there are too many requirements for it to work better than just downloading the games.
da90 t1_j933pnx wrote
Example: See what’s currently happening with Kerbal Space Program 2
Strange_guy_9546 t1_j933zmz wrote
my brother in Christ indie games exist
Toytles t1_j93bm26 wrote
I cri everytime
sobanz t1_j93d1n9 wrote
just do what I do and buy high end hardware to play pixelart indie games and 10 year+ old emulated games
WillDeletOneDay t1_j93kk4j wrote
Nvidia and AMD have a duopoly. If Intel fails to break into the market, your options will be to pay the prices they charge, go used, or not buy a GPU at all.
Rivale t1_j93kygy wrote
the developers that can actually write performant code probably aren't working for gaming companies. Games have to sell for a certain amount and companies aren't willing to bite into their margin even further hiring very skilled developers.
bigtallsob t1_j93lme1 wrote
That's been a thing for a long time. Crysis came out 16 years ago.
AadamAtomic t1_j93m6sx wrote
Honestly, games have been held back by graphics for a long time, but memory has become much more efficient.
It's likely that we won't need graphics cards stronger than the 4000 series for a very long time as memory and ram are now scaling with graphics performance along with A.I.
GPUs are likely to start getting smaller instead of more powerful.
TheConboy22 t1_j93vtcy wrote
Denuvo is a big problem as well. User resources shouldn’t be used for that type of garbage.
deltib t1_j93zl5j wrote
Sadly creating game engines exclusively for one type of game is no longer an option for most developers, and an engine created to cast the widest net possible can't really be ideally optimized for specific cases.
Malf1532 t1_j940z38 wrote
Yes. I completely agree.
But I think the real problem is YouTube hardware reviewers. Especially Linus Tech Tips. Probably the biggest tech channel currently. I say this because they are constantly focused on the top tier products instead of mid tier which is where the bulk of PC gamers live at.
I know their mission is to generate views to make money but hyping primarily top tier products will keep causing manufacturers to focus production/supply on those product lines.
They also have a pulpit to say 'make more mid tier cards and leave the elite to the elite' but don't use it. I got a kick out of one of their WAN shows where they addressed themselves as not being influencers. That they just reviewed products purely on their merits. How are you non-biased when you choose which products to review?
Another thing that annoys the fuck out of me about their channel is that they went out of their way to get the ARC team to come to them recognizing the importance of a 3rd party in the GPU market to keep AMD and NVIDIA honest. So they know the importance of a 3rd party for competition even going so far as to use some Intel cards in a month long challenge. But were picky about the feedback regarding things like VR compatibility and streaming issues which effect how many gamers? DX9 stuff was addressed and still being worked on because it's just drivers. The overlay nonsense...it's software in infancy Luke. Grow the fuck up. Then as a dick smack in the face saying they couldn't get their NVIDIA cards back in fast enough. Must be nice to be able to give feedback from an ivory tower of gear.
Bottom line, I feel like YouTube channels like LTT are influencers and chose not to actively advocate for mid range gamers.
B_U_A_Billie_Ryder t1_j94612b wrote
I really just peeked the advertising for the first time today.
First thing I thought was gee, remember when KSP taxed the hell out of computers just because of all the parts?
​
Now more parts AND particles? It's gonna look great at 5 fps for sure.
Original-Cow-2984 t1_j948vum wrote
Kind of limits the market, I would think.
[deleted] t1_j949p51 wrote
[removed]
Notorious_Junk t1_j94a23h wrote
Did you even read the article?
doktaphill t1_j94enaw wrote
You just proved me right
venk t1_j94opbn wrote
To me it’s kind of what happened to low cost digital cameras once good cameras started appearing on phones. The only cameras that people buy these days are high end / prosumer/professional cameras.
Consoles are pretty much mini PCs these days and you get a much better gaming experience with a $500 console than a $500 PC (or even putting in a $500 GPU in a PC a lot of times).
The true beauty of PC gaming, where they just crush consoles, can only really been seen at the higher end gaming PCs with high end monitors.
SteakandTrach t1_j94ww76 wrote
Me: bought a 3080ti (not during the dark times! I got it below MSRP, just so we’re clear) but i’m going to fire up vampire survivors for the umpteenth run. I played myself, lol.
Unhappy-Stranger-336 t1_j94xc9u wrote
Or rather the game engine budget went into artistic
SteakandTrach t1_j94xshs wrote
To be honest though, my 980ti in my kid’s hand-me-down PC can still play almost everything just fine. I plugged it in to a 4k TV and with a mix of high and mid settings played through Shadow of the tomb raider with pretty darn good image quality with fps running 50-60 fps. I was kind of surprised how well that old card held up. I have a 3080ti on my main rig and sure, it does better, 4k, all ultra settings and still get high fps, but it’s not immensely better, you know? The overall experience is roughly the same. I think i’ll be getting away from feeling I need to be on the latest and greatest hardware to have a good experience going forward.
Ok_Marionberry_9932 t1_j95auwc wrote
What a stupid article. Just play with what you have, there tons of games out there that don’t require the latest hardware
Roger_005 t1_j95d1k9 wrote
Really? Smaller instead of more powerful? Did you see the 4090?
AadamAtomic t1_j95dynr wrote
Now imagining it being small enough to put 4 of them on a single PCI board.
SLI and dual GPU's are already being phased out.
gamaknightgaming t1_j95mqj8 wrote
There are other things to consider. for example, i got cities skylines free on epic, but the bulk of the mods for it are on the steam workshop. Sure it’s possible to get steam mods to work on epic but it’s a pain in the ass and I’m probably just going to buy the game when it’s on sale on steam
ericnakagawa t1_j95q9ad wrote
Have a friendly link to those roguelikes, by chance?
ketaminekid t1_j95qsxj wrote
It’s generally accepted that the Japanese makers made the concept popular in the US
trx1150 t1_j96lkql wrote
“Thing of the past”
The market is cyclical and we shouldn’t rule out crypto skyrocketing again in the next 5 to 10 years
[deleted] t1_j96lylm wrote
OnionBagMan t1_j96mhwr wrote
I need a list of unplayable games because I’m doing fine with a cheap ass 10series card and can’t fathom what people are attempting to do that they can’t.
OnionBagMan t1_j96mnsy wrote
Yeah no one needs a 4090 to play minecraft.
SixthLegionVI t1_j96my2z wrote
Which model of 10 series?
thenrepent t1_j96uv8i wrote
Crypto mining was mostly for Ethereum, which has moved to proof of stake now (no mining).
So crypto mining with GPUs should mostly be a thing of the past indeed.
bofpisrebof t1_j96w4oc wrote
it sure doesn't hurt though
mailslot t1_j980s4r wrote
Running at 120fps means that every single thing you do has to fit with the rest of the game and run in less than 5 milliseconds. Everything is performant by necessity… but only up to a point. You can spend years optimizing every single bit of waste to squeeze as much as you can out of the hardware, but at the end of the day, you have to cut your losses. Does another year of optimizing justify supporting older hardware, that won’t even support the full experience?
trx1150 t1_j981j5e wrote
Ah that’s right, thanks for the reminder. And BTC mining isn’t as GPU based right?
mrturret t1_j984qtu wrote
BTC is mined with ASICs now.
mrturret t1_j985acb wrote
I mean, I'm still rocking my 1080 that I got back in 2016, even after I upgraded my motherboard, CPU, and RAM. The 1080 still runs new games great.
no-name-here t1_j99hc9l wrote
What games "only play correctly on obscenely expensive hardware"? According to the top google result, the most demanding game is cyberpunk 2077. It doesn't require a 40xx, 30xx, or 20xx, or even a 10xx series card - it requires a gtx 780, and recommends a 1060. Even the recommended card series was released the better part of a decade ago.
And there are other comments here that they can run pretty much everything fine on a pre-10xx series card: https://www.reddit.com/r/technology/comments/115a172/ultraenthusiast_hardware_is_strangling_pc_gaming/j94xshs/?utm_source=share&utm_medium=ios_app&utm_name=iossmf&context=3
9-11GaveMe5G t1_j90hzqw wrote
The boundless wisdom of unimpeded capitalism will solve this