Submitted by Sorin61 t3_11c7r3x in technology
EyeLikeTheStonk t1_ja29tfm wrote
Now all you need is a graphics card that can push 1000 fps in games without costing $10,000.
People are playing Hogwarts Legacy with top of the line graphics cards and registering below 60 fps.
cesium-sandwich t1_ja2c73b wrote
.and a CPU+GPU that can generate a full frame of AAA graphics in under 1 msec. Good luck w that. Same thing with apples "retina" displays. Yeah they are nice to look at.. but it's REALLY hard to feed a high res image at any decent framerate.
Doubling the frame size means quadrupling the GPU framebuffer size = 4x more horsepower to feed it.
quettil t1_ja2w81y wrote
Is it not possible to do some sort of real-time interpolation between frames?
theinvolvement t1_ja2lmab wrote
What do you think about fitting some logic between the pixels at the cost of pixel density?
I was thinking it could handle some primitive draw operations, like vector graphics and flood fill.
Instead of trying to drive every pixel, you could send tiles of texture with relatively low resolution, and use vector graphics to handle masking of edges.
asdaaaaaaaa t1_ja2pr26 wrote
I'd imagine the more steps in between "generate graphics" and "display" add a considerable amount of latency. From my understanding we're already at the point where having the CPU physically close to related chips (memory's one, IIRC) makes a difference. Could be wrong, but from my understanding the last thing you want to do is throw a bunch of intermediate hardware/steps in the process if you can avoid it.
cesium-sandwich t1_ja2ps0i wrote
There are some economies of scale involved.. especially for high density displays,
The GPU does a lot of the heavy lifting..
But even simple-ish games often take multiple milliseconds of CPU time to simulate One frame, and that doesn't transfer to the CPU, so doubling the framerate means half the physics+gameplay+cpu calculation since you have half as much time to do it.
rumbletummy t1_ja35lae wrote
You mean like CAD?
theinvolvement t1_ja4bihn wrote
I am not sure, what i'm thinking of is a gpu that can output a tiles of image data, and an outline that trims the image to a sharply defined shape.
so the monitor would receive an array of images tiled together, and instructions to trim the edges before displaying on screen.
its kind of a pipe dream i had since hearing about vector graphics video codecs last decade, and microleds a few years ago.
ElasticFluffyMagnet t1_ja2pjo1 wrote
It's mostly because of denuvo. I can manage 100-110 fps on 1440p.. And I'm running it on a 2080 ti (all settings on high)
cyniclawl t1_ja3ymk3 wrote
How? I'm running high on a 3080 and none of my hardware is being taxed at all
ElasticFluffyMagnet t1_ja473ma wrote
Did you mean me or the guy above me..
cyniclawl t1_ja47h2a wrote
I meant you, curious what I'm doing wrong lol. I'm getting 60 in cinematics but as low as 10fps in some parts
ElasticFluffyMagnet t1_ja49s9f wrote
Well, I think it's important to have your driver's up to date and to check if you meet all other requirements . I've seen people with 1660s run it very well on 1080p, and people with 4090s running it insane on 4k with 60+ fps. But then there's people with 1080s, 4070s, and everything in between having problems.
The thing is, I'm running the game and it's using 20gb or ram, not 16, which they say should be enough. Don't know how much influence that has on performance though.
If you have low fps, check links like this, to see if any help. In this case though I believe denuvo really doesn't help either. There's a video of Hogwarts with and without denovo (sort of without) and the difference is quite noticeable.
If none of the things in the earlier videos help, the only thing you can do is contact the makers of the game. Or make a ticket. Seems to me though that it's mostly luck of the draw if your system runs it smooth. If you, of course, meet all requirements. But seeing my RAM usage is way higher I wonder if the other requirements are wrong too..
Edit: added some stuff and fixed typos
458_Wicked_Pyre t1_ja4be38 wrote
Unreal engine, it's a single-core CPU bottleneck.
Exci_ t1_ja2q6g0 wrote
A 4090 at 1000fps is basically free tinnitus.
alice_damespiel t1_ja2hvdv wrote
1k Hz would be beneficial in literally every use case but modern 3d games.
CatalyticDragon t1_ja2m91r wrote
A 7900xtx already pushes 500-600 fps in Valorant (and other similar games). People are getting 300+ on a 3060ti.
One more generation of GPUs and some popular eSports games will hit 1,000 fps.
So it makes sense to work on displays now.
evicous t1_ja3r9ri wrote
I don’t know why you’re getting downvoted. We all joke about 1khz being unacceptably stuttery for esports but… we’re well on our way to doing that on the high end. Frankly given the CPU bottleneck on a 4090/7900XTX at 1080p we might actually already be there with GPUs, or we’re very close.
A kHz refresh 1080p display will be very usable with adaptive refresh already, honestly.
CryptographerOdd299 t1_ja3vq0f wrote
Aren't CRT able of insanely high refresh rates?
deep_anal t1_ja3k85q wrote
Classic r/technology take. Cool new technology being discussed, "yea but what about all this other bullshit that makes this trash and we shouldn't be excited or talk about it in any way shape or form."
ElementNumber6 t1_ja5mljj wrote
This is nothing unique to any particular subreddit. The entire world is salty about the state of GPUs right now, for one reason or another.
sameguyontheweb t1_ja34z4d wrote
*with RTX enabled
rumbletummy t1_ja35dtc wrote
Maybe eyetracking and Foveated rendering could take advantage?
Fickle_Ball_1553 t1_ja3s0t1 wrote
Blame Denuvo.
IHadTacosYesterday t1_ja3upav wrote
In other words... "Calls on NVDA!"
azibuga t1_ja40u79 wrote
Right? Exactly my thoughts when I saw this
king0pa1n t1_ja4eb07 wrote
This would be incredible for old FPS games like Quake
[deleted] t1_ja4mx8q wrote
[deleted]
gliffy t1_ja4x5o7 wrote
Idk I have a 6900xt and getting solid 75fps on ultra
TDYDave2 t1_ja2aflk wrote
Does that mean Hogwarts Legacy is the new Crisis?
tnnrk t1_ja2f08t wrote
It’s just poorly optimized
TDYDave2 t1_ja2f6sq wrote
Then I will be optimistic about it being better optimized in the future.
Certain_Push_2347 t1_ja381rv wrote
Just get it now. There's nothing wrong with it unless your computer doesn't meet minimum requirements. Which are high compared to majority of previous games but it runs well.
chaivpush t1_ja40sma wrote
I have well beyond the min requirements and my frames still tank into the 40s and 50s while moving in Hogwarts and Hogsmeade. Definitely wait until they drop a performance patch.
[deleted] t1_ja56ok8 wrote
[removed]
Certain_Push_2347 t1_ja41ux6 wrote
Are you running on the minimum settings lol. So many people don't even have their PC set up properly and this is why they experience issues in games. Have windows fighting Nvidia software fighting some other 3rd party software vs the game. Have a quad core processor with multi threading disabled. Etc.
chaivpush t1_ja421jr wrote
Trust me when I tell you that I know how to optimize my PC, dipshit.
Certain_Push_2347 t1_ja428ee wrote
Lmao so not minium. Got it. Makes sense.
BobbyBorn2L8 t1_ja5ukc8 wrote
Its well known these days that 'AAA' devs aren't optimising properly for PC, don't defend the practice when people are clearly having issues
Certain_Push_2347 t1_ja6msd0 wrote
Lol that's just not true.
BobbyBorn2L8 t1_ja6z2t1 wrote
How so? There is many games today that should not be peforming as badly as they are on PC, they aren't nearly as good looking to justify it
Name me any decent good looking AAA game I guarnatee they were plagued with poor performance on launch and for many months after. Hogwarts Legacy is the most recent to come to mind
Dead Space Remaster had stuttering and framerate drop issues Elden Ring performed awful on release and still has noticable issues today
Certain_Push_2347 t1_ja7ohno wrote
I've already explained this. It's not like some of us have an exclusive patch that makes the game run at 60fps no problem. It's a properly working PC with the required hardware.
BobbyBorn2L8 t1_ja806iz wrote
Yeah some hardware is not properly optimised for by devs, so while you may have 60fps someone with different hardware to you (not necessarily better or worse) will have a different experience because the devs haven't properly optimised for devices
How do you not get this?
Certain_Push_2347 t1_ja80v1y wrote
I get it. The problem is you understanding.
BobbyBorn2L8 t1_ja8hr3o wrote
hahahahahah brilliant, you have no clue hence your clueless statement
>Just get it now. There's nothing wrong with it unless your computer doesn't meet minimum requirements. Which are high compared to majority of previous games but it runs well.
You clearly have no clue what you are talking about
https://www.pcgamer.com/hogwarts-legacy-february-patch-notes/
peace out
Certain_Push_2347 t1_ja9u0m3 wrote
I'm guessing you didn't read your own article? People who can't run dx12 properly are having shading issues lmao.
BobbyBorn2L8 t1_jad7zi1 wrote
>Since launch, Hogwarts Legacy PC players have reported stuttering and crashing while playing the game, and it seems to be caused by how it loads shaders. Unreal Engine games using DirectX12 have a tendency to chug when shaders load in for the first time, and it doesn't matter how good of a gaming rig you have. Final Fantasy 7 Remake was a particularly egregious example of shader sin, and for some, Hogwarts Legacy is just as bad
What dream are you living in, this is the definition of piss poor optimisation, stop simping for companies they 100% should be criticized for this
Certain_Push_2347 t1_jad9myj wrote
Lmao I don't think you understand how computers work. The game is fine.
BobbyBorn2L8 t1_jada3j1 wrote
Clearly not when there is widespread performance issues causes by shoddy loading of shaders, you don't understand how software works, your own source admits they messed up the loading of shaders
You are truly a delusional fanboy, this is why companies can get away with releasing buggy games nowadays
Certain_Push_2347 t1_jadap3c wrote
I'm not sure what you're even talking about now. I didn't give a source for anything and you're literally repeating what I've said. Perhaps do some reading on computer performance and maybe you'll fix your problem.
BobbyBorn2L8 t1_jadbtty wrote
Sorry I forgot I linked the source, brain is fried and I still understand how computers work better than you do
And I don't have the problem cause I don't own the game, but plenty of people who own the game are getting performance hits where they shouldn't because of poor optimisation the article I provided confirms this but still you sit here and argue that its somehow a consumer problem
The game shouldn't be performing this badly for people
Certain_Push_2347 t1_jae5thp wrote
Lmaooo never even played it but understand it's not hardware issues somehow.
BobbyBorn2L8 t1_jae6zur wrote
Are you dense? You do realise other people's experiences are literally out there right, are they all just lying, performance issues never happen cause of developer/management incompetency/tight deadlines? You've got an article that is literally based off peoples experiences and tells you why the software optimisation is having issues, hell the article is about a fix for performance issues that didn't even fix it
If there wasn't issue why did the developers have to release a fix? Explain that one are the developers lying too?
Certain_Push_2347 t1_jaeuagv wrote
Hopefully you learn something from this. It's okay to not understand but you shouldn't be so aggressive. Makes you look bad. No one judged you before.
atchijov t1_ja2dsqi wrote
10k Hz refresh rate is not the same as 10k fps. Analog movies were shot at 32 fps… and no one complained about “smoothness”.
So anything above 32 is mostly to fool our brain for some “beneficial” purpose.
Ordinary_Fun_8379 t1_ja2gwkn wrote
Movies are shot at 24fps and are noticeably stuttery during action and fast pans. The “feel” of 24fps is so intertwined with what audiences expect a movie to look like that high frame rate films like The Hobbit look wrong to most people.
asdaaaaaaaa t1_ja2pz6k wrote
Agreed, reminds me of the "soap opera effect", where soap operas used to use higher FPS video (I forget the exact amount) which led people to viewing a smoother video experience as "low quality", because that's the type of shows that used to use it. Don't know the technical specifications on it, just that even I had to adjust what I viewed as "high quality" when more studios started doing the same.
PmMeYourBestComment t1_ja2h03a wrote
Ever seen a fast pan on 30fps? You’ll see stuttering. The human eye can easily see difference above 60fps
Viewing a single comment thread. View all comments