Comments

You must log in or register to comment.

MorgrainX OP t1_isb2351 wrote

lmao

They will probably rebrand it to a 3070 (since it is in fact a 70 card, maybe TI) and reduce the price, since the community feedback was overwhelmingly bad.

177

kenman345 t1_isb2ops wrote

They basically named them the same so people didn’t have much prestige so they want the 4090 more. The 4080 with 12GB should’ve been the mobile version if it even existed like that. The same name thing never made sense. Glad they’re cancelling it

41

ColinM9991 t1_isehh82 wrote

This exactly.

There probably never was a "3080 12GB". It's likely it's always been a 3070 and Nvidia did this because they're scumbags and wanted to push people towards the 4090. It's all very coincidental that they make this decision after the launch and 2 days of buyers getting their share.

Edit:

Looks like AIBs actually have produced 4080 12GB cards as they're now deconstructing these, and I doubt they'd get in on a conspiracy with Nvidia. So Nvidia really do just like fucking everyone over.

3

G1ntok1_Sakata t1_isd8ibp wrote

The "4080 12GB" will perf 52% - 58% of the full die card (4090Ti). That's XX60Ti territory and not XX70 territory. Don't let Nvidia fool with with the "whoop it actually a 4070" bs.

10

Asuka-02- t1_isb85t9 wrote

The chips have already been manufactured, binned, and shipped to board partners and Nvidia's own manufacturing for reference cards.

So they aren't cancelling anything. Just putting an arguably more appropriate sticker on the box.

153

Ditchdigger456 t1_isb9sox wrote

Bingo. This is just more marketing talk. They'll probably just re-label them as 4070ti

65

G1ntok1_Sakata t1_isd8t9o wrote

Which funnily enough it's still gonna perf 52% - 58% of the full card, the 4090Ti. It'll be a XX60Ti class card marketed as a XX70(Ti) card and everyone will clap for Nvidia cuz whoop whoop they the hero. We live in sad times eh?

6

swisstraeng t1_isdtyd5 wrote

Is it the same chip though?

Oh it's not.

The AD102 chip is used on the 4080ti, 4090, 4090ti and rtx6000 (quadro card)
AD103 is used only on the 4080 16gb? Wtf.
AD104 is used on 4060 up to 4080 12gb, wtf.
AD106 is used on the 4050 and that's it for now.
AD107 is not yet used. But that'd be for something like a 4030 or 4010...

I am guessing they will want at some point to make some RTX 4080ti with the AD103 chip perhaps? Or maybe they will make quadro cards with them..? Or they will bring back the "super" editions...

The hell is wrong with Nvidia this generation...

2

jl_theprofessor t1_isbjke9 wrote

I'm just going to sit on my 3090 TI until the 5000 series because this generation is a mess.

29

oNOCo t1_isby4z3 wrote

I’ll sit on my 2070 super until i feel like it’s actually necessary to upgrade

18

M-Rich t1_iscx3op wrote

I remember how media and press wasn't really convinced the 2070 super was a good deal. But then the whole supplychain thing and COVID happened and boom, best timed purchase ever imo. Super happy with it, will ride it at least one more gen

4

oNOCo t1_isdl337 wrote

Yeah, I had to send my 2070 in and it took Gigabyte 7 months to get me a new one. The apologized by giving me a 2070 super. Its been serving me really well :)

1

StrudelStrike t1_isccvs9 wrote

My 1080 still runs everything I’m even a little bit interested in playing on my PC just fine. But I don’t play multiplayer games so I don’t have to pick stuff up while there are still people playing it.

2

jimababwe t1_isch0oh wrote

1060 here. Haven’t bought a game made this decade yet.

5

pillowbanter t1_isdh1ed wrote

960… but honestly it’s time

2

Russ-T-Axe t1_isdlr98 wrote

970 and it’s time but it still works great for everything I play. With the price of cards I picked up a ps5 for my son to play newer titles like elden ring etc. I would love to build a new pc but the price of components is crazy.

2

Cocacola612 t1_isbkrej wrote

Yeah, I’m more likely to make the jump to the new 7000 series Ryzen9 and DDR5 ram then get a 4090 this year or next. My 3090Ti will last awhile with the current top games

14

jl_theprofessor t1_isbpllo wrote

Yeah I mean I'm sitting here playing F.T.L. I think I'll get by.

9

spike4379 t1_ise1g6t wrote

But don't you wonder what it would be like to play it at 9000fps?

1

jl_theprofessor t1_ise6cp0 wrote

Bud,

9000 FPS LIGHT SPEED

JUST START PLAYING THE TRUMPET SONG FROM HEEERRRIDDDAATTTARRYYYYY

1

kenman345 t1_iscmz5w wrote

I’m still happy with myGTX 1080 for now. Definitely showing it’s age but I don’t have the time for the latest games anyways. Might wait until next generation and hope for something more power efficient at the performance level I want, and a price to match

3

imafraidofmuricans t1_isch3r9 wrote

Yes? There shouldn't be any possible release thar should make you switch a year old super expensive card out. That'd be fucking stupid of you to do no matter what Nvidia release.

5

alc4pwned t1_isc83sr wrote

Is it? If you were the kind of person who paid for a 3090 ti, the 4090 seems like a way better value than that card was.

3

HiCanIPetYourCat t1_iscfnqd wrote

I’m getting a 4090 whenever I can find one easily. I don’t care about any of the politics or complaints I’ve seen, I just want ray tracing maxed in 4k at 60 fps and the card does that so. I’m sold. The price isn’t that crazy especially considering I’ll get 700 back out of my 3080.

The card is a waste of time for anyone not playing in 4k but for people like me playing on a giant OLED from the couch this is a huge deal and totally worth it.

4

pufpuf89 t1_iscj1zn wrote

Unless your giant OLED is 80" and you are sitting about 2.5 metres from it you don't really need 4k.

2

alc4pwned t1_isck61q wrote

I game on a 48" OLED that I sit about 3 feet away from, 4k is absolutely something you want in that scenario

1

HiCanIPetYourCat t1_iscjbm4 wrote

It’s bigger than 80 and I sit about that distance from it. Trust me, the difference between 4k and 2160 on a screen that big and clear is absolutely massive.

0

G1ntok1_Sakata t1_isd9ayg wrote

Lol, it'll just be 5090-24, 5090-12, 5090-10, 5090-16 (but slower then 12/10), 5090-8, 5090-6, and 5090-4.

2

ChrisFromIT t1_isbz4k4 wrote

The 5000 series might also be a bit of a mess since it is going to likely be the first nvidia GPUs that use MCM. So it will have a lot of growing pains.

1

TheLurkingMenace t1_iscsr99 wrote

The PSU required to drive the 4090 is already ridiculous. The 5000 series cards are going to need their own PSUs.

2

FightOnForUsc t1_isd57av wrote

MCM could make it take less power (potentially)

1

TheLurkingMenace t1_isdhcxv wrote

Maybe. Or maybe they will just make the GPUs that much more powerful.

1

FightOnForUsc t1_isdinxp wrote

Ok but then you buy a lower tier one that has lower power and the same performance

1

LaidToR3st t1_isce22q wrote

You woulda been paying a grand to go from 144 frames per second to 144 frames per second

1

I_R0M_I t1_isbtozp wrote

I was pretty excited a while back, hearing the 4080 was going to be faster than 3090Ti etc....

Then more details dropped about the 4080 not using the 4090 chip etc.

Then EVGA pulling out....

Sticking with my 3080 FTW Ultra this time round.

28

RSomnambulist t1_isckr37 wrote

Jesus, you were really going to jump from the 3080 to the 4080? I mean, the RT numbers look great, but why?

16

imdyingfasterthanyou t1_iscoxe4 wrote

I own a 3090 and not upgrading.

But actual 4K gaming at a high framerate is not attainable with the 3090 with everything on ultra...

So if someone wants to make up a reason to upgrade I'm sure they can find it

7

I_R0M_I t1_isec1y7 wrote

Well I was interested.... Not a sure thing.

Who doesn't love a new gpu? 😉

1

RSomnambulist t1_isey2oj wrote

I sure do, but it doesn't seem worth it unless you're big into RT or hfr 4k. However, fuck nvidia after this 4080 (4070) debacle.

1

I_R0M_I t1_isfi3yj wrote

I only play 1440, will only use RT on single player games.

Tbh, I'm just impulsive and like buying things 😂

1

cyclopeon t1_isdgggv wrote

I was contemplating the same jump actually. I'm building my daughter's first PC so I was going to give her my 3080 and buy a 4000 series. Didn't have to, but figured why not...

Then Nvidia introduced the "4080s" 🤣

0

Juub1990 t1_isebsn9 wrote

And then you heard it was $1200, up $500 from its predecessor.

1

I_R0M_I t1_isec4rl wrote

The 3080 was £800 for me at launch.

Not sure what the 4080 will be, more I'm sure.

1

FormulaTacoma t1_isbfvp7 wrote

This generation seems so unnecessary. People can barely just walk in and get the 30 series whenever they want. And is there really anything taxing the 30 series yet?

19

CosmicCreeperz t1_isbi62a wrote

Technology matches on… should they just tell all of their engineers to take a year off? ;)

14

FormulaTacoma t1_isbie5r wrote

I dunno but if you don’t like the developers catch up with using the horsepower then what’s the point?

Like if Ti cards were coming out this year and 40 series next year would anyone actually care?

4

CosmicCreeperz t1_isbowod wrote

Probably not. But the reverse is also true. People will buy whatever is the fastest for their price range at the time. I’d be pretty happy to get a 4080 for the same price as a 3080 a few months ago, and not have to upgrade for an extra couple years…

That said I don’t even really want a 4K monitor, let alone have one already. And I have no interest in 200Hz gaming, 120Hz is more than enough for me. I think the truly functional real time ray tracing and high end VR support is the only reason I’d consider it now. But it’s irrelevant for me since I just overpaid for an (MSRP, still) 3080 earlier this year. Oh well.

Honestly though I do get your point - IMO the big problem isn’t necessarily that there is no use for faster cards, it’s that the market is so saturated with SKUs no one can figure out what the fuck they should get any more… I feel like a lot of “last gen” cards are either going to get very cheap and/or lose OEMs a bunch of money…

6

RaiShado t1_isbilxz wrote

Raytraced games can be taxing. Look at Cyberpunk

6

FormulaTacoma t1_isbitmr wrote

Yah but like 1 game? Plus is seems like the vast majority are playing e sports games that still run on my rx580. I’m ready to upgrade and it’s so overwhelming now lol

2

homelessdreamer t1_isbl73k wrote

Do you remember when crysis came out. Building a computer that could run that game became such a cultural phenomenon within the enthusiast community it is still used as a trope today. 1 game is all it takes to keep things rolling.

3

Numarx t1_isbp5fj wrote

It was much more of a meme than an actual bar that people were competing for. No one said "Can it run Crysis at max settings?". It was pretty much "Can it run Crysis?". My shitty ass computer could run Crysis just fine. Hell a lot of youtubers who run benchmarks still use CS-GO as a benchmark.

2

imdyingfasterthanyou t1_iscp58p wrote

CS:GO is a great benchmark because there is a lot of data on it and it is largely CPU-bound so it works great to test CPUs or CPU/GPU scaling.

1

FormulaTacoma t1_isblgsm wrote

Yah but do you hear anyone referring to cyberpunk in that way?

0

Fugueknight t1_isc9cyd wrote

VR is also a big performance suck. Most native VR games run pretty efficiently, but if you want to play sim games with a VR mode you pretty much need a 3080+ to get decent power for a 4k headset (and 4k is the "minimum" for sim games - obviously people make do with less, but it's tough to read dials/labels/etc. below 4k).

2

oNOCo t1_isby80g wrote

That’s been most technology for the last few generations haha

0

FormulaTacoma t1_isbyjhh wrote

Maybe it’s because the 30 series took so long to become generally available. At least with like an iPhone it’s fully on the shelf within a month or two of release

1

M0dusPwnens t1_isbkfsc wrote

They want more scarcity so people who have been waiting will buy the 4090 instead of continuing to wait for the cheaper cards. The name change is just an excuse.

Decent chance they engineered the seemingly nonsensical product naming so they'd have this excuse. There was no good explanation for why they named things that way, but it makes sense if they wanted to keep this escape hatch open for themselves.

I can only imagine how pissed off their partners must be with these last-second changes, especially after they were already so hard to work with that they drove EVGA away.

I have never seen as many people I know looking to upgrade as I have this generation, and they were all waiting on the 4000s. Nearly all of them have only ever owned Nvidia cards. And now not a single one of them plans to buy Nvidia. They're all waiting for AMD. Crazy how badly Nvidia is tanking its reputation.

12

acidrain69 t1_isbypfq wrote

I’ll stick with my 1080. That was a hug purchase when I got it, now they expect 3x as much for a card? FU Nvidia, I hope Intel and AMD take you down a few pegs.

10

invisatrooper t1_isbow6p wrote

I’ve got 1080ti and it’s spot on for almost everything. Yeah I don’t get 120fps at 4K but who actually cares.

6

GeoffDeGeoff t1_isbtxpr wrote

Soon to be rebadged as a 4070…

5

SgtThund3r t1_isbuczc wrote

NVIDIA makes surprise announcement about upcoming RTX 4070!

3

LockCL t1_isbp1xq wrote

Yay, more Linus videos about this.

2

Mclarenrob2 t1_isby1sa wrote

Think I'll stick with my 1050ti

2

LeeHarvey81 t1_isckxwh wrote

It was supposed to be the 4070 but they still got a lot of 30xx cards to sell

2

DrawTheLine87 t1_iscy9xp wrote

I’m guessing this is partly because of the backlash, but I’m more inclined to believe it’s because they got wind of something from AMD….

2

pengy99 t1_isdhk4i wrote

This is more likely. They found out AMD had something that would destroy it and didn't want the "4080" getting wrecked in benchmarks.

2

bleaucheaunx t1_isddq15 wrote

1080ti still going strong. Too many other money priorities in my life to warrant giving nVidia any more money.

2

JohnnyGFX t1_isbsbbl wrote

I'll stick with my Asus ROG Strix 3080 OC. My current monitor is a 4k 55" curved LED that only really does 60hz (it has a 120hz upscale 'feature' but can't accept an actual 120hz signal). So, for right now the bottleneck for me is my monitor. I'm thinking I'll upgrade to a 4k true 120hz OLED monitor before I upgrade my video card.

1

nailbunny2000 t1_ise16ix wrote

It's crazy to me the amount of inconvenience this must cause their AIB partners. I mean it's launching in a month. Marketing, packaging, documentation, would all have been done, there would likely be tens of thousands of them already on container ships at this point that will need to either be recalled or repackaged. What a shit show.

1

dustofdeath t1_ise4anx wrote

Someone with a brain in marketing who actually follows media saw the kind of damage this was doing and would have led to lawsuits after launch (misleading product name, different hardware).

A new name is fine, regardless of price/perf. Market will decide if it will be a flop or not. But at least it's clear what you are buying.

1

oNOCo t1_isbycjd wrote

And cyberpunk will still run like shit and have issues rendering its world

−4

Scoobz1961 t1_iscooc0 wrote

Does the game run badly? I am using the cheapest 1060 6GB I was able to buy many years ago and high visuals at 1080p average at 48fps. I am actually really surprised how well it runs on my shitty old card.

2

killaho69 t1_isdb8vl wrote

No this guy is just stuck in a year old circle jerk. I had CP2077 running on a 3080 and an i5-9600k with 16gb ram AT LAUNCH, 4k on max settings and DLSS, getting 50-75ish FPS.

4

oNOCo t1_isdlgfy wrote

Yeah, I was on high at 1440p. But halfway through the game my worlds/levels started loading with missing objects/assets. The part where you wake up towards the end in the rippers clinic, the walls were invisible, floors were invisible, etc lots was missing. That and the very very end... the whole building except a few columns were missing until I walked out on the roof. I could see through the world. No matter of saving loading or graphics adjustment would fix it

−1

Blastoxic999 t1_isc16o2 wrote

Cancel culture saves the day (again, probably).

−5