Comments

You must log in or register to comment.

Disastrous-Spell-135 t1_itnkrya wrote

What if we watercool the whole power supply system? Connect it directly to your homes water radiators. Faster than a puny boiler too. Problem solved.

40

CryoAurora t1_itnmyi0 wrote

Welcome to the world of crypto mining type setups just to play games.

They burn through connectors in mining.

Now desktop units by themselves are pulling as much as multiple GPUs used to.

20

MRV1V4N t1_itnqmln wrote

So much pawaaar!!! The world wasn't ready for this monstrosity.

14

Mastasmoker t1_ito6j69 wrote

Lol this makes no sense to me. Why do nvidia cards take 4x8 (32 pins) and reduce it to 16? Half!

7

fordfan919 t1_itomgyb wrote

So, will there be a better version of these adapters from nvidia or other parties available at some point or do we literally have to play with fire?

7

tnupmap t1_itop00m wrote

unscheduled obsolescence

4

JangoDarkSaber t1_itorgff wrote

It was always the natural conclusion of things. Either the architecture gets significantly more efficient or the cards get significantly larger/ more power hungry.

Architecture improvements alone were never enough to quell the thirst for faster processing.

11

EatTheShroomz t1_itpdqvv wrote

Ah, it’s always fun being an early adapter ain’t it? This is why I like being just a little bit behind the most cutting edge.

2

LewAshby309 t1_itpko2t wrote

I don't even know what was wrong with the old pins.

Sure, you would need 4 of them but this leads to the questions why such a high power draw.

The 4090 at 70% of the power limit performance in games on avg 7% below the max power limit. They released a gpu that is stock already past the efficiency limit. Graphs clearly show that. That's usually the OC range. If you up the wattage from stock you gain like 2%. If you go past the efficiency you hit a wall where you need exponentially more power to increase the performance slightly.

It's a waste of energy, hands you just little performance with the risk of burning the power port and cable.

The only reason to do this is that the last few percent might give you the edge over the competition. Still we know if AMD beats them they come up with new gpus.

It's stupid. For casual users it should be right at the end of the efficiency range while past it should be a thing for people that are into OC. There are tons of people who don't touch a thing and run it stock. They won't and often don't even know about the way better efficiency if the simply lower a slider a little bit.

The positive for the consumer is that the coolers are made for the high tdp means if you lower it the gpu runs really cool comparable to undervolting 30 series gpus.

2

Tehnomaag t1_itptu3e wrote

I'd expect consumer protection agencies getting involved and then proceed to tell to the industry consortium which is responsible for these power sockets that it is not OK at all for things to catch fire.

Meaning that after an investigation it is possible that regulators might withdraw CE and its US equivalent markings from any product having one of these sockets on them, making it illegal to sell or even import any of these into US / EU.

Fire hazard is a pretty damn serious issue.

As a short-term fix NVIDIA/AMD, etc would probably issue a driver/firmware update that limits the released cards to max ~400W. The connectors themselves might have to wait a few years until the next hardware gen to get improved somehow.

Too little pins, too many amps and almost no case in use has enough room above the card to leave 35 mm space from the connector until you bend the cable. The 4000 series are already pretty thicc then you plug in there a connector, that is itself some 10-15mm long, then you need 40 mm more plus whatever the bend takes as its not like these cables can be bent easily at sharp 90 degree angle.

2

Tehnomaag t1_itpuarz wrote

The point was data pins. So that the new standard PSU's can communicate to the card how much power they can supply. Then on top of that someone figured that would it not be better if there was only a single cable.

Some optimistic engineer optimized it to the bone and it was decided that yep, it can handle 600W alright if the stars align just right and the user does not bend the cable in any way within the visual range of the card.

3

jakejm79 t1_itqj24g wrote

Only 6 pins in an 8 (or 6+2) plug carry power. Also those 8 (or 6+2) pin cables came in the daisy chain/pigtail variety, meaning each cable had two GPU plugs each capable of doing 150W so 300W total for 6 wires from the PSU.

We now have a single plug cable (with no pigtail/daisy chain) that has 12 current carrying wires and supports 600W, it's no different than before, it's still 50W per wire from the PSU (or more specifically 100W per pair).

Think of the new 12 pin connector like two dual pigtails 8 pin cables from before, they just moved the sense wires to the additional 4 and consolidated the plug, the number of wires from the PSU that carry current hadn't changed.

They've effectively halved the number of connections because they've removed the ability to have a pigtail/daisy chained connector and separated the sense wires.

2

imakesawdust t1_itr553t wrote

We're going to start seeing big Anderson power connectors on GPUs, aren't we?

1

jakejm79 t1_itxynut wrote

For this generation 4x 8pin might not be a big deal, but in the future you could be looking at 6x 8pin or more. Also there are additional benefits when the 12+4pin is paired with an ATX 3.0 PSU.

You have to remember that with an 8 pin connector it's really just 6 pins for the power delivery.

They could have done something like they did with the dual 8 pin cables, basically made a pigtail 12 pin with each connector at the GPU just doing 300w for 600w total, but you'd still have 600w from the single 12 pin at the PSU, plus they were trying to reduce the number/size of connections on the GPU.

1