orincoro t1_j08c6qx wrote
Reply to comment by swisstraeng in Japan to Manufacture 2nm Chips With a Little Help From IBM by Avieshek
Yeah, I thought I read this, that the obvious next step is to just build the wafers in a 3D architecture, but it’s super complicated to fabricate.
IlIIlllIIlllllI t1_j08pupi wrote
heat is a bigger problem
Hodr t1_j0bvehy wrote
Heat is more of a materials issue. Once they hit the wall they can move to GaAs or other semiconductors.
The only reason we still use silicon is the existing infrastructure and the relative abundance of the element.
swisstraeng t1_j09a5ge wrote
Yeah, and the main issue is that, when you add layers on top of layers, you are less and less flat. And at some point you're a whole layer wrong, so you have to do long and expensive processes to try to flatten the thing again.
Cooling is partially an issue, but that's also because CPU/GPU manufacturers push their chips to their limits in an attempt to make them appear better. And end up selling stuff like RTX4090 that is clocked way too high and end up eating 600W, when it could have 90% of the performances for 300W. But hey. They're not the ones paying the power bill.
orincoro t1_j0c7tzb wrote
I wonder how much electricity globally is consumed by needlessly overclocked GPUs.
swisstraeng t1_j0ei32s wrote
Surprisingly not much. If we only look at industry grade hardware. Consumers? Yeah, a lot is wasted.
All server and industrial stuff is actually not too bad. For example, the chip used in the RTX 4090 is also used in a Quadro card.
It is the AD102 chip. Used in the RTX 6000 Ada gpu, which has only 300W TDP compared to the RTX 4090 that has 450W and is pushed to 600W sometimes. Or worse, 800W in the RTX 4090ti.
We're talking about the same chip and a 300W versus 800W difference.
Anyone using a rtx 4090ti is wasting 500W into a bit of extra computing power.
But hey, kwh costs about 0.25euros in the EU depending where you live. This means, you pay 1 euro every 8h of use for a rtx4090ti that could be saved by downclocking the card.
SneakyCrouton t1_j09z32d wrote
See that's just a marketing name for it, it's actually just a 2D transistor but they draw a little D on there for legality purposes.
TheseusPankration t1_j0aci3b wrote
Wafers are already 3D. It's just that all the dozen or so metal layers are used to route signals and power. Only the critical few layers for the transistors themselves are fabricated using the latest nodes. https://en.wikichip.org/wiki/File:intel_interconnect_10_nm.jpg
Viewing a single comment thread. View all comments