AylaDoesntLikeYou OP t1_iuqg6ux wrote
"Artificial-intelligence systems are increasingly limited by the hardware used to implement them. Now comes a new superconducting photonic circuit that mimics the links between brain cells—burning just 0.3 percent of the energy of its human counterparts while operating some 30,000 times as fast."
Deformero t1_iuqiyrj wrote
Yeah, it might be great if ever commertialized.
Shelfrock77 t1_iuqkhu0 wrote
Have some faith in the engineers in those chip factories, god damn, just be impressed at how quick it’s going. I can’t imagine being a cutting edge engineer and hearing people say “bUt iTz N0t cOmMeRCiaLiZ3d yEt”. I often say this to people, it’s only 2022 and AI is hatching quicker and quicker. Who knows maybe AI designed this chip and the human engineers took credit for it.
neo101b t1_iuqucn7 wrote
My brother thinks none of this will happen in our life time, I keep telling him we will have a GTX 4080 compressed into the size of a cell phone, when they have the materials which run faster and cooler than the current tec.
I give it less than 10 years.
ninjasaid13 t1_ius4iqo wrote
The physical limits of computing is:
> At 20 °C (room temperature, or 293.15 K), the Landauer limit represents an energy of approximately 0.0175 eV, or 2.805 zJ. Theoretically, room-temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second (1 Gbit/s) with energy being converted to heat in the memory media at the rate of only 2.805 trillionths of a watt (that is, at a rate of only 2.805 pJ/s).
Which means trillions of times less power than a lightbulb for a gigabyte internet is the theoretical limit.
Maybe zettabit/s computing power on a lightbulb of energy.
blueSGL t1_ius7e2x wrote
Started reading the comment thinking that you were explaining why it's not going to be possible, finished reading the comment happy that we are no where near the theoretical limits and that there is a massive amount of ground still to cover.
Cuissonbake t1_ius77ju wrote
Needs to be smaller than that and portable energy sources are the main bottle neck still in regards to reducing the size of computers. Best we got is carrying a battery the size of a suitcase that can charge using solar panels. Idk how we can make energy sources smaller than that. I hate being dependant on the electrical grid.
And if you want to travel they limit the battery size on planes that you can bring for safety reasons. So there's still a lot to figure out in terms of energy sources for portable devices of that caliber.
[deleted] t1_iuudv9n wrote
[deleted]
neo101b t1_iuuh8ck wrote
It is, could you imagine a mega drive or amiga computer in the size of your phone, because it can emulate any games on them.
I can't wait for the future.
Deformero t1_iuqkt1n wrote
Sure, you are right, I'm just sayin that this is not first time there's revolutionairy new technology for chips that promises something extrodinary. There's been a dozen since 2016 and i havent seen any of them implemented in practice.
capsicum_fondler t1_iuqlnmi wrote
Sure, but since 2016 it's been about better versions of ordinary chips for ordinary computation.
Now we're using chips more or less optimized for ordinary computation to do AI computation.
Now we're finding ways to optimize for AI instead. There's ample ground for innovation.
Shelfrock77 t1_iuqly7z wrote
Right, exactly.
Deformero t1_iuqnbn8 wrote
[deleted] t1_iuqycck wrote
In what way does this prove them wrong
z0rm t1_iuqocw1 wrote
Because 2016 was 6 years ago. Taking something from discovery to full scale commercialization takes decades. Graphene for example was only discovered in 2004 but it's in early stages of commercialization right now. So...just wait.
Baron_Samedi_ t1_ius3th3 wrote
It takes longer than you might imagine to go from research breakthrough to industrial production 10 -15 years, sometimes.
duffmanhb t1_iuqnpuv wrote
If it has use, it will be commercialized. The problem is often these things in the lab show a proof of concept that "can maybe" do something great. But the moment that they show it has an advantage over the rest, it'll be miniaturized and fabricated. The problem is often they may have one unique strength, but it's not enough to outweigh what we already have.
Black_RL t1_iur6ld1 wrote
New tech always blows my mind with this, compared to the older one it’s better and consumes less resources.
Happens all the time!
Viewing a single comment thread. View all comments