WithoutReason1729 t1_j8c97b1 wrote
Reply to comment by turnip_burrito in This is Revolutionary?! Amazon's 738 Million(!!!) parameter's model outpreforms humans on sience, vision, language and much more tasks. by Ok_Criticism_1414
GPT-2 XL is 1.5 billion parameters. Unless they added some very computationally expensive change to this new model that's unrelated to the parameter count, this could definitely run on consumer hardware. Very very cool!
Red-HawkEye t1_j8cd8mp wrote
God damn, Amazon has entered the game.
Just when you think you had seen it all, an atomic bomb like this one gets announced.
This is equivalent to Black freiza villain coming back in Dragon ball and one shotting the main characters.
Amazon GPT one shots ChatGPT and Google LaMBDA out of nowhere.
grimorg80 t1_j8ctetu wrote
I want to see it in action out in the open, though.
DarkCeldori t1_j8cz4sg wrote
While on the topic of consumer h/w, ryzen ai xdna seems promising, as itll be able to easily make use of main system memory which will soon be able to easily reach 256GB. That can fit very large models and inference is usually far less computationally intensive than training.
gangstasadvocate t1_j8d43oc wrote
Apparently one of my professors well, her husband has a home grade computer system with 2 TB of RAM. I tried searching it up. It only seems like server type size but yeah
DarkCeldori t1_j8dk62j wrote
I think some threadripper pro workstations can reach up to 2TB of ram. Will be very good once treadrippers come with ryzen xdna ai built in as that can directly use main memory for ai tasks.
Tiamatium t1_j8i6ien wrote
Yeah, 2TB ram is doable with server/workstation hardware. Think Threadripper or Xeon for CPU.
Viewing a single comment thread. View all comments