space_spider
space_spider t1_iqum8oo wrote
Reply to comment by Nmanga90 in Large Language Models Can Self-improve by Dr_Singularity
This is close to nvidia’s megatron parameter count: https://developer.nvidia.com/blog/using-deepspeed-and-megatron-to-train-megatron-turing-nlg-530b-the-worlds-largest-and-most-powerful-generative-language-model/
It’s also the same as PaLM: https://ai.googleblog.com/2022/04/pathways-language-model-palm-scaling-to.html?m=1
This approach (chain of thought) has been discussed for a few months at least, so I think this could be a legit paper from nvidia or google
space_spider t1_iy3kumf wrote
Reply to comment by tuvok86 in Why is VR and AR developing so slowly? by Neurogence
This is the real answer, OP. Almost everyone I know who has a VR/AR headset doesn’t use it any more. It was novel, but anecdotally I can only use one for an hour before my head hurts. Video games aren’t that fun in it, and industry applications are unreliable because software is usually buggy and over budget.
I got to play with the magic leap, hololens, oculus quest 2, and other headsets when they came out, and the progress is cool, but they’re just not worth the investment to anyone except as a curiosity for those with excess disposable income.