kallikalev
kallikalev t1_j1pkid7 wrote
Reply to comment by basafish in Is there any real upper limit of technology? by basafish
And the fact that they’re just toys now means they’ll never be put into use? The newest big stuff like image generation is less than a year old, things take time. Just recently generative art was nothing but a pipe dream, then all the outputs were messes of scribbles that vaguely resembled the prompts, and now they’re mind-blowing. Give it a few more years of refinement and business interests, and you’re going to see image generators and chatbots commonplace.
As a first example of widespread deployment, popular graphic design tool Canva has added a text-to-image tab on its website directly in the editor, allowing people to create stock photos, logos, backgrounds, etc on the fly. And on the “toy” side of things, Midjourney launched about six months ago and already has millions of users paying $10-$30 every month for image generation. Most are using it as a toy but some are making album or book covers, character art for roleplaying games, sketches and inspiration for their own drawings, etc. Just because something is a toy doesn’t mean it won’t have any impact.
kallikalev t1_iyvwn8e wrote
Reply to comment by BlingyStratios in StableDiffusion can generate an image on Apple Silicon Macs in under 18 seconds, thanks to new optimizations in macOS 13.1 by Avieshek
A few months ago Stable Diffusion wasn’t running on the GPU on macs, so that was CPU-only
kallikalev t1_jdhj0tf wrote
Reply to comment by Mercurionio in ChatGPT Gets Its “Wolfram Superpowers”! by Just-A-Lucky-Guy
We’re talking about direct computations. Someone with a massive memory of pi has it memorized, they aren’t computing it via an infinite series in the moment.
The point being made is that it’s much more efficient, both in time and energy, in having the actual computation done by a dedicated and optimized program that only takes a few CPU instructions, rather than trying to approximate it using the giant neural network mind that is a LLM. And this is similar to humans, our brains burn way more energy multiplying large numbers in our head than a CPU would in the few nanoseconds it would take.