Viewing a single comment thread. View all comments

ActuatorMaterial2846 t1_j9ydjnf wrote

Is this to do with advancements in file compression? I heard Emad Mostaque talk about this regarding stable diffusion.

2

KarmaStrikesThrice t1_j9zvqll wrote

No I meant it more generally. Neural networks dont contain any super complicated math and equations that are difficult to solve, it is a network of simple cells whose inputs are outputs of previous layer of cells and the output is fed to the next layer. Popular example of a cell is Perceptron, which computes a simple linear equation y=Ax+b. The main problem is the size of a network, which can be billions or even trillions of cells in case of chatgpt. But not all cells are always used, based on the input only some cells are active (the same way our brain does not activate cells that learned math when we are asked what is the capital of New York state for example).

So the most computationally difficult part is learning, and then having enough memory to store the whole network into fast memory, the AI doesnt know what you are about to ask it, so the whole network needs to be ready. But once we ask a specific question, like "are cats carnivores?", 99.99...% of cells remain inactive and only those storing information about biology, mammals, cats, food, meat, diets, carnivores, etc. are engaged and produce answer. So extracting the output based on given inputs is much simpler and can be done by personal computers (if our computers had many terabytes/petabytes of RAM and storage, which they dont)

The advanced compression alhorithms reduce the memory required to store the network, but it doesnt really improve performance aside from some minor cache optimizations.

2