pkuba208
pkuba208 t1_jcy7nxg wrote
Reply to comment by 1stuserhere in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
Should be faster than 1 word per second. Judging by the fact, that modern PC's run it at 5 words per second and a raspberry pi 4b runs it at 1 word per second, it should run somewhere near the 2.5 words per second mark
pkuba208 t1_jcy717u wrote
Reply to comment by Art10001 in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
I know, but android uses 3-4gb ram itself. I run it myself, so I know that it uses from 6-7 gb of ram on the smallest model currently with 4bit quantization
pkuba208 t1_jcx3d9i wrote
Reply to comment by Art10001 in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
Well... I run this model on a raspberry pi 4B, but you will need AT LEAST 8gb ram
pkuba208 t1_jcvmhhm wrote
Reply to comment by 1stuserhere in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
Depends on the hardware
pkuba208 t1_jcy83gf wrote
Reply to comment by Art10001 in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
I use swap too. For now, it can only run on flagships tho. You have to have at least 8gb of ram, because running it directly on let's say 3gb(3gb used by system) ram and 3-5gb SWAP may not even be possible and if it is, then it will be very slow and prone to crashing