Submitted by simpleuserhere t3_11usq7o in MachineLearning
pkuba208 t1_jcvmhhm wrote
Reply to comment by 1stuserhere in [Research] Alpaca 7B language model running on my Pixel 7 by simpleuserhere
Depends on the hardware
Art10001 t1_jcwg5zg wrote
You can really see how phones defeat 10 year old computers, as revealed by their Geekbench 5 scores.
pkuba208 t1_jcx3d9i wrote
Well... I run this model on a raspberry pi 4B, but you will need AT LEAST 8gb ram
Art10001 t1_jcy2sb5 wrote
Raspberry Pi 4 is far slower than modern phones.
Also there was somebody else saying it probably actually uses 4/6 GB.
pkuba208 t1_jcy717u wrote
I know, but android uses 3-4gb ram itself. I run it myself, so I know that it uses from 6-7 gb of ram on the smallest model currently with 4bit quantization
Art10001 t1_jcy7rqs wrote
Yes, that's why it was tried in a Pixel 7 which has 8 GB of RAM and maybe even swap.
pkuba208 t1_jcy83gf wrote
I use swap too. For now, it can only run on flagships tho. You have to have at least 8gb of ram, because running it directly on let's say 3gb(3gb used by system) ram and 3-5gb SWAP may not even be possible and if it is, then it will be very slow and prone to crashing
1stuserhere t1_jcxyj1o wrote
pixel 6 or 7 (or other modern phones from last 2-3 years)
pkuba208 t1_jcy7nxg wrote
Should be faster than 1 word per second. Judging by the fact, that modern PC's run it at 5 words per second and a raspberry pi 4b runs it at 1 word per second, it should run somewhere near the 2.5 words per second mark
Viewing a single comment thread. View all comments