Submitted by AylaDoesntLikeYou t3_11c5n1g in singularity
duffmanhb t1_ja2nzfa wrote
Reply to comment by Z1BattleBoy21 in Meta unveils a new large language model that can run on a single GPU by AylaDoesntLikeYou
Siri was exclusively cloud based for the longest time. They only brought over basic functions to local hardware.
Z1BattleBoy21 t1_ja2qcli wrote
I did some research and you're right. I made my claim based on some reddit threads that said that apple won't bother with LLMs as long as they couldn't be processed on local hardware due to privacy; I retract the "required" part of my post but I still believe they wouldn't go for it due to [1] [[2]] (https://www.theverge.com/2021/6/7/22522993/apple-siri-on-device-speech-recognition-no-internet-wwdc)
Viewing a single comment thread. View all comments