We’ll see. To date there’s no local runnable generative LLM model that comes close to the gold standard GPT-4. Even coming close to GPT-3.5-turbo counts as impressive.
We only recently got on-device Siri and it still isn’t always on-device if I understand correctly. So the same level of privacy that applies to in-the-cloud Siri could apply here.
We’ll see. To date there’s no local runnable generative LLM model that comes close to the gold standard GPT-4. Even coming close to GPT-3.5-turbo counts as impressive.
We only recently got on-device Siri and it still isn’t always on-device if I understand correctly. So the same level of privacy that applies to in-the-cloud Siri could apply here.
My on-device-Siri that lives in my Apple Watch Series 4 is definitely processing everything locally now. She got dumber than I.