- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Just because a certain requests don’t work offline, that doesn’t mean that Google isn’t actually running models locally for many requests.
My pixel isn’t new enough to run nano. What are some examples of offline processing not working?
I wouldn’t be surprised if the handshake between Pro and Nano was intermingled for certain requests. Some stuff done in the cloud, and some stuff done locally for speed - but if the internet is off, they kill the processing of the request entirely because half of the required platform isn’t available.
Yeah? It does.
What a thought provoking reply.
I dunno what you expect me to say. It’s not complicated.
You’re really going to say that Google isn’t doing anything locally with Tensor? That’s just silly.
No that is not what I said.