this post was submitted on 15 Sep 2025
13 points (84.2% liked)
Apple
1004 readers
65 users here now
There are a couple of community rules in addition to the main instance rules.
All posts must be about Apple
Anything goes as long as it’s about Apple. News about other companies and devices is allowed if it directly relates to Apple.
No NSFW content
While lemmy.zip allows NSFW content this community is intended to be a place for all to feel welcome. Any NSFW content will be removed and the user banned.
If you have any comments or suggestions please message one of the moderators.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Er, you can choose to use GPT online, or limit it to local models. And they’re currently testing alternative models from other vendors.
What OpenAI allowed was for Apple to run their own instance of GPT that doesn’t call home to OpenAI.
That said, for chatbots I still only use them in a browser; any “AI” integrated on my devices has to be local-only, with me having control over the selected model.
Thing is the “local” iOS models are really bad because (last I checked) the iphone just doesn’t have enough RAM to hold super useful ones, and Apple’s models are not even great for the size.
Standard practice in literally every other service is to use OpenAI API, which means anything. It can be Google or Openrouter or Huggingface or Cerebras or your laptop or a homelab or anything.