Apple's week April 25: Apple's intelligence

There's admittedly not much to discuss this week, but two major stories caught my eye. Apple seems to be shifting its thinking around artificial intelligence, and someone got bit by Apple's increased device security being a bit too good.
There's no need to dive into the continued uncertainty around tariffs and exemptions. The world is on fire, and we don't need to belabor that point here.
Apple Intelligence training
Apple revealed that iOS 18.5 will introduce a new feature aimed at helping train Apple Intelligence using on-device user data, but without sacrificing privacy. It's all opt-in and keeps the user's data securely on their iPhone, but helps perfect Apple's generated responses to be more accurate and human-like.

That is all well and good, and I'm sure many reading this will choose to opt-out. However, developers stumbled into a weird notification while submitting bugs on the beta. It seems that developers have no choice but to submit their diagnostic data to train Apple Intelligence.
I expect there is an option to opt-out somewhere, just not in the dialog. This is a beta, so we'll see how that shakes out. Apple can always clarify and make it more obvious why the prompt exists and if there are options.
These changes are occurring at the same time as an internal reorganization around John Giannandrea's AI/ML teams. He previously had Siri moved from under him to Rockwell's team, and now we've learned that the robotics team is moving to hardware under John Ternus.
There are two ways this could go. Either Cook is truly trying to ensure Giannandrea's team isn't spread too thin by taking the extra cruft from under him, or he's gearing up to dismantle the entire AI/ML division.
If the first is true, then Giannandrea's team will work on Apple Intelligence as an ecosystem-wide effort where they help develop models for every aspect of the ecosystem. It is plausible, but that also seems like how Apple's approached AI so far, and I'm not sure that's the best way to handle developing the technology.
Sure, there should be a central AI core and a team in charge of it, perhaps Giannandrea's team, but I don't think his team should be developing every model for every use case. That's again running into the issue of spreading too thin.
So, there's a chance the second possibility is true. Apple is going to strip the AI/ML organization to a minimum or nothing. That would leave each department in charge of developing their own models on a per-need basis.
I believe this is a better solution because it means more tools, more often, and more specialized. Like we've heard with Apple's push to get AI in Health, we could see other sections of the company invest in models that help users by utilizing their specific expertise in a department.
Approaching AI more specifically, less generally, will benefit users. And that will put Apple further ahead of the competition, considering they are the only company with on-device, secure, and private AI that doesn't train on user data.
Advanced Data Protection
A terrible situation occurred after a woman's iPhone was stolen by a thief that had learned her passcode. The thief was able to turn on Advanced Data Protection using only the device passcode, plus they changed the Apple Account password and set a recovery key.

This resulted in a total loss of data. The only thing Apple could do in this situation is delete the account and its data. The woman is suing Apple for $5 million in losses, but I believe it won't result in a win.
These kinds of situations shouldn't happen.
Apple did make it nearly impossible for thieves to make these kinds of changes to an Apple Account, but users have to enable the feature. It's called Stolen Device Protection. It makes it so people that know the device passcode can't change the Apple Account password without an hour delay and biometrics at each end.
The person involved in this unfortunate situation didn't have these features enabled, or else they could have at least got control of their Apple Account back. Sure, the thief could have enabled Advanced Data Protection regardless of the Stolen Device Protection setting, but they'd still be able to log into another device and shut that off.
I think Apple should move the toggle for Advanced Data Protection under the Stolen Device Protection umbrella. That way thieves can't turn on the feature, even if the user can later turn it off. It just doesn't make sense to have such an important feature protected only by the device passcode.
An odd week
I've been dealing with a weird knee injury that has left me unable to do strenuous tasks like work out or walk very far. So, when Apple said there would be a 10-year anniversary award in Apple Fitness for closing your rings, I was worried I wouldn't be able to get it done.
Luckily, after manipulating my minimums in the Fitness app, I was able to close my rings even without being able to move much. Too bad I'm not close enough to a physical Apple Store to get the enamel pen.
I've been working hard to perform different workouts every day since January, so it has been annoying having to sit them out for the last couple of weeks. Excited to get back to it soon.
