Apple's week July 25: AI grift

It's been a while since I've done my weekly check-in here. I wrote on July 4th, then it was my birthday, then we traveled to Virginia Beach to visit friends, then we stayed in a cabin away from internet access. It's been a busy month.
I managed to keep doing the AppleInsider Podcast in between everything, so those that follow me here and there didn't miss much. I even managed to appear on HomeKit Insider after a long hiatus.
Apple Music had a birthday, more rumors poured in about the next Apple Vision Pro, and I published a long rant on Apple's place in artificial intelligence.
So, yeah, even though I've been away from the blog, I've been busy with the usual work stuff. Apple hasn't slowed down either. The first public betas for iOS 26 and the rest have arrived.
A lot and a little has happened, which is the usual for the summer. Let's dive into some of the stories from this week.
Apple Intelligence has a future
I don't know who needs to hear this, but Apple Intelligence isn't a chatbot, it isn't competing with ChatGPT, and it isn't trying to replace humans. Instead, it's a large language model that's been broken up into different agents for text, images, and notifications.

There's this idea that Apple is behind in artificial intelligence, but a lot of the arguments start and stop with that statement. The people saying these things have an idea for what Apple Intelligence should do, but never really consider what it was built for.
I have no doubt that Apple Intelligence could one day end up as a chatbot and an agent for vibe coding if Apple chose to do that, but today, it isn't that. It isn't Apple's strategy and never has been. The goal Apple has set forth, and has succeeded in doing so far, is to ensure users have access to LLMs without beating them over the head with them. The idea is that users should be able to access AI and not even realize that AI is doing the job.
We've got a long road ahead in Apple's plans it has set forth, but the ultimate result sounds really promising. It seems Apple Intelligence will be used as an on-device, private, and secure model for everyone, while Apple also works with other companies to bring models to Private Cloud Compute.
The concept is interesting, and I believe no one else is doing this in the space. It could mean Apple will have the only AI ecosystem of private, secure, ethical, and green AI out there, even when considering third-parties. Sure, Samsung might bring multiple agents to the Galaxy line, but they'll just be sucking in data and running on third-party servers.
Apple Vision Pro with M5
It seems Apple might be upgrading the Apple Vision Pro with an M5 processor this fall. If it does happen, it fits within my previous estimates of taking at least 18 months, if not two years, before updating the base hardware.

It's not ideal that Apple isn't going to do more than a spec bump, but it makes sense. I wonder if we'll see an R2 chip with better tracking and room analyzing, but that hasn't been rumored yet.
Depending on what's announced and what work needs, I'll determine if I'll get the next Apple Vision Pro. Whatever it is, I'll be surprised if Apple calls it a second-generation. A new design that's lighter and faster is due in 2027 or so, and that'll open the door to all-day wearability.
It's interesting seeing people react to the platform. Sure, it isn't selling in massive volumes, but Apple hasn't abandoned visionOS. It is the future of computing, and I'm certain we'll see spatial computing in more products in the next decade as Apple figures out what happens after iPhone.
A full redesign of iOS
There's been a lot of commentary around iOS 26 and Liquid Glass. I personally am excited by the redesign, but I'm aware that there's a lot of folks out there that hate it. The accessibility part of this all isn't ideal. That much I can acknowledge, but I don't think that's a failure of the new design.

Apple needs to provide accessibility settings that let users navigate the new and complex design introduced with Liquid Glass. There are a lot of areas where, when all the things align, tab bars and other objects just become illegible and unidentifiable. It's not great.
However, I'm not calling for anyone to get fired. Instead, I trust that Apple is listening to feedback and refining the system. Over time, I expect this new interface will stand out as a unique and opinionated design structure, and one that isn't easily replicated by competitors.
Since the public beta dropped, people I know that aren't tech nerds, who could care less about Apple and its business, have commented on how cool they think the new Liquid Glass material is. Anecdotal, but just saying that everyone isn't just doom and gloom here.
I don't know why so many people that I follow expect Apple to have everything perfect before the beta period is even over. iOS 26 is buggy, but no more than other betas. I don't understand the hate – at least Apple is trying to differentiate in a world full of flat design.
Apple's next CEO
It's incredible that there are still people out there that dislike Tim Cook. Jeff Williams may be on his way out, but there's not really a brain drain at Apple. Whoever replaces Cook by 2030 will be a fine choice, vetted to the extreme. It's not something I'm concerned about today.
In the meantime, Apple will continue to establish itself as a powerful technology company that excels in user health, privacy, and security. We'll see the iPhone and visionOS platform evolve over time too. Apple is in no danger of falling down in the near term, or in the next decade, for that matter.
I'm excited for Apple Vision Pro, Apple's involvement in AI, and what's next for Liquid Glass in iOS. Whatever happens, it'll still be the company I root for among the rest that just don't care. It's always odd seeing how everyone hates on Apple when it is still miles ahead of everyone else in the department of giving a shit.
It's time we move beyond the Apple distortion field.
