IOS 26 Is Basically Android’s Greatest Hits From Five Years Ago

ios-26-is-basically-android’s-greatest-hits-from-five-years-ago
IOS 26 Is Basically Android’s Greatest Hits From Five Years Ago

It was an exercise in restraint not laughing out loud when Apple introduced several “new” features for iOS 26, many of which have been mainstays of Android phones for years. It reminded me of those promos NBC used to run showcasing upcoming reruns of its sitcoms with the line, “If you haven’t seen it, it’s new to you.” Apple’s always had some of that in its presentations, but the WWDC 2025 Keynote event was particularly comical.

Apple Intelligence is back, and it’s oozing its way through the iOS 26 user experience. If you need a good laugh, here are the top features Apple added that have been on your favorite Android phone for years.

iOS 18 running on Apple iPhone 16 Pro Max

Related

iOS 26 will help cut down on distractions

I can finally know who is calling

iPhone 16 Pro running iOS 18 with a custom pink theme on top

Call screening is a familiar feature to many Android users, especially those who enjoy Pixel smartphones. It will screen out spam calls by answering for you and determining what the person (or computer) on the other end wants. You can read their response and choose whether to take the call. It’s like having a personal assistant to filter out nonsense spam, but it also gives you peace of mind that if someone important calls from an unknown number, you won’t miss it.

My Samsung Galaxy S3 tracked my location and made note of the places I’d been, so I was hardly impressed when Apple announced it would be available on iOS 26 in 2025.

It’s a valuable feature, and it debuted back on the Google Pixel 4 in 2019. It’s taken Apple six years to develop an equivalent, a long time even by Cupertino standards. Thankfully, it only took Apple five years to add Hold Assist, which lets your phone wait on hold for you, alerting you when you’re prompted to take action or an agent returns. It has been featured on Pixel devices since the Pixel 5, with the Hold for Me feature. Apple has been behind on AI from the start, but seeing just how far behind is astonishing, especially when you consider the impressive raw power of the company’s A-series chipsets.

See also  Android Now Supports Digital Credentials Like IDs And Passports

Remember a restaurant you went to or when you last went to the movies

iOS 26 will keep track of your movements

How Siri looks on an iPhone 16 Pro Max running iOS 18.1

Apple introduced Visited Places during its keynote event, a feature that will keep track of locations you’ve been to for reference. If you forget a restaurant or store you’ve shopped at, your iPhone with iOS 26 will have a log for you to check. I don’t love the idea of my iPhone tracking my every movement, but it’s going to happen anyway, so I might as well get some utility from it. I don’t know how Apple Intelligence will factor in, but if you’ve used an Android in the last decade, your phone has kept track of where you’ve been.

Google introduced Location History with Google Maps in 2012, and I recall it being available on Android phones as far back as I can remember. My Samsung Galaxy S3 tracked my location and made note of the places I’d been, so I was hardly impressed when Apple announced it would be available on iOS 26 in 2025. I’m accustomed to Apple taking longer to implement capabilities that Android phones have had for years, but thirteen years is excessive, even for Apple. It’s odd to highlight such a gap, but it’s further evidence that iOS has been slipping for years, and once you take away its shield of reliability, there’s little else left.

See also  Google's Controversial Split Quick Settings For Android 16 Could At Least Be Optional

Your iPhone will be able to see your environment

Search for items in your surroundings through your camera

Apple iPhone 16 review: iOS 18 automatic icon tint

Visual Intelligence in iOS 26 enables you to interact with your surroundings. You can ask Apple Intelligence to search for an item on your screen, or you can ask it to research something you’re looking at with your camera. Suppose you’re unsure about something you’re looking at or would like to save information for later. In that case, you can pull up your camera and have Visual Intelligence identify and explain what it sees. If you see a poster for an upcoming concert, Visual Intelligence can parse the written information and add a calendar event. There are practical applications for Apple Intelligence to interact with what we see, but it’s nothing we haven’t experienced before on Android.

Google Lens versions were available on Pixel devices going back to 2017, and it’s widely available on Android (and iOS). I recognize Visual Intelligence has advantages over Google Lens, but at best, it’s a hybrid between Lens and Project Astra, providing context and information about your surroundings. There are even plenty of areas where I’d prefer a device with Project Astra, since Google’s AI model is a more active participant, bringing me information it thinks I might need instead of waiting for queries.

See also  Apple Next Big Software Move Could Make Android's Fragmentation Even More Visible

It’s never been so clear

I still believe Apple retains advantages over Android smartphones with hardware, but it’s hard to argue iOS isn’t significantly behind Android’s software. Google is pushing innovation harder, and its AI roadmap is clear, even if I don’t see the value just yet. I’ve been using Macs since the 1990s, so it’s painful to see a company I grew up with in such bad shape. Steve Jobs used to claim the iPhone was five years ahead of any other device. With iOS 26 introducing many features that Android has had for years, it appears Google has flipped that around, especially when I can get software features on a midrange Pixel that I can’t on a flagship iPhone.