iOs 13 brings several major changes to the iPhone, iPod touch, and iPad. Yes, iPad now runs iPadOS, but for right now that’s just a thin name on top of a very big stack.
Here are some of the biggest changes in iOS 13 and iPadOS, how they work, and what they could mean for you and your workflows.
Dark Mode is going to be on every list from every lister. How can it not be? Anytime any platform announces it, it gets the loudest cheers this side of emoji.
But, just like with macOS last year with its subtle, sampled tints and accents, Apple is Dark Moding smart.
First, they’ve established Symantec, dynamic colors. So, rather than hard code colors — for example, making a background RGB 0 0 0 black and label text RGB 255 255 255,
that would break the instant you switch modes from light to dark— you simply refer to them as,
in the example case, systemGroupedBackground and label respectively, And a blue icon isn’t RGB 10 132 255 anymore but system blue.
That way, when you switch from light to dark or back, it’s not stuck on the same color but picks the appropriate color each time.
If you’re familiar with web design, think of it as something closer to CSS. And, what’s cool, is that this sets up apps to work not just across a potentially wider
range of themes but across a wider range of platforms. For example, Catalyst apps on the Mac will get the same light and dark mode
support right out of the box. Or, rather, the checkbox. There’s also a well thought out hierarchy,
from absolute white or black backgrounds to increasingly darker shades for light them and lighter shades for dark themes,
so the visual hierarchy is always clear. In other words, you can tell how many cards are stacked on top of each other by the shade of each card.
Same with text. The primary text is pure white on black, the inverse of backgrounds, and as you move towards secondary and tertiary text, the color moves across the spectrum of gray.
Same with controls as well. Dark mode doesn’t just invert light mode.
A white button doesn’t become a black one. A white button with light gray states becomes a medium gray button with dark gray states.
It can even apply to images, so, for example, a header with a daytime skyline in light mode can switch to a nighttime skyline in dark mode.
It’s all set up so that everything just looks right, in context, maintains legibility, and gives you a visual cue to how important it is, no matter which state you’re in.
Dark mode even gets its own materials look. In other words, the blur effect that helps separate content from controls, and the vibrancy that makes them seem real and alive.
I still really, really wish shadows would come back to iOS but, absent that, Apple’s done a good job defining spaces and relationships in a dark mode world.
And, for people like me, who find dark mode oppressive and straining during the day but blessedly subdued and calming at night, the automatic transition switch is absolutely aces.
Sign in with Apple
But, it’s especially important on iOS not just because of the sheer number of iPhone users but because it fits the iPhone model, which demands speed and convenience, so damn perfectly.
Here’s how it works. Download an app, like a new game, and if it offers sign in with Google and Facebook,
it has to offer sign in with Apple as well. Yeah, that’s totally Apple being a bully, but in totally the best-for-users way.
If the game doesn’t care about your data and just wants you in and playing as fast as possible, it can do just that. Tap and go. If it does want your data, like your name and email, Sign in with Apple will give it your verified Apple ID name and, if you’re ok with it, your verified Apple ID email address as well.
If you’re not ok with it, Sign in with Apple will create a burner address for you, random, anonymized, that you can reply to it and when needed, but also revoke any time, just for that app.
And Apple never sees or retains any of these emails. This also means companies like Facebook, which try to build shadow profiles on us based on connecting everything around our email addresses, are shut out of luck.
Sign in with Apple will even check Keychain to make sure you don’t already have an account, for example, if you downloaded Fortnite and didn’t realize it was from Epic, but already had an Epic account.
And if you do have an existing account, it’ll just prompt you to login rather than creating duplicate accounts, and that way you don’t lose any in-game currency or benefits or wtehatever attached to your existing account.
It’s super clever. It’ll also, on the device, check several signals including how long you’ve been using the App Store, for example, and then flip a trusted bit between true and false and send that on to the developer.
If it’s true, it means Apple trusts you’re a real person and the developer can give you the red carpet treatment and doesn’t have to make you jump through hoops to validate yourself before you start using the app.
If it’s not true, they can then go ahead with validation, just in case you’re a bot or a farmed account or whatever.
For us it means fewer passwords to remember and, because it uses Apple’s two-factor and Face ID or Touch ID authentication, better security as well as privacy, and almost transparent convenience.
And the developers I spoke to at the event seemed to love it too because it gives them all the advantages of a single sign-on system without them having to make a deal with the data exploitation devils and serve up their users to get it.
Siri & Machine Learning
John Gianandrea leaving his job as head of a search and AI at Google in order to work on ethical machine learning at Apple,
and subsequently becoming a Senior Vice President with his own AI org is probably going to go down in history as the biggest thing since Johny Srouji began heading up silicon, and we’ve all seen how that’s been working out.
It’s going to start slowly but it’s also going to snowball. This year, there’s a whole bunch of Siri and Siri-adjacent stuff coming in iOS 13,
including full-on Voice Control, which I talked about in my iPadOS video — link in the description — and new, conversational, automation-integrated Siri shortcuts, which I’ll talk about in a video later this week.
Siri’s even getting a new voice. And yeah, it’s hella ironic that to sound more human Siri has to become more synthetic, but that’s AI.
When Siri gets a request for local data, like contacts, rather than having to anonymize and tokenize that data to preserve privacy while operating on it in the cloud,
it just bounces the request back to your local device so neither you nor Apple ever has to worry about where or how your personal information is being stored or used. Which is phenomenal.
Siri is also getting a couple of new Sirikit intents. Maps, for one, so you can use Siri with Waze or Google Maps. Audio for another, so, as long as the developers implement it, you’ll be able to use Siri with everything from Overcast to Audible, Pandora to Spotify.
It’s a bummer it doesn’t work with video yet, because I’d love the same to work with Netflix or Nebula, but if you add this to the watchOS announcements around SwiftUI native apps and streaming audio, and Spotify has almost nothing left on its victimy little list to complain about. Revenue sharing aside — which, yeah, is a huge aside — in terms of implementation it’s all on them now.
What Apple’s doing here is pretty clever as well. The biggest hurdle to Sirikit for media has always been… the media. Siri has to be able to tell what content is available so it can cleanly separate the request from what’s being requested, especially in multiple languages.
Some other assistants lock you into specific grammar patterns, which can be awkward. For Apple Music, Apple just brute-forces the catalog, which is beyond laborious.
Thanks to that, whatever overlaps with Apple’s catalog will just work. In other words, where the content is the same, everyone gets that for free. But no catalog overlaps entirely.
So, for content that isn’t in Apple’s catalog, Sirikit is going to pull and front-load the most frequently and recently played content. That cuts down the overhead significantly in most cases.
There’s a bunch of other super-cool, Siri-adjacent stuff coming as well, including Message Announce for AirPods, multi-user for HomePod, and I think some stuff that hasn’t even made it into any of the announcements yet. I’ll get a video up on all of that as soon as possible as well.
Apple launched their own Maps back with iOS 6 and because they had no first-party data and no real idea how Maps was supposed to work,
the result was a poorly integrated, poorly sanitized, poorly cleansed mish-mash of difference services resulting in… bad results. And also missing features like transit and street view.
Over the years, though, Apple worked to improve and expand them, adding back transit, and for the last little while, driving, flying, and hiking their way across the U.S. and other countries to make what they claim will eventually be the best Maps in the world.
The U.S. should be done by the end of this year, Canada and some other countries by the end of next, and then onward to everywhere they can safely, legally map.
They’re crowd-sourcing more than ever,
but maintaining privacy by refusing to track your beginning and end points and throwing away random segments in between.
And with iOS 13, they’re re-introducing a view of the street called Look Around. It’s the visual information you want with the video-game-like performance you never knew you wanted.
And it’s got data layers so that you can get more information on whatever you see if and when you want to.
There are no AR maps yet, which still seems strange to me given Apple’s investment in both those things separately.
There are favorites and collections, so you can easily get to frequent destinations or places you’ve bookmarked for a trip. And MapKit and MapKit JS,
along with a cool new Snapshot feature, so all of this is also available in apps and on the web.
And, thanks to Voice Control, you can even use them like something straight out of Blade Runner now. Grid. Zoom. Enhance! (We just need to keep filing all the radars for Enhance.)
There’s also a great new privacy feature where, instead of always or only when using, you can now grant apps permission to use your location only once.
That’s it. That’s all. And if they keep using it, by API hook or Bluetooth crook, Apple shows you what they’re tracking so can decide if it’s necessary and useful or just grossly creepy and shut them down or delete them off, whatever you like.
I’m Loving it so much.
Camera & Photos
Apple is almost certainly saving all the big camera updates for the iPhone 11 event and announcement but we did get a few new features and capabilities I wanted to touch on here and now.
First, there’s a new high key mono Portrait Lighting effect that gives you a black and white image on a white background instead of black.
It’s only coming to 2018 iPhones because it hits the A12 Neural Engine hard. It’s also not in the beta just yet, but when it gets added you’ll also be able to adjust the intensity like you can Bokeh with depth control,
and Apple has modeled each Portrait Lighting effect specifically for pulling the light back or pushing it up in close.
If you take a series of Live Photos, holding down on the first one now plays all of them seamlessly, one after the other, like a single long video.
Saliency was also the new geeky WWDC word of the week. And no, it has nothing to do with salt. It’s a fancy way to say relevancy and from audio to voice to an image to everything,
Apple’s machine learning is working to figure out what the most important elements are in any context and heat map them for you.
So, for example, making sure thumbnails contain faces or figures so you can more easily tell what’s important to you at a glance.
Then, there’s something we’ve been waiting a decade for. No, not smart orientation lock that still lets landscape photos and video go landscape. We’re still waiting on that. But video editing.
The video came to iPhone with iPhone 3G and iOS 3 in 2009 and, ten years later, so has edited.
Now, not only can you easily rotate a video without round-tripping to iMovie, you can apply almost every kind of editing tool available to photos.
It’s great for when you just want to fix the thing you have and not, you know, create something entirely new.
iOS 13 is currently in developer beta. It’ll come too public beta in early July and ship sometime this September.