Yesterday was the 10th anniversary of the announcement of the original iPad. Tom Warren has our look back story here: Apple’s iPad changed the tablet game 10 years ago today. Tom’s piece investigates how Steve Jobs originally positioned the first iPad as a new kind of device that sits in between a phone and a laptop.
Ever since, there has been an omnipresent and often unfair expectation that the iPad would eventually supplant the MacBook. It’s an expectation that Apple itself has encouraged, from time to time. But the iPad is different from a laptop by design, its strengths and limitations encourage (and demand) different behavior.
The video series Processor that is a complement to this newsletter is in some ways an ongoing meditation on the question Apple famously asked in one iPad commercial: What’s a computer? I’m still obsessed with this question — or more specifically with questions about how computers are changing and how they’re changing us.
Sometimes these questions end up getting answered by looking at the unique ways the iPad structures its multitasking user interface. I’ve written about this numerous times and made a few videos about it. Here’s a video from last June where I looked at the user interface metaphors in iPadOS (the grammar section about three minutes in is the heart of it):
You are reading Processor, a newsletter about computers by Dieter Bohn. Dieter writes about consumer tech, software, and the most important news of the day from The Verge. This newsletter delivers about four times a week, at least a couple of which include longer essays. You can subscribe to Processor and learn more about it here. Processor is also a YouTube series with the same goal: providing smart and surprising analysis with a bit of humor. Subscribe to all of The Verge’s great videos here!
By subscribing, you are agreeing to receive a daily newsletter from The Verge that highlights top stories of the day, as well as occasional messages from sponsors and / or partners of The Verge.
Here’s what I wrote back then, if you prefer to read instead of watch, specifically about the new three-finger gestures but it applies to lots of “unintuitive” parts of iPadOS as compared to more “intuitive” desktop interfaces:
I don’t think any user interface — whether it’s a computer or a bicycle — is the sort of thing that humans just innately understand. Nearly everything we do requires training and learning. The difference between an intuitive interface and an unintuitive one is how that learning happens.
With intuitive interfaces, you don’t notice that the learning is happening. One skill flows naturally into the next, more complex skill on a relatively easy learning curve. Take the classic desktop interface: if you step back and look, it’s actually deeply weird! It only feels normal because it’s been around for 35 years. However, it is intuitive: you learn left click, then discover right click, then see keyboard shortcuts listed. Each skill leads somewhat naturally to the next, and there are little hints that these extra tools exist all over the interface, inviting you to try them out whenever you want.
I think the way the iPad handles windows and files and multitasking is not intuitive, by my particular definition of the word. I think the root of the conceptual confusion is that the user interface mixes both spatial and temporal metaphors — I’ve made a video and written about that, too. (Jump to 5:30 here.)
I (obviously) think the iPad’s interface is fascinating in its own right. I could (and have) talk about it for hours, but rather than tuck in yet again, I want to talk about the effects of all those user experiences on us. Because if I’m honest, explaining the nuances of how it works sometimes keeps me from fully expressing why I think it’s so fascinating.
Here’s just one example. The video conferencing software we use, Zoom, isn’t allowed to keep the camera on the iPad active when it’s not the frontmost app. There are explainable reasons for this. Perhaps it’s just a result of iPadOS’ legacy plumbing, which started as a singletasking operating system from the iPhone that has had multitasking elements bolted on piece by piece. Perhaps it’s because Apple believes that from a privacy and security perspective, a camera should never be active unless it’s in the frontmost app. Perhaps it’s both of those things and more.
If you have sat in half as many video conferences as I have, you know that you and your colleagues have some unwritten rules about their etiquette. Sometimes (often), it’s accepted that if you’re not directly affected by the current conversation, it’s okay to split your attention between the call and something else — say email or Slack. But with an iPad, splitting your attention literally makes your face disappear from the Brady Bunch grid.
So it changes your behavior. Maybe you leave the camera off more often so people can’t tell you’re multitasking. Maybe you switch away from the conversation less often and make a real effort to be present. Maybe you pull your phone out and do stuff on your phone — literally multitasking with your body because the iPad won’t let you do it with its operating system.
If you’d only ever used an iPad, you might just think that’s how computers work. It would, in some sense, limit your imagination of what’s possible on a computer. Often when I complain about iPad limitations, it gets misinterpreted as a desire to have it work just like other computers do. That’s not it — I worry that it is subtly narrowing our sense of what computers can do without our even noticing it.
If the iPad were just that, a limited computer, I wouldn’t give it a second thought. However! In addition to being limiting, it’s also incredibly liberating. It is great to not have to worry about all the things you usually have to worry about with traditional operating systems like macOS or Windows. It’s freeing to have a device that’s fast, does so much so effortlessly, and doesn’t feel like it’s only designed to sit on top of a desk or a lap.
After ten years, you’d think we’d know exactly what the iPad is and what it can do, but we don’t. I think it’s the tensions between the limiting and liberating parts of the iPad — both of which still feel new, even now — that make it worth paying attention to.
(Speaking of things worth paying attention to — later today I’ll be taking another look at a different vision for the future of computing interfaces: web apps in the Edge browser on the Surface Pro X. I’ve written a lot about the iPad, but it’s still just one answer among many. Please keep an eye out for it on YouTube and the site.)
The latest iPad has received a significant price cut at Amazon and Best Buy, with the 128GB Wi-Fi model seeing the largest amount skimmed off the top. A sizable $100 discount brings the total to $329.99 (available at Amazon and Best Buy). The 32GB Wi-Fi iPad is $80 off, and costs just $249.99 (Amazon, Best Buy). At this point, it cheaper to buy this newer model of the iPad than it is to go looking for the previous generation. When has that ever happened?
If Apple won’t allow a third party to maintain an archive of its history, I hope it’s doing something to retain and maintain these videos itself. I also hope that it finds a path towards making these videos public. Not to get all “Late Capitalism” on you, but Apple the Corporation is an important part of our recent history. The company’s latent distaste for celebrating the past threatens to limit the scope of historians in the future.
If you haven’t yet, I highly recommend you seek out and watch the documentary General Magic. As a record of a place and time in tech, it’s essential viewing. It sets up so much of our currently world. As with the examples above, there’s clearly an attempt to establish General Magic (and Fadell) as important to history — but in this case, it’s well-deserved.
On Friday and over the weekend I had a weird sense of nostalgia: it used to be that a new social network of note would launch every few months and there’d be a rush to secure user names. It’s been a minute since that happened, but it definitely happened with Byte, the successor to Vine.
Now to see if it has staying power or not.
Preorders are backed up to late February as of this writing. Since we don’t know how many are being made, we can’t really use that backlog as an indication of …anything really. I will say that this release is starting to remind me of a Hollywood movie — the kind where the studio puts a lot of ads behind something but doesn’t share it with movie reviewers ahead of time.
We’re creeping up on the Mobile World Congress convention in Barcelona. Expect the pace of phone leaks to pick up!
Even when you think there’s a clear-cut case of AI being good for the world, it turns out that there’s nuance to worry about, as James Vincent explains:
“There’s this idea in society that finding more cancers is always better, but it’s not always true,” Adewole Adamson, a dermatologist and assistant professor at Dell Medical School, tells The Verge. “The goal is finding more cancers that are actually going to kill people.” But the problem is “there’s no gold standard for what constitutes cancer.”
By criticizing Amazon in public, employees risk being fired — a threat received by workers who spoke out on the issue earlier this month. But those involved in this mass action hope that by coordinating their criticism, they’ll avoid such punishment
YouTube is trying to corner the marketplace by bringing in swaths of people via big e-sports leagues instead of relying on a few handfuls of popular streamers. Using professional leagues to drive viewership growth isn’t a new concept; YouTube is just enacting the same strategy traditional broadcasters have used in fights over rights to mainstream sports for decades.