By Leif Johnson
I’ve been using iPads for so long that my hands automatically expect to do some things when I see one, even when I’m using one as an external display for my MacBook with Apple’s new Sidecar feature. But Apple only indulges this muscle memory so far.
When I’m “running” macOS on my 12.9-inch iPad Pro with Sidecar, I can reach over to the iPad’s display and use my fingers to scroll through websites in Safari or documents in Pages. I can even use some iPadOS multi-touch gestures, and Sidecar performs these fluidly when the Mac and iPad are on the same network. If I’m just scrolling, it’s wonderfully convenient.
But if I use my finger to try to click a link on that same page? Nothing. I have to waste time either dragging my mouse pointer over to the display or picking up my Apple Pencil. This makes no sense. The technology is clearly there. And what about if I tap on a gigantic icon on the dock or on a file on the desktop? Again, nothing.
I can’t even hold my finger down on a file or link to pull up a “right-click” menu—but I can do these things with a long-press with a Pencil. I’m at the point where I think not having any touchscreen support in Sidecar at all would be better off than dealing with this unsatisfying and unintuitive teasing.
Sidecar’s design comes off as deliberate half-assing. (In a more cynical mood, I’d say it’s a ploy to get you to buy the Apple Pencil.) But at least it’s somewhat consistent. For years, Apple has argued that touchscreens clash with the Mac experience or, as Jony Ive told Cnet in 2016, that it’s a feature that “wasn’t particularly useful.” And yes, I think reasonable people agree it’d be silly to support touchscreens on something as large as the iMac Pro.
With Sidecar, though, Apple seems determined to make us think multi-touch support wouldn’t work well with something as small as a MacBook. Meh, I say. If anything, it shows how well it would work. Apple all but markets the iPad Pro as a full-on laptop these days, and the largest 12.9-inch model has roughly the same display size as the smallest contemporary MacBooks. A 12.9-inch iPad Pro running Sidecar thus provides a reasonable glimpse into how well a touchscreen MacBook would work. And that’s a relatively small screen. On a large laptop like the new 16-inch MacBook Pro, the presumed “problems” with touch input would be even less problematic.
We don’t even have to guess at how we might interact with Sidecar or touchscreen MacBooks if Apple unlocked the full range of touch gestures, as the third-party Luna Display service already lets you interact with the entirety of macOS with your fingers when you’re using an iPad as a secondary display. Just like if you were using an Apple Pencil with Sidecar, you can press on links and apps and they’ll open. You can hold down on an app or file and pull up a right-click menu. You can even select whole blocks of text with a finger swipe (which means you’ll need to use two fingers if you want to scroll normally).
Until Sidecar came along, Luna Display was one of your best options for using the iPad as a secondary Mac monitor. If you want touch support, it still is.
Such features make Luna Display more satisfying to use than Sidecar, though it’s certainly not as elegant. Unlike Sidecar, you need to plug a dongle into your MacBook before it works, and then I found the screen transitions weren’t anywhere near as fluid, even working on our office’s powerful Wi-Fi network. With Sidecar, the iPad registers every movement so smoothly that you’d think it was jacked directly into the Mac.
As impressive as Luna Display is, here’s what scrolling through an article quickly ends up looking like. You don’t get that with Sidecar. (The red dots are my fingers activating the scroll function.)
Importantly, though, Luna Display proves that using fingers to interact with a macOS interface on laptop-sized screens isn’t the hassle Apple has been making it out to be. I can use it for all the lightweight tasks I’d expect to be able to use with a touchscreen laptop, whether that’s opening links, opening apps, selecting text, or simply dropping my cursor in the right spot. If Sidecar let me do these same things, I’d love it more than I already do.
I don’t think Apple grasps this simple point. It’s overthinking how people would use touchscreen laptops. Apple seems to assume users would want to use nothing but touch support on their MacBooks, but when I see colleagues and visitors using touchscreen Windows laptops in meetings, they’re not using them for complicated tasks like clone-stamping textures in Photoshop. They’re usually not diving deep into menus, and they’re certainly not trying to recreate one of Monet’s haystacks. Instead, they’re usually standing over their laptops and quickly swiping to different parts of a page or opening files or links, thereby saving a few seconds over what using a mouse or the trackpad would have taken. It’s sure a heck a lot more convenient than the Touch Bar, which has been Apple’s only concession to touch-based interaction on MacBooks to date. And these aren’t part-time tablets or rigs aimed at creating works of art. The model I see most often is a Dell Latitude 7480, which is basically an everyday Windows “business” laptop apart from the touchscreen support.
A shot from my 12.9-inch iPad Pro running Sidecar. Apparently Apple thinks pressing the Touch Bar for Safari tabs (inside the red oval) would be easier than just pressing the tabs in Safari? No.
Apple, though, has long argued that bringing touchscreen support to Macs would require some kind of big overhaul of macOS. In a 2016 interview with Wired about the Touch Bar making the Mac a “part-time touch experience,” Apple marketing chief Phil Schiller said the idea of a touchscreen Mac was “lowest common denominator thinking” because you can’t optimize the design of features like the menu bars of macOS for both mice and fingers.
“We think of the whole platform,” Schiller said. “If we were to do Multi-Touch on the screen of the notebook, that wouldn’t be enough — then the desktop wouldn’t work that way.”
Apple itself, though, proves that you don’t have to expect an identical Mac experience regardless of device with the Touch Bar itself. The controversial strip appears on every current-generation MacBook Pro model, sure, but it doesn’t appear on the MacBook Air or on the Magic Keyboards used for iMacs and sure as heck doesn’t appear on the new silver and black keyboard designed for the 2019 Mac Pro.
I don’t think anyone’s complaining that the Mac Pro doesn’t work the same way as a MacBook Pro because it doesn’t have a Touch Bar. It is a convenience used for simple tasks on easily transportable machines like MacBooks, much as direct touch interaction with a display would be. And unlike the Touch Bar, multi-touch interaction wouldn’t even require us to look away from our keyboards. Indeed, the ever-shifting Touch Bar defeats the entire point of touch-typing.
And Sidecar itself proves the superiority of direct touchscreen interaction over the Touch Bar in MacBooks—and, for that matter, in Sidecar itself. Whenever you pull up a Mac app in the Sidecar window running on an iPad, you’ll get the Touch Bar options you’d expect on a MacBook along the bottom or top of the screen. This makes it easy to see that the Touch Bar options for options like bolding and italicizing in an app like Microsoft Word aren’t significantly larger than you’ll see in the actual Word document above, so I see little reason why you couldn’t just press them in the Word document itself.
Here’s the macOS version of Word running on Sidecar. You can see the Touch Bar options at the top. They’re bigger and little easier to touch, but they’re not THAT much bigger.
The existence of Sidecar also complicates the argument that touchscreens aren’t ergonomic for Macs, as Apple’s software engineering chief Craig Federighi said to Wired not long after last year’s WWDC.
“We really feel that the ergonomics of using a Mac are that your hands are rested on a surface, and that lifting your arm up to poke a screen is a pretty fatiguing thing to do.”
Yes, maybe this makes sense in the context of a large-screened device like an iMac. But on a MacBook—a device that’s meant to be portable, much like an iPad? That’s silly. Think of it this way—“lifting up your arm to poke a screen” represents the entirety of the iPad experience when you’re using Apple’s Smart Keyboard or any other keyboard case. And I haven’t seen Apple stop selling Smart Keyboards on that account.
(This is also a good spot to address the idea that Apple would never design a MacBook that allows you to grub up the pretty screen with your fingerprints: Doesn’t the entire iPad and iPhone experience involve that? At any rate, my MacBook’s screen seems prone to getting dirty even when I don’t touch it.)
In the same interview, Federighi said he regards all the touchscreen laptops out there as “experiments.”
“I don’t think we’ve looked at any of the other guys to date and said, how fast can we get there?” he said.
But what is the Touch Bar, if not an experiment? And if it is, it’s a failed one. Apple’s boldest ideas tend to be trendsetters despite initial protests—consider smartphone notches, smartphones without headphone jacks, and USB-C laptops—but no other laptop maker has made a serious effort to bring a Touch Bar-like feature to their own device.
Touchscreen laptops, though, are becoming more and more common. They’re becoming the norm even in popular traditional laptops like the Dell XPS 13. They may be experiments, but they’re experiments that other companies and users have found successful and desirable—much unlike the Touch Bar. At this point, Apple’s resistance makes it look like the odd man out, which is a sad fate for a company that’s usually credited for making us fall in love with touchscreens in the first place.
Touchscreens have become so common in some Windows laptops like the Dell XPS 13 that the inclusion of one barely warrants news.
Also, Apple doesn’t need to include multi-touch support on every MacBook, and if it’s really so concerned about precision, it could limit support to the larger notebooks like the MacBook Pro. It could even charge an extra $300 for the feature, much as Apple was fond of doing for the Touch Bar before it became standard. I’d be willing to bet, though, that more people would end up using a multi-touch MacBook display instead of a Touch Bar.
Apple’s approach to Sidecar feels like a relic of an older Apple: an Apple that refused to let you use a mouse with an iPad because the tablet wasn’t strictly designed to work that way. Lately Apple isn’t so stubborn. We can now use mice with iPads, after all, although only as an Accessibility feature. We don’t have to buy specifically designed “MFi” controllers to play games on an iPhone: ordinary PlayStation 4 and Xbox One controllers will do. And most remarkably of all, this year Apple stopped trying to persuade the world to embrace its “butterfly” keyboards and instead equipped its new 16-inch MacBook Pro with the scissor-switch keys of old.
After so many years of complaints and a lack of interest, maybe in the years to come Apple will ditch the Touch Bar altogether and just let us interact with our screens directly. As Sidecar and Luna Display show, that leap needn’t be as great as Apple is making it out to be.
And if nothing else, Apple? C’mon, let us click on links in Sidecar with our fingers.
This story, “Apple really doesn’t want us thinking about touchscreen MacBooks—and Sidecar proves it” was originally published by
Copyright © 2019 IDG Communications, Inc.