Friday , October 30 2020

iOS 13 delivered on several of our top requests, but there's still plenty of room for improvement.

iOS 14 Wishlist: 10 ways Apple can take the iPhone to the next level


Staff Writer,

Macworld |

Last year, after the debut of iOS 12, I put together a list of features I hoped to see in iOS 13. And Apple listened!

Okay, it’s not likely that anybody at Apple read my article and altered a single plan based on it, but the company did give several of the things I’ve been asking for.

There’s still a lot left on the table, though. So many more features and significant changes that seem easy to identify (if not easy to develop) that would make iPhones more useful.

I want about a million things for iOS 14 (multiple timers, for example) but many are small tweaks not worth getting worked up about. Here’s a list of the ten biggest and most far-reaching features I hope to see in iOS 14.

Note: Now that iOS and iPadOS have technically split, this list doesn’t include iPad-specific features—that’s another list! Also, there are features I’d love to see that require new iPhone hardware, which aren’t included here. These are features I’d like to see for all iPhone models that could run iOS 14.

After a surprisingly trouble-free iOS 12 release, iOS 13 has been full of problems. The release schedule was staggered and disjointed, and Apple’s still squashing significant bugs.

There are reports that Apple has once again altered its development process to improve reliability, and it can’t come soon enough. The buzzwords for iOS 14 should be “stability” and “performance.”

More than any new feature, making sure that the first release of iOS 14 is fast, fluid, and trouble-free for the hundreds of millions of devices upon which it will run should be priority number one.

And don’t promise features at WWDC only to have them come weeks or months after release (and in a shoddy state, at that). It’s probably too much to expect all the major iOS features to release at once, but be honest about the staggered release. Let us know which features are coming in a future iOS point-release update so we can manage expectations (as so Apple’s developers aren’t rushing to meet an unrealistic ship date).

Apple improves Siri every year. In iOS 13, it gave Siri a smoother and more natural-sounding voice. It also added support for music, podcast, and other audio apps to the SiriKit framework. Both are nice (especially that second one) but not nearly what we have in mind when we wish every year, fingers-crossed, for a dramatically upgraded Siri.

Last year, I wrote this about Siri in describing my hopes for iOS 13:

Siri still lags way behind Google Assistant and Alexa in its ability to answer general questions and gracefully perform actions with third-party hardware and services. There are so many obvious shortfalls; you can do a Spotlight search for a flight number and get detailed flight info, but ask Siri and you just get a web search.

Siri needs better voice recognition, faster response times, and more “fun” activities like trivia and games. It needs to give more accurate answers to a much broader set of questions.

All of that is still true. Siri still needs more domains for things like Shopping, and both Siri and HomeKit need better support for more smart home gadgets and types (why can’t I arm my alarm with Siri?).

Just another example from the /r/SiriFails subreddit, whose very existence should embarrass Apple daily.

More than anything else, I want Apple execs to get on stage at WWDC with a giant Siri 2.0 logo behind them and talk about the “all new Siri” that takes everything the company has learned over the last nine years and built a whole new digital assistant for the next decade. One that is smarter, faster, works offline (it’s shocking how often Siri does not!), better understands both your words and your intent, and is more proactive about doing things on your behalf if you want it to.

I would also like Apple to make users choose a Male or Female Siri voice during phone setup (or when they first upgrade to iOS 14), without defaulting to the female voice.

When Google demonstrated its new Recorder app on the Pixel 4, we couldn’t help but feel jealous. The phone was doing exceptionally accurate text-to-speech trascription, live, and entirely on-device. It was even smart enough to put in periods between sentences.

Of course, there’s no magic hardware in the Pixel 4 to enable this feat. It’s just software, and it’s even coming to older Pixel phones.

Google’s live transcription is impressive, but there’s no reason Apple couldn’t do this on most modern iPhones.

Apple’s dictation feature (tap the microphone on the keyboard) is a handy way to input text almost anywhere, but it’s slow and inaccurate enough that most people don’t bother. To Apple’s credit, it does work without a network connection. But it can’t keep up with a normal talking pace, and it doesn’t do a very good job of making a sentence (with proper punctuation) out of your string of words.

Apple should step up its game here. Make the dictation dramatically faster and more accurate (especially with accents), and use some intelligence to improve word choices. If I say, “you have a funny accent” and the dictation thinks I said, “you have a runny axe meant,” it should recognize that its interpretation produces a nonsense phrase and that other similar-sounding words produce a reasonable sentence.

The beefed-up speech-to-text engine should be used all over iOS, from Siri to voice mail transcriptions (which are awful) to the Voice Memos app.

With the iPhone 11, Apple made a few sensible improvements to the Camera app interface. The smooth-scrolling zoom wheel, for example, is a delightful experience that makes controlling zoom level easier than pinch-to-zoom.

Then, of course, there’s Night Mode, where several seconds of exposure are combined to produce stunning shots in dark environments.

iOS 13.2 added video frame rate and resolution adjustment in the Camera app. But why only for iPhone 11?

And in iOS 13.2, Apple added the ability to change video resolution and frame rate right in the Camera app instead of jumping into Settings—but only on the iPhone 11.

There’s really no need for any of these improvements to be restricted to Apple’s latest phone. Night Mode may not be possible on the oldest iPhone hardware, but it’s certainly something an iPhone XS and XR can pull off. And the interface changes have no business being restricted to just the new phones.

I’d love to see iOS 14 brings these and other camera improvements to older phones. Unify the interface and restrict features only to what is technically impossible on older hardware. And while I wouldn’t want Apple to make the Camera app interface too busy, I think a “Pro” mode that lets users adjust color temperature, shutter speed, and ISO, would be welcome. Put it right in line with the Pano, Time-lapse, Portrait, and other modes, and give photography nerds as much manual control as possible.

The iPhone’s home screen got a bit of an overhaul back in iOS 7, but hasn’t really changed a lot since then. We can now long-press on app icons to get context menus, but the home screen is still a big grid of icons that you can’t even freely move around, only reorder.

The grid of icons probably won’t go away, and that’s not necessarily a bad thing. But there are plenty of ways to change things without completely upending the paradigm.

I’d love Apple to introduce a dynamic icon API so that, for example, a weather app could change its icon to match the forecast, or an email icon could show how many unread messages you have. At the very least, Apple should let developers define separate app icons for Light and Dark mode.

If you drag down on the home screen, you enter Spotlight search. Before you begin searching, Siri suggests a row of app icons based on your common use at the current time of day and location. Maybe this should (optionally) reside on our home screens, now that iPhones are so much taller?

And even if Apple is not going to let us hide apps in an app drawer à la Android, it can at least let us position app icons and folders wherever we want. Currently, you can reorder them, but they always fill the screen from the upper left—there’s no way to leave a blank space.

Notifications have improved in iOS, but they’re still a bit of a mess. When you dig into the Settings app to change how apps notify you, you’re bombarded with options. Where do you get alerts? What kind of banner style? Do you want sounds? Previews? How about the little red dot on app icons?

All these options, which most users never touch, and still no way to differentiate critical notifications from all the rest.

The average user doesn’t ever go here, and suffers default settings for all their apps. Those who do change things have too many options to choose from, and they’re the wrong kind of options. I’d like to see Apple reduce the choices for the many ways in which notifications are displayed, and instead work on a systemic means of separating notifications into two groups: critical alerts that require immediate action and casual stuff that can wait.

What the notification system on the iPhone really needs is to recognize that notifications are regularly abused by developers and are a major reason why we all use our phones too much. Apple should take the same systemic approach to reducing notification impact in iOS 14 as it did with location tracking in iOS 13. Make notifications silent and non-interrupting by default (no banners, they just appear in the notification shade). Let apps define specific “high priority” notifications, and have to request that users enable them, while telling users exactly what will produce them.

For example, Twitter’s notifications for likes, retweets, and follows would silently deliver to Notification Center, but the app would prompt you to allow direct messages to be “high priority” where they would produce alerts and sounds. The Ring app would have silent notifications for detected motion, but could request allowing high priority notifications for when your doorbell is rung or your alarm triggered.

And let’s get rid of notification badges entirely. The little red dot on app icons is useless vestige of an old mobile world. It is high visual impact but low information density—all it can do is provide a number, which could mean anything depending on the app. Worse, it doesn’t tell you if any actions are needed or what they should be. Their existence owes more to inertia than to actually improving our phone experience.

The Apple Watch has an always-on display. OLED Android phones have had always-on displays for years. There’s no reason why the OLED iPhone models can’t have them as well.

The always-on “sleep” screen for the iPhone should be similar to the lock screen, with a few adjustments. It should display time, date, and battery life on a black background (to save battery life), but no notifications. Perhaps new notifications could briefly appear and go away, but we need fewer reasons to pick up our phone, not more.

And always-on display and complications. If they’re good enough Android, and good enough for Apple Watch, they’re good enough for iPhone.

To that end, I’d love to see Apple take the idea of complications from the Apple Watch and add them to the lock screen and always-on sleep screen. Perhaps four of them, flanking the clock, with standardized formats. Developers could produce complications for their apps and what they would display, and users could choose which four they want to see.

This would be a great way to get simple information without picking up our phones and diving into apps. Some of what we need we could get without even unlocking our phones. It would make the iPhone even more useful, even when at rest, while providing an important feature to promote digital health and wellbeing.

I know recording phone calls is a bit of a tricky legal issue, but it’s just so useful that I want Apple to at least attempt to tackle it. The disclosure that one needs to make to legally record a phone call varies from country to country and state to state, but it seems that the most stringent requirement is for both parties to be informed that a recording is happening, and that requirement can usually be met by playing a regular audible tone (like a “beep” every 15 seconds).

Ideally, when one enables call recording we would simply hear “call recording enabled” followed by regular unobtrusive beeps, neither of which would actually be captured as part of the recording. Recordings can go into Voice Memos where they would be designated with a phone icon, and could be auto-transcribed (see improved dictation, above).

Apple bought Shazam in 2018 but hasn’t done a whole lot with it besides remove ads. Technically, you’re using Shazam technology when you ask Siri to identify a song, but Apple could go a lot further.

First, build Shazam right into Apple Music. Make an easy-to-hit “identify this song” button both in the iPhone/iPad and Apple Watch interface. Keep a custom default playlist for identified songs that makes a note of the date, time, and location.

Copyright © 2020 IDG Communications, Inc.

This Article was first published on

About IT News Ug

Check Also

Top web browsers 2020: Firefox, IE, Chrome all fall

Firefox took the biggest market-share hit in August as Mozilla’s financial woes continue.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.