iOS said, “Ok clarity, let’s not be friends anymore.”
The iOS lockscreen has been pretty straightforward for years since its inception in 2007 but now it is not as obvious as it used to be. There are a few reasons why this is happening.
Before we go any further, here’s what this post is about:
- It is not an aim to bash the design gods at Apple since I am unaware of the constraints considered and other decisions made by the team. In other words, we are doing an analysis of the design; not the designer or design team.
- I found the lockscreen experience confusing and I wanted to learn why.
- It is a good way to understand how when products gain more features or goals, designing for an experience becomes harder — hence resulting in a lack of obviousness in the interface.
The new lockscreen
This is not even about the whole ‘Press to Unlock’ mechanism (a lot has been written about it) but it is about the way you interact with and navigate through recent notifications, widgets and other items on the lockscreen.
Death by a thousand swipes
Have a look at this —
The iOS 10 lockscreen has little round-cornered cells which represent your most recent un-attended notifications on the lockscreen.
- Swipe right on a notification to unlock iPhone and view the app which triggered that notification (similar to past versions of iOS).
- Swipe left on a notification to view options for the notification (‘View’ and ‘Clear’).
- Swipe right in the area outside the cells to scroll to widgets.
- Swipe left in the area outside the cells to scroll to camera.
- Swipe right when in camera to switch to video 🙁
All this bulk leads to a rough experience when a user has to perform a simple task of accessing her lock screen widgets. Here is a not-so-uncommon scenario –
If the user’s goal is to access lockscreen widgets and her screen is filled with notifications, she cannot go swipe wherever she likes –
Lesson: It is not a good idea to have elements with horizontal swipe actions in a horizontally scrolling window.
Getting to Camera on the lock screen — and back
As illustrated earlier, one can swipe to the left from the lock screen to quickly access the camera. It is an easy way to get to the camera (arguably easier than the interaction of pulling up the camera icon in iOS 9 and earlier). The pagination at the bottom of the lockscreen signifies that you can happily scroll between views. Apparently, it is not as simple as it seems. Consider this scenario –
Our user happily scrolls all the way to the camera after viewing her widgets only to realize that she doesn’t want to snap a picture right now, but wants to get to the lock screen again. How does she get back? Not by swiping back, but by pressing the physical home button — which is an interaction reserved for the new unlocking mechanism but does not work the same way in the camera view.
Lesson: I will just leave two of Nielsen’s heuristics here
Even better than good error messages is a careful design which prevents a problem from occurring in the first place…
Users should not have to wonder whether different words, situations, or actions mean the same thing.
Where do widgets live? — On the first floor and second floor
The iOS lock screen has 2 primary layers. One is the lock screen itself and on top of that lives the notification center much like the rest of iOS.
There are two ways to access the widgets on the lockscreen: You could swipe right from the lock screen or get to the notification center (by swiping down from top edge) and then swipe right to get to widgets again.
It feels like greeting someone who is sitting in the first floor only to find them in the exact same place on the second floor — funny and weird. However, that is not the most confusing part of this convoluted interface. Hypothetical not-so-uncommon scenario –
Our user is looking at lock screen widgets. She is confused whether she should go back to the lock screen by swiping left or by pressing on the pagination below or by swiping up from the bottom edge.. There is a pause… but she eventually figures it out.
The screenshots below show us why this is confusing –
The similarity of the two screens on different levels of the Z-index in the interface leads to a slow down and makes the user stop and think about the ways to exit the widget view.
Put simply, navigating through the different layers on the lock screen feels like a maze. And one more thing — you cannot go from widgets to widgets which makes sense and is logical from the system’s standpoint. For the user, understanding this is not a straightforward task. This is design-centered design, not user-centered design.
- Redundant use of information on different levels in the Z-index (Notifications and widgets on both levels) leads to a convoluted concept model.
- Lack of visible system status and signifiers could lead to user error.
Press to Unlock vs Slide to Unlock
There are merits and demerits to both the models but the latter is arguably clearer. There is nothing wrong in trying to shape the user’s mental model to press to unlock but by sticking to the ‘Slide to view more’ action for unlocking a notification, the system has one leg stuck in the past.
As far as unlocking the iPhone itself goes, press to unlock requires precision and repeated verbose multi-step feedback from the system (Like ‘Press home to unlock’ and ‘Press home to open’) which not only slows down the user the first few times but also keeps them from getting efficient with repeated use.
While the whole mechanism feels like a game, none of the good things from game design are carried over here — like well-designed tutorial/onboarding and helping the user learn shortcuts/advanced tricks to get more efficient over time.
Apple’s iOS has always stood for its obviousness but it hasn’t scaled well as new features have been added. Perhaps what is required is a mature system language (like material design) with clear directions for the user — to completely reimagine the concept model and bring back clarity.