Exploring how new hardware trends will shape the way we design UIs
I’m a self-taught UX designer and most of my work has been focused on desktop apps. My most recent work allowed the users to annotate their appoitments with their digital pen.
While designing the UX for this product I had to think about how such a relatively new input (i.e. the digital pen) changed the way the users interacted with the app. What I found out is that what matters the most when designing for new pieces of hardware is not only the input method but also how the content is displayed on the screen.
A simple example of how new hardware paradigms eventually form new software design paradigms is the introduction of smartphones; because of the switch from precise input methods (i.e. mice and keyboards) to intuitive ones (i.e. touch screens) and from large horizontal screens to small vertical ones, many changes had to be made in order to adapt UIs to this new hardware category:
- Gestures became central in almost every app (previously the only widely-used gesture was drag ’n’ drop);
- Interactive objects became bigger to be touched with more ease;
- The most important elements of a page became more visible in order to be easily seen on smaller screens (this benefited also desktop users since before the introduction of smartphones many pages were hard to navigate).
So, as you can see, this revolution left an evident legacy to almost all the apps we use today and I believe we are at the beginning of a similar (although probably not as huge) revolution because of the introduction of ultra-wide mobile screens, non-rectangular and curved displays, and notches (although they have no real benefit, they will inevitably change the mobile software design landscape, so we’ll tackle them anyway in this article).
Moving elements away from the corners
Up until now many of the most important interactive elements of a UI were conveniently placed at the corners of the screen in order to make them more reachable for desktop users and more visible for mobile users, clear examples of this trend are the send button in messaging apps and the hamburger menu.
While this paradigm worked flawlessly on a rectangular screen, such as the ones found on almost every smartphone released before 2017, it isn’t optimal on displays with rounded corners, such as the ones found on the iPhone X, the Galaxy S9 and the LG V30.
Because of this new hardware paradigm we need to rethink the way we position the most relevant elements of an interface and the solutions appear to be two:
- Start using new interface components that replace the need to place buttons at the corners of the screen, such as Google’s Floating Action Button;
- Replace buttons with gestures (this one is a valid solution just in a few cases, but, if done right, it could vastly improve the user experience).
While these alternatives are easy to implement, not many apps feature them yet, but I expect them to get more wide-spread as we move in 2018 and more phones (even low-end ones) start featuring a display with rounded corners.
Moving interactive elements away from the top of the screen
The other big change in terms of hardware design last year was the 18:9 aspect ratio (or 2:1, if you want to show a bit of respect to math) becoming more common among flagship mobile devices.
Taller aspect ratios mean that the section of the screen that your thumb can reach without excessively stretching while using the phone with just one hand is much smaller compared to the total size of the screen; on a 16:9 phablet (from 5.7" to 6") this section is around the bottom half of screen, while on a recent flagship device (let’s say sporting a 6" 18:9 screen) it is around the bottom third, if not even less on devices with taller aspect ratios such as Samsung’s 18.5:9 and Apple’s 19.5:9.
Furthermore, many manufacturers are adding a notch to the top of their devices’ screen in order to gain some space around the earpiece, the camera and the other sensors.
UI designers should take into account these trends while placing the most important interactive elements of the interface, since a component at the top of the screen could be hard to reach or, in a full-screen app, could interfere with the notch. To avoid these problems many apps are starting to feature bottom-placed interactive components (such as search bars and tabs).
A good example of this design trend is Google’s Bottom Navigation, a Material Design component intended to replace tabs in certain cases and make them more accessible on taller aspect ratios.
Giving more relevance to the edges
As many manufacturers are reducing not only the top and bottom bezels, but also the left and right ones and some of them are even introducing phones with curved edges (such the Samsung Galaxy S and Note series and the Nokia 8 Sirocco), it becomes easier for developers to create great animations starting from the edges.
This gives developers the opportunity to add edge gestures to their apps, because the animations prompted by them now look more natural than ever before.
A great example of how gestures can improve the user experience is their implementation in the iPhone X variant of iOS 11: Apple did such a great job that most of us don’t miss the old home button, although it was part of the iOS experience for more than a decade.
As we have seen, the design of an app is deeply influenced by the hardware it is running on and we are at the beginning of a series of new paradigms concerning mobile phone screens.
As a UI designer, I am excited to witness all of these updates to current designs and I really hope that by the end of the year the first foldable mobile devices will be uncovered (such as the much-rumored Galaxy X and Surface Andromeda), further challenging the current conventions relative to how a phone should look like.