Considering UX when screens disappear

We’re about to eclipse physical displays. What will this mean for product design?

Berty Bhuruth
July 31, 2022
5 min read

Today’s displays are great. They’re thin, they’re light and they have amazing picture quality. Because of this, we want them bigger, and we also want to bring them with us. Our phones are growing, to the point where we can fold them in half to get them in our pockets.

It’s crazy to think that it’s been over 15 years since the first iPhone was shipped and the smartphone market was effectively born. Sure, devices were around before then, but Apple really paved the path.

As handheld screens have increased in size, so too has our collective desire to be able to work, play, and consume digital content, anywhere, anytime.

But let’s take a moment to consider the UX implications of this road we’re on.

Bigger displays mean bigger devices to handle. And if we still want to use them on the go, it would likely mean supporting them with both hands. Bigger displays also need bigger batteries to keep them alive. So, either battery technology gets better and maybe they don’t need to get bigger, or our display devices get heavier in order to accommodate a larger battery.

But carrying around bigger and bigger screens is not the user-friendly solution. It goes without saying that all this mobile work and play is not great from an ergonomic perspective. I would bet you’re reading this with your chin tucked into your chest looking down at a screen. Faces buried in screens is the norm, though this is going to change.

Over the years, mobile devices began to shrink but they’ve since reverted back to an ever-larger form. One could postulate that the next big disruption to mobile computing will be the disappearance of ‘handheld’ devices entirely and the rectangular screens they come with.

What we need is a technology that’s portable, lightweight, while not diverting from the trajectory of bigger and bigger screens.

Where to from here?

To replace our rectangular screens we may opt for a smaller Near Eye Displays (NED) or more commonly referred to as Head mounted displays (HMD).

With the disappearance of screens will come the expansion of screen-less digital realities.

Of course, this isn’t an epiphany. Back in 2012, Oculus released their first dev kit for the Oculus Rift. It gave those interested in virtual reality (VR) their first taste of a consumer VR experience (I still have mine). The years that followed sparked a wave of consumer VR, augmented reality (AR) and mixed reality (MR) devices from the likes of Google, Microsoft, HTC, Samsung, and Facebook/Meta. Since then, I think the buzz around VR has died down because the majority of us have come to realise we don’t want heavy devices strapped to our faces.

However, AR and MR have stuck around; I’d argue they’ve subtly become a learnt interaction we’ve now become comfortable with. During the pandemic, we got used to scanning QR codes revealing an interactive hyperlink to either sign into places, or order food from a pub or restaurant. We’ve also become accustomed to applying funny filters to our faces in Instagram and Snapchat, which essentially is mixed reality.

Without the typical constraints of screens, you won’t need to cradle a phone or be limited in your desktop setup. The dream of AR and MR opens the opportunity for you to arrange your digital content in front of you, in your peripheries, at any size or shape — all while keeping it all private to you. Before we can make this all work and be adoptable there needs to be some serious UX thought into this space to make sure we architect a UX solution that solves the day to day realities, and not just the big picture dreams.

How will product designers tackle this next UX green field?

As a digital product designer in 2022, my work day is filled with user flows and Figma prototypes. I apply the double diamond process. I help my team plan roadmaps and sequence project plans. And I do all of this while working under an assumption — an assumption most people in the industry wouldn’t think to question: that the solution will involve a desktop or handheld mobile device. But once we move beyond these physical constraints this is an assumption that us designers will need to constantly challenge.

Prior to being a digital product designer at UntilNow, I spent most of my career at Canon researching new ways customers could interact with Canon’s camera, printer, and projector technology. The type of design we worked on was human-computer interaction (HCI). As digital realities increasingly blur with the real world, my advice is we should apply valuable learnings from the HCI space.

What is HCI?

HCI is a research field that delves into how people interact with technology around them in new ways. The aim is to consider all human factors when designing an interface or user experience. This can range from how we might actively use our five senses to how we might passively interface with the built environment.

The display-centric UX of today seems to have been driven by the maturation of HCI research that kicked off 40 years ago. Starting in the 80s, the evolution of technology went from the genesis of personal computers to the start of the internet in the 90s. In the late 00s and early 10s, we had mobile phones and mobile computing, and more recently, we have everything in the cloud. For decades, the dominant modality we’ve needed to interface with was the 2D display, and over that time, it’s done a very good job of providing us a gateway into the digital sphere.

If we want to keep going down this path of digital consumption, a shift in focus is needed.

We’ll need to reset our thinking by starting from first principles and leading with a HCI mindset. In other words, we’ll need to consider the digital and physical experience.

Key considerations for designing beyond screens

My assumption is Apple will be the first one to take the leap back into consumer head-mounted AR & MR devices (based on what they’ve recently been patenting).

As a product designer, I can’t help but think about all the new unique HCI problems that spawn from new situations created by AR & MR. Here are a few:

  • I’m watching AR or MR content with my smart glasses and people need to urgently get my attention. They can’t see where my AR content is positioned; they might be standing right behind it. How might the system help in this situation?
  • I’m in an environment where there are multiple triggers that want to activate their own AR content for me to see. Let’s assume the different triggers are legitimate and valuable content, and not just ads. How might the system mediate my experience so I’m not overwhelmed and I’m in control of the content I see, and when?
  • What is the best way to interact with AR & MR? If I have virtual content floating in front of me, do I use a mouse, a trackpad, my voice? Do I have a little controller in my hand? Or do I reach out and touch it? Given that the interaction should be able to be applied in a public setting, the challenge becomes: How might we let users type, point, and select AR content in a way that aligns with cultural norms?
The take-away

Whether addressing a display-less future, or another UX challenge, product designers must go back and challenge fundamental assumptions. AR & MR is but one area ripe for disrupting the interaction status quo; one that will require a massive reassessment in how we approach user’s problems. Designers will need to build digital experience with physical human factors (spatial, sensory) in mind.

But these design considerations aren’t just limited to future AR & MR. Great organisations should be thinking about this right now. Whether you’re working with or inside a neo-bank, or Web3 startup, all problems and solutions occur in a physical time and place. What is the time? What is the place? What resources do & don’t the users have available? In my experience , after knowing the context, those fundamental assumptions become clear and open to radical reassessment. Take Uber for example, they provide completely digital products, but for their products to succeed, they recognise all parties in their ecosystem operate in a specific time and place. By designing with this in mind, they were able to deliver a service that today sets the standard for rideshare and food delivery industry.

And that’s all to say turning back to HCI might illuminate the path forward.

Share this post