Skip to main content

Mobile

User Experience Design Beyond the Screen

Every morning I take the elevator up 12 floors to the office that I work in. Sometimes, to kill a few seconds I’ll pull out my phone and check Facebook or email to see the latest. While on this elevator ride, I never use: Netflix, Pandora, Google Maps, or Uber. It may sound obvious, but there’s not much need for Netflix when you only have one minute of time. It’s also not the environment Netflix was designed for. Now, if you’re stuck in an elevator for a couple of hours, that might be a different story, because Netflix is primarily designed for someone with enough time to watch a show.
User experience design doesn’t start at the edge of the screen; it begins with understanding where and how the user is going to be interacting with the device so that your product can be as convenient as possible for that environment. This means that during the design phase, we should be considering the four main aspects of user environment: time, sight, sound, and touch.
Time
One thing to consider when designing an application for a user is how much time they’re going to have when using it. Continuing with our Netflix example, the user is performing a leisure activity and probably has time to browse around and look for shows. Conversely, in a different scenario, someone using an exercise app may want to log reps quickly or switch between exercises without having to take a lot of time. They may only be able to do some quick interactions like starting and stopping a timer during the activity. After, with additional time, the user can create a more detailed log of activity. Time can drive what actions are available on a screen, views for the user, features, etc. It can also add to the successful experience of an app. An app that forces a user to take several steps when they’re in a rush will not be as convenient as one that doesn’t.
Sight
A little while ago, we worked with a client with a target market of interior designers. Initial research from an online survey revealed that very few designers visited our client’s site on their mobile phone. While responsive design has now become an industry standard, this still made us shake our heads and blink. During follow-up interviews we found the answer was simple – they were using our client’s website primarily when they were at work. While at work they had access to large monitors which allowed them to see everything bigger, zoom in, and compare products in windows next to each other. They simply didn’t have the environmental restraint of using the website from their phone and it wasn’t in their best interest to do so. This drives decisions around interface, features, and device priorities.
Sound
Recently, I was walking by a real estate office near my home and noticed that they have a kiosk outside of the window. As the curious person who likes to play around with things, I started punching some buttons and suddenly loud music started playing from it. I checked around me and noticed other people looking over to see what was going on. After a few minutes of unsuccessfully trying to find a volume switch it finally died down on its own. Because I was afraid of causing another loud disruption to my neighbors, I left earlier than I would have had the music not played. This is also a concern to auto-playing videos or sound on websites. The user may be in an environment where automatically playing sound would be disruptive to others and would discourage them from visiting the site. Therefore, the product may not be used as frequently as it could be because of environmental restraints.
Touch
The medical industry also poses unique and challenging environments for their users. These are incredibly busy users with many environmental restraints. How often can a doctor interact with an app throughout the day? Is hygiene a factor? Is a doctor able to use a phone and then go see a patient, or do they need to wash their hands first? If a doctor needs to view a notification on the go, but doesn’t want to physically touch something, voice commands could allow the doctor to interact with the device without touching it.
While we want to understand how to make an app that can empower a user to become more efficient and more capable I believe that we have to gain an understanding of the realistic scenarios and environments users are going to be in so that our design is convenient to their environments.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Heather Baird

More from this Author

Categories
Follow Us
TwitterLinkedinFacebookYoutubeInstagram