Thursday, September 18, 2014

Solving a problem before defining it is a bad idea

A big part of design is solving problems.

A big part of solving problems is defining the problem. A big part of defining the problem is identifying individual parts in that event sequence where the problem occurs.

Solving problems properly not only helps to really solve the problem, but can also expose other potential issues in that area, or even the system as a whole, resulting in better design.

Don't do it backwards.

Finding a user problem that fits an existing solution is problematic. The actual manifestation of the problem, in relation to user interactions, becomes easily abstracted or very difficult to spot.

In other words, you might know the user problem or need, but how it occurs as a part of a natural user activity remains unclear.

If you observe the problem where and when it occurs, you can spot the root cause and find a proper technology or solution for it. Without understanding the root cause, you might be treating a symptom instead of the problem itself.

Let's use an example. A button based navigation model on a smartphone. It's design done backwards (or intentionally screwing it up). It's also easy to illustrate and familiar to everyone.

Back and Home (and a possible third) buttons are the pinnacle of modern smartphone interface development, so surely they can withstand some criticism. Below, is an illustration of a hand, holding a generic smartphone that has a button based navigation, represented by three circles.

First, observe the thumb. Look at its position.

That’s what a relaxed thumb position looks like. Now, grab your own phone and, with one hand, scroll through a list of contacts. Then flick between some images.

What location did your thumb return each time after relaxing?

Can you spot a pattern in how your thumb is backtracking to the same location after each interaction?

Almost all content interactions are somehow connected to this thumb position. From that position, it’s very comfortable to draw a small circle, flick up, down, left or right.

It’s the place to be for easy content interactions.

Then, whose idea was it to place the most commonly used actions (Home, back...) related to content interactions, as far away as physically possible?

Reaching those controls from your relaxed thumb location requires both high thumb mobility and accuracy.

Or using your new “modern” smartphone with two hands. It doesn't help the matter if the access to your notifications is through the opposite end of the device.

"What, are you a left handed person? Oh, we didn't think it mattered (or you existed). Placing the back button to the bottom left was the right thing to do"

And this design accidentally worked when it was validated with much smaller touch screens over ten years ago. Except it didn't work even then for the left handed folks that well.

What happened?

A solution existed before the problem was defined as a part of a natural user activity.

All previous navigation schemes (let's all give a big hand to desktop interfaces) used to have a back button, so one had to be introduced here as well. Coupled with a Home button and whatnot. User thumb location, in that specific moment when the need arose to go back, was irrelevant.

A back button could not simply be in the middle of the screen.

Everyone knew that much.

The technology was driving the user experience. A solution (button) existed before the problem definition.

Small display sizes at that time helped to mask the issue. Everybody was thrilled to be able to directly poke at a screen to do stuff.

It was magical.

Today, over ten years later, everybody is still thrilled to be able to directly poke the screen to do stuff. A generation of users exist who haven't experienced a phone you can use with a one hand. They have no idea what they're missing out. For them it's normal.

For our hands it's a disgrace. As it is for the development of mobile user interfaces in general.

Mobile devices are intended to be used in a mobile context, where our hands naturally interact with the environment as well. For that reason, a mobile device interface should be unobtrusive and empowering.

However, if two hands are required to navigate the interface by default, it’s clearly anything but.

If you wish to get your other hand back, so that you don't have to drop everything else when you need to use your "smartphone", you need to fight for it.

Manufacturers will not just waltz in to your doorstep to give it to you.

Do you want an alternative mobile interface to exist? Good, you're not alone.

Thanks for reading and see you in the next post. In the meantime, agree or disagree, debate or shout. Bring it on and spread the word.


  1. It's great that someone is finally paying explicit attention to what our hands are supposed to do with a phone. I've always felt that the basic idea of the Iphone is best for some sort of disembodied beings.
    I have to add though that SailfishOs's homescreen has four important buttons on the bottom of the area marked red in your drawing - the part of this area that is the least comfortable to reach with your thumb (so one might mark that part in a brighter red)! Granted, at least these buttons are not *even lower* than that (below the screen), but you have to concede that it's not like Sailfish is completely sparing us the trouble ;-)
    One of my biggest gripes about the touchsreen-only-epidemic of the past years (which, amongst other things, has led many people to believe that anything with a touchscreen is a smartphone) is the complete disappearance of buttons. Ironically, kinda my last hope for this industry, Jolla, has gone to some lengths in their design to achieve just that.
    Maybe some time you could write some lines about regarding physical buttons as enemies.
    I have had a bit of a shaky hand all my life, and it's excruciating how carefully you need to aim for some parts of the interface to make them react (and not the other tiny little *virtual button* directly attached to it). This has always made me hate typing on a touchscreen.
    The gesture-based nature of Sailfish is making up for these pains in a considerable way, being able to start a gesture on any point of the screen makes for so much more convenience!
    But what on earth have physical buttons done to smartphone designers to make them hate them so much? The perfect combination of hardware- and virtual controls could be found. Just consider that the location of a physical button, in staying the same throughout the device's lifespan, also makes pressing it a matter of muscle memory, which plays such an important role in the overall design of Sailfish OS.
    If Jolla aims to revolutionize industry trends in some way, a more flexible, multifaceted approach to hardware design needs to be taken into account, rather than having always just the front of a smooth rectangle at your disposal for manipulation...

    1. Thank you Ben for writing about both your experiences and thoughts. Good to hear Sailfish OS has made a difference for you. You're not alone.

      We're trying hard reduce the trouble that comes with the touch-screen-only paradigm :-) And related to those four buttons on the "red zone", we have changes planned for Home screen that will improve the situation. It's good to have others reporting these issues. I can't comment on the schedule, though.

      I think the most difficult design aspect of a physical button, comes from the dependency it has to individual hand characteristics. To be useful, it should be close to your thumb location, on the "green zone". This would increase the screen bezel or make the button protrude somehow, and that requires a more customized phone hardware, not to mention a sturdy button. And it would need to be on each sides of the device for both portrait and landscape orientation. I mean if it would be ergonomically placed to serve both one- and two-handed use. The increase in bezel would push the screen itself away from your thumb. It's a tricky situation :-)

      In short, physical buttons cost, reduce fault tolerance and add mechanical constraints. They always require a porting through the phone body, allowing moisture and dirt to enter the internal parts. They also reduce the overall frame rigidity. iPhone 6+ wouldn't have bended that easily without the volume key ports. I wouldn't be surprised if physical keys would completely disappear from phones within couple of years as they become more aware of our physical usage patterns (vibration, acceleration and other sensor data).

      A good design can improve a lot how a certain feature works. A great design removes the need to use that feature. Most of the time I'm just a broken record asking "why, why why" to understand why we do some things. And most of the time, we have a feature that exist because someone in the past just made a feature work better instead of solving why is that feature needed in the first place.

      I'm not a huge fan of virtual keyboards either, but it gets the job done for short messages. Instead, I like to call people, because I can mind my surroundings while talking. Can't do that while texting.