Thursday, September 25, 2014

Harmonizing touch screen gestures

When you're using a touch screen device, you expect that it's meant to be touched.

When you move content up or down on that screen, you're expecting stuff to move accordingly. It has worked that way in mobile touch interfa... STOP!

Stay absolutely still. DO NOT try to move things horizontally. Slowly backtrack your flick and listen.

Flicking the screen sideways seems to be the thing to do these days. Nobody really knows why, but everybody does it. Why wouldn't they. More is better, so more flicking must equal more value. The more flicks you can master, the more you can do with them. The more fingers you add to the fun, the better.

A solid plan no doubt.

It seems like the smartphone industry, with certainly some apps to go around, is self-guiding itself towards maximizing the variety in horizontal gesture use. No holds barred.

To prove my point, I'll list some things that can happen when user flicks horizontally from the screen center. These are not in any particular order, and I'll stick with just applications to narrow things down. No app names are mentioned to keep things civil.

Let's just flick it.

  1. In the first app, by flicking left to right, you open a menu of some sort with additional controls or content categories. Just like pressing the menu button does from the app header.
  2. Another app does the same but using a different direction, from right to left. Surprise!
  3. The third one has a different behavior and a much longer flick is required. Son, you need to be more specific. A thumb extension surgery is a good idea if one handed use is your thing.
  4. A fourth app requires a specific speed for the flick. It didn't understand what your kind-of-a-flick was trying to accomplish. Not just any flick qualifies. Go on, try again.
  5. The fifth subject has only been reading user comments for the past two years, and hasn't yet implemented the required horizontal flick support. Please press a button on the top right corner. Or left, wait, it was at the bottom somewhere. Google it.
  6. The sixth contender is insecure, but polite. It asks you to define what would you prefer for that particular flick to do. Neither developers, designers nor managers could pick one, of all the possible features the monster of an app offers. They're hoping for you to solve the problem for them. You open the flick manager. While browsing through all those options to select one, you forgot the purpose of the app.
  7. Our last example application is the most advanced of them all. By flicking on top of an individual content item, and altering the flick direction, speed and blood pressure; you can delete, manage, reply, call back, link, fold protein and travel through time. In every multiverse. Times Pi.
Oh yes, modern apps have gotten really powerful.

But that potential doesn't directly transfer to the user. All different solutions aim to solve who has the most innovative application interface. Application design is focused on doing a very specific task, instead of a efficiently completing a sequence of tasks within multiple apps.

All eyes are on measuring an individual application performance, instead of the performance of the overall system. When reviewing the system performance, it's important to include user as the operator of the system. If each app functions differently, it comes at the expense of the overall system performance, because user has to adapt to different ways of how similar tasks are done.

Users are looking for smarter and leaner experiences. Doing more with less. Making that happen with application-centered thinking is difficult. Because each app is an island.

Fully independent, self-sufficient, self-governed and self-centered.

And every time you jump between those apps, you have to remember what island you're on this time. Even if they all, most of the time, are doing the same thing with a similar gesture.

How did it got so fragmented?

What was missing from those application development tools, were both the default behavior and function for horizontal flicks. Without those, everyone invented their own.

Sailfish OS was designed from ground up to allow direct control of the interface through touch gestures, without the need for additional buttons. This results in a much faster and efficient interface by bringing structure to how individual apps use touch gestures.

If you want to create an app, you save time in both design and implementation because the default behavior and functionality for those flicks is already built in to the way the OS handles application pages.

Now, let's take a closer look at them. To help remember and relate easier to Sailfish OS gestures, I'll give them names: "Symbolic swipes" and "Functional flicks". Don't worry, these are not official terms.

Symbolic swipes

Start from the display bezel and slide your finger over any screen edge to perform a symbolic swipe. They are controlling the operating system in the same way as a Home or power button would do on other devices, symbolizing the function of those buttons.

In the first picture, swiping from the either side takes you to Home screen. On the middle, swiping from the bottom edge shows the Events view with all your notifications. The last image illustrates a swipe from the top edge. It ends your current activity by closing the application you're in. As mentioned in my previous post, moving the notification access to the bottom edge helps greatly one handed use.

Functional flicks

Starting from the screen center, move what you see on the screen to either left or right, depending of the function you wish to use. Functional flicks are related to controlling the most important functions inside Sailfish applications. Like going back to previous page.

In the first picture, flicking to right takes you to the previous application page, replacing the common back button. In the middle, flicking to the left opens a page related to the current one (not shown in the video), replacing a common menu button. Close it by going back (right flick). A dialog page uses right and left functional flicks in canceling or accepting a common yes/no confirmation from an app.

Flicking up or down moves the content vertically as it does in many other device.

However, controls that directly relate to the current page content (create a new message, search etc) can be accessed by using the same movement direction. Imagine the content as an extension of your hand, like a rope. When you pull the content down, a pulley menu starts to open. As you keep revealing more menu options while pulling down, they become highlighted, one at a time. Releasing will select and perform the highlighted action. The name comes from an apparatus that's used to lift heavy loads.

Because you just used the content to access that menu, it doesn't matter what size your hand is, or is it a tiny phone or a tablet. You're not trying to reach and tap an icon or a button. Only the distance you pull down matters, and you can feel a small vibration when a new option is selected. Move content back up to hide the menu again.

Most of you have already done these in many apps that already exist out there. So it's hardly a first encounter.

Sailfish OS has simply harmonized and promoted common touch gestures.

If you think it's complicated, you most likely haven't tried it. Because it's not. It just takes few days for your hands to wake up from the button-based smartphone hibernation.

And the difference between a symbolic swipes and a functional flick is really where you start it. From the screen edge, or from its center. An easy way to remember it is: Swipe what you feel (device edge) and flick what you see or know (on the screen).

On almost all button-based smartphone operating systems, when you go to a sub-page, that page animates in from the right. The movement clearly communicates that it came from that direction. So what's the best way to undo that movement?

Yep, move it back to where it came from. I'll do another post at some point about interface animations and transitions.

Harmonizing and promoting gestures like done in Sailfish OS doesn't only makes moving and working inside applications faster and more ergonomic, but also much closer to how we're used to interacting with the physical world.

Nothing happens until you let go and stop affecting an object. If you lift a coffee mug from one table to another, the time you release your grip has a big impact whether it's a disaster or a graceful landing.

As long as you keep your finger touching the screen, you're in control. You can test what a gesture does as you can see what is happening moving your finger. If it was a wrong one, simply reverse the gesture to your starting point without releasing.

With a button-based navigation, you can only press and release a button. When you do, you get to enjoy the show from the passenger seat. You touch it, you buy it.

No peeking.

Thanks for reading and see you in the next post. In the meantime, agree or disagree, debate or shout. Bring it on and spread the word.


  1. Man! You're on fire!
    Looking forward to hear how these gestures could fit bigger screens and also apply to multitouch :)
    Keep up the amazing work!

    1. Thanks for the kind words, I'll add those to the list I'll be writing about in the future.

  2. All nice but jolla made a small mistake when implemented swipes. It used two different functions to a single swipe motion (top). Close app (which later made an option) and lock. Those two should have been different. (ie top always lock and use from left swipe to close). Anyway.

    Nice blog. Drop by to discuss design suggestions and other design related stuff. There are ideas and the community would like to hear what you liked and where you disagree.

    1. I agree, the top swipe should have a more consistent behavior. It would've made closing apps also easier. You're spot on about that mistake. I don't know when those can be fixed, but we're on it.

      Thank you for the support. I'll make time to check out the situation in TJC. Apologies for the lack of participation.

    2. It'd be nice to have you on TJC. People have made icons suggested changes in visual aspects of the OS and many more UI/UX related stuff. All in a try to make a more polished OS.


    3. I've wanted for a long time to get community design activities somehow more organized. Both Jolla and community created designs would be in a single place, people would have material to make more and discuss about them. Developers would also find things easier.

      Now, everything is a bit scattered. Maybe TJC should be the tracking system. I need to think about it a bit more.

      Thanks for pushing me to be more active :)

  3. I don't really mind that top swipe closes an app, as close can be seen as close the home screen.

    Pulley menu operation can be slightly confusing though. As you need to be on specific place on the content (top) as pulley menu can't be operated when the content can be scrolled. Or is it an implementation thing. (for example Tweetian shows the "problem")

    1. All the current functionality can still exist, even though the top edge wouldn't be used for it.

      Closing is important part of how you deal with tasks (apps). Sometimes you just quickly use something, like a flashlight and swipe to close it because you're not going to need it running on Home. Not all apps/tasks are equal in that sense. Others are more useful when left running, while others just get in the way. It's important that the OS allows this distinction to be made.

      True, that's the weak point of pulley menu if the need arises to use commands in it when not at the top (reading something).

      The quick scroll feature gets you to the content top, but it's not perfect. Majority of the cases (start an app, return from sub page) you're at the top of the content when you need those functions, although naturally not every time.

      This favors a certain way of usage over another and I'm sure will not fit everybody.

      Tweetian solved the issue by placing another pulley menu to the toolbar, which is always visible. When there's a will, there's a way :)