Tuesday, October 21, 2014

Replacing widgets with minimized applications

One of the most unique aspects of Sailfish OS, are active application covers.

They're essentially small windows on your Home screen. Each one of them represents a minimized application (not currently in full-screen state). The same function is found from any desktop OS, in form of a task bar or an application dock, that show all your applications that are running in the background.

As you know, there's no separate task bar or switcher in Sailfish OS. Only one Home screen. These covers show relevant and legible information, and allow user to perform important actions directly from the home screen.

On the first encounter, they kind of come across like widgets. And it's no wonder, since they actually pull double duty. Primarily, they're used to maximize an application to full-screen, but also allow user to interact with common features without the need of entering the app.

When compared to for example Android home screen counterparts, similarities start to run shallow. Widgets on Android are not in the same place where active applications are shown, so there's no real connection between the two. Even if they also allow using common features, there's a problem with doing it with buttons. I made a simulated example below, about Sailfish OS having separate tappable areas.

Accurately clicking a correct physical location on the display is easy in a desktop environment, the birthplace of widgets. After all, mousing on a stable surface is a breeze. But when you're walking or otherwise active, the overall movement of your body easily transfers all the way to your hands, making it harder to tap the correct button. And just because this interaction method uses our fingers to emulate a mouse cursor, it's already by definition an inferior approach for mobile interfaces that are not used in stable environments.

Sailfish OS solves this by not using taps for cover actions at all. Instead, user flicks the cover horizontally to perform an action. This makes it irrelevant where you tap or flick a cover. Both tap and flick touch events can use the entire Cover area as a target. This makes a big difference in the accuracy requirement for performing an action.

Since it's the flick direction (left or right) that defines the action (media player play/pause and skip song for example), it naturally limits the maximum amount of actions into two. And since tapping and flicking to left or right all have a different effects on how, and in which order, your hand muscles behave; it's easier for our body to associate them for different things. Again, the design both supports and takes advantage of our unique capabilities.

Whether you use all the potential through these gestures or not, is up to you. Each of us takes tools we use to different levels of efficiency. Some push them all the way to their limits, while others are comfortable in casual use. Both are equally correct.

However there shouldn't be separate locations for these two ways of use. In Android, the requirement for widgets comes from the poor multi-tasking performance, as well as system complexity. It's nicer to have multiple home screens filled with widgets, where user can perform frequent tasks. However, that's just adding more complexity by fixing the wrong problem.

By solving how minimized applications allow different degrees of user focus, and reduce the need to enter an application, the need for a separate location for widgets is removed. It both makes the OS simpler and leaner, and greatly increases task handling speed, since various user needs can fulfilled in the same location.

Responding to user required level of efficiency and control.

Thanks for reading and see you in the next post. In the meantime, agree or disagree, debate or shout. Bring it on and spread the word.


  1. You mention taping while moving etc. and you expose one aspect of SFOS that needs to improve IMO. Closing apps while they are minimized/tiles. Long tapping and having the x show up on the tiles brings you exactly to what you describe as bad. Inaccurate taping. So wouldn't it be better if you could close the app by flicking the tile vertically?

    1. Yes, it would, and I agree 100%. I've mentioned this issue internally and let's see what comes out of it. As you know, these are my personal opinions as a designer who knows SFOS intimately, however it doesn't automatically mean they are treated as an internal guideline. There's always a deeper discussion involved :)

      I do hope this can be fixed, though. What you proposed would improve it a lot.

    2. How long does it usually take to fix a design issue. We know how the coding team works more or less -and we interact with them in TJC- but the design team modus operanti the roadmap and the people are pretty much unknown (apart from you obviously).

      BTW what i proposed above has a potential issue -while the general idea of having a flick instead of a tap isn't wrong-. Gesture and flick orientation consistency. IMO a gesture should only do one thing (which for example isn't the case with the top outside one) and probably a flick should also have the same action as the gesture. Ie if a gesture from outside LtoR closes an app the flick that will close the app should be in the same direction. Anyway. Enough of my ideas. :P

    3. Mmh, the vertical flick could collide with the scrolling of the window. How about flicking left->right->left, without lifting your finger (scratching the cover out)?

    4. Anon, the time it takes to fix a design issue really depends on a lot of things. For technical people, the way of working is a bit flatter than for the design. It used to be less, and I could comment more freely in TJC about things :/

      I'll try helping with my blog :)

      I'll think about the point you made with the gesture direction and its effect. Also check the reply below.

      Martin, yes it cannot be a vertical flick. One way to solve it would be to look at other windowing systems that have close and minimize buttons in the same corner. Using the same logic, the gesture that you use for getting back to home screen, would close the app if taken to the opposite side of the screen (and maybe, give a way to avoid accidental closings by a remorse timer or similar)

      This would allow you to always lock the display from top edge, and to leave the device as it is (music/browser/maps/shopping list). When you next time unlock the device, it would resume the same task as you left it.

      Just some thoughts :)

      Thanks for good comments!

  2. I like all the swipes of Sailfish but i don't understand why there isn't an option for unlock the screen with a slide up from the bottom edge (the opposite of lock).

  3. Hi,

    The bottom edge swipe is reserved for the events view access, so that it would be always the same, regardless of when and where you perform it. In the same way as accessing the notification drawer is opened always from the top edge in other devices.

    It wouldn't be very nice if in lock screen the bottom edge would behave differently than in other places, and there would need to be other way to access events view.

    I hope that shed some light into the logic. If not, leave me a reply :)

    Thanks for stopping by to comment.

  4. This comment has been removed by the author.

  5. Yes, i understand and i like your coherent design.

    I noticed that there are some problems with font rendering, somebody thinks that it's the resolution but i don't think so.
    Are there no patches like on linux (freetype...) ?

    Thanks for your answer!

    1. Sorry for the late reply. I agree that the font rendering could be improved, and there's occasional glitches. However, I'm not aware of any easy wins in terms of patches. Maybe I'm not looking deep enough.

      Thanks for pointing out the issue. I'll dig around.

      Have a nice week!