They're essentially small windows on your Home screen. Each one of them represents a minimized application (not currently in full-screen state). The same function is found from any desktop OS, in form of a task bar or an application dock, that show all your applications that are running in the background.
As you know, there's no separate task bar or switcher in Sailfish OS. Only one Home screen. These covers show relevant and legible information, and allow user to perform important actions directly from the home screen.
On the first encounter, they kind of come across like widgets. And it's no wonder, since they actually pull double duty. Primarily, they're used to maximize an application to full-screen, but also allow user to interact with common features without the need of entering the app.
On the first encounter, they kind of come across like widgets. And it's no wonder, since they actually pull double duty. Primarily, they're used to maximize an application to full-screen, but also allow user to interact with common features without the need of entering the app.
When compared to for example Android home screen counterparts, similarities start to run shallow. Widgets on Android are not in the same place where active applications are shown, so there's no real connection between the two. Even if they also allow using common features, there's a problem with doing it with buttons. I made a simulated example below, about Sailfish OS having separate tappable areas.
Accurately clicking a correct physical location on the display is easy in a desktop environment, the birthplace of widgets. After all, mousing on a stable surface is a breeze. But when you're walking or otherwise active, the overall movement of your body easily transfers all the way to your hands, making it harder to tap the correct button. And just because this interaction method uses our fingers to emulate a mouse cursor, it's already by definition an inferior approach for mobile interfaces that are not used in stable environments.
Accurately clicking a correct physical location on the display is easy in a desktop environment, the birthplace of widgets. After all, mousing on a stable surface is a breeze. But when you're walking or otherwise active, the overall movement of your body easily transfers all the way to your hands, making it harder to tap the correct button. And just because this interaction method uses our fingers to emulate a mouse cursor, it's already by definition an inferior approach for mobile interfaces that are not used in stable environments.
Sailfish OS solves this by not using taps for cover actions at all. Instead, user flicks the cover horizontally to perform an action. This makes it irrelevant where you tap or flick a cover. Both tap and flick touch events can use the entire Cover area as a target. This makes a big difference in the accuracy requirement for performing an action.
Since it's the flick direction (left or right) that defines the action (media player play/pause and skip song for example), it naturally limits the maximum amount of actions into two. And since tapping and flicking to left or right all have a different effects on how, and in which order, your hand muscles behave; it's easier for our body to associate them for different things. Again, the design both supports and takes advantage of our unique capabilities.
Since it's the flick direction (left or right) that defines the action (media player play/pause and skip song for example), it naturally limits the maximum amount of actions into two. And since tapping and flicking to left or right all have a different effects on how, and in which order, your hand muscles behave; it's easier for our body to associate them for different things. Again, the design both supports and takes advantage of our unique capabilities.
Whether you use all the potential through these gestures or not, is up to you. Each of us takes tools we use to different levels of efficiency. Some push them all the way to their limits, while others are comfortable in casual use. Both are equally correct.
However there shouldn't be separate locations for these two ways of use. In Android, the requirement for widgets comes from the poor multi-tasking performance, as well as system complexity. It's nicer to have multiple home screens filled with widgets, where user can perform frequent tasks. However, that's just adding more complexity by fixing the wrong problem.
By solving how minimized applications allow different degrees of user focus, and reduce the need to enter an application, the need for a separate location for widgets is removed. It both makes the OS simpler and leaner, and greatly increases task handling speed, since various user needs can fulfilled in the same location.
Responding to user required level of efficiency and control.
Thanks for reading and see you in the next post. In the meantime, agree or disagree, debate or shout. Bring it on and spread the word.