Two proposals for one-handed navigation on Windows 10 mobile devices


Windows Phone, before the unification of the Windows platforms, used to be based around the 'Hub and Spoke' model of navigation. This meant that all forward navigation was done in the UI. You navigated further 'down' in an app by tapping content in the page (e.g. someone's name, which accesses their profile) and by tapping the hardware Back button to ruturn. There was also the Windows key which would jump you out of the app to the Start screen. Hub and Spoke meant that there was no need for dedicated navigation UI on every page (such as e.g. a persistant menubar at the top to jump to a particular page), just that the content activated the forwards navigation. This was emphasised in the design guidelines as 'Content over Chrome' which meant very clean UI and a simple conceptual model of navigation.

This was a major differentiator of Windows Phone apps which set it clearly apart from navigation models of Android and iOS.

The current state of mobile navigation

This was fine for the most part although there were some conceptual difficulties when more apps supported deep-linking through 'Secondary Tiles'. Here the app is accessed at somewhere in the app which is not the root hub screen meaning there is no path on the navigation stack back to the root hub screen (i.e. hitting back button will not take you all the way to the apps root hub screen where you would access other parts of the app on other spokes).

In addition the popularity of Android and iOS meant that that the navigation paradigms of those platforms were starting to become the de-facto standard. Broadly speaking, people were expecting navigation elements to be at the top of the screen and also to be able to jump directly to other parts of the app via a persistent menu. This was established with the original iPhone which had a small 3.5 inch screen meaning that the navigation was easily reachable. As phone sizes became bigger however, the navigation in the top left started to become problematic for one-handed use since it was out of easy reach - however the navigation at the top 'standard' was now long established in the users psyche.

Moving to Windows 10

Windows RT and Metro on the Desktop, (neither of which have a hardware Back button nor are typically held and activated one-handed), both adopted the principal of navigation at the top and actions at the bottom. In addition, in the mobile space, Windows 10 is looking to allow Android and iOS developers to easily bring their apps, along with their existing navigation models, to Windows 10 devices which has meant pressure to move from the previous Hub and Spoke model on Windows Phone towards the de-facto standard navigation at the top. The Windows 10 design guidelines are now no longer so prescriptive on the navigation model and allow direct navigation between pages at the same level in the hierarchy.

Microsoft have brought the SplitView XAML element to the Windows 10 UAP as a counterpart to Android's Navigation Drawer however the difference is that SplitView does not have to be for navigation. Moreover navigation may be performed in an entirely different way in an app – a good example of this is the Edge browser which navigates primarily through an address bar where you perform your searches/enter URLs and the Back button. The overall trend though is away from Hub and Spoke in Windows 10 apps.

This has lead to a huge outcry from fans of Windows Phone since there are clear benefits of the Hub and Spoke: chiefly a clean UI free from navigation chrome, and in the Windows Phone implementation all navigation could be done from the on-screen content or the hardware buttons at the bottom.

This leads to the current problem which is also faced by Android and iOS – How to reconcile one-handed use with the new Windows de-facto standard of navgation at the top and actions at the bottom?

Proposal A - Long pressing the Windows button

The design guidelines should recommend long pressing the Windows button to act as a shortcut to activate/focus the primary navigation UI in an app.

This has many benefits since it requires very little to no modification to existing UI, fits into current Windows 10 app design centered around the SplitView, solves the problem of not having to reach to the top to activate navigation elements.


The shortcut is invisible and so is only likely to be used by people who know about it - i.e. 'power users'. However it should be emphasised that, like keyboard shortcuts, there should always be a way to activate the navigation within the UI e.g. through a Hamburger icon or visible address bar. This way it is not essential to know the shortcut.

Highlighting navigation

The shortcut would be contextual to each app. Usually for an app with a SplitView control, it would open the control to show navigation. However it can be adapted to show more appropriate or custom navigation UI.

Example 1: A game has a unique custom compass widget to move around. Activating the long-press shortcut pulses the compass.

Example 2: Activating the long-press shortcut places the focus in the address bar in the Edge browser.


The Windows key is a good candidate to activate navigation UI since it has always been associated with navigation (it returns the user back to the Start screen) and the long-press function has not yet been assigned to a particular feature (unlike the long press of the Back button). This shortcut is also not without precedent. Apple implemented 'Reachability' for the iPhone 6 which arguably does something similar – by double tapping the Home button, the screen shifts down within reach of the thumb. However this does not highlight navigation UI, merely brings it within reach. Double tapping is unlikely to be appropriate on Windows Phone since most of the hardware buttons are capacitive and so do not give a haptic tap feedback - plus double tap is not a common UI metaphor for the platform.

Proposal B - Swiping from the left edge

The SplitView element should be activated by swiping from the left edge of the screen

One problem with proposal A is that SplitView does not necessarily contain navigation UI. It may contain lesser used actions for example. If people come to expect long-pressing the Home button to open the SplitView (which it would in most cases) they may be surprised when it activates some other bit of navigation UI. A swipe from the screen edge could be used to consistently activate the SplitView. This guesture is less abstract than long-pressing the Windows button and so users will less likely misunderstand what it does conceptually.

Consistent implementation

The second benefit is that this could be implemented by default (although it could be overridden) whenever a SplitView control is used. This will ensure consistency of implementation since proposal A is just design guidelines. Proposal A could be implemented by default whever a SplitView is present but we would not want to encourage developers to leave the default when more appropriate navigation UI is present.


Thirdly, a hardware button function is not used up, leaves it free for other functionality. This is also importatn when we consider that the hardware button behaviour can be defined by a developer which may lead to malicious behaviour which appears to originate 'at the system level'.


For proposal A,

  • Plays well with Android Navigation Drawer style-navigation
  • Minimal implementation overhead
  • Minimal/zero impact other platforms such as Desktop and tablet
  • Adaptive to app context e.g. activates address bar in Edge, SplitView in other apps
  • Functions as a hint as to how to navigate using non-standard UI

For proposal B,

  • Plays well with Android Navigation Drawer style-navigation
  • Minimal implementation overhead
  • Minimal/zero impact other platforms such as Desktop and tablet
  • Hardware button functionality is not used up
  • More intuitive guesture than proposal A
  • Implemented consistently by default

So what are your thoughts – Proposal A, proposal B or some as yet undefined proposal C?

Manifestations show