How Google is taking control of Gesture Navigation in Android
post credit site visit
After experimenting with button-based gesture controls in Android 9 Pie, Google went back to the drawing board to improve the fluidity and one-handed use of Android’s gesture navigation. With Android 10, Google arrived at a solution that’s visually like iOS: a gesture bar that can be swiped up to go home or swiped left or right to switch between apps. Since the gesture bar is much thinner than the dedicated space for the previous three-button navigation scheme, Android 10’s gestures give apps more space to show content at the bottom of the screen. To deal with the lack of a dedicated back button, Google added an inward swipe from the left or right edges of the screen to trigger the back action. Google’s new and improved gestures are a step in the right direction, though some still believe that third-party alternatives are superior.
Even if there’s still room to improve (and there definitely is), Google is pushing its Android partners to adopt these new navigation gestures because the company doesn’t want to burden app developers with having to accommodate multiple different gesture navigation schemes. Android device makers like OnePlus, Samsung, Xiaomi, Huawei, OPPO, Vivo, and ASUS are just some of the companies with their own takes on gesture navigation. These companies have already invested a lot of development effort into building their own gestures, so Google isn’t forcing them to completely abandon their work.
Instead, Google has rewritten its ruleset for Android and Google app compatibility, forcing OEMs to sideline their own gestures in favor of Google’s, while also restricting the functionality of OEM gestures.
Android 10 Gesture Compatibility Requirements
After every major Android platform release, Google updates the Android Compatibility Definition Document (CDD) to outline the new requirements that all devices must meet in order to be considered compatible with the latest version of Android. This is one of the prerequisites to obtaining an Android license, which is necessary to use the Android branding in marketing. It’s also a prerequisite to obtain approval to distribute Google Mobile Services, the suite of Google apps, services, and libraries pre-installed on most Android devices sold internationally.
In the CDD for Android 10, Google has updated section 2.2.3 on the software requirements for handheld devices (AKA smartphones) with the below wording. These statements inform OEMs about Google’s expectations for how large the trigger area should be for navigation gestures.
Google recommends that the gesture recognition area for the home action should be within 32dp (dp stands for density-independent pixel) from the bottom of the screen, but they aren’t making this a requirement so OEMs can still offer floating gesture controls such as EMUI’s floating navigation dock.
If an OEM offers a swipe-in gesture from either the left or right edges of the screen, then Google requires that the trigger area be less than 40dp from the edge (ideally 24dp in width.) Note that this allows for OEMs to create different sensitivity options for side gestures so long as the trigger area doesn’t exceed 40dp. In fact, Google offers exactly this in its own Android 10 release. By default, the inset for the back gesture is 24dp on the Pixel, but it can be lowered to 18dp or raised to 32dp or 40dp.
In a later section of the CDD, specifically section 7.2.3 covering Navigation Keys, Google provides detailed requirements for how gestures for the back, home, and recent apps actions should operate. Most of the requirements focus on making sure that the system behavior is consistent for app developers, but there are a few notable statements that might affect the user experience.
While Google doesn’t mandate that a swipe up from the bottom edge trigger the home action or a swipe up and hold trigger the recent apps overview, Google does require that swipe gestures from the sides trigger the back action. Notably, this would mean that the customizable gestures provided by Samsung’s One Hand Operation+ wouldn’t be allowed, though since One Hand Operation+ isn’t installed out-of-the-box, it might get a pass.
If an OEM provides a floating system panel that’s triggered via a side swipe gesture, then the OEM must place the trigger area in the top 1/3 of the left or right side and must not allow the panel to exceed a size of 1/3 of the size of the screen edge. The OEM may, however, allow the user to set the trigger area below the top 1/3 of the edges. This language was likely added to accommodate Samsung’s Edge Panel feature.
The Android 10 Compatibility Definition Document doesn’t place that many restrictions on what OEMs can do with gestures, but as I mentioned before, abiding by the CDD is just one of the prerequisites to obtain an Android license and approval to distribute GMS. Google has a separate document that they distribute privately to all its licensed Android partners; this document enumerates the technical requirements that companies have to follow to be allowed to distribute GMS, and it has additional stipulations pertaining to gesture navigation in Android 10. We obtained a copy of this document, titled GMS Requirements v7, dated September 3rd, 2019.
Gesture Navigation Requirements for GMS Approval
The Google Assistant is an incredibly important service for Google, so Google bundles it as part of the Google App and requires all Android partners to distribute it as part of the suite of GMS apps for “Regular” (non-Android Go) devices. However, the requirements don’t end there. Since Android 5.1, Google mandates that a long-press of the Home button trigger the Assist action, which by default will invoke Google Assistant since Google also mandates that the Google app be the default handler for the Assist action. There’s no longer a dedicated home button in Android 10, though, so Google has set new requirements on how to trigger the Assistant with a gesture.
To trigger the Google Assistant with Google’s gesture navigation, you have to swipe diagonally from the bottom left/right corner. Google requires that this gesture is present on all devices running Android 10, regardless of whether or not Google’s gestures are the default navigation controls out-of-the-box. If an OEM implements its own gesture navigation controls, then it can implement its own trigger to launch the Assistant app, but the exact implementation will be subject to review from Google. Some OEMs like OnePlus and Xiaomi let you trigger the Assistant by long-pressing on the power button, for example.
- Classic three-button navigation controls. These can be on-screen or hardware buttons, but they must have some distance between them. The three buttons trigger home, back, and recent apps.
- Android 9 Pie’s two-button navigation controls. These can’t be hardware buttons, though the two buttons must still have some distance between them. The back and home buttons trigger the back and home actions respectively, though the recent apps button has been merged with the home button such that a swipe-up of the home button triggers the recent apps overview.
- Android 10’s new gestural navigation.
All devices launching with Android 10 must implement A and C, though it’s up to the OEM to decide which one is made the out-of-the-box default. B is no longer supported and cannot be allowed as a user-selectable option.
So where does that leave alternative navigation controls from OEMs? Google says while Android partners may offer their own navigation controls, their alternatives cannot be presented to the user during setup nor can they be advertised to the user through notifications or on-screen pop-ups. While A and C are required to be shown at the top-level of navigation settings, any alternative navigation options have to be placed one entry deeper in Settings.
This effectively means that alternative, arguably better gestures will only be found by power users who dig through settings or read articles online about their device. We noted in our OnePlus 7T review that OnePlus doesn’t offer its OxygenOS full-screen gestures, and that’s likely going to be the case with other devices launching with Android 10 down the line since there’s little point in offering an alternative gesture scheme. The likely reason that the OnePlus 7 and OnePlus 7 Pro still have the old OxygenOS gestures is that Google strongly recommends that OEMs don’t remove existing navigation options when upgrading devices to Android 10.
Lastly, Google strongly recommends that OEMs don’t switch the user to a different navigation mode when setting a third-party launcher as the default. Ironically, this is exactly what happens when you try to set a third-party launcher as default in Android 10 for the Google Pixel. Google has promised that they’ll roll out a fix to make Android 10’s gestures compatible with third-party launchers, so it’s likely that they added this particular statement so users won’t blame third-party launchers for gesture incompatibility. Do as I say, not as I do.
In summary, Google has finally taken steps to unify gesture navigation in Android, and they’re using the CDD and GMS approval process to make OEMs play along. That’s not a bad thing, though, as fragmentation in navigation controls is problematic for app developers. Google has clearly put a lot of thought and research into the usability of the new gestures. Since Google knows that not everyone will be happy with their gestures, however, they’re still giving OEMs some leeway by allowing them to make their own gestures, so long as those gestures follow certain rules.
In future versions of Android, Google may entirely disallow alternative navigation modes. OnePlus may already see the writing on the wall which would explain why they no longer provide their old gestures on the OnePlus 7T, though we’ll have to wait for more devices to launch with Android 10 to see if this is a one-off or a new industry trend.