A partial archive of discourse.wicg.io as of Saturday February 24, 2024.

Ability to detect device’s native Navigation Gesture mode


The new Android 10 has a mode of navigation using gestures on the edge of the display, including both side edges.

This interferes with side navigation push menus (offcanvas menus) and similar UI/UX patterns used in the web.

It’d be great to allow browsers to let us know which edges are reserved by the device OS.

Android 10 also has an option to revert back to using software buttons, this freeing up the edges.

When edges are free, I’d like to give users the full-featured push menus, otherwise give them another option.

I believe this is important information, especially for highly-interactive applications (f.e. games).


I’m a fan. Maybe it could be exposed through the Navigator object?