As smartphones have grown larger and more powerful, they’ve become increasingly difficult to use with one hand. The average screen size has ballooned from 3.5 inches to over 6 inches, yet our thumbs haven’t grown to match. This creates a fundamental usability problem: the very devices designed to be mobile and convenient now require two hands for basic navigation.
The current solutions—reachability modes, one-handed keyboards, and floating buttons—feel like band-aids on a broken paradigm. They acknowledge the problem but don’t fundamentally rethink how we interact with large screens.
Monophone reimagines mobile navigation from first principles, treating the thumb’s natural arc as the primary design constraint. Instead of forcing users to reach across vast screen real estate, the interface comes to them.
The concept introduces several innovative interaction patterns:
The demo is built as a React web application that showcases the core interaction concepts. It uses:
The project began with extensive research into hand anthropometry and grip patterns. By studying how people naturally hold their phones—on trains, while walking, in bed—clear patterns emerged about comfortable and uncomfortable reach zones.
Prototyping started with paper mockups to quickly test gesture ideas, then moved to interactive prototypes to refine the timing and feel of animations. The goal was to make every interaction feel as natural as using a physical tool.
While this concept demo focuses on navigation, the principles could extend to:
The monophone concept isn’t just about making phones easier to use—it’s about designing technology that adapts to human limitations rather than forcing humans to adapt to technology.