Instead of the traditional approach, Apple proposes tracking subsystems and onboard sensors deployed within a mobile device be used to translate a user's physical motion into a panoramic navigation UI. In the examples that follow, data from accelerometers, cameras, gyroscopes and other sensors are used to move a user through virtual street-level panoramic space.