You are using an outdated browser. Please upgrade your browser to improve your experience.
Our Future iPhones Could Alter Touch Controls Based On User Movement

Our Future iPhones Could Alter Touch Controls Based On User Movement

January 15, 2014

Our iPhones are getting smarter each and every year, and now a new patent explains how future iDevices could alter the touch controls presented to users based on their levels of movement.

Though the patent was originally filed back in 2007, during the early days of the iPhone, it was awarded by the U.S. Patent and Trademark Office only recently (and reached us from AppleInsider). Called “Variable device graphical user interface,” the patent describes a system whereby the graphical user interface (GUI) presented by an iDevice is determined, in part, by its user’s level of activity.

In terms of real world application, this means that when an iPhone is being used as a music player during a training session - while power walking or jogging, for example - the volume controls would auto-enlarge. Of course, this would make it easier for users to interact with the touch controls while on the move.

Most interesting, perhaps, is that this patent appears to be largely software-based: it cites hardware sensors including the accelerometer and gyroscope, and since both of these are present on our current-generation iDevices, in theory this feature could be implemented in a subsequent iOS update.

The patent goes one step further, however, and also aims to figure out the kind of activity the iDevice user is partaking in. AppleInsider explains:

Changes in metrics such as device acceleration and orientation can be used to interpret the motion of a device. More importantly, patterns can be detected and matched to a database of predetermined criteria. For example, if the device senses bobbing, it can be determined that the user is walking. An oscillating motion may denote running, while minute bounces could signal that a user is riding in a car.

As such, the resultant changes to an iDevice’s GUI would reflect the kind of movements its user is making.

Apple’s patent also describes how an iDevice’s user interface would be adjusted in order to compensate for the angle at which its owner is holding the product. In particular, a “fisheye” effect could be implemented in order to focus certain areas of the user interface.

The last aspect of the patent, according to the publication, involves iDevices “learning” how users touch their smartphones and tablets. “This data can be stored and later retrieved to predict where the user will touch during a particular pattern of motion. The GUI can be remapped based on these predictions,” AppleInsider notes.

Some of Apple’s most recent patent applications concern the use of Liquidmetal, curved touch displays, and an iWallet for iOS. Though it’s worth noting that many Apple patents never seem to reach our iDevices, the inclusion of an M7 coprocessor in the iPhone 5s appears to suggest that the Cupertino, Calif. company is certainly interested in monitoring user activity.

We’ll keep you updated with further information as we receive it.

In the meantime, see: Patent Firm VirnetX Holding Corporation To Go After Apple’s iPhone 5s, iPad Air, Add An iOS 7-Style Lock Screen To Your Mac Using This OS X Screensaver, and Episode 2 Of Telltale Games’ The Wolf Among Us Will Be Among Us In Early February.