Date: Thursday, April 16th, 2009, 08:56
Category: iPhone, Patents, Software
Apple may be looking into creating a version of its iPhone with a front-facing camera as well as a software interface capable of adjusting itself for more precise interaction when the user carrying the phone is in motion.
While the front-facing camera idea hints towards the inevitable adoption of video conference capabilities by the iPhone in the coming years, the adaptive software interface concept could become a reality that much sooner, improving a user’s accuracy in making touch selections by increasing the size of user interface elements on the touch-screen when its determined that the user is operating the device while jogging or participation in some other kind of motion-based activity.
According to AppleInsider, Apple has filed a patent that proposes an updated version of its iPhone OS software that can detect when the device is in motion and then compare the detected degree of motion to one or more predetermined “signatures of motion.” The iPhone software could then adjust itself by enlarging selection areas on the screen to a degree suitable for the current motion of the device and user.
“For example, if the user wishes to view the contact information for ‘John Adams,’ the user touches the display over the area of the row for the contact ‘John Adams,” Apple says. “While the device is moving, the motion of the device can be detected. The device can change the size of the rows of the contacts in the contact list application to give the user a larger target area for each contact. For example, the height of a row can be increased. This gives the user a larger touch area with which to select a contact. In some implementations, the height of the toolbar can be increased as well.”
The 16-page patent filing made back in November of 2007 also suggests that interface elements, such as an array of home screen icons, could shift their position on the screen based on predictions of where the user may touch the screen. Oddly enough, the need for such adjustments isn’t entirely clear from Apple’s description.
“The shift moves the target touch areas of the display objects to a different position. In some implementations, the new position is a predetermined distance from the original position,” the company says. “In some other implementations, the new position is determined by the device based on a prediction of where the user will touch the touch-sensitive display if the user wanted to select the user interface element while the device is in motion.”
The filing is credited to Apple employee John Louch.