1. Field of the Invention
The present invention relates generally to user interfaces for humans to interact with electronic devices particularly those electronic devices that are mobile.
2. Description of the Prior Art
A user interface facilitates the interaction between an electronic device such as a computer and a user by enhancing the user's ability to utilize application programs running on the device. The traditional interface between a human user and a typical personal computer is implemented with graphical displays and is generally referred to as a graphical user interface (GUI). Input to the computer or particular application program is accomplished by a user interacting with graphical information presented on the computer screen using a keyboard and/or mouse, trackball or other similar input device. Such graphical information can be in the form of displayed icons or can simply be displayed text in the form of menus, dialog boxes, folder contents and hierarchies, etc.
Some systems also utilize touch screen implementations of a graphical user interface whereby the user touches a designated area of a screen to affect the desired input. Some touch screen user interfaces, for example the one implemented in the iPhone mobile device made by Apple Inc. of Cupertino, Calif., use what is known as “finger-gestures.” Below is an exemplary list of such finger gestures and the associated commands they cause to be executed:
Referring now to
In a similar fashion, mobile device 101 may also have a traditional touch screen graphical user interface where, rather than using slider bar 107 with sliding element 109 to effect moving between images 103, 104, 105, 106, 107 and 108, instead the user simply touches the displayed images themselves and again using the drag command cycles through them.
There are, however, many applications where the user interfaces discussed above are impractical or inefficient. Having to use a separate input device such as a mouse to interact with a GUI becomes inconvenient when that means carrying both a mobile device and a mouse device and further requires the use of two hands, one to hold the mobile device and one to operate the mouse device. This later limitation likewise exists in the case of traditional touch screen user interfaces deployed on mobile devices. These limitations of the prior art are overcome by providing a motion driven user interface for mobile devices as described herein.
In one example is a mobile device user interface method comprising: detecting motion of the mobile device using one or more sensors located within the mobile device; confirming by a processor of the mobile device that the detected motion of the mobile device exceeds a preset threshold; determining by the mobile device processor that the confirmed detected motion of the mobile device matches a defined type of motion; and executing by the mobile device processor a user interface input command associated with the defined type of motion.
In a further example of the mobile device user interface method, the user interface input command associated with the defined type of motion varies depending upon what context in which the mobile device user interface is operating when the step of detecting motion of the mobile device occurs.
In another example is a non-transitory computer readable medium containing programming code executable by a processor, the programming code configured to perform mobile device user interface method, the method comprising: detecting motion of the mobile device using one or more sensors located within the mobile device; confirming by a processor of the mobile device that the detected motion of the mobile device exceeds a preset threshold; determining by the mobile device processor that the confirmed detected motion of the mobile device matches a defined type of motion; and executing by the mobile device processor a user interface input command associated with the defined type of motion.
In various embodiments are provided methods and systems for a motion driven context sensitive user interface for mobile devices. In one embodiment the method provides a user with the option to cause execution of certain user interface input commands by physically moving the mobile device in space. This provides a user with the convenience of interacting with the mobile device using embedded sensors in the mobile device. By gathering and processing data from multiple sensors within the mobile device, certain commands can be executed that in the past required a traditional user interface such as a graphical user interface.
As portable electronic devices become more compact, and the number of functions performed by a given device increase, it has become a significant advantage to use these portable devices for functions other than the ones they were originally designed for. Some mobile devices like the iPhone whose main function may be considered to be primarily a phone can also be used for gaming as they provide enough computing power, incorporate a touch screen interface and have embedded sensors like global positioning system (GPS), camera, compass, gyroscope and accelerometer. Such devices are ripe for a shift in the way user interfaces are implemented. Taking advantage of these embedded sensors, a motion driven user interface is disclosed herein. Thus by moving the device in space certain commands can be executed as will be described.
As has been discussed, menu driven GUIs are tedious and often require the use of both hands (e.g., one hand to hold the device and the other to control an external input device or touch the screen). However, using the approach described herein, certain motions are natural and can be easily performed with one hand which is holding the mobile device, thus giving the user the freedom to do other tasks with the spare hand, while still meaningfully interacting with the mobile device.
The present disclosure describes a motion driven user interface as a method and a system that uses the output of multiple sensors available in a mobile device to capture the motion and then performing a command/task associated with that particular motion.
Various sensors available on mobile devices are briefly discussed below:
This application discloses methods and systems that use one or more of the above listed embedded sensors in a mobile device to implement a motion driven user interface to further enhance the user experience.
Referring now to
In various embodiments the operations and processing described are handled by a processor of the mobile device running software stored on the mobile device stored in memory of the mobile device.
While any such defined type of motion may be used, in one embodiment these motions are typically either a linear motion, an angular motion or a composite motion (it is to be understood that a composite motion is a combination of more than one linear motion, a combination of more than one angular motion or a combination of at least one linear motion and at least one angular motion) as will be explained.
In this way the mobile device, using its sensors, can measure and calculate a range of motions that can then be translated to commands that are context-specific, i.e., the command is different depending upon the operating context of the mobile device. As such, in some embodiments, the user interface input command associated with the defined type of motion is dependent upon what context the mobile device is operating in at the time motion of the mobile device is detected in step 201 as will be explained.
To be clear, an operating context is the current user interface operational mode or state of the mobile device which in various examples is when the mobile device is displaying to the user one application's GUI versus displaying to the user a different application's GUI, or when the mobile device is displaying to the user an application's GUI when running one part or function of the application versus displaying to the user that application's GUI when running a different part or function of the same application, or when the mobile device is displaying to the user an application GUI versus displaying to the user an operating system function GUI. It is to be understood that operating context can vary by part or function of an application, by part or function of one application versus part or function of a different application, or by part or function of an operating system and can also vary depending upon particular hardware components of the mobile device (e.g., what sensors are included in the mobile device, what other user interface input devices are included in or coupled to the mobile device, etc.).
In various embodiments, the association between a defined type of motion and its user interface input command is mapped in a table, or may use a database or a file for such mapping. The mapping may vary from one context to another. For example when playing an AR game, game related motions may be applicable. Alternatively, the mapping may change as the user progresses from one level of the game to another. Exemplary defined types of motions and associated user interface input commands are shown in the table below which also shows examples of different commands which may be executed dependent upon the operating context of the mobile device when the device motion occurs.
Referring now to
Mobile device 301 can be moved laterally (left to right or right to left in the figure) along x-axis 305 as indicated by movement arrow 311 in the figure. Mobile device 301 can likewise be moved longitudinally (up or down in the figure) along y-axis 307 as indicated by movement arrow 315 in the figure. Mobile device 301 can also be moved in or out of the figure along z-axis 309 as indicated by the movement arrow 319 in the figure. These are the linear motions of mobile device 301.
Mobile device 301 can be moved in a clockwise or counterclockwise fashion (rotated) about x-axis 305 as indicated by a pair of rotation arrows 313 in the figure. Mobile device 301 can likewise be moved in a clockwise or counterclockwise fashion (rotated) about y-axis 307 as indicated by a pair of rotation arrows 317 in the figure. Mobile device 301 can also be moved in a clockwise or counterclockwise fashion (rotated) about z-axis 309 as indicated by a pair of rotation arrows 319 in the figure. These are the angular motions of mobile device 301.
Mobile device 301 can also be moved via a combination of the linear motions, the angular motions or both, as previously stated. These are the composite motions of mobile device 301.
It is to be understood that although the intersection of the three axes, namely x-axis 305, y-axis 307 and z-axis 309, commonly referred to as an origin, is shown as being located some distance from mobile device 301 in the figure, this was merely done for visually clarity in the figure and therefore need not be the case in any given situation. Thus, the origin may be located at any point in space relative to mobile device 301 including touching or even within mobile device 301. Therefore, any discussion herein regarding mobile device movement with respect to the three axes (whether linear, angular or composite) is likewise understood to cover any placement of the origin and the three axes relative to mobile device 301. For example, discussion herein of angular motion of mobile device 301 about y-axis 307 can mean that mobile device 301 starts from a position some distance along x-axis 305 and therefore all of mobile device 301 is moving around y-axis 307 (in which case all of mobile device 301 is moving through space) or can mean that a left edge of mobile device 301 starts from a position no distance along x-axis 305 (in which case the left edge of mobile device 301 is coincident with y-axis 307) and therefore the rest of mobile device 301 is moving around y-axis 307 while the left edge of mobile device 301 stays stationary (in which case mobile device 301 is pivoting about its left edge), or can mean that some part of mobile device 301 starts from a position some negative distance along x-axis 305 and therefore the rest of mobile device 301 on either side of that part of mobile device 301 is moving around y-axis 307 while that part of mobile device 301 stays stationary (in which case mobile device 301 is essentially stationary while rotating in space).
Referring now to
This is accomplished because a user's movement of mobile device 301 is detected by sensors within the mobile device such as a gyroscope and an accelerometer which sensors inform the mobile device when the user starts and stops the motion. When the mobile device detects that a motion has started, it can then continue to track changes in the X, Y, or Z coordinates of the mobile device's position for changes until it detects that the motion has stopped. If the mobile device calculates a net change with respect to an initial stationery position in coordinates along x-axis 305, e.g., from a smaller value to a larger value, and it does not measure any appreciable changes in coordinates along either of y-axis 307 or z-axis 309 (for example, a preset threshold of a change in magnitude less than 10% of either y-axis 307 or z-axis 309 of the magnitude of the delta vector in x-axis 305), then the mobile device can conclude that the user performed an intentional left-to-right lateral motion with the device. Of course, the preset threshold can be definable and may vary from one instance to the other depending on implementation and operating context.
Of course, if the operating context were different, for example if a video was paused on display screen 303 of mobile device 301 when the lateral movement occurred, some other associated user interface input command would execute, for example, to play the video. Likewise, if mobile device 301 is then moved back laterally to the left then an associated user interface input command of pausing the video would be executed.
In a further embodiment, the speed with which the mobile device is moved, again as sensed via sensors within the mobile device, can also be used to determine a defined type of motion. For example, a quick lateral movement to the right can be a defined type of motion such that if the mobile device is so moved when playing a video on the display screen this can cause a fast-forward user interface input command to be executed. Likewise a quick lateral movement to the left can be a defined type of motion such that if the mobile device is so moved when playing a video on the display screen this can cause a rewind user interface input command to be executed.
Numerous other examples are possible including moving a mobile device laterally towards the left with respect to an initial stationery position to cause execution of a user interface input command of panning left in a virtual world or rolling back a radial view, whereas a quick sideways or lateral ‘jab’ to the left can cause execution of a user interface input command to undo a previous action or rewind a video as has been explained. Likewise, moving the device towards the right with respect to an initial stationery position can cause execution of a user interface input command to pan right in a virtual world, roll forward in a radial view, or play a video—whereas a quick sideways or lateral ‘jab’ to the right might execute a user interface input command to fast forward the video or repeat or redo a previous action. Similarly, a ‘sideways shake’ motion where the user moves the mobile device laterally to the left and laterally to the right repeatedly might execute, depending on the context, a user interface input command to scramble, reorganize, or sort when a list view or other view which contains individually selectable elements is displayed on a display screen of the mobile device.
Referring now to
Again, this is accomplished because a user's movement of mobile device 301 is detected by sensors within the mobile device which sensors inform the mobile device when the user starts and stops the motion. In order to determine and calculate the up-down motions the mobile device compares the coordinate values along the y-axis with respect to an initial stationary position, while the coordinate values along the x-axis and z-axis remain relatively unchanged, i.e., less than some preset threshold. If the net difference between the initial position and a final position is positive (i.e. difference between coordinate values along the y-axis) then an upwards motion is indicated whereas a net negative change indicates a downwards motion.
Thus in one embodiment when a user moves the mobile device upwards with respect to the initial stationery position can cause execution of a user interface input command of panning the camera up in a virtual world or moving to a higher level in a folder hierarchy, whereas a quick ‘jab’ up motion can cause execution of a user interface input command to make the user's displayed avatar jump in that virtual world or move to the top of a folder hierarchy again depending upon implementation and operating context.
Likewise, the user moving the mobile device downwards with respect to the initial stationery position can cause execution of a user interface input command of panning the camera down in a virtual world or moving to a lower level in a folder hierarchy, whereas a quick ‘jab’ down motion can cause execution of a user interface input command to make the user's displayed avatar duck or slide in that virtual world or move to the bottom of a folder hierarchy.
Another possible example based on detected speed of a motion is a quick ‘up-down shake’ where the user quickly moves the mobile device up and down repeatedly which can cause execution of a user interface input command, depending on implementation and operating context, of charging a weapon in a game, making the user's displayed avatar flip through the air after a jump, or a vertical sort similar to how the horizontal or lateral quick sideways shake corresponds to a horizontal sort.
Referring now to
Again, this is accomplished because a user's movement of mobile device 301 is detected by sensors within the mobile device such as a gyroscope and an accelerometer which sensors inform the mobile device when the user starts and stops the motion. In order to determine and calculate the in-out motions the mobile device compares coordinate values along the z-axis with respect to an initial position, while the coordinate values along the x-axis and y-axis remain relatively unchanged, i.e., less than some preset threshold. If the net difference between the initial position and a final position is positive (i.e., the difference between coordinate values along the z-axis) then an outwards or forward motion is indicated whereas a net negative change indicates an inwards or backward motion.
Thus in one embodiment when a user moves the mobile device away from the user with respect to the initial stationery position can cause execution of a user interface input command to zoom-out of a displayed scene. This zoom-out command can likewise occur in the operating context of a displayed AR image received from the output of a video capture device (e.g. a camera) of the mobile device where, by the user moving the mobile device away from themselves the AR image can be zoomed-out as shown in the figure.
Referring now to
Again, this is accomplished because a user's movement of mobile device 301 is detected by sensors within the mobile device which sensors inform the mobile device of the motion or movement of the mobile device. Using these sensors the mobile device can determine, for example, that the right side of the device is rotating clockwise about the y-axis while the left side of the device has remained in a relatively constant position, i.e., less than some preset threshold. In this example, referring now to
Further, as has been explained, a user's movement of mobile device 301 can cause execution of a different user interface input command depending upon which operating context mobile device 301 is operating in when mobile device 301 detects that it has been moved by the user. For example, in the example shown with reference to
In one embodiment a preset threshold is defined to eliminate a possible margin of error (for example 10 angular degrees) within which the orientation of the left hand side of the mobile device could vary. Since the change in the right side of the device increased about the positive z-axis (and somewhat toward the negative X-axis) outside of the margin of error and the changes in the left side of the device were within the margin of error, the mobile device can then positively conclude that the user did rotate the right side of the device inward or forward.
Specifically, if the mobile device received linear position coordinates of the mobile device along the x-axis, y-axis and z-axis through the accelerometer, gyroscope, and/or other sensors from the sides of the device, then it would first transform these coordinates to a spherical coordinate system. As an example:
Let:
phi=a tan 2(x,y)
theta=a cos(z/r)
r=sgrt(x̂2+ŷ2+ẑ2)
Then:
Again, this is accomplished because a user's movement of mobile device 301 is detected by sensors within the mobile device which sensors measure the acceleration and deceleration of the device, which sensors inform the mobile device when the user starts and stops the motion. When the mobile device detects that a motion has started, it can then continue to track changes in the phi, theta, or r coordinates of the device's polar position for changes until it detects that the motion has stopped. If the mobile device calculates a net change with respect to the initial stationery position in the value of right-side phi coordinates, e.g., from a negative value to a positive value, and it does not measure any appreciable changes in the coordinates of theta and r (for example a change in magnitude less than 10% of the magnitude of the delta vector in the two radii), then the system can conclude that the user performed an intentional rotation around the left edge motion with the device.
Thus in one embodiment when a user moves the right edge of the mobile device while the left edge stays relatively fixed with respect the initial stationery position can cause execution of a user interface input command to cycle forward through a photo album, open or close a door in a displayed game, etc., depending upon implementation and operating context.
Likewise, moving the device from the left edge white the right edge stays relatively fixed, with respect to the initial stationery position can cause execution of a user interface input command to cycle backward through a photo album, open or close a door in a displayed game, etc., depending upon implementation and operating context.
Another possible example based on detected speed of motion is when the user quickly and repeatedly moves the mobile device back and forth along one edge while the other edge stays relatively stationary, which movement can cause execution of a user interface input command to simulate a vacuum or a fan-blowing effect where various adjustable displayed elements are ‘blown’ or ‘sucked’ to the waving side of the screen.
Referring now to
Thus in one embodiment when the user moves the mobile device in a clockwise direction about the z-axis as the center of rotation with respect to the initial stationery position can cause execution of a user interface input command to make a right turn, as when driving in the video game example. Similarly, a quick clockwise ‘toss’ angular motion about the z-axis can cause execution of a user interface input command to ‘send-to-back’ a top most view, as when shuffling cards, etc. Likewise, a user moving the device in a counterclockwise angular motion about the z-axis as the center of rotation with respect to the initial stationery position can cause execution of a user interface input command to make a left turn, as when driving in the video game example. Similarly, a quick counterclockwise ‘toss’ angular motion about the z-axis can cause execution of a user interface input command to ‘send-to-front’ a bottom most view, as when shuffling cards, etc. Still further, a repeated clockwise and counter-clockwise alternating rotation or angular motion about the z-axis can cause execution of a user interface input command to stabilize a drifting car in a racing game or to unlock a locked file provided the correct sequence of precise ‘twists’ or rotations are applied, as with a combination lock. Again, it is to be understood that each of these defined type of motions and their associated user interface input commands are dependent upon implementation and operating context.
In another example (not shown) of angular motion of a mobile device is a user's rotation or angular motion of the mobile device about the x-axis. By the user tipping top edge of the mobile device away from the user, with either the bottom edge remaining stationary or moving towards the user) in a rotational or angular direction about the x-axis with respect to an initial stationary position of the device can cause execution of a range of user interface input commands depending on implementation and operating context.
Thus in one embodiment a user tipping the top edge of the mobile device backwards (moving the top of the device away from the user with the X-axis as the center of rotation with respect to the initial stationery position) can cause execution of a user interface input command to accelerate forward, such as when pressing the gas pedal of a vehicle (e.g., race car 903 of
Likewise, a user tipping the bottom edge of the mobile device backwards (moving the bottom of the device away from the user with the X-axis as the center of rotation with respect to the initial stationery position) can cause execution of a user interface input command to press the brake pedal of a vehicle (e.g., race car 903 of
Further, in one embodiment when the user moves the mobile device in a rotational or angular motion repeatedly back and forth about the X-axis as the center of rotation with respect to the initial stationery position can cause execution of a user interface input command to wind a coiled spring or reel in a virtual fishing game when a user has just caught a fish with an overhead ‘throw’ motion of the mobile device.
As previously explained some defined types of motions are composite motions. For example, a repeated full circular motion can be used to cause execution of an “erase” user interface input command in an application that offers drawing, sketching and/or painting features. In another example, the same full circular motion can also be used to cause execution in a game of a user interface input command to polish, shine, and buff the paint of a car. It is to be understood that in these circular motion examples there is no rotation of the mobile device and therefore there is no axis of rotation. Instead, the device is being translated along a circular path where x=r cos(t) and y=r sin(t) and (x,y) is the current position at time t, r is within a defined range.
In yet another example, a user moving the mobile device “up, right, down and left” (that is, a linear motion up along the y-axis, followed by a linear motion to the right along the x-axis, followed by a linear motion down along the y-axis; followed by a linear motion to the left along the x-axis) can cause execution of a user interface input command to add a border or frame around a displayed image or to crop or re-size a displayed image.
Similarly some composite motions combine a linear motion with an angular motion. For example, a user moving a mobile device in circles (i.e., a rotational or angular motion about the z-axis) while moving the device away from the user (i.e., a linear motion along the z-axis) can cause execution of a user interface input command to tunnel or bore a hole when the mobile device is in the operating context of a treasure hunt game. Another example is a user moving the mobile device in a circular motion (i.e., a rotational or angular lotion about the z-axis) white moving the device down user (i.e., a linear motion along the y-axis) with respect to the initial position to cause execution of a user interface input command of creating turbulence in an AR space (e.g. to simulate a tornado in an AR image when playing a game) when the mobile device is in the operating context of running an AR game.
It is to be understood that the examples given are for illustrative purposes only (for example the diagrams show the mobile device in a landscape orientation, but the methods are applicable in a portrait orientation as well) and may be extended to other implementations and embodiments with a different set of sensors, defined types of motions, conventions and techniques. While a number of embodiments are described, there is no intent to limit the disclosure to the embodiment(s) disclosed herein. On the contrary, the intent is to cover all alternatives, modifications, and equivalents apparent to those familiar with the art.
Likewise, it is to be understood that although the term game has been used as an example the techniques and approach described herein are equally applicable to any other piece of software code, application program, operating system or operating context. There is no intent to limit the disclosure to game applications or player applications and the term player and user are considered synonymous as do games and software applications.
It is to be further understood that the mobile device described herein can be any mobile device with a user interface such as a phone, smartphone (such as the iPhone from Apple, Inc. or phone running the Android OS from Google, Inc. of Mountain View, Calif.), personal digital assistant (PDA), media device such as the iPod or iPod Touch from Apple, Inc.), electronic tablet (such as an iPad from Apple, Inc.), electronic reader device (such as the Kindle from Amazon.com, Inc. of Seattle, Wash.) hand held game console, embedded devices such as electronic toys, etc., that have a processor, memory and display screen.
Further, while a number of the examples are described as a game running on a mobile device, it is to be understood that the game itself, along with the ancillary functions such as sensor operations, device communications, user input and device display generation, etc., can all be implemented in software stored in a computer readable storage medium for access as needed to either run such software on the appropriate processing hardware of the mobile device.
In the foregoing specification, the invention is described with reference to specific embodiments thereof, but those skilled in the art will recognize that the invention is not limited thereto. Various features and aspects of the above-described invention may be used individually or jointly. Further, the invention can be utilized in any number of environments and applications beyond those described herein without departing from the broader spirit and scope of the specification. The specification and drawings are, accordingly, to be regarded as illustrative rather than restrictive. It will be recognized that the terms“comprising,” “including,” and “having,” as used herein, are specifically intended to be read as open-ended terms of art.
This application claims the benefit of U.S. Provisional Patent Application No. 61/401,149 filed on Aug. 9, 2010 and entitled “Motion Driven User Interface,” which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61401149 | Aug 2010 | US |