Modern hand-held devices use an accelerometer to detect a change in orientation of the device from a landscape orientation to a portrait orientation and to adjust a graphical user interface (GUI) within a display to switch between orientations. Some hand-held devices include a tilt-scroll feature wherein the GUI will slide horizontally or vertically in the plane of the display to depict a different orthogonal view in response to a tilt of the device.
Accordingly, various embodiments for a tiltable user interface are described below in the Detailed Description. For example, one embodiment comprises adjusting a graphical user interface in response to a tilt of a device. In this way, a graphical user interface may display a parallax effect in response to the device tilt.
This Summary is provided to introduce concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Computing device 110 includes an accelerometer 105 to detect a tilt of the display 180. For example, the accelerometer 105 may detect a rotation 106 or a translation 107 of the computing device 110 and provide an input 108 indicating a tilt of the display 180 to an orientation module 140 in the computer program 130. Other inputs may include a shake input, a roll input, or other combinations of inputs. In some embodiments, the orientation module 140 may receive an input depicting a rotation 106 or translation 107 from other position detection hardware, such as a gyroscope, a position sensing system, a global positioning system (GPS) receiver, etc. Computing device 110 also includes a user interface module 160 in communication with the display 180 and the orientation module 140 and configured to provide a tilted view 162 in response to a detected tilt.
The computing device 110 may detect a tilt having a component of rotation around at least one of the X-axis or the Y-axis. In this way, if a user intends to rotate the device around the X-axis or the Y-axis of the display but rotates the device around an axis that is not the X-axis, the Y-axis, or the Z-axis, the orientation module 140 may determine that the user intended to tilt the graphical user interface 185 according to the detected rotational component. Then, the orientation module may process the rotation 106 and determine if a user intended to tilt the graphical user interface 185.
In one example, the display 180 may show a first view in the graphical user interface 185 and the orientation module 140 may receive an input 108 from the accelerometer indicating a tilt to the computing device 110. Then, the orientation module 140 may calculate an amount of tilt 142 to be applied to the first view shown in the graphical user interface. Then, the user interface module 160 may generate a tilted view 162 including a portion of at least one graphical element that was not displayed in the first view, wherein the display 180 is configured to display the tilted view in the graphical user interface 185.
In some embodiments, a tilted view includes an icon 350 that is not displayed in the first view. For example a status icon such as a battery icon, a wireless connection icon, etc. may be viewable by tilting a device but not viewable in a first view. This allows icons that are infrequently utilized or having a changing status to be accessible yet hidden in the first view.
In some embodiments, one or more icons may move or be displayed in a different fashion from other icons or display elements. In an embodiment, status icons may move into view at a different speed than other display elements in response to a tilt or other input. As an example, a status icon may slide into view more quickly than other elements. In another example, an icon may be remain displayed on a display screen longer even when a user returns a device to a neutral state, and then may move off screen. In yet another example, an icon may optionally not be subjected to a parallax/perspective shift and may displayed with an x-axis movement, with no change in Z-depth, subject to a different set of physical rules, etc.
In some embodiments, an icon or display element may be brought into view by one motion and then adopt a different set of physical rules governing its motion. As an example, in response to a shake input, a display element may either respond to a tilt in the same fashion as the other display elements, or it may not longer respond to tilt or other inputs for a set period of time, until it is removed from the display screen, until a different input, etc.
Some embodiments may treat different layers or portions of display elements in different fashions. For example, one embodiment may include a flat foreground layer including a layer or layers designated to be excluded from a perspective shift when the device is tilted. In this example, a foreground layer may not shift while others layers below it would shift in response to a tilt or other input. In this way, a user interface may be tailored to have a natural feel, to specifically highlight certain icons, to allow programmably different effects for different icons or design elements, etc.
In some embodiments, the graphical user interface 185 depicts a 3-dimensional environment including a Z-axis orthogonal to the display 180, wherein the user interface module is configured to depict a parallax effect between a first element with a first Z component and a second element with a second Z component as the graphical user interface 185 changes between the first view and the tilted view. This also enhances the perception of depth in the graphical user interface 185. A 3-dimensional environment may include a rotation about the X-axis 144, a rotation about the Y-axis, or a rotation about the Z axis 146, or a translation 147 through any of the 3 dimensions.
In an example including a 3-dimensional environment,
The tiltable graphical user interface depicted in
In some embodiments, the tilted view may have a rotation point with an adjustable Z component. For example, the tilt of the graphical user interface 185 may be about a pivot point, and the pivot point may be at the Z component of the viewer, of an element in the graphical user interface, of the display, etc. An adjustable rotation point allows the look and feel of the graphical user interface to be adjusted. For example, by providing a tilt with a rotation point with a Z component the same as the display 180, the user perspective may orbit about that rotation point. By adjusting the rotation point to have a Z component similar to the user's perspective, the graphical user interface 185 will pivot with respect to the user.
User interface module 160 may also use the effects 150 to provide a depth of field 154, such as a focus depth, wherein the user interface module may adjust the focus depth in the graphical user interface 185 in response to a tilt. For example, first element 220 may be out of focus in
In some embodiments, the user interface module 160 may be further configured to adjust the focus depth in the graphical user interface 185 in response to a selection of an element in the graphical user interface 185. For example, in
In some embodiments, user interface module may provide other effects 156, off-screen effects 152, etc. In one example, the first view may be displayed if the tilt is below a threshold rotation. This allows a slight rotation to not be interpreted as an input, and the device to display the first view below a threshold rotation. In another example, the graphical user interface may revert to the first view after a period with no additional tilts of the device.
Continuing with the Figures,
Method 400 then comprises receiving an input indicating a tilt of the device, the tilt including a component of rotation around at least one of the X-axis or the Y-axis of the display, as indicated in block 420. Such inputs may be, but are not limited to, rotation or translation inputs detected by accelerometer, or from other position detection hardware, such as a gyroscope, a position sensing system, a global positioning system (GPS) receiver, etc. In some embodiments, a tilt may be detected having a component of rotation around at least one of the X-axis or the Y-axis. In this way, if a user intends to rotate the device around the X-axis or the Y-axis of a display but rotates the device around an axis that is not the X-axis, the Y-axis, or the Z-axis.
Next, method 400 comprises applying the tilt to the first view to generate a tilted view in response to the input as indicated at 430.
Method 400 then comprises displaying the tilted view in the graphical user interface, the tilted view including a portion of at least one graphical element that was not displayed in the first view, as indicated in block 440.
In some embodiments, the tilted view may further comprise an icon in the tilted view that is not displayed in the first view. For example a status icon such as a battery icon, a wireless connection icon, etc. may be viewable by tilting a device but not viewable in a first view. This allows icons that are infrequently utilized or having a changing status to be accessible yet hidden in the first view.
In some embodiments, the graphical user interface may depict a 3-dimensional environment including a Z-axis orthogonal to the display, wherein the method 400 further comprises depicting parallax between a first element with a first Z component and a second element with a second Z component as the graphical user interface changes between the first view and the tilted view.
Additionally, the tilted view may have a rotation point with an adjustable Z component. For example, the tilt of the graphical user interface may be about a pivot point, and the pivot point may be at the Z component of the viewer, of an element in the graphical user interface, of a display, etc. An adjustable rotation point allows the look and feel of the graphical user interface to be adjusted. For example, by providing a tilt with a rotation point with a Z component the same as a display, a user perspective may orbit about that rotation point. By adjusting the rotation point to have a Z component similar to the user's perspective, a graphical user interface will pivot with respect to a user's perspective.
In some embodiments, the 3-dimensional environment may include a focus depth, wherein the method 400 further comprises adjusting the focus depth in the graphical user interface in response to the tilt. For example, method 400 may adjust a focus depth in the graphical user interface in response to a selection of an element in the graphical user interface, in response to a tilt of the device, etc.
Some embodiments may provide other effects. For example, method 400 may display the first view if a tilt is below a threshold rotation. This allows a slight rotation to not be interpreted as an input, and the device to display the first view below a threshold rotation. In another example, method 400 may further comprise displaying the first view after a period with no additional tilts of the device. In another example, a first tilt may be applied to the first view if the display is in a portrait orientation and a second tilt is applied to the first view if the display is in a landscape orientation.
It will be appreciated that the embodiments described herein may be implemented, for example, via computer-executable instructions or code, such as programs, stored on a computer-readable medium and executed by a computing device. Generally, programs include routines, objects, components, data structures, and the like that perform particular tasks or implement particular abstract data types. As used herein, the term “program” may connote a single program or multiple programs acting in concert, and may be used to denote applications, services, or any other type or class of program. Likewise, the terms “computer” and “computing device” as used herein include any device that electronically executes one or more programs, including, but not limited to, media players and any other suitable devices such as personal computers, laptop computers, hand-held devices, cellular phones, microprocessor-based programmable consumer electronics and/or other suitable computing devices that may utilize a tiltable graphical user interface.
It will further be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of any of the above-described processes is not necessarily required to achieve the features and/or results of the embodiments described herein, but is provided for ease of illustration and description.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
This application is a continuation of U.S. patent application Ser. No. 14/148,622, filed Jan. 6, 2014, which is a continuation of U.S. patent application Ser. No. 12/276,153, filed on Nov. 21, 2008, now U.S. Pat. No. 8,645,871, and titled “TILTABLE USER INTERFACE,” the entire disclosures of each of which are incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5426732 | Boies et al. | Jun 1995 | A |
5745197 | Leung et al. | Apr 1998 | A |
6229542 | Miller | May 2001 | B1 |
6466198 | Feinstein | Oct 2002 | B1 |
6690358 | Kaplan | Feb 2004 | B2 |
6798429 | Bradski | Sep 2004 | B2 |
7038662 | Noguera | May 2006 | B2 |
7248269 | Card et al. | Jul 2007 | B2 |
7289102 | Hinckley et al. | Oct 2007 | B2 |
7301528 | Marvit et al. | Nov 2007 | B2 |
7564469 | Cohen | Jul 2009 | B2 |
7631277 | Nie et al. | Dec 2009 | B1 |
20040145613 | Stavely et al. | Jul 2004 | A1 |
20050210417 | Marvit | Sep 2005 | A1 |
20060010699 | Tamura | Jan 2006 | A1 |
20060094480 | Tanaka | May 2006 | A1 |
20060178212 | Penzias | Aug 2006 | A1 |
20060187204 | Yi et al. | Aug 2006 | A1 |
20070107015 | Kazama et al. | May 2007 | A1 |
20070113207 | Gritton | May 2007 | A1 |
20070192722 | Kokubo | Aug 2007 | A1 |
20080042973 | Zhao et al. | Feb 2008 | A1 |
20080062001 | Hsu et al. | Mar 2008 | A1 |
20090002391 | Williamson et al. | Jan 2009 | A1 |
20090201270 | Pikkujamsa et al. | Aug 2009 | A1 |
20090265627 | Kim | Oct 2009 | A1 |
20090307634 | Strandell | Dec 2009 | A1 |
20090322676 | Kerr et al. | Dec 2009 | A1 |
20100042954 | Rosenblatt | Feb 2010 | A1 |
20100171691 | Cook | Jul 2010 | A1 |
Number | Date | Country |
---|---|---|
1525286 | Sep 2004 | CN |
1825265 | Aug 2006 | CN |
1667471 | Jun 2006 | EP |
1752737 | Feb 2007 | EP |
1903425 | Mar 2008 | EP |
10177449 | Jun 1998 | JP |
11065806 | Mar 1999 | JP |
2000311174 | Nov 2000 | JP |
2002502999 | Jan 2002 | JP |
2002149616 | May 2002 | JP |
2003223095 | Aug 2003 | JP |
2005003463 | Jan 2005 | JP |
2007515859 | Jun 2007 | JP |
2008210019 | Sep 2008 | JP |
200414002 | Aug 2004 | TW |
127735 | Apr 2001 | WO |
Entry |
---|
“Communication Pursuant to Rules 70(2) and 70a(2) EPC Issued in Patent Application No. 098279532”, dated Aug. 30, 2016, 1 Page. |
“Final Office Action Issued in U.S. Appl. No. 14/148,622”, dated Feb. 28, 2017, 5 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 14/148,622”, dated Oct. 26, 2016, 9 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/148,622”, dated May 19, 2017, 8 Pages. |
“Office Action Issued in Chinese Patent Application No. 200980147062.2”, dated Apr. 5, 2016, 9 Pages. |
“Office Action Issued in Chinese Patent Application No. 200980147062.2”, dated Sep. 30, 2016, 15 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 200980147062.2”, dated Aug. 8, 2013, 11 Pages. |
“Tilt-and-Scroll, Technology for Smartphones and Other Handheld Devices”, Retrieved From <<http://www.rotoview.com/>>, 4 Pages. |
“Search Report Issued in European Patent Application No. 09827953.2”, dated Aug. 12, 2016, 9 Pages. |
“Notice of Allowance Issued in Korean Patent Application No. 10-2011-7010928”, dated Feb. 22, 2016, 6 Pages. |
“Office Action Issued in Korean Patent Application No. 10-2011-7010928”, dated Oct. 21, 2015, 4 Pages. |
“Final Office Action Issued in U.S. Appl. No. 12/276,153”, dated Sep. 7, 2011, 9 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 12/276,153”, dated Mar. 4, 2011, 9 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 12/276,153”, dated Oct. 2, 2013, 8 Pages. |
“Office Action Issued in Chinese Patent Application No. 200980147062.2”, dated Feb. 18, 2014, 9 Pages. |
“Notice of Allowance Issued in Japanese Patent Application No. 2011-537477”, dated May 2, 2014, 4 Pages. |
“Office Action Issued in Japanese Patent Application No. 2011-537477”, dated Dec. 26, 2013, 4 Pages. |
“Notice of Allowance Issued in Taiwan Patent Application No. 98139164”, dated May 21, 2015, 4 Pages. |
“Office Action Issued in Taiwan Patent Application No. 98139164”, dated Oct. 17, 2014, 8 Pages. |
Crossan, et al., “Variability in Wrist-Tilt Accelerometer Based Gesture Interfaces”, In Proceedings of the International Conference on Mobile Human-Computer Interaction, Sep. 13, 2004, 12 Pages. |
Hinckley, et al., “Sensing Techniques for Mobile Interaction”, In Proceedings of the 13th Annual ACM symposium on User Interface Software and Technology, Nov. 1, 2000, pp. 91-100. |
Kelsey, Michelle A., “Expand Your Gaming Space with Sensors”, Retrieved From <<http://www.sensorsmag.com/embedded/expand-your-gaming-space-sensors>>, May 24, 2007, 5 Pages. |
“International Search Report and Written Opinion Issued in PCT Patent Application No. PCT/US2009/061745”, dated Apr. 30, 2010, 11 Pages. |
Rekimoto, Jun, “Tilting Operations for Small Screen Interfaces”, In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology, Nov. 6, 1996, pp. 167-168. |
“Office Action Issued in European Patent Application No. 09827953.2”, dated May 2, 2018, 8 Pages. |
Number | Date | Country | |
---|---|---|---|
20180004390 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14148622 | Jan 2014 | US |
Child | 15706419 | US | |
Parent | 12276153 | Nov 2008 | US |
Child | 14148622 | US |