The present disclosure generally relates to systems and methods for eye-controlled user interfaces, and more particularly, to multi-step activation processes of buttons based on user gaze fixation within a display such as an augmented reality (“AR”) or virtual reality (“VR”) display.
The growth of AR/VR technologies across a large and diverse set of markets is well understood by one of skill in the art. Markets such as gaming, media, search, and information management implement a variety of different AR/VR products to allow an individual to interact within a virtual environment. These AR/VR products provide an individual a rich and dynamic platform in which the user can retrieve information, view media content, navigate virtual scenes and interact with other types of content in a manner unique to the AR/VR environment. It is important that these AR/VR products maintain a user-friendly experience throughout their use and provide a user the ability to interact with virtual content in a simple and consistently accurate manner.
AR/VR technologies are oftentimes constrained by the way an individual can interact with the virtual environment. Some AR/VR products implement an eye-controlled user interface where a user interacts with virtual content using eye movement. This interface may rely exclusively on eye-controlled interactions or may use a combination of eye-controlled interactions with other types of user control such as hand-gestures, hand controllers, head movement or other types of movement that is translated into the virtual environment. Eye-controlled user interfaces, especially those that rely exclusively on eye movement, are limited to a small subset of ways in which a user can interact with virtual content and present unique challenges in providing a robust and reliable interface. One such issue with eye-controlled user interfaces is ensuring consistently accurate activation of buttons based on user eye movement within the display.
The figures and the following description relate to various embodiments by way of illustration. It is noted that from the following discussion, alternative embodiments of the structures and methods disclosed herein will be readily recognized as viable by one of skill in the art. It is further noted that any headings used herein are only for organizational purposes and shall not be used to limit the scope of the description or the claims. All documents cited are incorporated by reference herein in their entirety.
Reference in the specification to “one or more embodiments,” “preferred embodiment,” “an embodiment,” “embodiments,” or the like means that a particular feature, structure, characteristic, or function described in connection with the embodiment is included in at least one embodiment of the disclosure and may be in more than one embodiment. Also, the appearances of the above-noted phrases in various places in the specification are not necessarily all referring to the same embodiment or embodiments.
One skilled in the art shall recognize that: (1) certain steps may optionally be performed; (2) steps may not be limited to the specific order set forth herein; (3) certain steps may be performed in different orders; and (4) certain steps may be done concurrently.
For purposes of this disclosure, the term “button” is defined as a symbol within a display that can be activated by a user's gaze, examples of which include (but are not limited to) icons, shapes, text, glyphs, links, menus and menu components, pointers, etc. The term “confirmation element” is defined as a symbol within the display that appears in response to a user gaze fixating at a button. The confirmation element is associated with the button and provides confirmation of a user's intent to activate the button. The term “gaze” is defined as a user looking at a location within the display. The term “fixate” is defined as a period of time of a user gaze. The period of time during which a user fixates at a location may be measured in relation to a threshold. This threshold may be static or vary based on factors such as user history, user activity, machine learning processes, saccade prediction or other factors that adjust this threshold. The time may be measured as an incremental counter or a countdown time.
Embodiments of the present invention implement an eye-controlled user interface in which buttons are activated using a multi-step activation process. Eye-controlled user interfaces track a user's eye movement and map the movement within a virtual environment. In one example, AR/VR goggles have an inward-facing camera that monitors user eye movement as he/she is viewing a display. The goggles correlate this eye movement to the virtual content being viewed to allow interaction between the user and the display. In another example, contact lenses are worn by a user and track the movement of the user's eyes. This eye movement is correlated to a display projected onto the user's eye from a projector(s) embedded in the contact lens, examples of which may be found in U.S. patent application Ser. No. 16/940,152, filed on Jul. 27, 2020, entitled “Eye-Tracking User Interface for Virtual Tool Control,” listing inventors Haine et al.; U.S. patent application Ser. No. 16/005,379, filed on Jun. 11, 2018, entitled “Contact lens gaze tracking architectures,” listing inventors Mirjalili et al.; and U.S. patent application Ser. No. 16/200,039, filed on Nov. 26, 2018, entitled “Eye-mounted Displays Including Embedded Solenoids,” listing inventors Mirjalili et al., which patent documents are incorporated by reference herein in their entirety and for all purposes.
Eye-controlled user interfaces (either exclusively or partially controlled by eye movement) present unique challenges in defining how buttons are activated within a display. One such challenge is defining the user's intent within the display when he/she looks at a button. The system should be able to differentiate between when a user looked at a button to activate it and when the user viewed the button without intending to activate it.
Accurate activation of buttons within the field of view 110 is challenging if the system relies on a user looking at a button to activate it. Correlating a user's intent to activate a button versus a user perusing content in the display is essential to a user interface that relies on eye movement as a primary means for user input. As a user views the display, his/her eyes may look at one or more of the buttons within the display for a short period of time without intending to activate a button. For example, a user's eyes may simply traverse 130 across the field of view 110 to view content displayed within virtual environment. This eye movement is mapped into the display and may cross any number of buttons 117, 118. If these buttons are activated, the display may become unmanageable as buttons are unintentionally activated. Various embodiments of the invention address this problem by defining button activation processes that more accurately predict the intent of a user to activate a particular button.
The confirmation element 220 is generated within the display in a manner to visually show an association with the button 215 to the user. In a first example, the confirmation element 220 is a triangle placed at an edge of the button. In a second example, the confirmation element 220 is a spike coupled to the button. In a third example, the confirmation element 220 is a virtual slider of the button. The shape, location, shade and/or color of the confirmation element may vary based on design implementations. The three confirmation elements are described in more detail later in this description.
The user fixates a second gaze 250 at the confirmation element to activate the button. This second gaze 250 may be at a location within the confirmation element, at an edge or point of the confirmation element 250 or within a predefined distance external to the confirmation element 250. The length of time of the second gaze fixation 250 may be equal to or different from the first gaze 210. After performing the second gaze fixation 250 at the confirmation element 220, the corresponding button is activated. This multi-step activation process of a button reduces unintended activation of buttons based on user eye movement over prior art systems.
A first characteristic is a confirmation element location 320 relative to the button 305. In one example, the confirmation element 350 is positioned at a border of the button 305. If the confirmation element 350 appears after a user gaze fixates at the button 305, the appearance of the confirmation element 350 within the visual area at which the user is gazing will indicate an association with the button 305. Accordingly, one skilled in the art will recognize that a confirmation element 350 may be positioned within a button, at an edge or border of a button, or within a distance proximate to the button such that a user will see the confirmation element when it is initiated.
A second characteristic is a user-guided confirmation element 330. In another example, the confirmation element 360 is visually coupled to the button 305 such that the user will recognize the association between button and confirmation element. This visual coupling may be implemented a number of different ways within a display such as a line 325 connecting the button 305 to the confirmation element 360. The coupled confirmation element 360 may be positioned at a variety of locations external to the button 305.
Other characteristics such as related visual shapes, border highlights, shading, etc. 340 may also be used to visually associated a confirmation element with a button. In yet another example, a slider confirmation element 370 is visually displayed proximate to the button 305. This slider 370 may be shaded or have a different color. It may also have a similar shape to the button or appear to extend the button once activated. Other confirmation element characteristics may also be used to visually associate a button with a confirmation element.
This reduced button footprint allows the display to show more buttons in an organized and uncluttered manner. It also allows buttons to be positioned closer to each other within the display. Exemplary implementations of this enhanced border button are provided below.
This concept of enhanced button is shown across the other buttons in the menu. A second gaze fixation at a second button 1016 results in a border 1041 and confirmation element 1042 being displayed. The same process occurs at a third button 1017 and a fourth button 1018. Using these enhanced buttons allows the menu 1010 to have button located closer together and displayed in an organized manner. The buttons 1015-1019 may be activated by performing a subsequent gaze fixation at the confirmation element.
A user gaze fixates at the CANCEL button 1130 resulting in a border 1135 and confirmation element 1136 being displayed. In this instance, the enhanced button within the border extends beyond the virtual response block 1110 and is closer to/overlaps with the BEGIN button 1120. A multi-step activation of the CANCEL button 1130 occurs when the user fixates a subsequent gaze at the confirmation element 1136. A similar multi-step activation process occurs with the BEGIN button 1120.
For illustrative purposes, a version of the virtual response block 1110 is provided in which both a BEGIN button 1160 and a CANCEL button 1170 are shown with borders. If borders are shown for both buttons, overlap between buttons and borders with the virtual response block 1110. One skilled in the art will recognize a resulting more-efficient use of space within a display when enhanced buttons are used. A variety of different implementations of enhanced buttons may be used where a first portion of the button is only visible and a second portion of the button (e.g., border) appears when a user gaze fixation occurs.
As previously described, the confirmation element may be provided in a variety of locations and shapes. A second example of a confirmation element is shown where a glyph, such as an X, 1240 is provided at a corner of a dismissal block 1230. Embodiments of the invention may implement different enhancements to the confirmation element wherein it appears/disappears, changes size, changes shape or color, etc. to confirm a user's intent to activate or dismiss a button.
The confirmation element may be further enhanced by having a glyph change its size, shape, color or shade as part of the activation process. As shown, a block 1350 may have a dismissal glyph 1355 at a corner. As a user gaze fixates at the dismissal glyph 1355, it may display a border 1358 and confirmation element 1360, and increase its size 1370 to further enhance the next step in the dismissal process.
The button transitions to a third state 1430 after a user fixates a gaze at the confirmation element for a period greater than the time threshold T2. In this third state 1430, the button is enabled and confirmed resulting in an activation of the button, activation of content associated with the button and/or other response/command 1440 executed by activating the button. The button may also visually change to acknowledge confirmation.
If the button in the second state 1420 fails to transition to the third state 1430 within a predefined period of time, the button transitions to the initial state 1410.
It will be appreciated to those skilled in the art that the preceding examples and embodiments are exemplary and not limiting to the scope of the present disclosure. It is intended that all permutations, enhancements, equivalents, combinations, and improvements thereto that are apparent to those skilled in the art upon a reading of the specification and a study of the drawings are included within the true spirit and scope of the present disclosure. It shall also be noted that elements of any claims may be arranged differently including having multiple dependencies, configurations, and combinations.
Number | Name | Date | Kind |
---|---|---|---|
5844544 | Kahn | Dec 1998 | A |
8430310 | Ho | Apr 2013 | B1 |
8520309 | Sprague | Aug 2013 | B2 |
8764185 | Biederman | Jul 2014 | B1 |
8786675 | Deering | Jul 2014 | B2 |
8798332 | Otis | Aug 2014 | B2 |
8827445 | Wiser | Sep 2014 | B1 |
8870370 | Otis | Oct 2014 | B1 |
8874182 | Etzkorn | Oct 2014 | B2 |
8890946 | Publicover | Nov 2014 | B2 |
8911087 | Publicover | Dec 2014 | B2 |
8960898 | Etzkorn | Feb 2015 | B1 |
8964298 | Haddick | Feb 2015 | B2 |
8971978 | Ho | Mar 2015 | B2 |
8989834 | Ho | Mar 2015 | B2 |
9028068 | Chang | May 2015 | B2 |
9047512 | Otis | Jun 2015 | B2 |
9052533 | Pugh | Jun 2015 | B2 |
9153074 | Zhou | Oct 2015 | B2 |
9170646 | Toner | Oct 2015 | B2 |
9196094 | Ur | Nov 2015 | B2 |
9215293 | Miller | Dec 2015 | B2 |
9298002 | Border | Mar 2016 | B2 |
9298020 | Etzkorn | Mar 2016 | B1 |
9341843 | Border | May 2016 | B2 |
9390326 | Publicover | Jul 2016 | B2 |
9405365 | Publicover | Aug 2016 | B2 |
9600069 | Publicover | Mar 2017 | B2 |
9870060 | Marggraff | Jan 2018 | B2 |
10025379 | Drake | Jul 2018 | B2 |
10178367 | Zhou | Jan 2019 | B2 |
10345621 | Franklin | Jul 2019 | B2 |
10353463 | Shtukater | Jul 2019 | B2 |
10901505 | Haine | Jan 2021 | B1 |
11416072 | Singh | Aug 2022 | B1 |
20040155907 | Yamaguchi | Aug 2004 | A1 |
20090066722 | Kriger | Mar 2009 | A1 |
20100231504 | Bloem | Sep 2010 | A1 |
20110221659 | King, III | Sep 2011 | A1 |
20130145304 | DeLuca | Jun 2013 | A1 |
20140063054 | Osterhout | Mar 2014 | A1 |
20140098226 | Pletcher | Apr 2014 | A1 |
20140168056 | Swaminathan | Jun 2014 | A1 |
20140198128 | Hong | Jul 2014 | A1 |
20140347265 | Aimone | Nov 2014 | A1 |
20140354539 | Skogo | Dec 2014 | A1 |
20150143234 | Norris, III | May 2015 | A1 |
20150192992 | Di Censo | Jul 2015 | A1 |
20150205106 | Norden | Jul 2015 | A1 |
20150212576 | Ambrus | Jul 2015 | A1 |
20150235439 | Schowengerdt | Aug 2015 | A1 |
20150235440 | Schowengerdt | Aug 2015 | A1 |
20150235444 | Schowengerdt | Aug 2015 | A1 |
20150235446 | Schowengerdt | Aug 2015 | A1 |
20150235457 | Schowengerdt | Aug 2015 | A1 |
20150235468 | Schowengerdt | Aug 2015 | A1 |
20150235471 | Schowengerdt | Aug 2015 | A1 |
20150241698 | Schowengerdt | Aug 2015 | A1 |
20150243090 | Schowengerdt | Aug 2015 | A1 |
20150338915 | Publicover | Nov 2015 | A1 |
20150339857 | O'Connor | Nov 2015 | A1 |
20150348550 | Zhang | Dec 2015 | A1 |
20150362749 | Biederman | Dec 2015 | A1 |
20150362753 | Pletcher | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160018650 | Haddick | Jan 2016 | A1 |
20160018651 | Haddick | Jan 2016 | A1 |
20160018652 | Haddick | Jan 2016 | A1 |
20160018653 | Haddick | Jan 2016 | A1 |
20160025981 | Burns | Jan 2016 | A1 |
20160091737 | Kim | Mar 2016 | A1 |
20160133201 | Border | May 2016 | A1 |
20160195924 | Weber | Jul 2016 | A1 |
20160253831 | Schwarz | Sep 2016 | A1 |
20160274660 | Publicover | Sep 2016 | A1 |
20160283595 | Folkens | Sep 2016 | A1 |
20170019661 | Deering | Jan 2017 | A1 |
20170023793 | Shtukater | Jan 2017 | A1 |
20170115742 | Apr 2017 | A1 | |
20170123492 | Marggraff | May 2017 | A1 |
20170131764 | Bognar | May 2017 | A1 |
20170177078 | Henderek | Jun 2017 | A1 |
20170270636 | Shtukater | Sep 2017 | A1 |
20170285742 | Marggraff | Oct 2017 | A1 |
20170371184 | Shtukater | Dec 2017 | A1 |
20180120568 | Miller | May 2018 | A1 |
20180149884 | Miller | May 2018 | A1 |
20180180980 | Ouderkirk | Jun 2018 | A1 |
20180275753 | Publicover | Sep 2018 | A1 |
20180335835 | Lemoff | Nov 2018 | A1 |
20180348969 | Kawamura | Dec 2018 | A1 |
20190056785 | Suk | Feb 2019 | A1 |
20190235624 | Goldberg | Aug 2019 | A1 |
20190250408 | Lafon | Aug 2019 | A1 |
20190250432 | Kim | Aug 2019 | A1 |
20190377428 | Mirjalili | Dec 2019 | A1 |
20190390976 | Anderson | Dec 2019 | A1 |
20210208674 | Haine | Jul 2021 | A1 |
20220121344 | Pastrana Vicente | Apr 2022 | A1 |
Number | Date | Country |
---|---|---|
107092346 | Aug 2017 | CN |
2016195201 | Dec 2016 | WO |
2018109570 | Jun 2018 | WO |
Entry |
---|
Christiansen et al., editors. Motion Sensors Explainer. W3C Working Group Note, Aug. 30, 2017. retrieved from [https://www.w3.org/TR/motion-sensors/] on [Oct. 21, 2021]. (Year: 2017). |
CN107092346A English translation (Year: 2017). |
International Search Report and Written Opinion in PCT/US2020/056376, dated Jan. 12, 2021, 10 pages. |
WO2016195201A1 English translation (Year: 2016). |
International Search Report and Written Opinion in PCT/US2020/048091, dated Dec. 16, 2022, 10 pages. |