Gaze detection relates to the monitoring or tracking of eye movements to detect a person's gaze point. Various types of gaze detection systems and methods are known. For example products sold by Tobii Technology AB operate by directing near infrared illumination towards a user's eye and detecting reflection of the infrared illumination from the user's eye using an image sensor. Such a gaze detection system is described in U.S. Pat. No. 7,572,008. Other alternative gaze detection systems are also known, such as those disclosed in U.S. Pat. Nos. 6,873,314 and 5,471,542.
A gaze detection system can be employed as a user input mechanism for a computing device, using gaze detection to generate control commands. Eye control can be applied as a sole interaction technique or combined with other control commands input via keyboard, mouse, physical buttons and/or voice. It is now feasible to add gaze detection technology to many mobile computing devices, smart phones and tablet computers, and personal computers. Most standard-type web cameras and cameras integrated into mobile computing devices have a resolution of a few million pixels, which provides sufficient optical quality for eye-tracking purposes, Most mobile computing devices and personal computers also have sufficient processing power and memory resources for executing gaze detection software.
The following systems and methods provide solutions for automatic scrolling of displayed content in response to gaze detection. Content is displayed in a window rendered on a display screen. Gaze detection components may be used to detect that a user is gazing at the displayed content and to determine a gaze point relative to the display screen. In some cases, images of at least one facial feature of the user may be captured, such as at least one of a nose, a mouth, a distance between two eyes, a head pose and a chin, and at least one facial feature may be used in determining the gaze point.
At least one applicable scroll zone relative to the display screen and a scroll action associated with each applicable scroll zone may be determined. An interface may be provided to allow the user to define at least one scroll point and the scroll point may be used to define the applicable scroll zone(s). In response to determining that the gaze point is within a first applicable scroll zone, an associated first scroll action is initiated. In some instances, statistical analysis may be applied to gaze data patterns to determine that the gaze point is within the first applicable scroll zone. In some examples, a first scroll point may be displayed when the gaze point is determined to be within the first scroll zone and may be hidden when the gaze point is determined to be outside of the first scroll zone or when the user is determined to be reading the content.
The first applicable scroll zone may be defined as an area at the bottom of the display screen and the associated first scroll action may be defined as a scroll down action. In response to determining that the gaze point is within a second applicable scroll zone, which may be defined as an area at the top of the display screen, an associated second scroll action, such as a scroll up action, may be initiated. In other cases, a scroll zone may be defined as an area away from the display screen.
The first scroll action may cause the content to scroll within the window until at least one of: expiration of a defined period, determining that a portion of the content scrolls past a defined position within the window, determining that the gaze point is outside of the first scroll zone, and detecting an indicator that the user begins reading the content. In some examples, in response to termination of the first scroll action, a subsequent scroll action may not be initiated until after expiration of a defined time period. In other examples, in response to termination of the first scroll action the size of a scroll zone may be temporarily decreased or moved.
Additional features, advantages, and embodiments may be set forth in or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary and the following detailed description are provided by way of example only and intended to provide further explanation without limiting the scope of the claimed subject matter.
Many aspects of the present disclosure can be better understood with reference to the following diagrams. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating certain features of the disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
It is to be understood that the subject matter disclosed and claimed herein is not limited to the particular methodology, protocols, etc. described herein, as the skilled artisan will recognize that these may vary in different embodiments. The embodiments disclosed herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and examples that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and computing techniques may be omitted so as to not unnecessarily obscure the described embodiments. The examples used herein are intended merely to facilitate an understanding of ways in which the subject matter disclosed and claimed herein may be practiced and to further enable those of skill in the art to practice various embodiments.
Disclosed are various embodiments of systems and associated devices and methods for implementing automatic scrolling of content rendered on a display screen based on gaze detection. Gaze detection is also sometimes referred to as eye-tracking. As will be appreciated, gaze detection systems include hardware and software components for detecting eye movements, generating data representing such eye movements, and processing such data to determine a gaze point relative to a display screen or other object. By way of example, a gaze point can be expressed in terms of coordinates in a coordinate system.
Certain embodiments of the present invention are described herein with respect to camera-based gaze detection systems, but it should be understood that the invention is also applicable to any available or later-developed gaze detection systems. For example, embodiments of the invention may rely on gaze detection system that employ infrared-sensitive image sensors and collimated infrared sources to determine gaze points. Other embodiments may rely additionally or alternatively on face or body position tracking devices or other systems that enable at least directional input into a computing device that can be used to control the device. Embodiments of the present invention have particular application in mobile computing devices, such as mobile phones, smart phones, tablet computers, e-readers, personal digital assistants, personal gaming devices, media players and other handheld or laptop computer devices. In other embodiments, the invention may be used with other computing devices, including desktop computers, mainframe personal computers, set top boxes, game consoles, and the like.
In the embodiment shown, the camera 112 is integrated with the computing device 101. In other embodiments, the camera 112 may be a peripheral or add-on device that is attached to or used in proximity to the computing device 101. A camera 112 may be configured for capturing still images and/or video. Images or video captured by the camera 112 may be used for gaze detection, as will be described. In some embodiments, other gaze detection components may be connected to and/or integrated with the computing device 101 via appropriate system interface components 106.
A number of program modules may be stored in the system memory 104 and/or any other computer-readable media associated with the computing device 101. The program modules may include, among others, an operating system 117, various application program modules 119 and a gaze detection program module 123. In general, and for purposes of the present discussion, an application program module 119 includes computer-executable code for rendering images, text and other content 125 within a window 124 or other portion of the display screen 110 and for receiving and responding to user input commands (e.g., supplied via a gaze detection system, touch screen, camera, keyboard, control button 114, microphone 113, etc.) to manipulate such displayed content 125. Non-limiting examples of application program modules include browser applications, email applications, messaging applications, calendar applications, e-reader applications, word processing applications, presentation applications, etc.
A gaze detection program module 123 may include computer-executable code for detecting gaze points, saccades and/or other indicators of the user reading rather than gazing (e.g. eye fixation or dwelling on or around a constant point on the display) and other eye tracking data and for calculating positions of gaze points relative to the display screen 110. A gaze detection program module 123 may further include computer-executable code for controlling and receiving signals from a camera 112 or the components of other gaze detections. In other words, the gaze detection program module 123 may control the activation/deactivation and any configurable parameters of the camera 112 and may receive signals from the camera 112 representing images or video captured or detected by the camera 112. The gaze detection program module 123 may process such signals so as to determine reflection of light on the cornea or other portion of an eye, pupil location and orientation, pupil size or other metric for determining a location on a screen that is being viewed by an eye and use such information to determine the coordinates of a gaze point 130.
In some cases, camera based gaze detection systems may rely on facial recognition processing to detect facial features such as nose, mouth, distance between the two eyes, head pose, chin etc. Combinations of these facial features may be used to determine the gaze point 130. For instance in embodiments where vertical scrolling (a scroll up action and/or a scroll down action) is to be done based on face images from the camera 112, the detection of the gaze point 130 relies solely on the detected eyelid position(s). In other words, when the user gazes at the lower portion of the display screen 110, the eye will be detected as being more closed, whereas when the user gazes at the top of the display screen 110, the eye will be detected as being more open.
Eye lid position detection is good for determining changes in gaze points in a vertical direction, but not as effective for determining changes in gaze points in a horizontal direction. For better determining changes in gaze points in a horizontal direction, images of the head pose may be used instead. In such cases, gaze points may be determined to be within scroll zones only when the user's face is determined to be oriented in the general direction of the display screen 110. As general rule, whenever a user looks at an object more than 7 degrees off from his direct forward line of sight, he will immediately turn his head in the direction of that object. Thus a head pose indicating more than 7 degrees off to a side from the display screen 110 is an indication that the user is unlikely to be looking at the displayed content 125.
As used herein, the term “gaze point” is intended to represent an area or region relative to the display screen 110 to which the user's gaze is directed. Depending on the sensitivity and accuracy of the gaze detection components, which may be dictated by camera resolution, processing power, available memory, and the like, a gaze point 130 may occupy a smaller (more sensitive/accurate) or larger (less sensitive/accurate) area relative to the display screen 110. Calibration of the gaze detection components may also play a role in the accuracy and sensitivity of gaze punt calculations. Accuracy or sensitivity may dictate the relationship between an actual gaze point and a projected gaze point. The actual gaze point is the point relative to a display at which the user is actually looking, and the projected gaze point is the point relative to a display that the gaze detection program module 123 determines as the gaze point. One advantage of the present invention is that it functions even if the relationship between the actual and projected gaze points is not direct.
In some embodiments, the actual gaze may be calibrated with the projected gaze point by using touch data, input via a touch screen, to assist with calibration. For example, the gaze detection program module 123 or another process executed on the computing device 101 may be configured for prompting the user to look at and touch the same point(s) on the display screen. The detected gaze point will represent the projected gaze point and the detected touch point will represent the actual gaze point. Alternatively, such a calibration process may be performed in the background without prompting the user or interrupting the user's normal interaction with the computing device 101. For example, as the user normally operates the computing device 101 he/she will be pressing buttons, hyperlinks, and other portions of the content 125, display screen 110 and/or computing device 101 having known positions. The user will normally also be looking at the buttons, hyperlinks, etc. at the same time. Thus, gaze detection program module 123 or other process may recognize the touch point as the actual gaze point and then correct any discrepancies between the actual gaze point and the projected gaze point. Such a background calibration process can be helpful in order to slowly improve calibration as the user interacts with the computing device over time.
In some embodiments, one or more light sources may be added around, or in proximity to the display screen 110 to provide more illumination to an eye, so as to enhance the sensitivity and accuracy of the gaze detection program module 123. An example of using light sources to improve the sensitivity of an eye tracking system is shown in U.S. Pat. No. 8,339,446. Further, in some embodiments, illumination found in the user's own environment, so-called ambient illumination, may be used to enhance the sensitivity and accuracy of the gaze detection program module 123. Additionally the light source(s) will cause reflections in the eyes of the user that may be used as one of the features when determining the gaze point 130.
In some embodiments the computing device 101 may include a digital signal processing (DSP) unit 105 for performing some or all of the functionality ascribed to the gaze detection program module 123. As is known in the art, a DSP unit 105 may be configured to perform many types of calculations including filtering, data sampling, triangulation and other calculations with respect to data signals received from an input device such as a camera 112 or other sensor. The DSP unit 105 may include a series of scanning imagers, digital filters, and comparators implemented in software. The DSP unit 105 may therefore be programmed for calculating gaze points 130 relative to the display screen 110, as described herein. A DSP unit 105 may be implemented in hardware and/or software. Those skilled in the art will recognize that one or more graphics processing unit (GPU) may be used in addition to or as an alternative to a DSP unit 105.
In some embodiments, the operating system 117 of a computing device may not provide native support for interpreting gaze detection data into input commands. Therefore, in such cases, the gaze detection program module 123 may be configured to initiate a scroll action by interacting with an application program module 129 to pass the application program module 129 a scroll command emulating one that is recognized by the application program module 129 (e.g., a mouse wheel scroll or a mouse click and drag, touch and swipe actions or gestures, activation of page up/page down buttons or other commands).
The gaze detection program module 123 and/or DSP unit 105 and/or one or more GPU in combination with the camera 112 is referred to generally herein as a gaze detection system. As mentioned, other types of gaze detection systems may be connected to and/or integrated with the computing device 101. The processor 102, which may be controlled by the operating system 117, can be configured to execute the computer-executable instructions of the various program modules, including the gaze detection program module 123, an application program module 119 and the operation system 117. The methods of the present invention may be embodied in such computer-executable instructions. Furthermore, the images or other information displayed by an application program module 119 and data processed by the gaze detection system may be stored in one or more data files 121, which may be stored on any computer readable medium associated with the computing device 101.
In some embodiments, the gaze detection program module 123 may be configured for determining one or more scroll zones relative to the display screen 110 or relative to a window 124 or other portion of the display screen 110 that displays content 125. Scroll zones may also or alternatively be defined in locations away from the display screen 112, e.g., below or to the side of the display screen 110. In
A virtual divider may be of any suitable geometry (e.g., a line, a rectangle, polygon, point, etc.) and may be defined by coordinates relative to the widow 124 and/or the display screen 110. In some embodiments, the gaze detection program module 123 sets the coordinates of the virtual divider 204 to be approximately in the middle of the window 124 (i.e., approximately equally between the top and bottom of the window 124), thus defining an upper scroll zone (“Zone 1”) and a lower scroll zone (“Zone 2”) of the window 124. It should be appreciated that the upper Zone 1 and lower Zone 2 do not need to be of approximately equal area; any relative areas will suffice as long as they result in usable scroll zones. Preferably, the geometry of the virtual divider 204 and/or the locations of the scroll zones will result in a “dead zone” in the middle of the window 124 or display screen 110 to ensure that measurement errors and data noise do not cause opposite scroll actions to be initiated in succession while the gaze point 130 remains in the same position. As another alternative, a hysteresis may be implemented to attempt to avoid the same problem.
The gaze detection program module 123 may further be configured for associating certain scroll actions with the defined zones of the window 124. By way of example, a “scroll up” action may be associated with the upper Zone 1 of the exemplary window 124 and a “scroll down” action may be associated with the lower Zone 2 of the exemplary window 124. A scroll action may be defined in terms of scrolling (i.e., moving the content 125 relative to the window 124) for a determined period of time (e.g., a number of seconds or fractions thereof) or for a predetermined distance, until a gaze point 130 is detected as being moved within or outside of a defined scroll zone of the window 124, only when it is determined that the user is not reading or his/her eyes are otherwise determined to be still for a determined period of time, or until a particular portion of the content 125 passes a certain position within the window 124 (which, for example may be defined as a percentage of the size of the window 124). In some embodiments the size of the scroll zone may dictate the scroll distance or scroll time. For instance, a gaze point detected in a relatively smaller scroll zone may translate to a relatively smaller scroll distance or a relatively shorter scroll time. Conversely, a gaze point detected in a relatively larger scroll zone may translate to a relatively larger scroll distance or a relatively longer scroll time.
In some embodiments, it is preferable to configure the gaze detection program module 123 such that scroll actions may be initiated even when the user is determined to be reading. For instance when the user is looking at a map the scrolling (or “panning”) must be initiated relatively faster than when the user is reading text. Thus the dwell time before triggering a scroll action when reading text may be longer than for reviewing maps and other graphical content, and the scroll zones may be chosen differently in each case. For example, the scroll zone(s) may have to be made larger in the case of a map or other graphical content in order to make the gaze detection system sufficiently responsive, while scroll zone(s) for a text document may be smaller because a scroll action is typically not required until the user is reading text very close (e.g., 5 lines) to the bottom or top of the window 124.
In cases where a scroll action is initiated in response to detecting a gaze point 130 within a defined scroll zone for a determined period of time and the scroll action is terminated when the gaze point is determined to have moved outside of that scroll zone, a time-out mechanism may be implemented, such that automatic scrolling is not initiated again (even if a gaze point is subsequently determined to be in a defined scroll zone) until after expiration of a defined time period, which may be a number of seconds or fractions thereof. In addition or in the alternative, after a first scroll action is terminated, the size of one or more scroll zones may be at least temporarily decreased, such that a more deliberate gaze point is required to initiate a subsequent scrolling action. In some embodiments, the gaze detection program module 123 may be configured to allow a user to alter (e.g. by way of a control panel or other interface) the sensitivity of a scroll action (e.g., the size or position of the scroll zone, and/or how fast and/or how far the content 125 scrolls) and/or the period of time the gaze point must remain in a scroll zone to initiate a scroll action and/or actions to take following termination of a scroll action.
For instance, a “scroll up” action may be defined in terms of moving the content 125 downward relative to the window 124 until the portion of the content 125 that was previously at or near the top of lower Zone 2 (prior to scrolling) reaches the bottom or near the bottom of the window 124 or otherwise reaches a predefined position within the window 124. Similarly, a “scroll down” action may be defined in terms of moving the content 125 upward relative to the window 124 until the portion of the content 125 that was previously at or near the bottom of upper Zone 1 (prior to scrolling) reaches the top or near the top of the window 124 or otherwise reaches a predefined position within the window 124. Continuing a scroll action until a portion of the content 125 reaches a predefined position within the window 124 may be advantageous for a gaze detection system of lower robustness, i.e. meaning that the a gaze detection system fairly often loses data and may require a fairly long sampling time to determine gaze points with a reasonable degree of certainty. In embodiments where scroll actions are initiated in connection with a user reading content 125, this is easy to achieve because as the person reads content close to the bottom of the window 124 he is likely to keep his gaze in that area for several seconds, thus allowing for tens or maybe even hundreds of gaze samples before determining to initiate a scroll action.
It will be appreciated that scrolling actions may be defined in various ways and that different definitions may be used in connection with different application program modules 119, different window 124 sizes and/or different types of computing devices 101 (e.g., based on screen size, image or text resolution, etc.)
The gaze detection program module 123 may also be configured for determining whether a detected gaze point 130 is within one of the defined zones of the window 124 and, in response, which scrolling action to implement. With reference to the example of
In some embodiments, the gaze detection program module 123 may be configured such that scrolling based on gaze detection (referred to herein as “scroll mode”) is always enabled. In other embodiments, the gaze detection program module 123 may be configured such that scroll mode may be enabled/disabled in response to user commands. Exemplary user commands for enabling/disabling scroll mode may include activation of a button 114 or other actuator, input of a voice command via a microphone 113, interaction with a touch screen or keyboard, shaking or moving the computing device 101 in a particular fashion (if suitable motion detector components are included), a specified gesture, specified eye movements, or gazing at a defined region on or away from the display screen 110, etc. In certain embodiments, scrolling actions are initiated in response to gaze detection only if an associated user input command (e.g., a voice command) is simultaneously or contemporaneously detected.
In some embodiments, the gaze detection program module 123 may be configured to differentiate between a user gazing (e.g., for purposes of triggering a scroll action) and a user reading displayed content 125. For example, known techniques include detecting and evaluating saccades and whether an eye fixates or dwells on or around a constant point on the display. This information may be used to determine indicators of reading as distinguished from a more fixed gaze. In some embodiments, the gaze detection program module 123 may be configured to use gaze data patterns (e.g., the frequency with which gaze points appear in certain positions) to determine with greater accuracy, based on statistical analysis, when an actual gaze point is within a defined scroll zone. This approach is particularly useful in connection with relatively small scroll zones, which may be due to relatively small window 124 and/or display screen 110 sizes.
In some embodiments, the gaze detection program module 123 may further be configured for determining whether a detected gaze point 130 remains within a particular scroll zone (or within a defined position relative to a configurable scroll point, as discussed with reference to
As shown in
In the case of horizontally scrollable content, defined scroll actions may include a “scroll right” action and/or a “scroll left” action. In particular, a scroll right action may be defined in terms of moving the content 125 to the left relative to the window 124 and a scroll left action may be defined in terms of moving the content 125 to the right relative to the window 124. As noted above, a scroll action may be defined in terms of scrolling for a determined period of time, until a gaze point 130 is detected as being moved within or outside of a defined zone of the window 124, or until a particular portion of the content 125 passes a certain position within the window 124. In the example of
In some embodiments, more than two scroll zones may be defined for automatic scrolling based on gaze detection. For example, as shown in
Again, a dead zone is preferably positioned between each of the adjacent scroll zones and/or a hysteresis is implemented to account for the effects of measurement errors and data noise. In some embodiments, adjacent scroll zones may be overlapping. For instance, all scroll zones may be rectangular such that zone 1 is the upper quarter of the screen, zone 2 the right quarter, zone 3 the lower quarter and zone 4 the left quarter. This configuration will give an overlap (essentially indicating scrolling in two directions), but it will also leave the central portion (half of the screen width and half of the screen height) as a dead zone allowing for quite large measurement errors in the determined gaze position without inducing erratic scrolling.
In some embodiments the gaze detection program module 123 may be configured to allow a user to define one or more “scroll points” for determining custom scroll zones used for automatic scrolling based on gaze detection. As shown by way of example in
In the example of
Those skilled in the art will recognize that one or more scroll points may alternatively or additionally be defined so as to create vertical scroll zones for automatic horizontal scrolling based on gaze detection. As shown by way of example in
In embodiments where scroll points 602, 604 are defined as geometric shapes, the gaze detection program module 123 may be configured to determine when a gaze point 130 encompasses or is otherwise within a defined position relative to a scroll point 602, 604. In other words, the scroll zone associated with such a scroll point 602, 604 may be defined in terms of an area surrounding the scroll point 602, 604 and the size of that area may be varied depending at least in part on the sensitivity and accuracy of the gaze detection system. For example, if a gaze point is determined to be within a configurable number of inches, centimeters, millimeters or other distance in one or more direction (x,y) from a scroll point 602, 604 (e.g. in the direction of scroll relative to the scroll point or in all directions from the scroll point, etc.), the gaze detection program module 123 may be configured to initiate a scroll action associated with that particular scroll point 602, 604.
Next in step 704, a determination may be made as to whether scroll mode is enabled. In some embodiments, scroll mode may always be enabled. In other embodiments, the user may enable/disable scroll mode using any suitable commands, which may themselves be defined and/or configured by the user. If scroll mode is not enabled, the method loops between step 704 and step 706 to await an input command for enabling scroll mode. When it is determined at step 704 that scroll mode is enabled, the method advances to step 708 to detect or determine a gaze point resulting from the user viewing the displayed content 125. At step 710, the gaze point is determined to be within an applicable scroll zone or within a defined position relative to an applicable scroll point.
In step 714 a determination may optionally be made as to whether the gaze point remains within the applicable scroll zone beyond the expiration of a threshold time period. If the gaze point is determined not to remain with the applicable scroll zone beyond expiration of the threshold time period, it may be assumed that the user does not intend to initiate a scroll action and, in that case, the method loops back to step 708 to await detection or determination of the next gaze point.
The determination of whether the gaze point remains with the applicable scroll zone beyond a threshold time period may involve intelligent filtering. For instance intelligent filtering may involve filtering-out data samples that were not usable for determining a projected gaze position. Additionally the intelligent filtering may involve filtering-out a certain percentage of the gaze data samples that were not usable for determining a projected gaze position due to measurement errors. Preferably the gaze detection system should require that the last sample or a very recent sample of gaze data shows that the user is in fact gazing within the applicable scroll zone as part of this intelligent filtering.
If the determination of step 714 is performed and the gaze point is determined to remain with the applicable scroll zone beyond expiration of the threshold time period, the method advances to step 716 where the applicable scroll action is initiated. Following step 716, the method ends at step 718.
Although the automatic scrolling in response to gaze detection methods and other various methods and systems described herein may be embodied in software or code executed by general purpose hardware as discussed above, as an alternative the same may also be embodied in dedicated hardware or a combination of software/general purpose hardware and dedicated hardware. If embodied in dedicated hardware, each can be implemented as a circuit or state machine that employs any one of or a combination of a number of technologies. These technologies may include, but are not limited to, discrete logic circuits having logic gates for implementing various logic functions upon an application of one or more data signals, application specific integrated circuits having appropriate logic gates, or other components, etc. Such technologies are generally well known by those skilled in the art and, consequently, are not described in detail herein.
The flowchart of
Although the flowchart of
Any logic or application described herein, including the gaze detection program module 123, application program module 119 and other processes a modules running on a client device 120, that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor in a computer system or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system. The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, US flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
This application is a continuation of U.S. patent application Ser. No. 13/802,240, filed Mar. 13, 2013, the entire disclosure of which is incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5390281 | Luciw et al. | Feb 1995 | A |
5471542 | Ragland | Nov 1995 | A |
5731805 | Tognazzini et al. | Mar 1998 | A |
5850211 | Tognazzini | Dec 1998 | A |
6011555 | Eckhoff et al. | Jan 2000 | A |
6021403 | Horvitz et al. | Feb 2000 | A |
6067565 | Horvitz | May 2000 | A |
6085226 | Horvitz | Jul 2000 | A |
6204828 | Amir et al. | Mar 2001 | B1 |
6233570 | Horvitz et al. | May 2001 | B1 |
6260035 | Horvitz et al. | Jul 2001 | B1 |
6262730 | Horvitz et al. | Jul 2001 | B1 |
6351273 | Lemelson et al. | Feb 2002 | B1 |
6393584 | McLaren et al. | May 2002 | B1 |
6421064 | Lemelson et al. | Jul 2002 | B1 |
6578962 | Amir et al. | Jun 2003 | B1 |
6603491 | Lemelson et al. | Aug 2003 | B2 |
6734845 | Nielsen et al. | May 2004 | B1 |
6873314 | Campbell | Mar 2005 | B1 |
6882354 | Nielsen | Apr 2005 | B1 |
6886137 | Peck et al. | Apr 2005 | B2 |
7013258 | Su et al. | Mar 2006 | B1 |
7091928 | Rajasingham | Aug 2006 | B2 |
7113170 | Lauper et al. | Sep 2006 | B2 |
7346195 | Lauper et al. | Mar 2008 | B2 |
7486302 | Shoemaker | Feb 2009 | B2 |
7572008 | Elvesjo et al. | Aug 2009 | B2 |
7614011 | Karidis et al. | Nov 2009 | B2 |
7630254 | Lutze | Dec 2009 | B2 |
7630524 | Lauper et al. | Dec 2009 | B2 |
7634528 | Horvitz et al. | Dec 2009 | B2 |
7792391 | Arima | Sep 2010 | B2 |
8094122 | Molander et al. | Jan 2012 | B2 |
8226574 | Whillock et al. | Jul 2012 | B2 |
8235529 | Raffle et al. | Aug 2012 | B1 |
8339446 | Blixt et al. | Dec 2012 | B2 |
8407263 | Elad et al. | Mar 2013 | B2 |
8564533 | Yuan | Oct 2013 | B2 |
8620913 | Hough et al. | Dec 2013 | B2 |
8643680 | Baldwin et al. | Feb 2014 | B2 |
8756516 | Singh et al. | Jun 2014 | B2 |
8762846 | Kellerman et al. | Jun 2014 | B2 |
8786953 | Wheeler et al. | Jul 2014 | B2 |
8963806 | Starner et al. | Feb 2015 | B1 |
9377863 | Bychkov et al. | Jun 2016 | B2 |
9400553 | Kerr et al. | Jul 2016 | B2 |
9423870 | Teller et al. | Aug 2016 | B2 |
9423871 | Sukumar | Aug 2016 | B2 |
9454225 | Bychkov et al. | Sep 2016 | B2 |
9478143 | Bowen | Oct 2016 | B1 |
9480397 | Larsen | Nov 2016 | B2 |
9619020 | George-Svahn et al. | Apr 2017 | B2 |
9864498 | Olsson et al. | Jan 2018 | B2 |
20020032696 | Takiguchi et al. | Mar 2002 | A1 |
20020105482 | Lemelson et al. | Aug 2002 | A1 |
20020180799 | Peck et al. | Dec 2002 | A1 |
20030020755 | Lemelson et al. | Jan 2003 | A1 |
20030052903 | Weast | Mar 2003 | A1 |
20030067446 | Ono | Apr 2003 | A1 |
20030098954 | Amir et al. | May 2003 | A1 |
20040175020 | Bradski et al. | Sep 2004 | A1 |
20040199663 | Horvitz et al. | Oct 2004 | A1 |
20050030322 | Gardos | Feb 2005 | A1 |
20050047629 | Farrell et al. | Mar 2005 | A1 |
20050197763 | Robbins et al. | Sep 2005 | A1 |
20050221268 | Chaar et al. | Oct 2005 | A1 |
20060066567 | Scharenbroch et al. | Mar 2006 | A1 |
20060087502 | Karidis et al. | Apr 2006 | A1 |
20060174213 | Kato | Aug 2006 | A1 |
20060192775 | Nicholson et al. | Aug 2006 | A1 |
20060256133 | Rosenberg | Nov 2006 | A1 |
20070078552 | Rosenberg | Apr 2007 | A1 |
20070122064 | Arima | May 2007 | A1 |
20070132663 | Iba et al. | Jun 2007 | A1 |
20070164990 | Bjorklund et al. | Jul 2007 | A1 |
20070219732 | Creus et al. | Sep 2007 | A1 |
20070279591 | Wezowski et al. | Dec 2007 | A1 |
20080074389 | Beale | Mar 2008 | A1 |
20080114614 | Mahesh et al. | May 2008 | A1 |
20080130950 | Miklos et al. | Jun 2008 | A1 |
20080148149 | Singh et al. | Jun 2008 | A1 |
20080270474 | Flake et al. | Oct 2008 | A1 |
20080281915 | Elad et al. | Nov 2008 | A1 |
20080320418 | Huang et al. | Dec 2008 | A1 |
20090146950 | Maringelli | Jun 2009 | A1 |
20090245600 | Hoffman et al. | Oct 2009 | A1 |
20090273562 | Baliga et al. | Nov 2009 | A1 |
20090315827 | Elvesjö et al. | Dec 2009 | A1 |
20100079508 | Hodge et al. | Apr 2010 | A1 |
20100125816 | Bezos | May 2010 | A1 |
20100182232 | Zamoyski | Jul 2010 | A1 |
20100225668 | Tatke et al. | Sep 2010 | A1 |
20100245093 | Kobetski et al. | Sep 2010 | A1 |
20100295774 | Hennessey | Nov 2010 | A1 |
20110045810 | Issa et al. | Feb 2011 | A1 |
20110115703 | Iba et al. | May 2011 | A1 |
20110115883 | Kellerman et al. | May 2011 | A1 |
20110119361 | Issa et al. | May 2011 | A1 |
20110175932 | Yu et al. | Jul 2011 | A1 |
20110270123 | Reiner | Nov 2011 | A1 |
20120011170 | Elad et al. | Jan 2012 | A1 |
20120050273 | Yoo et al. | Mar 2012 | A1 |
20120105486 | Lankford et al. | May 2012 | A1 |
20120256967 | Baldwin et al. | Oct 2012 | A1 |
20120272179 | Stafford | Oct 2012 | A1 |
20130044055 | Karmarkar et al. | Feb 2013 | A1 |
20130106674 | Wheeler | May 2013 | A1 |
20130132867 | Morris et al. | May 2013 | A1 |
20130135196 | Park et al. | May 2013 | A1 |
20130169560 | Cederlund et al. | Jul 2013 | A1 |
20130176208 | Tanaka et al. | Jul 2013 | A1 |
20130176250 | Lee et al. | Jul 2013 | A1 |
20130229368 | Harada | Sep 2013 | A1 |
20130267317 | Aoki et al. | Oct 2013 | A1 |
20130275883 | Bharshankar et al. | Oct 2013 | A1 |
20130283208 | Bychkov et al. | Oct 2013 | A1 |
20130300637 | Smits et al. | Nov 2013 | A1 |
20130300654 | Seki | Nov 2013 | A1 |
20130321400 | Van Os et al. | Dec 2013 | A1 |
20140002352 | Jacob et al. | Jan 2014 | A1 |
20140019136 | Tanaka | Jan 2014 | A1 |
20140026098 | Gilman | Jan 2014 | A1 |
20140104197 | Khosravy et al. | Apr 2014 | A1 |
20140126782 | Takai et al. | May 2014 | A1 |
20140168054 | Yang et al. | Jun 2014 | A1 |
20140184550 | Hennessey et al. | Jul 2014 | A1 |
20140191948 | Kim et al. | Jul 2014 | A1 |
20140195918 | Friedlander | Jul 2014 | A1 |
20140211995 | Model | Jul 2014 | A1 |
20140247208 | Henderek et al. | Sep 2014 | A1 |
20140247210 | Henderek et al. | Sep 2014 | A1 |
20140247215 | George-Svahn et al. | Sep 2014 | A1 |
20140247232 | George-Svahn et al. | Sep 2014 | A1 |
20140268054 | Olsson et al. | Sep 2014 | A1 |
20140334666 | Lankford et al. | Nov 2014 | A1 |
20150009238 | Kudalkar | Jan 2015 | A1 |
20150138079 | Lannsjö | May 2015 | A1 |
20150138244 | George-Svahn et al. | May 2015 | A1 |
20150143293 | George-Svahn et al. | May 2015 | A1 |
20150149956 | Kempinski et al. | May 2015 | A1 |
20160116980 | George-Svahn et al. | Apr 2016 | A1 |
20170083088 | Lannsjö et al. | Mar 2017 | A1 |
20170177078 | Henderek et al. | Jun 2017 | A1 |
20170177079 | George-Svahn et al. | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
105339866 | Feb 2016 | CN |
0903662 | Mar 1999 | EP |
0816980 | Jan 2001 | EP |
1646026 | Apr 2006 | EP |
1812881 | Aug 2007 | EP |
1832753 | Sep 2007 | EP |
1970649 | Sep 2008 | EP |
2048326 | Apr 2009 | EP |
2075430 | Jul 2010 | EP |
2613224 | Jul 2013 | EP |
2752733 | Jul 2014 | EP |
2762997 | Aug 2016 | EP |
3088997 | Nov 2016 | EP |
2490864 | Nov 2012 | GB |
2016-0005013 | Jan 2016 | KR |
9202880 | Feb 1992 | WO |
9803907 | Jan 1998 | WO |
2006045843 | May 2006 | WO |
2010051037 | May 2010 | WO |
2010127714 | Nov 2010 | WO |
2010132991 | Nov 2010 | WO |
2010141403 | Dec 2010 | WO |
2012145180 | Oct 2012 | WO |
2012138744 | Nov 2012 | WO |
2013033842 | Mar 2013 | WO |
2013144807 | Oct 2013 | WO |
2013168171 | Nov 2013 | WO |
2013168173 | Nov 2013 | WO |
2014134623 | Sep 2014 | WO |
Entry |
---|
U.S. Appl. No. 14/195,755, filed Mar. 3, 2014 Final Rejection dated Jun. 1, 2018, all pages. |
U.S. Appl. No. 14/547,087, filed Nov. 18, 2014 Final Rejection dated Apr. 16, 2018, all pages. |
U.S. Appl. No. 14/547,089, filed Nov. 18, 2014 Non-Final Rejection dated May 11, 2018, all pages. |
U.S. Appl. No. 14/600,896, filed Jan. 20, 2015 Final Rejection dated Jun. 22, 2018, all pages. |
U.S. Appl. No. 15/367,453, filed Dec. 2, 2016 Non-Final Rejection dated Jul. 17, 2018, all pages. |
U.S. Appl. No. 15/449,058, filed Mar. 3, 2017 Non-Final Rejection dated May 11, 2018, all pages. |
“Adjacent”, Dictionary.com, http://dictionary.reference.com/browse/adjacent, Nov. 18, 2011, 1 page. |
U.S. Appl. No. 15/446,843, “Final Office Action”, dated Feb. 26, 2018, 16 pages. |
European Patent Application No. 14716455.2, “Office Action”, dated Sep. 30, 2016, 5 pages. |
European Patent Application No. 14716455.2, “Office Action”, dated Feb. 20, 2018, 6 pages. |
European Patent Application No. 16174545.0, “Extended European Search Report”, dated Oct. 4, 2016, 7 pages. |
European Patent Application No. 16174545.0, “Office Action”, dated Jul. 1, 2016, 1 page. |
European Patent Application No. 97304371.4, “Extended European Search Report”, Sep. 9, 1998, 5 pages. |
Korean Patent Application No. 10-2015-7027213, “Office Action”, dated Oct. 7, 2015, 3 pages. |
Kumar et al., “Gaze-enhanced Scrolling Techniques”, ACM, UIST'07, Oct. 7-10, 2007, 4 pages. |
PCT/US1997/010856, “International Search Report and Written Opinion”, dated Jul. 16, 1998, 4 pages. |
PCT/US2014/020024, “International Preliminary Report on Patentability”, dated Sep. 11, 2015, 11 pages. |
PCT/US2014/020024, “International Search Report and Written Opinion”, dated Jul. 29, 2014, 15 pages. |
Number | Date | Country | |
---|---|---|---|
20180181272 A1 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13802240 | Mar 2013 | US |
Child | 15851292 | US |