Embodiments disclosed herein are generally directed to systems and methods for interacting with devices based on a detected gaze
As electronic devices become faster and have larger displays with increased resolution, the power needed to operate these electronic devices also increases. In order to conserve power, many devices include settings that will turn the displays off when not in use.
However, when a user wants to interact with the device, the user often first needs to press a button, or otherwise manually interact with the device to activate the display. For example, a desktop computer often requires the user to press one or more buttons, or manipulate an interaction device such as a mouse to activate the display. On mobile devices, the user may have to press a button or tap the display. Further, on devices that include security challenges, such as lockscreens, the user often also has to perform an unlock function, whether it is a simple swipe on a mobile device, or entering a complex password.
Accordingly, what is needed is a method for interacting with a device that provides an improved and smoother interaction experience, and makes it easier for a user to interact with the device.
Consistent with some embodiments, there is provided a system for interacting with a device based on a detected gaze. The system includes one or more sensors configured to detect a triggering event, and an optical detection sensor module configured to detect a gaze of a user, the optical detection sensor module being activated when the one or more sensors detect a triggering event. The system also includes a display component, the display component being activated when the optical detection sensor detects a gaze of the user, wherein the display component is deactivated after a predetermined amount of time when the optical detection sensor no longer detects a gaze.
Consistent with some embodiments, there is further provided a method for device interaction. The method includes steps of detecting a triggering event, activating gaze detection when a triggering event is detected, determining, based on information from an optical detection sensor, whether a gaze is detected based on the activated gaze detection, activating a display of the device when a gaze is detected, and deactivating the gaze detection after a predetermined amount of time when a gaze is not detected. The method may also be embodied in a computer-readable medium.
Consistent with some embodiments, there is further provided a method for displaying content on a device. The method includes steps of determining that the content is available for display by the device, activating a display component of the device and an optical detection sensor of the device when content is determined to be available for display, determining, by one or more processors of the device in communication with the optical detection sensor, whether a gaze is detected, displaying the content when a gaze is detected and keeping the display component active as long as the gaze is detected, and deactivating the optical detection sensor and the display component when a gaze is not detected.
Consistent with some embodiments, there is further provided a system for device interaction. The system includes means for detecting a triggering event and means for detecting a gaze, the means for detecting a gaze being activated when a trigger event is detected. The system also includes means for determining whether a gaze is detected, means for activating a display of the device when a gaze is detected, and means for deactivating the gaze detection after a predetermined amount of time if a gaze is not detected.
In the drawings, elements having the same designation have the same or similar functions.
In the following description specific details are set forth describing certain embodiments. It will be apparent, however, to one skilled in the art that the disclosed embodiments may be practiced without some or all of these specific details. The specific embodiments presented are meant to be illustrative, but not limiting. One skilled in the art may realize other material that, although not specifically described herein, is within the scope and spirit of this disclosure.
Processing device 100 may include network interface component 102 configured for communication with a network. Consistent with some embodiments, network interface component 102 may be configured to interface with a coaxial cable, a fiber optic cable, a digital subscriber line (DSL) modem, a public switched telephone network (PSTN) modem, an Ethernet device, and/or various other types of wired network communication devices. Network interface component 102 may also include one or more wireless transceivers, wherein each wireless transceiver may include an antenna that is separable or integral and is capable of transmitting and receiving information according to a different wireless networking protocol, such as Wi-Fi™, 3G, 4G, HDSPA, LTE, RF, NFC. Consistent with some embodiments, processing device 100 includes a system bus 104 for interconnecting various components within processing device 100 and communication information between the various components. In some embodiments, the bus 104 is implemented in a System on Chip (SoC) and connects various elements or components on the chip and/or cores of one or more processors. Components may include a processing component 106, which may be one or more processors, central processing units (CPUs), image signal processors (ISPs), micro-controllers, or digital signal processors (DSPs), a system memory component 108, which may correspond to random access memory (RAM), an internal memory component 110, which may correspond to read only memory (ROM), and an external or static memory 112, which may correspond to optical, magnetic, or solid-state memories. Consistent with some embodiments, processing device 100 may also include a display component 114 for displaying information to a user. Display component 114 may be a liquid crystal display (LCD) screen, an organic light emitting diode (OLED) screen (including active matrix AMOLED screens), an LED screen, a plasma display, or a cathode ray tube (CRT) display. Display component 114 may be integrated with processing device 100, or may be separate from processing device 100 and coupled to processing device 100. Display component 114 may have an associated ambient light sensor (not shown) that may be configured to control a brightness of display component 114 based on a detected ambient brightness. Processing device 100 may also include an input and navigation control component 116, allowing for a user to input information and navigate along display component 114. An input and navigation component 116 may include, for example, a keyboard or key pad, whether physical or virtual, a mouse, a trackball, or other such device, or a capacitive sensor-based touch screen. Processing device 100 may include more or less components than shown in
Processing device 100 may also include an accelerometer 118 for detecting acceleration of processing device 100. According to some embodiments, acceleration detected by accelerometer 118 may be indicative of movement or motion of processing device 100. For example, a detected acceleration may be indicative of a user 120 picking up or moving processing device 100. According to some embodiments, accelerometer 118 detects an acceleration of device 100 and compares the detected acceleration to a known acceleration that is indicative of motion according to instructions stored in any of memories 108, 110, and 112. In some embodiments, accelerometer 118 may detect an acceleration of device 100 and send information regarding the detected acceleration to processing component 106 for processing the acceleration information according to instructions stored in any of memories 108-112. Based on the provided information, processing component 106 may determine if device is or has been in motion.
Processing device 100 may also include an optical detection sensor module 122. Optical detection sensor module 122 may be any sensor module capable of detecting a face and/or a gaze of a person, such as user 120. Moreover, optical detection sensor module 122 may include one or more processors and at least one memory, the one or more processors being configured to execute instructions stored in the at least one memory to capture optical information and analyze the captured optical information to detect a face and/or gaze of a person such as user 120. In some embodiments, optical detection sensor module 122 captures optical information and sends the information to processing component 106 which may analyze the information to detect a face and/or gaze based on instructions stored in any of optical detection sensor module 122 and memories 108-112.
According to some embodiments, optical detection sensor module 122 may include or be in communication with a camera, such as a built-in front facing camera of processing device 100, or a camera built into a coupled display or a coupled peripheral camera device. The camera may be a visible light camera or a depth-sensing camera, such as the Microsoft® Xbox™ Kinect™ camera. In some embodiments, optical detection sensor module 122 may also include or be in communication with an infrared or other light-based eye tracker that senses light reflected from an eye or particular features within an eye of user 120 to track the eye of user 120. According to some embodiments, optical detection sensor module 122 may be configured to cause a sensor to capture frames of images that are processed within module 122 or, in some embodiments, by processing component 106 for detecting a face in the frames of images and, if a face is detected, detecting a gaze from the face. A face and a gaze may be detected using one or more face and gaze detection algorithms stored in module 122 or any of memories 108, 110, and 112. For example, a gaze may be detected by determining eye direction of a detected face, such as user 120. According to some embodiments, optical detection sensor module 122 may be configured to detect a face and then a gaze of user 120 for use in determining whether to activate or deactivate display component 114, pass a security check of device 100, and/or display an alert or message on display component 114. In some embodiments, optical detection sensor module 122 may be configured to detect a gaze based on user 120 gazing at a predetermined object or area, such as processing device 100 or display component 114. Thus, detecting a gaze as described herein may require user 120 to face processing device 100 or display component 114. In some embodiments, user 120 may only be required to have their eyes looking toward processing device 100 or display component 114 to detect a gaze.
Processing device 100 may also include a proximity sensor 124. Proximity sensor 124 may be configured to detect when an object comes in proximity with device 100. According to some embodiments, the object may be user 120. Moreover, proximity sensor 124 may be configured to detect when an object comes within about 20-50 cm of device 100. Consistent with some embodiments, proximity sensor 124 may be one or more ultrasonic proximity sensors. Proximity sensor 124 may also include one or more heat sensors, such as an infrared heat sensor that detects the heat produced by an object, such as user 120, when the object is in proximity of the device. Proximity sensor 124 may also include one or more electric field proximity sensors that may detect the presence of a conducting or partially conducting object, such as user 120, as the object enters an electric field created by electrodes on or otherwise associated with the sensor. Proximity sensor 124 may also include an ambient light sensor, such as described above with respect to display component 114, that may detect a presence of an object, such as user 120, by the object causing the ambient light to decrease as a result of occlusion of light by the object. According to some embodiments, proximity sensor 124 may be configured to act as a trigger for activating optical detection sensor module 122 and/or a sensor associated with module 122, for example a camera, and initiating face and gaze detection algorithms. By only activating optical detection sensor module 122 and/or its associated sensor and initiating face and gaze detection algorithms when an object, such as user 120, is in proximity of device 100, the device is able to conserve power when a face is not in proximity of device 100.
Although the components of processing device 100 are shown as being integral with processing device 100, the components are not so limited and may be separate from and external to processing device 100, and coupled to processing device 100 and system bus 104 via a wired or wireless coupling.
Consistent with some embodiments, device 100 may use accelerometer 118, optical detection sensor module 122, and proximity sensor 124, individually or in various combinations, to activate display component 114 when user 120 gazes at display component 114 as detected by optical detection sensor module 122, and keep display component 114 activated as long as user 120 is gazing at display component 114. Further to conserve power, device 100 may use accelerometer 118, optical detection sensor module 122, and proximity sensor 124, individually or in various combinations, to deactivate display component 114 when user 120 is not gazing at display component 114, and only activate optical detection sensor module 122 and/or its associated sensor and/or gaze detection in response to a triggering event. Consistent with some embodiments, a triggering event may include at least one of a movement of device 100 detected by accelerometer 118 and a proximity of an object, such as user 120, detected by proximity sensor 124.
In some embodiments, a triggering event may be a combination of any of an acceleration detected by acceleration sensor 118, a proximity of user 120 by proximity sensor 124, or an alert generated by processing component 106. For example, processing component 106 may generate an alert causing processing device 100 to vibrate, generate an audible sound, or generate a flashing light, or other indicator that may capture the attention of user 120 to let user 120 know about the alert. User 120 may then pick up device 100 and bring device 100 in close proximity to view the alert. Accelerometer 118 may detect the motion of picking up the device 100, and proximity sensor 124 may detect the proximity of user 120. In some embodiments, the triggering event comprises an alert generated by processing component 106 and a user being within a threshold proximity of the device. The combination of any or all of these actions may be a triggering event that causes processing component 106 to perform a subsequent action.
After a triggering event has been detected, gaze detection is activated (204). In some embodiments, activating gaze detection comprises activating or initiating a gaze detection module, function, and/or process, for example as executed by optical detection sensor module 122 and/or processing component 106, or a combination thereof. According to some embodiments, activating gaze detection includes activating optical detection sensor module 122 to capture and process, within the module 122 and/or by processing component 106, optical images for detecting a gaze. Moreover, activating gaze detection may include activating face detection and then, if a face is detected, activating gaze detection. In some embodiments, activating gaze detection may include increasing a duty cycle or frame rate of an optical detection sensor associated with module 122 such that optical detection sensor captures more images than before gaze detection was activated. Optical detection sensor module 122, using functionality included within module 122 and/or in combination with processing component 106 and instructions stored in any of memories 108, 110, and 112, may determine if a gaze is detected (206). A gaze may be detected using any of a number of gaze detection algorithms, and may include facial detection performed according to known facial detection algorithms. For example, if a face is detected, optical detection sensor module 122 may determine if the face is gazing in the direction of display component 114 by analyzing information captured by an optical detection sensor to determine a direction of the eyes on the detected face. Optical detection sensor module 122 may also determine if the face is gazing in the direction of display component 114 by analyzing other features of the detected face, such as corners of the eyes or mouth, eyebrows on the detected face, ears on the detected face, or other facial features that may be used to determine a direction of a gaze.
If a gaze is not detected, gaze detection may be deactivated (208) until another triggering event is detected (202). In some embodiments, deactivating gaze detection may include powering down optical detection sensor module 122 and/or an associated sensor to conserve power. In some embodiments, deactivating gaze detection may include lowering a duty cycle or frame rate of the optical detection sensor so that fewer images are captured than when gaze detection is activated. In some embodiments, deactivating gaze detection may include the optical detection sensor continuing to capture images without optical detection sensor module 122 processing the captured images. In such embodiments, the captured images may be stored in a buffer, a memory internal to optical detection sensor module 122, or any of memories 108-112.
If a gaze is detected, display component 114 may be activated and/or content may be displayed on display component 114 for a predetermined amount of time (210) allowing user to view information on display component 114. Activating display component 114 and displaying content may include passing a keylock or security challenge. In some embodiments, activating display component 114 may cause processing component 106 to generate a keylock or security challenge that may be displayed by display component 114. A keylock may be a slider or other motion challenge that requires users to take an action to remove the keylock, such as sliding the slider in an indicated direction. A security challenge may be a required password, personal identification number (PIN), pattern challenge, or biometric challenge presented to a user for security purposes. In some embodiments, at least one of a keylock and a security challenge may be required by an operating system of device 100 after a predetermined time of inactivity or after user 120 manually locks device 100 via a lock switch or command to secure device 100. Activating display component 114 via a detected gaze may bypass the keylock or security challenge. For example, if gaze detection detects a gaze, the detection of the gaze may be sufficient to activate display component 114 and bypass a keylock, but not sufficient to bypass a security challenge. However, in some embodiments, if gaze detection includes facial detection, optical detection sensor module 122 alone, or in combination with processing component, may be configured to detect a face and determine if the detected face matches a known face, such as a face of user 120 or other authorized user, that may be stored in an internal memory of optical detection sensor module 122 or any of memories 108-112. If a detected face matches a face of user 120, a security challenge may be bypassed when display component 114 is activated.
After the predetermined amount of time has passed, optical detection sensor module 122 may again determine if a gaze is detected (206), and display component 114 may remain active and/or display content if a gaze is still detected (210). If a gaze is no longer detected after the predetermined amount of time, the display may deactivate and/or the content may no longer be displayed, and gaze detection by optical detection sensor module 122 may be deactivated (208) until another triggering event is detected (202). In some embodiments, user 120 may be able to deactivate display component 114 and deactivate gaze detection prior to or instead of the amount of time passing. For example, user may manually turn off display component 114 and/or gaze detection by pressing a button, whether physical or virtual and displayed on display component 114, or changing a setting of device 100. In some embodiments, optical detection sensor module 122 may be further configured to detect a gesture, wherein a detected gesture may instruct display component 114 and/or gaze detection to be deactivated. For example, a swipe over the display component 114 or occluding an optical sensor of the device 100 may be used in some embodiments to dismiss an alert and/or prevent display component 114 from activating. In some embodiments, an ambient light sensor, such as described previously, may be used to deactivate display component 114 and/or gaze detection. For example, an ambient light sensor may detect that there is no, or a very small amount, of ambient light, which may be indicative of device being placed in a drawer, a bag, or a pocket. The ambient light sensor may send this information to processing component 106 which may make a determination to deactivate display component 114 and/or gaze detection to conserve power.
In some embodiments, a user 120 may be able to further interact with the content displayed on the activated display component 114 through gaze detection and/or eye tracking implemented by optical detection sensor module 122 alone or in combination with processing component 106. For example, user 120 may be able to move his/her eyes from one side to the other in a “swipe” motion which, when detected by optical detection sensor module 122, may be interpreted as a swipe to content displayed on display component 114, dismissing a displayed alert, moving to a next message, etc. Detecting that user 120 is gazing at displayed content for a predetermined period of time may be interpreted as selecting the content. Detecting that user 120 is blinking may be interpreted that user 120 is closing the displayed content. Detecting that user 120 is looking at the content for a predetermined period of time in combination with a detected head nod may be interpreted as a selection of the displayed content such that user 120 is capable of moving the displayed content on a screen of display component 114, such as to move the content into a file folder, delete the content, etc. In some embodiments, detecting a gaze to activate display component 114 and display content on display component 114 may be combined with known gaze and head or facial gesture detection algorithms to translate gestures made by a head or face of user 120 to interact with the displayed content in additional ways, for example after the display component 114 has been activated and content is being displayed.
Consistent with some embodiments, process 200 illustrated in
After proximity detection is initiated, proximity detection sensor 124 may continue to monitor for objects in the proximity of proximity detection sensor 124. When proximity detection sensor 124 detects an object in proximity, optical detection sensor module 122 may be activated (306). In some embodiments, proximity detection sensor 124 may consume less power than the optical detection sensor module 122 and/or an optical sensor associate therewith. Thus, in some embodiments, monitoring for nearby objects using the proximity detection sensor 124 may save power compared to monitoring for nearby objects using the optical detection sensor module 122. Information captured by optical detection sensor module 122 may be analyzed to determine if a face is detected (308). In some embodiments, the information captured by optical detection sensor module 122 may be analyzed by executing a process or function, such as a facial detection algorithm, within optical detection sensor module 122. The information captured by optical detection sensor module 122 may also be sent to processing component 106 for analysis according to a facial detection algorithm stored in any of memories 108-112. Moreover, information captured by optical detection sensor module may be analyzed by a combination of processing component 106 and optical detection sensor module 122 according to a facial detection algorithm. In some embodiments, the facial detection algorithm may be any known facial detection algorithm. If a face is not detected, optical detection sensor module 122 and/or an associated sensor may enter a reduced power mode (310) until another object is detected in proximity of proximity detection sensor 124 (304). In some embodiments, a reduced power mode for optical detection sensor module 122 may include powering down an associated sensor to conserve power. In some embodiments, a reduced power mode may include lowering a duty cycle or frame rate of the associated sensor so that fewer images are captured than when optical detection sensor module 122 is activated. In some embodiments, a reduced power mode may include an optical sensor continuing to capture images without optical detection sensor module 122 processing the captured images. In such embodiments, the captured images may be stored in a buffer, a memory internal to optical detection sensor module 122, or any of memories 108-112.
If a face is detected at 308, gaze detection may be initiated (312). In some embodiments, initiating gaze detection comprises activating or initiating a gaze detection module, function, and/or process, for example as executed by optical detection sensor module 122 and/or processing component 106, or a combination thereof. According to some embodiments, initiating gaze detection includes initiating optical detection sensor module 122 to capture and process, within module and/or by processing component 106, optical images for detecting a gaze. In some embodiments, initiating gaze detection may include increasing a duty cycle or frame rate of an optical sensor associated with optical detection sensor module 122 such that optical detection sensor module 122 captures more images than before gaze detection was activated. Optical detection sensor module 122, using functionality included within module 122 and/or in combination with processing component 106 and instructions stored in any of memories 108, 110, and 112, may determine if a gaze is detected (314). A gaze may be detected using any number of gaze detection algorithms, for example as described above, which may include analyzing information captured by optical detection sensor module 122 to determine a direction of the eyes on the detected face. A gaze may also be detected by analyzing other features of a detected face, such as corners of the eyes or mouth, eyebrows on the detected face, ears on the detected face, or other facial features that may be used to determine a direction of a gaze and to determine if a detected face is gazing in a direction of display component 114. If a gaze is not detected, optical detection sensor module 122 continues to look for a gaze (308, 312, 314) as long as a face is still in the field of view of optical detection sensor module 122. In some embodiments, however, if a gaze is not detected after a predetermined period of time, optical detection sensor module 122 may enter a reduced power mode (310) and gaze detection may be deactivated. However, if a gaze is detected, display component 114 may be activated (316) allowing user to view content displayed on display component. Activating display component 114 and displaying content may include passing a keylock or security challenge, such as described above with respect to process 200 illustrated in
Consistent with some embodiments, process 300 illustrated in
In some embodiments, a triggering event may comprise a combination of any of an acceleration detected by acceleration sensor 118, a proximity of user 120 by proximity sensor 124, a touch, or an alert generated by processing component 106. For example, processing component 106 may generate an alert causing processing device 100 to vibrate, generate an audible sound, or generate a flashing light, or other indicator that may capture the attention of user 120 to let user 120 know about the alert. User 120 may then pick up device 100 and bring device 100 in close proximity to view the alert. Accelerometer 118 may detect the motion of picking up the device 100, and proximity sensor 124 may detect the proximity of user 120. The combination of any or all of these actions may be a triggering event that causes processing component 106 to perform a subsequent action.
In response to detecting a triggering event, display component 114 may be activated (404). In some embodiments, operating system running on device 100 may automatically activate display component 114 when a triggering event is detected, for example, to provide an indication of the triggering event and/or provide user 120 with a preview of the triggering event, such as an alert. If no action is taken after a predetermined time, the operating system may deactivate display component 114 to conserve power. Proximity sensor 124 may also be activated in response to detecting a triggering event (406). In some embodiments, proximity sensor 124 may be configured to be active when display component 114 is active and inactive, or in a reduced power mode, when display component is not active. The configuration of proximity sensor 124 may be an operating system configuration.
In response to detecting a triggering event, optical detection sensor module 122 may also be activated (408). Optical detection sensor module 122, using functionality within module 122 and/or in combination with processing component 106 and instructions stored in any of memories 108, 110, and 112, may determine if a gaze is detected (410). A gaze may be detected using any of a number of gaze detection algorithms, and may include facial detection performed according to known facial detection algorithms. For example, if a face is detected, optical detection sensor module 122 may determine if the face is gazing in the direction of display component 114 by analyzing information captured by optical detection sensor module 122 to determine a direction of the eyes on the detected face. Optical detection sensor module 122 may also determine if the face is gazing in the direction of display component 114 by analyzing other features of the detected face, such as corners of the eyes or mouth, eyebrows on the detected face, ears on the detected face, or other facial feature that may be used to determine a direction of a gaze.
If a gaze is not detected, optical detection sensor module 122 may enter a reduced power mode (412). In some embodiments, a reduced power mode for optical detection sensor module 122 may include powering down an optical sensor to conserve power. In some embodiments, a reduced power mode may include lowering a duty cycle or frame rate of an optical detection sensor so that fewer images are captured than when optical detection sensor module is activated. In some embodiments, a reduced power mode may include an optical sensor continuing to capture images without optical detection sensor module 122 processing the captured images. In such embodiments, the captured images may be stored in a buffer, a memory internal to optical detection sensor module 122, or any of memories 108-112.
If a gaze is not detected, proximity sensor 124 may also be deactivated (414). Deactivation of proximity sensor 124 may include entering a low power mode that may be similar to a low power mode of optical detection sensor module 122. In some embodiments, deactivation of proximity sensor may include deactivating proximity sensor 124 so that it is not consuming power. If a gaze is not detected, display component 114 may also be deactivated (416) until another triggering event is detected (402). Deactivating display component 114 may include deactivating a backlight of display component 114 so that content rendered for display by display component is not visible to user 120. Deactivating display component 114 may also include deactivating rendering and processing capabilities associated with display component 114 such that content is not rendered for display by display component 114 while deactivated. Although display component 114 may be automatically deactivated in 416, in some embodiments, user 120 may be able to deactivate display component 114 by, for example, pressing a button, whether physical or virtual and displayed on display component 114, or changing a setting of device 100. In some embodiments, optical detection sensor module 122 may be further configured to detect a gesture, wherein a detected gesture may deactivate display component 114. Moreover, display component 114 may be deactivated in response to a detection of no or a very small amount of ambient light by an ambient light sensor, such as described above.
If a gaze is detected by optical detection sensor module 122, a keylock or security challenge may be passed (418). In some embodiments, a detected gaze may be sufficient to bypass a keylock, but not sufficient to bypass a security challenge such as a password or PIN. According to some embodiments, optical detection sensor module 122 alone, or in combination with processing component 106, may be configured to detect a face and determine if the detected face matches a known face, such as a face of user 120, that may be stored in an internal memory of optical detection sensor module 122 or any of memories 108-112. If a detected face matches a face of user 120, a security challenge may be bypassed when display component 114 is activated. In some embodiments, a type of trigger may determine whether the security challenge is bypassed. For example, an alert of a newsfeed being updated may cause the security challenge to be bypassed, but reception of a personal email may still cause a security challenge to be presented. In some embodiments, authorized users are authorized for all content on the device. In other embodiments, each authorized user may have a set of permissions that determine if a security challenge will be presented upon recognition of the authorized user and/or if the user will be permitted access when a given type of triggering event occurs. In some embodiments, authenticating a user may comprise matching a color of the user's eyes to known eye colors and/or validating biometric data of the user, for example a shake of the user's eyes. Display component 114 may display content (420) after a user has passed security. The displayed content may be the triggering alert described above, such as a received e-mail, calendar alert, message and the like. The content may be displayed on display component for a predetermined time, after which optical detection sensor may determine if user 120 is still gazing at optical detection sensor module 122 (422). If a gaze is not detected, optical detection sensor module 122 may enter a reduced power mode (412), proximity sensor 124 may be deactivated (414), and display component 114 may be deactivated (416), such as described above, until another triggering event is detected (402) or user 120 manually activates any of these components. If a gaze is detected, display component 114 may remain activated (424) as long as a gaze is detected or until user manually deactivates display component 114, and proximity sensor 124 may be deactivated to save power (426). Moreover, a user 120 may be able to further interact with the content displayed on the activated display component 114 through gaze detection and/or eye tracking implemented by optical detection sensor module 122 alone or in combination with processing component 106, such as described above with respect to process 200 in
Consistent with some embodiments, process 400 may provide power savings by activating display component 114 for displaying content when user 120 is determined to be actively gazing at device 100. Moreover, process 400 may provide further power savings by activating proximity sensor 124 and optical detection sensor module 122 in response to a triggering event. Furthermore, process 400 may also provide for enhanced interaction with device 100 by user 120, allowing user 120 to interact with device 100 to pass a keylock or security challenge and view content displayed on display component 114 by gazing at optical detection sensor module 122 of device 100.
Software, in accordance with the present disclosure, such as program code and/or data, may be stored on one or more machine readable mediums, including non-transitory machine readable medium. such as any of memories 108, 110, and 112 in device 100. It is also contemplated that software identified herein may be implemented using one or more general purpose or specific purpose computers or application specific integrated circuits (ASICs) and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein may be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
Consequently, embodiments as described herein may provide user convenience, by allowing a user to activate and unlock a device by looking at an optical detection sensor integrated into or coupled to the device. Moreover, embodiments as described herein may also provide power savings, by activating gaze detection only in response to one or more triggering events and then only activating the display of the device when a gaze is detected. The examples provided above are exemplary only and are not intended to be limiting. One skilled in the art may readily devise other systems consistent with the disclosed embodiments which are intended to be within the scope of this disclosure. As such, the application is limited only by the following claims.