An operator of an electronic system may engage with a visual presentation system to interact with the electronic system. The visual presentation system may include various cues, such as graphical user interface (GUI) elements alerting the operator of a status of the electronic system. The GUI elements may be text or any sort of indication of information. The operator may interact with the visual presentation system in situations where a touch capable device is provided.
In certain cases, an electronic system may be equipped with multiple visual presentation systems. Accordingly, the multiple visual presentation systems may be associated with different locations or displays capable of presenting information.
For example, if the operator is situated in a vehicle (i.e. a driver or passenger of the vehicle), the operator may have multiple visual presentation systems to engage with. For example, the vehicle may have a visual presentation system embedded in a cockpit of a dashboard, embedded in a heads-up display (HUD), or have indicia provided via mirrors or other translucent surfaces. Thus, the visual presentation system may indicate information in various locations.
Recently, human interface techniques known as gaze tracking or head tracking have been implemented. The gaze and head tracking allow an electronic system to detect the location of the operator. The gaze and head tracking monitor the operators head or eyes via an image or video capturing device, and accordingly, translate the movement and location into commands to control the electronic system. Essentially, the head and the eyes become pointing devices employed to operate various controls and commands. As interfaces become more sophisticated, this allows an operator to engage an electronic system or visual presentation system independently of one's hand. In certain situations, for example driving a vehicle, because the operator's hand stays on a steering wheel, the operator may experience a safer and more convenient driving environment.
A system and method for adjusting a display based on a detected orientation is disclosed herein. The system includes an orientation detector to detect an orientation of a viewer associated with the display, the display including at least a first display and a second display; an information input module to receive information to output on either the first display or the second display; and a display selector to select either the first display or the second display to output the information based on the detected orientation.
The detailed description refers to the following drawings, in which like numerals refer to like items, and in which:
The invention is described more fully hereinafter with references to the accompanying drawings, in which exemplary embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure is thorough, and will fully convey the scope of the invention to those skilled in the art. It will be understood that for the purposes of this disclosure, “at least one of each” will be interpreted to mean any combination the enumerated elements following the respective language, including combination of multiples of the enumerated elements. For example, “at least one of X, Y, and Z” will be construed to mean X only, Y only, Z only, or any combination of two or more items X, Y, and Z (e.g. XYZ, XZ, YZ, X). Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals are understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
Providing information to an operator of an electronic system allows the operator to engage in the electronic system in a more robust and dynamic way. Based on the information presented, for example text or graphical, the operator may make guided decisions on how to engage with the electronic system, or the environment in general.
For example, if the electronic system is embedded or incorporated with a vehicle, visual information may convey information associated with the electronic system to a driver or passenger. Accordingly, the driver or passenger may modify the operation of the vehicle based on the indication provided by the display.
The vehicle display may provide safety information, or guidance information. Thus, the vehicle display may alert the vehicle's operator of a hazardous road condition, an instruction to proceed, or certain other information associated with the vehicular operation.
In certain cases, there may be multiple displays installed in a location. For example, relying on the vehicular context, the following locations may be implemented for a display: a heads-up display (HUD), a cockpit display, displays located or integrated with various mirrors and electronics associated with the vehicle.
In these situations, a vehicle operator's head and/or eye gaze direction may be oriented in a first direction at a first display, and information may be displayed on a second display, in a direction in which the vehicle's operator is not oriented. Accordingly, the vehicle's operator may miss the information associated with the second display due to gazing in a different direction.
Disclosed herein are systems and methods for adjusting a display based on a detected orientation. According to the aspects disclosed herein, in situations where multiple displays are implemented along with an electronic system, information may be adjusted accordingly. Thus, the operator associated with the electronic system may be alerted to critical or important information. Even in situations where the information is not critical, an implementer of the systems disclosed herein may assign a priority associated with information, and accordingly, the information with the highest priority may be displayed to a vehicle's operator.
The aspects disclosed herein employ either gaze tracking or head tracking to determine an orientation of an operator's attention. Accordingly, the gaze tracking and head tracking determine which direction the operator's attention is directed to, and adjusts the displays so that the information with a higher priority is directed towards the display being gazed at.
The storage device 108 is a non-transitory computer-readable storage medium such as a hard drive, compact disk read-only memory (CD-ROM). DVD, or a solid-state memory device. The memory 106 holds instructions and data used by the processor 102. The pointing device 114 is a mouse, track ball, or other type of pointing device, and is used in combination with the keyboard 110 to input data into the computer 100. The pointing device 114 may also be a gaming system controller, or any type of device used to control the gaming system. For example, the pointing device 114 may be connected to a video or image capturing device that employs biometric scanning to detect a specific user. The specific user may employ motion or gestures to command the point device 114 to control various aspects of the computer 100.
The graphics adapter 112 displays images and other information on the display 118. The network adapter 116 couples the computer system 100 to one or more computer networks.
The computer 100 is adapted to execute computer program modules for providing functionality described herein. As used herein, the term “module” refers to computer program logic used to provide the specified functionality. Thus, a module can be implemented in hardware, firmware, and/or software. In one embodiment, program modules are stored on the storage device 108, loaded into the memory 106, and executed by the processor 102.
The types of computers used by the entities and processes disclosed herein can vary depending upon the embodiment and the processing power required by the entity. The computer 100 may be a mobile device, tablet, smartphone or any sort of computing element with the above-listed elements. For example, a data storage device, such as a hard disk, solid state memory or storage device, might be stored in a distributed database system comprising multiple blade servers working together to provide the functionality described herein. The computers can lack some of the components described above, such as keyboards 110, graphics adapters 112, and displays 118.
The computer 100 may act as a server (not shown) for the content sharing service disclosed herein. The computer 100 may be clustered with other computer 100 devices to create the server.
Referring to
Also shown in
As shown in
The gaze tracking device 290 captures an image/video associated with the electronic system 260's operator, and processes the image/video to ascertain the operator's eyes. Based on the image/video of the eyes, the gaze tracking device 290 may ascertain a direction associated with the eye's attention.
The head tracking device 295 works similarly to the gaze tracking device 290, but employs an image/video of the operator's head. Based on the angle of the head detected, a direction of attention of the operator may be obtained.
The orientation detector 210 receives an indication from the either the gaze tracking device 290 or the head tracking device 295 on the direction or orientation of the electronic system 260's operator. The orientation detector 210 may be configured to receive information associated with the electronic system 260's operator at a predefined interval. Accordingly, when the electronic system 260's operator moves their head from side-to-side or to various locations, the orientation detector 210 may ascertain which direction the operator is oriented towards.
In another example, the orientation detector 210 may determine the distance away from the display being oriented at. For example, in real-time or at predetermined intervals, the orientation detector 210 may track the physical distance a viewer is from the display.
The information input module 220 obtains information from the electronic system 260 to display on one of the displays, such as display 270 or display 280. The information input module 220 may cross-reference a persistent store, and employ a lookup table to ascertain whether the information to be displayed is of a priority high enough to display according to the aspects disclosed herein. The lookup table 206 may record whether certain information is to be displayed at a higher priority than other information. The priority associated with each information type may be predefined by an implementer of system 200.
The information may also be augmented with information associated with modifications based on distance. Accordingly, different renderings or amount of information may be presented to a viewer based on the distance from the display. An example of an implementation of system 200 with regards to this example is shown below in
The display selector 230 correlates the nearest available display to the operator's attention (based on the orientation detector 210), and records the display associated with the operator's attention. Accordingly, if the operator is oriented at or near a certain display, the display selector 230 may record that display as the selected display. As the orientation detector 210 is updated at predetermined intervals, the nearest display in which an operator's attention is directed at may be updated accordingly. Referring to
Additionally, or alternatively to, the display selector 230 may operate with a specific portion of a singular display (such as top portion or a bottom portion of display 270, for example). Accordingly, the display selector 230 may select a portion of a single display, instead of one of a multiple array of displays based on the detected orientation.
The display driver 240 determines whether the information being rendered is to be displayed via the selected display (for example, display 270 or display 280). The display driver 240 may select all, some, or none of the information on the selected display.
In another implementation of system 200, the display driver 240 may render a different amount of information based on the detected distance from the display being oriented at. For example, if the viewer of the display 270 moves closer or farther away, an image may be rendered according to the change in distance. In this implementation of system 200, a singular display may be implemented, and the display selector 230 may be omitted. In another example, this implementation may be combined with the example described above.
The information to be displayed according to the aspects disclosed herein may be predefined with a priority. Accordingly, information over a predetermined threshold may be communicated to a selected display accordingly.
For example, according to the aspects disclosed herein, if the system 200 is implemented in a vehicle, certain information may be deemed important enough to be transmitted to a display in which the driver is gazing or oriented at. Safety information item, such as a detected foreign object to the vehicle, may be deemed important, and thus, transmitted to be displayed in a selected display. Conversely, information not deemed important enough (for example, the current radio station), may not be transmitted to the selected display.
In operation 310, a detected orientation of an operator associated with an implementation of method 300 is made. As explained above, the detected orientation may be accomplished via numerous techniques, such as through gaze tracking or head tracking. Further, the detected orientation may determine how far the viewer of the display is from a viewing surface.
In operation 315, if no change in detected orientation is made, a predetermined time interval may be set as to iteratively perform operation 310. Operation 315 is electively added to operation 310, and may occur in parallel with the operations disclosed herein.
In operation 320, information to be transmitted onto one of the displays associated with method 300 is received. The information may include a priority or other augmented information to ascertain the informations criticality or priority of display. For example, if method 300 is implemented in a vehicle, information pertaining to safety and guidance may be set at a higher priority, versus information pertaining to an entertainment system.
In operation 330, a display is selected at which an operator associated with method 300 is directing attention towards. This selection may be performed with the information ascertained in operation 310.
In operation 340, the information received in operation 320 is analyzed to determine if the priority is above a predetermined threshold, and thus, displayed via the selected display in operation 330.
In another implementation, the information may be rendered differently based on the detected distance from a viewing surface. For example, if the viewer is closer to the viewing surface, a larger range of information may be displayed.
a)-(c) illustrate an example implementation of system 200 and method 300. In
The various displays in
As shown in
As shown in
In
a) and (b) illustrate an example implementation of system 200 and method 300. In
In
In
Thus, based on the aspects disclosed herein, employing an orientation detection technique, operators of multiple-display systems is provided a robust technique to interact with a system. Accordingly, a safer and more efficient way of engaging with a system may be realized.
It will be apparent to those skilled in the art that various modifications and variation can be made in the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.