The present invention relates to mobile devices. In particular, the present invention relates to orientating the display of mobile devices.
Mobile devices can be held in a variety of ways and oriented in multiple directions. The display of a mobile device is oriented based on the orientation of the mobile device. Currently, automatic display orientation is based on an accelerometer reading and the assumption that the user is looking at the display in the natural orientation of the body subjected to gravity.
Certain exemplary embodiments are described in the following detailed description and in reference to the drawings, in which:
Embodiments disclosed herein provide techniques for automatically orienting a device display. Current techniques for automatically orienting a device display rely on an accelerometer reading. However, the current technique is only effective when the device is in a vertical position. When the device is in a horizontal position, gravity will not cause any effect on the accelerometer reading to trigger an appropriate change in device display orientation. However, by employing detection devices such as a compass and a gyrometer in addition to the accelerometer, changes in device orientation can be detected, regardless of whether the device is in a vertical position or a horizontal position. A camera can be used to detect user face contour and/or eye position in response to a detection of a change in device orientation. By employing a camera in addition to the detection devices, changes in device orientation, or a lack of changes in device orientation, can be detected, even if the user is in a horizontal position, or changes position from vertical to horizontal.
The computing device 100 can also include a graphics processing unit (GPU) 108. As shown, the CPU 102 can be coupled through the bus 106 to the GPU 108. The GPU 108 can be configured to perform any number of graphics operations within the computing device 100. For example, the GPU 108 may be configured to render or manipulate graphics images, graphics frames, videos, or the like, to be displayed to a user of the computing device 100. In some embodiments, the GPU 108 includes a number of graphics engines, wherein each graphics engine is configured to perform specific graphics tasks, or to execute specific types of workloads.
The memory device 104 can include random access memory (RAM), read only memory (ROM), flash memory, or any other suitable memory systems. For example, the memory device 104 may include dynamic random access memory (DRAM). The CPU 102 can be linked through the bus 106 to a display interface 110 configured to connect the computing device 100 to a display device 112. The display device 112 can include a display screen that is a built-in component of the computing device 100. The display device 112 can also include a computer monitor, television, or projector, among others, that is externally connected to the computing device 100.
The CPU 102 can also be connected through the bus 106 to an input/output (I/O) device interface 114 configured to connect the computing device 100 to one or more I/O devices 116. The I/O devices 116 can include, for example, a keyboard and a pointing device, wherein the pointing device may include a touchpad or a touchscreen, among others. The I/O devices 116 can be built-in components of the computing device 100, or can be devices that are externally connected to the computing device 100.
A network interface card (NIC) 118 can be adapted to connect the computing device 100 through the system bus 106 to a network (not depicted). The network (not depicted) may be a wide area network (WAN), local area network (LAN), or the Internet, among others. In an example, the computing device 100 can connect to a network via a wired connection or a wireless connection.
The computing device 100 also includes a storage device 120. The storage device 120 is a physical memory such as a hard drive, an optical drive, a thumbdrive, a secure digital (SD) card, a microSD card, an array of drives, or any combinations thereof, among others. The storage device 120 can also include remote storage drives. The storage device 120 includes any number of applications 122 that are configured to run on the computing device 100.
The computing device 100 includes an inertial measurement unit (IMU) 124. The IMU 124 measures the movement of the computing device 100. In particular, the IMU 124 measures the pitch, roll, and yaw of the computing device 100. Pitch, roll, and yaw are measured in reference to the typical vertical position as the starting position. The typical vertical position can refer to the position at which the device has a pitch, yaw, and roll of zero. In embodiments, the typical vertical position of a cell phone can refer to the position at which the cell phone has an ear speaker at the top most portion of the device and a microphone at the bottom most portion of the device. Furthermore, in embodiments, the typical vertical position of a tablet device can refer to the position at which the tablet device has a camera lens at the top most portion of the device, and user controls at the bottom most portion of the device. When the data collected by the IMU 124 indicates a change in computing device orientation, such as more than a predefined threshold, a corresponding change in display orientation is triggered, or the camera is activated to double check the real relative orientation of the device with respect to the user.
The block diagram of
With a 6DOF algorithm, the pitch, roll, and relative yaw of the device are measured and used to compute a change in orientation of the device. In other words, translation in three perpendicular axes is measured and used to compute a change in orientation of the device. Specifically, pitch refers to the forward and backward inclination of the device. Yaw refers to the lateral edges of the device rotating to the left and right. Roll refers to the top and bottom edges of the device rotating to the left and the right. With a 9DOF algorithm, the measurements from a three degree of freedom compass (3DOF), a 3DOF accelerometer, and a 3DOF gyroscope are used to calculate the change in orientation of the device with respect to the magnetic North of the earth.
When the IMU 202 detects a rotation of a computing device about an axis greater than a set amount, a camera 210 is triggered. In an example, the set amount can be 20 degrees, 45 degrees, 90 degrees, or any other determined angle. In an example, the set amount can be determined and programmed by a manufacturer, a user, or others. The camera can be turned on at startup when the device is in startup position, i.e. vertical position. While the display is on, the position of the eyes and a face contour is captured and the display is oriented based on the position. The camera can then be turned off until a movement of the device is detected. In this manner, the device power is conserved. Although the IMU 124 is described using a compass, accelerometer, and gyrometer, any device that captures direction, rotation, and acceleration can be used to detect the rotation and movement of the computing device.
The camera 210 is positioned in the housing of the computing device on the face of the computing device such that the camera 210 faces the user during use of the computing device. In an example, the camera is a dedicated camera. In another example, the camera 210 is present on the computing device in addition to at least one other camera present on the computing device. In a further example, the camera is a functioning camera used to take pictures as well as to detect a user's face and/or eye position. The camera can be a still-shot camera, a video camera, a combination still-shot camera and video camera, an infrared camera, a three dimensional camera, or any other type of camera. In embodiments, the camera can be duty cycled such that the camera is powered off until the device detects movement beyond a determined threshold. For example, the camera can be turned on at device startup when the device is at the starting position, i.e. typical vertical position, and capture the user face and/or eye position. The camera can then be turned off until a change in device orientation is detected by the IMU. By duty cycling the camera, power can be saved by not supplying constant power to the camera as the camera is powered off when not in use.
The data collected by the camera 210 is analyzed at face/eyes detection 212. The face/eyes detection 212 analyzes the camera data to determine the position of the user's face and/or eyes in relation to the computing device. The information collected by the IMU 202 and the position of the user's face and/or eyes are transferred to a display rotation decision unit 214. The display rotation decision unit 214 uses the information to determine if the device display should be rotated in accordance with the device orientation to maintain the alignment of the display orientation and the user's eyes. If the display rotation decision unit 214 determines the device display is to be reoriented, the display driver 216 initiates the display reorientation.
In another example, the camera remains on at all times. When the camera is on at all times, the camera tracks the face and/or eyes of the user on a continuing basis. The display orientation is maintained based on the face/eye position.
In a further example, the computing device does not include a camera. The orientation of the device is determined based on data collected by the IMU 202. The orientation of the display is determined based on the device orientation. When the device is vertical, i.e. in portrait position, the current accelerometer-based approach is used. When the device is put on a horizontal surface, the current orientation is considered to be “preserved”. If yaw changes happen when on the horizontal surface, the display is rotated in order to maintain the original orientation.
The block diagram of
When the device 504 is placed on the table 502 in a portrait position 506, the display is orientated such that up is oriented as indicated by arrow 508. When the device is rotated as indicated by arrow 510 to a landscape position 512, the change in orientation is detected by the compass and the gyrometer of the IMU, but no change in accelerometer is detected. Due to the detection of orientation change, the display is rotated such that up within the display is indicated by arrow 514. The change in display rotation can be triggered when a predetermined amount of change in device orientation is detected by the IMU. For example, when the device is rotated at least 90 degrees, the display orientation is changed in accordance with the change in device orientation, but if the device is rotated less than 90 degrees, the orientation of the display is unchanged. The predetermined amount of change can be set by a user or a manufacturer. The predetermined amount of change can be set to any number, such as 30 degrees, 45 degrees, 90 degrees, or any other suitable amount of change.
Program instructions may be used to cause a general-purpose or special-purpose processing system that is programmed with the instructions to perform the operations described herein. These operations include, but are not limited to, the process flows described in
A device is disclosed herein. The device includes logic, at least partially including hardware logic, to receive data from an inertial measurement unit (IMU). The device also includes logic to analyze IMU data for changes in computing device orientation. The device further includes logic to orient a device display in accordance with the computing device orientation.
The device may include logic to receive a user eye position in relation to the device display. The user eye position can be detected by a camera. The camera is triggered by detection of the changes in computing device orientation by the IMU. The camera is triggered when the IMU detects changes in device orientation greater than a predetermined amount. The IMU can include a gyrometer, a compass, and an accelerometer. The device may employ a six degree of freedom (6DOF) algorithm or a nine degree of freedom (9DOF) algorithm. The device may include logic to detect pitch, roll, and yaw of the device.
A system for changing display orientation is described herein. The system includes an inertial measurement unit (IMU) to collect device orientation data. The system also includes a decision engine to analyze the device orientation data to determine changes in device orientation. The system further includes a driver to initiate a change in device display orientation in accordance with a change in device orientation.
The system may include a camera to detect user eyes position. The camera detects user eyes position when a change in device orientation is detected by the IMU. The orientation of the display is changed based on the user eye position. The IMU may include an accelerometer, a gyrometer, and a compass. The system can employ a six degree of freedom (6DOF) algorithm or a nine degree of freedom ((DOF) algorithm. The IMU detects pitch, roll, and yaw of the device. The system detects changes in device orientation when the device is in a horizontal position.
A computing device is described herein. The computing device includes a display and an inertial measurement unit (IMU) to detect changes in device orientation. The computing device also includes a driver to change an orientation of the display when a change in orientation of the computing device is detected by the IMU.
The computing device may include a camera to detect user eye position. The camera detects user eye position when a change in device orientation is detected by the IMU. The driver changes the orientation of the display based on user eye position. The IMU may include an accelerometer, a gyrometer, and a compass. The system can employ a six degree of freedom (6DOF) algorithm or a nine degree of freedom (9DOF) algorithm. The IMU detects pitch, roll, and yaw of the device.
A tangible, non-transitory, computer-readable medium is disclosed herein. The tangible, non-transitory, computer-readable medium includes code to direct a processor to receive data from an inertial measurement unit (IMU). The tangible, non-transitory, computer-readable medium also includes code to direct a processor to analyze IMU data for changes in computing device orientation. The tangible, non-transitory, computer-readable medium further includes code to direct a processor to orient a device display in accordance with the computing device orientation.
In the foregoing description and claims, the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, “connected” may be used to indicate that two or more elements are in direct physical or electrical contact with each other. “Coupled” may mean that two or more elements are in direct physical or electrical contact. However, “coupled” may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
Some embodiments may be implemented in one or a combination of hardware, firmware, and software. Some embodiments may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by a mobile platform to perform the operations described herein. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine, e.g., a computer. For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices, among others.
An embodiment is an implementation or example. Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” “various embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the inventions. The various appearances of “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. Elements or aspects from an embodiment can be combined with elements or aspects of another embodiment.
Not all components, features, structures, characteristics, etc. described and illustrated herein need be included in a particular embodiment or embodiments. If the specification states a component, feature, structure, or characteristic “may”, “might”, “can” or “could” be included, for example, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
It is to be noted that, although some embodiments have been described in reference to particular implementations, other implementations are possible according to some embodiments. Additionally, the arrangement and/or order of circuit elements or other features illustrated in the drawings and/or described herein need not be arranged in the particular way illustrated and described. Many other arrangements are possible according to some embodiments.
In each system shown in a figure, the elements in some cases may each have a same reference number or a different reference number to suggest that the elements represented could be different and/or similar. However, an element may be flexible enough to have different implementations and work with some or all of the systems shown or described herein. The various elements shown in the figures may be the same or different. Which one is referred to as a first element and which is called a second element is arbitrary.
In the preceding description, various aspects of the disclosed subject matter have been described. For purposes of explanation, specific numbers, systems and configurations were set forth in order to provide a thorough understanding of the subject matter. However, it is apparent to one skilled in the art having the benefit of this disclosure that the subject matter may be practiced without the specific details. In other instances, well-known features, components, or modules were omitted, simplified, combined, or split in order not to obscure the disclosed subject matter.
While the disclosed subject matter has been described with reference to illustrative embodiments, this description is not intended to be construed in a limiting sense. Various modifications of the illustrative embodiments, as well as other embodiments of the subject matter, which are apparent to persons skilled in the art to which the disclosed subject matter pertains are deemed to lie within the scope of the disclosed subject matter.
While the present techniques may be susceptible to various modifications and alternative forms, the exemplary examples discussed above have been shown only by way of example. It is to be understood that the technique is not intended to be limited to the particular examples disclosed herein. Indeed, the present techniques include all alternatives, modifications, and equivalents falling within the true spirit and scope of the appended claims.