Many electronic devices have the ability to display media. The orientation of the media may match the orientation of the display screen. Some electronic devices have a display screen that is able to change the way an image is displayed on the display screen based on a physical orientation of the device. For example, a tablet computer may be display media in a portrait aspect ratio or a landscape aspect ratio. Other electronic devices may provide a user with an option to rotate media displayed on the display device by fixed amounts, such as increments of ninety-degrees.
In one example, a method is provided for rotating a display orientation of media displayed on a display device. The method may include receiving information corresponding to a field of view of a camera of a wearable computing device. The field of view of the camera may include a display device. The method may also include identifying an orientation of the display device based on the information corresponding to the field of view of the camera. The method may further include identifying a reference orientation that includes an orientation of the wearable computing device. The method may additionally include determining, by the wearable computing device, a rotation of a display orientation of media displayed on the display device. The rotation may be based on a comparison of the orientation of the display device with the reference orientation. The method may also include providing information indicative of the rotation of the display orientation to the display device.
In another example, non-transitory computer-readable memory having stored thereon instructions executable by a computing device to perform functions is provided. The functions may include receiving information corresponding to a field of view of a camera of a wearable computing device. The field of view of the camera may include a display device. The functions may also include identifying an orientation of the display device based on the information corresponding to the field of view of the camera. The functions may further include identifying a reference orientation that includes an orientation of the wearable computing device. The functions may additionally include determining, by the wearable computing device, a rotation of a display orientation of media displayed on the display device. The rotation may be based on a comparison of the orientation of the display device with the reference orientation. The functions may also include providing information indicative of the rotation of the display orientation to the display device.
In another example, a wearable computing device is provided. The wearable computing device may include a camera having a field of view and a processor. The processor may be configured to receive information corresponding to the field of view of the camera that includes a display device. The processor may also be configured to identify an orientation of the display device based on the information corresponding to the field of view of the camera. The processor may further be configured to identify a reference orientation that includes an orientation of the wearable computing device. The processor may additionally be configured to determine a rotation of a display orientation of media displayed on the display device. The rotation may be based on a comparison of the orientation of the display device with the reference orientation. The processor may also be configured to provide information indicative of the rotation to the display device.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the figures and the following detailed description.
In the following detailed description, reference is made to the accompanying figures, which form a part thereof. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, figures, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are contemplated herein.
1. Overview
Disclosed herein are example methods and systems for determining a rotation of a display orientation of media displayed on a display device by a wearable computing device. An example method may include receiving information corresponding to a field of view of a camera of a wearable computing device. The field of view of the camera may include a display device. The example method may also include the wearable computing device identifying an orientation of the display device based on the information corresponding to the field of view and identifying a reference orientation that includes an orientation of the wearable computing device. In some examples, the reference orientation may include an orientation of a head-mounted display of the wearable computing device.
The example method may further include determining, by the wearable computing device, a rotation of a display orientation of media displayed on the display device. The rotation may be based on a comparison of the orientation of the display device with the reference orientation. In one example, the rotation may align the display orientation with reference orientation such that an axis of the display orientation is parallel to an of the reference orientation. The method may also include providing information indicative of the rotation to the display device.
2. Example System and Device Architecture
Each of the frame elements 104, 106, and 108 and the extending side-arms 114, 116 may be formed of a solid structure of plastic and/or metal, or may be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the system 100. Other materials may be possible as well.
One or more of each of the lens elements 110, 112 may be formed of any material that can suitably display a projected image or graphic. Each of the lens elements 110, 112 may also be sufficiently transparent to allow a user to see through the lens element. Combining these two features of the lens elements may facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the lens elements 110, 112.
The extending side-arms 114, 116 may each be projections that extend away from the lens-frames 104, 106, respectively, and may be positioned behind a user's ears to secure the system 100 to the user. The extending side-arms 114, 116 may further secure the system 100 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, the system 100 may connect to or be affixed within a head-mounted helmet structure. Other possibilities exist as well.
The system 100 may also include an on-board computing system 118, a video camera 120, a sensor 122, and a finger-operable touch pad 124. The on-board computing system 118 is shown to be positioned on the extending side-arm 114 of the system 100; however, the on-board computing system 118 may be provided on other parts of the system 100 or may be positioned remote from the system 100 (e.g., the on-board computing system 118 could be connected by wires or wirelessly connected to the system 100). The on-board computing system 118 may include a processor and memory, for example. The on-board computing system 118 may be configured to receive and analyze data from the video camera 120, the sensor 122, and the finger-operable touch pad 124 (and possibly from other sensory devices, user-interfaces, or both) and generate images for output by the lens elements 110 and 112. The on-board computing system 118 may additionally include a speaker or a microphone for user input (not shown). An example computing system is further described below in connection with
The video camera 120 is shown positioned on the extending side-arm 114 of the system 100; however, the video camera 120 may be provided on other parts of the system 100. The video camera 120 may be configured to capture images at various resolutions or at different frame rates. Video cameras with a small form-factor, such as those used in cell phones or webcams, for example, may be incorporated into an example embodiment of the system 100.
Further, although
The sensor 122 is shown on the extending side-arm 116 of the system 100; however, the sensor 122 may be positioned on other parts of the system 100. The sensor 122 may include one or more of a gyroscope or an accelerometer, for example. Other sensing devices may be included within, or in addition to, the sensor 122 or other sensing functions may be performed by the sensor 122.
The finger-operable touch pad 124 is shown on the extending side-arm 114 of the system 100. However, the finger-operable touch pad 124 may be positioned on other parts of the system 100. Also, more than one finger-operable touch pad may be present on the system 100. The finger-operable touch pad 124 may be used by a user to input commands. The finger-operable touch pad 124 may sense at least one of a position and a movement of a finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities. The finger-operable touch pad 124 may be capable of sensing finger movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and may also be capable of sensing a level of pressure applied to the pad surface. The finger-operable touch pad 124 may be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers. Edges of the finger-operable touch pad 124 may be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user's finger reaches the edge, or other area, of the finger-operable touch pad 124. If more than one finger-operable touch pad is present, each finger-operable touch pad may be operated independently, and may provide a different function.
The detector 126 may also include various lenses, optics, or other components to alter the focus and/or direction of the detector 126. Although the detector 126 is shown coupled to an inside surface of the frame element 104, one or more components may be coupled to the frame elements 104, 106, and 108 and/or the extending side-arms 114, 116 in place of and/or in addition to the detector 126 as well.
As shown in
The lens elements 110, 112 may act as a combiner in a light projection system and may include a coating that reflects the light projected onto them from the projectors 128, 132. In some embodiments, a reflective coating may be omitted (e.g., when the projectors 128, 132 are scanning laser devices).
In alternative embodiments, other types of display elements may also be used. For example, the lens elements 110, 112 themselves may include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user's eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user. A corresponding display driver may be disposed within the frame elements 104, 106 for driving such a matrix display. Alternatively or additionally, a laser or light emitting diode (LED) source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user's eyes. Other possibilities exist as well.
As shown in
The system 220 may include a single lens element 230 that may be coupled to one of the side-arms 223 or the center frame support 224. The lens element 230 may include a display such as the display described with reference to
Thus, the device 310 may include a display system 312 comprising a processor 314 and a display 316. The display 316 may be, for example, an optical see-through display, an optical see-around display, or a video see-through display. The processor 314 may receive data from the remote device 330, and configure the data for display on the display 316. The processor 314 may be any type of processor, such as a micro-processor or a digital signal processor, for example.
The device 310 may further include on-board data storage, such as memory 318 coupled to the processor 314. The memory 318 may store software that can be accessed and executed by the processor 314, for example.
The remote device 330 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, or tablet computing device, etc., that is configured to transmit data to the device 310. Additionally, the remote device 330 may be an additional heads-up display system, such as the systems 100, 200, or 220 described with reference to
In
As described above in connection with
Computing system 400 may include at least one processor 402 and system memory 404. In an example embodiment, computing system 400 may include a system bus 406 that communicatively connects processor 402 and system memory 404, as well as other components of computing system 400. Depending on the desired configuration, processor 402 can be any type of processor including, but not limited to, a microprocessor (μP), a microcontroller (μC), a digital signal processor (DSP), or any combination thereof. Furthermore, system memory 404 can be of any type of memory now known or later developed including but not limited to volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.) or any combination thereof
An example computing system 400 may include various other components as well. For example, computing system 400 includes an A/V processing unit 408 for controlling graphical display 410 and speaker 412 (via A/V port 414), one or more communication interfaces 416 for connecting to other computing devices 418, and a power supply 420. Graphical display 410 may be arranged to provide a visual depiction of various input regions provided by user-interface module 422. For example, user-interface module 422 may be configured to provide a user-interface, and graphical display 410 may be configured to provide a visual depiction of the user-interface. User-interface module 422 may be further configured to receive data from and transmit data to (or be otherwise compatible with) one or more user-interface devices 428.
Furthermore, computing system 400 may also include one or more data storage devices 424, which can be removable storage devices, non-removable storage devices, or a combination thereof. Examples of removable storage devices and non-removable storage devices include magnetic disk devices such as flexible disk drives and hard-disk drives (HDD), optical disk drives such as compact disk (CD) drives or digital versatile disk (DVD) drives, solid state drives (SSD), and/or any other storage device now known or later developed. Computer storage media can include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. For example, computer storage media may take the form of RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium now known or later developed that can be used to store the desired information and which can be accessed by computing system 400.
According to an example embodiment, computing system 400 may include program instructions 426 that are stored in system memory 404 (and/or possibly in another data-storage medium) and executable by processor 402 to facilitate the various functions described herein including, but not limited to, those functions described with respect to
3. Example Determination of a Rotation of Media Displayed on Display Device
In addition, for the method 500 and other processes and methods disclosed herein, the flowchart shows functionality and operation of one possible implementation of present embodiments. In this regard, each block may represent a module, a segment, or a portion of program code, which includes one or more instructions executable by a processor for implementing specific logical functions or steps in the process. The program code may be stored on any type of computer-readable medium, for example, such as a storage device including a disk or hard drive. The computer-readable medium may include non-transitory computer-readable media, for example, such as a computer-readable media that stores data for short periods of time, such as register memory, processor cache, or Random Access Memory (RAM). The computer-readable medium may also include non-transitory media, such as secondary or persistent long term storage, such as read-only memory (ROM), optical or magnetic discs, compact-disc read-only memory (CD-ROM), or the like. The computer-readable medium may also include any other volatile or non-volatile storage systems. The computer-readable medium may be considered a computer-readable storage medium, for example, or a tangible storage device.
In addition, for the method 500 and other processes and methods disclosed herein, each block of
At block 502, the method 500 includes identifying an orientation of a display device in a field of view of a wearable computing device. The wearable computing device may include a head-mounted display, such as the systems 100, 200, and 220 depicted in
The orientation of the display device may include a first axis that is perpendicular to a second axis. In one example, a user wearing the head-mounted display may also use a display device, such as a television, a tablet computer, a notebook or laptop computer, an e-reader, a digital media player, or a similar electronic device capable of displaying media. When the user uses the display device, the user may position the display device such that the user can see the media displayed on the display device. Since the user is wearing the head-mounted display, the field of view of the camera may include the display device. The wearable computing device may employ an object recognition technique to identify an orientation of the display device from the information corresponding to the field of view of the camera.
In another example, the wearable computing device may send an instruction to the display device that includes an instruction for displaying a fiducial on the display device. The fiducial may include a character unique to the communication link and may be either perceptible or imperceptible to human vision. The wearable computing device may identify the fiducial in the information corresponding to the field of view of the camera and determine the orientation of the display device based on a location of the fiducial in the information corresponding to the field of view of the camera. Alternatively, the instruction may include an instruction for displaying a fiducial in each corner of the display device. The wearable computing device may determine the orientation of the display device based on a location of each fiducial in the information corresponding to the field of view of the camera.
In another aspect of this example, the instruction may include an instruction for displaying a watermark on the display device. The watermark may be identifiable by the wearable computing device and imperceptible to human vision. The wearable computing device may identify the watermark in the information corresponding to the field of view of the camera and identify and may determine an orientation of the watermark. The wearable computing device may identify the orientation of the display device based on the orientation of the watermark.
In yet another example, the wearable computing device may identify text displayed on the display device from the information corresponding to the field of view of the camera. The wearable computing device may determine an orientation of the text and, based on the orientation of the text, determine the orientation of the display device.
The wearable computing device may receive information corresponding to the view 600 and identify an orientation 618 of the tablet computer 602, which is shown for illustrative purposes. The orientation 618 of the tablet computer 602 may include a horizontal axis 620 and a vertical axis 622. In one example, the wearable computing device may employ a text recognition technique to identify the time indication 608. The wearable computing device may identify the orientation 618 of the tablet computer 602 based the orientation of the time indication 608.
In another example, the wearable computing device may receive an indication of a location of a fiducial displayed on the display 604 of the tablet computer 602. The fiducial may include one or more of the signal strength indication 606, the power level indication 610, and the application icons 612, 614, and 616. The wearable computing device may identify the fiducial in the information corresponding to the view 600 and identify the orientation 618 of the tablet computer 602 by comparing the location of the fiducial received from the tablet computer to the location of the fiducial in the information corresponding to the view 600.
In the example depicted in view 630, a user of the wearable computing device and the tablet computer 632 holds the tablet computer 632 at an angle. The wearable computing device may determine that the fiducials 636, 638, 640, and 642 form a trapezoid 644, which is shown for illustrative purposes. The wearable computing device may identify the orientation 646 of the tablet computer 632 by aligning a horizontal axis 648 of the orientation 646 of the tablet computer 632 with the base of the trapezoid 644, which is a line connection the fiducials 638 and 640. The orientation 648 of the tablet computer 632 may include a vertical axis 650 that is perpendicular to the horizontal axis 648.
Returning to
Returning to
In another example, a wearable computing device may not include an IMU or a similar sensor configured to determine an orientation of a head-mounted display. In this example the wearable computing device may include a data storage, such as the system memory 404 depicted in
At block 506, the method 500 includes determining a rotation of a display orientation of media displayed on the display device. The display orientation may include a first axis and a second axis upon which the media is displayed. Applying the rotation to the display orientation may result in aligning one of the first axis and the second axis of the display orientation with a reference axis of a reference orientation. In one example, the wearable computing device may base the rotation on a comparison of an orientation of the display device with a reference orientation. In this example, the wearable computing device may make the comparison by determining an angle between a horizontal axis of the orientation of the display device and a horizontal axis of the reference orientation. In another example, the wearable computing device may determine the comparison by determining an angle between a different axis of the orientation of the display device and a different axis of the reference orientation.
Returning to
For example, consider a situation in which a user is watching media on a tablet computer while wearing a head-mounted display of the wearable computing device. The wearable computing device may receive a first signal from the sensor indicating that the user is wearing the head-mounted display, and the wearable computing device may determine a rotation of a display orientation of the media as described herein. The user may subsequently take the head-mounted display off and set the head-mounted display on a surface such that the field of view of a camera mounted to the head-mounted display includes the tablet computer. The wearable computing device may receive a second signal from the sensor indicating that the user is not wearing the wearable computing device. In this case, the wearable computing device may not determine a rotation of the display orientation.
At block 508, the method 500 includes providing information indicative of a rotation of a display orientation to a display device. In one example, a wearable computing device may communicate with the display device via a wired or wireless communication link. The wearable computing device may send information indicative of the rotation to the display device via the communication link.
The information indicative of the rotation may include additional information for displaying the media on the display device. In one example, the information indicative of the rotation may include an indication of an aspect ratio of the media displayed on the display device. In this example, the display device may display the media in one of a first aspect ratio and a second aspect ratio, such as a portrait aspect ratio and a landscape aspect ratio. The wearable computing device may base the indication of the aspect ratio on the rotation of the display orientation. For instance, rotation is less than or equal to a threshold angle, the information indicative of the rotation may include an indication that the display device should display the media using the first aspect ratio. If the angle is greater than the threshold angle, the information indicative of the rotation may include an indication that the display device should display the media using the second aspect ratio.
Returning to
It should be understood that arrangements described herein are for purposes of example only. As such, those skilled in the art will appreciate that other arrangements and other elements (e.g., machines, interfaces, functions, orders, groupings of functions, etc.) can be used instead, and some elements may be omitted altogether according to the desired result. Further, many of the elements described are functional entities that may be implemented as discrete or distributed components or in conjunction with other components, in any suitable combination and location.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims, along with the full scope of equivalents to which such claims are entitled. It is also understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intend to be limiting.