1. Field of the Invention
The present invention relates to three-dimensional display technology.
2. Background Art
Images may be generated for display in various forms. For instance, television (TV) is a widely used telecommunication medium for transmitting and displaying images in monochromatic (“black and white”) or color form. Conventionally, images are provided in analog form and are displayed by display devices in two-dimensions. More recently, images are being provided in digital form for display in two-dimensions on display devices having improved resolution (e.g., “high definition” or “HD”). Even more recently, images capable of being displayed in three-dimensions are being generated.
Conventional displays may use a variety of techniques to achieve three-dimensional image viewing functionality. For example, various types of glasses have been developed that may be worn by users to view three-dimensional images displayed by a conventional display. Examples of such glasses include glasses that utilize color filters or polarized filters. In each case, the lenses of the glasses pass two-dimensional images of differing perspective to the user's left and right eyes. The images are combined in the visual center of the brain of the user to be perceived as a three-dimensional image. In another example, synchronized left eye, right eye LCD (liquid crystal display) shutter glasses may be used with conventional two-dimensional displays to create a three-dimensional viewing illusion. In still another example, LCD display glasses are being used to display three-dimensional images to a user. The lenses of the LCD display glasses include corresponding displays that provide images of differing perspective to the user's eyes, to be perceived by the user as three-dimensional.
Difficulties exist in providing audio associated with delivered three-dimensional or two-dimensional content, such as content provided to shutter glasses. Such audio is desired to be provided in a manner that it does not disturb other persons who may not be viewing the same three-dimensional content or two-dimensional content.
Methods, systems, and apparatuses are described for display systems and wearable devices that enable audio associated with three-dimensional or two-dimensional image content to be heard by viewers without interfering with other persons substantially as shown in and/or described herein in connection with at least one of the figures, as set forth more completely in the claims.
The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate the present invention and, together with the description, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention.
The present invention will now be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
The present specification discloses one or more embodiments that incorporate the features of the invention. The disclosed embodiment(s) merely exemplify the invention. The scope of the invention is not limited to the disclosed embodiment(s). The invention is defined by the claims appended hereto.
References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
Furthermore, it should be understood that spatial descriptions (e.g., “above,” “below,” “up,” “left,” “right,” “down,” “top,” “bottom,” “vertical,” “horizontal,”etc.) used herein are for purposes of illustration only, and that practical implementations of the structures described herein can be spatially arranged in any orientation or manner.
There is a huge industry push to support three-dimensional images being displayed by a digital television (DTV) or by other types of display devices to viewers. One technique for providing three-dimensional images involves shuttering of the right and left lenses of glasses (e.g., goggles) worn by a viewer (or “wearer”), the shuttering being perform in sync with the display of image on a display. In such shutter glasses, a left image is displayed on the screen that is coordinated with a blackout on the right lens of the glasses (so that the left image is only seen by the left eye of the viewer), followed by a right image being displayed on the screen that is coordinated with a blackout on the left lens of the glasses (so that the right image is only seen by the right eye of the viewer).
Difficulties exist in enabling a viewer to listen to audio associated with such three-dimensional content being viewed without disturbing other persons that are nearby. Furthermore, a display may be configured to provide three-dimensional or two-dimensional views to multiple viewers, such that each viewer is enabled to see their own selected three-dimensional content without seeing the content viewed by other viewers. However, in such case, difficulties exist in enabling each viewer to hear the audio associated with their own three-dimensional view without hearing audio associated with the other three-dimensional views.
In embodiments, as described herein, glasses are provided that deliver three-dimensional content to a wearer and/or enables the wearer to have a personal listening experience. For instance, if the user is watching television using the glasses, the user does not disturb other persons who may be viewing other content, may be performing other activity nearby, may be sleeping, etc.
For instance,
Display system 102 is a system configured to display images. For example, display system 102 may include a display device, such as a television display, a computer monitor, a smart phone display, etc., and may include one or more devices configured to provide media content to the display device, such as a computer, a cable box or set top box, a game console, a digital video disc (DVD) player, a home theater receiver, etc. In an embodiment, the display device and a media content receiver and/or player may be integrated in a single device or may be separate devices. A display device of display system displays light 110 that includes three-dimensional/two-dimensional images associated with three-dimensional/two dimensional content selected by viewers 106a and 106b for viewing. For example, viewer 106a may use remote control 104a to select the first content for viewing, and viewer 106b may use remote control 104b to select the second content for viewing. In an embodiment, remote control 104a and remote control 104b may be integrated into a single device or may be two separate devices. As shown in
Viewer 106a is delivered a corresponding view 108a by display system 102, and viewer 106b is delivered a corresponding view 108b by display system 102. Views 108a and 108b may each be a three dimensional view. In embodiments, view 108a may be delivered to viewer 106a, but not be visible by viewer 106b, and view 108b may be delivered to viewer 106b, but not be visible by viewer 106a. In embodiments, views 108a and 108b may be different or the same.
First and second glasses 112a and 112b are shutter glasses that enable personal viewing, 3D viewing, or both personal and 3D viewing. As such, each of first and second glasses 112a and 112b filters the images displayed by display system 102 so that each of viewers 106a and 106b is delivered a particular three-dimensional or two-dimensional view associated with the three-dimensional or two dimensional content that the viewer selected.
For instance, display system 102 may emit light 110 that includes first and second images associated with the first three-dimensional content selected by viewer 106a and third and fourth images associated with the second three-dimensional content selected by viewer 106b. The first image is a left eye image and the second image is a right eye image associated with the first three-dimensional content, and the third image is a left eye image and the fourth image is a right eye image associated with the second three-dimensional content. The first-fourth images may be sequentially displayed by display system 102 in a repeating fashion, with each repeated display of each of the first-fourth images providing a next image in four sequences of images. First and second glasses 112a and 112b operate to filter the first-fourth images displayed by display system 102 so that viewers 106a and 106b are enabled to view the corresponding three-dimensional content they desire to view. For example, the left and right lens of first glasses 112a block or pass light 110 in synchronization with the first and second images, respectively, and the left and right lens of second glasses 112b block or pass light 110 in synchronization with the third and fourth images. In this manner, first viewer 106a alternately sees the first image with his/her left eye and the second image with his/her right eye, and second viewer 106b alternately sees the third image with his/her left eye and the fourth image with his/her right eye. The first and second images are combined in the visual center of the brain of viewer 106a to be perceived as a first three-dimensional image, and the third and fourth images are combined in the visual center of the brain of viewer 106b to be perceived as a three-dimensional image.
In another embodiment, display system 102 may provide two-dimensional views to viewers 106a and 106b. For instance, display system 102 may emit light 110 that includes a first image associated with the first content selected by viewer 106a and a second image associated with the second content selected by viewer 106b. The first image is associated with the first content, and the second image is associated with the second content. The first and second images may be sequentially displayed by display system 102 in a repeating fashion, with each repeated display of each of the first and second images providing a next image in two sequences of images. First and second glasses 112a and 112b operate to filter the first and second images displayed by display system 102 so that viewers 106a and 106b are enabled to view the corresponding content they desire to view. For example, the left and right lens of first glasses 112a block or pass light 110 in synchronization with the first image, and the left and right lens of second glasses 112b block or pass light 110 in synchronization with the second image. In this manner, viewer 106a views a sequence of the first images to be provided a first two-dimensional view, and viewer 106b views a sequence of the second images to be provided a second two-dimensional view.
In another embodiment, one of first and second glasses 112a and 112b provides one of viewers 106a and 106b with a two-dimensional view, while the other of first and second glasses 112a and 112b provides one of viewers 106a and 106b with a three-dimensional view.
Furthermore, first and second glasses 112a and 112b each include one or more earphones that enable viewers 106a and 106b to hear the audio associated with their corresponding content being viewed. If the content selected to be presented by first and second glasses 112a and 112b is the same, first and second glasses 112a and 112b provide the same audio content to viewers 106a and 106b associated with the content selected to be viewed. If the content selected to be viewed by first and second glasses 112a and 112b is not the same, first and second glasses 112a and 112b respectively provide the corresponding audio content associated with the particular content selected to be viewed. In this manner, viewers 106a and 106b hear their own respective audio, and are not disturbed by the other viewer's audio.
Glasses 112a and 112b may be configured in various ways. For instance,
Note that the configuration of glasses 200 shown in
Still further, transmitter 302 transmits an audio content signal 308. Receivers 304a and 304b of glasses 304a and 304b receive audio content signal 308. When glasses 112a and 112b display the same content to their wearers, audio content signal 308 includes the same audio content for both receiver 304a and 304b. Glasses 112a and 112b extract the audio content from audio content signal 308, and use their respective earphones to play the extracted audio content to their wearers. When glasses 112a and 112b display different content to their wearers, audio content signal 308 includes first and second audio content associated with the first and second content. Glasses 112a extract the first audio content from audio content signal 308, and uses its earphones to play the extracted first audio content to its wearer. Glasses 112b extracts the second audio content from audio content signal 308, and uses its earphones to play the extracted second audio content to its wearer.
Transmitter 302 and receivers 304a and 304b may communicate according to any suitable communication protocol, such as an 802.11 WLAN (wireless local area network) communication protocol (“WiFi”), an 802.15.4 WPAN (wireless personal area network) protocol (e.g., “ZigBee”), or other communication protocol. In an embodiment, transmitter 302 and receivers 304a and 304b communicate according to the Bluetooth™ communication protocol. For instance,
In an embodiment where glasses 112a and 112b display the same content to their wearers, Bluetooth™ transmitter 402 transmits an audio content signal 408 that includes the associated audio content across communication channel 406, which is received by Bluetooth™ receivers 404a and 404b. The audio content of audio content signal 418 may be transmitted as native Advanced Audio Distribution Profile (A2DP) data, as SBC (subband codec) encoded data that is inserted into HID (human interface device) packets, or may be transmitted in other form. Glasses 112a and 112b extract the audio content from audio content signal 408, and use their respective earphones to play the extracted audio content to their wearers. Furthermore, in an embodiment, Bluetooth™ transmitter 402 may optionally transmit frame sync signal 306 across a communication channel 406 established between Bluetooth™ transmitter 402 and Bluetooth™ receivers 404a and 404b. Communication channel 406 containing a synchronization signal similar to frame sync signal 306 is received by Bluetooth™ receivers 404a and 404b. As described above, glasses 112a and 112b may use frame sync signal 306 to synchronize the blocking and passing of light 110 by their right and left shuttering lenses to enable the images displayed by display system 102 to be perceived as three-dimensional. In an embodiment, communication channel 406 may be a Bluetooth™ HID link.
In a current Bluetooth™ piconet implementation, up to seven glasses 112 may be present that are separately linked with display system 102 (there may be eight total members of the piconet including seven glasses 112 and display system 102), but in other embodiments, additional glasses 112 may be present.
In another embodiment, separate communication channels may be established between Bluetooth™ transmitter 402 and Bluetooth™ receiver 404a and between Bluetooth™ transmitter 402 and Bluetooth™ receiver 404b. Such an embodiment may be used when glasses 112a and 112b may display different content to their wearers, and thus, different audio content is transmitted to glasses 112a and 112b by display device 102 over the separate communication channels.
For example,
Bluetooth™ transmitter 402 transmits a first synchronization signal across communication channel 502 that includes the shutter glass information for the wearer of first glasses 112a, and transmits audio content across communication channel 504 that is associated with the content being viewed by the wearer of first glasses 112a. The audio content may be transmitted as native A2DP data, as SBC (subband codec) encoded data that is inserted into HID (human interface device) packets, or may be transmitted in other form. The first audio content signal and the first synchronization signal are received by Bluetooth™ receiver 404a. Glasses 112a may use a frame sync signal included in the first synchronization signal to synchronize the blocking and passing of light 110 by the right and left shuttering lenses of glasses 112a to enable the corresponding images displayed by display system 102 to be perceived as three-dimensional or two dimensional by the wearer of glasses 112a. Furthermore, glasses 112a extract the audio content from the first audio content signal, and play the extracted audio content to the wearer of glasses 112a.
Likewise, Bluetooth™ transmitter 402 transmits a second audio content signal across communication channel 508 that includes the audio content associated with the content being viewed by the wearer of second glasses 112b, and transmits a second synchronization signal across communication channel 506 that is associated with the content being viewed by the wearer of second glasses 112b. The second audio content signal and the second synchronization signal are received by Bluetooth™ receiver 404b. Glasses 112b may use a frame sync signal included in the second synchronization signal to synchronize the blocking and passing of light 110 by the right and left shuttering lenses of glasses 112b to enable the corresponding images displayed by display system 102 to be perceived as three-dimensional or two dimensional by the wearer of glasses 112b. Furthermore, glasses 112b extract the audio content from the second audio content signal, and play the extracted audio content to the wearer of glasses 112b.
Thus, through the use of wearable devices such as glasses 112, three-dimensional content may be provided to one or more users that view a display device of a display system. For instance,
Flowchart 600 begins with step 602. In step 602, a wearable device joins a device network as a slave device. For example, as shown in
In step 604, a frame sync signal and audio content are received from a display system, the audio content being associated with three-dimensional image content delivered to the slave device as alternating left and right images displayed by the display system. For example, as described above, receiver 404a of glasses 112a may receive first communication channel 502 that carries a frame sync signal, and may receive second communication channel 504 that carries audio content. The audio content received in second communication channel 504 is associated with three-dimensional image content displayed by display system 102 as alternating left and right images.
In step 606, audio based on the received audio content is played using at least one earphone. For example, as described above, one or both earphones of glasses 112a may play audio based on the audio content received in second communication channel 504.
In step 608, a left eye shuttering lens and a right eye shuttering lens are shuttered in synchronism with the alternating left and right images according to the frame sync signal to enable a wearer of the slave device to perceive the alternating left and right images as a three-dimensional image. For example, as described above, a left eye shuttering lens and a right eye shuttering lens of glasses 112a may be shuttering in synchronism with the alternating left and right images displayed by display system 102. The frame sync signal provided in first communication channel 502 causes the left eye shuttering lens to be open and the right eye shuttering lens to be closed when the left image is displayed by display system 102, and causes the left eye shuttering lens to be closed and the right eye shuttering lens to be open when the right image is displayed by display system 102.
Thus, a first user that wears glasses 112a may be delivered three-dimensional content by display system 102. As shown in
Flowchart 700 begins with step 702. In step 702, a second wearable device joins the device network as a second slave device. For example, as shown in
In step 704, a second frame sync signal and second audio content are received from a display system, the second audio content being associated with second three-dimensional image content delivered to the second slave device as second alternating left and right images displayed by the display system. For example, as described above, receiver 404b of glasses 112b may receive third communication channel 506 that carries the frame sync signal, and may receive fourth communication channel 508 that carries the second audio content. The second audio content received in fourth communication channel 508 is associated with second three-dimensional image content displayed by display system 102 as a second set of alternating left and right images.
In step 706, second audio based on the received second audio content is played using at least one earphone of the second wearable device. For example, as described above, one or both earphones of glasses 112b may play second audio based on the second audio content received in fourth communication channel 508.
In step 708, a second left eye shuttering lens and a second right eye shuttering lens are shuttered in synchronism with the second alternating left and right images according to the second frame sync signal to enable a second wearer of the second slave device to perceive the second alternating left and right images as a second three-dimensional image. For example, as described above, a left eye shuttering lens and a right eye shuttering lens of glasses 112b may be shuttering in synchronism with the second alternating left and right images displayed by display system 102. The second frame sync signal provided in third communication channel 506 causes the left eye shuttering lens to be open and the right eye shuttering lens to be closed when the second left image is displayed by display system 102, and causes the left eye shuttering lens to be closed and the right eye shuttering lens to be open when the second right image is displayed by display system 102.
In this manner, two users can simultaneously be delivered independent three-dimensional content by display system 102. Display system 102 displays a pair of right and left images for each user that are interleaved with each other, each pair of right and left images corresponding to a three-dimensional image. Furthermore, display system 102 streams audio content for each user that is associated with the corresponding three-dimensional image viewed by the user. Further users may additionally be delivered independent three-dimensional content by display system 102 in a similar manner (e.g., as described in flowchart 700 for the second user). The number of users that are enabled to be delivered independent three-dimensional content by display system 102 may be limited by a number of slave devices that can join the device network, if a limit exists.
Glasses 112a and 112b and display system 102 may be configured in various ways to perform their respective functions. For instance,
In
Frame rate controller 804 receives image content signal 832. Frame rate controller 804 generates a frame sync indicator 836 and image content 834 based on image content signal 832. Frame rate controller 804 provides image content 834 to display panel 802 in a manner such that display panel 802 displays a sequence of images associated with one or more three-dimensional views provided to viewers. For example, if a single three-dimensional view is being provided to a single glasses 860 that is formed by alternating display of a left image and a right image, frame rate controller 804 provides a sequence of the alternating left and right images to display panel 802 for display in a manner that is synchronized with frame sync indicator 836. Frame sync indicator 836 indicates a timing of the alternate display of the left and right images. If a pair of three-dimensional views is being provided to first and second glasses 860 (e.g., glasses 112a and 112b) that are formed by alternating display of a first left image, a first right image, a second left image, and second right image, frame rate controller 804 provides a sequence of the alternating first left, first right, second left, and second right images to display panel 802 for display, and indicates the timing of the sequential display of the first left image, the first right image, the second left image, and the second right image in frame sync indicator 836. Additional three-dimensional views may be handled in a similar manner.
Note that display panel 802 may be implemented in any suitable manner, including as a liquid crystal display (LCD) panel, a plasma display, etc.
Furthermore, note that frame rate controller 804 can provide two (or more) two-dimensional images for display by display panel 802 to two (or more) viewers. If a pair of different views is being provided to first and second glasses 860 (112a and 112b) that are formed by alternating display of a first image and then a second image, frame rate controller 804 provides a sequence of the alternating first and second images to display panel 802 for display, and frame sync indicator 836 indicates the timing of the sequential display of the first and second images. Additional views may be handled in a similar manner. Still further, frame rate controller 804 can enable display panel 802 to display one or more two-dimensional images interleaved with one or more three-dimensional images, in a similar manner. Frame rate controller 804 may be implemented in hardware, software, firmware, or any combination thereof, including in analog circuitry, digital logic, software/firmware executing in one or more processors, and/or other form.
Frame sync indicator 836 may be used to enable the passing and blocking of light by the shuttering lenses of wearable devices to be synchronized with the corresponding images displayed by display device 802. As shown in
Communication module 808 also receives audio content 854 from interface module 806. Communication module 808 transmits the captured clock value and/or other indication of frame sync indicator 826 in a first communication signal 838, and the audio content of audio content 854 that is associated with the image content delivered to glasses 860 in a second communication signal 856 (e.g., similarly to first and second communication channels 502 and 504 of
As shown in
Drive circuit 816 drives left and right shuttering lenses 824 and 826 according to the frame synch signal to block or pass light received from display panel 802 timed with the corresponding left and right images desired to be delivered to the wearer of glasses 860. Left and right shuttering lenses 824 and 826 may be any type of shuttering lenses, including liquid crystal shutters, etc. As shown in
Furthermore, communication module 822 outputs a received audio content signal 840 that includes the audio content received in second communication signal 856. Decoder 820 receives audio content signal 840. Decoder 820 (e.g., an audio codec) may be present to decode the received audio content (e.g., according to an audio standard) to generate decoded audio data, and may convert the decoded audio data from digital to analog form (e.g., converted to an analog audio signal using a digital-to-analog converter). As shown in
Note that the embodiment of glasses 860 shown in
In an embodiment, display panel 802 may include an array of pixels (e.g., an array of LCD pixels, LED pixels, etc.). A first row 902 in
A third row 906 of
As indicated by third row 906 in
This sequence illustrated in
A fourth row 908 of
As described above, if additional three-dimensional views are delivered to wearers of additional glasses 860, display panel 802 may be illuminated with corresponding right and left image data for the additional glasses 860 in an interleaved sequence with the right and left image data for the first glasses 860. For instance, referring to
Embodiments provide advantages. For instance, two Bluetooth™ enabled functions may be combined in a single device—the frame sync signal functionality and audio content functionality may be included together in a pair of 3D-enabled glasses. This enables a more user friendly experience in that separate glasses (that enable video) and a separate headset (that provides audio) are not needed. Instead, a combination device can be worn by the user that both enables video and provides audio. Furthermore, a number of members that may be included in a piconet is preserved. Because Bluetooth™ can support eight endpoints, by reducing the 3D glasses-plus-headset device (two Bluetooth™ piconet member devices) into a single Bluetooth™ piconet member device is advantageous as it preserves space in the Bluetooth™ piconet for further members (e.g., for remote control function, for other 3D glasses/headset endpoints, and/or potential other Bluetooth™ endpoints).
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.
This application claims the benefit of U.S. Provisional Application No. 61/360,070, filed on Jun. 30, 2010, which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
61360070 | Jun 2010 | US |