The disclosure relates generally to media display systems, and more particularly to systems that display multiple video streams for different viewers where each video stream is rendered for a different viewer.
Video and television systems are a mature technology. Recently, manufactures have begun to expand the functionality of such systems in order to provide viewers with a richer video experience. Recent advances include high definition and digital video display. More recently, advances in rendering speed and frame rate have allowed virtual three dimensional (3D) rendering of video in conjunction with corresponding eye wear or viewing devices. 3D technology has existed for some time in cinema in a passively viewed form, using either different colored filters for each eye, or using different polarizations for each eye, and where the viewed video is a composite, showing the video for both eyes at the same time. By using different filters for each eye, each eye perceives the video differently, allowing for the appearance of depth in the two dimensional display. There are problems with passive filtering, however. Using color filtering distorts the coloring of the video content. Polarization filtering can provide better suppression, and does not distort the color of the video, but requires the viewer to maintain head alignment, otherwise the suppression effect of the polarizing filters loses effectiveness as the viewer tilts their head from a horizontal alignment with the video.
More recently, in order to provide the appearance of depth, and leveraging the increased frame rendering or refresh rate of video monitors, active 3D viewing has been employed where each eye is alternately shuttered, and frames for each eye are rendered alternately in correspondence with the shuttering. The rendering and shuttering rate are performed at a rate which is not perceptibly significant, and which takes advantage of the natural persistence in perception when the human eye is exposed to an image.
However, even present active shuttering only renders one video stream. Anyone viewing the video, with or without the active shuttering eye wear, can only see the video that has been selected to be rendered. In some situations, however, it may be desirable for different people to see different video content displayed on the same monitor.
An embodiment includes a method for displaying multiple video streams on a single display where each video stream is intended to be seen by a different viewer. The method includes rendering frames of a first video stream in a first alternating time division of a video frame sequence on a video display monitor, and rendering frames of a second video stream in a second alternating time division of the video frame sequence on the video display monitor. The method further includes synchronizing a first actively shuttered viewing device with the first alternating time division of the video frame sequence such that a user of the first actively shuttered viewing device can only see the first video stream and not the second video stream, and synchronizing a second actively shuttered viewing device with the second alternating time division of the video frame sequence such that a user of the second actively shuttered viewing device can only see the second video stream and not the first video stream.
Another embodiment includes a video display system that includes a monitor operable to graphically render video frames, and a rendering engine operable to render frames of a first video stream and a second video stream on the monitor in alternating fashion, and provide a sync signal which changes state in correspondence with the alternating. The sync signal is provided to a first actively shuttered viewing device and a second actively shuttered viewing device, and the first actively shuttered viewing device uses the sync signal to allow a first viewer using the first actively shuttered viewing device to see the first video stream but not the second video stream. The second actively shuttered viewing device also uses the sync signal to allow a second viewer using the second actively shuttered viewing device to see the second video stream but not the first video stream.
Another embodiment includes a method for displaying video for two players of an interactive video game. The method includes generating a first video stream of a first game view for a first player using a first actively shuttered viewing device, and generating a second video stream of a second game view for a second player using a second actively shuttered viewing device. The method further includes alternately rendering frames of the first and second video streams on a monitor viewed by the first and second players through the first and second actively shuttered viewing devices, respectively. A sync signal is generated that alternately changes state in correspondence with rending the frames and is transmitted to each of the first and second actively shuttered viewing devices. The first and second actively shuttered viewing devices utilize the sync signal to control shuttering such that the first player can see the first video stream but not the second video stream and the second player can see the second video stream but not the first video stream.
Another embodiment provides a method for presenting multiple video streams on a monitor to be viewed exclusive of each other by different viewers. The method includes rendering a first video stream on the monitor using a first optical effect such that a first viewer using a first viewing device having left and right lenses both optically configured in correspondence with the first optical effect can see the first video stream. While rendering the first video stream on the monitor, the method further includes rendering a second video stream on the monitor using a second optical effect such that a second viewer using a second viewing device having left and right lenses both optically configured in correspondence with the second optical effect can see the second video stream. The first and second optical effect are complimentary such that the second video stream is substantially suppressed optically by the first viewing device, and the first video stream is substantially suppressed optically by the second viewing device.
In a further embodiment there is a video presentation system for displaying multiple exclusively viewable video streams on a single monitor. The system can include a first optical effect processor that applies a first optical effect to a first video stream. The first optical effect is such that a first viewer using a first viewing device having left and right lenses both optically configured in correspondence with the first optical effect can see the first video stream. The system also includes a second optical effect processor that applies a second optical effect to a second video stream. The second optical effect is such that a second viewer using a second viewing device having left and right lenses both optically configured in correspondence with the second optical effect can see the second video stream. The two video streams are combined by a stream combiner that combines the first and second video streams, including the first and second optical effects, into a composite signal. A rendering engine visually renders the composite signal on the monitor, including the first and second optical effects. The first and second optical effect are complimentary such that the second video stream is substantially suppressed optically by the first viewing device, and the first video stream is substantially suppressed optically by the second viewing device.
There are shown in the drawings, embodiments which are presently preferred, it being understood, however, that the invention is not limited to the precise arrangements and instrumentalities shown, nor that the presently preferred embodiments limit the scope of the subject matter disclosed herein to only those embodiments.
While the specification concludes with claims defining features that are regarded as novel, it is believed that the claims will be better understood from a consideration of the description in conjunction with the drawings. As required, detailed embodiments are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary and can be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the claims in virtually any appropriately detailed structure. Further, the terms and phrases used herein are not intended to be limiting but rather to provide an understandable description. Embodiments can include hardware implementations, software implementations, and implementations using both software and hardware.
Embodiments described herein allow different viewers to see different video streams or signals displayed on the same monitor in such a way that each viewer sees only his/her respective video stream, and is able to see it on the entire display area of the monitor instead of over a portion of the display area. Each viewer uses an actively shuttered viewing device to see his/her respective video rendering. Frames for each video stream are alternately rendered so that a frame from the first video stream is rendered, then a frame for the second video stream is rendered, alternating continuously. In conjunction with the alternate rendering, each actively shuttered viewing device comprises active lenses that can be optically opened and closed in conjunction with the alternate frame rendering so that the user sees only frames of their respective video stream but not frames of the other video stream.
Referring to
In the present drawing, a frame 106 of a first video stream 102 is presented on a video display monitor 104. As used in the present disclosure, the term “frame” refers to a visually renderable discrete portion or unit of video information, or the visually perceivable rendering of the portion or unit. Frames can be sequentially rendered at discrete time intervals to produce a time-dependent view of visually perceptible content. The term “video stream” as used herein refers to the sequence of frames that can be rendered to produce the time-dependent view of the visually perceptible content. The frame 106, upon being rendered on the monitor 104, shows the word “ONE,” merely used here as an example. As used here, the term “monitor” refers to the device or apparatus that includes the physical surface on which video information is visually rendered so that it can be seen, and can include circuitry and other elements for rendering the video information graphically, and can further include audio components for producing acoustic signals in conjunction with the rendered video information. A first actively shuttered viewing device 112, which is synchronized with the first video stream 102, has both lenses open and a user or viewer using the first actively shuttered viewing device 112 can see the image on the monitor through both lenses, as indicated by the appearance of the word “ONE” in each lens. A second actively shuttered viewing device 114, which is synchronized with a second video stream 110, has both lenses shut, indicated here by the dark lenses, so that the viewer using the second actively shuttered viewing device 114 cannot see the image of the first video stream. As used herein, the term “actively shuttered viewing device” refers to a device having a lens or lenses or viewing portal or openings through which a person looks, and wherein the lens or lenses are visually opened or closed. When a lens is “open” it is meant that there the lens does not obstruct the person's view when looking through the lens. When the lens is closed it is meant that the person cannot see through the lens. The monitor 104 can be any device capable of rendering frames of the video streams, and can include a television, computer monitor, or any other similar device capable of rendering video information.
The second video stream 110, in the present example, shows the word “TWO” as an example. The first actively shuttered viewing device 112, since it is synchronized with the first video stream 102, has its lenses closed (opaque) such that the viewer using it cannot see the monitor during the time that the frame of the second video stream 110 is rendered. The second actively shuttered viewing device 114, however, being synchronized with the second video stream 110, has its lenses open such that a viewing using that device can see the word “TWO” on the monitor, and indicated by the appearance of the word “TWO” in each lens of the second actively shuttered viewing device 114.
The actively shuttered viewing devices 112, 114 can use liquid crystal lenses that use a pane of liquid crystal material having transparent conductor layers on opposing sides of the lens for alternately activating and de-activating the liquid crystal material to cause the liquid crystal material to appear transparent or opaque. Each viewing device 112, 114 is synchronized to its corresponding video stream such that the viewer using the device 112, 114 can only see frames of the video stream with which the device 112, 114 is synchronized, and cannot see the frames of the other video stream. As a result, the viewers using devices 112, 114 see different video streams on the entire display of the monitor 104.
A frame sequence 115 illustrates a method for operating the system 100. Generally, there is a continuing sequence of frames of each video stream alternately being rendered into viewable form on the monitor 104. Each frame is rendered in one of two alternating time divisions 118, 120 of a frame cycle 116. Each alternating time division 118, 120 occupies substantially half of each frame cycle 116. Each successive frame of a given video stream can change from a previous frame so that motion and movement and other visual dynamics can be perceived by a viewer. A video stream, as used here, is defined as a received signal containing video information or data that is displayable. The signal can be received from any of a variety of signal sources, including local sources, such as a video media player or a video game system, as well as remote sources such as video content hosted on the Internet, video signals received from broadcast or other commercial sources such as community antenna television (CATV).
It is contemplated that the system of
In this example, for each viewing device 112, 114, each eye lens is alternately opened and closed once each frame cycle 214. A series of configurations 206, 208, 210, and 212 illustrate the process. In the first configuration 206, the left lens of the first viewer 112 is open, while the right lens is closed. Accordingly, a user would see a left sub-frame image as denoted by the “L1” in the left lens. Both lenses of the second viewing device 114 are closed at this time. In the next configuration 208, which occurs in a subsequent sub-frame time to configuration 206, the second viewing device 114 has its left lens open and its right lens closed, while both lenses of the first viewing device 112 are closed. Accordingly, a user of the second viewing device 114 would see an image with their left eye, as indicated by the “L2” in the left lens.
Configurations 210 and 212 then follow, opening the right lens for each viewing device 112, 114 in sequence. In configuration 210, the user of the first viewing device 112 sees the right sub-frame “R1,” while both lenses of the second viewing device 114 are closed. In configuration 212, the second viewing device 114 has its right lens open and its left lens closed so that the user sees the “R2” with the user's right eye. The configuration pattern can then repeat indefinitely, allowing each viewer to see a different video stream using the entire display of the monitor 104 in apparent stereoscopic presentation, without seeing the other video stream.
A method embodiment is illustrated in the frame sequence diagram 215. Over a given frame cycle 214, there is a left frame 216 and a right frame 218. Although shown here in a particular order, different orderings can be used, as will be appreciated by those skilled in the art. Each left frame 216 and right frame 218 is split into sub-frames 220 for each of the different video streams. Each frame cycle presents a left image and a right image for each of the two viewing devices 112, 114. The viewing devices, in synchronization, control their left and right lenses such that the user of the viewing devices 112, 114 sees the right and left sub-frames with only the corresponding eye, and when no sub-frames are displayed for the viewing device, both lenses are closed.
A common means for controlling the optical state of liquid crystal is to use a square wave generator 302, which generates a square wave signal. The square wave signal is applied to, for example, the backplane 308 of each lens assembly. The square wave signal is also fed to a pair of switches 310, 312, which have complementary switch states meaning that when one is closed the other is open, and vice versa. The switch state can be controlled by a sync control block 314. The sync control block 314 ensures that the switches 312, 314 operate in complementary states responsive to a sync signal 301. The square wave signal is applied to both the front plane 306 and the backplane 308, and the switches 310, 312 switch in or out an inverter 316, which inverts or delays the square wave signal so that the signal applied to the front plane 306 is inverted from that applied to the backplane 308. This technique creates a voltage differential and causes the liquid crystal material 304 to become opaque. The sync signal can be provided by a media controller and is timed in correspondence with the rendering of frames intended for the user of an actively shuttered viewing device incorporating or using the circuit of
The sync signal 301 is timed so that when a rendered frame is to be seen by the user of the actively shuttered device, switch 312 is open and switch 310 is closed, causing the front plane 306 and backplane 308 to have no substantial voltage differential between them, causing the liquid crystal material to be transparent. When the rendered frame is not to be seen by the user of the actively shuttered viewing device, switch 310 is open and switch 312 is closed, causing the square wave signal applied to the front plane 306 to be inverted from that applied to the backplane 308, resulting in a voltage differential between the front plane 306 and the backplane 308, which causes the liquid crystal material to be opaque. These states are illustrated as a backplane signal 320 and the front plane signal 318. When the switches are at switch state “0” 322, switch 312 is open and switch 310 is closed, so the front plane signal 318 and backplane signal 320 are the same. At switch state “1” 324 responsive to the sync signal 301 changing state, switch 310 is open and switch 312 is closed, causing the square wave signal to pass through inverter 316 so that the front plane signal 318 is inverted in relation to backplane signal 320.
To close both lenses, sync signal 409 is asserted while mask signal 412 remains de-asserted, resulting in an inverted square wave signal to be fed to both front planes 402. To close the left lens, both the sync signal 409 and the mask signal 412 are asserted, causing an inverted square wave to be applied to the front plane 402 of the left lens, while the square wave signal fed to the front plane 402 of the right lens is delayed 360 degrees so that it is in phase with the square wave signal fed to the backplanes 406, resulting in the right lens being open/transparent. To close the right lens, the sync signal 409 is de-asserted and the mask signal 412 is asserted, causing the square wave signal applied to the front plane 402 of the right lens to be 180 degrees out of phase with the signal fed to the backplanes 406 and the front plane 402 of the left lens.
Signals 512 corresponding to the first video stream source 502 can be transmitted to the first actively shuttered viewing device 112, and can include a sync signal, a mask signal, and audio signals that can be played over speakers 516 associated with the first actively shuttered viewing device 112. When the video content is not active stereoscopic, which requires the mask signal, no mask signal needs to be present.
Likewise, signals 514 can be transmitted to the second actively shuttered viewing device 114 and include a sync and mask signal that are appropriately phase shifted relative to the sync and mask signal of signals 512, as well as audio signals to be played over speakers 518 associated with the second actively shuttered viewing device 114. The signals 512, 514 can be transmitted through wires or by wireless means. The sync and mask signals are changed by the rendering engine 508 in correspondence with the rendering of each frame.
A game system such as a game console 602 executes a game program stored on machine readable storage media, such as a disk, drive or memory cartridge, as is known. The game console 602 generates one or more video streams 610 portraying game play for an interactive video game. The video stream 610 is transmitted to the monitor system 604 so that it can be rendered on a display of the monitor system 604. Furthermore, the game console 602 can be used for multi-player game play and can produce a separate video stream for each player. A first player can use the first actively shuttered viewing device 112, while a second player can use the second actively shuttered viewing device 114. Accordingly, the game console 602 generates first and second video streams 610 which, when rendered by the monitor system 604, portray game play for the first and second players and provide each player with a full view of their respective game play. Frames of the two video streams 610 are alternately rendered using methods substantially similar to that illustrated in
To facilitate synchronization of each of the viewing devices 112, 114, each viewing device 112, 114 can be associated with a game controller, such as first game controller 606 and second game controller 608. The association can be made via a link 612, 614, respectively, which can be a wired or wireless link. The links 612, 614 can simply indicate on which phase of the sync signal the respective viewing device 112, 114 is to trigger shuttering. For example, the monitor system 604 can include a frame multiplexer and rendering engine (similar to that shown in
Accordingly, when the viewing device 908 is moved to a different position in second view 910, where the viewing device 908 has moved to the left as indicated by arrow 912, the camera 704 is used to detect the change of position by tracking movement of the optical markers 909 in the field of view of the camera 704. The first view 900 may be adjusted as the viewing device 908 is moved to provide the appearance of a virtual three dimensional view. Accordingly, the monitor 702 is treated as a window into virtual space 901, and is moved to the right relative to the virtual space 901 as indicated by arrow 916. Furthermore, first object 906 is moved farther to the right than second object 907 since first object 906 is in the foreground. Additionally, the perspective of first object 906 changes to provide a partial side view or isometric view of first object 906. The effect is similar to a slight rotation of the first object 906 as indicated by arrow 914. Third object 911, which was not viewable in first view 900 has become partially viewable in second view 910 because of the change of perspective.
By using active shuttering techniques as shown in
It is contemplated that when the optical effect used is polarization, the monitor may alternate between showing a frame of the first video stream using a first polarization, and showing a frame of the second video stream using a second polarization. The effect is similar to active shuttering techniques, but active shuttering is not required because the viewing devices block out the frames of the other video stream. This allows the monitor to switch from one polarization to an orthogonal polarization when switching from rendering a frame of the first video stream to rendering a frame of the second video stream so that the monitor only has to display in one polarization orientation at a time. In such an embodiment, the combiner 1110 can be a multiplexer that alternates between frames of the two video streams.
In another embodiment, it is contemplated that the monitor 1202 can be a projection screen where each of the video streams is projected using a different projector onto the same screen and where each projector uses a different polarization that is orthogonal to the other projector. In such an embodiment, the combiner 1110 can be an optical combiner, which combines the two projections into one projection. Alternatively, the screen can act as the combiner 1110 when both video streams, and their respective optical effects, are projected onto the screen, in which case the rendering engine 1112 is not applicable and the combiner 1110 and monitor 1114 are essentially the same entity.
This description can be embodied in other forms without departing from the spirit or essential attributes thereof. Accordingly, reference should be made to the following claims, rather than to the foregoing specification, as indicating the scope of the invention.