Recently, more and more movies are not only available in a 2-dimensional (referred to hereinafter as “2D”) format, but also in a 3-dimensional (referred to hereinafter as “3D”) format. The demand for 3-dimensional contents is not limited to cinema, but is also prevalent in the home viewing market. As a result, many Liquid Crystal Displays (referred to hereinafter as “LCDs”) available today are capable of displaying 3D images. Hologram and holography technology have been applied to flat screen LCD displays, as well as projection displays, in order to achieve the in demand 3D effect. The demand for 3D content in movies is also increasing in the gaming applications environment. In the coming years, more and more games available on major game consoles, such as Nintendo, Xbox, and PlayStation will be available in 3D. With the progress of sensing technology, some of the gaming consoles available today such, as Kinetic Xbox 360 no longer require end users to use an input device. Instead, a camera sensing system is positioned in front of a display to detect the user's movement and subsequently, the movement will be interpreted and produced as input to the game application.
One potential challenge with a camera imaging system for 3D gaming applications may be if the user moves into close proximity with the display. For example, in 2D gaming applications, most users typically remain in a relatively stationary position at least three or four feet from the screen. However, with 3D gaming applications, a user may experience images “popping out” from the screen and be responsive to actions or events occurring “in” and “out” of the screen. With 3D projection display, a user may move to a position close to the screen. This may be challenging for sensing systems using a camera, because typically camera sensing systems require a user to be positioned a predetermined minimum distance from the camera.
Illustrative embodiments by way of examples, not by way of limitation, are illustrated in the drawings. Throughout the description and drawings, similar reference numbers may be used to identify similar elements.
The sensors 120 are connected to control circuits or a controller configured to detect movement of a user using correlation, spatial filtering or any other similar detection methods. Generally, in order to detect movement, the sensors 120 are positioned in a periodic manner, for example in an array form, as shown in
The sensors 120 may be configured to detect movement of a user, typically within a position of approximately 5 feet from the holographic display system 100. The detection may be done through sensing the ambient light reflected from one or more external objects positioned in close proximity to the holographic display system 100. The ambient light may enter the holographic display system 100 through the holographic overlay 150, which may be coupled into the light guide 142. However, besides ambient light, the light emitted from the light source 130 may also fall onto the sensors 120, thus creating undesired crosstalk. One way to reduce crosstalk is to use sensors 120 that are sensitive to only a specific limited range of wavelengths outside the visible light. In one embodiment, the sensors 120 may be sensitive primarily to near infrared light (750 nm to 950 nm), but remain relatively insensitive to visible light. For example, the output of the sensors 120 corresponding to a light radiation with a wavelength of 850 nm may be 100 times larger than the output corresponding to another visible light having a wavelength of 550 nm. As the light emitted from the light source 130 is typically visible light, the crosstalk created by the light source 130 may be reduced significantly. Another method to reduce the sensitivity of the sensors 120 to visible light, a filter 125 may be applied to the sensors 120 to block the visible light substantially.
In yet another embodiment, the crosstalk may be reduced by synchronizing the sensors 120 and the light source 130, such that light emission and light detection are carried out at different times. For example, the sensors 120 may be connected to a switch capacitor circuit (not shown) so that light sensing may be done at a first time interval. On the other hand, the light source 130 may be configured to emit light at a second time interval that does not overlap with the first time interval. As the light source 130 and the sensors 120 are synchronized, the sensors 120 may be configured to avoid sensing light emitted by the light source 130. The above crosstalk elimination techniques may be used independently or in any combination.
The holographic overlay 150 may be a hologram, or a holographic optical element, or a combination of all the above. The holographic overlay 150 may be positioned such that it has a first appearance when viewed at a first angle, and a second appearance when the display screen 140 is viewed at a second angle. The effects of the holographic overlay 150, in combination with an LCD, are capable of producing a 3D image. The holographic overlay 150 may be incorporated directly onto the stack structure of the display screen 140.
The holographic display system 100 may form a portion of a gaming system (not shown). A user of the gaming system may be located at a position in close proximity to the display system 100. For example, the user may be less than one foot from the holographic display system 100 interacting with and responding to the gaming application. The plurality of sensors 120 may be located across a wide spread area of the holographic display system 100 to enable movement detection in such close proximity. In one embodiment, the sensors 120 may be arranged in an array form and each sensor 120 may be positioned 3 cm away from a neighboring sensor 120.
The holographic overlay 250 may be made from glass or may comprise a layer of glass (not shown) made from acrylic polymer material. The sensors 220 may be formed directly on the holographic overlay, as shown in
Typically the photodiode pockets 321 may be small and may not obstruct optical transmission. For example the size of the photodiode pockets may be approximately 20 um×20 um×50 um. The depth of the photodiode pockets 321 may affect the sensitivity of the sensors 320. A silicon dioxide layer 351 may be formed above the photodiode pockets 321. The silicon dioxide layer 351 may define a dome shape to form a lens 324 for collimating light into the photodiode pockets 321. The photodiode pockets 321 and the lens 324 may be microscopic in dimension and invisible to unaided naked human eyes.
The sensors 420 may be positioned on the substrate 410. The substrate 410 may overlay at least a portion of the holographic screen 450. However, a substantial portion of the substrate 410 may define a hollow 480 allowing modulated light beam 490 from the image projector 470 to pass through so that images can be viewed on the holographic screen 450. In the embodiment shown in
Generally the light source 430 may be configured to emit light, which may pass through collimators (not shown) and beam splitters (not shown) before reaching the LCD modulator panel 440. The LCD modulator panel 440 may be controlled by a display driver (not shown). The display driver (not shown) is configured to modulate the light beam 490 in accordance with the images that is being projected. The LCD modulator panel 440 may comprise M×N pixels. The light may then go through one or more mirrors (not shown) and one or more lenses (not shown) before exiting the image projector 470 as an image beam 490. The image beam 490 will then be incident on the holographic screen 450 through the hollow 480 of the substrate 410.
The sensors 520 in the holographic projection display system 500, as shown in the embodiment in
The sensors 520 may be configured to detect light from the surrounding environment by synchronizing the sensors 520 and the light source 530 of the image projector 570, as discussed in the previous embodiment. Therefore, in a first time interval, when the light sources 530 are configured to emit light, the sensors 520 may be configured to detect light from the light sources 530. However, in a second time interval, that is non-overlapping with the first time interval, the sensors 520 may be configured to detect light from the ambient environment in order to detect movement of a user.
In step 625, the light redirected from the holographic display, and the sensor may be synchronized. This may be done through an image processor that is connected to the LCD panel in a flat screen display or a LCD modulator panel in a projection display. The image processor may be configured to emit a synchronized signal to both the light source driver and the sensors. In step 630, movement of a user is detected with the sensors. For example, in a first time interval in which images are displayed, the light source may be configured to emit light. In a second time interval in which the light source is not turned on, the sensors may be configured to detect ambient light. Finally in step 640, the movement of the user is computed by applying a controller to carry out a predetermined algorithm, such as correlation and spatial filtering.
Although specific embodiments of the invention have been described and illustrated herein above, the invention should not be limited to any specific forms or arrangements of parts so described and illustrated. For example, the light source die described above may be an LED die or some other light source, as known or later developed without departing from the spirit of the invention. The scope of the invention is to be defined by the claims appended hereto and their equivalents. Similarly, manufacturing embodiments and the steps thereof may be altered, combined, reordered, or other such modification as is known in the art to produce the results illustrated.