The present invention relates generally to projectors. More specifically, examples of the present invention are related to a projection system that projects visible and non-visible images.
Interest in three dimensional (3D) cameras is increasing as the popularity 3D applications continues to grow in applications such as imaging, movies, games, computers, user interfaces, and the like. A typical passive way to create 3D images is to use multiple cameras to capture stereo or multiple images. Using the stereo images, objects in the images can be triangulated to create the 3D image. Another way to create 3D images is to use a time-of-flight camera. By measuring the time of flight of every point of an object in the image, a 3D image of the object in the image can be created.
Currently, for gesture recognition utilized in for example gaming and user interfaces, infrared (IR) light pattern is typically projected onto an object. An IR camera detects the IR light pattern that is projected onto the object, and then a computer is used to analyze the detected IR light pattern on the object to determine the 3D position of the object. Typically, the structured light that is projected in 3D imaging applications includes a grating pattern. The grating pattern is projected onto a surface of the object. A profile of the surface at a position, e.g., a protrusion or a recess relative to a flat surface, can be calculated from the deformation of the projected grating at that position. The deformation typically relates to the shift or offset of the projected grating from its image if the surface is flat without a profile. Currently, gesture recognition systems may be utilized for a variety of applications such as for controlling a computer, gaming system, as well as many other uses. Typically, a light source for gesture recognition is mounted on or near a display, such as for example on a television or a computer screen, to project the structured light at a user that is interacting with the television or computer screen.
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Corresponding reference characters indicate corresponding components throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present invention. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one having ordinary skill in the art that the specific detail need not be employed to practice the present invention. In other instances, well-known materials or methods have not been described in detail in order to avoid obscuring the present invention.
Reference throughout this specification to “one embodiment”, “an embodiment”, “one example” or “an example” means that a particular feature, structure or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment”, “in an embodiment”, “one example” or “an example” in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures or characteristics may be combined in any suitable combinations and/or subcombinations in one or more embodiments or examples. Particular features, structures or characteristics may be included in an integrated circuit, an electronic circuit, a combinational logic circuit, or other suitable components that provide the described functionality. In addition, it is appreciated that the figures provided herewith are for explanation purposes to persons ordinarily skilled in the art and that the drawings are not necessarily drawn to scale.
Examples in accordance with the teaching of the present invention describe a combined visible and non-visible projection system, which in one example combines an RGB visual projection and IR illumination system into a single projection system. Instead of using a separate RGB projection display with a separate IR pattern generator, examples in accordance with the teachings of the present invention combine an RGB projection display and an IR projection display, which can display on the same screen a still or video RGB image and an IR pattern that is independent of the RGB image. The RGB image is visible to a human, but the IR image is non-visible to human, and can be used for gesture recognition that will be detected by an IR camera and analyzed by a computer. The combined visible and non-visible projection system enables both visual RGB color projection displays and IR based applications, such as for example gesture recognition and 3D mapping or printing in accordance with the teachings of the present invention.
Accordingly, an example combined visible and non-visible projection system in accordance with the teachings of the present invention allows for the possibility to project on the same screen a visible image as well as dynamically changing distinct IR patterns at different times based on conditions rather than being limited to a projection of a static or fixed single IR pattern. Thus, an example combined visible and non-visible projection system in accordance with the teachings of the present invention features the advantage of having the ability to adjust the projected IR pattern dynamically to account for wavelength shifts, leading to better resolution, better accuracy, and simpler depth computations in accordance with the teachings of the present invention.
To illustrate,
As shown in the example depicted in
In the illustrated example, projection display system 200 also includes an IR camera 210 that can sense and detect the IR structured light pattern 214, which is not visible to human viewer 206. In one example, the detected IR structured light pattern 214 is provided to a computer 212 for processing and analysis. Since the IR structured light pattern 214 is invisible to the human viewer 206, the IR structured light pattern 214 may be independent of the visible color image 208 and can therefore be dynamically updated as needed in accordance with the teachings of the present invention. If the structured light pattern 214 is projected onto an object in front of the screen 220, such as for example a human hand 207, computer 212 can analyze the detected structured light pattern 214 distorted by hand 207 as detected by IR camera 210, and then compute a 3D image of the object based on the distortions of the structured light pattern 214 caused by human hand 207, which can be used for gesture recognition or other purposes in accordance with the teachings of the present invention.
In one example, IR structured light pattern 214 can also be changed dynamically independent of the visible color image 208. For instance, in one example, the computer 212 can change the pitch and/or the orientation of the grating of structured light pattern 214 in response to a variety of conditions, such as the detected IR structured light pattern 214 as distorted for example by human hand 207, or in response to a particular scene of the color image 208, or any other condition, while the same projection of color image 208 is used. In another example, IR structured light pattern 214 may contain a non-uniform grating for structured light pattern 214 or any specific pattern generated by a computer 212 in accordance with the teachings of the present invention.
As shown in the depicted example, the R laser beam from R laser 302A is reflected by a mirror 308, and combined with the G laser beam from G laser 302B by a dichroic mirror 310. The combined R and G laser beams are further combined with the B laser beam from B laser 302C by a dichroic mirror 312. In the example, the combined R, G, and B laser beams are combined with the IR laser beam from IR laser 302D by a dichroic mirror 314.
As shown in the example depicted in
As shown in the depicted example, the R light beam is reflected by a mirror 404, and combined with the G light beam by a dichroic mirror 406. The combined R and G light beams are further combined with the B light beam by a dichroic mirror 408. The combined R, G, and B light beams are combined with the IR light beam by a dichroic mirror 410.
As shown in the example depicted in
In one example, DMD 416 forms R, G, B, and IR images sequentially, which are synchronized with computer 212 switching the R, G, B, and IR beams. In this manner, a color image is generated on screen 420 overlapping with an invisible IR structured light pattern. A human viewer 206 will see the visible color image, and an IR camera 210 will detect the invisible IR structured light pattern, and therefore detect hand 207 in front of screen 420 in accordance with the teachings of the present invention.
In another example, it is appreciated that four separate DMDs 416 may be used, in which each respective DMD used for generating an image of different wavelength. In the example, each DMD is illuminated with an expanded beam and four images generated by four DMDs are combined using dichroic mirrors similar to the example discussed above in accordance with the teachings of the present invention.
In one example, the light beams from R light source 502A, G light source 502B, B light sources 502C, and IR laser 502D are collimated by lenses 504A, 504B, 504C, and 504D, respectively. The collimated R light beam is reflected by a mirror 506, and combined with the collimated G light beam by a dichroic mirror 508. The combined R and G light beams are further combined with the collimated B light beam by a dichroic mirror 510. The combined R, G, and B light beams are one more time combined with the collimated IR laser beam by a dichroic mirror 512.
As shown in the example depicted in
In one example, LCOS display panel 516 forms the R, G, B, and IR image sequentially, which is synchronized with computer 212 switching the R, G, B, and IR beams. In this manner, a visible color image is generated on screen 520 overlapping with an invisible IR structured light pattern. A human viewer 206 will see the visible color image, and an IR camera 210 will detect the invisible IR structured light pattern, and therefore detect hand 207 in front of screen 520 in accordance with the teachings of the present invention.
Continuing with the example depicted in
In one example, LCOS display panel 516 forms the R, G, B, and IR image sequentially, which is synchronized with computer 212 switching the R, G, B, and IR beams. In this manner, a visible color image is generated on screen 520 overlapping with an invisible IR structured light pattern. A human viewer 206 will see the visible color image, and an IR camera 210 will detect the invisible IR structured light pattern, and therefore detect hand 207 in front of screen 520 in accordance with the teachings of the present invention.
As shown in the depicted example, a red light beam from R light source 602A is collimated by a lens 604A. The collimated beam is reflected by a PBS 606A to illuminate R-LCOS 608A. A red image is formed after the light beam reflected by R-LCOS 608A passing through PBS 606A. Similarly, a green light beam from G light source 602B is collimated by a lens 604B. The collimated beam is reflected by a PBS 606B to illuminate G-LCOS 608B. A green image is formed after the light beam reflected by G-LCOS 608B passing through PBS 606B. A blue light beam from B light source 602C is collimated by a lens 604C. The collimated beam is reflected by a PBS 606C to illuminate B-LCOS 608C. A blue image is formed after the light beam reflected by B-LCOS 608C passing through PBS 606C. The red, green, and blue images may be combined by a dichroic combiner cube (X-cube) 610. An IR light beam from IR light source 602D is collimated by a lens 604D. The collimated beam is reflected by a PBS 606D to illuminate IR-LCOS 608D. An IR image is formed after the light beam reflected by IR-LCOS 608D passing through PBS 606D. The IR image is combined with the combination of red, green, and blue images given by dichroic combiner cube 610 using a dichroic mirror 612.
As shown in the example, the combined red, green, blue, and IR images may be reflected by an optional mirror 614. A projection lens 616 projects the combined red, green, blue, and IR images on a screen 620. In this manner, a visible color image is generated on screen 620 overlapping with an invisible IR structured light pattern. A viewer 206 will see the visible color image, and an IR camera 210 will detect the invisible IR structured light pattern, and therefore detect hand 207 in front of screen 620 in accordance with the teachings of the present invention.
Referring back to the examples illustrated in
To illustrate,
It is appreciated that in other examples, the example LCOS projection display systems 500, 501, and 600 discussed above in
It is further appreciated that in other examples, the IR structured light pattern discussed above may be replaced with a uniform IR pattern. Thus, an object, e.g., hand 207 in front of the screen, may be under an IR uniform illumination in for example
The above description of illustrated examples of the present invention, including what is described in the Abstract, are not intended to be exhaustive or to be limitation to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications are possible without departing from the broader spirit and scope of the present invention.
These modifications can be made to examples of the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims. Rather, the scope is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation. The present specification and figures are accordingly to be regarded as illustrative rather than restrictive.
Number | Name | Date | Kind |
---|---|---|---|
7532181 | Tang | May 2009 | B2 |
7703926 | Hong et al. | Apr 2010 | B2 |
8120239 | Cheon et al. | Feb 2012 | B2 |
9052579 | Poulad | Jun 2015 | B1 |
9204121 | Marason | Dec 2015 | B1 |
20030103171 | Hall, Jr. | Jun 2003 | A1 |
20060285079 | Wada | Dec 2006 | A1 |
20060291014 | Hirata | Dec 2006 | A1 |
20080174742 | Ito | Jul 2008 | A1 |
20090128451 | Tokui | May 2009 | A1 |
20090180079 | Oakley | Jul 2009 | A1 |
20100165470 | Davis | Jul 2010 | A1 |
20100309439 | Bi | Dec 2010 | A1 |
20100328433 | Li | Dec 2010 | A1 |
20110063574 | Freeman | Mar 2011 | A1 |
20110228232 | Sakata et al. | Sep 2011 | A1 |
20110249014 | Kolstad | Oct 2011 | A1 |
20110249197 | Sprowl | Oct 2011 | A1 |
20110310060 | Li | Dec 2011 | A1 |
20110312116 | Weiss et al. | Dec 2011 | A1 |
20120044585 | Yamamoto | Feb 2012 | A1 |
20120162608 | Eguchi | Jun 2012 | A1 |
20120268371 | Takahashi | Oct 2012 | A1 |
20120280941 | Hu | Nov 2012 | A1 |
20130215235 | Russell | Aug 2013 | A1 |
20130222237 | Jesme et al. | Aug 2013 | A1 |
20130222892 | Jesme et al. | Aug 2013 | A1 |
20130241822 | Sharma | Sep 2013 | A1 |
20130241890 | Sharma | Sep 2013 | A1 |
20130241907 | Amirparviz | Sep 2013 | A1 |
20140055755 | Fan | Feb 2014 | A1 |
20140071404 | Davidson | Mar 2014 | A1 |
20140293231 | Yoon | Oct 2014 | A1 |
20140362052 | McCaughan | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
M300825 | Nov 2006 | TW |
201201077 | Jan 2012 | TW |
201234236 | Aug 2012 | TW |
201237715 | Sep 2012 | TW |
Entry |
---|
Lee,Jin-Ho et al., “Laser TV for Home Theater,” Proceedings of SPIE, vol. 4657, Projection Displays VIII (Apr. 30, 2002), pp. 138-145. |
Hornbeck, L.J., “Current status of the digital micromirror device (DMD) for projection television applications,” Electron Devices Meeting, 1993. IEDM '93. Technical Digest., International, pp. 381-384, Dec. 5-8, 1993. |
Jutamulia, S. et al., “Infrared signal processing using a liquid crystal television,” Optical Engineering vol. 30, No. 2:178-182 (Feb. 1991). |
TW Application No. 103135255—Taiwanese Office Action and Search Report, with English Translation, dated Jan. 7, 2016 (12 pages). |
CN Patent Application No. 201410788533.0—Chinese Office Action and Search Report, dated Jul. 4, 2016, with English Translation, 12 pages. |
CN Patent Application No. 201410788533.0—Chinese Office Action and Search Report, dated Apr. 24, 2017, with English Translation, 22 pages. |
Third Chinese Office Action dated Nov. 16, 2017, for Chinese Application No. 201410788533.0, filed Dec. 17, 2014, 26 pages. |
Fourth Chinese Office Action and Translation dated May 2, 2018, for Chinese Application No. 201410788533.0, filed Dec. 17, 2014, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20150296150 A1 | Oct 2015 | US |