Projectors are conventionally designed to project a visible image in which the projected light has frequencies within the visible spectrum. For instance, conventional projectors are capable of projecting an image composed of multiple pixels, each emitted by a distinct pixel unit. For each pixel unit, the projector includes multiple light-emitting diodes (LEDs). For instance, a pixel unit typically might include a red LED, a green LED, and a blue LED. The projected image passes through optics in a manner that the visible image is then focused on a surface at a distance from the projector. Projectors are not conventionally used to project images outside of the visible spectrum.
Embodiments described herein relate to a projector that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
The projector includes multiple projecting units (e.g., one for each pixel to be displayed). Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. The optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of various embodiments will be rendered by reference to the appended drawings. Understanding that these drawings depict only sample embodiments and are not therefore to be considered to be limiting of the scope of the invention, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
The principles described herein relate to a projection system, or a projector, that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
The projection system includes multiple projecting units (e.g., one for each pixel to be displayed). Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. The optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
Each projecting unit has multiple light-emitting elements that are configured to emit in the visible spectrum of electromagnetic wavelengths that are visible to the human eye. For instance, projecting unit 101A is illustrated as including light-emitting elements 102A in the form of two light-emitting elements 102Aa and 102Ab, although the ellipses 102Ac represent that there may be other numbers of light-emitting elements 102A within the projecting unit 101A that are also capable of emitting light in the visible spectrum.
The same is true of the other projecting units in the projecting units 101. For instance, projecting unit 101B is illustrated as including light-emitting elements 102B in the form of two light-emitting elements 102Ba and 102Bb, although the ellipses 102Bc represent that there may be other numbers of light-emitting elements 102B within the projecting unit 101B that are also capable of emitting light in the visible spectrum. Furthermore, projecting unit 101C is illustrated as including light-emitting elements 102C in the form of two light-emitting elements 102Ca and 102Cb, although the ellipses 102Cc represent that there may be other numbers of light-emitting elements 102C within the projecting unit 101C that are also capable of emitting light in the visible spectrum. The same may be true of the other projecting units 101 that are not illustrated and which are represented by the ellipses 101D.
In one embodiment, the light-emitting elements 102 within each of the projecting units 101 constitutes a red light-emitting element, a green light-emitting element, and a blue light-emitting element. For instance, light-emitting element 102Aa of the projecting unit 101A might be a red Light-Emitting Diode (LED), the light-emitting element 102Ab of the projecting unit 101A might be a green LED, and another light-emitting element (represented by ellipses 102Ac) of the projecting unit 101A might be a blue LED.
Some or all of the projecting units 101 comprise further emitting elements 103 that emit light outside of the visible spectrum. The group of projecting units 101 that are capable of doing this are sometimes referred to herein as a “collection” of the projecting units 101. The collection could be all of the projecting units 101, or just a subset of the projecting units 101. The use of the term “collection” should not be construed as implying that such projecting units are collected together, as the collection may be distributed in any manner amongst the total number of projecting units.
In
The light-emitting elements 102 thus collectively emit a visible image 151 represented abstractly as an arrow. Likewise, the collection of emitting elements 103 thus emits a non-visible image 152, as represented by another arrow.
The method 200 includes emitting a portion of a visible image from each projecting unit to thereby generate a visible image (act 201). In the context of
The projection system 100 further includes optics 110 positioned to project the visible image 151 emitted by the projecting units 101, and also to project the non-visible image 152 emitted by the collection of projecting units 101 (i.e., the set of the projecting units 101 that includes an emitting element 103). The method 200 thus further includes projecting the visible image and the non-visible image through optics (act 203). The projected form of the visible image 151 is represented by projected visible image 151′. The projected form of the non-visible image 152 is represented by projected non-visible image 152′. The visible image 151 and the non-visible image 152 are projected into a field of projection 140. For instance, the field of projection 140 might be a wall, table-top, a flat surface, a complex surface, and might include one or more mobile objects (such as a hand or game pieces) positioned within the field of projection 140.
A reflected portion 151″ of the projected visible image 151′ is received back into the optics 110. Likewise, a reflected portion 152″ of the projected non-visible image 152′ is received back into the optics 110. Accordingly, the method 200 further includes the optics 110 receiving a reflected portion of the visible image 151 and the non-visible image 152 (act 204). In one embodiment, the surface onto which the projected visible image 151′ and the projected non-visible image 152′ are projected may be the same surface on which the projection system 100 sits.
A portion 151′″ of the reflected portion 151″ of the projected visible image 151′ is redirected to a camera 120 by a portion 111 of the optics 110, whereupon the camera 120 captures the portion 151′″ of the projected visible image 151′. Likewise, a portion 152′″ of the reflected portion 152″ of the projected non-visible image 152′ is redirected to the camera 120 by a portion 111 of the optics 110, whereupon the camera 120 captures the portion 152′″ of the projected portion 152′″ of the projected non-visible image 152′. Accordingly, the method 200 includes redirecting at least a portion of the received visible and non-visible image to a camera (act 205), and capturing the received visible and non-visible image (act 206). In some embodiments, the same camera 120 captures both the projected visible image 151′ and the reflected non-visible image 152′, though separate cameras 120 might capture each image instead.
A depth sensing module 130 detects depth information associated with surfaces within the field of projection 140 by using the captured image information regarding the portion 151′″ of the reflected portion 151″ of the projected visible image 151′ and regarding the portion 152′″of the reflected portion 152″ of the projected non-visible images 152′. Accordingly, the method 200 includes deriving depth information regarding one or more objects within the field of projection 140 of the visible image 151 using the captured portion 152′″ of the reflected portion 152″ of the non-visible image 152 (act 207).
In addition, an infra-red LED 301D for that pixel emits infra-red light, which reflects off of mirror 302C to be collimated with the visible optical signals for that pixel. The infra-red light likewise reflects off of mirror 303, is focused using focus optics 304, redirected with the Digital Micromirror Device (DMD) 305, passes through one way mirror 306, and is projected by the projection lens 307 as a pixel of the non-visible image 312. In one embodiment, this pixel of non-visible image 312 is overlaid on the same pixel of the visible image 311.
A portion of the visible image 311 and the non-visible image 312 are reflected back through the projection lens 307, and a portion of the reflected images are then redirected with the one way mirror 306 towards the camera 308.
The projection system and the camera 308 thus both use the same optics, and thus the assembly may be made quite small. In fact, the projection system 100 might be incorporated within a single computing system that may itself be quite small, such as a laptop, a smartphone, or an accessory to a laptop or smartphone.
Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems.
As illustrated in
As used herein, the term “executable module” or “executable component” can refer to software objects, routings, or methods that may be executed on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads). Such executable modules may be managed code in the case of being executed in a managed environment in which type safety is enforced, and in which processes are allocated their own distinct memory objects. Such executable modules may also be unmanaged code in the case of executable modules being authored in native code such as C or C++.
In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the computer-readable media 404 of the computing system 400. Computing system 400 may also contain communication channels 408 that allow the computing system 400 to communicate with other processors over, for example, network 410.
Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible storage medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface controller (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
As an example, the depth processing module 130 may be created and/or operated by the computing system 400 in response to the computing system 400 accessing a computer program product having one or more computer-readable media 404 having thereon computer-executable instructions that are structured such that, when executed by one or more processors of the computing system 400, the computing system 400 creates and/or operates the depth processing module 130.
The depth processing module 130 might allow the computing system 400 to infer information regarding the surface on which the projection system projects in the absence of objects placed within the field of project, while likewise detecting objects and characteristics of objects placed within the field of projection. Thus, the depth information might affect state of the computing system 400 thereby affecting the visible image. In one embodiment, the non-visible image is a pattern that is perhaps repeated (although it may be a non-repeating pattern also) and that allows for depth information to be derived based on reflections of that pattern. Also, the depth information may be obtained by a non-visible image via phase-based or other time of flight methods, or any other method for determining depth information from non-visible images.
Physical embodiments of the projection system 100 will now be described, although the diversity within these two physical embodiments should convey that the projection system described herein really have no limit on the actual physical implementation.
The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scopes.
Number | Name | Date | Kind |
---|---|---|---|
5844985 | Kulberg et al. | Dec 1998 | A |
5853327 | Gilboa | Dec 1998 | A |
6281878 | Montellese | Aug 2001 | B1 |
6331145 | Sity et al. | Dec 2001 | B1 |
6611252 | DuFaux | Aug 2003 | B1 |
6614422 | Rafii et al. | Sep 2003 | B1 |
6650318 | Arnon | Nov 2003 | B1 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6728582 | Wallack | Apr 2004 | B1 |
6750849 | Potkonen | Jun 2004 | B2 |
6798401 | DuFaux | Sep 2004 | B2 |
6832954 | Odake et al. | Dec 2004 | B2 |
6955297 | Grant | Oct 2005 | B2 |
7095033 | Sorge | Aug 2006 | B2 |
7204428 | Wilson | Apr 2007 | B2 |
7397464 | Robbins et al. | Jul 2008 | B1 |
7450086 | Thielman et al. | Nov 2008 | B2 |
7634128 | Snow et al. | Dec 2009 | B2 |
7753798 | Soltys et al. | Jul 2010 | B2 |
7934836 | Ito | May 2011 | B2 |
7961934 | Thrun et al. | Jun 2011 | B2 |
8134717 | Pangrazio et al. | Mar 2012 | B2 |
8425325 | Hope | Apr 2013 | B2 |
8442304 | Marrion et al. | May 2013 | B2 |
8485907 | Soltys et al. | Jul 2013 | B2 |
8672755 | Guthrie et al. | Mar 2014 | B2 |
8784206 | Gronkowski et al. | Jul 2014 | B1 |
8905551 | Worley, III | Dec 2014 | B1 |
8933974 | Marason | Jan 2015 | B1 |
20030047683 | Kaushal | Mar 2003 | A1 |
20040102247 | Smoot et al. | May 2004 | A1 |
20040160000 | Lindsey et al. | Aug 2004 | A1 |
20050088623 | Przybyla et al. | Apr 2005 | A1 |
20050162381 | Bell et al. | Jul 2005 | A1 |
20050192094 | Okada et al. | Sep 2005 | A1 |
20050219552 | Ackerman et al. | Oct 2005 | A1 |
20050245302 | Bathiche et al. | Nov 2005 | A1 |
20060050243 | Huewel | Mar 2006 | A1 |
20060052163 | Aida | Mar 2006 | A1 |
20060052885 | Kong | Mar 2006 | A1 |
20060073869 | LeMay et al. | Apr 2006 | A1 |
20060073891 | Holt | Apr 2006 | A1 |
20060073892 | Watanabe et al. | Apr 2006 | A1 |
20060274972 | Peterson | Dec 2006 | A1 |
20070046625 | Yee | Mar 2007 | A1 |
20070178955 | Mills | Aug 2007 | A1 |
20070201863 | Wilson et al. | Aug 2007 | A1 |
20080032808 | Ochi | Feb 2008 | A1 |
20080122805 | Smith et al. | May 2008 | A1 |
20080217851 | Colton | Sep 2008 | A1 |
20080278894 | Chen et al. | Nov 2008 | A1 |
20080280682 | Brunner et al. | Nov 2008 | A1 |
20080318550 | DeAtley | Dec 2008 | A1 |
20090020947 | Albers | Jan 2009 | A1 |
20090029754 | Slocum et al. | Jan 2009 | A1 |
20090104976 | Ouwerkerk et al. | Apr 2009 | A1 |
20090124382 | Lachance et al. | May 2009 | A1 |
20090168027 | Dunn et al. | Jul 2009 | A1 |
20090185139 | Morikuni | Jul 2009 | A1 |
20090264196 | Fujimoto | Oct 2009 | A1 |
20090323029 | Chen et al. | Dec 2009 | A1 |
20100007582 | Zalewski | Jan 2010 | A1 |
20100020026 | Benko et al. | Jan 2010 | A1 |
20100035684 | Kotlarik et al. | Feb 2010 | A1 |
20100113148 | Haltovsky et al. | May 2010 | A1 |
20100182402 | Nakajima et al. | Jul 2010 | A1 |
20100203965 | Juds et al. | Aug 2010 | A1 |
20100241976 | Nozaki et al. | Sep 2010 | A1 |
20100279768 | Huang et al. | Nov 2010 | A1 |
20100285881 | Bilow | Nov 2010 | A1 |
20110007140 | Nakahata et al. | Jan 2011 | A1 |
20110111833 | Nordahl et al. | May 2011 | A1 |
20110133934 | Tan et al. | Jun 2011 | A1 |
20110165923 | Davis et al. | Jul 2011 | A1 |
20110181553 | Brown et al. | Jul 2011 | A1 |
20110256927 | Davis et al. | Oct 2011 | A1 |
20110288964 | Linder et al. | Nov 2011 | A1 |
20120026376 | Goran | Feb 2012 | A1 |
20120162544 | Nicholson et al. | Jun 2012 | A1 |
20120223885 | Perez | Sep 2012 | A1 |
20120280941 | Hu | Nov 2012 | A1 |
20130113975 | Gabris | May 2013 | A1 |
20140043516 | Baker | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
1827630 | May 2008 | EP |
2007107874 | Sep 2007 | WO |
2009149112 | Dec 2009 | WO |
Entry |
---|
Mike Hanlon, “Philips Entertaible—Electronic Multi-Touch Tabletop Gaming Platform,” gizmag, Sep. 3, 2006, accessible online at http://www.gizmag.com/go/6093/. |
European Patent Office as International Searching Authority, “International Search Report and Written Opinion,” mailed Jun. 7, 2011, in related PCT application No. PCT/US2011/020058. |
Andrew D. Wilson, “PlayAnywhere: a compact interactive tabletop projection-vision system,” USIT '05 Proceedings of the 18th annual ACM symposium on User interface software and technology, pp. 83-92; (2005, ACM New York, New York). |
United States Patent and Trademark Office, Acting as the International Search Authority, “International Search Report and Written Opinion,” mailed Apr. 2, 2015 in international patent application No. PCT/US2014/051365. |
Ramesh Raskar, et al., “The Office of the Future: A Unified Approach to Image-Based Modeling and Spatially immersive Displays,” ACM, Proceedings of the 25th Annual Conference on Computer G (Jul. 24, 1998). |
Andrew D. Wilson, et al., “Combining Multiple Depth Cameras and Projectors for Interactions on, above, and between surfaces,” ACM, Proceedings of the 23rd Annual ACM Symposium on User Interface Software and Technology, pp. 273-282 (Oct. 6, 2010). |
Ramesh Raskar, et al., “iLamps: Geometrically Aware and Self-Configuring Projectors”, ACM SIGGRAPH 2006 Courses, Article No. 7 (2006). |
Alexander Kulik,et al., “C1 ×6: A Stereoscopic Six-User Display for Co-located Collaboration in Shared Virtual Environments,” ACM, ACM Transactions on Graphics (Dec. 2011). |
Chris Harrison, et al., “OmniTouch: Wearable Multitouch Interaction Everywhere,” ACM, UIST '11 Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 441-450 (Oct. 19, 2011). |
Pillip Staud, et al., “Pal map: Designing the Future of Maps,” ACM, OZCHI '09 Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7, pp. 427-428 (Nov. 27, 2009). |
Claudio Pinhanez, “The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces,” Springer-Verlag, Ubicomp 2001: Ubiquitous Computing, pp. 315-331 (2001). |
Oliver Bimber, et al., “Enabling View-Dependent Stereoscopic Projection in Real Environments,” IEEE, Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR 2005). |
Bernd Frohlich, et al., “Implementing Multi-Viewer Stereo Displays,” WSCG 2005: Full Papers: The 13th International conference in Central Europe on Computer Graphics, Visualization and Computer Vision (Feb. 4, 2005). |
Stephen J. Krotosky, et al., “A Comparison of Color and Infrared Stereo Approaches to Pedestrian Detection,” Proceedings of the 2007 IEEE Intelligent Vehicles Symposium, pp. 81-86 (Jun. 13-15, 2007). |
Number | Date | Country | |
---|---|---|---|
20150049308 A1 | Feb 2015 | US |