Flat- and depth-imaging digital cameras are available as discrete systems and as device components, and some devices incorporate both flat- and depth-imaging functionality. The combination of flat and depth imaging is typically achieved using separate, discrete sensor arrays.
One aspect of this disclosure is directed to a camera. The camera includes a sensor array including a plurality of individually addressable sensor elements, each of the plurality of sensor elements responsive to incident light over a broad wavelength band. Covering the sensor array is at least one light valve switchable electronically between closed and open states. The light valve is configured to, in the closed state, block light of a stopband and transmit light outside the stopband, and, in the open state, transmit the light of the stopband. An electronic controller of the camera is configured to switch the light valve from the closed to the open state and, synchronously with switching the light valve, address the sensor elements of the sensor array.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve the disadvantages identified in this disclosure.
Images acquired concurrently from different sensor arrays may exhibit parallax, which is objectionable if the images are to be registered to each other. Beam splitting optics may be used to align, in effect, the two sensor arrays on the same optical axis, but this approach requires tight manufacturing tolerances, adds complexity, and may reduce the signal-to-noise ratio for both flat and depth imaging by dividing the available image intensity between the two arrays.
Attempts to acquire flat and depth images using the same sensor array may be complicated by the different wavelength bands used by the respective imaging processes. Flat imaging typically uses broadband visible light as the illumination source, while depth-imaging typically uses narrow-band infrared (IR) light. In one approach, a specialized array of filter elements is arranged in registry with the sensor elements of the imaging sensor. The filter array includes a repeated tiling of subarrays having visible-transmissive, IR blocking elements as well as IR-transmissive, visible-blocking elements. A disadvantage of this approach is that both visible and IR images are acquired on less than the full area of the sensor array, which decreases both the resolution and the signal-to-noise ratio for both images.
Disclosed herein is a combination flat- and depth-imaging camera that overcomes the issues noted above. The camera may sequentially acquire a depth image under active, narrow-band IR illumination, and a color or monochrome image under ambient illumination. Both images are acquired on the same sensor array, which is operated in a time multiplexed manner. An electrically switchable light valve covering the sensor array is used to block visible light during the IR depth acquisition. After the depth image is acquired, the light valve is opened, the active illumination used for depth imaging is switched off and the visible flat image is acquired. Preferably both images are acquired within the same video or image frame, effectively providing concurrent flat- and depth-image acquisition at the full frame rate. Advantageously, the entire sensor array may be used to acquire the IR image, while non-useful visible wavelengths are rejected. During monochrome or color-image acquisition, the entire sensor array is made responsive to visible light.
Aspects of this and other implementations will now be described by example, and with reference to the drawing figures listed above. Components, process steps, and other elements that may be substantially the same in one or more of the figures are identified coordinately and are described with minimal repetition. It will be noted, however, that elements identified coordinately may also differ to some degree. It will be further noted that the figures are schematic not necessarily drawn to scale. Rather, the various drawing scales, aspect ratios, and numbers of components shown in the figures may be purposely distorted to make certain features or relationships easier to see. In this disclosure, the term ‘visible’ is applied to the portion of the electromagnetic spectrum from about 400 to about 700 nanometers (nm). Any longer wavelength is referred to as ‘infrared’, including so called near-infrared wavelengths of about 850 nm, for example.
Due to the broad wavelength response of sensor elements 30, camera 12 may include one or more passive filters 34 arranged parallel to the sensor array and configured to limit the wavelength response of the sensor array. The passive filters reduce noise by excluding photons of wavelengths not intended to be imaged. Typically, an IR-imaging camera may include a visible bandstop filter. Conversely, a visible-imaging camera may include IR and ultraviolet (UV) cutoff filters. In implementations in which both visible and IR response is desired, the one or more passive filters 34 may include a visible and narrow-band IR bandpass filter.
Configured for visible as well as IR imaging, camera 12 may also include a color filter array (CFA) 36 of color filter elements 38. The color filter elements are arranged in registry with sensor elements 30 of sensor array 28. An example CFA may present a Bayer pattern—i.e., a repeated tiling of 2×2 subarrays having two green-transmissive elements 38G, one blue-transmissive element 38B, and one red-transmissive element 38R in each subarray, for example. In this implementation, the integrated response from sensor array 28 may be converted into a full-color image using a de-mosaicing algorithm. In implementations in which both visible and IR response is required at each sensor element, all of the color filter elements may be highly transmissive in the IR band of interest. Being transmissive to both visible and IR light, however, neither CFA 36 nor passive filters 34 will exclude visible light as a noise source in IR imaging. Nor will they exclude IR light as a noise source in visible imaging. For this purpose, in implementations in which both visible and IR imaging are provided, an electronically switchable light valve 40 is included.
In
Electronic controller 42 of
Camera 12 of
The term ‘modulate’ as applied to IR emitter 44 may include activating or deactivating the IR emitter, as described above, and, in some implementations, periodically varying the intensity of the IR emission at a high frequency (e.g., 100 MHz). Likewise, the term ‘address’ as applied to sensor array elements 30 may have a somewhat different meaning depending on the imaging mode described. For flat-imaging—both visible and IR—addressing the sensor elements may include integrating the intensity of light received at each sensor element 30 and associating the integrated intensity with the portion of the image corresponding to that element. For depth imaging, the sensor elements may be addressed differently. Here, addressing the sensor elements may include resolving a phase offset from each sensor element relative to the periodic modulation of the IR emitter (as described further below). The phase offset, optionally converted into the depth domain, may be associated with the portion of the image corresponding to the sensor element addressed. In some implementations, a series of IR acquisitions in rapid succession may be used to obtain the phase offset. In combination depth- and flat-imaging applications, both of the above addressing modes may be used in an alternating (i.e., multiplexed) manner synchronously timed with corresponding opening and closing of the light valve 40.
The phase-discriminating time-of-flight (ToF) approach described above is one of several depth-imaging technologies lying within the metes and bounds of this disclosure. In general, a depth-imaging camera may be configured to acquire one or more depth maps of a scene or subject. The term ‘depth map’ refers to an array of pixels registered to corresponding regions (Xi, Yi) of an imaged scene, with a depth value Zi indicating, for each pixel, the depth of the corresponding region. ‘Depth’ is defined as a coordinate parallel to the optical axis of the camera, which increases with increasing distance from the camera. Operationally, some depth-imaging cameras may be configured to acquire 2D image data, from which a depth map is obtained via downstream processing. The term ‘depth video’ refers herein to a time-resolved sequence of depth maps.
The configuration of a depth-imaging camera may differ from one implementation to the next. In one example, brightness or color data from two, stereoscopically oriented sensor arrays in a depth-imaging camera may be co-registered and used to construct a depth map. More generally, depth coordinates into a scene may be obtained using one or more flat-imaging cameras, with optical-tomography based co-registration of imaged features. Hyperspectral (e.g., visible+IR and/or UV) flat imaging may be used with this approach, for improved feature discrimination. In other examples, an illumination source associated with a depth-imaging camera may be configured to project onto the subject a structured illumination pattern comprising numerous discrete features—e.g., lines or dots. A sensor array in the depth-imaging camera may be configured to image the structured illumination reflected back from the subject. Based on the spacings between adjacent features in the various regions of the imaged subject, a depth map of the subject may be constructed. In ToF implementations, the illumination source—an IR emitter—may project pulsed or otherwise modulated IR illumination towards the subject. The sensor array of the depth-imaging camera may be configured to detect the phase offset between the illumination reflected back from the subject and the modulated emission. In some implementations, the phase offset of each sensor element may be converted into a pixel-resolved time-of-flight of the pulsed illumination, from the illumination source to the subject and then to the array. ToF data may then be converted into depth.
Continuing in
In other examples where a plurality of light valves 40 are stacked in series (e.g., parallel to sensor array 28), at least some of the stopbands may be relatively narrow bands in the visible. For instance, the camera may include three stacked light valves with stopbands centered in the red, green, and blue, respectively. In one implementation, the three light guides may be switched in concert between closed and open states, mimicking the functionality of the broadband visible light valve described above. In other implementations, however, the three light guides may be multiplexed in sequence to image the subject in independent red, green, and blue channels. Accordingly, cameras equipped with plural light guides having visible stopbands may omit CFA 36. In implementations in which IR imaging is also desired, all of the light valves may be switched into the closed state over an additional, IR-imaging time window.
An advantage of the above implementation over cameras relying on CFA 36 for color discrimination is that the whole of sensor array 28 may be used for each channel. This provides increased resolution and, potentially, increased signal and better color reproduction. In such implementations, however, the latency of each of the light valves must be short enough to acquire both IR and three or more images at the desired frame rate.
As shown in
In still other implementations, a stack of one or more light valves 40 with different stopbands may be used to control saturation in flat monochrome imaging done under variable lighting conditions. Controller 42 may sense the ambient light level and determine how much of the broadband spectrum to include in the flat image to be acquired. Under very dim lighting, all of the light valves may be maintained in the open state. Under brighter lighting, one or more of the light valves may be switched to the closed state, to avoid saturation of the sensor elements and preserve image quality. Under the brightest conditions, all light valves may be closed, so that only a narrow wavelength band—e.g., narrow-band IR—gets through to the sensor array. This may be the same band used for IR-based depth imaging if the camera is a combination flat monochrome and depth-imaging camera. In this application, the stopbands of the various light valves may be overlapping or non-overlapping.
Light valve 40 may be configured differently in the different implementations of camera 12.
In some implementations, the transmission plane of downstream fixed polarizer 50D is perpendicular to that of upstream fixed polarizer 50U. When electrodes 52A/52B are unbiased, LC layer 48 adopts a twisted nematic LC structure. The thickness of the LC layer is such that when linearly polarized light passes through the LC layer, its polarization rotates by 90°. In this configuration, where the polarizers on either side of the LC layer are crossed, light rotated by the LC layer will be transmitted. When the electrodes are biased, however, the twisted structure is formed. In that state, the LC layer does not rotate the polarization of the incident light, and thus the light is blocked. In this manner, light valve 40A enables electronic control of light transmission.
In the alternative implementation, the transmission planes of upstream fixed polarizer 50U and downstream fixed polarizer 50D are arranged parallel to each other. In this case the blocking and transmission states are reversed—viz., light of the stopband is transmitted when electrodes 52A/52B are biased, and blocked otherwise. Accordingly, the fixed polarizer orientation may be set based on the anticipated duty cycle of the light valve, to conserve power—i.e., if the anticipated duty cycle calls for the light valve to be open more often than closed, then the orientation may be set so that the light valve is open by default, when there is no energy provided to the electrodes, but if the anticipated duty cycle calls for the light valve to be closed more often than open, then the orientation may be set so that the light valve is closed by default, when there is no energy provided to the electrodes.
In summary, the function of upstream fixed polarizer 50U is to select a polarization state for the incident light, the function of LC layer 48 is to rotate the polarization state of incident light, and the function of downstream fixed polarizer 50D is to block or transmit the light, based on its final polarization state. If the fixed polarizers are limited to the specified stopband—i.e. exhibiting a high contrast ratio in that band and unit contrast ratio elsewhere, then their polarizing function will be limited to only the stopband. Furthermore, if the fixed polarizers are configured to provide high transmission outside the stopband, then they will be essentially transparent at those wavelengths. A light valve configured in this manner has significant advantages, such as low cost, low power operation, low driving voltage, and fast response time.
In some instances, the molecules of LC layer 48 may act as scattering centers to incident light, causing some degree of ghosting and flare in acquired images. This phenomenon is typically more significant at lower wavelengths. In order to reduce the effects of scattering, light valve 40A may be positioned as close to sensor array 28 as possible—e.g., coupled directly to sensor array 28 or to microlens array 32 of the sensor array. For example,
The switching time for LC layer 48 may be 5 milliseconds or less. Accordingly, two or more image acquisitions may be completed within a single video or image frame, effecting quasi-simultaneous imaging in two or more wavelength bands. It may be possible to achieve even lower switching times using appropriate LC materials, which would be advantageous, for example, in reducing motion blur between corresponding visible and IR images.
No aspect of the foregoing configurations should be interpreted in a limiting sense, for numerous alternative configurations are envisaged as well. For example, while the above examples stress the value of selectively blocking ambient visible light for combined visible and IR imaging, the stopband may be an IR band in alternative implementations. There, the light outside the stopband may include visible light. Other example light valves may use a mechanical structure that incorporates separate passive optical filters—e.g., red, green, blue, and IR. Using this approach, the light valve may switch to the desired optical filter by moving or rotating the mechanical structure to bring the appropriate filter in front of sensor array 28. Envisaged filter-switching modalities include voice-coil motors and piezoelectric actuators, for example. In this solution, the switching modalities may be specially configured to meet the size, speed and reliability requirements of the camera.
At 62 of method 60, an IR emitter 44 of the camera is activated. At 64 the light valve is switched to the closed state. At 66 sensor array 28 is addressed so as to resolve depth to an imaged locus at each sensor element 30 of the sensor array while the light valve is in the closed state. In phase-sensitive ToF implementations, resolving depth to an image locus may include resolving a phase offset from each sensor element relative to IR-emitter modulation. In some implementations, each sensor element may be addressed several times to acquire a single phase capture. At 68 the IR emitter is deactivated. At 70 the light valve is switched from the closed to the open state. At 72 the sensor array is addressed so as to integrate light intensity received at each element of the sensor array when the light valve is in the open state. In color-imaging implementations, integrating light intensity received at each element of the sensor array may include integrating light intensity in three or more color channels. In monochromatic visible-imaging implementations, light intensity may be integrated in a single channel.
Method 60 may include additional, optional steps intended to correct the visible image for ambient IR irradiation which passes through the one or more filters and light valves of camera 12. At 74, accordingly, IR emitter 44 is deactivated. At 76 the light valve is switched to the closed state. At 78 the sensor array is addressed so as to integrate light intensity received at each element of the sensor array, to assemble a reference image while the light valve is in the closed state. At 80 the reference image is saved. At 84, subsequent light intensities integrated when the light valve is in the open state are subtractively corrected based on the reference image.
One aspect of this disclosure is directed to a camera comprising a sensor array, at least one light valve, and an electronic controller. The sensor array includes a plurality of individually addressable sensor elements, each of the plurality of sensor elements responsive to incident light over a broad wavelength band. Optically covering the sensor array, is at least one light valve switchable electronically between a closed state and an open state, the light valve configured to, in the closed state, block light of a stopband and transmit light outside the stopband, and, in the open state, transmit the light of the stopband. The electronic controller is configured to switch the light valve from the closed state to the open state and, synchronously with switching the light valve, address the sensor elements of the sensor array.
In some implementations, the stopband is a visible band. In some implementations, the light outside the stopband includes infrared light. In some implementations, the camera further comprises an infrared emitter, and the electronic controller is further configured to, synchronously with switching the light valve and addressing the sensor elements, energize the infrared emitter. In some implementations, addressing the sensor elements includes, in only one of the closed and open state of the light valve, resolving a phase offset from each of the plurality of sensor elements relative to infrared-emitter modulation. In some implementations, the controller is configured to maintain the light valve in the closed state for a repeatable closed duration and in the open state for a repeatable open duration, and to vary the closed and open durations. In some implementations, the camera further comprises one or more passive filters configured to limit response of the sensor array. In some implementations, the camera further comprises one or more passive filters configured to limit response of the sensor array, the one or more passive filters including an array of color filter elements arranged in registry with the sensor elements of the sensor array. In some implementations, the light valve includes an electrostatically polarizable liquid-crystal layer and a fixed polarizer, wherein at least one of the liquid-crystal layer and the fixed polarizer exhibit wavelength-dependent transmissivity. In some implementations, the fixed polarizer provides a higher contrast ratio in the stopband than outside the stopband. In some implementations, the fixed polarizer is a first fixed polarizer, the light valve includes a second fixed polarizer, and the second fixed polarizer has a transmission plane perpendicular to that of the first fixed polarizer. In some implementations, the light valve includes an electronically switchable distributed Bragg reflector. In some implementations, the light valve is one of a plurality of stacked light valves, the electronic controller is configured to switch each of the light valves from the closed to the open state and synchronously address the sensor elements of the sensor array, and the stopband is different for each of the light valves. In some implementations, the stopband of a first of the plurality of light valves is a visible band, and the stopband of a second of the plurality of light valves is an infrared band. In some implementations, the light valve is coupled directly to the sensor array or to a microlens array portion of the sensor array.
Another aspect of this disclosure is directed to a combination depth and flat-image camera comprising a sensor array, at least one light valve, an infrared emitter, and an electronic controller. The sensor array includes a plurality of individually addressable sensor elements, each of the plurality of sensor elements responsive to incident light over visible and infrared bands. Optically covering the sensor array, is at least one light valve switchable electronically between a closed state and an open state, the light valve configured to, in the closed state, block light of the visible band and transmit light outside the visible band, and, in the open state, transmit the light of the visible band. The electronic controller is configured to: activate the infrared emitter, switch the light valve to the closed state, resolve depth to an imaged locus at each element of the sensor array by addressing the sensor array while the light valve is in the closed state, deactivate the infrared emitter, switch the light valve from the closed to the open state, and integrate light intensity received at each of the plurality of sensor elements of the sensor array when the light valve is in the open state.
In some implementations, resolving depth to an image locus includes resolving a phase offset from each sensor element relative to infrared-emitter modulation. In some implementations, integrating light intensity received at each of the plurality of sensor elements of the sensor array includes integrating light intensity in three or more color channels. In some implementations, the controller is further configured to: deactivate the infrared emitter, switch the light valve to the closed state, integrate light intensity received at each element of the sensor array to assemble a reference image while the light valve is in the closed state, and subtractively correct, based on the reference image, subsequent light intensities integrated when the light valve is in the open state.
Another aspect of this disclosure is directed to a combination depth and flat-image camera comprising a sensor array, at least one light valve, one or more passive filters, an infrared emitter, and an electronic controller. The sensor array includes a plurality of individually addressable sensor elements, each of the plurality of sensor elements responsive to incident light over visible and infrared bands. Optically covering the sensor array is a light valve switchable electronically between a closed state and an open state, the light valve configured to, in the closed state, block light of the visible band and transmit light outside the visible band, and, in the open state, transmit the light of the visible band. The one or more passive filters include an array of color filter elements arranged in registry with the sensor elements of the sensor array. The electronic controller is configured to: activate the infrared emitter, switch the light valve to the closed state, resolve a phase offset from each sensor element relative to infrared-emitter modulation by addressing the sensor array while the light valve is in the closed state, deactivate the infrared emitter, switch the light valve from the closed to the open state, and integrate light intensity received at each of the plurality of sensor elements of the sensor array in three or more color channels when the light valve is in the open state.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Number | Name | Date | Kind |
---|---|---|---|
3848129 | Figler et al. | Nov 1974 | A |
4349277 | Mundy et al. | Sep 1982 | A |
4621284 | Nishioka et al. | Nov 1986 | A |
4627620 | Yang | Dec 1986 | A |
4630910 | Ross et al. | Dec 1986 | A |
4645458 | Williams | Feb 1987 | A |
4679068 | Lillquist et al. | Jul 1987 | A |
4695953 | Blair et al. | Sep 1987 | A |
4702475 | Elstein et al. | Oct 1987 | A |
4711543 | Blair et al. | Dec 1987 | A |
4751642 | Silva et al. | Jun 1988 | A |
4796997 | Svetkoff et al. | Jan 1989 | A |
4809065 | Harris et al. | Feb 1989 | A |
4817950 | Goo | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4893183 | Nayar | Jan 1990 | A |
4901362 | Terzian | Feb 1990 | A |
4925189 | Braeunig | May 1990 | A |
5101444 | Wilson et al. | Mar 1992 | A |
5148154 | MacKay et al. | Sep 1992 | A |
5184295 | Mann | Feb 1993 | A |
5229754 | Aoki et al. | Jul 1993 | A |
5229756 | Kosugi et al. | Jul 1993 | A |
5239463 | Blair et al. | Aug 1993 | A |
5239464 | Blair et al. | Aug 1993 | A |
5243455 | Johnson | Sep 1993 | A |
5288078 | Capper et al. | Feb 1994 | A |
5295491 | Gevins | Mar 1994 | A |
5320538 | Baum | Jun 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5385519 | Hsu et al. | Jan 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5417210 | Funda et al. | May 1995 | A |
5423554 | Davis | Jun 1995 | A |
5444235 | Redford | Aug 1995 | A |
5454043 | Freeman | Sep 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5495576 | Ritchey | Feb 1996 | A |
5516105 | Eisenbrey et al. | May 1996 | A |
5524637 | Erickson | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5580249 | Jacobsen et al. | Dec 1996 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5597309 | Riess | Jan 1997 | A |
5616078 | Oh | Apr 1997 | A |
5617312 | Iura et al. | Apr 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5641288 | Zaenglein, Jr. | Jun 1997 | A |
5682196 | Freeman | Oct 1997 | A |
5682229 | Wangler | Oct 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5703367 | Hashimoto et al. | Dec 1997 | A |
5704837 | Iwasaki et al. | Jan 1998 | A |
5715834 | Bergamasco et al. | Feb 1998 | A |
5809065 | Dapper et al. | Sep 1998 | A |
5875108 | Hoffberg et al. | Feb 1999 | A |
5877803 | Wee et al. | Mar 1999 | A |
5892612 | Miller et al. | Apr 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5933125 | Fernie et al. | Aug 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5995649 | Marugame | Nov 1999 | A |
6005548 | Latypov et al. | Dec 1999 | A |
6009210 | Kang | Dec 1999 | A |
6054991 | Crane et al. | Apr 2000 | A |
6066075 | Poulton | May 2000 | A |
6072494 | Nguyen | Jun 2000 | A |
6073489 | French et al. | Jun 2000 | A |
6077201 | Cheng | Jun 2000 | A |
6081612 | Gutkowicz-Krusin et al. | Jun 2000 | A |
6098458 | French et al. | Aug 2000 | A |
6100896 | Strohecker et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6130677 | Kunz | Oct 2000 | A |
6141463 | Covell et al. | Oct 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6152856 | Studor et al. | Nov 2000 | A |
6159100 | Smith | Dec 2000 | A |
6173066 | Peurach et al. | Jan 2001 | B1 |
6183143 | Lippold et al. | Feb 2001 | B1 |
6188777 | Darrell et al. | Feb 2001 | B1 |
6215890 | Matsuo et al. | Apr 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6226396 | Marugame | May 2001 | B1 |
6229913 | Nayar et al. | May 2001 | B1 |
6256033 | Nguyen | Jul 2001 | B1 |
6256400 | Takata et al. | Jul 2001 | B1 |
6283860 | Lyons et al. | Sep 2001 | B1 |
6289112 | Jain et al. | Sep 2001 | B1 |
6299308 | Voronka et al. | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6316934 | Amorai-Moriya et al. | Nov 2001 | B1 |
6363160 | Bradski et al. | Mar 2002 | B1 |
6384819 | Hunter | May 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6430997 | French et al. | Aug 2002 | B1 |
6476834 | Doval et al. | Nov 2002 | B1 |
6496598 | Harman | Dec 2002 | B1 |
6503195 | Keller et al. | Jan 2003 | B1 |
6539931 | Trajkovic et al. | Apr 2003 | B2 |
6570555 | Prevost et al. | May 2003 | B1 |
6580459 | Uchino | Jun 2003 | B2 |
6633294 | Rosenthal et al. | Oct 2003 | B1 |
6640202 | Dietz et al. | Oct 2003 | B1 |
6661918 | Gordon et al. | Dec 2003 | B1 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6731799 | Sun et al. | May 2004 | B1 |
6738066 | Nguyen | May 2004 | B1 |
6760475 | Miller | Jul 2004 | B1 |
6765726 | French et al. | Jul 2004 | B2 |
6788809 | Grzeszczuk et al. | Sep 2004 | B1 |
6801637 | Voronka et al. | Oct 2004 | B2 |
6825928 | Liu et al. | Nov 2004 | B2 |
6873723 | Aucsmith et al. | Mar 2005 | B1 |
6876496 | French et al. | Apr 2005 | B2 |
6937742 | Roberts et al. | Aug 2005 | B2 |
6950534 | Cohen et al. | Sep 2005 | B2 |
7003134 | Covell et al. | Feb 2006 | B1 |
7036094 | Cohen et al. | Apr 2006 | B1 |
7038855 | French et al. | May 2006 | B2 |
7039676 | Day et al. | May 2006 | B1 |
7042440 | Pryor et al. | May 2006 | B2 |
7050606 | Paul et al. | May 2006 | B2 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7060957 | Lange et al. | Jun 2006 | B2 |
7113918 | Ahmad et al. | Sep 2006 | B1 |
7121946 | Paul et al. | Oct 2006 | B2 |
7155363 | Rosenthal et al. | Dec 2006 | B1 |
7170492 | Bell | Jan 2007 | B2 |
7184048 | Hunter | Feb 2007 | B2 |
7202898 | Braun et al. | Apr 2007 | B1 |
7222078 | Abelow | May 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7257437 | Demos et al. | Aug 2007 | B2 |
7259747 | Bell | Aug 2007 | B2 |
7274393 | Acharya | Sep 2007 | B2 |
7274454 | Kowarz et al. | Sep 2007 | B2 |
7289209 | Kowarz et al. | Oct 2007 | B2 |
7289211 | Walsh, Jr. et al. | Oct 2007 | B1 |
7308112 | Fujimura et al. | Dec 2007 | B2 |
7317836 | Fujimura et al. | Jan 2008 | B2 |
7348963 | Bell | Mar 2008 | B2 |
7359121 | French et al. | Apr 2008 | B2 |
7367887 | Watabe et al. | May 2008 | B2 |
7372977 | Fujimura et al. | May 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379563 | Shamaie | May 2008 | B2 |
7379566 | Hildreth | May 2008 | B2 |
7389591 | Jaiswal et al. | Jun 2008 | B2 |
7412077 | Li et al. | Aug 2008 | B2 |
7421093 | Hildreth et al. | Sep 2008 | B2 |
7430312 | Gu | Sep 2008 | B2 |
7436496 | Kawahito | Oct 2008 | B2 |
7440637 | Schechner et al. | Oct 2008 | B2 |
7450736 | Yang et al. | Nov 2008 | B2 |
7452275 | Kuraishi | Nov 2008 | B2 |
7460160 | Hershey et al. | Dec 2008 | B2 |
7460690 | Cohen et al. | Dec 2008 | B2 |
7489812 | Fox et al. | Feb 2009 | B2 |
7536032 | Bell | May 2009 | B2 |
7538326 | Johnson et al. | May 2009 | B2 |
7555142 | Hildreth et al. | Jun 2009 | B2 |
7560679 | Gutierrez | Jul 2009 | B1 |
7560701 | Oggier et al. | Jul 2009 | B2 |
7570805 | Gu | Aug 2009 | B2 |
7574020 | Shamaie | Aug 2009 | B2 |
7576727 | Bell | Aug 2009 | B2 |
7590262 | Fujimura et al. | Sep 2009 | B2 |
7593552 | Higaki et al. | Sep 2009 | B2 |
7598942 | Underkoffler et al. | Oct 2009 | B2 |
7607509 | Schmiz et al. | Oct 2009 | B2 |
7620202 | Fujimura et al. | Nov 2009 | B2 |
7668340 | Cohen et al. | Feb 2010 | B2 |
7680298 | Roberts et al. | Mar 2010 | B2 |
7683954 | Ichikawa et al. | Mar 2010 | B2 |
7684592 | Paul et al. | Mar 2010 | B2 |
7701439 | Hillis et al. | Apr 2010 | B2 |
7702130 | Im et al. | Apr 2010 | B2 |
7704135 | Harrison, Jr. | Apr 2010 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
7729530 | Antonov et al. | Jun 2010 | B2 |
7746345 | Hunter | Jun 2010 | B2 |
7760182 | Ahmad et al. | Jul 2010 | B2 |
7809167 | Bell | Oct 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7852262 | Namineni et al. | Dec 2010 | B2 |
RE42256 | Edwards | Mar 2011 | E |
7898522 | Hildreth et al. | Mar 2011 | B2 |
8035612 | Bell et al. | Oct 2011 | B2 |
8035614 | Bell et al. | Oct 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8072470 | Marks | Dec 2011 | B2 |
8421015 | Scott et al. | Apr 2013 | B1 |
8462238 | Fredembach et al. | Jun 2013 | B2 |
8531562 | Schmidt et al. | Sep 2013 | B2 |
8569681 | Ovsiannikov et al. | Oct 2013 | B2 |
8988680 | Nelson | Mar 2015 | B2 |
9148589 | Fischer et al. | Sep 2015 | B2 |
20020016533 | Marchitto et al. | Feb 2002 | A1 |
20020030755 | Uchino | Mar 2002 | A1 |
20030098918 | Miller | May 2003 | A1 |
20040125222 | Bradski et al. | Jul 2004 | A1 |
20040252230 | Winder | Dec 2004 | A1 |
20050058337 | Fujimura et al. | Mar 2005 | A1 |
20050219552 | Ackerman et al. | Oct 2005 | A1 |
20050285966 | Bamji et al. | Dec 2005 | A1 |
20060221250 | Rossbach et al. | Oct 2006 | A1 |
20070087564 | Speakman | Apr 2007 | A1 |
20070146512 | Suzuki et al. | Jun 2007 | A1 |
20070203413 | Frangioni | Aug 2007 | A1 |
20070221849 | Tabirian et al. | Sep 2007 | A1 |
20070249913 | Freeman et al. | Oct 2007 | A1 |
20080026838 | Dunstan et al. | Jan 2008 | A1 |
20080039715 | Wilson et al. | Feb 2008 | A1 |
20080255414 | Voegele et al. | Oct 2008 | A1 |
20080255425 | Voegele et al. | Oct 2008 | A1 |
20080255459 | Voegele et al. | Oct 2008 | A1 |
20080255460 | Voegele et al. | Oct 2008 | A1 |
20080309913 | Fallon | Dec 2008 | A1 |
20090021739 | Tsujita et al. | Jan 2009 | A1 |
20090114799 | Maeda | May 2009 | A1 |
20140078459 | Kim et al. | Mar 2014 | A1 |
20140187968 | Pinho | Jul 2014 | A1 |
20140232912 | Morimoto | Aug 2014 | A1 |
20140263991 | Therriault-Proulx et al. | Sep 2014 | A1 |
20140327837 | Osterman | Nov 2014 | A1 |
20140347570 | Osterman | Nov 2014 | A1 |
20150092059 | Lu et al. | Apr 2015 | A1 |
20150200220 | Juenger et al. | Jul 2015 | A1 |
20150234102 | Kurzweg et al. | Aug 2015 | A1 |
20150256767 | Schlechter | Sep 2015 | A1 |
Number | Date | Country |
---|---|---|
101254344 | Sep 2008 | CN |
0583061 | Feb 1994 | EP |
H08044490 | Feb 1996 | JP |
H1051668 | Feb 1998 | JP |
H11073491 | Mar 1999 | JP |
2002084451 | Mar 2002 | JP |
2003198898 | Jul 2003 | JP |
9310708 | Jun 1993 | WO |
9717598 | May 1997 | WO |
9944698 | Sep 1999 | WO |
Entry |
---|
Chen, et al., “Single Camera Imaging System for Color and Near-Infrared Fluorescence Image Guided Surgery”, In Journal of Biomedical Optics Express, vol. 5, Issue 8, Jul. 25, 2014, 7 pages. |
Lu, et al., “Designing Color Filter Arrays for the Joint Capture of Visible and Near-Infrared Images”, In Proceedings of 16th IEEE International Conference on Image Processing, Nov. 7, 2009, pp. 3797-3800. |
Kim, et al., “A 1.5Mpixel RGBZ CMOS Image Sensor for Simultaneous Color and Range Image Capture”, In Proceedings of IEEE International Solid-State Circuits Conference Digest of Technical Papers, Feb. 19, 2012, pp. 392-393. |
Fisher, S. et al., “Virtual Environment Display System,” ACM 1986 Workshop on Interactive 3D Graphics, Oct. 23, 1986, 12 pages. |
Azarbayejani, A. et al., “Visually Controlled Graphics,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, No. 6, Jun. 1993, 4 pages. |
Sheridan, T. et al., “Virtual Reality Check,” Technology Review, vol. 96, No. 7, Oct. 1993, 9 pages. |
“Simulation and Training,” Division Incorporated, Available as Early as Jan. 1, 1994, 6 pages. |
Granieri, J. et al., “Simulating Humans in VR,” Conference of the British Computer Society, Oct. 12, 1994, 15 pages. |
Freeman, W. et al., “Television Control by Hand Gestures,” Technical Report TR94-24, Mitsubishi Electric Research Laboratories, Dec. 1994, 7 pages. |
Breen, D. et al., “Interactive Occlusion and Collision of Real and Virtual Objects in Augmented Reality,” Technical Report ECRC-95-02, European Computer-Industry Research Center GmbH, Available as Early as Jan. 1, 1995, 22 pages. |
Stevens, J., “Flights Into Virtual Reality Treating Real World Disorders,” The Washington Post, Science Psychology, Mar. 27, 1995, 2 pages. |
“Virtual High Anxiety,” Popular Mechanics, vol. 172, No. 8, Aug. 1995, 1 page. |
Kanade, T. et al., “A Stereo Machine for Video-rate Dense Depth Mapping and Its New Applications,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 1996, 7 pages. |
Kohler, M., “Vision Based Remote Control in Intelligent Home Environments,” Proceedings of 3D Image Analysis and Synthesis 1996, Nov. 1996, 8 pages. |
Aggarwal, J. et al., “Human Motion Analysis: A Review,” Proceedings of IEEE Nonrigid and Articulated Motion Workshop 1997, Jun. 1997, 13 pages. |
Kohler, M., “Techical Details and Ergonomical Aspects of Gesture Recognition applied in Intelligent Home Environments,” Technical University of Dortmund, Germany, Jun. 1997, 35 pages. |
Pavlovic, V. et al., “Visual Interpretation of Hand Gesture for Human-Computer Interaction: A Review,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, No. 7, Jul. 1997, 19 pages. |
Wren, C. et al., “Pfinder: Real-Time Tracking of the Human Body,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, No. 7, Jul. 1997, 6 pages. |
Kohler, M., “Special Topics of Gesture Recognition Applied in Intelligent Home Environments,” Proceedings of International Gesture Workshop, Sep. 1997, 12 pages. |
Miyagawa, R. et al., “CCD-Based Range-Finding Sensor,” IEEE Transactions on Electron Devices, vol. 44, No. 10, Oct. 1997, 5 pages. |
Shao, J. et al., “An Open System Architecture for a Multimedia and Multimodal User Interface,” Improving the Quality of Life for the European Citizen, TIDE 98, Jan. 1998, 8 pages. |
Isard, M. et al., “Condensation—Conditional Density Propagation for Visual Tracking,” International Journal of Computer Vision, vol. 29, No. 1, Aug. 1998, 24 pages. |
Brogan, D. et al., “Dynamically Simulated Characters in Virtual Environments,” IEEE Computer Graphics and Applications, vol. 18, No. 5, Sep. 1998, 12 pages. |
Livingston, M., “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality,” Doctoral Dissertation, University of North Carolina at Chapel Hill, Dec. 1998, 145 pages. |
Hongo, H. et al., “Focus of Attention for Face and Hand Gesture Recognition Using Multiple Cameras,” Proceedings of the 4th Annual IEEE Conference on Automatic Face and Gesture Recognition, Mar. 2000, 6 pages. |
Schechner, Y. et al., “Generalized Mosaicing,” Proceedings of the Eighth IEEE International Conference on Computer Vision, ICCV 2001, vol. 1, Vancover, BC, Jul. 7, 2001, 8 pages. |
Zhao, L., “Dressed Human Modeling, Detection, and Parts Localization,” Doctoral Dissertation, The Robotics Institute, Carnegie Mellon University, Jul. 26, 2001, 121 pages. |
Eijk, R., “Beyond the flat screen: Minimal and optimal camera-base distances for viewing 3-D images,” Master's Thesis, Eindhoven University of Technology, Aug. 2003, 87 pages. |
Qian, G. et al., “A Gesture-Driven Multimodal Interactive Dance System,” 2004 IEEE International Conference on Multimedia and Expo, ICME 04, Jul. 2004, 4 pages. |
Sinha, S., “Calibration of a Heterogeneous Network of Color and Depth Cameras,” Idea for Canesta Vision Contest, Dec. 2004, 3 pages. |
He, L., “Generation of Human Body Models,” Master's Thesis, University of Auckland, Apr. 2005, 111 pages. |
Rosenhahn, B. et al., “Automatic Human Model Generation,” Proceedings of the 11th International Conference of CAIP, Sep. 2005, 8 pages. |
Hasegawa, S. et al., “Human-Scale Haptic Interaction with a Reactive Virtual Human in a Real-Time Physics Simulator,” ACM Computers in Entertainment, vol. 4, No. 3, Article 6C, Jul. 2006, 12 pages. |
Cho, J. et al., “Depth Image Processing Technique for Representing Human Actors in 3DTV Using Single Depth Camera,” Proceedings of 3DTV Conference, May 2007, 4 pages. |
Wang, O. et al., “Automatic Natural Video Matting with Depth,” 15th Pacific Conference on Computer Graphics and Applications, Oct. 2007, 4 pages. |
“3DV Systems—Zmini—Discontinued,” ThingLab Website, Available Online at www.thinglab.co.uk/scanning_product.php?URL_=product_digiscan_3dvsystems_zmini&SubCatID_=53, Retrieved Jul. 13, 2009, 2 pages. |
ISA Korean Intellectual Property Office, International Search Report and Written Opinion Issued in Application No. PCT/US2010/047564, dated Apr. 27, 2011, WIPO, 8 pages. |
State Intellectual Property Office of the People's Republic of China, First Office Action Issued in Chinese Patent Application No. 201080043779.5, dated Apr. 3, 2013, 11 pages. |
State Intellectual Property Office of the People's Republic of China, Second Office Action Issued in Chinese Patent Application No. 201080043779.5, dated Sep. 22, 2013, 6 pages. |
State Intellectual Property Office of the People's Republic of China, Third Office Action Issued in Chinese Patent Application No. 201080043779.5, dated Feb. 7, 2014, 6 pages. |
Japanese Patent Office, Office Action Issued in Japanese Patent Application No. 2012-532098, dated Apr. 22, 2014, 5 pages. |
European Patent Office, Search Report Issued in European Patent Application No. 10821000.6, dated Sep. 29, 2014, Germany, 3 pages. |
European Patent Office, Office Action Issued in European Patent Application No. 10821000.6, dated Oct. 21, 2014, Germany, 6 pages. |
Japanese Patent Office, Office Action Issued in Japanese Patent Application No. 2012-532098, dated Dec. 25, 2014, 4 pages. |
European Patent Office, Office Action Issued in European Patent Application No. 10821000.6, dated Apr. 20, 2015, Germany, 4 pages. |
ISA European Patent Office, International Search Report and Written Opinion Issued in PCT Application No. PCT/US2017/015926, dated Apr. 20, 2017, WIPO, 16 Pages. |
Number | Date | Country | |
---|---|---|---|
20170230551 A1 | Aug 2017 | US |