In active depth sensing, such as used by active rangefinders or active stereo systems, a projector projects patterns of light such as dots or lines to illuminate a region being sensed. The projected patterns are then captured by a camera/sensor (two or more in stereo systems), with the image (or images) processed to compute a depth map or the like.
For example, in stereo systems, stereo cameras capture two images from different viewpoints. Then, for example, one way to perform depth estimation with a stereo pair of images is to find correspondences between the images, e.g., to correlate each projected and sensed dot in one image with a counterpart dot in the other image via patch matching. For example, a dense depth map at the original (native) camera resolution may be obtained by area matching (e.g., a window of size 5×5). Once matched, the projected patterns within the images may be correlated with one another, and disparities between one or more features of the correlated dots (e.g., including their intensities) used to estimate a depth to that particular dot pair.
However, the resolution of the depth map is limited by the camera resolution.
This Summary is provided to introduce a selection of representative concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in any way that would limit the scope of the claimed subject matter.
Briefly, one or more of various aspects of the subject matter described herein are directed towards moving a projector or projector component to project a light pattern into a scene. The moving causes the light pattern to move over time relative to the scene. Images of the scene captured at different times are used for depth sensing.
One or more aspects are directed towards computing a depth map having a higher resolution than a native sensor resolution. A moving pattern of light is projected into a scene, and a sensor set comprising one or more sensors captures a plurality of images at different times. By processing the images, computed depth data is obtained for sub-pixels based upon which sub-pixel locations were in a path corresponding to the moving pattern of light. Depth data for any sub-pixels for which computed depth data is not obtained is estimated. A depth map comprising depth data for each sub-pixel is output.
In one or more aspects, a projector is configured to project a light pattern towards a scene, and a sensor set comprising senses light from the light pattern that is reflected from the scene. A motion mechanism coupled to the projector moves the light pattern over time. An image processing subsystem processes images captured over time in which the light pattern has moved within the images to compute a depth map.
Other advantages may become apparent from the following detailed description when taken in conjunction with the drawings.
The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Various aspects of the technology described herein are generally directed towards increasing the native resolution of a depth map by using a moving (e.g., slightly vibrating) pattern projector, such as moved by coupling the projector to a small piezoelectric motor. By tracking the path of features in the images over a series of frames and associating patterns across cameras (or from projector to camera), higher resolution depth information can be achieved.
The technology may be used in the context of an active stereo system, such as where a pair of IR cameras are used in conjunction with an IR projector to provide texture for matching, and hence depth estimation. The technology may be based upon an accurate two-dimensional (2D) feature detector that is repeatable (e.g., including a peak detector for a dot pattern). By vibrating the pattern projector, the 2D features cover a number of sub-pixel positions, which are used for reconstructing an upsampled version of the native resolution of the system.
It should be understood that any of the examples herein are non-limiting. As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in active depth sensing and image processing in general.
In
In one or more embodiments, the projector 106 may be configured to project dots to a region that is slightly larger than the scene 222 that the cameras 102 and 103 capture, because as described herein, the dots are not stationary whereby some dots not projected into the scene at one time may be projected into the scene at another time. More particularly, the projector is mechanically coupled to a motion mechanism 114, such as a small motor that vibrates the projector 106, causing the dot patterns move in a path over time. In
In one alternative implementation, rather than shake the projector 106 or a component part thereof (e.g., a diffractive optical element that diffracts infrared laser light into the dot pattern), the projector 106 may project into a mirror system that is vibrated or otherwise moved. In this way, a mirror or the like, which may be much lighter and/or accessible and therefore more amenable to vibrating than a projector or a subcomponent thereof, may be used. Note that any such mirroring system that may be used, whether one mirror or more, is considered a component/part of the projector, even if only optically coupled to the projector and not physically coupled. Thus, as used herein, moving/vibrating the “projector” or “projector component” is the same as moving a mirror (or multiple mirrors) that reflectively project the light pattern.
Note that the placement of the projector 106 may be outside the cameras (e.g.,
Returning to
The images captured by the cameras 102 and 103 are provided to an image processing system or subsystem 118. In some implementations, the image processing system 118 and image capturing system or subsystem 104, or parts thereof, may be combined into a single device. For example a home entertainment device may include all of the components shown in
The image processing system or subsystem 118 includes a processor 120 and a memory 122 containing one or more image processing algorithms 124. A dense depth map 126 at the original camera resolution can be obtained through area-based matching, while a semi-dense depth map 128 can be extracted by matching features (such as dots and/or lines).
Described herein is an algorithm (
As is known, such as described in U.S. published patent application no. 20130100256, hereby incorporated by reference, different dots or other projected elements have different features when captured, including intensity (brightness), depending on the distance from the projector to the reflective surfaces and/or the distance from the camera to the reflective surfaces. As is also known, the dots in different images taken at the same time (e.g., with genlocked stereo cameras) may be correlated with one another, such as by matching small (e.g., RGB) patches between RGB images of the same scene captured at the same instant.
Thus, with captured images, known algorithms can determine individual depth-related features (depth maps) for each image, and a disparity map that maintains differences between those images. The disparity map may be processed into a depth map based upon the disparity of certain features (e.g., intensity).
As can be seen in
Note that
As shown in
Because of the movement over time, as captured in a sequence (set) of image frames, far more sub-pixels are illuminated (over the set of frames) by any given projected dot than without movement of that dot. Note that the paths in
As mentioned above, it may be beneficial to intentionally project a dot pattern that is slightly larger than the scene (as exemplified in
In general, the movement is generally arbitrary/unpredictable for a given implementation. Notwithstanding, the amount of movement and vibration frequency may be controlled (or calibrated in a fixed system) relative to the frame rate of capturing the images and/or the dot density so that the benefits of the moving dots are obtained and made closer to optimal, e.g., the feature path traverses as many sub-pixels as possible and the frame rate is such that all or most of traversal is captured.
As can be readily appreciated, the super-resolution of depth for a stationary scene (or one with slow movement relative to the camera frame rate and vibration frequency) can be determined by combining the depth/disparity maps computed from a set of images taken over time. While some depth data may still be missing (e.g., the entire pixel 555 was not touched by a dot center, nor are other pixels and sub-pixels represented in
Combining the images to effectively increase dot density is represented in
In any event, it is seen that seven sub-pixels are in the feature path (or have depth data) as captured in the example frame 662, and nine sub-pixels are in the feature path as captured in the other, e.g., later frame 664; (some additional sub-pixels moved in from outside the grid 664).
Combining the grids 662 and 664 results in the merged grid 668 having sixteen sub-pixels that were illuminated over time. As can be readily appreciated, the two frames 662 and 664 illustrate the principle, however typically more than two images are combined, up to any practical number depending on how many frames can be captured before a super-resolved depth map is desired. Thus, coverage/merged dot density may be increased significantly.
For a given set of image frames over time, wherever the feature path passed through the higher resolution sub-pixel, the depth is directly computed for that sub-pixel. For each sub-pixel that is not in the path, depth information from the stereo capture at the original resolution and/or depth information from nearby super-resolved sub-pixel depth may be combined to estimate a depth for that sub-pixel. A straightforward way this may be accomplished is via push/pull interpolation, e.g., based on only the super-resolved depth information, which are well-known techniques in other areas of pixel processing. Note however that far less missing information needs to be estimated (e.g., interpolated).
The above description is applicable to stationary scenes, (or scenes with slow depth changes relative to the frame rate and feature path). For a scene with moving objects, tracking may be performed to transfer information across different time frames, e.g., using known techniques such as deformable ICP (iterative closest point) solutions. Note that the projector and camera may be moved to change depth instead of or in addition to a scene, e.g., the image capturing system or subsystem may be attached to a robot, mounted to a camera dolly, and so forth.
Step 704 captures the image or images. For brevity, stereo images are used hereafter in the description of
Step 706 evaluates whether some criterion is met that triggers a super-resolution computation. For example, some number of frames (corresponding to an amount of time with a given frame rate) may trigger the super-resolution computation. In general, the feature path will be completed at some time and start repeating over the same sub-pixels, although not necessarily exactly; this completion time may be used to determine a trigger time. Alternatively, some amount of image processing may be done on existing images to determine whether a suitable coverage area in terms of the feature path hitting sub-pixels may be used, e.g., at a high enough percentage. If this alternative is used, time or number of frames may be used as a secondary trigger in case the path is such that it takes too long to reach sufficient coverage due to too much repetition.
Further, such image pre-processing may be used to vary the projector or projector movement in some way. For example, if the pattern is not providing desired coverage, the pattern density may be increased (if not fixed), the vibration frequency, duty cycle and/or amplitude may be changed, and so forth. Note that the use of two or more motors (e.g., piezoelectric motors) may be used to change the shapes of vibration-induced feature paths as well, (as exemplified by Lissajous figures).
Returning to step 706, if not triggered, a conventional depth map may be estimated and output via step 710, e.g., at the native resolution based on the last images captured and processed. The process then returns to step 704 to capture the next set of images. Note that as described above, it is alternatively feasible to return to step 702 if a change to the projector/movement is possible and appropriate, such as because the feature paths are not attaining sufficient sub-pixel coverage.
If at step 706 the super-resolution computation is triggered, step 712 combines the images to obtain feature data for as many sub-pixels as reached by the feature path. Part of the sub-pixel estimation and combination of images/depths may be ongoing as each new image set is captured rather than only when triggered (e.g., at least part of step 712 may be performed before step 706), whereby step 712 may only needs to process and combine the latest image set/depth data to complete the super-resolved depth map. Note that in the event of a conflict where a sub-pixel has been reached more than once, (e.g., the sub-pixel 557 in
At step 714, any missing sub-pixels are estimated, e.g., via interpolation. The super-high resolution depth map is output at step 716, and the process repeated.
Note that while the examples herein are generally directed towards a stereo camera system, a single camera may similarly benefit from the technology described herein. Indeed, depth maps obtained via a single camera and projected light are well known; vibrating the projector likewise provides for more pixels or sub-pixels being illuminated, providing for more accurate native resolution depth maps or super-resolution depth maps. Time-of-flight depth sensing may also benefit from having more parts of an object being illuminated by vibrating the light source.
Example Operating Environment
It can be readily appreciated that the above-described implementation and its alternatives may be implemented on any suitable computing device, including a gaming system, personal computer, tablet, DVR, set-top box, smartphone and/or the like. Combinations of such devices are also feasible when multiple such devices are linked together. For purposes of description, a gaming (including media) system is described as one exemplary operating environment hereinafter.
The CPU 802, the memory controller 803, and various memory devices are interconnected via one or more buses (not shown). The details of the bus that is used in this implementation are not particularly relevant to understanding the subject matter of interest being discussed herein. However, it will be understood that such a bus may include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus, using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnects (PCI) bus also known as a Mezzanine bus.
In one implementation, the CPU 802, the memory controller 803, the ROM 804, and the RAM 806 are integrated onto a common module 814. In this implementation, the ROM 804 is configured as a flash ROM that is connected to the memory controller 803 via a Peripheral Component Interconnect (PCI) bus or the like and a ROM bus or the like (neither of which are shown). The RAM 806 may be configured as multiple Double Data Rate Synchronous Dynamic RAM (DDR SDRAM) modules that are independently controlled by the memory controller 803 via separate buses (not shown). The hard disk drive 808 and the portable media drive 809 are shown connected to the memory controller 803 via the PCI bus and an AT Attachment (ATA) bus 816. However, in other implementations, dedicated data bus structures of different types can also be applied in the alternative.
A three-dimensional graphics processing unit 820 and a video encoder 822 form a video processing pipeline for high speed and high resolution (e.g., High Definition) graphics processing. Data are carried from the graphics processing unit 820 to the video encoder 822 via a digital video bus (not shown). An audio processing unit 824 and an audio codec (coder/decoder) 826 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data are carried between the audio processing unit 824 and the audio codec 826 via a communication link (not shown). The video and audio processing pipelines output data to an A/V (audio/video) port 828 for transmission to a television or other display/speakers. In the illustrated implementation, the video and audio processing components 820, 822, 824, 826 and 828 are mounted on the module 814.
In the example implementation depicted in
Memory units (MUs) 850(1) and 850(2) are illustrated as being connectable to MU ports “A” 852(1) and “B” 852(2), respectively. Each MU 850 offers additional storage on which games, game parameters, and other data may be stored. In some implementations, the other data can include one or more of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into the console 801, each MU 850 can be accessed by the memory controller 803.
A system power supply module 854 provides power to the components of the gaming system 800. A fan 856 cools the circuitry within the console 801.
An application 860 comprising machine instructions is typically stored on the hard disk drive 808. When the console 801 is powered on, various portions of the application 860 are loaded into the RAM 806, and/or the caches 810 and 812, for execution on the CPU 802. In general, the application 860 can include one or more program modules for performing various display functions, such as controlling dialog screens for presentation on a display (e.g., high definition monitor), controlling transactions based on user inputs and controlling data transmission and reception between the console 801 and externally connected devices.
The gaming system 800 may be operated as a standalone system by connecting the system to high definition monitor, a television, a video projector, or other display device. In this standalone mode, the gaming system 800 enables one or more players to play games, or enjoy digital media, e.g., by watching movies, or listening to music. However, with the integration of broadband connectivity made available through the network interface 832, gaming system 800 may further be operated as a participating component in a larger network gaming community or system.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.
The present application is a continuation of U.S. Non-provisional patent application Ser. No. 13/924,485, filed Jun. 21, 2013, which claims priority to U.S. provisional patent application Ser. No. 61/812,232, filed Apr. 15, 2013.
Number | Name | Date | Kind |
---|---|---|---|
3938102 | Morrin et al. | Feb 1976 | A |
5351152 | Kuo et al. | Sep 1994 | A |
5471326 | Hall et al. | Nov 1995 | A |
5586200 | Devaney et al. | Dec 1996 | A |
5739906 | Evans et al. | Apr 1998 | A |
6105139 | Dey et al. | Aug 2000 | A |
6751344 | Grumbine | Jun 2004 | B1 |
7315383 | Abdollahi | Jan 2008 | B1 |
7512262 | Criminisi et al. | Mar 2009 | B2 |
7565003 | Ashizaki et al. | Jul 2009 | B2 |
7634395 | Flandrin et al. | Dec 2009 | B2 |
8077034 | Borlez et al. | Dec 2011 | B2 |
8331654 | Abraham et al. | Dec 2012 | B2 |
8442940 | Faletti et al. | May 2013 | B1 |
8787656 | Park et al. | Jul 2014 | B2 |
8818077 | Hwang | Aug 2014 | B2 |
9508003 | Eguro et al. | Nov 2016 | B2 |
9697424 | Kirk et al. | Jul 2017 | B2 |
9760770 | Eguro et al. | Sep 2017 | B2 |
9928420 | Kirk et al. | Mar 2018 | B2 |
9959465 | Georgiou et al. | May 2018 | B2 |
20020136444 | Brown et al. | Sep 2002 | A1 |
20030043270 | Rafey et al. | Mar 2003 | A1 |
20030048459 | Gooch | Mar 2003 | A1 |
20030081833 | Tilton | May 2003 | A1 |
20040105580 | Hager et al. | Jun 2004 | A1 |
20040125222 | Bradski et al. | Jul 2004 | A1 |
20040201586 | Marschner et al. | Oct 2004 | A1 |
20050058362 | Kita | Mar 2005 | A1 |
20050234527 | Slatkine | Oct 2005 | A1 |
20050257748 | Kriesel et al. | Nov 2005 | A1 |
20050279172 | Schreier et al. | Dec 2005 | A1 |
20060176306 | Nagaraj et al. | Aug 2006 | A1 |
20060210146 | Gu | Sep 2006 | A1 |
20060238714 | Fox et al. | Oct 2006 | A1 |
20060291020 | Knox et al. | Dec 2006 | A1 |
20070009150 | Suwa et al. | Jan 2007 | A1 |
20070145273 | Chang | Jun 2007 | A1 |
20070146512 | Suzuki et al. | Jun 2007 | A1 |
20070183657 | Kidono et al. | Aug 2007 | A1 |
20070253310 | Ikenaka | Nov 2007 | A1 |
20070263903 | St. hilaire et al. | Nov 2007 | A1 |
20080118143 | Gordon et al. | May 2008 | A1 |
20080130015 | Lu | Jun 2008 | A1 |
20080165357 | Stern et al. | Jul 2008 | A1 |
20080187711 | Alam et al. | Aug 2008 | A1 |
20080205748 | Lee et al. | Aug 2008 | A1 |
20080218612 | Border et al. | Sep 2008 | A1 |
20080278570 | Gharib et al. | Nov 2008 | A1 |
20080283729 | Hosaka | Nov 2008 | A1 |
20090021750 | Korner et al. | Jan 2009 | A1 |
20090080048 | Tsao | Mar 2009 | A1 |
20090096783 | Shpunt et al. | Apr 2009 | A1 |
20090217213 | Meserve | Aug 2009 | A1 |
20090217214 | Meserve | Aug 2009 | A1 |
20090231425 | Zalewski | Sep 2009 | A1 |
20090273679 | Gere et al. | Nov 2009 | A1 |
20100042964 | Meserve | Feb 2010 | A1 |
20100046004 | Lee et al. | Feb 2010 | A1 |
20100074532 | Gordon et al. | Mar 2010 | A1 |
20100177164 | Zalevsky et al. | Jul 2010 | A1 |
20100202725 | Popovich et al. | Aug 2010 | A1 |
20100277571 | Xu et al. | Nov 2010 | A1 |
20100289885 | Lu et al. | Nov 2010 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110063427 | Fengler et al. | Mar 2011 | A1 |
20110078189 | Bonchi et al. | Mar 2011 | A1 |
20110080487 | Venkataraman et al. | Apr 2011 | A1 |
20110091096 | Morris et al. | Apr 2011 | A1 |
20110103711 | Su et al. | May 2011 | A1 |
20110149031 | Um et al. | Jun 2011 | A1 |
20110222757 | Yeatman et al. | Sep 2011 | A1 |
20110228097 | Motta | Sep 2011 | A1 |
20110310220 | Mceldowney | Dec 2011 | A1 |
20120002045 | Tony et al. | Jan 2012 | A1 |
20120025080 | Liu et al. | Feb 2012 | A1 |
20120038986 | Pesach | Feb 2012 | A1 |
20120056982 | Katz | Mar 2012 | A1 |
20120087572 | Dedeoglu et al. | Apr 2012 | A1 |
20120120494 | Takayama | May 2012 | A1 |
20120154397 | Chernikov et al. | Jun 2012 | A1 |
20120155747 | Hwang | Jun 2012 | A1 |
20120242829 | Shin et al. | Sep 2012 | A1 |
20120253201 | Reinhold | Oct 2012 | A1 |
20120281087 | Kruse | Nov 2012 | A1 |
20120294510 | Zhang et al. | Nov 2012 | A1 |
20120307075 | Margalit | Dec 2012 | A1 |
20130002814 | Park et al. | Jan 2013 | A1 |
20130003069 | Umeda et al. | Jan 2013 | A1 |
20130051657 | Ostermann et al. | Feb 2013 | A1 |
20130083062 | Geisner et al. | Apr 2013 | A1 |
20130095302 | Pettis et al. | Apr 2013 | A1 |
20130100256 | Kirk et al. | Apr 2013 | A1 |
20130100282 | Siercks | Apr 2013 | A1 |
20130141545 | Macchia et al. | Jun 2013 | A1 |
20130141611 | Hirai et al. | Jun 2013 | A1 |
20130215235 | Russell | Aug 2013 | A1 |
20130229396 | Huebner | Sep 2013 | A1 |
20130265623 | Sugiyama et al. | Oct 2013 | A1 |
20130278631 | Border et al. | Oct 2013 | A1 |
20130287291 | Cho | Oct 2013 | A1 |
20130335531 | Lee et al. | Dec 2013 | A1 |
20140055560 | Fu et al. | Feb 2014 | A1 |
20140098342 | Webb | Apr 2014 | A1 |
20140104387 | Klusza et al. | Apr 2014 | A1 |
20140112573 | Francis et al. | Apr 2014 | A1 |
20140120319 | Joseph | May 2014 | A1 |
20140132501 | Choi et al. | May 2014 | A1 |
20140132728 | Verano et al. | May 2014 | A1 |
20140139717 | Short | May 2014 | A1 |
20140168380 | Heidemann et al. | Jun 2014 | A1 |
20140180639 | Cheatham et al. | Jun 2014 | A1 |
20140184584 | Reif et al. | Jul 2014 | A1 |
20140206443 | Sharp et al. | Jul 2014 | A1 |
20140225985 | Klusza et al. | Aug 2014 | A1 |
20140225988 | Poropat | Aug 2014 | A1 |
20140241612 | Rhemann et al. | Aug 2014 | A1 |
20140293011 | Lohry et al. | Oct 2014 | A1 |
20140307047 | Kirk et al. | Oct 2014 | A1 |
20140307055 | Kang et al. | Oct 2014 | A1 |
20140307058 | Kirk et al. | Oct 2014 | A1 |
20140307098 | Kang et al. | Oct 2014 | A1 |
20140307307 | Georgiou et al. | Oct 2014 | A1 |
20140307952 | Sweeney et al. | Oct 2014 | A1 |
20140307953 | Kirk et al. | Oct 2014 | A1 |
20140309764 | Socha-Ieialoha et al. | Oct 2014 | A1 |
20140310496 | Eguro et al. | Oct 2014 | A1 |
20140320605 | Johnson | Oct 2014 | A1 |
20140354803 | Chida | Dec 2014 | A1 |
20150078672 | Eguro et al. | Mar 2015 | A1 |
20150316368 | Moench et al. | Nov 2015 | A1 |
20170116757 | Shpunt et al. | Apr 2017 | A1 |
20180218210 | Georgiou et al. | Aug 2018 | A1 |
20180260623 | Kang et al. | Sep 2018 | A1 |
Number | Date | Country |
---|---|---|
1244008 | Feb 2000 | CN |
1414412 | Apr 2003 | CN |
1414420 | Apr 2003 | CN |
1445724 | Oct 2003 | CN |
1541483 | Oct 2004 | CN |
1669051 | Sep 2005 | CN |
1735789 | Feb 2006 | CN |
101061367 | Oct 2007 | CN |
101124514 | Feb 2008 | CN |
101309429 | Nov 2008 | CN |
101443809 | May 2009 | CN |
101501442 | Aug 2009 | CN |
101509764 | Aug 2009 | CN |
101711354 | May 2010 | CN |
101878409 | Nov 2010 | CN |
102027434 | Apr 2011 | CN |
102036599 | Apr 2011 | CN |
102231037 | Nov 2011 | CN |
102362150 | Feb 2012 | CN |
102385237 | Mar 2012 | CN |
102572485 | Jul 2012 | CN |
102638692 | Aug 2012 | CN |
102760234 | Oct 2012 | CN |
102803894 | Nov 2012 | CN |
102831380 | Dec 2012 | CN |
102867328 | Jan 2013 | CN |
103308517 | Sep 2013 | CN |
0085210 | Aug 1983 | EP |
2295932 | Mar 2011 | EP |
2400261 | Dec 2011 | EP |
2481459 | Dec 2011 | GB |
2000341721 | Dec 2000 | JP |
2003058911 | Feb 2003 | JP |
2004135209 | Apr 2004 | JP |
2005341470 | Dec 2005 | JP |
2006229802 | Aug 2006 | JP |
2009014501 | Jan 2009 | JP |
2010011223 | Jan 2010 | JP |
2010504522 | Feb 2010 | JP |
2010145186 | Jul 2010 | JP |
2011514232 | May 2011 | JP |
2013544449 | Dec 2013 | JP |
20110046222 | May 2011 | KR |
20110132260 | Dec 2011 | KR |
101137646 | Apr 2012 | KR |
2237284 | Sep 2004 | RU |
2006016303 | Feb 2006 | WO |
2007132399 | Nov 2007 | WO |
2009046268 | Apr 2009 | WO |
2012137434 | Oct 2012 | WO |
Entry |
---|
“Non Final Office Action Issued In U.S. Appl. No. 15/912,555”, dated Apr. 18, 2019, 16 Pages. |
“Notice On Reexamination Issued In Chinese Patent Application No. 201480021519.6”, dated Mar. 29, 2019, 8 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480021460.0”, dated Mar. 28, 2017, 17 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480021460.0”, dated Dec. 11, 2017, 6 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021460.0”, dated Jul. 2, 2018, 7 Pages. |
“Office Action Issued In Chinese Patent Application No. 201480021487.X”, dated Jun. 20, 2018, 7 Pages. |
“Office Action Issued in Chinese Patent Application No. 201480021493.5”, dated Dec. 19, 2017, 12 Pages. |
“Second Office Action Issued In Chinese Patent Application No. 201480021493.5”, dated Aug. 15, 2018, 10 Pages. |
“Office Action Issued In Chinese Patent Application No. 201480021519.6”, dated Sep. 4, 2018, 10 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480021519.6”, dated Aug. 30, 2016, 18 Pages. |
“Fourth Office Action Issued in Chinese Patent Application No. 201480021519.6”, dated Mar. 8, 2018, 15 Pages. |
“Office Action Issued in Chinese Patent Application No. 201480021519.6”, dated Sep. 19, 2017, 14 Pages. |
“Second Office Action and Search Report Issued in Chinese Patent Application No. 201480021519.6”, dated Mar. 30, 2017, 17 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480021528.5”, dated Dec. 11, 2017, 14 Pages. |
“Second Office Action Issued In Chinese Patent Application No. 201480021528.5”, dated Aug. 2, 2018, 8 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480021958.7”, dated Oct. 18, 2017, 19 Pages. |
“Second Office Action Issued In Chinese Patent Application No. 201480021958.7”, dated Jun. 13, 2018, 16 Pages. |
“Office Action Issued In Russian Patent Application No. 2015143654”, dated Jan. 31, 2018, 8 Pages. |
“Office Action Issued in Japanese Patent Application No. 2016-508993”, dated Feb. 6, 2018, 5 Pages. |
“Office Action Issued in Japanese Patent Application No. 2016-508993”, dated Aug. 7, 2018, 5 Pages. |
Anderson, et al., “The IBM System/360 Model 91: Machine Philosophy And Instruction-Handling”, Published in IBM Journal of Research and Development, vol. 11, Issue 1, Jan. 1, 1967, pp. 8-24. |
Chan, et al., “Regular Stereo Matching Improvement System Based On Kinect-Supporting Mechanism”, In Open Journal of Applied Sciences, vol. 3, Issue 1, Mar. 30, 2013, 5 Pages. |
Chihoub, et al., “A Band Processing Imaging Library For A Tricore-Based Digital Still Camera”, In Research Article of Real-Time Imaging, vol. 7, Issue 4, Aug. 1, 2001, pp. 327-337. |
Gao, et al., “Stereo Matching Algorithm Based On Illumination Normal Similarity and Adaptive Support Weight”, In Optical Engineering, vol. 52, Issue 2, Feb. 2013, 8 Pages. |
Gu, et al., “Trinocular Disparity Estimation With Adaptive Support Weight And Disparity Calibration”, In Optical Engineering, vol. 47, Issue 3, Mar. 1, 2008, 5 Pages. |
Hariyama, et al., “Optimal Periodical Memory Allocation For Logic-In-Memory Image Processors”, In Proceedings Of the Emerging VLSI Technologies and Architectures (ISVLS1'06), Mar. 2, 2006, pp. 193-198. |
Hosni, et al., “Near Real-Time Stereo With Adaptive Support Weight Approaches”, In Proceedings of International Symposium 3D Data Processing, Visualization and Transmission, May 17, 2010, 8 Pages. |
Kanade, et al., “Development Of A Video-Rate Stereo Machine”, In Proceedings Of IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, Aug. 5, 1995, pp. 95-100. |
Kuon, et al., “FPGA Architecture: Survey And Challenges”, In Foundations and Trends in Electronic Design Automation vol. 2, Issue 2, Apr. 18, 2008, pp. 135-253. |
Langmann, et al., “Depth Camera Technology Comparison And Performance Evaluation”, In Proceedings of the 1st International Conference on Pattern Recognition Applications and Methods, Feb. 2012, 7 Pages. |
Mcllroy, et al., “Kinectrack: Agile 6-DoF Tracking Using A Projected Dot Pattern”, Published In IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nov. 5, 2012, 7 Pages. |
“Office Action Issued in Mexican Patent Application No. MX/a/2015/014577”, dated Nov. 10, 2017, 3 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/033909”, dated Jun. 29, 2015, 17 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/033909”, dated Jul. 25, 2014, 16 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2014/033909”, dated Mar. 25, 2015, 14 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/033910”, dated Jul. 24, 2015, 8 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/033910”, dated Jul. 25, 2014, 9 Pages. |
“Second Written Opinion Issued In PCT Patent Application No. PCT/US2014/033910”, dated Dec. 18, 2014, 5 Pages. |
“International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/033915”, dated Apr. 7, 2015, 10 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/033915”, dated Jul. 16, 2014, 11 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2014/033915”, dated Jan. 8, 2015, 9 Pages. |
“International Preliminary Report on Patentability Issued in PCT Patent Application No. PCT/US2014/033916”, dated Jul. 13, 2015, 7 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/033916”, dated Jul. 18, 2014, 8 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2014/033916”, dated Mar. 27, 2015, 6 Pages. |
“International Preliminary Report on Patentability Issued in PCT Patent Application No. PCT/US2014/033917”, dated Jul. 20, 2015, 8 Pages. |
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/033917”, dated Jul. 18, 2014, 10 Pages. |
“International Preliminary Report on Patentability Issued in PCT Patent Application No. PCT/US2014/033919”, dated Jul. 13, 2015, 7 Pages. |
“International Search Report & Written Opinion Issued In PCT Application No. PCT/US2014/033919”, dated Jul. 17, 2014, 8 Pages. |
“Second Written Opinion Issued In PCT Application No. PCT/US2014/033919”, dated Mar. 27, 2015, 6 Pages. |
“International Search Report Issued In PCT Patent Application No. PCT/US2014/033996”, dated Jul. 31, 2014, 3 Pages. |
Yamaguchi, et al., “Interleaved Pixel Lookup For Embedded Computer Vision”, In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Jun. 23, 2008, pp. 1-8. |
Yoon, et al., “Locally Adaptive Support-Weight Approach For Visual Correspondence Search”, In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 20, 2005, 8 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021958.7”, dated Dec. 3, 2018, 10 Pages. |
“Fourth Office Action Issued in Chinese Patent Application No. 201480021460.0”, dated Jan. 24, 2019, 7 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021493.5”, dated Feb. 26, 2019, 10 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021528.5”, dated Feb. 11, 2019, 9 Pages. |
“Office Action Issued in European Patent Application No. 14724942.9”, dated Oct. 21, 2019, 8 Pages. |
“Office Action Issued in European Patent Application No. 14725312.4”, dated Oct. 9, 2019, 5 Pages. |
“Office Action Issued in European Patent Application No. 14727992.1”, dated Oct. 9, 2019, 8 Pages. |
Schuon, et al., “High-Quality Scanning Using Time-Of-Flight Depth Superresolution”, In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Jun. 23, 2008, 7 Pages. |
“Decision on Rejection Issued In Chinese Patent Application No. 201480021493.5”, dated Aug. 21, 2019, 6 Pages. |
“Office Action Issued in Indian Patent Application No. 6032/CHENP/2015”, dated Aug. 13, 2019, 6 Pages. |
“Office Action Issued in European Patent Application No. 14724934.6”, dated Oct. 24, 2019, 8 Pages. |
“Office Action Issued in Chinese Patent Application No. 201480021422.5”, dated Nov. 12, 2019, 6 Pages. |
“Office Action Issued in European Patent Application No. 14725861.0”, dated Nov. 8, 2019, 5 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/912,555”, dated Oct. 8, 2019, 15 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021487.X”, dated Feb. 28, 2020, 18 Pages. |
“Office Action Issued in Indian Patent Application No. 6362/CHENP/2015”, dated Feb. 26, 2020, 6 Pages. |
“Non Final Office Action Issued In U.S. Appl. No. 15/912,555”, dated Feb. 7, 2020, 17 Pages. |
“Office Action Issued in European Patent Application No. 14726261.2”, dated Nov. 29, 2019, 8 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 15/937,851”, dated Dec. 23, 2019, 14 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Jan. 21, 2016, 28 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Mar. 7, 2017, 28 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Mar. 9, 2018, 28 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Aug. 5, 2016, 39 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Sep. 2, 2015, 28 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/915,622”, dated Aug. 15, 2017, 27 pages. |
“Final Office Action Issued in U.S. Appl. No. 13/915,626”, dated Mar. 28, 2017, 17 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/915,626”, dated Jul. 11, 2016, 13 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/915,626”, dated Dec. 9, 2016, 16 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/915,626”, dated Jan. 29, 2016, 17 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/918,892”, dated Nov. 19, 2015, 7 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/918,892”, dated Dec. 21, 2016, 6 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/918,892”, dated Mar. 28, 2016, 6 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/918,892”, dated May 11, 2015, 6 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/923,135”, dated Oct. 3, 2016, 15 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/923,135”, dated Jul. 27, 2015, 21 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/923,135”, dated Mar. 31, 2016, 14 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/923,135”, dated May 5, 2017, 17 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/923,135”, dated Dec. 19, 2014, 13 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated Sep. 15, 2016, 25 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated Dec. 17, 2015, 25 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated Sep. 27, 2017, 37 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated Aug. 10, 2015, 20 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated May 17, 2016, 26 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,464”, dated May 5, 2017, 38 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/924,475”, dated May 6, 2016, 13 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,475”, dated Sep. 8, 2016, 15 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,475”, dated Mar. 24, 2015, 17 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/924,475”, dated Oct. 7, 2015, 14 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/925,762”, dated May 11, 2016, 22 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/925,762”, dated May 9, 2017, 26 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/925,762”, dated Oct. 3, 2016, 28 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 13/925,762”, dated Nov. 5, 2015, 13 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/088,408”, dated Sep. 29, 2016, 16 Pages. |
“Final Office Action Issued in U.S. Appl. No. 14/253,696”, dated Feb. 25, 2016, 23 Pages. |
“Non Final Office Action Issued in U.S. Appl. No. 14/253,696”, dated Sep. 10, 2015, 18 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 14/253,696”, dated Jul. 21, 2016, 17 Pages. |
“Office Action Issued in European Patent Application No. 14723271.4”, dated Sep. 19, 2016, 9 Pages. |
“Office Action Issued in European Patent Application No. 14723271.4”, dated Jun. 29, 2017, 12 Pages. |
“Oral Hearing Issued in European Patent Application No. 14723271.4”, dated Mar. 29, 2018, 16 Pages. |
“Office Action Issued in Australian Patent Application No. 2014254219”, dated Apr. 19, 2017, 3 Pages. |
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480021199.4”, dated May 31, 2017, 17 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480021199.4”, dated Jan. 17, 2018, 7 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480021199.4”, dated Jun. 19, 2018, 13 Pages. |
“Office Action Issued In Chinese Patent Application No. 201480021422.5”, dated Jul. 5, 2018, 10 Pages. |
“First Office Action & Search Report Issued in Chinese Patent Application No. 201480021422.5”, dated Mar. 21, 2017, 16 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480021422.5”, dated Dec. 11, 2017, 14 Pages. |
“Connected component (Graph Theory)”, Retrieved From: https://en.wikipedia.org/w/index.php?title=Connected_component_(graph_theory)&oldid=841872370, May 18, 2018, 3 Pages. |
“Final Office Action Issued in U.S Appl. No. 13/924,485”, dated Sep. 13, 2016, 16 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/924,485”, dated Jul. 20, 2017, 16 Pages. |
“Non Final office Action issued in U.S. Appl. No. 13/924,485”, dated Dec. 15, 2015, 15 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/924,485”, dated Mar. 7, 2017, 16 Pages. |
“Notice of Allowance Issued in U.S. Appl. No. 13/924,485”, dated Nov. 2, 2017, 8 Pages. |
“Office Action Issued in Chinese Application No. 201480021487.X”, dated Apr. 21, 2017, 13 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480021487.X”, dated Dec. 15, 2017, 10 Pages. |
“International Preliminary Report on Patentability Issued in PCT Patent Application No. PCT/US2014/033911”, dated Jul. 13, 2015,6 Pages. |
“International Search Report & Written Opinion Issued in PCT Application No. PCT/US2014/033911”, dated Aug. 29, 2014, 8 Pages. |
“Second Written Opinion Issued in PCT Application No. PCT/US2014/033911”, dated Apr. 2, 2015, 5 Pages. |
Yang, et al.“Spatial-Depth Super Resolution for Range Images”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Jun. 1, 2007, 8 Pages. |
“Notice of Allowance Issued in Korean Patent Application No. 10-2015-7032651”, dated Apr. 13, 2020, 4 Pages. |
“Office Action Issued in Korean Patent Application No. 10-2015-7032633”, dated Apr. 29, 2020, 8 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/912,555”, dated Jun. 10, 2020, 16 Pages. |
“Final Office Action Issued in U.S. Appl. No. 15/937,851”, dated Jul. 28, 2020, 10 Pages. |
“Office Action Issued in European Patent Application No. 14724942.9”, dated Jul. 27, 2020, 7 Pages. |
Number | Date | Country | |
---|---|---|---|
20180173947 A1 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
61812232 | Apr 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13924485 | Jun 2013 | US |
Child | 15889188 | US |