The present invention relates generally to systems and methods for electronic imaging, and specifically to methods of illumination for enhancing the quality of captured images.
Most low-cost CMOS image sensors use a rolling shutter, in which successive rows of sensor elements are triggered sequentially to capture light. This method of image acquisition thus records each individual frame not as a single snapshot at a point in time, but rather as a sequence of image stripes scanning across the frame. The result of the rolling shutter is that not all parts of the optical image are recorded at exactly the same time (although the frame is stored as a single electronic image).
The use of a rolling shutter introduces a temporal shear in the image frame, which can create artifacts in imaging of moving objects. Bradley et al. address this problem in “Synchronization and Rolling Shutter Compensation for Consumer Video Camera Arrays,” IEEE International Workshop on Projector-Camera Systems—PROCAMS 2009 (Miami Beach, Fla., 2009), which is incorporated herein by reference. The authors propose to solve the problem using synchronized stroboscopic illumination.
Embodiments of the present invention that are described hereinbelow provide apparatus and methods for illuminating an object that can be advantageous when the object is imaged using a sensor with a rolling shutter.
There is therefore provided, in accordance with an embodiment of the present invention, imaging apparatus, including an illumination assembly, including a plurality of radiation sources and projection optics, which are configured to project radiation from the radiation sources onto different, respective regions of a scene. An imaging assembly includes an image sensor and objective optics configured to form an optical image of the scene on the image sensor, which includes an array of sensor elements arranged in multiple groups, which are triggered by a rolling shutter to capture the radiation from the scene in successive, respective exposure periods from different, respective areas of the scene so as to form an electronic image of the scene. A controller is coupled to actuate the radiation sources sequentially in a pulsed mode so that the illumination assembly illuminates the different, respective areas of the scene in synchronization with the rolling shutter.
In disclosed embodiments, each group includes one or more rows of the sensor elements, and the regions define stripes that extend across the scene in a direction parallel to the rows. Typically, each stripe illuminates a respective region that contains the areas of the scene from which the sensor elements in a respective set of multiple rows capture the radiation, and the controller is configured to actuate the radiation sources so that the projected radiation sweeps across the scene in a direction perpendicular to the rows.
In a disclosed embodiment, the rolling shutter defines a frame time for capturing the entire electronic image, and the controller is configured to actuate each of the radiation sources for a respective actuation period that is less than half the frame time. The controller may actuate each of the radiation sources so that the illumination assembly illuminates each area of the scene only during a respective exposure period of a corresponding group of the sensor elements that captures the radiation from the area.
In some embodiments, the projection optics include a patterning element, which is configured so that the radiation is projected onto the scene in a predefined pattern, which is detectable in the electronic image formed by the imaging assembly. Typically, the controller is configured to analyze the pattern in the electronic image so as to generate a depth map of the scene. In one embodiment, the radiation sources include a matrix of light-emitting elements, which are arranged on a substrate and are configured to emit the radiation in a direction perpendicular to the substrate. In another embodiment, the radiation sources include a row of edge-emitting elements, which are arranged on a substrate and are configured to emit the radiation in a direction parallel to the substrate, and the illumination assembly includes a reflector disposed on the substrate so as to turn the radiation emitted by the edge-emitting elements away from the substrate and toward the patterning element.
There is also provided, in accordance with an embodiment of the present invention, a method for imaging, including arranging a plurality of radiation sources to project radiation onto different, respective regions of the scene. An image sensor, which includes an array of sensor elements arranged in multiple groups, is configured to receive an optical image of the scene, in which the groups of the sensor elements receive the radiation from different, respective areas of the scene. The groups of the sensor elements are triggered with a rolling shutter to capture the radiation from the scene in successive, respective exposure periods so as to form an electronic image of the scene. The radiation sources are actuated sequentially in a pulsed mode so as to illuminate the different, respective areas of the scene in synchronization with the rolling shutter.
In one embodiment, configuring the image sensor includes arranging multiple image sensors, having respective rolling shutters, together with multiple, respective pluralities of the radiation sources to form respective electronic images of different, respective, overlapping parts of a scene, and actuating the radiation sources includes synchronizing the respective pluralities of the radiation sources over the multiple image sensors so as to control an overlap of the respective areas of the scene illuminated by the radiation sources at any given time. The method may include analyzing the pattern over the electronic images formed by the multiple image sensors in order to generate a depth map of the scene.
There is additionally provided, in accordance with an embodiment of the present invention, imaging apparatus, including multiple imaging units. The imaging units include respective pluralities of radiation sources and projection optics, which are configured to project radiation from the radiation sources onto different, respective regions of a scene, and respective imaging assemblies. The imaging assemblies include respective image sensors and objective optics configured to form respective optical images of different, respective, overlapping parts of the scene on the respective image sensors. Each image sensor includes an array of sensor elements arranged in multiple groups, which are triggered by a rolling shutter to capture the radiation from the scene in successive, respective exposure periods from different, respective areas of the scene so as to form respective electronic images of the scene. The radiation sources are actuated sequentially in a pulsed mode so that the illumination assembly illuminates the different, respective areas of the scene in synchronization with the rolling shutter, while synchronizing the respective pluralities of the radiation sources over the multiple image sensors so as to control an overlap of the respective areas of the scene illuminated by the radiation sources at any given time.
Typically, the overlap is controlled so that the respective areas of the scene illuminated by the radiation sources at any given time are non-overlapping.
The present invention will be more fully understood from the following detailed description of the embodiments thereof, taken together with the drawings in which:
Various types of imaging systems include optical projectors for illuminating the scene of interest. For example, a projector may be used to cast a pattern of coded or structured light onto an object for purposes of three-dimensional (3D) depth mapping. In this regard, U.S. Patent Application Publication 2008/0240502, whose disclosure is incorporated herein by reference, describes an illumination assembly in which a light source, such as a laser diode or LED, transilluminates a transparency with optical radiation so as to project a pattern onto the object. (The terms “optical,” “light” and “illumination” as used herein refer generally to any of visible, infrared, and ultraviolet radiation.) An image sensor captures an image of the pattern that is projected onto the object, and a processor processes the image so as to reconstruct a three-dimensional (3D) map of the object.
Systems based on projection of patterned light may suffer from low signal/background ratio due to limitations on the power of the projector, particularly in conditions of strong ambient light. Embodiments of the present invention address this problem by projecting radiation onto the scene of interest in a synchronized spatial sweep, which is timed to take advantage of the rolling shutter of the image sensor in order to improve the signal/background ratio of the system.
In embodiments of the present invention, the rolling shutter is operated so as to cause different groups (typically successive rows) of sensor elements in the image sensor to capture radiation in different, successive exposure periods, which are much shorter than the total frame period (typically less than half, and possibly less than 10%). Each such group collects radiation from a different, respective area of the scene, which is focused onto the image sensor by objective optics. The illumination assembly is controlled so as to sweep the projected radiation over those areas of the scene in synchronization with the rolling shutter, so that each area of the scene is illuminated during the specific time that the corresponding group of sensor elements is active. As a result, the output power of the illumination assembly is concentrated, in each area of the scene, in the specific exposure periods during which the corresponding sensor elements are able to collect radiation from that area. Limitation of the exposure periods by the rolling shutter reduces the total amount of ambient radiation that is collected, without wasting any of the projected radiation. Therefore, the signal/background ratio of the system is enhanced substantially even without increasing the average power of the illumination.
In the embodiments that are disclosed hereinbelow, the illumination assembly comprises an array of radiation sources, with projection optics that project radiation from the radiation sources onto different, respective regions of the scene. The spatial sweep of the projected radiation is accomplished by pulsing the radiation sources sequentially. The respective region of the scene that is illuminated by each radiation source overlaps the areas in the scene that are sensed by one or more of the groups of the sensor elements. Each radiation source is thus pulsed on only during the time that the corresponding groups of sensor elements are active. This sequential pulsed operation of the array of radiation sources provides full flexibility in choosing the optimal timing for the spatial sweep of radiation, as well as high reliability in that no moving parts or active optical elements (other than the radiation sources themselves) are required to implement the sweep.
Although the embodiments that are described below relate specifically to projection of patterned light in a 3D sensing system, the principles of the present invention may similarly be applied to enhance the performance of other projection-based imaging systems. The rolling shutter in these embodiments is assumed to activate the sensor elements in the image sensor row by row, as in conventional CMOS image sensors that are known in the art; but the principles of the present invention may similarly be applied in conjunction with image sensors that use other sorts of sequential activation of groups of sensor elements, such as block-by-block activation.
An illumination assembly 22 projects a patterned radiation field 24 onto an object 26 (in this case a hand of a user of the system) in a scene. An imaging assembly 28 captures an image of the scene within a field of view 30. A controller 31 or other electronic processor processes the image in order to generate a 3D depth map of object 26. Further details of this sort of mapping process are described, for example, in the above-mentioned U.S. 2008/0240502 and in PCT International Publication WO 2007/105205, whose disclosure is also incorporated herein by reference. The 3D map of the user's hand (and/or other parts of the user's body) may be used in a gesture-based computer interface, but this sort of functionality is beyond the scope of the present patent application.
Imaging assembly 28 comprises objective optics 36, which form an optical image of the scene containing object 26 on an image sensor 38, such as a CMOS integrated circuit image sensor. The image sensor comprises an array of sensor elements 40, arranged in multiple rows. The sensor elements generate respective signals in response to the radiation focused onto them by optics 36, wherein the pixel value of each pixel in the electronic images output by image sensor 38 corresponds to the signal from a respective sensor element 40. The sensor elements are activated and deactivated, row by row, by a rolling shutter, whose timing is set by controller 31. This sort of rolling shutter operation is a standard feature of many CMOS image sensors.
Illumination assembly 22 comprises a projection module 32, which generates a beam of patterned light, and projection optics 34, which project the beam onto field 24. Module 32 typically comprises multiple radiation sources, along with optics for pattern generation. Controller 31 actuates the radiation sources sequentially, in a pulsed mode, in synchronization with the rolling shutter of image sensor 38. The design of module 32 and the synchronization of its operation with the rolling shutter are described in detail hereinbelow.
Illumination assembly 22 generates multiple stripes 46, 48, 50, 52, . . . of illumination. Each such stripe is generated by a respective radiation source or group of radiation sources. (Example arrangements of radiation sources that can be used to generate this sort of multi-stripe illumination are shown in the figures that follow.) The region defined by each stripe covers the area of a number of the rows of pixels 44. In other words, each stripe illuminates a certain area of the scene from which the image sensors in the corresponding rows capture radiation. Although stripes 46, 48, 50, 52 are shown in
Traces 62, 64, . . . correspond to actuation of the respective radiation sources that generate stripes 46, 48, . . . . In other words, when trace 62 is high, the radiation source that generates stripe 46 is actuated, and so on. For each group 58, 60, . . . , of the rows, the actuation period of the corresponding radiation source is set so as to fall entirely within the exposure periods of all the rows in the group. Thus, the illumination assembly illuminates each area of the scene only during the exposure periods of the sensor elements that capture the radiation from the area, and none of the illumination is wasted.
Trace 64 goes high just as trace 62 goes low, and so forth over all the radiation sources in illumination assembly 22. Thus, the stripe output of the illumination assembly sweeps across the scene in a sweep direction perpendicular to the rows of pixels 44 (and sensor elements 40), completing one such sweep in each image frame, in synchronization with the sweep of the rolling shutter of image sensor 38. The duty cycle of each radiation source is roughly 1:N, wherein N is the number of stripes (each illuminated by a respective radiation source or group of radiation sources). In the timing scheme of
Alternatively, other timing relations may be used between the frame rate, actuation periods and exposure times. These alternative timing arrangements may be advantageous in situations in which the geometrical relationships between illumination stripes and sensor rows are not maintained as precisely as in
A collecting lens 76 collimates and directs the radiation from optoelectronic elements 70 through one or more patterning elements 78. The patterning elements cause the radiation from elements 70 to be projected onto the scene in a predefined pattern, which is detectable in the electronic image formed by imaging assembly 28. This pattern in the image is processed in order to compute the depth map of the scene. Patterning elements 78 may comprise a patterned transparency, which may comprise a micro-lens array (MLA), as described, for example, in the above-mentioned U.S. 2008/0240502 or WO 2007/105205, and/or one or more diffractive optical elements (DOEs), as described in U.S. Patent Application Publication 2009/0185274, whose disclosure is also incorporated herein by reference. Additionally or alternatively, when elements 70 emit coherent radiation, patterning elements 78 may comprise a diffuser, which casts a laser speckle pattern on the scene.
Each of optoelectronic elements 70 emits radiation that forms a respective stripe 80, 82, 84, . . . , as shown in
In embodiments in which patterning elements 78 comprise a MLA or other transparency, each stripe 80, 82, 84, . . . , passes through a different, respective region of the transparency, and thus creates a respective part of the overall illumination pattern corresponding to the pattern embedded in the transparency. Projection optics 34 projects this pattern onto the object.
On the other hand, in embodiments in which patterning elements 78 comprise a DOE, either lens 76 or one of elements 78 (or the geometry of optoelectronic elements 70) is typically configured to create an appropriate “carrier” angle for the beam emitted by each of the optoelectronic elements. In such embodiments, the beams emitted by the different optoelectronic elements use different parts of lens 76, which may therefore be designed so that the collimated beams exit at respective angles corresponding to the desired vertical fan-out. Alternatively, the illumination module may comprise some other type of optics, such as a blazed grating with as many different zones as there are optoelectronic elements.
Further details of the fabrication of illumination module 32, as well as other, similar sorts of modules, are described in the above-mentioned U.S. Provisional Patent Application 61/300,465.
Optoelectronic subassembly 90 comprises a row of edge-emitting optoelectronic elements 70, such as laser diodes, which may be fabricated on a suitable substrate as in the preceding embodiment. In subassembly 90, however, the radiation emitted by elements 70 is reflected internally from an interior surface 94 (typically with a suitable reflective coating) of prism 92. The radiation from elements 70 enters prism 92 via a curved entry surface 96. As a result, respective beams generated by elements 70 spread apart and overlap partially with the adjacent beams. Controller 31 actuates elements 70 to emit radiation sequentially during each image frame in synchronization with the rolling shutter of image sensor 38
In contrast to the preceding embodiments, elements 110 comprise surface-emitting devices, such as light-emitting diodes (LEDs) or vertical-cavity surface-emitting laser (VCSEL) diodes, which emit radiation directly into the Z-direction. An array of microlenses (or other suitable micro-optics, such as total internal reflection-based micro-structures) 112 is aligned with elements 110, so that a respective microlens collects the radiation from each element and directs it into an optical module 104. The optical module comprises, inter alia, a suitable patterning element 106, as described above, and a projection lens 108, which projects the resulting pattern onto the scene.
Although the above embodiments are described, for the sake of clarity, in the context of system 20 and certain specific geometrical configurations of illumination and sensing, the principles of the present invention may similarly be applied in systems and configurations of other sorts.
In order to cover scene 130 completely, the projected patterned beams typically overlap in overlap regions 140. In conventional operation, the overlap of the patterns could lead to inability of sensing units 122, 124, 126, 128 to detect their own patterns reliably in regions 140 and thus to loss of 3D information in these regions. One way to overcome this problem could be to operate the sensing units at different wavelengths, so that each unit senses only its own pattern. This solution, however, can be cumbersome and require costly optoelectronics and optical filters.
Therefore, in system 120, controller 121 controls the timing of the illumination assemblies and the rolling shutters of the imaging assemblies in sensing units 122, 124, 126, 128 so as to control the overlap between the regions that are illuminated at any given time. Typically, the sensing units are controlled so that they illuminate and capture radiation from respective non-overlapping stripes 142, 144, 146, 148. Within each sensing unit, the illumination stripe and the sensing area that is triggered to receive radiation by the rolling shutter are internally synchronized as described above. Furthermore, the timing of all the sensing units is coordinated to avoid interference. Thus, for example, all of the sensing units simultaneously activate their respective stripes 142, followed by stripes 144, and so on, so that no more than a single sensing unit is active within each overlap region 140 at any given time. Each sensing unit provides 3D mapping data with respect to its own part of scene 130, and a processing unit (such as controller 121 or another computer) stitches the data together into a combined depth map.
The scheme illustrated in
Alternatively, sensing units 122, 124, 126, 128 may operate together without a centralized controller to regulate synchronization. For example, each sensing unit may adjust its own timing so as to maximize its depth readings. Thus, the entire system will converge to an optimal synchronization. Additionally or alternatively, the sensing units may communicate with one another using a token ring type protocol, without centralized control.
In system 150, however, sensing units 152 and 154 and their beams 156 and 158 are offset from one another in a direction perpendicular to the scan direction of the illumination and rolling shutter (horizontal offset with vertical scan in the view shown in
It will thus be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.
This application is a continuation of U.S. patent application Ser. No. 12/762,373, filed Apr. 19, 2010, which claims the benefit of U.S. Provisional Patent Application 61/300,465, filed Feb. 2, 2010, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3796498 | Post | Mar 1974 | A |
4850673 | Velzel et al. | Jul 1989 | A |
5406543 | Kobayashi et al. | Apr 1995 | A |
5477383 | Jain | Dec 1995 | A |
5606181 | Sakuma et al. | Feb 1997 | A |
5648951 | Kato | Jul 1997 | A |
5691989 | Rakuljic et al. | Nov 1997 | A |
5742262 | Tabata et al. | Apr 1998 | A |
5781332 | Ogata | Jul 1998 | A |
6002520 | Hoch et al. | Dec 1999 | A |
6031611 | Rosakis et al. | Feb 2000 | A |
6560019 | Nakai | May 2003 | B2 |
6583873 | Goncharov et al. | Jun 2003 | B1 |
6611000 | Tamura et al. | Aug 2003 | B2 |
6707027 | Liess et al. | Mar 2004 | B2 |
6927852 | Reel | Aug 2005 | B2 |
6940583 | Butt et al. | Sep 2005 | B2 |
7112774 | Baer | Sep 2006 | B2 |
7227618 | Bi | Jun 2007 | B1 |
7304735 | Wang et al. | Dec 2007 | B2 |
7335898 | Donders et al. | Feb 2008 | B2 |
7700904 | Toyoda et al. | Apr 2010 | B2 |
7952781 | Weiss et al. | May 2011 | B2 |
8384997 | Shpunt et al. | Feb 2013 | B2 |
20030090900 | Kim et al. | May 2003 | A1 |
20040012958 | Hashimoto et al. | Jan 2004 | A1 |
20040082112 | Stephens | Apr 2004 | A1 |
20040184270 | Halter | Sep 2004 | A1 |
20040258354 | Sekiya et al. | Dec 2004 | A1 |
20050088644 | Morcom | Apr 2005 | A1 |
20050178950 | Yoshida | Aug 2005 | A1 |
20060001055 | Ueno et al. | Jan 2006 | A1 |
20060044803 | Edwards | Mar 2006 | A1 |
20060252167 | Wang | Nov 2006 | A1 |
20060252169 | Ashida | Nov 2006 | A1 |
20060269896 | Liu et al. | Nov 2006 | A1 |
20070019909 | Yamauchi et al. | Jan 2007 | A1 |
20080106746 | Shpunt et al. | May 2008 | A1 |
20080198355 | Domenicali et al. | Aug 2008 | A1 |
20080212835 | Tavor | Sep 2008 | A1 |
20080240502 | Freedman et al. | Oct 2008 | A1 |
20080278572 | Gharib et al. | Nov 2008 | A1 |
20090090937 | Park | Apr 2009 | A1 |
20090096783 | Shpunt et al. | Apr 2009 | A1 |
20090183125 | Magal et al. | Jul 2009 | A1 |
20090185274 | Shpunt | Jul 2009 | A1 |
20100007717 | Spektor et al. | Jan 2010 | A1 |
20100013860 | Mandella et al. | Jan 2010 | A1 |
20100142014 | Rosen et al. | Jun 2010 | A1 |
20110019258 | Levola | Jan 2011 | A1 |
20110069389 | Shpunt | Mar 2011 | A1 |
20110075259 | Shpunt | Mar 2011 | A1 |
20110114857 | Akerman et al. | May 2011 | A1 |
20110187878 | Mor et al. | Aug 2011 | A1 |
20110188054 | Petronius et al. | Aug 2011 | A1 |
20110295331 | Wells et al. | Dec 2011 | A1 |
20120038986 | Pesach | Feb 2012 | A1 |
20120140094 | Shpunt et al. | Jun 2012 | A1 |
20120140109 | Shpunt et al. | Jun 2012 | A1 |
20130038881 | Pesach et al. | Feb 2013 | A1 |
20130038941 | Pesach et al. | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
1725042 | Jan 2006 | CN |
S62-011286 | Jan 1987 | JP |
H10123512 | May 1998 | JP |
H10301201 | Nov 1998 | JP |
2002372701 | Dec 2002 | JP |
2011118178 | Jun 2011 | JP |
2007043036 | Apr 2007 | WO |
2007105205 | Sep 2007 | WO |
2008120217 | Oct 2008 | WO |
2010004542 | Jan 2010 | WO |
2012020380 | Feb 2012 | WO |
2012066501 | May 2012 | WO |
Entry |
---|
U.S. Appl. No. 12/330,766 Office Action dated Jul. 16, 2013. |
International Application PCT/IB2013/051986 Search Report dated Jul. 30, 2013. |
U.S. Appl. No. 13/008,042 Office Action dated Jul. 15, 2013. |
Fienup, J.R., “Phase Retrieval Algorithms: A Comparison”, Applied Optics, vol. 21, No. 15, Aug. 1, 1982. |
International Application PCT/IL2008/01592 Search Report dated Apr. 3, 2009. |
U.S. Appl. No. 12/840,312 Office Action dated Jul. 12, 2012. |
Gerchberg et al., “A Practical Algorithm for the Determination of the Phase from Image and Diffraction Plane Pictures,” Journal Optik, vol. 35, No. 2, pp. 237-246, year 1972. |
Sazbon et al., “Qualitative Real-Time Range Extraction for Preplanned Scene Partitioning Using Laser Beam Coding,” Pattern Recognition Letters 26 , pp. 1772-1781, year 2005. |
Moharam et al. “Rigorous coupled-wave analysis of planar-grating diffraction”, Journal of the Optical Society of America, vol. 71, No. 6, pp. 818-818, Jul. 1981. |
U.S. Appl. No. 12/945,908 Official Action dated Dec. 5, 2012. |
Eisen et al., “Total internal reflection diffraction grating in conical mounting” ,Optical Communications 261, pp. 13-18, year 2006. |
O'Shea et al., “Diffractive Optics: Design, Fabrication and Test”, SPIE Tutorial Texts in Optical Engineering, vol. TT62, pp. 66-72, SPIE Press, USA 2004. |
U.S. Appl. No. 13/008,042 Official Action dated Jan. 3, 2013. |
U.S. Appl. No. 61/568,185, filed Dec. 8, 2011. |
U.S. Appl. No. 12/330,766 Official Action dated Dec. 14, 2010. |
EZCONN Czech A.S. “Site Presentation”, Oct. 2009. |
Luxtera Inc., “Luxtera Announces World's First 10GBit CMOS Photonics Platform”, Carlsbad, USA, Mar. 28, 2005 (press release). |
Bradley et al., “Synchronization and Rolling Shutter Compensation for Consumer Video Camera Arrays”, IEEE International Workshop on Projector-Camera Systems—PROCAMS 2009, Miami Beach, Florida, 2009. |
Marcia et al., “Fast Disambiguation of Superimposed Images for Increased Field of View”, IEEE International Conference on Image Processing, San Diego, USA, Oct. 12-15, 2008. |
U.S. Appl. No. 13/798,231, filed Mar. 13, 2013. |
Btendo, “Two Uni-axial Scanning Mirrors Vs One Bi-axial Scanning Mirror”, Kfar Saba, Israel, Aug. 13, 2008. |
Microvision Inc., “Micro-Electro-Mechanical System (MEMS) Scanning Mirror”, years 1996-2009. |
European Patent Application # 11150668.9 Partial European Search Report dated Apr. 1, 2011. |
U.S. Appl. No. 12/330,766 Official Action dated Jun. 7, 2011. |
Garcia et al., “Three-dimensional mapping and range measurement by means of projected speckle patterns”, Applied Optics, vol. 47, No. 16, pp. 3032-3040, Jun. 1, 2008. |
Garcia et al.., “Projection of Speckle Patterns for 3D Sensing”, Journal of Physics, Conference series 139, year 2008. |
CN Patent Application # 200880119911.9 Office Action dated Jan. 29, 2012. |
U.S. Appl. No. 12/955,939 Office Action dated Jan. 30, 2012. |
U.S. Appl. No. 12/955,940 Office Action dated Jan. 11, 2012. |
U.S. Appl. No. 12/762,373 Office Action dated Mar. 7, 2012. |
International Application PCT/IB2011/053560 Search Report dated Jan. 19, 2012. |
U.S. Appl. No. 61/611,075, filed Mar. 15, 2012. |
International Application PCT/IB2011/055155 Search Report dated Apr. 20, 2012. |
U.S. Appl. No. 12/955,939 Office Action dated Jun. 1, 2012. |
U.S. Appl. No. 12/955,940 Office Action dated Jun. 27, 2012. |
U.S. Appl. No. 13/567,095 Office Action dated Oct. 1, 2013. |
U.S. Appl. No. 13/008,042 Office Action dated Dec. 3, 2013. |
JP Application # 2011-009310 Office Action dated Nov. 19, 2014. |
JP Application # 2011-009310 Office Action dated Sep. 9, 2015. |
Number | Date | Country | |
---|---|---|---|
20130147921 A1 | Jun 2013 | US |
Number | Date | Country | |
---|---|---|---|
61300465 | Feb 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12762373 | Apr 2010 | US |
Child | 13765706 | US |