Imaging apparatus having configurable stereoscopic perspective

Abstract
In some embodiments, a stereoscopic imaging apparatus includes a tubular housing having a bore extending longitudinally through the housing. First and second image sensors are disposed proximate a distal end of the bore, each including a light sensitive elements on a face and mounted facing laterally outward. The apparatus further includes a first beam steering element associated with the first image sensor and a second beam steering element associated with the second image sensor. The beam steering elements receive light from first and second perspective viewpoints and direct the received light onto the faces of the image sensors forming first and second images. Either the first and second beam steering elements or the first and second image sensors are moveable to cause a change a spacing between or an orientation of the perspective viewpoints to cause sufficient disparity between the first and second images to provide image data including three-dimensional information.
Description
BACKGROUND
1. Field

This disclosure relates generally to stereoscopic imaging and more particularly to a stereoscopic imaging apparatus wherein a spacing between or an orientation of the stereoscopic viewpoints may be changed to cause sufficient disparity between images for generating three-dimensional (3D) information.


2. Description of Related Art

Stereoscopic imaging generally involves capturing a pair of images from spaced apart perspective viewpoints and processing the images to generate a three-dimensional (3D) view or 3D information based on a disparity between the images. Small format image sensors may be used to generate stereoscopic images while being sufficiently small to fit within a small diameter tubular housing. However when the spacing between image sensors is constrained by the size of the housing, the disparity between images may be insufficient particularly when viewing images that are close to the image sensors. The lack of disparity results in some views providing an inadequate 3D viewing effect. The extraction of 3D information may also be limited by the lack of disparity between stereo images.


SUMMARY

In accordance with some embodiments there is provided a stereoscopic imaging apparatus. The apparatus includes a tubular housing configured for insertion into a confined space, the tubular housing having a bore extending longitudinally through the housing. The apparatus also includes first and second image sensors disposed proximate a distal end of the bore, each image sensor including a plurality of light sensitive elements on a face of the image sensor and being mounted facing laterally outward with respect to a longitudinal axis extending through the bore. The apparatus further includes a first beam steering element associated with the first image sensor and a second beam steering element associated with the second image sensor, the beam steering elements being operably configured to receive light from respective first and second perspective viewpoints extending longitudinally outward into an object field and direct the received light onto the faces of the respective first and second image sensors for forming respective first and second images. Either the first and second beam steering elements or the first and second image sensors are moveable to cause a change at least one of a spacing between and an orientation of the perspective viewpoints with respect to a longitudinal axis of the bore to cause sufficient disparity between the first and second images to provide image data including three-dimensional information.


Each of the first and second beam steering elements may include a plurality of beam steering elements disposed in different locations with respect to the longitudinal axis and the first and second image sensors may be moveable to cause the first and second images to be selectively received by one of the plurality of beam steering elements.


The first and second image sensors may be mounted back-to-back on a moveable carrier.


The moveable carrier may include a circuit substrate.


The moveable carrier may be constrained for longitudinal motion within the bore and may further include an actuator disposed within the bore and operably configured to cause longitudinal movement of the carrier.


The actuator may include one of a piezoelectric actuator, a rotary piezoelectric motor, and a control wire.


The plurality of beam steering elements may be disposed in different locations may include longitudinally spaced apart prisms at a periphery of the housing, each prism being operably configured to receive light from a different perspective viewpoint.


Each of the first and second beam steering elements may include a moveable reflective element operably configured to be pivoted to receive light from different perspective viewpoints.


The moveable reflective elements are operably configured to be disposed along an outer periphery of the housing while the apparatus is being inserted into the confined space and are deployable after insertion to receive light from the respective first and second perspective viewpoints.


Each of the first and second beam steering elements may include a deformable optical element operably configured to deform to receive light from different perspective viewpoints.


The deformable optical element may include at least one of a liquid lens and a liquid prism.


The apparatus may include an actuator operably configured to cause movement of imaging lenses associated with each of the first and second image sensors in a direction aligned with the longitudinal axis to cause a change in orientation of the perspective viewpoints with respect to a longitudinal axis.


The tubular housing may be attached to a distal end of an elongate sheath having a passage extending through the sheath for carrying signals to and from the image sensors.


At least a portion of the sheath may include a manipulator operably configured to cause the sheath to be bend for positioning the tubular housing with the confined space.


The confined space may include a body cavity of a patient undergoing a medical or surgical procedure.


The stereoscopic imaging apparatus may be used in a robotic surgery system.


The tubular housing may have a generally circular cross section.


The bore of the tubular housing may have a diameter of about 10 millimeters.


The apparatus may include a controller in communication with the apparatus and operably configured to cause movement of either the first and second beam steering elements or the first and second image sensors in response to making a determination that an object field being captured by the apparatus may have insufficient disparity between the first and second images to provide image data including three-dimensional information.


Other aspects and features will become apparent to those ordinarily skilled in the art upon review of the following description of specific disclosed embodiments in conjunction with the accompanying figures.





BRIEF DESCRIPTION OF THE DRAWINGS

In drawings which illustrate disclosed embodiments,



FIG. 1 is a perspective view of a stereoscopic imaging apparatus;



FIG. 2 is a perspective view of an imaging assembly of the stereoscopic imaging apparatus shown in FIG. 1;



FIG. 3A is a schematic plan view of an optical configuration of the imaging apparatus shown in FIG. 2;



FIG. 3B is a schematic plan view of a further optical configuration of the imaging apparatus shown in FIG. 2 and FIG. 3A;



FIG. 4 is a perspective view of a stereoscopic imaging apparatus in accordance with another embodiment;



FIG. 5A is a schematic plan view of an optical configuration of the imaging apparatus shown in FIG. 4;



FIG. 5B is a schematic plan view of a further optical configuration of the imaging apparatus shown in FIG. 4 and FIG. 5A;



FIG. 6 is a schematic plan view of an optical configuration for implementing some embodiments;



FIG. 7A is a schematic plan view of another optical configuration of the imaging apparatus shown in FIG. 4 in accordance with another embodiment; and



FIG. 7B is a schematic plan view of the optical configuration of the imaging apparatus shown in FIG. 7A.





DETAILED DESCRIPTION

Referring to FIG. 1, a stereoscopic imaging apparatus in accordance with a first embodiment is shown generally at 100. The apparatus 100 includes a tubular housing 102 configured for insertion into a confined space. The tubular housing 102 has a bore 104 extending longitudinally through the housing that accommodates imaging components (shown in FIG. 2). In the embodiment shown, the tubular housing 102 has a generally circular cross section, which in one embodiment may have a diameter of about 10 millimeters.


In the embodiment shown, the tubular housing is attached to a distal end of an elongate sheath 106 having a passage 108 extending through the sheath for carrying signals to and from the imaging components within the tubular housing 102. A portion of the sheath 106 includes a manipulator 110, which is configured to cause the sheath to be bent to position the tubular housing within the confined space for capturing images. In one embodiment, the manipulator may include a plurality of vertebra actuated to bend by a plurality of control links or cables 112 for disposing the apparatus 100 at various positions with respect to a longitudinal axis 120 of the bore 104. The passage 108 also accommodates various signal cables 114 for carrying image data to a host system controller 122 and for transmitting control and command signals to the apparatus 100. The host system controller 122 is in communication with a display 124 for displaying the images, which may be viewed through a stereoscopic viewing device (not shown) to provide separate left and right stereoscopic images to a user's left and right eyes.


The apparatus 100 includes a first beam steering element 116 laterally disposed on the tubular housing 102 of the apparatus 100 proximate a distal end 118. A second beam steering element (not visible in FIG. 1) is similarly laterally disposed on the opposite side of the tubular housing 102. The beam steering element 116 in FIG. 1 is shown schematically as a demarcated portion of the tubular housing 102 but may take on various forms, such as described in more detail below.


In one embodiment, the confined space within which the apparatus 100 may be employed may be a body cavity of a patient undergoing a medical or surgical procedure. For example, the apparatus 100 may be used for imaging during a laparoscopic surgery procedure or may be part of a robotic surgery system for performing robotic surgery.


Referring to FIG. 2, the apparatus 100 includes an imaging assembly shown generally at 200. The imaging assembly 200 includes a first image sensor 202 and a second image sensor 204 (of which only a portion is visible in FIG. 2). The first and second image sensors 202 and 204 are substantially identical and are disposed proximate the distal end 118 of the bore 104 mounted facing laterally outward with respect to the longitudinal axis 120 of the bore. Each of the image sensors 202 and 204 include a plurality of light sensitive elements 206 on a face 208 of the image sensor.


The imaging assembly 200 also includes a first beam steering element 210 associated with the first image sensor 202 and a second beam steering element 212 associated with the second image sensor 204. The beam steering element 210 is operably configured to receive light from a first perspective viewpoint in an object field 218, which is directed through an imaging lens 214 onto the face 208 of the image sensor 202 for forming a first image. The beam steering element 212 is operably configured to receive light from a second perspective viewpoint in the object field 218, which is directed through an imaging lens 216 onto the face of the image sensor 204 for forming a second image.


In this embodiment, the first beam steering element 210 includes two prisms 220 and 222 longitudinally spaced apart at a periphery of the imaging assembly 200. Similarly, the second beam steering element 212 includes two prisms 224 and 226 longitudinally spaced apart on an opposite side of the imaging assembly 200. The first and second image sensors 202 and 204 are moveable along the longitudinal axis 120 to cause the first and second images to be selectively received by either the prisms 220, 224 or the prisms 222, 226. In the embodiment shown in FIG. 2, the first and second image sensors 202 and 204 are mounted back-to-back on a moveable carrier 228, which in the embodiment shown comprises respective circuit substrates 234 and 236 on which the imaging sensors are mounted. In the embodiment shown in FIG. 1 the imaging lenses 214 and 216 are each mounted in a lens tube (shown in FIG. 3) which is coupled to the respective first and second image sensors 202 and 204 and thus move with the sensors and the moveable carrier 228.


The moveable carrier 228 is received within a channel 230 in a frame 232 (shown partially cut-away in FIG. 2 to reveal underlying elements). The frame 232 is received within and fixed relative to the bore 104 of the tubular housing 102. The moveable carrier 228 is constrained for longitudinal movement within the channel 230 of the frame 232 in a direction aligned with the longitudinal axis 120 of the bore. The imaging assembly 200 further includes an actuator 238 which is coupled to the moveable carrier 228 to cause the longitudinal movement on the carrier when actuated by a control signal provided by the host system controller 122. In some embodiments the actuator 238 may be a piezoelectric actuator, a rotary piezoelectric motor, or a control wire, for example.


In the imaging assembly 200 shown in FIG. 2, the moveable carrier 228 is disposed such that the first and second image sensors 202 and 204 receive images via the prisms 220 and 224 respectively. The optical configuration corresponding to FIG. 2 is shown in plan view in FIG. 3A, in which the first and second image sensors 202 and 204 have perspective viewpoints 300 and 302 within the object field 218. The perspective viewpoints 300 and 302 are separated by a distance D1 and in this embodiment where the prisms 220 and 224 have a 45° prism angle, the perspective viewpoints are also substantially parallel.


Referring to FIG. 3B, when the moveable carrier 228 is moved by the actuator 238 to align the sensors 202 and 204 with the prisms 222 and 226, the first and second image sensors have respective perspective viewpoints 300′ and 302′ within the object field 218. The perspective viewpoints 300′ and 302′ are separated by a distance D2 and due to the 45° prism angle of the prisms 222 and 226, are also substantially parallel. The increased separation between the perspective viewpoints from D1 to D2 increases the disparity between the first and second images received at the respective first and second image sensors 202 and 204. The increased image disparity may provide for more effective display and extraction of 3D information. Under some imaging conditions the smaller disparity D1 as shown in FIG. 3A may be insufficient to provide a view having appreciable 3D depth.


Referring to FIG. 4, a stereoscopic imaging apparatus in accordance with another embodiment is shown generally at 400. The apparatus 400 includes a tubular housing 402, shown partially cut away in FIG. 4 to reveal imaging components. The apparatus 400 includes first and second image sensors 404 and 406 disposed back-to-back and proximate a distal end 408 of a bore 410. The back-to-back mounting has an advantage of providing options for packaging the optical components within the tubular housing 102 in that the image sensors 404 and 406 may be located proximate a widest portion of the bore 104. In systems where image sensors are disposed side-by-side at a distal end of a tubular housing and facing the object field 218, the maximum size of sensor that can be accommodated would have a width of less than half of the diameter of the tubular housing 102. For a 10 millimeter diameter housing, the maximum diagonal size of image sensor would be about 6 millimeters (or ¼ inch). The configuration of imaging assembly 200 shown in FIG. 2 would permit the sensors to be increased in size to close to the full 10 millimeters (or just less than ½ inch). While image sensors as small as 3.62 millimeters ( 1/7 inch) are now available, a larger image sensor may provide improved light capture, imaging performance, reduced image signal noise, and also increased image resolution.


The image sensors 404 and 406 each include a plurality of light sensitive elements 412 on a face 414 of the image sensors. The image sensors 404 and 406 are mounted on a carrier 418 facing laterally outward with respect to a longitudinal axis 416 extending through the bore 410. In this embodiment the carrier 418 is made up by circuit substrates 420 and 422 on which the sensors 404 and 406 are mounted. In this embodiment the carrier 418 and image sensors 404 and 406 are immobilized within the bore 410 of the tubular housing 402.


The apparatus 400 also includes a first beam steering element 424 associated with the first image sensor 404 and a second beam steering element 426 associated with the second image sensor 406. The first beam steering element 424 in this embodiment is implemented using a reflective element or mirror 428 mounted on a moveable support 430 via hinges to the tubular housing 102 and operable to pivot outwardly as indicated by the arrow 432. Similarly, the second beam steering element 426 includes a mirror 434 mounted on a moveable support 436 mounted via hinges to the tubular housing 102 and operable to pivot outwardly. In this embodiment the first beam steering element 424 includes a miniature actuator 438 coupled to the moveable support 430 to cause the movement 432 for deploying the mirror. The second beam steering element 426 also includes an actuator (not visible in FIG. 4) for actuating movement of the moveable support 436. While the apparatus 100 is being inserted into a confined space, the beam steering elements 424 and 426 may be maintained in an un-deployed disposition lying along an outer periphery of the housing 102. Once the apparatus 100 is inserted, the beam steering elements 424 and 426 may be deployed to receive light from an object field 440. The mirrors 428 and 434 each receive light from different perspective viewpoints within the object field 440. The received light is directed by the respective mirrors 428 and 434 through lenses 442 and 444 toward the sensors 404 and 406 for forming left and right images on the sensors.


Referring to FIG. 5A, the apparatus 400 is shown in a first deployed operating condition where the mirrors 428 and 434 are pivoted outwardly to an angle α1 of about 35° with respect to the longitudinal axis 416. Under these conditions the image sensors 404 and 406 receive light from respective first and second perspective viewpoints 500 and 502 that are angled inwardly (or toed in) toward the longitudinal axis 416 and converge at a convergence plane 504. Images captured of objects located at the convergence plane 504 will not have any disparity and will appear to be located at a screen plane when viewed on the display 124 using a 3D viewing device. Objects closer to the apparatus 400 than the convergence plane 504 will exhibit positive parallax and will appear to be located rearwardly of the screen plane, while objects behind the convergence plane 504 will have negative parallax and appear to be located forward of the screen plane.


In FIG. 5B, the mirrors 428 and 434 are pivoted outwardly to an angle α2 of about 40° such that the image sensors 404 and 406 receive light from respective first and second perspective viewpoints 500′ and 502′ that are less inwardly angled with respect to the longitudinal axis 416. This has the effect of moving an associated convergence plane 504′ for the perspective viewpoints 506 and 508 outwardly with respect to the apparatus 400.


One advantage of the configuration shown in FIGS. 5A and 5B is that the convergence plane 504 may be located at a desired depth in the object field 440 to facilitate generation of 3D information at the desired depth. Some 3D information may also generated for objects located away from the convergence plane 504, but the 3D effects are enhanced and the resulting view may result in increased eyestrain for the user.


Referring back to FIG. 2, FIG. 3A and FIG. 3B, in the embodiment shown the prisms 220 and 224 may be configured with a prism angle less than 45° to cause the perspective viewpoints 300 and 302 to be angled inwardly, generally as shown in FIG. 5. The prisms 222 and 226 may be configured with a prism angle less than 45° to cause the perspective viewpoints 300′ and 302′ to be angled inwardly. Other embodiments may be configured to maintain the parallel perspective viewpoints 300′ and 302′, while the perspective viewpoints 300 and 302 are toed in. Parallel perspective viewpoints effectively locate the convergence plane at infinity such that the screen plane is at infinity and all objects are displayed having positive parallax.


In an embodiment configured as shown in FIG. 3A and FIG. 3B, the prism angle once selected remains fixed. Referring to FIG. 6, in some embodiments the imaging assembly 200 may further include an actuator 600 and the imaging lenses 214 and 216 may be moveable in a direction aligned with the longitudinal axis 416 in response to movement of the actuator. Displacement of the imaging lenses 214 and 216 with respect to an optical centerline 602 of the first and second image sensors 202 and 204 causes the perspective viewpoints 300″ and 302″ to be toed in to a degree permitted by the optical design of the imaging lenses.


In some embodiments one or more conventional optical elements of the imaging assembly 200 or apparatus 400 may be replaced with a deformable optical element. For example the prisms 220-226 may be implemented as a liquid prism that is capable of changing beam steering characteristics in response to a control signal received from the host system controller 122. Similarly, one or more of the imaging lenses 214216, 442, or 444 may include a deformable optical element such as a liquid lens. The deformable optical element facilitates some adjustment of the perspective viewpoint orientation and/or separation by changing optical properties of the deformable element.


In some embodiments the host system controller 122 may be configured to make a determination whether the object field 218 or object field 440 being captured by the imaging assembly 200 or imaging apparatus 400 is capable of providing sufficient disparity between the first and second images for successful extraction of 3D information. The host system controller 122 may be further configured to cause movement of the applicable first beam steering elements, imaging lenses, or deformable optics when insufficient disparity is found in the images currently being captured.


In some embodiments the mirrors 428 and 434 (shown in FIGS. 4 and 5) may be replaced by mirrors 700 and 702 as shown in FIG. 7A. Each of the mirrors 700 and 702 in FIG. 7A has a first reflective surface 704 and a second reflective surface 706. The first reflective surface 704 is disposed at an angle θ1, which in the example shown is 45° resulting in perspective viewpoints 708 and 710 within the object field 440 generally as described in connection with FIG. 3A. The perspective viewpoints 708 and 710 are separated by a distance D1 and in this embodiment where the first reflective surface 704 is at a 45° angle to the longitudinal axis 416 are also substantially parallel.


Referring to FIG. 7B, when the mirrors 700 and 702 are pivoted further outwardly with respect to the longitudinal axis 416, the first and second image sensors 404 and 406 will have respective perspective viewpoints 708′ and 710′ within the object field 440. The perspective viewpoints 708′ and 710′ are separated by a distance D2 and due to the further 45° angle of the second reflective surface 706, are also substantially parallel. In this embodiment the mirrors 700 and 702 may also be actuated to angles other than 45°, thus facilitating toeing in the perspective viewpoints while also providing a selectable spacing between the perspective viewpoints.


The embodiments set forth above provide for selectively changing orientation and/or the spacing between perspective viewpoints for producing stereoscopic views of an object field. The back-to-back orientation of the lateral facing image sensors also facilitates the accommodation of the imaging components within a small bore housing suitable for insertion into confined spaces. The provision of beam steering elements that are located peripherally on the housing increases the spacing between perspective viewpoints over a side-by-side image sensor configuration.


While specific embodiments have been described and illustrated, such embodiments should be considered illustrative only and not as limiting the disclosed embodiments as construed in accordance with the accompanying claims.

Claims
  • 1. A stereoscopic imaging apparatus comprising: a tubular housing configured to be inserted into a confined space, the tubular housing including a bore extending longitudinally through the tubular housing;first and second image sensors disposed proximate a distal end of the bore, each image sensor including a plurality of light sensitive elements on a face of the image sensor and being mounted facing laterally outward with respect to a longitudinal axis extending through the bore;a first beam steering element associated with the first image sensor and a second beam steering element associated with the second image sensor, the first and second beam steering elements configured to receive light from respective first and second perspective viewpoints extending longitudinally outward into an object field and direct the received light onto the faces of the respective first and second image sensors to form respective first and second images; anda first movable support mounted via a first hinge to the tubular housing and fixedly supporting the first beam steering element thereon, and a second movable support mounted via a second hinge to the tubular housing and fixedly supporting the second beam steering element thereon,wherein the first and second beam steering elements are pivotably moveable outside the tubular housing to cause a change in an orientation of the perspective viewpoints with respect to the longitudinal axis of the bore to cause sufficient disparity between the first and second images to provide image data including three-dimensional (3D) information.
  • 2. The apparatus of claim 1, wherein each of the first and second beam steering elements comprises a moveable reflective element configured to be pivoted to receive light from different perspective viewpoints.
  • 3. The apparatus of claim 2, wherein the moveable reflective elements are configured to be disposed along an outer periphery of the tubular housing while the apparatus is being inserted into the confined space and are deployable after insertion to receive light from the respective first and second perspective viewpoints.
  • 4. The apparatus of claim 1 wherein each of the first and second beam steering elements comprises a deformable optical element configured to deform to receive light from different perspective viewpoints.
  • 5. The apparatus of claim 4 wherein the deformable optical element comprises at least one of a liquid lens or a liquid prism.
  • 6. The apparatus of claim 1 further comprising an actuator configured to cause movement of imaging lenses associated with each of the first and second image sensors in a direction aligned with the longitudinal axis to cause a change in orientation of the perspective viewpoints with respect to a longitudinal axis.
  • 7. The apparatus of claim 1, wherein the tubular housing is attached to a distal end of an elongate sheath having a passage extending through the elongate sheath to carry signals to and from the first and second image sensors.
  • 8. The apparatus of claim 7, wherein at least a portion of the elongate sheath comprises a manipulator configured to cause the elongate sheath to be bent to position the tubular housing with the confined space.
  • 9. The apparatus of claim 1, wherein the confined space comprises a body cavity of a patient undergoing a medical or surgical procedure.
  • 10. The apparatus of claim 9, wherein the stereoscopic imaging apparatus is used in a robotic surgery system.
  • 11. The apparatus of claim 1, wherein the tubular housing includes a generally circular cross section.
  • 12. The apparatus of claim 1, wherein the bore of the tubular housing has a diameter of about 10 millimeters.
  • 13. The apparatus of claim 1, further comprising a controller in communication with the apparatus and configured to cause movement of either the first and second beam steering elements or the first and second image sensors based on a determination of whether or not the object field being captured by the apparatus provides sufficient disparity between the first and second images to extract three-dimensional (3D) information.
  • 14. A stereoscopic imaging apparatus comprising: first and second image sensors configured to be disposed proximate a distal end of a bore of a tubular housing, each image sensor including a plurality of light sensitive elements on a face of the image sensor and being mounted facing laterally outward with respect to a longitudinal axis extending through the bore of the tubular housing;a first beam steering element associated with the first image sensor and a second beam steering element associated with the second image sensor, the first and second beam steering elements configured to receive light from respective first and second perspective viewpoints extending longitudinally outward into an object field and direct the received light onto the faces of the respective first and second image sensors to form respective first and second images ; anda first movable support mounted via a first hinge to the tubular housing and fixedly supporting the first beam steering element thereon, and a second movable support mounted via a second hinge to the tubular housing and fixedly supporting the second beam steering element thereon,wherein the first and second beam steering elements are pivotably moveable outside the tubular housing to cause a change in an orientation of the perspective viewpoints with respect to the longitudinal axis of the bore of the tubular housing to cause sufficient disparity between the first and second images to provide image data including three-dimensional (3D) information.
US Referenced Citations (304)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6414779 Mandella Jul 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll et al. Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco et al. Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez et al. Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Diolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart et al. Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo et al. Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 Patrick Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Grifliths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9271637 Farr Mar 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti et al. Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin et al. Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
20140092215 Hayama Apr 2014 A1
20140210945 Morizumi Jul 2014 A1
Related Publications (1)
Number Date Country
20200209730 A1 Jul 2020 US