Surgical visualization systems and related methods are disclosed herein, e.g., for providing visualization during surgical procedures, including minimally-invasive spinal procedures.
There are many instances in which it may be desirable to provide a surgeon or other user with visualization of a surgical site. While a number of surgical visualization systems have been developed, they are often heavy and cumbersome to use, difficult to clean or sterilize, or have properties that render them inadequate or impossible to use for many types of procedures. A spinal endoscope, for example, is typically used only in limited procedures, such as herniated disc repair and other pathologies that are reduced to a predictable, very small location. Such devices typically require very small, specialized tools with low output force and tissue processing capabilities. Such devices can require multiple human operators, with an assistant or other user holding and operating the visualization system while the surgeon uses both hands to perform the surgery. In addition, such devices can have a steep learning curve, as the visualization orientation may be dictated by instrument orientation, and therefore may be constantly changing over the course of the procedure.
Surgical visualization systems and related methods are disclosed herein, e.g., for providing visualization during surgical procedures. Systems and methods herein can be used in a wide range of surgical procedures, including spinal surgeries such as minimally-invasive fusion or discectomy procedures. Systems and methods herein can include various features for enhancing end user experience, improving clinical outcomes, or reducing the invasiveness of a surgery. Exemplary features can include access port integration, hands-free operation, active and/or passive lens cleaning, adjustable camera depth, and many others.
In some embodiments, a surgical system can include an access device having a working channel and a visualization channel; and a visualization system at least partially disposed in the visualization channel, the visualization system comprising a camera module and a housing in which the camera module is mounted.
The camera module can include an image sensor and a lens configured to direct reflected light onto the image sensor, the image sensor and the lens being disposed within the housing. The camera module can include an optical fiber having a distal end disposed within the housing and a proximal end in optical communication with a light source. The lens can be disposed within a lens lumen of a lens barrel and the optical fiber can be disposed in an illumination lumen of the lens barrel. The illumination lumen can be disposed closer to the center of the access device working channel than the lens lumen. The lens barrel can include a distal-facing surface that is obliquely angled relative to a central longitudinal axis of the working channel. The system can include a diffuser mounted in a recess formed in the distal-facing surface of the lens barrel over the optical fiber. The recess can be crescent-shaped and can substantially follow the perimeter of the lens lumen. A central region of the lens can be coated with hydrophobic coating and a peripheral region of the lens can be coated with a hydrophilic coating.
The housing can include a main body and a distal end cap. The main body can include a camera lumen in which the camera module is disposed and first and second fluid lumens. The camera lumen can be disposed centrally in the main body, between the first and second fluid lumens. A distal facing surface of the main body and a distal facing surface of the end cap can be obliquely angled relative to a central longitudinal axis of the working channel. The main body can have a sidewall having a concave inner surface disposed adjacent the working channel and a convex outer surface disposed opposite the inner surface. The inner surface of the sidewall of the main body can define at least a portion of an inner surface of the working channel. The outer surface of the sidewall can include first and second planar regions connected by a central transition region that defines a section of a cylinder, the transition region following an outer perimeter of a camera lumen of the main body. The inner surface can be concavely curved and connected to the outer surface by first and second transition regions that define sections of respective cylinders, the first and second transition regions following the outer perimeters of first and second fluid lumens formed in the main body. The housing can be formed from an inner circular tube disposed within and attached to an outer oval tube to define a camera lumen and first and second fluid lumens. The housing can be formed from an inner circular tube having opposed first and second outer shells attached thereto to define a camera lumen and first and second fluid lumens. The end cap can include a cut-out axially aligned with a lens of the camera module and a cut-out axially aligned with an illumination system of the camera module. A proximal-facing surface of the end cap can define a recess configured to direct fluid flowing out of a first fluid lumen of the housing across a lens of the camera module and into a second fluid lumen of the housing. The recess can include lateral end portions axially aligned with the first and second fluid lumens and medial end portions that are open to a cut-out of the end cap.
The system can include a connector assembly extending from the housing, the connector assembly including electrical and optical connections to the camera module and fluid connections to the housing. The connector assembly can have an exterior shape that matches that of the housing. The system can include a controller having an electronic display configured to display image data captured by an image sensor of the camera module. The visualization channel can intersect or overlap with the working channel. The housing and the camera module can be axially-translatable relative to the access device. The visualization channel can have a central longitudinal axis disposed radially outward from a central longitudinal axis of the working channel.
The access device can include a mating feature configured to selectively hold the visualization system in a desired position relative to the access device. The mating feature can include a locking track configured to receive a portion of the visualization system to restrict movement of the visualization system relative to the access device. The locking track can be formed in a proximal extension of the access device. The locking track can receive the visualization system in a snap-fit or friction fit. The mating feature can include an adjustment track configured to receive at least a portion of the visualization system to allow movement of the visualization system relative to the access device. The visualization system can be configured such that the visualization system can be loaded into the locking track by moving it radially outward from a central longitudinal axis of the visualization channel. The locking track can be curved or obliquely angled away from a central longitudinal axis of the visualization channel. The visualization system can be configured such that the visualization system can be secured to the mating feature at any point along a connector assembly of the visualization system. The mating feature can include a wheel that, when rotated, advances or retracts the visualization system relative to the access device. The mating feature can include a coil spring clamp configured to selectively lock the visualization system in a fixed position relative to the access device. The mating feature can include an O-ring disposed around a portion of the visualization system and a cone movably coupled to the access device, the cone being movable between a first position in which it expands the O-ring to allow movement of the visualization system relative to the access device and a second position in which the O-ring is released to clamp onto the visualization system and prevent movement of the visualization system relative to the access device.
The system can include a sleeve insertable through the working channel of the access device, the sleeve having an outside diameter and a bulb movably coupled to the sleeve, the bulb being movable between a first position in which the bulb is disposed entirely within the outside diameter of the sleeve and a second position in which at least a portion of the bulb protrudes out from the outside diameter of the sleeve. The bulb in the second position can be configured to fill a void space within the access device distal to the visualization channel of the access device. The bulb can include a distal-facing surface that is curved or ramped to form a gradual transition between a cylindrical dilator inserted through the sleeve and an outer surface of the access device. The bulb can be movable between the first and second positions by translating the sleeve axially relative to the access device. The bulb can be movable between the first and second positions by rotating the sleeve relative to the access device. The bulb can be biased towards the second position. The access device can include a transition portion at a distal end thereof, the transition portion being movable between a first position in which the transition portion extends radially-inward to provide a gradual transition between a cylindrical dilator inserted through the working channel and an outer surface of the access device and a second position in which the transition portion is moved radially-outward from the first position. The system can include a dilation shaft insertable through the access device to move the transition portion to the second position. The transition portion can include a plurality of flexible and resilient fingers.
In some embodiments, a visualization system can include a camera module having an image sensor and a lens; and a housing in which the camera module is disposed, the housing having a main body, first and second fluid channels, and an end cap configured to direct fluid flow through the channels across the lens of the camera module.
The camera module can include an optical fiber having a distal end disposed within the housing and a proximal end in optical communication with a light source. The lens can be disposed within a lens lumen of a lens barrel and the optical fiber can be disposed in an illumination lumen of the lens barrel. The system can include a diffuser mounted in a recess formed in a distal-facing surface of the lens barrel over the optical fiber. The recess can be crescent-shaped and can substantially follow the perimeter of the lens lumen. A central region of the lens can be coated with a hydrophobic coating and a peripheral region of the lens can be coated with a hydrophilic coating.
The camera module can be disposed centrally in the main body, between the first and second fluid channels. A distal facing surface of the main body and a distal facing surface of the end cap can be obliquely angled relative to a central longitudinal axis of the camera module. The main body can have an outer surface that includes first and second planar regions connected by a central transition region that defines a section of a cylinder, the transition region following an outer perimeter of a camera lumen of the main body. The main body can have an inner surface that is concavely curved and connected to the outer surface by first and second transition regions that define sections of respective cylinders, the first and second transition regions following the outer perimeters of the first and second fluid channels. The housing can be formed from an inner circular tube disposed within and attached to an outer oval tube to define a camera lumen and the first and second fluid channels. The housing can be formed from an inner circular tube having opposed first and second outer shells attached thereto to define a camera lumen and the first and second fluid channels. The end cap can include a cut-out axially aligned with the lens. A proximal-facing surface of the end cap can define a recess configured to direct fluid flowing out of the first fluid channel of the housing across the lens and into the second fluid channel of the housing. The recess can include lateral end portions axially aligned with the first and second fluid channels and medial end portions that are open to the cut-out of the end cap.
The system can include a connector assembly extending from the housing, the connector assembly including electrical and optical connections to the camera module and fluid connections to the housing. The connector assembly can have an exterior shape that matches that of the housing. The system can include a controller having an electronic display configured to display image data captured by an image sensor of the camera module.
In some embodiments, a surgical method can include inserting an access device into a patient; mounting the access device to an anchor, the anchor comprising at least one of an anatomical structure of the patient and an implant implanted in the patient; inserting a camera module into the access device; adjusting a depth of the camera module relative to the access device; and securing the camera module to a mating feature of the access device to maintain the camera module at the adjusted depth.
The anchor can include a bone anchor implanted in a pedicle of the patient. The method can include performing a surgical procedure through the access device without using hands to hold the access device or the camera module. The method can include inserting a fusion cage through the access device and into an intervertebral disc space. Inserting the access device can include positioning a distal end of the access device in proximity to an intervertebral disc space of the patient via a TLIF approach. The method can include positioning the camera module in a relatively proximal position relative to the access device, performing a bone resection through the access device, positioning the camera module in a relatively distal position relative to the access device, and removing disc tissue through the access device. The method can include directing cleaning media through a housing in which the camera module is disposed and across a lens of the camera module. Inserting the access device can include positioning a distal end of the access device in a dry environment of the patient. Inserting the access device can include positioning a distal end of the access device in a sinus cavity of the patient. The method can include performing a laryngoscopy or a bronchoscopy using the camera module.
In some embodiments, a surgical system can include an access device having a working channel and a visualization channel; a visualization system at least partially disposed in the visualization channel, the visualization system comprising a camera module and a housing in which the camera module is mounted; and a tissue shield that extends distally beyond a terminal distal end surface of the housing.
The tissue shield can be longitudinally movable relative to the housing. The tissue shield can be slidably disposed within a lumen of the housing. The tissue shield can be slidably disposed within a lumen of the access device. The tissue shield can be slidably disposed along an exterior surface of the housing. The tissue shield can include a wiper configured to clear debris from a lens of the camera module as the tissue shield is moved longitudinally relative to the housing. The tissue shield can extend around less than an entire periphery of the housing. The tissue shield can include a curved inner surface that follows a curve of a lens of the camera module. The tissue shield can have an outer surface with a profile that matches that of an outer surface of the housing. The tissue shield can have a crescent-shaped transverse cross section.
In some embodiments, a surgical system can include an access device having a working channel and a visualization channel; a visualization system at least partially disposed in the visualization channel, the visualization system comprising a camera module and a housing in which the camera module is mounted; and an active lens cleaning device configured to remove debris from a lens of the camera module.
The lens cleaning device can include a source of positive pressure gas directed towards the lens through a lumen of the housing. The gas can include air or carbon dioxide. The lens cleaning device can include a fluid lumen having a nozzle opening through which fluid can be directed towards the lens, the nozzle opening being obliquely angled with respect to a central longitudinal axis of the housing. The nozzle opening can be formed in a tube that extends from a terminal distal end surface of the housing. The lens cleaning device can include a fluid lumen having a nozzle opening through which fluid can be directed towards the lens, the nozzle opening extending perpendicular to a central longitudinal axis of the housing. The lens cleaning device can include an ultrasound agitator. The lens cleaning device can include a membrane movable relative to the lens. The membrane can include a continuous loop of material configured to be carried across at least one of a wiper, a brush, a fluid jet, and a vacuum port to clean the membrane. The membrane can be rolled onto a spool. The membrane can extend through a first lumen of the housing, across the lens, and through a second lumen of the housing. The lens cleaning device can include a wiper at least partially disposed in the working channel of the access device. The wiper can include an offset portion that contacts a protrusion formed in the working channel to urge a tip of the wiper laterally across the lens as a shaft of the wiper is moved longitudinally within the working channel. The wiper can be biased towards the visualization channel such that the wiper wipes across the lens of the camera module as the camera module is advanced distally into the visualization channel.
In some embodiments, a surgical method can include inserting an access device into a patient; inserting a housing having a camera module therein into the access device, the camera module having a lens; and while the access device and camera module are inserted into the patient, actuating a lens cleaning device to clean a visualization path to the lens of the camera module.
Actuating the lens cleaning device can include moving a tissue shield protruding from a distal end of the housing longitudinally relative to the housing to carry a wiper across the lens. Actuating the lens cleaning device can include directing positive pressure air through a lumen of the housing and towards the lens. Actuating the lens cleaning device can include vibrating the lens. Actuating the lens cleaning device can include moving a membrane relative to the lens. Moving the membrane can include rotating a continuous loop of membrane material to move a soiled portion of the membrane away from the lens and to position a clean portion of the membrane over the lens. Actuating the lens cleaning device can include advancing the camera module through the access device to drag a wiper flap biased towards the visualization channel across the lens. Actuating the lens cleaning device can include moving a wiper longitudinally within the working channel, thereby causing an offset portion of the wiper to contact a protrusion disposed in the working channel to urge a tip of the wiper laterally across the lens.
In some embodiments, a visualization system can include a camera module having an image sensor and a lens; and a housing in which the camera module is disposed, the housing having a main body, one or more fluid channels, and an end cap configured to direct fluid flow through the one or more fluid channels across the lens of the camera module.
In some embodiments, a surgical system can include an access device having a working channel and a visualization channel; and a visualization system at least partially disposed in the visualization channel, the visualization system comprising a camera module; wherein the visualization system is axially translatable relative to the access device to position the visualization system such that the camera module protrudes from a distal end of the access device.
In some embodiments, a surgical method can include inserting an access device into a patient; inserting a camera module into the access device; and adjusting a depth of the camera module relative to the access device; wherein adjusting the depth of the camera module comprises positioning the camera module such that the camera module protrudes from a distal end of the access device.
Surgical visualization systems and related methods are disclosed herein, e.g., for providing visualization during surgical procedures. Systems and methods herein can be used in a wide range of surgical procedures, including spinal surgeries such as minimally-invasive fusion or discectomy procedures. Systems and methods herein can include various features for enhancing end user experience, improving clinical outcomes, or reducing the invasiveness of a surgery. Exemplary features can include access port integration, hands-free operation, active and/or passive lens cleaning, adjustable camera depth, and many others.
Certain exemplary embodiments will now be described to provide an overall understanding of the principles of the structure, function, manufacture, and use of the devices and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those skilled in the art will understand that the devices and methods specifically described herein and illustrated in the accompanying drawings are non-limiting exemplary embodiments. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments.
In some embodiments, the surgical visualization systems disclosed herein can enable hands free visualization. For example, a camera module can be mounted in an access device in a manner that eliminates the need for a surgeon, assistant, or other user to manually hold the camera in place. As another example, the access device can be secured to a support, e.g., an operating table, an anatomical anchor, or the like, eliminating the need for the user to manually hold the access port, or to hold a camera module disposed therein. Accordingly, the user's hands can be free to perform other steps of the surgery.
The camera module can be separate and independent from surgical instruments used to carry out the procedure. Accordingly, the instruments with which the visualization system can be used are not restricted, and the system can be used with any of a variety of custom or off-the-shelf instruments. Such instruments can include instruments with increased size, strength, output force, and/or tissue processing capabilities. In addition, the visual trajectory of the camera can be independent of instrument positioning. This can allow the camera, and/or an access device in which the camera is disposed, to remain relatively stationary as instruments are manipulated and as the procedure is carried out. The field of view of the camera can thus remain substantially fixed during the procedure, providing the user with good spatial orientation and providing a reduced learning curve as compared to other visualization systems.
Traditional spinal microscopes can protrude significantly from the patient, occupy a lot of space over a surgical incision, and a clear space must be maintained around the proximal end of the scope to allow a user to look through the scope. This can limit the degree to which other instruments can be manipulated during the surgery, for example limiting the possible access angles of the instruments. This can also restrict the size and types of instruments that can be used, the placement of the user's hands during various steps in the surgery, and so forth. Visualization systems of the type described herein can be integrated with an access device, can have a low-profile design, and/or can display captured images on an external monitor, reducing or eliminating these potential concerns.
In some embodiments, the surgical visualization systems disclosed herein can enable adjustment of the depth of a camera within an access device. In some embodiments, the camera depth can be quickly and easily adjusted. In some embodiments, the camera depth can be adjusted with one hand in a tool-free manner. The ability to easily adjust camera depth can allow visualization to be optimized for each stage of a surgical procedure in a seamless manner that does not interfere with or disrupt the surgical flow. In a spinal surgery, for example, the camera can be retracted proximally when performing gross bone removal and other steps that are relatively low risk, but could cause the camera lens to become obscured with debris. Later, when doing nerve work or other tasks that require greater precision, the camera can be advanced distally within the access device. In some cases, the camera can be advanced to protrude from the distal end of the access device, e.g., to position the camera within an intervertebral disc space.
In some embodiments, the surgical visualization systems disclosed herein can provide improved visualization in harsh or challenging operating environments. In many spinal procedures, for example, the operating environment is dry (as compared to fluid filled cavities that exist in other surgeries where visualization systems are used) and a large amount of smoke and cutting debris can be generated during the surgery. The surgical visualization systems disclosed herein can include active and/or passive features for clearing such debris from the camera lens, or for preventing such debris from blocking or sticking to the lens in the first place. In some embodiments, the surgical visualization systems disclosed herein can provide high-resolution visualization in minimally-invasive spinal procedures, such as endoscopic transforaminal lumbar interbody fusion (TLIF) procedures.
In some embodiments, the surgical visualization systems disclosed herein can be a single-use disposable, can be easily cleanable or sterilizable, can have a low-profile, can be lightweight, can be low-cost, can have high resolution, and/or can be used in minimally-invasive surgery, e.g., spinal surgery.
The camera module 102 is shown in greater detail in
The camera module 102 can include a lens or optical element 114 configured to direct light onto the image sensor 112. The lens 114 can be a static lens. The lens 114 can be a monolithic block of optically-transparent material. The lens 114 can be polymeric. The lens 114 can have a fixed focal length or can have an adjustable focal length. The lens 114 can include a mechanical shutter. The lens 114 can include multiple optical elements movable relative to one another. The lens 114 can include one or more motors, actuators, gears, etc. for adjusting the focal length, aperture size, shutter speed, and other parameters of the lens. The camera module 102 can include a modular lens receiver such that a user can attach any of a variety of lenses to the camera module. Exemplary lenses 114 can include prime lenses, normal lenses, wide-angle lenses, fisheye lenses, telephoto lenses, zoom lenses, anamorphic lenses, catadioptric lenses, lenses of varying focal lengths or varying ranges of focal lengths, etc. The lens 114 can include a high speed autofocus. The lens 114 can include an adjustable aperture and an adjustable shutter speed.
The camera module 102 can include an illumination system for illuminating a field of view of the image sensor 112. The illumination system can include a digital or analog light source. The light source can include one or more laser emitters or light-emitting diodes, an incandescent bulb, or the like. The light source can emit light in any dithered, diffused, or collimated emission and can be controlled digitally or through analog methods or systems. The light source can be pulsed to illuminate a surgical scene. The light source can pulse in one or more partitions, where each partition is a pre-determined range of wavelengths of the electromagnetic spectrum that is less than the entire electromagnetic spectrum. The pixel array of the image sensor 112 can be paired with the light source electronically, such that they are synced during operation for both receiving emissions of reflected electromagnetic radiation and for the adjustments made within the system. The light source can be tuned to emit electromagnetic radiation. The light source can pulse at an interval that corresponds to the operation and functionality of the pixel array. The light source can pulse light in a plurality of electromagnetic partitions, such that the pixel array receives reflected electromagnetic energy and produces a data set that corresponds (in time) with each specific electromagnetic partition. For example, the light source can emit a green electromagnetic partition, a blue electromagnetic partition, and a red electromagnetic partition in any desired sequence which can be combined to form a color image. Any color combination or any electromagnetic partition can be used in place of the red partition, green partition, and blue partition, such as cyan, magenta, and yellow, ultraviolet, infra-red, or any other combination, including all visible and non-visible wavelengths. Exemplary illumination systems, and other features that can be included or incorporated in the camera module 102, are disclosed in U.S. Pat. No. 9,516,239 entitled YCBCR PULSED ILLUMINATION SCHEME IN A LIGHT DEFICIENT ENVIRONMENT; and U.S. Pat. No. 9,641,815 entitled SUPER RESOLUTION AND COLOR MOTION ARTIFACT CORRECTION IN A PULSED COLOR IMAGING SYSTEM, each of which is incorporated herein by reference.
The light source can be disposed within the housing 104, or can be disposed remotely from the housing. For example, the illumination system can include an optical fiber 116 having a distal end disposed within or in close proximity to the housing 104 and a proximal end in optical communication with a light source disposed remotely from the housing. The optical fiber 116 can direct light emitted from the light source into the field of view of the image sensor 112. The illumination system can include an optical element 118 for adjusting various illumination properties. Exemplary optical elements 118 can include diffusers, filters, and so forth. In the illustrated embodiment, the camera module 102 includes an optical element 118, which is illustrated as a diffuser, disposed over the terminal distal end of an optical fiber 116.
The camera module 102 can include a printed circuit board assembly (PCBA) 120. The image sensor 112 can be mounted directly to the PCBA 120, or can be operably coupled thereto, e.g., using a connector 122. The PCBA 120 can include power conditioning circuitry, hardware logic, clocks, and/or other components for operating the image sensor 112 and communicating image data generated by the image sensor to the controller 106.
One or more of the above camera module 102 components can be mounted in a frame or lens barrel 124. The lens barrel 124 can include an outer sidewall having one or more channels or lumens formed therein. For example, as shown, the lens barrel 124 can include a lens lumen 126 and an illumination lumen 128.
The lens lumen 126 can include proximal and distal ends 126p, 126d and a central longitudinal axis A1. The lens lumen 126 can have a circular transverse cross-section. The lens 114 can be disposed within a distal end of the lens lumen 126. The image sensor 112 can be positioned at or in a proximal end of the lens lumen 126. The lens lumen 126 can be open at its proximal and distal ends and closed along its sides as shown, or a portion or all of the lens lumen can be open to an outer sidewall of the lens barrel 124.
The illumination lumen 128 can include proximal and distal ends 128p, 128d and a central longitudinal axis A2. The illumination lumen 128 can have a circular transverse cross-section. The optical fiber 116 can be disposed within the illumination lumen 128. The illumination lumen 128 can be open at its proximal and distal ends and closed along its sides or, as shown, a portion or all of the illumination lumen can be open to an outer sidewall of the lens barrel 124. This can facilitate insertion of the optical fiber 116 into the lens barrel 124 during assembly, e.g., by allowing the optical fiber to be side-loaded or laterally-inserted into the illumination lumen 128. The distal end of the illumination lumen 128 can be curved or angled, e.g., such that it extends in a direction that is not parallel to the axis A2. This arrangement can allow a planar distal end of a straight-cut optical fiber 116 to be oriented parallel to an angle-cut distal end of the lens barrel 124, as shown. In other arrangements, the optical fiber 116 can be angle-cut to match the distal end of the lens barrel 124.
The lumens 126, 128 can be coaxial with one another or can be non-coaxial. For example, the lumens 126, 128 can be laterally offset from each other. The lumens 126, 128 can be formed such that they extend parallel to one another, e.g., such that their respective central longitudinal axes A1, A2 are parallel. As described further below, when integrated with an access device 110, the illumination lumen 128 can be disposed radially-inward from the lens lumen 126, e.g., such that the illumination lumen is disposed closer to the center of the access device working channel than the lens lumen. This can provide a more even distribution of light within the working channel of the access device 110.
The proximal end of the lens barrel 124 can include a recess or pocket 130. The image sensor 112 can be disposed within the recess 130. An image sensor connector 122 and/or at least a portion of the PCBA 120 can be disposed within the recess 130. The lens barrel 124 can include an internal baffle 132 disposed between the lens 114 and the image sensor 112. The baffle 132 can include an aperture through which light passing through the lens 114 can be communicated to the image sensor 112. The aperture can have a fixed dimension or can be mechanically-adjustable.
The lens barrel 124 can include a distal-facing surface at a terminal distal end thereof. The lumens 126, 128 of the lens barrel 124 can be open to the distal-facing surface. The distal facing surface can be obliquely angled. For example, the distal facing surface can lie substantially in a plane that is obliquely angled with respect to the central axis A1 of the lens lumen 126, obliquely angled with respect to the central axis A2 of the illumination lumen 128, and/or obliquely angled with respect to the central longitudinal axis of a working channel or access device in which the camera module 102 is disposed. The lens 114 can be flush with the distal-facing surface as shown, or can be recessed or protruding therefrom. The distal-facing surface can include a recess 134 in which the diffuser or other optical element 118 of the illumination system can be disposed. The recess 134 can be formed about the perimeter of the illumination lumen 128. The recess 134, and the optical element 118 disposed therein, can have a curved or crescent shape as shown. The recess 134 can include an inner edge that follows or substantially follows the perimeter of the lens lumen 126.
As described further below, when integrated with an access device 110, the distal-facing surface of the lens barrel 124 can be angled, in a distal-to-proximal direction, towards the center of the access device or a working channel thereof in which the camera module 102 is disposed. This can provide a better view of the surgical site for the image sensor 112 and/or a more even distribution of light within the surgical site. The lens barrel 124 can have a transverse exterior cross-section that is oval, oblong, circular, elliptical, square, rectangular, etc.
As shown in
The main body 136 of the housing 104 can include a proximal-facing surface 136p, a distal-facing surface 136d, and a sidewall 136s connecting the proximal-facing and distal-facing surfaces. One or more channels or lumens can be formed in the main body 136. The main body 136 can be formed by extrusion of a multi-lumen shaft. The main body 136 can be formed by welding or otherwise attaching multiple longitudinal components to one another.
The main body 136 can include a camera lumen 140 in which the camera module 102 can be selectively mounted. At least a portion of the camera lumen 140 can be lined with or can otherwise include a metallic tube, which can provide electromagnetic shielding for the components of the camera module 102.
The main body 136 can include one or more fluid lumens 142. For example, in the illustrated embodiment, the main body 136 includes a first fluid lumen 142A, which can be used to convey material from a proximal end of the housing 104 to a distal end of the housing, and a second fluid lumen 142B, which can be used to convey material from a distal end of the housing to the proximal end of the housing. While first and second fluid lumens 142 are shown, the housing 104 can include any number of fluid lumens. The first and second lumens 142 can form part of a lens cleaning system in which cleaning fluid or media is delivered through one lumen to deliver the cleaning fluid to the distal end of the camera module 102 and suction or vacuum is applied to the other lumen to draw cleaning fluid, tissue, debris, or other material away from the camera module. In some embodiments, a fluid such as air or other gas can be directed towards the lens under positive pressure. The positive pressure fluid can be conveyed through one or more of the lumens 142. Each of the lumens 142 can include a fluid connector or coupling 144 at a proximal end thereof for establishing fluid communication between the lumens and the connector assembly 108. The fluid connectors 144 can be male barbed fittings that extend proximally from the proximal-facing surface 136p of the main body 136 as shown, or can be any other type of fluid fitting or connection, such as a luer connection, a female fitting, etc. Multiple different types of cleaning media can be delivered through the same lumen or through separate lumens. The multiple types of cleaning media can be delivered simultaneously, sequentially, intermittently, or otherwise. In one example, a spray of saline or other liquid can be directed through a lumen to the lens, and can be chased with a burst of carbon dioxide or other gas through the same lumen. In another example, the liquid and gas can be delivered through separate lumens. In some embodiments, the housing can include only a single fluid lumen.
The lumens 140, 142 of the housing 104 can be co-axial, non-coaxial, or some lumens can be coaxial while others are non-coaxial. The camera lumen 140 can be disposed centrally between first and second fluid lumens 142 as shown. The lumens 140, 142 of the housing 104 can be parallel to one another. The lumens 140, 142 can have various shapes. The lumens 140, 142 can have a transverse interior cross-section that is oval, oblong, circular, elliptical, square, rectangular, etc. The camera lumen 140 can have an interior shape that matches the exterior shape of the camera module 102, or the lens barrel 124 thereof.
The distal facing surface 136d of the main body 136 can be obliquely angled. For example, the distal facing surface 136d can lie substantially in a plane that is obliquely angled with respect to a central longitudinal axis A3 of the housing 104 and/or obliquely angled with respect to the central longitudinal axis of a working channel or access device in which the housing is disposed. The distal-facing surface 136d of the main body 136 can be angled in the same manner as the distal-facing surface of the camera module 102 or lens barrel 124. The distal-facing surface of the camera module 102 can be flush with the distal-facing surface 136d of the main body 136, or can be recessed or protruding relative thereto.
The sidewall 136s of the housing 136 can have any of a variety of shapes. The sidewall 136s can have a transverse exterior cross-section that is substantially triangular, crescent shaped, circular, square, or rectangular. The sidewall 136s can be cylindrical. The sidewall 136s can be curved. The sidewall 136s can be shaped to facilitate integration of the housing 104 with an access device 110, e.g., to reduce or eliminate the degree to which the housing interferes with a working channel of the access device. The sidewall 136s can include an inner portion 136i that is concave. The inner portion 136i can be curved. The inner portion 136i can form a section of a working channel of an access device 110, e.g., a cylindrical or elliptical working channel. The sidewall 136s can include an outer portion 136o that is convex. The outer portion 136o can be curved. As shown for example in
The end cap 138 of the housing 104 can be coupled to the distal end of the main body 136. An exterior sidewall of the end cap 138 can have a shape that matches that of the sidewall of the main body 136. The end cap 138 can include one or more cut-outs 146 formed therein. The end cap 138 can include a first cut-out 146A aligned with the lens 114 of the camera module 102 to allow light to pass through the end cap and into the lens. The end cap 138 can include a second cut-out 146B aligned with the illumination system of the camera module 102 to allow light to pass through the end cap and into the surgical site. The first and second cut-outs 146 can be discrete cut-outs or can be contiguous as shown. The first and second cut-outs 146 can be circular, can have a shape that matches that of the lens 114 or illumination system, or can be otherwise shaped to function as described above. A proximal-facing surface of the end cap 138 can include one or more recesses, grooves, or channels 148 for directing fluid flow. Alternatively, the proximal-facing surface of the end cap 138 can be a flat planar surface and the recesses, grooves, or channels 148 can be formed in the distal-facing surface of the main body 136. Lateral end portions of the recesses 148 can be axially aligned with the fluid lumens 142 of the main body 136. Medial end portions of the recesses 148 can be open to the cut-outs 146 and/or to the distal-facing surface of the lens 114 and the illumination system. In use, as shown in
The housing 104 and/or the camera module 102 can be coupled to the controller 106 via a connector or connector assembly 108. An exemplary connector assembly 108 is shown in
The connector assembly 108 can include a distal section 108d and a proximal section 108p. In the distal section 108d, all of the conductors of the connector assembly 108 can be disposed within the outer sheath 150. In the proximal section 108p, one or more of the conductors can exit the outer sheath 150 and can extend separately therefrom. For example, in the illustrated embodiment, the distal section 108d of the connector assembly 108 includes electrical conductors 152A, one or more optical fibers 152B, and first and second fluid lumens 152C all extending through a common outer sheath 150. Also in the illustrated embodiment, the proximal section 108p of the connector assembly 108 is configured such that the electrical conductors 152A and one or more optical fibers 152B continue through the common outer sheath 150 while the first and second fluid lumens 152C exit the outer sheath as discrete fluid tubes 154. The outer sheath 150 can include a mechanical connector 156 at a proximal end thereof for making optical and/or electrical connections with the controller 106. The fluid tubes 154 can include respective fluid fittings or connectors for making fluid connections, e.g., with a fluid reservoir and/or vacuum source or positive pressure source of the controller 106 or separate from the controller.
The distal section 108d of the connector assembly 108 can have an exterior shape that matches that of the housing 104, e.g., as shown in
An exemplary controller 106 is shown in
The controller 106 can include an interface 160, such as a communication interface or an I/O interface. A communication interface can enable the controller 106 to communicate with remote devices (e.g., other controllers or computer systems) over a network or communications bus (e.g., a universal serial bus). An I/O interface can facilitate communication between one or more input devices, one or more output devices, and the various other components of the controller 106. Exemplary input devices include touch screens, mechanical buttons, keyboards, and pointing devices. The controller 106 can include a storage device 162, which can include any conventional medium for storing data in a non-volatile and/or non-transient manner. The storage device 162 can include one or more hard disk drives, flash drives, USB drives, optical drives, various media disks or cards, and/or any combination thereof. The controller 106 can include a display 164, and can generate images to be displayed thereon. The display 164 can be an electronic display, a vacuum fluorescent display (VFD), an organic light-emitting diode (OLED) display, or a liquid crystal display (LCD). The controller 106 can include a power supply 166 and appropriate regulating and conditioning circuitry. Exemplary power supplies include batteries, such as polymer lithium ion batteries, or adapters for coupling the controller 106 to a DC or AC power source (e.g., a USB adapter or a wall adapter).
The controller 106 can include a fluid source 168 that can be in fluid communication with a fluid lumen 152C of the connector assembly 108 when the connector assembly is attached to the controller 106. The fluid source 168 can include a reservoir of cleaning media. The cleaning media can be a flowable gas or liquid. The cleaning media can include one or more of carbon dioxide, saline, oxygen, air, water, and the like. The fluid source can include a source of positive pressure air or other gas. The fluid source 168 can include a pump or other mechanism for urging cleaning media through the connector assembly 108 and the housing 104. The pump can be controlled by the processor 156 to execute a cleaning cycle. The cleaning cycle can be executed automatically, or in response to a user instruction. For example, the processor 156 can detect user actuation of a button, foot pedal, or other interface element and initiate a cleaning cycle in response thereto.
The controller 106 can include a light source 170 that can be optically coupled to an optical fiber 152B of the connector assembly 108 when the connector assembly is attached to the controller. Exemplary light sources include light-emitting diodes (LEDs), incandescent or fluorescent bulbs, etc.
The controller 106 can include a vacuum or suction source 172 that can be in fluid communication with a fluid lumen 152C of the connector assembly 108 when the connector assembly is attached to the controller. The controller 106 can include an onboard vacuum pump for generating suction, or can be configured to attach to a standard hospital or operating room vacuum supply. The suction source 172 can include a valve, regulator, or other component for adjusting the degree of suction applied to the housing 104 and/or for turning the suction on or off. The suction source 172 can be controlled by the processor 156, e.g., to execute a cleaning cycle. The cleaning cycle can be executed automatically, or in response to a user instruction. For example, as noted above, the processor 156 can detect user actuation of a button, foot pedal, or other interface element and initiate a cleaning cycle in response thereto.
The controller 106 can include one or more connectors 174 for mating with the connector 156 and/or the fluid couplings 154 of the connector assembly 108. When mated, the connectors can establish electrical, optical, and/or fluid connections between the controller 106 and the connector assembly 108. It will be appreciated that any one or more of the components above can be disposed external to the housing of the controller 106, and/or can be separate or isolated from the controller altogether. For example, any one or more of the fluid source 168, the light source 170, and the suction source 172 can be external to and/or separate from the controller 106.
The controller 106 can receive image data from the image sensor 112. The image data can be communicated via a wired or wireless connection. The image data can include still image data and/or video image data. The controller 106 can display the image data on an on-board display 164 of the controller, or on one or more external monitors or displays operably coupled to the controller. The image data can be displayed to a surgeon or other user to facilitate a surgical procedure. The controller 106 can display patient data, user interface controls for manipulating system settings, controls for capturing screen shots of displayed images, and so forth.
The various functions performed by the controller 106 can be logically described as being performed by one or more modules. It will be appreciated that such modules can be implemented in hardware, software, or a combination thereof. It will further be appreciated that, when implemented in software, modules can be part of a single program or one or more separate programs, and can be implemented in a variety of contexts (e.g., as part of an embedded software package, an operating system, a device driver, a standalone application, and/or combinations thereof). In addition, software embodying one or more modules can be stored as an executable program on one or more non-transitory computer-readable storage mediums.
The system 100 can be configured to be integrated or used with an access device. Exemplary access devices can include cannulas, retractors, tubes, and other structures for providing a working channel in a surgical application. The access device can define a working channel that extends from a surgical site, e.g., an intervertebral disc space or a spinal region proximate thereto, to a location outside a patient's body. The access device can include a visualization channel for receiving the housing 104 and the camera module 102 of the visualization system 100. The visualization channel can also receive at least a portion of the connector assembly 108. The visualization channel can be the same as the working channel, can be independent of the working channel, or can overlap or intersect with the working channel. At least a portion of the sidewall of the working channel can be defined by an exterior surface of the housing 104. The housing 104 and/or the camera module 102 can be disposed off-center within the access device. The housing 104 and/or the camera module 102 can be slidably and/or rotatably coupled to the access device.
The housing 104 and/or the camera module 102 can be axially translatable within the access device, e.g., in a proximal-distal direction, to adjust the depth of the camera module relative to the access device. For example, the housing 104 can be advanced distally relative to the access device to move the lens 114 and image sensor 112 closer to the surgical site. By way of further example, the housing 104 can be retracted proximally relative to the access device to move the lens 114 and image sensor 112 farther from the surgical site. The ability to reposition the camera module 102 within the access device can facilitate various surgical procedures. For example, in an exemplary TLIF procedure, the camera module 102 can be positioned relatively shallow within the access device when cutting through Kambin's triangle and can then be advanced deeper within the access device when performing subsequent discectomy. In some cases, the camera module 102 can be advanced distally into the disc space. The camera module 102 can be advanced distally so as to protrude or extend from a distal end of an access device while the access device is inserted into a patient. For example, the camera module 102 can be advanced such that the lens 114 and/or the image sensor 112 is disposed outside of and distal to a terminal distal end of the access device. The camera module 102 can be advanced or retracted to any of an infinite number of relative longitudinal positions with respect to the access device. The ability to reposition the camera module 102 can also allow the camera module to be a modular component interchangeably usable with many different types or sizes of access device, or with an adjustable-length access device.
An exemplary access device 110 is shown in
The access device 110 can define a working channel 174 extending between the proximal and distal ends and having a central longitudinal axis A4. The working channel 174 can be cylindrical. The working channel 174 can have a circular transverse cross-section. The working channel 174 can have a diameter in the range of about 3 mm to about 30 mm, in the range of about 10 mm to about 20 mm, and/or in the range of about 12 mm to about 15 mm. The working channel 174 can have a diameter of about 15 mm. While a single working channel 174 is shown, the access device 110 can include any number of working channels. In use, instruments and/or implants can be disposed in, passed through, and/or inserted into the working channel 174 to perform a surgical procedure. In some embodiments, the access device 110 can be used to access an intervertebral disc space. A cutting instrument can be inserted through the working channel 174 to cut tissue, such as bone or disc tissue. An aspiration instrument can be inserted through the working channel 174 to aspirate material from the disc space, including excised bone or disc tissue. The cutting instrument and the aspiration instrument can be a single tool. An implant such as a fusion cage, a height and/or width expandable fusion cage, a disc prosthesis, or the like can be inserted into the disc space through the working channel 174.
The access device 110 can define a visualization channel 176. The visualization channel 176 can extend between the proximal and distal ends of the access device 110, or can extend along less than an entire length of the access device. The visualization channel 176 can include a central longitudinal axis A5. The central axis A5 of the visualization channel 176 can be disposed radially-outward from the central axis A4 of the working channel 174. The working channel 174 can have a greater transverse cross-sectional area than the visualization channel 176. The visualization channel 176 can be open to, or can intersect with, the working channel 174 along its length. The visualization channel 176 can be isolated or separate from the working channel 174.
The visualization channel 176 can have an interior transverse cross section that matches or substantially matches the exterior transverse cross-section of the housing 104. When disposed within the visualization channel 176, an exterior surface of the housing 104 can define at least a portion of the inner sidewall of the working channel 174. The working channel 174 can be cylindrical about the central axis A4 and the surface of the housing 104 that faces the working channel can form a section of a cylinder centered on the axis A4. The inner sidewall of the working channel 174 and the outer surface of the housing 104 can define a substantially smooth and continuous surface.
The access device 110 can include an attachment feature 180, e.g., for attaching the access device to a support or other object. The attachment feature 180 can be formed at a proximal end of the access device 110. For example, the access device 110 can include an annular circumferential groove 180 formed in an exterior surface thereof.
The access device 110 can include a mating feature 178 for stabilizing, holding, and/or attaching the visualization system 100 to the access device. The mating feature 178 can be a proximal extension of the access device 110 as shown. The mating feature 178 can define one or more tracks 182 configured to receive the connector assembly 108 therein. The tracks 182 can be open to one side such that the connector assembly 108 can be loaded laterally into the track, e.g., by moving the connector assembly away from the central axis A5 of the visualization channel 176. Alternatively, or in addition, the connector assembly 108 can be loaded into the mating feature 178 by translating the connector assembly proximally or distally relative thereto.
One or more of the tracks 182 can define a connector path that is curved or obliquely-angled away from the central axis A5 of the visualization channel 176. The track 182 can have an interior transverse cross-section that matches or substantially matches the exterior transverse cross-section of the connector assembly 108. The track 182 can extend around the outer periphery of the connector assembly 108. The track 182 can extend around the connector assembly 108 to a sufficient degree that the free edges of the track interfere slightly with side-loading of the connector assembly into the track. Accordingly, slight deformation or deflection of the connector assembly 108 and/or the mating feature 178 can be required to load the connector assembly into the track 182. This can allow the connector assembly 108 to be held securely by the mating feature 178, e.g., by “snapping” the connector assembly into the track 182.
The mating feature 178 can include inner and outer tracks 182A, 182B, e.g., as shown in
The connector assembly 108 can be secured to the mating feature 178 at any point along its length, e.g., at any point along the distal section 108d of the connector assembly. Accordingly, the visualization system 100 can be locked to the access device 110 at any inserted depth of the camera module 102. In addition, the visualization system 100 can be locked with the camera module 102 at a desired depth, regardless of the length of the access device 110, the position of the mating feature 178 along the access device, etc. This can allow the visualization system 100 to be interchangeably used with any of a variety of different type or size access devices 110.
The ability to lock the system 100 to the mating feature 178 can allow the camera module 102 to be used in a hands-free manner. In other words, the surgeon or other user does not need to manually grasp and/or hold the camera module 102 in place during use. The mating feature 178 can provide for simple, quick, and/or one-handed depth adjustment of the camera module 102, resulting in minimal delay and disruption to the procedure.
The access device 110 can have an exterior transverse cross section that is circular, e.g., as shown in
As noted above, and as shown in
An exemplary method of using the system 10 of
Exemplary properties of the camera module 102 and the lens 114 thereof are shown in
The system 100 can include active cleaning features. Active cleaning features can include application of an active force to the lens 114, the illumination system, or other components of the camera module 102. The active force can be or can include a fluid jet, fluid suction, mechanical or acoustic vibration, mechanical wipers, etc. The active force can be or can include positive pressure air or other gas directed towards, onto, and/or across the lens or other component(s) of the camera module.
The system 100 can include passive cleaning features. The passive cleaning features can be used independently, or can augment or improve the performance of active cleaning features. As one example, the lens 114 can have a coating applied thereto to resist or prevent adhesion of debris to the lens. The coating can be hydrophilic. The coating can be oleophilic. The coating can be hydrophobic. The coating can be oleophobic. The coating can be a pollution-repellant coating. The coating can be a gradient coating, e.g., one in which a central region of the lens has a hydrophobic coating and a peripheral region of the lens has a hydrophilic coating. The gradient lens coating can be effective to “walk” or direct fluid from the center of the lens towards the outer periphery of the lens and out of the way of the image sensor 112.
As shown in
As noted above, the main body 136 of the housing 104 can be formed by welding or otherwise attaching multiple longitudinal components to one another. For example, as shown in
As shown in
The access devices disclosed herein can be inserted into a patient using various dilation techniques. In an exemplary method, a guidewire or needle can be inserted through a percutaneous incision in the patient. The guidewire can be placed using fluoroscopic guidance, a surgical navigation system, freehand, or otherwise. The incision can be sequentially or serially dilated, for example by inserting one or more dilators over the guidewire, each having a progressively larger outside diameter. Once sufficiently dilated, the access device can be inserted by placing the outer-most dilator into the working channel of the access device and sliding the access device distally along the dilator and into the patient. The one or more dilators can then be removed from the patient, leaving an open working channel through the access device through which the surgical procedure can be conducted.
There can be instances in which it may be necessary or desirable to augment standard cylindrical dilation techniques. For example, as shown in
The sleeve 113 can be substantially cylindrical. The sleeve 113 can be hollow to define a channel 119 through which a standard cylindrical dilator 109 can be inserted. The sleeve 113 can define an outside diameter D. The bulb 115 can be movable relative to the sleeve 113 between a first position in which the bulb is disposed entirely within the outside diameter D of the sleeve and a second position in which at least a portion of the bulb protrudes out from the outside diameter D of the sleeve. The bulb 115 can be biased towards the second position. The bulb 115 can be attached to or formed integrally with the sleeve 113. The bulb 115 can be attached to the sleeve by a spring. For example, the bulb 115 can be mounted at the distal end of a longitudinal leaf spring or flat spring 121 of the sleeve 113. The spring 121 can be an integral extension of the sleeve 113 defined between opposed longitudinal slits formed in the sleeve. The bulb 115 can include a distal-facing surface 115d that is ramped, curved, tapered, or otherwise configured to provide a smooth lead-in between the outside diameter of a dilator 109 disposed in the sleeve 113 and the outside diameter of the access device 110. The bulb 115 can include a proximal-facing surface 115p that is ramped, curved, tapered, or otherwise configured to urge the bulb radially-inward towards the first position as the sleeve 113 is withdrawn proximally from the access device 110.
In use, an incision can be sequentially dilated using standard cylindrical dilators, including an outermost dilator 109. The sleeve 113 can be loaded into the access device 110 with the bulb 115 disposed in the first, radially-inward position. The sleeve 113 can be advanced distally relative to the access device 110 until the bulb 115 is at the depth of the void space 107, at which point the bulb can move radially-outward to the second position under the bias of the spring 121. The bulb 115 can also be urged radially-outward, and maintained in that position, by insertion of a dilator 109 through the sleeve 113. The access device 110 with the inserted sleeve 113 can then be advanced distally over the outer-most dilator 109. As the access device 110 is advanced distally, the distal-facing surface 115d of the bulb 115 can gently urge tissue out of the path of the access device. Once the access device 110 is positioned as desired, the dilators 109 can be removed from the sleeve 113 by withdrawing the dilators proximally therefrom. The sleeve 113 can also be removed from the access device 110 by withdrawing the sleeve proximally therefrom. As the sleeve 113 is withdrawn proximally relative to the access device 110, the bulb 115 can be urged radially-inward, e.g., by the proximal-facing surface 115p of the bulb 115 bearing against the distal end of the visualization channel 176, thereby moving the bulb to the first, radially-inward position to allow the sleeve to be removed from the access device. The sleeve 113 can form a dilator having a profile at its distal end that differs from its profile at its proximal end.
The transition portion 123 can be moved between the first and second positions by a dilation shaft 127 insertable through the access device 110A. The dilation shaft 127 can include a distal shroud 129 configured to contact and bear against the transition portion 123 as the dilation shaft is advanced distally within the access device 110A to urge the transition portion radially outward. The shroud 129 can form a section of a cylinder as shown. At least a portion of the visualization channel 176 can be formed in the dilation shaft 127. The dilation shaft 127 can remain in place within the access device 110 when the camera module 102 is disposed in the access device and as the surgical procedure is performed.
The housing can include a wiper, brush, flap, or other feature for clearing debris from the lens. The wiper can be disposed within, inserted through, and/or deployable from a lumen of the housing. For example, the wiper can be selectively deployable through a nozzle opening of a fluid lumen of the housing. The wiper can be deployed from the opening to wipe debris from the lens before, during, or after a fluid is directed through the lumen and towards the lens.
The housing can include various features for retracting, shielding, or manipulating tissue adjacent to the camera lens. For example,
The shield can be movable with respect to the housing. For example, the shield can be retractable relative to the housing, e.g., longitudinally retractable. In some embodiments, the shield can be slidably disposed within a lumen formed in the housing. The shield can be configured to translate longitudinally within the lumen in a proximal-distal direction. This can allow the shield to be selectively deployed or retracted as desired by the user, or for the degree of shield protrusion to be adjusted during the surgery. Movement of the shield relative to the housing can be controlled in various ways, for example by the user manually grasping a proximal end of the shield and sliding it relative to the housing.
The shield can include a wiper, brush, flap, fluid jet, vacuum port, or other feature for clearing debris from the lens, for example, as the shield is moved relative to the housing.
The tissue shield can be disposed within a lumen of the housing as described above, or can be otherwise incorporated into the system. For example, the tissue shield can be formed integrally with the housing, can be formed integrally with the access device, and/or can be formed integrally with the camera module. As another example, the tissue shield can be slidably disposed within a lumen of the housing, a lumen of the access device, and/or a lumen of the camera module. As yet another example, the tissue shield can be slidable along an exterior surface of the housing, the access device, and/or the camera module.
The system can include active mechanical and/or acoustic systems for maintaining clear visualization for the camera. For example, the system can include an ultrasonic agitator that can be actuated to clear debris from the lens or to prevent debris from blocking the lens in the first place.
The transducer can be a piezoelectric transducer. The transducer can emit ultrasonic waves, e.g., in a frequency in the range of about 20 kHz to about 40 kHz. The transducer can be a ring-shaped transducer, a plate-type transducer, or any other suitable transducer type.
While ultrasonic agitators are described above, it will be appreciated that any means for applying vibration or agitation to the system can be used instead or in addition. In some embodiments, an electric motor having an eccentrically mounted mass can be used to apply vibration to the system. The motor can be mounted within the camera module, housing, or access device. In some embodiments, the system can include an actuator, such as a solenoid or linear actuator, configured to strike the camera module when an electric potential is applied thereto. In use, current can be selectively applied to the actuator to cause the actuator to strike the camera module and thereby dislodge or clear debris from the lens. The actuator can be mounted within the camera module, housing, or access device. In some embodiments, a generator that operates below the ultrasound frequency range, e.g., in the infrasound or acoustic ranges, can be used to clear debris from the lens.
The system can include a membrane movable across the lens to maintain visibility through the lens. The membrane can be transparent. The membrane can be drawn across the lens to change the portion of the membrane that is aligned with the lens, e.g., to move a soiled section of the membrane away from the lens and to replace it with a clean section of the membrane. The membrane can be a continuous loop of material that is drawn across the lens and moved past a wiper, brush, flap, fluid jet, vacuum port, or other cleaning element that removes debris from the membrane. Thus, a soiled section of the membrane can be moved away from the lens and replaced with a clean section of the membrane, the soiled section eventually being moved across a brush or wiper to clean that section before it is again aligned with the lens. The membrane can be wound around one or more spools, for example with soiled sections of the membrane being wound around one spool after use as clean sections are unwound from another spool to be aligned with the lens. Movement of the membrane can be continuous or intermittent. Movement of the membrane can be controlled by an electric motor, a manual crank or handle, or various other mechanisms. Movement of the membrane can occur automatically, e.g., in response to the controller detecting debris or lack of clarity in images captured from the camera, or manually, e.g., in response to user actuation of a button, wheel, or other input mechanism.
The system can include a mechanical wiper movable across the lens to clear debris therefrom.
Various lens cleaning mechanisms described herein can be used individually or in combination. For example, a visualization system can include a mechanical wiper, an ultrasound agitator, a movable membrane, and a fluid cleaning system. As another example, a visualization system can include an ultrasound agitator and a fluid cleaning system. As another example, a visualization system can include a movable membrane and a fluid cleaning system. Any other combination or sub-combination can also be used.
The visualization systems and/or access devices disclosed herein can be used in any of a variety of surgical procedures. For example, such systems and devices can be used in ear, nose, and throat (ENT) surgery, sinus surgery, gastrointestinal (GI) surgery, abdominal surgery, intravascular surgery, cardiothoracic surgery, joint surgery, and so forth. In some embodiments, the visualization system can be used, with or without an access device, as a self-cleaning endoscope for sinus surgery. Active and/or passive cleaning features of the system can reduce or eliminate the need to repeatedly withdraw the scope from the patient to clean the lens. In some embodiments, the visualization system can be used, with or without an access device, as a self-cleaning endoscope for airway surgery (e.g., laryngoscopy, bronchoscopy, etc.). In some embodiments, the visualization system can be used, with or without an access device, as a self-cleaning upper and/or lower GI scope. The visualization system can form a rigid endoscope with self-cleaning abilities.
The various housings and camera modules disclosed herein can be used with an access device, or can be used independently without any access device. Any of the systems described herein can include a housing that is separate and distinct from the camera module, or can include an integral camera module and housing, e.g., a system in which the outer envelope of the camera module defines the housing.
It should be noted that any ordering of method steps expressed or implied in the description above or in the accompanying drawings is not to be construed as limiting the disclosed methods to performing the steps in that order. Rather, the various steps of each of the methods disclosed herein can be performed in any of a variety of sequences. In addition, as the described methods are merely exemplary embodiments, various other methods that include additional steps or include fewer steps are also within the scope of the present disclosure.
The devices disclosed herein can be constructed from any of a variety of known materials. Exemplary materials include those which are suitable for use in surgical applications, including metals such as stainless steel, titanium, nickel, cobalt-chromium, or alloys and combinations thereof, polymers such as PEEK, ceramics, carbon fiber, and so forth. The various components of the devices disclosed herein can be rigid or flexible. One or more components or portions of the device can be formed from a radiopaque material to facilitate visualization under fluoroscopy and other imaging techniques, or from a radiolucent material so as not to interfere with visualization of other structures. Exemplary radiolucent materials include carbon fiber and high-strength polymers.
The devices and methods disclosed herein can be used in minimally-invasive surgery and/or open surgery. While the devices and methods disclosed herein are generally described in the context of spinal surgery on a human patient, it will be appreciated that the methods and devices disclosed herein can be used in any type of surgery on a human or animal subject, in non-surgical applications, on non-living objects, and so forth.
Although specific embodiments are described above, it should be understood that numerous changes may be made within the spirit and scope of the concepts described.
The present application is a continuation of U.S. application Ser. No. 15/901,435, filed on Feb. 21, 2018. U.S. application Ser. No. 15/901,435 is a continuation-in-part of U.S. application Ser. No. 15/692,845, filed on Aug. 31, 2017. U.S. application Ser. No. 15/692,845 claims priority to U.S. Provisional Application No. 62/468,475, filed on Mar. 8, 2017. U.S. application Ser. No. 15/692,845 is also a continuation-in-part of U.S. application Ser. No. 15/437,792 filed on Feb. 21, 2017 (now U.S. Pat. No. 10,874,425). U.S. application Ser. No. 15/437,792 is a continuation-in-part of U.S. application Ser. No. 15/254,877, filed on Sep. 1, 2016. U.S. application Ser. No. 15/254,877 claims priority to U.S. Provisional Application No. 62/214,297 filed on Sep. 4, 2015. The entire contents of each of these applications are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4132227 | Ibe | Jan 1979 | A |
4318401 | Zimmerman | Mar 1982 | A |
4573448 | Kambin | Mar 1986 | A |
4646738 | Trott | Mar 1987 | A |
4678459 | Onik et al. | Jul 1987 | A |
4807593 | Ito | Feb 1989 | A |
4863430 | Klyce et al. | Sep 1989 | A |
4874375 | Ellison | Oct 1989 | A |
4888146 | Dandeneau | Dec 1989 | A |
5080662 | Paul | Jan 1992 | A |
5195541 | Dbenchain | Mar 1993 | A |
5207213 | Auhll et al. | May 1993 | A |
5285795 | Ryan et al. | Feb 1994 | A |
5395317 | Kambin | Mar 1995 | A |
5439464 | Shapiro | Aug 1995 | A |
5529580 | Kusunoki et al. | Jun 1996 | A |
5540706 | Aust et al. | Jul 1996 | A |
5569290 | McAfee | Oct 1996 | A |
5591187 | Dekel | Jan 1997 | A |
5601569 | Pisharodi | Feb 1997 | A |
5615690 | Giurtino et al. | Apr 1997 | A |
5618293 | Sample et al. | Apr 1997 | A |
5662300 | Michelson | Sep 1997 | A |
5688222 | Hluchy et al. | Nov 1997 | A |
5697888 | Kobayashi et al. | Dec 1997 | A |
5730754 | Obenchain | Mar 1998 | A |
5733242 | Rayburn et al. | Mar 1998 | A |
5735792 | Vanden Hoek et al. | Apr 1998 | A |
5749602 | Delaney et al. | May 1998 | A |
5792044 | Foley et al. | Aug 1998 | A |
5820623 | Ng | Oct 1998 | A |
5885300 | Tokuhashi et al. | Mar 1999 | A |
5894369 | Akiba et al. | Apr 1999 | A |
5899425 | Corey, Jr. et al. | May 1999 | A |
5928137 | Green | Jul 1999 | A |
5954635 | Foley et al. | Sep 1999 | A |
5976075 | Beane et al. | Nov 1999 | A |
5989183 | Reisdorf et al. | Nov 1999 | A |
6017333 | Bailey | Jan 2000 | A |
6033105 | Barker et al. | Mar 2000 | A |
6053907 | Zirps | Apr 2000 | A |
6063021 | Hossain et al. | May 2000 | A |
6110182 | Mowlai-Ashtiani | Aug 2000 | A |
6126592 | Proch et al. | Oct 2000 | A |
6139563 | Cosgrove et al. | Oct 2000 | A |
6200322 | Branch et al. | Mar 2001 | B1 |
6217509 | Foley et al. | Apr 2001 | B1 |
6234961 | Gray | May 2001 | B1 |
6283966 | Houfburg | Sep 2001 | B1 |
6286179 | Byrne | Sep 2001 | B1 |
6296644 | Saurat et al. | Oct 2001 | B1 |
6322498 | Gravenstein et al. | Nov 2001 | B1 |
6354992 | Kato | Mar 2002 | B1 |
6357710 | Fielden et al. | Mar 2002 | B1 |
6371968 | Kogasaka et al. | Apr 2002 | B1 |
6383191 | Zdeblick et al. | May 2002 | B1 |
6447446 | Smith et al. | Sep 2002 | B1 |
6468289 | Bonutti | Oct 2002 | B1 |
6520495 | La Mendola | Feb 2003 | B1 |
6558407 | Ivanko et al. | May 2003 | B1 |
6575899 | Foley et al. | Jun 2003 | B1 |
6579281 | Palmer et al. | Jun 2003 | B2 |
6596008 | Kambin | Jul 2003 | B1 |
6626830 | Califiore et al. | Sep 2003 | B1 |
6648915 | Sazy | Nov 2003 | B2 |
6663563 | Sharratt | Dec 2003 | B1 |
6676597 | Guenst et al. | Jan 2004 | B2 |
6679833 | Smith et al. | Jan 2004 | B2 |
6685724 | Haluck | Feb 2004 | B1 |
6688564 | Salvermoser et al. | Feb 2004 | B2 |
6758809 | Briscoe et al. | Jul 2004 | B2 |
6808505 | Kadan | Oct 2004 | B2 |
6887198 | Phillips et al. | May 2005 | B2 |
6983930 | La Mendola et al. | Jan 2006 | B1 |
7087058 | Cragg | Aug 2006 | B2 |
7104986 | Hovda et al. | Sep 2006 | B2 |
7137949 | Scirica et al. | Nov 2006 | B2 |
7179261 | Sicvol et al. | Feb 2007 | B2 |
7182731 | Nguyen et al. | Feb 2007 | B2 |
7226413 | McKinley | Jun 2007 | B2 |
7341556 | Shalman | Mar 2008 | B2 |
7434325 | Foley et al. | Oct 2008 | B2 |
7491168 | Raymond et al. | Feb 2009 | B2 |
7591790 | Pflueger | Sep 2009 | B2 |
7594888 | Raymond et al. | Sep 2009 | B2 |
7618431 | Roehm, III et al. | Nov 2009 | B2 |
7636596 | Solar | Dec 2009 | B2 |
7637905 | Saadat et al. | Dec 2009 | B2 |
7641659 | Emstad et al. | Jan 2010 | B2 |
7766313 | Panosian | Aug 2010 | B2 |
7771384 | Ravo | Aug 2010 | B2 |
7794456 | Sharps et al. | Sep 2010 | B2 |
7794469 | Kao et al. | Sep 2010 | B2 |
7811303 | Fallin et al. | Oct 2010 | B2 |
7931579 | Bertolero et al. | Apr 2011 | B2 |
7946981 | Cubb | May 2011 | B1 |
7951141 | Sharps et al. | May 2011 | B2 |
7959564 | Ritland | Jun 2011 | B2 |
7988623 | Pagliuca et al. | Aug 2011 | B2 |
8007492 | DiPoto et al. | Aug 2011 | B2 |
8038606 | Otawara | Oct 2011 | B2 |
8043381 | Hestad et al. | Oct 2011 | B2 |
8062218 | Sebastian et al. | Nov 2011 | B2 |
8079952 | Fujimoto | Dec 2011 | B2 |
8092464 | McKay | Jan 2012 | B2 |
8096944 | Harrel | Jan 2012 | B2 |
8202216 | Melkent et al. | Jun 2012 | B2 |
8206357 | Bettuchi | Jun 2012 | B2 |
8230863 | Ravikumar et al. | Jul 2012 | B2 |
8236006 | Hamada | Aug 2012 | B2 |
8267896 | Hartoumbekis et al. | Sep 2012 | B2 |
8303492 | Ito | Nov 2012 | B2 |
8333690 | Ikeda | Dec 2012 | B2 |
8360970 | Mangiardi | Jan 2013 | B2 |
8372131 | Hestad et al. | Feb 2013 | B2 |
8382048 | Nesper et al. | Feb 2013 | B2 |
8397335 | Gordin et al. | Mar 2013 | B2 |
8419625 | Ito | Apr 2013 | B2 |
8435174 | Cropper et al. | May 2013 | B2 |
8460180 | Zarate et al. | Jun 2013 | B1 |
8460186 | Ortiz et al. | Jun 2013 | B2 |
8460310 | Stern | Jun 2013 | B2 |
8518087 | Lopez et al. | Aug 2013 | B2 |
8535220 | Mondschein | Sep 2013 | B2 |
8556809 | Vijayanagar | Oct 2013 | B2 |
8585726 | Yoon et al. | Nov 2013 | B2 |
8602979 | Kitano | Dec 2013 | B2 |
8622894 | Banik et al. | Jan 2014 | B2 |
8636655 | Childs | Jan 2014 | B1 |
8648932 | Talbert et al. | Feb 2014 | B2 |
8688186 | Mao et al. | Apr 2014 | B1 |
8690764 | Clark et al. | Apr 2014 | B2 |
8721536 | Marino et al. | May 2014 | B2 |
8740779 | Yoshida | Jun 2014 | B2 |
8784421 | Garrison et al. | Jul 2014 | B2 |
8821378 | Morgenstern Lopez et al. | Sep 2014 | B2 |
8834507 | Mire et al. | Sep 2014 | B2 |
8845734 | Weiman | Sep 2014 | B2 |
8852242 | Morgenstern Lopez et al. | Oct 2014 | B2 |
8870753 | Boulais et al. | Oct 2014 | B2 |
8870756 | Maurice | Oct 2014 | B2 |
8876712 | Yee et al. | Nov 2014 | B2 |
8888689 | Poll et al. | Nov 2014 | B2 |
8888813 | To | Nov 2014 | B2 |
8894573 | Loftus et al. | Nov 2014 | B2 |
8894653 | Solsberg et al. | Nov 2014 | B2 |
8926502 | Levy et al. | Jan 2015 | B2 |
8932207 | Greenburg et al. | Jan 2015 | B2 |
8932360 | Womble et al. | Jan 2015 | B2 |
8936545 | To | Jan 2015 | B2 |
8936605 | Greenberg | Jan 2015 | B2 |
8952312 | Blanquart et al. | Feb 2015 | B2 |
8961404 | Ito | Feb 2015 | B2 |
8972714 | Talbert et al. | Mar 2015 | B2 |
8974381 | Lovell et al. | Mar 2015 | B1 |
8986199 | Weisenburgh, II et al. | Mar 2015 | B2 |
8992580 | Bar et al. | Mar 2015 | B2 |
9028522 | Prado | May 2015 | B1 |
9050036 | Poll et al. | Jun 2015 | B2 |
9050037 | Poll et al. | Jun 2015 | B2 |
9050146 | Woolley et al. | Jun 2015 | B2 |
9055936 | Mire et al. | Jun 2015 | B2 |
9072431 | Adams et al. | Jul 2015 | B2 |
9078562 | Poll et al. | Jul 2015 | B2 |
9123602 | Blanquart | Sep 2015 | B2 |
9131948 | Fang et al. | Sep 2015 | B2 |
9144374 | Maurice, Jr. | Sep 2015 | B2 |
9153609 | Blanquart | Oct 2015 | B2 |
9198674 | Benson et al. | Dec 2015 | B2 |
9211059 | Drach et al. | Dec 2015 | B2 |
9216016 | Fiechter et al. | Dec 2015 | B2 |
9216125 | Sklar | Dec 2015 | B2 |
9226647 | Sugawara | Jan 2016 | B2 |
9232935 | Brand et al. | Jan 2016 | B2 |
9247997 | Stefanchik et al. | Feb 2016 | B2 |
9265491 | Lins et al. | Feb 2016 | B2 |
9277928 | Morgenstern Lopez | Mar 2016 | B2 |
9307972 | Lovell et al. | Apr 2016 | B2 |
9320419 | Kirma et al. | Apr 2016 | B2 |
RE46007 | Banik et al. | May 2016 | E |
RE46062 | James et al. | Jul 2016 | E |
9386971 | Casey et al. | Jul 2016 | B1 |
9387313 | Culbert et al. | Jul 2016 | B2 |
9414828 | Abidin et al. | Aug 2016 | B2 |
9462234 | Blanquart et al. | Oct 2016 | B2 |
9486296 | Mire et al. | Nov 2016 | B2 |
9492194 | Morgenstern Lopez et al. | Nov 2016 | B2 |
9509917 | Blanquart et al. | Nov 2016 | B2 |
9510853 | Aljuri et al. | Dec 2016 | B2 |
9516239 | Blanquart et al. | Dec 2016 | B2 |
9522017 | Poll et al. | Dec 2016 | B2 |
9526401 | Saadat et al. | Dec 2016 | B2 |
9579012 | Vazales et al. | Feb 2017 | B2 |
9603510 | Ammirati | Mar 2017 | B2 |
9603610 | Richter et al. | Mar 2017 | B2 |
9610007 | Kienzle et al. | Apr 2017 | B2 |
9610095 | To | Apr 2017 | B2 |
9622650 | Blanquart | Apr 2017 | B2 |
9629521 | Ratnakar | Apr 2017 | B2 |
9641815 | Richardson et al. | May 2017 | B2 |
9655605 | Serowski et al. | May 2017 | B2 |
9655639 | Mark | May 2017 | B2 |
9668643 | Kennedy, II et al. | Jun 2017 | B2 |
9675235 | Lieponis | Jun 2017 | B2 |
9700378 | Mowlai-Ashtiani | Jul 2017 | B2 |
9706905 | Levy | Jul 2017 | B2 |
10561427 | Weitzman et al. | Feb 2020 | B2 |
10576231 | Gunday et al. | Mar 2020 | B2 |
10682130 | White et al. | Jun 2020 | B2 |
10758220 | White et al. | Sep 2020 | B2 |
10869659 | Thommen et al. | Dec 2020 | B2 |
10874425 | Thommen et al. | Dec 2020 | B2 |
10987129 | Thommen et al. | Apr 2021 | B2 |
11000312 | Thommen et al. | May 2021 | B2 |
11331090 | Thommen et al. | May 2022 | B2 |
11439380 | Thommen et al. | Sep 2022 | B2 |
11559328 | Richter et al. | Jan 2023 | B2 |
20020022762 | Beane et al. | Feb 2002 | A1 |
20020035313 | Scirica et al. | Mar 2002 | A1 |
20020091390 | Michelson | Jul 2002 | A1 |
20020138020 | Pflueger | Sep 2002 | A1 |
20020165560 | Danitz et al. | Nov 2002 | A1 |
20030083555 | Hunt et al. | May 2003 | A1 |
20030083688 | Simonson | May 2003 | A1 |
20030171744 | Leung et al. | Sep 2003 | A1 |
20030191474 | Cragg et al. | Oct 2003 | A1 |
20040092940 | Zwirnmann | May 2004 | A1 |
20040122446 | Solar | Jun 2004 | A1 |
20040127992 | Serhan et al. | Jul 2004 | A1 |
20040143165 | Alleyne | Jul 2004 | A1 |
20040158260 | Blau et al. | Aug 2004 | A1 |
20040158286 | Roux et al. | Aug 2004 | A1 |
20040249246 | Campos | Dec 2004 | A1 |
20050021040 | Bertagnoli | Jan 2005 | A1 |
20050075540 | Shluzas et al. | Apr 2005 | A1 |
20050075644 | DiPoto et al. | Apr 2005 | A1 |
20050080435 | Smith et al. | Apr 2005 | A1 |
20050085692 | Kiehn et al. | Apr 2005 | A1 |
20050090848 | Adams | Apr 2005 | A1 |
20050107671 | McKinley | May 2005 | A1 |
20050137461 | Marchek et al. | Jun 2005 | A1 |
20050187570 | Nguyen et al. | Aug 2005 | A1 |
20050192589 | Raymond et al. | Sep 2005 | A1 |
20050256525 | Culbert et al. | Nov 2005 | A1 |
20060020165 | Adams | Jan 2006 | A1 |
20060041270 | Lenker et al. | Feb 2006 | A1 |
20060052671 | McCarthy | Mar 2006 | A1 |
20060074445 | Gerber et al. | Apr 2006 | A1 |
20060142643 | Parker | Jun 2006 | A1 |
20060161189 | Harp | Jul 2006 | A1 |
20060173521 | Pond et al. | Aug 2006 | A1 |
20060200186 | Marchek et al. | Sep 2006 | A1 |
20060206118 | Kim et al. | Sep 2006 | A1 |
20060264895 | Flanders | Nov 2006 | A1 |
20070049794 | Glassenberg et al. | Mar 2007 | A1 |
20070055259 | Norton et al. | Mar 2007 | A1 |
20070129634 | Hickey et al. | Jun 2007 | A1 |
20070149975 | Oliver et al. | Jun 2007 | A1 |
20070162223 | Clark | Jul 2007 | A1 |
20070203396 | McCutcheon et al. | Aug 2007 | A1 |
20070213716 | Lenke et al. | Sep 2007 | A1 |
20070225556 | Ortiz et al. | Sep 2007 | A1 |
20070249899 | Seifert | Oct 2007 | A1 |
20070255100 | Barlow et al. | Nov 2007 | A1 |
20070260113 | Otawara | Nov 2007 | A1 |
20070260120 | Otawara | Nov 2007 | A1 |
20070260184 | Justis et al. | Nov 2007 | A1 |
20070270866 | von Jako | Nov 2007 | A1 |
20080015621 | Emanuel | Jan 2008 | A1 |
20080033251 | Araghi | Feb 2008 | A1 |
20080064921 | Larkin et al. | Mar 2008 | A1 |
20080064928 | Otawara | Mar 2008 | A1 |
20080081951 | Frasier et al. | Apr 2008 | A1 |
20080139879 | Olson et al. | Jun 2008 | A1 |
20080147109 | Kambin et al. | Jun 2008 | A1 |
20080183189 | Teichman et al. | Jul 2008 | A1 |
20080188714 | McCaffrey | Aug 2008 | A1 |
20080242930 | Hanypsiak et al. | Oct 2008 | A1 |
20080260342 | Kuroiwa | Oct 2008 | A1 |
20090018566 | Escudero et al. | Jan 2009 | A1 |
20090024158 | Viker | Jan 2009 | A1 |
20090062871 | Chin et al. | Mar 2009 | A1 |
20090105543 | Miller et al. | Apr 2009 | A1 |
20090125032 | Gutierrez et al. | May 2009 | A1 |
20090149857 | Culbert et al. | Jun 2009 | A1 |
20090156898 | Ichimura | Jun 2009 | A1 |
20090187080 | Seex | Jul 2009 | A1 |
20090240111 | Kessler et al. | Sep 2009 | A1 |
20090253964 | Miyamoto | Oct 2009 | A1 |
20090253965 | Miyamoto | Oct 2009 | A1 |
20090259184 | Okoniewski | Oct 2009 | A1 |
20090264895 | Gasperut et al. | Oct 2009 | A1 |
20090287061 | Feigenbaum et al. | Nov 2009 | A1 |
20090318765 | Torii | Dec 2009 | A1 |
20100004651 | Biyani | Jan 2010 | A1 |
20100022841 | Takahashi et al. | Jan 2010 | A1 |
20100076476 | To et al. | Mar 2010 | A1 |
20100081875 | Fowler et al. | Apr 2010 | A1 |
20100114147 | Biyani | May 2010 | A1 |
20100151161 | Da Rolo | Jun 2010 | A1 |
20100161060 | Schaller et al. | Jun 2010 | A1 |
20100256446 | Raju | Oct 2010 | A1 |
20100268241 | Flom et al. | Oct 2010 | A1 |
20100280325 | Ibrahim et al. | Nov 2010 | A1 |
20100284580 | OuYang et al. | Nov 2010 | A1 |
20100286477 | OuYang et al. | Nov 2010 | A1 |
20100312053 | Larsen | Dec 2010 | A1 |
20100317928 | Subramaniam | Dec 2010 | A1 |
20100324506 | Pellegrino et al. | Dec 2010 | A1 |
20110009905 | Shluzas | Jan 2011 | A1 |
20110028791 | Marino et al. | Feb 2011 | A1 |
20110040333 | Simonson et al. | Feb 2011 | A1 |
20110054507 | Batten et al. | Mar 2011 | A1 |
20110056500 | Shin et al. | Mar 2011 | A1 |
20110073594 | Bonn | Mar 2011 | A1 |
20110098628 | Yeung et al. | Apr 2011 | A1 |
20110106261 | Chin et al. | May 2011 | A1 |
20110112588 | Linderman et al. | May 2011 | A1 |
20110125158 | Diwan et al. | May 2011 | A1 |
20110130634 | Solitario, Jr. et al. | Jun 2011 | A1 |
20110201888 | Verner | Aug 2011 | A1 |
20110230965 | Schell et al. | Sep 2011 | A1 |
20110251597 | Bharadwaj et al. | Oct 2011 | A1 |
20110257478 | Kleiner et al. | Oct 2011 | A1 |
20110295070 | Yasunaga | Dec 2011 | A1 |
20110319941 | Bar et al. | Dec 2011 | A1 |
20120016192 | Jansen et al. | Jan 2012 | A1 |
20120029412 | Yeung et al. | Feb 2012 | A1 |
20120095296 | Frieu et al. | Apr 2012 | A1 |
20120101338 | O'Prey et al. | Apr 2012 | A1 |
20120111682 | Andre | May 2012 | A1 |
20120116170 | Vayser et al. | May 2012 | A1 |
20120157788 | Serowski et al. | Jun 2012 | A1 |
20120172664 | Hayman et al. | Jul 2012 | A1 |
20120209273 | Zaretzka et al. | Aug 2012 | A1 |
20120221007 | Batten et al. | Aug 2012 | A1 |
20120232350 | Seex | Sep 2012 | A1 |
20120232552 | Morgenstern Lopez et al. | Sep 2012 | A1 |
20120259173 | Waldron et al. | Oct 2012 | A1 |
20120265022 | Menn | Oct 2012 | A1 |
20120296171 | Lovell et al. | Nov 2012 | A1 |
20120298820 | Manolidis | Nov 2012 | A1 |
20120316400 | Vijayanagar | Dec 2012 | A1 |
20120323080 | DeRidder et al. | Dec 2012 | A1 |
20130030535 | Foley et al. | Jan 2013 | A1 |
20130103067 | Fabro et al. | Apr 2013 | A1 |
20130103103 | Mire et al. | Apr 2013 | A1 |
20130150670 | O'Prey et al. | Jun 2013 | A1 |
20130150674 | Haig et al. | Jun 2013 | A1 |
20130172674 | Kennedy, II et al. | Jul 2013 | A1 |
20130172676 | Levy et al. | Jul 2013 | A1 |
20130211202 | Perez-Cruet et al. | Aug 2013 | A1 |
20130282022 | Yousef | Oct 2013 | A1 |
20130289399 | Choi et al. | Oct 2013 | A1 |
20130303846 | Cybulski et al. | Nov 2013 | A1 |
20130304106 | Breznock | Nov 2013 | A1 |
20140025121 | Foley et al. | Jan 2014 | A1 |
20140066940 | Fang et al. | Mar 2014 | A1 |
20140074170 | Mertens et al. | Mar 2014 | A1 |
20140088367 | DiMauro et al. | Mar 2014 | A1 |
20140128979 | Womble et al. | May 2014 | A1 |
20140142584 | Sweeney | May 2014 | A1 |
20140148647 | Okazaki | May 2014 | A1 |
20140163319 | Blanquart et al. | Jun 2014 | A1 |
20140180321 | Dias et al. | Jun 2014 | A1 |
20140194697 | Seex | Jul 2014 | A1 |
20140215736 | Gomez et al. | Aug 2014 | A1 |
20140221749 | Grant et al. | Aug 2014 | A1 |
20140222092 | Anderson et al. | Aug 2014 | A1 |
20140257296 | Morgenstern Lopez | Sep 2014 | A1 |
20140257332 | Zastrozna | Sep 2014 | A1 |
20140257489 | Warren et al. | Sep 2014 | A1 |
20140261545 | Jenkins et al. | Sep 2014 | A1 |
20140275793 | Song | Sep 2014 | A1 |
20140275799 | Schuele | Sep 2014 | A1 |
20140276840 | Richter et al. | Sep 2014 | A1 |
20140276916 | Ahluwalia et al. | Sep 2014 | A1 |
20140277204 | Sandhu | Sep 2014 | A1 |
20140285644 | Richardson et al. | Sep 2014 | A1 |
20140318582 | Mowlai-Ashtiani | Oct 2014 | A1 |
20140336764 | Masson et al. | Nov 2014 | A1 |
20140357945 | Duckworth | Dec 2014 | A1 |
20140371763 | Poll et al. | Dec 2014 | A1 |
20140378985 | Mafi | Dec 2014 | A1 |
20150018623 | Friedrich et al. | Jan 2015 | A1 |
20150065795 | Titus | Mar 2015 | A1 |
20150073218 | Ito | Mar 2015 | A1 |
20150087913 | Dang et al. | Mar 2015 | A1 |
20150112398 | Morgenstern Lopez et al. | Apr 2015 | A1 |
20150133727 | Bacich et al. | May 2015 | A1 |
20150164496 | Karpowicz et al. | Jun 2015 | A1 |
20150216593 | Biyani | Aug 2015 | A1 |
20150223671 | Sung et al. | Aug 2015 | A1 |
20150223676 | Bayer et al. | Aug 2015 | A1 |
20150230697 | Phee et al. | Aug 2015 | A1 |
20150238073 | Charles et al. | Aug 2015 | A1 |
20150250377 | Iizuka | Sep 2015 | A1 |
20150257746 | Seifert | Sep 2015 | A1 |
20150272694 | Charles | Oct 2015 | A1 |
20150313585 | Abidin et al. | Nov 2015 | A1 |
20150313633 | Gross et al. | Nov 2015 | A1 |
20150327757 | Rozenfeld et al. | Nov 2015 | A1 |
20150335389 | Greenberg | Nov 2015 | A1 |
20150342619 | Weitzman | Dec 2015 | A1 |
20150342621 | Jackson, III | Dec 2015 | A1 |
20150366552 | Sasaki et al. | Dec 2015 | A1 |
20150374213 | Maurice, Jr. | Dec 2015 | A1 |
20150374354 | Boyd et al. | Dec 2015 | A1 |
20160015467 | Vayser et al. | Jan 2016 | A1 |
20160030061 | Thommen et al. | Feb 2016 | A1 |
20160066965 | Chegini et al. | Mar 2016 | A1 |
20160067003 | Chegini et al. | Mar 2016 | A1 |
20160074029 | O'Connell et al. | Mar 2016 | A1 |
20160095505 | Johnson et al. | Apr 2016 | A1 |
20160106408 | Ponmudi et al. | Apr 2016 | A1 |
20160166135 | Fiset | Jun 2016 | A1 |
20160174814 | Igov | Jun 2016 | A1 |
20160192921 | Pimenta et al. | Jul 2016 | A1 |
20160213500 | Beger et al. | Jul 2016 | A1 |
20160228280 | Schuele et al. | Aug 2016 | A1 |
20160235284 | Yoshida et al. | Aug 2016 | A1 |
20160256036 | Gomez et al. | Sep 2016 | A1 |
20160287264 | Chegini et al. | Oct 2016 | A1 |
20160296220 | Mast et al. | Oct 2016 | A1 |
20160324541 | Pellegrino et al. | Nov 2016 | A1 |
20160345952 | Kucharzyk et al. | Dec 2016 | A1 |
20160353978 | Miller et al. | Dec 2016 | A1 |
20160367294 | Boyd et al. | Dec 2016 | A1 |
20170003493 | Zhao | Jan 2017 | A1 |
20170007226 | Fehling | Jan 2017 | A1 |
20170007294 | Iwasaka et al. | Jan 2017 | A1 |
20170027606 | Cappelleri et al. | Feb 2017 | A1 |
20170042408 | Washbum et al. | Feb 2017 | A1 |
20170042411 | Kang et al. | Feb 2017 | A1 |
20170065269 | Thommen et al. | Mar 2017 | A1 |
20170065287 | Silva et al. | Mar 2017 | A1 |
20170086939 | Vayser et al. | Mar 2017 | A1 |
20170105770 | Woolley et al. | Apr 2017 | A1 |
20170135699 | Wolf | May 2017 | A1 |
20170156755 | Poll et al. | Jun 2017 | A1 |
20170156814 | Thommen et al. | Jun 2017 | A1 |
20170196549 | Piskun et al. | Jul 2017 | A1 |
20170224391 | Biester et al. | Aug 2017 | A1 |
20170245930 | Brannan et al. | Aug 2017 | A1 |
20170280969 | Levy et al. | Oct 2017 | A1 |
20170296038 | Gordon et al. | Oct 2017 | A1 |
20170311789 | Mulcahey et al. | Nov 2017 | A1 |
20180008138 | Thommen et al. | Jan 2018 | A1 |
20180008253 | Thommen et al. | Jan 2018 | A1 |
20180014858 | Biester et al. | Jan 2018 | A1 |
20180098788 | White et al. | Apr 2018 | A1 |
20180098789 | White et al. | Apr 2018 | A1 |
20180110503 | Flock et al. | Apr 2018 | A1 |
20180110506 | Thommen et al. | Apr 2018 | A1 |
20180153592 | Larson | Jun 2018 | A1 |
20180214016 | Thommen et al. | Aug 2018 | A1 |
20180249992 | Truckey | Sep 2018 | A1 |
20180333061 | Pracyk et al. | Nov 2018 | A1 |
20190209154 | Richter et al. | Jul 2019 | A1 |
20190216454 | Thommen et al. | Jul 2019 | A1 |
20190216486 | Weitzman | Jul 2019 | A1 |
20190374236 | Weitzman et al. | Dec 2019 | A1 |
20200268368 | White et al. | Aug 2020 | A1 |
20200360048 | White et al. | Nov 2020 | A1 |
20200367737 | Matsumoto et al. | Nov 2020 | A1 |
20210052298 | Thommen et al. | Feb 2021 | A1 |
20210204973 | Thommen et al. | Jul 2021 | A1 |
20210282806 | Thommen et al. | Sep 2021 | A1 |
20220192700 | Thommen et al. | Jun 2022 | A1 |
20220249125 | Thommen et al. | Aug 2022 | A1 |
20220265134 | Thommen et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
2659368 | Dec 2004 | CN |
1735380 | Feb 2006 | CN |
1742685 | Mar 2006 | CN |
101426437 | May 2009 | CN |
201290744 | Aug 2009 | CN |
101815476 | Aug 2010 | CN |
102448380 | May 2012 | CN |
202211669 | May 2012 | CN |
102497828 | Jun 2012 | CN |
102821673 | Dec 2012 | CN |
102843984 | Dec 2012 | CN |
202740102 | Feb 2013 | CN |
102727309 | Nov 2014 | CN |
105286776 | Feb 2016 | CN |
103976779 | Sep 2016 | CN |
106794032 | May 2017 | CN |
107126254 | Sep 2017 | CN |
9415039 | Nov 1994 | DE |
29916026 | Nov 1999 | DE |
20309079 | Aug 2003 | DE |
0 537 116 | Apr 1993 | EP |
0 807 415 | Nov 1997 | EP |
0 891 156 | Jan 1999 | EP |
0890341 | Jan 1999 | EP |
2 491 848 | Aug 2012 | EP |
2481727 | Jan 2012 | GB |
05-207962 | Aug 1993 | JP |
H0681501 | Mar 1994 | JP |
08-278456 | Oct 1996 | JP |
2000126190 | May 2000 | JP |
2000-511788 | Sep 2000 | JP |
2001520906 | Nov 2001 | JP |
2007-007438 | Jan 2007 | JP |
2008-508943 | Mar 2008 | JP |
2009543612 | Dec 2009 | JP |
2011-512943 | Apr 2011 | JP |
2012045325 | Mar 2012 | JP |
2012527327 | Nov 2012 | JP |
2012527930 | Nov 2012 | JP |
2013059688 | Apr 2013 | JP |
2013-538624 | Oct 2013 | JP |
2014-517710 | Jul 2014 | JP |
2015-500680 | Jan 2015 | JP |
2015-521913 | Aug 2015 | JP |
9629014 | Sep 1996 | WO |
9734536 | Sep 1997 | WO |
2001056490 | Aug 2001 | WO |
2001089371 | Nov 2001 | WO |
2002002016 | Jan 2002 | WO |
2004039235 | May 2004 | WO |
2004103430 | Dec 2004 | WO |
2006017507 | Feb 2006 | WO |
2007059068 | May 2007 | WO |
2008121162 | Oct 2008 | WO |
2009033207 | Mar 2009 | WO |
2009108318 | Sep 2009 | WO |
2010111629 | Sep 2010 | WO |
2010138083 | Dec 2010 | WO |
2012004766 | Jan 2012 | WO |
2012040239 | Mar 2012 | WO |
2012122294 | Sep 2012 | WO |
2013033426 | Mar 2013 | WO |
2013059640 | Apr 2013 | WO |
2013074396 | May 2013 | WO |
2014041540 | Mar 2014 | WO |
2014050236 | Apr 2014 | WO |
2014100761 | Jun 2014 | WO |
2014185334 | Nov 2014 | WO |
2014188796 | Nov 2014 | WO |
2015026793 | Feb 2015 | WO |
2015175635 | Nov 2015 | WO |
2016111373 | Jul 2016 | WO |
2016131077 | Aug 2016 | WO |
2016168673 | Oct 2016 | WO |
2016201292 | Dec 2016 | WO |
2017006684 | Jan 2017 | WO |
2017015480 | Jan 2017 | WO |
2017040873 | Mar 2017 | WO |
2017083648 | May 2017 | WO |
2018131039 | Jul 2018 | WO |
2018147225 | Aug 2018 | WO |
2018165365 | Sep 2018 | WO |
2021209987 | Oct 2021 | WO |
Entry |
---|
Chinese Office Action for Application No. 201880013056.7, dated Mar. 25, 2021 (15 pages). |
Hott, J. S., et al., “A new table-fixed retractor for anterior odontoid screw fixation: technical note,” J Neurosurg (Spine 3), 2003, v. 98, pp. 118-120. |
Extended European Search Report for Application No. 18758290.3, dated Nov. 27, 2020 (7 pages). |
Extended European Search Report for Application No. 16843037.9; dated Mar. 14, 2019 (8 pages). |
International Search Report and Written Opinion for Application No. PCT/US2015/043554, dated Nov. 19, 2015 (8 pages). |
International Search Report and Written Opinion for Application No. PCT/US2015/048485, dated Feb. 9, 2016 (16 pages). |
International Search Report and Written Opinion for Application No. PCT/US2015/060978, dated Feb. 15, 2016 (8 pages). |
Invitation to Pay Additional Fees for Application No. PCT/US2016/050022, mailed Nov. 3, 2016 (2 pages). |
International Search Report and Written Opinion for Application No. PCT/US2016/050022, dated Feb. 1, 2017 (19 pages). |
Iprenburg, M, “Percutaneous Transforaminal Endoscopic Discectomy: The Thessys Method,” in Lewandrowski, K., et al., Minimally Invasive Spinal Fusion Techniques, Summit Communications, 2008 pp. 65-81. |
International Preliminary Report on Patentability issued for Application No. PCT/US2016/050022, dated Mar. 15, 2018. |
International Search Report and Written Opinion for Application No. PCT/EP2020/056706, dated Jun. 9, 2020 (17 pages). |
International Search Report and Written Opinion issued for Application No. PCT/US2018/021472, dated Jul. 19, 2018. |
International Search Report and Written Opinion for Application No. PCT/US2018/018905, dated May 7, 2018 (10 pages). |
International Search Report and Written Opinion for Application No. PCT/US19/18700, dated May 3, 2019 (7 pages). |
International Search Report for Application No. PCT/IB2018/057367, dated Jan. 29, 2019, (4 pages). |
International Search Report and Written Opinion for Application No. PCT/US2018/021449, dated Aug. 27, 2018 (13 pages). |
International Search Report and Written Opinion for Application No. PCT/US2018/021454, dated Jul. 3, 2018 (16 pages). |
International Search Report and Written Opinion for Application No. PCT/US2018/021466 dated Jul. 3, 2018 (8 pages). |
International Search Report and Written Opinion for Application No. PCT/US2018/047136, dated Jan. 23, 2019 (9 pages). |
Japanese Office Action issued in Appln. No. JP 2018-511695, dated May 26, 2020 (21 pages). |
Jung, K., et al., “A hands-free region-of-interest selection interface for solo surgery with a wide-angle endoscope: preclinical proof of concept,” Surg Endosc, 2017, v. 31, pp. 974-980. |
Regan, J. M. et al., “Burr Hole Washout versus Craniotomy for Chronic Subdural Hematoma: Patient Outcome and Cost Analysis,” Plos One, Jan. 22, 2015, DOI:10.1371/journal.pone.0115085. |
Shalayev, S. G. et al., “Retrospective analysis and modifications of retractor systems for anterior odontoid screw fixation,” Neurosurg Focus 16 (1):Article 14, 2004, pp. 1-4. |
U.S. Appl. No. 15/254,877, filed Sep. 1, 2016, Multi-Shield Spinal Access System. |
U.S. Appl. No. 15/437,792, filed Feb. 21, 2017, Multi-Shield Spinal Access System. |
U.S. Appl. No. 15/692,845, filed Aug. 31, 2017, Surgical Visualization Systems and Related Methods. |
U.S. Appl. No. 15/697,494, filed Sep. 7, 2017, Multi-Shield Spinal Access System. |
U.S. Appl. No. 15/786,846, filed Oct. 18, 2017, Devices and Methods for Surgical Retraction. |
U.S. Appl. No. 15/786,858, filed Oct. 18, 2017, Devices and Methods for Providing Surgical Access. |
U.S. Appl. No. 15/786,891, filed Oct. 18, 2017, Surgical Access Port Stabilization. |
U.S. Appl. No. 15/786,923, filed Oct. 18, 2017, Surgical Instrument Connectors and Related Methods. |
U.S. Appl. No. 15/901,435, filed Feb. 21 2018, Surgical Visualization Systems and Related Methods. |
15/931,839 May 14, 2020 Surgical Access Port. Stabilization. |
U.S. Appl. No. 15/966,293, filed Apr. 30 2018, Neural Monitoring Devices and Methods. |
U.S. Appl. No. 16/352,654, filed Mar. 13, 2019, Multi-Shield Spinal Access System. |
U.S. Appl. No. 16/362,497, filed Mar. 22, 2019, Surgical Instrument Connectors and Related Methods. |
U.S. Appl. No. 16/985,200, filed Aug. 4, 2020, Devices and Methods for Providing Surgical Access. |
U.S. Appl. No. 17/089,695, filed Nov. 4, 2020, Multi-Shield Spinal Access System. |
U.S. Appl. No. 17/159,129, filed Jan. 26, 2021, Multi-Shield Spinal Access System. |
Extended European Search Report for Application No. 20212396.4, dated Sep. 23, 2021 (9 pages). |
Extended European Search Report for Application No. 18854503, dated Apr. 15, 2021 (10 pages). |
Extended European Search Report for Application No. 19758283.6, dated Sep. 28, 2021 (8 pages). |
Chinese Office Action for Application No. 201880013056.7, dated Oct. 26, 2021 (6 Pages). |
Japanese Office Action for Application No. 2019-545263, dated Jan. 4, 2022 (11 pages). |
U.S. Appl. No. 17/692,942, filed Mar. 11, 2022, Multi-Shield Spinal Access System. |
U.S. Appl. No. 17/728,967, filed Apr. 25 2022, Surgical Visualization Systems and Related Methods. |
U.S. Appl. No. 17/740,305, filed May 9, 2022, Surgical Visualization Systems and Related Methods. |
U.S. Appl. No. 18/091,255, filed Dec. 29, 2022, Multi-Shield Spinal Access System. |
Australian Examination Report for Application No. 2018225113, dated Jul. 15, 2022 (4 pages). |
Chinese Office Action for Application No. 201880016688.9, dated Mar. 8, 2022, with Translation (21 pages). |
Chinese Decision of Reexamination issued for 201680051245.4, dated Aug. 23, 2022, (23 pages). |
Chinese Office Action and Search Report issued for Application No. 201880058099, dated Nov. 2, 2022 (17 pages). |
“Clinical Workbook of Neurosurgery in Xijing [M], edited by Fei Zhou, Xi'an: Fourth Military Medical University Press, Aug. 2012, pp. 431-432: an endoscope with a diameter of 3.7 mm is used for intramedullary examination).”. |
Extended European Search Report for Application No. 18764249.1, dated Mar. 11, 2022 (8 pages). |
Extended European Search Report for Application No. 18764504.9, dated Mar. 18, 2022 (7 pages). |
Extended European Search Report for Application No. 18764370.5, dated Mar. 25, 2022 (8 pages). |
Japanese Office Action for Application No. 2019-548591, dated Oct. 5, 2021, (14 pages). |
Japanese Office Action for Application No. 2020-513791, dated May 17, 2022 (8 pages). |
Japanese Office Action for Application No. 2020-177880, dated May 31, 2022 (3 pages). |
Japanese Office Action for Application No. 2019545263, dated Aug. 9, 2022 (8 pages). |
Japanese Decision to Grant Patent for Application No. JP 2020-544278, dated Mar. 14, 2023. |
Number | Date | Country | |
---|---|---|---|
20210186316 A1 | Jun 2021 | US |
Number | Date | Country | |
---|---|---|---|
62468475 | Mar 2017 | US | |
62214297 | Sep 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15901435 | Feb 2018 | US |
Child | 17192889 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15692845 | Aug 2017 | US |
Child | 15901435 | US | |
Parent | 15437792 | Feb 2017 | US |
Child | 15692845 | US | |
Parent | 15254877 | Sep 2016 | US |
Child | 15437792 | US |