Field
Embodiments of the present disclosure relate to surgical devices and visualization systems for use during surgery.
Description of Related Art
Some surgical operations involve the use of large incisions. These open surgical procedures provide ready access for surgical instruments and the hand or hands of the surgeon, allowing the user to visually observe and work in the surgical site, either directly or through an operating microscope or with the aide of loupes. Open surgery is associated with significant drawbacks, however, as the relatively large incisions result in pain, scarring, and the risk of infection as well as extended recovery time. To reduce these deleterious effects, techniques have been developed to provide for minimally invasive surgery. Minimally invasive surgical techniques, such as endoscopy, laparoscopy, arthroscopy, pharyngo-laryngoscopy, as well as small incision procedures utilizing an operating microscope for visualization, utilize a significantly smaller incision than typical open surgical procedures. Specialized tools may then be used to access the surgical site through the small incision. However, because of the small access opening, the surgeon's view and workspace of the surgical site is limited. In some cases, visualization devices such as endoscopes, laparoscopes, and the like can be inserted percutaneously through the incision to allow the user to view the surgical site.
The visual information available to a user through laparoscopic of endoscopic contain trade-offs in approach. Accordingly, there is a need for improved visualization systems, for use in minimally invasive surgery.
The systems, methods and devices of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
In one aspect, a medical apparatus for use with a surgical tubular retractor configured to hold open an incision and thereby provide a pathway for access of surgical tools to a surgical site, the surgical retractor having a first end, a second end, and a longitudinal axis extending between the first and second ends, wherein the pathway extends along the longitudinal axis into the surgical site and the first end is more proximal than the second end, the apparatus comprising: an imaging insert configured to be received within the tubular retractor, the imaging insert comprising a proximal end and a distal end, wherein the imaging insert is configured to extend along the longitudinal axis of the tubular retractor between the first and second ends of the tubular retractor without substantially obstructing the pathway and maintaining the pathway allowing the surgical tools to gain access to the surgical site through the proximal end of the imaging insert; and wherein the imaging insert comprises: a proximal head of the imaging insert at the proximal end, the proximal head configured to be disposed above the first end of the tubular retractor; and a plurality of cameras inwardly facing toward the pathway; and wherein the plurality of cameras are disposed on an inner surface of the imaging insert. In some embodiments, the medical apparatus wherein the imaging insert further comprises an illumination assembly disposed on an inner surface of the imaging insert. In some embodiments, the medical apparatus wherein the imaging insert is substantially tubular and an outer surface of the imaging insert is configured to contact an inner surface of the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises one or more pieces configured to be extending from the proximal head, and wherein an outer surface of the one or more a pieces is configured to contact an inner surface of the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises one or more pieces configured to be disposed adjacent to one another, and wherein an outer surface of the one or more a pieces is configured to contact an inner surface of the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises one or more annular pieces configured to be disposed adjacent to one another to form a substantially tubular insert, and wherein an outer surface of the one or more annular pieces is configured to contact an inner surface of the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises a restraint configured to prohibit the imaging insert from passing completely through the pathway of the tubular retractor. In some embodiments, the medical apparatus wherein the proximal head of the imaging insert is wider than a distal portion of the imaging insert to restrain the imaging insert from passing completely through the pathway of the tubular retractor. In some embodiments, the medical apparatus wherein the proximal end of the imaging insert abuts the first end of the tubular retractor and the distal end of the insert is configured to be aligned with the second end of the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert is configured to slidably engage with the tubular retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises a ridge on an outer surface configured to correspond to a groove on an inner surface of the tubular retractor, wherein the groove is configured to receive the ridge and to allow the insert to be slidably engaged with the retractor. In some embodiments, the medical apparatus wherein the imaging insert comprises a groove on an outer surface configured to correspond to a ridge on an inner surface of the tubular retractor, wherein the groove is configured to receive the ridge and to allow the insert to be slidably engaged with the retractor. In some embodiments, the medical apparatus wherein at least one of the plurality of cameras are on a flexible cable is configured to be moved between the proximal and distal end of the imaging insert. In some embodiments, the medical apparatus wherein the flexible cable is configured to be fed through a slot on the proximal head of the imaging insert, wherein the at least one of the plurality of cameras on the flexible cable is moved closer or further from the distal end as the flexible cable is raised and lowered within the imaging insert. In some embodiments, the medical apparatus further comprising a plurality of connectors on the proximal head of the imaging insert, wherein the plurality of connectors comprise an optical fiber input port, a fluid port, or an air port. In some embodiments, the medical apparatus wherein the illumination assembly comprises at least one illumination source, the at least one illumination source comprising light guides or fibers. In some embodiments, the medical apparatus wherein the illumination assembly comprises at least one illumination source, the at least one illumination source configured to be longitudinally movable along a length of the imaging insert. In some embodiments, the medical apparatus wherein the plurality of cameras are configured to be longitudinally movable along a length of the imaging insert.
In another aspect, a medical apparatus comprising: a camera platform, the camera platform comprising a camera module; a flexible joint; a movement assembly; and a retractor connector surface, the retractor connector surface configured to be mounted to a surface of a retractor; wherein the flexible joint is configured to permit movement of the camera platform relative to the retractor connector surface. In some embodiments, the medical apparatus wherein the movement assembly comprises at least one set of push-pull cables. In some embodiments, the medical apparatus wherein the movement assembly comprises an electro-mechanical actuator configured to actuate the at least one set of push-pull cables. In some embodiments, the medical apparatus wherein the movement assembly comprises an actuator configured to move the camera platform, wherein the actuator is pneumatically or hydraulically driven.
In certain aspects, a movable mechanical device for positioning a camera on a surgical device is disclosed. The device can include a camera platform configured to attach to the camera, a surgical device connector surface configured to attach to the surgical device, and an electro mechanical actuator configured to control the position of the camera. The electro mechanical actuator can comprise a Micro-Electro-Mechanical System (MEMS) actuator. In some embodiments, the surgical device can include a surgical tool. In other embodiments, the surgical device can include a retractor configured to hold open a surgical incision and to provide access to a surgical site.
In certain aspects, an imaging module for disposing on a surgical device is disclosed. The imaging module can be configured to provide images of a surgical site within a field-of-view of the imaging module. The imaging module can include at least one optical sensor having at least one active detection area on a front face of the at least one optical sensor. The imaging module can also include first and second channels. The first channel can include first imaging optics configured to focus light from the surgical site onto the at least one active detection area to form left-eye view images of the surgical site on the at least one active detection area. The first imaging optics can comprise one or more lenses. The second channel can include second imaging optics configured to focus light from the surgical site onto the at least one active detection area to form right-eye view images of the surgical site on the at least one active detection area. The second imaging optics can include one or more lenses.
The imaging module can also include redirection optics between the first and second imaging optics and the at least one active detection area. The redirection optics can be configured to redirect light from the first and second imaging optics to the at least active detection area such that the at least one optical sensor can be oriented so as to reduce obstruction to the surgical site by the at least one optical sensor. In addition, the imaging module can include a mask associated with the at least one optical sensor. The mask can be configured to partition the at least one active detection area of the at least one optical sensor to define left-eye and right-eye views of the left-eye and right-eye view images.
In various embodiments, the mask can be an electronic mask implemented via software. The mask can be configured to be movable along an axis of the at least one optical sensor. The mask can include two portions. A distance between the two portions can be configured to be adjustable. For example, the distance between the two portions can be configured to be adjustable to control a convergence angle of the imaging module. In some embodiments, the mask can comprise an opening and a size of the opening can be adjustable. The mask can be configured to be controllable via a user interface.
In some embodiments, the at least one optical sensor can comprise a single sensor comprising a single chip. In other embodiments, the at least one optical sensor can comprise first and second sensors. In addition, in some embodiments, the redirection optics can comprise first and second prisms.
The imaging module of certain embodiments can be disposed on a surgical tool. In other embodiments, the imaging module can be disposed on a retractor configured to hold open a surgical incision and to provide access to the surgical site.
In certain aspects, a stereo optical assembly for disposing on a surgical device is disclosed. The assembly can be configured to provide stereo imaging of a surgical site within a field-of-view of the assembly. The stereo optical assembly can include an imaging module. The imaging module can include an optical sensor assembly comprising one or more optical sensors. The imaging module can also include first and second channels. The first channel can comprise first imaging optics configured to focus light from the surgical site onto the optical sensor assembly to form left-eye view images of the surgical site on the optical sensor assembly. The first imaging optics can include one or more lenses. The second channel can comprise second imaging optics configured to focus light from the surgical site onto the optical sensor assembly to form right-eye view images of the surgical site on the optical sensor assembly. The second imaging optics can include one or more lenses.
The imaging module can also include first and second redirection optics. The first redirection optics can be between the first and second imaging optics and the optical sensor assembly. The first redirection optics can be configured to redirect light from the first and second imaging optics to the optical sensor assembly such that the optical sensor assembly can be oriented so as to reduce obstruction to the surgical site by the optical sensor assembly. The second redirection optics can be configured to redirect light from the surgical site to the first and second imaging optics. The second redirection optics can be configured to control a convergence angle of the imaging module. For example, the second redirection optics is configured to increase the convergence angle of the imaging module.
In some embodiments of the stereo optical assembly, the first and second channels can comprise first and second ends respectively. The first and second ends can be configured to receive light from the second redirection optics. The second redirection optics can comprise first and second optical apertures configured to receive light from the surgical site. A center-to-center distance between the first and second optical apertures can be greater than a center-to-center distance between the first and second ends. In some embodiments, the second redirection optics can include first and second prisms.
In various embodiments, the imaging module is a first imaging module, and the stereo optical assembly further comprises a second imaging module. The second imaging module can include a second optical sensor assembly comprising one or more optical sensors. The second imaging module can also include a third channel and a fourth channel. The third channel can comprise third imaging optics configured to focus light from the surgical site onto the second optical sensor assembly to form second left-eye view images of the surgical site on the second optical sensor assembly. The third imaging optics can include one or more lenses. The fourth channel can comprise fourth imaging optics configured to focus light from the surgical site onto the second optical sensor assembly to form second right-eye view images of the surgical site on the second optical sensor assembly. The fourth imaging optics can include one or more lenses.
The second imaging module can also include third redirection optics between the third and fourth imaging optics and the second optical sensor assembly. The second redirection optics can be configured to redirect light from the third and fourth imaging optics to the second optical sensor assembly.
In some such embodiments, the first imaging module has a first convergence angle, the second imaging module has a second convergence angle, and the first convergence angle is substantially equal to the second convergence angle. The first imaging module is located at a proximal location, the second imaging module is located at a distal location, and the distal location is configured to be disposed closer to the surgical site than the proximal location.
The first imaging module can comprise a movable electronic mask associated with the optical sensor assembly or the second imaging module can comprise a movable electronic mask associated with the second optical sensor assembly. In some embodiments, the optical sensor assembly comprises a single sensor comprising a single chip. In other embodiments, the optical sensor assembly comprises first and second sensors. The first redirection optics can include first and second prisms.
The stereo optical assembly of certain embodiments can be disposed on a surgical tool. In other embodiments, the stereo optical assembly can be disposed on a retractor configured to hold open a surgical incision and to provide access to the surgical site.
In certain aspects, a stereo optical assembly for disposing on a surgical device is disclosed. The assembly can be configured to provide stereo imaging of a surgical site within a field-of-view of the assembly. The assembly can include a proximal imaging module at a proximal location. The proximal imaging module can be configured to provide a first left-eye view and a first right-eye view of the surgical site. The assembly can also include a distal imaging module at a distal location. The distal imaging module can be configured to provide a second left-eye view and a second right-eye view of the surgical site. The distal location can be configured to be disposed closer to the surgical site than the proximal location. In addition, an effective separation distance between the first left-eye and right-eye views can be larger than an effective separation distance between the second left-eye and right-eye views.
In various embodiments, the proximal imaging module has a first convergence angle, the distal imaging module has a second convergence angle, and the first convergence angle is substantially equal to the second convergence angle. In some embodiments, at least one of the effective separation distance between the first left-eye and right-eye views and the effective separation distance between the second left-eye and right-eye views can be defined by a plurality of prisms. In some embodiments, the effective separation distance between the first left-eye and right-eye views can be defined by a movable electronic mask associated with an optical sensor of the proximal imaging module or the effective separation distance between the second left-eye and right-eye views can be defined by a movable electronic mask associated with an optical sensor of the distal imaging module.
In certain aspects, an imaging module for disposing on a surgical device is disclosed. The imaging module can be configured to provide images of a surgical site within a field-of-view of the imaging module. The imaging module can include at least one optical sensor having at least one active detection area on a front face of the at least one optical sensor. The imaging module can also include first and second channels. The first channel can include first imaging optics configured to focus light from the surgical site onto the at least one active detection area to form left-eye view images of the surgical site on the at least one active detection area. The first imaging optics can comprise one or more lenses. The second channel can include second imaging optics configured to focus light from the surgical site onto the at least one active detection area to form right-eye view images of the surgical site on the at least one active detection area. The second imaging optics can include one or more lenses.
The imaging module can also include redirection optics between the first and second imaging optics and the at least one active detection area. The redirection optics can be configured to redirect light from the first and second imaging optics to the at least active detection area such that the at least one optical sensor can be oriented so as to reduce obstruction to the surgical site by the at least one optical sensor.
In some embodiments, the at least one optical sensor can comprise a single sensor comprising a single chip. In other embodiments, the at least one optical sensor can comprise first and second sensors. In addition, in some embodiments, the redirection optics can comprise first and second prisms.
The imaging module of certain embodiments can be disposed on a surgical tool. In other embodiments, the imaging module can be disposed on a retractor configured to hold open a surgical incision and to provide access to the surgical site.
In various aspects, a stereo camera system is provided. The stereo camera system can include a pair of image sensors comprising a left image sensor and a right image sensor. Each of the pair of image sensors can have an active detection area on a front face of the image sensor. The left image sensor can be offset along a first direction from the right image sensor. In addition, the front face of the left image sensor can be oriented such that a plane of the front face of the left image sensor is parallel to a plane of the front face of the right image sensor. The front face of the left image sensor can face the front face of the right image sensor. Each of the planes of the front faces can be oriented perpendicular to the first direction.
In various embodiments, the stereo camera system can include a pair of lens trains comprising a left lens train having a plurality of lens elements along a left optical path and a right lens train having a plurality of lens elements along a right optical path. The left optical path can be offset along the first direction from the right optical path. The stereo camera system can also include a pair of optical redirection elements comprising a left optical redirection element positioned along the left optical path and configured to redirect the left optical path to the front face of the left image sensor and a right optical redirection element positioned along the right optical path and configured to redirect the right optical path to the front face of the right image sensor.
In some embodiments of the stereo camera system, the left optical redirection element comprises a left prism and the right redirection element comprises a right prism. In some such embodiments, the left prism can be offset from the right prism along the first direction. The left prism can comprise a primary reflective face that is orthogonal to a primary reflective face of the right prism. In some embodiments, the left optical redirection element comprises a left mirror and the right redirection element comprises a right mirror.
In some embodiments, the left optical redirection element can be configured to redirect the left optical path 90 degrees and the right optical redirection element can be configured to redirect the right optical path 90 degrees. The redirected left optical path and the redirected right optical path can be anti-parallel to one another. In some embodiments, the left optical path and the right optical path can be parallel.
In various embodiments, the left image sensor can be a two-dimensional detector array and the right image sensor can also be a two-dimensional detector array. For example, the left image sensor can be a CCD detector array and the right image sensor can also be a CCD detector array.
In various aspects, a stereo camera system is provided. The stereo camera system can comprise a pair of image sensors comprising a left image sensor and a right image sensor. The stereo camera system can also comprise a pair of lens trains comprising a left lens train having a plurality of lens elements along a left optical path and a right lens train having a plurality of lens elements along a right optical path. The left optical path can be offset laterally from the right optical path. The stereo camera system can also comprise a pair of optical redirection elements comprising a left optical redirection element positioned along the left optical path and configured to redirect the left optical path to the front face of the left image sensor and a right optical redirection element positioned along the right optical path and configured to redirect the right optical path to the front face of the right image sensor.
Furthermore, certain embodiments also include a surgical visualization system comprising a plurality of camera systems. At least one of the plurality of camera systems can include a stereo camera system in accordance with certain embodiments as described herein. Various embodiments also include a retractor including a stereo camera system as described herein disposed thereon. In addition, some embodiments include a surgical tool including a stereo camera system as described herein disposed thereon.
In various aspects, a surgical visualization system display is disclosed. The surgical visualization system display can include at least one camera configured to acquire video images of a surgical tool. The at least one camera can be configured to be disposed on a surgical device. The surgical visualization system can also include an image processing system in communication with the at least one camera. The image processing system can comprise at least one physical processor. In certain embodiments, the image processing system can be configured to receive tracking information associated with the location of the surgical tool, and to adjust a focal length and/or orientation of the at least one of camera, based at least in part on the received tracking information.
In various embodiments, the image processing system can be configured to adjust a focal length and/or orientation of the at least one camera so as to maintain focus of the surgical tool with movement of the surgical tool. The at least one camera can comprise a plurality of cameras. The surgical device can comprise a retractor.
In some embodiments, the image processing system can be configured to adjust the focal length of the at least one camera, based at least in part on the received tracking information. In some embodiments, the image processing system can be configured to adjust the orientation of the at least one camera, based at least in part on the received tracking information.
The surgical visualization system display can further include a foot pedal, where actuation of the foot pedal can be configured to send a signal causing the image processing system to receive tracking information associated with the location of the surgical tool, and to adjust a focal length and/or orientation of the at least one camera, based at least in part on the received tracking information.
In certain aspects, a surgical visualization system display is disclosed. The surgical visualization system display can include at least one camera disposed on a surgical tool and configured to acquire video images of a surgical site. The surgical visualization system display can also include an image processing system in communication with the at least one camera. The image processing system can comprise at least one physical processor. The image processing system can be configured to receive tracking information associated with the location of the surgical tool, and to adjust a focal length and/or orientation of the at least one camera, based at least in part on the received tracking information.
In some embodiments, the image processing system can be configured to adjust the focal length of the at least one camera, based at least in part on the received tracking information. In some embodiments, the image processing system can be configured to adjust the orientation of the at least one camera, based at least in part on the received tracking information.
In another aspect, a medical apparatus comprising a surgical visualization system console; and a drive system disposed within the console and in communication with at least one surgical tool, wherein the drive system comprises at least one drive board configured to drive the at least one surgical tool. In some embodiments, the medical apparatus wherein the drive system comprises an ultrasonic driver board. In some embodiments, the medical apparatus wherein the drive system comprises a radiofrequency (RF) driver board. In some embodiments, the medical apparatus wherein the RF driver board is configured to support at least two modulation formats associated with at least two surgical tools, respectively. In some embodiments, the medical apparatus wherein the at least one surgical tool comprises an ultrasonic tissue aspirator, bipolar coagulation and cutting tool, bipolar forceps, or a combination thereof. In some embodiments, the medical apparatus comprising a foot pedal in communication with the drive system. In some embodiments, the medical apparatus wherein the foot pedal is configured to send a signal to the drive system indicative of a power level associated with the one or more surgical tools. In some embodiments, the medical apparatus wherein the power level is proportional to a degree to which the foot pedal is configured to be depressed.
In another aspect, a method of driving surgical tools, the method comprising receiving a signal from a foot pedal to drive a first surgical tool, wherein the signal includes information as to a degree to which the foot pedal is depressed; and driving power to the first surgical tool in proportion to the degree to which the foot pedal is depressed. In some embodiments, the method further comprising, driving power to a second surgical tool, wherein the second surgical tool has a modulation format different than that of the first surgical tool.
In certain aspects, a surgical visualization system display is disclosed. The surgical visualization system display can include a plurality of cameras, at least one disposed on a retractor. A first camera of the plurality can be configured to image fluorescence in a surgical site. A second camera of the plurality can be configured to produce a non-fluorescence image of said surgical site. The first and second cameras can have different spectral responses. For example, in some embodiments, one of the first and second cameras is sensitive to infrared and the other is not.
In another aspect, a surgical visualization system configured to receive images from one or more cameras, the surgical visualization system comprising a display, electronics configured to receive and process image signals from the one or more cameras, an input connector configured to fluidly connect with a source of pneumatic pressure, and one or more pneumatic outputs configured to fluidly connect to the input connector. In some embodiments, surgical visualization system further comprising a hydraulic pressure circuit having one or more valves, wherein at least one of the one or more pneumatic outputs is configured to operate one or more of the valves of the hydraulic pressure circuit. In some embodiments, surgical visualization system wherein the hydraulic pressure circuit is fluidly connected to one or more surgical tools and is configured to operate the one or more surgical tools using hydraulic pressure. In some embodiments, surgical visualization system wherein at least one of the one or more valves is an elastomeric proportional valve. In some embodiments, surgical visualization system wherein at least one of the one or more pneumatic outputs is a solenoid. In some embodiments, surgical visualization system wherein at least one of the one or more pneumatic outputs is a piston. In some embodiments, surgical visualization system wherein at least one of the pneumatic outputs is a pneumatic actuator. In some embodiments, surgical visualization system further comprising one or more valves positioned on fluid lines between the source of pneumatic pressure and the one or more pneumatic outputs. In some embodiments, surgical visualization system further comprising a hydraulic cassette assembly. In some embodiments, surgical visualization system wherein at least one of the one or more pneumatic outputs is configured to operate a cassette lifter configured to raise and/or lower the hydraulic cassette assembly. In some embodiments, surgical visualization system wherein one or more of the pneumatic outputs are configured to operate as a tube ejector for a peristaltic pump.
In another aspect, a surgical tool comprising a proximal handle portion, a distal handle portion connected to the proximal handle portion and rotatable with respect to the proximal handle portion about an axis of rotation, a base portion connected to the distal handle portion and fixed thereto in a direction parallel to the axis of rotation, a top portion connected to the distal handle portion and movable with respect to the distal handle portion in the direction parallel to the axis of rotation, the top portion having a cutting edge on a distal end of the top portion configured to operate with a cutting portion on the base portion to cut bone or tissue, a proximal actuation chamber within the proximal handle portion, a piston connected to the top portion and fixed thereto in the direction parallel to the axis of rotation; and an actuation element positioned at least partially within the proximal actuations chamber and configured to exert an axial force on the piston in the direction parallel to the axis of rotation. In some embodiments, the surgical tool further comprising a distal actuation chamber within the distal handle portion. In some embodiments, the surgical tool further comprising a biasing element positioned within one or more of the proximal handle portion and the distal handle portion and configured to bias the piston in a direction parallel to the axis of rotation and away from a distal end of the base portion. In some embodiments, the surgical tool wherein the actuation element is a bag or balloon configured to be inflated by a source of physiological saline. In some embodiments, the surgical tool further comprising a return valve configured to introduce physiological saline to the tool to compress the actuation element.
In another aspect, a surgical tool comprising; a hydraulic impeller assembly comprising a turbine housing defining a blade cavity, a flow director positioned at least partially within the turbine housing, an impeller having a plurality of impeller blades, the impeller positioned at least partially within the blade cavity, an output shaft rotatably connected to the impeller and configured to transfer a torque from the impeller to a drill, and one or more ports in a wall of the blade cavity providing fluid communication between an interior of the blade cavity and an exterior of the blade cavity; a hydraulic fluid input port; a pneumatic fluid input port; and a controller configured to control a proportion of pneumatic and hydraulic fluids input into the blade cavity. In some embodiments, the surgical tool further comprising a fluid output port line configured to facilitate fluid communication between at least one of the one or more ports and a hydraulic pressure source. In some embodiments, the surgical tool further comprising a vacuum source configured to extract fluid from the blade cavity through the one or more ports in the wall of the blade cavity. In some embodiments, the surgical tool wherein the vacuum source is an external pump. In some embodiments, the surgical tool wherein the vacuum source is a bypass channel in the turbine housing in fluid communication with the blade cavity via the one or more ports, wherein a low pressure fluid is passed through the bypass channel to draw fluid out from the blade cavity via the Venturi effect. In some embodiments, the surgical tool wherein the low pressure fluid is a gas. In some embodiments, the surgical tool wherein at least a portion of the fluid extracted from the blade cavity is physiological saline. In some embodiments, the surgical tool wherein the hydraulic impeller assembly is configured to receive both pressurized hydraulic fluid and pressurized pneumatic fluid to rotate the impeller. In some embodiments, the surgical tool wherein the controller is configured to increase the proportion of hydraulic fluid input to the blade cavity when higher torque is desired and to increase the proportion of pneumatic fluid input to the blade cavity when a higher rotational speed is desired.
In another aspect, a medical apparatus comprising a surgical device, at least one camera disposed on the surgical device, and a hydraulic system configured to deliver fluid to the at least one camera to remove obstructions therefrom, wherein said hydraulic system comprises a pulsing valve connected to a high pressure source of said fluid configured to provide pulses of fluid. In some embodiments, the medical apparatus wherein said pulsing valve comprises a pop off valve configured to open when a pressure threshold is reached to provide increased pressure beyond the threshold value resulting in a pulse of liquid from the pulsing valve. In some embodiments, the medical apparatus wherein said at least one camera comprises a plurality of cameras and said pulsing valve is disposed in said hydraulic system such that the fluid is delivered to each of the plurality of cameras at the same time. In some embodiments, the medical apparatus wherein said pulsing valve is disposed in a line that splits into different fluid outlets to clean different cameras, said pulsing valve disposed upstream of said split. In some embodiments, the medical apparatus wherein said hydraulic system is further configured to deliver pressurized air to the at least one camera after said fluid is delivered. In some embodiments, the medical apparatus wherein said surgical device comprises a retractor.
In another aspect, a medical apparatus comprising a surgical device, at least one camera disposed on the surgical device, and a hydraulic system configured to deliver air to the at least one camera, wherein said hydraulic system comprises a pulsing valve connected to a high pressure source of air to provide pulses of air. In some embodiments, the medical apparatus wherein said pulsing valve comprises a pop off valve configured to open when a pressure threshold is reached to provide increased pressure beyond the threshold value resulting in a pulse of air from the pulsing valve.
In another aspect, a medical apparatus comprising a surgical device, at least one camera disposed on the surgical device, said at least one camera having camera optics, and a hydraulic system configured to deliver fluid and air to the camera optics of said at least one camera to remove obstructions therefrom, wherein said hydraulic system comprises a three way valve connected to a supply of said fluid and a supply of high pressure air, said three way valve configured to selectively shut off said supply of fluid and to provide instead pressurized air thereby reducing inadvertent leakage of fluid onto the camera optics. In some embodiments, the medical apparatus wherein said hydraulic system is configured to deliver fluid pulses and air pulses to said at least one camera. In some embodiments, the medical apparatus further comprising a pop off valve, wherein said three way valve is disposed downstream of said pop off valve. In some embodiments, the medical apparatus wherein said surgical device comprises a retractor.
In another aspect, a medical apparatus comprising a surgical device, at least one camera disposed on the surgical device, said at least one camera having camera optics, and a hydraulic system comprising a valve connected to a high pressure source of fluid and configured to deliver fluid to the camera optics of said at least one camera to remove obstructions therefrom, wherein said hydraulic system is configured to open said valve periodically based on a pre-programmed schedule or a schedule selected by a user.
In another aspect, a medical apparatus comprising a surgical device, at least one camera disposed on the surgical device, said at least one camera having camera optics, and a hydraulic system comprising a valve connected to a high pressure source of fluid and configured to deliver fluid to the camera optics of said at least one camera to remove obstructions therefrom, wherein said hydraulic system is configured deliver fluid when an obstruction reducing the amount of light entering the camera is detected. In some embodiments, the medical apparatus wherein said at least one camera produces an image signal and said apparatus is configured to monitor said image signal to determine when visibility is compromised and thereby trigger delivery of said fluid to clean the camera optics. In some embodiments, the medical apparatus wherein camera intensity is monitored. In some embodiments, the medical apparatus wherein attenuation of red wavelength compared to green wavelength is monitored to determine whether blood is on the camera reducing the amount of light entering the camera.
In another aspect, a suction system for a surgical system, the suction system comprising: a suction cassette including: a cassette housing; and a plurality of ports facilitating fluid communication between an interior and an exterior of the cassette housing; a first suction line in fluid communication with the suction cassette and configured to be positioned in fluid communication with a surgical site; a second suction line in fluid communication with the suction cassette and configured to be positioned in fluid communication with the surgical site; and a storage tank in fluid communication with the suction cassette. In some embodiments, the suction system further comprising a connector in fluid communication with the suction cassette and with a vacuum source. In some embodiments, the suction system wherein the vacuum source maintains a pressure below ambient pressure within the storage tank. In some embodiments, the suction system wherein the first suction line is configured to operate as a high flow suction line to suction heavy bleeding. In some embodiments, the suction system wherein the second suction line is configured to operate as a low flow suction line to identify and coagulate low flow bleeding. In some embodiments, the suction system further comprising an intermediate storage tank positioned at least partially within the cassette housing and in fluid communication with one or more of the first suction line and the second suction line. In some embodiments, the suction system further comprising a pump positioned on a fluid line between the intermediate storage tank and the storage tank to pull material from the intermediate storage tank to the storage tank. In some embodiments, the suction system wherein the pump is a peristaltic pump.
A separate apparatus can include a translation system having an upper connecting member and a lower connecting member, the translation system designed to have a component attached to the translation system, wherein the component is configured to translate relative to the upper connecting member along at least a first axis and a second axis, a pitch-yaw adjustment system designed to attach to the component, the pitch-yaw adjustment system designed to rotate the component about a joint around an axis parallel to the first axis and rotate the component about the joint around an axis parallel to the second axis, a first control member designed to attach to the translation system via one or more control member joints, wherein the first control member is physically coupled to both the translation system and the pitch-yaw adjustment system so as to provide control thereto.
In some embodiments, the component can be attached to the lower connecting member via an arm. In some embodiments, the first control member can be physically coupled to the translation system such that the component can be translated along the first axis or second axis by translating the first control member in the direction parallel to the first axis or second axis respectively. In some embodiments, the first control member can be designed to connect to a component of the apparatus via a joint having three rotational degrees of freedom. In some embodiments, the first control member can be designed to connect to the translation system via the lower connecting member. In some embodiments, the translation system can also include a guide assembly designed to attach the upper connecting member to the lower connecting member, wherein the guide assembly can be positioned between the upper connecting member and the lower connecting member. In some embodiments, the first control member can be physically coupled to the pitch-yaw adjustment system such that the component can be rotated about the joint around the first axis or second axis by rotating the first control member about the control member joint around an axis parallel to the first axis or second axis respectively. In some embodiments, the pitch-yaw adjustment system can be designed such that rotation of the first control member about the control member joint can result in about a one-to-one rotation of the component about the joint. In some embodiments, the pitch-yaw adjustment system can be designed such that rotation of the first control member about the control member joint can result in greater than a one-to-one rotation of the component about the joint. In some embodiments, the pitch-yaw adjustment system can be designed such that rotation of the first control member about the control member joint can result in less than a one-to-one rotation of the component about the joint.
The following description is directed to certain embodiments for the purposes of describing the innovative aspects of this disclosure. However, a person having ordinary skill in the art will readily recognize that the teachings herein can be applied in a multitude of different ways. Thus, the teachings are not intended to be limited to the embodiments depicted solely in the figures. and described herein, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.
Cameras on Retractors
In the illustrated embodiment, the retractor blades 101 are substantially rigid. In various embodiments, the retractor blades may be malleable, and may have a wide range of different structural features such as width, tension, etc. For example, stronger, larger retractor blades may be desired for spinal and trans-oral surgery, while weaker, smaller retractor blades may be desired for neurosurgery. In some embodiments, the retractor can be configured such that different blades can be arranged as desired.
Each of the camera modules 105 are in electrical communication with an aggregator 104. The aggregator 104 is configured to receive input from each of the camera modules 105, and to connect to external components via electrical cable 108. For example, the hub or aggregator 104 may receive image data from each of the camera modules 105 and may transmit the image data to an image processing module (not shown). In the illustrated embodiment, the wiring connecting the camera modules 105 with the aggregator 104 is imbedded within the retractor blades 101 and articulating arms 103 and is not visible. In some embodiments, as described in more detail below, cables connecting the camera modules 105 with the aggregator 104 may be adhered (either permanently or non-permanently, e.g., releasably) to the exterior surface of the retractor 100. In the illustrated embodiment, the hub or aggregator 104 is affixed to an upper surface of the retractor 100. The aggregator may be positioned at any locations relative to the retractor 100, or may be disconnected from the retractor 100 altogether. The aggregator may contain camera interface electronics, tracker interface electronics and SERDES to produce a high speed serial cable supporting all cameras in use. In various embodiments, the retractor camera output is coupled to a console which causes video from the retractor camera to be presented on display.
Although the illustrated embodiment shows integrated camera modules 105, in various embodiments the camera modules 105 may be removably attached to the retractor blades 101. In some embodiments, the camera modules 105 can be disposed within pre-positioned receptacles on the retractor blades 101 or other surgical device. In some embodiments, the camera modules 105 can be disposed at a plurality or range of locations desired by the user on the retractor blades 101. In various embodiments, the orientation and position of the sensors can be adjusted by the user, e.g., physician, nurse, technician, or other clinician. In some embodiments, for example, the camera may be disposed on a track such that the camera can slide up and down the retractor, e.g., retractor blade. The height of the camera or camera within or above the surgical site may thereby be adjusted as desired. Other arrangements for laterally adjusting the position of the camera may be used. Additionally, in various embodiments, the cameras may be configured to have tip and/or tilt adjustment such that the attitude or orientation of the camera may be adjusted. The line of sight or optical axis of the cameras can thereby be adjusted to, for example, be directed more downward into the surgical site or be directed less into the surgical sight and more level or angled in different lateral directions. The camera modules 105 can include sensors or markers for, e.g., electromagnetic or optical tracking or use encoders, accelerometers, gyroscopes, or inertial measurement units (IMUS) or combinations thereof or any other orientation and/or position sensors, as described in more detail below. Tracking can provide location and/or orientation of the cameras. The images obtained by the cameras may be stitched together or tiled using image processing techniques. Tracking or otherwise knowing the relative locations of the sensor can assist in image processing and display formatting.
In various embodiments, pairs of cameras together provide information for creating a stereo effect or 3-dimensional (3D) image. Pairs of cameras, for example, may be included on each of the blades 101 of the retractor 100.
As illustrated, the retractor is configured to hold open tissue so as to produce an open region or cavity centrally located between the blades. Notably, in various embodiments, this open central region is unobstructed by the retractor. In particular, the central portions of the open region would be unobstructed by features of the retractor such that the surgeon would have clear access to the surgical site. The surgeon could thus more freely introduce and utilize his or her tools on locations within the surgical site. Additionally, this may enable the surgeon to use tools with both hands without the need to hold an endoscope.
Also as illustrated, the cameras are disposed on the blades of the retractor such that the cameras face inward toward the surgical site that would be held open by the retractor blades. The cameras in this example would be disposed about the central open region held open by the retractor blades so as to provide views from locations surrounding the surgical site. The camera thus would face objects within the surgical site such as structures on which tools would be used by the surgeon to operate.
In this particular example, the cameras on two of the blades face each other such that the leftmost blade and the cameras thereon would be in the field-of-view of the cameras on the rightmost blade and vice versa. The cameras on the leftmost blade may be anti-parallel to the cameras on the rightmost blade and have optical axes oriented at an angle, θ, of 180° with respect to each other. The cameras on the remaining blade may be directed orthogonally to the other two blades and thus have optical axes directed at an angle, θ, of 90° with respect to each other. Retractors with cameras can be reaffixed to a frame or mounting structure during a procedure and the cameras can reorient themselves with respect to relative position within an array of the cameras through their communication protocol with the aggregator and video switching unit.
In some embodiments, the field-of-views of the different cameras, and hence the images produced by the different cameras, may overlap. Image processing may be employed to yield increased resolution at the regions of overlap. Likewise, the number of sensors used may be increased to provide increased field-of-view and/or resolution. Likewise, cameras with overlapping images can be electronically magnified thereby making their images adjacent rather than overlapping.
A minimally invasive spine surgery can use a tubular retractor having a circular working space with a diameter of approximately 25 mm. The retractor contains blades, fingers, or at least one barrier such as e.g., a tube that holds tissue back to maintain open the surgical site. Multiple cameras located on the retractor at locations within the surgical field or in very close proximity thereto, e.g., within 75 mm of the surgical opening, can provide a useful viewpoint for the surgeon. The cameras may for example be located on the blades, fingers, tubular barrier, or other portion of the retractor close to the surgical field or within the patient and the surgical field. The cameras may include pairs of cameras arranged and/or oriented to provide stereo and thus 3D imaging or single CMOS camera chips with dual optics to provide stereo. The cameras may be located at various locations in relation to surgical devices, for example, the cameras can be located proximally and distally along or near a retractor, wherein the location of the cameras can be configured to facilitate both the progression of surgery and an enhanced view or view selection of an area of interest.
Tubular Retractor
In the embodiment of
In various embodiments, the retractor may assume other shapes, and need not be tubular. For example, the retractor 16050 may be rectangular, triangular, elliptical, or may have one or more openings on a side. An upper ring 10662 on the top of the proximal head 10660 includes a plurality of slots 10664 configured to receive flex cables 10608 therein. The flex cables are described in more detail herein. In the illustrated embodiment, two flex cables 10608 are provided, each directed to a different one of the slots 10664. Associated with each slot 10664 is a rotatable knob 10666 which, when actuated, causes the flex cable 10608 to be fed into or out of the insert 10652. With actuation of the rotary knob 1066, therefore, the associated flex cable 10608 can be raised or lowered within the insert 10652, nearer or further from the distal end 10668. In other embodiments, different actuation mechanisms may be used to lower or raise the flex cables. By lowering or raising the flex cables 10608, the cameras disposed on the distal ends thereof (as shown in
A plurality of connectors are provided on the proximal head 10660 of the insert 10652, including an optical fiber input port 10656, a fluid port 10670, and an air port 10672. The fluid port 10670 and air port 10672 can provide fluid, such as saline, and air to the cameras disposed on the flex cables 10608 for cleansing, drying, etc., as described elsewhere herein. In some embodiments, the insert can additionally include a port for aspiration, for example for removal of blood, saline, or other fluids from the surgical site. A rack 10674 is provided that supports the cables attached to the ports 10652, 10670, and 10672 as well as the flex cables 10608. Within the interior of the insert 10652, a plurality of illumination fibers 10676 extend downward along the length of the insert 10652. The illumination fibers 10676 carry light from the optical fiber input port 10656 and emit the light out of the distal ends of the illumination fibers 10676, as shown in
In some embodiments, an insert comprises an upper support configured to rest above a retractor. The proximal head of the insert 10660 shown in
The tubular insert can be characterized by a length and a width. For example, in the case of a tubular insert in the shape of a hollow cylinder such as a hollow right circular cylinder as shown in
Likewise, in some embodiments, the tubular insert as seen from the cross-section orthogonal to the length thereof (and to the z-axis) may be partitioned into separate components. For example, as illustrated in
Similarly, in some embodiments, the insert may be semi-annular corresponding to only a portion of the inner surface 10706 of the tubular retractor. For example, one or two of the separate pieces 10710 may be excluded from the insert shown in
In some embodiments, the length of the insert may be greater than the width of the insert. In some embodiments, this width is measured at the distal end of the insert although in other embodiments the width may be measured at the middle of the length of the insert. In some embodiments, the insert includes an upper support such as 10708 that is wider than the distal end of the insert and the length is measured from the distal end of the support structure where the insert enters the retractor. In some embodiments, the length of the insert may be least 1.1, 1.2, 1.3, 1.4, 1.5, 1.6, 1.7, 1.8, or 1.9 the width. In some embodiments, the length of the insert may be least 2, 3, 4, 5, or more times the width. In various embodiments, however, the length is less than 6, 5, 4, 3, or 2 times the width thereof. The insert may be characterized by a central opening through which tools may be inserted. In some embodiments, the central opening has a width, Winner, of at least ⅓ of the total width, W, of the insert. In some embodiments, the central opening can have a width of at least ⅔ of the total width, in some embodiments at least ¾ of the total width, in some embodiments at least ⅞ of the total width. In various embodiments, the central opening has a width less than 15/16, ⅞, ¾, ⅔, or ½ the total width.
In some embodiments, the insert can comprise an upper head portion (such as the upper support 10708 shown in
In some embodiments, the plurality of cameras 10714 comprises at least three cameras, at least four cameras, at last five cameras, or more. In some embodiments, the plurality of illumination sources 10714 comprises at least three illumination sources, at least four illumination sources, at least five illumination sources, or more. In some embodiments, the plurality of cameras 10714 comprises at least two pairs of stereo cameras, at least three pairs of stereo cameras, at least four pairs of stereo cameras, or more. The multiple stereo pairs can be arrange at angles of at least 30°, 45°, 90°, 180° with respect to each other. In some embodiments, the plurality of elongate support structures comprises at least three elongate support structures, at least four elongate support structures, at least five elongate support structures, or more. In some embodiments, the elongate support structures are configured such that when the insert is received within the retractor, the average distance separating the outer surface 10702 of the elongate support structures and the inner surface of the retractor is less than 5 mm, less than 4 mm, less than 3 mm, less than 2 mm, less than 1 mm or less and can be touching or be at least 0.2 mm, at least 0.5 mm, at least 1 mm, at least 2 mm, at least 3 mm, at least 4 mm in various embodiments. In some embodiments, the elongate support structures are configured to guide the insertion of the insert into the retractor. In some embodiments, at least one of the plurality of cameras 10714 is configured to be longitudinally movable along the length of the elongate support structure. In some embodiments, at least one of the plurality of illumination sources 10714 is configured to be longitudinally movable along the length of the elongate support structure.
In some embodiments, an insert for a retractor provides rigid support for one or more camera modules 10714 and one or more light sources 10714, wherein the insert is configured to cover at least 25% of the sidewall of the retractor. In some embodiments, the insert is configured to cover at least 35% of the sidewall, in some embodiments at least 50%, at least 60%, at least 70%, at least 80%, at least 90%, or more (e.g. 100%). In some embodiments, the insert is configured to cover at most 90% of the sidewall, in some embodiments at most 80%, at most 70%, at most 60%, at most 50%, or less. In some embodiments, a plurality of such inserts together covers at least 50% of the sidewall of the retractor, in some embodiments at least 60%, at least 70%, at least 80%, or more but may be at most 90% of the sidewall, in some embodiments at most 80%, at most 70%, at most 60%, at most 50%, or less. In some embodiments, the insert can be spaced apart from the sidewall of the retractor, on average, by less than about 5 mm, less than about 4 mm, less than about 3 mm, less than about 2 mm, or less and can be touching or be, on average, at least 0.2 mm, at least 0.5 mm, at least 1 mm, at least 2 mm, at least 3 mm, at least 4 mm in various embodiments. In some embodiments, the insert can be spaced apart from the sidewall of the retractor by on average less than about 10% of the width, W, of the insert, by less than about 5% of the width of the insert, by less than about 3% of the width of the insert, or less but can be spaced apart from the sidewall of the retractor by on average at least about 1% of the width, W, of the insert, at least about 3% of the width of the insert, at least about 5% of the width of the insert. In some embodiments the insert is generally disposed against and touches the inner surface 10706 of the retractor. In some embodiments, the plurality of inserts can together provide a central access 10712 for tools through the retractor to a surgical site.
Accordingly, in some embodiments, an insert for a retractor provides rigid support for one or more camera modules and/or one or more light sources 10714, wherein the insert has a width, W greater than its thickness, T. In some embodiments, the sidewall 10706 of the insert can be substantially arcuate or curved. In some embodiments, the sidewall 10706 of the insert is flat, linear, and/or planar. The cross-section of the insert, orthogonal to the length (and z axis as shown) can be substantially rectangular. In some embodiments, the width, W, can be at least twice the thickness, T, in some embodiments at least three times, at least four times, at least five times, or more and may be ten times or less, five times or less, four times or less, etc. In some embodiments, multiple inserts are configured to be simultaneously disposed within the retractor, the inserts together providing a central access 10712 for tools through the retractor to a surgical site. In some embodiments, a system comprises a retractor having inner sidewalls 10706, and a conformal insert having a shape corresponding to the inner sidewalls of the retractor. In some embodiments, the conformal insert has a shape corresponding to at least 25% of the inner sidewalls 10706 of the retractor. In some embodiments, the conformal insert has a shape corresponding to at least 30%, at least 40%, at least 50%, at least 75%, or more of the inner sidewalls 10706 of the retractor. In some embodiments, the conformal insert is spaced apart from the inner sidewalls 10706 of the retractor by on average less than about 5 mm, less than about 4 mm, less than about 3 mm, less than about 2 mm, or less. In some embodiments, the conformal insert is spaced apart from the inner sidewalls 10706 of the retractor by on average less than about 10% of the width of the insert, by less than about 5%, less than about 3%, or less.
Additionally, in some embodiments, an insert comprises a top head portion 10708 and a plurality of rigid elongate support structures for supporting proximal and distal cameras 10714 thereon and at least one illumination source 10714. In some embodiments, the length, L, of the support structures that is configured to extend into the retractor is greater than the width, W. In some embodiments, the length of the support structures is at least twice the width, at least three times the width, at least four times the width, or more.
Additionally, in some embodiments, an insert comprises a top head portion 10708 and a plurality of rigid elongate support structures configured to receive a camera 10714 and at least one illumination source 10714. In some embodiments, the length, L, of the support structures that is configured to extend into the retractor is greater than the width, W. In some embodiments, the length of the support structures is at least twice the width, at least three times the width, at least four times the width, or more.
Camera Modules with Flex Cable
In some embodiments, the cameras on the retractor blades or flexible cable can be tilted, for example, upward or downwards or sideways or combinations thereof. The cameras can be tilted to achieve different orientations of the camera. For example, in some embodiments, the cameras can be tilted with the use of hydraulic balloon, diaphragm, or bellows actuated pistons thereby orienting the camera in different positions relative to the retractor blade. The tilt of the camera can be changed prior to or during a surgical procedure. For example, the cameras can be positioned on a stage and the stage can be tilted with the use of hydraulic balloon, diaphragm or bellows actuated pistons thereby orienting the camera in different positions relative to the retractor blade. In some embodiments, the cameras on the retractor blades can be situated on a track that allows the camera to move vertically (and/or laterally) on the retractor blade thereby changing the position of the camera. Such a vertical position can be set prior to surgery or during surgery. The positioning may be performed manually or by using an actuator, such as a motor or other actuator.
In some embodiments, the camera module position and orientation with respect to the retractor can be controlled, for example, remotely controlled. The camera may be manipulated to tilt or move vertically or horizontally with respect to the retractor. Electrical, manual mechanical, or other means can control and/or drive the position or orientation of the camera. In some embodiments, the camera movement can be provided by an electro mechanical device such as a piezo or other actuator. In some embodiments, the camera movement can be provided by one or more Micro-Electro-Mechanical System (MEMS) actuators. In some embodiments, however, MEMS devices may encounter limitations with regard to autoclaving and exposure to water.
In various embodiments, the camera modules can be positioned using a movable mechanical device or system that is manually controlled.
As illustrated in
As described above, in some embodiments, the movement is provided with electrical actuators. Instead of manually turning a knob or crank, for example, a motor or other electro-mechanical actuator could be used to drive the motion.
Additionally, in some embodiments, the camera can be actuated pneumatically or hydraulically. Pressurized air or saline can allow for remote actuation of the camera. Bellows or diaphragms may be used to provide the force necessary to rotate or tilt the camera with respect to the retractor blade. The pneumatic or hydraulic actuation requires no motors and has greater compatibility with EM tracking devices as described herein. In some embodiments that do not employ a MEMS actuation device, autoclaving can be employed and water damage issues are alleviated.
Foot Pedal and Frame
The foot pedal 10801 can activate or control the functions associated with or connected to the console 10001. During surgery, a medical professional 10029, 10031 can depress or otherwise activate the foot pedal 10801. In some embodiments, such activation can provide for activation or communication with an associated hydraulic, mechanical, or electrical system. For example, the activation of the foot pedal 10801 by depressing the foot pedal 10801 can start a hydraulic circuit similar to one that can also control the tools or camera cleaning components as described herein. The foot pedal can be used to control the hydraulic system. In some embodiments, the foot pedal 10801 can be used to operate surgical devices such as a surgical retractor, camera, or tools such as, Kerrison, forceps, or any other tools as disclosed or described herein. Additionally, thereon some embodiments, the foot pedal can be used to control fluid and/or air pulses over the surface of the cameras or for use in the surgical area.
In some embodiments, the foot pedal 10801 can be a proportional foot pedal that allows for proportional control of the associated mechanism. For example, the speed or force applied by a tool can be proportional to the depression or force applied by the medical professional to the foot pedal 10801. Similarly, increasing depression of the foot pedal, for example, can close the tool by a proportional amount. Additionally, in some embodiments, surgical impedance feedback can be incorporated as described previously. Increased resistance felt by the surgical tool can be communicated to the operator by an increased resistance in the depression of the foot pedal. This allows the medical professional to receive a tactile response to the resistance of the tool even though the tool is operated by a remote foot pedal.
One or more foot pedals can be utilized to control one device. The multiple foot pedals can provide for control of different parameters of a device. In some embodiments, the one or more foot pedals can each operate or control different tools. One foot pedal can operate one device while a second foot pedal can control a second device. Therefore the medical professional can utilize two tools actuated by foot pedals and allowing his or her hands to be free to perform other functions. In various embodiments, one or more of these foot pedals may be proportional foot pedals and may provide tactile feedback.
As illustrated in
Driver Boards Inside the Console
In some embodiments, the console 10001 can contain driver boards for controlling various surgical tools. Some embodiments further comprise a foot pedal, which can send a signal to a driver board in the console 10001 upon user actuation of the foot pedal. Upon receiving a signal from the foot pedal, the driver board can control various parameters of the surgical tools, such as power. In some embodiments, the foot pedal is proportional, so that the degree to which the foot pedal is depressed correlates to the amount of power supplied to a surgical tool and/or the result, e.g., the amount of movement, the speed, etc. Thus, for example, if the foot pedal is depressed to its maximum displacement (e.g., to the floor) the driver board can cause the surgical tool to operate at maximum power. Continuing the example, if the foot pedal is depressed to half its maximum displacement, the driver board can cause the surgical tool to operate at half the maximum power.
In some embodiments, the console 10001 can contain an ultrasonic driver board for driving an ultrasonic tissue aspirator. The ultrasonic tissue aspirator can be used for various surgical procedures and operations, such as brain tumor debulking. Brain tumor debulking can be facilitated by various aspirator hand pieces, which can also be driven by the ultrasonic driver board inside the console 10001. The aspirator hand pieces can be reusable according to some embodiments. In some embodiments, ultrasonic power can be controlled via a proportional foot pedal. For example, depressing the foot pedal can send a signal to the ultrasonic driver board in the console 10001 to deliver power to the ultrasonic tissue aspirator in proportion to the degree to which the foot pedal is depressed.
In some embodiments, one foot pedal may control both the aspiration level and the ultrasonic power level of the ultrasonic tissue aspirator. In some embodiments, depressing the foot pedal may cause the aspiration level and the ultrasonic power level to increase simultaneously and proportionally. In other embodiments, the aspiration level and the ultrasound power level may not increase simultaneously and/or proportionally. For example, a first depression of the foot pedal (e.g., depressing the foot pedal to half its maximum displacement) may increase the aspiration level while the ultrasound power level remains constant. A second depression of the foot pedal (e.g., depressing the foot pedal from half its maximum displacement to its full maximum displacement) may increase both the aspiration level and the ultrasound level proportionally. In other embodiments, the sequence may be reversed. Thus, a first depression of the foot pedal may increase the aspiration level and the ultrasound power level proportionally, and a second depression of the foot pedal may increase the aspiration power level while the ultrasound power level remains constant. In various embodiments, software may be provided which can be programmed to control the relation between the aspiration level and the ultrasound power level that is output when the foot pedal is depressed.
In some embodiments, the ultrasonic driver board may introduce noise, which may disturb the video output signal from the cameras on the retractors. In order to minimize noise introduced to the video camera output signal by the ultrasonic driver board, the ultrasonic driver board can be shielded and separated from the other electronic components in the console, such as the flex cables from the cameras on the retractors. Additionally, the surgical visualization system may include a filter configured to filter out the unwanted components of the video from the cameras. Because the noise introduced by the ultrasonic driver boards may be at a particular frequency, a notch filter can be used, the notch being at a dominant frequency of the noise (e.g., at 23 kHz and/or 36 kHz). This notch filter may be included with electronics for the surgical visualization system such as electronics on the console and may be in electronics on the console arm such as the distal end of the console arm, e.g., in the CIB. The notch filter may comprise one or more digital or analog filters. In some embodiments, the notch filter is included in a processor. In some embodiments, the notch filter is included with amplifier circuitry.
In some instances, a phase-locked loop can be employed to remove noise introduced by the ultrasonic driver board. In particular, the phase-locked loop can be used to synchronize a version of the output signal from the ultrasonic driver board with the output video signal of the cameras, such that the noise contribution to the video signal resulting from the ultrasonic driver board can be removed or at least reduced. In various embodiments, for example, the ultrasonic driver board generates an oscillating signal at a particular frequency (e.g., at 23 kHz and/or 36 kHz) as well as potentially some harmonics. This oscillating signal can introduce noise having similar frequency components into the video signal produced by the camera. In an effort to cancel out this noise, an output signal of the ultrasonic driver board at this frequency (and possibly including the harmonics) can be compared to the video output signal from the cameras, or a facsimile thereof using the phase-locked loop. The phase-locked loop can then lock onto the frequency and phase of the two signals. The phase-locked loop would then be used to produce an output similar to the ultrasonic driver board signal but with a frequency and phase that matches the output video signal of the cameras. This phase-locked signal can then be subtracted from the video signal output by the cameras so that noise in the video introduced by the ultrasonic driver board can be removed or at least reduced. In some embodiments, an adaptive filter can be used to provide the correct amplitude of the noise to be subtracted out of the video signal. The ultrasonic driver board, in some embodiments, drives an ultrasonic tissue aspirator, which can be coordinated with a precision suction system. The precision suction system can be used for various surgical operations. For example, the suction system can be used near fragile veins with precise control of very low vacuum levels. The suction system can also be used near ruptured aneurysms where high vacuum levels can be used for suctioning blood. Suctioned blood and tissue can be transferred to disposable canisters according to some embodiments. In addition, the suction system can be used with conventional medical devices, such as reusable and disposable cannulas. The suction system can also be coordinated with the ultrasonic tissue aspirator.
In some embodiments, the power and/or vacuum level of the precision suction system can be controlled via a proportional foot pedal. Some embodiments include a proportional foot pedal for the precision suction system, and a separate proportional foot pedal for the ultrasonic tissue aspirator. In other embodiments, one proportional foot pedal can be used to control both the ultrasonic tissue aspirator or other tools and the precision suction system. In these embodiments, an auditory command, a gesture, a touch input, or the like can be used to select the surgical tool controlled by the proportional foot pedal.
Some embodiments of the console include a radiofrequency (RF) driver board in the console 10001. The RF driver board can be configured to drive bipolar radiofrequency (RF) coagulation and cutting tools as well as bipolar forceps. In some embodiments, the coagulation and cutting tools are disposable. In order to reduce charring and sticking, in some embodiments the bipolar forceps can be irrigated with pressurized saline from the console 10001. Saline irrigation can further be utilized to clean the bipolar forceps as well as other surgical tools.
A proportional foot pedal can be used to control the bipolar RF coagulation and cutting tools and the bipolar forceps. For example, depressing the proportional foot pedal can send a signal to the RF driver board causing the RF tools to operate with more or less power, in proportion to the degree to which the foot pedal is depressed. Some embodiments include a proportional foot pedal in communication with the RF driver board and a separate proportional foot pedal in communication with the ultrasonic driver board. Other embodiments include one foot pedal in communication with both the RF driver board and the ultrasonic driver board.
According to some embodiments, the RF driver board can support multiple modulation formats. Thus, one RF driver system included in the console can support multiple RF surgical tools and their respective modulation formats. The RF driver board, may for example, comprise a processor and/or other electronics configured to provide signals with the desired formatting. Amplifiers may be used as well to provide the desired formatting. In various embodiments, a format is selected and the processor or electronics adjusts so as to output the suitable format. The format selected may vary for different surgical tools, tool types, manufacturer's etc. Selection may be provide by the surgeon, a technician, a nurse, or another medical professional. In some embodiments, the selection may be provided by RFID, EEPROM or other coding associated with the tool such that the presence or connection of the tool activates selection of the proper format. The electronics in the RF driver reconfigures the output to be consistent with the selected format. Thus, despite different modulation formats for different RF surgical tools, only one RF driver system, which may be included in the console, may be used to drive the RF surgical tools.
As discussed above, in various embodiments, footpedals may be used to provide control, such as proportional control of electronics, hydraulics, tools, suction, and other functions or combinations of functions. Foot pedals are beneficial because control can be provided without employing the hands, which can be used for other aspect of the surgery. In certain embodiments, however, other types of controls other than foot pedals may be used. In some embodiments, the RF driver board may introduce noise, which may disturb the video output signal from the cameras on the retractors. In order to reduce of minimize noise introduced to the video camera output signal by the ultrasonic driver board, the ultrasonic driver board can be shielded and separated from the other electronic components in the console, such as the flex cables from the cameras on the retractors. Additionally, the surgical visualization system may include a filter configured to filter out the unwanted components of the video from the cameras. Because the noise introduced by the RF driver board may be at a particular frequency, a notch filter can be used, the notch being at a dominant frequency of the noise. For example, in some embodiments, the RF driver board may introduce to the video camera output signal a noise frequency of about 300-500 kHz with a modulation scheme of approximately 30 kHz and other harmonics that may be introduced by, for example, changes in the impedance of the tissue surgical site. Therefore, in these embodiments, the notch filter may be configured to filter out from the video camera output signal the dominant frequency, which may be about 500 kHz. In some embodiments, the noise introduced by the RF driver board may be at relatively high frequencies, such as 400 kHz to 3.5 MHz. Therefore, in some embodiments the surgical visualization system can include a low pass filter to remove these relatively high frequency noise signals from the video camera output signal.
This notch filter or low pass filter may be included with electronics for the surgical visualization system such as electronics in the console and may be electronics in the console arm such as the distal end of the console arm, e.g., in the CIB. The notch filter or low pass filter may comprise one or more digital or analog filters. In some embodiments, the notch filter or low pass filter is included in a processor. In some embodiments the notch filter or low pass filter is included with amplifier circuitry.
In some instances, a phase-locked loop can be employed to remove noise introduced by the RF driver board. In particular, the phase-locked loop can be used to synchronize a version of the output signal from the RF driver board with the output video signal of the cameras, such that the noise contribution to the video signal resulting from the RF driver board can be removed or at least reduced. In various embodiments, for example, the ultrasonic driver board generates a signal at a particular frequency (e.g., at 500 kHz) as well as potentially some harmonics. This signal can introduce noise having similar frequency components into the video signal produced by the camera. In an effort to cancel out this noise, an output signal of the RF driver board at this frequency (and possibly including the harmonics) can be compared to the video output signal from the cameras, or a facsimile thereof using the phase-locked loop. The phase-locked loop can then lock onto the frequency and phase of the two signals. The phase-locked loop would then be used to produce an output similar to the RF driver board signal but with a frequency and phase that matches the output video signal of the cameras. This phase-locked signal can then be subtracted from the video signal output by the cameras so that noise in the video introduced by the RF driver board can be removed or at least reduced. In some embodiments, an adaptive filter can be used to provide the correct amplitude of the noise to be subtracted out of the video signal. Paragraphs [0883]-[0898] and claims 111-115 from each of U.S. Prov. App. No. 61/880,808, U.S. Prov. App. No. 61/920,451, U.S. Prov. App. No. 61/921,051, U.S. Prov. App. No. 61/921,389, U.S. Prov. App. No. 61/922,068, and U.S. Prov. App. No. 61/923,188 are incorporated by reference herein.
Distal Proximal Camera with Prism
As illustrated in
In various embodiments, the cameras can additionally include one or more distal optical elements (e.g., a prism) configured to redirect an optical path or primary optical axis from being parallel with a mechanical axis (e.g., a longitudinal axis of a cable forming part of a camera module) to point inward relative to the mechanical axis. The inward pointing angle can be at least 0.1° and/or less than or equal to about 90°, at least about 15° and/or less than or equal to about 75°, or at least about 30° and/or less than or equal to about 45°. The optical subassembly can include one or more proximal optical elements (e.g., a prism) to fold the optical path of the optical assembly to be parallel with the mechanical axis (e.g., the longitudinal axis of the cable).
In various embodiments, the optical subassembly can include distal optical elements (e.g., prisms) to further separate stereo optical paths to approximate an interpupillary distance, which may in some embodiments, be similar to that provided by an operating room microscope. For example, for a close working distance (e.g., 10 mm, 15 mm, 20 mm, 25 mm) the adjacent right eye and left eye separation can be placed side by side with the lens trains side by side with front collection surface side by side. In some embodiment, a single image sensor can include masking to separate portions of the sensor for the respective lens trains. In various embodiments, a center-to-center distance between the front collection surface, for the respective left and right imaging optics, divided by the above working distances approximates the convergence physicians are familiar with such as while using such operating room microscopes. For larger working distances (e.g., approaching between about 75 mm and 100 mm), the center-to-center separation can be increased by using deviating prisms configured to separate the front collection surface and/or entrance pupils. See, for example, the stereo optical assemblies 11200a, 11200b can be used respectively as proximal and distal cameras mounted on a retractor.
With regard to masking the sensor, in some embodiments, portions of the image sensor can be electronically mapped (e.g., through electronics or image processing methods) as left and right sides of the image sensor for the respective left and right channels of the stereo camera optics. As discussed above, in some embodiments, a center-to-center distance between the left and right sides, divided by the working distance, can be adjusted to approximate the convergence accustomed to by a surgeon. In some embodiments, the parameters are selected to approximate the convergence typical operating room microscopes. The electronic masks can be used to create left- and right-eye views using circular electronic mask openings, square electronic mask openings, or some other shape for the electronic mask openings. In some embodiments, the electronic mask openings can be movable along an axis of the sensor (e.g., a vertical or horizontal axis) to control convergence. The distance between the electronic mask openings can be controlled by a user through user interface elements on the display, such as a graphical user interface. The size of the masks (e.g., a diameter), can be electronically fixed in a non-transitory storage medium (e.g., an EPROM), and may be fixed or adjustable. The distance between the electronic mask openings can be configured according to one or more targeted or suitable effects such as, for example, alignment error correction, dead pixel masking, or the like.
In some embodiments, the second redirection optics 11220a can be configured to redirect the left- and right-side optical axes from a path that is substantially parallel with a mechanical axis of the structure with which it is associated to an axis that is between about 10 degrees and about 75 degrees, between about 20 degrees and about 60 degrees, or between about 30 degrees and about 45 degrees from coaxial with that mechanical axis. In some embodiments, the second redirection optics 11220b can be configured to separate left and right optical paths to an approximate inter-pupilary distance to provide stereo imaging for three-dimensional viewing. In certain embodiments, the second redirection optics 11220b can be configured to separate optical paths to an approximate inter-pupilary distance of a typical operating room microscope. For relatively close working distances (e.g., about 10 mm, 15 mm, 20 mm, or 25 mm), the adjacent right-eye and left-eye separation on the image sensor, with an optical physical or electronic mask, can be sufficient for the inter-pupilary separation distance. For longer working distances (e.g., at least about 75 mm and/or less than or equal to about 100 mm), the second redirection optics 11220b can be used to change the effective separation of the left- and right-eye views. In such a case, the line of sight of the proximal cameras can be decreased to be between about 30 degrees and about 50 degrees for viewing into, as opposed to within, a surgical or anatomical site. Accordingly, camera optics providing a line of sight of between about 30 degrees and about 50 degrees may be used in such cases. In various embodiments, the left and right views in side-by-side arrangement and adjusting the spacing therebetween to provide the desired convergence can reduce keystone distortions that would be cause by alternatively tilting the pair of camera views with respect to each other to provide the desired convergence.
As discussed above, in some embodiments, the stereo optical assemblies 11200a, 11200b can be used respectively as proximal and distal cameras mounted on a retractor. The stereo optical assemblies 11200a, 11200b can be mounted at the same azimuthal angle with respect to the surgical site, e.g., both at 12 o'clock, 3 o'clock, 6 o'clock, 9 o'clock, or points in between. The stereo optical assemblies 11200a, 11200b can be configured to have their optical axes generally align with the gravity vector when mounted to a retractor. In some embodiments, the angles of the optical axes can be different from one another, or non-parallel.
Stereo Camera Design
As discussed above, in various embodiments a single two-dimensional image sensor can be employed for a stereo camera. Separate optics, e.g., lens trains, for left (L) and right (R) channels can be directed to the single two-dimensional image sensor as shown in
Alternatively, more than a single chip can be employed. In particular, first and second two-dimensional detector arrays can be disposed to receive the left (L) and right (R) channels respectively as illustrated in
In the configuration shown, the first detector array is disposed above (e.g., in the +Y direction) and is offset laterally (e.g., in the X direction) with respect to the second detector array. Additionally, the active area of the chip for the first detector array faces (e.g., in the −Y direction) while the active area of the chip for the second detector array faces (e.g., in the +Y direction.)
Likewise, the first and second prisms are oriented oppositely as well as being disposed laterally with respect to each other. For example, the first prism is disposed in the X direction with respect to the second prism. Moreover, in the embodiment shown, the first and second prisms comprises right angle prisms, however, the first prism is flipped upside-down with respect to the second prism. Each of the first and second prisms have primary reflective faces that receive light from the lens train and direct the light to the active region of the 2D detector array chip. Because the second 2D array is below the prism while the first 2D array is above the prism, the reflective surface of the two prisms are orthogonal to each other. Accordingly, the reflective surface of the second prism reflects incoming light from the optics train downward (e.g., in the −Y direction) to the detector array thereunder. In contrast, the reflective surface of the first prism reflects incoming light from the optics train upward (e.g., in the +Y direction) to the detector array thereover.
Advantageously, this arrangement enables the two detector array chips to be in close proximity of each other providing a more compact design. Variations are possible. For example, the prisms need not be right angle prisms. For example, the primary reflective surface need not be at an angle of 45 degrees with respect to the front face of the 2D sensor arrays. Additionally, different types of prisms and/or reflective surfaces can be employed. Other variations are also possible.
In comparison to the embodiment shown in
Example Camera/Sensor Designs
As discussed above with reference to
In other designs, it may be possible to have a chip with two spaced apart active regions thereon corresponding to left and right image channels. A single chip in a single package can comprise semiconductor and be patterned such that two spaced apart regions of pixels may be created to receive light from left and right lens trains. The space may include in some embodiments electronics or dead space. The space between these regions may not be active areas for collecting and sensing light. The spacing may accommodate for example the space needed for the two (left and right) lens trains or other physical components. A single 45° turning prism or a pair of 45° turning prisms may be employed to redirect light from said lens trains onto the front face and active regions of the sensor.
Methods of Surgery
A surgeon makes an incision for access into the body and introduces tools initially into the body. As the tools progress into the surgical site, the surgeon may use certain embodiments of the cameras as disclosed herein on a retractor. In certain cases, the surgeon will use the proximal retractor cameras initially and the distal retractor cameras thereafter as the surgical tool(s) passes deeper into the surgical site, for example, passing through proximal regions of the opening in the body into more distal regions into the surgical site. The various cameras can be employed to guide advancement of the tool into the desired depth in the body and into the surgical site. Similarly, with removal of the instruments, this process may be reversed (for example, the distal camera may be used more after relying on the proximal camera).
Various embodiments of the system may additionally be configured to provide for the same convergence angle for each of the stereo cameras, for example, the stereo cameras on the retractor, including possibly both proximal and distal stereo cameras. Also, if a stereo camera is mounted on a surgical tool, such as for example, a Kerrison, this tool camera too may have the same convergence angle. Having a similar convergence angle from one stereo camera to another should provide a more comfortable viewing experience for the surgeon.
The convergence angle is determined by the separation of the left and right cameras of a stereo camera pair that make up the stereo camera. As discussed herein, these cameras obtain images of the object from different perspectives akin to the human's eyes separated by an interpupillary distance. The convergence angle is also determined by the distance to the object, for example, the working distance of the camera. In particular, the convergence angle depends on the ratio of the distance separating the left and right cameras and the working distance of the camera pair to the object.
The human brain and eye react to depth cues resulting at least in part from the convergence. Likewise images produced by stereo cameras having a convergence angle (based on interpupilary distance and working distance of the stereo camera pair), will provide depth cues to viewers of those images. As the surgeon may be transitioning between viewing images from the proximal and distal cameras on the retractor, and one or more cameras on surgical tools, the surgeon will receive depth cues from these different cameras. In various embodiments, the stereo cameras have the same convergence so as to avoid introducing changes among the depth cues as the surgeon moves from viewing video from one of the cameras to another and to yet another and back, for example.
In certain embodiments, stereo cameras may be configured to be adjusted to provide the same convergence angle. For example, the stereo camera or cameras on the retractor and/or surgical tool may be adjustable to provide the same convergence. As referred to above, these cameras may include proximal and/or distal cameras on the retractor.
As discussed above with reference to
Such a mask need not be limited to embodiments such as those disclosed in
Accordingly, the mask can be adjusted, for example, one or more openings therein can be translated, to provide for the same convergence between stereo cameras on the retractor and/or surgical tool. For example, a mask on one or more stereo cameras (e.g., a stereo camera pair on a surgical tool, proximal and/or distal stereo cameras on a retractor, etc.) may be changed or reconfigured, for example, by moving one or more openings therein, to provide the same convergence angle. Consequently, using the reconfigurable mask with movable aperture(s), stereo camera pairs on retractors or surgical tools may be provided with a similar convergence. By maintaining the same convergence for the different cameras, the depth cues provided the surgeon can be maintain relatively constant despite viewing images from different stereo cameras (e.g., proximal retractor camera, distal retractor camera, surgical tool camera, etc.). As a result, a more comfortable viewing experience may be provided.
In certain embodiments, the stereo camera may additionally provide adjustable focus. One or more actuators may be included that are configured to translate one or more lenses in the camera optics that images the surgical site onto the two-dimensional detector array to change the focus of the camera. These actuators may be driven electrically in some embodiments although different types of actuators could be employed. These actuators can be included in the package that supports the camera and is disposed on the retractor. Advantageously, cameras on retractors (in contrast for example to endoscopes) have available space lateral to the imaging lenses (e.g., in the radial direction) in which such actuation devices can be located. The result may be that the lateral dimensions (e.g., in x and y) exceed the longitudinal dimensions (z), however, surgical access to the surgical site would not be impeded by utilization of the space surrounding the lenses in the lateral or radial directions.
In various embodiments, when the focus is changed using the actuator, the mask may be reconfigured or changed as discussed above. For example, one or more open region or aperture in the mask through which light is directed to the left and/or right channel can be shifted laterally to increase or decrease the convergence angle. In this manner, the convergence angle of the stereo camera with the adjustable focus disposed on the retractor or surgical tool can be altered to be the same. Constant convergence angle for different stereo cameras can be provided even if such cameras include an adjustable focus. Both the focus and the mask can be changed as needed to provide the desired focus and convergence angle.
Incorporating an adjustable focus enables a camera lens having a smaller depth of focus to be employed. Such a camera lens will have a larger numerical aperture and smaller F-number than a similar lens that produces a larger depth of focus. Some benefits of the larger aperture lens are increased light collection and resolution.
Adjustment of Camera Focal Len the and/or Orientation
In various embodiments, the cameras on the retractor provide focused images of a surgical tool tip. Furthermore, various embodiments include software that maintains focus on the surgical tool tip, even while the tool moves about the surgical site. In particular, when the tool moves in any direction (e.g., laterally, vertically, or a combination of both), the software can be configured to maintain the retractor cameras' focus (or other camera's focus) on the tool tip by changing a focal length and/or orientation of the cameras. In various embodiments, the position and/or movement of the tool tip can be determined through a tracking device on the surgical tool, e.g., on or at the tool tip.
When the tool tip moves about the surgical site, the distance between the cameras imaging the surgical tool tip and surgical tool tip changes, according to some embodiments. Furthermore, the distance between one camera and the tool tip, and the distance between another camera and the tool tip, can be different. For example, when the instrument tip moves about the surgical site laterally, it can move closer to one camera and farther away from another camera. As the tool tip moves the movement can cause the tool tip to become out of focus. In some embodiments, a foot pedal can be actuated (e.g., a foot pedal can be depressed), to enable adjustment of the focal distance and/or orientation of the cameras and thereby maintain focus of the target image. For example, actuation of a foot pedal can cause the retractor cameras to maintain focus on the surgical tool tip. Further, in order to account for the varying distances between each of the cameras and the tool tip, while maintaining focus of the tool tip, various embodiments can include software that changes the focus of the cameras as the tool tip changes position.
In addition, in the case where the cameras include electrically controlled transducers or actuators to vary their orientation, the software can be configured to change the orientation of the cameras (e.g., tip, tilt, etc.) using the actuators, as the tool tip moves vertically into or out of the surgical site. For example, if the tool tip moves deeper into the surgical site, the software can be configured to increase the angle of the cameras in a downstream direction (e.g., deeper into the surgical site). As another example, if the tool tip moves out of the surgical site, the software can be configured to increase the angle of the cameras in an upstream direction. In various embodiments, the software can be configured to change the orientation of the camera with movement of the surgical tool tip, such that the cameras' view follows movement of the tool tip. This software may for example cause a processor to drive the actuators that are configured to move the cameras.
In some embodiments, one or more cameras on the surgical tool can similarly be configured to have focal adjustment and orientation adjustment in order to maintain focus of the surgical site. In addition, tracking information can be received via a tracking device on the surgical tool. Based on this tracking information, when the surgical tool moves, the software can reposition the focal length and/or orientation of the tool camera(s) in order to maintain focus on the surgical site. Thus, the cameras can maintain focus on the surgical site while the surgical tool changes positions.
The images of the surgical tool and/or the surgical site can be displayed on a surgical visualization system display such as one or more surgeons display and/or assistants displays according to some embodiments. Those images can also be displayed on a graphical user interface according to some embodiments. In some embodiments, a graphical user interface is displayed on the surgical visualization display. In other embodiments, a separate display is provided for the graphical user interface. In either case, the images of the surgical tool and/or the surgical site can be displayed on the surgical visualization system display(s) as well as a graphical user interface(s). In some embodiments, the movement of the tool or tool location is used to control selection of the camera images that are displayed on the displays. Additionally, actuation of a foot pedal can cause the surgical tool cameras to maintain focus on the surgical site. In some embodiments, actuation of one foot pedal device can cause both the retractor cameras and the surgical tool cameras to maintain focus of their target.
Fluorescence Imaging
In various embodiments, images or information in addition to video from the cameras on the retractor can be presented via the display. For example, in some embodiments fluorescence information can be displayed. Cameras that image in different wavelengths, such as infrared, could image the surgical site or objects contained therein. In some embodiments, features could be made to fluoresce, for example, by injecting fluorescent chemical and illuminating the area with light the will induce fluorescence. Such a technique may be useful to identify and/or highlight the location and/or boundaries of specific features of interest such as tumors, etc. The fluorescence or other wavelength of interest may be detected by the cameras on the retractor or one or more other cameras. In some embodiments, images produced by fluorescence or other wavelengths of interest are superimposed on one or more images from cameras on the retractor or other camera(s). Filtering could be provided to remove unwanted wavelengths and possibly increase contrast. The filter can remove excitation illumination. In some embodiments emission image content, (e.g., fluorescing tissue) can be parsed and superimposed on image content that is not emitting (e.g., tissue that is not fluorescing), or vice versa. In various embodiments, such as where the fluorescing wavelength is not visible (e.g., for fluorescence in the infrared), an artificial color rendition of the fluorescing content can be used in place of the actual fluorescing color so as to enable the fluorescing tissue to be visible.
Kerrison
In various embodiments, the console can be equipped with a hydraulic and/or pneumatic system that can be employed to drive hydraulic and pneumatic tools.
In some embodiments, the Kerrison 1900 includes a base 1930. The base 1930 can include a cutting portion at a distal end (e.g., the left end of FIG. 9B9B). The base 1930 can be fixed axially (e.g., parallel to the handle axis 1927) with respect to the distal handle portion 1923 and/or with respect to the proximal handle portion 1918. In some embodiments, the base 1930 and/or distal handle portion 1923 are rotatable about the handle axis 1927 with respect to the proximal handle portion 1918.
As shown in
In some embodiments, the distal handle portion 1923 defines a distal actuation chamber 1917. In some embodiments, the distal actuation chamber 1917 has a cross-section with substantially the same shape and/or size as a cross-section of at least a portion of the actuation chamber 1919.
The Kerrison 1900 can include a piston 1920. The piston 1920 can be operably coupled with and/or attached to a Kerrison top portion 1928. For example, the piston 1920 can be a unitary part with or attached/adhered/welded to the Kerrison top portion 1928. In some embodiments, the piston 1920 and top portion 1928 are connected via a releasable connection (e.g., a protrusion-slot connection). The piston 1920 can be fixed axially (e.g., parallel to the handle axis 1927) with respect to the top portion 1928. In some embodiments, the piston 1920 is fixed rotationally with respect to the top portion 1928 (e.g., rotation about the handle axis 1927).
The top portion 1928 can include a cutting edge on the distal end of the top portion 1928. The cutting edge of the top portion 1928 can be configured to operate with the cutting portion of the base 1930 to cut medical material (e.g., bone and/or other tissue). In some embodiments, the top portion 1928 is connected to the base 1930 via a track-protrusion engagement. For example, the top portion 1928 can include a protrusion configured to slidably engage with a track in the base 1930. Engagement between the track of the base 1930 and the protrusion of the top portion 1928 can limit the movement of the top portion 1928 with respect to the base 1930 to the axial direction (e.g., parallel to the handle axis 1927).
In some embodiments, the piston 1920 is configured to fit within the actuation chamber 1919 and/or within the distal actuation chamber 1917. For example, the piston 1920 can have a first guide portion 1921a configured to fit snugly within the actuation chamber 1919 (e.g., fit such that movement of the first guide portion 1921a within the actuation chamber is substantially limited to axial movement and rotational movement about the handle axis 1927). In some embodiments, the piston 1920 includes a second guide portion 1921b. The second guide portion 1921b can be configured to fit snugly within the distal actuation chamber 1917. Axial movement of the piston 1920 can be limited by interaction between a radially-inward projection 1913 of the proximal handle portion 1918. For example, proximal axial movement of the piston 1920 can be limited by interaction between the second guide portion 1921b and the radially-inward projection 1913. In some embodiments, distal axial movement of the piston 1920 is limited by interaction between the first guide portion 1921a and the radially-inward projection 1913.
The distal handle portion 1923 can include a distal opening 1905. The distal opening 1905 can be sized and/or shaped to accommodate passage of the top portion 1928 therethrough. In some embodiments, the top portion 1928 is sized and shaped to fit snugly within the distal opening 1905. For example, the top portion 1928 can have a non-circular cross-section sized to substantially match a cross-section shape of the distal opening 1905. In some embodiments, the top portion 1928 is rotationally locked to the distal handle portion 1923 via interaction between the distal opening 1905 and the top portion 1928. In some such embodiments, the grip 1915 can be rotated relative to the top portion 1928 and the base 1930. In some embodiments, sensors and/or optical devices (e.g., cameras, CMOS sensors, etc.) can be attached to the proximal handle portion 1918 such that the relative alignment of the sensors and/or optical devices with respect to the handle portion 1918 remains consistent independent of rotation of the top portion 1928 and base 1930 with respect to the handle portion 1918.
As illustrated in
In some embodiments, the Kerrison 1900 includes a return valve (not shown) configured to introduce physiological saline so as to provide compression to the actuation element 1916. The return valve may, for example, allow injection of pressurized gas into the distal actuation chamber 1917 or in the region of the proximal actuation chamber 1919 forward the first guide portion 1921a. The fluid introduced via the return valve can be used to move the piston 1920 in the proximal direction. In some such embodiments, the Kerrison 1900 does not include a biasing structure 1924.
The actuation element 1916 can be fluidly connected to a conduit 1914 through which physiological saline can be input into and pulled out from the actuation element 1916. In some embodiments, hydraulic controls associated with the actuation element 1916 are operated via a foot pedal. Such embodiments can allow for greater dexterity for the user of the Kerrison 1900 by reducing the operating variables controlled by the Kerrison 1900 handle portions 1918, 1923. Elastomeric and/or proportional valves can be used to enhance the responsiveness of the Kerrison 1900 to operation of the foot pedal (see, e.g.,
Additionally, with reference to
Additionally, in any of the Kerrison embodiments described herein, the fixed cutting surface 1730 can be generally vertically oriented. As shown in
In the tool embodiments disclosed herein, including without limitation the Kerrison, the cutting surface or top surface of the tool can be bayonetted. The bayonetted structure of the tool can allow the tool to be inserted into the surgical area without interfering or obscuring the views of the surgical site or overhead views of the surgical field. The bayonet style tool can be utilized for the Kerrison, forceps, scissors, or other tools described herein. The bayonet configuration can be advantageous for small surgical sites or external viewing of the surgical site. The bayonet feature reduces the area obscured by the tool within the surgical site.
In any of the tool embodiments disclosed herein, including without limitation the Kerrison, the housing supporting or comprising the tool can be configured to have a port or lumen therein arranged to facilitate the removal of tissue and bone extracted from the surgical site. For example, the Kerrison can have a side port or opening located proximal of the cutting head 1728, though which cut tissue can be removed (e.g., pushed through port or opening as cutter withdraws and Kerrison returns to the default position). In some embodiments, a source of suction, or a source of saline and suction, can be supplied to the port. Additionally, the removal port or lumen of the housing can also support a mechanical removal mechanism, such as but not limited to a screw type auger (which can be hydraulically actuated, via for example a gear motor, gerotor, or vane motor 1512 discussed above), to facilitate removal of bone debris and extracted tissue from the surgical site. In some embodiments, the removed tissue can be extracted to a waste reservoir supported by or tethered to the housing of the tool. In another embodiment, the movable cutting head of the Kerrison can be a generally cylindrical tube that can be actuated (in the matter described above) to slidably move against the fixed cutting surface 1730. For example, said cylindrical tube can be slidable within an outer housing of the Kerrison when a force is exerted thereon via the expansion of the second inflatable element 1716, as discussed above.
Additionally, in any of the tool embodiments disclosed herein, including without limitation the Kerrison, the housing supporting or comprising the tool can be configured to have a suction port and a source of saline so that the tool and/or the surgical site can be flushed with saline and the saline and debris can removed via the suction line simultaneously or sequentially with the flushing. In some embodiments, the saline can be provided through the conduit used to provide saline to the second inflatable element, through the same or a different lumen of such conduit.
Additionally, the saline source or conduit and/or the suction source or conduit can be separate from the tool so that it can be independently positioned. In some embodiments, the saline source or conduit and/or the suction source or conduit can be tethered to the tool.
Any of the hydraulic system embodiments disclosed herein can be configured to incorporate or use any suitable surgical tools, including without limitation scissors, micro-scissors, forceps, micro-forceps, bipolar forceps, clip appliers including aneurysm clip appliers, rongeur, and, as described, Kerrison tools.
Turbine
Another tool is a drill which can be driven by a hydraulic turbine. The tool can be driven by a hydraulic turbine. In some embodiments, as illustrated in
The relative areas of the nozzle inlet 2074 and the nozzle outlet 2075 can vary. For example, the nozzle outlet 2075 can have an area that is greater than or equal to approximately 125% of the area of the nozzle inlet 2074 and/or less than or equal to about 600% of the area of the nozzle inlet 2074. In some embodiments, the area of the nozzle outlet 2075 is approximately 300% of the area of the nozzle inlet 2074.
As illustrated in
In some embodiments, physiological saline is directed through the nozzle frame 2072 toward an impeller 2076. The impeller 2076 can include a plurality of impeller blades 2077 around the outer periphery of the hub of the impeller 2076. The impeller blades 2077 can rotate within a blade cavity 2077a. (See
In some cases, the impeller blades 2077 are oriented at an angle offset from the central axis of the impeller 2077. The hydraulic nozzles 2073 can be configured to turn the flow of physiological saline from an axial direction A to nozzle direction 2078 as the flow is passed through the nozzle frame 2072 toward the impeller 2076. The nozzle direction 2078 can be selected to be at an angle θT offset from axial A such that the nozzle direction 2078 is substantially perpendicular to the faces of the impeller blades 2077. The closer nozzle outlets are to the plane of the impeller blades 2077 and the more radially-directed the flow from the nozzles, the more torque can be imparted upon the impeller blades 2077. For example, the nozzle outlets can be positioned close to the impeller blades 2077 in the axial direction and can direct physiological saline at a highly-radial angle toward impeller blades 2077 whose surfaces are close to parallel to the rotation of axis of the impeller 2076.
In some cases, utilizing a plurality of circumferentially-distributed turbine nozzles 2073 to drive a plurality of impeller blades 2077 can increase the torque output of impeller 2076 as compared to a configuration wherein only one turbine nozzle 2073 is utilized. In some such configurations, the outer diameters of the nozzle frame 2072 and impeller 2076 can smaller than a single-nozzle configuration of equal output torque.
In some embodiments, the hydraulic turbine 2070 can be configured to operate at rotational speeds of 40,000 rpm to 60,000 rpm, though higher and lower rpm values may be possible. In some embodiments, the hydraulic turbine is configured to operate at rotational speeds of 100,000 rpm. The hydraulic turbine 2070 can be configured to operate at operating pressures between 70 psi and 190 psi, though greater and lesser operating pressures are possible. In some embodiments, the operating pressure of the hydraulic turbine 2070 is designed to be approximately 120 psi.
As illustrated in
As illustrated in
In some embodiments, multiple impellers 2076 (e.g., multiple turbine wheels) can be utilized in the same turbine housing 2071. In some such embodiments, the overall diameter of the turbine 2070 and/or some of its components can be reduced relative to a single-impeller turbine 2070 without sacrificing output torque.
Some instruments such as surgical tools use torque or mechanical force to translate manual input into tool actuation. For example, a Kerrison for bone removal generally includes a handle mechanically coupled to a head including a stationary portion and a movable portion. When a user squeezes the handle, the movable portion moves closer to the stationary portion in a cutting manner (e.g., in a shearing manner), for example to remove bone by trapping the removed bone between the stationary portion and the movable portion (e.g., within a channel between the stationary portion and the movable portion). Other examples of tools include an aneurysm clipper, a rongeur, forceps, scissors, and the like, although many other hand-operated tools are known to those skilled in the art. Referring again to the Kerrison, the pace and force of the squeezing translates to the pace and force of the cutting, and this phenomenon is also applicable to other hand-operated tools. This translation can be disadvantageous, for example varying based on each user, being too slow or too fast or having variable speed, lacking force or imparting too much force or having variable force, etc. Additionally, periodic use of such manually operated tools (e.g., during a lengthy operation or procedure) can lead to hand fatigue of the surgeon or user. Manual actuation leads to inadvertent movement of the tool tip.
The hydraulic system may also be used for other purpose such as cleaning optics and/or cooling light emitters for illuminating a surgical site.
Pops-Off Valve for Camera Cleaning/Cleaning Based on Detection of Blood
In some embodiments, a fluid reservoir can be fluidly connected to one or more fluid outlets (e.g., nozzles) configured to wash optical components (e.g., cameras, LEDs, and/or other components disposed in the surgical site that can be dirtied by blood, bodily fluids, or debris). The fluid reservoir can be fluidly connected to the fluid outlets via a hydraulic manifold in console. In some embodiments, one or more valves (e.g., proportional, elastomeric, and/or on/off valves) can be positioned in the fluid path between the fluid reservoir and the fluid outlets.
For example, a pulsing valve (e.g., a pop off valve) can be positioned in a fluid path between the fluid outlets and the fluid reservoir. In some embodiments, the pulsing valve is disposable. The pulsing valve is configured to open when a pressure threshold is reached. In various embodiments, the valve closes once the pressure returns to below the threshold. In some embodiments, the operation of the pulsing valve can produce a substantially square wave fluid pulse. The pulsing valve can be used to provide short duration liquid pulses to wash optics such as the camera optics. One benefit of such a pulse is the reduction of image distortion that would result in interruption of usable video stream provided to the surgeon. Such degradation of the video can be caused by flowing liquid across the camera for a noticeable period of time during which the image is distorted. In various embodiments, the pulsing or pop off valve can be located so as to wash all the cameras at the same time. The pulsing valve, for example, can be located upstream to where the line splits into different fluid outlets to clean different cameras.
To activate the pulsing valve, the pressure may be increased beyond the threshold for opening the pulsing valve. The pulsing valve may, for example, be connected to a high pressure source of fluid via a valve. The valve can be opened sufficiently to provide increased pressure beyond the threshold value resulting in a pulse of liquid from the pulsing valve. In certain embodiments the liquid pulses are produced periodically. For example, a processor may cause the valve to be opened periodically based on a schedule programmed into the processor or selected by the user via the processor. In some embodiments, liquid pulses are produced when blood or other obstruction is detected. The intensity level of the camera can be monitored to determine when visibility is compromised. In various cases, the color of the light reaching the camera can be analyzed to determine, for example, that blood is obstructing or impairing vision of the cameras and to thereby trigger pulse washing. The processor could be utilized to analyze the image signal and determine whether pulse washing is to be initiated. In some embodiments, attenuation of the red wavelength in comparison to other wavelengths such as green may indicate that blood is on the camera and reducing the amount of light entering the camera.
A three-way valve can also be employed. The three-way valve can operate to permit fluid from the fluid reservoir or gas such as pressurized gas from a pump to access the fluid outlets via the pulsing valve. The three-way valve can be configured to selectively shut off supply of fluid from the fluid reservoir and provide instead gas (e.g., pressurized gas from a pump). In some embodiments, shut off of fluid from the fluid reservoir to the fluid outlets can reduce the likelihood of “dribbling” or other inadvertent leakage of liquid from the fluid outlets onto the optical components. The pressurized air stream used to dry the camera optics can carry away any residual liquid that would otherwise dribble onto the camera at a later time. Depending on the design, the three-way valve can be positioned upstream or downstream of the pulsing valve. In some embodiments, positioning the three-way valve downstream of the pop off valve may further reduce dribbling.
In some embodiments, gas can be directed to the fluid outlets. For example, the pneumatic assembly can be configured to direct pressurized pneumatic gas (e.g., air) to the fluid outlets. The pneumatic gas can be supplied by a pump or other source of pressurized pneumatic gas. For example, an air pump can be positioned within the hydraulic manifold in the console or elsewhere in the hydraulic pressure circuit therein. In some embodiments, a pulsing valve is configured to open when a pressure threshold in the pneumatic gas is reached. The pulsing valve can produce a substantially square pneumatic gas wave. In some embodiments, the pressurized air output by the fluid outlets creates a squeegeeing effect wherein the pneumatic gas effectively squeegees the optical components to dry them.
Suction Cassette
In some embodiments, a console can include a medical suction system. As illustrated in
In some embodiments, the medical suction system 12000 includes a first suction line 12080a. The medical suction system 12000 can, in some embodiments, include a second suction line 12080b. The suction cassette assembly 12020 can be configured to utilize the first and second suction lines 12080a, 12080b simultaneously and/or in isolation.
As illustrated, the medical suction system 12000 can include a vacuum source 12040. In some embodiments, the vacuum source 12040 is a hospital vacuum source. In some embodiments, the vacuum source 12040 is a pump. Many variations are possible. In some embodiments, the vacuum source 12040 is fluidly connected to the suction cassette assembly 12020 via a first vacuum line 12044a. In some embodiments, a second vacuum line 12044b connects the vacuum source 12040 to the suction cassette 12020. The medical suction system 12000 can be configured to utilize the first and second vacuum lines 12044a, 12044b simultaneously and/or in isolation.
In some embodiments, the medical suction system 12000 includes a storage tank 12060. The storage tank 12060 can be configured to store blood and/or other tissue/fluids. In some embodiments, the storage tank 12060 is fluidly connected to a waste line (not shown) or other disposal line. The storage tank 12060 can be connected to the suction cassette assembly 12020 via a first storage line (e.g., storage inlet line 12062a). In some embodiments, the storage tank 12060 is connected to the suction cassette assembly 12020 via a second storage line (e.g., storage outlet line 12062b).
The suction cassette assembly 12020 can include an intermediate tank 12024. The intermediate tank 12024 can be housed at least partially within the suction cassette assembly 12020. In some embodiments, the intermediate storage tank 12024 is fluidly connected to the second suction line 12080b. The intermediate storage tank 12024 can be configured to store medical and/or bodily fluids/tissues (e.g., blood, saline, bone matter).
In some embodiments, the intermediate storage tank 12024 is fluidly connected to the storage tank 12060. In some embodiments, a pump 12026 (e.g., a peristaltic pump or other fluid pump) can be positioned on the fluid line between the intermediate storage tank 12024 and the storage tank 12060. The pump 12026 can be configured to pull material from the intermediate storage tank 12024 into the storage tank 12060. In some embodiments, the pump 12026 is configured to reduce the likelihood of fluid/bodily material transfer from the storage tank 12060 to the intermediate storage tank 12024. The pump 12026 can be configured to permit large portions of tissue (e.g., bone portions, muscle portions, and/or other tissue) to travel from the intermediate storage tank 12024 to the storage tank 12060. In some embodiments, the pump 12026 is positioned at least partially within the suction cassette assembly 12020.
The medical suction system 12000 can include one or more filters 12022. In some embodiments, the filters 12027a, 12027b are hydrophobic and/or anti-microbial. One or more of the filters 12022 can be positioned within the suction cassette assembly 12020. In some embodiments, a filter 12022 is positioned in the fluid line between the storage tank 12060 and the vacuum source 12040. Such a filter 12022 could be configured to reduce the likelihood that liquid and/or pathogens could pass from the storage tank 12060 to the vacuum source 12040. In some embodiments a filter 12022 can be positioned in the fluid line between the intermediate storage tank 12024 and the vacuum source 12040. Such a filter 12022 could be configured to reduce the likelihood that liquid and/or pathogens could pass from the intermediate storage tank 12024 to the vacuum source 12040.
In some embodiments, one or more valves are positioned on the vacuum lines 12044a, 12044b. For example, a first vacuum valve 12042a can be positioned on the first vacuum line 12044a. The first vacuum valve 12042a can be configured to selectively occlude the first vacuum line 12044a. A filter 12027a can be positioned in the first vacuum line 12044a. For example, the filter 12027a can be positioned at least partially within the suction cassette assembly 12020, as illustrated in
A second vacuum valve 12042b can be positioned on the second vacuum line 12044b. In some embodiments, the second vacuum valve 12042b is configured to selectively occlude the second vacuum line 12044b. A filter 12027b can be positioned in the second vacuum line 12044b. For example, the filter 12027b can be positioned at least partially within the suction cassette assembly 12020. In some embodiments, the filter 12027b is an antimicrobial and/or hydrophobic filter. The filter 12027a can be configured to reduce the likelihood that blood or other materials from the intermediate storage tank 12024 will pass into the second vacuum valve 12042b and/or into the vacuum source 12040. In some embodiments, the intermediate storage tank 12024 includes a baffle (not shown) or other structure to reduce the likelihood of material ingress from the intermediate storage tank 12024 to the filter 12027b and/or to the second vacuum line 12044b. For example, the baffle help to block fluid from splashing directly from the second suction line 12080b to the second vacuum line 12044b.
In some embodiments, the medical suction system 12000 includes a third vacuum valve 12042c. The third vacuum valve 12042c can be positioned on the second vacuum line 12044b. In some embodiments, the third vacuum valve 12042c is fluidly connected to the second vacuum line 12044b between the intermediate storage tank 12024 and the second vacuum valve 12042b. The third vacuum valve 12042c can be used to moderate the pressure P2 within the intermediate storage tank 12024. For example, the third vacuum valve 12042c can be used to maintain the pressure P2 within the intermediate storage tank 12024 within a predetermined range (e.g., 0-550 mm Hg, 200-800 mm Hg, etc.). In some embodiments, one or more pneumatic indicators 12051 can be positioned on the vacuum lines 12044a, 12044b. The indicators 12051 can be configured to provide feedback on pneumatic parameters (e.g., pressure).
In some embodiments, the vacuum source 12040 is configured to decrease the pressure P1 within the storage tank 12060. For example, the first vacuum valve 12042a can be at least partially opened to allow for fluid communication between the vacuum source 12040 and the storage tank 12060. In some embodiments, the vacuum source 12040 is configured to maintain the pressure P1 within the storage tank 12060 below ambient pressure Pa. In some embodiments, the vacuum source 12040 maintains the pressure P1 within the storage tank 12060 at or near 600 mm Hg.
The medical suction system 12000 is configured to operate with a low flow suction input and/or with a high flow suction input. For example, the first suction line 12080a can be configured to operate as a high flow suction line. In some embodiments, the first suction line 12080a is used to produce high flow at a sight of interest to suction, for example, heavy bleeding. In some embodiments, the second suction line 12080b is configured to operate as a low flow suction line. For example, the second suction line 12080b can be used to identify and coagulate low flow bleeding on the surface of the brain without damaging the brain.
In some embodiments, valves 12028a, 12028b (e.g., proportional elastomeric valves) can be positioned on the suction lines 12080a, 12080b, respectively. For example, valves 12028a, 12028b can be used to open and/or close fluid communication between the first suction line 12080a and the storage tank 12060 and to open and/or close fluid communication between the second suction line 12008b and the intermediate storage tank 12024, respectively.
Pneumatic Drive System
In some embodiments, the pneumatic actuators 12562 are configured to operate the valves of a hydraulic pressure circuit 3200 (and/or the valves of any of the embodiments of the hydraulic pressure circuits disclosed herein). In some embodiments, the actuator 12552 is configured to operate a cassette lifter. The pneumatic actuators 12554 can be configured to operate as tube ejectors for peristaltic pumps used, for example, in the hydraulic pressure circuits disclosed herein.
As illustrated in
Apparatus for Moving a Component
As illustrated in
The guide assembly 10230 can include multiple guide connectors 10250, 10255 which can be used to connect the multiple components of the guide assembly 10230 together. In the illustrated embodiment, the guide connectors come in pairs although in other embodiments, single guide connectors or multiple (e.g., greater than two) guide connectors can be used. These guide connectors 10250, 10255 can be designed to translate along the path of the guides. As such, in embodiments using tracks or rails as guides, the connectors 10250, 10255 can be slidably translatable along the tracks or rails and can include mechanisms such as rollers, ball bearings, or the like. During operation of the illustrated embodiment, the guide assembly 10230 can be translated relative to the upper connecting member 10210 via sliding across both the upper guide 10235 and the x-axis intermediate guide 10240. The lower connecting member 10220 can thus translate relative to upper connecting member 10210 along the x-axis. The lower connecting member 10220 can be translated relative to the guide assembly 10230 by sliding across the y-axis intermediate guide 10245.
With continued reference to
The lower connection member 10220 can be translated relative to the upper connection member 10210 by translating the control members 10110, 10115. As described above, in various embodiments, the joints 10111, 10116 can be any joint that lacks translational degrees of freedom along at least the x-axis and the y-axis. As such, a user of the translation system 10200 can move one control member such as control members 10110, 10115 in the desired direction to translate the lower connection member 10220 and ultimately the component 18 which can be attached thereto. Because movement of the component 18 is linked directly to the user's physical movement of the control member 10110, 10115, dexterous users can find this type of mechanism more user-friendly and precise. Furthermore, in embodiments where the joints 10111, 10116 lack translation degrees of freedom along at least the x-axis and the y-axis, the other control member, such as control members 10110 or 10115, also translate along with the lower connection member 10220. As such, a user can control translation using any control member attached to the lower connection member 10220.
In some embodiments, the guide connectors 10550, 10555 can be designed to have a threshold friction such that the lower connection member 10220 can only translate upon a threshold force being applied to the lower connection member 10220. Requiring a threshold force to be applied prior to movement can reduce the likelihood of unintentional movement of the translation system 10200 is reduced. In some embodiments, alternative control mechanisms can be used in conjunction with, or in lieu of, the control members 10110, 10115.
With reference to
The member 10310 can be configured to link the motion of multiple control members 10110, 10115 and the component arm 10120. For example, in the illustrated embodiment, when adjusting the yaw of one control member 10110, the yaw of the other control member 10115 and component arm 10120 correspondingly adjusts via mechanical movement. In much the same way, in the illustrated embodiment, when adjusting the pitch of one control member 10110, the pitch of the other control member 10115 and component arm 10120 also correspondingly adjusts. This provides the advantage of allowing a user of the pitch-yaw adjustment system 10300 to adjust pitch or yaw of the component arm 10120, and ultimately the component 18, using only one of potentially multiple control members such as control members 10110 or 10115 in the illustrated embodiment. A user may advantageously choose whichever control member 10110, 10115 is most convenient to use at the time an adjustment is necessary. Additionally, this may facilitate use by multiple users. For example, during a medical procedure, a medical professional on one side may use one control member 10110. If a second medical professional or assistant needs to make an adjustment, the second medical professional or assistant may use the other control member 10115 if more readily accessible.
In some embodiments, the member 10310 can be attached to the control members 10110, 10115 and the component arm 10120 such that adjustment of yaw and/or pitch of the control members 10110, 10115 can result in more than or less than a one-to-one adjustment in the yaw or pitch of the component arm 10120. For example, as illustrated in
In some embodiments, a link member 10305 can also be used. The link member 10305 can be designed to attach to the control members 10110 and 10115 to provide greater structural rigidity. For example, link member 10305 can be connected to control members 10110 and 10115 using joints having fewer degrees of freedom. For example, the link member 10305 can be attached to control member 10110 at joint 10113 and attached to control member 10115 at joint 10118. In some embodiments, the joint can have a single degree of rotational freedom such as a pin-and-aperture design. By limiting the degrees of freedom, the control members 10110 and 10115 can be more likely to remain co-planar when rotating in pitch and/or yaw thereby reducing the potential for twisting. In some embodiments, the link member 10305 can be used in lieu of member 10310.
In some embodiments, such as that illustrated in
With reference to
As illustrated in
As shown in
In some embodiments, the four-bar linkage assembly may include, in addition to a driven link 10470, a shaft link 10471 which can be rigidly attached to the shaft 10450, a free link 10472, and a hinge unit 10475. In some embodiments, a second four-bar linkage assembly can be included with a second free link 10480 and third free link 10481, a hinge unit 10475 attached to the first four-bar linkage assembly, and a second hinge unit 10485 for attaching the component 18 to the component arm 10120. In such embodiments, mechanisms may be added between the first four-bar linkage assembly and the second four-bar linkage assembly such that the second four-bar linkage assembly will correspondingly rotate when the driven link 10470 is rotated. For example, the first free link 10472 and the second free link 10480 can be rotatably connected via gearing. The gearing can be chosen such that there is a one-to-one rotational transfer between the first free link 10472 and the second free link 10480. In other embodiments, the gearing can be chosen such that there is greater than a one-to-one rotational transfer or lesser than a one-to-one rotational transfer. As should be appreciated by one of ordinary skill in the art, the lengths of the links and the ratio of the gearing can be chosen such that the z-distance adjustment system 10400 can adjust the z-distance, that is, adjust the position of the component 18 in a direction generally parallel to the z-axis, without causing any or at least a small amount of translation of the component in either the x-axis or y-axis.
The four-bar linkage component arm 10120 provides the advantage of reducing the form factor when the z-distance is greater. Due to the compact form factor of the four-bar linkage component arm 10120, the component 18 can be placed closer to the lower connecting member 10220.
In other embodiments, the component arm 10500 can include a screw-drive assembly for adjusting the z-distance. In this embodiment, the component arm 10500 can include two struts members 10505, 10510 designed to support the component arm 10500 rigidly attached to a first end 10515 and second end 10520, a screw member 10525 rigidly attached to the first end 10515, the second end 10520 and the component 18, and a threaded torque receiving joint 10530, such as the illustrated capstan, at one end which can be received within aperture 10416 (see
The z-distance adjustment system 10400 can be operated by using the control members 10110, 10115. In the illustrated embodiment, control member 10110 can be rotatably coupled to transmitting drive member 10405 via aperture 10406, such as the illustrated keyed aperture, and control member 10115 can be rotatably coupled to transmitting drive member 10410 via aperture 10411. In some embodiments, the rotatable coupling can be achieved through use of a capstan, such as 10112 and 10117 of
As should be apparent to one of ordinary skill in the art, the radius of the transmitting drive members 10405, 10410 and receiving drive member 10415 can be chosen to modify the speed of the z-distance adjustment. For example, in some embodiments, the transmitting drive members 10405, 10410 can have a larger radius than the receiving drive member 10415 such that, for a single revolution of a control member 10110, 10115, the receiving drive member 10415 rotates more than one revolution.
Various embodiments may comprise a retractor, use a retractor, or are configured to be used with a retractor and not an endoscope, laparoscope, or arthroscope. Similarly, many embodiments comprise retractors or use retractors or are configured to be used with retractors wherein cameras are disposed on the retractors, not endoscope, laparoscope, or arthroscope.
In various embodiments the retractor fits in an opening in a manner to provide ample room for the surgeon to operate but does not provide a gas seal for pumping up a cavity as may a laparoscope. Similarly, in many embodiments the retractor does not maintain alignment of the layers of tissue that are cut through to form the incision.
In various embodiments, the retractor is not employed as a fulcrum for surgical tools.
In various embodiments the camera(s) can be positioned on top of the retractor near and above the body surface or on the retractor at a depth within the surgical site (e.g., at the far distal end of the retractor into the deepest portion of the surgical site, or elsewhere on the retractor such as more proximal).
Many embodiments are employed for spine surgery, neurosurgery, head and/or neck surgery, and ear nose and throat surgery and many embodiments involve the cutting and extraction of bone, for example, through the pathway provided by the retractor.
Various embodiments, however, may be used with devices other than retractors.
Many other embodiments are possible, including numerous combinations of the above recited features.
The embodiments described herein can differ from those specifically shown. For example, various elements of the different embodiments may be combined and/or rearranged. Components can be added, removed and/or rearranged. A wide variety of variations are possible.
Unless otherwise indicated, the functions described herein may be performed by software (e.g., including modules) including executable code and instructions running on one or more systems including one or more computers. The software may be stored in computer readable media (e.g., some or all of the following: optical media (e.g., CD-ROM, DVD, Blu-ray, etc.), magnetic media (e.g., fixed or removable magnetic media), semiconductor memory (e.g., RAM, ROM, Flash memory, EPROM, etc.), and/or other types of computer readable media.
The one or more computers can include one or more central processing units (CPUs) that execute program code and process data, non-transitory, tangible memory, including. for example, one or more of volatile memory, such as random access memory (RAM) for temporarily storing data and data structures during program execution, non-volatile memory, such as a hard disc drive, optical drive, or FLASH drive, for storing programs and data, including databases, a wired and/or wireless network interface for accessing an intranet and/or Internet, and/or other interfaces.
In addition, the computers can include a display for displaying user interfaces, data, and the like, and one or more user input devices, such as a keyboard, mouse, pointing device, touch screen, microphone and/or the like, used to navigate, provide commands, enter information, provide search queries, and/or the like. The systems described herein can also be implemented using general-purpose computers, special purpose computers, terminals, state machines, and/or hardwired electronic circuits.
Various embodiments provide for communications between one or more systems and one or more users. These user communications may be provided to a user terminal (e.g., an interactive television, a phone, a laptop/desktop computer, a device providing Internet access, or other networked device). For example, communications may be provided via Webpages, downloaded documents, email, SMS (short messaging service) message, MMS (multimedia messaging service) message, terminal vibrations, other forms of electronic communication, text-to-speech message, or otherwise.
Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.
Certain features that are described in this specification in the context of separate embodiments also can be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment also can be implemented in multiple embodiments separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination.
Similarly, while operations may be described as occurring in a particular order, this should not be understood as requiring that such operations be performed in the particular order described or in sequential order, or that all described operations be performed, to achieve desirable results. Further, other operations that are not disclosed can be incorporated in the processes that are described herein. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the disclosed operations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single product or packaged into multiple products. Additionally, other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
This application claims the benefit of priority to U.S. Prov. App. No. 61/880,808, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Sep. 20, 2013; to U.S. Prov. App. No. 61/920,451, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 23, 2013; to U.S. Prov. App. No. 61/921,051, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 26, 2013; to U.S. Prov. App. No. 61/921,389, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 27, 2013; to U.S. Prov. App. No. 61/922,068, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Dec. 30, 2013; and to U.S. Prov. App. No. 61/923,188, entitled “SURGICAL VISUALIZATION SYSTEMS”, filed Jan. 2, 2014.
Number | Name | Date | Kind |
---|---|---|---|
497064 | Van Meter | May 1893 | A |
2826114 | Bryan | Mar 1958 | A |
3050870 | Heilig | Aug 1962 | A |
3108781 | Saffir | Oct 1963 | A |
3128988 | Mandroian | Apr 1964 | A |
3141650 | Saffir | Jul 1964 | A |
3405990 | Nothnagle et al. | Oct 1968 | A |
3409346 | Stapsy | Nov 1968 | A |
3664330 | Deutsch | May 1972 | A |
4056310 | Shimizu et al. | Nov 1977 | A |
4063557 | Wuchinich et al. | Dec 1977 | A |
4087198 | Theis, Jr. | May 1978 | A |
4167302 | Karasawa | Sep 1979 | A |
4176453 | Abbott | Dec 1979 | A |
4223676 | Wuchinich et al. | Sep 1980 | A |
4226228 | Shin et al. | Oct 1980 | A |
4344746 | Leonard | Aug 1982 | A |
4354734 | Nkahashi | Oct 1982 | A |
4395731 | Schoolman | Jul 1983 | A |
4562832 | Wilder et al. | Jan 1986 | A |
4651201 | Schoolman | Mar 1987 | A |
4655557 | Takahashi | Apr 1987 | A |
4665391 | Spani | May 1987 | A |
4684224 | Yamashita et al. | Aug 1987 | A |
4703314 | Spani | Oct 1987 | A |
4718106 | Weinblatt | Jan 1988 | A |
4750488 | Wuchinich et al. | Jun 1988 | A |
4750902 | Wuchinich et al. | Jun 1988 | A |
4779968 | Sander | Oct 1988 | A |
4783156 | Yokota | Nov 1988 | A |
4786155 | Fantone et al. | Nov 1988 | A |
4813927 | Morris et al. | Mar 1989 | A |
4873572 | Miyazaki et al. | Oct 1989 | A |
4900301 | Morris et al. | Feb 1990 | A |
4905670 | Adair | Mar 1990 | A |
4920336 | Meijer | Apr 1990 | A |
4922902 | Wuchinich et al. | May 1990 | A |
4986622 | Martinez | Jan 1991 | A |
4989452 | Toon et al. | Feb 1991 | A |
5032111 | Morris et al. | Jul 1991 | A |
5047009 | Morris et al. | Sep 1991 | A |
5098426 | Sklar et al. | Mar 1992 | A |
5143054 | Adair | Sep 1992 | A |
5151821 | Marks | Sep 1992 | A |
5176677 | Wuchinich et al. | Jan 1993 | A |
5201325 | McEwen et al. | Apr 1993 | A |
5221282 | Wuchinich | Jun 1993 | A |
5251613 | Adair | Oct 1993 | A |
5327283 | Zobel | Jul 1994 | A |
5354314 | Hardy et al. | Oct 1994 | A |
5417210 | Funda | May 1995 | A |
5441059 | Dannan | Aug 1995 | A |
5464008 | Kim | Nov 1995 | A |
5523810 | Volk | Jun 1996 | A |
5537164 | Smith | Jul 1996 | A |
5553995 | Martinez | Sep 1996 | A |
5575789 | Bell et al. | Nov 1996 | A |
5584796 | Cohen | Dec 1996 | A |
5593402 | Patrick | Jan 1997 | A |
5601549 | Miyagi | Feb 1997 | A |
5625493 | Matsumura et al. | Apr 1997 | A |
5634790 | Pathmanabhan et al. | Jun 1997 | A |
5667481 | Villalta et al. | Sep 1997 | A |
5697891 | Hori | Dec 1997 | A |
5712995 | Cohn | Jan 1998 | A |
5716326 | Dannan | Feb 1998 | A |
5743731 | Lares et al. | Apr 1998 | A |
5743846 | Takahashi et al. | Apr 1998 | A |
5747824 | Jung et al. | May 1998 | A |
5751341 | Chaleki | May 1998 | A |
5797403 | DiLorenzo | Aug 1998 | A |
5803733 | Trott et al. | Sep 1998 | A |
5822036 | Massie et al. | Oct 1998 | A |
5825534 | Strahle | Oct 1998 | A |
5835266 | Kitajima | Nov 1998 | A |
5841510 | Roggy | Nov 1998 | A |
5861983 | Twisselman | Jan 1999 | A |
5889611 | Zonneveld | Mar 1999 | A |
5897491 | Kastenbauer et al. | Apr 1999 | A |
5909380 | Dubois | Jun 1999 | A |
5913818 | Co et al. | Jun 1999 | A |
5928139 | Koros et al. | Jul 1999 | A |
5949388 | Atsumi | Sep 1999 | A |
5982532 | Mittelstadt et al. | Nov 1999 | A |
6016607 | Morimoto et al. | Jan 2000 | A |
6023638 | Swanson | Feb 2000 | A |
6088154 | Morita | Jul 2000 | A |
6139493 | Koros et al. | Oct 2000 | A |
6152736 | Schmidinger | Nov 2000 | A |
6152871 | Foley et al. | Nov 2000 | A |
6176825 | Chin et al. | Jan 2001 | B1 |
6217188 | Wainwright et al. | Apr 2001 | B1 |
6246898 | Vesely et al. | Jun 2001 | B1 |
6293911 | Imaizumi et al. | Sep 2001 | B1 |
6317260 | Ito | Nov 2001 | B1 |
6319223 | Wortrich et al. | Nov 2001 | B1 |
6340363 | Bolger et al. | Jan 2002 | B1 |
6350235 | Cohen et al. | Feb 2002 | B1 |
6354992 | Kato | Mar 2002 | B1 |
6398721 | Nakamura | Jun 2002 | B1 |
6405072 | Cosman | Jun 2002 | B1 |
6434329 | Dube et al. | Aug 2002 | B1 |
6443594 | Marshall et al. | Sep 2002 | B1 |
6450706 | Chapman | Sep 2002 | B1 |
6450950 | Irion | Sep 2002 | B2 |
6491661 | Boukhny et al. | Dec 2002 | B1 |
6508759 | Taylor et al. | Jan 2003 | B1 |
6517207 | Chapman | Feb 2003 | B2 |
6525310 | Dunfield | Feb 2003 | B2 |
6525878 | Takahashi | Feb 2003 | B1 |
6527704 | Chang et al. | Mar 2003 | B1 |
6538665 | Crow et al. | Mar 2003 | B2 |
6549341 | Nomura et al. | Apr 2003 | B2 |
6561999 | Nazarifar et al. | May 2003 | B1 |
6582358 | Akui et al. | Jun 2003 | B2 |
6587711 | Alfano et al. | Jul 2003 | B1 |
6589168 | Thompson | Jul 2003 | B2 |
6618207 | Lei | Sep 2003 | B2 |
6626445 | Murphy et al. | Sep 2003 | B2 |
6633328 | Byrd et al. | Oct 2003 | B1 |
6635010 | Lederer | Oct 2003 | B1 |
6636254 | Onishi et al. | Oct 2003 | B1 |
6661571 | Shioda et al. | Dec 2003 | B1 |
6668841 | Chou | Dec 2003 | B1 |
6698886 | Pollack et al. | Mar 2004 | B2 |
6720988 | Gere et al. | Apr 2004 | B1 |
6757021 | Nguyen-Nhu | Jun 2004 | B1 |
6805127 | Karasic | Oct 2004 | B1 |
6817975 | Farr et al. | Nov 2004 | B1 |
6824525 | Nazarifar et al. | Nov 2004 | B2 |
6847336 | Lemelson et al. | Jan 2005 | B1 |
6869398 | Obenchain et al. | Mar 2005 | B2 |
6873867 | Vilsmeier | Mar 2005 | B2 |
6892597 | Tews | May 2005 | B2 |
6903883 | Amanai | Jun 2005 | B2 |
6908451 | Brody et al. | Jun 2005 | B2 |
6985765 | Morita | Jan 2006 | B2 |
6996460 | Krahnstoever et al. | Feb 2006 | B1 |
7034983 | Desimone et al. | Apr 2006 | B2 |
7050225 | Nakamura | May 2006 | B2 |
7050245 | Tesar et al. | May 2006 | B2 |
7054076 | Tesar et al. | May 2006 | B2 |
7116437 | Weinstein et al. | Oct 2006 | B2 |
7125119 | Farberov | Oct 2006 | B2 |
7150713 | Shener et al. | Dec 2006 | B2 |
7150714 | Myles | Dec 2006 | B2 |
7154527 | Goldstein et al. | Dec 2006 | B1 |
7155316 | Sutherland | Dec 2006 | B2 |
7163543 | Smedley et al. | Jan 2007 | B2 |
7226451 | Shluzas et al. | Jun 2007 | B2 |
7244240 | Nazarifar et al. | Jul 2007 | B2 |
7278092 | Krzanowski | Oct 2007 | B2 |
7298393 | Morita | Nov 2007 | B2 |
7306559 | Williams | Dec 2007 | B2 |
7307799 | Minefuji | Dec 2007 | B2 |
7326183 | Nazarifar et al. | Feb 2008 | B2 |
7471301 | Lefevre | Dec 2008 | B2 |
7480872 | Ubillos | Jan 2009 | B1 |
7494463 | Nehls | Feb 2009 | B2 |
7518791 | Sander | Apr 2009 | B2 |
7537565 | Bass | May 2009 | B2 |
7538939 | Zimmerman et al. | May 2009 | B2 |
7559887 | Dannan | Jul 2009 | B2 |
7621868 | Breidenthal et al. | Nov 2009 | B2 |
7633676 | Brunner et al. | Dec 2009 | B2 |
7644889 | Johnson | Jan 2010 | B2 |
7651465 | Sperling et al. | Jan 2010 | B1 |
7713237 | Nazarifar et al. | May 2010 | B2 |
7764370 | Williams et al. | Jul 2010 | B2 |
7766480 | Graham et al. | Aug 2010 | B1 |
7777941 | Zimmer | Aug 2010 | B2 |
7785253 | Arambula | Aug 2010 | B1 |
7786457 | Gao | Aug 2010 | B2 |
7806865 | Wilson | Oct 2010 | B1 |
7844320 | Shahidi | Nov 2010 | B2 |
7872746 | Gao et al. | Jan 2011 | B2 |
7874982 | Selover | Jan 2011 | B2 |
7896839 | Nazarifar et al. | Mar 2011 | B2 |
7907336 | Abele et al. | Mar 2011 | B2 |
7927272 | Bayer et al. | Apr 2011 | B2 |
7932925 | Inbar et al. | Apr 2011 | B2 |
7956341 | Gao | Jun 2011 | B2 |
8009141 | Chi et al. | Aug 2011 | B1 |
8012089 | Bayat | Sep 2011 | B2 |
8018523 | Choi | Sep 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8027710 | Dannan | Sep 2011 | B1 |
8038612 | Paz | Oct 2011 | B2 |
8070290 | Gille et al. | Dec 2011 | B2 |
8088066 | Grey | Jan 2012 | B2 |
8136779 | Wilson et al. | Mar 2012 | B2 |
8149270 | Yaron et al. | Apr 2012 | B1 |
8159743 | Abele et al. | Apr 2012 | B2 |
8169468 | Scott et al. | May 2012 | B2 |
8187167 | Kim | May 2012 | B2 |
8187180 | Pacey | May 2012 | B2 |
8194121 | Blumzvig et al. | Jun 2012 | B2 |
8221304 | Shioda et al. | Jul 2012 | B2 |
8229548 | Frangioni | Jul 2012 | B2 |
8294733 | Eino | Oct 2012 | B2 |
8295693 | McDowall | Oct 2012 | B2 |
8351434 | Fukuda et al. | Jan 2013 | B1 |
8358330 | Riederer | Jan 2013 | B2 |
8405733 | Saijo | Mar 2013 | B2 |
8408772 | Li | Apr 2013 | B2 |
8409088 | Grey et al. | Apr 2013 | B2 |
8419633 | Koshikawa et al. | Apr 2013 | B2 |
8419634 | Nearman et al. | Apr 2013 | B2 |
8430840 | Nazarifar et al. | Apr 2013 | B2 |
8439830 | McKinley et al. | May 2013 | B2 |
8460184 | Nearman et al. | Jun 2013 | B2 |
8464177 | Ben-Yoseph | Jun 2013 | B2 |
8482606 | Razzaque | Jul 2013 | B2 |
8498695 | Westwick et al. | Jul 2013 | B2 |
8521331 | Itkowitz | Aug 2013 | B2 |
8702592 | Langlois et al. | Apr 2014 | B2 |
8702602 | Berci et al. | Apr 2014 | B2 |
8734328 | McDowall | May 2014 | B2 |
8786946 | Nakamura | Jul 2014 | B2 |
8827899 | Farr et al. | Sep 2014 | B2 |
8827902 | Dietze, Jr. et al. | Sep 2014 | B2 |
8836723 | Tsao et al. | Sep 2014 | B2 |
8858425 | Farr et al. | Oct 2014 | B2 |
8876711 | Lin et al. | Nov 2014 | B2 |
8878924 | Farr | Nov 2014 | B2 |
8882662 | Charles | Nov 2014 | B2 |
8976238 | Ernsperger et al. | Mar 2015 | B2 |
8979301 | Moore | Mar 2015 | B2 |
9033870 | Farr et al. | May 2015 | B2 |
9216068 | Tesar | Dec 2015 | B2 |
9492065 | Tesar et al. | Nov 2016 | B2 |
9615728 | Charles et al. | Apr 2017 | B2 |
9629523 | Tesar et al. | Apr 2017 | B2 |
9642606 | Charles et al. | May 2017 | B2 |
9681796 | Tesar et al. | Jun 2017 | B2 |
9723976 | Tesar | Aug 2017 | B2 |
9782159 | Tesar | Oct 2017 | B2 |
9936863 | Tesar | Apr 2018 | B2 |
10022041 | Charles et al. | Jul 2018 | B2 |
10028651 | Tesar | Jul 2018 | B2 |
10231607 | Charles et al. | Mar 2019 | B2 |
10555728 | Charles et al. | Feb 2020 | B2 |
10568499 | Tesar | Feb 2020 | B2 |
10702353 | Tesar | Jul 2020 | B2 |
20010055062 | Shioda et al. | Dec 2001 | A1 |
20020013514 | Brau | Jan 2002 | A1 |
20020049367 | Irion et al. | Apr 2002 | A1 |
20020065461 | Cosman | May 2002 | A1 |
20020082498 | Wendt et al. | Jun 2002 | A1 |
20030055410 | Evans et al. | Mar 2003 | A1 |
20030059097 | Abovitz et al. | Mar 2003 | A1 |
20030078494 | Panescu et al. | Apr 2003 | A1 |
20030088179 | Seeley et al. | May 2003 | A1 |
20030102819 | Min et al. | Jun 2003 | A1 |
20030103191 | Staurenghi et al. | Jun 2003 | A1 |
20030142204 | Rus et al. | Jul 2003 | A1 |
20030147254 | Yoneda et al. | Aug 2003 | A1 |
20040017607 | Hauger et al. | Jan 2004 | A1 |
20040027652 | Erdogan et al. | Feb 2004 | A1 |
20040036962 | Brunner et al. | Feb 2004 | A1 |
20040070822 | Shioda et al. | Apr 2004 | A1 |
20040087833 | Bauer et al. | May 2004 | A1 |
20040111183 | Sutherland | Jun 2004 | A1 |
20040196553 | Banju et al. | Oct 2004 | A1 |
20040230191 | Frey et al. | Nov 2004 | A1 |
20050018280 | Richardson | Jan 2005 | A1 |
20050019722 | Schmid et al. | Jan 2005 | A1 |
20050026104 | Takahashi | Feb 2005 | A1 |
20050031192 | Sieckmann | Feb 2005 | A1 |
20050033117 | Ozaki et al. | Feb 2005 | A1 |
20050052527 | Remy et al. | Mar 2005 | A1 |
20050063047 | Obrebski et al. | Mar 2005 | A1 |
20050064936 | Pryor | Mar 2005 | A1 |
20050065435 | Rauch et al. | Mar 2005 | A1 |
20050095554 | Wilkinson | May 2005 | A1 |
20050107808 | Evans et al. | May 2005 | A1 |
20050171551 | Sukovich et al. | Aug 2005 | A1 |
20050215866 | Kim | Sep 2005 | A1 |
20050228232 | Gillinov et al. | Oct 2005 | A1 |
20050279355 | Loubser | Dec 2005 | A1 |
20060004261 | Douglas | Jan 2006 | A1 |
20060020213 | Whitman et al. | Jan 2006 | A1 |
20060025656 | Buckner et al. | Feb 2006 | A1 |
20060069315 | Miles et al. | Mar 2006 | A1 |
20060069316 | Dorfman et al. | Mar 2006 | A1 |
20060085969 | Bennett et al. | Apr 2006 | A1 |
20060092178 | Tanguya, Jr. et al. | May 2006 | A1 |
20060114411 | Wei et al. | Jun 2006 | A1 |
20060129140 | Todd et al. | Jun 2006 | A1 |
20060152516 | Plummer | Jul 2006 | A1 |
20060235279 | Hawkes et al. | Oct 2006 | A1 |
20060236264 | Cain et al. | Oct 2006 | A1 |
20060241499 | Irion et al. | Oct 2006 | A1 |
20060276693 | Pacey | Dec 2006 | A1 |
20060293557 | Chuanggui et al. | Dec 2006 | A1 |
20070010716 | Malandain | Jan 2007 | A1 |
20070019916 | Takami | Jan 2007 | A1 |
20070038080 | Salisbury, Jr. et al. | Feb 2007 | A1 |
20070086205 | Krupa et al. | Apr 2007 | A1 |
20070129608 | Sandhu | Jun 2007 | A1 |
20070129719 | Kendale | Jun 2007 | A1 |
20070153541 | Bennett et al. | Jul 2007 | A1 |
20070173853 | MacMillan | Jul 2007 | A1 |
20070238932 | Jones et al. | Oct 2007 | A1 |
20070282171 | Karpowicz et al. | Dec 2007 | A1 |
20080015417 | Hawkes et al. | Jan 2008 | A1 |
20080058606 | Miles et al. | Mar 2008 | A1 |
20080081947 | Irion et al. | Apr 2008 | A1 |
20080091066 | Sholev | Apr 2008 | A1 |
20080094583 | Williams et al. | Apr 2008 | A1 |
20080096165 | Virnicchi et al. | Apr 2008 | A1 |
20080097467 | Gruber et al. | Apr 2008 | A1 |
20080123183 | Awdeh | May 2008 | A1 |
20080151041 | Shafer et al. | Jun 2008 | A1 |
20080183038 | Tilson et al. | Jul 2008 | A1 |
20080195128 | Orbay et al. | Aug 2008 | A1 |
20080221394 | Melkent et al. | Sep 2008 | A1 |
20080221591 | Farritor et al. | Sep 2008 | A1 |
20080266840 | Nordmeyer et al. | Oct 2008 | A1 |
20080269564 | Gelnett | Oct 2008 | A1 |
20080269730 | Dotson | Oct 2008 | A1 |
20080278571 | Mora | Nov 2008 | A1 |
20080300465 | Feigenwinter et al. | Dec 2008 | A1 |
20080303899 | Berci | Dec 2008 | A1 |
20080310181 | Gurevich et al. | Dec 2008 | A1 |
20080319266 | Poll et al. | Dec 2008 | A1 |
20090030436 | Charles | Jan 2009 | A1 |
20090034286 | Krupa et al. | Feb 2009 | A1 |
20090040783 | Krupa et al. | Feb 2009 | A1 |
20090052059 | Lin | Feb 2009 | A1 |
20090105543 | Miller et al. | Apr 2009 | A1 |
20090137893 | Seibel et al. | May 2009 | A1 |
20090137989 | Kataoka | May 2009 | A1 |
20090149716 | Diao et al. | Jun 2009 | A1 |
20090156902 | Dewey et al. | Jun 2009 | A1 |
20090185392 | Krupa et al. | Jul 2009 | A1 |
20090190209 | Nakamura | Jul 2009 | A1 |
20090190371 | Root et al. | Jul 2009 | A1 |
20090209826 | Sanders et al. | Aug 2009 | A1 |
20090238442 | Upham et al. | Sep 2009 | A1 |
20090244259 | Kojima et al. | Oct 2009 | A1 |
20090245600 | Hoffman et al. | Oct 2009 | A1 |
20090248036 | Hoffman et al. | Oct 2009 | A1 |
20090258638 | Lee | Oct 2009 | A1 |
20090304582 | Rousso et al. | Dec 2009 | A1 |
20090318756 | Fisher et al. | Dec 2009 | A1 |
20090326322 | Diolaiti | Dec 2009 | A1 |
20090326331 | Rosen | Dec 2009 | A1 |
20100013910 | Farr | Jan 2010 | A1 |
20100013971 | Amano | Jan 2010 | A1 |
20100081919 | Hyde et al. | Apr 2010 | A1 |
20100107118 | Pearce | Apr 2010 | A1 |
20100128350 | Findlay et al. | May 2010 | A1 |
20100161129 | Costa et al. | Jun 2010 | A1 |
20100168520 | Poll et al. | Jul 2010 | A1 |
20100182340 | Bachelder et al. | Jul 2010 | A1 |
20100198014 | Poll et al. | Aug 2010 | A1 |
20100198241 | Gerrah et al. | Aug 2010 | A1 |
20100208046 | Takahashi | Aug 2010 | A1 |
20100245557 | Luley, III et al. | Sep 2010 | A1 |
20100249496 | Cardenas et al. | Sep 2010 | A1 |
20100286473 | Roberts | Nov 2010 | A1 |
20100305409 | Chang | Dec 2010 | A1 |
20100312069 | Sutherland et al. | Dec 2010 | A1 |
20100318099 | Itkowitz et al. | Dec 2010 | A1 |
20100331855 | Zhao et al. | Dec 2010 | A1 |
20110034781 | Loftus | Feb 2011 | A1 |
20110038040 | Abele et al. | Feb 2011 | A1 |
20110042452 | Cormack | Feb 2011 | A1 |
20110046439 | Pamnani et al. | Feb 2011 | A1 |
20110063734 | Sakaki | Mar 2011 | A1 |
20110065999 | Manzanares | Mar 2011 | A1 |
20110071359 | Bonadio et al. | Mar 2011 | A1 |
20110080536 | Nakamura et al. | Apr 2011 | A1 |
20110115882 | Shahinian et al. | May 2011 | A1 |
20110115891 | Trusty | May 2011 | A1 |
20110144436 | Nearman et al. | Jun 2011 | A1 |
20110178395 | Miesner et al. | Jul 2011 | A1 |
20110184243 | Wright et al. | Jul 2011 | A1 |
20110190588 | McKay | Aug 2011 | A1 |
20110234841 | Akeley et al. | Sep 2011 | A1 |
20110249323 | Tesar et al. | Oct 2011 | A1 |
20110257488 | Koyama et al. | Oct 2011 | A1 |
20110263938 | Levy | Oct 2011 | A1 |
20110264078 | Lipow et al. | Oct 2011 | A1 |
20110288560 | Shohat et al. | Nov 2011 | A1 |
20110298704 | Krah | Dec 2011 | A1 |
20110301421 | Michaeli et al. | Dec 2011 | A1 |
20110316994 | Lemchen | Dec 2011 | A1 |
20120029280 | Kucklick | Feb 2012 | A1 |
20120035423 | Sebastian et al. | Feb 2012 | A1 |
20120035638 | Mathaneswaran et al. | Feb 2012 | A1 |
20120040305 | Karazivan et al. | Feb 2012 | A1 |
20120041272 | Dietze, Jr. et al. | Feb 2012 | A1 |
20120041534 | Clerc et al. | Feb 2012 | A1 |
20120059222 | Yoshida | Mar 2012 | A1 |
20120065468 | Levy et al. | Mar 2012 | A1 |
20120087006 | Signaigo | Apr 2012 | A1 |
20120088974 | Maurice | Apr 2012 | A1 |
20120089093 | Trusty | Apr 2012 | A1 |
20120097567 | Zhao et al. | Apr 2012 | A1 |
20120108900 | Viola et al. | May 2012 | A1 |
20120116173 | Viola | May 2012 | A1 |
20120127573 | Robinson et al. | May 2012 | A1 |
20120130399 | Moll et al. | May 2012 | A1 |
20120134028 | Maruyama | May 2012 | A1 |
20120157775 | Yamaguchi | Jun 2012 | A1 |
20120157787 | Weinstein et al. | Jun 2012 | A1 |
20120157788 | Serowski et al. | Jun 2012 | A1 |
20120158015 | Fowler et al. | Jun 2012 | A1 |
20120190925 | Luiken | Jul 2012 | A1 |
20120197084 | Drach et al. | Aug 2012 | A1 |
20120230668 | Vogt | Sep 2012 | A1 |
20120232352 | Lin et al. | Sep 2012 | A1 |
20120245432 | Karpowicz et al. | Sep 2012 | A1 |
20120265023 | Berci et al. | Oct 2012 | A1 |
20120320102 | Jorgensen | Dec 2012 | A1 |
20120330129 | Awdeh | Dec 2012 | A1 |
20130012770 | Su | Jan 2013 | A1 |
20130027516 | Hart et al. | Jan 2013 | A1 |
20130041226 | McDowall | Feb 2013 | A1 |
20130041368 | Cunningham et al. | Feb 2013 | A1 |
20130060095 | Bouquet | Mar 2013 | A1 |
20130066304 | Belson et al. | Mar 2013 | A1 |
20130072917 | Kaschke et al. | Mar 2013 | A1 |
20130076863 | Rappel | Mar 2013 | A1 |
20130077048 | Mirlay | Mar 2013 | A1 |
20130085337 | Hess et al. | Apr 2013 | A1 |
20130159015 | O'Con | Jun 2013 | A1 |
20130197313 | Wan | Aug 2013 | A1 |
20130245383 | Friedrich et al. | Sep 2013 | A1 |
20130298208 | Ayed | Nov 2013 | A1 |
20130331730 | Fenech et al. | Dec 2013 | A1 |
20140005485 | Tesar et al. | Jan 2014 | A1 |
20140005486 | Charles | Jan 2014 | A1 |
20140005487 | Tesar | Jan 2014 | A1 |
20140005488 | Charles et al. | Jan 2014 | A1 |
20140005489 | Charles | Jan 2014 | A1 |
20140005555 | Tesar | Jan 2014 | A1 |
20140081659 | Nawana et al. | Mar 2014 | A1 |
20140168785 | Belgum | Jun 2014 | A1 |
20140168799 | Hurbert et al. | Jun 2014 | A1 |
20140179998 | Pacey et al. | Jun 2014 | A1 |
20140187859 | Leeuw et al. | Jul 2014 | A1 |
20140198190 | Okumu | Jul 2014 | A1 |
20140247482 | Doi et al. | Sep 2014 | A1 |
20140275801 | Menchaca et al. | Sep 2014 | A1 |
20140276008 | Steinbach et al. | Sep 2014 | A1 |
20140285403 | Kobayashi | Sep 2014 | A1 |
20140316209 | Overes et al. | Oct 2014 | A1 |
20140327742 | Kiening et al. | Nov 2014 | A1 |
20140347395 | Tsao et al. | Nov 2014 | A1 |
20140362228 | McCloskey et al. | Dec 2014 | A1 |
20140378843 | Valdes et al. | Dec 2014 | A1 |
20150018622 | Tesar | Jan 2015 | A1 |
20150025324 | Wan | Jan 2015 | A1 |
20150080982 | Van Funderburk | Mar 2015 | A1 |
20150085095 | Tesar | Mar 2015 | A1 |
20150087918 | Vasan | Mar 2015 | A1 |
20150094533 | Kleiner et al. | Apr 2015 | A1 |
20150112148 | Bouquet | Apr 2015 | A1 |
20150141759 | Tesar et al. | May 2015 | A1 |
20150238073 | Charles | Aug 2015 | A1 |
20150272694 | Charles | Oct 2015 | A1 |
20150297311 | Tesar | Oct 2015 | A1 |
20150300816 | Yang et al. | Oct 2015 | A1 |
20160018598 | Hansson | Jan 2016 | A1 |
20160089026 | Heerren | Mar 2016 | A1 |
20160100908 | Tesar | Apr 2016 | A1 |
20160139039 | Ikehara et al. | May 2016 | A1 |
20160220324 | Tesar | Aug 2016 | A1 |
20170020627 | Tesar | Jan 2017 | A1 |
20170143442 | Tesar | May 2017 | A1 |
20170258550 | Vazales | Sep 2017 | A1 |
20180055348 | Tesar et al. | Mar 2018 | A1 |
20180055502 | Charles et al. | Mar 2018 | A1 |
20180064316 | Charles et al. | Mar 2018 | A1 |
20180064317 | Tesar | Mar 2018 | A1 |
20180070804 | Tesar | Mar 2018 | A1 |
20180256145 | Tesar | Sep 2018 | A1 |
20180318033 | Tesar | Nov 2018 | A1 |
20180353059 | Tesar | Dec 2018 | A1 |
20180368656 | Austin et al. | Dec 2018 | A1 |
20190046021 | Charles et al. | Feb 2019 | A1 |
20190053700 | Tesar | Feb 2019 | A1 |
20190380566 | Charles | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
2336380 | Sep 1999 | CN |
101518438 | Sep 2009 | CN |
102495463 | Jun 2012 | CN |
202920720 | Nov 2012 | CN |
103 41 125 | Apr 2005 | DE |
10 2010 030 285 | Dec 2011 | DE |
10 2010 044 502 | Mar 2012 | DE |
0 293 228 | Nov 1988 | EP |
0 233 940 | Nov 1993 | EP |
0 466 705 | Jun 1996 | EP |
1 175 106 | Jan 2002 | EP |
1 333 305 | Aug 2003 | EP |
2 641 561 | Sep 2013 | EP |
49-009378 | Mar 1974 | JP |
03-018891 | Jan 1991 | JP |
06-315487 | Nov 1994 | JP |
07-194602 | Aug 1995 | JP |
07-261094 | Oct 1995 | JP |
08-131399 | May 1996 | JP |
2001-087212 | Apr 2001 | JP |
2001-117049 | Apr 2001 | JP |
2001-161638 | Jun 2001 | JP |
2001-161640 | Jun 2001 | JP |
2002-011022 | Jan 2002 | JP |
3402797 | May 2003 | JP |
2003-322803 | Nov 2003 | JP |
2004-024835 | Jan 2004 | JP |
3549253 | Aug 2004 | JP |
2004-305525 | Nov 2004 | JP |
2007-068876 | Mar 2007 | JP |
2009-288296 | Dec 2009 | JP |
4503748 | Jul 2010 | JP |
2010-206495 | Sep 2010 | JP |
2011-118741 | Jun 2011 | JP |
WO 87001276 | Mar 1987 | WO |
WO 91012034 | Aug 1991 | WO |
WO 99017661 | Apr 1999 | WO |
WO 00078372 | Dec 2000 | WO |
WO 01072209 | Oct 2001 | WO |
WO 2007047782 | Apr 2007 | WO |
WO 2008073243 | Jun 2008 | WO |
WO 2009051013 | Apr 2009 | WO |
WO 2010079817 | Jul 2010 | WO |
WO 2010114843 | Oct 2010 | WO |
WO 2010123578 | Oct 2010 | WO |
WO 2011069469 | Jun 2011 | WO |
WO 2012047962 | Apr 2012 | WO |
WO 2012078989 | Jun 2012 | WO |
WO 2013049679 | Apr 2013 | WO |
WO 2013109966 | Jul 2013 | WO |
WO 2013116489 | Aug 2013 | WO |
WO 2014004717 | Jan 2014 | WO |
WO 2014060412 | Apr 2014 | WO |
WO 2014189969 | Nov 2014 | WO |
WO 2015042460 | Mar 2015 | WO |
WO 2015042483 | Mar 2015 | WO |
WO 2015100310 | Jul 2015 | WO |
WO 2016090336 | Jun 2016 | WO |
WO 2016154589 | Sep 2016 | WO |
WO 2017091704 | Jun 2017 | WO |
WO 2018208691 | Nov 2018 | WO |
WO 2018217951 | Nov 2018 | WO |
Entry |
---|
Aesculap Inc.; Aesculap Neurosurgery Pneumatic Kerrison; http://www.aesculapusa.com/assets/base/doc/doc763-pneumatic_kerrison_brochure.pdf; 2008; pp. 12. |
Aliaga, Daniel G.; “Image Morphing and Warping”, Department of Computer Science; Purdue University; Spring 2010; in 61 pages. |
“Arri Medical Shows SeeFront 3D Display with HD 3D Surgical Microscope”, dated Jun. 9, 2013, downloaded from http://www.seefront.com/news-events/article/arri-medical-shows-seefront-3d-display-with-hd-3d-surgical-microscope/. |
“Arriscope: A New Era in Surgical Microscopy”, Arriscope Brochure published May 20, 2014 in 8 pages. |
AustriaMicroSystems; “AS5050: Smallest Magnetic Rotary Encoder for μA Low Power Applications”; www.austriamicrosystems.com/AS5050 printed Nov. 2012 in 2 pages. |
Bayonet Lock Video, 00:16 in length, Date Unknown, [Screenshots captured at 00:00, 00:02, 00:05, 00:08, and 00:16]. |
BellowsTech; “Actuators”; www.bellowstech.com/metal-bellows/actuators/, printed Jul. 17, 2012 in 4 pages. |
“Carl Zeiss Unveils $99 VR One Virtual Reality Headset”; www.electronista.com/articles/14/10/10/zeiss.vr.one.able.to.accept.variety.of.smartphones.using.custom.trays printed Oct. 13, 2014 in 2 pages. |
Designboom; “Bright LED”; http://www.designboom.com/project/fiber-optics-light-glove/; Sep. 28, 2007. |
Fei-Fei, Li; Lecture 10: Multi-View Geometry; Stanford Vision Lab; Oct. 24, 2011; pp. 89. |
“Fuse™. Full Spectrum Endoscopy™”; http://www.endochoice.com/Fuse printed Oct. 7, 2013 in 3 pages. |
Hardesty, Larry; “3-D Cameras for Cellphones: Clever math could enable a high-quality 3-D camera so simple, cheap and power-efficient that it could be incorporated into handheld devices”; MIT News Office; http://web.mit.edu/newsoffice/2011/lidar-3d-camera-cellphones-0105.html; Jan. 5, 2012; pp. 4. |
Hartley et al.; “Multiple View Geometry in Computer Vision: Chapter 9—Epipolar Geometry and the Fundamental Matrix”; http://www.robots.ox.ac.uk/˜vgg/hzbook/hzbook2/HZepipolar.pdf; Mar. 2004; 2nd Edition; Ch. 9; pp. 239-261. |
Heidelberg Engineering; “MultiColor: Scanning Laser Imaging”; htto://www.heidelbergengineering.com/us/products/spectralis-models/imaging-modes/multicolor/; Copyright © 2013; printed Apr. 5, 2013. |
Krishna, Golden; “Watch: What Good is a Screen?” http://www.cooper.com/author/golden_krishna as printed Jul. 9, 2014 in 62 pages. |
Lang et al.; “Zeiss Microscopes for Microsurgery”; Springer-Verlag; Berlin, Heidelberg; 1981. |
“Leica Microsystems and TrueVision® 3D Surgical create the first 3D digital hybrid microscope”, Press Release, Oct. 5, 2012, pp. 2. |
Male Bayonet Video, 00:04 in length, Date Unknown, [Screenshots captured at 00:00, 00:01, 00:02, 00:03, and 00:04]. |
Melexis; “MLX75031 Optical Gesture and Proximity Sensing IC”; httg://melexis.com/optical-sensors/optical-sensing.mlx75031-815.aspx?sta printed Mar. 15, 2013 in 1 page. |
MMR Technologies; “Micro Miniature Refrigerators”; www.mmr-tech.com/mmr_overview.php; Copyright © 2011; printed Feb. 11, 2013. |
Moog; “Surgical Handpieces: Therapeutic Ultrasonic Devices”; http://www.moog.com/products/surgical-hpieces/ printed Sep. 25, 2013 in 1 page. |
Morita; “TwinPower Turbine® High Speed Handpieces Standard, 45°, and Ultra Series Head Designs”; J. Morita Mfg. Corp., http://www.morita.com/usa/root/img/pool/pdf/product_brochures/twinpower_brochure_l-264_0512_web.pdf; May 2012; pp. 20. |
Olympus; “Olympus Introduces the World's First and Only Monopolar, Disposable Tonsil Adenoid Debrider (DTAD)”; htto://www.olympusamerica.com/corporate/corp_presscenter_headline.asp?pressNo=926; Sep. 11, 2012; pp. 2. |
OmniVision; “OV2722 full HD (1080p) product brief: ⅙-Inch Native 1080p HD CameraChip Sensor for Ultra-Compact Applications”; http://web.archive.org/web/20120730043057/http://www.ovt.com/download_document.php?type=sensor&sensorid=119; May 2012 in 2 pages. |
Orthofix; “ProView MAP System Retractors”; www.us.orthofix.com/products/proviewretractors.asp?cid=39; Copyright © 2010; printed Apr. 1, 2013. |
OrtusTech; “Sample Shipment Start: World's Smallest Size Full-HD Color TFT LCD”; http://ortustech.co.jp/english/notice/20120427.html printed May 22, 2012 in 2 pages. |
Saab, Mark; “Applications of High-Pressure Balloons in the Medical Device Industry”; http://www.ventionmedical.com/documents/medicalballoonpaper.pdf; Copyright © 1999; pp. 19. |
Purcher, Jack, “Apple Wins a Patent for an Oculus Rift-Like Display System,” http://www.patentlyapple.com/patently-apple/2014/09/apple-wins-a-patent-for-an-oculus-rift-like-display-system.html, Sep. 9, 2014. |
Savage, Lynn; “Sound and Light, Signifying Improved Imaging”; www.photonics.com/Article.aspx?AID=45039; Nov. 1, 2010; pp. 6. |
Timm, Karl Walter; “Real-Time View Morphing of Video Streams”; University of Illinois; Chicago, Illinois; 2003; pp. 168. |
Whitney et al.; “Pop-up book MEMS”; Journal of Micromechanics and Microengineering; Oct. 14, 2011; vol. 21; No. 115021; pp. 7. |
Wikipedia, “Zoom Lens,” http://en.wikipedia.org/wiki/Optical_Zoom, printed Oct. 7, 2014 in 3 pages. |
Zeiss; “Informed for Medical Professionals, Focus: Fluorescence”; Carl Zeiss; 2nd Issue; Oct. 2006; 30-801-LBW-GFH-X-2006; Printed in Germany; pp. 32. |
Zeiss; “Ophthalmic Surgery in Its Highest Form, OPMI® Visu 210”; Carl Zeiss, 2005, 30-097/III-e/USA Printed in Germany AW-TS-V/2005 Uoo; pp. 19. |
Zeiss; “SteREO Discovery. V12, Expanding the Boundaries”; Carl Zeiss, Sep. 2004; 46-0008 e09.2004, pp. 6. |
Zeiss; “Time for a Change: OPMI® pico for ENT”; Carl Zeiss, 2005, 30-451/III-e Printed in Germany LBW-TS-V/2005 Uoo, pp. 8. |
Zhang, Michael; “LIFX: A WiFi-Enabled LED Bulb that May Revolutionize Photographic Lighting”; http://www.petapixel.com/2012/09/22/lifx-a-wifi-enabled-led-buib-that-may-revolutionize-photographic-lighting/ printed Sep. 28, 2012 in 9 pages. |
Restriction Requirement in U.S. Appl. No. 13/802,362, dated Oct. 23, 2013. |
Office Action in U.S. Appl. No. 13/802,362, dated Dec. 17, 2013. |
Office Action in U.S. Appl. No. 13/802,362, dated Apr. 7, 2014. |
International Search Report and Written Opinion in PCT Application No. PCT/US2013/047972, dated Jan. 3, 2014. |
Office Action in U.S. Appl. No. 13/802,485, dated Jun. 20, 2014. |
Restriction Requirement in U.S. Appl. No. 13/802,635, dated May 28, 2014. |
Office Action in U.S. Appl. No. 13/802,509, dated Sep. 9, 2013. |
Notice of Allowance in U.S. Appl. No. 13/802,509, dated Apr. 16, 2014. |
Restriction Requirement in U.S. Appl. No. 13/802,582, dated Oct. 23, 2013. |
Office Action in U.S. Appl. No. 13/802,582, dated Dec. 16, 2013. |
Office Action in U.S. Appl. No. 13/802,582, dated Apr. 16, 2014. |
International Search Report and Written Opinion in PCT Application No. PCT/US2014/038839, dated Oct. 17, 2014. |
International Search Report and Written Opinion in PCT Application No. PCT/US2014/056643, dated Dec. 11, 2014. |
Kramer, Jennifer; “The Right Filter Set Gets the Most out of a Microscope”; Biophotonics International; Jan./Feb. 1999; vol. 6; pp. 54-58. |
Leica Microsystems; “Images TrueVision Integrated 3D”; http://www.leica-microsystems.com/products/surgical-mcroscopes/neurosurgery-spine/details/product/truevison-integrated-3d/gallery/; Nov. 26, 2014; in 3 pages. |
Leica Microsystems; “Leica Microsystems' Ophthalmic Surgical Microscopes with TrueVision 3D Technology Available Globally”; http://www.leica-microsystems.com/products/surgical-microscopes/neurosurgery-spine/details/product/truevision-integrated-3d/news/; Sep. 18, 2014; in 5 pages. |
Lutze et al.; “Microsystems Technology for Use in a Minimally Invasive Endoscope Assisted Neurosurgical Operating System—MINOP II”; 2005; http://web.archive.org/web/20151120215151/http://www.meditec.hia.rwth-aachen.de/fileadmin/content/meditec/bilder/forschung/aktuelle_projekte/robotische/Exoscope_Aesculap.pdf; Nov. 20, 2015 in 4 pages. |
MediTec; “MINOP II—Robotical Microscope Platform”; http://web.archive.org/web/20151120213932/http://www.meditec.hia.rwth-aachen.de/en/research/former-projects/minop-ii/; Nov. 20, 2015 in 3 pages. |
“Narrow Band Imaging”; http://web.archive.org/web/20150701233623/https://en.wikipedia.org/wiki/Narrow_band_imaging printed Jul. 1, 2015 in 1 page. |
Rustum, Dr. Abu; “ICG Mapping Endometrial Cancer”; Pinpoint Endometrium Ca Lenfedenektomi MSKCC May 2013; Memorial Sloan Kettering Cancer Center; May 2013; Published to YouTube.com Sep. 1, 2013; in 2 pages; http://web.archive.org/web/20150402210857/https://www.youtube.com/watch?v-DhChvaUCe4I. |
Sun et al.; “Neurotoxin-Directed Synthesis and in Vitro Evaluation of Au Nanoclusters”; RSC Advances, 2015; vol. 5, No. 38; pp. 29647-29652. |
TrueVision Microscopes; http://truevisionmicroscopes.com/images/productsnew/081a-f.jpg; printed Nov. 26, 2014 in 1 page. |
TrueVision; “About TrueVision”; http://web.archive.org/web/20071208125103/http://www.truevisionsys.com/about.html; as viewed Dec. 8, 2007 in 2 pages. |
TrueVision; “TrueVision Technology”; http://web.archive.org/web/20071208125125/http://www.truevisionsys.com/technology.html; as viewed Dec. 8, 2007 in 2 pages. |
Zeiss; “Stereomicroscopes: Stemi SV 6, SV 11, SV 11 Apo”; the Profile; 1999; in 30 pages. |
Zhang, Sarah; “The Obscure Neuroscience Problem That's Plaguing VR”; http://web.archive.org/web/20150812172934/http://www.wired.com/2015/08/obscure-neuroscience-problem-thats-plaguing-vr/; Aug. 11, 2015 in 5 pages. |
Preliminary Amendment in U.S. Appl. No. 14/411,068, dated Aug. 13, 2015. |
Office Action in U.S. Appl. No. 14/411,068, dated Aug. 17, 2017. |
Official Communication in European Application No. 13808996.6, dated Jan. 4, 2016. |
Official Communication in European Application No. 13808996.6, dated Apr. 14, 2016. |
Official Communication in European Application No. 13808996.6, dated Feb. 21, 2017. |
Official Communication in European Application No. 13808996.6, dated Jun. 6, 2017. |
Official Communication in Japanese Application No. 2015-520471, dated May 9, 2017. |
Official Communication in Japanese Application No. 2015-520471, dated Nov. 21, 2017. |
International Preliminary Report on Patentability in PCT Application No. PCT/US2013/047972, dated Jan. 8, 2015. |
Office Action in U.S. Appl. No. 13/802,635, dated Mar. 27, 2015. |
Final Office Action in U.S. Appl. No. 13/802,635, dated Jan. 14, 2016. |
Response to Final Office Action in U.S. Appl. No. 13/802,635, dated Jul. 13, 2016. |
Office Action in U.S. Appl. No. 13/802,635, dated Sep. 27, 2016. |
Amendment and Response to Office Action in U.S. Appl. No. 13/802,635, dated Mar. 24, 2017. |
Notice of Allowance in U.S. Appl. No. 13/802,635, dated Apr. 27, 2017. |
Notice of Allowance in U.S. Appl. No. 13/802,635, dated Aug. 15, 2017. |
Amendment in U.S. Appl. No. 15/589,058, dated Nov. 15, 2017. |
Office Action in U.S. Appl. No. 15/589,058, dated Dec. 8, 2017. |
Office Action in U.S. Appl. No. 13/802,577, dated Sep. 30, 2016. |
Response to Office Action in U.S. Appl. No. 13/802,577, dated Mar. 29, 2017. |
Notice of Allowance in U.S. Appl. No. 13/802,577, dated Apr. 24, 2017. |
Amendment in U.S. Appl. No. 13/802,577, dated May 25, 2017. |
Office Action in U.S. Appl. No. 13/802,577, dated Jun. 20, 2017. |
Amendment in U.S. Appl. No. 13/802,577, dated Nov. 20, 2017. |
Notice of Allowance in U.S. Appl. No. 13/802,577, dated Dec. 6, 2017. |
Official Communication in European Application No. 14800423.7, dated Feb. 8, 2017. |
International Preliminary Report on Patentability in PCT Application No. PCT/US2014/038839, dated Dec. 3, 2015. |
Preliminary Amendment in U.S. Appl. No. 14/491,827, dated Nov. 25, 2014. |
Office Action in U.S. Appl. No. 14/491,827, dated Mar. 1, 2017. |
Amendment in U.S. Appl. No. 14/491,827, dated Aug. 1, 2017. |
Notice of Allowance in U.S. Appl. No. 14/491,827, dated Sep. 25, 2017. |
Partial Supplementary European Search Report in European Application No. 14845427.5, dated May 4, 2017. |
Extended European Search Report in European Application No. 14845427.5, dated Aug. 8, 2017. |
European Search Report in European Application No. 14846410.0, dated Jun. 23, 2017. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/056643, dated Mar. 31, 2016. |
Invitation to Pay Additional Fees in PCT Application No. PCT/US2014/056681, dated Jan. 14, 2015. |
International Search Report and Written Opinion in PCT Application No. PCT/US2014/056681, dated Mar. 20, 2015. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/056681, dated Mar. 31, 2016. |
Preliminary Amendment in U.S. Appl. No. 14/581,779, dated Jul. 6, 2015. |
Restriction Requirement in U.S. Appl. No. 14/581,779, dated Oct. 31, 2017. |
Extended European Search Report in European Application No. 14873324.9, dated Aug. 25, 2017. |
Invitation to Pay Additional Fees in PCT Application No. PCT/US2014/072121, dated Mar. 2, 2015. |
International Search Report and Written Opinion in PCT Application No. PCT/US2014/072121, dated May 1, 2015. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2014/072121, dated Jul. 7, 2016. |
Preliminary Amendment in U.S. Appl. No. 14/960,276, dated Apr. 18, 2016. |
Office Action in U.S. Appl. No. 14/960,276, dated Jul. 28, 2017. |
International Search Report and Written Opinion in PCT Application No. PCT/US2015/064133, dated Feb. 9, 2016. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2015/064133, dated Jun. 15, 2017. |
Preliminary Amendment in U.S. Appl. No. 15/081,653, dated Oct. 11, 2016. |
International Search Report and Written Opinion in PCT Application No. PCT/US2016/024330, dated Jul. 1, 2016. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2016/024330, dated Oct. 5, 2017. |
Preliminary Amendment in U.S. Appl. No. 15/360,565, dated Feb. 6, 2017. |
Invitation to Pay Additional Fees in PCT Application No. PCT/US2016/063549, dated Feb. 2, 2017. |
International Search Report and Written Opinion in PCT Application No. PCT/US2016/063549, dated Apr. 14, 2017. |
“Portion”; Definition; American Heritage® Dictionary of the English Language; Fifth Edition; 2016; Retrieved Apr. 12, 2018 from https://www.thefreedictionary.com/portion in 1 page. |
Preliminary Amendment in U.S. Appl. No. 16/357,081, dated Sep. 4, 2019. |
Official Communication in European Application No. 13808996.6, dated Jun. 15, 2018. |
Official Communication in European Application No. 13808996.6, dated May 13, 2019. |
Notice of Decision or Rejection in Japanese Application No. 2015-520471, dated Jul. 24, 2018. |
Preliminary Amendment in U.S. Appl. No. 15/483,995, dated Nov. 21, 2017. |
Office Action in U.S. Appl. No. 15/483,995, dated Mar. 9, 2018. |
Amendment in U.S. Appl. No. 15/483,995, dated Sep. 7, 2018. |
Final Office Action in U.S. Appl. No. 15/483,995, dated Nov. 29, 2018. |
Amendment in U.S. Appl. No. 15/483,995, dated May 28, 2019. |
Office Action in U.S. Appl. No. 15/483,995, dated Jun. 13, 2019. |
Office Action in U.S. Appl. No. 15/645,589, dated Feb. 9, 2018. |
Amendment in U.S. Appl. No. 15/645,589, dated Aug. 7, 2018. |
Final Office Action in U.S. Appl. No. 15/645,589, dated Nov. 28, 2018. |
Amendment in U.S. Appl. No. 15/645,589, dated May 28, 2019. |
Office Action in U.S. Appl. No. 15/645,589, dated Jun. 13, 2019. |
Preliminary Amendment filed in U.S. Appl. No. 16/036,665, dated Nov. 1, 2018. |
Preliminary Amendment filed in U.S. Appl. No. 16/036,665, dated Sep. 5, 2019. |
Office Action in U.S. Appl. No. 16/036,665, dated Sep. 26, 2019. |
Office Action in U.S. Appl. No. 15/626,516, dated Mar. 14, 2018. |
Amendment in U.S. Appl. No. 15/626,516, dated Sep. 13, 2018. |
Final Office Action in U.S. Appl. No. 15/626,516, dated Jan. 15, 2019. |
Response in U.S. Appl. No. 15/626,516, dated Jul. 15, 2019. |
Restriction Requirement in U.S. Appl. No. 15/495,484, dated May 14, 2019. |
Amendment in U.S. Appl. No. 15/589,058, dated Jun. 7, 2018. |
Final Office Action in U.S. Appl. No. 15/589,058, dated Aug. 27, 2018. |
Amendment in U.S. Appl. No. 15/589,058, dated Feb. 26, 2019. |
Office Action in U.S. Appl. No. 15/589,058, dated Mar. 5, 2019. |
Amendment in U.S. Appl. No. 15/589,058, dated Sep. 5, 2019. |
Notice of Allowance in U.S. Appl. No. 15/589,058, dated Sep. 25, 2019. |
Preliminary Amendment filed in U.S. Appl. No. 15/724,100, dated Jun. 5, 2018. |
Office Action in U.S. Appl. No. 15/724,100, dated Oct. 9, 2019. |
Preliminary Amendment in U.S. Appl. No. 16/042,318, dated Nov. 8, 2018. |
Office Action in U.S. Appl. No. 16/042,318, dated May 8, 2019. |
Amendment in U.S. Appl. No. 16/042,318, dated Sep. 9, 2019. |
Notice of Allowance in U.S. Appl. No. 16/042,318, dated Oct. 9, 2019. |
Official Communication in European Application No. 14846410.0, dated Jul. 18, 2018. |
Official Communication in European Application No. 14846410.0, dated Mar. 20, 2019. |
Official Communication in Japanese Application No. 2016-544032, dated Jun. 26, 2018. |
Restriction Requirement and Election of Species Response in U.S. Appl. No. 14/581,779, dated Jan. 2, 2018. |
Office Action in U.S. Appl. No. 14/581,779, dated Apr. 24, 2018. |
Amendment in U.S. Appl. No. 14/581,779, dated Sep. 24, 2018. |
Final Office Action in U.S. Appl. No. 14/581,779, dated Jan. 4, 2019. |
Amendment in U.S. Appl. No. 14/581,779, dated Jul. 2, 2019. |
Office Action in U.S. Appl. No. 14/581,779, dated Aug. 5, 2019. |
Official Communication in Japanese Application No. 2016-542194, dated Nov. 6, 2018. |
Decision of Rejection in Japanese Application No. 2016-542194, dated May 14, 2019. |
Amendment in U.S. Appl. No. 14/960,276, dated Jan. 26, 2018. |
Office Action in U.S. Appl. No. 14/960,276, dated Mar. 8, 2018. |
Amendment in U.S. Appl. No. 14/960,276, dated Sep. 7, 2018. |
Office Action in U.S. Appl. No. 14/960,276, dated Nov. 2, 2018. |
Amendment in U.S. Appl. No. 14/960,276, dated May 2, 2019. |
Final Office Action in U.S. Appl. No. 14/960,276, dated Jun. 7, 2019. |
Extended European Search Report in European Application No. 15865454.1, dated Jun. 27, 2018. |
Office Action in U.S. Appl. No. 15/081,653, dated Mar. 28, 2018. |
Amendment in U.S. Appl. No. 15/081,653, dated Sep. 27, 2018. |
Final Office Action in U.S. Appl. No. 15/081,653, dated Nov. 16, 2018. |
Final Amendment in U.S. Appl. No. 15/081,653, dated May 15, 2019. |
Office Action in U.S. Appl. No. 15/081,653, dated Jul. 12, 2019. |
Extended European Search Report in European Application No. 16769809.1, dated Nov. 23, 2018. |
Office Action in U.S. Appl. No. 15/360,565, dated Aug. 10, 2018. |
Amendment in U.S. Appl. No. 15/360,565, dated Feb. 8, 2019. |
Office Action in U.S. Appl. No. 15/360,565, dated May 22, 2019. |
Extended European Search Report in European Application No. 16869253.1, dated May 29, 2019. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2016/063549, dated Jun. 7, 2018. |
Office Action in U.S. Appl. No. 15/973,433, dated Jun. 28, 2019. |
International Search Report and Written Opinion in PCT Application No. PCT/US2018/031442, dated Sep. 14, 2018. |
International Search Report and Written Opinion in PCT Application No. PCT/US2018/034227, dated Jul. 30, 2018. |
Office Action in U.S. Appl. No. 16/357,081, dated Jul. 8, 2020. |
Official Communication in Japanese Application No. 2018-218745, dated Feb. 25, 2020. |
Amendment in U.S. Appl. No. 15/483,995, dated Dec. 12, 2019. |
Final Office Action in U.S. Appl. No. 15/483,995, dated Feb. 20, 2020. |
Office Action in U.S. Appl. No. 15/645,589, dated Dec. 26, 2019. |
Amendment in U.S. Appl. No. 15/645,589, dated Jun. 26, 2020. |
Notice of Allowance in U.S. Appl. No. 15/645,589, dated Jul 14, 2020. |
Amendment filed in U.S. Appl. No. 16/036,665, dated Mar. 26, 2020. |
Office Action in U.S. Appl. No. 16/036,665, dated Jul. 13, 2020. |
Amendment in U.S. Appl. No. 15/626,516, dated Jan. 24, 2020. |
Notice of Allowance in U.S. Appl. No. 15/626,516, dated Mar. 9, 2020. |
Notice of Allowance in U.S. Appl. No. 15/626,516, dated Jun. 29, 2020. |
Response to Restriction Requirement in U.S. Appl. No. 15/495,484, dated Nov. 13, 2019. |
Office Action in U.S. Appl. No. 15/495,484, dated Nov. 27, 2019. |
Amendment in U.S. Appl. No. 15/495,484, dated May 27, 2020. |
Notice of Allowance in U.S. Appl. No. 15/495,484, dated Jun. 16, 2020. |
Restriction Requirement in U.S. Appl. No. 15/948,842, dated Jan. 22, 2020. |
Amendment filed in U.S. Appl. No. 15/724,100, dated Apr. 9, 2020. |
Office Action in U.S. Appl. No. 15/724,100, dated Apr. 22, 2020. |
Notice of Allowance in U.S. Appl. No. 15/724,100, dated Jul. 6, 2020. |
Amendment in U.S. Appl. o. 14/581,779, dated Feb. 4, 2020. |
Final Office Action in U.S. Appl. No. 14/581,779, dated Apr. 29, 2020. |
Amendment in U.S. Appl. No. 15/081,653, dated Jan. 10, 2020. |
Final Office Action in U.S. Appl. No. 15/081,653, dated Jan. 31, 2020. |
Amendment in U.S. Appl. No. 15/081,653, dated Jul. 30, 2020. |
Amendment in U.S. Appl. No. 15/360,565, dated Nov. 21, 2019. |
Office Action in U.S. Appl. No. 15/360,565, dated Jan. 30, 2020. |
Amendment in U.S. Appl. No. 15/360,565, dated Jul. 29, 2020. |
Amendment in U.S. Appl. No. 15/973,433, dated Sep. 30, 2019. |
Notice of Allowance in U.S. Appl. No. 15/973,433, dated Jan. 28, 2020. |
Notice of Allowance in U.S. Appl. No. 15/973,433, dated Jun. 25, 2020. |
International Preliminary Report on Patentability and Written Opinion in PCT Application No. PCT/US2018/031442, dated Nov. 21, 2019. |
International Preliminary Report on Patentability and Written Opinion in PCT/US2018/034227, dated Dec. 5, 2019. |
Number | Date | Country | |
---|---|---|---|
20150141755 A1 | May 2015 | US |
Number | Date | Country | |
---|---|---|---|
61880808 | Sep 2013 | US | |
61920451 | Dec 2013 | US | |
61921051 | Dec 2013 | US | |
61921389 | Dec 2013 | US | |
61922068 | Dec 2013 | US | |
61923188 | Jan 2014 | US |