This application generally relates to minimally invasive surgery, minimally invasive surgical cameras and virtual reality minimally invasive surgical systems.
The field of minimally invasive surgery has undergone tremendous development and growth since its inception in the 1900's, with said developments and growth yielding improved results for patients. One of the major developments in the minimally invasive surgery field has been the implementation of surgical robotic devices. The implementation and utilization of surgical robotic devices in the minimally invasive surgery field has led to an increase in the number and types of surgeries that can be performed using said devices. The increase has led to many improvements for patients, including shorter recovery times, improved outcomes and quicker operation times. The increase in the utilization of surgical robotic devices has created an influx in the number of devices capable of performing a myriad of functions and operations, and being controlled and operated via various techniques.
During minimally invasive surgery, typically an endoscopic camera is used to provide the surgeon with imagery of the operation site and surgical cavity to allow the surgeon to manipulate robotic tools and also allow others to view the procedure while it is being performed. Routinely, during minimally invasive surgeries, the surgeon is concentrated on tasks of manipulating tissue and retracting organs. In order to accomplish these tasks, the surgeon manually maneuvers the endoscopic camera to a desired location and position in order to obtain a view adequate for performing a procedure. Typically, endoscopic cameras give a limited and narrow field of view, which results in the surgeon having to manually move the endoscopic camera back and forth or to a different location in order for the surgeon to view tools or tissue outside of his or her field of view. Requiring the surgeon to manually move the endoscopic camera requires the surgeon to switch his or her focus from performing the operation and concentrate on obtaining an adequate view, which results in longer operation times and longer recovery times for patients.
Typically, throughout minimally invasive surgeries multiple views and angles of the operation field are needed for a surgeon to perform the operation. Generally, the endoscope may be manually moved marginally left, right, back and/or forward to obtain a larger view or a different view during the operation, and then moved back it its original position and orientation so as to allow the surgeon to view the tissue and/or organ at a desired magnification. Physically manipulating the endoscopic camera requires the surgeon to switch his or her focus to the view of the operation instead of performing the operation, which can lead to patients sustaining accidental incidents, as well as longer recovery times and longer operation times.
In order to eliminate the need to manually move endoscopic cameras to obtained multiple views of an operation site, as well as a larger field of view, some have utilized multiple endoscopic cameras, inserting each endoscopic camera through a different incision in the patients cavity. While this has allowed surgeons to obtain multiple and different views, it has come at a cost to the patient as multiple incisions must be made in order to insert multiple endoscopic cameras, increasing the risk of herniation, risk of infection, pain and general morbidity. Additionally, utilizing multiple endoscopic cameras decreases the surgeon's workspace and thus making it more difficult for the surgeon to perform the operation.
While using and manually maneuvering and manipulating an endoscopic camera(s) is a viable option in conventional minimally invasive surgeries, and existing robotic surgeries, it is unpractical and not an intuitive method for maneuvering and manipulating a camera during virtual reality surgeries. In virtual reality surgeries, the surgeon has the perception of being condensed inside a patient's body at a surgical site. In conjunction with three-dimensional visualization provided by virtual reality goggles, the surgeon views the operation and interacts with the robotic arms as if the robotic arms have taken the form of his or her own arms and hands. With virtual reality surgeries, the surgeon is engrossed in a natural and immersive virtual reality user interface. While the surgeon is immersed in the virtual reality user interface it would be cumbersome for the surgeon to manually maneuver and relocate an endoscopic camera to a desired location and position, as it would require the surgeon to disconnect and remove him/herself from the natural and immersive virtual reality user interface. Alternatively, if the surgeon was to manually manipulate the endoscopic camera, such manipulation would be disorienting for the surgeon, and thus could lead to increased operation time, as well as disrupt the surgeon's work flow. In order to allow a surgeon to remain immersed in the natural and immersive virtual reality user interface, a different technique of controlling a camera and obtaining multiple views of the operation field is necessary for virtual reality surgery.
With human-like robotic systems, having a successful system results from maintaining a natural and intuitive human-machine interface (HMI). As such, it is advantageous in a virtual reality surgery for a surgeon to be able to interact and control the camera while maintaining the functionality of a human-like robot.
In one embodiment the invention includes a system comprising a console assembly comprising, a first actuator, and a first actuator pulley operably coupled to the first actuator, a trocar assembly operably coupled to the console assembly, the trocar assembly comprising, a trocar having an inner and an outer diameter, and a seal sub-assembly comprising at least one seal, the seal sub-assembly operably coupled to the trocar, a camera assembly operably coupled to the console assembly, the camera assembly comprising, a camera support tube having a distal end and a proximal end, a stereoscopic camera assembly operably coupled to the distal end of the camera support tube, the stereoscopic camera assembly comprising, a main camera body defining a cavity, a pitch actuation assembly, a yaw actuation assembly, the pitch and yaw actuation assemblies providing at least two rotational degrees of freedom, a first camera module having a first optical axis, and a second camera module having a second optical axis, and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis, wherein the yaw axis is normal to a plane in which the camera support tube lies, and the pitch axis is perpendicular to the yaw axis. The seal sub-assembly of the system may also comprise a second seal. The trocar assembly of the system may also comprise a seal plug. The stereoscopic camera assembly of the system may also comprise a peripheral camera. The system, the first optical axis of the first camera module and the second optical axis of the second camera module have an inter-axial distance configured to provide stereo vision. The console assembly of the system may also comprise a plurality of actuators. The trocar assembly of the system may also comprise a trocar mating fixture defining a pass through having a pass through axis, wherein the pass through axis configured to permit access through the camera console assembly and through the trocar assembly.
The system may also comprise, a pitch cable operably coupling the pitch actuation assembly to the first actuator pulley so that actuation of the first actuator rotates the stereoscopic camera assembly about the pitch axis. In the system comprising a pitch cable, the console assembly may also comprise, a second actuator and a second actuator pulley operably coupled to the second actuator. In the system comprising a console assembly with a second actuator, may also comprise a yaw cable operably coupling the second actuator to the second actuator pulley so that actuation of the second actuator rotates the stereoscopic camera assembly about the yaw axis. In the system comprising a yaw cable, the console assembly may also comprise a first redirecting pulley disposed along a path of the pitch cable between the first actuator pulley and the pitch actuation assembly, the first redirecting pulley being configured to redirect the path of the pitch cable from the first actuator pulley to a first cable lumen defined by the camera support tube. In the system comprising a yaw cable, the console assembly may also comprise a redirecting pulley disposed along a path of the yaw cable between the second actuator pulley and the yaw actuation assembly, the redirecting pulley being configured to redirect the path of the yaw cable from the second actuator pulley to a second cable lumen defined by the camera support tube.
In the system comprising a console assembly with a first redirecting pulley, may also comprise a second redirecting pulley disposed along a path of the yaw cable between the second actuator pulley and the yaw actuation assembly, the second redirecting pulley being configured to redirect the path of the yaw cable from the second actuator pulley to a second cable lumen defined by the camera support tube.
In other embodiments, the stereoscopic camera assembly of the system has an insertion configuration and a deployed configuration, and wherein, in the insertion configuration, the first optical axis of the first camera module and the second optical axis of the second camera modules are orientated perpendicular to the camera support tube. In the system with a stereoscopic camera assembly with an insertion configuration and a deployed configuration, the first camera module comprising a first camera module body having a first outer edge, the second camera module comprising a second camera module body having a second outer edge, and wherein a maximum distance from the first outer edge of the first camera module body to the second outer edge of the second camera module body is greater than a maximum width of a cross-section of the stereoscopic camera assembly taken perpendicular to an axis of the camera support tube.
In another aspect the invention includes a camera assembly comprising a camera support tube having a distal end and a proximal end, a stereoscopic camera assembly operably coupled to the distal end of the camera support tube, the stereoscopic camera assembly comprising, a main camera body operably coupled to the distal end of the camera support tube, wherein the main camera body defines an electrical component cavity, a first camera module having a first optical axis, a second camera module having a second optical axis, and an actuation system comprising a pitch actuation assembly and a yaw actuation assembly, the actuation system providing at least two rotational degrees of freedom; and at least one rotational positional sensor configured to detect rotation of the stereoscopic camera assembly about at least one of a pitch axis or a yaw axis, wherein the yaw axis is normal to a plane in which the camera support tube lies, and the pitch axis is perpendicular to the yaw axis. In one embodiment, the actuation system of the camera assembly is cable driven. In another embodiment, the actuation system of the camera assembly is motor driven. In other embodiments, the yaw actuation assembly of the camera assembly is configured to actuate the stereoscopic camera assembly about the yaw axis. In yet another embodiment, the pitch actuation assembly of the camera assembly is configured to actuate the stereoscopic camera about the pitch axis. In other embodiments, the pitch actuation assembly of the camera assembly is configured to actuate the stereoscopic camera assembly about the pitch axis independent of the yaw actuation assembly. The stereoscopic camera of the camera assembly may also comprise a lighting source operably coupled to a power supply.
In another embodiment, the stereoscopic camera assembly of the camera assembly may also comprise a first peripheral camera. In the camera assembly comprising a first peripheral camera, the stereoscopic camera assembly may also comprise a second peripheral camera.
In yet another embodiment, the stereoscopic camera assembly of the camera assembly may also comprise an electrical communication component, wherein the electrical communication component is configured to transmit information captured by at least one of the first camera module, the second camera module, or the at least one rotational positional sensor. In the camera assembly comprising an electrical communication component, the electrical communication component may also comprise a flexible printed circuit boards (FPCB). In another embodiment, the camera assembly comprising an electrical communication component, the electrical communication component may also comprise a printed circuit boards (PCB).
In another embodiment of the camera assembly comprising an electrical communication component, the electrical communication component is physically configured to permit the stereoscopic camera assembly to be actuated in the at least two rotational degrees of freedom, and wherein the electrical communication component is configured to transmit the information captured by the at least one of the first camera module, the second camera module, or the at least one rotational positional sensor during actuation of the stereoscopic camera assembly in the at least two rotational degrees of freedom. In another embodiment of the camera assembly comprising the electrical communication component physically configured to permit the stereoscopic camera assembly to be actuated in the at least two degrees of freedom, the electrical communication component can be bent to a minimum allowable bend radius without being damaged or rendered unuseable. In the camera assembly comprising the electrical communication component physically configured to permit the stereoscopic camera assembly to be actuated in the at least two degrees of freedom, may also comprise an electrical communication retainer, the electrical communication component retainer preventing the electrical communication components from being damaged while the actuation system is in use.
In another embodiment the camera assembly comprising the electrical communication component physically configured to permit the stereoscopic camera assembly to be actuated in the at least two degrees of freedom, may also comprise a flex shield that provides a protective casing for the electrical communication components, the flex shield preventing the electrical communication components from coming into contact with other objects and/or components while the camera assembly is in use. In the camera assembly comprising a flex shield, the flex shield may also comprise side walls.
In yet another embodiment the camera assembly comprising the electrical communication component physically configured to permit the stereoscopic camera assembly to be actuated in the at least two degrees of freedom, the electrical communication components are situated in an electrical communication cavity defined by the main camera body. In the camera assembly with the electrical communication components situated in an electrical communication cavity, may also comprise a flex wrap guide and a constant-force spring, wherein the constant-force spring applies a radial force on the electrical communication component. In another embodiment the camera assembly with the electrical communication components situated in an electrical communication cavity, the main camera body may also define machined surface apertures.
In other embodiments of the camera assembly, the first optical axis of the first camera module and the second optical axis of the second camera module have an interaxial distance configured to provide stereo vision. In another embodiment, the stereoscopic camera assembly of the camera assembly, has an insertion configuration and a deployed configuration, wherein in the insertion configuration the first optical axis of the first camera module and the second optical axis of the second camera modules are orientated perpendicular to the camera support tube. In the camera assembly with the stereoscopic camera assembly having an insertion configuration and deployed configuration, the first camera module comprising a first camera body having a first outer edge, the second camera module comprising a second camera module body having a second outer edge, and wherein the maximum distance from the first outer edge of the first camera module body to the second outer edge of the second camera module body is greater than the maximum width of a cross-section of the stereoscopic camera assembly taken perpendicular to an axis of the camera support tube.
Note that numbered items remain consistent across all figures. Items numbered with the same number are either the same item, or identical copies of the item. Items numbered with different numbers are either parts of different design or are occasionally identical parts serving different purposes.
While the present system is designed for use by a surgeon within the abdominal cavity, many alternative uses of the system are possible. For example, a user might be a physician's assistant, nurse, surgical aid, or any other surgical personnel. Additionally, the device could be disposed within any part of a patient's body, and future embodiments could be designed to be much smaller so as to allow for use within smaller areas of a patient's body. Both smaller and larger devices can be fabricated for uses in areas such as the paranasal sinuses, colon, stomach, or any other areas within the human body including but not limited to, the abdomen, cranium and cervicis. Micro-fabrication utilizing MEMS or other means could allow for a device to be positionable within immensely small areas such as human blood vessels.
In some embodiments, the device may be used for non-surgical or non-medical tasks such as bomb diffusion, military reconnaissance, inspectional services, or any other task which requires obtaining multiple camera views without manual manipulation of the camera. In addition, some embodiments may be used for educational purposes, such as for training personnel. Some embodiments of the device could be fabricated to be human-sized or even larger-than-life, allowing humans to obtain visuals from areas unable to be reached or viewed by a human. Obviously, in such embodiments, the user many not necessarily be a surgeon.
Overview
In particular embodiments, the surgical apparatus system disclosed herein is designed to be incorporated and utilized with the Virtual Reality Surgical Device disclosed in International Patent Application No. PCT/US2015/029247 (published as International Patent Publication No. WO2015171614A1), incorporated by reference herein in its entirety. Notwithstanding the above-sentence, in some embodiments the surgical apparatus system disclosed herein can be implemented and utilized by other existing and future robotic surgery systems and/or devices.
The purpose of the system is to allow a surgeon who is performing MIS surgery, to be obtain multiple views and angles of a surgical site without having to manually manipulate and/or move an endoscopic camera. The system allows a surgeon to control the manipulation of a camera based off the movement of the surgeon's head movement, such that that surgeon can obtain a desired view by intuitively moving his or her head in the direction he or she would like to view. When the system is in use the surgeon is able to view the operation site, in such a way that he or she has the perception of being inside the patient's body, and by simply looking around the surgeon is able to view the entire operation field and obtain a desired view. This advantageously allows the surgeon to efficiently obtain a desired view, enabling him or her to maintain focus during a procedure, resulting in quicker operation time and faster recovery times for the patient.
Unless otherwise stated, the term “distal” as used herein means relatively further from a reference point, while “proximal” means relatively closer to a reference point. In general, the reference point will be the incision site on the patient's body for which the system is being used.
Camera Console Assembly
As illustrated in
As shown in the illustrative embodiment in
In one embodiment, the actuator mounts 116 are affixed to the camera console base 108 by a screw connection. In other embodiments, the actuator mounts 116 are affixed to the camera console base 108 via pin connections, while in further embodiments other connection types and/or methods known in the art are utilized such as adhesive connections, snap-fit connections, and/or welded connections. Alternatively, in other embodiments, the actuator mounts 116 and the camera console base 108 are fabricated as one rigid piece.
The actuator mounts 116, are configured to secure the actuators 106 (
In some embodiments, where the camera console base 108 is constructed as two halves, the trocar mating fixture 114 is used to mate the two halves of the camera console base. In this embodiment, one side of the proximal end of the trocar mating fixture 114, affixes to one of halve of the camera console base, while the other side of the proximal end of the trocar mating fixture 114, affixes to the other halve of the camera console base, thus mating the two halves of the camera console base 108.
As mentioned above, in one embodiment, the trocar mating fixture 114 contains a distal end which protrudes from the bottom of the camera console base 108, with the proximal end of the trocar mating fixture 114 resting on the camera console base 108. In one embodiment, the trocar mating fixture 114 is outfitted with a connection housing 117 (
In various embodiments, a variety of connection components are utilized to mate the trocar mating fixture 114 and the trocar assembly 102. In one embodiment, a dog leg snap button connection is used to couple the trocar mating fixture 114 to the trocar assembly 102. In this embodiment, the trocar mating fixture 114 is outfitted with a connection aperture 118, which is situated on the front face of the trocar mating fixture 114. In this embodiment, the dog leg snap button (not pictured), sits within the connection housing 117 of the trocar mating fixture 114. The dog leg snap button is constrained in the connection housing 117, by friction with the back tab of the dog leg snap button pressed against the wall of the connection housing 117, such that the snap button is partial compressed. The front tab of the dog leg snap button contains a button which protrudes and enters the connection aperture 118 of the trocar mating fixture 114, thus securing the dog leg snap button in place, and coupling the camera console assembly 101 with the trocar assembly 102.
In some embodiments, the distal end of the trocar mating fixture 114 contains a pin and slot connection 119 as depicted in the illustrative embodiment shown in
In addition, in some embodiments the trocar mating fixture 114 contains a communications cut-out. The communications cut-out is configured to allow electrical communication components from the robotic camera assembly 103 that are routed through the trocar assembly 102 to mate and couple with a camera rigid board 115 (
In some embodiments, the dog leg snap connection detailed above is eliminated.
In some embodiments, the trocar mating fixture 114 is also utilized to mate and affix other components of the camera console assembly 101 to said assembly. In some embodiments, a pulley housing block 112 mates and couples to the proximal end of the trocar mating surface 114. As depicted in the illustrative embodiment of the camera console assembly 101 shown in
The pulley housing block 112 is utilized to house and constrain camera redirecting pulleys 111. In one embodiment, the pulley housing block 112 is configured to house four camera redirecting pulleys 111, while in other embodiments the pulley housing block 112 is configured to house as few as one camera redirecting pulleys 111. In alternative embodiments, the pulley housing block 112 is configured to house four or more redirect pulleys 111.
The redirecting pulley(s) 111 redirect yaw and pitch cables of the robotic camera assembly 103 from a vertical orientation to a horizontal orientation such that the cables are able to be routed to a plurality actuator pulleys 105, and thus allowing the actuator 106 to actuate said cables.
In one embodiment, the redirecting pulley slot(s) 120 contains a shaft channel 121 located on the top of the redirecting pulley slot(s) 120 (
Additionally, in some embodiments, the redirecting pulleys 111 are also constrained in the redirecting pulley slots 120 via redirecting pulley covers 110.
In some embodiments, the redirecting pulley covers 110 are affixed to the top of the redirecting pulley slots 120 via a screw connection, while in alternative embodiments the redirecting pulley covers 110 are affixed to the redirecting pulley slots 120 via snap-fit connections. In alternative embodiments, a variety of connection techniques are utilized to affix the redirecting pulley covers 110 to the redirecting pulley slots 120 including but not limited to press-fit connections, adhesive connection and/or any other techniques and/or combination of connection techniques known in the art.
Additionally, in some embodiments the pulley housing block 112, contains a stiffening rod aperture, which is configured to allow a stiffening rod (not shown) to enter said aperture. The stiffening rod acts as an alignment feature to align the robotic camera system 100 and devices to be inserted, such as the Virtual Reality Surgical Device disclosed in International Patent Application No. PCT/US2015/029247 (published as International Patent Publication No. WO2015171614A1), so as to ensure that said device(s) is properly aligned for insertion into a patient. In addition, in some embodiments, the stiffening rod is also used to mate the robotic camera system 100 with devices to be inserted into a patient.
In one embodiment, the stiffening rod contains two threaded ends that allow said rod to connect to the pulley housing block 112 via a screw which is pressed in the stiffening rod aperture. In other embodiments, different connections techniques are utilized including but not limited to, snap fit connections, pressed fit connections, and/or any other techniques or combination of techniques known in the art. Alternatively, in some embodiments, the stiffening rod is eliminated. In various embodiments, the stiffening rod is fabricated out of a variety of materials including but not limited to carbon fiber, stainless steel, and/or composite materials.
In some embodiments, the pulley housing block 112 contains a flex cavity 122 located on the interior of said housing block 112.
In some embodiments, the pulley housing block 112 contains a camera support tube aperture 123 (
In various embodiments, different connection techniques are utilized to mate and couple the camera support tube 124 in the camera support tube aperture 123, including but not limited to a screw connection, a press-fit connection and/or a snap fit connection. In alternative embodiments, a rivet connection and/or any other connection technique and/or combination of techniques can be utilized to mate and couple the camera support tube 124 in the camera support tube aperture 123.
Additionally, in some embodiments the pulley housing block 112 contains an assembly slot located on the back of said housing block for cables from the robotic camera assembly to pass in and out of the pulley housing block 112 such that any component of the camera console assembly can be removed and/or swapped out without having to de-cable or unstring the robotic camera assembly.
Furthermore, in various embodiments, the pulley housing block 112 contains an insertion opening 125 (
In some embodiments, the pulley housing block 112 also contains a cavity configured to match the configuration and shape of the connection housing 117 of the trocar mating fixture 114 (
In some embodiments, the camera console assembly 101 also contains a console mating support 113, which affixes to camera console base 108.
In addition, as mentioned above, in various embodiments, the console mating support 113 is configured to provide added stability to the overall system. In some embodiments, located on the inner surface of the aperture for alignment are a plurality of stiffening bumpers 218 (
As mentioned above, the console mating support 113 affixes to the camera console base 108.
In alternative embodiments, the camera console assembly does not contain a console mating support. As depicted in the illustrative embodiment shown in
In some embodiments, the camera console assembly 101 contains a flex shield 109 which affixes to the bottom of the camera console base 108.
As illustrated in the embodiment shown in
As mentioned above, in some embodiments the flex shield 109 mates and couples to the bottom of the camera console base 108. In various embodiments, a variety of connection methods and techniques are utilized to couple the flex shield 109 to the bottom of the camera console base 108, including but not limited to threaded connections, snap fit connections, press-fit connections, welded connections and/or adhesive connections. In one embodiment, the flex shield 109 contains a plurality of flex shield stand-offs 127 (
As detailed above, the camera rigid board 115 rests upon the flex shield standoffs 127. The camera rigid board 115 acts as an intermediary for electrical communication components routed from the robotic camera assembly 103, with said rigid board taking data and information for electrical communication components and routing the information to the requisite locations such as motor control boards, and external computers. In one embodiment, the camera rigid board 115 contains a top and bottom surface, with both surfaces containing a plurality of connectors for which electrical communication components connect to.
As mentioned above, a variety of electrical communication components are used in various embodiments, including but not limited to PCBs and FPCBs. In one embodiment, a FPCB from the camera rigid board 115 is routed to a motor control board, with said FPCB providing position and orientation sensor data from the robotic camera assembly. Likewise, in one embodiment two FPCBs are routed from the camera rigid board 115 to a computer board which provides camera feeds and data obtained from sensors located on the robotic camera assembly. In addition, in one embodiment, the camera rigid board 115 contains a plurality of traces which are routed across said board and configured to ensure that traces from the electrical communication components of the robotic camera assembly contain the same length, so that data and camera feeds reach their respective location at the same time, thus reducing any disruption in the data flow of the system.
As detailed above, in some embodiments, the camera rigid board 115 is operatively connected to the motor control board. The motor control board is configured to process position and orientation data obtained from sensors of the robotic camera assembly, as well as from a sensor system tracking the position and orientation of a head-mounted display worn by a surgeon, as detailed below. The motor control board processes the data obtained from the aforementioned sensors and sensor system and transmits actuation commands to a plurality of actuators 106 informing the actuators 106 how much actuation force should be applied to cable(s) of the robotic camera assembly in order to actuate a stereoscopic camera 143 to follow and align with the head movements of the surgeon. In these embodiments, the motor control board is rigidly affixed to the camera console assembly.
As detailed above, in some embodiments, the camera console base 108 is outfitted with a plurality of actuator mounts 116, which are used to secure a plurality of actuators 106 (
In different embodiments, different types of actuators are utilized including but not limited to servomotors, rotary actuators, linear actuators and/or stepper motors. In one embodiment, the system 100 contains four (4) actuators, with two actuators designated to actuate yaw cables and two actuators designated to actuate pitch cables routed from the robotic camera assembly 103. Alternatively, in other embodiments, the system contains only two (2) actuators, with one actuator designated to actuate yaw cables and one actuate pitch cables routed from the robotic camera assembly.
In addition to the actuators 106 being secured to the camera console base 108 via the actuator mounts 116, in some embodiments a top console body 107 is utilized to provide additional stability to the entire system 100, as well as secure the actuators 106 in place. The top console body 107 is configured to constrain the camera console assembly 101 and prevent the assembly from bending inward and collapsing in on itself, due to tension forces from the actuators 106 during actuation of cables.
As depicted in the illustrative embodiment of the camera console assembly 101 shown in
As shown in
Likewise, in some embodiments, the top console body 107 contains a device opening 129 situated in the center of said plate. In one embodiment, the device opening 129 is located directly above the pulley housing block 112, with said opening configured to have a cross sectional area large enough to allow access to the redirecting pulleys 111, as well as allow a device and/or object to pass through the device opening 129 and enter the insertion opening 125 of the pulley housing block 112. In addition, in some embodiments, the device opening 129 is configured to provide space for a stiffening rod to pass through and enter an aperture located on the console mating support 113.
In some embodiments, the top console body 107 contains a proximal stiffening rod aperture 130. In these embodiments, the proximal stiffening rod aperture 130, is located directly above the stiffening rod aperture of the pulley housing block 112. In addition, in these embodiments, the proximal stiffening rod aperture 130 is configured to have the same cross-section as the stiffening rod aperture, so as to allow a stiffening rod to pass through the proximal stiffening rod aperture 130 and enter the stiffening rod aperture.
In one embodiment, the top console body 107 is fabricated as two halves that affix to one another via snap-fit connections. In other embodiments, the snap-fit connection is substituted for a pin-hole connection, while in further embodiments other connection types and/or methods are utilized, such as adhesive connection, welded connections, magnetic connection, and/or any other method or combination of methods known in the art. In alternative embodiments, top console body 107 is fabricated as one rigid piece. In some embodiments, the top console body 107 is constructed out of stainless steel, while in alternative embodiments the plate is constructed out of plastics, ceramics and/or other material types known in the art, that are capable of supporting the camera console assembly 101.
As mentioned above, in some embodiments the camera console assembly is configured to have two actuators.
In these embodiments, as the actuator actuates the bottom counter-rotating pulley 200b, said pulley pulls in on the end of the cable that is terminated on the pulley. While the bottom counter-rotating pulley 200b rotates, the top counter-rotating pulley 200a also rotates due to the coupling between said pulleys. The pulling in on the cable pans or tilts the stereoscopic camera, depending on how the cable is routed through the camera assembly. In order to pan or tilt the stereoscopic camera in the opposite direction, the actuator is rotated in the opposite direction. When the actuator is rotated in the opposite direction the torsion spring 202 applies a force to the top counter-rotating pulley 200a as the torsion spring moves back towards its free state. The force applied by the torsion spring causes slack form the other end of the cable to be pulled in, which causes the stereoscopic camera to pan or tilt in the opposite direction.
Trocar Assembly
As detailed above, attached to the distal end of the trocar mating fixture 114 is the trocar assembly 102.
In one embodiment, the trocar assembly 102 contains a trocar 133.
During utilization, the main trocar body 135 is situated on the exterior abdomen wall of the patient, with the trocar neck 136 inserted in the patient's abdomen wall to allow for insertion of the robotic camera assembly 103, as well as various surgical devices and apparatuses. In one embodiment, affixed to the trocar neck 136 is a winged ring 134. The winged ring 134 is configured to affix the trocar assembly 102 to the exterior abdomen wall of the patient to prevent any movement of the trocar assembly 102 during the procedure, as well as when a robotic camera assembly and surgical devices and tools are inserted into the patient's body.
In one embodiment, the main trocar body 135 contains a seal port 137 which is located on the side wall of said main trocar body 135. In this embodiment, a Luer Lock connection passes through the seal port 137 and connects to an air-port 140 located on an inflatable seal 132, as detailed below. The seal port 137 is located on the sidewall of the main trocar body 135 to allow air to be pumped into a plurality sheaths 139 located on the interior walls of the inflatable seal 132. In this embodiment, an air hose is coupled to the Luer Lock connection, the air hose is coupled to a pump. When air is pumped into the plurality of sheaths 139 via the air-port 140, the sheaths 139 inflate and expand creating a seal that conforms around the devices inserted in the patient via the trocar 133 of the trocar assembly 102, which prevents loss of pneumoperitoneum.
In some embodiments, located on the interior of the main trocar body 135 is a seal sub-assembly 138 made up of two seals, the inflatable seal 132 and a universal seal 131.
Additionally, in these embodiments, located on one wall of the inflatable seal 132 is an air-port 140 which protrudes from said wall. The air-port 140 is fabricated to be threaded to allow it to mate with a Luer Lock fitting which passes through the seal port 137 of the main trocar body 135, thus constraining the inflatable seal 132, rotationally and in axial travel. Furthermore, in these embodiments, the air-port 140 is also operatively coupled to the plurality of elastic sheathes 139. In these embodiments, the air-port 140 contains a Luer Lock, so as to allow carbon dioxide (CO2) or air to be pumped into said plurality of sheaths 132. As detailed above, when air is pumped into the plurality of sheaths 139, the sheaths inflate and expand creating a form fitting seal, which conforms to the devices passed through the main trocar body 135, thus preventing gas from escaping the patient's abdomen, while allowing said devices to be manipulated simultaneously. Due to the elasticity of the sheaths 139, multiple devices can pass through and or be routed through the trocar assembly 102 while maintaining a gas tight seal. In various embodiments, the plurality of sheaths 139 can be fabricated out of a variety of materials including but not limited to latex, neoprene, rubber and/or any other materials known in the art, capable of conforming to any shape when inflated. In addition, in further embodiments, the plurality of sheaths 139 are substituted for one elastic sheath that covers the entire interior perimeter of the inflatable seal 132. In alternative embodiments, the air-port located on the inflatable seal is eliminated. In these embodiments, the inflatable seal contains an aperture which aligns with the seal port of the trocar. An air hose is routed directly to a Luer Lock connection which passes through the seal port on the trocar and mates directly with the aperture on the inflatable seal.
In other embodiments, the inflatable seal is replaced by other seals known in art, such as an AirSeal®. In this embodiment, CO2 is continually pumped through a channel in the trocar, creating a pressure differential which prevents loss of pneumoperitoneum during insertion of tools and operation. In alternative embodiments, the inflatable seal is replaced by a compliant material which similarly fills the space of the trocar and conforms to the shapes of tools, devices or other items that are passed through the trocar. In some embodiments, a duckbill seal is utilized, well in further embodiments, a combination of seals known in the art are utilized to prevent loss of pneumoperitoneum during insertion of tools and throughout an operation.
In one embodiment, the proximal end and distal end of the inflatable seal 132 are configured to have threaded ends, so to allow the distal end to mate and couple with the main trocar body 135 and the proximal end to mate and couple with the universal seal 131. In alternative embodiments, different connection methods and attachment methods known in the art are used to couple the inflatable seal 132 to the main trocar body 135, as well as to couple the universal seal 131 with the inflatable seal 132, including but not limited to adhesive connections, snap fit connections, and/or a screw connection.
As illustrated in the embodiment shown in
In one embodiment, the proximal end of the universal seal 131 is outfitted with a plurality of seal flaps 141 which extend inward from the outer perimeter of the seal to the hollow center of said seal (
In some embodiments, the plurality of seal flaps 141 are configured to overlap each other in such a way that allow objects to pass through the hollow center of the universal seal 132, while conforming around the object passed through, such that a seal is created preventing air from escaping the patient's body. In one embodiment, the plurality of seal flaps 141 are fabricated as semi circles, with each one of the plurality of flaps overlapping a portion of another flap. In alternative embodiments, the plurality of seal flaps 141 can be configured to take on any of a variety of shapes, including but not limited to triangles, parallelograms, oval, and/or crescent. Additionally, in various embodiments, the seal flaps 141 are constructed out of variety of materials having flexible and resilient properties, including but not limited to rubber, latex, neoprene, silicone and/or any other materials known in the art, that are flexible and resilient.
As mentioned above, in one embodiment, the universal seal 131 is configured to have a distal end with a larger diameter than its proximal end. In this embodiment, the distal end is configured to have a hollow center with side walls that distally extend, so as to allow the proximal end of the inflatable seal 132 to sit within the distal end of the universal seal 132, such that the side walls of the universal seal 131 encompass the proximal end of the inflatable seal 131. In one embodiment, located on the interior perimeter of the distal end side wall of the universal seal 132 is a groove that is configured to allow an o-ring from the elastic sheathes 139 of the inflatable seal 132 to sit in. In this embodiment, the distal end of the universal seal 131 fits around the proximal end of the inflatable seal 132, with the o-ring from the sheaths 139 entering the groove on the interior wall of the distal end of the universal seal 131, thus creating an interference fit and coupling the universal seal 131 and the inflatable seal 132. In alternative embodiments, standard attachments known in the art are utilized to couple the universal seal 131 and the inflatable seal 132, including but not limited to adhesive connections, threaded connections, press-fits and/or snap-fit connections.
In alternative embodiments, the universal seal can take on various configurations. As depicted in the illustrative embodiment shown in
Robotic Camera Assembly
As seen in
As mentioned above, in one embodiment, the robotic camera assembly 103 is coupled to the camera console assembly 101 via the camera support tube 124 (
In one embodiment, the camera support tube 124 equipped with a plurality of channels and/or grooves, which are configured to allow cables to sit within, so as to provide a track for said cables to be routed to the camera console assembly 101, as well as protect said cables during actuation of the system. In alternative embodiments, the camera support tube is outfitted with lumens for which cables are routed through. In addition, in one embodiment, the camera support tube 124 contains grooves and/or corrugation configured to route electronic communication components from a stereoscopic camera to a camera console assembly. In alternative embodiments, grooves and/or corrugation are located on the side of the camera support tube 124 such as to allow electronic communication components to sit flush with the support tube so as to prevent the electronic communication components from bending and becoming damaged.
In one embodiment, the camera support tube 124 couples to pulley housing block 112 of the camera console assembly 101 via a screw connection, while in other embodiments standard coupling methods and techniques known in the art are utilized, including but not limited to snap-fit connections, pressed fit connections, adhesive connection, and/or welded connection. The camera support tube 124 is configured to fit and pass through the trocar assembly 102 so as allow the robotic camera assembly 103 to be inserted into a patient's body.
In some embodiments, the camera support tube 124 is configured to have a vertical offset so as to allow a stereoscopic camera to not interfere with other instruments or devices being inserted through the trocar assembly 102. In these embodiments, the proximal portion of the camera support tube that remains in the trocar assembly has a small cross-sectional area to allow other instruments and/or devices to pass through the trocar and into the field of operation. The distal end of the camera support tube has a vertical offset such that when inside the field of operation, the distal end of the camera support tube is jogged upward, so as to allow the stereoscopic camera to remain elevated above the space allowed for other instruments and/or devices to enter and pass through the trocar assembly and into the field of operation. In other embodiments, the distal end of the camera support tube is fabricated to have a horizontal offset or angular offset, while in further embodiments the camera support tube is fabricated to have a horizontal and vertical offset.
As mentioned above, in one embodiment operatively coupled to the distal end of the camera support tube 124 is the stereoscopic camera 143 (
In one embodiment, the stereoscopic camera 143 contains a main camera body 144, which is used to couple and mate the stereoscopic camera 143 with the camera support tube 124.
As seen in
In one embodiment affixed to the top surface of the main camera body 144 is a main body flex cover 146 (
As mentioned above, in one embodiment, the main camera flex cover 146 provides a bearing surface for which the main camera body mount 147 can rotate.
In one embodiment, the main camera body mount 147 is configured to have a hollow center, so as to allow pitch rotation of the left camera assembly 149 and the right camera assembly 150. In addition, in one embodiment the main camera body mount 147 contains a plurality of apertures with one aperture utilized to connect a main mount insert 153, one aperture utilized to route cables from the pitch actuation assembly 148, one aperture utilized as an alignment pinhole to align the left camera assembly 149 and the right camera assembly 150, and one aperture is used to route electrical communication components through the main camera body mount 144. In one embodiment, the main camera body mount 147 contains machined surfaced apertures, for which rotational positional sensors 209 and capacitors to sit inside, so as allow electrical communication components operatively coupled to said rotational positional sensors 209 and capacitors to sit flat against the outer surface of the main camera body mount 147 (
In addition, in one embodiment the main camera body mount 147 contains a pocket for which the main mount insert 153 enters and couples to. In one embodiment, the pocket is fabricated such that the main mount insert 153 couples to the main camera body mount 144 and prevents the main camera body mount 144 from moving and becoming detached from the rest of the robotic camera assembly during actuation. The main mount insert 153 is discussed in further detail below. Additionally, in one embodiment, located on the interior wall of the main camera body mount 147 are keyed grooves, that are constructed to constrain the rotation of a pitch bearing race 154.
In a different embodiment, the main camera body mount is fabricated as multiple components that couple to each other, in order to provide ease of assembly and repair of the pitch actuation assembly 148. As depicted in the illustrative embodiment shown in
In addition, in some embodiments the main mount 181 contains a mechanical stop to prevent the camera assemblies from being actuated and rotated past its allowable actuation range. In one embodiment, the mechanical stop is configured as two concentric rings, with one being an inner ring fixed to the main body flex cover, and one being an outer ring which rotates concentrically and is located on the main mount. In this embodiment, both concentric rings contain radial protrusions, with the protrusion located on the inner ring extending radially outward and the protrusion on the outer rig extending radially inward, such that those protrusions do not interfere with each other directly. The inner and outer concentric rings are spaced apart such that they form a track in which a bearing ball is free to move. The track formed is configured to be slightly larger than the bearing ball but constrains the bearing ball along a circular path. During actuation of the camera assemblies about the yaw axis, the protrusion on the outer ring contacts the ball bearing and free rotates until the ball bearing contacts and jams against the protrusion on the inner ring. In different embodiments the location of the protrusions can be adjusted to configure the start and stop of rotation, while in other embodiments the width of the protrusions can be configured to achieve a desired amount of rotation. The amount of rotation achievable can range from 0 degrees of rotation all the way to 700 degrees of rotation depending on the configuration and location of the protrusions on the concentric rings. In other embodiments, the mechanical stop is configured to have three concentric rings and two bearing balls, thus allow for a greater range of rotation to be achieved.
In addition, in some embodiments the main mount contains a plurality of apertures for routing cables from the pitch actuation assembly and electrical communication components from the camera assemblies. Additionally, on the interior surface of the main mount 181 are machined surface apertures for which rotational positional sensors and capacitors sit in. In some embodiments, the rotational positional sensors used are two hall effect sensors and capacitors, which sit within the interior surface of the main mount, while in further embodiments more than two hall effect sensors and capacitors sit within the interior surface of the main mount. In addition, on the interior of the main mount 181 are machined surfaces for the pitch bearing races of the pitch actuation assembly to sit in.
Similar to the main mount 181, in some embodiments, located on the interior surface of the main mount cover 182 are machined surfaces for rotational positional sensors and capacitors to sit within, as well as corresponding machined surfaces for pitch bearing races from the pitch actuation assembly to sit in (
As mentioned above, in one embodiment, the main camera body mount is configured to house the pitch actuation assembly.
In one embodiment, the pitch pulley 155 contains a pitch pulley mandrel 158 (
In one embodiment, located on the exterior of the pitch pulley mandrel 158 are two apertures, in which alignment pins sit in. In this embodiment, the alignment pins are used to align the pitch pulley 155 with the pitch thrust bearing 156. In one embodiment, the pitch pulley 155 couples and mates with the pitch thrust bearing 156 via a screw connection, with the screw sitting in an aperture located on the pitch pulley mandrel 158. In other embodiments, different coupling methods known in the art are used to mate and couple the pitch pulley 155 with the pitch thrust bearing 156 including but not limited to press-fit connections, snap fit connections, and/or adhesive connections.
In addition, in one embodiment the pitch pulley 155 contains a pitch cable channel 160 for which a cable is routed around. In one embodiment located on the interior of the pitch cable channel 160 is an aperture, with said aperture crossing through the center plane of the pitch pulley 155. In this embodiment, a cable is routed around the pitch pulley 155, where it enters the aperture, and passes through to the other side of the pitch pulley 155 and is routed through the main camera body 144. In this embodiment, the cable sits within the pitch cable channel 160 until it is routed through the aperture in the pitch cable channel 160, once the cable passes through the aperture on the other side of the pulley, a set screw holds the cable in place within the aperture, thus preventing the cable from moving during actuation. In one embodiment, one end of the cable is routed to one of the actuators 106 of the camera console assembly 101 and the other end is routed to a different actuator 106. In this embodiment, one of the actuators 106 is configured to actuate the pitch pulley 155 in an upward direction about a pitch axis, and another one of the actuators 106 is configured to actuate the pitch pulley 155 in the downward direction about a pitch axis. In other embodiments, both ends of the cable may be routed to only one actuator 106, with that actuator configured to actuate the pitch pulley 155 in both the upward and downward direction about a pitch axis. In addition, in alternative embodiments, two or more cables may be used to actuate the pitch pulley 155. Additionally, in some embodiment, the pitch pulley 155 also contains a bearing surface 161, with said bearing surface fabricated to allow the pitch ball bearings 157a to sit in.
As mentioned above, in some embodiments, contained in the pitch actuation assembly 148 is the pitch bearing race 154 (
As mentioned above, in some embodiments, contained in the pitch actuation assembly 148 is the pitch thrust bearing 156. The pitch thrust bearing 156 is configured to act as a rotational bearing for pitch of the left camera assembly 149 and the right camera assembly 150, and as a thrust bearing to compensate for an axial force acting upon the robotic camera assembly 103 during actuation. The pitch thrust bearing 156 rotates inside of the main camera body mount 147 and mates with the pitch pulley 155. In one embodiment, the pitch thrust bearing 156 contains a bearing surface for which the pitch ball bearings 157b sit on. In this embodiment, the pitch ball bearings 157b sit on the bearing surface of the pitch thrust bearing 156 and a machined bearing race located in the interior of the main camera body mount 147.
In one embodiment, the pitch thrust bearing 156 contains a plurality of apertures, with two being used to mate and couple the pitch thrust bearing 156 to a front left camera support 162, one being used to align the pitch thrust bearing 156 with the pitch pulley 155, one to route electrical communication components through the pitch actuation assembly 148, two apertures for aligning the left camera assembly 149 and the right camera assembly 150 to a desired position and orientation, and one aperture to connect the left camera assembly 149 and the right camera assembly 150. In one embodiment, the pitch thrust bearing 156 contains a notch for aligning a back right camera support 163 of the right camera assembly 150.
In other embodiments, the pitch actuation assembly can take on different configurations.
In alternative embodiments, the pitch actuation assembly is fabricated to be directly driven by an actuator. In these embodiments, the actuator replaces the cable pulley systems detailed above. Different types of actuators can be utilized in different embodiments, including but not limited to, piezoelectric motors, linear actuators, rotational motors such as servo motors or stepper motors, or other known actuators in the field. In these embodiments, the actuators provide the rotational movement for the stereoscopic camera about a pitch axis.
In alternative embodiments, pitch actuation of a stereoscopic camera is done by rotating the camera support tube about a pitch axis located at the proximal end of the camera support tube. In these embodiments, an actuator may be affixed to the proximal end of the camera support tube for rotating said support tube about the pitch axis. In different embodiments, different types of actuators can be utilized including but not limited to, piezoelectric motors, linear actuators, rotational motors such as servomotors or stepper motors, and/or other actuators known in the field capable of providing rotational movement. Alternatively, in some embodiments, the camera support tube is manually rotated about a pitch axis.
In further embodiments, the camera support tube is outfitted with an actuator located at the distal end of the support tube, which rotates the main camera body about a pitch axis. In these embodiments, the pitch actuation assembly of the stereoscopic camera may be eliminated or may be used in conjunction with the above-mentioned actuation methods. In different embodiments, different types of actuators can be utilized including but not limited to, piezoelectric motors, linear actuators, rotational motors such as servomotors or stepper motors, and/or other actuators known in the field capable of providing rotational movement.
As previously mentioned, in one embodiment, the main camera body mount 147 is coupled to the main camera body 144 via the main mount insert 153. In one embodiment, the main mount insert 153 is configured to connect the main camera body mount 147 and the yaw actuation assembly 151. In one embodiment, the main mount insert 153 contains an aperture on the bottom surface that has filleted sides to allow for cable(s) to move across the surface without said cable(s) becoming damaged. In addition, in some embodiments, the bottom surface of the main mount insert 153 is fabricated to be curved in shape to match the bore of the main camera body mount 147, such that the main mount insert 153 sits flush with the main camera body mount 147. Additionally, in some embodiments, the main mount insert 153 contains a stem which mates with and sits inside a slot in the yaw actuation assembly 151. In these embodiments, the stem of the main mount inserts 153 passes through the main camera body mount 147 and mates with the yaw actuation assembly 151.
In one embodiment, perpendicular to the yaw cable surface 167 is an opening for which cable(s) are a routed through (
In one embodiment, located beneath the yaw cable surface 167 of the yaw pulley 165 is a connection boss 168 that is machined in the shape of a horseshoe. In this embodiment, the connection boss 168 is configured so that it can only mate with the yaw pulley block 166 in one orientation. In different embodiments, the connection boss 168 is fabricated to take on a variety of shapes that allow it to mate with the yaw pulley block 166 in only one orientation, including but not limited to hexagonal shape and/or any polygon. In alternative embodiments, two connection bosses 168 are used to connect the yaw pulley 165 with the yaw pulley block 166. In addition, in other embodiments, different connections known in the field are used to connect the yaw pulley 165 with the yaw pulley block 166, including but not limited to, a pin and slot connection, a screw connection, and/or a snap-fit connection. In addition, in some embodiments, the connection boss 168 is located on the yaw pulley block instead of the yaw pulley.
As mentioned above, in some embodiments, coupled to the connection boss 168 of the yaw pulley 165 is the yaw pulley block 166.
In addition, in some embodiments the connection pocket 169, contains space in which the yaw magnet 164 sits. In these embodiments, the yaw magnet 164 is sandwiched between the yaw pulley 165 and the yaw pulley block 166. In one embodiment, the yaw magnet 164 is configured as a ring magnet. In this embodiment, the yaw magnet 164 is diametrically magnetized, so that as the yaw magnet 164 rotates around its cylindrical axis the magnetic field changes. In this embodiment, the change in magnetic field is measured by rotational positional sensors, with said sensors transmitting the data to processors which convert the change in magnetic field to rotational position data. This conversion is done with knowledge of the physical configuration of the magnet(s) and sensor(s). In this embodiment, several rotational positional sensors are placed around the diametrically magnetized magnet orthogonally to each other. As the diametrically magnetized magnet rotates, the magnetic field it generates also changes relative to the rotational positional sensors. Using simple trigonometry, a combination of two orthogonally placed sensors can determine the direction of the magnetic field by comparing the relative field strength between the sensors. This calculation yields the orientation of the diametrically magnetized magnet and, by extension, the orientation of the stereoscopic camera. Additional rotational positional sensors are placed in this embodiment for redundancy, but the total number of sensor necessary for the absolute orientation calculation will depend on the chosen configuration of the magnet and the sensors.
With the rotational data obtained from the sensors, the system is able to pinpoint how far the stereoscopic camera 143 has been rotated about a yaw axis, and thus obtain the rotational position of the stereoscopic camera 143 during actuation. In other embodiments, the yaw magnet 164 is configured as horseshoe magnet, disc magnet, sphere magnet, cylinder magnet and/or any other magnet shape known in the art. In addition, in different embodiments a variety of rotational positional sensors known in the art that are capable of magnetic field sensing can be used, including but not limited to hall effect sensors, and/or magnetoresistors.
In some embodiments, the electrical communication component cavity 145 of the main camera body 144 contains machined surface apertures for which sensors and capacitors sit inside, with said sensors and capacitors obtaining the rotational position data of the stereoscopic camera 143 about a yaw axis. Similarly, electrical communication component cavity 345 is outfitted with rotational positional sensors 209 and capacitors for obtaining rotational position data of the stereoscopic camera assembly (
In some embodiments, the yaw pulley block 166 contains a protrusion 171 on the bottom surface of said yaw pulley block 166. In some embodiments, the protrusion 171 contains a slot for electrical communication components to be routed through said slot and around the protrusion 171 during actuation. In some embodiments, the protrusion 171 is configured to be circular in shape, so to allow electrical communication components to be wrapped around said protrusion during actuation. In other embodiments, the protrusion 171 can take on a variety of shapes that allow electrical communication components to wrap around it during actuation, including but not limited to oval, spherical, and/or cylindrical.
In some embodiments, the protrusion 171 contains a pitch cable aperture 170 configured to allow cable(s) from the pitch actuation assembly 148 to be routed through said yaw actuation assembly 151, and up to the actuators 106 of the camera console assembly 101. Additionally, in some embodiments, the protrusion 171 contains an alignment pocket and a main mount insert pocket 172. In these embodiments, the alignment pocket is configured to allow an alignment pin from the main camera body mount 147 to enter the alignment pocket to align the yaw pulley block 166 and the main camera body mount 147. The main mount insert pocket 172 is configured to allow the stem of the main mount insert 153 to enter and couple the yaw pulley block 166 with the main camera body mount 147. In one embodiment, a set screw hole is located on the side of the protrusion 171. In this embodiment, a set screw enters the set screw hole and couples the main mount insert 153 in the main mount insert pocket 172, thus affixing and securing the main camera body mount 147 to the yaw actuation assembly 151. In other embodiments, different attachment and coupling methods and/or techniques known in the art are utilized to affix the main mount insert 153 to the main mount insert pocket 172 including but not limited to press-fit connections, snap-fit connections, and/or adhesive connections.
In addition, in some embodiments, located around the top surface of the yaw pulley block 166 is a yaw bearing surface 173. In these embodiments, the yaw bearing surface 173 is configured to allow a set of yaw ball bearings 174a to sit in said bearing race. In these embodiments, the yaw actuation assembly 151 rotates inside the main camera body 144, and the main body flex cover 146. In these embodiments, the main body flex cover 146 contains a bearing surface which one set of yaw ball bearings 174a sit and ride along, and the main camera body 144 contains a bearing surface for which another set of yaw ball bearings 174b sit and ride along. In these embodiments, the yaw bearing surface 173 of the yaw pulley block 166 mates with one set of yaw ball bearings 174a and rides along the bearing surface of the main camera body 144, and the other set of yaw ball bearings 174b rides along a bearing surface of the main camera body mount 147 and a bearing surface of the main body flex cover 146. This configuration allows the stereoscopic camera 143 to be rotated about a yaw axis.
As depicted in
In alternative embodiments, the yaw actuation assembly is fabricated to be directly driven by an actuator. In these embodiments, the actuator replaces the cable pulley systems detailed above. Different types of actuators can be utilized in different embodiments, including but not limited to, piezoelectric motors, linear actuators, rotational motors such as servo motors or stepper motors, or other known actuators in the field. In these embodiments, the actuators provide the rotational movement for the stereoscopic camera about a yaw axis.
As mentioned above, in some embodiments, the stereoscopic camera assembly is constructed to have two camera assemblies, each having an optical axis. In some embodiments, the camera assembles are fabricated to have the same components, while in other embodiments, the camera assemblies are fabricated to contain different components. In alternative embodiments, the camera assemblies are fabricated to contain different variations of the same components. As seen in embodiment shown in
In one embodiment, the left camera assembly 149 is comprised of the front left camera support 162, a back left camera support 175, a camera case 176, a camera connector (not shown), and a camera module 177a (
In one embodiment the front left camera support 162 contains an alignment protrusion that mates and couples with an aperture on the pitch actuation assembly 148, thus affixing the front left camera support 162 with the pitch actuation assembly 148. Additionally, in one embodiment, the front left camera support 162 contains alignment pin holes for aligning and coupling the front left camera support 162 with the back left camera support 175. In addition, in one embodiment, the front left camera support 162 contains a screw hole for connection with the pitch pulley 155, as well as another screw hole for connection with an end cap 178. In this embodiment, the end cap is configured to help facilitate insertion of the stereoscopic camera 143. In one embodiment, the endcap 178 has rounded edges so that it can be inserted through the trocar assembly 102 without puncturing the seals of the trocar assembly.
In one embodiment, the front left camera support 162 contains a groove for which the camera connector (not shown) of the left camera assembly 149 sits in. In one embodiment, the camera connector of the left camera assembly 149 is configured as a thirty-pin connection, which connects the camera module 177a of the left camera assembly 149 to electrical communication components that run to the camera rigid board. The camera connector allows video feed obtained by the camera module 177a of the left camera assembly 149 to be transmitted to the camera rigid board, where said rigid board processes the video feed and transmits it to an external processor, which outputs the video feed to an external monitor or head mounted display worn by a surgeon, allowing the surgeon to view the operation site.
As mentioned above, the front left camera support 162 is configured to constrain and house the camera module 177a of the left camera assembly. The camera module is utilized to provide live video feed of the operation site to a surgeon. In some embodiments, the camera module 177a is fabricated to have a lens stack, an infrared filter, a module body and a digital sensor board having a digital sensor. In some embodiments, camera modules currently on the market, such as Raspberry Pi camera modules, e-con System® camera modules, and/or similar camera modules are utilized, while in other embodiments custom made camera modules may be used to provide live video feed.
In one embodiment, the module body of the camera module is fabricated to have an outer and inner edge, with the outer edge being closer to the end-cap of the camera assembly. In addition, in one embodiment, the module body of the camera module is fabricated to allow the digital sensor of the camera module to be shifted such that there is a horizontal displacement from the center of the lens stack of the camera module. The horizontal displacement of the digital sensor from the center of the lens stack, allows the images obtained from the camera module 177a of the left camera assembly 149 and the images obtained from the camera module 177b of the right camera assembly 150 to have a greater overlapping region, thus providing the surgeon with a wider stereoscopic field of view. With a wider stereoscopic field of view, the amount of disparity between the images obtained from the camera module 177a of the left camera assembly 149 and the camera module 177b of the right camera assembly 150 is limited, thus reducing the amount of eye strain experienced by the surgeon.
As mentioned above, in some embodiments, the camera module 177a of the left camera assembly 149 is constrained by the front left camera support 162 and the back-left camera support 175. In these embodiments, the back-left camera support 175 is configured to support the back of camera module 177a of the left camera assembly 149, providing a surface for the back of said camera module to rest on and also providing a surface for the front left camera support 162 to couple to. In one embodiment, the back portion of back left camera support 175 is configured to be a rounded surface such as to allow said support to fit within the camera case 176a. In one embodiment, the back-left camera support 175 contains a slot for electrical communication components to be routed through, with said components being routed through the main camera body mount 147 and to the camera rigid board 115 of the camera console assembly 101. In one embodiment, the back-left camera support 175 contains a plurality of through holes, with said through holes configured to allow a set screw to pass through and adjust the alignment of the camera module 177a. In addition, in some embodiments, the back-left camera support 175 contain a plurality of connection holes to couple and mate the back-left camera support 175 with the front left camera support 162, as well as for attachment of the end cap 178a.
As mentioned above, in some embodiments the back-left camera support 175 is configured to fit within the camera case 176a of the left camera assembly 149. The camera case 176a of the left camera assembly 149 is configured to house the camera module 177a of the left camera assembly 149, the left front camera support 162, the back-left camera support 175, the camera connector of the left camera assembly 149, as well as the electrical communication components routed from the camera module 177a of the left camera assembly 149. The camera case 176a of the left camera assembly 149 is fabricated to prevent liquids and other substances from entering the left camera assembly 149. The camera case 176a of the left camera assembly 149 is configured to slide over the above referenced parts and be constrained on one end by the end cap 178a of the left camera assembly 149, and on the other end by the main camera body mount 147. In one embodiment, the end cap 178a of the left camera assembly 149 is configured as two pieces that mate together. In various embodiments, different connection methods and techniques known in the art are utilized to couple the end cap 178a of the left camera assembly 149 to the camera case 176a of the left camera assembly 149, as well as to couple said case to the main camera body mount 147. Such methods include but are not limited to screw connections, adhesive connections, and/or press-fit connections. In addition, the camera case 176a of the left camera assembly 149 contains an aperture to allow the camera module 177 of the left camera assembly 149 to have clear view of the operation site.
As mentioned above, the stereoscopic camera or stereoscopic camera assembly 143 also comprises the right camera assembly 150 (
In one embodiment, the front right camera support 179 is configured to support and house the camera module 177b of the right camera assembly 150, similar to how the front left camera support 162 houses and supports the camera module 177a of the left camera assembly 149. In one embodiment, the front right camera support 179 couples directly with the pitch thrust bearing 156 situated inside the main camera body mount 147. In addition, in one embodiment the front right camera support 179 mates and couples with the back-right camera support 163, in the same manner detailed above for the mating and coupling of the front left camera support 162 and the back-left camera support 175. In one embodiment, the back-right camera support 163 contains a protrusion that is configured to fit inside a slot on the pitch thrust bearing 156, thus mating the right camera assembly 150 with the pitch actuation assembly 148.
Additionally, similar to the camera module 177a of the left camera assembly 149, the camera module 177b of the right camera assembly 150 is configured to provide live video feed of the operation site to a surgeon. As detailed above for the camera module 177a of the left camera assembly 149, the camera module 177b of the right camera assembly 150 is comprised of a lens stack, an infrared filter, a module body and a digital sensor. Likewise, in some embodiments, camera modules currently on the market, such as Raspberry Pi camera modules, e-con System® camera modules, and/or similar camera modules are utilized, while in other embodiments custom made camera modules may be used to provide live video feed.
In addition, in some embodiments, the module body of the camera module 177b of the right camera assembly 150 is fabricated to have an inner and outer edge, with the outer edge being closer to the end cap of the camera assembly. Additionally, in some embodiments the module body 177b of the right camera assembly 150 is fabricated to allow the digital sensor of the camera module 177b to be shifted such that there is a horizontal displacement from the center of the lens stack of said camera module. The horizontal displacement of the digital sensor of the camera module 177b from the center of the lens stack of said camera module allows the images obtained from the camera module 177b of the right camera assembly 150 and the images obtained by the camera module 177a of the left camera assembly 150 to have a greater overlapping region, thus providing the surgeon with a wider stereoscopic field of view. In these embodiments, the digital sensor of the camera module 177a of the left camera assembly 149 is shifted to left, and the digital sensor of the camera module 177b contained in the right camera assembly 150 is shifted to the right.
As mentioned above, the right camera assembly 150 contains a camera connector (not shown). The camera connector of the right camera assembly 150 is analogous to the camera connector of the left camera assembly 149, in that the camera connector of the right camera assembly 150 sits within a groove in the front right camera support 179. Likewise, in one embodiment the camera connector of the right camera assembly 150, is configured as a thirty-pin connection, which connects the camera module 177b of the right camera assembly 150 to electrical communication components that run to the camera rigid board 115. The camera connector of the right camera assembly 150 allows video feed obtained by the camera module 177b of the right camera assembly 150 to be transmitted to the camera rigid board 115, where said rigid board processes the video feed and transmits it to an external processor, which outputs the video feed to an external monitor or head mounted display worn by a surgeon, allowing the surgeon to view the operation site.
Furthermore, the camera case 176b of the right camera assembly 150 is analogous to the camera case 176a of the left camera assembly 149, in such that the camera case 176b is configured to house the camera module 177b of the right camera assembly 150, the front right camera support 179, the back left camera support 163, the camera connector of the right camera assembly 150, as well as the electrical communication components routed from the camera module 177b of the right camera assembly 150. The camera case 176b is fabricated to prevent liquids and other substances from entering the right camera assembly 150. In addition, the camera case 176b is configured to slide over the above referenced parts and be constrained on one end by the end cap 178b of the right camera assembly 150, and on the other end by the main camera body mount 147. In one embodiment, the end cap 178b is configured to be two pieces that mate together. In various embodiments, different connection methods and techniques known in the art are utilized to couple the end cap 178b to the camera case 176b, as well as to couple said case to the main camera body mount 147. Such methods include but are not limited to screw connections, adhesive connections, and/or press-fit connections. In addition, the camera case 176b contains an aperture to allow the camera module 177b of the right camera assembly 150 to have clear view of the operation site.
As stated above, in some embodiments, the left and right camera assemblies contain the same components and thus are identical.
As stated in one embodiment, camera assembly 187 contains an electrical communication component retainer 189. The electrical communication component retainer 189 is used for the electrical communication components coupled to the digital sensor board 191 to sit, so as to prevent said communication components from being damaged during actuation of the camera assembly 187. In this embodiment the camera module assembly 190 is coupled to the pitch actuation assembly, with the camera module body 195 holding the electrical communication component retainer 189 in place within the camera case 376. In some embodiments, the camera assembly 187 contains a flex mandrel 188. In these embodiments, the flex mandrel 188 is used to wrap and route electrical communication components coupled to the camera module assembly 190. The flex mandrel 188 is configured to sit within a space on the electrical communication component retainer 189, with said retainer configured to fit and sit within the camera case 376 and mate with the end cap 378, to seal the camera assembly 187.
In various embodiments, components of the stereoscopic camera and camera assemblies can be configured to provide a user experience that is keyed to a specific user allowing the user to view stereo images within a head-mounted display in a manner which feels natural and comfortable. In some embodiments, the interaxial distance between camera assemblies is modified to adjust the depth of the operation site perceived by the user. In some embodiments the digital sensor or digital sensor board of the camera module is shifted relative to the lens stack in order to provide a wider stereoscopic field of view. Additionally, in some embodiments, the focal length of a camera module is adjusted to adjust the focus distance of the camera assemblies.
As mentioned above, the interaxial distance between camera assemblies can be modified to adjust the depth of the operation site perceived by a user. A greater interaxial distance increases the perceived depth, while a smaller interaxial distance decreases the perceived depth of the operation site. With an increase in the interaxial distance, the amount of overlap in images obtained by the camera assemblies will decrease. At distances close to the camera assemblies, the overlap in images may be nonexistent or insufficient for stereoscopic viewing.
In addition, as detailed above, the digital sensor or digital sensor board of the camera modules or camera module assembly can be shifted in order to increase the stereoscopic field of view. Similar to the method detailed above for the left camera assembly 149 and the right camera assembly 150, in one embodiment the digital sensor board 191 of the camera module assembly 190 can be shifted such that there is horizontal displacement from the center of the lens stack 192. In this embodiment, the horizontal displacement of the digital sensor board 191, allows the images obtained from one camera assembly 187 and the images obtained from another camera assembly to have a greater overlapping region, thus providing the surgeon or user with a wider stereoscopic field of view. In these embodiments, the digital sensor board of the camera assembly located on the left is shifted left and the digital sensor board of the camera assembly located on the right is shifted to the right. When the shift distance of the digital sensor board in each of the camera assemblies is sufficient, a zero-disparity plane (ZDP) is achieved, at which both images form the camera assemblies completely overlap. As such, by adjusting the interaxial distance between camera assemblies and shifting the digital sensor boards of the said camera assemblies, the stereoscopic view obtained can be maximized.
Additionally, as mentioned above, the focal length of camera assemblies can be adjusted in order to focus the camera modules or module assemblies. The focal length is adjusted by moving the lens stack of the camera assembly towards or away from the digital sensor or digital sensor board. In some embodiments, the lens stack has a threaded exterior that screws into a threaded hole in the camera module body 195 or housing of the camera module 177 depending on the embodiment. In these embodiments, the focal length is adjusted by screwing the lens stack so that it is closer to or farther away from the digital sensor or digital sensor board of the camera assembly. The focal length is adjusted such that the area viewed by the surgeon or user is focused, thus providing a clear image of the operation site. In some embodiments, the lens stack is manually adjusted, while in other embodiments the focal length is adjusted electromechanically utilizing a small actuator such as a linear actuator, or rotary actuator and/or any other small actuator known in the field.
In some embodiments, the camera assembly is outfitted with lights to illuminate the operation site and to help increase the visibility for the surgeon or user. In one embodiment, the end caps of camera assembly are equipped with an array of light emitting diodes (LEDs). The LEDs are powered via wires routed through the camera assembly from outside the patient's body, where said wires are coupled to a power supply. Heat from the LEDs dissipates within the main camera body. In some embodiments, a small amount of sterile saline or other biocompatible fluid flows through the main camera body to cool said camera body, while in other embodiments biocompatible fluid or gas is forced through said camera body for cooling purposes. In these embodiments, biocompatible fluid or gas is routed through the main camera body via a cooling line that is routed from outside the patient body and through the camera assembly. The cooling line is coupled to a fluid or gas source (depending on the embodiment) and a pump, which pumps the fluid or gas through the cooling line. In some embodiments, fluid or gas is continuously pumped and circulated through the cooling line, while in other embodiments, the fluid or gas may be pumped into the line once or a certain time intervals. In other embodiments, a main camera body is outfitted with a temperature sensor to ensure the camera assembly remains within a safe temperature range. In alternative embodiments, LEDs are located on the camera support tube, and/or the main camera body mount. In further embodiments, fiber optics are used in place of LEDs, to illuminate the operation site.
In some embodiments, the camera assembly is outfitted with lens wipers to wipe, brush and/or remove any matter or debris located on the lens of the camera assemblies. In one embodiment, two lens wipers are affixed to the main camera body, one wiper for each camera assembly. The lens wipers are fabricated to extend from the main camera body distally towards the camera assemblies during use. In this embodiment, the lens wipers are affixed to the main camera body via a hinged connection known in the art, such that during use the wipers are able to sway side to side across the lens of the camera assemblies. In other embodiments, lens wipers are rigidly fixed to the main camera body, and the stereoscopic camera is actuated such that the lens of the camera assemblies move across the lens wipers to remove and wipe away any debris or matter. In alternative embodiments, lens wipers are attached directly to the camera assemblies.
In alternative embodiments, lens wipers are fabricated to move up and down from the main camera body towards the camera assemblies. In these embodiments, the lens wipers are configured to be collapsible. The lens wipers expand and extend from the main camera body towards the lens of the camera assemblies, as the lens wipers move up and down, the contact the lens of the camera assemblies and wipe away and remove debris or matter located on the camera assemblies. In some embodiments, the lens wipers are fabricated out of soft biocompatible rubber known in the art, while in other embodiments the lens wipers are fabricated other biocompatible materials known in the art such as soft biocompatible ceramics.
In further embodiments, the camera assembly is outfitted with an irrigation system to spray water or other solutions or fluids, to help remove debris and matter from the camera assemblies, as well as prevent the lens of the camera assemblies from experiencing smudging when debris or matter is wiped away. In these embodiments, the camera assembly is equipped with a fluid line that routed from outside the patient's body and through the camera assembly. The fluid line is coupled to a fluid source and a pump, which pumps the fluid through the fluid line to sprayers located on the main camera body. In this embodiment the sprayers are positioned so that the fluid is sprayed down onto the lens of the camera assemblies. In some embodiments, the pressure at which the fluid is sprayed is controlled by the surgeon or user, while in other embodiments the fluid is set to spray at a set rate. In some embodiments, the camera assembly contains both an irrigation system and lens wipers. In these embodiments, the irrigation system and lens wipers work in conjunction to remove any debris or matter on the camera assemblies.
In some embodiments, the camera assembly is outfitted with peripheral cameras to provide the surgeon or user with real-time images of the operation site during insertion and removal of the camera assembly. In one embodiment, the end caps of the camera assemblies contain peripheral cameras in order to capture the real-time images of insertion and removal. As the camera assembly is inserted the peripheral cameras provide the surgeon with images of the operation site. In these embodiments, the peripheral cameras are orientated to be forward facing with respect to insertion, such that the camera is looking in the direction of insertion so as to provide images of the operation site as the stereoscopic camera is inserted. With the images from the peripheral camera, the surgeon can determine if there are any unforeseen conditions in the operation site, as well as determine if the angle of insertion or the point of insertion needs to be modified.
As stated above, in some embodiments the end caps of both camera assemblies contain peripheral cameras. In these embodiments, one of the peripheral cameras is used to capture images of the operation site during insertion of the stereoscopic camera, and the second peripheral camera is used to capture images of the robotic device, tool and/or instrument being inserted. The second peripheral camera allows the surgeon or user to monitor the insertion of tools, robotic devices or instruments that are being inserted through the trocar assembly. With the images from the second peripheral camera, the surgeon or user can modify the insertion angle or position of the device or instrument being inserted. In addition, during operations, the peripheral cameras are utilized to capture additional images of the operation site. The images from the peripheral cameras during the operation, provide the surgeon or user with imagery that the stereoscopic camera is unable to capture without adjusting the orientation and/or position of the stereoscopic camera.
In alternative embodiments, only the end cap of one of the camera assemblies contains a peripheral camera, while in future embodiments an end cap may contain multiple peripheral cameras. In some embodiments, the peripheral cameras comprise of camera modules known on the market such as Raspberry Pi camera modules, e-con System® camera modules, and/or other similar camera modules known in the field. In other embodiments, the peripheral cameras may comprise custom camera modules.
Insertion
As aforementioned, the robotic camera system is configured to obtain multiple views of an operation site during a surgical procedure, with the camera assembly being inserted into a patient's body. In one embodiment, in order to insert the camera assembly, the trocar assembly is first inserted into the patient's body. In this embodiment, the trocar assembly is inserted in to the patient's body using a standard obturator known in the art. In this embodiment, the obturator punctures the patient's abdominal wall, creating an opening wide enough to allow the trocar to be inserted into the patient's abdomen. The trocar is inserted such that the winged ring sits flush with the exterior wall of the patient's abdomen, with the proximal portion of the trocar assembly located outside of the patient's body. The winged ring is then secured to the patient's body via surgical thread. In one embodiment, two pieces of surgical thread are used to secure the winged ring to the patient's body. In this embodiment, one end of each piece of surgical thread is fastened to screw of the winged ring, and the other end of each piece of surgical thread is sown into the patient's body, thus securing the trocar assembly to the patient's body. With the trocar assembly secured to the patient's body, the patient's abdominal cavity is insufflated, thus expanding the patient's abdominal cavity creating room for the camera assembly to be inserted. In other embodiments, a standard trocar currently on the market and known in the art, is inserted into the patient's body in order to insufflate the patient's abdominal cavity, and then the trocar assembly is inserted into the patient's body.
With the patient's abdominal cavity insufflated, the sheaths of the inflatable seal is inflated, via a pump and/or compression coupled to the air-port. With the sheaths inflated, the camera assembly is inserted through the trocar assembly, and into the patient's abdominal cavity. In one embodiment, prior to insertion of the camera assembly, the stereoscopic camera, is orientated such that the end cap of left camera assembly is first to pass through the trocar assembly and enter the patient's abdominal cavity. In alternative embodiments, the stereoscopic camera is orientated such that the end cap of the right camera assembly is first to pass through the trocar assembly and enter the patient's abdominal cavity. Alternatively, in embodiments where the stereoscopic camera contains a peripheral camera, that end of the stereoscopic camera may be inserted first.
Once the camera assembly has been inserted into the patient's abdominal cavity, the trocar mating fixture of the camera console assembly is coupled to the trocar, thus securing the camera console assembly and the trocar assembly. This connection, is used to stabilize the system, such that during actuation the camera assembly remains aligned with the trocar assembly such that other devices can pass through the trocar assembly and enter the patient's abdominal cavity. In alternative embodiments, the camera console assembly and the trocar assembly are not coupled to each other, thus allowing the camera console assembly and camera assembly to be rotated while inserted in the patient's body, as well as allowing the camera assembly to be pushed in further into the patient's abdominal cavity and/or pulled back out towards the trocar.
Once the stereoscopic camera has been inserted into the patient's abdominal cavity, tools, devices and/or instruments can be inserted through the trocar assembly into the patient's abdominal cavity. In one embodiment, prior to insertion of a tool, instrument or robotic device through the trocar assembly, said device, tool or instrument, enters into a seal plug. The seal plug serves as a passage vessel for the tool, device or instrument to be introduced into the patient's abdominal cavity, so as to allow the tool, device or instrument to pass through the seal sub-assembly while maintaining a seal and preventing any carbon dioxide from escaping or leaking out. In one embodiment, the seal plug is configured to have a hollow center for a device, tool or other object to fit within. Prior to insertion into the trocar assembly, the device, tool or other object is inserted into the seal plug. During operation, the seal plug is positioned such that a distal portion of the instrument is outside of the plug with a proximal portion of the instrument encompassed in the plug. As the seal plug is introduced into the trocar assembly, the seal plug passes through the seal sub-assembly, with a proximal portion of the seal plug remaining outside of the seal sub-assembly. The seal plug is fabricated to fill all of the open space in the trocar such that the seals of the seal sub-assembly surround the portion of the seal plug contained within the seal sub-assembly thus creating a seal. The seal plug remains inside the trocar assembly, until the instrument is ready to be removed from the operation field.
Actuation
The stereoscopic camera is configured to obtain multiple views of an operation site, by actuating the stereoscopic camera to a desired position and orientation.
In one embodiment, the stereoscopic camera is actuated by the movement of the surgeon's head. For example, during an operation if the surgeon wishes to view an object located above his current field of view, the surgeon looks up which results in the stereoscopic camera being rotated up about a pitch axis. In this embodiment, as disclosed in International Patent Application No. PCT/US2015/029247 (published as International Patent Publication No. WO2015171614A1), the surgeon wears a virtual-reality head mounted display to view the live camera feed(s) obtained by the stereoscopic camera. Appropriate head-mounted displays (HMDs) such as the Oculus Rift provide the user with a head-mounted view of the operation site, lenses to allow focused view within the display, and a sensor system to provide position and orientation tracking of the display. HMDs such as the Oculus Rift and HTC Vive, have built-in tracking and sensor systems, that obtain raw orientation data for yaw, pitch and roll of the HMD as well as positional data in Cartesian space (x,y,z) of the HMD. However, alternative tracking systems may be used in to provide supplementary position and orientation tracking data of the display in lieu of or in addition to the built-in tracking system of the HMD. Position and orientation sensor systems may include accelerometers, gyroscopes, magnetometers, infrared tracking, computer vision, fiducial tracking, magnetic tracking, laser tracking, ultrasonic tracking, mechanical tracking with encoders, or any other method of tracking at least one of position and orientation, or any combination thereof. The above-mentioned sensor tracking systems can be used to track the head-mounted display as worn by the user, as well as to track the rotational position of the stereoscopic camera during actuation.
In this embodiment, a sensor system tracks position and orientation of the surgeon's head mounted display. The sensor system relays the orientation data to a computer in real time. The position data is not necessary in this embodiment of the camera system since this embodiment of the camera system cannot independently translate in space, however other embodiments of the camera system may rely on the positional data for additional movement or to provide supplementary data. The orientation measurements are presented relative to the HMD's built-in coordinate system. The computer interprets the raw orientation data by transforming the coordinate system of the data from the built-in coordinate system of the HMD to one which matches the coordinate system defined by the camera system. In this embodiment, a simple constant rotation must be applied for this transformation since the built-in coordinate system of the HMD and the defined coordinate system of the camera system are both fixed and known.
The computer also ensures no singularities will be achieved when enforcing the rotation order determined by the physical configuration of the camera actuators. To avoid the natural singularity occurring when the 2nd rotation angle approaches 90 degrees (pi/2 radians), an algorithm begins to weigh the 1st rotation angle more heavily than the 3rd rotation angle as the 2nd rotation angle crosses a defined singularity threshold. The computer then transmits the interpreted data to the motor control board which is operatively coupled to the actuators of the camera console assembly.
The motor control board receives the orientation data sent from the computer and determines the necessary control effort needed to drive the actuators to put the camera system in the desired orientation. Physical characteristics of the camera system such as pulley diameters, cable diameters, friction profiles, and actuator constraints, are considered in calculating the actuator commands. In this embodiment of the camera system, the actuator commands are designed using position control to drive the actuators to a specific position which will result in a desired output orientation of the stereoscopic camera. In other embodiments of the camera system, using torque control or more advanced techniques, a control torque may be calculated instead to command the actuators to drive the stereoscopic camera to the desired orientation.
The motor control board transmits these actuation commands to the actuators of the camera console assembly, such that the actuators actuate the stereoscopic camera to follow the movement of the surgeon's head in real time. In this embodiment, position and/or orientation data obtained from rotational positional sensors operatively connected to the pitch actuation assembly and the yaw actuation assembly is simultaneously transmitted back to the motor control board, such that the motor control board constantly knows the position and orientation of the stereoscopic camera, so as to allow the motor control board to adjust the pan and tilt of the stereoscopic camera to align with the head movements of the surgeon. In other embodiments, position and/or orientation sensing of pitch and/or yaw actuation can be omitted if the actuation of the stereoscopic camera is sufficiently rigid, such that actuator (motor) position can be assumed to directly correlate to the pitch and/or yaw position of the stereoscopic camera. In alternative embodiments, position and/or orientation sensing is omitted entirely, with the stereoscopic camera actuated about the pitch and yaw axis relative to previous positions and/or orientations.
The camera rigid board processes the video feed obtained from the stereoscopic camera. The images and/or video feed obtained from the camera modules of the stereoscopic camera are displayed on the head-mounted display. The images and/or video feeds obtained from the camera assembly on the left side of the stereoscopic camera are displayed to the surgeon's left eye and images and/or video feed obtained from the camera assembly on the right side of the stereoscopic camera being displayed to the surgeon's right eye. The combination of the left eye view and the right eye view obtained from the camera assemblies of the stereoscopic camera provides the surgeon with a stereoscopic view of the operation site. In some embodiments, software is utilized to adjust the views of the stereoscopic camera slightly to compensate for any difference between the position of the stereoscopic camera and the surgeon's head position.
As stated above, the camera modules and camera module assembly contain digital sensor board which captures the images and/or video feeds of the operation site. The digital sensor board of the camera modules and module assemblies are in communication with a video processor board. In different embodiments, a variety of video processor boards are utilized including but not limited to various models of the Raspberry Pi, eInfochip's DVPB, NVIDIA jetson board or other known video processor boards known in the art. The digital sensor board communicates to the video processor board via MIPI communication protocol. In some embodiments, the image and/or video feed from each camera module is sent to its own video processor board, while in other embodiments, both image and/or video feed from both camera modules are sent to the same video processor board. The video processor board or boards is in communication with a computer which encodes the image/video feed using video rendering software. In some embodiments, FFmpeg is the video rendering software used, while in alternative embodiments other video rendering software known in field is utilized. The computer then sends the image and/or video feed obtained from the camera modules or module assemblies to a virtual reality computer application via network streaming. The virtual reality computer application takes the image and/or video feed from the network stream and decodes it using the video rendering software. From the video rendering software, the image and/or video feed is sent to the HMD which is done through the software of the HMD.
Computer System
The subject matter described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structural means disclosed in this specification and structural equivalents thereof, or in combinations of them. The subject matter described herein can be implemented as one or more computer program products, such as one or more computer programs tangibly embodied in an information carrier (e.g., in a machine readable storage device), or embodied in a propagated signal, for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers). A computer program (also known as a program, software, software application, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file. A program can be stored in a portion of a file that holds other programs or data, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification, including the method steps of the subject matter described herein, can be performed by one or more programmable processors executing one or more computer programs to perform functions of the subject matter described herein by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus of the subject matter described herein can be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processor of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of nonvolatile memory, including by way of example semiconductor memory devices, (e.g., EPROM, EEPROM, and flash memory devices); magnetic disks, (e.g., internal hard disks or removable disks); magneto optical disks; and optical disks (e.g., CD and DVD disks). The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, the subject matter described herein can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, (e.g., a mouse or a trackball), by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well. For example, feedback provided to the user can be any form of sensory feedback, (e.g., visual feedback, auditory feedback, or tactile feedback), and input from the user can be received in any form, including acoustic, speech, or tactile input.
The subject matter described herein can be implemented in a computing system that includes a back end component (e.g., a data server), a middleware component (e.g., an application server), or a front end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the subject matter described herein), or any combination of such back end, middleware, and front end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
It is to be understood that the disclosed subject matter is not limited in its application to the details of construction and to the arrangements of the components set forth in the following description or illustrated in the drawings. The disclosed subject matter is capable of other embodiments and of being practiced and carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
As such, those skilled in the art will appreciate that the conception, upon which this disclosure is based, may readily be utilized as a basis for the designing of other structures, methods, and systems for carrying out the several purposes of the disclosed subject matter. It is important, therefore, that the claims be regarded as including such equivalent constructions insofar as they do not depart from the spirit and scope of the disclosed subject matter.
Although the disclosed subject matter has been described and illustrated in the foregoing exemplary embodiments, it is understood that the present disclosure has been made only by way of example, and that numerous changes in the details of implementation of the disclosed subject matter may be made without departing from the spirit and scope of the disclosed subject matter, which is limited only by the claims which follow.
This application is a continuation of U.S. patent application Ser. No. 16/130,734, filed on Sep. 13, 2018 which claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 62/558,583, entitled Virtual Reality Surgical Camera System filed on Sep. 14, 2017, each of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2053868 | Grosso | Sep 1936 | A |
2313164 | Nelson | Mar 1943 | A |
4563812 | Goddard-Watts | Jan 1986 | A |
4573452 | Greenberg | Mar 1986 | A |
4620362 | Reynolds | Nov 1986 | A |
4651201 | Schoolman | Mar 1987 | A |
4676142 | McCormick et al. | Jun 1987 | A |
4843921 | Kremer | Jul 1989 | A |
5203646 | Landsberger et al. | Apr 1993 | A |
5368015 | Wilk | Nov 1994 | A |
5507297 | Slater et al. | Apr 1996 | A |
5515478 | Wang | May 1996 | A |
5546508 | Jain et al. | Aug 1996 | A |
5593402 | Patrick | Jan 1997 | A |
5624398 | Smith | Apr 1997 | A |
5755661 | Schwartzman | May 1998 | A |
5797900 | Madhani et al. | Aug 1998 | A |
5825982 | Wright et al. | Oct 1998 | A |
5836869 | Kudo et al. | Nov 1998 | A |
5876325 | Mizuno | Mar 1999 | A |
5911036 | Wright et al. | Jun 1999 | A |
6132368 | Cooper | Oct 2000 | A |
6162172 | Cosgrove et al. | Dec 2000 | A |
6377011 | Ben-Ur | Apr 2002 | B1 |
6441577 | Blumenkranz et al. | Aug 2002 | B2 |
6459926 | Nowlin et al. | Oct 2002 | B1 |
6491701 | Tierney et al. | Dec 2002 | B2 |
6556741 | Fan | Apr 2003 | B1 |
6587750 | Gerbi et al. | Jul 2003 | B2 |
6594552 | Nowlin et al. | Jul 2003 | B1 |
6659939 | Moll et al. | Dec 2003 | B2 |
6676684 | Morley et al. | Jan 2004 | B1 |
6682287 | Glass et al. | Jan 2004 | B2 |
6714841 | Wright et al. | Mar 2004 | B1 |
6725866 | Johnson et al. | Apr 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6788018 | Blumenkranz | Sep 2004 | B1 |
6817974 | Cooper et al. | Nov 2004 | B2 |
6858003 | Evans et al. | Feb 2005 | B2 |
6860878 | Brock | Mar 2005 | B2 |
6963792 | Green | Nov 2005 | B1 |
6965812 | Wang et al. | Nov 2005 | B2 |
6969385 | Moreyra | Nov 2005 | B2 |
7042184 | Oleynikov et al. | May 2006 | B2 |
7121781 | Sanchez | Oct 2006 | B2 |
7125403 | Julian et al. | Oct 2006 | B2 |
7126303 | Farritor et al. | Oct 2006 | B2 |
7185657 | Johnson et al. | Mar 2007 | B1 |
7199545 | Oleynikov et al. | Apr 2007 | B2 |
7208005 | Frecker et al. | Apr 2007 | B2 |
7239940 | Wang et al. | Jul 2007 | B2 |
7297142 | Brock | Nov 2007 | B2 |
7339341 | Oleynikov et al. | Mar 2008 | B2 |
7367973 | Manzo et al. | May 2008 | B2 |
7372229 | Farritor et al. | May 2008 | B2 |
7594912 | Cooper et al. | Sep 2009 | B2 |
7691058 | Rioux et al. | Apr 2010 | B2 |
7717890 | Drogue et al. | May 2010 | B2 |
7736356 | Cooper et al. | Jun 2010 | B2 |
7763015 | Cooper et al. | Jul 2010 | B2 |
7772796 | Farritor et al. | Aug 2010 | B2 |
7778733 | Nowlin et al. | Aug 2010 | B2 |
7831292 | Quaid et al. | Nov 2010 | B2 |
7854738 | Lee et al. | Dec 2010 | B2 |
7862502 | Pool et al. | Jan 2011 | B2 |
7862580 | Cooper et al. | Jan 2011 | B2 |
7950306 | Stuart | May 2011 | B2 |
7981025 | Pool et al. | Jul 2011 | B2 |
8016845 | Sauer | Sep 2011 | B1 |
8066644 | Sarkar et al. | Nov 2011 | B2 |
RE43049 | Grace | Dec 2011 | E |
8073335 | Labonville et al. | Dec 2011 | B2 |
8088062 | Zwolinski | Jan 2012 | B2 |
8120301 | Goldberg et al. | Feb 2012 | B2 |
8123740 | Madhani et al. | Feb 2012 | B2 |
8142421 | Cooper et al. | Mar 2012 | B2 |
8241271 | Millman et al. | Aug 2012 | B2 |
8246533 | Chang et al. | Aug 2012 | B2 |
8303576 | Brock | Nov 2012 | B2 |
8317778 | Spaide | Nov 2012 | B2 |
8333780 | Pedros et al. | Dec 2012 | B1 |
8343171 | Farritor et al. | Jan 2013 | B2 |
8375808 | Blumenkranz et al. | Feb 2013 | B2 |
8377044 | Coe et al. | Feb 2013 | B2 |
8398541 | DiMaio et al. | Mar 2013 | B2 |
8398634 | Manzo et al. | Mar 2013 | B2 |
8400094 | Schena | Mar 2013 | B2 |
8409234 | Stahler et al. | Apr 2013 | B2 |
8418073 | Mohr et al. | Apr 2013 | B2 |
8444631 | Yeung et al. | May 2013 | B2 |
8479969 | Shelton, IV | Jul 2013 | B2 |
8506555 | Ruiz Morales | Aug 2013 | B2 |
8518024 | Williams et al. | Aug 2013 | B2 |
8540748 | Murphy et al. | Sep 2013 | B2 |
8551076 | Duval et al. | Oct 2013 | B2 |
8551114 | Ramos de la Pena | Oct 2013 | B2 |
8600551 | Itkowitz et al. | Dec 2013 | B2 |
8604742 | Farritor et al. | Dec 2013 | B2 |
8613230 | Blumenkranz et al. | Dec 2013 | B2 |
8620473 | Diolaiti et al. | Dec 2013 | B2 |
8623028 | Rogers et al. | Jan 2014 | B2 |
8641700 | Devengenzo et al. | Feb 2014 | B2 |
8667860 | Helmer et al. | Mar 2014 | B2 |
8679096 | Farritor et al. | Mar 2014 | B2 |
8682489 | Itkowitz et al. | Mar 2014 | B2 |
8706301 | Zhao et al. | Apr 2014 | B2 |
8715159 | Pool et al. | May 2014 | B2 |
8721539 | Shohat et al. | May 2014 | B2 |
8747394 | Belson et al. | Jun 2014 | B2 |
8758391 | Swayze et al. | Jun 2014 | B2 |
8761930 | Nixon | Jun 2014 | B2 |
8768516 | Diolaiti et al. | Jul 2014 | B2 |
8776632 | Gao et al. | Jul 2014 | B2 |
8792951 | Mao et al. | Jul 2014 | B1 |
8808163 | Pool et al. | Aug 2014 | B2 |
8827988 | Belson et al. | Sep 2014 | B2 |
8827996 | Scott et al. | Sep 2014 | B2 |
8828024 | Farritor et al. | Sep 2014 | B2 |
8834488 | Farritor et al. | Sep 2014 | B2 |
8844789 | Shelton, IV et al. | Sep 2014 | B2 |
8852174 | Burbank | Oct 2014 | B2 |
8858538 | Belson et al. | Oct 2014 | B2 |
8876857 | Burbank | Nov 2014 | B2 |
8882660 | Phee et al. | Nov 2014 | B2 |
8894633 | Farritor et al. | Nov 2014 | B2 |
8911428 | Cooper et al. | Dec 2014 | B2 |
8912746 | Reid et al. | Dec 2014 | B2 |
8919348 | Williams et al. | Dec 2014 | B2 |
8932208 | Kendale et al. | Jan 2015 | B2 |
8936544 | Shahoian et al. | Jan 2015 | B2 |
8942828 | Schecter | Jan 2015 | B1 |
8944997 | Fernandez et al. | Feb 2015 | B2 |
8945095 | Blumenkranz et al. | Feb 2015 | B2 |
8945163 | Voegele et al. | Feb 2015 | B2 |
8945174 | Blumenkranz | Feb 2015 | B2 |
8956351 | Ravikumar et al. | Feb 2015 | B2 |
8968332 | Farritor et al. | Mar 2015 | B2 |
8974374 | Schostek et al. | Mar 2015 | B2 |
8979857 | Stad et al. | Mar 2015 | B2 |
8989903 | Weir et al. | Mar 2015 | B2 |
8991678 | Wellman et al. | Mar 2015 | B2 |
8992422 | Spivey et al. | Mar 2015 | B2 |
8992565 | Brisson et al. | Mar 2015 | B2 |
8992566 | Baldwin | Mar 2015 | B2 |
8996173 | Itkowitz et al. | Mar 2015 | B2 |
9002518 | Manzo et al. | Apr 2015 | B2 |
9005112 | Hasser et al. | Apr 2015 | B2 |
9011434 | Kappel et al. | Apr 2015 | B2 |
9028468 | Scarfogliero et al. | May 2015 | B2 |
9028494 | Shelton, IV et al. | May 2015 | B2 |
9039685 | Larkin et al. | May 2015 | B2 |
9044256 | Cadeddu et al. | Jun 2015 | B2 |
9052710 | Farwell | Jun 2015 | B1 |
9055960 | Stoy et al. | Jun 2015 | B2 |
9060678 | Larkin et al. | Jun 2015 | B2 |
9060770 | Shelton, IV et al. | Jun 2015 | B2 |
9077973 | Aguren | Jul 2015 | B2 |
9078684 | Williams | Jul 2015 | B2 |
9078695 | Hess et al. | Jul 2015 | B2 |
9089352 | Jeong | Jul 2015 | B2 |
9089353 | Farritor et al. | Jul 2015 | B2 |
9095317 | Cooper et al. | Aug 2015 | B2 |
9095362 | Dachs, II et al. | Aug 2015 | B2 |
9096033 | Holop et al. | Aug 2015 | B2 |
9101381 | Burbank et al. | Aug 2015 | B2 |
9107686 | Moon et al. | Aug 2015 | B2 |
9119655 | Bowling et al. | Sep 2015 | B2 |
9144452 | Scott et al. | Sep 2015 | B2 |
9155764 | Ahn et al. | Oct 2015 | B1 |
9173643 | Morley et al. | Nov 2015 | B2 |
9173707 | Singh | Nov 2015 | B2 |
9173915 | Kador | Nov 2015 | B1 |
9179912 | Yates et al. | Nov 2015 | B2 |
9179979 | Jinno | Nov 2015 | B2 |
9186215 | Singh | Nov 2015 | B2 |
9186220 | Stefanchik et al. | Nov 2015 | B2 |
9194403 | Neyme | Nov 2015 | B2 |
9198714 | Worrell et al. | Dec 2015 | B2 |
9216062 | Duque et al. | Dec 2015 | B2 |
9220567 | Sutherland et al. | Dec 2015 | B2 |
9226750 | Weir et al. | Jan 2016 | B2 |
9226751 | Shelton, IV et al. | Jan 2016 | B2 |
9226761 | Burbank | Jan 2016 | B2 |
9241766 | Duque et al. | Jan 2016 | B2 |
9259274 | Prisco | Feb 2016 | B2 |
9259275 | Burbank | Feb 2016 | B2 |
9261172 | Solomon et al. | Feb 2016 | B2 |
9271857 | Pool et al. | Mar 2016 | B2 |
9272166 | Hartman et al. | Mar 2016 | B2 |
9301759 | Spivey et al. | Apr 2016 | B2 |
9303212 | Flegal | Apr 2016 | B2 |
9305123 | Leotta et al. | Apr 2016 | B2 |
9308011 | Chao et al. | Apr 2016 | B2 |
9308145 | Jackson | Apr 2016 | B2 |
9309094 | Hoffend, III | Apr 2016 | B2 |
9314153 | Stein et al. | Apr 2016 | B2 |
9314239 | Brown | Apr 2016 | B2 |
9315235 | Wood | Apr 2016 | B1 |
9326823 | McMillan et al. | May 2016 | B2 |
9327081 | Gobron et al. | May 2016 | B2 |
9333003 | Kappel et al. | May 2016 | B2 |
9333041 | Yeung et al. | May 2016 | B2 |
9358031 | Manzo | Jun 2016 | B2 |
9358075 | Kim et al. | Jun 2016 | B2 |
9360093 | Garner | Jun 2016 | B2 |
9366862 | Haddick et al. | Jun 2016 | B2 |
9375288 | Robinson et al. | Jun 2016 | B2 |
9386983 | Swensgard et al. | Jul 2016 | B2 |
9393017 | Flanagan et al. | Jul 2016 | B2 |
9398911 | Auld | Jul 2016 | B2 |
9399298 | Kang | Jul 2016 | B2 |
9399558 | Guernsey et al. | Jul 2016 | B2 |
9402688 | Min et al. | Aug 2016 | B2 |
9403281 | Farritor et al. | Aug 2016 | B2 |
9404734 | Ramamurthy et al. | Aug 2016 | B2 |
9408369 | Dubinsky | Aug 2016 | B2 |
9408607 | Cartledge et al. | Aug 2016 | B2 |
9408668 | Durant et al. | Aug 2016 | B2 |
9456735 | Hrayr et al. | Oct 2016 | B2 |
9457168 | Moll et al. | Oct 2016 | B2 |
9460880 | Melecio Ramirez et al. | Oct 2016 | B2 |
9463015 | Hausen | Oct 2016 | B2 |
9463059 | Suon et al. | Oct 2016 | B2 |
9464643 | Shu | Oct 2016 | B2 |
9476245 | Hansen | Oct 2016 | B2 |
9486241 | Zeiner et al. | Nov 2016 | B2 |
9566709 | Kwon et al. | Feb 2017 | B2 |
9579163 | Valdastri et al. | Feb 2017 | B2 |
9724077 | Aranyi | Aug 2017 | B2 |
9801618 | Sachs et al. | Oct 2017 | B2 |
11006975 | Cohen et al. | May 2021 | B1 |
20020049367 | Irion et al. | Apr 2002 | A1 |
20040230161 | Zeiner | Nov 2004 | A1 |
20040231061 | Irvin et al. | Nov 2004 | A1 |
20050096502 | Khalili | May 2005 | A1 |
20060052669 | Hart | Mar 2006 | A1 |
20060178556 | Hasser et al. | Aug 2006 | A1 |
20070074584 | Talarico et al. | Apr 2007 | A1 |
20070265502 | Minosawa et al. | Nov 2007 | A1 |
20080000317 | Patton et al. | Jan 2008 | A1 |
20080004634 | Farritor | Jan 2008 | A1 |
20080033450 | Bayer | Feb 2008 | A1 |
20080064931 | Schena et al. | Mar 2008 | A1 |
20080097476 | Peh et al. | Apr 2008 | A1 |
20080147018 | Squilla et al. | Jun 2008 | A1 |
20080159653 | Dunki-Jacobs et al. | Jul 2008 | A1 |
20080221591 | Farritor et al. | Sep 2008 | A1 |
20090076536 | Rentschler et al. | Mar 2009 | A1 |
20090112229 | Omori et al. | Apr 2009 | A1 |
20090157076 | Athas et al. | Jun 2009 | A1 |
20090171373 | Farritor et al. | Jul 2009 | A1 |
20090177452 | Ullrich et al. | Jul 2009 | A1 |
20090245600 | Hoffman | Oct 2009 | A1 |
20090248041 | Williams et al. | Oct 2009 | A1 |
20100041938 | Stoianovici et al. | Feb 2010 | A1 |
20100174293 | Orban, III et al. | Jul 2010 | A1 |
20100179479 | Albrecht et al. | Jul 2010 | A1 |
20100245549 | Allen et al. | Sep 2010 | A1 |
20100331858 | Simaan et al. | Dec 2010 | A1 |
20110063428 | Sonnenschein et al. | Mar 2011 | A1 |
20110071347 | Rogers | Mar 2011 | A1 |
20110184404 | Walberg et al. | Jul 2011 | A1 |
20110202070 | Dario et al. | Aug 2011 | A1 |
20110230894 | Simaan et al. | Sep 2011 | A1 |
20110238080 | Ranjit et al. | Sep 2011 | A1 |
20120046525 | Russell et al. | Feb 2012 | A1 |
20120078053 | Phee et al. | Mar 2012 | A1 |
20120158015 | Fowler et al. | Jun 2012 | A1 |
20120190920 | Hasser et al. | Jul 2012 | A1 |
20120265214 | Bender et al. | Oct 2012 | A1 |
20120290134 | Zhao et al. | Nov 2012 | A1 |
20120316575 | Farin et al. | Dec 2012 | A1 |
20130023860 | Nagashimada | Jan 2013 | A1 |
20130066136 | Palese et al. | Mar 2013 | A1 |
20130085510 | Stefanchik et al. | Apr 2013 | A1 |
20130107665 | Fletcher et al. | May 2013 | A1 |
20130131695 | Scarfogliero | May 2013 | A1 |
20130281924 | Shellenberger | Oct 2013 | A1 |
20130321262 | Schecter | Dec 2013 | A1 |
20140005640 | Shelton, IV et al. | Jan 2014 | A1 |
20140012287 | Oyola et al. | Jan 2014 | A1 |
20140066955 | Farritor et al. | Mar 2014 | A1 |
20140107417 | McKinley et al. | Apr 2014 | A1 |
20140107665 | Shellenberger et al. | Apr 2014 | A1 |
20140114327 | Boudreaux et al. | Apr 2014 | A1 |
20140142377 | Yang et al. | May 2014 | A1 |
20140180001 | von Grunberg | Jun 2014 | A1 |
20140222020 | Bender et al. | Aug 2014 | A1 |
20140276667 | Shellenberger et al. | Sep 2014 | A1 |
20140276944 | Farritor et al. | Sep 2014 | A1 |
20150026537 | Romanovskyy et al. | Jan 2015 | A1 |
20150038984 | Hiroe et al. | Feb 2015 | A1 |
20150073223 | Pravongviengkham et al. | Mar 2015 | A1 |
20150085095 | Tesar | Mar 2015 | A1 |
20150119638 | Yu et al. | Apr 2015 | A1 |
20150130599 | Berkley et al. | May 2015 | A1 |
20150250546 | Larkin et al. | Sep 2015 | A1 |
20150272694 | Charles | Oct 2015 | A1 |
20160007827 | Frimer | Jan 2016 | A1 |
20160038008 | Molnar | Feb 2016 | A1 |
20160184032 | Romo et al. | Jun 2016 | A1 |
20160234408 | Urakawa et al. | Aug 2016 | A1 |
20160332305 | Gonzalez et al. | Nov 2016 | A1 |
20170078583 | Haggerty | Mar 2017 | A1 |
20170181802 | Sachs et al. | Jun 2017 | A1 |
20170188795 | Ouyang | Jul 2017 | A1 |
20170273716 | Garofalo et al. | Sep 2017 | A1 |
20170319174 | Hill | Nov 2017 | A1 |
20180221102 | Wang et al. | Aug 2018 | A1 |
20190076199 | Kline et al. | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
106456145 | Feb 2017 | CN |
2005-288174 | Oct 2005 | JP |
20160050449 | May 2016 | KR |
WO-2007111571 | Oct 2007 | WO |
WO-2010067267 | Jun 2010 | WO |
WO-2010126127 | Nov 2010 | WO |
WO-2011040769 | Apr 2011 | WO |
WO-2011060046 | May 2011 | WO |
WO-2011135503 | Nov 2011 | WO |
WO-2011137336 | Nov 2011 | WO |
WO-2012044334 | Apr 2012 | WO |
WO-2012060586 | May 2012 | WO |
WO-2012153151 | Nov 2012 | WO |
WO-2012158458 | Nov 2012 | WO |
WO-2013180773 | Dec 2013 | WO |
WO-2014011969 | Jan 2014 | WO |
WO-2014073121 | May 2014 | WO |
WO-2015063524 | May 2015 | WO |
WO-2015115887 | Aug 2015 | WO |
WO-2015171614 | Nov 2015 | WO |
WO-2016083189 | Jun 2016 | WO |
Entry |
---|
Chinese Office Action for Application No. 201880073752.7, dated Feb. 18, 2023, 4 pages. |
Can et al., The “Highly Versatile Single Port System” for laparoscopic surgery: Introduction and first clinical application. 4th European Conference of the International Federation for Medical and Biological Engineering. 2009;22:1650-1654. |
Kim et al., A Novel Surgical Manipulator with Workspace-Conversion Ability for Telesurgery. IEEE/ASME Transactions on Mechatronics. Feb. 2013;18(10):200-211. |
Oppenheimer et al., Immersive surgical robotic interfaces. Stud Health Technol Inform. 1999;62:242-8. |
Roppenecker, Entwicklung und Validierung eines generativ gefertigten Snake-Like Manipulators für die minimal-invasive Chirurgie. Development and Validation of an Additive Manufactured Snake-Like Manipulator for Minimally-Invasive Surgery. The dissertation was submitted to the Technical University of Munich on Jul. 18, 2016 and accepted by the Faculty of Mechanical Engineering on May 18, 2017. p. 6, (2017). |
Song et al., The Development of Human-Arm Like Manipulator for Laparoscopic Surgery With Force Sensing. IEEE International Conference on Industrial Technology. Dec. 15-17, 2006, DOI: 10.1109/ICIT.2006.372460, pp. 1258-1262, (2006). |
Talasaz, Haptics-Enabled Teleoperation for Robotics-Assisted Minimally Invasive Surgery. The University of Western Ontario. A thesis submitted in partial fulfillment of the requirements for the degree in Doctor of Philosophy. 175 pages, May 2012. |
Supplementary European Search Report for Application No. 18856312.6, dated May 20, 2021, 17 pages. |
International Search Report and Written Opinion for Application No. PCT/US2018/050922, dated Dec. 7, 2018, 10 pages. |
International Preliminary Report on Patentability for Application No. PCT/US2018/050922, dated Mar. 17, 2020, 7 pages. |
Japanese Office Action for Application No. 2020-515936, dated Aug. 24, 2022, 9 pages. |
U.S. Appl. No. 16/130,734, filed Sep. 13, 2018, 2019-0076199, Published. |
Number | Date | Country | |
---|---|---|---|
20220370156 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
62558583 | Sep 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16130734 | Sep 2018 | US |
Child | 17876238 | US |