Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers

Information

  • Patent Grant
  • 9743987
  • Patent Number
    9,743,987
  • Date Filed
    Thursday, March 13, 2014
    10 years ago
  • Date Issued
    Tuesday, August 29, 2017
    6 years ago
Abstract
The various embodiments disclosed herein relate to improved robotic surgical systems, including robotic surgical devices having improved arm components and/or biometric sensors, contact detection systems for robotic surgical devices, gross positioning systems and devices for use in robotic surgical systems, and improved external controllers and consoles.
Description
FIELD OF THE INVENTION

The various embodiments disclosed herein relate to improved robotic surgical systems, including improvements to various components therein.


BACKGROUND OF THE INVENTION

Various robotic surgical tools have been developed to perform certain procedures inside a target cavity of a patient. These robotic systems are intended to replace the standard laparoscopic tools and procedures that involve the insertion of long surgical tools through trocars positioned through incisions in the patient such that the surgical tools extend into the target cavity and allow the surgeon to perform a procedure using the long tools. As these systems are developed, various new components are developed to further improve the operation and effectiveness of these systems.


There is a need in the art for improved robotic surgical systems, including improved robotic devices and arm components, external controllers, and positioning systems.


BRIEF SUMMARY OF THE INVENTION

Discussed herein are various improvements for robotic surgical systems, including robotic surgical devices having improved arm components and/or biometric sensors, contact detection systems for robotic surgical devices, gross positioning systems and devices for use in robotic surgical systems, and improved external controllers and consoles.


In Example 1, a gross positioning system for use with a robotic surgical device comprises a base, a body operably coupled to the base, a first arm link operably coupled to the body at a first rotational joint, a second arm link operably coupled to the first arm link at a second rotational joint, and an extendable third arm link operably coupled to the second arm link. A portion of the third arm link is rotatable about a third rotational joint, and the third arm link comprises a connection component at a distal end of the third arm link. Further, the connection component is configured to be coupleable to the robotic surgical device.


Example 2 relates to the gross positioning system according to Example 1, wherein an axis of rotation of the first rotational joint is perpendicular to at least one of an axis of rotation of the second rotational joint and an axis of rotation of the third rotational joint.


Example 3 relates to the gross positioning system according to Example 1, wherein an axis of rotation of the second rotational joint is perpendicular to at least one of an axis of rotation of the first rotational joint and an axis of rotation of the third rotational joint.


Example 4 relates to the gross positioning system according to Example 1, wherein an axis of rotation of the third rotational joint is perpendicular to at least one of an axis of rotation of the first rotational joint and an axis of rotation of the second rotational joint.


Example 5 relates to the gross positioning system according to Example 1, wherein an axis of rotation of the first rotational joint, an axis of rotation of the second rotational joint, and an axis of rotation of the third rotational joint intersect at a spherical joint.


Example 6 relates to the gross positioning system according to Example 1, wherein the extendable third arm link comprises an extender body and an extendable rod slidably coupled to the extender body, wherein the extendable rod is configured to move between an extended position and a retracted position.


Example 7 relates to the gross positioning system according to Example 1, wherein the robotic surgical device comprises at least one arm, wherein the gross positioning system and robotic surgical device are configured to operate together to position the robotic surgical device within a body cavity of a patient.


In Example 8, a arm component for a robotic device configured to be positioned within a cavity of a patient comprises an arm body, a grasper end effector disposed at a distal end of the arm body, a first actuator operably coupled to the grasper end effector, and a second actuator operably coupled to the grasper end effector. The grasper end effector comprises an open configuration and a closed configuration. The first actuator is configured to actuate the grasper end effector to rotate. The second actuator is configured to actuate the grasper end effector to move between the open and closed configurations.


Example 9 relates to the arm component according to Example 8, further comprising a yoke operably coupled to the grasper end effector and a drive rod slidably disposed within the lumen of the yoke. The yoke comprises a lumen defined within the yoke, wherein the yoke is operably coupled at a proximal end to the first actuator, wherein the first actuator is configured to actuate the yoke to rotate. The drive rod is operably coupled at a distal end to the grasper end effector and at a proximal end to the second actuator, wherein the second actuator is configured to actuate the drive rod to slide between a distal and proximal position.


Example 10 relates to the arm component according to Example 9, wherein the second actuator comprises a hydraulic actuator.


Example 11 relates to the arm component according to Example 10, wherein the hydraulic actuator comprises an input port defined in the hydraulic actuator, and a piston rod slidably disposed within the hydraulic actuator. The piston rod is operably coupled to the drive rod and is configured to slide proximally when hydraulic fluid is added to the hydraulic actuator through the input port, thereby urging the drive rod proximally.


Example 12 relates to the arm component according to Example 9, wherein the second actuator comprises a pneumatic actuator.


Example 13 relates to the arm component according to Example 9, wherein the second actuator comprises a shape memory alloy (“SMA”) actuator.


Example 14 relates to the arm component according to Example 13, wherein the SMA actuator comprises a distal end component and a proximal end component, at least one elongate SMA component disposed within the SMA actuator, and a tensioned spring disposed within a lumen defined in the SMA actuator. The at least one elongate SMA component is operaby coupled to the distal and proximal end components. The SMA component is configured to contract due to application of heat and thereby urge the distal component toward the proximal component, thereby urging the drive rod in a proximal direction.


Example 15 relates to the arm component according to Example 14, wherein the distal component is configured to move in a distal direction when the SMA component is allowed to contract due to removal of the heat, whereby the tensioned spring is configured to urge the distal component in a distal direction, thereby urging the drive rod in a distal direction.


In Example 16, a robotic surgical system comprises a console, a processor operably coupled to the console, a first software application operably coupled to the processor, and a robotic surgical device configured to be positioned into a body cavity of a patient. The console comprises a configurable user interface that comprises a visual display of the target surgical space, at least one overlay disposed on the user interface, and at least one deployable menu configured to appear on the user interface upon command. The at least one overlay is configured to provide information about a surgical procedure being performed. The software application is configured to generate the at least one overlay and the at least one deployable menu on the user interface.


Example 17 relates to the robotic surgical system according to Example 16, further comprising a second software application configured to provide feedback relating to a surgical performance of the user.


Example 18 relates to the robotic surgical system according to Example 16, further comprising a second software application configured to generate warm-up or practice exercises at the console for the user.


Example 19 relates to the robotic surgical system according to Example 16, further comprising at least one biometric sensor disposed on the console, and a second software application configured to utilize the biometric information to track the physiological state of the user. The at least one biometric sensor is configured to collect biometric information relating to a user.


Example 20 relates to the robotic surgical system according to Example 16, wherein the first software application is further configured to provide personalized settings for each unique user upon identification of the unique user.


While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. As will be realized, the invention is capable of modifications in various obvious aspects, all without departing from the spirit and scope of the present invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a perspective view of a robotic arm component having a hydraulic actuator, according to one embodiment.



FIG. 1B is a perspective view of certain internal components of the robotic arm component of FIG. 1A.



FIG. 1C is a side view of certain internal components of the robotic arm component of FIG. 1A.



FIG. 2A is an exploded perspective view of the robotic arm component of FIG. 1A.



FIG. 2B is an exploded side view of the robotic arm component of FIG. 1A.



FIG. 3 is an expanded perspective view of a portion of the robotic arm component of FIG. 1A.



FIG. 4 is an expanded perspective view of a portion of the robotic arm component of FIG. 1A.



FIG. 5A is an expanded perspective view of certain internal components of the robotic arm component of FIG. 1A.



FIG. 5B is an expanded side view of the internal components of the robotic arm component of FIG. 5A.



FIG. 6 is an expanded perspective view of a portion of the robotic arm component of FIG. 1A.



FIG. 7A is a perspective view of a robotic arm component having a shape memory alloy actuator, according to one embodiment.



FIG. 7B is a side view of the robotic arm component of FIG. 7A.



FIG. 7C is a cross-sectional view of a portion of the robotic arm component of FIG. 7A.



FIG. 8A is a cross-sectional perspective view of a portion of the robotic arm component of FIG. 7A.



FIG. 8B is a cross-sectional side view of the portion of the robotic arm component of FIG. 8A.



FIG. 8C is an expanded cross-sectional perspective view of a smaller portion of the robotic arm component of FIG. 8A.



FIG. 8D is an expanded cross-sectional perspective view of another smaller portion of the robotic arm component of FIG. 8A.



FIG. 9 is a perspective view of a portion of the robotic arm component of FIG. 7A.



FIG. 10 is a perspective view of robotic surgical device with a contact detection system, according to one embodiment.



FIG. 11 is a schematic view of a contact detection system, according to one embodiment.



FIG. 12 is a perspective view of a portion of the robotic surgical device of FIG. 10.



FIG. 13 is a perspective view of robotic surgical device with a pressurization system to maintain a fluidic seal, according to one embodiment.



FIG. 14A is a perspective view of an external gross positioning system, according to one embodiment.



FIG. 14B is a further perspective view of the external gross positioning system of FIG. 14A.



FIG. 15 is a perspective view of a standard surgical table.



FIG. 16A is a schematic depiction of a user interface for a robotic surgical system, according to one embodiment.



FIG. 16B is a schematic depiction of a menu that can be displayed on the user interface of FIG. 16A.



FIG. 17 is a schematic depiction of a personalized display for a robotic surgical system, according to one embodiment.



FIG. 18 shows two graphs showing the endpoint positions of robotic surgical tools operated by two different surgeons, wherein the endpoint positions were tracked during a peg transfer test.



FIG. 19 is a perspective view of a console having at least one biometric sensor, according to one embodiment.



FIG. 20 is a perspective view of a controller for a robotic surgical system, according to another embodiment.



FIG. 21 is a perspective view of a foot controller for a robotic surgical system, according to one embodiment.



FIG. 22 is a perspective view of a handheld controller with a scroll wheel for a robotic surgical system, according to one embodiment.



FIG. 23 is a perspective view of a standard mouse controller.



FIG. 24 is a perspective view of a robotic surgical device having at least one biometric sensor, according to one embodiment.





DETAILED DESCRIPTION


FIGS. 1A-1C depict a forearm 10 having a hydraulic actuator 16, according to one embodiment. The forearm 10 has a body 12 that encases the internal components, including, as best shown in FIGS. 1B and 1C, a motor 14 and a hydraulic actuator 16 (in this case, a piston 16) that are positioned within the body 12. In this embodiment, the end effector 18 at the distal end of the forearm 10 is a grasper 18. In addition, the forearm 10 also has a coupling component 20 at its proximal end that is configured to be coupleable to an upper arm (not shown) of a robotic surgical device.


In accordance with one implementation, a hydraulic actuator such as the hydraulic piston 16 in FIGS. 1A-1C can provide increased speed and force characteristics for the end effector in comparison to other types of actuators while also reducing the size requirements for the forearm. In some aspects, the increased speed and force and decreased size can be accomplished because the hydraulic actuator allows for direct linear actuation of the end effector, in contrast with threaded actuators that often require multiple gears in order to convert the rotary motion of the motor into linear motion.


As shown in FIGS. 2A and 2B, according to one embodiment, the body 12 is made up of three body components 12A, 12B, 12C (also referred to herein as “shell components”) that are configured to be coupled together to make up the body 12 (also referred to as a “shell”): a first body component 12A, a second body component 12B, and a cap component 12C. In the embodiment as shown, the first and second body components 12A, 12B is formed or configured to have form-fitting inner configurations as shown such that the motor 14 and the hydraulic piston 16 and any other interior components mate with the inner configurations when positioned within the body 12. According to one specific implementation, the second body component 12B can be transparent, thereby allowing a user to visually confirm operation of the internal components of the device. In accordance with one embodiment, the cap component 12C constrains the bearings and protects the gears during use.


In this embodiment, the coupling components 22 as best shown in FIGS. 2A and 2B couple the three body components 12A, 12B, 12C together. More specifically, the coupling components 22 in this embodiment are screws 22 that are positioned through one of the coupling components 12A, 12B, 12C and into another, thereby coupling those components together. Alternatively, the coupling components 22 can be any known devices or components for coupling two or more body components together.


As best shown in FIG. 2A, the motor 14 has a drive shaft 24 at its distal end that is operably coupled to a motor gear 26 that is rotated by the motor 14 when the motor 14 is actuated. As will be explained in detail below, the actuation of the motor 14 causes the end effector 18 to rotate. Further, in this embodiment, the hydraulic piston 16 is a single acting piston having an input port 28 (also referred to as an “input barb”) formed or positioned along a side of the piston 16 (as best shown in FIGS. 1B, 1C, and 4) and a spring 40 positioned at a proximal end of the piston 16 (as best shown in FIG. 2B). In one embodiment, the input port 28 is configured to be coupleable to hydraulic tubing (not shown) that is configured to provide the hydraulic fluid to the piston 16. As best shown in FIG. 4, in certain implementations the port 28 extends out of the body 12 through a hole 30 defined in the body 12.


As best shown in FIGS. 2A, 5A, and 5B, the piston 16 is coupled to the end effector 18. In this particular embodiment, actuation of the piston 16 causes the grasper 18 to move between its open and closed positions. The piston 16 has a piston rod 32 extending from the distal end of the piston 16 with a threaded drive rod 34 coupled to and extending from the piston rod 32. The threaded drive rod 34 is threadably coupled at its distal end to a coupler 36 which, in turn, is coupled to a distal drive rod 38 that is operably coupled to the grasper arms 18. In one implementation, the coupler 36 rigidly connects the two components through the use of an adhesive such as, for example, a thread locking compound. In one specific example, the adhesive is one of the threadlocker products commercially available from Loctite®. Regardless, the adhesive retains the coupler 36 in place in relation to the threaded drive rod 34 and distal drive rod 38 such that rotation is transferred through the coupler 36 rather than unscrewing one end or the other. According to one embodiment, the coupler 36 is sized to slidably fit within the bearing 46 as described in further detail below, thereby helping to keep the overall length of the forearm short. Alternatively, the coupler 36 need not be sized to fit within the bearing 46. Regardless, actuation of the piston 16 can actuate the grasper arms 18 to open and close.



FIG. 3 depicts the proximal end of the forearm body 12 and the coupling component 20. It is understood that the coupling component 20 can be any known device or component for rigidly coupling one portion of a medical device to another at a joint, and is specific to the upper arm (not shown) to which the forearm body 12 is being coupled. In this embodiment, the coupling component 20 is a rectangular protrusion 20 having a retention bolt 50 disposed within the square opening 52 defined within the protrusion 20. In accordance with one implementation, the protrusion 20 is formed at the proximal end of the first body component 12A such that rigidity may be maintained from the coupling component 20 to the end effector 18.


In use, the piston 16 operates as follows, according to one embodiment. It is understood that, according to certain implementations, the grasper 18 operates in the same fashion as many known graspers, with the distal drive rod 38 slidably positioned within a lumen (not shown) defined in the yoke 44 such that rod 38 (which is operably coupled to the arms of the grasper 18) can actuate the grasper to move between its open and closed configurations by sliding the rod 38 distally and proximally in relation to the yoke 44. To actuate this grasper 18 or any other known grasper requiring lateral actuation, fluid is added to the piston 16 through the port 28, which is best shown in FIGS. 1A, 1B, 1C, and 4. Referring specifically to FIG. 4, the port 28 is positioned along the piston 16 such that the increased pressure causes the piston rod 32 to move proximally (back into the piston body 16). This movement of the rod 32 pulls the threaded drive rod 34 and the coupler 36 in a proximal direction, thereby pulling the distal drive rod 38 proximally as well, thereby causing the grasper arms 18 to move toward the closed position. To actuate the grasper arms 18 toward the open position, the pressure in the piston 16 is reduced, thereby allowing the spring 40 to urge the piston 16 in the distal direction, thereby urging the piston rod 32, the threaded drive rod 34, the coupler 36, and the distal drive rod 38 distally and thus urging the grasper arms 18 toward the open position.


In one embodiment, the piston 16 is a single-action piston 16 with the port 28 positioned such that increased pressure causes the piston rod 32 to move proximally as described above. This configuration eliminates the need for an excessively strong fluid vacuum to move the piston proximally. A piston that requires such a strong vacuum can have problems if any air leaks into the system and can lead to more air entering the fluid tract.


In one embodiment, the fluid provided to the piston 16 through the port 28 is provided by a driving mechanism (not shown). The driving mechanism can be any known device or component that can be coupled to the port and thereby provide fluid to the piston 16 at varying levels of pressure. According to one implementation, the driver can also be configured to sense the applied pressure and regulate the pressure in the piston 16 to drive the end effector 18 motion based on force at the end effector 18 rather than position of the end effector.


In accordance with one implementation, the fluid in the hydraulic system can be water, saline, or any other known biosafe noncompressible fluid.


Alternatively, the piston 16 can be a dual acting piston that can provide better control of the position and performance of the piston.


In the embodiment as shown, the end effector 18 can also be rotated via the motor 14. That is, as mentioned above, the motor 14 can be actuated to cause the end effector 18 to rotate. As best shown in FIGS. 2A and 2B, the motor 14 rotates the motor shaft 24, which rotates the motor gear 26, which, in turn, rotates the driven gear 42. The driven gear 42 is rotationally coupled to the end effector 18 such that rotation of the driven gear 42 causes rotation of the end effector 18. More specifically, the driven gear 42 is coupled to the yoke 44 such that rotation of the drive gear 42 causes rotation of the yoke 44. In one embodiment, the gear ratio of the motor gear 26 and the driven gear 42 can be changed to provide different performance characteristics of the end effector 18 roll axis.


The rotation by the motor 14 as described above is decoupled from the push/pull motion of the piston rod 32. That is, the components that are used to cause the rotation and the push/pull motion are configured such that the two actions are separate and independent from each other. In this embodiment, the decoupling results from the bearing 46 that is rotationally coupled to the driven gear 42 such that rotation of the driven gear 42 causes the bearing 46 to rotate. The bearing 46 further is rotatably positioned over the coupler 36 and distal drive rod 38 such that the coupler and distal drive rod 38 are positioned through the bearing 46 and do not rotate when the bearing 46 rotates. Thus, the distal drive rod 38 can move distally and proximally while the bearing 46 rotates.


In one alternative embodiment, the hydraulic actuation can be replaced with pneumatics, shape memory alloy, or some other linear actuation component.


In accordance with an alternative implementation, a shape memory alloy is used to actuate the end effector. FIGS. 7A-9 depict a forearm 60 having an end effector 62 that is actuated using a shape memory alloy that contracts upon heating.


As shown in FIGS. 7A-7C, the end effector 62 in this implementation is a set of graspers 62. The forearm 60 has a shape memory alloy (“SMA”) actuator 64 that is operably coupled to the graspers 62 such that actuation of the SMA actuator 64 causes the graspers 62 to move between its open and closed positions. Further, the forearm 60 also has a motor 66 that is operably coupled with the grasper 62 such that actuation of the motor 66 causes the grasper 62 to rotate. As best shown in FIGS. 7B and 7C, the motor 66 has a motor shaft 68 that is operably coupled to a motor gear 70. The motor gear 70 is operably coupled to a driven gear 72 that is, in turn, operably coupled to the graspers 62.


According to one embodiment, the graspers 62 are identical or substantially similar to the graspers 18 described in detail above. Alternatively, any known grasper configuration or any other known end effector can be used.


In one implementation, the forearm 60 has a gearbox 74 that is operably coupled to motor 66, the motor gear 70, and the driven gear 72 such that the gearbox 74 is configured to maintain the relative position between the motor gear 70 and the driven gear 72, thereby ensuring that the motor gear 70 will maintain a uniform contact distance with the driven gear 72. According to one embodiment, the gearbox 74 has a clamping feature actuated through the tightening of a bolt 76. The tightening of the bolt 76 pulls the motor gear 70 and driven gear 72 together and further secures the motor 66, thereby helping to prevent rotation or translation of the motor 66 or any other components.



FIGS. 8A-8D are a set of cutaway figures depicting the SMA actuator 64 according to one embodiment. The actuator 64 provides linear motion to move the grasper 62 between its open and closed positions. According to one implementation, as best shown in FIG. 8B, the SMA actuator 64 has two SMA wires 82 that are configured to contract upon heating. As used herein, the term “SMA wire” is intended to mean any elongate shape memory alloy component (also referred to as a “cord,” “rope,” or “braid”) that can be used with the SMA actuator 64 as disclosed or contemplated herein. The actuator 64 also has a spring 84 positioned through a central lumen 100 of the actuator 64.


According to one embodiment, the SMA material used in the wires 82 is nitinol. Alternatively, the material is a copper-based SMA material or a ferromagnetic shape-memory alloy (“FSMA”) material. In a further alternative, the material is a shape-memory polymer such as, for example, a shape-memory plastic. In yet another alternative, the shape-memory material can be any known shape-memory material that can be used in a medical device. Further, alternative implementations of the actuator 64 can be made of any material or component that can change its physical structure in a repeatable and useful way.


As best shown in FIGS. 8A and 8B, the actuator 64 also has two end components 88, 90 and a bolt 86 inserted into the proximal end of the actuator 64 such that the bolt is positioned within the central lumen 100 and within the spring 84, thereby helping to constrain the spring 84 within the actuator 64. The bolt 86 helps to retain the proximal end component 88 in place. In one embodiment, each of the end components 88, 90 is made up of two components (such as, for example, circuit boards) that are coupled together to make a single end component 88, 90. That is, as best shown in FIG. 8B, the proximal end component 88 has a first end component 88A and a second end component 88B, while the distal end component 90 has a first end component 90A and a second end component 90B. Each of the end components 88, 90 has two openings 92, 94 defined within the component 88, 90, with each opening 92, 94 configured to receive one of the SMA wires 82. In accordance with one implementation, each of the SMA wires 82 is configured to be formed into a knot 96 that helps to couple the wires 82 to the end components 88, 90 in the openings 92, 94 such that the wires 82 are fixedly coupled to the end components 88, 90 while also maintaining an electrical connection between the wires 82 and the end components 88, 90.


In one embodiment, with reference to one wire 82 being coupled to the distal end component 90 (with the understanding that the same process is used for each opening 92, 94 in each end component 88, 90), the wire 82 is positioned through the opening 94 and then a knot 96 is tied into the wire 82 and positioned between the first component 90A and the second component 90B of the distal end component 90. The knot tail is then fed through the opening of the second component 90B and an adhesive or fixative is used to fix the first and second components 90A, 90B together to form the distal end component 90, thereby capturing the knot 96 within the end component 90.


In accordance with one implementation, each opening 94 has a conductive ring around the opening 94 that helps to establish the electrical connection with the wire 82 disposed through the opening 94. It is understood in the art that such rings are standard features of circuit boards.


Alternatively, the two end components 88, 90 can be any devices or components that can retain the wires 82 in place while also providing an electrical connection to the wires 82. Further, instead of a knot, any known attachment component or mechanism that secures the wire 82 to the end component 88, 90 while also maintaining an electrical connection can be used.


According to one implementation, the actuator 64 also has four bolts or pins 98 positioned strategically within the actuator 64 such that the two SMA wires 82 can be positioned around the pins 98 as shown in FIG. 8B. More specifically, the pins 98 are positioned such that each wire 82 can be looped around the pins 98 in a fashion that increases the length of the wire 82 in the actuator 64 (in comparison to the length of the wire if there were no pins) while preventing the wires 82 from contacting each other and thus causing a short-circuit. The longer the wire 82, the greater the amount of force that can be created by the actuator 64. According to one embodiment in which a type of nitinol is used for the SMA wires 82, the nitinol wires 82 shortens in an amount equaling about 4% of its total length. In such an embodiment, the length of the wires 82 must be maximized using the pins 98 to loop the wires 82 in order to achieve the amount of pull required for the actuator 64. Thus, if more force is required to ensure that the grasper 62 can be moved between its open and closed positions, then the length of the wire 82 within the actuator 64 can be increased by looping the wire 82 around the pins 98 as shown, especially given the need to keep the overall size of the forearm and thus the actuator 64 as small as possible. Of course, the amount of force required, and thus the length of the wire 82 that is needed, will vary based on the type of shape memory alloy that is used for the wire 82 and the type of end effector 62 that is used. In certain alternatives, a different SMA wire 82 or a different type of nitinol can be used that has the capacity to contract more than the nitinol wires 82 described above.


As best shown in FIG. 8D, the actuator 64, in one embodiment, also has a nut 110 and a bearing 112. The nut 110 is configured to receive and be threadably coupled to the drive rod 114 (as best shown in FIG. 8B) that is operably coupled to the end effector 62. Alternatively, instead of a nut, any other coupling component or mechanism can be used to couple the actuator 64 to the end effector 62. The bearing 112 is positioned to decouple the rotation of the end effector 62 (which is actuated by the motor 66 as discussed above) from the linear motion that is actuated by the SMA actuator 64.


In accordance with one implementation, the wire 82 is designed to be able to withstand a certain minimum amount of pull force. Further, as shown in FIGS. 8A-8D, the actuator 64 has two wires 82 that are used together to create the appropriate amount of total force for the actuator 64. Alternatively, three or more wires 82 could be incorporated into the actuator 64 to provide additional actuation force. In a further alternative, two or more wires 82 could be braided together into bundles.


In one embodiment, the wire 82 is actually a braided wire 82 having four braided strands. According to one implementation, the four braided strands provide sufficient strength such that the appropriate amount of force can be applied to the wire 82 without breaking the wire 82. In addition, the four strands in the wire 82 also make it possible to provide two separate electrical loops, such that, for example, the power passes up strand one, down strand two, up strand three, and down strand four.



FIG. 9 depicts an embodiment of an SMA actuator 64 having a coupling component 116 disposed on an exterior portion of the actuator 64. In the specific embodiment as shown, the coupling component 116 is a curved projection 116 configured to couple with the motor 66 discussed above such that the motor 66 fits or “nests” within the concave portion of the projection 116. The coupling component 116 prevents the rotational actuation of the motor 66 from causing the motor 66 and actuator 64 to rotate in relation to each other.


In use, according to one embodiment, the SMA wires 82 are actuated by applying heat to the wires 82 via a known process in which an electrical current is applied to the wires, which creates heat that causes the wires 82 to contract. The contraction of the wires 82 applies a pulling force on the distal end component 90, thereby causing the component 90 to be urged proximally, which causes the drive rod 114 (shown in FIG. 8B) to retract (move in a proximal direction), thereby actuating the grasper 62 to move between its closed and open positions. In one embodiment, the retraction of the drive rod 114 causes the grasper to move toward its closed position. Alternatively, any known process for applying heat can be used.


When the grasper 62 needs to be actuated to move to the other position, the heat being applied to the SMA wires 82 is removed and the wires 82 are allowed to cool. As they cool, they begin to expand and thus lengthen. The spring 84 within the actuator 64 is configured to provide the restoring force that urges the distal end component 90 in a distal direction, thereby urging the drive rod 114 in the same direction and thus actuating the grasper 62 to move toward the other position. In one implementation, the urging of the drive rod 114 in the distal direction causes the grasper to move toward its open position.


In accordance with one embodiment, the SMA actuator 64 has channels defined within the actuator that provide fluidic communication between the interior and the exterior of the actuator 64, thereby allowing ambient air to flow into the interior of the actuator 64 and into contact with the SMA wires 82, thereby providing natural convective cooling of the wires 82. Alternatively, active cooling can be provided, such as forced air or thermo-electric cooling (such as, for example, Peltier coolers, one version of which is commercially available from Beijing Huimao Cooling Equipment Co., Ltd., located in Beijing, China) (not shown), both of which increases the amount or the speed of the cooling action in comparison to natural convective cooling.


Operation of a robotic device having moveable arms, and especially moveable arms with elbow joints, creates the risk that those arms or the elbows of those arms can contact the patient's cavity wall, potentially causing serious damage to the patient and/or the device. During a procedure, a camera positioned on the device such that it is disposed between two arms has a viewpoint that does not capture the elbows, meaning that the user generally cannot detect or observe any contact between the arms and the cavity wall. One solution is a contact detection system such as the exemplary embodiment depicted in FIG. 10. In this embodiment, a robotic device 120 has a device body 122, two arms 124, 126 coupled to the body 122, and two contact detection sleeves 128, 130 positioned over those arms 124, 126. According to certain embodiments, the sleeves 128, 130 also serve as sterilization sleeves that help to maintain a sterile field for the robot arms 124, 126. In another implementation, as best shown in FIG. 11, the sleeves 128, 130 are part of a contact detection system 138 that is made up of the sleeves 128, 130, a grounding pad 134 attached to the patient's skin, and at least one sensor 136 that is operably coupled to the sleeves 128, 130 and the pad 134. More specifically, the sensor 136 is electrically coupled to the sleeves 128, 130 via a wire 140 that extends from the sleeves 128, 130 to the sensor 136. Further, the sensor 136 is electrically coupled to the pad 134 via wire or elongate member 142 that extends from the sensor 136 to the pad 134. In addition, in one embodiment in which one end effector is a monopolar cautery device, the pad 134 is also electrically coupled to a cautery generator (not shown) via wire or elongate member 144. That is, embodiments having a monopolar cautery device, the device requires a grounding pad (independent of any grounding pad—such as pad 134—for the contact detection system). Thus, in one embodiment, the grounding pad 134 serves as a grounding pad not only for the detection system 138, but also the monopolar cautery device. Alternatively, separate grounding pads are provided for the system 138 and the end effector. It is understood that the wires 140, 142, 144 can also include a cord, an elongate member, or any other electrical connection component.


According to one embodiment, each of the contact detection sleeves 128, 130 has contact sections (also referred to as “patches”) 132 (as best shown in FIG. 10) that are made of a conductive material such as copper and positioned along an external portion of the sleeve 128, 130. For example, a contact section 132 made of copper is schematically depicted in FIG. 12. Alternatively, the material can be silver, aluminum, or gold. In a further alternative, the material can be any known conductive material that could be used in a contact detection system. In one embodiment, the contact sections 132 are copper mesh patches 132. In one implementation as best shown in FIGS. 10 and 12, the patches 132 can be positioned strategically along the sleeves 128, 130 at those areas of the arms 124, 126 that are most likely to make inadvertent contact with an interior wall or other portion of a patient's cavity. For example, as depicted in FIG. 10, there are three patches 132 positioned along an external portion of each sleeve 128, 130, with one patch 132 positioned near each shoulder, one patch 132 at each elbow, and one patch 132 near the distal end of the forearm of each arm 124, 126. In another example, as shown in FIG. 11, the patch 132 is positioned at the elbow of the robotic arm. Alternatively, the patches 132 can be positioned at regular intervals across the exterior of each sleeve 128, 130. Alternatively, the patches 132 are distributed according to any other known pattern or strategy for positioning the patches 132 on the sleeves 128, 130. In a further alternative, the sleeves do not have patches. Instead, the sleeves—such as sleeves 128, 130—could be made up of two or more layers of material that can interact such that the sleeves themselves can detect contact and transmit a signal based on such contact (similar to the way in which a touchscreen works). These patch-less sleeves also eliminate the need for a contact pad.


As shown in FIG. 11, the grounding pad 134, in one embodiment, is positioned on the patient's lower back. The pad 134 is electrically connected to the contact sections 132 via the electrical wire discussed above such that any contact between a contact section 132 and the patient's body (including an internal cavity of the patient) creates a conductive path between the contact section 132 and the grounding pad 134. If an electrical connection is made between the contact section 132 and the grounding pad 134 via such a conductive path as a result of contact between the contact section 132 and the patient's internal cavity, the sensor (or sensors) 136 is triggered, thereby notifying the surgeon or another user that contact has been made between one of the robotic arms and a wall of the patient's cavity. Alternatively, the pad 134 can be positioned anywhere on the patient or in contact with the patient so long as the pad 134 is electrically accessible through all parts of the patient. In one implementation, the grounding pad 134 is a commercially-available grounding pad used in monopolar electrocautery.


One specific embodiment of a contact patch 132 is schematically depicted in FIG. 12. In this embodiment, the sleeve 130 has this single contact patch 132 positioned at or near the elbow of the robotic arm over which the sleeve 130 is positioned. Alternatively, the sleeve 130 can have two or more patches 132 in any of a number of configurations. In various other embodiments, the positioning of the one or more patches 132 can depend on the structure or configuration of the robotic arm (or other portion of the robotic device) or the expected movements thereof.


In use, when one of the arms 124, 126 makes contact with the patient's cavity wall, the sleeve 128, 130 on that arm, and thus at least one of the contact patches 132 on that sleeve 128, 130, makes contact with the cavity wall, thereby completing an electrical circuit running through the patch 132, the grounding pad 134, and the at least one sensor 136 such that the sensor 136 provides a notification to the user about the contact between the arm 124, 126 and the cavity wall. In one embodiment, the extent of the contact can impact the extent of the notification or feedback. That is, the harder the arm 124, 126 contacts the wall or the greater the surface of the arm 124, 126 that contacts the wall, the more the wall deforms and conforms to the shape of the arm 124, 126, thus increasing the amount of surface of the sleeve 128, 130 that contacts the wall. The increased contact surface of the sleeve 128, 130 triggers a stronger electrical connection, and the sensor 136 can be configured to provide a greater or different notification based on the stronger electrical connection. Alternatively, the location of the contact can be provided in the notification or feedback. That is, each patch 132 can be uniquely identified according to any known fashion such that the notification or feedback provides information not only about the contact, but also information about the identity of the patch 132 at which the contact occurred. According to one embodiment, this system can help detect any collision or other contact between an arm and the patient's cavity wall or an internal organ and thus help the user to better control the movements that are made using the robotic device. The sensor's notification of the contact can help to prevent the user from doing further harm to the patient or the robotic device.


In a further embodiment, the sleeves 128, 130 can also be configured to be electronic noise reduction sleeves 128, 130 (also referred to herein as “Faraday sleeves”). More specifically, in certain embodiments, the sleeves 128, 130 are electronic noise reduction sleeves 128, 130 made at least in part of a woven copper mesh. In at least one exemplary implementation, the sleeves 128, 130 are made entirely of a tightly woven mesh made of copper and are grounded. Alternatively, the woven mesh is made of any mesh made of any known conductive material. For example, alternative conductive materials include, but are not limited to, silver, aluminum, and gold. The sleeves 128, 130, in addition to providing a sterilized field for the robotic arms 124,126, can reduce or terminate the electronic interference (also referred to herein as “noise”) created by the multiple different electronic components in the robotic device, including, for example, motors and end effectors such as cautery components.


Another embodiment disclosed herein relates to improved methods and devices for maintaining the sterilization of a robotic device such that the device can be reused. Robotic surgical devices such as the various embodiments disclosed herein are exposed to many different body fluids during a procedure. In order to be able to reuse a surgical device, the device must be fairly impermeable to those fluids. While most components of the various robotic devices disclosed and contemplated herein are positioned such that they generally do not contact any of the body fluids, the end effectors at the distal ends of the robotic arms are, by design, intentionally in contact with or even immersed in the fluids as the end effectors are used to perform a procedure. Typically, mechanical seals such as o-rings can be used to maintain the fluidic seal in the robotic devices contemplated herein. However, o-rings may not work as effectively for certain end effectors.



FIG. 13 depicts one embodiment of a robotic device 150 that uses pressurization to maintain a fluidic seal and thereby maintain the sterilization of the device 150. More specifically, the device 150 has a body 152 and two arms 154, 156 coupled to the body 152. Each of the arms 154, 156 has an upper arm 154A, 156A and a forearm 154B, 156B. In this embodiment, the device 150 also has at least one pressurization tube 158, 160 associated with each arm 154, 156. More specifically, each pressurization tube 158, 160 is operably coupled to a forearm 154B, 156B such that the tube 158, 160 forces pressurized air into an interior portion of the forearm 154B, 156B. It is understood that the term “tube” as used herein is intended to mean any tube, pipe, line, or any other known elongate member having a lumen that can be used to deliver pressurized air. In this embodiment, the interior portion of each forearm 154B, 156B is fluidically sealed in relation to the air outside the forearms 154B, 156B except for the distal opening 166, 168 in the forearm 154B, 156B from which the end effector 162, 164 extends. In accordance with one implementation, the pressurization tubes 158, 160 force pressurized air into the forearms 154B, 156B such that their interior portions have pressures that are higher than the air pressure inside the patient's cavity, thereby creating a constant flow of air out of the forearms 154B, 156B through the distal openings 166, 168.


In use, the pressurization tubes 158, 160 pressurize the interior portions of the forearms 154B, 156B such that there is a constant flow of pressurized air out of the distal openings 166, 168 of the forearms 154B, 156B. This constant flow from each forearm 154B, 156B operates based on the same principle as a clean room—the flow of air maintains the sterility of the interior portions of the forearms 154B, 156B by preventing the fluids from accessing those interior portions. That is, the constant flow of air keeps any liquids outside the forearms 154B, 156B from entering through those distal holes 166, 168.



FIGS. 14A and 14B depicts an external gross positioning device and system 180 that can be used to automatically grossly position a surgical device 182 inside a cavity of a patient (as best shown in FIG. 14B, according to one embodiment. “Gross positioning,” as used herein, is intended to mean general positioning of an entire moveable surgical device (in contrast to precise placement of the specific components of such a device, such as an arm or end effector). In known robotic surgical systems, the positioning of those devices during a surgical procedure can be a challenging task. Further, minimally invasive surgical procedures (using either robotic or non-robotic systems) frequently require a surgical technician to reposition the surgical equipment, such as a laparoscope. Such repositioning takes time and additional effort. In addition, in some cases, the surgical technician is a junior medical student who is not fully trained in laparoscopy. As a result, the repositioning instructions from the surgeon often result in an obstructed and/or fogged view of the surgical site, requiring additional cognitive resources from the surgeon. Hence, the Da Vinci® system and known single incision surgical devices often require timely repositioning of the patient, the robotic system, or both while performing complicated procedures.


The various gross positioning devices contemplated herein aid in the repositioning of surgical devices (including, for example, any surgical devices that have a device body or rod configured to be positioned through an incision and at least one robotic arm coupled to the device body that is positioned entirely within the cavity of the patient) throughout the procedure without additional intervention from the surgical staff. The gross positioning system embodiments are capable of controlling the degrees of freedom, azimuth and elevation angle, and roll and translation about the axis of insertion of laparascopic surgical tools, including robotic laparoscopic surgical tools. As a result, the gross positioning device embodiments disclosed and contemplated herein can grossly position a surgical device through an incision into a patient cavity such as the abdominal cavity with high manipulability, reducing the operative time and stress induced upon the surgical staff. The combination of the external gross positioning system with the internal surgical device system will allow the degrees of freedom of the internal system to effectively increase without increasing the size of the surgical robot/device.


In one implementation, the various devices described and contemplated herein can be used with any single site surgical device with an available external positioning fixture, such as a protruding rod or magnetic handle.


This system embodiment 180 has a base 184 and a body 186 coupled to the base. An upper arm or first arm link 188 is rotatably coupled to the body 186 at a rotational coupling or joint 190 such that the upper arm 188 can rotate in relation to the body 186 around the axis 190A as best shown in FIG. 14B. A forearm or second arm link 192 is rotatably coupled to the upper arm 188 at a rotational coupling or joint 194 such that the forearm 192 can rotate in relation to the upper arm 188 around the axis 194A as best shown in FIG. 14B. The device 180 also has a third link or extender 198 (best shown in FIG. 14B) coupled to the forearm 192. The extender 198, according to one embodiment, has two degrees of freedom: it can both rotate and extend laterally. That is, the extender 198 is configured to move between an extended position and a retracted position and any position in between. In one embodiment, the amount of extension and retraction is depicted by the arrow 204 in FIG. 14B. As shown in FIG. 14B, the extender 198 can have two components: a stationary body 198B and an extendable rod 198C. In this embodiment, the extendable rod 198C extends from and retracts into the stationary body 198B as shown. Further, the extender 198 can also rotate around axis 198A. More specifically, in the depicted embodiment, the body 198B and the extendable rod 198C are rotationally coupled to each other such that they both rotate around axis 198A together. Alternatively, the extendable rod 198C can rotate in relation to the stationary body 198B around axis 198A while the stationary body 198B does not rotate. Alternatively, the extender 198 can be any known component or device that provides both extension and rotation as contemplated herein.


In one implementation, the base 184 is configured to keep the entire device 180 stable and secure during use. As shown, the base 184 is made up of two extended pieces or “feet” 184A, 184B (best shown in FIG. 14A) that provide stability and help to prevent the device 180 from tilting or tipping during use. In alternative embodiments, the base 184 can be any structure that provides such stability, including, for example, a very heavy or weighted structure that uses the weight to enhance stability. In certain implementations, the base can be stably coupled to a surgical table on which the patient is placed, such as the known surgical table 210 depicted in FIG. 15. According to one implementation, the base 184 can be coupled to a rail 212 on the table 210. In a further alternative, the base 184 can be coupled to any fixed object in the operating room. Alternatively, the base 184 can be coupled to or be an integral part of a cart or other mobile standalone unit.


In one embodiment, the rotational axis 190A at rotational joint 190 (between the body 186 and the upper arm 188) is perpendicular to both the rotational axis 194A at rotational joint 194 (between the upper arm 188 and the forearm 192) and the rotational axis 198A. In other words, each axis 190A, 194A, 198A can be perpendicular in relation to the other two. The three axes 190A, 194A, 198A being perpendicular can, in some implementations, simplify the control of the system 180 by causing each axis 190A, 194A, 198A to contribute solely to a single degree of freedom. For example, if the extender 198 is rotated around axis 198A, the tilt of the surgical device 182 does not change when all three axes 190A, 194A, 198A are perpendicular. Similarly, if the upper arm 188 is rotated around axis 190A, only the tilt of the surgical device 182 from side to side is affected. Alternatively, two of the three axes 190A, 194A, 198A are perpendicular to each other. In a further alternative, none of the axes 190A, 194A, 198A are perpendicular to each other.


In one embodiment, the three axes 190A, 194A, 198A (as best shown in FIG. 14A) intersect at the intersection 200 (as best shown in FIG. 14B), also known as a “spherical joint” 200. The intersection 200 remains fixed at the same location, regardless of the positioning of the arm links 188, 192, 198, and can be used as the insertion point during surgeries. In one implementation, the intersection 200 causes the system 180 to act similarly to a spherical mechanism. A “spherical mechanism” is a physical mechanism or software application that can cause all end effector motions to pass through a single point, thereby allowing a surgical system to use long rigid tools that perform procedures through incisions that serve as single pivot points. As an example, both COBRASurge and the Raven have mechanical spherical mechanisms, while Da Vinci has a software-based spherical mechanism. In the device 180 as shown in FIG. 14A, the configuration of the device 180 creates the spherical joint 200 such that the extender 198 must pass through the single point of the spherical joint 200. The spherical joint 200 created by the device 180 increases the size of the effective workspace (depicted by the cone 202) for the surgical device 182.′


Alternatively, the gross positioning device 180 can have a fourth link, a fifth link, or any number of additional links, and a related additional number of rotational joints. Further, the device 80 can also have fewer than three links, and a related number of rotational joints. Thus, in one specific alternative implementation, the device 180 can have solely a base (such as base 184), a body (such as body 186), and a first link (such as first link 188) with a single rotational joint (such as rotational joint 190). In sum, the gross positioning device 180 can have a single rotational joint, two rotational joints, or any number of rotational joints.


In use of the embodiment shown in FIGS. 14A and 14B, the arm links 188, 192, 198 rotate about axes 190A, 194A, 198A to position the surgical device 182 within the surgical space defined by the cone 202. The cone 202 is a schematic representation of the outer boundaries of the space in which the device 182 can be positioned by the positioning device 180. More specifically, the extender 198 can be rotated around axis 198A to rotate the surgical device 182 about the axis 198A. Further, the arm links 188, 192 in combination with the extender 198 can be used to articulate the device 182 through two separate angular planes. That is, the two axes 190A and 194A can affect the angular position of the extender 198. In addition, the extender 198 can be extended or retracted to allow for the surgical device 182 to be advanced into and out of the patient's body cavity.


In one implementation, the positioning system 180 and the surgical device 182 (as shown in FIG. 14B) can be used in combination, such that the surgical device 182 is treated as an extension of the positioning system 180 wherein both are used together to move and operate the surgical device 182. For example, the surgeon may want to move the surgical device 182 a total of one inch to the right and thus actuates an external controller to cause this move. The controller transmits the appropriate signals to the system 180 and the surgical device 182 such that the system 180 and device 182 work in combination to move the surgical device 182 one inch to the right. In one example, the system 180 could move 0.5 inches and the device 182 could move 0.5 inches, thereby resulting in the device 182 moving the full one inch as desired. According to one embodiment, the system 180 can thus be used to maximize the strength, workspace, and maneuverability of the combination of the system 180 and the device 182 by determining the optimal contribution of each component during use.


Alternatively, the system 180 and the device 182 operate separately. That is, the system 180 is not operable or does not operate while the device 182 is being used, and the device 182 is not operable or does not operate while the system 180 is being used. For example, if the device 182 is being used and it is determined that a target object in the surgical space is outside the reach of the device 182, the device 182 is “shut down,” otherwise rendered inoperable, or simply placed in a “pause mode,” and the system 180 is used to reposition the device 182 accordingly.


It is understood that the device 180 can be operably coupled to a processor or computer (not shown) such that the processor can be used to control the system 180, including movement of the arm links 188, 192, 198 to grossly position the surgical device 182.


In alternative embodiments, the system 180 can have an arm that has only 2 arm links, or in yet another alternative the arm can have only 1 arm link.


In a further alternative implementation, the system 180 can also be configured to incorporate or integrate equipment or devices that couple to the surgical device 182 to provide various functionalities to the device 182. For example, in one embodiment, the positioning device 180 can contain suction and irrigation equipment that couples to corresponding equipment in the surgical device 182 such that the surgical device 182 includes suction and irrigation components. In another example according to a further implementation, the positioning device 180 can contain any known equipment that is configured to couple to corresponding equipment in the surgical device 182.


Alternative embodiments contemplated herein also include systems that can be used with surgical devices that are magnetically controlled (in contrast to the surgical device depicted in FIGS. 14A and 14B, which is controlled via a positioning rod inserted through the surgical incision). In those implementations, the positioning system positions the surgical device anywhere along an internal surface inside the patient's cavity by positioning an external magnetic component (such as a magnetic handle or other type of external magnetic component) along the outer skin of the patient. This positioning of the device can include any combination of movement in two dimensions along the surface of the patient's skin as well as rotation of the external magnetic component about an axis perpendicular to the surface of the skin. Of course, it is understood that while the movement of the magnetic component along the skin of the patient is considered to be two dimensional, the patient's skin is curved such that movement of the external component along the skin demonstrates absolute manipulation in all six degrees of freedom.


Another set of embodiments disclosed herein relates to a user interface and related software applications for use in surgical robotics. FIG. 16A depicts a user interface 280, according to one embodiment. The interface 280 provides a visual display 282 of the surgical space as captured by a camera. In addition, the interface 280 can also provide additional information via various icons or informational overlays positioned on the interface 280 in any configuration chosen by the user. For example, in the user interface 280 depicted in FIG. 16A, the left arm status overlay 284 is positioned in the upper left hand corner of the interface 280, while the right arm status overlay 286 is positioned in the upper right hand corner. Further, the device controller sensitivity overlay 288 is positioned in the lower left hand corner, and the device configuration overlay 290 is positioned in the lower right hand corner. In addition, the cautery status overlay 292 is positioned in a middle portion of the lower edge of the display 282. The interface 280, according to one embodiment, is fully customizable such that the user (typically a surgeon) can actively arrange the icons or overlays on the display 282 in any configuration that the user desires. In an alternative embodiment, all of the informational overlays can be positioned along one edge of the display 282.


In another implementation, the interface 280 can be triggered to display a menu, such as, for example, the menu 294 as shown in FIG. 16B. According to one embodiment, the menu 294 can display and provide access to various types of additional information. The menu 294 can be triggered by actuation of a button (not shown) that both freezes the robotic device and causes the display of the menu 294. Alternatively, the button can simply cause the display of the menu 294. In one exemplary embodiment as shown in FIG. 16B, the additional information on the menu 294 includes real-time patient information such as the patient's current heart rate and blood pressure. Further, the menu 294 can also provide access to historical patient information such as previous conditions, X-ray images, and MRI images of the patient. In addition, the menu 294 can also provide additional information about the current surgical procedure, such as the elapsed time in surgery. Further, the menu 294 can provide any other relevant or useful realtime or historical information.


In use, before a surgical procedure begins, the surgeon can position different informational overlays or icons on the display 282 as the surgeon desires. Further, the user can actuate a button on the user interface 280 or operably coupled thereto at any time before, during, or after a procedure to trigger the display of the menu 294. For example, the surgeon might notice something strange or unexpected during a procedure and actuate the button to display the menu 294 in order to access the patient's current heart rate or blood pressure, for example. Alternatively, the surgeon might want to select a different camera filter or different lighting during a procedure to better visualize certain structures or areas of the surgical space. According to one embodiment, the user interface 280 can act as an informational hub for decision-making during the procedure and/or during emergencies that might occur during the procedure. The user interface 280 provides enhanced surgeon comfort and ergonomics in comparison to known consoles and interfaces for robotic surgical systems by allowing for easy real-time adjustments to the display 282 to fit the needs of the surgeon, technician, or other user.


In a further implementation, a display 282 accessible to and used by multiple users over time (such as on a surgical system in an operating room in a hospital) can be configured to be quickly and easily set to the personalized settings of a specific user. That is, a specific user can configure the display 282 as desired at any time, including the placement of the informational overlays and other configurable settings of the display 282. That configuration of the display 282 can be saved and associated with the user's profile or account such that the user can access the configuration whenever the user accesses her or his account while using the interface 280. In one embodiment, the interface 280 allows for the saving of and access to the user personalized settings through a personalized login in which each user must log in to the interface 280 system each time the user wants to use it.


For example, the interface 280 can require a specific username and password for each user such that each time the user first interacts with the interface 280, the display 282 launches a screen or overlay that prompts the user to enter a username and password or any other type of personalized login information. Only after the user enters the correct personalized login information is the interface 280 triggered to provide access to the user and further to configure the display 282 as previously configured by the user. One example of a personalized configuration of a display 296 is shown in FIG. 17 for exemplary purposes only. Alternatively, the interface 280 can be operably coupled to a card reader (not shown) such as an RFID or NFC reader such that the user must swipe a personalized ID badge or card near or through the card reader in order to access the interface 280. In a further alternative, the interface 280 can be operably coupled to another type of scanner, such as a facial recognition or biometric (such as fingerprint, iris, etc.) scanner such that the user must first use the scanner in order to access the interface.


Another set of embodiments disclosed herein relates to software applications to provide feedback to a user relating to her/his surgical performance. The software applications can be used with a user interface such as, for example, the user interface embodiments described above, or alternatively any processor or computer. Certain embodiments compare performance parameters to standard benchmark performance parameters, while other embodiments track performance parameters over time.


In one embodiment, a software application is provided to track and compare surgical tool endpoint positions. It is understood that during a procedure, the surgical tool endpoint positions are indicative of the skill level of the surgeon. That is, an experienced surgeon will operate the surgical tool with control and very little wasted motion. In contrast, a novice will operate the tool with less control and more wasted motion. Similarly, the total distance traveled by a surgical tool endpoint can also be indicative of skill level. The shorter the distance, the more experienced the surgeon.


According to one implementation, the software application tracks the tool endpoints and plots those tracks in a graph. For example, FIG. 18 depicts two different graphs of endpoint positions tracked during an FLS peg transfer test—the left graph depicts the endpoint track of an experienced surgeon, while the right graph depicts the endpoint track of a novice surgeon. Thus, such a graphical display qualitatively shows the experience or skill level of a surgeon. Alternatively, the software application can track the tool endpoints and report information about the total distance traveled or the total enclosed volume. In accordance with one implementation, the software application can collect the information over time, thereby allowing for tracking of a surgeon's progress from a novice to a more experienced surgeon or to an expert.


In another embodiment, the software application records and tracks any forces, velocities, and/or accelerations of any components of the surgical tools. It is understood that very careful application of forces and use of smooth motions need to be used during a surgical procedure due to the enclosed and delicate nature of the operating site. In accordance with one implementation, the software application can be configured to limit the force, velocity, or acceleration of any device component during a procedure, thereby preventing the device from exceeding any pre-established limits for those parameters. In a further embodiment or in combination with the parameter limits, the software application can also collect and record the parameter information and make it available at a later time (such as post-surgery, for example), in some cases in combination with the surgical video, to identify any specific instances in which excess motion or force was used. Further, an overall smoothness factor could be calculated by the software application using some combination of the parameters.


One embodiment of the software application utilizes any of the previous parameters to provide benchmark testing for surgeons. The parameter data can be used to compare a surgeon's skills or the skills of groups of surgeons to other surgeons across the country or world. Alternatively, the software application can be used to test any individual surgeon against a known benchmark standard. In a further alternative, yearly or quarterly competency testing could be performed using the software application to certify that the surgeon meets or exceeds the set standard.


In addition to benchmark testing, in one embodiment the same information can be used by the software application to monitor the state of a surgeon or user during a procedure. As an example, the system can measure any tremor by the surgeon during the procedure and compare it to the surgeon's normal state as established based on information collected during past actual procedures or practice procedures.


In another software application embodiment relating to feedback, the software is configured to provide warm-up or practice exercises for a surgeon, including providing such exercises just prior to performing an actual surgery. It has been shown that “warming up” prior to a surgical procedure improves the performance of a surgeon. In one embodiment, the user console contains the software application and the application provides a virtual reality environment for the user using the user console. The application can provide an example procedure in a virtual reality environment for the surgeon to perform that is similar to the actual procedure. Alternatively, the application can provide specially designed warm-up tasks, procedures, “games,” or any other type of warm-up activity for the surgeon to perform prior to the actual procedure.


It should be noted that any of the software application embodiments relating to feedback as described herein can be operated on any known operating system. For example, the software application can be used with any known user interface for a surgical system, any known controller for any surgical system, or any known processor or computer system. In certain embodiments, a surgical device can be coupled to the user interface or the computer, or alternatively, the user interface or computer can be used by the user to operate the software application without a surgical device coupled thereto. In yet another alternative, a surgeon at a remote training center could use a computer, controller, or user interface that is linked to a robotic trainer or other type of training system at a central location to interface with the software application and perform tasks such as tests or warm-up procedures.


It should also be noted that the data relating to the various parameters discussed above can be collected using sensors coupled to the surgical tools. Alternatively, the data can be provided by the controller based on information already available from the controller. In a further embodiment, the data can be collected in any known fashion using any known devices.


Another set of embodiments disclosed herein relates to controllers or consoles for use with various surgical systems, including robotic surgical systems.


Certain controller or console implementations are configured to collect biometric information relating to the user, including during use of the system for surgery or training. For example, in one embodiment as shown in FIG. 19, a console 300 is provided that is a known Da Vinci™ console or a similar controller. Alternatively, the console 300 can be any known controller that can be used by a surgeon to operate a surgical device and that requires physical contact between the controller and the surgeon. As shown, the console 300 has a viewer 302 that requires the user (such as a surgeon) to place her or his head within the viewer 302 in order to operate the console 300. This results in the head of the user coming into contact with the viewer 302. Similarly, the console 300 has an armrest 304 that allows the user to rest her or his arms while using the console 300. Like the viewer 302, the placement of the armrest 304 results in the user's arms coming into contact with the armrest 304. In this implementation, the console 300 is provided with a sensor or sensors (not shown) associated with the viewer 302 and separately a sensor or sensors (not shown) associated with the armrest 304. The viewer sensor is configured to be in contact with the user's head when the user has correctly placed her/his head within the viewer 302, while the armrest sensor is configured to be in contact with the user's arm or arms when the user rests her/his arm or arms on the armrest 304. These sensors can be configured to collect various biometric information regarding the user, such as, for example, temperature, breathing patterns, pupil dilation, blinking (excessive blinking can indicate irritation or tiredness), muscle tension, and/or pulse rate. Alternatively, any other biometric information that can be indicative of a user's physical state can be detected and/or collected. According to one embodiment, these metrics can be collected by the sensors and used to track the user's physical state during a procedure.


According to one implementation, the information about the user's physical state can be used to modify the operation of the surgical system. For example, in one implementation, any biometric information indicating excessive stress or anger or the like can trigger the system to automatically minimize, reduce, or shut down the movement of the components of the surgical device. That is, any predetermined biometric parameter that exceeds a certain predetermined level can cause the processor to trigger the actuators on the surgical device to move at a reduced speed, thereby slowing the movement of the device and reducing the risk of injury to the patient. Further, if the biometric parameter exceeds a second, higher predetermined level, the processor triggers the actuators to stop entirely for a predetermined period, thereby forcing the user to take a short break from the procedure. Alternatively, at this higher level, the processor can be triggered to disconnect the controls from the device for a predetermined period, thereby producing the same result of forcing the user to take a short break.


In accordance with certain alternative embodiments, the console 300 can be linked to a robotic trainer or other type of training system to interface with a software application similar to one of the applications described above, thereby allowing a user to perform tasks such as tests, warm-up procedures, or surgical simulations while the console 300 collects biometric information as described above, thereby allowing for evaluation of the physical state of the user during the simulation or task.


In another controller embodiment as shown in FIG. 20, a controller system 310 is provided that is an open air controller system 310. As used herein, “open air controller” means a controller that allows a user to control a device or system via movement of the user's arms, legs, and body by taking a significant amount of position and orientation measurements via non-mechanical means. Commercial examples of open air controllers include the Wii and XBox Kinect gaming systems. The controller system 310 depicted in FIG. 20 has a monitor 312 configured to display a live video image of the surgical space as captured by one or more cameras associated with the surgical device being used in the procedure. Alternatively, instead of the monitor 312, the system 310 could have a “heads up” display (not shown) that is worn on the head of the user.


The system 310 also has at least one of the following: one or more handles 314 to be held and manipulated by the user and/or a tracking device 316 coupled with or positioned near the monitor 312. For system 310 embodiments having one or more handles 314, the handles 314 can be used to control the surgical device, including the position and orientation of one or more end effectors on the device. These handles 314 work as an electronic means of sensing position and orientation via wireless positioning, accelerometers, gyros, and/or compasses. A commercial example of such a handle is the Wii controller. In one implementation, the handles 314 work in conjunction with the tracking device 316 to control the surgical device using handle tracking in which the tracking device 316 is configured to track identifiable markers associated with each handle 314 and thereby be capable of tracking the position and orientation of each handle 314 and use that information to control the surgical device.


According to one implementation, the handle or handles 314 can also have additional components or features incorporated therein. For example, one handle 314 embodiment can have at least one button or other input component that can be used to allow the user to interact with the system via menus displayed on the monitor 312 or in any other fashion in which the user can communicate with the system via an input component. In one embodiment, the input component can be a button, a scroll wheel, a knob, or any other such component. Alternatively, the handle 314 can have a sensor or any other type of detection component. In use, the user could use the input component or detection component to provide fine adjustments to the system or otherwise communicate with the system as desired or needed.


Alternatively, the system has a tracking device 316 and no handles. In such an embodiment, the tracking device 316 tracks the location and movement of the user's arms and/or hands in a fashion similar to the tracking of the handles as described above, but without the use of any identifiable markers. Instead, the tracking device 316 uses the arms, hands, or other relevant body parts as natural markers. In one implementation, the tracking device 316 is a camera that can be used to identify and track the user or at least the hands and/or arms of the user. According to one embodiment, the tracking device 316 can fully map the user's body such that positional information about various parts of the user's body could be used to control a surgical device. For example, the positional information about the user's elbows might be used to control the surgical device. One commercial example of such a tracking device is the Kinect system used with the XBox gaming system.


Another similar embodiment relates to a tracking device 316 used in conjunction with a cuff or other type of device that is positioned around at least a portion of the user's forearm. The cuff is configured to detect and measure the firing of muscles in the forearm, such that the system can identify hand gestures. Other similar devices for directly measuring of muscle actions that are coupled to other parts of the user's body can also be used with the current system.


In another alternative embodiment, the system 310 is scaled to a smaller size such that the system 310 tracks a user's hands or fingers instead of the user's arms or larger body parts. By tracking the user's hands, very fine motions can be used to control the surgical device. Such a system would reduce the risk of fatigue.


In use, a user or surgeon can stand or sit or otherwise position herself or himself in front of the monitor 312 so that the user can see the surgical space as displayed on the monitor 312, including at least a portion of the surgical device. The user can then use either handles 314 or the user's own arms and/or hands to make motions that will be detected by the system 310 and utilized to control the movements of the surgical device.


A similar embodiment relates to a system configured to control a surgical device based at least in part on eye motion. That is, a motion sensor or monitor—perhaps similar or identical to one of the tracking device embodiments described above—is positioned to track the motion of a user's eye, and that motion can be utilized to control a surgical device. In one implementation, eye motion tracking could help the system recognize the intended tooltip position by tracking where the user is looking.


Another controller embodiment relates to controller (not shown) configured to monitor brainwaves and thereby control a surgical device based on those brainwaves. In one specific embodiment, the controller has an electroencephalography (EEG) sensor that is positioned against or adjacent to the user's head. The EEG sensor senses electrical activity along the scalp of the user and can be used to detect and thereby interpret the brain waves of the user and use that information to control the surgical device. In one implementation, such a system eliminates the need for precise motor skills and relies instead on the user's thoughts to actuate the surgical device.


In a further alternative, the EEG sensor could be used to monitor a user's mental state and work in combination with the system to react to the information about the user's mental state in a fashion similar to that described above with respect to monitoring a user's physical state.


In a further embodiment, the controller has a microphone that detects voice commands and the controller uses those commands to control the surgical device. That is, the user can state verbal instructions that the controller detects via the microphone, and the software in the controller can be configured to analyze what the user said and trigger the device to take the action instructed. One commercial embodiment of a similar system is the Siri system used in Apple products. In use, the user could use voice commands instead of physical manipulation to control a surgical device. In one specific example, surgeons are often required to stop use of one component or device during a procedure to switch to a different task. With this system, the surgeon could verbally instruct the system to switch to the other task.


Another controller embodiment relates to various controller devices having a foot controller. Some prior art controllers have foot controllers made up of multiple foot pedals that require a user to use each foot for more than one function and/or actuate more than one pedal. The foot controller embodiments herein are configured to reduce the functions, make those functions easier, or eliminate the multiple pedals.



FIG. 21 depicts a foot controller 320, according to one embodiment. In this embodiment, the controller 320 has one pedal 322. Having a single pedal 322 eliminates the need for the user to release contact with the pedal 322 and make contact with one or more other pedals. In this embodiment, the pedal 322 is a multi-directional pedal 322 that acts like a joystick for the user's foot. The pedal can be moved in any of the four cardinal directions to activate a different function in relation to the surgical device or the surgical system.


Alternatively, the foot controller can be configured to have multiple functions that are selectable via a hand- or finger-actuated button or input component. In one embodiment as shown in FIG. 22, the input component is a controller handle 330 with a scroll wheel 334, wherein the scroll wheel 334 can be actuated to select the desired function of a foot controller (such as the foot controller 320). Alternatively, as shown in FIG. 23, the input component can be a basic mouse 340. Regardless of the specific input component, the component 330 or 340 is operably coupled to the foot pedal (such as the foot controller 320 discussed above) such that the input component 330 or 340 can be used to select the function of the foot controller 320. In use, the user can use her or his hand to actuate the scroll wheel 332 of the handle 330 or the scroll wheel of the mouse 340 to select the appropriate function of the foot pedal, such as the foot controller 320. In one embodiment, a menu is displayed on a display or monitor of the controller when the user actuates the scroll wheel and the user can then select from that menu. Of course, this menu can apply to one foot controller (like the foot controller 320 above) or two different foot controllers or pedals.


In a further alternative, the controller has a pedal selection indicator that is displayed on the display. As an example, the pedal selection indicator could be an overlay on a display such as the user interface display 286 discussed above with respect to FIG. 16A. Alternatively, the indicator could be displayed in any known fashion on any known controller. In one embodiment, the pedal selection detecting device is a camera that is positioned to capture all of the one or more foot pedals associated with a foot controller. Alternatively, any sensor can be used that can detect which foot pedals are being used.


Returning to FIG. 22, the controller handle 330 can also be used to control any part of a surgical device or system. The handle 330 has a handle body 332, the scroll wheel 334, and two actuatable finger loops 336A, 336B. The scroll wheel 334 and both finger loops 336A, 336B can be actuated by a user to trigger certain actions by the system or the surgical device. One example of such an action relates to selection of a foot pedal function as described above, but many other actions can be accomplished via the wheel 334 and loops 336A, 336B. In use, the user grasps the handle body 332 and positions her or his hand such that the thumb and index finger can be positioned in the loops 336A, 336B and the middle finger is in close proximity with the scroll wheel 334 such that the middle finger can be used to actuate the wheel 334 as needed. It is understood that this scroll wheel 334 operates in a fashion similar to known scroll wheels, by providing both a scrolling action and a clicking action.


Returning to the surgical device embodiments discussed above, another set of surgical device implementations relate to devices having at least one biometric sensor associated with the device to monitor the status of the patient.


One example of a surgical device 350 embodiment having a biometric sensor 352 is set forth in FIG. 24. In this embodiment, the sensor 352 is positioned in an arm 354 of the device 350. Alternatively, the sensor 352 can be positioned anywhere else in or on the device 350. The sensor 352 can be coupled to or with the existing electronics in the arm 354 such that the sensor 352 is electrically coupled to an external controller and thereby can provide feedback regarding one or more biometric parameters. The various parameters can include, but are certainly not limited to, temperature, pressure, or humidity. In use, the biometric parameter(s) of interest can be monitored by using the sensor 352 (or two or more sensors) to capture the relevant data and transmit it to a controller that provides the information on a display for the user/surgeon to see.


Another set of embodiments relates to best practices in cavity insufflations. It is understood that a patient's surgical cavity is typically expanded for purposes of a surgical procedure in that cavity by insufflating the cavity with a gas to maximize the amount of space in the cavity while minimizing the risk of damaging the cavity wall or organs by inadvertent contact with the surgical tools. The gas most commonly used is carbon dioxide (“CO2”). One problem with the use of CO2 is the absorption of excess CO2 into one or more tissues of the patient, which can cause or increase postoperative pain, including abdominal and shoulder pain. In one implementation, one method for maximizing insufflation while minimizing the problems of CO2 absorption involves flushing the CO2 from the patient's cavity at the completion of the surgical procedure. More specifically, once the procedure is complete, another gas (other than CO2) is pumped into the patient's cavity, thereby forcing or “flushing” the CO2 out of the cavity. In accordance with one implementation, the replacement or flushing gas can be a gas that is more reactive than CO2, because the reactive gas is used after the procedure is complete (when risks of using such reactive gas is significantly reduced). In one embodiment, the replacement gas is oxygen (“O2”). It is understood that the replacement gas is a gas that does not adversely affect the patient when the gas is absorbed.


EXAMPLE

One embodiment of a gross positioning system similar to those discussed above and depicted in FIGS. 14A and 14B was examined. In this specific example, the system was tested with a two armed surgical device with two and three degrees of freedom per arm. The degrees of freedom of each arm of the surgical device from proximal to distal tip were shoulder pitch, shoulder yaw, and elbow yaw. For the experiment, it was assumed the two arms would work within close proximity of one another, as in a stretch and dissect operation.


The benchtop experiment took place in a mock surgical environment at the University of Nebraska-Lincoln to show the advantages of the gross positioning system over a fixed stand-alone device. The known fundamentals of laparoscopic surgery (FLS) peg transfer task was used to demonstrate the dexterous workspace of the surgical device. The goal of this task was to touch the top of each peg.


The results of the benchtop testing with the gross positioning device show that the gross positioning system is advantageous for surgical devices that typically would have poor dexterity or limited workspace with such a positioning device. Without the positioning device, when all six degrees of freedom were used, the stand-alone device could only reach a portion of the maximum number of pegs. In contrast, when the gross positioning system was used, all of the pegs were reachable with the four and six DOF surgical devices. These benchtop results indicate the advantages of a gross positioning system coupled with restricted surgical devices.


Although the present invention has been described with reference to preferred embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.

Claims
  • 1. A gross positioning system for use with a robotic surgical device, the system comprising: (a) a base;(b) a body operably coupled to the base;(c) a first arm link operably coupled to the body at a first rotational joint;(d) a second arm link operably coupled to the first arm link at a second rotational joint; and(e) an extendable third arm link operably coupled to the second arm link, wherein at least a portion of the third arm link is rotatable about a third rotational joint, wherein the third arm link is configured to be positionable through an incision in a patient, the third arm link comprising a connection component at a distal end of the third arm link, wherein the connection component is configured to be coupleable to the robotic surgical device,wherein an axis of rotation of the first rotational joint, an axis of rotation of the second rotational joint, and an axis of rotation of the third rotational joint intersect at a spherical joint.
  • 2. The gross positioning system of claim 1, wherein the robotic surgical device comprises at least one arm, wherein the gross positioning system and robotic surgical device are configured to operate together to position the robotic surgical device within a body cavity of a patient.
  • 3. The gross positioning system of claim 1, wherein the extendable third arm link comprises an extender body and an extendable rod slidably coupled to the extender body, wherein the extendable rod is configured to move between an extended position and a retracted position.
  • 4. The gross positioning system of claim 1, wherein the spherical joint is disposed at the incision of the patient, wherein the third arm link is disposed through the spherical joint.
  • 5. The gross positioning system of claim 1, wherein the connection component is positionable in a body cavity of the patient.
  • 6. A gross positioning system for use with a robotic surgical device, the system comprising: (a) a base;(b) a body operably coupled to the base;(c) a first arm link operably coupled to the body at a first rotational joint;(d) a second arm link operably coupled to the first arm link at a second rotational joint;(e) an extendable third arm link operably coupled to the second arm link, wherein at least a portion of the third arm link is rotatable about a third rotational joint, the third arm link comprising a connection component at a distal end of the third arm link; and(f) the robotic surgical device operably coupled to the connection component, the robotic surgical device comprising: (i) a device body;(ii) a first arm operably coupled to the device body, the first arm comprising at least one first actuator; andiii) a second arm operably coupled to the device body, the second arm comprising at least one second actuator,wherein an axis of rotation of the first rotational joint, an axis of rotation of the second rotational joint, and an axis of rotation of the third rotational joint intersect at a spherical joint.
  • 7. The gross positioning system of claim 6, wherein an axis of rotation of the first rotational joint is perpendicular to at least one of an axis of rotation of the second rotational joint and an axis of rotation of the third rotational joint.
  • 8. The gross positioning system of claim 6, wherein the extendable third arm link comprises an extender body and an extendable rod slidably coupled to the extender body, wherein the extendable rod is configured to move between an extended position and a retracted position.
  • 9. The gross positioning system of claim 6, wherein the third arm link is configured to be positionable through an incision in a patient.
  • 10. The gross positioning system of claim 6, wherein the spherical joint is disposed at an insertion point of a patient, wherein the third arm link is disposed through the spherical joint.
  • 11. An external gross positioning system for use with an internal robotic surgical device, the system comprising: (a) a base;(b) a body operably coupled to the base;(c) a first arm link operably coupled to the body at a first rotational joint;(d) a second arm link operably coupled to the first arm link at a second rotational joint;(e) an extendable third arm link operably coupled to the second arm link, wherein at least a portion of the third arm link is rotatable about a third rotational joint, the third arm link comprising a connection component at a distal end of the third arm link, wherein the connection component is configured to be coupleable to the robotic surgical device; and(f) a spherical joint at an intersection of an axis of rotation of the first rotational joint, an axis of rotation of the second rotational joint, and an axis of rotation of the third rotational joint, wherein the spherical joint is disposed at an insertion point of a patient, wherein the third arm link is disposed through the spherical joint.
  • 12. The gross positioning system of claim 11, wherein an axis of rotation of the first rotational joint is perpendicular to at least one of an axis of rotation of the second rotational joint and an axis of rotation of the third rotational joint.
  • 13. The gross positioning system of claim 11, wherein an axis of rotation of the second rotational joint is perpendicular to at least one of an axis of rotation of the first rotational joint and an axis of rotation of the third rotational joint.
  • 14. The gross positioning system of claim 11, wherein an axis of rotation of the third rotational joint is perpendicular to at least one of an axis of rotation of the first rotational joint and an axis of rotation of the second rotational joint.
  • 15. The gross positioning system of claim 11, wherein the extendable third arm link comprises an extender body and an extendable rod slidably coupled to the extender body, wherein the extendable rod is configured to move between an extended position and a retracted position.
  • 16. The gross positioning system of claim 11, wherein the robotic surgical device comprises at least one arm, wherein the gross positioning system and robotic surgical device are configured to operate together to position the robotic surgical device within a body cavity of the patient.
  • 17. The gross positioning system of claim 11, wherein the connection component is positionable in a body cavity of the patient.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application claims priority to U.S. Provisional Application 61/782,413, filed on Mar. 14, 2013 and entitled “Methods, Systems, and Devices Relating to Robotic Surgical Devices, End Effectors, and Controllers,” which is hereby incorporated herein by reference in its entirety.

GOVERNMENT SUPPORT

This invention was made with government support under Grant No. DGE-10410000 awarded by the National Science Foundation; Grant Nos. NNX09AO71A and NNX10AJ26G awarded by the National Aeronautics and Space Administration; and Grant No. W81XWF-09-2-0185 awarded by U.S. Army Medical Research and Materiel Command within the Department of Defense. Accordingly, the government has certain rights in this invention.

US Referenced Citations (493)
Number Name Date Kind
3870264 Robinson Mar 1975 A
3989952 Hohmann Nov 1976 A
4246661 Pinson Jan 1981 A
4258716 Sutherland Mar 1981 A
4278077 Mizumoto Jul 1981 A
4538594 Boebel et al. Sep 1985 A
4568311 Miyake Feb 1986 A
4623183 Amori Nov 1986 A
4736645 Zimmer Apr 1988 A
4771652 Zimmer Sep 1988 A
4852391 Ruch et al. Aug 1989 A
4896015 Taboada et al. Jan 1990 A
4897014 Tietze Jan 1990 A
4922755 Oshiro et al. May 1990 A
4922782 Kawai May 1990 A
4990050 Tsuge et al. Feb 1991 A
5019968 Wang et al. May 1991 A
5108140 Bartholet Apr 1992 A
5172639 Wiesman et al. Dec 1992 A
5176649 Wakabayashi Jan 1993 A
5178032 Zona et al. Jan 1993 A
5187032 Sasaki et al. Feb 1993 A
5187796 Wang et al. Feb 1993 A
5195388 Zona et al. Mar 1993 A
5201325 McEwen et al. Apr 1993 A
5217003 Wilk Jun 1993 A
5263382 Brooks et al. Nov 1993 A
5271384 McEwen et al. Dec 1993 A
5284096 Pelrine et al. Feb 1994 A
5297443 Wentz Mar 1994 A
5297536 Wilk Mar 1994 A
5304899 Sasaki et al. Apr 1994 A
5307447 Asano et al. Apr 1994 A
5353807 DeMarco Oct 1994 A
5363935 Schempf et al. Nov 1994 A
5382885 Salcudean et al. Jan 1995 A
5388528 Pelrine et al. Feb 1995 A
5436542 Petelin et al. Jul 1995 A
5441494 Ortiz Aug 1995 A
5458131 Wilk Oct 1995 A
5458583 McNeely et al. Oct 1995 A
5458598 Feinberg et al. Oct 1995 A
5471515 Fossum et al. Nov 1995 A
5515478 Wang May 1996 A
5524180 Wang et al. Jun 1996 A
5553198 Wang et al. Sep 1996 A
5562448 Mushabac Oct 1996 A
5588442 Scovil et al. Dec 1996 A
5620417 Jang et al. Apr 1997 A
5623582 Rosenberg Apr 1997 A
5624380 Takayama et al. Apr 1997 A
5624398 Smith et al. Apr 1997 A
5632761 Smith et al. May 1997 A
5645520 Nakamura et al. Jul 1997 A
5657429 Wang et al. Aug 1997 A
5657584 Hamlin Aug 1997 A
5672168 de la Torre et al. Sep 1997 A
5674030 Sigel Oct 1997 A
5728599 Rostoker et al. Mar 1998 A
5736821 Suyama et al. Apr 1998 A
5754741 Wang et al. May 1998 A
5762458 Wang et al. Jun 1998 A
5769640 Jacobus et al. Jun 1998 A
5791231 Cohn et al. Aug 1998 A
5792135 Madhani et al. Aug 1998 A
5797538 Heaton et al. Aug 1998 A
5797900 Madhani et al. Aug 1998 A
5807377 Madhani et al. Sep 1998 A
5808665 Green Sep 1998 A
5815640 Wang et al. Sep 1998 A
5825982 Wright et al. Oct 1998 A
5841950 Wang et al. Nov 1998 A
5845646 Lemelson Dec 1998 A
5855583 Wang et al. Jan 1999 A
5876325 Mizuno et al. Mar 1999 A
5878193 Wang et al. Mar 1999 A
5878783 Smart Mar 1999 A
5895417 Pomeranz et al. Apr 1999 A
5906591 Dario et al. May 1999 A
5907664 Wang et al. May 1999 A
5910129 Koblish et al. Jun 1999 A
5911036 Wright et al. Jun 1999 A
5971976 Wang et al. Oct 1999 A
5993467 Yoon Nov 1999 A
6001108 Wang et al. Dec 1999 A
6007550 Wang et al. Dec 1999 A
6030365 Laufer Feb 2000 A
6031371 Smart Feb 2000 A
6058323 Lemelson May 2000 A
6063095 Wang et al. May 2000 A
6066090 Yoon May 2000 A
6102850 Wang et al. Aug 2000 A
6107795 Smart Aug 2000 A
6132368 Cooper Oct 2000 A
6132441 Grace Oct 2000 A
6139563 Cosgrove, III et al. Oct 2000 A
6156006 Brosens et al. Dec 2000 A
6159146 El Gazayerli Dec 2000 A
6162171 Ng et al. Dec 2000 A
D438617 Cooper et al. Mar 2001 S
6206903 Ramans Mar 2001 B1
D441076 Cooper et al. Apr 2001 S
6223100 Green Apr 2001 B1
D441862 Cooper et al. May 2001 S
6238415 Sepetka et al. May 2001 B1
6240312 Alfano et al. May 2001 B1
6241730 Alby Jun 2001 B1
6244809 Wang et al. Jun 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
D444555 Cooper et al. Jul 2001 S
6286514 Lemelson Sep 2001 B1
6292678 Hall et al. Sep 2001 B1
6293282 Lemelson Sep 2001 B1
6296635 Smith et al. Oct 2001 B1
6309397 Julian et al. Oct 2001 B1
6309403 Minoret al. Oct 2001 B1
6312435 Wallace et al. Nov 2001 B1
6321106 Lemelson Nov 2001 B1
6327492 Lemelson Dec 2001 B1
6331181 Tierney et al. Dec 2001 B1
6346072 Cooper Feb 2002 B1
6352503 Matsui Raifu et al. Mar 2002 B1
6364888 Niemeyer et al. Apr 2002 B1
6371952 Madhani et al. Apr 2002 B1
6394998 Wallace et al. May 2002 B1
6398726 Ramans et al. Jun 2002 B1
6400980 Lemelson Jun 2002 B1
6408224 Okamoto et al. Jun 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6432112 Brock et al. Aug 2002 B2
6436107 Wang et al. Aug 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6450104 Grant et al. Sep 2002 B1
6451027 Cooper et al. Sep 2002 B1
6454758 Thompson et al. Sep 2002 B1
6459926 Nowlin et al. Oct 2002 B1
6463361 Wang et al. Oct 2002 B1
6468203 Belson Oct 2002 B2
6468265 Evans et al. Oct 2002 B1
6470236 Ohtsuki Oct 2002 B2
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer et al. Dec 2002 B1
6496099 Wang et al. Dec 2002 B2
6508413 Bauer et al. Jan 2003 B2
6512345 Borenstein Jan 2003 B2
6522906 Salisbury, Jr. et al. Feb 2003 B1
6544276 Azizi Apr 2003 B1
6548982 Papanikolopoulos et al. Apr 2003 B1
6554790 Moll Apr 2003 B1
6565554 Niemeyer May 2003 B1
6574355 Green Jun 2003 B2
6587750 Gerbi et al. Jul 2003 B2
6591239 McCall et al. Jul 2003 B1
6594552 Nowlin et al. Jul 2003 B1
6610007 Belson et al. Aug 2003 B2
6620173 Gerbi et al. Sep 2003 B2
6642836 Wang et al. Nov 2003 B1
6645196 Nixon et al. Nov 2003 B1
6646541 Wang et al. Nov 2003 B1
6648814 Kim et al. Nov 2003 B2
6659939 Moll et al. Dec 2003 B2
6661571 Shioda et al. Dec 2003 B1
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6684129 Salisbury, Jr. et al. Jan 2004 B2
6685648 Flaherty et al. Feb 2004 B2
6685698 Morley et al. Feb 2004 B2
6687571 Byme et al. Feb 2004 B1
6692485 Brock et al. Feb 2004 B1
6699177 Wang et al. Mar 2004 B1
6699235 Wallace et al. Mar 2004 B2
6702734 Kim et al. Mar 2004 B2
6702805 Stuart Mar 2004 B1
6714839 Salisbury, Jr. et al. Mar 2004 B2
6714841 Wright et al. Mar 2004 B1
6719684 Kim et al. Apr 2004 B2
6720988 Gere et al. Apr 2004 B1
6726699 Wright et al. Apr 2004 B1
6728599 Wright et al. Apr 2004 B2
6730021 Vassiliades, Jr. et al. May 2004 B2
6731988 Green May 2004 B1
6746443 Morley et al. Jun 2004 B1
6764441 Chiel et al. Jul 2004 B2
6764445 Ramans et al. Jul 2004 B2
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6774597 Borenstein Aug 2004 B1
6776165 Jin Aug 2004 B2
6780184 Tanrisever Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6785593 Wang et al. Aug 2004 B2
6788018 Blumenkranz Sep 2004 B1
6792663 Krzyzanowski Sep 2004 B2
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6799088 Wang et al. Sep 2004 B2
6801325 Farr et al. Oct 2004 B2
6804581 Wang et al. Oct 2004 B2
6810281 Brock et al. Oct 2004 B2
6817972 Snow Nov 2004 B2
6817974 Cooper et al. Nov 2004 B2
6817975 Farr et al. Nov 2004 B1
6820653 Schempf et al. Nov 2004 B1
6824508 Kim et al. Nov 2004 B2
6824510 Kim et al. Nov 2004 B2
6832988 Sproul Dec 2004 B2
6832996 Woloszko et al. Dec 2004 B2
6836703 Wang et al. Dec 2004 B2
6837846 Jaffe et al. Jan 2005 B2
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6852107 Wang et al. Feb 2005 B2
6858003 Evans et al. Feb 2005 B2
6860346 Burt et al. Mar 2005 B2
6860877 Sanchez et al. Mar 2005 B1
6866671 Tiemey et al. Mar 2005 B2
6870343 Borenstein et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6871563 Choset et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6892112 Wang et al. May 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6905460 Wang et al. Jun 2005 B2
6905491 Wang et al. Jun 2005 B1
6911916 Wang et al. Jun 2005 B1
6917176 Schempf et al. Jul 2005 B2
6933695 Blumenkranz Aug 2005 B2
6936001 Snow Aug 2005 B1
6936003 Iddan Aug 2005 B2
6936042 Wallace et al. Aug 2005 B2
6943663 Wang et al. Sep 2005 B2
6949096 Davison et al. Sep 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6965812 Wang et al. Nov 2005 B2
6974411 Belson Dec 2005 B2
6974449 Niemeyer Dec 2005 B2
6979423 Moll Dec 2005 B2
6984203 Tartaglia et al. Jan 2006 B2
6984205 Gazdzinski Jan 2006 B2
6991627 Madhani et al. Jan 2006 B2
6993413 Sunaoshi Jan 2006 B2
6994703 Wang et al. Feb 2006 B2
6994708 Manzo Feb 2006 B2
6997908 Carrillo, Jr. et al. Feb 2006 B2
7025064 Wang et al. Apr 2006 B2
7027892 Wang et al. Apr 2006 B2
7033344 Imran Apr 2006 B2
7039453 Mullick May 2006 B2
7042184 Oleynikov et al. May 2006 B2
7048745 Tierney et al. May 2006 B2
7053752 Wang et al. May 2006 B2
7063682 Whayne et al. Jun 2006 B1
7066879 Fowler et al. Jun 2006 B2
7066926 Wallace et al. Jun 2006 B2
7074179 Wang et al. Jul 2006 B2
7077446 Kameda et al. Jul 2006 B2
7083571 Wang et al. Aug 2006 B2
7083615 Peterson et al. Aug 2006 B2
7087049 Nowlin et al. Aug 2006 B2
7090683 Brock et al. Aug 2006 B2
7097640 Wang et al. Aug 2006 B2
7105000 McBrayer Sep 2006 B2
7107090 Salisbury, Jr. et al. Sep 2006 B2
7109678 Kraus et al. Sep 2006 B2
7118582 Wang et al. Oct 2006 B1
7121781 Sanchez et al. Oct 2006 B2
7125403 Julian et al. Oct 2006 B2
7126303 Farritor et al. Oct 2006 B2
7147650 Lee Dec 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7169141 Brock et al. Jan 2007 B2
7182025 Ghorbel et al. Feb 2007 B2
7182089 Ries Feb 2007 B2
7199545 Oleynikov et al. Apr 2007 B2
7206626 Quaid, III Apr 2007 B2
7206627 Abovitz et al. Apr 2007 B2
7210364 Ghorbel et May 2007 B2
7214230 Brock et al. May 2007 B2
7217240 Snow May 2007 B2
7239940 Wang et al. Jul 2007 B2
7250028 Julian et al. Jul 2007 B2
7259652 Wang et al. Aug 2007 B2
7273488 Nakamura et al. Sep 2007 B2
7311107 Harel et al. Dec 2007 B2
7339341 Oleynikov et al. Mar 2008 B2
7372229 Farritor et al. May 2008 B2
7447537 Funda et al. Nov 2008 B1
7492116 Oleynikov et al. Feb 2009 B2
7566300 Devierre et al. Jul 2009 B2
7574250 Niemeyer Aug 2009 B2
7637905 Saadat et al. Dec 2009 B2
7645230 Mikkaichi et al. Jan 2010 B2
7655004 Long Feb 2010 B2
7670329 Flaherty et al. Mar 2010 B2
7678043 Gilad Mar 2010 B2
7731727 Sauer Jun 2010 B2
7762825 Burbank et al. Jul 2010 B2
7772796 Farritor et al. Aug 2010 B2
7785251 Wilk Aug 2010 B2
7785333 Miyamoto et al. Aug 2010 B2
7789825 Nobis et al. Sep 2010 B2
7794494 Sahatjian et al. Sep 2010 B2
7865266 Moll et al. Jan 2011 B2
7960935 Farritor et al. Jun 2011 B2
8021358 Doyle et al. Sep 2011 B2
8179073 Farritor et al. May 2012 B2
8353897 Doyle et al. Jan 2013 B2
8604742 Farritor et al. Dec 2013 B2
9089353 Farritor Jul 2015 B2
20010018591 Brock et al. Aug 2001 A1
20010049497 Kalloo et al. Dec 2001 A1
20020003173 Bauer et al. Jan 2002 A1
20020013601 Nobles et al. Jan 2002 A1
20020026186 Woloszko et al. Feb 2002 A1
20020038077 de la Torre et al. Mar 2002 A1
20020065507 Azizi May 2002 A1
20020091374 Cooper Jul 2002 A1
20020103417 Gazdzinski Aug 2002 A1
20020111535 Kim et al. Aug 2002 A1
20020120254 Julian et al. Aug 2002 A1
20020128552 Nowlin et al. Sep 2002 A1
20020140392 Borenstein et al. Oct 2002 A1
20020147487 Sundquist et al. Oct 2002 A1
20020151906 Demarais et al. Oct 2002 A1
20020156347 Kim et al. Oct 2002 A1
20020171385 Kim et al. Nov 2002 A1
20020173700 Kim et al. Nov 2002 A1
20020190682 Schempf et al. Dec 2002 A1
20030020810 Takizawa et al. Jan 2003 A1
20030045888 Brock et al. Mar 2003 A1
20030065250 Chiel et al. Apr 2003 A1
20030089267 Ghorbel et al. May 2003 A1
20030092964 Kim et al. May 2003 A1
20030097129 Davison et al. May 2003 A1
20030100817 Wang et al. May 2003 A1
20030114731 Cadeddu et al. Jun 2003 A1
20030135203 Wang et al. Jul 2003 A1
20030139742 Wampler et al. Jul 2003 A1
20030144656 Ocel et al. Jul 2003 A1
20030167000 Mullick Sep 2003 A1
20030172871 Scherer Sep 2003 A1
20030179308 Zamorano et al. Sep 2003 A1
20030181788 Yokoi et al. Sep 2003 A1
20030229268 Uchiyama et al. Dec 2003 A1
20030230372 Schmidt Dec 2003 A1
20040024311 Quaid Feb 2004 A1
20040034282 Quaid Feb 2004 A1
20040034283 Quaid Feb 2004 A1
20040034302 Abovitz et al. Feb 2004 A1
20040050394 Jin Mar 2004 A1
20040070822 Shioda et al. Apr 2004 A1
20040099175 Perrot et al. May 2004 A1
20040102772 Baxter et al. May 2004 A1
20040106916 Quaid et al. Jun 2004 A1
20040111113 Nakamura et al. Jun 2004 A1
20040117032 Roth et al. Jun 2004 A1
20040138525 Saadat Jul 2004 A1
20040138552 Harel et al. Jul 2004 A1
20040140786 Borenstein Jul 2004 A1
20040153057 Davison Aug 2004 A1
20040173116 Ghorbel et al. Sep 2004 A1
20040176664 Iddan Sep 2004 A1
20040215331 Chew et al. Oct 2004 A1
20040225229 Viola Nov 2004 A1
20040254680 Sunaoshi Dec 2004 A1
20040267326 Ocel et al. Dec 2004 A1
20050014994 Fowler et al. Jan 2005 A1
20050021069 Feuer et al. Jan 2005 A1
20050029978 Oleynikov et al. Feb 2005 A1
20050043583 Killmann et al. Feb 2005 A1
20050049462 Kanazawa Mar 2005 A1
20050054901 Yoshino Mar 2005 A1
20050054902 Konno Mar 2005 A1
20050064378 Toly Mar 2005 A1
20050065400 Banik et al. Mar 2005 A1
20050083460 Hattori et al. Apr 2005 A1
20050096502 Khalili et al. May 2005 A1
20050143644 Gilad et al. Jun 2005 A1
20050154376 Riviere et al. Jul 2005 A1
20050165449 Cadeddu et al. Jul 2005 A1
20050283137 Doyle et al. Dec 2005 A1
20050288555 Binmoeller Dec 2005 A1
20050288665 Woloszko Dec 2005 A1
20060020272 Gildenberg Jan 2006 A1
20060046226 Bergler et al. Mar 2006 A1
20060119304 Farritor et al. Jun 2006 A1
20060149135 Paz Jul 2006 A1
20060152591 Lin Jul 2006 A1
20060155263 Lipow Jul 2006 A1
20060195015 Mullick et al. Aug 2006 A1
20060196301 Oleynikov et al. Sep 2006 A1
20060198619 Oleynikov et al. Sep 2006 A1
20060241570 Wilk Oct 2006 A1
20060241732 Denker et al. Oct 2006 A1
20060253109 Chu Nov 2006 A1
20060258954 Timberlake Nov 2006 A1
20070032701 Fowler et al. Feb 2007 A1
20070043397 Ocel et al. Feb 2007 A1
20070055342 Wu et al. Mar 2007 A1
20070080658 Farritor et al. Apr 2007 A1
20070106113 Ravo May 2007 A1
20070123748 Meglan May 2007 A1
20070142725 Hardin et al. Jun 2007 A1
20070156019 Larkin et al. Jul 2007 A1
20070156211 Ferren et al. Jul 2007 A1
20070167955 De La Menardiere et al. Jul 2007 A1
20070225633 Ferren et al. Sep 2007 A1
20070225634 Ferren et al. Sep 2007 A1
20070241714 Oleynikov et al. Oct 2007 A1
20070244520 Ferren et al. Oct 2007 A1
20070250064 Darois et al. Oct 2007 A1
20070255273 Fernandez et al. Nov 2007 A1
20080004634 Farritor et al. Jan 2008 A1
20080015565 Davison Jan 2008 A1
20080015566 Livneh Jan 2008 A1
20080033569 Ferren et al. Feb 2008 A1
20080045803 Williams Feb 2008 A1
20080058835 Farritor et al. Mar 2008 A1
20080058989 Oleynikov et al. Mar 2008 A1
20080103440 Ferren et al. May 2008 A1
20080109014 Pena May 2008 A1
20080111513 Farritor et al. May 2008 A1
20080119870 Williams et al. May 2008 A1
20080132890 Woloszko et al. Jun 2008 A1
20080161804 Rioux et al. Jul 2008 A1
20080164079 Jacobsen Jul 2008 A1
20080183033 Bern et al. Jul 2008 A1
20080221591 Farritor et al. Sep 2008 A1
20080269557 Marescaux et al. Oct 2008 A1
20080269562 Marescaux et al. Oct 2008 A1
20090020724 Paffrath Jan 2009 A1
20090024142 Ruiz Morales Jan 2009 A1
20090048612 Farritor et al. Feb 2009 A1
20090054909 Farritor et al. Feb 2009 A1
20090069821 Farritor et al. Mar 2009 A1
20090076536 Rentschler et al. Mar 2009 A1
20090137952 Ramamurthy et al. May 2009 A1
20090143787 de la Pena Jun 2009 A9
20090163929 Yeung et al. Jun 2009 A1
20090171373 Farritor et al. Jul 2009 A1
20090234369 Bax et al. Sep 2009 A1
20090236400 Cole et al. Sep 2009 A1
20090240246 Devill et al. Sep 2009 A1
20090247821 Rogers Oct 2009 A1
20090248038 Blumenkranz et al. Oct 2009 A1
20090281377 Newell et al. Nov 2009 A1
20090305210 Guru et al. Dec 2009 A1
20100010294 Conlon et al. Jan 2010 A1
20100016659 Weitzner et al. Jan 2010 A1
20100016853 Burbank Jan 2010 A1
20100042097 Newton et al. Feb 2010 A1
20100056863 Dejima et al. Mar 2010 A1
20100069710 Yamatani et al. Mar 2010 A1
20100069940 Miller et al. Mar 2010 A1
20100081875 Fowler et al. Apr 2010 A1
20100139436 Kawashima et al. Jun 2010 A1
20100198231 Scott Aug 2010 A1
20100204713 Ruiz Aug 2010 A1
20100245549 Allen et al. Sep 2010 A1
20100250000 Blumenkranz Sep 2010 A1
20100262162 Omori Oct 2010 A1
20100292691 Brogna Nov 2010 A1
20100318059 Farritor et al. Dec 2010 A1
20110015569 Kirschenman et al. Jan 2011 A1
20110020779 Hannaford et al. Jan 2011 A1
20110071347 Rogers et al. Mar 2011 A1
20110071544 Steger et al. Mar 2011 A1
20110077478 Freeman et al. Mar 2011 A1
20110082365 McGrogan et al. Apr 2011 A1
20110098529 Ostrovsky et al. Apr 2011 A1
20110152615 Schostek et al. Jun 2011 A1
20110224605 Farritor et al. Sep 2011 A1
20110230894 Simaan et al. Sep 2011 A1
20110237890 Farritor et al. Sep 2011 A1
20110238080 Ranjit et al. Sep 2011 A1
20110264078 Lipow Oct 2011 A1
20110270443 Kamiya et al. Nov 2011 A1
20120035582 Nelson et al. Feb 2012 A1
20120109150 Quaid et al. May 2012 A1
20120116362 Kieturakis May 2012 A1
20120179168 Farritor Jul 2012 A1
20120253515 Coste-Maniere et al. Oct 2012 A1
20130041360 Farritor Feb 2013 A1
20130131695 Scarfogliero et al. May 2013 A1
20130345717 Markvicka et al. Dec 2013 A1
20140046340 Wilson et al. Feb 2014 A1
20140058205 Frederick et al. Feb 2014 A1
20140039515 Mondry et al. Jun 2014 A1
20140303434 Farritor et al. Oct 2014 A1
20150051446 Farritor et al. Feb 2015 A1
Foreign Referenced Citations (51)
Number Date Country
10828219198 Dec 2012 CN
102010040405 Mar 2012 DE
1354670 Oct 2003 EP
2286756 Feb 2011 EP
2286756 Feb 2011 EP
2329787 Aug 2011 EP
2563261 Mar 2013 EP
2004144533 May 1990 JP
5115425 May 1993 JP
200716235 Jun 1993 JP
2006507809 Sep 1994 JP
07 136173 May 1995 JP
7306155 Nov 1995 JP
08-224248 Sep 1996 JP
2001505810 May 2001 JP
2003220065 Aug 2003 JP
2004322310 Jun 2004 JP
2004180781 Jul 2004 JP
2004329292 Nov 2004 JP
2006508049 Mar 2006 JP
2009-106606 May 2009 JP
2010-533045 Oct 2010 JP
2010-536436 Dec 2010 JP
2011-504794 Feb 2011 JP
2011-045500 Mar 2011 JP
2011-115591 Jun 2011 JP
WO 9221291 May 1991 WO
WO 0189405 Nov 2001 WO
WO 02082979 Oct 2002 WO
WO 02100256 Dec 2002 WO
WO 2005009211 Jul 2004 WO
WO 2005009211 Feb 2005 WO
WO 2005044095 May 2005 WO
WO 2006052927 Aug 2005 WO
WO 2006 005075 Jan 2006 WO
WO 2006079108 Jan 2006 WO
WO2006079108 Jul 2006 WO
WO 2007011654 Jan 2007 WO
WO 2007111571 Oct 2007 WO
WO 2007149559 Dec 2007 WO
WO 2009023851 Feb 2009 WO
WO 2009144729 Dec 2009 WO
WO2010042611 Apr 2010 WO
WO2010046823 Apr 2010 WO
WO201050771 May 2010 WO
WO 2011075693 Jun 2011 WO
WO 2011118646 Sep 2011 WO
WO 2011135503 Nov 2011 WO
WO 2011135503 Nov 2011 WO
WO 2013009887 Jan 2013 WO
WO 2014011238 Jan 2014 WO
Non-Patent Literature Citations (175)
Entry
International Preliminary Report on Patentability from related case PCT/US2007/014567, mailed Jan. 8, 2009, 11 pp.
International Search report and Written Opinion from international application No. PCT/US2012/41911, mailed Mar. 13, 2013.
International Search Report and Written Opinion from international application No. PCT/US12/46274, mailed Sep. 25, 2012.
International Search Report and Written Opinion from international application No. PCT/US2007/089191, mailed Nov. 10, 2008, 20 pp.
“International Search Report and Written Opinion from international application No. PCT/US07/14567, mailed Apr. 28, 2008, 19 pp.”
International Search Report and Written Opinion of international application No. PCT/US2008/069822, mailed Aug. 5, 2009, 12 pp.
International Search Report and Written Opinion of international application No. PCT/US2008/073334, mailed Jan. 12, 2009, 11 pp.
International Search Report and Written Opinion of international application No. PCT/US2008/073369, mailed Nov. 12, 2008, 12 pp.
International Search Report and Written Opinion issued in PCT/US11/46809, mailed Dec. 8, 2011.
Ishiyama et al., “Spiral-type Micro-machine for Medical Applications,” 2000 International Symposium on Micromechatronics and Human Science, 2000: 65-69.
Jagannath et al., “Peroral transgastric endoscopic ligation of fallopian tubes with long-term survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 61(3): 449-453.
Kalloo et al., “Flexible transgastric peritoneoscopy: a novel approach to diagnostic and therapeutic interventions in the peritoneal cavity,” Gastrointestinal Endoscopy, 2004; 60(1): 114-117.
Kang et al., “Robotic Assistants Aid Surgeons During Minimally Invasive Procedures,” IEEE Engineering in Medicine and Biology, Jan.-Feb. 2001; pp. 94-104.
Kantsevoy et al., “Endoscopic gastrojejunostomy with survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 62(2): 287-292.
Kantsevoy et al., “Transgastric endoscopic splenectomy,” Surgical Endoscopy, 2006; 20: 522-525.
Kazemier et al. (1998), “Vascular Injuries During Laparoscopy,” J. Am. Coli. Surg. 186(5): 604-5.
Kim, “Early Experience with Telemanipulative Robot-Assisted Laparoscopic Cholecystectomy Using da Vinci,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):33-40.
Ko et al., “Per-Oral transgastric abdominal surgery,” Chinese Journal of Digestive Diseases, 2006; 7: 67-70.
LaFullarde et al., “Laparoscopic Nissen Fundoplication: Five-year Results and Beyond,” Arch/Surg, Feb. 2001; 136:180-184.
Leggett et al. (2002), “Aortic injury during laparoscopic fundoplication,” Surg. Endoscopy 16(2): 362.
Li et al. (2000), “Microvascular Anastomoses Performed in Rats Using a Microsurgical Telemanipulator,” Comp. Aid. Surg. 5: 326-332.
Liem et al., “Comparison of Conventional Anterior Surgery and Laparoscopic Surgery for Inguinal-hernia Repair,” New England Journal of Medicine, 1997; 336 (22): 1541-1547.
MacFarlane et al., “Force-Feedback Grasper Helps Restore the Sense of Touch in Minimally Invasive Surgery,” Journal of Gastrointestinal Surgery, 1999; 3: 278-285.
Mack et al., “Present Role of Thoracoscopy in the Diagnosis and Treatment of Diseases of the Chest,” Ann Thorac Surgery, 1992; 54: 403-409.
Mack, “Minimally Invasive and Robotic Surgery,” JAMA, Feb. 2001; 285(5): 568-572.
Mei et al, “Wireless Drive and Control of a Swimming Microrobot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002: 1131-1136.
Melvin et al., “Computer-Enhanced vs. Standard Laparoscopic Antireflux Surgery,” J Gastrointest Surg 2002; 6: 11-16.
Menciassi et al., “Locomotion of a Leffed Capsule in the Gastrointestinal Tract: Theoretical Study and Preliminary Technological Results,” IEEE Int. Conf. on Engineering in Medicine and Biology, San Francisco, CA, pp. 2767-2770, Sep. 2004.
Menciassi et al., “Robotic Solutions and Mechanisms for a Semi-Autonomous Endoscope,” Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Systems, Oct. 2002; 1379-1384.
Menciassi et al., “Shape memory alloy clamping devices of a capsule for monitoring tasks in the gastrointestinal tract,” J. Micromech. Microeng, 2005, 15: 2045-2055.
Meron, “The development of the swallowable video capsule (M2A),” Gastrointestinal Endoscopy 2000; 52 6: 817-819.
Micron, http://www.micron.com, 2006, l/4-inch VGA NTSC/PAL CMOS Digital Image Sensor, 98 pp.
Midday Jeff et al., “Material Handling System for Robotic natural Orifice Surgery”, Proceedings of the 2011 Design of medical Devices Conference, Apr. 12-14, 2011, Minneapolis, MN, 4 pages.
Miller, Ph.D., et al., “In-Vivo Stereoscopic Imaging System with 5 Degrees-of-Freedom for Minimal Access Surgery,” Dept. of Computer Science and Dept. of Surgery, Columbia University, New York, NY, 7 pp.
Munro (2002), “Laparoscopic access: complications, technologies, and techniques,” Curro Opin. Obstet. Gynecol., 14(4): 365-74.
Nio et al, “Efficiency of manual vs robotical (Zeus) assisted laparoscopic surgery in the performance of standardized tasks,” Surg Endosc, 2002; 16: 412-415.
Office Action dated Apr. 17, 2007, received in related case U.S. Appl. No. 11/552,379, 5 pp.
Office Action dated Apr. 3, 2009, received in related case U.S. Appl. No. 11/932,516, 43 pp.
Office Action dated Aug. 18, 2006, received in related case U.S. Appl. No. 11/398,174, 6 pp.
Office Action dated Aug. 21, 2006, received in related case U.S. Appl. No. 11/403,756, 6 pp.
Office Action dated Oct. 29, 2007, received in related case U.S. Appl. No. 11/695,944, 6 pp.
Office Action dated Oct. 9, 2008, received in related case U.S. Appl. No. 11/932,441, 4 pp.
Oleynikov et al., “In Vivo Camera Robots Provide Improved Vision for Laparoscopic Surgery,” Computer Assisted Radiology and Surgery (CARS), Chicago, IL, Jun. 23-26, 2004b.
Oleynikov et al., “In Vivo Robotic Laparoscopy,” Surgical Innovation, Jun. 2005, 12(2): 177-181.
Oleynikov et al., “Miniature Robots Can Assist in Laparoscopic Cholecystectomy,” Journal of Surgical Endoscopy, 19-4: 473-476, 2005.
O'Neill, “Surgeon takes new route to gallbladder,” The Oregonian, Jun. 2007, 2 pp.
Orlando et al., (2003), “Needle and Trocar Injuries in Diagnostic Laparoscopy under Local Anesthesia: What Is the True Incidence of These Complications?” Journal of Laparoendoscopic & Advanced Surgical Techniques 13(3): 181-184.
Park et al., “Trocar-less Instrumentation for Laparoscopy: Magnetic Positioning of Intra-abdominal Camera and Retractor,” Ann Surg, Mar. 2007; 245(3): 379-384.
Park et al., “Experimental studies of transgastric gallbladder surgery: cholecystectomy and cholecystogastric anastomosis (videos),” Gastrointestinal Endoscopy, 2005; 61(4): 601-606.
Abbott et al., “Design of an Endoluminal Notes Robotic System,” from the Proceedings of the 2007 IEEE/RSJ Int'l Conf. on Intelligent Robot Systems, San Diego, CA, Oct. 29-Nov. 2, 2007, pp. 410-416.
Allendorf et al., “Postoperative Immune Function Varies Inversely with the Degree of Surgical Trauma in a Model,” Surgical Endoscopy 1997; 11:427-430.
Ang, “Active Tremor Compensation in Handheld Instrument for Microsurgery,” Doctoral Dissertation, tech report CMU-RI-TR-04-28, Robotics Institute, Carnegie Mellon Unviersity, May 2004, 167pp.
Applicant Amendment after Notice of Allowance under Rule 312, filed Aug. 25, 2008, in related case U.S. Appl. No. 11/695,944, 6pp.
Applicant Response to Office Action dated Apr. 17, 2007, in related case U.S. Appl. No. 11/552,379, filed Aug. 8, 2007, 7 pp.
Applicant Response to Office Action dated Aug. 18, 2006, in related case U.S. Appl. No. 11/398,174, filed Nov. 7, 2006, 8pp.
Applicant Response to Office Action dated Aug. 21, 2006, in related case U.S. Appl. No. 11/403,756, filed Nov. 21, 2006, 52pp.
Applicant Response to Office Action dated Oct. 29, 2007, in related case U.S. Appl. No. 11/695,944, filed Jan. 22, 2008, 6pp.
Atmel 80C5X2 Core, http://www.atmel.com, 2006, 186pp.
Bailey et al., “Complications of Laparoscopic Surgery,” Quality Medical Publishers, Inc., 1995, 25pp.
Ballantyne, “Robotic Surgery, Telerobotic Surgery, Telepresence, and Telementoring,” Surgical Endoscopy, 2002; 16: 1389-1402.
Bauer et al., “Case Report: Remote Percutaneous Renal Percutaneous Renal Access Using a New Automated Telesurgical Robotic System,” Telemedicine Journal and e-Health 2001; (4): 341-347.
Begos et al., “Laparoscopic Cholecystectomy: From Gimmick to Gold Standard,” J Clin Gastroenterol, 1994; 19(4): 325-330.
Berg et al., “Surgery with Cooperative Robots,” Medicine Meets Virtual Reality, Feb. 2007, 1 pg.
Breda et al., “Future developments and perspectives in laparoscopy,” Eur. Urology 2001; 40(1): 84-91.
Breedveld et al., “Design of Steerable Endoscopes to Improve the Visual Perception of Depth During Laparoscopic Surgery,” ASME, Jan. 2004; vol. 126, pp. 1-5.
Breedveld et al, “Locomotion through the Intestine by means of Rolling Stents,” Proceedings of the ASME Design Engineering Technical Conferences, 2004, pp. 1-7.
Calafiore et al., Multiple Arterial Conduits Without Cardiopulmonary Bypass: Early Angiographic Results,: Ann Thorac Surg, 1999; 67: 450-456.
Camarillo et al., “Robotic Technology in Surgery: Past, Present and Future,” The American Journal of Surgery, 2004; 188: 2S-15.
Cavusoglu et al., “Telesurgery and Surgical Simulation: Haptic Interfaces to Real and Virtual Surgical Environments,” In McLaughliin, M.L., Hespanha, J.P., and Sukhatme, G., editors. Touch in virtual environments, IMSC Series in Multimedia 2001, 28pp.
Cavusoglu et al., “Robotics for Telesurgery: Second Generation Berkeley/UCSF Laparoscopic Telesurgical Workstation and Looking Towards the Future Applications,” Industrial Robot: An International Journal, 2003; 30(1): 22-29.
Chanthasopeephan et al., (2003), “Measuring Forces in Liver Cutting: New Equipment and Experimenal Results,” Annals of Biomedical Engineering 31: 1372-1382.
Choi et al., “Flexure-based Manipulator for Active Handheld Microsurgical Instrument,” Proceedings of the 27th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), Sep. 2005, 4pp.
Cuschieri, “Technology for Minimal Access Surgery,” BMJ, 1999, 319: 1-6.
Dakin et al., “Comparison of laparoscopic skills performance between standard instruments and two surgical robotic systems,” Surg Endosc., 2003; 17: 574-579.
Dumpert et al., “Improving in Vivo Robot Visioin Quality,” from the Proceedings of Medicine Meets Virtual Realtiy, Long Beach, CA, Jan. 26-29, 2005. 1 pg.
Dumpert et al., “Stereoscopic in Vivo Surgical Robots,” IEEE Sensors Special Issue on in Vivo Sensors for Medicine, Jan. 2007, 10 pp.
Examiner Interview Summary dated Aug. 6 and Aug. 12, 2008, in related case U.S. Appl. No. 11/695,944, 1 pg.
Examiner Interview Summary dated May 9, 2008, in related case U.S. Appl. No. 11/695,944, 1 pg.
Examiner Interview Summary dated Nov. 30, 2006, in related case U.S. Appl. No. 11/398,174, 2pp.
Falcone et al., “Robotic Surgery,” Clin. Obstet. Gynecol. 2003, 46(1): 37-43.
Faraz et al., “Engineering Approaches to Mechanical and Robotic Design for Minimaly Invasive Surgery (MIS),” Kluwer Academic Publishers (Boston), 2000, 13pp.
Fearing et al., “Wing Transmission for a Micromechanical Flying Insect,” Proceedings of the 2000 IEEE International Conference to Robotics & Automation, Apr. 2000; 1509-1516.
Fireman et al., “Diagnosing small bowel Crohn's desease with wireless capsule endoscopy,” Gut 2003; 52: 390-392.
Flynn et al., “Tomorrow's Surgery: micromotors and microbots for minimally invasive procedures,” Minimally Invasive Surgery & Allied Technologies.
Franklin et al., “Prospective Comparison of Open vs. Laparoscopic Colon Surgery for Carcinoma: Five-Year Results,” Dis Colon Rectum, 1996; 39: S35-S46.
Franzino, “The Laprotek Surgical System and the Next Generation of Robotics,” Surg Clin North Am, 2003 83(6).
Fraulob et al, “Miniature assistance module for robot-assisted heart surgery,” Biomed. Tech. 2002, 47 Suppl. 1, Pt. 1: 12-15.
Fukuda et al., “Mechanism and Swimming Experiment of Micro Mobile Robot in Water,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, 1994: 814-819.
Fukuda et al., “Micro Active Catheter System with Multi Degrees of Freedom,” Proceedings of the IEEE International Conference on Robotics and Automation, May 1994, pp. 2290-2295.
Fuller et al., “Laparoscopic Trocar Injuries: A Report from a U.S. Food and Drug Administration (FDA) Center for Devices and Radiological Health (CDRH) Systematic Technology Assessment of Medical Products (STAMP) Committe,” U.S. Food and Drug Adminstration, available at http://www.fdaJ:?;ov, Finalized: Nov. 7, 2003; Updated: Jun. 24, 2005, 11 pp.
Grady, “Doctors Try New Surgery for Gallbladder Removal,” The New York Times, Apr. 20, 2007, 3 pp.
Guber et al., “Miniaturized Instrumetn Systems for Minimally Invasive Diagnosis and Therapy,” Biomedizinishe Technic. 2002, Band 47, Erganmngsband 1.
Patronik et al., “Development of a Tethered Epicardial Crawler for Minimally Invasive Cardiac Therapies,” IEEE, pp. 239-240.
Patronik et al., “Crawling on the Heart: A Mobile Robotic Device for Minimally Invasive Cardiac Interventions,” MICCAI, 2004, pp. 9-16.
Patronik et al., “Preliminary evaluation of a mobile robotic device for navigation and intervention on the beating heart,” Computer Aided Surgery, 10(4): 225-232, Jul. 2005.
Peirs et al., “A miniature manipulator for integration in a self-propelling endoscope,” Sensors and Actuators A, 2001, 92: 343-349.
Peters, “Minimally Invasive Colectomy: Are the Potential Benefits Realized?” Dis Colon Rectum 1993; 36: 751-756.
Phee et al, “Analysis and Development of Locomotion Devices for the Gastrointestinal Tract,” IEEE Transaction on Biomedical Engineering, vol. 49, No. 6, Jun. 2002, pp. 613-616.
Phee et al., “Development of Microrobotic Devices for Locomotion in the Human Gastrointestinal Tract,” International Conference on Computational Intelligence, Robotics and Autonomous Systems (CIRAS 2001), Nov. 28-30, (2001), Singapore.
Platt et al., “In Vivo Robotic Cameras can Enhance Imaging Capability During Laparoscopic Surgery,” in the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005, I pg.
Preliminary Amendment filed Apr. 11, 2007, in related case U.S. Appl. No. 11/403,756, 7 pp.
Preliminary Amendment filed Jul. 30, 2008, in related case U.S. Appl. No. 12/171,413, 4 pp.
RCE and Amendment filed Jun. 13, 2007, in related case U.S. Appl. No. 11/403,756, 8 pp.
Rentschler et al., “Mobile in Vivo Biopsy and Camera Robot,” Studies in Health and Infonnatics Medicine Meets Virtual Reality, vol. 119., pp. 449-454, IOS Press, Long Beach, CA, 2006e.
Rentschler et al., “Mobile in Vivo Biopsy Robot,” IEEE International Conference on Robotics and Automation, Orlando, Florida, May 2006, pp. 4155-4160.
Rentschler et al., “Miniature in vivo Robots for Remote and Harsh Environments,” IEEE Transactions on Information Technology in Biomedicine, Jan. 2006; 12(1): 66-75.
Rentschler et al., “An in Vivo Mobile Robot for Surgical Vision and Task Assistance,” Journal of Medical Devices, Mar. 2007, vol. 1: 23-29.
Rentschler et al., “In vivo Mobile Surgical Robotic Task Assistance,” 1 pg.
Rentschler et al., “In vivo Robotics during the NEEMO 9 Mission,” Medicine Meets Virtual Reality, Feb. 2007, I pg.
Rentschler et al.., “In Vivo Robots for Laparoscopic Surgery,” Studies in Health Technology and Infonnatics Medicine Meets Virtual Reality, ISO Press, Newport Beach, CA, 2004a, 98: 316-322.
Rentschler et al., “Mechanical Design of Robotic in Vivo Wheeled Mobility,” ASME Journal of Mechanical Design, 2006a, pp, I-II.
Rentschler et al., “Mobile in Vivo Camera Robots Provide Sole Visual Feedback for Abdominal Exploration and Cholecystectomy,” Journal of Surgical Endoscopy, 20-I: 135-138, 2006b.
Rentschler et al., “Mobile in Vivo Robots Can Assist in Abdominal Exploration,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005b.
Rentschler et al., “Modeling, Analysis, and Experimental Study of in Vivo Wheeled Robotic Mobility,” IEEE Transactions on Robotics, 22 (2): 308-321, 2005c.
Rentschler et al., “Natural Orifice Surgery with an Endoluminal Mobile Robot,” The Society of American Gastrointestinal Endoscopic Surgeons, Dallas, TX, Apr. 2006d, 14 pp.
Rentschler et al., “Theoretical and Experimental Analysis of in Vivo Wheeled Mobility,” ASME Design Engineering Technical Conferences: 28th Biennial Mechanisms and Robotics Conference, Salt Lake City, Utah, Sep. 28-Oct. 2, 2004, pp. 1-9.
Rentschler et al., “Toward in Vivo Mobility,” Studies in Health Technology and Infonnatics—Medicine Meets Virtual Reality, ISO Press, Long Beach, CA, 2005a, III: 397-403.
Response to Rule 312 Amendment in related case U.S. Appl. No. 11/695,944, dated Jan. 12, 2009, 2 pp.
Riviere et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, Oct. 2003, 19(5): 793-800.
Rosen et al., “Force Controlled and Teleoperated Endoscopic, Grasper for Minimally Invasive Surgery-Experimental Performance Evaluation,” IEEE Transactions of Biomedical Engineering, Oct. 1999; 46(10): 1212-1221.
Rosen et al., “Objective Laparoscopic Skills Assessments of Surgical Residents Using Hidden Markov Models Based on Haptic Information and Tool/Tissue Interactions,” Studies in Health Technology and Informatics-Medicine Meets Virtual Reality, Jan. 2001, 7 pp.
Rosen et al., “Spherical Mechanism Analysis of a Surgical Robot for Minimally Invasive Surgery—Analytical and Experimental Approaches,” Studies in Health Technology and Informatics-Medicine Meets Virtual Reality, pp. 442-448, Jan. 2005.
Rosen et al., “Task Decomposition of Laparoscopic Surgery for Objective Evaluation of Surgical Residents' Learning Curve Using Hidden Markov Model,” Computer Aided Surgery, vol. 7, pp. 49-61, 2002.
Rosen et al., “The Blue Dragon—A System of Measuring the Kinematics and the Dynamics of Minimally Invasive Surgical Tools In-Vivo,” Proc. of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, pp. 1876-1881, May 2002.
Ruurda et al, “Robot-Assisted surgical systems: a new era in laparoscopic surgery,” Ann R. Coll Surg Engl., 2002; 84: 223-226.
Ruurda et al, “Feasibility of Robot-Assisted Laparoscopic Surgery,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):41-45.
Sackier et al., “Robotically assisted laparoscopic surgery,” Surgical Endoscopy, 1994; 8: 63-66.
Salky, “What is the Penetration of Endoscopic Techniques into Surgical Practice?” Digestive Surgery, 2000; 17:422-426.
Satava, “Surgical Robotics: The Early Chronicles,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1): 6-16.
Schippers et al., (1996) “Requirements and Possibilities of Computer-Assisted Endoscopic Surgery,” In: Computer Integrated Surgery: Technology and Clinical Applications, pp. 561-565.
Schurr et al, “Robotics and Telemanipulation Technologies for Endoscopic Surgery,” Surgical Endoscopy, 2000; 14: 375-381.
Schwartz, “In the Lab: Robots that Slink and Squirm,” The New York Times, Mar. 27, 2007, 4 pp.
Sharp LL-151-3D, http://www.sharp3d.com, 2006, 2 pp.
Slatkin et al., “The Development of a Robotic Endoscope,” Proceedings of the 1995 IEEE International Conference on Robotics and Automation, pp. 162-171, 1995.
Smart Pill “Fastastic Voyage: Smart Pill to Expand Testing,” http://www.smartpilldiagnostics.com, Apr. 13, 2005, 1 pg.
Southern Surgeons Club (1991), “A prospective analysis of 1518 laparoscopic cholecystectomies,” N. Eng. 1 Med. 324 (16): 1073-1078.
Stefanini et al, “Modeling and Experiments on a Legged Microrobot Locomoting in a Tubular Compliant and Slippery Environment,” Int. Journal of Robotics Research, vol. 25, No. 5-6, pp. 551-560, May-Jun. 2006.
“Long-term Pain: Less Common After Laparoscopic than Open Cholecystectomy,” British Journal of Surgery, 1994; 81: 1368-1370.
Strong, et al., “Efficacy of Novel Robotic Camera vs. a Standard Laproscopic Camera,” Surgical Innovation vol. 12, No. 4, Dec. 2005, Westminster Publications, Inc., pp. 315-318.
Suzumori et al., “Development of Flexible Microactuator and its Applications to Robotics Mechanisms,” Proceedings of the IEEE International Conference on Robotics and Automation, 1991: 1622-1627.
Taylor et al., “A Telerobotic Assistant for Laparoscopic Surgery,” IEEE Eng Med Biol, 1995; 279-287.
Tendick et al.. (1993), “Sensing and Manipulation Problems in Endoscopic Surgery: Experiment, Analysis, and.Observation,” Presence 2( 1): 66-81.
Palm, William, “Rapid Prototyping Primer” May 1998 (revised Jul. 30, 2002) (http://www.me.psu.edu/lamancusa/rapidpro/primer/chapter2.htm).
Guo et al., “Micro Active Guide Wire Catheter System—Characteristic Evaluation, Electrical Model and Operability Evaluation of Micro Active Catheter,” Proceedings of the 1996 IEEE International Conference on Robotics and Automation, Apr. 1996: 2226-2231.
Guo et al., “Fish-like Underwater Microrobot with 3 DOF,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002: 738-743.
Tendick et al., “Applications of Micromechatronics in Minimally Invasive Surgery,” IEEE/ASME Transactions on Mechatronics, 1998; 3(1): 34-42.
Thomann et al., “The Design of a new type of Micro Robot for the Intestinal Inspection,” Proceedings of the 2002 IEEE Intl. Conference on Intelligent Robots and Systems, Oct. 2002: 1385-1390.
U.S. Appl. No. 60/180,960, filed Feb. 2000.
U.S. Appl. No. 60/956,032, filed Aug. 15, 2007.
U.S. Appl. No. 60/983,445, filed Oct. 29, 2007.
U.S. Appl. No. 60/990,062, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,076, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,086, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,106, filed Nov. 26, 2007.
U.S. Appl. No. 60/990,470, filed Nov. 27, 2007.
U.S. Appl. No. 61/025,346, filed Feb. 1, 2008.
U.S. Appl. No. 61/030,588, filed Feb. 22, 2008.
U.S. Appl. No. 61/030,617, filed Feb. 22, 2008.
Way et al., (editors), “Fundamentals of Laparoscopic Surgery,” Churchill Livingstone Inc., 1995, 14 pp.
Wolfe et al., “Endoscopic Cholecystectomy: An analysis of Complications,” Arch. Surg. Oct. 1991; 126: 1192-1196.
Worn et al., “Espirit Project No. 33915: Miniaturised Robot for Micro Manipulation (MINIMAN)”, Nov. 1998; http://www.ipr.ira.ujka.de/-microbot/miniman.
Yu et al., “Microrobotic Cell Injection,” Proceedings of the 2001 IEEE International Conference on Robotics and Automation, May 2001; 620-625.
Yu, BSN, RN, “M2ATM Capsule Endoscopy a Breakthrough Diagnostic Tool for Small Intestine Imagining,” vol. 25, No. 1, Gastroenterology Nursing, pp. 24-27.
International Search Report and Written Opinion of international application No. PCT/US2010/061137, mailed Feb. 11, 2011, 10 pp.
Abbou et al., “Laparoscopic Radical Prostatectomy with a Remote Controlled Robot,” The Journal of Urology, Jun. 2001, 165: 1964-1966.
Glukhovsky et al.., “The development and application of wireless capsule endoscopy,” Int. J. Med. Robot. Comput. Assist. Surgery, 2004; I (1): 114-123.
Gong et al., Wireless endoscopy, Gastrointestinal Endoscopy 2000; 51(6): 725-729.
Hanly et al., “Value of the SAGES Learning Center in introducing new technology,” Surgical Endoscopy, 2004; 19 (4): 477-483.
Hanly et al., “Robotic Abdominal Surgery,” The American Journal of Surgery 188 (Suppl.to Oct. 1994): 19S-26S, 2004.
Heikkinen et al., “Comparison of laparoscopic and open Nissen fundoplication two years after operation: A prospective randomized trial,” Surgical Endoscopy, 2000; 14: 1019-1023.
Hissink, “Olympus Medical develops capsule camera technology,” Dec. 2004, accessed Aug. 29, 2007, http://www.letsgodigital.org , 3 pp.
Horgan et al., “Technical Report: Robots in Laparoscopic Surgery,” Journal of Laparoendoscopic & Advanced Surgical Techniques, 2001; 11(6): 415-419.
Stoianovici et al., “Robotic Tools for Minimally Invasive Urologic Surgery”, Jan. 1, 2002, pp. 1-17.
Cleary et al., “State of the Art in Surgical Rootics: Clinical Applications and Technology Challenges”, “Computer Aided Surgery”, Jan. 1, 2002, pp. 312-328, vol. 6.
Green, “Telepresence Surgery”, Jan. 1, 1995, Publisher: IEEE Engineering in Medicine and Biology.
Related Publications (1)
Number Date Country
20140276944 A1 Sep 2014 US
Provisional Applications (1)
Number Date Country
61782413 Mar 2013 US