The various embodiments disclosed herein relate to robotic devices used for medical procedures and related methods. More specifically, each implementation of the various robotic devices and methods include a robotic device having an arm.
Laparoscopy is minimally invasive surgery (MIS) performed in the abdominal cavity. It has become the treatment of choice for several routinely performed interventions.
However, known laparoscopy technologies are limited in scope and complexity due in part to (1) mobility restrictions resulting from using rigid tools inserted through access ports, and (2) limited visual feedback. That is, long rigid laparoscopic tools inserted through small incisions in the abdomen wall limit the surgeon's range of motion and therefore the complexity of the surgical procedures being performed. Similarly, using a 2-D image from a typically rigid laparoscope inserted through a small incision limits the overall understanding of the surgical environment. Further, current technology requires a third port to accommodate a laparoscope (camera), and each new viewpoint requires an additional incision.
Robotic systems such as the da Vinci® Surgical System (available from Intuitive Surgical, Inc., located in Sunnyvale, Calif.) have been developed to address some of these limitations using stereoscopic vision and more maneuverable end effectors. However, da Vinci® is still restricted by the access ports. Further disadvantages include the size and high cost of the da Vinci® system, the fact that the system is not available in most hospitals and the system's limited sensory and mobility capabilities. In addition, most studies suggest that current robotic systems such as the da Vinci® system offer little or no improvement over standard laparoscopic instruments in the performance of basic skills See Dakin, G. F. and Gagner, M. (2003) “Comparison of Laparoscopic Skills Performance Between Standard Instruments and Two Surgical Robotic Systems,” Surgical Endoscopy 17: 574-579; Nio, D., Bemelman, W. A., den Boer, K. T., Dunker, M. S., Gouma, D. J., and van Gulik, T. M. (2002) “Efficiency of Manual vs. Robotical (Zeus) Assisted Laparoscopic Surgery in the Performance of Standardized Tasks,” Surgical Endoscopy 16: 412-415; and Melvin, W. S., Needleman, B. J., Krause, K. R., Schneider, C., and Ellison, E. C. (2002) “Computer-Enhanced vs. Standard Laparascopic Antireflux Surgery,” J. Gastrointest Surg 6: 11-16. Further, the da Vinci® system and similar systems are implemented from outside the body and will therefore always be constrained to some degree by the limitations of working through small incisions. For example, these small incisions do not allow the surgeon to view or touch the surgical environment directly, and they constrain the motion of the endpoint of the tools and cameras to arcs of a sphere whose center is the insertion point.
There is a need in the art for improved surgical methods, systems, and devices.
One embodiment disclosed herein relates to a robotic device having an agent delivery component.
In one implementation, the device is a mobile robotic device having an agent delivery component. The device can also have a body configured to be disposed within a patient cavity, a translational mobility component, an actuator coupled with the translational mobility component, a power source coupled with the actuator, and a controller component coupled with the actuator. In one embodiment, the mobility component is configured to apply translational pressure on a surface for purposes of mobility or immobility.
Various embodiments of agent delivery components disclosed herein have at least one agent reservoir. Further embodiments have a mixing and discharge component in fluidic communication with the at least one reservoir. The delivery component can also have at least one delivery tube in fluidic communication with the at least one reservoir, a manifold in fluidic communication with the at least one delivery tube, and/or a cannula in fluidic communication with the manifold.
The device, in another embodiment, is a robotic device having a body, an agent delivery component, a rotation component comprising at least one of a pan component and a tilt component; a handle coupled with the body; and a non-attachable support component coupled with the body. According to one embodiment, the body, rotation component, and support component are sized to fit within an animal body cavity.
Various methods of performing a procedure are also disclosed. One implementation includes positioning a robotic device in a cavity inside the patient, operating a controller component to move the robotic device to a desired location within the cavity, and delivering an agent to the desired location with an agent delivery component. In one embodiment, the device has a body, a mobility component, an actuator coupled with the mobility component, a power source, a controller component, and an agent delivery component. In a further embodiment, the method includes using a biopsy tool to obtain a biopsy sample from the desired location prior to delivering the agent.
While multiple embodiments are disclosed, still other embodiments will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. As will be realized, the embodiments disclosed herein are capable of modifications in various obvious aspects, all without departing from the spirit and scope of the various inventions. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
The present invention relates to various embodiments of robotic devices for use in surgical methods and systems. Generally, the robotic devices are configured to be inserted into and/or positioned in a patient's body, such as a body cavity, for example.
The robotic devices fall into two general categories: mobile devices and stationary or “fixed base” devices. A “mobile device” includes any robotic device configured to move from one point to another within a patient's body via motive force created by a motor in the device. For example, certain embodiments of mobile devices are capable of traversing abdominal organs in the abdominal cavity. A “fixed base device” is any robotic device that is positioned by a user, such as a surgeon.
In one alternative embodiment, the device 10 also has a rotation translation component 20 or “tail.” The tail 20 can limit counter-rotation and assist the device 10 in translating the rotation of the wheels 14 into movement from one point to another. The “rotation translation component” is any component or element that assists with the translation or conversion of the wheel rotation into movement of the device. In one embodiment, the tail is spring loaded to retract and thus, according to one embodiment, provide for easy insertion of the robotic device 10 through the entry port of a laparoscopic surgical tool.
In another implementation, the device 10 has no tail 20 and the wired connection component 18 or some other component serves to limit counter-rotation.
Alternatively, a mobile robotic device according to another embodiment can also have one or more operational components (also referred to herein as “manipulators”) and/or one or more sensor components. In these embodiments, the device may or may not have an imaging component. That is, the device can have any combination of one or more imaging components, one or more operational components, and one or more sensor components.
The operational component might be, for example, biopsy graspers. Further, the one or more sensor components could be chosen from, for example, sensors to measure temperature, blood or other tissue or body fluids, humidity, pressure, and/or pH.
In a further alternative, the connection component is a wireless connection component. That is, the controller is wirelessly coupled to, and wirelessly in connection with, the device 10. In such embodiments, the wireless connection component of the device 10 is a transceiver or a transmitter and a receiver to communicate wirelessly with an external component such as a controller. For example,
In accordance with one implementation, a mobile robotic device could be used inside the body of a patient to assist with or perform a surgical procedure. In one aspect, the device is sized to fit through standard laparoscopic tools for use during laparoscopic surgery. In another alternative, the device is sized to be inserted through a natural orifice of the patient, such as the esophagus, as will be described in further detail below. In yet another alternative, the device can be sized and configured in any fashion to be used in surgical procedures.
Any of the several embodiments of mobile robotic devices described herein can be used in any number of ways. For example, one implementation of a mobile robotic device could provide visual feedback with a camera system and tissue dissection or biopsy component with a grasper attached to it. Further, such a robot could also be equipped with a sensor suite that could measure pressure, temperature, pH, humidity, etc.
It is understood that a robotic device as described generally above can take on any known configuration and be equipped with any number of sensors, manipulators, imaging devices, or other known components. That is, a robotic device conforming to certain aspects described herein can, in various embodiments, take on many different configurations, such as cylindrical or spherical shapes, or, alternatively, a shape such as that of a small vehicle, and is not limited to the cylindrical robotic devices depicted in
In addition, as shown in
In one aspect of the invention, the body 32 has a center portion 54 having a radius that is larger than the rest of the body 32. Alternatively, the center portion 54 has the same radius as the rest of the body 32. According to one embodiment, the body 32 can be constructed in any known fashion. For example, according to one embodiment, the body 32 is fabricated via machining or stereolithography.
The device 30 as shown in
In one implementation, the device 30 also has a wireless connection component (not shown) in the form of transmitter and a receiver (not shown) or a transceiver (not shown) for use in a wireless configuration of the device 30 such that any images collected by the camera 38 can be transmitted to an external component for viewing and/or storage of the image and further such that any control signals can be transmitted from an external controller or other external component to the motor 42 and/or other components of the device 30. Alternatively, the device 30 has a wired connection component (not shown) that is attached to the device 30.
In another implementation, the device 30 can also have a light component (not shown) to illuminate the area to be captured by the imaging component. Alternatively, the device 30 has no light component.
According to one embodiment, a robotic device similar to the device 30 depicted in
The device 30 depicted in
According to another embodiment, the robotic device 30 can be constructed without any sharp edges, thereby reducing damage to the patient during use of the device 30. In a further embodiment, the device 30 is comprised of biocompatible materials and/or materials that are easy to sterilize.
A mobile robotic device conforming to certain characteristics of various embodiments discussed herein has a transport component, which is also referred to herein as a “mobility component.” “Transport component” is any component that provides for moving or transporting the device between two points. In one example, the transport component is one or more wheels. For example, the transport components of the mobile robotic devices depicted in
Alternatively, a robotic device as described herein can have any known transport component. That is, the transport component is any known component that allows the device to move from one place to another. The present application contemplates use of alternative methods of mobility such as walking components, treads or tracks (such as used in tanks), hybrid components that include combinations of both wheels and legs, inchworm or snake configurations that move by contorting the body of the device, and the like.
According to one embodiment as depicted in
Each wheel 48, according to one implementation, has a surface texture on its exterior surface as shown in
The raised portion 58, according to one embodiment, defines an outer diameter 58 (doo), while the wheel 48 defines an inner diameter 56 (dr). According to another embodiment, the inner and outer diameters of the wheels in one implementation are 17 mm and 20 mm, respectively. Alternatively, the grouser depth is 1.5 mm, where grouser depth is equal to (doo−dr)/2. In a further alternative, the diameters and/or the grouser depth are any that would be useful for wheels on the mobile devices disclosed herein.
In another embodiment, the helical profile 59 of the wheels has a pitch of 30° as depicted in
In accordance with one implementation, the transport component constitutes at least about 80% of the external surface area of the robotic device. Alternatively, the transport component constitutes at least about 90% of the external surface area of the robotic device. In a further alternative, the transport component constitutes from about 80% to about 98% of the external surface area of the robotic device. In yet another alternative, the transport component constitutes any percentage of the external surface area of the robotic device.
The wheels depicted in
In addition, the wheels depicted in
In certain alternative embodiments, the robotic device has one or more sensor components. In various embodiments, such sensor components include, but are not limited to, sensors to measure or monitor temperature, blood, any other bodily fluids, fluid composition, presence of various gases, such as CO2, for example, or other parameters thereof, humidity, electrical potential, heart rate, respiration rate, humidity, pressure, and/or pH. Further, the one or more sensor components can include one or more imaging components, which shall be considered to be a type of sensor component for purposes of this application. The sensors, including imaging devices, can be any such components or devices known in the art that are compatible with the various designs and configurations of the robotic devices disclosed herein.
According to one embodiment, a robotic device having one or more of the sensors described herein assists the user in the performance of a surgical procedure. In accordance with one implementation, the one or more sensors restore some of the natural monitoring or sensing capabilities that are inherently lost when using standard laparoscopic tools. Thus, the one or more sensor components allow the user to perform more complex procedures and/or more accurately monitor the procedure or the patient.
According to one embodiment, the imaging component can be a camera or any other imaging device. The imaging component can help to increase or improve the view of the area of interest (such as, for example, the area where a procedure will be performed) for the user. According to one embodiment, the imaging component provides real-time video to the user.
Current standard laparoscopes use rigid, single view cameras inserted through a small incision. The camera has a limited field of view and its motion is highly constrained. To obtain a new perspective using this prior art technique often requires the removal and reinsertion of the camera through another incision, increasing patient risk. In contrast to such limited imaging, a robotic device having one or more imaging components according to various embodiments described herein eliminates many of the limitations and disadvantages of standard laparoscopy, providing for an expanded and adjustable field of view with almost unlimited motion, thereby improving the user's visual understanding of the procedural area.
As used herein, the terms “imaging component,” “camera,” and “imaging device” are interchangeable and shall mean the imaging elements and processing circuitry which are used to produce the image signal that travels from the image sensor or collector to a viewing component. According to one embodiment, the image is a moving video image and the viewing component is a standard video viewing component such as a television or video monitor. Alternatively, the image is a still image. In a further alternative, the images are a combination of still and moving video images. The term “image sensor” as used herein means any component that captures images and stores them. In one embodiment, the image sensor is a sensor that stores such images within the structure of each of the pixels in an array of pixels. The terms “signal” or “image signal” as used herein, and unless otherwise more specifically defined, means an image which is found in the form of electrons which have been placed in a specific format or domain. The term “processing circuitry” as used herein refers to the electronic components within the imaging device which receive the image signal from the image sensor and ultimately place the image signal in a usable format. The terms “timing and control circuits” or “circuitry” as used herein refer to the electronic components which control the release of the image signal from the pixel array.
In accordance with one implementation, the imaging component is a small camera. In one exemplary embodiment, the imaging component is a complementary metal oxide semiconductor (“CMOS”) digital image sensor such as Model No. MT9V125 from Micron Technology, Inc., located in Boise, Id. Alternatively, the imaging component is a square 7 mm camera. In an alternative example, the camera can be any small camera similar to those currently used in cellular or mobile phones. In another example, the imaging device can be any imaging device currently used in or with endoscopic devices. In one embodiment, the imaging device is any device that provides a sufficient depth of field to observe the entire abdominal cavity.
According to another embodiment, the imaging device can employ any common solid state image sensor including a charged coupled device (CCD), charge injection device (CID), photo diode array (PDA), or any other CMOS, which offers functionality with simplified system interfacing. For example, a suitable CMOS imager including active pixel-type arrays is disclosed in U.S. Pat. No. 5,471,515, which is hereby incorporated herein by reference in its entirety. This CMOS imager can incorporate a number of other different electronic controls that are usually found on multiple circuit boards of much larger size. For example, timing circuits, and special functions such as zoom and anti-jitter controls can be placed on the same circuit board containing the CMOS pixel array without significantly increasing the overall size of the host circuit board. Alternatively, the imaging device is a CCD/CMOS hybrid available from Suni Microsystems, Inc. in Mountain View, Calif.
In accordance with one implementation, the imaging device provides video output in NTSC format. For example, any commercially-available small NTSC video format transmission chips suitable for the devices described herein can be used. Alternatively, any known video output in any known format can be incorporated into any device described herein.
The imaging component, according to one embodiment, has a manual focus adjustment component. Alternatively, the imaging component has a mechanically-actuated adjustable-focus component. A variety of adjustable-focus mechanisms are known in the art and suitable for actuating focusing of many types of known imaging components.
In one embodiment, the imaging component is capable of focusing in range from about 2 mm to infinity. Alternatively, the imaging component can have a focusing range similar to that of any known adjustable focus camera.
Alternatively, the imaging component has an adjustable-focus mechanism 60 as depicted in
In accordance with another embodiment, the imaging component can be controlled externally to adjust various characteristics relating to image quality. For example, according to one embodiment, one or more of the following can be adjusted by a user: color, white balance, saturation, and/or any other known adjustable characteristic. According to one embodiment, this adjustment capability can provide quality feedback in poor viewing conditions such as, for example, low lighting.
According to one implementation, any mobile imaging device disclosed herein can have any known lens that can be used with such devices. In one particular embodiment, the lens is model no. DSL756A, a plastic lens available from Sunex, located in Carlsbad, Calif. This embodiment provides only a short depth of field, which requires adjustable-focus capability. To attain this, the lens of this implementation is attached to an actuation mechanism to provide adjustable focus capability. The lens is moved by the actuation mechanism to provide a range of focus from 2 mm to infinity. Alternatively, the lens can be any lens that can be incorporated into any of the imaging devices described herein.
In a further alternative, the imaging component can include an image stabilization component. For example, according to one embodiment, the device could include on-board accelerometer measurements with image motion estimates derived from optical flow to yield base motion estimates, such as are known in the art. Alternatively, the image stabilization component can be any such commercially-available component. Optical flow has been shown to yield reliable estimates of displacements computed across successive image frames. Using these robot base motion estimates, image stabilization algorithm can be used to provide image stabilization. Alternatively, any known image stabilization technology can be incorporated for use with the imaging component.
In certain embodiments, the camera is fixed with respect to the body of the robotic device, such that the position of the robot must be changed in order to change the area to be viewed. Alternatively, the camera position can be changed with respect to the device such that the user can move the camera with respect to the robotic device. According to one embodiment, the user controls the position of the camera using a controller that is operably coupled to the device as described in further detail herein.
The robotic device can also, according to one embodiment, have a lighting component to light the area to be viewed. In one example, the lighting component is an LED light. Alternatively, the lighting component can be any illumination source.
According to one implementation, the camera is disposed on the center portion of the body of the device, as shown in
According to one embodiment, the robotic device has one or more operational components. The “operational component,” as used herein, is intended to mean any component that performs some action or procedure related to a surgical or exploratory procedure. According to one embodiment, the operational component is also referred to as a “manipulator” and can be a clamp, scalpel, any type of biopsy tool, a grasper, forceps, stapler, cutting device, cauterizing device, ultrasonic burning device, or other similar component, as set forth in further detail herein. In yet another embodiment, the operational component is any device that can perform, or assist in the performance of, any known surgical or exploratory laparoscopic procedure. In one aspect, the one or more operational components assist with procedures requiring high dexterity. In currently known techniques, movement is restricted, as passing the rigid laparoscopic tool through a small incision restricts movement and positioning of the tool tip. In contrast, a robotic device having an operational component inside a cavity is not subject to the same constraints.
In one implementation, the operational component can also include an arm or other positioning component. For example, the operational component can include an arm and a biopsy tool. Alternatively, the operational component can include a positioning component and any operational component as described above.
According to one embodiment, any operational component described or contemplated herein can be an off-the-shelf surgical tool or modified version thereof. Alternatively, any such operational component can be constructed de novo.
The operational component depicted in
The joints 84 are configured in any known fashion. In one example as depicted in
In one implementation, the arm was constructed using stereolithography. According to one embodiment, stereolithography can be used to construct the linkages and the base section out of a cured resin material similar to plastic.
The motor, according to one embodiment, that can be used in the linkages is a DC micromotor with encoders manufactured by MicroMo Electronics, located in Clearwater, Fla. The motor is a 6 V motor having a 15,800 rpm no-load speed, 0.057 oz-in stall torque, and weighed 0.12 oz. The motor has an 8 mm diameter and is 16 mm long. Due to its high no-load speed, a precision planetary gearhead is used. Further description of the motor, gearhead, and an encoder that can be used with the motor are described in U.S. Pat. No. 7,199,545. Alternatively, the arm can use a low voltage motor, such as a 3 V motor.
In one implementation, the arm has an encoder used for the indication and control of both shaft velocity and the direction of rotation, as well as for positioning. In one embodiment, the encoder is a 10 mm magnetic encoder. It is 16.5 mm long, but only adds 11.5 mm to the total length of the assembly.
In one embodiment as depicted in
In one embodiment, the manipulator is a biopsy forceps or grasper. According to one aspect, the manipulator includes a biopsy forceps or graspers at one end of an arm.
In another embodiment, the manipulator of the present invention includes an actuation mechanism that generates forces required for operating the manipulator. For example, according to one embodiment in which the manipulator is a biopsy forceps or graspers, the manipulator also has an actuation mechanism that generates sufficient force to allow the forceps or graspers to cut/obtain a biopsy sample. According to one embodiment, the actuation mechanism generates a drawbar force of magnitude greater than 0.6 N. Alternatively, the actuation mechanism generates any amount of force sufficient to obtain a biopsy sample. In a further alternative, the actuation mechanism generates a sufficient force to operate any type of manipulator, such as a clamp, stapler, cutter, cauterizer, burner, etc.
In one embodiment, the body 104 also contains an imaging component (not shown), camera lens 108, motor and video control boards (not shown), and actuation motor (not shown) and a mechanism for camera adjustable-focus (not shown). In this embodiment, the imaging component and lens 108 are offset to the side to allow space for the biopsy grasper 102. The wheel 110 on the camera side has slots 112 machined in it to allow for space for the camera lens 108 to see the abdominal environment and the biopsy grasper 102. Alternatively, the camera and lens 108 are disposed anywhere on the robotic device 100 such that the camera can be used to view the surgical area and/or the biopsy grasper 102 during use. The device 100 a wired connection component 114 that is connected to an external component (not shown).
In use, a robotic device with a camera and a biopsy tool such as the devices depicted in
In an alternative embodiment, the manipulator is a drug delivery component. That is, according to one implementation, robotic devices disclosed herein can have a drug delivery component or system that delivers an agent to an animal, including a human. In one embodiment, the agent is a hemostatic agent. Alternatively, the agent can be any deliverable composition for delivery to an animal, including a human.
In one embodiment, the dual reservoirs 162 of
According to one embodiment, the spring-loaded catch lever 176 is a shape memory alloy and is actuated with a SMA wire trigger. SMA wires are made of a nickel-titanium alloy that is easily stretched at room temperature. However, as the wires are heated by passing an electric current through them, they shorten in length and exert a force that is greater than the force required to stretch them. In one embodiment, the wires shorten in length by up to approximately 8% and exert approximately 5 times the force required to stretch them.
A further alternative embodiment of the actuator mechanism is depicted in
Alternatively, the actuator mechanism can be any known device for providing for linear displacement of the reservoir plungers 180 that dispense the agent. According to one implementation, the actuator ensures uniform delivery of the agent from the storage reservoir(s).
Alternatively, the mixing component is any known component for mixing two agents, including, but not limited to, hemostatic agents, that can implemented with one or more of the robotic devices described herein.
In accordance with one aspect, the reservoir or reservoirs have at least one externally accessible loading port configured to allow for loading, injecting, or otherwise placing the agent or components into the reservoir. The loading port is a standard rubber stopper and seal commonly used for vaccine vials. Such a rubber stopper and seal facilitates transfer of any agent using a standard syringe. Alternatively, the loading port is any known type of loading port of any known configuration. According to one embodiment, such a loading port is useful for known agents that must be reconstituted shortly before use, such as on-site reconstitution. As such, the loading port or ports accommodate the need for on-site loading of the compounds.
According to one aspect, any robotic device embodiment described herein is connected to an external controller via a connection component. According to one embodiment, the connection component is a wire, cord, or other physical flexible coupling. For purposes of this application, the physical or “wired” connection component is also referred to as “tethered” or “a tether.” The flexible connection component can be any component that is coupled at one end to the robotic device and is flexible, pliable, or otherwise capable of being easily formed or manipulated into different shapes or configurations. According to one embodiment, the connection component includes one or more wires or cords or any other type of component operably coupled at the second end to an external unit or device. The component in this embodiment is configured to transmit or convey power and/or data, or anything else necessary or useful for operation of the device between the robotic unit and the external unit or device. In a further alternative, the connection component comprises at least two wires or cords or other such components, each of which are connected to a separate external unit (which, in one example, are a power source and a data transmission and receiver unit as described below).
Alternatively, the connection component is a wireless connection component. That is, the robotic device communicates wirelessly with a controller or any other external component. The wireless coupling is also referred to herein as “untethered.” An “untethered device” or “wireless device” is intended for purposes of this application to mean any device that is fully enclosed within the body such that no portion of the device is external to the body for at least a portion of the surgical procedure or, alternatively, any device that operates within the body while the device is not physically connected to any external object for at least a portion of the surgical procedure. In one embodiment, an untethered robotic device transmits and receives data wirelessly, including data required for controlling the device. In this embodiment, the robotic device has an internal power supply, along with a receiver and transmitter for wireless connection.
The receiver and transmitter used with a wireless robotic device as described herein can be any known receiver and transmitter. For example, any known receiver and/or transmitter used in remote vehicle locking devices, remote controls, mobile phones.
In one embodiment, the data or information transmitted to the robotic device could include user command signals for controlling the device, such as signals to move or otherwise operate various components. According to one implementation, the data or information transmitted from the robotic device to an external component/unit could include data from the imaging component or any sensors. Alternatively, the data or information transmitted between the device and any external component/unit can be any data or information that may be useful in the operation of the device.
According to another implementation, any robotic device embodiment described herein is connected via a connection component not only to the external controller, but also to one or more other robotic devices, such devices being either as described herein or otherwise known in the art. That is, according to one embodiment, two or more robotic devices can be operably coupled to each other as well as an external unit or device. According to one embodiment in which there are two robotic devices, the two devices are operably coupled to each other and an external unit or device by a flexible connection component. That is, the two devices are operably coupled to each other by a flexible connection component that is coupled to each device and each device is also operably coupled to an external unit or device by a flexible connection component. In one embodiment, there are three separate flexible connection components: (1) a connection component connecting the two robotic devices, (2) a connection component connecting one of the robotic devices to the external unit, and (3) a connection component connecting the other of the robotic devices to the external unit. Alternatively, one connection component is operably coupled to both devices and the external unit. In a further alternative, any number of connection components can be used in any configuration to provide for connection of two robotic devices to each other and an external unit.
Alternatively, the two or more robotic devices are operably coupled to each other as well as an external unit or device in an untethered fashion. That is, the robotic devices are operably coupled to each other and an external unit or device in a fashion such that they are not physically connected. In one embodiment, the devices and the external unit are operably coupled wirelessly.
In one aspect, any robotic device described herein has a drive component. The “drive component,” as defined herein, is any component configured to provide motive force such that the robotic device can move from one place to another or some component or piece of the robotic device can move, including any such component as described herein. The drive component is also referred to herein as an “actuator.” In one implementation, the drive component is a motor.
The actuator can be chosen from any number of different actuators. For example, one actuator that can be incorporated into many, if not all, of the robotic devices described herein, is a brushless direct current motor, such as, for example, model no. SBLO4-0829 with gearhead PG04-337 (available from Namiki Precision of California, which is located in Belmont, Calif.). According to one embodiment, this motor requires external connection, which is generally provided by a circuit supplied by the manufacturer. In another implementation, the motor is model no. SBL02-06H1 with gearhead PG02-337, also available from Namiki.
Alternatively, any brushless direct current motor can be used. In a further alternative, another motor that can be used to operate various components of a robotic device, such as a manipulator, is a permanent magnet DC motor made by MicroMo™ Electronics, Inc. (located in Clearwater, Fla.). In yet another alternative, any known permanent magnet DC motors can be used with the robotic devices described herein.
The motor runs on a nominal 3 V and can provide 10.6 [mNm] stall torque at 80 rpm. This motor provides a design factor of 4 for the robot on a 75-degree slope (if frictional force is sufficient to prevent sliding).
In addition, other actuators that can be used with the robotic devices described herein include shape memory alloys, piezoelectric-based actuators, pneumatic motors, hydraulic motors, or the like. Alternatively, the robotic devices described herein can use any type of compatible actuator.
According to one embodiment, the actuator can have a control component, also referred to as a “control board.” The control board can have a potentiometer that controls the speed of the motor. relationship between the terminals that created the voltage divider. According to one embodiment, the control board can also control the direction of the motor's rotation.
In accordance with one implementation, any robotic device as described herein can have an external control component, also referred to herein as a “controller.” That is, at least some of the devices herein are operated by a controller that is positioned at a location external to the animal or human.
In one embodiment, the external control component transmits and/or receives data. In one example, the unit is a controller unit configured to control the operation of the robotic device by transmitting data such as electronic operational instructions via the connection component, wherein the connection component can be a wired or physical component or a wireless component. The data transmitted or conveyed by the connection component can also include, but is not limited to, electronic data collected by the device such as electronic photographs or biopsy data or any other type of data collected by the device. Alternatively, the external unit is any component, device, or unit that can be used to transmit or receive data.
According to one embodiment, the external component is a joystick controller. In another example, the external component is any component, device, or unit that can be used to control or operate the robotic device, such as a touch screen, a keyboard, a steering wheel, a button or set of buttons, or any other known control device. Further, the external component can also be a controller that is actuated by voice, such as a voice activation component. Further, a controller may be purchased from commercial sources, constructed de novo, or commercially available controllers may be customized to control any robotic device or any robotic device components disclosed herein.
In one example, the controller includes the “thumb sticks” from a Playstation™ Dual-Shock controller. In this example, the Playstation™ controller had two analog thumb sticks, each with two degrees of freedom. This allows the operator to move the thumbsticks a finite amount in an XY coordinate plane such that pushing the stick forward a little yields a different output than pushing the stick forward a great deal. That is, the thumb sticks provide speed control such that movement can be sped up or slowed down based on the amount that the stick is pushed in the corresponding direction.
According to one embodiment, the connections between the controller and the robotic device are configured such that each wheel is controlled by a separate joystick.
In another example, the controller is a directional pad similar to the directional pad on an original Nintendo™ game system. The pad resembles a + sign and has four discrete directions.
In use, the controller can be used to control the movement of the robotic device and further to control the operation of any components of the device such as a sensor component, a manipulator component, or any other such component. For example, one embodiment of the controller controls the wheels, the focus adjustment of the camera, and further controls the biopsy tool.
In accordance with one embodiment, the control component also serves as a power source for the robotic device.
In accordance with one embodiment, a mobile robotic device is coupled to an image display component. Signal from the camera is transmitted in any format (e.g., NTSC, digital, PAL, etc.) to the image display component. According to one embodiment, the signal is a video signal or a still image signal. In one embodiment, the image display component is a video display that can be viewed by the operator. Alternatively, the image display component is a still image display. In a further alternative, the image display component displays video and still images. In one embodiment, the image display component is a standard video monitor. Those of ordinary skill in the art recognize that a signal from a camera can be processed to produce a display signal for many different types of display devices, including televisions configured to display an NTSC signal, televisions configured to display a PAL signal, cathode ray tube based computer monitors, LCD monitors, and plasma displays. In a further embodiment, the image display component is any known image display component capable of displaying the images collected by a camera that can be used with any of the robotic devices described herein.
In one embodiment, the image display component is a component of the controller.
A robotic device as described herein, according to one implementation, has a power source or power supply. According to one embodiment, the power source is integrated into the body of robotic device. In this embodiment, the power source can be one or more batteries. The battery can be an alkaline, lithium, nickel-cadmium, or any other type of battery known in the art.
Alternatively, the power source is positioned in a location external to the body of the patient. In this embodiment, the connection component operably coupled to the power source and the robotic device transmits or conveys power between the power source and the robotic device. For example, the external power source according to one embodiment is an electrical power source such as a battery or any other source of electricity. In this example, the electricity is conveyed from the battery to the robotic device via the connection component, which is any known wire or cord configured to convey electricity, and thereby supplies power to the robotic device, including the motor of the robotic device. In one example, the power source is integrated into the control component or is operably coupled to the control component.
According to one embodiment, the power source can be any battery as described above. Alternatively, the power source can be magnetic induction, piezoelectrics, nuclear, fluid dynamic, solar or any other known power source that can be used to supply power to any robotic device described herein.
Certain embodiments of robotic devices disclosed herein relate to fixed base robots. As discussed above, a “fixed base robotic device” is any robotic device that has no propelled transport component or is positioned manually by a user. Such a device is also referred to herein as a “stationary” robotic device. In one embodiment, a fixed base robot has a camera and is positioned manually by the user to provide visual feedback or a visual overview of the target area. A fixed base robotic camera device according to one implementation facilitates the application of laparoscopy and other surgical techniques by providing a remote-control camera robot to provide visual feedback during a surgical procedure, thereby minimizing incisions and patient risk.
In one embodiment, the device 220 is made of a biocompatible material capable of being easily sterilized. According to one embodiment, the materials can include, but are not limited to, sterilizable plastics and/or metals. Alternatively, the device 220 can be made of any material that can be used in surgical procedures.
The body 222 can take on many different configurations, such as cylindrical or spherical shapes so as to be compatible with laparoscopic tools known currently in the art. However, as with the other components, the body 222 configuration is not limited to that exemplified herein. In general, the only constraints on the shape of the body are that the body be able to incorporate at least one of the components described herein.
The handle 232, according to one embodiment as depicted in
The light component 226, according to one embodiment, is configured to light the area to be viewed, also referred to as the “field of view.” In one implementation, the light component 226 is proximate to the imaging component to provide constant or variable illumination for the camera. Alternatively, the light component 226 is associated with the handle 232 as depicted in
In one example, the lighting component 226 is an LED light. Alternatively, an exemplary light source is two 5 mm LEDs. In a further alternative, the lighting component 226 can be any suitable illumination source.
In one implementation, the imaging component 224 depicted in
The imaging component can help to increase or improve the view of the area of interest (such as, for example, the area where a procedure will be performed) for the user. According to one embodiment, the imaging component provides real-time video to the user. Alternatively, the imaging component can be any imaging component as described above with respect to the mobile robotic devices.
In accordance with one implementation, the tilting component 242 is pivotally coupled to the body 248 via a pin (not shown). Alternatively, the tilting component can be a standard ratchet mechanism or any other type of suitable component known in the art. According to one embodiment, the tilting component 242 can tilt up to about 45 degrees from vertical (i.e., a range of about 90 degrees). Alternatively, the tilting component 242 can tilt any amount ranging from about 0 degrees to about 360 degrees from vertical, or the tilting component 242 can configured to rotate beyond 360 degrees or can rotate multiple times. In certain embodiments such as the embodiment depicted in
The panning component 244, 246, according to one embodiment, has the two components 244, 246 that rotate with respect to each other as described above with respect to
In one aspect, any fixed base robotic device described herein has a drive component (not shown). In accordance with certain embodiments, the fixed base robotic device can have more than one drive component. For example, in one embodiment, a fixed base robotic device has a motor for actuating the panning component and another motor for actuating the tilting component. Such motors can be housed in the body component and/or the support component. In one example, the actuator or actuators are independent permanent magnet DC motors available from MicroMo™ Electronics, Inc. in Clearwater, Fla. Other suitable actuators include shape memory alloys, piezoelectric-based actuators, pneumatic motors, hydraulic motors, or the like. Alternatively, the drive component can be any drive component as described in detail above with respect to mobile robotic devices. In a further alternative embodiment, the panning and tilting components can be actuated manually.
In one embodiment, the actuator is coupled to a standard rotary-to-translatory coupling such as a lead screw, a gear, or a pulley. In this fashion, the force created by the actuator is translated with the rotary-to translatory coupling.
Moreover, it is also contemplated that the body or camera in certain embodiments could be capable of a side-to-side motion (e.g., yaw).
Various embodiments of fixed base robotic devices have an adjustable-focus component. For example, one embodiment of an adjustable-focus component 60 that can incorporated into various embodiments of the fixed base robotic devices described herein is depicted in
According to one embodiment, the imaging component can have a lens cleaning component. For example, the lens cleaning component can be a wiper blade or sacrificial film compose of multiple layers for maintaining a clear view of the target environment. In a further embodiment, the lens cleaning component can be any known mechanism or component for cleaning a camera lens.
Certain embodiments of the fixed base robotic devices, such as the embodiment depicted in
The support component 266, as depicted in
According to one aspect, any fixed base robotic device embodiment described herein is connected to an external controller via a connection component. According to one embodiment, the connection component is any wired or flexible connection component embodiment or configuration as described above with respect to mobile robotic devices. Alternatively, the connection component is a wireless connection component according to any embodiment or configuration as described above with respect to mobile robotic devices. The receiver and transmitter used with a wireless robotic device as described herein can be any known receiver and transmitter, as also described above. According to another implementation described in additional detail above with respect to the mobile devices, any fixed base robotic device embodiment described herein can be connected via a (wired or wireless) connection component not only to the external controller, but also to one or more other robotic devices of any type or configuration, such devices being either as described herein or otherwise known in the art.
In one embodiment, the data or information transmitted to the robotic device could include user command signals for controlling the device, such as signals to move or otherwise operate various components. According to one implementation, the data or information transmitted from the robotic device to an external component/unit could include data from the imaging component or any sensors. Alternatively, the data or information transmitted between the device and any external component/unit can be any data or information that may be useful in the operation of the device.
In accordance with one implementation, any fixed base robotic device as described herein can have an external control component according to any embodiment as described above with respect to the mobile robotic devices. That is, at least some of the fixed base devices herein are operated by a controller that is positioned at a location external to the animal or human. In one embodiment, the external control component transmits and/or receives data. In one example, the unit is a controller unit configured to control the operation of the robotic device by transmitting data such as electronic operational instructions via the connection component, wherein the connection component can be a wired or physical component or a wireless component. Alternatively, the external unit is any component, device, or unit that can be used to transmit or receive data.
In use, the controller can be used to control the movement or operation of any components of the device such as the camera component, a sensor component, or any other component. For example, one embodiment of the controller controls the focus adjustment of the camera, and further controls the panning and/or tilting functions of the device.
According to one embodiment, the control component is configured to control the operation of the image sensor, the panning component, and the tilting component. In one embodiment, the control component transmits signals containing operational instructions relating to controlling each of those components, such as, for example, signals containing operational instructions to the image sensor relating to image quality adjustment, etc.
In accordance with one embodiment, the control component also serves as a power source for the robotic device.
According to one implementation, the fixed base robotic device is coupled to an image display component. The image display component can be any image display component as described above with respect to the mobile robotic devices.
A fixed base robotic device as described herein, according to one implementation, has a power source or power supply. According to one embodiment, the power source is any power source having any configuration as described above with respect to the mobile robotic devices. According to various embodiments, power can be provided by an external tether or an internal power source. When the device is wireless (that is, the connection component is wireless), an internal power supply can be used. Various implementations of the fixed base robotic devices can use alkaline, lithium, nickel-cadmium, or any other type of battery known in the art. Alternatively, the power source can be magnetic induction, piezoelectrics, fluid dynamics, solar power, or any other known power source. In a further alternative, the power source is a power unit positioned within the patient's body. In this embodiment, the power unit can be used to supply power not only to one or more robotic camera devices, but can also to any other surgical robotic devices.
In one embodiment, the fixed base robotic device has one or more sensor components. In various embodiments, such sensor components include any of the sensor components as described above with respect to the mobile robotic devices.
According to one embodiment, the fixed base robotic device has one or more operational components. In various embodiments, such operational components include any of the operational components as described above with respect to mobile robotic devices. For example, one embodiment of a fixed base robotic device has an agent delivery component disposed within the body of the device. In another implementation, the operational component can also include an arm or other positioning component. For example, the operational component can include an arm and a biopsy tool. Alternatively, the operational component can include a positioning component and any operational component as described above.
According to one embodiment, any of the components on any fixed base robotic device as described herein can be known, commercially available components.
In use, any of the fixed base robotic devices can be used in various surgical procedures. For example, a fixed base device can be used in combination with a laparoscopic surgical tool, wherein the device is adapted to fit through a port of the laparoscopic surgical tool and used for obtaining an internal image of an animal. In still other embodiments, the whole of the device is introduced into an open space to obtain internal images.
Alternatively, the fixed base robotic devices can be used in oral surgery and general dental procedures to provide an image of particularly difficult-to-access locations. Additionally, it will also be appreciated by those skilled in the art that the devices set forth herein can be applied to other functional disciplines wherein the device can be used to view difficult-to-access locations for industrial equipment and the like. For example, the device could be used to replace many industrial boroscopes.
Any of the robotic devices described herein can be used in various different surgical methods or procedures in which the device is used inside the patient's body. That is, the robotic devices can be used inside the patient's body to perform a surgical task or procedure and/or provide visual feedback to the user.
According to one embodiment, any of the mobile devices described above can be inserted entirely into the patient, wherein the patient can be any animal, including a human. In known laparoscopic procedures, the use of small incisions reduces patient trauma, but also limits the surgeon's ability to view and touch directly the surgical environment, resulting in poor sensory feedback, limited imaging, and limited mobility and dexterity. In contrast, the methods described herein using the various robotic devices inside the body can provide vision and surgical assistance and/or perform surgical procedures while the robotic device is not constrained by the entry incision.
In one embodiment, any of the above devices can be used inside an abdominal cavity in minimally invasive surgery, such as laparoscopy. Certain of the devices are sized and configured to fit through standard laparoscopic tools. According to one embodiment, the use of a robotic device inserted through one standard laparoscopy port eliminates the need for the second port required in standard laparoscopic procedures.
According to one embodiment, robotic devices as described herein having a camera can allow for planning of trocar insertion and tool placement, as well as for providing additional visual cues that will help the operator to explore and understand the surgical environment more easily and completely. Known laparoscopes use rigid, single view cameras with limited fields of view inserted through a small incision. To obtain a new perspective using this prior art device often requires the removal and reinsertion of the camera through another incision, thereby increasing patient risk. In contrast, the robotic devices with cameras as described herein provide one or more robots inside an abdominal cavity to deliver additional cavity images and easy adjustment of the field of view that improve the surgeon's geometric understanding of the surgical area. The ability to reposition a camera rapidly to arbitrary locations will help the surgeon maintain optimal orientation with respect to other tools.
In accordance with one implementation, any of the mobile robotic devices described herein can be used not only in traditional surgical environments such as hospitals, but also in forward environments such as battlefield situations.
Although the present invention has been described with reference to preferred embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention.
This example is an examination biopsy tool design for a mobile robotic device. The device should produce sufficient clamping and drawbar forces to biopsy porcine tissue.
To examine clamping and drawbar forces used during a biopsy, experimental biopsies were conducted. A biopsy forceps device that is commonly used for tissue sampling during esophago-gastroduodenoscopy (EGD) and colonoscopies was modified to measure cutting forces during tissue biopsy. These forceps 280, shown schematically in
The diameter of the forceps (h) depicted in
For a cable force of 10 N, the force at the tip was approximately 1.4 N for this design where a was 2.9 mm, b was 1.7 mm, and d was 0.65 mm. The maximum area of the forceps in contact with tissue during a biopsy was 0.3756 mm2.
Assuming an even distribution of force, the applied pressure was approximately 3.75 MPa. However, by taking a smaller “bite”, the contact area was reduced and the pressure can be drastically increased and the required force was decreased.
A normal biopsy device 300 was modified to contain a load cell 302 to measure clamping forces indirectly, as shown in
Measurements of cable force were made while sampling liver, omentum, small bowel and the abdominal wall of an anesthetized pig. Representative results for a liver biopsy are shown in
Generally, biopsy forceps do not completely sever the tissue. When this is the case, the forceps are gently pulled to free the sample. This extraction force also needs to be produced by a biopsy robot. The magnitude of the extraction force needed to be determined so that a robot could be designed to provide sufficient drawbar force to free the sample.
A laboratory test jig was built to measure the force needed to free a biopsy sample of bovine liver. After clamping the sample with the biopsy forceps, a load cell attached to the handle of the device was gently pulled to free the sample while the tensile force was recorded. Representative results shown in
As indicated, a complete cut of the tissue is rarely achieved and some tearing of the sample is needed to extract the sample. To obtain a biopsy sample, the in vivo robot embodiment of the present example should produce enough drawbar force to pull the sample free. A biopsy robot similar to the devices shown in
In the second test, for which results are depicted in
As depicted in
A direct current motor 320 drives the lead screw 322 vertically as the linkage 326 transforms the vertical motion of the lead nut 324 to the horizontal translation of the slider 328. This allows for a large mechanical advantage at the point when the graspers are nearly closed.
Force measurements were made in the laboratory to determine the maximum amount of force that could be produced using the biopsy robot embodiment of this example. Representative results from these tests are shown in
In vivo mobility testing with the embodiment discussed herein indicated that the wheel design of the instant embodiment produces sufficient drawbar forces to maneuver within the abdominal environment, allowing the robot to traverse all of the abdominal organs (liver, spleen, small and large bowel), as well as climb organs two to three times its height. These tests were performed without causing any visible tissue damage.
After exploring the abdominal environment, the biopsy mechanism described in this example was used to acquire three samples of hepatic tissue from the liver of the animal. The robot camera was used to find a suitable sample site. The biopsy graspers were opened and the sample site was penetrated with the biopsy forceps' spike. Then the graspers were actuated. This cut nearly all of tissue sample free. The robot was then driven slowly away from the sample site thereby pulling free the tissue sample. This tissue sample was then retrieved after robot extraction through the entry incision. This demonstrated the success of a one-port biopsy and successful tissue manipulation by an in vivo robot, according to one embodiment.
A laboratory two-component drug delivery system is shown in
The ability of this system to adequately mix liquid and solid components of a drug was evaluated in a series of bench top experiments. The liquid and solid drug components were simulated using commonly available materials (e.g., corn starch, dyed saline solution, etc). One visual metric of mixing efficiency is the color uniformity of the mixture as determined by measuring the RGB color components of the mixture using image processing software. Representative results are shown in
Bench top tests were also conducted to determine the force that could be applied by an actuation mechanism that could be incorporated into this type of drug delivery tool. One type of mechanism might use a permanent magnet direct current motor (MicroMo, 2005) with a lead screw mounted on the motor shaft. Rotation of the lead screw would move a lead nut attached to the fluid reservoir plunger in and out to dispense the two drug components. This concept was implemented in a test jig 180, illustrated in
Nagelschmidt (1999) found that the maximum force required to mix and dispense fibrin-based hemostatic agents through 1 mm diameter catheters 27 cm long was less than 5 N. These results strongly suggest that the actuation mechanism described above will generate sufficient forces to deliver dual component fibrin-based hemostatic agents.
This application claims priority as a continuation of U.S. patent application Ser. No. 13/107,272, filed on May 13, 2011, which claims priority as a continuation of U.S. patent application Ser. No. 12/816,909, filed on Jun. 16, 2010, which issued on Jun. 14, 2011 as U.S. Pat. No. 7,960,935, which claims priority as a continuation of U.S. patent application Ser. No. 11/947,097, filed on Nov. 29, 2007, Aug. 10, 2010 as U.S. Pat. No. 7,772,796, which claims priority to U.S. Provisional Patent Application Ser. No. 60/868,030, filed Nov. 30, 2006 and further claims priority as a continuation-in-part of U.S. patent application Ser. No. 11/695,944, filed on Apr. 3, 2007, which issued on Feb. 17, 2009 as U.S. Pat. No. 7,492,116, which is a continuation of U.S. patent application Ser. No. 11/398,174, filed on Apr. 5, 2006, which issued on Apr. 3, 2007 as U.S. Pat. No. 7,199,545, which is a continuation of U.S. patent application Ser. No. 10/616,096, filed on Jul. 8, 2003, which issued on May 9, 2006 as U.S. Pat. No. 7,042,184, all of which are hereby incorporated herein by reference in their entireties. Further, U.S. patent application Ser. No. 11/947,097 claims priority as a continuation-in-part of U.S. patent application Ser. No. 11/932,516, filed on Oct. 31, 2007, now abandoned, which is a continuation of U.S. patent application Ser. No. 11/403,756, filed on Apr. 13, 2006, which issued on Mar. 4, 2008 as U.S. Pat. No. 7,339,341, which is a continuation-in-part of U.S. patent application Ser. No. 10/616,096, filed on Jul. 8, 2003, which issued on May 9, 2006 as U.S. Pat. No. 7,042,184, all of which are hereby incorporated herein by reference in their entireties. U.S. patent application Ser. No. 11/947,097 also claims priority as a continuation-in-part of U.S. patent application Ser. No. 11/932,441, filed on Oct. 31, 2007, which is a continuation of U.S. patent application Ser. No. 11/552,379, filed on Oct. 24, 2006, which issued on May 13, 2008 as U.S. Pat. No. 7,372,229, which is a continuation of U.S. patent application Ser. No. 11/338,166, filed on Jan. 24, 2006, which issued on Oct. 24, 2006 as U.S. Pat. No. 7,126,303, which is a continuation-in-part of U.S. patent application Ser. No. 10/616,096, filed on Jul. 8, 2003, which issued on May 9, 2006 as U.S. Pat. No. 7,042,184, all of which are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3870264 | Robinson | Mar 1975 | A |
3989952 | Hohmann | Nov 1976 | A |
4258716 | Sutherland | Mar 1981 | A |
4278077 | Mizumoto | Jul 1981 | A |
4538594 | Boebel et al. | Sep 1985 | A |
4568311 | Miyaki | Feb 1986 | A |
4736645 | Zimmer | Apr 1988 | A |
4771652 | Zimmer | Sep 1988 | A |
4852391 | Ruch et al. | Aug 1989 | A |
4896015 | Taboada et al. | Jan 1990 | A |
4922755 | Oshiro et al. | May 1990 | A |
4990050 | Tsuge et al. | Feb 1991 | A |
5019968 | Wang et al. | May 1991 | A |
5172639 | Wiesman et al. | Dec 1992 | A |
5176649 | Wakabayashi | Jan 1993 | A |
5178032 | Zona et al. | Jan 1993 | A |
5187032 | Sasaki et al. | Feb 1993 | A |
5187796 | Wang et al. | Feb 1993 | A |
5195388 | Zona et al. | Mar 1993 | A |
5201325 | McEwen et al. | Apr 1993 | A |
5271384 | McEwen et al. | Dec 1993 | A |
5284096 | Pelrine et al. | Feb 1994 | A |
5297443 | Wentz | Mar 1994 | A |
5297536 | Wilk | Mar 1994 | A |
5304899 | Sasaki et al. | Apr 1994 | A |
5307447 | Asano et al. | Apr 1994 | A |
5353807 | DeMarco | Oct 1994 | A |
5363935 | Schempf et al. | Nov 1994 | A |
5382885 | Salcudean et al. | Jan 1995 | A |
5388528 | Pelrine et al. | Feb 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5441494 | Oritz | Aug 1995 | A |
5458131 | Wilk | Oct 1995 | A |
5458583 | McNeely et al. | Oct 1995 | A |
5458598 | Feinberg et al. | Oct 1995 | A |
5471515 | Fossum et al. | Nov 1995 | A |
5515478 | Wang | May 1996 | A |
5524180 | Wang et al. | Jun 1996 | A |
5553198 | Wang et al. | Sep 1996 | A |
5562448 | Mushabac | Oct 1996 | A |
5588442 | Scovil et al. | Dec 1996 | A |
5620417 | Jang et al. | Apr 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5624398 | Smith et al. | Apr 1997 | A |
5632761 | Smith et al. | May 1997 | A |
5645520 | Nakamura et al. | Jul 1997 | A |
5657429 | Wang et al. | Aug 1997 | A |
5657584 | Hamlin | Aug 1997 | A |
5674030 | Sigel | Oct 1997 | A |
5728599 | Rosteker et al. | Mar 1998 | A |
5736821 | Suyaman et al. | Apr 1998 | A |
5754741 | Wang et al. | May 1998 | A |
5762458 | Wang et al. | Jun 1998 | A |
5769640 | Jacobus et al. | Jun 1998 | A |
5791231 | Cohn et al. | Aug 1998 | A |
5792135 | Madhani et al. | Aug 1998 | A |
5797900 | Madhani et al. | Aug 1998 | A |
5807377 | Madhani et al. | Sep 1998 | A |
5815640 | Wang et al. | Sep 1998 | A |
5825982 | Wright et al. | Oct 1998 | A |
5841950 | Wang et al. | Nov 1998 | A |
5845646 | Lemelson | Dec 1998 | A |
5855583 | Wang et al. | Jan 1999 | A |
5876325 | Mizuno et al. | Mar 1999 | A |
5878193 | Wang et al. | Mar 1999 | A |
5878783 | Smart | Mar 1999 | A |
5895417 | Pomeranz et al. | Apr 1999 | A |
5906591 | Dario et al. | May 1999 | A |
5907664 | Wang et al. | May 1999 | A |
5911036 | Wright et al. | Jun 1999 | A |
5971976 | Wang et al. | Oct 1999 | A |
6001108 | Wang et al. | Dec 1999 | A |
6007550 | Wang et al. | Dec 1999 | A |
6030365 | Laufer | Feb 2000 | A |
6031371 | Smart | Feb 2000 | A |
6058323 | Lemelson | May 2000 | A |
6063095 | Wang et al. | May 2000 | A |
6066090 | Yoon | May 2000 | A |
6102850 | Wang et al. | Aug 2000 | A |
6107795 | Smart | Aug 2000 | A |
6132368 | Cooper | Oct 2000 | A |
6132441 | Grace | Oct 2000 | A |
6156006 | Brosens et al. | Dec 2000 | A |
6159146 | El Gazayerli | Dec 2000 | A |
6162171 | Ng et al. | Dec 2000 | A |
D438617 | Cooper et al. | Mar 2001 | S |
6206903 | Ramans | Mar 2001 | B1 |
D441076 | Cooper et al. | Apr 2001 | S |
6223100 | Green | Apr 2001 | B1 |
D441862 | Cooper et al. | May 2001 | S |
6238415 | Sepetka et al. | May 2001 | B1 |
6240312 | Alfano et al. | May 2001 | B1 |
6241730 | Alby | Jun 2001 | B1 |
6244809 | Wang et al. | Jun 2001 | B1 |
6246200 | Blumenkranz et al. | Jun 2001 | B1 |
D444555 | Cooper et al. | Jul 2001 | S |
6286514 | Lemelson | Sep 2001 | B1 |
6292678 | Hall et al. | Sep 2001 | B1 |
6293282 | Lemelson | Sep 2001 | B1 |
6296635 | Smith et al. | Oct 2001 | B1 |
6309397 | Julian et al. | Oct 2001 | B1 |
6309403 | Minoret et al. | Oct 2001 | B1 |
6312435 | Wallace et al. | Nov 2001 | B1 |
6321106 | Lemelson | Nov 2001 | B1 |
6327492 | Lemelson | Dec 2001 | B1 |
6331181 | Tierney et al. | Dec 2001 | B1 |
6346072 | Cooper | Feb 2002 | B1 |
6352503 | Matsui et al. | Mar 2002 | B1 |
6364888 | Niemeyer et al. | Apr 2002 | B1 |
6371952 | Madhani et al. | Apr 2002 | B1 |
6394998 | Wallace et al. | May 2002 | B1 |
6398726 | Ramans et al. | Jun 2002 | B1 |
6400980 | Lemelson | Jun 2002 | B1 |
6408224 | Okamoto et al. | Jun 2002 | B1 |
6424885 | Niemeyer et al. | Jul 2002 | B1 |
6432112 | Brock et al. | Aug 2002 | B2 |
6436107 | Wang et al. | Aug 2002 | B1 |
6441577 | Blumenkranz et al. | Aug 2002 | B2 |
6450104 | Grant et al. | Sep 2002 | B1 |
6451027 | Cooper et al. | Sep 2002 | B1 |
6454758 | Thompson et al. | Sep 2002 | B1 |
6459926 | Nowlin et al. | Oct 2002 | B1 |
6463361 | Wang et al. | Oct 2002 | B1 |
6468203 | Belson | Oct 2002 | B2 |
6468265 | Evans et al. | Oct 2002 | B1 |
6470236 | Ohtsuki | Oct 2002 | B2 |
6491691 | Morley et al. | Dec 2002 | B1 |
6491701 | Tierney et al. | Dec 2002 | B2 |
6493608 | Niemeyer et al. | Dec 2002 | B1 |
6496099 | Wang et al. | Dec 2002 | B2 |
6508413 | Bauer et al. | Jan 2003 | B2 |
6512345 | Borenstein | Jan 2003 | B2 |
6522906 | Salisbury, Jr. et al. | Feb 2003 | B1 |
6544276 | Azizi | Apr 2003 | B1 |
6548982 | Papanikolopoulos et al. | Apr 2003 | B1 |
6554790 | Moll | Apr 2003 | B1 |
6565554 | Niemeyer | May 2003 | B1 |
6574355 | Green | Jun 2003 | B2 |
6587750 | Gerbi et al. | Jul 2003 | B2 |
6591239 | McCall et al. | Jul 2003 | B1 |
6594552 | Nowlin et al. | Jul 2003 | B1 |
6610007 | Belson et al. | Aug 2003 | B2 |
6620173 | Gerbi et al. | Sep 2003 | B2 |
6642836 | Wang et al. | Nov 2003 | B1 |
6645196 | Nixon et al. | Nov 2003 | B1 |
6646541 | Wang et al. | Nov 2003 | B1 |
6648814 | Kim et al. | Nov 2003 | B2 |
6659939 | Moll et al. | Dec 2003 | B2 |
6661571 | Shioda et al. | Dec 2003 | B1 |
6671581 | Niemeyer et al. | Dec 2003 | B2 |
6676684 | Morley et al. | Jan 2004 | B1 |
6684129 | Salisbury, Jr. et al. | Jan 2004 | B2 |
6685648 | Flaherty et al. | Feb 2004 | B2 |
6685698 | Morley et al. | Feb 2004 | B2 |
6687571 | Byme et al. | Feb 2004 | B1 |
6692485 | Brock et al. | Feb 2004 | B1 |
6699177 | Wang et al. | Mar 2004 | B1 |
6699235 | Wallace et al. | Mar 2004 | B2 |
6702734 | Kim et al. | Mar 2004 | B2 |
6702805 | Stuart | Mar 2004 | B1 |
6714839 | Salisbury, Jr. et al. | Mar 2004 | B2 |
6714841 | Wright et al. | Mar 2004 | B1 |
6719684 | Kim et al. | Apr 2004 | B2 |
6720988 | Gere et al. | Apr 2004 | B1 |
6726699 | Wright et al. | Apr 2004 | B1 |
6728599 | Wright et al. | Apr 2004 | B2 |
6730021 | Vassiliades, Jr. et al. | May 2004 | B2 |
6731988 | Green | May 2004 | B1 |
6746443 | Morley et al. | Jun 2004 | B1 |
6764441 | Chiel et al. | Jul 2004 | B2 |
6764445 | Ramans et al. | Jul 2004 | B2 |
6766204 | Niemeyer et al. | Jul 2004 | B2 |
6770081 | Cooper et al. | Aug 2004 | B1 |
6774597 | Borenstein | Aug 2004 | B1 |
6776165 | Jin | Aug 2004 | B2 |
6780184 | Tanrisever | Aug 2004 | B2 |
6783524 | Anderson et al. | Aug 2004 | B2 |
6785593 | Wang et al. | Aug 2004 | B2 |
6788018 | Blumenkranz | Sep 2004 | B1 |
6793653 | Sanchez et al. | Sep 2004 | B2 |
6799065 | Niemeyer | Sep 2004 | B1 |
6799088 | Wang et al. | Sep 2004 | B2 |
6801325 | Farr et al. | Oct 2004 | B2 |
6804581 | Wang et al. | Oct 2004 | B2 |
6810281 | Brock et al. | Oct 2004 | B2 |
6817972 | Snow | Nov 2004 | B2 |
6817974 | Cooper et al. | Nov 2004 | B2 |
6817975 | Farr et al. | Nov 2004 | B1 |
6820653 | Schempf et al. | Nov 2004 | B1 |
6824508 | Kim et al. | Nov 2004 | B2 |
6824510 | Kim et al. | Nov 2004 | B2 |
6832988 | Sprout | Dec 2004 | B2 |
6832996 | Woloszko et al. | Dec 2004 | B2 |
6836703 | Wang et al. | Dec 2004 | B2 |
6837846 | Jaffe et al. | Jan 2005 | B2 |
6837883 | Moll et al. | Jan 2005 | B2 |
6839612 | Sanchez et al. | Jan 2005 | B2 |
6840938 | Morley et al. | Jan 2005 | B1 |
6850817 | Green | Feb 2005 | B1 |
6852107 | Wang et al. | Feb 2005 | B2 |
6858003 | Evans et al. | Feb 2005 | B2 |
6860346 | Burt et al. | Mar 2005 | B2 |
6860877 | Sanchez et al. | Mar 2005 | B1 |
6866671 | Tiemey et al. | Mar 2005 | B2 |
6870343 | Borenstein et al. | Mar 2005 | B2 |
6871117 | Wang et al. | Mar 2005 | B2 |
6871563 | Choset et al. | Mar 2005 | B2 |
6879880 | Nowlin et al. | Apr 2005 | B2 |
6892112 | Wang et al. | May 2005 | B2 |
6899705 | Niemeyer | May 2005 | B2 |
6902560 | Morley et al. | Jun 2005 | B1 |
6905460 | Wang et al. | Jun 2005 | B2 |
6905491 | Wang et al. | Jun 2005 | B1 |
6911916 | Wang et al. | Jun 2005 | B1 |
6917176 | Schempf et al. | Jul 2005 | B2 |
6933695 | Blumenkranz | Aug 2005 | B2 |
6936001 | Snow | Aug 2005 | B1 |
6936003 | Iddan | Aug 2005 | B2 |
6936042 | Wallace et al. | Aug 2005 | B2 |
6943663 | Wang et al. | Sep 2005 | B2 |
6949096 | Davison et al. | Sep 2005 | B2 |
6951535 | Ghodoussi et al. | Oct 2005 | B2 |
6965812 | Wang et al. | Nov 2005 | B2 |
6974411 | Belson | Dec 2005 | B2 |
6974449 | Niemeyer | Dec 2005 | B2 |
6979423 | Moll | Dec 2005 | B2 |
6984203 | Tartaglia et al. | Jan 2006 | B2 |
6984205 | Gazdzinski | Jan 2006 | B2 |
6991627 | Madhani et al. | Jan 2006 | B2 |
6993413 | Sunaoshi | Jan 2006 | B2 |
6994703 | Wang et al. | Feb 2006 | B2 |
6994708 | Manzo | Feb 2006 | B2 |
6997908 | Carrillo, Jr. et al. | Feb 2006 | B2 |
7025064 | Wang et al. | Apr 2006 | B2 |
7027892 | Wang et al. | Apr 2006 | B2 |
7033344 | Imran | Apr 2006 | B2 |
7039453 | Mullick | May 2006 | B2 |
7042184 | Oleynikov et al. | May 2006 | B2 |
7048745 | Tierney et al. | May 2006 | B2 |
7053752 | Wang et al. | May 2006 | B2 |
7063682 | Whayne et al. | Jun 2006 | B1 |
7066879 | Fowler et al. | Jun 2006 | B2 |
7066926 | Wallace et al. | Jun 2006 | B2 |
7074179 | Wang et al. | Jul 2006 | B2 |
7077446 | Kameda et al. | Jul 2006 | B2 |
7083571 | Wang et al. | Aug 2006 | B2 |
7083615 | Peterson et al. | Aug 2006 | B2 |
7087049 | Nowlin et al. | Aug 2006 | B2 |
7090683 | Brock et al. | Aug 2006 | B2 |
7097640 | Wang et al. | Aug 2006 | B2 |
7105000 | McBrayer | Sep 2006 | B2 |
7107090 | Salisbury, Jr. et al. | Sep 2006 | B2 |
7109678 | Kraus et al. | Sep 2006 | B2 |
7118582 | Wang et al. | Oct 2006 | B1 |
7121781 | Sanchez et al. | Oct 2006 | B2 |
7125403 | Julian et al. | Oct 2006 | B2 |
7126303 | Farritor et al. | Oct 2006 | B2 |
7147650 | Lee | Dec 2006 | B2 |
7155315 | Niemeyer et al. | Dec 2006 | B2 |
7169141 | Brock et al. | Jan 2007 | B2 |
7182025 | Ghorbel et al. | Feb 2007 | B2 |
7182089 | Ries | Feb 2007 | B2 |
7199545 | Oleynikov et al. | Apr 2007 | B2 |
7206626 | Quaid, III | Apr 2007 | B2 |
7206627 | Abovitz et al. | Apr 2007 | B2 |
7210364 | Ghorbel et | May 2007 | B2 |
7214230 | Brock et al. | May 2007 | B2 |
7217240 | Snow | May 2007 | B2 |
7239940 | Wang et al. | Jul 2007 | B2 |
7250028 | Julian et al. | Jul 2007 | B2 |
7259652 | Wang et al. | Aug 2007 | B2 |
7273488 | Nakamura et al. | Sep 2007 | B2 |
7338513 | Lee et al. | Mar 2008 | B2 |
7492116 | Oleynikov et al. | Feb 2009 | B2 |
7574250 | Niemeyer | Aug 2009 | B2 |
7615067 | Lee et al. | Nov 2009 | B2 |
7637905 | Saadat et al. | Dec 2009 | B2 |
8177794 | Cabrera et al. | May 2012 | B2 |
8292905 | Taylor et al. | Oct 2012 | B2 |
8292906 | Taylor et al. | Oct 2012 | B2 |
8372090 | Wingardner et al. | Feb 2013 | B2 |
20010018591 | Brock et al. | Aug 2001 | A1 |
20010049497 | Kalloo et al. | Dec 2001 | A1 |
20020003173 | Bauer et al. | Jan 2002 | A1 |
20020026186 | Woloszka et al. | Feb 2002 | A1 |
20020065507 | Azizi | May 2002 | A1 |
20020091374 | Cooper | Jul 2002 | A1 |
20020103417 | Gazdzinski | Aug 2002 | A1 |
20020111535 | Kim et al. | Aug 2002 | A1 |
20020120254 | Julian et al. | Aug 2002 | A1 |
20020140392 | Borenstein et al. | Oct 2002 | A1 |
20020147487 | Sundquist et al. | Oct 2002 | A1 |
20020151906 | Demarais et al. | Oct 2002 | A1 |
20020156347 | Kim et al. | Oct 2002 | A1 |
20020171385 | Kim et al. | Nov 2002 | A1 |
20020173700 | Kim et al. | Nov 2002 | A1 |
20020190682 | Schempf et al. | Dec 2002 | A1 |
20030020810 | Takizawa et al. | Jan 2003 | A1 |
20030045888 | Brock et al. | Mar 2003 | A1 |
20030065250 | Chiel et al. | Apr 2003 | A1 |
20030089267 | Ghorbel et al. | May 2003 | A1 |
20030092964 | Kim et al. | May 2003 | A1 |
20030097129 | Davison et al. | May 2003 | A1 |
20030100817 | Wang et al. | May 2003 | A1 |
20030114731 | Cadeddu et al. | Jun 2003 | A1 |
20030139742 | Wampler et al. | Jul 2003 | A1 |
20030144656 | Ocel et al. | Jul 2003 | A1 |
20030167000 | Mullick | Sep 2003 | A1 |
20030172871 | Scherer | Sep 2003 | A1 |
20030179308 | Zamorano et al. | Sep 2003 | A1 |
20030181788 | Yokoi et al. | Sep 2003 | A1 |
20030229268 | Uchiyama et al. | Dec 2003 | A1 |
20030230372 | Schmidt | Dec 2003 | A1 |
20040024311 | Quaid | Feb 2004 | A1 |
20040034282 | Quaid | Feb 2004 | A1 |
20040034283 | Quaid | Feb 2004 | A1 |
20040034302 | Abovitz et al. | Feb 2004 | A1 |
20040050394 | Jin | Mar 2004 | A1 |
20040070822 | Shioda et al. | Apr 2004 | A1 |
20040099175 | Perrot et al. | May 2004 | A1 |
20040106916 | Quaid et al. | Jun 2004 | A1 |
20040111113 | Nakamura et al. | Jun 2004 | A1 |
20040138552 | Harel et al. | Jul 2004 | A1 |
20040140786 | Borenstein | Jul 2004 | A1 |
20040153057 | Davison | Aug 2004 | A1 |
20040173116 | Ghorbel et al. | Sep 2004 | A1 |
20040176664 | Iddan | Sep 2004 | A1 |
20040215331 | Chew et al. | Oct 2004 | A1 |
20040225229 | Viola | Nov 2004 | A1 |
20040254680 | Sunaoshi | Dec 2004 | A1 |
20040267326 | Ocel et al. | Dec 2004 | A1 |
20050014994 | Fowler et al. | Jan 2005 | A1 |
20050029978 | Oleynikov et al. | Feb 2005 | A1 |
20050043583 | Killmann et al. | Feb 2005 | A1 |
20050049462 | Kanazawa | Mar 2005 | A1 |
20050054901 | Yoshino | Mar 2005 | A1 |
20050054902 | Konno | Mar 2005 | A1 |
20050064378 | Toly | Mar 2005 | A1 |
20050065400 | Banik et al. | Mar 2005 | A1 |
20050083460 | Hattori et al. | Apr 2005 | A1 |
20050096502 | Khalili | May 2005 | A1 |
20050143644 | Gilad et al. | Jun 2005 | A1 |
20050154376 | Riviere et al. | Jul 2005 | A1 |
20050165449 | Cadeddu et al. | Jul 2005 | A1 |
20050288555 | Binmoeller | Dec 2005 | A1 |
20050288665 | Woloszko | Dec 2005 | A1 |
20060020272 | Gildenberg | Jan 2006 | A1 |
20060046226 | Bergler et al. | Mar 2006 | A1 |
20060119304 | Farritor et al. | Jun 2006 | A1 |
20060149135 | Paz | Jul 2006 | A1 |
20060152591 | Lin | Jul 2006 | A1 |
20060155263 | Lipow | Jul 2006 | A1 |
20060195015 | Mullick et al. | Aug 2006 | A1 |
20060196301 | Oleynikov et al. | Sep 2006 | A1 |
20060198619 | Oleynikov et al. | Sep 2006 | A1 |
20060241570 | Wilk | Oct 2006 | A1 |
20060241732 | Denker at al. | Oct 2006 | A1 |
20060253109 | Chu | Nov 2006 | A1 |
20060258954 | Timberlake et al. | Nov 2006 | A1 |
20070032701 | Fowler et al. | Feb 2007 | A1 |
20070043397 | Ocel et al. | Feb 2007 | A1 |
20070055342 | Wu et al. | Mar 2007 | A1 |
20070080658 | Farritor et al. | Apr 2007 | A1 |
20070106113 | Ravo | May 2007 | A1 |
20070123748 | Meglan | May 2007 | A1 |
20070142725 | Hardin et al. | Jun 2007 | A1 |
20070156019 | Larkin et al. | Jul 2007 | A1 |
20070156211 | Ferren et al. | Jul 2007 | A1 |
20070167955 | De La Menardiere et al. | Jul 2007 | A1 |
20070225633 | Ferren et al. | Sep 2007 | A1 |
20070225634 | Ferren et al. | Sep 2007 | A1 |
20070241714 | Oleynikov et al. | Oct 2007 | A1 |
20070244520 | Ferren et al. | Oct 2007 | A1 |
20070250064 | Darois et al. | Oct 2007 | A1 |
20070255273 | Fernandez et al. | Nov 2007 | A1 |
20100010512 | Taylor et al. | Jan 2010 | A1 |
20100030028 | Cabrera et al. | Feb 2010 | A1 |
20100030238 | Viola et al. | Feb 2010 | A1 |
20100076460 | Taylor et al. | Mar 2010 | A1 |
20100076461 | Viola et al. | Mar 2010 | A1 |
20100094083 | Taylor et al. | Apr 2010 | A1 |
20100217282 | Cabrera et al. | Aug 2010 | A1 |
20100274265 | Wingardner et al. | Oct 2010 | A1 |
20100318059 | Farritor et al. | Dec 2010 | A1 |
20110224605 | Farritor et al. | Sep 2011 | A1 |
20120277769 | Cabrera et al. | Nov 2012 | A1 |
20130035703 | Taylor et al. | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
2004144533 | May 1990 | JP |
5115425 | May 1993 | JP |
07 136173 | May 1995 | JP |
7306155 | Nov 1995 | JP |
2009508049 | Sep 1999 | JP |
2004322310 | Jun 2004 | JP |
2004180781 | Jul 2004 | JP |
2004329292 | Nov 2004 | JP |
WO 9221291 | Dec 1992 | WO |
WO 02082979 | Oct 2002 | WO |
WO 02100256 | Dec 2002 | WO |
WO 2005009211 | Feb 2005 | WO |
WO 2006079108 | Jan 2006 | WO |
WO 2006052927 | May 2006 | WO |
WO 2007111571 | Oct 2007 | WO |
Entry |
---|
Patronik et al., “Development of a Tethered Epicardial Crawler for Minimally Invasive Cardiac Therapies,” IEEE, pp. 239-240. |
Patronik et al., “Crawling on the Heart: A Mobile Robotic Device for Minimally Invasive Cardiac Interventions,” MICCAI, 2004, pp. 9-16. |
Patronik et al., “Preliminary evaluation of a mobile robotic device for navigation and intervention on the beating heart,” Computer Aided Surgery, 10(4): 225-232, Jul. 2005. |
Peirs et al., “A miniature manipulator for integration in a self-propelling endoscope,” Sensors and Actuators A, 2001, 92: 343-349. |
Peters, “Minimally Invasive Colectomy: Are the Potential Benefits Realized?” Dis Colon Rectum 1993; 36: 751-756. |
Phee et al., “Analysis and Development of Locomotion Devices for the Gastrointestinal Tract,” IEEE Transaction on Biomedical Engineering, vol. 49, No. 6, Jun. 2002, pp. 613-616. |
Phee et al., “Development of Microrobotic Devices for Locomotion in the Human Gastrointestinal Tract,” International Conference on Computational Intelligence, Robotics and Autonomous Systems (CIRAS 2001), Nov. 28-30, (2001), Singapore. |
Platt et al., “In Vivo Robotic Cameras can Enhance Imaging Capability During Laparoscopic Surgery,” in the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005, I pg. |
Preliminary Amendment filed Apr. 11, 2007, in related U.S. Appl. No. 11/403,756, 7 pp. |
RCE and Amendment filed Jun. 13, 2007, in related U.S. Appl. No. 11/403,756, 8 pp. |
Tendick et al. (1993), “Sensing and Manipulation Problems in Endoscopic Surgery: Experiment, Analysis, and Observation,” Presence 2(1): 66-81. |
Rentschler et al., “Mobile In Vivo Biopsy and Camera Robot,” Studies in Health and Infonnatics Medicine Meets Virtual Reality, vol. 119., pp. 449-454, IOS Press, Long Beach, CA, 2006. |
Rentschler et al., Mobile In Vivo Biopsy Robot, IEEE International Conference on Robotics and Automation, Orlando, Florida, May 2006, pp. 4155-4160. |
Rentschler et al., “Miniature in vivo Robots for Remote and Harsh Environments,” IEEE Transactions on Information Technology in Biomedicine, Jan. 2006; 12(1): 66-75. |
Rentschler et al., “An In Vivo Mobile Robot for Surgical Vision and Task Assistance,” Journal of Medical Devices, Mar. 2007, vol. 1: 23-29. |
Rentschler et al., “In vivo Mobile Surgical Robotic Task Assistance,” 1 pg. |
Rentschler et al., “In vivo Robotics during the NEEMO 9 Mission,” Medicine Meets Virtual Reality, Feb. 2007, I pg. |
Rentschler et al., “In Vivo Robots for Laparoscopic Surgery,” Studies in Health Technology and Infonnatics—Medicine Meets Virtual Reality, ISO Press, Newport Beach, CA, 2004a, 98: 316-322. |
Rentschler et al., “Mechanical Design of Robotic in Vivo Wheeled Mobility,” ASME Journal of Mechanical Design, 2006a, pp, I-II. |
Rentschler et al., “Mobile In Vivo Camera Robots Provide Sole Visual Feedback for Abdominal Exploration and Cholecystectomy,” Journal of Surgical Endoscopy, 20-I: 135-138, 2006b. |
Rentschler et al., “Mobile In Vivo Robots Can Assist in Abdominal Exploration,” from the Proceedings of the Society of American Gastrointestinal Endoscopic Surgeons (SAGES) Scientific Conference, Ft. Lauderdale, FL, Apr. 13-16, 2005b. |
Rentschler et al., “Modeling, Analysis, and Experimental Study of In Vivo Wheeled Robotic Mobility,” IEEE Transactions on Robotics, 22 (2): 308-321, 2005c. |
Rentschler et al., “Natural Orifice Surgery with an Endoluminal Mobile Robot,” The Society of American Gastrointestinal Endoscopic Surgeons, Dallas, TX, Apr. 2006d, 14 pp. |
Rentschler et al., “Theoretical and Experimental Analysis of In Vivo Wheeled Mobility,” ASME Design Engineering Technical Conferences: 28th Biennial Mechanisms and Robotics Conference, Salt Lake City, Utah, Sep. 28-Oct. 2, 2004, pp. 1-9. |
Rentschler et al., “Toward In Vivo Mobility,” Studies in Health Technology and Informatics—Medicine Meets Virtual Reality, ISO Press, Long Beach, CA, 2005a, III: 397-403. |
Taylor et al., “A Telerobotic Assistant for Laparoscopic Surgery,” IEEE Eng Med Biol, 1995; 279-287. |
Riviere et al., “Toward Active Tremor Canceling in Handheld Microsurgical Instruments,” IEEE Transactions on Robotics and Automation, Oct. 2003, 19(5): 793-800. |
Rosen et al., “Force Controlled and Teleoperated Endoscopic, Grasper for Minimally Invasive Surgery-Experimental Performance Evaluation,” IEEE Transactions of Biomedical Engineering, Oct. 1999; 46(10): 1212-1221. |
Rosen et al., “Objective Laparoscopic Skills Assessments of Surgical Residents Using Hidden Markov Models Based on Haptic Information and Tool/Tissue Interactions,” Studies in Health Technology and Informatics—Medicine Meets Virtual Reality, Jan. 2001, 7 pp. |
Rosen et al., “Spherical Mechanism Analysis of a Surgical Robot for Minimally Invasive Surgery—Analytical and Experimental Approaches,” Studies in Health Technology and Informatics—Medicine Meets Virtual Reality, pp. 442-448, Jan. 2005. |
Rosen et al., “Task Decomposition of Laparoscopic Surgery for Objective Evaluation of Surgical Residents' Learning Curve Using Hidden Markov Model,” Computer Aided Surgery, vol. 7, pp. 49-61, 2002. |
Rosen et al., “The Blue Dragon—A System of Measuring the Kinematics and the Dynamics of Minimally Invasive Surgical Tools In-Vivo,” Proc. of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, pp. 1876-1881, May 2002. |
Ruurda et al., “Robot-Assisted surgical systems: a new era in laparoscopic surgery,” Ann R. Coll Surg Engl., 2002; 84: 223-226. |
Ruurda et al., “Feasibility of Robot-Assisted Laparoscopic Surgery,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):41-45. |
Sackier et al., “Robotically assisted laparoscopic surgery,” Surgical Endoscopy, 1994; 8: 63-66. |
Salky, “What is the Penetration of Endoscopic Techniques into Surgical Practice?” Digestive Surgery, 2000; 17:422-426. |
Satava, “Surgical Robotics: The Early Chronicles,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1): 6-16. |
Schippers et al., (1996) “Requirements and Possibilities of Computer-Assisted Endoscopic Surgery,” In: Computer Integrated Surgery: Technology and Clinical Applications, pp. 561-565. |
Schurr et al., “Robotics and Telemanipulation Technologies for Endoscopic Surgery,” Surgical Endoscopy, 2000; 14: 375-381. |
Schwartz, “In the Lab: Robots that Slink and Squirm,” The New York Times, Mar. 27, 2007, 4 pp. |
Sharp LL-151-3D, http://www.sharp3d.com, 2006, 2 pp. |
Slatkin et al., “The Development of a Robotic Endoscope,” Proceedings of the 1995 IEEE International Conference on Robotics and Automation, pp. 162-171, 1995. |
Smart Pill “Fantastic Voyage: Smart Pill to Expand Testing,” http://www.smartpilldiagnostics.com, Apr. 13, 2005, 1 pg. |
Southern Surgeons Club (1991), “A prospective analysis of 1518 laparoscopic cholecystectomies,” N. Eng. 1 Med. 324 (16): 1073-1078, 1991. |
Stefanini et al., “Modeling and Experiments on a Legged Microrobot Locomoting in a Tubular Compliant and Slippery Environment,” Int. Journal of Robotics Research, vol. 25, No. 5-6, pp. 551-560, May-Jun. 2006. |
Stiff et al., “Long-term Pain: Less Common After Laparoscopic than Open Cholecystectomy,” British Journal of Surgery, 1994; 81: 1368-1370. |
Strong, et al., “Efficacy of Novel Robotic Camera vs. a Standard Laproscopic Camera,” Surgical Innovation vol. 12, No. 4, Dec. 2005, Westminster Publications, Inc., pp. 315-318. |
Suzumori et al., “Development of Flexible Microactuator and its Applications to Robotics Mechanisms,” Proceedings of the IEEE International Conference on Robotics and Automation, 1991: 1622-1627, 1991. |
Abbott et al., “Design of an Endoluminal NOTES Robotic System,” from the Proceedings of the 2007 IEEE/RSJ Int'l Conf. on Intelligent Robot Systems, San Diego, CA, Oct. 29-Nov. 2, 2007, pp. 410-416. |
Allendorf et al., “Postoperative Immune Function Varies Inversely with the Degree of Surgical Trauma in a Murine Model,” Surgical Endoscopy 1997; 11:427-430. |
Ang, “Active Tremor Compensation in Handheld Instrument for Microsurgery,” Doctoral Dissertation, tech report CMU-RI-TR-04-28, Robotics Institute, Carnegie Mellon Unviersity, May 2004, 167pp. |
Applicant Response to Office Action dated Apr. 17, 2007, in related U.S. Appl. No. 11/552,379, filed Aug. 8, 2007, 7 pp. |
Applicant Response to Office Action dated Aug. 18, 2006, in related U.S. Appl. No. 11/398,174, filed Nov. 7, 2006, 8pp. |
Applicant Response to Office Action dated Aug. 21, 2006, in related U.S. Appl. No. 11/403,756, filed Nov. 21, 2006, 52pp. |
Applicant Response to Office Action dated Oct. 29, 2007, in related U.S. Appl. No. 11/695,944, filed Jan. 22, 2008, 6pp. |
Atmel 80C5X2 Core, http://www.atmel.com, 2006, 186pp. |
Bailey et al., “Complications of Laparoscopic Surgery,” Quality Medical Publishers, Inc., 1995, 25pp. |
Ballantyne, “Robotic Surgery, Telerobotic Surgery, Telepresence, and Telementoring,” Surgical Endoscopy, 2002; 16: 1389-1402. |
Bauer et al., “Case Report: Remote Percutaneous Renal Percutaneous Renal Access Using a New Automated Telesurgical Robotic System,” Telemedicine Journal and e-Health 2001; (4): 341-347. |
Begos et al., “Laparoscopic Cholecystectomy: From Gimmick to Gold Standard,” J Clin Gastroenterol, 1994; 19(4): 325-330. |
Berg et al., “Surgery with Cooperative Robots,” Medicine Meets Virtual Reality, Feb. 2007, 1 pg. |
Breda et al., “Future developments and perspectives in laparoscopy,” Eur. Urology 2001; 40(1): 84-91. |
Breedveld et al., “Design of Steerable Endoscopes to Improve the Visual Perception of Depth During Laparoscopic Surgery,” ASME, Jan. 2004; vol. 126, pp. 1-5. |
Breedveld et al., “Locomotion through the Intestine by means of Rolling Stents,” Proceedings of the ASME Design Engineering Technical Conferences, 2004, pp. 1-7. |
Calafiore et al., Multiple Arterial Conduits Without Cardiopulmonary Bypass: Early Angiographic Results,: Ann Thorac Surg, 1999; 67: 450-456. |
Camarillo et al., “Robotic Technology in Surgery: Past, Present and Future,” The American Journal of Surgery, 2004; 188: 2S-15. |
Cavusoglu et al., “Telesurgery and Surgical Simulation: Haptic Interfaces to Real and Virtual Surgical Environments,” In McLaughliin, M.L., Hespanha, J.P., and Sukhatme, G., editors. Touch in virtual environments, IMSC Series in Multimedia 2001, 28pp. |
Cavusoglu et al., “Robotics for Telesurgery: Second Generation Berkeley/UCSF Laparoscopic Telesurgical Workstation and Looking Towards the Future Applications,” Industrial Robot: An International Journal, 2003; 30(1): 22-29. |
Chanthasopeephan et al., (2003), “Measuring Forces in Liver Cutting: New Equipment and Experimenal Results,” Annals of Biomedical Engineering 31: 1372-1382, 2003. |
Choi et al., “Flexure-based Manipulator for Active Handheld Microsurgical Instrument,” Proceedings of the 27th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), Sep. 2005, 4pp. |
Cuschieri, “Technology for Minimal Access Surgery,” BMJ, 1999, 319: 1-6. |
Dakin et al., “Comparison of laparoscopic skills performance between standard instruments and two surgical robotic systems,” Surg Endosc., 2003; 17: 574-579. |
Dumpert et al., “Improving in Vivo Robot Visioin Quality,” from the Proceedings of Medicine Meets Virtual Realtiy, Long Beach, CA, Jan. 26-29, 2005. 1 pg. |
Dumpert et al., “Stereoscopic in Vivo Surgical Robots,” IEEE Sensors Special Issue on In Vivo Sensors for Medicine, Jan. 2007, 10 pp. |
Grady, “Doctors Try New Surgery for Gallbladder Removal,” The New York Times, Apr. 20, 2007, 3 pp. |
Guber et al., “Miniaturized Instrumetn Systems for Minimally Invasive Diagnosis and Therapy,” Biomedizinishe Technic. 2002, Band 47, Erganmngsband 1. |
Examiner Interview Summary dated Nov. 30, 2006, in related U.S. Appl. No. 11/398,174, 2pp. |
Falcone et al., “Robotic Surgery,” Clin. Obstet. Gynecol. 2003, 46(1): 37-43. |
Faraz et al., “Engineering Approaches to Mechanical and Robotic Design for Minimaly Invasive Surgery (MIS),” Kluwer Academic Publishers (Boston), 2000, 13pp. |
Fearing et al., “Wing Transmission for a Micromechanical Flying Insect,” Proceedings of the 2000 IEEE International Conference to Robotics & Automation, Apr. 2000; 1509-1516. |
Fireman et al., “Diagnosing small bowel Crohn's desease with wireless capsule endoscopy,” Gut 2003; 52: 390-392. |
Flynn et al., “Tomorrow's Surgery: micromotors and microbots for minimally invasive procedures,” Minimally Invasive Surgery & Allied Technologies. |
Franklin et al., “Prospective Comparison of Open vs. Laparoscopic Colon Surgery for Carcinoma: Five-Year Results,” Dis Colon Rectum, 1996; 39: S35-S46. |
Franzino, “The Laprotek Surgical System and the Next Generation of Robotics,” Surg Clin North Am, 2003 83(6). |
Fraulob et al., “Miniature assistance module for robot-assisted heart surgery,” Biomed. Tech. 2002, 47 Suppl. 1, Pt. 1: 12-15. |
Fukuda et al., “Mechanism and Swimming Experiment of Micro Mobile Robot in Water,” Proceedings of the 1994 IEEE International Conference on Robotics and Automation, 1994: 814-819. |
Fukuda et al., “Micro Active Catheter System with Multi Degrees of Freedom,” Proceedings of the IEEE International Conference on Robotics and Automation, May 1994, pp. 2290-2295. |
Fuller et al., “Laparoscopic Trocar Injuries: A Report from a U.S. Food and Drug Administration (FDA) Center for Devices and Radiological Health (CDRH) Systematic Technology Assessment of Medical Products (STAMP) Committe,” U.S. Food and Drug Adminstration, available at http://www.fdaJ:?;ov, Finalized: Nov. 7, 2003; Updated: Jun. 24, 2005, 11 pp. |
Tendick et al., “Applications of Micromechatronics in Minimally Invasive Surgery,” IEEE/ASME Transactions on Mechatronics, 1998; 3(1): 34-42. |
Thomann et al., “The Design of a new type of Micro Robot for the Intestinal Inspection,” Proceedings of the 2002 IEEE Intl. Conference on Intelligent Robots and Systems, Oct. 2002: 1385-1390. |
U.S. Appl. No. 60/180,960, filed Feb. 2000. |
U.S. Appl. No. 60/956,032, filed Aug. 15, 2007. |
U.S. Appl. No. 60/983,445, filed Oct. 29, 2007. |
U.S. Appl. No. 60/990,062, filed Nov. 26, 2007. |
U.S. Appl. No. 60/990,076, filed Nov. 26, 2007. |
U.S. Appl. No. 60/990,086, filed Nov. 26, 2007. |
U.S. Appl. No. 60/990,106, filed Nov. 26, 2007. |
U.S. Appl. No. 60/990,470, filed Nov. 27, 2007. |
Way et al., (editors), “Fundamentals of Laparoscopic Surgery,” Churchill Livingstone Inc., 1995, 14 pp. |
Wolfe et al., “Endoscopic Cholecystectomy: An analysis of Complications,” Arch. Surg. Oct. 1991; 126: 1192-1196. |
Worn et al., “Espirit Project No. 33915: Miniaturised Robot for Micro Manipulation (MINIMAN)”, Nov. 1998; http://www.ipr.ira.ujka.de/-microbot/miniman. |
Yu et al., “Microrobotic Cell Injection,” Proceedings of the 2001 IEEE International Conference on Robotics and Automation, May 2001; 620-625. |
Yu, BSN, RN, “M2ATM Capsule Endoscopy A Breakthrough Diagnostic Tool for Small Intestine Imagining,” vol. 25, No. 1, Gastroenterology Nursing, pp. 24-27. |
Abbou et al., “Laparoscopic Radical Prostatectomy with a Remote Controlled Robot,” The Journal of Urology, Jun. 2001, 165: 1964-1966. |
Glukhovsky et al., “The development and application of wireless capsule endoscopy,” Int. J. Med. Robot. Comput. Assist. Surgery, 2004; I (1): 114-123. |
Gong et al., Wireless endoscopy, Gastrointestinal Endoscopy 2000; 51(6): 725-729. |
Hanly et al., “Value of the SAGES Learning Center in introducing new technology,” Surgical Endoscopy, 2004; 19 (4): 477-483. |
Hanly et al., “Robotic Abdominal Surgery,” The American Journal of Surgery 188 (Suppl.to Oct. 1994): 19S-26S, 2004. |
Ishiyama et al., “Spiral-type Micro-machine for Medical Applications,” 2000 International Symposium on Micromechatronics and Human Science, 2000: 65-69. |
Jagannath et al., “Peroral transgastric endoscopic ligation of fallopian tubes with long-term survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 61(3): 449-453. |
Kalloo et al., “Flexible transgastric peritoneoscopy: a novel approach to diagnostic and therapeutic interventions in the peritoneal cavity,” Gastrointestinal Endoscopy, 2004; 60(1): 114-117. |
Kang et al., “Robotic Assistants Aid Surgeons During Minimally Invasive Procedures,” IEEE Engineering in Medicine and Biology, Jan.-Feb. 2001; pp. 94-104. |
Kantsevoy et al., “Endoscopic gastrojejunostomy with survival in a porcine model,” Gastrointestinal Endoscopy, 2005; 62(2): 287-292. |
Kantsevoy et al., “Transgastric endoscopic splenectomy,” Surgical Endoscopy, 2006; 20: 522-525. |
Kazemier et al. (1998), “Vascular Injuries During Laparoscopy,” J. Am. Coli. Surg. 186(5): 604-5, 1998. |
Kim, “Early Experience with Telemanipulative Robot-Assisted Laparoscopic Cholecystectomy Using da Vinci,” Surgical Laparoscopy, Endoscopy & Percutaneous Techniques, 2002; 12(1):33-40. |
Ko et al., “Per-Oral transgastric abdominal surgery,” Chinese Journal of Digestive Diseases, 2006; 7: 67-70. |
Lafullarde et al., “Laparoscopic Nissen Fundoplication: Five-year Results and Beyond,” Arch/Surg, Feb. 2001; 136:180-184. |
Leggett et al. (2002), “Aortic injury during laparoscopic fundoplication,” Surg. Endoscopy 16(2): 362, 2002. |
Li et al. (2000), “Microvascular Anastomoses Performed in Rats Using a Microsurgical Telemanipulator,” Comp. Aid. Surg. 5: 326-332, 2000. |
Liem et al., “Comparison of Conventional Anterior Surgery and Laparoscopic Surgery for Inguinal-hernia Repair,” New England Journal of Medicine, 1997; 336 (22): 1541-1547. |
MacFarlane et al., “Force-Feedback Grasper Helps Restore the Sense of Touch in Minimally Invasive Surgery,” Journal of Gastrointestinal Surgery, 1999; 3: 278-285. |
Mack et al., “Present Role of Thoracoscopy in the Diagnosis and Treatment of Diseases of the Chest,” Ann Thorac Surgery, 1992; 54: 403-409. |
Mack, “Minimally Invasive and Robotic Surgery,” JAMA, Feb. 2001; 285(5): 568-572. |
Mei et al., “Wireless Drive and Control of a Swimming Microrobot,” Proceedings of the 2002 IEEE International Conference on Robotics & Automation, May 2002: 1131-1136. |
Melvin et al., “Computer-Enhanced vs. Standard Laparoscopic Antireflux Surgery,” J Gastrointest Surg 2002; 6: 11-16. |
Menciassi et al., “Locomotion of a Leffed Capsule in the Gastrointestinal Tract: Theoretical Study and Preliminary Technological Results,” IEEE Int. Conf. on Engineering in Medicine and Biology, San Francisco, CA, pp. 2767-2770, Sep. 2004. |
Menciassi et al., “Robotic Solutions and Mechanisms for a Semi-Autonomous Endoscope,” Proceedings of the 2002 IEEE/RSJ Intl. Conference on Intelligent Robots and Svstems, Oct. 2002; 1379-1384. |
Menciassi et al., “Shape memory alloy clamping devices of a capsule for monitoring tasks in the gastrointestinal tract,” J. Micromech. Microeng, 2005, 15: 2045-2055. |
Meron, “The development of the swallowable video capsule (M2A),” Gastrointestinal Endoscopy 2000; 52 6: 817-819, 2000. |
Micron, http://www.micron.com, 2006, l/4-inch VGA NTSC/PAL CMOS Digital Image Sensor, 98 pp. |
Midday Jeff et al., “Material Handling System for Robotic natural Orifice Surgery”, Proceedings of the 2011 Design of medical Devices Conference, Apr. 12-14, 2011, Minneapolis, MN, 4 pages. |
Miller, Ph.D., et al., “In-Vivo Stereoscopic Imaging System with 5 Degrees-of-Freedom for Minimal Access Surgery,” Dept. of Computer Science and Dept. of Surgery, Columbia University, New York, NY, 7 pp. |
Munro (2002), “Laparoscopic access: complications, technologies, and techniques,” Curro Opin. Obstet. Gynecol., 14(4): 365-74, 2002. |
Nio et al., “Efficiency of manual vs robotical (Zeus) assisted laparoscopic surgery in the performance of standardized tasks,” Surg Endosc, 2002; 16: 412-415. |
Office Action dated Apr. 17, 2007, received in related U.S. Appl. No. 11/552,379, 5 pp. |
Office Action dated Aug. 18, 2006, received in related U.S. Appl. No. 11/398,174, 6 pp. |
Office Action dated Aug. 21, 2006, received in related U.S. Appl. No. 11/403,756, 6 pp. |
Office Action dated Oct. 29, 2007, received in related U.S. Appl. No. 11/695,944, 6 pp. |
Oleynikov et al., “In Vivo Camera Robots Provide Improved Vision for Laparoscopic Surgery,” Computer Assisted Radiology and Surgery (CARS), Chicago, IL, Jun. 23-26, 2004b. |
Oleynikov et al., “In Vivo Robotic Laparoscopy,” Surgical Innovation, Jun. 2005, 12(2): 177-181. |
Oleynikov et al., “Miniature Robots Can Assist in Laparoscopic Cholecystectomy,” Journal of Surgical Endoscopy, 19-4: 473-476, 2005. |
O'Neill, “Surgeon takes new route to gallbladder,” The Oregonian, Jun. 2007, 2 pp. |
Orlando et al., (2003), “Needle and Trocar Injuries in Diagnostic Laparoscopy under Local Anesthesia: What Is the True Incidence of These Complications?” Journal of Laparoendoscopic & Advanced Surgical Techniques 13(3): 181-184, 2003. |
Park et al., “Trocar-less Instrumentation for Laparoscopy: Magnetic Positioning of Intra-abdominal Camera and Retractor,” Ann Surg, Mar. 2007; 245(3): 379-384. |
Park et al., “Experimental studies of transgastric gallbladder surgery: cholecystectomy and cholecystogastricanastomosis (videos),” Gastrointestinal Endoscopy, 2005; 61(4): 601-606, 2005. |
Number | Date | Country | |
---|---|---|---|
20130131694 A1 | May 2013 | US |
Number | Date | Country | |
---|---|---|---|
60868030 | Nov 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13107272 | May 2011 | US |
Child | 13469839 | US | |
Parent | 12816909 | Jun 2010 | US |
Child | 13107272 | US | |
Parent | 11947097 | Nov 2007 | US |
Child | 12816909 | US | |
Parent | 11552379 | Oct 2006 | US |
Child | 11695944 | US | |
Parent | 11403756 | Apr 2006 | US |
Child | 11552379 | US | |
Parent | 11398174 | Apr 2006 | US |
Child | 11403756 | US | |
Parent | 11338166 | Jan 2006 | US |
Child | 11398174 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11932441 | Oct 2007 | US |
Child | 11947097 | US | |
Parent | 11932516 | Oct 2007 | US |
Child | 11932441 | US | |
Parent | 11695944 | Apr 2007 | US |
Child | 11932516 | US | |
Parent | 10616096 | Jul 2003 | US |
Child | 11338166 | US |