Robotic surgical control system

Information

  • Patent Grant
  • 11173000
  • Patent Number
    11,173,000
  • Date Filed
    Friday, January 11, 2019
    5 years ago
  • Date Issued
    Tuesday, November 16, 2021
    3 years ago
Abstract
The invention involves a system and method for controlling the movements of a multi-axis robot to perform a surgery at least on the spinal area of a human in vivo. The system includes controls and software coding to cause the robot to move in desired patterns to complete the surgery, which may include bone, disc and tissue removal, and may also include insertion of hardware for fusing adjacent bony structures.
Description
FIELD OF THE INVENTION

The invention relates to robotic surgical procedures and, more specifically, to a software system configuring a computer system that controls the movements of a multi-axial robot to perform an orthopedic surgical procedure.


BACKGROUND INFORMATION

The performance of an orthopedic surgical procedure on the spine with the assistance of a robot is known in the art. However, the surgeries that have been previously performed are merely small portions of the surgery, or have been small steps in the overall completion of the surgery. Thus, the known control systems configured by software for controlling the movements of the robot are deficient for complex and delicate procedures such as bone removal, disc removal and the like.


Thus, the present invention provides a software configured control system which overcomes the disadvantages of prior art robot surgical systems. The robotic surgical control system of the present invention not only provides for relative ease in the setup of the procedure and control over the robot during the actual operation, it also provides warnings and monitoring of the procedure to detect, warn and prevent injury to a patient undergoing the procedure.


SUMMARY OF THE INVENTION

Briefly, the invention involves a system and method for controlling the movements of a multi-axis robot to perform a surgery at least on the spinal area of a human in vivo. The system includes controls and software coding to cause the robot to move in desired patterns to complete a surgery, which may include bone, disc and tissue removal, and may also include insertion of hardware for fusing adjacent bony structures.


Accordingly, it is an objective of the present invention to provide a system for controlling a multi-axis robot for performance of a surgical procedure.


It is a further objective of the present invention to provide software for controlling the movements of a multi-axis robot for performance of a surgical procedure.


It is yet a further objective of the present invention to provide a method for controlling a multi-axis robot for the purpose of performing a surgical procedure.


Other objectives and advantages of this invention will become apparent from the following description taken in conjunction with the accompanying drawings wherein are set forth, by way of illustration and example, certain embodiments of this invention. The drawings constitute a part of this specification, include exemplary embodiments of the present invention, and illustrate various objects and features thereof.





BRIEF DESCRIPTION OF THE FIGURES


FIG. 1 is a screen shot illustrating the opening screen of the present system;



FIG. 2 is a screen shot illustrating a login screen for the present system;



FIG. 3 is a screen shot illustrating a patient selection screen for the present system;



FIG. 4 is a screen shot illustrating a patient information screen for the present system;



FIG. 5 is a screen shot illustrating a procedure selection screen for the present system;



FIG. 6 is a screen shot illustrating a cut planning screen for the present system;



FIG. 7 is a screen shot illustrating a cutting tool path screen for the present system;



FIG. 8 is a screen shot illustrating a screw insertion planning screen for the present system;



FIG. 9 is a screen shot illustrating another screw insertion planning screen for the present system;



FIG. 10 is a screen shot illustrating a (DICOM) digital imaging and communications in medicine screen for the present system for viewing the patient's anatomy;



FIG. 11 is a screen shot illustrating a DICOM slice view screen for the present system;



FIG. 12 is a screen shot illustrating an operation preview screen for the present system;



FIG. 13 is a screen shot illustrating an operation preview screen in slice format for the present system;



FIG. 14 is a general flow diagram for the present system; and



FIG. 15 is a view of the robotic control system including one embodiment of the robot.





Other objectives and advantages of this invention will become apparent from the following description taken in conjunction with the accompanying drawings wherein are set forth, by way of illustration and example, certain embodiments of this invention. The drawings constitute a part of this specification and include exemplary embodiments of the present invention and illustrate various objects and features thereof.


DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS

While the present invention is susceptible of embodiment in various forms, there is shown in the drawings and will hereinafter be described a presently preferred embodiment with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.


Referring generally to FIGS. 1-15, a robotic surgical system 10 that includes a computer system 11 operably connected or interfaced with a multi-axis robot 12 for controlling the movements of components of the multi-axis robot 12 to perform a surgical procedure on a human patient 14 in vivo is illustrated. The computer system 11 includes a user interface 16, shown as two touchscreen displays 18. The primary visual display 20 is positioned above the work display 22, and may be provided with a horizontal tilt toward the operator 68 to facilitate easier tactile interaction. It is contemplated that these touchscreen displays have built-in multi touch support for scrolling, zooming and rotating viewpoints and target objects, in addition to standard click based selection functionality. The displays 20, 22 are operable to display one or more digital images from one or more image files in storage 83. The computer system 11 includes a computer 82 having storage 83 and a processor 84. The computer 82 is operably connected to the robot 12 wirelessly or by suitable wiring 85. The storage 83 can have a primary storage (commonly referred to as memory) and/or a secondary storage device that can be read by the processor 84 to transfer the digital images between the primary storage to the secondary storage for subsequent long-term storage as an image file, such as a jpeg file. Primary and secondary storage are herein referred to as storage collectively, and the term storage can include one or both primary and secondary storage. It is to be understood that the image file can be transferred wirelessly between primary storage and long-term storage. It is also to be understood that long term storage can be remote, for example, cloud storage or other remote storage.


While operational information can be input by a user such as a surgeon via the display 20, 22, it can also be input by other input devices such as a keyboard 87, joystick or mouse 86 and a touchscreen display 18. The information input can be by a user touching the screen on a display 20, 22 with a finger, stylus or other suitable implement. Based upon the commands input into the computer 82, the computer outputs electrical signals to the multi-axis robot 12 to cause the motors in the robot to move the robot's 12 component arms 89 to a desired position at a desired speed. A tool holder 91 is also associated with the robot 12 for holding a suitable surgical tool 92. The motors (not shown) of the robot may be servos or any other type of motor suitable to provide feedback to the computer system 11 and thus the control system 10 to ensure that the movements of the various robotic arms 89 are moving as commanded by the control system. Thus, it is desirable that the robotic motors are provided with, but not limited to, encoders to provide the feedback signals to the control system 10. In this manner, the control system 10 can modify the electrical signals to the robot 12 to provide the desired level of control needed for surgical procedures. In addition, the robot 12 may be provided with torque sensors, G-force sensors, speed sensors, orientation sensors and the like to provide additional feedback data to the control system 10 to further enhance control of the robot's movements.


In at least one embodiment, the multi-axis robot 12 is provided with a mobile base assembly 90. The mobile base assembly 90 allows the control system 10 to reposition the robot for a more desirable position for completing the desired operation based upon fiducial landmarks in the patient's anatomy, which may be marked to provide electrical, magnetic induced or other types of signals that allow the control system to position the robot. These movements may be based upon limitations in the robot's ability to move in a desired trajectory or the like. In at least one embodiment, the base assembly 90 is provided with Mecanum wheels 92, which allow movement of the robot in any direction without rotation of the wheel about a vertical axis. Like the robot axes, the Mecanum wheel should be provided with motors and/or sensors that provide feedback to the control system 10, which has the capability to monitor the movement, and alter the movement, in real time based upon discrepancies such as wheel slippage and the like.


Referring to FIGS. 1 and 2, the startup and login screens are illustrated. The startup screen 24 is initially viewed when the system is started. Once started, the login screen 26 becomes viewable. To login, the user or operator 68 selects their name from a list of registered users and places their name in the user box. Thereafter, the operator 68 enters a numeric or other suitable password to open the system using the keypad 30 or keyboard 87. The data is entered into the computer using the login button 32, or cleared with the clear button 34. It should also be noted that other types of logins may be utilized without departing from the scope of the invention, which may include, but should not be limited to, retina scans, facial scans, fingerprint scans, RFID, and other suitable methods of determining the true identity of an operator 68.


Referring to FIG. 3, the patient selection screen is illustrated. To select a patient, the operator 68 may type in or scroll through a list of preloaded patient profiles. The patient's name is then entered into the patient selection box 38 and the load patient button 39 selects the named patient. Once selected, the patient's imaging and medical data (FIG. 4) are available for the operator 68, and the system progresses to the next screen.


Referring to FIG. 4, the patient information screen is illustrated. In this screen, patient details 42 are viewed on the work display 22. Various surgical procedure types 46 are listed in the navigation toolbar 44 for selection by the operator 68.


Referring to FIG. 5, the operation builder screen 48 is illustrated. Each surgical procedure 46 is provided with picture icons 50. Selection of the desired procedure and pressing the add procedure button 52 appends the sequence of operations in the Operation Procedure List 54 with the steps associated to the selected procedure. The selections made on this screen determine available options on planner screens, see FIGS. 6 and 7. Pressing procedure types 46 on the navigation toolbar 44 (FIG. 4) allows the operator 68 to toggle to and from the operation builder screen 48 throughout operation of the system 10.


Referring to FIGS. 6 and 7, the cut planner screens 56, 58 are illustrated. In these screens 56, 58, the system 10 allows the operator 68 to select vertebrae 62 and edit cutter paths 60 for the procedure. A proposed surgical tool path 60 is input by the operator as described above on a screen on at least one of the displays 20, 22. The specific procedure to be edited is selected from the Operation Procedure List 64 on the left side of the screen. Each procedure can include operations on any number of vertebrae 62. Tool selection buttons 66 allow the user to select the type of tool to be used for cutting. Each tool selected has an associated tool offset length and diameter to automatically offset and establish the cutting depth of the tool relative to the tool path chosen and directed by the operator 68 for the procedure. All tools chosen are stored for tool path generation. The navigation bar 44 allows the operator to toggle into and out of this screen throughout the procedure as desired. In this manner, the operator 68 can change the surgical tool paths and tools used for the procedure as desired. The on screen view of vertebrae 62 can be rotated and enlarged to allow the surgical tool path 60 to be placed as desired by the operator 68. In this manner, the operator 68 can view the surgical tool path 60 from various angles, and can zoom in on specific features that are desired to remain or be cut away.


Referring to FIGS. 8 and 9, planning the insertion of screw screens 70, 72 are illustrated. In these screens, the system 10 allows the operator 68 to edit a specific procedure as selected from the Operation Procedure List 54 (FIG. 5) on the right side of the screen. This screen allows the operator 68 to select a desired vertebrae 62 for examination, rotation and positioning, to allow an axis of insertion 76 to be oriented in the vertebrae. Each procedure can include operations on any number of vertebrae, and may include the insertion of pedicle screws 74. Pedicle screws 74 typically include a threaded shank and tulip portion 80. The tulip portion 80 is typically formed to have two upright arms 82 for accepting a rod member (not shown) for connecting multiple screws together to allow fusion of the vertebrae 62 to adjacently positioned vertebrae 62. The pedicle screw 74 is connected to a tube or non-tubular member (not-shown) that immobilizes and orients the upright arms 82 of the tulip portion 80, while engaging the shank 78 so that the screw can be inserted along an axis 76 chosen by the operator 68, and to a desired depth in the vertebrae 62 with the upright arms 82 oriented to allow a rod to be inserted across multiple pedicle screws 74. The tool 92 for inserting the screw, like the other tools 92 (often referred to as an effector), includes a length and a diameter (that may change over its length) stored in the tool offsets, whereby the system knows the length and diameter of the tool for planning and insertion of the screw. This construction thereby provides warnings and notices to the operator 68 if there is interference with the chosen path, or if the screw will protrude from the bone in an undesirable manner. In one embodiment, the control system 10 utilizes the volume of the vertebrae to predict break out or collision to protect the patient from injury, and may also track nerve space, other vertebrae, other tools and sensitive tissue areas of the patient to avoid injury. It should also be noted that, while pedicle screws are illustrated herein, any type of bone screw may be inserted in a similar manner without departing from the scope of the invention.


Referring to FIGS. 10-14, the digital imaging and communications in medicine “DICOM” viewer screens 94, 96, as well as the operation overview screens 98, 100, are illustrated. In these screens, patient 3D simulated imaging in the DICOM format is loaded into the control system 10 to facilitate angled slice viewing. A 3D simulated image is often simply referred to as a 3D image, even though it is displayed on a two dimensional screen. Tools 92 and surgical tool paths 60 for each selected tool can be overlaid upon the DICOM data to show precisely where the tools will enter and traverse according to the tool paths and tools that have been selected and set up for the operation (FIGS. 12 and 13). Electromagnetic or optical sensors may be utilized to align the DICOM data with the tool paths for robot operation. In this manner, the operator 68 can see precisely where cuts will be made and the depth to which the tools will travel in the anatomy of the patient. This allows preliminary run through of the surgery to allow for modification if desired. In at least one embodiment, the control system 10 provides automatic segmentation and calibration of the DICOM data. In this embodiment, the control system 10 will automatically correlate previously taken DICOM data to the current patient on the table to facilitate live surgical tool path generation and tracking. Either anchor positions or ultrasound sweeping, or a combination of both, will allow the robot 12 to build up precision, relating the DICOM data to the real time patient data until a sufficient match is achieved to begin the operation. This construction also allows minimally invasive surgery “pathing”; whereby the control system includes path generation and planning software that provides for a minimally invasive mode. This mode will optimize robot tool trajectories so that the smallest possible incision (ideally just a single puncture) can be used. Both surface and/or internal tissue damage can be minimized or preferentially protected as per the operator's discretion.

Claims
  • 1. A robotic surgical system comprising: a multi-axis surgical robot with a plurality of movable components and operable to hold and operate at least one surgical tool;a computer system having a processor, a display device and an input device, said computer system being operably interfaced with said robot for controlling operation of said at least one surgical tool and movement of robot components to effect movement of said at least one surgical tool, said computer system being configured to receive one or more image files of one or more pictures of a surgical site and display at least one said picture upon command, said one or more image files including at least one image taken prior to set up of a proposed surgical path and at least one real time image taken at the set-up of the proposed surgical path, the prior taken image aligned to the real time image prior to robot movement to complete the surgery, said display being operable to receive information from a user about a proposed surgical tool path overlaid on at least the image taken prior to set up of the proposed surgical path for at least one said surgical tool, said proposed path instruction being processed by a processor and used to direct movement of said at least one surgical tool by said surgical robot to perform a surgical procedure at said surgical site under control of the processor, the proposed surgical path viewable along with an image of the proposed tool on the display device with respect to the aligned images prior to movement of the robot.
  • 2. The system of claim 1 wherein said display includes a touch screen operable to function as said input device, and the information is input by a user touching said touch screen, and said computer system being operable to display the input information for viewing by said user.
  • 3. The system of claim 2 wherein said processor determines said surgical tool path and provides surgical tool path data to said robot to effect movement of said tool along said proposed surgical path.
  • 4. The system of claim 3 wherein said processor being configured to determine movement of said robot components to effect movement of said surgical tool along said surgical tool path, said robot providing feedback to said processor regarding the location of an effector.
  • 5. The system of claim 4 wherein said processor being configured to analyze said proposed surgical tool path and effect changes to said surgical tool path if there are interferences with the proposed surgical tool path.
  • 6. The system of claim 5 wherein said computer system being configured to receive instructions on surgical tool selection from said user, said processor configured to analyze the proposed surgical tool path using stored tool offsets to provide warnings of interference.
  • 7. The system of claim 6 wherein said computer system being configured to determine the location of said surgical tool relative to said surgical site prior to surgical operation using said surgical tool.
  • 8. The system of claim 1 wherein said computer system being configured to display an operation builder screen having a plurality of selectable operation procedures for selection to at least partially program operation of said robot and said surgical tool prior to commencement of the surgical procedure.
  • 9. The system of claim 1 wherein said computer system being configured to display at least one simulated image of said surgical site and show a currently predicted surgical tool path and surgical tool location on the image and allow said user to adjust a current predicted path to a new predicted path, said image including a simulated 3D image.
  • 10. A robotic surgical method comprising: positioning a multi-axis surgical robot with a plurality of computer system controlled movable components adjacent to a surgical patient;activating a computer system having a processor, a display device and an input device, said computer system being operably interfaced with said robot for controlling operation of a surgical tool and movement of robot components to effect movement of a surgical tool under control of the computer system, said computer system being configured to receive one or more image files of one or more pictures of a surgical site;displaying at least one surgical site picture on said display device;selecting a surgical tool using said computer system;a user inputting information of a proposed surgical path and displaying said proposed surgical path on said display device and overlaying said proposed surgical path for at least one said tool on at least one said picture;processing information about said proposed surgical path and said surgical tool and sending instruction to said robot to move said surgical tool along said proposed surgical path and performing a surgical procedure, said proposed surgical path instruction being processed by said processor and used to direct movement of said tool to perform a surgical procedure at said surgical site.
  • 11. The method of claim 10 wherein said user inputs the information and said information is displayed as a proposed surgical path on a touch screen for viewing by said user.
  • 12. The method of claim 11 wherein said processor analyzing said proposed surgical path and providing tool path data to said robot to effect movement of said surgical tool along said surgical path.
  • 13. The method of claim 12 wherein the processor determining the movement of said robot components to effect movement of said surgical tool along said surgical path.
  • 14. The method of claim 13 wherein said processor analyzing said proposed surgical path and effecting changes to said proposed path and generating a revised surgical path.
  • 15. The method of claim 14 wherein said user inputting instructions on surgical tool selection into said computer system.
  • 16. The method of claim 15 wherein said computer system determining the location of said surgical tool relative to said surgical site prior to surgical operation using a predetermined surgical tool offset, said surgical tool offset including a length of said surgical tool and a diameter of said surgical tool.
  • 17. The method of claim 10 wherein said computer system displaying an operation builder screen and said user selecting at least one of a plurality of selectable operation procedures to at least partially program operation of said robot and said surgical tool.
  • 18. The method of claim 10 wherein said computer system displaying at least one image of said surgical site and showing a currently proposed surgical path and tool location on the image and allowing at least one of said user and said computer system to adjust said proposed surgical path to a revised surgical path.
  • 19. The method of claim 18 wherein the at least one image including a simulated 3D image.
  • 20. The method of claim 10 wherein said surgical procedure being an orthopedic surgery procedure.
  • 21. The method of claim 20 wherein said orthopedic surgery procedure being a spinal surgery procedure.
  • 22. The Method of claim 19 wherein said 3D image includes digital imaging and communications in medicine data.
  • 23. The method of claim 10 wherein said surgical site includes at least one electromagnetic sensor in electrical communication with said computer system for aligning said surgical site picture with a patient for control positioning of said multi-axis surgical robot.
PRIORITY CLAIM

In accordance with 37 C.F.R. 1.76, a claim of priority is included in an Application Data Sheet filed concurrently herewith. Accordingly, the present invention claims priority to U.S. Provisional Patent Application No. 62/616,700, entitled “ROBOTIC SURGICAL CONTROL SYSTEM”, filed Jan. 12, 2018. The contents of the above referenced application are incorporated herein by reference in its entirety.

US Referenced Citations (106)
Number Name Date Kind
1154159 Ashworth Sep 1915 A
2557429 Hawley Jun 1951 A
2831295 Weiss Apr 1958 A
3091060 Geigerich et al. May 1963 A
3554197 Dobbie Jan 1971 A
4007528 Shea et al. Feb 1977 A
4008720 Brinckmann et al. Feb 1977 A
4081704 Vassos et al. Mar 1978 A
RE29736 Shea et al. Aug 1978 E
D248967 Shea et al. Aug 1978 S
4111208 Leuenberger Sep 1978 A
4596243 Bray Jun 1986 A
4620539 Andrews et al. Nov 1986 A
4828052 Duran et al. May 1989 A
4932935 Swartz Jun 1990 A
5092875 McLees Mar 1992 A
5522829 Michalos Jun 1996 A
5733119 Carr Mar 1998 A
5843110 Dross et al. Dec 1998 A
6021538 Kressner et al. Aug 2000 A
6110174 Nichter Aug 2000 A
6236875 Bucholz et al. May 2001 B1
6490467 Bucholz et al. Dec 2002 B1
6522906 Salisbury, Jr. Feb 2003 B1
6546279 Bova et al. Apr 2003 B1
6635067 Norman Oct 2003 B2
6676669 Charles et al. Jan 2004 B2
6689087 Pal et al. Feb 2004 B2
6716215 David et al. Apr 2004 B1
6721986 Zhuan Apr 2004 B2
6966912 Michelson Nov 2005 B2
7160304 Michelson Jan 2007 B2
7194120 Wicker et al. Mar 2007 B2
7346417 Luth et al. Mar 2008 B2
8029523 Wallis et al. Oct 2011 B2
8038630 Pal et al. Oct 2011 B2
8170717 Sutherland et al. May 2012 B2
8219178 Smith et al. Jul 2012 B2
8465491 Yedlicka et al. Jun 2013 B2
8491603 Yeung et al. Jul 2013 B2
8657821 Palermo Feb 2014 B2
8728085 Marsh et al. May 2014 B2
8828001 Stearns et al. Sep 2014 B2
9820818 Malackowski Nov 2017 B2
20030060927 Gerbi et al. Mar 2003 A1
20040050603 Jaeger Mar 2004 A1
20040147934 Kiester Jul 2004 A1
20040199072 Sprouse et al. Oct 2004 A1
20050027397 Niemeyer Feb 2005 A1
20050043718 Madhani et al. Feb 2005 A1
20050171557 Shoham Aug 2005 A1
20050283175 Tanner et al. Dec 2005 A1
20060229624 May et al. Oct 2006 A1
20060235305 Cotter et al. Oct 2006 A1
20060235306 Cotter et al. Oct 2006 A1
20070021738 Hasser et al. Jan 2007 A1
20070276423 Green Nov 2007 A1
20070282344 Yedlicka et al. Dec 2007 A1
20070282345 Yedlicka et al. Dec 2007 A1
20080027449 Gundlapalli et al. Jan 2008 A1
20080061784 Pal et al. Mar 2008 A1
20080108010 Wang May 2008 A1
20080108991 von Jako May 2008 A1
20080213899 Olgac Sep 2008 A1
20100145343 Johnson et al. Jun 2010 A1
20100165793 Haug Jul 2010 A1
20100198230 Shoham Aug 2010 A1
20100249506 Prisco Sep 2010 A1
20100249786 Schmieding et al. Sep 2010 A1
20110015635 Aryan Jan 2011 A1
20110015649 Anvari et al. Jan 2011 A1
20110118708 Burbank et al. May 2011 A1
20110118709 Burbank May 2011 A1
20110118778 Burbank May 2011 A1
20110196404 Dietz et al. Aug 2011 A1
20110230886 Gustilo et al. Sep 2011 A1
20110295270 Giordano et al. Dec 2011 A1
20110306873 Shenai et al. Dec 2011 A1
20110313428 Mohr et al. Dec 2011 A1
20110319941 Bar et al. Dec 2011 A1
20120059392 Diolaiti Mar 2012 A1
20120186372 Smith et al. Jul 2012 A1
20120211546 Shelton, IV Aug 2012 A1
20120220831 Cooper et al. Aug 2012 A1
20120266442 Rogers et al. Oct 2012 A1
20130096540 Cooper et al. Apr 2013 A1
20130123799 Smith et al. May 2013 A1
20130178856 Ye et al. Jul 2013 A1
20130206441 Roser et al. Aug 2013 A1
20130244820 Solomon et al. Sep 2013 A1
20130245629 Xie Sep 2013 A1
20130296886 Green et al. Nov 2013 A1
20130304069 Bono et al. Nov 2013 A1
20130345718 Crawford Dec 2013 A1
20140051922 Guthart et al. Feb 2014 A1
20140100574 Bono et al. Apr 2014 A1
20140194894 Dachs, II et al. Jul 2014 A1
20140222003 Herndon et al. Aug 2014 A1
20140276001 Ungi et al. Sep 2014 A1
20140350391 Prisco et al. Nov 2014 A1
20150100066 Kostrzewski et al. Apr 2015 A1
20150119916 Dietz et al. Apr 2015 A1
20160151120 Kostrzewski et al. Jun 2016 A1
20170065248 Ungi et al. Mar 2017 A1
20170245940 Piron et al. Aug 2017 A1
20170296292 Mahmood et al. Oct 2017 A1
Foreign Referenced Citations (40)
Number Date Country
42807 Jul 2005 AR
370608 Apr 1983 AT
2003200831 Sep 2004 AU
2011215901 Aug 2012 AU
861446 Mar 1978 BE
1112970 Nov 1981 CA
2513071 Jul 2004 CA
2788918 Aug 2011 CA
610753 May 1979 CH
102781349 Nov 2012 CN
2730227 Jun 1978 DE
570977 Jun 1978 DK
0148304 Jul 1985 EP
0261260 Mar 1988 EP
1581374 Oct 2005 EP
1690649 Aug 2006 EP
2533703 Dec 2012 EP
465719 Dec 1980 ES
197703650 Jun 1978 FI
2374886 Jul 1978 FR
1550577 Aug 1979 GB
1081824 May 1985 IT
S5380789 Jul 1978 JP
S5613462 Jul 1978 JP
2006512954 Apr 2006 JP
4481173 Jun 2010 JP
2013519434 May 2013 JP
5826771 Dec 2015 JP
197713563 Jun 1978 NL
197704411 Jun 1978 NO
WO9107116 May 1991 WO
WO0215799 Feb 2002 WO
WO2004062863 Jul 2004 WO
WO2007008703 Jan 2007 WO
WO2009151926 Dec 2009 WO
WO2011100313 Aug 2011 WO
WO2012166476 Dec 2012 WO
WO2013192598 Dec 2013 WO
WO2014150514 Sep 2014 WO
WO2015006296 Jan 2015 WO
Related Publications (1)
Number Date Country
20190216553 A1 Jul 2019 US
Provisional Applications (1)
Number Date Country
62616700 Jan 2018 US