The functions of a computer-assisted surgery (CAS) system may include pre-operative planning of a procedure, presenting pre-operative diagnostic information and images in useful formats, presenting status information about a procedure as it takes place, and enhancing performance. The CAS system may be used for procedures in traditional operating rooms, interventional radiology suites, mobile operating rooms or outpatient clinics.
Navigation systems may be used to display the positions of surgical tools with respect to preoperative or intraoperative image datasets. These images may include two-dimensional fluoroscopic images, and three-dimensional images generated using, for example, magnetic resonance imaging (MRI), computed tomography (CT) and positron emission tomography (PET). Some navigation systems make use of a tracking or localizing system. These systems locate markers attached or fixed to an object, such as an instrument or a patient, and track the position of markers. These tracking systems may be optical and/or magnetic, but may also include acoustic and/or ultrasonic systems. Optical systems may have a stationary stereo camera pair that observes passive reflective markers or active infrared LEDs attached to the tracked tools. Magnetic systems may have a stationary field generator that emits a magnetic field that is sensed by small coils integrated into the tracked tools.
Most navigation systems transmit information to the surgeon via a computer monitor. Conversely, the surgeon transmits information to the system via a keyboard and mouse, touchscreen, voice commands, control pendant, or foot pedals, and also by moving the tracked tool. The visual displays of navigation systems may display multiple slices through three-dimensional diagnostic image datasets.
Autonomous robots have been applied commercially to joint replacement procedures. These systems make precise bone resections, improving implant fit and placement relative to techniques that rely on manual instruments. Robots may also utilized haptic feedback systems to provide for semi-autonomous control, as described in greater detail below. Registration is performed by having the robot touch fiducial markers screwed into the bones or a series of points on the bone surfaces. Cutting is performed autonomously with a high-speed bur, although the surgeon can monitor progress and interrupt it if necessary. Bones may be clamped in place during registration and cutting, and are monitored for motion, which then requires re-registration.
Despite the advances in robotic devices and methods to perform or assist in the performance of certain surgeries, further advances are still desirable. For example, while robotic systems have been used to resect a patient's bone, it would be desirable to integrate the same robot into related procedures, such as in creating and/or applying a bone graft.
According to a first embodiment of the disclosure, a method of performing a surgical procedure on a patient includes planning a resection of a bone of the patient, and removing a volume of the bone with a surgical tool according to the planned resection. Data corresponding to a shape and volume of the removed bone is tracked with a computer system operatively coupled to the surgical tool, and a prostheses is implanted onto the bone of the patient based on the tracked data corresponding to the shape and volume of the removed bone.
The surgical tool for removing the volume of the bone may be operatively coupled to a robotic device during the removal step. The surgical tool may be a manual tool. The implanted prosthesis may be deposited on the bone with a deposition tool operatively coupled to a robotic device during the implanting step. The deposition tool may be a syringe device. The implanted prosthesis may be an ultraviolet curable resin. A temperature of the prosthesis may be monitored during the implanting step.
The implanting step may further include forming a lattice on the resected bone with a first deposition tool containing a first prosthesis therein, with the first deposition tool being operatively coupled to a robotic device, and filling the lattice with a second prosthesis, which may contained in a second deposition tool operatively coupled to the robotic device.
The implanting step may further include implanting a first prosthesis layer on the resected bone with a first deposition tool operatively coupled to a robotic device, the first prosthesis layer having a first density, and implanting a second prosthesis layer on the first prosthesis layer, which may be done with a second deposition tool operatively coupled to a robotic device, the second prosthesis layer having a second density different than the first density. The second density may be greater than the first density.
The method may further include shaping the prosthesis using the surgical tool so that the prosthesis has a shape complementary to the shape of the removed bone. The surgical tool may be selected from one of the group consisting of a bur, saw, laser, cautery device, and waterjet. During the step of shaping the prosthesis, the prosthesis may be secured to a holding device. The step of removing the volume of the bone may include forming a first geometric shape in the bone and the step of shaping the prosthesis includes forming a second geometric shape in the prosthesis, the first geometric shape being keyed to the second geometric shape. The first and second geometric shapes may form a dovetail configuration. The step of implanting the prosthesis onto the bone of the patient may include coupling the prosthesis onto the bone with a fastener. The fastener may be selected from the group consisting of bone screws and bone pins. A feature for accepting the fastener may be formed into at least one of the bone and the prosthesis. The feature may be selected from one of the group consisting of a threaded screw hole and pilot hole. The step of shaping the prosthesis using the surgical tool may include forming a plurality of discrete prostheses, and the step of implanting the prosthesis onto the bone may include implanting each of the discrete prostheses.
Haptic device 113 is, in the illustrated example, a robotic device. Haptic device 113 may be controlled by a processor based system, for example a computer 10. Computer 10 may also include power amplification and input/output hardware. Haptic device 113 may communicate with computer-assisted surgery system 11 by any suitable communication mechanism, whether wired or wireless.
Also shown in
Haptic object 110 is a virtual object used to guide and/or constrain the movement and operations of surgical tool 112 to a target area inside a patient's anatomy 114, for example the patient's leg. In this example, haptic object 110 is used to aid the surgeon 116 to target and approach the intended anatomical site of the patient. Haptic feedback forces may be used to slow and/or stop the surgical tool's movement if it is detected that a portion of surgical tool 112 will intrude or cross over pre-defined boundaries of the haptic object. Furthermore, haptic feedback forces can also be used to attract (or repulse) surgical tool 112 toward (or away from) haptic object 110 and to (or away from) the target. If desired, surgeon 116 may be presented with a representation of the anatomy being operated on and/or a virtual representation of surgical tool 112 and/or haptic object 110 on display 30.
The computer-assisted surgery (“CAS”) system preferably includes a localization or tracking system that determines or tracks the position and/or orientation of various trackable objects, such as surgical instruments, tools, haptic devices, patients, donor tissue and/or the like. The tracking system may continuously determine, or track, the position of one or more trackable markers disposed on, incorporated into, or inherently a part of the trackable objects, with respect to a three-dimensional coordinate frame of reference. Markers can take several forms, including those that can be located using optical (or visual), magnetic or acoustical methods. Furthermore, at least in the case of optical or visual systems, location of an object's position may be based on intrinsic features, landmarks, shape, color, or other visual appearances, that, in effect, function as recognizable markers.
Any type of tracking system may be used, including optical, magnetic, and/or acoustic systems, which may or may not rely on markers. Many tracking systems are typically optical, functioning primarily in the infrared range. They may include a stationary stereo camera pair that is focused around the area of interest and sensitive to infrared radiation. Markers emit infrared radiation, either actively or passively. An example of an active marker is a light emitting diode (LED). An example of a passive marker is a reflective marker, such as ball-shaped marker with a surface that reflects incident infrared radiation. Passive systems may include an infrared radiation source to illuminate the area of focus. A magnetic system may have a stationary field generator that emits a magnetic field that is sensed by small coils integrated into the tracked tools.
With information from the tracking system on the location of the trackable markers, CAS system 11 may be programmed to be able to determine the three-dimensional coordinates of an end point or tip of a tool and, optionally, its primary axis using predefined or known (e.g. from calibration) geometrical relationships between trackable markers on the tool and the end point and/or axis of the tool. A patient, or portions of the patient's anatomy, can also be tracked by attachment of arrays of trackable markers. In the illustrated example, the localizer is an optical tracking system that comprises one or more cameras 14 that preferably track a probe 16. As shown in
In one implementation, processor based system 36 may include image guided surgery software to provide certain user functionality, e.g., retrieval of previously saved surgical information, preoperative surgical planning, determining the position of the tip and axis of instruments, registering a patient and preoperative and/or intraoperative diagnostic image datasets to the coordinate system of the tracking system, etc. Full user functionality may be enabled by providing the proper digital medium to storage medium 12 coupled to computer 36. The digital medium may include an application specific software module. The digital medium may also include descriptive information concerning the surgical tools and other accessories. The application specific software module may be used to assist a surgeon with planning and/or navigation during specific types of procedures. For example, the software module may display predefined pages or images corresponding to specific steps or stages of a surgical procedure. At a particular stage or part of a module, a surgeon may be automatically prompted to perform certain tasks or to define or enter specific data that will permit, for example, the module to determine and display appropriate placement and alignment of instrumentation or implants or provide feedback to the surgeon. Other pages may be set up to display diagnostic images for navigation and to provide certain data that is calculated by the system for feedback to the surgeon. Instead of or in addition to using visual means, the CAS system could also communicate information in other ways, including audibly (e.g. using voice synthesis) and tactilely, such as by using a haptic interface. For example, in addition to indicating visually a trajectory for a drill or saw on the screen, a CAS system may feed information back to a surgeon whether he is nearing some object or is on course with an audible sound. To further reduce the burden on the surgeon, the module may automatically detect the stage of the procedure by recognizing the instrument picked up by a surgeon and move immediately to the part of the program in which that tool is used.
The software which resides on computer 36, alone or in conjunction with the software on the digital medium, may process electronic medical diagnostic images, register the acquired images to the patient's anatomy, and/or register the acquired images to any other acquired imaging modalities, e.g., fluoroscopy to CT, MRI, etc. If desired, the image datasets may be time variant, i.e. image datasets taken at different times may be used. Media storing the software module can be sold bundled with disposable instruments specifically intended for the procedure. Thus, the software module need not be distributed with the CAS system. Furthermore, the software module can be designed to work with specific tools and implants and distributed with those tools and implants. Moreover, CAS system can be used in some procedures without the diagnostic image datasets, with only the patient being registered. Thus, the CAS system need not support the use of diagnostic images in some applications—i.e. an imageless application.
Haptic device 113 may be used in combination with the tracking and imaging systems described above to perform highly accurate bone resections and grafting bone on the resected bone. A general description of such a procedure is described below, followed by at least two particular examples of the procedure.
In step 230, the surgeon may define the boundaries of haptic object 110. This may be accomplished in one of several ways. In one example, the haptic object 110 may be based on the geometry and/or volume of bone to be removed determined in step 220. The haptic object 110 may be defined to have boundaries along the geometry and/or volume of bone to be removed so that the surgical tool 112, as described above, may aid the surgeon 116 to target and approach the intended anatomical site of the patient with surgical tool 112. In another example, a number of pre-defined shapes or volumes may be pre-loaded into computer 10 and/or computer 36. For example, different procedures may have certain typical shapes or volumes of intended bone removal, and one or more pre-loaded geometries and/or volumes may be included in the software application on computer 10 and/or computer 36, for example with each geometry and/or volume corresponding to one or more types of procedures. These pre-loaded shapes or volumes may be used without modification, but in many cases the pre-loaded geometries and/or volumes will be modified by the surgeon and/or combined with other pre-loaded geometries and/or volumes to meet the needs of the particular patient.
In step 240, haptic device 113 is registered to the anatomy of the patient. If desired, a representation of the anatomy of the patient displayed on display device 30 may also be registered with the anatomy of the patient so that information in diagnostic or planning datasets may be correlated to locations in physical space. For example, the haptic device 113 (or a probe attached thereto) may be directed to touch fiducial markers screwed into the bones, to touch a series of points on the bone to define a surface, and/or to touch anatomical landmarks. The registration step 240 is preferably performed when the anatomy is clamped or otherwise secured from undesired movement. Registration may also be performed using, for example, intraoperative imaging systems. However, the anatomy does not need to be clamped in certain situations, for example if tracking devices are coupled to the anatomy. In that case, any movement of the anatomy is tracked so that rigid fixation is not necessary.
In step 250, with patient registration complete, the bone removal procedure is performed. The procedure may be any suitable procedure in which bone is to be removed, such as resection in preparation for joint replacement, bulk bone removal, or small volume bone removal for treating small tumors or the like. The actual process of removing bone may be performed semi-autonomously under haptic control, as described above, autonomously by haptic device 113, manually via free-hand resection by the surgeon, or any combination of the above. Regardless of the specific procedure or the level of surgeon control, the bone removal geometry and/or volume is tracked by computer 10 (and/or computer 36) by tracking the position of surgical tool 112 with the navigation system and/or joint encoders of haptic device 113. Thus, even if the bone actually removed differs from the surgical plan, the computer 10 (and/or computer 36) tracks and stores information relating to the bone actually removed. In other embodiments, photo and/or pressure sensors may be employed with haptic device 113 to precisely measure the geometry and/or volume of bone that is removed. It is also contemplated that, following the bone removal, additional imaging may be performed and compared to patient images prior to the resection to determine bone actually removed, which may be used as an alternative to the robotic tracking of bone removal or as confirmation of same. Still further, instead of tracking and storing information to the bone actually removed during the removal process, the bone may first be removed, and following the bone removal, the remaining surface of the bone may be probed to register the precise remaining volume and/or geometry of bone.
With the information relating to the geometry and/or volume of bone removed from the patient, computer 10 and/or computer 36 determines the precise three-dimensional geometry of the prosthesis to be implanted into or onto the bone in step 260. Based on this determination, haptic device 113 may be used in any one of a number of ways to form and/or place the prosthesis. For example, if the prosthesis is an allograft bone, haptic device 113 may employ the determined geometry and/or volume to assist the surgeon in shaping the allograft bone to precisely fit the geometry of the resected bone. Alternately, a similar procedure may be used on the patient if the prosthesis is an autograft bone taken from another bone portion of the patient, with the haptic device 113 providing assistance to the surgeon in resecting the precise geometry and/or volume of autograft to replace the bone removed in step 250. In other embodiments, haptic device 113 may be employed to resect more autograft than will be needed to replace the bone removed in step 250 while taking into account whether such removal of autograft taken from the another bone portion of the patient is safe for the patient. Still further, a liquid or putty-type bone graft may be applied to the site of bone removal in step 250, for example by attaching a syringe-like device as the tool of haptic device 113, with precise application of the bone graft to the site of bone removal. Some of these examples are described in greater detail below.
As noted above, steps 200 through 260 do not necessarily need to be performed in the order shown in
One particular example of a procedure utilizing steps 200-260 of
The processor-based system 36, for example with the aid of software, may automatically identify the location and/or boundaries of tumors(s) 310. In one example, this determination is based on bone density and/or quality information from the image 300. Tumor(s) 310 and surrounding portions of healthy femur 305 may have different density values, allowing for the correlation of image brightness to bone density in order to determine the boundaries between tumor(s) 310 and adjacent portions of healthy femur 305. The surgeon may review and confirm the determined location of tumor(s) 310, revise the determined location of the tumor(s), or otherwise manually identify the location of the tumor(s).
Based on the determination of the boundary between tumor(s) 310 and healthy femur 305, the processor-based system 36 may automatically determine the geometry and/or volume 315 of femur 305 to be resected to effectively remove tumor(s) 310, as provided by step 220 and as shown in
Whether or not steps 220 and 230 are performed, the patient is then registered to the haptic device 113 as described above in connection with step 240. A surgical tool 112 in the form of a small bur may be coupled to haptic device 113 and used to remove the tumor(s) 310 on femur 305. If steps 220 and 230 were performed, the haptic device 113 may autonomously or semi-autonomously guide the bur using the constraints of the haptic object 110 to remove the desired geometry and/or volume 315 of bone, as shown in
In step 260, the precise geometry and/or volume of the prosthetic is determined. The prosthetic geometry and/or volume may be identical to that of the bone removed, as tracked during the removal step, whether the bone removal was autonomous, semi-autonomous, or manual. If the bone removal geometry and/or volume was pre-planned using computer 36, the geometry and/or volume of the prosthetic may be identical to the geometry and/or volume of the planned bone removal, since haptic device 113 helps ensure the bone removal occurs exactly (or nearly exactly) according to plan. Instead of forming the geometry of the prosthesis to be identical to the geometry and/or volume of the removed bone, modifications may be made, for example so that the prosthesis can have a press fit or interference fit with the patient's anatomy.
The prosthesis may take any suitable form, including, e.g., demineralized bone matrices (“DBM”), morselized autograft, morselized allograft, polymethyl methacrylate (“PMMA”) bone cements, synthetic calcium phosphate or calcium sulfate based bone grafts, and/or ultraviolet (“UV”) curable resins. If the prosthesis takes the form of one of the above void fillers, it may be delivered via syringe or syringe-like device. For example, as shown in
Rather than use a homogenous void filler 320, the process may be divided into steps to provide additional features of the prosthetic bone. For example, a surgical tool 112 with a syringe packed with a curable resin, such as a UV curable resin, may be coupled to haptic device 113. A curing source, such as a UV source, may be provided along with surgical tool 112 so that the curable resin cures contemporaneously or near-contemporaneously upon deposition into the bone void. A cured resin lattice may be formed in this manner, which may be then be infused with a void filler or a bone growth composition. The lattice may take the form of a structural three-dimensional matrix with voids that can be filled with a void filler and/or bone growth composition. This infusion may be accomplished by coupling a surgical tool 112 in the form of a syringe-like device packed with the bone growth material to haptic device 113, or manually by the surgeon.
Another alternative, as shown in
With any of the void filler 320 deposition techniques described above, the void filler 320 may vary in quality in three-dimensions. For example, layers of filler 320 which have different densities may be applied as desired, for example by repeating the delivery described in connection with
Some void fillers 320, such as bone cement, may be applied to the bone at a relatively high temperature and cure as the cement cools. The surgical tool 112 may incorporate a thermal sensor so that computer 10 (and/or computer 36) is able to detect a temperature of the void filler 320 packed into the effector. The computer 10 (and/or computer 36) may then control the deposition of the void filler 320 onto the bone so that the application occurs at an optimal viscosity and/or thermal optimum. For example, if the void filler 320 is too hot, the native bone may be damaged. However, if the void filler 320 is allowed to cool too much prior to deposition, the deposition may not be effective if the void filler 320 has already begun to harden.
Although the procedure above is described as tracking bone removal coincident with the bone removal process, other alternatives may be suitable. For example, after the bone removal is complete, a shapeable material may be pressed into the bone void to create a mold having a volume and/or geometry corresponding to the resected bone. It should be understood that this mold may actually be a “reverse” mold of the resected bone, since the mold has the shape of what was removed. The mold, once formed, may be removed from the bone and the surface probed and registered to determine the shape of the removed bone (and correspondingly the shape of the remaining bone).
Another example of a procedure utilizing steps 200-260 of
Structural bulk allograft procedures using tissue prostheses may provide certain benefits because the allograft may include soft tissue attachments to allow the surgeon to reconstruct the soft tissue with the promise of increased restoration of function. For example, a proximal tibial allograft may include a patellar tendon, a proximal femoral allograft may include hip abductor tendons, and a proximal humeral allograft may include rotator cuff tendons.
As an example, a physician may determine that, following trauma to a proximal tibia, it would be beneficial to replace the proximal tibia with a bulk tissue allograft from a donor. The patient's bone, including the trauma site, is imaged in step 210. A schematic illustration of an image 400 of a patient's tibia 405 is shown in
An appropriate prosthesis, such as a donor tibia 505, may be secured into a holding device 515, as shown in
The donor tibia 505, including any soft tissue attachments, such as a patellar tendon, may be imaged if desired. In one example, imaging the donor tibia 505 and/or soft tissue attachments may provide information that may be useful to the surgeon in the procedure. For example, density information of the donor tibia 505 may be obtained from the image and a desired portion of donor tibia 505 may be selected for grafting based, at least in part, on the density profile determined from the image. In addition, information regarding any imaged soft tissue attachments may aid in planning placement of the soft tissue attachments with respect to the patient's anatomy.
Whether or not donor tibia 505 is imaged, the surgeon may use processor based system 36 to plan the surgical procedure on the patient's tibia 405, including the step 220 of defining the volume and/or geometry of the resection of the patient's tibia 405. Based on this determination, haptic object 110 may be defined in step 230. For example, as shown in
The patient's tibia 405 is registered as described above in connection with step 240. A surgical tool 112, such as a drill, bur, or other resecting tool, may be coupled to haptic device 113 and used to resect tibia 405 according to the surgical plan. As described above, the haptic device 113 may autonomously or semi-autonomously guide the surgical tool 112 using the constraints of the haptic object 110 to remove the desired geometry and/or volume of bone, as shown in
As the haptic device 113 resects tibia 405, data is stored in computer 10 (and/or computer 36) to determine the geometry and/or volume of donor tibia 505 that needs to be removed, as provided in step 260, in order to provide a corresponding fit with the resected tibia 405. If not already performed, the donor tibia 505 is registered, for example in the same manner in which the patient's tibia 405 was registered. The registration takes place after donor tibia 505 is securely positioned in holding device 515 to help ensure that the global or real world coordinate system does not change with respect to the registered coordinate system, for example by unintentional movement of donor tibia 505 within holding device 515. It should be noted that the holding device 515 may also be registered during this step. Whether the registration is performed before or after the registration and resection of patient's tibia 405, the donor tibia 505 is resected using haptic device 113, as shown in
It should be understood that although a dovetail interlocking feature is described above, other features of aiding implantation may be used instead or in addition. For example, other types of geometric keys, including tongue and groove, may be correspondingly formed in the patient's tibia 405 and donor tibia 505. In fact, any corresponding geometries intended to mate with one another may be created. For example, corresponding geometries that provide for a press fit and/or interference fit may be created. Further, the donor bone may be resected into multiple pieces that fit together to form the desired implant shape. This type of procedure may be useful, for example, when a middle portion of a bone is being replaced, similar to the procedure described in connection with
Still further, in some procedures one or more pieces of hardware, such a bone plate, may be implanted to additionally secure other prosthetic devices, such as a donor bone or multiple pieces of donor bone. In some cases, a bone plate may be bent by the surgeon intraoperatively to provide the best fit between the plate and the anatomy. However, such bending is often done by trial and error. With the above disclosure in mind, if a surgeon bends a plate intraoperatively, the surgeon may probe the bone-contacting surface of the plate to determine the geometry of the surface, which may be compared by computer 10 and/or 36 to the surface geometry of the patient anatomy to determine whether or not, and to what extent, the contour of the bone plate matches the contour of the anatomy to which the bone plate will be affixed.
Although the bulk allograft procedure is described above in relation to a tibia 405, it should be understood that the procedure applies to other bones and to other types of resections. In addition, the procedure could be performed with the donor bone being a portion of the patient's own bone from another site. With such an autograft procedure, the steps outlined above would be generally similar, but with the haptic device 113 being used to resect the patient's host bone and also the patient's own donor bone which may come from another part of the patient's body.
It should further be clear that the imaging and registration of the patient's bone and donor bone may be performed essentially in any order. For example, the patient's bone may be imaged and registered, then resected, and then the donor bone registered and resected. Alternatively, the patient and donor bone may both be registered prior to performing resection of either bone.
In addition, although the procedure described in connection with
Still further, although certain steps are described as being performed on processor based system 36 and/or computer 10, it should be understood that such steps may be performed on a separate computer device with the results imported to processor based system 36 and/or computer 10. For example, the surgical plan may be created on a separate computer device prior to the surgery and the results of such plan imported to processor based system 36 for use during the surgery.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. For example, features described in relation to one embodiment may be combined with features described in relation to another embodiment.
This application is a continuation of U.S. patent application Ser. No. 17/399,193, filed Aug. 11, 2021, which is a continuation of U.S. patent application Ser. No. 16/546,498, filed Aug. 21, 2019, which is a continuation of U.S. Pat. No. 10,433,921, filed Dec. 19, 2016, which claims the benefit of the filing date of U.S. Provisional Application No. 62/271,599, filed Dec. 28, 2015, the disclosures of which are hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4743262 | Tronzo | May 1988 | A |
5598005 | Wang | Jan 1997 | A |
5824084 | Muschler | Oct 1998 | A |
6112109 | D'Urso | Aug 2000 | A |
6319712 | Meenen | Nov 2001 | B1 |
6475243 | Sheldon et al. | Nov 2002 | B1 |
6613092 | Kana et al. | Sep 2003 | B1 |
6711432 | Krause et al. | Mar 2004 | B1 |
6723131 | Muschler | Apr 2004 | B2 |
7641672 | Fallin et al. | Jan 2010 | B2 |
7747311 | Quaid, III | Jun 2010 | B2 |
7831292 | Quaid | Nov 2010 | B2 |
8095200 | Quaid, III | Jan 2012 | B2 |
8183042 | Liao et al. | May 2012 | B2 |
8287522 | Moses | Oct 2012 | B2 |
8391954 | Quaid, III | Mar 2013 | B2 |
8483863 | Knox | Jul 2013 | B1 |
8551178 | Sharkey et al. | Oct 2013 | B2 |
8617171 | Park et al. | Dec 2013 | B2 |
8652148 | Zuhars | Feb 2014 | B2 |
8657482 | Malackowski et al. | Feb 2014 | B2 |
8845736 | Zhang et al. | Sep 2014 | B2 |
8911499 | Quaid | Dec 2014 | B2 |
9002426 | Quaid | Apr 2015 | B2 |
9039998 | Guillemot et al. | May 2015 | B2 |
9056017 | Kotlus | Jun 2015 | B2 |
9060794 | Kang | Jun 2015 | B2 |
9278001 | Forsell | Mar 2016 | B2 |
9486321 | Smith et al. | Nov 2016 | B1 |
9492237 | Kang | Nov 2016 | B2 |
9636185 | Quaid | May 2017 | B2 |
9724165 | Arata | Aug 2017 | B2 |
9757242 | Dong et al. | Sep 2017 | B2 |
9757243 | Jones et al. | Sep 2017 | B2 |
9775681 | Quaid | Oct 2017 | B2 |
9775682 | Quaid | Oct 2017 | B2 |
9820861 | Smith | Nov 2017 | B2 |
10028789 | Quaid | Jul 2018 | B2 |
10085804 | Nortman | Oct 2018 | B2 |
10231790 | Quaid | Mar 2019 | B2 |
10433921 | Librot | Oct 2019 | B2 |
11154370 | Librot | Oct 2021 | B2 |
11819298 | Librot | Nov 2023 | B2 |
20030135216 | Sevrain | Jul 2003 | A1 |
20040193268 | Hazebrouck | Sep 2004 | A1 |
20050272153 | Xuenong et al. | Dec 2005 | A1 |
20060142657 | Quaid | Jun 2006 | A1 |
20070142751 | Kang | Jun 2007 | A1 |
20070173815 | Murase | Jul 2007 | A1 |
20070265705 | Gaissmaier et al. | Nov 2007 | A1 |
20070270685 | Kang | Nov 2007 | A1 |
20080004633 | Arata | Jan 2008 | A1 |
20080010705 | Quaid | Jan 2008 | A1 |
20080010706 | Moses | Jan 2008 | A1 |
20090000626 | Quaid | Jan 2009 | A1 |
20090000627 | Quaid | Jan 2009 | A1 |
20090012531 | Quaid | Jan 2009 | A1 |
20090012532 | Quaid | Jan 2009 | A1 |
20090306499 | Van Vorhis | Dec 2009 | A1 |
20090314925 | Van Vorhis | Dec 2009 | A1 |
20100016467 | Truckai | Jan 2010 | A1 |
20100217400 | Nortman | Aug 2010 | A1 |
20100256504 | Moreau-Gaudry | Oct 2010 | A1 |
20100256692 | Kang | Oct 2010 | A1 |
20110172611 | Yoo et al. | Jul 2011 | A1 |
20120109152 | Quaid, III | May 2012 | A1 |
20130053648 | Abovitz | Feb 2013 | A1 |
20130060278 | Bozung | Mar 2013 | A1 |
20130144392 | Hughes | Jun 2013 | A1 |
20130211523 | Southard et al. | Aug 2013 | A1 |
20140180290 | Otto | Jun 2014 | A1 |
20140188134 | Nortman | Jul 2014 | A1 |
20140194887 | Shenoy | Jul 2014 | A1 |
20140263214 | Dahotre et al. | Sep 2014 | A1 |
20140371897 | Lin et al. | Dec 2014 | A1 |
20150182295 | Bozung et al. | Jul 2015 | A1 |
20160338782 | Bowling | Nov 2016 | A1 |
20170007406 | Cui et al. | Jan 2017 | A1 |
20170020613 | Kang | Jan 2017 | A1 |
20170151021 | Quaid, III | Jun 2017 | A1 |
20170181755 | Librot | Jun 2017 | A1 |
20170333137 | Roessler | Nov 2017 | A1 |
20170333138 | Arata | Nov 2017 | A1 |
20170340389 | Otto | Nov 2017 | A1 |
20180168749 | Dozeman | Jun 2018 | A1 |
20180168750 | Staunton | Jun 2018 | A1 |
20190015164 | Quaid | Jan 2019 | A1 |
20190029764 | Nortman | Jan 2019 | A1 |
20190374295 | Librot | Dec 2019 | A1 |
20200046412 | Nachtrab et al. | Feb 2020 | A1 |
20200060843 | Evans et al. | Feb 2020 | A1 |
20210369361 | Librot | Dec 2021 | A1 |
20240065785 | Librot | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
WO-2014145406 | Sep 2014 | WO |
2019104392 | Jun 2019 | WO |
Entry |
---|
Extended European Search Report including Written Opinion for Application No. 21205871.3 dated Mar. 23, 2022, pp. 1-10. |
Number | Date | Country | |
---|---|---|---|
20240065785 A1 | Feb 2024 | US |
Number | Date | Country | |
---|---|---|---|
62271599 | Dec 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17399193 | Aug 2021 | US |
Child | 18388091 | US | |
Parent | 16546498 | Aug 2019 | US |
Child | 17399193 | US | |
Parent | 15383303 | Dec 2016 | US |
Child | 16546498 | US |