The invention relates to a system and method of computer assisted surgery (CAS) using stereotactic navigation with three-dimensional visualization, and more specifically to a CAS system that is reactive and does not disrupt operating room workflow procedures.
A current method of inserting implants (consisting of, for example, a plate and associated screws) is typically accomplished by positioning the plate on the corresponding anatomical location and inserting the screws with the assistance of fluoroscopy. The implantation of plating and nailing systems is often a difficult task because operating room (OR) procedures are generally minimally invasive and thus placement is achieved by trial and error using fluoroscopy such as a C-arm apparatus, i.e., C-arm vision. This generally leads to long operating times. Furthermore, during such a procedure, both the patient and the operator are exposed to significant amounts of radiation.
In addition, in some cases it may be impossible to determine the position of implant components (for example, the screws in the bone) with sufficient precision because the fluoroscopic image is only two-dimensional. This may lead to misplacement or insertion of screws of improper length. This, in turn, may cause high revision rates or even injuries (e.g., hip joint injury). In order to ensure that these implant components do not extrude from the bone, it is thus sometimes necessary to position these implant components with an excessively large margin of error away from the edge of the bone. In many instances, the result is that the implant cannot be positioned as intended, and the desired biomechanical stability cannot be achieved. In the case of femoral neck fractures, for example, the use of conventional fluoro-navigation does not result in any significant improvement.
Other cutting edge technologies currently being used in operating rooms to assist in surgery include intra-operative three-dimensional (3D) imaging and navigation systems based on tracking technology. However, only a few hospitals are using these technologies. The limited adoption of these technologies is primarily due to their high cost, the effort involved in installing these systems, and the significant resulting changes to OR procedures or workflow. For example, tracking technologies require a line of sight between the tracking device and navigation detection system. This disrupts the normal workflow since the surgeon and other personnel must then remain cognizant of the system's line of sight requirements.
Further, as a general matter, satisfactory positioning of a main implant, like a plate or nail, cannot be defined pre-operatively. For example, during an operation positioning can be done by haptic match on the bone surface or by reaming the bone to make space for an intra-medullary nail. In addition, although the position of sub-implant(s) might be based only on pre-operative images (e.g., fluoroscope or CT images), such position is still relative to the position of the main implant. Thus, a positioning procedure cannot be completely planned pre-operatively, but must be optimized during the operation. In this regard, classical stereotaxis cannot be used due to the fact that the position cannot be predefined.
Accordingly, there is a need for a computer assisted surgery (CAS) system that enhances surgical procedures without significantly disrupting the normal OR workflow. More specifically, there is a need for a combined 3D imaging and CAS system which can be easily and readily integrated into the clinical environment. Preferably, such a system would be low cost, easy to set-up and use, and minimize changes to the OR workflow.
An aspect of the present invention is a reactive method for stereotactic surgery. The method preferably comprises positioning an implant associated with a reference body on a region of interest of a patient's anatomy; detecting information associated with the implant using an imaging system; determining, based on the detected information associated with the implant, an action to be taken as part of the surgery; and displaying positional information associated with the implant and the region of interest based on the action to be taken.
In accordance with this aspect of the present invention, positioning comprises acquiring two fluoroscope images of the region of interest at two different angles.
Further in accordance with this aspect of the present invention, displaying further comprises processing detected information associated with the implant by estimating the contours of the region of interest in at least two dimensions based on the plurality of two-dimensional images.
Further still in accordance with this aspect of the present invention, detecting comprises detecting the presence of the reference body based on one or more fiducial markers.
In another aspect, the present invention is a method for stereotactic surgery. The method preferably comprises positioning a medical device associated with a reference body proximate a region of interest of a portion of an anatomy of a subject and imaging the region of interest at two or more angles to obtain a plurality of two-dimensional images. In a preferred embodiment, the reference body comprises a plurality of fiducial members, most preferably at least four such markers that are visible to the imaging system. It is further preferred that the fiducial markers comprise spheres that are visible to the imaging system.
In accordance with this aspect of the present invention, the plurality of two-dimensional images are processed to produce three dimensional information associated with the region of interest. In addition, the method further preferably includes associating, based on the three dimensional information, a virtual medical device with the region of interest and the reference body and displaying the association as an image showing the virtual medical device superimposed onto the region of interest.
Further in accordance with this aspect of the present invention, the virtual medical device comprises a main implant and one or more sub-implants. In addition, the virtual main implant is superimposed over the current location of the actual implant and the virtual sub-implants are generated so as to show its future position. Accordingly, the virtual sub-implants inform the surgeon of where the actual sub-implant will be located before it is placed in the region of interest.
In accordance with this aspect of the present invention, imaging preferably comprises acquiring two fluoroscope images of the region of interest at two different angles. In addition, processing further preferably comprises estimating the contours of the region of interest in at least two dimensions based on the plurality of two-dimensional images.
Further in accordance with this aspect of the present invention, processing may further comprise forming a three dimensional image associated with the region of interest based on the estimation. In a further preferred aspect, the present invention may be applied to a surgical implant procedure wherein the region of interest comprises a femoral head, the plurality of two dimensional images comprise anterior-to-posterior and axial images of the femoral region and estimating comprises forming an outline of the femoral head on the anterior-to-posterior and axial images. In this regard, the method may further comprise forming parts of a three dimensional sphere representing important portions of the femoral head.
Further still in accordance with this aspect of the present invention, the medical device preferably comprises an intracapsular plate and the reference body is connected to the plate, and positioning comprises positioning the intracapsular plate on a femur proximate the femoral head. In addition, the virtual medical device preferably comprises a virtual intracapsular plate and displaying comprises showing the virtual intracapsular plate superimposed on the position of the intracapsular plate in relation to the femoral head.
In another aspect, the present invention is a computer assisted surgical system, comprising: an apparatus for imaging a region of interest of a portion of an anatomy of a subject; a memory containing executable instructions; and a processor programmed using the instructions to perform a method. In this regard, the processor preferably receives two or more two-dimensional images of the region of interest taken at different angles from the apparatus, processes the two or more two-dimensional images to produce three dimensional information associated with the region of interest, superimpose a virtual reference body onto the region of interest based on the three dimensional information to form an image showing the virtual reference body relative to the region of interest, and generate a display signal associated with the superimposed image. Preferably, the reference body is first detected and superimposed onto an object that models the region of interest, e.g., sphere for a femoral head; and the display signal is then generated.
In accordance with this aspect of the present invention, the processor preferably processes the one or more two-dimensional images by outlining the contours of the region of interest in two dimensions and creates a three dimensional object representing the region of interest. The three dimensional object may be derived from a database and based on age and gender of the patient. The three dimensional object may also be determined based on landmarks associated with the region of interest.
Further in accordance with this aspect of the present invention, a medical device may comprise a device selected from the group consisting of an intracapsular plate, an artificial joint, a pacemaker and a valve.
In another aspect, the present invention is a system and method of computer assisted surgery (CAS) using stereotactic navigation with three-dimensional visualization, wherein an implant or implant system acts as a stereotactic device. The invention provides a reactive CAS system designed for use with mono-axial and poly-axial plates and nails. Based on the principles of stereotactics and 2D-3D matching, a system is provided that virtually suggests or gives indication of the optimal position of an implant by calculating such position. In addition, the system may also calculate screw lengths before drilling. Aided by image processing and virtual 3D visualisation, the system can achieve optimal biomechanics.
In addition, unlike existing navigation systems, the CAS system of the present invention is designed to be reactive to reduce any additional effort for the surgeon. In particular, the system may be triggered by use of a reference body, implant K-wires, or screws that are normally used as part of the surgical procedure. In addition, by detecting these devices, the system is able to determine the step in the workflow that's being performed. More specifically, image processing is used to detect various objects during the workflow and determine which step is being performed by the surgeon and for system adaptation.
In another aspect, the system provides necessary 3D information without the need for intra-operative 3D imaging (e.g. 3D C-arms). The system is also low cost, easy to set-up and use, and minimizes changes to the OR workflow. The present system also requires fewer X-ray images and is therefore safer for patients.
In another aspect, the invention makes use of an iterative procedure (which for the example of using an ICP to fix a femur neck fracture) includes one or more of the following steps:
These and additional aspects and features of the present invention are described in further detail below.
Generally, in one aspect, the system of the present invention is based on the registration of fluoroscopy images with an implant associated with a reference body. For example, the implant (e.g., an angle stable plate) may include the reference body or may be positioned in a predefined location in relation to the reference body, which is detected or recorded in a fluoro image. Thus, the actual spatial dimension and position of the implant can be determined by means of the correct identification and registration of the reference body in the fluoro images.
Where multiple related implants are included as part of the procedure, e.g., main implants and sub-implants, after registration of the main implant as described above, the location of any remaining sub-implants may be depicted virtually in the correct spatial position in relation to the fluoro images of the main implant. The sub-implants (e.g., screws of the associated angle stable plate) will be located in a fixed, pre-defined position in relation to the main implant after all implants have been implanted.
In order to provide the information necessary for an anatomically correct location of all (main and sub-) implants, important anatomical regions are approximated using three dimensional bodies or objects depicted in the fluoro image in correct relative position. Target values are compared with the values of the location of the remaining implants, which are used in determining the current position of the main implant.
During pre-operative planning (for example, using a non-invasive applied reference body), the partial or sub-implants (e.g., screws) can first be placed in an optimum position, independent of the location of the main implant (plate). In a subsequent operation (using an invasive reference body), where the main implant location has been determined by the pre-operative planning (with a position estimate derived by the surgeon), the location of the main implant can be optimized using haptic feedback. After the registration as described above, the resulting location of the partial or sub-implants are depicted virtually; this position is compared to the position of the partial implant in the pre-operative plan, and to the distances to important anatomical (three-dimensional) structures. In a reactive iterative process (adjusting the plate as instructed by the system), it is possible to determine the optimum balance between an ideal main implant location (for example, plate fit) and the ideal partial implant position (for example, screw location).
Turning now to
Memory 160 stores information accessible by processor 150, via bus 162 for example, including instructions 164 for execution by the processor 150 and data 166 which is retrieved, manipulated or stored by the processor 150. The memory 160 may be of any type capable of storing information accessible by the processor 150, such as a hard-drive, ROM, RAM, CD-ROM, write-capable, read-only, or the like. The instructions 164 may comprise any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. In that regard, the terms “instructions,” “steps” and “programs” may be used interchangeably herein. The functions, methods and routines of the program in accordance with the present invention are explained in more detail below.
Data 166 may be retrieved, stored or modified by processor 150 in accordance with the instructions 164. The data may be stored as a collection of data. For instance, although the invention is not limited by any particular data structure, the data may be stored in computer registers, in a relational database as a table having a plurality of different fields and records, or as an XML document. The data may also be formatted in any computer readable format such as, but not limited to, binary values, ASCII or EBCDIC (Extended Binary-Coded Decimal Interchange Code). Moreover, any information sufficient to identify the relevant data may be stored along with the data, such as descriptive text, proprietary codes, pointers, or information which is used by a function to calculate the relevant data.
Although the processor 150 and memory 160 are functionally illustrated in
As shown, computer device 120 may comprise additional components typically found in a computer system such as a display (e.g., an LCD screen), user input (e.g., a keyboard, mouse, game pad, touch-sensitive screen), microphone, modem (e.g., telephone or cable modem), and all of the components used for connecting these elements to one another.
As is also shown in
In another aspect, the present invention addresses a problem with the current technique of ICP implantation of accurately positioning the plate using two-dimensional (2D) images. This problem is in part due to the dangerous screw placement needed to avoid cutouts. Specifically, the ends/tips of the screws need to be set as close as possible to the second cortex. However, the 2D images used by the surgeon do not reflect the 3D nature of the problem.
In one aspect, the present invention provides a system and method which generates 3D information from the 2D imagery to allow for more accurate positioning of a medical device, e.g., an implant, and thereby avoiding the above problems. Generally, as used herein, the term medical device includes any biomedical device or structure that is introduced or implanted into the anatomy of a subject. Such devices include those that replace or act as missing biological structures, or that are placed over or within bones or portions of the anatomy. As mentioned above, the present invention is described using the illustrative example of implanting an intracapsular plate (ICP) to repair a femoral neck fracture. Note, however, that the invention may find application in numerous surgeries, including virtually all fields of bone surgery (e.g., trauma, orthopedics, and pediatrics).
By way of background, it is generally known that fractures are usually repaired by reduction and fixation of the broken bones. The individual fragments of bone are aligned in their normal anatomical position (i.e., reduced) so that separated parts can grow together again. It is necessary that the parts remain relatively stable with respect to each other over an extended period of time to allow for healing. In some cases, particularly for more complicated fractures, it is necessary to connect the individual broken bone pieces directly to one another. In these cases, the fracture is fixed or reduced via an invasive procedure wherein an implant is installed within the body with screws or nails.
Turning now to
Using the image of the virtual implants, the surgeon may then affix the implant, using the sub-implants for example, as is depicted at S424. Once the sub-implants (e.g., screws) and implants are in the place, the system may perform a quality check, at S428, by detecting and displaying the actual location of these implants relative to their desired position. This quality check is desirable given that during implantation, the position of an implant or sub-implant may change from its ideal position due to the mechanical forces during, for example, drilling or screw placement or as a result of movement by the patient. In this regard, quality checks, such as step at S428, may also be performed during affixation of implant, at step S424. Additionally, quality checks may also be performed post operatively using the system to detect movement in the implant caused by, for example, patient activity.
Significantly, the above method 400 is reactive in that the surgeon is not required to inform the system 100 of which step he/she is performing as part of the OR workflow. In this regard, this system 100 is compatible with the normal OR workflow and is able to determine the step in the OR workflow that is being performed by, for example, detecting the presence of a reference body or object.
Turning now to
In this regard,
In accordance with an aspect of the present invention, prior to insertion of the main implant within the region of interest, the main implant 510 is connected to a reference body or object. The reference body is preferably attached to (or part of) the implant, but may also be attached to an aiming device or instrument (e.g., a drill guide). In this way, the position of the implant may be determined based on the location and position of the reference body. Preferably, each implant is associated with a different reference body that is detectable by the system 100, in particular the fluoroscope 110. In a preferred embodiment, the reference body comprises a plurality of spherical fiducial markers inserted on or in the instrument (e.g., aiming device). By arranging the fiducial markers in a predetermined pattern, they may serve as identifiers for different instruments. In addition, the size and shape of the fiducial markers may also serve as identifiers. In this regard, the fiducial markers and instrument may be conveniently referred to as a reference body—though the fiducial markers are what provide the reference.
For example,
Returning to
As the implant and reference body are located within the field of view of the imaging device, given their proximity to region of the interest, the fluoroscope 110 detects the presence of the reference body, i.e., the fiducial markers. Computer 120 then uses the image data it receives from the fluoroscope 110 to provide a visualization of the location of the implant relative to the region of interest. In particular, registration of the fluoroscopic images is performed using the reference body. As discussed above, the reference body is typically in a fixed position relative to the implant and bone. Usually, a three dimensional reference is attached to the image intensifier and visible in the X-image to determine the center of the X-ray beam and reduce distortion. As an alternative to using such a three dimensional reference body, a disk with fiducial markers may be used as a reference and may also provide compensation for distortion. In this latter embodiment, determination of the center of x-ray beam may then be provided by the reference body in the implant system. In addition, where digital image intensifiers are used, a disk is not necessary.
Determination of the implant relative to anatomical region of interest is done using known image processing techniques based on the variation in the spatial radiation arriving at the detector, including the radiation directed at the region of interest and reference body. Using the spatial variation, the computer is able to construction an image that accurately depicts the spatial relationship between the implant and region of interest (e.g., femur and femoral head) as a two dimensional image.
Upon viewing this image, the surgeon may then determine if the implant should be re-positioned, as at step S438. For example, the surgeon may decide to adjust the position along the length of the femur closer to the femoral head or other degree of freedom. If the surgeon decides such an adjustment is warranted, he/she repositions the implant as is shown at step S440 and additional fluoroshots are taken at step S434. On the other hand, if the surgeon determines that the no adjustment is needed along in this dimension the procedure continues at step S442 with stabilization of the implant. In keeping with the example, stabilization could be effected by insertion of a Kirshner wire (K-wire) through one or more openings in the ICP.
With the implant fixed as described above, a fluoroshot may then be taken along a different dimension, step S446. In particular, if the fluoroshots in step S434 were taken along the anterior posterior direction, in step S446 they may be taken along the axial direction or at another angle. In this regard, as part of step S402, it may be sufficient to use a single image for this step to optimize position along only one degree of freedom (e.g., a distal shift of the implant) where 3D information is not needed.
Upon completion of the fluoroshot at step S446, the surgeon may then view an image of the position of the implant. If it is determined that the implant needs to be adjusted at step S448, e.g., rotated in the case of an ICP, the procedure returns to step S446 and additional fluoroshots are taken along this dimension. Once the surgeon is satisfied that the implant is appropriately positioned based on images obtain along this dimension, the procedure continues at step S450 with additional stabilization of the implant. For example, where the implant or medical device is an ICP, K-wires may be inserted through additional openings in the ICP. As result of the foregoing procedure, the position of the ICP or other implant may be positioned by the surgeon iteratively and in accordance with normal OR workflow procedures. That is, the surgeon may repeat any steps within the procedure until the implant is appropriately positioned.
With the implant positioned as described above in relation to step S402, the method then continues as shown at step S408 of
In accordance with this aspect of the present invention, the resulting 2D images are processed to locate and outline a three dimensional contour, i.e., a sphere, of the femoral head. For example,
In addition, using the 2D images, the computer 120 then determines and generates 3D object that is associated with and models the region of interest, step S456, in accordance with another aspect of the present invention. In particular,
The visualization also allows the surgeon to manually adjust the position of the actual ICP if better alignment is considered necessary. For example,
As discussed above, the present invention is reactive in that the system reacts to the surgeon in lieu of requiring the surgeon to take action or interact with the computer or system. As such, if the surgeon decides that the implant is properly aligned, he/she can then decide to secure the implant and complete the procedure. This minimizes disruptions in current OR workflow and allows the surgeon to use his/her judgment as part of the workflow. In contrast, conventional approaches tend to disrupt the OR workflow by requiring the surgeon to interact with the CAS. This typically lengthens the surgical procedure and requires more in the way of equipment, both of which increase the cost of surgical procedures.
Upon completion of the steps outlined above in relation to step S408, the procedure continues to step S424, where the implant may be affixed to the region of interest. Additional fluoroshots may be taken during or after step S424 to verify reduction of the fracture and the position of the ICP and K-wires or screws. For example,
As discussed above, Kirshner wires (K-wires) can be inserted through openings in the reference body 604. More specifically, as shown in
Alternatively, an aiming apparatus with scaling in combination with an oblong-shaped hole (wherein a K-Wire may be inserted) may be directly attached to the ICP and used to assist in the mounting and any further adjustment deemed necessary by the surgeon.
Turning now to
As is shown in
Once the surgeon is satisfied with translational alignment of the nail 1654, he may then use the system to rotationally align the nail as is shown in
Once the surgeon determines that the nail 1654 is suitably aligned, he may then insert a K-wire 1687 as is shown in
Based on the tick marks 1693 shown in
The image processing performed by the invention includes: anatomic feature detection and segmentation; position detection of the reference body; generation of 3D information from 2D images; registration, rendering and display of 3D information on 2D images; and calculation of the optimal position of the implant. In addition, in another aspect, the system may propose an appropriate length for each screw.
As discussed above, at least two 2D images containing the reference body are required by the invention to provide 3D information. These images should be taken at different angles (preferably near a 90 degree angle). Additional 2D images can also be used to provide information. The images can be registered to one another by detecting distinctive anatomic features in the images and/or by using the reference body. The reference body (which occurs in each image) can be used to precisely register the images in three dimensions. The reference body can also be helpful in automatically detecting these anatomic structures for segmentation (e.g. detecting feature borders). The relative position of specific anatomical structures to the position of the reference body may also be estimated based on general bone shape statistics and on patient data (e.g. size, gender, age). This relative position may be used as a starting point for the segmentation algorithms. Once the anatomic structures have been segmented, the image processing software can correlate the structures from different images to generate 3D information.
Various three-dimensional reconstruction algorithms can be used to generate this information. Typically, the algorithms will approximate the segmented anatomic features with geometric shapes (e.g., a circle). The geometric shapes are then matched/registered to their known relative positions in the 2D images. These shapes are then projected into 3D space to form, for example, a sphere or cylinder. The invention may initially select a typical 3D Shape for an anatomic region from a database and match it with the image by zooming, rotating, and/or translating the shape. The shape may also be altered, such as with a morphing algorithm, for a better match. In fact, pre-operative images may be taken of the same anatomic region to better determine the actual shape of various features.
Because the reference body is located within each image and is attached to an anatomic region (e.g. a bone), movement of the patient during surgery is not a problem in accordance with an aspect of the present invention. This is because the system can use the location of the reference body to register different fluoroscope images (independent of the image content) and generate a low artifact real 3D image using 3D reconstruction algorithms. This aspect of the invention to precisely register the images significantly reduces artifacts due to patient movement during surgery.
Preoperative planning may be performed by taking pre-operative images similar to the intra-operative images. This pre-operative planning can be used to determine the optimal sub-implant positioning which may then be checked against the intra-operative positioning. Such pre-operative images could be processed using different algorithms which are too time consuming to use during surgery or could be segmented and matched manually.
As discussed above, the invention may also provide a reactive workflow by automatically detecting the status of an operation and thus knowing the next operative steps to be performed. In this manner, the invention might provide suggestions to the surgeon. For example, the invention may suggest a specific type, size, or shape of a best-fit implant based on the detected geometry of a fracture. Moreover, the invention could modify a previous suggestion based on additional information determined during the surgery.
Additional distinctive aspects of the invention include that the stereo-tactic device is implanted in the body. In addition, the invention uses 2D images (e.g. fluoroscopic x-rays) to generate 3D information. The reference plate (ICP) is contoured to match the surface contour of the bone to restrict the degrees of freedom for adjustments. The reference plate (ICP) is also threaded so relative screw position is known. The invention calculates and proposes reference plate position, sphere position, screw position and lengths.
Advantages of the invention include that it reduces the surgery time for insertion of an implant, requires almost no interaction between the surgeon and the system, provides three-dimensional information on important regions, requires little change to operating room procedures, and is cheaper than current tracking based navigation.
Additional features of the invention include that it takes into account any bending of Kirshner wires (K-wires) through automatic detection, calculates and displays any dislocation of the femur head during implantation, and calculates the screw lengths.
Although the invention herein has been described with reference to an ICP procedure, it is to be understood that this embodiment is merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiment and that other arrangements may be devised without departing from the spirit and scope of the present invention.
This application claims the benefit of the filing date of U.S. Provisional Patent Application No. 61/010,543 filed Jan. 9, 2008, the disclosure of which is hereby incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5122141 | Simpson et al. | Jun 1992 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5480402 | Kim | Jan 1996 | A |
5533143 | Takeo | Jul 1996 | A |
5622170 | Schulz | Apr 1997 | A |
5682886 | Delp et al. | Nov 1997 | A |
5799055 | Peshkin et al. | Aug 1998 | A |
5841830 | Barni et al. | Nov 1998 | A |
5880976 | DiGioia, III et al. | Mar 1999 | A |
6002859 | DiGioia, III et al. | Dec 1999 | A |
6021343 | Foley et al. | Feb 2000 | A |
6053918 | Spievack | Apr 2000 | A |
6069932 | Peshkin et al. | May 2000 | A |
6101543 | Alden et al. | Aug 2000 | A |
6149592 | Yanof et al. | Nov 2000 | A |
6198794 | Peshkin et al. | Mar 2001 | B1 |
6205411 | DiGioia, III et al. | Mar 2001 | B1 |
6226548 | Foley et al. | May 2001 | B1 |
6285902 | Kienzle, III et al. | Sep 2001 | B1 |
6341231 | Ferre et al. | Jan 2002 | B1 |
6370421 | Williams | Apr 2002 | B1 |
6428547 | Vilsmeier et al. | Aug 2002 | B1 |
6450978 | Brosseau et al. | Sep 2002 | B1 |
6470207 | Simon et al. | Oct 2002 | B1 |
6477400 | Barrick | Nov 2002 | B1 |
6503249 | Krause | Jan 2003 | B1 |
6510241 | Vaillant et al. | Jan 2003 | B1 |
6535756 | Simon et al. | Mar 2003 | B1 |
6674883 | Wei et al. | Jan 2004 | B1 |
6697664 | Kienzle, III et al. | Feb 2004 | B2 |
6701174 | Krause et al. | Mar 2004 | B1 |
6711432 | Krause et al. | Mar 2004 | B1 |
6718194 | Kienzle, III | Apr 2004 | B2 |
6725080 | Melkent et al. | Apr 2004 | B2 |
6747646 | Gueziec et al. | Jun 2004 | B2 |
6810280 | Strobel et al. | Oct 2004 | B2 |
6856826 | Seeley et al. | Feb 2005 | B2 |
6887245 | Kienzle, III et al. | May 2005 | B2 |
6917827 | Kienzle, III | Jul 2005 | B2 |
6922581 | Kienzle, III | Jul 2005 | B2 |
7130676 | Barrick | Oct 2006 | B2 |
7167738 | Schweikard et al. | Jan 2007 | B2 |
7203277 | Birkenbach et al. | Apr 2007 | B2 |
7235076 | Pacheco | Jun 2007 | B2 |
RE40176 | Peshkin et al. | Mar 2008 | E |
7392076 | Moctezuma de La Barrera | Jun 2008 | B2 |
7427200 | Noble et al. | Sep 2008 | B2 |
7427272 | Richard et al. | Sep 2008 | B2 |
7567834 | Clayton et al. | Jul 2009 | B2 |
7570791 | Frank et al. | Aug 2009 | B2 |
7887545 | Fernandez et al. | Feb 2011 | B2 |
7966058 | Xue et al. | Jun 2011 | B2 |
8090166 | Rappaport et al. | Jan 2012 | B2 |
8611697 | Nathaniel et al. | Dec 2013 | B2 |
9109998 | Nathaniel et al. | Aug 2015 | B2 |
9111180 | Rappaport et al. | Aug 2015 | B2 |
9433390 | Nathaniel et al. | Sep 2016 | B2 |
20010036245 | Kienzle et al. | Nov 2001 | A1 |
20020188194 | Cosman | Dec 2002 | A1 |
20040009459 | Anderson et al. | Jan 2004 | A1 |
20040030245 | Noble et al. | Feb 2004 | A1 |
20040039259 | Krause et al. | Feb 2004 | A1 |
20040082849 | Schweikard et al. | Apr 2004 | A1 |
20040097922 | Mullaney | May 2004 | A1 |
20040171924 | Mire et al. | Sep 2004 | A1 |
20040230199 | Jansen et al. | Nov 2004 | A1 |
20040243148 | Wasielewski | Dec 2004 | A1 |
20040263535 | Birkenbach et al. | Dec 2004 | A1 |
20050021043 | Jansen et al. | Jan 2005 | A1 |
20050027304 | Leloup et al. | Feb 2005 | A1 |
20050065617 | Moctezuma de la Barrera et al. | Mar 2005 | A1 |
20050075632 | Russell | Apr 2005 | A1 |
20050288679 | Kienzle | Dec 2005 | A1 |
20060064106 | Fernandez | Mar 2006 | A1 |
20060084867 | Tremblay et al. | Apr 2006 | A1 |
20060098851 | Shoham et al. | May 2006 | A1 |
20060161059 | Wilson | Jul 2006 | A1 |
20060173293 | Marquart et al. | Aug 2006 | A1 |
20060241416 | Marquart et al. | Oct 2006 | A1 |
20060281334 | Shin et al. | Dec 2006 | A1 |
20070038059 | Sheffer et al. | Feb 2007 | A1 |
20070038223 | Marquart et al. | Feb 2007 | A1 |
20070161929 | Maier | Jul 2007 | A1 |
20070270680 | Sheffer et al. | Nov 2007 | A1 |
20080018643 | Feilkas et al. | Jan 2008 | A1 |
20080051910 | Kammerzell et al. | Feb 2008 | A1 |
20080075348 | Rappaport et al. | Mar 2008 | A1 |
20080089566 | Node-Langlois et al. | Apr 2008 | A1 |
20080119725 | Lloyd | May 2008 | A1 |
20080243191 | Tipirneni et al. | Oct 2008 | A1 |
20080269596 | Revie et al. | Oct 2008 | A1 |
20080281334 | Zheng et al. | Nov 2008 | A1 |
20080294265 | Warkentine et al. | Nov 2008 | A1 |
20080319448 | Lavallee et al. | Dec 2008 | A1 |
20090209851 | Blau | Aug 2009 | A1 |
20090234217 | Mire et al. | Sep 2009 | A1 |
20100030219 | Lerner et al. | Feb 2010 | A1 |
20100104150 | Saint Felix et al. | Apr 2010 | A1 |
20100168562 | Zhao et al. | Jul 2010 | A1 |
20110019884 | Blau | Jan 2011 | A1 |
20110184477 | Dell'Oca et al. | Jul 2011 | A1 |
20110213379 | Blau et al. | Sep 2011 | A1 |
20110313418 | Nikonovas | Dec 2011 | A1 |
20130060146 | Yang et al. | Mar 2013 | A1 |
20130211244 | Nathaniel | Aug 2013 | A1 |
20130322726 | Nathaniel | Dec 2013 | A1 |
20170128027 | Nathaniel et al. | May 2017 | A1 |
Number | Date | Country |
---|---|---|
1424673 | Jun 2003 | CN |
1203435 | May 2005 | CN |
101069640 | Nov 2007 | CN |
102005062610 | Jun 2007 | DE |
102005062611 | Jun 2007 | DE |
102007008521 | Aug 2007 | DE |
102007008522 | Aug 2007 | DE |
0738502 | Oct 1996 | EP |
1491151 | Dec 2004 | EP |
1523950 | Apr 2005 | EP |
1859755 | Nov 2007 | EP |
1994914 | Nov 2008 | EP |
2895267 | Jun 2007 | FR |
2421187 | Jun 2006 | GB |
2008514296 | May 2008 | JP |
2010538753 | Dec 2010 | JP |
0209611 | Feb 2002 | WO |
03105659 | Dec 2003 | WO |
2004069040 | Aug 2004 | WO |
2005087125 | Sep 2005 | WO |
2007073733 | Jul 2007 | WO |
WO-2007073733 | Jul 2007 | WO |
2007095917 | Aug 2007 | WO |
2007095918 | Aug 2007 | WO |
2007095919 | Aug 2007 | WO |
WO-2007095917 | Aug 2007 | WO |
WO-2007095918 | Aug 2007 | WO |
WO-2007095919 | Aug 2007 | WO |
2009087214 | Jul 2009 | WO |
2012007054 | Jan 2012 | WO |
2012084056 | Jun 2012 | WO |
Entry |
---|
Amir Herman et al.,The International Journal of Medical Robotics and Computer Assisted Surgery, 5; 45-50, Dec. 29, 2008. |
Communication from EP Application No. 10153136 dated Aug. 17, 2011. |
International Search Report, PCT/EP2009/050210, dated Jun. 16, 2009. |
Jagannathan et al., Neurosurg Focus 20, 2, E9, pp. 1-6, 2006. |
Thomas C. Kienzle III et al., “An Integrated CAD-Robotics System for Total Knee Replacement Surgery”, IEEE, 1993. |
Ziv Yaniv, Member, IEEE, and Leo Joskowicz, Senior Member, IEEE, Precise Robot-Assisted Guide Positioning for Distal Locking of Intramedullary Nails, IEEE Transactions on Medical Imaging, vol. 24, No. 5, May 2005. |
Joskowicz et al., IEEE Transactions on Medical Imaging, IEEE Service Center, Piscataway NJ, US, vol. 24, No. 5, May 1, 2005, pp. 624-635. |
PCT International Search Report PCT/EP2010/060314 dated Apr. 6, 2011. |
Hofstetter et al., “Computer-Assisted Fluoroscopy-Based Reduction of Femoral Fractures and Antetorsion Correction”, Computer Aided Surgery 5:311-325 (2000). |
International Search Report for PCT/EP2012/004102 dated Feb. 27, 2013. |
Schulz et al., “Evidence Based Development of a Novel Lateral Fibula Plate (VariAx Fibula) Using a Real CT Bone Data Based Optimization Process During Device Development”, The Open Orthopaedics Journal, 6:1-7 (2012). |
Guoyan Zheng et al., Precise estimation of postoperative cup alignment from single standard X-ray radiograph with gonadal shielding, Proceedings of the 10th international conference on Medical image computing and computer-assisted intervention, Oct. 29-Nov. 2, 2007, Brisbane, Australia. |
Number | Date | Country | |
---|---|---|---|
20090209851 A1 | Aug 2009 | US |
Number | Date | Country | |
---|---|---|---|
61010543 | Jan 2008 | US |