System and method for the registration of an anatomical feature

Information

  • Patent Grant
  • 10624764
  • Patent Number
    10,624,764
  • Date Filed
    Monday, November 28, 2016
    8 years ago
  • Date Issued
    Tuesday, April 21, 2020
    4 years ago
Abstract
A computer-assisted surgery (CAS) system for navigating a surface of an anatomical feature in a coordinate system comprises an apparatus for obtaining points of a surface of an anatomical feature including a base adapted to be secured to the anatomical feature, a spherical joint supported by the base, the spherical joint having a ball member rotatable in at least two rotational degrees of freedom relative to the base and having a center of rotation fixed relative to the base, a distance-measurement device connected to the ball member such that a distance-measurement axis of the distance-measurement device passes through said center of rotation of the ball member. An inertial sensor unit produces signals representative of the orientation of the distance-measurement device. A CAS processor receives the signal from the at least one inertial sensor unit and outputs orientation data relating at least an object relative to the surface of the anatomical feature using the model of the surface in the coordinate system and the signals from the at least one inertial sensor unit.
Description
TECHNICAL FIELD

The present application relates to computer-assisted surgery using inertial sensors and more particularly to referencing inertial sensors relative to a bone, for subsequent alterations to the bone.


BACKGROUND OF THE ART

In arthroplasty, a bone is altered to subsequently receive thereon an implant. For example, in hip arthroplasty, the acetabular cup implant is received in the reamed acetabulum and serves as a receptacle for a femoral head or femoral head implant. Accordingly, tools such as a reamer and a cup impactor are used in the procedure.


One of the challenges in such procedures is to provide an adequate orientation to the tool or implant relative to the bone. An inaccurate orientation, for instance in the case of an acetabular cup implant, may result in a loss of movements, improper gait, and/or premature wear of implant components. For example, the acetabular cup is typically positioned in the reamed acetabulum by way of an impactor. The impactor has a stem at an end of which is the acetabular cup. The stem is handled by an operator that impacts the free end so as to drive the acetabular cup into the acetabulum. It is however important that the operator holds the stem of the impactor in a precise three-dimensional orientation so as to ensure the adequate orientation of the acetabular cup, in terms of inclination and anteversion. Accordingly, the knowledge of the initial position and orientation of the bone relative to an inertial sensor unit can contribute to subsequent steps of altering the bone and positioning an implant thereon.


Computer-assisted surgery has been developed in order to help operators in positioning and orienting implants to a desired orientation. Among the various tracking technologies used in computer-assisted surgery, optical navigation, C-arm validation and manual reference guides have been used. The optical navigation requires the use of a navigation system, which adds operative time. Moreover, it is bound to line-of-sight constraints which hamper the normal surgical flow. C-arm validation requires the use of bulky equipment and the validation is not cost-effective, yet does not provide a quantitative assessment of the cup positioning once done, and is generally used post-operatively as opposed to intra-operatively. Finally, manual jigs, such as an A-frame, do not account for the position of the patient on the operative table. Accordingly, inertial sensors are used for their cost-effectiveness and the valuable information they provide.


SUMMARY

Therefore, in accordance with a first embodiment of the present disclosure, there is provided an apparatus for obtaining points of a surface of an anatomical feature comprising: a base adapted to be secured to an anatomical feature; a spherical joint supported by the base, the spherical joint having a ball member rotatable in at least two rotational degrees of freedom relative to the base and having a center of rotation fixed relative to the base; a distance-measurement device connected to the ball member such that a distance-measurement axis of the distance-measurement device passes through said center of rotation of the ball member, the distance-measurement device configured for providing a distance of any point of the surface intersecting the distance-measurement axis; and at least one receptacle configured to receive an inertial sensor unit for determining an orientation of the distance-measurement device; whereby a position of any point is obtained using said distance and an orientation of the distance-measurement device as connected to the ball member at a measurement of said point.


In accordance with a second embodiment of the present disclosure, there is provided a method for modelling a surface of an anatomical feature in computer-assisted surgery (CAS) in a coordinate system, comprising: obtaining a distance between a reference position and any point of the surface of the anatomical feature; determining, using at least one inertial sensor unit and one or more processors of a CAS system, an orientation of an axis passing through the reference position and said any point of the surface of the anatomical feature; calculating a position of said any point using the orientation of the axis and the distance for said any point; repeating the obtaining the distance, the determining the orientation and the calculating a position for a plurality of points on the anatomical feature, with the reference position being fixed throughout the repeating; generating, using the at least one inertial sensor unit and one or more processors of the CAS system, the model of the surface in the coordinate system using at least the position of the plurality of points; and outputting, using the at least one inertial sensor unit and one or more processors of the CAS system, orientation data relating at least an object relative to the surface of the anatomical feature using the model of the surface in the coordinate system.


In accordance with a third embodiment of the present disclosure, there is provided a computer-assisted surgery (CAS) system for navigating a surface of an anatomical feature in a coordinate system comprising: an apparatus for obtaining points of a surface of an anatomical feature including a base adapted to be secured to the anatomical feature, a spherical joint supported by the base, the spherical joint having a ball member rotatable in at least two rotational degrees of freedom relative to the base and having a center of rotation fixed relative to the base, a distance-measurement device connected to the ball member such that a distance-measurement axis of the distance-measurement device passes through said center of rotation of the ball member, the distance-measurement device for providing a distance of any point of the surface intersecting the distance-measurement axis, and at least one receptacle for receiving an inertial sensor unit; at least one inertial sensor unit received in the receptacle of the apparatus, the at least one inertial sensor unit producing signals representative of the orientation of the distance-measurement device; a CAS processor receiving the signal from the at least one inertial sensor unit and including a distance module for obtaining a distance between a reference position and a plurality of points of the surface of the anatomical feature, an orientation module for determining, using the signal from the at least one inertial sensor unit, an orientation of the distance-measurement axis for each of the plurality of points, a position calculator module for calculating a position of each of the plurality of points using the orientation of the distance-measurement axis, the distance for each of the plurality of points, and the reference position being common to each of the plurality of points, a model generating module for generating the model of the surface in the coordinate system using at least the position of the plurality of points and the signals from the at least one inertial sensor unit, and a navigation module for producing orientation data relating at least an object relative to the surface of the anatomical feature using the model of the surface in the coordinate system and the signals from the at least one inertial sensor unit; and an output for outputting the orientation data.


In accordance with a fourth embodiment of the present disclosure, there is provided a CAS processor for modelling a surface of an anatomical feature in computer-assisted surgery (CAS) in a coordinate system, comprising: a distance module for obtaining a distance between a reference position and a plurality of points of the surface of the anatomical feature; an orientation module for determining, using signals from at least one inertial sensor unit, an orientation of an axis passing through the reference position and the plurality of points of the surface of the anatomical feature, a position calculator module for calculating a position of said any point using the orientation of the axis and the distance for said any point, the reference position being common to each of the plurality of points, a model generating module for generating and outputting, using the signals from the at least one inertial sensor unit, the model of the surface in the coordinate system using at least the position of the plurality of points, and a navigation module for producing and outputting, using the signals from the at least one inertial sensor unit, orientation data relating at least an object relative to the surface of the anatomical feature using the model of the surface in the coordinate system.





DESCRIPTION OF THE DRAWINGS


FIG. 1A is a perspective view of an apparatus of a CAS system on a pelvis;



FIG. 1B is a series of perspective views of the apparatus of the CAS system, with a tab handle assisting in the positioning of the apparatus on the pelvis;



FIG. 2 is a perspective enlarged view showing a connection of the apparatus to the pelvis;



FIG. 3 is an assembly view of the apparatus of FIG. 1A, with a bracket;



FIG. 4 is a perspective view showing a relation between the apparatus of FIG. 1A and a distance-measuring device;



FIG. 5 is a block diagram of a CAS processor for modelling a surface of an anatomical feature in computer-assisted surgery in a coordinate system; and



FIG. 6 is a flow chart illustrating a method for modelling a surface of an anatomical feature in computer-assisted surgery in a coordinate system.





DETAILED DESCRIPTION

Referring to the drawings, a method for referencing an inertial sensor unit relative to an anatomical feature in computer-assisted hip surgery is generally shown. Although the example provided herein relates to hip surgery, with the anatomical feature being the pelvis, other types of surgery may benefit from the method and instrumentation of the present disclosure. The purpose of method is to enable accurate navigation of instruments used in hip arthroplasty or like procedures using inertial sensors.


As an initial point, the bone may be modeled. The imaged model may be obtained and/or generated using imaging. The imaging may be done by any appropriate technology such as CT scanning (computerized tomography), fluoroscopy, or like radiography methods, providing suitable resolution of images. The model of the bone may include a surface geometry of its surface to be altered and other parts of the bone that are exposed. In particular, if applicable, a combination of radiography and magnetic resonance imagery (MRI) may provide a suitable resolution between bone and cartilage, useful to recognize the boundaries of cartilage relative to the bone. The bone modeling may comprise generating a 3D surface of the bone if the bone modeling is not directly performed by the imaging equipment, or if not complete. The model may alternatively be composed of a two-dimensional (2D) outline instead of a three-dimensional (3D) surface, as such a 2D outline may provide sufficient data to determine how a reference will be secured to a bone.


In the case of hip arthroplasty, the pelvis may be imaged as a whole, or key parts may be more detailed in a generic model. For example, if given bone landmarks will be used to facilitate navigation or as abutment surfaces, the model may feature additional resolution for such landmarks. In hip arthroplasty, an example would be the acetabulum and surroundings as the acetabulum receives the cup implant, and the iliac crest (e.g., anterior-superior iliac spine, ASIS) as they are landmarks often used to guide an operator in orienting tools (e.g., impactor). For example, the 3D images bone model may also include an orientation of the anatomical feature, such as a coordinate system. In the case of the pelvis, the coordinate system may include for example a medio-lateral axis passing through the antero-superior iliac spines (ASIS), and a cranial-caudal axis using the position of the pubic turbercle relative to the ASIS, among other possibilities, as determined using the images and added to the imaged bone model. The anterior-posterior axis would be obtained as normal to the plane including the medio-lateral axis and the cranial-caudal axis. Therefore, the orientation of the anatomical feature is a virtual orientation that may be part of the virtual 3D model of the surface.


Referring to FIG. 1A, a system for navigating instruments in computer-assisted hip surgery is shown, and is of the type used to implement the method, as will be detailed below. The system comprises a computer-assisted surgery (CAS) processing unit 1, shown as a stand-alone unit in FIG. 1A. It is however pointed out that the CAS processing unit 1 may be integrated into one or more inertial sensor units such as 14 and 23 described hereinafter, also known as pods, mounted to the various devices and instruments of the system 1, namely an apparatus 10 and a distance-measuring device 20.


The inertial sensor units incorporating the processing unit 1 may thus be equipped with user interfaces to provide the navigation data, whether it be in the form of LED displays, screens, numerical displays, etc. Alternatively, the inertial sensor units may be connected to a stand-alone CAS processing unit 1 that would include a screen or like monitor. The inertial sensor units may be known as micro-electro-mechanical sensors (MEMS) and may include one or more accelerometers, gyroscopes, inclinometers, magnetometers, among other possible inertial sensors. The inertial sensor units are of the type providing orientation data along 3 axes, hence tracking three rotational degrees of freedom of movement. The CAS processing unit 1 may comprise geometrical data for some of the devices and instruments. Accordingly, when an inertial sensor unit is mounted to one of the devices and instruments, the relation between the device/instrument and a coordinate system of the inertial sensor unit is known. For example, the relation is between an axis or a 3D coordinate system of the device/instrument and the coordinate system of the inertial sensor unit. Moreover, the inertial sensor units may be portable and detachable units, used with one device/instrument, and then transferred to another device/instrument, preserving in the process orientation data of a global coordinate system, using for example dead-reckoning tracking with readings from the inertial sensor unit(s). The navigation of instruments is intended to mean tracking at least some of the degrees of freedom of orientation in real-time or quasi-real time, such that the operator is provided with data calculated by computer assistance.


The apparatus 10 and the distance-measuring device 20 are provided to assist in defining a model of a surface of the bone, in a coordinate system, with the inertial sensor unit(s) of the apparatus 10 remaining active afterwards to track other tools relative to the bone. Other devices may be used to assist in the positioning of the apparatus 10 to the pelvis, such as a tab handle 30. Hence, other devices may be used subsequently to complete the surgical procedure, such as drills, impactors, reamers, guiding pins, etc.


Referring to FIGS. 1A, 1B and 2, the apparatus 10 has a base 11 by which it is connected to a bone, such as the pelvis A featuring acetabulum B. For clarity, reference is made hereinafter to the case of hip surgery, as an example, even though the system and method may be applied to other bones. FIG. 1B illustrates the use of tab handle 30, releasably secured to the base 11 of the apparatus 10, to position same against the pelvis. The tab handle 30 may stabilize the apparatus 10 over the acetabulum rim, to position the apparatus 10 on bone surfaces adjacent to the rim. The tab handle 30 may be used to hold the apparatus 10 at the chosen location while inserting fasteners such as screws 11A, for instance by having a contact portion defined to abut against the pelvis, in such a way that the apparatus 10 and tab handle 30 concurrently grip the rim of the acetabulum. To assist in positioning the base 11 in a desired position against the anatomical feature, the base 11 may have a patient-specific surface, as is shown at 11B. The patient-specific surface 11B may be a contour-matching negative surface of the bone, defined using the 3D imaged bone model from pre-operative planning. However, the base 11 may also be without such surface. The tab handle 30 may be removed once the apparatus 10 is secured to the pelvis A, by the presence of a complementary tongue and groove joint shown as 31. As shown in FIG. 2, the base 11 may be secured to the bone by way of screws 11A, although other types of fasteners may be used as well. The base 11 has an upwardly projecting portion 12 which projects away from a remainder of the base 11, and has a receptacle 13 at its end, for receiving inertial sensor unit 14.


Referring to FIG. 3, a bracket 15 may be releasably connectable to the base 11, or other part of the apparatus 10, and forms part of a spherical joint 16 at its end, with a central bore 17, such as a counterbore. In an embodiment, the geometric relation between the central bore 17 and the inertial sensor unit 14 is known, such that subsequent readings can be derived from this relation. The spherical joint 16 has a ball member 16A having its center of rotation fixed relative to the base 11.


Referring to FIG. 4, the spherical joint 16 is the interface of the base 11 with the distance-measuring device 20. The distance-measuring device 20 is used as a probe to obtain points representative of a surface of the bone, for the CAS processor 1 to define the model of the bone using readings obtained from the distance-measuring device 20. According to an embodiment, the distance-measuring device 20 has an elongated body 21 at the end of which is located receptacle 22 to receive inertial sensor unit 23. In an embodiment, the elongated body 21 is sized so as to be received in the central bore 17 of the spherical joint 16, such that the distance-measuring device 20 is blocked from translating, yet movable in three rotational degrees of freedom because of the spherical joint 16.


The distance-measuring device 20 has a pointer end 24. In an embodiment, the pointer end 24 may be telescopically connected to the elongated body 21, to be displaced along the longitudinal axis of the elongated body 21. An encoder (e.g., standard distance encoder, Hall-effect sensor, etc) may be placed at the telescopic joint so as to measure the length of the pointer end 24. Accordingly, the system 1 may calculate the distance of various points of the bone surface, i.e., the distance between the pointer end 24 and the central bore 17. Moreover, as the orientation of the distance-measuring device 20 is known (i.e., its axis) via the data produced by the inertial sensor unit 23, it is possible to obtain a cloud of points representative of the acetabulum surface, as described below. According to an embodiment, the cloud of points may then be transposed into the coordinate system tracked by the inertial sensor unit 14 on the base 11, as explained below. As another feature, the inertial sensor unit 14 may be used to detect movements of the pelvis during the gathering of points made with the inertial sensor unit 14.


As an alternative embodiment, the pointer end 24 is not telescopically connected to the elongated body 21. Instead, the distance-measuring device 20 is allowed to slide relative to the spherical joint 16, the pair forming a sliding joint. A distance-measuring encoder could be placed at the sliding joint to measure the displacement. Alternatively, the distance could be measured by the inertial sensor unit 23. The distance-measuring device 20 may be an optical rangefinder, connected to the ball member 16A and measuring a distance through light emission. It is also considered to use a simple ruler, with an operator entering the distance value between the point on the surface and the reference position along the distance-measuring axis.


Although a pair of inertial sensor units are shown, i.e., 14 and 23, the apparatus may be provided with additional encoders to determine the orientation of the ball member 16A relative to the base 11, in addition to encoders or a rangefinder providing distance values between a reference point, such as the center of rotation of the ball member 16A, and a point on the surface of the anatomical feature intersecting the distance-measuring axis (e.g., the longitudinal axis of the elongated body 21).


Referring to FIG. 5, an embodiment of the CAS processor 1 is shown in greater detail. The CAS processor 1 may include the following modules as part of a non-transitory computer readable memory having recorded thereon statements and instructions for execution by a computer to carry out a method for modelling a surface of an anatomical feature in computer-assisted surgery in a coordinate system. The coordinate system is virtual and may be updated with signals from the inertial sensor units, such as 14 and 23. The CAS processor 1 may output data via a user interface 40.


A distance module 51 obtains a distance between a reference position and a plurality of points of the surface of the anatomical feature. For example, the distance module 51 calculates a distance using the signals from an encoder of the distance-measuring device 24. According to an embodiment, the reference position is the center of rotation of the spherical joint 16 supporting the distance-measuring device 24, as the center of rotation is fixed relative to the base 11 and hence the anatomical feature, and is therefore conveniently used for trigonometric calculations. Other reference positions may be used, such as any part of the distance-measuring device 24.


An orientation module 52 determines, using signals from the inertial sensor unit 23 and/or encoder signals, an orientation of the distance-measuring axis passing through the reference position and the plurality of points of the surface of the anatomical feature.


A position calculator module 53 calculates a position of each point using the orientation of the axis from the orientation module 52 and the distance for this point from the distance module 51. Hence, the position of a plurality of points is calculated, the reference position being common to each of the plurality of points.


A model generating module 54 generates and outputs, using the signals from the inertial sensor unit(s) 14 and 23, the model of the surface in the coordinate system using the position of the plurality of points. The cloud of points may be enough for a virtual model of the surface to be generated, for example, the acetabulum. The model generating module 54 may also obtain the imaged model M of the surface, to match the position of the plurality of points with the imaged model. This may include obtaining the orientation of the anatomical feature, e.g., the virtual coordinate system from pre-operative planning, as a reference for subsequent navigation. The CAS processor 1 may therefore perform some surface matching to match (a.k.a., register) the model to the actual measured surface or may perform other registration methods as well, and hence obtain other geometrical or outline data, such as the position of other landmarks, without resorting to an existing pre-opereative 3D model. This information is captured by the apparatus 10 and inertial sensors 14 and/or 23, and once completed, the distance-measuring device 24 may be removed along with the bracket 15.


A navigation module 55 produces and outputs, using the signals from the inertial sensor unit(s) 14 and 23, orientation data relating an object such as a tool relative to the surface of the anatomical feature using the model of the surface in the coordinate system, as generated by the model generator module 54. As mentioned below, this may require detaching the inertial sensor unit 23 from the distance-measuring device 20, to connect it to a tool. For example, the navigation module 55 outputs an orientation of a bone-altering tool or an implant positioning tool T relative to the anatomic feature, using a tool guide positioned on the base 11 as a replacement for the bracket 15.


Referring to FIG. 6, a method for modelling a surface of an anatomical feature in computer-assisted surgery in a coordinate system is generally shown at 60, and may be performed by the CAS processor unit 1 using the apparatus 10 described above. According to an example, the method is performed on a pelvis to define a surface of the acetabulum.


According to 61, a distance between a reference position and any point of the surface of the anatomical feature is obtained. 61 may include calculating a distance from an encoder of the distance-measuring device 24. The method may be performed using a center of rotation of the spherical joint 16 as reference position.


According to 62, using signals from inertial sensor unit(s) 14 and 23, an orientation of an axis passing through the reference position and any point of the surface of the anatomical feature is determined.


According to 63, a position of any point is calculated using the orientation of the axis and the distance for the point. 61, 62 and 63 are repeated for a plurality of points on the anatomical feature to be obtained, with the reference position being fixed throughout 61, 62 and 63. According to an embodiment, 61, 62 and 63 may also be performed to create a coordinate system of the anatomical feature, in addition to obtain points of the surface to model. For example, in the case of the pelvis, 61, 62 and 63 may obtain a suitable number of landmarks to define a coordinate system of the pelvis. 61, 62 and 63 may be used to obtain the ASIS, and the pubic tubercle, as one possibility.


According to 64, using the inertial sensor unit(s) 14 and 23, the model of the surface in the coordinate system is generated using the position of the plurality of points. This may include using the points obtained for the known landmarks, to create the coordinate system for the model generated from the cloud of points. 64 may include obtaining an imaged model of the surface, such that generating the model of the surface comprises matching or registering the position of the plurality of points with the imaged model. In such a case, a virtual coordinate system representative of the orientation of the anatomical feature may be obtained with the 3D imaged model.


According to 65, using the signals from the inertial sensor unit(s) 14 and 23, orientation data relating an object relative to the surface of the anatomical feature is output, using the model of the surface in the coordinate system. Outputting the orientation data may include outputting an orientation of a bone-altering tool or an implant positioning tool relative to the anatomic feature. According to an embodiment, the inertial sensor unit 14 is on the base 11, and tools may or may not have the inertial sensor unit 23 thereon, detached from the distance-measuring device 20. For example, the base 11 may use the bracket 15 with spherical joint 16 (and associated encoders) to determine the orientation of the tool T. In an alternative embodiment, the tool T supports the inertial sensor unit 23, and the orientation of the tool T is navigated using the readings of both inertial sensor units 14 and 23.


Therefore, the CAS processor 1 and method 60 may track tools relative to a bone using a single inertial sensor unit (14 or 23) on the base 11 during the navigation, after the modelling or registration has been performed, with the inertial sensor unit 14 and/or 23. The CAS processor 1 uses encoder data and/or geometrical data of its bracket 15 or like attachment guiding the tool T, to determine the orientation in the coordinate system tracked by the inertial sensor unit 14 secured to the base 11. In an embodiment, it is contemplated to use a single inertial sensor unit secured to the distance-measuring device 20 with landmarks being detected on the anatomical feature to create a coordinate system. If the inertial sensor unit is then detached from the distance-measuring device 20 to be positioned onto the base 11 for navigation, some geometric relation must be recorded prior to detaching the inertial sensor unit and tracking same in dead-reckoning, for the acquired cloud of points and coordinate system to be in a known geometric relation relative to the base 11.


While the methods and systems described herein have been described and shown with reference to particular steps performed in a particular order, it will be understood that these steps may be combined, subdivided or reordered to form an equivalent method without departing from the teachings of the present invention. Accordingly, the order and grouping of the steps is not a limitation of the present invention.

Claims
  • 1. An apparatus for obtaining points of a surface of an anatomical feature comprising: a base adapted to be secured to an anatomical feature;a spherical joint supported by the base, the spherical joint having a ball member rotatable in at least two rotational degrees of freedom relative to the base and having a center of rotation fixed relative to the base;a distance-measurement device connected to the ball member such that a distance-measurement axis of the distance-measurement device passes through said center of rotation of the ball member, the distance-measurement device configured for providing a distance of any point of the surface intersecting the distance-measurement axis; andat least one receptacle configured to receive an inertial sensor unit for determining an orientation of the distance-measurement device;whereby a position of any point is obtained using said distance and an orientation of the distance-measurement device as connected to the ball member at a measurement of said any point.
  • 2. The apparatus according to claim 1, wherein the distance-measurement device includes a probe having a body connected to the ball member with a contact end of the body configured to contact the surface of the anatomical feature, the body of the probe connected to the ball member such that the distance-measurement axis is a longitudinal axis of the body passing through said center of rotation of the ball member and said contact end of the body.
  • 3. The apparatus according to claim 2, wherein further comprising a translational joint providing a translation degree of freedom between the contact end of the body of the probe and the center of rotation of the ball member, such that the contact end of the body is configured to come into contact with the surface of the anatomical feature.
  • 4. The apparatus according to claim 2, further comprising an encoder determining a distance between the contact end of the body and the center of rotation of the ball member.
  • 5. The apparatus according to claim 1, wherein the at least one receptacle is on the distance-measurement device.
  • 6. The apparatus according to claim 5, wherein another said receptacle is on the base, such that the apparatus is configured to support said inertial sensor unit on the distance-measurement device, and another inertial sensor unit on the base.
  • 7. The apparatus according to claim 1, further comprising a handle secured to the base for positioning the base against the anatomical feature.
  • 8. The apparatus according to claim 7, wherein the handle concurrently forms a concavity with the base configured for receiving a protuberance of the anatomical feature.
  • 9. The apparatus according to claim 7, wherein the handle is removable from the base once the base is secured to the anatomical feature.
  • 10. The apparatus according to claim 1, wherein the ball member defines a counterbore having an axis coincident with said center of rotation, the counterbore receiving and supporting the distance-measurement device.
  • 11. The apparatus according to claim 1, further comprising a tool guide having a guiding feature configured for receiving a tool altering the anatomic feature.
  • 12. The apparatus according to claim 11, wherein the tool guide is connected to the base after removal of the spherical joint.
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims priority of U.S. Patent Application No. 62/260,296, filed on Nov. 26, 2015 and incorporated herein by reference.

US Referenced Citations (316)
Number Name Date Kind
4841975 Woolson Jun 1989 A
5098383 Hemmy et al. Mar 1992 A
5490854 Fisher et al. Feb 1996 A
5768134 Swaelens et al. Jun 1998 A
5871018 Delp et al. Feb 1999 A
5916219 Matsuno et al. Jun 1999 A
6327491 Franklin Dec 2001 B1
7357057 Chiang Apr 2008 B2
7468075 Lang et al. Dec 2008 B2
7510557 Bonutti Mar 2009 B1
7534263 Burdulis May 2009 B2
7618451 Berez et al. Nov 2009 B2
7634119 Tsougarakis et al. Dec 2009 B2
7717956 Lang May 2010 B2
7796791 Tsougarakis et al. Sep 2010 B2
7799077 Lang et al. Sep 2010 B2
7806896 Bonutti Oct 2010 B1
7806897 Bonutti Oct 2010 B1
7967868 White et al. Jun 2011 B2
7981158 Fitz et al. Jul 2011 B2
8062302 Lang et al. Nov 2011 B2
8066708 Lang et al. Nov 2011 B2
8070752 Metzger et al. Dec 2011 B2
8077950 Tsougarakis et al. Dec 2011 B2
8083745 Lang et al. Dec 2011 B2
8092465 Metzger et al. Jan 2012 B2
8094900 Steines et al. Jan 2012 B2
8105330 Fitz et al. Jan 2012 B2
8122582 Burdulis, Jr. et al. Feb 2012 B2
8133234 Meridew et al. Mar 2012 B2
8160345 Pavlovskaia et al. Apr 2012 B2
8175683 Roose May 2012 B2
8221430 Park et al. Jul 2012 B2
8234097 Steines et al. Jul 2012 B2
8241293 Stone et al. Aug 2012 B2
8282646 Schoenefeld et al. Oct 2012 B2
8298237 Schoenefeld Oct 2012 B2
8337501 Fitz et al. Dec 2012 B2
8337507 Lang et al. Dec 2012 B2
8343218 Lang et al. Jan 2013 B2
8366771 Burdulis et al. Feb 2013 B2
8377129 Fitz et al. Feb 2013 B2
8439926 Bojarski et al. May 2013 B2
8460304 Fitz et al. Jun 2013 B2
8480754 Bojarski et al. Jul 2013 B2
8500740 Bojarski et al. Aug 2013 B2
8529568 Bouadi Sep 2013 B2
8529630 Bojarski Sep 2013 B2
8585708 Fitz et al. Sep 2013 B2
8545569 Fitz et al. Oct 2013 B2
8551099 Lang Oct 2013 B2
8551102 Fitz et al. Oct 2013 B2
8551103 Fitz et al. Oct 2013 B2
8551169 Fitz et al. Oct 2013 B2
8556906 Fitz et al. Oct 2013 B2
8556907 Fitz et al. Oct 2013 B2
8556971 Lang Oct 2013 B2
8556983 Bojarski et al. Oct 2013 B2
8561278 Fitz et al. Oct 2013 B2
8562611 Fitz et al. Oct 2013 B2
8562618 Fitz et al. Oct 2013 B2
8568479 Fitz et al. Oct 2013 B2
8568480 Fitz et al. Oct 2013 B2
8617172 Fitz et al. Dec 2013 B2
8617242 Philipp Dec 2013 B2
8623026 Wong et al. Jan 2014 B2
8634617 Tsougarakis et al. Jan 2014 B2
8638998 Steines et al. Jan 2014 B2
8641716 Fitz et al. Feb 2014 B2
8657827 Fitz et al. Feb 2014 B2
8682052 Fitz et al. Mar 2014 B2
9539112 Thornberry Jan 2017 B2
20030055502 Lang et al. Mar 2003 A1
20030216669 Lang et al. Nov 2003 A1
20040133276 Lang et al. Jul 2004 A1
20040138754 Lang et al. Jul 2004 A1
20040147927 Tsougarakis et al. Jul 2004 A1
20040153079 Tsougarakis et al. Aug 2004 A1
20040204644 Tsougarakis et al. Oct 2004 A1
20040204760 Fitz et al. Oct 2004 A1
20040236424 Berez et al. Nov 2004 A1
20050010301 Disilvestro Jan 2005 A1
20050234461 Burdulis et al. Oct 2005 A1
20050267584 Burdulis et al. Dec 2005 A1
20060111722 Bouadi May 2006 A1
20070083266 Lang Apr 2007 A1
20070100462 Lang et al. May 2007 A1
20070156171 Lang et al. Jul 2007 A1
20070157783 Chiang Jul 2007 A1
20070198022 Lang et al. Aug 2007 A1
20070226986 Park et al. Oct 2007 A1
20070233141 Park et al. Oct 2007 A1
20070233269 Steines et al. Oct 2007 A1
20070250169 Lang Oct 2007 A1
20080114370 Schoenefeld May 2008 A1
20080147072 Park et al. Jun 2008 A1
20080161815 Schoenefeld et al. Jul 2008 A1
20080195216 Philipp Aug 2008 A1
20080243127 Lang et al. Oct 2008 A1
20080275452 Lang et al. Nov 2008 A1
20080281328 Lang et al. Nov 2008 A1
20080281329 Fitz et al. Nov 2008 A1
20080281426 Fitz et al. Nov 2008 A1
20080287954 Kunz et al. Nov 2008 A1
20090024131 Metzgu et al. Jan 2009 A1
20090088753 Aram et al. Apr 2009 A1
20090088754 Aker et al. Apr 2009 A1
20090088755 Aker et al. Apr 2009 A1
20090088758 Bennett Apr 2009 A1
20090088759 Aram et al. Apr 2009 A1
20090088760 Aram et al. Apr 2009 A1
20090088761 Roose et al. Apr 2009 A1
20090088763 Aram et al. Apr 2009 A1
20090093816 Roose et al. Apr 2009 A1
20090099567 Zajac Apr 2009 A1
20090110498 Park et al. Apr 2009 A1
20090131941 Park May 2009 A1
20090131942 Aker et al. May 2009 A1
20090138020 Park et al. May 2009 A1
20090157083 Park et al. Jun 2009 A1
20090222014 Bojarksi et al. Sep 2009 A1
20090222016 Park et al. Sep 2009 A1
20090222103 Fitz et al. Sep 2009 A1
20090226068 Fitz et al. Sep 2009 A1
20090228113 Lang et al. Sep 2009 A1
20090254093 White et al. Oct 2009 A1
20090270868 Park et al. Oct 2009 A1
20090276045 Lang Nov 2009 A1
20090306676 Lang et al. Dec 2009 A1
20090307893 Burdulis, Jr. et al. Dec 2009 A1
20090312805 Lang et al. Dec 2009 A1
20100023015 Park Jan 2010 A1
20100042105 Park et al. Feb 2010 A1
20100049195 Park et al. Feb 2010 A1
20100054572 Tsougarakis et al. Mar 2010 A1
20100082035 Keefer Apr 2010 A1
20100087829 Metzger et al. Apr 2010 A1
20100152741 Park et al. Jun 2010 A1
20100152782 Stone et al. Jun 2010 A1
20100160917 Fitz et al. Jun 2010 A1
20100168754 Fitz et al. Jul 2010 A1
20100174376 Lang et al. Jul 2010 A1
20100185202 Lester et al. Jul 2010 A1
20100191244 White et al. Jul 2010 A1
20100191298 Earl Jul 2010 A1
20100212138 Carroll et al. Aug 2010 A1
20100217270 Polinski et al. Aug 2010 A1
20100217338 Carroll et al. Aug 2010 A1
20100228257 Bonutti Sep 2010 A1
20100234849 Bouadi Sep 2010 A1
20100256479 Park et al. Oct 2010 A1
20100262150 Lian Oct 2010 A1
20100274534 Steines et al. Oct 2010 A1
20100281678 Burdulis, Jr. et al. Nov 2010 A1
20100286700 Snider et al. Nov 2010 A1
20100298894 Bojarski et al. Nov 2010 A1
20100303313 Lang et al. Dec 2010 A1
20100303317 Tsougarakis et al. Dec 2010 A1
20100303324 Lang et al. Dec 2010 A1
20100305573 Fitz et al. Dec 2010 A1
20100305574 Fitz et al. Dec 2010 A1
20100305708 Lang et al. Dec 2010 A1
20100305907 Fitz et al. Dec 2010 A1
20100329530 Lang et al. Dec 2010 A1
20110015636 Katrana et al. Jan 2011 A1
20110015637 De Smedt et al. Jan 2011 A1
20110015639 Metzger et al. Jan 2011 A1
20110029091 Bojarski et al. Feb 2011 A1
20110029093 Bojarski et al. Feb 2011 A1
20110040168 Arnaud et al. Feb 2011 A1
20110054478 Vanasse et al. Mar 2011 A1
20110060341 Angibaud Mar 2011 A1
20110066193 Lang et al. Mar 2011 A1
20110066245 Lang et al. Mar 2011 A1
20110071533 Metzger et al. Mar 2011 A1
20110071581 Lang et al. Mar 2011 A1
20110071645 Bojarski et al. Mar 2011 A1
20110071802 Bojarski et al. Mar 2011 A1
20110087332 Bojarski et al. Apr 2011 A1
20110092977 Salehi et al. Apr 2011 A1
20110093108 Ashby et al. Apr 2011 A1
20110106093 Romano et al. May 2011 A1
20110144760 Wong et al. Jun 2011 A1
20110160736 Meridew et al. Jun 2011 A1
20110160867 Meridew et al. Jun 2011 A1
20110166578 Stone et al. Jul 2011 A1
20110172672 Dubeau et al. Jul 2011 A1
20110184419 Meridew Jul 2011 A1
20110196377 Hodorek et al. Aug 2011 A1
20110213368 Fitz et al. Sep 2011 A1
20110213373 Fitz et al. Sep 2011 A1
20110213374 Fitz et al. Sep 2011 A1
20110213376 Maxson et al. Sep 2011 A1
20110213377 Lang et al. Sep 2011 A1
20110213427 Fitz et al. Sep 2011 A1
20110213428 Fitz et al. Sep 2011 A1
20110213429 Lang et al. Sep 2011 A1
20110213430 Lang et al. Sep 2011 A1
20110213431 Fitz et al. Sep 2011 A1
20110214279 Park et al. Sep 2011 A1
20110218539 Fitz et al. Sep 2011 A1
20110218545 Catanzarite et al. Sep 2011 A1
20110218584 Fitz et al. Sep 2011 A1
20110224674 White et al. Sep 2011 A1
20110230888 Lang et al. Sep 2011 A1
20110238073 Lang et al. Sep 2011 A1
20110245835 Dodds et al. Oct 2011 A1
20110266265 Lang Nov 2011 A1
20110295329 Fitz et al. Dec 2011 A1
20110295378 Bojarski et al. Dec 2011 A1
20110313423 Lang et al. Dec 2011 A1
20110313424 Bono Dec 2011 A1
20110319897 Lang et al. Dec 2011 A1
20110319900 Lang et al. Dec 2011 A1
20120010711 Antonyshyn et al. Jan 2012 A1
20120029520 Lang et al. Feb 2012 A1
20120041445 Roose Feb 2012 A1
20120041446 Wong et al. Feb 2012 A1
20120065640 Metzger Mar 2012 A1
20120066892 Lang et al. Mar 2012 A1
20120071881 Lang et al. Mar 2012 A1
20120071882 Lang et al. Mar 2012 A1
20120071883 Lang et al. Mar 2012 A1
20120072185 Lang et al. Mar 2012 A1
20120078254 Ashby et al. Mar 2012 A1
20120078258 Lo et al. Mar 2012 A1
20120078259 Meridew Mar 2012 A1
20120093377 Tsougarakis et al. Apr 2012 A1
20120101503 Lang et al. Apr 2012 A1
20120109138 Meridew et al. May 2012 A1
20120116203 Vancraen et al. May 2012 A1
20120116562 Agnihotri et al. May 2012 A1
20120123422 Agnihotri et al. May 2012 A1
20120123423 Fryman May 2012 A1
20120130382 Iannotti et al. May 2012 A1
20120130687 Otto et al. May 2012 A1
20120141034 Iannotti et al. Jun 2012 A1
20120143197 Lang et al. Jun 2012 A1
20120151730 Fitz et al. Jun 2012 A1
20120158001 Burdulis, Jr. et al. Jun 2012 A1
20120165820 De Smedt et al. Jun 2012 A1
20120172884 Zheng et al. Jul 2012 A1
20120191205 Bojarski et al. Jul 2012 A1
20120191420 Bojarski et al. Jul 2012 A1
20120192401 Pavlovskaia et al. Aug 2012 A1
20120197260 Fitz et al. Aug 2012 A1
20120197408 Lang et al. Aug 2012 A1
20120201440 Steines et al. Aug 2012 A1
20120209276 Schuster Aug 2012 A1
20120209394 Bojarski et al. Aug 2012 A1
20120215226 Bonutti Aug 2012 A1
20120221008 Carroll et al. Aug 2012 A1
20120226283 Meridew et al. Sep 2012 A1
20120232669 Bojarski et al. Sep 2012 A1
20120232670 Bojarski et al. Sep 2012 A1
20120232671 Bojarski Sep 2012 A1
20120239045 Li Sep 2012 A1
20120245647 Kunz Sep 2012 A1
20120245699 Lang et al. Sep 2012 A1
20120265208 Smith Oct 2012 A1
20120271366 Katrana et al. Oct 2012 A1
20120276509 Iannotti et al. Nov 2012 A1
20120277751 Catanzarite et al. Nov 2012 A1
20120289966 Fitz et al. Nov 2012 A1
20120296337 Fitz et al. Nov 2012 A1
20120323247 Bettenga Dec 2012 A1
20130018379 Fitz et al. Jan 2013 A1
20130018380 Fitz et al. Jan 2013 A1
20130018464 Fitz et al. Jan 2013 A1
20130023884 Fitz et al. Jan 2013 A1
20130024000 Bojarski et al. Jan 2013 A1
20130030419 Fitz et al. Jan 2013 A1
20130030441 Fitz et al. Jan 2013 A1
20130079781 Fitz et al. Mar 2013 A1
20130079876 Fitz et al. Mar 2013 A1
20130081247 Fitz et al. Apr 2013 A1
20130096562 Fitz et al. Apr 2013 A1
20130103363 Lang et al. Apr 2013 A1
20130110471 Lang et al. May 2013 A1
20130123792 Fitz et al. May 2013 A1
20130184713 Bojarski Jul 2013 A1
20130197870 Steines et al. Aug 2013 A1
20130211409 Burdulis, Jr. et al. Aug 2013 A1
20130211410 Landes et al. Aug 2013 A1
20130211531 Steines et al. Aug 2013 A1
20130245803 Lang Sep 2013 A1
20130253522 Bojarski et al. Sep 2013 A1
20130289570 Chao Oct 2013 A1
20130296874 Chao Nov 2013 A1
20130297031 Hafez Nov 2013 A1
20130317511 Bojarski et al. Nov 2013 A1
20130331850 Bojarski et al. Dec 2013 A1
20140005792 Lang et al. Jan 2014 A1
20140029814 Fitz et al. Jan 2014 A1
20140031826 Bojarski et al. Jan 2014 A1
20140039631 Bojarski et al. Feb 2014 A1
20140058396 Fitz et al. Feb 2014 A1
20140058397 Fitz et al. Feb 2014 A1
20140066935 Fitz et al. Mar 2014 A1
20140066936 Fitz et al. Mar 2014 A1
20140074441 Fitz et al. Mar 2014 A1
20140086780 Miller et al. Mar 2014 A1
20140236159 Haider Aug 2014 A1
20140303631 Thornberry Oct 2014 A1
20160015468 Piron Jan 2016 A1
20160022374 Haider Jan 2016 A1
20160113720 Lavallee Apr 2016 A1
20170151018 Leone Jun 2017 A1
20170360512 Couture Dec 2017 A1
20180085135 Singh Mar 2018 A1
20180132949 Merette May 2018 A1
20180177612 Trabish Jun 2018 A1
20180280037 Dassonville Oct 2018 A1
20180280092 Van Beek Oct 2018 A1
20180311011 Van Beek Nov 2018 A1
20190000372 Gullotti Jan 2019 A1
Foreign Referenced Citations (220)
Number Date Country
2004293091 Jun 2005 AU
2004293104 Jun 2005 AU
2005309692 Jun 2006 AU
2005311558 Jun 2006 AU
2002310193 Mar 2007 AU
2006297137 Apr 2007 AU
2002310193 May 2007 AU
2007202573 Jun 2007 AU
2007212033 Aug 2007 AU
2007226924 Sep 2007 AU
2009221773 Sep 2009 AU
2009246474 Nov 2009 AU
2010201200 Apr 2010 AU
2011203237 Jul 2011 AU
2010217903 Sep 2011 AU
2010236263 Nov 2011 AU
2010264466 Feb 2012 AU
2010289706 Mar 2012 AU
2010315099 May 2012 AU
2010327987 Jun 2012 AU
2011203237 Oct 2012 AU
2012216829 Oct 2012 AU
2012217654 Oct 2013 AU
2007212033 Jan 2014 AU
2014200073 Jan 2014 AU
2012289973 Mar 2014 AU
2012296556 Mar 2014 AU
2501041 Apr 2004 CA
2505371 May 2004 CA
2505419 Jun 2004 CA
2506849 Jun 2004 CA
2546958 Jun 2005 CA
2546965 Jun 2005 CA
2804883 Jun 2005 CA
2588907 Jun 2006 CA
2590534 Jun 2006 CA
2623834 Apr 2007 CA
2641241 Aug 2007 CA
2646288 Sep 2007 CA
2717760 Sep 2009 CA
2765499 Dec 2010 CA
2771573 Mar 2011 CA
2779283 May 2011 CA
2782137 Jun 2011 CA
2546965 Mar 2013 CA
1728976 Feb 2006 CN
1729483 Feb 2006 CN
1729484 Feb 2006 CN
1913844 Feb 2007 CN
101111197 Jan 2008 CN
101384230 Mar 2009 CN
101442960 May 2009 CN
100502808 Jun 2009 CN
102006841 Apr 2011 CN
102125448 Jul 2011 CN
102405032 Apr 2012 CN
102448394 May 2012 CN
101420911 Jul 2012 CN
102599960 Jul 2012 CN
1913844 Sep 2012 CN
102711670 Oct 2012 CN
102724934 Oct 2012 CN
102805677 Dec 2012 CN
1729483 Oct 2013 CN
103476363 Dec 2013 CN
60336002 Mar 2011 DE
60239674 May 2011 DE
602004032166 May 2011 DE
602005027391 May 2011 DE
1555962 Jul 2005 EP
1558181 Aug 2005 EP
1567985 Aug 2005 EP
1575460 Sep 2005 EP
1686930 Aug 2006 EP
1686931 Aug 2006 EP
1389980 Apr 2007 EP
1814491 Aug 2007 EP
1833387 Sep 2007 EP
1686930 Oct 2007 EP
1686931 Jan 2008 EP
1928359 Jun 2008 EP
1951136 Aug 2008 EP
1981409 Oct 2008 EP
1996121 Dec 2008 EP
2114312 Nov 2009 EP
2124764 Dec 2009 EP
1928359 Oct 2010 EP
2259753 Dec 2010 EP
2265199 Dec 2010 EP
1555962 Feb 2011 EP
2292188 Mar 2011 EP
2292189 Mar 2011 EP
1389980 Apr 2011 EP
1686930 Apr 2011 EP
1833387 Apr 2011 EP
2303193 Apr 2011 EP
2316357 May 2011 EP
2324799 May 2011 EP
2335654 Jun 2011 EP
2403434 Jan 2012 EP
2405865 Jan 2012 EP
2419035 Feb 2012 EP
2265199 Mar 2012 EP
2303193 Mar 2012 EP
2259753 Apr 2012 EP
2292188 May 2012 EP
2292189 May 2012 EP
2445451 May 2012 EP
2470126 Jul 2012 EP
2496183 Sep 2012 EP
2509539 Oct 2012 EP
2512381 Oct 2012 EP
2324799 Jan 2013 EP
2419035 Jan 2013 EP
2445451 Mar 2013 EP
2403434 Apr 2013 EP
2591756 May 2013 EP
2496183 Dec 2013 EP
2512381 Dec 2013 EP
2649951 Dec 2013 EP
2649951 Dec 2013 EP
2671520 Dec 2013 EP
2671521 Dec 2013 EP
2671522 Dec 2013 EP
2114312 Jan 2014 EP
2710967 Mar 2014 EP
2484042 Mar 2012 GB
2489884 Oct 2012 GB
201213674 Oct 2012 GB
2484042 Mar 2014 GB
1059882 Aug 2011 HK
1072710 Aug 2011 HK
1087324 Nov 2011 HK
1104776 Nov 2011 HK
2006510403 Mar 2006 JP
2007514470 Jun 2007 JP
2011519713 Jul 2011 JP
2011224384 Nov 2011 JP
2012091033 May 2012 JP
2012176318 Sep 2012 JP
5053515 Oct 2012 JP
2012187415 Oct 2012 JP
2012523897 Oct 2012 JP
5074036 Nov 2012 JP
2012531265 Dec 2012 JP
2013503007 Jan 2013 JP
5148284 Feb 2013 JP
5198069 May 2013 JP
2014000425 Jan 2014 JP
20050072500 Jul 2005 KR
20050084024 Aug 2005 KR
20120090997 Aug 2012 KR
20120102576 Sep 2012 KR
2012007140 Jan 2013 MX
597261 Nov 2013 NZ
173840 Sep 2011 SG
175229 Nov 2011 SG
176833 Jan 2012 SG
178836 Apr 2012 SG
193484 Oct 2013 SG
200509870 Mar 2005 TW
1231755 May 2005 TW
200800123 Jan 2008 TW
1330075 Sep 2010 TW
2004049981 Jun 2004 WO
2004051301 Jun 2004 WO
2005051239 Jun 2005 WO
2005051240 Jun 2005 WO
2006058057 Jun 2006 WO
2006060795 Jun 2006 WO
2006058057 Jul 2006 WO
2007041375 Apr 2007 WO
2007062103 May 2007 WO
2007092841 Aug 2007 WO
2007109641 Sep 2007 WO
2007092841 Nov 2007 WO
2007109641 Dec 2007 WO
2008101090 Aug 2008 WO
2008112996 Sep 2008 WO
2008101090 Nov 2008 WO
2008157412 Dec 2008 WO
2007041375 Apr 2009 WO
2008157412 Apr 2009 WO
2009111626 Sep 2009 WO
2009111639 Sep 2009 WO
2009111656 Sep 2009 WO
2009140294 Nov 2009 WO
2009111626 Jan 2010 WO
2010099231 Sep 2010 WO
2010099353 Sep 2010 WO
2010121147 Oct 2010 WO
2010099231 Nov 2010 WO
2011028624 Mar 2011 WO
2011056995 May 2011 WO
2011072235 Jun 2011 WO
2011075697 Jun 2011 WO
2011056995 Sep 2011 WO
2011075697 Oct 2011 WO
2011072235 Dec 2011 WO
2012112694 Aug 2012 WO
2012112694 Aug 2012 WO
2012112698 Aug 2012 WO
2012112701 Aug 2012 WO
2012112702 Aug 2012 WO
2012112694 Jan 2013 WO
2012112701 Jan 2013 WO
2012112702 Jan 2013 WO
2013020026 Feb 2013 WO
2013025814 Feb 2013 WO
2012112698 Mar 2013 WO
2013056036 Apr 2013 WO
2013119790 Aug 2013 WO
2013119865 Aug 2013 WO
2013131066 Sep 2013 WO
2013152341 Oct 2013 WO
2013155500 Oct 2013 WO
2013155501 Oct 2013 WO
2014008444 Jan 2014 WO
2014035991 Mar 2014 WO
2014047514 Mar 2014 WO
Non-Patent Literature Citations (3)
Entry
Taylor et al, “Computer-Integrated Surgery, Technology and Clinical Applications”, The MIT Press, Cambridge, MA, London, UK, pp. 451-463.
Hofmann et al, “Natural-Knee II System”, Intermedics Orthopedics, Austin, TX, 1995.
https://www.youtube.com/watch?v=1iGfnrRyWTA.
Related Publications (1)
Number Date Country
20170151018 A1 Jun 2017 US
Provisional Applications (1)
Number Date Country
62260296 Nov 2015 US