The disclosure relates generally to navigation systems and methods for indicating and reducing line-of-sight errors when tracking one or more objects.
Navigation systems assist users in locating objects. For instance, navigation systems are used in industrial, aerospace, defense, and medical applications. In the medical field, navigation systems assist surgeons in placing surgical instruments relative to a patient's anatomy. Surgeries in which navigation systems are used include neurosurgery and orthopedic surgery. Typically, the instrument and the anatomy are tracked together with their relative movement shown on a display.
Navigation systems may employ light signals, sound waves, magnetic fields, radio frequency signals, etc. in order to track the position and/or orientation of objects. Often the navigation system includes tracking devices attached to the object being tracked. A localizer cooperates with tracking elements on the tracking devices to determine a position of the tracking devices, and ultimately to determine a position and/or orientation of the object. The navigation system monitors movement of the object via the tracking devices.
Many navigation systems rely on an unobstructed line-of-sight between the tracking elements and sensors that receive tracking signals from the tracking elements. When the line-of-sight is obstructed, tracking signals being transmitted from the tracking elements are not received by the sensors. As a result, errors can occur. Typically, in this situation, navigation is discontinued and error messages are conveyed to the user until the line-of-sight returns or the system is reset. In the medical field, in many instances, the error messages are displayed on a monitor remote from the surgeon making it difficult for the surgeon to notice and remedy the error in a timely manner. This can cause delays to surgical procedures.
As a result, there is a need in the art for navigation systems and methods that quickly identify line-of-sight issues so that they can be resolved without significant delay. There is also a need in the art for navigation systems and methods that help to improve the line-of-sight and reduce possible errors associated with obstructions to the line-of-sight between tracking elements and sensors.
In one embodiment, a navigation system is provided for tracking a plurality of objects. The navigation system includes a sensor and a plurality of tracking devices attachable to the plurality of objects. Each tracking device includes a tracking head having a tracking body and a tracking element coupled to the tracking body that transmits a tracking signal to the sensor to determine a position of one of the plurality of objects. Each tracking device also includes an error indicator supported by the tracking head that indicates an error in the sensor receiving the tracking signal from the tracking element. The error indicator includes at least one light emitting diode attached to the tracking body. A computing system determines if the sensor received the tracking signal from the tracking element. The computing system also generates an error signal if the sensor did not receive the tracking signal. The computing system also controls activation of the at least one light emitting diode of the error indicator to emit a colored light in response to absence of the error signal. The computing system further deactivates the at least one light emitting diode in response to generation of the error signal.
In another embodiment, a navigation system is provided for tracking a plurality of objects. The navigation system includes a sensor and a plurality of tracking devices attachable to the plurality of objects. Each tracking device includes a tracking head having a tracking body and a tracking element coupled to the tracking body that transmits a tracking signal to the sensor to determine a position of one of the plurality of objects. Each tracking device also includes an error indicator supported by the tracking head that indicates an error in the sensor receiving the tracking signal from the tracking element. The error indicator includes at least one light emitting diode attached to the tracking body. A computing system determines if the sensor received the tracking signal from the tracking element. The computing system also generates an error signal if the sensor did not receive the tracking signal. The computing system also controls activation of the at least one light emitting diode of the error indicator to emit a colored light in response to generation of the error signal. The computing system further deactivates the at least one light emitting diode in response to absence of the error signal.
A method is also provided for tracking a plurality of objects using a navigation system. The navigation system includes a sensor and a plurality of tracking devices. The plurality of tracking devices are attachable to the plurality of objects. Each tracking device includes a tracking head having a tracking body and a tracking element coupled to the tracking body that transmits a tracking signal to the sensor to determine a position of one of the plurality of objects. Each tracking device also includes an error indicator supported by the tracking head that indicates an error in the sensor receiving the tracking signal from the tracking element. The error indicator includes at least one light emitting diode attached to the tracking body. The method includes transmitting the tracking signal from the tracking element to the sensor while the tracking device is attached to the object and determining if the sensor received the tracking signal from the tracking element. An error signal is generated if the sensor did not receive the tracking signal. The method also includes controlling activation of the at least one light emitting diode to emit a colored light in response to absence of the error signal and to deactivate the at least one light emitting diode in response to generation of the error signal.
In another embodiment, a method is provided for tracking a plurality of objects using a navigation system. The navigation system includes a sensor and a plurality of tracking devices. The plurality of tracking devices are attachable to the plurality of objects. Each tracking device includes a tracking head having a tracking body and a tracking element coupled to the tracking body that transmits a tracking signal to the sensor to determine a position of one of the plurality of objects. Each tracking device also includes an error indicator supported by the tracking head that indicates an error in the sensor receiving the tracking signal from the tracking element. The error indicator includes at least one light emitting diode attached to the tracking body. The method includes transmitting the tracking signal from the tracking element to the sensor while the tracking device is attached to the object and determining if the sensor received the tracking signal from the tracking element. An error signal is generated if the sensor did not receive the tracking signal. The method also includes controlling activation of the at least one light emitting diode to emit a colored light in response to generation of the error signal and to deactivate the at least one light emitting diode in response to absence of the error signal.
One advantage of these navigation systems and methods is that the error is identified directly on the object being tracked. As a result, the user can better isolate signal transmission issues and remedy the issues without significant delay. Also, in some instances, by placing the indicator on the object being tracked, there is a greater likelihood that the user will notice the error as compared to an error message displayed on a remote monitor.
Advantages of the present invention will be readily appreciated as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings wherein:
Referring to
The navigation system 20 includes a computer cart assembly 24 that houses a navigation computer 26. A navigation interface is in operative communication with the navigation computer 26. The navigation interface includes a first display 28 adapted to be situated outside of a sterile field and a second display 29 adapted to be situated inside the sterile field. The displays 28, 29 are adjustably mounted to the computer cart assembly 24. First and second input devices 30, 32 such as a mouse and keyboard can be used to input information into the navigation computer 26 or otherwise select/control certain aspects of the navigation computer 26. Other input devices are contemplated including a touch screen (not shown) on displays 28, 29 or voice-activation.
A localizer 34 communicates with the navigation computer 26. In the embodiment shown, the localizer 34 is an optical localizer and includes a camera unit 36 (also referred to as a sensing device). The camera unit 36 has an outer casing 38 that houses one or more optical position sensors 40. In some embodiments at least two optical sensors 40 are employed, preferably three. The optical sensors 40 may be three separate charge-coupled devices (CCD). In one embodiment three, one-dimensional CCDs are employed. It should be appreciated that in other embodiments, separate camera units, each with a separate CCD, or two or more CCDs, could also be arranged around the operating room. The CCDs detect infrared (IR) signals.
Camera unit 36 is mounted on an adjustable arm to position the optical sensors 40 with a field of view of the below discussed trackers that, ideally, is free from obstructions.
The camera unit 36 includes a camera controller 42 in communication with the optical sensors 40 to receive signals from the optical sensors 40. The camera controller 42 communicates with the navigation computer 26 through either a wired or wireless connection (not shown). One such connection may be an IEEE 1394 interface, which is a serial bus interface standard for high-speed communications and isochronous real-time data transfer. The connection could also use a company specific protocol. In other embodiments, the optical sensors 40 communicate directly with the navigation computer 26.
Position and orientation signals and/or data are transmitted to the navigation computer 26 for purposes of tracking the objects. The computer cart assembly 24, display 28, and camera unit 36 may be like those described in U.S. Pat. No. 7,725,162 to Malackowski, et al. issued on May 25, 2010, entitled “Surgery System”, hereby incorporated by reference.
The navigation computer 26 can be a personal computer or laptop computer. Navigation computer 26 has the displays 28, 29, central processing unit (CPU) and/or other processors, memory (not shown), and storage (not shown). The navigation computer 26 is loaded with software as described below. The software converts the signals received from the camera unit 36 into data representative of the position and orientation of the objects being tracked.
Navigation system 20 includes a plurality of tracking devices 44, 46, 48, also referred to herein as trackers. In the illustrated embodiment, one tracker 44 is firmly affixed to the femur F of the patient and another tracker 46 is firmly affixed to the tibia T of the patient. Trackers 44, 46 are firmly affixed to sections of bone. Trackers 44, 46 may be attached to the femur F and tibia T in the manner shown in U.S. Pat. No. 7,725,162, hereby incorporated by reference. Other methods of attachment are described further below. In additional embodiments, a tracker (not shown) is attached to the patella to track a position and orientation of the patella. In yet further embodiments, the trackers 44, 46 could be mounted to other tissue types or parts of the anatomy.
An instrument tracker 48 is firmly attached to the surgical instrument 22. The instrument tracker 48 may be integrated into the surgical instrument 22 during manufacture or may be separately mounted to the surgical instrument 22 in preparation for the surgical procedures. The working end of the surgical instrument 22, which is being tracked, may be a rotating bur, electrical ablation device, or the like.
The trackers 44, 46, 48 can be battery powered with an internal battery or may have leads to receive power through the navigation computer 26, which, like the camera unit 36, preferably receives external power.
In the embodiment shown, the surgical instrument 22 is an end effector of a surgical manipulator. Such an arrangement is shown in U.S. patent application Ser. No. 13/958,070, filed Aug. 2, 2013, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in Multiple Modes”, the disclosure of which is hereby incorporated by reference.
In other embodiments, the surgical instrument 22 may be manually positioned by only the hand of the user, without the aid of any cutting guide, jib, or other constraining mechanism such as a manipulator or robot. Such a surgical instrument is described in U.S. patent application Ser. No. 13/600,888, filed Aug. 31, 2012, entitled, “Surgical Instrument Including Housing, a Cutting Accessory that Extends from the Housing and Actuators that Establish the Position of the Cutting Accessory Relative to the Housing”, the disclosure of which is hereby incorporated by reference.
The optical sensors 40 of the localizer 34 receive light signals from the trackers 44, 46, 48. In the illustrated embodiment, the trackers 44, 46, 48 are active trackers. In this embodiment, each tracker 44, 46, 48 has at least three active tracking elements or markers 50 for transmitting light signals to the optical sensors 40. The active markers 50 can be, for example, light emitting diodes (LEDs) 50 transmitting light, such as infrared light. The optical sensors 40 preferably have sampling rates of 100 Hz or more, more preferably 300 Hz or more, and most preferably 500 Hz or more. In some embodiments, the optical sensors 40 have sampling rates of 1000 Hz. The sampling rate is the rate at which the optical sensors 40 receive light signals from sequentially fired LEDs 50. In some embodiments, the light signals from the LEDs 50 are fired at different rates for each tracker 44, 46, 48.
Referring to
In other embodiments, the trackers 44, 46, 48 may have passive markers (not shown), such as reflectors that reflect light emitted from the camera unit 36. The reflected light is then received by the optical sensors 40. Active and passive marker arrangements are well known in the art.
Each of the trackers 44, 46, 48 also includes a 3-dimensional gyroscope sensor 60 that measures angular velocities of the trackers 44, 46, 48. As is well known to those skilled in the art, the gyroscope sensors 60 output readings indicative of the angular velocities relative to x, y, and z axes of a gyroscope coordinate system. These readings are multiplied by a conversion constant defined by the manufacturer to obtain measurements in degrees/second with respect to each of the x, y, and z axes of the gyroscope coordinate system. These measurements can then be converted to an angular velocity vector defined in radians/second.
The angular velocities measured by the gyroscope sensors 60 provide additional non-optically based kinematic data for the navigation system 20 with which to track the trackers 44, 46, 48. The gyroscope sensors 60 may be oriented along the axis of each coordinate system of the trackers 44, 46, 48. In other embodiments, each gyroscope coordinate system is transformed to its tracker coordinate system such that the gyroscope data reflects the angular velocities with respect to the x, y, and z axes of the coordinate systems of the trackers 44, 46, 48.
Each of the trackers 44, 46, 48 also includes a 3-axis accelerometer 70 that measures acceleration along each of x, y, and z axes of an accelerometer coordinate system. The accelerometers 70 provide additional non-optically based data for the navigation system 20 with which to track the trackers 44, 46, 48.
The accelerometers 70 may be oriented along the axis of each coordinate system of the trackers 44, 46, 48. In other embodiments, each accelerometer coordinate system is transformed to its tracker coordinate system such that the accelerometer data reflects the accelerations with respect to the x, y, and z axes of the coordinate systems of the trackers 44, 46, 48.
Each of the gyroscope sensors 60 and accelerometers 70 communicate with the tracker controller 62 located in the housing of the associated tracker that transmits/receives data to/from the navigation computer 26. The data can be received either through a wired or wireless connection.
The navigation computer 26 includes a navigation processor 52. The camera unit 36 receives optical signals from the LEDs 50 of the trackers 44, 46, 48 and outputs to the processor 52 signals and/or data relating to the position of the LEDs 50 of the trackers 44, 46, 48 relative to the localizer 34. The gyroscope sensors 60 transmit non-optical signals to the processor 52 relating to the 3-dimensional angular velocities measured by the gyroscope sensors 60. Based on the received optical and non-optical signals, navigation processor 52 generates data indicating the relative positions and orientations of the trackers 44, 46, 48 relative to the localizer 34.
It should be understood that the navigation processor 52 could include one or more processors to control operation of the navigation computer 26. The processors can be any type of microprocessor or multi-processor system. The term processor is not intended to be limited to a single processor.
Prior to the start of the surgical procedure, additional data are loaded into the navigation processor 52. Based on the position and orientation of the trackers 44, 46, 48 and the previously loaded data, navigation processor 52 determines the position of the working end of the surgical instrument 22 and the orientation of the surgical instrument 22 relative to the tissue against which the working end is to be applied. In some embodiments, navigation processor 52 forwards these data to a manipulator controller 54. The manipulator controller 54 can then use the data to control a robotic manipulator 56 as described in U.S. patent application Ser. No. 13/958,070, filed Aug. 2, 2013, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in Multiple Modes”, the disclosure of which is hereby incorporated by reference.
The navigation processor 52 also generates image signals that indicate the relative position of the surgical instrument working end to the surgical site. These image signals are applied to the displays 28, 29. Displays 28, 29, based on these signals, generate images that allow the surgeon and staff to view the relative position of the surgical instrument working end to the surgical site. The displays, 28, 29, as discussed above, may include a touch screen or other input/output device that allows entry of commands.
Referring to
Each tracker 44, 46, 48 and object being tracked also has its own coordinate system separate from localizer coordinate system LCLZ. Components of the navigation system 20 that have their own coordinate systems are the bone trackers 44, 46 and the instrument tracker 48. These coordinate systems are represented as, respectively, bone tracker coordinate systems BTRK1, BTRK2, and instrument tracker coordinate system TLTR.
Navigation system 20 monitors the positions of the femur F and tibia T of the patient by monitoring the position of bone trackers 44, 46 firmly attached to bone. Femur coordinate system is FBONE and tibia coordinate system is TBONE, which are the coordinate systems of the bones to which the bone trackers 44, 46 are firmly attached.
Prior to the start of the procedure, pre-operative images of the femur F and tibia T are generated (or of other tissues in other embodiments). These images may be based on magnetic resonance imaging (MRI) scans, radiological scans or computed tomography (CT) scans of the patient's anatomy. These images are mapped to the femur coordinate system FBONE and tibia coordinate system TBONE using well known methods in the art. In one embodiment, a pointer instrument P, such as disclosed in U.S. Pat. No. 7,725,162 to Malackowski, et al., hereby incorporated by reference, having its own tracker PT (see
During the initial phase of the procedure, the bone trackers 44, 46 are firmly affixed to the bones of the patient. The pose (position and orientation) of coordinate systems FBONE and TBONE are mapped to coordinate systems BTRK1 and BTRK2, respectively. Given the fixed relationship between the bones and their bone trackers 44, 46, the pose of coordinate systems FBONE and TBONE remain fixed relative to coordinate systems BTRK1 and BTRK2, respectively, throughout the procedure. The pose-describing data are stored in memory integral with both manipulator controller 54 and navigation processor 52.
The working end of the surgical instrument 22 (also referred to as energy applicator distal end) has its own coordinate system EAPP. The origin of the coordinate system EAPP may represent a centroid of a surgical cutting bur, for example. The pose of coordinate system EAPP is fixed to the pose of instrument tracker coordinate system TLTR before the procedure begins. Accordingly, the poses of these coordinate systems EAPP, TLTR relative to each other are determined. The pose-describing data are stored in memory integral with both manipulator controller 54 and navigation processor 52.
Referring to
Localization engine 100 receives as inputs the optically-based signals from the camera controller 42 and the non-optically based signals from the tracker controller 62. Based on these signals, localization engine 100 determines the pose of the bone tracker coordinate systems BTRK1 and BTRK2 in the localizer coordinate system LCLZ. Based on the same signals received for the instrument tracker 48, the localization engine 100 determines the pose of the instrument tracker coordinate system TLTR in the localizer coordinate system LCLZ.
The localization engine 100 forwards the signals representative of the poses of trackers 44, 46, 48 to a coordinate transformer 102. Coordinate transformer 102 is a navigation system software module that runs on navigation processor 52. Coordinate transformer 102 references the data that defines the relationship between the pre-operative images of the patient and the patient trackers 44, 46. Coordinate transformer 102 also stores the data indicating the pose of the working end of the surgical instrument 22 relative to the instrument tracker 48.
During the procedure, the coordinate transformer 102 receives the data indicating the relative poses of the trackers 44, 46, 48 to the localizer 34. Based on these data and the previously loaded data, the coordinate transformer 102 generates data indicating the relative position and orientation of both the coordinate system EAPP, and the bone coordinate systems, FBONE and TBONE to the localizer coordinate system LCLZ.
As a result, coordinate transformer 102 generates data indicating the position and orientation of the working end of the surgical instrument 22 relative to the tissue (e.g., bone) against which the instrument working end is applied. Image signals representative of these data are forwarded to displays 28, 29 enabling the surgeon and staff to view this information. In certain embodiments, other signals representative of these data can be forwarded to the manipulator controller 54 to control the manipulator 56 and corresponding movement of the surgical instrument 22.
Steps for determining the pose of each of the tracker coordinate systems BTRK1, BTRK2, TLTR in the localizer coordinate system LCLZ and systems and methods for determining the pose of the trackers 44, 46, 48 and the corresponding poses of the surgical instrument 22 with respect to the femur F and tibia T are described in greater detail in U.S. patent application Ser. No. 14/035,207, filed Sep. 24, 2013, entitled “Navigation System Including Optical and Non-Optical Sensors”, the disclosure of which is hereby incorporated by reference.
In some embodiments, only one LED 50 can be read by the optical sensors 40 at a time. The camera controller 42, through one or more infrared or RF transceivers (on camera unit 36 and trackers 44, 46, 48), or through a wired connection, may control the firing of the LEDs 50, as described in U.S. Pat. No. 7,725,162 to Malackowski, et al., hereby incorporated by reference. Alternatively, the trackers 44, 46, 48 may be activated locally (such as by a switch on trackers 44, 46, 48) which then fires its LEDs 50 sequentially once activated, without instruction from the camera controller 42.
One embodiment of trackers 44, 46 is shown in
Referring to
The bone plate 200 includes a plurality of protrusions for engaging the bone. In the embodiment shown, the protrusions are spikes 204. Once the bone plate 200 is secured in place with one or more bone screws, the spikes 204 prevent rotation of the bone plate 200 relative to the bone.
An extension arm 206 is mounted to the bone plate 200. The extension arm 206 has a base plate 208 that is secured to the bone plate 200 with one of the bone screws 202. The extension arm 206 extends arcuately in a C-shape from the base plate 208 to a mounting end 210.
A tracking head 212 is coupled to the mounting end 210 of the extension arm 206. The tracking head 212 includes tracking elements. The tracking elements, in the embodiment shown and described, are the LEDs 50, gyroscope sensor 60 (not shown), and accelerometer 70 (not shown). These tracking elements operate as previously described. In further embodiments, the tracking head 212 may include other types of tracking elements such as radio frequency receivers and/or transmitters, magnetic field sensors and/or generators, passive reflector balls, ultrasonic transmitters and/or receivers, or the like.
A connector assembly 214 couples the tracking head 212 to the extension arm 206. The connector assembly 214 supports the tracking head 212 for movement in two degree of freedom. In the embodiment shown, the tracking head 212 is rotatably and tiltably mounted to the mounting end 210 of the support arm 206 via the connector assembly 214.
Referring to
The bone plate 200 has top and bottom surfaces 216, 218 with three side surfaces 220, 222, 224. The side surfaces 220, 222, 224 extend between the top and bottom surfaces 216, 218. The concavity of the bone plate 200 can be given by a radius of curvature R1 of the top surface 216 of from about 5 millimeters to about 50 millimeters and a radius of curvature R2 of the bottom surface 218 of from about 5 millimeters to about 50 millimeters (see
Three spikes 204 are formed as integral extensions of surfaces 218, 220, 222, 224. Each spike 204 has a sharp tip 226 (see
The sharp tips 226 are formed to cut through soft tissue, such as the periosteum, and pierce into bone when the bone plate 200 is secured to bone. When one or more of the sharp tips 226 pierce into bone, they, in conjunction with one or more of the bone screws 202, prevent movement of the bone plate 200 relative to the bone.
The sharp tips 226, when engaged in bone, also support the bone plate 200 to provide a space beneath the bone plate 200 and above the surface of the bone. In some cases, tissue such as muscle, ligaments, and the like may be present on top of the bone to which the bone plate 200 is to be secured. This tissue can be accommodated in this space without affecting the engagement of the sharp tips 226 in the bone.
Three openings 228 are defined through the bone plate 200 to receive the bone screws 202. These three openings 228 have the cross-sectional configuration shown in
Each of the openings 228 are defined about an axis A. Each opening 228 comprises a generally cylindrical throughbore 230 defined by inner surface 234. The throughbore 230 is centered about the axis A.
An integral flange 232 is located in the throughbore 230 and directed radially inward toward axis A. The flange 232 is spaced from the top and bottom surfaces 216, 218 of the bone plate 200. This flange 232 is disposed annularly about axis A and generally perpendicular to axis A. The flange 232 tapers in cross-section from the inner surface 234 to an end surface 236. The end surface 236 defines an opening (not numbered) that is cylindrical in shape. The taper of the flange 232 is symmetrically formed by upper and lower surfaces 240, 242. The upper surface 240 extends at an acute angle α from end surface 236 to inner surface 234. The lower surface 242 extends at the same acute angle α from the end surface 236 to the inner surface 234, but in the opposite direction.
Referring back to
A central opening 246 is located in the recess 244 and is defined through the bone plate 200. The central opening 246 receives a bone screw 202 similar to the openings 228, but has a different cross-section than openings 228. The central opening 246 is a generally cylindrical throughbore of single diameter that is substantially perpendicular to the bone plate 200 at that location.
An axis C defines a center of the throughbore 246, as shown in
Referring to
The bone plate 200 may be formed of stainless steel, cobalt base alloys, bioceramics, titanium alloys, titanium, or other biocompatible materials.
Bone screws 202 are shown in
An arcuate segment 254 extends from the base plate 208 to the mounting end 210. A rib 256 is disposed partially on the base plate 208, extends along the arcuate segment 254, and ends at the mounting end 210. The rib 256 provides additional rigidity to the extension arm 206 to prevent bending, buckling, twisting, or other deformation of the extension arm 206.
A mounting surface 258 is located at the mounting end 210 of the extension arm 206. The mounting surface 258 is configured to support the connector assembly 214 and tracking head 212. The mounting surface 258 is generally planar. A threaded opening 260 is defined through the mounting end 210 for receiving a threaded adjustment fastener 261 (see
The extension arm 206 interconnects the bone plate 200 and the tracking head 212. The extension arm 206 spaces the tracking elements (such as LEDs 50) of the tracking head 212 from the bone plate 200. The tracking elements are spaced in this manner to extend above the anatomy thereby improving line-of-sight potential between the tracking elements and the optical sensors 40 of camera unit 36.
Referring to
The tissue receiving area 262 enables the user to retract soft tissue away from bone, mount the bone plate 200 directly to the bone, and then release the soft tissue back to a position above the bone plate 200. Accordingly, the soft tissue is not required to be continually retracted during the entire surgical procedure.
The bone plate 200 is firmly mounted in bone unicortically—meaning the bone screws 202 only penetrate the cortical bone layer once, from the outside.
The extension arm 206 may be formed of stainless steel, cobalt base alloys, bioceramics, titanium alloys, titanium, or other biocompatible materials.
Referring to
The tracking head 212 includes a first hinge member 264 for mounting to the connector assembly 214. The first hinge member 264 defines a non-threaded bore 268.
The connector assembly 214 is shown in
When tightening the adjustment fastener 286 the second hinge members 278 are drawn together to compress against the first hinge member 264. This prevents movement of the first hinge member 264 relative to the second hinge members 278. When the adjustment fastener 286 is loosened, the second hinge members 278 relax to a non-compressed position in which the first hinge member 264 is freely movable in the space 280.
The tracking head 212 can be tilted relative to the bone plate 200 via the hinge created by the hinge members 264, 278. Tilting occurs in one degree of freedom about pivot axis P (see
The connector 270 also has a rotational base 290. The rotational base 290 is integral with the second hinge members 278. The rotational base 290 has a flat bottom (not numbered) for mating with the mounting surface 258.
The rotational base 290 defines an opening 292. The opening 292 is shaped to receive a frusto-conical head (not separately numbered) of the adjustment fastener 261 (see
When tightening the adjustment fastener 261 the rotational base 290 is drawn against the mounting surface 258. Friction between the bottom of the rotational base 290 and the mounting surface 258 prevents rotational movement of the connector 270 relative to the mounting surface 258. When the adjustment fastener 261 is loosened, the connector 270 can be rotated freely relative to the mounting surface 258. Thus, the tracking head 212 can be rotated relative to the bone plate 200. Rotation occurs in one degree of freedom about rotational axis R (see
Some of the tracking elements, such as the LEDs 50, rely on line-of-sight with the optical sensors 40 to transmit tracking signals to the optical sensors 40. As a result, these tracking elements are also referred to as line-of-sight tracking elements. These tracking elements must be within the field of view of the camera unit 36 and not be blocked from transmitting tracking signals to the camera unit 36. When the signal path of one or more tracking elements is obstructed, an error message, in certain situations, may be generated.
The optical sensors 40 of the navigation system 20 are configured to receive signals from the LEDs 50. The navigation system 20 controls activation of the LEDs 50, as previously described, so that the navigation system 20 can anticipate when a signal should be received. When an anticipated signal from an LED 50 is not received one possibility is that the signal path is obstructed and the signal is blocked from being sent to the camera unit 36. Another possibility is that the LED 50 is not functioning properly.
The navigation computer 26 determines that there is an error if any one of the optical sensors 40 fails to receive a signal from an LED 50, even though other sensors 40 still receive the signal. In other embodiments, navigation computer 26 determines that there is an error if none of the optical sensors 40 receive the signal. In either case, when the navigation system 20 determines that there is an error based on the failure of one or more sensors 40 to receive signals from one or more LEDs 50, an error signal is generated by the navigation computer 26. An error message then appears on displays 28, 29. The navigation computer 26 also transmits an error signal to the tracker controller 62.
An error indicator 300 is located on tracking head 312 in the embodiment shown in
The indicator 300 includes indicating light emitters such as indicating light emitting diodes (LEDs) 302. The indicating LEDs 302 emit a first colored light when the tracker controller 62 receives the error signal from the navigation computer 26, such as a red, yellow, or orange colored light. The indicating LEDs 302 emit a second colored light when no error signal is received or when an all clear signal is received from the navigation computer 26 after the error is cleared, such as a green or blue colored light. This indicates that the line-of-sight is not broken and is being maintained between at least a required number of the LEDs 50 and the optical sensor or sensors 40. When an error is again detected the light changes from the second colored light to the first colored light. It should be appreciated that the indicating LEDs 302 may include separate indicating LEDs that are alternately activated based on the error status—one or more first colored indicating LEDs for error and one or more second colored indicating LEDs for no error. In other embodiments, the indicator 300 may include an LED or LCD display with error message and/or audible alerts when there is an error.
Tracking head 312 has a body 306 supporting the LEDs 302. The body 306 defines openings (not numbered) covered by transparent windows 310. The LEDs 302 are located inside the body 306 behind the windows 310 so that light from the LEDs 302 can be emitted through the windows 310 so that the light is visible to the user. The windows 310 may also include lenses (not shown) to provide desired lumination characteristics for the LEDs 302.
In another embodiment shown in
A light ring 430 is captured between the top 414 and bottom 420. The indicator LEDs 402, 404 are located inside the tracking head 412 within the light ring 430, as schematically shown in
The light ring 430 is illuminated with the orange colored light from the first indicator LEDs 402 when the tracker controller 62 receives the error signal from the navigation computer 26. The light ring 430 is illuminated with the green colored light from the second indicator LEDs 404 when no error signal is received or when an all clear signal is received from the navigation computer 26 after the error is cleared. When an error is again detected the light ring 430 changes from being illuminated green to being illuminated orange. The indicator LEDs 402, 404 are alternately activated based on the error status—one or more first indicator LEDs 402 are activated for error conditions and one or more second indicator LEDs 404 are activated when no error conditions exist.
Alternating activation of the indicator LEDs 402, 404 are carried out using the circuit shown in
Each of the indicator LEDs 402, 404 includes an anode 432 and a cathode 434. In
Switching elements M1, M2 respectfully connect between the cathodes 434 of the indicator LEDs 402, 404 and the second voltage reference 438. The switching elements M1, M2 control current flow through the indicator LEDs 402, 404 thereby controlling operation of the indicator LEDs 402, 404. The switching elements M1, M2 may be further defined as transistors, and more specifically, N-channel MOSFETs. Each of the MOSFETs M1, M2 includes a gate, source, and drain. The gate of each MOSFET M1, M2 connects to the tracker controller 62. The source of each MOSFET M1, M2 connects to the second reference voltage 438, and more specifically, signal ground. The drain of each MOSFET M1, M2 connects to the cathode 434 of the respective indicator LED 402, 404. Resistors R1, R2 respectfully connect between the cathode 434 of each of the indicator LEDs 402, 404 and the drain of each MOSFET M1, M2. The resistors R1, R2 limit current flow through the indicator LEDs 402, 404 to suitable operating levels.
The tracker controller 62 controls activation/deactivation of the indicator LEDs 402, 404. The tracker controller 62 selectively controls the switching elements M1, M2 to allow or prevent current flow through the indicator LEDs 402, 404.
In one embodiment, the tracker controller 62 sends a first indicator control signal to the gate of the MOSFET M1. The tracker controller 62 may send the first indicator control signal in response to the tracker controller 62 receiving the error signal from the navigation computer 26. The first indicator control signal causes the MOSFET M1 to form a closed circuit path between the source and the drain such that current freely flows through indicator LED 402 between the first and second voltage references 436, 438 to illuminate indicator LED 402.
In other embodiments, the tracker controller 62 sends a second indicator control signal to the gate of the MOSFET M2. The tracker controller 62 may send the second indicator control signal in response to the tracker controller 62 receiving an all clear signal or no error signal from the navigation computer 26. In turn, the second indicator control signal causes the MOSFET M2 to form a closed circuit path between the source and the drain such that current freely flows through indicator LED 404 between the first and second voltage references 436, 438 to illuminate indicator LED 404.
The first and second indicator control signals may correspond to any suitable predetermined voltage or current for controlling MOSFETs M1, M2.
The tracker controller 62 may alternate activation/deactivation of the indicator LEDs 402, 404. In one embodiment, the tracker controller 62 sends the second indicator control signal to the gate of the MOSFET M2 to deactivate indicator LED 404. According to one embodiment, the tracker controller 62 does so during activation of indicator LED 402. To deactivate indicator LED 404, the second indicator control signal causes the MOSFET M2 to form an open circuit between the source and the drain such that current is prevented from flowing through indicator LED 404 between the first and second voltage references 436, 438. Alternatively, the tracker controller 62 may send the first indicator control signal to the gate of the MOSFET M1 to deactivate indicator LED 402 during activation of LED indicator 404.
In one embodiment, the indicator LEDs 402, 404 may be selectively detachable from and attachable to the tracking head 412. The tracking head 412 may include a printed circuit board (PCB) assembly (not shown) disposed therein with the indicator LEDs 402, 404 being electrically connected to the PCB assembly. In
Control of the LEDs 50 is carried out using the circuit shown in
In
Switching elements M3, M4, M5, M6 respectively connect to the cathodes 452 of the LEDs 50. The switching elements M3, M4, M5, M6 control current flow through the LEDs 50 thereby controlling operation of the LEDs 50. The switching elements M3, M4, M5, M6 may be further defined as transistors, and more specifically, N-channel MOSFETs. Each of the MOSFETs M3, M4, M5, M6 includes a gate, source, and drain. The gate of each MOSFET M3, M4, M5, M6 connects to the tracker controller 62. The source of each MOSFET M3, M4, M5, M6 ultimately connects to the fourth voltage reference 456, and more specifically, signal ground. In
The tracker controller 62 controls activation/deactivation of the LEDs 50. Mainly, the tracker controller 62 selectively controls the switching elements M3, M4, M5, M6 to allow or prevent current flow through the LEDs 50.
In one embodiment, the tracker controller 62 receives an input signal indicating how the tracker controller 62 is to control any given LED 50 or combination of LEDs 50. The tracker controller 62 may receive the input signal from the navigation computer 26. In response, the tracker controller 62 sends an LED control signal to the gate of the MOSFET or MOSFETs M3, M4, M5, M6 connected to any given LED 50 or combination of LEDs 50. In
If the input signal indicates that the tracker controller 62 is to activate any given LED 50, the tracker controller 62 sends the LED control signal to the gate of the respective MOSFET M3, M4, M5, M6. Doing so causes the MOSFET M3, M4, M5, M6 to form a closed circuit path between the source and the drain of such that current freely flows through the respective LED 50 between the third and fourth voltage references 454, 456 to illuminate the respective LED 50.
If no input signal is received by the tracker controller 62 or when the input signal indicates that the tracker controller 62 is to deactivate any given LED 50, the tracker controller 62 sends the LED control signal to the gate of the MOSFET M3, M4, M5, M6 to deactivate the given LED 50. Mainly, the LED control signal causes the MOSFET M3, M4, M5, M6 to form an open circuit between the source and the drain of the MOSFET such that current is prevented from flowing between the third voltage reference 454 and signal ground 456 through the LED 50.
The LEDs 50 may be detachable from and attachable to the tracking head 412. In
In
The voltage sensing circuit 470 sends to the tracker controller 62 a voltage sense signal representing a measured operating voltage of the LEDs 50. In one embodiment, the voltage sensing circuit 470 protects the LEDs 50 from inappropriate voltage conditions. The tracker controller 62 may process the voltage sense signal and, in response, modify the LED control signal based on the value of the voltage sense signal. For instance, the tracker controller 62 may change the voltage of the LED control signal(s) if the tracker controller 62 determines that the voltage sense signal is above a predetermined threshold level. In another embodiment, the tracker controller 62 utilizes the voltage sensing circuit 470 for determining whether an LED 50 is malfunctioning. The tracker controller 62 can communicate the malfunction of the LED 50 to the navigation system 20 so that the navigation system 20 can anticipate such malfunction and respond accordingly. The voltage sensing circuit 470 may be implemented according to various other configurations and methods.
In
In one mode of operation, the current sensing circuit 480 provides to the tracker controller 62 a current sense signal. The current sense signal may be derived from the measured operating current of the LEDs 50. The tracker controller 62 may process the current sense signal and determine whether the current sense signal conforms to a predetermined value. For instance, the tracker controller 62 may determine that the current sense signal is above a predetermined threshold voltage level.
A current limiting circuit 482 is further provided in
The first input terminal 486 of the amplifier 482 connects to the tracker controller 62. The second input terminal 488 of the amplifier 482 connects to the gate and the source of the MOSFET M7. A capacitor C3 is included between the second input terminal 488 and the gate of the MOSFET M7. A resistor R5 is included between the second input terminal 488 and the source of the MOSFET M7. Resistor R6 connects between the source of the MOSFET M7 and the signal ground 456. The output terminal 490 of the amplifier 482 connects to the gate of the MOSFET M7. Resistor R7 connects between the output 490 of the amplifier 484 and the gate of the MOSFET M7. The drain of the MOSFET M7 connects to the sources of the MOSFETs M3, M4, M5, M6 at the shared line 458.
In one mode of operation, the tracker controller 62 sends a current limiting signal to the current limiting circuit 482. The tracker controller 62 may send the current limiting signal based on the value of the current sense signal provided by the current sensing circuit 480. In
The current sensing circuit 480 and the current limiting circuit 482 may be implemented according to various other configurations and methods.
In some embodiments, the tracker 44 may include four or more tracking LEDs 50 so that if the tracking signal from one of the LEDs 50 is obstructed, position and orientation data can still be obtained from the remaining LEDs 50. In this instance, before any error signals are generated, the navigation computer 26 will first run through a complete tracking cycle. The complete tracking cycle includes sequentially activating all the LEDs 50 on the tracker 44 to determine if the optical sensors 40 receive tracking signals from at least three of the LEDs 50 in the tracking cycle. The error signal is then generated if an optical sensor 40 (or all optical sensors 40 in some embodiments) did not receive tracking signals from at least three LEDs 50 in the tracking cycle.
The navigation system 20 is configured to assist with positioning of the tracker 44 by the surgeon or other medical personnel. This assistance helps to place the tracking head 212 (or tracking heads 312, 412) in a desired orientation that provides line-of-sight between the LEDs 50 and the optical sensors 40 and helps to reduce line-of-sight errors that may otherwise be encountered during a surgical procedure.
Once the bone plate 200 is mounted to the bone, such as femur F, the tracking head 212 is movable relative to the bone plate 200 via the connector assembly 214. In particular, the tracking head 212 is movable about pivot axis P and rotational axis R (see
Before navigation begins, the medical personnel are instructed to place the tracking head 212 in an initial orientation in which, visually, the tracking head 212 appears to be oriented so that the LEDs 50 will be within the line-of-sight of the optical sensors 40 and unobstructed throughout the surgical procedure. Once the user has placed the tracking head 212 in the initial orientation, the navigation computer 26 provides instructions to the medical personnel setting up the tracker 44 on how to further move the tracking head 212 to reach the desired orientation, if necessary.
The navigation computer 26 first determines the initial orientation of the tracking head 212 and whether the tracking head 212 is already in the desired orientation. If the tracking head 212 is not in the desired orientation, the navigation system 20, through software instructions displayed on displays 28, 29, instructs the user to move the tracking head 212 relative to the bone plate 200 in one or more of the degrees of freedom to reach the desired orientation.
The tracker 44 includes an orientation sensor 320, as shown in
The gravity sensor 320 is located inside the tracking head 212. The gravity sensor 320 is operatively connected to the tracker controller 62. The tracker controller 62 receives gravity measurements from the gravity sensor 320 with respect to the x-axis, y-axis, and z-axis. These signals are analyzed by the navigation system 20 to determine the current orientation of the tracking head 212 relative to gravity. It should be appreciated that the accelerometer 70, in some embodiments, could be used as the gravity sensor 320.
Referring to
Referring to
Referring to
In this embodiment, the navigation computer 26 is configured to instruct the user to adjust the tilt angle of the tracking head 212 until the z-axis gravity measurement is zero. This occurs when the x-y plane of the tracking head 212 is oriented vertically relative to the gravity vector 326, as shown in
The instructions to the user to adjust the tilt angle are carried out by the navigation system 20 through displays 28, 29, in which the current orientation of the tracking head 212 is graphically represented. The current orientation is dynamically adjusted as the user changes the tilt angle. The desired orientation of the tracking head 212 is also shown graphically so that the user can visually determine how close the current orientation is to the desired orientation. When the current orientation is at the desired orientation, the displays 28, 29 may flash green or have some other visual indicator that the tracking head 212 is in the desired orientation relative to gravity. It should be appreciated that the desired orientation may include predefined deviations from an ideal orientation in which the x-y plane is perfectly vertical relative to gravity, such as deviations of +/− ten percent, +/− five percent, or +/− two percent.
In alternative embodiments, an LED on the tracking head 212 may indicate to the user when the current orientation is at the desired orientation by being activated to emit a green colored light. Alternatively, an audible indicator may be provided on the tracker 44 to indicate that the tracking head 212 is in the desired orientation.
Once the tilt angle is set so that the tracking head 212 is at the desired orientation relative to gravity (see, e.g.,
In other cases, rotational adjustment about rotational axis R may be needed after the tilt adjustment to place the tracking head 212 in the desired orientation. In these cases, the desired orientation may include an additional rotational adjustment in which the gravity measurement along the z-axis moves from being approximately zero to being non-zero. Adjustment may also be iterative in which tilt adjustment is performed first to place the tracking head 212 vertically, then rotational adjustment is performed which causes the gravity measurement along the z-axis to be non-zero, and then further tilt adjustment is performed to place the z-axis back to approximately zero (i.e., to move the tracking head 212 back to vertical).
During rotational adjustment, the LEDs 50 are being tracked by the optical sensors 40. In particular, the navigation computer 26 is configured to determine, based on the signals received by the optical sensors 40, which rotational orientation of the tracking head 212 provides the best line-of-sight to the LEDs 50.
The desired rotational orientation can be determined by instructing the user to rotate the tracking head 212 through a maximum range of movement, e.g., 360 degrees, one or more times. While the tracking head 212 is rotated, the navigation computer 26 determines at which rotational positions (e.g., rotational angles) about axis R line of sight for each LED 50 is present and at which rotational positions there is no line of sight for each LED 50. This may include rotating the tracking head 212 though its maximum range of movement at various positions of the knee joint, i.e., at the flexed and at the extended positions of the knee joint. This will determine a range of line-of-sight positions about axis R for each LED 50. A best fit algorithm can then be used to determine the positions about axis R that best fits within the ranges of line-of-sight positions for all of the LEDs 50.
The navigation computer 26 instructs the user, through the displays 28, 29 to rotate the tracking head 212 until the current rotational position meets the desired rotational position determined by the navigation computer 26. See, for instance, the desired rotational position shown in
Like with adjusting the tilt angle, an LED on the tracking head 212 may indicate to the user when the rotational position is at the desired rotational position by being activated to emit a green colored light. Alternatively, an audible indicator may be provided on the tracker 44 to indicate that the current rotational position is at the desired rotational position. It should be appreciated that the desired rotational position may include predefined deviations from an ideal rotational position, such as deviations of +/− ten percent, +/− five percent, or +/− two percent.
When the desired rotational orientation is met, the adjustment fastener 261 of the connector assembly 214 is tightened so that the tracking head 212 is unable to rotate relative to the bone plate 200. The tracking head 212 is now fixed from moving relative to the bone plate 200 and the bone.
In some embodiments, the LEDs 50 are being tracked by the optical sensors 40 during adjustment of both the tilt angle and rotational angle. In particular, the navigation computer 26 is configured to determine, based on the signals received by the optical sensors 40, which tilt and rotational orientation of the tracking head 212 provides the best line-of-sight from the LEDs 50 to the optical sensors 40.
In these embodiments, the desired orientation can be determined by instructing the user to tilt and rotate the tracking head 212 through their maximum ranges of movement, one or more times, either sequentially or alternately. While the tracking head 212 is tilted and rotated, the navigation computer 26 determines at which tilt and rotational positions (e.g., tilt and rotational angles) about axes P and R line of sight for each LED 50 is present and at which tilt and rotational positions there is no line of sight for each LED 50. This will determine a range of line-of-sight positions about axes P and R for each LED 50. A best fit algorithm can then be used to determine the positions about axes P and R that best fits within the ranges of line-of-sight positions for all of the LEDs 50.
This process may be iterative and include several adjustments by the user about the axes P and R to find a suitable position for the tracking head 212. In certain instances, the navigation computer 26 may be unable to identify an orientation in which line-of-sight is maintained for all of the LEDs 50 because of a poor initial orientation set by the user. In this case, the navigation computer 26 may first instruct the user to reorient the tracking head 212 so that the LEDs 50 visually appear to be facing the optical sensors 40 and then continue with measuring the orientation of the tracking head 212 through various movements to find the best fit that maintains the line-of-sight for all of the LEDs 50.
In some cases, tilting adjustment may be processed first with the tracking head 212 being moved through its entire range of tilting motion, one or more times, and possibly at multiple knee positions including flexed and extended positions. The best fit tilt angle is then determined and the tilt angle is then fixed at the best fit tilt angle. The tracking head 212 can thereafter be moved through its entire range of rotational motion, one or more times, and possibly at multiple knee positions including flexed and extended positions. The best fit rotational angle is then determined and the rotational angle is then fixed at the best fit rotational angle.
In some embodiments, a third degree of freedom may be adjusted to a desired position, such as a height of the tracker 44. In the embodiment shown only two degrees of freedom are adjusted due to the fact that the tracker 44 moves up and down as the surgeon flexes the knee joint. In this case, the LEDs 50 of the tracking head 212 may be raised or lowered relative to the optical sensors 40 without breaking the line-of-sight between the LEDs 50 and sensors 40.
A screw driver 500 is shown in
Referring to
The screw driver 500 includes a body. The body comprises a nose tube 508, middle tube 510, and rear cap 512. The nose tube 508, middle tube 510, and rear cap 512 are separate parts that are releasably connected together for purposes of assembling internal components. It should be appreciated that in other embodiments, the nose tube 508, middle tube 510, and rear cap 512 may be permanently fixed together.
The nose tube 508 has a generally conical distal end 514. The nose tube 508 extends from its distal end 514 to an externally threaded proximal end 516. The nose tube 508 is hollow. The nose tube 508 defines a proximal bore 518 and distal bore 520. The proximal bore 518 is larger in cross-sectional area than the distal bore 520. The proximal bore 518 is circular in cross-section and the distal bore 520 is hexagonal in cross-section.
The middle tube 510 is generally cylindrical. The middle tube 510 has an internally threaded distal end 522 and an externally threaded proximal end 524. The internally threaded distal end 522 threads onto the externally threaded proximal end 516 of the nose tube 508.
The rear cap 512 has a top 526. The rear cap 512 extends distally from the top 526 to an internally threaded section 528. The internally threaded section 528 threads onto the externally threaded proximal end 524 of the middle tube 510.
The impactor 502 includes a hammer 530 disposed in the middle tube 510. The hammer 530 is generally cylindrical in shape. The rear spring 504 is disposed between the rear cap 512 and the hammer 530. The rear spring 504 biases the hammer 530 distally. Compression of the rear spring 504 can be adjusted by loosening or tightening the rear cap 512 to decrease or increase the stored energy of the hammer 530.
A receiving hole 532 is formed in the hammer 530. The receiving hole 532 is disposed about a central axis of the hammer 530. The receiving hole 532 is cylindrically shaped. A cross bore 534 is formed in a direction perpendicular to the receiving hole 532 (see
The impactor 502 includes a trigger 536 located in the cross bore 534. The trigger 536 is semi-cylindrical in shape. The trigger 536 has a flat bottom 538 (see
A driving rod 544 ultimately receives the energy stored in the impactor 502 to drive in the bone screws 202. The driving rod 544 has a cylindrical shaft 546 with a proximal end 548. The proximal end 548 is shaped for mating reception within the receiving hole 532 of the hammer 530.
A boss 550 is located on the proximal end 548 of the driving rod 544 to form a shoulder 552 (see
The driving rod 544 has a hexagonal shaft 554 with a distal end 556. The hexagonal shaft 554 mates with the hexagonal-shaped distal bore 520 of the nose tube 508 so that rotation of the body by the user also rotates the hexagonal shaft 554. The distal end 556 has features adapted to engage heads of the bone screws 202 to rotate and drive the bone screws 202 into the bone when the body of the screw driver 500 is rotated.
A collar 558 is fixed to the hexagonal shaft 554. The collar 558 prevents the driving rod 544 from falling out of the nose tube 508. The collar 558 is cylindrical in shape to mate with the proximal bore 518 of the nose tube 508.
A spring cap 560 is centrally disposed inside the middle tube 510 between the nose tube 508 and the hammer 530. The spring cap 560 is press fit inside the middle tube 510.
A rod spring 562 is disposed about the cylindrical shaft 546 of the driving rod 544. The rod spring 562 acts between the spring cap 560 and the collar 558 of the driving rod 544 to return the screw driver 500 to the rest state after the hammer 530 is actuated. The spring cap 560 defines a throughbore (not numbered) for receiving the cylindrical shaft 546 of the driving rod 544 and centering the cylindrical shaft 546 inside the middle tube 510.
The screw driver 500 is pressed against the bone screw 202 by gripping the rear cap 512 and/or middle tube 510 and urging them distally. The bone screw 202 is thus pressed against the bone. At the same time, the driving rod 544 travels proximally in the middle tube 510. The shoulder 552 pushes the trigger 536 proximally while the leaf spring 542 keeps the trigger 536 throughbore 540 misaligned with the receiving hole 532 of the hammer 530.
The middle tube 510 includes an inclined inner surface 566. The inclined inner surface 566 engages the trigger 536 when the trigger 536 reaches the inclined inner surface 566. When this occurs, the inclined inner surface 566 acts like a cam to push the trigger 536 in a manner that centers the throughbore 540 of the trigger 536 and places the throughbore 540 into alignment with the receiving hole 532 of the hammer 530. Likewise, the proximal end 548 of the driving rod 544 is now aligned to fit within the throughbore 540, which allows the trigger 536 to slide down the driving rod 544 thereby releasing the hammer 530. The hammer 530 then moves forward, propelled by the rear spring 504.
Because the receiving hole 532 in the hammer 530 has a predefined depth, the boss 550 on the proximal end 548 of the driving rod 544 eventually bottoms out in the receiving hole 532 and the force of the hammer 530 is transmitted into the driving rod 544 and the bone screw 202 punching the bone screw 202 into the bone.
In one embodiment, when each of the trackers 44, 46, 48 are being actively tracked, the firing of the LEDs occurs such that one LED 50 from tracker 44 is fired, then one LED 50 from tracker 46, then one LED 50 from tracker 48, then a second LED 50 from tracker 44, then a second LED 50 from tracker 46, and so on until all LEDs 50 have been fired and then the sequence repeats. This order of firing may occur through instruction signals sent from the transceivers (not shown) on the camera unit 36 to transceivers (not shown) on the trackers 44, 46, 48 or through wired connections from the navigation computer 26 to the tracker controller 62 on each of the trackers 44, 46, 48.
The navigation system 20 can be used in a closed loop manner to control surgical procedures carried out by surgical cutting instruments. Both the instrument 22 and the anatomy being cut are outfitted with trackers 50 such that the navigation system 20 can track the position and orientation of the instrument 22 and the anatomy being cut, such as bone.
In one embodiment, the navigation system 20 is part of a robotic surgical system for treating tissue. In some versions, the robotic surgical system is a robotic surgical cutting system for cutting away material from a patient's anatomy, such as bone or soft tissue. The cutting system could be used to prepare bone for surgical implants such as hip and knee implants, including unicompartmental, bicompartmental, or total knee implants. Some of these types of implants are shown in U.S. patent application Ser. No. 13/530,527, entitled, “Prosthetic Implant and Method of Implantation”, the disclosure of which is hereby incorporated by reference.
In one embodiment, the navigation system 20 communicates with a robotic control system (which can include the manipulator controller 54). The navigation system 20 communicates position and/or orientation data to the robotic control system. The position and/or orientation data is indicative of a position and/or orientation of instrument 22 relative to the anatomy. This communication provides closed loop control to control cutting of the anatomy such that the cutting occurs within a predefined boundary.
In one embodiment, each of the femur F and tibia T has a target volume of material that is to be removed by the working end of the surgical instrument 22. The target volumes are defined by one or more boundaries. The boundaries define the surfaces of the bone that should remain after the procedure. In some embodiments, navigation system 20 tracks and controls the surgical instrument 22 to ensure that the working end, e.g., bur, only removes the target volume of material and does not extend beyond the boundary, as disclosed in U.S. patent application Ser. No. 13/958,070, filed Aug. 2, 2013, entitled, “Surgical Manipulator Capable of Controlling a Surgical Instrument in Multiple Modes”, the disclosure of which is hereby incorporated by reference.
Several embodiments have been discussed in the foregoing description. However, the embodiments discussed herein are not intended to be exhaustive or limit the invention to any particular form. The terminology which has been used is intended to be in the nature of words of description rather than of limitation. Many modifications and variations are possible in light of the above teachings and the invention may be practiced otherwise than as specifically described.
This application is a continuation of U.S. patent application Ser. No. 15/407,639, filed on Jan. 17, 2017, which is a divisional of U.S. patent application Ser. No. 14/156,856, filed on Jan. 16, 2014, now U.S. Pat. No. 9,566,120, which claims priority to and benefit of U.S. Provisional Patent Application No. 61/753,219, filed on Jan. 16, 2013, the entire contents of each of the above is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3741205 | Markoff et al. | Jun 1973 | A |
4362416 | Kaimo | Dec 1982 | A |
5017139 | Mushabac | May 1991 | A |
5108395 | Laurain | Apr 1992 | A |
5108397 | White | Apr 1992 | A |
5142930 | Allen et al. | Sep 1992 | A |
5167464 | Voellmer | Dec 1992 | A |
5174772 | Vranish | Dec 1992 | A |
5368593 | Stark | Nov 1994 | A |
5566681 | Manwaring et al. | Oct 1996 | A |
5676673 | Ferre et al. | Oct 1997 | A |
5683118 | Slocum | Nov 1997 | A |
5748827 | Holl et al. | May 1998 | A |
5834759 | Glossop | Nov 1998 | A |
5855582 | Gildenberg | Jan 1999 | A |
5987960 | Messner et al. | Nov 1999 | A |
6021343 | Foley et al. | Feb 2000 | A |
6052611 | Yanof et al. | Apr 2000 | A |
6066141 | Dall et al. | May 2000 | A |
6167145 | Foley et al. | Dec 2000 | A |
6190395 | Williams | Feb 2001 | B1 |
6193430 | Culpepper et al. | Feb 2001 | B1 |
6203543 | Glossop | Mar 2001 | B1 |
6226548 | Foley et al. | May 2001 | B1 |
6273896 | Franck et al. | Aug 2001 | B1 |
6322562 | Wolter | Nov 2001 | B1 |
6327491 | Franklin et al. | Dec 2001 | B1 |
6377011 | Ben-Ur | Apr 2002 | B1 |
6379071 | Sorvino | Apr 2002 | B1 |
6430434 | Mittelstadt | Aug 2002 | B1 |
6434507 | Clayton et al. | Aug 2002 | B1 |
6484049 | Seeley et al. | Nov 2002 | B1 |
6491699 | Henderson et al. | Dec 2002 | B1 |
6514259 | Picard et al. | Feb 2003 | B2 |
6517484 | Wilk et al. | Feb 2003 | B1 |
6529765 | Franck et al. | Mar 2003 | B1 |
6572624 | U et al. | Jun 2003 | B2 |
6640128 | Vilsmeier et al. | Oct 2003 | B2 |
6719757 | Neubauer et al. | Apr 2004 | B2 |
6729589 | Shelef | May 2004 | B2 |
6738657 | Franklin et al. | May 2004 | B1 |
6746172 | Culpepper | Jun 2004 | B2 |
6856828 | Cossette et al. | Feb 2005 | B2 |
6893447 | Dominguez et al. | May 2005 | B2 |
6932823 | Grimm et al. | Aug 2005 | B2 |
6974461 | Wolter | Dec 2005 | B1 |
6980849 | Sasso | Dec 2005 | B2 |
6993374 | Sasso | Jan 2006 | B2 |
7043961 | Pandey et al. | May 2006 | B2 |
7048741 | Swanson | May 2006 | B2 |
7104996 | Bonutti | Sep 2006 | B2 |
7107091 | Jutras et al. | Sep 2006 | B2 |
7153297 | Peterson | Dec 2006 | B2 |
7153308 | Peterson | Dec 2006 | B2 |
7166114 | Moctezuma De La Barrera et al. | Jan 2007 | B2 |
7233820 | Gilboa | Jun 2007 | B2 |
7274958 | Jutras et al. | Sep 2007 | B2 |
7300432 | Surma et al. | Nov 2007 | B2 |
7302288 | Schellengberg | Nov 2007 | B1 |
7302355 | Jansen et al. | Nov 2007 | B2 |
7314048 | Couture et al. | Jan 2008 | B2 |
7366561 | Mills et al. | Apr 2008 | B2 |
7366562 | Dukesherer et al. | Apr 2008 | B2 |
7377924 | Raistrick et al. | May 2008 | B2 |
7419492 | Yoon et al. | Sep 2008 | B2 |
7458977 | McGinley et al. | Dec 2008 | B2 |
7477926 | McCombs | Jan 2009 | B2 |
7547307 | Carson et al. | Jun 2009 | B2 |
7558617 | Vilsmeier | Jul 2009 | B2 |
7604639 | Swanson | Oct 2009 | B2 |
7641660 | Lakin et al. | Jan 2010 | B2 |
7643862 | Schoenefeld | Jan 2010 | B2 |
7646899 | Fitzpatrick | Jan 2010 | B2 |
7668584 | Jansen | Feb 2010 | B2 |
7688998 | Tuma et al. | Mar 2010 | B2 |
7702379 | Avinash et al. | Apr 2010 | B2 |
7725162 | Malackowski et al. | May 2010 | B2 |
7725182 | Sutardja | May 2010 | B2 |
7726564 | Goldbach | Jun 2010 | B2 |
7734327 | Colquhoun | Jun 2010 | B2 |
7736368 | Couture et al. | Jun 2010 | B2 |
7753910 | Ritland | Jul 2010 | B2 |
7764985 | McCombs et al. | Jul 2010 | B2 |
7771436 | Moctezuma de la Barrera et al. | Aug 2010 | B2 |
7776000 | Schaffrath et al. | Aug 2010 | B2 |
7780681 | Sarin et al. | Aug 2010 | B2 |
7794469 | Kao et al. | Sep 2010 | B2 |
7835778 | Foley et al. | Nov 2010 | B2 |
7857821 | Couture et al. | Dec 2010 | B2 |
7862570 | Russell et al. | Jan 2011 | B2 |
7875039 | Zohra et al. | Jan 2011 | B2 |
7876942 | Gilboa | Jan 2011 | B2 |
7881770 | Melkent et al. | Feb 2011 | B2 |
7925328 | Urquhart et al. | Apr 2011 | B2 |
7951176 | Grady et al. | May 2011 | B2 |
7970174 | Goldbach | Jun 2011 | B2 |
7970190 | Steinle et al. | Jun 2011 | B2 |
7993353 | Roeβner et al. | Aug 2011 | B2 |
7996059 | Porath et al. | Aug 2011 | B2 |
8021369 | Curry | Sep 2011 | B2 |
8066961 | Costello, III et al. | Nov 2011 | B2 |
8105339 | Melkent et al. | Jan 2012 | B2 |
8109877 | Moctezuma de la Barrera et al. | Feb 2012 | B2 |
8114134 | Winslow et al. | Feb 2012 | B2 |
8147496 | Couture et al. | Apr 2012 | B2 |
8152726 | Amiot et al. | Apr 2012 | B2 |
8165658 | Waynik et al. | Apr 2012 | B2 |
8202322 | Doty | Jun 2012 | B2 |
8226724 | Doty | Jul 2012 | B2 |
8239001 | Verard et al. | Aug 2012 | B2 |
8271069 | Jacob et al. | Sep 2012 | B2 |
8277505 | Doty | Oct 2012 | B1 |
8348954 | Carls et al. | Jan 2013 | B2 |
8357165 | Grant et al. | Jan 2013 | B2 |
8382766 | Warkentine et al. | Feb 2013 | B2 |
8386022 | Jutras et al. | Feb 2013 | B2 |
8457719 | Moctezuma de la Barrera et al. | Jun 2013 | B2 |
8460277 | Suarez et al. | Jun 2013 | B2 |
8469965 | Neubauer et al. | Jun 2013 | B2 |
8512346 | Couture et al. | Aug 2013 | B2 |
8535329 | Sarin et al. | Sep 2013 | B2 |
8549732 | Burg et al. | Oct 2013 | B2 |
8611985 | Lavallee et al. | Dec 2013 | B2 |
8644570 | Hartmann et al. | Feb 2014 | B2 |
8657809 | Schoepp | Feb 2014 | B2 |
8672490 | Shafer et al. | Mar 2014 | B2 |
8706185 | Foley et al. | Apr 2014 | B2 |
8709017 | Plasky et al. | Apr 2014 | B2 |
8721660 | Ulfarsson et al. | May 2014 | B2 |
8747419 | Solar et al. | Jun 2014 | B2 |
8800939 | Karsak et al. | Aug 2014 | B2 |
8820729 | Doi et al. | Sep 2014 | B2 |
8845655 | Henderson et al. | Sep 2014 | B2 |
8862200 | Sherman et al. | Oct 2014 | B2 |
8942788 | Roger | Jan 2015 | B2 |
8945132 | Plasky et al. | Feb 2015 | B2 |
9008757 | Wu | Apr 2015 | B2 |
9066751 | Sasso | Jun 2015 | B2 |
9082319 | Shimada et al. | Jul 2015 | B2 |
9085401 | Shafer et al. | Jul 2015 | B2 |
9095376 | Plasky et al. | Aug 2015 | B2 |
9119655 | Bowling et al. | Sep 2015 | B2 |
9125624 | Dekel et al. | Sep 2015 | B2 |
9131987 | Stefanchik et al. | Sep 2015 | B2 |
9157698 | Cosentino | Oct 2015 | B2 |
9161799 | Benson et al. | Oct 2015 | B2 |
9381085 | Axelson, Jr. et al. | Jul 2016 | B2 |
9495509 | Amiot et al. | Nov 2016 | B2 |
9513113 | Yang et al. | Dec 2016 | B2 |
9566120 | Malackowski et al. | Feb 2017 | B2 |
9707043 | Bozung | Jul 2017 | B2 |
9993273 | Moctezuma de la Barrera et al. | Jun 2018 | B2 |
10531925 | Malackowski et al. | Jan 2020 | B2 |
10537395 | Perez | Jan 2020 | B2 |
20020133175 | Carson | Sep 2002 | A1 |
20030069591 | Carson et al. | Apr 2003 | A1 |
20030078565 | Vilsmeier et al. | Apr 2003 | A1 |
20030086748 | Culpepper | May 2003 | A1 |
20030135213 | LeHuec et al. | Jul 2003 | A1 |
20030225329 | Rossner et al. | Dec 2003 | A1 |
20040068263 | Chouinard et al. | Apr 2004 | A1 |
20040127902 | Suzuki et al. | Jul 2004 | A1 |
20040171930 | Grimm et al. | Sep 2004 | A1 |
20050010226 | Grady et al. | Jan 2005 | A1 |
20050049485 | Harmon et al. | Mar 2005 | A1 |
20050109855 | McCombs | May 2005 | A1 |
20050124988 | Terrill-Grisoni et al. | Jun 2005 | A1 |
20050137599 | Grimm et al. | Jun 2005 | A1 |
20050149050 | Stifter et al. | Jul 2005 | A1 |
20050187562 | Grimm et al. | Aug 2005 | A1 |
20050203528 | Couture et al. | Sep 2005 | A1 |
20050228387 | Paul | Oct 2005 | A1 |
20050277933 | Wall et al. | Dec 2005 | A1 |
20060015018 | Jutras et al. | Jan 2006 | A1 |
20060015119 | Plasskey et al. | Jan 2006 | A1 |
20060052691 | Hall et al. | Mar 2006 | A1 |
20060052792 | Boettiger et al. | Mar 2006 | A1 |
20060100642 | Yang et al. | May 2006 | A1 |
20060142656 | Malackowski et al. | Jun 2006 | A1 |
20060161059 | Wilson | Jul 2006 | A1 |
20060195111 | Couture | Aug 2006 | A1 |
20060235290 | Gabriel et al. | Oct 2006 | A1 |
20060241405 | Leitner et al. | Oct 2006 | A1 |
20070016008 | Schoenefeld | Jan 2007 | A1 |
20070055232 | Colquhoun | Mar 2007 | A1 |
20070073133 | Schoenefeld | Mar 2007 | A1 |
20070073297 | Reynolds | Mar 2007 | A1 |
20070118139 | Cuellar et al. | May 2007 | A1 |
20070233156 | Metzger | Oct 2007 | A1 |
20080027452 | Sheffer et al. | Jan 2008 | A1 |
20080045972 | Wanger et al. | Feb 2008 | A1 |
20080065084 | Couture et al. | Mar 2008 | A1 |
20080114375 | von Jako | May 2008 | A1 |
20080161682 | Kendrick et al. | Jul 2008 | A1 |
20080177173 | Deffenbaugh | Jul 2008 | A1 |
20080183108 | Huber et al. | Jul 2008 | A1 |
20080195110 | Plassy et al. | Aug 2008 | A1 |
20090024127 | Lechner et al. | Jan 2009 | A1 |
20090099445 | Burger | Apr 2009 | A1 |
20090118742 | Hartmann et al. | May 2009 | A1 |
20090163930 | Aoude et al. | Jun 2009 | A1 |
20090183740 | Sheffer et al. | Jul 2009 | A1 |
20090247863 | Proulx | Oct 2009 | A1 |
20090270928 | Stone et al. | Oct 2009 | A1 |
20090281417 | Hartmann et al. | Nov 2009 | A1 |
20090281421 | Culp et al. | Nov 2009 | A1 |
20090306499 | Van Vorhis | Dec 2009 | A1 |
20100004259 | Liu et al. | Jan 2010 | A1 |
20100023062 | Faillace et al. | Jan 2010 | A1 |
20100042111 | Qureshi et al. | Feb 2010 | A1 |
20100063511 | Plassky et al. | Mar 2010 | A1 |
20100094358 | Moore et al. | Apr 2010 | A1 |
20100100131 | Wallenstein | Apr 2010 | A1 |
20100125286 | Wang et al. | May 2010 | A1 |
20100160932 | Gschwandtner et al. | Jun 2010 | A1 |
20100192961 | Amiot et al. | Aug 2010 | A1 |
20110004259 | Stallings et al. | Jan 2011 | A1 |
20110098553 | Robbins et al. | Apr 2011 | A1 |
20110160572 | McIntosh | Jun 2011 | A1 |
20110160738 | McIntosh | Jun 2011 | A1 |
20110166446 | Whitmore, III et al. | Jul 2011 | A1 |
20110263971 | Nikou et al. | Oct 2011 | A1 |
20120016427 | Stindel et al. | Jan 2012 | A1 |
20120109228 | Boyer et al. | May 2012 | A1 |
20120143048 | Finlay et al. | Jun 2012 | A1 |
20120197266 | Sasso | Aug 2012 | A1 |
20130053648 | Abovitz et al. | Feb 2013 | A1 |
20130053895 | Stoll et al. | Feb 2013 | A1 |
20130060278 | Bozung et al. | Mar 2013 | A1 |
20130096573 | Kang et al. | Apr 2013 | A1 |
20130123580 | Peters et al. | May 2013 | A1 |
20130165947 | Nguyen et al. | Jun 2013 | A1 |
20130261783 | Daon | Oct 2013 | A1 |
20130331686 | Freysinger et al. | Dec 2013 | A1 |
20130345718 | Crawford | Dec 2013 | A1 |
20140039681 | Bowling et al. | Feb 2014 | A1 |
20140049629 | Siewerdsen et al. | Feb 2014 | A1 |
20140088410 | Wu | Mar 2014 | A1 |
20140200621 | Malackowski et al. | Jul 2014 | A1 |
20140236159 | Haider et al. | Aug 2014 | A1 |
20140276943 | Bowling et al. | Sep 2014 | A1 |
20140364858 | Li et al. | Dec 2014 | A1 |
20150031982 | Piferi et al. | Jan 2015 | A1 |
20150088108 | Tyc et al. | Mar 2015 | A1 |
20150173911 | Doty | Jun 2015 | A1 |
20150182285 | Yen et al. | Jul 2015 | A1 |
20150182293 | Yang et al. | Jul 2015 | A1 |
20150209119 | Theodore et al. | Jul 2015 | A1 |
20150257851 | Plassky et al. | Sep 2015 | A1 |
20150265769 | Bratbak et al. | Sep 2015 | A1 |
20150282735 | Rossner | Oct 2015 | A1 |
20150309187 | Shafer et al. | Oct 2015 | A1 |
20160249988 | Pfeifer et al. | Sep 2016 | A1 |
20170119478 | Malackowski et al. | May 2017 | A1 |
20170245945 | Zuhars et al. | Aug 2017 | A1 |
20170333136 | Hladio et al. | Nov 2017 | A1 |
20180263670 | Moctezuma de la Barrera et al. | Sep 2018 | A1 |
20190142525 | Malackowski et al. | May 2019 | A1 |
Number | Date | Country |
---|---|---|
2587369 | Nov 2003 | CN |
1658789 | Aug 2005 | CN |
101032426 | Sep 2007 | CN |
101311882 | Nov 2008 | CN |
101327148 | Dec 2008 | CN |
101536013 | Sep 2009 | CN |
102449666 | May 2012 | CN |
1343117 | Nov 1999 | DE |
19962317 | Mar 2001 | DE |
19629011 | Aug 2001 | DE |
10335388 | Jun 2006 | DE |
19858889 | Aug 2008 | DE |
1143867 | Jul 2002 | EP |
1211994 | Apr 2005 | EP |
1211993 | Oct 2005 | EP |
1570802 | Jun 2006 | EP |
1873666 | Jan 2008 | EP |
2435243 | Apr 1980 | FR |
2007503898 | Mar 2007 | JP |
2008538184 | Oct 2008 | JP |
2009511211 | Mar 2009 | JP |
2011515163 | May 2011 | JP |
2002080773 | Oct 2002 | WO |
2004069073 | Aug 2004 | WO |
2004069073 | Nov 2004 | WO |
2005104783 | Nov 2005 | WO |
2006091494 | Aug 2006 | WO |
2007014470 | Feb 2007 | WO |
2007038135 | Apr 2007 | WO |
2008082574 | Jul 2008 | WO |
2008104548 | Sep 2008 | WO |
2008113008 | Sep 2008 | WO |
2008133615 | Nov 2008 | WO |
2009117832 | Oct 2009 | WO |
2010055193 | May 2010 | WO |
2010111090 | Sep 2010 | WO |
2012103407 | Aug 2012 | WO |
2012127353 | Sep 2012 | WO |
2013091112 | Jun 2013 | WO |
2013177334 | Nov 2013 | WO |
2013192598 | Dec 2013 | WO |
2014091053 | Jun 2014 | WO |
2014139022 | Sep 2014 | WO |
2014198784 | Dec 2014 | WO |
2015013518 | Jan 2015 | WO |
2015067743 | May 2015 | WO |
2015090434 | Jun 2015 | WO |
2015150877 | Oct 2015 | WO |
Entry |
---|
U.S. Appl. No. 16/730,230, filed Dec. 30, 2019. |
English language abstract and machine-assisted English translation for CN 2587369 extracted from espacenet.com database on Jun. 13, 2018, 13 pages. |
English language abstract and machine-assisted English translation for DE 10335388 extracted from espacenet.com database on Dec. 16, 2016; 6 pages. |
English language abstract and machine-assisted English translation for DE 19629011 extracted from espacenet.com database on Jan. 12, 2017; 7 pages. |
English language abstract and machine-assisted English translation for DE 19858889 extracted from espacenet.com database on Jan. 12, 2017; 13 pages. |
English language abstract and machine-assisted English translation for DE 4343117 extracted from espacenet.com database on Jan. 12, 2017; 7 pages. |
English language abstract and machine-assisted English translation for FR 2 435 243 A1 extracted from the www.espacenet.com database on May 2, 2014. |
English language abstract for EP 1 873 666 A1 extracted from the www.espacenet.com database on May 5, 2014. |
English language abstract for FR 2435243 extracted from espacenet.com database on Dec. 16, 2016; 2 pages. |
English language abstract for JP 2007-503898 A extracted from the www.espacenet.com database on May 2, 2014. |
English language abstract for JP 2008-538184 extracted from espacenet.com database on Jun. 18, 2018, 2 pages. |
English language abstract for JP 2009 511211 A extracted from the www.espacenet.com database on May 2, 2014. |
English language abstract for JP 2011-515163 extracted from espacenet.com database on Jun. 18, 2018, 2 pages. |
English language abstract for WO 2014091053 extracted from espacenet.com database on Dec. 19, 2016; 2 pages. |
International Search Report for Application No. PCT/US2014/011821 dated Jul. 18, 2014; 7 pages. |
Liebergall, Meir; Mosheiff, Rami; Joskowicz, Leo; Chapter 23: Computer-Aided Orthopaedic Surgery in Skeletal Trauma; Rockwood & Green's Fractures in Adults, 6th Edition, .COPYRGT. 2006 Lippincott Williams & Wilkins, pp. 739-767; 60 pages. |
Written Opinion for Application No. PCT/US2014/011821 dated Jul. 18, 2014; 10 pages. |
Liebergall, Meir; Mosheiff, Rami; Joskowicz, Leo; Chapter 23: Computer-Aided Orthopaedic Surgery in Skeletal Trauma; Rockwood & Green's Fractures in Adults, 6th Edition, .COPYRGT. 2006 Lippincott Williams & Wilkins, pp. 739-767; 60 pages. cited byapplicant. |
English language abstract for CN 101032426 A extracted from espacenet.com database on Mar. 29, 2021, 1 page. |
English language abstract for CN 101536013 A extracted from espacenet.com database on Mar. 29, 2021, 2 pages. |
English language abstract for CN 10244966 A extracted from espacenet.com database on Mar. 29, 2021, 2 pages. |
English language abstract and machine-assisted English translation for CN 101311882 A extracted from espacenet.com database on Feb. 14, 2022, 17 pages. |
English language abstract and machine-assisted English translation for CN 101327148 A extracted from espacenet.com database on Feb. 14, 2022, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20200100849 A1 | Apr 2020 | US |
Number | Date | Country | |
---|---|---|---|
61753219 | Jan 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14156856 | Jan 2014 | US |
Child | 15407639 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15407639 | Jan 2017 | US |
Child | 16701972 | US |