Orthopedic fixation with imagery analysis

Information

  • Patent Grant
  • 11896313
  • Patent Number
    11,896,313
  • Date Filed
    Monday, February 1, 2021
    3 years ago
  • Date Issued
    Tuesday, February 13, 2024
    2 months ago
Abstract
Methods of orthopedic fixation and imagery analysis are provided. Images of first and second bone segments attached to a fixation apparatus are captured. Fixator elements identified in the images can be used to obtain imaging scene parameters. Bone elements identified in the images can be used with the imaging scene parameters to reconstruct a three dimensional representation of positions and/or orientations of the first and second bone segments with respect to the fixation apparatus.
Description
BACKGROUND

Techniques used to treat bone fractures and/or bone deformities can include the use of external fixators, such as fixation frames, that are surgically mounted to bone segments on opposed sides of a fracture site. A pair of radiographic images is taken of the fixator and bone segments at the fracture site. Typically, the radiographic images must be orthogonal, or perpendicular with respect to each other and aligned with anatomical axes of the patient. Data from the images is then manipulated with orthogonal projection techniques to construct a three dimensional representation of the fixator and the bones segments that can be used in developing a treatment plan, which may for example comprise realigning the bone segments through adjustments to the fixator.


However, the ability to acquire orthogonal radiographic images of a fracture site can be limited by factors beyond a surgeon's control, for instance maneuverability of the imaging apparatus, the anatomical location of a fracture or deformity, and/or pain incurred by a patient in positioning a broken limb for orthogonal imaging. Limiting factors such as these can introduce inaccuracies into the imaging process. These inaccuracies can have undesirable consequences such as improper alignment of bone segments during the healing process, compromised union between the bone segments, necessitating additional rounds of radiographic imaging to facilitate alignment corrections, or even necessitating additional surgical procedures.


SUMMARY

In accordance with one embodiment, a method of orthopedic fixation includes attaching a fixation apparatus to first and second bone segments. The method further includes capturing a first image of the fixation apparatus and bone segments from a first orientation with respect to the fixation apparatus. The method further still includes capturing a second image of the fixation apparatus and bone segments from a second orientation with respect to the fixation apparatus that is different from the first orientation. The method further still includes computing first and second transformation matrices for the first and second images, respectively. The method further still includes utilizing the transformation matrices to reconstruct a three dimensional representation of the first and second bone segments with respect to the fixation apparatus.


In accordance with an alternative embodiment, a computer-readable storage medium has computer-readable instructions stored thereon that when executed by a processor perform a method of orthopedic fixation imagery analysis. The method includes capturing, via an imager, first and second images of a fixation apparatus and first and second bone segments attached thereto. The first image is captured from a first orientation and the second image is captured from a second orientation that is different from the first orientation. The method further includes obtaining a plurality of imaging scene parameters. The method further still includes reconstructing a three dimensional representation of the first and second bone segments with respect to the fixation apparatus based upon the plurality of imaging scene parameters.





BRIEF DESCRIPTION OF THE DRAWINGS

The foregoing summary, as well as the following detailed description of the preferred embodiments of the application, will be better understood when read in conjunction with the appended drawings. For the purposes of illustrating the methods and/or techniques of orthopedic fixation with imagery analysis, there are shown in the drawings preferred embodiments. It should be understood, however, that the instant application is not limited to the precise arrangements and/or instrumentalities illustrated in the drawings, in which:



FIG. 1 is a perspective view of a fixation assembly positioned for imaging in accordance with an embodiment;



FIG. 2 is a perspective view of an example imaging process of the fixation assembly illustrated in FIG. 1; and



FIG. 3 is a flow diagram illustrating an example orthopedic fixation with imagery analysis process in accordance with an embodiment.





DETAILED DESCRIPTION

For convenience, the same or equivalent elements in the various embodiments illustrated in the drawings have been identified with the same reference numerals. Certain terminology is used in the following description for convenience only and is not limiting. The words “right”, “left”, “top” and “bottom” designate directions in the drawings to which reference is made. The words “inward”, “inwardly”, “outward”, and “outwardly” refer to directions toward and away from, respectively, the geometric center of the device and designated parts thereof. The terminology intended to be non-limiting includes the above-listed words, derivatives thereof and words of similar import.


Referring initially to FIG. 1, bodily tissues, for instance first and second bone segments 102, 104, can be aligned and/or oriented to promote union or other healing between the bodily tissues. The alignment and/or orientation of the bodily tissues can be achieved by connecting the bodily tissues to an adjustable fixation apparatus, such as orthopedic fixator 100. The orthopedic fixator can comprise an external fixation apparatus that includes a plurality of discrete fixator members that remain external to the patient's body, but that are attached to respective discreet bodily tissues, for example with minimally invasive attachment members. By adjusting the spatial positioning of the fixator members with respect to each other, the respective bodily tissues attached thereto can be reoriented and/or otherwise brought into alignment with each other, for example to promote union between the bodily tissues during the healing process. The use of external orthopedic fixators in combination with the imagery analysis and positioning techniques described herein can be advantageous in applications where direct measurement and manipulation of the bodily tissues is not possible, where limited or minimally invasive access to the bodily tissues is desired, or the like.


The fixator members can be connected to each other via adjustment members, the adjustment members configured to facilitate the spatial repositioning of the fixator members with respect to each other. For example, in the illustrated embodiment, the orthopedic fixator 100 comprises a pair of fixator members in the form of an upper fixator ring 106 and a lower fixator ring 108. The fixator rings 106, 108 can be constructed the same or differently. For instance, the fixator rings 106, 108 can have diameters that are the same or different. Similarly, the fixator rings 106, 108 can be constructed with varying cross sectional diameters, thicknesses, etc. It should be appreciated that the fixator members of the orthopedic fixator 100 are not limited to the illustrated upper and lower fixator rings 106, 108, and that the orthopedic fixator 100 can be alternatively constructed. For example, additional fixator rings can be provided and interconnected with the fixator ring 106 and/or 108. It should further be appreciated that the geometries of the fixator members are not limited to rings, and that at least one, such as all of the fixator members can be alternatively constructed using any other suitable geometry.


The first and second bone segments 102, 104 can be rigidly attached to the upper and lower fixator rings 106, 108, respectively, with attachment members that can be mounted to the fixator rings 106, 108. For example, in the illustrated embodiment, attachment members are provided in the form of attachment rods 110 and attachment wires 112.


The rods 110 and the wires 112 extend between proximal ends attached to mounting members 114 that are mounted to the fixator rings 106, 108, and opposed distal ends that are inserted into or otherwise secured to the bone segments 102, 104. The mounting members 114 can be removably mounted to the fixator rings 106, 108 at predefined points along the peripheries of the fixator rings 106, 108, for example by disposing them into threaded apertures defined by the fixator rings. With respect to each fixator ring 106, 108, the mounting members 114 can be mounted to the upper surface of the ring, the lower surface of the ring, or any combination thereof. It should be appreciated that the attachment members are not limited to the configuration of the illustrated embodiment. For example, any number of attachment members, such as the illustrated rods 110 and wires 112 and any others, can be used to secure the bone segments to respective fixator members as desired. It should further be appreciated that one or more of the attachment members, for instance the rods 110 and/or wires 112, can be alternatively configured to mount directly to the fixator rings 106, 108, without utilizing mounting members 114.


The upper and lower fixator rings 106, 108 can be connected to each other by at least one, such as a plurality of adjustment members. At least one, such as all, of the adjustment members can be configured to allow the spatial positioning of the fixator rings with respect to each other to be adjusted. For example, in the illustrated embodiment, the upper and lower fixator rings 106, 108 are connected to each other with a plurality of adjustment members provided in the form of adjustable length struts 116. It should be appreciated that the construction of the orthopedic fixator 100 is not limited to the six struts 116 of the illustrated embodiment, and that more or fewer struts can be used as desired.


Each of the adjustable length struts 116 can comprise opposed upper and lower strut arms 118, 120. Each of the upper and lower strut arms 118, 120 have proximal ends disposed in a coupling member, or sleeve 122, and opposed distal ends that are coupled to universal joints 124 mounted to the upper and lower fixator rings 106, 108, respectively. The universal joints of the illustrated embodiment are disposed in pairs spaced evenly around the peripheries of the upper and lower fixator rings 106, 108, but can be alternatively placed in any other locations on the fixator rings as desired.


The proximal ends of the upper and lower strut arms 118, 120 of each strut 116 can have threads defined thereon that are configured to be received by complementary threads defined in the sleeve 122, such that when the proximal ends of the upper and lower strut arms 118, 120 of a strut 116 are received in a respective sleeve 122, rotation of the sleeve 122 causes the upper and lower strut arms 118, 120 to translate within the sleeve 122, thus causing the strut 116 to be elongated or shortened, depending on the direction of rotation. Thus, the length of each strut 116 can be independently adjusted with respect to the remaining struts. It should be appreciated that the adjustment members are not limited to the length adjustable struts 116 of the illustrated embodiment, and that the adjustment members can be alternatively constructed as desired, for example using one or more alternative geometries, alternative length adjustment mechanisms, and the like.


The adjustable length struts 116 and the universal joints 124 by which they are mounted to the upper and lower fixator rings 106, 108, allow the orthopedic fixator 100 to function much like a Stewart platform, and more specifically like a distraction osteogenesis ring system, a hexapod, or a Taylor spatial frame. That is, by making length adjustments to the struts 116, the spatial positioning of the upper and lower fixator rings 106, 108, and thus the bone segments 102, 104 can be altered. For example, in the illustrated embodiment the first bone segment 102 is attached to the upper fixator ring 106 and the second bone segment 104 is attached to the lower fixator ring 108. It should be appreciated that attachment of the first and second bone segments 102, 104 to the upper and lower fixator rings 106, 108 is not limited to the illustrated embodiment (e.g., where the central longitudinal axes L1, L2 of the first and second bone segments 102, 104 are substantially perpendicular to the respective planes of the upper and lower fixator rings 106, 108), and that a surgeon has complete flexibility in aligning the first and second bone segments 102, 104 within the upper and lower fixator rings 106, 108 when configuring the orthopedic fixator 100.


By varying the length of one or more of the struts 116, the upper and lower fixator rings 106, 108, and thus the bone segments 102 and 104 can be repositioned with respect to each other such that their respective longitudinal axes L1, L2 are substantially aligned with each other, and such that their respective fractured ends 103, 105 abut each other, so as to promote union during the healing process. It should be appreciated that adjustment of the struts 116 is not limited to the length adjustments as described herein, and that the struts 116 can be differently adjusted as desired. It should further be appreciated that adjusting the positions of the fixator members is not limited to adjusting the lengths of the length adjustable struts 116, and that the positioning of the fixator members with respect to each other can be alternatively adjusted, for example in accordance the type and/or number of adjustment members connected to the fixation apparatus.


Repositioning of the fixator members of an orthopedic fixation apparatus, such as orthopedic fixator 100, can be used to correct displacements of angulation, translation, rotation, or any combination thereof, within bodily tissues. A fixation apparatus, such as orthopedic fixator 100, utilized with the techniques described herein, can correct a plurality of such displacement defects individually or simultaneously. However, it should be appreciated that the fixation apparatus is not limited to the illustrated orthopedic fixator 100, and that the fixation apparatus can be alternatively constructed as desired. For example, the fixation apparatus can include additional fixation members, can include fixation members having alternative geometries, can include more or fewer adjustment members, can include alternatively constructed adjustment members, or any combination thereof.


Referring now to FIGS. 2-3, an example orthopedic fixation with imagery analysis process, or method in accordance with an embodiment is illustrated. Steps for carrying out an example orthopedic fixation with imagery analysis method 300 are depicted in the flow chart of FIG. 3. At step 302, bodily tissues, such as first and second bone segments 102, 104, can be connected to an adjustable fixation apparatus, such as the orthopedic fixator 100, as described above.


At step 304, with the orthopedic fixator 100 secured to the bone segments 102, 104, at least one, such as a plurality of images can be taken of the fixator 100 and the bone segments 102, 104. The images can be captured using the same or different imaging techniques. For example, the images can be acquired using x-ray imaging, computer tomography, magnetic resonance imaging, ultrasound, infrared imaging, photography, fluoroscopy, visual spectrum imaging, or any combination thereof.


The images can be captured from any position and/or orientation with respect to each other and with respect to the fixator 100 and the bone segments 102, 104. In other words, there is no requirement that the captured images be orthogonal with respect to each other or aligned with anatomical axes of the patient, thereby providing a surgeon with near complete flexibility in positioning the imagers 130. Preferably, the images 126, 128 are captured from different directions, or orientations, such that the images do not overlap. For example, in the illustrated embodiment, the image planes of the pair of images 126, 128 are not perpendicular with respect to each other. In other words, the angle α between the image planes of the images 126, 128 is not equal to 90 degrees, such that the images 126, 128 are non-orthogonal with respect to each other. Preferably, at least two images are taken, although capturing additional images may increase the accuracy of the method.


The images 126, 128 can be captured using one or more imaging sources, or imagers, for instance the x-ray imagers 130 and/or corresponding image capturing devices 127, 129. The images 126, 128 can be x-ray images captured by a single repositionable x-ray imager 130, or can be captured by separately positioned imagers 130. Preferably, the position of the image capturing devices 127, 129 and/or the imagers 130 with respect to the space origin 135 of the three dimensional space, described in more detail below, are known. The imagers 130 can be manually positioned and/or oriented under the control of a surgeon, automatically positioned, for instance by a software assisted imager, or any combination thereof.


At step 306, imaging scene parameters pertaining to fixator 100, the bone segments 102, 104, imager(s) 130, and image capturing devices 127, 129 are obtained. The imaging scene parameters can be used in constructing a three dimensional representation of the positioning of the bone segments 102, 104 in the fixator 100, as described in more detail below. One or more of the imaging scene parameters may be known. Imaging scene parameters that are not known can be obtained, for example by mathematically comparing the locations of fixator element representations in the two dimensional space of the x-ray images 126, 128 to the three dimensional locations of those elements on the geometry of the fixator 100. In a preferred embodiment, imaging scene parameters can be calculated using a pin hole or perspective camera models. For example, the imaging scene parameters can be determined numerically using matrix algebra, as described in more detail below.


The imaging scene parameters can include, but are not limited to image pixel scale factors, image pixel aspect ratio, the image sensor skew factor, the image size, the focal length, the position and orientation of the imaging source, the position of the principle point (defined as the point in the plane of a respective image 126, 128 that is closest to the respective imager 130), positions and orientations of elements of the fixator 100, the position and orientation of a respective image receiver, and the position and orientation of the imaging source's lens.


In a preferred embodiment, at least some, such as all of the imaging scene parameters can be obtained by comparing the locations of representations of particular components, or fixator elements of the fixator 100 within the two dimensional spaces of the images 126, 128, with the corresponding locations of those same fixator elements in actual, three dimensional space. The fixator elements comprise components of the orthopedic fixator 100, and preferably are components that are easy to identify in the images 126, 128. Points, lines, conics, or the like, or any combination thereof can be used to describe the respective geometries of the fixator elements. For example, the representations of fixator elements used in the comparison could include center lines of one or more of the adjustable length struts 116, center points of the universal joints 124, center points of the mounting members 114, and the like.


The fixator elements can further include marker elements that are distinct from the above-described components of the fixator 100. The marker elements can be used in the comparison, as a supplement to or in lieu of using components of the fixator 100. The marker elements can be mounted to specific locations of components of the fixator 100 prior to imaging, can be imbedded within components of the fixator 100, or any combination thereof. The marker elements can be configured for enhanced viewability in the images 126, 128 when compared to the viewability of the other components of the fixator 100. For example, the marker elements may be constructed of a different material, such as a radio-opaque material, or may be constructed with geometries that readily distinguish them from other components of the fixator 100 in the images 126, 128. In an example embodiment, the marker elements can have designated geometries that correspond to their respective locations on the fixator 100.


At step 306A, fixator elements can be identified for use in the comparison. The identification of fixator elements and the determination of their respective locations can be performed by a surgeon, with the assistance of software, or by any combination thereof.


The locations of the fixator elements in the two dimensional space of the images 126, 128 are determined with respect to local origins 125 defined in the imaging planes of the images 126, 128. The local origins 125 serve as a “zero points” for determining the locations of the fixator elements in the images 126, 128. The locations of the fixator elements can be defined by their respective x and y coordinates with respect to a respective local origin 125. The location of the local origin 125 within the respective image can be arbitrary so long it is in the plane of the image. Typically, the origin is located at the center of the image or at a corner of the image, such as the lower left hand corner. It should be appreciated that the locations of the local origins are not limited to illustrated local origins 125, and that the local origins 125 can be alternatively defined at any other locations. It should further be appreciated that the locations of the local origins 125 can be designated by a surgeon, with the assistance of software, or by any combination thereof.


At step 306B, a respective transformation matrix P can be computed for each of the images 126, 128. The transformation matrices can be utilized to map location coordinates of one or more respective fixator elements in actual three dimensional space to corresponding location coordinates of the fixator element(s) in the two dimensional space of the respective image 126, 128. It should be appreciated that the same fixator element(s) need not be used in the comparisons of both images 126, 128. For example, a fixator element used in constructing the transformation matrix associated with image 126 can be the same or different from the fixator element used in constructing the transformation matrix associated with image 128. It should further be appreciated that increasing the number of fixator elements used in computing the transformation matrices can increase the accuracy method. The following equation represents this operation:










[



x




y




1



]

=

P
·

[



X




Y




Z




1



]






(
1
)







The symbols x and y represent location coordinates, with respect to the local origin 125, of a fixator element point in the two dimensional space of images 126, 128. The symbols X, Y and Z represent corresponding location coordinates, with respect to a space origin 135, of the fixator element point in actual three dimensional space. In the illustrated embodiment, the point corresponding to the center of the plane defined by the upper surface of the upper fixator ring 106 has been designated as the space origin 135. The illustrated matrix P can be at least four elements wide and three elements tall. In a preferred embodiment, the elements of the matrix P can be computed by solving the following matrix equation:

A·p=A  (2)


The vector p can contain eleven elements representing values of the matrix P. The following equations present arrangements of the elements in the vector p and the matrix P:









p
=


[




p
1




p
2




p
3




p
4




p
5




p
6




p
7




p
8




p
9




p
10




p
11




]

T





(
3
)






P
=

[




p
1




p
2




p
3




p
4






p
5




p
6




p
7




p
8






p
9




p
10




p

1

1





p

1

2





]





(
4
)







In the preferred embodiment, the twelfth element p12 of the matrix P can be set to a numerical value of one. The matrices A and B can be assembled using the two dimensional and three dimensional information of the fixator elements. For every point representing a respective fixator element, two rows of matrices A and B can be constructed. The following equation presents the values of the two rows added to the matrices A and B for every point of a fixator element (e.g., a center point of a respective universal joint 124):











[






































X


Y


Z


1


0


0


0


0




-
x

·
X





-
x

·
Y





-
x

·
Z





0


0


0


0


X


Y


Z


1




-
y

·
X





-
y

·
Y





-
y

·
Z







































]

·
p

=

[








x




y








]





(
5
)







The symbols X, Y and Z represent location coordinate values of a fixator element point in actual three dimensional space relative to the space origin 135, and the symbols x and y represent location coordinate values of the corresponding fixator element point in the two dimensional space of the respective image 126, 128 relative to local origin 125.


For every line representing a respective fixator element, two rows of matrices A and B can be constructed. The following equation presents the values of the two rows added to the matrices A and B for every line of a fixator element (e.g., a center line of a respective adjustable length strut 116):











[







































X
·
a




Y
·
a




Z
·
a



a



X
·
b




Y
·
b




Z
·
b



b



X
·
c




Y
·
c




Z
·
c









dX
·





a









dY
·





a









dZ
·





a





0






dX
·





b









dY
·





b









dZ
·





b





0






dY
·





c









dY
·





c









dZ
·





c









































]

·
p

=

[









-
c





0








]





(
6
)







The symbols X, Y and Z represent location coordinate values of a point belonging to a line of a fixator element in actual three dimensional space relative to the space origin 135. The symbols dX, dY and dZ represent gradient values of the line in actual three dimensional space. The symbols a, b and c represent constants defining a line in the two dimensional space of a respective image 126, 128. For example, a, b, and c can be computed using two points belonging to a line on a respective image 126, 128. In a preferred embodiment, the value of b is assumed to be 1, unless the line is a vertical line, in which case the value of b is zero. A correlation of constants a, b and c with the respective image coordinates x and y is presented in the following equation:

a·x+b·y+c=0  (7)


The equation (2) can be over constrained by using six or more fixator elements, for example the adjustable length struts 116. It should be appreciated that it is not necessary for all of the fixator elements to be visible in a single one of the images 126, 128 in order to obtain the matrix P. It should further be appreciated that if one or more of the above-described imaging scene parameters are known, the known parameters can be used to reduce the minimum number of the fixator elements required to constrain equation (2). For instance, such information could be obtained from modern imaging systems in DICOM image headers. Preferably, a singular value decomposition or least squares method can be used to solve equation (2) for values of the vector p.


At step 306C, the transformation matrices can be decomposed into imaging scene parameters. The following equation can be used to relate the matrix P to matrices E and I:

P=I·E  (8)


It should be appreciated that additional terms can be introduced when decomposing the matrix P. For example, the method presented by Tsai, described in “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using of-the-shelf TV Cameras and Lenses”, IEEE Journal of Robotics & Automation, RA-3, No. 4, 323-344, August 1987, which is incorporated herein by reference in its entirety, can be used to correct images 126, 128, for radial distortion.


Matrices E and I contain imaging scene parameters. The following equation represents a composition of the matrix I:









I
=

[



sx


0



-
tx





0



s

y





-
t


y





0


0



1
/
f




]





(
9
)







The symbols sx and sy represent values of image coordinate scale factors (e.g., pixel scale factors). The symbol f, representing the focal length, corresponds to the value of the shortest distance between a respective imaging source 130 and the plane of a corresponding image 126, 128. The symbols tx and ty represent the coordinates of the principle point relative to the local origin 125 of the respective image 126, 128. The following equation represents the composition of the matrix E:









E
=

[




r
1




r
2




r
3




-

(



r
1

·

o
x


+


r
2

·

o
y


+


r
3

·

o
z



)







r
4




r
5




r
6




-

(



r
4

·

o
x


+


r
5

·

o
y


+


r
6

·

o
z



)







r
7




r
8




r
9




-

(



r
7

·

o
x


+


r
8

·

o
y


+


r
9

·

o
z



)





]





(
10
)







The symbols ox, oy and oz represent values of the position of the fixator 100 in actual three dimensional space. The symbols r1 to r9 describe the orientation of the fixator 100. These values can be assembled into a three dimensional rotational matrix R represented by the following equation:









R
=

[




r
1




r
2




r
3






r
4




r
5




r
6






r
7




r
8




r
9




]





(
11
)







The methods of Trucco and Verri, as described in “Introductory Techniques of 3-D Computer Vision”, Prentice Hall, 1998, or the method of Hartley, as described in “Euclidian Reconstruction from Uncalibrated Views”, Applications of Invariance in Computer Vision, pages 237-256, Springer Verlag, Berlin Heidelberg, 1994, which are incorporated herein in their entireties, can be used to obtain values of the matrices E and/or I. Utilizing the resulting values of matrices E and I, a complete three dimensional imaging scene of the fixator 100 and the bone segments 102, 104 can be reconstructed.


For example, FIG. 2 illustrates an example three dimensional imaging scene reconstructed from the x-ray images 126, 128. In the illustrated embodiment, x-rays are emitted from x-ray imagers 130. It should be appreciated that the x-ray imagers 130 can be the same or different imagers, as described above. The x-rays emitted from the imagers 130 are received on by corresponding imaging devices, thus capturing the images 126, 128. Preferably, the positioning of the imagers 130 with respect to the local origins 125 is known.


At step 308, the images 126, 128 and the imaging scene parameters can be used to obtain the positions and/or orientations of the bone segments 102, 104 in three dimensional space. The position and/or orientation data obtained can be used to develop a treatment plan for a patient, for example to change the orientation and/or position of the fractured first and second bone segments 102, 104 in order to promote union between the bone segments 102, 104, as described in more detail below. It should be appreciated that the methods and techniques of orthopedic fixation with imagery analysis described herein are not limited to applications of repositioning broken bones, and that orthopedic fixation with imagery analysis can be used in any other type of fixation procedure as desired, for example lengthening of bones, correction of anatomical defects, and the like.


At step 308A, bone elements comprising representations of particular portions (e.g., anatomical features) of the bone segments 102, 104, can be identified and their locations within the images 126, 128 determined. Preferably, the locations of the bone elements are determined with respect to the respective local origins 125 of images 126, 128. The identification of the bone elements and the determination of their respective locations can be performed by a surgeon, with the assistance of software, or by any combination thereof.


The bone elements can be used in the construction of the three dimensional representation of the position and/or orientation of the bone segments 102, 104. Preferably, the bone elements are easy to identify in the images 126, 128. Points, lines, conics, or the like, or any combination thereof can be used to describe the respective geometries of the bone elements. For example, in the illustrated embodiment, points 134 and 136 representing the fractured ends 103, 105 of the bone segments 102, 104, respectively, are identified as bone elements in the images 126, 128.


The bone elements can further include marker elements that are implanted into the bone segments 102, 104 prior to imaging. The marker elements can be used as a supplement to or in lieu of the above-described bone elements identified in the images 124, 126. The marker elements can be configured for enhanced viewability in the images 126, 128 when compared to the viewability of anatomical features of the bone segments 102, 104. For example, the marker elements may be constructed of a radio-opaque material, or may be constructed with readily distinguishable geometries.


At step 308B, a three dimensional representation 200 of the bone segments 102, 104 can be reconstructed. The three dimensional representation can be constructed with or without a corresponding representation of the fixator 100. In the illustrated embodiment, pairs of ray-lines, such as ray lines 138, 140 and 142, 144 can be constructed for the bone element points 134, 136, respectively. Each ray line connects a bone element in one of the images 126, 128 with a respective imager 130. Each pair of ray lines can be analyzed for a common intersection point, such as points 146, 148. The common intersection points 146, 148 represent the respective positions of the bone element points 134, 136, in the three dimensional representation of the bone segments 102, 104. Of course more than a pair of ray lines, such as a plurality, can be constructed, for example if more than two images were captured. If the ray lines of a particular set do not intersect, a point closest to all the ray lines in the set can be used as the common intersection point.


The positions and/or orientations of the bone segments 102, 104 can be quantified or measured using common intersection points, for instance points 146, 148. For example, lines representing center lines of the bone segments 102, 104 can be constructed and can be compared to the anatomical axes of the patient. Additionally, the distance between the fractured ends 103, 105 of the bone segments 102, 104 can be quantified. Using these or similar techniques, the positions and/or orientations of the bone segments 102, 104 can be determined.


At step 310, the three dimensional representation 200 can be used to determine desired changes to the positions and/or orientations of the bone segments 102, 104, for instance how the bone segments 102, 104 can be repositioned with respect to each other in order to promote union between the bone segments 102, 104. For example, in the illustrated embodiment, it may be desirable to change the angulation of the second bone segment 104 such that the axes L1 and L2 are brought into alignment, and to change the position of the second bone segment such that the fractured ends 103, 105 of the bone segments 102, 104 abut each other. Preferably, the determination of the desired changes to the positions and/or orientations of the bone segments 102, 104 are made by a surgeon. In an example embodiment, lines representing the longitudinal axes L1, L2 of the first and second bone segments 102, 104 can be generated in the three dimensional representation, in order to aid in determining desired changes to the positions and/or orientations of the bone segments 102, 104. In determining the desired changes to the positions and/or orientations of the bone segments, the surgeon may be aided by software, such as a computer program configured to determine the desired positions and/or orientations of the bone segments 102, 104. Preferably, the desired changes to the positions and/or orientations of the bone segments 102, 104 are defined relative to the space origin 135.


Once the desired changes to the positions and/or orientations of the bone segments 102, 104 have been determined, a treatment plan for effecting the position and/or orientation changes can be determined. In a preferred embodiment, the desired changes to the positions and/or orientations of the bone segments 102, 104 can be effected gradually, in a series of smaller changes. The positions and/or orientations of the bone segments 102, 104 can be changed by changing the positions and/or orientations of the upper and lower fixator rings 106, 108 with respect to each other, for instance by lengthening or shortening one or more of the length adjustable struts 116.


At step 312, the required changes to the geometry of the fixator 100 (i.e., the position and/or orientation of the fixator 100) that can enable the desired changes to the positions and/or orientations of the bone segments 102, 104 can be computed using the matrix algebra described above. For example, the required repositioning and/or reorientation of the second bone segment 104 with respect to the first bone segment 102 can be translated to changes in the position and/or orientation of the lower fixator ring 108 with respect to the upper fixator ring 106. The required changes to the geometry of the fixator can be expressed with respect to a fixator origin 145 designated for the orthopedic fixator 100. It should be appreciated that the fixator origin 145 need not coincide with the space origin 135, as depicted in the illustrated embodiment.


At step 314, the treatment plan can be implemented, that is the positions and/or orientations of the bone segments 102, 104 can be altered by changing the geometry of the fixator 100.


As described above, one or more of the methods steps described herein and illustrated in FIG. 3 can be executed by a computer program, software, firmware or other form of computer-readable instructions incorporated in a computer-readable medium for execution by a computer or processor. Examples of computer-readable media can include computer-readable storage media and computer-readable communication media. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). Examples of computer-readable communication media include, but are not limited to electronic signals transmitted over wired or wireless connections.


It should be appreciated that the orthopedic fixation with imagery analysis techniques described herein provide not only for the use of non-orthogonal images, but also allow the use of overlapping images, images captured using different imaging techniques, images captured in different settings, and the like, thereby presenting a surgeon with greater flexibility when compared with existing fixation and imagery techniques.


It should further be appreciated that the methods and techniques described herein with respect to orthopedic fixation can also be applied to other uses. For example, a repositionable mechanical manipulation apparatus, such as a parallel manipulator, a Stewart platform, or the like, can have first and second objects connected to it. The manipulation apparatus can be made up of a plurality of components. The first and second objects can be any objects that are to be repositioned and/or realigned with respect to each other. Steps similar to those of the orthopedic fixation with imagery analysis method 300 can be applied to reconstruct a three dimensional representation of the first and second objects with respect to the repositionable manipulation apparatus. A three dimensional representation of the first and second objects can be reconstructed and used to determine one or more geometry changes of the manipulation apparatus that when implemented can reposition the first and second objects with respect to each other. The three dimensional representation can be reconstructed using respective first and second pluralities of imaging scene parameters, a location of an element of at least one of the objects in the first image, and a location of an element of at least one of the objects in the second image.


Although the orthopedic fixation with imagery analysis techniques have been described herein with reference to preferred embodiments and/or preferred methods, it should be understood that the words which have been used herein are words of description and illustration, rather than words of limitation, and that the scope of the instant disclosure is not intended to be limited to those particulars, but rather is meant to extend to all structures, methods, and/or uses of the herein described orthopedic fixation with imagery analysis techniques. Those skilled in the relevant art, having the benefit of the teachings of this specification, may effect numerous modifications to the orthopedic fixation with imagery analysis techniques as described herein, and changes may be made without departing from the scope and spirit of the instant disclosure, for instance as recited in the appended claims.

Claims
  • 1. A computer-implemented method of orthopedic fixation imagery analysis, the computer-implemented method comprising: acquiring, by one or more computing devices, first and second two dimensional images of a fixation apparatus and first and second bone segments attached thereto, wherein the first two dimensional image is captured from a first orientation and the second two dimensional image is captured from a second orientation that is different from the first orientation;obtaining, by the one or more computing devices, imaging scene parameters based in part on respective locations of a plurality of fixator elements in the first and the second two dimensional images, the plurality of fixator elements having corresponding physical locations in three dimensional space, wherein the obtaining of the imaging scene parameters comprises: identifying the respective locations of the plurality of fixator elements in the first and the second two dimensional images; andcomputing, based on the plurality of fixator elements, first and second transformation matrices associated with relating the corresponding physical locations of the plurality of fixator elements in the three dimensional space to the respective locations of the plurality of fixator elements in the first and the second two dimensional images; andreconstructing, by the one or more computing devices, a three dimensional representation of the first and the second bone segments with respect to the fixation apparatus based upon the imaging scene parameters.
  • 2. The computer-implemented method of claim 1, wherein the obtaining of the imaging scene parameters is based on a comparison of the respective locations of the plurality of fixator elements in the first and the second two dimensional images with the corresponding physical locations of the plurality of fixator elements in the three dimensional space.
  • 3. The computer-implemented method of claim 1, wherein the first and the second transformation matrices correspond to the first and the second two dimensional images, respectively.
  • 4. The computer-implemented method of claim 1, wherein the obtaining of the imaging scene parameters further comprises: decomposing the first and the second transformation matrices into the imaging scene parameters.
  • 5. The computer-implemented method of claim 1, wherein the computing of the first and the second transformation matrices is based at least in part on one or more lines representing at least one of the plurality of fixator elements.
  • 6. The computer-implemented method of claim 5, wherein the computing of the first and the second transformation matrices comprises constructing rows of matrices based on the one or more lines.
  • 7. The computer-implemented method of claim 5, wherein the computing of the first and the second transformation matrices is based, at least in part, on a point value and a gradient value for each of the one or more lines.
  • 8. The computer-implemented method of claim 1, further comprising identifying respective locations of a plurality of bone elements in the first and the second two dimensional images, the plurality of bone elements comprising anatomical features of the first and the second bone segments.
  • 9. The computer-implemented method of claim 8, wherein the three dimensional representation is further reconstructed based upon the respective locations of the plurality of bone elements.
  • 10. The computer-implemented method of claim 1, wherein the first and the second orientations are not orthogonal with respect to each other.
  • 11. One or more non-transitory computer-readable storage media having stored thereon instructions that, upon execution by one or more computing devices, cause the one or more computing devices to perform operations comprising: acquiring, by the one or more computing devices, first and second two dimensional images of a fixation apparatus and first and second bone segments attached thereto, wherein the first two dimensional image is captured from a first orientation and the second two dimensional image is captured from a second orientation that is different from the first orientation;obtaining, by the one or more computing devices, imaging scene parameters based in part on respective locations of a plurality of fixator elements in the first and the second two dimensional images, the plurality of fixator elements having corresponding physical locations in three dimensional space, wherein the obtaining of the imaging scene parameters comprises: identifying the respective locations of the plurality of fixator elements in the first and the second two dimensional images; andcomputing, based on the plurality of fixator elements, first and second transformation matrices associated with relating the corresponding physical locations of the plurality of fixator elements in the three dimensional space to the respective locations of the plurality of fixator elements in the first and the second two dimensional images; andreconstructing, by the one or more computing devices, a three dimensional representation of the first and the second bone segments with respect to the fixation apparatus based upon the imaging scene parameters.
  • 12. The one or more non-transitory computer-readable storage media having of claim 11, wherein the obtaining of the imaging scene parameters is based on a comparison of the respective locations of the plurality of fixator elements in the first and the second two dimensional images with the corresponding physical locations of the plurality of fixator elements in the three dimensional space.
  • 13. The one or more non-transitory computer-readable storage media of claim 11, wherein the first and the second transformation matrices correspond to the first and the second two dimensional images, respectively.
  • 14. The one or more non-transitory computer-readable storage media of claim 11, wherein the obtaining of the imaging scene parameters further comprises: decomposing the first and the second transformation matrices into the imaging scene parameters.
  • 15. The one or more non-transitory computer-readable storage media of claim 11, wherein the computing of the first and the second transformation matrices is based at least in part on one or more lines representing at least one of the plurality of fixator elements.
  • 16. The one or more non-transitory computer-readable storage media of claim 15, wherein the computing of the first and the second transformation matrices comprises constructing rows of matrices based on the one or more lines.
  • 17. The one or more non-transitory computer-readable storage media of claim 15, wherein the computing of the first and the second transformation matrices is based, at least in part, on a point value and a gradient value for each of the one or more lines.
  • 18. The one or more non-transitory computer-readable storage media of claim 11, wherein the operations further comprise identifying respective locations of a plurality of bone elements in the first and the second two dimensional images, the plurality of bone elements comprising anatomical features of the first and the second bone segments.
  • 19. The one or more non-transitory computer-readable storage media of claim 18, wherein the three dimensional representation is further reconstructed based upon the respective locations of the plurality of bone elements.
  • 20. The one or more non-transitory computer-readable storage media of claim 11, wherein the first and the second orientations are not orthogonal with respect to each other.
Priority Claims (1)
Number Date Country Kind
1008281 May 2010 GB national
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of, and claims priority to, U.S. patent application Ser. No. 16/111,775, filed on Aug. 24, 2018, which is a continuation of, and claims priority to, U.S. patent application Ser. No. 15/461,969, filed on Mar. 17, 2017, which is a divisional of, and claims priority to, U.S. patent application Ser. No. 13/111,180, filed on May 19, 2011, now issued as U.S. Pat. No. 9,642,649, which claims priority to Great Britain Patent Application Serial Number GB1008281.6, filed May 19, 2010. U.S. patent application Ser. No. 16/111,775, filed on Aug. 24, 2028, is incorporated herein by reference in its entirety. U.S. patent application Ser. No. 15/461,969, filed on Mar. 17, 2017, is incorporated herein by reference in its entirety. U.S. patent application Ser. No. 13/111,180, filed on May 19, 2011, is incorporated herein by reference in its entirety. Great Britain Patent Application Serial Number GB1008281.6, filed May 19, 2010, is incorporated herein by reference in its entirety.

US Referenced Citations (217)
Number Name Date Kind
2055024 Bittner, Jr. Sep 1936 A
2391537 Anderson Dec 1945 A
3977397 Kalnberz et al. Aug 1976 A
4081686 Nieuweboer Mar 1978 A
4450834 Fischer May 1984 A
4489111 Woodrum Dec 1984 A
4615338 Ilizarov et al. Oct 1986 A
4620533 Mears Nov 1986 A
4630203 Szirtes Dec 1986 A
4768524 Hardy Sep 1988 A
4784125 Monticelli et al. Nov 1988 A
4875165 Fencil et al. Oct 1989 A
4889111 Ben-Dov Dec 1989 A
4890631 Hardy Jan 1990 A
4930961 Weis Jun 1990 A
4964320 Lee, Jr. Oct 1990 A
4973331 Pursley et al. Nov 1990 A
5062844 Jamison et al. Nov 1991 A
5074866 Sherman et al. Dec 1991 A
5087258 Schewior Feb 1992 A
5095919 Monticelli et al. Mar 1992 A
5108393 Ruffa Apr 1992 A
5156605 Pursley et al. Oct 1992 A
5179525 Griffis et al. Jan 1993 A
5180380 Pursley et al. Jan 1993 A
5209750 Stef May 1993 A
5275598 Cook Jan 1994 A
5358504 Paley et al. Oct 1994 A
5437668 Aronson et al. Aug 1995 A
5443464 Russell et al. Aug 1995 A
5451225 Ross et al. Sep 1995 A
5458599 Adobbati Oct 1995 A
5540686 Zippel et al. Jul 1996 A
5601551 Taylor et al. Feb 1997 A
5630814 Ross et al. May 1997 A
5653707 Taylor et al. Aug 1997 A
5681309 Ross et al. Oct 1997 A
5702389 Taylor et al. Dec 1997 A
5728095 Taylor et al. Mar 1998 A
5746741 Kraus et al. May 1998 A
5766173 Ross et al. Jun 1998 A
5776132 Blyakher Jul 1998 A
5871018 Delp et al. Feb 1999 A
5885282 Szabo Mar 1999 A
5891143 Taylor et al. Apr 1999 A
5919192 Shouts Jul 1999 A
5951556 Faccioli et al. Sep 1999 A
5961515 Taylor et al. Oct 1999 A
5963612 Navab Oct 1999 A
5967777 Klein Oct 1999 A
5968043 Ross et al. Oct 1999 A
5971984 Taylor et al. Oct 1999 A
5976142 Chin Nov 1999 A
6017341 Windhagen et al. Jan 2000 A
6021579 Schimmels et al. Feb 2000 A
6030386 Taylor et al. Feb 2000 A
6047080 Chen et al. Apr 2000 A
6129727 Austin et al. Oct 2000 A
6206566 Schuetz Mar 2001 B1
6293947 Buchbinder Sep 2001 B1
6320928 Vaillant et al. Nov 2001 B1
6363169 Ritter et al. Mar 2002 B1
6434278 Hashimoto Aug 2002 B1
6501848 Carroll et al. Dec 2002 B1
6510241 Vaillant et al. Jan 2003 B1
6537275 Venturini et al. Mar 2003 B2
6701174 Krause Mar 2004 B1
6711432 Krause et al. Mar 2004 B1
6912293 Korobkin Jun 2005 B1
7113623 Chen et al. Sep 2006 B2
7187792 Fu Mar 2007 B2
7226449 Venturini et al. Jun 2007 B2
7280687 Ban et al. Oct 2007 B2
7306601 McGrath et al. Dec 2007 B2
7388972 Kitson Jun 2008 B2
7490085 Walker et al. Feb 2009 B2
RE40914 Taylor et al. Sep 2009 E
7645279 Haupt Jan 2010 B1
7657079 Lake et al. Feb 2010 B2
7677078 Sauer et al. Mar 2010 B2
7758582 Ferrante et al. Jul 2010 B2
7828801 Mirza et al. Nov 2010 B2
7837621 Krause et al. Nov 2010 B2
7887537 Ferrante et al. Feb 2011 B2
7955334 Steiner et al. Jun 2011 B2
8029505 Hearn et al. Oct 2011 B2
8057474 Knuchel et al. Nov 2011 B2
8062293 Steiner et al. Nov 2011 B2
8147491 Lavi Apr 2012 B2
8157800 Vvedensky et al. Apr 2012 B2
8202273 Karidis Jun 2012 B2
8257353 Wong Sep 2012 B2
8282652 Mackenzi et al. Oct 2012 B2
8296094 Harrison et al. Oct 2012 B2
8323282 Taylor Dec 2012 B2
8333766 Edelhauser et al. Dec 2012 B2
8377060 Vasta et al. Feb 2013 B2
8419732 Mullaney Apr 2013 B2
8425512 Vasta et al. Apr 2013 B2
8430878 Vasta et al. Apr 2013 B2
8439914 Ross et al. May 2013 B2
8444644 Ross et al. May 2013 B2
8454604 Wong Jun 2013 B2
8469958 Stevens Jun 2013 B2
8574232 Ross et al. Nov 2013 B1
8654150 Haskell Feb 2014 B2
8777946 Lindahl et al. Jul 2014 B2
8834467 Singh et al. Sep 2014 B2
8858555 Crozet et al. Oct 2014 B2
8864763 Murray et al. Oct 2014 B2
8906021 Lehmann et al. Dec 2014 B1
8945128 Singh et al. Feb 2015 B2
8951252 Steiner et al. Feb 2015 B2
8952986 Haskell Feb 2015 B2
9011438 Steiner et al. Apr 2015 B2
9017339 Edelhauser et al. Apr 2015 B2
9039706 Murray et al. May 2015 B2
9044271 Edelhauser et al. Jun 2015 B2
9066756 Wong Jun 2015 B2
9078700 Ross et al. Jul 2015 B2
9101398 Singh et al. Aug 2015 B2
9155559 Ross et al. Oct 2015 B2
9204937 Edelhauser et al. Dec 2015 B2
9220533 Singh et al. Dec 2015 B2
9642649 Nikonovas May 2017 B2
9895167 Edelhauser et al. Feb 2018 B2
10932857 Nikonovas Mar 2021 B2
20010018617 Copf Aug 2001 A1
20020010465 Koo et al. Jan 2002 A1
20030106230 Hennessey Jun 2003 A1
20030191466 Austin et al. Oct 2003 A1
20040068187 Krause et al. Apr 2004 A1
20040073211 Austin et al. Apr 2004 A1
20040073212 Kim Apr 2004 A1
20040082849 Schweikard Apr 2004 A1
20040111024 Zheng et al. Jun 2004 A1
20040133199 Coati et al. Jul 2004 A1
20040167518 Estrada Aug 2004 A1
20040208279 Xiao et al. Oct 2004 A1
20050149018 Cooper et al. Jul 2005 A1
20050215997 Austin et al. Sep 2005 A1
20050256389 Koga Nov 2005 A1
20060276786 Brinker Dec 2006 A1
20070043354 Koo Feb 2007 A1
20070043429 Hegel Feb 2007 A1
20070161983 Cresina et al. Jul 2007 A1
20070161984 Cresina et al. Jul 2007 A1
20070238069 Lovald Oct 2007 A1
20080012850 Keating, III Jan 2008 A1
20080114267 Lloyd May 2008 A1
20090036890 Karidis Feb 2009 A1
20090036892 Karidis et al. Feb 2009 A1
20090105621 Boyd et al. Apr 2009 A1
20090143788 Fang Jun 2009 A1
20090161945 Morgan-Mar et al. Jun 2009 A1
20090177198 Theodoros et al. Jul 2009 A1
20090226055 Dankowicz et al. Sep 2009 A1
20090275944 Huebner et al. Nov 2009 A1
20090326532 Schulze Dec 2009 A1
20090326560 Lampropoulos Dec 2009 A1
20100030219 Lerner et al. Feb 2010 A1
20100039421 Toyomura et al. Feb 2010 A1
20100087819 Mullaney Apr 2010 A1
20100104150 Saint et al. Apr 2010 A1
20100172567 Prokoski Jul 2010 A1
20100179548 Marin Jul 2010 A1
20100191239 Sakkers et al. Jul 2010 A1
20100280516 Taylor Nov 2010 A1
20100305568 Ross et al. Dec 2010 A1
20110004199 Ross et al. Jan 2011 A1
20110029093 Bojarski Feb 2011 A1
20110103676 Mullaney May 2011 A1
20110131418 Teng et al. Jun 2011 A1
20110313419 Mullaney Dec 2011 A1
20120029280 Kucklick Feb 2012 A1
20120078251 Benenati et al. Mar 2012 A1
20120136355 Wolfson May 2012 A1
20120232554 Shaevitz et al. Sep 2012 A1
20120259343 Clark et al. Oct 2012 A1
20120330312 Burgherr et al. Dec 2012 A1
20130041288 Taylor et al. Feb 2013 A1
20130060146 Yang Mar 2013 A1
20130131675 Vasta et al. May 2013 A1
20130138017 Jundt et al. May 2013 A1
20130211521 Shenoy et al. Aug 2013 A1
20130245625 Vasta et al. Sep 2013 A1
20130296857 Barnett et al. Nov 2013 A1
20140135764 Ross et al. May 2014 A1
20140236152 Walberg et al. Aug 2014 A1
20140257286 Lindahl et al. Sep 2014 A1
20140278325 Burgherr et al. Sep 2014 A1
20140303670 Colloca Oct 2014 A1
20140379038 Dogramadzi et al. Dec 2014 A1
20150080892 Lehmann et al. Mar 2015 A1
20150088135 Singh Mar 2015 A1
20150112339 Lindahl et al. Apr 2015 A1
20150223842 Murray et al. Aug 2015 A1
20150238227 Singh et al. Aug 2015 A1
20150257788 Jay et al. Sep 2015 A1
20150265313 Wong Sep 2015 A1
20150272624 Singh Oct 2015 A1
20150305776 Ross et al. Oct 2015 A1
20150305777 Singh et al. Oct 2015 A1
20150313641 Ross et al. Nov 2015 A1
20160022314 Bordeaux et al. Jan 2016 A1
20160045225 Edelhauser et al. Feb 2016 A1
20160092651 Austin et al. Mar 2016 A1
20160113681 Singh Apr 2016 A1
20160125603 Tanji May 2016 A1
20160183979 Del Deo et al. Jun 2016 A1
20170181800 Nikonovas Jun 2017 A1
20170224520 Karasahin Aug 2017 A1
20170303966 Edelhauser et al. Oct 2017 A1
20170348054 Kumar et al. Dec 2017 A1
20170348057 Kumar et al. Dec 2017 A1
20170354439 Mannanal et al. Dec 2017 A1
20180055569 Wahl et al. Mar 2018 A1
Foreign Referenced Citations (39)
Number Date Country
1494397 May 2004 CN
101296664 Oct 2008 CN
102883671 Jan 2013 CN
103270513 Aug 2013 CN
105852985 Aug 2016 CN
1100048 May 2001 EP
1690506 Aug 2006 EP
2767252 Aug 2014 EP
2576774 Aug 1986 FR
2756025 May 1998 FR
2001-523985 Nov 2001 JP
2003-144454 May 2003 JP
2003-530177 Oct 2003 JP
2004-254899 Sep 2004 JP
2006-507056 Mar 2006 JP
2006-218298 Aug 2006 JP
2009-505736 Feb 2009 JP
2011-512883 Apr 2011 JP
2013-526377 Jun 2013 JP
20-0443058 Jan 2009 KR
2159091 Nov 2000 RU
2352283 Apr 2009 RU
9812975 Apr 1998 WO
9959100 Nov 1999 WO
0115611 Mar 2001 WO
0178015 Oct 2001 WO
0330759 Apr 2003 WO
2007024904 Mar 2007 WO
2009102904 Aug 2009 WO
2010002587 Jan 2010 WO
2010104567 Sep 2010 WO
2011026475 Mar 2011 WO
2011060264 May 2011 WO
2011060266 May 2011 WO
2011146703 Nov 2011 WO
2012021307 Feb 2012 WO
2014186453 Nov 2014 WO
2019040829 Feb 2019 WO
2020023686 Jan 2020 WO
Non-Patent Literature Citations (39)
Entry
Changjiang Yang et al.: “Planar conic based camera calibration”, Proceedings / 15th International Conference on Pattern Recognition Barcelona, Spain, Sep. 3-7, [Proceedings of the International Conference on Pattern Recognition. (ICPR)], IEEE Computer Society, Los Alamitos, Calif. [U.A.], vol. 1, Sep. 3, 2000 (Sep. 3, 2000) pp. 555-558.
Charlton, an Investigation into the Effect of Lateral Hillslope inputs on Floorplain Hydraulic Model Predictions, Diss.University of Bristol, Sep. 1995, 289 pages.
Circle Hough Transform, Wikipedia, https://en.wikipedia.org/wiki/Circle_Hough_Transform, web-archive capture from Jan. 23, 2020, accessed on Mar. 4, 2021 from web.archive.org/web/20200123161407/https://en.wikipedia.org/wiki/Circle_Hough_Transform, 5 pages.
Circle Hough Transform, Wikipedia, https://en.wikipedia.org/wiki/Circle_Hough_Transform; webpage accessed Apr. 3, 2020, 5 pages.
Decision to Grant (Translation) dated Mar. 2016 in Russian patent application 2012147835, 6 pages.
Durali M, Shameli E. Full order neural velocity and acceleration observer for a general 6-6 Stewart platform. InNetworking, Sensing and Control, 2004 IEEE International Conference on Mar. 21, 2004 (vol. 1, pp. 333-338).
Garreau et al., “A Knowledge-Based Approach for 3-D Reconstruction and Labeling of Vascular Networks from Biplane Angiographic Projections”, IEEE Transactions On Medical Imaging, Jun. 1991, vol. 10, No. 2, 122-131.
Hartley, “Euclidian Reconstruction from Uncalibrated Views”, Applications of Invariance in Computer Vision, 1994, vol. 825, pp. 237-256.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Jan. 17, 2019, accessed on Mar. 4, 2021 from web.archive.org/web/20190117001205/https://en.wikipedia.org/wiki/iterative_closest_point, 3 pages.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Oct. 28, 2010, accessed on Oct. 26, 2020 from web.archive.org/web/20101028140305/https://en.wikipedia.org/wiki/iterative_closest_point, 3 pages.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/iterative_closest_point, web-archive capture from Sep. 13, 2006, accessed on Oct. 26, 2020 from web.archive.org/web/20060913000000/http://en.wikipedia.org/wiki/iterative_closest_point, 1 page.
Iterative Closest Point, Wikipedia, https://en.wikipedia.org/wiki/Iterative_closest_point, webpage accessed Apr. 3, 20, 3 pages.
Kelly, “How to calculate 3D coordinates with two cameras, a calibration object, a java program, and a lot of MS Excel macros”, Jun. 10, 2002, 9 pages.
Larionova, et al.: “Roentgen absorptiometry for analyzing bone tissue mineral density in orthopaedic-and-traumatological patients”, Genius Orthopedics, No. 3, 2009, pp. 98-102.
Maiocchi etl.; “Instruments and Their Use”; Operative Principles of Ilizarov; Chapter 2, 1991, 26 pages.
Nikonovas, Arkadijus. Taylor Spatial Frame: Kinematics, Mechanical Properties and Automation. Diss. University of Bristol, May 2005, 230 pages.
Ortho-SUV Frame—Art of Deformity Correction, Ortho-SUV Ltd, captured by https://web.archive.org from http://www.miito.org/download/ortho-suv-frame-eng.pdfon Jun. 13, 2010; 11 pages.
Orthofix, TL-HEX Software User's Guide: Software version 1.4, Nov. 2015, 60 pages.
Paley et al., “Deformity Correction By The Ilizarov Technique”, Operative Orthopaedics, 1993, 883-948.
Paley, “The principles of deformity correction by the Ilizarov technique: Technical aspects”, Techniques in Orthopaedics, 1989, vol. 4, Issue 1, 15-29.
Parikh PJ, Lam SS. A hybrid strategy to solve the forward kinematics problem in parallel manipulators. IEEE Transactions on Robotics Feb. 2005; 21(1): 18-25.
Point set registration, Wikipedia, https://en.wikipedia.org/wiki/Point_set_registration, web-archive capture from Oct. 16, 2019, accessed on Mar. 4, 2021 from web.archive.org/web/2019016232144/https://en.wikipedia.org/wiki/Point_set-registration.
Point set registration, Wikipedia, https://en.wikipedia.org/wiki/Point_set_registration, webpage accessed Apr. 3, 2020, 11 pages.
Ren L, Feng Z, Mills JK. A self-tuning iterative calculation approach for the forward kinematics of a Stewart-Gough platform. In Mechatronics and Automation, Proceedings of the 2006 IEEE International Conference on Jun. 25, 2006, 2018-2023.
Russakoff et al., “Intensity-Based 2D-3D Spine Image Registration Incorporating a Single Fiducial Marker”, Academic Radiology, Jan. 2005, vol. 12, No. 1, 37-50.
Simard et al., “The Ilizarov Procedure: Limb Lengthening and Its Implications”, Physical Therapy, Jan. 1992, vol. 72, No. 1, 25-35.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame User Manual Draft, year and date of publication are unknown, 90 pages.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame, User Manual, For SUV-Software vp 1.0 and vr 1.0, Vreden Russian Research Institute of Traumatology and Orthopedics, (Ortho-SUV) Ltd., Saint Petersburg, 2013, 144 pages.
Solomin et al., Deformity Correction and Fracture Treatment by software-based Ortho-SUV Frame, User Manual, For SUV-Software vp 2.1, Vreden Russian Research Institute of Traumatology and Orthopedics, (Ortho-SUV) Ltd., Saint Petersburg, Apr. 2016, 158 pages.
Solomin, The Basic Principles of External Fixation Using The Ilizarov Device, 2005, 371 pages.
Stoughton et al., “A Modified Stewart Platform Manipulator with Improved Dexterity”, IEEE Transactions On Robotics And Automation, Apr. 1993, vol. 9, No. 2, 166-173.
Stryker, Hoffman LRF, Gradual Correction, Operative technique, Jul. 2016, 36 pages.
Styker, Hoffmann LRF Hexapod, Operative technique, Jul. 2016, 44 pages.
T.A. Larionova et al. “X-ray absorptiometry in the analysis of bone mineral density of a patient with an orthopaedic trauma”, Genius of Orthopaedy No. 3, pp. 98-102 (w/English abstract), 2009.
Trucco et al., “Introductory Techniques of 3-D Computer Vision”, 1998, pp. 178-194.
Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using of-the-shelf TV Cameras and Lenses”, IEEE Journal of Robotics & Automation, RA-3, No. 4, Aug. 1987, 323-344.
U.S. Appl. No. 15/247,333, filed Aug. 25, 2016, Non-final Rejection dated Sep. 19, 2018, 21 pages.
Viceconti et al., “A software simulation of tibial fracture reduction with external fixator”, Computer Methods and Programs in Biomedicine, 1993, 40, 89-94.
Zhang, A Flexible New Technique for Camera Calibration, Technical Report, MSR-TR-98-71, Microsoft Research, Microsoft Corporation, Mar. 25, 1999, 22 pages.
Related Publications (1)
Number Date Country
20210153944 A1 May 2021 US
Divisions (1)
Number Date Country
Parent 13111180 May 2011 US
Child 15461969 US
Continuations (2)
Number Date Country
Parent 16111775 Aug 2018 US
Child 17163850 US
Parent 15461969 Mar 2017 US
Child 16111775 US