The disclosure relates generally to diagnostic imaging and in particular to methods and apparatus for characterization of bone joint structure and condition.
Joint damage in arthritis (such as osteoarthritis and rheumatoid arthritis (RA)) can result in functional impairment, disability, and overall mobility loss. Analysis from trial data has demonstrated that joint space narrowing, rather than erosive damage, is associated with an irreversible decline in physical function over time. Joint space narrowing has been largely ignored as a sign of progression of disease in comparison to erosion development, but is clearly of significant importance for improved diagnosis and care.
Current scoring methods for joint space narrowing using conventional radiographs are characterized by inaccuracy and relative insensitivity to change. In one widely used scoring system in randomized clinical trials in RA, an ordinal scale is used to characterize normal joint space, minimal narrowing, generalized narrowing with either <50% or >50% of the joint space remaining, or complete loss of joint space. Ordinal scales characterize incremental steps in change, but may miss small continuous measurements that represent progression. To address this, methods to directly measure the joint space width using 2-D radiography or using a variety of automated software programs have been described. Additionally, joint space width measurements have been determined using digital X-ray radiogrammetry (DXR), a technology more traditionally used in measuring bone mineral density in the hand. These techniques attempt to determine the measurement using a 2 dimensional image, and, at least in part due to the complexities of joint structure, can be subject to projection errors, discrepancies related to joint position, and obscured or damaged joint margins. Alternate imaging technologies that can be used for obtaining volume image data content include magnetic resonance imaging (MRI) and ultrasound.
In an attempt to measure joint space width measures quantitatively, sensitive tools that reliably characterize the bone and related tissue interface at the joint are desired. Conventional 2D radiography and other methods have provided some help for bone joint characterization, but fall short of what is needed for effectively visualizing and quantifying joint condition in order to provide accurate biometric data or providing any type of biomarker that is indicative of conditions such as ageing, bone loss, disease, damage, or infection.
Volume or 3D imaging methods such as computed tomography (CT) such as cone-beam computed tomography (CBCT) can be useful tools for imaging bone, with CT viewed as superior for detecting erosive changes and for providing more detailed information related to bone surfaces. High-resolution peripheral quantitative computed tomography is an instrument capable of imaging bones and can provide a high degree of accuracy for quantitative assessment for bone and joint condition. However, the diagnostician and clinician need tools and utilities for more accurate characterization of joint condition and that can allow standardization and quantification of factors related to joint health for long-term monitoring as well as immediate care functions.
The Applicants desire to improve the methodology to more accurately characterize joint spacing using volume imaging techniques.
Certain embodiments described herein address the need for a method for characterizing joint condition of a patient. This characterization can provide improved visualization and metrics that relate to distance and pressure information for localized joint areas as well as provide more global information related to the joint surface and interface volumes as indicators for the joint health.
Another aspect of the present disclosure is to display, store, or transmit imagery that characterizes joint spacing of a patient.
According to at least one aspect of the invention, there is described a method for characterizing bone joint spacing of a patient, the method executed at least in part by a computer. The method includes: accessing a 3-D volume image that includes at least bone content and background; automatically segmenting a 3-D bone region from the 3-D volume image to generate a 3-D bone volume image having a plurality of voxels and at least one joint; automatically computing, from the 3-D bone volume image, a 3-D distance map image of the at least one joint; computing one or more joint spacing parameters of the at least one joint from the 3-D distance map image; and displaying, storing, or transmitting the one or more joint spacing parameters.
According to another aspect of the invention, there is provided a method for characterizing a bone joint of a patient, the method executed at least in part by a computer and comprising: accessing 3-D volume image content that includes the bone joint; automatically segmenting the bone joint volume image content from background content to define at least a first bone surface and a second bone surface that is spaced apart from and faces the first bone surface; computing one or more distances between at least a first point on the first bone surface and one or more points on the second bone surfaces; displaying at least the first and second bone surfaces, wherein the display appearance is conditioned by the one or more computed distances; and displaying, storing, or transmitting data relating to the one or more computed distances.
These aspects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the disclosed invention may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings. The elements of the drawings are not necessarily to scale relative to each other.
The following is a detailed description of the embodiments of the invention, reference being made to the drawings in which the same reference numerals identify the same elements of structure in each of the several figures.
Where they are used in the context of the present disclosure, the terms “first”, “second”, and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one step, element, or set of elements from another, unless specified otherwise.
As used herein, the term “energizable” relates to a device or set of components that perform an indicated function upon receiving power and, optionally, upon receiving an enabling signal.
In the context of the present disclosure, the phrase “in signal communication” indicates that two or more devices and/or components are capable of communicating with each other via signals that travel over some type of signal path. Signal communication may be wired or wireless. The signals may be communication, power, data, or energy signals. The signal paths may include physical, electrical, magnetic, electromagnetic, optical, wired, and/or wireless connections between the first device and/or component and second device and/or component. The signal paths may also include additional devices and/or components between the first device and/or component and second device and/or component.
In the context of the present disclosure, the term “extremity” has its meaning as conventionally understood in diagnostic imaging parlance, referring to knees, legs, ankles, fingers, hands, wrists, elbows, arms, and shoulders and any other anatomical extremity. The term “subject” is used to describe the extremity of the patient that is imaged, such as the “subject leg”, for example. The term “paired extremity” is used in general to refer to any anatomical extremity wherein normally two or more are present on the same patient. In the context of the present invention, the paired extremity is not imaged; only the subject extremity is imaged.
To describe an embodiment of the present disclosure in detail, the examples given herein focus on imaging of the load-bearing lower extremities of the human anatomy, such as the hip, the leg, the knee, the ankle, and the foot, for example. However, these examples are considered to be illustrative and non-limiting. The imaging and measurement methods of the present disclosure can similarly be applied for joints that may not be considered as load-bearing. Moreover, different metrics can be provided for the same joint under load-bearing and non-load-bearing conditions, as described in more detail subsequently.
In the context of the present disclosure, the term “arc” or, alternately, “circular arc”, has its conventional meaning as being a portion of a circle of less than 360 degrees or, considered alternately, of less than 2π radians for a given radius.
In the context of the present disclosure, “volume image”, “volume image content” or “3D volume image” describes the reconstructed image data for an imaged subject, generally generated and stored as a set of voxels derived from measurements of density to radiation energy. Image display utilities use the volume image content in order to display features within the volume, selecting specific voxels that represent the volume content for a particular slice or view of the imaged subject. Thus, volume image content is the body of resource information that is obtained from a CBCT or other volume imaging reconstruction process and that can be used to generate depth visualizations of the imaged subject. The 3-D volume image can be obtained from a volume imaging apparatus such as a CT (computed tomography), CBCT (cone beam computed tomography), and/or MRI (magnetic resonance imaging) system, for example.
In the context of the present disclosure, the term “bone joint” is used to include the combined skeletal structures, including bone mineral density (BMD) that can be imaged and calculated using the radiation energy of CT, CBCT or other volume imaging system using well-known techniques. The bone joint is associated with cartilage and connective tissue at the bone interface, that cooperate to provide joint function and movement. Unless specifically stated otherwise, the term “bone surface” does not include cartilaginous tissues that form a portion of the mating surfaces at the joint. Measurements obtained using an embodiment of the present disclosure can characterize features of the cartilaginous tissue, such as depth or width and overall volume, but do not directly image the cartilage that lies within and cooperates with the bone joint.
The term “highlighting” for a displayed feature has its conventional meaning as is understood to those skilled in the information and image display arts. In general, highlighting uses some form of localized display enhancement to attract the attention of the viewer. Highlighting a portion of an image, such as an individual organ, bone, or structure, or a path from one chamber to the next, for example, can be achieved in any of a number of ways, including, but not limited to, annotating, displaying a nearby or overlaying symbol, outlining or tracing, display in a different color or at a markedly different intensity or gray scale value than other image or information content, blinking or animation of a portion of a display, or display at higher sharpness or contrast.
The perspective schematic view of
A full 360 degree orbit of the source and detector may not be needed for conventional CBCT imaging; instead, sufficient information for image reconstruction can often be obtained with an orbital scan range that just exceeds 180 degrees by the angle of the cone beam itself, for example.
Using a volume imaging apparatus such as that shown schematically in
Embodiments of the present disclosure describe a number of measurements and calculations that can be used to characterize the condition of a bone joint for a patient. Characterization techniques described herein can provide metrics for various aspects of joint health, including measurements and calculations and distributions of data that can be visualized on a display or printed surface.
Bone joint distance is proportional to the overall pressure that is applied to the bone and therefore provides a useful first approximation for overall joint condition.
Based on the sensed pressure measurements from transducer 110 and on the weight W applied, the relationship between weight and pressure over the surface area that corresponds to the transducer can be modeled. Results of the pressure measurements can be used, for example, to generate a look-up table (LUT) that gives the corresponding pressure related to weight. In addition, by scanning the joint J1 with a volume imaging apparatus such as that shown schematically in
Embodiments of the present disclosure provide methods for obtaining, from the volume image content, measurement data that characterizes the bone joint and provides useful information on joint spacing and contact area characteristics.
Information that can be particularly useful for diagnosis of RA and other joint conditions relates to spacing between skeletal surface structures and characterization of contact surface areas where bone and related articular cartilage come into close proximity in order to cooperate for allowing articulated movement at the joint. Of particular interest for methods of the present disclosure is the relationship of distance between skeletal surfaces and corresponding pressure of synovial fluid and upon cartilage within the joint. While the relationship of distance to pressure can be complex and can be affected by various factors depending on the particular joints being examined, it is apparent that the overall relationship of distance as inversely proportional to pressure is diagnostically useful as a first-approximation indicator of RA and other joint conditions.
Once segmentation and labeling steps have been completed, a distance map generation step S130 can be executed.
Referring back to the sequence shown in
By way of example,
Distance values for a single point in the bone joint can be computed for a number of slightly different distance metrics, as shown in
The relative diagnostic significance of the different distance metrics shown for a joint in
In some cases, bone joint analysis can include additional processing to re-align bone position for one or more bone structures within the joint, according to pressure exerted by the patient's weight. In the case of an injury or fracture, simulated behavior along a weight-bearing joint may be used as a model, such as where it would not be feasible to obtain an image of the limb under actual weight-bearing conditions. Volume images of the joint features can be used to simulate joint behavior according to the model and can be used to guide treatment of the injury.
In addition to point-by-point proximity along joint surfaces, the pressure contribution from nearby points may be of diagnostic interest. The schematic diagram of
Distance map generation step S130 then computes distances between points on facing surfaces S1 and S2 using a predetermined distance metric, as described previously. A decision step S140 then determines whether or not the joint is imaged under load-bearing conditions. For a load-bearing joint, a processing step S150 applies higher weighting to vertical distance, with some considering for weighting of other nearby distance values, as well as considering the contribution of sliding forces as shown in
A display step S170 then assigns color or other appearance characteristics to voxel values, conditioned by the computed distances. Deep red colors, for example, can be assigned to contact areas or areas within a minimum distance of a facing surface. Display step S170 can display the reconstructed volume, display only the contact surface features, or display only distance mapping information, depending on system design and, optionally, operator preference. The generated data can alternately be stored or transmitted to a different computer or processor.
In a similar manner,
Segmentation utilities allow the practitioner to view bones and surfaces of particular features, isolated from other structures of the joint.
According to an embodiment of the present disclosure, an image of the volume between facing bone surfaces can be generated.
According to an embodiment of the present disclosure, the volume of space 92 within a joint is calculated for comparison with the calculated volume from a previous imaging exam in order to characterize bone joint condition according to its rate of change. Optionally, histogram data related to the joint spacing is used to provide a metric based on changes in distance distribution over a time period. Color-encoded display of the bone volume can help to further characterize the condition of a particular joint with localized information. Using the detected volume within the bone joint can prove to be a fairly robust method for characterization of the joint and of relative cartilage loss over time and information on the total volume, distribution of the volume, and change in volume over time offers more information about the joint when compared against use of distance measures by themselves.
Another feature of an embodiment of the present disclosure relates to the capability to simulate selective, partial disassembly of the joint structure, allowing each bone feature of the joint to be displayed individually or in combination with only a subset of the other bones in the joint. Referring to
Another feature of methods and utilities provided by the present disclosure allows a practitioner to compare volume images obtained from the patient over a period of time. Images acquired one or two years previously, for example, can help the practitioner to view and quantify changes to bone spacing and corresponding pressure by allowing the use of volume images of the bone spacing itself. The use of a sliding bar or other visual tool can further enhance the view of bone spacing as shown in
According to an alternate embodiment of the present disclosure, the control logic for the volume image processing monitors spacing volume changes and automatically detects and reports change values that exceed predetermined threshold values. Reporting can use various metrics that have potential diagnostic value, such as number of pixels or voxels having changed values or the overall volume calculation that vary between exams, for example.
Visual and quantitative comparison of image content and measured values for smoothness and texture of bone surfaces at the joint can also display as calculated or visual data in the display of
It can also be advantageous to show the bone joint characterization under both weight-bearing and non-weight bearing conditions.
Additional data can also be provided by the imaging apparatus as indicators of overall bone density. Given the joint and segmented and labeled bone structures, the relative Hounsfield values of voxels can be an indicator of trabecular bone mass and overall bone strength near the joint. Trabecular structure can be segmented and calculated for relative volume near the joint, for example, by showing a percentage of trabecular bone structure to other bone material.
Templates can be devised not only to specify fixed perspective views, but also to compare joint spacing and pressure for an individual patient with standard measurements from a population of patients, allowing a grading or scoring to be used as a metric for bone joint health assessment.
The logic flow diagram of
(i) identification of bone joint components for display;
(ii) initial angle for bone joint orientation;
(iii) initial scale specifications;
(iv) color mapping for different distance measurements; and
(v) segmentation specifications.
Various other content can also be contained in template 98.
Continuing with the sequence of
Bone density and related trabecular structure of the inner portions of the bone near the joint surface can also be a useful indicator of overall bone health and joint condition. Bone surface smoothness, providing a close view of bone texture that is available from segmented views of the joint surfaces, can be a useful diagnostic tool.
The Applicants have developed a method for determining joint spacing of a patient. The method can be executed at least in part by a computer. A 3-D volume image that includes at least bone content and background is accessed. A 3-D bone region is automatically segmented from the 3-D volume image to generate a 3-D bone volume image. The 3-D bone volume image includes a plurality of voxels and at least one joint. From the 3-D bone volume image, a 3-D distance map image of the at least one joint is automatically computed. The method then computes one or more joint spacing parameters of the at least one joint from the 3-D distance map image. After the computation, the one or more joint spacing parameters can be displayed, stored, or transmitted. The 3-D bone region can be, for example, a knee, hand, wrist, or ankle.
One or more joint spacing parameters can be displayed in a time series. If drug therapy is employed, the display can be in a time series with drug therapy. Further, if displayed, the one or more joint spacing parameters can be displayed relative to a joint spacing parameter of an average/baseline/typical/standard patient. The average/baseline/typical/standard patient can grouped for example by male/female; age (child, teen, adult), and/or size (small, medium, large).
The method of the present embodiment can also include generating a 3-D mapping of the one or more computed joint space parameters and automatically labeling individual joints. The method can also include automatically connecting at least two components for labeling individual joints.
There can be displayed a 3-D color mapping surface rending of joint space narrowing on top of the bone surface. The method can further include a 3-D interactive segmentation of bones and tracking of joint space narrowing change over time. The user interface may be configured to allow a user to select at least two bones for the automatic computing. The method can be configured to automatically identify a type of bone from the segmented 3-D bone region, and then individually display, store, or transmit the one or more joint spacing parameters for the identified type of bone. The method can be configured to allow a user to select a particular joint from the 3-D volume image for the computing of the joint space parameters.
Applicants have described an apparatus for characterizing a bone joint of a patient, the apparatus comprising: (a) a volume imaging apparatus; (b) a computer in signal communication with the volume imaging apparatus and configured with instructions for: (i) accessing 3-D volume image content that includes the bone joint; (ii) automatically segmenting the bone joint volume image content from background content to define at least a first bone surface and a second bone surface that is spaced apart from and faces the first bone surface; (iii) computing one or more distances between at least a first point on the first bone surface and one or more points on the second bone surfaces; and (iv) displaying at least the first and second bone surfaces, wherein the display appearance is conditioned by the one or more computed distances; and (c) a display for displaying data relating to the one or more computed distances. With such an apparatus, the volume imaging apparatus can be taken from the group consisting of a CT (computed tomography), a CBCT (cone beam computed tomography), and an MRI (magnetic resonance imaging) system.
The method of the present disclosure can also provide a computer storage product having at least one computer storage medium having instructions stored therein causing one or more computers to perform the described calculations and display features.
Consistent with one embodiment, the present invention utilizes a computer program with stored instructions that control system functions for image acquisition and image data processing for image data that is stored and accessed from an electronic memory. As can be appreciated by those skilled in the image processing arts, a computer program of an embodiment of the present invention can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation that acts as an image processor, when provided with a suitable software program so that the processor operates to acquire, process, transmit, store, and display data as described herein. Many other types of computer systems architectures can be used to execute the computer program of the present invention, including an arrangement of networked processors, for example.
The computer program for performing the method of the present invention may be stored in a computer readable storage medium. This medium may comprise, for example; magnetic storage media such as a magnetic disk such as a hard drive or removable device or magnetic tape; optical storage media such as an optical disc, optical tape, or machine readable optical encoding; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other network or communication medium. Those skilled in the image data processing arts will further readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
It is noted that the term “memory”, equivalent to “computer-accessible memory” in the context of the present disclosure, can refer to any type of temporary or more enduring data storage workspace used for storing and operating upon image data and accessible to a computer system, including a database. The memory could be non-volatile, using, for example, a long-term storage medium such as magnetic or optical storage. Alternately, the memory could be of a more volatile nature, using an electronic circuit, such as random-access memory (RAM) that is used as a temporary buffer or workspace by a microprocessor or other control logic processor device. Display data, for example, is typically stored in a temporary storage buffer that is directly associated with a display device and is periodically refreshed as needed in order to provide displayed data. This temporary storage buffer can also be considered to be a memory, as the term is used in the present disclosure. Memory is also used as the data workspace for executing and storing intermediate and final results of calculations and other processing. Computer-accessible memory can be volatile, non-volatile, or a hybrid combination of volatile and non-volatile types.
It is understood that the computer program product of the present invention may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present invention, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.
The invention has been described in detail, and may have been described with particular reference to a suitable or presently preferred embodiment, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.
This application claims the benefit of U.S. Provisional application U.S. Ser. No. 62/093,119, provisionally filed on Dec. 17, 2014, entitled “QUANTITATIVE METHOD FOR 3-D JOINT SPACE ANALYSIS VISUALIZATION AND MONITORING”, in the names of Andre Souza et al., incorporated herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62093119 | Dec 2014 | US |