Systems and methods for matching, naming, and displaying medical images

Information

  • Patent Grant
  • 7787672
  • Patent Number
    7,787,672
  • Date Filed
    Thursday, November 3, 2005
    19 years ago
  • Date Issued
    Tuesday, August 31, 2010
    14 years ago
Abstract
A method of matching medical images according to user-defined matches rules. In one embodiment, the matched medical images are displayed according user-defined display rules such that the matched medical images may be visually compared in manner that is suitable to the viewer's viewing preferences.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


This invention relates to matching, naming, and displaying medical images based upon user-defined rules.


2. Description of the Related Art


Medical imaging is increasingly moving into the digital realm. This includes imaging techniques that were traditionally analog, such as mammography, x-ray imaging, angiography, endoscopy, and pathology, where information can now be acquired directly using digital sensors, or by digitizing information that was acquired in analog form. In addition, many imaging modalities are inherently digital, such as MRI, CT, nuclear medicine, and ultrasound. Increasingly these digital images are viewed, manipulated, and interpreted using computers and related computer equipment. When medical images are created, they are typically provided an identifier, label, or name. This name can be either automatically generated or provided by a technician. One problem with organizing medical images from different sources is that these identifiers do not necessarily provide a good description that is understandable to a subsequent viewer of the image. Furthermore, as the number of images increase, there is a need for improved methods of matching related medical images together for subsequent viewing and analysis.


SUMMARY OF THE INVENTION

One embodiment comprises a method of displaying medical data. The method comprises receiving a plurality of medical images of a first medical examination and receiving at least one user-defined matching rules, at least one of user-defined matching rules identifying selection criteria for the medical images. The method also comprises selecting medical images that satisfy the selection criteria of the user-defined rules, thereby matching medical images according to user-specific rules and receiving at least one user-defined display rule, at least one of user-defined display rules identifying a display preference with respect to selected medical images. The method also comprises displaying the selected medical images according to the identified display preferences, thereby allowing matched medical images to be visually compared and displayed in a manner that is suitable to the user's preferences.


Another embodiment comprises a method displaying medical data. The method comprises receiving a plurality of medical images of a first medical examination and receiving a plurality of medical images of a second medical examination. The method also comprises receiving at least one user-defined matching rule, at least one of user-defined matching rules identifying selection criteria for matching the medical images of the first and second medical examinations. The method also comprises selecting medical images that satisfy the selection criteria of the user-defined rules, thereby matching medical images of the first medical examination with medical images of the second examination according to user-specific rules. The method also comprises receiving a plurality of user-defined display rules, at least one of user-defined display rules identifying a display preference with respect to selected medical images. The method also comprises displaying the selected medical images according to the identified display preferences, thereby allowing matched medical images to be visually compared and displayed in a manner that is suitable to the user's preferences.


Another embodiment comprises a system for displaying medical data. The system comprises an electronic device being configured to receive a plurality of medical images of a first medical examination. The electronic device is configured to receive a plurality of user-defined matching rules. At least one of user-defined matching rules identify selection criteria for the medical images. The electronic device is further configured to select medical images that satisfy the selection criteria of the user-defined rules, thereby matching medical images according to user-specific rules. The electronic device is further configured to receive a at least one user-defined display rules. At least one of user-defined display rules identify a display preference with respect to selected medical images. The electronic device is further being configured to display the selected medical images according to the identified display preferences, thereby allowing matched medical images to be visually compared and displayed in a manner that is suitable to the user's preferences.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary computing system in communication with a network and various networked devices.



FIG. 2 is a flowchart illustrating an exemplary process of matching and displaying medical images.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

Embodiments of the invention will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the inventions herein described.



FIG. 1 is a block diagram of an exemplary computing system 100 in communication with a network 160 and various network devices. The computing system 100 may be used to implement certain systems and methods described herein. The functionality provided for in the components and modules of computing system 100 may be combined into fewer components and modules or further separated into additional components and modules.


The computing system 100 includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible. In one embodiment, the exemplary computing system 100 includes a central processing unit (“CPU”) 105, which may include a conventional microprocessor, an application module 145 that comprises one or more various applications that may be executed by the CPU 105. The application module 145 may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


The computing system 100 further includes a memory 130, such as random access memory (“RAM”) for temporary storage of information and a read only memory (“ROM”) for permanent storage of information, and a mass storage device 120, such as a hard drive, diskette, or optical media storage device. Typically, the modules of the computing system 100 are connected to the computer using a standards-based bus system. In different embodiments of the present invention, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.


The computing system 100 is generally controlled and coordinated by operating system software, such as the Windows 95, 98, NT, 2000, XP or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 100 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary computing system 100 includes one or more of commonly available input/output (I/O) devices and interfaces 110, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 110 include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, display devices provide for the presentation of GUIs, application software data, and multimedia presentations, for example. In one embodiment, a GUI includes one or more display panes in which medical images may be displayed. According to the systems and methods described below, medical images may be stored on the computing system 100 or another device that is local or remote, displayed on a display device, and manipulated by the application module 145. The computing system 100 may also include one or more multimedia devices 140, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 1, the I/O devices and interfaces 110 provide a communication interface to various external devices. In the embodiment of FIG. 1, the computing system 100 is coupled to a network 160, such as a LAN, WAN, or the Internet, for example, via a communication link 115. The network 160 may be coupled to various computing devices and/or other electronic devices. In the exemplary embodiment of FIG. 1, the network 160 is coupled to imaging devices 170, an image server 180, and a medical facility 190. In addition to the devices that are illustrated in FIG. 1, the network 160 may communicate with other computing, imaging, and storage devices.


The imaging devices 170 may be any type of device that is capable of acquiring medical images, such as an MRI, x-ray, mammography, or CT scan systems. The image server 180 includes a data store 182 that is configured to store images and data associated with images. In one embodiment, the imaging devices 170 communicate with the image server 180 via the network 160 and image information is transmitted to the image server 180 and stored in the data store 182. In one embodiment, the image data is stored in Digital Imaging and Communications in Medicine (“DICOM”) format. The complete DICOM specifications may be found on the National Electrical Manufactures Association Website at <medical.nema.org>. Also, NEMA PS 3—Digital Imaging and Communications in Medicine, 2004 ed., Global Engineering Documents, Englewood Colo., 2004, provides an overview of the DICOM standard. Each of the above-cited references is hereby incorporated by reference in their entireties. In one embodiment, the data store 182 also stores the user-defined display rules associated with one or more of the images stored on the data store 182. As discussed in further detail below, the user-defined display rules may vary depending of the type of image, area imaged, clinical indication, source of image, display device, user, or other factors. Accordingly, any type of user-defined display rule is expressly contemplated for use in conjunction with the systems and methods described herein.


The exemplary image server 160 is configured to store images from multiple sources and in multiple formats. For example, the image server 160 may be configured to receive medical images in the DICOM format from multiple sources, store these images in the data store 182, and selectively transmit medical images to requesting computing devices.


The medical facility 190 may be a hospital, clinic, doctor's office, or any other medical facility. The medical facility 190 may include one or more imaging devices and may share medical images with the image server 180 or other authorized computing devices. In one embodiment, multiple computing systems, such as the computing system 100 may be housed at a medical facility, such as medical facility 190.


Definition of Terms


Below is a definition of certain terms used herein.


“Modality” is defined as a medical imaging device (a patient who undergoes an MRI is said to have been examined or scanned with the MRI modality).


“Medical image” is defined to include an image of an organism. It may include but is not limited to a radiograph, computed tomography (CT), magnetic resonance imaging (MRI), Ultrasound (US), mammogram, positron emission tomography scan (PET), nuclear scan (NM), pathology, endoscopy, ophthalmology, or many other types of medical images. While this description is directed to viewing and tracking of medical images, the methods and systems described herein may also be used in conjunction with non-medical images, such as, images of circuit boards, airplane wings, and satellite images, for example.


“Patient” refers to an individual who undergoes a medical imaging examination.


“Viewing” is defined to include the process of visually observing one or more medical images associated with exams.


“Viewer” is defined as any person who views a medical image.


“Reading” is defined to include the process of visually observing one or more medical images for the purpose of creating a professional medical report, also called an interpretation. When reading is complete, an exam may be labeled “read,” indicating that the medical professional has completed observation of the one or more medical images for purposes of creating a medical report.


“Reader” is defined to include one who is authorized to perform the reading process.


“User” is defined to include any person that is a viewer and/or a reader.


“Display rules” are defined to include methods of display of an image or exam. For example, an image or exam may be displayed with a certain pixel window level or width (similar to brightness and contrast), in color, based on a certain color map, opacity map, or other display parameters.


“User-defined display rules” refers to rules that a user can establish and store in a database that establish criteria for image display that is considered adequate. For example, a user-defined display rule might store a rule that triggers certain warnings or displays if all pixels in a medical image have not been displayed or, alternatively, if at least a predetermined portion of the pixels have not been displayed with a certain display method (such as image window, level, brightness, contrast, opacity, color look-up table, or other parameters). User-defined display rules may also refer to other image processing functions, such as edge enhancement and automated image analysis functions, e.g., computer-aided detection (CAD) techniques.



FIG. 2 is a high-level flowchart describing an exemplary method that may be performed by the computing system 100 (FIG. 1). Depending on the embodiment, additional steps may be added, others removed, and the ordering of the steps rearranged.


Starting at step 200, matching rules are provided with respect to medical images that are accessible by the computer system 100. The medical can be accessible via the imaging server 180, be local to the computing system 100, or elsewhere accessible via the network 160. The matching rules establish criteria for matching related medical images. In one embodiment, the matching rules are defined for a particular individual or machine. In another embodiment, the matching rules are defined for a group or class of individuals. The rules may be provided by the users themselves and/or by a system administrator. The auto-matching rules may be established to select medical data based upon any of the following non-limiting criteria: modality (MRI, CT, X-ray etc); exam type (left knee X-ray, CT Chest, MRI Brain etc); archive status (has the exam been archived, archived and restored, not yet archived); assigned physician (has the exam been assigned to a particular physician for interpretation); exam age (how long ago was the exam done); patient age; and any item in a DICOM header file, such as orientation, contrast use, thickness of slices, field of view, MRI tissue contrast weighting, and other items. With regard to some criteria, such as MRI tissue contrast weighting, the rules may analyze the MRI pulse sequence and the imaging parameters in order to determine the tissue contrast weighting and subcategorize images into weighting categories or weighting names.


The matching rules can be used to match medical images in the same and/or different medical series. For example, assume the medical images relate to three series of 6 x-rays. The matching rules can be established such that like views amongst each of the different series are grouped together for subsequent viewing. The matching rules be defined using simple or complex search expressions such “AND” or “OR.” Complex filter criteria may be stored on the image server 180 and then used by local devices that access these records via the web.


Next, at a step 204, display rules are provided with respect to the medical images. In one embodiment, the display rules may be user-defined allowing the user to determine the timing, positioning, and size of displayed matched images. For example, a user can define that matched medical images are all displayed concurrently on a display. Also, for example, a user can define that the most recent of the matched medical images are displayed on the left hand portion of the display and the other matched medical images are displayed in sequence on the right hand side of the display, the sequence advancing in response to user prompting. In one embodiment, the display rules include directives (timing, positioning, and size). As an example, directives can include the following for identifying location information: TOP_DISPLAY, BOTTOM_DISPLAY, RIGHT_DISPLAY, LEFT_DISPLAY, CENTER DISPLAY. Furthermore, if the number of matched medical images is variable, the display rules can include instructions for identifying selected medical images based upon further rules, such as using the matching criteria listed above. In addition, the display rules may or may not define how many images or image series are displayed per monitor, a display grid (2×3, 5×4, etc.), or whether like images are displayed neighboring each other side by side horizontally or vertically. Furthermore, the display rules may also specify how different matched medical images from different series may be “interleaved” together for successive display. Using the computing system 100, a user may also manually interleave matched medical images, e.g., order the matched medical images for progressive display of each of the matched sets. The computing system 100 may also provide an interface to re-order images or image series to facilitate the matching display, and may even reorient images (flip, rotate) in order to best match the display. Using the display rules, the user can provide display rules such that related medical images are readily comparable.


In one embodiment, display rules may be set to display pre and post contrast axial Ti weighted images from a brain MRI from the same exam in adjacent panes on a monitor. Display rules may also be set to display axial T2 weighted MRI images from a prior spine MRI adjacent to axial T2 weighted images from the current exam. Display rules may also be set to display a PA projection from a chest radiograph from a prior exam adjacent to the same projection from the current exam.


Continuing to a step 206, naming rules are provided. The naming rules describe how the medical images can be provided a new “name”, label, or identifying description. In one embodiment, the naming rules are defined for a particular individual or machine. In another embodiment, the matching rules are defined for a group or class of individuals. The rules may be provided by the users themselves and/or by a system administrator. The naming rules can define sets of naming rules for different exam types or other classifications. The naming rules can also be defined to perform naming before or after the matching step 216. In one embodiment, the naming rules define how information in a header file, e.g., DICOM file, that is associated with the medical image is described. The naming rules can define that the new name of the medical image is defined using meta or other data that is associated with the medical images. It is noted that the new name of the medical image need not be stored with the image but such information can be stored by any device connected to the network.


Continuing to a step 208, in one embodiment, the medical images are received by the computing system 100. It is noted that in one embodiment, the medical images need not be stored locally by the computer system 100 but are merely made accessible to it via the network Continuing to a step 212, the computing system 100 may optionally name the received medical images based upon the received user-specific naming rules. In one embodiment, the new “names” of the medical images may be used by the computing system 100 to facilitate matching (step 216) of related medical images. In one embodiment, the naming rules define how information in a header file, e.g., a DICOM file, is described. The naming rules can define that the “new” name of the medical image is defined using meta or other data that is associated with the medical image. For example, the naming rules can define that the new “name” of the medical image is based upon the name of patient, the exam type, and the date of the medical image. Also for example, the naming rules can define categorization tables that define a particular name when certain conditions are met. The particular name can be predefined and/or based upon meta or other data that is associated with the medical images.


Proceeding to a step 216, the computing system 100 matches medical images in the same or related series together as discussed above (step 200). The matched images may be collectively or individually provided a new name per the naming rules provided in step 206. Continuing to a step 220, the matched medical images are displayed. It is noted that the display rules may define user display preferences that are specific for particular types of medical images, e.g., the imaging modality, the body part, whether there is one exam, two, or more medical images being displayed. The display rules may be defined per individual user, site, or system. In one embodiment, the user can store the display rules in a database. In one embodiment, one set of display rules can apply to one modality and another set of display rules can apply to another modality. In addition, the display rules may include specific triggers or warnings that occur if the user-defined display rules are not satisfied.


The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A method of displaying medical data, the method comprising: receiving a plurality of medical images of a first medical examination;receiving a plurality of medical images of a second medical examination;receiving at least one user-defined matching rule, at least one of the user-defined matching rules identifying selection criteria for matching the medical images of the first and second medical examinations;selecting medical images that satisfy the selection criteria of the user-defined rules, thereby matching medical images of the first medical examination with medical images of the second medical examination according to the user-defined matching rules;receiving a plurality of user-defined display rules, at least one of user-defined display rules identifying one or more display preference with respect to selected medical images; anddisplaying the selected medical images according to the one or more identified display preference, thereby allowing matched medical images to be visually displayed in a manner that is suitable to the user's preferences,wherein at least some of the method is performed by a computing system comprising one or more computing device.
  • 2. The method of claim 1, wherein the medical images of the first medical examination and of the second medical examination are each grouped in one or more image series.
  • 3. The method of claim 1, additionally comprising naming the matched medical images according to at least one user-defined naming rule.
  • 4. The method of claim 1, additionally comprising naming the medical images according to at least one user-defined naming rule prior to the selecting step.
  • 5. A system, comprising: an electronic device configured to receive a plurality of medical images of a first medical examination, the electronic device being further configured to receive a plurality of medical images of a second medical examination, the electronic device being further configured to receive a plurality of user-defined matching rules, at least one of user-defined matching rules identifying selection criteria for matching the medical images of the first and second medical examinations, the electronic device being further configured to select medical images that satisfy the selection criteria of the user-defined rules, thereby matching medical images of the first medical examination with medical images of the second medical examination according to the user-defined matching rules, the electronic device being further configured to receive a plurality of user-defined display rules, at least one of the user-defined display rules identifying a display preference with respect to selected medical images, and the electronic device being further configured to display the selected medical images according to the identified display preference, thereby allowing matched medical images to be visually displayed in a manner that is suitable to the user's preferences.
  • 6. The system of claim 5, wherein the medical images of the first medical examination and of the second medical examination are each grouped in one or more image series.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application Ser. No. 60/625,690, filed on Nov. 4, 2004, which is hereby expressly incorporated by reference in its entirety.

US Referenced Citations (103)
Number Name Date Kind
4672683 Matsueda Jun 1987 A
5123056 Wilson Jun 1992 A
5431161 Ryals et al. Jul 1995 A
5452416 Hilton et al. Sep 1995 A
5734915 Roewer Mar 1998 A
5740267 Echerer et al. Apr 1998 A
5852646 Klotz et al. Dec 1998 A
5926568 Chaney et al. Jul 1999 A
5954650 Saito et al. Sep 1999 A
5976088 Urbano et al. Nov 1999 A
5987345 Engelmann et al. Nov 1999 A
5995644 Lai et al. Nov 1999 A
6115486 Cantoni Sep 2000 A
6175643 Lai et al. Jan 2001 B1
6185320 Bick et al. Feb 2001 B1
6304667 Reitano Oct 2001 B1
6347329 Evans Feb 2002 B1
6351547 Johnson et al. Feb 2002 B1
6424996 Killcommons et al. Jul 2002 B1
6463169 Ino et al. Oct 2002 B1
6532299 Sachdeva et al. Mar 2003 B1
6532311 Pritt Mar 2003 B1
6556695 Packer et al. Apr 2003 B1
6563950 Wiskott et al. May 2003 B1
6574629 Cooke et al. Jun 2003 B1
6577753 Ogawa Jun 2003 B2
6630937 Kallergi et al. Oct 2003 B2
6678764 Parvelescu et al. Jan 2004 B2
6697506 Qian et al. Feb 2004 B1
6775402 Bacus et al. Aug 2004 B2
6778689 Aksit et al. Aug 2004 B1
6820100 Funahashi Nov 2004 B2
6829377 Milioto Dec 2004 B2
6864794 Betz Mar 2005 B2
6886133 Bailey et al. Apr 2005 B2
6891920 Minyard et al. May 2005 B1
6917696 Soenksen Jul 2005 B2
6996205 Capolunghi et al. Feb 2006 B2
7022073 Fan et al. Apr 2006 B2
7027633 Foran et al. Apr 2006 B2
7031846 Kaushikkar et al. Apr 2006 B2
7043474 Mojsilovic et al. May 2006 B2
7050620 Heckman May 2006 B2
7092572 Huang et al. Aug 2006 B2
7103205 Wang et al. Sep 2006 B2
7106479 Roy et al. Sep 2006 B2
7110616 Ditt et al. Sep 2006 B2
7155043 Daw Dec 2006 B2
7170532 Sako Jan 2007 B2
7174054 Manber et al. Feb 2007 B2
7209149 Jogo Apr 2007 B2
7212661 Samara et al. May 2007 B2
7218763 Belykh et al. May 2007 B2
7224852 Lipton et al. May 2007 B2
7263710 Hummel et al. Aug 2007 B1
7272610 Torres Sep 2007 B2
7412111 Battle et al. Aug 2008 B2
7450747 Jabri et al. Nov 2008 B2
7526114 Seul et al. Apr 2009 B2
7545965 Suzuki et al. Jun 2009 B2
7583861 Hanna et al. Sep 2009 B2
7613335 McLennan et al. Nov 2009 B2
7634121 Novatzky et al. Dec 2009 B2
7636413 Toth Dec 2009 B2
7660488 Reicher et al. Feb 2010 B2
20020016718 Rothschild et al. Feb 2002 A1
20020021828 Papier et al. Feb 2002 A1
20020044696 Sirohey et al. Apr 2002 A1
20020081039 Funahashi Jun 2002 A1
20020091659 Beaulieu et al. Jul 2002 A1
20020106119 Foran et al. Aug 2002 A1
20020110285 Wang et al. Aug 2002 A1
20020164063 Heckman Nov 2002 A1
20030036087 Kaushikkar et al. Feb 2003 A1
20030123717 Bacus et al. Jul 2003 A1
20030185446 Huang et al. Oct 2003 A1
20040024303 Banks et al. Feb 2004 A1
20040086163 Moriyama et al. May 2004 A1
20040101191 Seul et al. May 2004 A1
20040141661 Hanna et al. Jul 2004 A1
20040151374 Lipton et al. Aug 2004 A1
20040161139 Samara et al. Aug 2004 A1
20040161164 Dewaele Aug 2004 A1
20040165791 Kaltanji Aug 2004 A1
20040170312 Soenksen Sep 2004 A1
20040264753 Capolunghi et al. Dec 2004 A1
20050027570 Maier et al. Feb 2005 A1
20050036668 McLennan et al. Feb 2005 A1
20050043970 Hsieh Feb 2005 A1
20050063612 Manber et al. Mar 2005 A1
20050065424 Shah et al. Mar 2005 A1
20050184988 Yanof et al. Aug 2005 A1
20050273009 Deischinger et al. Dec 2005 A1
20060093198 Fram et al. May 2006 A1
20060093199 Fram et al. May 2006 A1
20060093207 Reicher et al. May 2006 A1
20060095423 Reicher et al. May 2006 A1
20060095426 Takachio et al. May 2006 A1
20070050701 El Emam et al. Mar 2007 A1
20070055550 Courtney et al. Mar 2007 A1
20070067124 Kimpe et al. Mar 2007 A1
20080103828 Squilla et al. May 2008 A1
20100138239 Reicher et al. Jun 2010 A1
Related Publications (1)
Number Date Country
20060106642 A1 May 2006 US
Provisional Applications (1)
Number Date Country
60625690 Nov 2004 US