Systems and methods for viewing medical 3D imaging volumes

Information

  • Patent Grant
  • 10614615
  • Patent Number
    10,614,615
  • Date Filed
    Wednesday, October 12, 2016
    8 years ago
  • Date Issued
    Tuesday, April 7, 2020
    4 years ago
Abstract
A method of automatically tracking the portions of a 3D medical imaging volume, such as the voxels, that have already been displayed according to use-defined display parameters, notating those portions, and providing the user with information indicating what portions of the imaging volume have been displayed at full resolution.
Description
BACKGROUND OF THE INVENTION

Field of the Invention


This invention relates to management and viewing of medical images and, more particularly, to systems and methods of tracking which portions of three dimensional imaging volumes have been displayed using predetermined display parameters.


Description of the Related Art


Medical imaging is increasingly moving into the digital realm. This includes imaging techniques that were traditionally analog, such as mammography, x-ray imaging, angiography, endoscopy, and pathology, where information can now be acquired directly using digital sensors, or by digitizing information that was acquired in analog form. In addition, many imaging modalities are inherently digital, such as MRI, CT, nuclear medicine, and ultrasound. Increasingly these digital images are viewed, manipulated, and interpreted using computers and related computer equipment. Accordingly, there is a need for improved systems and methods of viewing and manipulating these digital images.


For projection modalities like mammography and radiography, information may be represented in digital images in the form of a two dimensional array of pixels (picture elements). Other techniques are capable of creating cross sectional views of the body, such as magnetic resonance imaging (MRI), computed tomography (CT), and positron emission computed tomography (PET). With these techniques, the information in an image represents a two-dimensional slice of the body, with each pixel in the image representing information from a small volume in space, a voxel (volume element). In a typical imaging exam (“scan”) using these modalities, a series of parallel images are acquired through the body. The information in this series of images therefore represents a 3D volume of space or imaging volume. In addition, a three-dimensional imaging volume may be created from a set of nonparallel images as well. It is also possible to perform a three-dimensional acquisition directly with some modalities. Thus, the image volume may be generated by a device that creates a 3D data set.


While a radiologist or physician interpreting such an imaging volume might choose to view the acquired images, the plane in which the images were acquired may not be optimal for accurate and efficient analysis and interpretation of the scan. For example, imaging might be performed in the axial plane with the patient asymmetrically positioned in the scanner. The resulting images would therefore not be in the axial plane with respect to the patient's anatomy. In that case, a new set of axial images would be desired that were in the axial plane with respect to the patient's anatomy. Alternatively, structures of interest might be best evaluated in other planes, such as coronal, sagittal or oblique planes.


A reader may choose to view the information in planes that differ from the plane in which the images were originally acquired as a supplement or substitute for viewing the original images. Those of skill in the art will recognize that given the 3D imaging volume created by the original images from the imaging device, a new set of 2D images that slice up the imaging volume along other axes using multiplanar reformatting (MPR) may be created. The reformatting could be used to create a series of images in another axis prior to viewing or the viewer might choose to interactively create new views in real time as part of his analysis of the information.


No matter how the reader chooses to view the imaging volume acquired by the scan, all the acquired relevant information in the imaging volume should be viewed. Currently available systems for viewing imaging volumes do not provide a mechanism for tracking which portions of the imaging volume, such as which voxels of the imaging volume, have been displayed on a display device. Thus, the viewer must independently determine which portions of an imaging volume have been viewed. As those of skill in the art will recognize, requiring the viewer to determine when all relevant portions of an imaging volume have been viewed introduces the possibility that portions of the imaging volume are not viewed and, thus, features and/or abnormalities expressed in the unviewed portions may not be detected. Systems and methods for tracking and alerting a viewer of an imaging volume which portions of the imaging volume have been viewed are desired.


SUMMARY OF THE INVENTION

One embodiment comprises a computing system for viewing an imaging volume comprising a plurality of planar images that represent portions of an imaged object. Each of the planar images comprises a plurality of voxels. The system comprises a display device configured to display a predetermined number of pixels. The system also comprises an input interface configured to receive the imaging volume. The system also comprises an application module being configured to initiate display of one or more navigation images in one or more navigation planes. The navigation images are displayed at a reduced resolution so that the entire navigation images may be viewed on the display device. The application module is configured to initiate generation of a reformatted image comprising voxels that are along any plane determined by the user. In one embodiment, a portion of the navigation image corresponding to voxels that have been displayed at full resolution in any plane are visually distinguishable from the remaining portions of the navigation image.


Another embodiment comprises a system for viewing a three dimensional imaging volume. The system comprises: at least one three dimensional imaging volume; and a module configured to provide an interface to allow a user to selectively display portions of the imaging volume, wherein the module automatically determines whether at least a region of the three dimensional imaging volume has been displayed at a particular resolution.


Another embodiment comprises a system for viewing a three dimensional imaging volume. The system comprises: at least one three dimensional imaging volume; and means for selectively displaying portions of the image of the display, wherein the displaying means automatically determines whether at least a region of the three dimensional imaging volume has been displayed at a particular resolution.


Yet another embodiment comprises a method of viewing a 3D imaging volume on a display device coupled to a computing system. The display device is configured to concurrently display one or more navigation images and one of a plurality of reformatted images that comprise a plane of the imaging volume along any axis chosen by the view. The method comprises receiving an imaging volume at the computing system, wherein the imaging volume comprises a plurality of voxels arranged in a three dimensional array. The method also comprises displaying on the display device one or more navigation images comprising voxels of the imaging volume. The method also comprises selecting a portion of the navigation image and generating a reformatted image comprising voxels of the imaging volume from a second plane of the imaging volume. In one embodiment, the first and second planes intersect at the selected portion of the navigation image. The method also comprises displaying on the display device the reformatted image and updating the navigation image to include a visual indication that the selected portion of the navigation image has been displayed on the display device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary computing system in communication with a network and various networked devices.



FIG. 2A is a three-dimensional depiction of an imaging volume comprising a plurality of voxels.



FIG. 2B illustrates a reformatted image generated from the imaging volume of FIG. 2A.



FIG. 3A is a CT image of a patient's lumbar spine in the axial plane taken from an imaging volume comprising a plurality of axial images.



FIG. 3B is a reformatted image in the sagittal plane that was generated from the imaging volume described with respect to FIG. 3A.



FIG. 4 is a flowchart illustrating a method of tracking which voxels of a medical imaging volume have been displayed according to user-defined display parameters.



FIG. 5 illustrates portions of an exemplary graphical user interface (GUI) that may be displayed on a display device, either concurrently or separately.



FIG. 6 illustrates a display device displaying a navigation image and a reformatted image representative of a cross-section of the imaging space at the location of a navigation line on the navigation image.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

Embodiments of the invention will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the inventions herein described.



FIG. 1 is a block diagram of an exemplary computing system 100 in communication with a network 160 and various network devices. The computing system 100 may be used to implement certain systems and methods described herein. The functionality provided for in the components and modules of computing system 100 may be combined into fewer components and modules or further separated into additional components and modules.


The computing system 100 includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible. In one embodiment, the exemplary computing system 100 includes a central processing unit (“CPU”) 105, which may include a conventional microprocessor, an application module 145 that comprises one or more various applications that may be executed by the CPU 105. The application module 145 may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


The computing system 100 further includes a memory 130, such as random access memory (“RAM”) for temporary storage of information and a read only memory (“ROM”) for permanent storage of information, and a mass storage device 120, such as a hard drive, diskette, or optical media storage device. Typically, the modules of the computing system 100 are connected to the computer using a standards-based bus system. In different embodiments of the present invention, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.


The computing system 100 is generally controlled and coordinated by operating system software, such as the Windows 95, 98, NT, 2000, XP or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 100 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary computing system 100 includes one or more of commonly available input/output (I/O) devices and interfaces 110, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 110 include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, display devices provide for the presentation of GUIs, application software data, and multimedia presentations, for example. In one embodiment, a GUI includes one or more display panes in which medical images may be displayed. According to the systems and methods described below, medical images may be stored on the computing system 100 or another device that is local or remote, displayed on a display device, and manipulated by the application module 145. The computing system 100 may also include one or more multimedia devices 140, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 1, the I/O devices and interfaces 110 provide a communication interface to various external devices. In the embodiment of FIG. 1, the computing system 100 is coupled to a network 160, such as a LAN, WAN, or the Internet, for example, via a communication link 115. The network 160 may be coupled to various computing devices and/or other electronic devices. In the exemplary embodiment of FIG. 1, the network 160 is coupled to imaging devices 170, an image server 180, and a medical facility 190. In addition to the devices that are illustrated in FIG. 1, the network 160 may communicate with other computing, imaging, and storage devices.


The imaging devices 170 may be any type of device that is capable of acquiring medical images, such as an MRI, x-ray, mammography, or CT scan systems. The image server 180 includes a data store 182 that is configured to store images and data associated with images. In one embodiment, the imaging devices 170 communicate with the image server 180 via the network 160 and image information is transmitted to the image server 180 and stored in the data store 182. In one embodiment, the image data is stored in Digital Imaging and Communications in Medicine (“DICOM”) format. The complete DICOM specifications may be found on the National Electrical Manufactures Association Website at <medical.nema.org>. Also, NEMA PS 3—Digital Imaging and Communications in Medicine, 2004 ed., Global Engineering Documents, Englewood Colo., 2004, provides an overview of the DICOM standard. Each of the above-cited references is hereby incorporated by reference in their entireties. In one embodiment, the data store 182 also stores the user-defined display parameters associated with one or more of the images stored on the data store 182. As discussed in further detail below, the user-defined display parameters may vary depending of the type of image, area imaged, clinical indication, source of image, display device, user, or other factors. Accordingly, any type of user-defined display parameter is expressly contemplated for use in conjunction with the systems and methods described herein.


The exemplary image server 180 is configured to store images from multiple sources and in multiple formats. For example, the image server 180 may be configured to receive medical images in the DICOM format from multiple sources, store these images in the data store 182, and selectively transmit medical images to requesting computing devices.


The medical facility 190 may be a hospital, clinic, doctor's office, or any other medical facility. The medical facility 190 may include one or more imaging devices and may share medical images with the image server 180 or other authorized computing devices. In one embodiment, multiple computing systems, such as the computing system 100 may be housed at a medical facility, such as medical facility 190.


Definition of Terms

Below is a definition of certain terms used herein.


“Modality” is defined as a medical imaging device (a patient who undergoes an MRI is said to have been examined or scanned with the MRI modality).


“Medical image” is defined to include an image of an organism. It may include but is not limited to a radiograph, computed tomography (CT), magnetic resonance imaging (MRI), Ultrasound (US), mammogram, positron emission tomography scan (PET), nuclear scan (NM), pathology, endoscopy, ophthalmology, or many other types of medical images. While this description is directed to viewing and tracking of medical images, the methods and systems described herein may also be used in conjunction with non-medical images, such as, images of circuit boards, airplane wings, geologic mapping, and satellite images, for example.


“Patient” refers to an individual who undergoes a medical imaging examination.


“3D Imaging Volume” or “Imaging Volume” refers to the information acquired by an imaging device (also referred to as a “scanner”) in the form of images that together form a 3D volume of digital information representative of the volume that was scanned, such as a portion of a patient.


“Viewing” is defined to include the process of visually observing one or more medical images associated with exams.


“Viewer” is defined as any person who views a medical image.


“Reading” is defined to include the process of visually observing one or more medical images for the purpose of creating a professional medical report, also called an interpretation. When reading is complete, an exam may be labeled “read,” indicating that the medical professional has completed observation of the one or more medical images for purposes of creating a medical report.


“Reader” is defined to include one who is authorized to perform the reading process.


“User” is defined to include any person that is a viewer and/or a reader.


“Display parameters” are defined to include methods of display of an image or exam. For example, an image or exam may be displayed with a certain pixel window level or width (similar to brightness and contrast), in color, based on a certain color map, opacity map, or other display parameters.


“Full voxel display” is defined to include display on a monitor or other display system of information from every voxel of a 3D imaging volume.


“Full Resolution” is defined to include the concurrent display of all voxels of a 3D imaging volume.


“Reduced Resolution” is defined to include display of less than all of the voxels of a 3D imaging volume.


“User-defined display parameter” refers to rules that a user can establish and store in a database that establish criteria for image display that is considered adequate. For example, a user-defined display parameter might store a rule that triggers certain warnings or displays if all voxels have not been displayed or, alternatively, if at least a predetermined portion of the voxels have not been displayed with a certain display method (such as image window, level, brightness, contrast, opacity, color look-up table, or other parameters). User-defined display parameters may also refer to other image processing functions, such as edge enhancement and automated image analysis functions, e.g., computer-aided detection (CAD) techniques.


As noted above, some medical imaging scanners, such as computed tomography (CT) and Magnetic Resonance Imaging (MRI), are capable of collecting a series of adjacent 2D images through a region of interest that, when stacked up, represent a 3D imaging volume. In some cases, a three-dimensional volume is directly acquired. For example, each individual two-dimensional image might consist of 512×512 pixels (picture elements), with each pixel representing a 1 mm×1 mm×1 mm voxel (volume element) within the region scanned, although the dimensions of the voxel need not be equal. If 1,000 such contiguous images were acquired then these images would then form an imaging volume consisting of 512×512×1,000 voxels representing a volume of space that is 51.2 cm×51.2 cm×100 cm in size.



FIG. 2A is a three-dimensional depiction of an imaging volume 320 comprising a plurality of voxels 322. In the embodiment of FIG. 2A, a series of contiguous axial MRI images 310A-310N of a brain are stacked to yield the 3D imaging volume 320. In certain embodiments, the orientation of the scanned images with respect to the patient is arbitrary, as images from any orientation can be stacked to form a 3D imaging volume, such as imaging volume 320. Once an imaging volume is created, 2D images can be created in any plane through the imaging volume using, for example, a technique known as multiplanar reformatting. In the embodiment of FIG. 2B, two-dimensional images in the coronal orientation 330 may be generated from the imaging volume 320, even though the original 2D images 310 were collected in the axial plane. Thus, coronal images, such as coronal image 340 may be generated for viewing. The coronal image 340 comprises voxels from each of the source images 310A-310N at the predetermined plane. In one embodiment, images may be reformatted from any arbitrary oblique plane of the imaging volume 320. This is often useful to physicians interpreting imaging volumes as particular features of anatomy and pathology may be best appreciated in a particular imaging plane.



FIG. 3A is a CT image 410 of a patient's lumbar spine in the axial plane taken from an imaging volume comprising a plurality of axial images. FIG. 3B is a reformatted image 420 from the imaging volume in the sagittal plane. As those of skill in the art will appreciate, assessment of alignment of the vertebral bodies may be best appreciated in the sagittal plane. Accordingly, sagittal images, including image 420 may be created from the axial imaging volume using multiplanar reformatting. In the embodiment of FIG. 3A, the patient is rotated with respect to the scanner and, thus, creation of images that are in the true sagittal plane with respect to the patient's spine require that the images be reformatted along an oblique axis 430.


As described above, the process of multiplanar reformatting allows generation of reformatted images from imaging volumes in planes other than the plane in which the original images were acquired. Therefore, a viewer of an imaging volume may choose to view the source images acquired from the imaging device and/or reformatted images, reformatted to be in a plane that differs from the plane of the source images. No matter whether the viewer is viewing source images or reformatted images, it may be important that the viewer examines the entire imaging volume so that important features of the exam are appreciated. In addition, it may be important that voxels of the imaging volume are displayed according to a predefined set of display parameters.



FIG. 4 is a flowchart illustrating a method of tracking which voxels of a medical imaging exam have been displayed according to user-defined display parameters. More particularly, the exemplary method of FIG. 4 may be used to track which voxels of an imaging volume have been displayed by the physician, regardless of the plane of viewing, and provide feedback to the physician indicating whether the exam has been completely viewed. As noted above, it may be important and/or required that a user view all of a medical imaging volume at full resolution, or accordingly to other user-defined display parameters. The method described with reference to FIG. 4 is an example of one method of automatically tracking the portions of a 3D imaging volume, such as the voxels, that have already been displayed at full resolution, notating those portions, and providing the user with information indicating what portions of the imaging volume have been displayed at full resolution. This process is generally referred to as “voxel tracking.” In an advantageous embodiment, the user interface provides a real time status of what portions of the imaging volume have been displayed at full resolution, or according to other user-defined display parameters.


In one embodiment, the method described with respect to FIG. 4 is performed by a computing system 100, a medical facility 190, or an image server 180, for example. For ease of description, the method will be discussed below with reference to a computing system 100 performing the method. Depending on the embodiment, certain of the blocks described below may be removed, others may be added, and the sequence of the blocks may be altered.


In one embodiment, the user-defined display parameters specify that an entire medical imaging volume must be viewed at full resolution before a reader may mark the image as read. However, the user-defined display parameters may have different requirements, such as requiring that at least a defined portion of the voxels are displayed at full resolution and/or a defined portion of the voxels are viewed with a certain display method, for example. In another embodiment, the user-defined display parameters may specify that the imaging volume be viewed at a resolution that is less than full resolution. In other embodiments, the user-defined display parameters may specify additional display settings that must be satisfied in order to allow the reader to mark the imaging volume as read. For example, the display parameters may be set to require that every nth voxel, or a certain percentage of the total voxels of an imaging volume, are displayed. Thus, various user-defined display parameters may be established on a user, modality, or facility basis, for example. In one embodiment, such as when viewing CT images, the display parameters may specify that the CT images are viewed using a specified series of display parameters, such as lung windows, bone windows, and/or other types of windows, for example. In this embodiment, if the user forgets to view the images separately using all the required display parameters, the CT images may be misinterpreted.


For ease of description, the following description refers to user-defined display parameters specifying that every voxel of the imaging volume is displayed at full resolution before it may be marked as read. However, the methods described herein are not limited to these display parameters and application of these methods using other user-defined display parameters is expressly contemplated. Any reference to tracking voxels at full resolution should be interpreted to cover similar systems and methods for monitoring and/or tracking of any other user-defined display parameter or combination of display parameters.


In one embodiment, the computing system 100 is configured to determine a portion of the imaging volume on which voxel tracking is to be applied. Many imaging volumes comprise regions outside the area of interest that are not important for a user, such as a doctor, to view and mark as read. For example, the imaging volume that results from a CT scan of the brain will include air outside of the head that is irrelevant to the analysis by the user. Accordingly, viewing of these irrelevant portions of the imaging volume according to the user-defined display parameters is not necessary. In one embodiment, the computing system 100 analyzes the medical image and determines those regions that are irrelevant to the user's analysis. These irrelevant regions are then excluded from the user-defined display parameters and a viewer may mark an image as read without viewing the irrelevant areas at full resolution, for example. In another embodiment, the user may define the irrelevant regions of an imaging volume prior to viewing portions of the image at full resolution. For example, the user may use the keyboard, mouse, or other input device, to select regions surrounding the volume of interest that do not require viewing according to the user-defined display parameters. In yet another embodiment, the user may determine that the relevant regions of an imaging volume have been viewed according to the display parameters, without the need to pre-select portions that should be viewed according to the display parameters. By providing for automatic and/or manual selection of irrelevant portions of a medical imaging volume, the viewer is not required to display those irrelevant portions of the medical imaging volume according to the user-defined display parameters, such as full resolution.


Returning to FIG. 4, in a block 210, a series of source images that make up an imaging volume are received from an image source. The image source may comprise one or more of the imaging devices 170, the image server 180, the medical facility 190, or any other device that is capable of transmitting medical images. The medical images may be received via the network 160, or by other means, such as transferred on a floppy disk, CD-ROM, or USB storage device. In certain embodiments, the received source images each comprise more pixels that the display device and, thus, an entire source image may not be concurrently displayed at full resolution. Commonly owned U.S. patent application Ser. No. 11/179,384 titled “Systems And Methods For Viewing Medical Images,” which is hereby incorporated by reference in its entirety, describes exemplary systems and methods of tracking portions of 2D images that have been displayed according to user-defined display parameters. Images of an imaging volume, either source images or reformatted images, may be tracked using the systems and methods described in that application so that relevant portions of each image are displayed according to the user-defined display parameters.


Continuing to a block 220, a plane through the imaging volume is selected for display on the display device. As discussed above, the source images of the imaging volume may not be of a plane that the viewer wishes to view. For example, as described above with respect to FIG. 3, detection of certain features or abnormalities in an imaging space may be more easily detectable in certain planes, which may not be the plane of the source images. Accordingly, in certain embodiments, the user may select a plane of view in which reformatted images will be generated and displayed. In one embodiment, reformatted images are generated using the process of multiplanar reformatting. However, other methods of reformatting portions of a 3D imaging space in order to form images from any plane of the imaging space may also be used in conjunction with the systems and methods described herein.


Moving to a block 230, at least a portion of one of images, e.g., source or reformatted images, is displayed on the display device. As noted above, in many embodiments, the source and reformatted images comprise more pixels than a display device is capable of displaying at full resolution. Accordingly, only a portion of the images may be concurrently displayed on the display device.


Moving to a block 240, the user can adjust the display characteristics of the image currently being viewed, for example by adjusting the window level and window width. In one embodiment, adjustments made to the display characteristics of the currently displayed image are made to other images of the imaging space. Accordingly, if a contrast, brightness, or color level, for example, of an image is adjusted, the remaining images in the imaging space may also be adjusted.


Continuing to a block 250, an indication of the portion of the imaging volume that is displayed at full resolution is recorded. For example, the voxels being displayed in the imaging plane may be recorded, such as by storing voxel information in a memory 130 of the computing system 100. Alternatively, the information regarding displayed voxels may be stored on a local mass storage device 120 or central server, such as the image server 180.


Moving to a block 260, an indication of which regions of the imaging volume have been displayed at full resolution is displayed. For example, the user might interactively create a series of reformatted coronal images from a set of axial images that make up the imaging volume. One or more display characteristics of a reference axial image that the user has chosen could be altered to reflect voxels within that image that have been displayed at full resolution in one or more imaging planes, as illustrated in FIG. 5. Multiple references images might be displayed, for example in axial, coronal and sagittal orientations, each displaying an indication of which voxels have been displayed at full resolution along the corresponding planes. Alternatively, a three-dimensional cube might be displayed, with regions on each of three displayed faces that illustrate what voxels have been displayed at full resolution along the corresponding plane. For example, a user interface may include visual indications as to which portions of an imaging volume have not been displayed, or which voxels have not been displayed using a user-defined display parameter, such as a specified window or level setting. In one embodiment, the computing system 100 automatically displays a message indicating which regions of the imaging volume have not displayed with full pixel display and/or meeting user-defined display parameter criteria. In another embodiment, the computing system 100 automatically directs the user to any regions of the imaging volume that have not been displayed at full resolution and/or meeting user-defined display parameter criteria.


In one embodiment, the adjustment of a display characteristic to indicate regions that have or have not been displayed at full resolution comprises changing a color of the image portion. In another embodiment, other indicators, such as lines surrounding those image portions already displayed at full resolution, may be used to discriminate between portions of the imaging volume that have been displayed at full resolution and portions that have not been displayed at full resolution. Based on the distinguishing display characteristic, the user may select for display a portion of the imaging volume that has not yet been displayed at full resolution. In one embodiment, coloring of the viewed voxels may be toggled on and off by the user. In another embodiment, a text message, icon, or other indication, may be displayed at the bottom of the display, for example, indicating that the imaging volume has been viewed according to the user-defined display parameters. In yet another embodiment, the outside margins of the viewing pane may change color or the system could beep or provide some other audible feedback when the imaging volume has been displayed according to the user-defined display parameters.


In a decision block 270, the computing system 100 determines if another plane of the imaging volume has been selected for display at full resolution. In one embodiment, the user is presented with a full or reduced resolution representation of certain images within the imaging volume, for example axial, coronal and sagittal images through the middle of the imaging volume. Selection of a plane to be displayed using multiplanar reformatting may be accomplished by pressing certain keys on a keyboard, such as the arrow keys, for example. In another embodiment, the user may change the selected portion for viewing by moving a mouse, or other input device. In another embodiment, the computing system 100 may be configured to periodically update the display with a plane of the imaging volume that has not yet been displayed at full resolution, or update the display in response to an input from the user.


If in the decision block 270, the computing device 100 determines that instructions have been received to display another portion of the imaging volume on the display device, the method returns to block 220 where the image plane to be displayed is selected. The method then repeats the steps 220, 230, 240, 250, 260, and 270, allowing the user to view various portions of the imaging space by viewing reformatted images from various planes, and tracking which voxels of the imaging space have been viewed by the user. Because information indicating which voxels of the imaging space have been displayed according to the user-defined display parameters is stored by the computing system, the user can advantageously track the portions of the image space that have been viewed.


Referring again to the decision block 270, if the computing device 100 determines that instructions to display another portion of the imaging volume have not been received, the method continues to a decision block 280, wherein the computing device 100 determines whether all of the voxels in the imaging volume have been displayed at full resolution. If it is determined that not all of the voxels in the imaging volume have been displayed at full resolution, the method continues to a block 290, wherein an indication, such as an audible or visual alert, is provided to the user that not all of the voxels in the imaging volume have been viewed at full resolution. The method then returns to block 220. In one embodiment, the computing system 100 automatically selects an image, either a source image or reformatted image, for display from the portion of the imaging space that have not yet been displayed at full resolution, and/or that have not been displayed such that user-defined display parameters have been met.


If, however, in the decision block 280, the computing device 100 determines that the entire image has been displayed at full resolution, the method continues to a block 285, wherein an indication, such as an audible or visual alert, is provided to the user that the entire imaging volume have been displayed at full resolution. In certain embodiments, the user is only able to mark the imaging volume as read when the method reaches block 285.


As noted above, the flowchart of FIG. 4 illustrates an exemplary process of tracking voxels viewed by a user according to exemplary user-defined display parameters. In particular, the user-defined display parameters in the example of FIG. 4 specify that the entire imaging volume is viewed at full resolution. However, in other embodiments the user-defined display parameters may require that, for example, only a portion of the imaging volume is displayed at full resolution, or any other predetermined reduced resolution. In one embodiment, the user or the software may select portions of the imaging volume that must be displayed according to the user-defined display parameters. In another embodiment, the display parameters may specify that the viewer determines when the imaging volume has been viewed according to the user-defined display parameters. In this embodiment, the system may track the viewed voxels of the imaging volume, present the viewer with a view of the imaging volume that distinguishes portions of the imaging volume that have not been viewed at full resolution, or according to any other user-defined display parameters, and the viewer can determine whether the image can be marked as read.


In one embodiment, the user can establish user-defined display parameters and store those parameters in a database. In certain embodiments, rules may be established and linked to an individual user, user type, exam type, modality, or system, for example. These rules may trigger voxel tracking when certain events occur, such as a particular user opens an imaging space for viewing. In certain embodiments, rules may designate that voxel tracking applies to only certain viewers or users. In one embodiment, one set of display parameters can apply to one modality and another set of display parameters can apply to another modality. In addition, the rules may include specific triggers or warnings that occur if the user-defined display parameters are not satisfied. In one embodiment, the rules may indicate that the computing system 100 automatically direct the user to any voxels that have not been displayed with specific user-defined display parameters when the user attempts to mark an imaging space as read.


In one embodiment, the user is not able to notate an imaging volume as being read, or completely viewed, until the entire imaging volume has been displayed at full resolution. Accordingly, in the exemplary method of FIG. 4, if not all of the imaging volume has been displayed at full resolution, the method indicates that the entire imaging volume has not been viewed at full resolution in block 290, and the method returns to block 220, wherein another portion of the imaging volume may be selected for viewing.


It is noted that medical imaging exams can be subclassified into exam types. For example, “CT of the Brain” or “MRI of the Knee” may be exam types. Exam types can also be grouped into exam groups. For example, a user using the computing system 100 can indicate that two different exam types, such as “CT of the Chest” and “MRI of the Chest” belong in the same exam group, or that “CT of the Chest” and “CT of the Abdomen” belong in the same group. A user can also, via the system 100, set up rules related to voxel tracking that are linked to the exam group and/or modality and/or exam type or even per other exam specifiers. Thus, a user may require that all exams of a certain type or group are tracked according to user-defined rules, e.g., to determine if every voxel is displayed, and/or if every voxel is displayed with a bone window setting, etc. The rules can also be applied under other user-defined conditions, e.g., the referring doctor is an oncologist or if the exam indications are for cancer evaluation.


Similarly, users can also be grouped into user roles or user groups. A particular user or user having a user role may: (1) require a warning if the specified voxel tracking rules are not met (but still allow the user to mark the exam as “read”), (2) require that the rules are met before the exam is marked as “read”, and\or (3) display a warning or track displayed data even if the user does not have the right to mark the exam as “read.” The computing system 100 may also provide users guidance information. Voxel tracking might also be useful for a researcher or administrator who is studying the behavior of reading or viewing health care professionals.



FIG. 5 illustrates portions of an exemplary graphical user interface (GUI) that may be displayed on a display device, either concurrently or separately. More particularly, the exemplary GUI may display a reference image 510 that allows the user to select an image plane for viewing using the user-defined display parameters. In the embodiment of FIG. 5, the reference image 510 comprises a brain CT. A navigation line 520 may be adjusted by the user in order to move to another imaging plane. In one embodiment, as the user moves the navigation line, such as by using a mouse connected to the computing system 100, the computing system 100 generates 2D images perpendicular to the navigation line 520 using MPR (multiplanar reformatting), e.g., in the coronal orientation. The reformatted image 540 corresponding to the navigation line 520 is displayed in another portion of the GUI or, alternatively, may replace the navigation image 510. In one embodiment, the navigation image 510 is displayed in a pane of the GUI that is smaller than a pane that displays a corresponding reformatted image. In another embodiment, the navigational image 510 may be alternatively displayed on the display device so that more of the images may be concurrently displayed, for example. In the embodiment of FIG. 5, a contrast inverted region 530 of the navigation image 510 indicates the portions of the imaging volume that have been displayed according to the user-defined display parameters, such as full pixel resolution. Thus, the user can determine which portions of the imaging volume have been viewed according to the user-defined display parameters by noting the portions of the navigation image 510 that is within the inverted region 530. As noted above, other methods of indicating portions of the imaging space that have been displayed may also be used.



FIG. 6 illustrates a display device 610 displaying the navigation image 510 and the reformatted image 540. As noted above, the reformatted image 540 is a cross-section of the imaging space at the location of the navigation line 520 on the navigation image 510. In the embodiment of FIG. 6, as the user moves the navigation line 520 up and down the navigation image 510, the reformatted image 540 is replaced with other reformatted images corresponding to the current location of the navigation line 520. In this embodiment, the user can see which portions of the imaging volume have already been viewed while viewing portions of the imaging volume in another plane. In other embodiments, the navigation line 520 is oriented vertically, or at any other orientation, so that the user may cause generation of reformatted images in any plane desirable.


In another embodiment, multiple navigation images may be displayed on the display device 610. For example, two navigation images comprising orthogonal planes of the imaging volume may be concurrently displayed on the display device 610 and may provide the viewer with additional information regarding the location of the reformatted images displayed on the display device 610. In other embodiments, three or more navigation images from different planes of the imaging volume may be concurrently, or alternatively, displayed on the display device 610.


The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. For example, the above-described pixel checking method may be performed on other types of images, in addition to medical images. For example, images of circuit boards, airplane wings, geology, and satellite imagery may be analyzed using the described systems and methods for monitoring whether a three dimensional imaging volume has been viewed according to predefined display parameters. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computing system comprising: a computer readable storage medium having program instructions embodied therewith; andone or more processors configured to execute the program instructions to cause the one or more processors to:determine, for a three dimensional imaging volume comprising a plurality of voxels, a subset of voxels from the plurality of voxels to be displayed, wherein the subset of voxels is represented as at least one of: a particular percentage of the plurality of voxels are to be displayed, or particular portions of the plurality of voxels are to be displayed, and wherein the subset of voxels is based on one or more of: a resolution of the three dimensional imaging volume, a user of the computing system, a type of user of the computing system, a modality of the three dimensional imaging volume, an exam type associated with the three dimensional imaging volume, a group of exams of which the three dimensional imaging volume is a part, a group of users of the computing system of which the user is a member, a facility in which the three dimensional imaging volume is displayed, or a system on which the three dimensional imaging volume is displayed;receive indications of portions of the three dimensional imaging volume that are selectively displayed; anddetermine whether the subset of voxels has been displayed based on the indications of portions of the three dimensional image volume that have been selectively displayed.
  • 2. The computing system of claim 1, wherein the subset of voxels is based on a user the three dimensional imaging volume is displayed to.
  • 3. The computing system of claim 1, wherein the subset of voxels is based on an exam type associated with the three dimensional imaging volume.
  • 4. The computing system of claim 1, wherein the one or more processors are configured to execute the program instructions to further cause the one or more processors to: in response to determining that the subset of voxels has not been displayed based on the indications of portions of the three dimensional imaging volume that have been selectively displayed, provide a warning to a user.
  • 5. A computer-implemented method comprising: by one or more processors executing program instructions: determining, for a three dimensional imaging volume comprising a plurality of voxels, a subset of voxels from the plurality of voxels to be displayed, wherein the subset of voxels is represented as at least one of: a particular percentage of the plurality of voxels are to be displayed, or particular portions of the plurality of voxels are to be displayed, and wherein the subset of voxels is based on one or more of: a resolution of the three dimensional imaging volume, a user of the computing system, a type of user of the computing system, a modality of the three dimensional imaging volume, an exam type associated with the three dimensional imaging volume, a group of exams of which the three dimensional imaging volume is a part, a group of users of the computing system of which the user is a member, a facility in which the three dimensional imaging volume is displayed, or a system on which the three dimensional imaging volume is displayed;receiving indications of portions of the three dimensional imaging volume that are selectively displayed; anddetermining whether the subset of voxels has been displayed based on the indication of portions of the three dimensional imaging volume that have been selectively displayed.
  • 6. The computer-implemented method of claim 5, wherein the subset of voxels is based on a user the three dimensional imaging volume is displayed to.
  • 7. The computer-implemented method of claim 5, wherein the subset of voxels is based on an exam type associated with the three dimensional imaging volume.
  • 8. The computer-implemented method of claim 5 further comprising: by the one or more processors executing program instructions: in response to determining that the subset of voxels has not been displayed based on the indications of portions of the three dimensional imaging volume that have been selectively displayed, providing a warning to a user.
  • 9. A computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by one or more processors to cause the one or more processors to: determine, for a three dimensional imaging volume comprising a plurality of voxels, a subset of voxels from the plurality of voxels to be displayed, wherein the subset of voxels is represented as at least one of: a particular percentage of the plurality of voxels are to be displayed, or particular portions of the plurality of voxels are to be displayed, and wherein the subset of voxels is based on one or more of: a resolution of the three dimensional imaging volume, a user of the computing system, a type of user of the computing system, a modality of the three dimensional imaging volume, an exam type associated with the three dimensional imaging volume, a group of exams of which the three dimensional imaging volume is a part, a group of users of the computing system of which the user is a member, a facility in which the three dimensional imaging volume is displayed, or a system on which the three dimensional imaging volume is displayed;receive indications of portions of the three dimensional imaging volume that are selectively displayed; anddetermine whether the subset of voxels has been displayed based on the indications of the portions of three dimensional imaging volume that have been selectively displayed.
  • 10. The computer program product of claim 9, wherein the subset of voxels is based on a user the three dimensional imaging volume is displayed to.
  • 11. The computer program product of claim 9, wherein the subset of voxels is based on an exam type associated with the three dimensional imaging volume.
  • 12. The computer program product of claim 9, wherein the program instructions are executable by one or more processors to further cause the one or more processors to: in response to determining that the subset of voxels has not been displayed based on the indications of portions of the three dimensional imaging volume that have been selectively displayed, providing a warning to a user.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/081,225, filed Nov. 15, 2013, entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL 3D IMAGING VOLUMES,” which is a continuation of U.S. application Ser. No. 13/535,758, filed Jun. 28, 2012, entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL 3D IMAGING VOLUMES,” now U.S. Pat. No. 8,610,746, which is a continuation of U.S. application Ser. No. 13/079,597, filed Apr. 4, 2011, entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL 3D IMAGING VOLUMES,” now U.S. Pat. No. 8,217,966, which is a continuation of U.S. application Ser. No. 11/268,262, filed Nov. 3, 2005, entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL 3D IMAGING VOLUMES,” now U.S. Pat. No. 7,920,152, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application Ser. No. 60/625,690, filed on Nov. 4, 2004, each of which is hereby expressly incorporated by reference in its entirety.

US Referenced Citations (499)
Number Name Date Kind
4672683 Matsueda Jun 1987 A
5123056 Wilson Jun 1992 A
5172419 Manian Dec 1992 A
5179651 Taaffe et al. Jan 1993 A
5431161 Ryals et al. Jul 1995 A
5452416 Hilton et al. Sep 1995 A
5515375 DeClerck May 1996 A
5542003 Wofford Jul 1996 A
5734915 Roewer Mar 1998 A
5740267 Echerer et al. Apr 1998 A
5779634 Ema et al. Jul 1998 A
5807256 Taguchi Sep 1998 A
5835030 Tsutsui et al. Nov 1998 A
5852646 Klotz et al. Dec 1998 A
5857030 Gaborski Jan 1999 A
5867322 Morton Feb 1999 A
5926568 Chaney et al. Jul 1999 A
5954650 Saito et al. Sep 1999 A
5976088 Urbano et al. Nov 1999 A
5986662 Argiro et al. Nov 1999 A
5987345 Engelmann et al. Nov 1999 A
5995644 Lai et al. Nov 1999 A
6008813 Lauer et al. Dec 1999 A
6115486 Cantoni Sep 2000 A
6128002 Leiper Oct 2000 A
6130671 Agiro Oct 2000 A
6151581 Kraftson et al. Nov 2000 A
6175643 Lai et al. Jan 2001 B1
6177937 Stockham et al. Jan 2001 B1
6185320 Bick et al. Feb 2001 B1
6211795 Izuta Apr 2001 B1
6211884 Knittel et al. Apr 2001 B1
6219059 Argiro Apr 2001 B1
6219061 Lauer et al. Apr 2001 B1
6243095 Shile et al. Jun 2001 B1
6243098 Lauer et al. Jun 2001 B1
6262740 Lauer et al. Jul 2001 B1
6266733 Knittel et al. Jul 2001 B1
6269379 Hiyama et al. Jul 2001 B1
6297799 Knittel et al. Oct 2001 B1
6304667 Reitano Oct 2001 B1
6310620 Lauer et al. Oct 2001 B1
6313841 Ogata et al. Nov 2001 B1
6342885 Knittel et al. Jan 2002 B1
6347329 Evans Feb 2002 B1
6351547 Johnson et al. Feb 2002 B1
6356265 Knittel et al. Mar 2002 B1
6369816 Knittel et al. Apr 2002 B1
6383135 Chikovani et al. May 2002 B1
6388687 Brackett et al. May 2002 B1
6404429 Knittel Jun 2002 B1
6407737 Zhao et al. Jun 2002 B1
6411296 Knittel et al. Jun 2002 B1
6421057 Lauer et al. Jul 2002 B1
6424346 Correll et al. Jul 2002 B1
6424996 Killcommons et al. Jul 2002 B1
6426749 Knittel et al. Jul 2002 B1
6427022 Craine et al. Jul 2002 B1
6438533 Spackman et al. Aug 2002 B1
6463169 Ino et al. Oct 2002 B1
6476810 Simha et al. Nov 2002 B1
6512517 Knittel et al. Jan 2003 B1
6532299 Sachdeva et al. Mar 2003 B1
6532311 Pritt Mar 2003 B1
6556695 Packer et al. Apr 2003 B1
6556724 Chang et al. Apr 2003 B1
6563950 Wiskott et al. May 2003 B1
6574629 Cooke et al. Jun 2003 B1
6577753 Ogawa Jun 2003 B2
6603494 Banks et al. Aug 2003 B1
6606171 Renk et al. Aug 2003 B1
6614447 Bhatia et al. Sep 2003 B1
6618060 Brackett Sep 2003 B1
6621918 Hu et al. Sep 2003 B1
6630937 Kallergi et al. Oct 2003 B2
6650766 Rogers Nov 2003 B1
6654012 Lauer et al. Nov 2003 B1
6678764 Parvelescu et al. Jan 2004 B2
6680735 Seiler et al. Jan 2004 B1
6683933 Saito et al. Jan 2004 B2
6697067 Callahan et al. Feb 2004 B1
6697506 Oian et al. Feb 2004 B1
6734880 Chang et al. May 2004 B2
6760755 Brackett Jul 2004 B1
6775402 Bacus et al. Aug 2004 B2
6778689 Aksit et al. Aug 2004 B1
6785410 Vining et al. Aug 2004 B2
6820093 de la Huerga Nov 2004 B2
6820100 Funahashi Nov 2004 B2
6826297 Saito et al. Nov 2004 B2
6829377 Milioto Dec 2004 B2
6864794 Betz Mar 2005 B2
6886133 Bailey et al. Apr 2005 B2
6891920 Minyard et al. May 2005 B1
6894707 Nemoto May 2005 B2
6909436 Pianykh et al. Jun 2005 B1
6909795 Tecotzky et al. Jun 2005 B2
6917696 Soenksen Jul 2005 B2
6988075 Hacker Jan 2006 B1
6996205 Capolunghi et al. Feb 2006 B2
7016952 Mullen et al. Mar 2006 B2
7022073 Fan et al. Apr 2006 B2
7027633 Foran et al. Apr 2006 B2
7031504 Argiro et al. Apr 2006 B1
7031846 Kaushikkar et al. Apr 2006 B2
7039723 Hu et al. May 2006 B2
7043474 Mojsilovic May 2006 B2
7050620 Heckman May 2006 B2
7054473 Roehrig et al. May 2006 B1
7058901 Hafey et al. Jun 2006 B1
7092572 Huang et al. Aug 2006 B2
7103205 Wang et al. Sep 2006 B2
7106479 Roy et al. Sep 2006 B2
7110616 Ditt et al. Sep 2006 B2
7113186 Kim et al. Sep 2006 B2
7123684 Jing et al. Oct 2006 B2
7136064 Zuiderveld Nov 2006 B2
7139416 Vuylsteke Nov 2006 B2
7149334 Dehmeshki Dec 2006 B2
7155043 Daw Dec 2006 B2
7162623 Yngvesson Jan 2007 B2
7170532 Sako Jan 2007 B2
7174054 Manber et al. Feb 2007 B2
7209149 Jogo Apr 2007 B2
7209578 Saito et al. Apr 2007 B2
7212661 Samara et al. May 2007 B2
7218763 Belykh et al. May 2007 B2
7224852 Lipton et al. May 2007 B2
7236558 Saito et al. Jun 2007 B2
7260249 Smith Aug 2007 B2
7263710 Hummell et al. Aug 2007 B1
7272610 Torres Sep 2007 B2
7346199 Pfaff Mar 2008 B2
7366992 Thomas, III Apr 2008 B2
7379578 Soussaline et al. May 2008 B2
7412111 Battle et al. Aug 2008 B2
7450747 Jabri et al. Nov 2008 B2
7492970 Saito et al. Feb 2009 B2
7505782 Chu Mar 2009 B2
7516417 Amador et al. Apr 2009 B2
7523505 Menschik et al. Apr 2009 B2
7525554 Morita et al. Apr 2009 B2
7526114 Seul et al. Apr 2009 B2
7526132 Koenig Apr 2009 B2
7545965 Suzuki et al. Jun 2009 B2
7574029 Peterson et al. Aug 2009 B2
7583861 Hanna et al. Sep 2009 B2
7590272 Brejl et al. Sep 2009 B2
7599534 Krishnan Oct 2009 B2
7613335 McLennan et al. Nov 2009 B2
7634121 Novatzky et al. Dec 2009 B2
7636413 Toth Dec 2009 B2
7639879 Goto et al. Dec 2009 B2
7656543 Atkins Feb 2010 B2
7660413 Partovi et al. Feb 2010 B2
7660481 Schaap et al. Feb 2010 B2
7660488 Reicher et al. Feb 2010 B2
7668352 Tecotzky et al. Feb 2010 B2
7683909 Takekoshi Mar 2010 B2
7698152 Reid Apr 2010 B2
7716277 Yamatake May 2010 B2
7787672 Reicher et al. Aug 2010 B2
7834891 Yarger et al. Nov 2010 B2
7835560 Vining et al. Nov 2010 B2
7885440 Fram et al. Feb 2011 B2
7885828 Glaser-Seidnitzer et al. Feb 2011 B2
7899514 Kirkland Mar 2011 B1
7920152 Fram et al. Apr 2011 B2
7941462 Akinyemi et al. May 2011 B2
7953614 Reicher May 2011 B1
7970188 Mahesh et al. Jun 2011 B2
7970625 Reicher et al. Jun 2011 B2
7991210 Peterson et al. Aug 2011 B2
7995821 Nakamura Aug 2011 B2
8019138 Reicher et al. Sep 2011 B2
8046044 Stazzone et al. Oct 2011 B2
8050938 Green, Jr. et al. Nov 2011 B1
8065166 Maresh et al. Nov 2011 B2
8073225 Hagen et al. Dec 2011 B2
8094901 Reicher et al. Jan 2012 B1
8150708 Kotula et al. Apr 2012 B2
8214756 Salazar-Ferrer et al. Jul 2012 B2
8217966 Fram et al. Jul 2012 B2
8244014 Reicher et al. Aug 2012 B2
8249687 Peterson et al. Aug 2012 B2
8262572 Chono Sep 2012 B2
8292811 Relkuntwar et al. Oct 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8370293 Iwase et al. Feb 2013 B2
8379051 Brown Feb 2013 B2
8380533 Reicher et al. Feb 2013 B2
8391643 Melbourne et al. Mar 2013 B2
8406491 Gee et al. Mar 2013 B2
8457990 Reicher et al. Jun 2013 B1
8520978 Jakobovits Aug 2013 B2
8554576 Reicher et al. Oct 2013 B1
8560050 Martin et al. Oct 2013 B2
8610746 Fram et al. Dec 2013 B2
8626527 Reicher et al. Jan 2014 B1
8693757 Gundel Apr 2014 B2
8712120 Reicher et al. Apr 2014 B1
8731259 Reicher et al. May 2014 B2
8751268 Reicher et al. Jun 2014 B1
8771189 Ionasec et al. Jul 2014 B2
8797350 Fram Aug 2014 B2
8879807 Fram et al. Nov 2014 B2
8913808 Reicher et al. Dec 2014 B2
8954884 Barger Feb 2015 B1
8976190 Westerhoff et al. Mar 2015 B1
9042617 Reicher et al. May 2015 B1
9075899 Reicher Jul 2015 B1
9092551 Reicher Jul 2015 B1
9092727 Reicher Jul 2015 B1
9324188 Fram et al. Apr 2016 B1
9386084 Reicher et al. Jul 2016 B1
9471210 Fram et al. Oct 2016 B1
9495604 Fram Nov 2016 B1
9501617 Reicher et al. Nov 2016 B1
9501627 Reicher et al. Nov 2016 B2
9501863 Fram et al. Nov 2016 B1
9536324 Fram Jan 2017 B1
9542082 Reicher et al. Jan 2017 B1
9672477 Reicher et al. Jun 2017 B1
9684762 Reicher et al. Jun 2017 B2
9727938 Reicher et al. Aug 2017 B1
9734576 Fram et al. Aug 2017 B2
9754074 Reicher et al. Sep 2017 B1
9836202 Reicher et al. Dec 2017 B1
9892341 Reicher et al. Feb 2018 B2
9934568 Reicher et al. Apr 2018 B2
1009611 Fram et al. Oct 2018 A1
1015768 Reicher et al. Dec 2018 A1
10387612 Wu et al. Aug 2019 B2
10437444 Reicher et al. Oct 2019 B2
10438352 Fram et al. Oct 2019 B2
20010016822 Bessette Aug 2001 A1
20010041991 Segal et al. Nov 2001 A1
20010042124 Barron Nov 2001 A1
20020016718 Rothschild et al. Feb 2002 A1
20020021828 Papier et al. Feb 2002 A1
20020039084 Yamaguchi Apr 2002 A1
20020044696 Sirohey et al. Apr 2002 A1
20020054038 Nemoto May 2002 A1
20020070970 Wood et al. Jun 2002 A1
20020073429 Beane et al. Jun 2002 A1
20020090118 Olschewski Jul 2002 A1
20020090119 Saito et al. Jul 2002 A1
20020090124 Soubelet et al. Jul 2002 A1
20020091659 Beaulieu et al. Jul 2002 A1
20020099273 Bocionek et al. Jul 2002 A1
20020103673 Atwood Aug 2002 A1
20020103827 Sesek Aug 2002 A1
20020106119 Foran et al. Aug 2002 A1
20020110285 Wang et al. Aug 2002 A1
20020144697 Betz Oct 2002 A1
20020145941 Poland et al. Oct 2002 A1
20020172408 Saito et al. Nov 2002 A1
20020172409 Saito et al. Nov 2002 A1
20020180883 Tomizawa et al. Dec 2002 A1
20020186820 Saito et al. Dec 2002 A1
20020188637 Bailey et al. Dec 2002 A1
20020190984 Seiler et al. Dec 2002 A1
20030005464 Gropper et al. Jan 2003 A1
20030013951 Stefanescu Jan 2003 A1
20030016850 Kaufman et al. Jan 2003 A1
20030028402 Ulrich et al. Feb 2003 A1
20030034973 Zuiderveld Feb 2003 A1
20030037054 Dutta et al. Feb 2003 A1
20030053668 Ditt et al. Mar 2003 A1
20030055896 Hu et al. Mar 2003 A1
20030065613 Smith Apr 2003 A1
20030071829 Bodicker et al. Apr 2003 A1
20030101291 Mussack et al. May 2003 A1
20030115083 Masarie et al. Jun 2003 A1
20030120516 Perednia Jun 2003 A1
20030130973 Sumner, II et al. Jul 2003 A1
20030140044 Mok et al. Jul 2003 A1
20030140141 Mullen et al. Jul 2003 A1
20030156745 Saito et al. Aug 2003 A1
20030160095 Segal Aug 2003 A1
20030164860 Shen et al. Sep 2003 A1
20030184778 Chiba Oct 2003 A1
20030187689 Barnes et al. Oct 2003 A1
20030190062 Noro et al. Oct 2003 A1
20030195416 Toth Oct 2003 A1
20030204420 Wilkes et al. Oct 2003 A1
20030215120 Uppaluri et al. Nov 2003 A1
20030215122 Tanaka Nov 2003 A1
20040008900 Jabri et al. Jan 2004 A1
20040015703 Madison et al. Jan 2004 A1
20040024303 Banks et al. Feb 2004 A1
20040027359 Aharon et al. Feb 2004 A1
20040061889 Wood et al. Apr 2004 A1
20040068170 Wang et al. Apr 2004 A1
20040077952 Rafter et al. Apr 2004 A1
20040086163 Moriyama et al. May 2004 A1
20040088192 Schmidt et al. May 2004 A1
20040105030 Yamane Jun 2004 A1
20040105574 Pfaff Jun 2004 A1
20040114714 Minyard et al. Jun 2004 A1
20040122705 Sabol et al. Jun 2004 A1
20040122787 Avinash et al. Jun 2004 A1
20040141661 Hanna et al. Jul 2004 A1
20040143582 Vu Jul 2004 A1
20040161139 Samara et al. Aug 2004 A1
20040161164 Dewaele Aug 2004 A1
20040165791 Kaltanji Aug 2004 A1
20040172306 Wohl et al. Sep 2004 A1
20040174429 Chu Sep 2004 A1
20040190780 Shiibashi et al. Sep 2004 A1
20040197015 Fan et al. Oct 2004 A1
20040202387 Yngvesson Oct 2004 A1
20040243435 Williams Dec 2004 A1
20040252871 Tecotzky et al. Dec 2004 A1
20040254816 Myers Dec 2004 A1
20040255252 Rodriguez et al. Dec 2004 A1
20050010531 Kushalnagar et al. Jan 2005 A1
20050027569 Gollogly et al. Feb 2005 A1
20050027570 Maier et al. Feb 2005 A1
20050043970 Hsieh Feb 2005 A1
20050063575 Ma et al. Mar 2005 A1
20050065424 Shah et al. Mar 2005 A1
20050074150 Bruss Apr 2005 A1
20050074157 Thomas, III Apr 2005 A1
20050075544 Shapiro et al. Apr 2005 A1
20050088534 Shen et al. Apr 2005 A1
20050107689 Sasano May 2005 A1
20050108058 Weidner et al. May 2005 A1
20050110791 Krishnamoorthy et al. May 2005 A1
20050111733 Fors et al. May 2005 A1
20050113681 DeFreitas et al. May 2005 A1
20050114178 Krishnamurthy et al. May 2005 A1
20050114179 Brackett et al. May 2005 A1
20050114283 Pearson et al. May 2005 A1
20050143654 Zuiderveld et al. Jun 2005 A1
20050171818 McLaughlin Aug 2005 A1
20050184988 Yanof et al. Aug 2005 A1
20050197860 Joffe et al. Sep 2005 A1
20050203775 Chesbrough Sep 2005 A1
20050238218 Nakamura Oct 2005 A1
20050244041 Tecotzky et al. Nov 2005 A1
20050251013 Krishnan Nov 2005 A1
20050254729 Saito et al. Nov 2005 A1
20050259118 Mojaver Nov 2005 A1
20050273009 Deischinger et al. Dec 2005 A1
20060008181 Takekoshi Jan 2006 A1
20060031097 Lipscher et al. Feb 2006 A1
20060050152 Rai et al. Mar 2006 A1
20060058603 Dave et al. Mar 2006 A1
20060061570 Cheryauka et al. Mar 2006 A1
20060093198 Fram et al. May 2006 A1
20060093199 Fram et al. May 2006 A1
20060093207 Reicher et al. May 2006 A1
20060095423 Reicher et al. May 2006 A1
20060095426 Takachio et al. May 2006 A1
20060106642 Reicher et al. May 2006 A1
20060111937 Yarger et al. May 2006 A1
20060111941 Blom May 2006 A1
20060122482 Mariotti et al. Jun 2006 A1
20060171574 DelMonego et al. Aug 2006 A1
20060181548 Hafey Aug 2006 A1
20060188134 Quist Aug 2006 A1
20060230072 Partovi et al. Oct 2006 A1
20060238546 Handley et al. Oct 2006 A1
20060239573 Novatzky et al. Oct 2006 A1
20060241979 Sato et al. Oct 2006 A1
20060267976 Saito et al. Nov 2006 A1
20060274145 Reiner Dec 2006 A1
20060276708 Peterson et al. Dec 2006 A1
20060277075 Salwan Dec 2006 A1
20060282408 Wisely et al. Dec 2006 A1
20060282447 Hollebeek Dec 2006 A1
20070009078 Saito et al. Jan 2007 A1
20070021977 Elsholz Jan 2007 A1
20070050701 El Emam et al. Mar 2007 A1
20070055550 Courtney et al. Mar 2007 A1
20070064984 Vassa et al. Mar 2007 A1
20070067124 Kimpe et al. Mar 2007 A1
20070073556 Lau et al. Mar 2007 A1
20070106535 Matsunaga May 2007 A1
20070106633 Reiner May 2007 A1
20070109299 Peterson May 2007 A1
20070109402 Niwa May 2007 A1
20070110294 Schaap et al. May 2007 A1
20070116345 Peterson et al. May 2007 A1
20070116346 Peterson et al. May 2007 A1
20070122016 Brejl et al. May 2007 A1
20070124541 Lang et al. May 2007 A1
20070140536 Sehnert Jun 2007 A1
20070159962 Mathavu et al. Jul 2007 A1
20070162308 Peters Jul 2007 A1
20070165917 Cao et al. Jul 2007 A1
20070174079 Kraus Jul 2007 A1
20070192138 Saito et al. Aug 2007 A1
20070192140 Gropper Aug 2007 A1
20070237380 Iwase et al. Oct 2007 A1
20070239481 DiSilvestro et al. Oct 2007 A1
20070245308 Hill et al. Oct 2007 A1
20070270695 Keen Nov 2007 A1
20080016111 Keen Jan 2008 A1
20080021877 Saito et al. Jan 2008 A1
20080031507 Uppaluri et al. Feb 2008 A1
20080059245 Sakaida et al. Mar 2008 A1
20080097186 Biglieri et al. Apr 2008 A1
20080100612 Dastmalchi et al. May 2008 A1
20080103828 Squilla et al. May 2008 A1
20080118120 Wegenkittl et al. May 2008 A1
20080125846 Battle et al. May 2008 A1
20080126982 Sadikali et al. May 2008 A1
20080130966 Crucs Jun 2008 A1
20080133526 Haitani et al. Jun 2008 A1
20080136838 Goede et al. Jun 2008 A1
20080275913 van Arragon et al. Nov 2008 A1
20080279439 Minyard et al. Nov 2008 A1
20080300484 Wang et al. Dec 2008 A1
20090005668 West et al. Jan 2009 A1
20090022375 Fidrich Jan 2009 A1
20090028410 Shimazaki Jan 2009 A1
20090080719 Watt Mar 2009 A1
20090091566 Turney et al. Apr 2009 A1
20090094513 Bay Apr 2009 A1
20090123052 Ruth et al. May 2009 A1
20090129643 Natanzon et al. May 2009 A1
20090129651 Zagzebski et al. May 2009 A1
20090132586 Napora et al. May 2009 A1
20090150481 Garcia et al. Jun 2009 A1
20090164247 Dobler et al. Jun 2009 A1
20090182577 Squilla et al. Jul 2009 A1
20090198514 Rhodes Aug 2009 A1
20090213034 Wu et al. Aug 2009 A1
20090248442 Pacheco et al. Oct 2009 A1
20090268986 Holstein et al. Oct 2009 A1
20090326373 Boese et al. Dec 2009 A1
20100053353 Hunter et al. Mar 2010 A1
20100086182 Hui Apr 2010 A1
20100131887 Salazar-Ferrer et al. May 2010 A1
20100138239 Reicher et al. Jun 2010 A1
20100198608 Kaboff et al. Aug 2010 A1
20100201714 Reicher et al. Aug 2010 A1
20100211409 Kotula et al. Aug 2010 A1
20100246981 Hu et al. Sep 2010 A1
20100299157 Fram et al. Nov 2010 A1
20110016430 Fram et al. Jan 2011 A1
20110019886 Mizuno Jan 2011 A1
20110110572 Guehring et al. May 2011 A1
20110267339 Fram et al. Nov 2011 A1
20110293162 Pajeau Dec 2011 A1
20110316873 Reicher et al. Dec 2011 A1
20120070048 Van Den Brink Mar 2012 A1
20120130729 Raizada et al. May 2012 A1
20120136794 Kushalnagar et al. May 2012 A1
20120163684 Natanzon et al. Jun 2012 A1
20120183191 Nakamura Jul 2012 A1
20120194540 Reicher et al. Aug 2012 A1
20120196258 Geijsen et al. Aug 2012 A1
20120208592 Davis et al. Aug 2012 A1
20120284657 Hafey et al. Nov 2012 A1
20120320093 Zhu et al. Dec 2012 A1
20130070998 Shibata Mar 2013 A1
20130076681 Sirpal et al. Mar 2013 A1
20130083023 Fram Apr 2013 A1
20130129198 Sherman et al. May 2013 A1
20130129231 Dale et al. May 2013 A1
20130159019 Reicher Jun 2013 A1
20130169661 Reicher et al. Jul 2013 A1
20130195329 Canda et al. Aug 2013 A1
20130198682 Matas et al. Aug 2013 A1
20130297331 Zuehlsdorff et al. Nov 2013 A1
20140022194 Ito Jan 2014 A1
20140096049 Vonshak et al. Apr 2014 A1
20140119514 Miyazawa May 2014 A1
20140142983 Backhaus et al. May 2014 A1
20140378810 Davis et al. Dec 2014 A1
20150046349 Michael, Jr. et al. Feb 2015 A1
20150101066 Fram Apr 2015 A1
20150363104 Ichioka et al. Dec 2015 A1
20160034110 Edwards Feb 2016 A1
20160270746 Foos et al. Sep 2016 A1
20170038951 Reicher et al. Feb 2017 A1
20170039321 Reicher et al. Feb 2017 A1
20170039322 Reicher et al. Feb 2017 A1
20170039350 Reicher et al. Feb 2017 A1
20170039705 Fram et al. Feb 2017 A1
20170046014 Fram Feb 2017 A1
20170046483 Reicher et al. Feb 2017 A1
20170046485 Reicher et al. Feb 2017 A1
20170046495 Fram Feb 2017 A1
20170046870 Fram Feb 2017 A1
20170053404 Reicher et al. Feb 2017 A1
20170200064 Reicher et al. Jul 2017 A1
20170200269 Reicher et al. Jul 2017 A1
20170200270 Reicher et al. Jul 2017 A1
20170206324 Reicher et al. Jul 2017 A1
20170239720 Levin et al. Aug 2017 A1
20170293720 Reicher et al. Oct 2017 A1
20170301090 Fram et al. Oct 2017 A1
20170308647 Reicher et al. Oct 2017 A1
20180059918 Reicher et al. Mar 2018 A1
20180225824 Fram et al. Aug 2018 A1
Foreign Referenced Citations (1)
Number Date Country
WO 2007131157 Nov 2007 WO
Non-Patent Literature Citations (372)
Entry
US 7,801,341 B2, 09/2010, Fram et al. (withdrawn)
US 8,208,705 B2, 06/2012, Reicher et al. (withdrawn)
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jan. 30, 2018 (10 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Mar. 26, 2018 (40 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Mar. 8, 2018 (25 pages).
U.S. Appl. No. 14/540,830, Systems and Methods for Viewing Medical Images, filed Nov. 13, 2014.
U.S. Appl. No. 15/254,627, Systems and Methods for Interleaving Series of Medical Images, filed Sep. 1, 2016.
U.S. Appl. No. 14/095,123, Systems and Methods for Retrieval of Medical Data, filed Dec. 3, 2013.
U.S. Appl. No. 14/244,431, Systems and Methods for Matching, Naming, and Displaying Medical Images, filed Apr. 3, 2014.
U.S. Appl. No. 14/298,806, Smart Placement Rules, filed Jun. 6, 2013.
U.S. Appl. No. 11/942,687, Smart Forms, filed Nov. 19, 2007.
U.S. Appl. No. 14/043,165, Automated Document Filings, filed Oct. 1, 2013.
U.S. Appl. No. 11/944,000, Exam Scheduling With Customer Configured Notifications, filed Nov. 21, 2007.
U.S. Appl. No. 13/768,765, System and Method of Providing Dynamic and Customizable Medical Examination Forms, filed Feb. 15, 2013.
U.S. Appl. No. 15/163,600, Rules-Based Approach to Transferring and/or Viewing Medical Images, filed May 24, 2016.
U.S. Appl. No. 14/792,210, Dynamic Montage Reconstruction, filed Jul. 6, 2015.
U.S. Appl. No. 15/140,346, Database Systems and Interactive User Interfaces for Dynamic Interaction With, and Sorting of, Digital Medical Image Data, filed Apr. 27, 2016.
U.S. Appl. No. 15/140,363, Database Systems and Interactive User Interfaces for Dynamic Interation With, and Comparison of, Digital Medical Image Data, filed Apr. 27, 2016.
U.S. Appl. No. 15/140,351, Database Systems and Interactive User Interfaces for Dynamic Interaction With, and Review of, Digital Medical Image DA, filed Apr. 27, 2016.
U.S. Appl. No. 15/140,348, Database Systems and Interactive User Interfaces for Dynamic Interaction With, and Indications of, Digital Medical Image Data, filed Apr. 27, 2016.
U.S. Appl. No. 14/081,225, filed Nov. 15, 2013 including its ongoing prosecution history, including without limitation Office Actions, Amendments, Remarks, and any other potentially relevant documents, Fram et al.
U.S. Appl. No. 12/437,522, filed May 7, 2009, Fram.
U.S. Appl. No. 14/792,210, filed Jul. 6, 2015 including its ongoing prosecution history, including without limitation Office Actions, Amendments, Remarks, and any other potentially relevant documents., Reicher.
Non-Final Office Action dated Aug. 28, 2007 in U.S. Appl. No. 11/179,384.
Final Office Action dated Jun. 26, 2008 in U.S. Appl. No. 11/179,384.
Non-Final Office Action dated Dec. 29, 2008 in U.S. Appl. No. 11/179,384.
Final Office Action dated Jul. 24, 2009, in U.S. Appl. No. 11/179,384.
Notice of Allowance dated Nov. 3, 2009, in U.S. Appl. No. 11/179,384.
Non-Final Office Action dated Aug. 18, 2010 in U.S. Appl. No. 12/702,976.
Interview Summary dated Dec. 1, 2010, in U.S. Appl. No. 12/702,976.
Final Office Action dated Feb. 17, 2011 in U.S. Appl. No. 12/702,976.
Interview Summary dated May 31, 2011 in U.S. Appl. No. 12/702,976.
Notice of Allowance dated Jul. 20, 2011, in U.S. Appl. No. 12/702,976.
Office Action dated Dec. 1, 2011, in U.S. Appl. No. 13/228,349.
Notice of Allowance dated Feb. 6, 2012, in U.S. Appl. No. 13/228,349.
Notice of Allowance dated Jul. 20, 2012, in U.S. Appl. No. 13/228,349.
Office Action dated Dec. 11, 2013, in U.S. Appl. No. 13/477,853.
Interview Summary dated Mar. 14, 2014, in U.S. Appl. No. 13/477,853.
Final Office Action dated Jun. 13, 2014, in U.S. Appl. No. 13/477,853.
Notice of Allowance dated Aug. 15, 2014, in U.S. Appl. No. 13/477,853.
Non-Final Office Action dated Oct. 1, 2009, in U.S. Appl. No. 11/268,261.
Notice of Allowance dated Feb. 2, 2010, in U.S. Appl. No. 11/268,261.
Interview Summary dated Jan. 25, 2010, in U.S. Appl. No. 11/268,261.
Interview Summary dated May 14, 2010, in U.S. Appl. No. 11/268,261.
Notice of Allowance dated May 17, 2010, in U.S. Appl. No. 11/268,261.
Supplemental Notice of Allowance dated Aug. 6, 2010, in U.S. Appl. No. 11/268,261.
Notice of Allowance dated Oct. 8, 2010, in U.S. Appl. No. 11/268,261.
Notice of Allowance dated Dec. 3, 2010, in U.S. Appl. No. 11/268,261.
Notice of Allowance dated Jan. 6, 2011, in U.S. Appl. No. 11/268,261.
Office Action dated May 16, 2011, in U.S. Appl. No. 12/857,915.
Interview Summary dated Sep. 6, 2011, in U.S. Appl. No. 12/857,915.
Final Office Action dated Dec. 15, 2011, in U.S. Appl. No. 12/857,915.
Office Action dated Jun. 12, 2012, in U.S. Appl. No. 12/857,915.
Office Action dated Aug. 23, 2013, in U.S. Appl. No. 12/857,915.
Interview Summary dated Feb. 4, 2014, in U.S. Appl. No. 12/857,915.
Notice of Allowance dated Jul. 3, 2014, in U.S. Appl. No. 12/857,915.
“Corrected” Notice of Allowance dated Aug. 15, 2014, in U.S. Appl. No. 12/857,915.
Non-Final Office Action dated Jan. 20, 2016, in U.S. Appl. No. 14/502,055.
Interview Summary dated Apr. 14, 2016, in U.S. Appl. No. 14/502,055.
Notice of Allowance dated Jun. 2, 2016, in U.S. Appl. No. 14/502,055.
Notice of Corrected Allowability dated Jul. 14, 2016, in U.S. Appl. No. 14/502,055.
Notice of Corrected Allowability dated Sep. 19, 2016, in U.S. Appl. No. 14/502,055.
Non-Final Office Action dated May 13, 2009, in U.S. Appl. No. 11/265,979.
Final Office Action dated Dec. 22, 2009 in U.S. Appl. No. 11/265,979.
Non-Final Office Action dated Jul. 8, 2010 in U.S. Appl. No. 11/265,979.
Interview Summary dated Mar. 4, 2010 in U.S. Appl. No. 11/265,979.
Interview Summary dated Nov. 16, 2010 in U.S. Appl. No. 11/265,979.
Final Office Action dated Dec. 23, 2010 in U.S. Appl. No. 11/265,979.
Interview Summary dated Mar. 17, 2011 in U.S. Appl. No. 11/265,979.
Notice of Allowance dated May 26, 2011 in U.S. Appl. No. 11/265,979.
Office Action dated Jun. 8, 2012 in U.S. Appl. No. 13/171,081.
Interview Summary dated Jul. 31, 2012 in U.S. Appl. No. 13/171,081.
Final Office Action dated Oct. 12, 2012 in U.S. Appl. No. 13/171,081.
Interview Summary dated Nov. 6, 2012 in U.S. Appl. No. 13/171,081.
Notice of Allowance, dated Sep. 4, 2013, in U.S. Appl. No. 13/171,081.
Office Action dated Mar. 3, 2015 in U.S. Appl. No. 14/095,123.
Interview Summary dated May 1, 2015 in U.S. Appl. No. 14/095,123.
Final Office Action dated Jul. 23, 2015 in U.S. Appl. No. 14/095,123.
Interview Summary dated Aug. 27, 2015 in U.S. Appl. No. 14/095,123.
Office Action dated Feb. 23, 2016 in U.S. Appl. No. 14/095,123.
Final Office Action dated Jul. 20, 2016 in U.S. Appl. No. 14/095,123.
Non-Final Office Action dated Aug. 24, 2009 in U.S. Appl. No. 11/268,262.
Non-Final Office Action dated Apr. 16, 2010 in U.S. Appl. No. 11/268,262.
Interview Summary dated Nov. 24, 2009 in U.S. Appl. No. 11/268,262.
Interview Summary dated May 12, 2010 in U.S. Appl. No. 11/268,262.
Final Office Action dated Oct. 28, 2010 in U.S. Appl. No. 11/268,262.
Interview Summary dated Dec. 1, 2010 in U.S. Appl. No. 11/268,262.
Notice of Allowance dated Dec. 1, 2010 in U.S. Appl. No. 11/268,262.
Notice of Allowance dated Feb. 25, 2011 in U.S. Appl. No. 11/268,262.
Non-Final Office Action dated Jan. 11, 2012 in U.S. Appl. No. 13/079,597.
Notice of Allowance dated Apr. 25, 2012, in U.S. Appl. No. 13/079,597.
Non-Final Office Action dated Apr. 4, 2013 in U.S. Appl. No. 13/535,758.
Notice of Allowance, dated Aug. 23, 2013 in U.S. Appl. No. 13/535,758.
Corrected Notice of Allowance dated Jun. 27, 2016, in U.S. Appl. No. 14/502,055.
Office Action dated Mar. 10, 2016 in U.S. Appl. No. 14/081,225.
Notice of Allowance dated Sep. 2, 2016 in U.S. Appl. No. 14/081,225.
Non-Final Office Action dated Jul. 27, 2009 in U.S. Appl. No. 11/265,978.
Notice of Allowance dated Nov. 19, 2009 in U.S. Appl. No. 11/265,978.
Notice of Allowance dated Apr. 19, 2010 in U.S. Appl. No. 11/265,978.
Supplemental Notice of Allowance dated May 3, 2010 in U.S. Appl. No. 11/265,978.
Supplemental Notice of Allowance dated Aug. 3, 2010 in U.S. Appl. No. 11/265,978.
Non-Final Office Action dated May 5, 2011 in U.S. Appl. No. 12/870,645.
Non-Final Office Action dated May 31, 2013, in U.S. Appl. No. 13/345,606.
Interview Summary dated Aug. 15, 2013, in U.S. Appl. No. 13/345,606.
Notice of Allowance, dated Jan. 9, 2014 in U.S. Appl. No. 13/345,606.
Non-Final Office Action dated Mar. 18, 2016 in U.S. Appl. No. 14/244,431.
Interview Summary dated Jun. 17, 2016 in U.S. Appl. No. 14/244,431.
Notice of Allowance dated Aug. 18, 2016 in U.S. Appl. No. 14/244,431.
Non-Final Office Action dated May 26, 2010 in U.S. Appl. No. 11/942,674.
Interview Summary dated Jul. 26, 2010 in U.S. Appl. No. 11/942,674.
Final Office Action dated Nov. 26, 2010 in U.S. Appl. No. 11/942,674.
Interview Summary dated Mar. 2, 2011 in U.S. Appl. No. 11/942,674.
Notice of Allowance, dated Apr. 1, 2011 in U.S. Appl. No. 11/942,674.
Non Final Office Action dated Nov. 10, 2011 in U.S. Appl. No. 13/118,085.
Interview Summary, dated Feb. 17, 2012, in U.S. Appl. No. 13/118,085.
Final Office Action, dated Apr. 13, 2012, in U.S. Appl. No. 13/118,085.
Notice of Allowance, dated Feb. 6, 2013, in U.S. Appl. No. 13/118,085.
Non Final Office Action dated Aug. 23, 2013 in U.S. Appl. No. 13/907,128.
Final Office Action dated Oct. 9, 2013 in U.S. Appl. No. 13/907,128.
Interview Summary dated Nov. 22, 2013 in U.S. Appl. No. 13/907,128.
Notice of Allowance dated Jan. 31, 2014 in U.S. Appl. No. 13/907,128.
Office Action, dated Dec. 29, 2014 in U.S. Appl. No. 14/298,806.
Interview Summary, dated Mar. 2, 2015 in U.S. Appl. No. 14/298,806.
Final Office Action, dated Jun. 17, 2015 in U.S. Appl. No. 14/298,806.
Office Action, dated Feb. 16, 2016 in U.S. Appl. No. 14/298,806.
Final Office Action, dated Jul. 21, 2016 in U.S. Appl. No. 14/298,806.
Non Final Office Action dated Sep. 16, 2010 in U.S. Appl. No. 11/942,687.
Interview Summary dated Dec. 3, 2010 in U.S. Appl. No. 11/942,687.
Final Office Action, dated Apr. 5, 2011 in U.S. Appl. No. 11/942,687.
Office Action, dated Mar. 13, 2014 in U.S. Appl. No. 11/942,687.
Interview Summary, dated Jun. 17, 2014 in U.S. Appl. No. 11/942,687.
Office Action, dated Jul. 18, 2014 in U.S. Appl. No. 11/942,687.
Final Office Action, dated Jan. 5, 2015 in U.S. Appl. No. 11/942,687.
Interview Summary, dated Mar. 4, 2015 in U.S. Appl. No. 11/942,687.
PTAB Examiner's Answer, dated Feb. 25, 2016 in U.S. Appl. No. 11/942,687.
Non-Final Office Action dated Apr. 14, 2010 in U.S. Appl. No. 11/944,027.
Interview Summary dated May 13, 2010 in U.S. Appl. No. 11/944,027.
Final Office Action dated Dec. 23, 2010 in U.S. Appl. No. 11/944,027.
Interview Summary dated Mar. 31, 2011 in U.S. Appl. No. 11/944,027.
Office Action dated Apr. 19, 2012 in U.S. Appl. No. 11/944,027.
Interview Summary dated Jun. 28, 2012 in U.S. Appl. No. 11/944,027.
Final Office Action dated Oct. 22, 2012 in U.S. Appl. No. 11/944,027.
Notice of Allowance dated Jun. 5, 2013 in U.S. Appl. No. 11/944,027.
Office Action dated Oct. 14, 2014 in U.S. Appl. No. 14/043,165.
Final Office Action dated Apr. 1, 2015 in U.S. Appl. No. 14/043,165.
Office Action dated Oct. 2, 2015 in U.S. Appl. No. 14/043,165.
Interview Summary dated Dec. 21, 2015 in U.S. Appl. No. 14/043,165.
Final Office Action dated Feb. 17, 2016 in U.S. Appl. No. 14/043,165.
Non-Final Office Action dated Sep. 29, 2010 in U.S. Appl. No. 11/944,000.
Final Office Action dated Apr. 20, 2011 in U.S. Appl. No. 11/944,000.
Interview Summary dated Jun. 7, 2011 in U.S. Appl. No. 11/944,000.
Appeal Brief dated Mar. 4, 2013 in U.S. Appl. No. 11/944,000.
Examiner's Answer dated Jun. 26, 2013 in U.S. Appl. No. 11/944,000.
Board Decision dated Mar. 23, 2016 in U.S. Appl. No. 11/944,000.
Office Action, dated Jul. 15, 2016 in U.S. Appl. No. 11/944,000.
Office Action dated Feb. 3, 2012 in U.S. Appl. No. 12/622,404.
Interview Summary dated May 8, 2012 in U.S. Appl. No. 12/622,404.
Final Office Action dated Aug. 6, 2012 in U.S. Appl. No. 12/622,404.
Notice of Allowance dated Oct. 15, 2012 in U.S. Appl. No. 12/622,404.
Office Action dated Mar. 17, 2015 in U.S. Appl. No. 13/768,765.
Interview Summary dated Jun. 11, 2015 in U.S. Appl. No. 13/768,765.
Notice of Allowance dated Aug. 28, 2015 in U.S. Appl. No. 13/768,765.
Notice of Allowability dated Nov. 20, 2015 in U.S. Appl. No. 13/768,765.
Notice of Allowability dated Jul. 28, 2016 in U.S. Appl. No. 13/768,765.
Office Action dated Mar. 4, 2013 in U.S. Appl. No. 12/891,543.
Interview Summary dated Apr. 5, 2013 in U.S. Appl. No. 12/891,543.
Notice of Allowance dated Nov. 14, 2013 in U.S. Appl. No. 12/891,543.
Office Action dated Sep. 11, 2014 in U.S. Appl. No. 14/179,328.
Notice of Allowance dated Jan. 14, 2015 in U.S. Appl. No. 14/179,328.
Office Action dated Aug. 13, 2015 in U.S. Appl. No. 14/687,853.
Notice of Allowance dated Feb. 25, 2016 in U.S. Appl. No. 14/687,853.
Supplemental Notice of Allowance dated Jun. 2, 2016 in U.S. Appl. No. 14/687,853.
Notice of Allowance dated Aug. 11, 2016 in U.S. Appl. No. 15/163,600.
Supplemental Notice of Allowance dated Sep. 14, 2016 in U.S. Appl. No. 15/163,600.
Office Action dated Jun. 27, 2014 in U.S. Appl. No. 13/572,397.
Final Office Action dated Jan. 13, 2015 in U.S. Appl. No. 13/572,397.
Notice of Allowance dated Mar. 19, 2015, 2015 in U.S. Appl. No. 13/572,397.
Office Action dated Aug. 6, 2014 in U.S. Appl. No. 13/572,547.
Notice of Allowance, dated Mar. 3, 2015 in U.S. Appl. No. 13/572,547.
Corrected Notice of Allowance, dated Apr. 10, 2015 in U.S. Appl. No. 13/572,547.
Corrected Notice of Allowance, dated May 21, 2015 in U.S. Appl. No. 13/572,547.
Office Action dated Jul. 30, 2014 in U.S. Appl. No. 13/572,552.
Interview Summary dated Sep. 3, 2014 in U.S. Appl. No. 13/572,552.
Final Office Action dated Jan. 28, 2015 in U.S. Appl. No. 13/572,552.
Interview Summary dated Apr. 23, 2015 in U.S. Appl. No. 13/572,552.
Notice of Allowance, dated May 8, 2015 in U.S. Appl. No. 13/572,552.
Agfa HealthCare, color brochure “IMPAX 6: Digital Image and Information Management,” © 2012 Agfa HealthCare N.V. Downloaded from http://www.agfahealthcare.com/global/en/he/library/libraryopen?ID=32882925. Accessed on Feb. 9, 2015.
Agfa HealthCare, IMPAX 6.5 Datasheet (US)2012. © 2012 Agfa HealthCare N.V. Downloaded from http://www.agfahealthcare.com/global/en/he/library/libraryopen?ID=37459801. Accessed on Feb. 9, 2015.
AMD Technologies, Inc., Catella PACS 5.0 Viewer User Manual (112 pgs), © 2010, AMD Technologies, Inc. (Doc. 340-3-503 Rev. 01). Downloaded from http://www.amdtechnologies.com/lit/cat5viewer.pdf. Accessed on Feb. 9, 2015.
Aspyra's Imaging Solutions, 3 page color print out. Accessed at http://www.aspyra.com/imaging-solutions. Accessed on Feb. 9, 2015.
Avreo, interWorks—RIS/PACS package, 2 page color brochure, © 2014, Avreo, Inc. (Document MR-5032 Rev. 4). Downloaded from http://www.avreo.com/ProductBrochures/MR-5032Rev.%204interWORKS%20RISPACSPackage.pdf. Accessed on Feb. 9, 2015.
BRIT Systems, BRIT PACS View Viewer, 2 page color brochure, (BPB-BPV-0001). Downloaded from http://www.brit.com/pdfs/britpacsview.pdf. Accessed on Feb. 9, 2015.
BRIT Systems, Roentgen Works—100% Browers-based VNA (Vendor Neutral Archive/PACS), © 2010 BRIT Systems, 1 page color sheet. Accessed at http://www.roentgenworks.com/PACS. Accessed on Feb. 9, 2015.
BRIT Systems, Vision Multi-modality Viewer—with 3D, 2 page color brochure, (BPB-BVV-0001 REVC). Downloaded from http://www.brit.com/pdfs/BPB-BVV-0001REVC_BRIT_Vision_Viewer.pdf. Accessed on Feb. 9, 2015.
CANDELiS, ImageGrid™: Image Management Appliance, 6 page color brochure. (AD-012 Rev. F Nov. 2012), © 2012 Candelis, Inc. Downloaded from http://www.candelis.com/images/pdf/Candelis_ImageGrid_Appliance_20111121.pdf. Accessed on Feb. 9, 2015.
Carestream, Cardiology PACS, 8 page color brochure. (CAT 866 6075 06/12). © Carestream Health, Inc., 2012. Downloaded from http://www.carestream.com/cardioPACS_brochure_M1-877.pdf. Accessed on Feb. 9, 2015.
Carestream, Vue PACS, 8 page color brochure. (CAT 300 1035 05/14). © Carestream Health, Inc., 2014. Downloaded from http://www.carestream.com/csPACS_brochure_M1-876.pdf. Accessed on Feb. 9, 2015.
Cerner, Radiology—Streamline image management, 2 page color brochure, (fl03_332_10_v3). Downloaded from http://www.cerner.com/uploadedFiles/Clinical_Imaging.pdf. Accessed on Feb. 9, 2015.
CoActiv, EXAM-PACS, 2 page color brochure, © 2014 CoActiv, LLC. Downloaded from http://coactiv.com/wp-content/uploads/2013/08/EXAM-PACS-BROCHURE-final-web.pdf. Accessed on Feb. 9, 2015.
Crowley, Rebecca et al., Development of Visual Diagnostic Expertise in Pathology: an Information-processing Study, Jan. 2003, Journal of the American medical Informatics Association, vol. 10, No. 1, pp. 39-51.
DR Systems, Dominator™ Guide for Reading Physicians, Release 8.2, 546 pages, (TCP-000260-A), © 1997-2009, DR Systems, Inc. Downloaded from https://resources.dominator.com/assets/004/6999.pdf. Document accessed Feb. 9, 2015.
DR Systems, DR Scheduler User Guide, Release 8.2, 410 pages, (TCP-000115-A), © 1997-2009, DR Systems, Inc. Downloaded from https://resources.dominator.com/assets/003/6850.pdf. Document accessed Feb. 9, 2015.
Erickson, et al.: “Effect of Automated Image Registration on Radiologist Interpretation,” Journal of Digital Imaging, vol. 20, No. 2 (Jun. 2007); pp. 105-113.
Erickson, et al.: “Image Registration Improves Confidence and Accuracy of Image Interpretation,” Special Issue-Imaging Informatics, Cancer Informatics 2007: 1 19-24, May 2007.
Fujifilm Medical Systems, Synapse® Product Data, Synapse Release Version 3.2.1, Foundation Technologies, 4 page color brochure, (XBUSSY084) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/foundation.pdf. Accessed on Feb. 9, 2015.
Fujifilm Medical Systems, Synapse® Product Data, Synapse Release Version 3.2.1, Server Modules and Interfaces, 4 page color brochure, (XBUSSY085) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/server-interface.pdf. Accessed on Feb. 9, 2015.
Fujifilm Medical Systems, Synapse® Product Data, Synapse Release Version 3.2.1, Workstation Software, 4 page color brochure, (XBUSSY082) Aug. 2008. Downloaded from http://www.fujifilmusa.com/shared/bin/workstation.pdf. Accessed on Feb. 9, 2015.
GE Healthcare, Centricity PACS, in 8 page printout. Accessed at http://www3.gehealthcare.com/en/products/categories/healthcare_it/medical_imaging_informatics_-_ris-pacs-cvis/centricity_pacs. Accessed on Feb. 9, 2015.
Handylife.com—Overview of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/overview.html. Accessed on Feb. 18, 2015.
Handylife.com—Features of Handy Patients Enterprise, in 4 page printout. Accessed from http://www.handylife.com/en/software/features.html. Accessed on Feb. 18, 2015.
Handylife.com—Screenshots of Handy Patients Enterprise, in 2 page printout. Accessed from http://www.handylife.com/en/software/screenshots.html. Accessed on Feb. 18, 2015.
ICRco, I See the Future, in 12 pages, color brochure, (BR080809AUS), © 2009 iCRco.ClarityPACS. Downloaded from http://www.claritypacs.com/pdfs/ISeeFuture_26_Web.pdf. Accessed on Feb. 9, 2015.
Imageanalysis, dynamika, 2 page color brochure. Downloaded from http://www.imageanalysis.org.uk/what-we-do. Accessed on Feb. 9, 2015.
Imageanalysis, MRI Software, in 5 page printout. Accessed at http://www.imageanalysis.org.uk/mri-software. Accessed on Feb. 9, 2015.
IMSI, Integrated Modular Systems, Inc., Hosted / Cloud PACS in one page printout. Accessed at http://www.imsimed.com/#!products-services/ctnu. Accessed on Feb. 9, 2015.
Infinitt, PACS, RIS, Mammo PACS, Cardiology Suite and 3D/Advanced Visualization | Infinittna, 2 page printout. Accessed at http://www.infinittna.com/products/radiology/radiology-pacs. Accessed on Feb. 9, 2015.
Intelerad, IntelePACS, 2 page color brochure, © 2014 Intelerad Medical Systems Incoprorated. Downloaded http://www.intelerad.com/wp-content/uploads/sites/2/2014/08/IntelePACS-brochure.pdf. Accessed on Feb. 9, 2015.
Intelerad, InteleViewer, 2 page color brochure, © 2014 Intelerad Medical Systems Incoprorated. Downloaded from http://www.intelerad.com/wp-content/uploads/sites/2/2014/09/InteleViewer-brochure.pdf. Accessed on Feb. 9, 2015.
Intuitive Imaging Informatics, ImageQube, 1 page in color. Downloaded from http://www.intuitiveimaging.com/2013/pdf/ImageQube%20one-sheet.pdf. Accessed on Feb. 9, 2015.
Kuhl, Helen: Comparison Chart/PACS, Customers Are Happy, But Looking for More, (color) Imaging Techology News, itnonline.com, May 2012, pp. 24-27. Downloaded from http://www.merge.com/MergeHealthcare/media/company/In%20The%20News/merge-pacs-comparison.pdf. Accessed on Feb. 9, 2015.
LUMEDX CardioPACS 5.0 Web Viewer, Cardiopacs Module, 2 page color brochure, (506-10011 Rev A). Downloaded from http://cdn.medicexchange.com/images/whitepaper/cardiopacs_web_viewer.pdf?1295436926. Accessed on Feb. 9, 2015.
LUMEDX Cardiovascular Information System, CardioPACS, one page in color printout. Accessed at http://www.lumedx..com/pacs.aspx. Accessed on Feb. 9, 2015.
McKesson Enterprise Medical Imagining and PACS | McKesson, 1 page (color) printout. Accessed at http://www.mckesson.com/providers/health-systems/diagnostic-imaging/enterprise-medical-imaging. Accessed on Feb. 9, 2015.
Medweb Radiology Workflow Solutions, Radiology Workflow Solutions, Complete Workflow & Flexible Turnkey Solutions, Web RIS/PACS with Advanced Viewer, 3 page color brochure, © 2006-2014 Medweb. Downloaded from http://www.medweb.com/docs/rispacs_brochure_2014.pdf. Accessed on Feb. 9, 2015.
Mendelson, et al., “Informatics in Radiology—Image Exchange: IHE and the Evolution of Image Sharing,” RadioGraphics, Nov.-Dec. 2008, vol. 28, No. 7.
Merge Radiology Solutions, Merge PACS, A real-time picture archiving communication system, (PAX-21990 rev 2.0), 2 page color brochure. Downloaded from http://www.merge.com/MergeHealthcare/media/documents/brochures/Merge_PACS_web.pdf. Accessed on Feb. 9, 2015.
NOVARAD Enterprise Imaging Solutions, NOVAPACS, 2 page (color) printout. Accessed at http://ww1.novarad.net/novapacs. Accessed on Feb. 9, 2015.
PACSPLUS, PACSPLUS Server, 1 page (color) printout. Accessed at http://www.pacsplus.com/01_products/products_01.html. Accessed on Feb. 9, 2015.
PACSPLUS, PACSPLUS Workstation, 3 page (color) printout. Accessed at http://www.pacsplus.com/01_products/products_01.html. Accessed on Feb. 9, 2015.
Philips IntelliSpace PACS, in 2 color page printout. Accessed at https://www.healthcare.philips.com/main/products/healthcare_informatics/products/enterprise_imaging_informatics/isite_pacs. Accessed on Feb. 9, 2015.
Philips, IntelliSpace: Multi-modality tumor tracking application versus manual PACS methods, A time study for Response Evaluation Criteria in Solid Tumors (RECIST). 2012, Koninklijke Philips Electronics N.V., in four pages.
RamSoft, RIS PACS Teleradiology, PowerServer PACS, Lite PACS, XU PACS Compare RamSoft PACS Products, 2 color page printout. Accessed at http://www.ramsoft.com/products/powerserver-pacs-overview. Accessed on Feb. 9, 2015.
Rosset et al.: “OsiriX: An Open-Source Software for Navigating in Multidimensional DICOM Images,” Journal of digital Imaging, Sep. 2004, pp. 205-216.
Sage Intergy PACS | Product Summary. Enhancing Your Workflow by Delivering Web-based Diagnostic Images When and Where You Need Them, in 2 color pages. (IRV-SS-INTPACS-PSS-031309). © 2009 Sage Software Healcare, Inc. Downloaded from http://www.greenwayhealth.com/solutions/intergy/. Accessed on Feb. 9, 2015.
Schellingerhout, Dawid, MD, et al.: “Coregistration of Head CT Comparison Studies: Assessment of Clinical Utility,” Acad Radiol 2003; 10:242-248, Mar. 2003.
ScImage, Cardiology PACS, in 8 color page printout. Accessed at http://www.scimage.com/solutions/clinical-solutions/cardiology. Accessed on Feb. 9, 2015.
Sectra RIS PACS, in 2 color page printout. Accessed at https://www.sectra.com/medical/diagnostic_imaging/solutions/ris-pacs/. Accessed on Feb. 9, 2015.
Siemens syngo.plaza, Features and Benefits, in 2 color page printout. Accessed at http://www.healthcare.siemens.com/medical-imaging-it/imaging-it-radiology-image-management-pacs/syngoplaza/features. Accessed on Feb. 9, 2015.
Simms | RIS and PACS Medical Imaging Software, in 2 color page printout. http://www.mysimms.com/ris-pacs.php. Accessed on Feb. 9, 2015.
Stryker, Imaging—OfficePACS Power Digital Imaging, in one color page printout. Accessed from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/index.htm. Accessed on Feb. 9, 2015.
Stryker, OfficePACS Power—Digital Imaging, 8 page color brochure, (MPP-022 Rev 4 BC/MP 300 1/07). © 2007 Stryker. Downloaded from http://www.stryker.com/emea/Solutions/Imaging/OfficePACSPowerDigitalImaging/ssLINK/emea/1557/022268. Accessed on Feb. 9, 2015.
UltraRAD—ultra VISION, 1 page (color). Downloaded from http://www.ultraradcorp.com/pdf/UltraVISION.pdf. Accessed on Feb. 9, 2015.
VioStream for VitreaView, 2 color pages printout. Accessed at http://www.vitalimages.com/solutions/universal-viewing/viostream-for-vitreaview. Accessed on Feb. 9, 2015.
Visage Imaging Visage 7, 3 color page printout. Accessed at http://www.visageimaging.com/visage-7. Accessed on Feb. 9, 2015.
Viztek Radiology PACS Software Vixtek Opal-RAD, 4 color page printout. Accessed at http://viztek.net/products/opal-rad. Accessed on Feb. 9, 2015.
Voyager Imaging—Voyager PACS Radiologist Workstation, 2 page color brochure. Downloaded from http://www.intellirad.com.au/assets/Uploads/Voyager-PacsWorkstations.pdf?. Accessed on Feb. 9, 2015.
Voyager Imaging—Voyager PACS, 3 page color brochure. Downloaded from http://www.intellirad.com.au/index.php/assets/Uploads/Voyager-Pacs3.pdf. Accessed on Feb. 9, 2015.
Ivetic, D., and Dragan, D., Medical Image on the Go!, 2009, J Med Syst, vol. 35, pp. 499-516, Oct. 2009.
Tahmoush, D. and Samet, H., A New Database for Medical Images and Information, 2007, Medical Imaging 2007; PACS and Imaging Informatics, vol. 6516. pp. 1-9, Feb. 2007.
Correct Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jul. 20, 2018 (3 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Aug. 15, 2018 (8 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jul. 16, 2018 (7 pages).
U.S. Appl. No. 15/346,530, Systems and Methods for Matching, Naming, and Displaying Medical Images, filed Nov. 8, 2016.
U.S. Appl. No. 15/292,014, System and Method of Providing Dynamic and Customizable Medical Examination for, filed Oct. 12, 2016.
U.S. Appl. No. 15/292,023, Selective Display of Medical Images, filed Oct. 12, 2016.
U.S. Appl. No. 15/188,872, Intelligent Management of Computerized Advanced Processing, filed Jun. 21, 2016.
U.S. Appl. No. 15/188,819, Intelligent Management of Computerized Advanced Processing, filed Jun. 21, 2016.
Office Action dated Jan. 17, 2017, in U.S. Appl. No. 14/540,830.
Office Action dated Dec. 12, 2016, in U.S. Appl. No. 15/254,627.
Corrected Notice of Allowance dated Oct. 21, 2016 in U.S. Appl. No. 14/081,225.
Corrected Notice of Allowance dated Nov. 16, 2016 in U.S. Appl. No. 14/244,431.
Appeal Brief dated Jul. 15, 2016 in U.S. Appl. No. 14/043,165.
Examiner's Answer dated Nov. 14, 2016, in U.S. Appl. No. 14/043,165.
Office Action, dated Jan. 12, 2017 in U.S. Appl. No. 15/292,023.
Restriction Requirement, dated Jul. 28, 2015 in U.S. Appl. No. 14/139,068.
Office Action, dated Mar. 11, 2016 in U.S. Appl. No. 14/139,068.
Notice of Allowance, dated Sep. 21, 2016 in U.S. Appl. No. 14/139,068.
Sprawls, “Image Characteristics and Quality,” Physical Principles of Medical Imaging, http://www.sprawls.org/resources pp. 1-14.
TeraRecon iNtuition pamphlet in 20 pages, retrieved on Nov. 8, 2013, available at http://int.terarecon.com/wp-content/uploads/2013/11/brochure_english2013.pdf.
TeraRecon iNtuition—Workflow. <www.terarecon.com/wordpress/our-solutions/intuition-workflow> Last accessed Nov. 8, 2013. 2 pages.
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jan. 10, 2019 (9 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated May 25, 2018 (10 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Feb. 18, 2009 (2 page).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Sep. 24, 2008 (4 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Aug. 15, 2017 (11 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated May 15, 2017 (42 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Jul. 28, 2017 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Mar. 24, 2017 (3 page).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated May 9, 2018 (17 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Sep. 6, 2018 (14 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated May 17, 2018 (3 page).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Dec. 22, 2017 (13 page).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Jun. 10, 2011 (3 page).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Dec. 20, 2017 (11 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Aug. 6, 2018 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Mar. 19, 2018 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Sep. 7, 2018 (16 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Jan. 10, 2018 (11 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Jun. 1, 2018 (17 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Nov. 30, 2017 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Jun. 27, 2017 (62 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Nov. 30, 2017 (1 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Dec. 13, 2017 (3 page).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Apr. 2, 2018 (59 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jun. 26, 2017 (51 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Sep. 20, 2018 (58 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jan. 11, 2018 (60 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jun. 26, 2018 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Oct. 13, 2017 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Jan. 22, 2018 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Jun. 27, 2017 (58 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Oct. 13, 2017 (3 page).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Oct. 19, 2018 (12 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Jul. 3, 2018 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Jul. 30, 2018 (25 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Dec. 6, 2018 (21 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Nov. 19, 2018 (33 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jul. 3, 2014 (1 page).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Jul. 13, 2017 (4 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Apr. 3, 2017 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated May 13, 2011 (14 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Sep. 4, 2013 (1 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Mar. 30, 2017 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated Sep. 13, 2011 (8 pages).
Applicant Summary of Interview of Examiner from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/345,606 dated Oct. 21, 2013 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/298,806 dated Apr. 12, 2017 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Jan. 30, 2017 (12 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Oct. 5, 2012 (11 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Feb. 4, 2011 (3 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/768,765 dated Aug. 28, 2015 (1 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/891,543 dated Nov. 14, 2013 (1 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/179,328 dated Dec. 11, 2014 (3 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/163,600 dated Sep. 14, 2016 (1 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,023 dated Apr. 11, 2014 (11 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Oct. 13, 2017 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Janaury 22, 2018 (11 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/572,397 dated Jun. 29, 2015 (2 pages).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Sep. 5, 2017 (12 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Apr. 23, 2014 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Sep. 10, 2014 (4 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Feb. 3, 2011 (16 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Apr. 20, 2015 (5 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Oct. 14, 2011 (17 pages).
Examiner's Answer to Appeal Brief from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Jul. 5, 2016 (18 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Jun. 1, 2011 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Aug. 11, 2015 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Feb. 6, 2019 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jan. 24, 2019 (7 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Mar. 4, 2019 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Janaury 11, 2019 (12 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Feb. 20, 2019 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated May 6, 2019 (7 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated May 8, 2019 (14 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated May 15, 2019 (8 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jul. 15, 2019 (2 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,346 dated May 28, 2019 (39 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Jul. 7, 2019 (21 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,363 dated Jun. 3, 2019 (33 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated May 21, 2019 (14 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Aug. 27, 2019 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/646,756 dated Jul. 16, 2019 (12 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Apr. 1, 2019 (18 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jul. 11, 2019 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Aug. 19, 2019 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Apr. 29, 2019 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Aug. 23, 2019 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Jan. 25, 2019 (7 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Mar. 15, 2019 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Aug. 21, 2019 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated May 20, 2019 (8 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Aug. 28, 2019 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,210 dated Jun. 17, 2019 (10 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,291 dated Oct. 8, 2019 (12 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Oct. 9, 2019 (4 pages).
Supplemental Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Oct. 28, 2019 (5 pages).
Supplemental Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Oct. 2, 2019 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Nov. 15, 2019 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 16/529,378 dated Oct. 22, 2019 (8 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,346 dated Oct. 31, 2019 (41 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Nov. 14, 2019 (14 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Dec. 4, 2019 (21 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,210 dated Oct. 11, 2019 (9 pages).
Related Publications (1)
Number Date Country
20170046870 A1 Feb 2017 US
Provisional Applications (1)
Number Date Country
60625690 Nov 2004 US
Continuations (4)
Number Date Country
Parent 14081225 Nov 2013 US
Child 15292006 US
Parent 13535758 Jun 2012 US
Child 14081225 US
Parent 13079597 Apr 2011 US
Child 13535758 US
Parent 11268262 Nov 2005 US
Child 13079597 US