Systems and methods for viewing medical images

Information

  • Patent Grant
  • 10782862
  • Patent Number
    10,782,862
  • Date Filed
    Thursday, August 1, 2019
    5 years ago
  • Date Issued
    Tuesday, September 22, 2020
    4 years ago
Abstract
For certain medical images, it is important and/or required that a user view all of a medical image at full resolution so that minute, but important, indicia in the medical image are not missed. A computing systems monitor the portions of the medical image that are displayed on the display device, notates those portions that have been displayed at full resolution (or other user-defined display parameters), and provides the user with information indicating portions that have not been viewed at full resolution and/or provides information indicating for which images of a multiple image examination full pixel display has been accomplished. The process reduces the possibility of missing an abnormality in a medical image due to the viewer not viewing a portion of the image at full resolution or using other user defined display parameters.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

This invention relates to management and viewing of medical images and, more particularly, to systems and methods of tracking which portions of medical images have been displayed using predetermined display parameters.


Description of the Related Art

Medical imaging is increasingly moving into the digital realm. This includes imaging techniques that were traditionally analog, such as mammography, x-ray imaging, angiography, endoscopy, and pathology, where information can now be acquired directly using digital sensors, or by digitizing information that was acquired in analog form. In addition, many imaging modalities are inherently digital, such as MRI, CT, nuclear medicine, and ultrasound. Increasingly these digital images are viewed, manipulated, and interpreted using computers and related computer equipment. Accordingly, there is a need for improved systems and methods of viewing and manipulating these digital images.


SUMMARY OF THE INVENTION

A pixel is the smallest changeable element of a digital image, where an image comprises a plurality of pixels. For example, a mammographic image may include an array of 4,000 horizontal by 6,000 vertical pixels. For all medical images, particularly for certain modalities such as mammography, for example, it is important that every image pixel is displayed on a display device and viewed by a viewer or reader, such as a doctor, nurse, or other medical staff member. For example, in the field of mammography, diagnosis of a cancer may only be detectable in a small number of pixels. Accordingly, if this small number of pixels is not viewed by the viewer or reader, a misdiagnosis may be given to the patient. If a mammography image comprises 16 million pixels (e.g., an image resolution of 4,000×4,000), all of the 16 million pixels cannot be simultaneously displayed on a display device with a resolution of 2,048×1,536 (a 3.2 Megapixel display). Thus, only about ⅕ of the 16 million pixels of the mammography image may be displayed simultaneously at full resolution on such a display device. In other words, if one elects to display the image at full resolution, the entire 4,000×4,000 pixel image cannot be simultaneously displayed on a 2,048×1,536 or smaller matrix monitor. If one elects to display the complete area of the image, the only alternative is to display the entire image at a reduced resolution, discarding a fraction of the pixels.


Because an entire medical image cannot be concurrently viewed at full resolution on a typical display device, software applications currently allow viewing of portions of medical images at full resolution on the display device. In some embodiments, a user may be required to adjust the portion of the medical image that is displayed at full resolution on the display device in an attempt to view all of the image pixels. For example, the viewer may select up to hundreds of portions of the image for sequential viewing at full resolution before the entire image has been viewed at full resolution. As those of skill in the art will appreciate, manually tracking which portions of an image have been viewed at full resolution is cumbersome and may not allow the viewer to accurately determine when all relevant portions of the image have been viewed at full resolution. Currently, there are no systems or methods for automatically tracking the portions of a medical image that have been displayed at full resolution, or for indicating those images for which all pixels have been presented on a display device at full resolution. Accordingly, portions of medical images may not be viewed at full resolution and important indicia in the medical image may be overlooked. Thus, systems and methods for tracking portions of a medical image that have been viewed at full resolution are desired. Furthermore, systems and methods for allowing a viewer of the medical image to visually distinguish those portions that have not been viewed at full resolution are desired.


In one embodiment, the invention comprises a method of viewing medical images on a display device coupled to a computing system, wherein the display device is configured to concurrently display N pixels of an image to a user. In one embodiment, the method comprises (a) receiving an image at the computing system, wherein the image comprises M pixels, wherein M is greater than N; (b) displaying on the display device a portion of the image comprising N pixels, wherein the image portion is displayed at full resolution; (c) determining whether each of the M pixels of the image has been displayed on the display device, and (d) in response to determining that not all of the M pixels of the image have been displayed on the display device, returning to step (b).


In another embodiment, the invention comprises a method of viewing a mammographic image in a viewing pane depicted on a display device, wherein the viewing pane is configured to display a predetermined number of pixels. In one embodiment, the method comprises displaying the mammographic image at a reduced resolution in the viewing pane, displaying a portion of the mammographic image at full resolution in the viewing pane, and displaying the mammography image at the reduced resolution in the viewing pane, wherein a portion of the reduced resolution image that corresponds with the portion of the mammographic image that was displayed at full resolution is visually distinguishable from the remaining portion of the reduced resolution image.


In another embodiment, the invention comprises a computing system for viewing a mammographic image. In one embodiment, the system comprises a display device depicting a viewing pane, means for displaying a portion of the mammographic image at full resolution in the viewing pane, and means for displaying the entire mammographic image at the reduced resolution in the viewing pane, wherein the portion of the mammographic image displayed at full resolution is visually distinguishable from the other portions of the reduced resolution mammographic image.


In another embodiment, the invention comprises a computing system for viewing a medical image. In one embodiment, the system comprises a display device having a predetermined number of pixels, an input interface configured to receive the medical image, an application module comprising software for initiating display of the medical image on the display device, and a processing unit configured to execute the software, wherein, in a first mode, the software initiates display of the entire medical image at a reduced resolution on the display device, in a second mode, the software initiates display of a portion of the medical image at full resolution on the display device, and, in a third mode, the software initiates display of the entire medical image at the reduced resolution on the display device, wherein a portion of the reduced resolution medical image corresponding to the portion of the medical image displayed at full resolution is visually distinguishable from the remaining portions of the reduced resolution medical image.


In another embodiment, the invention comprises a method of viewing medical images. In one embodiment, the method comprises selectively viewing portions of a high resolution image and verifying that the entire high resolution image has been viewed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of an exemplary computing system in communication with a network and various networked devices.



FIG. 2 is a flowchart illustrating a method of tracking which portions of a medical image have been displayed at full resolution.



FIG. 3 is a mammographic image, wherein the entire medical image is displayed at a reduced resolution.



FIG. 4 is the mammographic image of FIG. 3, wherein a portion of the mammographic image is displayed at full resolution.



FIG. 5 is the mammographic image of FIG. 3, wherein portions of the image that have been displayed at full resolution are color inverted.



FIG. 6 is an exemplary mammographic image, wherein a portion of the image has been selected for display at full resolution and another portion of the image has already been displayed at full resolution.



FIG. 7 is a portion of a mammographic image that has been selected for display at full resolution displayed at full resolution.





DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

Embodiments of the invention will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the invention may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the inventions herein described.



FIG. 1 is a block diagram of an exemplary computing system 100 in communication with a network 160 and various network devices. The computing system 100 may be used to implement certain systems and methods described herein. The functionality provided for in the components and modules of computing system 100 may be combined into fewer components and modules or further separated into additional components and modules.


The computing system 100 includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible. In one embodiment, the exemplary computing system 100 includes a central processing unit (“CPU”) 105, which may include a conventional microprocessor, an application module 145 that comprises one or more various applications that may be executed by the CPU 105. The application module 145 may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


The computing system 100 further includes a memory 130, such as random access memory (“RAM”) for temporary storage of information and a read only memory (“ROM”) for permanent storage of information, and a mass storage device 120, such as a hard drive, diskette, or optical media storage device. Typically, the modules of the computing system 100 are connected to the computer using a standards-based bus system. In different embodiments of the present invention, the standards based bus system could be Peripheral Component Interconnect (PCI), Microchannel, SCSI, Industrial Standard Architecture (ISA) and Extended ISA (EISA) architectures, for example.


The computing system 100 is generally controlled and coordinated by operating system software, such as the Windows 95, 98, NT, 2000, XP or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the computing system 100 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, and I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.


The exemplary computing system 100 includes one or more of commonly available input/output (I/O) devices and interfaces 110, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O devices and interfaces 110 include one or more display devices, such as a monitor, that allows the visual presentation of data to a user. More particularly, display devices provide for the presentation of GUIs, application software data, and multimedia presentations, for example. In one embodiment, a GUI includes one or more display panes in which medical images may be displayed. According to the systems and methods described below, medical images may be stored on the computing system 100 or another device that is local or remote, displayed on a display device, and manipulated by the application module 145. The computing system 100 may also include one or more multimedia devices 140, such as speakers, video cards, graphics accelerators, and microphones, for example.


In the embodiment of FIG. 1, the I/O devices and interfaces 110 provide a communication interface to various external devices. In the embodiment of FIG. 1, the computing system 100 is coupled to a network 160, such as a LAN, WAN, or the Internet, for example, via a communication link 115. The network 160 may be coupled to various computing devices anchor other electronic devices. In the exemplary embodiment of FIG. 1, the network 160 is coupled to imaging devices 170, an image server 180, and a medical facility 190. In addition to the devices that are illustrated in FIG. 1, the network 160 may communicate with other computing, imaging, and storage devices.


The imaging devices 170 may be any type of device that is capable of acquiring medical images, such as an MRI, x-ray, mammography, or CT scan systems. The image server 180 includes a data store 182 that is configured to store images and data associated with images. In one embodiment, the imaging devices 170 communicate with the image server via the network 160 and image information is transmitted to the image server 160 and stored in the data store 182. In one embodiment, the image data is stored in Digital Imaging and Communications in Medicine (“DICOM”) format. The complete DICOM specifications may be found on the National Electrical Manufactures Association Website at <medical.nema.org>. Also, NEMA PS 3—Digital imaging and Communications in Medicine, 2004 ed., Global Engineering Documents, Englewood Colo., 2004, provides an overview of the DICOM standard. Each of the above-cited references is hereby incorporated by reference in their entireties. In one embodiment, the data store 182 also stores the user-defined display parameters associated with one or more of the images stored on the data store 182. As discussed in further detail below, the user-defined display parameters may vary depending of the type of image, area imaged, clinical indication, source of image, display device, user, or other factors. Accordingly, any type of user-defined display parameter is expressly contemplated for use in conjunction with the systems and methods described herein.


The exemplary image server 160 is configured to store images from multiple sources and in multiple formats. For example, the image server 160 may be configured to receive medical images in the DICOM format from multiple sources, store these images in the data store 182, and selectively transmit medical images to requesting computing devices.


The medical facility 190 may be a hospital, clinic, doctor's office, or any other medical facility. The medical facility 190 may include one or more imaging devices and may share medical images with the image server 180 or other authorized computing devices. In one embodiment, multiple computing systems, such as the computing system 100 may be housed at a medical facility, such as medical facility 190.


Definition of Terms

Below is a definition of certain terms used herein.


“Medical image” is defined to include an image of an organism. It may include but is not limited to a radiograph, computed tomography (CT), magnetic resonance imaging (MRI), Ultrasound (US), mammogram, positron emission tomography scan (PET), nuclear scan (NM), pathology, endoscopy, ophthalmology, or many other types of medical images. While this description is directed to viewing and tracking of medical images, the methods and systems described herein may also be used in conjunction with non-medical images, such as, images of circuit boards, airplane wings, and satellite images, for example.


“Modality” is defined as a medical imaging device (a patient who undergoes an MRI is said to have been examined with the MRI modality).


“Patient” refers to an individual who undergoes a medical imaging examination.


“Viewing” is defined to include the process of visually observing one or more medical images associated with exams.


“Viewer” is defined as any person who views a medical image.


“Reading” is defined to include the process of visually observing one or more medical images for the purpose of creating a professional medical report, also called an interpretation. When reading is complete, an exam may be labeled “read,” indicating that the medical professional has completed observation of the one or more medical images for purposes of creating a medical report.


“Reader” is defined to include one who is authorized to perform the reading process.


“User” is defined to include any person that is a viewer and/or a reader.


“Display parameters” are defined to include methods of display of an image or exam. For example, an image or exam may be displayed with a certain pixel window level or width (similar to brightness and contrast), in color, based on a certain color map, opacity map, or other display parameters.


“Full pixel display” is defined to include display on a monitor or other display system of every pixel of a medical image.


“Full Resolution” is defined to include the concurrent display of all pixels of a medical image portion on a display device.


“Reduced Resolution” is defined to include display of less than alt of the pixels of a medical image portion on a display device.


“User-defined display parameter” refers to rules that a user can establish and store in a database that establish criteria for image display that is considered adequate. For example, a user-defined display parameter might store a rule that triggers certain warnings or displays if all pixels are not displayed or alternatively if at least half the pixels are not displayed, or alternatively, if some determined fraction of the pixels are not viewed with a certain display method (such as image window, level, brightness, contrast, opacity, color look-up table, or other parameters). User-defined display parameters may also refer to other image processing functions, such as edge enhancement and automated image analysis functions, e.g., computer-aided detection (CAD) techniques



FIG. 2 is a flowchart illustrating a method of tracking which pixels of a medical image have been displayed according to user-defined display parameters. In one embodiment, the method described with respect to FIG. 2 is performed by a computing system 100, a medical facility 190, or an image server 190, for example. For ease of description, the method will be discussed below with reference to a computing system 100 performing the method. Depending on the embodiment, certain of the blocks described below may be removed, others may be added, and the sequence of the blocks may be altered.


In one embodiment, the user-defined display parameters specify that an entire medical image must be viewed at full resolution before a reader may mark the image as read. However, the user-defined display parameters may have different requirements, such as requiring that at least a defined portion of the pixels are displayed at full resolution and/or a defined portion of the pixels are viewed with a certain display method, for example. In another embodiment, the user-defined display parameters may specify that the medical image is viewed at a resolution that is less than full resolution. In other embodiments, the user-defined display parameters may specify additional display settings that must be satisfied in order to allow the reader to mark the image as read. For example, the display parameters may be set to require that every nth pixel is displayed. Thus, various user-defined display parameters may be established on a user, modality, or facility basis, for example. In one embodiment, such as when viewing CT images, the display parameters specify that the CT images must be viewed using a specified series of display parameters, such as lung windows, bone windows, and/or other types of windows, for example. In this embodiment, if the user forgets to view the images separately using all the required display parameters, the CT images may be misinterpreted. For ease of description, the following description refers to user-defined display parameters specifying that every pixel of the image is displayed at full resolution before it may be marked as read. However, the methods described herein are not limited to these display parameters and application of these methods using other user-defined display parameters are expressly contemplated. Any reference to tracking pixels at full resolution should be interpreted to cover similar systems and methods for monitoring and/or tracking of any other user-defined display parameter or combination of display parameters.


In some embodiments, it may be important and/or required that a user view all of a medical image at full resolution. Thus, a user may be required to adjust the portion of the medical image that is displayed at full resolution on the display device in an attempt to view all of the image pixels. However, currently there are no systems or methods for automatically tracking the portions of a medical image that have already been displayed at full resolution, or for indicating those images for which all pixels have been presented on a display device at full resolution (“full pixel display”). Accordingly, there is a need for image viewing devices and computing systems that monitor the portions of the medical image that are displayed on the display device, notate those portions that have been displayed at full resolution (or other user-defined display parameters), and provide the user with information indicating portions that have not been viewed at full resolution and/or provide information indicating for which images of a multiple image examination full pixel display has been accomplished. These processes are referred to herein as “visual pixel tracking.”


In one embodiment, visual pixel tracking applies to the display of an individual image. In another embodiment, visual pixel tracking applies to a volume rendering, wherein after 3-D slicing, the computing system 100 indicates which pixels in a volume have not been displayed at full resolution or meeting other user-defined display parameters. FIG. 2, described in detail below, is a flowchart illustrating an exemplary method of tracking which pixels of a medical image are displayed according to user-defined display parameters. When volume rendering is performed, the method of FIG. 2 may be applied to each slice of the imaging volume. The user interface may provide a real time status of viewing each of the slices at full resolution.


In one embodiment, the computing system 100 is configured to determine a portion of the medical image on which visual pixel tracking is to be applied. Many medical images comprise areas surrounding the area of interest that are not important for a user, such as a doctor, to view and mark as read. For example, a medical image of a breast typically includes areas, such as air around the breast, that are irrelevant to the analysis by the user. Accordingly, viewing of these irrelevant portions of the image according to the user-defined display parameters is not necessary. In one embodiment, the computing system 100 analyzes the medical image and determines those areas that are irrelevant to the user's analysis. These irrelevant areas are then excluded from the user-defined display parameters and a viewer may mark an image as read without viewing the irrelevant areas at full resolution, for example. In another embodiment, the user may define the irrelevant areas of an image prior to viewing portions of the image at full resolution. For example, the user may use the keyboard, mouse, or other input device, to select areas surrounding the area of interest that do not require viewing according to the user-defined display parameters. In yet another embodiment, the user may determine that the relevant portions of an image have been viewed according to the display parameters, without the need to pre-select portions that should be viewed according to the display parameters. By providing for automatic and/or manual selection of irrelevant portions of a medical image, the viewer is not required to display those irrelevant portions of the medical image according to the user-defined display parameters, such as full resolution.


In a block 210, one or more medical images are received from an image source. Although the process described below is directed to processing one image, it is possible to use the process in conjunction with multiple images. The image source may comprise one or more of the imaging devices 170, the image server 180, the medical facility 190, or any other device that is capable of transmitting medical images. The medical image may be received via the network 160, or by other means, such as transferred on a floppy disk or CD-ROM. For ease of description, in the description that follows the exemplary computing system 100 will be the device that receives and displays the medical image. However, other computing devices may perform the methods described herein. The received medical image comprises more pixels that the display device and, thus, the entire image may not be concurrently displayed at full resolution.


Continuing to a block 220, a portion of the medical image is selected for display on the display device. As discussed above, many medical images contain more pixels than are capable of being displayed concurrently on a display device. Accordingly, the user of the medical image may select a portion of the image to display at full resolution on the display device. Alternatively, the computing system 100 may automatically determine a portion of the image to display on the display device. For example, in one embodiment the computing system 100 may initially display a top, left portion of received medical image first and then proceed to display adjacent portions of the image in response to an input from the user.


Moving to a block 230, the portion of the image that was selected for display at full resolution is displayed on the display device. More particularly, all of the image pixels for the selected portion of the medical image are concurrently displayed on the display device. In one embodiment, depending on the resolution of the medical image and the resolution of the display device, about 1-25% of the image may be concurrently displayed at full resolution on the display device.


Continuing to a block 240, an indication of the portion of the image that is displayed at full resolution is recorded. For example, if ⅛ of the total pixels of an image are displayed at full resolution, an indication of these pixels is recorded, such as by storing pixel information in a memory 130 of the computing system 100. Alternatively, the information regarding displayed pixels may be stored on a central server, such as the image server 180, which may then be accessible to other medical facilities and imaging devices.


In a decision block 250, the computing system 100 determines if another portion of the image has been selected for display at full resolution. In one embodiment, the user is presented with a reduced resolution representation of the medical image and is allowed to select another portion of the image for display at full resolution. Selection of an image portion may be accomplished by pressing certain keys on a keyboard, such as the arrow keys, for example. In another embodiment, the user may change the selected portion for viewing by moving a mouse, or other input device. For example, a pan tool may be invoked by the user, allowing the user to adjust the portion of the image displayed at full resolution so that areas of the images that are above, below, or to the sides of the current displayed portion are displayed at full resolution. In another embodiment, the computing system 100 may be configured to periodically updated the display with a portion of the image that has not yet been displayed at full resolution, or update the display in response to an input from the user.


If in the decision block 250, the computing device 100 determines that instructions have been received to display another portion of the image on the display device, at a block 280 the requested portion of the image is selected for display at full resolution. In one embodiment, such as when a panning tool is used, the selected portion comprises much of the currently displayed portion of the image. In another embodiment, the selected portion comprises a portion of the image that is entirely different than the portion of the image that is currently displayed.


Moving to a block 290, one or more display characteristics of the selected portion of the image that is displayed at full resolution is altered. Thus, when the entire image is displayed on the display device at a reduced resolution, those portions of the image that have not been displayed at full resolution can be identified. These portions of the image may then be selected for display at full resolution.


In one embodiment, the adjustment of a display characteristic comprises changing a color of the image portion. In another embodiment, other indicators, such as a line surrounding those image portions already displayed at full resolution, may be used to discriminate between portions of the image that have been displayed at full resolution and portions that have not been displayed at full resolution. Accordingly, when the entire image is viewed at a reduced resolution, such as by displaying only every nth image pixel, where n is less than or equal to the ratio of image pixels to display pixels, areas of the image that have not been viewed at full resolution are distinguished from those that have been viewed at full resolution. Based on the distinguishing display characteristic, the user may select for display a portion of the image that has not yet been displayed at full resolution. In one embodiment, coloring of the viewed pixels may be toggled on and off by the user. In another embodiment, a text message, icon, or other indication, may be displayed at the bottom of the display, for example, indicating that the image has been viewed according to the user-defined display parameters. In yet another embodiment, the outside margins of the viewing pane may change color or the system could beep or provide some other audible feedback when the image has been displayed according to the user-defined display parameters.


Moving from block 290, the selected portion of the image is displayed at full resolution in block 230, and the method continues to block 240 and block 250. Accordingly, blocks 230, 240, 250, 280, and 290 may be repeated multiple times in the process of selecting and displaying portions of an image at full resolution.


Referring again to the decision block 250, if the computing device 100 determines that instructions to display another portion of the image at full resolution have not been received, the method continues to a decision block 260, wherein the computing device 100 determines whether all of the image has been displayed at full resolution. If it is determined that not all of the image has been displayed at full resolution, the method continues to a block 295, wherein an indication is provided to the user that not all of the image has been viewed at full resolution. If, however, in the decision block 260, the computing device 100 determines that the entire image has been displayed at full resolution, the method continues to a block 270, wherein an indication is provided to the user that the entire image has been displayed at full resolution.


As noted above, the flowchart of FIG. 2 illustrates an exemplary process of tracking pixels viewed by a user according to exemplary user-defined display parameters. In particular, the user-defined display parameters in the example of FIG. 2 specify that the entire image is viewed at full resolution. However, in other embodiments the user-defined display parameters may require that, for example, only a portion of the image is displayed at full resolution, or any other predetermined reduced resolution. For example, many images contain non-rectangular areas of interest. The portions outside of the areas of interest, such as a breast in a mammography image, may include air, other body portions, or imaging equipment, for example. Those of skill in the art will recognize that is not important to analyze every pixel of the air surrounding an area or interest. Accordingly, in one embodiment, the user or the software may select portions of the image that must be displayed according to the user-defined display parameters. In another embodiment, the display parameters may specify that the viewer determines when the image has been viewed according to the user-defined display parameters. In this embodiment, the system may track the viewed pixel of the image, present the viewer with a view of the image that distinguishes portions of the image that have not been viewed at full resolution, or according to any other user-defined display parameters, and the viewer can determine whether the image can be marked as read.


In one embodiment, the user can establish user-defined display parameters and store those parameters in a database. For example, the user may establish a rule linked to the individual user, user type, exam type, modality, system or other links that triggers the above described automatic warnings and/or visual pixel tracking if a user-defined fraction of the pixels are displayed. The user may, alternatively or additionally, establish other rules linked to the individual user, user type, exam type, modality, and/or system that trigger the above-described automatic warnings and/or visual pixel tracking if an image is not viewed using one or more specified display parameters or combination of display parameters. For example, the computing system 100 may be configured to automatically direct the user to any pixels or images that have not been displayed with specific display parameters.


In another embodiment, rules may be generated to automatically designate when the pixel tracking process should be turned on and off. For example, rules may designate that visual pixel tracking applies to only certain viewers or users. In one embodiment, one type of display parameters can apply to one modality and another set of display parameters can apply to another modality.


In one embodiment, the user is not able to notate an image as being read, or completely viewed, until the entire image has been displayed at full resolution. Accordingly, in the exemplary method of FIG. 2, if not all of the image has been displayed at full resolution, the method indicates that the entire image has not been viewed at full resolution in block 295, and the method returns to block 250, wherein another portion of the image may be selected for viewing.


In one embodiment, the computing system 100 automatically displays portions of the image that have not yet been displayed at full resolution, and/or that have not been displayed such that user-defined display parameters have been met. For example, a user interface may include visual indications as to which portions of an image include pixels that have not been displayed, or which pixels have not been displayed using a user-defined display parameter, such as a specified window or level setting. In one embodiment, the computing system 100 automatically displays a message indicating which one or more of several images has not displayed with full pixel display and/or meeting user-defined display parameter criteria. In another embodiment, the computing system 100 automatically directs the user to any image positions or images, that have not been displayed at full resolution and/or meeting user-defined display parameter criteria.



FIG. 3 is an exemplary graphical user interface (GUI) 300 including a menu bar 320, multiple toolbars 310, and an image viewing pane 330. In the example of FIG. 3, a mammographic image is displayed in the viewing pane 330, wherein the entire image is displayed at a reduced resolution. For example, if the mammography image comprises 16 million pixels (e.g., an image resolution of 4,000×4,000), all of the 16 million pixels cannot be simultaneously displayed on a display device with a resolution of 2,048×1,536 (a 3.2 Megapixel display). Thus, only about ⅕ of the 16 million pixels of the mammography image may be displayed simultaneously at full resolution on such a display device. Accordingly, in order to view the entire mammographic image on a display device, the number of pixels in the image is reduced by removing about 4 of every 5 pixels. Those of skill in the art will recognize that there are many systems and methods for reducing the resolution of a digital image. These systems and methods are contemplated for use in generating a reduced resolution image, such as the image displayed in the viewing pane 330 of FIG. 3.


As noted above, because many medical images, such as mammography images, for example, may include features or abnormalities that are only detectable by viewing only 1, or a few, pixels, a viewer of the image 340 may not be able to detect all features and abnormalities. Importantly, the viewer of a mammographic image displayed at less than full resolution, such as image 340, may not be able to detect an abnormality related to breast cancer. For example, the area 350 shown in FIG. 3 may include many times more pixel than are displayed in the area 350.



FIGS. 4 and 5 illustrate a method of sequentially viewing portions of a medical image at full resolution and related methods of tracking and indicating the portions of the image that have been viewed at full resolution. In the embodiment of FIGS. 4 and 5, the reduced resolution image remains in the viewing pane 330 while a portion of the image is also displayed at full resolution within the viewing pane 330.



FIG. 4 is the GUI 300 with the mammographic image 340 (FIG. 3) displayed in the viewing pane 330 at a reduced resolution, where a window 410 displays a portion of the image at full resolution. In one embodiment, the portion of the mammographic image that is displayed at full resolution in the window 410 is selected by the user, such as by allowing the user to move a cursor, or other icon, over portions of the mammographic image. In this embodiment, when the cursor is over an area of the mammographic image, the window 410 displays at full resolution the portion of the mammographic image currently covered by the cursor. Thus, in the example of FIG. 4, the window 410 display all of the pixels of a portion of the reduced resolution image 340.


In one embodiment, the portion of the image displayed at full resolution in the window 410 is updated as the user moves the cursor across the reduced resolution mammographic image 340. Thus, the user may determine the order in which portions of the reduced resolution image are viewed at full resolution and may control the speed at which portions of the image are viewed at full resolution.



FIG. 5 is the GUI 300 with the reduced resolution mammographic image 340 (FIG. 3) displayed in the viewing pane 330, wherein portions of the image that have been displayed at full resolution are distinguished from portions of the image that have not yet been displayed at full resolution. In the embodiment of FIG. 5, for example, the portions of the image that have already been viewed at full resolution, such as via the window 410 (FIG. 4) are color inverted from those portions that have not been viewed at full resolution. In FIG. 5, the non inverted image portion 510 indicates that this portion has been displayed at full resolution. The inverted image portion 520, which substantially surrounds the non-inverted image portion 510, indicates that portion 520 has not been viewed at full resolution. Accordingly, if the viewer is required to view all of the image at full resolution, the portion 520 would need to be viewed at full resolution.


In another embodiment, the portions of the image that have been viewed at full resolution are distinguished from those portions that have not been viewed at full resolution in other manners. For example, in one embodiment a border may be displayed around those portions of the image 340 that have been displayed at full resolution. In another embodiment, the portion that has been viewed at full resolution is color-inverted. In another embodiment, the coloring of the portion that has been viewed at full resolution is adjusted, such as by adding a yellow tint to that portion.



FIGS. 6 and 7 illustrate another method of sequentially viewing portions of a reduced resolution medical image at full resolution and related methods of tracking and indicating the portions of the image that have been viewed at full resolution.



FIG. 6 is the GUI 300 with the reduced resolution mammographic image 640 displayed in the viewing pane 330, wherein a portion of the image has already been displayed at full resolution and another portion of the image is selected for display at full resolution. More particularly, the inverted portion 610 of the image (shaped generally as two overlapping rectangles), indicates that portion 610 has already been displayed at full resolution. As described above, other methods of indicating portions of an image that have been viewed at full resolution may be used also.


In one embodiment, a portion of the mammographic image 640 is selected for display at full resolution. The portion that is selected is sized so that the selected portion may be displayed at full resolution in the viewing pane 330. In FIG. 6, for example, the selection window 620 has approximately the same vertical to horizontal proportions as the viewing pane 330. In addition, the selection window 620 is sized so that a portion of the image covered by the selection window 620 substantially fills in the viewing pane 330 when displayed at full resolution.


In one embodiment, the viewer may move the selection window 620 to any portion of the viewing pane 330, such as by moving a mouse or pressing designated keys on a keyboard. Once the selection window 620 is over a portion of the image that the user would like to view at full resolution, the user selects the portion by pressing a button on the mouse or pressing a designated key on the keyboard, for example. In one embodiment, after selecting a portion for viewing at full resolution, the viewing pane 330 is updated to display the selected portion at full resolution in the viewing pane 330. After viewing the selected portion at full resolution for a predetermined period of time or until the user indicates, the viewing pane 330 is updated with the reduced resolution image 330, updated with an indication of the portion of the image that was viewed at full resolution. Alternatively, the viewing pane 330 may be sequentially filled with full resolution portions of the image without returning to the reduced resolution image.



FIG. 7 is the GUI 300 with a portion of a mammographic displayed at full resolution in the viewing pane 330. As noted above, many medical images, such as mammographic images, are taken at resolutions that are higher than resolutions of typical display and, thus, all pixel of these medical images may not concurrently be displayed. As illustrated in FIG. 7, a portion of a mammographic image is displayed at full resolution 710 in the viewing pane 330 of the GUI 300. The portion of the image displayed at full resolution in the viewing pane may be selected by the user (such as is described with reference to FIG. 6) or may be selected by the computing device according to predetermined display criteria.


In another embodiment, a viewing pane includes two panes, where a first pane, or selection pane, displays the image at a reduced resolution and a second pane, or display pane, displays at least a portion of the image at full resolution, or other user-defined display parameters. For example, a single display device could concurrently display an image such as the mammographic image 640 in the selection pane and a selected portion of the mammographic image 640 may be viewed in the display pane at full resolution, such as the full resolution image portion 710. In one embodiment, the selection pane may be sized to correspond to the size and shape that will accommodate a full resolution display pane. In one embodiment, the user may be provided with at least two methods of selecting portions of the reduced resolution image for viewing in the display pane. In particular, the user may move a selection window, such as the selection window 620 (FIG. 6), in the selection pane and the corresponding image area may be updated in the display pane. Alternatively, the user may use a pan function in the display pane and the position of the selection window in the selection pane is dynamically updated. In either case, areas that have been viewed at full resolution, or according to other user-defined display parameters, are dynamically adjusted so that they may be distinguished from the remaining portions. In an embodiment incorporating a selection and display pane, the relative sizes of the panes may be adjusted by the user or, alternatively, may be automatically adjusted by the software according to predetermined criteria.


In yet another embodiment, the selection pane may include multiple pares that each display a different image at a reduced resolution. For example, the multiple panes may display various images in a single image series. In one embodiment, the reduced resolution images are adjusted so that portions of the images that have been viewed at full resolution, or other user-defined display parameters, are visually distinguishable from the remaining portions. Accordingly, the display may provide an overview of the viewing status of multiple images.


The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. For example, the above-described pixel checking method may be performed on other types of images, in addition to medical images. For example, images of circuit boards, airplane wings, and satellite imagery may be analyzed using the described systems and methods for monitoring whether an image has been viewed according to predefined display parameters. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computer-implemented method for medical image display and analysis, the method comprising: by one or more processors executing program instructions: determining one or more irrelevant portions of a medical image;selectively displaying, in response to input received from a user, one or more portions of the medical image on a display device at a first resolution;storing tracking information indicating the one or more portions of the image displayed on the display device at the first resolution; andin response to determining, based on at least the stored tracking information, a portion of the medical image not displayed at the first resolution and not included in the one or more irrelevant portions of the medical image, providing a notification to the user.
  • 2. The method of claim 1, wherein providing the notification includes providing at least one selected from a group consisting of a visible notification and an audible notification.
  • 3. The method of claim 1 further comprising: adding a portion of the medical image to the one or more irrelevant portions of the medical image in response to user input.
  • 4. The method of claim 1 further comprising: removing a portion of the medical image from the one or more irrelevant portions of the medical image in response to user input.
  • 5. The method of claim 1, wherein determining the one or more irrelevant portions of the medical image includes performing at least one selected from a group consisting of edge enhancement, automated image analysis, and computer-aided detection.
  • 6. The method of claim 1, wherein the one or more irrelevant portions of the medical image include an area of the medical image representing air.
  • 7. The method of claim 1 further comprising: identifying each portion of the medical image not included in the one or more irrelevant portions of the medical image as a relevant portion; andin response to determining, based on the stored tracking information, that each relevant portion of the medical image has been displayed at the first resolution, providing a second notification.
  • 8. The method of claim 1 further comprising: determining a parameter of the medical image, wherein the parameter indicates one or more of a user type, an exam type, and a modality and wherein determining the one or more irrelevant portions of the medical image includes determining the one or more irrelevant portions of the medical image based on the parameter of the medical image.
  • 9. A non-transitory, computer-readable medium including instructions that, when executed by at least one electronic processor, perform a set of functions, the set of functions comprising: determining one or more irrelevant portions of a medical image;selectively displaying, in response to input received from a user, one or more portions of the medical image on a display device at a first resolution;storing tracking information indicating the one or more portions of the image displayed on the display device at the first resolution; andin response to determining, based on at least the stored tracking information, a portion of the medical image not displayed at the first resolution and not included in the one or more irrelevant portions of the medical image, providing a notification to the user.
  • 10. A system configured for image display and analysis, the system comprising: one or more processors; anda non-transitory computer readable medium operatively coupled to the one or more processors and storing executable instructions,the one or more processors configured, via execution of the executable instructions, to: determine a set of irrelevant portions of the image;selectively display, in response to input received from a user, one or more portions of the image on a display device at a first resolution;store tracking information indicating the one or more portions of the image displayed on the display device at the first resolution; andin response to determining, based on at least the stored tracking information, a portion of the image not displayed at the first resolution and not included in the set of irrelevant portion of the image, provide a notification to a user.
  • 11. The system of claim 10, wherein the notification includes at least one selected from a group consisting of a visible notification and an audible notification.
  • 12. The system of claim 10, wherein the one or more processors are further configured to add a portion of the image to the set of irrelevant portions of the image in response to user input.
  • 13. The system of claim 10, wherein the one or more processors are further configured to remove a portion of the image from the set of irrelevant portions of the image in response to user input.
  • 14. The system of claim 10, wherein the one or more processors are configured to determine the set of irrelevant portions of the image by performing at least one selected from a group consisting of edge enhancement, automated image analysis, and computer-aided detection.
  • 15. The system of claim 10, wherein the set of irrelevant portions of the image includes an area of the image representing air.
  • 16. The system of claim 10, wherein the image includes one selected from a group consisting of a medical image, an image of a circuit board, an airplane wing image, and a satellite image.
  • 17. The system of claim 10, wherein the image is stored in Digital Imaging and Communications in Medicine (“DICOM”) format.
  • 18. The system of claim 10, wherein the one or more processors are further configured to: identify each portion of the image not included in the set of irrelevant portions as a relevant a portion of the image; andin response to determining, based on the stored tracking information, that each relevant portion of the image has been displayed at the first resolution, provide a second notification to the user.
  • 19. The system of claim 10, wherein the one or more processors are further configured to: determine a parameter of the image, wherein the parameter indicates one or more of a user type, an exam type, and a modality and wherein the one or more processors are configured to determine the set of irrelevant portions based on the parameter of the image.
  • 20. The system of claim 10, wherein the set of irrelevant portions of the image is an empty set.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/799,657, filed on Oct. 31, 2017 and entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” which is a continuation of U.S. patent application Ser. No. 14/540,830, filed on Nov. 13, 2014 and entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” now U.S. Pat. No. 9,836,202, which is a continuation of U.S. patent application Ser. No. 13/477,853, filed on May 22, 2012 and entitled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” now U.S. Pat. No. 8,913,808, which is a continuation of U.S. patent application Ser. No. 13/228,349, filed on Sep. 8, 2011 and titled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” now U.S. Pat. No. 8,244,014, which is a continuation of U.S. patent application Ser. No. 12/702,976, filed on Feb. 9, 2010 and titled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” now U.S. Pat. No. 8,019,138, which is a continuation of U.S. patent application Ser. No. 11/179,384, filed on Jul. 11, 2005 and titled “SYSTEMS AND METHODS FOR VIEWING MEDICAL IMAGES,” now U.S. Pat. No. 7,660,488, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application Ser. No. 60/625,690, filed on Nov. 4, 2004, each of which is hereby expressly incorporated by reference in its entirety.

US Referenced Citations (328)
Number Name Date Kind
4672683 Matsueda Jun 1987 A
5123056 Wilson Jun 1992 A
5179651 Taaffe et al. Jan 1993 A
5431161 Ryals et al. Jul 1995 A
5452416 Hilton et al. Sep 1995 A
5515375 DeClerck May 1996 A
5542003 Wofford Jul 1996 A
5734915 Roewer Mar 1998 A
5740267 Echerer et al. Apr 1998 A
5779634 Ema et al. Jul 1998 A
5835030 Tsutsui et al. Nov 1998 A
5852646 Klotz et al. Dec 1998 A
5867322 Morton Feb 1999 A
5926568 Chaney et al. Jul 1999 A
5954650 Saito et al. Sep 1999 A
5976088 Urbano et al. Nov 1999 A
5986662 Argiro et al. Nov 1999 A
5987345 Engelmann et al. Nov 1999 A
6128002 Leiper Oct 2000 A
6151581 Kraftson et al. Nov 2000 A
6177937 Stockham et al. Jan 2001 B1
6211795 Izuta Apr 2001 B1
6243095 Shile et al. Jun 2001 B1
6269379 Hiyama et al. Jul 2001 B1
6304667 Reitano Oct 2001 B1
6347329 Evans Feb 2002 B1
6383135 Chikovani et al. May 2002 B1
6388687 Brackett et al. May 2002 B1
6427022 Craine et al. Jul 2002 B1
6532299 Sachdeva et al. Mar 2003 B1
6574629 Cooke, Jr. et al. Jun 2003 B1
6577753 Ogawa Jun 2003 B2
6603494 Banks et al. Aug 2003 B1
6606171 Renk et al. Aug 2003 B1
6618060 Brackett Sep 2003 B1
6630937 Kallergi et al. Oct 2003 B2
6697067 Callahan et al. Feb 2004 B1
6734880 Chang May 2004 B2
6760755 Brackett Jul 2004 B1
6785410 Vining et al. Aug 2004 B2
6829377 Milioto Dec 2004 B2
6864794 Betz Mar 2005 B2
6891920 Minyard et al. May 2005 B1
6894707 Nemoto May 2005 B2
6909436 Pianykh et al. Jun 2005 B1
6909795 Tecotzky et al. Jun 2005 B2
7016952 Mullen et al. Mar 2006 B2
7043474 Mojsilovic et al. May 2006 B2
7054473 Roehrig et al. May 2006 B1
7058901 Hafey et al. Jun 2006 B1
7103205 Wang et al. Sep 2006 B2
7106479 Roy et al. Sep 2006 B2
7110616 Ditt et al. Sep 2006 B2
7113186 Kim et al. Sep 2006 B2
7123684 Jing et al. Oct 2006 B2
7139416 Vuylsteke Nov 2006 B2
7149334 Dehmeshki Dec 2006 B2
7155043 Daw Dec 2006 B2
7162623 Yngvesson Jan 2007 B2
7170532 Sako Jan 2007 B2
7209149 Jogo Apr 2007 B2
7218763 Belykh et al. May 2007 B2
7224852 Lipton et al. May 2007 B2
7260249 Smith Aug 2007 B2
7263710 Hummel, Jr. et al. Aug 2007 B1
7272610 Torres Sep 2007 B2
7346199 Pfaff Mar 2008 B2
7366992 Thomas, III Apr 2008 B2
7379578 Soussaline et al. May 2008 B2
7412111 Battle et al. Aug 2008 B2
7450747 Jabri et al. Nov 2008 B2
7505782 Chu Mar 2009 B2
7516417 Amador et al. Apr 2009 B2
7523505 Menschik et al. Apr 2009 B2
7525554 Morita et al. Apr 2009 B2
7526114 Xia et al. Apr 2009 B2
7526132 Koenig Apr 2009 B2
7545965 Suzuki et al. Jun 2009 B2
7583861 Hanna et al. Sep 2009 B2
7599534 Krishnan Oct 2009 B2
7613335 McLennan et al. Nov 2009 B2
7634121 Novatzky et al. Dec 2009 B2
7636413 Toth Dec 2009 B2
7639879 Goto et al. Dec 2009 B2
7656543 Atkins Feb 2010 B2
7660413 Partovi et al. Feb 2010 B2
7660448 Collins et al. Feb 2010 B2
7660488 Reicher et al. Feb 2010 B2
7668352 Tecotzky et al. Feb 2010 B2
7683909 Takekoshi Mar 2010 B2
7698152 Reid Apr 2010 B2
7716277 Yamatake May 2010 B2
7787672 Reicher et al. Aug 2010 B2
7834891 Yarger et al. Nov 2010 B2
7835560 Vining et al. Nov 2010 B2
7885440 Fram et al. Feb 2011 B2
7885828 Glaser-Seidnitzer et al. Feb 2011 B2
7920152 Fram et al. Apr 2011 B2
7941462 Akinyemi et al. May 2011 B2
7970625 Reicher et al. Jun 2011 B2
7995821 Nakamura Aug 2011 B2
8019138 Reicher et al. Sep 2011 B2
8046044 Stazzone et al. Oct 2011 B2
8073225 Hagen et al. Dec 2011 B2
8094901 Reicher et al. Jan 2012 B1
8217966 Fram et al. Jul 2012 B2
8244014 Reicher et al. Aug 2012 B2
8262572 Chono Sep 2012 B2
8292811 Relkuntwar et al. Oct 2012 B2
8298147 Huennekens et al. Oct 2012 B2
8379051 Brown Feb 2013 B2
8391643 Melbourne et al. Mar 2013 B2
8406491 Gee et al. Mar 2013 B2
8560050 Martin et al. Oct 2013 B2
8610746 Fram et al. Dec 2013 B2
8626527 Reicher et al. Jan 2014 B1
8693757 Gundel Apr 2014 B2
8731259 Reicher et al. May 2014 B2
8771189 Ionasec et al. Jul 2014 B2
8879807 Fram et al. Nov 2014 B2
8913808 Reicher et al. Dec 2014 B2
8954884 Barger Feb 2015 B1
8976190 Westerhoff et al. Mar 2015 B1
9042617 Reicher et al. May 2015 B1
9092551 Reicher Jul 2015 B1
9092727 Reicher Jul 2015 B1
9324188 Fram et al. Apr 2016 B1
9386084 Reicher et al. Jul 2016 B1
9471210 Fram et al. Oct 2016 B1
9501617 Reicher et al. Nov 2016 B1
9501627 Reicher et al. Nov 2016 B2
9501863 Fram et al. Nov 2016 B1
9536324 Fram Jan 2017 B1
9542082 Reicher et al. Jan 2017 B1
9672477 Reicher et al. Jun 2017 B1
9684762 Reicher et al. Jun 2017 B2
9727938 Reicher et al. Aug 2017 B1
9734576 Fram et al. Aug 2017 B2
9754074 Reicher et al. Sep 2017 B1
9836202 Reicher et al. Dec 2017 B1
9892341 Reicher et al. Feb 2018 B2
9934568 Reicher et al. Apr 2018 B2
10096111 Fram et al. Oct 2018 B2
10157686 Reicher et al. Dec 2018 B1
10387612 Wu et al. Aug 2019 B2
10437444 Reicher et al. Oct 2019 B2
10438352 Fram et al. Oct 2019 B2
10540763 Reicher et al. Jan 2020 B2
10592688 Reicher et al. Mar 2020 B2
10607341 Reicher et al. Mar 2020 B2
10614615 Fram et al. Apr 2020 B2
20010016822 Bessette Aug 2001 A1
20010041991 Segal et al. Nov 2001 A1
20010042124 Barron Nov 2001 A1
20020016718 Rothschild et al. Feb 2002 A1
20020021828 Papier et al. Feb 2002 A1
20020039084 Yamaguchi Apr 2002 A1
20020044696 Sirohey et al. Apr 2002 A1
20020054038 Nemoto May 2002 A1
20020070970 Wood et al. Jun 2002 A1
20020073429 Beane et al. Jun 2002 A1
20020090118 Olschewski Jul 2002 A1
20020090124 Soubelet et al. Jul 2002 A1
20020091659 Beaulieu et al. Jul 2002 A1
20020099273 Bocionek Jul 2002 A1
20020103673 Atwood Aug 2002 A1
20020106119 Foran et al. Aug 2002 A1
20020144697 Betz Oct 2002 A1
20020145941 Poland et al. Oct 2002 A1
20020180883 Tomizawa et al. Dec 2002 A1
20020188637 Bailey et al. Dec 2002 A1
20030005464 Gropper et al. Jan 2003 A1
20030016850 Kaufman et al. Jan 2003 A1
20030037054 Dutta et al. Feb 2003 A1
20030053668 Ditt et al. Mar 2003 A1
20030071829 Bodicker et al. Apr 2003 A1
20030101291 Mussack et al. May 2003 A1
20030130973 Sumner, II et al. Jul 2003 A1
20030140044 Mok et al. Jul 2003 A1
20030140141 Mullen et al. Jul 2003 A1
20030164860 Shen et al. Sep 2003 A1
20030184778 Chiba Oct 2003 A1
20030187689 Barnes et al. Oct 2003 A1
20030190062 Noro et al. Oct 2003 A1
20030195416 Toth Oct 2003 A1
20030215120 Uppaluri et al. Nov 2003 A1
20030215122 Tanaka Nov 2003 A1
20040008900 Jabri et al. Jan 2004 A1
20040015703 Madison et al. Jan 2004 A1
20040024303 Banks et al. Feb 2004 A1
20040027359 Aharon et al. Feb 2004 A1
20040061889 Wood et al. Apr 2004 A1
20040068170 Wang et al. Apr 2004 A1
20040077952 Rafter et al. Apr 2004 A1
20040105030 Yamane Jun 2004 A1
20040105574 Pfaff Jun 2004 A1
20040114714 Minyard et al. Jun 2004 A1
20040122787 Avinash et al. Jun 2004 A1
20040141661 Hanna et al. Jul 2004 A1
20040143582 Vu Jul 2004 A1
20040161139 Samara et al. Aug 2004 A1
20040174429 Chu Sep 2004 A1
20040190780 Shiibashi et al. Sep 2004 A1
20040197015 Fan et al. Oct 2004 A1
20040202387 Yngvesson Oct 2004 A1
20040243435 Williams Dec 2004 A1
20040252871 Tecotzkky et al. Dec 2004 A1
20050010531 Kushalnagar et al. Jan 2005 A1
20050063575 Ma et al. Mar 2005 A1
20050065424 Shah et al. Mar 2005 A1
20050074157 Thomas, III Apr 2005 A1
20050075544 Shapiro et al. Apr 2005 A1
20050088534 Shen et al. Apr 2005 A1
20050108058 Weidner et al. May 2005 A1
20050111733 Fors et al. May 2005 A1
20050113681 DeFreitas et al. May 2005 A1
20050184988 Yanof et al. Aug 2005 A1
20050197860 Joffe et al. Sep 2005 A1
20050203775 Chesbrough Sep 2005 A1
20050238218 Nakamura Oct 2005 A1
20050244041 Tecotzky et al. Nov 2005 A1
20050259118 Mojaver et al. Nov 2005 A1
20050273009 Deischinger et al. Dec 2005 A1
20060008181 Takekoshi Jan 2006 A1
20060031097 Lipscher et al. Feb 2006 A1
20060050152 Rai et al. Mar 2006 A1
20060058603 Dave et al. Mar 2006 A1
20060061570 Cheryauka et al. Mar 2006 A1
20060093198 Fram et al. May 2006 A1
20060093199 Fram et al. May 2006 A1
20060093207 Reicher et al. May 2006 A1
20060095423 Reicher et al. May 2006 A1
20060095426 Takachio et al. May 2006 A1
20060106642 Reicher et al. May 2006 A1
20060111937 Yarger et al. May 2006 A1
20060122482 Mariotti et al. Jun 2006 A1
20060171574 DelMonego et al. Aug 2006 A1
20060181548 Hafey et al. Aug 2006 A1
20060238546 Handley et al. Oct 2006 A1
20060239573 Novatzky et al. Oct 2006 A1
20060241979 Sato et al. Oct 2006 A1
20060274145 Reiner Dec 2006 A1
20060282447 Hollebeek Dec 2006 A1
20070050701 El Emam et al. Mar 2007 A1
20070055550 Courtney et al. Mar 2007 A1
20070067124 Kimpe et al. Mar 2007 A1
20070109402 Niwa May 2007 A1
20070124541 Lang et al. May 2007 A1
20070162308 Peters Jul 2007 A1
20070192140 Gropper et al. Aug 2007 A1
20070245308 Hill et al. Oct 2007 A1
20070270695 Keen Nov 2007 A1
20080031507 Uppaluri et al. Feb 2008 A1
20080097186 Biglieri et al. Apr 2008 A1
20080100612 Dastmalchi et al. May 2008 A1
20080103828 Squilla et al. May 2008 A1
20080118120 Wegenkittl et al. May 2008 A1
20080125846 Battle et al. May 2008 A1
20080126982 Sadikali et al. May 2008 A1
20080130966 Crucs Jun 2008 A1
20080133526 Haitani et al. Jun 2008 A1
20090005668 West et al. Jan 2009 A1
20090028410 Shimazaki Jan 2009 A1
20090091566 Turney et al. Apr 2009 A1
20090094513 Bay Apr 2009 A1
20090123052 Ruth et al. May 2009 A1
20090129643 Natanzon et al. May 2009 A1
20090129651 Zagzebski et al. May 2009 A1
20090132586 Napora et al. May 2009 A1
20090150481 Garcia et al. Jun 2009 A1
20090164247 Dobler et al. Jun 2009 A1
20090198514 Rhodes Aug 2009 A1
20090326373 Boese et al. Dec 2009 A1
20100053353 Hunter et al. Mar 2010 A1
20100138239 Reicher et al. Jun 2010 A1
20100201714 Reicher et al. Aug 2010 A1
20100211409 Kotula et al. Aug 2010 A1
20110016430 Fram et al. Jan 2011 A1
20110019886 Mizuno Jan 2011 A1
20110110572 Guehring et al. May 2011 A1
20110267339 Fram et al. Nov 2011 A1
20110316873 Reicher et al. Dec 2011 A1
20120070048 Van Den Brink Mar 2012 A1
20120136794 Kushalnagar et al. May 2012 A1
20120163684 Natanzon et al. Jun 2012 A1
20120183191 Nakamura Jul 2012 A1
20120194540 Reicher et al. Aug 2012 A1
20120196258 Geijsen et al. Aug 2012 A1
20120208592 Davis et al. Aug 2012 A1
20130076681 Sirpal et al. Mar 2013 A1
20130083023 Fram et al. Apr 2013 A1
20130129198 Sherman et al. May 2013 A1
20130129231 Dale et al. May 2013 A1
20130159019 Reicher et al. Jun 2013 A1
20130169661 Reicher et al. Jul 2013 A1
20130195329 Canda et al. Aug 2013 A1
20130198682 Matas et al. Aug 2013 A1
20140022194 Ito Jan 2014 A1
20140096049 Vonshak et al. Apr 2014 A1
20140119514 Miyazawa May 2014 A1
20140378810 Davis et al. Dec 2014 A1
20150046349 Michael, Jr. et al. Feb 2015 A1
20150101066 Fram Apr 2015 A1
20150363104 Ichioka et al. Dec 2015 A1
20160034110 Edwards Feb 2016 A1
20160270746 Foos et al. Sep 2016 A1
20170038951 Reicher et al. Feb 2017 A1
20170039321 Reicher et al. Feb 2017 A1
20170039322 Reicher et al. Feb 2017 A1
20170039350 Reicher et al. Feb 2017 A1
20170039705 Fram et al. Feb 2017 A1
20170046014 Fram Feb 2017 A1
20170046483 Reicher et al. Feb 2017 A1
20170046485 Reicher et al. Feb 2017 A1
20170046495 Fram Feb 2017 A1
20170046870 Fram et al. Feb 2017 A1
20170053404 Reicher et al. Feb 2017 A1
20170200064 Reicher et al. Jul 2017 A1
20170200269 Reicher et al. Jul 2017 A1
20170200270 Reicher et al. Jul 2017 A1
20170206324 Reicher et al. Jul 2017 A1
20170239720 Levin et al. Aug 2017 A1
20170293720 Reicher et al. Oct 2017 A1
20170301090 Fram et al. Oct 2017 A1
20170308647 Reicher et al. Oct 2017 A1
20180059918 Reicher et al. Mar 2018 A1
20180225824 Fram et al. Aug 2018 A1
20190009371 Veerasamy et al. Jan 2019 A1
Non-Patent Literature Citations (267)
Entry
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Jul. 13, 2017 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Aug. 15, 2017 (9 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Mar. 26, 2018 (12 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Sep. 24, 2008 (4 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Feb. 18, 2009 (2 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated Jun. 10, 2011 (2 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Mar. 8, 2018 (25 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Oct. 5, 2012 (11 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Feb. 4, 2011 (3 pages).
Applicant Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/179,328 dated Dec. 11, 2014 (3 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Jun. 27, 2017 (58 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jun. 26, 2017 (51 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jan. 11, 2018 (60 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Apr. 2, 2018 (59 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jul. 3, 2014 (20 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Jul. 13, 2017 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated May 13, 2011 (14 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Sep. 4, 2013 (12 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated Sep. 13, 2011 (8 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated Dec. 7, 2011 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/891,543 dated Nov. 14, 2013 (14 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/163,600 dated Sep. 14, 2016 (6 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/572,397 dated Sep. 29, 2015 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Jan. 22, 2018 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Jan. 10, 2018 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Jun. 27, 2017 (62 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Nov. 30, 2017 (12 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jan. 30, 2018 (10 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Mar. 26, 2018 (40 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Mar. 8, 2018 (25 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated May 25, 2018 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Aug. 15, 2018 (8 pages).
Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jul. 16, 2018 (7 pages).
Correct Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jul. 20, 2018 (3 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Oct. 17, 2018 (18 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Sep. 24, 2008 (4 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Feb. 18, 2009 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated May 25, 2018 (14 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jul. 20, 2018 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated May 9, 2018 (17 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for Application No. 15/46, dated May 17, (3 pages).
Final Office Action from the U.S. Patent and Trademark Office for Application No. 15/4,5 dated Sep. 2018 (14 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Jun. 10, 2011 (3 pages).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Dec. 22, 2017 (13 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Mar. 19, 2018 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Aug. 6, 2018 (11 pages).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Dec. 20, 2017 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Jan. 10, 2018 (42 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Jun. 1, 2018 (17 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Sep. 7, 2018 (16 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Nov. 30, 2017 (11 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Nov. 30, 2017 (1 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jun. 26, 2018 (3 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jan. 11, 2018 (64 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Sep. 20, 2018 (58 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Jul. 3, 2018 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Jul. 30, 2018 (25 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jul. 3, 2014 (1 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Sep. 4, 2013 (1 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated Sep. 13, 2011 (10 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Feb. 4, 2011 (3 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/768,765 dated Aug. 28, 2015 (1 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/891,543 dated Nov. 14, 2013 (1 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/179,328 dated Dec. 11, 2014 (3 page).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/163,600 dated Sep. 14, 2016 (1 page).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/572,397 dated Jun. 29, 2015 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Nov. 29, 2018 (12 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Nov. 29, 2018 (1 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jan. 10, 2019 (9 pages).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Feb. 18, 2009 (2 page).
Examiner Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Sep. 24, 2008 (4 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Aug. 15, 2017 (11 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated May 15, 2017 (42 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Jul. 28, 2017 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Mar. 24, 2017 (3 page).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Sep. 6, 2018 (14 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated May 17, 2018 (3 page).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Dec. 22, 2017 (13 page).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Jun. 10, 2011 (3 page).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Dec. 20, 2017 (11 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,342 dated Oct. 13, 2017 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jun. 26, 2018 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Oct. 13, 2017 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,296 dated Oct. 13, 2017 (3 page).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Oct. 19, 2018 (12 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Dec. 6, 2018 (21 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Nov. 19, 2018 (33 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Jul. 13, 2017 (4 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Apr. 3, 2017 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Mar. 30, 2017 (10 pages).
Applicant Summary of Interview of Examiner from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/345,606 dated Oct. 21, 2013 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/298,806 dated Apr. 12, 2017 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Jan. 30, 2017 (12 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Feb. 4, 2011 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,023 dated Apr. 11, 2017 (11 pages).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Sep. 5, 2017 (12 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Apr. 23, 2014 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Sep. 10, 2014 (4 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Feb. 3, 2011 (16 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Apr. 20, 2015 (5 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Oct. 14, 2011 (17 pages).
Examiner's Answer to Appeal Brief from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Jul. 5, 2016 (18 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Jun. 1, 2011 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/437,522 dated Aug. 11, 2015 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Nov. 3, 2009 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Dec. 29, 2008 (45 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Aug. 28, 2007 (15 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Jul. 24, 2009 (23 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Jun. 26, 2008 (31 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/179,384 dated Feb. 18, 2009 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/702,976 dated Jul. 20, 2011 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/702,976 dated Aug. 18, 2010 (17 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/702,976 dated Feb. 17, 2011 (13 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/702,976 dated May 31, 2011 (3 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/702,976 dated Dec. 1, 2010 (4 pages).
Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/228,349 dated Jul. 20, 2012 (3 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/228,349 dated Feb. 6, 2012 (5 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/228,349 dated Dec. 1, 2011 (17 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/477,853 dated Aug. 15, 2014 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/477,853 dated Dec. 11, 2013 (20 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/477,853 dated Jun. 13, 2014 (13 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/477,853 dated Mar. 14, 2014 (3 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Aug. 15, 2017 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Jan. 17, 2017 (26 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Jul. 28, 2017 (6 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/540,830 dated Mar. 24, 2017 (3 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jul. 20, 2018 (7 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,313 dated Jan. 30, 2018 (42 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Jan. 28, 2019 (7 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Nov. 29, 2018 (35 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated May 17, 2018 (3 pages).
Supplemental Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Jan. 6, 2011 (3 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Dec. 3, 2010 (6 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Oct. 8, 2010 (6 pages).
Supplemental Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Aug. 6, 2010 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated May 17, 2010 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Feb. 2, 2010 (12 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Oct. 1, 2009 (35 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,261 dated Jan. 25, 2010 (3 pages).
Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Aug. 15, 2014 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jul. 3, 2014 (19 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Aug. 23, 2013 (33 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jun. 12, 2012 (33 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated May 16, 2011 (26 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Dec. 15, 2011 (37 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Jul. 3, 2014 (1 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Sep. 6, 2011 (3 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/857,915 dated Feb. 4, 2014 (3 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Sep. 19, 2016 (3 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Jul. 14, 2016 (2 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Jun. 27, 2016 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Jun. 2, 2016 (10 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Jan. 20, 2016 (9 pages).
Applicant-Initiated Interview Summery from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/502,055 dated Apr. 14, 2016 (3 pages).
Corrected Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Jul. 13, 2017 (4 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/254,627 dated Dec. 12, 2016 (12 pages).
Supplemental Notice of Allowabilliaty from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated May 26, 2011 (4 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Jul. 8, 2010 (18 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated May 13, 2009 (16 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Dec. 23, 2010 (24 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Dec. 22, 2009 (17 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Mar. 17, 2011 (3 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Nov. 16, 2010 (3 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,979 dated Mar. 4, 2010 (3 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Sep. 4, 2013 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Jun. 8, 2012 (21 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Oct. 12, 2012 (21 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Nov. 6, 2012 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/171,081 dated Jul. 31, 2012 (3 page).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Feb. 23, 2016 (14 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Mar. 3, 2015 (15 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Jul. 20, 2016 (15 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Jul. 23, 2015 (13 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated Aug. 27, 2015 (3 page).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/095,123 dated May 1, 2015 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Feb. 25, 2011 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Dec. 1, 2010 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Apr. 16, 2010 (22 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Aug. 24, 2009 (17 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Oct. 28, 2010 (12 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Dec. 1, 2010 (4 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated May 12, 2010 (4 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/268,262 dated Nov. 24, 2009 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/079,597 dated Apr. 25, 2012 (5 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/079,597 dated Nov. 11, 2012 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/535,758 dated Aug. 23, 2013 (10 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/535,758 dated Apr. 4, 2013 (8 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/081,225 dated Oct. 21, 2016 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/081,225 dated Sep. 2, 2016 (11 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/081,225 dated Mar. 10, 2016 (23 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,978 dated Apr. 19, 2010 (7 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,978 dated Nov. 19, 2009 (6 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/265,978 dated Jul. 27, 2009 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 12/870,645 dated May 5, 2011 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/345,606 dated Jan. 9, 2014 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/345,606 dated May 31, 2013 (12 pages).
Summary of Interview from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/345,606 dated Oct. 21, 2013 (1 page).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/244,431 dated Nov. 16, 2016 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/244,431 dated Aug. 18, 2016 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/244,431 dated Mar. 18, 2016 (15 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/244,431 dated Jun. 17, 2016 (3 page).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated Feb. 6, 2019 (8 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Jan. 28, 2019 (3 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jan. 24, 2019 (7 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Jan. 25, 2019 (7 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Mar. 4, 2019 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Apr. 10, 2019 (6 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/475,930 dated Apr. 1, 2019 (18 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Apr. 29, 2019 (10 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jan. 11, 2019 (12 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Feb. 20, 2019 (2 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated May 8, 2019 (14 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated May 6, 2019 (7 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated May 15, 2019 (8 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,346 dated May 28, 2019 (39 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,363 dated Jun. 3, 2019 (33 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated May 21, 2019 (14 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/799,657 dated May 20, 2019 (8 pages).
Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,674 dated Jul. 26, 2010 (3 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Mar. 4, 2015 (3 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Jun. 17, 2014 (3 pages).
Examiner's Answer to Appeal Brief from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/942,687 dated Feb. 25, 2016 (11 pages).
Examiner's Answer to Appeal Brief from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Jun. 26, 2013 (14 pages).
Patent Board Decision from the U.S. Patent and Trademark Office for U.S. Appl. No. 11/944,000 dated Mar. 23, 2016 (8 pages).
Applicant-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 13/572,552 dated Sep. 3, 2014 (3 pages).
Examiner's Answer to Appeal Brief from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/043,165 dated Nov. 14, 2016 (13 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Aug. 7, 2019 (6 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/645,448 dated Jul. 29, 2019 (26 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Jul. 15, 2019 (2 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Jul. 9, 2019 (21 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Aug. 27, 2019 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/646,756 dated Jul. 16, 2019 (12 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jul. 11, 2019 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Aug. 19, 2019 (11 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Aug. 23, 2019 (10 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Mar. 15, 2019 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Aug. 21, 2019 (8 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/945,448 dated Aug. 28, 2019 (2 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 14/792,210 dated Jun. 17, 2019 (10 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Sep. 4, 2019 (7 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,291 dated Oct. 6, 2019 (12 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Oct. 9, 2019 (4 pages).
Supplemental Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Oct. 28, 2019 (5 pages).
Supplemental Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Oct. 2, 2019 (4 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,346 dated Oct. 31, 2019 (41 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Nov. 15, 2019 (7 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Nov. 14, 2019 (14 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,348 dated Dec. 4, 2019 (21 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Mar. 5, 2020 (16 pages).
Examiner-Initiated Interview Summary from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/140,351 dated Mar. 5, 2020 (1 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Jan. 24, 2020 (8 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,819 dated Mar. 20, 2020 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Jan. 23, 2020 (9 pages).
Supplemental Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/188,872 dated Mar. 20, 2020 (5 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Oct. 29, 2019 (6 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,006 dated Dec. 12, 2019 (3 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Dec. 5, 2019 (4 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jan. 8, 2020 (4 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/292,014 dated Jan. 23, 2020 (4 pages).
Corrected Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/346,530 dated Dec. 17, 2019 (4 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Nov. 19, 2019 (11 pages).
Notice of Allowability from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/469,281 dated Jan. 10, 2020 (4 pages).
Non-Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,291 dated Oct. 8, 2019 (6 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,291 dated Jan. 10, 2020 (8 pages).
Final Office Action from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/646,756 dated Jan. 15, 2020 (13 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/631,291 dated May 21, 2020 (8 pages).
Notice of Allowance from the U.S. Patent and Trademark Office for U.S. Appl. No. 15/646,756 dated May 20, 2020 (10 pages).
Related Publications (1)
Number Date Country
20190354270 A1 Nov 2019 US
Provisional Applications (1)
Number Date Country
60625690 Nov 2004 US
Continuations (6)
Number Date Country
Parent 15799657 Oct 2017 US
Child 16529378 US
Parent 14540830 Nov 2014 US
Child 15799657 US
Parent 13477853 May 2012 US
Child 14540830 US
Parent 13228349 Sep 2011 US
Child 13477853 US
Parent 12702976 Feb 2010 US
Child 13228349 US
Parent 11179384 Jul 2005 US
Child 12702976 US