Embodiments described herein relate to methods and systems for diagnostic image collaboration.
A picture archiving and communication system (“PACS”) is a central repository for various medical image studies of different modalities. A modality creates an image, an x-ray, a sonogram, a magnetic resonance imaging (“MRI”), and the like. There are defined workflows for reviewing and analyzing medial images, and a PACS server manages access to the medical images by other systems. A PACS viewer provides an interface for accessing medical images and provides various viewing options for one or more types of images. In some embodiments, the PACS viewer also include a dictation and speech-to-text mechanism that captures user audio input and converts the audio data to text data for insertion in a report, transmission to another system, or the like. Images awaiting review by a radiologist may be organized via worklists. Worklists are organizational structures that list the medical image studies a user is interested in reviewing and analyzing. For example, a radiologist may select a medical image study from a worklist, and the PACS viewer displays the medical images included within the selected medical image study. In some cases, the worklist is separate from the PACS viewer such that worklists are available for use with different viewers.
In some situations, multiple individuals collaborate on the treatment of a patient. For example, a radiologist or other specialist may be enlisted by a referring doctor to read medical images and provide the referring doctor with a report indicating the diagnosis of the patient's medical condition from the perspective of what is shown in the medical images. However, the report itself may not be sufficient to provide the referring doctor with the knowledge needed to adequately treat the patient. This generally results in further inquiries between the referring doctor and the radiologist.
Although online meetings and other collaboration systems exist, these systems may not be configured to handle medical image data. For example, in many collaboration systems, a screen shot is taken of a presenter's screen, which is then transmitted to other devices for display to other collaborators. Such screens may impair the diagnostic quality of medical images, which may hinder collaborators from properly reading the images and collaborating on a patient's health and treatment. Therefore, there is a need for a collaboration system that is directed to collaborating on medical imaging evaluations and that provides diagnostic quality imaging capabilities.
Accordingly, embodiments described herein provide a collaboration system that allows a presenter device to independently render diagnostic quality images while sharing presentation information for the images (in real-time or pseudo real-time) with other collaborator devices. The presentation information allows the collaborator devices to render the same images and apply the shared presentation information to display a medical image that mirrors the presentation on the presenter device.
For example, one embodiment provides a system for collaborating on medical image data captured as part of a medical imaging procedure. The system includes an electronic processor. The electronic processor is configured to receive, from a first user device collaborating on the medical image as a presenter within a collaboration session for the medical image, presentation state information representing a current presentation state of the medical image as displayed on a display device of the first user device as modified by a user interaction of the presenter, and automatically transmit a presentation model based on the presentation state information to a second user device collaborating on the medical image as a collaborator, the second user device automatically modifying the medical image as displayed on a display device of the second user device based on the presentation model to mirror the user interaction with the medical image as displayed on the display device of the first user device.
Another embodiment provides a method for collaborating on a medical image. The method includes displaying, with an electronic processor included in a first user device used by a presenter, the medical image on a display device of the first user device and transmitting an invitation to a collaborator via a second user device to join a collaboration session for the medical image. The method also includes, in response to receiving an acceptance of the invitation from the second user device, transmitting a presentation model to the second user device, the presentation model based on a current presentation state of the medical image as displayed on the display device of the first user device. In addition, the method includes, during the collaboration session, transmitting a subsequent presentation model to the second user device representing a subsequent presentation state of the medical image as displayed on the display device of the first user device as modified by the presenter using the first user device.
Yet another embodiment provides a non-transitory, computer-readable medium storing instructions that, when executed by an electronic processor, perform a set of functions. The set of functions includes displaying a medical image captured as part of a medical imaging procedure during a collaboration session on a display device of a collaborator device, receiving a presentation model for the medical image, the presentation model representing a current presentation state of the medical image as displayed on a display device of a presenter device as modified by a presenter, and automatically modifying the medical image as displayed on the display device of the collaborator device based on the presentation model to mirror the medical image as displayed on the display device of the presenter device.
Other aspects of the embodiments described herein will become apparent by consideration of the detailed description and accompanying drawings.
Other aspects of the embodiments described herein will become apparent by consideration of the detailed description.
Before embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the accompanying drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “mounted,” “connected” and “coupled” are used broadly and encompass both direct and indirect mounting, connecting and coupling. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings, and may include electrical connections or couplings, whether direct or indirect. Also, electronic communications and notifications may be performed using any known means including direct connections, wireless connections, and the like.
A plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the embodiments described herein. In addition, embodiments described herein may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic-based aspects of the embodiments described herein may be implemented in software (for example, stored on non-transitory computer-readable medium) executable by one or more processors. As such, it should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components, may be utilized to implement the embodiments described herein. For example, “mobile device,” “computing device,” and “server” as described in the specification may include one or more electronic processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (for example, a system bus) connecting the components.
As described above, multiple individuals may collaborate on an evaluation of a patient based on imaging data. Typical collaboration systems, however, capture a screen shot of a presenter's screen and share the screen shot with the other collaborators. This type of collaboration can impact the quality of the imaging data displayed within the shared screen shot, which can impact a collaborator's contribution to the evaluation. Accordingly, embodiments described herein provide methods and systems for sharing presentation information from a presenter device to each collaborator device, wherein each collaborator device is configured to independently render a medical image and apply the shared presentation information to mirror the presenter's screen.
For example,
The server 105, the first user device 110, the second user device 112, and the medical image database 115 communicate over one or more wired or wireless communication networks 120. Portions of the communication network 120 may be implemented using a wide area network, such as the Internet, a local area network, such as a Bluetooth™ network or Wi-Fi, and combinations or derivatives thereof Alternatively or in addition, in some embodiments, components of the system 100 communicate directly as compared to through the communication network 120. Also, in some embodiments, the components of the system 100 communicate through one or more intermediary devices not illustrated in
The server 105 is a computing device that serves as a gateway for the medical image database 115. For example, in some embodiments, the server 105 is a PACS server. Alternatively, in some embodiments, the server 105 may be a server that communicates with a PACS server to access the medical image database 115. As illustrated in
The electronic processor 125 includes a microprocessor, an application-specific integrated circuit (ASIC), or another suitable electronic device for processing data. The memory 130 includes a non-transitory computer-readable medium, such as read-only memory (ROM), random access memory (RAM) (for example, dynamic RAM (DRAM), synchronous DRAM (SDRAM), and the like), electrically erasable programmable read-only memory (EEPROM), flash memory, a hard disk, a secure digital (SD) card, another suitable memory device, or a combination thereof. The electronic processor 125 is configured to access and execute computer-readable instructions (“software”) stored in the memory 130. The software may include firmware, one or more applications, program data, filters, rules, one or more program modules, and other executable instructions. For example, the software may include instructions and associated data for performing a set of functions, including the methods described herein.
The communication interface 135 allows the server 105 to communicate with devices external to the server 105. For example, as illustrated in
The medical image database 115 stores a plurality of medical images 140 (collectively referred to as “the medical images 140” and individually referred to as “a medical image 140”). In some embodiments, the medical image database 115 is combined with the server 105, the first user device 110, the second user device 112, or a combination thereof. Alternatively or in addition, the medical images 140 may be stored within a plurality of databases, such as within a cloud service. Although not illustrated in
The first user device 110 and the second user device 112 are also computing devices and may include desktop computers, terminals, workstations, laptop computers, tablet computers, smart watches or other wearables, smart televisions or whiteboards, or the like. In some embodiments, the first user device 110 is located remotely from the second user device 112.
A user may use the first user device 110 or the second user device 112 to access and view one or more medical images 140 and interact with a medical image 140. For example, the user may access a medical image 140 stored in the medical image database 115 (through a browser application or a dedicated application stored on the first user device 110 that communicates with the server 105) and view the medical images 140 on the display device 170 associated with the first user device 110. The user may also interact with the medical images 140 using the human-machine interface 160 of the first user device 110. For example, a user may modify content of a medical image 140 (for example, by drawing on a medical image 140), modify a display property of a medical image 140 (for example, by modifying a contrast property, a brightness property, and the like), or a combination thereof.
As noted above, the first user device 110 and the second user device 112 may be used to collaboratively evaluate or diagnose a patient based on one or more medical images 140. For example, a radiologist (for example, a first user) may use the first user device 110 to collaborate with a physician (for example, a second user) using the second user device 112. In particular, the radiologist may annotate the medical image 140 to identify a portion of the medical image 140 in which the radiologist is basing a diagnostic opinion on. As described in more detail below, the radiologist's interaction with the medical image 140 may be simultaneously displayed to the medical image 140 as displayed to the physician (for example, in real time or pseudo real time). As used in the present application, “real-time” means receiving data, processing the data, and returning results of processing the data without significant delay. For example, “real-time” may include processing data within seconds or milliseconds so that results of the processing are available virtually immediately.
As illustrated in
In some embodiments, the electronic processor 125 initiates the collaboration session in response to receiving the request from the presenter. Alternatively or in addition, the electronic processor 125 may initiate the collaboration session in response to the first collaborator accepting the invitation to join the collaboration session. In some embodiments, the collaboration session is a web-based collaboration session. It should be understood that more than two users may participate in a collaboration session and users may join a session at different times. For example, continuing with the example scenario set forth above, the presenter may also invite a user of a third user device (hereinafter referred to as the “second collaborator”) to join the collaboration session at the same time the first collaborator is invited or at a later time.
Returning to
Accordingly, after joining the collaboration session, the second user device 112 (used by the first collaborator) receives the presentation model from the server 105 and uses the received presentation model to modify the same medical image as displayed on the second user device 112 to match the medical image as displayed on the first user device 110. For example, the second user device 112 may update the presentation state of the medical image as displayed on the second user device 112 to the settings included in the presentation model. Thus, the medical image 140 displayed on the second user device 112 is automatically modified to mirror the medical image 140 as displayed on the first user device 110 without impacting a quality of the medical image 140 as displayed on each collaborator. In particular, by rendering the medical image 140 independently at each collaborator device (as compared to merely sharing a screen shot), diagnostic image quality is preserved. It should be understood that, in some embodiments, the second user device 112 is configured to apply the presentation model to the medical image 140. However, in other embodiments, a separate device, such as the server 105, may be configured to apply the presentation model to the medical image 140 and transmit a modified version of the medical image 140 to the second user device 112 for display. In other words, embodiments described herein may be used with client-side rendering of medical images and server-side rendering of medical images. As noted above, in some embodiments, only one user included in a collaboration session is designated as the presenter, and a user that is not designated as a presenter is thus restricted from interacting or otherwise modifying the displayed medical image.
While the collaboration session is still active, the server 105 continues to receive presentation state information from the presenter's device and transmits this presentation state information to the device used by each collaborator to maintain the medical image 140 as displayed to each collaborator synchronized (in real-time or pseudo real-time) with the medical image as displayed to the presenter. The presenter device may be configured to transmit presentation state information periodically, in response to a change in the presentation state of the medical image as displayed by the presenter's device (such as based on the presenter's interaction with the displayed image), or a combination thereof. For example, if the presenter selects a zoom button X times to increase the zoom level of the medical image, the device used by the presenter transmits presentation state information to the server 105 indicating the new presentation state of the medical image (for example, the zoom level for the image has changed to X %). A presenter may interact with a displayed medical image in various ways. For example, as one example, a presenter may interact with a medical image 140 by annotating the medical image, such as by identifying a particular portion of the medical image 140. A presenter may also interact with a displayed medical image 140 by modifying the brightness property, contrast property, or other display property of the image, modifying a size of the image or a zoom level of the image, or the like. In some embodiments, the presenter's cursor position may also be displayed within the medical image 140. For example, as illustrated in
It should be understood that the presentation state information transmitted to the server 105 by the presenter device may take various forms. For example, as described above, the presentation state information may include presentation control commands that describe the effect of the presenter's interactions with a displayed medical image 140. In particular, the presentation state information can represent the current display state of the image 140 on the presenter's device, including, for example, a zoom level, a brightness level, a layout or configuration applied by the viewer, cursor position coordinates, and the like. However, alternatively or in addition, the actual user interactions with the medical image 140 (mouse clicks, mouse movements, key presses, touches, and the like) are transmitted to the server 105 as the presentation state information. The server 105 may forward these interactions to the collaborator devices, which can effectively replay the interactions on the medical image 140 as displayed on each collaborator device. Alternatively or in addition, the server 105 may convert these received interactions into presentation state information, which the server 105 can forward to each collaborator (for example, included in a presentation model), use to generate a modified version of the medical image 140 for transmission to each collaborator device, or a combination thereof. Also, in some embodiments, the presentation model transmitted by the server 105 may include the presentation state information as received from the presenter device (unaltered). Thus, in these situations, the server 105 may act as a relay for the presentation state information transmitted by the presenter device.
Embodiments described herein provide a collaboration system that preserves diagnostic image quality by sharing image modifications between participants and allowing the shared modifications to be applied to the medical image displayed at each participant as compared to capturing and comparing screen shots. In other words, each participant in the collaboration session (collaboration endpoint) renders its own version of a medical image while maintaining synchronization of the medical images displayed to each participant based on the medical image rendered by the presenter of the session. Not only does this configuration preserve the quality of the medical image, but may reduce the amount of data shared between the participant device as only modifications needs to be shared and not screen shots.
Various features and advantages of the embodiments described herein are set forth in the following claims.