This invention relates to medical diagnostic ultrasound systems and, in particular, to the use of ultrasound systems for remote diagnosis with remote system control.
Currently available medical ultrasound systems enable clinicians to conduct ultrasound scans on a patient, capture images, make measurements and use built-in algorithms and report generation software to make diagnoses and report the results of a diagnosis. The clinicians who perform these procedures are experienced radiologists, cardiologists, obstetricians, or trained sonographers. Sufficient training is expected before a clinician can conduct the procedure and interpret the scanned images. However, such medical experts may not be readily available in a remote area. One approach to providing necessary diagnostic care is teleradiology, whereby an inexperienced person at the location of the patient is guided in the conduct of a scan by a remotely located medical expert. Teleradiology can not only be used to provide diagnostic imaging services to remote areas where skilled diagnostic personnel are not otherwise available. It can also be used to provide convenience and reduce costs for patients who need frequent scanning such as those at high risk for certain conditions or patients who have recently undergone a procedure which requires periodic follow-up monitoring. In the usual diagnostic scenario, ultrasound scanning is done by an experienced sonographer at a facility with a variety of ultrasound systems for different diagnostic procedures. The patient must travel to the diagnostic facility which often may be inconvenient or difficult for the patient. Moreover, the patient or his or her insurance company can incur considerable expense for scanning performed at such highly equipped facilities staffed by expert sonographic personnel. These costs and inconvenience can mount when a patient requires frequent scanning to monitor an ongoing medical condition, for instance.
An approach which reduces such costs and inconvenience is to have the scanning performed by someone who is not an experienced sonographer but under the direction of a remotely located clinician who is in communication with the person performing the scan. That person may be a visiting nurse, a caregiver, or even the patient himself. Using teleradiology, the images produced by the scan at the patient site are viewed by the remote expert, who assures that the proper diagnostic images are acquired and subsequently reviewed. When teleradiology is employed in this way, the patient does not have to travel to a hospital or diagnostic imaging facility, and the costs incurred by the reviewing facility are minimized.
Providing ultrasound scanning in this way does require that the ultrasound scanner be present at the patient site such as the home of the patient, however. The costs attendant to providing an ultrasound scanner for patient use can be minimized by using a compact, highly portable scanner, such as one of the Lumify™ ultrasound probes available from Philips Healthcare of Andover, Mass. The Lumify probes are designed for specific scanning procedures such as curved array probes for radiological procedures and phased array probes for cardiovascular procedures. These probes contain all of the circuitry necessary to produce an ultrasound image in the probe case itself. A Lumify probe thus needs only a display device to complete a fully functional ultrasonic diagnostic imaging system. The display device can be anyone of commonly available desktop, laptop or tablet computers or even a standard smartphone. The software necessary to interface a Lumify probe to one of these standard devices can be downloaded from the user's application provider. Once an app is downloaded and installed, the Lumify probe and the user's display device form a fully functional ultrasound system.
Conventionally, fully functional ultrasound systems are always under the control of medical professionals who take great pains to ensure that they are always used properly. One guiding principle in the use of ultrasound by ultrasound professionals is the ALARA principle. This acronym stands for “as low as reasonably achievable,” which means that the ultrasound professional uses ultrasound in a way which achieves the diagnostic objective while exposing the patient to the minimal amount of ultrasonic energy necessary to provide the required images. While ultrasound uses non-ionizing radiation and is generally safe under virtually all conditions, including unborn infants, the ALARA principle is nonetheless followed by all responsible ultrasound diagnosticians. But when an ultrasound system is left in the hands of someone inexperienced in ultrasound or the patient himself, this guiding principle may not always be followed. The patient or the person doing the scanning may use the system to perform unauthorized scanning, not only of the patient but even of other persons. Using conventional preventive measures such as passwording the ultrasound system may not be fully effective. Once the patient or caregiver has received the password for the initial scanning procedure, there is no way to prevent its use for subsequent unauthorized scanning. It is therefore desirable to provide an ultrasound system which can be used by laypersons or caregivers to scan a person at home or another location outside a diagnostic facility, but which still prevents any use of the ultrasound system for unauthorized scanning.
Accordingly, it is an object of the present invention to make ultrasound imaging available at the site of a patient with the help of an expert at a remote location. It is a further object to do so in a way that facilitates local scanning with remote expert assistance without also facilitating improper use of the scanning device such as unauthorized scanning with the device. A solution which meets these objectives will enable diagnostic imaging in locations where diagnostic imaging experts are not readily available for cost or even reasons of convenience.
In accordance with the principles of the present invention a system and method are described which enable ultrasonic imaging by inexperienced personnel under the guidance of a remotely located diagnostic imaging expert. Improper use of the scanning device is prevented by having the remote expert control the operation of the ultrasound system, including such operation as enabling or disabling ultrasound energy transmission by an ultrasound probe and enabling or disabling imaging on the ultrasound system. The remote expert can thereby direct and control a scanning procedure, remotely enabling the ultrasound system when an authorized scan is to be performed, viewing images from the ultrasound exam on his or her image display, changing scan settings if needed, and limiting the local display and/or disabling the ultrasound system when the authorized procedure is completed.
In the drawings:
FIG. 1 illustrates an ultrasound scanning device and a remote expert diagnostic workstation configured in accordance with the principles of the present invention.
FIG. 2 illustrates in block diagram form an ultrasound system configured for use by an untrained user an under control of a remote diagnostic clinician in accordance with the principles of the present invention.
FIG. 2a illustrates in detail the command decoder circuitry of FIG. 2.
FIG. 3 is a flowchart illustrating the sequence of a scanning procedure using an ultrasonic scanning device at the site of a patient and a remotely located expert in accordance with the present invention.
FIG. 1 illustrates at the top of the drawing a highly portable ultrasonic scanning device comprising an L12-5 Lumify™ ultrasound probe 8 coupled to a handheld display device in the form of a smartphone 100. The Lumify ultrasound system, available from Philips Healthcare of Andover, Mass., USA, consists of a Lumify probe such as the one shown in the drawing, and an image display device, which may comprise a PDA, laptop computer, desktop computer, tablet computer, or smartphone 100 as shown in this illustration. In this system all of the ultrasound-specific components, such as the transducer array, beamformer, ultrasound signal processor, and B mode and Doppler processors, are constructed in piezoelectronic and integrated circuit form and located in the probe 10. The display device 100 contains the image display and the user interface (control) components. The display device also contains a software application specific to the Lumify probe, which causes the ultrasound image and the user controls to be displayed on the device screen as shown in the drawing. In this smartphone implementation the user holds the ultrasound probe 8 against the skin of the patient with one hand while holding the display and user control device 100 with the other hand. The acquired ultrasound images are displayed on the display device, and the same images are sent wirelessly to the display device of a remote expert. In the illustrated implementation the expert's ultrasound image display device is the screen of a workstation 108 which is controlled by a keyboard 106. The expert sends commands to the scanning device from the workstation and receives images from the scanning device by means of a modem or wireless radio 132 which are carried over a data network 40, e.g., the Internet, which may take the form of a wire network 42 or a wireless network as indicated by 44.
FIG. 2 illustrates the components of an exemplary ultrasonic scanning device in block diagram form. A transducer array 12 is provided in an ultrasound probe 10 for transmitting ultrasonic waves and receiving echo information. The transducer array 12 may be a one- or two-dimensional array of transducer elements capable of scanning in two or three dimensions, for instance, in both elevation (in 3D) and azimuth. The transducer array 12 is coupled to a microbeamformer 14 in the probe which controls transmission and reception of signals by the array elements. Microbeamformers are capable of at least partial beamforming of the signals received by groups or “patches” of transducer elements as described in U.S. Pat. No. 5,997,479 (Savord et al.), U.S. Pat. No. 6,013,032 (Savord), and U.S. Pat. No. 6,623,432 (Powers et al.) The microbeamformer is coupled to a transmit/receive (T/R) switch 16 which switches between transmission and reception and protects the main beamformer 20 from high energy transmit signals. In the event that the microbeamformer 14 performs all of the beamformation, as it does in the Lumify probe, then a main beamformer is unnecessary. The transmission of ultrasonic beams from the transducer array 12 under control of the microbeamformer 14 is directed by a transmit controller 18 coupled to the T/R switch and the microbeamformer 14, which receives input from the user's operation of a user interface or control panel (not shown) and, in this implementation of the present invention, receives an “enable” signal from a command decoder 34. Among the transmit characteristics controlled by the transmit controller are the timing, amplitude, phase, and polarity of transmit waveforms. Beams formed in the direction of pulse transmission may be steered straight ahead from (orthogonal to) the transducer array, or at different angles for a wider field of view. They may also be focused at different depths in the body of a subject.
The echoes received by a group of transducer elements are beamformed by appropriately delaying them and then combining them. The partially beamformed signals produced by the microbeamformer 14 from each patch are coupled to a main beamformer 20 where partially beamformed signals from individual patches of transducer elements are combined into a fully beamformed coherent echo signal. For example, the main beamformer 20 may have 128 channels, each of which receives a partially beamformed signal from a patch of 12 transducer elements. In this way the signals received by over 1500 transducer elements of a two-dimensional array can contribute efficiently to a single beamformed signal.
The coherent echo signals undergo signal processing by a signal processor 26, which includes filtering by a digital filter and noise reduction as by spatial or frequency compounding as shown in U.S. Pat. No. 4,561,019 (Lizzi et al.) and U.S. Pat. No. 6,390,981 (Jago), for instance. The signal processor can also shift the frequency band to a lower or baseband frequency range. The digital filter of the signal processor 26 can be a filter of the type disclosed in U.S. Pat. No. 5,833,613 (Averkiou et al.), for example. The processed echo signals then are demodulated into quadrature (I and Q) components by a quadrature demodulator 28, which provides signal phase information.
The beamformed and processed coherent echo signals are coupled to a B mode processor 52 which produces a B mode tissue image. The B mode processor performs amplitude (envelope) detection of quadrature demodulated I and Q signal components by calculating the echo signal amplitude in the form of (I2+Q2). The quadrature echo signal components are also coupled to a Doppler processor 54, which stores ensembles of echo signals from discrete points in an image field which are then used to estimate the Doppler shift at points in the image with a fast Fourier transform (FFT) processor. For a color Doppler image as shown on the display device 100 of FIG. 1, the estimated Doppler flow values at each point in a blood vessel are wall filtered and converted to color values using a look-up table. The B mode image signals and the Doppler flow values are coupled to a scan converter 30 which converts the B mode and Doppler samples from their acquired R-θ coordinates to Cartesian (x,y) coordinates for display in a desired display format, e.g., a sector display format or a rectilinear display format as shown in FIG. 1. Either the B mode image or the Doppler image may be displayed alone, or the two shown together in anatomical registration in which the color Doppler overlay shows the blood flow in the structure of tissue and vessels in the image.
The ultrasound images produced by the scan converter 30 are coupled to an image processor 32. The image processor may further smooth and filter the image for display, and add graphical information such as patient name, date, and scanning parameters. The image processor may also convert 3D data sets or sets of image planes into three dimensional images by volume rendering as described in U.S. Pat. No. 6,530,885 (Entrekin et al.), or extract individual image planes from a 3D data set by multiplanar reformatting as described in US Pat. 6,443,896 (Detmer). For local display on the display device the processed images are coupled to the display device's display controller 138, which displays the images on a display 140.
In an implementation of the present invention the ultrasound images produced by the image processor 32 are coupled to a MODEM or WiFi radio 130 for transmission to the display device of a remote expert. The MODEM or WiFi 130 can be the conventional WiFi transceiver found in smartphones, laptop, tablet and desktop computers. In addition to transmitting ultrasound images to the remote expert, the MODEM/WiFi radio 130 also receives commands transmitted by the remote expert to control the ultrasound functionality of the system of FIG. 2. The received commands, generally in the form of digital words, are coupled to a command decoder 34 which decodes the commands into signals which control the ultrasound system. In the implementation of FIG. 2, a command which enables ultrasound transmission from the transducer array 12 decodes to produce an “enable” signal that is applied to the transmit controller 18. The transmit controller, microbeamformer 14, and transducer array 12 are then enabled to transmit ultrasonic energy. When the enable signal is changed to disable by another command, the transmit controller 18 and transducer array 12 cease ultrasound transmission. Also shown in the drawing is an enable signal produced by decoding of a command to enable local display of ultrasound images. When this enable signal is produced and applied to the display controller 138, the display controller will display the ultrasound images produced by the system on the local display 140. When another command disables the enable signal, the display controller is inhibited from displaying the ultrasound images on the local display 140. In a typical implementation the command decoder 34 receives a variety of commands from the remote expert to control an ultrasound scan, in addition to the two shown in FIG. 2. Other commands which may be sent to the ultrasound system include those to select or change the imaging mode, e.g., changing from B mode to colorflow mode and back again, or adjusting the focal depth of the image. The command set generally would include those used in the conduct of a usual ultrasound exam.
When the ultrasound system display device is a smartphone 100 as shown in FIG. 1, it contains cellular network circuitry 110 for wireless transmission in addition to WiFi communication over the Internet. In the implementation shown in FIG. 2, audio communication is done over a cellular network. This includes voice input from the smartphone's microphone 116 which is coupled to the cellular networks circuitry and transmitted by means of antenna 120, and voice communication from the remote expert which is received over the cellular network and reproduced on the smartphone's loudspeaker 114. The smartphone's camera 118, which in the case of a desktop computer implementation would comprise a Webcam, is coupled to the image processor 32 so that live images can be sent to the remote expert via the MODEM/WiFi radio 130. Thus, in the implementation shown in FIG. 2, video communication is via WiFi, and audio communication is via cellular network. However, this partitioning of communication in a particular implementation is a matter of design choice; all communications can be via WiFi, via cellular network, or other partitioning thereof. The example of FIG. 2 shows a dotted line from the output of the image processor 32 to the cellular network circuitry 110, which would be used to conduct both audio and video communication over the cellular network. The cellular network circuitry can also be configured to receive the control commands sent from the clinician's remote location for the command decoder 34.
The structure of one implementation of the command decoder 34 is shown in FIG. 2a. A received digital command word is coupled to a digital comparator 82. The comparator sequentially compares the received command with command words stored in a command register 80, implemented as a digital memory, each of which can be implemented by the ultrasound system to produce a known control effect. When a match is found of a received command with one stored in the command register, the control function of the identified command is implemented by the ultrasound system. For example, when the received command word matches with the Transmit Enable command word from the command register, the comparator initiates the application of a transmit enable signal to the transmit controller 18 as shown in FIG. 2, which enables the transmission of ultrasound energy by the transducer array. Using this command, a remote expert can enable the ultrasound probe in the hands of a patient or layperson for ultrasonic imaging. When the imaging procedure is completed, the remote expert can send a similar Transmit Disable command which, when matched with the known Transmit Disable command from the command register, disables the ultrasound probe from further ultrasonic energy transmission. By enabling the ultrasound probe to transmit ultrasound only for a legitimate imaging exam, the remote expert assures adherence to the ALARA principle in the use of the system in the hands of a layperson.
Similarly, when the received command matches with the Display Enable command from the command register, the comparator initiates the application of a display enable signal to the display controller 138 as shown in FIG. 2. Using this command the remote expert can control when the ultrasound images produced during the scan will be shown to individuals at the site of the scanning, and when display of the images will be inhibited. Many countries have laws in place to restrict the use of ultrasound for use by an expert for medical diagnosis purpose only. In some countries gender identification is an issue and it is illegal to determine gender of a fetus before birth using ultrasound. Only licensed persons can perform ultrasound imaging. If an ultrasound system is provided to a patient for legitimate use such as monitoring the condition of the patient after surgery, it is conceivable that the device could be misused for gender identification purposes. One approach to preventing such misuse is to block the display of images to unqualified or unauthorized users at the scanning site, and allowing only the remote expert to view the ultrasound images. The implementation of FIG. 2 provides this capability. In the Lumify system of FIG. 1, the display controller 138 and the display 140, the MODEM/WiFi radio 130, the camera 118, microphone 116, loudspeaker 114, and cellular network circuitry 110 and antenna 120 are located in the smartphone display unit 100, while the other components of FIG. 2 are located in the probe housing 8. The user controls on the touchscreen display of the smartphone display unit, not shown in FIG. 2, can also send control signals to the probe unit 8, which appropriately control the other components of the scanning system in accordance with the user's manipulation of the controls. Other implementations, for instance, ones using a tablet or laptop computer as the display device, may partition the components differently so that more components are located in the computer.
A method for use of the ultrasound scanning device of FIG. 1 by an untrained user to perform scanning with the probe, with control of the procedure by a remote expert clinician at the image workstation 106, 108, is outlined by the flowchart of FIG. 3. In step 60 a patient is notified of the need for an ultrasound scan. This notification may be by phone or email or text message about a need for an ultrasound scan either because a patient developed symptoms described by the patient to the medical professional and which, in the judgment of the medical professional, require an ultrasound examination or because time has come for a scheduled ultrasound monitoring examination, such as monitoring a post-surgery patient for possible internal bleeding or fluid build-up. When a person at the patient site, such as the patient himself or herself, is ready to conduct the necessary scan, the clinician establishes voice and video communication in step 62 with the patient and/or the person who will scan the patient. The video can be provided for instance by the camera of the smartphone or by a Webcam, and the audio can be over a cellular network of using a voice-over-IP protocol communicating over the Internet. In step 64 the remote clinician sends a command from the workstation 106, 108 to enable the transmission of ultrasound by the ultrasound probe 8. In step 66 the remote clinician then guides the person holding the ultrasound probe, the patient in this example, in properly positioning the probe against the body of the patient so that the desired anatomy is in the image field. This is done in this example by observing the probe placement with the live images transmitted by the smartphone camera or a Webcam. Once the remote clinician is seeing ultrasound images of the proper anatomy on the workstation, images which may or may not be visible to the person(s) at the scanning site depending upon transmission of a Display Enable command, the remote clinician will begin carefully observing the ultrasound images and generally recording them. In step 68 the remote clinician sends additional commands to the ultrasound system to optimize the exam, such as mode change commands and focal depth change commands, and may issue additional verbal instructions to guide probe manipulation. The exam is complete when the clinician has acquired and stored the necessary images for a diagnosis. With the exam completed, the remote expert sends a command to disable further ultrasound transmission by the probe in step 70 and ends the voice and video communication with the patient and others at the scanning site.
Variations of the system and method described above will readily occur to those skilled in the art. Instead of enabling or disabling the transmit controller to control the production of ultrasonic energy by the probe, the enabling command may enable an ultrasound image acquisition program executed by the ultrasound system for image acquisition. Alternatively, the image acquisition program may be downloaded to the ultrasound system under control of the remote clinician, then removed from the ultrasound system after the ultrasound exam is completed. Some or all of the image acquisition program may be resident on the cloud and executed there in whole or in part and never be fully loaded onto the ultrasound system in the hands of a layperson scanner. Instead of enabling and disabling the transmit controller, the enable command may control the application of high voltage to the transducer drivers in the microbeamformer by closing (and later opening) a switch in the high voltage supply line. Instead of using command words, an enabling code may be verbally given to the person conducting the scan to input into the ultrasound system, provided that the code is only effective for a single scan and thus cannot be misused at a later time.
It should be noted that an ultrasound system suitable for use in an implementation of the present invention, and in particular the component structure of the workstation and ultrasound systems of FIGS. 1 and 2, may be implemented in hardware, software or a combination thereof. The various embodiments and/or components of an ultrasound system, for example, the modules, or components and controllers therein, also may be implemented as part of one or more computers or microprocessors. The computer or processor may include a computing device, an input device, a display unit and an interface, for example, for accessing the Internet as shown in FIG. 1. The computer or processor may include a microprocessor. The microprocessor may be connected to a communication bus, for example, to access a PACS system or the data network shown in FIG. 1. The computer or processor may also include a memory. The memory devices such as the command register 80 may include Random Access Memory (RAM) and Read Only Memory (ROM). The computer or processor further may include a storage device, which may be a hard disk drive or a removable storage drive such as a floppy disk drive, optical disk drive, solid-state thumb drive, and the like. The storage device may also be other similar means for loading computer programs or other instructions into the computer or processor.
As used herein, the term “computer” or “module” or “processor” or “workstation” may include any processor-based or microprocessor-based system including systems using microcontrollers, reduced instruction set computers (RISC), ASICs, logic circuits, and any other circuit or processor capable of executing the functions described herein. The above examples are exemplary only, and are thus not intended to limit in any way the definition and/or meaning of these terms.
The computer or processor executes a set of instructions that are stored in one or more storage elements, in order to process input data. The storage elements may also store data or other information as desired or needed. The storage element may be in the form of an information source or a physical memory element within a processing machine.
The set of instructions of an ultrasound system including those controlling the acquisition, processing, and transmission of ultrasound images as described above may include various commands that instruct a computer or processor as a processing machine to perform specific operations such as the methods and processes of the various embodiments of the invention. The set of instructions may be in the form of a software program. The software may be in various forms such as system software or application software and which may be embodied as a tangible and non-transitory computer readable medium. Further, the software may be in the form of a collection of separate programs or modules, a program module within a larger program or a portion of a program module. The software also may include modular programming in the form of object-oriented programming. The processing of input data by the processing machine may be in response to operator commands, or in response to results of previous processing, or in response to a request made by another processing machine. In the Lumify system smartphone shown in FIG. 1, for instance, software instructions are conventionally employed to create and control the display and user control functions described above and are downloaded to the smartphone as an app (application software.)
Furthermore, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. 112, sixth paragraph, unless and until such claim limitations expressly use the phrase “means for” followed by a statement of function devoid of further structure.