The present invention relates to substantially real-time medical scanning and imaging, and more particularly to a virtual control interface for controlling real-time scanning and display machines in medical contexts.
Effective use of a substantially real-time medical scanner, such as, for example, an ultrasound machine, generally requires a user to control both the position and orientation of a probe as well as the scanning machine itself.
Conventionally, substantially real-time scanning machines, such as for example, ultrasound machines, provide customized mouse and keyboard controls for the scanner, as well as a selection of scanning probes which are attached to the scanner. While scanning a patient, a user (generally a health care clinician; known as a “sonographer” in ultrasound contexts) handles a probe with one hand (for example, the right hand for abdominal scans or the left hand for cardiac scans) and manipulates keyboard and mouse interfaces to the scanning machine with the other. This handiwork must be done by a user as he simultaneously watches a computer monitor or other display where the acquired images are displayed. Given the general complexity of image controls and the close attention to the displayed anatomies that is required for diagnosis and/or intervention, the division of a user's attention in this manner can impede or even degrade his performance of these tasks.
Such image control tasks can include, for example: (i) gain control for an ultrasound signal (conventionally implemented using several slide potentiometers to control the gain at several depths from a probe's tip); (ii) transmit power and overall gain control (conventionally implemented using rotary potentiometers) ; (iii) linear measurements and area/perimeter measurements using elliptical approximation, continuous trace or trace by points (conventionally implemented using a mouse-like track ball for measurements and text positioning); (iv) starting and stopping 3D modes, Doppler mode, panoramic view mode, etc. and controlling each of a mode's particular tools; and (v) Adjusting a probe's scanning depth, and angle of scan (in convex probes), also conventionally implemented using rotary potentiometers.
Additionally, conventional real-time medical scanner interfaces, such as, for example, those to ultrasound machines, are not programmable. In general, once a given functionality is assigned to a particular key, lever or button on a given ultrasound machine, that interface device's functionality cannot be reconfigured. There are sometimes found function keys (such as, for example, F1, F2, etc.) that can be customized by a user, and some buttons can have an integrated light so that they can indicate their active/nonactive status by being on or off. Nonetheless, it is often confusing to have buttons in place that are not active.
Notwithstanding the cumbersomeness of conventional interfaces, state of the art ultrasound machines allow a user to perform numerous image processing functionalities on raw ultrasound data, and these functionalities are capable of being updated, modified, upgraded or reprogrammed. Often such image processing functionalities are specific to a given medical specialty, such as, for example, fetal ultrasound or cardiology. In such cases enhanced ultrasound machines can, for example, automatically calculate cranial size and diameter, head to body ratios, heart and lung size, etc., or can be optimized to display the face of a baby.
Thus, using a set of fixed interface controls which are hard wired to fixed operational and control functions presents a significant problem for real-time scanning interfaces where specialized functionalities and upgrades thereto are becoming more and more common. For example, a designer of an ultrasound scanner has to decide whether to provide a few programmable buttons that can each have many functionalities mapped to them or to use many buttons, where each is dedicated to a specific function. It is noted that the latter choice is good for operators since the needed buttons can be memorized and thus quickly located, but it tends to clutter the keyboard with keys that might never be used.
Additionally, conventional real-time scanning modalities use a series of two-dimensional images to offer insight into what are essentially three-dimensional anatomical structures, such as, for example, fetuses, livers, kidneys, hearts, lungs, etc. Thus, for example, state of the art 3D ultrasound technology converts acquired 2D scan images into 3D volumes, and provides users with 3D interactive display and processing functionality (such as, for example, rotation, translation, segmentation, color look-up tables, zoom, cropping, etc.) to allow users to better depict the actual structures under observation and to operate on the displayed 3D images in a three-dimensional way. It is thus a difficult task to map such 3D display and processing operations to a conventional ultrasound interface, which is simply a keyboard and mouse. It is also a difficult task to ask a user to interact with a standard keyboard-and-mouse type interface for basic image control operations, as described above, and to then use another, perhaps more natural interface, for 3D interaction with a displayed volume. These difficulties will only be further exacerbated as time goes on, as more and more complex 3D interactive functionalities are offered on substantially real-time scanning machines.
What is needed in the art is a control interface for substantially real-time imaging systems that solves the above described problems of the prior art.
A virtual control system for real-time imaging machines is presented. In exemplary embodiments according to the present invention, a virtual control system comprises a physical interface communicably connected to a scanner/imager, such as, for example, an ultrasound machine. The scanner/imager has, or is communicably connected to, a processor that controls the display of, and user interaction with, a virtual control interface. In operation, a user can interact with the virtual control interface by physically interacting with the physical interface. In exemplary embodiments according to the present invention the physical interface can comprise a handheld tool and a stationary tablet-like device. In exemplary embodiments according to the present invention the control system can further include a 3D tracking device that can track both an ultrasound probe as well as a handheld physical interface tool. In such exemplary embodiments a user can control scan and display functions of the ultrasound machine by moving a handheld tool relative to the stationary tablet, and can perform 3D interactive display and image processing operations on a displayed 3D image by manipulating the handheld tool within a defined 3D space. Alternatively, all- control functions, those associated with scan and display control as well as those associated with 3D interactive display and image processing can be mapped to manipulations of the handheld tool in a defined 3D space.
It is noted that the patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the U.S. Patent Office upon request and payment of the necessary fee.
In exemplary embodiments of the present invention an interface to real-time imaging systems (for example, ultrasound, but in general any scanner that is obtaining images from a body—or object—that need to be seen and interacted in 3D) is provided. An interface according to such exemplary embodiments can, for example, allow a imaging system operator to both work on the imaged body or object in 3D and to control its 2D (and 1D, that is pushing a button) interface, in a ‘seamless’ manner, or a manner that doesn't involve change of tools, that waste time and complicate the procedure.
In exemplary embodiments according to the present invention methods and apparatus for controlling ultrasound scanning machines using a virtual control panel are presented. According to such exemplary embodiments, both standard 2D image control as well as image acquisition and display operations of conventional ultrasound scanners (such as, for example, depth of scan or mode of scan, which are conventionally controlled by a keyboard and mouse) as well as 3D operations on volumetric 3D data (such as, for example, rotating or cropping a 3D volume, zooming into any part of a volume, defining a 3D cutting plane or picking an object within a volume) can be effected. If 3D imaging is not required in a given ultrasound application, an exemplary virtual keyboard according to an exemplary embodiment of the present invention can be used simply to more efficiently control a 2D ultrasound process. This method is significantly advantageous with respect to prior art touch screen interfaces, which are also virtual, in that the interaction space for 2D could be anywhere within easy reach of the clinician, whereas the touch screen requires that the monitor remain within his reach. In exemplary embodiments of the present invention the display is decoupled from the interaction space; if the display is large, so are the interaction gesticulation.
With reference to FIGS. 1(a)-1(c), an exemplary virtually controlled ultrasound system according to an exemplary embodiment of the present invention is depicted. Such an exemplary system contains a computer with graphics capabilities 165, image acquisition equipment 150, and a 3D tracking system. The virtual interface can be, for example, displayed on an ultrasound machine display 171 and interacted with using a variety of physical input devices as may be known. Moreover, a virtual control interface as described herein can have any design as may be desirable, ergonomic or appropriate in given contexts.
For example, in markets where users are accustomed to the physical keyboard of a conventional ultrasound machine, a virtual interface can appear like a keyboard displayed on the ultrasound machine, i.e., as a “virtual keyboard.” Functions can be, for example, mapped to the virtual keyboard as they are actually mapped to an actual keyboard, and a user, by interacting with the physical interface, can push virtual buttons on the virtual keyboard (and watch them light up or change color on the display to indicate their being virtually “pushed”) precisely as he would have on the actual keyboard. Such a virtual keyboard is shown in
For more sophisticated users, a virtual interface can appear more like a standard GUI on a computer, yet can be optimized for the imaging modality being controlled. Such a virtual interface could have, for example, a series of virtual control panels each of which has various function buttons, sliders, display parameters choices or other input/output interfaces. Such a virtual interface would not need to be optimized as to available space and the ergonomics of pushing and reaching real buttons and a physical mouse, but rather could be optimized for ease of identification of control buttons, and for common work flow sequences. For example, for 3D ultrasound machines, the virtual interfaces and palettes provided by Volume Interactions Pte Ltd. of Singapore on its interactive 3D data display system, the Dextroscope™, used in connection with its RadioDexter™ software, could be ported and used as a virtual interface, offering optimized interactive control for a variety of 3D manipulations of data. Such an exemplary virtual interface is depicted in
As noted,
In exemplary embodiments of the present invention a virtual keyboard (as shown in the computer generated image 170), can contain, for example, a slider, various buttons, color look up tables and other graphic interfaces, and can be manipulated by a user using, for example, a hand-held tool 105 which can be grasped much as one grasps a pen. Using such a tool 105, a user can, for example, interact with a surface 102 in a similar way as one interacts with a tablet. This “pen-and-tablet” 105, 102 mechanism can, for example, thus replace a standard physical ultrasound keyboard and mouse (or trackball, etc.). For example, the virtual interface can appear on monitor 171 upon touching the tool 105 to the tablet 102 and then disappear when tool is removed from the surface. A user's position with respect to the tablet 102 can be, for example, tracked using standard tablet tracking systems such as, for example, pressure sensing, or any other known means. The image of the hand-held tool 105 can be, for example, a virtual stylus, as seen in the image 170. Physical motion of the hand-held tool can be, for example, mirrored by virtual motions of the stylus in the image 170. The stylus can be used, for example, to manipulate the virtual keyboard. Thus, in 2D applications, such an exemplary “pen-and-tablet” 105, 102 physical interface can be, for example, all that is required to control an ultrasound scanner.
In alternate exemplary embodiments according to the present invention, a hand-held device 105 can also be tracked by a 3D tracking system 160 as to 3D position and orientation within a defined 3D space. This can be effected, for example, by a hand-held device 105 containing, for example, 3D sensor 104, which can comprise, for example, a radio frequency or optical device which can be “seen” by tracking system 160, or can be effected using other known techniques. Such tracking in 3D space can enable a user to control and/or interact with a displayed 3D ultrasound image obtained from the ultrasound scanner.
For example, a 3D interface can be activated by lifting hand-held tool 105 from the surface of tablet 102 and moving it into a defined 3D space. Such a defined 3D space can be, for example, immediately above tablet 102, or, for example, closer to a patient's body. In the latter case a useful feature according to the present invention is noted. A user, by means of the virtual interface is physically decoupled from the actual scanning machine, and need not be physically proximate to it to control it.
Thus, in exemplary embodiments according to the present invention, a user can control the full functionality of an ultrasound scanner as well as interactively operate in 3D on 3D ultrasound images generated by the ultrasound machine. In such embodiments not only would the virtual stylus be used to interact with the virtual keyboard, but, when the user lifts hand-held tool 105 from the surface of tablet 102 and moves it into a defined 3D space the stylus can operate as a virtual tool, as seen in
The virtual stylus could, for example, be active on the virtual object and be used to select portions of the virtual image to operate on using 3D (or 2D) image manipulations and processes in the same manner as a user operates on a 3D data set using a Dextroscope™.
Additionally, each of the ultrasound probe 130 and the hand-held tool 105 can have, for example, a user controllable button 120 and 103, respectively, in
An exemplary physical interface to a virtual control panel and associated apparatus according to an exemplary embodiment of the present invention is next described with reference to
With reference to
With reference to
As noted, when combined with a 3D tracking device, a user can both control scan and display parameters as well as operate in 3D space upon a 3D image displayed by the ultrasound machine. This functionality is next illustrated with reference to
In exemplary embodiments according to the present invention, a user can easily shift from controlling scan and display functions to controlling a resulting 3D volume obtained by an ultrasound scanner. Illustrating this functionality,
Finally,
There are various advantages to using a virtual interface according to an exemplary embodiment of the present invention. It allows a user to maintain a uniform line of vision while viewing images derived from an ultrasound or other substantially real-time scan. This contrasts with the conventional viewing of ultrasound images, where a user is required to shift focus from a monitor which displays an image to a keyboard and mouse in order to perform desired control functions. The present invention also solves the problems inherent in certain conventional devices which attempt to partially free a user's hand by mapping certain high-use control functions to an ultrasound probe and the remaining functions to the standard keyboard. Moreover, since a virtual keyboard can be made to appear in similar form as the conventional physical keyboard, users of conventional systems can more easily adapt to the use of a virtual keyboard system.
Another advantage of a virtual control panel and associated physical interface is their ability to be programmed and reprogrammed, as opposed to a fixed control interface which can quickly become outdated or domain restricted. This is of great benefit to the manufacturers, who do not need to build new plastic and electronic interfaces each time new features are added.
While the present invention has been described with reference to certain exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. For example, the disclosed system and method can be used to control, via a virtual interface, any substantially real time medical imaging system. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.
This application claims the benefit of the following U.S. Provisional Patent Applications: (i) Ser. No. 60/585,214, entitled “SYSTEM AND METHOD FOR SCANNING AND IMAGING MANAGEMENT WITHIN A 3D SPACE (“SonoDEX”)”, filed on Jul. 1, 2004; (ii) Ser. No. 60/585,462, entitled “SYSTEM AND METHOD FOR A VIRTUAL INTERFACE FOR ULTRASOUND SCANNERS (“Virtual Interface”)”, filed on Jul. 1, 2004; and (iii) Ser. No. 60/660,858, entitled “SONODEX: 3D SPACE MANAGEMENT AND VISUALIZATION OF ULTRASOUND DATA”, filed on Mar. 11, 2005. The following related United States Patent applications, under common assignment herewith, are also fully incorporated herein by this reference: Ser. No. 10/469,294 (hereinafter “A Display Apparatus”), filed on Aug. 29, 2003; Ser. Nos. 10/725,773 (hereinafter “Zoom Slider”), 10/727,344 (hereinafter “Zoom Context”), and 10/725,772 (hereinafter “3D Matching”), each filed on Dec. 1, 2003; Ser. No. 10/744,869 (hereinafter “UltraSonar”), filed on Dec. 22, 2003, and Ser. No. 60/660,563 entitled “A METHOD FOR CREATING 4D IMAGES USING MULTIPLE 2D IMAGES ACQUIRED IN REAL-TIME (“4D Ultrasound”), filed on Mar. 9, 2005.
Number | Date | Country | |
---|---|---|---|
60585214 | Jul 2004 | US | |
60585462 | Jul 2004 | US | |
60660858 | Mar 2005 | US | |
60660563 | Mar 2005 | US |