The disclosed technology relates generally to touch-based user input and in particular to systems and methods for receiving touch-based user input on ultrasound imaging devices.
In ultrasound imaging devices, images of a subject are created by transmitting one or more acoustic pulses into the body from a transducer. Reflected echo signals that are created in response to the pulses are detected by the same or a different transducer. The echo signals cause the transducer elements to produce electronic signals that are analyzed by the ultrasound system in order to create a map of some characteristic of the echo signals such as their amplitude, power, phase or frequency shift etc. The map therefore can be displayed to a user as images.
Many ultrasound imaging devices include a screen for displaying ultrasound images and a separate input device (e.g., a hardware control panel and/or keyboard) for inputting commands and adjusting the display of the images on the screen. Use of a control panel to adjust ultrasound images can be awkward and cumbersome, as an operator may have to manipulate several variables simultaneously to adjust the image to his or her liking. Furthermore, inputting commands using a control panel may require that the operator break visual contact with the image display to focus on the control panel. In addition, a control panel on an ultrasound imaging device may include several commands and/or functions, requiring an operator to undergo extensive training before becoming proficient in using the device. A need exists for an intuitive ultrasound image display system that reduces the need for an operator to break visual contact with the display while decreasing time spent adjusting images on the display.
The present technology is generally directed to ultrasound imaging devices configured to receive touch-based input. It will be appreciated that several of the details set forth below are provided to describe the following embodiments in a manner sufficient to enable a person skilled in the relevant art to make and use the disclosed embodiments. Several of the details described below, however, may not be necessary to practice certain embodiments of the technology. Additionally, the technology can include other embodiments that are within the scope of the claims but are not described in detail with reference to
The processing unit 110 can be configured to receive ultrasound data from a probe 112 having an ultrasound transducer array 114. The array 114 can include, for example, a plurality of ultrasound transducers (e.g., piezoelectric transducers) configured to transmit ultrasound energy into a subject and receive ultrasound energy from the subject. The received ultrasound energy may then be transmitted as one or more ultrasound data signals via a link 116 to the ultrasound processing unit 110. The processing unit 110 may be further configured to process the ultrasound signals and form an ultrasound image, which can be included in the first and second display outputs 106 and 109 shown on the displays 104 and 108, respectively.
In the example shown, either of the displays 104 and 108 may be configured as a touchscreen, and the processing unit 110 can be configured to adjust the display outputs 106 and 109, respectively based on touch-based input received from an operator. The displays 104 and 108 can include any suitable touch-sensitive display system such as, for example, resistive touchscreens, surface acoustic wave touchscreens, capacitive touchscreens, surface capacitance touchscreens, projected capacitance touchscreens, mutual capacitance touchscreens, self-capacitance touchscreens, infrared touchscreens, optical imaging touchscreens, dispersive signal touchscreens, acoustic pulse recognition touchscreens, etc. In addition, the displays 104 and 108 can be configured to receive input from a user via one or more fingers (e.g., a fingertip, a fingernail, etc.), a stylus, and/or any other suitable pointing implement.
In operation, for example, the operator may hold the probe 112 with a first hand while adjusting the ultrasound image presented in the display output 106 with a second hand, using, for example, one or more touch-based inputs or gestures. These inputs may include, for example, direct manipulation (e.g., dragging one or more fingers on the display 104 to move an element on the display output 106), single and double tapping the display 104 with one or more fingers, flicking the display 104 with one or more fingers, pressing and holding one or more fingers on the display 104, pinching and expanding two or more fingers on the display 104, rotating two or more fingers on the display 104, etc. As explained in further detail below, the processing unit 110 can be configured to receive the inputs from the display 104 and update the display output 106 to correspond the operator input.
As noted above, the display output 106 may include a user interface (UI) to control measurements and/or output of the device 100. In some embodiments, for example, the display output 109 may be similar or identical to the display output 106. In other embodiments, however, the display output 109 may be tailored for persons within close proximity to the device 100 (e.g., a patient and/or a physician). For example, the display output 109 may include larger sized renderings of ultrasound images formed by the processing unit 110 compared to those display in the display output 106. In other embodiments, either of the display outputs 106 and 109 can be configured for direct manipulation. For example, the display outputs 106 and 109 can be configured such that there is generally a one-to-one size relationship between a region in the subject being imaged and the image presented to the operator. This can offer the advantage of allowing the operator an intuitive experience when interacting with the image.
In illustrated embodiment of
In some other embodiments, the device 100 may comprise the display 104 and the processing unit 110 as a single integrated component. For example, the ultrasound imaging device 100 may comprise a handheld portable ultrasound system having the display 104, the processing unit 110, and the probe 112, without the support structure 120.
The technology disclosed herein allows an operator to collect ultrasound images of a subject while manipulating the images on a first display without looking away, for example, from the second display while operating the imaging device. The disclosed technology allows the operator to manipulate the image using an interface having intuitive touch-based inputs, reducing the time spent learning a set of commands associated with a hardware control panel. Furthermore, in some embodiments of the disclosed technology, the user interface is provided on a touchscreen display with a flat, cleanable surface, allowing the operator to more effectively disinfect the input area than many conventional prior art input devices.
Aspects of the technology may be stored or distributed on computer-readable media, including magnetically or optically readable computer disks, as microcode on semiconductor memory, nanotechnology memory, organic or optical memory, or other portable data storage media. Indeed, computer-implemented instructions, data structures, screen displays, and other data under aspects of the technology may be distributed over the Internet or over other networks (including wireless networks), on a propagated signal on a propagation medium (e.g., an electromagnetic wave(s), a sound wave, etc.) over a period of time, or may be provided on any analog or digital network (packet switched, circuit switched, or other scheme).
Referring to
The display 210 can be configured to display, for example, a user interface to receive commands from an operator and/or present measured ultrasound images. The display 210 may include any suitable visual and/or audio display system such as, for example, a liquid crystal display (LCD) panel, a plasma-based display, a video projection display, etc. While only one display 210 is shown in
In some embodiments, the input 220 may be implemented as a touch-sensitive surface on the display 210. In other embodiments, the input 220 may include additional inputs such as, for example, inputs from a control panel, a keyboard, a trackball, a system accelerometer and/or pressure sensors in the touch screen, audio inputs (e.g., voice input), visual inputs, etc. In further embodiments, the input 220 may be configured to receive non-tactile gestures performed by an operator without contacting a surface. In these embodiments, for example, the system 200 may include one or more sensors (e.g., one or more cameras, one or more infrared transmitters and/or receivers, one or more laser emitters and/or receivers, etc.) configured to detect, for example, one or more operator hand movements. The system 200 can be configured to analyze the operator hand movements and perform a corresponding action associated with the hand movements.
The system 200 can receive input from an operator at the input 220 (e.g., one or more touchscreen displays), which can be converted to one or more input signals and transmitted to the input recognition engine 230 and/or the processor 250. The input signals may include, for example, X-Y coordinate information of the tactile contact with the input 220, the time duration of each input, the amount of pressure applied during each input, or a combination thereof. The input recognition engine 230 can, based on the input signals, identify the input features (e.g., taps, swipes, dragging, etc.) and relay information regarding the identified input features to the one or more processors 250.
The processors 250 can perform one or more corresponding actions (e.g., adjusting an image output to the display 210) based on the identified input features from the input recognition engine 230. The input recognition engine 230, for example, can be configured to detect the presence of two of the operator's fingers at the input 220 in an area corresponding to the output of an ultrasound image on the display 210. For example, the operator may place his or her two fingers on an image and subsequently move them apart in a “pinch and expand” motion, which may be associated with zooming in on or expanding the view of an area of interest in the image display. The input recognition engine 230 can identify the pinch and expand input and the one or more processors 250 can correspondingly update the output to the display 210 (e.g., increase the zoom level of the currently displayed image at the region where the finger movement was detected).
The system 200 may control components and/or the flow or processing of information or data between components using one or more processors 250 in communication with the memory 260, such as ROM or RAM (and instructions or data contained therein) and the other components via a bus 260. The memory 260 may, for example, contain data structures or other files or applications that provide information related to the processing and formation of ultrasound images. The memory may also, for example, contain one or more instructions for providing an operating system and/or a user interface configured to display commands and receive input from the operator.
The measurement system 270 can be configured to transmit and receive ultrasound energy into a subject (e.g., a patient) and send acquired ultrasound data to the processor 250 for image processing. The measurement system 270 can include, for example, an ultrasound probe (e.g., the probe 112 in
Components of the system 200 may receive energy via a power component 280. Additionally, the system 200 may receive or transmit information or data to other modules, remote computing devices, and so on via a communication component 235. The communication component 235 may be any wired or wireless components capable of communicating data to and from the system 200. Examples include a wireless radio frequency transmitter, infrared transmitter, or hard-wired cable, such as a USB cable. The system 200 may include other additional component 290 having modules 292 and 294 not explicitly described herein, such as additional microprocessor components, removable memory components (flash memory components, smart cards, hard drives), and/or other components.
The user interface 302 is configured to present output and input components to an operator. A first user control bar 304, a second user control bar 306, a third user control bar 308, and a fourth user control bar 310 present icons to the operator associated with, for example, various operating system commands (e.g., displaying an image, saving an image, etc.) and/or ultrasound measurements. A adjustable scale 312 can be configured to adjust image generation, measurement, and/or image display parameters such as, for example, time, dynamic range, frequency, vertical depth, distance, Doppler velocity, etc.
Referring to
In the illustrated examples shown in
In some embodiments, for example, the boundary 324 can also be adjusted through the use of other touch-based input and/or gestures. For example, the interface 302 may be configured to recognize a double tap input (e.g. multiple touch based input by one or more fingers in the same general location) and correspondingly display an expanded view (e.g., zoomed view) of the image within the boundary 324. In other embodiments, for example, the boundary 324 may be configured to allow the operator to resize the boundary 324 in only one dimension. For example, the boundary 324 can be configured to allow adjustment in only the horizontal (x) or only the vertical (y) dimension, as opposed to conventional “pinch and expand” gestures, which may simply scale a user interface element at the same rate in both directions (i.e. the scaling only depends on the distance between the two contact points).
In further embodiments, the user interface 302 can be configured to receive a gesture from the operator associated with, for example, a freeze command to freeze the current image (e.g., the image 322) displayed in the display region 320. For example, the user interface 302 may be configured to associate a gesture with a freeze command. In conventional ultrasound display systems, the operator may have to break visual contact with an ultrasound image (e.g. the image 322) to find a freeze button on a control panel. The present system, however, allows the operator to use the gesture anywhere in and/or on the user interface 302 without breaking visual contact with the display. For example, the user interface can be configured to receive a two-finger double-tap from the operator and accordingly freeze the image 322. A two-finger double-tap can offer the advantage of avoiding false positives that may occur with, for example, a single-finger gesture (e.g., an operator's accidentally freezing the image when he or she intended to do something totally different, like pressing a button).
Referring to
In the examples shown in
Pinching and expanding the Doppler gate 330 (e.g., increasing or decreasing the distance between the two contact points will increase or decrease, respectively, the Doppler gate size). In some embodiments, for example, the x and y components of the movement may not considered, and only the pixel distance between the contact points may be taken into consideration.
A multi-touch rotational gesture may, for example, be associated with adjusting the angle correction display 330. For example, the operator may place two fingers (e.g., a finger and a thumb) on the angle correction display 330 or within the display region 320. While holding the two fingers approximately the same distance apart from each other, the operator may rotate the fingers in a circular pattern clockwise or counterclockwise to correspondingly adjust the angle correction display 330 (e.g., to adjust a Doppler angle). The operator can perform the rotational gesture until the Doppler gate 334 is suitably aligned with an area of interest in the image 322. While holding the fingers in the same position, the operator may also move the Doppler gate 332 to another location within the image 322. As those skilled in the art would appreciate, the operator may also use any other combination of fingers to perform the rotational gesture (e.g., an index finger and a middle finger, a first finger on a first hand and a second finger on a second hand, etc.). In some embodiments, the user interface 302 can be configured to receive a circular tactile input with which the operator can trace, for example, a rotational path of the angle correction display 330 with one or more fingers. In further embodiments, the user interface can be configured to receive three or more separate tactile inputs (e.g., three or more fingers) to rotate the angle correction display 330.
An accelerated single touch movement (e.g. a flick) within the display region 320 may be interpreted to control a steering angle of the Doppler line control for linear transducers. For example, the operator may apply the accelerated single touch movement to the Doppler line 332 to adjust an angle thereof to suitably align the Doppler gate 334 with the image 322.
An operator may also use, for example, a single touch and drag along the Doppler line 332 to correspondingly align the Doppler gate 334 along ultrasound ray boundaries (e.g., horizontally for linear transducers and at an angle for phased or curved transducers).
In the example shown in
The flow diagrams described herein do not show all functions or exchanges of data, but instead provide an understanding of commands and data exchanged under the system. Those skilled in the relevant art will recognize that some functions or exchange of commands and data may be repeated, varied, omitted, or supplemented, and other (less important) aspects not shown may be readily implemented. Further, although process steps, method steps, blocks, algorithms or the like may be described in a particular order, such processes, methods, blocks and algorithms may be configured to work in alternate orders. In other words, any sequence or order described herein does not necessarily indicate a requirement that the steps or blocks be performed in that order. The steps or blocks of processes and methods described herein may be performed in any order practical, and some steps may be performed simultaneously.
Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to.” As used herein, the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements; the coupling or connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
The above Detailed Description of examples of the disclosed technology is not intended to be exhaustive or to limit the disclosed technology to the precise form disclosed above. While specific examples for the disclosed technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosed technology, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative implementations may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times. Further any specific numbers noted herein are only examples: alternative implementations may employ differing values or ranges.
The teachings of the disclosed technology provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various examples described above can be combined to provide further implementations of the disclosed technology. Some alternative implementations of the disclosed technology may include not only additional elements to those implementations noted above, but also may include fewer elements.
These and other changes can be made to the disclosed technology in light of the above Detailed Description. While the above description describes certain examples of the disclosed technology, and describes the best mode contemplated, no matter how detailed the above appears in text, the disclosed technology can be practiced in many ways. Details of the system may vary considerably in its specific implementation, while still being encompassed by the disclosed technology disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosed technology should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the disclosed technology with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosed technology to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms.
The present application claims priority under 35 U.S.C. 119(e) to U.S. Provisional Application Ser. No. 61/711,185, filed Oct. 8, 2012, the disclosure of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61711185 | Oct 2012 | US |