Medical ultrasound imaging has become an industry standard for many medical imaging applications. In recent years, there has been an increasing need for medical ultrasound imaging equipment that is portable to allow medical personnel to easily transport the equipment to and from hospital and/or field locations, and more user-friendly to accommodate medical personnel who may possess a range of skill levels.
Conventional medical ultrasound imaging equipment typically includes at least one ultrasound probe/transducer, a keyboard and/or a knob, a computer, and a display. In a typical mode of operation, the ultrasound probe/transducer generates ultrasound waves that can penetrate tissue to different depths based on frequency level, and receives ultrasound waves reflected back from the tissue. Further, medical personnel can enter system inputs to the computer via the keyboard and/or the knob, and view ultrasound images of tissue structures on the display.
However, conventional medical ultrasound imaging equipment that employ such keyboards and/or knobs can be bulky, and therefore may not be amenable to portable use in hospital and/or field locations. Moreover, because such keyboards and/or knobs typically have uneven surfaces, they can be difficult to keep clean in hospital and/or field environments, where maintenance of a sterile field can be crucial to patient health. Some conventional medical ultrasound imaging equipment have incorporated touch screen technology to provide a partial user input interface. However, conventional medical ultrasound imaging equipment that employ such touch screen technology generally provide only limited touch screen functionality in conjunction with a traditional keyboard and/or knob, and can therefore not only be difficult to keep clean, but also complicated to use.
In accordance with the present application, systems and methods of medical ultrasound imaging are disclosed. The presently disclosed systems and methods of medical ultrasound imaging employ medical ultrasound imaging equipment that includes a handheld housing in a tablet form factor, and a touch screen display disposed on a front panel of the housing. The touch screen display includes a multi-touch touchscreen that can recognize and distinguish one or more single, multiple, and/or simultaneous touches on a surface of the touch screen display, thereby allowing the use of gestures, ranging from simple single point gestures to complex multipoint moving gestures, as user inputs to the medical ultrasound imaging equipment.
In accordance with one aspect, exemplary medical ultrasound imaging system includes a housing having a front panel and a rear panel rigidly mounted to each other in parallel planes, a touch screen display, a computer having at least one processor and at least one memory, an ultrasound beamforming system, and a battery. The housing of the medical ultrasound imaging equipment is implemented in a tablet form factor. The touch screen display is disposed on the front panel of the housing, and includes a multi-touch LCD touch screen that can recognize and distinguish one or more single, multiple, and/or simultaneous touches or gestures on a surface of the touch screen display. The computer, the ultrasound beamforming system or engine, and the battery are operatively disposed within the housing. The medical ultrasound imaging equipment can use a Firewire connection operatively connected between the computer and the ultrasound engine within the housing and a probe connector having a probe attach/detach lever to facilitate the connection of at least one ultrasound probe/transducer. In addition, the exemplary medical ultrasound imaging system includes an I/O port connector and a DC power input.
In an exemplary mode of operation, medical personnel can employ simple single point gestures and/or more complex multipoint gestures as user inputs to the multi-touch LCD touch screen for controlling operational modes and/or functions of the exemplary medical ultrasound imaging equipment. Such single point/multipoint gestures can correspond to single and/or multipoint touch events that are mapped to one or more predetermined operations that can be performed by the computer and/or the ultrasound engine. Medical personnel can make such single point/multipoint gestures by various finger, palm, and/or stylus motions on the surface of the touch screen display. The multi-touch LCD touch screen receives the single point/multipoint gestures as user inputs, and provides the user inputs to the computer, which executes, using the processor, program instructions stored in the memory to carry out the predetermined operations associated with the single point/multipoint gestures, at least at some times, in conjunction with the ultrasound engine. Such single point/multipoint gestures on the surface of the touch screen display can include, but are not limited to, a tap gesture, a pinch gesture, a flick gesture, a rotate gesture, a double tap gesture, a spread gesture, a drag gesture, a press gesture, a press and drag gesture, and a palm gesture. In contrast to existing ultrasound systems that rely on numerous control features operated by mechanical switching, keyboard elements, or touchpad trackball interface, preferred embodiments of the present invention employ a single on/off switch. All other operations have been implemented using touchscreen controls. Moreover, the preferred embodiments employ a capacitive touchscreen display that is sufficiently sensitive to detect touch gestures actuated by bare fingers of the user as well as gloved fingers of the user. Often medical personnel must wear sterilized plastic gloves during medical procedures. Consequently, it is highly desirable to provide a portable ultrasound device that can be used by gloved hands; however, this has previously prevented the use of touchscreen display control functions in ultrasound systems for many applications requiring sterile precautions. Preferred embodiments of the present invention provide control of all ultrasound imaging operations by gloved personnel on the touchscreen display using the programmed touch gestures.
In accordance with an exemplary aspect, at least one flick gesture may be employed to control the depth of tissue penetration of ultrasound waves generated by the ultrasound probe/transducer. For example, a single flick gesture in the “up” direction on the touch screen display surface can increase the penetration depth by one (1) centimeter or any other suitable amount, and a single flick gesture in the “down” direction on the touch screen display surface can decrease the penetration depth by one (1) centimeter or any other suitable amount. Further, a drag gesture in the “up” or “down” direction on the touch screen display surface can increase or decrease the penetration depth in multiples of one (1) centimeter or any other suitable amount. Additional operational modes and/or functions controlled by specific single point/multipoint gestures on the touch screen display surface can include, but are not limited to, freeze/store operations, 2-dimensional mode operations, gain control, color control, split screen control, PW imaging control, cine/time-series image clip scrolling control, zoom and pan control, full screen control, Doppler and 2-dimensional beam steering control, and/or body marking control. At least some of the operational modes and/or functions of the exemplary medical ultrasound imaging equipment can be controlled by one or more touch controls implemented on the touch screen display in which beamforming parameters can be reset by moving touch gestures. Medical personnel can provide one or more specific single point/multipoint gestures as user inputs for specifying at least one selected subset of the touch controls to be implemented, as required and/or desired, on the touch screen display. A larger number of touchscreen controls enable greater functionality when operating in full screen mode when a few or more virtual buttons or icons are available for use.
In accordance with another exemplary aspect, a press gesture can be employed inside a region of the touch screen display, and, in response to the press gesture, a virtual window can be provided on the touch screen display for displaying at least a magnified portion of an ultrasound image displayed on the touch screen display. In accordance with still another exemplary aspect, a press and drag gesture can be employed inside the region of the touch screen display, and, in response to the press and drag gesture, a predetermined feature of the ultrasound image can be traced. Further, a tap gesture can be employed inside the region of the touch screen display, substantially simultaneously with a portion of the press and drag gesture, and, in response to the tap gesture, the tracing of the predetermined feature of the ultrasound image can be completed. These operations can operate in different regions of a single display format, so that a moving gesture within a region of interest within the image, for example, may perform a different function than the same gesture executed within the image but outside the region of interest.
By providing medical ultrasound imaging equipment with a multi-touch touchscreen, medical personnel can control the equipment using simple single point gestures and/or more complex multipoint gestures, without the need of a traditional keyboard or knob. Because the multi-touch touch screen obviates the need for a traditional keyboard or knob, such medical ultrasound imaging equipment is easier to keep clean in hospital and/or field environments, provides an intuitive user friendly interface, while providing fully functional operations. Moreover, by providing such medical ultrasound imaging equipment in a tablet form factor, medical personnel can easily transport the equipment between hospital and/or field locations.
Certain exemplary embodiments provide a multi-chip module for an ultrasound engine of a portable medical ultrasound imaging system, in which a transmit/receive (TR) chip, a pre-amp/time gain compensation (TGC) chip and a beamformer chip are assembled in a vertically stacked configuration. The transmission circuit provides high voltage electrical driving pulses to the transducer elements to generate a transmit beam. As the transmit chip operates at voltages greater than 80V, a CMOS process utilizing a 1 micron design rule has been utilized for the transmit chip and a submicron design rule has been utilized for the low-voltage receiving circuits (less than 5V).
Preferred embodiments of the present invention utilize a submicron process to provide integrated circuits with sub-circuits operating at a plurality of voltages, for example, 2.5V, 5V and 60V or higher. These features can be used in conjunction with a bi-plane transducer probe in accordance with certain preferred embodiments of the invention.
Thus, a single IC chip can be utilized that incorporates high voltage transmission, low voltage amplifier/TGC and low voltage beamforming circuits in a single chip. Using a 0.25 micron design rule, this mixed signal circuit can accommodate beamforming of 32 transducer channels in a chip area less than 0.7×0.7 (0.49) cm2. Thus, 128 channels can be processed using four 32 channel chips in a total circuit board area of less than 1.5×1.5 (2.25) cm2.
The term “multi-chip module,” as used herein, refers to an electronic package in which multiple integrated circuits (IC) are packaged with a unifying substrate, facilitating their use as a single component, i.e., as a higher processing capacity IC packaged in a much smaller volume. Each IC can comprise a circuit fabricated in a thinned semiconductor wafer. Exemplary embodiments also provide an ultrasound engine including one or more such multi-chip modules, and a portable medical ultrasound imaging system including an ultrasound engine circuit board with one or more multi-chip modules. Exemplary embodiments also provide methods for fabricating and assembling multi-chip modules as taught herein. Vertically stacking the TR chip, the pre-amp/TGC chip, and the beamformer chip on a circuit board minimizes the packaging size (e.g., the length and width) and the footprint occupied by the chips on the circuit board.
The TR chip, the pre-amp/TGC chip, and the beamformer chip in a multi-chip module may each include multiple channels (for example, 8 channels per chip to 64 channels per chip). In certain embodiments, the high-voltage TR chip, the pre-amp/TGC chip, and the sample-interpolate receive beamformer chip may each include 8, 16, 32, 64 channels. In a preferred embodiment, each circuit in a two layer beamformer module has 32 beamformer receive channels to provide a 64 channel receiving beamformer. A second 64 channel two layer module can be used to form a 128 channel handheld tablet ultrasound device having an overall thickness of less than 2 cm. A transmit multi-chip beamformer can also be used having the same or similar channel density in each layer.
Exemplary numbers of chips vertically integrated in a multi-chip module may include, but are not limited to, two, three, four, five, six, seven, eight, and the like. In one embodiment of an ultrasound device, a single multi-chip module is provided on a circuit board of an ultrasound engine that performs ultrasound-specific operations. In other embodiments, a plurality of multi-chip modules are provided on a circuit board of an ultrasound engine. The plurality of multi-chip modules may be stacked vertically on top of one another on the circuit board of the ultrasound engine to further minimize the packaging size and the footprint of the circuit board.
Providing one or more multi-chip modules on a circuit board of an ultrasound engine achieves a high channel count while minimizing the overall packaging size and footprint. For example, a 128-channel ultrasound engine circuit board can be assembled, using multi-chip modules, within exemplary planar dimensions of about 10 cm×about 10 cm, which is a significant improvement over the much larger space requirements of conventional ultrasound circuits. A single circuit board of an ultrasound engine including one or more multi-chip modules may have 16 to 128 channels in some embodiments. In certain embodiments, a single circuit board of an ultrasound engine including one or more multi-chip modules may have 16, 32, 64, 128 or 192 channels, and the like.
The foregoing and other objects, aspects, features, and advantages of exemplary embodiments will become more apparent and may be better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:
Systems and methods of medical ultrasound imaging are disclosed. The presently disclosed systems and methods of medical ultrasound imaging employ medical ultrasound imaging equipment that includes housing in a tablet form factor, and a touch screen display disposed on a front panel of the housing. The touch screen display includes a multi-touch touch screen that can recognize and distinguish one or more single, multiple, and/or simultaneous touches on a surface of the touch screen display, thereby allowing the use of gestures, ranging from simple single point gestures to complex multipoint gestures, as user inputs to the medical ultrasound imaging equipment. Further details regarding tablet ultrasound systems and operations are described in U.S. application Ser. No. 10/997,062 filed on Nov. 11, 2004, Ser. No. 10/386,360 filed Mar. 11, 2003 and U.S. Pat. No. 6,969,352, the entire contents of these patents and applications are incorporated herein by reference.
In an exemplary mode of operation, medical personnel (also referred to herein as the “user” or “users”) can employ simple single point gestures and/or more complex multipoint gestures as user inputs to the multi-touch LCD touch screen of the touch screen display 104 for controlling one or more operational modes and/or functions of the medical ultrasound imaging equipment 100. Such a gesture is defined herein as a movement, a stroke, or a position of at least one finger, a stylus, and/or a palm on the surface 105 of the touch screen display 104. For example, such single point/multipoint gestures can include static or dynamic gestures, continuous or segmented gestures, and/or any other suitable gestures. A single point gesture is defined herein as a gesture that can be performed with a single touch contact point on the touch screen display 104 by a single finger, a stylus, or a palm. A multipoint gesture is defined herein as a gesture that can be performed with multiple touch contact points on the touch screen display 104 by multiple fingers, or any suitable combination of at least one finger, a stylus, and a palm. A static gesture is defined herein as a gesture that does not involve the movement of at least one finger, a stylus, or a palm on the surface 105 of the touch screen display 104. A dynamic gesture is defined herein as a gesture that involves the movement of at least one finger, a stylus, or a palm, such as the movement caused by dragging one or more fingers across the surface 105 of the touch screen display 104. A continuous gesture is defined herein as a gesture that can be performed in a single movement or stroke of at least one finger, a stylus, or a palm on the surface 105 of the touch screen display 104. A segmented gesture is defined herein as a gesture that can be performed in multiple movements or stokes of at least one finger, a stylus, or a palm on the surface 105 of the touch screen display 104.
Such single point/multipoint gestures performed on the surface 105 of the touch screen display 104 can correspond to single or multipoint touch events, which are mapped to one or more predetermined operations that can be performed by the computer and/or the ultrasound engine 108. Users can make such single point/multipoint gestures by various single finger, multi-finger, stylus, and/or palm motions on the surface 105 of the touch screen display 104. The multi-touch LCD touch screen receives the single point/multipoint gestures as user inputs, and provides the user inputs to the processor, which executes program instructions stored in the memory to carry out the predetermined operations associated with the single point/multipoint gestures, at least at some times, in conjunction with the ultrasound engine 108. As shown in
In accordance with the illustrative embodiment of
Additional operational modes and/or functions controlled by specific single point/multipoint gestures on the surface 105 of the touch screen display 104 can include, but are not limited to, freeze/store operations, 2-dimensional mode operations, gain control, color control, split screen control, PW imaging control, cine/time-series image clip scrolling control, zoom and pan control, full screen display, Doppler and 2-dimensional beam steering control, and/or body marking control. At least some of the operational modes and/or functions of the medical ultrasound imaging equipment 100 can be controlled by one or more touch controls implemented on the touch screen display 104. Further, users can provide one or more specific single point/multipoint gestures as user inputs for specifying at least one selected subset of the touch controls to be implemented, as required and/or desired, on the touch screen display 104.
Shown in
Ultrasound images of flow or tissue movement, whether color flow or spectral Doppler, are essentially obtained from measurements of movement. In ultrasound scanners, a series of pulses is transmitted to detect movement of blood. Echoes from stationary targets are the same from pulse to pulse. Echoes from moving scatterers exhibit slight differences in the time for the signal to be returned to the scanner.
As can be seen from
In this tablet ultrasound system, an ROI, region of interest, is also used to define the direction in response to a moving gesture of the ultrasound transmit beam. A liver image with a branch of renal flow in color flow mode is shown in
As shown in
As shown in
In accordance with the present application, various measurements and/or tracings of objects (such as organs, tissues, etc.) displayed as ultrasound images on the touch screen display 104 of the medical ultrasound imaging equipment 100 (see
For example, using his or her finger (see, e.g., a finger 508;
Once the cursor 607 is at the desired location on the touch screen display 104, as determined by the location of the finger 610, the user can fix the cursor 607 at that location by employing a tap gesture (see, e.g., the tap gesture 302; see
As described above, the user can perform measurements and/or tracings of objects on a magnified portion of an original ultrasound image of a displayed object within a virtual window on the touch screen display 104.
For example, using his or her fingers (see, e.g., the fingers 710, 712;
For example, using his or her fingers (see, e.g., the fingers 810, 812;
There are many types of ultrasound transducers. They differ by geometry, number of elements, and frequency response. For example, a linear array with center frequency of 10 to 15 MHz is better suited for breast imaging, and a curved array with center frequency of 3 to 5 MHz is better suited for abdominal imaging.
It is often necessary to use different types of transducers for the same or different ultrasound scanning sessions. For ultrasound systems with only one transducer connection, the operator will change the transducer prior to the start of a new scanning session.
In some applications, it is necessary to switch among different types of transducers during one ultrasound scanning session. In this case, it is more convenient to have multiple transducers connected to the same ultrasound system, and the operator can quickly switch among these connected transducers by hitting a button on the operator console, without having to physically detach and re-attach the transducers, which takes a longer time. Preferred embodiments of the invention can include a multiplexor within the tablet housing that can select between a plurality of probe connector ports within the tablet housing, or alternatively, the tablet housing can be connected to an external multiplexor that can be mounted on a cart as described herein.
The ultrasound transducer element 960, on the needle guide 962, may be connected to the ultrasound engine. The connection may be made through a separate cable to a dedicated probe connector on the engine, similar to a sharing the pencil CW probe connector. In an alternate embodiment, a small short cable may be plugged into the larger image transducer probe handle or a split cable connecting to the same probe connector at the engine. In another alternate embodiment the connection may be made via an electrical connector between the image probe handle and the needle guide without a cable in between. In an alternate embodiment the ultrasound transducer elements on the needle guide may be connected to the ultrasound engine by enclosing the needle guide and transducer elements in the same mechanical enclosure of the imagining probe handle.
The system 986 includes needle guide 962 that may be mounted to a needle guide mounting bracket 966 that may be coupled to an ultrasound imaging probe assembly for imaging the patient's body 982, or alterative suitable form factors. The ultrasound reflector disc 964 may be mounted at the exposed end of the needle 956. In this embodiment a linear ultrasound acoustic array 978, is mounted parallel to the direction of movement of the needle 956. The linear ultrasound acoustic array 978 includes an ultrasound transducer array 980 positioned parallel to the needle 956. In this embodiment an ultrasound imagining probe assembly 982, is positioned for imagining the patient body. The ultrasound imaging probe assembly for imaging the patient body 982 is configured with an ultrasound transducer array 984.
In this embodiment, the position of the ultrasound reflector disc 964 can be detected by using the ultrasound transducer array 980 coupled to an ultrasound imaging probe assembly for imaging 978. The position of the reflector disc 964 is located by transmitting ultrasonic wave 972, from the transducer element 980 on the ultrasound imaging probe assembly for imaging 978. The ultrasound wave 972 travels through the air towards reflector disc 964 and is reflected by the reflector disc 964. The reflected ultrasound wave 974, reaches the transducer element 980 on the ultrasound imaging probe assembly for imaging 978. The distance 976, between the reflector disc 964, and the transducer element 980 is calculated from the time elapsed and the speed of sound in the air. In an alternate embodiment an alternate algorithm may be used to sequentially scan the polarity of elements in the transducer array and analyze the reflections produced per transducer array element. In an alternate embodiment a plurality of scans may occur prior to forming an ultrasound image.
At times, identification of endocardial borders may be difficult, and when such difficulties are encountered tissue Doppler imaging of the same view may be employed (per step 934). A reference template for identifying the septal and lateral free wall is provided (per step 936). Next, standard tissue Doppler imaging (TDI) with pre-set velocity scales of, say, ±30 cm/sec may be used (per step 938).
Then, a reference of the desired triplex image may be provided (per step 940). Either B-mode or TDI may be used to guide the range gate (per step 942). B-mode can be used for guiding the range gate (per step 944) or TDI for guiding the range gate (per step 946). Using TDI or B-mode for guiding the range gate also allows the use of a direction correction angle for allowing the Spectral Doppler to display the radial mean velocity of the septal wall. A first pulsed-wave spectral Doppler is then used to measure the septal wall mean velocity using duplex or triplex mode (per step 948). The software used to process the data and calculate dyssynchrony can utilize a location (e.g. a center point) to automatically set an angle between dated locations on a heart wall to assist in simplifying the setting of parameters.
A second range-gate position is also guided using a duplex image or a TDI (per step 950), and a directional correction angle may be used if desired. After step 950, the mean velocity of the septal wall and lateral free wall are being tracked by the system. Time integration of the Spectral Doppler mean velocities 952 at regions of interest (e.g., the septum wall and the left ventricular free wall) then provides the displacement of the septal and left free wall, respectively.
The above method steps may be utilized in conjunction with a high pass filtering means, analog or digital, known in the relevant arts for removing any baseline disturbance present in collected signals. In addition, the disclosed method employs multiple simultaneous PW Spectral Doppler lines for tracking movement of the interventricular septum and the left ventricular fee wall. In additional, a multiple gate structure may be employed along each spectral line, thus allowing quantitative measurement of regional wall motion. Averaging over multiple gates may allow measurement of global wall movement.
The ultrasound probe 1040, can include sub-arrays/apertures 1052 consisting of neighboring elements with an aperture smaller than that of the whole array. Returned echoes are received by the 1D transducer array 1062 and transmitted to the controller 1044. The controller initiates formation of a coarse beam by transmitting the signals to memory 1058, 1046. The memory 1058, 1046 transmits a signal to a transmit Driver 11050, and Transmit Driver m 1054. Transmit Driver 11050 and Transmit Driver m 1054 then send the signal to mux11048 and mux m 1056, respectively. The signal is transmitted to sub-array beamformer 11052 and sub-array beamformer n 1060.
The outputs of each coarse beam forming operation can include further processing through a second stage beam forming in the interface unit 1020 to convert the beam forming output to digital representation. The coarse beam forming operations can be coherently summed to form a fine beam output for the array. The signals can be transmitted from the ultrasound probe 1040 sub-array beam former 11052 and sub-array beam former n 1060 to the A/D convertors 1030 and 1028 within the interface unit 1020. Within the interface unit 1020 there are A/D converters 1028, 1030 for converting the first stage beam forming output to digital representation. The digital conversion can be received from the A/D convertors 1030, 1028 by a customer ASIC such as a FPGA 1026 to complete the second stage beam forming. The FPGA Digital beam forming 1026 can transmit information to the system controller 1024. The system controller can transmit information to a memory 1032 which may send a signal back to the FPGA Digital Beam forming 1026. Alternatively, the system controller 1024 may transmit information to the custom USB3 Chipset 1022. The USB3 Chipset 1022 may then transmit information to a DC-DC convertor 1034. In turn, the DC-DC convertor 1034 may transmit power from the interface unit 1020 to the ultrasound probe 1040. Within the ultrasound probe 1040 a power supply 1042 may receive the power signal and interface with the transmit driver 11050 to provide the power to the front end integration probe.
The Interface unit 1020 custom or USB3 Chipset 1022 may be used to provide a communication link between the interface unit 10220 and the host computer 1010. The custom or USB3 Chipset 1022 transmits a signal to the host computer's 1010 custom or USB3 Chipset 1012. The custom or the USB3 Chipset 1012 then interfaces with the microprocessor 1014. The microprocessor 1014 then may display information or send information to a device 1075.
In an alternate embodiment, a narrow band beamformer can be used. For example, an individual analog phase shifter is applied to each of the received echoes. The phase shifted outputs within each sub-array are then summed to form a coarse beam. The A/D converts can be used to digitize each of the coarse beams; a digital beam former is then used to form the fine beam.
In another embodiment, forming a 64 element linear array may use eight adjacent elements to form a coarse beam output. Such arrangement may utilize eight output analog cables connecting the outputs of the integrated probe to the interface units. The coarse beams may be sent through the cable to the corresponding A/D convertors located in the interface unit. The digital delay is used to form a fine beam output. Eight A/D convertors may be required to form the digital representation.
In another embodiment, forming a 128 element array may use sixteen sub-array beam forming circuits. Each circuit may form a coarse beam from an adjacent eight element array provided in the first stage output to the interface unit. Such arrangement may utilize sixteen output analog cables connecting the outputs of the integrated probe to the interface units to digitize the output. A PC microprocessor or a DSP may be used to perform the down conversion, base-banding, scan conversion and post image processing functions. The microprocessor or DSP can also be used to perform all the Doppler processing functions.
The ultrasound probe 1040 includes subarray/apertures 1052 consisting of neighboring elements with an aperture smaller than that of the whole array. Returned echoes are received by the 1D transducer array 1062 and transmitted to the controller 1044. The controller initiates formation of a coarse beam by transmitting the signals to memory 1058, 1046. The memory 1058, 1046 transmits a signal to a transmit Driver 11050, and Transmit Driver m 1054. Transmit Driver 11050 and Transmit Driver m 1054 then send the signal to mux11048 and mux m 1056, respectively. The signal is transmitted to subarray beamformer 11052 and subarray beamformer n 1060.
The outputs of each coarse beam forming operation then go through a second stage beam forming in the interface unit 1020 to convert the beam forming output to digital representation. The coarse beamforming operations are coherently summed to form a fine beam output for the array. The signals are transmitted from the ultrasound probe 1040 subarray beamformer 11052 and subarray beamformer n 1060 to the A/D convertors 1030 and 1028 within the host computer 1082. Within the host computer 1082 there are A/D converters 1028, 1030 for converting the first stage beamforming output to digital representation. The digital conversion is received from the A/D convertors 1030, 1028 by a customer ASIC such as a FPGA 1026 to complete the second stage beamforming. The FPGA Digital beamforming 1026 transmits information to the system controller 1024. The system controller transmits information to a memory 1032 which may send a signal back to the FPGA Digital Beam forming 1026. Alternatively, the system controller 1024 may transmit information to the custom USB3 Chipset 1022. The USB3 Chipset 1022 may then transmit information to a DC-DC convertor 1034. In turn, the DC-DC convertor 1034 may transmit power from the interface unit 1020 to the ultrasound probe 1040. Within the ultrasound probe 1040 a power supply 1042 may receive the power signal and interface with the transmit driver 11050 to provide the power to the front end integration probe. The power supply can include a battery to enable wireless operation of the transducer assembly. A wireless transceiver can be integrated into controller circuit or a separate communications circuit to enable wireless transfer of image data and control signals.
The host computer's 1082 custom or USB3 Chipset 1022 may be used to provide a communication link between the custom or USB3 Chipset 1012 to transmits a signal to the microprocessor 1014. The microprocessor 1014 then may display information or send information to a device 1075.
A transducer array 152 is configured to transmit ultrasound waves to and receive reflected ultrasound waves from one or more image targets 1102. The transducer array 152 is coupled to the ultrasound engine 108 using one or more cables 1104.
The ultrasound engine 108 includes a high-voltage transmit/receive (TR) module 1106 for applying drive signals to the transducer array 152 and for receiving return echo signals from the transducer array 152. The ultrasound engine 108 includes a pre-amp/time gain compensation (TGC) module 1108 for amplifying the return echo signals and applying suitable TGC functions to the signals. The ultrasound engine 108 includes a sampled-data beamformer 1110 that the delay coefficients used in each channel after the return echo signals have been amplified and processed by the pre-amp/TGC module 1108.
In some exemplary embodiments, the high-voltage TR module 1106, the pre-amp/TGC module 1108, and the sample-interpolate receive beamformer 1110 may each be a silicon chip having 8 to 64 channels per chip, but exemplary embodiments are not limited to this range. In certain embodiments, the high-voltage TR module 1106, the pre-amp/TGC module 1108, and the sample-interpolate receive beamformer 1110 may each be a silicon chip having 8, 16, 32, 64 channels, and the like. As illustrated in
The ultrasound engine 108 includes a first-in first-out (FIFO) buffer module 1112 which is used for buffering the processed data output by the beamformer 1110. The ultrasound engine 108 also includes a memory 1114 for storing program instructions and data, and a system controller 1116 for controlling the operations of the ultrasound engine modules.
The ultrasound engine 108 interfaces with the computer motherboard 106 over a communications link 112 which can follow a standard high-speed communications protocol, such as the Fire Wire (IEEE 1394 Standards Serial Interface) or fast (e.g., 200-400 Mbits/second or faster) Universal Serial Bus (USB 2.0 USB 3.0), protocol. The standard communication link to the computer motherboard operates at least at 400 Mbits/second or higher, preferably at 800 Mbits/second or higher. Alternatively, the link 112 can be a wireless connection such as an infrared (IR) link. The ultrasound engine 108 includes a communications chipset 1118 (e.g., a Fire Wire chipset) to establish and maintain the communications link 112.
Similarly, the computer motherboard 106 also includes a communications chipset 1120 (e.g., a Fire Wire chipset) to establish and maintain the communications link 112. The computer motherboard 106 includes a core computer-readable memory 1122 for storing data and/or computer-executable instructions for performing ultrasound imaging operations. The memory 1122 forms the main memory for the computer and, in an exemplary embodiment, may store about 4 GB of DDR3 memory. The computer motherboard 106 also includes a microprocessor 1124 for executing computer-executable instructions stored on the core computer-readable memory 1122 for performing ultrasound imaging processing operations. An exemplary microprocessor 1124 may be an off-the-shelf commercial computer processor, such as an Intel Core-i5 processor. Another exemplary microprocessor 1124 may be a digital signal processor (DSP) based processor, such as one or more DaVinci™ processors from Texas Instruments. The computer motherboard 106 also includes a display controller 1126 for controlling a display device that may be used to display ultrasound data, scans and maps.
Exemplary operations performed by the microprocessor 1124 include, but are not limited to, down conversion (for generating I, Q samples from received ultrasound data), scan conversion (for converting ultrasound data into a display format of a display device), Doppler processing (for determining and/or imaging movement and/or flow information from the ultrasound data), Color Flow processing (for generating, using autocorrelation in one embodiment, a color-coded map of Doppler shifts superimposed on a B-mode ultrasound image), Power Doppler processing (for determining power Doppler data and/or generating a power Doppler map), Spectral Doppler processing (for determining spectral Doppler data and/or generating a spectral Doppler map), and post signal processing. These operations are described in further detail in WO 03/079038 A2, filed Mar. 11, 2003, titled “Ultrasound Probe with Integrated Electronics,” the entire contents of which are expressly incorporated herein by reference.
To achieve a smaller and lighter portable ultrasound devices, the ultrasound engine 108 includes reduction in overall packaging size and footprint of a circuit board providing the ultrasound engine 108. To this end, exemplary embodiments provide a small and light portable ultrasound device that minimizes overall packaging size and footprint while providing a high channel count. In some embodiments, a high channel count circuit board of an exemplary ultrasound engine may include one or more multi-chip modules in which each chip provides multiple channels, for example, 32 channels. The term “multi-chip module,” as used herein, refers to an electronic package in which multiple integrated circuits (IC) are packaged into a unifying substrate, facilitating their use as a single component, i.e., as a larger IC. A multi-chip module may be used in an exemplary circuit board to enable two or more active IC components integrated on a High Density Interconnection (HDI) substrate to reduce the overall packaging size. In an exemplary embodiment, a multi-chip module may be assembled by vertically stacking a transmit/receive (TR) silicon chip, an amplifier silicon chip and a beamformer silicon chip of an ultrasound engine. A single circuit board of the ultrasound engine may include one or more of these multi-chip modules to provide a high channel count, while minimizing the overall packaging size and footprint of the circuit board.
As illustrated in
In one embodiment of an ultrasound engine circuit board, a single multi-chip module as illustrated in
In addition to the need for reducing the footprint, there is also a need for decreasing the overall package height in multi-chip modules. Exemplary embodiments may employ wafer thinning to sub-hundreds micron to reduce the package height in multi-chip modules.
Any suitable technique can be used to assemble a multi-chip module on a substrate. Exemplary assembly techniques include, but are not limited to, laminated MCM (MCM-L) in which the substrate is a multi-layer laminated printed circuit board, deposited MCM (MCM-D) in which the multi-chip modules are deposited on the base substrate using thin film technology, and ceramic substrate MCM (MCM-C) in which several conductive layers are deposited on a ceramic substrate and embedded in glass layers that layers are co-fired at high temperatures (HTCC) or low temperatures (LTCC).
Exemplary chip layers in a multi-chip module may be coupled to each other using any suitable technique. For example, in the embodiment illustrated in
Important requirements for the die attach (DA) paste or film is excellent adhesion to the passivation materials of adjacent dies. Also, a uniform bond-link thickness (BLT) is required for a large die application. In addition, high cohesive strength at high temperatures and low moisture absorption are preferred for reliability.
The DA material illustrated in
In method (A), a first passive silicon layer is bonded to the first die in a stacked manner using a dicing die-attach film (D-DAF). A second die is bonded to the first passive layer in a stacked manner using D-DAF. Wire bonding is used to couple the second die to the metal frame. A second passive silicon layer is bonded to the second die in a stacked manner using D-DAF. A third die is bonded to the second passive layer in a stacked manner using D-DAF. Wire bonding is used to couple the third die to the metal frame. A third passive silicon layer is bonded to the third die in a stacked manner using D-DAF. A fourth die is bonded to the third passive layer in a stacked manner using D-DAF. Wire bonding is used to couple the fourth die to the metal frame.
In method (B), die attach (DA) paste dispensing and curing is repeated for multi-thin die stack application. DA paste is dispensed onto a first die, and a second die is provided on the DA paste and cured to the first die. Wire bonding is used to couple the second die to the metal frame. DA paste is dispensed onto the second die, and a third die is provided on the DA paste and cured to the second die. Wire bonding is used to couple the third die to the metal frame. DA paste is dispensed onto the third die, and a fourth die is provided on the DA paste and cured to the third die. Wire bonding is used to couple the fourth die to the metal frame.
In method (C), die attach films (DAF) are cut and pressed to a bottom die and a top die is then placed and thermal compressed onto the DAF. For example, a DAF is pressed to the first die and a second die is thermal compressed onto the DAF. Wire bonding is used to couple the second die to the metal frame. Similarly, a DAF is pressed to the second die and a third die is thermal compressed onto the DAF. Wire bonding is used to couple the third die to the metal frame. A DAF is pressed to the third die and a fourth die is thermal compressed onto the DAF. Wire bonding is used to couple the fourth die to the metal frame.
In method (D), film-over wire (FOW) employs a die-attach film with wire penetrating capability that allows the same or similar-sized wire-bonded dies to be stacked directly on top of one another without passive silicon spacers. A second die is bonded and cured to the first die in a stacked manner. Film-over wire bonding is used to couple the second die to the metal frame. A third die is bonded and cured to the first die in a stacked manner. Film-over wire bonding is used to couple the third die to the metal frame. A fourth die is bonded and cured to the first die in a stacked manner. Film-over wire bonding is used to couple the fourth die to the metal frame.
After the above-described steps are completed, in each method (a)-(d), wafer molding and post-mold curing (PMC) are performed. Subsequently, ball mount and singulation are performed.
Further details on the above-described die attachment techniques are provided in TOH C H et al., “Die Attach Adhesives for 3D Same-Sized Dies Stacked Packages,” the 58th Electronic Components and Technology Conference (ECTC2008), pp. 1538-43, Florida, US (27-30 May 2008), the entire contents of which are expressly incorporated herein by reference.
In this exemplary embodiment, each multi-chip module may handle the complete transmit, receive, TGC amplification and beam forming operations for a large number of channels, for example, 32 channels. By vertically integrating the three silicon chips into a single multi-chip module, the space and footprint required for the printed circuit board is further reduced. A plurality of multi-chip modules may be provided on a single ultrasound engine circuit board to further increase the number of channels while minimizing the packaging size and footprint. For example, a 128 channel ultrasound engine circuit board 108 can be fabricated within exemplary planar dimensions of about 10 cm×about 10 cm, which is a significant improvement of the space requirements of conventional ultrasound circuits. A single circuit board of an ultrasound engine including one or more multi-chip modules may have 16 to 128 channels in preferred embodiments. In certain embodiments, a single circuit board of an ultrasound engine including one or more multi-chip modules may have 16, 32, 64, 128 channels, and the like.
The ultrasound engine 108 includes a probe connector 114 to facilitate the connection of at least one ultrasound probe/transducer. In the ultrasound engine 108, a TR module, an amplifier module and a beamformer module may be vertically stacked to form a multi-chip module as shown in
The ASICs and the multi-chip module configuration enable a 128-channel complete ultrasound system to be implemented on a small single board in a size of a tablet computer format. An exemplary 128-channel ultrasound engine 108, for example, can be accommodated within exemplary planar dimensions of about 10 cm×about 10 cm, which is a significant improvement of the space requirements of conventional ultrasound circuits. An exemplary 128-channel ultrasound engine 108 can also be accommodated within an exemplary area of about 100 cm2.
The ultrasound engine 108 also includes a clock generation complex programmable logic device (CPLD) 1714 for generating timing clocks for performing an ultrasound scan using the transducer array. The ultrasound engine 108 includes an analog-to-digital converter (ADC) 1716 for converting analog ultrasound signals received from the transducer array to digital RF formed beams. The ultrasound engine 108 also includes one or more delay profile and waveform generator field programmable gate arrays (FPGA) 1718 for managing the receive delay profiles and generating the transmit waveforms. The ultrasound engine 108 includes a memory 1720 for storing the delay profiles for ultrasound scanning. An exemplary memory 1720 may be a single DDR3 memory chip. The ultrasound engine 108 includes a scan sequence control field programmable gate array (FPGA) 1722 configured to manage the ultrasound scan sequence, transmit/receiving timing, storing and fetching of profiles to/from the memory 1720, and buffering and moving of digital RF data streams to the computer motherboard 106 via a high-speed serial interface 112. The high-speed serial interface 112 may include Fire Wire or other serial or parallel bus interface between the computer motherboard 106 and the ultrasound engine 108. The ultrasound engine 108 includes a communications chipset 1118 (e.g., a Fire Wire chipset) to establish and maintain the communications link 112.
A power module 1724 is provided to supply power to the ultrasound engine 108, manage a battery charging environment and perform power management operations. The power module 1724 may generate regulated, low noise power for the ultrasound circuitry and may generate high voltages for the ultrasound transmit pulser in the TR module.
The computer motherboard 106 includes a core computer-readable memory 1122 for storing data and/or computer-executable instructions for performing ultrasound imaging operations. The memory 1122 forms the main memory for the computer and, in an exemplary embodiment, may store about 4 Gb of DDR3 memory. The memory 1122 may include a solid state hard drive (SSD) for storing an operating system, computer-executable instructions, programs and image data. An exemplary SSD may have a capacity of about 128 GB.
The computer motherboard 106 also includes a microprocessor 1124 for executing computer-executable instructions stored on the core computer-readable memory 1122 for performing ultrasound imaging processing operations. Exemplary operations include, but are not limited to, down conversion, scan conversion, Doppler processing, Color Flow processing, Power Doppler processing, Spectral Doppler processing, and post signal processing. An exemplary microprocessor 1124 may be an off-the-shelf commercial computer processor, such as an Intel Core-i5 processor. Another exemplary microprocessor 1124 may be a digital signal processor (DSP) based processor, such as DaVinci™ processors from Texas Instruments.
The computer motherboard 106 includes an input/output (I/O) and graphics chipset 1704 which includes a co-processor configured to control I/O and graphic peripherals such as USB ports, video display ports and the like. The computer motherboard 106 includes a wireless network adapter 1702 configured to provide a wireless network connection. An exemplary adapter 1702 supports 802.11g and 802.11n standards. The computer motherboard 106 includes a display controller 1126 configured to interface the computer motherboard 106 to the display 104. The computer motherboard 106 includes a communications chipset 1120 (e.g., a Fire Wire chipset or interface) configured to provide a fast data communication between the computer motherboard 106 and the ultrasound engine 108. An exemplary communications chipset 1120 may be an IEEE 1394b 800 Mbit/sec interface. Other serial or parallel interfaces 1706 may alternatively be provided, such as USB3, Thunder-Bolt, PCIe, and the like. A power module 1708 is provided to supply power to the computer motherboard 106, manage a battery charging environment and perform power management operations.
An exemplary computer motherboard 106 may be accommodated within exemplary planar dimensions of about 12 cm×about 10 cm. An exemplary computer motherboard 106 can be accommodated within an exemplary area of about 120 cm2.
The housing 102 includes or is coupled to a probe connector 114 to facilitate connection of at least one ultrasound probe/transducer 150. The ultrasound probe 150 includes a transducer housing including one or more transducer arrays 152. The ultrasound probe 150 is couplable to the probe connector 114 using a housing connector 1804 provided along a flexible cable 1806. One of ordinary skill in the art will recognize that the ultrasound probe 150 may be coupled to the housing 102 using any other suitable mechanism, for example, an interface housing that includes circuitry for performing ultrasound-specific operations like beamforming. Other exemplary embodiments of ultrasound systems are described in further detail in WO 03/079038 A2, filed Mar. 11, 2003, titled “Ultrasound Probe with Integrated Electronics,” the entire contents of which is expressly incorporated herein by reference. Preferred embodiments can employ a wireless connection between the hand-held transducer probe 150 and the display housing. Beamformer electronics can be incorporated into probe housing 150 to provide beamforming of subarrays in a 1D or 2D transducer array as described herein. The display housing can be sized to be held in the palm of the user's hand and can include wireless network connectivity to public access networks such as the internet.
The menu bar 1902 enables a user to select ultrasound data, images and/or videos for display in the image display window 1904. The menu bar 1902 may include, for example, GUI components for selecting one or more files in a patient folder directory and an image folder directory. The image display window 1904 displays ultrasound data, images and/or videos and may, optionally, provide patient information. The tool bar 1908 provides functionalities associated with an image or video display including, but not limited to, a save button for saving the current image and/or video to a file, a save Loop button that saves a maximum allowed number of previous frames as a Cine loop, a print button for printing the current image, a freeze image button for freezing an image, a playback toolbar for controlling aspects of playback of a Cine loop, and the like. Exemplary GUI functionalities that may be provided in the main GUI 1900 are described in further detail in WO 03/079038 A2, filed Mar. 11, 2003, titled “Ultrasound Probe with Integrated Electronics,” the entire contents of which are expressly incorporated herein by reference.
The image control bar 1906 includes touch controls that may be operated by touch and touch gestures applied by a user directly to the surface of the display 104. Exemplary touch controls may include, but are not limited to, a 2D touch control 408, a gain touch control 410, a color touch control 412, a storage touch control 414, a split touch control 416, a PW imaging touch control 418, a beamsteering touch control 20, an annotation touch control 422, a dynamic range operations touch control 424, a Teravision™ touch control 426, a map operations touch control 428, and a needle guide touch control 428. These exemplary touch controls are described in further detail in connection with
Capacitive touchscreen module comprises an insulator for example glass, coated with a transparent conductor, such as indium tin oxide. The manufacturing process may include a bonding process among glass, x-sensor film, y-sensor film and a liquid crystal material. The tablet is configured to allow a user to perform multi-touch gestures such as pinching and stretching while wearing a dry or a wet glove. The surface of the screen registers the electrical conductor making contact with the screen. The contact distorts the screens electrostatic field resulting in measureable changes in capacitance. A processor then interprets the change in the electrostatic field. Increasing levels of responsiveness are enabled by reducing the layers and by producing touch screens with “in-cell” technology. “In-cell” technology eliminates layers by placing the capacitors inside the display. Applying “in-cell” technology reduces the visible distance between the user's finger and the touchscreen target, thereby creating a more directive contact with the content displayed and enabling taps and gestures to have an increase in responsiveness.
The menu bar 3104 enables users to select ultrasound data, images and/or video for display in the image display window 3102. The menu bar may include components for selecting one or more files in a patient folder directly and an image folder directory.
The image control bar 3106 includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a depth control touch controls 3108, a 2-dimensional gain touch control 3110, a full screen touch control 3112, a text touch control 3114, a split screen touch control 3116, a ENV touch control 3118, a CD touch control 3120, a PWD touch control 3122, a freeze touch control 3124, a store touch control 3126, and a optimize touch control 3128.
The menu bar 3204 enables users to select ultra sound data, images and/or video for display in the image display window 3202. The menu bar 3204 may include touch control components for selecting one or more files in a patient folder directory and an image folder directory. Depicted in an expanded format, the menu bar may include exemplary touch control such as, a patient touch control 3208, a pre-sets touch control 3210, a review touch control 3212, a report touch control 3214, and a setup touch control 3216.
The image control bar 3220 includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to depth control touch controls 3222, a 2-dimensional gain touch control 3224, a full screen touch control 3226, a text touch control 3228, a split screen touch control 3230, a needle visualization ENV touch control 3232, a CD touch control 3234, a PWD touch control 3236, a freeze touch control 3238, a store touch control 3240, and a optimize touch control 3242.
Within the patient data screen 3300, the image control bar 3318, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to accept study touch control 3320, close study touch control 3322, print touch control 3324, print preview touch control 3326, cancel touch control 3328, a 2-dimensional touch control 3330, freeze touch control 3332, and a store touch control 3334.
Within the pre-sets screen 3400, the image control bar 3408, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a save settings touch control 3410, a delete touch control 3412, CD touch control 3414, PWD touch control 3416, a freeze touch control 3418, a store touch control 3420, and a optimize touch control 3422.
Within the review screen 3500, the image control bar 3516, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a thumbnail settings touch control 3518, sync touch control 3520, selection touch control 3522, a previous image touch control 3524, a next image touch control 3526, a 2-dimensional image touch control 3528, a pause image touch control 3530, and a store image touch control 3532.
A image display window 3506, may allow the user to review images in a plurality of formats. Image display window 3506, may allow a user to view images 3508, 3510, 3512, 3514, in combination or subset or allow any image 3508, 3510, 3512, 3514, to be viewed individually. The image display window 3506, may be configured to display up to four images 3508, 3510, 3512, 3514, to be viewed simultaneously.
Within the report screen 3600, the image control bar 3608, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a save touch control 3610, a save as touch control 3612, a print touch control 3614, a print preview touch control 3616, a close study touch control 3618, a 2-dimensional image touch control 3620, a freeze image touch control 3622, and a store image touch control 3624.
Within the setup expanded screen 3704, the setup control bar 3744, includes touch controls that may be operated by touch and touch gestures, applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a general touch control 3706, a display touch control 3708, a measurements touch control 3710, annotation touch control 3712, a print touch control 3714, a store/acquire touch control 3716, a DICOM touch control 3718, an export touch control 3720, and a study information image touch control 3722. The touch controls may contain a display screen that allow the user to enter configuration information. For example, the general touch control 3706, contains a configuration screen 3724, wherein the user may enter configuration information. Additionally, the general touch control 3706, contains a section allowing user configuration of the soft key docking position 3726.
Within the review screen 3700, the image control bar 3728, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include but are not limited to, a thumbnail settings touch control 3730, sync touch control 3732, selection touch control 3734, a previous image touch control 3736, a next image touch control 3738, a 2-dimensional image touch control 3740, and a pause image touch control 3742.
Within the setup expanded screen 3804, the setup control bar 3844, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a plurality of icons such as a general touch control 3806, a display touch control 3808, a measurements touch control 3810, annotation touch control 3812, a print touch control 3814, a store/acquire touch control 3816, a DICOM touch control 3818, an export touch control 3820, and a study information image touch control 3822. The touch controls can contain a display screen that allow the user to enter store/acquire information. For example, the store/acquire touch control 3816, contains a configuration screen 3802, wherein the user may enter configuration information. The user can actuate a virtual keyboard allowing the user to enter alphanumeric characters in different touch activated fields. Additionally, the store/acquire touch control 3802, contains a section allowing user enablement of retrospective acquisition 3804. When the user enables the store function, the system is defaulted to store prospective cine loops. If the user enables the enable retrospective capture, the store function may collect the cine loop retrospectively.
Within the setup screen 3800, the image control bar 3828, includes touch controls that may be operated by touch and touch gestures applied by the user directly to the surface of the display. Exemplary touch controls may include, but are not limited to a thumbnail settings touch control 3830, synchronize touch control 3832, selection touch control 3834, a previous image touch control 3836, a next image touch control 3838, a 2-dimensional image touch control 3840, and a pause image touch control 3842.
Further illustrated by
And the ejection fraction is calculated by
It is noted that the operations described herein are purely exemplary, and imply no particular order. Further, the operations can be used in any sequence, when appropriate, and/or can be partially used. Exemplary flowcharts are provided herein for illustrative purposes and are non-limiting examples of methods. One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than shown.
In describing exemplary embodiments, specific terminology is used for the sake of clarity. For purposes of description, each specific term is intended to at least include all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Additionally, in some instances where a particular exemplary embodiment includes a plurality of system elements or method steps, those elements or steps may be replaced with a single element or step. Likewise, a single element or step may be replaced with a plurality of elements or steps that serve the same purpose. Further, where parameters for various properties are specified herein for exemplary embodiments, those parameters may be adjusted up or down by 1/20th, 1/10th, ⅕th, ⅓rd, ½, etc., or by rounded-off approximations thereof, unless otherwise specified.
With the above illustrative embodiments in mind, it should be understood that such embodiments can employ various computer-implemented operations involving data transferred or stored in computer systems. Such operations are those requiring physical manipulation of physical quantities. Typically, though not necessarily, such quantities take the form of electrical, magnetic, and/or optical signals capable of being stored, transferred, combined, compared, and/or otherwise manipulated.
Further, any of the operations described herein that form part of the illustrative embodiments are useful machine operations. The illustrative embodiments also relate to a device or an apparatus for performing such operations. The apparatus can be specially constructed for the required purpose, or can incorporate general-purpose computer devices selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines employing one or more processors coupled to one or more computer readable media can be used with computer programs written in accordance with the teachings disclosed herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
The foregoing description has been directed to particular illustrative embodiments of this disclosure. It will be apparent, however, that other variations and modifications may be made to the described embodiments, with the attainment of some or all of their associated advantages. Moreover, the procedures, processes, and/or modules described herein may be implemented in hardware, software, embodied as a computer-readable medium having program instructions, firmware, or a combination thereof. For example, one or more of the functions described herein may be performed by a processor executing program instructions out of a memory or other storage device.
It will be appreciated by those skilled in the art that modifications to and variations of the above-described systems and methods may be made without departing from the inventive concepts disclosed herein. Accordingly, the disclosure should not be viewed as limited except as by the scope and spirit of the appended claims.
This application is a continuation of U.S. application Ser. No. 17/520,150, filed Nov. 5, 2021, which is a continuation of U.S. application Ser. No. 15/833,547, filed Dec. 6, 2017, which is a continuation of U.S. application Ser. No. 14/037,106, filed Sep. 25, 2013, which is a continuation-in-part of PCT Application PCT/US2013/033941 filed Mar. 26, 2013, which is a continuation of U.S. application Ser. No. 13/838,694 filed Mar. 15, 2013, which claims priority to U.S. Provisional Application No. 61/615,627, filed Mar. 26, 2012 and to U.S. Provisional Application No. 61/704,254, filed Sep. 21, 2012, all of these applications being incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4991604 | Wurster et al. | Feb 1991 | A |
5311095 | Smith et al. | May 1994 | A |
5381794 | Tei et al. | Jan 1995 | A |
5487388 | Rello et al. | Jan 1996 | A |
5598845 | Chandraratna et al. | Feb 1997 | A |
5722411 | Suzuki et al. | Mar 1998 | A |
5844140 | Seale | Dec 1998 | A |
6059727 | Fowlkes et al. | May 2000 | A |
6063030 | Vara et al. | May 2000 | A |
6106472 | Chiang et al. | Aug 2000 | A |
6131459 | Seale et al. | Oct 2000 | A |
6146331 | Wong | Nov 2000 | A |
6261234 | Lin | Jul 2001 | B1 |
6371918 | Bunce | Apr 2002 | B1 |
6425865 | Salcudean et al. | Jul 2002 | B1 |
6447451 | Wing et al. | Sep 2002 | B1 |
6450958 | Linkhart et al. | Sep 2002 | B1 |
6468212 | Scott et al. | Oct 2002 | B1 |
6500126 | Brock-Fisher | Dec 2002 | B1 |
6516667 | Broad et al. | Feb 2003 | B1 |
6519632 | Brackett et al. | Feb 2003 | B1 |
6520912 | Brooks et al. | Feb 2003 | B1 |
6530887 | Gilbert et al. | Mar 2003 | B1 |
6569102 | Imran et al. | May 2003 | B2 |
6575908 | Barnes et al. | Jun 2003 | B2 |
6599256 | Acker et al. | Jul 2003 | B1 |
6603494 | Banks et al. | Aug 2003 | B1 |
6638226 | He et al. | Oct 2003 | B2 |
6663567 | Ji et al. | Dec 2003 | B2 |
6669633 | Brodsky et al. | Dec 2003 | B2 |
6682483 | Abend et al. | Jan 2004 | B1 |
6689055 | Mullen et al. | Feb 2004 | B1 |
6719698 | Manor et al. | Apr 2004 | B2 |
6760755 | Brackett | Jul 2004 | B1 |
6761689 | Salgo et al. | Jul 2004 | B2 |
7022075 | Grunwald et al. | Apr 2006 | B2 |
7115093 | Halmann et al. | Oct 2006 | B2 |
7338450 | Kristoffersen et al. | Mar 2008 | B2 |
7352570 | Smith et al. | Apr 2008 | B2 |
7457672 | Katsman et al. | Nov 2008 | B2 |
7604601 | Altmann et al. | Oct 2009 | B2 |
7736313 | Luo et al. | Jun 2010 | B2 |
7736314 | Beach et al. | Jun 2010 | B2 |
7794398 | Salgo | Sep 2010 | B2 |
8235903 | Abraham | Aug 2012 | B2 |
8241220 | Wilser et al. | Aug 2012 | B2 |
8357094 | Mo et al. | Jan 2013 | B2 |
8409095 | Marquis | Apr 2013 | B1 |
8435183 | Barnes et al. | May 2013 | B2 |
8659507 | Roncalez et al. | Feb 2014 | B2 |
8925386 | Oshiki | Jan 2015 | B2 |
9033879 | Urness et al. | May 2015 | B2 |
9072471 | Kato et al. | Jul 2015 | B2 |
9113825 | Chaggares et al. | Aug 2015 | B2 |
9220478 | Smith et al. | Dec 2015 | B2 |
9301730 | Poland | Apr 2016 | B2 |
9314225 | Steen et al. | Apr 2016 | B2 |
9327142 | Rothberg et al. | May 2016 | B2 |
9386964 | Bagge | Jul 2016 | B2 |
9504448 | Cheng et al. | Nov 2016 | B2 |
9597008 | Henkel et al. | Mar 2017 | B2 |
9877699 | Chiang et al. | Jan 2018 | B2 |
9962143 | Funakubo | May 2018 | B2 |
9983905 | Tobias et al. | May 2018 | B2 |
9986972 | Halmann et al. | Jun 2018 | B2 |
RE46931 | McLaughlin et al. | Jul 2018 | E |
10667790 | Chiang | Jun 2020 | B2 |
10856847 | Rothberg et al. | Dec 2020 | B2 |
11179138 | Chiang et al. | Nov 2021 | B2 |
11259874 | Landon | Mar 2022 | B1 |
20020087061 | Lifshitz et al. | Jul 2002 | A1 |
20020173721 | Grunwald et al. | Nov 2002 | A1 |
20030013959 | Grunwald et al. | Jan 2003 | A1 |
20030078501 | Barnes et al. | Apr 2003 | A1 |
20030088182 | He et al. | May 2003 | A1 |
20030097071 | Halmann et al. | May 2003 | A1 |
20030139664 | Hunt | Jul 2003 | A1 |
20030195418 | Barnes et al. | Oct 2003 | A1 |
20030212327 | Wang et al. | Nov 2003 | A1 |
20040015079 | Berger et al. | Jan 2004 | A1 |
20040059220 | Mourad et al. | Mar 2004 | A1 |
20040138569 | Grunwald et al. | Jul 2004 | A1 |
20040147840 | Duggirala et al. | Jul 2004 | A1 |
20040150963 | Holmberg et al. | Aug 2004 | A1 |
20040152982 | Hwang et al. | Aug 2004 | A1 |
20040152986 | Fidel et al. | Aug 2004 | A1 |
20040158154 | Hanafy | Aug 2004 | A1 |
20040193042 | Scampini et al. | Sep 2004 | A1 |
20050085730 | Flesch et al. | Apr 2005 | A1 |
20050101864 | Zheng et al. | May 2005 | A1 |
20050113690 | Halmann et al. | May 2005 | A1 |
20050119574 | Maerfeld et al. | Jun 2005 | A1 |
20050281444 | Lundberg et al. | Dec 2005 | A1 |
20060020204 | Serra et al. | Jan 2006 | A1 |
20060020206 | Serra et al. | Jan 2006 | A1 |
20070139873 | Thomas et al. | Jun 2007 | A1 |
20070140424 | Serceki | Jun 2007 | A1 |
20070161905 | Munrow | Jul 2007 | A1 |
20070167709 | Slayton et al. | Jul 2007 | A1 |
20080108899 | Halmann et al. | May 2008 | A1 |
20080119731 | Becerra et al. | May 2008 | A1 |
20080146922 | Steins et al. | Jun 2008 | A1 |
20080161686 | Halmann | Jul 2008 | A1 |
20080161688 | Poland et al. | Jul 2008 | A1 |
20080208047 | Delso | Aug 2008 | A1 |
20080208061 | Halmann | Aug 2008 | A1 |
20080215982 | Washburn et al. | Sep 2008 | A1 |
20080249414 | Yang | Oct 2008 | A1 |
20090012401 | Steinbacher | Jan 2009 | A1 |
20090043195 | Poland | Feb 2009 | A1 |
20090054781 | Stonefield et al. | Feb 2009 | A1 |
20090125840 | Squilla et al. | May 2009 | A1 |
20090131793 | Stonefield et al. | May 2009 | A1 |
20090177086 | Steen | Jul 2009 | A1 |
20090198132 | Pelissier et al. | Aug 2009 | A1 |
20090275835 | Hwang | Nov 2009 | A1 |
20100022890 | Fukukita et al. | Jan 2010 | A1 |
20100094132 | Hansen et al. | Apr 2010 | A1 |
20100145195 | Hyun | Jun 2010 | A1 |
20100160787 | Gorzitze | Jun 2010 | A1 |
20100174189 | Abraham | Jul 2010 | A1 |
20100179428 | Pedersen et al. | Jul 2010 | A1 |
20100217123 | Eran et al. | Aug 2010 | A1 |
20100217128 | Betts | Aug 2010 | A1 |
20100305444 | Fujii et al. | Dec 2010 | A1 |
20110050594 | Kim | Mar 2011 | A1 |
20110112399 | Willems et al. | May 2011 | A1 |
20110125022 | Lazebnik | May 2011 | A1 |
20110202889 | Ludwig et al. | Aug 2011 | A1 |
20110218436 | Dewey et al. | Sep 2011 | A1 |
20110230764 | Baba et al. | Sep 2011 | A1 |
20110237948 | Corn | Sep 2011 | A1 |
20110313292 | Kwak et al. | Dec 2011 | A1 |
20120010508 | Sokulin et al. | Jan 2012 | A1 |
20120029303 | Shaya | Feb 2012 | A1 |
20120053463 | Yoo | Mar 2012 | A1 |
20120065513 | Lee | Mar 2012 | A1 |
20120078108 | Kim et al. | Mar 2012 | A1 |
20120089024 | Hong | Apr 2012 | A1 |
20120095342 | Lee | Apr 2012 | A1 |
20120101378 | Lee | Apr 2012 | A1 |
20120108962 | Yoon | May 2012 | A1 |
20120108964 | Lee et al. | May 2012 | A1 |
20120112605 | Kim | May 2012 | A1 |
20120130244 | Kim | May 2012 | A1 |
20120133601 | Marshall | May 2012 | A1 |
20120136252 | Cho | May 2012 | A1 |
20120136254 | Kim | May 2012 | A1 |
20120157836 | Kim | Jun 2012 | A1 |
20120157844 | Halmann | Jun 2012 | A1 |
20120157847 | Kim | Jun 2012 | A1 |
20120157848 | Kim | Jun 2012 | A1 |
20120179038 | Meurer et al. | Jul 2012 | A1 |
20120184849 | Sandstrom et al. | Jul 2012 | A1 |
20120190984 | Kim et al. | Jul 2012 | A1 |
20120209107 | Guo et al. | Aug 2012 | A1 |
20120215108 | Park et al. | Aug 2012 | A1 |
20120220873 | Hyun | Aug 2012 | A1 |
20120232399 | Lee | Sep 2012 | A1 |
20120265027 | Lee et al. | Oct 2012 | A1 |
20120265074 | Na et al. | Oct 2012 | A1 |
20120283605 | Lewis, Jr. | Nov 2012 | A1 |
20120288172 | Perrey et al. | Nov 2012 | A1 |
20120302883 | Kong et al. | Nov 2012 | A1 |
20120316444 | Shim et al. | Dec 2012 | A1 |
20130018265 | Kim et al. | Jan 2013 | A1 |
20130019193 | Rhee | Jan 2013 | A1 |
20130072795 | Mo et al. | Mar 2013 | A1 |
20130072797 | Lee | Mar 2013 | A1 |
20130079627 | Lee | Mar 2013 | A1 |
20130144169 | Lee et al. | Jun 2013 | A1 |
20130144194 | Ahn et al. | Jun 2013 | A1 |
20130165783 | Kim et al. | Jun 2013 | A1 |
20130184578 | Lee et al. | Jul 2013 | A1 |
20130190624 | Beger et al. | Jul 2013 | A1 |
20130202169 | Lee et al. | Aug 2013 | A1 |
20130202174 | Lee | Aug 2013 | A1 |
20130218014 | Shim et al. | Aug 2013 | A1 |
20130218024 | Boctor | Aug 2013 | A1 |
20130226001 | Steen | Aug 2013 | A1 |
20130226004 | Lee | Aug 2013 | A1 |
20130237811 | Mihailescu et al. | Sep 2013 | A1 |
20130237824 | Kim | Sep 2013 | A1 |
20130237828 | Lee et al. | Sep 2013 | A1 |
20130239052 | Moody et al. | Sep 2013 | A1 |
20130245449 | Barnes et al. | Sep 2013 | A1 |
20130253316 | Choi | Sep 2013 | A1 |
20130253323 | Kim | Sep 2013 | A1 |
20130261434 | Kim et al. | Oct 2013 | A1 |
20130261448 | Hyun et al. | Oct 2013 | A1 |
20130261459 | Na et al. | Oct 2013 | A1 |
20130320485 | Ching Tee et al. | Dec 2013 | A1 |
20130324850 | Petruzzelli et al. | Dec 2013 | A1 |
20130328810 | Li | Dec 2013 | A1 |
20130331694 | Barnes et al. | Dec 2013 | A1 |
20140005550 | Lu et al. | Jan 2014 | A1 |
20140009686 | Segal | Jan 2014 | A1 |
20140051984 | Berger et al. | Feb 2014 | A1 |
20140100440 | Cheline | Apr 2014 | A1 |
20140107435 | Sharf et al. | Apr 2014 | A1 |
20140111451 | Park | Apr 2014 | A1 |
20140114194 | Kanayama et al. | Apr 2014 | A1 |
20140121524 | Chiang et al. | May 2014 | A1 |
20140164965 | Lee | Jun 2014 | A1 |
20140180111 | Gopinathan | Jun 2014 | A1 |
20140187934 | Urness | Jul 2014 | A1 |
20140187946 | Miller et al. | Jul 2014 | A1 |
20140194742 | Sundaran Baby Sarojam et al. | Jul 2014 | A1 |
20140200452 | Chang | Jul 2014 | A1 |
20140200456 | Owen | Jul 2014 | A1 |
20140237811 | Guercioni | Aug 2014 | A1 |
20140243614 | Rothberg et al. | Aug 2014 | A1 |
20140243669 | Halmann et al. | Aug 2014 | A1 |
20140257104 | Dunbar et al. | Sep 2014 | A1 |
20140275976 | Moro | Sep 2014 | A1 |
20140296711 | Lee | Oct 2014 | A1 |
20140300720 | Rothberg | Oct 2014 | A1 |
20140378835 | Satoh et al. | Dec 2014 | A1 |
20150087982 | Mullick et al. | Mar 2015 | A1 |
20150182197 | Willems et al. | Jul 2015 | A1 |
20150238168 | Poland | Aug 2015 | A1 |
20150265252 | Chu et al. | Sep 2015 | A1 |
20150313578 | Yu et al. | Nov 2015 | A1 |
20150366536 | Courtney et al. | Dec 2015 | A1 |
20160110875 | Sugiyama et al. | Apr 2016 | A1 |
20160135786 | Mullen et al. | May 2016 | A1 |
20160174937 | Bakshi et al. | Jun 2016 | A1 |
20160228091 | Chiang et al. | Aug 2016 | A1 |
20160278739 | Pelissier et al. | Sep 2016 | A1 |
20160287214 | Ralovich et al. | Oct 2016 | A1 |
20170000464 | Chang et al. | Jan 2017 | A1 |
20170020490 | Ryu et al. | Jan 2017 | A1 |
20170028227 | Emery | Feb 2017 | A1 |
20170055951 | Messina et al. | Mar 2017 | A1 |
20170079551 | Henkel et al. | Mar 2017 | A1 |
20170095228 | Richard et al. | Apr 2017 | A1 |
20170095230 | Richard et al. | Apr 2017 | A1 |
20170095231 | Richard et al. | Apr 2017 | A1 |
20170143307 | Tahmasebi Maraghoosh | May 2017 | A1 |
20170150948 | Kanayama | Jun 2017 | A1 |
20170249744 | Wang et al. | Aug 2017 | A1 |
20170360403 | Rothberg et al. | Dec 2017 | A1 |
20170360412 | Rothberg et al. | Dec 2017 | A1 |
20180085043 | Panicker | Mar 2018 | A1 |
20180182096 | Grady et al. | Jun 2018 | A1 |
20180365808 | Jiang | Dec 2018 | A1 |
20190046158 | Kroon et al. | Feb 2019 | A1 |
20190156526 | Liu | May 2019 | A1 |
20190336101 | Chiang et al. | Nov 2019 | A1 |
20190365350 | Chiang | Dec 2019 | A1 |
20200004225 | Buller | Jan 2020 | A1 |
20200268351 | Chiang et al. | Aug 2020 | A1 |
20210015456 | Chiang | Jan 2021 | A1 |
20220015741 | Amador Carrascal et al. | Jan 2022 | A1 |
20220125407 | Chiang et al. | Apr 2022 | A1 |
20220304661 | Chiang et al. | Sep 2022 | A1 |
20230181160 | Chiang et al. | Jun 2023 | A1 |
Number | Date | Country |
---|---|---|
101869484 | Oct 2010 | CN |
102178547 | Sep 2011 | CN |
102525556 | Jul 2012 | CN |
102626324 | Aug 2012 | CN |
102636787 | Aug 2012 | CN |
102872542 | Jan 2013 | CN |
102930170 | Feb 2013 | CN |
102940507 | Feb 2013 | CN |
102988043 | Mar 2013 | CN |
103140175 | Jun 2013 | CN |
103876781 | Jun 2014 | CN |
105611877 | May 2016 | CN |
1016875 | Jul 2000 | EP |
1239396 | Sep 2002 | EP |
2422705 | Feb 2012 | EP |
2425784 | Mar 2012 | EP |
2453256 | May 2012 | EP |
2455753 | May 2012 | EP |
2468191 | Jun 2012 | EP |
2575628 | Apr 2013 | EP |
2599442 | Jun 2013 | EP |
2605035 | Jun 2013 | EP |
2637166 | Sep 2013 | EP |
62-97539 | May 1987 | JP |
H11-508461 | Jul 1999 | JP |
2003-190159 | Jul 2003 | JP |
2004-530463 | Oct 2004 | JP |
2005-137747 | Jun 2005 | JP |
2006-68524 | Mar 2006 | JP |
2008-18107 | Jan 2008 | JP |
2008-515583 | May 2008 | JP |
2008-536555 | Sep 2008 | JP |
2009-45081 | Mar 2009 | JP |
2009-119259 | Jun 2009 | JP |
2009-525538 | Jul 2009 | JP |
2009-183720 | Aug 2009 | JP |
2009-240779 | Oct 2009 | JP |
2010-131396 | Jun 2010 | JP |
2010-220218 | Sep 2010 | JP |
2011-72746 | Apr 2011 | JP |
2011-87949 | May 2011 | JP |
2011-104079 | Jun 2011 | JP |
2011-200482 | Oct 2011 | JP |
2012-24133 | Feb 2012 | JP |
2012-101075 | May 2012 | JP |
2013-043082 | Mar 2013 | JP |
2013-111203 | Jun 2013 | JP |
2013-172959 | Sep 2013 | JP |
2016-087020 | May 2016 | JP |
20120043642 | May 2012 | KR |
20120047785 | May 2012 | KR |
20120071319 | Jul 2012 | KR |
20120097324 | Sep 2012 | KR |
20120117714 | Oct 2012 | KR |
20120137206 | Dec 2012 | KR |
20120138478 | Dec 2012 | KR |
20130011793 | Jan 2013 | KR |
20130012501 | Feb 2013 | KR |
20130012844 | Feb 2013 | KR |
20130020035 | Feb 2013 | KR |
20130020054 | Feb 2013 | KR |
20130020371 | Feb 2013 | KR |
20130022249 | Mar 2013 | KR |
20130026041 | Mar 2013 | KR |
20130030663 | Mar 2013 | KR |
20130033717 | Apr 2013 | KR |
20130036327 | Apr 2013 | KR |
101269459 | May 2013 | KR |
20130043702 | May 2013 | KR |
20130054013 | May 2013 | KR |
20130056676 | May 2013 | KR |
101273585 | Jun 2013 | KR |
20130059307 | Jun 2013 | KR |
20130060007 | Jun 2013 | KR |
20130066821 | Jun 2013 | KR |
20130074398 | Jul 2013 | KR |
20130074399 | Jul 2013 | KR |
20130075458 | Jul 2013 | KR |
20130075465 | Jul 2013 | KR |
20130075472 | Jul 2013 | KR |
20130075477 | Jul 2013 | KR |
20130075481 | Jul 2013 | KR |
20130075486 | Jul 2013 | KR |
20130076031 | Jul 2013 | KR |
20130076042 | Jul 2013 | KR |
20130076054 | Jul 2013 | KR |
20130076064 | Jul 2013 | KR |
20130076071 | Jul 2013 | KR |
20130076404 | Jul 2013 | KR |
20130076428 | Jul 2013 | KR |
20130077118 | Jul 2013 | KR |
20130077121 | Jul 2013 | KR |
20130077406 | Jul 2013 | KR |
20130078935 | Jul 2013 | KR |
20130078972 | Jul 2013 | KR |
20130080640 | Jul 2013 | KR |
20130081067 | Jul 2013 | KR |
20130081626 | Jul 2013 | KR |
20130081684 | Jul 2013 | KR |
20130082267 | Jul 2013 | KR |
20130083725 | Jul 2013 | KR |
20130084049 | Jul 2013 | KR |
20130087291 | Aug 2013 | KR |
20130087478 | Aug 2013 | KR |
20130088478 | Aug 2013 | KR |
20130089037 | Aug 2013 | KR |
20130090038 | Aug 2013 | KR |
20130094671 | Aug 2013 | KR |
20130095160 | Aug 2013 | KR |
20130095236 | Aug 2013 | KR |
20130095505 | Aug 2013 | KR |
I378255 | Dec 2012 | TW |
I380014 | Dec 2012 | TW |
I406684 | Sep 2013 | TW |
WO-2002068992 | Sep 2002 | WO |
WO-2003075769 | Sep 2003 | WO |
WO-2005053664 | Jun 2005 | WO |
WO-2005058168 | Jun 2005 | WO |
WO-2006030378 | Mar 2006 | WO |
WO-2006040697 | Apr 2006 | WO |
2006111874 | Oct 2006 | WO |
WO-2006111871 | Oct 2006 | WO |
WO-2008069021 | Jun 2008 | WO |
WO-2008115312 | Sep 2008 | WO |
WO-2009129845 | Oct 2009 | WO |
WO-2010020939 | Feb 2010 | WO |
WO-2010042282 | Apr 2010 | WO |
WO-2010051587 | May 2010 | WO |
WO-2012091518 | Jul 2012 | WO |
WO-2012101511 | Aug 2012 | WO |
WO-2012141550 | Oct 2012 | WO |
WO-2013030746 | Mar 2013 | WO |
WO-2013034175 | Mar 2013 | WO |
WO-2013055707 | Apr 2013 | WO |
WO-2013095032 | Jun 2013 | WO |
WO-2013122320 | Aug 2013 | WO |
WO-2013148730 | Oct 2013 | WO |
WO-2013162244 | Oct 2013 | WO |
WO-2014003404 | Jan 2014 | WO |
WO-2014014965 | Jan 2014 | WO |
WO-2014134316 | Sep 2014 | WO |
WO-2015048327 | Apr 2015 | WO |
WO-2015114484 | Aug 2015 | WO |
WO-2016001865 | Jan 2016 | WO |
WO-2016083985 | Jun 2016 | WO |
WO-2017013511 | Jan 2017 | WO |
WO-2017222970 | Dec 2017 | WO |
Entry |
---|
alibaba.com, Chison SonoTouch 10 B&W Handled Ultrasound Tablet With CE FDA. Shaanxi Aipu Medical Instrument Co., Ltd. 6 pages, (2014). |
Alrayashi et al., Hands-free continuous transthoracic echocardiography: A contemporary evolution of the precordial stethoscope. Paediatr Anaesth. May 2021;31(5):616-618. |
AMD Case Study. AMD embedded G-Series APU boosts 3-D visualization for portable ultrasound device. 3 pages. (2014). |
Basoglu et al., Applications of a next-generation programmable ultrasound machine. Proceedings SPiE Medical Imaging. 1 page, Abstract 3031, May 7, 1997. |
Basoglu et al., Computing requirements of modern medical diagnostic ultrasound machines. Parallel Computing. Sep. 1998;24(9-10):1407-1431. |
Brattain et al. Machine learning for medical ultrasound: status, methods, and future opportunities. Abdominal Radiology. Apr. 1, 2018;43(4):786-99. |
Chison Medical Imaging Co., Ltd., Premarket Notification [510(k)] Summary. Sono Touch Series Diagnostic Ultrasound System. 11 pages, Aug. 2, 2012. |
Dewaraja et al., GPU engine enhances ultrasound-detected brain motion calculations. OpenSystems Media. Retrieved online at: https://embeddedcomputing.com/application/healthcare/gpu-engine-enhances-ultrasound-detected-brain-motion-calculations. 5 pages, May 1, 2009. |
Dickson, Wireless communication options for a mobile ultrasound system. Thesis Submitted to the Faculty of Worcester Polytechnic Institute. 2008. 252 pages. |
Esaote, MyLab Ultrasound Scanners, DICOM Conformance Statement, Document Version 6.3. May 21, 2010. 277 pages. |
Esaote, MyLab Ultrasound Scanners, DICOM Conformance Statement, Document Version 6.5. Jul. 19, 2011. 278 pages. |
Esaote, MyLab Ultrasound Scanners, DICOM Conformance Statement, Document Version 6.6. Mar. 1, 2012. 278 pages. |
Felix et al., Biplane ultrasound arrays with integrated multiplexing solution for enhanced diagnostic accuracy in endorectal and transvaginal imaging. IEEE Ultrasonics Symposium, Sep. 18, 2005;4:2251-2254. |
GE Healthcare, Technical Publications Direction 5265930-100 English, Rev. 10. Venue 40 Basic User Manual. Operating Documentation by General Electric Co. 303 pages (2008-2012). |
Gray et al., Ultrasound-guided Regional Anesthesia, Current State of the Art. Anesthesiology. Feb. 2006;104:368-73. |
Kang et al., Stereoscopic augmented reality for laparoscopic surgery. Surg Endosc. 2014;28(7):2227-2235. |
Karadayi et al., Software-based Ultrasound Beamforming on Multi-core DSPs. IEEE International Ultrasonics. Oct. 18-21, 2011, 14 pages. |
Khuri-Yakub et al., Capacitive micromachined ultrasonic transducers for medical imaging and therapy. J Micromech Microeng. May 2011;21(5):54004, 11 pages. |
Lee et al., A new smart probe system for a tablet PC-based point-of-care ultrasound imaging system: feasibility study. IEEE International Ultrasonics Symposium Proceedings. 2014;1611-14. |
Lewandowski et al., Modular and scalable ultrasound platform with GPU processing. Conference Paper, Warsaw, Poland. 5 pages. (Oct. 2012). |
NanoMaxx Ultrasound System—Sonosite—User Guide. 100 pages (2010). |
Soma, Access Systems, Introducing AxoTrack™ Needle visualization as you've never seen it. Retrieved online at: SomaAccessSystems.com, 6 pages. |
Song et al., Tailored Holder for Continuous Echocardiographic Monitoring. Anesth Analg. Feb. 2018;126(2):435-437. |
SonoTouch, The Revolution is at Hand, catalog. Retrieved online at: www.sonatouch.com. 4 pages. |
SonoTouch, The Revolution is at Hand, SonoTouch 20 Operation Manual. 68 pages. |
Stolka et al., Needle guidance using handheld stereo vision and projection for ultrasound-based interventions. Med Image Comput Comput Assist Interv. 2014;17(Pt 2):684-91. |
Ultrasound Diagnostic System, Model: SonoTouch 20, Operator's Manual, Direction: CHUM-001a, Rev. 1.0, 98 pages, Oct. 13, 2012. |
Wygant et al., Beamforming and hardware design for a multichannel front-end integrated circuit for real-time 3D catheter-based ultrasonic imaging. Proceedings of SPiE. 2006;6147:61470A-1. |
York et al., Ultrasound Processing and Computing: Review and Future Directions. Annu Rev Biomed Eng. 1999;1:559-588. |
York, Architecture and Algorithms for a Fully Programmable Ultrasound System. A dissertation in partial fulfillment of the requirements for the Degree of Doctor of Philosophy, University of Washington. 141 pages, (1999). |
Zhang et al., A software package for portavle three-dimensional ultrasound imaging. 2nd IEEE International Symposium on Biomedical Imaging: Nano to Macro. 2004;1:539-42. |
International Preliminary Report on Patentability for Application No. PCT/US2013/033941, dated Oct. 1, 2014. 24 pages. |
International Search Report and Written Opinion for Application No. PCT/US2013/0333941, dated Oct. 8, 2013. 32 pages. |
Invitation to Pay Additional Fees for Application No. PCT/US2014/057516, dated Jan. 13, 2015. 6 pages. |
U.S. Appl. No. 17/520,150, filed Nov. 5, 2021, 2022-0125407, Allowed. |
U.S. Appl. No. 13/838,694, filed Mar. 15, 2013, U.S. Pat. No. 10,667,790, Issued. |
U.S. Appl. No. 14/037,106, filed Sep. 25, 2013, U.S. Pat. No. 9,877,699, Issued. |
U.S. Appl. No. 15/025,058, filed Mar. 25, 2016, 2016-0228091, Published. |
U.S. Appl. No. 15/833,547, filed Dec. 6, 2017, U.S. Pat. No. 11,179,138, Issued. |
U.S. Appl. No. 16/806,118, filed Mar. 2, 2020, 2020-0268351, Published. |
U.S. Appl. No. 17/520,150, filed Nov. 5, 2021, U.S. Pat. No. 11,857,363, Issued. |
U.S. Appl. No. 17/834,771, filed Jun. 7, 2022, 2022-0304661, Allowed. |
U.S. Appl. No. 16/461,581, filed May 16, 2019, 2019-0365350, Published. |
U.S. Appl. No. 16/414,215, filed May 16, 2019, 2019-0336101, Published. |
U.S. Appl. No. 16/938,515, filed Jul. 24, 2020, 2021-0015456, Published. |
U.S. Appl. No. 18/090,316, filed Dec. 28, 2022, 2023-0181160, Published. |
Number | Date | Country | |
---|---|---|---|
20240148358 A1 | May 2024 | US |
Number | Date | Country | |
---|---|---|---|
61704254 | Sep 2012 | US | |
61615627 | Mar 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17520150 | Nov 2021 | US |
Child | 18397557 | US | |
Parent | 15833547 | Dec 2017 | US |
Child | 17520150 | US | |
Parent | 14037106 | Sep 2013 | US |
Child | 15833547 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2013/033941 | Mar 2013 | WO |
Child | 14037106 | US | |
Parent | 13838694 | Mar 2013 | US |
Child | PCT/US2013/033941 | US |