The systems and methods described herein relate to ultrasound imaging for medical imaging applications and to user interfaces that adjust image characteristics to make clinical conduct of imaging procedures more facile.
Scientists and engineers have made remarkable advances with medical imaging technologies, including ultrasound imaging probes that are portable, provide ease of use and ready transport and produce clinically excellent images. Such devices allow doctors and other clinicians to use medical imaging for patients who are remote from hospitals and clinics with imaging equipment or who treat patients who are confined to their homes. As such, these portable systems provide patients access to the excellent care that advanced medical imaging can provide.
Examples of such portable imaging systems include those described in U.S. Pat. No. 10,856,840 which is granted to the assignee hereof. These new handheld ultrasound imaging devices have demonstrated accuracy similar to that provided by cart-based ultrasound machines. See Le et al.; Comparison of four handheld point-of-care ultrasound devices by expert users; The Ultrasound Journal (2022) 14:27. In studies, such as the study set out in the above cited publication of Le et al., multiple ultrasound imaging applications were considered, including bedside procedures, such as thoracentesis and epidural analgesia, and diagnostic evaluation of left ventricular function. Handheld ultrasound imaging devices performed well and in these studies and were found to be as clinically effective as cart-based ultrasound systems.
Although handheld ultrasound imaging devices have been shown in studies to be as clinically effective as cart-based ultrasound imaging systems, there are differences between the cart-based devices and the handheld devices. Some of these differences appear in how images are rendered. Cart-based ultrasound devices may generate images with a sharper and cleaner look than handheld systems. These differences in appearance may arise from the different device form factors and the differences in components used in the two different systems. The larger form factor of cart-based systems allows for the use of different components from those used in the smaller handheld units. This can include components, such as power supplies and transmitters, that are larger, more power-hungry, more expensive, and higher performance, but are less suited for use in a battery powered handheld device. Collectively, these different components can cause the appearance of images generated by cart-based systems to look different from images generated by handheld ultrasound systems.
As used herein, a handheld ultrasound imaging device encompasses, but is not limited to, ultrasound devices that are free from the cart carried power and processing systems used by cart-based ultrasound devices. They typically include a handheld probe and a mobile phone or table that connects to the probe. Typically, they are battery powered and the battery is carried in the probe. In any case, the handheld device typically, but not always, are physical devices which can generate and receive ultrasonic pulses and that have a size, weight, and power requirements such that it can be utilized by hand and without the need to be attached to an external source of power or processing units to generate clinically useful ultrasound images. Of course, carts can be associated with these handheld devices, but the handheld device does not rely on the cart for power and processing during operations. It will be apparent to those skilled in the art that in alternate embodiments handheld ultrasound imaging devices can encompass other elements based on the application at hand.
Additionally, cart-based devices typically have larger image display screens as well as different image display resolutions. Resolution as it is used in this embodiment includes, but is not limited to, a visual characteristic of an image which defines the amount of detail which an image holds such that an image which holds more detail is categorized as having a higher level of resolution. It will be clear to those skilled in the art that in other embodiments this term can encompass other elements based on the application at hand. Further, some cart-based devices have specialized graphics processors for more sophisticated image processing. As such, the clinician can experience a different visual aesthetic for the images produced by a cart-based ultrasound device as compared to a handheld ultrasound device. Moreover, the images generated by cart-based systems may have sharper features and a cleaner, less cluttered appearance. This may help clinicians find the anatomical structure of interest, such as a heart chamber, more quickly and allow the clinician to make a more accurate and faster diagnosis. Further, some clinicians may have a comfort with the more familiar visual aesthetic provided by cart-based ultrasound devices.
Although these portable ultrasound systems work well, are clinically effective and offer significant advantages over cart-based ultrasound devices, there remains a need to improve the ease at which clinicians who are facile with operating cart-based devices, can transition to use of handheld ultrasound devices and as such there is a remaining need for improved systems.
The systems and methods described herein provide, in one aspect, a selectable image filter that can adjust an image collected by the handheld ultrasound device and transform that image using a neural filter that renders the image with the look and feel of a cart-based ultrasound device. In one embodiment, the clinician-user can activate this filter while seeking the proper position for the ultrasound probe on the habitus of the patient. The clinician-user may toggle the filter between an active and inactive state during the positioning and imaging in order to employ adjusted images during the probe location and orientation process and to employ raw, that is unadjusted, images during study of a target anatomical feature.
Further disclosed herein are systems for generating ultrasound images during a medical imaging procedure, comprising an ultrasound imaging device for generating a stream of image data, an image processor for processing the stream of image data to generate images for use by a clinician, a user interface for controlling operation of the image processor and having a neural filter UI switch that activates a neural filter, and a neural filter responsive to the UI switch for processing image data within the stream of images by adjusting an output distribution of the image data to conform with an output distribution of image data generated by cart-based ultrasound images and to maintain content of the image data within the generated stream of image data. Optionally, the system may have the UI switch associated with a cardiac preset configuration for generating cardiac image data. Further optionally, adjusting an output distribution may include adjusting an output distribution of an image to match an image output distribution associated with a cart-based ultrasound imaging device. Still further optionally, the neural filter may include a mapping function for translating image data generated from a handheld ultrasound device to images of the type generated by cart-based ultrasound devices. The neural filter may process paired image data to generate the mapping function or may process unpaired image data to generate the mapping function. Further optionally, the system may include a training module that has and employs a cycle-consistent adversarial network to process unpaired image data to generate the mapping function. One advantage of the systems and methods described herein is that such systems and methods generate the level of ultrasound imaging quality produced by the high-power processors of cart-based ultrasound systems, often having specialized and power intensive image processing circuits, using the relatively low power methods of a neural filter. Still other embodiments may be realized and will be apparent from the figures and disclosure below.
In particular, in certain embodiments the systems and methods described herein include systems of generating ultrasound images during a medical imaging procedure comprising, a handheld ultrasound imaging device for generating a stream of image data from the habitus of the patient, and an image processor for processing the stream of image data to produce images. Additionally, there may be a neural filter which receives the stream of image data from the handheld ultrasound imaging device and processes it to generate a new stream of image data that produces images such that the output distribution of images conforms to the visual properties of the image distribution of ultrasound images of the type produced by cart-based ultrasound systems, and a user interface control for controlling operation of the neural filter and having a UI switch for activating the neural filter. Optionally, the UI switch may comprise a preset configuration for configuring the handheld ultrasound imaging device to generate image data for an associated image study requirement associated with the preset and for processing generated image data with the neural filter to conform the visual properties of the generated image data to have an image distribution of ultrasound images of the type produced by cart-based ultrasound systems for the respective preset image study requirements. Optionally, adjusting the output distribution to conform to the visual properties of the image distribution of the type produced by cart-based ultrasound systems may include adjusting the image data such that the measures of the visual properties of sharpness, resolution, and noise of the resulting output image conform to the measures of the visual properties sharpness, resolution and noise of the output distribution of ultrasound images produced by cart-based ultrasound imaging systems. Further optionally, the neural filter may include a mapping function for translating image data produced by a handheld ultrasound imaging system into image data of the type produced by cart-based ultrasound imaging systems by employing a training module to define for the neural filter visual properties of image data produced by handheld ultrasound imaging devices and cart-based ultrasound imaging systems respectively. Further optionally, the training module may process the image data of paired images across an image distribution of the type produced by handheld ultrasound imaging systems and an image distribution of the type produced by cart-based ultrasound imaging systems to generate the mapping function. Further optionally, the training module may process the image data of unpaired images across an image distribution of the type produced by handheld ultrasound imaging systems and an image distribution of the type produced by cart-based ultrasound imaging systems to generate the mapping function. Optionally the training module may employ a cycle-consistent adversarial network to evaluate unpaired images translated from a first image distribution into a second image distribution to determine the accuracy of the translation, and evaluating images translated from a first image distribution into a second image distribution and then back into the first image distribution to determine the content lost across the translation to generate the mapping function.
In another aspect the methods described herein include a method of generating ultrasound images during a medical imaging procedure comprising, generating with a handheld ultrasound imaging device a stream of image data from the habitus of the patient, and processing the stream of image data to produce images, and receiving the stream of image data and processing it to generate a new stream of image data that produces images such that the output distribution of images conforms to the visual properties of the image distribution of ultrasound images of the type produced by cart-based ultrasound systems, and controlling operation of the neural filter comprising a UI switch for activating the neural filter. Additionally there may be a method for controlling operation of the neural filter including accessing a preset configuration for configuring the handheld ultrasound imaging device to generate image data for image study requirements associated with the preset and processing generated image data with the neural filter to conform the visual properties of the generated image data to have an image distribution of ultrasound images of the type produced by cart-based ultrasound systems for the respective preset image study requirements. Optionally, adjusting the output distribution to conform to the visual properties of the image distribution of the type produced by cart-based ultrasound systems may include adjusting the image data such that the measures of the visual properties of sharpness, resolution, and noise of the resulting output image conforms to the measures of the visual properties of sharpness, resolution and noise of the output distribution of ultrasound images produced by cart-based ultrasound imaging systems. Optionally receiving the stream of image data and processing it to generate a new stream of image data may include employing a neural filter which includes employing a mapping function for translating image data produced by a handheld ultrasound imaging system into image data of the type produced by cart-based ultrasound imaging systems by employing a training module to define for the neural filter visual properties of image data produced by handheld ultrasound imaging devices and cart-based ultrasound imaging systems respectively. Optionally employing a training module may include processing the image data of paired images across an image distribution of the type produced by handheld ultrasound imaging systems and an image distribution of the type produced by cart-based ultrasound imaging systems to generate the mapping function. Further optionally employing a training module may include processing the image data of unpaired images across an image distribution of the type produced by handheld ultrasound imaging systems and an image distribution of the type produced by cart-based ultrasound imaging systems to generate the mapping function. Further optionally employing a training module may include employing a cycle-consistent adversarial network to evaluate unpaired images translated from a first image distribution into a second image distribution to determine the accuracy of the translation, and evaluating images translated from a first image distribution into a second image distribution and then back into the first image distribution to determine the content lost across the translation to generate the mapping function.
The systems and methods described herein are set forth in the appended claims.
However, for purpose of explanation of these systems and methods, several embodiments are set forth in the following figures and the related description.
In the following description, numerous details are set forth for purpose of explanation. However, one of ordinary skill in the art will realize that the embodiments described herein may be practiced without the use of these specific details. Further, for clarity, well-known structures and devices are shown in block diagram form to not obscure the description with unnecessary detail.
In one embodiment, the systems and methods described herein include, among other things, the system 100 depicted in
The probe 102, in this embodiment, is an ultrasound probe of the type disclosed in U.S. Pat. No. 10,856,840, assigned to the assignee hereof. The probe 102 is a handheld ultrasonic imaging probe that can be used by the clinician to image a patient and collect medical images useful in the clinical process of diagnosing and treating the patient. In the depicted embodiment the probe 102 is a handheld battery powered unit, although in other embodiments the probe 102 may draw power from the handheld device 108 or from a remote power supply. The probe 102 has a transducer head 106 that the clinician may place against the tissue of the patient, such as by placing the transducer head 106 in contact with the patient's chest proximate to the heart of the patient or proximate the carotid artery. The depicted probe 102 has a single UI control button 104, although in other embodiments there may be more than one UI control button, or no UI control button. The depicted probe 102 is an example of an ultrasound imaging device for generating a stream of image data. The stream of image data is a stream of image data generated by ultrasound imaging during a medical imaging procedure and that stream includes reflected ultrasound energy detected by the probe 102 transducer head 106 and capable of being processed to reveal as images the anatomical structures, including organs, bones and tissue, of a patient. The stream of image data herein includes, but is not limited to, a sequence of data being received by a processing unit which when processed produces a visual image represented by the data. This may refer to the sequence of data which the ultrasound probe 102 produces after processing the ultrasound signals it receives, or additionally it may refer to the sequence of data that the neural filter 224 produces after translating an image into another style. It will be clear to those skilled in the art that in alternate embodiments this may encompass other elements depending on the application at hand.
In typical operation, the clinician uses the probe 102 and the application 109 executing on the handheld device 108 to capture and display images of anatomical features of the patient. To this end, the application 109 may render the captured image in the image window 110 for the clinician to view. The UI window 112 may provide the clinician with a series of optional user interface controls that the clinician may use to operate the application 109 executing on the handheld device 108 to change how the captured image is rendered, store the image, mark the image, and perform other types of operations useful during the tomographic procedure.
During a tomographic procedure, the clinician adjusts the position and angle of the probe 102 until an image of interest appears in the image window 110. In some embodiments, the clinician may operate the application program 109 by activating UI controls in window 112 to capture images to study, or activate various functions such as, but not limited to, selecting preset configurations, performing imaging control such as for controlling depth, controlling gain, switching modes, turning color on and/or off, controlling a wireless pairing process, or for soft resetting of the probe. In any case, the application 109 allows the clinician to adjust how images are presented and stored.
In the embodiment depicted in
In the embodiment depicted in
In the depicted embodiment, the executing application 109 may display the constructed images, including video images, such as the ultrasound images 116 and 117, in the image window 110 so that the clinician can see live images of the patient as those images are being generated by the probe 102. In operation, the application 109 performs image processing for processing the stream of image data generated by the probe 102 for use by a clinician. Image processing may include analyzing and operating on the stream of image data generated by the probe 102, which when generated by the probe 102 is typically in the form of the energies and frequencies reflected back to the transducer head 106 by the anatomical structure of the patient. The application 109 will process this stream of ultrasound image data to generate frames of video data of the type that can be displayed on a screen and understood and put to use by a clinician.
In the depicted embodiment, the application 109 also provides a UI window 112 that has a series of software control buttons, sometimes called widgets, that the clinician may use for controlling the operation of the probe 102. These controls allow the clinician to change how images, such as image 116 and image 117, are rendered, captured, and displayed on the handheld device 108. In the embodiment of
In other examples a preset for a needle visualization procedure may be employed wherein the handheld device 108 overlays a B-mode ultrasound image, generated using parameters set for visualizing an inserted needle, on top of a B-mode image obtained with parameters set for normal vascular imaging. Still other presets may be employed and the preset employed will typically depend, at least in part, on the imaging procedure the clinician is conducting and the clinician's choice of preset.
The menu 114 may include text boxes that display text representing the preset used during the ultrasound imaging and that the images are displayed with a biplane view. Biplane views may be understood as displays of two planes of imaging, one along the longitudinal axis of the probe and the other along the transverse axis of the probe. In the depicted example, the longitudinal axis view is displayed on the bottom of the screen and the transverse axis view is displayed on the top of the screen, to provide a perpendicular plane of view of the patient lumen.
In any case,
In the embodiment depicted in
In certain embodiments, the application 109 responds to controls entered through the UI controls of window 112. For example, the user interface window 112 may include a control, typically referred to as a widget, that is presented as an icon within the user interface window 112. The clinician can activate the widget to direct the application 109 to activate the neural filter for processing images presented within window 110. In certain embodiments, the widget maybe a preset selection widget that allows the clinician to configure the system 100 for certain preset operations. For example, a preset control may be presented within window 112 to allow the clinician to select preset parameters for a cardiac imaging procedure. In this embodiment, the preset control acts as a user interface to instruct the application 109 to control operation of the neural filter. A preset control, or preset configuration, includes but is not limited to, a set of parameter values within a larger system's capabilities which are configured to specific values which optimize or improve the performance of the overall system for a specific purpose and having those values grouped together into a single selectable setting for the overall system. Typically the parameters which are a part of the preset can be set individually and it would be possible for a user to manually set each parameter value to the same level that they are in the preset. The benefit of the preset is that the combination of parameter values are optimized for a specific use-case of the system and where that use-case is common. That allows the user to save time by setting all parameter values to the optimal level for a common use-case all at once. It will be clear to those skilled in the art that in other embodiments a preset control may encompass other aspects depending on the application at hand. The preset control therefore acts as a UI switch that activates the neural filter and causes the application 109 to load parameters associated with the cardiac preset controls. In this embodiment, it can be understood that a UI switch may encompass to a button or selectable digital icon which is incorporated into the User Interface of a system or tool which allows the user to toggle on and off the feature associated with the button or digital icon. It will be clear to those skilled in the art that in other embodiments a UI switch can encompass other forms which are suited to the application at hand. Typically, such parameters are set to optimize the system 100 for imaging cardiovascular anatomy of the patient. In this embodiment, the preset parameter configurations may include the activation of the neural filter of application 109. As such, the application 109 will transform images collected by probe 102 to have an image distribution associated with cart-based images. In this embodiment the image distribution can be understood as a set of images which are of the same type based on fitting a pre-determined categorization but which have differences in their visual characteristics and content as to maintain their membership in the category and to define the range of the distribution. For example, if you have a set of images which are all ultrasound images of the heart, those images will not all be identical to each other. Variations in patient anatomy, probe placement, device strength and other factors will create differences between the images, however all of these images can still be categorized as of the same type since they are all ultrasound images of the heart. It will be clear to those skilled in the art that in other embodiments image distribution may encompass other elements based on the application at hand. The transformed images will be presented within window 110 for use by the clinician during the cardiac imaging operation. In one embodiment, the clinician may have in additional control within user interface window 112 which allows the clinician to toggle the use of the neural filter within application 109 from active to inactive. In such embodiments, the clinician may activate the neural filter by selecting the cardiac preset user control within window 112. As the clinician moves and orients the probe 102 over the habitus of the patient, the clinician may achieve a location and orientation the clinician deems proper for collecting images for a cardiac imaging procedure.
In some practices, the clinician will generate images for the imaging procedure with the neural filter active and operating as discussed herein. The clinician will locate and orient the probe 102 on the patient's habitus and capture and filter images. The filtered images may be employed during the diagnostic process. This process may include taking measurements of organ characteristics, such as ejection fraction of the patient's heart, chamber volume, hypodermic needle placement and depth relative to a patient's vein, lung volume and other measures and observations that the clinician may employ during diagnosis and treatment. Thus, the clinician may use the images filtered by the neural filter to position the probe 102 and collect images and the filtered images may be analyzed by the clinician as part of the clinical treatment process. Optionally and alternatively, once the probe 102 is located and oriented at a selected location on the patient's habitus, the clinician may employ the user control within window 112 to toggle the application 109 to deactivate the neural filter. Once deactivated, the application 109 will present within window 110 the image data that is collected by the probe 102 without transforming such image data to cart-based image distributions. In this way, the clinician may employ the neural filter of application 109 during the location and orientation of the probe 102 to facilitate easy and efficient location and orientation of the probe 102 relative to the habitus of the patient. Once so located, the clinician may use the UI to deactivate the neural filter of application 109 to collect images that are ground images, that is representative of the image data collected by probe 102 without transforming such data to achieve the look and feel of cart-based imaging devices.
In alternative embodiments, the application 109 may respond to the UI 104 on probe 102. In this way the clinician may manually toggle the UI switch 104 to direct the application 109 to activate or deactivate the neural filter that may be filtering image data generated by probe 102. Those of skill in the art will recognize other ways to activate or deactivate the neural filter of application 109 so that the clinician has control over whether the neural filter is being applied to image data generated by probe 102.
In
The application 209 may be a computer program executing on a processor built within the circuit board of the handheld device 208 that couples to the probe 202. The development of such applications that execute on a handheld device such as a smartphone or tablet and that carry out the depicted functions of the application 209 is well known to those of skill in the art. Techniques for developing such applications are set out in for example, Alebicto et al., Mastering iOS 14 Programming: Build professional-grade iOS 14 applications with Swift 5.3 and Xcode 12.4, 4th Four ed.; Packt Publishing Ltd (2021). For clarity and ease of illustration, the application 209 is depicted as a functional block diagram. The functional blocks of application 209 include cardiac preset 210, image data memory 214, memory buffers 218 and 220 and the neural filter module 224. The handheld device 208 also includes a device interface 204, also depicted as a functional block, that interfaces with the probe 202 to allow the exchange of data between the probe 202 and the handheld device 208. The device interface 204 in one embodiment, is a conventional electronic transceiver circuit capable of transmitting and receiving data over a wire connection, such as the depicted data path 207. Additionally, the device interface 204 couples to the bus 212. The bus 212 maybe a typical data bus used on circuit boards to exchange data among different functional modules on the circuit board. In the depicted embodiment the device interface 204 couples to the data bus 212 to exchange information with the cardiac preset module 210 and the image data memory 214. The cardiac preset module 210 may be a module, in some embodiments connected and responsive to a UI widget, and capable of delivering data to the probe 202. In one embodiment, the cardiac present module 210 of the application 209 detects that the clinician has selected this preset. This selection may occur through any suitable process, and in one particular embodiment, the selection occurs by the clinician selecting the cardia preset option from a menu displayed on a screen of the handheld device 208, as described with reference to
Additionally, the device interface 204 communicates via the bus 212 with the image data memory 214. The image data memory 214 may be a data memory allocated and maintained by the application 209 and capable of storing image data generated by the probe 202.
In one embodiment, the neural filter 224 is a software module executing as part of the application 209. The neural filter 224 in this embodiment is a software module capable of translating the style associated with one set of images, in this case images generated by probe 202, to the style associated with another set of images, in this case images of the type generated by cart-based ultrasound imaging systems. As such, the neural filter 224 may be understood as an image-to-image translator that converts an image from one representation of a given scene, x, in this example the scene will be an anatomical image rendered by the ultrasound imaging probe 202, to another representation, y. As used herein, translating image data can be understood to encompass, but is not limited to a process of altering the data which define a visual image in such a way that the original content of the image is largely maintained, but that the resulting translated image resembles a different category of images enough that a person familiar with both the original and translated categories would recognize the translated image as being a part of the translated category. For example, translating the image data of an ultrasound image of a heart taken by a handheld ultrasound imaging device into an image of the type produced by cart-based ultrasound systems would mean altering the image data which define the image from the handheld device such that the image produced by the translated image data would look like a typical ultrasound image of the heart taken by a cart-based ultrasound system to a knowledgeable clinician. It will be apparent to those skilled in the art that in alternate embodiments translating image data may encompass other elements depending on the application at hand. As discussed below, the neural filter 224 can process image data from the probe 102 to adjust an output distribution or output distributions of the image data to conform with an output distribution of image data generated by cart-based ultrasound systems and to maintain the content of the image data that had been generated by the probe 102. Thus, the neural filter 224 may maintain the content of the ground images generated by the probe 102, while processing those ground images to achieve output distributions of cart-based images and thereby for example, achieve the sharpness and clarity of images associated with cart-based ultrasound devices. In this embodiment, output distribution may be understood as a set of images which have been processed by a neural filter which while having enough similarity as to categorize them as of the same type, do have some variations in their image content and visual properties which define the range of the distribution. For example, if a neural filter was given a set of photographs of a house, and was tasked with translating them into a images in the style of a newspaper comic strip, the output distribution of the neural filter would be a set of images which all depicted houses, with rooves, doors, windows, etc. which generally aligned with the corresponding elements of the original photographs, but the elements will have been altered possibly by changing the color palate or rounding the edges of sharp corners, so that the houses are still recognizable as houses, but would not look out of place in a newspaper comic strip. It will be clear to those skilled in the art that in other embodiments output distribution may encompass other elements based on the application at hand.
In one embodiment, the neural filter 224 may be implemented as an Application Specific IC (ASIC), a Field Programmable Array (FPGA), a micro-controller, a software module executing on a microprocessor or any other implementation capable of being configured to carry out the image-to-image functions described herein. In one particular embodiment, the neural filter 224 is a software module executing on the microprocessor on the handheld device 208.
In particular,
In one embodiment, the neural filter 324 may include a translation mapping function developed by a training module 306 that carries out an analysis of supervised settings, where example image pairs {x, y} are available. These techniques may be employed to create a training module 306 that is suitable for use with paired image data. In such an example, the images in the first data set 302 and the second data set 304 are paired. A paired data set is a data set where an image in the first data set has a corresponding respective image in the second data set. Typically, that corresponding image includes essentially the same anatomical scene as depicted in the respective image of the first data set. In this embodiment, the training module 306 employs a training process suitable for use with paired image data, such as the training process disclosed in J. Johnson, A. Alahi, and L. Fei-Fei.; Perceptual losses for real-time style transfer and super-resolution; In ECCV, pages 694-711; Springer (2016). Such training modules 306 may be software modules that typically implement pixel based image comparisons between respective images of a pair of images. The training module 306, based on these comparisons, develop a translation map for translating images from one domain to the other. The developed translation map may be supervised, which is essentially a testing process to determine the accuracy and robustness of the translation mapping function. Such supervision may occur by using known pairings of respective images between the sets to ensure that the developed translation map translates images from one domain to the other without introducing unacceptable losses and inaccuracies in the content of the translated image, and may ensure that errors in style do not occur, such as failures that increase noise, such as line fuzziness, into the translated image. The term noise can be understood to encompass but is not limited to a visual characteristic of an image which defines the amount of visual information displayed in the image which may distort, transform, block, or add to what is seen in the image compared to the reality which the image is meant to capture. It will be apparent to those skilled in the art that in other embodiments this term may encompass other elements based on the application at hand.
In alternate embodiments, the neural filter 224 may include a translation mapping developed by a training module 306 that works from unpaired input-output examples. In this embodiment unpaired images can be understood to include but are not limited to, images of different categories which are only identified by their category, and not the relationship between each other. For example if two sets of ultrasound images are differentiated by one set being from handheld, and the other from cart-based ultrasound systems, no other identifiers will be added to the images even if there are images in both sets which depict the heart, liver etc. Still other embodiments unpaired images may encompass other elements without departing from the scope of this disclosure. Such techniques are disclosed in, for example Zhu et al.; Unpaired Image-to-Image Translation using Cycle-Consistent Adverserial Networks; arXiv:1703.10593 (2020) were employed. For such embodiments, supervision of the developed translation may be provided at the level of sets of images. In these processes that employ unpaired data, the development of the neural filter 324 includes collecting a first set of images in Domain X, which in this case will typically be images generated by a handheld device of the type depicted in
In operation, the system 300 delivers the images from image sets 302 and 304 into the training module 306. The training module 306 trains a neural network to identify and analyze characteristics of the images related to image set 302, and characteristics of the images in the image set 304. Distribution information about these characteristics is developed for the input data set 302 and for the output data set 304. The training module develops a mapping that maps images of the type from the first data set 302 to the domain space associated with images of the type in data set 304. In particular, the translation map developed by the training module 306 maps images of the type from the image set 302 such that translated images from set 302 will have image distributions that match, or are similar to, the image distribution characteristics of images of the type found in data set 304. In this way, images captured by the handheld device can be translated to have the distributions found in the Domain of images generated by cart-based devices. This will give the translated images a look and feel that is highly correlated to the look and feel of images found in the second data set 304. This high correlation is such that a process of discriminating translated images from native images within the set 304 is difficult for the human observer. In one embodiment, discriminators are applied to challenge both generative functions. The discriminators will aim to distinguish between ground images from a Domain, whether X or Y, and images that have been translated into the respective Domain. The use of discriminators can increase the match between distributions of ground images of one Domain with images translated into that Domain. The test processes employed to discriminate translated images from actual images from the Domain may include human visual inspection or machine inspection that analyses image characteristics such as distribution of contrast, image noise and other criteria. It will be apparent to those of skill in the art that any suitable criteria may be employed, and the inspection criteria for discriminating between translated images and ground images originally from Domain will typically depend upon the application being addressed. The discriminator criteria for convergence may, in one example, be set such that the discriminator has an accuracy of about fifty percent (50%) on both domains and the discriminator cannot distinguish effectively between the translated images and the ground images. This fifty percent level, meaning the discriminator is correct or incorrect in determining for instance that a translated image is actually a ground image, indicates that the generative network is generating translated images with data distributions that the discriminator cannot distinguish from the ground images.
The mapping developed by the training module 306 may then be loaded into a neural filter module 324. In one embodiment the training module 306 loads the map into the neural filter 324 as a portion of a neural network capable of translating images from the first domain associated with the set of images 302 into images having a distribution that matches images in the data set 304. Alternatively, the training module 306 may develop look up tables, a series of functions, or any other suitable mechanism for storing the translation map developed by training module 306 into the neural filter 324. In one optional embodiment, the systems and methods described herein employ two stages of training to develop a training module 306 that is of size more suited for use in a handheld device. The first stage of this two-stage training is as described above, and the training module developed may be referred to as a “teacher model”. The two-stage training employs a second stage for the purpose of processing the teacher model to develop a student model that is effective for the application at hand, which in this example is translating the data distribution of ultrasound images. However, the student module will be smaller in size than the teacher model and therefore more readily installed in an application executing on a handheld device and may be more able to execute on the handheld device as a real-time application.
To this end, in one embodiment the teacher model is distilled into a smaller student model having a model architecture that runs in real-time on the handheld device. Techniques for such a distillation process are known in the art and include those disclosed in or similar to those disclosed in Chen et al.; Distilling Portable Generative Adversarial Networks for Image Translation; The Thirty-Fourth AAAI Conference on Artificial Intelligence; pages 3585-3592; (2020). In one embodiment, the distillation process includes sampling a source image and running it through the teacher model to generate the teacher target image. Additionally, the distillation process will run that source image through a student model and generate an output. Optionally, the student model may be size constrained. For example, the student model may be constrained to have half or quarter channels of the teacher model. This can reduce the size of the student model as compared to the teacher model. Thus, in these embodiments, the student model generated output may be generated by a size constrained student model.
The distillation process will reduce the distance of the student model generated output to teacher target using a suitable technique such as, but not limited to, pixel-wise mean absolute distance measures. The distillation process will update the discriminator that provides that the student model generated outputs have data distributions from the target domain. The smaller student model may be employed as the training module 306.
Turning to
In several of the embodiments discussed above, the neural filter has been activated in concert with activation of a preset such as the cardiac preset. Although in some embodiments the neural filter may be activated independent of any preset, in other embodiments the neural filter is activated in response to selection of a preset. Although cardiac preset has been discussed, it will be understood by those of skill in the art that any useful preset may cause the activation of the neural filters disclosed herein. Table 1 presents a short list of presets of the type that may be used with the neural filters described herein. Table 1
Each preset is a mode adapted for a particular type of imaging study. Presets may help with imaging studies and may be employed within systems that have the neural filter feature described herein, but optionally, presets may be employed within systems that do not have such a neural filter feature. In further optional embodiments, the clinician may be able to override a preset parameter that activates a neural filter to either activate or deactivate the neural filter for all operations regardless of the selected preset.
The systems and methods described herein reference circuits, CPUs and other devices, and those of skill in the art will understand that these embodiments are examples, and the actual implementation of these circuits may be carried out as software modules running on microprocessor devices and may comprise firmware, software, hardware, or any combination thereof that is configured to perform as the systems and processes described herein. Further, some embodiments may also be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits. To illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described herein generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the embodiments described herein.
Various aspects of the present disclosure may be used alone, in combination, or in a variety of arrangements not specifically described in the embodiments described in the foregoing and is therefore not limited in its application to the details and arrangement of components set forth in the foregoing description or illustrated in the drawings. For example, aspects described in one embodiment may be combined in any manner with aspects described in other embodiments.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified.
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements.
Having described above several aspects of at least one embodiment, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be object of this disclosure. Accordingly, the foregoing description and drawings are by way of example only.
Accordingly, it will be understood that the invention is not to be limited to the embodiments disclosed herein and which include, but are not limited to:
A system for generating ultrasound images during a medical imaging procedure, comprising
an ultrasound imaging device for generating a stream of image data,
an image processor for processing the stream of image data to generate images for use by a clinician,
The system of above, wherein the UI switch is associated with a cardiac preset configuration for generating cardiac image data.
The system of above, wherein adjusting an output distribution includes adjusting an output distribution of an image to match an image output distribution associated with a cart-based ultrasound imaging device.
The system of above, wherein the neural filter includes a mapping function for translating image data generated from a handheld ultrasound device to images of the type generated by cart-based ultrasound devices.
The system of above, wherein the neural filter processes paired image data to generate the mapping function.
The system of above, wherein the neural filter processes unpaired image data to generate the mapping function.
The system of above having a training module that employs a cycle-consistent adversarial network to process unpaired image data to generate the mapping function.
Number | Date | Country | |
---|---|---|---|
63423777 | Nov 2022 | US |