This disclosure relates generally to a method and ultrasound imaging system for obtaining head progression measurements. The method and ultrasound imaging system includes displaying a reference image frame superimposed over a live image.
A head progression measurement obtained from an ultrasound image is used as a reliable method for assessing the progress of labor and fetal head descent for pregnant women. One commonly used head progression measurement is an angle of progression (AOP) measurement. The AOP is the angle formed between a long axis of the pubis symphysis and a line extending tangentially from an inferior edge of the pubic symphysis to a fetal skull. The AOP should increase over time for women who will undergo vaginal delivery.
The AOP, and how it changes over time, is a good indicator of fetal station. The AOP, and its progression, has been shown to be a good parameter for determining the type of delivery that would be most appropriate for a given pregnancy. Thus far, AOP has been found to be useful in the prediction of the following: successful vaginal delivery, the length of the second stage of labor, the chances of a successful induction of labor, and vacuum extraction.
In order for AOP, or any other head progression measurement, to be a reliable method to assess the progress of labor and fetal head descent, it is desirable to use ultrasound images acquired from the same, or nearly the same, scan plane. Using ultrasound images acquired from different scan planes may result in undesirable error in the determination of head progression which, in turn, may contribute to an incorrect clinical determination regarding the type of delivery for a given patient.
For these and other reasons, an improved ultrasound imaging system and method for obtaining head progression measurements is desired.
The above-mentioned shortcomings, disadvantages, and problems are addressed herein which will be understood by reading and understanding the following specification.
In an embodiment, a method of obtaining an head progression measurement using an ultrasound imaging system includes accessing, from a memory, a reference image frame that was acquired along a first scan plane, the reference image frame including a fetal head and an anatomical reference structure. The method includes obtaining a first head progression measurement from the reference image frame, acquiring, with an ultrasound probe, a live image comprising a sequence of image frames along a second scan plane, and displaying both the live image and the reference image frame on the display screen at the same time, where the reference image frame is superimposed over the live image. The method includes comparing the live image to the reference image frame, adjusting a position of the ultrasound probe based on comparing the live image to the reference image frame to align the second scan plane with the first scan plane. The method includes selecting an image frame from the live image after aligning the second scan plane with the first scan plane. The method includes obtaining a second head progression measurement from the image frame and displaying the second head progression measurement on the display screen.
In an embodiment, an ultrasound imaging system includes an ultrasound probe, a memory, an input device, a display screen, and a processor in electronic communication with the memory, the input device, and the display screen. The processor is configured to access a reference image frame from the memory that was acquired along a first scan plane, the reference image frame including an anatomical reference structure and a fetal head. The processor is configured to obtain a first head progression measurement from the reference image frame and, acquire, with the ultrasound probe, a live image comprising a sequence of image frames along a second scan plane. The processor is configured to display both the live image and the reference image frame on the display screen at the same time, where the reference image frame is superimposed over the live image. The processor is configured to obtain a second head progression measurement from an image frame selected from the live image, and display the second head progression measurement on the display screen.
Various other features, objects, and advantages of the invention will be made apparent to those skilled in the art from the accompanying drawings and detailed description thereof.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments that may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical, and other changes may be made without departing from the scope of the embodiments. The following detailed description is, therefore, not to be taken as limiting the scope of the invention.
According to various embodiments, the input device 115 may include an off-the-shelf consumer electronic device such as a smartphone, a tablet, a laptop, etc. For purposes of this disclosure, the term “off-the-shelf consumer electronic device” is defined to be an electronic device that was designed and developed for general consumer user and one that was not specifically designed for use in a medical environment. According to some embodiments, the consumer electronic device may be physically separate from the rest of the ultrasound imaging system. The consumer electronic device may communicate with the processor 116 thought a wireless protocol, such Wi-Fi, Bluetooth, Wireless Local Area Network (WLAN), near-field communication, etc. According to an embodiment, the consumer electronic device may communicate with the processor 116 through an open Application Programming Interface (API).
The ultrasound imaging system 100 also includes a processor 116 to control the transmit beamformer 101, the transmitter 102, the receiver 108, and the receive beamformer 110. The processor 116 is configured to receive inputs from the input device 115. The receive beamformer 110 may be either a conventional hardware beamformer or a software beamformer according to various embodiments. If the receive beamformer 110 is a software beamformer, it may comprise one or more of the following components: a graphics processing unit (GPU), a microprocessor, a central processing unit (CPU), a digital signal processor (DSP), or any other type of processor capable of performing logical operations, The receive beamformer 110 may be configured to perform conventional beamforming techniques as well as techniques such as retrospective transmit beamforming (RTB).
The processor 116 is in electronic communication with the ultrasound probe 106. The processor 116 may control the ultrasound probe 106 to acquire ultrasound data. The processor 116 controls which of the elements 104 are active and the shape of a beam emitted from the ultrasound probe 106. The processor 116 is also in electronic communication with the display screen 118, and the processor 116 may process the ultrasound data into images for display on the display screen 118. The processor 116 may be configured to display one or more non-image elements on the display screen 118. The instructions for displaying each of the one or more non-image elements may be stored in the memory 120, which will be described in additional detail hereinafter. For purposes of this disclosure, the term “electronic communication” may be defined to include both wired and wireless connections. The processor 116 may include a central processing unit (CPU) according to an embodiment. According to other embodiments, the processor 116 may include other electronic components capable of carrying out processing functions, such as a digital signal processor, a field-programmable gate array (FPGA), a graphics processing unit (GPU), or an other type of processor. According to other embodiments, the processor 116 may include multiple electronic components capable of carrying out processing functions. For example, the processor 116 may include two or more electronic components selected from a list of electronic components including: a central processing unit (CPU), a digital signal processor (DSP), a field-programmable gate array (FPGA), and a graphics processing unit (GPU). According to another embodiment, the processor 116 may also include a complex demodulator (not shown) that demodulates the RF data and generates raw data. In another embodiment the demodulation may be carried out earlier in the processing chain. The processor 116 may be adapted to perform one or more processing operations according to a plurality of selectable ultrasound modalities on the data. The data may be processed in real-time during a scanning session as the echo signals are received. For the purposes of this disclosure, the term “real-time” is defined to include a procedure that is performed without any intentional delay. Real-time volume rates may vary based on the size of the volume from which data is acquired and the specific parameters used during the acquisition. The data may be stored temporarily in a buffer during a scanning session and processed in less than real-time in a live or off-line operation. Some embodiments of the invention may include multiple processors(not shown) to handle the processing tasks. For example, an embodiment may use a first processor to demodulate and decimate the RF signal and a second processor to further process the data prior to displaying an image. It should be appreciated that other embodiments may use a different arrangement of processors. For embodiments where the receive beamformer 110 is a software beamformer, the processing functions attributed to the processor 116 and the software beamformer hereinabove may be performed by a single processor, such as the receive beamformer 110 or the processor 116. Or the processing functions attributed to the processor 116 and the software beamformer may be allocated in a different manner between any number of separate processing components.
According to an embodiment, the ultrasound imaging system 100 may continuously acquire real-time 3D ultrasound data at a volume-rate of, for example, 10 Hz to 30 Hz. A live image may be generated based on the real-time 3D ultrasound data. The live image may be refreshed at a frame-rate that is similar to the volume-rate. Other embodiments may acquire data and or display the live image at different volume-rates and/or frame-rates. For example, some embodiments may acquire real-time 3D ultrasound data at a frame-rate of less than 10 Hz or greater than 30 Hz depending on the size of the volume and the intended application. Other embodiments may use 3D ultrasound data that is not real-time 3D ultrasound data. The memory 120 is included for storing processed frames of acquired data and instructions for displaying one or more non-image elements on the display screen 118. In an exemplary embodiment, the memory 120 is of sufficient capacity to store image frames of ultrasound data acquired over a period of time at least several seconds in length. The memory 120 may comprise any known data storage medium. In embodiments where the 3D ultrasound data is not real-time 3D ultrasound data, the 3D ultrasound data may be accessed from the memory 120, or any other memory or storage device. The memory or storage device may be a component of the ultrasound imaging system 100, or the memory or storage device may external to the ultrasound imaging system 100.
Optionally, embodiments of the present invention may be implemented utilizing contrast agents and contrast imaging. Contrast imaging generates enhanced images of anatomical structures and blood flow in a body when using ultrasound contrast agents including microbubbles. After acquiring data while using a contrast agent, the image analysis includes separating harmonic and linear components, enhancing the harmonic component, and generating an ultrasound image by utilizing the enhanced harmonic component. Separation of harmonic components from the received signals is performed using suitable filters. The use of contrast agents for ultrasound imaging is well-known by those skilled in the art and will therefore not be described in further detail.
In various embodiments of the present invention, data may be processed by other or different mode-related modules by the processor 116 (e.g., B-mode, Color Doppler, M-mode, Color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate and combinations thereof, and the like) to form 2D or 3D images or data. For example, one or more modules may generate B-mode, color Doppler, M-mode, color M-mode, spectral Doppler, Elastography, TVI, strain, strain rate and combinations thereof, and the like. The image beams and/or frames are stored and timing information indicating a time at which the data was acquired in memory may be recorded. The modules may include, for example, a scan conversion module to perform scan conversion operations to convert the image frames from beam space coordinates to display space coordinates. A video processor module may be provided that reads the image frames from a memory and displays the image frames in real time while a procedure is being carried out on a patient. A video processor module may store the image frames in an image memory, from which the images are read and displayed.
The method 200 will be described with respect to an exemplary embodiment where the method 200 is performed with the ultrasound imaging system 100 shown in
At step 202, the processor 116 accesses a reference image frame 301 acquired along a first scan plane from a memory 120. The reference image frame 301 may have been previously acquired with the ultrasound imaging system 100 or the reference image frame 301 may have been acquired with a different ultrasound imaging system. According to an embodiment, the reference image frame 301 includes both a pubic symphysis 302 and a fetal head 304. The reference image frame 301 may be acquired from a 2D acquisition, or the reference image frame 301 may be a representation of a plane from a 3D acquisition.
At step 204, the processor 116 obtains a first head progression measurement from the reference image frame 301. According to an exemplary embodiment, the first head progression measurement may be a first angle of progression measurement 303. The first angle of progression measurement 303 is obtained by measuring the angle between a longitudinal axis of the pubic symphysis, represented by a pubic symphysis line 306, and a tangent line 308 extending tangentially from an inferior edge of the pubic symphysis 302 to the fetal head 304. According to some embodiments the first angle of progression may be determined semi-automatically. For example, one or both of pubic symphysis line 306 and tangent line 308 may be manually placed by an operator. According to other embodiments, the first angle of progression 303 may be determined automatically and the processor 116 may automatically place both pubic symphysis line 306 and tangent line 308 using image processing techniques, to determine the position of the public symphysis 302 and the fetal head 304. Any type of image processing technique may be used including, thresholding, edge detection, intensity vectors, shape recognition, and the like. Angle of progression is a standard measurement that is well-known by those skilled in the art, so it will not be described in additional detail.
The live image 401 includes the pubic symphysis 302 and the fetal head 304. Since the operator will ultimately want to compare the first angle of progression 303 obtained from the reference image frame 301 with an angle of progression 305 determined from one of the image frames of the live image 401, it is desired to have the second scan plane, represented in the live image 401, align as closely as possible with the first scan plane, represented in the reference image frame 301. Since the live image 401 updates as additional ultrasound data is acquired, it should be appreciated that the position of the second scan plane, as represented in the live image 401, may change during the course of acquiring the live image 401. The operator may, for instance, either intentionally or inadvertently adjust the position of the ultrasound probe 106 with respect to the patient.
At step 208, the processor displays the reference image frame 301 superimposed over the live image 401.
According to an embodiment, the processor 116 may automatically highlight one or more structures in either one or both of the live image 401 and the reference image frame 301 to aid the operator in adjusting the ultrasound probe position so that the live image 401 matches the reference image frame 301. For instance, the processor 116 may use one or more image processing techniques in order to automatically identify the pubic symphysis 302, the fetal head, 304, a portion of the fetal head 304, such as the skull, or shadows present in either one/or both the reference image frame 301 and the live image 401 to help the operator more easily visualize differences between the reference image frame 301 and the live image 401. The operator may use real-time feedback from the live image 401 with respect to the superimposed reference image frame to make adjustments to the position of the ultrasound probe 106. According to some embodiments, the processor 116 may only superimpose select features from the reference image frame 301 on the live image 401. For example, the processor 116 may only superimpose one or more of the pubic symphysis 302, the fetal head 304, or shadows present in the reference image frame 301. At step 210, the operator deliberately adjusts the position of the ultrasound probe 106 to align the second scan plane (as represented by, the live image 401) with the first scan plane (as represented by the reference image frame 301). When the live image 401 matches or closely matches with the reference image frame 301, it is a good indication to the user that the second scan plane is aligned with the first scan plane. According to other embodiments, the matching of the live image 401 to the reference image frame 301 may be based primarily upon on one or more anatomical structures present in both the reference image frame 301 and the live image 401. For example, determining how well the live image 401 matches the reference image frame 301 may be based primarily on how well the pubic symphysis in the live image matches the pubic symphysis in the reference image as the pubic symphysis is expected to a fixed landmark throughout labor.
At step 212, after adjusting the position of the ultrasound probe 106 so that the live image 401 matches the reference image frame 301, an image frame is selected from the live image 401. It should be appreciated that it may not always be possible for the live image 401 to perfectly match the reference image frame 301. The operator may objectively determine when the match between the live image 401 and the reference image frame is close enough, or the processor 116 may use a similarity metric, based on any parameter such as grayscale values, shape-based matching, edge detection, or the like, to determine when the similarity metric exceeds a threshold. The threshold may be selectable by the operator or the threshold may be preset on the ultrasound imaging system 100.
According to an embodiment, the processor 116 may automatically select an image frame from the live image 401 once the similarity metric exceeds the threshold. According to other embodiments, an operator may manually select the image frame from the live image 401 based on an input through the input device 115, such as pressing a freeze button. According to other embodiments, the processor 116 may display a graphical indicator on the display screen 118 to indicate the value of the similarity metric between the live image 401 and the reference image frame 301. Graphical indicators may include the use of various colors, shapes, or icons to indicate the similarity metric between the live image 401 and the reference image frame 301. For instance, a first color may be used to indicate when the similarity metric between the live image 401 and the reference image frame 301 exceeds the threshold. A second color may be used to indicate when the similarity metric between the live image 401 and the reference image frame 301 is below the threshold. According to other embodiments, different icons may be used to indicate when the similarity metric is above or below the threshold. According to one exemplary embodiment, the graphical indicator may include a traffic light icon. The traffic light icon may, for instance, show a green light when the similarity metric exceeds a threshold and a red light when the similarity metric is below the threshold. According other embodiments, the traffic light icon may additionally include a yellow light to indicate when the similarity metric is within a predetermined range of the threshold. The operator may select the image frame from the live image 401 by using the graphical indicator as a guide, or, according to other embodiments, the operator may determine when the match between the live image 401 and the reference image frame 301 is acceptable based solely on a visual comparison.
Next, at step 214, a second head progression measurement is obtained from the image frame selected from the live image 401. According to an exemplary embodiment, the second head progression measurement may be a second angle of progression. As described with respect to step 204, obtaining the second angle of progression measurement may either be an automatic process performed by the processor 116 or a semi-automatic process with one or more inputs required by the operator. For example, the operator may manually identify one or more of the following: the pubic symphysis 302, the pubic symphysis line 306, the fetal head 304, a fetal skull 307, or the tangent line 308.
At step 216, the processor 116 displays the second head progression measurement on the display screen 118. According to an exemplary embodiment, the second head progression measurement may be the second angle of progression measurement. The processor may, for instance, display the second angle of progression measurement in degrees on the display screen 118. The second angle of progression measurement is displayed in the lower right-hand corner of
At step 218, the processor 116 overlays a head progression indicator, such as an angle of progression indicator 309, on the display screen 118. According to an embodiment, the angle of progression indicator 309 includes the pubic symphysis line 302 and the tangent line 308. Different head progression indicators may be used in other embodiments, includimg a line to represent a length.
Steps 220 and 222 are optional for the method 200. Not all embodiments will implement steps 220 and 222. At step 220, it is determined if it is desired to obtain additional head progression measurements. According to an exemplary embodiment, it may be desired to obtain additional head progression measurements on a regular basis. For example, embodiments may obtain additional head progression measurements relatively frequently, such as multiple times a minute or multiple times an hour. Or, the method 200 may obtain additional head progression measurements relatively infrequently, such as at an interval of greater than an hour. If it is desired to obtain an additional head progression measurement, the method 200 returns to step 206, and steps 206, 208, 210, 212, 214, 216, 218, and 220 are repeated. Steps 206, 208, 210, 212, 214, 216, 218, and 220 may be repeated as many times as desired by the operator. If it is not desired to obtain additional head progression measurements, the method 200 advances to step 222.
As discussed previously, step 222 is optional. Some embodiments may be completed after either step 218 or step 220. However, according to some embodiments, at step 222, the method 200 displays the reference image frame and one or more image frames selected during each iteration through steps 206, 208, 210, 212, 214, 216, 218, and 220 as a cine loop. Other embodiments may include displaying a cine loop that does not include the reference image frame. For example, the cine loop may consist of image frame acquired only from multiple iterations of steps 206, 208, 210, 212, 214, 216, 218, and 220. According to an embodiment, the angle of progression indicator 309 may be included in the cine loop. The angle of progression indicator 309 may be adjusted to fit the angle of progression represented in each image frame of the cine loop. Displaying the cine loop allows the operator to easily see how the angle of progression measurement changes during labor. Additionally, the angle of progression indicator provides the operator with a clear visual representation of the progression of labor. For example, by focusing on the angle of progression indicator 309 while viewing the cine loop, the operator can easily see how the angle of progression is changing over the course of labor. The cine loop may represent angle of progression measurements acquired over a relatively short time, such as several minutes up to an hour. Or the cine loop may represent angle of progression measurements acquired over a longer period of time, such as over multiple hours. Either way, watching the angle of progression indicator 309 while viewing the cine loop provides the operator with a very easily discernable representation of how labor is progressing and any changes in the angle of progression/descent of the fetal head that have occurred over the time represented in the cine loop. Different types of head progression indicators may be displayed in other embodiments. For example, a line showing the distance between the fetal head and an anatomical reference structure may be shown in other embodiments. The length of the line may be adjusted as the cine loop is played to visually depict the head progression measurement in each frame of the cine loop.
In the exemplary embodiment described with respect to
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.