The present disclosure pertains to imaging systems and methods for automatically detecting a pleural line in an image. In particular, imaging systems and methods for automatically detecting a pleural line based on rib location in ultrasound images are disclosed.
Lung ultrasound may be useful in the management of a spectrum of respiratory diseases, particularly at the point of care. Its advantages include avoidance of patient transport to receive a computed tomography (CT) scan or chest x-ray thereby improving infection control procedures. There is growing interest in developing automated solutions for lung ultrasound to address clinical applications including the management of infectious diseases such as COVID-19, pneumonia, chronic pulmonary diseases, and chronic or congestive heart failure. In these applications, automated lung ultrasound would aid inexperienced users in triage and clinical decision-making when monitoring disease progression over time.
The pleural line is an important landmark in lung ultrasound. Other lung features such as B-lines and sub-pleural consolidations are related to the position of the pleural line. Recent clinical literature shows that changes in the pleural line, such as thickening or irregularity, are associated with the severity of respiratory and infectious diseases including COVID-19 pneumonia.
Currently, manual identification and measurement of properties of the pleural line, such as by emergency physicians, is a time-consuming process and interpretation of the ultrasound images and evaluation of the pleural line is subjective. Previous approaches for automatic or semi-automatic pleural line detection have used a variety of techniques including confidence map-based approach, 3-seed points based peak detection, or deep-learning based approach. However, each of these previous methods have limitations. In the confidence-map based approach, the region with values bigger than the global threshold of entire map may be considered the upper pleura and the remaining may be considered the lower pleural region. Confidence maps may provide reliable pleural line detection from the sharp drop of confidence value along axial direction. However, radio-frequency ultrasound signal is needed for computing confidence map with high computation load which may not be available in most portable point-of-care (POC) ultrasound imaging systems. The 3-seed point based peak detection approach may work well for most cases, but may suffer from false-positives for cases where there are 1) forced breathing from the patient in both horizontal and vertical directions and 2) strong motion around location of A-line.
A deep learning based pleural line detection approach using a multi-layer convolutional neural network (CNNs) with collected ultrasound sequence frames as input may be used. The prediction model is trained by a relative small amount of lung ultrasound images. The major limitation for deep learning-based approach is requirements for large dataset from multiple clinical sites and significant annotation of the images. Furthermore, lack of transparency of detected features from deep learning is still an issue for clinical acceptance and regulatory approval.
Accordingly, improved automatic pleural line detection techniques are desired.
A method and system for automated pleural line detection in lung ultrasound images may include selecting a region of interest (ROI) in image data (the first ROI image) that is likely to encompass the rib shadow, typical ROI image depth of 2.5 cm to 3 cm starting from the bottom of the image (e.g., range of [(depth−2.5) to depth]), summing intensity along the axial direction to obtain an intensity projection curve for each scanning line on the first ROI image, determining the small value regions in the intensity projection curve which could be rib candidates due to the small value regions are from rib shadows, determining the second ROI image from the small value region, summing intensity along the lateral direction to obtain an axial intensity projection curve on the second ROI image, determining the surface of the rib by reversing searching peaks from bottom-up on the axial intensity projection curve, determining the third ROI image starting from the surface of the rib and 1 cm deeper, computing lung sliding motion map from several adjacent frames (e.g., 9 to 16 frames) on the region of the third ROI image, using the locations of rib to limiting searching region on lung sliding motion map where the largest motion increase is from the pleural line, displaying the detected pleural line using a color indicator on the screen. The techniques disclosed herein may also be used to detect ribs and/or surfaces of ribs and display visual indicators thereof.
The system and methods disclosed herein may provide a uniform method that all operators can follow, thereby reducing operator dependence.
According to at least one example of the present disclosure, a method for determining a pleural line in an image may include automatically selecting a first region of interest (ROI) within an image acquired from a lung of a subject based, at least in part, on a depth of the image, analyzing the first ROI to determine at least one rib shadow region in the image, based, at least in part, on the rib shadow region, automatically selecting a second ROI within the image, analyzing the second ROI to determine a location of a rib surface in the image, based, at least in part, on the location of the rib surface, automatically selecting a third ROI within the image, analyzing the third ROI, and determining the pleural line based on the analyzing of the third ROI.
In some examples, the first ROI extends from a pre-determined depth to the depth of the image.
In some examples, analyzing the first ROI comprises computing, with at least one processor, a lateral projection for the first region of interest.
In some examples, the second ROI extends from a top of the image to the depth of the image and extends across a portion of a width of the image based on a comparison of lateral intensities of the lateral projection to a threshold value.
In some examples, analyzing the second ROI includes computing, with at least one processor, an axial projection for the second region of interest, and detecting, with the at least one processor, a peak in the axial projection.
In some examples, the third ROI extends from a first depth in the image corresponding to a location of the peak to a second depth greater than the first depth.
In some examples, the peak is detected based, at least in part, on a comparison of a difference between a peak value and a neighboring value and a threshold value.
In some examples, analyzing the third ROI includes computing, with at least one processor, an axial projection for the third region of interest, detecting, with the at least one processor, one or more peaks in the axial projection, wherein individual ones of the one or more peaks correspond to a corresponding candidate pleural line, and computing, with the at least one processor, a motion map for the third region of interest.
In some examples, determining the pleural line includes, for individual ones of the corresponding candidate pleural lines: calculating an average motion above the candidate pleural line and an average motion below the candidate pleural line from the motion map, and calculating a difference between the average motion above and the average motion below the candidate pleural line, determining the candidate pleural line having a greatest difference between the average motion above and the average motion below, and selecting the candidate pleural line having the greatest difference as the pleural line. In some examples, the method may further include determining a location of individual ones of the candidate pleural lines based, at least in part, on locations of corresponding ones of the one or more peaks. In some examples, the method may further include determining a local brightness of individual ones of the candidate pleural lines.
In some examples, the method may include analyzing the pleural line to determine a width, a thickness, a smoothness, or a combination thereof, of the pleural line.
In some examples the method may include displaying the image on a display with a visual indicator of the pleural line overlaid on the image.
According to at least one example of the present disclosure, an ultrasound imaging system configured to determine a pleural line in an ultrasound image may include an ultrasound probe configured to acquire an ultrasound image from a lung of a subject, at least one processor configured to automatically select a first region of interest (ROI) within the ultrasound image based, at least in part, on a depth of the ultrasound image, analyze the first ROI to determine at least one rib shadow region in the ultrasound image, based, at least in part, on the rib shadow region, automatically select a second ROI within the ultrasound image, analyze the second ROI to determine a location of a rib surface in the ultrasound image, based, at least in part, on the location of the rib surface, automatically select a third ROI within the ultrasound image, analyze the third ROI, and determine the pleural line based on the analyzing of the third ROI, and a display configured to display a visual indication of the pleural line overlaid on the ultrasound image.
In some examples, the system may include a user interface configured to receive a user input indicating an age, a weight, or a combination thereof, of the subject, wherein the user input determines, at least in part, the first ROI.
According to at least one example of the present disclosure, a method for determining a rib surface in an image may include automatically selecting a first region of interest (ROI) within an image acquired from a lung of a subject based, at least in part, on a depth of the image, analyzing the first ROI to determine at least one rib shadow region in the image, based, at least in part, on the rib shadow region, automatically selecting a second ROI within the image, and analyzing the second ROI to determine a location of a rib surface in the image.
In some examples, analyzing the second ROI includes computing, with at least one processor, an axial projection for the second region of interest and detecting, with the at least one processor, a peak in the axial projection.
In some examples, the peak is detected based, at least in part, on a comparison of a difference between a peak value and a neighboring value and a threshold value.
In some examples, the method may further include displaying the image on a display with a visual indicator of the rib surface overlaid on the image.
In some examples, the first ROI extends from a pre-determined depth to the depth of the image, analyzing the first ROI comprises computing, with at least one processor, a lateral projection for the first region of interest, and the second ROI extends from a top of the image to the depth of the image and extends across a portion of a width of the image based on a comparison of lateral intensities of the lateral projection to a threshold value.
The following description of certain embodiments is merely exemplary in nature and is in no way intended to limit the invention or its applications or uses. In the following detailed description of embodiments of the present systems and methods, reference is made to the accompanying drawings which form a part hereof, and which are shown by way of illustration specific embodiments in which the described systems and methods may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice presently disclosed apparatuses, systems and methods, and it is to be understood that other embodiments may be utilized and that structural and logical changes may be made without departing from the spirit and scope of the present disclosure. Moreover, for the purpose of clarity, detailed descriptions of certain features will not be discussed when they would be apparent to those with skill in the art so as not to obscure the description of the present apparatuses, systems, and methods. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present system is defined only by the appended claims.
Ultrasound signals (e.g., beams) propagate through skin, subcutaneous tissue and muscle (e.g., chest wall soft tissue) to reach a surface of a rib and a visceral pleural-lung interface. At the rib surface, the ultrasound signals are almost totally reflected back (e.g., echo signals) and received by the ultrasound probe due to large acoustic impedance between the bony surface and its nearby soft tissue. The rib in the resulting ultrasound images may appear as a convex hyperechoic bow-like line. Regions of the lung “behind” the rib relative to the ultrasound probe may receive little to no ultrasound signals due to the high reflection of the ribs. Thus, in the resulting ultrasound images, these regions may appear as hypoechoic regions (e.g., lower amplitude, darker regions) which may extend from the rib to the bottom of the ultrasound image. These regions may be referred to as rib shadows.
The pleural line (e.g., pleural lines 310) is usually a horizontal hyperechoic line approximately 0.5 cm deeper than the rib surface in adults. The inventors have recognized that as long as the surface of the rib can be determined, then a searching region for the pleural line can be selected to be 0.5˜1.0 cm deeper than the location of the rib. Defining this searching region may reduce the possible false positives of the pleural line from superficial horizontal bright lines which are usually located at the depth range of 0.6 cm to 3 cm, and this range is overlapped with location of pleural lines for some subjects, strong intensity from A-lines, especially in cases where the pleural line is below or around 1 cm, and bright horizontal lines at superficial region (e.g., intercostal muscles 308).
According to examples of the present disclosure, a first region of interest (ROI) is automatically selected within an ultrasound image acquired from a lung ultrasound scan of a patient based, at least in part, on a depth of the image. The first ROI may be analyzed to determine at least one rib shadow region. In some examples, a lateral projection is computed for the first ROI for analysis. The rib shadow region may be used to automatically select a second ROI. The second ROI may be analyzed to determine a location of a rib surface. In some examples, the second ROI may be used to compute an axial projection for analysis. The location of the rib surface may be used to automatically select a third ROI. The third ROI may be analyzed to identify the pleural line. In some examples, the third ROI may be used to compute an axial projection and/or a motion map for analysis. Using the systems and methods disclosed herein, the pleural line may be correctly identified (e.g., a depth/location of the pleural line may be determined) based on the position of the rib surface and rib shadows in the lung ultrasound image, even in difficult cases.
As indicated by block 402, within an ultrasound image acquired from a lung of a subject, a first ROI may be selected. The first ROI may extend from a pre-determined depth as measured from a face of the ultrasound probe to the bottom of the image across an entire width of the image. Alternatively the pre-determined depth may be defined relative to the bottom of the image.
The pre-determined depth may be approximately 4 cm to 5 cm for a typical adult subject (e.g., alternatively, 2 cm-3 cm from the bottom of the image). In some examples, the pre-determined depth may be adjusted based, at least in part, on the type of ultrasound probe used, whether the subject is an adult or a child, whether or not the subject is overweight. For example, if the subject is a child, the pre-determined may be less (e.g., approximately 2 cm to 3 cm). In another example, if the subject is an overweight adult, the pre-determined depth may be greater (e.g., approximately 5-5.5 cm). In some examples, the pre-determined depth may be adjusted (if necessary) automatically, for example, an ultrasound imaging system may detect a type of ultrasound probe being used and/or an ultrasound image may have metadata (e.g., DICOM info head/tag) indicating the type of probe used that can be interpreted by a computing system. In some examples, the pre-determined depth may be adjusted based on input from a user (e.g., emergency physician, ultrasound technician). For example, during the ultrasound exam, a user interface of the ultrasound system may prompt the user for information about the subject and adjust the pre-determined depth (if necessary) based on the user's input.
Within the first ROI, a lateral projection is computed as indicated by block 404.
Returning to
Continuing with the example shown in
Returning to
Because the rib shadow found in the first ROI was used to select the second ROI, and the rib shadow is caused by the rib, in some applications, the probability of finding the rib within the second ROI may be higher than other techniques for finding the rib surface that do not rely on the location of the rib shadow region. Furthermore, since the peak detection is performed starting at a greater depth and working towards shallower depths, there may be fewer false positives due to peaks caused by strong reflectors in the subcutaneous tissues and intercostal muscles.
A third ROI may be selected within the ultrasound image based on the determined depth of the rib surface as indicated by block 410. In some examples, the third ROI may extend from the determined depth of the rib surface to a pre-determined depth below the rib surface. In typical adults, the pleural line is around 0.5 cm below the rib surface. Accordingly, in some examples, the pre-determined depth may be 1 cm. This may provide a margin for biological variability while still limiting the region in which to search for the pleural line. Limiting the search region may reduce computing time and/or reduce false positives in some applications. In some examples, the third ROI may further extend to a depth slightly above the rib surface (e.g., 0.2 cm). The relationship of the third ROI to the ultrasound image, the first ROI, and the second ROI is illustrated in
As shown in
Returning to
Within the third ROI, a motion map is generated based from a series of ultrasound images. The series ultrasound images may include the ultrasound image used to select the ROI. The ultrasound image may be one frame in a multi-frame image (e.g., a cine-loop, movie) all acquired by an ultrasound probe at a same (or approximately same) location and orientation at different times. The series of ultrasound images used to generate the motion map may be frames temporally adjacent to one another. In some examples, the ultrasound image may be the first frame, the last frame, or a middle frame of the series of frames used to generate the motion map. In some examples, nine to sixteen adjacent frames may be used to generate the motion map. In other examples, more or fewer frames may be used.
For each pair of adjacent frames, in the third ROI, the magnitude of intensity differences between corresponding pixels are calculated. Differences in pixel intensity between adjacent frames may correlate to motion between the frames. The magnitudes of the differences for individual pixels are summed across all of the frames to compute a total magnitude value. Thus, each pixel in the ultrasound image may be associated with a value of total magnitude. Similar to pixels being assigned a color or grayscale value based on the intensity value associated with the pixel in a B-mode image, a color or grayscale value may be assigned to each pixel based on the value of total magnitude associated with the pixel. This color or grayscale mapping may be used to generate a motion map. Regions of greater motion may be associated with larger total magnitude values than regions of lesser motion. In some examples, even though the values of the motion map are calculated, the motion map may not be provided on a display.
As indicated by block 414 of
In some applications, using motion to determine the pleural line from multiple candidate pleural lines may allow detection of the pleural line when it is not the brightest line in the region. For example, fibers may be present in the vicinity of the pleural line in some lung conditions. Depending on the orientation and/or other properties, the fibers may reflect ultrasound signal more strongly than the pleural line. Thus, fibers may cause false positives in some cases when brightness is the sole criteria for assessing candidate pleural lines.
Optionally, in some examples, additional criteria may be used to determine the pleural line from multiple candidate pleural lines. For example, as previously noted, in adults, the pleural line is typically around 0.5 cm below the surface of the rib. In some examples, candidate pleural lines that are closer to this depth may be favored and/or selected over pleural lines further away from this depth. In some examples, the local brightness of the candidate pleural line may be used (e.g., the height of the peak relative to its surrounding values on the axial projection curve). Pleural lines with greater local brightness may be favored and/or selected over pleural lines with lower local brightness.
In some applications, one or more of the additional criteria may be required to determine the pleural line from candidate pleural lines. While the difference in motion above and below the pleural line may typically be a reliable indicator of the “true” pleural line, little to no lung sliding motion may be present in certain conditions. For example, in pneumothorax (PTX), lung sliding may not be present. When little to no motion is detected in the motion map, the pleural line may be determined based on depth and/or local brightness. In some examples, if no candidate pleural lines are found near the depth and/or have a local brightness equal to and/or above a threshold value, it may indicate that the pleural line cannot be determined from the candidate pleural lines. In some examples, the depth and/or local brightness may be used to eliminate some or all of the candidate pleural lines even when motion is present.
As described in block 416 of
Optionally, once the pleural line has been determined using the technique summarized in
In some embodiments, the transducer array 1114 may be coupled to a microbeamformer 1116, which may be located in the ultrasound probe 1112, and which may control the transmission and reception of signals by the transducer elements in the array 1114. In some embodiments, the microbeamformer 1116 may control the transmission and reception of signals by active elements in the array 1114 (e.g., an active subset of elements of the array that define the active aperture at any given time).
In some embodiments, the microbeamformer 1116 may be coupled, e.g., by a probe cable or wirelessly, to a transmit/receive (T/R) switch 1118, which switches between transmission and reception and protects the main beamformer 1122 from high energy transmit signals. In some embodiments, for example in portable ultrasound systems, the T/R switch 1118 and other elements in the system can be included in the ultrasound probe 1112 rather than in the ultrasound system base, which may house the image processing electronics. An ultrasound system base typically includes software and hardware components including circuitry for signal processing and image data generation as well as executable instructions for providing a user interface (e.g., processing circuitry 1150 and user interface 1124).
The transmission of ultrasonic signals from the transducer array 1114 under control of the microbeamformer 1116 is directed by the transmit controller 1120, which may be coupled to the T/R switch 1118 and a main beamformer 1122. The transmit controller 1120 may control the direction in which beams are steered. Beams may be steered straight ahead from (orthogonal to) the transducer array 1114, or at different angles for a wider field of view. The transmit controller 1120 may also be coupled to a user interface 1124 and receive input from the user's operation of a user control. The user interface 1124 may include one or more input devices such as a control panel 1152, which may include one or more mechanical controls (e.g., buttons, encoders, etc.), touch sensitive controls (e.g., a trackpad, a touchscreen, or the like), and/or other known input devices.
In some embodiments, the partially beamformed signals produced by the microbeamformer 1116 may be coupled to a main beamformer 1122 where partially beamformed signals from individual patches of transducer elements may be combined into a fully beamformed signal. In some embodiments, microbeamformer 1116 is omitted, and the transducer array 1114 is under the control of the main beamformer 1122 which performs all beamforming of signals. In embodiments with and without the microbeamformer 1116, the beamformed signals of the main beamformer 1122 are coupled to processing circuitry 1150, which may include one or more processors (e.g., a signal processor 1126, a B-mode processor 1128, a Doppler processor 1160, and one or more image generation and processing components 1168) configured to produce an ultrasound image from the beamformed signals (e.g., beamformed RF data).
The signal processor 1126 may be configured to process the received beamformed RF data in various ways, such as bandpass filtering, decimation, I and Q component separation, and harmonic signal separation. The signal processor 1126 may also perform additional signal enhancement such as speckle reduction, signal compounding, and noise elimination. The processed signals (also referred to as I and Q components or IQ signals) may be coupled to additional downstream signal processing circuits for image generation. The IQ signals may be coupled to a plurality of signal paths within the system, each of which may be associated with a specific arrangement of signal processing components suitable for generating different types of image data (e.g., B-mode image data, Doppler image data). For example, the system may include a B-mode signal path 1158 which couples the signals from the signal processor 1126 to a B-mode processor 1128 for producing B-mode image data.
The B-mode processor can employ amplitude detection for the imaging of structures in the body. The signals produced by the B-mode processor 1128 may be coupled to a scan converter 1130 and/or a multiplanar reformatter 1132. The scan converter 1130 may be configured to arrange the echo signals from the spatial relationship in which they were received to a desired image format. For instance, the scan converter 1130 may arrange the echo signal into a two dimensional (2D) sector-shaped format, or a pyramidal or otherwise shaped three dimensional (3D) format. The multiplanar reformatter 1132 can convert echoes which are received from points in a common plane in a volumetric region of the body into an ultrasonic image (e.g., a B-mode image) of that plane, for example as described in U.S. Pat. No. 6,443,896 (Detmer). The scan converter 1130 and multiplanar reformatter 1132 may be implemented as one or more processors in some embodiments.
A volume renderer 1134 may generate an image (also referred to as a projection, render, or rendering) of the 3D dataset as viewed from a given reference point, e.g., as described in U.S. Pat. No. 6,530,885 (Entrekin et al.). The volume renderer 1134 may be implemented as one or more processors in some embodiments. The volume renderer 1134 may generate a render, such as a positive render or a negative render, by any known or future known technique such as surface rendering and maximum intensity rendering.
In some embodiments, the system may include a Doppler signal path 1162 which couples the output from the signal processor 1126 to a Doppler processor 1160. The Doppler processor 1160 may be configured to estimate the Doppler shift and generate Doppler image data. The Doppler image data may include color data which is then overlaid with B-mode (i.e. grayscale) image data for display. The Doppler processor 1160 may be configured to filter out unwanted signals (i.e., noise or clutter associated with non-moving tissue), for example using a wall filter. The Doppler processor 1160 may be further configured to estimate velocity and power in accordance with known techniques. For example, the Doppler processor may include a Doppler estimator such as an auto-correlator, in which velocity (Doppler frequency, spectral Doppler) estimation is based on the argument of the lag-one autocorrelation function and Doppler power estimation is based on the magnitude of the lag-zero autocorrelation function. Motion can also be estimated by known phase-domain (for example, parametric frequency estimators such as MUSIC, ESPRIT, etc.) or time-domain (for example, cross-correlation) signal processing techniques. The velocity and/or power estimates may then be mapped to a desired range of display colors in accordance with a color map. The color data, also referred to as Doppler image data, may then be coupled to the scan converter 1130, where the Doppler image data may be converted to the desired image format and overlaid on the B-mode image of the tissue structure to form a color Doppler or a power Doppler image.
Outputs from the scan converter 1130, the multiplanar reformatter 1132, and/or the volume renderer 1134 may be coupled to an image processor 1136 for further enhancement, buffering and temporary storage before being displayed on an image display 1138. A graphics processor 1140 may generate graphic overlays for display with the images. These graphic overlays can contain, e.g., standard identifying information such as patient name, date and time of the image, imaging parameters, and the like. For these purposes the graphics processor 1140 may be configured to receive input from the user interface 1124, such as a typed patient name or other annotations (e.g., labels, bodymarkers). The user interface 1124 can also be coupled to the multiplanar reformatter 1132 for selection and control of a display of multiple multiplanar reformatted (MPR) images.
The system 1100 may include local memory 1142. Local memory 1142 may be implemented as any suitable non-transitory computer readable medium (e.g., flash drive, disk drive). Local memory 1142 may store data generated by the system 1100 including ultrasound images, executable instructions, imaging parameters, or any other information necessary for the operation of the system 1100. In some examples, local memory 1142 may include multiple memories, which may be the same or of different type. For example, local memory 1142 may include a dynamic random access memory (DRAM) and a flash memory.
As mentioned previously system 1100 includes user interface 1124. User interface 1124 may include display 1138 and control panel 1152. The display 1138 may include a display device implemented using a variety of known display technologies, such as LCD, LED, OLED, or plasma display technology. In some embodiments, display 1138 may comprise multiple displays. The control panel 1152 may be configured to receive user inputs (e.g., exam type, patient parameters). The control panel 1152 may include one or more hard controls (e.g., buttons, knobs, dials, encoders, mouse, trackball or others). In some embodiments, the control panel 1152 may additionally or alternatively include soft controls (e.g., GUI control elements or simply, GUI controls) provided on a touch sensitive display. In some embodiments, display 1138 may be a touch sensitive display that includes one or more soft controls of the control panel 1152.
In some embodiments, various components shown in
According to examples of the present disclosure, the system 1100 may be used to perform an ultrasound lung scan to acquire one or more ultrasound images of a lung of a subject, such as ultrasound image 700. For example ultrasound probe 1112 may acquire one or more ultrasound images. In some examples, the system 1100 may acquire a series of temporally spaced ultrasound images, such as those used to compute a motion map, such as motion map 904. In some examples, the user interface 1124 may receive user inputs, such as an indication that a pleural line should be detected in the acquired ultrasound image, and/or information relating to the subject that may alter a predetermined depth (e.g., age, weight).
In some examples, one or more of the processors of system 1100 may perform the portions of the technique described in blocks 402-414 of
The processor 1200 may include one or more cores 1202. The core 1202 may include one or more arithmetic logic units (ALU) 1204. In some embodiments, the core 1202 may include a floating point logic unit (FPLU) 1206 and/or a digital signal processing unit (DSPU) 1208 in addition to or instead of the ALU 1204.
The processor 1200 may include one or more registers 1212 communicatively coupled to the core 1202. The registers 1212 may be implemented using dedicated logic gate circuits (e.g., flip-flops) and/or any memory technology. In some embodiments the registers 1212 may be implemented using static memory. The register may provide data, instructions and addresses to the core 1202.
In some embodiments, processor 1200 may include one or more levels of cache memory 1210 communicatively coupled to the core 1202. The cache memory 1210 may provide computer-readable instructions to the core 1202 for execution. The cache memory 1210 may provide data for processing by the core 1202. In some embodiments, the computer-readable instructions may have been provided to the cache memory 1210 by a local memory, for example, local memory attached to the external bus 1216. The cache memory 1210 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
The processor 1200 may include a controller 1214, which may control input to the processor 1200 from other processors and/or components included in a system (e.g., control panel 1152 and scan converter 1130 shown in
The registers 1212 and the cache memory 1210 may communicate with controller 1214 and core 1202 via internal connections 1220A, 1220B, 1220C and 1220D. Internal connections may implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology. Inputs and outputs for the processor 1200 may be provided via a bus 1216, which may include one or more conductive lines. The bus 1216 may be communicatively coupled to one or more components of processor 1200, for example the controller 1214, cache memory 1210, and/or register 1212. The bus 1216 may be coupled to one or more components of the system, such as display 1138 and control panel 1152 mentioned previously.
The bus 1216 may be coupled to one or more external memories. The external memories may include Read Only Memory (ROM) 1232. ROM 1232 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology. The external memory may include Random Access Memory (RAM) 1233. RAM 1233 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology. The external memory may include Electrically Erasable Programmable Read Only Memory (EEPROM) 1235. The external memory may include Flash memory 1234. The external memory may include a magnetic storage device such as disc 1236. In some embodiments, the external memories may be included in a system, such as ultrasound imaging system 1100 shown in
At block 1302, “automatically selecting a first ROI within an image acquired from a lung of a subject based, at least in part, on a depth of the image” may be performed. In some examples, the image may be an ultrasound image. In some examples, the image may have been acquired by an ultrasound probe, such as ultrasound probe 1114 of system 1100. In some examples, the first region of interest extends from a pre-determined depth to the depth (e.g., bottom) of the image. For example, the first ROI may extend from [(Depth)−(Pre-determined depth)] to [Depth] as shown in
At block 1304, “analyzing the first ROI to determine at least one rib shadow region in the image” may be performed. In some examples, the analyzing may be performed by the at least one processor. In some examples, analyzing the first ROI may include computing, with at least one processor, a lateral projection for the first region of interest, for example, as described with reference to
At block 1306, “based, at least in part, on the rib shadow region, automatically selecting a second ROI within the image” may be performed. In some examples, the selecting may be performed by the at least one processor. In some examples, the second region of interest extends from a top of the image to the depth of the image and extends across a portion of a width of the image based on a comparison of lateral intensities of the lateral projection to a threshold value as shown in
At block 1308, “analyzing the second ROI to determine a location of a rib surface in the image” may be performed. In some examples, the analyzing may be performed by the at least one processor. In some examples, analyzing the second ROI includes computing, with at least one processor, an axial projection for the second region of interest and detecting, with the at least one processor, a peak in the axial projection. In some examples, the peak is detected based, at least in part, on a comparison of a difference between a peak value and a neighboring value and a threshold value. For example, as described with reference to
At block 1310, “based, at least in part, on the location of the rib surface, automatically selecting a third ROI within the image” may be performed. In some examples, the selecting may be performed by the at least one processor. In some examples, the third region of interest extends from a first depth in the image corresponding to a location of the peak to a second depth greater than the first depth as shown in
At block 1312, “analyzing the third ROI” may be performed. In some examples, the analyzing may be performed by the at least one processor. In some examples, analyzing the third ROI may include computing, with at least one processor, an axial projection for the third region of interest, detecting, with the at least one processor, one or more peaks in the second axial projection, wherein individual ones of the one or more peaks correspond to a corresponding candidate pleural line, and computing, with the at least one processor, a motion map for the third region of interest. The motion map may be computed as described with reference to
At block 1314, “determining the pleural line based on the analyzing of the third ROI” may be performed. In some examples, the determining may be performed by the at least one processor. In some examples, determining the pleural line may include for individual ones of the corresponding candidate pleural lines: calculating an average motion above the candidate pleural line and an average motion below the candidate pleural line from the motion map and calculating a difference between the average motion above and the average motion below the candidate pleural line. Determining the pleural line may further include determining the candidate pleural line having a greatest difference between the average motion above and the average motion below and selecting the candidate pleural line having the greatest difference as the pleural line.
Optionally, method 1300 may further include “analyzing the pleural line to determine a width, a thickness, a smoothness, or a combination thereof, of the pleural line” at block 1316. In some examples, the analyzing may be performed by the at least one processor.
Optionally, method 1300 may further include, “displaying the image on a display with a visual indicator of the pleural line overlaid on the image” as indicated by block 1318. In some examples, the display may include display 1138. In some examples, the visual indicator may be generated by the at least one processor, such as graphics processor 1140.
While the examples of the present disclosure have highlighted techniques for determining (e.g., identifying, detecting) a pleural line within an image, the techniques disclosed herein may be used to determine a rib and/or surface of a rib in the image. In these examples, blocks 1302-1308 of method 1300 may be performed. Optionally, in some examples, instead of showing a visual indicator of the pleural line (e.g.,
An automated system and method for detection and quantifying pleural-line using ultrasound imaging systems in emergency cases of acute respiratory diseases or thoracic diseases is disclosed herein. The system and method may be used in pre-hospital settings, initial evaluation in the emergency room, and/or follow-up after proper treatments. The systems and methods may be applicable to all ultrasound imaging systems, especially in point-of-care applications ranging from trauma to surgeon induced, can be used in a variety of settings including ambulance, ER, or critical care, or surgery situations. The systems and methods disclosed herein may improve automated detection and analysis of the pleural line without requiring access to RF data or the high computing demands of AI approaches in some applications.
In various embodiments where components, systems and/or methods are implemented using a programmable device, such as a computer-based system or programmable logic, it should be appreciated that the above-described systems and methods can be implemented using any of various known or later developed programming languages, such as “C”, “C++”, “C#”, “Java”, “Python”, and the like. Accordingly, various storage media, such as magnetic computer disks, optical disks, electronic memories and the like, can be prepared that can contain information that can direct a device, such as a computer, to implement the above-described systems and/or methods. Once an appropriate device has access to the information and programs contained on the storage media, the storage media can provide the information and programs to the device, thus enabling the device to perform functions of the systems and/or methods described herein. For example, if a computer disk containing appropriate materials, such as a source file, an object file, an executable file or the like, were provided to a computer, the computer could receive the information, appropriately configure itself and perform the functions of the various systems and methods outlined in the diagrams and flowcharts above to implement the various functions. That is, the computer could receive various portions of information from the disk relating to different elements of the above-described systems and/or methods, implement the individual systems and/or methods and coordinate the functions of the individual systems and/or methods described above.
In view of this disclosure it is noted that the various methods and devices described herein can be implemented in hardware, software and firmware. Further, the various methods and parameters are included by way of example only and not in any limiting sense. In view of this disclosure, those of ordinary skill in the art can implement the present teachings in determining their own techniques and needed equipment to affect these techniques, while remaining within the scope of the invention. The functionality of one or more of the processors described herein may be incorporated into a fewer number or a single processing unit (e.g., a CPU) and may be implemented using application specific integrated circuits (ASICs) or general purpose processing circuits which are programmed responsive to executable instruction to perform the functions described herein.
Of course, it is to be appreciated that any one of the examples, embodiments or processes described herein may be combined with one or more other examples, embodiments and/or processes or be separated and/or performed amongst separate devices or device portions in accordance with the present systems, devices and methods.
Finally, the above-discussion is intended to be merely illustrative of the present system and should not be construed as limiting the appended claims to any particular embodiment or group of embodiments. Thus, while the present system has been described in particular detail with reference to exemplary embodiments, it should also be appreciated that numerous modifications and alternative embodiments may be devised by those having ordinary skill in the art without departing from the broader and intended spirit and scope of the present system as set forth in the claims that follow. Accordingly, the specification and drawings are to be regarded in an illustrative manner and are not intended to limit the scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
PCT/CN2022/071512 | Jan 2022 | WO | international |
22168408.7 | Apr 2022 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2023/050444 | 1/10/2023 | WO |