This application claims priority to European Patent Application No. 23166161.2, filed Mar. 31, 2023, the entirety of which is incorporated herein by reference.
Embodiments of the invention relates generally to methods, devices, systems and apparatuses for medical imaging, in particular intra-operative imaging of tissues of a patient's body.
Intra-operative medical imaging systems, such as endoscopic or laparoscopic imaging systems, typically comprise camera systems adapted to be placed close to or within a patient's body for capturing live images during the medical procedure or the surgery. Typically, the live images are captured within a visible light range such as having a wavelength in the range of 400-780 nanometers. In the tissue of the patient's body, there may be fluorescent features present that emit fluorescent light if excited by excitation light. The fluorescent light emitted by the fluorescent features may be captured in the live images, however, the fluorescent light is usually weak in the live images. Accordingly, intra-operative medical imaging systems typically comprise fluorescence camera sensors for capturing fluorescence images in the wavelength range at least a major part of the light emitted by the fluorescent features is expected to be, such that the fluorescent features are more clearly visible. The wavelength of the fluorescent light may be situated at least partly in the visible range and at least partly in the infrared light range such as in a wavelength range of 780 nm to 1000 nm.
U.S. Pat. No. 10,694,152 B2 describes an endoscopic video system with a camera comprising a single color image sensor for fluorescence and color imaging. The tissue under observation is illuminated continuously with fluorescence excitation light as well as periodically with visible light comprising wavelengths outside the fluorescence excitation wavelength range. The fluorescence and color images are displayed simultaneously which can be inconvenient for the person performing the operation, who must alternate between the fluorescence and the color images in order to gather all details in both images.
US patent application US 2019/0167083 A1 describes an endoscopic system which includes a light source apparatus configured to emit excitation light for exciting a fluorescent medical agent and illumination light. The endoscopic system includes an image sensor configured to capture an image of fluorescent light emitted in response to the excitation light from the fluorescent medical agent and an image of the reflected light. A combined image is generated by mixing a first image of the reflected light with a second image of the fluorescent light in a set of non-overlapping pixel arrays.
Further endoscopic camera systems can be found, for instance, in JP 2012-085917 A, JP 2012-085916 A and JP 2003-070721 A.
In medical imaging, such as intra-operative imaging, the signals of the fluorescence images are usually weak and noisy compared to live images. Due to the comparatively low signal intensity and high noise intensity, it is possible that the surgical staff may miss important details in the fluorescence images because the visibility of the details may be affected by the noise.
US 2008/0204600 A1 describes an adaptive temporal noise reduction method that adaptively combines motion adaptive filtering results and motion compensated results in order to reduce Gaussian additive noise in video sequences.
US 2017/0281093 A1 further describes a method for de-noising an image stream of a moving structure in a medical device. In the method, the movement of the structure is measured by comparing two images each representing the same portion of the moved structure.
Moreover, there are approaches that reduce noise in images by frame addition. In this approach, a plurality of successive frames are added together to form a filtered image that includes a reduced level of noise. This approach can be found in JP 2012-085917 A, JP 2012-085916 A and EP 3469420 A1, US 2021/0052226 A1. In JP 2012-085917 A, the number of the added frames depends on the motion detected.
Further prior art documents can be found in EP 3,834,407 A1, U.S. Pat. No. 5,667,474 A, US 2019/0167083 A1, JP H-07250804 A and JP 2003-070721 A.
However, generating overlay images from fluorescence live images and live images with a good visibility remains a challenge. Usually, the boundaries of fluorescence features are not clearly visible so that there is an increased risk that the person operating the imaging device will miss features, in particular of thin, delicate features, in the overlay image. On the other hand, strong fluorescence signals can result in a saturation of the overlay images so that details from the live images may no longer be visible due to strong fluorescence features.
Therefore, it is an objective of embodiments of the present invention to provide an improved medical imaging apparatus and method in which the visualization of fluorescent images in an overlay image is enhanced.
The objective of embodiments of the present invention is solved by the device for medical imaging according to claim 1 and the method for medical imaging according to claim 15:
The embodiments of inventive device for medical imaging of tissue of a patient's body, in particular for intra-operative imaging, comprises an illumination unit, an imaging unit, a processing unit and a display unit.
The illumination unit is configured to illuminate an area of the tissue by emitting illumination light that includes at least excitation light that excites fluorescent features in the tissue to emit fluorescent light. The illumination light may be in the visible range of light, whereas the excitation light may be in the range of visible light or also in the range of invisible light, such as ultraviolet and infrared light. The excitation light may—for instance—comprise wavelengths that are larger than 500 nanometers or larger than 600 nanometers. For example, the wavelength of the excitation light may range from 600 to 800 nanometers, preferably 600 to 900 nanometers or 650 to 1350 nanometers, which means in the near infrared range. The excitation light may—for example—also comprise wavelengths that are at least 1400 nanometers, at least 3000 nanometers, at least 8000 nanometers or at least 15000 nanometers. The excitation light may be pulsed or continuous.
The fluorescent features in the tissue may—for instance—be fluorophores which are present in the tissue. For instance, the hemoglobin in blood vessels may be excited to fluoresce by the excitation light. The fluorescent features may also result from fluorescent dyes injected into the tissue or marker devices inserted into the patient's body.
The imaging unit is configured to capture fluorescent images and corresponding live images of the illuminated area. In particular, the imaging unit may comprise an image separation unit configured to separate the fluorescent images from the live images such that the imaging unit is configured to capture fluorescent images and separate live images of the illuminated area at least substantially at the same time. For example, the image separation unit can be a dichroic prism that splits the light which enters the imaging unit into a live image which—for instance—contains the images in the visible light range and fluorescent images, which—for instance—contain light in a fluorescent light range. The imaging unit may comprise separate sensors for capturing the fluorescent images and the images.
The processing unit is configured to receive the fluorescent images and the live images from the imaging unit. The processing unit further includes a motion detection module, an adaptive motion filter module and a mapping module. The motion detection module, the adaptive motion filter module and the mapping module may be implemented as software and/or hardware modules.
The motion detection module is configured to determine motion between the tissue and the imaging unit. The adaptive motion filter module is configured to generate filtered fluorescent images by applying motion based adaptive filtering to the fluorescent images. The mapping module is configured to generate overlay images by non-linearly mapping the filtered fluorescent images into the live images.
For determining the motion between the tissue and the imaging unit, the motion detection module, may evaluate changes in the image data of the imaging unit, for instance by applying and evaluating the optical flow or comparing particularly significant image features in-between frames. Additionally or alternatively, the motion detection module may receive and evaluate motion data from a motion sensor, such as an inertial measurement unit (IMU), which can be located in the camera head.
Motion between the tissue and the imaging unit may be present due to movement of the imaging unit relative to the tissue, movement of the tissue relative to the imaging unit, or both combined.
Moreover, the display unit is configured to receive and display the overlay images. The display unit may also be configured to additionally receive and display the fluorescent images and the live images.
An aspect of embodiments of the present invention is based on the notion that the processing unit is configured to adaptively filter the fluorescent images by considering the motion between the tissue and the imaging unit. In the motion-based adaptive filtering, the processing unit may add previous and present fluorescent images, wherein at least one parameter for the addition of the fluorescent images depends on the motion determined by the motion detection module. The at least one parameter defines the influence of the present fluorescent image in relation to the previous filtered fluorescent image. In particular, if a comparably high motion value is determined, the at least one parameter is increased such that the present fluorescent image has a higher influence on the resulting filtered fluorescent image than the previous filtered fluorescent images. In contrast, if the determined motion value is relatively low, the at least one parameter is decreased such that the influence of the previous filtered fluorescent images is higher than the influence of the present fluorescent image in the resulting filtered fluorescent image.
Another aspect of the embodiments of the present invention is based on the notion that the processing unit is further configured to generate an overlay image in which the filtered fluorescent images are mapped into the live images in the mapping module, wherein the mapping is non-linear. With the non-linear mapping, it is possible to amplify regions in the filtered fluorescent images with lower intensity values more than regions with comparably higher intensity values. This may result in an amplification of the details in the overlay images without the occurrence of over-saturation. Accordingly, combining the motion-based filtering with the non-linear mapping results in an amplification of the fluorescent details in the overlay images, while the noise present in the fluorescent images can be effectively suppressed. The mapping module may also restrict the output values in the non-linear mapping of the filtered fluorescent images to a maximal value so that details from the live images may remain visible even when the filtered fluorescent images have a comparatively large intensity value.
Fluorescent images can be images that collect only light in a predetermined preferably narrow band wavelength range. The wavelength collected in the fluorescent images may be in the visible light range and/or the invisible light range. If the fluorescent images are in the visible light range, they may—for example—comprise wavelength larger than 500 nanometers. If the fluorescent images are in the infrared light range, they may—for example—comprise wavelengths that are larger than 600 nanometers, e.g. ranging from 600 to 800 nanometers, 600 to 900 nanometers or 650 to 1350 nanometers (i.e. the near infrared (NIR) range). The infrared light may—for example—also comprise wavelengths that are at least 1400 nanometers, at least 3000 nanometers, at least 8000 nanometers or at least 15000 nanometers.
In particular, the generation of the filtered fluorescent images in the adaptive motion filter module involves a weighted averaging operation of a present fluorescent image and a previous filtered fluorescent image, wherein the weights in the averaging are adapted to the motion determined. This allows effective suppression of noise in the fluorescent images without filtering out the details of the fluorescent features. Furthermore, trailing effects due to motion in the fluorescent images can be avoided, since the weights in the average operation are adjusted to the motion.
Preferably, the weights in the weighted averaging operation comprise a first factor and a second factor, wherein the first factor determines the influence of the present fluorescent image in relation to the previous filtered fluorescent image and the second factor determines the amplification of the previous filtered fluorescent image.
In particular, the adaptive motion filter module is configured to successively generate a filtered fluorescent image for a plurality of time steps, wherein the filtered fluorescent image is calculated for a present time step as follows:
wherein I_filt is the filtered fluorescent image, I_fluo_pres is the present fluorescent image, I_filt_prev is the previous filtered image, a is the first factor and B is the second factor. The filtered fluorescent image in the present time step preferably becomes the previous fluorescent filtered image in a next time step.
Preferably, the adaptive motion filter module is configured to adapt the first factor to the determined motion such that the first factor decreases when the motion decreases and the first factor increases when the motion increases. Accordingly, the first factor is proportional to the motion determined in the motion detection module.
In particular, the second factor is proportional to at least the first factor, preferably the second factor is also proportional to the previous filtered fluorescent image. For example, the second factor can be proportional to the brightness of the previous filtered fluorescent image. The lower the brightness of the previous filtered fluorescent image, the higher the value of the second factor. This leads to an enhancement of the brightness of the filtered image even in situations when the brightness of the present fluorescent image is rather low.
Preferably, the second factor is defined to be in a predetermined range, the predetermined range in particular being between 1.00 and 1.50 preferably between 1.00 and 1.30. Defining the second factor in a predetermined range leads to a stable amplification avoiding over-saturation of the fluorescent image signal. Accordingly, the second factor can be dynamically changed with the brightness of the filtered fluorescent images and the motion. Ensuring that the second factor is always above 1.00 can result in the adaptive motion filter module not attenuating the previous filtered fluorescent image signal.
In particular, the processing unit further comprises a pre-processing noise filter module configured to:
Furthermore, the mapping module is preferably configured to apply a non-linear amplification function on the filtered fluorescent images in the mapping of the filtered fluorescent images into the live images. In doing so, the mapping module can generate an overlay image in which lower intensity values of the filtered fluorescent image can be displayed with a stronger value. This particularly amplifies the lower signal values while avoiding oversaturation of relatively high signal values.
Preferably, the non-linear amplification function is under-linear homogenous. This means that the homogenous degree is below 1.00. The under-linear homogenous amplification function may result in lower intensity values being mapped with a higher value while higher intensity values are mapped with a lower value, in particular when regarding the interval between 0.00 and 1.00 of the amplification function. For instance, the non-linear amplification function may be a square root function or a cubic root function, or a root function of a higher degree.
Additionally or alternatively, the amplification function can be a non-linear function that is specifically adapted to amplify lower intensity values and attenuate higher intensity values compared to a linear mapping.
Preferably, the value range of the non-linear amplification function is limited to a predetermined maximal value. The predetermined maximal value particularly has a value below 1.00, preferably below 0.80, 0.70 or 0.50, so that the live image is displayed in the background of the overlay image even when the intensity value of the filtered fluorescent image is close to the maximum value. This ensures that details in the live images remain visible in the overlay images, even when the fluorescence signal is comparably large.
In particular, the mapping module is configured to assign the intensity values of the mapped and filtered fluorescent images to different visually distinguishable colors. The colors should be distinguishable to the naked eye. This can further improve the visualization of the resulting color map for the operating person. For example, the distinguishable colors may be multiple of the following: blue, green, orange, yellow, red, violet and brown.
The overlay images are preferably calculated for every time step as follows:
wherein y is the amplification function, color is the color assigning operator, I_filt is the filtered fluorescent image, I_live is the corresponding live image and I_over is the overlay image. The color assigning operator can be a linear mapping operator.
In a preferred embodiment, the device further comprises an endoscope with a distal end and a proximal end between which an elongated shaft extends. An inertial measurement unit may be positioned at the distal end of the endoscope or the laparoscope. The illumination unit and/or the imaging unit can be positioned at the distal end and/or the proximal end of the endoscope. In case the illumination unit and the imaging unit are located at the distal end of the endoscope, the endoscope comprises signal lines in the elongated shaft for connecting at least the imaging unit with the processing unit. Moreover, the endoscope may comprise signal lines in the elongated shaft of the endoscope for controlling the illumination unit. In case the illumination unit and the imaging unit are located at the proximal end of the endoscope, the endoscope comprises optical channels and/or optical objectives for passing the illumination light which contains the excitation light from the proximal end to the distal end of the endoscope as well as passing the reflected and/or fluorescence light back entering the optical objective at the distal end to the proximal end of the endoscope.
Embodiments of the inventive method for medical imaging of tissue of a patient's body, in particular for intra-operative imaging, preferably with the above-described device, includes the method steps of illuminating an area of the tissue with an illumination unit by emitting illumination light containing at least excitation light which excites fluorescent features, such as fluorophores, in the tissue to emit fluorescent light, capturing fluorescent images and (separate) corresponding live images of the illuminated area with an imaging unit as well as processing the fluorescent images and the live images including the following steps:
All the features and advantages described with respect to embodiments of the inventive device are equally applicable to the inventive method.
Further details and advantages can be taken from the drawings, the respective description and from the claims as well.
Examples of the embodiments of the invention are illustrated in the drawings in which
In the displayed example of
The illumination unit 2 is configured to illuminate an area A of the tissue 21 by emitting illumination light 14 containing at least excitation light 15 which excites fluorophores in the tissue 21 to emit fluorescent light 16. In
The imaging unit 3 is configured to capture fluorescent images I_fluo and corresponding live images I_live of the illuminated area A. The imaging unit 3 is connected with the processing unit 4 so that the processing unit 4 is configured to receive the fluorescent images I_fluo and the live images I_live from the imaging unit 3.
The processing unit 4 is configured to process the fluorescent images I_fluo and the live images I_live in order to form overlay images I_over. The processing unit 4 and the display unit 8 are connected such that the display unit is configured to receive the overlay images I_over.
The display unit 8 also comprises a screen for at least displaying the overlay images. The display unit 8 can also be configured to receive the live images I_live, the fluorescent images I_fluo and/or the filtered images I_filt. Additionally, the display unit 8 can be configured to display the live images I_live, the fluorescent images I_fluo and/or the filtered fluorescent images I_filt.
The processing unit 4 comprises a motion detection module 5, an adaptive motion filter module 6 and a mapping module 7. Optionally, the processing unit 4 may also comprise a pre-processing noise filter module 9.
The motion detection module 5 is configured to determine motion between the tissue 21 and the imaging unit 3. The motion detection module 5 can be configured to determine motion in the fluorescent images I_fluo and/or the live images I_live, for example with image processing methods, such as the optical flow operator feature detection methods and/or tracking methods can be utilized. Alternatively, the motion detection module 5 may be configured to determine the motion by using an external tracking system, a motion sensor or an integrated inertial measurement unit (IMU). In the illustrated example, however, the motion detection module 5 is configured to detect the motion in the images by image processing. Also, the alternative example with an IMU 38 is illustrated in
The adaptive motion filter module 6 is configured to generate filtered fluorescent images I_filt by applying motion based adaptive filtering to the fluorescent images I_fluo. In the adaptive motion filter module 6, the fluorescent images I_fluo are continuously averaged. After initializing the filtered fluorescent images I_filt with an empty invention which contains only zeroes, the adaptive motion filter module forms a weighted sum of a present fluorescent image I_fluo_pres and a previous filtered image I_filt_prev, wherein the weights in the averaging are adapted to the motion determined in the motion detection module 5.
The mapping module 7 is configured to generate overlay images I_over by non-linearly mapping the filtered fluorescent images I_filt into the live images I_live.
The adaptive motion filter module 6 and the mapping module 7 are discussed in more detail below.
The device 1 displayed in
The illumination unit 2 and the imaging unit 3 are both positioned at the proximal end 12 of the endoscope 10 in
On the upper right side in
On the middle left side of
wherein I_filt is the filtered fluorescent image, I_fluo_pres is the present fluorescent image, I_filt_prev is the previous filtered image, a is the first factor and β is the second factor. The filtered fluorescent image I_filt of the present time step becomes the previous filtered image I_filt_prev in the next time step.
In the mapping module 7, a non-linear amplification function γ is applied on the filtered fluorescent image I_filt such that pixel with a lower intensitive value are amplified more than pixels with a high intensity value. This may result in an amplification of the border regions 22 while an over amplification and therefore an over-saturation of the pixels with high intensity values can be avoided. In the mapping module 7, the intensity values of the mapped and filtered fluorescent images I_filt are assigned to different visually distinguishable color groups. As shown in
On the lower end of
Moreover,
The present motion m_pres and the present fluorescent image I_fluo_pres is transmitted to the adaptive motion filter module 6. In the adaptive motion filter module 6, the present fluorescent image I_fluo_pres is added to a previous filtered fluorescent image I_filt_prev by using the first factor α and the second factor β. The first factor α is proportional to the present motion m_pres. Also, the second factor β is proportional to the first factor α. The resulting filtered fluorescent image I_filt is transmitted to the mapping module 7, which is configured to generate overlay images by non-linearly mapping the filtered fluorescent images I_filt into the live images I_live. For the non-linear mapping, the mapping module 7 uses a non-linear amplification function γ. The processing unit 4 has the resulting overlay images I_over as an output.
The invention relates to a method and a device 1 for medical imaging in particular intra-operative imaging of tissue 21 of a patient's body. The device 1 comprises the illumination unit 2, the imaging unit 3, the processing unit 4 and the display unit 8. The imaging unit captures fluorescent images I_fluo and live images I_live of the same scenery. In the processing unit 4, the fluorescent images I_fluo are processed by using the detected motion between the tissue (21) and the imaging unit (3) based on the detected motion. The filtered fluorescent images are also non-linearly mapped into the live images in order to enhance the visibility of fluorophores in the tissue resulting in an overlay image that comprises details of the live images on one hand and details of the fluorescent images on the other hand.
Number | Date | Country | Kind |
---|---|---|---|
23166161.2 | Mar 2023 | EP | regional |