Systems and methods for smoke-reduction in images

Information

  • Patent Grant
  • 12223629
  • Patent Number
    12,223,629
  • Date Filed
    Wednesday, September 11, 2019
    5 years ago
  • Date Issued
    Tuesday, February 11, 2025
    3 months ago
Abstract
A method for smoke reduction in images includes accessing an RGB image of an object obscured by smoke, determining a dark channel matrix of the RGB image, estimating an atmospheric light matrix for the RGB image based on the dark channel, determining a transmission map based on to the atmospheric light matrix and the dark channel matrix, de-hazing the RGB image based on the transmission map to reduce the smoke in the RGB image, and displaying the de-hazed RGB image on a display device. The RGB image includes a plurality of pixels. The dark channel matrix includes, for each pixel of the plurality of pixels, a minimum color component intensity for a respective pixel area centered at the respective pixel. The atmospheric light matrix includes an atmospheric light component value for each of the plurality of pixels.
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a U.S. National Stage Application filed under 35 U.S.C. § 371(a) claiming the benefit of and priority to International Patent Application No. PCT/CN2019/105311, filed Sep. 11, 2019, the entire disclosure of which being incorporated by reference herein.


FIELD

The present disclosure relates to devices, systems and methods for smoke-reduction in images, and more particularly, to smoke-reduction in images during surgical procedures.


BACKGROUND

Endoscopes are introduced through an incision or a natural body orifice to observe internal features of a body. Conventional endoscopes are used for visualization during endoscopic or laparoscopic surgical procedures. During such surgical procedures, it is possible for smoke to be generated when the energy surgical instrument is used, for example, to cut tissue with electrosurgical energy during the surgery. Thus, the image acquired by the endoscope may become blurry because of this smoke. The smoke may obscure features of the surgical site and delay the surgical procedure while surgeons wait for the smoke to clear. Other procedures may experience similar issues where smoke is present during the capture of an image. Accordingly, there is interest in improving imaging technology.


SUMMARY

The present disclosure relates to devices, systems, and methods for smoke reduction in images. In accordance with aspects of the present disclosure, a method for smoke reduction in images includes accessing an RGB image of an object obscured by smoke, determining a dark channel matrix of the RGB image, estimating an atmospheric light matrix for the RGB image based on the dark channel, determining a transmission map based on the atmospheric light matrix and the dark channel matrix, de-hazing the RGB image based on the transmission map to reduce the smoke in the RGB image, and displaying the de-hazed RGB image on a display device. The RGB image includes a plurality of pixels. The dark channel matrix includes, for each pixel of the plurality of pixels, a minimum color component intensity for a respective pixel area centered at the respective pixel. The atmospheric light matrix includes an atmospheric light component value for each of the plurality of pixels.


In an aspect of the present disclosure, the de-hazing the RGB image includes converting the RGB image to a YUV image, performing a de-hazing operation on the YUV image to provide a Y′UV image, and converting the Y′UV image to the de-hazed RGB image.


In an aspect of the present disclosure, the performing the de-hazing operation on the YUV image includes, for each pixel x of the plurality of pixels, determining Y′ as








Y


(
x
)

=




Y

(
x
)

-

A

(
x
)



T

(
x
)


.






T(x) is a transmission component for the pixel x. A(x) is the atmospheric light component value for the pixel x.


In another aspect of the present disclosure, the performing the de-hazing operation on the YUV image further includes replacing a Y channel of the YUV image with the determined Y′ to provide a Y′UV image.


In an aspect of the present disclosure, the estimating the atmospheric light matrix includes, for each pixel x of the plurality of pixels: determining an atmospheric light component value for the pixel x as: A(x)=max(min(Ic(y)))*coef, for all y∈Ω(x), where Ω(x) is a pixel area centered at pixel x, y is a pixel of the pixel area Ω(x), Ic(y) is an intensity value of a color component c of the pixel y, and coef is a predetermined coefficient value.


In an aspect of the present disclosure, the determining the transmission map includes determining, for each pixel x of the plurality of pixels, a transmission component as:







T

(
x
)

=

1
-

ω
*



I_DARK


(
X
)



A

(
X
)


.








Where ω is a predetermined constant, I_DARK(x) is the dark channel matrix value for the pixel x, and A(x) is the atmospheric light component value for the pixel x.


In accordance with aspects of the present disclosure, a method for smoke reduction in images is presented. The method includes accessing an image obscured by smoke, for each pixel of the plurality of pixels: (i) determining a dark channel matrix value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel, and (ii) estimating an atmospheric light component value for the pixel x based on the minimum color component intensity value for each pixel of the pixel area, de-hazing the image based on the atmospheric light component values for the plurality of pixels, and displaying the de-hazed image on a display device. The image includes a plurality of pixels, where each pixel of the image includes a plurality of color components.


In a further aspect of the present disclosure, the de-hazing the image includes determining a transmission map value for each pixel x of the plurality of pixels as:








T

(
x
)

=

1
-

ω
*


I_DARK


(
X
)



A

(
X
)





,





converting the image to a YUV image, determining Y′ for each pixel x of the plurality of pixels as









Y


(
x
)

=



Y

(
x
)

-

A

(
x
)



T

(
x
)



,





and replacing a Y channel of the YUV image with the determined Y′ to provide a Y′UV image, where ω is a predetermined constant, I_DARK(x) is the dark channel matrix value for the pixel x, and A(x) is the atmospheric light component value for the pixel x.


In an aspect of the present disclosure, the de-hazing the image further includes converting the Y′UV image to a de-hazed image.


In yet another aspect of the present disclosure, the image includes at least one of an RGB image, a CMYK image, a CIELAB image, or a CIEXYZ image.


In an aspect of the present disclosure, the estimating the atmospheric light matrix includes, for each pixel x of the plurality of pixels: determining an atmospheric light component value for the pixel x as A(x)=max(min(Ic(y)))*coef, for all y∈Ω(x), where Ω(x) is a pixel area centered at pixel x, y is a pixel of the pixel area Ω(x), Ic(y) is an intensity value of a color component c of the pixel y, and coef is a predetermined coefficient value.


In accordance with aspects of the present disclosure, a system for smoke reduction in images is presented. The system may include a light source configured to provide light, an imaging device configured to acquire images, an imagining device control unit configured to control the imaging device. The image includes a plurality of pixels, where each pixel of the image may include a plurality of color components. The control unit may include a processor and a memory storing instructions. The instructions which, when executed by the processor, causes the system to access the image, for each of the pixels: determine a dark channel matrix value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel, and estimate an atmospheric light component value for each pixel based on the minimum color component intensity value for each pixel of the pixel area, de-haze the image based on the atmospheric light component value for each of the pixels, and display the de-hazed image on a display device.


In a further aspect of the present disclosure, the instructions when de-hazing the image, may further cause the system to determine a transmission map value for each pixel x of the plurality of pixels as








T

(
x
)

=

1
-

ω
*


I_DARK


(
X
)



A

(
X
)





,





convert the image to a YUV image, determine Y′ as









Y


(
x
)

=



Y

(
x
)

-

A

(
x
)



T

(
x
)



,





and replace a Y channel of the YUV image with the determined Y′ to provide a Y′UV image, where ω is a predetermined constant, I_DARK(x) is the dark channel matrix value for the pixel x, and A(x) is the atmospheric light component value for the pixel x.


In yet a further aspect of the present disclosure, the instructions when de-hazing the image may further cause the system to convert the Y′UV image to a de-hazed image.


In yet another aspect of the present disclosure, the image includes at least one of an RGB image, a CMYK image, a CIELAB image, or a CIEXYZ image.


In an aspect of the present disclosure, the estimating the atmospheric light matrix includes determining an atmospheric light component value for the pixel x as A(x)=max(min(Ic(y)))*coef, for all y∈Ω(x). Where y is a pixel, Ic(y) is an intensity value of a color component c of the pixel y, and coef is a predetermined coefficient value.


Further details and aspects of various embodiments of the present disclosure are described in more detail below with reference to the appended figures.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present disclosure are described herein with reference to the accompanying drawings, wherein:



FIG. 1 is a diagram of an exemplary visualization or endoscope system in accordance with the present disclosure;



FIG. 2 is a schematic configuration of the visualization or endoscope system of FIG. 1;



FIG. 3 is a diagram illustrating another schematic configuration of an optical system of the system of FIG. 1;



FIG. 4 is a schematic configuration of the visualization or endoscope system in accordance with an embodiment of the present disclosure;



FIG. 5 is a flowchart of a method for smoke reduction in accordance with an exemplary embodiment of the disclosure;



FIG. 6 is an exemplary input image including an area of pixels in accordance with the present disclosure;



FIG. 7 is a flowchart of a method for performing de-hazing in accordance with the disclosure;



FIG. 8 is an exemplary image with smoke in accordance with the present disclosure;



FIG. 9 is an exemplary de-hazed image with constant atmospheric light; and



FIG. 10 is an exemplary de-hazed image with atmospheric light calculated in accordance with the present disclosure.





Further details and aspects of exemplary embodiments of the disclosure are described in more detail below with reference to the appended figures. Any of the above aspects and embodiments of the disclosure may be combined without departing from the scope of the disclosure.


DETAILED DESCRIPTION OF EMBODIMENTS

Embodiments of the presently disclosed devices, systems, and methods of treatment are described in detail with reference to the drawings, in which like reference numerals designate identical or corresponding elements in each of the several views. As used herein, the term “distal” refers to that portion of a structure that is farther from a user, while the term “proximal” refers to that portion of a structure that is closer to the user. The term “clinician” refers to a doctor, nurse, or other care provider and may include support personnel.


The present disclosure is applicable where images of a surgical site are captured. Endoscope systems are provided as an example, but it will be understood that such description is exemplary and does not limit the scope and applicability of the present disclosure to other systems and procedures.


Referring initially to FIGS. 1-3, an endoscope system 1, in accordance with the present disclosure, includes an endoscope 10, a light source 20, a video system 30, and a display device 40. With continued reference to FIG. 1, the light source 20, such as an LED/Xenon light source, is connected to the endoscope 10 via a fiber guide 22 that is operatively coupled to the light source 20 and to an endocoupler 16 disposed on, or adjacent to, a handle 18 of the endoscope 10. The fiber guide 22 includes, for example, fiber optic cable which extends through the elongated body 12 of the endoscope 10 and terminates at a distal end 14 of the endoscope 10. Accordingly, light is transmitted from the light source 20, through the fiber guide 22, and emitted out the distal end 14 of the endoscope 10 toward a targeted internal feature, such as tissue or an organ, of a body of a patient. As the light transmission pathway in such a configuration is relatively long, for example, the fiber guide 22 may be about 1.0 m to about 1.5 m in length, only about 15% (or less) of the light flux emitted from the light source 20 is outputted from the distal end 14 of the endoscope 10.


With reference to FIG. 2 and FIG. 3, the video system 30 is operatively connected to an image sensor 32 mounted to, or disposed within, the handle 18 of the endoscope 10 via a data cable 34. An objective lens 36 is disposed at the distal end 14 of the elongated body 12 of the endoscope 10 and a series of spaced-apart, relay lenses 38, such as rod lenses, are positioned along the length of the elongated body 12 between the objective lens 36 and the image sensor 32. Images captured by the objective lens 36 are forwarded through the elongated body 12 of the endoscope 10 via the relay lenses 38 to the image sensor 32, which are then communicated to the video system 30 for processing and output to the display device 40 via cable 39. The image sensor 32 is located within, or mounted to, the handle 18 of the endoscope 10, which can be up to about 30 cm away from the distal end 14 of the endoscope 10.


With reference to FIGS. 4-7, the flow diagrams include various blocks described in an ordered sequence. However, those skilled in the art will appreciate that one or more blocks of the flow diagram may be performed in a different order, repeated, and/or omitted without departing from the scope of the present disclosure. The below description of the flow diagram refers to various actions or tasks performed by one or more video system 30, but those skilled in the art will appreciate that the video system 30 is exemplary. In various embodiments, the disclosed operations can be performed by another component, device, or system. In various embodiments, the video system 30 or other component/device performs the actions or tasks via one or more software applications executing on a processor. In various embodiments, at least some of the operations can be implemented by firmware, programmable logic devices, and/or hardware circuitry. Other implementations are contemplated to be within the scope of the present disclosure.


Referring to FIG. 4, there is shown a schematic configuration of a system, which may be the endoscope system of FIG. 1 or may be a different type of system (e.g., visualization system, etc.). The system, in accordance with the present disclosure, includes an imaging device 410, a light source 420, a video system 430, and a display device 440. The light source 420 is configured to provide light to a surgical site through the imaging device 410 via the fiber guide 422. The distal end 414 of the imaging device 410 includes an objective lens 436 for capturing the image at the surgical site. The objective lens 436 forwards the image to the image sensor 432. The image is then communicated to the video system 430 for processing. The video system 430 includes an imaging device controller 450 for controlling the endoscope and processing the images. The imaging device controller 450 includes processor 452 connected to a computer-readable storage medium or a memory 454 which may be a volatile type memory, such as RAM, or a non-volatile type memory, such as flash media, disk media, or other types of memory. In various embodiments, the processor 452 may be another type of processor such as, without limitation, a digital signal processor, a microprocessor, an ASIC, a graphics processing unit (GPU), field-programmable gate array (FPGA), or a central processing unit (CPU).


In various embodiments, the memory 454 can be random access memory, read only memory, magnetic disk memory, solid state memory, optical disc memory, and/or another type of memory. In various embodiments, the memory 454 can be separate from the imaging device controller 450 and can communicate with the processor 452 through communication buses of a circuit board and/or through communication cables such as serial ATA cables or other types of cables. The memory 454 includes computer-readable instructions that are executable by the processor 452 to operate the imaging device controller 450. In various embodiments, the imaging device controller 450 may include a network interface 540 to communicate with other computers or a server.


Referring now to FIG. 5, there is shown an operation for smoke reduction in images. In various embodiments, the operation of FIG. 5 can be performed by an endoscope system 1 described above herein. In various embodiments, the operation of FIG. 5 can be performed by another type of system and/or during another type of procedure. The following description will refer to an endoscope system, but it will be understood that such description is exemplary and does not limit the scope and applicability of the present disclosure to other systems and procedures. The following description will refer to an RGB (Red, Green, Blue) image or RGB color model, but it will be understood that such description is exemplary and does not limit the scope and applicability of the present disclosure to other types of images or color models (for example, CMYK (Cyan, Magenta, Yellow, Key), CIELAB, or CIEXYZ). The image sensor 32 may capture raw data. The format of the raw data may be RGGB, RGBG, GRGB, or BGGR. The video system 30 may convert the raw data to RGB using a demosaicing algorithm. A demosaicing algorithm is a digital image process used to reconstruct a full-color image from the incomplete color samples output from an image sensor overlaid with a color filter array (CFA). It is also known as CFA interpolation or color reconstruction. The RGB image may be further converted by the video system 30 to another color model, such as CMYK, CIELAB, or CIEXYZ.


Initially, at step 502, an image of a surgical site is captured via the objective lens 36 and forwarded to the image sensor 32 of endoscope system 1. The term “image” as used herein may include still images or moving images (for example, video). In various embodiments, the captured image is communicated to the video system 30 for processing. For example, during an endoscopic procedure, a surgeon may cut tissue with an electrosurgical instrument. During this cutting, smoke may be generated. When the image is captured, it may include the smoke. Smoke is generally a turbid medium (such as particles, water droplets) in the atmosphere. The irradiance received by the objective lens 36 from the scene point is attenuated by the line of sight. This incoming light is mixed with ambient light (air-light) reflected into the line of sight by atmospheric particles such as smoke. This smoke degrades the image, making it lose contrast and color fidelity.



FIG. 6 shows an exemplary pixel representation of an image captured in step 502. In various embodiments, the captured image may or may not have been processed during the capture process or after the capture process. In various embodiments, an image 600 includes a number of pixels, and the dimensions of the image 600 are often represented as the amount of pixels in an X by Y format, such as 500×500 pixels, for example. In accordance with aspects of the present disclosure, and as explained in more detail later herein, each pixel of the image 600 may be processed based on a pixel area 602, 610 centered at that pixel, which will also be referred to herein as a patch. In various embodiments, each patch/pixel area of the image can have the same size. In various embodiments, different pixel areas or patches can have different sizes. Each pixel area or patch can be denoted as Ω(x), which is a pixel area/patch having a particular pixel x as its center pixel. In the illustrative example of FIG. 6, the pixel area 602 has a size of 3×3 pixels and is centered at a particular pixel x1 606. If an image has 18 by 18 pixels, a patch size may be 3×3 pixels. The illustrated image size and patch size are exemplary and other image sizes and patch sizes are contemplated to be within the scope of the present disclosure.


With continuing reference to FIG. 6, each pixel 601 in an image 600 may have combinations of color components 612, such as red, green, and blue, which are also referred to herein as color channels. Ic(y) is used herein to denote the intensity value of a color component c of a particular pixel y in the image 600. For a pixel 601, each of the color components 612 has an intensity value representing the brightness intensity of that color component. For example, for a 24 bit RGB image, each of the color components 612 has 8 bits, which corresponds to each color component having 256 possible intensity values.


Referring again to FIG. 5, at step 504, the video system 30 determines a dark channel matrix for the image 600. As used herein, the phrase “dark channel” of a pixel refers to the lowest color component intensity value among all pixels of a patch Ω(x) 602 centered at a particular pixel x. The term “dark channel matrix” of an image, as used herein, refers to a matrix of the dark channel of every pixel of the image. The dark channel of a pixel x will be denoted as I_DARK(x). In various embodiments, the video system 30 calculates the dark channel of a pixel as follows:

I_DARK(x)=min(min(Ic(y))), for all c∈{r,g,b}y∈Ω(x)

where y denotes a pixel of the patch Ω(x), c denotes a color component, and Ic(y) denotes the intensity value of the color component c of pixel y. Thus, the dark channel of a pixel x is the outcome of two minimum operations across two variables c and y, which together determine the lowest color component intensity value among all pixels of a patch centered at pixel x. In various embodiments, the video system 30 can calculate the dark channel of a pixel by acquiring the lowest color component intensity value for every pixel in the patch and then finding the minimum value among all those values. For cases where the center pixel of the patch is at or near the edge of the image, only the part of the patch in the image is used.


For example, with reference to FIG. 6, for an image 600 that was captured in step 502, the image 600 may have a height and width of 18×18 pixels, the pixel area (patch) size may be 3×3 pixels. For example, a 3×3 pixel area Ω(x1) 602 centered at x1 606 may have the following intensities for the R, G, and B channels for each of the 9 pixels in the patch:






[




1
,
3
,
6




2
,
0
,
1




5
,
3
,
4






2
,
4
,
3




6
,
7
,
4




7
,
6
,
9






1
,
3
,
2




5
,
8
,
9




9
,

1

1

,

2

5





]





In this example, for the top-left pixel in the pixel area Ω(x1) 602, the R channel may have an intensity of 1, the G channel may have an intensity of 3, and the B channel may have an intensity of 6. Here, the R channel has the minimum intensity value (a value of 1) of the RGB channels for that pixel.


The minimum color component intensity value of each the pixels would be determined. For example, for the 3×3-pixel area Ω(x1) 602 centered at x1, the minimum color component intensity value for each of the pixels in the pixel area Ω(x1) 602 are:






[



1


0


3




2


4


6




1


5


9



]





Thus, the dark channel of the pixel would have an intensity value of 0 for this exemplary 3×3-pixel area Ω(x) 602 centered at x1.


Referring again to FIG. 5, at step 506, the video system 30 estimates an atmospheric light component for each pixel, and the atmospheric light components for all of the pixels are together referred to herein as an “atmospheric light matrix.” The estimated atmospheric light component for a pixel x will be denoted herein as A(x). In various embodiments, A(x) can be determined based on the lowest color component intensity value for each pixel y 604 in a pixel area Ω(x) 602, which can be denoted as:

A(x)=f(min(Ic(y))), for all c∈{r,g,b}y∈Ω(x),

where f( ) is an operation for estimating the atmospheric light component, based on the lowest color component intensity value for each pixel y 604 in the patch Ω(x1) 602. In various embodiments, the operation f( ) may determine the maximum value among min(Ic(y)), for y∈Ω(x). In various embodiments, the maximum value can be scaled by a coefficient “coef,” which in various embodiments can have a value between 0 and 1, such as 0.85. The embodiment of atmospheric light component described above may be provided as follows:

A(x)=f(min(Ic(y)))=max(min(Ic(y)))*coef, for all c∈{r,g,b}y∈Ω(x)

For example, using the same example above for intensity values in patch Ω(x1) 602, the video system 30 determines the atmospheric light component A(x1) to be 9*coef.


At step 508, the video system 30 determines what is referred to herein as a transmission map T. The transmission map T is determined based on the dark channel matrix and the atmospheric light matrix, which were determined in steps 504 and 506. The transmission map includes a transmission component T(x) for each pixel x. In various embodiments, the transmission component can be determined as follows:








T

(
x
)

=

1
-

ω
*


I_DARK


(
X
)



A

(
X
)





,





where ω is a parameter having a value between 0 and 1, such as 0.85. In practice, even in clear images, there are some particles. Thus, some haze exists when distant objects are observed. The presence of haze is a cue to human perception of depth. If all haze is removed, the perception of depth may be lost. Therefore, to retain some haze, the parameter ω (0<ω<=1) is introduced. In various embodiments, the value of ω can vary based on the particular application. Thus, the transmission map is equal to 1 minus ω times the dark channel of a pixel (I-DARK(x)) divided by the atmospheric light component of the pixel, A(x).


At step 510, the video system 30 de-hazes the image based on the transmission map. FIG. 7 illustrates one way to perform the de-hazing operation.


With reference to FIG. 7, the illustrated operation assumes that the original image is an RGB image. The operation attempts to retain the color of the original RGB image 600 as much as possible in the de-haze process. In various embodiments, the de-hazing operation converts the image 600 from the RGB color space to the YUV color space (Y is luminance, U and V are chrominance or color), and applies dehazing on the Y (luma) channel, which is generally a weighted sum of the RGB color channels.


Initially, at step 702 the video system 30 converts the RGB image 600 to a YUV image denoted as I-YUV. The conversion of each pixel from RGB and YUV may be performed as follows:







[



Y




U




V



]

=


[



0.2126


0.7152


0.0722





-
0.09991




-
0.33609



0.436




0.615



-
0.55861




-
0.05639




]


[



R




G




B



]





Next, at step 704 the video system 30 performs a de-hazing operation on the channel Y (luma) of the I-YUV image. In accordance with aspects of the present disclosure, the de-hazing operation is as following:








Y


(
x
)

=



Y

(
x
)

-

A

(
x
)



T

(
x
)







where Y′(x) is the Y(luma) channel of de-hazed image I-Y′UV. A(x) is the estimated atmospheric light component for pixel x, and T(x) is the transmission map value for pixel x. Thus, the Y(luma) channel of de-hazed image I-Y′UV is equal to the difference of the Y(luma) channel of image I-YUV and the estimated atmospheric light component A(x) calculated in step 506, divided by the transmission map value T(x) which was determined in step 508.


Finally, at step 706 the video system 30 converts the YUV image I-Y′UV to an de-hazed RGB image, the conversion from YUV to RGB is as follows:







[



R




G




B



]

=


[



1


0


1.28033




1



-
0.21482




-
0.38059





1


2.12798


0



]

[



Y




U




V



]





In various embodiments, the video system 30 may communicate the resultant de-hazed RGB image on the display device 40 or save it to a memory or external storage device for later recall or further processing. Although the operation of FIG. 7 is described with respect to an RGB image, it will be understood that the disclosed operation can be applied to other color spaces as well.



FIGS. 8-10 show an example result of the methods described in the previous sections. FIG. 8 shows an image 800 with smoke captured during a surgical procedure using the endoscope system 1. For example, during an endoscopic procedure, a surgeon may cut tissue 804 with an electrosurgical instrument 802. During this cutting smoke 806 may be generated. This smoke 806 would be captured in the image 800.



FIG. 9 shows a de-hazed image 900, where the image 800 from FIG. 8 was de-hazed was based on a constant atmospheric light value. The image 1000, still somewhat obscured by smoke 806, may include an electrosurgical instrument 802 and tissue 804. For example, in a case where a constant atmospheric light value A was used instead of the atmospheric light matrix A being estimated by the formula used in step 506.



FIG. 10 shows a de-hazed RGB image 1000, de-hazed using the method of FIGS. 5 and 7, as described herein. The de-hazed RGB image 1000 may include an electrosurgical instrument 802 and tissue 804. The method may start with the capture of the image 800 of FIG. 8 during a surgical procedure, as in step 502 using the endoscopic system 1. For example, the image may be approximately 20×20 pixels. Next, the video system 30 determines the dark channel matrix of the image as in step 504. For example, the size of the pixel area Ω(x) may be set to approximately 3×3 pixels.


The determined dark channel matrix of the image of FIG. 8 is used by the video system 30 to estimate the atmospheric light matrix by estimating a maximum value among the minimum color component intensities for each pixel in a pixel area, and multiplying this maximum value by a coefficient (e.g., 0.85) as in step 506. Next, as in step 508 the video system 30 calculates a transmission map (T) according to the dark channel matrix and the estimated atmospheric light matrix.


The transmission map (T) is used in a de-hazing operation as described in FIG. 7. At step 702 the video system 30 converts the RGB image I to a YUV image I-YUV. Next, at step 704 the video system 30 applies the de-hazing operation on channel Y (luma) of the I-YUV image by subtracting the estimated atmospheric light component A(x) from the Y (luma) channel and then dividing this difference by the determined transmission map, creating image I-Y′UV. Finally in step 706, the I-Y′UV image gets converted to a de-hazed RGB image 1000 (see FIG. 10).


The embodiments disclosed herein are examples of the present disclosure and may be embodied in various forms. For instance, although certain embodiments herein are described as separate embodiments, each of the embodiments herein may be combined with one or more of the other embodiments herein. Specific structural and functional details disclosed herein are not to be interpreted as limiting, but as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure. Like reference numerals may refer to similar or identical elements throughout the description of the figures.


The phrases “in an embodiment,” “in embodiments,” “in some embodiments,” or “in other embodiments” may each refer to one or more of the same or different embodiments in accordance with the present disclosure. A phrase in the form “A or B” means “(A), (B), or (A and B).” A phrase in the form “at least one of A, B, or C” means “(A); (B); (C); (A and B); (A and C); (B and C); or (A, B, and C).” The term “clinician” may refer to a clinician or any medical professional, such as a doctor, nurse, technician, medical assistant, or the like, performing a medical procedure.


The systems described herein may also utilize one or more controllers to receive various information and transform the received information to generate an output. The controller may include any type of computing device, computational circuit, or any type of processor or processing circuit capable of executing a series of instructions that are stored in a memory. The controller may include multiple processors and/or multicore central processing units (CPUs) and may include any type of processor, such as a microprocessor, digital signal processor, microcontroller, programmable logic device (PLD), field programmable gate array (FPGA), or the like. The controller may also include a memory to store data and/or instructions that, when executed by the one or more processors, causes the one or more processors to perform one or more methods and/or algorithms.


Any of the herein described methods, programs, algorithms or codes may be converted to, or expressed in, a programming language or computer program. The terms “programming language” and “computer program,” as used herein, each include any language used to specify instructions to a computer, and include (but is not limited to) the following languages and their derivatives: Assembler, Basic, Batch files, BCPL, C, C+, C++, Delphi, Fortran, Java, JavaScript, machine code, operating system command languages, Pascal, Perl, PL1, scripting languages, Visual Basic, metalanguages which themselves specify programs, and all first, second, third, fourth, fifth, or further generation computer languages. Also included are database and other data schemas, and any other meta-languages. No distinction is made between languages which are interpreted, compiled, or use both compiled and interpreted approaches. No distinction is made between compiled and source versions of a program. Thus, reference to a program, where the programming language could exist in more than one state (such as source, compiled, object, or linked) is a reference to any and all such states. Reference to a program may encompass the actual instructions and/or the intent of those instructions.


Any of the herein described methods, programs, algorithms or codes may be contained on one or more machine-readable media or memory. The term “memory” may include a mechanism that provides (for example, stores and/or transmits) information in a form readable by a machine such a processor, computer, or a digital processing device. For example, a memory may include a read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or any other volatile or non-volatile memory storage device. Code or instructions contained thereon can be represented by carrier wave signals, infrared signals, digital signals, and by other like signals.


It should be understood that the foregoing description is only illustrative of the present disclosure. Various alternatives and modifications can be devised by those skilled in the art without departing from the present disclosure. Accordingly, the present disclosure is intended to embrace all such alternatives, modifications and variances. The embodiments described with reference to the attached drawing figures are presented only to demonstrate certain examples of the present disclosure. Other elements, steps, methods, and techniques that are insubstantially different from those described above and/or in the appended claims are also intended to be within the scope of the present disclosure.

Claims
  • 1. A method for smoke reduction in images comprising: accessing an RGB image of an object obscured by smoke, the RGB image including a plurality of pixels;determining a dark channel matrix of the RGB image, where the dark channel matrix includes, for each pixel of the plurality of pixels, a minimum color component intensity for a respective pixel area centered at the respective pixel;estimating an atmospheric light matrix for the RGB image based on the dark channel matrix, wherein the atmospheric light matrix includes an atmospheric light component value for each of the plurality of pixels;determining a transmission map based on the atmospheric light matrix and the dark channel matrix;de-hazing the RGB image based on the transmission map to reduce the smoke in the RGB image, wherein the de-hazing the RGB image includes: converting the RGB image to a YUV image;performing a de-hazing operation on the YUV image to provide a Y′UV image, wherein the performing the de-hazing operation on the YUV image includes, for each pixel x of the plurality of pixels: determining Y′ as
  • 2. The method of claim 1, wherein the performing the de-hazing operation on the YUV image further includes replacing a Y channel of the YUV image with the determined Y′ to provide a Y′UV image.
  • 3. A method for smoke reduction in images comprising: accessing an RGB image of an object obscured by smoke, the RGB image including a plurality of pixels;determining a dark channel matrix of the RGB image, where the dark channel matrix includes, for each pixel of the plurality of pixels, a minimum color component intensity for a respective pixel area centered at the respective pixel;estimating an atmospheric light matrix for the RGB image based on the dark channel matrix, wherein the atmospheric light matrix includes an atmospheric light component value for each of the plurality of pixels, wherein the estimating the atmospheric light matrix includes, for each pixel x of the plurality of pixels: determining an atmospheric light component value for the pixel x as: A(x)=max(min(Ic(y)))*coef, for all γ∈Ω(x),wherein: Ω(x) is a pixel area centered at pixel x,y is a pixel of the pixel area Ω(x),Ic(y) is an intensity value of a color component c of the pixel y, andcoef is a predetermined coefficient value;determining a transmission map based on the atmospheric light matrix and the dark channel matrix;de-hazing the RGB image based on the transmission map to reduce the smoke in the RGB image; anddisplaying the de-hazed RGB image on a display device.
  • 4. A method for smoke reduction in images comprising: accessing an RGB image of an object obscured by smoke, the RGB image including a plurality of pixels;determining a dark channel matrix of the RGB image, where the dark channel matrix includes, for each pixel of the plurality of pixels, a minimum color component intensity for a respective pixel area centered at the respective pixel;estimating an atmospheric light matrix for the RGB image based on the dark channel matrix, wherein the atmospheric light matrix includes an atmospheric light component value for each of the plurality of pixels;determining a transmission map based on the atmospheric light matrix and the dark channel matrix, wherein the determining the transmission map includes determining, for each pixel x of the plurality of pixels, a transmission component value as:
  • 5. A method for smoke reduction in images comprising: accessing an image obscured by smoke, the image including a plurality of pixels, where each pixel of the image includes a plurality of color components;for each pixel of the plurality of pixels: determining a dark matrix channel value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel; andestimating an atmospheric light component value for the pixel based on the minimum color component intensity value for each pixel of the pixel area;de-hazing the image based on the atmospheric light component value for each of the plurality of pixels, wherein the de-hazing the image includes: determining a transmission map, for each pixel x of the plurality of pixels as:
  • 6. The method of claim 5, wherein the de-hazing the image further includes converting the Y′UV image to a de-hazed image.
  • 7. The method of claim 6, wherein the image includes at least one of an RGB image, a CMYK image, a CIELAB image, or a CIEXYZ image.
  • 8. A method for smoke reduction in images comprising: accessing an image obscured by smoke, the image including a plurality of pixels, where each pixel of the image includes a plurality of color components;for each pixel of the plurality of pixels: determining a dark matrix channel value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel; andestimating an atmospheric light component value for the pixel based on the minimum color component intensity value for each pixel of the pixel area, wherein the estimating the atmospheric light component value includes, for each pixel x of the plurality of pixels: determining the atmospheric light component value for the pixel x as: A(x)=max(min(Ic(y)))*coef, for all y∈Ω(x),wherein: Ω(x) is a pixel area centered at pixel x,y is a pixel of the pixel area Ω(x),Ic(y) is an intensity value of a color component c of the pixel y, andcoef is a predetermined coefficient value;de-hazing the image based on the atmospheric light component value for each of the plurality of pixels; anddisplaying the de-hazed image on a display device.
  • 9. A system for smoke reduction in images comprising: a light source configured to provide light;an imaging device configured to acquire images;an imagining device control unit configured to control the imaging device, the control unit comprising: a processor; anda memory storing instructions which, when executed by the processor, cause the system to: capture an image of an object obscured by smoke, by the imaging device, the image including a plurality of pixels, where each pixel of the image includes a plurality of color components; access the image;for each of the pixels:determine a dark channel matrix value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel;estimate an atmospheric light component value for each pixel based on the minimum color component intensity value for each pixel of the pixel area; andde-haze the image based on the atmospheric light component value for each of the pixels, wherein the instructions when de-hazing the image, further cause the system to:determine a transmission map, for each pixel x of the plurality of pixels as:
  • 10. The system of claim 9, wherein the instructions when de-hazing the image, further cause the system to convert the Y′UV image to a de-hazed image.
  • 11. The system of claim 10, wherein the image includes at least one of an RGB image, a CMYK image, a CIELAB image, or a CIEXYZ image.
  • 12. A system for smoke reduction in images comprising: a light source configured to provide light;an imaging device configured to acquire images;an imagining device control unit configured to control the imaging device, the control unit comprising: a processor; anda memory storing instructions which, when executed by the processor, cause the system to: capture an image of an object obscured by smoke, by the imaging device, the image including a plurality of pixels, where each pixel of the image includes a plurality of color components;access the image;for each of the pixels: determine a dark channel matrix value for the respective pixel as a minimum color component intensity value for a respective pixel area centered at the respective pixel; andestimate an atmospheric light component value for each pixel based on the minimum color component intensity value for each pixel of the pixel area, wherein the instructions when estimating the atmospheric light component of the image, further cause the system to: determine the atmospheric light component value for the pixel x as: A(x)=max(min(Ic(y)))*coef, wherein: y is a pixel, Ic is a color component of the pixel y, and coef is a predetermined coefficient; andde-haze the image based on the atmospheric light component value for each of the pixels; anddisplay the de-hazed image on a display device.
PCT Information
Filing Document Filing Date Country Kind
PCT/CN2019/105311 9/11/2019 WO
Publishing Document Publishing Date Country Kind
WO2021/046743 3/18/2021 WO A
US Referenced Citations (357)
Number Name Date Kind
6132368 Cooper Oct 2000 A
6206903 Ramans Mar 2001 B1
6246200 Blumenkranz et al. Jun 2001 B1
6312435 Wallace et al. Nov 2001 B1
6331181 Tierney et al. Dec 2001 B1
6394998 Wallace et al. May 2002 B1
6424885 Niemeyer et al. Jul 2002 B1
6441577 Blumenkranz et al. Aug 2002 B2
6459926 Nowlin et al. Oct 2002 B1
6491691 Morley et al. Dec 2002 B1
6491701 Tierney et al. Dec 2002 B2
6493608 Niemeyer Dec 2002 B1
6565554 Niemeyer May 2003 B1
6645196 Nixon et al. Nov 2003 B1
6659939 Moll Dec 2003 B2
6671581 Niemeyer et al. Dec 2003 B2
6676684 Morley et al. Jan 2004 B1
6685698 Morley et al. Feb 2004 B2
6699235 Wallace et al. Mar 2004 B2
6714839 Salisbury, Jr. et al. Mar 2004 B2
6716233 Whitman Apr 2004 B1
6728599 Wang et al. Apr 2004 B2
6746443 Morley et al. Jun 2004 B1
6766204 Niemeyer et al. Jul 2004 B2
6770081 Cooper et al. Aug 2004 B1
6772053 Niemeyer Aug 2004 B2
6783524 Anderson et al. Aug 2004 B2
6793652 Whitman et al. Sep 2004 B1
6793653 Sanchez et al. Sep 2004 B2
6799065 Niemeyer Sep 2004 B1
6837883 Moll et al. Jan 2005 B2
6839612 Sanchez et al. Jan 2005 B2
6840938 Morley et al. Jan 2005 B1
6843403 Whitman Jan 2005 B2
6846309 Whitman et al. Jan 2005 B2
6866671 Tierney et al. Mar 2005 B2
6871117 Wang et al. Mar 2005 B2
6879880 Nowlin et al. Apr 2005 B2
6899705 Niemeyer May 2005 B2
6902560 Morley et al. Jun 2005 B1
6936042 Wallace et al. Aug 2005 B2
6951535 Ghodoussi et al. Oct 2005 B2
6974449 Niemeyer Dec 2005 B2
6991627 Madhani et al. Jan 2006 B2
6994708 Manzo Feb 2006 B2
7048745 Tierney et al. May 2006 B2
7066926 Wallace et al. Jun 2006 B2
7118582 Wang et al. Oct 2006 B1
7125403 Julian et al. Oct 2006 B2
7155315 Niemeyer et al. Dec 2006 B2
7239940 Wang et al. Jul 2007 B2
7306597 Manzo Dec 2007 B2
7357774 Cooper Apr 2008 B2
7373219 Nowlin et al. May 2008 B2
7379790 Toth et al. May 2008 B2
7386365 Nixon Jun 2008 B2
7391173 Schena Jun 2008 B2
7398707 Morley et al. Jul 2008 B2
7413565 Wang et al. Aug 2008 B2
7453227 Prisco et al. Nov 2008 B2
7524320 Tierney et al. Apr 2009 B2
7574250 Niemeyer Aug 2009 B2
7594912 Cooper et al. Sep 2009 B2
7607440 Coste-Maniere et al. Oct 2009 B2
7666191 Orban, III et al. Feb 2010 B2
7682357 Ghodoussi et al. Mar 2010 B2
7689320 Prisco et al. Mar 2010 B2
7695481 Wang et al. Apr 2010 B2
7695485 Whitman et al. Apr 2010 B2
7699855 Anderson et al. Apr 2010 B2
7713263 Niemeyer May 2010 B2
7725214 Diolaiti May 2010 B2
7727244 Orban, III et al. Jun 2010 B2
7741802 Prisco Jun 2010 B2
7756036 Druke et al. Jul 2010 B2
7757028 Druke et al. Jul 2010 B2
7762825 Burbank et al. Jul 2010 B2
7778733 Nowlin et al. Aug 2010 B2
7803151 Whitman Sep 2010 B2
7806891 Nowlin et al. Oct 2010 B2
7819859 Prisco et al. Oct 2010 B2
7819885 Cooper Oct 2010 B2
7824401 Manzo et al. Nov 2010 B2
7835823 Sillman et al. Nov 2010 B2
7843158 Prisco Nov 2010 B2
7865266 Moll et al. Jan 2011 B2
7865269 Prisco et al. Jan 2011 B2
7886743 Cooper et al. Feb 2011 B2
7899578 Prisco et al. Mar 2011 B2
7907166 Lamprecht et al. Mar 2011 B2
7935130 Williams May 2011 B2
7963913 Devengenzo et al. Jun 2011 B2
7983793 Toth et al. Jul 2011 B2
8002767 Sanchez Aug 2011 B2
8004229 Nowlin et al. Aug 2011 B2
8012170 Whitman et al. Sep 2011 B2
8054752 Druke et al. Nov 2011 B2
8062288 Cooper et al. Nov 2011 B2
8079950 Stern et al. Dec 2011 B2
8100133 Mintz et al. Jan 2012 B2
8108072 Zhao et al. Jan 2012 B2
8120301 Goldberg et al. Feb 2012 B2
8142447 Cooper et al. Mar 2012 B2
8147503 Zhao et al. Apr 2012 B2
8151661 Schena et al. Apr 2012 B2
8155479 Hoffman et al. Apr 2012 B2
8182469 Anderson et al. May 2012 B2
8202278 Orban, III et al. Jun 2012 B2
8206406 Orban, III Jun 2012 B2
8210413 Whitman et al. Jul 2012 B2
8216250 Orban, III et al. Jul 2012 B2
8220468 Cooper et al. Jul 2012 B2
8256319 Cooper et al. Sep 2012 B2
8285517 Sillman et al. Oct 2012 B2
8315720 Mohr et al. Nov 2012 B2
8335590 Costa et al. Dec 2012 B2
8347757 Duval Jan 2013 B2
8374723 Zhao et al. Feb 2013 B2
8418073 Mohr et al. Apr 2013 B2
8419717 Diolaiti et al. Apr 2013 B2
8423182 Robinson et al. Apr 2013 B2
8452447 Nixon May 2013 B2
8454585 Whitman Jun 2013 B2
8499992 Whitman et al. Aug 2013 B2
8508173 Goldberg et al. Aug 2013 B2
8528440 Morley et al. Sep 2013 B2
8529582 Devengenzo et al. Sep 2013 B2
8540748 Murphy et al. Sep 2013 B2
8551116 Julian et al. Oct 2013 B2
8562594 Cooper et al. Oct 2013 B2
8594841 Zhao et al. Nov 2013 B2
8597182 Stein et al. Dec 2013 B2
8597280 Cooper et al. Dec 2013 B2
8600551 Itkowitz et al. Dec 2013 B2
8608773 Tierney et al. Dec 2013 B2
8620473 Diolaiti et al. Dec 2013 B2
8624537 Nowlin et al. Jan 2014 B2
8634957 Toth et al. Jan 2014 B2
8638056 Goldberg et al. Jan 2014 B2
8638057 Goldberg et al. Jan 2014 B2
8644988 Prisco et al. Feb 2014 B2
8666544 Moll et al. Mar 2014 B2
8668638 Donhowe et al. Mar 2014 B2
8746252 McGrogan et al. Jun 2014 B2
8749189 Nowlin et al. Jun 2014 B2
8749190 Nowlin et al. Jun 2014 B2
8758352 Cooper et al. Jun 2014 B2
8761930 Nixon Jun 2014 B2
8768516 Diolaiti et al. Jul 2014 B2
8786241 Nowlin et al. Jul 2014 B2
8790243 Cooper et al. Jul 2014 B2
8808164 Hoffman et al. Aug 2014 B2
8816628 Nowlin et al. Aug 2014 B2
8821480 Burbank Sep 2014 B2
8823308 Nowlin et al. Sep 2014 B2
8827989 Niemeyer Sep 2014 B2
8838270 Druke et al. Sep 2014 B2
8852174 Burbank Oct 2014 B2
8858547 Brogna Oct 2014 B2
8862268 Robinson et al. Oct 2014 B2
8864751 Prisco et al. Oct 2014 B2
8864752 Piolaiti et al. Oct 2014 B2
8903546 Diolaiti et al. Dec 2014 B2
8903549 Itkowitz et al. Dec 2014 B2
8911428 Cooper et al. Dec 2014 B2
8912746 Reid et al. Dec 2014 B2
8944070 Guthart Feb 2015 B2
8989903 Weir et al. Mar 2015 B2
9002518 Manzo Apr 2015 B2
9014856 Manzo et al. Apr 2015 B2
9016540 Whitman et al. Apr 2015 B2
9019345 O'Grady et al. Apr 2015 B2
9043027 Durant et al. May 2015 B2
9050120 Swarup et al. Jun 2015 B2
9055961 Manzo et al. Jun 2015 B2
9068628 Solomon et al. Jun 2015 B2
9078684 Williams Jul 2015 B2
9084623 Gomez et al. Jul 2015 B2
9095362 Dachs, II et al. Aug 2015 B2
9096033 Holop et al. Aug 2015 B2
9101381 Burbank et al. Aug 2015 B2
9113877 Whitman et al. Aug 2015 B1
9138284 Krom et al. Sep 2015 B2
9144456 Rosa et al. Sep 2015 B2
9198730 Prisco et al. Dec 2015 B2
9204923 Manzo et al. Dec 2015 B2
9226648 Saadat et al. Jan 2016 B2
9226750 Weir et al. Jan 2016 B2
9226761 Burbank Jan 2016 B2
9232984 Guthart et al. Jan 2016 B2
9241766 Duque et al. Jan 2016 B2
9241767 Prisco et al. Jan 2016 B2
9241769 Larkin et al. Jan 2016 B2
9259275 Burbank Feb 2016 B2
9259277 Rogers et al. Feb 2016 B2
9259281 Griffiths et al. Feb 2016 B2
9259282 Azizian et al. Feb 2016 B2
9261172 Solomon et al. Feb 2016 B2
9265567 Orban, III et al. Feb 2016 B2
9265584 Itkowitz et al. Feb 2016 B2
9283049 Diolaiti et al. Mar 2016 B2
9301811 Goldberg et al. Apr 2016 B2
9314307 Richmond et al. Apr 2016 B2
9317651 Nixon Apr 2016 B2
9345546 Toth et al. May 2016 B2
9393017 Flanagan et al. Jul 2016 B2
9402689 Prisco et al. Aug 2016 B2
9417621 Diolaiti Aug 2016 B2
9424303 Hoffman et al. Aug 2016 B2
9433418 Whitman et al. Sep 2016 B2
9446517 Burns et al. Sep 2016 B2
9452020 Griffiths et al. Sep 2016 B2
9474569 Manzo et al. Oct 2016 B2
9480533 Devengenzo et al. Nov 2016 B2
9503713 Zhao et al. Nov 2016 B2
9550300 Danitz et al. Jan 2017 B2
9554859 Nowlin et al. Jan 2017 B2
9566124 Prisco et al. Feb 2017 B2
9579164 Itkowitz et al. Feb 2017 B2
9585641 Cooper et al. Mar 2017 B2
9615883 Schena et al. Apr 2017 B2
9623563 Nixon Apr 2017 B2
9623902 Griffiths et al. Apr 2017 B2
9629520 Diolaiti Apr 2017 B2
9662177 Weir et al. May 2017 B2
9664262 Donlon et al. May 2017 B2
9687312 Dachs, II et al. Jun 2017 B2
9700334 Hinman et al. Jul 2017 B2
9718190 Larkin et al. Aug 2017 B2
9730719 Brisson et al. Aug 2017 B2
9737199 Pistor et al. Aug 2017 B2
9795446 DiMaio et al. Oct 2017 B2
9797484 Solomon et al. Oct 2017 B2
9801690 Larkin et al. Oct 2017 B2
9814530 Weir et al. Nov 2017 B2
9814536 Goldberg et al. Nov 2017 B2
9814537 Itkowitz et al. Nov 2017 B2
9820823 Richmond et al. Nov 2017 B2
9827059 Robinson et al. Nov 2017 B2
9830371 Hoffman et al. Nov 2017 B2
9839481 Blumenkranz et al. Dec 2017 B2
9839487 Dachs, II Dec 2017 B2
9850994 Schena Dec 2017 B2
9855102 Blumenkranz Jan 2018 B2
9855107 Labonville et al. Jan 2018 B2
9872737 Nixon Jan 2018 B2
9877718 Weir et al. Jan 2018 B2
9883920 Blumenkranz Feb 2018 B2
9888974 Niemeyer Feb 2018 B2
9895813 Blumenkranz et al. Feb 2018 B2
9901408 Larkin Feb 2018 B2
9918800 Itkowitz et al. Mar 2018 B2
9943375 Blumenkranz et al. Apr 2018 B2
9948852 Lilagan et al. Apr 2018 B2
9949798 Weir Apr 2018 B2
9949802 Cooper Apr 2018 B2
9952107 Blumenkranz et al. Apr 2018 B2
9956044 Gomez et al. May 2018 B2
9980778 Ohline et al. May 2018 B2
10008017 Itkowitz et al. Jun 2018 B2
10028793 Griffiths et al. Jul 2018 B2
10033308 Chaghajerdi et al. Jul 2018 B2
10034719 Richmond et al. Jul 2018 B2
10052167 Au et al. Aug 2018 B2
10085811 Weir et al. Oct 2018 B2
10092344 Mohr et al. Oct 2018 B2
10123844 Nowlin Nov 2018 B2
10188471 Brisson Jan 2019 B2
10201390 Swarup et al. Feb 2019 B2
10213202 Flanagan et al. Feb 2019 B2
10258416 Mintz et al. Apr 2019 B2
10278782 Jarc et al. May 2019 B2
10278783 Itkowitz et al. May 2019 B2
10282881 Itkowitz et al. May 2019 B2
10335242 Devengenzo et al. Jul 2019 B2
10405934 Prisco et al. Sep 2019 B2
10433922 Itkowitz et al. Oct 2019 B2
10464219 Robinson et al. Nov 2019 B2
10485621 Morrissette et al. Nov 2019 B2
10500004 Hanuschik et al. Dec 2019 B2
10500005 Weir et al. Dec 2019 B2
10500007 Richmond et al. Dec 2019 B2
10507066 DiMaio et al. Dec 2019 B2
10510267 Jarc et al. Dec 2019 B2
10524871 Liao Jan 2020 B2
10548459 Itkowitz et al. Feb 2020 B2
10575909 Robinson et al. Mar 2020 B2
10592529 Hoffman et al. Mar 2020 B2
10595946 Nixon Mar 2020 B2
10881469 Robinson Jan 2021 B2
10881473 Itkowitz et al. Jan 2021 B2
10898188 Burbank Jan 2021 B2
10898189 McDonald, II Jan 2021 B2
10905506 Itkowitz et al. Feb 2021 B2
10912544 Brisson et al. Feb 2021 B2
10912619 Jarc et al. Feb 2021 B2
10918387 Duque et al. Feb 2021 B2
10918449 Solomon et al. Feb 2021 B2
10932873 Griffiths et al. Mar 2021 B2
10932877 Devengenzo et al. Mar 2021 B2
10939969 Swarup et al. Mar 2021 B2
10939973 DiMaio et al. Mar 2021 B2
10952801 Miller et al. Mar 2021 B2
10965933 Jarc Mar 2021 B2
10966742 Rosa et al. Apr 2021 B2
10973517 Wixey Apr 2021 B2
10973519 Weir et al. Apr 2021 B2
10984567 Itkowitz et al. Apr 2021 B2
10993773 Cooper et al. May 2021 B2
10993775 Cooper et al. May 2021 B2
11000331 Krom et al. May 2021 B2
11013567 Wu et al. May 2021 B2
11020138 Ragosta Jun 2021 B2
11020191 Diolaiti et al. Jun 2021 B2
11020193 Wixey et al. Jun 2021 B2
11026755 Weir et al. Jun 2021 B2
11026759 Donlon et al. Jun 2021 B2
11040189 Vaders et al. Jun 2021 B2
11045077 Stern et al. Jun 2021 B2
11045274 Dachs, II et al. Jun 2021 B2
11058501 Tokarchuk et al. Jul 2021 B2
11076925 DiMaio et al. Aug 2021 B2
11090119 Burbank Aug 2021 B2
11096687 Flanagan et al. Aug 2021 B2
11098803 Duque et al. Aug 2021 B2
11109925 Cooper et al. Sep 2021 B2
11116578 Hoffman et al. Sep 2021 B2
11129683 Steger et al. Sep 2021 B2
11135029 Suresh et al. Oct 2021 B2
11147552 Burbank et al. Oct 2021 B2
11147640 Jarc et al. Oct 2021 B2
11154373 Abbott et al. Oct 2021 B2
11154374 Hanuschik et al. Oct 2021 B2
11160622 Goldberg et al. Nov 2021 B2
11160625 Wixey et al. Nov 2021 B2
11161243 Rabindran et al. Nov 2021 B2
11166758 Mohr et al. Nov 2021 B2
11166770 DiMaio et al. Nov 2021 B2
11166773 Ragosta et al. Nov 2021 B2
11173597 Rabindran et al. Nov 2021 B2
11185378 Weir et al. Nov 2021 B2
11191596 Thompson et al. Dec 2021 B2
11197729 Thompson et al. Dec 2021 B2
11213360 Hourtash et al. Jan 2022 B2
11221863 Azizian et al. Jan 2022 B2
11234700 Ragosta et al. Feb 2022 B2
11241274 Vaders et al. Feb 2022 B2
11241290 Waterbury et al. Feb 2022 B2
11259870 DiMaio et al. Mar 2022 B2
11259884 Burbank Mar 2022 B2
11272993 Gomez et al. Mar 2022 B2
11272994 Saraliev et al. Mar 2022 B2
11291442 Wixey et al. Apr 2022 B2
11291513 Manzo et al. Apr 2022 B2
20110188775 Sun Aug 2011 A1
20180308225 Shah et al. Oct 2018 A1
20200250797 Park Aug 2020 A1
Foreign Referenced Citations (5)
Number Date Country
102982513 Mar 2013 CN
104392417 Mar 2015 CN
106023092 Oct 2016 CN
106 846 259 Jun 2017 CN
109754372 May 2019 CN
Non-Patent Literature Citations (4)
Entry
Extended European Search Report for application No. 19945282.2 dated May 9, 2023.
Kaiming He et al: “Single Image Haze Removal Using Dark Channel Prior”, IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE Computer Society, USA, vol. 33, No. 12, Dec. 1, 2011 , pp. 2341-2353.
Tchaka et al: “Chromaticity based smoke removal in endoscopic images”, Proceedings of SPIE, vol. 10133, Feb. 24, 2017, p. 101331M.
International Search Report mailed May 29, 2020 and Written Opinion completed May 19, 2020 corresponding to counterpart Int'l Patent Application PCT/CN2019/105311.
Related Publications (1)
Number Date Country
20220392022 A1 Dec 2022 US