Real-time Marine Snow Noise Removal from Underwater Video

Information

  • Patent Application
  • 20240119607
  • Publication Number
    20240119607
  • Date Filed
    June 19, 2023
    a year ago
  • Date Published
    April 11, 2024
    8 months ago
Abstract
Optical flow refers to the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene. Optical flow algorithms can be used to detect and delineate independently moving objects, even in the presence of camera motion. The present invention uses optical-flow algorithms to detect and remove marine snow particles from live video. Portions of an image scene which are identified as marine snow are reconstructed in a manner intended to reveal underwater scenery which had been occluded by the marine snow. Pixel locations within the regions of marine snow are replaced with new pixel values that are determined based on either historical data for each pixel or a mathematical operation, such as one which uses data from neighboring pixels.
Description
FIELD OF THE INVENTION

The present invention relates to real-time video enhancement.


BACKGROUND

Underwater video is of broad interest in areas such as equipment inspections, surveillance, search & rescue, mine countermeasures, and deep-sea exploration. However, the acquisition of underwater images presents a new set of challenges compared to air space images. In general, underwater visibility tends to be poor due to conditions including poor natural light at different depths, the presence of suspended particles and the consequent scattering of light in all directions. Thus, underwater imagery suffers from poor quality and loss of conveyed information resulting from contrast and color decay, light scattering, blur, haze, and various types of noise. There are numerous image processing methods that can filter out these unwanted effects. However, marine snow is a special type of noise which can profoundly degrade the quality of underwater images and is difficult to filter out.


Marine snow is a phenomenon caused by light back scattering from small organic and mineral particles and air bubbles. These particles tend to grow as they fall down through the water and show up in images as bright spots of various shapes and sizes, which resemble snowflakes.


The problem of filtering out marine snow has been difficult to address because the particles can be quite large and have different structural and lighting characteristics that make them fundamentally different from other types of noise encountered in digital images. Thus, there is a need for a method for marine snow filtering, such as the one presented herein.


SUMMARY OF INVENTION

The present invention seeks to provide a solution to this problem by using optical flow algorithms to remove visual occlusions caused by marine snow in live video.


These and other aspects, objects, features and advantages of the present invention, are specifically set forth in, or will become apparent from, the following detailed description of an exemplary embodiment of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a flow diagram of the video processing platform architecture.



FIG. 2 is an image of a live underwater video stream.



FIG. 3 is an image of a Synthetic Video Frame.



FIG. 4 is a diagram of the Dynamic Chroma Mask.



FIG. 5 is an image of an Augmented Live Video Frame.



FIG. 6 is a flow diagram of the Dynamic Chroma Mask.



FIG. 7 is a flow diagram of the Synthetic Video Frame.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

In a preferred embodiment, the present invention uses chroma key masks created from optical flow algorithms to remove visual occlusions caused by marine snow in live video. Chroma key is a technique used in video to replace a portion of an image with a new image. For example, chroma key is commonly used in the film industry to replace a monochrome background, such as a green screen, with a different setting. The preferred embodiment uses a chroma key operation to merge pixels from the live video stream with pixels from a synthetic video frame to deliver a video frame with no occluded pixels. The live video frame contains clusters of “snow pixels” that are extracted using optical flow algorithms to create a chroma mask for use in extracting unoccluded live pixels. In essence, this mask replaces particles of marine snow within each video frame with similarly shaped regions of a single color.


The synthetic video contains live unoccluded pixels extracted from the current video frame and unoccluded pixels carried forward from previous frame(s). The merging operation is controlled by a dynamic chroma mask created for each frame by optically identifying and tracking the “snow” particles. The chroma mask chooses a live pixel if not hidden behind a “snow” pixel, otherwise it selects the pixel from the synthetic video frame.


The present invention provides a method for removing certain visual occlusions referred to as marine snow from live underwater video using optical flow algorithms. The preferred embodiment uses optical flow algorithms to create chroma key masks that are used to remove marine snow and merge pixels from the live video stream with pixels from a synthetic video frame to deliver a video with no occluded pixels. The method is described in detail in diagrams and associated text that follows.


For the purposes of this description, image pixels can be defined to exist in one of the following four states: 1) a Live pixel is visible, unoccluded; 2) a Snow pixel is an unwanted waste particle that floats in the frame; 3) a Hidden pixel is one that is hidden by Snow; 4) a History pixel is one that was previously visible but now occluded in the current frame.


There are five basic steps comprising the method for removing marine snow:

    • 1) The live video frame contains clusters of “snow pixels” that are extracted using optical flow algorithms to create a chroma mask for use in extracting unoccluded live pixels.
    • 2) The synthetic video contains live unoccluded pixels extracted from the current video frame and unoccluded pixels carried forward from previous frame(s).
    • 3) The merging operation is controlled by a dynamic chroma mask created for each frame by optically identifying and tracking the “snow” particles.
    • 4) The chroma mask chooses a live pixel if not hidden behind a “snow” pixel, otherwise it selects the pixel from the synthetic video frame.
    • 5) A video frame with no occluded pixels is created as the chroma key operation merges pixels from the live video stream with pixels from a synthetic video frame.



FIG. 1 shows an example architecture of a video processing platform 100 that could be used to implement the method for removing marine snow from video streams. It shows a video input 114 for receiving a video input stream from a connected video camera 110 or video archive 108. Also shown are a connected video display, a DVR storage module and a network module to which the processed video is output 116. The video processing platform 100 integrates a CPU 102 for command and control, a GPU 104 for image processing through a user interface 112, and an FPGA 106 to run the optical flow algorithms 118 in real time. A video controller is also included (not shown) which provides the capability to fuse incoming video with the chroma masks and to encode/decode the IP video streams.



FIG. 2 shows a live underwater video stream that contains snow pixels which hide live pixels.



FIG. 3 shows a Synthetic Video Frame which consists of unoccluded Live Pixels from the current video frame and History Pixels carried forward from previous frames. There are no Snow Pixels because they have been replaced with History Pixels carried forward from previous frames.



FIG. 4 shows a Dynamic Chroma Mask 400 created for each frame using Optical Flow algorithms 402 to identify and track floating pixel clusters (“Snow Pixels”) 404. Snow Pixel clusters are geo-positioned on the mask in the chroma key 406 color.



FIG. 5 shows an Augmented Live Video Frame wherein Snow Pixels have been replaced by History Pixels selected from the Synthetic Video Frame.



FIG. 6 shows how the Dynamic Chroma Mask 600 is created for each frame to filter out the Snow Pixels from the live video. A Dynamic Chroma Mask 600 is created for each frame using Optical Flow algorithms 602 to identify and track floating pixel clusters (“Snow Pixels”) 604. Snow Pixel clusters are geo-positioned on the mask in the chroma key color. The chroma mask will be used to filter out clusters of Snow Pixels from the live video 606.



FIG. 7 shows how the Synthetic video frame 700 is updated as the last operation to prepare for the next frame. When updating the Synthetic video frame through a Dynamic Chroma Mask 702 using inverted chroma keying 704, only unoccluded pixels 706 are extracted from the live video frame 708 to update the synthetic frame 700. The pixels in the synthetic frame that are currently occluded by the snow pixel clusters in the live frame remain and carry forward.


In certain embodiments, pixels in the synthetic frame that are not from the current frame may comprise pixels convolved from neighboring pixels instead of carrying forward the last instance where the pixel was visible.


While various embodiments of the disclosed technology have been described above, it should be understood that they have been presented by way of example only, and not of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the disclosed technology, which is done to aid in understanding the features and functionality that may be included in the disclosed technology. The disclosed technology is not restricted to the illustrated example architectures or configurations, but the desired features may be implemented using a variety of alternative architectures and configurations. Indeed, it will be apparent to one of skill in the art how alternative functional, logical or physical partitioning and configurations may be implemented to implement the desired features of the technology disclosed herein. Also, a multitude of different constituent module names other than those depicted herein may be applied to the various partitions. Additionally, with regard to flow diagrams, operational descriptions and method claims, the order in which the steps are presented herein shall not mandate that various embodiments be implemented to perform the recited functionality in the same order unless the context dictates otherwise.


Although the disclosed technology is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead may be applied, alone or in various combinations, to one or more of the other embodiments of the disclosed technology, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the technology disclosed herein should not be limited by any of the above-described exemplary embodiments.


Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as meaning “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; the terms “a” or “an” should be read as meaning “at least one,” “one or more” or the like; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future.

Claims
  • 1. An optical flow sensor to detect the movement of water while immersed under the surface of said water comprising: a camera for detecting optical flow events within said water;an optical flow comparison process wherein independently moving objects in the presence of said water are isolated against a backdrop of water flow not associated with said moving objects and wherein movement of said camera is subtracted from movement associated with said object;a process performed by a computer wherein video signals produced by said camera wherein marine snow noise is identified and removed from an overall underwater image, so that said overall underwater image consists of said moving objects and wherein said overall underwater image is relatively clear for human observation, andwherein underwater scenery otherwise occluded by marine snow noise become more clearly visible for human observation.
  • 2. The optical flow sensor of claim 1, further comprising of said camera for detecting optical flow presenting a video input and output stream.
  • 3. The optical flow sensor of claim 1, wherein said video signals produced are shown in a video display.
  • 4. The optical flow sensor of claim 1, wherein said overall underwater image is processed through a user interface and an FPGA to run an algorithm using said optical flow sensor to clarify said underwater image in real time.
  • 5. The optical flow sensor of claim 4, further comprising of said overall underwater image featuring a synthetic video frame with unoccluded live pixels from a current video frame and historical pixels from previous frames.
  • 6. The optical flow sensor of claim 1, wherein said process entails creating a dynamic chroma mask for each frame using said algorithm from said optical flow sensor to identify and track floating pixel clusters.
  • 7. The process of claim 6, further comprising of said pixel clusters being geo-positioned on said chroma mask.
  • 8. A system for detecting the movement of water with an optical flow sensor while immersed under the surface of said water, the system comprising of: a live video stream, deriving from a video recording device and camera;a video processing platform with an integrated central processing unit for command and control, presented by way of a video display connected to said video recording device and said camera and a network module to process a video output;a DVR storage module, for storing and retrieving video data;an image processing module through a user interface on said video processing platform with a designated FPGA to run an optical flow algorithm; anda video controller, for managing live frames retrieved from said live video stream.
  • 9. The system of claim 8, wherein said image processing module uses a dynamic chroma mask for said live frames retrieved from said live video stream using said optical flow algorithms to identify and track floating pixel clusters.
  • 10. The system of claim 9, wherein said image processing module utilizes a synthetic video frame for said dynamic chroma mask for inverted chroma keying to clarify a video frame and clarify occlusion of said floating pixel clusters.
  • 11. The system of claim 10, wherein said floating pixel clusters are geo-positioned on said dynamic chroma mask in chroma key color.
  • 12. The system of claim 10, wherein pixels in said synthetic video frame comprise of said pixels convolved from neighboring pixels.
  • 13. The system of claim 10, wherein said synthetic video frame has unoccluded live pixels from a current video frame and historical pixels from previous frames that replace snow pixels.
  • 14. The system of claim 8, wherein said image processing module employs a graphic processing unit for generating a visual display of said live video stream and said video output.
  • 15. A method for detecting the movement of water with an optical flow sensor while immersed under the surface of said water, the method comprising of: capturing a video stream by way of a video recording device and camera in an aquatic environment;transmitting said video stream to a video processing platform with an integrated central processing unit for command and control and a graphic processing unit for presenting a visual display of occluded video frames;storing data of said video stream to a DVR storage module;processing said video stream on said video processing platform, by way of a video display, over a network module to process a video output;processing video frames of said video stream on an image processing module through a user interface on said video processing platform with a designated FPGA to run an optical flow algorithm;managing said video frames by way of a video controller; andfusing incoming videos with chroma masks to encode and decode internet protocol video streams over said network module.
  • 16. The method of claim 15, wherein said image processing module uses a dynamic chroma mask for said video frames retrieved from said video stream using said optical flow algorithm to identify and track floating pixel clusters.
  • 17. The method of claim 16, wherein said image processing module utilizes a synthetic video frame for said dynamic chroma mask for inverted chroma keying to clarify a video frame and clarify occlusion of said floating pixel clusters.
  • 18. The method of claim 16, wherein said floating pixel clusters are geo-positioned on said chroma mask in chroma key color.
  • 19. The method of claim 17, wherein pixels in said synthetic video frame comprise of said pixels convolved from neighboring pixels.
  • 20. The method of claim 19, further comprising wherein said synthetic video frame has unoccluded live pixels from a current video frame and historical pixels from previous frames that replace snow pixels.
PRIORITY CLAIMS

This application is a continuation of U.S. patent application Ser. No. 17/184,076, filed Feb. 24, 2021, which claims the benefit of U.S. Provisional Application Ser. No. 62/981,157 filed on Feb. 25, 2020, the contents of which are incorporated herein by reference.

Provisional Applications (1)
Number Date Country
62981157 Feb 2020 US
Continuations (1)
Number Date Country
Parent 17184076 Feb 2021 US
Child 18211566 US