This application claims the benefit, under 35 U.S.C. §119 of EP Patent Application 11306621.1, filed 7 Dec. 2011.
The invention relates to a method and an apparatus for processing occlusions in motion estimation, i.e. the estimation of motion vectors between two frames in those areas that are visible in one frame only and are occluded in the other frame.
During motion estimation between two frames of a sequence of frames there sometimes is the problem that certain areas of one frame are not visible in the other frame. Typically this is the case for pixels belonging to an object in the background, which in one of the frames are hidden by pixels belonging to a foreground object. In such cases it is not possible to determine a motion vector for the occluded pixels directly from the two frames. Most of today's motion estimators simply assume that the pixels are present in the other frame and provide motion vectors for these pixels, which optimize a selected criterion. However, these motion vectors do generally not correspond to the correct motion vectors.
It is thus an object of the invention to propose a solution for providing improved motion vectors in case of occlusions.
According to the invention, a method for processing an occlusion in a forward motion field of a second frame relative to a first frame comprises the steps of:
Accordingly, an apparatus for processing an occlusion in a forward motion field of a second frame relative to a first frame comprises:
The proposed solution considers neighboring motion fields in order to correct the motion field in the areas that are occluded in the other frame. In this way more reliable motion vectors are generated for occluded areas with only limited additional processing cost. The third frame, the second frame, and the first frame, from which the neighboring motion fields are derived, are either temporally successive frames or temporally non-successive frames. In the latter case the relative temporal distances between the frames are taken into account when filling the forward motion field.
Advantageously, the motion estimator further determines a backward motion field of the first frame relative to the second frame, which is preferably stored in a memory. This backward motion field simplifies the detection of occluded areas.
Favorably, a corrected previous backward motion field is determined by filling the previous backward motion field in areas of the second frame that are occluded in the third frame using motion vectors of the forward motion field of the second frame. In case of temporally non-successive frames the relative temporal distances between the frames are taken into account when filling the previous backward motion field. In this way also the earlier backward motion fields are further improved when later motion fields are being processed.
Advantageously, an acceleration between the previous backward motion field and the forward motion field is determined. This acceleration is then taken into account when filling the forward motion field and/or the previous backward motion field. The determination of acceleration ensures that the motion filtering converges faster despite the presence of acceleration.
Preferably, an interpolation filter is applied to the filled motion vectors, e.g. a bilateral filter. This post-processing of the filled motion field reduces noise in the motion map and ensures that the filled motion field is more homogeneous.
For a better understanding the invention shall now be explained in more detail in the following description with reference to the figures. It is understood that the invention is not limited to this exemplary embodiment and that specified features can also expediently be combined and/or modified without departing from the scope of the present invention as defined in the appended claims. In the figures:
In order to be able to address occlusions, apparently the areas in the current frame that are occluded in the other frame need to be identified. A known method to identify such areas is to perform a double motion estimation, i.e. to estimate motion twice, between frames t and t−1. This results in a backward motion field for frame t, as in
In order to correct the motion field in the areas that are occluded in the other frame, other neighboring motion fields are considered. In
The second further motion field is the backward motion field at frame t−1that describes the vectors of the pixels in frame t−1with regard to the pixels in frame t−2, i.e. the motion field t−1/t−2. Pixels in frame t−1 now have a link to the past through the backward map t−1/t−2 and a link to the future through the forward map t−1/t. This is schematically illustrated in
The areas in frame t that are occluded in frame t−1 as well as the areas in frame t−1 that are occluded in frame t correspond to holes in the associated motion maps. These holes in frame t are filled by simply assigning to a backward motion vector dBf(p,t) of pixel (p,t) the vector opposite to the forward motion vector dF(p,t). Similarly, the holes in frame t−1 can be filled by simply assigning to the forward motion vector dFf(p,t−1) of pixel (p,t−1) the vector opposite to the backward motion vector dB(p,t−1):
dBf(p,t)=−dF(p,t) (1)
dFf(p,t−1)=−dB(p,t−1) (2)
As the filling processing is pixel-based, the resulting filled motion map can be noisy. Therefore, a post-processing of the filled motion field is advantageously applied in order to homogenize the filled motion field, for example via a cross bilateral filtering. The bilateral filter is applied to the filled vectors and is defined by
where x is the current pixel, y is a pixel of the N×N window centered on x, and d(y) is the motion vector at pixel y. Wxy is the weight assigned to the motion vector of pixel y. Wxy is defined as follows:
Wxy=e−δ
Δxy results from the color difference between pixel x and its neighboring pixels y. It is defined as:
Γxy is defined as the distance in the image grid between pixel x and pixel y (Euclidean norm):
Γxy=∥x−y∥2 (6)
Bxy is defined as the distance between motion values of the pixels x and y:
Bxy=∥d(x)−d(y)∥2 (7)
The parameters δ, γ, β weight the three costs relative to each other.
A practical implementation of the sequential motion estimation and correction process is schematically illustrated in
In the presence of acceleration, which for three successive frames t, t−1 and t−2 is defined as the difference between the forward motion field t−1/t and the backward motion field t−1/t−2, an acceleration vector is preferably estimated before motion filling via a second bilateral filtering in order to make the motion filtering converge faster:
Apart from bilateral filtering any 2D+t interpolation filter, which fulfills the following requirements, can be used. The filter needs to take forward vectors t−1/t and backward vectors t−1/t−2 to fill the holes either in the forward map or backward map. Furthermore, the interpolation filter must consider the spatial neighborhood to take into account the spatial variation of the motion and the neighborhood in the facing map in order to cope with acceleration. In addition, the filter should discard the data of the occluding object.
As discussed above, both the forward motion field and the backward motion field for frame t−1 shall be improved. The acceleration can be computed based on the pixels of frame t−1 that are visible in the three frames t−2, t−1 and t. The acceleration vector a( )is defined as follows for the backward motion field t−1/t−2:
aB(y,t−1)=dB(y,t−1)+dF(y,t−1) (9)
For the forward motion field t−1/t, the acceleration vector a( ) is defined as:
aF(y,t−1)=dF(y,t−1)+dB(y,t−1) (10)
The bilateral filtering is used to estimate acceleration in the areas in frame t−1 that are occluded in the other frame. The acceleration is not known for these areas, and hence needs to be derived from the neighboring pixels for which acceleration can be computed. Thus, the bilateral filter allows to estimate an acceleration vector for the occluded areas from the neighboring visible areas.
The weights Wxy are adapted to the current pixel, they are defined to give more importance to the acceleration values corresponding to pixels with similar color and similar motion with regard to the current occluded pixel. Therefore, the weights Wxy can be the same ones as defined above in equation (4).
The filling process is then realized as follows:
This is schematically illustrated in
As before, the filled motion field is advantageously filtered in the filled areas via bilateral filtering.
Up to now triplets of successive frames have been considered, assuming that the time distance between two successive frames is constant. The above processing can be applied to any triplets of frames if the distance between the frames is taken into account. If three frames t−i, t, and t+j are considered, the transfer of motion vectors (filling process) between forward and backward fields, for example dBf(p,t)=−dF(p,t) for the triplet (t−1,t,t+1), becomes:
where the relative distance of the reference frames t−i and t+j to the current frame t is taken into account.
Furthermore, if distant frames are considered, acceleration will be more frequent and it will be required for accuracy. In this context, acceleration is defined as follows for the backward motion field t/t−I with regard to the forward motion field t/t+j:
The filling process is then:
Number | Date | Country | Kind |
---|---|---|---|
11306621 | Dec 2011 | EP | regional |
Number | Name | Date | Kind |
---|---|---|---|
5394196 | Robert | Feb 1995 | A |
7277558 | Sefcik | Oct 2007 | B2 |
20060222077 | Ohwaki et al. | Oct 2006 | A1 |
20070211800 | Shi | Sep 2007 | A1 |
20080317127 | Lee | Dec 2008 | A1 |
20110007982 | Su et al. | Jan 2011 | A1 |
20110211125 | Petrides | Sep 2011 | A1 |
20130106837 | Mukherjee | May 2013 | A1 |
Number | Date | Country |
---|---|---|
2334065 | Jun 2011 | EP |
WO2011060579 | May 2011 | WO |
Entry |
---|
The European Search Report dated Mar. 15, 2012. |
Number | Date | Country | |
---|---|---|---|
20130148730 A1 | Jun 2013 | US |