The present invention relates to the technical field of omnidirectional image processing, and more particularly relates to an omnidirectional image processing method and device.
A 360° omnidirectional image refers to all the scenes around an observation point in space which are composed of all the light that can be received by this observation point, and a sphere can describe all the scenes around the observation point. Since spherical images are difficult to store, and an existing image codec is designed for common non-omnidirectional images and is not ideal for the coding effect of the spherical images, it is necessary to convert the spherical images into omnidirectional image formats through projection, and the common projection formats include Equirectangular Projection (ERP), Cubemap Projection (CMP), etc. For omnidirectional images in different formats, an existing coding mode for common non-omnidirectional images such as HEVC, AVS2, AV1, etc. may be adopted for coding, and a locally adjusted coding method may also be adopted for different formats.
ERP sampling is shown in
As shown in
After determining a position mapping relationship between the sphere and the omnidirectional image, the omnidirectional image may be sampled, so as to generate a pixel value for each pixel position, and then the omnidirectional image is coded and decoded. The position mapping relationship between the sphere and the omnidirectional image may also be utilized to perform format conversion of the omnidirectional image, and the ERP format may be converted to the CMP format or other formats.
In view of the defects in the prior art, the present invention provides a novel omnidirectional image processing method and device.
For the same area, in a Cubemap Projection (CMP) format, the central region of each face corresponds to a larger region on a sphere than marginal regions, which leads to non-uniform sampling on the sphere and redundant information in the marginal regions of the cube, and reduces the representation efficiency.
A main idea of the present invention is to enable the arc length of the unit interval at each latitude in an omnidirectional image to be as equal as possible on the sphere by adjusting a stretching relationship between the omnidirectional image and the sphere in different positions, thereby reducing information redundancy in the marginal regions in Cubemap projection, and improving the representation efficiency.
For this purpose, the present invention adopts the following technical solution:
A first objective of the present invention is to provide an omnidirectional image processing method, including:
decoding a code stream to obtain omnidirectional image coding format information; and determining a mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere according to the format information, wherein the mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the decoded omnidirectional image, y represents a second dimensional coordinate position of the region in the decoded omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
Preferably, the omnidirectional image format information is one of the following:
1) The format information is a format number, and the format number specifies a direction of a first dimensional coordinate of the region in the decoded omnidirectional image by default.
2) The format information includes a format number and direction information of the first dimensional coordinate of the region in the decoded omnidirectional image.
A second objective of the present invention is to provide an omnidirectional image processing method, including the following steps:
expressing at least one image region in an omnidirectional image by the following mapping relationship, wherein the mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian;
coding and writing the omnidirectional image expressed by the mapping relationship to a code stream; and
writing format information of the mapping relationship to the code stream.
Preferably, the omnidirectional image format information is one of the following:
1) The format information is a format number, and the format number specifies a direction of a first dimensional coordinate of the region in the omnidirectional image by default.
2) The format information includes a format number and direction information of the first dimensional coordinate of the region in the omnidirectional image.
A third objective of the present invention is to provide an omnidirectional image processing method, including the following steps:
decoding a code stream to obtain omnidirectional image coding format information; and
determining mapping relationships between a position (x1, y1) in a region 1 and a position (x2, y2) in a region 2 and their respective corresponding positions (θ1, φ1) and (θ2, φ2) on a sphere according to the format information since at least one pair of adjacent regions, i.e. the region 1 and the region 2, is present in a decoded omnidirectional image, wherein the mapping relationships are:
x1 and φ1 are in linear relationship, and y1 and
are in linear relationship;
x2 and φ2 are in linear relationship, and y2 and
are in linear relationship,
where x1 represents a first dimensional coordinate position of the region 1 in the omnidirectional image, y1 represents a second dimensional coordinate position of the image region 1, φ1 represents a longitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image, and θ1 represents a latitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image; x2 represents a first dimensional coordinate position of the region 2 in the omnidirectional image, y2 represents a second dimensional coordinate position of the region 2 in the omnidirectional image, φ2 represents a longitude position of the sphere corresponding to the position (x2, y2) in the image region 2, and θ2 represents a latitude position of the sphere corresponding to the position (x2, y2) in the region 2 in the omnidirectional image; and a line mapped to the sphere by a first dimensional coordinate axis of the region 1 is the same as a line mapped to the sphere by a first dimensional coordinate axis of the region 2, and is an equator line of the sphere.
A fourth objective of the present invention is to provide an omnidirectional image processing device, including the following modules:
a format information extraction module, wherein input of the format information extraction module is a coding code stream, output of the format information extraction module is omnidirectional image format information, and the module decodes the coding code stream to obtain the omnidirectional image format information; and
a position mapping module, wherein input of the position mapping module is the omnidirectional image format information, output of the position mapping module is a mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere, and the mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
Preferably, the omnidirectional image format information is one of the following:
1) The format information is a format number, and the format number specifies a direction of a first dimensional coordinate of the region in the omnidirectional image by default.
2) The format information includes a format number and direction information of the first dimensional coordinate of the region in the omnidirectional image.
A fifth objective of the present invention is to provide an omnidirectional image processing device, including the following modules:
a position mapping module, wherein output of the position mapping module is omnidirectional image format information, and a mapping relationship between a position (x, y) in at least one region in an omnidirectional image and a position (θ, φ) on a sphere is determined in the module as:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian; and
a format information and image coding module, wherein input of the format information and image coding module is the omnidirectional image format information and the omnidirectional image expressed by the mapping relationship determined by the format information, output of the format information and image coding module is a code stream including the omnidirectional image and the format information of the omnidirectional image, and the module codes and writes the omnidirectional image and the format information to the code stream.
Preferably, the omnidirectional image format information is one of the following:
1) The format information is a format number, and the format number specifies a direction of a first dimensional coordinate of the region in the omnidirectional image by default.
2) The format information includes a format number and direction information of the first dimensional coordinate of the region in the omnidirectional image.
Compared with the prior art, by designing the special position mapping relationship between the sphere and the omnidirectional image, the present invention enables the arc length of the unit interval at each latitude in the omnidirectional image to be as equal as possible on the sphere, and the omnidirectional image to be distributed more uniformly on the sphere, so that the spherical uniformity of expression is improved, the sampling loss is reduced under the same sample number, and the coding efficiency is improved.
In combination with the accompanying drawings, the principle of the present invention can be explained from embodiments given below.
The accompanying drawings illustrated here are to provide further understanding of the present invention and constitute a part of this application. Various embodiments of the present invention are merely to explain the present invention and are only some particular cases, and the application range of the present invention is not limited to these embodiments. In the drawings:
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
A code stream is decoded to obtain omnidirectional image coding format information. A mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere is determined according to the format information. The mapping relationship is: x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the decoded omnidirectional image, y represents a second dimensional coordinate position of the region in the decoded omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the decoded omnidirectional image is in a Cubemap Projection (CMP) format, the region is a front face of the cube in
In this embodiment, the format information is a format number, and this format number specifies by default that a direction of a first dimensional coordinate of the region in the omnidirectional image is a horizontal direction, as shown in
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
A code stream is decoded to obtain omnidirectional image coding format information. A mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere is determined according to the format information. The mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the decoded omnidirectional image, y represents a second dimensional coordinate position of the region in the decoded omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the decoded omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, the format information includes a format number and direction information of a first dimensional coordinate of the region, and this direction information specifies that a direction of the first dimensional coordinate of the region in the omnidirectional image is a vertical direction, as shown in
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
At least one image region in an omnidirectional image is expressed by the following mapping relationship. The mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, format information is a format number, and this format number specifies by default that a direction of a first dimensional coordinate of the region in the omnidirectional image is a horizontal direction, as shown in
The omnidirectional image expressed by the mapping relationship is coded and written to a code stream.
The format information of the mapping relationship is also written to the code stream.
The sequence of writing the omnidirectional image and the format information to the code stream may be arbitrary. That is, the omnidirectional image may be coded and written to the code stream first, and then the format information of the mapping relationship is also written to the code stream. Or, the format information of the mapping relationship may be written to the code stream first, and then the omnidirectional image expressed by the mapping relationship is coded and written to the code stream.
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
At least one image region in an omnidirectional image is expressed by the following mapping relationship. The mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the omnidirectional image is in a CMP format, the region in the omnidirectional image is a front face of the cube in
In this embodiment, format information includes a format number and direction information of a first dimensional coordinate of the region, and this direction information specifies that a direction of the first dimensional coordinate of the region in the omnidirectional image is a vertical direction, as shown in
The omnidirectional image expressed by the mapping relationship is coded and written to a code stream.
The format information of the mapping relationship is also written to the code stream.
The embodiment of the present invention provides a method of sampling by using a mapping relationship.
The sampling method provided by the present embodiment includes the following steps:
At least one image region is included in a to-be-generated omnidirectional image. The mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is.
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the to-be-generated omnidirectional image is in a CMP format, the region is a front face of the cube in
Pixel sampling positions in a vertical direction are shown in
During sampling, first, the equator is sampled with equal arc length in the horizontal direction, and then meridians corresponding to all the horizontal sampling positions are sampled with equal arc length in the vertical direction.
According to the pixel sampling positions in the to-be-generated omnidirectional image, each pixel point in the to-be-generated omnidirectional image is interpolated on the sphere to obtain its corresponding pixel value.
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
A code stream is decoded to obtain omnidirectional image coding format information.
Mapping relationships between a position (x1, y1) in a region 1 and a position (x2, y2) in a region 2 and their respective corresponding positions (θ1, φ1) and (θ2, φ2) on a sphere are determined according to the format information since at least one pair of adjacent regions, i.e. the region 1 and the region 2, is present in a decoded omnidirectional image. The mapping relationships are:
x1 and φ1 are in linear relationship, and y1 and
are in linear relationship;
x2 and φ2 are in linear relationship, and y2 and
are in linear relationship.
x1 represents a first dimensional coordinate position of the region 1 in the omnidirectional image, y1 represents a second dimensional coordinate position of the image region 1, φ1 represents a longitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image, and θ1 represents a latitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image; x2 represents a first dimensional coordinate position of the region 2 in the omnidirectional image, y2 represents a second dimensional coordinate position of the region 2 in the omnidirectional image, φ2 represents a longitude position of the sphere corresponding to the position (x2, y2) in the image region 2, and θ2 represents a latitude position of the sphere corresponding to the position (x2, y2) in the region 2 in the omnidirectional image; and a line mapped to the sphere by a first dimensional coordinate axis of the region 1 is the same as a line mapped to the sphere by a first dimensional coordinate axis of the region 2, and is an equator line of the sphere.
In this embodiment, a front face and a right face in
The relationship between (x2, y2) and (φ2, θ2) is:
In this embodiment, first dimensional directions of the region 1 and the region 2 are both horizontal directions, as shown in
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
A code stream is decoded to obtain omnidirectional image coding format information.
Mapping relationships between a position (x1, y1) in a region 1 and a position (x2, y2) in a region 2 and their respective corresponding positions (θ1, φ1) and (θ2, φ2) on a sphere are determined according to the format information since at least one pair of adjacent regions, i.e. the region 1 and the region 2, is present in a decoded omnidirectional image. The mapping relationships are:
x1 and φ1 are in linear relationship, and y1 and
are in linear relationship;
x2 and φ2 are in linear relationship, and y2 and
are in linear relationship.
x1 represents a first dimensional coordinate position of the region 1 in the omnidirectional image, y1 represents a second dimensional coordinate position of the image region 1, φ1 represents a longitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image, and θ1 represents a latitude position of the sphere corresponding to the position (x1, y1) in the region 1 in the omnidirectional image; x2 represents a first dimensional coordinate position of the region 2 in the omnidirectional image, y2 represents a second dimensional coordinate position of the region 2 in the omnidirectional image, φ2 represents a longitude position of the sphere corresponding to the position (x2, y2) in the image region 2, and θ2 represents a latitude position of the sphere corresponding to the position (x2, y2) in the region 2 in the omnidirectional image; and a line mapped to the sphere by a first dimensional coordinate axis of the region 1 is the same as a line mapped to the sphere by a first dimensional coordinate axis of the region 2, and is an equator line of the sphere.
In this embodiment, a front face and a right face in
The relationship between (x2, y2) and (φ2, θ2) is:
In this embodiment, first dimensional directions of the region 1 and the region 2 are both vertical directions, as shown in
The embodiment of the present invention provides an omnidirectional image format conversion method.
The omnidirectional image format conversion method provided by the present embodiment includes the following steps:
At least one image region is included in a to-be-generated omnidirectional image. A mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, a first dimensional direction is a horizontal direction, as shown in
A position on the sphere is mapped to a source format. An Equirectangular Projection (ERP) is taken as the source format, and a mapping relationship between the position (u, v) in the ERP and the position (θ, φ) on the sphere is:
ϕ=(u−0.5)*(2*π),and
θ=(0.5−v)*π.
The values of u and v are both in [0,1].
Interpolation is performed on the ERP to generate pixel values in the to-be-generated omnidirectional image.
The embodiment of the present invention provides an omnidirectional image processing method.
The omnidirectional image processing method provided by the present embodiment includes the following steps:
A code stream is decoded to obtain omnidirectional image coding format information. A mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere is determined according to the format information. The mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the decoded omnidirectional image, y represents a second dimensional coordinate position of the region in the decoded omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In this embodiment, the region in the omnidirectional image is a rear face of the cube in
and when y=1,
then
the mapping relationship between (x, y) and (θ, φ) is:
In the X′-Y′ coordinate system, when y′=−1,
and when y′=1,
then the mapping relationship between (x′, y′) and (θ, φ) is:
The region where y=±1 is the effective content corresponding to the sphere in the omnidirectional image. Conversion between (x′, y′) and (x, y) is as follows:
x=x′, and
y=(y′−½)×2.
Through the above conversion, the mapping relationship between the position (x′, y′) in the omnidirectional image and the position (θ, φ) on the sphere can be derived as:
The embodiment of the present invention provides an omnidirectional image processing device.
The omnidirectional image processing device provided by the present embodiment includes the following modules:
a format information extraction module, wherein input of the format information extraction module is a coding code stream, output of the format information extraction module is omnidirectional image format information, and the module decodes the coding code stream to obtain the omnidirectional image format information; and
a position mapping module, wherein input of the position mapping module is the omnidirectional image format information, output of the position mapping module is a mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere, and the mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, the format information is a format number, and this format number specifies by default that a direction of a first dimensional coordinate of the region in the omnidirectional image is a horizontal direction, as shown in
The embodiment of the present invention provides an omnidirectional image processing device.
The omnidirectional image processing device provided by the present embodiment includes the following modules:
a format information extraction module, wherein input of the format information extraction module is a coding code stream, output of the format information extraction module is omnidirectional image format information, and the module decodes the coding code stream to obtain the omnidirectional image format information; and
a position mapping module, wherein input of the position mapping module is the omnidirectional image format information, output of the position mapping module is a mapping relationship between a position (x, y) in at least one region in an omnidirectional image obtained by decoding the code stream and a position (θ, φ) on a sphere, and the mapping relationship is:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, the format information includes a format number and direction information of a first dimensional coordinate of the region in the omnidirectional image, and this direction information specifies that a direction of the first dimensional coordinate of the region in the omnidirectional image is a vertical direction, as shown in
The embodiment of the present invention provides an omnidirectional image processing device.
The omnidirectional image processing device provided by the present embodiment includes the following modules:
a position mapping module, wherein output of the position mapping module is omnidirectional image format information, and a mapping relationship between a position (x, y) in at least one region in an omnidirectional image and a position (θ, φ) on a sphere is determined in the module as:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian; and
a format information and image coding module, wherein input of the format information and image coding module is the omnidirectional image format information and the omnidirectional image expressed by the mapping relationship determined by the format information, output of the format information and image coding module is a code stream including the omnidirectional image and the format information of the omnidirectional image, and the module codes and writes the omnidirectional image and the format information to the code stream.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, the format information is a format number, and this format number specifies by default that a direction of a first dimensional coordinate of the region in the omnidirectional image is a horizontal direction, as shown in
The embodiment of the present invention provides an omnidirectional image processing device.
The omnidirectional image processing device provided by the present embodiment includes the following modules:
a position mapping module, wherein output of the position mapping module is omnidirectional image format information, and a mapping relationship between a position (x, y) in at least one region in an omnidirectional image and a position (θ, φ) on a sphere is determined in the module as:
x and φ are in linear relationship, and y and
are in linear relationship,
where x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian; and
a format information and image coding module, wherein input of the format information and image coding module is the omnidirectional image format information and the omnidirectional image expressed by the mapping relationship determined by the format information, output of the format information and image coding module is a code stream including the omnidirectional image and the format information of the omnidirectional image, and the module codes and writes the omnidirectional image and the format information to the code stream.
In the present embodiment, the omnidirectional image is in a CMP format, the region is a front face of the cube in
In this embodiment, the format information includes a format number and direction information of a first dimensional coordinate of the region, and this direction information specifies that a direction of the first dimensional coordinate of the region in the omnidirectional image is a vertical direction, as shown in
The embodiment of the present invention provides an omnidirectional image generation method.
The omnidirectional image generation method provided by the present embodiment includes the following steps:
At least one image region is included in a to-be-generated omnidirectional image. A mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is specified as:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the to-be-generated omnidirectional image is in a CMP format. As shown in
and when y=1 and x=0,
then the mapping relationship between (x, y) and (θ, φ) is:
In
For the faces on the second line of the CMP format, namely, the upper, lower and rear faces, a first dimensional direction is the horizontal direction. When y=0.8 and x=0,
and when y=−1 and x=0,
then the mapping relationship between (x, y) and (θ, φ) is:
In the rear face, the region outside the thick lines is a non-effective region, the pixels in this region are filled with the pixels in the line where y=0.8, and the pixels in non-effective regions of the upper and lower faces are also filled with the pixels in the lines where y=0.8 in the corresponding regions.
The embodiment of the present invention provides an omnidirectional image generation method.
The omnidirectional image generation method provided by the present embodiment includes the following steps:
At least one image region is included in a to-be-generated omnidirectional image. A mapping relationship between a position (x, y) in the image region and a position (θ, φ) on a sphere is:
x and φ are in linear relationship, and y and
are in linear relationship.
x represents a first dimensional coordinate position of the region in the omnidirectional image, y represents a second dimensional coordinate position of the region in the omnidirectional image, φ represents a longitude position of the sphere, θ represents a latitude position of the sphere, a line mapped to the sphere by a first dimensional coordinate axis is an equator line, and a line mapped to the sphere by a second dimensional coordinate axis is a prime meridian.
In the present embodiment, the to-be-generated omnidirectional image is in a CMP format. As shown in
and when y=1 and x=0,
then the mapping relationship between (x, y) and (θ, φ) is:
In
For the faces on the second line of the CMP format, namely, the upper, lower and rear faces, a first dimensional direction is a vertical direction. When x=−0.8,
and when x=1,
then the mapping relationship between (x, y) and (θ, φ) is:
In the rear face, the region outside the thick lines is a non-effective region, the pixels in this region are generated by interpolation on the sphere according to the same mapping relationship as in the effective region, and at this time, the value of x is less than −0.8. The pixels in non-effective regions of the upper and lower faces are also generated by interpolation on the sphere according to the same mapping relationship as in the corresponding effective regions.
Finally, it should be noted that the above embodiments are merely to illustrate the technical solutions of the present invention, and not to limit them. Although the present invention has been illustrated in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that they can still make modifications on the technical solutions recorded in various embodiments, or equivalent replacements on some of the technical features. However, these modifications or replacements do not make the nature of the corresponding technical solutions depart from the scope of the technical solutions of various embodiments in the present invention.
Number | Date | Country | Kind |
---|---|---|---|
201810280029.8 | Apr 2018 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2019/080042 | 3/28/2019 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/192377 | 10/10/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
10620441 | Van Der Auwera | Apr 2020 | B2 |
10643301 | Coban | May 2020 | B2 |
10805614 | D'Acunto | Oct 2020 | B2 |
10839480 | Van Der Auwera | Nov 2020 | B2 |
10887621 | He | Jan 2021 | B2 |
10957044 | Van Der Auwera | Mar 2021 | B2 |
20170336705 | Zhou | Nov 2017 | A1 |
20170339391 | Zhou | Nov 2017 | A1 |
20180199065 | Adams | Jul 2018 | A1 |
20190200023 | Hanhart | Jun 2019 | A1 |
20190238861 | D'Acunto | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
102298771 | Dec 2011 | CN |
106384367 | Feb 2017 | CN |
107018336 | Aug 2017 | CN |
Entry |
---|
Search Report dated Jun. 28, 2019 issued in PCT Application No. PCT/CN2019/080042. |
Number | Date | Country | |
---|---|---|---|
20210073938 A1 | Mar 2021 | US |