The aspect of the embodiments relates to an image processing apparatus, an image processing method therefor, and a storage medium, and in particular to an image processing apparatus that carries out a noise reduction process when playing video, an image processing method, and a storage medium.
There is a technique with which shooting is performed using a camera capable of shooting video in RAW format (hereafter referred to as “RAW video”), and the shot RAW video is edited in a variety of ways. Information about an image signal generated by an image sensor of the camera is recorded the way it is in each of frames constituting the RAW video. The image quality of the RAW video can be freely adjusted by subjecting the RAW video to a developing process using the camera, a personal computer, or the like.
On the other hand, there is a known technique with which noise is reduced in each frame of video by a cyclic noise reduction process. The cyclic noise reduction process is a technique for reducing noise using correlation between a developed frame (image) and a frame (image) that is one or more frame periods before the developed frame and stored in memory.
As related arts, a technique described in Japanese Laid-Open Patent Publication (Kokai) No. H11-215403 and a technique described in Japanese Laid-Open Patent Publication (Kokai) No. 2013-239957 have been proposed. With the technique described in Japanese Laid-Open Patent Publication (Kokai) No. H11-215403, frame noise reduction based on frame correlation is performed. According to the technique described in Japanese Laid-Open Patent Publication (Kokai) No. 2013-239957, when a first video and a second video are sequentially played, a scene transition effect is applied to a video following the first video and a video preceding the second video over a period of time during which a scene is judged to be continuing.
As described above, the cyclic noise reduction process is the technique for reducing noise using correlation between one frame and a frame that is one or more frame periods before the frame. Thus, the cyclic noise reduction process cannot be carried out on a first frame among frames constituting a video because there is no frame before the first frame. For this reason, a problem arises due to worsening image quality of the first frame. The technique described in Japanese Laid-Open Patent Publication (Kokai) No. H11-215403 is not intended to reduce noise in the first frame. The technique described in Japanese Laid-Open Patent Publication (Kokai) No. 2013-239957 is not related to the cyclic noise reduction process. Moreover, the above problem may also arise in video other than the RAW video.
Accordingly, the aspect of the embodiments provides an image processing apparatus comprising at least one processor that performs to: obtain a video; and when playing the video, subject a first frame for starting playing the video to a cyclic noise reduction process using one or more frames after the first frame.
Further features of the disclosure will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The disclosure will now be described in detail below with reference to the accompanying drawings showing embodiments thereof.
The lens drive unit 102 controls zoom, focus, etc. of the lens 101. A subject image entering through the lens 101 is adjusted to have an appropriate amount of light by the diaphragm 104 and formed on an imaging surface on the image pickup device 106. The subject image formed on the imaging surface on the image pickup device 106 is photoelectrically converted. The subject image that has been photoelectrically converted is subjected to, for example, a gain adjustment process and an A/D conversion process in which an analog signal is converted into a digital signal. A digital signal obtained by an analog signal being converted is sent as, for example, R, Gr, Gb, or B image signal to the image signal processing circuit 107. The image signal processing circuit 107 carries out various types of image signal processing such as a developing process, a low-pass filtering process that reduces noise, a shading process, white balance (WB) process, and a cyclic noise reduction process. The image signal processing circuit 107 also performs various types of correction, compression of image data, and so forth.
Operation of the mechanical shutter 103 and the diaphragm 104 is controlled by the shutter/diaphragm drive unit 105. The control unit 109 controls the entire image pickup apparatus 100 and performs various types of computations. The first memory unit 108 stores image data. The recording medium control I/F unit 110 records image data onto and reads image data from the recording medium 112. The display unit 111 displays images and various types of information. The recording medium 112, which is a removable recording medium such as a semiconductor memory, records image data and various types of information. The external I/F unit 113 is an interface for carrying out communications with external apparatuses such as an external computer. The second memory unit 114 stores results of computations performed by the control unit 109.
The control unit 109 is implemented by, for example, a CPU. In each embodiment, control is implemented by a CPU, which performs functions of the control unit 109, executing predetermine programs. The functions of the control unit 109 may be implemented by predetermined circuits as well. The control unit 109 functions as an image processing apparatus. The first memory unit 108 and the second memory unit 114 may be either separate memories as shown in
The operating unit 115 is a member that can be operated by a user. Information about driving conditions for the image pickup apparatus 100, which is input by the user via the operating unit 115, is sent to the control unit 109. Based on the received information, the control unit 109 controls the overall operation of the image pickup apparatus 100. The operating unit 115 and the display unit 111 may be an integral touch panel display. The control unit 109 includes a cyclic frame holding control unit 116 and a cyclic frame determination unit 117. The cyclic frame holding control unit 116 causes the first memory unit 108 to hold image data of frames for use in the cyclic noise reduction process. In the following description, the cyclic noise reduction process may be referred to as “the cyclic NR process” as well. Based on a frame number, the cyclic frame determination unit 117 determines image data of frames for use in the cyclic NR process. The image data of frames for use in the cyclic NR process may be determined based on something other than the frame number.
A description will now be given of a configuration of the moov box 201. Arbitrary information such as a thumbnail image 218, which is displayed when playing video, and management information 219, which is used when playing, can be additionally stored in a uuid box 213 included in the moov box 201.
Management information about image data, audio data, time code data, frame-by-frame metadata, and so forth is stored in track boxes 205 to 208. Data sizes of image data, audio data, time code data, frame-by-frame metadata, and so forth in each coding unit are stored in stsz boxes 209 to 212. Information indicating storage locations in the mdat box 204 at which the image data, audio data, time code data, and frame-by-frame metadata are stored in the track boxes 205 to 208 is stored in the stco boxes 214 to 217. In the mdat box 204, data are stored in units called chunks comprised of one or more coding units.
A configuration of the mdat box 204 will now be described. Metadata 230 to 241 are image data, audio data, time code data, frame-by-frame metadata and so forth written in the mdat box 204. The data can be accessed on a chunk-by-chunk basis based on values written in respective stco boxes. For example, the metadata 230 (CV1) can be tracked from CV1 in the stco box 214. It should be noted that the configuration of the video file is not limited to the example in
Although the process in the flowchart of
In S301, the cyclic frame determination unit 117 of the control unit 109 judges whether or not a present frame N among a plurality of frames constituting a video to be played is a first frame N (N is a natural number). When the result of the judgment in S301 is Yes (when it is judged that the present frame N is the first frame N), the control unit 109 lets the process proceed to S302. When the result of the judgment in S301 is No (when it is judged that the present frame N is not the first frame N), the control unit 109 lets the process proceed to S308.
In S302, the control unit 109 judges whether a cyclic NR strength setting is “Strong” or not (“Strong” or “Weak”). When the result of the judgment in S302 is Yes (when it is judged that the cyclic NR strength setting is “Strong”), the control unit 109 lets the process proceed to S303. In S303, the control unit 109 reads a frame (N+2) two frames after the first frame N from the RAW video file and causes the image signal processing circuit 107 to carry out the developing process on the read frame (N+2). In S304, the control unit 109 causes the first memory unit 108 to hold the frame (frame (N+2)), which has been subjected to the developing process, via the cyclic frame holding control unit 116.
In S305, the control unit 109 reads a frame (N+1) one frame after the first frame N from the video file and causes the image signal processing circuit 107 to carry out the development process on the read frame (N+1). In S306, the control unit 109 subjects the frame (N+1) developed in S305 to the cyclic NR process using the frame (N+2) held in the first memory unit 108 in S304. As a result of the cyclic NR process carried out using the two consecutive frames, various types of noise in a time direction can be reduced. In S307, the control unit 109 causes the first memory unit 108 to hold the frame (N+1), which has been subjected to the cyclic NR process, via the cyclic frame holding control unit 116.
In S308, the control unit 109 causes the image signal processing circuit 107 to carry out the developing process on the present frame N. In S309, the control unit 109 subjects the present frame N, which was subjected to the developing process in S308, to the cyclic NR process using the frame (N+1) held in the first memory unit 108. In S310, the control unit 109 causes the display unit 111 to display image data in the frame N obtained by carrying out the cyclic NR process on the present frame N. In S311, the cyclic frame holding control unit 116 of the control unit 109 causes the first memory unit 108 to hold the image data in the frame N, which was obtained by carrying out the cyclic NR process on the present frame N.
When the result of the judgment in S302 is No (when it is judged that the cyclic NR strength setting is not “Strong”), the control unit 109 lets the process proceed to S312. In S312, the control unit 109 reads a frame (N+1) one frame after the present frame N (the first frame N) from the RAW video file and causes the image signal processing circuit 107 to carry out the developing process on the read frame (N+1). In S313, the control unit 109 causes the first memory unit 108 to hold the frame (N+1), which has been subjected to the developing process, via the cyclic frame holding control unit 116. The control unit 109 then lets the process proceed to S308. After that, the control unit 109 carries out the processes in S308 and the subsequent steps.
As described above, the control unit 109 subjects the present frame N (the first frame N) to the cyclic NR process using a frame played backward (treating a frame after the frame N as a frame before the frame N). Conventionally, the cyclic NR process is a method that reduces noise using the correlation between the present frame and the immediately preceding frame. Namely, to carry out the cyclic NR process on one frame, a frame that is immediately preceding that frame is needed. However, there is no frame before the first frame in a RAW video file, and hence the cyclic NR process cannot be carried out on the first frame. Therefore, in the present embodiment, the control unit 109 carries out the cyclic NR process on the first frame N using one or more frames after the first frame N. As a result, when video is played, the cyclic NR process can be carried out on the first frame N as well, and this improves the image quality of the first frame N. Accordingly, the image quality of the video as a whole is also improved.
Next, the control unit 109 reads image data corresponding to the frame 8 (the first frame 8), and as indicated by an arrow 402, carries out the developing process on the read frame 8. Namely, the developing process on the frame 9 read from the RAW video file is carried out prior to the first frame 8. As indicated by an arrow 403, the control unit 109 carries out the cyclic NR process on the first frame 8, which was subjected to the developing process, using the frame 9 held in the first memory unit 108. As a result, image data in the frame 8 subjected to the developing process is generated. The generated image data in the frame 8 is displayed as a display image on the display unit 111, as indicated by an arrow 404. The image data in the frame 8 subjected to the developing process is held in the first memory unit 108.
For frames following the first frame, the control unit 109, on a frame-by-frame basis, carries out the cyclic NR process on an image in the present frame using an image in the immediately preceding frame subjected to the cyclic NR process, as indicated by an arrow 405. In the example shown in
The control unit 109 also reads the frame 20 as a play frame 1, and as indicated by an arrow 502, carries out the developing process on the read frame 20. As indicated by an arrow 503, the control unit 109 carries out the cyclic NR process on the frame 20, which was subjected to the developing process, using the frame 19 held in the first memory unit 19. As a result, image data in the frame 20 subjected to the cyclic NR process is generated. The image data in the frame 20 subjected to the cyclic NR process is displayed as a display image on the display unit 111, as indicated by an arrow 504. The image data in the frame 20 subjected to the cyclic NR process is held in the first memory unit 108.
For frames following the frame 20, the same process as in the example shown in
The control unit 109 also reads the frame 8, and as indicated by an arrow 602, carries out the developing process on the read frame 8. At this time, the control unit 109 treats the read frame 8 as a play frame 1. As indicated by an arrow 603, the control unit 109 carries out the cyclic NR process on the frame 8, which has been subjected to the developing process, using the frame 12 held in the first memory unit 108. As a result, image data in the frame 8 subjected to the cyclic NR process is generated. The generated image data in the frame 8 subjected to the cyclic NR process is displayed as a display image on the display unit 111, as indicated by an arrow 604. The image data in the frame 8 subjected to the cyclic NR process is held in the first memory unit 108.
For frames following the frame 9, the same cyclic NR process as in the examples shown in
Although the case where the cyclic NR strength setting is “Weak” has been described with reference to
The control unit 109 may cause the display unit 111 to display a video selection screen.
A video selection screen 1851 in
On the video selection screen 1801 in
Assume that before the OK button 1804 is depressed, the currently selected video is changed from the video 2 to a video 3 by an operation performed via the operating unit 115. In this case, the control unit 109 subjects the currently selected video 3 to the same process as the above-described process carried out on the video 2. As a result, the time required for the cyclic NR process on the first frame can be shortened, and hence when the selection of an arbitrary video is confirmed, the first frame can be quickly subjected to the cyclic NR process and the video can quickly start playing. Although the case where the video selection screen 1801 in
As described above, the control unit 109 uses one or more frames after the first frame in carrying out the cyclic NR process on the first frame when playing a video. As a result, the cyclic NR process can be carried out also on the first frame among frames constituting the video, and hence the effect of the noise reduction process can be obtained even for the first frame. Here, proxy video (video with a smaller number of pixels or in a smaller size than a RAW video) is often recorded as video for editing at the same time when RAW video is recorded. If the cyclic NR process has not been carried out on the first frame of the RAW video, the noise reduction effect for the RAW video is lower than that for the proxy video recorded at the same time when the RAW video is recorded.
However, in the present embodiment, the control unit 109 subjects the first frame to the cyclic NR process using one or more frames after the first frame as described above, and hence the RAW video can be played while the same level of noise reduction effect as for the proxy video is achieved.
Although the example in which RAW video is targeted for the process has been described, a target for the process may be video other than RAW video. For example, the process may also be applied to a case where the cyclic NR process is carried out on the first frame when an image that has not been subjected to the cyclic NR process or a video file recorded in an image format (RAW, YUV, etc.) is played. Moreover, the process carried out by the control unit 109 may be applied to a case where a predetermined editing process is to be carried out. Examples of the predetermined editing process include dividing/combining video files, transcoding in which a video file is recorded after its recording format, resolution, and frame rate are converted, and still image clipping in which one frame of RAW video is clipped to be recorded as a still image. In this case, the process in the flowchart of
In a second embodiment, the control unit 109 combines a plurality of video files into one and plays a combined video file.
To carry out the cyclic NR process on a first frame 1-1 of the video 1, the control unit 109 carries out the developing process on a frame 1-2 following the first frame 1-1 first. Likewise, to carry out the cyclic NR process on a first frame 2-1 of the video 2, the control unit 109 carries out the developing process on a frame 2-2 following the first frame 2-1 first. To carry out the cyclic NR process on a first frame 3-1 of the video 3, the control unit 109 carries out a developing process on a frame 3-2 following the first frame 3-1 first. The frames 1-2, 2-2, and 3-2 for use in carrying out (applied to) the cyclic NR process on the first frames 1-1, 2-1, and 3-1 are not displayed on the display unit 111.
Assume here that when the combined video has been played up to a point (a point of combination) at which the video 1 and the video 2 are combined together, the developing process is carried out on the frame 2-2 following the frame 2-1 so as to carry out the cyclic NR process on the first frame 2-1 of the video 2. In this case, the video playing pauses due to the frame 2-2 being subjected to the developing process. In the second embodiment, the control unit 109 completes the developing process on the combined frames 2-2 and 3-2 before the frames 2-2 and 3-2 of the combined video start playing. As a result, the same effects as those in the first embodiment can be obtained without pausing the video playing even in the case where the combined video is played.
In S803, when the cyclic NR process on a first frame has been completed, the control unit 109 starts playing the combined video. Namely, the above-described cyclic NR process is carried out on a first frame of a first video among videos constituting the combined video and completed. Then, the combined video is played. In S804, while playing the combined video, the control unit 109 carries out the developing process on frames following frames at points of combination, at twice the speed at which the combined video is played. Namely, while playing the combined video, the control unit 109 carries out the cyclic NR process on the frames at points of combination (first frames of respective videos constituting the combined video) in advance to complete preparation for playing.
Here, the process in S804 is carried out when the result of the judgment in S802 is Yes. For this reason, even in a case where a video to be played first in the combined video is comprised of two frames, the cyclic NR process has been completed at the time when a first frame (a frame at a point of combination) of the next video is played. Accordingly, since a preparation for playing frame at a point of combination has been completed at the time when the frame is played, the combined video being currently played is not paused. After the process in S804, the control unit 109 ends the process in the flowchart of
When the result of the judgment in S802 is No (it is not possible to carry out the developing process at such a speed as to be completed within the time period no more than half of the playing cycle of the combined video), the control unit 109 lets the process proceed to S805. In S805, the control unit 109 completes the cyclic NR process on a first frame and frames at all points of combination (first frames of all videos constituting the combined video). By the process in S805 being carried out, the cyclic NR process on the first frame and the frames at the points of combination is completed to finish a preparation for playing of those frames before the combined video starts playing. In S806, the control unit 109 starts playing the combined video. At the time when a frame at a point of combination in the combined video is played, the cyclic NR process on the frame (a preparation for playing the frame) has been completed, and hence the combined video being currently played is not paused. After the process in S806, the control unit 109 ends the process in the flowchart of
When the result of the judgment in S801 is No (when the video to be played is not a combined video), the control unit 109 lets the process proceed to S807. In this case, videos have not been combined together, and hence the cyclic NR process on frames at points of combination need not to be taken into consideration. In S807, the control unit 109 carries out the cyclic NR process on a first frame of the video to be played. The process in S807 is the same as the process in the first embodiment described above. As a result, the cyclic NR process on the first frame is completed. In S808, playing of the video is started. After the process in S808, the control unit 109 ends the process in the flowchart of
In the example described above, in S804, the control unit 109 carries out the developing process on a frame following a frame at a point of combination (first frames of videos constituting the combined video or a first frame of “Next video”) in advance to generate images subjected to the cyclic NR process (post-cyclic NR-processed images). Alternatively, the control unit 109 may carry out the developing process on “Next frame” following frames at points of combination in advance and carry out the developing process and the cyclic NR process on the frames at the points of combination in real time.
In a third embodiment, the strength of the cyclic NR process carried out on a first frame a RAW video when the RAW video is played is decreased to reduce the phenomenon of an afterimage occurring in the first frame.
Assume here that as in the first embodiment described above, for the RAW video file, the control unit 109 carries out the developing process on the next frame 903 first to generate the frame 901, and carries out the cyclic NR process on the first frame 902 using the frame 901. The frame 901 has the same image as the frame 903 and used for the cyclic NR process but is not displayed. Here, in the frame 901, a ball of the pendulum swings widely to the right (in a direction of B). When the cyclic NR process is carried out on the first frame 902 using the frame 901, an afterimage of the ball occurs on the B side. The afterimage is a phenomenon that occurs due to processing between frames being performed by taking difference between an area of the ball in the frame 901 and the same area in the first frame 902.
On the other hand, for the proxy video file, the cyclic NR process is carried out on the frame 905 using the frame 904. In the frame 904, a ball of the pendulum widely swing to the left (in a direction of A). For this reason, when the cyclic NR process is carried out on the first frame 905 using the frame 904, an afterimage of the ball occurs on the A side. Namely, the directions in which the afterimage of the ball occurs are opposite to each other between the RAW video file and the proxy video file. If the directions in which afterimages occur differ between the RAW video file and the proxy video file as shown in
The control unit 109 according to the third embodiment carries out a process for changing settings on the cyclic NR process carried out on a first frame when a RAW video is played and reducing the phenomenon of an afterimage occurring in the first frame.
Arrangements of components other than the control unit 109 in the image pickup apparatus 100 are the same as those in
The image analysis unit 1001 analyzes images in at least two frames among frames of the RAW video and analyzes the amount of movement of the images or the amount of movement of a subject in the images. The reference frame judgment unit 1002 judges, based on the analyzed amount of movement, whether or not to use a result (image data), which has been obtained by carrying out the developing process on a frame after a first frame, as a reference frame for use in cyclic NR. The noise reduction setting control unit 1003 changes settings on the cyclic NR process, which is carried out by the image signal processing circuit 107, based on the result of the judgment by the reference frame judgment unit 1002.
In S1101, the image analysis unit 1001 of the control unit 109 reads the frame 903 as the frame 901, which is to be used in the cyclic NR process carried out on the first frame 902 of a RAW video, as well as the first frame 902, and performs an image analysis using these two frames. At this time, the image analysis unit 1001 performs the image analysis and analyzes the amount of movement of images between the two frames. For example, the image analysis unit 1001 divides each of the images in the two frames using a predetermined number of mesh frames, and based on the similarity in the distribution of brightness values of pixels inside the mesh frames, obtains movement amount information that represents the amount of frame movement between the two frames as horizontal and vertical pixel values. It should be noted that the number of mesh frames by which an image is divided may be an arbitrary value such as “255×255”.
In S1102, the image analysis unit 1001 calculates, based on the result of the analysis in S1101, an evaluation value (frame evaluation value) that represents the tendency of an afterimage to occur. The frame evaluation value is an index of the tendency of an afterimage to occur in a frame. The image analysis unit 1001 uses a maximum value of the amount of movement of each mesh frame as the frame evaluation value. As the amount of movement of a mesh frame increases, the value of the frame evaluation value is increased. To judge whether or not a frame includes a scene in which an afterimage tends to occur, the image analysis unit 1001 may calculate the frame evaluation value using arbitrary information for detecting panning and tilting of a camera and movements of a subject. For example, in a case where a subject recognition is performed and it is judged that a frame includes a subject likely to be a moving object (a moving animal, a sport scene, etc.), the image analysis unit 1001 may set the frame evaluation value to a greater value than in a case where the subject is not a moving subject. The image analysis unit 1001 may calculate the frame evaluation value based on metadata recorded in association with the video when the video is shot and recorded.
In S1103, the reference frame judgment unit 1002 of the control unit 109 judges whether or not the frame evaluation value calculated by the image analysis unit 1001 has reached a predetermined threshold value. When the result of the judgment in S1103 is No (when the frame evaluation value has not reached the predetermined threshold value), the reference frame judgment unit 1002 (the control unit 109) proceeds the process to S1104. In this case, a frame after the first frame 902 is allowed to be used for the cyclic NR process on the first frame 902 of the RAW video file. When the result of the judgment in S1103 is No, the amount of movement between frames is not so large, and hence it is likely that the effect of an afterimage on visibility will be somewhat low.
In S1104, the noise reduction setting control unit 1003 sets the setting on the cyclic NR process to a first NR setting. The first NR setting is a setting that is applied so as to carry out the cyclic NR process according to the first or second embodiment described above. The first NR setting may be the same as a setting on the cyclic NR process carried out on a proxy video.
When the result of the judgment in S1103 is Yes (when the frame evaluation value has reached the predetermined threshold value), the reference frame judgment unit 1002 (the control unit 109) proceeds the process to S1105. In this case, a frame after the first frame 902 is not allowed to be used for the cyclic NR process on the first frame 902 of the RAW video file. In S1105, the noise reduction setting control unit 1003 sets the setting on the cyclic NR process to a second NR setting. The second NR setting is a setting that reduces afterimages.
The strength of the cyclic NR process is represented by a cyclic coefficient. The cyclic coefficient is a numeric value that represents an extent to which the effects of a difference between frames are removed. The greater the numeric value, the higher the noise reduction effect. When it is judged that the subject is a moving object (that is, the frame evaluation value is large, i.e. no less than a predetermined threshold value), a process that reduces afterimages by carrying out the cyclic noise reduction process with a reduced cyclic coefficient is carried out. In S1105, the noise reduction setting control unit 1003 may switch to a setting that intends to reduce afterimages by, for example, lowering a threshold value for the moving object judgment by the cyclic NR processing circuit.
In S1106, the control unit 109 carries out the cyclic NR process on the first frame 902 of the RAW video file with the cyclic NR process setting (the first NR setting/the second NR setting) made in S1104/S1005. In the case where the cyclic NR process is carried out with the first NR setting made in S1104, the control unit 109 carries out the cyclic NR process by using a frame after the first frame 902 as a frame for use in the cyclic NR process on the first frame 902. Namely, the same cyclic NR process as in the first embodiment and the second embodiment is carried out. On the other hand, in the case where the cyclic NR process is carried out with the second NR setting made in S1105, the control unit 109 does not use a frame after the first frame 902 as a frame for use in the cyclic NR process on the first frame 902. For this reason, in the case where the cyclic NR process is carried out with the second NR setting made in S1105, the noise reduction effect is lower, but the phenomenon of afterimages is expected to be reduced to a greater extent.
Assume here that the cyclic NR process is carried out on the first frame 902 of the RAW video file with the second NR setting made in S1105, and the cyclic NR process is carried out on the frame 903 following the first frame 902 with the first NR setting made in S1104. In this case, there is a great difference in image quality between the first frame 902 and the subsequent frame 903. In this case, for example, noise appearing when the first frame 902 is played suddenly disappears when the next frame 903 is played. Accordingly, the great difference in image quality between the two frames causes visibility to decrease.
Thus, the control unit 109 may slow down changes in image quality difference, by decreasing the amount of change in settings applied to the cyclic NR process on the first frame 902 according to that frames continue to be played at a predetermined time constant. The control unit 109 may gradually change the amount of change in the NR settings, by increasing the time constant according to that the amount of change in the cyclic NR settings for the first frame 902 increases.
Although the image analysis unit 1001 performs image analysis of frames in a RAW video file, it may perform image analysis of a proxy video file recorded at the same time when the RAW video file is recorded. In this case, the same process as above is carried out based on the result of image analysis on the proxy video file. The image analysis unit 1001 may compare the results of image analysis on developed files between the frames 904, 905 of the proxy video file and the frames 901, 902 of the RAW video file. In this case, the settings on the cyclic NR process may be changed based on the result of estimation about a difference in the direction in which an afterimage appears based on the result of the comparison.
In the third embodiment described above, by changing the settings on the cyclic NR process, the process for reducing the phenomenon of afterimages is carried out, wherein the cyclic NR process is a process that reduces noise using frame correlation. In a fourth embodiment, according to whether or not the frame evaluation value has reached the predetermined threshold value, a process that reduces afterimages is carried out by a method that does not use frame correlation. In the fourth embodiment, for example, a spatial NR process such as an epsilon filter process is applied. In the spatial NR process, information within one frame is used, and hence the phenomenon of afterimages never occurs.
In the fourth embodiment, the cyclic NR process is carried out on a first frame of a proxy video file. An NR process using an epsilon filter, which is a spatial NR process, is carried out on a first frame of a RAW video file when the phenomenon of afterimages tends to occur. The epsilon filter is strengthened by increasing a weighted ratio of a neighboring pixel area or neighboring pixels to a pixel of interest. A description will now be given of an example in which reference pixels are expanded from 3×3 pixels to 5×5 pixels to raise the noise reduction effect.
The cyclic NR process is adversely affected by the phenomenon of afterimages in video, and the spatial NR process is adversely affected by a decrease in resolution. In view of these adverse effects, the conversion setting table data to be prepared are adjusted to appropriate values in advance. The conversion setting table data are configured such that the cyclic NR setting is converted into the epsilon filter setting (epsilon type NR) in the following order: R1 (Weak: cyclic coefficient 2), P2 (Weak: 3×3 pixels), R2 (Strong: cyclic coefficient 4), and P2 (Strong: 5×5 pixels). Namely, the conversion setting table data are configured such that, as the noise reduction effect in the cyclic NR setting increases, the effect of a noise reduction process that is not cyclic (the epsilon filtering process that is the spatial NR process) increases. When the process in S1305 has been carried out, the epsilon NR process, which is the noise reduction process that is not cyclic, is carried out in S1306.
As described above, in the fourth embodiment, when the phenomenon of afterimages tends to occur, the spatial NR process such as the epsilon filtering process is carried out. As a result, when a RAW video file is played, the extent of afterimages appearing in a direction different from that in a proxy video file can be decreased, and the noise reduction effect can be substantially constant.
In S1501, the image analysis unit 1001 of the control unit 109 perform image analysis for at least two frames (a present frame and a previous frame), and based on the result of the image analysis, calculates the amount of movement. In S1502, the reference frame judgment unit 1002 of the control unit 109 judges whether or not to use the result of development of a frame (N+1) and subsequent frames as reference frames to be used for the cyclic NR process. The judgment in S1502 is made based on whether or not the amount of movement calculated in S1501 is equal to or greater than a predetermined amount. The predetermined amount can be set to an arbitrary value. For example, in S1502, the control unit 109 may make the same judgment as in S1103 in
When the result of the judgment by the reference frame judgment unit 1002 in S1502 is Yes (when the result of development of the frame (N+1) and the subsequent frames is to be used), the control unit 109 proceeds the process to S1503. On the other hand, when the result of the judgment by the reference frame judgment unit 1002 in S1502 is No (when the result of development of the frame (N+1) and the subsequent frames is not to be used), the control unit 109 proceeds the process to S1504. In S1503, the metadata adding unit 1404 of the control unit 109 sets, to metadata, information indicating that the result of development of the frame (N+1) and the subsequent frame is allowed to be applied to the cyclic NR process. In S1504, the metadata adding unit 1404 sets information, to metadata, indicating that the result of development of the frame (N+1) and the subsequent frames is not allowed to be applied to the cyclic NR process.
In S1505, the metadata adding unit 1404 sets the amount of movement calculated in S1501 as metadata. In S1506, the control unit 109 records the above-described metadata as well the RAW video file in the recording medium 112.
Values calculated by the image analysis unit 1001 and the cyclic frame determination unit 117 as well as the metadata described above are stored in the second memory unit 114. The first example is a process that is carried out when a RAW video is shot in a normal scene in which an afterimage does not tend to be visually recognized, and hence, in S1503, the information indicating that application to the cyclic NR process is possible is added to metadata by the metadata adding unit 1404. In the first example, the information indicating that application to the cyclic NR process is possible is recorded as a value “1” as “usage/non-usage in cyclic NR”.
In the fifth embodiment, metadata relating to the cyclic NR process as well as a RAW video can be recorded when the RAW video recording process is carried out. The fifth embodiment is useful when it is used the control according to the third or fourth embodiments is performed.
Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2021-057794, filed Mar. 30, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2021-057794 | Mar 2021 | JP | national |