This application claims the priority benefit of Taiwan application serial no. 113100479, filed on Jan. 4, 2024. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The application relates to an image processing method, and in particular, to a method for adjusting video signal.
In general, during photography of fireworks, since the speed of fireworks spreading in the air is usually slow in the camera scene, the trajectory length of the fireworks is often limited by the length of a single frame exposure. In order to maintain the smoothness of the video in general video recording, the frame rate must be maintained at a certain level of high frame rate (HFR). As a result, the exposure time of a single frame will be limited, so when shooting fireworks videos, you can usually only get a picture of the fireworks points, but cannot record the lines of the trajectory.
If it is necessary to retain the trajectory of a longer fireworks smear, the single frame exposure time needs to be extended. As a result, the output frame rate of the photosensitive element will be reduced, which will affect the smoothness of video playback. Furthermore, even if time-lapse photography technology is used to reduce the display time of a single frame on the video for the sake of smoothness (i.e. changing the sense of time speed to maintain frame rate), however, due to the time-lapse effect, which makes the fireworks fleeting. Therefore, it further affects the user's perception of time during viewing, and the beautiful feeling of slow dragging fireworks cannot be presented.
The present invention provides a method for adjusting video signal including: obtaining N consecutive input frames obtained by an imaging apparatus by a processor; and generating N result frames based on the N input frames and generating a video signal based on the N result frames by the processor. The step of generating the N result frames includes: storing a first input frame of the N input frames as a first result frame; and selecting multiple input frames of the N input frames according to a time sequence as a source for image-overlaying of an i-th result frame, performing an image-overlaying operation on the selected multiple input frames to obtain the i-th result frame, and storing the i-th result frame, wherein i=2, 3, 4, . . . , N.
Based on the above, the disclosure uses the video signal originally shot with a short exposure through the image-overlaying operation to obtain the video signal that can simulate the effect of a long exposure. In this way, it cannot affect the smoothness of video playback, but also have a beautiful feeling.
The imaging apparatus 140 is implemented, for example, by a video recorder, a camera, etc. using a Charge coupled device (CCD) lens or a Complementary metal oxide semiconductor transistors (CMOS) lens. In other embodiments, the imaging apparatus 140 may be further configured with a gyroscope, an accelerometer, a voice coil motor, etc.
The electronic device 100 includes a processor 110, a memory 120 and a display 130. The processor 110 is coupled to the memory 120 and the display 130. The electronic device 100 may be a smart phone, a smart wearable device, a tablet computer, a notebook computer, a personal computer, or other electronic device with computing capabilities.
In addition, in other embodiments, the display 130 may also be an external device that communicates with the electronic device 100 through wired or wireless means. Here, it is not limited whether the display 130 is integrated with the electronic device 100.
The processor 110 is, for example, a Central Processing Unit (CPU), a Physics Processing Unit (PPU), a Microprocessor, an embedded control chip, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), or other similar devices.
The memory 120 includes one or more program code fragments. After the above code fragment is installed, it will be executed by the processor 110 to perform the method of self-adjusting the video signal described below. The memory 120 can be implemented using any type of fixed or removable Random-Access Memory (RAM), Read-Only Memory (ROM), Flash memory, hard disk or other similar devices or a combination of these devices.
The display 130 is used to display the adjusted video signal. For example, Liquid Crystal Display (LCD), Plasma Display, Organic Light-Emitting Diode (OLED) display, projection system, etc. can be used to implement the display 130.
Then, in step S220, the processor 110 generates N result frames RF1 to RFN based on the N input frames IF1 to IFN. Specifically, in step S221, the processor 110 stores the first input frame IF1 as the first result frame RF1. Then, in step S222, the processor 110 selects multiple input frames in the N input frames IF1 to IFN according to a time sequence as a source for image-overlaying of an i-th result frame RFi, performs an image-overlaying operation on the multiple input frames to obtain the i-th result frame RFi and stores the i-th result frame RFi, wherein i=2, 3, 4, . . . , N. That is to say, the processor 110 takes out the corresponding multiple input frames for the second result frame RF2 to the N-th result frame RFN to perform the image-overlaying operation.
For example,
Then, the processor 110 takes out the corresponding source for image-overlaying for the second result frame RF2 to the N-th result frame RFN. In the embodiment shown in
Specifically, the processor 110 takes out two consecutive input frames (IF1 to IF2) starting from the first input frame IF1 in the first region R1 as the source for image-overlaying of the second result frame RF2, then, the image-overlaying operation is performed on the input frames IF1 to IF2 to obtain the second result frame RF2. The processor 110 takes out three consecutive input frames (IF1 to IF3) starting from the first input frame IF1 in the first region R1 as the source for image-overlaying of the third result frame RF3, then, the image-overlaying operation is performed on the input frames IF1 to IF3 to obtain the third result frame RF3. The processor 110 takes out four consecutive input frames (IF1 to IF4) starting from the first input frame IF1 in the first region R1 as the source for image-overlaying of the fourth result frame RF4, then, the image-overlaying operation is performed on the input frames IF1 to IF4 to obtain the fourth result frame RF4. By analogy, subsequent result frames (RF4 to RFN) are obtained.
In other embodiments, continuous or discontinuous Mi input frames can also be taken out based on requirements as the source for image-overlaying of i-th result frame RFi, 1<Mi≤i.
Before performing the image-overlaying operation, the processor 110 can selectively perform an image alignment action or an image stabilization process so that the input frame to be overlaid is aligned. The image alignment operation, for example, uses control points to align the image to the correct position. The image stabilization procedure is described below. The processor 110 uses related image feature calculation technologies such as Digital Image Stabilization (DIS) calculation to calculate the video jitter caused by hand shake, and thereby determine whether the input frame to be overlaid is aligned. Alternatively, the processor 110 uses Electric Image Stabilization (EIS) related technology to process the gyroscope signal of the imaging apparatus 140 to estimate the relationship between the viewing angle change and the input frame, and thereby determine whether the input frame to be overlaid is aligned. Alternatively, the user uses optical image stabilization (OIS) or Gimbal Stabilization related technologies to process the gyroscope signal and accelerometer signal of the imaging apparatus 140 to estimate the degree of rotation or movement of the imaging apparatus 140, and thereby determine whether the input frame to be overlaid is aligned. The above method of judging alignment is only an example and is not limited thereto.
After that, during the image-overlaying operation, the processor 110 respectively retains one or more pixels with a brightness value greater than a preset value (default brightness) for the first input frame to the Mi−1th input frame included in the source for image-overlaying (Mi input frames). Then, the processor 110 overlays the pixels retained from the first input frame to the Mi−1th input frame to the Mith input frame to obtain the i-th result frame RFi.
Alternatively, the processor 110 respectively retains one or more pixels with a saturation value greater than a preset value (preset saturation) for the first input frame to the Mi−1th input frame included in the source for image-overlaying (Mi input frames). Then, the processor 110 overlays the pixels retained from the first input frame to the Mi−1th input frame to the Mith input frame to obtain the i-th result frame RFi.
In addition, the processor 110 may also sum and average each pixel of multiple input frames as the source for image-overlaying to obtain the i-th result frame RFi. Alternatively, the value of each pixel of multiple input frames that are the source for image-overlaying is calculated by adding weights and then averaged. The above is only an example and is not limited thereto.
After that, in step S230, the processor 110 generates the video signal based on the N result frames RF1 to RFN. The processor 110 determines the time interval of the simulated exposure of the N result frames RF1 to RFN based on the frame rate. The processor 110 determines the number of frames per second to play based on the frame rate. Assuming the frame rate is 30 fps (frame per second), it means that the time interval corresponding to each result frame is 1/30 seconds, and the playback time length of the video signal formed by the N result frames is N/30 seconds. The frame rate can be flexibly adjusted based on the exposure length to be simulated.
Herein, the input frames and the result frames can adopt Bayer pattern, RGB raw data, YUV and other data formats. The embodiment uses YUV for processing, however, the data format is not limited thereto. In practical applications, taking the electronic device 100 with the built-in imaging apparatus 140 as an example, after the user obtains the original video signal through photography by the imaging apparatus 140, the processor 110 can perform the overlaying operation on the original video signal in real time and transmit the adjusted video signal to the control circuit of the display 130, so that each result frame is rendered to the display screen through the control circuit based on the time interval (for example, 1/30 seconds) between two adjacent frames.
Firework photography will be used to illustrate below, but it is not limited thereto.
It can be known from
In summary, the disclosure uses the video signal originally shot with a short exposure through the image-overlaying operation to obtain the video signal that can simulate the effect of a long exposure. In this way, it cannot affect the smoothness of video playback, but also have a beautiful feeling.
Number | Date | Country | Kind |
---|---|---|---|
113100479 | Jan 2024 | TW | national |