This application claims the benefit under 35 USC § 119 (a) of Korean Patent Application No. 10-2023-0150106 filed on Nov. 2, 2023 in the Korean Intellectual Property Office, the entire disclosure of which is incorporated herein by reference for all purposes.
The following disclosure relates to a method and device with image acquisition.
An optical camera system may convert light incident through a lens to an imaging sensor into an electrical signal through a photodiode and construct an image by measuring the electrical signal. In the process of converting photons into an electrical signal through the photodiode, electric charges converted during an exposure time may be accumulated in pixels.
High-speed continuous shooting may accelerate the operating speed of normal cameras and acquire images using the same technology as general shooting. In high-speed shooting, the exposure time may decrease due to a fast image acquisition rate for each frame to acquire multiple images at a high speed. The short exposure time may reduce the amount of light received by a camera sensor system and thus, may reduce the size of acquired signals. The effect therefrom may intensify as the speed of high-speed shooting accelerates, resulting in deterioration in image quality.
Analog/digital amplification that increases the size of electrical signals may be used to compensate for the amount of light reduced during high-speed shooting, but the analog/digital amplification has an issue of noise amplification during the process. To increase the amount of light received, the area per unit pixel may be increased by increasing the size of the sensor, which, however, has limitations as having effects on the size of a form factor.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one or more general aspects, a processor-implemented method with image acquisition includes: acquiring image frames comprising a shutter-off frame corresponding to a shutter-off through a sensor by performing the shutter-off during continuous shooting; acquiring a measurement signal corresponding to a target image frame; removing, based on a target shutter-off frame corresponding to the target image frame, a first remaining signal corresponding to the target shutter-off frame from a measurement signal corresponding to shutter-off period frames comprising image frames between the target image frame and the target shutter-off frame; and generating a target image frame from which a second remaining signal is removed based on one or more of the shutter-off period frames from which the first remaining signal is removed.
The removing of the first remaining signal may include: estimating the first remaining signal based on the target shutter-off frame and parameter information of the sensor; and removing the first remaining signal from the measurement signal corresponding to the shutter-off period frames.
The estimating of the first remaining signal may include estimating a remaining signal due to a signal generated before a shutter-off corresponding to the target shutter-off frame based on the target shutter-off frame and information on a photodiode forming the sensor.
The removing of the first remaining signal may include removing the first remaining signal from the measurement signal corresponding to the shutter-off period frames by inputting the target shutter-off frame into a first artificial neural network model.
The generating of the target image frame from which the second remaining signal is removed may include generating the target image frame from which the second remaining signal is removed by inputting shutter-off period frames from which the first remaining signal is removed into a second artificial neural network model.
The method may include restoring an image frame corresponding to the target shutter-off frame by inputting a predetermined number of before and after image frames based on the target shutter-off frame into a third artificial neural network model.
The shutter-off frame may construct an image with a residual charge of a photodiode forming the sensor without acquiring a signal during a time corresponding to the shutter-off.
The generating of the target image frame from which the second remaining signal is removed may include generating a target image frame from which a remaining signal due to a signal after a shutter-off corresponding to the target shutter-off frame is removed.
The sensor may include an organic photodiode (OPD).
The sensor may include a hybrid image sensor comprising an OPD and a silicon photodiode.
The sensor may include an image sensor and an optical structure for improving sensitivity.
The acquiring of the image frames may include: determining a frequency and timing of the shutter-off; and acquiring the image frames by performing the shutter-off according to the determined frequency and timing.
The acquiring of the image frames may include acquiring the image frames by periodically performing the shutter-off.
The removing of the first remaining signal may include: acquiring a residual signal between the shutter-off period frames; and acquiring the shutter-off period frames from which the first remaining signal is removed based on the residual signal between the shutter-off period frames.
The generating of the target image frame from which the second remaining signal is removed may include: generating a residual signal between the shutter-off period frames from which the first remaining signal is removed; and generating the target image frame from which the second remaining signal is removed, based on the residual signal between the shutter-off period frames from which the first remaining signal is removed.
The method may include: acquiring a short-term image frame corresponding to an exposure time shorter than an exposure time of the image frames; and generating a high dynamic range (HDR) image based on the short-term image frame.
In one or more general aspects, a non-transitory computer-readable storage medium may store instructions that, when executed by one or more processors, configure the one or more processors to perform any one, any combination, or all of operations and/or methods described herein.
In one or more general aspects, an electronic device includes: an image signal acquisition device comprising a sensor configured to acquire image frames comprising a shutter-off frame corresponding to a shutter-off by performing the shutter-off during continuous shooting, and acquire a measurement signal corresponding to a target image frame; and an image signal restoration device configured to remove, based on a target shutter-off frame corresponding to the target image frame, a first remaining signal corresponding to the target shutter-off frame from a measurement signal corresponding to shutter-off period frames comprising image frames between the target image frame and the target shutter-off frame, and generate a target image frame from which a second remaining signal is removed based on one or more of the shutter-off period frames from which the first remaining signal is removed.
For the removing of the first remaining signal, the image signal restoration device may be configured to: estimate the first remaining signal based on the target shutter-off frame and parameter information of the sensor, and remove the first remaining signal from the measurement signal corresponding to the shutter-off period frames.
For the estimating of the first remaining signal, the image signal restoration device may be configured to estimate a remaining signal due to a signal generated before a shutter-off corresponding to the target shutter-off frame based on the target shutter-off frame and information on a photodiode forming the sensor.
For the removing of the first remaining signal, the image signal restoration device may be configured to remove the first remaining signal from the measurement signal corresponding to the shutter-off period frames by inputting the target shutter-off frame into a first artificial neural network model.
For the generating of the target image frame, the image signal restoration device may be configured to generate the target image frame from which the second remaining signal is removed by inputting shutter-off period frames from which the first remaining signal is removed into a second artificial neural network model.
The image signal restoration device may be configured to restore an image frame corresponding to the target shutter-off frame by inputting a predetermined number of before and after image frames based on the target shutter-off frame into a third artificial neural network model.
The shutter-off frame may construct an image with a residual charge of a photodiode forming the sensor without acquiring a signal during a time corresponding to the shutter-off.
For the generating of the target image frame, the image signal restoration device may be configured to generate a target image frame from which a remaining signal due to a signal after a shutter-off corresponding to the target shutter-off frame is removed.
The sensor may include an organic photodiode (OPD).
The sensor may include a hybrid image sensor comprising an OPD and a silicon photodiode.
The sensor may include an image sensor and an optical structure for improving sensitivity.
For the acquiring of the image frames, the image signal acquisition device may be configured to: determine a frequency and timing of the shutter-off, and acquire the image frames by performing the shutter-off according to the determined frequency and timing.
For the acquiring of the image frames, the image signal acquisition device may be configured to acquire the image frames by periodically performing the shutter-off.
For the removing of the first remaining signal, the image signal restoration device may be configured to: acquire a residual signal between the shutter-off period frames, and acquire the shutter-off period frames from which the first remaining signal is removed based on the residual signal between the shutter-off period frames.
For the generating of the target image frame, the image signal restoration device may be configured to: generate a residual signal between the shutter-off period frames from which the first remaining signal is removed, and generate the target image frame from which the second remaining signal is removed, based on the residual signal between the shutter-off period frames from which the first remaining signal is removed.
The image signal acquisition device may be configured to acquire a short-term image frame corresponding to an exposure time shorter than an exposure time of the image frames, and the image signal restoration device may be configured to generate a high dynamic range (HDR) image based on the short-term image frame.
In one or more general aspects, a processor-implemented method with image acquisition includes: generating a restored signal by removing, from a measurement signal corresponding an image frame acquired after a shutter-off frame, a first remaining signal corresponding to the shutter-off frame, and removing, from the measurement signal corresponding the image frame acquired after the shutter-off frame, a second remaining signal corresponding to one or more image frames acquired after the shutter-off frame and before the image frame acquired after the shutter-off frame; and generating a target image frame corresponding to the image frame acquired after the shutter-off frame, based on the restored signal.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described or provided, the same drawing reference numerals may be understood to refer to the same or like elements, features, and structures. The drawings may not be to scale, and the relative size, proportions, and depiction of elements in the drawings may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. However, various changes, modifications, and equivalents of the methods, apparatuses, and/or systems described herein will be apparent after an understanding of the disclosure of this application. For example, the sequences within and/or of operations described herein are merely examples, and are not limited to those set forth herein, but may be changed as will be apparent after an understanding of the disclosure of this application, except for sequences within and/or of operations necessarily occurring in a certain order. As another example, the sequences of and/or within operations may be performed in parallel, except for at least a portion of sequences of and/or within operations necessarily occurring in an order, e.g., a certain order. Also, descriptions of features that are known after an understanding of the disclosure of this application may be omitted for increased clarity and conciseness.
Although terms such as “first,” “second,” and “third”, or A, B, (a), (b), and the like may be used herein to describe various members, components, regions, layers, or sections, these members, components, regions, layers, or sections are not to be limited by these terms. Each of these terminologies is not used to define an essence, order, or sequence of corresponding members, components, regions, layers, or sections, for example, but used merely to distinguish the corresponding members, components, regions, layers, or sections from other members, components, regions, layers, or sections. Thus, a first member, component, region, layer, or section referred to in the examples described herein may also be referred to as a second member, component, region, layer, or section without departing from the teachings of the examples.
Throughout the specification, when a component or element is described as “on,” “connected to,” “coupled to,” or “joined to” another component, element, or layer, it may be directly (e.g., in contact with the other component, element, or layer) “on,” “connected to,” “coupled to,” or “joined to” the other component element, or layer, or there may reasonably be one or more other components elements, or layers intervening therebetween. When a component or element is described as “directly on”, “directly connected to,” “directly coupled to,” or “directly joined to” another component element, or layer, there can be no other components, elements, or layers intervening therebetween. Likewise, expressions, for example, “between” and “immediately between” and “adjacent to” and “immediately adjacent to” may also be construed as described in the foregoing.
The terminology used herein is for describing various examples only and is not to be used to limit the disclosure. The articles “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. As non-limiting examples, terms “comprise” or “comprises,” “include” or “includes,” and “have” or “has” specify the presence of stated features, numbers, operations, members, elements, and/or combinations thereof, but do not preclude the presence or addition of one or more other features, numbers, operations, members, elements, and/or combinations thereof, or the alternate presence of an alternative stated features, numbers, operations, members, elements, and/or combinations thereof. Additionally, while one embodiment may set forth such terms “comprise” or “comprises,” “include” or “includes,” and “have” or “has” specify the presence of stated features, numbers, operations, members, elements, and/or combinations thereof, other embodiments may exist where one or more of the stated features, numbers, operations, members, elements, and/or combinations thereof are not present.
Unless otherwise defined, all terms, including technical and scientific terms, used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains and based on an understanding of the disclosure of the present application. It will be further understood that terms, such as those defined in commonly-used dictionaries, are to be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the disclosure of the present application, and are not to be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, the term “and/or” includes any one and any combination of any two or more of the associated listed items. The phrases “at least one of A, B, and C”, “at least one of A, B, or C”, and the like are intended to have disjunctive meanings, and these phrases “at least one of A, B, and C”, “at least one of A, B, or C”, and the like also include examples where there may be one or more of each of A, B, and/or C (e.g., any combination of one or more of each of A, B, and C), unless the corresponding description and embodiment necessitates such listings (e.g., “at least one of A, B, and C”) to be interpreted to have a conjunctive meaning.
The features described herein may be embodied in different forms, and are not to be construed as being limited to the examples described herein. Rather, the examples described herein have been provided merely to illustrate some of the many possible ways of implementing the methods, apparatuses, and/or systems described herein that will be apparent after an understanding of the disclosure of this application. The use of the term “may” herein with respect to an example or embodiment (e.g., as to what an example or embodiment may include or implement) means that at least one example or embodiment exists where such a feature is included or implemented, while all examples are not limited thereto. The use of the terms “example” or “embodiment” herein have a same meaning (e.g., the phrasing “in one example” has a same meaning as “in one embodiment”, and “one or more examples” has a same meaning as “in one or more embodiments”).
The examples may be implemented as various types of products, such as, for example, a personal computer (PC), a laptop computer, a tablet computer, a smartphone, a television (TV), a smart home appliance, an intelligent vehicle, a kiosk, and a wearable device. Hereinafter, examples will be described in detail with reference to the accompanying drawings. In the drawings, like reference numerals are used for like elements.
Referring to
The image signal acquisition device 110 according to one or more embodiments may generate a plurality of image frames through continuous shooting. The image signal acquisition device 110 may include various optical image acquisition devices capable of high-speed continuous shooting, for example, an ultra-high-speed camera, a medical imaging device, a semiconductor measurement device, a vehicle camera, and the like.
The image signal acquisition device 110 may include a sensor (e.g., a sensor 1105 of
Light incident through the lens to the imaging sensor may be converted into an electrical signal through the photodiode, and the image signal acquisition device 110 may construct or generate an image by measuring the electrical signal. In the process of converting photons into an electrical signal through the photodiode, electric charges converted during an exposure time may be accumulated in pixels.
When capturing images continuously at high speed, it may be difficult for a typical electronic device to secure a signal-to-noise ratio because an image signal is small due to a short exposure time for image acquisition. In contrast to the typical electronic device, the electronic device 100 according to one or more embodiments may acquire (e.g., generate) high-quality images with a high signal-to-noise ratio even in high-speed shooting using the image signal acquisition device 110 and the image signal restoration device 150.
For example, the image signal acquisition device 110 according to one or more embodiments may use an image sensor including a photodiode with high sensitivity to acquire high-quality images with a high signal-to-noise ratio even in high-speed shooting. For example, the image signal acquisition device 110 may use an image sensor including an organic photodiode (OPD). For example, the image signal acquisition device 110 may increase the sensitivity by using the OPD, thereby acquiring high-quality images with a high signal-to-noise ratio in high-speed shooting such as a 960-frame per second (fps) slow motion video. As described in examples in detail below, an OPD may substitute for a portion or the entirety of a photodiode in the CIS and thereby increase sensitivity while maintaining the same CIS structure.
When using an OPD to improve sensitivity during high-speed shooting, high-quality images with a high signal-to-noise ratio may be acquired due to high sensitivity for the same acquisition time, and when the same signal-to-noise ratio signal is acquired, the acquisition time may be reduced, and thus, images may be acquired at a high frame rate.
However, when the OPD has a low charge reduction of the photodiode compared to an existing silicon photodiode (SiPD), a remaining signal may remain in subsequent frames. In the typical electronic device, the remaining signal due to a slow photoresponse may cause image deterioration such as motion blur, halation, and color distortion due to signal saturation, and the like.
In contrast to the typical electronic device, the electronic device 100 according to one or more embodiments may use a shutter-off shooting technique that reduces image deterioration due to remaining signals and easily estimates and removes remaining signals. The image signal acquisition device 110 may construct an image with a signal that remains in the photodiode without acquiring a signal of a corresponding time through a shutter-off during continuous shooting.
The shutter-off of one or more embodiments may not acquire a signal of a corresponding time and thus, may prevent a remaining signal due to the corresponding signal from remaining in a subsequent image frame, thereby reducing deterioration caused by the remaining signal.
Thereafter, the image signal restoration device 150 may acquire remaining signal information generated before the shutter-off from a corresponding frame, and estimate and compensate for image deterioration that occurs in an image after the shutter-off due to the previous signal. A deteriorated image may be restored by specializing the image acquired by the image signal acquisition device 110, which is configured with a high-sensitivity photodiode and a shutter-off high-speed shooting technique, by the image signal restoration device 150 using an artificial neural network-based model. Examples of the detailed operation of the image signal restoration device 150 according to one or more embodiments will be described below with reference to
The description provided with reference to
The electronic device 100 according to one or more embodiments may perform a shutter-off at a predetermined time and frequency during continuous shooting. For example, the electronic device 100 may periodically perform a shutter-off for every N frames. Alternatively, the electronic device 100 may aperiodically perform a shutter-off.
The electronic device 100 of one or more embodiments may minimize a remaining signal that remains in a subsequent frame due to a slow photoresponse decay of a high-sensitivity photodiode (for example, an OPD) through a shutter-off shooting technique.
The image signal restoration device 150 may estimate a remaining signal before the shutter-off using the shutter-off frame and remove the remaining signal from an image frame. Examples of methods of removing a remaining signal before a shutter-off will be described below with reference to
The image signal restoration device 150 may estimate a remaining signal that remains in an image frame after a shutter-off using a shutter-off frame acquired during the shutter-off and a parameter of a sensor, and remove the estimated remaining signal from the image frame.
Referring to
N image frames may be generated between the first shutter-off frame 310 and the second shutter-off frame 330, and the image frames between the shutter-off frames may be referred to as shutter-off period frames 320. The image signal restoration device 150 may estimate a remaining signal due to a signal before the first shutter-off using the first shutter-off frame 310, and remove the remaining signal due to the signal before the first shutter-off from the shutter-off period frames 320. Thus, the first shutter-off frame 310, which is the shutter-off frame used to remove the remaining signal in the shutter-off period frames 320, may be referred to as the target shutter-off frame of the shutter-off period frames 320.
Of the shutter-off period frames 320, an image frame corresponding to a time t>ts may be referred to as a target image frame. The image signal restoration device 150 may remove the remaining signal due to a slow photoresponse of the photodiode from the target image frame.
When an ideal signal of the target image frame is denoted as xt, a measurement signal measured or generated by a sensor is denoted as yt, a remaining signal due to a signal before a shutter-off is denoted as ebefore t
Hereinafter, the remaining signal due to the signal before the shutter-off (e.g., ebeforet
When an image is constructed with residual charges of the photodiode without acquiring a signal of a corresponding time in the first shutter-off frame 310, the first shutter-off frame 310 may be a remaining signal image before the shutter-off.
The image signal restoration device 150 may estimate the effect of the measured afterimage on the image after the shutter-off using the first shutter-off frame 310 and photoresponse information 340 of the photodiode. Thereafter, the image signal restoration device 150 may remove the remaining signal that the signal before the shutter-off affects the image after the shutter-off. For example, the image signal restoration device 150 may estimate and remove ebeforet
Referring to
The first artificial neural network model 350 according to one or more embodiments may be trained to remove the first remaining signal from the measurement signal corresponding to the shutter-off period frames using the shutter-off frame. Training of an artificial neural network model may be performed to determine a parameter that minimizes a loss function. The loss function may be used as an index to determine an optimal parameter in the process of training the artificial neural network model. The loss function according to one or more embodiments may be defined as the difference between ground truth data (the shutter-off period frames from which the first remaining signal is actually removed) and the shutter-off period frames from which the first remaining signal is removed, output from the first artificial neural network model 350, and a parameter that minimizes the loss function may be determined. The method of estimating the first remaining signal is not limited to the method described above, and various methods may be adopted.
Referring to
When signal acquisition is resumed after a first shutter-off, a remaining signal (e.g., the second remaining signal) due to the signal acquisition after the first shutter-off may remain in the target image frame. For example, when the target image frame is a fourth image frame after the first shutter-off, the target image frame may include a remaining signal due to acquisition of first to third image frames after the first shutter-off.
The image signal restoration device 150 may acquire a target image frame from which the second remaining signal is removed based on an artificial neural network model. For example, the image signal restoration device 150 may acquire the target image frame from which the second remaining signal is removed by inputting the shutter-off period frames from which the first remaining signal is removed into a second artificial neural network model 410. The image signal restoration device 150 may acquire a signal from which the second remaining signal is removed, as expressed by Equation 3 below, for example, through this process.
The second artificial neural network model 410 according to one or more embodiments may be trained to output the target image frame from which the second remaining signal is removed using shutter-off period frames. A loss function according to one or more embodiments may be defined as the difference between ground truth data (the target image frame from which the second remaining signal is actually removed) and the target image frame from which the second remaining signal is removed, output from the second artificial neural network model 410, and a parameter that minimizes the loss function may be determined.
Since the remaining signal information changes based on the first shutter-off frame 310, the electronic device 100 of one or more embodiments may consider the shutter-off period frames 320 without the need to consider the entire image series for image restoration, which is efficient. For example, when the image signal restoration device 150 compensates for the first remaining signal, the image signal restoration device 150 of one or more embodiments may restore an image using only the shutter-off period frames 320 instead of the entire image series to compensate for the second remaining signal, thereby reducing computing resources to be used for image restoration. However, the method of removing the remaining signal after the shutter-off is not limited to the method described above. For example, the image signal restoration device 150 may estimate and remove the second remaining signal through a signal processing method.
Referring to
Accordingly, the image signal restoration device 150 may restore the shutter-off frame using neighboring frame information. For example, the image signal restoration device 150 may restore an image frame corresponding to the shutter-off frame by inputting a predetermined number of before and after image frames based on the shutter-off frame into a third artificial neural network model 510.
For example, the image signal restoration device 150 may restore an image frame 540 corresponding to the first shutter-off frame 310 by inputting two previous image frames 520 and two subsequent image frames 530 of the first shutter-off frame 310 into the third artificial neural network model 510. However, the number of image frames used to restore the shutter-off frame is not limited to the example described above.
At this time, the third artificial neural network model 510 used to restore the shutter-off frame may be integrated with the second artificial neural network model 410 or may perform sequential processing.
For ease of description, it is described that operations 610 to 650 are performed by the electronic device 100 illustrated in
Furthermore, the operations of
In operation 610, the electronic device 100 according to one or more embodiments may acquire image frames including a shutter-off frame corresponding to a shutter-off through a sensor by performing the shutter-off during continuous shooting. The shutter-off frame may construct an image with a residual charge of a photodiode forming the sensor without acquiring a signal during a time corresponding to the shutter-off.
In operation 620, the electronic device 100 according to one or more embodiments may acquire a measurement signal corresponding to a target image frame.
In operation 630, the electronic device 100 according to one or more embodiments may remove, based on a target shutter-off frame corresponding to the target image frame, a first remaining signal corresponding to the target shutter-off frame from a measurement signal corresponding to shutter-off period frames.
The image signal restoration device 150 may estimate the first remaining signal corresponding to the target shutter-off frame based on the target shutter-off frame corresponding to the target image frame and parameter information of the sensor. The image signal restoration device 150 may estimate a remaining signal due to a signal before a shutter-off corresponding to the target shutter-off frame based on the target shutter-off frame and information on a photodiode forming the sensor.
The electronic device 100 may remove the first remaining signal from the measurement signal corresponding to the shutter-off period frames including image frames between the target image frame and the target shutter-off frame. The image signal restoration device 150 may acquire a target image frame from which a remaining signal due to a signal after the shutter-off corresponding to the target shutter-off frame is removed.
Alternatively or additionally, the image signal restoration device 150 may remove the first remaining signal from a measurement signal corresponding to shutter-off period frames by inputting the target shutter-off frame into a first artificial neural network model.
In operation 640, the electronic device 100 according to one or more embodiments may acquire a target image frame from which a second remaining signal is removed based on at least one of the shutter-off period frames from which the first remaining signal is removed.
The image signal restoration device 150 may acquire the target image frame from which the second remaining signal is removed by inputting the shutter-off period frames from which the first remaining signal is removed into a second artificial neural network model.
In operation 650, the electronic device 100 according to one or more embodiments may restore an image frame corresponding to the target shutter-off frame. The image signal restoration device 150 may acquire the target image frame from which the second remaining signal is removed by inputting a predetermined number of before and after image frames based on the target shutter-off frame into a third artificial neural network model.
Referring to
The image sensor 710 may include an OPD or may be a hybrid image sensor including an OPD and a SiPD.
The sensor 700 may reduce an exposure time without a loss of a signal-to-noise ratio and secure a reset time by introducing the optical structure 720 for improving sensitivity to the image sensor 710.
Diagram 730 in
Comparing the first graph 741 and the second graph 742, it may be learned that remaining signals may be reduced because the residual charge of the photodiode decreases during the reset time additionally secured when the sensor 700 with the optical structure 720 for improving sensitivity applied is used.
As described above, the electronic device 100 may perform a shutter-off at a predetermined time and frequency during continuous shooting. For example, the electronic device 100 may periodically perform a shutter-off for every N frames.
Alternatively, referring to
For example, first shutter-off period frames 810 may include Ni−1 frames, second shutter-off period frames 820 may include Na frames, third shutter-off period frames 830 may include Ni+1 frames, and fourth shutter-off period frames 840 may include Ni+2 frames.
Furthermore, the electronic device 100 may optimize the image signal restoration device 150 to be specialized to an aperiodic image acquisition method.
When a plurality of image frames are acquired through high-speed continuous shooting, the difference between neighboring frames may not be significant. Accordingly, the image signal restoration device 150 according to one or more embodiments may restore an image using a residual signal of image frames, rather than using an acquired signal of the image frames as it is. A residual signal of an i-th image frame may be the difference between the i-th image frame and an (i-1)-th image frame.
Referring to
For example, the residual signal of the target image frame from which the first remaining signal is removed may be expressed by Equation 4 below, for example.
Since yt′ is input into the first artificial neural network model 410 in
Similarly, the image signal restoration device 150 may acquire a residual signal between shutter-off period frames, and acquire shutter-off period frames from which the first remaining signal is removed based on the residual signal between the shutter-off period frames.
For example, the residual signal of the target image frame from which the first remaining signal is removed may be expressed by Equation 5 below, for example.
The image signal restoration device 150 may acquire the target image frame from which the second remaining signal is removed by inputting the residual signal between the shutter-off period frames into a first artificial neural network model.
The image signal acquisition device 110 according to one or more embodiments may acquire an image with a short exposure time instead of a shutter-off.
Referring to
The image signal restoration device 150 may generate a high dynamic range (HDR) image based on the image 1010 with a short exposure time. In this case, as well, the image signal restoration device 150 may remove the first remaining signal and the second remaining signal using the image 1010 with a short exposure time.
Referring to
The processor 1101 according to one or more embodiments may perform any one, any combination, or all of the operations and/or methods described above with reference to
The memory 1103 according to one or more embodiments may be a volatile memory or a non-volatile memory, and the memory 1103 may store data (e.g., parameters of first to third artificial neural network models that are trained) to be used for performing operation restoration. For example, the memory 1103 may include a non-transitory computer-readable storage medium storing instructions that, when executed by the processor 1101, configure the processor 1101 to perform any one, any combination, or all of the operations and/or methods described herein with reference to
The sensor 1105 according to one or more embodiments may include an image sensor. The sensor 1105 may include an image sensor including an OPD. The sensor 1105 may include a hybrid image sensor including an OPD and a SiPD. The sensor 1105 may include an image sensor and an optical structure for improving sensitivity.
The electronic device 1100 according to one or more embodiments may further include other components not shown in the drawings. For example, the electronic device 1100 may further include a communication module and an input/output interface including an input device and an output device as means for interface with the communication module. In addition, for example, the electronic device 1100 may further include other components such as a transceiver, various sensors, and a database.
The electronic devices, image signal acquisition devices, image signal restoration devices, sensors, image sensors, optical structures, processors, memories, electronic device 100, image signal acquisition device 110, image signal restoration device 150, sensor 700, image sensor 710, optical structure 720, electronic device 1100, processor 1101, memory 1103, and sensor 1105 described herein, including descriptions with respect to respect to
The methods illustrated in, and discussed with respect to,
Instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above may be written as computer programs, code segments, instructions or any combination thereof, for individually or collectively instructing or configuring the one or more processors or computers to operate as a machine or special-purpose computer to perform the operations that are performed by the hardware components and the methods as described above. In one example, the instructions or software include machine code that is directly executed by the one or more processors or computers, such as machine code produced by a compiler. In another example, the instructions or software includes higher-level code that is executed by the one or more processors or computer using an interpreter. The instructions or software may be written using any programming language based on the block diagrams and the flow charts illustrated in the drawings and the corresponding descriptions herein, which disclose algorithms for performing the operations that are performed by the hardware components and the methods as described above.
The instructions or software to control computing hardware, for example, one or more processors or computers, to implement the hardware components and perform the methods as described above, and any associated data, data files, and data structures, may be recorded, stored, or fixed in or on one or more non-transitory computer-readable storage media, and thus, not a signal per se. As described above, or in addition to the descriptions above, examples of a non-transitory computer-readable storage medium include one or more of any of read-only memory (ROM), random-access programmable read only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, blue-ray or optical disk storage, hard disk drive (HDD), solid state drive (SSD), flash memory, a card type memory such as multimedia card micro or a card (for example, secure digital (SD) or extreme digital (XD)), magnetic tapes, floppy disks, magneto-optical data storage devices, optical data storage devices, hard disks, solid-state disks, and/or any other device that is configured to store the instructions or software and any associated data, data files, and data structures in a non-transitory manner and provide the instructions or software and any associated data, data files, and data structures to one or more processors or computers so that the one or more processors or computers can execute the instructions. In one example, the instructions or software and any associated data, data files, and data structures are distributed over network-coupled computer systems so that the instructions and software and any associated data, data files, and data structures are stored, accessed, and executed in a distributed fashion by the one or more processors or computers.
While this disclosure includes specific examples, it will be apparent after an understanding of the disclosure of this application that various changes in form and details may be made in these examples without departing from the spirit and scope of the claims and their equivalents. The examples described herein are to be considered in a descriptive sense only, and not for purposes of limitation. Descriptions of features or aspects in each example are to be considered as being applicable to similar features or aspects in other examples. Suitable results may be achieved if the described techniques are performed in a different order, and/or if components in a described system, architecture, device, or circuit are combined in a different manner, and/or replaced or supplemented by other components or their equivalents.
Therefore, in addition to the above and all drawing disclosures, the scope of the disclosure is also inclusive of the claims and their equivalents, i.e., all variations within the scope of the claims and their equivalents are to be construed as being included in the disclosure.
| Number | Date | Country | Kind |
|---|---|---|---|
| 10-2023-0150106 | Nov 2023 | KR | national |