This application is based on and claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2022-0121116, filed on Sep. 23, 2022, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
The disclosure relates to an image acquisition apparatus and an electronic apparatus including the same, and a method of controlling the image acquisition apparatus.
An imaging sensor is an apparatus that receives light incident from an object and photoelectrically converts the received light to generate electrical signals. For color expression, the imaging sensor usually uses a color filter composed of an array of filter elements that selectively transmits red light, green light, and blue light, and after sensing the amount of light transmitted through each filter element, a color image of the object is formed through image processing.
A spectral filter including an multispectral imaging (MSI) sensor has a Fabry-Perot cavity structure that uses resonance caused by a specific wavelength of light between two reflectors to generate an image composed of only a specific wavelength band. These spectral channels may be configured in, for example, a 4×4 array. Such a unit filter structure represents one piece of spatial information, and there are, for example, 640×480 such unit filters, thereby enabling multispectral imaging of 640×480 pixels composed of 16 channels of different wavelength bands.
In general, resonance efficiency of a reflector of the Fabry-Perot cavity for each wavelength varies due to the difference in reflectivity according to wavelengths, which affects transmission efficiency (hereinafter referred to as TE). In addition, with respect to a CMOS sensor, quantum efficiency (hereinafter referred to as QE) for each wavelength is not constant and decreases toward a longer wavelength. The QE×TE values considering both of these effects vary greatly for each spectral channel, and a channel of a specific wavelength is saturated more easily than a channel of another wavelength when imaging with MIS.
In addition, due to the dynamic range of the scene itself, a specific object in the scene may be too dark or saturated with one exposure time. In this case, there is a high dynamic range (hereinafter referred to as HDR) imaging method capable of obtaining images from multiple images having different exposure times without saturation of all objects. However, it is difficult to apply this technique to a dynamic object because several images must be obtained sequentially.
Provided are an image acquisition apparatus and an electronic apparatus including the same, and a method of controlling the image acquisition apparatus.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments of the disclosure.
According to an aspect of the disclosure, there is provided an image acquisition apparatus including: a multispectral imaging sensor configured to acquire a plurality of image signals corresponding to at least four channels based on a wavelength band of about 10 nm to about 1000 nm; and a processor configured to: set an exposure time for each of the at least four channels based on transmission efficiency for each of a plurality of wavelengths of each of the at least four channels and quantum efficiency for each of the plurality of wavelengths of each of the at least four channels, and generate a high dynamic range (HDR) image based on the plurality of image signals corresponding to the at least four channels, the plurality of image signals being obtained based on the set exposure time.
The processor may be further configured to set a different exposure time for each of the at least four channels based on a value obtained by multiplying the transmission efficiency for each of the plurality of wavelengths and the quantum efficiency for each of the plurality of wavelengths.
Each of the at least four channels ma include N×N pixels, wherein the processor may be further configured to set a different exposure time for each of the N×N pixels, and wherein N is a natural number greater than or equal to 2.
The processor may be further configured to extract a first image signal, among the plurality of image signals, from a first pixel for which a shortest exposure time is set among the N×N pixels when a brightness of an object is greater than or equal to a first threshold value.
The processor may be further configured to extract a second image signal, among the plurality of image signals, from a second pixel for which a longest exposure time is set among the N×N pixels when a brightness of an object is less than a first threshold value.
A maximum value of the exposure time may be set based on a value obtained by multiplying the transmission efficiency for each of the plurality of wavelengths and the quantum efficiency for each of the plurality of wavelengths.
A value obtained by multiplying the transmission efficiency for each of the plurality of wavelengths and the quantum efficiency for each of the plurality of wavelengths may be a magnitude of a signal of the MSI sensor, the signal being output per unit time.
Te processor may be further configured to: set a first exposure time for a first channel in which the value obtained by multiplying the transmission efficiency for each of the plurality of wavelengths and the quantum efficiency for each of the plurality of wavelengths of the first channel is equal to or greater than a first threshold value, and set a second exposure time for a second channel, in which the value obtained by multiplying the transmission efficiency for each of the plurality of wavelengths and the quantum efficiency for each of the plurality of wavelengths of the second channel is less than the first threshold value, the second exposure time being longer than the first exposure time.
The processor may be further configured to: group channels having same wavelength bands among the at least four channels, and set a different exposure time for each of the grouped channels.
The respective areas of the at least four channels of the MSI sensor may be different from each other.
The processor may be further configured to set a different analog gain value for each of the at least four channels.
According to another example embodiment, an electronic apparatus includes the image acquisition apparatus.
According to another aspect of the disclosure, there is provided a method of controlling an image acquisition apparatus a multispectral imaging sensor configured to acquire a plurality of image signals corresponding to at least four channels based on a wavelength band of about 10 nm to about 1000 nm, the method including: setting an exposure time for each of the at least four channels based on transmission efficiency for each of a plurality of wavelengths of each of the at least four channels and quantum efficiency for each of the plurality of wavelengths of each of the at least four channels; and generating a high dynamic range (HDR) image based on the plurality of image signals corresponding to the at least four channels the plurality of image signals being obtained based on the set exposure time.
The above and other aspects, features, and advantages of certain embodiments of the disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the present embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the embodiments are merely described below, by referring to the figures, to explain aspects. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
Hereinafter, embodiments will be described in detail with reference to the accompanying drawings. The described embodiments are merely illustrative, and various modifications are possible from these embodiments. In the following drawings, the same reference numbers refer to the same components, and the size of each component in the drawings may be exaggerated for clarity and convenience of description.
Hereinafter, the terms such as “above” and “on” may include any of physically in contact and without contact at a higher level.
The terms such as “first”, “second” etc. may be used to describe various components, but are used only for the purpose of distinguishing one component from another. These terms are not intended to limit the difference in material or structure of components.
Singular expressions include plural expressions unless the context clearly dictates otherwise. In addition, when a part “includes” a certain component, it means that it may further include other components, not excluding other components unless otherwise stated.
In addition, terms such as “unit”, “module” etc. described in the description may include a unit that processes at least one function or operation, which may be implemented as hardware or software or a combination thereof.
Use of the term “the” and similar denoting terms may correspond to both singular and plural.
Operations of a method may be performed in any suitable order, unless explicitly stated that they must be performed in the order described. In addition, the use of all exemplary terms (for example, etc.) is simply for explaining technical ideas in detail, and the scope of rights is not limited due to these terms unless limited by claims.
In an example embodiment, HDR or High Dynamic Range Imaging (HDRI) is a technique used in imaging and photography to reproduce a greater dynamic range of light intensity than is possible with standard digital imaging or photography techniques. Human eyes can adapt to a wide range of lighting conditions, but most imaging apparatuses are limited to 256 levels since they use 8 bits per channel. When taking a photo of a real scene, a single exposure may not capture all the detail as bright areas can be overexposed while dark areas can be underexposed. HDR imaging works with images that use more than 8 bits per channel (usually 32-bit floating point values), allowing for a much wider dynamic range.
There are many ways to get HDR images, but the most common way is to use photos of a scene taken with different exposure values. In order to combine these exposures, it is useful to know camera response functions and various algorithms are available to estimate them. After the HDR images are merged, they must be converted back to 8-bit to be viewed on a typical display, a process called tone mapping.
In an example embodiment, multispectral imaging (MSI) may use a spectral filter having a Fabry-Perot structure. A Fabry-Perot structure is generally constructed by inserting a cavity or resonant layer between two mirrors having high reflectivity. The basic principle of filters is that when various wavelengths such as λ1, λ2, λ3, λ4, λ5, etc. transmitted through an optical fiber are incident on a filter, multiple interference occurs in the cavity to transmit only a specific wavelength and reflect other wavelengths. By doing so, only data or information of interest may be selected. In an example embodiment, MSI uses a spectral filter having a Fabry-Perot structure, but is not limited thereto. As such, according to another example embodiment, multispectral imaging may be performed using a different structure.
Referring to
The multispectral imaging sensor 100 acquires images of a plurality of channels. According to an example embodiment, the MSI sensor 100 may acquire at least 4 channels based on a wavelength band of about 10 nm to about 1000 nm. In addition, the MSI sensor 100 may generate images of 16 channels within a wavelength range of about 10 nm to about 1000 nm, or may generate images of 31 channels by interpolating images of 16 channels. The number of channels that may be acquired or generated by the MSI sensor 100 is not limited to 4, 16, or 31.
According to an example embodiment, the processor 500 sets an exposure time for each of the four channels based on transmission efficiency and quantum efficiency for each wavelength of each of the four channels from the MSI sensor 100. The processor 500 acquires image signals by controlling the exposure time of four channels of the MSI sensor 100 according to the set exposure time, and generates an HDR image using the obtained image signals corresponding to the four channels.
For reference, the transmission efficiency for each wavelength means a transmission ratio for each wavelength of the spectral filter. In general, a reflector of the spectral filter of the Fabry-Perot cavity has different resonance efficiency for each wavelength due to the difference in reflectivity according to wavelengths, which affects transmission efficiency. Referring to
In addition, the quantum efficiency (QE) for each wavelength is a conversion ratio to indicate the efficiency of each pixel of the sensor in converting photons into electrons at a specific wavelength (nm). A higher ratio of QE indicates higher sensitivity for detecting light. In addition, with respect a CMOS sensor, quantum efficiency is not constant for each wavelength and tends to decrease toward a longer wavelength. Referring to
The QE×TE values, which consider both of the QE and TE effects vary greatly for each spectral channel, and a channel of a specific wavelength is saturated more easily than a channel of another wavelength when imaging with the MSI sensor 100. Referring to
In an example embodiment, by applying a different exposure time for each spectral channel, QE×TE values which differ greatly from channel to channel as illustrated in
According to an example embodiment, the processor 500 may set the exposure time differently for each channel of the MSI sensor 100. The processor 500 acquires image signals for each channel according to different exposure times, and generates an HDR image using the obtained image signals. For reference, various algorithms may be used to generate an HDR image using image signals for each channel. For example, a Debevec algorithm, a Robertson algorithm, or a Mertens algorithm may be used. However, in an example embodiment, a method of generating an HDR image using image signals for each channel is not limited thereto.
Referring to
The imaging sensor 200 which is a sensor employed in a general RGB camera may be a CMOS imaging sensor using a Bayer color filter array. A first image IM1 acquired by the imaging sensor 200 may be an RGB image based on red, green, and blue.
The MSI sensor 100 is a sensor that senses light of more types of wavelengths than the imaging sensor 200. The MSI sensor 100 may use, for example, 16 channels, or may use 31 channels, or any other number of channels. The bandwidth of each channel is set to be narrower than R, G, and B bands, and the total bandwidth which is the sum of the bandwidths of all channels includes and may be wider than an RGB bandwidth, that is, the visible light bandwidth. For example, the total bandwidth may be of about 10 nm to about 1000 nm. A second image IM2 obtained by the MSI sensor 100 may be a multispectral or hyperspectral image, may include a wavelength band wider than the RGB wavelength band, for example, the visible light band, and may be an image based on a wavelength where an ultraviolet to infrared wavelength band which is wider than the RGB wavelength band is divided into 16 or more channels. The second image IM2 may be an image obtained by utilizing all available channels of the MSI sensor 100 or may be an image obtained by selecting a specific channel. The spatial resolution of the second image IM2 may be lower than that of the first image IM1, but is not limited thereto.
In an example embodiment, the imaging sensor 200 may be an RGB sensor. According to an example embodiment, the RGB sensor may be a CMOS imaging sensor. The RGB sensor may generate images of three channels by sensing spectrums representing R, G, and B using a Bayer color filter array. However, the disclosure is not limited thereto, and as such, according to other example embodiments, other types of color filter arrays may also be used. The MSI sensor senses and displays light of a different wavelength than the RGB sensor. The MSI sensor is configured to sense light of more types of wavelengths by having a larger number of channels. According to an example embodiment, the MSI sensor may have 16 channels. In another example embodiment, the MSI sensor may have 31 channels. Each channel may adjust a band through which light passes, a transmission amount, and a bandwidth so as to sense light of a desired band. The total bandwidth which is the sum of bandwidths of all channels includes and may be wider than a bandwidth of a conventional RGB sensor. Sensing spectrums or wavelength bands of the RGB sensor and the MSI sensor will be described later with reference to
Referring to
Referring to
The exposure time setting unit 510 sets the exposure time for each channel based on transmission efficiency (TE) and quantum efficiency (QE) for each wavelength of each channel.
The image acquisition controller 520 controls to acquire image signals for a plurality of channels according to the exposure time for each channel set by the exposure time setting unit 510.
Referring to
Referring to
As shown in
According to an example embodiment, exposure of all channels may start at a rising edge of a frame trigger, but the exposure time of each channel is different. That is, the duration of the exposure of each of the channels may be different. For example, as shown in
Referring to
Referring to
In an example embodiment, only a signal of a specific pixel may be extracted from each channel according to the brightness of an object. For example, a bright object may be recorded by extracting a signal of a pixel to which the shortest exposure time is applied from each channel, and a dark object may be recorded by extracting a signal of a pixel to which a long exposure time is applied from each channel. Referring to
In an example embodiment, the maximum exposure time of a pixel of each channel may be set based on a value obtained by multiplying transmission efficiency (TE) and quantum efficiency (QE) for each wavelength of each channel. For example, the maximum exposure time may be set based on the minimum TE×QE value for each channel. For reference, an average value for each channel or another reference value may be used as the TE×QE value.
Referring to
In an example embodiment, a different exposure time is set for each of the four groups. As shown in
Referring to
Referring to
The HDR image generator 530 generates an HDR image using image signals for each channel obtained under the control of the image acquisition controller 520. For generating an HDR image, a Debevec algorithm, a Robertson algorithm, or a Mertens algorithm may be used. However, in an example embodiment, a method of generating an HDR image using image signals for each channel is not limited thereto.
The image acquisition apparatus 1000 includes a MSI sensor 100 that acquires a first image IM1 based on a first wavelength band of about 10 nm to about 1000 nm, an imaging sensor 200 that acquires a second image IM2 based on a second wavelength band, and a processor 500 that generates a third image by signal processing the first image IM1 and the second image IM2. The image acquisition apparatus 1000 may further include a first memory 300 storing data related to the first image IM1 and a second memory 310 storing data related to the second image IM2, and may further include an image output unit 700 that outputs an image.
The image acquisition apparatus 1000 may also include a first imaging optical system 190 that forms an optical image of an object OBJ in the MSI sensor 100 and a second imaging optical system 290 forming an optical image of the object OBJ in the imaging sensor 200. The first imaging optical system 190 and the second imaging optical system 290 are illustrated as including one lens, but this is an example and is not limited thereto. The first imaging optical system 190 and the second imaging optical system 290 may be configured to have the same focal length and the same angle of view.
The MSI sensor 100 includes a first pixel array PA1, which includes a first sensor layer 110 in which a plurality of first sensing elements are arrayed, and a spectral filter 120 arranged on the first sensor layer 110. The spectral filter 120 includes a plurality of filter groups, and each of the plurality of filter groups may include a plurality of unit filters having different transmission wavelength bands. The spectral filter 120 may be configured to subdivide and filter a wavelength band more than a color filter 220 of the image sensor 200. Here, the wavelength band is wider than the color filter 220, for example, in the ultraviolet to infrared wavelength range. A first micro-lens array 130 may be arranged on the first pixel array PA1. Examples of pixel arrangement applied to the first pixel array PA1 will be described later with reference to
The imaging sensor 200 includes a second pixel array PA2, which includes a second sensor layer 210 in which a plurality of second sensing elements are arrayed, and a color filter 220 arranged on the second sensor layer 210. The color filter 220 may include red filters, green filters, and blue filters that are alternately arranged. A second micro-lens array 230 may be arranged on the second pixel array PA2. Examples of pixel arrangement applied to the second pixel array PA2 will be described later with reference to
The first sensor layer 110 and the second sensor layer 210 may include, but not limited to, a charge coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor.
The first pixel array PA1 and the second pixel array PA2 may be arranged horizontally on the same substrate SU) for example, spaced apart from each other in the X direction.
The substrate SU may include first circuit elements that process signals from the first sensor layer 110 and second circuit elements that process signals from the second sensor layer 210. However, it is not limited thereto, and the first circuit elements and the second circuit elements may be provided on separate substrates.
The memory 300 in which data for the first image IM1 and the second image IM2 are stored is shown separately from the substrate SU, but this is an example. The memory 300 may be arranged on the same layer with or on the separate layer from circuit elements within the substrate. The memory 300 may be a line memory that stores images in line units or may be a frame buffer that stores an entire image. A static random access memory (SRAM) or a dynamic random access memory (DRAM) may be used for the memory 300.
Various circuit elements necessary for the image acquisition apparatus 1000 may be integrated and arranged on the substrate SU. For example, a logic layer including various analog circuits and digital circuits may be provided, and a memory layer storing data may also be provided. The logic layer may be configured on the same layer with or on the separate layer from the memory layer.
Referring to
A row decoder 202, an output circuit 203, and a timing controller (TC) 201 may be connected to the second pixel array PA2 to process signals therefrom, which is similar to the above. In addition, a processor for processing the second image IM2 output through the output circuit 203 may be implemented as a single chip together with the timing controller 201, the row decoder 202, and the output circuit 203.
It is described that the first pixel array PA1 and the second pixel array PA2 have the same size and the same number of pixels, but this is an example for convenience and is not limited thereto.
In operating two different types of sensors, timing control may be required according to different resolutions and output speeds, and the size of an area required for image matching. For example, when reading one image column based on the imaging sensor 200, the image column of the MSI sensor 100 corresponding to that area may be already stored in a buffer or may need to be newly read. Alternatively, operations of the imaging sensor 200 and the MSI sensor 100 may be synchronized using the same synchronization signal. For example, the timing controller 400 may be further provided to transmit a sync signal to the imaging sensor 200 and the MSI sensor 100.
Referring to
This may also be applied to the image acquisition apparatus 1000 including the MSI sensor 100 and the processor 500 according to an example embodiment, without the configuration and function of the imaging sensor 200 illustrated in
Referring to
For example, referring to
Referring to
The first and second unit filters F1 and F2 may have central wavelengths UV1, UV2 of ultraviolet light, and the third to fifth unit filters F3 to F5 may have central wavelengths B1 to B3 of blue light. The sixteenth to eleventh filters F6 to F11 may have central wavelengths G1 to G6 of green light, and the twelfth to fourteenth unit filters F12 to F14 may have central wavelengths R1 to R3 of red light. Also, the fifteenth and sixteenth unit filters F15, F16 may have center wavelengths NIR1, NIR2 of near infrared.
The above mentioned unit filters provided in the spectral filter 120 may have a resonance structure having two reflectors, and a transmission wavelength band may be determined according to characteristics of the resonance structure. The transmission wavelength band may be adjusted according to the material of the reflector, the material of the dielectric material in the cavity, and the thickness of the cavity. A structure using a grating, a structure using a distributed bragg reflector (DBR), and the like may also be applied to the unit filter.
Referring to
Each channel may include N×N pixels, and an exposure time may be set for each pixel, wherein N may be a natural number of 2 or greater, and wherein the exposure time may be set differently for each pixel. If the brightness of an object is greater than or equal to a first threshold value, an image signal acquired from a pixel for which the shortest exposure time may be set among N×N pixels may be extracted; or if the brightness of an object is less than the first threshold value, an image signal obtained from a pixel having the longest exposure time among N×N pixels may be extracted. The brightness of the subject may be pre-measured using a brightness histogram acquired by an image acquisition apparatus or an illuminance sensor. The first threshold value which is an arbitrary value may be predetermined according to the structure and application of the MSI sensor.
In operation 1902, an HDR image is generated using image signals corresponding to the four channels obtained according to the exposure time set in operation 1900. For reference, various algorithms may be used to generate an HDR image using image signals for each channel. For example, a Debevec algorithm, a Robertson algorithm, or a Mertens algorithm may be used. The image signal for each channel may be a signal obtained from a pixel or pixels constituting each channel, or a signal extracted from any one pixel among a plurality of pixels constituting a channel.
In a method of controlling the image acquisition apparatus according to an example embodiment, an HDR image may be obtained using image signals acquired by applying a different exposure time to each channel or each pixel constituting each channel in a MSI sensor. A stable HDR image may be obtained even when a dynamic object as well as a static object is photographed.
The imaging sensor 1000 including the above-mentioned spectral filter may be employed in various high-performance optical apparatuses or high-performance electronic apparatuses. Such electronic apparatuses include, for example, smart phones, mobile phones, cell phones, personal digital assistants (PDAs), laptops, PCs, various portable devices, home appliances, security cameras, medical cameras, vehicles, and the Internet of Things (IoT) devices or other mobile or non-mobile computing devices, but are not limited thereto.
In addition to the imaging sensor 1000, the electronic apparatuses may further include a processor that controls an imaging sensor, for example, an application processor (AP) and may control a plurality of hardware or software components by driving an operating system or an application program through a processor and perform various data processing and calculations. The processor may further include a graphic processing unit (GPU) and/or an image signal processor. If the processor includes an image signal processor, the image (or video) obtained by the imaging sensor may be stored and/or output using the processor.
The processor ED20 may execute software (a program ED40, etc.) to control one or a plurality of other components (e.g., hardware, software components etc.) of the electronic apparatus ED01 connected to the processor ED20, and may perform various data processing or calculations. As part of data processing or calculation, the processor ED20 may load commands and/or data received from other components (the sensor module ED76, the communication module ED90 etc.) into volatile memory ED32, process the command and/or data stored in the volatile memory ED32, and store the resulting data in non-volatile memory ED34.
The memory ED30 may store various data required by components (e.g., the processor ED20, the sensor module ED76, etc.) of the electronic apparatus ED01. The data may include, for example, input data and/or output data for software (e.g., the program ED40) and commands related thereto. The memory ED30 may include a volatile memory ED32 and/or a non-volatile memory ED34. The non-volatile memory ED32 may include a built-in memory ED36 fixedly mounted in the electronic apparatus ED01 and a removable external memory ED38.
The program ED40 may be stored as software in the memory ED30 and may include an operating system ED42, middleware ED44, and/or an application ED46. The camera module ED80 may capture still images and moving images. The camera module ED80 may include a lens assembly including one or a plurality of lenses, an imaging sensor 1000 of
The flash CM20 may emit light used to enhance light emitted or reflected from an object. The flash CM20 may include one or a plurality of light emitting diodes (e.g., red-green-blue (RGB) LED, white LED, infrared LED, ultraviolet LED, etc.), and/or a Xenon Lamp. The imaging sensor 1000 may be the imaging sensor described in
The image stabilizer CM40 may move one or a plurality of lenses or the imaging sensor 1000 included in the lens assembly CM10 in a specific direction in response to the movement of the camera module ED80 or the electronic apparatus CM01 including the same, or may control the operation characteristics of the imaging sensor 1000 (e.g., adjustment of read-out timing, etc.) so that negative effects caused by motion may be compensated for. The image stabilizer CM40 may detect the movement of the camera module ED80 or the electronic apparatus ED01 by using a gyro sensor (not shown) or an acceleration sensor (not shown) arranged inside or outside the camera module ED80. The image stabilizer CM40 may be implemented optically.
The memory CM50 may store some or all data of an image acquired through the imaging sensor 1000 for a next image processing task. For example, when a plurality of images are acquired at high speed, the memory CM50 may be used to store acquired original data (e.g., Bayer-Patterned data, high resolution data etc.), display only low resolution images for selection and deliver the original data of the selected image to the image signal processor CM60. The memory CM50 may be integrated into the memory ED30 of the electronic apparatus ED01, or may be configured as a separate memory operating independently.
The image signal processor CM60 may perform image processing on images acquired through the imaging sensor 1000 or image data stored in the memory CM50. Image processing may include depth map generation, 3D modeling, panorama generation, feature point extraction, image synthesis, and/or image compensation (e.g., noise reduction, resolution adjustment, brightness adjustment, blurring, sharpening, softening, etc.). The image signal processor CM60 may perform control (e.g., exposure time control, read-out timing control, etc.) for components (such as the imaging sensor 1000) included in the camera module ED80. The images processed by the image signal processor CM60 may be stored back in the memory CM50 for further processing or provided to external components of the camera module ED80 (e.g., the memory ED30, the display ED60, the electronic apparatus ED02, an electronic apparatus ED04, the server ED08, etc.). The image signal processor CM60 may be integrated into the processor ED20 or configured as a separate processor that operates independently of the processor ED20. If the image signal processor CM60 is configured separate from the processor ED20, the image processed by the image signal processor CM60 may be displayed on the display ED60 after undergoing additional image processing by the processor ED20.
The electronic apparatus ED01 may include a plurality of camera modules ED80 each having different properties or functions. One of the plurality of camera modules ED80 may be a wide-angle camera and the other may be a telephoto camera. Similarly, one of the plurality of camera modules ED80 may be a front camera and the other may be a rear camera. In addition, the camera module ED 80 may be a composite camera module in which an imaging sensor having a conventional RGB three-color filter and a spectral imaging sensor composed of a spectral filter are combined and data of the combined two imaging sensors is integrated and processed.
The RGB imaging sensor may be a CMOS imaging sensor. The RGB sensor may generate images of three channels by sensing spectrums representing R, G, and B using a Bayer color filter array. Other types of color filter arrays may also be used. The MSI sensor senses and displays light of a different wavelength than the RGB imaging sensor. The MSI sensor is characterized by sensing light of more types of wavelengths by having a larger number of channels.
The processor 500 may process image information obtained from each of the imaging sensors 100 and 200 and combine data in a desired method to improve image quality or performance of identifying an object in an image.
The imaging sensor 1000 according to the embodiments may be applied to a mobile phone or smart phone 5100m shown in
In addition, the imaging sensor 1000 may be applied to a smart refrigerator 5600 shown in
In addition, the imaging sensor 1000 may be applied to a vehicle 6000 as shown in
The image acquisition apparatus according to an example embodiment may acquire an HDR image using image signals obtained by applying a different exposure time for each channel in a MSI sensor.
A stable HDR image may be obtained even when a dynamic object as well as a static object is photographed.
The above mentioned image acquisition apparatus may be employed in various electronic apparatuses.
Although the imaging sensor including the above mentioned spectral filter and the electronic apparatus including the same have been described with reference to the embodiments shown in drawings, this is only an example, and a person skilled in the art may understand that various modifications and other equivalent embodiments are possible. Therefore, the disclosed embodiments should be considered from an illustrative rather than a limiting point of view. The scope of rights is shown in the claims rather than the foregoing description, and all differences within an equivalent scope should be construed as being included in the scope of rights.
It should be understood that embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each embodiment should typically be considered as available for other similar features or aspects in other embodiments. While one or more embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0121116 | Sep 2022 | KR | national |