This application relates to the field of data processing technologies, and in particular, to a multispectral image sensor and an imaging module.
Spectral imaging is one of the main existing imaging technologies. Data based on spectral imaging includes both image information and spectral information. The spectral information can reflect the spectral intensity of each pixel at each wave band during the shooting of an image. A shooting object in the image may be qualitatively or even quantitatively analyzed by using the spectral information. Therefore, spectral imaging is applicable to scenarios with various different demands.
In existing technologies of multispectral image sensors, multispectral image sensors in a filter switching manner are generally used. When a multispectral image needs to be obtained, a multispectral image is acquired by switching filters corresponding to different preset wavelengths on a photosensitive chip. However, when a multispectral image sensor generated in the foregoing manner obtains a multispectral image, because different spectrums are acquired at different moments, the real-time performance is relatively low, and different spectrums are not acquired simultaneously, impacting the precision and efficiency of imaging.
The embodiments of this application provide a multispectral image sensor and an imaging module, to resolve the problem that in existing technologies of multispectral image sensors, multispectral image sensors in a filter switching manner are generally used, when a multispectral image sensor based on the principle obtains a multispectral image, because different spectrums are acquired at different moments, the real-time performance is relatively low, and different spectrums are not acquired simultaneously, resulting in relatively low precision and efficiency of imaging.
The embodiments of this application provide an image sensor, where the image sensor includes a microlens array, a filter array, and a photosensitive chip sequentially arranged in a direction of incident light, where
The embodiments of this application provide an imaging device, includes at least one of the above image sensor, at least one lens, and a circuit board. The at least one image sensor and the at least one lens are arranged on the circuit board, and the at least one lens is arranged on the at least one image sensor, to irradiate the incident light on the at least one image sensor after passing through the at least one lens.
The implementation of the multispectral image sensor and the imaging module provided in the embodiments of this application has the following beneficial effects.
The multispectral image sensor provided in the embodiments of this application includes a filter array, the filter array includes at least one filter unit set, and each filter unit set includes a plurality of filters corresponding to preset wavelengths that are not identical, so that the simultaneous acquisition of optical signals in a plurality of different wave bands and the generation of multispectral image data can be implemented, thereby ensuring the real-time performance of acquisition of different channels in the multispectral image data, and providing the precision and efficiency of imaging.
To describe the technical solutions in the embodiments of this application more clearly, the accompanying drawings required in the description of the embodiments or existing technologies are briefly described below. Apparently, the accompanying drawings in the following description show merely some embodiments of this application, and a person of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
To make the objectives, technical solutions, and advantages of this application clearer, this application is further described below in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are only used for describing this application rather than limiting this application.
Data based on spectral imaging includes both image information and spectral information, and has a data type integrating an image and a spectrum. Data obtained through spectral imaging can reflect the spectral intensity of each pixel at each wave band during the shooting of an image. Qualitative and quantitative analysis, positioning analysis, and the like may be performed on an object by using spectral imaging technologies. The spectral imaging technologies may be categorized into three classes in an ascending order of the spectral resolution: multispectral imaging, high spectral imaging, and hyperspectral imaging technologies. The spectral imaging technologies have both the spectral discrimination capability and the image discrimination capability, and are applicable to the identification of geological minerals and vegetation ecology, the reconnaissance of military targets, and other scenarios.
At present, a spectral imaging device may mainly be implemented by using several solutions as follows. The first solution is a filter switching method. A multispectral image sensor based on the foregoing method includes a plurality of filters. The plurality of filters are generally located between an object under test and a lens. When image acquisition is required, the sensor switches to a specific filter based on a preset switching sequence. Only a single image with specific filtering characteristics can be outputted through a single exposure. A spectral image with multiple channels, that is, a multispectral image, is obtained by continuously switching the filters to perform a plurality of exposures. In the second solution, a push-broom method is used in the implementation of a multispectral image sensor. Only the multispectral information of the object under test in one pixel width (that is, corresponding to one column of pixels) can be outputted through a single exposure. To obtain a spatially complete two-dimensional image of the object under test, exposures need to be performed in a push-broom manner to obtain the multispectral information corresponding to a plurality of columns of pixels, to eventually synthesize a spectral image with multiple channels.
However, for the multispectral images generated in both a filter switching manner and a push-broom manner, there is a real-time performance problem. For example, for a multispectral image obtained in the filter switching manner, different spectrums have inconsistent acquisition moments, that is, there is a real-time deviation in the time domain. For a multispectral image obtained in a push-broom manner, only the multispectral information of one column of pixels can be obtained each time, and different columns have inconsistent obtaining moments, that is, there is a real-time performance deviation in the spatial domain, causing great impact on the imaging precision and efficiency of the multispectral images.
Therefore, to resolve the problem in existing technologies, this application provides a multispectral image sensor and a manufacturing method of a multispectral image sensor, to simultaneously obtain the overall multispectral information of an object under test, thereby satisfying the real-time performance of a multispectral image in the spatial domain and the time domain, and improving the imaging precision and efficiency of multispectral images.
Referring to
The photosensitive chip 103 includes a plurality of pixel units.
The filter array 102 includes at least one filter unit set. Each filter unit set includes a plurality of filters corresponding to preset wavelengths that are not identical, that is, different wavelengths. Each filter is configured to allow the passage of light with a preset wavelength corresponding to the filter in the incident light.
The microlens array 101 includes at least one microlens unit. The microlens unit is configured to converge the incident light, and make the converged incident light pass through the filter array to focus on the photosensitive chip.
In this embodiment, the photosensitive chip 103 included in the multispectral image sensor may convert the acquired optical image information into an electrical signal, to obtain and store the image data including multiple spectrums.
In an embodiment, the photosensitive chip 103 may be a complementary metal-oxide-semiconductor (Complementary Metal-Oxide-Semiconductor, CMOS) sensor chip or may be a charge-coupled device (Charge-coupled Device, CCD) chip. Certainly, another chip that may convert an optical signal into an electrical signal may be used as the photosensitive chip 103 in this embodiment.
Further,
In this embodiment, the photosensitive chip 103 includes a plurality of pixel units. Each pixel unit may acquire corresponding multispectral data. The multispectral data corresponding to the plurality of pixel units are synthesized to obtain the multispectral image data. It needs to be noted that the pixel units included in one photosensitive chip 103 may be determined according to the resolution and image size of acquisition of the pixel units, and may be correspondingly adjusted according to a use scenario. A quantity of the pixel units is not limited herein.
In an embodiment,
In an embodiment,
In this embodiment, the multispectral image sensor includes a microlens array 101. The microlens array includes at least one microlens unit, or certainly may include two or more microlens units. A specific quantity of microlens units may be correspondingly configured according to an actual scenario or a sensor requirement. The quantity of microlens units is not limited herein. The microlens array is configured to converge the incident light, and make the converged incident light pass through the filter array to focus on the photosensitive chip. The incident light may be light that is emitted by a preset light source and is reflected by an object under test or may be the light generated by the object under test.
In an embodiment, each microlens unit in the microlens array 101 corresponds to one filter unit set in a filter matrix. That is, there is a one-to-one correspondence between the microlens units and the filter unit sets. Each microlens unit is configured to converge the incident light into an area corresponding to the filter unit set, and make the incident light irradiate the photosensitive chip 103 through the filter unit set. Certainly, one microlens unit may further correspond to two or more filter unit sets. A specific correspondence manner may be determined according to an actual case.
In this embodiment, the multispectral image sensor includes a filter array 102. The filter array 102 includes at least one filter unit set. One filter unit set includes a plurality of filters. Different filters may correspond to different preset wavelengths that are not identical. That is, one filter unit set may have two or more filters corresponding to the same preset wavelength and have two or more filters corresponding to different preset wavelengths, and optical signals corresponding to different spectrums may be acquired. One filter unit set includes filters having different preset wavelengths, and different filters can only allow the passage of the light with specific wavelengths. That is, the light with a preset wavelength is obtained from the incident light through filtering. Therefore, optical signals with multiple spectrums may be obtained by using one filter unit set. After the incident light passes the through filter unit set, the photosensitive chip may acquire the optical signals with multiple spectrums, and converts the optical signals into corresponding electrical signals, thereby generating the multispectral image data.
In this embodiment, the filter array 102 of the multispectral image sensor includes a plurality of filters corresponding to different preset wavelengths. Therefore, when incident light passes through the filter array 102 to irradiate the photosensitive chip 103, the photosensitive chip may filter the light with the filter in the ranges of visible light and near infrared light (for example, light with a wave band ranging from 300 nm and 1100 nm) to obtain a multispectral image. The bandwidth of the multispectral image may range from 50 nm to 700 nm, or certainly may be greater than or less than the bandwidth ranges. An acquired multispectral image or a reconstructed multispectral image using the multispectral image sensor provided in this embodiment may be used for qualitatively analyzing components of a shooting object, for example, recognizing components of a substance, or obtaining a more precise environmental color temperature and performing color recovery on the shooting object based on the environmental color temperature, or may perform more accurate liveness detection, facial recognition, and the like. That is, the image data based on multispectral acquisition may be applied to various different use scenarios.
In an embodiment, one filter unit set may include four or more filters, for example, four filters, nine filters or sixteen filters. A specific quantity of filters is determined according to a channel quantity of the multispectral image sensor. If the filter unit set includes nine filters, the filter unit set may be a 3*3 filter matrix.
In an embodiment, for example, different filters in the same filter unit set are arranged on a two-dimensional plane in a preset arrangement manner. When a filter array includes two or more filter unit sets, because filters corresponding to different preset wavelengths in each filter unit set are arranged in the same arrangement manner, for the entire filter array, the filters corresponding to different preset wavelengths are periodically arranged in a two-dimensional plane in a preset arrangement order. For example,
In an embodiment, for example, the filter unit set is a broadband filter matrix. Similarly, the broadband filter matrix includes a plurality of filters corresponding to different preset wavelengths. Compared with an existing multispectral image sensor, a filter unit set in the multispectral image sensor provided in the embodiments of this application may be considered as one broadband filter matrix, that is, a “broadband filter” formed by a plurality of filters corresponding to different preset wavelengths. That is, a filter unit set formed by a plurality of filters may be considered as one broadband filter. Therefore, a wave band formed by preset wavelengths corresponding to all filters included in the filter unit set may be in a relatively wide range, for example, from 300 nm to 1100 nm, or from 350 nm to 1000 nm. That is, a spectral range may be wave bands of visible light and near infrared light. A spectral transmittance curve of the broadband filter matrix may be similar to a spectral transmittance curve of a Bayer filter. The full width at half maximum (the full width at half maximum is a transmission peak width at a half of a peak height) of a transmission spectrum ranges from 50 nm to 700 nm. Different spectral transmission characteristics correspond to different colors. That is, after white light enters a filter corresponding to a preset wavelength in the broadband filter matrix, only the light with the corresponding wavelength is allowed to pass through, and the light with all the remaining wave bands is blocked. For example,
In an embodiment, the broadband filter matrix includes a filter that may allow passage of light in a near infrared wave band, so that a spectral range in which light is allowed to pass through of the entire broadband filter matrix may be expanded. In most existing color camera modules, a filter that filters out light in a near infrared wave band (that is, does not allow the passage of light in the near infrared wave band), that is, IR-cut, is usually added to the color camera modules (between a lens and a photosensitive chip), to completely cut off light in the near infrared spectrum (650 nm to 1100 nm), to implement better color recovery. However, to expand a spectrum utilization range and obtain more spectral data to adapt to demands in different application scenarios, the multispectral image sensor provided in this application also utilizes a near infrared spectrum (when the spectrum utilization range is wider, spectral information is richer). Therefore, the multispectral image sensor may select not to use an infrared cut-off filter. That is, a filter that allows the passage of the near infrared light may be added to the broadband filter matrix, so that while color recovery can be similarly ensured, more spectral information is introduced. The filter that allows the passage of the near infrared light and a filter of another preset wave band have close response curves in the near infrared wave band. A spectral curve corresponding to each preset wavelength may be recovered by subtracting spectral information acquired by a black filter from spectral information acquired by filters corresponding to all other preset wave bands except the near infrared wave band. The filter that only responds to near infrared light is used as an IR cut filter.
Further, in another embodiment of this application, the multispectral image sensor further includes a base 104. The photosensitive chip 103, the filter array 102, and the microlens unit 101 are sequentially arranged on the base. For example,
Further, in another embodiment of this application, this application further provides an imaging module based on the foregoing multispectral image sensor. The imaging module includes the multispectral image sensor provided in any foregoing embodiment. In addition to the foregoing multispectral image sensor, the imaging module further includes a lens and a circuit board. For example,
In an embodiment, the lens 82 in the imaging module includes an imaging lens 821 and a pedestal 822. The imaging lens 821 is arranged on the pedestal 822. The multispectral image sensor 81 connected to the pedestal 822 is arranged on the circuit board 83. That is, after the actual installation, the pedestal 822 covers the multispectral image sensor 81, that is, covers the entire multispectral image sensor 81 and is arranged on the circuit board 83.
In the embodiments of this application, the multispectral image sensor includes a filter array, the filter array includes at least one filter unit set, and each filter unit set includes filters corresponding to different preset wavelengths, so that the simultaneous acquisition of optical signals in a plurality of different wave bands and the generation of multispectral image data can be implemented, thereby ensuring the real-time performance of the acquisition of different channels in the multispectral image data, and providing the precision and efficiency of imaging.
The multispectral image sensor includes a microlens array 901, a filter array 902, and a photosensitive chip 903 that are sequentially arranged along a direction of incident light.
The photosensitive chip 903 includes a plurality of pixel units.
The filter array 902 includes at least one filter unit set. Each filter unit set includes a plurality of filters corresponding to different preset wavelengths that are not identical. Each filter is configured to allow the passage of light with a preset wavelength corresponding to the filter in the incident light. The filters in each filter unit set are arranged in a target manner. The target manner is an arrangement manner corresponding to optimal image acquisition indicators corresponding to the filter unit set.
The microlens array 901 includes at least one microlens unit. The microlens unit is configured to converge the incident light, and make the converged incident light pass through the filter array to focus on the photosensitive chip.
In this embodiment, the photosensitive chip 903 and the microlens array 901 are respectively the same as the photosensitive chip 103 and the microlens array 101 in Embodiment 1, and are both configured to convert an optical signal into an electrical signal and configured to converge light. For detailed description, reference may be made to the related description in Embodiment 1. Details are not described herein again.
In this embodiment, the filter array 902 is similar to the filter array 102 in the previous embodiment, both including at least one filter unit set, and the filter unit set includes filters corresponding to different preset wavelengths. Different from the filter array 102 in Embodiment 1, the filters in the filter unit set of the filter array 902 in this embodiment are arranged in a preset target manner, and when the filters are arranged in the foregoing manner, image acquisition indicators corresponding to the filter unit set are optimal.
In an embodiment, before the target manner is determined, image acquisition indicators corresponding to candidate manners may be respectively determined. An optimal image acquisition indicator is determined based on the image acquisition indicators of all candidate manners. A candidate manner corresponding to the optimal image acquisition indicator is used as the target manner. In an embodiment, the image acquisition indicator includes a plurality of indicator dimensions. Different weight values may be configured for the different indicator dimensions according to different use scenarios. A weighting operation is performed according to indicator values corresponding to a candidate manner in the indicator dimensions and the configured weight value, so that an image acquisition indicator corresponding to the candidate manner can be calculated. If the value of the image acquisition indicator is larger, it indicates that the adaptability to a use scenario is higher, an imaging effect is better, and the recognition accuracy is higher. Based on this, a candidate manner corresponding to an image acquisition indicator with the largest value may be chosen as the target manner.
In an embodiment, the filter unit set includes an m*n filter matrix. That is, in one filter unit set, the filters are arranged in a manner of m rows and n columns, to form one m*n filter matrix. For example, the filters in the filter matrix may be square filters or may be rectangular filters. m and n are both positive integers greater than 1. For example, m may be 2, 3, 4, or the like. Correspondingly, n may also be 2, 3, 4, or the like. The values of m and n may be the same or may be different. Specific values of m and n are not limited herein.
For example, according to the colors of filters included in a filter unit set, the filter unit set (that is, the filter matrix) may be categorized into several types as follows, which are respectively: a GRBG filter, an RGGB filter, a BGGR filter, and a GBRG filter, where G represents a filter that allows the passage of green light, R represents a filter that allows the passage of red light, and B represents a filter that allows the passage of blue light.
A 3*3 filter matrix is used as an example for description. For example,
A 3*3 filter matrix with a total of 9 different preset wavelengths (that is, allowing the passage of different specific colors) continues to be used as an example for description. The positions of the filters in the filter matrix are mainly determined based on several aspects as follows. 1) From the perspective of the entire filter array, the specific position of a single color in the 3*3 matrix is not required. Therefore, relative positions between different colors in one filter matrix (that is, the filter unit set) need to be considered. 2) It needs to be considered whether there is a specific requirement for the relative positions of colors in subsequent scenario applications. 3) The recovery effect of a color image (for example, an RGB image) is strongly correlated to the relative positions of colors. Therefore, if a scenario application does not have a specific requirement for the relative positions of colors, the demand of a color image recovery algorithm (which subsequently becomes an RGB recovery algorithm) is mainly considered in the design of the spatial arrangement of filters corresponding to different preset wavelengths in a filter array.
In this embodiment,
A part of the resolution in the filter matrix is sacrificed in the foregoing manner, and 5/9 of spatial information is discarded in a sampling process. For a multispectral image sensor with an original resolution output of 3a*3b, an image resolution of an RGB output of the image sensor is 2a*2b. However, in the foregoing manner, the RGB recovery of the multispectral image sensor can be completed by using a universal color signal processing model, so that the universality and efficiency of the color image recovery can be improved. Therefore, after it is determined that the RGB recovery algorithm is applicable, the image acquisition indicators may be determined according to the recovery effects of the RGB recovery algorithm in different arrangement manners, and the target manner of the filters in the filter matrix may be determined based on the image acquisition indicators.
Further, in another embodiment of this application, the image acquisition indicators include an information sampling rate, a distortion distance, a distance parameter with respect to a reference channel, and a spectral similarity calculated based on a transmittance curve. The image acquisition indicators being optimal, such as optimal image acquisition indicators, refers to that when the filters are arranged in the target manner, the information sampling rate is greater than a sampling rate threshold, the distortion distance is less than a distortion threshold, the distance parameter is less than a preset distance threshold, and a spectral similarity between adjacent filters is greater than a preset similarity threshold. The sampling rate threshold is determined based on information sampling rates of all candidate manners. The distance threshold is determined based on distortion distances of all the candidate manners.
In this embodiment, the image acquisition indicators include four types of feature parameters: an information sampling rate, a distortion distance, a distance parameter with respect to a reference channel, and a spectral similarity between different filters. The calculation manners of three of the feature parameters are respectively described below. Detailed description is as follows.
1) Information Sampling Rate
The 3*3 filter matrix continues to be used as an example for description. As described above, during the execution of the RGB recovery algorithm, only four filters provide color information for the RGB recovery algorithm. That is, information of five channels (that is, data acquired by filters at five positions) in the 3*3 array is discarded, and information of only four channels is kept. When the four filters are at different positions in the 3*3 array, the sampling effects on the spatial information of the entire filter matrix are different. Therefore, a sampling effect on the spatial information when the filters of the four colors are at different positions may be indicated by using an information sampling rate. For example,
Based on this, when the filter matrix is arranged in the foregoing manner in
Further, the sampling rate threshold may be determined according to information sampling rates corresponding to all candidate manners. For example, an information sampling rate with the second largest value in all the candidate manners may be used as the sampling rate threshold, thereby selecting an information sampling rate with the largest value as the sampling rate threshold.
2) Distortion Distance
For example,
Further, the distortion threshold may be determined according to the distortion distances corresponding to all candidate manners. For example, a distortion distance with the second smallest value in the distortion distances of all the candidate manners may be used as the distortion threshold, thereby selecting a distortion distance with the smallest value as the distortion threshold.
3) Distance Parameter with Respect to a Reference Channel
As described above, during the execution of the RGB recovery algorithm, the grayscale value of the near infrared light IR channel first needs to be respectively subtracted from the grayscale values of four channels. Therefore, the IR channel may be used as a reference channel. Certainly, in other application scenarios, if a channel corresponding to a filter with another wave band is used as the reference channel, the IR channel may be replaced with a channel of the corresponding wave band. During the execution of the RGB recovery algorithm, IR components in the grayscale values of the four channels are the same as the grayscale value of the IR channel. Therefore, in the determination of the arrangement manner of the filter matrix, the positions of the filters corresponding to the four channels (that is, the RGGB channels) in the filter matrix need to be as close to the position of the filter corresponding to the IR channel as possible, and the four channels should have distances as close to the IR channel as possible. For this, the distances between the filters corresponding to the four channels and the IR filter and the fluctuation values of the distances are defined. For example,
4) Spectral Similarity Calculated Based on a Transmittance Curve
After the information sampling rate, the distortion distance, and the distance parameter (that is, the sum of distances and the IR distance fluctuation) corresponding to the candidate manners are calculated, all the candidate manners may be quantitatively evaluated. For example,
After the positions corresponding to the five channels are determined, filters that need to be placed at the remaining four positions in the matrix may be determined. Because different colors should be distributed as uniformly as possible in space, it should be avoided that filters with close colors are located in an excessively close approximate in the 3*3 filter matrix, that is, close colors should be kept from being adjacent as much as possible. The foregoing figure is used as an example for description. Transmittance curves corresponding to the remaining four filters that each has a to-be-determined position are determined, and any filter with to-be-determined position is placed in a vacant position. A spectral similarity between a transmittance curve of the filter with to-be-determined position and an adjacent transmittance curve of a filter with a determined position is calculated. A similarity between the two transmittance curves may be determined based on a similarity measurement indicator for a spectral curve used in the field of spectral measurement. For example, a similarity measurement indicator such as a Euclidean distance, a spectral corner, or a related coefficient between two transmittance curves may be used. This is not limited herein. A position with the smallest similarity is determined from a plurality of vacant positions as the position of the filter with a to-be-determined position. The position of the filter with a to-be-determined position in the filter matrix is obtained in the foregoing manner, so that a transmittance curve corresponding to each filter has a preset weighting correlation with a transmittance curve of a filter in a neighboring region of the filter. The foregoing steps are sequentially performed for all the filters with a to-be-determined position, so that a corresponding target manner during the arrangement of the filters in the filter matrix may be determined from all the candidate manners.
In an embodiment, based on the calculation manners of the four feature parameters, a terminal device may perform traversal to calculate the parameter values corresponding to the four feature parameters of all candidate manners of the filter matrix, and calculate the image acquisition indicators corresponding to the candidate manners based on the parameter values corresponding to the feature parameters, so that an optimal image acquisition indicator can be chosen, and a candidate manner corresponding to the optimal image acquisition indicator is used as the target manner.
In an embodiment, the multispectral image sensor provided in this embodiment may be integrated in an imaging module. In this case, the imaging module includes the multispectral image sensor, a lens, and a circuit board. At least one multispectral image sensor and at least one lens are arranged on the circuit board. The lens is arranged on the multispectral image sensor, to make incident light pass through the lens to irradiate the multispectral image sensor.
In the embodiments of this application, the image acquisition indicators are determined in a plurality of feature dimensions. The feature dimensions include an information acquisition degree, a distortion degree, a correlation between filters, and a fluctuation range from a center point, to quantitatively assess an acquisition effect of a filter matrix in a plurality of aspects, so that an optimal target arrangement manner can be accurately and effectively determined, thereby improving the acquisition precision of a multispectral image sensor and the adaptability between application scenarios.
The foregoing embodiments are merely for describing the technical solutions of this application, but not for limiting this application. This application is described in detail with reference to the foregoing embodiments, and a person of ordinary skill in the art may make modifications to the technical solutions described in the foregoing embodiments or make equivalent replacements to some technical features thereof, without departing from the spirit and scope of the technical solutions of the embodiments of this application, which are included in the protection scope of this application.
Number | Date | Country | Kind |
---|---|---|---|
202110621865.X | Jun 2021 | CN | national |
This application is a Continuation application of International Patent Application No. PCT/CN2021/107954 filed with the China National Intellectual Property Administration (CNIPA) on Jul. 22, 2021, which is based on and claims priority to and benefits of Chinese Patent Application No. 202110621865.X, filed on Jun. 3, 2021. The entire content of all of the above-referenced applications is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2021/107954 | Jul 2021 | US |
Child | 18237241 | US |