The present application is based on PCT filing PCT/EP2018/086783, filed Dec. 21, 2018, and claims priority to EP 17209808.9, filed Dec. 21, 2017, the entire contents of each are incorporated herein by reference.
The present disclosure generally pertains to the field of spectral imaging systems.
Spectral imaging is a field of spectroscopy and of photography in which a complete spectrum or some spectral information is collected at respective locations in an image plane. Multispectral imaging measures light in a number of spectral bands. Hyperspectral imaging is a special case of spectral imaging where often hundreds of contiguous spectral bands are available. Multi-/hyperspectral image sensing is for example of high interest for material classification. It is important to obtain high spatial, temporal and spectral accurate measurements. The amount of data captured in such high precision systems is very high and requires a compression of data (e.g. a 16-bit full HD resolved image sampled from 400 nm to 800 nm at 1 nm steps would generate ˜1.5 Gbyte data per single capture).
State-of-the art spectral imaging systems apply various approaches to acquire spectral image information. The most known technique is applying mosaicking on an image sensor (similar to normal imaging sensors where a red-green-blue, a so-called Bayer pattern is applied), where each image pixel is covered with a different type of color filter with a known spectral sensitivity. This approach reduces the spatial image accuracy depending on the number of color filters and provides a low spectral resolution capability. For recovering the spectral image information complex signal reconstruction methods must be applied. Other systems try utilizing globally changing color filters, but those systems suffer from a temporal sampling limitation of the sensor, the camera systems large form factor and face data bandwidth problems due to the high amount of acquired data.
According to a first aspect the disclosure provides a spectral filter with a variable spectral filter transmission; an event-based imaging sensor configured to produce measurement events which correspond to a change in a filter response that is generated by an observed spectrum, and a processor configured to control the filter transmission of the spectral filter so that it sweeps over wavelength with time, and to generate an estimation of the observed spectrum based on the measurement events that correspond to the filter response and based on the filter transmission.
According to another aspect the disclosure provides a method, comprising producing measurement events which correspond to a change in a filter response of a spectral filter that is generated by an observed spectrum; controlling the filter transmission of the spectral filter so that it sweeps over wavelength with time; and generating an estimation of the observed spectrum based on the measurement events that correspond to the filter response and based on the filter transmission. Further aspects are set forth in the dependent claims, the following description and the drawings.
Embodiments are explained by way of example with respect to the accompanying drawings, in which:
The embodiments described below provide an apparatus comprising a spectral filter with a variable spectral filter transmission; an event-based imaging sensor configured to produce measurement events which correspond to a change in a filter response that is generated by an observed spectrum, and a processor configured to control the filter transmission of the spectral filter so that it sweeps over wavelength with time, and to generate an estimation of the observed spectrum based on the measurement events that correspond to the filter response and based on the filter transmission.
For example, the processor may be configured to reconstruct the observed spectrum by generating the estimation of the observed spectrum. In this way, the processor may be configured to output a reconstructed spectral scene volume which contains image and spectral information of a spectral scene.
The processor may be configured to control the spectral filter transmission of the spectral filter so that it sweeps linearly or non-linearly over wavelength with time.
For example, the spectral filter transmission of the spectral filter may be controlled in such a way that a sensitivity maximum of the spectral filter transmission sweeps linearly or non-linearly over wavelength with time.
The processor may be configured to control the spectral filter transmission characteristic arbitrarily over time.
The event-based imaging sensor may be configured to record any intensity changes that result from changes in spectral filter transmission.
The event-based imaging sensor may be configured to register pixel intensity changes in an asynchronous way.
The event-based imaging sensor may be configured to create, from incidents on the imaging sensor, events that indicate an intensity increase or decrease.
The events may indicate an intensity increase or decrease of a predefined threshold. The threshold may be fixed or adaptively changed by the processor.
The processor may be configured to create the filter response from the detected events.
The processor may be configured to reconstruct the observed spectrum from the measured filter response by a numerical reconstruction method.
The numerical reconstruction method may be a Tikhonov regularization.
The numerical reconstruction method may be based on a matrix containing the known filter responses of the spectral filter. For example, an equation system can be set up with a combination of multiple measurements and solve the inverse problem with known methods, e.g. Tikhonov regularization.
In the numerical reconstruction method, a single event or a sum of events related to a measurement between a time interval may be formulated as a linear (or higher order) combination.
In an alternative embodiment, the numerical reconstruction method may be based on compressive sensing (CS) principles. For example, compressive sensing principles may be used in the control of the spectral filter and herewith also compressive sensing methods can be utilized for the reconstruction of the original signal.
The processor may for example be a digital signal processor (DSP), a computer, a desktop computer, a workstation, or the like. The processor may also be implemented in a laptop, a tablet computer, a smartphone or the like. Circuitry of the electronic device may include one or more processors, one or more microprocessors, dedicated circuits, logic circuits, a memory (RAM, ROM, or the like), a storage, output means (display, e.g. liquid crystal, (organic) light emitting diode, etc.), loud speaker, an interface (e.g. touch screen, a wireless interface such as Bluetooth, infrared, audio interface, etc.), etc. The processor may be collocated with the event-based imaging sensor and/or the spectral filter, or it may be remote to the event-based imaging sensor and/or the spectral filter. For example, event-based imaging sensor, spectral filter and processor may be implemented in a single camera device. Alternatively, a camera device may only comprise the event-based imaging sensor and spectral filter, and the processor may be arranged remote from the event-based imaging sensor and may be arranged to receive the data produced by the event-based imaging sensor via a data communication path such as Ethernet, WLAN or the like.
The apparatus may further comprise a lens system and a band-limiting filter.
The embodiments also disclose a method, comprising producing measurement events which correspond to a filter response of a spectral filter that is generated by an observed spectrum; controlling the filter transmission of the spectral filter so that it sweeps over wavelength with time; and generating an estimation of the observed spectrum based on the measurement events that correspond to the filter response and based on the filter transmission.
Spectral Imaging
yi=∫x(λ)·Ti(λ)dλ
For the purpose of processing, the spectrum is discretized into N wavelength bins Δλ1, Δλ2, . . . , ΔλN, and it is assumed, as a simplification, that there is one discrete intensity value xj associated with each wavelength bin Δλj (the discrete intensity values X(λ)={x1, x2, . . . , xN} corresponding to the mean or median intensity in the respective frequency bin). Under this assumption, the Fredholm integral can be written as a matrix multiplication:
with the filter matrix
computed from the filters Ti(λ), which are known.
The above mathematical relation between observed spectrum X(λ) and filter response Y(λ) allows to reconstruct the original spectrum x(λ) from the measured filter responses Y(λ) by numerical reconstruction methods, for example by using ordinary least squares linear regression to minimize ∥Ax−b∥2 (where A corresponds to the matrix T of filter responses Tij, and b corresponds to the vector of filter responses Y(λ)), or by using e.g. Tikhonov regularization. In Tikhonov regularization a regularization term ∥Γx∥2 is introduced into the minimization problem:
∥Ax−b∥2+∥Γx∥2
where ∥.∥2 is the Euclidian norm, and Γ is some suitable chosen Tikhonov matrix (e.g. Γ=αB with a suitable chosen scalar α and a suitable chosen matrix B).
An explicit solution of the minimization problem, denoted by {circumflex over (x)}, is given by:
{circumflex over (x)}=(ATA+ΓTΓ)−1ATb=(ATA+α2BTB)−1ATb.
The regularization matrix B may for example be chosen as the identity matrix I, or e.g. as follows:
Compressive Event-Based Spectrum Imaging System
The compressive event-based spectral imaging system applies a high speed sweep over the filter's 26 spectral transmission FT(λ), so that FT(t) becomes time-dependent, such that its transmission amplitude changes over time t. During the sweep the event-based imaging sensor 27 records any intensity changes that result from the changes in the filter transmission FT(t).
Event-Driven Spectral Imaging
Linear interpolation may be applied on the set of measurements to obtain a smooth filter response Y(t). The filter response Y(λ) as function of the wavelength λ can be obtained by the known correspondence between time t and wavelength λ as defined by the filter sweep (see e.g.
Reconstructing the Observed Spectrum X(λ)
As it is described with regard to
That is, the observed filter response Y(t) can be expressed as a sum over all the events ei within the time interval {0, t}, where 0 indicates the start of the measurement
Here the deliverable of the event sensor is a series of events ei, each event being attributed a respective time stamp ti
Here, ti is the time stamp of an event, and ti-1 is the time stamp of a preceding event. X(λ) is the observed spectrum.
In the case of an effective transmission as Dirac impulse (
yi-1,i=Y(ti)−Y(ti-1)
yi-1,i=[FS(ti)−FS(ti-1)]·x·(ti−ti-1).
and the originally observed spectrum X(λ) can be estimated by dividing Y(λ) with the spectral filter curve FS(λ).
the measure yΔ (a single or a sum of events related to a measurement between a sampling time interval of [t0, t1]) can be formulated for instance as a linear (or higher order) combination, e.g.
It is possible to solve the inverse problem with known methods, e.g. Tikhonov regularization. The solution of this inverse problem represents an estimation X* of the observed spectrum X.
Here, FS(t1) and FS(t0) are the filter transmissions FT, at time t1 respectively t0, which in the case of an effective transmission as Dirac impulse results in one sample at time t1, respectively one sample at time t0. The measurement yΔ depends on the change of the filter response between to and t1. That is, yΔ is the sum of measurements between time samples t0 and t1. The sampling interval [t0, t1] correlates with the dynamic range, sensitivity and threshold to detect the event on the sensor. It may be a predefined value depending on the sensor configuration. The amount of time intervals depends on the time it is swiped through FS.
For the more complicated case of a broader spectral filter behaviour (see
Compressive Sensing (CS)
Another option is to utilize compressive sensing (CS) principles in the control of the spectral filter SF and herewith also CS methods can be utilized for the reconstruction of the original signal X(λ).
Implementation
In the following, an embodiment of an electronic device 930 is described under reference of
Embodiments which use software, firmware, programs, plugins or the like for performing the processes as described herein can be installed on computer 930, which is then configured to be suitable for the embodiment.
The computer 930 has a CPU 931 (Central Processing Unit), which can execute various types of procedures and methods as described herein, for example, in accordance with programs stored in a read-only memory (ROM) 932, stored in a storage 937 and loaded into a random access memory (RAM) 933, stored on a medium 940, which can be inserted in a respective drive 939, etc.
The CPU 931, the ROM 932 and the RAM 933 are connected with a bus 941, which in turn is connected to an input/output interface 934. The number of CPUs, memories and storages is only exemplary, and the skilled person will appreciate that the computer 930 can be adapted and configured accordingly for meeting specific requirements which arise when it functions in an image processing apparatus or system.
At the input/output interface 934, several components are connected: an input 935, an output 936, the storage 837, a communication interface 938 and the drive 939, into which a medium 940 (compact disc, digital video disc, compact flash memory, or the like) can be inserted.
The input 935 can be a pointer device (mouse, graphic table, or the like), a keyboard, a microphone, a camera, a touchscreen, etc. and also the event-based image sensor 27 already described above.
The output 936 can have a display (liquid crystal display, cathode ray tube display, light emittance diode display, etc.), loudspeakers, etc. most commonly the output will be used to give out the calculated reconstructed spectral scene X* described in the embodiments above (see e.g.
The storage 937 can have a hard disk, a solid state drive and the like. It can be used to store local measurement data like the measurement events y1, y2 . . . , yn obtained from the event-based imaging sensor, or to store calculation data like the observed filter response Y(t) or the reconstructed spectral scene X*.
The communication interface 938 can be adapted to communicate, for example, via a local area network (LAN), wireless local area network (WLAN), mobile telecommunications system (GSM, UMTS, LTE, etc.), Bluetooth, infrared, etc.
It should be noted that the description above only pertains to an example configuration of computer 930. Alternative configurations may be implemented with additional or other sensors, storage devices, interfaces or the like. For example, the communication interface 938 may support other radio access technologies than the mentioned WLAN, GSM, UMTS and LTE. Also there could be multiple event-based imaging sensors, etc.
The methods as described herein are also implemented in some embodiments as a computer program causing a computer and/or a processor and/or a circuitry to perform the method, when being carried out on the computer and/or processor and/or circuitry. In some embodiments, also a non-transitory computer-readable recording medium is provided that stores therein a computer program product, which, when executed by a processor/circuitry, such as the processor/circuitry described above, causes the methods described herein to be performed.
It should be recognized that the embodiments describe methods with an exemplary ordering of method steps. The specific ordering of method steps is, however, given for illustrative purposes only and should not be construed as binding.
It should also be noted that the division of the control or circuitry of
All units and entities described in this specification and claimed in the appended claims can, if not stated otherwise, be implemented as integrated circuit logic, for example on a chip, and functionality provided by such units and entities can, if not stated otherwise, be implemented by software.
In so far as the embodiments of the disclosure described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a transmission, storage or other medium by which such a computer program is provided are envisaged as aspects of the present disclosure.
Note that the present technology can also be configured as described below:
[1] An apparatus comprising
[2] The apparatus of [1], in which the processor (28) is configured to control the spectral filter transmission (FT(λ)) of the spectral filter (26) so that it sweeps linearly or non-linearly over wavelength (λ) with time (t).
[3] The apparatus of [1] or [2], wherein the event-based imaging sensor (27) is configured to create the filter response (Y(t)) by recording any intensity changes that result from changes in the spectral filter transmission (FT(t)).
[4] The apparatus of anyone of [1] to [3], wherein the event-based imaging sensor (27) is configured to register pixel intensity changes in an asynchronous way.
[5] The apparatus of anyone of [1] to [4], wherein the event-based imaging sensor (27) is configured to create, from incidents on the imaging sensor, events (y1, . . . , yN) that indicate an intensity increase or decrease.
[6] The apparatus of [5], wherein the events (y1 to yN) indicate an intensity increase or decrease of a pre-defined threshold (λ) which is fixed or adaptively changed by the processor (28).
[7] The apparatus of [5] or [6], wherein the processor (28) is configured to create the filter response (Y(t)) from the detected events (y1, . . . , yN).
[8] The apparatus of [7], wherein the processor (28) is configured to reconstruct the observed spectrum (X(λ)) from the measured filter responses (Y(t)) by a numerical reconstruction method.
[9] The apparatus of [8], wherein the numerical reconstruction method is a Tikhonov regularization.
[10] The apparatus of [9], wherein the numerical reconstruction method is based on a matrix containing the known filter responses of the spectral filter.
[11] The apparatus of anyone of [8] to [10], wherein in the numerical reconstruction method, a single event (ym) or a sum of events related to a measurement between a time interval ([t0, t1]) is formulated as a linear or higher order combination.
[12] The apparatus of [11], wherein the linear or higher order combination is ym=[FS(t1)−FS(t0)]·x·(t1−t0), where FS is the filter transmission of the spectral filter, x is the observed spectrum and (t1−t0) is the time interval.
[13] The apparatus of anyone of [8] to [12], wherein the numerical reconstruction method is based on compressive sensing (CS) principles.
[14] The apparatus of anyone of [1] to [13], further comprising a lens system (24) and a band-limiting filter (25).
[15] A method, comprising
[16] A computer program comprising instructions, the instructions when carried out on a processor cause the processor to
[17] A computer-readable medium comprising instructions, the instructions when carried out on a processor cause the processor to
Number | Date | Country | Kind |
---|---|---|---|
17209808 | Dec 2017 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2018/086783 | 12/21/2018 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/122426 | 6/27/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9041932 | Priore et al. | May 2015 | B2 |
9347831 | Funamoto | May 2016 | B2 |
9588099 | Levenson et al. | Mar 2017 | B2 |
10955654 | Chenegros | Mar 2021 | B2 |
20140134712 | Na | May 2014 | A1 |
20180024343 | Chenegros | Jan 2018 | A1 |
Number | Date | Country |
---|---|---|
2012040466 | Mar 2012 | WO |
2017017684 | Feb 2017 | WO |
Entry |
---|
11. Moeys, Diederik Paul, et al. “Color temporal contrast sensitivity in dynamic vision sensors.” 2017 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2017 (Year: 2017). |
International Search Report and Written Opinion dated May 8, 2019 for PCT/EP2018/086783 filed on Dec. 21, 2018, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20200333186 A1 | Oct 2020 | US |