SYSTEM FOR IMAGING A SCENE

Information

  • Patent Application
  • 20230266600
  • Publication Number
    20230266600
  • Date Filed
    June 09, 2021
    2 years ago
  • Date Published
    August 24, 2023
    8 months ago
Abstract
A system for imaging a scene includes a capturing unit to capture two-dimensional and/or three-dimensional information of the scene, the information including light waves from the scene; a first diffractive optical element to receive the light waves from the capturing unit; a waveguide to forward the light waves received by the first diffractive optical element , the first diffractive optical element additionally to couple the light waves into the optical waveguide; and a second diffractive optical element to couple the light waves forwarded by the optical waveguide out of the optical waveguide. The system additionally includes a first image sensor and at least one second image sensor to capture the light waves coupled out of the optical waveguide and to generate first image data and second image data therefrom. The first image sensor and the second image sensor are in a region paired with the second diffractive optical element.
Description
BACKGROUND

Disclosed herein is a system for imaging a scene which system includes a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information including light waves from the scene, a first diffractive optical element configured to receive the light waves from the capturing unit, an optical waveguide configured to forward the light waves received from the first diffractive optical element, the first diffractive optical element further configured to couple the light waves into the optical waveguide, and a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide.


From the printed matter U.S. Pat. No. 9,753,141 B1, a sensor-based imaging system with minimized delay time between sensor captures is known.


From the printed matter WO 94/24527, a spectrograph with multiple holographic optical transmission gratings is known, which diffract incident light such that different spectral components impinge on spatially separated regions of an optoelectronic detector.


From the printed matter U.S. Pat. No. 10,116,915 B2, a system for generating artificial or augmented reality with data capturing devices different from each other, which each comprise multiple sensors, is known. These data capturing devices, for example holographic cameras, collect scene information from their respective field of view (FOV).


In a holographic cam or briefly “holocam”, light is coupled into a light guide by an optical element and forwarded up to a further outcoupling optical element by internal total reflection. A suitable image sensor is provided behind the outcoupling optical element.


Herein, the so-called sensor noise exerts a great influence on the achievable image quality. The sensor noise is representative of various disturbances, which influence the temporal course of a pixel value, for example from image to image. Therein, this so-called temporal noise includes photon noise, dark current, readout and quantization noise, local noise as well as offset noise and gain noise. Therein, the today's trend to increasingly smaller sensors with increasing resolution at the same time provides increasingly smaller and thereby less sensitive pixels and thus a greater noise susceptibility. However, the noise behavior of the captured image is not only depending on the size and sensitivity of the individual pixels.


The dynamic range is also an important capturing criterion. In image sensors, it is considerably lower than for example in the human eye. The effect is known from photography: If one looks out of a window, the eye is capable of seeing the darker interior as well as the brighter outside world. In contrast, a camera with considerably lower dynamic range usually only has the possibility of capturing one scene correctly exposed: the darker interior, wherein the window is overexposed, or the outside world, wherein the interior is considerably underexposed. While underexposed regions can be brightened again afterwards to a certain extent, information from overexposed regions cannot be recovered anymore.


Up to now, captures with extended dynamic range can basically arise in the following two manners. A capture can be digitally post-processed on the one hand. Usually, it is correctly exposed for the bright region, the dark region is subsequently brightened. Thus, the impression of a better illumination arises in the image. On the other hand, multiple individual captures with different settings, for example different exposure times, can be captured one after the other. Thus, multiple individual captures arise, which ideally cover suitable settings for the darkest up to the brightest image regions. These individual captures can then be composed, wherein the correctly exposed image portion is used from each capture. For example, bright regions are correctly captured and adopted in case of a short exposure time. In contrast, dark image regions are better exposed and adopted in case of a long exposure time.


SUMMARY

An aspect of the invention is to provide an improved system for imaging a scene, which allows a picture from the scene with an extended dynamic range on the one hand and reduces disturbances by sensor noise on the other hand.


Thus, a system for imaging a scene is disclosed. The system comprises: a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information comprises light waves from the scene, a first diffractive optical element configured to receive the light waves from the capturing unit, an optical waveguide configured to forward the light waves received from the first diffractive optical element, the first diffractive optical element further configured to couple the light waves into the optical waveguide, and a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide. The system further comprises: a first image sensor and at least one second image sensor, which are configured to capture the light waves coupled out and to generate first image data and second image data therefrom, the first image sensor and the second image sensor being arranged in a region associated with the second diffractive optical element.


Instead of one capturing image sensor, herein, at least two image sensors are provided in the region of the outcoupling diffractive optical element. Thus, the image sensors can capture the scene from identical viewing angle, i.e. from the same position. This in turn allows combining the image data generated by the two sensors with each other. Thereby, an averaged picture of the scene reduced in noise can be created.


Therein, the region associated with the second diffractive optical element can have an area, into which the second diffractive optical element couples out the light waves, wherein the first image sensor and the second image sensor are arranged within the area.


In this context, the size of the second diffractive optical element can determine the size of the area, i.e. the size of the outcoupling region (also called “eyebox”). In other words, the size of the area, in which light waves of the scene are received, can be set by the size of the outcoupling second diffractive optical element such that the two image sensors can be arranged therein and can capture the same scene. It is also possible to select the size of the second diffractive element such that further image sensors can be integrated within the area.


In an example, the first diffractive optical element can comprise a first holographic optical element and the second diffractive optical element can comprise a second holographic optical element. The use of holographic optical elements, which are more lightweight and space saving than conventional optical elements, allows further simplifying the optical arrangement of the system and reducing weight.


The first holographic optical element and the second holographic optical element can comprise volume holograms, which couple the light waves into and out of the waveguide, respectively, corresponding to their wavelengths. Herein, the volume holograms can be employed as transmission as well as reflection gratings. Thus, different arrangements of the holographic elements in the waveguide can be realized.


In this context, the second holographic optical element can comprise further optical functions for image correction to for example reduce a distortion of the picture of the scene.


The first holographic optical element and the second holographic optical element can comprise photosensitive material, including photopolymer. Materials like dichromatic gelatin, silver halides, refractive crystals and/or the like are also possible.


The waveguide can comprise a prism. However, an arrangement with a waveguide comprising an optical fiber is also possible.


The first image sensor can have a first sensitivity and/or a first exposure time, and the second image sensor can have a second sensitivity different from the first image sensor and/or a second exposure time different from the first image sensor. By the differently sensitive image sensors in connection with varying exposure time, captures of the scene with great brightness differences are possible. For example, the first image sensor can capture bright image regions without overexposure, while the second image sensor can capture dark image regions without underexposure. The image data generated therefrom can be combined, which allows imaging the scene with an improved dynamic range.


The first image sensor and the second image sensor can further convert the photons associated with the light waves into electrical signals to generate the first image data and the second image data therefrom,


The first image sensor and the second image sensor can comprise CMOS sensors and/or CCD sensors.


Herein, the system can further comprise a processing unit (or processor), which processes the image data generated by the first image sensor and the second image sensor and generates a picture of the scene therefrom. Therein, the processing unit can additionally perform corrections.


The above aspect may also include a holographic camera with the previously described system for imaging a scene.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other aspects and advantages will become more apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:



FIG. 1 shows an example of a system for imaging a scene;



FIG. 2 shows an example of the system;



FIG. 3 shows an example of the system; and



FIG. 4 shows an example of mage sensors in cooperation with a processing unit of the system.





DETAILED DESCRIPTION

Reference will now be made in detail to the embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout.


In FIG. 1 to FIG. 3, various examples of a system 10, 20, 30 for imaging a scene S are illustrated. The scene S can be an environment of the system 10, 20, 30. The system 10, 20, 30 comprises a capturing unit 12, 22, 32, a first diffractive optical element 14, 24, 34 and a second diffractive optical element 14, 24, 34, a waveguide 16, 26, 36 as well as a first image sensor 18, 28, 38 and at least one second image sensor 18, 28, 38.


The capturing unit 12, 22, 32 captures two-dimensional and/or three-dimensional information of the scene 3, wherein the information includes light waves LW from the scene S. The capturing unit 12, 22, 32, as illustrated in FIG. 1 to FIG. 3, can be formed as an objective with a lens, which images the light waves LW to the first diffractive optical element 14, 24, 34. However, this is not restrictive. Similarly, the capturing unit 12, 22, 32 can be formed by further optical elements such as for example mirrors, shutters and/or the like.


The first diffractive optical element 14, 24, 34 receives the light waves LW from the capturing unit 12, 22, 32 and couples them into the waveguide 16, 26, 36. The waveguide 16, 26, 36 forwards the light waves LW received from the first diffractive optical element 14, 24, 34 to the second diffractive optical element 14, 24, 34 by total reflection. The second diffractive optical element 14, 24, 34 then couples the light waves LW forwarded by the optical waveguide 16, 26, 36 out of the optical waveguide 16, 26, 36.


This is illustrated by the corresponding arrows in FIG. 1 to FIG. 3.


Therein, various arrangements of the first and the second diffractive optical element 14, 24, 34 in the system 10, 20, 30 are possible. The diffractive optical elements 14, 12, 24 can for example be arranged within the waveguide 16, 36, which is illustrated in FIG. 1 and FIG. 3. However, they can also be arranged outside, which is illustrated in FIG. 2, According to requirement to the system 10, 20, 30, any arrangements can be realized herein.


The light waves LW coupled out are then captured by the first image sensor 18, 28, 38 and the at least one second image sensor 18, 28, 38 and first image data BD and second image data BD are generated therefrom. This is described in detail with reference to FIG. 5 later. Herein, the first image sensor 18, 28, 38 and the at least one second image sensor 18, 28, 38 are arranged in a region EB associated with the second diffractive optical element 14, 24, 34, which is indicated by the dashed border in FIG. 1 to FIG. 3 in simplified manner.


The arrangement in the region EB allows the image sensors 18, 28, 38 to capture the light waves LW from the scene S from the same perspective. Thus, the generated image data can be combined with each other or averaged to obtain a picture of the scene AS reduced in noise.


In an example, the region EB associated with the second diffractive optical element 14, 24, 34 can have an area, into which the second diffractive optical element 14, 24, 34 couples the light waves LW out, wherein the first image sensor 18, 28, 38 and the at least one second image sensor 18, 28, 38 are arranged within the area.


Herein, the size of the second diffractive optical element 14, 24, 34 determines the size of the area, i.e. the size of the outcoupling region, also referred to as “eyebox”. By suitable choice of the size of the second diffractive optical element 14, 24, 34, the size of the area can be set such that further image sensors 18, 38, 38 can be arranged within the area, which is for example illustrated in the example shown in FIG. 3. In FIG. 3, the size of the second diffractive element 34 is selected such that four image sensors can overall be arranged in the area. However, this is not restrictive. Various sizes for the second diffractive optical element 14, 24, 34 and thus any number of image sensors 18, 28, 38 can be realized. By combination of the image data BD generated by the further image sensors, the image quality of the generated picture of the scene AS can be further improved.


The first diffractive optical element 14, 24, 34 can comprise a first holographic optical element and the second diffractive optical element 14, 24, 34 can comprise a second holographic optical element. Herein, the holographic optical elements have diffraction gratings, which are produced by holographic methods. These diffraction gratings are for example written or recorded into volume holograms.


The first holographic optical element and the second holographic optical element can comprise these volume holograms, which couple the light waves LW into and out of the waveguide 16, 26, 36, respectively, corresponding to their wavelengths. More precisely, the light waves LW are coupled in and coupled out, respectively, corresponding to the Bragg condition, i.e. the light waves LW have to have the correct wavelength (color) and the correct shape (beam direction, wavefront profile). Herein, it is distinguished between volume holograms with reflection gratings as well as volume holograms with transmission gratings. In transmitting gratings, a part of the incident light waves LW is reflected and a part is absorbed. In reflection gratings, the light waves LW are diffracted for certain angles and wavelengths such that constructive interference arises. In FIG. 1, an example with reflection gratings is illustrated. Herein, the light wave LW from the scene S is reflected at the coupling-in first holographic optical element and forwarded to the second holographic optical element by total reflection, which then couples the light wave LW out. In FIG. 1, it is a monochromatic (unicolored) construction, in which only light waves LW with a certain wavelength are reflected. A polychromatic (multicolored) construction is also possible, it is illustrated in FIG. 2.


In FIG. 2, light waves LW with different wavelengths are coupled in and coupled out, respectively. It can be realized by use of multiple holographic optical elements (not shown). The use of holographic optical elements with volume holograms, into which multiple diffraction gratings are written, is also possible.


In FIG. 3, an example is shown, which illustrates a monochromatic construction using transmitting gratings. In this example, the holographic optical elements are arranged centrally in the waveguide 36. Herein, a part of the incident light wave LW is absorbed and a part is reflected.


As mentioned above, the examples illustrated in FIG. 1 to FIG. 3 are not restrictive. Further arrangements and combinations are possible.


Further, the second holographic optical element can comprise further functions for image correction. They can for example be written into the volume hologram and reduce additional disturbances like distortions upon coupling the light waves LW out.


The first holographic optical element and the second holographic optical element can comprise a photosensitive material, preferably photopolymer. Photopolymer has a good diffraction efficiency and has the advantage that it does not have to be additionally chemically processed. Materials like dichromatic gelatin, silver halide and/or the like are also possible.


The waveguide 16, 26, 36 can comprise a prism. An optical fiber is also possible.


The first image sensor 18, 28, 38 can have a first sensitivity and/or a first exposure time, and the second image sensor 18, 28, 38 can have a second sensitivity different from the first image sensor 18, 28, 38 and/or a second exposure time different from the first image sensor. By the different sensitivities and/or exposure times, different regions of a scene S can be captured and combined with varying accuracy. Thus, the scene S can be imaged with all brightness differences. For example, the sensitivity and/or exposure time can be adjusted such that the first image sensor 18, 28, 38 captures bright regions of the scene S without overexposure and the second image sensor 18, 28, 38 captures dark regions of the scene S without underexposure. By combination of the generated image data BD, for example a superposition of image data with different exposure times, a high-contrast image of the scene S can be generated. For example, a color image with extended dynamic range can also be generated with the polychromatic arrangement shown in FIG. 2.


The first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 can convert the photons associated with the light waves LW into electrical signals to generate the first image data BD and the second image data BD therefrom. This occurs by the photoelectric effect, wherein, presented in simplified manner, photons are absorbed by the image sensors and electrons or charges are induced.


Herein, the first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 can comprise a CMOS sensor and/or CCD sensors. In CMOS sensors (complementary metal-oxide-semiconductor), charges in a pixel are converted into a voltage. It is amplified, quantized and output as a digital value. CCD (charge-coupled device) sensors are composed of a plurality of extensively arranged light-sensitive semiconductor elements. Each semiconductor element represents a photodetector, which converts the incident photons into electrons.


The system 10, 20, 30 can further comprise a processing unit VB, which processes the image data BD generated by the first image sensor 18, 28, 38 and the second image sensor 18, 28, 38 and generates a picture of the scene AS therefrom. In this context, FIG. 5 shows a simplified and schematic diagram of an example of the image sensors 18, 28, 38 in cooperation with the processing unit VB of the system 10, 20, 30. The light waves LW received by the first and the second image sensor 18, 28, 38 are converted into image data BD as described above and provided to the processing unit BV. It processes the image data BD, wherein additional image corrections can be performed.


The picture of the scene AS is generated from the processed image data BD by suitable combination.


Further, a holographic camera can be equipped with the above described system 10, 20, 30 for imaging a scene S.


A description has been provided with particular reference to preferred embodiments thereof and examples, but it will be understood that variations and modifications can be effected within the spirit and scope of the claims which may include the phrase “at least one of A, B and C” as an alternative expression that means one or more of A, B and C may be used, contrary to the holding in Superguide v. DIRECTV, 358 F3d 870, 69 USPQ2d 1865 (Fed. Cir. 2004).

Claims
  • 1-13. (canceled)
  • 14. A system for imaging a scene, the system comprising: a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information comprising light waves from the scene;a first diffractive optical element configured to receive the light waves from the capturing unit;an optical waveguide configured to forward the light waves received from the first diffractive optical element and to couple the light waves into the optical waveguide;a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide; anda first image sensor and at least one second image sensor, which are configured to capture the light waves coupled out and to generate first image data and second image data therefrom, the first image sensor and the second image sensor being in a region associated with the second diffractive optical element.
  • 15. The system according to claim 14, wherein the region associated with the second diffractive optical element comprises an area into which the second diffractive optical element couples the light waves out, and the first image sensor and the at least one second image sensor are within the area.
  • 16. The system according to claim 14, wherein the size of the second diffractive optical element determines the size of the area.
  • 17. The system according to claim 15, wherein the size of the second diffractive optical element determines the size of the area.
  • 18. The system according to claim 14, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
  • 19. The system according to claim 15, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
  • 20. The system according to claim 16, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
  • 21. The system according to claim 17, wherein the first diffractive optical element comprises a first holographic optical element and the second diffractive optical element comprises a second holographic optical element.
  • 22. The system according to claim 18, wherein the first holographic optical element and the second holographic optical element comprise volume holograms which couple the light waves into and out of the waveguide, respectively, corresponding to their wavelengths.
  • 23. The system according to claim 18, wherein the second holographic element comprises further optical functions for image correction.
  • 24. The system according to claim 22, wherein the second holographic element comprises further optical functions for image correction.
  • 25. The system according to claim 18, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
  • 26. The system according to claim 22, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
  • 27. The system according to claim 23, wherein the first holographic optical element and the second holographic optical element comprise a photosensitive material including photopolymer.
  • 28. The system according to claim 14, wherein the waveguide comprises a prism.
  • 29. The system according to claim 14, wherein the first image sensor has a first sensitivity and/or a first exposure time, and that the second image sensor has a second sensitivity different from the first image sensor and/or a second exposure time different from the first image sensor.
  • 30. The system according to claim 14, wherein the first image sensor and the second image sensor convert the photons associated with the light waves into electrical signals, to generate the first image data and the second image data therefrom.
  • 31. The system according to claim 30, wherein the first image sensor and the second image sensor comprise a CMOS sensor and/or a CCD sensor.
  • 32. The system according to claim 14, further comprising a processing unit to process the image data generated by the first image sensor and the second image sensor and to generate an image of the scene (AS) therefrom.
  • 33. A holographic camera comprising: a system for imaging a scene, the system comprising: a capturing unit configured to capture two-dimensional and/or three-dimensional information of the scene, the information comprising light waves from the scene;a first diffractive optical element configured to receive the light waves from the capturing unit;an optical waveguide configured to forward the light waves received from the first diffractive optical element and to couple the light waves into the optical waveguide;a second diffractive optical element configured to couple the light waves forwarded by the optical waveguide out of the optical waveguide; anda first image sensor and at least one second image sensor, which are configured to capture the light waves coupled out and to generate first image data and second image data therefrom, the first image sensor and the second image sensor being in a region associated with the second diffractive optical element.
Priority Claims (1)
Number Date Country Kind
10 2020 117 278.8 Jul 2020 DE national
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a U.S. National Stage Application which claims the benefit under 35 U.S.C. § 371 of International Patent Application No. PCT/EP2021/065545 filed on Jun. 9, 2021, which claims the priority benefit of German Patent Application No. 10 2020 117 278.8 filed on Jul. 1, 2020, the contents of each of which are incorporated herein by reference in their entireties.

PCT Information
Filing Document Filing Date Country Kind
PCT/EP2021/065545 6/9/2021 WO