CAMERA SENSOR HIDDEN BEHIND LUMINAIRE OPTICS

Information

  • Patent Application
  • 20190331334
  • Publication Number
    20190331334
  • Date Filed
    December 14, 2017
    7 years ago
  • Date Published
    October 31, 2019
    5 years ago
Abstract
The invention provides a sensor unit (100) comprising a sensor array (120) having at least 100×100 sensor pixels (125), a lens system (200), and a light transmissive sensor window (300) comprising optical structures (320), wherein the light transmissive sensor window (300) is configured at a window distance (d) from said sensor array (120), wherein the lens system (200) is configured between the sensor array (120) and said sensor window (300), and wherein an object plane (OP) and an image plane (IP) defined by said lens system (200) and sensor array (120) have an object-image plane distance (d1) selected from the range of 0.1*d-2*d, wherein the optical structures (320) are configured in a pattern, wherein the optical structures (320) have one or more dimensions selected from length (L), width (W) and diameter (D) selected from the range of 50 μm-20 mm, and wherein neighboring optical structures (320) shortest distances (d2) selected from the range of 0-50 mm.
Description
FIELD OF THE INVENTION

The invention relates to a sensor unit and a sensor system comprising such sensor unit. The invention further relates to a lighting system comprising such sensor unit or sensor system. Yet further, the invention also relates to a method of e.g. sensing motion.


BACKGROUND OF THE INVENTION

Security cameras in combination with lighting fixtures are known in the art. U.S. Pat. No. 8,622,561 B2, for instance, describes a luminaire, comprising a housing having an outer surface, a cover extending a height above the outer surface, and a camera concealed within the cover. This document also describes a luminaire, comprising a housing made of first and second portions having respective first and second outer surfaces, wherein the first outer surface extends a height substantially above the second outer surface, and a camera concealed within the first portion. Further, this document describes a luminaire, comprising a housing including first and second portions having respective first and second outer surfaces, wherein the first outer surface extends a height substantially above the second outer surface, and a privacy film disposed on the first surface, and a camera concealed within the first portion. The privacy film can reduce the visibility within at least the visible portion of the electromagnetic spectrum through generally transparent materials. The privacy film can be silvered, thereby offering an unimpeded view from the low-light side but virtually no view from the high-light side. The privacy film can be made by frosting transparent materials. This can transform the material to translucent. There are a number of privacy film gradients that can be lighter or darker. Further, the privacy film can include material having a plurality of small holes referred to a perforated film. Such perforations are designed to allow the graphic to be seen from the outside, whilst allowing people to see out from the inside. Here, we refer to the variety of privacy films generally as a tint, silvered, mirrored or a perforated film and the like. The camera tracks and records a target.


SUMMARY OF THE INVENTION

PIR (passive infrared) sensors may be used in presence detectors for lighting systems. A low number of infrared sensors (typically four) is combined with an (IR light transparent) lens array. Depending on the position of a person with respect to the PIR sensor, the emitted IR light by that person will be focused on one of the sensors. If the person moves, the light will be focused on another sensor. The difference signal triggers a motion detection signal. A disadvantage of PIR sensors is the low resolution: a lens array is needed to create a repeating virtual pattern of the small cluster of sensors such that a certain area may be covered. The lens array design must be specifically tuned to the application height and detection area, otherwise gaps may occur in the virtual sensor array where no movement can be detected or the virtual sensor images overlap, which reduces the sensitivity of the array. Thus every application requires a different PIR sensor design, or a complete other solution.


Hence, it is an aspect of the invention to provide an alternative sensor, which preferably further at least partly obviates one or more of above-described drawbacks.


Herein, it is proposed to use a camera for sensing, such as such camera in a lighting system. To avoid issues with privacy, the camera may have a fixed focus a few cm in front of the camera. Since people are typically more than a meter away, only very blurry images of people can be generated and privacy can be guaranteed.


In specific embodiments, the “near sighted” camera is hidden behind by a layer or cover with an optical structure (prisms, lenslets, controlled scattering) that scrambles the image even more. This not only emphasizes the privacy aspect, but it also improves the sensitivity of the camera to movement in case the optical structures are made of clear material and are positioned in the focus of the camera. The high resolution also enables a distinction in clusters of pixels that show uncorrelated dynamics, which can be used to count the number of people present and even estimate the speed of movement and deduce the type of activity from that. The cost of a high-resolution smartphone camera is already close to the cost of a 4-pixel PIR sensor. Hence, the present solution may also be cost effective. Further, the present solution may have less issues with distances and may have a larger distance range over which the sensor may sense well, whereas prior art sensors may only sense well over a limited range of distances. Further, the system may sense with relatively high precision while nevertheless not causing an infringement on privacy.


Hence, in a first aspect the invention provides in embodiments a sensor unit (“unit”) comprising a (2D) sensor array (“array”) having at least 100×100 sensor pixels (“pixels”) and a lens system. In other embodiments, the invention provides a sensor unit comprising a (2D) sensor array having at least 100×100 sensor pixels, and a light transmissive sensor window (“window” or “sensor window”) comprising optical structures (“structures”). Especially, in embodiments the invention provides a sensor unit comprising a (2D) sensor array having at least 100×100 sensor pixels, a lens system, and a light transmissive sensor window comprising optical structures. In such embodiments, the lens system is configured between the sensor array and light transmissive sensor window. The light transmissive sensor window is configured at a window distance (d) from said (2D) sensor array.


Further, the lens system may especially be configured to provide a nearsighted (2D) sensor array, such as having a sharp focus at distances equal to or less than about 50 cm, such as equal to or less than 25 cm (from the lens system). An object plane and an image plane defined by said lens system have an object-image plane distance (d1) selected from the range of 0.1*d-5*d, especially 0.1*d-4*d, even more especially of 0.1*d-2*d, such as 0.5*d-2*d, even more especially 0.9*d-1.1*d. In this way, privacy of a person may essentially be guaranteed as the 2D (2D) sensor array essentially focuses on the window, or nearby the window. Further, especially the image plane is configured close to or at the sensor array. Therefore, in specific embodiments the sensor array is configured at the image plane.


The image plane may thus coincide with the sensor array. Hence, especially the object plane and the image plane are defined by the lens system and the sensor array, with (thus) especially the sensor array at the image plane. Thereby, also the object plane is defined.


Further, especially the optical structures are configured in a pattern, wherein more especially the optical structures have one or more dimensions selected from length (L), width (W) and diameter (D) selected from the range of 50 μm-20 mm, and wherein more especially neighboring optical structures shortest distances (d2) selected from the range of 0-50 mm.


Further, the optical structures are selected from the group of dots, facets, pyramids, lines, grooves, lamellae, and lenses.


When the window comprises a transmissive layer, the optical structures may e.g. be provided at one side of the layer, such as the upstream side or the downstream side of such layer. The transmissive layer is herein also indicated as “optical layer”. In embodiments, the transmissive layer comprises a (light transmissive) plate. In other embodiments, the transmissive layer comprises a (light transmissive) foil.


As indicated above, such sensor unit may allow sensing with a relatively high precision but without violating privacy rules. For instance, one or more of motion, light, color, human presence, human behavior, etc., may be sensed with said control system as function of the sensor signal from said sensor array.


The sensor unit comprises a (2D) sensor array and one or more of a lens system and a light transmissive sensor window, especially both the lens system and light transmissive sensor window.


The (2D) sensor array may have at least 100×100 sensor pixels. With 100×100 or more pixels the sensor may sense a (moving) object with relatively high precision. Present day smartphone cameras have a much higher resolution. Hence, the number of pixels may be much larger, without much additional costs. Therefore, in specific embodiments the (2D) sensor array has at least 400×400 sensor pixels, like at least 1000×1000 pixels. When technology advances, a higher number of pixels may also be possible. Especially however, the number of pixels is at least 400×400. In embodiments, the (2D) sensor array may be the (2D) sensor array of CCD (semiconductor charge-coupled devices) or CMOS (complementary metaloxidesemiconductor). Hence, in embodiments the (2D) sensor array may comprise one or more of CCD and CMOS. The (2D) sensor array may be configured to sense one or more of (a) light from the visible spectrum (i.e. one or more wavelengths selected from the range of 380-780 nm) and (b) light from the infrared (i.e. one or more wavelengths selected from e.g. the range of 780-3000 nm, such as e.g. selected from the range of 780-1500 nm). For instance, the (2D) sensor array may have sensitivity in the wavelength range of 700-900 nm. The term “pixel” or “sensor pixel” refers to a single sensor. If no color filter is used, or if all sensors have the same color filter, all pixels are identical. Typically, in camera sensors, the pixels are grouped in triplets of sensors equipped with a red, green and blue color filter, respectively.


Especially, the sensor unit comprises a light transmissive sensor window. This light transmissive sensor window may be a plate or foil of light transmissive material, which may be planar or may include one or more curvatures relative to a plane of the window.


The light transmissive material may comprise one or more materials selected from the group consisting of a transmissive organic material, such as selected from the group consisting of PE (polyethylene), PP (polypropylene), PEN (polyethylene napthalate), PC (polycarbonate), polymethylacrylate (PMA), polymethylmethacrylate (PMMA) (Plexiglas or Perspex), cellulose acetate butyrate (CAB), silicone, polyvinylchloride (PVC), polyethylene terephthalate (PET), including in an embodiment (PETG) (glycol modified polyethylene terephthalate), PDMS (polydimethylsiloxane), and COC (cyclo olefin copolymer). Especially, the light transmissive material may comprise an aromatic polyester, or a copolymer thereof, such as e.g. polycarbonate (PC), poly (methyl)methacrylate (P(M)MA), polyglycolide or polyglycolic acid (PGA), polylactic acid (PLA), polycaprolactone (PCL), polyethylene adipate (PEA), polyhydroxy alkanoate (PHA), polyhydroxy butyrate (PHB), poly(3-hydroxybutyrate-co-3-hydroxyvalerate) (PHBV), polyethylene terephthalate (PET), polybutylene terephthalate (PBT), polytrimethylene terephthalate (PTT), polyethylene naphthalate (PEN); especially, the light transmissive material may comprise polyethylene terephthalate (PET). Hence, the light transmissive material is especially a polymeric light transmissive material. However, in another embodiment the light transmissive material may comprise an inorganic material. Especially, the inorganic light transmissive material may be selected from the group consisting of glasses, (fused) quartz, transmissive ceramic materials, and silicones. Also hybrid materials, comprising both inorganic and organic parts may be applied. Especially, the light transmissive material comprises one or more of PMMA, PC, or glass. In yet another embodiment, especially when IR sensitivity may be relevant the light transmissive material may especially comprise PE.


In embodiments, the light transmissive sensor window is a closed window, such as a foil or plate. Hence, the layer may be in integral piece of material. In such embodiments, the window essentially consists of light transmissive material. Further, in such embodiments, the window may not include through-openings.


The transmission or light permeability can be determined by providing light at a specific wavelength with a first intensity to the material and relating the intensity of the light at that wavelength measured after transmission through the material, to the first intensity of the light provided at that specific wavelength to the material (see also E-208 and E-406 of the CRC Handbook of Chemistry and Physics, 69th edition, 1088-1989).


In specific embodiments, a material may be considered transmissive when the transmission of the radiation at a wavelength or in a wavelength range, especially at a wavelength or in a wavelength range of radiation generated by a source of radiation as herein described, through a 1 mm thick layer of the material, especially even through a 5 mm thick layer of the material, under perpendicular irradiation with the visible (or optionally IR) radiation is at least about 20%, such as at least about 30%, such as even at least about 40%, such as at least 50%. The transmission may depend upon one or more of the material of the optical structures, the density, the window material, etc.


In specific embodiments, however, the light transmissive sensor window may include through openings. In such embodiments, the light transmissive sensor window may not essentially consist of light transmissive material, but may (also) comprise non-transmissive material. In such embodiments, transmission may (essentially) be provided by the openings, which may also be indicated as through openings. An example of such light transmissive sensor window may e.g. be a lamella structure (like louvres).


Especially, the light transmissive sensor window is configured at a window distance (d) from said (2D) sensor array. This distance may in embodiments be selected from the range of 0.5-300 mm, such as 1-250 mm, like 5-150 mm. Further, the light transmissive sensor window may be a light transmissive (lighting device) window of a lighting device, as will be further elucidated below.


As indicated above, in further embodiments the sensor unit comprises a lens system. Especially, the lens system may be configured to provide a nearsighted (2D) sensor array as the object-image plane distance is about the distance between the sensor array and the light transmissive window. The terms “object plane” and “image plane” especially refer to the conjugate planes of an imaging system (such as a lens) defined by the property that an object in the object plane has an image in the image plane with the optimum resolution allowed by the optical system. The specific distances of these planes can be calculated for thin lenses with the well-known lens equation, but they can also be determined for more complex imaging systems with calculation software known to people skilled in the art of optical design. The object-image plane distance (d1) is related to the lens system and depends on, e.g. the focal distance and the thickness of the lens system; the sensor window distance is related to the sensor array.


In embodiments, the lens system may include a single lens, like a dome (i.e. dome shaped lens), configured downstream from the 2D array. In specific embodiments, the lens may comprise a plurality of lenses. In yet further embodiments the lens system comprises one or more Fresnel lenses, like a plurality of Fresnel lenses.


The terms “upstream” and “downstream” relate to an arrangement of items or features relative to the direction of sensing from a sensing means (here the especially the sensor(s)), wherein relative to a first position following the direction of sensing (in a direction away) of the sensing means, a second position following the direction of sensing of the sensing means closer to the sensing means than the first position is “upstream” (of the first position), and a third position following the direction of sensing of the sensing means further away from the sensing means (than the first position) is “downstream” (from the first position (and also of the second position)).


Further, especially the sensor window comprises optical structures. The optical structures are configured in a pattern which may be regular or irregular, or a combination of a regular pattern and in irregular pattern. Hence, the term “pattern” may also refer to a plurality of different patterns. Especially, the optical structures are configured in a regular pattern, such as e.g. a cubic or a hexagonal configuration, or a configuration of a plurality of parallel elongated optical structures. In embodiments, the optical structures are light transmissive (see also below). This will especially be the case when the sensor window comprises light transmissive material as in embodiments the optical structures are comprised by the window.


In specific embodiments, the optical structures have one or more dimensions selected from length (L), width (W) and diameter (D) selected from the range of 50 μm-20 mm. Even more especially, the optical structures have one or more dimensions selected from length (L), width (W) and diameter (D) selected from the range of 100 μm-2 mm.


In specific embodiments neighboring optical structures have shortest distances (d2) selected from the range of 0-50 mm.


In further specific embodiments, neighboring optical structures have shortest distances (d2) selected from the range of 2-50 mm, like 5-50 mm, such as 5-20 mm. This may especially be relevant in the case of lamellae.


In further specific embodiments, neighboring optical structures have shortest distances (d2) selected from the range of 0-10 mm, such as 0-5 mm, like 0-2 mm. This may especially be relevant in the case of windows with optical structures that are essentially light transmissive, for instance in the case of a micro lens optical (MLO) layer.


In further specific embodiments, neighboring optical structures have shortest distances (d2) selected from the range of 50 μm-50 mm, like 50 μm-20 mm. This may especially be relevant in the case of windows with optical structures that are essentially light transmissive or which are not light transmissive, but which are not adjacent, such with through-holes or light transmissive material in between.


The shortest distances can be zero when the optical structures are light transmissive. When the optical structures are not light transmissive, the shortest distances will be larger than zero, such as at least 50 μm, thereby providing space for light transmissive material or a through-hole.


In embodiments, the optical structures are configured in a regular pattern, with the shortest distances selected from the range of 50 μm-50 mm, especially 50 μm-20 mm. In embodiments, a pitch of optical structures may be selected from the range of 50 μm-50 mm, especially 50 μm-20 mm.


The term “optical structure” is applied as the optical structures may influence incoming radiation such that a pattern of the radiation is generated on the 2D array. In this way, in a much higher resolution or precision e.g. motion can be sensed. Further, with such dimensions of and distances between the optical structures privacy of a persons may essentially be guaranteed, as the optical structures may inhibit or prevent face recognition. In combination with the nearsighted aspect of the 2D sensor array imposed by the lens system, face recognition may be impossible.


The optical structures are selected from the group of dots, facets, pyramids, lines, grooves, lamellae, and lenses. Especially, the optical structures are selected from the group of dots, facets, pyramids, lines, grooves, and lenses. Such optical structures may be comprised by the light transmissive sensor window as structures in the window and/or as structures on the window. For instance, dots and lines may be provided on a window, e.g. by screen printing. Grooves may be provided in the window, e.g. by etching. Pyramids, facets and lenses may e.g. be provided by casting a polymer with a suitable mold. Further, such structures may be provided at a downstream side of the window and/or an upstream side of the window.


In specific embodiments, the light transmissive sensor window comprises a light transmissive layer comprising a micro lens array. The micro lens array (or micro lens optics (MLO)) may be better than a diffuser, because the camera lens, which may in embodiments be essentially focused on the window, will see more details (compared to the situation with a diffuser).


In yet other embodiments, the optical structures may comprise lamellae. In such embodiments, a window may comprise a plurality of through-holes between the lamellae.


In specific embodiments, the light transmissive sensor window comprises a light transmissive layer comprising one or more of pyramids with a triangular base, pyramids with a square base, pyramids with a hexagonal base, and conical pyramids.


The sensor system may be used to sense one or more of a light level, presence of people (via motion or via recognition of a human shape), color of light (this may also enable the lighting system to adjust the CCT to the varying daylighting conditions), color of objects (e.g. in a supermarket the color of the goods can be detected and the spectrum adjusted to highlight blue, green, red or yellow for making fish, vegetables, tomatoes/meat or bread look more attractive), activity detection (by combining position/movement and possibly object recognition, or by combining with other sensor input like sound), etc. etc. The presently described sensor may however not be able to recognize a face. Hence, detection of light, light color, object color (but not very specific), motion detection, speed of motion detection (so also a (rough) activity classification), (rough) people counting (more people generate more motion, but they may not be distinguished separately).


In yet a further aspect, the invention also provides a sensor system comprising the sensor unit according to any one of the preceding claims and a control system functionally coupled with said sensor unit, wherein the sensor system is configured to sense (e.g. motion) as function of radiation received by said (2D) sensor array, especially through the light transmissive sensor window. Hence, the sensor unit as described herein may be coupled with a control system for e.g. evaluation of the sensor signal of the (2D) sensor array. The term “sensor signal” may also refer to a plurality of sensor signals, e.g. from different pixels. Herein, the term “control” and similar terms in relation to the control system that is functionally coupled with the sensor unit especially refers to monitoring. Hence, the control system is especially configured to monitor the sensor unit, more especially the (2D) sensor array. The control system receives the sensor signal and may e.g. convert this to information on one or more of the presence of a person, the location of a person, the behavior of a person, etc. etc.


As the pattern on the (2D) sensor array may depend upon the distance of an object to the sensor array, and thus also from the final position of the sensor unit during operation, it may be useful to calibrate the sensor system and/or to allow the sensor system self-learning. Hence, in embodiments the control system comprises one or more of a self-learning algorithm and a calibration procedure for improving sensitivity of the sensor system when installed. The calibration procedure may further include measuring a signal of a light source, such as a light source, configured in different positions in a space where the sensor unit is configured. The interpretation of the image may require maybe calibration input or and/or machine learning algorithms (like neural networks).


The space may be an indoor space or an outdoor space. The term space may for instance relate to a (part of) hospitality area, such as a restaurant, a hotel, a clinic, or a hospital, etc. The term “space” may also relate to (a part of) an office, a department store, a warehouse, a cinema, a church, a theatre, a library, etc. However, the term “space” also relate to (a part of) a working space in a vehicle, such as a cabin of a truck, a cabin of an air plane, a cabin of a vessel (ship), a cabin of a car, a cabin of a crane, a cabin of an engineering vehicle like a tractor, etc. The term “space” may also relate to (a part of) a working space, such as an office, a (production) plant, a power plant (like a nuclear power plant, a gas power plant, a coal power plant, etc.), etc. For instance, the term “space” may also relate to a control room, a security room, etc.


As already indicated above, the sensor unit may be implemented in a lighting device or lighting system. Lighting devices are necessary in many cases, such as indoor situation. Hence, the infrastructure of the lighting may be used to arrange the sensor unit at a location where electricity is anyhow and which location may also be chosen strategically (to provide a good illumination).


Hence, in yet a further aspect the invention provides a lighting system comprising a lighting device configured to provide lighting device light, the lighting system further comprising the sensor unit as defined herein. The sensor unit may be separate from the lighting device. However, in specific embodiments the sensor unit may be comprised by the lighting device. Moreover, as the lighting device may include a light exit window (light transmissive lighting device window) that may also be used as light transmissive sensor window, the sensor unit may especially be comprised by the lighting device. Hence, in embodiments the lighting device further comprises a light exit window, wherein said sensor unit is configured upstream of said light exit window (and thus sensing may be done through the light exit window and the light transmissive sensor window). Even more especially, the light exit window comprises said light transmissive sensor window (and thus sensing may be done through the light exit window which is or comprises the light transmissive sensor window).


For instance, the sensor unit may be functionally coupled with a control system (see also above). Hence, in embodiments the lighting system further comprises a control system functionally coupled with said sensor unit, wherein the lighting system is further configured to sense (e.g. motion) as function of radiation from external of said lighting device received by said sensor array through said light transmissive sensor window. Further, the lighting device may (also) be functionally coupled with the control system. Hence, the control system may also control the light generated by the lighting device. Here, the term “control” and similar terms in combination with the lighting device and the control system may especially refer to determining the behavior or supervising the running of the lighting device, more especially the lighting device light generated thereby.


Further, in yet an aspect the invention also provides a method of sensing, e.g. motion, with the sensor system or the lighting system as defined herein, the method comprising detecting radiation from external of said sensor unit or said lighting device with said sensor array and sensing, e.g. motion, with said control system as function of a sensor signal from said sensor array. Especially, the method may comprise sensing changes in radiation patterns on said sensor array. The method may amongst others be used for sensing, one or more of motion, light, color, human presence, human behavior, etc. These may be sensed with said control system as function of the sensor signal from said sensor array.


Especially, the sensor system or the lighting system as defined herein may be used for sensing (in high resolution) motion while preventing face recognition.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention will now be described, by way of example only, with reference to the accompanying schematic drawings in which corresponding reference symbols indicate corresponding parts, and in which:



FIG. 1a-1b schematically depict a number of embodiments and variants;



FIG. 2 schematically depicts a number of embodiments and variants in more detail.



FIGS. 3a-3e schematically depicts some (comparative) results;



FIGS. 4a-4c schematically depict some further variants and embodiments.





The schematic drawings are not necessarily on scale.


DETAILED DESCRIPTION OF THE EMBODIMENTS


FIG. 1a schematically depicts an embodiment of a sensor unit 100 comprising a sensor array 120 having at least 100×100 sensor pixels 125, though this is shown in a very simplified version with only 6 pixels by way of example. Further, an optional lens system 200 is schematically depicted. Also an optional light transmissive sensor window 300 comprising optical structures 320 is schematically depicted.


The light transmissive sensor window 300 is configured at a window distance d from said sensor array 120. The lens system 200 is configured to provide an object-image plane distance d1 selected from the range of 0.1*d-4*d, such as especially 0.9*d-1.1*d. Reference IP indicates the image plane, here coinciding with the 2D sensor array 120, and reference OP indicates the object plane, here by way of example essentially coinciding with the light transmissive sensor window 300. However, the object plane may also be upstream of this window 300 or downstream thereof (but relatively close to the sensor window 300).


The optical structures 320 are configured in a pattern, wherein the optical structures 320 have one or more dimensions selected from length L, width W and diameter D, and may have values selected from the range of 50 μm-20 mm. Further, neighboring optical structures 320 may have shortest distances d2 selected from the range of 0-50 mm. By way of example, the optical structures have the shape of pyramids 320b.



FIG. 1a indicates with the dashed line d1* the object-image plane distances that may be chosen, from the object plane very close to the lens system 200, but still upstream of the optional window 300, until about 50 cm away from the sensor array 120, which may be downstream from the optional window 300. Reference d3 indicates the distance between the pixels 125 and the lens system. This distance may essentially be 0 mm, but may optionally be a few mm, such as in the range of 0.2-20 mm. The lens system may comprise a single lens or a plurality of lenses.



FIG. 1b very schematically depict some embodiments and variants wherein the optical structures (320) are selected from the group of dots 320a (variants I, III), lines 320c (variants II,III), grooves 320d (variant III), lenses 320f (variant III), facets 320g (variant III, but pyramids in fact also comprises facets). The pitch between the optical structures is indicated with reference p. variant III is a fictive variant, only used to show a number of different optical structures. Note that as this Fig. is a cross section, the dots may be dots or lines. Variant IV again shows other variants, such as e.g. cubic, tetrahedral or other type of structures that may have a (base) width W and a (base) length L. Note that L and W can be used for different dimension. Attributing L to one dimension may imply to attribute W to another dimension.



FIG. 2 schematically also depicts a number of embodiments and variants. The near sighted camera is either integrated in a luminaire, or in a lamp that is to be used inside a luminaire. The camera is focused on the position of the exit window of the luminaire, which may be an optical layer containing prisms, pyramids, cones, lenslets, or a controlled diffuser (with limited scattering angle).


The camera may also sense the light from the luminaire itself, but that should be easy to filter out with a frequency filter. In an embodiment, the camera may be located in a separate compartment to avoid interference with light from the luminaire itself. Hence, in variant I the dashed line indicates the two options.



FIG. 2 thus also shows embodiments and variants (I, II, and III) of a lighting system 500 comprising a lighting device 510 configured to provide lighting device light 511, the lighting system 500 further comprising the sensor unit 100. In fact, hereby also embodiments and variants of a sensor system 1000 comprising the sensor unit 100 according to any one of the preceding claims and a control system 400 functionally coupled with said sensor unit 100, wherein the sensor system 1000 is configured to sense e.g. motion as function of radiation received by said sensor array 120 through said light transmissive sensor window 300. The control system 400 may also be integrated in the lighting device 510. These embodiments and variants also show variants wherein the lighting device 510 further comprises a light exit window 520, wherein said sensor unit 100 is configured upstream of said light exit window 520. More precisely, the light exit window 520 comprises said light transmissive sensor window 300. Reference 560 indicates a light source, such as a solid state light source, a fluorescent lamp, etc.


In the image in FIG. 3a the image of a point source at 1 m distance is shown in case there is no optical layer (such as a light transmissive window with optical structures configured in a pattern as described herein). The point is blurred a lot because the lens is focusing at 2 cm and not at 1 m. Still, movement of the point can be detected. The point source is at 1 m distance and 0.5 m off-axis.


When a diffuse optical layer is placed in between the point and the camera, the following images are observed. In the case of Gaussian scattering (scattering within a limited angular range), the point image is even more blurred than in the case without layer, but some movement may still be visible, see FIG. 3b.


When the optical layer has Lambertian scattering properties (scattering in all directions, independent of the incoming direction), the image contains no information at all and it does not change when the point moves. So the camera cannot be used for detecting movement behind a strong scattering optical layer (can still be used to detect light though), see FIG. 3c.


Often, luminaires contain optical layers containing lenslets (i.e. a small lens), pyramids, or cones for either hiding the source or shaping the beam and controlling the glare. In. FIG. 3d, we show examples of the image of a point seen on such an optical layer for square pyramids. Similar results (but other patterns) may be received for so-called MLO (micro lens optics: small indented cones) structures. The clear optical structures create a point source response that contains a pattern of sharply delineated structures. These structures mask the details of the image, which is good for privacy, but they are very sensitive to movement, which is good for presence detection. The use of such images in presence detection with cameras is an essential part of our invention.


To test the movement sensitivity of various optical layouts (no layer vs diffuser vs clear optical layer), we calculated the “camera” image of a model face at 1 m distance. The model face is an emitting disk (20 cm diameter) facing the camera with three black areas to draw two eyes and a mouth.


Without optical layer, the face image is blurred to a disk of light, see below. If the face moves 5 cm, the image also slightly moves. So such a movement may be detectable, but the correlation between the two images is still high (0.9) because the images contain largely the same information (very much overlapping disks). Therefore, to detect such a movement against a background of other objects and with other sources of noise may not be feasible. When a Gaussian scattering layer is used, the images are blurred more. Surprisingly, the correlation between the reference image and the image of the 5 cm displace face is slightly lower (0.83). When the face is moved 20 cm, the correlation becomes 0.63. If we now consider the case of the optical layer containing pyramids, we notice that the face image is completely unrecognizable, but it does still contain small features. Moving the face by 5 cm shifts those small features and the correlation with the reference image is only 0.64, i.e. comparable to the correlation of the 20 cm shifted image in case a Gaussian diffuser is used. The pyramid optics layer is therefore much more sensitive to small movements than a Gaussian diffuser layer (and also more sensitive than the situation without layer). We have found the best result for the MLO layer: a correlation of 0.49 after a 5 cm displacement.


Depending on the type of implementation, the optical layer and therefore the point response of the imaging system may be known (e.g. in case it is implemented directly in a luminaire or as a standalone unit) or it may depend on the application (in case it is embedded in a lamp and we don't know the luminaire it will be placed in).


To improve the sensitivity of the sensor, the point response may be implemented in the software filters during manufacturing, or it may be obtained through a calibration procedure (e.g. measure the response in a dark room by using a laser pointer from different locations), or it may be obtained by a self-learning algorithm during operation.



FIGS. 3d and 3e show results of a sensor unit comprising a sensor array having at least 100×100 sensor pixels, a fixed focus lens system, and a light transmissive sensor window comprising optical structures, wherein the light transmissive sensor window is configured at a window distance from said sensor array, wherein the lens system is configured between the sensor array and said sensor window, and wherein an object plane and an image plane defined by said lens system and sensor array have an object-image plane distance selected from the range of 0.1*d-2*d, wherein the optical structures are configured in a pattern, wherein the optical structures have one or more dimensions selected from length, width and diameter selected from the range of 50 μm-20 mm, and wherein neighboring optical structures have shortest distances selected from the range of 0-50 mm.


In a further set of embodiments, the camera may be integrated inside a lamp, with the focus on the envelope of the lamp. This is schematically depicted in FIGS. 4a-4c. FIG. 4a shows the sensor unit integrated in a bulb, using controlled scattering or a partly transparent envelope. FIG. 4b shows the sensor unit also integrated in the bulb with optical structures in or on the bulb window, such as lenslets or facets. FIG. 4c schematically depicts a similar variant but now only part of the bulb window comprises the optical structures. Note that when the window is essentially fully covered with optical structures with mutual distances of neighboring structures essentially zero, the structures are transmissive for radiation, such as visible and/or infrared variation, especially transparent.


In yet a further set of embodiments, the camera sensor may be used as a standalone sensor, connected to a lighting system (wireless or wired). The camera may be used without cover (but near-sighted to avoid face recognition) or with a cover, similar to the covers used in previous embodiments.


The term “substantially” herein, such as in “substantially all light” or in “substantially consists”, will be understood by the person skilled in the art. The term “substantially” may also include embodiments with “entirely”, “completely”, “all”, etc. Hence, in embodiments the adjective substantially may also be removed. Where applicable, the term “substantially” may also relate to 90% or higher, such as 95% or higher, especially 99% or higher, even more especially 99.5% or higher, including 100%. The term “comprise” includes also embodiments wherein the term “comprises” means “consists of”. The term “and/or” especially relates to one or more of the items mentioned before and after “and/or”. For instance, a phrase “item 1 and/or item 2” and similar phrases may relate to one or more of item 1 and item 2. The term “comprising” may in an embodiment refer to “consisting of” but may in another embodiment also refer to “containing at least the defined species and optionally one or more other species”.


Furthermore, the terms first, second, third and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other sequences than described or illustrated herein.


The devices herein are amongst others described during operation. As will be clear to the person skilled in the art, the invention is not limited to methods of operation or devices in operation.


It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb “to comprise” and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise”, “comprising”, and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in the sense of “including, but not limited to”. The article “a” or “an” preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.


The invention further applies to a device comprising one or more of the characterizing features described in the description and/or shown in the attached drawings. The invention further pertains to a method or process comprising one or more of the characterizing features described in the description and/or shown in the attached drawings.


The various aspects discussed in this patent can be combined in order to provide additional advantages. Further, the person skilled in the art will understand that embodiments can be combined, and that also more than two embodiments can be combined. Furthermore, some of the features can form the basis for one or more divisional applications.

Claims
  • 1. A sensor unit comprising a sensor array having at least 100×100 sensor pixels, a fixed focus lens system, and a light transmissive sensor window comprising optical structures, wherein the light transmissive sensor window is configured at a window distance from said sensor array, wherein the lens system is configured between the sensor array and said sensor window, and wherein an object plane and an image plane defined by said lens system and sensor array have an object-image plane distance selected from the range of 0.1*d-2*d, wherein the optical structures are configured in a pattern, wherein the optical structures have one or more dimensions selected from length, width and diameter selected from the range of 50 μm-20 mm, and wherein neighboring optical structures have shortest distances selected from the range of 0-50 mm;
  • 2. The sensor unit according to claim 1, wherein the lens system is configured to provide said object-image plane selected from the range of 0.9*d-1.1*d, and wherein said image plane coincides with the sensor array.
  • 3. The sensor unit according to claim 1, wherein the optical structures are configured in a regular pattern, wherein the optical structures have one or more dimensions selected from length, width and diameter selected from the range of 100 μm-2 mm, wherein neighboring optical structures have shortest distances selected from the range of 0-2 mm, and wherein the sensor array has at least 400×400 sensor pixels.
  • 4. The sensor unit according to claim 1, wherein the optical structures comprise lamellae having shortest distances selected from the range of 5-20 mm.
  • 5. The sensor unit according to claim 1, wherein the sensor array comprises one or more of a CCD and a CMOS, and wherein the light transmissive sensor window comprises a light transmissive layer comprising a micro lens array.
  • 6. A sensor system comprising the sensor unit according to claim 1 and a control system functionally coupled with said sensor unit, wherein the sensor system is configured to sense as function of radiation received by said sensor array through said light transmissive sensor window.
  • 7. The sensor system according to claim 6, wherein the control system comprises one or more of a self-learning algorithm and a calibration procedure for improving sensitivity of the sensor system when installed.
  • 8. A lighting system comprising a lighting device configured to provide lighting device light, the lighting system further comprising the sensor unit according to claim 1.
  • 9. The lighting system according to claim 8, wherein the lighting device further comprises a light exit window wherein said sensor unit is configured upstream of said light exit window.
  • 10. The lighting system according to claim 9, wherein said light exit window comprises said light transmissive sensor window.
  • 11. The lighting system according to claim 8, wherein the lighting system further comprises a control system functionally coupled with said sensor unit, wherein the lighting system is further configured to sense as function of radiation from external of said lighting device received by said sensor array through said light transmissive sensor window.
  • 12. A method of sensing with the sensor system according to claim 7, the method comprising detecting radiation from external of said sensor unit or said lighting device with said sensor array and sensing one or more of motion, light, color, human presence, human behavior with said control system as function of a sensor signal from said sensor array.
  • 13. The method according to claim 12, wherein the method comprises sensing changes in radiation patterns on said sensor array.
  • 14. Use of the sensor system according to claim 6, for sensing motion while preventing face recognition.
Priority Claims (1)
Number Date Country Kind
17150181.0 Jan 2017 EP regional
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2017/082884 12/14/2017 WO 00