The present disclosure relates to the field of surveillance cameras for safety and security applications. A surveillance apparatus, having an optical camera and an additional radar sensor, and a corresponding surveillance method are disclosed. Application scenarios include burglar, theft or intruder alarm as well as monitoring public and private areas.
Optical surveillance cameras are used in many public places such as train stations, stadiums, supermarkets and airports to prevent crimes or to identify criminals after they committed a crime. Optical surveillance cameras are widely used in retail stores for video surveillance. Other important applications are safety-related applications including the monitoring of hallways, doors, entrance areas and exits for example emergency exits.
While optical surveillance cameras show very good performance under regular operating conditions, these systems are prone to visual impairments. In particular, the images of optical surveillance cameras are impaired by smoke, dust, fog, fire and the like. Furthermore, a sufficient amount of ambient light or an additional artificial light source is required, for example at night.
An optical surveillance camera is also vulnerable to attacks of the optical system, for example paint from a spray attack, stickers glued to the optical system, cardboard or paper obstructing the field of view, or simply a photograph that pretends that the expected scene is monitored. Furthermore, the optical system can be attacked by laser pointers, by blinding the camera or by mechanical repositioning of the optical system.
In addition to imaging a scenery, it can be advantageous to obtain information about the distance to an object or position of an object or a person in the monitored scenery. A three-dimensional image of a scenery can be obtained, for example, with a stereoscopic camera system. However, this requires proper calibration of the optical surveillance cameras which is very complex, time consuming, and expensive. Furthermore a stereoscopic camera system typically is significantly larger and more expensive compared to a monocular, single camera setup.
In a completely different technological field, automotive driver assistance systems, US 2011/0163904 A1 discloses an integrated radar-camera sensor for enhanced vehicle safety. The radar sensor and the camera are rigidly fixed with respect to each other and have a substantially identical, limited field of view.
The “background” description provided herein is for the purpose of generally presenting the context of the disclosure. Work of the presently named inventor(s), to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present disclosure.
It is an object of the present disclosure to provide a surveillance apparatus and a corresponding surveillance method which overcome the above-mentioned drawbacks. It is a further object to provide a corresponding computer program and a non-transitory computer-readable recording medium for implementing said method. In particular, it is an object to expand the surveillance capabilities to measurement scenarios where a purely optical camera fails and to efficiently and flexibly monitor a desired field of view.
According to an aspect of the present disclosure there is provided a surveillance apparatus comprising
wherein said first field of view is variable with respect to said second field of view.
According to a further aspect of the present disclosure there is provided a corresponding surveillance method comprising the steps of
According to a further aspect of the present disclosure there is provided a surveillance apparatus comprising
wherein said second field differs from said first field of view.
According to a further aspect of the present disclosure there is provided a surveillance radar apparatus for retrofitting an optical surveillance camera, said surveillance radar apparatus comprising
wherein said first field of view is variable with respect to said second field of view.
According to still further aspects a computer program comprising program means for causing a computer to carry out the steps of the method disclosed herein, when said computer program is carried out on a computer, as well as a non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method disclosed herein to be performed are provided.
Preferred embodiments are defined in the dependent claims. It shall be understood that the claimed surveillance radar apparatus for retrofitting a surveillance camera, the claimed surveillance method, the claimed computer program and the claimed computer-readable recording medium have similar and/or identical preferred embodiments as the claimed surveillance apparatus and as defined in the dependent claims.
The present disclosure is based on the idea to provide additional sensing means, i.e., a radar sensor, that complements surveillance with an optical camera. A radar sensor can work in certain scenarios where an optical sensor has difficulties, such as adverse weather or visual conditions, for example, snowfall, fog, smoke, sandstorm, heavy rain or poor illumination or darkness. Moreover, a radar sensor can still operate after vandalism to the optical system. Synergy effects are provided by jointly evaluating the images captured by the (high-resolution) optical camera and the received electromagnetic radiation by the radar sensor.
The field of view of an optical camera that captures images based on received light is typically limited to a confined angular range. Attempts to widen the field of view of an optical camera exist, for example, in form of a fish-eye lens. While such optical elements significantly broaden the field of view of the optical camera, they also create a significantly distorted image of the observed scene. This makes image analyses difficult for an operator that monitors the images captured by the surveillance camera, if no additional correction and post-processing is applied.
The surveillance apparatus according to the present disclosure uses a different approach by combining an optical camera that captures images based on received light, and a radar sensor, that emits and receives electromagnetic radiation. The optical camera has a first field of view and the radar sensor has a second field of view. The first field of view is variable with respect to the second field of view. Alternatively, the second field of view differs from the first field of view. For example, the first field of view of the optical camera covers an angular range of about 50-80° to avoid substantial image distortions, whereas the second field of view of the radar sensor covers an angular range of at least 90°, preferably 180°, or even a full 360°. Thus, the field of view of the radar sensor is larger than the field of view of the optical camera and thereby monitors a wider field of view. However, the information gained from the radar sensor is often not sufficient for surveillance applications since often a high-resolution optical image is desired. Therefore, the field of view of the optical camera is variable with respect to the field of view of the radar sensor. In particular, the size and/or orientation of the first field of view are variable with respect to the second field of view. For example, an object can be identified with the radar sensor and the field of view of the optical camera is adjusted to cover said object. This is particularly beneficial if an object that is initially not covered by the field of view of the optical camera is now detected in the field of view of the radar sensor.
The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views,
The optical camera 101 of the surveillance apparatus 100 optionally features a light source for illuminating a region of interest in front of the camera. In this example, the camera 101 comprises a ring of infrared (IR) light emitting diodes (LEDs) 106 for illuminating the region of interest with non-visible light. To a certain extent, this enables unrecognized surveillance and surveillance in darkness over a limited distance.
Further optionally, the surveillance apparatus 100 comprises an actor 107 for moving the camera 101. By moving the camera, a larger area can be monitored. However the movement speed is limited. Different areas cannot be monitored at the same time but have to be monitored sequentially.
The field of view 118 of the optical camera 111 defines the region that is covered and thus imaged by the optical camera 111. In order to increase the area that can be monitored with the surveillance apparatus 110, the surveillance apparatus 110 can further comprise a first actor and a second actor to pan 119a and tilt 119b the optical camera 111.
The camera 301 is arranged at the center of the housing, for example, a dome-type camera as discussed with reference to
In a first configuration, the field of view 308a of the optical camera 301 corresponds to the portion of the field of view of the radar sensor that is covered by the antenna element 304a. Even if the view of the optical camera 301 is obscured by smoke, the radar sensor can still detect the object 306a, since the frequency spectrum used for the electromagnetic radiation of the radar sensor penetrates through smoke. For example with reference to the application scenario in
Using a radar sensor employing a frequency-modulated continuous wave (FMCW) modulation scheme or stepped CW allows ranging and relative speed detection. Measurement schemes, such as pulsed radar, can be used in the alternative. In principle, a single antenna is sufficient for ranging, such that in a most basic configuration, a single antenna 304a can be used. Thus, the range and speed of the target 306a can be determined.
The field of view of the radar sensor that emits and receives electromagnetic radiation comprises the field of view of the individual antenna elements 304a-304f. In the configuration shown in
In a further scenario, the optical camera 301 is oriented to cover the field of view 308a with the object 306a. The radar sensor covering the entire 360° field of view detects an object 306b in the sector of antenna element 304b. The surveillance apparatus 300 can comprise a control unit 307 as part of the radar front end system 305 (as shown in
Advantageously, this control of the optical camera 301 can be automated, such that a single optical camera 301 having a limited field of view 308a, 308b can be used to cover an extended area, in this example the entire surrounding of the surveillance apparatus. Furthermore, the system cost can be lowered by combining the radar functionality for coarse monitoring of an entire area with a selective high-resolution monitoring of only limited parts of the area. The high resolution monitoring is triggered, if an object has been detected by the radar sensor.
The housing 303 accommodates the electronics of the surveillance apparatus 300. In
The range and/or direction of the object 406b can be determined by use of the generally known principles of interferometry or phase monopulse. The principle of phase monopulse is sketched in
An alternative approach for determining the direction to an object is described with reference to
In case of +/−180° scanning, a flexible cable interconnect can be used between the static housing 503 and the movable part 510 including the antenna element 504. For the case of a continuously scanning system, a rotary joint is required that may optionally comprise a filter for radio frequency signals (RF), DC signals, intermediate frequency signals (IF), and the like. Alternatively, multiple slip rings for providing a connection between the static housing 503 and the moving parts 510 can be employed.
The processing circuitry 512 identifies a first object 506a and a second object 506b in the field of view 508a of the optical camera 501. For example, the processing circuitry performs image analysis on the captured image and identifies two dark spots as objects 506a and 506b. More advanced image processing algorithms can of course be employed that identify the outline of a person in both objects 506a and 506b. In addition to this result from the optical analysis, information acquired using the radar sensor with narrow beam antenna 504 can be used.
For example, the distances corresponding to the directions of objects 506a and 506b are evaluated. In the optical image, a person and its shadow may be falsely identified as two persons. However, using the information from the radar sensor, it can be clearly identified whether there are actually two persons or whether there is one person (a short distance is measured) and his shadow. For the case of a shadow, the distance measured with the radar sensor does not correspond to the distance of the object expected from the image captured by the optical camera. This use case is very important for counting people, for example to ensure that all kids have left a fun park or that all customers have left a shop or that everybody has left a danger zone.
The surveillance apparatus 600 in
According to a further aspect of the disclosure, the beam forming, for example digital beam forming with MIMO antenna elements, can be used to generate different beam forms. For example, a wide antenna beam similar to
The previous embodiments have illustrated scanning an antenna beam in one direction, i.e. in the azimuth plane. In order to monitor a room in three dimensions, however, the radar sensor can scan in the elevation plane in addition to the azimuth plane.
The azimuth and the elevation can be monitored with a mechanical scanning radar system, a hybrid mechanical/electronic scanning radar system, or a purely electronic scanning radar system.
In the example shown in
In an alternative embodiment, the outline of the surveillance apparatus is a polygonal shape. Thereby, the two-dimensional antenna arrays can be implemented, for example, as patch antenna arrays on individual printed circuit boards that are placed at the sides of the polygonal shape. This reduces fabrication costs.
A further aspect of the present disclosure relates to retrofitting an optical surveillance camera, as for example shown in
Optionally, the surveillance radar apparatus includes further functionalities, such as a converter for converting analog video signals of an existing analog optical camera to digital video signals, for example for connecting the existing analog optical camera via the surveillance radar apparatus to an IP network.
To ensure proper alignment of the optical surveillance camera 901 and the surveillance radar apparatus 900, the housing 902 of the surveillance radar apparatus 900 further comprises an alignment member 921 for aligning a position of the surveillance radar apparatus 900 with respect to the surveillance camera 901. For this purpose, the housing 912 of the surveillance camera 901 comprises a second alignment member 922 for engagement with the alignment member 921 of the housing of the surveillance radar apparatus 900. In this embodiment, the second alignment member 922 of the camera housing 912 is a type of slot or groove where a tapped structure 921 from the housing 902 of the surveillance radar apparatus 900 fits into. Of course, this form fit can also be implemented vice versa. There can also be other embodiments of alignment structures or multiple of them, respectively.
A conventional camera cover usually only comprises one translucent layer, for example a translucent dome made from glass or a transparent polymer. Optionally, the camera cover comprises an anti-reflective coating, a tinting, or a one-way mirror, in order to obscure the direction the camera is pointing at.
According to an embodiment of the translucent antenna, the patch antennas 1132 make up a conformal patch antenna array. The array can cover the entire hemispherical camera cover and can consist of multiple arrays of patch antenna elements that are arranged for observing different sectors. Alternatively, individually controlling the individual patch antenna elements is possible to form a hemispherical phased antenna array. A corresponding feeding network for routing to the radar circuitry 1139 for feeding the individual patch antenna elements is then provided with the corresponding individual microstrip feed lines 1138 and power dividers for individually feeding the antenna elements. The same holds true in the receiving path.
In this embodiment, conductive layers of the translucent antenna are preferably implemented by electrically conductive ITO (Indium-Tin-Oxide) layers 1240. As a further alternative, conductive layers of the translucent antenna elements comprise AgHT (silver coated polyester film). Alternatively, printed patch antennas, which are approximated by wire meshes, can be used. This methodology does not need any special type of material. Standard metallic conductors such as copper, gold, chrome, etc. can be employed. By perforating large metal areas of the antenna, a high optical transparency can be achieved. In a wire mesh the metal grid is typically space by 0.01 . . . 0.1 lambda (i.e. 0.01 . . . 0.1 times the used wavelength). The thickness of the metal strips can be as small as 0.01 lambda.
The conductive layers 1240 are separated by dielectric layers made from glass or, alternatively, a translucent polymer that is not electrically conductive but can serve as a dielectric. Of course, the translucent antenna can be implemented using different layer structures, however, the layer structure preferably comprises a first electrically conductive layer comprising a ground plane and a second electrically conductive layer comprising an antenna element.
For example, the base of the surveillance apparatus 1000 comprises radar circuitry, in particular, a printed circuit board (PCB) 1250 further comprising a ground plane 1251 and a microstrip line 1252. The microstrip line 1252 feeds the patch antenna elements 1232 via the shown structure. The ground plane 1251 further comprises a slot 1254 for coupling a signal from the microstrip line 1252 of the PCB to the microstrip line 1253 which connects the printed circuit board 1250 with the translucent antenna cover 1215 comprising the patch antenna elements 1232. The patch antenna element 1232 is fed by the microstrip line 1253 via further slots 1255 in the ground plane 1256 which is at least electrically connected to the ground plane 1251. In other words, an interconnection between the printed circuit board of the radar circuitry and the microstrip feed lines 1253, 1138 of the translucent camera cover 1215 is realized by a coupling structure which interconnects a microstrip line 1252 on the printed circuit board with a microstrip line 1253 on the translucent camera dome.
Thus, the foregoing discussion discloses and describes merely exemplary embodiments of the present disclosure. As will be understood by those skilled in the art, the present disclosure may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present disclosure is intended to be illustrative, but not limiting of the scope of the disclosure, as well as other claims. The disclosure, including any readily discernible variants of the teachings herein, defines, in part, the scope of the foregoing claim terminology such that no inventive subject matter is dedicated to the public.
In the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
In so far embodiments of the disclosure have been described as being implemented, at least in part, by software-controlled data processing apparatus, it will be appreciated that a non-transitory machine-readable medium carrying such software, such as an optical disk, a magnetic disk, semiconductor memory or the like, is also considered to represent an embodiment of the present disclosure. Further, such a software may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems, including fixed-wired logic, for example an ASIC (application-specific integrated circuit) or FPGA (field-programmable gate array).
It follows a list of further embodiments of the disclosed subject matter:
1. A surveillance apparatus comprising
2. The surveillance apparatus according to embodiment 1,
wherein size and/or orientation of said first field of view are variable with respect to said second field of view.
3. The surveillance apparatus according to any preceding embodiment,
wherein said optical camera is movable with respect to the radar sensor.
4. The surveillance apparatus according to any preceding embodiment,
further comprising a control unit that controls the optical camera based on radar information obtained with the radar sensor.
5. The surveillance apparatus according to any preceding embodiment,
wherein the optical camera further comprises a translucent camera cover.
6. The surveillance apparatus according to embodiment 5,
wherein the camera cover comprises a substantially hemispheric camera done.
7. The surveillance apparatus according to any preceding embodiment,
having a polygonal, cylindrical or circular outline.
8. The surveillance apparatus according to any preceding embodiment,
wherein the radar sensor comprises an antenna element arranged on the periphery of the surveillance apparatus.
9. The surveillance apparatus according to any preceding embodiment,
wherein the radar sensor is adapted to provide at least one of a direction, range and speed of an object relative to the surveillance apparatus.
10. The surveillance apparatus according to embodiment 5,
wherein the camera cover further comprises a translucent antenna.
11. The surveillance apparatus according to embodiment 10,
wherein the translucent antenna comprises an electrically conductive layer comprising at least one of a translucent electrically conductive material and an electrically conductive mesh structure.
12. The surveillance apparatus according to embodiment 11,
wherein a first electrically conductive layer comprises a ground plane and a second electrically conductive layer comprises an antenna element.
13. The surveillance apparatus according to embodiment 12,
wherein the ground plane comprises a slot for feeding the antenna element.
14. The surveillance apparatus according to embodiment 11, 12 or 13,
wherein the camera cover comprises at least one dielectric layer and two electrically conductive layers.
15. The surveillance apparatus according to embodiment 14,
wherein said dielectric layer is made from at least one of glass or a translucent polymer.
16. The surveillance apparatus according to any one of embodiments 10 to 15,
further comprising a feed structure comprising a microstrip feed line.
17. The surveillance apparatus according to any preceding embodiment,
further comprising processing circuitry that processes the captured images of the optical camera and the received electromagnetic radiation of the radar sensor and providing an indication of the detection of the presence of one or more objects.
18. The surveillance apparatus according to embodiment 17,
wherein the processing circuitry verifies the detection an object in the captured images of the optical camera or in the received electromagnetic radiation of the radar sensor based on the received electromagnetic radiation of the radar sensor or the captured images of the optical camera respectively.
19. The surveillance apparatus according to embodiment 18,
wherein the processing circuitry provides an indication of whether two persons identified in the captured images of the optical camera are actually two persons or one person and their shadow by evaluating distance information to the two identified persons based on the received electromagnetic radiation of the radar sensor.
20. A surveillance apparatus comprising
21. The surveillance apparatus according to embodiment 20,
wherein the second field of view is larger than the first field of view.
22. The surveillance apparatus according to embodiment 20 or 21,
wherein the second field of view covers an angular range of at least 90°.
23. A surveillance radar apparatus for retrofitting an optical surveillance camera, having a first field of view, comprising
24. The surveillance radar apparatus according to embodiment 23,
wherein the housing of the surveillance radar apparatus encompasses the surveillance camera.
25. The surveillance radar apparatus according to embodiment 23,
wherein said housing of the surveillance radar apparatus further comprises an alignment member for aligning a position of the surveillance radar apparatus with respect to the surveillance camera.
26. A surveillance method comprising the steps of
27. A computer program comprising program code means for causing a computer to perform the steps of said method as claimed in embodiment 26 when said computer program is carried out on a computer.
28. A non-transitory computer-readable recording medium that stores therein a computer program product, which, when executed by a processor, causes the method according to embodiment 26 to be performed.
Number | Date | Country | Kind |
---|---|---|---|
13169006 | May 2013 | EP | regional |
The present application is a continuation of U.S. application Ser. No. 14/889,081, filed Nov. 4, 2015 which is a National Stage Application based on PCT/EP2014/058755, filed Apr. 29, 2014, and claims priority to European Patent Application 13169006.7, filed in the European Patent Office on May 23, 2013, the entire contents of each of which being incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6484619 | Thomas et al. | Nov 2002 | B1 |
20060033674 | Essig, Jr. | Feb 2006 | A1 |
20060139162 | Flynn | Jun 2006 | A1 |
20060238617 | Tamir | Oct 2006 | A1 |
20060244826 | Chew | Nov 2006 | A1 |
20090167862 | Jentoft | Jul 2009 | A1 |
20100182434 | Koch | Jul 2010 | A1 |
20110037640 | Schmidlin | Feb 2011 | A1 |
20110163904 | Alland et al. | Jul 2011 | A1 |
20120080944 | Recker | Apr 2012 | A1 |
20120092499 | Klar et al. | Apr 2012 | A1 |
20130093615 | Jeon | Apr 2013 | A1 |
20130093744 | Van Lier | Apr 2013 | A1 |
20140204215 | Kriel | Jul 2014 | A1 |
20140368373 | Crain | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
2 284 568 | Feb 2011 | EP |
2006074161 | Jul 2006 | WO |
2010042483 | Apr 2010 | WO |
Entry |
---|
European Communication Pursuant to Article 94(3) EPC dated Sep. 11, 2018 in European Application No. 14722155.0-1206. |
“Gyrocam Systems” Lockheed Martin, Feb. 8, 2013 http://www.lockheedmartin.com/us/products/gyrocam.html, (total 2 pages). |
“MicroCoMPASS Micro Compact Multi-purpose Advanced Stabilized System—Airborne” Elbit Systems Electro-Optics—ELOP, 2009, www.elbitsystems.com/elop, (total 2 pages). |
“IllumiNITE Evolutionary Visual Extension of the Eye” Ferranti Technologies, www.ferranti-technoloqies.co.uk, (total 2 pages). |
Katsutoshi Ochiai, et al. “Development of the Laser Radar Surveillance System Technology at Long-distances with High-resolution Under Inclement Weather” Mitsubishi Heavy Industries, Ltd. Technical Review, vol. 42, No. 5, Dec. 2005, pp. 1-4. |
International Search Report and Written Opinion dated Oct. 8, 2014 for PCT/EP2014/058755 filed on Apr. 29, 2014. |
Number | Date | Country | |
---|---|---|---|
20190096205 A1 | Mar 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14889081 | US | |
Child | 16199604 | US |