This patent application relates generally to remote sensing display systems, and more specifically, to remote sensing security and communication systems that include dual-purpose visible and infrared (IR) based camera systems to monitor premises and generate alerts in case of emergencies.
Video surveillance systems use one or more cameras to monitor indoor premises and/or outdoor spaces to detect various activities. These may range from package deliveries to intruders. With burgeoning advancements in data communications, camera and sensor technologies, and augmented reality (AR), virtual reality (VR), and mixed reality (MR) devices, a more robust and comprehensive video surveillance system may be provided.
Features of the present disclosure are illustrated by way of example and not limited in the following figures, in which like numerals indicate like elements. One skilled in the art will readily recognize from the following that alternative examples of the structures and methods illustrated in the figures can be employed without departing from the principles described herein.
For simplicity and illustrative purposes, the present application is described by referring mainly to examples thereof. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. It will be readily apparent, however, that the present application may be practiced without limitation to these specific details. In other instances, some methods and structures readily understood by one of ordinary skill in the art have not been described in detail so as not to unnecessarily obscure the present application. As used herein, the terms “a” and “an” are intended to denote at least one of a particular element, the term “includes” means includes but not limited to, the term “including” means including but not limited to, and the term “based on” means based at least in part on.
Surveillance systems may generally include two types of video cameras—analog cameras such as those that may be used in closed-circuit TV (CCTV) systems or digital cameras used in conjunction with internet protocol (IP) networks. Different types of video surveillance services, including Video Surveillance as a Service (VSaaS) and hybrid-hosted solutions, may be offered by different providers. While some solutions may include installation of the video equipment and the surveillance occurring at the same site, many modern services may offer remotely monitored video surveillance, which may also be referred to as “network video surveillance.” It is a term used to describe a setup wherein a physical location is monitored remotely from another geographical location. Again, different types of video cameras may also be employed for different types of surveillance. Some video cameras may record continuous video while other types of video cameras may record time-lapse footage on detected movements with motions sensors.
The systems and methods described herein may be directed to a remote sensing security system for remotely monitoring a location or premises. In some examples, the remote sensing security system may include a dual-purpose camera system that has at least one dual-purpose camera with a visible light sensor and/or an infrared (IR) sensor. The visible light sensor may detect objects and/or movements in the visible spectrum whereas the IR sensor may detect objects and/or movements in the IR spectrum.
In some examples, the remote sensing security system may also include a server, such as a cloud service. The data from the dual-purpose camera system, for example, may be provided to a cloud server, for analysis and/or detection of any emergency conditions at the monitored location or premises. In an example, the cloud server may be located in a remote geographic location from the monitored premises. In some examples, the cloud server may include machine learning (ML) based object detection models, which when used in conjunction with one or more computer vision techniques, for detecting various objects, such as humans, animals, nonliving objects, and/or conditions that may be indicative of an emergency on the premises. If no, objects, movements, and/or conditions can be detected, then the cloud server may determine that there is no emergency at the premises.
In an example, the data from the dual-purpose camera system may be analyzed in one or more stages for a confirmed identification of the type of emergency. For example, the data from the visible sensor may be initially analyzed for object identification and/or for determining the type of emergency. In the event objects identified, by the system, in the visible spectrum data are indicative of a potential fire emergency, then the data from the IR sensor, which may sense or detect thermal signals, may be further analyzed for confirmation of the fire emergency.
Once such an emergency is detected and identified, various actions for dealing with the fire emergency may be executed by the cloud server. For example, these actions may include transmitting one or more notifications to one or more client devices registered with the cloud server to receive notifications related to the particular premises. For fire emergencies, additional notifications, such as to the fire department may also be transmitted. In some examples, the the one or more client devices may be a mobile phone or an AR/VR device capable of providing to the one or more notifications to a user in real-time or near real-time. Other various non-fire related emergencies, e.g., intruders, water leakage, etc., may also be detected and/or identified by the remote sensing security system as described herein. In such cases, the cloud server may be configured to execute one or more actions, such as transmitting one or more notifications to any number of registered client devices.
In some examples, the registered client devices may include, but not limited to, mobile computers, tabliets, phones, watches, or other similar portable device capable of transmitting and receiving data signals. In some examples, the registered client device may also include a head-mounted display (HMD) device, such as an augmented reality (AR) eyewear or glasses. If no objects are detected, the cloud server may determine that there is no emergency at the premises and may continue to monitor the premises by receiving data periodically or continuously from the dual-purpose camera system.
The remote sensing security system as described herein may also include a dual-purpose camera system that includes at least two dual-purpose cameras for surveillance at a premises. In this example, there may be at least an indoor dual-purpose camera to monitor the indoors of a building that may be located on the premises and at least one outdoor dual-purpose camera may monitor the outdoors of the building. The indoor dual-purpose camera may be communicatively coupled to the outdoor dual-purpose camera to form a network that communicates with the cloud server. Alternatively or additionally, the various dual-purpose cameras of the dual-purpose camera system may be individually coupled to the cloud server so that each dual-purpose camera may independently communicate the generated data to the cloud server. In an example, the indoor dual-purpose camera may form part of a device such as a tablet device, a laptop, a desktop, etc., which in turn may be communicatively coupled to the cloud server. Other various configurations may also be provided.
Each dual-purpose camera may be configured with a compact optical design that may accommodate at least two sensors that may function in different portions of the electromagnetic spectrum. An imaging lens may be included in the dual-purpose camera for capturing the light rays that may be focused on a beam split cube. In an example, the imaging lens may comprise multiple imaging lenses. The beam split cube may include a surface coated so that an IR component of light beam incident of the coated surface may be reflected and the visible light competent of the incident light beam may be transmitted. A visible light sensor may be arranged behind the beam split cube to receive the visible light competent and an IR sensor may be arranged below the beam split cube to receive the reflected IR component of the incident beam. An additional lens may be attached between the beam split cube and the IR sensor to generate a sharper IR image.
The dual-purpose cameras 102 and 104 may be continuously monitoring the interior and the exterior of the building 120. In an example, the dual-purpose cameras may be communicatively coupled to form a local network which in turn may be connected to the cloud server 140 via the internet. In an example, each of the dual-purpose cameras 102 and 104 may be individually connected to the cloud server 140 via the internet. In an example, one or more of the dual-purpose cameras 102 and 104 may be associated with or included as part of a user device such as a desktop, a laptop, or a tablet device (not shown) which may form part of the dual-purpose camera system 108. The user device may in turn be connected to the cloud server 140 via the internet.
The image/video data 130 from the dual-purpose camera system 108 may be continuously, discontinuously, or periodically received at the cloud server 140 wherein it may be analyzed for identification of specific objects and/or movements. The cloud server 140 may be configured to identify specific objects and in response to identifying the specific objects, the cloud server 140 may be further configured to trigger notifications or alerts 172 to at least one client device 162 which may be disparate and/or remote from the dual-purpose cameras 102, 104. The client device 162 may include but is not limited to one or more of smartphones, smartwatches, HMDs which may include Augmented Reality (AR), Virtual Reality (VR), or Mixed Reality (MR) devices. The remote sensing security system 100 described above may be configured to monitor the building 120 for safety and security issues. Although only two dual-purpose cameras are illustrated, it may be appreciated that any number of dual-purpose cameras may be similarly installed and communicatively coupled to each other and/or the cloud server 140 to enable monitoring of the building 120 remotely by the cloud server 140.
It should be appreciated that the processor 210 may be a semiconductor-based microprocessor, a central processing unit (CPU), an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), and/or other suitable hardware device. In some examples, the memory 220 may have stored thereon machine-readable instructions (which may also be termed computer-readable instructions) that the processor 210 may execute. The memory 220 may be an electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. The memory 220 may be, for example, Random Access Memory (RAM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, and the like. The memory 220, which may also be referred to as a computer-readable storage medium, may be a non-transitory machine-readable storage medium, where the term “non-transitory” does not encompass transitory propagating signals.
The processor 210 may execute instructions 202 to receive and store images/video recorded by the dual-purpose camera(s) 102 and optionally dual-purpose camera 104 in case the dual-purpose camera 104 is a stand-alone camera capable of being communicatively coupled to the user device 200. The processor 210 may execute instructions 204 to determine that the stored video/image may be transmitted to the cloud server 140 as data 130. In an example, the user device 200 may be configured to periodically transmit the data 130 to the cloud server 140 as push notifications. In an example, the cloud server 140 may pull the data 130 from the user device 200. In either case, the processor 210 may execute instructions 206 to transmit the images/video to the cloud server 140 whenever it is determined that the images/video are to be transmitted.
The memory 320 may include instructions 302 to receive the data 130 including the images/video from the dual-purpose camera system 108. The instructions 302 may cause the processor 310 may pull the data 130 from the dual-purpose camera system 108 periodically. Alternatively or additionally, the instructions 302 may cause the cloud server 104 to receive the data 130 when it is pushed to the cloud server 140. The data 130 provided to the cloud server 140 may not only include images/video in the visible spectrum but may also include images from the IR spectrum.
While the images in the visible spectrum enable identifying nonliving and living objects, the images in the IR spectrum may enable confirming the type of emergency at the building 120 based on the objects identified from the data 130. Accordingly, the processor 310 may execute instructions 304 to identify objects and/or conditions from the data 130. In an example, machine learning (ML) based models 350 pre-trained to identify specific objects/conditions such as but not limited to, fire, smoke, living beings, etc. may be employed for object detection and identification. In an example, the data 130 may include both visible spectrum data as well as data from the IR spectrum. Although ML models 350 are shown as being stored in the memory 320, it may be appreciated that the ML models 350 may even be stored remotely from the cloud server 140 and may yet be accessed by the processor 310 for object recognition. Infrared imaging-based machine vision technology may be used to automatically inspect, detect and analyze infrared images (or videos) obtained from the dual-purpose camera system 108.
The processor 310 may execute instructions 306 to determine if an emergency exists at the building 120. An identification (i.e., having a confidence level greater than a predetermined threshold) may be made by one or more of the ML models 310 to cause the instructions 306 to determine that an emergency exists. If, on the other hand, no positive identifications are made from the data 130 received from the dual-purpose camera system 108, then it may be determined that no emergency exists at the building 120 and the data 130 may be ignored and/or stored in archives. If it is determined that there is an emergency at the building 120, the processor 310 may execute instructions 308 to transmit an alert 172 to the client device 162. In an example, the alert 172 may include the images and/or video from the data 130.
As shown in
As a result splitting of the incident ray 412 by the beam split prism 404, a visible image of the object 420, for example, may be formed on the visible light sensor 406 from the visible light component of the incident rays 412 and 414. The IR portion of the incident rays 412 and 414 may be split up by the coated surface 462 to be reflected onto the lens element 408. The reflected IR component may be rendered parallel by the lens element 408 to form a sharp IR image on the IR sensor 410. The coated surface 462 may be arranged at such a distance from the lens element 408 that a beam of the IR spectrum is made to be incident on the IR sensor 410. The image information from the visible light sensor 406 and the IR sensor 410 may be provided as the data 130 to the cloud server 140. The optical system 400, therefore, affords a compact optical design and configuration for the dual-purpose cameras to be used in the dual-purpose camera system 108.
The method detailed in the flowchart below is provided by way of an example. There may be a variety of ways to carry out the method described herein. Although the method detailed below are primarily described as being performed by cloud server 140, as shown in
Accordingly, a type of emergency may be determined at 506. For example, it may be determined if the emergency is a fire-related emergency or a non-fire emergency i.e., an emergency not related to fire such as but not limited to, flood, breakage, intruders, etc. For particular objects/conditions such as fire and smoke, further confirmation may be obtained from the IR sensor 410 at 506. In case further confirmation is obtained from analyzing the portion of the data 130 from the IR sensor 410, then one or more notifications/alerts may be transmitted at 508 based on the type of emergency. For example, an alert in addition to the alert 172 to the client device 162, may also be transmitted to public services such as a fire department in case the data 130 from the visible sensor 406 and the IR sensor 410 indicate a fire emergency. If at 506 if particular objects are detected, which may not require further confirmation or which may not be confirmed by the IR sensor 410 at 506, then an alert 172 only to the client device 162 may be transmitted at 508.
The interconnect 610 may interconnect various subsystems, elements, and/or components of the computer system 600. As shown, the interconnect 410 may be an abstraction that may represent any one or more separate physical buses, point-to-point connections, or both, connected by appropriate bridges, adapters, or controllers. In some examples, the interconnect 610 may include a system bus, a peripheral component interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA)) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1364 bus, or “firewire,” or other similar interconnection element.
In some examples, the interconnect 610 may allow data communication between the processor 612 and system memory 618, which may correspond to one or more of the memories 220 and 320. The system memory 618 may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown). It should be appreciated that the RAM may be the main memory into which an operating system and various application programs may be loaded. The ROM or flash memory may contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with one or more peripheral components.
The processor 612 (which may correspond to the processor 210 or the processor 310) may be the central processing unit (CPU) of the computing device and may control the overall operation of the computing device. In some examples, the processor 612 may accomplish this by executing software or firmware stored in system memory 618 or other data via the storage adapter 620. The processor 612 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application-specific integrated circuits (ASICs), programmable logic device (PLDs), trust platform modules (TPMs), field-programmable gate arrays (FPGAs), other processing circuits, or a combination of these and other devices.
The multimedia adapter 614 may connect to various multimedia elements or peripherals. These may include devices associated with visual (e.g., video card or display), audio (e.g., sound card or speakers), and/or various input/output interfaces (e.g., mouse, keyboard, touchscreen).
The network interface 616 may provide the computing device with an ability to communicate with a variety of remote devices over a network and may include, for example, an Ethernet adapter, a Fibre Channel adapter, and/or other wired- or wireless-enabled adapter. The network interface 616 may provide a direct or indirect connection from one network element to another, and facilitate communication and between various network elements.
The storage adapter 620 may connect to a standard computer-readable medium for storage and/or retrieval of information, such as a fixed disk drive (internal or external).
Many other devices, components, elements, or subsystems (not shown) may be connected in a similar manner to the interconnect 610 or via a network. Conversely, all of the devices shown in
The figures and description are not intended to be restrictive. The terms and expressions that have been employed in this disclosure are used as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding any equivalents of the features shown and described or portions thereof. The word “example” may be used herein to mean “serving as an example, instance, or illustration.” Any embodiment or design described herein as “example” may not necessarily to be construed as preferred or advantageous over other embodiments or designs.
Although the methods and systems as described herein may be directed mainly to digital content, such as videos or interactive media, it should be appreciated that the methods and systems as described herein may be used for other types of content or scenarios as well. Other applications or uses of the methods and systems as described herein may also include social networking, marketing, content-based recommendation engines, and/or other types of knowledge or data-driven systems.