As technology has advanced, computing technology has proliferated to an increasing number of areas while decreasing in price. Consequently, devices such as smartphones, laptops, GPS, etc., have become prevalent in our community, thereby increasing the amount of data being gathered in an ever increasing number of locations. Unfortunately, most of the gathered information is used for marketing and advertising to the end user, e.g., smartphone user receives a coupon to a nearby coffee shop, etc., while the security of our community is left exposed and at a risk of terrorist attacks such as the Boston Marathon bombers.
Accordingly, a need has arisen for a solution to allow monitoring and collection of data from a plurality of sensors and management of the plurality of sensors for improving the security of our communities, e.g., by detecting radiation, etc. Further, there is a need to provide relevant information based on the sensors in an efficient manner to increase security. For example, relevant information of the sensors may be gathered by grouping sensors together based on readings of the sensors relative to a condition, threshold, or heuristics. The grouping of sensors may allow for efficient monitoring of the sensors by interested parties.
According to some embodiments, data associated with a number of sensors are received. The data of the sensors may be compared to a certain condition, for example a threshold value, and based on the comparison, two or more of the sensors may be grouped together. In some embodiments, the grouping of sensors may include combining the data and metadata of the sensors in a data structure.
According to some embodiments, data associated with a first detection sensor and data associated with a second detection sensor is received. The first detection sensor and the second detection sensor are grouped together if the data associated with the first detection sensor satisfies a first condition and if the data associated with the second sensor detection sensor satisfies a second condition.
According to some embodiments, a data store is configured to store data associated with a first and second detection sensor. Furthermore, a state change manger is configured to determine whether the data of the first detection sensor satisfies a first condition and the second detection sensor satisfies a second condition. A sensor data representation module is configured to group the first detection sensor and the second radiation detection sensor together based on the determination that the data of the first and second radiation detection sensors satisfy the first and second conditions, respectively.
According to some embodiments, data associated with a first detection sensor is receiving and a second detection sensor is identifying based on data second sensor satisfying a certain condition. The first detection sensor is grouped together with the identified second radiation detection sensor.
These and other features and aspects may be better understood with reference to the following drawings, description, and appended claims.
Reference will now be made in detail to various embodiments, examples of which are illustrated in the accompanying drawings. While the claimed embodiments will be described in conjunction with various embodiments, it will be understood that these various embodiments are not intended to limit the scope of the embodiments. On the contrary, the claimed embodiments are intended to cover alternatives, modifications, and equivalents, which may be included within the scope of the appended Claims. Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the claimed embodiments. However, it will be evident to one of ordinary skill in the art that the claimed embodiments may be practiced without these specific details. In other instances, well known methods, procedures, components, and circuits are not described in detail so that aspects of the claimed embodiments are not obscured.
Some portions of the detailed descriptions that follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts and data communication arts to most effectively convey the substance of their work to others skilled in the art. In the present application, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of operations or steps or instructions leading to a desired result. The operations or steps are those utilizing physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system or computing device. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as transactions, bits, values, elements, symbols, characters, samples, pixels, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present disclosure, discussions utilizing terms such as “receiving,” “identifying,” “grouping,” “ungrouping,” “rendering,” “determining,” or the like, refer to actions and processes of a computer system or similar electronic computing device or processor. The computer system or similar electronic computing device manipulates and transforms data represented as physical (electronic) quantities within the computer system memories, registers or other such information storage, transmission or display devices.
It is appreciated that present systems and methods can be implemented in a variety of architectures and configurations. For example, present systems and methods can be implemented as part of a distributed computing environment, a cloud computing environment, a client server environment, etc. Embodiments described herein may be discussed in the general context of computer-executable instructions residing on some form of computer-readable storage medium, such as program modules, executed by one or more computers, computing devices, or other devices. By way of example, and not limitation, computer-readable storage media may comprise computer storage media and communication media. Generally, program modules include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or distributed as desired in various embodiments.
Computer storage media can include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media can include, but is not limited to, random access memory (RAM), read only memory (ROM), electrically erasable programmable ROM (EEPROM), flash memory, or other memory technology, compact disk ROM (CD-ROM), digital versatile disks (DVDs) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed to retrieve that information.
Communication media can embody computer-executable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media can include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared and other wireless media. Combinations of any of the above can also be included within the scope of computer-readable storage media.
Provided herein are embodiments for grouping/ungrouping multiple sensors of a sensor-based system. The sensors are configured for monitoring certain conditions, e.g., radiation levels, acoustic threshold, moisture, or play back of events. For example, the sensor-based system include any of a variety of sensors, including thermal sensors (e.g., temperature, heat, etc.), electromagnetic sensors (e.g., metal detectors, light sensors, particle sensors, Geiger counter, charge-coupled device (CCD), etc.), mechanical sensors (e.g. tachometer, odometer, etc.), biological/chemical (e.g., toxins, nutrients, etc.), or any combination thereof. The sensor-based system may further include any of a variety of sensors or a combination thereof including, but not limited to, acoustic, sound, vibration, automotive/transportation, chemical, electrical, magnetic, radio, environmental, weather, moisture, humidity, flow, fluid velocity, ionizing, atomic, subatomic, navigational, position, angle, displacement, distance, speed, acceleration, optical, light imaging, photon, pressure, force, density, level, thermal, heat, temperature, proximity, presence, radiation, Geiger counter, crystal-based portal sensors, biochemical, pressure, air quality, water quality, fire, flood, intrusion detection, motion detection, particle count, water level, or surveillance cameras. The grouping of sensors may be based on various conditions, e.g., proximity of sensors to one another, geo-location of the sensors and their particular location, type of sensor, range of sensor detection, physical proximity of sensors, floor plan of a structure where the sensor is positioned or is next to, etc. In some embodiments, the system for grouping of sensors may provide functionality to alert appropriate entities or individuals to the status of events captured by the sensor-based system as events evolve, either in real-time or based on recorded sensor data.
The sensors 110-114 detect a reading associated therewith, e.g., gamma radiation, vibration, etc., and transmit that information to the sensor based detection system 102 for analysis. The sensor based detection system 102 may use the received information and compare it to a threshold value, e.g., historical values, user selected values, etc., in order to determine whether a potentially hazardous event has occurred. In response to the determination, the sensor based detection system 102 may transmit that information to the messaging system 108 for appropriate action, e.g., emailing the appropriate personnel, sounding an alarm, tweeting an alert, alerting the police department, alerting homeland security department, etc. Accordingly, appropriate actions may be taken in order to avert the risk.
The sensors 110-114 may be any of a variety of sensors including thermal sensors (e.g., temperature, heat, etc.), electromagnetic sensors (e.g., metal detectors, light sensors, particle sensors, Geiger counter, charge-coupled device (CCD), etc.), mechanical sensors (e.g. tachometer, odometer, etc.), complementary metal-oxide-semiconductor (CMOS), biological/chemical (e.g., toxins, nutrients, etc.), etc. The sensors 110-114 may further be any of a variety of sensors or a combination thereof including, but not limited to, acoustic, sound, vibration, automotive/transportation, chemical, electrical, magnetic, radio, environmental, weather, moisture, humidity, flow, fluid velocity, ionizing, atomic, subatomic, navigational, position, angle, displacement, distance, speed, acceleration, optical, light imaging, photon, pressure, force, density, level, thermal, heat, temperature, proximity, presence, radiation, Geiger counter, crystal based portal sensors, biochemical, pressure, air quality, water quality, fire, flood, intrusion detection, motion detection, particle count, water level, surveillance cameras, etc. The sensors 110-114 may be video cameras (e.g., internet protocol (IP) video cameras) or purpose built sensors.
The sensors 110-114 may be fixed in location (e.g., surveillance cameras or sensors), semi-fixed (e.g., sensors on a cell tower on wheels or affixed to another semi portable object), or mobile (e.g., part of a mobile device, smartphone, etc.). The sensors 110-114 may provide data to the sensor based detection system 102 according to the type of the sensors 110-114. For example, sensors 110-114 may be CMOS sensors configured for gamma radiation detection. Gamma radiation may thus illuminate a pixel, which is converted into an electrical signal and sent to the sensor based detection system 102.
The sensor based detection system 102 is configured to receive data and manage sensors 110-114. The sensor based detection system 102 is configured to assist users in monitoring and tracking sensor readings or levels at one or more locations. The sensor based detection system 102 may have various components that allow for easy deployment of new sensors within a location (e.g., by an administrator) and allow for monitoring of the sensors to detect events based on user preferences, heuristics, etc. The events may be used by the messaging system 108 to generate sensor-based alerts (e.g., based on sensor readings above a threshold for one sensor, based on the sensor readings of two sensors within a certain proximity being above a threshold, etc.) in order for the appropriate personnel to take action. The sensor based detection system 102 may receive data and manage any number of sensors, which may be located at geographically disparate locations. In some embodiments, the sensors 110-114 and components of a sensor based detection system 102 may be distributed over multiple systems (e.g., and virtualized) and a large geographical area.
The sensor based detection system 102 may track and store location information (e.g., board room B, floor 2, terminal A, etc.) and global positioning system (GPS) coordinates, e.g., latitude, longitude, etc. for each sensor or group of sensors. The sensor based detection system 102 may be configured to monitor sensors and track sensor values to determine whether a defined event has occurred, e.g., whether a detected radiation level is above a certain threshold, etc., and if so then the sensor based detection system 102 may determine a route or path of travel that dangerous or contraband material is taking around or within range of the sensors. For example, the path of travel of radioactive material relative to fixed sensors may be determined and displayed via a graphical user interface. It is appreciated that the path of travel of radioactive material relative to mobile sensors, e.g., smartphones, etc., or relative to a mixture of fixed and mobile sensors may similarly be determined and displayed via a graphical user interface. It is appreciated that the analysis and/or the sensed values may be displayed in real-time or stored for later retrieval.
The sensor based detection system 102 may display a graphical user interface (GUI) for monitoring and managing sensors 110-114. The GUI may be configured for indicating sensor readings, sensor status, sensor locations on a map, etc. The sensor based detection system 102 may allow review of past sensor readings and movement of sensor detected material or conditions based on stop, play, pause, fast forward, and rewind functionality of stored sensor values. The sensor based detection system 102 may also allow viewing of an image or video footage (e.g., motion or still images) corresponding to sensors that had sensor readings above a threshold (e.g., based on a predetermined value or based on ambient sensor readings). For example, a sensor may be selected in a GUI and video footage associated with an area within a sensor's range of detection may be displayed, thereby enabling a user to see an individual or person transporting hazardous material. According to one embodiment the footage is displayed in response to a user selection or it may be displayed automatically in response to a certain event, e.g., sensor reading associated with a particular sensor or group of sensors being above a certain threshold.
In some embodiments, sensor readings of one or more sensors may be displayed on a graph or chart for easy viewing. A visual map-based display depicting sensors may be displayed with the sensors representations and/or indicators which may include, color coded, shapes, icons, flash rate, etc., according to the sensors' readings and certain events. For example, gray may be associated with a calibrating sensor, green may be associated with a normal reading from the sensor, yellow may be associated with an elevated sensor reading, orange associated with a potential hazard sensor reading, and red associated with a hazard alert sensor reading.
The sensor based detection system 102 may determine alerts or sensor readings above a specified threshold (e.g., predetermined, dynamic, or ambient based) or based on heuristics and display the alerts in the GUI. The sensor based detection system 102 may allow a user (e.g., operator) to group multiple sensors together to create an event associated with multiple alerts from multiple sensors. For example, a code red event may be created when three sensors or more within twenty feet of one another and within the same physical space have a sensor reading that is at least 40% above the historical values. In some embodiments, the sensor based detection system 102 may automatically group sensors together based on geographical proximity of the sensors, e.g., sensors of gates 1, 2, and 3 within terminal A at LAX airport may be grouped together due to their proximate location with respect to one another, e.g., physical proximity within the same physical space, whereas sensors in different terminals may not be grouped because of their disparate locations. However, in certain circumstances sensors within the same airport may be grouped together in order to monitor events at the airport and not at a more granular level of terminals, gates, etc.
The sensor based detection system 102 may send information to a messaging system 108 based on the determination of an event created from the information collected from the sensors 110-114. The messaging system 108 may include one or more messaging systems or platforms which may include a database (e.g., messaging, SQL, or other database), short message service (SMS), multimedia messaging service (MMS), instant messaging services, TWITTER available from Twitter, Inc. of San Francisco, Calif., Extensible Markup Language (XML) based messaging service (e.g., for communication with a Fusion center), JAVASCRIPT Object Notation (JSON) messaging service, etc. For example, national information exchange model (NIEM) compliant messaging may be used to report chemical, biological, radiological and nuclear defense (CBRN) suspicious activity reports (SARs) to government entities (e.g., local, state, or federal government).
The sensor process manager 204 receives analyzed sensor data from sensor analytics processes 202. The sensor process manager 204 may then send the analyzed sensor data to the data store 206 for storage. The sensor process manager 204 may further send metadata associated with sensors 210-214 for storage in the data store 206 with the associated analyzed sensor data. In some embodiments, the sensor process manager 204 may send the analyzed sensor data and metadata to the sensor data representation module 216. In some embodiments, the sensor process manager 204 sends the analyzed sensor data and metadata associated with sensors 210-214 to the sensor data representation module 216. It is appreciated that the information transmitted to the sensor data representation module 216 from the sensor process manager 204 may be in a message based format.
The sensor process manager 204 is configured to initiate or launch sensor analytics processes 202. The sensor process manager 204 is operable to configure each instance or process of the sensor analytics processes 202 based on configuration parameters (e.g., preset, configured by a user, etc.). In some embodiments, the sensor analytics processes 202 may be configured by the sensor process manager 204 to organize sensor readings over time intervals (e.g., 30 seconds, one minute, one hour, one day, one week, one year). It is appreciated that the particular time intervals may be preset or it may be user configurable. It is further appreciated that the particular time intervals may be changed dynamically, e.g., during run time, or statically. In some embodiments, a process of the sensor analytics processes 202 may be executed for each time interval. The sensor process manager 204 may also be configured to access or receive metadata associated with sensors 210-214 (e.g., geospatial coordinates, network settings, user entered information, etc.).
In some embodiments, sensor analytics processes 202 may then send the analyzed sensor data to the data store 206 for storage. The sensor analytics processes 202 may further send metadata associated with sensors 210-214 for storage in the data store 206 with the associated analyzed sensor data.
The state change manager 208 may access or receive analyzed sensor data and associated metadata from the data store 206. The state change manager 208 may be configured to analyze sensor readings for a possible change in the state of the sensor. It is appreciated that in one embodiment, the state change manager 208 may receive the analyzed sensor data and/or associated metadata from the sensor analytics processes 202 directly without having to fetch that information from the data store 206 (not shown).
The state change manager 208 may determine whether a state of a sensor has changed based on current sensor data and previous sensor data. Changes in sensor state based on the sensor readings exceeding a threshold, within or outside of a range, etc., may be sent to a sensor data representation module 216 (e.g., on a per sensor basis, on a per group of sensors basis, etc.). For example, a state change of the sensor 212 may be determined based on the sensor 212 changing from a prior normal reading to an elevated reading (e.g., above a certain threshold, within an elevated reading, within a dangerous reading, etc.) In another example, the state of sensor 210 may be determine not to have changed based on the sensor 212 having an elevated reading within the same range as the prior sensor reading.
In some embodiments, the sensor process manager 204 may configure various states of sensors and associated alerts may be configured therein. For example, the sensor process manager 204 may be used to configure thresholds, ranges, etc., that may be compared against sensor readings to determine whether an alert should be generated. For example, the sensors 210-214 may have five possible states: calibration, nominal, elevated, potential, and warning. It is appreciated that the configuring of sensor process manager 204 may be in response to a user input. For example, a user may set the threshold values, ranges, etc., and conditions to be met for generating an alert. In some embodiments, color may be associated with each state. For example, dark gray may be associated with a calibration state, green associated with a nominal state, yellow associated with an elevated state, orange associated with a potential state, and red associated with an alert state. Light gray may be used to represent a sensor that is offline or not functioning. It is appreciated that any number of states may be present and discussing five possible states is for illustrative purposes and not intended to limit the scope of the embodiments.
In some embodiments, the state change manager 208 is configured to generate an alert or alert signal if there is a change in the state of a sensor 210-214 to a new state. For example, an alert may be generated for a sensor that goes from a nominal state to an elevated state or a potential state. In some embodiments, the state change manager 208 includes an active state table. The active state table may be used to store the current state and/or previous and thereby the active state table is maintained to determine state changes of the sensors 210-214. The state change manager 208 may thus provide real-time sensing information based on sensor state changes.
In some embodiments, the state change manager 208 may determine whether sensor readings exceed normal sensor readings from ambient sources or whether there has been a change in the state of the sensor and generate an alert. For example, with gamma radiation, the state change manager 208 may determine if gamma radiation sensor readings are from a natural source (e.g., the sun, another celestial source, etc.) or other natural ambient source based on a nominal sensor state, or from radioactive material that is being transported within range of a sensor based on an elevated, potential, or warning sensor state. In one exemplary embodiment, it is determined whether the gamma radiation reading is within a safe range based on a sensor state of nominal or outside of the safe range based on the sensor state of elevated, potential, or warning.
In some embodiments, individual alerts may be sent to an external system (e.g., a messaging system 108). For example, one or more alerts that occur in a certain building within time spans of one minute, two minutes, or 10 minutes may be sent to a messaging system. It is appreciated that the time spans that the alerts are transmitted may be preset or selected by the system operator. In one embodiment, the time spans that the alerts are transmitted may be set dynamically, e.g., in real time, or statically.
The sensor data representation module 216 may access or receive analyzed sensor data and associated metadata from the sensor process manager 204 or data store 206. The sensor data representation module 216 may further receive alerts (e.g., on a per sensor basis, on per location basis, etc.) based on sensor state changes determined by the state change manager 208.
The sensor data representation module 216 may be operable to render a graphical user interface (GUI) depicting sensors 210-214, sensor state, alerts, sensor readings, etc. Sensor data representation module 216 may display one or more alerts, which occur when a sensor reading satisfies a certain condition visually on a map, e.g., when a sensor reading exceeds a threshold, falls within a certain range, is below a certain threshold, etc. The sensor data representation module 216 may thus notify a user (e.g., operator, administrator, etc.) visually, audibly, etc., that a certain condition has been met by the sensors, e.g., possible bio-hazardous material has been detected, elevated gamma radiation has been detected, etc. The user may have the opportunity to inspect the various data that the sensor analytics processes 202 have generated (e.g. mSv values, bio-hazard reading level values, etc.) and generate an appropriate event case file including the original sensor analytics process 202 data (e.g., raw stream data, converted stream data, preprocessed sensor data, etc.) that triggered the alert. The sensor data representation module 216 may be used (e.g., by operators, administrators, etc.) to gain awareness of any materials (e.g., radioactive material, bio-hazardous material, etc.) or other conditions that travel through or occur in a monitored area.
In some embodiments, the sensor data representation module 216 includes location functionality operable to show a sensor, alerts, and events geographically. The location functionality may be used to plot the various sensors at their respective location on a map within a GUI. The GUI may allow for visual maps with detailed floor plans at various zoom levels, etc. The sensor data representation module 216 may send sensor data, alerts, and events to a messaging system (e.g., messaging system 108) for distribution (e.g., other users, safety officials, etc.).
As described below, sensor data representation module 216 may group multiple sensors together or ungroup one or more sensors from a previously created grouping. Herein, reference to grouping may refer to an aggregation of sensor captured data, metadata associated with multiple sensors 210-214, etc. Additionally, reference to ungrouping may refer to detaching one or more sensors 210-214 from a previously formed grouping of sensors 210-214. As an example, sensor data representation module 216 may ungroup sensor 212 from a grouping of sensors 210-214 by removing data corresponding to sensor 212 from the data structure of the grouping. As an example, sensor data representation module 216 may form a grouping of sensors, e.g., 210-214 by creating a data structure that aggregates readings from sensors 210-214 of the grouping, a data structure that aggregates reading from sensors, but displays the highest reading, a data structure that aggregates readings from sensors but displays the average reading of the sensor grouping, a data structure that aggregates readings from sensors and displays metadata associated such as geo-positional information, etc. As another example, sensor data representation module 216 may form a grouping of sensors, e.g., 210-214 by creating a data structure that aggregates readings from sensors 210-214 of the grouping having similar characteristics, e.g., similar sensors, sensors with similar state, sensors with similar metadata, sensors with similar readings, etc.
The created data structure may be stored in data store 206. In some embodiments, sensor data representation module 216 may group sensors 210-214 in the data structure using a MapReduce framework. The data structure may describe a grouping of sensors 210-214 with respect to any parameter associated therewith, e.g., location, sensor data, type, etc. As an example, the data structure of the grouping may be stored locally or in data store 206 as a relational database. The data structure may be a hierarchy of entries and each entry may have one or more sub-entries. For example, entries in the data structure may correspond to the individual sensors and the sub-entries may be the metadata of the individual sensors. As another example, a sub-entry may be the sensed data of the individual sensors. Entries in the data structure may implemented as JSON or XML documents that have attribute-value pairs. For a sensor, an example attribute may be “location” and a corresponding value may be “Terminal A”.
The data structure may include sensor readings of sensors 210-214 captured over a fixed time scale (e.g., period of time). In some embodiments, sensor readings may be added to the data structure starting at a time that is determined based on the sensor readings of sensors of the grouping 210-214. As an example, the sensor readings included in the data structure may start at a time when one or more of sensors 210-214 has an elevated reading. As another example, the sensor readings included in the data structure may start at a time when one or more of sensors 210-214 has a reading within a threshold. In other embodiments, the data structure of grouped sensors 210-214 may be open ended and may add readings from sensors 210-214 in an on-going basis until an operator manually closes out the data collection or automatically based on heuristics. For example, sensor readings of a grouping of sensors may be discontinued when all sensors 210-214 of the grouping no longer have elevated readings, readings of the sensors are within a certain range, etc.
The data structure may allow adding or removing an entry at any time. As an example, sensor data representation module 216 may access or receive one or more conditions, parameters, or heuristics via a graphical user interface, as input by an operator for instance, that may be used to configure sensor data representation module 216. The user input information accessed by sensor data representation module 216 may be used to group or ungroup sensors 210-214. The conditions, parameters, or heuristics may be received via the graphical user interface of a sensor data representation module 216, a sensor process manager 204, state change manager 208, etc. As described below, sensor data representation module 216 may determine grouping or ungrouping of sensors 210-214 based on an evaluation (e.g., a comparison, an algorithm, etc.) of sensor data, sensor metadata, or the conditions, parameters, heuristics, etc. For example, a sensor previously not included in an existing sensor grouping and satisfying a certain condition may be added to the existing sensor grouping by adding an entry corresponding to the sensor into the data structure. Furthermore, a sensor in the existing sensor grouping that no longer satisfies a certain condition may be removed from the existing sensor grouping by removing the entry corresponding to the sensor from the data structure.
Furthermore, data associated with a sensor grouping may be used to generate messages, monitor readings from sensors 210-214 of the sensor grouping, visualize the status or location of sensors 210-214 of the sensor grouping, etc. In some embodiments, a grouping of sensors 210-214 may group the sensed data (readings) of sensors 210-214 in a data structure. Although this disclosure describes grouping and ungrouping of sensors using a data structure, this disclosure contemplates any suitable grouping and ungrouping of sensors using any suitable data structure.
An indicator may be output from the sensor data representation module 216 based on determining that a grouping of sensors 210-214. In some embodiments, the indicator may be output visually, audibly, or via a signal to another system (e.g., messaging system 108). As described below, groups of sensors may be selected manually (e.g., via a GUI, command line interface, etc.) or automatically (e.g., based on an automatic grouping determined by the sensor based detection system 102) based on heuristics. In some embodiments, the indicator (e.g., alert, event, message, etc.) may be output to a messaging system (e.g., messaging system 108). For example, the indicator may be output to notify a person (e.g., operator, administrator, safety official, etc.) or group of persons (e.g., safety department, police department, fire department, homeland security, etc.).
In some embodiments, sensor data representation module 216 may group sensors together based on metadata showing the sensors are located within a geographic location, for example structure, city, county, region, etc. As illustrated in
As described above, metadata associated with sensors 310A-C including location, etc., may be used by sensor data representation module 216 for determining sensor groupings. As illustrated in
In some embodiments, sensors may be grouped together based on data from state change manager 208. Examples of data from state manager 208 may include alerts of elevated readings received from one or more sensors. In some embodiments, state change manager 208 may determine whether a state of a sensor has changed based on current sensor data or previous sensor data. As an example, sensors, 310A-D may have five possible states: calibration, nominal, elevated, potential, or warning. Changes in the state of sensors 310A-D may be determined based on the readings of sensors 310A-D being above a threshold, within or outside of a range, etc. As illustrated in
It is appreciated that the grouping of sensors can be used to provide a more accurate and precise picture of events happening. For example, a change of sensor state of a sensor may be a caused fluke or a blip in a single sensor reading. However, when a sensor captures multiple elevated readings or multiple sensors have elevated readings, there is a higher probability of an event taking place. A change of sensor state of multiple sensors may indicate an event occurred that may warrant further attention and sensor data representation module 216 may group sensors 310A-D in response to the elevated readings. As an example, elevated readings from radiation sensors 310A-D with a change of status from nominal to elevated may indicate that radioactive material is present. In some embodiments, the sensor data representation module 216 may automatically identify and group sensors 310A-D together, such that the metadata and sensed data from sensors 310A-D are stored in a data structure of data store 206. As another example, readings from thermal sensors 314A-B within a same area or facility 334 may be grouped together based on change of status from nominal to elevated. The change in status of sensors 314A-B may indicate that a fire or ignition source is present in building 334.
In some embodiments, the grouping of sensors may correspond to an inferred path of a moving radiation source. The heuristics may be based on an inferred time of travel between sensors (e.g., 410C-D), as illustrated in
As described above, sensors may be grouped based on the metadata of the sensors. In some embodiments, sensor data representation module 216 may group sensors 410A-D in disparate locations based on the type of sensor 412A-D, as illustrated by
In some embodiments, sensor based detection system 102 may create an event to facilitate monitoring the readings of the grouped sensors. Sensor process manager 204 may configure thresholds, ranges, etc. that are compared against sensor readings to determine whether a grouping should be created, as illustrated in
As another example, geographic location 436 may be an airport terminal managed by an airport authority. The airport authority may group motion sensors 422A-C together to monitor activity at airport terminal 436. In some embodiments, sensor based detection system 102 may create an event based on the grouped motion sensors 422A-C of geographic location 436 detecting movement during off-hours and the event sent to the airport authority for subsequent monitoring.
As illustrated in
As described above, alerts or readings from the manually grouped sensors 710A-F may then be displayed or sent to a responsible organization as an event. A condition may be applied to the manual grouping of sensors 710A-F, such that an event is triggered based on one or more of the sensors in the group of sensors 710A-F satisfying the condition (e.g., reaching particular reading level, exceeding a range of reading levels, etc.). According to some embodiments, the conditions may be set manually via a GUI by a user or it may be via heuristics. It is appreciated that the selected sensors 710A-F may be of varying types, each with their own conditions appropriate for the type of sensor 710A-F of the grouping.
Map view 900 of geographic location 902 may be enlarged or zoomed in to display a map view 1000 of geographic location 902 in more detail. As illustrated in
Map view 1000 may be enlarged or zoomed in to display a map view 1100 of the geographic location in more detail, as illustrated in
As described above, the GUI may also be used to render information in response to a user interaction. As illustrated in
As illustrated in
Additionally, in various embodiments, computing system environment 2100 may also have other features/functionality For example, computing system environment 2100 may also include additional storage (removable and/or non-removable) including, but not limited to, magnetic or optical disks or tape. Such additional storage is illustrated by removable storage 2108 and non-removable storage 2110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer readable medium 2104, removable storage 2108 and nonremovable storage 2110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, expandable memory (e.g., USB sticks, compact flash cards, SD cards), CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing system environment 2100. Any such computer storage media may be part of computing system environment 2100.
In some embodiments, computing system environment 2100 may also contain communications connection(s) 2112 that allow it to communicate with other devices. Communications connection(s) 2112 is an example of communication media. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
Communications connection(s) 2112 may allow computing system environment 2100 to communicate over various networks types including, but not limited to, fibre channel, small computer system interface (SCSI), Bluetooth, Ethernet, Wi-fi, Infrared Data Association (IrDA), Local area networks (LAN), Wireless Local area networks (WLAN), wide area networks (WAN) such as the internet, serial, and universal serial bus (USB). It is appreciated the various network types that communication connection(s) 2112 connect to may run a plurality of network protocols including, but not limited to, transmission control protocol (TCP), user datagram protocol (UDP), internet protocol (IP), real-time transport protocol (RTP), real-time transport control protocol (RTCP), file transfer protocol (FTP), and hypertext transfer protocol (HTTP).
In further embodiments, computing system environment 2100 may also have input device(s) 2114 such as keyboard, mouse, a terminal or terminal emulator (either connected or remotely accessible via telnet, SSH, http, SSL, etc.), pen, voice input device, touch input device, remote control, etc. Output device(s) 2116 such as a display, a terminal or terminal emulator (either connected or remotely accessible via telnet, SSH, http, SSL, etc.), speakers, light emitting diodes (LEDs), etc. may also be included. All these devices are well known in the art and are not discussed at length.
In one embodiment, computer readable storage medium 2104 includes a data store 2122, a state change manager 2126, a sensor data representation module 2128, and a visualization module 2130. The data store 2122 may be similar to data store 206 described above and is operable to store data associated with a first and second detection sensor according to flow diagrams 1600, 1700, 1900, and 2000, for instance. The state change manager 2126 may be similar to state change manager 208 described above and may be used to determine whether the data of the first and second radiation detection sensors satisfy a certain condition. The sensor data representation module 2128 may be similar to sensor data representation module 216 described above and may operate to group the first radiation detection sensor and the second radiation detection sensor together based on the determination that the data of the first and second radiation detection sensor satisfy the certain condition, as discussed with respect to flows 1600, 1700, 1900, and 2000. The visualization module 2130 is operable to render a portion of the data associated with the first detection sensor, as discussed with respect to flows 1600, 1700, 1900, and 2000.
It is appreciated that implementations according to embodiments of the present invention that are described with respect to a computer system are merely exemplary and not intended to limit the scope of the present invention. For example, embodiments of the present invention may be implemented on devices such as switches and routers, which may contain application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), etc. It is appreciated that these devices may include a computer readable medium for storing instructions for implementing methods according to flow diagrams 1600, 1700, 1900, and 2000.
Bus 2212 allows data communication between central processor 2214 and system memory 2217, which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components. Applications resident with computer system 2210 are generally stored on and accessed via a computer readable medium, such as a hard disk drive (e.g., fixed disk 2244), an optical drive (e.g., optical drive 2240), a floppy disk unit 2237, or other storage medium. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via network modem 2247 or interface 2248.
Storage interface 2234, as with the other storage interfaces of computer system 2210, can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 2244. Fixed disk drive 2244 may be a part of computer system 2210 or may be separate and accessed through other interface systems. Network interface 2248 may provide multiple connections to other devices. Furthermore, modem 2247 may provide a direct connection to a remote server via a telephone link or to the Internet via an internet service provider (ISP). Network interface 2248 may provide one or more connection to a data network, which may include any number of networked devices. It is appreciated that the connections via the network interface 2248 may be via a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence). Network interface 2248 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection or the like.
Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras and so on). Conversely, all of the devices shown in
Moreover, regarding the signals described herein, those skilled in the art will recognize that a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks. Although the signals of the above described embodiment are characterized as transmitted from one block to the next, other embodiments of the present disclosure may include modified signals in place of such directly transmitted signals as long as the informational and/or functional aspect of the signal is transmitted between blocks. To some extent, a signal input at a second block can be conceptualized as a second signal derived from a first signal output from a first block due to physical limitations of the circuitry involved (e.g., there will inevitably be some attenuation and delay). Therefore, as used herein, a second signal derived from a first signal includes the first signal or any modifications to the first signal, whether due to circuit limitations or due to passage through other circuit elements which do not change the informational and/or final functional aspect of the first signal.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings.
This application claims priority to U.S. patent application Ser. No. 14/336,994, entitled “SENSOR GROUPING FOR A SENSOR BASED DETECTION SYSTEM”, filed Jul. 21, 2014, which is incorporated by reference herein. This application is a continuation in part of U.S. patent application Ser. No. 14/281,896, entitled “SENSOR BASED DETECTION SYSTEM”, by Joseph L. Gallo et al. (Attorney Docket No. 13-012-00-US), filed May 20, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/281,901, entitled “SENSOR MANAGEMENT AND SENSOR ANALYTICS SYSTEM,” by Joseph L. Gallo et al. (Attorney Docket No. 13-013-00-US), filed May 20, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/315,286, entitled “METHOD AND SYSTEM FOR REPRESENTING SENSOR ASSOCIATED DATA”, by Joseph L. Gallo et al. (Attorney Docket No. 13-014-00-US), filed Jun. 25, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/315,289, entitled “METHOD AND SYSTEM FOR SENSOR BASED MESSAGING”, by Joseph L. Gallo et al. (Attorney Docket No. 13-015-00-US), filed Jun. 25, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/315,317, entitled “PATH DETERMINATION OF A SENSOR BASED DETECTION SYSTEM”, by Joseph L. Gallo et al. (Attorney Docket No. 13-016-00-US), filed Jun. 25, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/315,320, entitled “GRAPHICAL USER INTERFACE OF A SENSOR BASED DETECTION SYSTEM”, by Joseph L. Gallo et al. (Attorney Docket No. 13-017-00-US), filed Jun. 25, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/315,322, entitled “GRAPHICAL USER INTERFACE FOR PATH DETERMINATION OF A SENSOR BASED DETECTION SYSTEM” by Joseph L. Gallo et al. (Attorney Docket No. 13-018-00-US), filed Jun. 25, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/281,904, entitled “EVENT MANAGEMENT FOR A SENSOR BASED DETECTION SYSTEM”, by Joseph L. Gallo et al. (Attorney Docket No. 13-020-00-US), filed May 20, 2014, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/284,009, entitled “USER QUERY AND GAUGE-READING RELATIONSHIPS”, by Ferdinand E. K. de Antoni (Attorney Docket No. 13-027-00-US), filed May 21, 2014, which is incorporated herein by reference. This application is related to Philippines Patent Application No. 1/2013/000136, entitled “A DOMAIN AGNOSTIC METHOD AND SYSTEM FOR THE CAPTURE, STORAGE, AND ANALYSIS OF SENSOR READINGS”, by Ferdinand E. K. de Antoni (Attorney Docket No. 13-027-00-PH), filed May 23, 2013, which is incorporated herein by reference. This application is a continuation in part of U.S. patent application Ser. No. 14/337,012, entitled “DATA STRUCTURE FOR SENSOR BASED DETECTION SYSTEM”, by Joseph L. Gallo et al. (Attorney Docket No. 13-022-00-US), filed Jul. 21, 2014, which is incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US15/31835 | 5/20/2015 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
Parent | 14281901 | May 2014 | US |
Child | 15312618 | US | |
Parent | 14281896 | May 2014 | US |
Child | 14281901 | US | |
Parent | 14281904 | May 2014 | US |
Child | 14281896 | US | |
Parent | 14284009 | May 2014 | US |
Child | 14281904 | US | |
Parent | 14315286 | Jun 2014 | US |
Child | 14284009 | US | |
Parent | 14315317 | Jun 2014 | US |
Child | 14315286 | US | |
Parent | 14315289 | Jun 2014 | US |
Child | 14315317 | US | |
Parent | 14315322 | Jun 2014 | US |
Child | 14315289 | US | |
Parent | 14315320 | Jun 2014 | US |
Child | 14315322 | US | |
Parent | 14337012 | Jul 2014 | US |
Child | 14315320 | US | |
Parent | 14336994 | Jul 2014 | US |
Child | 14337012 | US |