As recording equipment, computers, and other monitoring and location management solutions have become more widely available and affordable, businesses and individuals have adopted the use of this equipment for location monitoring and activity management. This activity monitoring may be pursued for a wide variety of reasons, including, for example, security, loss prevention, employee observation, quality assurance, and other purposes.
The increased availability and implementation of activity monitoring devices has brought about conditions where the amount of information collected by these devices is greater than the average user can effectively browse and manage efficiently. Sufficient personnel may not be available to monitor (or capable of monitoring) the large number of live recordings from these devices as they happen, and the volume of information may be difficult to review after-the-fact, even when played back at higher speeds.
In certain types of businesses, thefts (including employee thefts) are an unfortunately frequent occurrence, and activity monitoring equipment is used to investigate and prevent these events. However, even if owners or managers know what happened during a theft, they must often document the occurrence by referencing the monitoring equipment recordings, including many simultaneous devices' feeds. A significant time investment is required to diligently search for and find events of interest to property owners.
According to at least one embodiment, a computer-implemented method for monitoring activity at a location is provided. The method may comprise receiving a first data signal and a second data signal, wherein the first and second data signals are related to activities detected at a location, and reproducing the first and second data signals in a chronology. The chronology may indicate relative timing of receipt of the first and second data signals via a user interface. The first data signal may include output from a first activity detector, and the second data signal may include output from a second activity detector. Each of the activity detectors may detect activity at the location. The two activity detectors may transduce first and second types of activity that are different from each other. At least one of the first and second activity detectors may transduce at the location light, sound pressure, and/or movement. In some cases, at least one of the activity detectors may detect a radio signal emitted from a portable device at the location, or may detect interaction of a user with a device at the location. The chronology may include a visual representation of the first and second data signals.
The method may also comprise steps of determining that a threshold condition has been met by at least one of the first and second data signals, and indicating that the threshold condition has been met by the at least one of the first and second data signals in the chronology. Additionally, the method may include receiving a time indicator via the user interface and displaying the chronology for the time indicator via the user interface. The method may also include receiving a time indicator via the user interface and transmitting at least a portion of the first and second data signals received at the time indicator.
In some embodiments, the method may further comprise receiving a plurality of data signals related to activities detected at the location, identifying a third data signal in the plurality of data signals, and reproducing the third data signal in the chronology. In this case, the chronology may indicate relative timing of receipt of the third data signal relative to the first and second data signals via the user interface. This method may also include steps of receiving a time indicator via the user interface, wherein the third data signal is identified in the plurality of data signals based on activity detected in the third data signal at the time indicator. The third data signal may also be identified in the plurality of data signals based on activity in the third data signal corresponding with activity in at least one of the first and second data signals. The third data signal may also be identified in the plurality of data signals based on being related to at least one of the first and second data signals.
According to another aspect of the invention, a computer program product for monitoring activity at a location may be provided. The computer program product may comprise a non-transitory computer-readable medium storing instructions executable by a processor to receive a first data signal and a second data signal, the first and second data signals being related to activities detected at a location; and reproduce the first and second data signals in a chronology, wherein the chronology indicates relative timing of receipt of the first and second data signals via a user interface. The first data signal may comprise output from a first activity detector and the second data signal may comprise output from a second activity detector, wherein the first and second activity detectors each detect activity at the location. In another embodiment, the instructions are further executable to receive a plurality of data signals related to activities detected at the location, identify a third data signal in the plurality of data signals, and reproduce the third data signal in the chronology. Here, the chronology may indicate relative timing of receipt of the third data signal relative to the first and second data signals via the user interface.
In another aspect of the invention, an apparatus for monitoring activity at a location may be provided which comprises a processor, a memory in electronic communication with the processor, and instructions stored in the memory. The instructions may be executable by a processor to receive a first data signal and a second data signal, the first and second data signals being related to activities detected at a location; and reproduce the first and second data signals in a chronology. The chronology may indicate relative timing of receipt of the first and second data signals via a user interface. The first data signal may comprise output from a first activity detector, and the second data signal may comprise output from a second activity detector, wherein the first and second activity detectors each detect activity at the location. In the apparatus, the instructions may be further executable to receive a plurality of data signals related to activities detected at the location, identify a third data signal in the plurality of data signals, and reproduce the third data signal in the chronology. In this case, the chronology may indicate relative timing of receipt of the third data signal relative to the first and second data signals via the user interface.
The foregoing and other features, utilities and advantages of the invention will be apparent from the following more particular description of a preferred embodiment of the invention as illustrated in the accompanying drawings.
The accompanying drawings and figures illustrate a number of exemplary embodiments and are part of the specification. Together with the present description, these drawings demonstrate and explain various principles of this disclosure. A further understanding of the nature and advantages of the present invention may be realized by reference to the following drawings. In the appended figures, similar components or features may have the same reference label.
While the embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
According to embodiments described herein, systems and methods are provided for monitoring activities at a location which may include activity sensors or other detecting devices that are in communication with a computing device configured to receive their data signals and reproduce them in a chronology. In most cases, existing activity monitoring review solutions lack time indicators (e.g., in audio feeds), and this makes the search for an event and the analysis of the various feeds a tedious and time-consuming task. However, the chronology produced in the present systems may at least visually indicate relative timing of receipt or recording of the data signals received, thereby allowing review of many device feeds in parallel and in context of other feeds.
Some vendors provide detection of events in feeds of specific types of monitoring equipment. For example, a camera's video feeds may be monitored for movement in the field of view of the camera, and when activity is detected, it can be marked in software to bring it to the attention of a later reviewer. This technology may outright fail to detect or flag events that take place outside the field of view of the camera, though, even if other sensors are in the area that the camera monitors. The present systems may integrate in a chronology the output of video recording devices with non-video sensors and detectors (such as motion sensors, microphones, pressure sensors, position sensors, and other devices) to provide context, correlate activity detected by each sensor, and give a clearer perspective of events that happen out of view of a camera or enhance the information this is provided by a camera.
The present systems and methods may analyze the data signals collected by the sensors to automatically detect events that are likely to be significant to a reviewer, thereby accelerating review and understanding of significant events. For instance, the system may determine that a number of different related sensors simultaneously sense a level of activity exceeding a threshold activity level and either flag that incident for later review or generate and transmit an alert to a reviewer to closely review the events that triggered the alert. Information from video or audio feeds, the time in which the sensors are triggered, the employees on duty at that time, and other factors may be part of the processes by which the most significant events and activities may be detected and logged. The alert may comprise a transmitted signal that presents live or recorded sensor information to a remote user using a user interface such as, for example, a smartphone, tablet, or other computing device.
In some embodiments, the chronology is played back for a viewer via a user interface. For example, the chronology may be played back as a timeline showing time-correlated activity of various sensors at the location. The user interface may comprise controls by which the user may identify which sensor information will be most relevant, the time period to display, and other sought after settings. The systems and methods herein may intelligently determine what sensor information to display based on other sensors being concurrently triggered, user input, or other relevant factors, as described further herein infra.
The present description provides examples, and is not limiting of the scope, applicability, or configuration set forth in the claims. Thus, it will be understood that changes may be made in the function and arrangement of elements discussed without departing from the spirit and scope of the disclosure, and various embodiments may omit, substitute, or add other procedures or components as appropriate. For instance, the methods described may be performed in an order different from that described, and various steps may be added, omitted, or combined. Also, features described with respect to certain embodiments may be combined in other embodiments.
Referring now to the figures in detail,
The computing device 105 may comprise a device enabled for data computation, including a device such as a computer or server. In some embodiments the computing device 105 may be a mobile computing device such as, for instance, a notebook computer, portable phone, tablet, or related device. The computing device 105 may comprise a processor, memory, and other computer components, as described in further detail in connection with the example of
The user interface 110 may comprise a software and/or hardware interface by which a user may interact with or receive information from the computing device 105. Therefore, the user interface 110 may comprise a display (e.g., monitor or other screen) and other output devices, and may comprise input devices (e.g., a keyboard or touch screen sensor). The user interface may preferably comprise a display capable of reproducing video recordings and speakers capable of reproducing audio recordings. Other components that may be part of the user interface 110 are discussed below in connection with
The sensor interface 115 may comprise electronics for interfacing with the sensors 130 from which the computing device 105 receives data signals. Thus, the sensor interface 115 may comprise a device for wireless or wired connection to one or more sensors 130 such as an analog-digital conversion interface, wired IP interface (e.g., Ethernet), or a wireless interface such as Bluetooth®, Wi-Fi, or another system for wireless exchange of information collected by the sensors 130. In some embodiments, the sensor interface 115 may support multiple connection types, such as, for example, a wired interface connectable to a first sensor 130-a-1 and a radio frequency identification (RFID) interface for receiving information from a second sensor 130-a-2. A plurality of different connection types may be supported based on the needs of the device owner and the number and types of activities monitored at the location where the device 105 is implemented.
The application module 120 may be resident on the device 105 and capable of receiving, analyzing, and displaying data signals received via the sensor interface 115 using the user interface 110. The application module 120 may, for example, receive a video feed from a first sensor 130-a-1 via the sensor interface 115, store the feed using the storage module 125, and display the feed using the user interface 110. Additional functions of an application module 120 are described in connection with
The storage module 125 may comprise hardware and software components for storing, indexing, and producing recorded data signals from the sensors 130. The storage module 125 may, for example, comprise a disk or computer memory device capable of writing and producing digital signals such as audio recordings, digital photographs and videos, sensor readings, related timing data, and other data signals used in monitoring activity at the location of the sensors 130.
The sensors 130 may include any sensors for monitoring specific or general activities at a location of interest. Such sensors are contemplated to include at least video cameras, light sensors, motion sensors, microphones and other audio-sensitive equipment, pressure-sensitive pads and load cells, position detectors (e.g., door position detectors or door lock position detectors), metal detectors, wireless device detectors such as radio frequency (RF) antennas (e.g., radio frequency identification (RFID), Wi-Fi, Bluetooth®, Zigbee®, near field communications (NFC), cellular (or other wireless) telecommunications devices, etc.), and like devices. The sensors 130 may be configured to output signals readable by the computing device 105 via the sensor interface 115. Preferably, the sensor output is stored and recorded by the storage module 125 in parallel with timing information that allows the recorded signal output of one sensor 130 to be later referenced relative to other signals collected simultaneously. Sensors 130 may each be one or more of many different types of detecting devices, and the sensor interface 115 may be permitted to interact with these different types of devices.
The system 100 provided in
The application module 120-a may be configured substantially similar to application module 120 of
The network interface 210 may comprise a wired or wireless network communications interface capable of linking to the network 212 of sensors and other network connected devices (e.g., a database 235). The network interface 210 may also be configured to direct network signals to the sensor interface 115, application module 120-a, and user interface 110. The network 212 may comprise a computer network such as, for example, a local area network (LAN), wide area network (WAN) (e.g., the Internet), or other comparable network. The network 212 may be configured to support a common communication standard used by the network interface 210 and all other devices connected to the network 212. In some embodiments, the network 212 may support multiple communication or transmission standards. For example, some sensors may be wired to the network 212 and others may be wirelessly connected, or some sensors may connect via a multiple wireless standards such as some connecting via Wi-Fi and others connecting via a telecommunications standard such as long term evolution (LTE) or Global System for Mobile Communications (GSM). Thus, the network 212 may link a plurality of different types of sensors and other network-attached devices to the computing device 105-a to provide data signals. In some embodiments, the computing device 105-a may also provide commands to sensors 130-b or to the database 235 or storage module 125-a via the network 212. For example, the computing device 105-a may direct a motorized camera sensor to change position, zoom, or perform another action via the network 212.
The database 235 may collect the information from the plurality of sensors 130-b. Output of the sensors may be constantly provided and stored at the database 235, or may be recorded when sensor output signals trigger a predetermined minimum threshold. The database 235 may be located remote from the computing device 105-a and the sensors 130-b through the network 212. The system 200 may provide that the computing device 105-a may access the database 235 through the network 212. In other embodiments, the database 235 may be local or connected to the device 105-a. In some arrangements, the computing device 105-a may display sensor information collected by the database 235. The computing device 105-a may be granted access to specific sensors' recorded output, or may be sent all sensors' information simultaneously by the storage module 125-a.
A storage module 125-a running at the database 235 may manage the storage and recording of sensor output, such as by indexing the information or managing the amount of information stored at one time, and the storage module 125-a may also provide a server function to grant the computing device 105-a access to requested files and other information stored by the database 235. Thus, the storage module 125-a may act as a storage controller or network controller for system 200. In some embodiments, the storage module 125-a may organize data collected from the sensors into zones or regions (e.g., by zones 240, 245). The method of organization of the data may be directed by the application module 120-a according to the needs of the user of the system 200. In one example, the device 105-a may be coupled directly with the database 235, and the database 235 may be internal or external to the device 105-a.
Receiving module 305 may be configured to receive data signals, such as data signals output from various sensors in the system in which the module 300 is operating. For example, the receiving module 305 may manage the communications sent or received between a device (e.g., computing device 105) and a sensor (e.g., sensors 130 in the system (e.g., systems 100, 200). The receiving module 305 may be enabled to drive and control a sensor interface (e.g., sensor interface 115) and/or a network interface (e.g., network interface 210) to receive these signals and properly route them to other modules or components. Certain data signals may be forwarded by the receiving module 305 to other components such as a storage module 125 or database 235.
An application module 120-b having monitoring module 310 may be configured to access and monitor data signals received by receiving module 305 or stored by a database connected to the device having the application module 120-b. Not all embodiments may have a monitoring module 310. The monitoring module 310 may permit the application module 120-b to determine the type of data signal received, its point of origin, timing information, and other information related to the signal. In some embodiments, the monitoring module 310 may monitor the signals by periodically measuring the signal data. This may allow the application module 120-b to detect events based on the status of the sensors being monitored using the detection module 315.
Detection module 315 may be configured to work in conjunction with the monitoring module 310 and receiving module 315 to measure the output of sensors. In this manner, the application module 120-b may determine when sensor data conditions meet predetermined conditions. For example, a microphone sensor may be received by the receiving module 305 and monitored by the monitoring module 310. The microphone may detect ambient sounds (e.g., sounds of a refrigerator, HVAC, etc.) at a certain level, and the detection module 315 may measure the signals from the microphone as being in a first condition at that point. If the microphone output increases beyond a predetermined threshold (e.g., exceeds a predetermined decibel level for a certain amount of time), the detection module 315 may register the unusual activity at a second level and, in conjunction with the interface module 320 and/or transmission module 325, may generate an alert or highlight the microphone output for a user of the system in which the application module 120-b is operating. In some embodiments, receiving any signal from a sensor may be a triggering condition for an alert or heightened activity to be indicated. In other cases, the detection module 315 may be used to detect when various signals are received coincident with each other, such as when multiple sensors in a small area (e.g., zones 240, 245) report notable activity.
Interface module 320 may be configured to display signal data collected by the receiving module 305 by generating visual and audible elements at a user interface (e.g., user interface 110 and display 205). The interface module 320 may also be configured to receive input via the user interface and control how the data signals are reproduced by the application module 120-b. In some embodiments, the interface module 320 generates a chronology showing a comparative view of multiple data signals based on the time in which they were collected, as described and depicted in greater detail in connection with
The transmission module 325 is an optional part of the application module 120-b that may interact with other portions of the application module 120-b and components of a computing device 105 on which the application module 120-b is resident in order to receive and send transmissions from the computing device. The transmissions managed by the transmission module 325 may comprise control signals for sensors in the network (e.g., network 212), alerts at the device on which the application module 120-b is operating, and/or alerts to a remote device. In one example embodiment, when the detection module 315 determines that a data signal is detected as being significant, such as when activity is detected in a secure area of the location at which activity is monitored, the transmission module 325 may be configured to prepare an alert to security personnel to review the event. The transmission module 325 may also be used to send a portion of the suspicious data signal to security personnel for immediate review. In another embodiment, certain conditions may cause the transmission module 325 to immediately trigger real-time monitoring of the data signal by transmitting it to a device. Thus, the transmission module 325 may be used to accelerate a process of identification and response to events of interest at the location being monitored.
In the location 400 of
Information may be collected and stored from each of the sensors and recording devices at the location 400. That information may then be accessed through a user interface to compare data signals collected from some sensors with others and to identify data signals that may be unusual or of other particular interest (see also
The system at the location 400 may also comprise means for tracking individuals such as employees who are in the building. For example, an employee time clock 436 may also be part of the system. The checking in and out of employees via the time clock 436 may be logged by the system and correlated with other information collected by the sensors. In another example, employees may be required to carry RFID devices while in the building, and the RFID reader 430 may log whether a certain RFID device is in the building at certain times. In other embodiments, one or more RFID readers may be implemented to estimate the areas of the building in which the RFID devices are detected. In such cases, employees on duty may be questioned regarding events taking place at the location 400 at a time of interest, such as when a theft occurred.
In another example embodiment, the business operator may observe and collect data regarding the quality of customer service at the location 400. The operator may therefore monitor video feeds that indicate the amount of activity outside the front door 412, how long persons stand waiting on a pressure pad 428 near the front of the building, whether customers are using the Wi-Fi 432, what customers are talking about (as recorded by audio recording equipment 424), etc. This information may be correlated in the chronology provided by the central control computer 434 to narrow down the time required to review hours of recorded data by bringing specific instances to the attention of the reviewing individual. For example, the reviewer may be alerted to instances where loud noises are detected near the front door microphone 424, the pressure pad 428 at the front door is triggered, and then the door 412 is opened and movement is detected in the outside video feed 418 within a short time. This sequence of events may indicate that a customer was upset and left the building, so such events may be quickly directed to the attention of the reviewer. The reviewer may then contact employees on duty at the time, as indicated by the time clock 434 or RFID tags active in the building.
Other such combinations of sensor readings may be devised to identify other events that are likely to be of interest to the reviewer. By detecting these events and showing them to the reviewer in a chronology, he or she may quickly sift through hours of recordings that are not important and gain a clear understanding of how events unfolded by comparing the data signals next to each other immediately.
The location 400 may be divided into zones, such as a front zone comprising the front room 402 and including front room cameras 418 and other sensors, a back zone comprising the back room 404 and including back room cameras 420 and other sensors, and a supply room zone comprising the supply room 406 and having the supply room camera 422 and other sensors. These zones may be referenced by an application module (e.g., application module 120) to index which information may be most valuable to a reviewing person. For example, if motion is detected by one of the motion detectors 426 at an unusual time, the application module may be configured to display all other sensor data collected from the zone in which the motion detector 426 is located in the chronology.
To illustrate how the chronology 500 may be used, a user may access the chronology 500 and indicate a time to display on the timeline 508. In some cases, the system (e.g., an application module 120) may suggest a portion of the timeline that shows potentially interesting activity. In this case, the user may have selected 2:30 p.m. to 9:30 p.m. on January 15. The chronology 500 may therefore be populated with data signals from sensors that have correlative activities in that time period. For example, the chronology 500 may display employee clock records 512, 514, back door lock position activity 516, and back room and supply room sensor activity 518, 520. Also, the recordings captured by the front room camera 502, back door camera 504, and supply room camera 506 may be selected and shown. These data signals may be correlated by a type, intensity, or pattern in the information collected from them during the selected time period, or by being associated with each other in a zone or other location designation.
In this example, one employee clocked in at 2:30 p.m. and clocked out at 5:30 p.m., and a second employee clocked in at 5:00 p.m. and clocked out at 8:00 p.m. Thus, the time clock is used as an activity detector based on interaction of a user with the device at the location. The back door lock signal 516 shows that it was locked since 1:16 p.m. and unlocked between 5:30 p.m. and 7:00 p.m. During that same time, motion sensors in the back room and supply room detected sequential activity. In this example, the width of the sensors' bars indicate the length of time that the signals were detected. Thus, this visualization of the data signals 512 through 520 allows the user to quickly and easily compare data, the time at which the data was collected, and come to conclusions about the meaning of the data collected.
Here, the user may quickly determine that the second employee was on duty while the back door was unlocked and there was activity in the supply room via the back room, so the employee may be interviewed about the activity recorded. The user may also control playback of the video feeds 502, 504, 506 over the time period indicated (e.g., by moving the indicator 510 to a specified position) and view the camera recordings to see whether the cameras recorded other notable activity not indicated in the timeline 508. Additionally, the user may be capable of customizing the information presented in the timelines 508 through use of the controls 532 or other settings. For example, the employee information may be color-coded or shaped differently to improve readability and to differentiate between types of employees. In at least one example, an employee's information may be coded to indicate whether the employee is normally permitted access in a certain area or at a certain time. In this way, unusual activity can be quickly brought to the attention of the reviewer. In some embodiments, the user may also be able to indicate data signals to save or transmit to another device, such as a mobile device.
Playback and search controls 704 may also be associated with the chronology 700 to improve user experience and interaction with the chronology 700. Individual sensor data may also have playback controls 706 to quickly browse through sensor data associated with a specific sensor. This chronology 700 also shows how rows shown in the timeline 508 may be changed (e.g., the different employees shown 708, 710). These changes may be manually made by the reviewer or may be selected by the computing device (e.g., using an application module 120) based on an algorithm identifying relevant data signals to display, such as sensors being in the same zone or group, or sensor information following a predetermined or unusual pattern, as described elsewhere herein.
In some alternative embodiments, the video feeds may be represented in the timeline 508. A video feed may be represented by periodic frames from the video feed displayed in the timeline rows, such as frames where movement is detected. This may allow the user to visually scan through a large amount of video information without playing it back in whole.
In some arrangements, the device or module may also perform block 815, wherein the device or module determines that a threshold condition has been met by at least one of the first and second data signals, then indicates that the threshold condition has been met by the signal in the chronology. The threshold condition may be the magnitude or frequency of one of the data signals. For example, if an audio recording device registers a sound level above a certain decibel level or if activity of a motion detector is detected too frequently at a typically slow time, the threshold condition may be reached. In the chronology, the sensor data displayed may indicate that the threshold condition has been reached by various means, such as, for example, by a highlighted color, shape, or icon, an audible alert, or another indication made in a similar fashion. In another example, a person or device may be detected at the location at an unusual time or in an unusual place. The indication may comprise an alert transmitted to a user such as via an email or simple messaging service (SMS) message informing him or her of the recorded threshold-reaching conditions. Bringing threshold conditions to users' attention may accelerate their browsing of recorded data signals or bring to their attention information that could otherwise be missed.
In some embodiments, the process 800 may include block 820, wherein the device or module may receive a time indicator via the user interface and the chronology is displayed for the time indicator via the user interface. The time indicator may be input from the user in response to a prompt from the user interface, or an unprompted input from the user, such as by selecting a time in a timeline (e.g., using the indicator 510 to select a time or span of time in timeline 508). The chronology displayed for the time indicator may provide various sensors and detectors that were active at the time indicated. In some cases, at least a portion of the first and second data signals received in and/or around the time indicator may be transmitted in response to receiving a time indicator via the user interface.
In block 910, the device or module may reproduce the first and second data signals in a chronology via a user interface indicating relative timing of collection of the first and second data signals, similar to the performance of block 810. Next, the device or module may in block 915 identify the third data signal in the plurality of data signals. In some cases, this means that the user identifies a data signal from the plurality of data signals as the data signal he or she wishes to include with the first or second data signals in the chronology. The third data signal is then reproduced in the chronology via the user interface relative to the timing of the first and second data signals in block 920.
In some cases, the device or module may identify the third data signal from the plurality of data signals by detecting that the third data signal is in the same or a nearby zone when compared to the first and second data signals, by detecting that the activity sensed by the sensor providing the third data signal is heightened near the time that the first and second data signals are heightened, by reference to preferences of the user (e.g., the user indicated that that data signal would be important), or any other method that would indicate that the third data signal should be brought to the user's attention by including it in the chronology. Some of these other methods are discussed elsewhere herein.
A data signal may be reproduced via the user interface relative to timing of other data signals by bring shown in a timeline (e.g., timeline 508) or by being reproduced for the user in a manner reflecting the same times in which the signals were originally received. Thus, a first and second signal may be reproduced so that the first signal received at a point in time is reproduced concurrently with the second signal received at that point in time.
In some embodiments, the activity pattern may be a sequence of changes in activity of one data signal, such as movement in a certain portion of a video camera recording or a pattern of activity occurring at an abnormal time. In response to detecting the activity pattern, the device or module may then generate an activity pattern alert in block 1015. The activity pattern alert may be an indication that a pattern has been detected, such as a message being displayed on the device or a message being transmitted in block 1020. In some embodiments, a reproduction of the data signal may accompany the transmitted activity pattern alert.
Bus 1105 allows data communication between central processor 1110 and system memory 1115, which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices. For example, an application module 120-c which may implement the present systems and methods may be stored within the system memory 1115. Applications resident with computer system 1100 are generally stored on and accessed via a non-transitory computer readable medium, such as a hard disk drive (e.g., fixed disk 1170), an optical drive (e.g., an optical drive that is part of a USB device 1155 or that connects to storage interface 1165), or other storage medium. These storage media may be embodied as computer program products. Additionally, applications can be in the form of electronic signals modulated in accordance with the application and data communication technology when accessed via network interface 1185.
Storage interface 1180, as with other storage interfaces of computer system 1100, can connect to a standard computer readable medium for storage and/or retrieval of information, such as a fixed disk drive 1175. Fixed disk drive 1175 may be a part of computer system 1100 or may be separate and accessed through other interface systems. A modem connected to the network interface 1185 may provide a direct connection to a remote server via a telephone link or to the Internet via an internet service provider (ISP). Network interface 1185 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence). Network interface 1185 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection or the like.
Many other devices or subsystems (not shown) may be connected in a similar manner (e.g., document scanners, digital cameras and so on). Conversely, all of the devices shown in
Moreover, regarding the signals and network communications described herein, those skilled in the art will recognize that a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks. Although the signals of the above described embodiments are characterized as transmitted from one block to the next, other embodiments of the present systems and methods may include modified signals in place of such directly transmitted signals as long as the informational and/or functional aspect of the signal is transmitted between blocks. To some extent, a signal input at a second block can be conceptualized as a second signal derived from a first signal output from a first block due to physical limitations of the circuitry involved (e.g., there will inevitably be some attenuation and delay). Therefore, as used herein, a second signal derived from a first signal includes the first signal or any modifications to the first signal, whether due to circuit limitations or due to passage through other circuit elements which do not change the informational and/or final functional aspect of the first signal.
The previous description of the disclosure is provided to enable a person skilled in the art to make or use the disclosure. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Throughout this disclosure the term “example” or “exemplary” indicates an example or instance and does not imply or require any preference for the noted example. Thus, the disclosure is not to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The present application is a continuation of U.S. patent application Ser. No. 14/251,498, titled: “Chronological Activity Monitoring and Review”, filed on Apr. 11, 2014. The disclosure of the application listed above is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6950989 | Rosenzweig et al. | Sep 2005 | B2 |
6970183 | Monroe | Nov 2005 | B1 |
6975220 | Foodman et al. | Dec 2005 | B1 |
8089563 | Girgensohn et al. | Jan 2012 | B2 |
8200669 | Iampietro et al. | Jun 2012 | B1 |
9615065 | Frenette | Apr 2017 | B2 |
20060156246 | Williams | Jul 2006 | A1 |
20060174302 | Mattern et al. | Aug 2006 | A1 |
20070182818 | Buehler | Aug 2007 | A1 |
20070291117 | Velipasalar et al. | Dec 2007 | A1 |
20080231595 | Krantz et al. | Sep 2008 | A1 |
20090049400 | Ishihara | Feb 2009 | A1 |
20100123579 | Midkiff | May 2010 | A1 |
20100185976 | Sadanandan | Jul 2010 | A1 |
20100205203 | Anderson et al. | Aug 2010 | A1 |
20100238286 | Boghossian et al. | Sep 2010 | A1 |
20110010623 | Vanslette | Jan 2011 | A1 |
20110010624 | Vanslette | Jan 2011 | A1 |
20120075475 | Mariadoss | Mar 2012 | A1 |
20120083675 | el Kaliouby | Apr 2012 | A1 |
20130167041 | Huang | Jun 2013 | A1 |
20140201627 | Freeman | Jul 2014 | A1 |
20140245151 | Carter | Aug 2014 | A1 |
Entry |
---|
Y. Sumi, M. Yano, & T. Nishida, “Analysis Environment of Conversational Structure with Nonverbal Multimodal Data”, presented at 2010 Int'l Conf. on Multimodal Interfaces / Workshop on Machine Learning for Multimodal Interfaces (ICMI-MLMI '10) (Nov. 2010) (Year: 2010). |
Girgensohn, A. et al., “Support for Effective Use of Multiple Video Streams in Security”, Proceedings of 4th ACM Int'l Workshop on Video Surveillance & Sensor Networks (VSSN '06), Oct. 2006, 19-26. |
Zahariev, A., “Graphical User Interface for Intrusion Detection in Telecommunications Networks”, Aalto University, Mar. 2011. |
Number | Date | Country | |
---|---|---|---|
Parent | 14251498 | Apr 2014 | US |
Child | 15671106 | US |