The present invention relates to processing data that is obtained by recording units for recording sequences of monitoring data in a monitoring system.
Monitoring systems, such as video surveillance systems and systems in which ambient conditions such as sound and temperature are monitored, typically generate very large amounts of monitoring data. The monitoring data is usually in the form of time series of data, originating in several detectors and recording units, representing the conditions that are monitored, for example video, thermal video, audio and temperature sequences.
Processing of such monitoring data may entail more or less automatic procedures and analysis algorithms for the purpose of providing a user or operator of the monitoring system with a concise and manageable data set that makes it possible to take action if an undesired condition occurs. Such procedures and algorithms may involve transformation and filtering and any other mathematical treatment of the actual data that is recorded during the monitoring in order to make the data more understandable and easy to handle.
However, notwithstanding this need for handling of the monitoring data itself, it is also desirable from the point of view of a user or operator to obtain information about when the monitoring data has been obtained. This is typically a small problem in situations where the monitoring system comprises a very small number of recording units, and when these units record monitoring data continuously. But where a larger number of recording units are involved and where these units are recording monitoring data in a more or less intermittent manner over longer periods of time, the amount of monitoring data becomes very large and difficult to manage, and the user or operator of the system will find it difficult to get an overview of when the monitoring data has been recorded.
Prior art solutions typically involve textual presentation of monitoring data in the form of lists and tables or graphical presentations in the form of bar charts and other types of charts for presenting statistical summaries of the data. One such system is the NVR system provided by Mirasys Ltd.
In view of the problems discussed above in relation to monitoring systems, there is provided a method for processing monitoring data in a monitoring system. The system comprises a plurality of recording units for recording sequences of monitoring data (e.g. video sequences, thermal video sequences, audio sequences, temperature sequences, metadata related to monitoring data etc.) and a system control station, and the method comprises, in the system control station, obtaining timing information related to each sequence of monitoring data in a plurality of sequences of monitoring data recorded by each of the plurality of recording units. The timing information indicates a respective start time and a stop time for each sequence. A recording unit selection signal is received that indicates a selected recording unit. The timing information is processed together with the recording unit selection signal, the processing comprising displaying, using a first graphic characteristic (e.g. a first color), a graphic representation of said start and stop times for at least a subset of the sequences of monitoring data, and displaying, using a second graphic characteristic different than the first graphic characteristic (e.g. a second color), a graphic representation of said start and stop times for each sequence of monitoring data recorded by the selected recording unit.
Such a method provides advantages in situations where recording units are recording monitoring data in a more or less intermittent manner over longer periods of time. Because the amount of monitoring data in these situations is typically very large, the user or operator of the system in which the method is realized will be able to get an overview of when the monitoring data has been recorded.
The subset of sequences that are displayed using a first graphic characteristic may be created by excluding the sequence recorded by the selected recording unit from the plurality of sequences of monitoring data recorded by each of the plurality of recording units.
The displaying of graphic representations of the start and stop times for each sequence may comprise displaying polygons having a respective size and placement that depend on the start and stop times of each respective sequence.
The displaying of graphic representations of the start and stop times for each sequence may comprise displaying the graphic representations along a timeline, for example superimposed on each other.
Some embodiments include a sequence that comprises a number of steps that commences with creation of a list of list records, each list record comprising the timing information related to one sequence of monitoring data. A respective vector representation of the polygons is then calculated followed by calculation of a respective bitmap corresponding to the vector represented polygons. An aggregated bitmap of bitmaps corresponding to at least a subset of the vector represented polygons is then calculated. The aggregated bitmap is then rendered, using the first graphic characteristic and the bitmap corresponding to the selected recording unit is rendered using the second graphic characteristic.
The obtaining of timing information may comprise sending a request for the timing information to each recording unit and receiving the timing information from each recording unit. Alternatively or additionally, the obtaining of timing information may comprise sending a request for the timing information to a sequence server and receiving the timing information from the sequence server.
In another aspect there is provided a system control station for a monitoring system, the system comprising a plurality of recording units for recording sequences of monitoring data. The system control station comprises control and communication circuitry configured to obtain timing information related to each sequence of monitoring data in a plurality of sequences of monitoring data recorded by each of the plurality of recording units, the timing information indicating a respective start time and a stop time for each sequence, receive a recording unit selection signal indicating a selected recording unit, and process the timing information and the recording unit selection signal. The processing comprises displaying, using a first graphic characteristic, a graphic representation of said start and stop times for at least a subset of the sequences of monitoring data, and displaying, using a second graphic characteristic different than the first graphic characteristic, a graphic representation of said start and stop times for each sequence of monitoring data recorded by the selected recording unit.
In yet another aspect there is provided a monitoring system comprising such a system control station and a plurality of recording units for recording sequences of monitoring data, for example video cameras, audio recording units and temperature recording units.
In yet another aspect there is provided a computer program product comprising software instructions that, when executed in a processing unit, performs the method as summarized above.
These further aspects provide effects and advantages corresponding to those of the method as summarized above.
Embodiments will now be described with reference to the attached drawings, where:
a schematically illustrates timing of recorded monitoring data,
b and 3c schematically illustrate processed monitoring data that is displayed along a respective timeline, and
With reference to
The control station 106 comprises, from a hardware perspective, a processing unit 104, memory 102 and input/output circuitry 108. Software instructions stored in the memory 102 are configured to control the station 106 and its interaction with the system 100 and implement, when executed by the processor and in combination with the hardware units, a user interface 110. The user interface includes a display for displaying video data and other information, including monitoring data, to a user or operator. The skilled person will realize that the user interface 110 may include other input/output units, including keypads, keyboards, loudspeakers etc that enable an operator of the control station 106 to interact with the monitoring system 100.
The network 112 is of a type suitable for communicating digital data from the recording units 114, 118, 120 and signaling information between the control station 106 and the recording units. For example, the network 112 may be any combination of local area networks and wide area networks, wired as well as wireless, that are configured to convey digital data according to any suitable network protocols known in the art, such as the Internet Protocol (IP) suite and other telecommunication protocols, including any communication protocols established within the framework of 3GPP. Consequently, any of the communicating units 106, 114, 116, 118 and 120 may be connected via wired as well as wireless communication means, such as Ethernet wired communication means and/or wireless means capable of communicating under any of the IEEE 802.11 set of standards and/or the 3GPP standards.
The cameras 114 may be any suitable digital camera capable of generating video sequences and communicating the video sequences, or other type of image data, such as image and video metadata, over the network 112 to the control station 106. The cameras 114 may comprise image storage memory for storing a plurality of images. The cameras 114 comprise a lens system for collecting incident light, an image sensor, for example in the form of a Charge Coupled Device (CCD), a CMOS-sensor or similar sensor, for registering incident light and/or thermal radiation, as well as circuitry as is known in the art (and therefore not illustrated in detail in
Although the monitoring data generated by the cameras typically is in the form of video sequences, the monitoring data may also be in the form of or at least include metadata. Such metadata may be any kind of information related to video data recorded by the cameras. For example, processing in the cameras may involve detecting movement in the scene recorded by the cameras and metadata may then be in the form of information regarding this detected movement.
The audio recording units 118 may be any suitable microphone equipped unit and may in some cases be incorporated in a video camera such as any of the cameras 114. Similarly, the sensors 120 for sensing ambient conditions, such as temperature, may be of any suitable type.
The monitoring data storage unit 116 is capable of communicating sequences of monitoring data over the network 112 with the control station 106 and the recording units 114, 118, 120. The storage unit 116 may form a functional part of the control station 106 and also be completely integrated in the control station 106.
Turning now to
A recording unit selection signal is received, in a reception step 204, which indicates a selected recording unit. The selection signal may originate in an action taken by a user or operator interacting with a user interface in a control station such as the control station 106 in
The timing information that was obtained in the obtaining step 202 is then processed together with the recording unit selection signal that was obtained in the reception step 204. The processing takes place in two steps that may operate in parallel or in sequence. In a first display step 206, displaying takes place, using a first graphic characteristic (e.g. a first color), of a graphic representation of said start and stop times for at least a subset of the sequences of monitoring data. In a second display step 208, displaying takes place, using a second graphic characteristic different than the first graphic characteristic (e.g. a second color), of a graphic representation of said start and stop times for each sequence of monitoring data recorded by the selected recording unit.
The method may be realized in the form of a computer program product 122 comprising software instructions that are configured such that they can be loaded into the memory 102 and executed in the processing unit 104.
Turning now to
Groups 304a, 304b, 306a, 306b of sequences recorded by a respective second and third recording unit are illustrated in the timeline 310 in the same manner as that of the groups 302a, 302b recorded by the first recording unit. Of course, the time scale of interest may depend on the particular situation in which a monitoring system is operating.
Continuing with the flow of the embodiment of the method, a respective vector representation of the polygons is then calculated in a vector calculation step 404. The actual algorithm for this calculation is outside the scope of the present disclosure.
The calculation step 404 is followed by a bitmap calculation step 406 in which calculation of a respective bitmap takes place, where the bitmaps correspond to the vector represented polygons. The actual algorithm for this calculation is outside the scope of the present disclosure.
An aggregated bitmap of bitmaps corresponding to at least a subset of the vector represented polygons is then calculated in an aggregate bitmap calculation step 408. For example, as will be illustrated in connection with a description of
The aggregated bitmap is then rendered in a first rendering step 410, using the first graphic characteristic and the bitmap corresponding to a selected recording unit is rendered in a second rendering step 412 using the second graphic characteristic.
b and
In
In
As clearly illustrated in
The process described with reference to
Yet another variation may be a procedure where no aggregation takes place. Such a procedure may entail creation of individual bitmaps from polygons followed by rendering of the bitmaps one by one. In such a procedure, the rendering takes place using a first (or at least similar) graphic characteristic for all but the last rendering. The last rendering with the use of a graphic characteristic that differs from the graphic characteristics of the already rendered polygons, is the rendering of the polygons representing the selected recording unit. That is, the last rendering may be seen as taking place “on top of” the already rendered polygons.
Number | Date | Country | Kind |
---|---|---|---|
11172103.1 | Jun 2011 | EP | regional |
Number | Date | Country | |
---|---|---|---|
61505383 | Jul 2011 | US |