Systems and methods for managing video data

Information

  • Patent Grant
  • 10038872
  • Patent Number
    10,038,872
  • Date Filed
    Friday, August 3, 2012
    12 years ago
  • Date Issued
    Tuesday, July 31, 2018
    6 years ago
Abstract
DVM system is configured to include an incident trigger, which is actuated either manually or automatically in response to an “incident”. In the case of manual actuation, the definition of an incident may be subjectively determined by an operator (for example based on training and/or protocols). In the case of automatic actuation, an incident is defined by predefined criteria (for example a signal from analytics software, and alarm, or the like). In response to the actuation of the trigger, an incident identifier is defined. During the period of actuation, a plurality of recordings is automatically made, based on an incident recording protocol. These recordings are all associated with the incident identifier. The automation allows an operator to, at the time of the incident, focus on factors other than ensuring important video evidence is being recorded (such as following a subject through a building by looking at various cameras).
Description

This application is a National Stage of International Application No. PCT/AU2012/000914, filed Aug. 3, 2012, which claims the benefit of Australian Patent Application No. 2011903153, filed Aug. 5, 2011, both of which are incorporated herein by reference.


FIELD OF THE INVENTION

The present invention relates to systems and methods for managing video data. Embodiments of the invention have been particularly developed for providing incident-centric recording in a Digital Video Management (DVM) system. While some embodiments will be described herein with particular reference to that application, it will be appreciated that the invention is not limited to such a field of use, and is applicable in broader contexts.


BACKGROUND

Any discussion of the background art throughout the specification should in no way be considered as an admission that such art is widely known or forms part of common general knowledge in the field.


DVM systems are widely used for surveillance purposes. In such contexts, it is important to collect evidence for the purpose of incident recording. For example, collecting of evidence is an extremely important aspect of incident management due to potential police involvement, subsequent legal proceedings, and training purposes. In known surveillance systems, evidence collection is generally achieved using a number of functionalities such as:

    • Automatic background recording on all cameras.
    • User or event driven recording on key cameras.
    • User initiated image snapshots.


These functionalities are either pre-configured, or must be triggered manually by the operator during an incident. This can lead to costly configuration, and additional stress on operators in an already stressful situation. It can also require time consuming analysis after the event, where one or more operators must manually reconstruct the event, finding appropriate video regions, and ordering them based on time or, for example, movements of a suspect.


There is a need in the art for improved systems and methods for managing video data.


SUMMARY OF THE INVENTION

It is an object of the present invention to overcome or ameliorate at least one of the disadvantages of the prior art, or to provide a useful alternative.


One embodiment provides a method for controlling a DVM system, the method including:


providing an incident trigger;


in response to actuation of the incident trigger, defining an incident identifier;


during a period of time for which the incident trigger is actuated, applying an incident recording protocol, thereby to automatically make recordings in accordance with the incident recording protocol; and


associating each of the DVM recordings with the incident identifier.


One embodiment provides a DVM system configured to perform a method as described herein.


One embodiment provides a tangible non-transitive carrier medium carrying computer executable code that, when executed via one or more processes, allows the performance of a method as described herein.


Reference throughout this specification to “one embodiment”, “some embodiments” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment”, “in some embodiments” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.


As used herein, unless otherwise specified the use of the ordinal adjectives “first”, “second”, “third”, etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.


In the claims below and the description herein, any one of the terms comprising, comprised of or which comprises is an open term that means including at least the elements/features that follow, but not excluding others. Thus, the term comprising, when used in the claims, should not be interpreted as being limitative to the means or elements or steps listed thereafter. For example, the scope of the expression a device comprising A and B should not be limited to devices consisting only of elements A and B. Any one of the terms including or which includes or that includes as used herein is also an open term that also means including at least the elements/features that follow the term, but not excluding others. Thus, including is synonymous with and means comprising.


As used herein, the term “exemplary” is used in the sense of providing examples, as opposed to indicating quality. That is, an “exemplary embodiment” is an embodiment provided as an example, as opposed to necessarily being an embodiment of exemplary quality.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:



FIG. 1 schematically illustrates a DVM system according to one embodiment.



FIG. 2 schematically illustrates a DVM system according to one embodiment.



FIG. 3A illustrates a method according to one embodiment.



FIG. 3B illustrates a method according to one embodiment.





DETAILED DESCRIPTION

Described herein are systems and methods for managing video data. Embodiments are described by reference to a Digital Video Management (DVM) system, for example in terms of methods for providing incident-centric recording in a DVM system. In overview, a DVM system is configured to include an incident trigger, which is actuated either manually or automatically in response to an “incident”. In the case of manual actuation, the definition of an incident may be subjectively determined by an operator (for example based on training and/or protocols). In the case of automatic actuation, an incident is defined by predefined criteria (for example a signal from analytics software, and alarm, or the like). In response to the actuation of the trigger, an incident identifier is defined. During the period of actuation, a plurality of recordings is automatically made, based on an incident recording protocol. These recordings are all associated with the incident identifier. The automation allows an operator to, at the time of the incident, focus on factors other than ensuring important video evidence is being recorded (such as following a subject through a building by looking at various cameras). The association streamlines subsequent review of an incident, as recordings relevant to the incident are commonly identifiable based on the incident identifier.


System Level Overview



FIG. 1 illustrates a general Digital Video Management (DVM) system 101. System 101 is described to provide general context to various embodiments discussed below. Although embodiments are described by reference to DVM systems based on system 101, the present invention is not limited as such. That is, system 101 is provided as a general example to highlight various features of an exemplary DVM system. In practice, many systems omit one or more of these features, and/or include additional features.


System 101 includes a plurality of video streaming units 102. Units 102 include conventional cameras 104 (including analogue video cameras) coupled to discrete video streaming units, and IP streaming cameras 105. Video streaming units 102 stream video data, presently in the form of surveillance footage, on a TCP/IP network 106. This is readily achieved using IP streaming cameras 105, which are inherently adapted for such a task. However, in the case of other cameras 104 (such as conventional analogue cameras), a discrete video streaming unit 107 is required to convert a captured video signal into a format suitable for IP streaming.


For the purposes of the present disclosure, the term “video streaming unit” should be read to include IP streaming cameras 105 and video streaming units 107. That is, the term “video streaming unit” describes any hardware component configured to stream video data onto a network, independent of the source of the originating analogue video data.


For the present purposes, the terms “video streaming unit” and “camera” are generally used interchangeably, on the assumption that each video streaming unit corresponds to a unique set of optical components used to capture video. That is, there is a one-to-one relationship between streaming units 107 and cameras 104. However, in other embodiments there is a one-to-many relationship between streaming units 107 and cameras 104 (i.e. a streaming unit is configured for connection to multiple cameras).


One or more camera servers 109 are also connected to network 106 (these may be either physical servers or virtual servers). Each camera server is enabled to have assigned to it one or more of video streaming units 102. In some embodiments the assignment is on a stream-by-stream basis rather than a camera-by-camera basis. This assignment is carried out using a software-based configuration tool, and it follows that camera assignment is virtual rather than physical. That is, the relationships are set by software configuration rather than hardware manipulation. In practice, each camera has a unique identifier. Data indicative of this identifier is included with surveillance footage being streamed by that camera such that components on the network are able to ascertain from which camera a given stream originates.


In the present embodiment, camera servers are responsible for making available both live and stored video data. In relation to the former, each camera server provides a live stream interface, which consists of socket connections between the camera manager and clients. Clients request live video through the camera server's COM interfaces and the camera server then pipes video and audio straight from the relevant streaming unit to the client through TCP sockets. In relation to the latter, each camera server has access to a data store for recording video data. Although FIG. 1 suggests a one-to-one relationship between camera servers and data stores, this is by no means necessary. Each camera server also provides a playback stream interface, which consists of socket connections between the camera manager and clients. Clients create and control the playback of video stored that the camera server's data store through the camera manager's COM interfaces and the stream is sent to clients via TCP sockets.


Although, in the context of the present disclosure, there is discussion of one or more cameras or streaming units being assigned to a common camera server, this is a conceptual notion, and is essentially no different from a camera server being assigned to one or more cameras or streaming units.


Clients 110 execute on a plurality of client terminals, which in some embodiments include all computational platform on network 106 that are provided with appropriate permissions. Clients 110 provide a user interface (UI) that allows surveillance footage to be viewed in real time by an end-user. For example, one UI component is a render window, in which streamed video data is rendered for display to a user. In some cases this user interface is provided through an existing application (such as Microsoft Internet Explorer), whilst in other cases it is a standalone application. The user interface optionally provides the end-user with access to other system and camera functionalities, including mechanical, digital and optical camera controls, control over video storage, and other configuration and administrative functionalities (such as the assignment and reassignment of cameras to camera servers). Typically clients 110 are relatively “thin”, and commands provided via the relevant user interfaces are implemented at a remote server, typically a camera server. In some embodiments different clients have different levels of access rights. For example, in some embodiments there is a desire to limit the number of users with access to change configuration settings or mechanically control cameras.


System 101 also includes a DVM database server 115. Database server 115 is responsible for maintaining various information relating to configurations and operational characteristics of system 101, and for managing events within the system. In terms of events, the general notion is that an action in the system (such as the modification of data in the database, or the reservation of a camera, as discusses below) causes an event to be “fired” (i.e. published), this having follow-on effects depending on the nature of the event.


In the present example, the system makes use of a preferred and redundant database server (115 and 116 respectively), the redundant server essentially operating as a backup for the preferred server. The relationship between these database servers is generally beyond the concern of the present disclosure.


Some embodiments of the present invention are directed to distributed DVM systems, also referred to as “distributed system architecture” (DSA). In general terms, a distributed DVM system includes a plurality of (i.e. two or more) discrete DVM systems, such as system 101. These systems are discrete in the sense that they are in essence standalone systems, able to function autonomously without the other by way of their own DVM servers. They may be distributed geographically (for example in different buildings, cities or countries), or notionally (in a common geographic location, but split due to individual system constraints, for example camera server numbers, or simply to take advantage of benefits of a distributed architecture). In the context of FIG. 1, a remote system 150, communicates with the local system via a DSA link 151. For the present purposes, it is assumed that remote system 150 is in a general sense similar to the local system. Various components (hardware and software) are configured to allow communications between the systems, for example via a network connection (including, but not limited to, an Intranet or Internet connection), or other communications interface. For the sake of the present embodiments, it is assumed that the inter-system communications occur by way of TCP/IP connections, and in this manner any communications channel supporting TCP/IP may be used.


Incident Centric Recording



FIG. 2 illustrates components of an exemplary DVM system (such as the system of FIG. 1), but simplified to illustrate components relevant to incident-centric recording.


A camera 201 is associated with a camera server 202. Camera server 202 is configured to access video data made available by camera 201, either for live viewing or for recording to a storage device 203. Camera server 202 is configured/controlled by a DVM server 204. There are a number of further cameras and camera servers, which are not illustrated in FIG. 2 for the sake of simplicity.


DVM server 204 executes DVM administration modules 205. The functional block for modules 205 is used to simplistically represent a wide range of software components implemented within a DVM system. A small selection of these components is illustrated in FIG. 2, and these are described in more detail further below.


DVM server 204 communicates with a user interface 210 which executes on a client terminal 211. In the present embodiment, this user interface is provided via various of modules 205 (including modules that are not specifically illustrated) via a web-server type arrangement (i.e. user interface 210 is provided via a web-browser at terminal 211 which renders data transmitted by server 211).


User interface 210 is configured to display live and recorded video data to a user via video display objects. In the example of FIG. 2, a plurality of display objects are shown as being rendered on-screen simultaneously, including a major display object 212 and minor display objects 213A-E. These are configured to each display live video data from respective cameras (such as camera 201). The size, geometric layout, and number of major/minor display objects is illustrative only, and is in some cases modifiable by a user or in response to other inputs. User interface 210 additionally includes an incident trigger button 216 and other controls 217 (which simplistically represents a variety of GUI controls available to an operator of terminal 211, such as record control, camera position control, camera view selection, and so on).


Operation of the arrangement of FIG. 2 relevant to incident centric recording is now described by reference to method 300 of FIG. 3A, which is performed by DVM server 204.


Module 250 provides an incident trigger, which is able to be actuated manually by an operator of terminal 211. Specifically, the incident trigger is actuated in response to an operator clicking button 216. In practice, an operator determines that an incident is occurring (for example based on training and/or defined surveillance protocols), and clicks button 216 thereby to incident centric recording. Clicking button 216 transmits a signal to server 204, thereby to actuate incident trigger module 250. In some embodiments trigger module 250 may be actuated automatically in response to predefined criteria being met. These criteria may include the triggering of an alarm or alarm condition in eth DVM system, a signal from analytics software (for example analytics software configured to identify movement in a camera's field of view), or an assessment of activity levels at a client terminal (for example if operator activity exceeds a threshold level it may be deemed that an incident is underway). In the context of method 300, the incident trigger is actuated at 301.


Functional block 302 represents a process including, in response to actuation of the incident trigger, defining an incident identifier. In some cases this is an alphanumeric identifier which is able to be associated with files (such as video recordings and screen captures) made by the DVM system. In other cases the identifier may be defined by a file folder in which such files are stored. However, it will be appreciated that the former approach is advantageous in the sense that the files are readily identifiable in a incident centric manner (based on the identifier) and in conventional DVM system manners (based, for example, on other properties of the files, such as time, camera, camera server, and so on). In some cases defining an incident identifier includes selecting one of a plurality of pre-created identifiers to for use in the current incident.


Functional block 302 represents a process including, during a period of time for which the incident trigger is actuated, applying an incident recording protocol. Application of this incident recording protocol causes the DVM system to automatically make recordings in accordance with the incident recording protocol. This may include any of (i) making recordings that would not have been otherwise made; (ii) making recordings at a different quality compared to what would have been made by default (for example where a camera is always configured to record at a background level); or (iii) determining that a recording that would have otherwise have been made by default is to be associated with the incident identifier. That is, the concept of “automatically making recordings” should be afforded a broad interpretation.


In terms of the period of time, that is in some cases defined by an “incident end” event. For example, this in some embodiments includes the operator clicking button 216 a further time to indicate that the incident is ended. In other cases the end of an incident is determined by different factors, which may be automatically determined.


As noted, the incident recording protocol defines a set of rules for automatically making recordings. These may include recordings from one or more cameras, and/or recordings of activity at the client terminal. The recordings are, in the present embodiment, made under the control of an incident-based recording control module 251. That is, module 251 implements the rules thereby to coordinate the making of automated recordings (which may include instructing a camera server to begin recording, modify recording parameters for a given camera, and/or associate an existing recording with the incident identifier). Various exemplary rules are outlined below:

    • A rule to set a predefined minimum level of low-quality background recording for all, or a selected group, of the cameras.
    • A rule to make recordings for all cameras currently being viewed by an operator of the terminal. For example, as shown in FIG. 3B, functional block 303 may include monitoring activity at a DVM client (for example using a client monitor module 252), identifying a set of one or more cameras for which live video is being accessed by the DVM client (for example in object 212 and objects 213A-E), and making recordings for the cameras in that set. That is, recordings are made to correspond with what the operator views. This allows the operator to focus on switching between views thereby to monitor the incident, without needing to be concerned with manually making recordings; the operator can rest assured that recordings are being made for everything he/she observes in the user interface.
    • A rule to make recordings of activity at a DVM client. For example, this may include making recordings defined by screenshots of the DVM client user interface 210. This may include sequential screenshots that define a video of what is presented to the user of the DVM client during the period of time for which the incident trigger is actuated. Such a video allows subsequent review of what the operator saw, how the operator reacted, and so on. In some cases logic is used thereby to recreate such a video using an approach other than screen capture (for example based on data indicative of what video was being displayed in each object at specific times, and use of automated recordings from the relevant cameras).


Additional rules may also be used, including rules relating to specified important cameras, cameras in regions proximal those viewed when the incident trigger is actuated, and so on. The concept of “recordings” should be interpreted broadly enough to cover both video and image recordings, in addition to other data (such as map data, location data, access card data from an associated access control system, and so on). In some embodiments the “recordings” include an audit log of all operator activity (for example as a list of operations performed in the system), thereby to allow review of steps taken by the operator in response to an incident.


As indicated by functional block 304, recordings made in accordance with the incident recording protocol are associated with the incident identifier defined at 302. It will be appreciated that, in practice, this preferably occurs in real time as recordings are made, as opposed to as a separate and subsequent step once the incident is over.


Functional block 305 represents a process whereby recordings are made available via an incident viewer interface. For example, server 204 provides an incident viewer module 253 for providing a user of a client terminal with access to all of the recordings associated with a given incident identifier. Preferably, a user is enabled to search for or browse incidents based on incident parameters other than only incident identifier, such as based on date and/or time. In this regard, each incident is associated with a start time (based on the time of actuation of the incident trigger). Each incident may additionally be associated with an operator responsible for actuating the incident trigger, and/or other parameters.


In some embodiments functionality is provided to export an incident package including all recordings having the incident identifier. This preferably includes identifying the relevant recordings, converting them into a standard video format such as MPEG or AVI (if required), and exporting the converted recordings (for example to a portable carrier medium such as a flash drive, CD, DVD or the like). This is particularly useful if incident recordings are to be provided to a third party, such as the police.


In some embodiments the incident package is defined as a file or file set that is executable in a DVM data viewer client. The term DVM data viewer client describes a software application that provides similar control functionality to a DVM client, but which does not actually interact with a physical DVM system. Rather, the software application is configured for allowing a user to access an incident package in an offline mode, and navigate/view recordings within that package via a DVM-style interface.


CONCLUSIONS AND INTERPRETATION

It will be appreciated that the disclosure above provides various significant systems and methods for managing video data. For example, the present embodiments allows for improved control of DVM systems, thereby to enable incident-centric recording. This frees up an operator to focus on monitoring of an incident, without needing to be concerned with the collection of evidence; the collection of evidence is automated based on an incident recording protocol, and that evidence associated with a common identifier thereby to streamline subsequent identification and review of evidence.


Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining”, analyzing” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities into other data similarly represented as physical quantities.


In a similar manner, the term “processor” may refer to any device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory. A “computer” or a “computing machine” or a “computing platform” may include one or more processors.


The methodologies described herein are, in one embodiment, performable by one or more processors that accept computer-readable (also called machine-readable) code containing a set of instructions that when executed by one or more of the processors carry out at least one of the methods described herein. Any processor capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken are included. Thus, one example is a typical processing system that includes one or more processors. Each processor may include one or more of a CPU, a graphics processing unit, and a programmable DSP unit. The processing system further may include a memory subsystem including main RAM and/or a static RAM, and/or ROM. A bus subsystem may be included for communicating between the components. The processing system further may be a distributed processing system with processors coupled by a network. If the processing system requires a display, such a display may be included, e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT) display. If manual data entry is required, the processing system also includes an input device such as one or more of an alphanumeric input unit such as a keyboard, a pointing control device such as a mouse, and so forth. The term memory unit as used herein, if clear from the context and unless explicitly stated otherwise, also encompasses a storage system such as a disk drive unit. The processing system in some configurations may include a sound output device, and a network interface device. The memory subsystem thus includes a computer-readable carrier medium that carries computer-readable code (e.g., software) including a set of instructions to cause performing, when executed by one or more processors, one of more of the methods described herein. Note that when the method includes several elements, e.g., several steps, no ordering of such elements is implied, unless specifically stated. The software may reside in the hard disk, or may also reside, completely or at least partially, within the RAM and/or within the processor during execution thereof by the computer system. Thus, the memory and the processor also constitute computer-readable carrier medium carrying computer-readable code.


Furthermore, a computer-readable carrier medium may form, or be included in a computer program product.


In alternative embodiments, the one or more processors operate as a standalone device or may be connected, e.g., networked to other processor(s), in a networked deployment, the one or more processors may operate in the capacity of a server or a user machine in server-user network environment, or as a peer machine in a peer-to-peer or distributed network environment. The one or more processors may form a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.


Note that while some diagrams only show a single processor and a single memory that carries the computer-readable code, those in the art will understand that many of the components described above are included, but not explicitly shown or described in order not to obscure the inventive aspect. For example, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


Thus, one embodiment of each of the methods described herein is in the form of a computer-readable carrier medium carrying a set of instructions, e.g., a computer program that is for execution on one or more processors, e.g., one or more processors that are part of web server arrangement. Thus, as will be appreciated by those skilled in the art, embodiments of the present invention may be embodied as a method, an apparatus such as a special purpose apparatus, an apparatus such as a data processing system, or a computer-readable carrier medium, e.g., a computer program product. The computer-readable carrier medium carries computer readable code including a set of instructions that when executed on one or more processors cause the processor or processors to implement a method. Accordingly, aspects of the present invention may take the form of a method, an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of carrier medium (e.g., a computer program product on a computer-readable storage medium) carrying computer-readable program code embodied in the medium.


The software may further be transmitted or received over a network via a network interface device. While the carrier medium is shown in an exemplary embodiment to be a single medium, the term “carrier medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “carrier medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by one or more of the processors and that cause the one or more processors to perform any one or more of the methodologies of the present invention. A carrier medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical, magnetic disks, and magneto-optical disks. Volatile media includes dynamic memory, such as main memory. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise a bus subsystem. Transmission media also may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications. For example, the term “carrier medium” shall accordingly be taken to included, but not be limited to, solid-state memories, a computer product embodied in optical and magnetic media; a medium bearing a propagated signal detectable by at least one processor of one or more processors and representing a set of instructions that, when executed, implement a method; a carrier wave bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions a propagated signal and representing the set of instructions; and a transmission medium in a network bearing a propagated signal detectable by at least one processor of the one or more processors and representing the set of instructions.


It will be understood that the steps of methods discussed are performed in one embodiment by an appropriate processor (or processors) of a processing (i.e., computer) system executing instructions (computer-readable code) stored in storage. It will also be understood that the invention is not limited to any particular implementation or programming technique and that the invention may be implemented using any appropriate techniques for implementing the functionality described herein. The invention is not limited to any particular programming language or operating system.


Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment, but may. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner, as would be apparent to one of ordinary skill in the art from this disclosure, in one or more embodiments.


Similarly it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, FIG., or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed invention requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.


Furthermore, while some embodiments described herein include some but not other features included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the invention, and form different embodiments, as would be understood by those skilled in the art. For example, in the following claims, any of the claimed embodiments can be used in any combination.


Furthermore, some of the embodiments are described herein as a method or combination of elements of a method that can be implemented by a processor of a computer system or by other means of carrying out the function. Thus, a processor with the necessary instructions for carrying out such a method or element of a method forms a means for carrying out the method or element of a method. Furthermore, an element described herein of an apparatus embodiment is an example of a means for carrying out the function performed by the element for the purpose of carrying out the invention.


In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.


Similarly, it is to be noticed that the term coupled, when used in the claims, should not be interpreted as being limited to direct connections only. The terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Thus, the scope of the expression a device A coupled to a device B should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means. “Coupled” may mean that two or more elements are either in direct physical or electrical contact, or that two or more elements are not in direct contact with each other but yet still co-operate or interact with each other.


Thus, while there has been described what are believed to be the preferred embodiments of the invention, those skilled in the art will recognize that other and further modifications may be made thereto without departing from the spirit of the invention, and it is intended to claim all such changes and modifications as falling within the scope of the invention. For example, any formulas given above are merely representative of procedures that may be used. Functionality may be added or deleted from the block diagrams and operations may be interchanged among functional blocks. Steps may be added or deleted to methods described within the scope of the present invention.

Claims
  • 1. A method, performed by a computing device, for controlling a Digital Video Management (DVM) system, the method comprising: configuring the DVM system to operate in a normal mode of operation, whereby recordings are made manually and in response to a set of automated recording triggers;providing an incident trigger which is configured to be selectively actuated;in response to actuation of the incident trigger, defining a unique incident identifier;during a period of time for which the incident trigger is actuated, configuring the DVM system to cease operation in the normal mode of operation, and instead operate in an incident recoding mode of operation, wherein operating in the incident recording mode of operation includes applying an incident recording protocol, thereby to automatically make recordings in accordance with the incident recording protocol, wherein the recordings made automatically in accordance with the incident recording protocol include recordings that are not made automatically in the normal mode of operation, wherein applying the incident recording protocol comprises: (i) monitoring a user interaction with a DVM client thereby to identify a set of one or more cameras for which live video is being viewed by the user of the DVM client;(ii) triggering recordings of live video data responsive to the monitoring of the set of one or more cameras for which live video is being viewed by the user of the DVM client, such that a set of recordings is made that directly corresponds with what the user viewed; and(iii) associating each of the recordings made whilst in the incident recording mode of operation with the unique incident identifier and storing the recordings in association with the unique incident identifier in a database, wherein the recordings associated with the unique incident identifier are identified and retrieved from the database based on a user query including the unique incident identifier.
  • 2. A method according to claim 1 wherein the set of one or more cameras include a camera for which live video is displayed in a major video display object and one or more cameras for which live video is displayed in respective minor video display objects.
  • 3. A method according claim 1 wherein applying the incident recording protocol includes making recordings of activity at a DVM client.
  • 4. A method according to claim 3 wherein making recordings of activity at the DVM client includes making recordings of one or more screen images presented to a user of the DVM client.
  • 5. A method according to claim 4 wherein the one or more screen images define a video of what is presented to the user of the DVM client during the period of time for which the incident trigger is actuated.
  • 6. A method according to claim 1 wherein applying the incident recording protocol includes associating all image snapshots created at a DVM client with the incident identifier.
  • 7. A method according to claim 1 wherein the incident trigger is selectively manually actuated by a user of a DVM client.
  • 8. A method according to claim 1 wherein the incident trigger is automatically actuated in response to predefined criteria being met.
  • 9. A method according to claim 1 comprising providing an incident review interface via a DVM client for providing a user with access to all of the recordings associated with a given incident identifier.
  • 10. A Digital Video Management (DVM) system configured to perform a method comprising: providing an incident trigger;in response to actuation of the incident trigger: defining a unique incident identifier;configuring the DVM system to cease operation in the normal mode of operation, and instead operate in an incident recoding mode of operation, wherein operating in the incident recording mode of operation includes applying an incident recording protocol, thereby to automatically make recordings in accordance with the incident recording protocol, wherein the recordings made automatically in accordance with the incident recording protocol include recordings that are not made automatically in the normal mode of operation, wherein applying the incident recording protocol comprises: (i) monitoring a user interaction with a DVM client thereby to identify a set of one or more cameras for which live video is being viewed by the user of the DVM client;(ii) triggering recordings of live video data responsive to the monitoring of the set of one or more cameras for which live video is being viewed by the user of the DVM client, such that a set of recordings is made that directly corresponds with what the user viewed; and(iii) associating each of the recordings made whilst in the incident recording mode of operation with the unique incident identifier and storing the recordings in association with the unique incident identifier in a database, wherein the recordings associated with the unique incident identifier are identified and retrieved from the database based on a user query including the unique incident identifier.
  • 11. A DVM system according to claim 10 comprising: a plurality of camera servers, wherein each camera server is configured to make available video data from an assigned one or more video streaming units; anda plurality of video streaming units, wherein each streaming unit is configured to stream, onto a network, video data for a respective camera.
  • 12. A system according to claim 10 wherein the set of one or more cameras include a camera for which live video is displayed in a major video display object and one or more cameras for which live video is displayed in respective minor video display objects.
  • 13. A system according claim 10 wherein applying the incident recording protocol includes making recordings of activity at a DVM client.
  • 14. A tangible non-transitive carrier medium carrying computer executable code that, when executed via one or more processors, allows the performance of a method comprising: configuring a Digital Video Management (DVM) system to operate in a normal mode of operation, whereby recordings are made manually and in response to a set of automated recording triggers;providing an incident trigger;in response to actuation of the incident trigger, defining a unique incident identifier;during a period of time for which the incident trigger is actuated, configuring the DVM system to cease operation in the normal mode of operation, and instead operate in an incident recoding mode of operation, wherein operating in the incident recording mode of operation includes applying an incident recording protocol, thereby to automatically make recordings in accordance with the incident recording protocol, wherein the recordings made automatically in accordance with the incident recording protocol include recordings that are not made automatically in the normal mode of operation, wherein applying the incident recording protocol comprises: (i) monitoring a user interaction with a DVM client thereby to identify a set of one or more cameras for which live video is being viewed by the user of the DVM client;(ii) triggering recordings of live video data responsive to the monitoring of the set of one or more cameras for which live video is being viewed by the user of the DVM client, such that a set of recordings is made that directly corresponds with what the user viewed; and(iii) associating each of the recordings made whilst in the incident recording mode of operation with the unique incident identifier and storing the recordings in association with the unique incident identifier in a database, wherein the recordings associated with the unique incident identifier are identified and retrieved from the database based on a user query including the unique incident identifier.
  • 15. A tangible non-transitive carrier medium according to claim 14 wherein the set of one or more cameras include a camera for which live video is displayed in a major video display object and one or more cameras for which live video is displayed in respective minor video display objects.
  • 16. A tangible non-transitive carrier medium according to claim 14 wherein applying the incident recording protocol includes making recordings of activity at a DVM client.
Priority Claims (1)
Number Date Country Kind
2011903153 Aug 2011 AU national
PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/AU2012/000914 8/3/2012 WO 00 2/5/2014
Publishing Document Publishing Date Country Kind
WO2013/020165 2/14/2013 WO A
US Referenced Citations (322)
Number Name Date Kind
3753232 Sporer Aug 1973 A
3806911 Pripusich Apr 1974 A
3857018 Stark et al. Dec 1974 A
3860911 Hinman et al. Jan 1975 A
3866173 Moorman et al. Feb 1975 A
3906447 Crafton Sep 1975 A
4001881 Folsom Jan 1977 A
4095739 Fox et al. Jun 1978 A
4146085 Wills Mar 1979 A
4148012 Baump et al. Apr 1979 A
4161778 Getson, Jr. et al. Jul 1979 A
4213118 Genest et al. Jul 1980 A
4283710 Genest et al. Aug 1981 A
4298946 Hartsell et al. Nov 1981 A
4332852 Korklan et al. Jun 1982 A
4336902 Neal Jun 1982 A
4337893 Flanders et al. Jul 1982 A
4353064 Stamm Oct 1982 A
4373664 Barker et al. Feb 1983 A
4379483 Farley Apr 1983 A
4462028 Ryan et al. Jul 1984 A
4525777 Webster et al. Jun 1985 A
4538056 Young et al. Aug 1985 A
4556169 Zervos Dec 1985 A
4628201 Schmitt Dec 1986 A
4646964 Parker et al. Mar 1987 A
4685615 Hart Aug 1987 A
4821177 Koegel et al. Apr 1989 A
4847839 Hudson, Jr. et al. Jul 1989 A
5070468 Niinomi et al. Dec 1991 A
5071065 Aalto et al. Dec 1991 A
5099420 Barlow et al. Mar 1992 A
5172565 Wruck et al. Dec 1992 A
5204663 Lee Apr 1993 A
5227122 Scarola et al. Jul 1993 A
5259553 Shyu Nov 1993 A
5271453 Yoshida et al. Dec 1993 A
5361982 Liebl et al. Nov 1994 A
5404934 Carlson et al. Apr 1995 A
5420927 Micali May 1995 A
5449112 Heitman et al. Sep 1995 A
5465082 Chaco Nov 1995 A
5479154 Wolfram Dec 1995 A
5481481 Frey et al. Jan 1996 A
5526871 Musser et al. Jun 1996 A
5541585 Duhame et al. Jul 1996 A
5591950 Imedio-Ocana Jan 1997 A
5594429 Nakahara Jan 1997 A
5604804 Micali Feb 1997 A
5610982 Micali Mar 1997 A
5631825 van Weele et al. May 1997 A
5640151 Reis et al. Jun 1997 A
5644302 Hana et al. Jul 1997 A
5663957 Dent Sep 1997 A
5666416 Micali Sep 1997 A
5717757 Micali Feb 1998 A
5717758 Micali Feb 1998 A
5717759 Micali Feb 1998 A
5732691 Maiello et al. Mar 1998 A
5774058 Henry et al. Jun 1998 A
5778256 Darbee Jul 1998 A
5793868 Micali Aug 1998 A
5844553 Hao et al. Dec 1998 A
5914875 Monta et al. Jun 1999 A
5915069 Nishijima Jun 1999 A
5915473 Ganesh et al. Jun 1999 A
5923817 Nakamura Jul 1999 A
5927398 Maciulewicz Jul 1999 A
5930773 Crooks et al. Jul 1999 A
5960083 Micali Sep 1999 A
5973613 Reis et al. Oct 1999 A
5992194 Baukholt et al. Nov 1999 A
6072402 Kniffin et al. Jun 2000 A
6097811 Micali Aug 2000 A
6104963 Cebasek et al. Aug 2000 A
6119125 Gloudeman et al. Sep 2000 A
6141595 Gloudeman et al. Oct 2000 A
6149065 White et al. Nov 2000 A
6154681 Drees et al. Nov 2000 A
6167316 Gloudeman et al. Dec 2000 A
6233954 Mehaffey et al. May 2001 B1
6241156 Kline et al. Jun 2001 B1
6249755 Yemini et al. Jun 2001 B1
6260765 Natale et al. Jul 2001 B1
6268797 Berube et al. Jul 2001 B1
6292893 Micali Sep 2001 B1
6301659 Micali Oct 2001 B1
6318137 Chaum Nov 2001 B1
6324854 Jayanth Dec 2001 B1
6334121 Primeaux et al. Dec 2001 B1
6347374 Drake et al. Feb 2002 B1
6366558 Howes et al. Apr 2002 B1
6369719 Tracy et al. Apr 2002 B1
6374356 Daigneault et al. Apr 2002 B1
6393848 Roh et al. May 2002 B2
6394359 Morgan May 2002 B1
6424068 Nakagashi Jul 2002 B2
6453426 Gamache et al. Sep 2002 B1
6453687 Sharood et al. Sep 2002 B2
6476858 Ramirez Diaz et al. Nov 2002 B1
6483697 Jenks et al. Nov 2002 B1
6487658 Micali Nov 2002 B1
6490610 Rizvi et al. Dec 2002 B1
6496575 Vasell et al. Dec 2002 B1
6516357 Hamann et al. Feb 2003 B1
6518953 Armstrong Feb 2003 B1
6546419 Humpleman et al. Apr 2003 B1
6556899 Harvey et al. Apr 2003 B1
6574537 Kipersztok et al. Jun 2003 B2
6583712 Reed et al. Jun 2003 B1
6604023 Brown et al. Aug 2003 B1
6615594 Jayanth et al. Sep 2003 B2
6628997 Fox et al. Sep 2003 B1
6647317 Takai et al. Nov 2003 B2
6647400 Moran Nov 2003 B1
6658373 Rossi et al. Dec 2003 B2
6663010 Chene et al. Dec 2003 B2
6665669 Han et al. Dec 2003 B2
6667690 Durej et al. Dec 2003 B2
6741915 Poth May 2004 B2
6758051 Jayanth et al. Jul 2004 B2
6766450 Micali Jul 2004 B2
6789739 Rosen Sep 2004 B2
6796494 Gonzalo Sep 2004 B1
6801849 Szukala et al. Oct 2004 B2
6801907 Zagami Oct 2004 B1
6826454 Sulfstede Nov 2004 B2
6829332 Farris et al. Dec 2004 B2
6851621 Wacker et al. Feb 2005 B1
6871193 Campbell et al. Mar 2005 B1
6886742 Stoutenburg et al. May 2005 B2
6895215 Uhlmann May 2005 B2
6910135 Grainger Jun 2005 B1
6967612 Gorman et al. Nov 2005 B1
6969542 Klasen-Memmer et al. Nov 2005 B2
6970070 Juels et al. Nov 2005 B2
6970183 Monroe Nov 2005 B1
6973410 Seigel Dec 2005 B2
6983889 Alles Jan 2006 B2
6989742 Ueno et al. Jan 2006 B2
7004401 Kallestad Feb 2006 B2
7019614 Lavelle et al. Mar 2006 B2
7032114 Moran Apr 2006 B1
7055759 Wacker et al. Jun 2006 B2
7076083 Blazey Jul 2006 B2
7117356 LaCous Oct 2006 B2
7119832 Blanco Oct 2006 B2
7124943 Quan et al. Oct 2006 B2
7130719 Ehlers et al. Oct 2006 B2
7151772 Kalmanek, Jr. Dec 2006 B1
7183894 Yui et al. Feb 2007 B2
7203962 Moran Apr 2007 B1
7205882 Libin Apr 2007 B2
7216007 Johnson May 2007 B2
7216015 Poth May 2007 B2
7218243 Hayes et al. May 2007 B2
7222800 Wruck May 2007 B2
7233243 Roche et al. Jun 2007 B2
7243001 Janert et al. Jul 2007 B2
7245223 Trela Jul 2007 B2
7250853 Flynn Jul 2007 B2
7274676 Cardei et al. Sep 2007 B2
7280030 Monaco Oct 2007 B1
7283489 Palaez et al. Oct 2007 B2
7298327 Dupray Nov 2007 B2
7310817 Hinchliffe et al. Dec 2007 B2
7313819 Burnett et al. Dec 2007 B2
7321784 Serceki et al. Jan 2008 B2
7337315 Micali Feb 2008 B2
7340743 Anural et al. Mar 2008 B1
7343265 Andarawis et al. Mar 2008 B2
7353396 Micali et al. Apr 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7375731 Divakaran May 2008 B2
7376839 Carta et al. May 2008 B2
7379997 Ehlers et al. May 2008 B2
7380125 Di Luoffo et al. May 2008 B2
7383158 Krocker et al. Jun 2008 B2
7397371 Martin et al. Jul 2008 B2
7408925 Boyle et al. Aug 2008 B1
7460149 Donovan Dec 2008 B1
7487538 Mok Feb 2009 B2
7505914 McCall Mar 2009 B2
7542867 Steger et al. Jun 2009 B2
7543327 Kaplinsky Jun 2009 B1
7574734 Fedronic et al. Aug 2009 B2
7576770 Metzger et al. Aug 2009 B2
7583401 Lewis Sep 2009 B2
7586398 Huang et al. Sep 2009 B2
7600679 Kshirsagar et al. Oct 2009 B2
7627199 Sato Dec 2009 B2
7634662 Monroe Dec 2009 B2
7646895 Haupt Jan 2010 B2
7661603 Yoon et al. Feb 2010 B2
7683940 Fleming Mar 2010 B2
7735132 Brown et al. Jun 2010 B2
7735145 Kuehnel et al. Jun 2010 B2
7751632 Liu et al. Jul 2010 B2
7794536 Roy et al. Sep 2010 B2
7801870 Oh et al. Sep 2010 B2
7805382 Rosen Sep 2010 B2
7818026 Hartikainen et al. Oct 2010 B2
7839926 Metzger et al. Nov 2010 B1
7840515 Ozdemir Nov 2010 B2
7843491 Vallone Nov 2010 B2
7847820 Vallone Dec 2010 B2
7853987 Balasubramanian et al. Dec 2010 B2
7861314 Serani et al. Dec 2010 B2
7873441 Synesiou et al. Jan 2011 B2
7899210 Dorai Mar 2011 B2
7907753 Wilson et al. Mar 2011 B2
7937669 Zhang et al. May 2011 B2
7983892 Anne et al. Jul 2011 B2
7995526 Liu et al. Aug 2011 B2
7999847 Donovan et al. Aug 2011 B2
8045960 Orakkan Oct 2011 B2
8069144 Quinlan et al. Nov 2011 B2
8081205 Baird et al. Dec 2011 B2
8089341 Nakagawa et al. Jan 2012 B2
8095889 DeBlaey et al. Jan 2012 B2
8102240 Birchbauer et al. Jan 2012 B2
8108200 Anne et al. Jan 2012 B2
8166532 Chowdhury et al. Apr 2012 B2
8179227 Dziadosz May 2012 B2
8193904 Kwakita Jun 2012 B2
8199196 Klein et al. Jun 2012 B2
8204273 Chambers et al. Jun 2012 B2
8222990 Gerner et al. Jul 2012 B2
8232860 Goel Jul 2012 B2
8272053 Markham et al. Sep 2012 B2
8316407 Lee et al. Nov 2012 B2
8341695 Thomas et al. Dec 2012 B2
8347216 Shin et al. Jan 2013 B2
8350666 Kore Jan 2013 B2
8351350 Bhandari et al. Jan 2013 B2
8413227 Chen Apr 2013 B2
8443437 Srinivasa et al. May 2013 B2
8479029 Adams et al. Jun 2013 B2
8474710 Marcinowski et al. Jul 2013 B2
8509987 Resner Aug 2013 B2
8521312 Dongare et al. Aug 2013 B2
8543684 Hulusi et al. Sep 2013 B2
8554865 Dziadosz Oct 2013 B2
8558658 Kumar et al. Oct 2013 B2
8560970 Liddington Oct 2013 B2
8598982 Bhandari et al. Dec 2013 B2
8602269 Alluigi Dec 2013 B2
8605151 Bellamy et al. Dec 2013 B2
8680995 Ashwin et al. Mar 2014 B2
8700714 Pan et al. Apr 2014 B1
8707414 Roy et al. Apr 2014 B2
8711201 Gorzynski Apr 2014 B2
8731895 Anne et al. May 2014 B2
8732457 Micali May 2014 B2
8754940 Belsarkar et al. Jun 2014 B2
8787725 Lee Jul 2014 B2
8789094 Singh et al. Jul 2014 B1
8875031 Periyannan et al. Oct 2014 B2
8893022 Akram et al. Nov 2014 B2
20020011923 Cunningham et al. Jan 2002 A1
20020022991 Sharood et al. Feb 2002 A1
20020046337 Micali Apr 2002 A1
20020118096 Hoyos et al. Aug 2002 A1
20020121961 Huff Sep 2002 A1
20020170064 Monroe et al. Nov 2002 A1
20030025599 Monroe Feb 2003 A1
20030033230 McCall Feb 2003 A1
20030071714 Bayer et al. Apr 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030208689 Garza Nov 2003 A1
20030233432 Davis et al. Dec 2003 A1
20040062421 Jakubowski et al. Apr 2004 A1
20040064453 Ruiz et al. Apr 2004 A1
20040068583 Monroe et al. Apr 2004 A1
20040087362 Beavers May 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040205350 Waterhouse et al. Oct 2004 A1
20050132414 Bentley Jun 2005 A1
20050138380 Fedronic et al. Jun 2005 A1
20050200714 Marchese Sep 2005 A1
20060077262 Miyamaki et al. Apr 2006 A1
20060017939 Jamieson et al. Jun 2006 A1
20070109098 Siemon et al. May 2007 A1
20070132550 Avraham et al. Jun 2007 A1
20070171862 Tang et al. Jul 2007 A1
20070185946 Basri Aug 2007 A1
20070268145 Bazakos et al. Nov 2007 A1
20070272744 Bantwal et al. Nov 2007 A1
20080173709 Ghosh Jul 2008 A1
20080303903 Bentley Dec 2008 A1
20080309760 Joyner et al. Dec 2008 A1
20090018900 Waldron et al. Jan 2009 A1
20090097815 Lahr et al. Apr 2009 A1
20090228808 MacDonald et al. Sep 2009 A1
20090258643 McGuffin Oct 2009 A1
20090328203 Haas Dec 2009 A1
20100026811 Palmer Feb 2010 A1
20100064219 Gabrisko et al. Mar 2010 A1
20100166053 Fukuhara Jul 2010 A1
20100199340 Jonas et al. Aug 2010 A1
20100220715 Cherchali et al. Sep 2010 A1
20100235868 Howarter Sep 2010 A1
20100303303 Shen Dec 2010 A1
20100313229 Martini Dec 2010 A1
20110010543 Schmidt Jan 2011 A1
20110018998 Guzik Jan 2011 A1
20110043631 Marman et al. Feb 2011 A1
20110071929 Morrison Mar 2011 A1
20110123169 Liu May 2011 A1
20110153791 Jones et al. Jun 2011 A1
20110225039 Goldman et al. Sep 2011 A1
20120096131 Bhandari et al. Apr 2012 A1
20120106915 Palmer May 2012 A1
20120133482 Bhandari et al. May 2012 A1
20120326868 Goel Dec 2012 A1
20130036356 Worrill et al. Feb 2013 A1
20130194430 Worrill Aug 2013 A1
20130227437 Brody et al. Aug 2013 A1
20130263021 Dunn et al. Oct 2013 A1
20140006977 Adams Jan 2014 A1
20140125808 Flannery May 2014 A1
20140211027 Worrill et al. Jul 2014 A1
Foreign Referenced Citations (37)
Number Date Country
2240881 Dec 1999 CA
1265762 Sep 2000 CN
19945861 Mar 2001 DE
0043270 Jan 1982 EP
0122244 Oct 1984 EP
0152678 Aug 1985 EP
0629940 Dec 1994 EP
0858702 Apr 2002 EP
1339028 Aug 2003 EP
1630639 Mar 2006 EP
2251266 Jul 1992 GB
2390705 Jan 2004 GB
6019911 Jan 1994 JP
2003074942 Mar 2003 JP
2003240318 Aug 2003 JP
WO 8402786 Jul 1984 WO
WO 9419912 Sep 1994 WO
WO 9627858 Sep 1996 WO
WO 0011592 Mar 2000 WO
WO 0076220 Dec 2000 WO
WO 0142598 Jun 2001 WO
WO 0157489 Aug 2001 WO
WO 0160024 Aug 2001 WO
WO 0232045 Apr 2002 WO
WO 02091311 Nov 2002 WO
WO 03090000 Oct 2003 WO
WO 2004092514 Oct 2004 WO
WO 2005038727 Apr 2005 WO
WO 2006021047 Mar 2006 WO
WO 2006049181 May 2006 WO
WO 2006126974 Nov 2006 WO
WO 2007043798 Apr 2007 WO
WO 2008045918 Apr 2008 WO
WO 2008144803 Dec 2008 WO
WO 2010039598 Apr 2010 WO
2010099575 Sep 2010 WO
WO 2010106474 Sep 2010 WO
Non-Patent Literature Citations (38)
Entry
“Method and System for Targeted Real-Time Collaborative Content Sharing Across Devices,” ip.com No. 000211548, 3 pages, Oct. 11, 2011.
“Use of Multiple Instant Messaging Group Chats During Teleconferences with Geographically/Culturally/Language Dispersed Teams,” ip.com No. 000194459, 4 pages, Mar. 25, 2010.
Gyorfi et al., “Sharing Local Content With Remote Clients in a Collaborative Virtual Environment,” ip.com No. 000172771 , Jul. 14, 2008.
“Certificate Validation Choices,” CoreStreet, Inc., 8 pages, 2002.
“CoreStreet Cuts the PKI Gordian Knot,” Digital ID World, pp. 22-25, Jun./Jul. 2004.
“Distributed Certificate Validation,” CoreStreet, Ltd., 17 pages, 2006.
“Identity Services Infrastructure,” CoreStreet Solutions—Whitepaper, 12 pages, 2006.
“Important FIPS 201 Deployment Considerations,” Corestreet Ltd.—Whitepaper, 11 pages, 2005.
“Introduction to Validation for Federated PKI,” Corestreet Ltd, 20 pages, 2006.
“Manageable Secure Physical Access,” Corestreet Ltd, 3 pages, 2002.
“MiniCRL, Corestreet Technology Datasheet,” CoreStreet, 1 page, 2006.
“Nonce Sense, Freshness and Security in OCSP Responses,” Corestreet Ltd, 2 pages, 2003.
“Real Time Credential Validation, Secure, Efficient Permissions Management,” Corestreet Ltd, 5 pages, 2002.
“The Role of Practical Validation for Homeland Security,” Corestreet Ltd, 3 pages, 2002.
“The Roles of Authentication, Authorization & Cryptography in Expanding Security Industry Technology,” Security Industry Association (SIA), Quarterly Technical Update, 32 pages, Dec. 2005.
“Vulnerability Analysis of Certificate Validation Systems,” Corestreet Ltd—Whitepaper, 14 pages, 2006.
“Keyfast Technical Overview”, Corestreet Ltd., 21 pages, 2004.
Goldman et al., “Information Modeling for Intrusion Report Aggregation,” IEEE, Proceedings DARPA Information Survivability Conference and Exposition II, pp. 329-342, 2001.
Honeywell, “Excel Building Supervisor—Integrated R7044 and FS90 Ver. 2.0,” Operator Manual, 70 pages, Apr. 1995.
http://www.tcsbasys.com/products/superstats.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1009.asp, TCS/Basys Controls: Where Buildings Connect With Business, 1 page, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1017a.asp, TCS/Basys Controls: Where Buildings Connect With Business, 1 page, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1017n.asp, TCS/Basys Controls: Where Buildings Connect With Business, 1 page, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1020nseries.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1020series.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1022.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1024.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1030series.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1033.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1035.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1041.asp, TCS/Basys Controls: Where Buildings Connect With Business, 1 page, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1050series.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1051.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://www.tcsbasys.com/products/sz1053.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
http://wwww.tcsbasys.com/products/sz1031.asp, TCS/Basys Controls: Where Buildings Connect With Business, 2 pages, printed Aug. 26, 2003.
Trane, “System Programming, Tracer Summit Version 14, BMTW-SVP01D-EN,” 623 pages, 2002.
An Office Action for Chinese Patent Application No. 201280038415.7, dated Aug. 31, 2015.
English Translation of the Office Action for Chinese Patent Application No. 201280038415.7, dated Aug. 31, 2015.
Related Publications (1)
Number Date Country
20140192192 A1 Jul 2014 US