Systems and methods for generating an audit trail for auditable devices

Information

  • Patent Grant
  • 10848717
  • Patent Number
    10,848,717
  • Date Filed
    Tuesday, December 11, 2018
    5 years ago
  • Date Issued
    Tuesday, November 24, 2020
    3 years ago
Abstract
In some embodiments, auditable devices such as auditable cameras are provided. The auditable devices maintain audit trail data that is digitally signed and stored on the devices until it can be uploaded to an evidence management system. When intermittent data connections to the evidence management system are available, an auditable device may transmit records of urgent events to the evidence management system. The transmission of urgent events via ad-hoc data connections and the digital signing of audit trail data help overcome technical obstacles in establishing provably reliable data collection while minimizing power consumption by the auditable device.
Description
BACKGROUND

Existing systems for collecting digital video data from wearable cameras have a plethora of technical shortcomings, especially when used in a law enforcement context. Wearable cameras for law enforcement require long battery life to last through an entire shift. However, it is also desirable to monitor the activities performed with such cameras in order to ensure that a chain of custody of any videos recorded by the cameras can be provably established, and to ensure that any videos recorded by the cameras are eventually uploaded to an evidence management system for storage. Because the cameras require minimal power consumption and have only intermittent connectivity to an evidence management system (if any at all), technical hurdles exist in collecting data from wearable cameras in a way that can provide reliable chain-of-custody information and preserve battery life at the same time.


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


In some embodiments, an auditable camera is provided. The auditable camera comprises a video sensor, a video data store, an audit trail data store, an audit trail gathering engine, and an audit trail reporting engine. The video data store is configured to store video data generated by the video sensor. The audit trail data store is configured to store a set of auditable event entries. The audit trail gathering engine is configured to in response to receiving a notification of an event reported by a component of the auditable camera, record an auditable event entry representing the event in the audit trail data store. The audit trail reporting engine is configured to transmit one or more auditable event entries from the audit trail data store to an evidence management system.


In some embodiments, a coordinator computing device is provided. The coordinator computing device comprises a short range wireless interface, a long range wireless interface, an auditable device communication engine, and a system communication engine. The auditable device communication engine is configured to establish a short range communication channel between the coordinator computing device and an auditable device via the short range wireless interface; and to receive an auditable event entry from the auditable device via the short range communication channel. The system communication engine is configured to establish a long range communication channel between the coordinator computing device and an evidence management system via the long range wireless interface; and to transmit the auditable event entry to the evidence management system via the long range communication channel.


In some embodiments, an evidence management system is provided. The evidence management system comprises a video data store, an audit trail data store, and at least one computing device configured to provide a data gathering engine. The video data store is configured to store video data received from a plurality of wearable cameras. The audit trail data store is configured to store a plurality of auditable event entries generated by the plurality of wearable cameras. The data gathering engine is configured to receive video data from a plurality of wearable cameras and store the video data in the video data store; and to receive auditable event entries generated by the plurality of wearable cameras and store the auditable event entries in the audit trail data store.





DESCRIPTION OF THE DRAWINGS

The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:



FIG. 1 is a high-level schematic diagram that illustrates communication between various components of an exemplary embodiment of a system according to various aspects of the present disclosure;



FIG. 2 is a block diagram that illustrates components of an exemplary embodiment of an auditable camera according to various aspects of the present disclosure;



FIG. 3 is a block diagram that illustrates an exemplary embodiment of a coordinator computing device according to various aspects of the present disclosure;



FIG. 4 is a block diagram that illustrates an exemplary embodiment of an evidence management system according to various aspects of the present disclosure;



FIGS. 5A-5B are a flowchart that illustrates an exemplary embodiment of a method of auditing camera activity according to various aspects of the present disclosure;



FIG. 6 is a flowchart that illustrates an exemplary embodiment of a method of managing collected audit trail data according to various aspects of the present disclosure; and



FIG. 7 is a block diagram that illustrates aspects of an exemplary computing device 700 appropriate for use as a computing device of the present disclosure.





DETAILED DESCRIPTION

Embodiments of the present disclosure solve the technical problems introduced above, as well as others. In some embodiments, auditable devices such as auditable cameras are provided. The auditable devices maintain audit trail data that is digitally signed and stored on the devices until it can be uploaded to an evidence management system. When an intermittent data connection to the evidence management system is available, the auditable devices transmit records of urgent events to the evidence management system, so that the evidence management system is notified to expect further data associated with the events despite the lack of persistent data connectivity to the device. By using these and other techniques, the technical problems introduced above and others can be overcome.



FIG. 1 is a high-level schematic diagram that illustrates communication between various components of an exemplary embodiment of a system according to various aspects of the present disclosure. In some embodiments, the system 100 is configured to allow for collection of auditing information from a plurality of auditable devices. The auditing information is generated by the auditable devices while they are being operated, and is gathered by an evidence management system 102.


In general, a user 92, such as a law enforcement officer, may carry one or more auditable devices. The devices may include, but are not limited to, an auditable camera 106, an auditable weapon 108, and a light bar sensor 110. The auditable camera 106 may be, for example, a wearable camera that records video and/or audio data when activated, and stores a record of auditable events that occur (such as the start of recording, the end of recording, various device faults, and/or the like and as described further below). The auditable weapon 108 may be, for example, a conducted energy weapon (CEW) that records firing events, cartridge loading, holster removal, and/or the like. The light bar sensor 110 may detect activation of the light bar on the vehicle 94, which is usually associated with an emergency situation. Other auditable devices, such as a dashboard camera, a heart rate sensor, may also be included in the system 100 but are not illustrated in FIG. 1.


In some embodiments, the auditable devices may have limited communication functionality. For example, auditable devices may only be able to transmit information to the evidence management system 102 when physically connected to an evidence collection dock 104 that communicates with the evidence management system 102 via a broadband network 90 such as a LAN, a WAN, and/or the Internet. Accordingly, technical problems arise when attempting to obtain verifiably complete auditing information in that there is no reliable, always-on communication path between the evidence management system 102 and the auditable devices.


In some embodiments, an ad-hoc communication path may be created between the auditable device and the evidence management system 102 via a coordinator computing device 107. The coordinator computing device 107 is illustrated as a smartphone computing device, but in some embodiments may be a laptop computing device, a tablet computing device, or any other suitable computing device capable of being carried by the user 92 or a vehicle 94 associated with the user 92 and capable of performing the actions described herein. The coordinator computing device 107 communicates with the auditable devices, and may also communicate with other devices such as a light bar sensor 110 of a vehicle 94 associated with the user 92, using a short-range wireless communication technology such as Bluetooth, Zigbee, IrDA, ANT, ANT+, 802.15.4, near-field communication (NEC), a radio frequency identifier (RFID) tag, and/or the like. In some embodiments, the auditable camera 106 may receive data directly from other auditable devices such as an auditable weapon 108 or a light bar sensor 110 in order to enhance the auditable event entries collected by the auditable camera 106.


The coordinator computing device 107 is capable of communicating with the evidence management system 102, though the communication path between the coordinator computing device 107 may be unreliable, may drain battery performance too greatly to be constantly enabled, or may be of relatively low bandwidth. As such, the communication of information from the coordinator computing device 107 to the evidence management system 102 may be sporadic and may transmit only a subset of information gathered by or audited on the auditable devices. Further aspects of these devices and their capabilities will be discussed below.



FIG. 2 is a block diagram that illustrates components of an exemplary embodiment of an auditable camera according to various aspects of the present disclosure. In some embodiments, the auditable camera 106 is a wearable camera that provides a point of view associated with the user 92. In some embodiments, the auditable camera 106 may be attached to another device carried by the user 92, such as a weapon.


As with any camera, the auditable camera 106 includes at least a video sensor 202, and may also include an audio sensor 206. Data collected by the video sensor 202 and the audio sensor 206 may be stored in a video data store 222 and an audio data store 224, respectively, though in some embodiments the audio and video information is stored together in a single data store and/or in a combined data file. One example of an appropriate video sensor is a charge-coupled device (CCD), though any other digital image sensor, such as a complementary metal-oxide-semiconductor (CMOS) sensor, an active pixel sensor, or any other type of digital image sensor could be used instead. Any type of microphone may be used as an audio sensor 206.


As understood by one of ordinary skill in the art, a “data store” as described herein may be any suitable device configured to store data for access by a computing device. One example of a data store suitable for use with the high capacity needs of the evidence management system 102 is a highly reliable, high-speed relational database management system (RDBMS) executing on one or more computing devices and accessible over a high-speed network. However, any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries may be used, such as a key-value store, an object database, and/or the like. Further, for the evidence management system 102, the computing device providing the data store may be accessible locally instead of over a network, or may be provided as a cloud-based service. A data store may also include data stored in an organized manner on a computer-readable storage medium, as described further below. One example of a data store suitable for use with the needs of the auditable camera 106 and the coordinator computing device 107, which includes reliable storage but also low overhead, is a file system or database management system that stores data in files (or records) on a computer readable medium such as flash memory, random access memory (RAM), hard disk drives, and/or the like. One of ordinary skill in the art will recognize that separate data stores described herein may be combined into a single data store, and/or a single data store described herein may be separated into multiple data stores, without departing from the scope of the present disclosure.


The auditable camera 106 includes a set of engines, including a camera control engine 204, an audit trail signing engine 210, an audit trail gathering engine 214, and an audit trail reporting engine 212. In general, the term “engine” as used herein refers to logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™ languages such as C#, and/or the like. An engine may be compiled into executable programs or written in interpreted programming languages. Engines may be callable from other engines or from themselves. Generally, the engines described herein refer to logical modules that can be merged with other engines or applications, or can be divided into sub-engines. The engines can be stored in any type of computer readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine. Accordingly, the devices and systems illustrated herein include one or more computing devices configured to provide the illustrated engines, though the computing devices themselves have not been illustrated in every case for the sake of clarity.


The camera control engine 204 is configured to cause the auditable camera 106 to perform camera functions. For example, the camera control engine 204 may cause the video sensor 202 and audio sensor 206 to begin obtaining data, and may save the video and/or audio data in a video data store 222 and/or audio data store 224 after receiving it from the sensor. The camera control engine 204 may receive commands to start, pause, or stop the video recording from a physical user interface device 208, or may automatically start, pause, or stop the video recording in response to a signal received from some other component of the auditable camera 106 (such as the battery sensor 234, the clock 230, the motion sensor 238, the short-range wireless interface 228, and/or the like). The camera control engine 204 may also change settings on the video sensor 202 and/or audio sensor 206, such as an image quality, a white balance setting, a gain, and/or any other common video or audio recording setting, based on configuration settings received via a physical user interface device 208, via the short-range wireless interface 228, via the physical dock interface 232, or by any other suitable technique, Any type of physical user interface device 208 that can transmit commands to the camera control engine 204 may be used, including but not limited to push button switches, toggle switches, slide switches, touch switches, and/or the like.


In some embodiments, the camera control engine 204 may report starting, pausing, or stopping the video recording, as well as the settings for the video sensor 202 and audio sensor 206, as auditable events to the audit trail gathering engine 214. In some embodiments, the camera control engine 204 may embed the sensor configuration information in the data stored in the video data store 222 and/or audio data store 224, along with other information about the state of the auditable camera 106 or other information received via the short-range wireless interface about other attainable devices, to prevent the additional information from being disassociated with the video and/or audio data.


The auditable camera 106 also includes an audit trail data store 216 for storing records of auditable events and a certificate data store 218 for storing data usable for digital signing. The audit trail gathering engine 214 is configured to receive reports of auditable events from the other components of the auditable camera 106, and to record auditable event records that represent the events in the audit trail data store 216. The audit trail signing engine 210 is configured to use one or more signing certificates from the certificate data store 218 to apply a digital signature to the auditable event records in the audit trail data store 216. The audit trail reporting engine 212 is configured to transmit the audit trail information from the audit trail data store 216 to the evidence management system 102, either directly via the long-range wireless interface 226 or physical dock interface 232, or via the short-range wireless interface 228 and the coordinator computing device 107. Further details of the functionality of these components are provided below.


The auditable camera 106 also includes a number of general components, including a clock 230, a motion sensor 238, a physical dock interface 232, a battery sensor 234, a battery 236. The clock 230 may be used by the audit trail gathering engine 214, the audit trail reporting engine 212, the camera control engine 204, or any other component of the auditable camera 106 to include timestamp information in generated data. The motion sensor 238, such as a multi-axis accelerometer, also produces information that may be used by other components. For example, the audit trail gathering engine 214 may use the motion sensor 238 to detect a certain type of motion, such as running, falling, and/or the like, and to treat the detected motion as an auditable event.


The physical dock interface 232 is configured to mate with a physical connector on the evidence collection dock 104. In some embodiments, the physical dock interface 232 may include a female 2.5 mm socket, which mates with a male 2.5 mm plug of the evidence collection dock 104. Once docked, the auditable camera 106 may then transfer data to the evidence management system 102 via the connection using any suitable data transmission protocol. In some embodiments, power may be transferred to the auditable camera 106 via the physical dock interface 232 instead of or in addition to the data transfer. In some embodiments, other connection hardware that can provide both power and data connectivity may be used, such as a USB connector, a Firewire connector, and/or the like.


The battery sensor 234 and the battery 236 are an example of an internal system that may generate auditable events that are captured by the audit trail gathering engine 214 without user intervention. For example, the battery sensor 234 may detect a low battery state, a battery overheating state, and/or the like, and may generate alerts to be logged as auditable events by the audit trail gathering engine. Other well-known internal device systems, such as a file system controller, a free-fall sensor, and/or the like, may similarly generate alerts to be logged as auditable events, but are not illustrated here.


The auditable camera 106 also includes an optional long-range wireless interface 226 and an optional short-range wireless interface 228. The long-range wireless interface 226 may use any suitable networking technology capable of establishing a wireless data connection to the evidence management system 102 from the auditable camera 106 from any geographical area, including but not limited to 3G, 4G, LTE, and/or the like. The short-range wireless interface 228 may use any suitable wireless networking technology capable of establishing a wireless data connection to the coordinator computing device 107 when within range of the coordinator computing device 107, including but not limited to Bluetooth, ZigBee, NFC, and/or the like. These wireless interfaces are illustrated as optional, because some embodiments may be missing one or both interfaces. For example, in some embodiments, the only communication performed by the auditable camera 106 may be through the physical dock interface 232. As another example, in some embodiments only the short-range wireless interface 228 may be present, and so the auditable camera 106 may only be able to communicate with the evidence management system 102 via the physical dock interface 232 or via the short-range wireless interface 228 through the coordinator computing device 107.


Though FIG. 2 illustrates components of a camera, one of ordinary skill in the art will recognize that, other than the components that make the device a camera such as the video sensor 202, audio sensor 206, and the associated data stores, similar components may be included in an auditable weapon 108 or an auditable device of some other type.



FIG. 3 is a block diagram that illustrates an exemplary embodiment of a coordinator computing device according to various aspects of the present disclosure. As illustrated, the coordinator computing device 107, which as stated above may be a smart phone or any other suitable computing device, includes a short-range wireless interface 306, a long-range wireless interface 310, and a GPS sensor 314. The short-range wireless interface 306 is configured to allow the components of the coordinator computing device 107 to communicate with the other nearby components of the system 100 when they are within range, including auditable devices such as the auditable camera 106 and the auditable weapon 108 and the light bar sensor 110 of the vehicle 94. The short-range wireless interface 306 may use one or more short-range wireless networking technologies such as those discussed above with respect to the auditable camera 106 in order to provide these communication paths. The long-range wireless interface 310 is configured to allow the coordinator computing device 107 to communicate data with the evidence management system 102 as long as the coordinator computing device 107 is within a service area of a corresponding long-range wireless data service, such as 3G, 4G, LTE, and/or the like. The GPS sensor 314 is configured to obtain a geographic position of the coordinator computing device 107 using satellite signals, triangulation of cellular phone towers, or any other technique known to one of ordinary skill in the art.


The coordinator computing device 107 also includes one or more user interface devices 302, a user interface engine 304, an auditable device communication engine 308, and a system communication engine 312. The one or more user interface devices 302 may include, but are not limited to, physical buttons, status lights, a display, a touch-screen display, a loudspeaker, and/or the like. The user interface engine 304 is configured to present an interactive interface to the user 92 and receive input from the user 92 via the user interface devices 302. The interactive interface may allow the user 92 to control the auditable devices, add information to auditable events, generate auditable events, manually initiate an upload of urgent auditable events to the evidence management system 102, and/or take other actions within the system 100. The auditable device communication engine 308 manages communication with the auditable devices via the short-range wireless interface 306, and the system communication engine 312 manages communication with the evidence management system 102. Further details of the functionality of the components of the coordinator computing device 107 are provided below.



FIG. 4 is a block diagram that illustrates an exemplary embodiment of an evidence management system according to various aspects of the present disclosure. In some embodiments, the evidence management system 102 comprises a plurality of computing devices configured to provide the illustrated components, though they are described as a single system for clarity. One of ordinary skill in the art will recognize that any suitable server system, such as a single server, a server farm, a cloud service, and/or the like, may be used to provide the functionality of the evidence management system 102.


As illustrated, the evidence management system 102 includes a computer aided dispatch system interface 406, a records management system interface 408, and a network interface 422. The computer-aided dispatch system interface 406 enables the evidence management system 102 to communicate with one or more computer aided dispatch (CAD) systems 402 operated by other parties. This communication allows the evidence management system 102 to automatically import information from the CAD systems 402, such as incident codes, user 92 or vehicle 94 locations at given times, and/or the like. This information may then be correlated with or otherwise used to enhance information received from auditable devices. The records management system interface 408 enables the evidence management system 102 to communicate with one or more records management systems 404 operated by other parties. This communication likewise allows the evidence management system 102 to automatically import information such as case numbers and/or the like. The records management system interface 408 may also provide information from the evidence management system 102 back to the records management systems 404, including but not limited to links to data stored in the video data store 412, the audio data store 414, and/or the audit trail data store 416; copies of audit trail information stored in the audit trail data store 416; and/or the like.


The evidence management system 102 also includes an auditable device data store 410, a video data store 412, an audio data store 414, and an audit trail data store 416. The auditable device data store 410 may be configured to store information associated with each of the auditable devices of the system 100. For example, for a given auditable camera 106, the auditable device data store 410 may store information such as a unique device identifier, an identifier (such as a badge number or the like) of a user 92 associated with the auditable camera 106 at a given time or date, capabilities of the auditable camera 106, and/or the like. The video data store 412 and audio data store 414 are configured to store data captured by one or more auditable cameras 106 or other devices that can capture audio and/or video data and are enrolled with the system 100. In some embodiments, the video data store 412 and audio data store 414 are merged into a single data store, and audio and video data that are recorded contemporaneously may be stored together in a single file. The audit trail data store 416 stores records of auditable events detected by the auditable devices of the system 100, as described further below.


The evidence management system 102 also includes a data gathering engine 418 and a user interface engine 420. The data gathering engine 418 is configured to receive audit trail information, video data, and audio data from the auditable devices via the evidence collection dock 104 and the coordinator computing device 107. The user interface engine 420 is configured to generate and present user interfaces for displaying and interacting with the data collected by the evidence management system 102 via web pages, application programming interfaces, or any other suitable technology. Each of the interfaces and engines of the evidence management system 102 is configured to use a network interface 422 for communication with other components of the system 100 via the Internet. Further description of the actions taken by the components of the evidence management system 102 is provided below.



FIGS. 5A-5B are a flowchart that illustrates an exemplary embodiment of a method of auditing camera activity according to various aspects of the present disclosure. One of ordinary skill in the art will recognize that, although a method of auditing camera activity is illustrated and described, other devices that detect events (such as the auditable weapon 108, the light bar sensor 110, and/or the like) could also be audited using a similar method, though for such devices the actions relating to transfer of video and/or audio data may not be performed.


From a start block, the method 500 proceeds to block 502, where an audit trail gathering engine 214 of an auditable camera 106 detects a disconnection of the auditable camera 106 from a physical dock interface 232. In some embodiments, when the physical dock interface 232 is electrically disconnected from the evidence collection dock 104, the physical dock interface 232 transmits a signal to the audit trail gathering engine 214 to notify it of the disconnection event. Next, at block 504, the audit trail gathering engine 214 creates an auditable event entry in an audit trail data store 216 of the auditable camera 106 indicating that the auditable camera 106 has been undocked.


In some embodiments, particularly where the user 92 is a law enforcement official, the undocking auditable event entry may indicate the start of a shift and the related start of auditing activities related to that shift. In some embodiments, the audit trail data store 216 may have been emptied after the previously recorded information was transmitted to the evidence management system 102 (as described further below), so at this point, the auditable event entry indicating the start of shift may be the only auditable event entry in the audit trail data store 216.


The method 500 then proceeds to block 508, where the audit trail gathering engine 214 receives a signal indicating an auditable event. The signal may be received from another component of the auditable camera 106, such as the camera control engine 204, the battery sensor 234, a physical user interface device 208, the physical dock interface 232, or any other component. The signal may indicate any of a wide variety of auditable events. These events include, but are not limited to: a start recording event, a stop recording event, a pause recording event, a button press event, a battery or other hardware fault, a battery charge status threshold event, a storage error, a free storage threshold event, an interaction or connection status change via a wireless interface, an interaction or connection status change via the physical dock interface 232, a physical motion event, a lack-of-physical motion event (i.e., the auditable camera 106 has remained stationary for a given amount of time, an environmental event such as a change in environmental temperature or humidity past a threshold value, a geofence event, a drop event, and a debug event.


At block 510, the audit trail gathering engine 214 creates an auditable event entry in the audit trail data store 216. In some embodiments, the auditable event entry includes information that identifies the type of the event (such as a numeric event type identifier, an event type string, and/or the like), and may include further information associated with the event. As several non-limiting examples, a hardware fault auditable event entry may include a fault type and a fault code; a start recording event may include camera settings such as exposure, F-stop, frame rate, and/or the like; a battery fault event may also indicate the type of battery fault; and so on. In some embodiments, the audit trail gathering engine 214 may actively obtain other information to include in the auditable event entry that cannot be derived solely from information included in the signal. As several non-limiting examples, the audit trail gathering engine 214 may itself obtain a timestamp from the clock 230 to be included in the auditable event entry; the audit trail gathering engine 214 may use the short-range wireless interface 228 to obtain GPS information or user-entered metadata from the coordinator computing device 107 to be included in the auditable event entry; and so on.


At optional block 512, data associated with the event obtained by one or more sensors of the auditable camera 106 is stored in one or more data stores 222, 224 of the auditable camera 106 and is associated with the auditable event entry. For example, if the event is a start recording event, then data obtained by the camera control engine 204 from the video sensor 202 may be stored in the video data store 222, and/or data obtained by the camera control engine 204 from the audio sensor 206 may be stored in the audio data store 224. To associate the stored data with the auditable event entry, a unique identifier of the auditable event entry may be stored along with the video or audio data, and/or a unique identifier of the video and/or audio data may be stored in the auditable event entry. The actions described with respect to block 512 are optional because some events, like battery faults, may not be related to the recording of data to be stored in one of the data stores 222, 224.


At block 514, an audit trail signing engine 210 of the auditable camera 106 applies a digital signature to the auditable event entry. In some embodiments, applying the digital signature uses a cryptographic hash to guarantee that the auditable event entry has not been altered or corrupted since being signed. For example, in one non-limiting embodiment, the evidence management system 102 may assign a private key to the auditable camera 106, which is stored in the certificate data store 218 when the auditable camera 106 is connected via the physical dock interface 232. The private key is used by the audit trail signing engine 210 to create a cryptographic hash of the auditable event entry, which is then added to the auditable event entry to sign it. Subsequently, a public key made available by the evidence management system 102 may be used to verify that the auditable event entry has not been tampered with since it was signed. In some embodiments, the cryptographic hash may be stored elsewhere, such as in the certificate data store 218, and may simply be associated with the auditable event entry instead of being added to it. In some embodiments, the audit trail signing engine 210 may sign the data stored in the video data store 222 and the audio data store 224 using a similar technique. In some embodiments, the signatures generated for the video and/or audio data may be added to the auditable event entry before the auditable event entry is signed. In some embodiments, some other suitable technique for applying digital signatures that ensure that data has not been altered or corrupted is used instead of the technique described above.


In some embodiments, the actions described with respect to block 514 may be performed in response to the creation of the auditable event entry as is illustrated. In some embodiments, the actions described with respect to block 514 may instead be performed later, such as in response to a request to transmit the auditable event entry to the evidence management system 102. Waiting to apply the signature until shortly before upload may help to conserve battery life, as the signature processing for much of the information could be deferred until the auditable camera 106 is coupled to the evidence collection dock 104 and is therefore receiving power.


One will note that, if the identifier or signature of the video data and/or audio data is stored in the auditable event entry before the signature is applied to the auditable event entry, then the signatures ensure that the auditable event entry cannot be tampered with to refer to a different video and the video/audio data cannot be replaced without invalidating the signatures. Accordingly, the technical problem of how to irrefutably associate recorded videos with a lightweight log of actions performed by an auditable camera 106 despite intermittent connectivity can be overcome.


At this point in the method 500, the storage of the auditable event entry in the audit trail data store 216 is complete. In most cases, the method 500 would then loop back and wait for a subsequent auditable event, though there are two special cases where the method 500 would either perform additional processing or exit the loop instead of simply looping back. Accordingly, the method 500 proceeds to a decision block 516, where a determination is made regarding whether the auditable event indicated that the auditable camera 106 was placed in an evidence collection dock 104. As with the disconnection event described above with respect to block 502, this may be detected by an electrical connection being made via the physical dock interface 232, and a signal being provided by the physical dock interface 232 to the audit trail gathering engine 214 accordingly. If the result of the determination at decision block 516 is YES, then the method 500 proceeds to a continuation terminal (“terminal B”).


From terminal B (FIG. 5B), the method 500 proceeds to block 526, where the audit trail reporting engine 212 transmits contents of the audit trail data store 216 to the data gathering engine 418 of the evidence management system 102 via the physical dock interface 232 and the evidence collection dock 104. In some embodiments, the evidence collection dock 104 may include a memory, a processor, and networking hardware that are configured to enable the evidence collection dock 104 to download the contents of the audit trail data store 216 from the auditable camera 106 and to subsequently upload the contents to the data gathering engine 418 via the network 90 using any suitable data transmission technique.


At block 528, in response to receiving an acknowledgement receipt from the evidence management system 102, the audit trail gathering engine 214 clears the contents of the audit trail data store 216. Waiting for the acknowledgement receipt ensures that the audit trail data was received by the evidence management system 102 and was stored successfully before it is cleared from the auditable camera 106. In some embodiments, the evidence management system 102 may provide one acknowledgement receipt for the entire contents of the audit trail data store 216. In some embodiments, the evidence management system 102 may provide acknowledgement receipts for subsets of the uploaded data. For example, data may be uploaded in chronological order from the audit trail data store 216, and the evidence management system 102 transmits an acknowledgement receipt for each individual piece of content. Subsequently, each piece of content would be deleted from the auditable camera 106 once its corresponding acknowledgement receipt is received. Such an embodiment would allow an upload to be interrupted and the auditable camera 106 to be reused without losing any of the data already stored on the auditable camera 106.


At optional block 530, the auditable camera 106 transmits contents of other data stores of the auditable camera 106 to the evidence management system 102 via the physical dock interface 232. For example, the auditable camera 106 may transmit the data from the video data store 222 and/or the audio data store 224, if any, to the evidence management system 102. The actions associated with block 530 are described as optional, because in some embodiments, there might not be any such data, particularly if the method 500 is being used with an auditable device other than an auditable camera 106, such as an auditable weapon 108. In some embodiments, the actions of block 530 may happen before (or contemporaneously with) the actions of block 526. In some embodiments, an acknowledgement receipt may be generated by the evidence management system 102 for the video/audio data as well, and receiving the acknowledgement receipt may cause the auditable camera 106 to delete its copy of the transferred audio/video data. The method 500 then proceeds to an end block and terminates.


Returning to decision block 516 (FIG. 5A), if the result of the determination regarding whether the event indicates that the auditable camera 106 has been connected to a physical dock is NO, then the method 500 proceeds to another decision block 518, where a determination is made regarding whether the auditable event is an urgent event. An urgent event is an auditable event for which the system 100 is configured to ensure that the evidence management system 102 is made aware of as soon as possible, even before the auditable device is returned to the evidence collection dock 104. This functionality is useful to ensure that, for urgent events, the associated data is eventually uploaded to the evidence management system 102 and is not instead forgotten. One example of an urgent event may be a start recording event, and an example of a non-urgent event may be a battery fault event. If the system 100 is configured to consider a start recording event to be an urgent event, then the evidence management system 102 will be notified each time a recording is started with an auditable camera 106, even if the associated video and/or audio is not uploaded to the evidence management system 102. This information can be used in many ways, including prompting the user 92 to dock the auditable camera 106 in order to upload the missing video, providing a data trail indicating that all relevant videos for an incident or during a given time period were uploaded, and/or the like.


If the result of the determination at decision block 518 is NO and the auditable event is not an urgent event, then the method 500 proceeds to a continuation terminal (“terminal A”), and from terminal A returns to block 508 upon the receipt of a subsequent signal indicating an auditable event. Otherwise, if the result of the determination at decision block 518 is YES and the auditable event is an urgent event, then the method 500 proceeds to another continuation terminal (“terminal C”).


From terminal C (FIG. 5B), the method 500 proceeds to block 520, where an audit trail reporting engine 212 of the auditable camera 106 initiates a connection to a coordinator computing device 107 via a short-range wireless interface 228 of the auditable camera 106. In some embodiments, the audit trail reporting engine 212 uses the short-range wireless interface 228 to establish a wireless connection to the coordinator computing device 107. In some embodiments, the wireless connection to the coordinator computing device 107 may already be available, and the audit trail reporting engine 212 just uses the existing connection.


At block 522, the audit trail reporting engine 212 transmits the auditable event entry to the coordinator computing device 107 via the short-range wireless interface 228. Next, at block 524, the coordinator computing device 107 transmits the auditable event entry to a data gathering engine 418 of an evidence management system 102 via a long-range wireless interface 310 of the coordinator computing device 107.


In some embodiments, one or more of the actions described in blocks 520-524 may be performed when an opportunity arises to do so, if the actions cannot be performed immediately. For example, upon detecting the urgent event, the auditable camera 106 may not be within short-range wireless communication range of the coordinator computing device 107. In such a case, the audit trail reporting engine 212 may wait for a notification from the short-range wireless interface 228 that it has returned within range before proceeding. An example of this case would be if the user 92 leaves the vehicle 94, and the coordinator computing device 107 is left within the vehicle 94. The communication channel to the coordinator computing device 107 would be opened once the user 92 returned to the vehicle 94, or to a location near enough to the vehicle 94 to be within short-range wireless communication range. As another example, in some embodiments, the long-range wireless interface 310 of the coordinator computing device 107 could be 3G, 4G, LTE, or another wireless network with broad geographic coverage such that the coordinator computing device 107 would usually remain connected to the evidence management system 102. However, even with road geographic coverage, the user 92 may take the coordinator computing device 107 through an area without wireless data coverage (such as a tunnel, a valley, a parking garage, and/or the like), or may use a long-range wireless technology with more limited connection range or geographic coverage (such as WiFi). In such limited connectivity environments, the coordinator computing device 107 may receive and store the auditable event entry from the auditable camera 106, and transmit it to the evidence management system 102 once a data connection becomes available via the long-range wireless interface 310.


The method 500 then proceeds to terminal A. One of ordinary skill in the art will recognize that the actions described in blocks 520-524 are not blocking steps. Instead, the method 500 may return to terminal A upon receiving a signal indicating an auditable event (as described in block 508) even while the actions described in blocks 520-524 are being executed (or are awaiting execution).



FIG. 6 is a flowchart that illustrates an exemplary embodiment of a method of managing collected audit trail data according to various aspects of the present disclosure. From a start block, the method 600 proceeds to block 602, where a data gathering engine 418 of an evidence management system 102 receives one or more auditable event entries generated by an auditable device such as an auditable camera 106, an auditable weapon 108, and/or the like. The auditable event entries may be received via the evidence collection dock 104 as described in blocks 526530 of the method 500, or could be received via the coordinator computing device 107 as described in blocks 520-524 of the method 500.


At block 604, the data gathering engine 418 stores the auditable event entries in an audit trail data store 416 of the evidence management system 102. At block 606, the data gathering engine 418 receives one or more device data files generated by the auditable device, and stores them in one or more device data stores of the evidence management system 102. For example, video data may be stored in a video data store 412, and audio data may be stored in an audio data store 414. In some embodiments, the video data and audio data may be stored in the same data store. In some embodiments, data in the video data store 412 and in the audio data store 414 may be stored separately, but may include links or other associations to each other so that may be treated as a single unit. The device data files may be received using actions such as those described in block 530 of the method 500.


At block 608, the data gathering engine 418 determines one or more auditable event entries from the audit trail data store 416 that should have matching device data files in the device data stores but do not. For example, an urgent event may have been received and stored in the audit trail data store 416 indicating a start recording event, but no video file associated with the event is found in the video data store 222. This determination can help find situations where vital evidence is missing but could still be uploaded. This determination can also provide proof that all existing videos associated with an auditable camera 106 are present within the evidence management system 102. At block 610, a user interface engine 420 of the evidence management system 102 generates alerts based on auditable event entries having missing matching device data files. These auditable event entries are those found in the block 606. In some embodiments, the alerts may include email or text message upload reminders sent to the user 92 associated with the appropriate auditable device, or to the user's supervisor. In some embodiments, the alerts may include a report presented as a web page (or in another format) of all missing device data files for a group of devices or for a period of time.


At block 612, the user interface engine 420 retrieves information from a records management system 404 describing an incident and generates a timeline presentation of auditable events associated with the incident. The information about the incident may include, but is not limited to: a start time and an end time of the incident; one or more geographic locations associated with the incident; one or more users 92 associated with the incident; and one or more auditable devices associated with said users 92. The information about the incident may be used to find relevant auditable event entries in the audit trail data store 416. The user interface engine 420 queries the audit trail data store 416 to find the relevant auditable event entries, and places indications of the auditable events on the timeline. In the timeline presentation, interacting with an auditable event indication or a portion of the timeline (such as clicking or tapping on the indication or a portion of the timeline) may cause data associated with the auditable event, such as event detail information, and/or associated data stored in the video data store 412 and/or the audio data store 414 to be presented.


At block 614, the user interface engine 420 generates one or more alerts based on auditable event entries from the audit trail data store 416 that indicate errors with an auditable device. In some embodiments, the alerts could be an email or SMS message transmitted to the user 92 associated with the auditable device. In some embodiments, the alerts could be displayed in a summary list of errors from all devices managed by a given technical support professional or a given agency. In some embodiments, the auditable device data store 410 may be queried to determine a location or contact information of the associated user 92 in order to deliver the alert.


At block 616, the user interface engine 420 generates a summary presentation of auditable events that indicate errors across multiple auditable devices of a matching type. For example, a summary presentation may present all battery faults for all auditable cameras, regardless of user. This summary presentation may be presented to information technology management at the agency operating the auditable devices, or may be aggregated for multiple agencies and viewed by the proprietor of the evidence management system 102 and/or the developer of the auditable devices to improve engineering processes or to update future versions of the auditable devices.


The method 600 then proceeds to an end block and terminates. One of ordinary skill in the art will recognize that the steps of method 600 are presented as ordered for sake of discussion only. In some embodiments, the steps may be performed in any order, repeatedly, in parallel, and/or the like. Though it is clear that the data gathering steps would happen at some point before the presentation steps happen for the first time, it is still possible that the presentation steps could happen multiple times for a single execution of the data gathering step, or that the data gathering steps could happen without the presentation steps occurring.



FIG. 7 is a block diagram that illustrates aspects of an exemplary computing device 700 appropriate for use as a computing device of the present disclosure. While multiple different types of computing devices were discussed above, the exemplary computing device 700 describes various elements that are common to many different types of computing devices. While FIG. 7 is described with reference to a computing device that is implemented as a device on a network, the description below is applicable to servers, personal computers, mobile phones, smart phones, tablet computers, embedded computing devices, and other devices that may be used to implement portions of embodiments of the present disclosure. Moreover, those of ordinary skill in the art and others will recognize that the computing device 700 may be any one of any number of currently available or yet to be developed devices.


In its most basic configuration, the computing device 700 includes at least one processor 702 and a system memory 704 connected by a communication bus 706. Depending on the exact configuration and type of device, the system memory 704 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or similar memory technology. Those of ordinary skill in the art and others will recognize that system memory 704 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 702. In this regard, the processor 702 may serve as a computational center of the computing device 700 by supporting the execution of instructions.


As further illustrated in FIG. 7, the computing device 700 may include a network interface 710 comprising one or more components for communicating with other devices over a network. Embodiments of the present disclosure may access basic services that utilize the network interface 710 to perform communications using common network protocols. The network interface 710 may also include a wireless network interface configured to communicate via one or more wireless communication protocols, such as WiFi, 2G, 3G, LTE, WiMAX, Bluetooth, and/or the like. As will be appreciated by one of ordinary skill in the art, the network interface 710 illustrated in FIG. 7 may represent one or more wireless interfaces or physical communication interfaces described and illustrated above with respect to particular components of the system 100.


In the exemplary embodiment depicted in FIG. 7, the computing device 700 also includes a storage medium 708. However, services may be accessed using a computing device that does not include means for persisting data to a local storage medium. Therefore, the storage medium 708 depicted in FIG. 7 is represented with a dashed line to indicate that the storage medium 708 is optional. In any event, the storage medium 708 may be volatile or nonvolatile, removable or nonremovable, implemented using any technology capable of storing information such as, but not limited to, a hard drive, solid state drive, CD ROM, DVD, or other disk storage, magnetic cassettes, magnetic tape, magnetic disk storage, and/or the like.


As used herein, the term “computer readable medium” includes volatile and non volatile and removable and non removable media implemented in any method or technology capable of storing information, such as computer readable instructions, data structures, program modules, or other data. In this regard, the system memory 704 and storage medium 708 depicted in FIG. 7 are merely examples of computer readable media.


Suitable implementations of computing devices that include a processor 702, system memory 704, communication bus 706, storage medium 708, and network interface 710 are known and commercially available. For ease of illustration and because it is not important for an understanding of the claimed subject matter, FIG. 7 does not show some of the typical components of many computing devices. In this regard, the computing device 700 may include input devices, such as a keyboard, keypad, mouse, microphone, touch input device, touch screen, tablet, and/or the like. Such input devices may be coupled to the computing device 700 by wired or wireless connections including RF, infrared, serial, parallel, Bluetooth, USB, or other suitable connections protocols using wireless or physical connections. Similarly, the computing device 700 may also include output devices such as a display, speakers, printer, etc. Since these devices are well known in the art, they are not illustrated or described further herein.


While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.


The foregoing description discusses preferred embodiments of the present invention, which may be changed or modified without departing from the scope of the present invention as defined in the claims. Examples listed in parentheses may be used in the alternative or in any practical combination. As used in the specification and claims, the words ‘comprising’, ‘comprises’, ‘including’, ‘includes’, ‘having’, and ‘has’ introduce an open ended statement of component structures and/or functions. In the specification and claims, the words ‘a’ and ‘an’ are used as indefinite articles meaning ‘one or more’. When a descriptive phrase includes a series of nouns and/or adjectives, each successive word is intended to modify the entire combination of words preceding it. For example, a black dog house is intended to mean a house for a black dog. While for the sake of clarity of description, several specific embodiments of the invention have been described, the scope of the invention is intended to be measured by the claims as set forth below. In the claims, the term “provided” is used to definitively identify an object that not a claimed element of the invention but an object that performs the function of a workpiece that cooperates with the claimed invention. For example, in the claim “an apparatus for aiming a provided barrel, the apparatus comprising: a housing, the barrel positioned in the housing”, the barrel is not a claimed element of the apparatus, but an object that cooperates with the “housing” of the “apparatus” by being positioned in the “housing”. The invention includes any practical combination of the structures and methods disclosed. While for the sake of clarity of description several specifics embodiments of the invention have been described, the scope of the invention is intended to be measured by the claims as set forth below.

Claims
  • 1. An auditable camera, comprising: a video sensor;at least one processor;a non-transitory computer-readable storage medium;a video data store stored on the non-transitory computer-readable storage medium, wherein the video data store is configured to store video data generated by the video sensor;an audit trail data store stored on the non-transitory computer-readable storage medium, wherein the audit trail data store is configured to store a set of auditable event entries;an audit trail gathering engine stored on the non-transitory computer-readable storage medium, wherein the audit trail gathering engine, when executed by the at least one processor, causes the at least one processor to: in response to receiving a notification of an event reported by a component of the auditable camera, record an auditable event entry representing the event in the set of auditable event entries in the audit trail data store; andan audit trail reporting engine stored on the non-transitory computer-readable storage medium, wherein the audit trail reporting engine, when executed by the at least one processor, causes the at least one processor to: transmit one or more auditable event entries of the set of auditable event entries from the audit trail data store to an evidence management system.
  • 2. The auditable camera of claim 1 further comprising at least one of a short-range wireless interface and a long-range wireless interface, wherein the audit trail reporting engine further causes the at least one processor to: receive a notification that an urgent auditable event entry has been recorded in the audit trail data store;establish a wireless data connection via the at least one of the short-range wireless interface and the long-range wireless interface after receiving the notification that the urgent auditable event entry has been recorded in the audit trail data store; andtransmit the urgent auditable event entry via the established wireless data connection.
  • 3. The auditable camera of claim 2, wherein the camera is configured to transmit the video data from the video data store to an evidence management server via a physical dock interface after the urgent auditable event entry has been transmitted via the established wireless data connection, the video data including a video data file associated with the urgent auditable event entry.
  • 4. The auditable camera of claim 1, wherein the notification is a first notification of a first event reported by a first component of the auditable camera and the audit trail gathering engine, when executed by the at least one processor, further causes the at least one processor to: receive a second notification of a second event reported by a second component of the auditable camera; andrecord a second auditable event entry representing the second event in the audit trail data store, wherein the second component is different from the first component.
  • 5. The auditable camera of claim 4, wherein the first component is a battery sensor and the second component is a motion sensor.
  • 6. The auditable camera of claim 4, wherein the first component is a battery sensor and the second component is a physical dock interface.
  • 7. The auditable camera of claim 4, wherein the first event reported by the first component is a disconnection event, the first component is a physical dock interface, and the first notification includes a signal transmitted to the audit trail gathering engine by the physical dock interface.
  • 8. The auditable camera of claim 1 further comprising: a certificate data store stored on the non-transitory computer-readable storage medium, wherein the certificate data store is configured to store one or more digital signing certificates; andan audit trail signing engine stored on the non-transitory computer-readable storage medium, wherein the audit trail signing engine, when executed by the at least one processor, causes the at least one processor to apply digital signatures using the one or more digital signing certificates in the certificate data store to the set of auditable event entries in the audit trail data store.
  • 9. The auditable camera of claim 8, wherein the video data is associated with the auditable event entry and the audit trail signing engine, when executed by the at least one processor, further causes the at least one processor to apply a digital signature to the auditable event entry after a signature of the video data is stored in the auditable event entry.
  • 10. The auditable camera of claim 8, wherein a digital signature is applied to the auditable event entry in response to a request to transmit the auditable event entry to the evidence management system.
  • 11. The auditable camera of claim 1, wherein the notification includes a signal from the component and the auditable event entry includes information derived from the signal.
  • 12. The auditable camera of claim 1, wherein the notification includes a signal from the component and the auditable event entry includes information not derived solely from the signal.
  • 13. The auditable camera of claim 12, wherein the information not derived solely from the signal includes a timestamp from a clock in the auditable camera, the timestamp obtained separately from the notification of the event reported by the component.
  • 14. The auditable camera of claim 1, wherein the event includes at least one of a battery fault and a battery charge status threshold event.
  • 15. The auditable camera of claim 1, wherein the event includes at least one of a hardware fault and a debug event.
  • 16. The auditable camera of claim 1, wherein the event includes at least one of a storage error and a free storage threshold event.
  • 17. The auditable camera of claim 1, further comprising a camera control engine stored on the non-transitory computer-readable storage medium, wherein the camera control engine, when executed by the at least one processor, causes the at least one processor to both cause the video sensor to begin obtaining the video data and transmit a signal to the audit trail gathering engine to report starting video recording.
  • 18. A method performed by an auditable device for generating an audit trail in the auditable device, the auditable device including a video sensor, a video data store, an audit trail data store, and at least one processor, the method comprising: generating video data with the video sensor;storing the video data in the video data store;receiving a signal associated with a first event from a first component of the auditable device,recording a first auditable event entry representing the first event in the audit trail data store;receiving a signal associated with a second event from a second component of the auditable device;recording a second auditable event entry representing the second event in the audit trail data store; andtransmitting the first auditable event entry and the second auditable event entry from the auditable device, wherein the first component of the auditable device is different from the second component of the auditable device.
  • 19. The method of claim 18, further comprising transmitting the video data from the auditable device via a physical dock interface of the auditable device, and wherein transmitting the second auditable event entry from the auditable device includes transmitting the second auditable event entry from the auditable device via a wireless interface of the auditable device.
  • 20. The method of claim 19, wherein the first component is a battery sensor and the first event is associated with a battery overheating state detected by the battery sensor, and the second event is related to the video data stored in the video data store.
US Referenced Citations (499)
Number Name Date Kind
3467771 Polack Sep 1969 A
4326221 Mallos Apr 1982 A
4409670 Herndon Oct 1983 A
4786966 Hanson Nov 1988 A
4789904 Peterson Dec 1988 A
4815757 Hamilton Mar 1989 A
4831438 Bellman May 1989 A
4863130 Marks Sep 1989 A
4910591 Petrossian Mar 1990 A
4918473 Blackshear Apr 1990 A
4926495 Comroe May 1990 A
4949186 Peterson Aug 1990 A
5012335 Cohodar Apr 1991 A
5027104 Reid Jun 1991 A
5096287 Kakinami Mar 1992 A
5111289 Lucas May 1992 A
5289321 Secor Feb 1994 A
5381155 Gerber Jan 1995 A
5400185 Scerbo Mar 1995 A
5420725 Hsu May 1995 A
5446659 Yamawaki Aug 1995 A
5453939 Hoffman Sep 1995 A
5473729 Bryant Dec 1995 A
5479149 Pike Dec 1995 A
5491464 Carter Feb 1996 A
5497419 Hill Mar 1996 A
5526133 Paff Jun 1996 A
5585798 Yoshioka Dec 1996 A
5636122 Shah Jun 1997 A
5642285 Woo Jun 1997 A
5659289 Zonkoski Aug 1997 A
5668675 Fredricks Sep 1997 A
D385209 Mercer Oct 1997 S
5689442 Swanson Nov 1997 A
5729016 Klapper Mar 1998 A
5742336 Lee Apr 1998 A
5752632 Sanderson May 1998 A
5781243 Kormos Jul 1998 A
5798458 Monroe Aug 1998 A
5815093 Kikinis Sep 1998 A
5834676 Elliott Nov 1998 A
5850613 Bullecks Dec 1998 A
5878283 House Mar 1999 A
5886739 Winningstad Mar 1999 A
5890079 Levine Mar 1999 A
5926210 Hackett Jul 1999 A
5978017 Tino Nov 1999 A
5983161 Lemelson Nov 1999 A
5996023 Winter Nov 1999 A
6008841 Charlson Dec 1999 A
6028528 Lorenzetti Feb 2000 A
6037977 Peterson Mar 2000 A
6052068 Price Apr 2000 A
6097429 Seeley Aug 2000 A
6100806 Gaukel Aug 2000 A
6121881 Bieback Sep 2000 A
6141609 Herdeg Oct 2000 A
6163338 Johnson Dec 2000 A
6175300 Kendrick Jan 2001 B1
6215518 Watkins Apr 2001 B1
6272781 Resnick Aug 2001 B1
6292213 Jones Sep 2001 B1
6298290 Abe Oct 2001 B1
6310541 Atkins Oct 2001 B1
6314364 Nakamura Nov 2001 B1
6326900 Deline Dec 2001 B2
6333694 Pierce Dec 2001 B2
6333759 Mazzilli Dec 2001 B1
6370475 Breed Apr 2002 B1
RE37709 Dukek May 2002 E
6389340 Rayner May 2002 B1
6396403 Haner May 2002 B1
6405112 Rayner Jun 2002 B1
6411874 Morgan Jun 2002 B2
6449540 Rayner Sep 2002 B1
6452572 Fan Sep 2002 B1
6518881 Monroe Feb 2003 B2
6525672 Chainer Feb 2003 B2
6546119 Ciolli Apr 2003 B2
6560463 Santhoff May 2003 B1
6591242 Karp Jul 2003 B1
6681195 Poland Jan 2004 B1
6697103 Fernandez Feb 2004 B1
6704044 Foster Mar 2004 B1
6718239 Rayner Apr 2004 B2
6727816 Helgeson Apr 2004 B1
6748792 Freund Jun 2004 B1
6784833 Evans Aug 2004 B1
6795111 Mazzilli Sep 2004 B1
RE38626 Kielland Oct 2004 E
6823621 Gotfried Nov 2004 B2
6831556 Boykin Dec 2004 B1
6856873 Breed Feb 2005 B2
6894717 Bakewell May 2005 B2
6950122 Mirabile Sep 2005 B1
6955484 Woodman Oct 2005 B2
6970183 Monroe Nov 2005 B1
7012632 Freeman Mar 2006 B2
7023913 Monroe Apr 2006 B1
7034683 Ghazarian Apr 2006 B2
7038590 Hoffman May 2006 B2
7106835 Saalsaa Sep 2006 B2
D529528 Ross Oct 2006 S
7116357 Oya Oct 2006 B1
7119832 Blanco Oct 2006 B2
7126472 Kraus Oct 2006 B2
7147155 Weekes Dec 2006 B2
7180407 Guo Feb 2007 B1
7190882 Gammenthaler Mar 2007 B2
7273321 Woodman Sep 2007 B2
7298964 Ishikawa Nov 2007 B2
7359553 Wendt Apr 2008 B1
7371021 Ross May 2008 B2
7432978 Storm Oct 2008 B2
7436955 Yan Oct 2008 B2
7448996 Khanuja Nov 2008 B2
7456875 Kashiwa Nov 2008 B2
7458736 Woodman Dec 2008 B2
7463280 Steuart Dec 2008 B2
7488996 Chang Feb 2009 B2
7496140 Winningstad Feb 2009 B2
7500794 Clark Mar 2009 B1
7508941 O'Toole Mar 2009 B1
7519271 Strub Apr 2009 B2
7536457 Miller May 2009 B2
7539533 Tran May 2009 B2
7561037 Monroe Jul 2009 B1
7576800 Swain Aug 2009 B2
7593034 Dekeyser Sep 2009 B2
7599942 Mohamad Oct 2009 B1
7602301 Stirling Oct 2009 B1
7659827 Gunderson Feb 2010 B2
7680947 Nicholl Mar 2010 B2
7685428 Piersol Mar 2010 B2
7697035 Suber Apr 2010 B1
7711154 Danielson May 2010 B2
D616901 Ward Jun 2010 S
7742625 Pilu Jun 2010 B2
7756602 Koempel Jul 2010 B2
7804426 Etcheson Sep 2010 B2
7806525 Howell Oct 2010 B2
7853944 Choe Dec 2010 B2
7881604 Kurane Feb 2011 B2
8005937 Wesley Aug 2011 B2
D644679 Woodman Sep 2011 S
8014597 Newman Sep 2011 B1
D646313 Woodman Oct 2011 S
8063786 Manotas Nov 2011 B2
8063934 Donato Nov 2011 B2
8077029 Daniel Dec 2011 B1
8079501 Woodman Dec 2011 B2
8081214 Vanman Dec 2011 B2
D657808 Woodman Apr 2012 S
8150248 Woodman Apr 2012 B1
8175314 Webster May 2012 B1
8176093 Mohamad May 2012 B2
8199251 Woodman Jun 2012 B2
8228364 Cilia Jul 2012 B2
8269617 Cook Sep 2012 B2
8314708 Gunderson Nov 2012 B2
8325270 Woodman Dec 2012 B2
D674428 Woodman Jan 2013 S
D674429 Woodman Jan 2013 S
8345969 Newman Jan 2013 B2
8350907 Blanco Jan 2013 B1
8351447 Habuto Jan 2013 B2
8373567 Denson Feb 2013 B2
8384539 Denny Feb 2013 B2
8433763 Anderson Apr 2013 B2
8456293 Trundle Jun 2013 B1
8479009 Bennett Jul 2013 B2
8487995 Vanman Jul 2013 B2
8503972 Haler Aug 2013 B2
8508353 Cook Aug 2013 B2
8520069 Haler Aug 2013 B2
D689537 Campbell Sep 2013 S
8538143 Newman Sep 2013 B2
8541903 Burk Sep 2013 B2
D692472 Samuels Oct 2013 S
8571895 Medina Oct 2013 B1
8606073 Woodman Dec 2013 B2
8606492 Botnen Dec 2013 B1
8629977 Phillips Jan 2014 B2
8638392 Woodman Jan 2014 B2
D699275 Samuels Feb 2014 S
D699276 Samuels Feb 2014 S
D699277 Samuels Feb 2014 S
8644629 Newman Feb 2014 B2
8676428 Richardson Mar 2014 B2
D702276 Woodman Apr 2014 S
D702277 Woodman Apr 2014 S
D702747 Woodman Apr 2014 S
D702754 Bould Apr 2014 S
D702755 Bould Apr 2014 S
8700946 Reddy Apr 2014 B2
8707758 Keays Apr 2014 B2
8718390 Newman May 2014 B1
8725462 Jain May 2014 B2
8744642 Nemat-Nasser Jun 2014 B2
8781292 Ross Jul 2014 B1
8827869 Ellis Sep 2014 B2
8836784 Erhardt Sep 2014 B2
8837928 Clearman Sep 2014 B1
8849501 Cook Sep 2014 B2
D715347 Troxel Oct 2014 S
D715846 Troxel Oct 2014 S
8854199 Cook Oct 2014 B2
8854465 McIntyre Oct 2014 B1
8857775 Clearman Oct 2014 B1
8863208 Calvert Oct 2014 B2
8867886 Feinson Oct 2014 B2
8872916 Alberth Oct 2014 B2
8872940 Marman Oct 2014 B2
8887208 Merrit Nov 2014 B1
8893010 Brin Nov 2014 B1
8896432 Kang Nov 2014 B2
8897506 Myers Nov 2014 B2
8911162 Kuehl Dec 2014 B2
8914472 Lee Dec 2014 B1
8923998 Ellis Dec 2014 B2
8928752 Dekeyser Jan 2015 B2
8930072 Lambert Jan 2015 B1
8934015 Chi Jan 2015 B1
8947262 Rauscher Feb 2015 B2
8964014 Ollila Feb 2015 B2
8989914 Nemat-Nasser Mar 2015 B1
8996234 Tamari Mar 2015 B1
9041803 Chen May 2015 B2
9143670 Cilia Sep 2015 B1
9148585 Cragun Sep 2015 B2
9183679 Plante Nov 2015 B2
9204084 Kim Dec 2015 B2
9214191 Guzik Dec 2015 B2
9237262 Phillips Jan 2016 B2
9253452 Ross Feb 2016 B2
20020013517 West Jan 2002 A1
20020032510 Turnbull Mar 2002 A1
20020044065 Quist Apr 2002 A1
20020049881 Sugimura Apr 2002 A1
20020084130 Der Ghazarian Jul 2002 A1
20020131768 Gammenthaler Sep 2002 A1
20020135336 Zhou Sep 2002 A1
20020159434 Gosior Oct 2002 A1
20020191952 Fiore Dec 2002 A1
20030040917 Fiedler Feb 2003 A1
20030080878 Kirmuss May 2003 A1
20030081121 Kirmuss May 2003 A1
20030081123 Rupe May 2003 A1
20030081935 Kirmuss May 2003 A1
20030081942 Melnyk May 2003 A1
20030090572 Belz May 2003 A1
20030095688 Kirmuss May 2003 A1
20030106917 Shetler Jun 2003 A1
20030133018 Ziemkowski Jul 2003 A1
20030151663 Lorenzetti Aug 2003 A1
20030173408 Mosher Sep 2003 A1
20030184649 Mann Oct 2003 A1
20030215010 Kashiwa Nov 2003 A1
20030215114 Kyle Nov 2003 A1
20030222982 Hamdan Dec 2003 A1
20040013192 Kennedy Jan 2004 A1
20040033058 Reich Feb 2004 A1
20040043765 Tolhurst Mar 2004 A1
20040061780 Huffman Apr 2004 A1
20040145457 Schofield Jul 2004 A1
20040146272 Kessel Jul 2004 A1
20040164896 Evans Aug 2004 A1
20040168002 Accarie Aug 2004 A1
20040199785 Pederson Oct 2004 A1
20040208493 Kashiwa Oct 2004 A1
20040223054 Rotholtz Nov 2004 A1
20040263609 Otsuki Dec 2004 A1
20050030151 Singh Feb 2005 A1
20050035161 Shioda Feb 2005 A1
20050046583 Richards Mar 2005 A1
20050066371 Lu Mar 2005 A1
20050068169 Copley Mar 2005 A1
20050068171 Kelliher Mar 2005 A1
20050078195 VanWagner Apr 2005 A1
20050078672 Caliskan et al. Apr 2005 A1
20050083404 Pierce Apr 2005 A1
20050088521 Blanco Apr 2005 A1
20050094966 Elberbaum May 2005 A1
20050100329 Lao May 2005 A1
20050134710 Nomura Jun 2005 A1
20050134966 Burgner Jun 2005 A1
20050162339 Schneider Jul 2005 A1
20050167172 Fernandez Aug 2005 A1
20050206532 Lock Sep 2005 A1
20050206741 Raber Sep 2005 A1
20050228234 Yang Oct 2005 A1
20050232469 Schofield Oct 2005 A1
20060009238 Stanco Jan 2006 A1
20060012683 Lao Jan 2006 A9
20060028811 Ross Feb 2006 A1
20060055521 Blanco Mar 2006 A1
20060133476 Page Jun 2006 A1
20060158968 Vanman Jul 2006 A1
20060164220 Harter Jul 2006 A1
20060164534 Robinson Jul 2006 A1
20060170770 MacCarthy Aug 2006 A1
20060176149 Douglas Aug 2006 A1
20060183505 Willrich Aug 2006 A1
20060203090 Wang Sep 2006 A1
20060220826 Rast Oct 2006 A1
20060244601 Nishimura Nov 2006 A1
20060256822 Kwong Nov 2006 A1
20060270465 Lee Nov 2006 A1
20060274828 Siemens Dec 2006 A1
20060282021 DeVaul Dec 2006 A1
20060287821 Lin Dec 2006 A1
20060293571 Bao Dec 2006 A1
20070021134 Liou Jan 2007 A1
20070039030 Romanowich Feb 2007 A1
20070064108 Haler Mar 2007 A1
20070067079 Kosugi Mar 2007 A1
20070102508 McIntosh May 2007 A1
20070117083 Winneg May 2007 A1
20070132567 Schofield Jun 2007 A1
20070152811 Anderson Jul 2007 A1
20070172053 Poirier Jul 2007 A1
20070177023 Beuhler Aug 2007 A1
20070200914 Dumas Aug 2007 A1
20070217761 Chen Sep 2007 A1
20070222859 Chang Sep 2007 A1
20070229350 Scalisi Oct 2007 A1
20070257781 Denson Nov 2007 A1
20070257782 Etcheson Nov 2007 A1
20070257804 Gunderson Nov 2007 A1
20070257815 Gunderson Nov 2007 A1
20070257987 Wang Nov 2007 A1
20070260361 Etcheson Nov 2007 A1
20070268158 Gunderson Nov 2007 A1
20070271105 Gunderson Nov 2007 A1
20070274705 Kashiwa Nov 2007 A1
20070285222 Zadnikar Dec 2007 A1
20070287425 Bates Dec 2007 A1
20070293186 Lehmann Dec 2007 A1
20070297320 Brummette Dec 2007 A1
20080001735 Tran Jan 2008 A1
20080002599 Yau Jan 2008 A1
20080030580 Kashiwa Feb 2008 A1
20080042825 Denny Feb 2008 A1
20080043736 Stanley Feb 2008 A1
20080049830 Richardson Feb 2008 A1
20080063252 Dobbs Mar 2008 A1
20080100705 Kister May 2008 A1
20080122603 Plante May 2008 A1
20080127160 Rackin May 2008 A1
20080129518 Carlton-Foss Jun 2008 A1
20080143481 Abraham Jun 2008 A1
20080144705 Rackin Jun 2008 A1
20080211906 Lovric Sep 2008 A1
20080234934 Otani Sep 2008 A1
20080239064 Iwasaki Oct 2008 A1
20080246656 Ghazarian Oct 2008 A1
20080266118 Pierson Oct 2008 A1
20090002491 Haler Jan 2009 A1
20090002556 Manapragada Jan 2009 A1
20090021591 Sako Jan 2009 A1
20090027499 Nicholl Jan 2009 A1
20090070820 Li Mar 2009 A1
20090085740 Klein Apr 2009 A1
20090122142 Shapley May 2009 A1
20090141129 Dischinger Jun 2009 A1
20090169068 Okamoto Jul 2009 A1
20090189981 Siann Jul 2009 A1
20090207252 Raghunath Aug 2009 A1
20090213204 Wong Aug 2009 A1
20090243794 Morrow Oct 2009 A1
20090245758 Kodama Oct 2009 A1
20090252486 Ross Oct 2009 A1
20090273672 Koudritski Nov 2009 A1
20090276531 Myka Nov 2009 A1
20090290022 Uhm Nov 2009 A1
20100060734 Chou Mar 2010 A1
20100077437 McManus Mar 2010 A1
20100177891 Keidar Jul 2010 A1
20100188201 Cook Jul 2010 A1
20100191411 Cook Jul 2010 A1
20100194850 Nickerson Aug 2010 A1
20100238009 Cook Sep 2010 A1
20100238262 Kurtz Sep 2010 A1
20100242076 Potesta Sep 2010 A1
20100250021 Cook Sep 2010 A1
20100265331 Tanaka Oct 2010 A1
20100279649 Thomas Nov 2010 A1
20100328463 Haler Dec 2010 A1
20110006151 Beard Jan 2011 A1
20110069151 Orimoto Mar 2011 A1
20110084820 Walter Apr 2011 A1
20110094003 Spiewak Apr 2011 A1
20110128350 Oliver Jun 2011 A1
20110145191 Anderson Jun 2011 A1
20110261176 Monaghan Oct 2011 A1
20110267433 Thorpe Nov 2011 A1
20110298736 Madonna et al. Dec 2011 A1
20120038689 Ishii Feb 2012 A1
20120056722 Kawaguchi Mar 2012 A1
20120063736 Simmons Mar 2012 A1
20120127315 Heier May 2012 A1
20120162436 Cordell Jun 2012 A1
20120163457 Wahadaniah Jun 2012 A1
20120189286 Takayama Jul 2012 A1
20120259852 Aasen Oct 2012 A1
20120268259 Igel Oct 2012 A1
20120276954 Kowalsky Nov 2012 A1
20130021153 Keays Jan 2013 A1
20130027552 Guzik Jan 2013 A1
20130039157 Waites Feb 2013 A1
20130080836 Stergiou Mar 2013 A1
20130080841 Reddy Mar 2013 A1
20130096731 Tamari Apr 2013 A1
20130096733 Manotas Apr 2013 A1
20130147962 Siann Jun 2013 A1
20130156397 Chen Jun 2013 A1
20130198637 Childers Aug 2013 A1
20130209057 Ofir Aug 2013 A1
20130222640 Baek Aug 2013 A1
20130242262 Lewis Sep 2013 A1
20130263122 Levijarvi Oct 2013 A1
20130282919 Richards Oct 2013 A1
20130314537 Haler Nov 2013 A1
20130336627 Calvert Dec 2013 A1
20130342697 Haler Dec 2013 A1
20140009585 Campbell Jan 2014 A1
20140028818 Brockway Jan 2014 A1
20140037262 Sako Feb 2014 A1
20140043485 Bateman Feb 2014 A1
20140049636 O'Donnell Feb 2014 A1
20140063249 Miller Mar 2014 A1
20140092251 Troxel Apr 2014 A1
20140092299 Phillips Apr 2014 A1
20140094992 Lambert Apr 2014 A1
20140101453 Senthurpandi Apr 2014 A1
20140104488 Samuels Apr 2014 A1
20140105589 Samuels Apr 2014 A1
20140122721 Marocchi May 2014 A1
20140125966 Phillips May 2014 A1
20140136445 Thorgerson May 2014 A1
20140156833 Robinson Jun 2014 A1
20140160250 Pomerantz Jun 2014 A1
20140167954 Johnson Jun 2014 A1
20140169752 May Jun 2014 A1
20140187190 Schuler Jul 2014 A1
20140195105 Lambert Jul 2014 A1
20140199041 Blanco Jul 2014 A1
20140210625 Nemat-Nasser Jul 2014 A1
20140211962 Davis Jul 2014 A1
20140215885 Sullivan et al. Aug 2014 A1
20140227671 Olmstead Aug 2014 A1
20140257539 Vock Sep 2014 A1
20140267615 Tapia Sep 2014 A1
20140267775 Lablans Sep 2014 A1
20140267894 Campbell Sep 2014 A1
20140270685 Letke Sep 2014 A1
20140294257 Tussy Oct 2014 A1
20140300739 Mimar Oct 2014 A1
20140305024 Russell Oct 2014 A1
20140307981 Rauscher Oct 2014 A1
20140311215 Keays Oct 2014 A1
20140313341 Stribling Oct 2014 A1
20140317052 Goldstein Oct 2014 A1
20140321541 Klein Oct 2014 A1
20140327928 Monroe Nov 2014 A1
20140327931 Monroe Nov 2014 A1
20140347265 Aimone Nov 2014 A1
20140351217 Bostock Nov 2014 A1
20140361185 Howell Dec 2014 A1
20140368601 deCharms Dec 2014 A1
20140375807 Muetzel Dec 2014 A1
20140376872 Lipetz Dec 2014 A1
20150002898 Monroe Jan 2015 A1
20150015715 Gloger Jan 2015 A1
20150021451 Clearman Jan 2015 A1
20150030320 Clearman Jan 2015 A1
20150051502 Ross Feb 2015 A1
20150063776 Ross Mar 2015 A1
20150078727 Ross Mar 2015 A1
20150086175 Lorenzetti Mar 2015 A1
20150088335 Lambert Mar 2015 A1
20150103246 Phillips Apr 2015 A1
20150172538 Nordstrom Jun 2015 A1
20150222817 Brockway Aug 2015 A1
20150237252 O'Donnell Aug 2015 A1
20160042767 Araya Feb 2016 A1
20160227173 Yamaguchi Aug 2016 A1
20170041359 Kwan Feb 2017 A1
20170059265 Winter Mar 2017 A1
20170061200 Wexler Mar 2017 A1
20170133051 Mack May 2017 A1
20170230605 Han Aug 2017 A1
20170284754 Chakraborty Oct 2017 A1
20180050800 Boykin Feb 2018 A1
20180063421 Yokomitsu Mar 2018 A1
20180101194 Wakako Apr 2018 A1
20180249087 Arnold Aug 2018 A1
20180376111 Mrowiec Dec 2018 A1
20190003804 Deng Jan 2019 A1
Foreign Referenced Citations (35)
Number Date Country
102010019451 Nov 2011 DE
522445 Dec 1996 EP
1868381 Dec 2007 EP
2109074 Oct 2009 EP
2273624 Jun 1994 GB
2320389 Jun 1998 GB
2351055 Dec 2000 GB
2343252 Apr 2003 GB
2417151 Feb 2006 GB
2425427 Oct 2006 GB
2455885 Mar 2010 GB
2485804 May 2012 GB
08-153298 Nov 1996 JP
2000137263 May 2005 JP
1020080093798 Oct 2008 KR
20120037076 Apr 2012 KR
20150069249 Jun 2015 KR
2383915 Mar 2010 RU
M384474 Jul 2010 TW
201344487 Nov 2013 TW
M466827 Dec 2013 TW
201403537 Jan 2014 TW
M480221 Jun 2014 TW
WO1990005076 May 1990 WO
WO1997038526 Oct 1997 WO
WO1998031146 Jul 1998 WO
WO2000039556 Jul 2000 WO
WO2000051360 Aug 2000 WO
WO2002049881 Jun 2002 WO
WO2002095757 Nov 2002 WO
WO2003049446 Jun 2003 WO
WO2004036926 Apr 2004 WO
WO2014052898 Apr 2014 WO
WO2015122129 Aug 2015 WO
WO2016014724 Jan 2016 WO
Non-Patent Literature Citations (16)
Entry
Zepcam Wearable Video Technology; http://www.zepcam.com/product.aspx; Date Printed: Feb. 21, 2016; Date Posted: Unknown; p. 1.
Zepcam T1 Product Overview, Date Printed: Feb. 21, 2016; Date Published: Unknown; pp. 1-4.
The Wolfcom 3rd Eye Police Body Camera; http://www.wolfcomusa.com/wolfcom_3rd_eye_police_body_camera.html; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-21.
The New Wolfcom Vision Pro™ Police Body Worn Camera; http://www.wolfcomusa.com/wolfcom_vision_police_body_worn.html; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-25.
WatchGuard Vista Body-Worn Camera Brochure; http://watchguardvideo.com/vista/overyiew; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-20.
The Safariland Group VIEVU Cameras; http://www.vievu.com/vievu-products/hardware/; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-5.
Digital Ally Body Worn Cameras; http://www.digitalallyinc.com/bodyWornCams.cfm; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-8.
Digital Ally Software, Complete Evidence Management Solution, http://www.digitalallyinc.com/software.cfm; Date Printed: Feb. 19, 2016; Date Posted: Unknown; pp. 1-16.
Larson, Soleymani, Serdyukov, Automatic Tagging and Geotagging in Video Collections and Communities, Apr. 17-20, 2011, ICMR, Trento, Italy.
CSI Tech, iPhone Video Metadata;, http://www.csitech.co.ukliphone-video-metadata/; Date Printed: Mar. 3, 2015; Date Posted: Unknown; pp. 1-3.
Ocean Systems QuickDME-Digital Media Evidence Manager Brochure; http://www.oceansystems.com/forensic/Digital_Evidence_Property_Room/index.php; Date Printed: Feb. 21, 2016; Date Posted: Unknown; pp. 1-2.
Activists' Guide to Archiving Video, Acquire: Acquiring Raw Video and Metadata; http://archiveguide.witness.org/acquire/acquiring-raw-video-and-metadata; Date Printed: Mar. 3, 2015; Date Posted: Unknown; pp. 1-3.
Google Developers Guide, YouTube Data API, Implementing OAuth 2.0 Authorization; https://developers.google.com/youtube/v3/guides/authentication; Date Printed: Mar. 3, 2015; Date Posted: Unknown; pp. 1-6.
European Patent Office, Extended European Search Report for European Patent Application No. 16824812.2 dated Nov. 27, 2018.
International Searching Authority, International Search Report for International Application No. PCT/US2016/018722 dated Jul. 29, 2016.
Taiwain Patent Office, Taiwan Search Report for Taiwan Patent Application No. 10507212 dated Mar. 1, 2018.
Related Publications (1)
Number Date Country
20190244315 A1 Aug 2019 US
Provisional Applications (1)
Number Date Country
62192466 Jul 2015 US
Divisions (1)
Number Date Country
Parent 15048218 Feb 2016 US
Child 16216781 US