Embodiments of the invention relate to video recording. More specifically, embodiments of the invention relate to redundant storage for mobile video recording.
Traditional video recordings lack the reliability to be properly authenticated and used as evidence in legal proceedings. Further, existing video recording systems fail to provide redundant storage of said video recordings, which makes the recordings vulnerable to tampering and loss. Further still, traditional video recordings fail to incorporate additional information associated with a triggering event, which would otherwise provide additional authentication and insight into the event.
Embodiments of the invention solve the above-mentioned problems by providing systems, methods, and computer-readable media for storing and authenticating event related video data. Redundant storage of additional copies of event related video data provides further reliability and security, which, in some cases, makes the video data more suitable as evidence. Further, bi-directional communication between a video recording manager device and one or more cameras provides additional opportunities in terms of allocating recording and storage resources, as well as providing a control hierarchy for optimizing the security and reliability of recording operations.
In some aspects, the techniques described herein relate to a video recording authentication system for redundantly storing event related video data, the video recording authentication system including: one or more cameras configured to continuously record video data, wherein each of the one or more cameras includes an internal storage medium using a circular storage buffer for storing the continuously recorded video data; at least one sensor; and a video recording manager device communicatively coupled to the one or more cameras and the at least one sensor, the video recording manager device configured to transmit a triggering event signal to the one or more cameras based on a triggering event indication received from the at least one sensor, the video recording manager device including: a first storage medium storing triggering event related video data received from the one or more cameras, the triggering event related video data including at least a portion of the video data stored in the circular storage buffer recorded prior to receiving the triggering event indication and authentication metadata associated with the triggering event indication for authenticating the triggering event related video data; a second storage medium storing an additional copy of the triggering event related video data received from the one or more cameras, wherein the second storage medium is removable from the video recording manager device; and a wireless transceiver configured to transmit the triggering event related video data including the authentication metadata over a wireless network to a cloud-based storage system.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein each of the one or more cameras, the at least one sensor, and the video recording manager device are mounted within a law enforcement vehicle.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein the at least one sensor includes a proximity tag reader.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein the authentication metadata is augmented to include proximity tag data indicative of an officer identifier associated with a law enforcement officer in proximity to the law enforcement vehicle.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein the at least one sensor includes a GPS receiver and the authentication metadata further includes location information.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein the one or more cameras include a body-mounted camera mounted on a law enforcement officer, the body-mounted camera including a wireless transceiver for communicating with the video recording manager device.
In some aspects, the techniques described herein relate to a video recording authentication system, wherein the internal storage medium of the one or more cameras includes a partitioned storage including a first portion associated with the circular storage buffer and a second portion for storing the triggering event related video data.
In some aspects, the techniques described herein relate to a method for redundantly storing event related video data, the method including: continuously recording video data using one or more cameras; storing the continuously recorded video data from the one or more cameras within an internal storage medium of each respective camera; receiving, via a video recording manager device, a triggering event indication from at least one sensor; responsive to receiving the triggering event indication, transmitting, via the video recording manager device, a triggering event signal to the one or more cameras, the triggering event signal initiating a triggering event recording procedure within each of the one or more cameras; receiving triggering event related video data from the one or more cameras into the video recording manager device, the triggering event related video data including at least a portion of the video data stored in the internal storage medium recorded prior to receiving the triggering event indication; storing the triggering event related video data within a first storage medium of the video recording manager device along with authentication metadata associated with the triggering event indication for authenticating the triggering event related video data; storing an additional copy of the triggering event related video data within a second storage medium of the video recording manager device along with the authentication metadata; and transmitting, from the video recording manager device, the triggering event related video data with the authentication metadata to a cloud-based storage system.
In some aspects, the techniques described herein relate to a method, further including: transmitting, from the video recording manager device, the authentication metadata to the one or more cameras to authenticate the video data stored in the internal storage medium of each respective camera.
In some aspects, the techniques described herein relate to a method, further including: preventing overwriting of the triggering event related video data on the internal storage medium of each of the one or more cameras before the triggering event related video data is stored by the cloud-based storage system.
In some aspects, the techniques described herein relate to a method, further including: receiving, at the video recording manager device, a confirmation message confirming storage of the triggering event related video data by the cloud-based storage system; and responsive to receiving the confirmation message, allowing overwriting of the triggering event related video data from the internal storage of each of the one or more cameras.
In some aspects, the techniques described herein relate to a method, further including: comparing the triggering event related video data to the additional copy of the triggering event related video data to authenticate the triggering event related video data as evidence.
In some aspects, the techniques described herein relate to a method, wherein the triggering event signal transmitted by the video recording manager device includes an instruction to adjust a set of recording parameters of the one or more cameras based at least in part on a type of the triggering event indication.
In some aspects, the techniques described herein relate to a method, wherein the set of recording parameters includes a video resolution, a frame rate, and a shutter speed.
In some aspects, the techniques described herein relate to one or more non-transitory computer-readable media storing computer-executable instructions that, when executed by at least one processor, perform a method for redundantly storing event related video data, the method including: continuously recording video data using one or more cameras; storing the continuously recorded video data from the one or more cameras within an internal storage medium of each respective camera using a circular storage buffer; receiving, via a video recording manager device, a triggering event indication from at least one sensor; responsive to receiving the triggering event indication, transmitting, via the video recording manager device, a triggering event signal to the one or more cameras, the triggering event signal initiating a triggering event recording procedure within each of the one or more cameras; receiving triggering event related video data from the one or more cameras into the video recording manager device, the triggering event related video data including at least a portion of the video data stored in the circular storage buffer recorded prior to receiving the triggering event indication; storing the triggering event related video data within a first storage medium of the video recording manager device along with authentication metadata associated with the triggering event indication for authenticating the triggering event related video data; and storing an additional copy of the triggering event related video data within a second storage medium of the video recording manager device along with the authentication metadata.
In some aspects, the techniques described herein relate to a computer-readable media, further including: transmitting, from the video recording manager device, the triggering event related video data with the authentication metadata to a cloud-based storage system.
In some aspects, the techniques described herein relate to a computer-readable media, further including: responsive to the triggering event signal, transferring the one or more cameras from a standard continuous recording mode into a triggering event recording mode.
In some aspects, the techniques described herein relate to a computer-readable media, wherein the triggering event recording mode is associated with an updated set of video recording parameters for increasing a video quality of the triggering event related video data compared to the standard continuous recording mode.
In some aspects, the techniques described herein relate to a computer-readable media, wherein the standard continuous recording mode is associated with a pixel resolution of 720p and the triggering event related video data is associated with a pixel resolution selected from the set consisting of 1080p and 4K.
In some aspects, the techniques described herein relate to a computer-readable media, wherein the circular storage buffer includes a temporary first-in-first-out storage procedure operable to store up to 168 hours of video data.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Other aspects and advantages of the invention will be apparent from the following detailed description of the embodiments and the accompanying drawing figures.
Embodiments of the invention are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
The following detailed description references the accompanying drawings that illustrate specific embodiments in which the invention can be practiced. The embodiments are intended to describe aspects of the invention in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments can be utilized and changes can be made without departing from the scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense. The scope of the invention is defined only by the appended claims, along with the full scope of equivalents to which such claims are entitled.
In this description, references to “one embodiment,” “an embodiment,” or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment,” “an embodiment,” or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the technology can include a variety of combinations and/or integrations of the embodiments described herein.
Turning first to
Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database, as well as transitory and non-transitory forms of media. For example, computer-readable media include (but are not limited to) RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data temporarily or permanently. However, unless explicitly specified otherwise, the term “computer-readable media” should not be construed to include physical, but transitory, forms of signal transmission such as radio broadcasts, electrical signals through a wire, or light pulses through a fiber-optic cable. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations.
Finally, network interface card (NIC) 124 is also attached to system bus 104 and allows computer 102 to communicate over a network such as local network 126. NIC 124 can be any form of network interface known in the art, such as Ethernet, ATM, fiber, Bluetooth®, or Wi-Fi (i.e., the IEEE 802.11 family of standards). NIC 124 connects computer 102 to local network 126, which may also include one or more other computers, such as computer 128, and network storage, such as data store 130. Generally, a data store such as data store 130 may be any repository from which information can be stored and retrieved as needed. Examples of data stores include relational or object oriented databases, spreadsheets, file systems, flat files, directory services such as LDAP and Active Directory, or email storage systems. A data store may be accessible via a complex API (such as, for example, Structured Query Language), a simple API providing only read, write and seek operations, or any level of complexity in between. Some data stores may additionally provide management functions for data sets stored therein such as backup or versioning. Data stores can be local to a single computer such as computer 128, accessible on a local network such as local network 126, or remotely accessible over Internet 132. Local network 126 is in turn connected to Internet 132, which connects many networks such as local network 126, remote network 134 or directly attached computers such as computer 136. In some embodiments, computer 102 can itself be directly connected to Internet 132.
Turning now to
In some embodiments, the one or more cameras 202 are adapted to various different recording environments. For example, the one or more cameras 202 may be adapted to operate at a wide temperature range such that video recording quality is not affected by extreme temperatures. Additionally, the one or more cameras 202 may be suitable for recording under extreme vibration and other physically challenging recording circumstances. In some embodiments, at least one of the one or more cameras 202 may comprise a vibration dampening mounting structure which reduces vibration to increase recording quality. Further, in some embodiments, various software techniques may be used on the backend to remove vibration effects from video data. For example, an optical tracking algorithm may be applied to remove vibration effects after the video data has been recorded.
In some embodiments, the video recording authentication system 200 further comprises a sensor 206 or a plurality of such sensors. In some embodiments, the sensor 206 comprises any of a radio frequency identifier (RFID) tag reader, an accelerometer, a global positioning system (GPS) receiver, a motion sensor, an acoustic sensor, a pressure sensor, or other suitable types of sensors. Embodiments are contemplated in which the sensor 206 may be incorporated into one of the one or more cameras 202. For example, in some embodiments, the sensor 206 comprises an RFID tag reader disposed on or within at least one of the one or more cameras 202.
In some embodiments, the video recording authentication system 200 comprises a video recording manager device 210, as shown, for managing one or more sets of video data from the one or more cameras 202. Accordingly, in some embodiments, the video recording manager device 210 may be communicatively coupled to the one or more cameras 202. In some such embodiments, any combination of wired and wireless communication connections are contemplated. For example, in some embodiments, a BLUETOOTH wireless connection may be established between at least one of the one or more cameras 202 and the video recording manager device 210. Additionally, in some embodiments, a wired connection may be established, for example, using a USB or Ethernet connection, to transmit signals between the video recording manager device 210 and the one or more cameras 202.
In some embodiments, the communication connection between the video recording manager device 210 and the one or more cameras 202, whether wired or wireless, allows for video data captured from the one or more cameras 202 to be transmitted to the video recording manager device 210 and control signals to be communicated from the video recording manager device 210 to the one or more cameras 202. Further, embodiments are contemplated in which control signals may additionally be transmitted from the one or more cameras 202 to the video recording manager device 210. Accordingly, the communication connection may be established such that bidirectional communication is supported between the video recording manager device 210 and the one or more cameras 202. The bidirectional communication connection between the video recording manager device 210 and the one or more cameras 202 allows both control and data signals to be transmitted back and forth from the one or more cameras 202. Accordingly, embodiments are contemplated in which the video recording manager device 210 provides control signals for controlling recording operations of the one or more cameras 202 based at least in part on data signals received from the one or more cameras 202. Further still, the video recording manager device 210 may monitor parameters of the one or more cameras 202 such as, for example, battery life, remaining storage capacity, recording quality, as well as other camera-specific parameters.
In some embodiments, the video recording manager device 210 comprises a first storage medium 212 and a second storage medium 214 disposed within the video recording manager device 210. In some such embodiments, the first storage medium 212 and the second storage medium 214 are configured to store video data captured by the one or more cameras 202 in addition to related metadata, as will be described in further detail below. In some embodiments, at least one of the storage mediums may be removable from the video recording manager device 210. For example, in some embodiments, the second storage medium 214 is removably mounted within or onto the video recording manager device 210. In one example, the second storage medium 214 comprises a removable USB flash memory device, an SD card, or the like, such that the second storage medium 214 may be added and removed from the video recording manager device 210 to physically transfer the contents of the second storage medium 214, which as described above may include video data captured by the one or more cameras 202.
In some embodiments, the video recording manager device 210 further comprises a controller 216 disposed within the video recording manager device 210. In some embodiments, the controller 216 is configured to control the operation of the video recording manager device 210. For example, the controller 216 may monitor data received from the one or more cameras 202 and the sensor 206 to determine instructions to be sent to the one or more cameras 202 and to further instruct storage of data on the first storage medium 212 and the second storage medium 214. In some embodiments, the controller 216 further instructs storage on the internal storage medium 204 of each respective camera of the one or more cameras 202. In some embodiments, the controller 216 monitors a signal received from the sensor 206 to determine a triggering event. For example, in some embodiments, the sensor 206 may provide a signal including a triggering event indication which initiates a triggering event procedure of the video recording manager device 210.
In some embodiments, a number of different types of triggering events are contemplated including a variety of manually and automatically initiated triggers. For example, in some embodiments, a triggering event may be initiated by a law enforcement officer of other user performing any of a number of actions such as flashing lights, braking, activating sirens of a law enforcement vehicle, as well as driver monitoring parameters such as, using a cell phone, not wearing a seatbelt, or falling asleep. Further, in some embodiments, triggering events may be initiated based on vehicle-specific parameters such as lane departer and following too close to another vehicle, as well as other vehicle-specific parameters such that instances of traffic accidents may be recorded as event-related video data.
In some embodiments, the video recording manager device 210 further comprises a wireless transceiver 218 that may be internal or external to the video recording manager device 210. In some such embodiments, the wireless transceiver 218 may comprise a radio transceiver for receiving and transmitting radio waves. Additionally, in some embodiments, the video recording manager device 210 comprises a Wi-Fi transceiver 220 that may be internal or external to the video recording manager device 210. Here, the Wi-Fi transceiver 220 may be configured to wirelessly transmit and receive Wi-Fi signals over a network. Embodiments are contemplated in which either of the wireless transceiver 218 or the Wi-Fi transceiver 220 may be used to establish communication with the one or more cameras 202 and/or the sensor 206. Alternatively, or additionally, in some embodiments, as described above, the one or more cameras 202 and the sensor 206 may be communicatively coupled via a wired connection. Further still, embodiments are contemplated in which a devoted wireless connection may be established with each of the one or more cameras 202. For example, one or more additional wireless transceivers may be included to communicate with the one or more cameras 202.
In some embodiments, the wireless transceiver 218 and/or the Wi-Fi transceiver 220 are operable to communicate with a cloud-based storage system 230. In some embodiments, the cloud-based storage system 230 comprises a cloud data store 232, as shown, for remotely storing data. As such, embodiments are contemplated in which video data captured by the one or more cameras 202 is transmitted to the cloud-based storage system 230 from the video recording manager device 210 and stored in the cloud data store 232. For example, in some embodiments, it may be desirable to store the video data within the cloud data store 232 to provide a redundant copy of the video data which is insured against physical destruction of the video recording manager device 210 and the one or more cameras 202. In some such embodiments, the video data may be transmitted to the cloud-based storage system 230, for example, by using the Wi-Fi transceiver 220 to transmit the video data over a wireless network.
In some embodiments, the recording and storage parameters of the one or more cameras 202 may be controlled based on various signals monitored by the video recording manager device 210. For example, in some embodiments, a sensor 206 (or a plurality of such sensors) may be used to monitor ambient conditions such that the video recording manager device 210 can provide control signals for optimizing recording based on said ambient conditions. In one example, a humidity sensor such as a hygrometer may be used to measure the amount of water vapor in the ambient air such that the recording parameters may be adjusted for recording in a foggy environment. Further still, embodiments are contemplated in which the one or more cameras 202 may include internal fans for reducing or removing condensation from the camera lens. In another example, a sensor may be included for monitoring the ambient lighting conditions. Accordingly, the video recording manager device 210 may adjust the recording parameters such as by instructing the one or more cameras 202 to switch into a night-vision recording mode. Embodiments are also contemplated in which the one or more cameras 202 may be controlled individually. For example, in some embodiments, the one or more cameras 202 include an internal controller interfacing with one or more sensors to perform any of the operations described herein with respect to the controller 216 and the sensors 206.
Additionally, embodiments are contemplated in which computer-vision techniques may be applied for monitoring video data from the one or more cameras 202. For example, in some embodiments, the video recording manager device 210 may utilize computer vision to identify objects within the video data and control recording parameters accordingly. In one example, computer-vision may determine that one of the cameras is covered by an object or obstacle such that an event scene is not visible. Based on this determination, the video recording manager device 210 may transmit a signal to the camera to adjust the recording angle to move around the obstacle or alternatively, to turn the camera off to save storage space and battery life. Further still, computer-vision techniques may be utilized to identify persons or objects of interest and focus or adjust recording parameters to ensure the persons or objects of interest are clearly visible within the video data. In some embodiments, the computer-vision techniques may be applied using any combination of additional hardware and software. For example, a computer-vision algorithm may be applied using the controller 216 to analyze the received video data. Alternatively, in some embodiments, said computer-vision techniques may be applied independently by the one or more cameras 202, for example, using a respective internal controller of each of the one or more cameras 202.
Turning now to
In some embodiments, any number of sensors may be included within the vehicle 302. In some embodiments, a GPS receiver 308 may be included for collecting location data associated with the vehicle 302. In some embodiments, one or more additional sensors 310 may be included for collected data associated with the vehicle 302 or additional data related to a triggering event. In some embodiments, the GPS receiver 308 and the one or more additional sensors 310 may be communicatively coupled to the video recording manager device 210. For example, in some such embodiments, the GPS receiver 308 and the one or more additional sensors 310 may perform similar operations as described herein with respect to the sensor 206. Embodiments are contemplated in which existing sensors of the vehicle 302 may be incorporated into the video recording system by establishing communication with the video recording manager device 210. Said communication may include any suitable forms of wireless or wired communication connections, such as, BLUETOOTH, Wi-Fi, Ethernet, USB, and other suitable communication connections.
In some embodiments, an operator 320, who may be a law enforcement officer, wears a body-mounted camera 322. For example, the body-mounted camera 322 may comprise a video camera including a clip for securing the camera to a shirt or other garment of the operator 320. In some embodiments, the operator 320 may also include a proximity tag 324. For example, the proximity tag 324 may be included within a badge of the law enforcement officer and is associated with a unique officer identifier for identifying the law enforcement officer. Accordingly, embodiments are contemplated in which the one or more additional sensors 310 include a proximity tag reader operable to read the proximity tag 324 for determining the presence of the operator 320. For example, in some embodiments, the proximity tag reader captures proximity tag data including an officer identifier which may be included within the authenticated video data. In some embodiments, the proximity tag data may be timestamped or even included within individual frames of the video data such that the video data shows when the operator is present within video data even if the operator 320 is off screen.
In some embodiments, the video recording manager device 210 is further interfaced with the electronics of the vehicle 302 such that various aspects of the vehicle 302 may be monitored in order to detect a triggering event. For example, in some embodiments, the video recording manager device 210 may receive a signal from the electronics of the vehicle 302 indicative of an airbag status, such that a triggering event can be associated with deployment of the airbags. It should be understood that various other vehicle-specific parameters may be monitored and potentially associated with triggering events.
It should be understood that, though embodiments of the invention are described above as relating to law enforcement, additional embodiments are contemplated that relate to other operational environments. For example, in some embodiments, the video recording/authentication systems and methods described herein may be employed for general security recording and surveillance applications. As such, said systems may include any number of cameras, which may be vehicle-mounted, body-mounted, or mounted onto stationary structures such as buildings. Further, embodiments may be employed for monitoring delivery operations and other services.
Turning now to
In some embodiments, the combiner 406 also receives one or more sets of sensor data 408 from one or more respective sensors. The combiner 406 is configured to combine the one or more sets of individual video data 402 and the one or more sets of sensor data 408. In some embodiments, the combiner 406 combines multiple streams of video data and sensor data in real time, such that live streams are combined as they are recorded by the one or more cameras 202. Alternatively, in some embodiments, the video data may be combined after completion of the triggering event. For example, the video data may be combined directly after completion of the triggering event responsive to a completion indication or at a later time such as, an hour, a day, a week, or another suitable period of time after the triggering event. Further, in some embodiments, the video data may be combined at any time in response to a manual combination request submitted by a user.
In some embodiments, the combiner 406 outputs a set of composite authenticated video data 410, as shown. Here, the composite authenticated video data 410 comprises authentication metadata 412 and the one or more sets of individual video data 402 received from the one or more cameras 202. In some embodiments, the authentication metadata includes any of the sensor data 408, a triggering event indication, a time stamp, a user identifier, one or more digital signatures, as well as other data suitable to authenticate the video data. In some embodiments, a digital signature may be added to each set of individual video data 402 (for example, within the individual video metadata 404 corresponding to the individual video data 402) to identify the camera which recorded the individual video data 402. Further, in some embodiments, a digital signature associated with the video recording manager device 210 may be included within the authentication metadata 412 to identify the video recording manager device 210. In some such embodiments, the digital signature may include a unique identifier corresponding to the video recording manager device 210. In some embodiments, additional digital signatures may be included for identifying the sensor 206.
Turning now to
It should be understood that the amount of video data that can be stored depends further on the specific recording parameters of the video data. For example, a set of video data with a resolution of 2160 pixels (2160p or 4K) or 1080 pixels (1080p) may require significantly more storage than video data with a lower resolution. Thus, a similar storage device may be operable to store about 168 hours of standard quality video data at a resolution of 720 pixels (720p) or about 70 hours of higher quality video data at a resolution of 1080p. Further still, embodiments are contemplated in which a combination of video data at multiple resolutions may be stored.
In some embodiments, the internal storage medium 204 further comprises a second portion 510. Here, the second portion 510 may be configured to store triggering event related video data 512 along with authentication metadata 514. In some embodiments, the triggering event related video data 512 may be received from the video recording manager device 210. For example, in some embodiments, the composite authenticated video data 410 may be communicated back to the one or more cameras 202 and stored within the internal storage medium 204 of each camera. Alternatively, in some embodiments, the triggering event related video data 512 may comprise video data transferred from the circular storage buffer 504 based on a received triggering event signal. For example, in some embodiments, the internal storage medium 204 may be configured to automatically transfer a portion of the video data in the circular storage buffer 504 to the second portion 510 responsive to receiving a triggering event signal. Accordingly, the video data leading up to a triggering event indication may be captured and preserved within the triggering event related video data 512. In some embodiments, the triggering event related video data 512 comprises video data captured prior to receiving the triggering event indication, as well as video data recorded for a predetermined period of time after receiving the triggering event indication. In some embodiments, the predetermined period of time may be adjusted based on a type of triggering event.
Additionally, embodiments are contemplated in which alternative storage procedures may be included. For example, in some embodiments, video data may be continuously recorded at a high video quality such as a resolution of 1080p or 4K. Here, the high-quality video data may be stored within the circular buffer for a predetermined period of time. If a triggering event occurs at least a portion of the high-quality video data may be transmitted to the video recording manager device 210. Alternatively, if no triggering event occurs over the predetermined period of time, the high-quality video data may be deleted or converted into a lower quality to increase storage availability. For example, the video data may initially be stored at the resolution of 1080p or 4K and after an hour be converted to a resolution of 720p or lower such that the video data occupies less storage space in the internal storage medium 204 of the one or more cameras 202. By initially storing video data at a high quality, the video data may be retroactively captured as event data leading up to a triggering event while maintaining high video quality. Further, since the high-quality storage may only be temporary, additional benefits in terms of storage capacity are achieved. Further still, embodiments are contemplated in which the predetermined time period for temporarily storing high-quality video data may be determined based at least in part on the remaining storage capacity or a signal received from the video recording manager device 210, such that, if storage is available high-quality video data may be captured and maintained. Accordingly, all of the available storage space may be utilized based on availability such that the storage is optimized for various recording environments.
In some embodiments, the one or more cameras may be configured to continuously record video data at a lower video quality with a lower resolution and lower bit rate to reduce the data resources used. Here, a higher video quality with a higher resolution and higher bit rate may be used while recorded event-related video data to increase the video quality of the event-related video data. In some embodiments, various other resolutions and bit rates may be used. For example, in some embodiments, a 4K resolution may be used to record at a higher video quality, while a pixel resolution of 720p or 1080p may be used for a lower video quality. Alternatively, in some embodiments, a single video quality may be used for both continuous recording and event-related recording.
Turning now to
At step 602, video data is recorded at the one or more cameras 202. In some embodiments, the video data is continuously recorded at each camera. Further, in some embodiments, the video data further comprises audio data recorded by one or more microphones associated with each respective camera. In some embodiments, continuously recording video data allows the preservation of video footage leading up to a triggering event.
At step 604, the recorded video data is stored on each of the one or more cameras 202. In some embodiments, the video data may be stored within the internal storage medium 204 of each camera. Further, in some embodiments, the circular storage buffer 504 may be used to continuously record video data from each respective camera.
At step 606, a triggering event indication is received from at least one sensor, such as sensor 206. In some embodiments, the triggering event indication may comprise sensor data received from the sensor 206, which is monitored by the video recording manager device 210. Accordingly, embodiments are contemplated in which the sensor 206 does not identify or classify the sensor data as including a triggering event indication. Instead, in such embodiments, the sensor 206 provides raw sensor data to the video recording manager device 210 and the video recording manager device 210 determines whether the raw sensor data comprises a triggering event indication. Alternatively, or additionally, in some embodiments, the sensor 206 may be operable to identify triggering event indications. Further still, embodiments are contemplated in which the triggering event indication may be received manually by an operator. For example, the sensor 206 may comprise a button or other interface element operable to receive an input which provides a triggering event indication. For example, an operator may manually opt to initiate a triggering event by providing such an input, which may be received in proximity to the video recording manager device 210 or may be received remotely. For example, embodiments are contemplated which allow a remote operator to transmit a triggering event indication input over a network to the video recording manager device 210.
At step 608, a triggering event signal is transmitted from the video recording manager device 210 to the one or more cameras 202. In some embodiments, the triggering event signal is transmitted in response to receiving the triggering event indication from the at least one sensor. For example, when the video recording manager device 210 receives sensor data which is indicative of a triggering event the video recording manager device 210 transmits the triggering event signal such that a triggering event procedure is initiated within the system. In some embodiments, the triggering event signal changes the operation and/or recording parameters of the one or more cameras 202. In some embodiments, the triggering event signal requests video data from the one or more cameras 202.
In some embodiments, the triggering event signal is operable to control video recording parameters of the one or more cameras 202. For example, in some embodiments, the triggering event signal instructs the one or more cameras 202 to switch from a standard continuous recording mode to a triggering event recording mode that may be associated with higher quality recording parameters. In some embodiments, the triggering event signal adjusts one or more recording parameters of the one or more cameras 202, such as, any of a video resolution, a frame rate, a shutter speed, as well as other recording parameters. In some embodiments, it may be desirable to increase video recording quality for a certain duration of time associated with a triggering event, such that the aspects of the event are suitably captured.
At step 610, triggering event related video data is received from the one or more cameras 202 at the video recording manager device 210. In some embodiments, the triggering event related video data comprises any combination of previously recorded video data from before receiving the triggering event indication, post-recorded video data from after receiving the triggering event indication, and live recorded video data received in real time directly after being recorded by the one or more cameras 202. In some embodiments, the one or more cameras 202 are configured to provide the triggering event related video data responsive to receiving the triggering event signal from the video recording manager device 210. In some embodiments, the triggering event related video data comprises at least a portion of the video data stored within the internal storage medium of the one or more cameras 202. For example, in some embodiments, a portion of the video data stored using the circular storage buffer 504 corresponding to a predetermined period of time before the triggering event was received may be transmitted to the video recording manager device 210 as triggering event related video data. In one example, video data for 5 minutes of video data leading up to the time a triggering event signal is received is included within the triggering event related video data and is redundantly stored across a plurality of storage devices.
At step 612, the triggering event related video data is stored by the video recording manager device 210. In some embodiments, the triggering event related video data is stored within a first storage medium 212 of the video recording manager device 210. In some embodiments, the stored triggering event related video data may comprise the composite authenticated video data 410 including authentication metadata 412 and one or more sets of individual video data 402.
In some embodiments, the triggering event related video data further comprises additional metadata relating to the triggering event. For example, the video data may be augmented with metadata including sensor data captured during the triggering event. In some embodiments, event related metadata may be included within frames of the video data. Accordingly, it may be possible to augment video frames with dynamic data. For example, location data or other sensor data may be continuously or periodically updated and recorded over time within the video frames. In some embodiments, it may be desirable to only augment certain frames of the video data with sensor data. For example, every tenth frame may be augmented or frames may be augmented once a second or once a minute. Further, in some embodiments, certain types of sensor data may be updated and augmented into video frames more frequently. For example, location data may be augmented into the video frames once a minute, while acceleration data may be augmented into the video data once a second.
At step 614, an additional copy of the triggering event related video data may be stored by the video recording manager device 210. In some embodiments, the additional copy may be stored in the second storage medium 214 of the video recording manager device 210. As described above, in some embodiments, the second storage medium 214 may be removable from the video recording manager device 210. In some embodiments, the additional copy of the triggering event related video data may be identical to the first triggering event related video data stored at step 612. Embodiments are contemplated in which the original triggering event related video data and the additional copy may be compared to ensure that neither version of the video data has been tampered with or altered. In some embodiments, it may be desirable to distribute copies of the video data between multiple different entities such that copies of the data from one entity may be compared to other copies to verify authenticity. Accordingly, embodiments are contemplated in which the triggering event related video data is compared to the additional copy to authenticate the triggering event related video data as evidence, for example, in a legal proceeding.
In some embodiments, redundant copies of the triggering event related video data further provide evidence capture assurance insuring against physical damage to any of the storage mediums. Embodiments of the invention provide further benefit because the triggering event related data may be stored across physically distributed storage media. For example, the one or more cameras 202 may be positioned in a separate location from the video recording manager device 210 such that if the video recording manager device 210 is damaged the video data is still available within the internal storage medium of the one or more cameras 202. Additionally, the second storage medium 214 of the video recording manager device 210 may be removable such that additional copies of the video data may be removed from the video recording manager device 210 and transported to another location.
At step 616, the triggering event related video data including the authentication metadata is transmitted to the cloud-based storage system 230. In some embodiments, the triggering event related video data may be transmitted upon completion of a triggering event recorded procedure after all of the triggering event related video data has been recorded. Alternatively, in some embodiments, the triggering event related video data may be transmitted to the cloud-based storage system 230 as it is received and as a suitable communication connection is available. For example, in some embodiments, the video recording manager device 210 may monitor the quality of a communication connection over a network to determine when the triggering event related video data should be transmitted. Further, embodiments are contemplated in which the triggering event related video data is transmitted after a certain amount of triggering event related video data has been recorded. For example, the triggering event related video data may be transmitted after 300 megabytes of data have been captured. It should be understood that, in some embodiments, the example of 300 megabytes given may be arbitrary and that other data amounts are also contemplated.
Transmitting the triggering event related video data to external storage systems such as the cloud-based storage system 230 provides further evidence capture assurance by persisting the video data remotely such that damage to any other storage media will not result in total loss of the video data, which in some cases, may include important video evidence. Further, embodiments are contemplated in which the video data may be published to online resources or the like to persist yet another copy of the video data distributed to various online databases.
In some embodiments, a video signature may be associated with the event-related video data that may be sent, for example, to cloud-based storage system 230 as soon as possible. In some embodiments, the video signature may comprise a unique identifier, hash, or checksum of the associated video data, which may be used to confirm that the video data has not been altered. In some embodiments, comparing video signatures may be much faster and less resource intensive than, for example, comparing the video data itself, as described above. In some embodiments, said video signature comprises a hash on the video data which ensures that the video data has not been altered after recording. In some such embodiments, the hash is generated for the video as soon as the video recording is closed. Said hash may be configured such that the hash will be altered, destroyed, or changed in some way if the video data is altered. In some embodiments, the hash may be included within the authentication metadata for the video data. In some embodiments, a cellular connection may be used to upload the hash and trigger information indicative of the triggering event within seconds of recording a video. In some embodiments, time stamps may be generated for various operations within the recording and storage process. For example, a first time stamp may be generated for when the recording of a set of video data is closed and a second time stamp may be generated for when any combination of the hash, the trigger information, and the metadata arrives at a secure data storage, such as, for example, the cloud-based storage system 230. Accordingly, further assurance that the video has not been altered may be deduced from determining that there was not sufficient time to alter the video data between a time when recording ended to a time when the video data or its associated metadata or hash is stored within a secure data store. Further, if the hash of a set of video data matches the hash stored within the secure data store the set of video data is confirmed to be unaltered.
It should be understood that, in some embodiments, any number of the steps described herein with respect to method 600 may be performed simultaneously or in a different order than as explicitly described herein. Further, certain steps may be optional or removed entirely. For example, in some embodiments, the triggering event related video data may not be transmitted to the cloud-based storage system 230. Further still, certain steps of the method 600 may be repeated or may be performed continuously. For example, step 602 of recording video data may be performed continuously as other steps are performed.
In some embodiments, redundant copies of the triggering event related video data may be persisted until a confirmation is received that the triggering event related video data has been successfully received and stored at the cloud-based storage system 230. Here, overwriting or deletion of the triggering event related video data may be prevented until it is confirmed that the triggering event related video data has been stored elsewhere, such as on the cloud data store 232. For example, the cloud-based storage system 230 may receive the triggering event related video data and store the triggering event related video data within the cloud data store 232 before responding with a confirmation message or confirmation signal. In some embodiments, after receiving the confirmation message or confirmation signal the video recording manager device 210 may allow the redundant copies of the triggering event related video data to be deleted to make additional storage capacity available for subsequent triggering events.
Although the invention has been described with reference to the embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the invention as recited in the claims.
Number | Name | Date | Kind |
---|---|---|---|
4409670 | Herndon et al. | Oct 1983 | A |
4789904 | Peterson | Dec 1988 | A |
4863130 | Marks, Jr. | Sep 1989 | A |
4918473 | Blackshear | Apr 1990 | A |
5027104 | Reid | Jun 1991 | A |
5064157 | O'Neal | Nov 1991 | A |
5096287 | Kaikinami et al. | Mar 1992 | A |
5111289 | Lucas et al. | May 1992 | A |
5289321 | Secor | Feb 1994 | A |
5381155 | Gerber | Jan 1995 | A |
5408330 | Squicciarii et al. | Apr 1995 | A |
5446659 | Yamawaki | Aug 1995 | A |
5453939 | Hoffman et al. | Sep 1995 | A |
5473501 | Claypool | Dec 1995 | A |
5473729 | Bryant et al. | Dec 1995 | A |
5479149 | Pike | Dec 1995 | A |
5497419 | Hill | Mar 1996 | A |
5526133 | Paff | Jun 1996 | A |
5585798 | Yoshioka et al. | Dec 1996 | A |
5642285 | Woo et al. | Jun 1997 | A |
5668675 | Fredricks | Sep 1997 | A |
5689442 | Swanson et al. | Nov 1997 | A |
5742336 | Lee | Apr 1998 | A |
5752632 | Sanderson et al. | May 1998 | A |
5798458 | Monroe | Aug 1998 | A |
5815093 | Kikinis | Sep 1998 | A |
5850613 | Bullecks | Dec 1998 | A |
5878283 | House et al. | Mar 1999 | A |
5886739 | Winningstad | Mar 1999 | A |
5890079 | Levine | Mar 1999 | A |
5926210 | Hackett et al. | Jul 1999 | A |
5962806 | Coakley et al. | Oct 1999 | A |
5978017 | Tino | Nov 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
5996023 | Winter et al. | Nov 1999 | A |
6008841 | Charlson | Dec 1999 | A |
6028528 | Lorenzetti et al. | Feb 2000 | A |
6052068 | Price R-W et al. | Apr 2000 | A |
6097429 | Seeley et al. | Aug 2000 | A |
6100806 | Gaukel | Aug 2000 | A |
6121881 | Bieback et al. | Sep 2000 | A |
6141609 | Herdeg et al. | Oct 2000 | A |
6141611 | Mackey et al. | Oct 2000 | A |
6163338 | Johnson et al. | Dec 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6298290 | Abe et al. | Oct 2001 | B1 |
6310541 | Atkins | Oct 2001 | B1 |
6314364 | Nakamura | Nov 2001 | B1 |
6324053 | Kamijo | Nov 2001 | B1 |
6326900 | Deline et al. | Dec 2001 | B2 |
6333694 | Pierce et al. | Dec 2001 | B2 |
6333759 | Mazzilli | Dec 2001 | B1 |
6370475 | Breed et al. | Apr 2002 | B1 |
RE37709 | Dukek | May 2002 | E |
6389340 | Rayner | May 2002 | B1 |
6396403 | Haner | May 2002 | B1 |
6405112 | Rayner | Jun 2002 | B1 |
6449540 | Rayner | Sep 2002 | B1 |
6452572 | Fan et al. | Sep 2002 | B1 |
6490409 | Walker | Dec 2002 | B1 |
6518881 | Monroe | Feb 2003 | B2 |
6525672 | Chainer et al. | Feb 2003 | B2 |
6546119 | Ciolli et al. | Apr 2003 | B2 |
6560463 | Santhoff | May 2003 | B1 |
6563532 | Strub et al. | May 2003 | B1 |
6583813 | Enright et al. | Jul 2003 | B1 |
6591242 | Karp et al. | Jul 2003 | B1 |
6681195 | Poland et al. | Jan 2004 | B1 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6697103 | Fernandez et al. | Feb 2004 | B1 |
6718239 | Rayer | Apr 2004 | B2 |
6727816 | Helgeson | Apr 2004 | B1 |
6747687 | Alves | Jun 2004 | B1 |
6748792 | Freund et al. | Jun 2004 | B1 |
6783040 | Batchelor | Aug 2004 | B2 |
6823621 | Gotfried | Nov 2004 | B2 |
6831556 | Boykin | Dec 2004 | B1 |
6856873 | Breed et al. | Feb 2005 | B2 |
6877434 | NcNulty, Jr. | Apr 2005 | B1 |
6883694 | Abelow | Apr 2005 | B2 |
6894601 | Grunden et al. | May 2005 | B1 |
6947071 | Eichmann | Sep 2005 | B2 |
6970183 | Monroe | Nov 2005 | B1 |
7012632 | Freeman et al. | Mar 2006 | B2 |
7034683 | Ghazarian | Apr 2006 | B2 |
D520738 | Tarantino | May 2006 | S |
7038590 | Hoffman et al. | May 2006 | B2 |
7071969 | Stimson, III | Jul 2006 | B1 |
7088387 | Freeman et al. | Aug 2006 | B1 |
7102496 | Ernst et al. | Sep 2006 | B1 |
7119832 | Blanco et al. | Oct 2006 | B2 |
7126472 | Kraus et al. | Oct 2006 | B2 |
7147155 | Weekes | Dec 2006 | B2 |
7180407 | Guo et al. | Feb 2007 | B1 |
7190822 | Gammenthaler | Mar 2007 | B2 |
7350437 | Mangano et al. | Apr 2008 | B2 |
7353086 | Ennis | Apr 2008 | B2 |
7363742 | Nerheim | Apr 2008 | B2 |
7371021 | Ross et al. | May 2008 | B2 |
7421024 | Castillo | Sep 2008 | B2 |
7436143 | Lakshmanan et al. | Oct 2008 | B2 |
7436955 | Yan et al. | Oct 2008 | B2 |
7448996 | Khanuja et al. | Nov 2008 | B2 |
7456875 | Kashiwa | Nov 2008 | B2 |
7496140 | Winningstad et al. | Feb 2009 | B2 |
7500794 | Clark | Mar 2009 | B1 |
7508941 | O'Toole, Jr. et al. | Mar 2009 | B1 |
7536457 | Miller | May 2009 | B2 |
7539533 | Tran | May 2009 | B2 |
7561037 | Monroe | Jul 2009 | B1 |
7594305 | Moore | Sep 2009 | B2 |
7602301 | Stirling et al. | Oct 2009 | B1 |
7602597 | Smith et al. | Oct 2009 | B2 |
7631452 | Brundula et al. | Dec 2009 | B1 |
7656439 | Manico et al. | Feb 2010 | B1 |
7659827 | Gunderson et al. | Feb 2010 | B2 |
7680947 | Nicholl et al. | Mar 2010 | B2 |
7697035 | Suber, III et al. | Apr 2010 | B1 |
7701692 | Smith et al. | Apr 2010 | B2 |
7714704 | Mellen | May 2010 | B1 |
7778004 | Nerheim et al. | Aug 2010 | B2 |
7804426 | Etcheson | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7853944 | Choe | Dec 2010 | B2 |
7944676 | Smith et al. | May 2011 | B2 |
7984579 | Brundula et al. | Jul 2011 | B2 |
8077029 | Daniel et al. | Dec 2011 | B1 |
8121306 | Cilia et al. | Feb 2012 | B2 |
8175314 | Webster | May 2012 | B1 |
8269617 | Cook et al. | Sep 2012 | B2 |
8314708 | Gunderson et al. | Nov 2012 | B2 |
8350907 | Blanco et al. | Jan 2013 | B1 |
8356438 | Brundula et al. | Jan 2013 | B2 |
8373567 | Denson | Feb 2013 | B2 |
8373797 | Ishii et al. | Feb 2013 | B2 |
8384539 | Denny et al. | Feb 2013 | B2 |
8446469 | Blanco et al. | May 2013 | B2 |
8456293 | Trundle et al. | Jun 2013 | B1 |
8508353 | Cook et al. | Aug 2013 | B2 |
8559486 | Kitayoshi | Oct 2013 | B2 |
8594485 | Brundula | Nov 2013 | B2 |
8606492 | Botnen | Dec 2013 | B1 |
8676428 | Richardson et al. | Mar 2014 | B2 |
8690365 | Williams | Apr 2014 | B1 |
8707758 | Keays | Apr 2014 | B2 |
8725462 | Jain et al. | May 2014 | B2 |
8744642 | Nemat-Nasser et al. | Jun 2014 | B2 |
8780205 | Boutell et al. | Jul 2014 | B2 |
8781292 | Ross et al. | Jul 2014 | B1 |
8805431 | Vasavada et al. | Aug 2014 | B2 |
8849501 | Cook et al. | Sep 2014 | B2 |
8854199 | Cook et al. | Oct 2014 | B2 |
8887208 | Merrit et al. | Nov 2014 | B1 |
8890954 | O'Donnell et al. | Nov 2014 | B2 |
8903593 | Addepalli et al. | Dec 2014 | B1 |
8930072 | Lambert et al. | Jan 2015 | B1 |
8934045 | Karn et al. | Jan 2015 | B2 |
8989914 | Nemat-Nasser et al. | Mar 2015 | B1 |
8996234 | Tamari et al. | Mar 2015 | B1 |
8996240 | Plante | Mar 2015 | B2 |
9002313 | Sink et al. | Apr 2015 | B2 |
9003474 | Smith | Apr 2015 | B1 |
9058499 | Smith | Jun 2015 | B1 |
9122082 | Abreau | Sep 2015 | B2 |
9123241 | Grigsby et al. | Sep 2015 | B2 |
9159371 | Ross et al. | Oct 2015 | B2 |
9164543 | Minn et al. | Oct 2015 | B2 |
9253452 | Ross et al. | Feb 2016 | B2 |
9518727 | Markle et al. | Dec 2016 | B1 |
9582979 | Mader et al. | Feb 2017 | B2 |
9591255 | Skiewica et al. | Mar 2017 | B2 |
9728228 | Palmer et al. | Aug 2017 | B2 |
9774816 | Rios, III et al. | Sep 2017 | B2 |
9781348 | Bart et al. | Oct 2017 | B1 |
10271015 | Haler et al. | Apr 2019 | B2 |
10964351 | Ross et al. | Mar 2021 | B2 |
11232685 | Nixon | Jan 2022 | B1 |
20010033661 | Prokoski | Oct 2001 | A1 |
20020013517 | West et al. | Jan 2002 | A1 |
20020019696 | Kruse | Feb 2002 | A1 |
20020032510 | Tumball et al. | Mar 2002 | A1 |
20020044065 | Quist et al. | Apr 2002 | A1 |
20020049881 | Sugimura | Apr 2002 | A1 |
20020077086 | Tuomela et al. | Jul 2002 | A1 |
20020084130 | Der Gazarian et al. | Jul 2002 | A1 |
20020131768 | Gammenthaler | Sep 2002 | A1 |
20020135336 | Zhou et al. | Sep 2002 | A1 |
20020159434 | Gosior et al. | Oct 2002 | A1 |
20020191952 | Fiore et al. | Dec 2002 | A1 |
20030040917 | Fiedler | Feb 2003 | A1 |
20030080713 | Kirmuss | May 2003 | A1 |
20030080878 | Kirmuss | May 2003 | A1 |
20030081121 | Kirmuss | May 2003 | A1 |
20030081934 | Kirmuss | May 2003 | A1 |
20030081935 | Kirmuss | May 2003 | A1 |
20030081942 | Melnyk et al. | May 2003 | A1 |
20030095688 | Kirmuss | May 2003 | A1 |
20030106917 | Shelter et al. | Jun 2003 | A1 |
20030133018 | Ziemkowski | Jul 2003 | A1 |
20030151510 | Quintana et al. | Aug 2003 | A1 |
20030184674 | Manico et al. | Oct 2003 | A1 |
20030185417 | Alattar et al. | Oct 2003 | A1 |
20030215010 | Kashiwa | Nov 2003 | A1 |
20030215114 | Kyle | Nov 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20030229493 | McIntyre et al. | Dec 2003 | A1 |
20040008255 | Lewellen | Jan 2004 | A1 |
20040043765 | Tolhurst | Mar 2004 | A1 |
20040143373 | Ennis | Jun 2004 | A1 |
20040131184 | Wu et al. | Jul 2004 | A1 |
20040141059 | Enright et al. | Jul 2004 | A1 |
20040145457 | Schofield et al. | Jul 2004 | A1 |
20040150717 | Page et al. | Aug 2004 | A1 |
20040168002 | Accarie et al. | Aug 2004 | A1 |
20040199785 | Pederson | Oct 2004 | A1 |
20040223054 | Rotholtz | Nov 2004 | A1 |
20040243734 | Kitagawa et al. | Dec 2004 | A1 |
20040267419 | Jing | Dec 2004 | A1 |
20050030151 | Singh | Feb 2005 | A1 |
20050046583 | Richards | Mar 2005 | A1 |
20050050266 | Haas et al. | Mar 2005 | A1 |
20050068169 | Copley et al. | Mar 2005 | A1 |
20050068417 | Kreiner et al. | Mar 2005 | A1 |
20050083404 | Pierce et al. | Apr 2005 | A1 |
20050094966 | Elberbaum | May 2005 | A1 |
20050099498 | Lao et al. | May 2005 | A1 |
20050100329 | Lao et al. | May 2005 | A1 |
20050101334 | Brown et al. | May 2005 | A1 |
20050134966 | Burgner | May 2005 | A1 |
20050132200 | Jaffe et al. | Jun 2005 | A1 |
20050151852 | Jomppanen | Jul 2005 | A1 |
20050035161 | Shioda | Aug 2005 | A1 |
20050168574 | Lipton et al. | Aug 2005 | A1 |
20050185438 | Ching | Aug 2005 | A1 |
20050206532 | Lock | Sep 2005 | A1 |
20050206741 | Raber | Sep 2005 | A1 |
20050228234 | Yang | Oct 2005 | A1 |
20050232469 | Schofield et al. | Oct 2005 | A1 |
20050243171 | Ross et al. | Nov 2005 | A1 |
20050258942 | Manasseh et al. | Nov 2005 | A1 |
20060009238 | Stanco et al. | Jan 2006 | A1 |
20060028811 | Ross, Jr. et al. | Feb 2006 | A1 |
20060055786 | Olilla | Mar 2006 | A1 |
20060082730 | Franks | Apr 2006 | A1 |
20060125919 | Camilleri et al. | Jul 2006 | A1 |
20060153740 | Sultan et al. | Jul 2006 | A1 |
20060158968 | Vanman et al. | Jul 2006 | A1 |
20060164220 | Harter, Jr. et al. | Jul 2006 | A1 |
20060164534 | Robinson et al. | Jul 2006 | A1 |
20060170770 | MacCarthy | Aug 2006 | A1 |
20060176149 | Douglas | Aug 2006 | A1 |
20060183505 | Willrich | Aug 2006 | A1 |
20060193749 | Ghazarian et al. | Aug 2006 | A1 |
20060203090 | Wang et al. | Sep 2006 | A1 |
20060208857 | Wong | Oct 2006 | A1 |
20060220826 | Rast | Oct 2006 | A1 |
20060225253 | Bates | Oct 2006 | A1 |
20060232406 | Filibeck | Oct 2006 | A1 |
20060244601 | Nishimura | Nov 2006 | A1 |
20060256822 | Kwong et al. | Nov 2006 | A1 |
20060270465 | Lee et al. | Nov 2006 | A1 |
20060271287 | Gold et al. | Nov 2006 | A1 |
20060274166 | Lee et al. | Dec 2006 | A1 |
20060274828 | Siemens et al. | Dec 2006 | A1 |
20060274829 | Siemens et al. | Dec 2006 | A1 |
20060276200 | Radhakrishnan et al. | Dec 2006 | A1 |
20060282021 | DeVaul et al. | Dec 2006 | A1 |
20060287821 | Lin | Dec 2006 | A1 |
20060293571 | Bao et al. | Dec 2006 | A1 |
20070021134 | Liou | Jan 2007 | A1 |
20070035622 | Hanna et al. | Feb 2007 | A1 |
20070064108 | Haler | Mar 2007 | A1 |
20070067079 | Kosugi | Mar 2007 | A1 |
20070081818 | Castaneda et al. | Apr 2007 | A1 |
20070091557 | Kim et al. | Apr 2007 | A1 |
20070102508 | Mcintosh | May 2007 | A1 |
20070117083 | Winneg et al. | May 2007 | A1 |
20070132567 | Schofield et al. | Jun 2007 | A1 |
20070152811 | Anderson | Jul 2007 | A1 |
20070172053 | Poirier | Jul 2007 | A1 |
20070177023 | Beuhler et al. | Aug 2007 | A1 |
20070195939 | Sink et al. | Aug 2007 | A1 |
20070199076 | Rensin et al. | Aug 2007 | A1 |
20070213088 | Sink | Sep 2007 | A1 |
20070229350 | Scalisi et al. | Oct 2007 | A1 |
20070257781 | Denson | Nov 2007 | A1 |
20070257782 | Etcheson | Nov 2007 | A1 |
20070257804 | Gunderson et al. | Nov 2007 | A1 |
20070257815 | Gunderson et al. | Nov 2007 | A1 |
20070260361 | Etcheson | Nov 2007 | A1 |
20070268158 | Gunderson et al. | Nov 2007 | A1 |
20070271105 | Gunderson et al. | Nov 2007 | A1 |
20070274705 | Kashiwa | Nov 2007 | A1 |
20070277352 | Maron et al. | Dec 2007 | A1 |
20070285222 | Zadnikar | Dec 2007 | A1 |
20070287425 | Bates | Dec 2007 | A1 |
20070297320 | Brummette et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080002031 | Cana et al. | Jan 2008 | A1 |
20080002599 | Denny et al. | Feb 2008 | A1 |
20080030580 | Kashhiawa et al. | Feb 2008 | A1 |
20080042825 | Denny et al. | Feb 2008 | A1 |
20080043736 | Stanley | Feb 2008 | A1 |
20080049830 | Richardson | Feb 2008 | A1 |
20080061991 | Urban et al. | Mar 2008 | A1 |
20080063252 | Dobbs et al. | Mar 2008 | A1 |
20080084473 | Romanowich | Apr 2008 | A1 |
20080100705 | Kister et al. | May 2008 | A1 |
20080101789 | Sharma | May 2008 | A1 |
20080122603 | Piante et al. | May 2008 | A1 |
20080129518 | Carlton-Foss | Jun 2008 | A1 |
20080143481 | Abraham et al. | Jun 2008 | A1 |
20080144705 | Rackin et al. | Jun 2008 | A1 |
20080169929 | Albertson et al. | Jul 2008 | A1 |
20080170130 | Ollila et al. | Jul 2008 | A1 |
20080175565 | Takakura et al. | Jul 2008 | A1 |
20080177569 | Chen et al. | Jul 2008 | A1 |
20080211906 | Lovric | Sep 2008 | A1 |
20080222849 | Lavoie | Sep 2008 | A1 |
20080239064 | Iwasaki | Oct 2008 | A1 |
20080246656 | Ghazarian | Oct 2008 | A1 |
20080266118 | Pierson et al. | Oct 2008 | A1 |
20080307435 | Rehman | Dec 2008 | A1 |
20080316314 | Bedell et al. | Dec 2008 | A1 |
20090002491 | Haler | Jan 2009 | A1 |
20090002556 | Manapragada et al. | Jan 2009 | A1 |
20090023422 | MacInnis et al. | Jan 2009 | A1 |
20090027499 | Nicholl | Jan 2009 | A1 |
20090052685 | Cilia et al. | Feb 2009 | A1 |
20090070820 | Li | Mar 2009 | A1 |
20090085740 | Klein et al. | Apr 2009 | A1 |
20090109292 | Ennis | Apr 2009 | A1 |
20090122142 | Shapley | May 2009 | A1 |
20090135007 | Donovan et al. | May 2009 | A1 |
20090177679 | Boomer et al. | Jun 2009 | A1 |
20090157255 | Plante | Jul 2009 | A1 |
20090169068 | Okamoto | Jul 2009 | A1 |
20090189981 | Siann et al. | Jul 2009 | A1 |
20090195686 | Shintani | Aug 2009 | A1 |
20090207252 | Raghunath | Aug 2009 | A1 |
20090213204 | Wong | Aug 2009 | A1 |
20090225189 | Morin | Sep 2009 | A1 |
20090243794 | Morrow | Oct 2009 | A1 |
20090251545 | Shekarri et al. | Oct 2009 | A1 |
20090252370 | Picard et al. | Oct 2009 | A1 |
20090252486 | Ross, Jr. et al. | Oct 2009 | A1 |
20090276708 | Smith et al. | Nov 2009 | A1 |
20090294538 | Wihlborg et al. | Dec 2009 | A1 |
20090324203 | Wiklof | Dec 2009 | A1 |
20100045798 | Sugimoto et al. | Feb 2010 | A1 |
20100050734 | Chou | Mar 2010 | A1 |
20100060747 | Woodman | Mar 2010 | A1 |
20100097221 | Kriener et al. | Apr 2010 | A1 |
20100106707 | Brown et al. | Apr 2010 | A1 |
20100118147 | Dorneich et al. | May 2010 | A1 |
20100122435 | Markham | May 2010 | A1 |
20100123779 | Snyder et al. | May 2010 | A1 |
20100157049 | Dvir et al. | Jun 2010 | A1 |
20100177193 | Flores | Jul 2010 | A1 |
20100177891 | Keidar et al. | Jul 2010 | A1 |
20100188201 | Cook et al. | Jul 2010 | A1 |
20100191411 | Cook et al. | Jul 2010 | A1 |
20100194885 | Plaster | Aug 2010 | A1 |
20100217836 | Rofougaran | Aug 2010 | A1 |
20100238009 | Cook et al. | Sep 2010 | A1 |
20100238262 | Kurtz et al. | Sep 2010 | A1 |
20100242076 | Potesta et al. | Sep 2010 | A1 |
20100265331 | Tanaka | Oct 2010 | A1 |
20100274816 | Guzik | Oct 2010 | A1 |
20100287473 | Recesso et al. | Nov 2010 | A1 |
20110006151 | Beard | Jan 2011 | A1 |
20110018998 | Guzik | Jan 2011 | A1 |
20110050904 | Anderson | Mar 2011 | A1 |
20110069151 | Orimoto | Mar 2011 | A1 |
20110084820 | Walter et al. | Apr 2011 | A1 |
20110094003 | Spiewak et al. | Apr 2011 | A1 |
20110098924 | Baladeta et al. | Apr 2011 | A1 |
20110129151 | Saito et al. | Jun 2011 | A1 |
20110157759 | Smith et al. | Jun 2011 | A1 |
20110187895 | Cheng et al. | Aug 2011 | A1 |
20110261176 | Monaghan, Sr. et al. | Oct 2011 | A1 |
20110281547 | Cordero | Nov 2011 | A1 |
20110301971 | Roesch et al. | Dec 2011 | A1 |
20110314401 | Salisbury et al. | Dec 2011 | A1 |
20120038689 | Ishil | Feb 2012 | A1 |
20120056722 | Kawaguchi | Mar 2012 | A1 |
20120063736 | Simmons et al. | Mar 2012 | A1 |
20120120258 | Boutell et al. | May 2012 | A1 |
20120162436 | Cordell et al. | Jun 2012 | A1 |
20120188345 | Salow | Jul 2012 | A1 |
20120189286 | Takayama et al. | Jul 2012 | A1 |
20120195574 | Wallace | Aug 2012 | A1 |
20120206565 | Villmer | Aug 2012 | A1 |
20120230540 | Calman et al. | Sep 2012 | A1 |
20120257320 | Brundula et al. | Oct 2012 | A1 |
20120268259 | Igel et al. | Oct 2012 | A1 |
20120276954 | Kowalsky | Nov 2012 | A1 |
20120314063 | Cirker | Dec 2012 | A1 |
20130021153 | Keays | Jan 2013 | A1 |
20130033610 | Osborn | Feb 2013 | A1 |
20130035602 | Gemer | Feb 2013 | A1 |
20130080836 | Stergiou et al. | Mar 2013 | A1 |
20130095855 | Bort | Apr 2013 | A1 |
20130096731 | Tamari et al. | Apr 2013 | A1 |
20130125000 | Flischhauser et al. | May 2013 | A1 |
20130148295 | Minn et al. | Jun 2013 | A1 |
20130222640 | Baek et al. | Aug 2013 | A1 |
20130225309 | Bentley et al. | Aug 2013 | A1 |
20130265453 | Middleton et al. | Oct 2013 | A1 |
20130285232 | Sheth | Oct 2013 | A1 |
20130290018 | Anderson et al. | Oct 2013 | A1 |
20130300563 | Glaze | Nov 2013 | A1 |
20130329063 | Zhou | Dec 2013 | A1 |
20130343571 | Lee | Dec 2013 | A1 |
20140037262 | Sako | Feb 2014 | A1 |
20140040158 | Dalley, Jr. et al. | Feb 2014 | A1 |
20140049636 | O'Donnell et al. | Feb 2014 | A1 |
20140092299 | Phillips et al. | Apr 2014 | A1 |
20140094992 | Lambert et al. | Apr 2014 | A1 |
20140098453 | Brundula et al. | Apr 2014 | A1 |
20140131435 | Harrington et al. | May 2014 | A1 |
20140139680 | Huang et al. | May 2014 | A1 |
20140140575 | Wolf | May 2014 | A1 |
20140143545 | McKeeman et al. | May 2014 | A1 |
20140167954 | Johnson et al. | Jun 2014 | A1 |
20140170602 | Reed | Jun 2014 | A1 |
20140176733 | Drooker et al. | Jun 2014 | A1 |
20140178031 | Walker | Jun 2014 | A1 |
20140192194 | Bedell et al. | Jul 2014 | A1 |
20140195105 | Lambert et al. | Jul 2014 | A1 |
20140195272 | Sadiq et al. | Jul 2014 | A1 |
20140210625 | Nemat-Nasser | Jul 2014 | A1 |
20140218544 | Senot et al. | Aug 2014 | A1 |
20140227671 | Olmstead et al. | Aug 2014 | A1 |
20140311215 | Keays et al. | Oct 2014 | A1 |
20140341532 | Marathe et al. | Nov 2014 | A1 |
20140355951 | Tabak | Dec 2014 | A1 |
20140368658 | Costa et al. | Dec 2014 | A1 |
20150019982 | Petitt, Jr. et al. | Jan 2015 | A1 |
20150050003 | Ross et al. | Feb 2015 | A1 |
20150051502 | Ross | Feb 2015 | A1 |
20150053776 | Rose et al. | Mar 2015 | A1 |
20150078727 | Ross et al. | Mar 2015 | A1 |
20150088335 | Lambert et al. | Mar 2015 | A1 |
20150103246 | Phillips et al. | Apr 2015 | A1 |
20150163390 | Lee et al. | Jun 2015 | A1 |
20150180746 | Day et al. | Jun 2015 | A1 |
20150229630 | Smith | Aug 2015 | A1 |
20150256808 | MacMillan et al. | Sep 2015 | A1 |
20150312773 | Joshi et al. | Oct 2015 | A1 |
20150317368 | Rhoads et al. | Nov 2015 | A1 |
20150332424 | Kane et al. | Nov 2015 | A1 |
20150348417 | Ignaczak et al. | Dec 2015 | A1 |
20150356081 | Cronin | Dec 2015 | A1 |
20150358549 | Cho et al. | Dec 2015 | A1 |
20160006922 | Boudreau | Jan 2016 | A1 |
20160042767 | Araya et al. | Feb 2016 | A1 |
20160050345 | Longbotham | Feb 2016 | A1 |
20160054735 | Switkes et al. | Feb 2016 | A1 |
20160057392 | Meidan et al. | Feb 2016 | A1 |
20160064036 | Chen et al. | Mar 2016 | A1 |
20160066085 | Chang et al. | Mar 2016 | A1 |
20160104508 | Chee et al. | Apr 2016 | A1 |
20160112636 | Yamaguchi et al. | Apr 2016 | A1 |
20160127695 | Zhang et al. | May 2016 | A1 |
20160165192 | Saatchi et al. | Jun 2016 | A1 |
20160295089 | Farahani | Oct 2016 | A1 |
20160358393 | Penland | Dec 2016 | A1 |
20160360160 | Eizenberg | Dec 2016 | A1 |
20160364621 | Hill et al. | Dec 2016 | A1 |
20170028935 | Dutta et al. | Feb 2017 | A1 |
20170059265 | Winter | Mar 2017 | A1 |
20170070659 | Kievsky et al. | Mar 2017 | A1 |
20170161382 | Ouimet et al. | Jun 2017 | A1 |
20170178475 | Renkis | Jun 2017 | A1 |
20170195635 | Yokomitsu et al. | Jul 2017 | A1 |
20170200476 | Chen et al. | Jul 2017 | A1 |
20170230605 | Han et al. | Aug 2017 | A1 |
20170237950 | Araya et al. | Aug 2017 | A1 |
20170244884 | Burtey et al. | Aug 2017 | A1 |
20170277700 | Davis et al. | Sep 2017 | A1 |
20170287523 | Hodulik et al. | Oct 2017 | A1 |
20180023910 | Kramer | Jan 2018 | A1 |
20180050800 | Boykin et al. | Feb 2018 | A1 |
20180053394 | Gersten | Feb 2018 | A1 |
20180131844 | Lau | May 2018 | A1 |
20180262724 | Ross | Sep 2018 | A1 |
20190020827 | Siminoff | Jan 2019 | A1 |
20190057314 | Julian | Feb 2019 | A1 |
Number | Date | Country |
---|---|---|
102010019451 | Nov 2011 | DE |
2479993 | Jul 2012 | EP |
3073449 | Sep 2016 | EP |
2273624 | Jun 1994 | GB |
2320389 | May 1998 | GB |
2343252 | May 2000 | GB |
2351055 | Dec 2000 | GB |
2417151 | Feb 2006 | GB |
2425427 | Oct 2006 | GB |
2455885 | Jul 2009 | GB |
2485804 | May 2012 | GB |
20090923 | Sep 2010 | IE |
294188 | Sep 1993 | JP |
153298 | Jun 1996 | JP |
198858 | Jul 1997 | JP |
10076880 | Mar 1998 | JP |
210395 | Jul 1998 | JP |
2000137263 | May 2000 | JP |
2005119631 | May 2005 | JP |
20-0236817 | Aug 2001 | KR |
1050897 | Jul 2011 | KR |
2383915 | Mar 2010 | RU |
107851 | Aug 2011 | RU |
124780 | Feb 2013 | RU |
9005076 | May 1990 | WO |
9738526 | Oct 1997 | WO |
9831146 | Jul 1998 | WO |
9948308 | Sep 1999 | WO |
0039556 | Jul 2000 | WO |
0051360 | Aug 2000 | WO |
0123214 | Apr 2001 | WO |
0249881 | Jun 2002 | WO |
02095757 | Nov 2002 | WO |
03049446 | Jun 2003 | WO |
2004036926 | Apr 2004 | WO |
2009013526 | Jan 2009 | WO |
2011001180 | Jan 2011 | WO |
2012037139 | Mar 2012 | WO |
2012120083 | Sep 2012 | WO |
2014000161 | Jan 2014 | WO |
2014052898 | Apr 2014 | WO |
Entry |
---|
Automation Systems Article, Know-How Bank Co. Ltd. Takes Leap Forward as a Company Specializing in R&D and Technology Consulting, published Jan. 2005. |
Car Rear View Camera—Multimedia Rear View Mirror—4′ LCD color monitor, Retrieved from the Internet: <URL: http://web.archive.org/web/20050209014751/http://laipac.com/multimedia-rear-mirror.htm>, Feb. 9, 2005. |
ATC Chameleon. Techdad Review [Online] Jun. 19, 2013 [Retrieved on Dec. 30, 2015]. Retrieved from Internet. <URL:http://www.techdadreview.com/2013/06/19atc-chameleon/>. |
“Breathalyzer.” Wikipedia. Printed Date: Oct. 16, 2014; Date Page Last Modified: Sep. 14, 2014; <http://en.wikipedia.org/wiki/Breathalyzer>. |
Dees, Tim; Taser Axon Flex: The next generation of body camera; <http://www.policeone.com/police-products/body-cameras/articles/527231-0-TASER-Axon-Flex-The-next-generation-of-body-camera/>, Date Posted: Mar. 12, 2012; Date Printed: Oct. 27, 2015. |
Brown, TP-Link TL-WDR3500 Wireless N600 Router Review, Mar. 6, 2013. |
Controller Area Network (CAN) Overview, National Instruments White Paper, Aug. 1, 2014. |
Daskam, Samuel W., Law Enforcement Armed Robbery Alarm System Utilizing Recorded Voice Addresses via Police Radio Channels, Source: Univ. of Ky, Off of Res and Eng., Serv (UKY BU107), pp. 18-22, 1975. |
Digital Ally vs. Taser International, Inc., Case No. 2:16-cv-232 (CJM/TJ); US D. Kan, Defendant Taser International Inc.'s Preliminary Invalidity Contentions, Jul. 5, 2016. |
Electronic Times Article, published Feb. 24, 2005. |
Supplementary European Search Report dated Sep. 28, 2010 in European Patent Application No. 06803645.8; Applicant: Digital Ally, Inc. |
W. Fincham, Data Recorders for Accident Investigation, Monitoring of Driver and Vehicle Performance (Digest No. 1997/122), Publication Date: Apr. 10, 1997, pp. 6/1-6/3. |
Frankel, Harry; Riter, Stephen, Bernat, Andrew, Automated Imaging System for Border Control, Source: University of Kentucky, Office of Engineering Services, (Bulletin) UKY BU, pp. 169-173, Aug. 1986. |
Freudenrich, Craig, Ph.D.; “How Breathalyzers Work—Why Test ?. ” HowStuff Works. Printed Date: Oct. 16, 2014; Posted Date: Unknown; <http://electronics.howstuffworks.com/gadgets/automotive/breathalyzer1.htm>. |
Hankyung Auto News Article, Know-How Bank's Black Box for Cars “Multi-Black Box,” Copyright 2005. |
Guide to Bluetooth Security: Recommendations of the National Institute of Standards and Technology, National Institute of Standards and Technology, U.S. Dep't of Commerce, NIST Special Publication 800-121, Revision 1 (Jun. 2012). |
ICOP Extreme Wireless Mic, Operation Supplement, Copyright 2008. |
ICOP Model 20/20-W Specifications; Enhanced Digital In-Car Video and Audio recording Systems, date: Unknown. |
ICOP Mobile Dvrs; ICOP Model 20/20-W & ICOP 20/20 Vision, date: Unknown. |
Bertomen, Lindsey J., PoliceOne.com News; “Product Review: ICOP Model 20/20-W,” May 19, 2009. |
ICOP Raytheon JPS communications, Raytheon Model 20/20-W, Raytheon 20/20 Vision Digital In-Car Video Systems, date: Unknown. |
Overview of the IEEE 802.15.4 standards for Low rate Wireless Personal Area Networks, 2010 7th International Symposium on Wireless Communication Systems (ISWCS), Copyright 2010. |
Lewis, S.R., Future System Specifications for Traffic Enforcement Equipment, S.R. 1 Source: IEE Colloquium (Digest), N 252, Publication Date: Nov. 18, 1996, pp. 8/1-8/2. |
Kopin Corporation; Home Page; Printed Date: Oct. 16, 2014; Posted Date: Unknown; <http://www.kopin.com>. |
Translation of Korean Patent No. 10-1050897, published Jul. 20, 2011. |
Lilliput RV 18-50NP 5″ Rear View Mirror TFT LCD Screen with Camera, Retrieved from the Internet: <URL: http://www.case-mod.com/lilliput-rv1850np-rear-view-mirror-tft-lcd-screen-with-camera-p-1271.html>, Mar. 4, 2005. |
Motor Magazine Article, Recreating the Scene of an Accident, published 2005. |
New Rearview-Mirror-Based Camera Display Takes the Guesswork Out of Backing Up Retrieved from the Internet: URL: http://news.thomasnet.com/fullstory/497750>, Press Release, Oct. 30, 2006. |
SIIF Award for Multi Black Box, published Dec. 10, 2004. |
Near Field Communication; Sony Corporation; pp. 1-7, Date: Unknown. |
Oregon Scientific ATC Chameleon Dual Lens HD Action Camera, http://www.oregonscientificstore.com/Oregon-Scientific-ATC-Chameleon-Dual-Lens-HD-Action-Camera.data, Date Posted: Unknown; Date Printed: Oct. 13, 2014, pp. 1-4. |
Asian Wolf High Quality Angel Eye Body Video Spy Camera Recorder System, http://www.asianwolf.com/covert-bodycam-hq-angeleye.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Brick House Security Body Worn Cameras / Hidden Cameras / Covert Spy Cameras, https://www.brickhousesecurity.com/hidden-cameras/bodyworn-cameras/, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Amazon.com wearable camcorders, http://www.amazon.com/s/ref=nb_sb_ss_i_0_4?url=search-alias%3Dphoto&field-keywords=wearable+camcorder&x=0&y=0&sprefix=wear, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration dated Feb. 4, 2016; International Application No. PCT/US2015/056052; International Filing Date: Oct. 16, 2015; Applicant: Digital Ally, Inc. |
Http:/ /www.k-h-b.com/board/board.php?board=products01&comand=body&no=1, Current State of Technology Held by the Company, Copyright 2005. |
City of Pomona Request for Proposals for Mobile Video Recording System for Police Vehicles, dated prior to Apr. 4, 2013. |
Http://www.k-h-b.com/sub1_02.html, Copyright 2005. |
Renstrom, Joell; “Tiny 3D Projectors Allow You to Transmit Holograms From a Cell Phone.” Giant Freakin Robot. Printed Date: Oct. 16, 2014; Posted Date: Jun. 13, 2014; <http://www.giantfreakinrobot.com/sci/coming-3d-projectors-transmit-holograms-cell-phone.html>. |
Request for Comment 1323 of the Internet Engineering Task Force, TCP Extensions for High Performance, Date: May 1992. |
RevealMedia RS3-SX high definition video recorder, http://www.revealmedia.com/buy-t166/cameras/rs3-sx.aspx, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Scorpion Micro DV Video Audio Recorder, http://www.leacorp.com/scorpion-micro-dv-video-audio-recorder/, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
“Stalker Press Room—Using In-Car Video, the Internet, and the Cloud to keep police officers safe is the subject of CopTrax live, free webinar.” Stalker. Printed Date: Oct. 16, 2014; Posted Date: Jul. 31, 2014. |
State of Utah Invitation to Bid State Cooperative Contract; Vendor: ICOP Digital, Inc., Contract No. MA503, Jul. 1, 2008. |
Wasson, Brian; “Digital Eyewear for Law Enforcement.” Printed Date: Oct. 16, 2014; Posted Date: Dec. 9, 2013; <http://www.wassom.com/digital-eyewear-for-law-enforcement.html>. |
X26 Taser, Date Unknown. |
Taser International; Taser X26 Specification Sheet, 2003. |
Digital Ally First Vu Mountable Digital Camera Video Recorder, http://www.opticsplanet.com/digital-ally-first-vu-mountable-digital-camera-video-recorder.html?gclid=CIKohcX05rkCFSlo7AodU0IA0g&ef_id=FUjCGEAAAAWGEjrQF:20130925155534:s, Sep. 25, 2013, Date Posted: Unknown, pp. 1-4. |
Drift X170, http://driftinnovation.com/support/firmware-update/x170/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Shapton, Dave “Digital Microphones: A new approach?” from soundonsound.com published Mar. 2004, 4 pages (Year: 2004). |
Sharper Image User Guide, https://cdn4.sharperimage.com/si/pdf/manuals/206463.pdf, Jan. 2, 2012 (Year: 2012). |
Prospero, Oregon Scientific ATC Chameleon Review, https://www.laptopmag.com/reviews/cameras/orgon-scientific-atc-chameleon, Mar. 27, 2013 (Year: 2013). |
Ecplaza HY-001HD law enforcement DVR, http://fireeye.en.ecplaza.net/law-enforcement-dvr--238185-1619696.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Edesix VideoBadge, http://www.edesix.com/edesix-products, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
GoPro Official Website: The World's Most Versatile Camera, http://gopro.com/products/?gclid=CKqHv9jT4rkCFWZk7AodyiAAaQ, Sep. 23, 2013, Date Posted: Unknown, pp. 4-9. |
Isaw Advance Hull HD EXtreme, www.isawcam.co.kr, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Kustom Signals VieVu, http://www.kustomsignals.com/index.php/mvideo/vievu, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
Lea-Aid Scorpion Micro Recorder Patrol kit,http://www.leacorp.com/products/SCORPION-Micro-Recorder-Patrol-kit.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Looxcie Wearable & mountable streaming video cams, http://www.looxcie.com/overview?gclid=CPbDyv6piq8CFWeFQAodlhXC-w, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
Midland XTC HD Video Camera, http://midlandradio.com/Company/xtc100-signup, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Panasonic Handheld AVCCAM HD Recorder/Player, http://www.panasonic.com/business/provideo/ag-hmr10.asp, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority, or the Declaration dated Jan. 30, 2014, International Application No. PCT/US2013/062415; International Filing date Sep. 27, 2013, Applicant: Digital Ally, Inc. |
Point of View Cameras Military & Police, http://pointofviewcameras.com/military-police, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Pov.hd System Digital Video Camera, http://www.vio-pov.com/index.php, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Invalidity Chart for International Publication No. WO2014/000161 dated Oct. 31, 2017. |
PCT Patent Application PCT/US17/16383 International Search Report and Written Opinion dated May 4, 2017. |
SIV Security in Vehicle Driving Partner, http://www.siv.co.kr/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Spy Chest Mini Spy Camera / Self Contained Mini camcorder / Audio & Video Recorder, http://www.spytechs.com/spy_cameras/mini-spy-camera.htm, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Stalker VUE Law Enforcement Grade Body Worn Video Camera/Recorder, http://www.stalkerradar.com/law_vue.shtml, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
SUV Cam, http://www.elmo.co.jp/suv-cam/en/product/index.html, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Taser Axon Body On Officer Video/Police Body Camera, http://www.taser.com/products/on-officer-video/axon-body-on-officer-video, Sep. 23, 2013, Date Posted: Unknown, pp. 1-8. |
Taser Axon Flex On-Officer Video/Police Video Camera, http://www.taser.com/products/on-officer-video/taser-axon, Sep. 26, 2013, Date Posted: Unknown, pp. 1-8. |
Taser Cam Law Enforcement Audio/Video Recorder (gun mounted), http://www.taser.com/products/on-officer-video/taser-cam, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Tide Leader police body worn camera, http://tideleader.en.gongchang.com/product/14899076, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
UCorder Pockito Wearable Mini Pocket Camcorder, http://www.ucorder.com/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Veho MUVI HD, http://veho-uk.fastnet.co.uk/main/shop.aspx?category=CAMMUVIHD, Sep. 26, 2013, Date Posted: Unknown, pp. 1-5. |
Veho MUVI portable wireless speaker with dock, http://veho-uk.fastnet.co.uk/main/shop.aspx?category=camcorder, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Vidmic Officer Worn Video & Radio Accessories, http://www.vidmic.com/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
VIEVU Products, http://www.vievu.com/vievu-products/vievu-squared/, Sep. 25, 2013, Date Posted: Unknown, pp. 1-2. |
WatchGuard CopVu Wearable Video Camera System, http://watchguardvideo.com/copvu/overview, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Witness Cam headset, http://www.secgru.com/DVR-Witness-Cam-Headset-Video-Recorder-SG-DVR-1-COP.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
WolfCom 3rd Eye, X1 A/V Recorder for Police and Military, http://wolfcomusa.com/Products/Products.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority, or the Declaration dated Jan. 14, 2016, International Application No. PCT/US2015/056039; International Filing date Oct. 16, 2015, Applicant: Digital Ally, Inc. |
U.S. Appl. No. 13/959,142 Final Office Action dated Jul. 20, 2016. |
U.S. Appl. No. 13/959,142 Office Action dated Nov. 3, 2015. |
Digital Ally, Inc. vs. Taser International, Inc., Case No. 2:16-cv-020232 (CJM/TJ); US D. Kan, Complaint for Patent Infringement, Jan. 14, 2016. |
Digital Ally, Inc. vs. Enforcement video LLC d/b/a Watchguard Video., Case No. 2:16-cv-02349 (CJM/TJ); US D. Kan, Complaint fFor Patent Infringement, May 27, 2016. |
International Association of Chiefs of Police Digital Video System Minimum Specifications; Nov. 21, 2008. |
Petition for Inter Partes Review No. 2017-00375, Taser International, Inc. v. Digital Ally, Inc., filed Dec. 1, 2016. |
Petition for Inter Partes Review No. 2017-00376, Taser International, Inc. v. Digital Ally, Inc., filed Dec. 1, 2016. |
Petition for Inter Partes Review No. 2017-00515, Taser International, Inc. v. Digital Ally Inc., filed Jan. 11, 2017. |
Petition for Inter Partes Review No. 2017-00775, Taser International, Inc. v. Digital Ally Inc., filed Jan. 25, 2017. |
PCT Patent Application PCT/US16/34345 International Search Report and Written Opinion dated Dec. 29, 2016. |
State of Utah Invitation to Bid State Cooperative Contract; Vendor: Kustom Signals Inc., Contract No. MA1991, Apr. 25, 2008. |
Dyna Spy Inc. hidden cameras, https://www.dynaspy.com/hidden-cameras/spy-cameras/body-worn-wearable-spy-cameras, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
U.S. Appl. No. 15/011,132 Office Action dated Apr. 18, 2016, 19 pages. |
Zepcam Wearable Video Technology, http://www.zepcam.com/product.aspx, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Petition for Post Grant Review No. PGR2018-00052, Axon Enterprise, Inc. v. Digital Ally, Inc., filed Mar. 19, 2018. |
MPEG-4 Coding of Moving Pictures and Audio ISO/IEC JTC1/SC29/WG11 N4668 dated Mar. 2002. |
European Patent Application 15850436.6 Search Report dated May 4, 2018. |
Final Written Decision for Inter Partes Review No. 2017-00375, Axon Enterprise Inc. v. Digital Ally, Inc., dated Jun. 1, 2018. |
Decision Denying Institution of Post Grant Review for Post Grant Review No. PGR2018-00052, Axon Enterprise, Inc. v. Digital Ally, Inc., issued Oct. 1, 2018. |
Number | Date | Country | |
---|---|---|---|
20230379430 A1 | Nov 2023 | US |