Embodiments of the invention are broadly related to automatic triggering of sensors. More specifically, embodiments of the invention are directed to the automatic collection of data from sensors triggered by the sensors or other devices to record events of interest.
Cameras are becoming widespread for documenting events of interest. For example, dashboard cameras may be triggered to record when an accident occurs so that a driver has a record of the event. Pre-event recording may be used to create a record of the events immediately preceding the triggering accident. Similarly, law-enforcement officers may wear body-mounted cameras to record law-enforcement events from a first-person point of view. However, the triggers for automatically recording traffic accident or law-enforcement events are not generally suitable for other, commercial applications. As such, camera systems that can be triggered to record by a wide variety of conditions are needed.
Embodiments of the invention address the above-described need by providing for a system that may automatically trigger recording of an event, store the event recording with relevant information, and may perform an action in response to the triggering event.
In particular, in a first embodiment, the invention includes a system for automatically recording an event, comprising a first sensor configured to collect a first set of data, a second sensor configured to collect a second set of data, a recording device manager, comprising, a data store, a processor, one or more non-transitory computer-readable media storing computer-executable instructions, that, when executed by the processor, perform the steps of receiving the first set of data from the first sensor, detecting a triggering event from the first set of data, transmitting, in response to the detection of the triggering event, a signal from the recording device manager to the second sensor, wherein the signal instructs the second sensor to begin collecting the second set of data, storing, in a data store, the first set of data collected at the first sensor, storing, in the data store, the second set of data collected at the second sensor, and storing a third set of data in the data store, wherein the third set of data is not collected at the first sensor or at the second sensor.
In a second embodiment, the invention includes a method of automatically recording an event, comprising the steps of collecting a first set of data at a first sensor, detecting a triggering event from the first set of data at a recording device manager, sending a first signal, in response to the detection of the triggering event, from the recording device manager to the second sensor, wherein the first signal instructs a second sensor to begin collecting a second set of data, sending a second signal, in response to the detection of the triggering event, from the recording device manager to an electromechanical device, wherein the electromechanical device performs an action in response to the third signal, and storing the first set of data collected at the first sensor and the second set of data collected at the second sensor.
In a third embodiment, the invention includes one or more non-transitory computer storage media storing computer-executable instructions that, when executed by a processor, perform a method of automatically recording an event, the method comprising the steps of collecting a first set of data at a first sensor, detecting a triggering event from the first set of data at a recording device manager, sending a first signal from the recording device manager to a second sensor instructing the second sensor to collect a second set of data, sending a second signal from the recording device manager to an electromechanical device, instructing the electromechanical device performs an action responsive to the second signal, and storing the first set of data and the second set of data.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Other aspects and advantages of the current invention will be apparent from the following detailed description of the embodiments and the accompanying drawing figures.
Embodiments of the invention are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
Embodiments of the invention solve the above problems by providing a system and method for more versatile triggering of video recordings appropriate to a wider range of applications. Broadly, the video triggering system includes one or more sensors and a recording device manager. One or more of the sensors may be an image sensor such as a camera. When an event of interest occurs, it is captured by the sensors and the recording device manager may trigger other sensors such as a camera to begin recording the event. Where multiple sensors are positioned to record the event, the triggered sensors may communicate to the recording device manager to cause all other sensors to record as well. For example, a camera may be the sensor that triggers other sensors based on a triggering event captured by the camera.
In one embodiment, an event is detected and a signal is sent to a camera to begin recording. These may also be referred to as a “triggering event” and a “triggering signal.” The camera may be communicatively coupled to a recording device manager and one or more mobile devices or computers. The sensor may send a signal directly to one or more cameras or to the recording device manager. The recording device manager may send one or more signals to a plurality of cameras or other sensors to begin recording. The camera may begin recording the video when the signal is received. The recording is terminated as determined by a predetermined set of instructions, when the user turns off the camera or when a separate event is sensed that, according to predetermined rules executed by the camera or the video recording manager, signals the camera to end the video recording.
In this description, references to “one embodiment,” “an embodiment,” or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment,” “an embodiment,” or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etc. described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the current technology can include a variety of combinations and/or integrations of the embodiments described herein.
Turning first to
Computer-readable media include both volatile and nonvolatile media, removable and nonremovable media, and contemplate media readable by a database. For example, computer-readable media include (but are not limited to) RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD), holographic media or other optical disc storage, magnetic cassettes, magnetic tape, magnetic disk storage, and other magnetic storage devices. These technologies can store data temporarily or permanently. However, unless explicitly specified otherwise, the term “computer-readable media” should not be construed to include physical, but transitory, forms of signal transmission such as radio broadcasts, electrical signals through a wire, or light pulses through a fiber-optic cable. Examples of stored information include computer-useable instructions, data structures, program modules, and other data representations.
Finally, network interface card (NIC) 124 is also attached to system bus 104 and allows computer 102 to communicate over a network such as network 126. NIC 124 can be any form of network interface known in the art, such as Ethernet, ATM, fiber, Bluetooth, or Wi-Fi (i.e., the IEEE 802.11 family of standards). NIC 124 connects computer 102 to local network 126, which may also include one or more other computers, such as computer 128, and network storage, such as data store 130. Generally, a data store such as data store 130 may be any repository from which information can be stored and retrieved as needed. Examples of data stores include relational or object oriented databases, spreadsheets, file systems, flat files, directory services such as LDAP and Active Directory, or email storage systems. A data store may be accessible via a complex API (such as, for example, Structured Query Language), a simple API providing only read, write and seek operations, or any level of complexity in between. Some data stores may additionally provide management functions for data sets stored therein such as backup or versioning. Data stores can be local to a single computer such as computer 128, accessible on a local network such as local network 126, or remotely accessible over Internet 132. Local network 126 is in turn connected to Internet 132, which connects many networks such as local network 126, remote network 134 or directly attached computers such as computer 136. In certain embodiments, computer 102 can itself be directly connected to Internet 132.
In the following embodiment, as depicted in
In some embodiments, the time, location, personal identification of people, numbers associated with people or animals, and other information about the event that may be sensed by either analog or digital sensors and recorded in digital form may be recorded as metadata. This data may be received from any of the sensors mentioned above, from RFID tags, or from computers or mobile devices communicatively connected to the triggering system and may be received from a recording device manager and not directly from the sensors. The real-time information transmitted by any of the devices can be embedded in the video data. This data may be used at a later date for verification purposes. For example, in a court case it may be persuasive to have the data embedded on the video recording, in real time, rather than just stored in a folder that may be or may not have to be manipulated between the time of recording and presentation. This may also be used for statistical, performance, or productivity purposes. For example, data may be extracted from a video recording of a day of delivering parcels from a parcel delivery service. With the time record, the number of parcels, and the driver ID transmitted from an RFID, the driver's performance that day may be evaluated against other drivers.
In the exemplary embodiment of the invention depicted in
In certain embodiments, camera 202 may send the video signal to a data store or to the recording device manager 208. The recording device manager 208 may relay the video signal to the data store. The recording device manager 208 may embed additional data in the video signal such as date, time, RFID number, or any other data that may be useful in some embodiments of the invention. The additional data may be stored as metadata.
As shown, a secondary, body-mounted camera 210 may be attached to a parcel service employee 212 in order to provide a secondary recording. As with camera 202, camera 210 may be mounted in any fashion, and may record in any visual spectrum. Parcel service employee 212 may drive the delivery vehicle 204, be a person designated to deliver parcels, or both drive and deliver. When camera 202 signals recording device manager 208 that recording is in process, recording device manager 208 may signal the secondary camera 210 to begin recording. Alternatively, camera 202 may signal camera 210 directly. The recording device manager 208 may receive the signal from the GPS sensor 206 for location and the recording device manager 208 may determine that the delivery vehicle 204 is proximate the delivery location. When the predefined proximity has been reached the recording device manager 208 may send a signal to all sensors to start recording. The information from the sensors may be tagged with information such as location, time, parcel 214 information such as tracking number, weight, size, contents, and employee 212 information such as number, employee rank, history, and any other information associated with the parcel 214 or employee 212. An RFID tag may be attached to the employee 212, the parcel 214, or both, and information such as the current location, time, employee ID number, parcel number and information may be stored as metadata on the video recording. Additionally, the information may not be received from the sensors and may be received from a clock, or self-contained device independently tracking information such as date and time, and may also be received from an online database, website, or mobile application.
Alternatively, recording device manager 208 may be incorporated into any of the sensors including one of camera 202 or camera 212, or into both. Camera 212 may also be triggered independently to begin recording (in which case it may or may not trigger camera 202 to begin recording). For example, a triggering event for camera 212 may be activated by motion detection of the parcel service employee 212 exiting delivery vehicle 204, opening the service door 216, or tripping an infrared laser (not shown) at the exit of the vehicle 204. In certain embodiments, manual activation of camera 212 may also serve as a triggering event.
In some embodiments, camera 202 and/or camera 212 may be connected to a monitor 218 for real-time viewing of the destination to ensure a safe environment prior to delivery. The monitor 218 may be in the interior of the delivery vehicle 204, on a communicatively coupled mobile device 220, or in a remote location. In some embodiments of the invention, when camera 202 or 212 is triggered to begin recording, recording device manager 208 may send a notification to mobile device 220, notifying the user that an event of interest has occurred. The recording device manager 208 may also send the signal to start recording to all sensors simultaneously or individually. The employee 212 can then remotely monitor the feeds from one or more cameras 202 or 212. In some such embodiments, the employee 212 may also be able to activate or deactivate cameras 202 and 212, recording device manager 208, monitor 218, or any online data that may be uploaded by the recording device manager 208 or any sensors from the mobile device 220 as well.
As another application of embodiments of the invention in the same context, the body-mounted camera 210 may be activated when the parcel bar code 222 is scanned on parcel 214. The bar code 222 may be scanned with the mobile device 220 that may be a phone or a scanning device issued by the parcel delivery service. A signal may be transmitted from the mobile device 220 to the recording device manager 208 and the recording device manager 208 may signal the other sensors to activate.
In an exemplary application of the embodiment depicted in
Alternatively, the recorded video and associated metadata may be uploaded to an online website, computer, or mobile application. When a parcel 214 has been delivered to an incorrect address, or a customer cannot find the parcel 214 that has been delivered, the customer may go online to view the information associated with the delivery and/or access parts, or all, of the associated metadata. The customer can then determine if the parcel 214 has been delivered to the correct address and where the parcel 214 may be located. Similarly, when the recording is triggered for a customer's package, a notification may be sent to a mobile device or an account of the customer. The customer may then view the parcel delivery in real-time.
The customer account may also be set up to the customer's home security system. When a notification is sent to the customer account the customer security system may be notified and automatically send an alert to the customer or automatically begin recording the delivery event. The customer may receive notification and view the delivery via a mobile application or computer.
In another exemplary scenario depicted in
The system may be beneficial not only for the transportation of goods but also for the transportation of people. In another exemplary scenario depicted in
A driver 320 may view the video footage from the camera 302 and the camera 318 on a monitor 322. The video may also be viewed remotely or via a mobile application. The driver 320 may also view payment information via the monitor 322 provided by sensors that may be activated during payment of a fare. For example, the passenger 306 may swipe a card to pay for a ride and a screen may be displayed to the driver 320 requesting input from the driver 320. The camera 302 may also be activated upon receipt of a payment attempt or in the event that the passenger 306 signs an electronic signature device. The passenger 306 may fail to pay or may not have funds in which the system may actuate locks to trap the passenger 306 or allow the passenger to leave. The system may also alert authorities and signal a visual emergency alert such as a flashing light or an audible noise alerting anyone near that the passenger has not paid. This may also be useful in the event that the passenger 306 has a weapon or is conducting any illegal actives in the taxicab 304.
The system initiating the signal may be based in part on the state of the taxicab 304 and not necessarily on the actions of the driver 320 or the passenger 306. In this case, the system may be operationally independent of the driver 320. As such, the system may be fully operational with the driver 320 or autonomously. In some embodiments, the driver 320 may initiate a signal that triggers the system or any remote signal may be used to operate the taxicab 304 that initiates the system. The initiation of the mode of the taxicab 304 may be a triggering event. For example, the passenger 306 may hail the taxicab 304 with voice and a movement. The autonomous taxicab 304 may access a database of voice and motion cues that exemplify a hailed ride activity. The system matches the gestures with the stored cues and determines that the passenger 306 needs a ride. Upon this determination, the system activates the necessary sensors and begins recording. Alternatively, the triggering event may be specific GPS coordinates that are communicatively relayed from the sensor 310, navigation system, or mobile device. The triggering event may be motion detected, such as when the passenger 306 reaches for the door handle 324, or when the passenger 306 enters the taxicab 304. The triggering event may be pressure-related, such as when the passenger 306 sits down on the seat 326.
Embodiments of the invention may be suitable for a typical school.
In some embodiments, the camera 402 may be on the exterior of the school such that in the event that suspicious person 406 approaches the school the system automatically locks the exterior doors that the suspicious person 406 is attempting to enter. In some embodiments, the system may lock all exterior doors. In the event that the suspicious person 406 is inside the building all doors, windows, or any other possible escape routes may be locked thus trapping the suspicious person 406 in a particular room or section of the school. This may be done to cut the suspicious person 406 off from any other people and trap the suspicious person 406 until authorities arrive.
In some embodiments as depicted in
In some embodiments, the suspicious person 406 may be a student, teacher, parent, or any other person that may have been granted access to the facility (in this case a school). The suspicious person 406 may be carrying a weapon 408 such as a firearm or knife. The system may use object recognition software to recognize that the suspicious person 406 is carrying a weapon 408 and though the suspicious person 406 may register as an approved person for entry, the suspicious person 406 may be locked out and denied entry based on the object recognition. The system may employ any of the above-mentioned features for detaining, trapping, or preventing the suspicious person 406 from movement throughout the building.
Continuing with exemplary embodiments depicted in
Embodiments of the invention may be advantageous for capturing events related to the school that take place that are not necessarily on the school premises, such as field trips, playgrounds, athletic facilities, and other events outside of school buildings. In some embodiments, a body-mounted camera may access and utilize a smartphone to transfer data while outside of communication range of a recording device manager. Embodiments of the invention may also be useful for providing oversight of the teachers without a significant administrative burden. This would allow administrators or even concerned parents to review a brief, random clip of video data. For example, a ten-minute clip during classroom discussion may be uploaded daily or upon request.
The exemplary embodiment above may apply to any commercial or government building. For example, sensors may be attached to the exterior or interior of a building. The sensors may receive data indicative of a person or object such as transmitted numbers, names, or any other data that may be transmitted over radio frequency, infrared or any other method. The sensors may also be cameras that employ facial or object recognition software as described above. The sensors may track individuals or objects. As the individuals or objects move throughout or around the building, the system may track the movements via the sensors. The sensors may be RFID readers, retinal scans, fingerprint readers, push button pads, biometric information, or any other way that information may be digitized. Any one of the sensor inputs may be a triggering event for other sensors including the cameras.
An exemplary embodiment depicted in
Once the gas 504 is detected the event that lead to the gas 504 exposure has already occurred. In this case, it may be beneficial to utilize pre-event recording. The triggering event may be gas detection and a signal may be sent to the camera 508 to activate. The camera 508 may begin sending information to a remote data store and the 30, 60, 120 seconds, 10 minutes, 0.5 hours, or any other predefined time range prior to the triggering event may be transmitted to the data store. This allows the events leading up to the triggering event to be recorded.
Alternatively, sensors such as the gas detector 506 may be mounted along exposed gas lines and triggered to begin recording when the gas within the line is sensed. Sensors may be mounted in sewer tunnels and triggered to record when toxic gases climb to specified levels. These sensors may be body-mounted sensors on workers inspecting the tunnels and activated along with an alert when toxic gases reach specified levels. Any video or sensor data may be transmitted to a remote monitoring location to ensure the workers safety. This transmission may be by radio frequency, infrared, electrical lines, or any other form of signaled communication depending on availability and necessity. The levels recorded, location, time, employee ID numbers, or any other information associated with any data in some embodiments of the invention may be recorded as metadata associated with video recording and other sensor data.
As described above, the selective triggering of recording of one or more cameras, sensors, and data stores based on a variety of sensors can provide great benefits. However, each application for the embodiment may require a different configuration of sensors, cameras, and metadata. As such, a variety of exemplary scenarios is disclosed herein to describe the variety of triggering signals and applications that can be used with the invention.
In one exemplary scenario, a camera may be attached to a service vehicle, such as a tow truck. The location of a distressed vehicle is known. When the camera is within proximity, defined by GPS coordinates, the video recorder is triggered to begin. A body-mounted camera may also be attached to the employee that is triggered to begin when in proximity of the destination as determined by GPS or geofencing, or the body-mounted camera may receive a signal from the other camera, or the recording device manager triggering the recording. The tow company employee may initiate a triggering event so the camera begins recording and the camera or recording device manager may send a signal to other cameras to begin recording. Information may be stored as metadata on the video recording such as GPS coordinates, time, date, employee identification number. The employee may also wear an RFID tag and the customer may fill out information electronically. The employee identification and the customer information may be stored as metadata on the video recording. This may be used to ensure safety of the employee and the customer and that proper procedures are followed. The recording may also be helpful in the event of property damage during the tow event. The tow truck may operate autonomously and the initiation of this state, or other autonomous states, may be triggering events.
In a second exemplary scenario, embodiments of the invention may be used in the industrial field. A camera may be installed on a type of industrial machine such as a lathe, a press, or other industrial machines. The camera may be triggered to record when the machine is in a specific state, such as on, running, off, or standby. A plurality of cameras may be in communication. A triggering event may cause a camera to begin recording and that camera may send a signal to start other cameras. The signal may also be sent from a recording device manager. The industrial machines may be autonomous and the initiation of the autonomous state may be a triggering event.
In another embodiment of the invention in the industrial or construction field, a sensor may be installed on a construction or industrial vehicle such as a forklift. Alternatively, the industrial vehicle may be a dump truck, a bull dozer, a loader, a scraper, a crane, or any other machine used in construction or industry. The sensor may be a camera and may be triggered to begin recording when the forklift ignition is turned to the on position. The camera may be triggered by an accelerometer sending a signal to a device manager in turn the device manager sending a signal to the camera to begin recording when the forklift moves in a direction, or comes to a stop.
The triggering event may be accident related. For example, the camera may turn on due to a motion detection sensor sensing motion. A triggering event may be the detection of objects falling, relative motion between two objects in close proximity, or objects moving at or above a specified speed. The triggering event may be a collision. In this case, the camera may employ pre- and post-recording options. The camera may be connected to a monitor such that the machine operator has better visibility surrounding the industrial vehicle. The camera may be mounted at a stationary position in a warehouse and triggered by a motion sensor when machines come into frame. The stationary camera may be triggered by the camera onboard the forklift. The stationary camera may be used to monitor progress, procedure, or safety. Any data associated with the data collected by the cameras such as date, time, employee identification number may be stored as metadata on the video recording. The camera may be triggered by autonomous initiation. The initiation of this state may be a triggering event. In some embodiments of the invention, a camera may be installed in a construction zone to monitor traffic while connected to a monitor in a construction vehicle. This may signal and monitor traffic in and out of the construction zone letting the machine operator know when it is safe to operate the machinery.
In a fourth exemplary scenario, embodiments of the invention may be used for identification of a person. A camera may be at a location where a client or customer's identification is required, such as using a credit card or a bank transaction. When the client or customer swipes a credit card through a magnetic card reader or signs a digitizer the video recorder is triggered to begin. A camera may be trigger directly from the user input sensor from the recording device manager. The card and digitizer identification may be used immediately along with facial recognition software to determine the person's identity. The camera may be triggered by the facial recognition software when a person's identification is either known or not known. If the person's identity is known then no identification is required. If the person's identity is unknown the identification is required. In either case the identification is stored in metadata in the video recording. Upon repeat visits a customer's identity is known through the facial recognition software. This may also be used in multi-level verification for identity. The customers previously entered information may be stored as metadata on the video recording and accessible to the employee. This data may be downloaded, making the interaction more efficient.
In a fifth exemplary scenario, embodiments of the invention may be used in connection with computers. A camera may be triggered to record when a computer is accessed. The camera may be triggered by opening the computer, starting the computer, or logging on. The video may keep a record of the user and time of use. Upon determination of criminal activity, the video may be referenced for prosecution. It may also be used for productivity and for tracking time of use for a public computer system. The user may store personal information on the computer that may be accessed at a later date. For example, if a book is not available at a public library, a customer may scan their library card and enter the book information while their identity is recorded by facial recognition software and stored as metadata on the video recording. Upon the next visit, the facial recognition software recognizes the customer, initiates a triggering event that begins a new recording session under the previously obtained user information, and retrieves the information from the previous session. Without any input from the customer, the customer is alerted that the book is now available.
In a sixth exemplary scenario, embodiments of the invention may be used in a medical context. The video recording may be triggered by an RFID tag associated with a medical practitioner or patient. In an exemplary embodiment, the camera may be a body-mounted camera worn by doctors, nurses or other medical or hospital administrative personnel or staff. The RFID tag may be worn by the patient, attached to the patient's bed, or in the patient's room. When the body-mounted camera comes into proximity with the RFID tag, video recording is triggered to begin, and the medical personnel's interactions with the patients can be recorded. The recording may help provide concrete and credible evidence in medical malpractice cases. As such, the doctor may be able to prove what information was given to the patient, what responses were given by the patient, etc. This may provide a video record of the interactions between doctor and patient. Any information such as medication or patient information, statistics, and vitals may be recorded on the video and may be embedded as metadata.
In yet another exemplary scenario, embodiments of the invention may be used in signaling an emergency. A triggering event may be a pressing of Code Blue button on medical equipment. Upon the triggering of the Code Blue, the recording device manager may instruct all body-mounted video cameras on the floor or in the room to begin recording. As another example, a defibrillator being taken out of a holster may trigger the body-mounted video camera or any other sensors to begin. Yet another example, an eye- or hand-washing washing station within the hospital may be equipped with a sensor. The sensor may automatically trigger a recording and may tag which camera is being used. The RFID tags may be placed throughout the hospital to track locations of the doctors so that a record of what doctor was interacting with a patient can be better recorded. Locations, actions, and medical information may all be tracked by a recording device manager and stored as metadata on the video recordings.
Moving now to step 604, the first sensor may generate a first signal to send to a recording device manager. The recording device manager may be standalone or located on, or in, one of the sensors. The recording device manager may contain a data store, a processor, and memory, and may signal other sensors to activate and may signal an external data store to store data from the sensors. Continuing with the example from step 602 where a signal was sent from the camera to the recording device manager, upon receipt of the signal from the camera the images are put through object recognition software and it is determined that the person is carrying a weapon.
In step 606, in some embodiments the recording device manager may signal other sensors to activate. A second signal may be generated at the recording device manager and transmitted to the other sensors. In the event that there is no recording device manager or that no delay is acceptable the second signal may be sent from the triggering sensor or the second signal may be the first signal and sent directly to the recording device manager and all other sensors. Continuing with the example from above, once it is determined that a triggering event has occurred a signal may be sent from the device manager to an electromechanical locking mechanism.
In step 608, electromechanical devices may be activated. The electromechanical devices may actuate physical objects to achieve a purpose in response to the triggering event. Continuing the example above, the electromechanical switch receives the signal from the recording device manager and locks the door. In some embodiments, examples of electromechanical devices may include actuators, switches, motors, or any other mechanical device that may be operated in response to or by an electrical signal.
In step 610, additional sensor information may be stored with any information that may be generated or sensed from the electromechanical devices or the result of the activities of the electromechanical devices. Continuing the example from above, a sensor may be utilized to verify that the electromechanical switch locked the door. The sensor may detect that the door is locked and send a signal to the recording device manager.
In step 612, the system may then store all data. All data may be stored from any sensors. The system may store data generated at the first sensor, other sensors, and alternative data. Alternative data may be stored as metadata. Initiation of the data store may be made from the first or second signals. Pre-recording for any sensors may also be implemented. Any recording or pre-recording may take place at the data store or at the device that is sensing. Continuing the example from above, the electromechanical device may be locked and the sensor has sent verification of the locked door to the device manager. A triggering event is detected and the lock actuates locking a door. Information that the door was locked and sensed to be locked at 3:00 pm may be stored. The information may be stored as metadata with the video data from the camera or stored with any data from the sensors.
In some embodiments of the invention, the steps of the flow chart 600 may be interchanged deleted or steps may be added. Any actions described within a step may be deleted or added to a different step. For example, in embodiments the recording device manager may not be necessary to send signals between sensors. The step of storing may begin before or concurrently with sending the first signal or at any step of the process. A first sensor such as a camera with a processor may send a signal to all other sensors and the data store to begin recording the data without the need for an intervening recording device manager. In some embodiments, the peripheral sensors may also contain processors that may calculate or process other information such as from online databases that may be included in the stored data. Any sensor may send a signal to any electromechanical device and activate the mechanical device.
Many different arrangements of the various components depicted, as well as components not shown, are possible without departing from the scope of the claims below. Embodiments of the invention have been described with the intent to be illustrative rather than restrictive. Alternative embodiments will become apparent to readers of this disclosure after and because of reading it. Alternative means of implementing the aforementioned can be completed without departing from the scope of the claims below. Certain features and subcombinations are of utility and may be employed without reference to other features and subcombinations and are contemplated within the scope of the claims. Although the invention has been described with reference to the embodiments illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the invention as recited in the claims.
This non-provisional patent application claims priority benefit, with regard to all common subject matter, of earlier-filed U.S. Provisional Patent Application No. 62/469,241 filed Mar. 9, 2017 and entitled AUTOMATIC VIDEO TRIGGERING SYSTEM. The identified earlier-filed provisional patent application is hereby incorporated by reference in its entirety into the present application.
Number | Name | Date | Kind |
---|---|---|---|
4409670 | Herndon et al. | Oct 1983 | A |
4789904 | Peterson | Dec 1988 | A |
4863130 | Marks, Jr. | Sep 1989 | A |
4918473 | Blackshear | Apr 1990 | A |
5027104 | Reid | Jun 1991 | A |
5096287 | Kaikinami et al. | Mar 1992 | A |
5111289 | Lucas et al. | May 1992 | A |
5289321 | Secor | Feb 1994 | A |
5381155 | Gerber | Jan 1995 | A |
5408330 | Squicciarii et al. | Apr 1995 | A |
5446659 | Yamawaki | Aug 1995 | A |
5453939 | Hoffman et al. | Sep 1995 | A |
5473729 | Bryant et al. | Dec 1995 | A |
5479149 | Pike | Dec 1995 | A |
5497419 | Hill | Mar 1996 | A |
5526133 | Paff | Jun 1996 | A |
5585798 | Yoshioka et al. | Dec 1996 | A |
5642285 | Woo et al. | Jun 1997 | A |
5668675 | Fredricks | Sep 1997 | A |
5689442 | Swanson et al. | Nov 1997 | A |
5742336 | Lee | Apr 1998 | A |
5752632 | Sanderson et al. | May 1998 | A |
5798458 | Monroe | Aug 1998 | A |
5815093 | Kikinis | Sep 1998 | A |
5850613 | Bullecks | Dec 1998 | A |
5878283 | House et al. | Mar 1999 | A |
5886739 | Winningstad | Mar 1999 | A |
5890079 | Levine | Mar 1999 | A |
5926210 | Hackett et al. | Jul 1999 | A |
5962806 | Coakley et al. | Oct 1999 | A |
5978017 | Tino | Nov 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
5996023 | Winter et al. | Nov 1999 | A |
6008841 | Charlson | Dec 1999 | A |
6028528 | Lorenzetti et al. | Feb 2000 | A |
6052068 | Price R-W et al. | Apr 2000 | A |
6097429 | Seeley et al. | Aug 2000 | A |
6100806 | Gaukel | Aug 2000 | A |
6121881 | Bieback et al. | Sep 2000 | A |
6141609 | Herdeg et al. | Oct 2000 | A |
6141611 | Mackey et al. | Oct 2000 | A |
6163338 | Johnson et al. | Dec 2000 | A |
6175300 | Kendrick | Jan 2001 | B1 |
6298290 | Abe et al. | Oct 2001 | B1 |
6310541 | Atkins | Oct 2001 | B1 |
6314364 | Nakamura | Nov 2001 | B1 |
6324053 | Kamijo | Nov 2001 | B1 |
6326900 | Deline et al. | Dec 2001 | B2 |
6333694 | Pierce et al. | Dec 2001 | B2 |
6333759 | Mazzilli | Dec 2001 | B1 |
6370475 | Breed et al. | Apr 2002 | B1 |
RE37709 | Dukek | May 2002 | E |
6389340 | Rayner | May 2002 | B1 |
6396403 | Haner | May 2002 | B1 |
6405112 | Rayner | Jun 2002 | B1 |
6449540 | Rayner | Sep 2002 | B1 |
6452572 | Fan et al. | Sep 2002 | B1 |
6490409 | Walker | Dec 2002 | B1 |
6518881 | Monroe | Feb 2003 | B2 |
6525672 | Chainer et al. | Feb 2003 | B2 |
6546119 | Ciolli et al. | Apr 2003 | B2 |
6560463 | Santhoff | May 2003 | B1 |
6563532 | Strub et al. | May 2003 | B1 |
6583813 | Enright | Jun 2003 | B1 |
6591242 | Karp et al. | Jul 2003 | B1 |
6681195 | Poland et al. | Jan 2004 | B1 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6697103 | Fernandez et al. | Feb 2004 | B1 |
6718239 | Rayer | Apr 2004 | B2 |
6727816 | Helgeson | Apr 2004 | B1 |
6747687 | Alves | Jun 2004 | B1 |
6748792 | Freund et al. | Jun 2004 | B1 |
6823621 | Gotfried | Nov 2004 | B2 |
6831556 | Boykin | Dec 2004 | B1 |
6856873 | Breed et al. | Feb 2005 | B2 |
6883694 | Abelow | Apr 2005 | B2 |
6970183 | Monroe | Nov 2005 | B1 |
7012632 | Freeman et al. | Mar 2006 | B2 |
7034683 | Ghazarian | Apr 2006 | B2 |
D520738 | Tarantino | May 2006 | S |
7038590 | Hoffman et al. | May 2006 | B2 |
7088387 | Freeman et al. | Aug 2006 | B1 |
7119832 | Blanco et al. | Oct 2006 | B2 |
7126472 | Kraus et al. | Oct 2006 | B2 |
7147155 | Weekes | Dec 2006 | B2 |
7180407 | Guo et al. | Feb 2007 | B1 |
7190822 | Gammenthaler | Mar 2007 | B2 |
7363742 | Nerheim | Apr 2008 | B2 |
7371021 | Ross et al. | May 2008 | B2 |
7421024 | Castillo | Sep 2008 | B2 |
7436143 | Lakshmanan et al. | Oct 2008 | B2 |
7436955 | Yan et al. | Oct 2008 | B2 |
7448996 | Khanuja et al. | Nov 2008 | B2 |
7456875 | Kashiwa | Nov 2008 | B2 |
7496140 | Winningstad et al. | Feb 2009 | B2 |
7500794 | Clark | Mar 2009 | B1 |
7508941 | O'Toole, Jr. et al. | Mar 2009 | B1 |
7536457 | Miller | May 2009 | B2 |
7539533 | Tran | May 2009 | B2 |
7561037 | Monroe | Jul 2009 | B1 |
7594305 | Moore | Sep 2009 | B2 |
7602301 | Stirling et al. | Oct 2009 | B1 |
7656439 | Manico et al. | Feb 2010 | B1 |
7659827 | Gunderson et al. | Feb 2010 | B2 |
7680947 | Nicholl et al. | Mar 2010 | B2 |
7697035 | Suber, III et al. | Apr 2010 | B1 |
7804426 | Etcheson | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7853944 | Choe | Dec 2010 | B2 |
7944676 | Smith et al. | May 2011 | B2 |
8077029 | Daniel et al. | Dec 2011 | B1 |
8121306 | Cilia et al. | Feb 2012 | B2 |
8175314 | Webster | May 2012 | B1 |
8269617 | Cook et al. | Sep 2012 | B2 |
8314708 | Gunderson et al. | Nov 2012 | B2 |
8350907 | Blanco et al. | Jan 2013 | B1 |
8356438 | Brundula et al. | Jan 2013 | B2 |
8373567 | Denson | Feb 2013 | B2 |
8373797 | Ishii et al. | Feb 2013 | B2 |
8384539 | Denny et al. | Feb 2013 | B2 |
8446469 | Blanco et al. | May 2013 | B2 |
8456293 | Trundle et al. | Jun 2013 | B1 |
8508353 | Cook et al. | Aug 2013 | B2 |
8594485 | Brundula | Nov 2013 | B2 |
8606492 | Botnen | Dec 2013 | B1 |
8676428 | Richardson et al. | Mar 2014 | B2 |
8690365 | Williams | Apr 2014 | B1 |
8707758 | Keays | Apr 2014 | B2 |
8725462 | Jain et al. | May 2014 | B2 |
8744642 | Nemat-Nasser et al. | Jun 2014 | B2 |
8780205 | Boutell et al. | Jul 2014 | B2 |
8781292 | Ross et al. | Jul 2014 | B1 |
8805431 | Vasavada et al. | Aug 2014 | B2 |
8849501 | Cook et al. | Sep 2014 | B2 |
8854199 | Cook et al. | Oct 2014 | B2 |
8887208 | Merrit et al. | Nov 2014 | B1 |
8930072 | Lambert et al. | Jan 2015 | B1 |
8989914 | Nemat-Nasser et al. | Mar 2015 | B1 |
8996234 | Tamari et al. | Mar 2015 | B1 |
9003474 | Smith | Apr 2015 | B1 |
9058499 | Smith | Jun 2015 | B1 |
9122082 | Abreau | Sep 2015 | B2 |
9164543 | Minn et al. | Oct 2015 | B2 |
9253452 | Ross et al. | Feb 2016 | B2 |
9591255 | Sakiewica et al. | Mar 2017 | B2 |
9781348 | Bart | Oct 2017 | B1 |
20020013517 | West et al. | Jan 2002 | A1 |
20020019696 | Kruse | Feb 2002 | A1 |
20020032510 | Tumball et al. | Mar 2002 | A1 |
20020044065 | Quist et al. | Apr 2002 | A1 |
20020049881 | Sugimura | Apr 2002 | A1 |
20020084130 | Der Gazarian et al. | Jul 2002 | A1 |
20020131768 | Gammenthaler | Sep 2002 | A1 |
20020135336 | Zhou et al. | Sep 2002 | A1 |
20020159434 | Gosior et al. | Oct 2002 | A1 |
20020191952 | Fiore et al. | Dec 2002 | A1 |
20030040917 | Fiedler | Feb 2003 | A1 |
20030080713 | Kirmuss | May 2003 | A1 |
20030080878 | Kirmuss | May 2003 | A1 |
20030081121 | Kirmuss | May 2003 | A1 |
20030081934 | Kirmuss | May 2003 | A1 |
20030081935 | Kirmuss | May 2003 | A1 |
20030081942 | Melnyk et al. | May 2003 | A1 |
20030095688 | Kirmuss | May 2003 | A1 |
20030106917 | Shelter et al. | Jun 2003 | A1 |
20030133018 | Ziemkowski | Jul 2003 | A1 |
20030151510 | Quintana et al. | Aug 2003 | A1 |
20030184674 | Manico et al. | Oct 2003 | A1 |
20030185417 | Alattar et al. | Oct 2003 | A1 |
20030215010 | Kashiwa | Nov 2003 | A1 |
20030215114 | Kyle | Nov 2003 | A1 |
20030222982 | Hamdan et al. | Dec 2003 | A1 |
20040008255 | Lewellen | Jan 2004 | A1 |
20040043765 | Tolhurst | Mar 2004 | A1 |
20040119591 | Peeters | Jun 2004 | A1 |
20040143373 | Ennis | Jun 2004 | A1 |
20040141059 | Enright | Jul 2004 | A1 |
20040145457 | Schofield et al. | Jul 2004 | A1 |
20040150717 | Page et al. | Aug 2004 | A1 |
20040168002 | Accarie et al. | Aug 2004 | A1 |
20040199785 | Pederson | Oct 2004 | A1 |
20040223054 | Rotholtz | Nov 2004 | A1 |
20040243734 | Kitagawa et al. | Dec 2004 | A1 |
20040267419 | Jing | Dec 2004 | A1 |
20050030151 | Singh | Feb 2005 | A1 |
20050046583 | Richards | Mar 2005 | A1 |
20050050266 | Haas et al. | Mar 2005 | A1 |
20050068169 | Copley et al. | Mar 2005 | A1 |
20050068417 | Kreiner et al. | Mar 2005 | A1 |
20050083404 | Pierce et al. | Apr 2005 | A1 |
20050094966 | Elberbaum | May 2005 | A1 |
20050100329 | Lao et al. | May 2005 | A1 |
20050101334 | Brown et al. | May 2005 | A1 |
20050134966 | Burgner | May 2005 | A1 |
20050132200 | Jaffe et al. | Jun 2005 | A1 |
20050151852 | Jomppanen | Jul 2005 | A1 |
20050035161 | Shioda | Aug 2005 | A1 |
20050168574 | Lipton | Aug 2005 | A1 |
20050185438 | Ching | Aug 2005 | A1 |
20050206532 | Lock | Sep 2005 | A1 |
20050206741 | Raber | Sep 2005 | A1 |
20050228234 | Yang | Oct 2005 | A1 |
20050232469 | Schofield et al. | Oct 2005 | A1 |
20050243171 | Ross, Sr. et al. | Nov 2005 | A1 |
20050258942 | Manasseh et al. | Nov 2005 | A1 |
20060009238 | Stanco et al. | Jan 2006 | A1 |
20060028811 | Ross, Jr. et al. | Feb 2006 | A1 |
20060055786 | Olilla | Mar 2006 | A1 |
20060158968 | Vanman et al. | Jul 2006 | A1 |
20060164220 | Harter, Jr. et al. | Jul 2006 | A1 |
20060164534 | Robinson et al. | Jul 2006 | A1 |
20060170770 | MacCarthy | Aug 2006 | A1 |
20060176149 | Douglas | Aug 2006 | A1 |
20060183505 | Willrich | Aug 2006 | A1 |
20060193749 | Ghazarian et al. | Aug 2006 | A1 |
20060203090 | Wang et al. | Sep 2006 | A1 |
20060220826 | Rast | Oct 2006 | A1 |
20060225253 | Bates | Oct 2006 | A1 |
20060244601 | Nishimura | Nov 2006 | A1 |
20060256822 | Kwong et al. | Nov 2006 | A1 |
20060270465 | Lee et al. | Nov 2006 | A1 |
20060271287 | Gold et al. | Nov 2006 | A1 |
20060274166 | Lee et al. | Dec 2006 | A1 |
20060274828 | Siemens et al. | Dec 2006 | A1 |
20060276200 | Radhakrishnan et al. | Dec 2006 | A1 |
20060282021 | DeVaul et al. | Dec 2006 | A1 |
20060287821 | Lin | Dec 2006 | A1 |
20060293571 | Bao et al. | Dec 2006 | A1 |
20070021134 | Liou | Jan 2007 | A1 |
20070064108 | Haler | Mar 2007 | A1 |
20070067079 | Kosugi | Mar 2007 | A1 |
20070091557 | Kim et al. | Apr 2007 | A1 |
20070102508 | Mcintosh | May 2007 | A1 |
20070117083 | Winneg et al. | May 2007 | A1 |
20070132567 | Schofield et al. | Jun 2007 | A1 |
20070152811 | Anderson | Jul 2007 | A1 |
20070172053 | Poirier | Jul 2007 | A1 |
20070177023 | Beuhler et al. | Aug 2007 | A1 |
20070199076 | Rensin et al. | Aug 2007 | A1 |
20070229350 | Scalisi et al. | Oct 2007 | A1 |
20070257781 | Denson | Nov 2007 | A1 |
20070257782 | Etcheson | Nov 2007 | A1 |
20070257804 | Gunderson et al. | Nov 2007 | A1 |
20070257815 | Gunderson et al. | Nov 2007 | A1 |
20070260361 | Etcheson | Nov 2007 | A1 |
20070268158 | Gunderson et al. | Nov 2007 | A1 |
20070271105 | Gunderson et al. | Nov 2007 | A1 |
20070274705 | Kashiwa | Nov 2007 | A1 |
20070277352 | Maron et al. | Dec 2007 | A1 |
20070285222 | Zadnikar | Dec 2007 | A1 |
20070287425 | Bates | Dec 2007 | A1 |
20070297320 | Brummette et al. | Dec 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080002031 | Cana et al. | Jan 2008 | A1 |
20080002599 | Denny et al. | Feb 2008 | A1 |
20080030580 | Kashhiawa et al. | Feb 2008 | A1 |
20080042825 | Denny et al. | Feb 2008 | A1 |
20080043736 | Stanley | Feb 2008 | A1 |
20080049830 | Richardson | Feb 2008 | A1 |
20080063252 | Dobbs et al. | Mar 2008 | A1 |
20080084473 | Romanowich | Apr 2008 | A1 |
20080100705 | Kister et al. | May 2008 | A1 |
20080101789 | Sharma | May 2008 | A1 |
20080122603 | Piante et al. | May 2008 | A1 |
20080129518 | Carlton-Foss | Jun 2008 | A1 |
20080143481 | Abraham et al. | Jun 2008 | A1 |
20080144705 | Rackin et al. | Jun 2008 | A1 |
20080169929 | Albertson et al. | Jul 2008 | A1 |
20080170130 | Ollila et al. | Jul 2008 | A1 |
20080211906 | Lovric | Sep 2008 | A1 |
20080222849 | Lavoie | Sep 2008 | A1 |
20080239064 | Iwasaki | Oct 2008 | A1 |
20080246656 | Ghazarian | Oct 2008 | A1 |
20080266118 | Pierson et al. | Oct 2008 | A1 |
20080307435 | Rehman | Dec 2008 | A1 |
20080316314 | Bedell et al. | Dec 2008 | A1 |
20090002491 | Haler | Jan 2009 | A1 |
20090002556 | Manapragada et al. | Jan 2009 | A1 |
20090027499 | Nicholl | Jan 2009 | A1 |
20090052685 | Cilia et al. | Feb 2009 | A1 |
20090070820 | Li | Mar 2009 | A1 |
20090122142 | Shapley | May 2009 | A1 |
20090135007 | Donovan et al. | May 2009 | A1 |
20090169068 | Okamoto | Jul 2009 | A1 |
20090189981 | Siann et al. | Jul 2009 | A1 |
20090195686 | Shintani | Aug 2009 | A1 |
20090207252 | Raghunath | Aug 2009 | A1 |
20090213204 | Wong | Aug 2009 | A1 |
20090243794 | Morrow | Oct 2009 | A1 |
20090251545 | Shekarri et al. | Oct 2009 | A1 |
20090252486 | Ross, Jr. et al. | Oct 2009 | A1 |
20090276708 | Smith et al. | Nov 2009 | A1 |
20090294538 | Wihlborg et al. | Dec 2009 | A1 |
20090324203 | Wiklof | Dec 2009 | A1 |
20100045798 | Sugimoto et al. | Feb 2010 | A1 |
20100050734 | Chou | Mar 2010 | A1 |
20100060747 | Woodman | Mar 2010 | A1 |
20100097221 | Kriener et al. | Apr 2010 | A1 |
20100106707 | Brown et al. | Apr 2010 | A1 |
20100118147 | Dorneich et al. | May 2010 | A1 |
20100122435 | Markham | May 2010 | A1 |
20100123779 | Snyder et al. | May 2010 | A1 |
20100177193 | Flores | Jul 2010 | A1 |
20100177891 | Keidar et al. | Jul 2010 | A1 |
20100188201 | Cook et al. | Jul 2010 | A1 |
20100191411 | Cook et al. | Jul 2010 | A1 |
20100194885 | Plaster | Aug 2010 | A1 |
20100217836 | Rofougaran | Aug 2010 | A1 |
20100238009 | Cook et al. | Sep 2010 | A1 |
20100238262 | Kurtz et al. | Sep 2010 | A1 |
20100242076 | Potesta et al. | Sep 2010 | A1 |
20100265331 | Tanaka | Oct 2010 | A1 |
20100274816 | Guzik | Oct 2010 | A1 |
20100287473 | Recesso et al. | Nov 2010 | A1 |
20110006151 | Beard | Jan 2011 | A1 |
20110018998 | Guzik | Jan 2011 | A1 |
20110050904 | Anderson | Mar 2011 | A1 |
20110069151 | Orimoto | Mar 2011 | A1 |
20110084820 | Walter et al. | Apr 2011 | A1 |
20110094003 | Spiewak et al. | Apr 2011 | A1 |
20110098924 | Baladeta et al. | Apr 2011 | A1 |
20110129151 | Saito et al. | Jun 2011 | A1 |
20110261176 | Monaghan, Sr. et al. | Oct 2011 | A1 |
20110281547 | Cordero | Nov 2011 | A1 |
20110301971 | Roesch et al. | Dec 2011 | A1 |
20110314401 | Salisbury et al. | Dec 2011 | A1 |
20120038689 | Ishii | Feb 2012 | A1 |
20120056722 | Kawaguchi | Mar 2012 | A1 |
20120063736 | Simmons et al. | Mar 2012 | A1 |
20120120258 | Boutell et al. | May 2012 | A1 |
20120162436 | Cordell et al. | Jun 2012 | A1 |
20120188345 | Salow | Jul 2012 | A1 |
20120189286 | Takayama et al. | Jul 2012 | A1 |
20120195574 | Wallace | Aug 2012 | A1 |
20120230540 | Calman et al. | Sep 2012 | A1 |
20120257320 | Brundula et al. | Oct 2012 | A1 |
20120268259 | Igel et al. | Oct 2012 | A1 |
20120276954 | Kowalsky | Nov 2012 | A1 |
20130021153 | Keays | Jan 2013 | A1 |
20130033610 | Osborn | Feb 2013 | A1 |
20130035602 | Gemer | Feb 2013 | A1 |
20130080836 | Stergiou et al. | Mar 2013 | A1 |
20130096731 | Tamari et al. | Apr 2013 | A1 |
20130148295 | Minn et al. | Jun 2013 | A1 |
20130222640 | Baek et al. | Aug 2013 | A1 |
20130225309 | Bentley et al. | Aug 2013 | A1 |
20130285232 | Sheth | Oct 2013 | A1 |
20130300563 | Glaze | Nov 2013 | A1 |
20130343571 | Lee | Dec 2013 | A1 |
20140037262 | Sako | Feb 2014 | A1 |
20140049636 | O'Donnell et al. | Feb 2014 | A1 |
20140092299 | Phillips et al. | Apr 2014 | A1 |
20140094992 | Lambert et al. | Apr 2014 | A1 |
20140098453 | Brundula et al. | Apr 2014 | A1 |
20140140575 | Wolf | May 2014 | A1 |
20140170602 | Reed | Jun 2014 | A1 |
20140176733 | Drooker | Jun 2014 | A1 |
20140192194 | Bedell et al. | Jul 2014 | A1 |
20140195105 | Lambert et al. | Jul 2014 | A1 |
20140195272 | Sadiq et al. | Jul 2014 | A1 |
20140210625 | Nemat-Nasser | Jul 2014 | A1 |
20140218544 | Senot et al. | Aug 2014 | A1 |
20140227671 | Olmstead et al. | Aug 2014 | A1 |
20140311215 | Keays et al. | Oct 2014 | A1 |
20140355951 | Tabak | Dec 2014 | A1 |
20150050003 | Ross et al. | Feb 2015 | A1 |
20150050345 | Smyth et al. | Feb 2015 | A1 |
20150051502 | Ross | Feb 2015 | A1 |
20150053776 | Rose et al. | Mar 2015 | A1 |
20150078727 | Ross et al. | Mar 2015 | A1 |
20150088335 | Lambert et al. | Mar 2015 | A1 |
20150103246 | Phillips et al. | Apr 2015 | A1 |
20150163390 | Lee | Jun 2015 | A1 |
20150229630 | Smith | Aug 2015 | A1 |
20150317368 | Rhoads et al. | Nov 2015 | A1 |
20150332424 | Kane et al. | Nov 2015 | A1 |
20150358549 | Cho et al. | Dec 2015 | A1 |
20160042767 | Araya et al. | Feb 2016 | A1 |
20160104508 | Chee et al. | Apr 2016 | A1 |
20160127695 | Zhang et al. | May 2016 | A1 |
20160165192 | Saatchi et al. | Jun 2016 | A1 |
20160358393 | Penland | Dec 2016 | A1 |
20160364621 | Hill et al. | Dec 2016 | A1 |
20170070659 | Kievsky et al. | Mar 2017 | A1 |
20170161382 | Ouimet | Jun 2017 | A1 |
20170178475 | Renkis | Jun 2017 | A1 |
20170195635 | Yokomitsu et al. | Jul 2017 | A1 |
20170230605 | Han et al. | Aug 2017 | A1 |
20170237950 | Araya et al. | Aug 2017 | A1 |
20170244884 | Burtey et al. | Aug 2017 | A1 |
20170277700 | Davis et al. | Sep 2017 | A1 |
20170287523 | Hodulik et al. | Oct 2017 | A1 |
20180023910 | Kramer | Jan 2018 | A1 |
20180050800 | Boykin et al. | Feb 2018 | A1 |
20180053394 | Gersten | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
102010019451 | Nov 2011 | DE |
2479993 | Jul 2012 | EP |
3073449 | Sep 2016 | EP |
2273624 | Jun 1994 | GB |
2320389 | May 1998 | GB |
2343252 | May 2000 | GB |
2351055 | Dec 2000 | GB |
2417151 | Feb 2006 | GB |
2425427 | Oct 2006 | GB |
2455885 | Jul 2009 | GB |
2485804 | May 2012 | GB |
20090923 | Sep 2010 | IE |
294188 | Sep 1993 | JP |
153298 | Jun 1996 | JP |
198858 | Jul 1997 | JP |
10076880 | Mar 1998 | JP |
210395 | Jul 1998 | JP |
2000137263 | May 2000 | JP |
2005119631 | May 2005 | JP |
20-0236817 | Aug 2001 | KR |
1050897 | Jul 2011 | KR |
2383915 | Mar 2010 | RU |
107851 | Aug 2011 | RU |
124780 | Feb 2013 | RU |
9005076 | May 1990 | WO |
9738526 | Oct 1997 | WO |
9831146 | Jul 1998 | WO |
9948308 | Sep 1999 | WO |
0039556 | Jul 2000 | WO |
0051360 | Aug 2000 | WO |
0123214 | Apr 2001 | WO |
0249881 | Jun 2002 | WO |
02095757 | Nov 2002 | WO |
03049446 | Jun 2003 | WO |
2004036926 | Apr 2004 | WO |
2009013526 | Jan 2009 | WO |
2011001180 | Jan 2011 | WO |
2012037139 | Mar 2012 | WO |
2012120083 | Sep 2012 | WO |
2014000161 | Jan 2014 | WO |
2014052898 | Apr 2014 | WO |
Entry |
---|
Automation Systems Article, Know-How Bank Co. Ltd. Takes Leap Forward as a Company Specializing in R&D and Technology Consulting, published Jan. 2005. |
Car Rear View Camera—Multimedia Rear View Mirror—4′ LCD color monitor, Retrieved from the Internet: <URL: http://web.archive.org/web/20050209014751/http://laipac.com/multimedia-rear-mirror.htm>, Feb. 9, 2005. |
ATC Chameleon. Techdad Review [Online] Jun. 19, 2013 [Retrieved on Dec. 30, 2015]. Retrieved from Internet. <URL:http://www.techdadreview.com/2013/06/19atc-chameleon/>. |
“Breathalyzer.” Wikipedia. Printed Date: Oct. 16, 2014; Date Page Last Modified: Sep. 14, 2014; <http://en.wikipedia.org/wiki/Breathalyzer>. |
Dees, Tim; Taser Axon Flex: The next generation of body camera; <http://www.policeone.com/police-products/body-cameras/articles/527231- 0-TASER-Axon-Flex-The-next-generation-of-body-camera/>, Date Posted: Mar. 12, 2012; Date Printed: Oct. 27, 2015. |
Brown, TP-LINK TL-WDR3500 Wireless N600 Router Review, Mar. 6, 2013. |
Controller Area Network (CAN) Overview, National Instruments White Paper, Aug. 1, 2014. |
Daskam, Samuel W., Law Enforcement Armed Robbery Alarm System Utilizing Recorded Voice Addresses Via Police Radio Channels, Source: Univ. of Ky, Off of Res and Eng., Sery (UKY BU107), pp. 18-22, 1975. |
Digital Ally vs. Taser International, Inc., Case No. 2:16-cv-232 (CJM/TJ); US D. Kan, Defendant Taser International Inc.'s Preliminary Invalidity Contentions, Jul. 5, 2016. |
Electronic Times Article, published Feb. 24, 2005. |
Supplementary European Search Report dated Sep. 28, 2010 in European Patent Application No. 06803645.8; Applicant: Digital Ally, Inc. |
W. Fincham, Data Recorders for Accident Investigation, Monitoring of Driver and Vehicle Performance (Digest No. 1997/122), Publication Date: Apr. 10, 1997, pp. 6/1-6/3. |
Frankel, Harry; Riter, Stephen, Bernat, Andrew, Automated Imaging System for Border Control, Source: University of Kentucky, Office of Engineering Services, (Bulletin) UKY BU, pp. 169-173, Aug. 1986. |
Freudenrich, Craig, Ph.D.; “How Breathalyzers Work—Why Test?.” HowStuff Works. Printed Date: Oct. 16, 2014; Posted Date: Unknown; <http://electronics.howstuffworks.com/gadgets/automotive/breathalyzer1.htm>. |
Hankyung Auto News Article, Know-How Bank's Black Box for Cars “Multi-Black Box,” Copyright 2005. |
Guide to Bluetooth Security: Recommendations of the National Institute of Standards and Technology, National Institute of Standards and Technology, U.S. Dep't of Commerce, NIST Special Publication 800-121, Revision 1 (Jun. 2012). |
ICOP Extreme Wireless Mic, Operation Supplement, Copyright 2008. |
ICOP Model 20/20-W Specifications; Enhanced Digital In-Car Video and Audio recording Systems, date: Unknown. |
ICOP Mobile DVRS; ICOP Model 20/20-W & ICOP 20120 Vision, date: Unknown. |
Bertomen, Lindsey J., PoliceOne.com News; “Product Review: ICOP Model 20/20-W,” May 19, 2009. |
ICOP Raytheon JPS communications, Raytheon Model 20/20-W, Raytheon 20/20 Vision Digital In-Car Video Systems, date: Unknown. |
Overview of the IEEE 802.15.4 standards for Low rate Wireless Personal Area Networks, 2010 7th International Symposium on Wireless Communication Systems (ISWCS), Copyright 2010. |
Lewis, S.R., Future System Specifications for Traffic Enforcement Equipment, S.R. 1 Source: IEE Colloquium (Digest), N 252, Publication Date: Nov. 18, 1996, pp. 8/1-8/2. |
Kopin Corporation; Home Page; Printed Date: Oct. 16, 2014; Posted Date: Unknown; <http://www.kopin.com>. |
Translation of Korean Patent No. 10-1050897, published Jul. 20, 2011. |
Lilliput RV 18-50NP 5″ Rear View Mirror TFT LCD Screen with Camera, Retrieved from the Internet: <URL: http://www.case-mod.com/lilliput-rv1850np-rear-view-mirror-tft-lcd-screen-with-camera-p-1271.html>, Mar. 4, 2005. |
Motor Magazine Article, Recreating the Scene of an Accident, published 2005. |
Renstrom, Joell; “Tiny 3D Projectors Allow You To Transmit Holograms From A Cell Phone.” Giant Freakin Robot. Printed Date: Oct. 16, 2014; Posted Date: Jun. 13, 2014; <http://www.giantfreakinrobot.com/sci/coming-3d-projectors-transmit-holograms-cell-phone.html>. |
Request for Comment 1323 of the Internet Engineering Task Force, TCP Extensions for High Performance, Date: May 1992. |
RevealMedia RS3-SX high definition video recorder, http://www.revealmedia.com/buy-t166/cameras/rs3-sx.aspx, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Scorpion Micro DV Video Audio Recorder, http://www.leacorp.com/scorpion-micro-dv-video-audio-recorder/, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
“Stalker Press Room—Using In-Car Video, the Internet, and the Cloud to keep police officers safe is the subject of CopTrax live, free webinar.” Stalker. Printed Date: Oct. 16, 2014; Posted Date: Jul. 31, 2014. |
State of Utah Invitation to Bid State Cooperative Contract; Vendor: ICOP Digital, Inc., Contract No. MA503, Jul. 1, 2008. |
Wasson, Brian; “Digital Eyewear for Law Enforcement.” Printed Date: Oct. 16, 2014; Posted Date: Dec. 9, 2013; <http://www.wassom.com/digital-eyewear-for-law-enforcement.html>. |
X26 Taser, Date Unknown. |
Taser International; Taser X26 Specification Sheet, 2003. |
Digital Ally First Vu Mountable Digital Camera Video Recorder, http://www.opticsplanet.com/digital-ally-first-vu-mountable-digital-camera-video-recorder.html?gclid=CIKohcX05rkCFSIo7AodU0IA0g&ef_id=UjCGEAAAAWGEjrQF:20130925155534:s, Sep. 25, 2013, Date Posted: Unknown, pp. 1-4. |
Drift X170, http://driftinnovation.com/support/firmware-update/x170/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Dyna Spy Inc. hidden cameras, https://www.dynaspy.com/hidden-cameras/spy-cameras/body-worn-wearable-spy-cameras, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
European Patent Application 15850436.6 Search Report dated May 4, 2018. |
Final Written Decision for Inter Partes Review No. 2017-00375, Axon Enterprise Inc. v. Digital Ally, Inc., issued Jun. 1, 2018. |
Petition for Post Grant Review No. PGR2018-00052, Axon Enterprise, Inc. v. Digital Ally, Inc., filed Mar. 19, 2018. |
MPEG-4 Coding of Moving Pictures and Audio ISO/IEC JTC1/SC29/WG11 N4668 dated Mar. 2002. |
Invalidity Chart for International Publication No. WO2014/000161 Oct. 31, 2017 (Resubmitted). |
Ecplaza HY-001HD law enforcement DVR, http://fireeye.en.ecplaza.net/law-enforcement-dvr--238185-1619696.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Edesix VideoBadge, http://www.edesix.com/edesix-products, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
GoPro Official Website: The World's Most Versatile Camera, http://gopro.com/products/?gclid=CKqHv9jT4rkCFWZk7AodyiAAaQ, Sep. 23, 2013, Date Posted: Unknown, pp. 4-9. |
Isaw Advance Hull HD EXtreme, www.isawcam.co.kr, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Kustom Signals VieVu, http://www.kustomsignals.com/index.php/mvideo/vievu, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
LEA-AID SCORPION Micro Recorder Patrol kit,http://www.leacorp.com/products/SCORPION-Micro-Recorder-Patrol-kit.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Looxcie Wearable & mountable streaming video cams, http://www.looxcie.com/overview?gclid=CPbDyv6piq8CFWeFQADdlhXC-w, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
Midland XTC HD Video Camera, http://midlandradio.com/Company/xtc100-signup, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Panasonic Handheld AVCCAM HD Recorder/Player, http://www.panasonic.com/business/provideo/ag-hmr10.asp, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority, or the Declaration dated Jan. 30, 2014, International Application No. PCT/US2013/062415; International Filing date Sep. 27, 2013, Applicant: Digital Ally, Inc. |
Point of View Cameras Military & Police, http://pointofviewcameras.com/military-police, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
POV.HD System Digital Video Camera, http://www.vio-pov.com/index.php, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Invalidity Chart for International Publication No. WO2014/000161 Oct. 31, 2017. |
PCT Patent Application PCT/US17/16383 International Search Report and Written Opinion dated May 4, 2017. |
SIV Security in Vehicle Driving Partner, http://www.siv.co.kr/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Spy Chest Mini Spy Camera / Self Contained Mini camcorder / Audio & Video Recorder, http://www.spytechs.com/spy_cameras/mini-spy-camera.htm, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Stalker VUE Law Enforcement Grade Body Worn Video Camera/Recorder, http://www.stalkerradar.com/law_vue.shtml, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
SUV Cam, http://www.elmo.co.jp/suv-cam/en/product/index.html, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
TASER AXON Body On Officer Video/Police Body Camera, http://www.taser.com/products/on-officer-video/axon-body-on-officer-video, Sep. 23, 2013, Date Posted: Unknown, pp. 1-8. |
TASER AXON Flex On-Officer Video/Police Video Camera, http://www.taser.com/products/on-officer-video/taser-axon, Sep. 26, 2013, Date Posted: Unknown, pp. 1-8. |
Taser Cam Law Enforcement Audio/Video Recorder (gun mounted), http://www.taser.com/products/on-officer-video/taser-cam, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Tide Leader police body worn camera, http://tideleader.en.gongchang.com/product/14899076, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
uCorder Pockito Wearable Mini Pocket Camcorder, http://www.ucorder.com/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Veho MUVI HD, http://veho-uk.fastnet.co.uk/main/shop.aspx?category=CAMMUVIHD, Sep. 26, 2013, Date Posted: Unknown, pp. 1-5. |
Veho MUVI portable wireless speaker with dock, http://veho-uk.fastnet.co.uk/main/shop.aspx?category=camcorder, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
Vidmic Officer Worn Video & Radio Accessories, http://www.vidmic.com/, Sep. 26, 2013, Date Posted: Unknown, p. 1. |
VIEVU Products, http://www.vievu.com/vievu-products/vievu-squared/, Sep. 25, 2013, Date Posted: Unknown, pp. 1-2. |
WatchGuard CopVu Wearable Video Camera System, http://watchguardvideo.com/copvu/overview, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
Witness Cam headset, http://www.secgru.com/DVR-Witness-Cam-Headset-Video-Recorder-SG-DVR-1-COP.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
WolfCom 3rd Eye, X1 A/V Recorder for Police and Military, http://wolfcomusa.com/Products/Products.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Search Authority, or the Declaration dated Jan. 14, 2016, International Application No. PCT/US2015/056039; International Filing date Oct. 16, 2015, Applicant: Digital Ally, Inc. |
U.S. Appl. No. 13/959,142 Final Office Action dated Jul. 20, 2016. |
U.S. Appl. No. 13/959,142 Office Action dated Nov. 3, 2015. |
Digital Ally, Inc. vs. Taser International, Inc., Case No. 2:16-cv-020232 (CJM/TJ); US D. Kan, Complaint For Patent Infringement, Jan. 14, 2016. |
Digital Ally, Inc. vs. Enforcement video LLC d/b/a Watchguard Video., Case No. 2:16-cv-02349 (CJM/TJ); US D. Kan, Complaint For Patent Infringement, May 27, 2016. |
International Association of Chiefs of Police Digital Video System Minimum Specifications; Nov. 21, 2008. |
Petition for Inter Partes Review No. 2017-00375, Taser International, Inc. v. Digital Ally, Inc., filed Dec. 1, 2016. |
Petition for Inter Partes Review No. 2017-00376, Taser International, Inc. v. Digital Ally, Inc., filed Dec. 1, 2016. |
Petition for Inter Partes Review No. 2017-00515, Taser International, Inc. v. Digital Ally Inc., filed Jan. 11, 2017. |
Petition for Inter Partes Review No. 2017-00775, Taser International, Inc. v. Digital Ally Inc., filed Jan. 25, 2017. |
PCT Patent Application PCT/US16/34345 International Search Report and Written Opinion dated Dec. 29, 2016. |
State of Utah Invitation to Bid State Cooperative Contract; Vendor: Kustom Signals Inc., Contract No. MA1991, Apr. 25, 2008. |
U.S. Appl. No. 15/011,132 Office Action dated Apr. 18, 2016, 19 pages. |
Zepcam Wearable Video Technology, http://www.zepcam.com/product.aspx, Sep. 26, 2013, Date Posted: Unknown, pp. 1-2. |
New Rearview-Mirror-Based Camera Display Takes the Guesswork Out of Backing Up Retrieved from the Internet: <URL: httb://news.thomasnet.com/fullstory/497750>, Press Release, Oct. 30, 2006. |
SIIF Award for Multi Black Box, published Dec. 10, 2004. |
Near Field Communication; Sony Corporation; pp. 1-7, Date: Unknown. |
Oregon Scientific ATC Chameleon Dual Lens HD Action Camera, http://www.oregonscientificstore.com/Oregon-Scientific-ATC-Chameleon-Dual-Lens-HD-Action-Camera.data, Date Posted: Unknown; Date Printed: Oct. 13, 2014, pp. 1-4. |
Asian Wolf High Quality Angel Eye Body Video Spy Camera Recorder System, http://www.asianwolf.com/covert-bodycam-hq-angeleye.html, Sep. 26, 2013, Date Posted: Unknown, pp. 1-3. |
Amazon.com wearable camcorders, http://www.amazon.com/s/ref=nb_sb_ss_i_0_4?url=search-alias%3Dphoto&field-keywords=wearable+camcorder&x=0&y=0&sprefix=wear, Sep. 26, 2013, Date Posted: Unknown, pp. 1-4. |
Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration dated Feb. 4, 2016; International Application No. PCT/US2015/056052; International Filing Date: Oct. 16, 2015; Applicant: Digital Ally, Inc. |
http:/ /www.k-h-b.com/board/board.php?board=products01&comand=body&no=1, Current State of Technology Held by the Company, Copyright 2005. |
City of Pomona Request for Proposals for Mobile Video Recording System For Police Vehicles, dated prior to Apr. 4, 2013. |
http://www.k-h-b.com/sub1_02.html, Copyright 2005. |
Number | Date | Country | |
---|---|---|---|
20180262724 A1 | Sep 2018 | US |
Number | Date | Country | |
---|---|---|---|
62469241 | Mar 2017 | US |