Systems, apparatuses, and methods for controlling audiovisual apparatuses

Information

  • Patent Grant
  • 10165171
  • Patent Number
    10,165,171
  • Date Filed
    Sunday, January 22, 2017
    7 years ago
  • Date Issued
    Tuesday, December 25, 2018
    6 years ago
Abstract
Systems and methods for automatically controlling audiovisual apparatuses. A portable camera and a docking module configured to wirelessly trigger the camera to activate/deactivate the buffering and storage of data captured by the camera.
Description
STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH

Not Applicable.


TECHNICAL FIELD OF THE INVENTION

This disclosure relates generally to techniques for controlling audiovisual apparatuses. More particularly, but not by way of limitation, this disclosure relates to systems and methods for automatic wireless activation and control of portable audiovisual devices.


BACKGROUND

Today's law enforcement officers have various means of technology at their disposal to perform their tasks. Police vehicles are typically equipped with video/audio equipment that captures on-scene information that is commonly used as evidence in legal proceedings. More recently, officers have begun to use body-worn-cameras (BWC) to capture on-scene audio and video while on patrol. However, while technology has provided law enforcement officers powerful tools to perform their jobs, it has also added a level of complexity for officers on patrol.


An officer on patrol performs a number of tasks in addition to controlling the vehicle, including addressing bulletins and communications, running checks on license plate numbers, scanning for identified suspects and vehicles, etc. The BWCs present an additional peace of gear that the officer has to contend with. In the heat of a sudden emergency, the officer may not always remember to activate his BWC. Thus, while modern technology has provided law enforcement officers better tools to perform their jobs, the tools still have to be activated and operated. In addition to law enforcement, other institutions and establishments (e.g., armored car officers, emergency responders, firemen, inspectors, interviewers, etc.) can make use of BWCs.


A need remains for techniques to improve the operation and control of audiovisual technology as used for law enforcement and other functions.


SUMMARY

In view of the aforementioned problems and trends, embodiments of the present invention provide systems and methods for automatically controlling one or more audiovisual apparatuses such as camera devices that capture data (audio, video, and metadata).


According to an aspect of the invention, a method includes wirelessly linking a portable camera with a docking module disposed in a vehicle, wherein the camera is disposed remote from the docking module and wherein the portable camera is configured to capture image data and is configured with a buffer to temporarily hold captured image data and a memory to store captured image data; and using the docking module, sending a wireless command to the portable camera if a specified condition is met, wherein the command causes performance of one or more actions selected from the group consisting of: (a) causing image data captured by the portable camera to be temporarily held in the buffer; (b) causing image data captured by the portable camera not to be held in the buffer; (c) causing image data captured by the portable camera to be stored in the memory; and (d) causing image data captured by the portable camera not to be stored in the memory.


According to another aspect of the invention, a method includes wirelessly linking a portable camera with a docking module disposed in a vehicle, wherein the camera is disposed remote from the docking module and wherein the portable camera is configured to capture image data and temporarily hold the captured image data in a buffer in a continuous circulating stream; and using the docking module, sending a wireless command to the portable camera if a specified condition is met, wherein the command causes performance of one or more actions selected from the group consisting of: (a) causing image data captured by the portable camera not to be held in the buffer; (b) causing image data held in the buffer to be transferred to a memory in the portable camera; (c) causing image data captured by the portable camera to be stored in the memory; and (d) causing image data captured by the portable camera not to be stored in the memory.


According to another aspect of the invention, a system includes a docking module disposed in a vehicle; a portable camera disposed remote from the docking module; wherein the portable camera is wirelessly linked with the docking module and configured to capture image data and configured with a buffer to temporarily hold captured image data and a memory to store captured image data; and wherein the docking module is configured to send a wireless command to the portable camera if a specified condition is met, wherein the command is to cause performance of one or more actions selected from the group consisting of: (a) to cause image data captured by the portable camera to be temporarily held in the buffer; (b) to cause image data captured by the portable camera not to be held in the buffer; (c) to cause image data captured by the portable camera to be stored in the memory; and (d) to cause image data captured by the portable camera not to be stored in the memory.


Other aspects of the embodiments described herein will become apparent from the following description and the accompanying drawings, illustrating the principles of the embodiments by way of example only.





BRIEF DESCRIPTION OF THE DRAWINGS

The following figures form part of the present specification and are included to further demonstrate certain aspects of the present claimed subject matter, and should not be used to limit or define the present claimed subject matter. The present claimed subject matter may be better understood by reference to one or more of these drawings in combination with the description of embodiments presented herein. Consequently, a more complete understanding of the present embodiments and further features and advantages thereof may be acquired by referring to the following description taken in conjunction with the accompanying drawings, in which like reference numerals may identify like elements, wherein:



FIG. 1, in accordance with some embodiments of the present disclosure, depicts a vehicle configured with a camera device and a docking module;



FIG. 2, in accordance with some embodiments of the present disclosure, depicts a portable camera;



FIG. 3, in accordance with some embodiments of the present disclosure, depicts a communication scheme, specifically between a police vehicle with an onboard docking module and an officer with a body worn camera;



FIG. 4, in accordance with some embodiments of the present disclosure, depicts a perspective view of a docking module;



FIG. 5, in accordance with some embodiments of the present disclosure, depicts a top view of a docking module;



FIG. 6, in accordance with some embodiments of the present disclosure, depicts a schematic of a portable camera and a docking module for docking the camera;



FIG. 7, in accordance with some embodiments of the present disclosure, depicts a schematic of a portable camera docked in a docking module;



FIG. 8, in accordance with some embodiments of the present disclosure, depicts a block diagram of the circuitry and components of a docking module;



FIG. 9 is a flow chart depicting, at a top level, a method in accordance with some embodiments of the present disclosure; and



FIG. 10 is a flow chart depicting, at a top level, another method in accordance with some embodiments of the present disclosure.





NOTATION AND NOMENCLATURE

Certain terms are used throughout the following description and claims to refer to particular system components and configurations. As one skilled in the art will appreciate, the same component may be referred to by different names. This document does not intend to distinguish between components that differ in name but not function. In the following discussion and in the claims, the terms “including” (and the like) and “comprising” (and the like) are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to . . . .” Also, the term “couple,” “coupled,” or “linked” is intended to mean either an indirect or direct electrical, mechanical, or wireless connection. Thus, if a first device couples to or is linked to a second device, that connection may be through a direct electrical, mechanical, or wireless connection, or through an indirect electrical, mechanical, or wireless connection via other devices and connections.


As used throughout this disclosure the term “computer” encompasses special purpose microprocessor-based devices such as a digital video surveillance system primarily configured for executing a limited number of applications, and general purpose computers such as laptops, workstations, or servers which may be configured by a user to run any number of off the shelf or specially designed software applications. Computer systems and computer devices will generally interact in the same way with elements and aspects of disclosed embodiments. This disclosure also refers to memory or storage devices and storage drives interchangeably. In general, memory or a storage device/drive represents a medium accessible by a computer (via wired or wireless connection) to store data and computer program instructions. It will also be appreciated that use of the term “microprocessor” in this disclosure encompasses one or more processors.


The terms “video data” and “visual data” refer to still image data, moving image data, or both still and moving image data, as traditionally understood. Further, the terms “video data” and “visual data” refer to such image data alone, i.e., without audio data and without metadata. The term “image data” (in contrast to “still image data” and “moving image data”) encompasses not only video or visual data but also audio data and/or metadata. That is, image data may include visual or video data, audio data, metadata, or any combination of these three. This image data may be compressed using industry standard compression technology (e.g., Motion Picture Expert Group (MPEG) standards, Audio Video Interleave (AVI), etc.) or another proprietary compression or storage format. The terms “camera,” “camera device,” and the like are understood to encompass devices configured to record or capture visual/video data or image data. Such devices may also be referred to as video recording devices, image capture devices, or the like. Metadata may be included in the files containing the video (or audio and video) data or in separate, associated data files, that may be configured in a structured text format such as eXtensible Markup Language (XML).


As used throughout this disclosure the term “record” is interchangeable with the term “store” and refers to the retention of image data in a storage medium designed for long-term retention (e.g., solid state memory, hard disk, CD, DVD, memory card, etc.), as compared to the temporary retention offered by conventional memory means such as volatile RAM. The temporary retention of data, image data or otherwise, is referred to herein as the “holding” of data or as data being “held.”


The term “metadata” refers to information associated with the recording of video (or audio and video) data, or information included in the recording of image data, and metadata may contain information describing attributes associated with one or more acts of actual recording of video data, audio and video data, or image data. That is, the metadata may describe who (e.g., Officer ID) or what (e.g., automatic trigger) initiated or performed the recording. The metadata may also describe where the recording was made. Metadata may also include telemetry or other types of data. For example, location may be obtained using global positioning system (GPS) information or other telemetry information. The metadata may also describe why the recording was made (e.g., event tag describing the nature of the subject matter recorded). The metadata may also describe when the recording was made, using timestamp information obtained in association with GPS information or from an internal clock, for example, for the first frame of a recording or each individual frame may also have time information inserted that can be used to synchronize multiple file playback from various sources after the data has been transferred to a storage location. Metadata may also include information relating to the device(s) used to capture or process information (e.g. a unit serial number). From these types of metadata, circumstances that prompted the recording may be inferred and may provide additional information about the recorded information. This metadata may include useful information to correlate recordings from multiple distinct recording systems. This type of correlation information may assist in many different functions (e.g., query, data retention, chain of custody, and so on).


As used throughout this disclosure the term “portable” refers to the ability to be easily carried or moved. The term encompasses a wearable device (i.e. a device that can be worn or carried by a person or an animal).


DETAILED DESCRIPTION

The foregoing description of the figures is provided for the convenience of the reader. It should be understood, however, that the embodiments are not limited to the precise arrangements and configurations shown in the figures. Also, the figures are not necessarily drawn to scale, and certain features may be shown exaggerated in scale or in generalized or schematic form, in the interest of clarity and conciseness. The same or similar parts may be marked with the same or similar reference numerals.


While various embodiments are described herein, it should be appreciated that the present invention encompasses many inventive concepts that may be embodied in a wide variety of contexts. The following detailed description of exemplary embodiments, read in conjunction with the accompanying drawings, is merely illustrative and is not to be taken as limiting the scope of the invention, as it would be impossible or impractical to include all of the possible embodiments and contexts of the invention in this disclosure. Upon reading this disclosure, many alternative embodiments of the present invention will be apparent to persons of ordinary skill in the art. The scope of the invention is defined by the appended claims and equivalents thereof.


Illustrative embodiments of the invention are described below. In the interest of clarity, not all features of an actual implementation are necessarily described for each embodiment disclosed in this specification. In the development of any such actual embodiment, numerous implementation-specific decisions may need to be made to achieve the design-specific goals, which may vary from one implementation to another. It will be appreciated that such a development effort, while possibly complex and time-consuming, would nevertheless be a routine undertaking for persons of ordinary skill in the art having the benefit of this disclosure. It will also be appreciated that the parts and component dimensions of the embodiments disclosed herein may not be drawn to scale.



FIG. 1 depicts an embodiment of this disclosure. A vehicle 10 (e.g. a police car) is equipped with a docking module 12 and a portable camera 14. Embodiments can be implemented with the docking module 12 and portable camera 14 disposed on any type of vehicle. FIG. 2 depicts a closer view of a portable camera 14 embodiment of this disclosure. In some embodiments, the camera 14 is configured with a record activation switch 16, which permits a user to manually deactivate or activate the camera to record data captured via the lens 18 and microphone 20. Some embodiments may be configured to respectively allow for separate manual activation/deactivation of the microphone 20 and lens 18, respectively, permitting a user to capture only audio data or only image data as desired. Some embodiments may also include a programmable function button 19 that provides a user the ability to select among different programmed/programmable modes of operation. The camera 14 is configured with an internal buffer 22 (e.g. RAM) to temporarily hold captured image data and memory 24 (e.g. hard disk) to store captured image data. Some embodiments may also include an audio buzzer 23 to provide an audible indication during various modes of operation. For example, the buzzer 23 may be configured for activation: when camera 14 recording starts or stops, to provide a camera low battery alert, to provide a camera memory full alert, to indicate successful camera pairing with another device, to provide warning beeps that may be sent from another device, etc. It will be appreciated by those skilled in the art that camera 14 embodiments of this disclosure can be implemented with various types of additional sensors to capture/store desired information (e.g. temperature) and with conventional data storage means as known in the art. Embodiments of the camera 14 are also equipped with internal Bluetooth® circuitry and associated electronics to permit wireless communication and/or signal transfer to or from the camera. Bluetooth® pairing may be manually activated by a button 26 on the camera 14. Some embodiments may also include a light-emitting diode (LED) 27 to indicate when the camera 14 is recording or performing other functions. Suitable camera devices 14 that may be used to implement embodiments of this disclosure include the devices commercially available from COBAN Technologies, Inc., in Houston, Tex. (http//www.cobantech.com).


Although the embodiment depicted in FIG. 1 shows the camera 14 disposed in a vehicle 10, other embodiments of this disclosure provide implementations with a fully portable camera. FIG. 3 depicts the camera 14 being worn by a user (e.g. an officer) as a BWC. This implementation provides a user the ability to capture on-scene image data with the camera 14 when the user is not in the vehicle 10. For law enforcement, the wearable camera 14 provides the officer complete freedom to move in and out of wireless communication range with the docking module 12.


When not in use as a BWC, the camera 14 is held in the docking module 12 in the vehicle 10. FIG. 4 depicts an embodiment of the docking module 12. The docking module 12 provides a slot 28 to cradle and hold the camera 14. In some embodiments, the docking module 12 is configured with an articulating backing plate 30 that is held against the back panel 31 of the module in the open position with embedded magnets 32, as depicted in FIG. 5. When the camera 14 is inserted into the docking module 12, the camera pushes down on pivoting feet on the bottom of the plate 30, which moves the plate forward. The docking module 12 is also equipped with a Pogo pin connection base 35 that accepts the Pogo pins 37 at the bottom of the camera 14 (depicted in FIG. 6) to form an electrical connection. In some embodiments, the camera 14 may be configured with magnets on a mount plate on the back side (not shown) that assist in pulling the plate 30 forward, holding it in place to lock the camera 14 in the docking module 12. FIG. 5 shows an opening 34 in the plate 30 that accepts a protruding stud 36 on the back of the camera 14 (depicted in FIG. 6) when the camera 14 is cradled in the docking module 12, providing additional support to keep the camera 14 locked in under vibration and shock in a mobile environment. FIG. 6 depicts the docking module 12 with the backing plate 30 held against the back panel 31 via the magnets 32 (FIG. 5), ready to receive the camera 14. FIG. 7 depicts the camera 14 housed in the docking module 12. Embodiments of the docking module 12 provide a secure mounting mechanism to hold the camera 14 stable while in the vehicle 10, while at the same time facilitating one-hand operation for the removal of the camera 14. Suitable docking modules 12 that may be used to implement embodiments of this disclosure include the devices commercially available from COBAN Technologies, Inc., in Houston, Tex. (http//www.cobantech.com).


In some embodiments, the docking module 12 is configured with Bluetooth® circuitry, microprocessors, and electronics to implement the aspects and features disclosed herein. In some embodiments, the docking module 12 is configured with one or more cable connectors 38 (FIG. 5) to connect the module to an onboard computer 29 (FIG. 1) in the vehicle 10. In other embodiments, the computer 29 may be configured with Bluetooth® circuitry, software, and electronics to implement the aspects and features disclosed herein. It will be appreciated by those skilled in the art that other wireless communication standards may be used in implementations of the embodiments disclosed herein (e.g., RuBee, Wi-Fi, 3G, 4G, LTE, etc.).


Embodiments of the docking module 12 are configured to automatically send wireless commands to the portable camera 14 when certain specified conditions are met. The commands cause performance of one or more actions in the camera 14, including: (a) causing image data captured by the camera 14 to be temporarily held in the buffer 22; (b) causing image data captured by the camera 14 not to be held in the buffer 22; (c) causing image data captured by the camera 14 to be stored in memory 24; and (d) causing image data captured by the camera 14 not to be stored in memory 24. Additional description of the docking module 12 command structure is provided in the following disclosure.


In some embodiments, the specified condition that triggers the docking module 12 to send a command to the camera 14 is an input signal received by the docking module 12 from one or more sensors 40 mounted in the vehicle (see FIG. 1). One sensor 40 may be a sensor coupled into or connected to the output of the light bar 42 circuitry and configured to send a signal (e.g. a 12 V DC signal via wiring) to the docking module 12 when the light bar is activated. Another sensor 40 may be a sensor disposed on a gun rack 44 in the vehicle and configured to send a signal (e.g. a 12 V DC signal via wiring) to the docking module 12 when a shotgun is removed from the rack 44. Another sensor 40 may be a vehicle 10 door switch that activates interior lighting in the vehicle when a door is opened. The door switch circuitry may be coupled to the docking module 12 and configured to send a signal (e.g. a 12 V DC signal via wiring) to the docking module 12 when a door is opened. The vehicle 10 may also be equipped with a Controller Area Network (CAN) bus coupled with the docking module 12 to provide the module with input data and signals from the devices and sensors on the vehicle 10, such as the signal indicating a door has been opened. It will be appreciated by those skilled in the art that other sensors (e.g. engine ignition sensor, siren sensor, voice activation sensor, crash detection sensor, etc.) and input signals may be used with implementations of embodiments of this disclosure. Thus, the specified condition that triggers docking module 12 to send a command to camera 14 may be (receipt by the docking module 12 of a sensor signal indicating): activation of the light bar 42; deactivation of the light bar 42; removal of a gun from the gun rack 44; return of a gun to the gun rack 44; opening of a door of the vehicle; closing of a door of the vehicle; activation of the siren; deactivation of the siren; turning on of the engine; turning off of the engine; or another condition. The specified condition that triggers docking module 12 to send a command to camera 14 may also be (receipt by the docking module 12 of a plurality of sensor signals indicating) a combination of any two or more of the afore-mentioned conditions.


In some embodiments, the specified condition that triggers the docking module 12 to send a command to the camera 14 is when the distance between the location of the camera 14 and the location of the docking module 12 satisfies (i.e., meets or exceeds) a threshold (i.e., a threshold minimum distance or a threshold maximum distance). The docking module 12 and/or the camera 14 can be configured with GPS circuitry and software to automatically calculate the proximity of the camera 14 to the docking module 12. In some embodiments, the software and electronics in the camera 14 and/or the docking module 12 may be configured to use the communication signal (e.g. Bluetooth® signal) to calculate the proximity of the camera 14 to the docking module 12. Other embodiments may be configured with conventional means to calculate the proximity of the portable camera 14 to the docking module 12 as known in the art.


In some embodiments, the specified condition that triggers the docking module 12 to send a command to the camera 14 is when the velocity of the vehicle 10 satisfies (i.e., meets or exceeds) a threshold (i.e., a threshold minimum velocity or a threshold maximum velocity). Velocity data from the vehicle 10 speedometer may be sent to the docking module 12 via the CAN bus. In some embodiments, the velocity data can also be provided from the camera 14 or an onboard GPS.


As previously discussed, the docking module 12 is configured to automatically send wireless commands to the camera 14 to cause performance of one or more actions in the camera when the module is triggered by a specified condition as determined by the disclosed means. In some embodiments, the image data captured by the camera 14 is temporarily held in the buffer 22 in a continuous circulating stream to perform “pre-event” circular buffering, not storing the data to memory 24 until activated to store the data to memory 24 by a wireless command from the docking module 12. This “smart buffering” feature provides a circular buffer that temporarily holds captured image data until the docking module 12 sends a wireless command causing performance of one or more actions in the camera 14 as disclosed herein. The software of the docking module 12 can be configured to send commands to the camera 14 based on any desired configuration of the specified conditions, which configurations can include:

    • (i) Distance-based buffering—Causing image data captured by the camera 14 not to be held in the buffer 22 if the camera is within a preset specified threshold distance or range from the docking module 12 (e.g. this condition may be satisfied when an officer is sitting in the vehicle). This feature avoids the temporary holding of needless data to the buffer 22, improving overall efficiency and conserving unit power. In some embodiments, when the camera 14 is beyond a preset specified threshold distance or range from the docking module 12, the camera 14 automatically starts holding captured image data in the buffer 22. In some embodiments, when the camera 14 is beyond a preset specified threshold distance or range from the docking module 12, the camera 14 automatically starts storing captured image data to memory 24. In such embodiments, the camera 14 may be configured to require manual deactivation of the image data storing via the record activation/deactivation switch 16 on the camera.
    • (ii) Sensor signal-based recording—Causing image data captured by the camera 14 not to be held in the buffer 22 if the camera is within a preset specified threshold distance or range from the docking module 12, until the module detects a signal input from a sensor 40 (e.g., lightbar activation, gun rack signal, door opened). If a sensor 40 signal input is detected, the docking module 12 automatically sends a command to the camera 14 to start storing to memory 24 the image data captured by the camera 14. In some embodiments, if a sensor 40 signal input is detected, the docking module 12 automatically sends a command to the camera 14 to transfer or flush to memory 24 any image data that the buffer 22 may be holding and to start storing to memory the image data captured by the camera.
    • (iii) Vehicle velocity-conditioned sensor signal-based buffering—Causing image data being captured by the camera 14 not to be held in the buffer 22 and not to be stored to memory 24 if the camera is within a preset specified threshold distance or range from the docking module 12 and the vehicle 10: (i) starts moving, (ii) reaches a specified threshold minimum velocity (e.g. 15 MPH), or (iii) exceeds a specified threshold maximum velocity. While in this mode, if the docking module 12 detects a signal input from a sensor 40 (e.g., lightbar activation, gun rack signal, door opened), the module automatically sends a command to the camera 14 to start storing to memory the image data captured by the camera. In some embodiments, while in this mode, the docking module 12 automatically sends a command to the camera 14 to transfer or flush to memory 24 any image data that the buffer 22 may be holding and to start storing to memory the image data captured by the camera 14.
    • (iv) Distance-conditioned sensor signal-based recording—Causing image data captured by the camera 14 not to be stored in memory 24 if the camera is within a preset specified threshold distance or range from the docking module 12. In this inhibiting mode, so long as the camera 14 is within the specified threshold range, the camera 14 will not resume recording to memory 24, regardless of the sensor 40 signal(s). Once outside the specified threshold range, if a sensor 40 signal is detected, then the camera 14 resumes its storing to memory 24 function.


In some embodiments, the camera 14 is configured to bypass any commands from the docking module 12 and continue storing captured image data to memory 24 if the camera has been activated to store data when the camera is outside of the vehicle 10. For example, when an officer manually activates the camera 14 to store data to memory 24 as he is approaching the vehicle 10. In this mode, the camera 14 will continue to record to memory 24 until the officer deactivates recording manually, regardless of any sensor 40 signal inputs or the satisfaction of specified conditions.


It will be appreciated by those having the benefit of this disclosure that the docking module 12 and camera 14 embodiments can be configured to operate using commands and performing actions based on other configurations of specified conditions and using signal inputs originating from other sensors in the vehicle or outside of the vehicle (not shown). Generally speaking, any command among those described herein may be sent by the docking module 12 to the camera 14 upon any of the following conditions being satisfied: a change in distance between camera 14 and docking module 12; a change in velocity of the vehicle 10 in which the docking module 12 resides; a change in the vehicle 10 acceleration exceeding a threshold; receipt of a sensor 40 signal by the docking module 12 indicating any of the conditions mentioned above (pertaining to the light bar, gun rack, door, siren, engine, voice activation, a crash detection sensor, etc.); any combination of any two or more of the foregoing conditions.


Turning to FIG. 8, a block diagram 50 of the circuitry and components within a docking module 12 embodiment is depicted. Block 52 depicts the power management components (e.g. input 10-30 V DC with over-voltage/surge protection). Block 54 depicts general-purpose input/output (GPIO) circuitry with I/O pigtail connections 56 providing isolated I/O including: power ground, power voltage, sensor 40 inputs (e.g., light bar, gun rack, door, voice activation, crash detection, etc.), and voltage output. Block 58 depicts RS-232 connections comprising: port connect to in-car video system (ICV) as a Near-Field Communication (NFC) login function, port connect to CAN bus for velocity data (disable/enable function combined with proximity detect/velocity detect). Block 60 depicts a USB HUB IC. A first portable camera 14 is docked in docking module 12. A USB cable can connect to the ICV or a laptop computer for data upload/exchange. The docking module 12 may be configured with LEDS to indicate: detection/sync of first camera, detection/sync of second camera, power, light bar, Aux, data exchange via USB such as upload/download or firmware updates. Block 62 depicts a NFC reader module. The NFC reader module may be equipped with an internal NFC antenna (for pairing and ICV login). Block 64 depicts a Bluetooth® radio. The docking module 12 may also be equipped with an external antenna pigtail connection 66 in association with the Bluetooth® radio. Block 68 depicts a buzzer to provide feedback for NFC read or fail. The docking module 12 shown on the left-hand side of FIG. 8 represents the main dock with wireless components, USB hub and power distribution to the ICV or a laptop. The camera 14 is docked to the docking module 12 via a Pogo pin connection (see FIGS. 5-6) for data transfer and charging. A second portable camera 14′ may be linked in with a detachable docking module 12′, as shown on the right-hand side of FIG. 8. Both cameras 14 and 14′ may be linked to the HUB to connect to the ICV over a single USB. The second docking module 12′ provides the basic functionality (charging, USB connectivity to the ICV or a laptop, data upload/exchange). Both docking module 12 and docking module 12′ may be equipped with a power on/off button and a backlight on/off button (not shown). The general functions provided by the disclosed docking module 12 embodiments include: NFC Bluetooth® auto-pairing, received signal strength indication (RSSI) based proximity/range detection for the linked cameras 14, proximity and/or condition-based smart buffering control (enable/disable circular buffer for pre-event hold), record inhibit or record activation functions based on proximity and condition-based record control, and Pogo pin data exchange via USB and RS-232 connections. The Bluetooth® communications protocol includes available status information sent from the camera 14 to the docking module 12 (e.g., camera firmware version, hardware version, storage status, battery status, record on/off status, mute on/off status, recording resolution, camera IR LED status, etc.). In addition to sending the wireless commands described above, the docking module 12 may also be configured to remotely control other camera functions (e.g., turn on/off mute, turn on/off Wi-Fi, turn on/off camera IR LEDs, etc.).



FIG. 9 is a flow chart depicting a method 100 according to an embodiment of this disclosure. At step 110, a portable camera 14 is wirelessly linked with a docking module 12 disposed in a vehicle 10. The portable camera 14 is disposed remote from the docking module 12 and configured: to capture image data, with a buffer 22 to temporarily hold captured image data, and with a memory 24 to store captured image data. At step 120, using the docking module 12, a wireless command is sent to the portable camera 14 if a specified condition is met. The command causes performance of one or more actions selected from the group consisting of: (a) causing image data captured by the portable camera to be temporarily held in the buffer; (b) causing image data captured by the portable camera not to be held in the buffer; (c) causing image data captured by the portable camera to be stored in the memory; and (d) causing image data captured by the portable camera not to be stored in the memory. This method may be implemented using the techniques and embodiments disclosed herein.



FIG. 10 is a flow chart depicting a method 200 according to an embodiment of this disclosure. At step 210, a portable camera 14 is wirelessly linked with a docking module 12 disposed in a vehicle 10. The portable camera 14 is disposed remote from the docking module 12 and configured to capture image data and temporarily hold the captured image data in a buffer 22 in a continuous circulating stream. At step 220, using the docking module, a wireless command is sent to the portable camera if a specified condition is met. The command causes performance of one or more actions selected from the group consisting of: (a) causing image data captured by the portable camera not to be held in the buffer; (b) causing image data held in the buffer to be transferred to a memory 24 in the portable camera; (c) causing image data captured by the portable camera to be stored in the memory; and (d) causing image data captured by the portable camera not to be stored in the memory. This method may be implemented using the techniques and embodiments disclosed herein.


In light of the principles and example embodiments described and depicted herein, it will be recognized that the example embodiments can be modified in arrangement and detail without departing from such principles. Also, the foregoing discussion has focused on particular embodiments, but other configurations are also contemplated. In particular, even though expressions such as “in one embodiment,” “in another embodiment,” or the like are used herein, these phrases are meant to generally reference embodiment possibilities, and are not intended to limit the invention to particular embodiment configurations. As used herein, these terms may reference the same or different embodiments that are combinable into other embodiments. As a rule, any embodiment referenced herein is freely combinable with any one or more of the other embodiments referenced herein, and any number of features of different embodiments are combinable with one another, unless indicated otherwise.


Similarly, although example processes have been described with regard to particular operations performed in a particular sequence, numerous modifications could be applied to those processes to derive numerous alternative embodiments of the present invention. For example, alternative embodiments may include processes that use fewer than all of the disclosed operations, processes that use additional operations, and processes in which the individual operations disclosed herein are combined, subdivided, rearranged, or otherwise altered. This disclosure describes one or more embodiments wherein various operations are performed by certain systems, applications, modules, components, etc. In alternative embodiments, however, those operations could be performed by different components. Also, items such as applications, modules, components, etc., may be implemented as software constructs stored in a machine accessible storage medium, such as an optical disk, a hard disk drive, etc., and those constructs may take the form of applications, programs, subroutines, instructions, objects, methods, classes, or any other suitable form of control logic; such items may also be implemented as firmware or hardware, or as any combination of software, firmware and hardware, or any combination of any two of software, firmware and hardware.


This disclosure may include descriptions of various benefits and advantages that may be provided by various embodiments. One, some, all, or different benefits or advantages may be provided by different embodiments.


In view of the wide variety of useful permutations that may be readily derived from the example embodiments described herein, this detailed description is intended to be illustrative only, and should not be taken as limiting the scope of the invention. What is claimed as the invention, therefore, are all implementations that come within the scope of the following claims, and all equivalents to such implementations.

Claims
  • 1. A method, comprising: disposing a docking module in a vehicle;disposing a portable camera remote from the docking module and wirelessly linking the portable camera with the docking module, the portable camera configured to capture image data, the portable camera comprising: a buffer to temporarily hold the captured image data in a continuous circulating stream; anda memory to store captured image data received from the buffer; andif the portable camera is located within a specified distance of the docking module, then: (a) recognizing the portable camera when the portable camera is undocked and detecting operating conditions from signals sent by triggers, the triggers comprising buttons for manual activation, sensors to sense equipment activation, global positioning satellite circuitry to detect distance between the docking module and the portable camera, and a speedometer to detect velocity data of the vehicle; and(b) receiving the signals and wirelessly sending commands from the docking module to the portable camera based on a desired configuration of operating conditions based on the triggers, the commands comprising: a do not hold and do not store command to cause the portable camera to not hold the captured image data in the buffer and to not store the captured image data in the memory;a buffer command to cause the portable camera to temporarily hold the captured image data in the buffer; anda store command to trigger the portable camera to store to the memory the captured image data held in the buffer; andif the portable camera has been triggered to store image data to the memory, then activating the portable camera to bypass the commands from the docking module based on the triggers and to continue storing the image data to the memory by overriding any command from the docking module to terminate storing the image data to the memory.
  • 2. The method of claim 1, wherein the commands sent to the portable camera also cause the portable camera to transfer to the memory any image data that is already held in the buffer when the commands are received by the portable camera.
  • 3. The method of claim 1, wherein the sensors comprise a vehicle light bar sensor configured to send a vehicle light bar signal to the docking module when a light bar of the vehicle is activated.
  • 4. The method of claim 1, wherein the sensors comprise a vehicle gun rack sensor configured to send a vehicle gun rack signal to the docking module when a gun is removed from a gun rack of the vehicle.
  • 5. The method of claim 1, wherein the sensors comprise a vehicle door sensor configured to send a vehicle door signal to the docking module when a door of the vehicle is opened.
  • 6. The method of claim 1, wherein the sending of one of the commands to the portable camera is triggered directly by the receiving, by the docking module, of one or more of the signals from one or more of the triggers, without any intermediate event between the receiving, by the docking module, of the one or more of the signals from the one or more of the triggers and the sending of the one of the commands to the portable camera.
  • 7. The method of claim 1, wherein the receiving the signals comprises: upon receipt, by the docking module, of an indication that a velocity of the vehicle exceeds a threshold velocity, sending a wireless command to the portable camera, the wireless command causing the portable camera to start storing to the memory image data captured by the portable camera.
  • 8. The method of claim 7, wherein the wireless command sent to the portable camera also causes the portable camera to transfer to the memory any image data that is already held in the buffer when the wireless command is received by the portable camera.
  • 9. The method of claim 7, wherein the sending of the wireless command to the portable camera is triggered directly by the receipt, by the docking module, of the indication that the velocity of the vehicle exceeds the threshold velocity, without any intermediate event between the receipt, by the docking module, of the indication that the velocity of the vehicle exceeds the threshold velocity and the sending of the wireless command to the portable camera.
  • 10. The method of claim 1, wherein the portable camera is configured to bypass the commands from the docking module and continue storing the image data to the memory if the portable camera has been activated, at a time when the portable camera is disposed outside of the vehicle, to store captured image data to the memory.
  • 11. A system, comprising: a docking module disposed in a vehicle;a portable camera disposed remote from the docking module, the portable camera wirelessly linked with the docking module, the portable camera configured to capture image data, the portable camera comprising: a buffer to temporarily hold the captured image data in a continuous circulating stream; anda memory to store captured image data received from the buffer;wherein the docking module is configured to perform the following operations if the portable camera is located within a specified distance of the docking module: (a) to recognize the portable camera when the portable camera is undocked and to detect operating conditions from signals sent by triggers, the triggers comprising buttons for manual activation, sensors to sense equipment activation, global positioning satellite circuitry to detect distance between the docking module and the portable camera, and a speedometer to detect velocity data of the vehicle; and(b) to receive the signals and to wirelessly send commands from the docking module to the portable camera based on a desired configuration of operating conditions based on the triggers, the commands comprising: a do not hold and do not store command to cause the portable camera to not hold the captured image data in the buffer and to not store the captured image data in the memory;a buffer command to cause the portable camera to temporarily hold the captured image data in the buffer; anda store command to trigger the portable camera to store to the memory the captured image data held in the buffer; andwherein the portable camera is configured such that, if the portable camera has been triggered to store image data to the memory, the portable camera is activatable to bypass the commands from the docking module based on the triggers and to continue storing the image data to the memory by overriding any command from the docking module to terminate storing the image data to the memory.
  • 12. The system of claim 11, wherein the wireless commands sent to the portable camera also cause the portable camera to transfer to the memory any image data that is already held in the buffer when the wireless commands are received by the portable camera.
  • 13. The system of claim 11, wherein the sensors comprise a vehicle light bar sensor configured to send a vehicle light bar signal to the docking module when a light bar of the vehicle is activated.
  • 14. The system of claim 11, wherein the sensors comprise a vehicle gun rack sensor configured to send a vehicle gun rack signal to the docking module when a gun is removed from a gun rack of the vehicle.
  • 15. The system of claim 11, wherein the sensors comprise a vehicle door sensor configured to send a vehicle door signal to the docking module when a door of the vehicle is opened.
  • 16. The system of claim 11, wherein the sending of one of the commands to the portable camera is triggered directly by the receiving, by the docking module, of one or more of the signals from one or more of the triggers, without any intermediate event between the receiving, by the docking module, of the one or more of the signals from the one or more of the triggers and the sending of the one of the commands to the portable camera.
  • 17. The system of claim 11, wherein the docking module is further configured to perform the following: upon receipt, by the docking module, of an indication that a velocity of the vehicle exceeds a threshold velocity, send a wireless command to the portable camera, the wireless command causing the portable camera to start storing to the memory image data captured by the portable camera.
  • 18. The system of claim 17, wherein the wireless command sent to the portable camera also causes the portable camera to transfer to the memory any image data that is already held in the buffer when the wireless command is received by the portable camera.
  • 19. The system of claim 17, wherein the sending of the wireless command to the portable camera is triggered directly by the receipt, by the docking module, of the indication that the velocity of the vehicle exceeds the threshold velocity, without any intermediate event between the receipt, by the docking module, of the indication that the velocity of the vehicle exceeds the threshold velocity and the sending of the wireless command to the portable camera.
  • 20. The system of claim 11, wherein the portable camera is configured to bypass the commands from the docking module based on the triggers and to continue storing the image data to the memory if the portable camera has been activated, at a time when the portable camera is disposed outside of the vehicle, to store captured image data to the memory.
  • 21. The system of claim 11, wherein the portable camera is configured to dock with the docking module to: (i) electrically charge the portable camera, (ii) enable data transfer to or from the portable camera through the docking module, or (iii) enable data exchange between the portable camera and the docking module.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Application No. 62/333,818, filed on May 9, 2016, titled “Systems, Apparatuses and Methods for Creating, Identifying, Enhancing, and Distributing Evidentiary Data” and to U.S. Provisional Patent Application No. 62/286,139, filed on Jan. 22, 2016, titled “Systems, Apparatuses and Methods for Securely Attaching Wearable Devices.” The entire disclosures of Application No. 62/333,818 and Application No. 62/286,139 are hereby incorporated herein by reference.

US Referenced Citations (294)
Number Name Date Kind
4344184 Edwards Aug 1982 A
4543665 Sotelo et al. Sep 1985 A
4590614 Erat May 1986 A
4910795 McCowen et al. Mar 1990 A
5012335 Cohodar Apr 1991 A
5111289 Lucas et al. May 1992 A
5408330 Squicciarini et al. Apr 1995 A
5477397 Naimpally et al. Dec 1995 A
5613032 Cruz et al. Mar 1997 A
5724475 Kirsten Mar 1998 A
5815093 Kikinis Sep 1998 A
5841978 Rhoads Nov 1998 A
5862260 Rhoads Jan 1999 A
5926218 Smith Jul 1999 A
5946343 Schotz et al. Aug 1999 A
5970098 Herzberg Oct 1999 A
6002326 Turner Dec 1999 A
6009229 Kawamura Dec 1999 A
6028528 Lorenzetti et al. Feb 2000 A
6038257 Brusewitz et al. Mar 2000 A
6122403 Rhoads Sep 2000 A
6141611 MacKey et al. Oct 2000 A
6163338 Johnson et al. Dec 2000 A
6175860 Gaucher Jan 2001 B1
6181711 Zhang et al. Jan 2001 B1
6275773 Lemelson et al. Aug 2001 B1
6298290 Abe et al. Oct 2001 B1
6346965 Toh Feb 2002 B1
6405112 Rayner Jun 2002 B1
6411874 Morgan et al. Jun 2002 B2
6421080 Lambert Jul 2002 B1
6424820 Burdick et al. Jul 2002 B1
6462778 Abram et al. Oct 2002 B1
6505160 Levy et al. Jan 2003 B1
6510177 De Bonet et al. Jan 2003 B1
6518881 Monroe Feb 2003 B2
6624611 Kirmuss Sep 2003 B2
6778814 Koike Aug 2004 B2
6788338 Dinev et al. Sep 2004 B1
6788983 Zheng Sep 2004 B2
6789030 Coyle et al. Sep 2004 B1
6791922 Suzuki Sep 2004 B2
6825780 Saunders et al. Nov 2004 B2
6831556 Boykin Dec 2004 B1
7010328 Kawasaki et al. Mar 2006 B2
7091851 Mason et al. Aug 2006 B2
7119832 Blanco et al. Oct 2006 B2
7120477 Huang Oct 2006 B2
7155615 Silvester Dec 2006 B1
7167519 Comaniciu et al. Jan 2007 B2
7190882 Gammenthaler Mar 2007 B2
7231233 Gosieski, Jr. Jun 2007 B2
7272179 Siemens et al. Sep 2007 B2
7317837 Yatabe et al. Jan 2008 B2
7356473 Kates Apr 2008 B2
7386219 Ishige Jun 2008 B2
7410371 Shabtai et al. Aug 2008 B2
7414587 Stanton Aug 2008 B2
7428314 Henson Sep 2008 B2
7515760 Sai et al. Apr 2009 B2
7542813 Nam Jun 2009 B2
7551894 Gerber et al. Jun 2009 B2
7554587 Shizukuishi Jun 2009 B2
7618260 Daniel et al. Nov 2009 B2
7631195 Yu et al. Dec 2009 B1
7688203 Rockefeller et al. Mar 2010 B2
7693289 Stathem et al. Apr 2010 B2
7768548 Silvernail et al. Aug 2010 B2
7778601 Seshadri et al. Aug 2010 B2
7792189 Finizio et al. Sep 2010 B2
7818078 Iriarte Oct 2010 B2
7835530 Avigni Nov 2010 B2
7868912 Venetianer et al. Jan 2011 B2
7877115 Seshadri et al. Jan 2011 B2
7974429 Tsai Jul 2011 B2
7995652 Washington Aug 2011 B2
8068023 Dulin et al. Nov 2011 B2
8081214 Vanman et al. Dec 2011 B2
8086277 Ganley et al. Dec 2011 B2
8121306 Cilia et al. Feb 2012 B2
8126276 Bolle et al. Feb 2012 B2
8126968 Rodman et al. Feb 2012 B2
8139796 Nakashima et al. Mar 2012 B2
8144892 Shemesh et al. Mar 2012 B2
8145134 Henry et al. Mar 2012 B2
8150089 Segawa et al. Apr 2012 B2
8154666 Mody Apr 2012 B2
8166220 Ben-Yacov et al. Apr 2012 B2
8174577 Chou May 2012 B2
8195145 Angelhag Jun 2012 B2
8208024 Dischinger Jun 2012 B2
8228364 Cilia Jul 2012 B2
8230149 Long et al. Jul 2012 B1
8253796 Renkis Aug 2012 B2
8254844 Kuffner et al. Aug 2012 B2
8260217 Chang et al. Sep 2012 B2
8264540 Chang et al. Sep 2012 B2
8270647 Crawford et al. Sep 2012 B2
8289370 Civanlar et al. Oct 2012 B2
8300863 Knudsen et al. Oct 2012 B2
8311549 Chang et al. Nov 2012 B2
8311983 Guzik Nov 2012 B2
8358980 Tajima et al. Jan 2013 B2
8380131 Chiang Feb 2013 B2
8422944 Flygh et al. Apr 2013 B2
8446469 Blanco et al. May 2013 B2
8457827 Ferguson et al. Jun 2013 B1
8489065 Green et al. Jul 2013 B2
8489151 Engelen et al. Jul 2013 B2
8497940 Green et al. Jul 2013 B2
8554145 Fehr Oct 2013 B2
8612708 Drosch Dec 2013 B2
8630908 Forster Jan 2014 B2
8661507 Hesselink et al. Feb 2014 B1
8707392 Birtwhistle et al. Apr 2014 B2
8731742 Zagorski et al. May 2014 B2
8780199 Mimar Jul 2014 B2
8781292 Ross et al. Jul 2014 B1
8849557 Levandowski et al. Sep 2014 B1
9041803 Chen et al. May 2015 B2
9070289 Saund et al. Jun 2015 B2
9159371 Ross et al. Oct 2015 B2
9201842 Plante Dec 2015 B2
9225527 Chang Dec 2015 B1
9253452 Ross et al. Feb 2016 B2
9307317 Chang et al. Apr 2016 B2
9325950 Haler Apr 2016 B2
9471059 Wilkins Oct 2016 B1
9589448 Schneider et al. Mar 2017 B1
9665094 Russell May 2017 B1
10074394 Ross et al. Sep 2018 B2
20020003571 Schofield et al. Jan 2002 A1
20020051061 Peters et al. May 2002 A1
20020135679 Scaman Sep 2002 A1
20030052970 Dodds et al. Mar 2003 A1
20030080878 Kirmuss May 2003 A1
20030081122 Kirmuss May 2003 A1
20030081127 Kirmuss May 2003 A1
20030081128 Kirmuss May 2003 A1
20030081934 Kirmuss May 2003 A1
20030081935 Kirmuss May 2003 A1
20030095688 Kirmuss May 2003 A1
20030103140 Watkins Jun 2003 A1
20030151663 Lorenzetti et al. Aug 2003 A1
20030197629 Saunders et al. Oct 2003 A1
20040008255 Lewellen Jan 2004 A1
20040051793 Tecu et al. Mar 2004 A1
20040107030 Nishira et al. Jun 2004 A1
20040146272 Kessel et al. Jul 2004 A1
20040177253 Wu et al. Sep 2004 A1
20050007458 Benattou Jan 2005 A1
20050078195 VanWagner Apr 2005 A1
20050083404 Pierce et al. Apr 2005 A1
20050088521 Blanco et al. Apr 2005 A1
20050122397 Henson et al. Jun 2005 A1
20050154907 Han et al. Jul 2005 A1
20050158031 David Jul 2005 A1
20050185936 Lao et al. Aug 2005 A9
20050243171 Ross, Sr. et al. Nov 2005 A1
20050286476 Crosswy et al. Dec 2005 A1
20060055521 Blanco et al. Mar 2006 A1
20060072672 Holcomb et al. Apr 2006 A1
20060077256 Silvemail et al. Apr 2006 A1
20060078046 Lu Apr 2006 A1
20060130129 Dai et al. Jun 2006 A1
20060133476 Page et al. Jun 2006 A1
20060165386 Garoutte Jul 2006 A1
20060270465 Lee et al. Nov 2006 A1
20060274116 Wu Dec 2006 A1
20070005609 Breed Jan 2007 A1
20070064108 Haler Mar 2007 A1
20070086601 Mitchler Apr 2007 A1
20070111754 Marshall et al. May 2007 A1
20070124292 Kirshenbaum et al. May 2007 A1
20070217761 Chen et al. Sep 2007 A1
20070219685 Plante Sep 2007 A1
20080005472 Khalidi et al. Jan 2008 A1
20080030782 Watanabe Feb 2008 A1
20080129825 DeAngelis et al. Jun 2008 A1
20080165250 Ekdahl et al. Jul 2008 A1
20080186129 Fitzgibbon Aug 2008 A1
20080208755 Malcolm Aug 2008 A1
20080294315 Breed Nov 2008 A1
20080303903 Bentley et al. Dec 2008 A1
20090017881 Madrigal Jan 2009 A1
20090022362 Gagvani et al. Jan 2009 A1
20090074216 Bradford et al. Mar 2009 A1
20090076636 Bradford et al. Mar 2009 A1
20090118896 Gustafsson May 2009 A1
20090195651 Leonard et al. Aug 2009 A1
20090195655 Pandey Aug 2009 A1
20090213902 Jeng Aug 2009 A1
20100026809 Curry Feb 2010 A1
20100030929 Ben-Yacov et al. Feb 2010 A1
20100057444 Cilia Mar 2010 A1
20100081466 Mao Apr 2010 A1
20100131748 Lin May 2010 A1
20100136944 Taylor et al. Jun 2010 A1
20100180051 Harris Jul 2010 A1
20100238009 Cook et al. Sep 2010 A1
20100274816 Guzik Oct 2010 A1
20100287545 Corbefin Nov 2010 A1
20100289648 Ree Nov 2010 A1
20100302979 Reunamaki Dec 2010 A1
20100309971 Vanman et al. Dec 2010 A1
20110016256 Hatada Jan 2011 A1
20110044605 Vanman et al. Feb 2011 A1
20110092248 Evanitsky Apr 2011 A1
20110142156 Haartsen Jun 2011 A1
20110233078 Monaco et al. Sep 2011 A1
20110234379 Lee Sep 2011 A1
20110280143 Li et al. Nov 2011 A1
20110280413 Wu et al. Nov 2011 A1
20110299457 Green, III et al. Dec 2011 A1
20120014534 Bodley et al. Jan 2012 A1
20120078397 Lee et al. Mar 2012 A1
20120083960 Zhu et al. Apr 2012 A1
20120163309 Ma et al. Jun 2012 A1
20120173577 Millar et al. Jul 2012 A1
20120266251 Birtwhistle et al. Oct 2012 A1
20120300081 Kim Nov 2012 A1
20120307070 Pierce Dec 2012 A1
20120310394 El-Hoiydi Dec 2012 A1
20120310395 El-Hoiydi Dec 2012 A1
20130114849 Pengelly et al. May 2013 A1
20130135472 Wu et al. May 2013 A1
20130163822 Chigos et al. Jun 2013 A1
20130201884 Freda et al. Aug 2013 A1
20130218427 Mukhopadhyay et al. Aug 2013 A1
20130223653 Chang Aug 2013 A1
20130236160 Gentile et al. Sep 2013 A1
20130242262 Lewis Sep 2013 A1
20130251173 Ejima et al. Sep 2013 A1
20130268357 Heath Oct 2013 A1
20130287261 Lee et al. Oct 2013 A1
20130302758 Wright Nov 2013 A1
20130339447 Ervine Dec 2013 A1
20130346660 Kwidzinski et al. Dec 2013 A1
20140037142 Bhanu et al. Feb 2014 A1
20140038668 Vasavada et al. Feb 2014 A1
20140078304 Othmer Mar 2014 A1
20140085475 Bhanu et al. Mar 2014 A1
20140092251 Troxel Apr 2014 A1
20140100891 Turner et al. Apr 2014 A1
20140114691 Pearce Apr 2014 A1
20140143545 McKeeman et al. May 2014 A1
20140162598 Villa-Real Jun 2014 A1
20140184796 Klein et al. Jul 2014 A1
20140236414 Droz et al. Aug 2014 A1
20140236472 Rosario Aug 2014 A1
20140278052 Slavin et al. Sep 2014 A1
20140280584 Ervine Sep 2014 A1
20140281498 Bransom et al. Sep 2014 A1
20140297687 Lin Oct 2014 A1
20140309849 Ricci Oct 2014 A1
20140321702 Schmalstieg Oct 2014 A1
20140355951 Tabak Dec 2014 A1
20140375807 Muetzel et al. Dec 2014 A1
20150012825 Rezvani et al. Jan 2015 A1
20150032535 Li et al. Jan 2015 A1
20150066349 Chan et al. Mar 2015 A1
20150084790 Arpin et al. Mar 2015 A1
20150086175 Lorenzetti Mar 2015 A1
20150088335 Lambert et al. Mar 2015 A1
20150103159 Shashua et al. Apr 2015 A1
20150161483 Allen et al. Jun 2015 A1
20150211868 Matsushita et al. Jul 2015 A1
20150266575 Borko Sep 2015 A1
20150294174 Karkowski et al. Oct 2015 A1
20160023762 Wang Jan 2016 A1
20160035391 Ross et al. Feb 2016 A1
20160042767 Araya Feb 2016 A1
20160062762 Chen et al. Mar 2016 A1
20160062992 Chen et al. Mar 2016 A1
20160063642 Luciani et al. Mar 2016 A1
20160064036 Chen et al. Mar 2016 A1
20160065908 Chang et al. Mar 2016 A1
20160144788 Perrin et al. May 2016 A1
20160148638 Ross et al. May 2016 A1
20160285492 Vembar Sep 2016 A1
20160332747 Bradlow et al. Nov 2016 A1
20170032673 Scofield et al. Feb 2017 A1
20170053169 Cuban et al. Feb 2017 A1
20170053674 Fisher et al. Feb 2017 A1
20170059265 Winter Mar 2017 A1
20170066374 Hoye Mar 2017 A1
20170076396 Sudak Mar 2017 A1
20170085829 Waniguchi Mar 2017 A1
20170113664 Nix Apr 2017 A1
20170178422 Wright Jun 2017 A1
20170178423 Wright Jun 2017 A1
20170193828 Holtzman et al. Jul 2017 A1
20170253330 Saigh et al. Sep 2017 A1
20170324897 Swaminathan et al. Nov 2017 A1
Foreign Referenced Citations (40)
Number Date Country
2907145 May 2007 CN
101309088 Nov 2008 CN
102355618 Feb 2012 CN
102932703 Feb 2013 CN
202957973 May 2013 CN
103617005 Mar 2014 CN
1148726 Oct 2001 EP
1655855 May 2006 EP
2107837 Oct 2009 EP
2391687 Nov 2004 GB
2003150450 May 2003 JP
2005266934 Sep 2005 JP
2009169922 Jul 2009 JP
2012058832 Mar 2012 JP
1997038526 Oct 1997 WO
2000013410 Mar 2000 WO
2000021258 Apr 2000 WO
2000045587 Aug 2000 WO
2000072186 Nov 2000 WO
2002061955 Aug 2002 WO
2004066590 Aug 2004 WO
2004111851 Dec 2004 WO
2005053325 Jun 2005 WO
2005054997 Jun 2005 WO
2007114988 Oct 2007 WO
2009058611 May 2009 WO
2009148374 Dec 2009 WO
2012001143 Jan 2012 WO
2012100114 Jul 2012 WO
2012116123 Aug 2012 WO
2013020588 Feb 2013 WO
2013074947 May 2013 WO
2013106740 Jul 2013 WO
2013107516 Jul 2013 WO
2013150326 Oct 2013 WO
2014057496 Apr 2014 WO
2016033523 Mar 2016 WO
2016061516 Apr 2016 WO
2016061525 Apr 2016 WO
2016061533 Apr 2016 WO
Non-Patent Literature Citations (43)
Entry
Office Action issued in U.S. Appl. No. 11/369,502 dated Mar. 16, 2010, 10 pages.
Office Action issued in U.S. Appl. No. 11/369,502 dated Sep. 30, 2010, 12 pages.
Office Action issued in U.S. Appl. No. 11/369,502 dated Jul. 14, 2011, 17 pages.
Office Action issued in U.S. Appl. No. 11/369,502 dated Jan. 31, 2012, 18 pages.
Examiner's Answer (to Appeal Brief) issued in U.S. Appl. No. 11/369,502 dated Oct. 24, 2012, 20 pages.
Office Action issued in U.S. Appl. No. 13/723,747 dated Mar. 22, 2013, 6 pages.
Office Action issued in U.S. Appl. No. 13/723,747 dated Jun. 26, 2013, 6 pages.
Office Action issued in U.S. Appl. No. 13/723,747 dated Sep. 10, 2013, 7 pages.
Advisory Action issued in U.S. Appl. No. 13/723,747 dated Feb. 24, 2014, 4 pages.
Office Action issued in U.S. Appl. No. 13/723,747 dated Mar. 20, 2014, 6 pages.
Office Action issued in U.S. Appl. No. 13/723,747 dated Nov. 10, 2014, 9 pages.
Notice of Allowance and Fees Due issued in U.S. Appl. No. 13/723,747 dated Mar. 30, 2015, 6 pages.
First Action Interview Pilot Program Pre-Interview Communication issued in U.S. Appl. No. 14/588,139 dated May 14, 2015, 4 pages.
Office Action issued in U.S. Appl. No. 14/593,853 dated Apr. 20, 2015, 30 pages.
Office Action issued in U.S. Appl. No. 14/593,956 dated May 6, 2015, 10 pages.
PCT International Search Report and Written Opinion issued in Application No. PCT/US07/63485 dated Feb. 8, 2008, 10 pages.
Chapter 5: “Main Memory,” Introduction to Computer Science course, 2004, 20 pages, available at http://www2.cs.ucy.ac.cy/˜nicolast/courses/lectures/MainMemory.pdf.
Sony Corporation, Digital Still Camera (MVC-CD200/CD300), Operation Manual, 2001, 108 pages, Sony, Japan.
Steve's Digicams, Kodak Professional DCS 620 Digital Camera, 1999, 11 pages, United States, available at: http://www.steves-digicams.com/dcs620.html.
Gregory J. Allen, “The Feasibility of Implementing Video Teleconferencing Systems Aboard Afloat Naval Units” (Master's Thesis, Naval Postgraduate School, Monterey, California), Mar. 1990, 143 pages.
Bell-Northern Research Ltd., “A Multi-Bid Rate Interframe Movement Compensated Multimode Coder for Video Conferencing” (Final Report prepared for DARPA), Apr. 1982, 92 pages, Ottawa, Ontario, Canada.
Xiaoqing Zhu, Eric Setton, Bernd Girod, “Rate Allocation for Multi-Camera Surveillance Over an Ad Hoc Wireless Network,” 2004, 6 pages, available at http://msw3.stanford.edu/˜zhuxq/papers/pcs2004.pdf.
Office Action issued in U.S. Appl. No. 14/593,722 dated Sep. 25, 2015, 39 pages.
Office Action issued in U.S. Appl. No. 14/593,853 dated Sep. 11, 2015 (including Summary of Interview conducted on May 9, 2015), 45 pages.
Notice of Allowance issued in U.S. Appl. No. 14/593,956 dated Oct. 26, 2015, 10 pages.
“IEEE 802.1X,” Wikipedia, Aug. 23, 2013, 8 pages, available at: http://en.wikipedia.org/w/index.php?title=IEEE_802.1X&oldid=569887090.
Notice of Allowance issued in U.S. Appl. No. 14/588,139 dated Aug. 14, 2015, 19 pages.
“Near Field Communication,” Wikipedia, Jul. 19, 2014, 8 pages, available at: hilps://en.wikipedia.org/w/index.php?title=near_field_communication&oldid=617538619.
PCT International Search Report and Written Opinion issued in Application No. PCT/US15/47532 dated Jan. 8, 2016, 22 pages.
Office Action issued in U.S. Appl. No. 14/686,192 dated Apr. 8, 2016, 19 pages.
Office Action issued in U.S. Appl. No. 14/715,742 dated Aug. 21, 2015, 13 pages.
Office Action issued in U.S. Appl. No. 14/715,742 dated Mar. 11, 2016, 14 pages.
Office Action issued in U.S. Appl. No. 14/593,722 dated Apr. 10, 2015, 28 pages.
Office Action issued in U.S. Appl. No. 14/686,192 dated Dec. 24, 2015, 12 pages.
“Portable Application,” Wikipedia, Jun. 26, 2014, 4 pages, available at: http://en.wikipedia.org/w/index.php?title=Portable_application&oldid=614543759.
“Radio-Frequency Identification,” Wikipedia, Oct. 18, 2013, 31 pages, available at: http://en.wikipedia.org/w/index.php?title=Radio-frequency_identification&oldid=577711262.
Advisory Action issued in U.S. Appl. No. 14/715,742 dated May 20, 2016 (including Summary of Interview conducted on May 12, 2016), 4 pages.
Advisory Action issued in U.S. Appl. No. 14/715,742 dated Jun. 14, 2016, 3 pages.
Office Action issued in U.S. Appl. No. 14/715,742 dated Sep. 23, 2016, 17 pages.
Office Action issued in U.S. Appl. No. 15/413,205 dated Mar. 17, 2017, 7 pages.
Office Action issued in U.S. Appl. No. 15/438,166 dated Apr. 21, 2017, 17 pages.
U.S. Appl. No. 62/197,493 (Fisher et al.), filed Jul. 27, 2015, 12 pages.
Office Action issued in U.S. Appl. No. 15/467,924 dated May 8, 2017, 10 pages.
Related Publications (1)
Number Date Country
20170214843 A1 Jul 2017 US
Provisional Applications (2)
Number Date Country
62333818 May 2016 US
62286139 Jan 2016 US