The invention relates to voice, data and audio-video streaming to and from a hands-free wireless mobile device. Further, the invention relates to multiple embodiments of a video camera telecommunications headset with a retractable viewfinder/monocular display and a laser rangefinder offered as either a single unit hands-free wireless real-time bi-directional and multi-feed telecommunications headset or as a modular Bluetooth earpiece headphone (Headphone) unit with a detachable handheld cellular video camera unit, with the earpiece unit serving as either a stand-alone wireless Bluetooth Headphone and personal media player or as a wearable multimedia port for the detachable handheld cellular video camera telephone unit that when linked together operate as a single autonomous hands-free wireless video communications headset capable of still image and audio-video capturing, recording and streaming to and from a receiver or group of receivers and real-time viewing and control of captured and/or received audio-video feeds via the retractable monocular viewfinder or other paired multimedia display system.
Further, the invention relates to the headset accessories including an adjustable docking station for recharging and/or use of the headset as a stationary wired or wireless IP camera, removable/rechargeable earpiece and neckpiece battery units for continuous hands-free wireless mobile headset, headphone and/or other ported device operations and a pair of multimedia sunglasses for active binocular display and eye to camera automation of the headsets optical and digital zoom, day and near infrared night vision camera.
Further the invention relates to device control software that is embedded into the headset and/or downloaded to the headset (or other networked device) and/or executed remotely from an external server. The device control software provides a system and method for infrastructure and ad-hoc networking, operational behavior management, communications protocol selection, security, data and power characteristics, modular data broadcasting, hands-free and remote systems control and autonomous, paired and/or networked device optimization,
Field of Invention
When wireless/cellular phone technology was initially commercialized, cellular phones were voice-only communication devices, i.e. such devices did not have any capability beyond voice communication. These cellular phones were hand-held devices. A hands-free option was introduced that required plugging in a wired headphone connection between the phone and an earpiece and microphone. Over time, some new technologies evolved.
Most notably:
Description of the Prior Art
Comparatively US patent 20100039493, Mobile video headset Feb. 18, 2010, comprises an embodiment that allows hands-free imaging of the users surroundings via a camera and transference of simplex (one-way) audio and video stream to a personal (nearby) electronic device. The evolution in this invention advances all communications, including the audio video stream, to be duplex with a plurality of devices for full multimedia exchange amongst multiple devices. Paired or connected devices do not have to be on the person or nearby, instead they can be anywhere a network connection can be established. Additionally, other unified communication data (email, text message, voice, GPS) is integral to the device for a rich communication platform amongst a group of connected devices. Also, user control capabilities (hands-free) are advanced by voice command, eye tracking, motion gesture, and hand operated controls.
Other patents utilize off-board computing power to process data and coordinate control of a wearable (headset) device, this invention integrates all functionality within the headset enclosure.
Technology referenced in some of the prior art (referenced patents) should be considered experimental and largely remains in the laboratory or is prohibitively expensive. Conversely, much of the technology in this invention is proven and available as commercial-off-the-shelf solutions ready for integration into a design.
Multiple prior art references incorporate large and heavy headset apparatus. This invention is less intrusive to the user's movement and other worn clothing or accessories. The target weight of this headset is less than 100 grams.
The battery powered video camera telecommunications headset is the next innovative step in video capture and wearable hands-free wireless communications devices. Designed to capture, record, stream and display what the eyes see and what the ears hear, the Headset offers two-way and multi-feed real-time voice, data and audio-video communications and streaming to and from multiple wireless network connections, or locally to persistent storage. Operating as an autonomous, paired or peer-to-peer networked computing and multimedia device the Headset offers a wide range of hands-free device operational controls and interfaces including voice command, automated eye-motion and facial recognition for eye-camera control, light and motion sensor for the embedded day and night vision camera, and laser rangefinder automated optical and digital zoom, remote paired or networked device and camera control, and redundant embedded and removable/rechargeable and rapidly swappable battery power sources for uninterrupted hands-free wireless device operation.
The headset can operate as a paired accessory to a cell phone, PC, TV, video game console or other external wired or wireless Bluetooth, Wi-Fi or WiMax, Cellular, or other wired or wirelessly networked device. The video camera telecommunications Headset is worn on the ear similar to other wireless Bluetooth and/or Wi-Fi earpiece Headsets yet unlike its predecessors the Headset is more than a Bluetooth accessory; it is a hands-free wireless GPS enabled Wide, Metro or Local and Personal Area Network cellular communications system capable of wholly autonomous operations. Incorporating an embedded optical and digital zoom video and still image camera with both day and infrared night vision capabilities and a retractable monocular viewfinder display, the headset is an intelligent hands-free wireless device capable of simultaneously capturing, recording, streaming, receiving and outputting voice, data and audio-video in real-time and over multiple networks and pairing with one or more networked devices to create an ad-hoc virtual private network over any WAN, MAN, LAN, PAN or a combination of a paired, managed network, federation of autonomous networked devices or other peer-to-peer networking configuration.
Until now Bluetooth headsets, wearable video cameras and other hands-free wireless devices have been limited in their capability due to one of five major constraints; power, wireless range, size, weight and data processing capability. In addition to introducing functional hardware solutions to each of these physical constraints the Headset also incorporates a mobile device control system that introduces a system and a method for modular device, group and network data, protocol and power management and optimization, modular data channeling, broadcasting and communications, coordinated and shared operations, security and remote systems control.
The Gateway Operating System (GOS) Control Software is designed to support numerous embodiments of the telecommunications headset and other unified, paired and networked communication devices (Headset in this invention, wireless Internet devices and stationary/wired Internet devices) and introduces a modular device architecture and management system, a model for wired and wireless device networking, paired, autonomous, hierarchical or other managed device groups, peer to peer and Virtual Private Networking (VPN) while constantly optimizing single or multiple networked device configurations for data, power and network optimization shared and/or unified operations and inter-device and multi-device secure ad-hoc networking over any combination or type of wired and/or wireless network, public or private federation of autonomous wired or wireless devices.
The GOS is a modular device, data and network management platform and operations model is designed to allow for components or whole devices to be attached or detached and systems to be turned on, shut off, monitored and constantly optimized specific to the available power, memory, wireless transmit power, and network bandwidth of all available networked devices and/or other user, group, network, event, location and/or application based device preferences. The GOS algorithms are consistently generating and updating optimized device operations based on automated and/or user specified guidelines and priorities for each of its coupled systems and/or modules.
The GOS is designed to expand to include one or many devices in a synchronized ad-hoc or coordinated, hierarchical, managed or other device network organizational model offering a standard modular device and data management system. It is also a model for individual and group device power and data management and storage, protocol selection, transmission, broadcasting, networking, applications and systems coordination over any number of wired and wireless networks for optimized individual and grouped device operations and remote systems control.
In addition to supporting multi-device coordinated networking the GOS organizes systems through standardized APIs for interaction of system processes, organized data and communication protocol types and applications such as device, user and network identification and security, location and contextual data, voice, audio and video, communications protocols, system controls, data management, and power management. The GOS identifies, quantifies, and prioritizes local and remote networked device operation, networks and applications for each of its designated system modules based on a set of variables such as power use, data processing speed, network strength, bandwidth, available data and memory storage.
For example a “Location and Contextual Data Module” may be designated to acquire all data and manage all tasks and processes specific to the location of the device and/or its relation to another device, address, position, location, time/date, contextual event or coordinated application and/or the source and location of any incoming or outgoing data stream. All location and contextual data acquisition, processing and transmission are managed by the location and contextual data module. The data module will automatically locate, organize and prioritize all available firmware components and or software processes designated for the acquisition, processing and or application of location and contextual data. The Location and Contextual Data Module in each independent or networked device will independently quantify and prioritize all of its local firmware components and related software processing systems and location and contextual data applications based on a universal set of variables.
Each Headset or other networked device or group of networked devices may have more than one method, system and/or set of firmware components capable of acquiring and utilizing the same or similar data. Each module then prioritizes all local or networked firmware component's for each of its designated tasks and related data sets in order to select the optimum system(s) and method for acquiring a specific data set and accomplish each task and/or process and which tasks should be acquired and/or processed and/or designated to each available system and/or networked device.
Each individual device then coordinates all networked and independent functions for each module and selects which local tasks, processes, components and/or networks should be idled or shut down at any given moment based on the independent and group generated modular algorithm for optimized system operations. Each device, device group and/or network can be pre-defined with user preferences and updated for coordinated device system settings. Each Module responds based on networked group defined standards for measuring the data, power and networking efficiency and application for all designated processes, tasks, data sets and all available firmware and software systems specific to each task. All data, system, component, device and network operations are then synchronized, coordinated and managed on a modular systems level for one or a group of networked devices with each module continuously updating its operation specific to system capabilities, optimized use case scenario and user priority settings for all available components and networked devices. Each module is assigned its own data channel on which all module specific data streams are transmitted upon. Modular channels and related data streams can be encrypted and grouped together with other channel data streams and sent out via a single network port.
The Headset and GOS Control Software offers a modular device, data and network management, operations and network optimization model that can be installed as an embedded system downloaded and customized to an existing operating system as a middleware application and/or networked to a device either via a networked hub or via a peer-to-peer ad-hoc networked or managed network configuration. The software is configurable for all headset and other networked device embodiments and applications including but not limited to wearable, hands-free and remote wireless systems controls.
A mash-up of a GPS enabled video camera and camcorder, IP camera, video VOIP phone (IP Telephony), Personal Media Player and mobile handset (cell phone) and a hands-free wireless earpiece Bluetooth Headset. As a cell phone the headset is a mobile, wide area networked device that offers one-to-one and one-to-many communication, file storage, audio, text, picture and video communication. As an IP camera phone, the headset provides real-time voice, data and audio-video over a Metro or Local Area, wired or Cloud wireless or wired Internet link and can serve as a wireless router and hot spot for an ad-hoc off-network peer to peer group of devices or secure Virtual Private Network (VPN). As a personal media player the headset offers an entirely autonomous hands-free wireless solution.
The Headset is designed as a modular device on both a physical, functional, and embedded system design architecture. Allowing for components to be attached or detached specific to the desired application. A first embodiment comprises a one-piece GPS enabled hands-free wireless wearable audio-video camera headset capable of two-way, multi-feed and duplex networked voice, data and audio-video communications offered in a curved or rectangular video camera phone (CP) Headset unit body design while a second embodiment introduces two-piece detachable handheld cellular CP unit and a Bluetooth earpiece Headphone unit with both units capable of operating autonomously or, when attached as a single hands-free wireless telecommunications headset device with the earpiece Headphone Unit serving as a universal port for the handheld CP unit.
The headset supports a redundant data and power management and storage system designed to offer both embedded removable/rechargeable and rapidly swappable data and power storage options while offering a modular data management system capable of supporting entirely standalone autonomous device operations, communications, automated and user control and/or securely transfer voice, data and/or video interface control to an authorized external wired or wirelessly networked device or group of devices. Additionally an embedded laser sight and range finder support automated zoom functions and hands free control of the optical and digital zoom camera with light and motion sensor and accelerometer to retractable monocular, view finder, data management and/or headset systems control partially or entirely to an authorized remote wired or wirelessly networked device or group of devices.
The cellular CP module and earpiece Headphone module which are introduced in both single one-piece embodiments and detachable two-piece embodiments, both include independent battery and power systems, independent flash with the CP Unit housing a removable/rechargeable battery and the Headphone unit housing an embedded rechargeable battery. The earpiece Headphone unit also has a dual data/power accessory port system for a removable/rechargeable earpiece or neckpiece battery unit both serving as a counter balance and support system for the headset and/or Headphone units.
In both the single unit and two-piece autonomous CP and Headphone embodiments the system automatically selects from one or more of the available batteries based on power charge status, application and run time scenarios with the CP unit handling the primary processing, system and network drivers and the earpiece Headphone unit housing a microprocessor for the earpiece microphone, speaker, media player and user controls, Bluetooth and multimedia device port and connector functions. Both the CP and Headphone units have independent embedded and removable flash memory ports and independent data storage and management functionality allowing both the CP unit and earpiece unit to operate jointly or independently in both the single unit and two-piece unit embodiments.
When the CP and Headphone unit are attached either by default in the single unit embodiment or by attachment in the two-piece embodiment the system may revert to the larger CP Unit Battery and only use the Headphone battery as a backup or the entire headset may revert to the external earpiece battery unit for all high-level applications and the CP unit battery for all core communications functions so that should the earpiece battery be removed the headset will continue to function without interruption. The modular headset system is able to calculate and use all available resources and make operational decisions based on optimum and redundant functionality.
By introducing the external removable/replaceable tether-free battery power storage option the Headset is able to optimize space and manage heat, weight, size and device operations more effectively and by offering redundant power, storage and system functionality the Headset becomes a truly hands-free wireless communications device and tool with the capability of uninterrupted use by simply replacing the external earpiece or neckpiece battery unit with a fully charged back up unit or directly plugging the earpiece or neckpiece battery unit into an external battery pack or other networked or battery powered device.
Because of the size and weight constraints that are lifted by utilizing the removable earpiece and neckpiece battery solution the CP unit is more comfortably able to house a full optical and digital zoom camera with light and motion sensor, day and night vision lens system, laser rangefinder for target location, motion sensor and automated camera zoom as well as retractable monocular, view finder for real time display of captured or incoming audio-video feed, eye motion sensor for eye-camera control, eye retina scan security, a dual camera and voice microphone system for noise cancellation, voice recognition and voice command and a multi-channel, duplex networking communications module for multi-device, telecommunications, teleconferencing, telenetworking, coordination, and remote systems control.
An expandable platform is possible with the adaptable CP module which is offered in both a curved and a rectangular body design with either a Multimedia and power connector port or retractable high-speed USB port option. The CP module can be operated as either a handheld GPS, Cellular, Wi-Fi or WiMax and Bluetooth duplex and multi-channel cellular, IP and peer to peer networked mobile phone and camera/camcorder device or when ported with the Bluetooth earpiece Headphone module as a hands-free wireless audio-video camera and two-way and multi-feed telecommunications headset. The main processing component is in the CP module which contains an optical and digital zoom day and near infrared night vision camera with an expansion connector for interfacing to other peripheral components, a retractable display/monocular eyepiece view finder with eye and facial scanning system for real-time hands-free viewing of day or night vision camera video feed, a laser sight and rangefinder for visual camera targeting and automated optical and digital zoom, dual front camera and voice microphones for isolated audio-video and audio-voice feeds for noise cancellation and speaker for use as a handheld phone or for audio-video playback, and a hot swappable battery, and SD storage slot a complete numerical key pad and an independent camera/rangefinder audio and video control button array.
The Bluetooth earpiece Headphone wearable plug and play module serves as a universal port with both a multimedia and power port and a high-speed USB port for multi-device synced networking and operations and dual data/power earpiece/neckpiece removable battery ports on either end of the Headphone allowing the Headphone to be reversible and worn on either ear by supporting an external earpiece battery clip or adjustable neckpiece headset battery clip that can be inserted into either end of the earpiece Headphone for support, counterbalance and external, rechargeable and swappable power source for uninterrupted use and a second SD storage slot for use as additional backup memory for the CP module when ported or for use in audio file storage and play back when the earpiece Headphone is worn as one or a set of Bluetooth earpiece Headphones for use as either a hands-free Bluetooth communicator or as a standalone wireless Bluetooth personal media player capable of entirely autonomous operations (free of a connection to an external networked or media storage device or as a unified extension of the CP module via the multimedia or USB port, or as a paired wired network accessory or wirelessly via a Bluetooth paired connection or a Wi-Fi peer to peer networked connection (when connected with the CP module or other Wi-Fi enabled device).
Rechargeable/Replaceable External Earpiece Battery Units
The video camera is preferably capable of duplex audio/video streaming; detecting of daylight, lowlight, and night light conditions; video streaming and still photography in varying light conditions; responding to control commands received via the headset; and stabilizing the image as the headset wearer moves.
Embodiments of the invention also provide for the control of the behavior of the headset. An embodiment of the headset has software that controls the operational behavior of the headset by responding to the external stimuli or commands; communicates with the external devices over wireless networks such as Bluetooth, cellular, or Wi-Fi; manages the modes of operation of the headset, which modes can comprise any of (a) voice-only, e.g. voice of the headset-wearer, (b) video-only-with-sound, e.g. video streaming as seen through the onboard video camera and sound of the video stream, but no voice of the headset wearer, (c) video-only-without-sound, similar to (b) but without sound in the video stream), and (d) video with voice; responds to (a) onboard manual controls that are operated by the headset wearer, (b) voice commands, and (c) instructions received from remote control devices, such as but not limited to a PC or another wireless device; and is capable of video transmission for (a) storing to another device, (b) uploading to the Internet, and (c) real-time duplex audio/video communications via a cellular, wireless, direct, or LAN connected device.
One form of operational control of the headset comprises of the following steps and sequence:
Although in the above depiction there is a notion of a controller and a controlee, in reality the headset and its participating devices are peers in the system. The headset or any of the other devices can assume the role of controller. The control software in the headset includes an algorithm the electronically negotiates the role of the device, i.e. controller or controlee, with the participating device based on that device's capabilities.
Having described the sequence of steps for operational control, control actions available in one embodiment are grouped in the following high-level categories:
Further, an embodiment of the invention provides for communication amongst the headset, a hand-held device (typically a cellular phone), and external entities. The available communication protocols include, for example, Bluetooth, Wi-Fi, and cellular. Protocol selection is made, for example, based upon such factors as optimum power consumption, desired audio or video quality, available bit-rate, and the type of data transmitted.
Embodiments of the invention relate to a hands-free wireless video camera telephone headset, which comprises a real-time two-way and multi-feed voice, data, and audio-video communications device that can operate as a standalone, long range cellular communications terminal and/or as a short-range, paired wireless accessory to a mobile phone, PC, TV, video game console, other wireless device or to a wired Internet terminal.
Embodiment of the invention also provide a manual and/or automated communications protocol selection for device power optimization, paired and/or independent wireless voice, data, and audio-video communications, streaming, networking, recording, archiving, storage, device and caller authorization and security, remote wireless video camera and device control, and remote viewfinder via any mobile phone, cellular, Bluetooth, and/or Wi-Fi enabled terminal.
The inventions now will be described more fully hereinafter with reference to the accompanying drawings, in which some examples of the embodiments of the inventions are shown. Indeed, these inventions may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will satisfy applicable legal requirements. When appropriate, like reference numerals and characters may be used to designate identical, corresponding or similar components in differing figure drawings.
An embodiment of the invention comprises a headset 10, 12, 15, 17, 20, 25 earpiece Headphone module 30, removable/rechargeable power accessories 100, multimedia glasses accessory 120 and headset docking station 150. An exemplary embodiment of the headset comprises the following components:
The headset power accessories include numerous embodiments of removable/rechargeable batteries. A docking station 150 comprised of a seat 153 for the headset for storage, battery charging, host connection 111, or any other operational purpose. The docking station capabilities include:
After headset is awakened as a result of one of the actions listed above, the headset is ready to perform one or more of the following actions:
Record/Capture
The following paragraphs explain the system controls in detail.
Intrinsic System Controls
These are the controls that operate inside the headset as a part of the headset operating system. These controls are essential to the proper functioning of the headset. These controls are activated as soon as power is switched on and run continuously until the headset is switched off. The major subsystems of the intrinsic controls are:
Power systems—storage, management
Memory management
Camera controls
Wireless Activation/Listen/Streaming/Networking
Audio Systems & Volume Controls
Security Controls
These controls 50 and 31 are operated by the user by pressing the onboard buttons and switches. These controls include:
Camera Control Buttons 50, Switches and Mount 87
Earpiece Control Buttons & Switches 31
Phone Control Buttons 55, Switches & Display 73
These controls include all non-button or physical array manual and semi-automated hands-free user control systems for the device, communications and/or camera operations including;
Voice Command
Motion Gesture Recognition
Optical Scan and Eye and Facial Recognition
Laser Rangefinder and automated optical and/or digital zoom
Light Sensor for day and near infrared night vision camera
Motion Sensor and accelerometer
Paired Device Controls
These controls are similar to manual control. However, the controls are operated using a paired device. These controls include all of the manual controls listed above. In addition, the paired device's screen acts as a remote viewfinder for the control operation.
Headset Operational Systems
The headset supports multiple wireless communication protocols. These include cellular, WiMax or Wi-Fi and Bluetooth. The communications protocols subsystem manages all the incoming and outgoing communication for the headset. The headset will participate in a network infrastructure (client-server) or ad-hoc (peer to peer) arrangement. Attributes of a Mobile Ad-hoc Network (MANET) will allow network connected units to share and forward data when operating in a mesh. Internet telephony is the conduit for the multimedia data streams operating on wideband IP networks. The TCP/IP stack will be IPv6 and IPSec compliant. The TCP/IP stack provides IP sockets for duplex inter-process communication amongst processes or threads for the delivery of multi-media data and control messages. The Session Initiation Protocol (SIP) will be used for multimedia session creation with Real-time Transport Protocol (RTP) for voice and video stream distribution.
Call Type Management Subsystem
Device Security/ID Subsystem
Power Management Subsystem
The headset powered by at least any of two sources: removable and rechargeable batteries; and grid power when the headset is seated in the docking station. The power management subsystem manages the power consumption of the headset. It makes operational decisions, such as what communication protocol to use, whether to use camera light or not, recording options, or any other action that involves power consumption. These decisions optimize the power consumption.
Device Operations Subsystem
The device operations subsystem manages the behavior of the headset itself. It is further subdivided into lower level of subsystems such as camera controls, memory management, manual controls, hands-free controls, video codec control, user display output, and audio controls.
Events/Context-Driven Subsystem
This subsystem responds to the external events or the external context of the device. For example, the headset may switch itself on or off when in a specific location or time of the day; the camera may begin recording when the motion sensor is activated and/or begin broadcasting live or recorded audio-video content to the Internet or directly to one or more authorized networked devices, the flashlight, laser sight and/or night-vision sensor may be activated automatically if the quality of the light degrades, etc.
Modes of Communication
Although the invention is described herein with reference to the preferred embodiment, one skilled in the art will readily appreciate that other applications may be substituted for those set forth herein without departing from the spirit and scope of the present invention. Accordingly, the invention should only be limited by the Claims included below.
This application is a Continuation of U.S. application Ser. No. 12/714,693, filed 1 Mar. 2010, which was issued as U.S. Pat. No. 8,902,315 on 2 Dec. 2014, which is a Continuation of PCT Patent Application No. PCT/US10/25603, filed 26 Feb. 2010, which claims priority to U.S. Provisional Application Ser. No. 61/208,783, filed 27 Feb. 2009, and to U.S. Provisional Application Ser. No. 61/270,221 filed 6Jul. 2009, each of which are incorporated herein in its entirety by this reference thereto.
Number | Name | Date | Kind |
---|---|---|---|
4484029 | Kenney et al. | Nov 1984 | A |
5457751 | Such | Oct 1995 | A |
5751341 | McKinley et al. | May 1998 | A |
5886735 | Bullister et al. | Mar 1999 | A |
6078825 | Hahn et al. | Jun 2000 | A |
6101038 | Hebert et al. | Aug 2000 | A |
6406811 | Hall et al. | Jun 2002 | B1 |
6407673 | Lane | Jun 2002 | B1 |
6510325 | Mack, II | Jan 2003 | B1 |
6563532 | Strub et al. | May 2003 | B1 |
6769767 | Swab et al. | Aug 2004 | B2 |
6868284 | Bae | Mar 2005 | B2 |
6952617 | Kumar | Oct 2005 | B1 |
D525962 | Elson | Aug 2006 | S |
7130654 | Cho | Oct 2006 | B2 |
7321783 | Kim | Jan 2008 | B2 |
8483754 | Rao et al. | Jul 2013 | B2 |
8784206 | Loose et al. | Jul 2014 | B1 |
8872941 | Sako et al. | Oct 2014 | B2 |
9317124 | Kongqiao | Apr 2016 | B2 |
20020044152 | Abbott, III et al. | Apr 2002 | A1 |
20020109579 | Pollard et al. | Aug 2002 | A1 |
20020198685 | Mann | Dec 2002 | A1 |
20030156208 | Obradovich | Aug 2003 | A1 |
20040215958 | Ellis et al. | Oct 2004 | A1 |
20050136949 | Barnes, Jr. | Jun 2005 | A1 |
20050278446 | Bryant | Dec 2005 | A1 |
20060025074 | Liang et al. | Feb 2006 | A1 |
20060039574 | Chen | Feb 2006 | A1 |
20060229012 | Tsai et al. | Oct 2006 | A1 |
20060244727 | Salman et al. | Nov 2006 | A1 |
20070040889 | Sahashi | Feb 2007 | A1 |
20070054697 | Money et al. | Mar 2007 | A1 |
20070072649 | Park | Mar 2007 | A1 |
20070118426 | Barnes, Jr. | May 2007 | A1 |
20070165875 | Rezvani et al. | Jul 2007 | A1 |
20070173266 | Barnes | Jul 2007 | A1 |
20070182812 | Ritchey | Aug 2007 | A1 |
20070202934 | Kim | Aug 2007 | A1 |
20070206829 | Weinans et al. | Sep 2007 | A1 |
20070249932 | Shahinian et al. | Oct 2007 | A1 |
20070260797 | Chen | Nov 2007 | A1 |
20080039072 | Bloebaum | Feb 2008 | A1 |
20080104018 | Xia | May 2008 | A1 |
20080132168 | Segev et al. | Jun 2008 | A1 |
20080133227 | Kong et al. | Jun 2008 | A1 |
20080259045 | Kim et al. | Oct 2008 | A1 |
20080284899 | Haubmann et al. | Nov 2008 | A1 |
20090221229 | Baumgartner | Sep 2009 | A1 |
20090268921 | Tang | Oct 2009 | A1 |
20090323975 | Groesch | Dec 2009 | A1 |
20100039493 | Chao et al. | Feb 2010 | A1 |
20100304783 | Logan et al. | Dec 2010 | A1 |
20110125063 | Shalon | May 2011 | A1 |
20130044992 | Boland et al. | Feb 2013 | A1 |
Number | Date | Country |
---|---|---|
2237939 | Aug 1998 | CA |
2688004 | Mar 2005 | CN |
197 15 321 | Oct 1998 | DE |
19841262 | Mar 2000 | DE |
101 06 072 | Aug 2002 | DE |
1126434 | Aug 2001 | EP |
1386490 | Feb 2004 | EP |
1391111 | Feb 2004 | EP |
1422639 | May 2004 | EP |
1445938 | Aug 2004 | EP |
1617361 | Jan 2006 | EP |
1711006 | Oct 2006 | EP |
1926051 | May 2008 | EP |
2003802 | Dec 2008 | EP |
H06141308 | May 1994 | JP |
H1173273 | Mar 1999 | JP |
2002152371 | May 2002 | JP |
2002300238 | Oct 2002 | JP |
2006148842 | Jun 2004 | JP |
2004206707 | Jul 2004 | JP |
2006086748 | Mar 2006 | JP |
2006146542 | Jun 2006 | JP |
2006186904 | Jul 2006 | JP |
2006332970 | Dec 2006 | JP |
2006352540 | Dec 2006 | JP |
2007103091 | Apr 2007 | JP |
2007310815 | Nov 2007 | JP |
200828552 | Feb 2008 | JP |
2008536443 | Sep 2008 | JP |
2009021914 | Jan 2009 | JP |
200927489 | Feb 2009 | JP |
2009033765 | Feb 2009 | JP |
2009232133 | Oct 2009 | JP |
03024006 | Mar 2003 | WO |
03088533 | Oct 2003 | WO |
2006110109 | Oct 2006 | WO |
2008027447 | Mar 2008 | WO |
2008045453 | Apr 2008 | WO |
2008060442 | May 2008 | WO |
2008091485 | Jul 2008 | WO |
2008156596 | Dec 2008 | WO |
2009017797 | Feb 2009 | WO |
2009052618 | Apr 2009 | WO |
2009062153 | May 2009 | WO |
2009094591 | Jul 2009 | WO |
2009105539 | Aug 2009 | WO |
Entry |
---|
“Mwearable Cameras—Lowest Prices & Best Deals on Wearable Cameras”, Pronto.com, Retrieved from website on Sep. 28, 2009: http://www.pronto.com/user/search.do?displayQuery=wearable%20car . . . reativeid?%7Bcreative%7D&site=%7Bplacement%7D&loadingComplete=true. |
“Osiris tunable imager and spectrograph instrument status”, CEPA, RevMex AA (Serie de Conferencias), vol. 16, retrieved from the Internet on Jun. 16, 2010: http://www.astroscu.unam.mx/rmaa/RMxAC..16/DF/RMxAC..16—session1/cepanew/cepa3.pdf, 2003, pp. 13-18. |
“Sunglasses Camera: all in one video system built into a pair of sunglasses”, Http://www.eaprotection.com/1-3-mp-sunglasses-camcorder.aspx, Retrieved from website on Sep. 28, 2009. |
“Wearable Video Camera”, http://igargoyle.com/archives/2006/08/wearable—video—camera.html, Retrieved from website on Sep. 28, 2009, Aug. 2006. |
“Wearable video cameras”, Http://www.bing.com/shopping/search?q=wearable+camera&go=&form=QBRE, Retrieved from website on Sep. 28, 2009. |
“Wearable video cameras”, CES Report posted online Jan. 10, 2009, retrieved from url: http://www.ces-show.com/0320/vievu/video/wearable—video-cameras/, Retrieved retrieved from website on Sep. 28, 2009: http://www.ces-show.com/0320/vievu/video/wearable—video-cameras/, Jan. 10, 2009. |
Bryon, Linda , “Police Testing Wearable Video Cameras”, King 5 News, retrieved from website on Sep. 28, 2009: http://www.king5.com/localnews/sotries/NW—090908WAB—lapel—cameras—KS.5d62e424.html, Sep. 9, 2008. |
Chartrand, S. , “Patent for No-Fuss video Camera”, New York Times, Retrieved from the Technology section of the website on Sep. 28, 2009: http://www.nytimes.com/2003/08/25/technology/26PATE.html?8hpib, Aug. 2003, 1-3. |
“Blue-tooth enabled phone can act as wireless web cam”, IBN Live. retrieved online on Mar. 25, 2010 from: http://ibnlive.in.com/news/bluetooth-enabled-phone-can-act-as-wireless-web-cam/45790-11.html, Jul. 28, 2007, pp. 1-3. |
Number | Date | Country | |
---|---|---|---|
20150085059 A1 | Mar 2015 | US |
Number | Date | Country | |
---|---|---|---|
61208783 | Feb 2009 | US | |
61270221 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12714693 | Mar 2010 | US |
Child | 14557072 | US | |
Parent | PCT/US2010/025603 | Feb 2010 | US |
Child | 12714693 | US |