Physical buttons are not always an appealing feature to add to hardware, such as a camera. This is particularly true when the hardware or camera has a small form factor. Additional physical buttons can create a crowding situation on the hardware or camera and can lead to an unaesthetic appearance. Further, crowded buttons increase the likelihood that a user will inadvertently press the wrong button.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.
Various embodiments provide a wearable camera that can be worn by a user. In one or more embodiments, the wearable camera can include a non-touch switch that can be utilized to access and activate various camera functionality.
The detailed description references the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
Overview
Various embodiments provide a wearable camera that can be worn by a user. In one or more embodiments, the wearable camera can include a non-touch switch that can be utilized to access and activate various camera functionality.
The camera can be worn in any suitable location. For example, the camera can be worn on a user's head such as, a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like. Alternately or additionally, the camera can be worn on locations other than the user's head. For example, the camera can be configured to be mounted on the user's clothing.
Various other embodiments provide a wearable camera that is mountable on a user's clothing. The camera is designed to be unobtrusive and user-friendly insofar as being mounted away from the user's face so as not to interfere with their view. In at least some embodiments, the camera includes a housing and a clip mounted to the housing to enable the camera to be clipped onto the user's clothing. The camera is designed to be lightweight with its weight balanced in a manner that is toward the user when clipped to the user's clothing.
In one or more embodiments, the camera includes a replay mode. When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer. In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. Once the memory buffer is full, the older image data is erased to make room for currently-captured image data. If an event occurs that the user wishes to memorialize through video or still images, a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. Alternately, in embodiments that include the non-touch switch, such switch can be used to cause the image data to be saved from the beginning of the memory buffer and continue recording until the non-touch switch is again engaged. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time.
In the discussion that follows, a section entitled “Example Environment” describes an example environment in which the various embodiments can be utilized. Next, a section entitled “Replay Functionality” describes an example replay mode in accordance with one or more embodiments. Following this, a section entitled “Camera Non-touch Switch” describes a camera switch with non-touch activation features in accordance with one or more embodiments. Next, a section entitled “Duel Encoding” describes an embodiment in which captured image data can be dual encoded in accordance with one or more embodiments. Next, a section entitled “Photo Log” describes an example photo log in accordance with one or more embodiments.
Consider now an example environment in which various embodiments can be practiced.
Example Environment
The sensor 104 can serve as a non-touch switch to enable activation of one or more camera device features or functionalities, as will become apparent below. Alternately or additionally, the camera device can include a separate non-touch switch 105 to enable activation of one or more camera device features or functionalities as will become apparent below.
It should be appreciated that the camera device 100 may include other components such as a battery or power source and other processor components that are required for a processor to operate. However, to avoid obfuscating the teachings, these well-known components are being omitted. In one embodiment, the camera device 100 does not include a view finder or a preview display. In other embodiments, however, a preview display may be provided. The techniques described herein can be used in any type of camera, and are particularly effective in small, highly portable cameras, such as those implemented in mobile telephones and other portable user equipment. Thus, in one embodiment, the camera device 100 includes hardware or software for making and receiving phone calls. Alternately, the camera device 100 can be a dedicated, stand-alone camera.
In at least some embodiments, the camera device 100 further includes a motion detector 108 that can include an accelerometer and, in some embodiments, a gyroscope. The accelerometer is used for determining the direction of gravity and acceleration in any direction. The gyroscope may also be used either in addition to the accelerometer or instead of the accelerometer. The gyroscope can provide information about how the rotational angle of the camera device 100 changes over time. Any other type of sensor may be used to detect the camera's motion. Using the rotational angle, an angle of rotation of the camera device 100 may be calculated, if the camera device 100 is rotated.
Further included is an input/output (I/O) port 114 for connecting the camera device 100 to an external device, including a general purpose computer. The I/O port 114 may be used for enabling the external device to configure the camera device 100 or to upload/download data. In one embodiment, the I/O port 114 may also be used for streaming video or pictures from the camera device 100 to the external device. In one embodiment, the I/O port may also be used for powering the camera device 100 or charging a rechargeable battery (not shown) in the camera device 100.
The camera device 100 may also include an antenna 118 that is coupled to a transmitter/receiver (Tx/Rx) module 116. The Tx/Rx module 116 is coupled to a processor 106. The antenna 118 may be fully or partly exposed outside the body of the camera device 100. However, in another embodiment, the antenna 118 may be fully encapsulated within the body of the camera device 100. The Tx/Rx module 116 may be configured for Wi-Fi transmission/reception, Bluetooth transmission/reception or both. In another embodiment, the Tx/Rx module 116 may be configured to use a proprietary protocol for transmission/reception of the radio signals. In yet another embodiment, any radio transmission or data transmission standard may be used so long as the used standard is capable of transmitting/receiving digital data and control signals. In one embodiment, the Tx/Rx module 116 is a low power module with a transmission range of less than ten feet. In another embodiment, the Tx/Rx module 116 is a low power module with a transmission range of less than five feet. In other embodiments, the transmission range may be configurable using control signals received by the camera device 100 either via the I/O port 114 or via the antenna 118.
The camera device 100 further includes a processor 106. The processor 106 is coupled to, among other components, the sensor 104, the non-touch switch 105, and the motion detector 108. The processor 106 may also be coupled to storage 110, which, in one embodiment, is external to the processor 106. The storage 110 may be used for storing programming instructions for controlling and operating other components of the camera device 100. The storage 110 may also be used for storing captured media (e.g., pictures and/or videos). In another embodiment, the storage 110 may be a part of the processor 106 itself.
In one embodiment, the processor 106 may include an image processor 112. The image processor 112 may be a hardware component or may also be a software module that is executed by the processor 106. It may be noted that the processor 106 and/or the image processor 112 may reside in different chips. For example, multiple chips may be used to implement the processor 106. In one example, the image processor 112 may be a Digital Signal Processor (DSP). The image processor can be configured as a processing module, that is a computer program executable by a processor. In at least some embodiments, the processor 112 is used to process a raw image received from the sensor 104 based, at least in part, on the input received from the motion detector 108. Other components such as Image Signal Processor (ISP) may be used for image processing.
In one embodiment, the storage 110 is configured to store both raw (unmodified image) and the corresponding modified image. In one or more embodiments, the storage 110 can include a memory buffer, such as a flash memory buffer, that can be used as a circular buffer to facilitate capturing image data when the camera is set to a replay mode that is supported by replay module 120. The replay module 120 can be implemented in connection with any suitable hardware, software, firmware, or combination thereof. When the replay mode is selected, the camera automatically captures image data, such as video or still images, and saves the image data to the memory buffer. In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. If an event occurs that the user wishes to memorialize through video or still images, in some embodiments, a record button can be activated which saves the image data from the beginning of the memory buffer and continues recording until the user presses the record button again. Alternately, in at least some embodiments, input by way of the non-touch switch, as implemented by either the sensor 104 or the non-touch switch 105, can be used to save the image data from the beginning of the memory buffer and continue recording until an additional input is received by way of the non-touch switch. In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time.
A processor buffer (not shown) may also be used to store the image data. The pictures can be downloaded to the external device via the I/O port 114 or via the wireless channels using the antenna 118. In one embodiment, both unmodified and modified images are downloaded to the external device when the external device sends a command to download images from the camera device 110. In one embodiment, the camera device 100 may be configured to start capturing a series of images at a selected interval.
In one embodiment, a raw image from the sensor 104 is inputted to an image processor (such as an ISP) for image processing or blur detection. After image processing is applied to the image outputted by the image processor, the modified image is encoded. The image encoding is typically performed to compress the image data.
In an example embodiment, the camera device 100 may not include the components for processing the image captured by the sensor 104. Instead, the camera device 100 may include programming instructions to transmit the raw image after extracting the image from the sensor 104 to a cloud based processing system that is connected to the camera device 100 via the Internet or a local area network. The cloud based system is configured to receive the raw image and process the image or images as described above and below. The encoded image is then either stored in a selected cloud based storage or the image is sent back to the camera device 100 or to any other device according to a user configuration. The use of a cloud based image processing system can reduce a need for incorporating several image processing components in each camera device, thus making a camera device lighter, more energy efficient and cheaper.
In another example embodiment, instead of a cloud based image processing, the camera device 100 may send either a raw image or the image processed through an image processor to another device, e.g., a mobile phone or a computer. The image may be transmitted to the mobile phone (or a computer) for further processing via Wi-Fi, Bluetooth or any other type of networking protocol that is suitable for transmitting digital data from one device to another device. After the mobile device receives the image or images, according to one or more embodiments described herein, the produced image may be saved to local storage on the device, transferred for storage in a cloud based storage system, or transmitted to another device, according to user or system configurations.
In one embodiment, the native image processing system in the camera device 100 may produce images and/or videos in a non-standard format. For example, a 1200×1500 pixel image may be produced. This may be done by cropping, scaling, or using an image sensor with a non-standard resolution. Since methods for transforming images in a selected standard resolution are well-known, there will be no further discussion on this topic.
Various embodiments described above and below can be implemented utilizing a computer-readable storage medium that includes instructions that enable a processing unit to implement one or more aspects of the disclosed methods as well as a system configured to implement one or more aspects of the disclosed methods. By “computer-readable storage medium” is meant all statutory forms of media. Accordingly, non-statutory forms of media such as carrier waves and signals per se are not intended to be covered by the term “computer-readable storage medium”.
As noted above, camera device 100 can assume any suitable form of wearable camera. The camera can be worn in any suitable location relative to a user. For example, the camera can be worn on a user's head such as, by a way of example and not limitation, a hat-mounted camera, glasses-mounted camera, headband-mounted camera, helmet-mounted camera, and the like. Alternately or additionally, the camera can be worn on locations other than the user's head. For example, the camera can be configured to be mounted on the user's clothing or other items carried by a user, such as a backpack, purse, briefcase, and the like.
In the example provided just below, a wearable camera is described in the context of a camera that is mountable on the user's clothing. It is to be appreciated and understood, however, that other types of non-clothing mountable, wearable cameras can be utilized without departing from the spirit and scope of the claimed subject matter.
Moving on to
In addition, camera device 200 can include a number of input buttons shown generally at 310. The input buttons can include, by way of example and not limitation, an input button to take a still picture, an input button to initiate the replay mode, an input button to initiate a video capture mode, and an input button to enable the user to adjust the buffer size that is utilized during the replay mode. In some embodiments, the input button to initiate the replay mode can be eliminated through the use of the non-touch switch as described below in more detail. It is to be appreciated and understood that the various input buttons can be located anywhere on the camera device 200.
It may be noted that even though the camera device 200 is shown to have a particular shape, the camera device 100 can be manufactured in any shape shape and size suitable and sufficient to accommodate the above described components of the camera device 100. The housing 202 of the camera device may be made of a metal molding, a synthetic material molding or a combination thereof. In other embodiments, any suitable type of material may be used to provide a durable and strong outer shell for typical portable device use.
In addition, the fastening device 300 can comprise any suitable type of fastening device. For example, the fastening device may be a simple slip-on clip, a crocodile clip, a hook, a Velcro or a magnet or a piece of metal to receive a magnet. The camera device 200 may be affixed permanently or semi-permanently to another object using the fastening device 300.
Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), or a combination of these implementations. The terms “module,” “functionality,” “component” and “logic” as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the techniques described below are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
For example, the camera device 200 may include a computer-readable medium that may be configured to maintain instructions that cause the camera's software and associated hardware to perform operations. Thus, the instructions function to configure the camera's software and associated hardware to perform the operations and in this way result in transformation of the software and associated hardware to perform functions. The instructions may be provided by the computer-readable medium to the camera device through a variety of different configurations.
One such configuration of a computer-readable medium is signal bearing medium and thus is configured to transmit the instructions (e.g., as a carrier wave) to the camera device, such as via a network. The computer-readable medium may also be configured as a computer-readable storage medium and thus is not a signal bearing medium. Examples of a computer-readable storage medium include a random-access memory (RAM), read-only memory (ROM), an optical disc, flash memory, hard disk memory, and other memory devices that may use magnetic, optical, and other techniques to store instructions and other data.
Having considered an example operating environment in accordance with one or more embodiments, consider now a discussion of replay functionality and other features that can be provided by the camera device.
Replay Functionality
As noted above, camera device 200 includes a replay mode. When the replay mode is selected, as by the user pressing an input button associated with initiating the replay mode or through the use of the non-touch switch, the camera automatically captures image data, such as video or still images, and saves the image data to a memory buffer. In one or more embodiments, the memory buffer is a circular buffer that saves an amount of image data, for example video data. When the memory buffer is full of image data, it deletes the oldest image data to make room for newly recorded image data. This continues until either the user exits the replay mode or presses a button associated with initiating video capture, i.e. the “record” button or, in embodiments that include the non-touch switch, uses the non-touch switch to initiate video capture.
In at least some embodiments, the size of the memory buffer can be set by the user to determine how much image data is to be collected. As an example, the user might set the length of the memory buffer to correspond to 5 seconds, 30 seconds, 1 minute, 2 minutes, and longer.
Assume now that an event occurs that the user wishes to memorialize through video or still images. Assume also that the user has initiated the replay mode so that video data is currently being buffered in the memory buffer. By pressing the “record” button, the video data is now saved from the beginning of the memory buffer and recording continues until the user presses the record button again. In embodiments where the non-touch switch can be used in place of the “record” button, by activating the non-touch switch, the video data can now be saved from the beginning of the memory buffer and recording can continue until the user again activates the non-touch switch.
In this manner, if an event occurs, the user is assured of capturing the event from a time t-x, where x is the length of the memory buffer, in time. So, for example, if the user initially set the memory buffer to capture 2 minutes worth of video data, by pressing the “record” button or alternatively using the non-touch switch, the last 2 minutes of video data will be recorded in addition to the currently recorded video data.
In one or more embodiments, the memory buffer comprises flash memory. When the user presses the “record” button or uses the non-touch switch, and the camera device is in replay mode, a pointer is used to designate where, in flash memory, the beginning of the captured video data occurs, e.g., the beginning of the last 2 minutes of video data prior to entering the “record” mode. In other embodiments, the video data captured during replay mode and “record” mode can be written to an alternate storage location.
Step 400 receives input associated with a replay mode. This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device. Responsive to receiving the input associated with the replay mode, step 402 captures image data and saves the image data to a memory buffer. Step 404 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 402 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full, step 406 deletes the oldest image data in the memory buffer and returns to step 402 to capture subsequent image data.
This process continues until either the user presses the “record” button, or in alternate embodiments the non-touch switch, or exits the replay mode.
Step 500 receives input to set a memory buffer size. This step can be performed in any suitable way. For example, in at least some embodiments, the step can be performed by receiving user input by way of a suitably-configured input mechanism such as a button on the camera device. Responsive to receiving this input, step 502 sets the memory buffer size.
Step 504 receives input associated with a replay mode. This step can be performed in any suitable way. For example, in at least some embodiments, this step can be performed by receiving input from the user via a suitable input device on the camera device. Responsive to receiving the input associated with the replay mode, step 506 captures image data and saves the image data to a memory buffer. Step 508 ascertains whether the buffer is full. If the buffer is not full, the method returns to step 506 and continues to capture image data and save image data to the memory buffer. If, on the other hand, the buffer is full, step 510 deletes the oldest image data in the memory buffer and returns to step 506 to capture subsequent image data.
This process continues until either the user presses the “record” button, or in alternate embodiments uses the non-touch switch, or exits the replay mode.
Step 600 captures image data and saves the image data to a memory buffer. The step can be performed in any suitable way. For example, the step can be performed as described in connection with
Having considered an example replay mode and how it can be implemented with a suitably hiding configured camera device, consider now aspects of a camera non-touch switch.
Camera Non-Touch Switch
In one or more embodiments, the camera device can include a non-touch switch that can be utilized to access and activate various camera functionality. In the example described just above, the non-touch switch can be utilized to access the replay mode. Alternately or additionally, the non-touch switch can be utilized to access and activate other various camera functionality. Utilization of a non-touch switch can enable the number of hardware buttons on the camera device to be reduced. This, in turn, can lead to a more simplified construction that reduces material used to construct the camera device such as input button gaskets, assembly layers utilized in connection with input buttons, and the like.
Before considering various types of non-touch switches that can be employed in connection with the above-described camera device, consider a use scenario which illustrates the utility of such a switch. Assume that a user has turned their camera device on so that it is capturing image data in the replay mode and has attached their camera to themselves in some manner. For example, the user may have clipped the camera on their shirt or mounted it on a hat or helmet. Now, as a user progresses throughout the day, assume that an interesting event occurs that they wish to capture. Because the user is moving around and/or access to the camera device's hardware buttons may be challenging because of the user's movement, they can activate the replay mode using the non-touch switch. Specifically, by interacting with the camera device's non-touch switch, the user can cause the camera device to enter the replay mode and capture the video data that has been buffered in the memory buffer, as well as the currently-captured video data as described above.
In some instances, the non-touch switch allows for less precision in order to activate the camera device's functionality which, in the example above, is the replay mode functionality. So, active users who are on the go need not worry about fumbling around to find a particular hardware button. Rather, through the non-touch switch as described above and below, the user can access camera functionality in a manner that is less precise than that utilized by a hardware button.
As noted above, the non-touch switch can comprise any suitably-configured switch that can be utilized to access and activate various camera functionality without necessarily relying on physical touch. That is to say, while some touch can occur when utilizing the non-touch switch, the non-touch switch does not rely on touch in order to be activated.
In at least some embodiments, the non-touch switch can be implemented by the camera device's sensor, such as sensor 104, which receives light that enters the camera. Specifically, in these embodiments, the non-touch switch can be activated by a user briefly covering the camera lens 102 to block light from entering the camera and being sensed by sensor 104. When this occurs, the camera device's processor recognizes the sudden change from light to dark as a trigger to access and activate an associated camera device functionality.
In at least other embodiments, the non-touch switch can be implemented by a separate light aperture through which light enters the camera. By briefly covering the light aperture, the non-touch switch, such as non-touch switch 105 (
In yet other embodiments, various types of proximity sensors can be utilized to implement a non-touch switch, such as non-touch switch 105. Such proximity sensors can utilize any suitable type of technology including, by way of example and not limitation, electromagnetic field emission, electromagnetic radiation such as infrared light, capacitive sensors, inductive sensors, and the like.
Having considered an example camera non-touch switch and how it can be implemented with a suitably hiding configured camera device, consider now aspects of a dual encoding process.
Dual Encoding
In one or more embodiments, the camera device's processor 106 (
Encoding image data at different resolutions levels can enhance the user's experience insofar as giving the user various options to transfer the saved image data. For example, at lower resolution levels, the captured image data can be streamed to a device such as a smart phone. Alternately or additionally, at higher resolution levels, when the user has Wi-Fi accessibility, they can transfer the image data to a network device such as a laptop or desktop computer.
Having considered a dual encoding scenario, consider now aspects of a photo log that can be constructed using the principles described above.
Photo Log
Photo log refers to a feature that enables a user to log their day in still photos at intervals of their own choosing. So, for example, if the user wishes to photo log their day at every 3 minutes, they can provide input to the camera device so that every 3 minutes the camera automatically takes a still photo and saves it. At the end of the day, the user will have documented their day with a number of different still photos.
In at least some embodiments, the photo log feature can work in concert with the replay mode described above. For example, if the user has entered the replay mode by causing image data to be captured and saved to the memory buffer, the camera device's processor can process portions of the captured video data at defined intervals to provide the still photos. This can be performed in any suitable way. For example, the camera device's processor can process the video data on the camera's photosensor and read predefined areas of the photosensor to process the read areas into the still photos. In some instances the photo format is a square format so that the aspect ratio is different from that aspect ratio of the video data.
Conclusion
Various embodiments provide a wearable camera that can be worn by a user. In one or more embodiments, the wearable camera can include a non-touch switch that can be utilized to access and activate various camera functionality.
Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the various embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the various embodiments.
This application is a continuation of and claims priority to U.S. patent application Ser. No. 13/873,069, filed on Apr. 29, 2013 which is a continuation-in-part of and claims priority to U.S. patent application Ser. No. 13/828,139, filed on Mar. 14, 2013, the disclosure of which is incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
3305148 | Zimmerman | Feb 1967 | A |
5282044 | Misawa et al. | Jan 1994 | A |
5610678 | Tsuboi et al. | Mar 1997 | A |
6058141 | Barger et al. | May 2000 | A |
6168057 | Schwabe | Jan 2001 | B1 |
6252389 | Baba et al. | Jun 2001 | B1 |
6275829 | Angiulo et al. | Aug 2001 | B1 |
6558050 | Ishibashi | May 2003 | B1 |
6561702 | Yik | May 2003 | B1 |
6612404 | Sweet et al. | Sep 2003 | B2 |
D483784 | Ting | Dec 2003 | S |
6667771 | Kweon | Dec 2003 | B1 |
6680748 | Monti | Jan 2004 | B1 |
6711297 | Chang et al. | Mar 2004 | B1 |
6727949 | Saruwatari et al. | Apr 2004 | B1 |
6757027 | Edwards et al. | Jun 2004 | B1 |
6867680 | Kulle | Mar 2005 | B1 |
6904185 | Wilkins et al. | Jun 2005 | B1 |
7012621 | Crosby et al. | Mar 2006 | B2 |
7092614 | Liebhold et al. | Aug 2006 | B2 |
7116440 | Eichhorn et al. | Oct 2006 | B2 |
D534939 | Sheehan et al. | Jan 2007 | S |
7280702 | Chang et al. | Oct 2007 | B2 |
7326869 | Flynn et al. | Feb 2008 | B2 |
7454711 | Angiulo et al. | Nov 2008 | B2 |
7490721 | Bishop | Feb 2009 | B2 |
7512886 | Herberger et al. | Mar 2009 | B1 |
7574129 | Tsukuda | Aug 2009 | B2 |
7623182 | Byrne et al. | Nov 2009 | B2 |
7778023 | Mohoney | Aug 2010 | B1 |
7783133 | Dunki-Jacobs et al. | Aug 2010 | B2 |
8031255 | Usami | Oct 2011 | B2 |
8083146 | Carlson et al. | Dec 2011 | B2 |
8159363 | Song | Apr 2012 | B2 |
8159541 | McLeod | Apr 2012 | B2 |
8270827 | Tse, Jr. | Sep 2012 | B2 |
8310540 | DeKeyser | Nov 2012 | B2 |
8315602 | Fogel et al. | Nov 2012 | B2 |
8322215 | Lakich et al. | Dec 2012 | B2 |
8330854 | Kossin | Dec 2012 | B2 |
8355619 | Kato et al. | Jan 2013 | B2 |
8363121 | Maeng | Jan 2013 | B2 |
8374498 | Pastore | Feb 2013 | B2 |
8411050 | Zalewski | Apr 2013 | B2 |
8514292 | Cho | Aug 2013 | B2 |
8515241 | Forsyth et al. | Aug 2013 | B2 |
8538376 | Lee et al. | Sep 2013 | B2 |
8587670 | Wood et al. | Nov 2013 | B2 |
8605188 | Ishihara et al. | Dec 2013 | B2 |
8979398 | Han et al. | Mar 2015 | B2 |
8990199 | Ramesh et al. | Mar 2015 | B1 |
9066007 | Chau et al. | Jun 2015 | B2 |
9282244 | Chau et al. | Mar 2016 | B2 |
9444996 | Chau et al. | Sep 2016 | B2 |
9451178 | Kelder et al. | Sep 2016 | B2 |
20020140849 | Slatter et al. | Oct 2002 | A1 |
20020186317 | Kayanuma | Dec 2002 | A1 |
20030081121 | Kirmuss | May 2003 | A1 |
20030234860 | Sung | Dec 2003 | A1 |
20040056949 | Lin | Mar 2004 | A1 |
20040089814 | Cheatle | May 2004 | A1 |
20040105024 | Takahashi | Jun 2004 | A1 |
20040145613 | Stavely et al. | Jul 2004 | A1 |
20040155968 | Cheatle et al. | Aug 2004 | A1 |
20040169883 | Eichorn et al. | Sep 2004 | A1 |
20040183912 | Szolyga et al. | Sep 2004 | A1 |
20040189691 | Jojic et al. | Sep 2004 | A1 |
20040218081 | Lohr et al. | Nov 2004 | A1 |
20040240005 | Kim | Dec 2004 | A1 |
20050093988 | Haas et al. | May 2005 | A1 |
20050100329 | Lao et al. | May 2005 | A1 |
20050140798 | Tashiro et al. | Jun 2005 | A1 |
20050207733 | Gargi | Sep 2005 | A1 |
20050248453 | Fechter | Nov 2005 | A1 |
20060066753 | Gennetten et al. | Mar 2006 | A1 |
20060078215 | Gallagher | Apr 2006 | A1 |
20060140599 | Nomura et al. | Jun 2006 | A1 |
20060165160 | Winningstad et al. | Jul 2006 | A1 |
20060173560 | Widrow | Aug 2006 | A1 |
20060197843 | Yoshimatsu | Sep 2006 | A1 |
20060268162 | Kayanuma | Nov 2006 | A1 |
20070071423 | Fantone et al. | Mar 2007 | A1 |
20070214886 | Sheynblat | Sep 2007 | A1 |
20070291177 | Lahoz et al. | Dec 2007 | A1 |
20080055427 | Wendelrup | Mar 2008 | A1 |
20080136940 | Srikanth et al. | Jun 2008 | A1 |
20080165249 | DeKeyser | Jul 2008 | A1 |
20080180537 | Weinber et al. | Jul 2008 | A1 |
20080259174 | Nagao | Oct 2008 | A1 |
20080260291 | Alakarhu et al. | Oct 2008 | A1 |
20080313172 | King et al. | Dec 2008 | A1 |
20090115865 | Kamada et al. | May 2009 | A1 |
20090117946 | Tomasini et al. | May 2009 | A1 |
20090196493 | Widrow et al. | Aug 2009 | A1 |
20100026716 | Garbow et al. | Feb 2010 | A1 |
20100026839 | Border | Feb 2010 | A1 |
20100027663 | Dai et al. | Feb 2010 | A1 |
20100053348 | Yoshimoto et al. | Mar 2010 | A1 |
20100079356 | Hoellwarth | Apr 2010 | A1 |
20100107126 | Lin et al. | Apr 2010 | A1 |
20100118158 | Boland et al. | May 2010 | A1 |
20100156676 | Mooring et al. | Jun 2010 | A1 |
20100175088 | Loebig et al. | Jul 2010 | A1 |
20100194682 | Orr et al. | Aug 2010 | A1 |
20100208370 | Chang | Aug 2010 | A1 |
20100289904 | Zhang et al. | Nov 2010 | A1 |
20100306402 | Russell et al. | Dec 2010 | A1 |
20100316369 | Pyle | Dec 2010 | A1 |
20110053641 | Lee et al. | Mar 2011 | A1 |
20110064129 | Bennett et al. | Mar 2011 | A1 |
20110070829 | Griffin et al. | Mar 2011 | A1 |
20110081143 | Kope et al. | Apr 2011 | A1 |
20110084661 | Saw | Apr 2011 | A1 |
20110084914 | Zalewski | Apr 2011 | A1 |
20110149094 | Chen et al. | Jun 2011 | A1 |
20110152629 | Eaton et al. | Jun 2011 | A1 |
20110158623 | Cheng et al. | Jun 2011 | A1 |
20110211061 | Kossin | Sep 2011 | A1 |
20120063736 | Simmons et al. | Mar 2012 | A1 |
20120086819 | Wilson et al. | Apr 2012 | A1 |
20120140093 | Chuang | Jun 2012 | A1 |
20120206653 | Graves et al. | Aug 2012 | A1 |
20120263430 | Spitzer-Williams | Oct 2012 | A1 |
20120290840 | Willins | Nov 2012 | A1 |
20120297334 | Fagans et al. | Nov 2012 | A1 |
20120320224 | Miyoshi et al. | Dec 2012 | A1 |
20130004153 | McKee et al. | Jan 2013 | A1 |
20130014585 | Hetherington | Jan 2013 | A1 |
20130057489 | Morton et al. | Mar 2013 | A1 |
20130063538 | Hubner et al. | Mar 2013 | A1 |
20130069985 | Wong et al. | Mar 2013 | A1 |
20130094697 | Adcock et al. | Apr 2013 | A1 |
20130148940 | Schmit et al. | Jun 2013 | A1 |
20130201344 | Sweet et al. | Aug 2013 | A1 |
20130229562 | Kamada | Sep 2013 | A1 |
20130239001 | Maloney et al. | Sep 2013 | A1 |
20130259322 | Lin et al. | Oct 2013 | A1 |
20130315556 | Ju et al. | Nov 2013 | A1 |
20130328051 | Franklin | Dec 2013 | A1 |
20140037216 | Kumar et al. | Feb 2014 | A1 |
20140071091 | Barak et al. | Mar 2014 | A1 |
20140072312 | Aldana et al. | Mar 2014 | A1 |
20140092299 | Phillips et al. | Apr 2014 | A1 |
20140191707 | Carreon et al. | Jul 2014 | A1 |
20140211031 | Han | Jul 2014 | A1 |
20140218493 | Dialameh et al. | Aug 2014 | A1 |
20140270688 | Han | Sep 2014 | A1 |
20140270689 | Chau | Sep 2014 | A1 |
20140308031 | Han et al. | Oct 2014 | A1 |
20140320687 | Chau et al. | Oct 2014 | A1 |
20140333828 | Han et al. | Nov 2014 | A1 |
20140354880 | Han | Dec 2014 | A1 |
20150054975 | Emmett et al. | Feb 2015 | A1 |
20150077552 | Yang et al. | Mar 2015 | A1 |
20150256749 | Frey et al. | Sep 2015 | A1 |
20150265223 | Simon et al. | Sep 2015 | A1 |
20150271231 | Luby et al. | Sep 2015 | A1 |
20150341559 | Kelder et al. | Nov 2015 | A1 |
20150341572 | Kelder et al. | Nov 2015 | A1 |
20150341591 | Kelder et al. | Nov 2015 | A1 |
20160021300 | Chau et al. | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
1668082 | Sep 2005 | CN |
101685363 | Mar 2010 | CN |
102090068 | Jun 2011 | CN |
202838971 | Mar 2013 | CN |
0985899 | Mar 2000 | EP |
1571634 | Sep 2005 | EP |
1921844 | May 2008 | EP |
2091247 | Aug 2009 | EP |
2160019 | Mar 2010 | EP |
2456587 | Jul 2009 | GB |
2006129391 | May 2006 | JP |
2009110351 | May 2009 | JP |
WO-03005702 | Jan 2003 | WO |
WO-2014149664 | Sep 2014 | WO |
Entry |
---|
“Canon VIXIA HF R200 Full HD Camcorder with Dual SDXC Card”, Retrieved from <http://www.amazon.com/Canon-VIXIA-R200-Camcorder-Slots/dp/B004HW7EAG> on Mar. 20, 2013, 9 pages. |
“Canon: Ease of Use”, Retrieved from: <http://www.usa.canon.com/cusa/consumer/standard—display/PS—Advantage—Ease> on Feb. 11, 2014, Mar. 12, 2011, 17 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 13/871,905, Apr. 9, 2015, 2 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 13/871,905, May 22, 2015, 2 pages. |
“DVR003 Mini Camera Stick HD Mini Camera and Video Recorder”, Retrieved from <http://www.spyville.com/dvr003-lawmate-camstick.html> on Apr. 16, 2013, 2013, 2 pages. |
“Final Office Action”, U.S. Appl. No. 13/754,719, Jan. 30, 2015, 18 pages. |
“Final Office Action”, U.S. Appl. No. 13/754,719, Nov. 12, 2015, 19 pages. |
“Final Office Action”, U.S. Appl. No. 13/871,905, Sep. 23, 2014, 6 pages. |
“Final Office Action”, U.S. Appl. No. 13/873,069, Jan. 5, 2015, 20 pages. |
“Final Office Action”, U.S. Appl. No. 13/892,085, May 7, 2015, 34 pages. |
“Final Office Action”, U.S. Appl. No. 13/908,906, Oct. 8, 2015, 11 pages. |
“Final Office Action”, U.S. Appl. No. 14/285,237, Oct. 27, 2015, 9 pages. |
“Final Office Action”, U.S. Appl. No. 14/285,387, Mar. 30, 2016, 14 pages. |
“Final Office Action”, U.S. Appl. No. 14/716,355, Feb. 16, 2016, 8 pages. |
“GeniusGuard™ Pen Spy Camera Recorder Model S3K”, Retrieved from <http://www.eurekaplus.com/pen-spycam-recorder-s3k.html> on Apr. 16, 2013, 5 pages. |
“Hall Effect Proximity Sensors Information”, Retrieved from <http://www.globalspec.com/learnmore/sensors—transducers—detectors/proximity—presence—sensing/hall—effect—proximity—sensors> on Mar. 21, 2013, Aug. 31, 2012, 3 pages. |
“Hall Effect Pushbutton Switches”, Retrieved from <http://www.ottoexcellence.com/file.aspx?FileId=3337> on Mar. 20, 2013, 1 page. |
“Hall Effect Sensor—SparkFun Electronics”, Retrieved from <https://www.sparkfun.com/products/9312> on Mar. 21, 2013, Jan. 24, 2013, 7 pages. |
“Hi-Res “Pen Cap” Pocket Wearable Spy Camera”, Retrieved from <https://www.dynaspy.com/hi-res-pen-cap-pocket-spy-camera-4gb-sd-memory> on Apr. 16, 2013, 2013, 3 pages. |
“Images of Nokia 6555 Mobile Phone”, In: “Images of Nokia 6555 Mobile Phone”, XP055143661, figures 1, 2, Oct. 1, 2014, 1 page. |
“International Preliminary Report on Patentability”, Application No. PCT/US2014/034065, Jun. 29, 2015, 10 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2014/040110, Sep. 9, 2015, 10 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2014/037406, Jul. 28, 2015, 12 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2014/035061, Jun. 23, 2015, 7 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/031654, Sep. 14, 2015, 10 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/022896, Jun. 16, 2014, 13 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/037406, Aug. 4, 2014, 13 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/022892, Jul. 1, 2014, 14 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/040110, Aug. 14, 2014, 14 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/034065, Oct. 13, 2014, 18 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/035061, Jul. 28, 2014, 9 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/031652, Sep. 3, 2015, 9 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2015/031653, Sep. 7, 2015, 9 Pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2013/013648, Apr. 11, 2014, 9 pages. |
“Kodak Easyshare LS443 Zoom Digital Camera—User's Guide”, Retrieved from <http://resources.kodak.com/support/shtml/en/manuals/urg00024/urg00024c8s1.shtml> on Mar. 20, 2013, 2 pages. |
“Looxcie, User Manual for Android, Model LX1”, Retrieved from <http://www.looxcie.com/wp-content/uploads/2012/10/LOOXCIE—LX1—UserManual—Android—15.pdf> on May 8, 2015, 2010, 36 pages. |
“Memoto Lifelogging Camera”, Retrieved from <http://www.kickstarter.com/projects/martinkallstrom/memoto-lifelogging-camera/> on Mar. 13, 2013, Oct. 23, 2012, 24 pages. |
“Nokia 6555 User Guide”, retrieved from https://web.archive.org/web/20101225182824/http://www.fido.ca/cms/pdf/en/userguides/Nokia6555—en.pdf on Oct. 29, 2014, Dec. 25, 2010, pp. 17, 61, 62. |
“Non-Final Office Action”, U.S. Appl. No. 13/754,719, May 13, 2015, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/754,719, Jul. 9, 2014, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/828,139, Aug. 7, 2014, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/863,998, May 30, 2014, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/871,905, May 15, 2014, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/873,069, May 22, 2015, 20 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/873,069, Jul. 25, 2014, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/892,085, Sep. 4, 2014, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/908,906, Mar. 12, 2015, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/285,237, Feb. 26, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/285,237, Apr. 24, 2015, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/285,387, Oct. 30, 2015, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/285,483, Mar. 14, 2016, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/716,355, Oct. 21, 2015, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/863,998, Oct. 31, 2014, 4 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/871,905, Feb. 11, 2015, 4 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/873,069, Oct. 14, 2015, 7 pages. |
“‘Panasonic V10 Camcorder Black’—Overview”, Retrieved from <http://dicksmith.com.au/product/XG1221/?utm—medium=lasoo&utm—campaign=catalogue&utm—source=XG1221> on Mar. 20, 2013, 2013, 3 pages. |
“Restriction Requirement”, U.S. Appl. No. 14/285,483, Nov. 5, 2015, 6 pages. |
“Second Written Opinion”, Application No. PCT/US2014/035061, Mar. 30, 2015, 7 pages. |
“Second Written Opinion”, Application No. PCT/US2014/013648, Jan. 27, 2015, 7 Pages. |
“Second Written Opinion”, Application No. PCT/US2014/037406, Apr. 16, 2015, 6 Pages. |
“Second Written Opinion”, Application No. PCT/US2015/031652, Apr. 26, 2016, 4 pages. |
“Second Written Opinion”, Application No. PCT/US2015/031653, Apr. 26, 2016, 5 pages. |
“Spy Mini DVR HDMI Camera”, Retrieved from <http://www.balajitech.in/spy-camera.html> on Apr. 16, 2013, 28 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/863,998, Feb. 12, 2015, 2 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/863,998, Nov. 26, 2014, 2 pages. |
“Tap Detection”, Retrieved from <http://www.kionix.com/tap-detection> on Mar. 19, 2013, Oct. 5, 2012, 1 page. |
“TouchOSC”, Retrieved from <http://hexler.net/docs/touchosc-configuration-options> on Mar. 19, 2013, May 26, 2011, 4 Pages. |
“User Guide Jawbone Era”, Retrieved from <http://content.jawbone.com/static/www/pdf/manuals/era/jawbone-era-manual.pdf> on Mar. 18, 2013, 2010, 16 pages. |
“Using Vicon Revue”, Retrieved from <https://www.viconrevue.com/faqs/documents/UsingViconRevuev10—final.pdf> on Apr. 30, 2013, 2010, 41 pages. |
“Written Opinion”, Application No. PCT/US2014/022896, Feb. 12, 2015, 9 pages. |
Boyle,“Smartphone Accelerometers Could Be Used To Eavesdrop On Nearby Devices”, Retrieved from <http://www.popsci.com/technology/article/2011-10/smartphones-could-become-spyphones-using-accelerometer-keystroke-detection> on Mar. 25, 2013, Oct. 19, 2011, 2 pages. |
Chua,“Shock and Mute Pager Applications Using Accelerometers”, Freescale Semiconductor, Inc., Rev. 5, Oct. 2006, 8 pages. |
Gallagher,“Using Vanishing Points to Correct Camera Rotation in Images”, In 2nd Canadian Conference on Computer and Robot Vision, May 9, 2005, 8 pages. |
Hodges,“SenseCam: A Wearable Camera which Stimulates and Rehabilitates Autobiographical Memory”, In Memory, vol. 19, No. 7, Available at <http://research.microsoft.com/en-us/um/cambridge/projects/sensecam/pdf—files/memory.pdf>, Oct. 2011, 18 pages. |
Jacobowitz,“Olympus Stylus 1050 SW”, Retrieved from <http://www.pcmag.com/article2/0,2817,2333989,00.asp> on Mar. 21, 2013, Nov. 5, 2008, 6 pages. |
Kaur,“A Combined DWT-DCT approach to perform Video compression base of Frame Redundancy”, In Proceedings: International Journal of Advanced Research in Computer Science and Software Engineering, vol. 2, Issue 9, M.M. University, Mullana (Ambala), Haryana, India, Sep. 2012, pp. 325-332. |
Liu,“uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications”, Pervasive and Mobile Computing, vol. 5, Issue 6, Available at <http://www.ruf.rice.edu/˜mobile/publications/liu09percom.pdf>, Mar. 9, 2009, pp. 1-9. |
Lopes,“Augmenting Touch Interaction Through Acoustic Sensing”, Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, Jan. 1, 2011, 4 pages. |
Robinson,“TapBack: Towards Richer Mobile Interfaces in Impoverished Contexts”, Human Factors in Computing Systems, May 7, 2011, 4 pages. |
Samadani,“Representative Image Thumbnails: Automatic and Manual”, In Proceeding of Human Vision and Electronic Imaging XIII, Jan. 24, 2008, 13 pages. |
Shiratori,“Accelerometer-based User Interfaces for the Control of a Physically Simulated Character”, Proceedings of the First ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia, vol. 27, No. 5, Available at <http://graphics.cs.cmu.edu/projects/wii/SIGGRAPHAsia2008—wii.pdf>, Dec. 10, 2008, 9 pages. |
Terbush,“Looxcie 2 Review”, Retrieved from <http://www.laptopmag.com/camcorders/looxcie-2.aspx> on Mar. 20, 2013, May 9, 2012, 3 pages. |
Thammineni,“Dynamic frame-rate selection for live Ibr Video Encoders using trial frames”, In Proceedings: IEEE International Conference on Multimedia and Expo, Ittiam Systems (Pvt.) Ltd., Bangalore, India, Jun. 23, 2008, pp. 817-820. |
Tomkins,“Nikon D3100”, Retrieved from <http://www.imaging-resource.com/PRODS/D3100/D3100VIDEO.HTM> on Mar. 14, 2013, Dec. 13, 2010, 10 pages. |
Trumbo,“Samsung Flip Foldable”, Retrieved from >http://www.concept-phones.comfsamsung/samsung-flip-foldable-phone-features-wrapscreen/, Jan. 9, 2012, 4 pages. |
Wilt,“Review: Panasonic AG-AC160 and AG-HPX250 1/3” 3-MMOS Camcorders, Retrieved from <http://provideocoalition.com/awilt/story/review—panasonic—ag-ac160—and—ag-hpx250—1—3—3-mos—camcorders/P1/> on Mar. 14, 2013, Jan. 23, 2012, 32 pages. |
Zhang,“Using Accelerometer in Windows 8* Metro Style App and a Case Study of Tap Detection”, Intel Corporation, Available at <http://software.intel.com/sites/default/files/m/d/4/1/d/8/Accelerometer—Case—Study—Win8.pdf>, Jul. 11, 2012, 20 pages. |
“Foreign Office Action”, EP Application No. 14727300.7, Jun. 22, 2016, 7 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/908,906, May 26, 2016, 13 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/285,237, Jul. 20, 2016, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/285,387, Aug. 5, 2016, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/716,355, Jun. 8, 2016, 5 pages. |
“Second Written Opinion”, Application No. PCT/US2015/031654, May 18, 2016, 6 pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 14/716,355, Aug. 18, 2016, 2 pages. |
“Final Office Action”, U.S. Appl. No. 14/285,483, Aug. 12, 2016, 19 pages. |
“Foreign Office Action”, CN Application No. 201480014904.8, Sep. 26, 2016, 10 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2014/022896, May 29, 2015, 10 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/285,237, Aug. 22, 2016, 3 pages. |
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/285,387, Oct. 17, 2016, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20160119544 A1 | Apr 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13873069 | Apr 2013 | US |
Child | 14990541 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13828139 | Mar 2013 | US |
Child | 13873069 | US |