Not Applicable
Not Applicable
Not Applicable
Not Applicable
Not Applicable
The present disclosure relates to wearable cameras, and in particular to wearable cameras with gesture-based control.
Cameras are used to capture an image or a video in a variety of circumstances. As a specific type of cameras, wearable cameras are typically worn on a user's body, e.g. at her/his temple, shoulder or chest, to capture image or video of a scene or event e.g. in front of the user.
For wearable cameras, it is desirable to minimize their size to facilitate their portability. Therefore, wearable cameras have in general a small form factor, which increase the need to reduce the physical buttons on the wearable cameras and the need for gesture-based control.
Furthermore, in many circumstances, it is desirable to capture an image or a video with minimal intervention and disturbance.
Therefore, a need exists for a wearable camera and its gesture-based control to capture image or video.
Embodiments are presented herein of, inter alia, a wearable camera with gesture-based control.
In an embodiment of the present disclosure, a wearable camera is provided comprising: a housing; and a camera lens supported by the housing and configured to capture an image or video. When the wearable camera is worn on a user's body, the camera lens is configured to monitor the forward proximity of the user's upper body in order to capture or sense a hand gesture by the user, and is configured to transition to capture the image or video in response to the hand gesture.
The various preferred embodiments of the present invention described herein can be better understood by those skilled in the art when the following detailed description is read with reference to the accompanying drawings. The components in the figures are not necessarily drawn to scale and any reference numeral identifying an element in one drawing will represent the same element throughout the drawings. The figures of the drawing are briefly described as follows.
While the features described herein are susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the drawings and detailed description thereto are not intended to be limiting to the particular form disclosed, but on the contrary, the intention is to cover all modifications, equivalents and alternatives falling within the spirit and scope of the subject matter as defined by the appended claims.
The camera lens 102 is movable in its capturing direction so that the wearable camera 100 can capture image or video in different directions. As an example, the camera lens 102 is capable to swivel in the housing 110 so as to change where it points in order to get good perspective. Like in the conventional cameras, the camera lens 102 is capable of zooming in and zooming out in order to adjust the size of the field of view for the camera 100.
The communication module 206 communicates with an external electronic device by means of a wireless communication protocol such as Bluetooth, Near-field Communication (NFC), Wi-Fi, Ultra Wideband (UWB), or another wireless communication protocol. With the communication module 206, the image or video captured by the wearable camera 200 may be transferred to an external electronic device, e.g. for storage or processing.
In order for a small form factor and/or for a convenient manipulation, the wearable camera 200 is configurable to be manipulate or activated with a hand gesture by the user, according to an embodiment of the present disclosure. To this end, the camera lens 202 is configured to monitor the forward proximity of the user's upper body e.g. in front of the user's eyes or chest in order to capture or sense a hand gesture by the user, when the wearable camera 200 is worn on a user's body, in an embodiment of the present disclosure.
As an example, the camera lens 202 may be configured to continuously monitor the forward proximity of the user's upper body in order to capture or sense a hand gesture by the user, when the wearable camera 200 is worn on a user's body. As another example, the camera lens 202 may be normally in standby or sleep mode, and be configured to be woken up by an input from a user (e.g. a tap) and to then monitor the forward proximity of the user's upper body in order to capture or sense a hand gesture by the user, when the wearable camera 200 is worn on a user's body.
In order to activate the wearable camera 200 to capture an image or video, the user who wears the wearable camera 200 provides a hand gesture. As an example, the user stretches out one or both of her/his arms, positions one or both of her/his hands in front of her/his body e.g. upper body, and make hand gesture.
In one aspect, in order to activate the wearable camera 200 to capture an image, the user who wears the wearable camera 200 provides an image capturing hand gesture, e.g. provides a frame virtually with two hands as illustrated in
On the other hand, in order to activate the wearable camera 200 to capture a video, the user who wears the wearable camera 200 provides a video capturing hand gesture, e.g. swipes one of her/his fingers in the air with the start 401 and end 403 of the swipe defining the field of view to be captured in the video as illustrated in
It is to be noted that, the image and video capturing hand gestures as illustrated in
It is understood that the user provides her/his hand gesture in relation to her/his vision or eyes, that is, the field of view defined by the user's hand gesture is chosen by the user in relation to her/his vision or eyes, i.e. based on the scene seen by the user in that field of view. However, due to the location offset between the camera and the user's eyes, the wearable camera cannot capture the same scene as that seen by the user when the camera lens simply is directed towards the user's hand gesture for capturing.
As an example,
Accordingly, when capturing an image or video based on the hand gesture, the wearable camera needs to change perspective based on the locations of the camera, the user's eyes, and the hand gesture, so as to capture the intended image or video that the user really wants, i.e. a same image or video as that seen by the user in the field of view as defined by the hand gesture.
Referring back to
It is understood that, viewing from different angles might results in parallax error, i.e. the perceived or apparent shift. In order to obtain accurate image or video, the wearable camera 200 conducts image adjustment or processing to reduce or eliminate parallax error, in an embodiment of the present disclosure. It is to be noted that, the image adjustment or processing for reducing or eliminating parallax error may be implemented in the perspective transformation unit 208 as described above or by another unit.
Although the embodiments above have been described in considerable detail, numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.
Number | Date | Country | |
---|---|---|---|
63342706 | May 2022 | US |