The present invention generally relates to human interface devices and, more particularly, to apparatus and methods of enabling the control of electronic devices and/or applications.
An electronic control system may serve as a hub that communicates with and controls different electronic devices and/or applications. A user can control, via the hub, the different electronic devices and/or applications. However, the ease of controlling the electronic devices and/or applications is limited by the ease of using the hub.
As can be seen, there is a need for improved apparatus and methods of human interaction to control different electronic devices and/or applications.
In one aspect of the present invention, a human interface device (HID) configured to communicate with a control system that can communicate with user mechanisms, the HID comprises an outer component; an inner input component radially inward of the outer component; and a central input component radially inward of the inner input component; wherein the central input component is configured to display a menu list of user mechanism icons which correspond to respective user mechanisms; wherein the inner input component is configured to select one of the user mechanism icons and thereby one of the respective user mechanisms; wherein the outer component is configured to control the respective user mechanism if the respective user mechanism employs audio.
In another aspect of the present invention, a human interface device (HID) configured to communicate with a control system that can communicate with user mechanisms, the HID comprises an outer component; an inner input component radially inward of the outer component; a central input component radially inward of the inner input component; and a controller configured to: receive outer user depression signals generated from user depressions of the outer component; receive inner user touch signals generated from user touches of the inner input component; and receive central user depression signals generated from user depressions of the central component.
In a further aspect of the present invention, apparatus for controlling user mechanisms comprises a human interface device (HID) having: an outer input component configured to be depressed by a user; an inner input component radially inward of the outer component and configured to be touched but not depressed by the user; a central component radially inward of the inner input component and configured to be depressed by the user; and a microcontroller in communication with the outer component, the inner input component, and the central component; and a control system in communication with the HID, wherein the control system is configured to communicate with a plurality of user mechanisms.
These and other features, aspects and advantages of the present invention will become better understood with reference to the following drawings, description and claims.
The following detailed description is of the best currently contemplated modes of carrying out the invention. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention is best defined by the appended claims.
Various inventive features are described below that can each be used independently of one another or in combination with other features. However, any single inventive feature may not address any of the problems discussed above or may only address one of the problems discussed above. Further, one or more of the problems discussed above may not be fully addressed by any of the features described below.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
Any combination of one or more computer readable storage media may be utilized. A computer readable storage medium is an electronic, magnetic, optical, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium is any tangible medium that can store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable storage medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable storage medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Broadly, the present invention provides apparatus and methods for controlling user mechanisms with a human interface device (HID). The user mechanisms can include electronic devices such as a computer and display screen. They can also include software applications such as Zoom™ and Skype™. The HID can communicate with a control system or hub which, in turn, can communicate with the user mechanisms.
According to the present invention, the HID may activate a display of a menu list of user mechanisms. The HID may then navigate to and select a user mechanism. Next, the HID may control the volume of an audio/video call.
Also, according to the present invention, the HID may be configured to determine whether a user touch of the HID was intentional or accidental. For example, the HID may determine whether a length of time of a user touch exceeds a threshold. If so, the HID can consider the user touch as intentional.
The HID 100 can be configured to communicate with a control system 104 which, in turn, can be configured to communicate with one or more user mechanisms 105. The user mechanisms 105 may include electronic devices such as a computer and a computer display 111. The user mechanisms 105 may also include software applications and, more specifically, access points to software applications. Some applications may include Zoom™ and Skype™.
In
Still referring to
Referring to
Accordingly, as shown in
In embodiments, a menu activation state 201 may be initiated by a user depressing the first central input portion 103a. In the menu activation state 201, the menu list 111a of user mechanism icons 111b may be shown on the computer display 111 (
A menu navigation state 202 may be initiated by a user touching (but not depressing) the inner input component 102. The user touching can enable the user to navigate to and select one of the user mechanism icons 111b shown on the display 111 (
Still referring to
A volume control state 204 may be initiated by a user touching (but not depressing) the inner input component 102. The user touching can enable the user to increase or decrease an audio volume of a user mechanism 105, such as Skype™. In embodiments, the touching can be a continuous touching with movement, such as at different points of the inner input component 102. For example, the continuous touching with movement may be a circular touching by the user's finger around the inner input component 102. In certain embodiments, the continuous touching with movement may be for a period of time that is at least a volume time threshold, as determined by the microcontroller 106. In still other embodiments, the continuous touching with movement may be an amount of angular distance that exceeds a volume angular distance threshold, as determined by the microcontroller 106.
Also in
In embodiments, the third angular distance may be greater than the second angular distance, and the second angular distance may be greater than the first angular distance. For example, the first angular distance may be 36°, the second angular distance may be 60°, and the third angular distance may be 90°.
In embodiments, the present invention, and in particular the microcontroller 106, can execute a predetermined processing if the user continuously touches the inner input component 102 over a first angular distance threshold 210. The present invention may re-execute the predetermined processing if the user continuously touches the inner input component 102 over a second angular distance threshold 211 after moving over the first angular distance threshold, with the second angular distance threshold being greater than the first angular distance threshold.
Thus, in certain embodiments, the microcontroller 106 can be configured to receive inner user touch signals generated from user touches, in a direction of a touching path, of the inner input component (
The microcontroller 106 may then execute a predetermined processing if the user continuously touches the inner input component 102 over the entire first angular distance threshold 210. And the microcontroller 106 may re-execute the predetermined processing if the user continuously touches the inner input component 102 over the entire second angular distance threshold 211 after continuously touching over the entire first angular distance threshold.
In
With the foregoing, the microcontroller 106 can be further configured to determine a subsequent angular distance of the user touches of the inner input component, wherein the subsequent angular distance is after the initial angular distance along the touching path. The microcontroller can be further configured to re-execute the predetermined processing, when the subsequent angular distance is equal to at least one of the second 211 and additional angular distance thresholds 212.
For example, by the user's continuous touching over the first angular distance, and then stopping, the user may select the first user mechanism icon 111b in the menu list 111a. By the user continuously touching over the first and second angular distances, and then stopping, the user may select the second user mechanism icon 111b in the menu list 111a. By the user continuously touching over the first, second, and third angular distances, and then stopping, the user may select the third user mechanism icon 111b in the menu list 111a. User selection of additional user mechanism icons may be enabled by additional angular touching movement equal to the third angular distance, as an example.
In embodiments, the inner input component 102 may, upon the user touching the component 102, send user inner component touch signals to the microcontroller 106 in the HID 100. In turn, the microcontroller 106 may be configured to receive the user inner component touch signals and determine an angular distance of the user touches of the inner input component. If the determined angular distance is at least equal to an angular distance threshold—for example, the first angular distance threshold—the first user mechanism icon is selected.
In other embodiments, the microcontroller 106 may be configured to determine if a user's touch of the inner input component 102 is at least a minimum touch threshold. In embodiments, the minimum touch threshold is based on angular distance of the user's touch. If a user's touch is not at least equal to the minimum touch threshold, the microcontroller 106 may determine that the touch was accidental and therefore the touch is ignored by the HID 100. If the user's touch is at least equal to the minimum touch threshold, the microcontroller 106 may determine that the touch was intentional and therefore the touch is to be considered as whether meeting an angular distance threshold. For example, the minimum touch threshold may be 6°. In such example, if a user's continuous touch with movement of the inner input component is at least 6°, the touch will be determined as intentional and continuous. The microcontroller 106 will also use such intentional touch to determine if the angular distance threshold is met.
In yet other embodiments, the microcontroller 106 may be configured to determine if a time of non-touching between consecutive user touches of the inner input component 102 is less than a time threshold. If the determined time is less than the time threshold, the microcontroller is further configured to consider and determine that the touches before and after the determined time are part of a continuous touch. If the determined time is equal to or greater than the time threshold, the touches before and after the determined time are not considered as part of a continuous touch. For example, the time threshold may be 100 ms.
In further embodiments, the microcontroller 106 may be configured to receive outer user depression signals generated from user depressions of the outer component 101. The microcontroller 106 may also be configured to receive central user depression signals generated from user depressions of the central component 103. The microcontroller 106 may use the foregoing depression signals to initiate activation of HID 100 operating states described above.
It should be understood, of course, that the foregoing relates to exemplary embodiments of the invention and that modifications may be made without departing from the spirit and scope of the invention as set forth in the following claims.