When interacting with a computing device, a user can access input components of the computing device, such as a keyboard and a mouse. A first hand can reposition the mouse for a first input and a second hand can access alphanumeric keys of the keyboard for a second input. The computing device can detect the first input and the second input from each separate input component to identify corresponding commands for the computing device.
Various features and advantages of the disclosed embodiments will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example, features of the disclosed embodiments.
An input device includes an input module coupled to a chassis of the input device which can reposition along an axis. For the purposes of this application, the input module is a hardware component, which the user can reposition with a finger and/or hand movement. When repositioning, the input module can slide and/or pivot along one or more axes. In one embodiment, the input module includes a flat elongated top surface and is elevated above a surface of the chassis of the device. The chassis houses components of the device, such as a first input sensor and a second input sensor. The first input sensor can include a position sensor and/or a directional sensor which detects position data of the input module repositioning. A second input sensor coupled to the input module, such as a touch pad, detects touch data in response to a touch gesture being made at a top surface of the input module.
By accessing the input module for a first input and a second input, the user can conveniently use a single hand to enter one or more inputs for a computing device. If the first input sensor detects the input module being repositioned, the first input sensor can share the position data as a first input signal for a computing device. If the second input sensor detects a touch gesture at the surface of the input module, the second input sensor can share the touch data as a second input signal for the computing device. The first input signal and the second input signal are shared in parallel of one another.
Using the first input signal and the second input signal, the computing device can identify a first input command associated with the input module repositioning and the computing device can identify a second input command associated with the touch gesture. In one embodiment, the computing device can identify a combination input command associated with the information from the first input and the second input. As a result, a user friendly experience can be created for the user by providing the user the ability to conveniently enter one or more input commands for the computing device by accessing the input module with a gesture.
The chassis 180 is a frame, an enclosure, and/or a casing to house one or more components of the input device 100. For the purposes of this application, the input module 140 is a hardware component of the input device 100, such as a track pad, which is coupled to a surface of the chassis 180. In one embodiment, the input module 140 is couple to the chassis 180 through a mechanism which elevates the input module 140 to a position substantially parallel to the surface of the chassis 180. The mechanism is a hardware component, such as a control stick, which allows the input module 140 to reposition along one or more axis. An axis include an X, Y, and/or Z axis.
A first input sensor 130 of the input device 100, such as a position sensor and/or a potentiometer, can detect for the input module 140 repositioning in response to a user accessing the input module 140 with a hand gesture. The hand gesture can be made with a finger and/or hand of the user to reposition the input module 140 along an axis. In one embodiment, when repositioning along an X and/or Y axes, the input module 140 can pivot or slide laterally along the X and/or Y axes. Additionally, when repositioning along a Z axis, the input module 140 can reposition vertically. The first input sensor 130 can detect information of the input module 140 repositioning and share the information with the controller 120 as a first input signal 160. For the purposes of this application, the first input signal 160 includes data and/or information, such as position data, of the input module 140 repositioning.
A second input sensor 135 of the input device 100, such as a touchpad or a touch sensitive surface at a top surface of the input module 140, can detect for a touch gesture from the user. The touch gesture can be made with a finger of the user at the top surface of the input module 140. The second input sensor 135 can detect information of the touch gesture and share the information with the controller 120 as a second input signal 165. The second input signal 165 includes data and/or information, such as touch data, of a touch gesture detected at the surface of the input module 140. For the purposes of this application, the first input signal 160 is shared by the first input sensor 130 in parallel of the second input sensor 135 sharing the second input signal 165 with the controller 120.
The controller 120 may be any suitable controller and/or processor, such as a central processing unit (CPU), a semiconductor-based microprocessor, or any other device suitable for retrieval and execution of instructions. In one embodiment, the input device 100 includes logic in addition to and/or in lieu of the controller 120. In other embodiments, the controller 120 is an integrated circuit (IC) of the input device 100. The controller 120 is connected to the first input sensor 130 and the second input sensor 135 to receive the first input signal 160 and the second input signal 165 for a computing device. In one embodiment, the controller 120 receives the first input signal 160 and the second input signal 165 in parallel of one another. In another embodiment, the controller 120 receives the first input signal 160 and the second input signal 165 in sequence.
The computing device can be a laptop, a notebook, a tablet, a netbook, an all-in-one system, a desktop, a workstation, and/or a server. In another embodiment, the computing device can be a cellular device, a PDA (Personal Digital Assistant), an E (Electronic)-Reader and/or any additional computing device coupled to the device 100. The computing device can use information of the first input signal 160 and the information of the second input signal 165 to identify and execute one or more input commands for the computing device.
As shown in
A first input sensor 230 is a hardware component of the input device 200 which detects information of the input module 240 repositioning along an axis. In one embodiment, the first input sensor 230 is coupled to the mechanism 270 to detect the input module 240 repositioning. The first input sensor 230 can include a potentiometer, a position sensor, a directional pad, a proximity sensor, and/or any additional sensor which detects information of the input module 240 repositioning. The position sensor can be an analog or digital position sensor. The information of the input module 240 repositioning includes position data. The position data can include one or more coordinates corresponding to the input module 240 repositioning from one location to another.
As the first input sensor 230 is detecting information of the input module 240 repositioning, a second input sensor 235 can detect for a touch gesture at a top surface of the input module 240. The second input sensor 235 is a hardware component of the input device 200 which detects information of a touch gesture at a top surface of the input module 240. The information of the touch gesture can include touch data of the user 205 touching one or more locations of the surface of the input module 240. As shown in the present embodiment, the top surface of the input module 240 can include a flat elongated surface. In one embodiment, the second input sensor 235 is a touchpad, a touch screen, and/or any additional touch sensitive surface which can be coupled to the top surface of the input module 240. In other embodiment, the second input sensor 235 can include any additional sensor which can detect a touch gesture.
As shown in
The user 205 can also use a finger to make a touch gesture at the top surface of the input module 240. When making the touch gesture, a finger of the user 205 can touch or be within proximity of the top surface of the input module 140. As the finger is touching or within proximity, the user 205 can reposition the finger over the top surface of the input module 240. In one example, when repositioning the input module 240 with gestures, the user 205 can use a thumb and middle finger to grasp and reposition the input module 240. While the input module 240 is being repositioned, the index finger of the user 205 can touch and reposition over the surface of the input module 240. In another example, the user 205 can use a first hand to reposition the input module 240 and a finger on the second hand of the user 205 can be used to make touch gestures over the surface of the input module 240.
The first input sensor 230 can detect position data of the input module repositioning in parallel of the second input sensor 235 detecting touch data of the hand gesture. The position data and the touch data from the first input sensor 230 and the second input sensor 235 are shared in parallel of one another with a controller of the device 200. The position data is received by the controller as a first input signal for a computing device. The touch data is received by the controller as a second input signal for a computing device. In one embodiment, the position data and the touch data are received in parallel of one another by the controller. In another embodiment, even though the position data and the touch data are shared in parallel, they are received by the controller sequentially. In response to receiving a first input signal and a second input signal, the computing device can identify one or more input commands associated with the hand gesture and/or the touch gesture.
The controller 320 can access a list, table, and/or database of input commands to compare the first input signal and the second input signal to predefined information corresponding to input commands of the computing device. The list, table, and/or database of input commands can be stored on the computing device or remotely on another device accessible to the computing device. In one embodiment, the list, table, and/or database of input commands can include first input commands and second input commands. The first input command corresponds to a first input command of the computing device associated with the position data from the first input sensor 330. The second input command corresponds to a second input command of the computing device associated with the touch data from the second input sensor 335.
The controller 320 can access the input commands list and compare the position data from the first input sensor 335 to a list of first input commands to determine whether a matching first input command can be found. In one embodiment, a first input command corresponds with a scrolling input command to vertically and/or horizontally scroll a present rendered content of the computing device. The content can include an application, a document, a webpage, and/or media of the computing device. If a matching first input command is identified, the controller 320 can proceed to execute the matching first input command on the computing device.
The controller 320 also compares the touch data from the second input sensor 335 to the list of second input commands to identify a matching second input command. In one embodiment, the second input command corresponds with a pointer input command to reposition a pointer of the computing device and/or to select an item rendered at the location of the pointer. The pointer can include a visual cursor which is rendered on a display component of the computing device. If a matching second input command is identified, the controller 320 proceeds to execute the matching second input command on the computing device.
If the controller 320 identifies both a matching first input command and a matching second input command are identified, the controller 320 executes both the first input command and the second input command in parallel. As a result, both of the input commands can concurrently be executed on the computing device. In another embodiment, the first input command and the second input command can be in different input threads which are executed by the controller 320 sequentially. The controller 320 can alternate back and forth between the two separate input threads and execute portions of the first input command and portions of the second input command from each input thread in rapid succession, such that they appear to be executing in parallel of one another.
In another embodiment, instead of executing both a first input command a second input command, the controller 320 can identify and execute a combination input command associated with the position data and touch data. As shown in
If a match is found, the controller 320 can proceed to execute the combination input command on the computing device. Additionally, if the combination input command is executed, any first input command matching the position data and any second input command matching the touch data is not executed on the computing device. In another embodiment, if a matching combination input command is not identified, the controller 320 proceeds to identify a first input command matching the position data and proceeds to identify a second input command matching the touch data. The matching first input command and the second input command are then executed in parallel of one another on the computing device.
In other embodiments, if the device is a peripheral input component of the computing device, the controller 320 can share the first input and the second input with the computing device. The computing device can use information of the first input and the second input to identify a first input command, a second input command, and/or a combination input command associated with the hand gesture repositioning the input module 340 and the touch gesture at the surface of the input module 340.
The controller can determine if the combination of the first input signal and the second input signal correspond to an input component for the computing device at 540. If the combination of the first input signal and the second input signal correspond to a combination input command, the controller can execute the combination input command on the computing device at 570. In another embodiment, if the combination of the first input signal and the second input signal do not correspond to an input command for the device, the controller can identify a first input command for the computing device associated with the hand gesture repositioning the input module at 550. The controller can also identify a second input command for the computing device associated with the touch gesture at 560. The first input command and the second input command can be perceived to be executed in parallel of one another. The method is then complete. In other embodiments, the method of