This application claims benefit of Japanese Patent Application No. 2011-150604 filed on Jul. 7, 2011, which is hereby incorporated by reference in its entirety.
1. Field of the Disclosure
The present disclosure relates to an input processing apparatus in which an input unit including a stick pointer is arranged at two portions of an operation input unit including a keyboard input device, and the like, in an information processing apparatus.
2. Description of the Related Art
On an operation panel of a personal computer, a keyboard input device and an input unit having a stick pointer are provided. Since an operation body of the input unit is arranged between keys constituting the keyboard input device, fingers can operate the stick pointer while hands are maintained in a posture of operating the keyboard input device, so that an input operation can be speedily performed.
Japanese Unexamined Patent Application Publication No. 2007-328475 discloses an input processing apparatus in which two input units having a stick pointer are arranged within a region of an array of keys in a keyboard input.
In the input processing apparatus, two independent cursors displayed on a screen can be individually controlled by two stick pointers, and a cursor can be controlled by one stick pointer while a scroll operation is performed by the other stick pointer, or the like.
As in the related art, in the input processing apparatus in which a single stick pointer is provided in the keyboard input device, control may only be performed so as to move a cursor displayed on the screen by generating an input signal of single coordinate data by an operation of the stick pointer, so that a variety of input controls may not be performed.
In the input processing apparatus disclosed in Japanese Unexamined Patent Application Publication No. 2007-328475, the input unit having the stick pointer is provided at two portions, so that it is possible to generate: the input signal of two kinds of coordinate data. However, what is being performed is limited to a movement control of the cursor and a scroll control, so that a variety of other input controls may not be performed.
An input processing apparatus, includes: a first input unit and a second input unit that are arranged in an operation input unit; a control processing unit to which an input signal from the first input unit and an input signal from the second input unit are applied; and a stick pointer that is provided in each of the first input, unit and the second input unit, and includes an operation body and a detection element for detecting an operational direction and an operational force which are applied to the operation body, wherein, when the input signal is obtained from the stick pointer of any one of the first input unit and the second input unit, a coordinate input process corresponding to the operational direction and the operational force which are applied to the operation body of the stick pointer is performed in the control processing unit, and when the input signal is obtained from the two stick pointers of the first input unit and the second input unit, a gesture control process in accordance with a combination of the operational directions applied to two operation bodies is performed in the control processing unit.
In
In the operation input unit 4, an input processing apparatus 10 and a keyboard input device 11 according to an embodiment of the invention, a touch pad 7 which is arranged at a side further to the front than the keyboard input device 11, and a left click button 8a and a right click button 8b which are arranged in adjacent positions at a front side of the touch pad 7 are provided. The touch pad 7 outputs coordinate data corresponding to a contact position of a finger by a change in capacitance generated when the finger is in contact with the touch pad 7.
As shown in
The first input unit 20A and the second input unit 20B are arranged between the keys 12 adjacent to each other. In an example shown in
An arrangement position of the first input unit 20A and the second input unit 20B is not limited to the embodiment shown in
In
The first stick pointer 21A includes a supporting base 22 formed of a synthetic resin, and a plus X-deformable portion 23a and a minus X-deformable portion 23b which extend in an X direction, and a plus Y-deformable portion 24a and a minus Y-deformable portion 24b which extend in a Y direction are integrally formed on the supporting base 22. At a center of the supporting base 22, a first operation body 25A that protrudes upward is integrally provided. The first operation body 25A is positioned at a center between the plus X-deformable portion 23a and the minus X-deformable portion 23b, and the plus Y-deformable portion 24a and the minus Y-deformable portion 24b.
An outer edge portion of the supporting base 22 is fixed to a substrate of the keyboard input device 11. When an operational force is applied to the first operation body 25A from the finger in an X direction, a Y direction, or the like, curvature occurs in the plus X-deformable portion 23a, the minus X-deformable portion 23b, the plus Y-deformable portion 24a, and the minus Y-deformable portion 24b in such a manner as to correspond to the operational direction and the operational force.
In the supporting base 22, a plus X-strain sensor 26a is mounted on an upper surface of the plus X-deformable portion 23a, and a minus X-strain sensor 26b is mounted on an upper surface of the minus X-deformable portion 23b. A plus Y-strain sensor 27a is mounted on an upper surface of the plus Y-deformable portion 24a, and a minus Y-strain sensor 27b is mounted on an upper surface of the minus Y-deformable portion 24b. In addition, each of the strain sensors 26a, 26b, 27a, and 27b may be mounted on a lower surface of the deformable portions 23a, 23b, 24a, and 24b.
The strain sensors 26a, 26b, 27a, and 27b are detection elements of the first stick pointer 21A. Each of the strain sensors 26a, 26b, 27a, and 27b is a resistance film. The strain sensors 26a, 26b, 27a, and 27b are connected to each other, so that a bridge circuit shown in
In
As shown in
The first light source 29A provided in the first input unit 20A includes a single or a plurality of LEDs that emit light having different colors. At least a part of the first operation body 25A has a configuration that transmits the light, so that the first operation body 25A is brightly illuminated when the first light source 29A lights.
A structure of the second input unit 20B is the same as that of the first input unit 20A. The second input unit 20B includes a second stick pointer 21B having the same structure as that shown in
An X operation output and a Y operation output of the first stick pointer 21A of the first input unit 20A, and a switch detection output of the first switch unit 28A are applied to a main signal generation unit 31. An X operation output and a Y operation output of the second stick pointer 21B of the second input unit 20B, and a switch detection output of the second switch unit 28B are applied to a sub signal generation unit 32.
Each output of the second input unit 20B is A/D-converted in the sub signal generation unit 32, converted into a signal of predetermined bytes, and transmitted to the main signal generation unit 31. In the main signal generation unit 31, each output of the first input unit 20A is A/D-converted, and converted to an input signal of predetermined bytes, together with an output signal from the second input unit 20B which is applied from the sub signal generation unit 32 to thereby be formatted.
A key detection output applied from each of the key switches of the keyboard input device 11 is A/D-converted in a key signal generation unit 33, and converted into an input signal having a predetermined number of bytes to thereby be formatted.
The main signal generation unit 31, the sub signal generation unit 32, and the key signal generation unit 33 are constituted of an integrated circuit that is mounted on the substrate of the keyboard input device 11.
An input signal 31a generated in the main signal generation unit 31 and an input signal 33a generated in the key generation unit 33 are applied to an application software 34 installed in a main body control unit of the personal computer 1. Next, control information that is executed in the application software 34 is applied to an operation system (OS) 35, so that a display screen of the display device 5 of the personal computer 1 is controlled. In the present embodiment, a control operation of the application software 34 functions as a control processing unit.
As shown in a modified example shown in
Next, an operation control of the input processing apparatus 10 will be described. In a flowchart shown in
When power is turned on and the application software 34 is enabled, a processing operation starts in ST1 (step 1). In ST2, the input signal 31a from the main signal generation unit 31 is monitored by a control operation of the application software 34, and whether a change exceeding a threshold value in at least one of an input signal from the first stick pointer (SP1) 21A of the first input unit 20A and an input signal from the second stick pointer (SP2) 21B of the second input unit 20B occurs is determined.
When exceeding the threshold value, in ST3, whether the input signals from both the first stick pointer 21A and the second stick pointer 21B exceed the threshold value is determined. Here, both the input signal from the first stick pointer 21A and the input signal from the second stick pointer 21B simultaneously exceed the threshold value, it is determined that “the input signals from both exceed the threshold value”. In addition, when both the input signal from the first stick pointer 21A and the input signal from the second stick pointer 21B exceed the threshold value within a fixed period of time determined in advance, it is determined that “the input signals from both exceed the threshold value”.
That is, a monitoring time having a fixed length determined in advance is set, and when both the input signal from the first stick pointer 21A and the input signal from the second stick pointer 21B exceed the threshold value for the monitoring time, it is determined that “the input signals from both exceed the threshold value”. By repeatedly executing the monitoring time, it is possible to determine whether the first stick pointer 21A and the second stick pointer 21B are simultaneously operated.
In ST3, when it is not determined that both the input signal from the first stick pointer 21A and the input signal from the second stick pointer 21B exceed the threshold value, that is, when the input signal from any one stick pointers exceeds the threshold value, the corresponding step proceeds to ST4.
In ST4, when it is determined that the input signal from the first stick pointer 21A exceeds the threshold value, the corresponding step proceeds to ST5, and the input signal from the first stick pointer 21A is confirmed. When it is confirmed that the input signal from the first stick pointer 21A is a coordinate signal showing a movement of a predetermined distance or more in an X direction or a Y direction, the corresponding step proceeds to ST6, and an information group that is displayed on the screen of the display device 5 is subjected to a coordinate input process for scrolling in the X direction or the Y direction. In this instance, based on an operational direction applied to the first operation body 25A, a scroll direction is determined, so that a speed of a scroll process is varied in proportion to the magnitude of the operational force.
In ST4, when it is determined that the input signal from the first stick pointer 21A does not exceed the threshold value, the input signal from the second stick pointer is determined as exceeding the threshold value, and the corresponding step proceeds to ST7, and the input signal from the second stick pointer 21B is confirmed. When it is determined that the coordinate signal of the predetermined distance or more in the X direction or the Y direction is input from the second stick pointer 21B, the corresponding step proceeds to ST8, and a coordinate input process for moving a cursor 9 shown in the display screen of the display device 5 is performed.
In ST8, in accordance with a direction of the operational force applied to the second operation body 25B of the second input unit 20B, a movement direction of the cursor 9 is determined, so that a movement distance of the cursor 9 is determined in proportion to the magnitude of the operational force applied to the second operation body 25B.
In ST3, when it is determined that both the input signal from the first stick pointer 21A and the input signal from the second stick pointer 21B exceed the threshold value, the corresponding step proceeds to ST9, an operational direction and an operational force thereof from the coordinate signal from the first stick pointer 21A are confirmed, the operational direction and the operational force from the input signal of the second stick pointer 21B are confirmed, and the corresponding step proceeds to ST10.
In ST10, in accordance with both the input signals of the first stick pointer 21A and the second stick pointer 21B, a gesture signal to be executed is selected.
In
As shown in (1) of
In the gesture control process of zoom-out, an image that is displayed on the screen of the display device 5 is reduced. In accordance with the magnitude of the operational force that is applied to the first operation body 25A and the second operation body 25B, a reduction ratio of the image is changed. In addition, since the first operation body 25A and the second operation body 25B are always going to return to a neutral position, a size of the image is returned to the initial size when the finger is separated from the first operation body 25A and the second operation body 25B. Alternatively, when the finger is separated from the first operation body 25A and the second operation body 25B, the reduced image may be maintained as is.
As shown in (2) of
In the gesture control process of zoom-in, the image that is displayed on the screen of the display device 5 is enlarged. In accordance with the magnitude of the operational force that is applied to the first operation body 25A and the second operation body 25B, an enlargement ratio of the image is changed. In addition, when the finger is separated from the first operation body 25A and the second operation body 25B, a size of the image is returned to the initial size. Alternatively, the enlarged image may be held as is.
As shown in (3) of
In the gesture control process of left rotation, the image that is displayed on the screen of the display device 5 is rotated in a counter-clockwise direction with respect to an axis perpendicular to the screen. In accordance with the magnitude of the operational force that is applied to the first operation body 25A and the second operation body 25B, a rotational angle or a rotational speed of the image is changed. When the finger is separated from the first operation body 25A and the second operation body 25B, a rotational posture of the image is returned to the initial rotational posture.
As shown in (4) of
In the gesture control process of left rotation, the image that is displayed on the screen of the display device 5 is rotated in a clockwise direction with respect to the axis perpendicular to the screen. In accordance with the magnitude of the operational force that is applied to the first operation body 25A and the second operation body 25B, a rotational angle or a rotational speed of the image is changed. When the finger is separated from the first operation body 25A and the second operation body 25B, a rotational posture of the image is returned to a rotational posture of an initial stage.
As shown in (5) of
In the gesture control process of forward tracking, all of the images that are displayed on the screen of the display device 5 is moved downward (minus Y direction) at a high speed. The gesture control process of forward tracking is a processing operation different from the scroll control of ST6 of
Alternatively, in the gesture control process shown in (5) of
As shown in (6) of
In the gesture control process of backward tracking, all of the images that are displayed on the screen of the display device 5 is moved at a high speed in an upward direction (plus Y direction). That is, the images displayed on the screen become units of one group, and are successively gathered and moved in the plus Y direction.
Alternatively, in the gesture control process shown in (6) of
As shown in (7) of
In the gesture control process of left tracking, all of the images that are displayed on the screen of the display device 5 becomes information of one group, is gathered, and is successively moved in a left direction (minus X direction).
Alternatively, all of the images may become one group, and may be successively moved in a right direction (plus X direction) in accordance with an operational direction of both hands. In addition, when the images that contain pictures or characters are displayed in page units, the gesture control process of right rotation-over by the gesture control process shown in (7) of
As shown in (8) of
In the gesture control process of right tracking, all of the images that are displayed on the screen of the display device 5 becomes information of one group, is gathered, and is successively moved in a left direction (plus X direction).
Alternatively, all of the images may become one group, and may be successively moved in the left direction (minus X direction) in accordance with an operational direction of both hands. In addition, when the images that contain pictures or characters are displayed in page units, the gesture control process of left rotation-over by the gesture control process shown in (8) of
When the gesture control processes shown in (5), (6), (7), and (8) of
Next, when the operation body 25A of the first input unit 20A is pushed in an axial direction, the first switch unit 28A enters an on state, and when the operation body 25B of the second input unit 20B is pushed in the axial direction, the second switch unit 28B enters an on state. In this instance, a switch signal is transmitted to the application software 34 as the input signal 31a from the main signal generation unit 31.
In the control operation of the application software 34, different control processes are performed in accordance with which one of the first switch unit 28A and the second switch unit 28B is operated. For example, when the switch unit 28A of the first input unit 20A is operated, the same control process in which the left click button 8a shown in
In addition, when both the first switch unit 28A and the second switch unit 28B are pressed, the same control process in which a middle click button positioned between the left click button 8a and the right click button 8b which are provided in a mouse is pressed is performed.
As described above, when the resistance values of four strain sensors 26a, 26b, 27a, and 27b which are provided in the stick pointers 21A and 21B are changed to the same state while the mechanical switch units 28A and 28B are not provided, it may be determined that the operation bodies 25A and 25B are pressed in the axial direction, and a first or a second switch function is operated.
In this case, in ST 11 of
In addition, a setting menu is displayed on the screen of the display device 5 by starting the application software 34, and the keyboard input device 11 is operated, so that it is possible to change setting or allocation of a variety of gesture functions shown in
In addition, as shown in
For example, by executing each of the gesture control processes shown in (1) to (8) of
In the present embodiment, in the application software 34 that functions as the control processing unit, as shown in
In
A portable information processing apparatus 101 is shown in
The information processing apparatus 101 is small-sized and is suitable for operating the main body portion 102 while holding the main body portion 102 with both hands, and for example, the first input unit 20A is operated by the thumb of the left hand, and the second input unit 20B is operated by the thumb of the right hand.
In
A small-sized information processing apparatus 301 shown in
In the information processing apparatus 301, a variety of input operations are made possible by touching the display screen using the finger, so that it is possible to operate the first input unit 20A with the thumb of the left hand, and the second input unit 20B with the thumb of the right hand.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims of the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2011-150604 | Jul 2011 | JP | national |