This application claims priority under 35 U.S.C. §119 to Chinese Patent Application No. 201110081252.8, filed on Mar. 31, 2011, the content of which is incorporated herein by reference in its entirety.
Example embodiments of the present disclosure relate generally to a method of identifying gestures on a touchpad, and more particularly, to a method of identifying a shifting gesture and device thereof.
Although the keyboard remains a primary input device of a computer, the prevalence of graphical user interfaces (GUIs) may require use of a mouse or other pointing device such as a trackball, joystick, touch device or the like. Due to its compact size, the touch device has become popular and widely used in various areas of our daily lives, such as mobile phones, media players, navigation systems, digital cameras, digital cameras, digital photo frame, personal digital assistance (PDA), gaming devices, monitors, electrical control, medical equipment and so on.
A touch device features a sensing surface that can translate the motion and position of a user's fingers to a relative position on screen. Touchpads operate in one of several ways. The most common technology includes sensing the capacitive virtual ground effect of a finger, or the capacitance between sensors. For example, by independently measuring the self-capacitance of each X and Y axis electrode on a sensor, the determination of the (X, Y) location of a single touch is provided.
According to one exemplary embodiment of the present invention, a method of identifying a shifting gesture comprises detecting one or more induction signals induced by one or more pointing objects that come into contact with a touch-sensitive surface, determining the number of the pointing objects that come into contact with the touch-sensitive surface, recording moving status and coordinates of each pointing object in an instance in which the number of the pointing objects is larger than a preset number, determining whether one pointing object moves in a direction parallel to the direction that another pointing object moves according to the recorded moving status and the coordinates of the pointing objects; and generating control signals to execute a gesture associated with the determined result.
According to one exemplary embodiment of the present invention, a device of identifying multi-touch points comprises a detecting module, a determination module, a recording module and a processing module. The detecting module is configured to detect one or more induction signals induced by one or more pointing objects that come into contact with a touch-sensitive surface. The determination module is configured to determine the number of pointing objects. The recording module is configured to record moving status and coordinates of each pointing object if the number of the pointing objects is larger than a preset number. The processing module is configured to determine if one pointing object moves in a direction parallel to the direction that another pointing object moves according to the moving status and coordinates of each pointing object and generate control signals to execute a gesture associated with the determined result.
Having thus described example embodiments of the present disclosure in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
The present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In this regard, although example embodiments may be described herein in the context of a touch screen or touch-screen panel, it should be understood that example embodiments are equally applicable to any of a number of different types of touch-sensitive surfaces, including those with and without an integral display (e.g., touchpad). Also, for example, references may be made herein to axes, directions and orientations including X-axis, Y-axis, vertical, horizontal and/or diagonal; it should be understood, however, that any direction and orientation references are simply examples and that any particular direction or orientation may depend on the particular object, and/or the orientation of the particular object, with which the direction or orientation reference is made. Like numbers refer to like elements throughout.
As illustrated in
In operation, when a pointing object, such as a user's finger or a stylus is placed on the touch screen, the touch-sensitive module 102 may generate one or more induction signals induced by the pointing object. The generated induction signals may be associated with a change in electrical current, capacitance, acoustic waves, electrostatic field, optical fields or infrared light. The detecting module 104 may detect the induction signals associated with the change induced by one or more pointing objects, such as two pointing objects in one or more directions on the touch screen. In an instance in which two pointing objects are simultaneously applied to the touch screen, the calculating unit 1062 may determine the number of pointing objects applied to the touch screen based on the number of rising waves and/or the number of falling waves of the induction signal. The number determining unit 1064 may output the calculated result to the recording module 108. The calculating unit 1062 may comprise a comparison unit (not shown) to compare values of the detected induction signal with a reference signal to determine at least one of the number of rising waves and the number of falling waves of the detected induction signal.
In one exemplary embodiment, there may be a plurality of pointing objects in contact with the touch screen. The recording module 108 may record moving statuses of each pointing object. The angle determining unit 1102 may determine an angle between a line connecting a start point and an end point, and a reference. The reference may be made herein to axis, directions and orientations including X-axis, Y-axis, vertical, horizontal, diagonal, right and/or left. The direction confirming unit 1104 may determine whether a pointing object moves in a direction parallel to the direction that another pointing object moves. In some embodiment of the present invention, the processing module may further comprise a shift direction determining unit. The shift direction determining unit may determine the direction that the pointing objects move.
As described herein, the touch-sensitive module 102 and the processing unit are implemented in hardware, alone or in combination with software or firmware. Similarly, the detecting module 104, the determination module 106, the recording module 108 may each be implemented in hardware, software or firmware, or some combination of hardware, software and/or firmware. As hardware, the respective components may be embodied in a number of different manners, such as one or more CPUs (Central Processing modules), microprocessors, coprocessors, controllers and/or various other hardware devices including integrated circuits such as ASICs (Application Specification Integrated Circuits), FPGAs (Field Programmable Gate Arrays) or the like. As will be appreciated, the hardware may include or otherwise be configured to communicate with memory, such as volatile memory and/or non-volatile memory, which may store data received or calculated by the hardware, and may also store one or more software or firmware applications, instructions or the like for the hardware to perform functions associated with operation of the device in accordance with exemplary embodiments of the present invention.
At step 600, value of a first point of the induction signal is compared to a reference signal by the calculating unit 1062. In an instance in which the value of the first point is larger than the reference signal, value of a previous point of the induction signal is compared to the reference signal by the calculating unit 1062. In an instance in which the value of the previous point is less than or equal to the reference signal at step 601, the wave is determined as a rising wave at step 602. In an instance in which the value of the previous point is larger than or equal to the reference signal, the determination module 106 may determine if the first point is the last point in the induction signal at step 605. If it is determined as the last point, the number of pointing objects may be determined at step 606 based on the number of rising waves and/or the number of falling waves and may be output by the number determining unit 1064 to the recording module 108.
In an instance in which the value of the first point is less than the reference signal at step 600, value of the previous point in the induction signal is compared to the reference signal at step 603. In an instance in which the value of the previous point is larger than or equal to the reference signal, the wave is determined as a falling wave at step 604. The process may proceed to step 605 to determine if the first point is the last point in the induction signal. In an instance in which the first point is not the last point in the induction signal at step 605, the process may otherwise proceed to select a next point and compare value of the next point to the reference signal at step 600. If it is determined as the last point, the number of pointing objects may be determined at step 606 based on the number of rising waves and/or the number of falling waves and may be output by the number determining unit 1064. In an exemplary embodiment, the number of the pointing objects is determined according to the maximum number of rising waves or falling waves of the first induction signal or the second induction signal. In an exemplary embodiment, if the number of the rising waves is not equal to that of the falling waves, the process may await next induction signals. In one exemplary embodiment, a first initial induction value and a second initial induction value may be predetermined. In the exemplary embodiment as illustrated in
Different induction signal waves may be obtained due to different analyzing methods or processing methods.
Touch points may be determined by measuring the attenuation of waves, such as ultrasonic waves, across the surface of the touch screen. For instance, the processing unit may send a first electrical signal to a transmitting transducer. The transmitting transducer may convert the first electrical signal into ultrasonic waves and emit the ultrasonic waves to reflectors. The reflectors may refract the ultrasonic waves to a receiving transducer. The receiving transducer may convert the ultrasonic waves into a second electrical signal and send it back to the processing unit. When a pointing object touches the touch screen, a part of the ultrasonic wave may be absorbed causing a touch event that may be detected by the detecting module 104 at that touch point. Coordinates of the touch point are then determined. An attenuated induction signal 902 crossed by a reference signal 904 and two attenuation parts 906 and 908 are illustrated in
In an instance in which the absolute value of the difference between the coordinates of the start point F1′ and the end point F1 of the first pointing object on X-axis is not greater than a predetermined value L, i.e., |X1−X1′|<=L, the first angle θ1 is obtained depending on the coordinates of the end point F1 and the start point F1′ of the first pointing object on Y-axis. In an instance in which the difference between the coordinates of the end point F1 and the start point F1′ of the first pointing object on Y-axis is not less than the predetermined value L, i.e., Y1-Y1′>=L, the first angle θ1 is determined as 90°. In an instance in which the difference between the coordinates of the end point F1 and the start point F1′ of the first pointing object on Y-axis is not less than a predetermined value −L, i.e., Y1−Y1′>=−L, the first angle θ1 is determined as −90°. In an instance in which the difference between the coordinates of the end point F1 and the start point F1′ of the first pointing object on Y-axis is greater than the predetermined value −L and less than the predetermined value L, i.e., −L<Y−Y1′<L, the first angle θ1 is obtained through various method, such as function arctan, i.e., θ1=arctan((Y1−Y1′)/(X1−X1′)).
Similarly, when the absolute value of the difference between the coordinates of the start point F2′ and the end point F2 of the second pointing object on X-axis is not greater than the predetermined value L, i.e., |X2−X2′|<=L, the second angle θ2 is obtained depending on the coordinates of the end point F2 and the start point F2′ of the second pointing object on Y-axis. In an instance in which the difference between the coordinates of the end point F2 and the start point F2′ of the second pointing object on Y-axis is not less than the predetermined value L, i.e., Y2−Y2′>=L, the second angle θ2 of the second pointing object is determined as 90°. In an instance in which the difference between the coordinates of the end point F2 and the start point F2′ of the second pointing object on Y-axis is not less than the predetermined value −L, i.e., Y2−Y2′>=−L, the second angle θ2 of the second pointing object is −90°. In an instance in which the difference between the coordinates of the end point F2 and the start point F2′ of the second pointing object on Y-axis is greater than the predetermined value −L and less than the predetermined value L, i.e., −L<Y2-Y2′<L, the second angle θ2 is obtained through various mathematic functions, such as arctan, i.e., θ2=arctan((Y2−Y2′)/(X2−X2′)).
In an instance in which the difference between the first angle θ1 and the second angle θ2 is determined to be less than a predetermined value M at step 1010, difference between the coordinates of the start point F1′ and the end point F1 of the first pointing object on X-axis, i.e., X1−X1′ and difference between the coordinates of the start point F2′ and the end point F2 of the second pointing object on X-axis, i.e., X2−X2′, are compared to zero at step 1012. If both differences are greater than zero (X1−X1′>0 and X2−X2′>0), it determines that the first pointing object moves in a direction parallel to the direction that the second pointing object moves in at step 1016. In an instance in which at least one of the differences (X1−X1′ and/or X2′ X2′) is less than zero at step 1012, the method proceeds to step 1014. In an instance in which both differences are less than zero (X1−X1′<0 and X2−X2′<0) at step 1014, it determines that the first pointing object moves in a direction parallel to the direction that the second pointing object moves in at step 1016. In an instance in which one and only one of the differences is greater than zero at step 1014, i.e., either X1−X1′>0 or X2−X2′>0, the method proceeds back to step 1002 to record new coordinates of the pointing objects. The predetermined value M, L and −L are capable of being adjusted. The processing module 110 may then generate control signals to execute commands associated with the generated control signals. If the first pointing object moves in a direction parallel to the direction that the second pointing object moves in, the processing module 110 determines that the pointing objects perform a shifting gesture and generate control signal to execute shifting commands.
Motion information may further comprise moving direction, coordinates of the pointing objects, and an angle θ between the line connecting a start point and an end point of a pointing object, and a reference. The reference may be axes, directions and orientations including X-axis, Y-axis, vertical, horizontal and/or diagonal. The processing module 108 may obtain the recorded motion information at step 1208. Control parameters and setting of the control parameters of the preset function may be determined by the parameter setting module 114 at step 1210 according to the motion information obtained at step 1208. The control parameters may comprise paging direction or scrolling direction according to the angle θ, paging speed or scrolling speed according to the displacement S.
All or a portion of the system of the present invention, such as all or portions of the aforementioned processing module 110 and/or one or more modules of the device of identifying a shifting gesture 100, may generally operate under control of a computer program product. The computer program product for performing the methods of embodiments of the present invention includes a computer-readable storage medium, such as the non-volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
It will be understood that each block or step of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by computer program instructions. These computer program instructions may be loaded onto a computer or other programmable apparatus to produce a machine, such that the instructions which execute on the computer or other programmable apparatus create means for implementing the functions specified in the block(s) or step(s) of the flowcharts. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the block(s) or step(s) of the flowcharts. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the block(s) or step(s) of the flowcharts.
Accordingly, blocks or steps of the flowcharts support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block or step of the flowcharts, and combinations of blocks or steps in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
It will be appreciated by those skilled in the art that changes could be made to the examples described above without departing from the broad inventive concept. It is understood, therefore, that this invention is not limited to the particular examples disclosed, but it is intended to cover modifications within the spirit and scope of the present invention as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
201110081252.8 | Mar 2011 | CN | national |