This application claims the priority of Chinese patent application No. 201110005101.4, titled “METHOD AND DEVICE FOR CONTROLLING ZOOMING OF INTERFACE CONTENT OF TERMINAL” and filed with the State Intellectual Property Office on Jan. 4, 2011, which is hereby incorporated by reference in its entirety.
The invention relates to the field of terminal control, and in particular to a method and a device for zoom control of interface content of a terminal.
Existing mobile terminals display certain content, such as a web page or a picture, in a way where the displayed object is zoomed out if the content of the displayed object is too large and exceeds the display region of the terminal screen, and the display object needs to be magnified when the details are needed to be observed. Presently, there are mainly two methods for the zooming in or zooming out of the displayed object. The first method is operating by invoking a menu on the interface. This method has disadvantages that multiple clicks on a keyboard or a touch screen are needed to implement the zooming, which causes cumbersome operations and low efficiency, and the presence of the menu on the interface affects the displayed content; moreover, it is difficult to realize zoom control with an accurate point as a center. The second method is operating based on a multi-touch enabled screen, i.e., sliding fingers such as the thumb and the index finger on the touch screen, and the zooming in of the content is performed when the two fingers move away and the zooming out of the content is performed when the two fingers move together. This method needs multiple fingers to participate, which causes inconvenient operations and low efficiency; moreover, it cannot realize zooming with an accurate point as a center, thus lacking of operation accuracy.
A first object of the invention is to provide an efficient method for zoom control of interface content of a terminal.
A second object of the invention is to provide an efficient device for zoom control of interface content of a terminal.
In order to achieve the first object above, a method for zoom control of interface content of a terminal is provided according to the invention, the method includes: sensing a displacement of the terminal between two time points; and performing zoom control on the interface content of the terminal according to the displacement.
In order to achieve the second object above, a device for zoom control of interface content of a terminal is provided according to the invention, the device includes: a displacement detecting unit of the terminal, configured to sense a displacement of the terminal between two time points; and a zoom control unit of the terminal, configured to perform zoom control on the interface content of the terminal according the displacement.
In the embodiments of the invention, the displacement of the terminal between two points is sensed, and the zooming of the interface content is controlled according to the displacement. Therefore, the operation is simple and convenient, and the zooming efficiency is improved.
The drawings are provided for a further understanding of the invention and constitute a part of the specification. The invention is explained by the drawings together with the embodiments of the invention, and the drawings do not constitute limitations of the invention. In the drawings:
Preferred embodiments of the invention are illustrated below in conjunction with the drawings. It should be understood that, the preferred embodiments described herein are used only to illustrate and explain the invention, but not used to limit the invention.
Step 102: sensing a displacement of the terminal between two time points.
Step 104: performing zoom control on the interface content of the terminal according to the displacement.
In the embodiment, the displacement of the terminal between two points is sensed, and the zooming of the interface content is performed according to the displacement. Therefore, the operation is simple and convenient, and the zooming efficiency is improved.
Step 201: accessing a content viewing program such a browser or a picture viewer in the terminal, to view a web page or a picture.
Step 202: activating a displacement detecting unit of the terminal (see explanations of
Step 203: judging whether the touch screen is clicked (i.e., touch information), and if it is determined that there is a click, performing step 204; otherwise, continuing performing step 203.
Step 204: determining a chosen point, i.e., a touch point, on the interface content of the terminal according to the touch information; triggering the displacement detecting unit to make it start to work at the time when the chosen point is determined (this time is determined as a motion start time, i.e., the time when the touch screen is clicked. As a matter of course, in a specific operation, the motion start time for starting the detection of the displacement may be independent from the time when the chosen point is determined. For example, the interface of the terminal may be touched to determine the detection start time, or the motion of the terminal may be sensed and the time when the terminal starts to move is determined as the detection start time, and thus the operation for determining the detection start time is omitted).
In a specific operation, the displacement detecting unit may sense in real time the displacement of the terminal between the motion start time and every motion lasting time until the motion stop time, i.e., continuously sense the displacement between the start time and every time before the motion stop time (including the motion stop time) (correspondingly, the zooming out or zooming in of the interface content is continuously performed in step 205); or, the displacement detecting unit may sense the displacement of the terminal between the motion start time and the motion stop time, that is, only the displacement between the two points, i.e., the start time and the motion stop time, is calculated (correspondingly, the zooming out or zooming in of the interface content is performed only at the motion stop time in the step 205).
The displacement detecting unit may be any one of an acceleration sensor, an ultrasonic detector or a camera. Specific explanations of these displacement detecting units are as follows. For the displacement detecting unit based on acceleration measurement, considering the use's habit when viewing the interface content, i.e., the screen faces the user and moves in a direction away from the user or toward the user, a sensitive axis of the acceleration sensor may be set in a direction perpendicular to the screen (this solution considers the use's habit and is a preferred solution). In this way, when the acceleration sensor works, the displacement in the direction perpendicular to the screen caused by external forces except the gravity may be calculated by removing the effect of the gravity component by calculation. The specific implementation is as follows.
The acceleration sensor may be a three-axis micromachine acceleration sensor (other types of acceleration sensors such as a two-axis micromachine acceleration sensor may be chosen as needed, and it should not be construed as a limitation). In a stationary state of the terminal, the attitude of the terminal in the air is in a certain direction, and angles θ1, θ2, θ3 between the three axes of the three-axis micromachine acceleration sensor and the gravity (assuming the angle between the installation direction and a sensitive axis parallel to a direction perpendicular to the screen (displacement sensitive axis for short) is θ1) may be calculated according to acceleration values outputted from the three axes, where the angles θ1, θ2, θ3 are the initial attitude angles of the terminal in the air. If the terminal does not rotate in the stationary state or the motion process, i.e., the attitude angles of the terminal are unchanged in the stationary state or the motion process, then for every motion time, a value obtained by subtracting a cosine component of the gravity acceleration at the angle θ1 from the acceleration output amount on the displacement sensitive axis is the acceleration used to calculate the displacement in the direction of the displacement sensitive axis (the relationship between the acceleration and the displacement is quadratic integral, various approximation algorithms may be employed in specific implementations, and the detailed explanation thereof is as follows in the next paragraph).
If the terminal rotates in the motion process, i.e., the attitude angles of the terminal are changed in the motion process, a gyroscope (various gyroscopes such as a three-axis micromachine gyroscope) may be used to calculate dynamic attitude angles. The initial attitude angle of the terminal in the air may be determined by an acceleration sensor as described in the above paragraph. In the motion process, the three-axis acceleration gyroscope outputs the angular speed ω. For angular speeds at sampling time points (every displacement time), assuming that the initial angular speed is ωc, the sampling time points are T0, T1, T2, . . . Tn, the angular speeds corresponding to the time points are ω0, ω1, ω2, . . . ωn respectively, and the rotation angles corresponding to the time points are θ0, θ1, θ2, . . . θn respectively, the relationship among the sampling time, the angular speed and the rotation angle is shown in formula (1) and formula (2):
θ0=ωc*T0 (1)
θn=θ0+ω3(T1−T0)+ω2*(T2−T1)+ . . . +ωn*(Tn−Tn−1), n≧1 (2)
If ωc=0, i.e., the initial rotation speed is zero, that is the user keeps the terminal from rotating during the stationary state of the terminal, the rotation angle on each axis (the sensitive axis of the acceleration sensor) may be calculated according to the above-described formulas, so as to obtain the attitude angle of the terminal in the air at any time. The gravity component of the gravity acceleration of the terminal on each sensitive axis of the acceleration sensor at any time is calculated according to the gravity acceleration and the attitude angles (the sensitive axes of the acceleration sensor may be set to coincide with the sensitive axes of the gyroscope respectively). Thus, the acceleration α of the terminal in the direction of each sensitive axis caused by the external force except the gravity is obtained by subtracting the corresponding gravity component from the acceleration output amount of each sensitive axis of the three-axis micromachine acceleration sensor (illustration is made here by taking the acceleration and the displacement in the direction of the displacement sensitive axis as an example). The specific implementation is as follows: assuming that the initial speed of the terminal is C0, the sampling time points are T0, T1, T2, . . . Tn, the accelerations corresponding to the time points are α0, α1, α2, . . . αn respectively, and the speeds at the time points are V0, V1, V2, . . . Vn respectively, the relationship may be represented by the following formulas:
V
0
=C
n
+T
0*α0 (3)
V
n
=V
0+α1*(T1−T0)+α2(T2−T1)+ . . . αn(Tn−Tn−1), n≧1 (4)
For the displacement at each time point, various approximation methods may be used, for example:
In a specific operation, C0 may be set to 0, i.e., the moving speed of the terminal is zero when the user presses the screen. It should be understood by those skilled in the art that, the sensitive axis of the acceleration sensor may be set in any direction, which is not limited to the direction parallel to the direction perpendicular to the screen; and the zooming of the interface content may be controlled according to the displacements on the multiple sensitive axes of the acceleration sensor.
The ultrasonic detector may be mounted on the same side of the panel and the screen of the terminal in a specific operation. By emitting the ultrasonic wave, the ultrasonic detector may obtain a displacement between the terminal and a reference object (e.g., a person's face) at different time (it should be noted that the reference object at the different time is required to be the same reference object in order to ensure calculation accuracy) when the ultrasonic wave encounters the reference object at different time. The time difference between the reflected wave and the incident wave may be outputted in real time based on the inherent property of the ultrasonic detector itself, thus the displacement between the start time and every motion lasting time until the motion stop time (including the motion stop time) is obtained. In a specific operation, considering the terminal is operated by the hand of a person, so there will be no sudden change in the displacement, therefore the calculation of the displacement may be stopped once there is a sudden change of the ultrasonic wave.
For the camera, scaling of an image of an target object (such as a person's face or eye) between the motion start time and the motion operating time (including every motion lasting time until the motion of the terminal stops) is tracked by real-time shooting, and the displacement of the terminal between the motion start time and the motion operating time is calculated according to a preset correspondence between scalings and distances. Alternatively, the zoom control of the interface content of the terminal is directly performed according to the scaling of the image of the target object between the motion start time and the motion operation time which is sensed by the camera of the terminal.
Specifically, assuming that at the motion start time the camera locks several special positions on the face, such as the eye, the nose or the mouth, and the pixel number or the area of the profile of each special position is calculated. When the terminal or the locked object moves, a scaling factor is obtained by measuring the size or the profile area of each special position in real time, and thus the scaling factor of the displacement is deduced, thereby zoom control of the interface content of the terminal is performed. In a specific operation, when all the special positions are lost, the zooming is stopped with the scaling factor being the last scaling factor before the lost; or when some of the special positions are lost, the calculation is performed according to the remaining special positions.
Step 205, the displacement detecting unit sends the measured displacement to a transmission unit (see the explanations with respect to
Specifically, the zoom control unit performs, according to the displacement and the chosen point on the interface content of the terminal, zoom control on the interface content of the terminal by using the chosen point as a center. The zoom control unit performs zooming out or zooming in on the interface content of the terminal according to the direction of the displacement, and controls the scaling of the zooming in or zooming out of the interface content of the terminal according to the magnitude of the displacement. For example, zooming in or zooming out when the terminal moves toward the face may be set according to actual needs; furthermore, a control switch may be set to choose two different control modes, where in one mode, zooming in of the interface content is performed when the terminal moves towards the face, and in the other mode, zooming out of the interface content is performed when the terminal moves towards the face; the scaling controlled by the magnitude of the displacement may be set according to actual needs, and for example, no zooming is performed if the movement distance is within 1 cm, the zooming of 5% per 1 cm is performed on the interface content if the movement distance ranges from 1 cm to 5 cm, and the zooming of 10% is performed on the interface content if the movement distance exceeds 5 cm.
Step 206, detecting the motion stop time; notifying the displacement detecting unit to stop working and stop the zooming of the interface content if the motion stop time is detected, then the operation is finished; and performing step 204 if the motion stop time is not detected.
In a specific operation, the motion start time and the motion stop time may be obtained in any ways as set. For example, the time when the touch screen is initially clicked is determined as the motion start time; the time when the user lifts his finger from the touch screen if the real-time zooming reaches a satisfactory zooming effect is set as the motion stop time; or the finger of the user leaves the touch screen directly after clicking, and the time when the user re-clicks the touch screen is set as the motion stop time.
It can be understood by those skilled in the art that, it is a preferred solution in which the zoom control is performed on the interface content of the terminal by using a chosen point as a center, which achieves zooming with an accurate point as a center. It is also a preferred solution in which whether the zooming out or zooming in of the interface content of the terminal is performed is controlled according to the direction of the displacement; however, in an actual operation, it is also possible to perform zooming out if the displacement is within a certain displacement value range and perform zooming in if the displacement is within another displacement value range. It is also a preferred technical solution in which the scaling is controlled according to the displacement value; in an actual operation, the zooming may also be performed according to a preset scale value. It is also a preferred technical solution in which the displacement of the terminal between the motion start time and every motion lasting time until the motion stop time is sensed in real time and the zooming of the interface content is performed in real time, which is convenient for the user to know in real time whether the desired zooming is achieved. It is also a preferred technical solution in which the time when the chosen point is determined is used as the displacement detecting time; moreover, the core of the solution is to control the zooming of the interface content according to the displacement, so it is not limited to sense the displacement when the terminal moves, that is, the sensing of the displacement is not based on the determination of the specific sensing time, the displacement may be sensed at any time, so as to perform zoom control on the interface.
In the embodiment, by adding a displacement detecting unit in a terminal having a touch screen, the terminal can sense in real time the motion displacement of the terminal between two time points, and perform zoom control according to the displacement; moreover, the zooming of the object content can be performed by using a touch point of the touch screen as a center while the motion. The method according to the embodiment can be operated simply and rapidly, and the scaling can be controlled more accurately, thus a better human-computer experience effect is achieved.
In a specific operation, the displacement detecting unit 34 may be triggered to work when the touch screen 32 is touched or clicked. Further, when the touch screen 32 is triggered, the processor 36 may send a control signal to the displacement detecting unit 34 to control the displacement detecting unit 34 to be activated and to stop work. As illustrated in
It will be understood by those skilled in the art that, corresponding to the explanation of
The displacement detecting unit 42 may include: a displacement detecting sub-unit 422, configured to sense in real time the displacement of the terminal between a motion start time and every motion lasting time until the motion stop time of the terminal, or sense the displacement of the terminal between the motion start time and the motion stop time, where the motion start time is the time when the chosen point is determined on the interface content of the terminal; and an activation sub-unit 424, configured to trigger the displacement detecting sub-unit 422 to work when the touch control unit determines the chosen point.
The transmission unit 44 may include: a first transmission unit (not shown), configured to send the displacement sensed by an acceleration sensor, an ultrasonic detector or a camera and the chosen point to the zoom control unit 46; a second transmission unit (not shown), configured to send scaling of an image of a target object between two time points which is sensed by the camera to the zoom control unit for performing zoom control on the interface content of the terminal.
The zoom control unit 46 may include: a first zoom control sub-unit 460, configured to perform, according to the displacement and the chosen point of the interface content, zoom control on the interface content of the terminal by using the chosen point of the interface content as a center; a second zoom control sub-unit 462, configured to perform zooming out or zooming in on the interface content of the terminal according to a direction of the displacement; and a scale control sub-unit 464, configured to control the scaling of the zooming out or zooming in of the interface content of the terminal according the magnitude of the displacement.
As the explanation of each of the embodiments in
an acceleration sensor, configured to sense the displacement of the terminal in a direction perpendicular to the screen of the terminal between two time points, preferably configured to sense in real time the displacement of the terminal in a direction perpendicular to the screen of the terminal between a motion start time and every motion lasting time until the motion stop time of the terminal; or sense the displacement of the terminal in a direction perpendicular to the screen of the terminal between the motion start time and the motion stop time; or
an ultrasonic sensor, configured to sense the displacement of the terminal between two time points with reference to a same reference object; or
a camera, configured to sense scaling of an image of a target object between two time points, and calculate the displacement of the terminal between the two time points according to a preset correspondence between scalings and distances.
In the embodiment, by embedding the displacement detecting unit 42 in the terminal and triggering the displacement detecting unit 42 to work at the time when a certain determined point (i.e., the chosen point) on the touch control unit 40 is clicked, zoom control of interface content is achieved according to the magnitude and direction of the displacement when the terminal moves, therefore the operation efficiency of the interface content zooming is improved, furthermore, the zooming accuracy is improved since the zooming of the interface content is performed by using the determined point as a center.
Finally, it should be noted that the above-disclosed are only preferred embodiments of the invention, and are not intended to limit the invention. Although the invention is illustrated in detail with reference to the embodiments described above, those skilled in the art can make modifications to the technical solution in the embodiments mentioned above, or substitute equivalent features for some of the technical features. Any changes, equivalents or modifications made within the spirit and principle of the invention should be included in the protection scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
201110005101.4 | Jan 2011 | CN | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/CN2011/085159 | 12/31/2011 | WO | 00 | 7/1/2013 |