This application claims priority under 35 U.S.C. §119(a) to an application entitled “METHOD FOR PROCESSING IMAGE FOR MOBILE COMMUNICATION TERMINAL” filed in the Korean Intellectual Property Office on Dec. 4, 2006 and assigned Serial No. 2006-0121662, the contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to a method for processing an image for a mobile communication terminal, and in particular to a method for processing an image for the mobile communication terminal to display the image by enlarging, reducing, or moving the image.
2. Description of the Related Art
It is possible to perform voice communication using a mobile communication terminal with few constraints of time and place due to development of mobile communication terminal technology. A user may be provided with, for example, character information, picture information, aN MP3 music file, and a game by adding functions to the mobile communication terminal, and receives the character information, the picture information, the MP3 music, and the game through a screen of the mobile communication terminal.
An image, such as a multimedia file, may often be processed through the mobile communication terminal. Particularly, zooming and panning are conveniently used as a method for processing the image. For example, zooming and panning are used to enlarge or reduce an image in a preview condition or an album of a camera, or to enlarge or reduce a character or data in a particular application, such as a file viewer.
A method of processing the image, such as zooming and panning, must be performed by selecting a key or an option item of the mobile communication terminal. A conventional mobile communication terminal additionally assigns the zooming function and the panning function to function assigned keys, if the mobile communication terminal enters a specific mode, for example, for the camera, the album, and the file viewer. Two keys are required to perform a zoom-in process and a zoom-out process, and a key is required to move an image up, down, left and right. Typically, two volume keys are used to perform a zooming function and a navigation key is used to perform a panning function. When a user uses both the zooming function and the panning function, it is inconvenient for the user to alternately use the volume keys and the navigation key when using both the zooming function and the panning function. It is also inconvenient for the user to press a key for an extended period to adjust the image up or down to a required size or to repeatedly press the key to adjust the image to the required size.
The present invention has been made in an effort to solve the above problems, and provides a method that enables convenient use of a zooming function or a panning function.
The present invention further provides a method that removes an inconvenience of alternately using a plurality of keys for performing both the zooming function and the panning function.
The present invention further provides a method that enables the same key to be used for performing both the zooming function and the panning function.
In accordance with the present invention, a method for processing an image for a mobile communication terminal includes selecting and displaying a beginning point on a screen displayed with an image; and zooming the image using the beginning point as a center of the image according to location information of an end point corresponding to the beginning point by establishing and moving the end point connected to the beginning point.
In accordance with the present invention, a method for processing an image for a mobile communication terminal includes selecting a point on an image displayed on a screen as a beginning point using a pointer; moving the beginning point with the image to a preset point on the screen and displaying the beginning point with the image; and zooming the image using the beginning point as a center of the image according to location information of an end point of the pointer corresponding to the beginning point by moving the pointer from the beginning point to the end point.
In accordance with the present invention, a method for processing an image for a mobile communication terminal includes selecting a point on an image displayed on a touch screen as a beginning point by touching the point with a touch apparatus; moving the beginning point to a preset point on the touch screen and displaying the image by using the beginning point as a center of the image; and zooming the image using the beginning point as a center of the image according to location information of an end point corresponding to the beginning point of drag by establishing the end point with a dragging motion of the touch apparatus on the touch screen.
In accordance with the present invention, a method for processing an image for a mobile communication terminal includes selecting a point on an image displayed on a touch screen as a beginning point by touching the point with a touch apparatus; and zooming the image using the beginning point as a center of the image according to location information of an end point of a drag corresponding to the beginning point by dragging the touch screen using the beginning point as a start point while touching the beginning point.
The objects, features and advantages of the present invention will be more apparent from the following detailed description in conjunction with the accompanying drawings, in which:
The present invention will now be described with reference to the accompanying drawings in detail. The same reference numbers are used throughout the drawings to refer to the same or like parts. Detailed descriptions of well-known functions and structures incorporated herein may be omitted to avoid obscuring the subject matter of the present invention.
Referring to
The control unit 11 performs overall control operation of the mobile communication terminal 10. The control unit 10 controls an image process, such as a zooming process and a panning process.
The input unit 12 provides a plurality of keys for a user input to the mobile communication terminal 10 and outputs key data to the control unit 11 corresponding to a key selected by a user. User commands input through the input unit 12 may be for controlling an image display or an image process.
The storage unit 13 stores required programs for controlling operation of the mobile communication terminal 10 and data resulting from execution of the required programs. The storage unit 13 also stores an image, programs for processing the image, and data resulting from execution of the programs. The image includes a preview image of the camera 16, a stored image in an album, and a displayed image that is displayed by executing a file viewer.
The wireless communication unit 14 transmits a radio frequency signal through an ANTenna (ANT) by modulating a signal output from the control unit 11 and upconverting to a frequency of the signal. The wireless communication unit 14 down-converts and demodulates a radio frequency signal received through the antenna ANT, and outputs the signal to the control unit 11.
The audio processing unit 15 converts an audio signal input through a MiCrophone (MIC) into digital format through control of the control unit 11, and demodulates audio data received by the wireless communication unit 14, and outputs the audio data through a SPeaKer (SPK).
The camera 16 produces image data by photographing an image. That is, the camera 16 produces the image data by photographing the image according to selection of a photographing mode through the input unit 12. The camera 16 includes an image sensor to transform an optical signal of a viewed object into an analog signal, and a signal-processing unit to transform the analog signal into a digital signal.
The image processing unit 17 processes image data output from the camera 16 to a format required for the display unit 18. The image-processing unit 17 edits the image data under the control of the control unit 11.
The display unit 18 displays function menus performed in the mobile communication terminal 10 and data stored in the storage unit 13, as image on a screen 18a. The display unit 18 displays a preview image output from the image-processing unit 17, an image stored in an album, and a document or data output using a file viewer.
Particularly, the control unit 11 processes an image through a Graphical User Interface (GUI) using a zoom vector 19, displayed on the screen 18a. Operation of the zoom vector 19 may be performed by a navigation key, a touch pad, or an optical sensor. The zoom vector 19 may be operated by a pointer 21.
The zoom vector 19 includes a beginning point 19a, a connect line 19b, and an end point 19c. The beginning point 19a is determined by a first selecting point using a pointer 21 on an image. The beginning point 19a is a base line of a zooming as well as a start point of a panning. The connect line 19b is a line connecting the beginning point 19a and a point at which the pointer is currently located (“current point”). The length of the connect line 19b is related to a zooming scale. The end point 19c is the point at which the pointer 21 is currently located.
The zoom vector 19 is displayed on an image of the screen 18a through operation of the pointer 21. The image is processed and displayed according to the displayed zoom vector 19.
If the pointer 21 selects a point on an image displayed on the screen 18a, the control unit 11 displays a beginning point 19a on the image. The control unit 11 displays the beginning point 19a on the selected point by the pointer 21, or displays the beginning point 19a by moving an image with a selected point by a pointer 21 to a preset point on the screen 18a. The latter case enables panning to be performed by selecting the beginning point 19a. The panning movement corresponds to a distance between a selected point by a pointer 21 and a preset point.
If the pointer 21 is moved from the beginning point 19a through the input unit 12, the control unit 11 displays the beginning point 19a, the connect line 19b, and the end point 19c on the screen 18a of the display unit 18. The control unit 11 processes an image by location information of the zoom vector 19, that is, by location information of the beginning point 19a and the end point 19c.
If the pointer 21 then selects the end point 19c through the input unit 12, the control unit 11 terminates the zooming and removes the zoom vector 19 from the screen 18a of the display unit 18.
Location information of the zoom vector 19 and of the related zooming is shown in
As shown in
If the end point 19c of a movement from the beginning point 19a is located on the X-axis, a zoom-stop is performed. That is, the zoom-stop is performed when the zoom vector 19 is located on the X-axis.
If the end point 19c of a movement from the beginning point 19a is located in a lower-right quadrant or a lower-left quadrant of the screen 18a, a zoom-out is performed. The zoom-out reduction is in direct proportion to the length of the zoom vector 19 in the direction of the Y-axis. If the end point 19c is located on a −A line shown in
The beginning point 19a is displayed at the center of the X-axis and Y-axis in an ellipse format for a user to easily distinguish the zoom-in, the zoom-stop and the zoom-out. As shown in
According to this exemplary embodiment of the present invention, the zooming scale is in direct proportion to the length of the zoom vector 19 in the direction of the Y-axis, however, the present invention is not limited thereto. In other embodiments, the zooming scale may be described in direct proportion to the length of the zoom vector 19, that is, the distance between the beginning point 19a and the end point 19c. An X-axis or a Y-axis may be established as a base line of the zooming. The zooming enlargement may be described as being in direct proportion to the length of the zoom vector 19 in the direction of the X-axis, or as being in direct proportion to the length of the zoom vector 19.
According to the present invention, an image is processed by a pointer 21 displayed on the screen 18a. However, the display unit 18 may process the image through touching of a touch screen by a user. The beginning point 19a may be selected through touching a touch apparatus and the zooming may be performed through dragging the touched point on the touch screen. The touch screen performs as a display unit 18 and an input unit 12.
Referring to
The control unit 11 determines whether a point of the image is selected as a beginning point 19a on the screen 18a, in Step S32. If a beginning point 19a is not selected, the control unit 11 continues to display the current image. If a beginning point 19a is selected, the control unit 11 displays the beginning point 19a on the image, in Step S33.
The control unit 11 determines whether an end point 19c is moved starting from the beginning point 19a after establishing the end point, in Step S34. If an end point 19c is not moved, the control unit 11 continues to display the beginning point 19a. If an end point 19c is moved, the control unit 11 displays a zoom vector 19 on the screen 18a and calculates location information of the beginning point 19a and the end point 19c as location information of the zoom vector 19, in Step S35. Because the beginning point 19a is the point of origin of an X-axis and a Y-axis, location information of the zoom vector 19 is an X-Y coordinate of the end point 19c. The control unit 11 determines whether the end point 19c of a movement from the beginning point 19a is located above the beginning point 19a, below the beginning point 19a, or horizontally level with the beginning point 19a, through the coordinate of the end point 19c. The control unit 11 calculates the length of the zoom vector 19 in the direction of the Y-axis through the coordinate of the end point 19c. The length of the zoom vector 19 in the direction of the Y-axis corresponds to the distance in the direction of the Y-axis from the coordinate of the end point 19c to the beginning point 19a.
The control unit 11 performs the zooming process using the beginning point 19a as a center according to the calculated location information of the zoom vector 19, in Step S36.
The control unit 11 determines whether the end point 19c is selected, in Step S37. If the end point 19c is not selected, the control unit 11 repeats the processes of Steps S35 and S36. If the end point 19c is selected, the control unit 11 terminates the zooming, in Step S38. The control unit 11 removes the zoom vector 19 from the screen 18a.
Referring to
If the end point 19c is located horizontally level with the beginning point 19a, in Step S363, the control unit 11 stops the zooming process, in Step S364.
If the end point 19c is located below the beginning point 19a, in Step S365, the control unit 11 performs a zoom-out process at a zoom reduction corresponding to the calculated length of the zoom vector 19 in the direction of the Y-axis, in Step S366. As described before, the zoom vector 19 is embodied by a Graphical User Interface (GUI) using a pointer 21 or by a GUI using a touch screen.
As shown in
As shown in
As shown in
The zoom-in process is indicated by displaying arrows centered on the beginning point 19a and pointing outwards away from the beginning point 19a in the direction of the X-axis and Y-axis.
As shown in
As shown in
As shown in
The zoom-out process is indicated by displaying arrows centered on the beginning point 19a and pointing inwards towards the beginning point 19a in the direction of the X-axis and Y-axis.
As shown in
In the process sequence herein described with reference to
Examples of a pointer 21 operated by an optical sensor are shown; however, the pointer 21 may also be operated by a touch pad or a navigation key. Movement of the end point 19c of the zoom vector 19 using a touch pad is performed by a dragging movement, the same as with the optical sensor. Movement of the end point 19c of the zoom vector 19 using a navigation key is performed in a stepwise manner by repeated pressing of a key.
As shown in
As shown in
If a user drags from the beginning point 19a as a start point with the touch apparatus 23, the zooming process is performed. Touching to select the beginning point 19a and dragging to perform the zooming process may be performed consecutively or independently. When the zooming process is consecutively performed, the zooming process is performed by dragging without detachment of the touch apparatus 23 from the touch screen 18b since the beginning point 19a is selected with the touch apparatus 23. When the zooming process is independently performed, the beginning point 19a is selected by detaching the touch apparatus 23 from the touch screen 18b after touching the touch screen 18b with the touch apparatus 23. The zooming process is continuously performed by dragging from the beginning point 19a as a start point.
As shown in
As shown in
As shown in
As shown in
As shown in
Although, in this exemplary embodiment, the beginning point 19a is selected as a start point of a drag, the present invention is not limited thereto. For example, the starting point for the drag may be not only a beginning point, but also may be a point on the touch screen. The zooming is performed by a zoom vector established by the starting point and the end point.
The method for processing an image according to the first exemplary embodiment describes a process of displaying a beginning point on a selected point on a screen.
Referring to
The control unit 11 determines whether a point of the image is selected as a beginning point 19a on the screen 18a, in Step S52. If a beginning point 19a is not selected, the control unit 11 continues to display the current image. If a beginning point 19a is selected, the control unit 11 moves the beginning point 19a with the image to a preset point on the screen 18a, and displays the beginning point 19a with the image, in Step S53. The control unit 11 thereby performs a panning process. The preset point may be a center of the screen 18a or a center of an image displayed on the screen 18a.
The panning process moves the image on the screen 18a according to the length and direction of movement of the selected point to the preset point while displaying the image on the screen 18a.
Steps S54 through S58 in the method of processing an image according to the second exemplary embodiment are performed in the same manner as steps S34 through S38, respectively, in the method of processing an image according to the first exemplary embodiment, and therefore a detailed description thereof is omitted.
As shown in
As shown in
The processes of zoom-in, zoom-stop, zoom-out, and selection of an end point of the method for processing an image according to the second exemplary embodiment, illustrated in
According to an exemplary embodiment of the present invention, zooming and panning may be implemented through a Graphic User Interface (GUI) using a zoom vector. Therefore, it is convenient for a user to perform the zooming and the panning by an input unit provided in a mobile communication terminal.
Because the zooming and the panning may be used together due to operation of a zoom vector used by a navigation key, a touch pad, an optical sensor, or a touch screen, as an input unit provided in the mobile communication terminal, the zooming function and the panning function may be improved by reducing the usage of keys. That is, this invention removes an inconvenience of alternately using a plurality of keys to use the zooming function and the panning function together in a conventional method.
Because the end point on the screen is freely moved by using the beginning point as a center for the zoom vector, the zoom-in, the zoom-stop, and the zoom-out may be performed consecutively or alternately.
Although exemplary embodiments of the present invention have been described in detail hereinabove, it should be understood that many variations and modifications of the basic inventive concepts herein described, which may appear to those skilled in the present art, will still fall within the spirit and scope of the present invention as defined in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2006-0121662 | Dec 2006 | KR | national |