To fit today's busy life, different space-efficient and highly portable mobile apparatuses are developed. Taking personal digital assistants (PDA), PDA phones, and smart phones as examples, they not only offer various functions as conventional communication apparatuses do, but also allow the users to edit files, send/receive e-mails, browse web pages, and perform instant messaging through built-in operating system (OS).
As to a light, slim, and small portable electronic apparatus, the volume thereof has to be very small. Thus, if both a screen and a keyboard are disposed on the electronic apparatus, the size of the screen has to be reduced. To dispose a larger screen within a limited space, a touch screen has been developed recently, in which a keyboard and a screen are integrated and served as the input interface of a portable electronic apparatus, so that both the cost and the surface area for disposing a conventional keyboard are saved.
The operation of the touch screen is very simple and straightforward. A user can perform various operations on the screen by simply touching the screen with a stylus or a finger. Thus, touch screen is a very convenient input interface. However, how to simplify touch gestures and allow a user to operate screen objects or even edit an entire screen frame is an object to be accomplished in the industry.
Accordingly, the present invention is directed to a screen frame cropping method and a screen frame cropping apparatus, in which a user is allowed to crop a screen frame through a simple touch operation.
The present invention provides a screen frame cropping method adapted to an electronic apparatus having a touch screen. In the screen frame cropping method, a frame is displayed on the touch screen, and a first touch and a second touch performed by a user are detected on the touch screen. When the first touch and the second touch satisfy a predetermined condition, a cropped frame of the frame is stored as an image file. The cropped frame is determined according to at least one first touch position of the first touch and at least one second touch position of the second touch.
According to an embodiment of the present invention, the screen frame cropping method further includes recognizing the cropped frame by using a character recognition technique to generate at least one character and storing the at least one character as a text file.
The present invention provides a screen frame cropping apparatus including a touch screen, a storage unit, and one or more processing units. The storage unit records a plurality of modules. The processing units are coupled to the touch screen and the storage unit. The processing units access and execute the modules recorded in the storage unit. The modules include a display module, a detection module, and a cropping module. The display module displays a frame on the touch screen. The detection module detects a first touch and a second touch preformed by a user on the touch screen. The cropping module stores a cropped frame of the frame as an image file when the first touch and the second touch satisfy a predetermined condition, where the cropped frame is determined according to at least one first touch position of the first touch and at least one second touch position of the second touch.
According to an embodiment of the present invention, the predetermined condition includes one of a first displacement of the first touch and a second displacement of the second touch being greater than a predetermined value.
According to an embodiment of the present invention, the predetermined condition includes one of the first touch and the second touch starting from a predetermined starting border area of the touch screen.
According to an embodiment of the present invention, the cropped frame is determined according to a first starting touch position of the first touch, a second starting touch position of the second touch, and a first current touch position of the first touch and a second current touch position of the second touch when one of a first displacement of the first touch and a second displacement of the second touch is greater than a predetermined value.
According to an embodiment of the present invention, the cropped frame is a rectangle including a cropping area and a transparent area outside the cropping area, where the cropping area and the transparent area are determined according to a first moving path of the first touch and a second moving path of the second touch.
According to an embodiment of the present invention, the modules further include a character recognition module. The character recognition module recognizes the cropped frame by using a character recognition technique to generate at least one character and stores the at least one character as a text file.
According to an embodiment of the present invention, the predetermined condition includes one of the first touch and the second touch ending at a predetermined ending border area of the touch screen.
According to an embodiment of the present invention, the predetermined condition includes one of a first touch time of the first touch and a second touch time of the second touch being greater than a predetermined value.
According to an embodiment of the present invention, the cropped frame is determined according to a first starting touch position of the first touch, a second starting touch position of the second touch, a first ending touch position of the first touch, and a second ending touch position of the second touch.
The present invention provides a computer program product. The computer program product is loaded into an electronic apparatus to executed following steps. A frame is displayed on a touch screen of the electronic apparatus. A first touch and a second touch performed by a user are detected on the touch screen. When the first touch and the second touch satisfy a predetermined condition, a cropped frame of the frame is stored as an image file. The cropped frame is determined according to at least one first touch position of the first touch and at least one second touch position of the second touch.
As described above, the present invention provides a screen frame cropping method, a screen frame cropping apparatus, and a computer program product. In the present invention, two touches performed by a user on a touch screen are detected, and when these two touches satisfy a predetermined condition, a cropping area is defined to crop a screen frame according to the touch positions of these two touches, so that the cropping operation is simplified. Besides being stored as a single image file, the cropped frame can also be stored as an image file and/or a text file through a character recognition technique. Thereby, a user is allowed to crop screen frames through simple touch operations.
These and other exemplary embodiments, features, aspects, and advantages of the invention will be described and become more apparent from the detailed description of exemplary embodiments when read in conjunction with accompanying drawings.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
It can be observed in an image cropping operation that the major action is to define a cropping area. If the object to be cropped is the screen frame displayed by an electronic apparatus, the cropping area can be defined by using the borders of the screen frame as parts of the cropping area. Thereby, in the present invention, a cropping area is defined by taking both the touch operations of a user and the corresponding frame borders into consideration, and whether a cropping operation is executed is determined according to the displacements and durations of the touch operations, so that the user can quickly and conveniently crop screen frames.
The touch screen 12 is fabricated by integrating resistive, capacitive, or any other type of touch sensing devices with a liquid crystal display (LCD), and which can detect the touch operations performed by a user at the same time when displaying frames of the electronic apparatus 10.
The storage unit 14 is one or a combination of a stationary or mobile random access memory (RAM), read-only memory (ROM), flash memory, hard disk, or any other similar device, and which records a plurality of modules that can be executed by the processing units 16. These modules can be loaded into the processing units 16 to execute a screen frame cropping function.
The processing units 16 is one or a combination of a central processing unit (CPU), a programmable general- or specific-purpose microprocessor, a digital signal processor (DSP), a programmable controller, application specific integrated circuits (ASIC), a programmable logic device (PLD), or any other similar device. The processing units 16 are coupled to the touch screen 12 and the storage unit 14. The processing units 16 access and execute the modules recorded in the storage unit 14 to execute a screen frame cropping function.
Aforementioned modules include a display module 142, a detection module 144, and a cropping module 146. These modules may be computer programs which can be loaded into the processing units 16 to execute the screen frame cropping function. Below, how the electronic apparatus 10 crops a screen frame will be described in detail with reference to embodiments of the present invention.
First, the display module 142 displays a frame on the touch screen 12 (step S202). Then, the detection module 144 detects a first touch and a second touch performed by a user on the touch screen 12 (step S204) and determines whether the first touch and the second touch satisfy a predetermined condition (step S206). The predetermined condition may be determined by one or a combination of the displacements, starting positions, ending positions, and durations of the first touch and the second touch.
To be specific, in an embodiment, the detection module 144 detects a first displacement of the first touch on the touch screen 12 and a second displacement of the second touch on the touch screen 12 and determines whether one or both of the first displacement and the second displacement are greater than a predetermined value, so as to determine whether the first touch and the second touch satisfy a predetermined condition. The first displacement and the second displacement may be the displacements of the first touch and the second touch in the direction of the axis X or the axis Y on the touch screen 12 or 2-dimensional displacements on the touch screen 12, which is not limited in the present invention. In addition, aforementioned predetermined value may be the product of the screen length (or width) and a predetermined ratio. However, the predetermined value is not limited in the present invention.
In an embodiment, besides determining whether the displacements of the first touch and the second touch are greater than a predetermined value, the detection module 144 further determines whether one or both of the first touch and the second touch start from a predetermined starting border area of the touch screen 12, so as to determine whether the first touch and the second touch satisfy the predetermined condition. Aforementioned predetermined starting border area may be the left/right side area, the top/bottom side area, or both the left/right and top/bottom side area of the touch screen 12 and has a specific width (for example, 1-15 pixels) to the borders of the touch screen 12. However, the predetermined starting border area is not limited in the present invention.
In an embodiment, the detection module 144 determines whether the first touch and the second touch satisfy the predetermined condition according to whether one or both of the first touch and the second touch end at the predetermined ending border area of the touch screen 12. Similarly, the predetermined ending border area may be the left/right side area, the top/bottom side area, or both the left/right and top/bottom side area of the touch screen 12 and has a specific width (for example, 1-15 pixels) to the borders of the touch screen 12. However, the predetermined ending border area is not limited in the present invention.
In an embodiment, the detection module 144 detects a first touch time of the first touch on the touch screen 12 and a second touch time of the second touch on the touch screen 12 and determines whether one or both of the first touch time and the second touch time is greater than a predetermined value, so as to determine whether the first touch and the second touch satisfy the predetermined condition. Aforementioned predetermined value may be 1-3 seconds. However, the predetermined value is not limited in the present invention.
It should be noted that in an embodiment, the predetermined condition can be any combination of the displacements, starting positions, ending positions, and touch durations mentioned in foregoing embodiments, and these factors can be set to be applicable to one or both of the first touch and the second touch. For example, when both the first touch and the second touch start from the predetermined starting border area of the touch screen and end at the predetermined ending border area of the touch screen, it can be determined that the first touch and the second touch satisfy the predetermined condition and a screen frame cropping operation can be executed accordingly.
Referring to
To be specific, in an embodiment, the cropping module 146 defines a rectangular cropping area by taking both the touch positions of the first touch and the second touch and the borders of the touch screen 12 into consideration. For example, the cropping module 146 defines the upper and lower borders of the cropping area by using the touch positions of the first touch and the second touch and defines the left and right borders of the cropping area by using the left and right borders of the touch screen 12. Similarly, the cropping module 146 may also define the left and right borders of the cropping area by using the touch positions of the first touch and the second touch and defines the upper and lower borders of the cropping area by using the upper and lower borders of the touch screen 12. Aforementioned touch positions of the first touch and the second touch may be the touch starting positions or the touch ending positions of the first touch and the second touch, the positions of the first touch and the second touch when the first touch and the second touch satisfy the predetermined condition, or the highest/lowest positions or leftmost/rightmost positions on the moving paths of the first touch and the second touch. However, the definition of the touch positions of the first touch and the second touch is not limited in the present invention. Additionally, in another embodiment, the cropping module 146 may not use the borders of the touch screen 12 and define the left, right, upper, and lower borders of the cropping area by using only the highest, lowest, leftmost, and rightmost positions on the moving paths of the first touch and the second touch. However, the present invention is not limited thereto.
Below, the definition of the upper and lower borders of the cropping area by using the touch positions of a first touch and a second touch and the definition of the left and right borders of the cropping area by using the left and right borders of the touch screen 12 will be respectively described in detail with reference to an embodiment. Similarly, these embodiments are also applicable to the definition of the left and right borders of the cropping area by using the touch positions of the first touch and the second touch and the definition of the upper and lower borders of the cropping area by using the upper and lower borders of the touch screen 12.
Then, referring to
Next, referring to
Thereafter, referring to
Next, referring to
Finally, referring to
After the electronic apparatus defines the cropping area through one of the techniques described above, the electronic apparatus crops the frame within the cropping area into a cropped frame and stores the cropped frame as an image file. Thereby, the user can select and crop an interested part in a frame displayed on the touch screen and obtain a file of the cropped frame through simple touch operations.
It should be noted that in another embodiment, after the cropping area is defined, the electronic apparatus further recognizes the cropped frame by using a character recognition technique through a character recognition module (not shown) to generate at least one character and stores the recognized character as a text file.
First, the display module 142 displays a frame on the touch screen 12 (step S802). Then, the detection module 144 detects a first touch and a second touch performed by a user on the touch screen 12 (step S804) and determines whether the first touch and the second touch satisfy a predetermined condition (step S806). Steps S802-S806 are the same as or similar to steps S202-S206 in the embodiment described above and therefore will not be described herein.
The differences from the embodiment described above lie in that, in the present embodiment, when the detection module 144 determines that the first touch and the second touch satisfy the predetermined condition, the electronic apparatus 10 further recognizes the cropped frame through a character recognition module (not shown) by using a character recognition technique to generate at least one character (step S806). After that, the character recognition module stores the recognized characters as a text file, and the cropping module 146 stores the image in the cropped frame as an image file (step S808). Contrarily, when the detection module 144 determines that the first touch and the second touch do not satisfy the predetermined condition, step S804 is executed again so that the detection module 144 continues to detect other touches performed by the user on the touch screen 12.
Besides the definitions of the rectangular cropping area and rectangular cropped frame described above, in another embodiment, the electronic apparatus may also determine the cropping area according to a first moving path of the first touch and a second moving path of the second touch and keep only the frame within the cropping area visible while the frame within the transparent area outside the cropping area is transparent.
In the embodiments described above, except the definition of the cropping area by using the highest, lowest, leftmost, and rightmost positions of the first touch and the second touch, the borders of the touch screen are used in every definition of the cropping area, and the borders involved are opposite borders (i.e., the left and right borders or the upper and lower borders) of the touch screen. However, in another embodiment, in case that the first touch and the second touch start from or end at different borders, the present invention provides a technique of defining a cropping area by using adjacent borders (for example, the left border and the lower border, or the right border and the upper border). To be specific, in case that the first touch and the second touch start from or end at different borders, the electronic apparatus defines the cropping area according to a first moving path of the first touch, a second moving path of the second touch, and the cross points between these two moving paths and the borders of the touch screen. While cropping the frame, the electronic apparatus keeps the frame within the cropping area visible while the frame within the transparent areas outside the cropping area is transparent.
Through the screen frame cropping method described above, even if the first touch and the second touch performed by a user start from or end at different borders, the electronic apparatus can still crop the frame selected by the user and store the cropped frame as an image file. It should be mentioned that the screen frame cropping method described above is also applicable when the first touch and the second touch start from different borders but end at the same border or when the first touch and the second touch start from the same border but end at different borders. The present invention is not limited herein.
The present invention further provides a computer program product for executing various steps of the screen frame cropping method described above. The computer program product is composed of a plurality of code snippets (for example, an organization chart establishment code snippet, a form approval code snippet, a settings code snippet, and a deployment code snippet). After these code snippets are loaded into an electronic apparatus and executed by the same, the steps of the screen frame cropping method described above can be accomplished.
As described above, the present invention provides a screen frame cropping method, a screen frame cropping apparatus, and a computer program product. In the present invention, two touches performed by a user on a touch screen are detected, and a cropping operation and a cropping area are defined according to the starting positions, ending positions, moving paths, displacements, and touch durations of these two touches. Thereby, a user can crop screen frames through simple touch operations.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
This application claims the priority benefits of U.S. provisional application Ser. No. 61/654,072, filed on May 31, 2012. The entirety of the above-mentioned patent applications is hereby incorporated by reference herein and made a part of this specification.
Number | Date | Country | |
---|---|---|---|
61654072 | May 2012 | US |