A certain aspect of the embodiments discussed herein is related to a portable terminal, especially to a portable terminal which is enabled to control display or nondisplay of a window displayed in a display unit.
An example of a portable terminal has a pressure-sensitive touch panel or an electrostatic touch panel which are combined with a display unit. A user may instruct various processes using a window displayed in a display unit by operating a touch panel installed in a portable terminal with a pointing device such as a stylus pen or a user's finger.
In order to improve operability of windows displayed on the display unit, there is known a technique of supporting a size or a position of a window besides an operating method of the window as in Japanese Laid-open Patent Publication No. 2006-185025. Japanese Laid-open Patent Publication No. 2006-185025 discloses a change of a window size and a move of the window carried out at once by previously determining “size and coordinate of maximum window”, “size and coordinate of minimum window”, and “interrelation between size and coordinate”.
The following technique is known in a portable terminal having a touch panel. An occupied area of a touch panel having a size substantially the same as a display area of an ordinary display unit may be further enlarged, a dedicated area for the touch panel on which the display unit does not exist may be provided, and sections may be previously allocated for the dedicated area for the touch panel by each application. By operating the previously allocated sections with the pointing device, the application allocated to the section is activated.
However, in a portable terminal having a pressure-sensitive or electrostatic touch panel, a display area may be partly or totally hidden by various displayed window. Then, there is a problem that the limited display area may not be effectively used. It is possible to carryout the change and movement of the window size at once by applying the technique proposed in Japanese Laid-open Patent Publication No. 2006-185025 by previously setting up to do so. However, it is cumbersome work for a user. Even if the window size is changed and the window is moved, there is a display area which is hidden under the window at a position after moving the window. Therefore, it is still difficult to effectively use the display area.
According to an aspect of the embodiment, a portable terminal includes a display unit that displays an image based in image information; a touch panel overlapped by a display area on which the image is displayed; a move unit that moves a window in response to movement of an object while the object is in contact with any position within an area of the touch panel corresponding to a predetermined area of the window displayed by the display unit; a determining unit that determines whether a part of the window moved by the move unit goes out of the display area; and a display control unit that controls the display unit to change the display of the window having a predetermined size to nondisplay, and displays an image based on indication information representing the window when it is determined that the part of the window moved by the move unit goes out of the display unit.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Preferred embodiments of the present invention are explained next with reference to accompanying drawings.
The portable terminal 1 has an input unit 19 having operations keys 19-1 to 19-4 and various instructions are input by the input unit 19. A display unit 20 is provided on a front face of the portable terminal 1. The display unit 20 may be a display constituted by an organic EL or a liquid crystal display. A transparent touch panel 18 overlaps with and is bonded to the display unit 20. The touch panel is provided at a portion exceeding the display area of the display unit 20. A touch on the touch panel 18 with a stylus pen 2 or a finger is detected because the touch panel is a pressure-sensitive or electrostatic type. Needless to say, the touch panel 18 may be provided under the display unit 20.
The CPU 12, the ROM 13, and the RAM 14 are mutually connected via a bus 15. Further, an input output interface 16 is connected to the bus 15. The input unit 19 including the operations keys 19-1 to 19-4, the display unit 20, and the memory unit 21 including a hard disk or a nonvolatile memory are connected to the input output interface 16.
A touch input control unit 17 is connected to the input output interface 16. When a user does a touch input into the touch panel 18 with the stylus pen 2 or a user finger, the touch input control unit 17 detects coordinates (coordinates represented by two axes of X-axis and Y-axis) at which the touch input is carried out, and outputs a coordinate detection signal to the control unit 11. The coordinate detection signal includes coordinate values represented by the two axes of X-axis and Y-axis. With this, an input into the portable terminal 1 may be done through the touch panel 18.
A portable phone radio communication unit 22 is connected to the input output interface 16. The portable phone radio communication unit 22 is connected to a base station (not illustrated) via an integrated antenna (not illustrated) with a W-CDMA communication method or the like.
However, in a portable terminal 1 having a pressure-sensitive or electrostatic touch panel 18, a display area α may be partly or totally hidden by various displayed windows. Then, there is a problem that the limited display area α may not be effectively used. Referring to
It is possible to carry out the change and movement of the window size at once by applying the technique proposed in Japanese Laid-open Patent Publication No. 2006-185025 by previously setting up to do so. However, it is cumbersome work for a user. Even if the size of the window is changed and the changed window is moved, or a title bar P of a window is dragged to be moved, a portion of the display area α hidden under the window is generated at a location to which the window is moved. Therefore, it is still difficult to effectively use the display area α.
With the embodiment, when the title bar P of the window moves from an initial setup position to a location in which a part of the window is moved out of the display area α0 of the display unit 20, the window is not displayed and is iconized at a predetermined area of the display unit 20. With this, the window displayed on the display unit 20 is not displayed in an appropriate manner during a sequence of a drag operation using the stylus pen 2 or a user's finger. Hereinafter, a window nondisplay control process using this method is described.
Referring to a flowchart of
In step S1, when an instruction of displaying the window is received from the user by an operation of the user in the input unit 19, the CPU 12 of the control unit 11 controls the display unit 20 and causes the window to be displayed at a predetermined initial position. In the case of
In step S2, the CPU 12 of the control unit 11 determines whether there is a touch input into the title bar of the window based on a coordinate detection signal from the touch input control unit 17 obtained when the user brings down the stylus pen 2 to a predetermined display area α of the display unit 20. For example, when the stylus pen 2 as the pointing device is not operated and brought down on any part of the title bar of the window displayed on the display unit 20, it is determined that there is no touch input into the title bar of the window by the stylus pen 2. On the other hand, referring to
When the CPU 12 of the control unit 11 determines that there is no touch input with the stylus pen 2 to the title bar of the window in step S2, the process waits at step S2. On the other hand, when it is determined that there is a touch input with the stylus pen 2 to the title bar of the window in step S2, the CPU 12 of the control unit 11 determines whether the title bar of the window is dragged after the touch input. Here, the terminology “drag” means an action of moving a stylus pen 2, a user's finger or the like from a first position on the touch panel at which the stylus pen 2, the user's finger or the like is first in contact with to a second position different from the first position while the stylus pen 2, the user's finger or the like keeps the contact with the touch panel.
Referring to
When it is determined that the title bar of the window is not dragged after the touch input of the title bar of the window is done in step S3, the process returns to step S2.
When it is determined that the title bar of the window is dragged after the touch input of the title bar of the window is done in step S3, the CPU 12 of the control unit 11 controls the display unit 20 to move the window in response to the dragged amount and the dragged direction. Referring to
In step S5, after the title bar of the window is started to be dragged, the CPU 12 of the control unit 11 determines whether the touch input into the title bar of the window has been finished by the user operating the stylus pen 2 based on the coordinate detection signal to thereby cease the touch input into the title bar of the window. When it is determined in step S5 that the touch input into the title bar of the window ceases, the display unit 20 is controlled to cease the movement of the window in response to the dragged amount and the dragged direction. Then, the movement of the window in response to the dragged amount and the dragged direction is finished. Thereafter, the process goes back to step S2, and the processes on and after step S2 are repeated.
When the CPU 12 determines that there still exists the touch input into the title bar of the window, the CPU 12 of the control unit 11 determines whether a part of the window displayed on the display unit 20 goes out of the display area α of the display unit 20 in step S7. Said differently, the CPU 12 of the control unit 11 determines whether at least one of the coordinates in the X and Y axes which indicate any corner of the four corners S, T, U and V is not included in the display area α of the display unit 20. For example, when at least one of the coordinates in the X and Y axes of the corners S, T, U and V exceeds the minimum value or the maximum value of the coordinates in the X and Y axes within the display area a of the display unit 220, it is determined that the one of coordinates is not included in the display area.
Referring to
In the embodiment, it is determined whether a part of the window goes out of the display area α of the display unit 20 for any one of the corners S, T, U and V. However, it can be determined when any point or part of the window goes out of the display area α.
When it is determined that the CPU 12 of the control unit 11 determines that a part of the window displayed on the display unit 20 does not go out of the display area α of the display unit 20, the CPU 12 of the control unit 11 continues to move the window in response to the direction of the dragged amount and the dragged direction, and the process goes back to step S5 in step S8. Then, the process returns back to step S5. With this the window is moved in response to the dragged amount and the dragged direction until any one of the coordinates in the X and Y axes indicative of the corners S, T, U and V is not contained in the display area α of the display unit.
In step S7, when the CPU 12 of the control unit 11 determines that the part of the window goes out of the display area of the display unit 20, the CPU 12 of the control unit 11 recognizes that the window displayed on the display unit 20 is instructed to be in a nondisplay state. Then, the CPU 12 controls the display unit 20 and ceases the display of the window in a predetermined size, and iconizes the window so as to be displayed on a predetermined display area (a display area which hardly obstructs watching an image displayed on the display unit 20). Referring to
Types of windows displayed on the display unit 20 are described.
In the window nondisplay control process illustrated in reference of the flowchart of
Referring to a flowchart of
In step S27, the CPU 12 of the control unit 11 determines that the part of the window displayed on the display unit 20 goes out of the display area of the display unit 20, the CPU 12 of the control unit 11 determines whether the touch input into the title bar of the window ceases in step S29. It is determined that the touch input ceases in step S29 when the stylus pen 2 is not in contact with the title bar of the window displayed on the predetermined display area of the display unit 20 when the user moves the stylus pen 2 based on the coordinate detection signal from the touch input control unit 17 in step S29. In step S29, when the CPU 12 of the control unit 11 determines that the touch input into the title bar of the window ceases, the CPU 12 of the control unit 11 recognizes that the nondisplay of the window is finally instructed by the user. The CPU 12 controls the display unit 20 in step S30 to change the display of the window having the predetermined size to the nondisplay, and iconizes the window and displays the iconized window on the predetermined area. On the other hand, when the CPU 12 of the control unit 11 determines that there is the touch input into the title bar, the CPU 12 of the control unit 11 recognizes that the nondisplay of the window is finally instructed by the user. Then, the process returns to step S27 and the processes on and after step S27 are repeatedly carried out.
In the case of the window nondisplay control process illustrated in the flowchart of
Referring to a flowchart of
In step S129, the CPU 12 of the control unit 11 determines whether the part of the window gone out of the display area α is on the right end. Referring to
When the CPU 12 of the control unit 11 determines that the part of the window gone out of the display area a of the display unit 20 is the right end, the CPU 12 of the control unit 11 recognizes that an instruction has been given to change from the display of the window displayed in the display unit 20 to the nondisplay thereof. Then, the CPU 12 controls the display unit 20 to change the display of the window displayed in the display unit 20 to the nondisplay, and iconizes the window and displays the iconized window in the display area α in the lower right corner. Referring to
On the other hand, when the CPU 12 of the control unit 11 determines that the part of the window has gone out of the display area α of the display unit 20, the CPU 12 of the control unit 11 further determines that the part of the window gone out of the display area α of the display unit is the left end. For example, in the case of
When the CPU 12 of the control unit 11 determines that the part of the window gone out of the display area a of the display unit 20 is the left end in step S131, the CPU 12 of the control unit 11 recognizes that an instruction has been given to change from the display of the window displayed in the display unit 20 to the nondisplay thereof. Then, the CPU 12 controls the display unit 20 to change the display of the window displayed in the display unit 20 to the nondisplay thereof, and iconizes the window and displays the iconized window in the display area in the lower left corner. Referring to
With this, even when the user operates the portable terminal 1 with only his or her right hand, the iconized window is displayed on the lower right corner. Thus, the operability in response to the operation with the right hand can be improved. Even when the user operates the portable terminal 1 with only his or her left hand, the iconized window is displayed on the lower left corner. Thus, in a similar manner to the right hand, the operability in response to the operation with the left hand can be improved.
Referring to a flowchart of
In step S241, the CPU 12 of the control unit 11 determines whether the touch input is given to the iconized window and waits until it is determined that the touch input is given to the iconized window. The determination is done based on the coordinate detection signal from the touch input control unit 17 when the iconized window is tapped on the touch panel 18.
When the CPU 12 of the control unit 11 determines that the touch input is given to the iconized window in step S241, the CPU 12 of the control unit 11 controls the display unit 20 in step S242 to thereby redisplay the iconized window in an original state immediately before the nondisplay of the window. When the window moves as illustrated in
When the iconized window is redisplayed, the redisplayed position may be any place. For example, the window may be redisplayed at a position within the display unit 20 and in the vicinity of a position out of the display area α as illustrated in
With the embodiment, the display unit 20 displays an image based on image information, moves a window in response to movement of an object such as a stylus pen 2 and a user's finger while keeping contact with the object with any position inside an area corresponding to a predetermined area of the window displayed on the display unit 20 of a touch panel 18 provided to overlap the display area, determines whether a part of the moved window went out of the display area, and ceases to display of the window in the predetermined size and displays the an image based on indication information representing the window. Thus, the display unit 20 can be controlled.
With this, it is possible to realize nondisplay or redisplay of a window displayed on a display unit 20 during a dragging operation using a stylus pen 2 or a user's finger. Thus, display or nondisplay of the window displayed on the display unit 20 can be suitably controlled by a touch input. Therefore, it is possible to reduce the display area hidden by the window as small as possible to thereby enable effectively using the limited display area. Further, a dedicated operation key for the nondisplay or the redisplay can be omitted. Therefore, additional work of attaching the dedicated operation key as hardware is omitted to thereby reduce the manufacturing cost. Further, the user can easily realize the nondisplay or the redisplay of the window by using the touch input into a window for inputting characters. Therefore, operability of the portable terminal 1 can be improved.
When the window is iconized and displayed, in order to make an image displayed on the display unit 20 be more easily watched, it is possible to increase the transmission ratio of the iconized window.
In the embodiment, when the part of the window displayed in the display unit 20 went out of the display area of the display unit 20, the window to be displayed in the predetermined size is ceased from being displayed and the iconized window is displayed in the predetermined display area. However, the embodiment is not limited to this case. When the stylus pen 2 moves along a locus of a predetermined shape such as a circle or a rectangle, the window may be ceased from being displayed and the iconized window and may be displayed in the predetermined display area. Hereinafter, a window nondisplay control process using this method is described.
Referring to a flowchart of
In step S251, when an instruction of displaying a window is received from a user by an operation of the user in the input unit 19, the CPU 12 of the control unit 11 controls the display unit 20 and causes the window to be displayed at a predetermined initial position as illustrated in
In step S252, when the user operates the stylus pen 2, the CPU 12 of the control unit 11 determines a position where the stylus pen 2 is brought down, determines whether there is a touch input, and waits until the touch input is determined based on a coordinate detection signal from the touch input control unit 17. When the CPU 12 determines that there is the touch input in step S252, the CPU 12 determines whether the locus of the stylus pen 2 moving on the touch panel 18 is a predetermined shape such as a circle and a rectangle in step S253.
When the CPU 12 determines that the locus of the stylus pen 2 moving on the touch panel 18 is the predetermined shape is step S253, the CPU 12 recognizes that an instruction of ceasing the display of the window is given to thereby control the display unit 20, change the display of the window of the predetermined size to the nondisplay of the window, and iconize the window as illustrated in
When the CPU 12 of the control unit 11 determines that the locus of the stylus pen 2 moving on the touch panel 18 is not the predetermined shape, the process returns to step S252.
With this, it is possible to change the display of the window on the display unit 20 to the nondisplay or the redisplay. Thus, the display or the nondisplay of the window displayed on the display unit 20 can be preferable controlled by the touch input.
The embodiment is applicable to a Personal Digital Assistant (PDA), a personal computer, a portable music reproducer, a portable movie reproducer, or other portable terminals.
Further, the sequence of the processes described in the embodiment may be carried out by software or hardware.
Furthermore, the steps of the flowchart are examples of processes carried out in a temporal sequence along the described order. However, the processes may be carried out in this temporal sequence and may be carried out in parallel or in independent manner.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority or inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a U.S. continuation application filed under 35 USC 111a and 365c of PCT application JP2008/066357, filed Sep. 10, 2008. The foregoing application are hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2008/066357 | Sep 2008 | US |
Child | 13042229 | US |