The present invention relates to a mobile information terminal, and more particularly to a mobile information terminal allowing for operation of a touch panel displayed on a display unit, a computer-readable program, and a recording medium.
In conventionally utilized various types of techniques for information terminals having a display unit provided with a touch panel, video is displayed on the display unit, and an image corresponding to an operation unit is also displayed for a user to operate the touch panel, so that an input of operation information is accepted.
In some of these techniques, for example, a user touch triggers an operation window as described above to be displayed.
Patent Document 1 (Japanese Patent Laying-Open No. 2007-52795) discloses a technique for a digital camera of, when a user touch on a touch panel is detected, displaying an image including operation buttons, such as a shutter button, a zoom-in button, and a zoom-out button, relative to a touch position on the touch panel for user convenience of operation.
To hold and operate a mobile information terminal by one hand, there has been a growing user demand to operate the terminal by a finger of one hand while holding the terminal by that hand. Many mobile information terminals are accordingly manufactured on the assumption that they are held and operated by one hand.
Meanwhile, in recent years, mobile information terminals have been equipped with an increasing number of functions. Demand is growing accordingly that more information, such as buttons and menus, is displayed in the operation window displayed on the display unit.
As information displayed in the operation window increases, the area required of the operation window is expected to increase. As the operation window increases in area, a situation is assumed to arise, where even when the operation window is displayed at a position supposed to be easily operated by a user as disclosed in the above-mentioned Patent Document 1, the user may not actually feel the operability. More specifically, even when the operation window is displayed at a position supposed to be easily operated by the user, a button located at the corner of the operation window may be too far for the user to operate by a finger of one hand holding the terminal. For operating the button in such a case, the user needs to change the position of his/her hand holding the terminal or to operate the button by the other hand.
The present invention was made in light of these circumstances, and an object of the invention is to ensure improved user convenience of a mobile information terminal displaying an operation window on a touch panel provided on a display unit.
A mobile information terminal in accordance with an aspect of the present invention includes a display unit, a touch panel arranged in the display unit, an application execution unit executing an application, and a controller executing processing related to the application in accordance with an operation on the touch panel. The controller displays, on the display unit, an operation window in which information for use in the processing related to the application is input, and shifts a display position of the operation window on the display unit based on a first operation on the touch panel.
A mobile information terminal in accordance with another aspect of the present invention includes a display unit, a touch panel arranged in the display unit, an application execution unit executing an application, and a controller executing processing related to the application in accordance with an operation on the touch panel. The controller displays, on the display unit, an operation window in which information for use in the processing related to the application is input. The controller shifts a display position of the operation window on the display unit based on a first operation on the touch panel. The controller is capable of returning the display position of the operation window shifted by the first operation, to a position before being shifted. When an operation is performed on the touch panel, the controller determines whether or not the operation satisfies a requirement for the first operation, and when determining that the requirement is satisfied, shifts the display position of the operation window on the display unit.
A mobile information terminal in accordance with a yet another aspect of the present invention includes a display box, a touch panel arranged in the display box, an application execution unit executing an application, and a controller executing processing related to the application in accordance with an operation on the touch panel. An operation window of the application is larger than a size of the display box. The controller displays, on the display box, a partial window constituting a portion of the operation window. The partial window includes items for input of information for use in the processing related to the application. In response to a first operation on the touch panel, the controller changes the portion of the operation window displayed on the display box as the partial window, and determining that information for selecting from among the items has been input by a second operation performed on the partial window as changed, executes the processing related to the application corresponding to a selected item. When the first operation is performed with the partial window located at an end of the operation window, the controller displays the end of the operation window at a shifted position in the display box from an end of the display box in a direction identical to an operation direction in the first operation. When the second operation is performed on the operation window located at the shifted position, the controller, determining that the information for selecting from among the items has been input, executes the processing related to the application corresponding to the selected item.
A computer-readable program in accordance with the present invention is a computer-readable program for controlling a mobile information terminal including a display unit, a touch panel arranged in the display unit, and an application execution unit executing an application. The computer-readable program causes the mobile information terminal to execute the steps of displaying, on the display unit, an operation window in which information for use in processing related to the application is input, determining whether or not an operation on the touch panel is performed, and shifting a display position of the operation window on the display unit based on the operation on the touch panel.
A recording medium in accordance with the present invention is a recording medium storing a computer-readable program for controlling a mobile information terminal including a display unit, a touch panel arranged in the display unit, and an application execution unit executing an application. The computer-readable program causes the mobile information terminal to execute the steps of displaying, on the display unit, an operation window in which information for use in processing related to the application is input, determining whether or not an operation on the touch panel is performed, and shifting a display position of the operation window on the display unit based on the operation on the touch panel.
According to the present invention, the display position of the operation window displayed on the display unit can be shifted based on an operation on the touch panel.
Therefore, even when a button that a user intends to operate in the operation window displayed on the display unit is located too far from a finger of a user's hand holding the mobile information terminal, the user can shift the display position of the button closer to that finger. The user can then operate the operation window at a desired position, such as a desired button, without having to change the position of his/her hand holding the mobile information terminal, for example.
A mobile phone according to an embodiment of a mobile information terminal of the present invention will be described hereinbelow with reference to the drawings. It is to be noted that the mobile information terminal according to the present invention is not limited to the mobile phone. More specifically, the mobile information terminal according to the present invention may be any terminal provided with a touch panel, and is not required to have a specific function, such as a verbal communications function provided for a mobile phone, for example.
First, with reference to
Mobile phone 100 is provided with a touch panel (a touch panel 40 which will be described later) on the front face of display unit 30. In mobile phone 100, an operation window 31 for input of information for use in a process related to an application executed in mobile phone 100 is displayed. An area of the touch panel that corresponds to a left area of display unit 30 is touched or otherwise operated, so that operation window 31 is displayed in the left area of display unit 30 as shown in
In
Operation window 31 includes a plurality of operation buttons 310 that correspond to individual functions, respectively. Mobile phone 100 stores as appropriate which button of plurality of operation buttons 310 on operation window 31 corresponding to which function is displayed at which position on the touch panel. Mobile phone 100 then detects such information and detects at which position the touch panel is operated, to thereby determine a procedure to be executed.
With reference to
Attitude detection unit 90 is to detect the orientation and the moving direction of mobile phone 100 as well as an acceleration given to mobile phone 100, and includes a plurality of gyroscopes, acceleration sensors, and geomagnetic sensors, for example. The orientation of mobile phone 100 includes, for example, a horizontally-long state when held by the user as shown in
Storage unit 60 includes a program storage unit 61 storing programs executed by the CPU of controller 50, a setting details storage unit 62 storing details of setting, such as an address book, made in mobile phone 100, and a data storage unit 63 storing various tables which will be described later and various types of data required to execute the programs stored in program storage unit 61. Program storage unit 61 may be fixed to or may be removable from mobile phone 100.
Details of a procedure executed in mobile phone 100 will now be described.
With reference to
In mobile phone 100, an operation that can be accepted subsequently and a type of application that can be activated in combination often vary depending on an application being activated. This raises the need to change the contents displayed as a menu, or as the case may be, to avoid the process of displaying a menu by checking the state of mobile phone 100, such as whether no application is activated, which application among the television function, the Web browser function, the e-mail function, and the like is activated, or whether a telephone conversation is being made. From these reasons, the activation state of an application is checked at step S1 (e.g., as to which application is activated).
At step S2, the CPU checks the state of touch panel 40, and then advances the process into step S3.
Generally, in an operating system (OS) of an information terminal, the input function through the touch panel is not offered by each application, but is provided in many cases as a function of the OS. Except for a brief time period after a touch on the touch panel is finished, during which a menu or screen transition may be displayed in an animated manner, the process of displaying a menu is often unexecuted until a touch operation is performed, so that power consumption is reduced. The following description will be made assuming that, except for some cases, checking the touch state on the touch panel is executed outside a menu control process, and that the touch state shall not be changed during execution of an algorithm for the menu control process.
At step S3, the CPU determines whether or not calling of the menu control process at step S5 which will be described later is necessary. When a determination is made that calling is necessary, the process proceeds into step S5. When a determination is made that calling is unnecessary, the process proceeds into step S4.
At step S4, the CPU waits the lapse of the above-mentioned certain interval after the execution of the current main routine is started, and then returns the process to step S1.
At step S5, the CPU, after executing the menu control process, waits the lapse of the above-mentioned certain interval after the execution of the current main routine, and then returns the process to step S1.
In the menu control process at step S5, various types of processing including the following five types of processing are executed in parallel or sequentially:
The touch-operation-type identification process is to identify the type of operation pattern performed on touch panel 40, based on a user operation on touch panel 40.
The first-display-mode change process is executed when the display of operation window 31 on display unit 30 is started. Throughout the present specification, operation window 31 will also be called “menu” as necessary.
The menu drag process is to shift the display position of operation window 31 displayed on display unit 30 by, for example, sliding operation window 31 in accordance with the user operation performed on touch panel 40.
The menu-position return process is to return the display position of the operation window having been shifted in display position by the above-described menu drag process to the position before the shift.
The second-display-mode change process is executed when the display of operation window 31 on display unit 30 is terminated.
In mobile phone 100, the type of touch operation includes a single tap, a double tap, a drag, and so on. In the present specification, distinction between a single tap and a double tap will be described with reference to the flow chart of a single tap/double tap distinction process shown in
It is to be noted that the following mode of identifying the type of touch operation is merely for illustration. In the mobile information terminal, the type of touch operation may be identified by another mode generally used, rather than the method described in the present specification.
With reference to
At step SA104, the CPU determines whether or not the value of a during-touch flag Q0, which indicates whether or not a touch operation has been performed during execution of a preceding touch-operation-type identification process, is 0. When a YES determination is made, the process proceeds into step SA106, and a NO determination is made, that is, when a determination is made that the value of during-touch flag Q0 is 1, the process proceeds into step SA112.
It is to be noted that, as will be described later, during-touch flag Q0 is a flag whose value is updated every time the touch-operation-type identification process is executed, with the value set at 1 when a touch operation is currently performed on touch panel 40, and the value set at 0 when a touch operation is not performed.
At step SA106, the CPU determines whether or not the difference between the current time and a touch start time T0 falls below a predetermined threshold value Td. When a YES determination is made, the process proceeds into step SA108, and when a NO determination is made, that is, when a determination is made that the touch operation on touch panel 40 continues for a time period longer than or equal to above-mentioned time Td, the process proceeds into step SA110.
At step SA108, the CPU determines that the current operation on touch panel 40 is a double touch, and advances the process into step SA116. It is to be noted that, at step SA108, a double-touch-state flag DT, which indicates whether or not mobile phone 100 is subjected to a double touch operation (double touch state), is set at 1.
At step SA110, the CPU determines that the current operation on touch panel 40 is a provisional touch, sets the value of a provisional-touch-state flag ET at 1, records the current time timed by timer 50A as the value of touch start time T0, sets the values of above-mentioned double-touch-state flag DT, a single-touch-state flag ST, a double-tap-state flag DU, and a single-tap-state flag SU, which will be described later, at 0, and then advances the process into step SA116.
It is to be noted that data storage unit 63 stores a touch information storage table as shown in Table 2, as a table for storing values used when various processes including the touch-operation-type identification process are executed.
In the touch information storage table, touch start position P0 is information indicating the position at which the user has started a touch operation on touch panel 40, and is represented, for example, by coordinates defined on touch panel 40 or the like. More specifically, the coordinates indicate the position at which the user has started touching touch panel 40.
It is to be noted that values of the respective items in the touch information storage table are updated when a provisional touch is detected.
Touch start time T0 indicates the time at which the user has started the touch operation on touch panel 40, as described above.
Touch position P1 is information indicating the current touch position at which the user touches touch panel 40 while the CPU executes various types of processing including the touch-operation-type identification process.
Touch time T1 is information indicating the time recorded when a user's touch is detected while the CPU executes various types of processing.
Referring back to
At step SA114, the CPU sets single-touch-state flag ST at 1 determining that the type of touch operation is a single touch, and then advances the process into step SA116.
At step SA116, the CPU sets the value of above-described during-touch flag Q0 at 1, and terminates the touch-operation-type identification process.
At step SA118, the CPU determines whether or not the value of during-touch flag Q0 is 0. When a determination is made that the value is 0, the process proceeds into step SA124. When a determination is made that the value of during-touch flag Q0 is 1, the process proceeds into step SA120.
At step SA120, the CPU determines whether the value of double-touch-state flag DT is 1 or the value of single-touch-state flag ST is 1. When a YES determination is made, the process proceeds into step SA122, and when a NO determination is made, that is, when a determination is made that double-touch-state flag DT and single-touch-state flag ST both have the value of 0, the process proceeds into step SA126.
At step SA122, the CPU determines that the type of operation is a double tap when the current value of double-touch-state flag DT is 1, and determines that the type of operation is a single tap when the value of double-touch-state flag DT is 0 and the value of single-touch-state flag ST is 1. In the case of a double tap, the value of double-tap-state flag DU is set at 1. In the case of a single tap, the value of single-tap-state flag SU is set at 1. The values of double-touch-state flag DT, single-touch-state flag ST, provisional-touch-state flag ET, and a provisional-release-state flag EU are all updated to 0.
At step SA124, a determination is made whether or not the value of provisional-touch-state flag ET is 1. When a YES determination is made, the process proceeds into step SA128, and when a NO determination is made, the process proceeds into step SA132.
At step SA126, the value of provisional-release-state flag EU is at 1 with a determination that the current operation is a provisional release, and the process proceeds into step S132.
At step SA128, the CPU determines whether or not the difference between the current time and touch start time T0 is shorter than time Td, similarly to step SA106. When a YES determination is made, the process proceeds into step SA132. When a determination is made that the difference is longer than or equal to Td, the process proceeds into step SA130.
At step SA130, the CPU sets the value of single-tap-state flag SU at 1 determining that the current touch operation is a single tap. The values of single-touch-state flag ST, double-touch-state flag DT, provisional-release-state flag EU, and provisional-touch-state flag ET are all updated to 0, and the process proceeds into step SA132.
At step SA132, the value of during-touch flag Q0 is updated to 0 to terminate the touch-operation-type identification process.
The values of flags for use in the respective processes including the above-described touch-operation-type identification process are stored in data storage unit 63 as a table as shown in Table 3, for example.
The first-display-mode change process will now be described with reference to
With reference to
At step S104, the CPU detects the current position at which touch panel 40 is operated (touch position), and advances the process proceeds into step S106.
At step S106, the CPU checks the type of touch operation by, for example, referring to the touch-type identification result storage table (Table 3), and advances the process proceeds into step S108.
At step S108, the CPU determines whether a menu display requirement is satisfied based on the touch position detected at step S104 and the type of touch operation checked at step S106.
At step S110, the CPU determines whether or not the menu display requirement is satisfied as a result of the determination at step S108. When a YES determination is made, the process proceeds into step S112, and when a NO determination is made, the first-display-mode change process is terminated. It is to be noted that the menu display requirement is determined previously depending on the type of touch operation, and are stored in setting details storage unit 62, for example.
At step S112, the type of menu (operation window) to be displayed on display unit 30 is determined based on the state of an application being activated in mobile phone 100 and the orientation of the mobile phone (e.g., the horizontally-long orientation as shown in
In mobile phone 100, data storage unit 63 stores data for displaying various operation windows depending on the state of an application being activated.
For each operation window, data storage unit 63 stores an operation window with a design including a button arrangement suitable for display on display unit 30 when mobile phone 100 is in the horizontal orientation, as well as data for displaying a window with a design suitable for display on display unit 30 when mobile phone 100 is in the vertical orientation. More specifically, data for displaying operation windows as shown in
With reference to
Window 351 has a horizontally-long design, and window 352 has a vertically-long design including buttons causing mobile phone 100 to exert identical functions of buttons 350A, 350B, and 350C included in window 351.
Window 351 is displayed on display unit 30 when mobile phone 100 is in the horizontal orientation as shown in
Referring back to
It is to be noted that the horizontal orientation shown in
At step S120, the CPU calculates coordinates at which operation window 31 is displayed on display unit 30, and advances the process into step S122. It is to be noted that coordinates at which display of operation window 31 is centered are calculated at step S120.
At step S122, the CPU displays a vertical orientation window at the coordinates calculated at step S120, and advances the process into step S124.
At step S116, the CPU calculates coordinates at which operation window 31 is displayed, and at step S118, displays a horizontal orientation window (e.g., window 351 shown in
How to calculate the coordinates at steps S116 and S120 will now be described.
A first method may be to divide the touch panel into two areas A1 and A2 as indicated by long and short dashed lines in
A second method may be to display operation window 31 with a touch position B on touch panel 40 detected at step S104 being placed at the center in the lateral and longitudinal directions, as shown in
When operation window 31 is displayed with the touch position placed at the center, part of operation window 31 cannot be displayed on display unit 30 in some cases depending on the touch position, even when operation window 31 is to be displayed with the touch position placed at the center. In such a case, the display position of operation window 31 is preferably designed to fall within display unit 30 as shown in
In a mode of correcting the display position of operation window 31, as shown in
Referring back to
At step S126, the CPU stores, as a display start position p, the central coordinates of operation window 31 (or the coordinates after the correction when a correction is made as described with reference to
It is to be noted that above-mentioned display start time t and display start position p are stored in a display information storage table stored in data storage unit 63, for example. The details of the display information storage table are shown in Table 4, by way of example.
It is to be noted that a menu as used herein refers to an operation window for input of information for use in an application-related process, and includes a window object being displayed on the display unit subjected to a menu drag process.
In the menu drag process, the CPU first determines at step S202 whether or not a touch input is currently made on touch panel 40. When a YES determination is made, the process proceeds into step S204, and when a NO determination is made, the process proceeds into step S228.
At step S204, the CPU determines whether or not mobile phone 100 is in a menu display mode. When a YES determination is made, the process proceeds into step S206, and when a NO determination is made, the process proceeds into step S212.
Modes of mobile phone 100 will be described now.
Mobile phone 100 allows for selection from among four modes of the menu display mode, a menu selection mode, a drag mode, and a menu non-display mode, as shown in Table 5.
The mode information storage table is stored in data storage unit 63, for example. The mode information storage table stores information by the flag value (1 or 0) so as to show either one of the four modes as shown in Table 5 is valid.
Referring back to
At step S208, the CPU stores the current touch position as touch start position P0 and the current time at this time point as touch start time T0, and advances the process into step S210. Touch start position P0 and touch start time T0 as used herein correspond to start position p and start time t shown in Table 4, respectively.
That the touch position falls within an area corresponding to operation window 31 means that the touch position falls within an area of touch panel 40 that is in pushing contact with the area in which operation window 31 is displayed on display unit 30.
At step S210, the CPU changes the mode of the mobile phone to the menu selection mode, and advances the process into step S216.
At step S212, the CPU stores the current touch position as touch position P1 and the current time as touch time T1, and advances the process into step S214.
At step S214, the CPU determines whether or not the current mode of mobile phone 100 is the menu selection mode or the drag mode. In the case of the menu selection mode, the process proceeds into step S216, and in the case of the drag mode, the process proceeds into step S224.
At step S216, the CPU calculates the difference between touch position P1 and touch start position P0 to obtain a shift distance, and determines whether or not the shift distance is longer than a predetermined certain threshold value. When a YES determination is made, the process proceeds into step S222, and when a NO determination is made, that is, when a determination is made that the shift distance is shorter than or equal to the threshold value, the process proceeds into step S218.
At step S222, the CPU changes the mode of mobile phone 100 to the drag mode, and advances the process into step S224.
At step S224, a new menu position is calculated based on touch position P1, and the process proceeds into step S226.
More specifically, the display position of new operation window 31 is calculated placing the central coordinates of the display position of new operation window 31 at touch position P1.
At step S226, the CPU changes the display position of operation window 31 on display unit 30 to the new position calculated at step S224, to terminate the menu drag process. Changing the display position is desirably performed such that the shift of the operation window from the initial position to the new position is displayed continuously on display unit 30, so that the display of operation window 31 appears to the user to be gradually shifting without disappearing from display unit 30.
In the above-described menu drag process, when mobile phone 100 is in the menu display mode and when the shift distance (P1−P0) of the user's finger on touch panel 40 is greater (longer) than the certain threshold value when touch panel 40 is operated continuously (successively), the display position of operation window 31 is shifted based the operation position (touch position P1) on touch panel 40 after the shift. Herein, the continuous operation of touch panel 40 includes a state in which a touch-and-release is never detected after the user starts touching on touch panel 40.
More specifically, as shown in
In
In another example, as shown in
Operation window 390 is a window with an address-book application activated. Displayed on display unit 30 is a cursor 381 indicating that “na” has been selected from among indices, such as “a”, “ka”, “ta”, “na”, displayed in a display box 380. Headers of individuals contained in the address book whose names start from the row of “na” are displayed in operation window 390.
It is to be noted that, with the display position of operation window 390 shifted, a portion underlying operation window 390 in
Although
Another example of a dragged window will now be described illustrating a Web contents browser window. It is to be noted that the Web contents browser window involves reproducing contents and displaying items linked to URL (Uniform Resource Locator) addresses of other homepages. A selection is made from among (character strings corresponding to) the items by a single tap or the like, so that processing such as accessing a link corresponding to a selected item is executed. From such points of view, the Web contents browser window is also regarded as an operation window in which information for causing mobile phone 100 to execute processing is input.
First, with reference to
The Web browser, installed in mobile phone 100, is executed so that the Web contents browser window as shown in
With reference to
Referring back to
Information indicative of the relative position of the portion displayed in display box 30C with respect to the entire Web contents is displayed in display box 30A, in addition to information specifying an application (Web browser) being executed.
The information of “80%” in display box 30A shown in
More specifically, a calculation is made according to the following expression (1) using L1 and L2 shown in
R%=(L2/L1)×100 (1)
Herein, L1 represents the longitudinal dimension of the entire Web contents (the longitudinal dimension of entire Web contents 1000), and L2 represents the distance between the upper end of the portion displayed in display box 30C and the upper end of the Web contents (the distance between the upper ends of portion 1001 and Web contents 1000). Information displayed in display box 30A is denoted by R %. With such information displayed in display box 30A, the user can readily identify at which position the information displayed in display box 30C resides in the entire Web contents.
It is to be noted that the display for allowing users to identify such proportion is not limited to the display in percentage as shown in
When the user performs an operation (e.g., drag) on touch panel 40 in such a manner as to slide downwardly as indicated by an arrow 301 starting from the state shown in
This scrolling may be performed in such a manner that the window has inertia. In this case, the scrolling speed is gradually increased after the start of scrolling, and then decreased to stop the scrolling.
When a finger is slid downwardly on touch panel 40 as described above, portion 1001 of Web contents 1000 displayed in display box 30C is changed from that shown in
When the finger is slid on touch panel 40, Web contents 1000 is shifted relative to portion 1001 in the direction that the finger is slid (the direction of arrow 301A in
In
When the user's finger is further slid on touch panel 40 downwardly as indicated by an arrow 302 from the state shown in
In
When an item in window 363 (e.g., each of the items of “1st-5th ranks”, “6th-10th ranks”, and “11th-15th ranks” shown as tabs in the menu of “Keywords of interest”) is operated in a pattern different from sliding the user's finger on touch panel 40 (e.g., a touch-and-release on touch panel 40), the Web browser, determining that information corresponding to an operated item has been input, executes processing such as changing the display contents in window 363.
According to the example of window dragging described above with reference to
In this manner, after shifting the display position of the operation window on display unit 30 further in the shift direction relative to an end of the operation window, the operation window is continuously displayed at the display position having been shifted, so that the user can select from among selection items by a touch-and-release or the like on the operation window at the shifted display position.
It is to be noted that, although the above description is made with the upper end of Web contents being displayed in display box 30C, the drag operation can also be operated similarly with the lower, right side or left side end of Web contents being displayed.
Referring back to
Herein, the shift distance may be an actual shift distance of an operation target position on touch panel 40, or may be a shift distance in a certain direction.
More specifically, when the operation target position is shifted from a point A to a point B as shown in
Then, as shown in
In this manner, the threshold value is set at distance R between the central positions of adjacent buttons, so that the threshold value can be determined depending on the size of and the distance between the buttons in the operation window. Determining the threshold value per operation window enables a more precise distinction between the menu selection mode and the drag mode irrespective of the size of and the distance between the buttons.
When a drag operation is performed on operation window 31 (menu) at least in the horizontal direction by a distance longer than or equal to distance R, the display position of operation window 31 (operation window 390) on display unit 31 is shifted by steps S216 and S222 to 226. The mobile phone is in the menu selection mode when a drag operation is not performed on operation window 31 on display unit 30 (an area of touch panel 40 corresponding to operation window 31) as shown in
At step S218, the CPU specifies the operation button at the closest position to the touch position in operation window 31, and advances the process into step S220.
At step S220, the CPU highlights the button specified at step S220, and terminates the menu drag process. This button highlighting enables the user to readily identify the selected item and to readily identify that mobile phone 100 is in the menu selection mode. When highlighting occurs during a touch operation for the purpose of dragging, the user can identify that the drag distance is too short to drag and display operation window 31, and then continues performing a drag operation by a longer distance to cause mobile phone 100 to shift the display position of operation window 31 in a dragged manner.
With reference to
At step S230, the CPU executes processing corresponding to the menu item currently selected in mobile phone 100, and then advances the process into step S234.
At step S232, the CPU determines whether or not the mode of mobile phone 100 is the drag mode. When a YES determination is made, the process proceeds into step S234, and when a NO determination is made, the menu drag process is terminated (with the menu (operation window 31) being displayed).
At step S234, the CPU changes the mode of mobile phone 100 to the menu display mode, and terminates the menu drag process.
The contents displayed on display unit 30 in
Just after the user's finger touches operation window 31 as shown in
If the finger is not shifted or the shift distance is too short to exceed the above-described certain threshold value, a NO determination is made at step S216. The process then proceeds into step S218 (
During a period in which the user touches touch panel 40 continuously in a shift operation of shifting his/her finger in the direction indicated by A31, from the state shown in
When the user lifts his/her finger off touch panel 40 at the position indicated by H1 (
Once the user performs a touch-and-release after the shift operation, and then touches the position indicated by H2 in
When the user lifts his/her finger off touch panel 40 at the position indicated by H2 (i.e., when the shift distance is shorter than the threshold value at step S216, and a touch-and-release occurs in the menu selection mode), the process proceeds from step S202 to S228. Because the operation mode of mobile phone 100 has not bee changed from the menu selection mode after the touch operation is performed at the position of H2, a YES determination is made at step S228, and the process proceeds into step S230. At step S230, the selected menu item, that is, processing corresponding to button 311A shown in
In the above-described menu drag process, the display position of operation window 31 having been shifted is centered at the end point of the user's drag operation on touch panel 40.
For example, when a drag operation is performed from point P0 to point P1 on touch panel 40 as shown in
Alternatively, as shown in
If operation window 31 cannot be displayed entirely on display unit 30 even when the user wishes to display operation window 31 on display unit 30 with point Pc placed at the center, point Pc is preferably corrected as appropriate (as described with reference to
In the above-described menu drag process, when the shift distance is longer than or equal to the certain distance at step S216, the mode of shift operation window 31 in accordance with a user operation on touch panel 40, that is, the drag mode is brought about.
In the above description, the shift distance in a touch operation is used as a requirement for bringing about the drag mode, however, other operation modes may also be requirements for bringing about the drag mode. An accurate determination whether to bring about the drag mode based on a user operation is important, because the need arises to determine whether to execute an executable selection item such as a button, if any, at the touch position or whether to shift the operation window without executing the item.
Providing a mobile phone with determination means for determining whether to bring about the drag mode based on a user operation or whether to select a menu corresponding to the touch position will allow a touch-panel-equipped device having a small display window to exert advantageous effects in terms of design of operation window and usability.
It is to be noted that, instead of the shift distance of a user operated position, mobile phone 100 may be configured to be changed to the drag mode provided that the user operates touch panel 40 without making any touch-and-release for a somewhat long time period or provided that the user performs a drag operation by shifting his/her finger over operation window 31 at a speed greater than or equal to a certain speed. An example of such processing is shown in
In
At step S216A, a determination is made whether or not the difference between touch start time T0 and touch time T1 (T1−T0), that is, a time period during which a touch operation continues exceeds a predetermined threshold value Tx, or whether or not the shift speed, that is, a value obtained by dividing the shift distance (P1−P0) by the shift time (T1−T0) (shift speed (herein, an initial speed of shifting)) exceeds a predetermined threshold value Vx. Mobile phone 100 is configured to be changed to the drag mode when the time period during which a touch operation continues exceeds predetermined threshold value Tx or when the shift speed exceeds threshold value Vx. Continuation of a touch operation refers to a state of being kept touched without any touch-and-release after the touch.
Mobile phone 100 may also be configured to be changed to the above-described drag mode provided that a user operation corresponds to a reciprocating motion.
In
Mobile phone 100 may also be configured to be changed to the above-described drag mode based on the position at which the user operates touch panel 40.
With reference to
It is to be noted that located proximate to an end is an area defined by broken lines 330 and long and short dashed lines 331, as shown in
When a determination is made at step S205A that the touch position is proximate to the end, then, the mode of mobile phone 100 is changed to the drag mode at step S205B. It is to be noted that, at step S205A, when the touch position is in an area other than the area where the buttons (selection items) are displayed in the operation window, the process may proceed into step S205B.
Mobile phone 100 may also be configured to be changed to the above-described drag mode provided that a certain type of touch operation is performed.
With reference to
A menu-position return process of returning the position of a dragged menu will now be described.
As shown in
More specifically, after the display position of the operation window is shifted such that the operation window is dragged closer to the user's finger in accordance with a user operation on touch panel 40, a touch-and-release occurs at a position indicated by broken lines H11, as shown as mobile phone 100B in
The menu-position return process may be such that the display position is returned to its original position when any new touch operation is not performed for a certain time period after the drag operation is finished. In other words, the display position may also be returned to its original position when processing for a second operation on the operation window, such as a selection, is not executed. Alternatively, without any new touch operation performed briefly, the operation window may no longer be displayed, as shown as mobile phone 100C in
As shown in
With reference to
At step S306, the CPU executes a processing item selected by the user operating a button (e.g., button 311) in operation window 31, and advances the process into step S308.
At step S308, the CPU determines whether menu start position p (see Table 4) and the current menu display position (the central coordinates of operation window 31) are different from each other. When a YES determination is made, the process proceeds into step S310, and when a NO determination is made, that is, when a determination is made that the central coordinates of current operation window 31 coincide with the coordinates stored as display start position p, the process proceeds into step S312.
At step S310, the CPU shifts (returns) the display position of operation window 31 such that its central coordinates coincide with the coordinates stored as display start position p, and advances the process into step S312.
At step S312, the CPU changes the mode of mobile phone 100 to the menu display mode, and terminates the menu-position return process.
At step S314, the CPU determines whether or not the mode of mobile phone 100 is the drag mode. When a YES determination is made, the process proceeds into step S316, and when a NO determination is made, the process proceeds into step S318.
At step S316, the CPU changes the mode of mobile phone 100 to the menu display mode, and advances the process into step S318.
At step S318, the CPU determines whether or not a time period without any touch input continues for a predetermined certain time period Tx or longer. When a YES determination is made, the process proceeds into step S320, and when a NO determination is made, the menu-position return process is terminated.
At step S320, the CPU changes the mode of mobile phone 100 to the menu non-display mode, and terminates the menu-position return process.
The second-display-mode change process will now be described with reference to the flow chart of the process shown in
With reference to
At step S404, the CPU determines whether or not the mode of mobile phone 100 is the menu selection mode. When a YES determination is made, the process proceeds into step S406, and when a NO determination is made, the process proceeds into step S408.
At step S406, the CPU executes a control item selected currently, and advances the process into step S414.
At step S408, the CPU determines whether or not mobile phone 100 is in the drag mode. When a YES determination is made, the process proceeds into step S410, and when a NO determination is made, the process proceeds into step S412.
At step S410, the CPU changes the mode of mobile phone 100 to the menu display mode, and advances the process into step S412.
At step S412, similarly to step S318, the CPU determines whether or not a time period without any touch input on touch panel 40 is longer than or equal to time period Tx stored in setting details storage unit 62, for example. When a YES determination is made, the process proceeds into step S414, and when a NO determination is made, the second-display-mode change process is terminated.
At step S414, the CPU changes the mode of mobile phone 100 to the menu non-display mode, and terminates the second-display-mode change process.
It should be understood that the embodiments disclosed herein are illustrative and non-restrictive in every respect. The scope of the present invention is defined by the claims, not by the description above, and is intended to include any modification within the meaning and scope equivalent to the claims.
30 display unit; 31, 390 operation windows; 40 touch panel; 50 controller; 50A timer; 51 display control unit; 53 to 55 audio output control units; 56 receiver; 57 speaker; 58 microphone; 60 storage unit; 61 program storage unit; 62 setting details storage unit; 63 data storage unit; 80 communication control unit; 81 antenna; 90 attitude detection unit; 100 mobile phone; 310, 311 buttons.
Number | Date | Country | Kind |
---|---|---|---|
2008-112849 | Apr 2008 | JP | national |
2009-064587 | Mar 2009 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/057838 | 4/20/2009 | WO | 00 | 10/22/2010 |