The disclosure of Japanese Patent Application No. 2015-119549, filed on Jun. 12, 2015, is incorporated herein by reference.
An exemplary embodiment relates to a technique for scrolling a content displayed on a display device.
Scrolling is known in the art as a technique for viewing a content extending beyond a display area of a display device.
An exemplary embodiment provides a non-transitory storage medium storing an information-processing program for causing a computer to execute a process, the process comprising: accepting a pointing operation input using a pointing device to indicate a coordinate of a display area of a display device; setting a first stop position for a scroll of a content, a part of the content appearing in the display area, based on a positional relationship between a start coordinate and an end coordinate of the pointing operation; and scrolling the content to the first stop position based on the pointing operation.
CPU 110 is a processing unit for executing a program stored in main memory 120 or data memory 130. Main memory 120 is a storage device for use as a work area or buffer area for CPU 110. Main memory 120 is, for example, a pseudo-SRAM (PSRAM).
Inertial performance data D1 is data used in a processing for performing an inertial scrolling (described later). The processing will hereinafter be referred to as “inertial performance.” Inertial performance flag D2 is a flag indicative of whether an inertial performance is in progress. An on-state of the flag indicates that an inertial performance is being currently executed.
Input operation data D3 is data indicative of a state of input to touch screen 140. Input operation data D3 includes data indicative of an input coordinate. Last input coordinate data D4 is data indicative of an input coordinate relative to touch screen 140, which has been detected in a last frame. When no input is detected in a last frame, no input coordinate is stored. On the other hand, when an input is detected, a coordinate of the input is stored. Reference to last input coordinate data D4 enables calculating parameters including a change in a touch position (or an input coordinate) of a drag operation, and an amount of movement of an indicator such as a finger or a stylus. Last but one input coordinate data D5 is data indicative of an input coordinate, which has been detected in a frame immediately prior to a frame in which last input coordinate data D4 has been detected. In other words, last but one input coordinate data D5 is data indicative of an input coordinate detected in a frame preceding a current frame by two frames.
Touch-on coordinate data D6 is data indicative of a coordinate of a position at which an indicator has come into contact with touch screen 140. Touch-off coordinate data D7 is data indicative of a coordinate of a position at which an indicator has moved away from touch screen 140.
Layout data D8 is data indicative of a layout of a screen to be displayed on touch screen 140. Touch screen 140 displays a screen according to layout data D8.
Content C is, for example, a menu screen of an application such as a game. Specifically, content C is, for example, a game stage selection screen, a possessed game item list screen, or other users list screen for a network game. Content C is, alternatively, a Web page or an electronic book content. A display element constituting content C is, for example, text or an image (specifically, a still image or a video).
The foregoing is a description of main memory 120.
Data memory 130 is a non-volatile storage device for storing programs to be executed by CPU 110, and various types of data. Data memory 130 is, for example, a flash memory or a hard disk. Data memory 130 stores menu processing program P. Data memory 130 may be detachable from information-processing device 100.
Touch screen 140 includes a display panel and a touch sensor arranged on the display panel. The display panel is a display device such as a liquid crystal display or an organic electroluminescence (EL) display. The touch sensor is, for example, a capacitive touch sensor or a resistive touch sensor. Touch screen 140 performs a process to detect a touching object at a predetermined frame rate to generate touch position data, which data is output to CPU 110 as input operation data D3. Touch position data is, for example, data indicative of a coordinate of a position at which an input has been detected on an input surface of touch screen 140. Touch screen 140 is an example of a pointing device. A pointing device as used herein refers to a device used to indicate a position (or a coordinate) on a computer screen.
Operation button 150 is a key used for input operation, such as a cross key or a push button. Operation button 150 outputs input operation data indicative of a state of input (specifically, whether it has been pressed) to CPU 110.
Communication module 160 is a communication interface such as a data communication card. Communication module 160 controls data communication with another device.
A functional configuration of information-processing device 100 will be described. Specifically, a configuration of functions for enabling a scroll operation performed in a menu processing (described later) will be described.
Input operation accepting unit 111 accepts a pointing operation using touch screen 140 to indicate a coordinate of display area R of touch screen 140. The pointing operation scrolls content C, a part of which appears in display area R. The pointing operation is, specifically, an input operation such as a drag operation or a flick operation. A drag operation as used in the present exemplary embodiment refers to an input operation to cause an indicator to touch touch screen 140 and to slide the indicator thereon in a certain direction. A flick operation as used in the present exemplary embodiment refers to an input operation to make an indicator touch on touch screen 140 and to move the indicator away from the display surface while swiping the indicator in a certain direction on the display surface. A flick operation enables a scrolling to continue for a short time even after a touch-off due to an inertial force corresponding to an intensity of the flick operation. The scrolling caused by an inertial force will hereinafter be referred to as “inertial scrolling.” Input operation accepting unit 111 distinguishes between a drag operation and a flick operation by referring to items of data stored in main memory 120.
Setting unit 112 sets a first stop position for scrolling of content C. When doing so, setting unit 112 sets the first stop position based on a positional relationship between a start coordinate and an end coordinate of a pointing operation accepted by input operation accepting unit 111. Specifically, setting unit 112 initially sets a second stop position for the scrolling of content C based on the pointing operation, and moves the second stop in a direction determined based on a vector directed from the start coordinate to the end coordinate of the pointing operation to set the first stop position.
When setting the second stop position, setting unit 112 sets the second stop position based on, for example, a draft operation or a flick operation. When setting the second stop position based on a flick operation, setting unit 112 refers to input coordinates detected in frames within a predetermined time period immediately prior to a touch-off. When setting the first stop position, setting unit 112, for example, moves the second stop position in a direction determined based on the vector directed from the start coordinate to the end coordinate of the pointing operation accepted by input operation accepting unit 111. When scrolling content C vertically, setting unit 112 moves the second stop position in a direction identical to a vertical component of the vector. When scrolling content C horizontally, setting unit 112 moves the second stop position in a direction identical to a horizontal component of the vector. Setting unit 112 sets first and second stop positions by referring to items of data stored in main memory 120 to store data on the first and second stop positions in main memory 120. Setting unit 112 may omit storing data on a second stop position in main memory 120, which has been set based on a flick operation.
Setting unit 112 includes complementary slide processing unit 114 and scrolling amount adjustment processing unit 115. Complementary slide processing unit 114, upon detecting that an amount of scrolling indicated by the pointing operation accepted by input operation accepting unit 111 fails to reach a predetermined amount, sets the first stop position by moving the second stop position in a direction determined based on the vector. Specifically, complementary slide processing unit 114 sets the first stop position in such a manner when input operation accepting unit 111 has accepted a drag operation. When doing so, complementary slide processing unit 114 may ignore input coordinates detected in frames within a predetermined time period immediately prior to a touch-off of the drag operation.
Complementary slide processing unit 114, upon detecting that scrolling content C to the second stop position makes a display element constituting content C appear across the boundary of display area R, sets the first stop so that more area of the display element is included in display area R. For example, complementary slide processing unit 114 sets the first stop position so that the whole display element is included in display area R. When doing so, complementary slide processing unit 114 may set the first stop position so that a part of the outer periphery of the display element comes into contact with the boundary of display area R. Namely, complementary slide processing unit 114 may snap the display element relative to the boundary of display area R. Alternatively, complementary slide processing unit 114, upon detecting that the display element appears across a part of display area R, the part being determined based on the vector, may set the first stop position so that more area of the display element is included in display area R.
The part determined based on the vector as used herein refers to, for example, a lower side of display area R in a case where display area R has a rectangular shape, content C is vertically scrolled, and a vertical component of the vector indicates an upward direction. The part determined based on the vector refers to an upper side of display area R in a case where a vertical component of the vector indicates a downward direction. The part determined based on the vector refers to a left side of display area R in a case where content C is horizontally scrolled, and a horizontal component of the vector indicates a rightward direction. The part determined based on the vector refers to a right side of display area R in a case where a horizontal component of the vector indicates a leftward direction.
In another exemplary embodiment, complementary slide processing unit 114, upon detecting that a display element appears across a predetermined part of display area R, may set a first stop position so that a larger area of the display element is included in display area R. The predetermined part as used herein refers to, for example, an upper or lower side of display area R in a case where display area R has a rectangular shape, and content C is vertically scrolled. The predetermined part refers to a right or left side of display area R in a case where content C is horizontally scrolled.
Scrolling amount adjustment processing unit 115, upon detecting that an amount of scrolling indicated by a pointing operation accepted by input operation accepting unit 111 equals or exceeds a predetermined amount, sets a first stop position by moving a second stop position in a direction determined based on the pointing operation. Specifically, scrolling amount adjustment processing unit 115 sets a first stop position in such a manner when input operation accepting unit 111 has accepted a flick operation.
Scrolling unit 113 scrolls content C to a first stop position set by setting unit 112, based on a pointing operation accepted by input operation accepting unit 111. When doing so, scrolling unit 113, specifically, scrolls content C by referring to a coordinate value of display area R stored in main memory 120. Scrolling unit 113, in response to acceptance of a drag operation by input operation accepting unit 111, scrolls content C to a second stop position, and thereafter scrolls the content to the first stop position. Scrolling unit 113 may scroll content C, which is constituted by an array of plural display elements, in a direction in which the plural display elements are arranged.
A menu processing performed in information-processing device 100 will be described.
At step S1, CPU 110 initially performs an initial processing of data to be used in the following process. Specifically, CPU 110 generates and arranges content C in virtual space V.
It is to be noted that it is possible to scroll content C by sliding the virtual camera (or display area R) vertically relative to content C, while content C remains in a fixed position. The method of a scroll processing employed in the present exemplary embodiment is merely an example; any other method may be employed. A method not using the virtual camera may be used.
Subsequent to step S1, CPU 110 proceeds with the menu processing by repeatedly performing a processing loop formed by steps S2 to S23 for each frame.
At step S2, CPU 110 acquires input operation data D3 from main memory 120. Subsequently, CPU 110 determines whether a touch input has been detected by touch screen 140 by referring to acquired input operation data D3 (step S3). When determining that a touch input has been detected (step S3: YES), CPU 110 acquires an input coordinate value of the touch input. Subsequently, CPU 110 determines whether a continuous touch input has been detected (step S4). Specifically, CPU 110 determines whether any data is set as last input coordinate data D4 in main memory 120. When determining that no continuous touch input has been detected (step S4: NO), CPU 110 stores data on the acquired input coordinate value in main memory 120 as touch-on coordinate data D6 because the detected touch input corresponds to a touch-on (step S5).
Subsequently, CPU 110 determines whether an inertial performance is in progress (step S6). Specifically, CPU 110 determines whether an inertial scrolling caused by a flick operation continues by referring to inertial performance flag D2 stored in main memory 120. When determining that an inertial performance is in progress (step S6: YES), CPU 110 performs a processing to cancel the inertial performance (step S7). When determining that no inertial performance is in progress (step S6: NO), CPU 110 skips the processing of step S7. Subsequently, CPU 110 performs a touch-on processing (step S8). Specifically, CPU 110 performs a processing according to the above input coordinate. For example, CPU 110 performs a processing to display an explanation of a display element to which the touch-on is directed. When doing so, CPU 110 may cause a pop up explanation to be displayed. Subsequently, CPU 110 proceeds to a processing of step S13.
At step S4, when determining that a continuous touch input has been detected (step S4: YES), CPU 110 determines whether the input operation corresponds to a drag operation (or a scroll operation) (step S9). Specifically, CPU 110 determines whether an amount of change of an input coordinate determined based on input operation data D3 and last input coordinate data D4 equals or exceeds a predetermined value. The fact that the amount of change equals or exceeds the predetermined value means that a drag operation has been detected. CPU 110, when determining that the input operation corresponds to a drag operation (step S9: YES), identifies parameters necessary to scroll content C (step S10). Specifically, CPU 110 identifies a scrolling amount and a scrolling direction for content C based on an amount of change and a direction of change of an input coordinate determined based on input operation data D3 and last input coordinate data D4. Subsequently, CPU 110 updates layout data D8 based on the identified scrolling amount and scrolling direction (step S11). Specifically, CPU 110 updates a coordinate value of display area R.
At step S9, when determining that the input operation does not correspond to a drag operation; namely, that the input operation corresponds to a continuous touch on an identical position (step S9: NO), CPU 110 skips steps S10 and S11.
Subsequently, CPU 110 sets last input coordinate data D4 (step S12). Specifically, CPU 110 stores last input coordinate data D4 as last but one input coordinate data D5 in main memory 120. CPU 110 also stores, as last input coordinate data D4 in main memory 120, data indicative of the input coordinate of the touch position indicated by input operation data D3 acquired at step S2. Subsequently, CPU 110 performs a display processing (step S13). Specifically, CPU 110 generates an image with reference to layout data D8 to cause touch screen 140 to display the image. Subsequently, CPU 110 returns to step S2.
At step S3, when determining that no touch input has been detected (step S3: NO), CPU 110 determines whether the current state of an input operation corresponds to a touch-off (step S14). Specifically, CPU 110, when any data is stored as last input coordinate data D4, determines that the current state of an input operation corresponds to a touch-off, and when no data is stored as last input coordinate data D4, determines that the current state of an input operation does not correspond to a touch-off; namely, that no touch has been continuously detected. When determining that the current state of an input operation corresponds to a touch-off (step S14: YES), CPU 110 stores data on the acquired input coordinate value in main memory 120 as touch-off coordinate data D7 (step S15).
Subsequently, CPU 110 determines whether the current state of an input operation corresponds to an inertial touch-off caused by a flick operation (step S16). Specifically, CPU 110 determines whether an amount of change of an input coordinate determined based on last input coordinate data D4 and last but one input coordinate data D5 equals or exceeds a predetermined value. The fact that the amount of change equals or exceeds the predetermined value means that an inertial touch-off caused by a flick operation has been detected. When determining that an inertial touch-off has been detected (step S16: YES), CPU 110 executes an inertial touch-off processing (step S17), which is a processing to enable an inertial performance.
Subsequently, CPU 110 determines whether a display element would appear across the boundary of display area R after an inertial scrolling of content C (step S53). Specifically, CPU 110 determines whether a display element would appear across a part of the boundary of display area R, the part corresponding to the identified scroll adjustment direction, by referring to layout data D8. When determining that a display element would appear across the part of the boundary of display area R (step S53: YES), CPU 110 identifies the display element.
Subsequently, CPU 110 identifies a correction scrolling amount necessary to display the whole area of the identified display element within display area R (step S54). Specifically, CPU 110 identifies a correction scrolling amount based on a coordinate value of the identified display element and the calculated preliminary coordinate value of display area R, by referring to layout data D8. Subsequently, CPU 110 calculates an actual scrolling amount by adding the identified correction scrolling amount to the identified basic scrolling amount (step S55). Subsequently, CPU 110 updates layout data D8 based on the calculated actual scrolling amount and the identified scrolling direction (step S56). Specifically, CPU 110 updates a coordinate value of display area R. CPU 110 also stores data on the calculated actual scrolling amount and the identified scrolling direction in main memory 120 as inertial performance data D1 (step S57).
At step S53, when determining that no display element would appear across the part of the boundary of display area R (step S53: NO), CPU 110 identifies zero as a correction scrolling amount, and proceeds to the processing of step S55. The foregoing is a description of the scrolling amount adjustment processing.
At step S16, when determining that no inertial touch-off has been detected (step S16: NO); namely, that a normal touch-off not caused by a flick operation has been detected, CPU 110 executes a complementary slide processing (step S18), which is a processing to scroll content C so that display elements shown on a screen come to an easily visible position.
Subsequently, CPU 110 identifies a complementary slide amount (step S73). Specifically, CPU 110 identifies a complementary slide amount based on a coordinate value of the identified display element and a coordinate value of display area R, by referring to layout data D8. Subsequently, CPU 110 updates layout data D8 based on the identified complementary slide direction and the identified complementary slide amount (step S74). Specifically, CPU 110 updates a coordinate value of display area R.
At step S72, when determining that no display element appears across the part of the boundary of display area R (step S72: NO), CPU 110 skips steps S73 and S74. The foregoing is a description of the complementary slide processing.
At step S14, when determining that the current state of an input operation does not correspond to a touch-off; namely, that no touch on touch screen 140 by a user has been continuously detected (step S14: NO), CPU 110 determines whether an inertial performance is in progress by referring to inertial performance flag D2 (step S20). When determining that no inertial performance is in progress (step S20: NO), CPU 110 skips a processing of steps S21 to S23 (described later). On the other hand, when determining that an inertial performance is in progress (step S20: YES), CPU 110 continues a processing for inertial performance based on inertial performance data D1 (step S21).
Subsequently, CPU 110 determines whether an ending condition for the inertial performance has been met (step S22). For example, CPU 110 determines whether to end the inertial performance based on whether an inertial scrolling for an inertial scrolling amount (or an actual scrolling amount) indicated by inertial performance data D1 has been completed. Alternatively, CPU 110 determines that an ending condition for the inertial performance has been met when display area R has reached an edge of content C during an inertial scrolling. At step S22, when determining that an ending condition for the inertial performance has not been met (step S22: NO), CPU 110 skips a processing of step S23 (described later). On the other hand, when determining that an ending condition for the inertial performance has been met (step S22: YES), CPU 110 clears inertial performance flag D2 (step S23). Subsequently, CPU 110 proceeds to the processing of step S13.
The foregoing is a description of the menu processing according to the present exemplary embodiment.
A first concrete example of the menu processing described in the foregoing will be described. The present concrete example describes scrolling content C in response to a user's drag operation. The present concrete example especially describes scrolling content C in an upward direction in a screen where content C can be vertically scrolled.
In the complementary slide processing, CPU 110 initially identifies a complementary slide direction based on a direction of vector (specifically, a vertical component thereof) directed from touch-on coordinate P1 to touch-off coordinate P2 (step S71 of
Subsequently, CPU 110 identifies a complementary slide amount necessary to include the identified display element C5 within display area R (step S73). Specifically, CPU 110 identifies a complementary slide amount by subtracting a y-coordinate value of the lower side of display area R (in other words, a y-coordinate value of the upper left corner plus a height) from a value (a y-coordinate plus a height) for display element C5. CPU 110 identifies a value “50 (=(670+110)−(250+480))” as a complementary slide amount in the present example. Subsequently, CPU 110 updates a coordinate value of display area R based on the identified complementary slide direction and complementary slide amount. CPU 110 adds the value “50” to a y-coordinate “250” of display area R to update a coordinate value of display area R to “(0,300),” as shown in
A second concrete example of the menu processing described in the foregoing will be described. The present concrete example differs from the first concrete example in scrolling content C in a downward direction.
In the complementary slide processing, CPU 110 initially identifies a complementary slide direction based on a direction of vector (specifically, a vertical component thereof) directed from touch-on coordinate P1 to touch-off coordinate P2 (step S71 of
Subsequently, CPU 110 identifies a complementary slide amount necessary to include the identified display element C1 within display area R (step S73). Specifically, CPU 110 identifies a complementary slide amount by subtracting a y-coordinate of display element C1 from a y-coordinate value of the upper side (in other words, a y-coordinate value of the upper left corner) of display area R. CPU 110 identifies a value “40 (=110−70)” as a complementary slide amount in the present example. Subsequently, CPU 110 updates a coordinate value of display area R based on the identified complementary slide direction and complementary slide amount. CPU 110 subtracts the value “40” from a y-coordinate “110” of display area R to update a coordinate value of display area R to “(0,70),” as shown in
A third concrete example of the menu processing described in the foregoing will be described. The present concrete example differs from the first concrete example in scrolling content C in response to a user's flick operation.
When a user touches the screen shown in
In the inertial touch-off processing, CPU 110 initially identifies a basic scrolling amount and a scrolling direction of content C (step S31 of
In the scrolling amount adjustment processing, CPU 110 calculates a preliminary coordinate value of display area R after an inertial scrolling of content C (step S51 of
Subsequently, CPU 110 identifies a correction scrolling amount based on a coordinate value of the identified display element C15 and the calculated preliminary coordinate value of display area R (step S54). Specifically, CPU 110 identifies a correction scrolling amount by subtracting a y-coordinate value of the lower side of display area R from a value (a y-coordinate value plus a height) for display element C15. CPU 110 identifies a value “40” as a correction scrolling amount in the present example. Subsequently, CPU 110 calculates an actual scrolling amount by adding the identified correction scrolling amount to the identified basic scrolling amount (step S55). CPU 110 calculates a value “1620 (=1580+40)” as an actual scrolling amount in the present example. Subsequently, CPU 110 updates a coordinate value of display area R based on the calculated actual scrolling amount and the identified scrolling direction. CPU 110 adds the value “1620” to a y-coordinate “180” of display area R to update a coordinate value of display area R to “(0,1800),” as shown in
A fourth concrete example of the menu processing described in the foregoing will be described. The present concrete example differs from the third concrete example in performing a downward inertial scrolling of content C.
When a user touches the screen shown in
In the inertial touch-off processing, CPU 110 initially identifies a basic scrolling amount and a scrolling direction of content C (step S31 of
In the scrolling amount adjustment processing, CPU 110 calculates a preliminary coordinate value of display area R after an inertial scrolling of content C (step S51 of
Subsequently, CPU 110 identifies a correction scrolling amount based on a coordinate value of the identified display element C1 and the calculated preliminary coordinate value of display area R (step S54). Specifically, CPU 110 identifies a correction scrolling amount by subtracting a y-coordinate value of display element C1 from a y-coordinate value of the upper side (in other words, a y-coordinate value of the upper left corner) of display area R. CPU 110 identifies a value “40” as a correction scrolling amount in the present example. Subsequently, CPU 110 calculates an actual scrolling amount by adding the identified correction scrolling amount to the identified basic scrolling amount (step S55). CPU 110 calculates a value “1610 (=1570+40)” as an actual scrolling amount in the present example. Subsequently, CPU 110 updates a coordinate value of display area R based on the calculated actual scrolling amount and the identified scrolling direction. CPU 110 subtracts the value “1610” from a y-coordinate “1680” of display area R to update a coordinate value of display area R to “(0,70),” as shown in
The above exemplary embodiment may be modified as described below. Any two or more of the following modifications may be combined with each other.
In the above exemplary embodiment, where a user uses touch screen 140 as a pointing device, s/he may use another pointing device such as a mouse, a pen tablet (or a graphics tablet), a touch-pad, a track-pad, or a trackball. Alternatively, a user may use a controller such as a joypad or a joystick, for use in an electronic device including a home video game machine.
In the above exemplary embodiment it is assumed that content C is scrolled vertically; however, a direction of scroll may be, for example, a horizontal direction, an oblique direction, or a depth direction. The above exemplary embodiment mainly assumes list form content C as an object to be scrolled; however, an object to be scrolled may be, for example, a map of a game such as a simulation game, or a real-world map displayed by a map application. Assuming a map as an object to be scrolled, a complementary slide processing or a scrolling amount adjustment processing according to the above exemplary embodiment adjusts a stop position for a scroll based on partition lines drawn on a map at predetermined intervals and the boundary of a display area.
In the above exemplary embodiment it is assumed that a single device achieves the above menu processing; however, plural devices that can access each other via a network may cooperate to achieve the menu processing. Namely, a display control system may achieve the menu processing.
Menu processing program P stored in information-processing device 100 may be provided to the device via a computer-readable non-transitory storage medium. A storage medium as used herein may refer to, for example, a magnetic storage medium such as a magnetic tape or a magnetic disk, an optical storage medium such as an optical disk, a magneto-optical storage medium, or a semiconductor memory. The program may be provided to information-processing device 100 via a network such as the Internet.
Number | Date | Country | Kind |
---|---|---|---|
2015-119549 | Jun 2015 | JP | national |