The following disclosure relates to a terminal device, a display control method, and a program.
Since it may be difficult for farsighted or weak-sighted people to get proper sight of fine characters, magnifying glasses and reading glasses are often used to make viewing such characters easy. However, using magnifying glasses and reading glasses is tiresome, and carrying the glasses around is inconvenient.
In recent years, as smart phones have become popular, proposals have been made for mobile electronic devices such as smart phones that are carried about while going out, to have a function of magnifying glasses and reading glasses. Such a character enlargement function may be implemented, for example, by capturing an object by a camera installed in a smart phone, and displaying an enlarged image of a predetermined place of the captured object on a screen.
For example, according to Patent Document 1, an object of interest such as a face of a person is detected in an image displayed on a screen, and a display area and an enlargement ratio for enlarging and displaying the detected object are determined in accordance with the position and the size of the object. Then, the object of interest is enlarged and displayed by the determined enlargement ratio in the determined display area.
However, it is difficult for the above technology to enlarge and display strings of multiple lines by a high enlargement ratio because camera shake makes it hard to have specific lines contained in an area to be captured by the camera. Since an electronic device such as a smart phone is operated while being held by a hand, the above technology is largely influenced by the camera shake when the enlargement ratio is high, and it is difficult to precisely contain a line to be enlarged and displayed in the imaging area of the camera. As such, there are cases where moving the position to be enlarged and displayed is difficult along a line.
According to an aspect, a terminal device includes a display unit; a memory; and a processor. The processor is configured to extract one or more strings from a character area included in image data by units of lines; to determine whether a position specified in a specified line among the strings extracted by units of lines has been moved by a first threshold or greater in a first axis direction that represents a line direction; and to enlarge and display, if it has been determined that the position in the first axis direction has been moved by the first threshold or greater, a string at and around a position in the first axis direction in the specified line.
The object and advantages of the embodiment will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention as claimed.
In the following, embodiments in the disclosure will be described with reference to the drawings. Note that elements having substantially the same functional configurations throughout the specification and drawings are assigned the same codes to avoid duplicated description.
[Example of Hardware Configuration]
First, an example of a hardware configuration of a terminal device will be described according to an embodiment in the disclosure.
A smart phone 1 according to an embodiment includes a CPU (Central Processing Unit) 10, a memory 11, a touch panel 12, a camera 13, an operational button 14, a secondary storage unit 15, a communication I/F (Interface) 16, a wireless communication I/F 17, an external I/F 18, and a sound input/output I/F 19.
The CPU 10 manages and controls the units included in the smart phone 1. Functions provided with the smart phone 1 are implemented by the CPU 10 that reads programs stored in the memory 11 constituted with a ROM (Read-Only Memory), a RAM (Random Access Memory), and the like, and executes the programs.
For example, the CPU 10 takes in and decodes instructions of an application program one after another, and executes the contents, calculation, data transfer, control, and the like. In the embodiment, the CPU 10 reads a program for enlarging and displaying characters, other application programs, and data from the memory 11 and the secondary storage unit 15, and executes a process for enlarging and displaying characters. Thus, the CPU 10 implements overall control of the smart phone 1 and control functions of enlarging and displaying characters that are installed on the smart phone 1.
The touch panel 12 has a sensor installed, with which contact of an operating object, such as a finger of a user or a touch pen, can be detected on a touch surface, and has a function to have data input in response to a user operation. The touch panel 12 also has a function to display a desired object on a display such as an LCD (liquid crystal display). In the embodiment, when a user performs an operation contacting the touch surface of the touch panel by a finger, a string specified by the operation is enlarged and displayed. Examples of the sensor includes a pressure sensor, an electrostatic capacitance sensor, and an optical sensor. However, the sensor installed in the touch panel 12 may be any other sensor as long as it can detect contact and non-contact between the operating object and the touch surface.
The camera 13 includes a lens and an imaging element to capture images of printed materials and documents having objects printed, and takes in the image data. The operational button 14 is a button provided for executing a predetermined function of the smart phone 1; examples may include a power button to turn on/off the power, and a button to return to a previously displayed image (also referred to as a “back button”, below).
The secondary storage unit 15 may be constituted with a storage device such as an EEPROM, a flash memory, and an HDD (Hard Disk Drive). The secondary storage unit 15 stores control programs and an OS program executed by the CPU 10, and application programs with which the CPU 10 executes various functions provided by the smart phone 1.
The communication I/F 16 is an interface to communicate with an external apparatus via a communication network. The communication I/F 16 connects to various communication terminals via the communication network, to implement reception/transmission of data between the smart phone 1 and the communication terminals. The communication I/F 16 may also function as an interface to transmit and receive electronic mail data and the like, with other apparatuses via a cellular phone communication channel network.
The wireless communication I/F 17 is an interface to execute wireless communication with an external apparatus. For example, the wireless communication I/F 17 is an interface to implement one of wireless communication protocols among infrared communication such as IrDA and IrSS, Bluetooth (trademark) communication, Wi-Fi (trademark) communication, and contactless IC cards.
The external I/F 18 is an interface to have the smart phone 1 connect with an external apparatus. For example, the external I/F 18 is implemented by a socket to have an external recording medium (a memory card or the like) inserted, an HDMI (High Definition Multimedia Interface) (trademark) terminal, a USB (Universal Serial Bus) terminal, or the like. In this case, the CPU 10 transmits and receives data with an external apparatus via the external I/F 18.
The sound input/output I/F 19 is an interface to output sound data processed by the smart, phone 1, and implemented by, for example, a loudspeaker, a headphone terminal, or a headphone. The sound input/output I/F 19 is also an interface to input sound generated outside of the smart phone 1, and implemented by, for example, a microphone.
[Example of Functional Configuration]
Next, a functional configuration of the terminal device according to an embodiment in the disclosure will be described with reference to
The smart phone 1 according to an embodiment includes an imaging unit 101, a storage unit 102, an extraction unit 103, a position detection unit 104, a processing unit 105, a determination unit 106, a display control unit 107, a communication I/F 108, a wireless communication I/F 109, and a sound input/output I/F 110.
The imaging unit 101 takes in image data that captures a document or the like. The imaging unit 101 is implemented by, for example, the camera 13.
The storage unit 102 stores image data that have been taken in, various programs, and various data items. The storage unit 102 stores a first threshold and a second threshold that have been set in advance as will be described later. The storage unit 102 is implemented by, for example, the memory 11 and the secondary storage unit 15.
The extraction unit 103 extracts strings from a character area included in image data by units of lines.
The position detection unit 104 detects contact on the touch surface by an operating object, and release of contact on the touch surface by an operating object (release of a finger or the like). The position detection unit 104 is implemented by, for example, a sensor installed in the touch panel 12.
The processing unit 105 calculates, based on detected contact on the touch surface, coordinates (x, y) of a touch position of an operating object, and calculates the moving direction and the moved distance of the operating object.
The determination unit 106 determines whether there is a line following a line specified by a touch position of the operating object.
The display control unit 107 enlarges and displays a string at and around the touch position in the line specified by the operating object among strings extracted by units of lines. In a predetermined case, which will be described later, the display control unit 107 enlarges and displays a string at and around the head of the next line or the previous line. Functions of the processing unit 105, the determination unit 106, and the display control unit 107 are implemented by the CPU 10.
The communication I/F 108 transmits and receives information with an external apparatus. The wireless communication I/F 109 executes wireless communication with an external apparatus. The sound input/output I/F 110 inputs and outputs sound data.
So far, as an example of the terminal device according to an embodiment, the hardware configuration and the functional configuration of the smart phone 1 have been described. Next, a process for enlarging and displaying characters will be described according to the first to third embodiments in order.
An example of a process for enlarging and displaying characters executed by the smart phone 1 according to the first embodiment will be described with reference to
Note that if the direction of strings included in image data is less than ±45 degrees with respect to the horizontal direction of the screen, it is determined that the strings are written in lateral writing. In this case, the horizontal direction (lateral direction) of the screen is taken as the first axis, and the vertical direction (longitudinal direction) of the screen is taken as the second axis. If the direction of strings included in image data is less than ±45 degrees with respect to the vertical direction of the screen, it is determined that the strings are written in vertical writing. In this case, the vertical direction (longitudinal direction) of the screen is taken as the first axis, and the horizontal direction (lateral direction) of the screen is taken as the second axis. The following description assumes lateral writing with which, the horizontal direction of the screen is taken as the first axis (line direction), and the vertical direction of the screen is taken as the second axis.
Once a process for enlarging and displaying characters according to the first embodiment is started, the imaging unit. 101 captures an image of a document or the like, and takes image data that includes characters in the smart phone 1 (Step S10).
Next, the extraction unit 103 analyzes the layout of the obtained image data, and extracts strings from the character area included in the image data by units of lines (Step S12). For example, by using an optical character recognition (OCR) technology, the extraction unit 103 executes a process for analyzing layout, executes extraction of the line direction, and executes extraction of strings by units of lines. At this moment, the extraction unit 103 extracts not only one line but also multiple lines, and determines order of lines from the positional relationship between the lines. Even for image data in which charts and strings coexist, the extraction unit 103 executes the process for analysing layout to automatically separate the charts and the strings, and to extract only the strings by units of lines.
Note that image data whose layout is analyzed by the extraction unit 103 may not be image data captured by the imaging unit 101. For example, it may be image data stored in the smart phone 1. In this case, at Step S10, instead of capturing an image, image data is read from the secondary storage unit 15.
When a finger of a user touches the touch surface, the display control unit 107 displays a line specified by the touch position (Step 314).
For example, in
Referring to
For example, on the enlarged display screens 2 in
Referring to
Based on the detected position of the drag button 5, the determination unit. 106 determines whether the drag button 5 is moving in the line direction (Step S22). If having determined that the drag button 5 is moving in the line direction, the determination unit 106 determines whether the drag button 5 has moved to the end of the specified line (Step S24). At this moment, the determination unit 106 determines whether it is the end of the line, based on the coordinates of the two endpoints of the centerline of the line extracted by the extraction unit 103.
If having determined at Step S24 that the drag button 5 is not moving in the line direct ion, the determination unit 106 determines whether the drag button 5 has moved to the head of the specified line (Step S28). At this moment, the determination unit 106 determines whether it is the head of the line, based on the coordinates of the two endpoints of the centerline of the line extracted by the extraction unit 103. If the determination unit. 106 has determined that the drag button 5 has not moved to the head of the line, the display control unit 107 extracts the component in the line direction when the finger of the user has moved the drag button 5. The display control unit 107 moves the display area of the string displayed on the enlarged display screen 2, by the amount of the extracted component, in the line direction along the centerline of the line (Step S30).
For example, in
Referring to
Next, based on the detected position of the drag button 5, the determination unit 106 determines whether the drag button 5 is moving in the line direction (Step S22).
As in a case where the position of the drag button 5 is moved in the direction of the second axis in
Note that also for vertical writing, the display control unit 107 similarly separates the movement of the position of the drag button 5 into the component of the first axis and the component of the second axis, and moves the drag button 5 by the amount of the component of the first axis (namely, the vertical direction (longitudinal direction) of the screen). Thus, the user can read the enlarged strings on the specified line smoothly in the line direction.
If having determined at Step S22 that the drag button 5 is not moving in the line direction, the determination unit 106 determines whether the drag button 5 is moving in a direction towards the line next to the specified line (Step S32). For example, if the drag button 5 is moving in the vertical direction of the screen, or if the moving direction of the drag button 5 is less than ±45 degrees with respect to the vertical direction of the screen, the determination unit 106 may determine that the drag button 5 is not moving in the line direction. At Step S32, if having determined that the drag button 5 is moving in a direction towards the next line, the display control unit 107 moves the head of the position displayed on the screen to the head of the next line (Step S26), displays the next line on the line display screen 3 (Step S14), and displays the head of the next line on the enlarged display screen 2 (Step S16). Consequently, as illustrated in
On the other hand, if the determination unit 106 has determined at Step S32 that the drag button 5 is not moving in a direction towards the next line, the display control unit 107 moves the head of the position displayed on the screen to the head of the previous line (Step S26), displays the previous line on the line display screen 3 (Step S14), and displays the head of the previous line on the enlarged display screen 2 (Step S16).
Here, a case will be described in which, based on the position of the drag button 5 detected when an end operation is not executed (Step S20), it is determined at Step S22 that the drag button 5 is moving in the line direction, to be moved to the end of the specified line at Step S24. This corresponds to a case, for example, in which the drag button 5 illustrated in
Next, a case will be described in which, based on the detected position of the drag button 5 (Step S20), it is determined that the drag button 5 has moved in the line direction (Step S22), not to the end of the specified line, but to the head of the line (Steps S24 and S28). This corresponds to a case, for example, in which the drag button 5 illustrated in
As described above, depending on the position of the drag button 5 detected at Step S20, at least one of sequences including Steps S22 to S34 is executed in each repetition. Consequently, the entire string on the specified line is displayed on the line display screen 3 at Step S14. At the same time, the string at and around the specify position in the specified line is enlarged and displayed on the enlarged display screen 2 at Step S16. Thus, the user can have the specified line enlarged to read it smoothly. So far, an example of the process for enlarging and displaying characters executed by the smart, phone 1 according to the first embodiment has been described.
Specifying a position by a touch operation or the like on a screen has lower precision compared to specifying a position by using a mouse. Therefore, if a character area to be enlarged is specified by a touch operation or the like on a screen, not a character part that the user desires to view, but another part in the neighborhood may be enlarged and displayed.
In contrast to this, by the process for enlarging and displaying characters according to the first embodiment, specifying a position at which a string is enlarged and displayed can be done easily. Specifically, in the embodiment, strings are extracted from a character area included in image data by units of lines, by layout analysis of a document. Next, the centerlines of the extracted lines and the points at both ends are calculated, and the area for enlarged display is moved along the centerline. Thus, even if a character area to be enlarged and displayed is specified in an imprecise way due to shake of an operational finger, it is possible to stably enlarge and display the specified position on the specified line, just by moving the finger along the line direction on the line display screen 3.
Also, by the process for enlarging and displaying characters according to the embodiment, after having a line that has been read through, the head of the next line is enlarged and displayed automatically. It is the same for a case where the user wants to go back to the previous line. Therefore, the user does not need to search for the head of the next line or the previous line on the screen. Also in this regard, specifying a position at which a string is enlarged and displayed can be done easily.
Further, the process for enlarging and displaying characters according to the embodiment can display a string at a location at which enlargement is desired, promptly without an error. For example, in a case where character recognition is applied to a document printed on a paper medium by an OCR to enlarge and display character codes on a screen, erroneous recognition of characters may occur at the location to be enlarged, which makes it difficult to display the characters 100% correctly. Also, character recognition by OCR takes time because the process requires two stages, extracting strings in lines in image data, and recognizing the characters in the strings in the extracted lines. In contrast to this, the process for enlarging and displaying characters according to the embodiment does not recognize characters at the locations to be enlarged in the lines by units of characters, but recognizes strings by units of lines. Therefore, it is possible to enlarge and display characters at the locations to be enlarged without an error. Also, since the process for enlarging and displaying characters according to the embodiment executes enlarging and displaying characters by units of lines, the processing time can be shortened compared to the case where enlarging and displaying characters are executed by units of characters, and the enlarging and displaying can be executed faster. Thus, a response time to have a specified string enlarged and displayed is faster. Therefore, it is possible even for a farsighted or weak-sighted user to read a document by using the smart phone 1 more smoothly.
(Displaying by Units of Words)
For a language that puts spaces between words, such as English, enlarging and displaying can be controlled by units of words in a specified line. In this case, a string at and around the specified position is enlarged and displayed by units of words. Specifically, when the position of the drag button 5 has moved, with respect to the middle point between the center position of a previous word and the center position of a next word, towards the next word, the display control unit 107 may have the entire next word enlarged and displayed.
By this way, as illustrated in
Next, an example of a process for enlarging and displaying characters executed by a smart phone 1 according to a second embodiment will be described with reference to
Once a process for enlarging and displaying characters according to the second embodiment is started, an image is captured, strings are extracted by units of lines by layout analysis of image data, the string on a specified line is displayed, and the string around the specified position is enlarged and displayed (Steps S10 to S16). Also, while the operation button is not pressed (Step S18), the position detection unit 104 detects coordinates of a position of the drag button 5 (touch position) (Step S20).
Next, based on the detected position of the drag button 5, the determination unit 106 determines whether the dragging operation has ended (Step S40). If having determined that the dragging operation has ended, the determination unit 106 determines whether there is a line next to the specified line (Step S42). If there is a next line, the display control unit 107 moves the head of the position displayed on the screen to the head of the next line (Step S26), displays the next line on the line display screen 3 (Step S14), and displays the head of the next line on the enlarged display screen 2 (Step S16). For example, when the finger is detached from the drag button 5 as illustrated in
If it has been determined that the operational finger has moved to the end or head of the line (Step S24 or S28), the string to be enlarged and displayed is automatically shifted to the head of the next line or the previous line (Steps S26 or S34, S16, and S18) in the same way as in the first embodiment. Also, steps to move the string to be enlarged and displayed depending on movement of the operational finger (Steps S30, S16, and S18) are the same as in the first embodiment. Therefore, description is omitted for these steps. So far, an example of the process for enlarging and displaying characters executed by the smart phone 1 according to the second embodiment has been described.
Note that if having determined that there is a line next to the specified line, the determination unit 106 may separate a position specified after the previous specification of the position has been released, into a position on the first axis that represents the line direction, and a position on the second axis orthogonal to the first axis, to determine whether the position on the first axis is within a predetermined range from the head of the next line. If having determined that there is a line next to the specified line, the determination unit 106 may determine whether a position specified after the previous specification of the position has been released is within the predetermined range from the head of the next line. If the determination unit 106 has determined that the specified position is within the predetermined range from the head of the next line, the display control unit 107 may enlarge and display the string at and around the head of the next line.
Specifying a position by a touch operation or the like on a screen has lower precision compared to specifying a position by using a mouse. Therefore, if the user wants to enlarge and display a line that is next to a line that has been enlarged and displayed, it may be difficult to specify the head of the next line. It is especially difficult to specify the head of the next line for strings if the line spacing is dense. In this case, if a finger of the user to specify the position shakes up and down with respect to the position of the next line, a character part at a shifted position that is different from the head of the next line is enlarged and displayed, which makes it difficult for the user to have characters on the desired line enlarged, and hinders the user from reading the document smoothly by using the smart phone 1.
In contrast to this, by the process for enlarging and displaying characters according to the second embodiment, the head position of an adjacent line can be easily specified for enlarging and displaying strings.
Specifically, by the process for enlarging and displaying characters according to the embodiment, strings are extracted from the screen by units of lines, and it is determined which of the lines corresponds to the part specified by the user to be enlarged. Then, in the embodiment, once it is determined that that specifying the line has completed, when determining an object to be enlarged and displayed next, a string to be enlarged and displayed can be determined from the next line even if a position specified by a finger of the user or the like to be enlarged is not at the position of the next line.
In other words, for example, in the embodiment, if the operational finger is detached from the screen, the string to be enlarged and displayed is automatically controlled to move to the head of the next line or the previous line automatically. Therefore, the user does not need to search for the head of the desired line on the screen, to specify the head of the desired line on the touch screen.
Also, in the embodiment, the same effects can be obtained that have been described by examples of effects in the first embodiment.
Note that an operation to release the specification of the position is not limited to a detaching operation of a finger from the screen. For example, when an operation is performed that moves a finger in a direction reverse to the moving direction, it may be determined that the specification of the position is released, and in the same way as the finger is detached from the screen in the above embodiment, the string to be enlarged and displayed may be automatically moved to the head of the next line.
(Displaying by Units of Words)
As done in the first embodiment, the display control unit 107 may enlarge and display the string at and around the specified position by units of words. By this way, while the finger is being moved along the line direction, a string enlarged by units of words is displayed on the enlarged display screen 2. For example, in
Next, an example of a process for enlarging and displaying characters executed by a smart phone 1 according to a third embodiment will be described with reference to
Once a process for enlarging and displaying characters according to the second embodiment is started, an image is captured (Step S10), and then, the extraction unit 103 extracts strings in image data by units of lines by layout analysis, and extracts the centerline of each line (ax+by+c=0) (Step S50). Next, when the user starts dragging, the position detection unit 104 stores coordinates of the start position of the drag button 5 in the storage unit 102 (Step S52). In the following, the coordinates of the start position of the drag button 5 are denoted as the start point of dragging (x0, y0).
Next, the display control unit 107 displays the specified line on the line display screen 3 (Step S14), and enlarges and displays the string around the drag button 5 on the enlarged display screen 2 (Step S16). Next, if the back button is not pressed, the determination unit 106 determines that an end operation has not been performed (Step S18). In this case, the position detection unit 104 detects the coordinates of the drag button 5 on the move (Step S54). In the following, the coordinates of the drag button 5 on the move are denoted as the intermediate point of dragging (x1, y1).
Next, the processing unit. 105 calculates a difference of the distance Δ from the start point of dragging (x0, y0) to the intermediate point of dragging (x, y), and the determination unit 106 determines whether the calculated difference of the distance Δ is greater than or equal to a predetermined threshold (Step S56).
A calculation method of this difference of the distance Δ will be described with reference to
A point (x1, y1), which is a projection of the start point of dragging (x0, y0) on the centerline, is represented by the following formula (1).
(x1,y1)=(x0,y0)−(ax0+by0+c)(a,b)/(a2+b2) (1)
A point (x2, y2), which is a projection of the intermediate point of dragging (x, y) on the centerline, is represented by the following formula (2).
(x2,y2)=(x,y)−(ax+by+c)(a,b)/(a2+b2) (2)
The difference of the distance Δ between the point (x1, y1) being the projection of the start point of dragging (x0, y0), and the point (x2, y2) being the projection of the intermediate point of dragging (x, y) on the centerline is defined as follows.
Difference Δ=|x2−x1|, if |a/b|<1, or
Difference Δ=|y2−y1|, if |a/b|≦1
The slope of the centerline (ax+by+c=0) is represented by “−a/b” because y=−a/b·x−c/b. As illustrated in
On the other hand, if the slope of the centerline is less than ±45 degrees with respect to the vertical direction (the second axis) of the screen, |a/b|≧1 is satisfied. In other words, as in a case where the drag button 5 is dragged from a position A to a position C, the component of the second axis by the movement of the drag button 5 is greater than the component of the first axis. In this way, if |a/b|≧1 is satisfied, the processing unit 105 calculates the difference of the distance A when the drag button 5 moves in the second axis direction by using the formula of the difference of the distance Δ=|y2−y1|.
Referring to
For example, if |a/b|<1 is satisfied, the amount of movement from the start point of dragging to the intermediate point of dragging is defined by the difference Δ=x2−x1| in the first axis direction. The determination unit 106 determines whether the calculated difference of the distance Δ (=|x2−x2|) is greater than or equal to the predetermined first threshold (Step S56). If the determination unit 106 has determined that the calculated difference of the distance Δ (=|x2−x1|) is greater than or equal to the predetermined first threshold, the display control unit 107 moves the drag button 5 in the first axis direction by an amount of the difference of the distance Δ (=|x2−x1|) from the start point of dragging of the drag button 5 in the first axis direction, and enlarges and displays a string at and around the intermediate point of dragging at which the drag button 5 is positioned. For example, in
Thus, even if a dragging operation by the user shakes as illustrated in
Note that as examples of timing when the specification of the position is released, timing when the drag button 5 on the screen is positioned at the end or head of a specified line as illustrated in Step S24 or S28 in
After execution of Step S60 in
On the other hand, if |a/b|≧1 is satisfied, the amount of movement from the start point of dragging to the intermediate point, of dragging is defined by the difference Δ=|y2−y1| in the second axis direction (the direction orthogonal to the line direction). The determination, unit 106 determines whether the calculated difference of the distance Δ (=|y2−y1|) is greater than or equal to the predetermined first threshold (Step S56). If the determination unit 106 has determined that the calculated difference of the distance Δ (=|y2−y1|) is greater than or equal to the predetermined second threshold, the display control unit 107 moves the drag button 5 in the second axis direction by an amount of the difference of the distance Δ (=|y2−y1|), determines a line at or around the moved position, and enlarges and displays a string at and around the head of the determined line. For example, in
Note that if the drag button 5 is moved to the point P8 in
While the back button is not pressed (Step S18), depending on the position of the intermediate point of dragging detected at Step S54, Steps S14 and after are executed repeatedly. So far, an example of the process for enlarging and displaying characters executed by the smart phone 1 according to the third embodiment has been described.
Note that the determination unit 106 may determine whether the specified position has been moved by the first threshold or greater in the first axis direction that represents the line direction. If the determination unit 106 has determined that the position in the first axis direction has been moved by the first threshold or greater, the display control unit 107 may enlarge and display a string at and around a position in the specified line in the first axis direction until the specification of the position is released.
When displaying strings in dense line spacing, while tracing positions of strings to be enlarged and displayed in a line, the tip of a finger may shake up and down, and a string in the up and down lines may be enlarged and displayed unintentionally.
In contrast to this, by the process for enlarging and displaying characters according to the third embodiment, when strings are enlarged and displayed, erroneous operations of the specification of the position to be enlarged and displayed can be reduced.
Specifically, by the process for enlarging and displaying characters according to the embodiment, string parts are extracted from the screen by units of lines, and based on comparison between the amount of movement of a dragging operation and the first or second threshold, it is determined whether a line to be enlarged and displayed that is specified by the user, is currently being enlarged and displayed. Consequently, if it has been determined that the line is currently being enlarged and displayed, then, displaying is controlled so as not to change the line to be enlarged and displayed even if the position specified to be enlarged and displayed is moved to a part in a line above or below the line currently being enlarged and displayed. Thus, erroneous operations of the specification of the position to be enlarged and displayed due to a shaking finger can be reduced. Also, in the embodiment, the same effects can be obtained that have been described by examples of effects in the first and second embodiments.
(Displaying by Units of Words)
As done in the first and second embodiments, the display control unit 107 may enlarge and display the string at and around the specified position by units of words. By this way, while a finger is being moved along the line direction, a string enlarged by units of words is displayed on the enlarged display screen 2. Thus, a word is not displayed in a state in which the word is cut in the middle, and the string can be displayed in a state that is easier to recognize.
So far, a terminal device, a display control method, and a program have been described with embodiments above. Note that the invention is not limited to the above embodiments, but various modifications and improvements can be made within the scope of the invention. Also, the above embodiments can be combined as long as no inconsistency is introduced. For example, in the above embodiments, the screen is partitioned into two areas to display an entire line in one of the areas, and to enlarge and display a string to be processed in the specified line in the other area. However, the screen may not be partitioned, and a string to be processed in the specified line may be enlarged and displayed in the entire area of a single screen. For example, as illustrated in
Also, as illustrated in
Also, in the above embodiments, although the examples have been described with the drag button 5 displayed so that the specified position can be indicated, the drag button 5 may not be displayed necessarily.
Also, in the above embodiments, although the examples have been described with strings in lateral writing, the process for enlarging and displaying characters according to the disclosure is applicable to strings in vertical writing, namely, strings represented with the first axis in the vertical direction.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments in the disclosure have been described in detail, if should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation application of International Application PCT/JP2014/058285 filed on Mar. 25, 2014 and designated the U.S., the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2014/058285 | Mar 2014 | US |
Child | 15198768 | US |