1. Technical Field
The present disclosure relates to an electronic device and a text reading guide method for the electronic device.
2. Description of Related Art
Many electronic devices, e.g., mobile phones, computers, and electronic readers (e-reader), are capable of storing and displaying electronic documents (e.g., digital images and digital texts). Users may manually control the displayed pages of an electronic document on these electronic devices to flip. However, many of the electronic documents include a number of pages, and usually the pages are displayed on the electronic device one at a time. Thus, the user should press the page flipping keys many times to flip through the pages, which is inconvenient especially when a large number of pages need to be displayed. Some of the electronic devices can automatically flip through the pages when the time flipping frequency has been preset by the user, but the then the page may be flipped before the user has finished reading each displayed page.
Therefore, what is needed is an electronic device and a text reading guide method thereof to alleviate the limitations described above.
The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of an electronic device and a text reading guide method for the electronic device. Moreover, in the drawings, like reference numerals designate corresponding sections throughout the several views.
The electronic device 100 includes a storage unit 10, an input unit 20, a display screen 30, a camera 40, and a processor 50.
Together referring to
The input unit 20 receives user commands and selections. The user selections may include activating, executing and ending the adjustment of text for reading function of the electronic device 100, and setting the adjustment of text for reading function, for example.
The camera 40 captures images of one or more fingers of a user in real-time and transmits the images of the one or more fingers to the processor 50. In the embodiment, the camera 40 is secured on the middle top of the display screen 30 for the purpose of capturing images of the one or more fingers of the user, and the camera 40 is activated as long as the adjustment of text for reading function of the electronic device 100 is activated. In alternative embodiments, the camera 40 is secured on the middle left or other portions of the display screen 30.
The processor 50 includes an image processing module 501, a determining module 502, an effect control module 503, and a display control module 504.
The image processing module 501 analyzes and processes the images of the one or more fingers by running a variety of image processing algorithms, thus extracting the image feature values of the one or more fingers from the captured images of the one or more fingers of the user.
The determining module 502 searches the posture feature database 102 to find the image feature value of the one or more fingers of the user which may match with the extracted image feature value of the one or more fingers of the user. The determining module 502 is further configured to retrieve the screen coordinates associated with the image feature value of the one or more fingers of the user recorded in the posture feature database 102, and to transmit the retrieved coordinates to the effect control module 503.
The effect control module 503 determines the display content such as single words, phrases or complete sentences on the display region corresponding to the retrieved coordinates on the display screen 30, according to the type of effects predefined by the user or the system of the electronic device 100. For example, the marking of words may be by zooming, coloring, or underlining the display content on the display region, for example.
The display control module 504 displays the determined contents in a manner of highlighting on the display screen 30.
In use, when a user activates the adjustment of text for reading function of the electronic device 100 via the input unit 20, the display control module 504 controls the display screen 30 to display an information input box for the user to input a user name. The determining module 502 determines whether the posture feature database 102 records the user name and other data for that username. If the posture feature database 102 records the user name and corresponding data, the image processing module 501, the determining module 502, the effect control module 503, and the display control module 504 cooperate together to execute the adjustment of text for reading function.
When the determining module 502 determines that the user name and the corresponding data do not exist in the posture feature database 102, that means it is the first time for the user to use the adjustment of text for reading function of the electronic device 100. The display control module 504 controls the display screen 30 to display a dialog box inviting the user to do a test for recording finger image feature values of his/her finger images. If the user determines to do the test, the display control module 504 further controls the display screen 30 to display the test page of the electronic text file 101. In the embodiment, the content of the test page includes a number of different portions. The display screen 30 defines a coordinate system. Each portion of the test page is displayed on a particular display region with coordinates associated therewith. The display control module 504 also controls the display screen 30 to display a dialog box prompting the user to follow the highlighted contents.
In the embodiment, each portion of the test page corresponds (is located on) to particular coordinates of the display screen 30. The camera 40 captures finger images of the user when the finger of the user points toward any portion, and transmits the finger image to the image processing module 501. The image processing module 501 is further configured to extract the finger feature values of the image of the finger of the user, and to store the extracted finger feature values corresponding to the user name and the coordinates of the portion of the display screen 30 to which the finger was pointing, in the posture feature database 102. When all portions of the text have been read, the test is completed, then the user can activate the adjustment of text for reading function of the electronic device 100.
Referring to
In step S30, a user activates the adjustment of text for reading function of the electronic device 100, the determining module 502 determines whether it is the first time for the user to activate the adjustment of text for reading function. If no, the process goes to step S31, if yes, the process goes to step S36. In this embodiment, if the user name input by the user exists in the posture feature database 102, the determining module 502 determines it is not the first time for the user to activate the adjustment of text for reading function. Otherwise, the determining module 502 will determine it is the first time for the user to activate the adjustment of text for reading function. In the embodiment, the camera 40 is activated when the user activates the adjustment of text for reading function.
In step S31, the camera 40 captures one or more images of one or more fingers of the user.
In step S32, the image processing module 501 analyzes and processes the captured images by running a variety of image processing algorithms, to extract a feature value for each image of the one or more fingers of the user.
In step S33, the determining module 502 searches the posture feature database 102 to find the feature value of the image of the one or more fingers which matches with an extracted finger image feature value of the user, and retrieves the screen coordinates associated with the finger image feature value recorded in the posture feature database 102. In an alternative embodiment, the determining module 502 searches the posture feature database 102 to find an finger image feature value with a highest percentage similarity to an extracted finger image feature value of the user, and then retrieves the screen coordinates associated with the finger image feature value recorded in the posture feature database 102, when the exact finger image feature value of the user is not found in the database.
In step S34, the effect control module 503 determines the display content such as single words, phrases, or complete sentences on the display region corresponding to the retrieved coordinates on the display screen 30, according to a predefined type of text reading guide effect.
In step S35, the display control module 504 displays the contents determined as being the target of the pointing finger in a manner of highlighting on the display screen 30 in place of the originally-displayed content. Referring to
In step S36, if it is the first time for the user to activate the adjustment of text for reading function, the determining module 502 invites the user to do a test for recording finger image feature values of his/her finger images, if yes, the process goes to step S37, otherwise, the process ends.
In step S37, the display control module 504 controls the display screen 30 to display the test page of the electronic text file 101, and controls the display screen 30 to display a dialog box prompting the user to follow the content displayed in the highlighted fashion to read. In the embodiment, the content of the test page includes a number of different portions, and each different portion of the test page is displayed on a display region with coordinates associated therewith.
In step S38, the camera 40 captures images of the finger of the user when the user points his finger at each of the portions to be read.
In step S39, the image processing module 501 extracts the finger feature values from the captured images of the finger of the user, and stores the extracted finger feature values and the coordinates corresponding to the extracted finger feature values in the posture feature database 102.
With such a configuration, when the adjustment of text for reading function of the electronic device 100 is activated, the display content corresponding to the coordinates of the display screen 30 being pointed at by the user are executed special treatment and then displayed to the user. Thus, a vivid content displaying effect is presented to the user of the electronic device 100 when the user is reading the display screen 30, which makes viewing and reading more expedient and convenient.
Although the present disclosure has been specifically described on the basis of the embodiments thereof, the disclosure is not to be construed as being limited thereto. Various changes or modifications may be made to the embodiments without departing from the scope and spirit of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201110363220.7 | Nov 2011 | CN | national |