This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed on Oct. 17, 2012 in the Korean Intellectual Property Office and assigned Serial No. 10-2012-0115204, the entire disclosure of which is hereby incorporated by reference.
1. Field of the Invention
The present invention relates to a method of controlling a mobile terminal based on a user input. More particularly, the present invention relates to a mobile terminal and a control method based on a user input for the same wherein a text string can be erased from content according to the user input and results of a comparison between an input text string and the erased text string can be displayed.
2. Description of the Related Art
With recent technological advances, smart electronic devices have been increasingly utilized in voice communication, entertainment, culture, creative writing, and social networking. Smart electronic devices enable a mobile terminal to detect a user input using a finger or a stylus pen, so that the user may easily control the mobile terminal in any situation. With advanced detecting technology, a mobile terminal having a sensor may accurately recognize an elaborate user input entered using a stylus pen.
Smart electronic devices may be used as learning aids in education. However, current learning aids tend to be limited to drawing of simple pictures or providing answers to questions through a menu selection, and fail to fully utilize the user interface with various input means.
Therefore, a need exists for a mobile terminal and a control method based on a user input for the same wherein a text string contained in a content may be erased through input using a finger or pen.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present invention.
Aspects of the present invention are to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a mobile terminal and a control method based on a user input for the same wherein a text string contained in a content may be erased through an input using a finger or pen.
Another aspect of the present invention is to provide a mobile terminal and a control method based on a user input for the same wherein an erased text string is stored and, when the user input is entered at the erasure region, a result of a comparison between the stored text string and the user input is output.
In accordance with an aspect of the present invention, a control method for a mobile terminal is provided. The control method includes detecting a first input, identifying color distribution of an input region corresponding to the first input, and erasing at least one object in the input region by applying the most commonly used color to the input region based on the identified color distribution.
The object may include at least one text string and the first input may include at least one of a touch input using a finger or a pen and a proximity input.
The identifying of the color distribution may include dividing, when the input region exceeds a preset size, the input region into sub-regions and separately determining color distribution of the sub-regions, and the erasing of the at least one object may include erasing at least one object in the input region by applying the most commonly used colors to the sub-regions.
The erasing of the at least one object may include underlining or shading the input region where at least one object has been erased.
The erasing of the at least one object may include extracting at least one text string from the input region, and storing the extracted text string together with information regarding the input region.
The storing of the extracted text string may include storing at least one of information on content from which the text string has been extracted and information regarding the input region.
The control method may further include detecting a second input at the input region, extracting at least one input text string from the second input, comparing the input text string with the stored text string, and displaying comparison results of the text strings.
The displaying of the comparison results may include displaying at least one of identical portions of the compared text strings, non-identical portions of the compared text strings, a score computed based on the text comparison result, a list of input text strings, and a list of stored text strings.
The displaying of the comparison results may include displaying comparison results using at least one of an underline, a strikethrough, and shading.
In accordance with another aspect of the present invention, a mobile terminal is provided. The mobile terminal includes a display unit for displaying at least one object, an input unit for detecting a first input, and a control unit for identifying color distribution of an input region corresponding to the first input, and for controlling the display unit to erase the object by applying the most commonly used color to the input region based on the identified color distribution.
The object may include at least one text string and the first input may include at least one of a touch input using a finger or a pen and a proximity input.
When the input region exceeds a preset size, the control unit may divide the input region into sub-regions, separately determine color distribution of the sub-regions, and control the display unit to erase the object by applying the most commonly used colors to the sub-regions.
The control unit may control the display unit to underline or shade the input region where at least one object has been erased.
The mobile terminal may further include a storage unit for storing data, and the control unit may extract at least one text string from the input region and control the storage unit to store the extracted text string together with information regarding the input region.
The storage unit may store at least one of information on content from which the text string has been extracted and information regarding the input region.
The input unit may detect a second input at the input region, and the control unit may extract at least one input text string from the second input, compare the input text string with the stored text string, and control the display unit to display comparison results of the text strings.
The control unit may control the display unit to display at least one of identical portions of the compared text strings, non-identical portions of the compared text strings, a score computed based on the text comparison result, a list of input text strings, and a list of stored text strings.
The control unit may control the display unit to display comparison results using at least one of an underline, a strikethrough, and shading.
Other aspects, advantages, and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
The above and other aspects, features, and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, it should be noted that like reference numbers are used to depict the same or similar elements, features, and structures.
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention is provided for illustration purpose only and not for the purpose of limiting the invention as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces,
By the term “substantially” it is meant that the recited characteristic, parameter, or value need not be achieved exactly, but that deviations or variations, including for example, tolerances, measurement error, measurement accuracy limitations and other factors known to those of skill in the art, may occur in amounts that do not preclude the effect the characteristic was intended to provide.
Exemplary embodiments of the present invention are applicable to control of a terminal based on a user input.
Exemplary embodiments of the present invention may be applied to any information appliance capable of detecting a user input, such as a smartphone, a portable terminal, a mobile terminal, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), a laptop computer, a note pad, a Wireless Broadband (WiBro) terminal, a tablet computer, and the like. Exemplary embodiments of the present invention may also be applied to a consumer electronic device capable of detecting a user input, such as a smart TV or smart refrigerator.
Referring to
The input unit 110 generates an input signal corresponding to user manipulation. The input unit 110 may include a touch sensor 111, a proximity sensor 112, and an electromagnetic sensor 113.
The touch sensor 111 may detect a user touch input and may include a touch film, a touch sheet, a touch pad, and the like. The touch sensor 111 may detect a touch input and send a touch signal corresponding to the touch input to the control unit 120. The control unit 120 may analyze the touch signal and perform a corresponding operation. Information indicated by the touch signal may be displayed on the display unit 140. The touch sensor 111 may detect a user touch input through various input means, such as a finger, a stylus pen, a button of the stylus pen, and the like. The touch sensor 111 may be configured to detect both a direct touch input and a contactless input within a given distance.
The proximity sensor 112 may detect presence, access, movement, direction, speed or shape of an object on the detection surface of the input unit 110 without any physical contact using an electromagnetic field. The proximity sensor 112 may be a through-beam, retro-reflective or diffuse reflective photoelectric sensor, or a high frequency oscillation, capacitive, magnetic, or infrared proximity sensor.
The electromagnetic sensor 113 may detect a touch input or a proximity input according to changes in electromagnetic field intensity, and may be configured as an ElectroMagnetic Resonant (EMR) or as an ElectroMagnetic Interference (EMI) input pad. The electromagnetic sensor 113 may include a coil producing a magnetic field, and may detect the presence of an object containing a resonant circuit causing a change in the magnetic field. The electromagnetic sensor 113 may detect an input by a stylus pen or the like acting as an object containing a resonant circuit. The electromagnetic sensor 113 may detect both a direct contact with the mobile terminal 100 and proximity or hovering without a direct contact.
Referring to
The combination of the input unit 110 and the display unit 140 shown in
More particularly, the input unit 110 may detect a user input indicating a text string. The input unit 110 may also detect a user input indicating a region where a text string has been erased.
Referring back to
The control unit 120 may include a color extractor 121, a text extractor 122, and a text comparator 123.
The color extractor 121 may extract color information of an object displayed on the display unit 140. The color extractor 121 may extract color values for a given region or pixel. The color extractor 121 may extract color information as RGB color values or palette color values.
The text extractor 122 may extract text from content. The text extractor 122 may extract a text string from content or user input. Here, a text string may include a character, a numeral, a special character, a symbol, a space, and the like. The text extractor 122 may recognize a text string using pattern matching and structural analysis and convert the text string into an analog signal or digital codes. The text extractor 122 may include an Optical Character Reader (OCR) and an Optical Mark Reader (OMR).
The text comparator 123 may perform text comparison by comparing two text strings to determine whether the two text strings are identical. The text comparator 123 may use, for example, a “strcmp” function to compare two text strings.
When a user input is entered through the input unit 110, the control unit 120 may extract color information of a region indicated by the user input. The control unit 120 may identify color distribution of the region and the most commonly used color in the input region. The control unit 120 may erase an object in the input region by applying the most commonly used color to the input region.
The control unit 120 may extract a text string present in a region through the text extractor 122, and control the storage unit 130 to store the extracted text string.
When a user input is entered, the control unit 120 may control the text extractor 122 to extract a text string from the user input, control the text comparator 123 to compare the extracted text string with the stored text string, and control the display unit 140 to display the text comparison result.
Operation of the control unit 120 is described below with reference to the drawings.
The storage unit 130 may store programs and commands for the mobile terminal 100. The control unit 120 may execute a program or a command stored in the storage unit 130.
The storage unit 130 may include one or more of various types of storage media, such as a flash memory, a hard disk, a multimedia or other memory card (i.e., a micro Secure Digital (SD) or eXtreme Digital (XD)), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic memory, a magnetic disk, an optical disc, and the like.
In one exemplary embodiment, the storage unit 130 may store content containing at least one text string. The storage unit 130 may temporarily or semi-permanently store text strings extracted from content or a user input. Here, an extracted text string may be stored together with information describing the source content and extraction location. An extracted text string may be stored in connection with the source content.
The storage unit 130 may store information on an operation applied to the user input entered through the input unit 110. For example, the storage unit 130 may store information on an erase or a compare operation applied to a given user input.
The display unit 140 displays information processed or to be processed by the mobile terminal 100. For example, the display unit 140 may display a User Interface (UI) or Graphical User Interface (GUI) related to voice detection, context awareness, function control, and the like.
The display unit 140 may be realized using one or more of display techniques based on a Liquid Crystal Display (LCD), a Thin Film Transistor-Liquid Crystal Display (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, a 3 Dimensional (3D) display, and the like.
When the display unit 140 is layered with the touch sensor of the input unit 110, it may act as a touchscreen. In this case, the display unit 140 may act as an input means as well as a display means.
More particularly, the display unit 140 may display specific content containing at least one text string under control of the control unit 120. The display unit 140 may display a portion of the content with erasure of a text string, or display a result of a comparison between text strings.
As the components of the mobile terminal 100 shown in
Referring to
The mobile terminal 100 receives a user input for the content in step 220. The mobile terminal 100 may receive a user input indicating a text string contained in the content. Here, the user input may be a touch input with a finger or a pen or a proximity input. The user input may correspond to a command for deleting an object from the content.
Referring to
Referring to
The mobile terminal 100 may extract color information for sub-regions or pixels of the input region. The mobile terminal 100 may extract color information as RGB color values or palette color values. The mobile terminal 100 may count frequencies of color values used in the region to identify color distribution in the region. Hence, the mobile terminal 100 may identify the most commonly used color in the region. Here, the most commonly used color may correspond to the color of a background image of the indicated object.
Referring to
The mobile terminal 100 may extract color values of pixels in an input region. The mobile terminal 100 may extract the text color 10, the spread color 20, and the background color 30 of each pixel. The mobile terminal 100 may identify distribution of the extracted colors. For example, the mobile terminal 100 may count the number of pixels having the text color 10, the number of pixels having the spread color 20, and the number of pixels having the background color 30 to identify color distribution. The mobile terminal 100 may identify the most commonly used color in the input region. In
When the input region exceeds a preset size, the mobile terminal 100 may identify color distribution by dividing the input region into sub-regions. When the input region exceeds a preset size, the mobile terminal 100 may identify color distribution of sub-regions in real-time while the user input is being processed. When the input region exceeds a preset size, the mobile terminal 100 may repeatedly identify color distribution of sub-regions in real-time. Alternatively, the mobile terminal 100 may determine the input region after the user input is processed, divide, when the input region exceeds a preset size, the input region into sub-regions, and identify color distribution of the sub-regions.
Referring to
Referring to
Referring to
When the most commonly used color is found for each sub-region, the mobile terminal 100 may apply the most commonly used color to each corresponding sub-region, producing a gradation or fading effect in the input region with object erasure.
Referring to
Referring to
The mobile terminal 100 stores the extracted text string in step 260. The mobile terminal 100 may store the text string together with information describing the input region where the text string is erased. The mobile terminal 100 may store the text string together with information describing the source content (for example, content name, player file location, content author, and the like) and information on the input region (for example, page, paragraph, line, coordinates, and the like).
The mobile terminal 100 determines whether a termination request is issued. When a termination request is issued, the mobile terminal 100 ends operation on the content. When a termination request is not issued, the mobile terminal 100 returns to step 210 and continues the procedure.
Referring to
The mobile terminal 100 detects the user input at an erasure region in step 320. The mobile terminal 100 may detect the user input at a previous input region where an object has been erased. The previous input region where an object has been erased may be a region where at least one text string has been extracted for storage. Here, the user input may be touch input with a finger, a pen, or a proximity input. For example, referring to
When the user input is detected, the mobile terminal 100 extracts a text string from the user input in step 330. The mobile terminal 100 may extract recognizable text from the detected user input. The mobile terminal 100 may recognize the text string through pattern matching and structure analysis, and extract the recognized text string. For example, referring to
In step 340, the mobile terminal 100 compares the extracted text string with the stored text string. The mobile terminal 100 may find a text string that has been extracted from the previous input region and stored. The text string to be found may have been stored together with information on a region where the text string has been extracted (or where a corresponding object has been erased). When such a text string is found, the mobile terminal 100 compares the text string extracted from the user input with the stored text string. The mobile terminal 100 may use the “strcmp” function for text comparison. Text comparison may be performed on a word basis in consideration of a space character. The mobile terminal 100 may determine whether the extracted text string is identical to the stored text string based on the text comparison result. The mobile terminal 100 may temporarily or semi-permanently store the text comparison result. When text comparison is repeatedly performed, the mobile terminal 100 may display a list of text comparison results as a note of incorrect answers.
The mobile terminal 100 outputs the comparison result in step 350. The mobile terminal 100 may display identical and non-identical portions of the compared text strings, a score computed based on the text comparison result, and a list of input and stored text strings. The mobile terminal 100 may use at least one of an underline, a strikethrough, and shading to highlight the comparison result. The mobile terminal 100 may display non-identical portions of the compared text strings together with one of the extracted text string and the stored text string. The mobile terminal 100 may display the extracted text string and the stored text string in a preset layout.
Referring to
Referring to
Referring to
Referring to
In an exemplary embodiment of the present invention, a control method based on a user input enables a mobile terminal to erase a text string in a simple and effective way according to the user input based on content color distribution.
The mobile terminal may identify and store an erased text string, and output, when the user input is entered at the erasure region, a result of a comparison between the stored text string and the user input. Hence, exemplary embodiments of the present invention may be used for fill-in-the-blanks exercises, and the like, in learning content.
While the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
10-2012-0115204 | Oct 2012 | KR | national |