The Wubihua method or the five-stroke input method is a method currently used for inputting Chinese text on a computer based on the stroke sequence of a character. Physical buttons (e.g., on a keyboard) or soft input buttons displayed on a touchscreen may be assigned a specific stroke. Currently, a tap-to-input method is utilized to select a stroke sequence of a Chinese character. Current input methods do not leverage the advantage of a touchscreen or gesture input. A swipe-stroke input may provide users with a more comfortable and efficient input experience to input Chinese text.
A current method for Chinese handwriting input includes drawing a Chinese character via an input device, wherein a handwriting engine is operable to receive and recognize the handwriting input as a character. A limitation to this approach is that after a user enters a handwriting input, a delay is experienced while the handwriting engine determines if the handwriting input has been completed or if the user may be providing addition input. While current Chinese handwriting engines provide a high recognition rate, the delay may be frustrating to users who desire a continuous handwriting experience.
It is with respect to these and other considerations that the present invention has been made.
Embodiments of the present invention solve the above and other problems by providing swipe-stroke input and continuous handwriting. According to embodiments, a user interface may be provided for allowing a user to input a stroke sequence or a portion of a stroke sequence of a Chinese character via a swipe gesture. When a stroke sequence input is ended (e.g., when the user lifts his finger from the user interface), one or more candidates may be provided. The user may select a candidate or may continue to input a next stroke sequence. As additional input is received, phrase candidates may be predicted and provided. Swipe-stroke input may provide an improved and more efficient input experience.
According to embodiments, an “end-of-input” (EOI) panel may be provided, which when selected, provides an indication of an end of a current handwriting input. By selecting the EOI panel, a next handwriting input may be received, providing a continuous and more efficient handwriting experience. Embodiments may also store a past handwriting input. A past handwriting input may be provided in a recognized character panel, which when selected, allows a user to edit the past handwriting input.
The details of one or more embodiments are set forth in the accompanying drawings and description below. Other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that the following detailed description is explanatory only and is not restrictive of the invention as claimed.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various embodiments of the present invention. In the drawings:
As briefly described above, embodiments of the present invention are directed to providing swipe-stroke input and continuous handwriting. According to embodiments, stroke buttons may be provided, wherein a user may input a stroke sequence or a portion of a stroke sequence of a Chinese character via selecting one or more stroke buttons via a swipe gesture. One or more candidates may be determined and provided when a stroke sequence input is ended (e.g., when the user lifts his finger from the user interface). The user may select a candidate or may continue to input a next stroke sequence. Multiple characters or phrases may share the same stroke sequence. As additional input is received, phrase candidates may be predicted and dynamically provided.
Embodiments may also provide continuous handwriting for a faster stroke input method. According to embodiments, an “end-of-input” (EOI) panel may be provided. When the EOI panel is selected, an indication of an end of a current handwriting input may be received, and a next handwriting input may be entered. As described above, with current systems, the indication of an end of a current handwriting input is a timeout between handwriting inputs. By providing a selectable functionality to indicate an end of a current handwriting input, a continuous and more efficient handwriting experience may be provided. Embodiments may also store a past handwriting input, allowing a user to edit the past handwriting input.
The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawing and the following description to refer to the same or similar elements. While embodiments of the invention may be described, modifications, adaptations, and other implementations are possible. For example, substitutions, additions, or modifications may be made to the elements illustrated in the drawings, and the methods described herein may be modified by substituting, reordering, or adding stages to the disclosed methods. Accordingly, the following detailed description does not limit the invention, but instead, the proper scope of the invention is defined by the appended claims.
Referring now to the drawings, in which like numerals represent like elements, various embodiments will be described. Referring now to
Referring now to
The swipe-stroke input UI may comprise a candidate line 210, as illustrated in
According to embodiments, a stroke sequence of a character may be a complete stroke sequence of a character or may be a portion of a stroke sequence of a character. Candidates 240 may be provided according to a received stroke sequence. As additional stroke sequences are received, candidates 240 may be dynamically updated.
Embodiments of the present invention may be applied to various software applications and may be utilized with various input methods. For example, embodiments are illustrated as applied to a messaging application; however, embodiments may be applied to various types of software applications where Chinese text may be input via a five-stroke input method (sometimes referred to as the Wubihua method).
Although the examples illustrated in the figures show touchscreen UIs on mobile 100 and tablet 200 devices, embodiments may be utilized on a vast array of devices including, but not limited to, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, IP telephones, gaming devices, cameras, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers.
With reference now to
According to embodiments, a stroke sequence input 405 may comprise a portion of a stroke sequence of a character, for example, the first couple of strokes of a character. As can be appreciated, some Chinese characters may include many strokes. Embodiments allow a user to input a portion of a stroke sequence of a character via a stroke or swipe gesture, and thereby providing a faster stroke input. The example stroke sequence input 405 illustrated in
The method 300 proceeds to OPERATION 315, where the received stroke sequence input 405 may be displayed. An example stroke sequence 510 displayed in a message bar 140 is illustrated in
Referring back to
If the received stroke sequence input 405 is recognized as a complete or partial stroke sequence 510 of a character or phrase, the method 300 may proceed to OPERATION 325, where one or more candidates may be provided. The one or more candidates 240 may be provided in the candidate line 210, for example, as illustrated in
According to the example illustrated in
Referring again to
A determination may be made at DECISION OPERATION 320 whether the received additional stroke sequence input 405 matches a portion of or a complete stroke sequence of a character. According to embodiments, a determination may also be made to determine whether possible character matches of the first stroke sequence 510 and one or more additional stroke sequences 510 may match one or more phrases. Phrase candidates 705A-D may be provided in the candidate line 210 (OPERATION 325) as illustrated in
The method 300 may proceed to OPERATION 330, where an indication of a selection of a candidate 240,705 is received. For example and as illustrated in
The method 300 may proceed to OPERATION 335, where the selected candidate 805 may be displayed in the message bar 140 as illustrated in
Embodiments of the present invention also provide for continuous handwriting. As described briefly above, while current Chinese handwriting engine recognition rates are very high, unwanted delays may be experienced while a determination is made whether a handwriting input is complete. For example, a user may “write” a character on an interface 205 via one of various input methods. The user may then experience a delay while a handwriting engine determines whether the user has finished writing the character. Embodiments provide for continuous handwriting, allowing a user to input a plurality of characters without having to wait after inputting each character. Embodiments also provide for allowing a user to edit a recognized character.
Referring now to
Embodiments may also provide for character correction. As illustrated in
Referring now to
An example of a user using his finger to enter handwriting input 920 into a writing panel 910 displayed on a display interface 205 of a mobile computing device 100 is illustrated in
Referring back to
With reference back to
If at DECISION OPERATION 1025 an indication of a selection of a character candidate 240 is not received, the method 1000 may return to OPERATION 1010 where addition handwriting input 920 is received or may proceed to OPERATION 1035 where an indication of a selection of the EOI selector 915 is received. The EOI selector 915 may be selected via a touch or other input device selection of the EOI selector 915 as illustrated in
After an indication of a selection of the EOI selector 915 is received, the method 1000 may proceed to OPERATION 1040, where the recognized character 1105 may be displayed in the recognized character panel 905. According to embodiments, the recognized character panel 905 may allow a user to select a recognized character 1105 and edit or correct the recognized character if desired. The method 1000 may then proceed to OPERATION 1045, where one or more word predictions 1405 may be displayed in the candidate line 210 (illustrated in
The method 1000 may proceed to DECISION OPERATION 1050, where a determination is made whether the recognized character 1105 displayed in the recognized character panel 905 is selected. If the recognized character 1105 displayed in the recognized character panel 905 is selected (illustrated in
Referring again to
If an indication of a selection of a word prediction 1405 is not received at DECISION OPERATION 1060, the method 1000 may return to OPERATION 1010, where additional handwriting input 920 may be received (as illustrated in
The embodiments and functionalities described herein may operate via a multitude of computing systems including, without limitation, desktop computer systems, wired and wireless computing systems, mobile computing systems (e.g., mobile telephones, netbooks, tablet or slate type computers, notebook computers, and laptop computers), hand-held devices, IP phones, gaming devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, and mainframe computers. In addition, the embodiments and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected.
Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like. As described above, gesture entry may also include an input made with a mechanical input device (e.g., with a mouse, touchscreen, stylus, etc.), the input originating from a bodily motion that can be received, recognized, and translated into a selection and/or movement of an element or object on a graphical user interface that mimics the bodily motion.
As stated above, a number of program modules and data files may be stored in the system memory 1804. While executing on the processing unit 1802, the program modules 1806, such as the IME Character Application 1850 or the Handwriting Engine 1860 may perform processes including, for example, one or more of the stages of methods 300 and 1000. The aforementioned processes are examples, and the processing unit 1802 may perform other processes. Other program modules that may be used in accordance with embodiments of the present invention may include electronic mail and contacts applications, word processing applications, database applications, slide presentation applications, drawing or computer-aided application programs, etc.
Furthermore, embodiments of the invention may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the invention may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated in
The computing device 1800 may also have one or more input device(s) 1812 such as a keyboard, a mouse, a pen, a sound input device, a touch input device, a microphone, a gesture recognition device, etc. The output device(s) 1814 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. The computing device 1800 may include one or more communication connections 1816 allowing communications with other computing devices 1818. Examples of suitable communication connections 1816 include, but are not limited to, RF transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, or serial ports, and other connections appropriate for use with the applicable computer readable media.
Embodiments of the invention, for example, may be implemented as a computer process (method), a computing system, or as an article of manufacture, such as a computer program product or computer readable media. The computer program product may be a computer storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
The term computer readable media as used herein may include computer storage media and communication media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. The system memory 1804, the removable storage device 1809, and the non-removable storage device 1810 are all computer storage media examples (i.e., memory storage.) Computer storage media may include, but is not limited to, RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by the computing device 1800. Any such computer storage media may be part of the computing device 1800.
Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
One or more application programs 1966 may be loaded into the memory 1962 and run on or in association with the operating system 1964. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. The system 1902 also includes a non-volatile storage area 1968 within the memory 1962. The non-volatile storage area 1968 may be used to store persistent information that should not be lost if the system 1902 is powered down. The application programs 1966 may use and store information in the non-volatile storage area 1968, such as e-mail or other messages used by an e-mail application, and the like. A synchronization application (not shown) also resides on the system 1902 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage area 1968 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into the memory 1962 and run on the mobile computing device 1900, including the IME Character Application 1850 and/or the Handwriting Engine 1860 described herein.
The system 1902 has a power supply 1970, which may be implemented as one or more batteries. The power supply 1970 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries. The system 1902 may also include a radio 1972 that performs the function of transmitting and receiving radio frequency communications. The radio 1972 facilitates wireless connectivity between the system 1902 and the “outside world”, via a communications carrier or service provider. Transmissions to and from the radio 1972 are conducted under control of the operating system 1964. In other words, communications received by the radio 1972 may be disseminated to the application programs 1966 via the operating system 1964, and vice versa.
The radio 1972 allows the system 1902 to communicate with other computing devices, such as over a network. The radio 1972 is one example of communication media. Communication media may typically be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. The term computer readable media as used herein includes both storage media and communication media.
This embodiment of the system 1902 provides notifications using the visual indicator 1920 that can be used to provide visual notifications and/or an audio interface 1974 producing audible notifications via the audio transducer 1925. In the illustrated embodiment, the visual indicator 1920 is a light emitting diode (LED) and the audio transducer 1925 is a speaker. These devices may be directly coupled to the power supply 1970 so that when activated, they remain on for a duration dictated by the notification mechanism even though the processor 1960 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 1974 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to the audio transducer 1925, the audio interface 1974 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present invention, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. The system 1902 may further include a video interface 1976 that enables an operation of an on-board camera 1930 to record still images, video stream, and the like.
A mobile computing device 1900 implementing the system 1902 may have additional features or functionality. For example, the mobile computing device 1900 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Data/information generated or captured by the mobile computing device 1900 and stored via the system 1902 may be stored locally on the mobile computing device 1900, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1972 or via a wired connection between the mobile computing device 1900 and a separate computing device associated with the mobile computing device 1900, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 1900 via the radio 1972 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
The description and illustration of one or more embodiments provided in this application are not intended to limit or restrict the scope of the invention as claimed in any way. The embodiments, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed invention. The claimed invention should not be construed as being limited to any embodiment, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the claimed invention and the general inventive concept embodied in this application that do not depart from the broader scope.