This patent application is also related to the following non-provisional patent applications: U.S. patent application Ser. No. 10/186,388, entitled “Writing Guide for a Free-Form Document Editor”, U.S. patent application Ser. No. 10/186,847, entitled “Method and System for Editing Electronic Ink”, U.S. patent application Ser. No. 10,186,874, entitled “Method and System for Selecting Objects on a Display Device”, U.S. patent application Ser. No. 10/186,812, entitled “Resolving Document Object Collisions”, U.S. patent application Ser. No. 10/186,837 entitled “Space Management for Electronic Documents”, U.S. patent application Ser. No. 10/186,820, entitled “Method and System for Categorizing Data Objects with Designation Tools”, U.S. patent application Ser. No. 10/186,463, entitled “Method and System for Displaying and Linking Ink Objects with Recognized Text and Objects”. These applications and the application herein are all being filed on the same date, Jun. 28, 2002, and are assigned to the Microsoft Corporation. The subject matter of each of these applications is hereby fully incorporated by reference.
The present invention is generally directed to writing on an electronic tablet using electronic ink. More particularly described, the present invention supports automatically recognizing a user's handwriting entered into an electronic document in electronic ink and automatically converting the electronic ink to text upon the occurrence of a predefined event.
One of the simplest methods of recording and communicating information is the traditional method of writing the information down on a piece of paper with a writing instrument such as a pen. Writing information by hand on a piece of paper is inexpensive and can be done quickly and easily with little preparation. The traditional method is also flexible in that a writer can generally write in any format anywhere on the page. One of the limitations with handwritten work is that it is not easily manipulated or transferred to other contexts. In other words, changing or transferring a piece of handwritten text typically requires rewriting the text on another medium.
With the widespread use of personal computers, textual information often is recorded using word processing software running on a personal computer. The advantage of such electronic methods of recording information is that the information can be easily stored and transferred to other remote computing devices and electronic media. Such electronically recorded text can also be easily corrected, modified, and manipulated in a variety of different ways.
Typical computer systems, especially computer systems using graphical user interface (GUI) systems such as Microsoft's WINDOWS operating system, are optimized for accepting user input from one or more discrete input devices. Common input devices include a keyboard for entering text and a pointing device, such as a mouse with one or more buttons, for controlling the user interface. The keyboard and mouse interface facilitates creation and modification of electronic documents including text, spreadsheets, database fields, drawings, and photos.
One of the limitations with conventional GUI systems is that a user must generally type the text they are entering into the personal computer using the keyboard. Entering text using a keyboard is generally slower and more cumbersome than handwriting. Although recent advances have been made in reducing the size of personal computers, they are still not as portable and easily accessible as traditional paper and pen. Furthermore, traditional pen and paper provide the user with considerable flexibility for editing a document, recording notes in the margin, and drawing figures and shapes. In some instances, a user may prefer to use a pen to edit a document rather than review the document on-screen because of the ability to make notes freely outside of the confines of the keyboard and mouse interface.
To address the shortcomings of traditional keyboard and mouse interfaces, there have been various attempts to create an electronic tablet that can record handwriting. Such electronic tablets typically comprise a screen and a handheld device that is similar to a pen (or “stylus”). A user can write with the handheld device on the screen of the electronic tablet in a similar manner to traditional pen and paper. The electronic tablet can “read” the strokes of the user's handwriting with the handheld device and recreate the handwriting in electronic form on the screen with “electronic ink.” This electronic tablet approach can be employed in a variety of ways including on a personal computer and on a handheld computing device.
Despite the advances in electronic tablets and electronic ink, several limitations still exist with the performance of such electronic handwriting devices. Typically, the placement of an insertion point (or cursor) on an electronic page dictates where electronic ink will be entered by the user. In other words, when a user writes in electronic ink on an electronic tablet, the electronic ink is inserted directly where the insertion point exists on the electronic page. In this way, the insertion point indicates to the user where electronic ink will be inserted on the electronic page once the user starts writing. Therefore, if the user wants to write on another part of the electronic page, the user must first move the insertion point to the spot on the electronic page where the user wants to insert the electronic ink, and then begin writing. Additionally, in the conventional art, as a user writes in electronic ink on an electronic page, the electronic ink remains on the page as electronic ink. If the user wants the electronic ink converted to traditional text (such as the text that appears on a traditional computer screen) at a later time, the user must manually select the electronic ink to be converted and then request that the electronic ink be converted to text. These manual steps are unnecessarily time consuming for the user to perform. Additionally, if a user hand writes a significant amount of information into the electronic document and then later converts the electronic ink to text, the user may find that some of the information could not be converted to text. This may occur if the user's handwriting becomes too illegible or erratic for the electronic tablet to decipher.
Consequently, there is a need in the art for a system and method that will allow a user to write electronic ink anywhere on an electronic page without manually moving the insertion point. Additionally, there is a need in the art for a system and method for automatically recognizing electronic ink as handwriting and converting the electronic ink to text, without manual intervention. Finally, there is a need in the art for a system and method for automatically recognizing and converting electronic ink to text as it is entered by a user and upon the occurrence of a predefined event.
The present invention can solve the aforementioned problems by providing a system and method for automatically recognizing and converting electronic ink written by a user on an electronic tablet to text. In one aspect of the present invention, a user can enter handwritten electronic ink into a writing guide on an electronic page. A stylus and ink module can display the handwritten electronic ink as it is rendered on the electronic page by the user. A writing guide module can also send the electronic ink to a recognizer upon the occurrence of a predefined event. After the recognizer converts the electronic ink to text, the writing guide module can replace the electronic ink in the writing guide with the text provided by the recognizer. In other words, the writing guide module can insert the converted text in the same location where the electronic ink was written by the user, no matter where the user wrote the electronic ink on the page.
Various aspects of the present invention may be more clearly understood and appreciated from a review of the following detailed description of the disclosed embodiments and by reference to the drawings and claims.
The present invention, which can be embodied in one or more program modules that run in a distributed computing environment, enhances the performance of computing devices, such as electronic tablets, that convert a user's handwriting strokes into electronic ink in an electronic document. Specifically, the present invention improves upon the user's ability to review and proof handwritten electronic ink. In one exemplary embodiment of the present invention, upon the occurrence of a predefined event, a user's handwritten electronic ink is automatically recognized and converted to text ink. In this way, the user can ensure that the application can recognize his or her handwriting and properly convert it to text as the user continues to hand write information on the electronic page.
In one exemplary embodiment of the present invention, a user can input handwritten information in electronic ink on an electronic page. As the user writes the information, an exemplary stylus and ink module displays the electronic ink on the page of the electronic tablet. Upon the occurrence of a predefined event, a writing guide module automatically sends the electronic ink to an exemplary recognizer. The recognizer receives the electronic ink from the writing guide module and converts the electronic ink to text ink. Upon receiving the text ink from the recognizer, the writing guide module replaces the electronic ink with the text ink in the exact location on the page where the electronic ink was originally written.
Although the exemplary embodiments will be generally described in the context of a software module and an operating system running on a personal computer, those skilled in the art will recognize that the present invention also can be implemented in conjunction with other program modules for other types of computers. Furthermore, those skilled in the art will recognize that the present invention may be implemented in a stand-alone or in a distributed computing environment. In a distributed computing environment, program modules may be physically located in different local and remote memory storage devices. Execution of the program modules may occur locally in a stand-alone manner.
An exemplary embodiment of the present invention comprises a computer program, which embodies the functions described herein and illustrated in the appended flow charts. However, those skilled in the art recognize that there could be many different ways of implementing the invention in computer programming, and the invention should not be construed as limited to any one set of computer program instructions. Further, a skilled programmer would be able to write such a computer program to implement the disclosed invention without difficulty based on the flow charts and associated description in the application text, for example. Therefore, disclosure of a particular set of program code instructions is not considered necessary for an adequate understanding of how to make and use the invention. The inventive functionality of the claimed computer program will be explained in more detail in the following description in conjunction with the remaining figures illustrating the program flow.
Referring now to the drawings in which like numerals represent like elements throughout the several figures, exemplary embodiments of the present invention and the illustrative operating environment will be described in connection with the drawings.
In the representative architecture 100, the hardware components are coupled to an ink processing software module 125. It should be understood by those skilled in the art that
In one exemplary embodiment, the ink processing module 125 comprises a collection of software modules that perform different tasks for rendering handwriting strokes as electronic ink. For example, the stylus and ink module 128 receives data describing the positions and angles of the stylus 155 for a series of handwriting strokes. The stylus and ink module 128 interprets the data for rendering electronic ink. Other software modules, such as a gesture recognizer 130 and word recognizer 135 identify certain handwriting strokes and assign them a particular significance. For example, the word recognizer 135 converts electronic ink to text ink. Specifically, the word recognizer 135 breaks the user's handwriting in electronic ink down into separate word blocks, evaluates the word blocks, and then upon recognizing the word contained in each word block, converts the handwritten word to word-based text (i.e. text that is typically viewed on a computer monitor). Additionally, certain gestures (such as a cross-out) may be recognized and associated with other editing processes. The ink processing module 125 can also include an erasing functions module 140 for removing electronic ink that has been previously rendered.
A document editing module 105 facilitates the manipulation of electronic ink so that a user can create and manipulate an electronic document 120 with greater ease and sophistication. The document editing module 105 typically comprises a collection of software modules for controlling and manipulating electronic ink rendered on the monitor 170. For example, a parsing module 110 identifies handwriting strokes that are selected by the user for editing. Selected strokes may by highlighted or shaded to assist the user in identifying which strokes are to be edited. A classifier module 115 identifies certain handwriting strokes as being part of a word or drawing. Software modules such as the layout module 116 and the insert space module 117 can be designed to control how electronic ink is rendered and moved. A writing guide module 127 facilitates the creation of writing guides, which assist a user in entering information in an electronic document 120 and in converting electronic handwriting to text. The editing modules shown in
Certain steps in the processes described below in
If the stylus and ink module 128 has detected a stylus 155 on or near the input screen of the electronic tablet 150, then in Step 220, the writing guide module 127 creates a writing guide based upon the user's stroke input into an electronic document 120 on the electronic tablet 150. A writing guide visually guides the user in entering his or her information by showing the user the structure of the information to be entered. A writing guide can comprise a handwriting writing guide (for electronic handwritten notes) or a drawing guide (for creating drawings). As the user enters strokes on the input screen, the classifier module 115 determines whether the guide to be created should comprise a drawing guide or a handwriting writing guide. In other words, if it appears to the classifier module 115 that the user is entering handwriting onto the input screen, then the writing guide module 127 creates a handwriting writing guide. On the other hand, if it appears to the classifier module 115 that the user is drawing on the input screen, then the writing guide module 127 creates a drawing guide. In one exemplary embodiment, the classifier module 115 makes this determination based upon the size of the input strokes made by the user, the number of strokes made by the user, and the aspect ratio of the strokes made by the user. In another exemplary embodiment, the classifier module 115 makes this determination based upon any other type of user input. For example, a user can select a particular handwriting pen from a set of handwriting pens that have particular pre-defined formatting characteristics for electronic ink. In this way, if the user selects a handwriting pen before writing on the electronic tablet 150, then the classifier module 115 knows automatically that the user will be entering handwriting onto the electronic page.
In Step 230, if the writing guide module 127 created a drawing guide or activated an existing drawing guide, then the process ends because a drawing does not need to be converted to text. However, if the writing guide module 127 created a handwriting writing guide or activated an existing handwriting writing guide as a result of the user's input, then in Step 240, the user continues to write electronic wet ink on the electronic page.
In Step 250, the writing guide module 127 determines whether the wet electronic ink should be automatically converted to text. In one exemplary embodiment, a user can select to write with an “Auto-Recognition” handwriting pen on the electronic tablet 150. If the user writes with an “Auto-Recognition” pen, then the document editing module 105 knows that the user wants his electronic ink to be automatically recognized and converted to text upon the occurrence of a particular event. In one exemplary embodiment, the occurrence of the following events will trigger the automatic conversion of wet electronic ink to dry text ink: the user enters enough input strokes to fill one line of the writing guide; the user changes its focus from the electronic document 120 to another application window; or a user moves the stylus 155 away from the current writing guide to another location on the electronic tablet 150.
If the writing guide module 127 determines that the wet electronic ink should be automatically converted to text, then in Step 260, the writing guide module 127 sends the wet handwriting ink to the word recognizer 135 to be converted to dry text ink. Then, in Step 270, upon receiving the converted text back from the word recognizer 135, the writing guide module 127 replaces the wet electronic ink in the writing guide with the dry text ink received from the word recognizer 135.
However, if in Step 320, a writing guide is not visible, or if in Step 310, the writing guide module 127 does not detect the stylus 155 near any writing guides that currently exist on the electronic page, then in Step 340, the stylus and ink module 128 determines whether the user has physically touched the stylus 155 to the input screen of the electronic tablet 150. If the stylus 155 has been placed directly on the input screen, then in Step 350, the classifier module 115 determines whether the user is inputting handwriting or a drawing. As discussed above, this determination can be made upon evaluating a number of factors. In one exemplary embodiment, these factors include the number of input strokes entered by the user, the height of the input strokes, and the aspect ratio of the input strokes. If in Step 350, the classifier module 115 determines that the user is inputting a drawing, or if it cannot recognize the input provided by the user, then the process ends. However, if in Step 350, the classifier module 115 detects that the user is entering handwriting onto the electronic page, then in Step 360, the writing guide module 127 creates a handwriting writing guide.
It should be understood that the foregoing relates only to illustrative embodiments of the present invention, and that numerous changes may be made therein without departing from the scope and spirit of the invention as defined by the following claims.
| Number | Name | Date | Kind |
|---|---|---|---|
| 2354332 | Polydoroff | Jul 1944 | A |
| 5321768 | Fenrich et al. | Jun 1994 | A |
| 5327342 | Roy | Jul 1994 | A |
| 5339391 | Wroblewski et al. | Aug 1994 | A |
| 5347295 | Agulnick et al. | Sep 1994 | A |
| 5442742 | Greyson et al. | Aug 1995 | A |
| 5491495 | Ward et al. | Feb 1996 | A |
| 5500937 | Thompson-Rohrlich | Mar 1996 | A |
| 5517578 | Altman et al. | May 1996 | A |
| 5559942 | Gough et al. | Sep 1996 | A |
| 5583542 | Capps et al. | Dec 1996 | A |
| 5590257 | Forcier | Dec 1996 | A |
| 5613019 | Altman et al. | Mar 1997 | A |
| 5649133 | Arquie | Jul 1997 | A |
| 5655136 | Morgan | Aug 1997 | A |
| 5760773 | Berman et al. | Jun 1998 | A |
| 5768418 | Berman et al. | Jun 1998 | A |
| 5809498 | Lopresti et al. | Sep 1998 | A |
| 5838326 | Card et al. | Nov 1998 | A |
| 5838819 | Ruedisueli et al. | Nov 1998 | A |
| 5864635 | Zetts et al. | Jan 1999 | A |
| 5867150 | Bricklin et al. | Feb 1999 | A |
| 5874957 | Cline et al. | Feb 1999 | A |
| 5953735 | Forcier | Sep 1999 | A |
| 5963208 | Dolan et al. | Oct 1999 | A |
| 5970455 | Wilcox et al. | Oct 1999 | A |
| 6061472 | Hullender et al. | May 2000 | A |
| 6069626 | Cline et al. | May 2000 | A |
| 6081829 | Sidana | Jun 2000 | A |
| 6108445 | Uehara | Aug 2000 | A |
| 6128633 | Michelman et al. | Oct 2000 | A |
| 6154219 | Wiley et al. | Nov 2000 | A |
| 6154758 | Chiang | Nov 2000 | A |
| 6188405 | Czerwinski et al. | Feb 2001 | B1 |
| 6223145 | Hearst | Apr 2001 | B1 |
| 6279014 | Schilit et al. | Aug 2001 | B1 |
| 6295372 | Hawkins et al. | Sep 2001 | B1 |
| 6304272 | Schanel et al. | Oct 2001 | B1 |
| 6337698 | Keely, Jr. et al. | Jan 2002 | B1 |
| 6340967 | Maxted | Jan 2002 | B1 |
| 6345389 | Dureau | Feb 2002 | B1 |
| 6355889 | Butcher et al. | Mar 2002 | B1 |
| 6487567 | Michelman et al. | Nov 2002 | B1 |
| 6487569 | Lui et al. | Nov 2002 | B1 |
| 6529215 | Golovchinsky et al. | Mar 2003 | B2 |
| 6565611 | Wilcox et al. | May 2003 | B1 |
| 6594390 | Frink et al. | Jul 2003 | B2 |
| 6651221 | Thompson et al. | Nov 2003 | B1 |
| 6654035 | DeStefano | Nov 2003 | B1 |
| 6661409 | Demartines et al. | Dec 2003 | B2 |
| 6678865 | Pratley et al. | Jan 2004 | B1 |
| 6690364 | Webb | Feb 2004 | B1 |
| 6741749 | Herbert, Jr. | May 2004 | B2 |
| 6801190 | Robinson et al. | Oct 2004 | B1 |
| 6833827 | Lui et al. | Dec 2004 | B2 |
| 6836759 | Williamson et al. | Dec 2004 | B1 |
| 6989822 | Pettiross et al. | Jan 2006 | B2 |
| 7002560 | Graham | Feb 2006 | B2 |
| 7039234 | Geidl et al. | May 2006 | B2 |
| 7079713 | Simmons | Jul 2006 | B2 |
| 20010000960 | Dettloff | May 2001 | A1 |
| 20020078035 | Frank et al. | Jun 2002 | A1 |
| 20020097270 | Keely et al. | Jul 2002 | A1 |
| 20020126153 | Withers et al. | Sep 2002 | A1 |
| 20030119469 | Karr et al. | Jun 2003 | A1 |
| 20030214531 | Chambers | Nov 2003 | A1 |
| 20030215142 | Gounares et al. | Nov 2003 | A1 |
| 20040003350 | Simmons et al. | Jan 2004 | A1 |
| Number | Date | Country |
|---|---|---|
| 40 26 852 | Feb 1991 | DE |
| 3-270403 | Dec 1991 | JP |