The present application is related to the following copending U.S. patent applications, assigned to the assignee of the present application, filed concurrently herewith and hereby incorporated by reference: “Digital Ink-Based Search,” U.S. patent application Ser. No. 11/821,837, and “Unified Digital Ink Recognition,” U.S. patent application Ser. No. 11/821,858.
Digital ink is becoming an important media for users to interact with computer applications. For example, with a tablet personal computer, a user can use a pen to handwrite digital ink to input information.
While tablet PC and digital ink technologies thus make a user's utilization of computer much more natural, a user has to consider what type of information is going to be input at any given time, and then what program to use. For example, some programs are not digital ink aware, whereby digital ink cannot be input directly to that program. For such non-aware programs, a tablet PC input panel (TIP) may be used to convert handwriting to recognized text that is then input to the program as if it was typed on the keyboard.
However, at various times a user may want to input different kinds of information, such as characters (text), a shape, a sketch, a math equation, and so forth, but a tablet PC input panel can only provide text recognized from handwriting. Thus, the user needs to run a different and appropriate digital ink-aware program anytime a different type of digital ink is input, otherwise, for example, using the TIP will cause the system to attempt to convert the input to text.
This Summary is provided to introduce a selection of representative concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in any way that would limit the scope of the claimed subject matter.
Briefly, various aspects of the subject matter described herein are directed towards a technology by which a platform processes different types of digital ink input for sending corresponding information to different application programs. The platform includes an ink panel having different operating modes for receiving different types of digital ink, and a recognition service that recognizes different types of digital ink. A mechanism such as an API set couples the ink panel to the recognition service such that a recognition result (or information corresponding to the recognition result) may be provided to an application program.
In one example aspect, the recognition service includes a unified recognizer that recognizes at least two different types of digital ink input, e.g., characters and shapes. Another recognizer may be included in the recognition service, e.g., an equation recognizer; if so, a recognizer is selected for recognizing ink based upon a current operating mode of the ink panel.
In one example aspect, while in a non-text input mode such as a shape mode, the panel sends the ink input to a unified recognizer that recognizes both text and non-text items, e.g., shapes. If the recognition result corresponds to text and the input panel was in a non-text input mode when the recognized ink was received, the text is used in a keyword search to locate a non-character item for output. Otherwise, the recognition result is used for output without keyword searching.
Other advantages may become apparent from the following detailed description when taken in conjunction with the drawings.
The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Various aspects of the technology described herein are generally directed towards digital ink based input technology that can be used to input items such as characters, shapes, sketches, formulas and so forth efficiently and naturally. In one aspect, digital ink is used as input to integrated smart digital ink panel, which can output appropriate corresponding data to any program that can receive digital ink and/or recognized output from a recognizer. For example, as is known, digital ink is input naturally by user's handwriting, and can be used to represent different kinds of information intuitively and effectively. As will be understood, the advantages of digital ink handwritten input are leveraged herein, whereby a user may input several kinds of information, such as text, shapes and math equations, for output into prevalent applications naturally and efficiently at the same time.
In one aspect, the smart digital ink panel provides a user with an integrated experience to input data to a computer by handwriting for use by any application that can consume corresponding information. For example, a user can input text (such as recognized from Chinese characters) into applications containing text editing functionality, such as a note taker program (e.g., Microsoft® Notepad), a word processing application (e.g., Microsoft® Word); a user can additionally input math equations via the panel into a suitable word processing application, (e.g., Microsoft® Word); and shapes into a diagramming program (Microsoft® Visio®).
While various examples herein are primarily directed to differentiating between characters, equations and shapes, any handwritten input may be differentiated and/or benefit from the technology described herein, including handwritten characters, sketched shapes, handwritten gestures, handwritten formulas (e.g., mathematical, chemical and so forth) and/or drawn pictures or the like. Further, while an example implementation of a unified digital ink recognizer is described herein that can among other aspects differentiate between characters and shapes, other implementations of a unified digital ink recognizer may be used.
As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in computing, and digital ink in general.
Example Unified Digital Ink Recognizer
As described below with reference to
For the shape set, the private use area of Unicode that can be customized, ranging from Unicode 0xF000 to 0xF0FF, is used. For building a unified digital ink recognizer, any item to be recognized can be assigned with a Unicode value from the private use area of Unicode, although an item with an already-assigned Unicode values (e.g., a character) can use that value.
To build the unified digital ink recognizer 102, a learning based pattern recognition approach is used, as generally represented by the example components shown in
With the classifier, given a new item to be recognized, the features of the item are matched with the feature of an existing class, which means the new item is recognized as belonging to that class.
One aspect of building a digital ink recognizer 102 with this approach is data collection of digital ink samples for each item in the defined dataset to be recognized by the digital ink recognizer 102. In the implementation represented in
Based on the digital ink samples 104, a first mechanism (process step) 114 develops and/or selects a set of one or more core algorithms 116 for use in extracting the features of the training set 106 to build the digital ink recognizer model 112 according to the extracted features. The developed core algorithms are performed on the training set 106 to build the digital ink recognizer model 112.
More particularly, a recognition algorithm is used to build the recognizer model (classifier) 112 for the items to be recognized. As represented in
As is known, there are many existing and possible recognition algorithms which may be used to build a recognition system, including nearest neighbor classification (sometimes referred to as k-nearest neighbor, or KNN), Gaussian Mixture Model (GMM), Hidden Markov Model (HMM), and so forth. In one implementation of the unified digital ink recognition system, nearest neighbor classification is used to recognize digital ink.
A primary concept in nearest neighbor classification is to use one point in multi-dimensional space to represent each class of samples, such as classes A-C as generally represented in
After the recognizer model 112 is built, when a new item “New Item” is to be recognized, that item is also represented by a point in this space. As represented in
Returning to
When complete, a unified digital ink recognizer 102 is provided, comprising the core algorithm or algorithms and the recognizer model 112. In one implementation, the unified digital ink recognizer can recognize digital ink of handwriting (e.g., Chinese characters) and sketching shapes (including sketched graphs). As a result, whether the user inputs a Chinese character by handwriting or inputs a shape by sketching, the unified digital ink recognizer correctly interprets the digital ink of the user's input as a character or as a shape.
Step 408 represents using a feature extraction algorithm to extract the features from each selected item in the training set, with step 410 representing the feature selection algorithm, and step 412 representing the building of the recognizer model, e.g., processing the feature data of each selected item as needed to adjusting the feature data for the class [the class is identified by the Unicode value, the selected item is belonging to the class] in the recognizer model (such as representing multi-dimensional coordinates).
Step 414 represents the evaluation of the accuracy and/or efficiency using the testing set of digital ink samples. Based on an error analysis at step 416 as to how accurate and/or efficient the model is, samples from the tuning set may be applied at step 416 in an attempt to better optimize the recognizer. Step 418 represents repeating any or all of steps 406, 408, 410, 412, 414 and 416 for further optimization. Note that the evaluation at step 414 may be used to determine whether further optimization is necessary. Further, note that a model that is less accurate and/or efficient than another model may be discarded until the best model of those evaluated is determined.
Integrated Platform for User Input of Digital Ink
Turning to
As represented in
In this example implantation, a set of recognition APIs 1506 are provided by which the panel 1504 communicates with a recognition engine service 1508. The recognition service 1508 may include the unified digital ink recognizer 102 of
As the user inputs information to the panel 1504 by handwriting, the digital ink of the user's input is collected and is passed via the APIs 1506 to the recognition engine service 1508 for recognition. The recognition engine service 1508 returns a recognition result to the panel 1504. In different operating modes of the smart digital ink panel, the recognition result is different and is used differently for input.
More particularly, when operating in the text mode as illustrated by
In the shape mode as illustrated in
In the math equation mode which is illustrated in
Step 1902 represents selecting the input mode, e.g., text, shape or equation, which may be done by default upon opening the panel. Step 1904 represents receiving the digital ink input to recognize, which as described above may be a character or other, non-character item depending on the current mode.
Steps 1906 and 1910 represent routing the input to an appropriate recognizer, which in the text mode is the unified recognizer (step 1908) in this example, but may be a text-only recognizer or the like. Step 1908 represents the text mode, in which the input is sent to the unified recognizer, a recognition result received and output to the panel, and thereafter handled as directed by the user or other process, e.g., to send the text (e.g., one or more Unicode values) to the current program in focus or cancel the send.
If the not text mode, step 1910 determines whether the input is to be routed to a shape-enabled (e.g., unified) recognizer or the equation recognizer. If in the equation mode, step 1912 is executed, in which the input is sent to the equation recognizer, a recognition result received and output to the panel, and thereafter handled as directed by the user or other process, e.g., to send the equation (or corresponding equation information) to the current program in focus or cancel the send.
Steps 1914, 1916, 1918 and 1920 represent example actions taken when digital ink is received in the shape mode, beginning at step 1914 where the ink input is sent to the unified recognizer. In this example, if a shape was recognized, the shape is output to the user via step 1920. Step 1920 also represents handling the shape as directed by the user or other process, e.g., to send shape information to the current program in focus or cancel the send.
If instead a character was recognized while in the shape mode as described above, (step 1916), step 1916 branches to step 1918 where the character is used as a keyword (or to build a keyword) for searching a shape data store or the like. Step 1918 also represents searching and obtaining the search results, with step 1920 representing outputting the found item or items to the user, and thereafter handling the item or items as directed by the user or other process, e.g., to send shape information to the current program in focus or cancel the send.
Step 1922 represents handling the user's next action, which in this example is ending the panel, inputting more ink, or changing modes. In this manner, the smart digital ink panel thus provides an integrated user experience with digital ink input, via an integrated platform for inputting different kinds of information into application programs.
Exemplary Operating Environment
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to: personal computers, server computers, hand-held or laptop devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in local and/or remote computer storage media including memory storage devices.
With reference to
The computer 2010 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer 2010 and includes both volatile and nonvolatile media, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by the computer 2010. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer-readable media.
The system memory 2030 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 2031 and random access memory (RAM) 2032. A basic input/output system 2033 (BIOS), containing the basic routines that help to transfer information between elements within computer 2010, such as during start-up, is typically stored in ROM 2031. RAM 2032 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 2020. By way of example, and not limitation,
The computer 2010 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media, described above and illustrated in
The computer 2010 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 2080. The remote computer 2080 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 2010, although only a memory storage device 2081 has been illustrated in
When used in a LAN networking environment, the computer 2010 is connected to the LAN 2071 through a network interface or adapter 2070. When used in a WAN networking environment, the computer 2010 typically includes a modem 2072 or other means for establishing communications over the WAN 2073, such as the Internet. The modem 2072, which may be internal or external, may be connected to the system bus 2021 via the user input interface 2060 or other appropriate mechanism. A wireless networking component 2074 such as comprising an interface and antenna may be coupled through a suitable device such as an access point or peer computer to a WAN or LAN. In a networked environment, program modules depicted relative to the computer 2010, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
An auxiliary subsystem 2099 (e.g., for auxiliary display of content) may be connected via the user interface 2060 to allow data such as program content, system status and event notifications to be provided to the user, even if the main portions of the computer system are in a low power state. The auxiliary subsystem 2099 may be connected to the modem 2072 and/or network interface 2070 to allow communication between these systems while the main processing unit 2020 is in a low power state.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
5445901 | Korall et al. | Aug 1995 | A |
5500937 | Thompson-Rohrlich | Mar 1996 | A |
5613019 | Altman et al. | Mar 1997 | A |
5666438 | Beernink et al. | Sep 1997 | A |
5680480 | Beernink et al. | Oct 1997 | A |
5687254 | Poon et al. | Nov 1997 | A |
5742705 | Parthasarathy | Apr 1998 | A |
5781663 | Sakaguchi et al. | Jul 1998 | A |
5784504 | Anderson et al. | Jul 1998 | A |
5796867 | Chen et al. | Aug 1998 | A |
5832474 | Lopresti | Nov 1998 | A |
5963942 | Igata | Oct 1999 | A |
6157905 | Powell | Dec 2000 | A |
6173253 | Abe et al. | Jan 2001 | B1 |
6240424 | Hirata | May 2001 | B1 |
6263113 | Abdel-Mottaleb | Jul 2001 | B1 |
6333995 | Perrone | Dec 2001 | B1 |
6389435 | Golovchinsky | May 2002 | B1 |
6415256 | Ditzik | Jul 2002 | B1 |
6470336 | Matsukawa et al. | Oct 2002 | B1 |
6512995 | Murao | Jan 2003 | B2 |
6549675 | Chatterjee | Apr 2003 | B2 |
6552719 | Lui et al. | Apr 2003 | B2 |
6625335 | Kanai | Sep 2003 | B1 |
6681044 | Ma et al. | Jan 2004 | B1 |
6813395 | Kinjo | Nov 2004 | B1 |
6819315 | Toepke et al. | Nov 2004 | B2 |
6873986 | McConnell et al. | Mar 2005 | B2 |
6965384 | Lui et al. | Nov 2005 | B2 |
7031555 | Troyanker | Apr 2006 | B2 |
7050632 | Shilman et al. | May 2006 | B2 |
7123770 | Raghupathy et al. | Oct 2006 | B2 |
7136082 | Saund et al. | Nov 2006 | B2 |
7167585 | Gounares et al. | Jan 2007 | B2 |
7171060 | Park | Jan 2007 | B2 |
7218779 | Dodge et al. | May 2007 | B2 |
7302099 | Zhang et al. | Nov 2007 | B2 |
7324691 | Li et al. | Jan 2008 | B2 |
7369702 | Abdulkader et al. | May 2008 | B2 |
7526129 | Bargeron | Apr 2009 | B2 |
7630554 | Napper et al. | Dec 2009 | B2 |
7756755 | Ghosh et al. | Jul 2010 | B2 |
20020087426 | Shiitani et al. | Jul 2002 | A1 |
20020090148 | Pass et al. | Jul 2002 | A1 |
20020149630 | Kitainik et al. | Oct 2002 | A1 |
20020150297 | Gorbatov et al. | Oct 2002 | A1 |
20030007683 | Wang et al. | Jan 2003 | A1 |
20030086627 | Berriss et al. | May 2003 | A1 |
20030123733 | Keskar et al. | Jul 2003 | A1 |
20030167274 | Dettinger et al. | Sep 2003 | A1 |
20030215142 | Gounares et al. | Nov 2003 | A1 |
20030215145 | Shilman et al. | Nov 2003 | A1 |
20040017946 | Longe et al. | Jan 2004 | A1 |
20040073572 | Jiang | Apr 2004 | A1 |
20040252888 | Bargeron et al. | Dec 2004 | A1 |
20050060324 | Johnson et al. | Mar 2005 | A1 |
20050091576 | Relyea et al. | Apr 2005 | A1 |
20050100214 | Zhang et al. | May 2005 | A1 |
20050102620 | Seto et al. | May 2005 | A1 |
20050201620 | Kanamoto et al. | Sep 2005 | A1 |
20050222848 | Napper et al. | Oct 2005 | A1 |
20050281467 | Stahovich | Dec 2005 | A1 |
20060001667 | LaViola et al. | Jan 2006 | A1 |
20060004728 | Gotoh | Jan 2006 | A1 |
20060007188 | Reiner | Jan 2006 | A1 |
20060031755 | Kashi | Feb 2006 | A1 |
20060036577 | Knighton | Feb 2006 | A1 |
20060045337 | Shilman et al. | Mar 2006 | A1 |
20060050969 | Shilman et al. | Mar 2006 | A1 |
20060062461 | Longe et al. | Mar 2006 | A1 |
20060110040 | Simard et al. | May 2006 | A1 |
20060126936 | Bhaskarabhatla | Jun 2006 | A1 |
20060149549 | Napper | Jul 2006 | A1 |
20060197763 | Harrison | Sep 2006 | A1 |
20060209040 | Garside et al. | Sep 2006 | A1 |
20060274943 | Abdulkader et al. | Dec 2006 | A1 |
20060277159 | Napper et al. | Dec 2006 | A1 |
20060279559 | Kongqiao et al. | Dec 2006 | A1 |
20060290656 | Soong et al. | Dec 2006 | A1 |
20070003142 | Simard et al. | Jan 2007 | A1 |
20090003658 | Zhang et al. | Jan 2009 | A1 |
20090003703 | Zhang et al. | Jan 2009 | A1 |
20090007272 | Huang et al. | Jan 2009 | A1 |
Number | Date | Country |
---|---|---|
62284417 | Dec 1987 | JP |
10-1993-0001471 | Feb 1993 | KR |
WO03034276 | Apr 2003 | WO |
WO 03034276 | Apr 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20090002392 A1 | Jan 2009 | US |