Utilization of temporal and spatial parameters to enhance the writing capability of an electronic device

Information

  • Patent Grant
  • 9141134
  • Patent Number
    9,141,134
  • Date Filed
    Tuesday, May 31, 2011
    13 years ago
  • Date Issued
    Tuesday, September 22, 2015
    9 years ago
Abstract
The enhanced feature of this invention is the direction of a pen input on an electronic device to a separate canvas where it can be separately manipulated and processed for use. Temporal and spatial analysis can ensure that each canvas contain pen strokes that spatially belong together such as an individual word composed of multiple strokes for easy manipulation of the word on the screen. Another related enhanced feature of the invention for erasing portions of the inputted writing is the process of generating a new curve with different points than the original curve.
Description
FIELD OF THE INVENTION

The invention relates to generally to electronic devices. More particularly, the invention relates to methods and devices for manipulating ink strokes on touch sensitive screens of an electronic device.


BACKGROUND OF THE INVENTION

In numerous electronic touch sensitive devices, a pen or stylus can be used to input writing and display the writing on the screen. In some devices, the method of implementing this includes processing co-ordinate streams from the user's manipulation of a pen near or touching a sensing apparatus attached to a display, and directly drawing on the device's graphics surface lines or curves that approximate the pen's motion. Other devices convert the co-ordinates produced by the motion of a pen to letters or words and then display these letters and words as typewritten text. In other devices, the strokes are interpreted and categorized as simple geometric shapes, and some processing is done to ‘clean up’ those shapes (e.g. ‘straightening’ a line, recognizing and adjusting circles and ovals, adjusting lines with an appropriate spatial relationship to form a triangle, rectangle, square, etc.) and these predefined shapes are displayed. Numerous methods also exist to erase writing on the screen.


The need still exists, however, for a better way to capture, display and edit writing on the screen. Manipulation of writing on the display and in storage could be more streamlined and efficient.


SUMMARY OF THE INVENTION

The enhanced feature of this invention is the direction of a pen input to a separate canvas where it can be separately manipulated and processed for use. Temporal and spatial analysis can ensure that each canvas contain pen strokes that spatially belong together such as an individual word composed of multiple strokes for easy manipulation of the word on the screen. Another related enhanced feature of the invention for erasing portions of the inputted writing is the process of generating a new curve with different points than the original curve.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a flowchart in accordance with the instant invention.



FIG. 2 shows an erasing process in accordance with the instant invention.



FIG. 3 shows an erasing process in accordance with the instant invention.





DETAILED DESCRIPTION OF THE INVENTION

The term canvas refers to a graphics surface. It is typically used as an object within a web page and can initially contain lines and curves when downloaded from the web server, or can be manipulated later locally using a scripting language built into the browser (most commonly JavaScript).


In the instant invention, input pen events on a touch screen or similar device that accepts pen inputs are directed to a canvas object. The object can then be passed to another software component for processing, and the software component then performs filtering, curve-fitting, and similar processing techniques to determine a path the pen is following. The software component then determines and updates the canvas object with lines or curves that approximate the path that the pen took.


The advantage of the invention derives from extending an existing type of object (canvas object) with integrated pen support. This hybrid approach differs from conventional means of supporting pen, where capturing, processing and drawing are all performed in the same software component.


Another novel feature of this device is the method of performing temporal and spatial analysis of the co-ordinate information to group related strokes together, as opposed to attempting to do higher-level analysis of the strokes to recognize handwriting or geometric figures. The benefit is that handwritten words or figures can be selected and manipulated (moved, copy/pasted) as a logical unit (e.g. a word, diagram) rather than individual strokes. A word like ‘little’ written cursively is typically three strokes, and without this ‘superscript’ grouping, selecting and moving the first long cursive stroke would not move the dot over the ‘i’, or the cross across the ‘tt’. The benefits get progressively greater for printed and mixed printed/cursive handwriting.


The flowchart shown in FIG. 1 illustrates how the super strokes are created and manipulated. The process begins with a master canvas that can accept input pen strokes. The first operation 2 is a read of the pen event. The first decision operation 4 is whether the pen input is being received (the pen is down or interacting with the screen). If the pen is down the next operation decision 7 is then whether the pen was down in the previous event. If the pen was down in the previous event, then operations 10 and 11 are executed and the pen coordinates are processed and inserted into the master canvas. The process then returns to operation 2.


If operation decision 4 was false and the pen is not down, the next operation decision 5 checks if the pen was down in the previous event. If it was, operation 3, starting a timer, is executed and the process returns to operation 2. If the pen was not down at operation decision 5, then operation decision 6 determines if the timer set in operation 3 has expired. If it has not expired, the process returns to operation 2. If it has expired then operations 12, 13, 14, 15 and 16 are executed before the process returns to operation 2. Operations 12, 13, 14, 15 and 16 reset the time and create a child canvas from the last set of curves from the master canvas.


If operation decision 4 was true but operation decision 7 was false, operation decision 8, checking whether the timer has expired, is executed. If at operation decision 8 the timer has expired, operations 12, 13, 14, 15 and 16 are executed and the process returns to operation 2.


If, on the other hand, at operation decision 8 it is determined that the timer has not expired, operation decision 9, determining if the new strokes are in close proximity to the previous ones, is executed. If they are, operations 10 and 11, processing the pen input and inserting or updating the curve in the master canvas, are executed and the process returns to operation 2. If at operation decision 8 it is determined that the timer has not expired and then at operation decision 9 it is determined that the new strokes are not in close proximity to the previous strokes then operations 12, 13, 14, 15 and 16 are executed and the process returns to operation 2.


In this way, child canvases are created that are based on temporal and spatial information. The child canvases make screen manipulation of the displayed input much easier and more efficient. For example, in this way, handwritten words or figures can be selected and manipulated (moved, copy/pasted) as a logical unit (e.g. a word, diagram) rather than individual strokes.


In many cases it is desired to edit the displayed canvasses in a similar way to using an eraser. This is typically done in one of two ways. There are several variants of systems that draw lines on the screen according to coordinates generated by a digitizer overlaying the display. In some cases, coordinates are filtered and curves are fitted to the coordinates. The ‘vector’ form of the data (i.e. the series of control points that determine the curve) is retained and portions of the curve can be manipulated by adjusting ‘control points’ or erased by cutting the curve at specific points. Alternatively, the curve can be converted to a bitmap and erasing can be done by pixel-level tools that affect the area of the pixels under the eraser area, irrespective of the means originally used to create the curve.


In the present invention, the improvement is to use an eraser tool on a curve or path wherein the area being erased is determined by a geometric shape (e.g. a circle) that tracks the pointing device (finger, stylus, pen etc.). The path is maintained in a vector format, not flattened to a bitmap. The area erased is determined by calculating the intersection(s) between the eraser shape and the path, inserting (or relocating) points at the intersections, and removing the line segments(s) and control points that fall within the eraser shape. A flowchart of this process is shown in FIG. 2 and FIG. 3.


As can be seen in FIG. 2, and FIG. 3, the eraser tool (1) intersects a given curve with control points (2). As a result, the curve has some of its control points deleted and new points (3) inserted. In particular, in one embodiment, the eraser tool affects a curve that has been fitted to the recorded co-ordinates, not line segments connecting them. For example, the points used to fit a new curve do not necessarily coincide with the original recorded points or even lie on straight segment connecting the original recorded points. A segment of the curve is removed when the area covered by the eraser shape intersects it. New co-ordinates are added at the point of intersection of the eraser shape and the fitted curve, so that the curve is preserved. Inserting points at the intersection of the eraser shape and line segment(s) would alter the fitted curve, in some cases drastically. A curve-fitting algorithm treats endpoints specially. The eraser shape removing a segment of the curve will often create one or two new endpoints (depending on whether the curve was trimmed or bisected, respectively), the algorithm may choose to insert more than the new co-ordinate at the point of intersection. One or more points may be inserted or moved between the new co-ordinate at the point of intersection and nearest recorded co-ordinate, to preserve the original shape of the fitted curve in the absence of the original co-ordinates now erased. Since recorded co-ordinates represent ‘original data’, recorded co-ordinates are only ever deleted (i.e. if the eraser shape occludes them). Inserted points are marked as being synthetic, and the erasing algorithm may insert and move these freely as the erase shape moves and progressively clips larger amounts of the line segment.

Claims
  • 1. An electronic device implemented method for receiving writing input, comprising: receiving, via a first application operating on the electronic device, a pen input onto a master canvas of the first application;creating, by the first application, a plurality of child canvases based on spatial and temporal information associated with the pen input, wherein each of the plurality of child canvases corresponds to one of a plurality of time periods defined by the temporal information, and the pen input made onto the master canvas during each time period of the plurality of time periods being inserted into a corresponding one of the plurality of child canvases after each time period of the plurality of time periods elapses, wherein the plurality of child canvases are separately processed by second one or more applications supporting the first application, and the plurality of child canvases enable individual manipulation of all the pen input made during the corresponding one of the plurality of time periods within the corresponding child canvas within the first application; andinserting, by the first application, the plurality of child canvases into the master canvas.
  • 2. The method of claim 1, wherein the plurality of child canvases are smaller in area than a master canvas that receives the pen input.
  • 3. The method of claim 2 wherein at least one of the plurality of child canvases encloses a single word.
  • 4. The method of claim 1, further comprising: receiving, by the electronic device, an eraser input that intersects a curve, wherein the curve is determined by a series of control points;determining, by the electronic device, a shape of an erasing contour of the eraser input;identifying, by the electronic device, based on the eraser input, one or more new control points that did not previously exist on the curve; andmodifying, by the electronic device, the curve based on the erasing contour and the new control points.
  • 5. The method of claim 4 wherein at least one of the new control points is between a point at which the eraser input intersects the curve and a nearest control point of the series of control points.
  • 6. The method of claim 4, wherein the one or more new control points do not lie on a straight segment connecting two control points of the series of control points.
  • 7. The method of claim 4, further comprising deleting, by the electronic device, one or more of the series of control points based on the eraser input.
  • 8. The method of claim 4, wherein the one or more new control points are marked as being synthetic, and wherein the one or more new control points move in response to receiving additional eraser input.
  • 9. The method of claim 1, wherein the spatial information is based on a proximity of pen strokes received as the pen input.
  • 10. The method of claim 1, wherein the temporal information is based on the elapsing of a period of time between pen strokes received as the pen input.
  • 11. A non-transitory computer readable medium having a plurality of instructions stored thereon, which, when executed by a processor of a computing device cause the computing device to: receive, via a first application of the computing device, a pen input on a master canvas of the first application;create, via the first application, a plurality of child canvases based on spatial and temporal information associated with the pen input, wherein each of the plurality of child canvases corresponds to one of a plurality of time periods defined by the temporal information, and the pen input made onto the master canvas during each time period of the plurality of time periods being inserted into a corresponding one of the plurality of child canvases after each time period of the plurality of time periods elapses, wherein the plurality of child canvases are separately processed by second one or more applications that support the first application, and the plurality of child canvases enable individual manipulation of all the pen input made during the corresponding one of the plurality of time periods within the corresponding child canvases within the first application; andinsert, by the first application, the plurality of child canvases into the master canvas.
  • 12. The non-transitory computer readable medium of claim 11, wherein the computing device is further caused to: receive an eraser input that intersects a curve, wherein the curve is determined by a series of control points;determine a shape of an erasing contour associated with the eraser input;identify, based on the eraser input, one or more new control points that did not previously exist on the curve; andmodify the curve based on the erasing contour and the new control points.
  • 13. The non-transitory computer readable medium of claim 12, wherein at least one of the new control points is between a point at which the eraser input intersects the curve and a nearest control point of the series of control points.
  • 14. The computer-readable medium of claim 12, wherein the one or more new control points do not lie on a straight segment connecting two control points of the series of control points.
  • 15. The computer-readable medium of claim 12, wherein the instructions, when executed by the processor, further cause the computing device to delete one or more of the series of control points based on the eraser input.
  • 16. The computer-readable medium of claim 12, wherein the instructions, when executed by the processor, further cause the computing device to: mark the one or more new control points as being synthetic; andmove the one or more new control points in response to receiving additional eraser input.
  • 17. The non-transitory computer readable medium of claim 11 wherein the plurality of child canvases are smaller in area than the master canvas that receives the pen input.
  • 18. The non-transitory computer readable medium of claim 17 wherein at least one of the plurality of child canvases encloses a single word.
  • 19. The computer-readable medium of claim 11, wherein the spatial information is based on a proximity of pen strokes received as the pen input.
  • 20. The computer-readable medium of claim 11, wherein the temporal information is based on the elapsing of a period of time between pen strokes received as the pen input.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Patent Application 61/396,789 filed Jun. 1, 2010, entitled “Electronic Device for Education”, the contents of which are incorporated herein by reference.

US Referenced Citations (190)
Number Name Date Kind
3132911 Heidler May 1964 A
4163303 Hanna Aug 1979 A
4619304 Smith Oct 1986 A
4633436 Flurry Dec 1986 A
4821373 Maidment et al. Apr 1989 A
5355555 Zarelius Oct 1994 A
5461581 Hallwirth et al. Oct 1995 A
5610825 Johnson et al. Mar 1997 A
5714971 Shalit et al. Feb 1998 A
5819032 de Vries et al. Oct 1998 A
5870552 Dozier et al. Feb 1999 A
5870559 Leshem et al. Feb 1999 A
5893899 Johnson et al. Apr 1999 A
5920864 Zhao Jul 1999 A
5958008 Pogrebisky et al. Sep 1999 A
5987704 Tang Nov 1999 A
6037937 Beaton et al. Mar 2000 A
6094197 Buxton et al. Jul 2000 A
6138072 Nagai Oct 2000 A
6144962 Weinberg et al. Nov 2000 A
6157381 Bates et al. Dec 2000 A
6168341 Chene et al. Jan 2001 B1
6237006 Weinberg et al. May 2001 B1
6288704 Flack et al. Sep 2001 B1
6292188 Carlson et al. Sep 2001 B1
6333994 Perrone et al. Dec 2001 B1
6340967 Maxted Jan 2002 B1
6377249 Mumford Apr 2002 B1
6411302 Chiraz Jun 2002 B1
6466220 Cesana et al. Oct 2002 B1
6493464 Hawkins et al. Dec 2002 B1
6537103 Jamison Mar 2003 B2
6647145 Gay Nov 2003 B1
6697524 Arai et al. Feb 2004 B1
7032187 Keely et al. Apr 2006 B2
7100119 Keely et al. Aug 2006 B2
7158678 Nagel et al. Jan 2007 B2
7167585 Gounares et al. Jan 2007 B2
7168035 Bell et al. Jan 2007 B1
7251413 Dow et al. Jul 2007 B2
7425103 Perez-Sanchez Sep 2008 B2
7427984 Smirnov et al. Sep 2008 B2
7450114 Anwar Nov 2008 B2
7477205 de Waal et al. Jan 2009 B1
7480858 Chen et al. Jan 2009 B2
7551312 Hull et al. Jun 2009 B1
7564995 Yaeger et al. Jul 2009 B1
7567239 Seni Jul 2009 B2
7576730 Anwar Aug 2009 B2
7689928 Gilra Mar 2010 B1
7735104 Dow et al. Jun 2010 B2
7757184 Martin et al. Jul 2010 B2
7774358 Tamas et al. Aug 2010 B2
7873243 Cohen et al. Jan 2011 B2
7886233 Rainisto et al. Feb 2011 B2
7889186 Nishimura et al. Feb 2011 B2
7890919 Williams Feb 2011 B1
8140560 Dinn Mar 2012 B2
8155498 Dow et al. Apr 2012 B2
8200796 Margulis Jun 2012 B1
8340476 Cohen et al. Dec 2012 B2
8407606 Davidson et al. Mar 2013 B1
8510677 van Os Aug 2013 B2
8576222 Handley et al. Nov 2013 B2
8599174 Cohen et al. Dec 2013 B2
8610672 Kun et al. Dec 2013 B2
8749480 Cohen et al. Jun 2014 B2
20010005207 Muikaichi et al. Jun 2001 A1
20020011990 Anwar Jan 2002 A1
20020024506 Flack et al. Feb 2002 A1
20020067319 Hensel Jun 2002 A1
20020080195 Carlson et al. Jun 2002 A1
20020097910 Guha Jul 2002 A1
20020109668 Rosenberg et al. Aug 2002 A1
20020113823 Card et al. Aug 2002 A1
20020133906 Fedon Sep 2002 A1
20030028851 Leung et al. Feb 2003 A1
20030030852 Sampson et al. Feb 2003 A1
20030202772 Dow et al. Oct 2003 A1
20030202773 Dow et al. Oct 2003 A1
20030214491 Keely et al. Nov 2003 A1
20040080498 Fujiwara et al. Apr 2004 A1
20040194014 Anwar Sep 2004 A1
20040196255 Cheng Oct 2004 A1
20040221311 Dow et al. Nov 2004 A1
20040257369 Fang Dec 2004 A1
20050010871 Ruthfield et al. Jan 2005 A1
20050051350 Porter et al. Mar 2005 A1
20050052427 Wu et al. Mar 2005 A1
20050078098 Dresevic et al. Apr 2005 A1
20050079477 Diesel et al. Apr 2005 A1
20050162413 Dresevic et al. Jul 2005 A1
20050183031 Onslow Aug 2005 A1
20060028457 Burns Feb 2006 A1
20060061551 Fateh Mar 2006 A1
20060152496 Knaven Jul 2006 A1
20060159345 Clary et al. Jul 2006 A1
20060184901 Dietz Aug 2006 A1
20060239505 Bjorklund et al. Oct 2006 A1
20060244738 Nishimura et al. Nov 2006 A1
20060253493 Tamas et al. Nov 2006 A1
20060256139 Gikandi Nov 2006 A1
20060274086 Forstall et al. Dec 2006 A1
20060277460 Forstall et al. Dec 2006 A1
20060284851 Pittman Dec 2006 A1
20060294466 Muller et al. Dec 2006 A1
20070061707 Sally et al. Mar 2007 A1
20070094267 Good et al. Apr 2007 A1
20070132763 Chu et al. Jun 2007 A1
20070180397 Hoyer et al. Aug 2007 A1
20070180471 Unz Aug 2007 A1
20070247445 Lynggaard et al. Oct 2007 A1
20070256031 Martin et al. Nov 2007 A1
20070291017 Syeda-Mahmood et al. Dec 2007 A1
20080076472 Hyatt Mar 2008 A1
20080078055 Estlander Apr 2008 A1
20080150946 Kuo Jun 2008 A1
20080165255 Christie et al. Jul 2008 A1
20080180409 Matsuda Jul 2008 A1
20080219556 Han et al. Sep 2008 A1
20080243808 Reiman et al. Oct 2008 A1
20080296074 Hollstron et al. Dec 2008 A1
20090015793 Suzuki et al. Jan 2009 A1
20090021493 Marggraff et al. Jan 2009 A1
20090021494 Marggraff et al. Jan 2009 A1
20090021495 Edgecomb et al. Jan 2009 A1
20090044236 Bendiabdallah et al. Feb 2009 A1
20090052778 Edgecomb et al. Feb 2009 A1
20090063960 Anwar Mar 2009 A1
20090083618 Campbell Mar 2009 A1
20090083655 Beharie et al. Mar 2009 A1
20090100380 Gardner et al. Apr 2009 A1
20090119365 Tomic May 2009 A1
20090161958 Markiewicz et al. Jun 2009 A1
20090184972 Weybrew et al. Jul 2009 A1
20090198132 Pelissier et al. Aug 2009 A1
20090199123 Albertson et al. Aug 2009 A1
20090202112 Nielsen et al. Aug 2009 A1
20090204663 Patwari Aug 2009 A1
20090213085 Zhen et al. Aug 2009 A1
20090241054 Hendricks Sep 2009 A1
20090251441 Edgecomb et al. Oct 2009 A1
20090253107 Marggraff Oct 2009 A1
20090267923 Van Schaack et al. Oct 2009 A1
20090304281 Yipu Dec 2009 A1
20090324082 Liu et al. Dec 2009 A1
20100054845 Marggraff et al. Mar 2010 A1
20100077059 Shen Mar 2010 A1
20100077343 Uhl et al. Mar 2010 A1
20100097331 Wu Apr 2010 A1
20100104269 Prestenback et al. Apr 2010 A1
20100138875 Johnson et al. Jun 2010 A1
20100161653 Krasnow Jun 2010 A1
20100175018 Petschnigg et al. Jul 2010 A1
20100177047 Brenneman et al. Jul 2010 A1
20100185948 Anwar Jul 2010 A1
20100185975 Anwar Jul 2010 A1
20100192062 Anwar Jul 2010 A1
20100210332 Imai Aug 2010 A1
20100211866 Nicholas et al. Aug 2010 A1
20100245295 Kimpara Sep 2010 A1
20100259494 Kii Oct 2010 A1
20100278504 Lyons et al. Nov 2010 A1
20100281372 Lyons et al. Nov 2010 A1
20100281384 Lyons et al. Nov 2010 A1
20100289820 Hoyer et al. Nov 2010 A1
20100309131 Clary Dec 2010 A1
20100315266 Gunawardana et al. Dec 2010 A1
20110018821 Kii Jan 2011 A1
20110066965 Choi Mar 2011 A1
20110090155 Caskey et al. Apr 2011 A1
20110122081 Kushler May 2011 A1
20110145724 Tsai et al. Jun 2011 A1
20110148892 Shreiner et al. Jun 2011 A1
20110167369 van Os Jul 2011 A1
20110185318 Hinckley et al. Jul 2011 A1
20110191719 Hinckley et al. Aug 2011 A1
20110199297 Antonyuk et al. Aug 2011 A1
20110202856 Handley et al. Aug 2011 A1
20110209058 Hinckley et al. Aug 2011 A1
20110261060 Waibel et al. Oct 2011 A1
20110289444 Winsky Nov 2011 A1
20110292042 Vaganov Dec 2011 A1
20110296344 Habib et al. Dec 2011 A1
20110320950 Rajput et al. Dec 2011 A1
20120023433 Choi et al. Jan 2012 A1
20120032886 Ciesla et al. Feb 2012 A1
20120036468 Colley Feb 2012 A1
20120090135 Soh Apr 2012 A1
20120144283 Hill et al. Jun 2012 A1
Non-Patent Literature Citations (14)
Entry
Office Action mailed Jun. 6, 2014 for U.S. Appl. No. 12/964,660, 15 pages.
Final Office Action mailed Oct. 17, 2014 for U.S. Appl. No. 12/964,660, 51 pages.
Office Action mailed Nov. 19, 2012 for U.S. Appl. No. 13/117,080, 22 pages.
Final Office Action mailed Mar. 5, 2013 for U.S. Appl. No. 13/117,080, 26 pages.
Office Action mailed Dec. 19, 2013 for U.S. Appl. No. 13/117,080, 23 pages.
Final Office Action mailed Apr. 2, 2014 for U.S. Appl. No. 13/117,080, 20 pages.
Office Action mailed Apr. 15, 2013 for U.S. Appl. No. 13/117,087, 11 pages.
Final Office Action mailed Aug. 2, 2013 for U.S. Appl. No. 13/117,087, 7 pages.
Office Action mailed Aug. 5, 2014 for U.S. Appl. No. 13/117,087, 10 pages.
Office Action mailed Dec. 28, 2012 for U.S. Appl. No. 13/149,887,9 pages.
Final Office Action mailed May 22, 2013 for U.S. Appl. No. 13/149,887,11 pages.
Song Ho Ahn, “OpenGL Frame Buffer Object (FBO)”, 2008, http://wayback.archive.org/web/20080822025141/http://ww.songho.ca/opengl/gl—fbo.html.
Microsoft Word—Split Function, Mar. 23, 2014, 5 pages.
Final Office Action mailed Nov. 26, 2014 for U.S. Appl. No. 13/117,087, 9 pages.
Related Publications (1)
Number Date Country
20120200540 A1 Aug 2012 US
Provisional Applications (1)
Number Date Country
61396789 Jun 2010 US