Claims
- 1. A computer generated user interface for accepting handwritten ink comprising:
a first region for accepting handwritten ink associated with a pen; a second region whose contents are controlled by configuring an object associated with said user interface.
- 2. The user interface according to claim 1, wherein said object controls the placement of said user interface near an insertion point in an insertion region.
- 3. The user interface according to claim 1, wherein said handwritten ink is sent to an insertion point upon said pen moving away from said user interface.
- 4. The user interface according to claim 3, wherein said pen moves at least a predetermined distance from a surface on which said user interface is displayed before said handwritten ink is sent to said insertion point.
- 5. The user interface according to claim 1, further comprising:
a third region, upon whose interaction, sends said handwritten ink in said first region to an insertion point in an insertion region.
- 6. The user interface according to claim 1, wherein said object is modifiable by a developer.
- 7. The user interface according to claim 2, wherein said insertion region is associated with an edit control.
- 8. The user interface according to claim 7, wherein the edit control specifies the offset of a panel housing said first and second regions from said insertion region.
- 9. The user interface according to claim 8, wherein said offset includes a vertical offset.
- 10. The user interface according to claim 8, wherein said offset includes a horizontal interface.
- 11. The user interface according to claim 3, wherein said handwritten ink is inserted as ink in said insertion region.
- 12. The user interface according to claim 3, wherein said handwritten ink is converted into text and said text is inserted into said insertion region.
- 13. The user interface according to claim 8, wherein the offset is a default offset.
- 14. The user interface according to claim 8, wherein the offset is specific to said edit control.
- 15. The user interface according to claim 8, wherein the offset is definable by a developer.
- 16. A computer display window having an upper region and a lower region comprising:
an insertion point active in an insertion region; a handwriting capture panel including a first region for capturing handwritten ink; wherein said handwriting capture panel is displayed above said insertion region when said insertion region is in said lower region and wherein said handwriting capture panel is displayed below said insertion region when said insertion region is in said upper region.
- 17. The computer display window according to claim 8, wherein a separation between upper region and said lower region is definable.
- 18. A method of controlling a user interface, comprising:
receiving information from a software application; and invoking a user interface based upon the received information.
- 19. The method recited in claim 18, further comprising:
invoking the user interface to appear at a specified position based upon the received information.
- 20. The method recited in claim 18, further comprising:
invoking the user interface to include specified features based upon the received information.
- 21. The method recited in claim 18, further comprising:
invoking the user interface to include specified features and to appear at a specified position based upon the received information.
- 22. The method recited in claim 18, wherein the information identifies a control.
- 23. The method recited in claim 22, wherein the information further includes position information for the control.
- 24. The method recited in claim 23, further comprising:
invoking the user interface to appear at a position proximal to the control.
- 25. The method recited in claim 23, further comprising:
invoking the user interface to appear at a position vertically offset from a boundary of the control.
- 26. The method recited in claim 23, further comprising:
invoking the user interface to appear at a position horizontally offset from a boundary of the control.
- 27. The method recited in claim 23, further comprising:
invoking the user interface to appear at a position horizontally and vertically offset from a boundary of the control.
- 28. The method recited in claim 23, further comprising:
invoking the user interface to have a specified boundary size based upon a position of the control.
- 29. The method recited in claim 22, wherein the information further includes size information for a boundary of the control.
- 30. The method recited in claim 29, further comprising:
invoking the user interface to appear within the control based upon the size information for the boundary of the control.
- 31. The method recited in claim 30, further comprising:
invoking the user interface to appear at a position vertically offset from a boundary of the control.
- 32. The method recited in claim 30, further comprising:
invoking the user interface to appear at a position horizontally offset from a boundary of the control.
- 33. The method recited in claim 22, wherein the information further includes a boundary of a work area.
- 34. The method recited in claim 33, further comprising:
invoking the user interface to appear at a specified position relative to the control and the boundary of the work area.
- 35. The method recited in claim 23, wherein the information further includes position for a second control.
- 36. The method recited in claim 35, further comprising:
invoking the user interface to appear at a specified position relative to the first control and the second control.
- 37. The method recited in claim 18, wherein the information identifies a type of control.
- 38. The method recited in claim 37, further comprising:
invoking the user interface to include an input surface corresponding to the identified type of control.
- 39. The method recited in claim 38, further comprising:
invoking the user interface to include a keyboard input surface.
- 40. The method recited in claim 38, further comprising:
invoking the user interface to include a writing input surface.
- 41. The method recited in claim 38, further comprising:
invoking the user interface to include a Asian language writing input surface.
- 42. The method recited in claim 1, further comprising:
receiving the instructions at a utility object; and in response to receiving the instructions, having the utility object control an application programming interface to invoke the user interface.
- 43. The method recited in claim 18, further comprising:
employing an invocation technique for invoking the user interface based upon the received information.
- 44. The method recited in claim 43, wherein the user interface is invoked in response to activation of a target.
- 45. The method recited in claim 44, wherein the target is activated when a pointing device moves proximal to the target.
- 46. The method recited in claim 44, wherein the target it activated when a pointing device moves over the target.
- 47. The method recited in claim 44, wherein the target is activated when a pointing device is held over the target for a predetermined period of time.
- 48. The method recited in claim 18, further comprising employing a dismissal technique for dismissing the user interface based upon the received information.
- 49. A computing environment, comprising:
an application having at least one control for receiving input data; a user interface for entering data into the at least one control; and a utility object that receives information from the application regarding the at least one control, and controls the user interface in response to the received information.
- 50. The computing environment recited in claim 49, wherein:
the application has a plurality of controls; and the utility object receives information from the application regarding each of the plurality of controls.
- 51. The computing environment recited in claim 49, wherein
the application includes a second control; and further comprising a second utility object that receives information from the application regarding the second control, and controls the user interface in response to the received information regarding the second control.
- 52. The computing environment recited in claim 49, further comprising:
a component shared among the at least one control for controlling the user interface in response to instructions from the utility object.
- 53. The computing environment recited in claim 52, wherein the shared component receives data from the user interface and routes the data into the at least one control.
- 54. The computing environment recited in claim 52, further including a proxy component that relays instructions from the utility object to the shared component.
- 55. The computing environment recited in claim 54, wherein
the shared component includes a message queue that receives and stores messages; and the proxy component relays instructions from the utility object to the message queue of the shared component, such that the shared component retrieves and responds to the instructions relayed to the message queue in the order in which the instructions are received at the message queue.
- 56. The computing environment recited in claim 52, further comprising:
a plurality of objects; and a proxy component corresponding to each of the plurality of objects for relaying instructions from the utility object to the shared component.
RELATED APPLICATIONS
[0001] This application is a continuation-in-part application of copending U.S. patent application Ser. No. 10/356,315, filed on Jan. 31, 2003, entitled “Utility Object For Specialized Data Entry,” and naming Kyril Feldman et al. as inventors, which copending application is incorporated entirely herein by reference. This application also is a continuation-in-part application of copending U.S. Provisional Patent Application No. 60/444,444, filed on Feb. 1, 2003, entitled “Synchronous Data Exchange Between In-Process Components And A Shared Singleton,” and naming Tobias Zielinski et al. as inventors, which copending application also is incorporated entirely herein by reference.
Provisional Applications (2)
|
Number |
Date |
Country |
|
60444444 |
Feb 2003 |
US |
|
60453701 |
Jan 2003 |
US |