Fonts with feelings

Abstract
Various technologies and techniques are disclosed that improve the instructional nature of fonts and/or the ability to create instructional fonts. Font characters are modified based on user interaction to enhance the user's understanding and/or fluency of the word. The font characters can have sound, motion and altered appearance. When altering the appearance of the characters, the system operates on a set of control points associated with characters, changes the position of the characters, and changes the influence of the portion of characters on a set of respective spline curves. A designer or other user can customize the fonts and user experience by creating an episode package that specifies words to include in the user interface, and details about actions to take when certain events fire. The episode package can include media effects to play when a particular event associated with the media effect occurs.
Description

BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a diagrammatic view of the properties available in an application operating on a computer system of one implementation.



FIG. 2 is a diagrammatic view of a computer system of one implementation.



FIG. 3 is a high-level process flow diagram for one implementation of the system of FIG. 2.



FIGS. 4-8 are diagrams for one implementation of the system of FIG. 2 illustrating program logic that executes at the appropriate times to implement one or more techniques for the system.



FIG. 9 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in tracing characters.



FIG. 10 is a diagrammatic view for one implementation of the system of FIG. 2 illustrating the components that allow customization of the user experience.



FIG. 11 is a diagram for one implementation of the system of FIG. 2 illustrating an exemplary sinusoid transformation.



FIG. 12 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in customizing the user experience using content in external files.



FIG. 13 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in resizing the words based on user interaction.



FIGS. 14-17 are simulated screens for one implementation of the system of FIG. 2 illustrating variations of the sizing and placement of words based on user interaction.



FIG. 18 is a diagrammatic view for one implementation of the system of FIG. 2 illustrating content being authored by multiple authors and used in customizing the user experience.



FIG. 19 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in customizing and processing events.



FIG. 20 is a diagram illustrating a sample text file for one implementation of the system of FIG. 2.



FIG. 21 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in processing programmatic events.



FIG. 22 is a logical diagram for one implementation of the system of FIG. 2 and the stages of FIG. 21 illustrating an exemplary programmatic event being processed.



FIG. 23 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved processing hover events.



FIG. 24 is a logical diagram for one implementation of the system of FIG. 2 and the stages of FIG. 23 illustrating an exemplary hover event being processed.



FIG. 25 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in processing a speech event.



FIG. 26 is a logical diagram for one implementation of the system of FIG. 2 and the stages of FIG. 25 illustrating an exemplary speech event being processed.



FIG. 27 is a logical diagram for one implementation of the system of FIG. 2 illustrating an exemplary event being processed for a comic.



FIG. 28 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved helping a user understand a word.



FIG. 29 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved helping a user reinforce an understanding they already have of a word.



FIG. 30 is a process flow diagram for one implementation of the system of FIG. 2 illustrating the stages involved in setting some exemplary types of tags in the settings files for customizing the actions.


Claims
  • 1. A computer-readable medium having computer-executable instructions for causing a Computer to perform steps comprising: reading a plurality of words in at least one font from a file;displaying the plurality of words in a user interface of a program that aids a word understanding of a user; andupon receiving input from the user to interact with at least a portion of the plurality of words, performing at least one action associated with the portion of the words, the action being specified in the file.
  • 2. The computer-readable medium of claim 1, wherein the file is a text file.
  • 3. The computer-readable medium of claim 1, wherein the file is a database.
  • 4. The computer-readable medium of claim 1, wherein the action is rendering a media effect.
  • 5. The computer-readable medium of claim 1, wherein the action is distorting the portion of words.
  • 6. A computer-readable medium having computer-executable instructions for causing a computer to perform steps comprising: reading a plurality of words in at least one font from a data store;displaying the plurality of words in a user interface;detecting an event being fired based upon an interaction with at least a portion of the words by a user;reading the data store to locate an event tag associated with the event, the event tag identifying a media effect to be played for the event; andplaying the media effect.
  • 7. The computer-readable medium of claim 6, further comprising the step of: altering the portion of words while the media effect is played.
  • 8. The computer-readable medium of claim 6, wherein the altering step includes stretching the portion of words.
  • 9. The computer-readable medium of claim 6, wherein the altering step includes distorting the portion of words.
  • 10. The computer-readable medium of claim 6, wherein the event is a selection event that fires when the user selects the portion of words.
  • 11. The computer-readable medium of claim 6, wherein the event is a hover event that fires when the user hovers an input device over the portion of words.
  • 12. The computer-readable medium of claim 6, wherein the event is a spoken command event that fires when the user speaks a sound related to the portion of words into a microphone.
  • 13. A method for providing customizable fonts comprising: reading at least one data store to determine at least some words to display on a user interface;detecting a particular event as a user interacts with at least a portion of the words;performing an action specified in the data store for the particular event when the particular event occurs;wherein the data store includes at least one media effect;wherein the action involves playing the media effect; andwherein the media effect is designed to aid an understanding of the portion of words.
  • 14. The method of claim 13, wherein the data store is a text file.
  • 15. The method of claim 13, wherein the data store is an episode package.
  • 16. The method of claim 15, wherein the episode package is supplied by at least one font designer.
  • 17. The method of claim 13, wherein the event is a stylus event.
  • 18. The method of claim 13, wherein the event is a speech event.
  • 19. The method of claim 13, further comprising: altering an appearance of the portion of words while the media effect is playing.
  • 20. A computer-readable medium having computer-executable instructions for causing a computer to perform the steps recited in claim 13.