The present invention relates to conveying meaning to a reader of a text message.
Users of mobile appliances, such as cell phones and pocket computers, have discovered great utility in being able to exchange text messages. In order to improve efficiency, it has become commonplace to abbreviate or to modify text so as to speed entry of the text. The practice has become so commonplace that it has become a comprehensive subset of language. One such codifying of text has recently been referred to as “teen-speak” or “lingo” and has a significant following, but it is not universally useful and it is not trivial to interchange with conventional text.
As an example of how text may be codified, in teen-speak the numeric value 2 is used as a phonetic substitute for the words “to” “two” and “too”. The numeric value 4 is used to substitute as “for” and the numeric value 8 may be seen for the letter trigram “ate” either as a word or as a syllable in longer words such as “mate” (m8), “gate” (g8), “innovate” (n of 8).
Professional users of text based services also use codification. In its simplest form, this may be simple abbreviation. As an example, aviation weather reports will usually alter, contract or abbreviate words in the interests of brevity of the text, relying on a trained reader to interpret the full meaning. In part there is an historical basis for this; because teletype systems were used to retrieve and show information, it was essential that the information code could in fact be represented entirely by the available machinery. A sample taken from Chicago O'Hare terminal area forecast, transmitted as FM0700 19013KT P6SM VCSH SCT035 OVC070 is translated to read “From 7 am coordinated universal time, the wind is expected to be from 190 degrees at 13 knots. The visibility is expected to exceed 6 statute miles with showers expected in the vicinity. There is a scattered cloud layer expected at 3,500 feet with an overcast layer expected at 7,000 feet.” There is considerable efficiency from such codification but it is exceptionally difficult for an untrained user to read. Of course it is understood that codification extends to obfuscation of meaning as well, but in the case of this invention we may be less concerned with deliberate secrecy resulting from engineered cryptography.
Staying with the weather text example, intensity of a weather phenomenon may be indicated by simple characters such as + or − but we have the problem of needing to understand the contextual relationships between the elements of a sequence. Turning again to the weather example, +RA would mean heavy rain but −TSRA would be interpreted to mean Thunderstorm with Light Rain. The user is assumed to understand that it is the precipitation that is light, not the overarching thunderstorm conditions.
Recently, codification of text has included objects, such as icons. Use of these icons has been facilitated by improvements in the appliances and in the networks. An example of such objects are commonly named “emoticons”. Emoticons typically use punctuation symbols in combination to give an emotional depth to electronic text exchanges. For example, by using three punctuation symbols a happy face may be made :-) or a sad face may be made :-(, either of which connote a much greater span of meaning than the text alone. The addition of a richer graphic environment allows these base symbologies to be replaced by more expressive icons such as and .
The invention may be embodied as a method of inserting a pictorial artifact into a text document. In one such method, a relational database is provided. The database may have at least one possible text-entry that is linked to a pictorial artifact. Text may be received from an input device, and the database may be searched to determine whether the received text matches the possible text-entry that is linked. If the received text matches a possible text entry that is linked, then the linked pictorial artifact is provided. The provided pictorial artifact may be selected, and inserted into a text document.
Modification options may be provided and used to modify the selected pictorial artifact. If a modification option is selected, the pictorial artifact may be modified accordingly. If the pictorial artifact is modified, alternate text and/or an alternate pictorial artifact may be provided for selection.
The invention may also be embodied as an electronic appliance. One such appliance has a database having at least one possible text-entry that is linked to a pictorial artifact. An input device of the appliance may be capable of receiving text from a user. The appliance may also have a microprocessor that is programmed to carry out a method according to the invention. For example, the microprocessor may be programmed to (a) search the database to determine whether text received by the input device matches the possible text-entry that is linked, and (b) provide the linked pictorial artifact if the received text matches the possible text entry that is linked. A selection device of the appliance may allow the user to select the provided pictorial artifact, and a display device of the appliance may be able to display the selected pictorial artifact in conjunction with the received text. The display device may be able to provide modification options that may be used to modify the selected pictorial artifact. Such modification options may be provided to the display device by the microprocessor. To facilitate selection of a modification option, the selection device may include a controller that may be used to identify and select from a plurality of modification options.
For a fuller understanding of the nature and objects of the invention, reference should be made to the accompanying drawings and the subsequent description. Briefly, the drawings are:
The use of artifacts to convey meaning is a focus of the invention.
Each of these is described in greater detail below. The database 13 may be searched by the microprocessor 19. Furthermore, the input device 16, selection device 22 and display 25 may be under the control of the microprocessor 19. As such, the appliance 10 will have software executable by the microprocessor 19 and which provides instructions to the microprocessor 19 for carrying out tasks.
The database 13 has a plurality of possible text-entries. For example, the possible text-entries may be words commonly found in a dictionary. At least one of the possible text-entries in the database 13 may be linked to a pictorial artifact. More than one pictorial artifact may be linked to a particular possible text-entry. As such, if a particular possible text-entry that is linked is identified, its corresponding pictorial artifact (or artifacts, as the case may be) may be retrieved. Also, more than one text entry may be linked to a particular pictorial artifact, and as such, if a particular pictorial artifact is identified, its corresponding text entry (or entries, as the case may be) may be retrieved.
The input device 16 may be capable of receiving text from a user of the appliance 10. For example, the input device 16 may include a monitor 28 and a means for selecting objects displayed by the microprocessor 19 on the monitor 28. If the monitor 28 is a touch sensitive screen which displays a virtual keyboard, the user can select letters from the virtual keyboard using his finger or a stylus. Alternatively, the input device 16 may include a plurality of buttons 31, each of which may be used to represent one or more letters. By pressing the buttons 31, the user may select letters in order to provide text. Alternatively, the input device 16 may use a track ball, joy stick or arrow buttons to move a cursor on the monitor 28 and a select button may be pressed by the user when the cursor identifies a desired letter or text object. Many cell phones include one or more such input devices. By using the input device 16, the user can provide text to the appliance 10, and that text may be compared to entries in the database 13 as part of an effort to locate a desired pictorial artifact.
Text may be received from a user of the appliance 10 using, for example, any of the usual methods of entry. In the case of deterministic entry, a text string may be entered letter by letter. Text may be entered, using a single press per letter, by keys 31 which may have more than one letter associated with a particular key 31, as in the case of a standard telephone keypad. In keeping with the predictive ability of certain implementations, after a certain number of the initial letters of a word have been entered, full or partial text candidates, which correspond to the initial sequence of letters, may be predicted by the microprocessor 19 and then displayed on the monitor 28 for user selection.
The microprocessor 19 may be programmed to carry out certain functions. For example, the microprocessor 19 may be programmed to search the database 13 to determine whether text received from the input device 16 matches a possible text-entry in the database 13 that is linked to a pictorial artifact. Furthermore, the microprocessor 19 may be programmed so that if the received text matches a possible text entry that is so linked, then the microprocessor 19 provides the linked pictorial artifact for that matching text entry. The microprocessor 19 may provide the pictorial artifact to a monitor 28, such as the touch sensitive screen used with the input device 16 (described above).
To illustrate this aspect of the appliance 10, the input device 16 may be used to provide the text “appreciate”, and the database 13 may be searched by the microprocessor 19 to determine whether the word “appreciate” is among the possible text-entries stored in the database 13. Upon locating the possible text-entry “appreciate” in the database 13, the microprocessor 19 may determine that the linked pictorial artifact is the smiley face , and the may be provided to the user via the monitor 28 for consideration and possibly selection by the user. If the user selects the , then the is inserted to the text document. For instance, the may be inserted immediately after the text “appreciate”, or the may be inserted at the end of the sentence in which the text “appreciate” appears. In this manner, the user may be able to better express the intensity with which he appreciates something.
The selection device 22 may be used by the user to select at least one of the provided pictorial artifacts. The selection device 22 may include a monitor 28 and a means for selecting objects caused to be displayed by the microprocessor 19 on the monitor 28. If the monitor 28 is a touch sensitive screen, the screen may display the provided pictorial artifacts so that the user can select one of the pictorial artifacts using his finger or a stylus to press in the vicinity of the desired artifact. It is possible for the monitor 28 used in the input device 16 to also be used in the selection device 22. For example, the touch sensitive screen used as part of the input device 16 may also be used as part of the selection device 22. Alternatively, the selection device 22 may use a track ball, joy stick or arrow buttons to move a cursor on the monitor 28 and a select button may be pressed by the user when the cursor identifies a desired pictorial artifact. It will now be recognized that the input device 16 and the selection device 22 may use the same components, the difference being afforded by what the microprocessor 19 displays on the monitor 28, and how the microprocessor 19 interprets indications provided by the user.
Once a pictorial artifact is selected, that pictorial artifact may be displayed on a display device 25, such as the monitor 28. The monitor 28 used for the input device 16 and/or the selection device 22 may be used as the display device 25. Under the control of the microprocessor 19, the display device 25 may be caused to provide the selected pictorial artifact in a desired position, such as at an end of a sentence in which the received text resides or next to the text which caused the pictorial artifact to be displayed.
The display device 25 may also be caused to provide modification options that may be used to modify the selected pictorial artifact. For example, one of the modification options might allow the user to change a color of the pictorial artifact, a fill-pattern of the pictorial artifact, or a density parameter used to depict the artifact on the display device 25.
The addition of color may extend the emotion exhibited via the artifact so that it has depth or intensity in addition to just a value. For example, colors at the red end of the spectrum may be interpreted as intense or positive, while colors at the blue end might signify indifference or negative attribution. An example of this can be seen in the “smiley face” artifact, where yellow may be used to signify “smiling and happy,” while light red might be used to signify “smiling and embarrassed.” In contrast, a “sad face” in blue might be used to signify sadness, whereas a “sad face” in red/purple could imply the sender was upset. In summary then, expression sets the basic value of the artifact, and the hue may be used to qualify the intensity of the emotion expressed by the artifact.
In one embodiment of the invention, the modification options may be provided and displayed as a spectral palette which may be used to change the fill parameter of the pictorial artifact. By way of example, the fill parameter may be the color used inside the pictorial artifact, or may be the adjustment of a density parameter in a monochrome system. Adjustment of the density may be perceived as a change in boldness of the lines used to create the pictorial artifact. It should be noted that in addition to modifying the density, the color of the lines may be modified too. In lieu of a spectral pallet, a numerical value may be used to identify the degree of boldness or color to be used in modifying the pictorial artifact. A slider control may also be used.
To illustrate the concept, consider that the text input may be the word “happy” and the linked artifact might be . Selecting the word “happy” to the text document being prepared by the user may allow the word “happy” to remain highlighted or active so that the user will know which word has been selected, and thereby assist the user with determining how the corresponding artifact should be modified by further actuation of a controller 33, such as a joystick or track ball. Upon indicating that a modification of the density parameter for the is desired, the controller 33 of the appliance 10 may be arranged so that up-down movement allows changing the density of the lines making up the , and left-right movement allows the adjustment of the hue or tint of the lines making up the .
To facilitate selection of a desired modification option, the various modification options may be grouped together into a plurality of groups. For example, one group of modification options might be the color of the lines comprising the artifact, and another group of modification options might be the color of the fill-pattern inside the artifact. If the appliance provides a controller 33, the user may select one group by using the controller 33 in a first manner, and select a second group by using the controller 33 in a second manner. For instance, if the controller 33 is a joy stick, the user might press the joy stick away (the first manner) from the user to indicate a desire to select a different color for the pictorial artifact, and in response, a cursor may be moved on the monitor 28 until the cursor identifies which of the colors is desired for the lines of the pictorial artifact. However, if the user desires a different fill-pattern, the joy stick may be pressed to the left (the second manner) until a cursor on the monitor 28 identifies the color of the desired fill-pattern for the pictorial artifact.
The appliance 10 may also include a speaker 36. If one of the modification options includes sound, then that sound may be made audible using the speaker 36.
In an embodiment of the invention, the microprocessor 19 may be programmed to search the database 13 for alternate text corresponding to a modified pictorial artifact. If alternate text corresponding to the modified pictorial artifact is located, then the display device 25 may be caused to provide the alternate text corresponding to the modified pictorial artifact. In this fashion, the user may provide an initial text object, select a pictorial artifact corresponding to that initial text object, identify a modification of that pictorial artifact, and then the microprocessor 19 may search the database 13 for possible text entries that are linked to the modified pictorial artifact. Consequently, although the user may initially identify a text object, the user may ultimately replace that text object with another text object that better expresses the user's intent, and this may be done by selecting and modifying a pictorial object.
The invention may be embodied as a method of conveying meaning. One such method is depicted in
The method depicted in
The method described above may be carried out to afford the user the opportunity to modify the selected pictorial artifact.
One or more of the modification options may correspond to an intensity. For example, color may be used to express the intensity with which the user feels that the selected pictorial artifact comports with his/her feelings. Other modification options may permit the user to select a pattern, such as dashed, dotted or solid lines. To illustrate the idea, the pictorial artifact may be modified to change the solid lines to dotted lines, those lines may be made green, and the fill pattern may be changed to be blue dots.
Another type of modification option may permit the user to alter a density parameter of the selected pictorial artifact. For example, the lines that create the selected pictorial artifact may be made darker, lighter, wider, or narrower depending on what the user selects as the modification option.
Yet another type of modification option may permit the user to include sound corresponding to a pictorial artifact. For example, if the user selects a smiley face, the user may modify the smiley face artifact to associate with the artifact a sound that comports with a happy person. For instance, the sound may be a whistled rendition of the song “Put On A Happy Face”.
Modification options may be selected 206 by moving a controller 33. For example, a cursor may be made to move through a list of modification options displayed on monitor 28 by pressing a joy stick in a particular manner, or an arrow key 31 may be used to move the cursor through the list of modification options. Once the desired modification option is identified using the controller 33, the user may select the desired modification option, for example by pressing the joy stick into the appliance 10, or pressing a “select” key 31 provided on the appliance 10.
If there are many modification options, it may be useful to group the modification options and permit the user to scroll through a first group of the modification options by moving the controller 33 in a first manner, and scroll through a second group of the modification options by moving the controller 33 in a second manner. When the controller 33 is a joy stick, the “first manner” may be pressing the joy stick in one direction, and the “second manner” may be pressing the joy stick in another direction. In doing so, the user may easily scroll through a list of modification options quickly, thereby allowing the user to identify a desired modification option more easily.
If the user selects 206 a modification option, the method may be carried out so as to provide alternate text 212, which corresponds to the modified pictorial artifact. To do so, the database may be searched for text that is linked to the modified version of the pictorial artifact, and if the search identifies text that is different from that selected by the user, then the alternate text may be provided to the user for selection. If the user decides that the alternate text is desirable, the user may select the alternate text. Upon selecting alternate text, the alternate text may be substituted for the text initially selected by the user.
To illustrate this aspect more concretely, consider the situation in which the user provides the word “happy” and the artifact is provided, the user may alter the hue of the smiley face from the standard light yellow, that might be suggestive of a sunny disposition, to a different hue such as pink. Assuming that a pink smiley face is normally associated with mild pleasure or delight, the database might have previously linked the word “delighted” with a pink smiley face, and in that situation, the microprocessor may search the database for text linked to the pink smiley face, and provide the user with the option to substitute the word “delighted” for the word “happy”.
In keeping with cultural conventions, the modification of a pictorial artifact may be accompanied by a change in the artifact itself. For example moving the hue of the standard smiley face from yellow to blue might adjust the smile to a neutral expression or a sad expression. In this manner, the artifact itself may be adjusted so as to move from the standard smiley face to a sad face , and the corresponding linked words “sad”, “unhappy”, and/or “glum” may be presented to the user for selection as alternate text.
In one embodiment of the invention, the alternate text provided in response to modifying a pictorial object may be text that is in a language different from that of the text initially received from the user. In this manner, a foreign-language word may be provided in the text document, and this may be particularly useful when the foreign-language word carries a meaning that is more precisely in keeping with the user's feelings. Furthermore, since language has strong cultural linkages, when this technique is applied, emotional data may be conveyed via the pictorial artifact so as to transcend simple transliteration, and may allow much more accurate translation of the text.
In a similar manner, when a pictorial artifact is modified, an alternate pictorial artifact may be provided to the user for selection. In this manner, although the user may initially select a pictorial artifact that seems acceptable, the appliance may subsequently permit the user to identify and select a pictorial artifact that is closer to that desired by the user. For example, if the user initially selects the smiley face artifact, subsequently modifies that artifact to have a red fill-pattern, the appliance 10 may provide a suggested alternate pictorial artifact that is a stick figure with its hands on its hips.
The method may be carried out so as to highlight the received text until the linked pictorial artifact is selected. In this manner, the user may quickly and easily remind himself about the text for which a pictorial artifact may be selected. This may assist the user in identifying an appropriate pictorial artifact, and in modifying 209 a selected pictorial artifact.
Although indicated above, it is may be useful to remember that a pictorial artifact may be selected by touching a portion of the monitor 28 where the pictorial artifact is displayed. Alternatively, a cursor may be moved so as to highlight or underline an artifact and then the user may press an “enter” button on the appliance to indicate that the identified artifact should be selected and provided to the text document. In a particularly sophisticated embodiment of the invention, the user may navigate over a spectral palette using a touch sensitive screen so as to not only select a particular pictorial artifact, but also modify the artifact using a continuous motion. The artifact may be activated by placing a stylus or finger to the monitor 28 at the artifact location and the hue or tint may be altered by moving the stylus or finger in a particular direction, for example radially from the artifact, and the radial direction and distance may be used by the microprocessor to determine the type of modification desired by the user. In this way an exceptionally rich range of emotions may be easily and quickly expressed.
In a particularly interesting embodiment of the invention, the pictorial artifact may be a sprite. In this manner, the movement provided by the sprite may be somewhat entertaining, more likely to catch the attention of the reader, and/or be particularly memorable. In so doing, the corresponding text may be made more valuable to both the sender and the receiver. For example, if the user provides the word “frustrated”, a sprite artifact that animates to a stick figure banging its head on a wall might be provided to the user for selection. When combined with a sound, such a sprite may be made even more meaningful. In this example, a dull repetitive thud might be representative of an obstacle being encountered when the head of the sprite strikes a wall. Alternatively, the sound associated with the sprite banging its head might be the sound of a brief scream, which may be intended to signify the failure of a protracted effort resulting in extreme frustration. By enabling the user to associate different sounds with a sprite, the user is enabled to associate different connotations with a particular sprite.
Although the present invention has been described with respect to one or more particular embodiments, it will be understood that other embodiments of the present invention may be made without departing from the spirit and scope of the invention. Hence, the present invention is deemed limited only by the appended claims and the reasonable interpretation thereof.
This application claims the benefit of priority to U.S. provisional patent application serial number 60/803,342, filed on May 26, 2006.
Number | Name | Date | Kind |
---|---|---|---|
5404435 | Rosenbaum | Apr 1995 | A |
5524193 | Covington et al. | Jun 1996 | A |
5526259 | Kaji | Jun 1996 | A |
5659742 | Beattie et al. | Aug 1997 | A |
5684999 | Okamoto | Nov 1997 | A |
5818447 | Wolf et al. | Oct 1998 | A |
5873107 | Borovoy et al. | Feb 1999 | A |
5880731 | Liles et al. | Mar 1999 | A |
6021412 | Ho et al. | Feb 2000 | A |
6064383 | Skelly | May 2000 | A |
6161108 | Ukigawa et al. | Dec 2000 | A |
6584328 | Kung | Jun 2003 | B1 |
6629793 | Miller | Oct 2003 | B1 |
6801659 | O'Dell | Oct 2004 | B1 |
6904560 | Panda | Jun 2005 | B1 |
6963839 | Ostermann et al. | Nov 2005 | B1 |
6983305 | Danker et al. | Jan 2006 | B2 |
6987991 | Nelson | Jan 2006 | B2 |
6990452 | Ostermann et al. | Jan 2006 | B1 |
7035803 | Ostermann et al. | Apr 2006 | B1 |
7051019 | Land et al. | May 2006 | B1 |
7075520 | Williams | Jul 2006 | B2 |
7133900 | Szeto | Nov 2006 | B1 |
7159192 | Dobronsky | Jan 2007 | B2 |
7167731 | Nelson | Jan 2007 | B2 |
7177811 | Ostermann et al. | Feb 2007 | B1 |
7185285 | Van Dok et al. | Feb 2007 | B2 |
7256769 | Pun et al. | Aug 2007 | B2 |
7257618 | Danker et al. | Aug 2007 | B2 |
7359688 | Seo et al. | Apr 2008 | B2 |
7379066 | Ostermann et al. | May 2008 | B1 |
7386453 | Polanyi et al. | Jun 2008 | B2 |
7484175 | Kirkland | Jan 2009 | B2 |
7484176 | Blattner et al. | Jan 2009 | B2 |
7503007 | Goodman et al. | Mar 2009 | B2 |
7529732 | Liu et al. | May 2009 | B2 |
7571213 | Walkush et al. | Aug 2009 | B2 |
7587378 | Van Meurs | Sep 2009 | B2 |
7609270 | Ostermann et al. | Oct 2009 | B2 |
7610194 | Bradford et al. | Oct 2009 | B2 |
7792785 | Clark et al. | Sep 2010 | B2 |
20010049596 | Lavine et al. | Dec 2001 | A1 |
20020007276 | Rosenblatt et al. | Jan 2002 | A1 |
20020077135 | Hyon | Jun 2002 | A1 |
20020149611 | May | Oct 2002 | A1 |
20020177454 | Karri et al. | Nov 2002 | A1 |
20030035412 | Wang et al. | Feb 2003 | A1 |
20030115552 | Jahnke et al. | Jun 2003 | A1 |
20030210265 | Haimberg | Nov 2003 | A1 |
20040091154 | Cote | May 2004 | A1 |
20040236565 | Wen et al. | Nov 2004 | A1 |
20050017954 | Kay et al. | Jan 2005 | A1 |
20050081150 | Beardow | Apr 2005 | A1 |
20050156873 | Walter et al. | Jul 2005 | A1 |
20050160149 | Durand et al. | Jul 2005 | A1 |
20050163379 | Zimmermann | Jul 2005 | A1 |
20050223328 | Ashtekar et al. | Oct 2005 | A1 |
20050278627 | Malik | Dec 2005 | A1 |
20060001758 | Nam et al. | Jan 2006 | A1 |
20060015812 | Cunningham et al. | Jan 2006 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060066754 | Zaima | Mar 2006 | A1 |
20060221059 | Choi et al. | Oct 2006 | A1 |
20060247915 | Bradford et al. | Nov 2006 | A1 |
20060256139 | Gikandi | Nov 2006 | A1 |
20070178918 | Shon | Aug 2007 | A1 |
Number | Date | Country |
---|---|---|
158203 | Feb 2005 | CN |
1197879 | Apr 2002 | EP |
2005054666 | Jun 2005 | KR |
2006059312 | Jun 2006 | KR |
02099697 | Dec 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20070276814 A1 | Nov 2007 | US |
Number | Date | Country | |
---|---|---|---|
60803342 | May 2006 | US |