This invention is related to electronic messaging. More particularly, this invention is related to inserting graphical elements into text messages.
Today, users typing text messages on touch-screen devices commonly use emoticons (e.g., emoji, smileys, animations, or visuals) to enliven the messages, making the messages more personal, enriching them with feeling. A user must go through several steps to access emoticons before inserting them into the text. For example, a user navigates to the location in the text where she wants to insert an emoticon. Once at the location, she taps a button on her keyboard, taking her away from the text field (typing box) on her keyboard, away from an ongoing chat session and anything else she might be viewing at the time. She may then have to enter a complex sequence of commands to return to the screen she was previously viewing. All of these steps discourage users from using emoticons, decreasing the enjoyment of texting.
In a first aspect, a method of inserting images into a text message on an electronic device includes presenting a library of images to a user on the electronic device during a chat session without leaving the chat session and inserting a selected image from the library of images into the text message. In one embodiment, the library of images is presented in response to user input entered on the electronic device, such as a swipe, a tap, or any combination thereof on a touchscreen surface of the electronic device. Preferably, the images comprise emoticons. In one embodiment, the method also includes receiving user input, such as swiping or tapping, on the electronic device and, in response, displaying a list of content to insert within the chat session. As one example, the content includes rich media.
In another embodiment, the method also includes receiving user input, such as swiping or tapping, on the electronic device and, in response, modifying options or settings on the chat session. In yet another embodiment, the method also includes embedding user text in the chat session within a mood message.
In one embodiment, the electronic device is a wireless device, such as a smartphone, a mobile telephone, a personal digital assistant, an iPad®, a smart watch, smart glasses, or any other handheld device. In other embodiments, the electronic device is a personal computer, a laptop computer, or a desktop computer.
In a second aspect, an electronic device includes a processor, a memory storing a plurality of images in an image library, and logic configured (e.g., programmed) to receive user input that selects an image from the plurality of images during a chat session and inserts the selected image into a text message during the chat session. In one embodiment, the images are emoticons. Preferably, the electronic device includes a touchscreen for receiving the user input, such as swiping or tapping.
In different embodiments, the logic is also configured to display a list of content and insert content selected from the list into a chat session, modify options or settings in the chat session messaging client, add “mood” graphics to text messages to create mood messages, or any combination of these elements.
Examples of the electronic device include a smartphone, a mobile phone, a laptop computer, a desktop computer, a tablet, a smart watch, or smart glasses. In one embodiment, the logic comprises a widget for executing an application for selecting and inserting the images.
In all the figures, identical labels refer to the same or a similar element.
In accordance with the principles of the invention, emoticons and other images are easily inserted into a chat session or other texting application. During a chat session, for example, a user inputs pre-determined gestures on an electronic device, causing an emoticon library (selector) to appear on the device. As some examples, the electronic device has a touch screen, and the gestures are swiping motions, taps, or some combination of these. Preferably, the emoticon library does not entirely obscure the chat session, but instead allows the user to view a portion of the chat session. Preferably, once the user selects an emoticon from the emoticon library, the emoticon library is automatically closed. Alternatively, the user enters a pre-determined sequence of gestures to close the emoticon library. Preferably, the user is able to configure the system to set the pre-determined gestures for opening (e.g., presenting or displaying) and closing (removing from view) the emoticon library, the location of the emoticon library when opened, and the replacement of one emoticon with another, to name only a few configuration parameters.
Adding Emoticons to Chat Sessions
In one embodiment, the emoticon application program is an “embedded widget,” which appears to the user as an application executed within the texting application, such that both applications appear to be executing simultaneously. In one embodiment, the emoticon application program causes an emoticon library to be displayed on an electronic device during a chat session, an emoticon to be selected, and the selected emoticon to be embedded with a text message of the chat session.
In other embodiments, the emoticon library 250 is able to be presented in different configurations and displayed in different locations on the electronic device. As only one example, shown in the screenshot 400B in
Once the emoticon library 250 is displayed (e.g.,
While the examples above describe swiping left-to-right along the input area 220 to open the emoticon library 250 and swiping right-to-left along the input area 220 to close the emoticon library 250, it will be appreciated that these are only illustrative ways to open and close the emoticon library 250 during a chat session. An electronic device in accordance with the principles of the invention is able to be configured to open and close an emoticon library in response to other user inputs, such as swiping from left-to-right on the input area 220 to open the emoticon library 250 and then swiping again from left-to-right to close the emoticon library 250; tapping once on the input area 220 to open the emoticon library 250 and tapping a second time on the input area 220 to close the emoticon library 250; tapping once on the input area 220 to open the emoticon library 250 and tapping twice on the input area 220 to close the emoticon library 250; tapping once on the input area 220 to open the emoticon library 250 and swiping in any direction on the input area 220 to close the emoticon library 250; or using any other predetermined sequence of swiping or tapping to open the emoticon library 250 and, after selection of an emoticon, automatically closing the emoticon library 250.
It will be appreciated that the emoticon library 250 in
Adding Content to Chat Sessions
In one embodiment, in response to any gesture (e.g., a typing, a swipe, or a tapping), a user is presented with a list of options from which they are able to choose. For example, in response to a gesture, a user is presented with any content, such as rich media, to be added to the chat session. Examples of rich media include pictures, video, text designs, and “push-to-talk” links, to name only a few such examples. Users are also able to swipe, type, tap, double-tap, etc. to be presented with options that allow them to modify other options or settings in their texting messaging client, such as described in U.S. patent application Ser. No. 14/079,957, titled “Embedding Rich Media Into Text Messages,” filed Nov. 14, 2013, which is hereby incorporated by reference in its entirety.
Adding Mood Messages to Chat Sessions
In yet another embodiment, text messages are enhanced by mood messages. For example, when a user types a message, such as “Happy Birthday Tim!” into the text field, she has the option to visually enhance the message through mood messages. The user also has the option to navigate to mood messages before typing any text. Once the user has selected the option to create a mood message, she is able to choose from many different graphic templates. For example, there may be a template of a colorful banner and the text that they have types (e.g., “Happy Birthday Tim!”) will be displayed within the designed mood message, in this case, the banner. The user is now able to send the banner directly to a recipient or the user is able navigate to other mood message templates, where the text that they have already types is presented within other design. As some examples, mood messages in accordance with embodiments of the invention are static, animated, or contain media from other photos or videos.
Configuring a System
It will be appreciated that the steps 1300 are merely illustrative of one embodiment of the invention. In other embodiments, other steps are added, some steps are deleted, and some steps are combined, to name only a few modifications.
Hardware Components
The electronic device 1400 is able to be any electronic device. In one embodiment, the electronic device 1400 is a smartphone. In other embodiments, the electronic device 1400 is a mobile phone, a personal computer, a laptop computer, a tablet, a desktop computer, a personal digital assistant, an iPad®, a smart watch, smart glasses, such as Google® glasses, or any other mobile or handheld device, to name only a few such devices.
It will be appreciated that steps 1500 are merely illustrative of one embodiment of the invention. In other embodiments, other steps are added, some steps are deleted, and some steps are combined, to name on a few modification.
It will be appreciated that while the examples above describe inserting emoticons into text messages, other graphical images are able to be inserted into text messages in accordance with the principles of the invention.
While the examples describe different embodiments, it will be appreciated that the embodiments are able to be combined in any combination of ways. For example, an electronic device in accordance with the invention is able to present emoticons, or present a list of items to add to a chat, or add mood messages, or perform all of these functions, or perform any subset of these functions.
The present invention has been described above with reference to exemplary embodiments. It will be apparent to those skilled in the art that various modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.
This patent application is a continuation of U.S. patent application Ser. No. 14/091,248, filed Nov. 26, 2013, which claims the benefit of U.S. Provisional Application No. 61/730,038, filed Nov. 26, 2012, both of which are incorporated by reference herein in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
6614422 | Rafii | Sep 2003 | B1 |
7035803 | Ostermann | Apr 2006 | B1 |
7996775 | Cole | Aug 2011 | B2 |
8610682 | Fulcher | Dec 2013 | B1 |
8775526 | Lorch | Jul 2014 | B2 |
9116884 | Milstein | Aug 2015 | B2 |
9306880 | Hyndman | Apr 2016 | B1 |
20050156947 | Sakai | Jul 2005 | A1 |
20050204309 | Szeto | Sep 2005 | A1 |
20060015812 | Cunningham | Jan 2006 | A1 |
20060200380 | Ho | Sep 2006 | A1 |
20060234680 | Doulton | Oct 2006 | A1 |
20060279559 | Kongqiao | Dec 2006 | A1 |
20070033534 | Kim | Feb 2007 | A1 |
20070040850 | Coleman | Feb 2007 | A1 |
20070043687 | Bodart | Feb 2007 | A1 |
20070177803 | Elias | Aug 2007 | A1 |
20070288560 | Bou-ghannam | Dec 2007 | A1 |
20080040227 | Ostermann | Feb 2008 | A1 |
20080195699 | Min | Aug 2008 | A1 |
20080216022 | Lorch | Sep 2008 | A1 |
20080316183 | Westerman | Dec 2008 | A1 |
20090013048 | Partaker | Jan 2009 | A1 |
20090013059 | Partaker | Jan 2009 | A1 |
20090013265 | Cole | Jan 2009 | A1 |
20090030800 | Grois | Jan 2009 | A1 |
20090061825 | Neelakantan | Mar 2009 | A1 |
20090063992 | Gandhi | Mar 2009 | A1 |
20090106825 | Cerruti | Apr 2009 | A1 |
20090228825 | Van Os | Sep 2009 | A1 |
20090325603 | Van Os | Dec 2009 | A1 |
20100123724 | Moore | May 2010 | A1 |
20100125785 | Moore | May 2010 | A1 |
20100133338 | Brown | Jun 2010 | A1 |
20100159883 | Pascal | Jun 2010 | A1 |
20100162133 | Pascal | Jun 2010 | A1 |
20100162138 | Pascal | Jun 2010 | A1 |
20100179991 | Lorch | Jul 2010 | A1 |
20100293473 | Borst | Nov 2010 | A1 |
20110055735 | Wood | Mar 2011 | A1 |
20110078567 | Kim | Mar 2011 | A1 |
20110214055 | Georgiev | Sep 2011 | A1 |
20110285631 | Imamura | Nov 2011 | A1 |
20110285656 | Yaksick | Nov 2011 | A1 |
20110289428 | Yuen | Nov 2011 | A1 |
20110302519 | Fleizach | Dec 2011 | A1 |
20120019446 | Wu | Jan 2012 | A1 |
20120021785 | Weinrib | Jan 2012 | A1 |
20120047447 | Haq | Feb 2012 | A1 |
20120060103 | Arasaki | Mar 2012 | A1 |
20120209863 | Hidesawa | Aug 2012 | A1 |
20120242582 | Choi | Sep 2012 | A1 |
20130024781 | Douillet | Jan 2013 | A1 |
20130046544 | Kay | Feb 2013 | A1 |
20130073556 | Valeski | Mar 2013 | A1 |
20130097526 | Stovicek | Apr 2013 | A1 |
20130120271 | Lee | May 2013 | A1 |
20130159919 | Leydon | Jun 2013 | A1 |
20130285926 | Griffin | Oct 2013 | A1 |
20130318466 | Estrada | Nov 2013 | A1 |
20140013271 | Moore | Jan 2014 | A1 |
20140032206 | Grieves | Jan 2014 | A1 |
20140088954 | Shirzadi | Mar 2014 | A1 |
20140101553 | Nagel | Apr 2014 | A1 |
20140195605 | Kallayil | Jul 2014 | A1 |
20150127453 | Tew | May 2015 | A1 |
20150135137 | Miwa | May 2015 | A1 |
20150178782 | Kim | Jun 2015 | A1 |
20180335930 | Scapel | Nov 2018 | A1 |
20180336715 | Rickwald | Nov 2018 | A1 |
20180364898 | Chen | Dec 2018 | A1 |
20190124021 | DeMattei | Apr 2019 | A1 |
20190332247 | Liu | Oct 2019 | A1 |
Entry |
---|
“Emotion”, “library”, and “user interface” definition; Microsoft Corp., Microsoft Computer Dictionary, Fifth Edtion, Microsoft Press, Mar. 15, 2002, ISBN-13: 978-07356-1495-6; pp. 241, 391, 684. |
Number | Date | Country | |
---|---|---|---|
20180059885 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
61730038 | Nov 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14091248 | Nov 2013 | US |
Child | 15802442 | US |