The present subject matter relates to electronic devices and, more particularly, to using emojis in communications between electronic devices.
Textual communication is a common means of communication between users of electronic devices (e.g., texting). Textual communication is conventionally performed using standardized computer fonts. Emojis can be used in text communications to enhance communications between the users.
The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
One aspect of the present disclosure describes a personalized emoji dictionary, such as for use with emoji-first messaging. Text messaging is automatically converted to emojis by an emoji-first application so that only emojis are communicated from one client device to another client device. Each client device has a personalized emoji library of emojis that are mapped to words, which libraries are customizable and unique to the users of the client devices, such that the users can communicate secretly in code. Upon receipt of a string of emojis, a user can select the emoji string to convert to text if desired, such as by tapping the displayed received emoji string, for a predetermined period of time. This disclosure provides a more engaging user experience.
The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products illustrative of examples of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various examples of the disclosed subject matter. It will be evident, however, to those skilled in the art, that examples of the disclosed subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.
For example, client device 110 is a device of a given user who uses a client application 114 on an online social platform, a gaming platform, and communication applications. Client device 110 accesses a website, such as an online social platform hosted by a server system 108. The user inputs login credentials associated with the user. Server system 108 receives the request and provides access to the online social platform.
A user of the client device 110 launches and engages a client application 114 hosted by the server system 108, which in one example is a messaging application. The client device 110 includes an emoji-first module 116 including a processor running client code for performing the emoji-first messaging on the client device 110. The emoji-first module 116 automatically converts text words entered by a user on a client device 110 to generate a string of one or more emojis based on a customizable library 118. The library 118 contains a list of emojis matched to one or more words of text. The messaging client application 114 communicates the emoji string between client devices 110. When a user of another client device 110 having the same customizable library 118 receives the generated emoji string, it displays the string of emojis on a device display, and the user can optionally select converting the received string of emojis to text, such as by tapping on the emoji string.
One or more users may be a person, a machine, or other means of interacting with the client device 110. In examples, the user may not be part of the system 100 but may interact with the system 100 via the client device 110 or other means. For instance, the user may provide input (e.g., touch screen input, alphanumeric input, verbal input, or visual input) to the client device 110 and the input may be communicated to other entities in the system 100 (e.g., third-party servers 128, server system 108, etc.) via a network 102 (e.g., the Internet). In this instance, the other entities in the system 100, in response to receiving the input from the user, may communicate information to the client device 110 via the network 102 to be presented to the user. In this way, the user interacts with the various entities in the system 100 using the client device 110.
One or more portions of the network 102 may be an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), a portion of the Internet, a portion of the public switched telephone network (PSTN), a cellular telephone network, a wireless network, a WiFi network, a 4G LTE network, another type of network, or a combination of two or more such networks.
The client device 110 may access the various data and applications provided by other entities in the system 100 via a web client 112 (e.g., a browser) or one or more client applications 114. The client device 110 may include one or more client application(s) 114 (also referred to as “apps”) such as, but not limited to, a web browser, messaging application, multi-player gaming application, electronic mail (email) application, an e-commerce site application, a mapping or location application, and the like.
In some examples, one or more client application(s) 114 are included in a given one of the client device 110, and configured to locally provide the user interface and at least some of the functionalities, with the client application(s) 114 configured to communicate with other entities in the system 100 (e.g., third-party server(s) 128, server system 108, etc.), on an as-needed basis, for data processing capabilities not locally available (e.g., to access location information, to authenticate a user, etc.). Conversely, one or more client application(s) 114 may not be included in the client device 110, and then the client device 110 may use its web browser to access the one or more applications hosted on other entities in the system 100 (e.g., third-party server(s) 128, server system 108, etc.).
The server system 108 provides server-side functionality via the network 102 (e.g., the Internet or wide area network (WAN)) to: one or more third party server(s) 128, and one or more client devices 110. The server system 108 includes an application server 104 including an application program interface (API) server 120, a web server 122, and one or more personalized font modules 124, that may be communicatively coupled with one or more database(s) 126. The one or more database(s) 126 may be storage devices that store data related to users of the server system 108, applications associated with the server system 108, cloud services, and so forth. The one or more database(s) 126 may further store information related to third-party server(s) 128, third-party application(s) 130, client device 110, client application(s) 114, users, and so forth. In one example, the one or more database(s) 126 may be cloud-based storage.
The server system 108 may be a cloud computing environment, according to some examples. The server system 108, and any servers associated with the server system 108, may be associated with a cloud-based application, in one example.
The emoji-first module 116 is stored on the client device 110 and/or server 108 to optimize processing efficiency. In some examples, all modules for performing a specific task are stored on the device/server performing that action. In other examples, some modules for performing a task are stored on the client device 110 and other modules for performing that task are stored on the server 108 and/or other devices. In some examples, modules may be duplicated on the client device 110 and the server 108.
The one or more third-party application(s) 130, executing on third-party server(s) 128 may interact with the server system 108 via API server 120 via a programmatic interface provided by the API server 120. For example, one or more of the third-party applications 130 may request and utilize information from the server system 108 via the API server 120 to support one or more features or functions on a website hosted by the third party or an application hosted by the third party. The third-party application(s) 130, for example, may provide software version analysis functionality that is supported by relevant functionality and data in the server system 108.
The emoji-first module 116 is an application, such as an iOS app, that enables emoji-first communication between two people in a close relationship, leveraging their closeness and history with each other to foster a shared emoji vocabulary between them. Each user creates an account and specifies a single partner with whom they will use the emoji-first module 116. The chat interface 200 allows the user pair to send and receive emoji-first messages between them, such that the messages comprise of only emojis, such as shown in
The chat interface 200 allows users to exchange emoji-first messages with their partners. That is, the users receive sequences of emoji, referred to as strings, representing a text message without being accompanied by text at first, though they may choose to view the message in text later by tapping the messages. As shown in
Referring to
At block 402, the recipient, referred to as “Friend 1”, always sees the received emoji string first, shown as the notification message 312 in
At block 404, Friend 1 can long press message 312 from Friend 2 to toggle between the emoji string and text words as shown at 314 in
At block 406, as shown in
At block 408, as shown in
Referring to
The activities that are the focus of discussions here involve the emoji-first messaging, and also the personalized library of emojis that are shared between two users of client devices 110. The emoji-first application 116 and the library 118 may be stored in memory 640 for execution by CPU 630, such as flash memory 640A or RAM memory 640B.
As shown in
To generate location coordinates for positioning of the client device 110, the client device 110 can include a global positioning system (GPS) receiver (not shown). Alternatively, or additionally, the client device 110 can utilize either or both the short range XCVRs 620 and WWAN XCVRs 610 for generating location coordinates for positioning. For example, cellular network, WiFi, or Bluetooth™ based positioning systems can generate very accurate location coordinates, particularly when used in combination. Such location coordinates can be transmitted to the eyewear device over one or more network connections via XCVRs 620.
The transceivers 610, 620 (network communication interface) conforms to one or more of the various digital wireless communication standards utilized by modern mobile networks. Examples of WWAN transceivers 610 include (but are not limited to) transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and 3rd Generation Partnership Project (3GPP) network technologies including, for example and without limitation, 3GPP type 2 (or 3GPP2) and LTE, at times referred to as “4G”, and 5G. For example, the transceivers 610, 620 provide two-way wireless communication of information including digitized audio signals, still image and video signals, web page information for display as well as web related inputs, and various types of mobile message communications to/from the client device 110 for user identification strategies.
Several of these types of communications through the transceivers 610, 620 and a network, as discussed previously, relate to protocols and procedures in support of communications with the server system 108 for obtaining and storing friend device capabilities. Such communications, for example, may transport packet data via the short range XCVRs 620 over the wireless connections of network 102 to and from the server system 108 as shown in
The client device 110 further includes microprocessor 630, shown as a CPU, sometimes referred to herein as the host controller. A processor is a circuit having elements structured and arranged to perform one or more processing functions, typically various data processing functions. Although discrete logic components could be used, the examples utilize components forming a programmable CPU. A microprocessor for example includes one or more integrated circuit (IC) chips incorporating the electronic elements to perform the functions of the CPU. The processor 630, for example, may be based on any known or available microprocessor architecture, such as a Reduced Instruction Set Computing (RISC) using an ARM architecture, as commonly used today in client devices and other portable electronic devices. Other processor circuitry may be used to form the CPU 630 or processor hardware in smartphone, laptop computer, and tablet.
The microprocessor 630 serves as a programmable host controller for the client device 110 by configuring the device to perform various operations, for example, in accordance with instructions or programming executable by processor 630. For example, such operations may include various general operations of the client device 110, as well as operations related to emoji-first messaging using emoji-first application 116, and also personalized libraries 118 mapping emojis to text between a two or more users. Although a processor may be configured by use of hardwired logic, typical processors in client devices are general processing circuits configured by execution of programming.
The client device 110 includes a memory or storage device system, for storing data and programming. In the example, the memory system may include a flash memory 640A and a random access memory (RAM) 640B. The RAM 640B serves as short term storage for instructions and data being handled by the processor 630, e.g., as a working data processing memory. The flash memory 640A typically provides longer term storage.
Hence, in the example of client device 110, the flash memory 640A is used to store programming or instructions for execution by the processor 630. Depending on the type of device, the client device 110 stores and runs a mobile operating system through which specific applications, including application 114. Examples of mobile operating systems include Google Android®, Apple iOS® (I-Phone or iPad devices), Windows Mobile®, Amazon Fire OS®, RIM BlackBerry® operating system, or the like.
The terms and expressions used herein are understood to have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
The examples illustrated herein are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed. Other examples may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. The Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various examples is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.
Additional objects, advantages and novel features of the examples will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out in the appended claims.
This application is a continuation of U.S. application Ser. No. 17/234,905 filed on Apr. 20, 2021, the contents of which are incorporated fully herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6990452 | Ostermann et al. | Jan 2006 | B1 |
8332756 | Kuhl et al. | Dec 2012 | B2 |
8584031 | Moore et al. | Nov 2013 | B2 |
8918339 | Rubinstein et al. | Dec 2014 | B2 |
9372608 | Leydon et al. | Jun 2016 | B2 |
9794202 | Pereira et al. | Oct 2017 | B1 |
10084738 | Venkatakrishnan et al. | Sep 2018 | B2 |
10169566 | Mossoba et al. | Jan 2019 | B1 |
10482163 | Hullette et al. | Nov 2019 | B2 |
10515464 | Stukalov | Dec 2019 | B2 |
10521091 | Anzures et al. | Dec 2019 | B2 |
10529030 | Kataria et al. | Jan 2020 | B2 |
10749818 | Bikumala et al. | Aug 2020 | B1 |
10771419 | Charignon | Sep 2020 | B2 |
10877629 | Tung et al. | Dec 2020 | B2 |
10922487 | Patel | Feb 2021 | B2 |
10997768 | Barlier et al. | May 2021 | B2 |
11159458 | Johnson | Oct 2021 | B1 |
11531406 | Bayer et al. | Dec 2022 | B2 |
11593548 | Bayer et al. | Feb 2023 | B2 |
11626368 | Chang et al. | Apr 2023 | B2 |
20050156873 | Walter et al. | Jul 2005 | A1 |
20050181777 | Kim | Aug 2005 | A1 |
20080070605 | Kim | Mar 2008 | A1 |
20080216022 | Lorch et al. | Sep 2008 | A1 |
20080244446 | LeFevre et al. | Oct 2008 | A1 |
20090019117 | Bonforte et al. | Jan 2009 | A1 |
20100123724 | Moore et al. | May 2010 | A1 |
20100125811 | Moore et al. | May 2010 | A1 |
20100179991 | Lorch et al. | Jul 2010 | A1 |
20130147933 | Kulas | Jun 2013 | A1 |
20140161356 | Tesch et al. | Jun 2014 | A1 |
20140361974 | Li et al. | Dec 2014 | A1 |
20150100537 | Grieves et al. | Apr 2015 | A1 |
20150133176 | Blount et al. | May 2015 | A1 |
20160259502 | Parrott | Sep 2016 | A1 |
20170083174 | Tobens, III et al. | Mar 2017 | A1 |
20170083493 | Kumhyr | Mar 2017 | A1 |
20170083506 | Liu | Mar 2017 | A1 |
20170098122 | Kaliouby et al. | Apr 2017 | A1 |
20170118189 | Venkatakrishnan | Apr 2017 | A1 |
20170185580 | Zhang et al. | Jun 2017 | A1 |
20170185581 | Bojja et al. | Jun 2017 | A1 |
20170222961 | Beach et al. | Aug 2017 | A1 |
20170249291 | Patel | Aug 2017 | A1 |
20170308290 | Patel | Oct 2017 | A1 |
20170336926 | Chaudhri et al. | Nov 2017 | A1 |
20170344224 | Kay | Nov 2017 | A1 |
20180026925 | Kennedy | Jan 2018 | A1 |
20180059885 | Gonnen et al. | Mar 2018 | A1 |
20180173692 | Greenberg et al. | Jun 2018 | A1 |
20180260385 | Fan | Sep 2018 | A1 |
20180295072 | Yim | Oct 2018 | A1 |
20180335927 | Anzures et al. | Nov 2018 | A1 |
20190005070 | Li et al. | Jan 2019 | A1 |
20190065446 | Kota | Feb 2019 | A1 |
20190122403 | Woo et al. | Apr 2019 | A1 |
20190197102 | Lerner et al. | Jun 2019 | A1 |
20190221208 | Chen et al. | Jul 2019 | A1 |
20190250813 | Block et al. | Aug 2019 | A1 |
20190379618 | Rabbat et al. | Dec 2019 | A1 |
20200110794 | Vos | Apr 2020 | A1 |
20200137004 | Yu et al. | Apr 2020 | A1 |
20200151220 | Cromack et al. | May 2020 | A1 |
20200210990 | Laracey et al. | Jul 2020 | A1 |
20200219295 | Kaliouby et al. | Jul 2020 | A1 |
20210058351 | Viklund et al. | Feb 2021 | A1 |
20210096729 | Dalonzo et al. | Apr 2021 | A1 |
20210209308 | Kang | Jul 2021 | A1 |
20210264502 | Mozafarian et al. | Aug 2021 | A1 |
20210312140 | Levkovitz et al. | Oct 2021 | A1 |
20210382590 | Fong et al. | Dec 2021 | A1 |
20220070125 | Vasamsetti et al. | Mar 2022 | A1 |
20220109646 | Lakshmipathy | Apr 2022 | A1 |
20220335201 | Bayer | Oct 2022 | A1 |
20220337540 | Bayer et al. | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
1020170074140 | Jun 2017 | KR |
2018053594 | Mar 2018 | WO |
Entry |
---|
Arifiyanti et al., Emoji and Emoticon in Tweet Sentiment Classification, 2020, IEEE, 6 pages. |
Jiang et al., automatic Prediction and Insertion of Multiple Emojis in Social Media Text, 2020, IEEE, 8 pages. |
Yokoi et al., Emoticon Extraction Method Based on Eye Characters and Symmetric String, 2015, IEEE, 6 pages. |
Author, et al.: “Automatically Emoji Generation System Through Social Interactions” 2019, IP.com, 6 pages. |
Broussard, Mitchel: Messages in iOS 10: How to Use Emoji Replacement and Prediction, MacRumors, Sep. 13, 2016 [retrieved on Jul. 7, 2022]. Retrieved from the Internet. |
International Search Report and Written Opinion for International Application No. PCT/US2022/024554, dated Jul. 26, 2022 (dated Jul. 26, 2022)—10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/024734, dated Jul. 26, 2022 (dated Jul. 26, 2022)—10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/024741, dated Aug. 1, 2022 (dated Aug. 1, 2022)—10 pages. |
Number | Date | Country | |
---|---|---|---|
20230090565 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17234905 | Apr 2021 | US |
Child | 18072367 | US |