The embodiments described in this disclosure relate to messaging systems such as text messaging systems on cellular telephones or other communication devices or data processing systems.
The use of text messaging systems began many years ago. For example, wireless cellular telephone carriers, such as Verizon or AT&T, allowed text messages through the Short Message Service (SMS) for cell phones in the 1990s before smartphones were available. Typically, the amount of data transmitted has been limited by rules established by the carriers. Recently, as the use of smartphones (e.g. iPhones) and tablet computers (e.g. iPad) has increased, the text messaging systems have developed the ability to send images, such as photos or emojis. In addition, messaging systems such as iMessage from Apple Inc. of Cupertino, Calif. have allowed users to also send and receive text and images through “public” networks which include “public” WiFi access points and the Internet (in addition to using the wireless carrier's private cellular telephone networks), and messaging systems such as iMessage can seamlessly transition between the use of public and private networks depending on the availability of, for example, WiFi access points or the compatibility of the other user's device (which may not be compatible with iMessage).
The embodiments described herein relate to a text messaging system that can include a set of one or more layers displayed within a message transcript of a messaging app. These one or more layers are in addition to one or more message bubbles on a received message layer in the message transcript and one or more message bubbles on a sent message layer in the message transcript in one embodiment. These one or more layers can include animated content and can be overlaid relative to each other and to the sent message layer and to the received message layer. In one embodiment, each of these one or more layers can have a Z depth relative to each other and to the sent and received message layers. In one embodiment, the layers can be composited using a set of alpha values and the relative Z depth values to create a composite or sequence of composite images if there are one or more animations in the one or more layers.
A method according to one embodiment can include the following operations: receiving, by a first messaging app, text entered by a user; detecting the text includes a set of one or more predetermined words associated with a first layer identifier; sending, by the first messaging app, the text and the first layer identifier to a second messaging app in response to a send command after detecting the text. In one embodiment, the first messaging app and the second messaging app are configured to communicate text messages through one or more messaging servers. In one embodiment, the first layer identifier specifies, for a first layer, a Z depth of the first layer relative to at least one of one or more message bubbles in a sent message layer and one or more message bubbles in a received message layer. In one embodiment, the first messaging app can send one or more additional layer identifiers to the second messaging app and these additional layer identifiers can specify for each layer a Z depth of the layer relative to other layers used to display a message transcript. In one embodiment, each layer identifier can specify or refer to content which can be created or generated on a receiving device.
A method according to another embodiment can include the following operations: receiving, by a first messaging app, a message from a second messaging app; receiving by the first messaging app a first layer identifier associated with the message, wherein the first layer identifier specifies for a first layer a Z depth of the first layer relative to at least one of one or more message bubbles in a sent message layer and one or more message bubbles in a received message layer in a message transcript; receiving or generating, by the first messaging app, content for display in the first layer; and displaying the sent message layer and the received message layer in the content in the first layer, wherein the first layer is displayed at the Z depth relative to at least one of the one or more message bubbles in the sent message layer and the one or more message bubbles in the received message layer. In one embodiment, the sent message layer and the received message layer can have different Z depths and wherein the first layer is, in Z depth, between the sent message layer and the received message layer. In one embodiment, the messaging app can use a plurality of layers in addition to the sent message layer and the received message layer, and each of those layers can have a Z depth relative to each other and relative to the sent message layer and the received message layer. In one embodiment, all of the layers are overlaid and composited based on a set of relative Z depth values and a set of alpha values in order to create a composite image. In one embodiment, each of the layers (or a subset of the layers) can display an animation which is displayed over a period of time and once the animation is completed, all layers are removed except for the sent message layer and the received message layer. In one embodiment, each layer is identified by a layer identifier which can specify the Z depth of the layer relative to the other layers and can also specify content for display in the particular layer. In one embodiment, a receiving messaging app can generate the content for display in a particular layer based on the identifier for the layer.
The methods and systems described herein can be implemented by data processing systems, such as one or more smartphones, tablet computers, desktop computers, laptop computers, smart watches, audio accessories, and other data processing systems and other consumer electronic devices. The methods and systems described herein can also be implemented by one or more data processing systems which execute executable computer program instructions, stored in one or more non-transitory machine readable media that cause the one or more data processing systems to perform the one or more methods described herein when the program instructions are executed. Thus, the embodiments described herein can include methods, data processing systems, and non-transitory machine readable media.
The above summary does not include an exhaustive list of all embodiments in this disclosure. All systems and methods can be practiced from all suitable combinations of the various aspects and embodiments summarized above, and also those disclosed in the Detailed Description below.
The present invention is illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
Various embodiments and aspects will be described with reference to details discussed below, and the accompanying drawings will illustrate the various embodiments. The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of various embodiments. However, in certain instances, well-known or conventional details are not described in order to provide a concise discussion of embodiments.
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in conjunction with the embodiment can be included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification do not necessarily all refer to the same embodiment. The processes depicted in the figures that follow are performed by processing logic that comprises hardware (e.g. circuitry, dedicated logic, etc.), software, or a combination of both. Although the processes are described below in terms of some sequential operations, it should be appreciated that some of the operations described may be performed in a different order. Moreover, some operations may be performed in parallel rather than sequentially.
The various embodiments described herein relate to messaging systems such as text messaging systems or “chat” messaging systems or other systems which allow devices to communicate messages between the devices. For example, iMessage from Apple Inc. of Cupertino, Calif. is an example of a messaging service for iOS devices and Mac (OS X) computers. Typically, a messaging system includes the plurality of client devices, each including at least one messaging app, and a set of one or more messaging servers that can receive messages from client devices and transmit messages to client devices.
A brief overview of an example of a messaging system will now be provided in conjunction with
A messaging system in one embodiment on a client device includes a messaging app and one or more extension apps that each operate as separate processes. In one embodiment, the message app and the one or more extension apps can each be separate sandboxed processes that operate or execute in their own memory spaces. In addition, the messaging app can also operate with plug-ins, such as an image creation plug-in shown in
In one embodiment the messaging app provides a view of content obtained from the extension app through the interprocess communication. The extension app can create the content in its own process and then provide that content in formats known to be acceptable to the messaging app (such as standard image formats or other standard formats). This allows the messaging app to then present the content from the extension app within one or more message bubbles within a message transcript (without needing to execute the extension app at least on the receiving device).
Objects created by an extension app in one embodiment are shown in the message transcript on sending and receiving devices without launching the extension app. The extension app should provide enough information to construct a message bubble as part of the object. The object can consist of some opaque data encoded in a resource locator and a layout specification provided as a MSMessageTemplateLayout object. MSMessageTemplateLayout is a subclass of MSMessageLayout and represents one method of specifying message bubble layout.
MSMessageTemplateLayout can have the following properties in one embodiment which are shown in
1) image or mediaFileURL : An image provide as a UIImage or as a file URL to an image file or a file URL to a video
2) imageTitle : A string the will be rendered on top of the image or movie
3) imageSubTitle : A string the will be rendered on top of the image or movie below the imageTitle
4) caption : A string the will be rendered in a caption bar below the image or movie
5) trailingCaption : A string the will be rendered right aligned in a caption bar below the image or movie
6) subCaption : A string the will be rendered in a caption bar below the caption
7) trailingSubCaption : A string the will be rendered right aligned in a caption bar below the trailingCaption
8) Extension icon: This is not provided as part of the MSMessageTemplateLayout but is derived from the bundle identifier of the extension that created the MSMessage.
The messaging app can use this information to construct the message bubble similar to the example shown in
The MSMessageTemplateLayout is serialized and transferred to the remote devices along with the opaque data. On receipt the messaging app on the receiving device will create a MSMessageTemplateLayout using the serialized data and use this to draw the message bubble in the receiver's message transcript.
In one embodiment, the extension apps which are configured to operate with a messaging app are not executable outside of the messaging application, and thus their life cycle is managed entirely by the messaging app. Moreover, as further described below, the downloading and installing of the extension apps can be controlled exclusively by the messaging app in one embodiment.
In one embodiment, each extension app can be obtained from an app marketplace or distribution facility such as the Apple App Store (trademark) for message extension apps and can be launched from within the messaging app.
Referring to
Referring back to
In one embodiment, the message transmitted from the remote device to the communication device 250 contains metadata which specifies the remote extension app used to create the content. In one embodiment, this metadata can be an app identifier, such as an identifier provided by an app marketplace or an extension app marketplace from which the extension apps can be downloaded and installed or can be a different identifier that can be associated with the identifier used by the app marketplace. In one embodiment, the notice 259 can result from the selection of the message bubble 253, while in another embodiment it can result automatically if the app identifier in the metadata for the content is not installed when the content for the message bubble 253 is received by the communication device 250.
In one embodiment, a messaging app can launch different types of extension apps in different ways depending upon the type of the extension app. For example, one type of extension app can be launched automatically in response to receipt of a message bubble containing content from an extension app having a certain predetermined type. Other extension apps having a different type may only be launched in one embodiment in response to the selection of a message bubble containing content from that extension app or by the selection of an icon representing the extension app in a browsable view, such as browsable view 571. It may be desirable to allow certain extension apps having a certain type to be launched automatically in response to receiving content that is displayed within the message transcript while not automatically launching other types of extension apps. In another alternative embodiment, one or more extension apps can be permitted to execute in the background and can be allowed to update their respective user interfaces that are presented in their respective message bubbles.
In an alternative embodiment, the metadata can include a format or extension identifier such as an identifier of an image format that can be used to determine available extension apps that can process that image format on the receiving device.
The method shown in
In operation 351 of
In operation 451 of
At this point, the extension app 413 can receive user input by the user of client device 405 and can modify one or more of the content, the resource locator or the data. For example, the user of client device 405 can cause the extension app 413 to access one or more websites to make a modified restaurant reservation by modifying the time, the number of people, the particular restaurant, etc. In one embodiment, the extension app 413, and also the extension app 407 can interact directly (but separately and independently) with the web server by sending the resource locator and the data to the web server and receiving responses from the web server which may include modified data or modified resource locators, or new data and/or new resource locators etc. In one embodiment, the web server can store data for use during the session, and this stored data can include information for some or all of the state information that can also be maintained by the two extension apps in the session. Again, if the extension app 413 is presented for display in a compact view, then the user of the device 405 can interact with the extension app 413 to make the restaurant reservation while the context and conversation of the chat or messaging session is shown in the message transcript of the messaging app 411. The user of the client device 405 can scroll through the message transcript while continuing to look at and interact with the extension app 413. Thus, the extension app 413 can, in operation 463 receive user input and may modify at least one of the content, the resource locator or data, and then can pass, in operation 465 the resource locator and data 427 (which may be modified or new) to the messaging app 411. In turn, the messaging app 411 in operation 467 can send the content, which may be modified, and the app identifier and the resource locator (which may be modified) and data (which may be modified) and the bubble ID back to the client device 401. As shown in operation 469, this process can repeat over time as the two users work on setting up a restaurant reservation in the example provided herein.
It will be appreciated that many different types of extension apps can provide a collaborative environment between the users of client devices 401 and 405 to exchange information and collaborate together and that restaurant reservation is one such type. Thus, it will be appreciated that the restaurant reservation example that is described relative to
The collaborative environment shown in
It can be seen from
If the receiving device, such as client device 405 in operation 459 is capable of installing and using the extension app (identified by the app identifier provided in communication 419) but that the extension app is not installed on the receiving device, the receiving device can, within the user interface of the messaging app offer to download and install the extension app (again specified by the app identifier in communication 419, on the receiving device).
In some embodiments, it may be desirable to provide an identifier of each user to each extension app executing on a client device, particularly in the case of a collaborative environment in which two or more users are interacting through the messaging apps and the extension apps.
In one embodiment, the messaging systems described herein can provide confirmation of receipt of messages to one or more extension apps, and this may be useful in certain cases in which the extension app needs to be assured that the remote extension app has the same known state as the local extension app.
Another aspect of the embodiments described herein relate to backward compatibility, and that aspect is shown in
The communication device 675 shown in
Another aspect of the embodiments described herein relate to a service, such as an app marketplace that can provide a plurality of different extension apps for use within a messaging app according to the one or more embodiments described herein. The service or app marketplace can present browsable views of the plurality of different extension apps and messaging app plug-ins and provide information about those various extension apps and provide for downloading of those extension apps to a client device to allow the client device to install one or more extension apps.
Referring now to
In one embodiment, the messaging app may cause the automatic updating of extension apps which have been installed. In another embodiment, the messaging app may provide alerts or notices to the user that certain extension apps are in need of being updated and the notifications about these updates can be received from an extension app marketplace in one embodiment. This can allow a user to selectively decide whether or not to update specific messaging apps.
Another aspect of the embodiments described herein relate to the use of one or more layers of content which are displayed along with one or more received message bubbles and one or more sent message bubbles.
A method which can be performed on a sending device according to one embodiment will now be described in conjunction with
In an alternative embodiment, the processing of the text to detect the predetermined words and the processing to find the associated layer identifiers for the detected predetermined words can occur on the receiving device. In other words, the receiving device receives the text from the sending device and then performs the detection of the words associated with layer identifiers and then performs operations 1075, 1077 and 1079.
Referring now to
One or more Application Programming Interfaces (APIs) may be used in some embodiments. An API is an interface implemented by a program code component or hardware component (hereinafter “API-implementing component”) that allows a different program code component or hardware component (hereinafter “API-calling component”) to access and use one or more functions, methods, procedures, data structures, classes, and/or other services provided by the API-implementing component. An API can define one or more parameters that are passed between the API-calling component and the API-implementing component.
An API allows a developer of an API-calling component (which may be a third party developer) to leverage specified features provided by an API-implementing component. There may be one API-calling component or there may be more than one such component. An API can be a source code interface that a computer system or program library provides in order to support requests for services from an application. An operating system (OS) can have multiple APIs to allow applications running on the OS to call one or more of those APIs, and a service (such as a program library) can have multiple APIs to allow an application that uses the service to call one or more of those APIs. An API can be specified in terms of a programming language that can be interpreted or compiled when an application is built.
In some embodiments the API-implementing component may provide more than one API, each providing a different view of or with different aspects that access different aspects of the functionality implemented by the API-implementing component. For example, one API of an API-implementing component can provide a first set of functions and can be exposed to third party developers, and another API of the API-implementing component can be hidden (not exposed) and provide a subset of the first set of functions and also provide another set of functions, such as testing or debugging functions which are not in the first set of functions. In other embodiments the API-implementing component may itself call one or more other components via an underlying API and thus be both an API-calling component and an API-implementing component.
An API defines the language and parameters that API-calling components use when accessing and using specified features of the API-implementing component. For example, an API-calling component accesses the specified features of the API-implementing component through one or more API calls or invocations (embodied for example by function or method calls) exposed by the API and passes data and control information using parameters via the API calls or invocations. The API-implementing component may return a value through the API in response to an API call from an API-calling component. While the API defines the syntax and result of an API call (e.g., how to invoke the API call and what the API call does), the API may not reveal how the API call accomplishes the function specified by the API call. Various API calls are transferred via the one or more application programming interfaces between the calling (API-calling component) and an API-implementing component. Transferring the API calls may include issuing, initiating, invoking, calling, receiving, returning, or responding to the function calls or messages; in other words, transferring can describe actions by either of the API-calling component or the API-implementing component. The function calls or other invocations of the API may send or receive one or more parameters through a parameter list or other structure. A parameter can be a constant, key, data structure, object, object class, variable, data type, pointer, array, list or a pointer to a function or method or another way to reference a data or other item to be passed via the API.
Furthermore, data types or classes may be provided by the API and implemented by the API-implementing component. Thus, the API-calling component may declare variables, use pointers to, use or instantiate constant values of such types or classes by using definitions provided in the API.
Generally, an API can be used to access a service or data provided by the API-implementing component or to initiate performance of an operation or computation provided by the API-implementing component. By way of example, the API-implementing component and the API-calling component may each be any one of an operating system, a library, a device driver, an API, an application program, or other module (it should be understood that the API-implementing component and the API-calling component may be the same or different type of module from each other). API-implementing components may in some cases be embodied at least in part in firmware, microcode, or other hardware logic. In some embodiments, an API may allow a client program (e.g., game center application) to use the services provided by a Software Development Kit (SDK) library. In other embodiments an application or other client program may use an API provided by an Application Framework. In these embodiments the application or client program may incorporate calls to functions or methods provided by the SDK and provided by the API or use data types or objects defined in the SDK and provided by the API. An Application Framework may in these embodiments provide a main event loop for a program that responds to various events defined by the Framework. The API allows the application to specify the events and the responses to the events using the Application Framework. In some implementations, an API call can report to an application the capabilities or state of a hardware device, including those related to aspects such as input capabilities and state, output capabilities and state, processing capability, power state, storage capacity and state, communications capability, etc., and the API may be implemented in part by firmware, microcode, or other low level logic that executes in part on the hardware component.
The API-calling component may be a local component (i.e., on the same data processing system as the API-implementing component) or a remote component (i.e., on a different data processing system from the API-implementing component) that communicates with the API-implementing component through the API over a network. It should be understood that an API-implementing component may also act as an API-calling component (i.e., it may make API calls to an API exposed by a different API-implementing component) and an API-calling component may also act as an API-implementing component by implementing an API that is exposed to a different API-calling component.
The API may allow multiple API-calling components written in different programming languages to communicate with the API-implementing component (thus the API may include features for translating calls and returns between the API-implementing component and the API-calling component); however the API may be implemented in terms of a specific programming language. An API-calling component can, in one embedment, call APIs from different providers such as a set of APIs from an OS provider and another set of APIs from a plug-in provider and another set of APIs from another provider (e.g. the provider of a software library) or creator of the another set of APIs.
It will be appreciated that the API-implementing component 3210 may include additional functions, methods, classes, data structures, and/or other features that are not specified through the API 3220 and are not available to the API-calling component 3230. It should be understood that the API-calling component 3230 may be on the same system as the API-implementing component 3210 or may be located remotely and accesses the API-implementing component 3210 using the API 3220 over a network. While
The API-implementing component 3210, the API 3220, and the API-calling component 3230 may be stored in a machine-readable medium (e.g., computer-readable medium), which includes any mechanism for storing information in a form readable by a machine (e.g., a computer or other data processing system). For example, a machine-readable medium includes magnetic disks, optical disks, random access memory; read only memory, flash memory devices, etc.
In
Note that the Service 2 has two APIs, one of which (Service 2 API 1) receives calls from and returns values to Application 1 and the other (Service 2 API 2) receives calls from and returns values to Application 2. Service 1 (which can be, for example, a software library) makes calls to and receives returned values from OS API 1, and Service 2 (which can be, for example, a software library) makes calls to and receives returned values from both OS API 1 and OS API 2. Application 2 makes calls to and receives returned values from OS API 2.
The systems and methods described herein can be implemented in a variety of different data processing systems and devices, including general-purpose computer systems, special purpose computer systems, or a hybrid of general purpose and special purpose computer systems. Exemplary data processing systems that can use any one of the methods described herein include desktop computers, laptop computers, tablet computers, smart phones, cellular telephones, personal digital assistants (PDAs), embedded electronic devices, or consumer electronic devices.
As shown in
While
It will be apparent from this description that aspects of the present invention may be embodied, at least in part, in software. That is, the techniques may be carried out in a data processing system in response to its processor executing a sequence of instructions contained in a storage medium, such as a non-transitory machine-readable storage medium (e.g. DRAM or flash memory). In various embodiments, hardwired circuitry may be used in combination with software instructions to implement the present invention. Thus the techniques are not limited to any specific combination of hardware circuitry and software, or to any particular source for the instructions executed by the data processing system. Moreover, it will be understood that where mobile or handheld devices are described, the description encompasses mobile devices (e.g., laptop devices, tablet devices), handheld devices (e.g., smartphones), as well as embedded systems suitable for use in wearable electronic devices.
The present disclosure recognizes that the use of personal information data (such as health data collected by one or more watches), in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver health related information or targeted content that is of greater interest to the user. Accordingly, use of such personal information data can enable calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.
The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of health information or advertisement delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services. In another example, users can select not to provide location information for targeted content delivery services. In yet another example, users can select to not provide precise location information, but permit the transfer of location zone information. In yet another example, users can select not to provide pertinent health information such as weight, personal characteristics, traits, etc.
In the foregoing specification, specific exemplary embodiments have been described. It will be evident that various modifications may be made to those embodiments without departing from the broader spirit and scope set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
This application is a continuation of U.S. patent application Ser. No. 15/400,780, entitled “LAYERS IN MESSAGING APPLICATIONS,” filed on Jan. 6, 2017, which claims the benefit of the following U.S. Provisional Patent Application Nos. (all of which are incorporated herein by reference): 62/349,086, filed on Jun. 12, 2016; 62/349,101, filed on Jun. 12, 2016; 62/349,113, filed on Jun. 12, 2016; and 62/349,091, filed on Jun. 12, 2016.
Number | Name | Date | Kind |
---|---|---|---|
5784001 | DeLuca et al. | Jul 1998 | A |
5903728 | Semenzato | May 1999 | A |
6054990 | Tran | Apr 2000 | A |
6539421 | Appelman et al. | Mar 2003 | B1 |
6869018 | Fifield et al. | Mar 2005 | B2 |
7130885 | Chandra et al. | Oct 2006 | B2 |
7343561 | Stochosky et al. | Mar 2008 | B1 |
7353466 | Vcrane et al. | Apr 2008 | B2 |
7571213 | Walkush et al. | Aug 2009 | B2 |
7669134 | Christie et al. | Feb 2010 | B1 |
7856469 | Chen et al. | Dec 2010 | B2 |
8271900 | Wakizaka et al. | Sep 2012 | B2 |
8375320 | Kotler et al. | Feb 2013 | B2 |
8464167 | Saund et al. | Jul 2013 | B2 |
8584031 | Moore et al. | Nov 2013 | B2 |
8621585 | Danieli et al. | Dec 2013 | B2 |
8667418 | Chaudhri et al. | Mar 2014 | B2 |
8677250 | Wormald et al. | Mar 2014 | B2 |
8751572 | Behforooz et al. | Jun 2014 | B1 |
8881051 | Frey | Nov 2014 | B2 |
8957915 | Chalasani | Feb 2015 | B1 |
9003306 | Mehin et al. | Apr 2015 | B2 |
9244601 | Kim et al. | Jan 2016 | B2 |
9272217 | Holme et al. | Mar 2016 | B1 |
9298355 | Beausoleil et al. | Mar 2016 | B1 |
9477375 | Lewis et al. | Oct 2016 | B1 |
9533217 | Naik et al. | Jan 2017 | B2 |
9654222 | Shatz et al. | May 2017 | B1 |
9904469 | Gnedin et al. | Feb 2018 | B2 |
9911222 | Sefton et al. | Mar 2018 | B2 |
9948589 | Gonnen et al. | Apr 2018 | B2 |
10126927 | Fieldman | Nov 2018 | B1 |
10129321 | Mayya et al. | Nov 2018 | B2 |
10558329 | Lewis et al. | Feb 2020 | B2 |
20030110450 | Sakai | Jun 2003 | A1 |
20040054740 | Daigle et al. | Mar 2004 | A1 |
20040137884 | Engstrom et al. | Jul 2004 | A1 |
20040179039 | Blattner | Sep 2004 | A1 |
20040224772 | Canessa et al. | Nov 2004 | A1 |
20050021834 | Coulombe | Jan 2005 | A1 |
20050116956 | Beardrow | Jun 2005 | A1 |
20050156873 | Walter et al. | Jul 2005 | A1 |
20050198124 | McCarthy | Sep 2005 | A1 |
20050204309 | Szeto | Sep 2005 | A1 |
20050210114 | Washburn | Sep 2005 | A1 |
20060022955 | Kennedy | Feb 2006 | A1 |
20060041629 | Lira | Feb 2006 | A1 |
20060041848 | Lira | Feb 2006 | A1 |
20060132457 | Rimas-Ribikauskas et al. | Jun 2006 | A1 |
20060161868 | Van Dok et al. | Jul 2006 | A1 |
20060294189 | Natarajan et al. | Dec 2006 | A1 |
20070207782 | Tran | Sep 2007 | A1 |
20080055269 | Lemay | Mar 2008 | A1 |
20080081698 | Wormald et al. | Apr 2008 | A1 |
20080091692 | Keith et al. | Apr 2008 | A1 |
20080114848 | Lira | May 2008 | A1 |
20080141150 | Kalaboukis et al. | Jun 2008 | A1 |
20080235285 | Della Pasqua | Sep 2008 | A1 |
20080280633 | Agiv | Nov 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20090006333 | Jones et al. | Jan 2009 | A1 |
20090013265 | Cole et al. | Jan 2009 | A1 |
20090094288 | Berry et al. | Apr 2009 | A1 |
20090150397 | Chen et al. | Jun 2009 | A1 |
20090193088 | Cervantes et al. | Jul 2009 | A1 |
20090262087 | Kim | Oct 2009 | A1 |
20090300540 | Russell | Dec 2009 | A1 |
20100017759 | Bimbaum | Jan 2010 | A1 |
20100058244 | Wang | Mar 2010 | A1 |
20100153844 | Hwang et al. | Jun 2010 | A1 |
20100158097 | Pascal | Jun 2010 | A1 |
20100235746 | Anzures | Sep 2010 | A1 |
20100241968 | Tarara et al. | Sep 2010 | A1 |
20100262666 | Kalu | Oct 2010 | A1 |
20100271366 | Sung et al. | Oct 2010 | A1 |
20100287241 | Swanbug et al. | Nov 2010 | A1 |
20110041086 | Kim | Feb 2011 | A1 |
20110276904 | Mehin | Nov 2011 | A1 |
20120050264 | Karaoguz | Mar 2012 | A1 |
20120059787 | Brown | Mar 2012 | A1 |
20120072856 | Park et al. | Mar 2012 | A1 |
20120110477 | Vuame | May 2012 | A1 |
20120185542 | Vyrros et al. | Jul 2012 | A1 |
20120190388 | Castleman et al. | Jul 2012 | A1 |
20120270578 | Feghali | Oct 2012 | A1 |
20120297348 | Santoro | Nov 2012 | A1 |
20130055112 | Joseph | Feb 2013 | A1 |
20130069969 | Chang et al. | Mar 2013 | A1 |
20130093833 | Al-Asaaed et al. | Apr 2013 | A1 |
20130151603 | Lobb et al. | Jun 2013 | A1 |
20130190081 | Naik et al. | Jul 2013 | A1 |
20130225087 | Uhm | Aug 2013 | A1 |
20130254710 | Banti et al. | Sep 2013 | A1 |
20130297317 | Lee et al. | Nov 2013 | A1 |
20130329114 | Kim et al. | Dec 2013 | A1 |
20130332543 | Shin et al. | Dec 2013 | A1 |
20140006343 | Allison | Jan 2014 | A1 |
20140009475 | Setton | Jan 2014 | A1 |
20140015782 | Kim et al. | Jan 2014 | A1 |
20140041056 | Stoop et al. | Feb 2014 | A1 |
20140047358 | Park et al. | Feb 2014 | A1 |
20140058939 | Savla | Feb 2014 | A1 |
20140082520 | Mamoun | Mar 2014 | A1 |
20140085487 | Park et al. | Mar 2014 | A1 |
20140122605 | Merom et al. | May 2014 | A1 |
20140136990 | Gonnen | May 2014 | A1 |
20140143355 | Berdis et al. | May 2014 | A1 |
20140173460 | Kim | Jun 2014 | A1 |
20140196026 | Seo et al. | Jul 2014 | A1 |
20140229165 | Lee | Aug 2014 | A1 |
20140279068 | Systrom | Sep 2014 | A1 |
20140279418 | Rubinstein et al. | Sep 2014 | A1 |
20140297254 | Yeo et al. | Oct 2014 | A1 |
20140331149 | Labey | Nov 2014 | A1 |
20140340470 | Perez et al. | Nov 2014 | A1 |
20140344726 | He | Nov 2014 | A1 |
20140351350 | Lee et al. | Nov 2014 | A1 |
20140365923 | Lee et al. | Dec 2014 | A1 |
20140372540 | Libin | Dec 2014 | A1 |
20140372541 | Feghali | Dec 2014 | A1 |
20150038235 | Kamekawa | Feb 2015 | A1 |
20150042852 | Lee et al. | Feb 2015 | A1 |
20150050993 | Blayer | Feb 2015 | A1 |
20150058754 | Rauh | Feb 2015 | A1 |
20150062052 | Bernstein et al. | Mar 2015 | A1 |
20150067496 | Missig et al. | Mar 2015 | A1 |
20150067519 | Missig et al. | Mar 2015 | A1 |
20150067596 | Brown et al. | Mar 2015 | A1 |
20150074209 | Liu et al. | Mar 2015 | A1 |
20150082201 | Sung et al. | Mar 2015 | A1 |
20150088699 | Rubinstein et al. | Mar 2015 | A1 |
20150089389 | Cohen-Zur et al. | Mar 2015 | A1 |
20150094106 | Grossman | Apr 2015 | A1 |
20150095801 | Kim | Apr 2015 | A1 |
20150121190 | Miyamoto et al. | Apr 2015 | A1 |
20150121255 | Lee | Apr 2015 | A1 |
20150160832 | Walkin et al. | Jun 2015 | A1 |
20150172584 | Park et al. | Jun 2015 | A1 |
20150188861 | Esplin et al. | Jul 2015 | A1 |
20150220774 | Ebersman | Aug 2015 | A1 |
20150222586 | Ebersman et al. | Aug 2015 | A1 |
20150268766 | Kim et al. | Sep 2015 | A1 |
20150268780 | Kim et al. | Sep 2015 | A1 |
20150268826 | Langholz | Sep 2015 | A1 |
20150281145 | Ji | Oct 2015 | A1 |
20150309720 | Fisher | Oct 2015 | A1 |
20150312176 | Jones et al. | Oct 2015 | A1 |
20150312182 | Langholz | Oct 2015 | A1 |
20150312184 | Langholz et al. | Oct 2015 | A1 |
20150312185 | Langholz et al. | Oct 2015 | A1 |
20150319569 | Chen et al. | Nov 2015 | A1 |
20150324858 | DeMattei | Nov 2015 | A1 |
20150331550 | Wang et al. | Nov 2015 | A1 |
20150334075 | Wang et al. | Nov 2015 | A1 |
20150347748 | Krstic et al. | Dec 2015 | A1 |
20150350141 | Yang et al. | Dec 2015 | A1 |
20150350147 | Shepherd et al. | Dec 2015 | A1 |
20150379336 | Hoshi et al. | Dec 2015 | A1 |
20160006856 | Bruno | Jan 2016 | A1 |
20160014059 | Rathod | Jan 2016 | A1 |
20160027443 | Terrell, II et al. | Jan 2016 | A1 |
20160020165 | Thomas et al. | Feb 2016 | A1 |
20160034977 | Bhaowal et al. | Feb 2016 | A1 |
20160035123 | Bonansea | Feb 2016 | A1 |
20160043974 | Purcell et al. | Feb 2016 | A1 |
20160062574 | Anzures et al. | Mar 2016 | A1 |
20160077793 | Disano | Mar 2016 | A1 |
20160080296 | Lewis et al. | Mar 2016 | A1 |
20160080297 | LeHuerou et al. | Mar 2016 | A1 |
20160092035 | Crocker et al. | Mar 2016 | A1 |
20160094504 | Cinar et al. | Mar 2016 | A1 |
20160103564 | Chao | Apr 2016 | A1 |
20160110714 | Norland | Apr 2016 | A1 |
20160110907 | Kelly | Apr 2016 | A1 |
20160117299 | Hynes | Apr 2016 | A1 |
20160117665 | Davis | Apr 2016 | A1 |
20160117670 | Davis | Apr 2016 | A1 |
20160149838 | Jeong | May 2016 | A1 |
20160164811 | Guthery et al. | Jun 2016 | A1 |
20160202889 | Shin et al. | Jul 2016 | A1 |
20160259526 | Lee et al. | Sep 2016 | A1 |
20160259528 | Foss et al. | Sep 2016 | A1 |
20160274720 | Shin | Sep 2016 | A1 |
20160279523 | Altagar et al. | Sep 2016 | A1 |
20160283984 | Rabbat et al. | Sep 2016 | A1 |
20160286028 | Abuja et al. | Sep 2016 | A1 |
20160291822 | Ahuja | Oct 2016 | A1 |
20160301641 | Belliston | Oct 2016 | A1 |
20160323213 | Hong | Nov 2016 | A1 |
20160334972 | Cheng et al. | Nov 2016 | A1 |
20160352887 | Na | Dec 2016 | A1 |
20170020114 | Luo et al. | Feb 2017 | A1 |
20170054662 | Crocker et al. | Feb 2017 | A1 |
20170060354 | Luo et al. | Mar 2017 | A1 |
20170064207 | Kim et al. | Mar 2017 | A1 |
20170083174 | Tobens, III et al. | Mar 2017 | A1 |
20170083210 | Parker et al. | Mar 2017 | A1 |
20170102912 | Jambulingam et al. | Apr 2017 | A1 |
20170109013 | Hong | Apr 2017 | A1 |
20170286913 | Liu et al. | Oct 2017 | A1 |
20170308289 | Kim | Oct 2017 | A1 |
20170322693 | Zhang | Nov 2017 | A1 |
20170339085 | Judd et al. | Nov 2017 | A1 |
20180124190 | Ji | May 2018 | A1 |
20180295092 | Peiris et al. | Oct 2018 | A1 |
20180351903 | Allen et al. | Dec 2018 | A1 |
20180373683 | Hullette et al. | Dec 2018 | A1 |
20190087082 | Chaudhri et al. | Mar 2019 | A1 |
20200034033 | Chaudhri et al. | Jan 2020 | A1 |
20200133478 | Chaudhri et al. | Apr 2020 | A1 |
20210096736 | Chaudhri et al. | Apr 2021 | A9 |
Number | Date | Country |
---|---|---|
101198948 | Jun 2008 | CN |
101611398 | Dec 2009 | CN |
101853132 | Oct 2010 | CN |
101931621 | Dec 2010 | CN |
102170442 | Aug 2011 | CN |
102664832 | Sep 2012 | CN |
102970213 | Mar 2013 | CN |
103748610 | Apr 2014 | CN |
103914261 | Jul 2014 | CN |
103918290 | Jul 2014 | CN |
104462128 | Mar 2015 | CN |
104487929 | Apr 2015 | CN |
104601812 | May 2015 | CN |
104699378 | Jun 2015 | CN |
105119812 | Dec 2015 | CN |
105407033 | Mar 2016 | CN |
105407273 | Mar 2016 | CN |
105427087 | Mar 2016 | CN |
105791536 | Jul 2016 | CN |
106255989 | Dec 2016 | CN |
107924256 | Apr 2018 | CN |
108762862 | Nov 2018 | CN |
202015003860 | Oct 2015 | DE |
1475939 | Nov 2004 | EP |
2475137 | Jul 2012 | EP |
2544431 | Jan 2013 | EP |
2667339 | Nov 2013 | EP |
2712165 | Mar 2014 | EP |
2713323 | Apr 2014 | EP |
2779580 | Sep 2014 | EP |
2779708 | Sep 2014 | EP |
2806620 | Nov 2014 | EP |
2879037 | Jun 2015 | EP |
2940570 | Nov 2015 | EP |
2988568 | Feb 2016 | EP |
20130050871 | May 2013 | ER |
2004-503004 | Jan 2004 | JP |
2010-277588 | Dec 2010 | JP |
2015-534664 | Dec 2015 | JP |
20090065098 | Jun 2009 | KR |
20110131941 | Dec 2011 | KR |
101130381 | Mar 2012 | KR |
20120107836 | Oct 2012 | KR |
20130125274 | Nov 2013 | KR |
20140000391 | Jan 2014 | KR |
20140035160 | Mar 2014 | KR |
20140078031 | Jun 2014 | KR |
20150010436 | Jan 2015 | KR |
101567555 | Nov 2015 | KR |
20160050599 | May 2016 | KR |
WO 2004079530 | Sep 2004 | WO |
WO 2009039758 | Apr 2009 | WO |
WO 2011085248 | Jul 2011 | WO |
WO 2011150860 | Dec 2011 | WO |
WO 2012178121 | Feb 2012 | WO |
WO 2012061318 | May 2012 | WO |
WO 2014038790 | Mar 2014 | WO |
WO 2015032284 | Mar 2015 | WO |
WO 2015050966 | Apr 2015 | WO |
WO 2015090137 | Jun 2015 | WO |
WO 2015139528 | Sep 2015 | WO |
WO 2015162072 | Oct 2015 | WO |
WO 2015167589 | Nov 2015 | WO |
WO 2015175240 | Nov 2015 | WO |
WO 2015183456 | Dec 2015 | WO |
WO 2015183699 | Dec 2015 | WO |
WO 2015183756 | Dec 2015 | WO |
WO 2015186534 | Dec 2015 | WO |
Entry |
---|
Aube, “No UI Is the New UI,” http://techcrunch.com/2015/11/11/no-ui-is-the-new-ui/#.uqifwxj:bXE2, Nov. 11, 2015, 13 pages. |
Benjamin, “How to Use Facebook Sticker in iMessage Conversations,” https://www.youtube.com/watch?v=XKCbtPEOIVc, Feb. 20, 2015, 6 pages. |
Bollyut, “What We Hate (and Love) About Chat Apps,” https://www.cheatsheet.com/technology/why-new-features-make-us-love-and-hate-messaging-app.html, Apr. 16, 2015, 6 pages. |
Buhr et al., “You Can Now Order Ubers (and Soon Lyfts) in Facebook Messenger to Prove You're on Your Way,” http://techcrunch.com/2015/12/16/facebook-messenger-transportation/#yzzjjr:g3wc, Dec. 16, 2015, 10 pages. |
Chan, “When One App Rules Them All: The Case of WeChat and Mobile in China,” https://modernmoneynetwork.org/resources/when-one-app-rules-them-all-case-wechat-and-mobile-in-china, Aug. 6, 2015, 11 pages. |
Chaykowski, “Facebook and Uber Team Up to Bring Ride-Hailing to Messenger,” https://www.forbes.com/sites/kathleenchaykowski/2015/12/16/facebook-and-uber-team-up-to-bring-ride-hailing-to-messenger/#35a69d174905, Dec. 16, 2015, 6 pages. |
Clark, “Emoji Now Has an Autocomplete to Get Annoyed With,” http://www.adweek.com/socialtimes/now-emoji-autocomplete-get-annoyed/204764. |
Constine, “Facebook Messenger Hits 800M Users: 2016 Strategy and Predictions,” http://techcrunch.com/2016/01/07/beyond-messaging/#.kfuOnsbh:hTbz, Jan. 7, 2016, 12 pages. |
Constine, “Facebook Payments in Messenger Demo Video,” https://www.youtube.com/watch?v=OczclezYB8, Mar. 17, 2015, 2 pages. |
Eadicicco, “Google's New iPhone Keyboard Solves a Big Problem With Texting”, http://fortune.com/2016/05/12/googlei-phone-keyboard-texting/, May 12, 2016, 2 pages. |
Facebook Messenger 2013, “How to Send Stickers and Photos in Messages,” https://www.youtube.com/watch?v=qJz82aTr1Cg, Sep. 7, 2013, 3 pages. |
Facebook Messenger 2013, “How to Send Stickers on Facebook Messenger?” https:www/youtube.com/watch?v=r8FsDQQxjHk, Jan. 20, 2016, 2 pages. |
Funny Status, “I Wish I Could Just ‘Like’ a Text So I Don't Have to Respond,” Feb. 17, 2013, 5 pages. |
Goel, “Facebook Announces a Payments Feature for Its Messenger App,” https://www.nytimes.com/2015/03/18/technology/facebook-announces-a-payments-feature-for-its-messenger-app.html?r=0, Mar. 17, 2015, 4 pages. |
Gonzalez, “10 Third-Party Apps for Facebook Messenger You Should Install Right Now,” https://smartphones.gadgethacks.com/how-to/10-third-party-app s-for-facebook-messenger-you-should-install-right-now, Mar. 27, 2015, 16 pages. |
Google, “Quick Search,” https://play.google.com/store/apps/details?id=com.startapp.quicksearchbox&hl=en, Jun. 7, 2017, 4 pages. |
ImTranslator, “Translate Facebook with Pop-up Bubble,” http://about.imtranslator.net/translate-facebook-with-pop-up-bubble/, Sep. 5, 2014, 9 pages. |
ImTranslator, “Translate Facebook,” http://about.imtranslator.net/translate-facebook/, Sep. 5, 2014, 6 pages. |
Isaac et al, “Facebook and Uber Follow Asian Rivals in Plan to Enhance Messenger App,” https://www.nytimes.com/2015/12/17/technology/facebook-and-uber-follow-asian-rivals-in-plan-to-enhance-messenger-app.html?-r=0, Dec. 16, 2015, 4 pages. |
Kleeman, “How to Play Secret Chess Game in Facebook Messenger,” Feb. 5, 2016, 6 pages. |
Krug, “News Feed FYI: What the Reactions Launch Means for News Feed,” http://newsroom.fb.com/news/2016/02/news-feed-fyi-what-the-reactions-launch-means-for-news-feed/, Feb. 24, 2016, 2 pages. |
Krug, “Reactions Now Available Globally,” http://newsroom.fb.com/news/2016/02/reactions-now-available-globally/, Feb. 24, 2016, 3 pages. |
Larson, “How to Customize Facebook Messenger with Chat Colors and Emoji,” https://www.dailydot.com/debug/customize-messenger-color-bubbles/, Dec. 24, 2015, 6 pages. |
Lim, “14 Google Hangouts Hidden Animated Emojis,” http://www.hongkiat.com/blog/google-hangout-animated-emojis/, 2017, 8 pages. |
Literati Labs, Inc., “Keymoji: Emoji Keyboard,” https://itunes.apple.com/us/app/keymoji-emoji-chat/id886476394?mt=8, Apr. 13, 2017, 2 pages. |
Marcus, “Here's to 2016 with Messenger,” http://newsroom.fb.com/news/2016/01/heres-to-2016-with-messenger/, Jan. 7, 2016, 5 pages. |
McAlone, “Epic Slide Deck from Former Yahoo Board Member Lays Out the Future of Tech and Media,” http://www.businessinsider.com/michael-wolf-predicts-what-will-hapen-in-the-tech-industry-in-2016-2015-10?op=1, Oct. 21, 2015, 138 pages. |
“Minuum for iPhone,” http://minuum.com/minuum-for-iPhone/, 2015, 4 pages. |
Newton, “Facebook Rolls Out Expanded Like Button Reactions Around the World,” https://www.theverge.com/2016/2/24/11094374/facebook-reactions-like-button, Feb. 24, 2016, 5 pages. |
OSXDaily, “Access and Use Emoji in Mac OS X,” https://web.archive.org/20160415075948/http://osxdaily.com/2011/08/20/emoji-mac-os-x-lion, Aug. 20, 2011, 10 pages. |
Singh, “Introducing Emoji Autocomplete,” Venmo, http://blog.venmo.com/hf2t3h4x98p5e13z82p18j66ngcmry/2015/14/introducing-emoji-autocomplete, May 14, 2015, 2 pages. |
Skype Messenger, “7 Skype Tips for Power Users,” http://web.archive.org/web/20131008122153/http://www.howtogeek.com/173448/7-skype-tips-for-power-users, Oct. 8, 2013, 5 pages. |
Smith (GottaBeMobile): “Facebook Reactions: Meet the New Facebook Like Button,” YouTube, https://www.youtube.com/watch?v=Wn5571R8u0g, New Feb. 24, 2016, 2 pages. |
Snapchat App 2014, “How to Resize and Rotate Emojis/Text on Snapchat,” https://www.youtube.com/watch?v=OErPIFWY3W0, Dec. 23, 2014, 3 pages. |
Sullivan, “Slash's Mobile Keyboard Makes It an Interesting Player in the Messaging Wars,” https://www.fastcompany.com/3055205/slashs-mobile-keyboard-makes-it-an-interesting-player-in-the-messaginh-wars, Jan. 7, 2016, 10 pages. |
Tabuchi, “No Time to Text? Say It With Stickers,” https://www.nytimes.com/2014/05/26/technology/no-time-to-text-apps-turn-to-stickers.html?_r=0, May 25, 2014, 6 pages. |
Tek, “Draw & Guess for Messenger,” https://web.archive.org/web/2017113114451/https://itunes.apple.com/gb/app/draw-guess-for-messenger/id999812010?mt=8, 2 pages. |
Viber Media S.a.r.l., “Viber for IOS,” https://www.viber.com/products/iphone, 2017, 3 pages. |
Viticci, “Facebook Messenger's ‘Optimized’ Approach and App Discovery,” https://www.macsotries.net/ios/facebook-messenger-optimized-approach-and-app-discovery, Mar. 25, 2015, 6 pages. |
Walker, “The Six Stages of Facebook Reaction Grief,” http://gizmodo.com/the-six-stages-of-facebook-reaction-grief-1761086520, Feb. 24, 2016, 7 pages. |
Weinberger, “The Next Big Thing in Computing is Called ‘ChatOps,’ and it's—Already Happening Inside Slack,” http://www.businessinsider.com/chatops-blockspring-atlassain-and-ibm-push-the-api-market-2015-11, Nov. 5, 2015, 5 pages. |
Weinberger, “Why Apple Should Be Scared of Facebook's and Google's Messaging Apps,” http://www.businessinsider.com/apple-versus-google-and-facebook-messaging-2015-12, Dec. 22, 2014, 4 pages. |
Wiki, “How to Resize Emoji on Snapchat,” https://web-archive.org/web/20160505072758/http://www.wikihow.com/Resize-Emoji-on-Snapchat, May 5, 2016, 3 pages. |
Wikipedia, “WeChat,” https://en.wikipedia.org/wiki/WeChat, Jan. 21, 2011, 7 pages. |
Wood, “Messaging Apps Offer Do-It-All Services in Bods for Higher Profits,” https://www.nytimes.com/2015/03/26/technology/personaltech/messaging-apps-offer-do-it, Mar. 25, 2015, 3 pages. |
Wortham, “What Silicon Valley Can Learn from Seoul,” https://www.nytimes.com/2015/06/07/magazine/what-silicon-valley-can-learn-from-seoul.html?_r=0, Jun. 2, 2015, 6 pages. |
YouTube, “How to Install or Add Other Apps to Facebook Messenger,” https://www.bing.com/videos/search?q=how+to+install+or+add+apps+to+facebook+messenger, Jan. 19, 2016, 8 pages. |
Australian Certificate of Grant, dated Nov. 14, 2019, received in Australian Patent Application No. 2019204403, 3 pages. |
Australian Notice of Acceptance, dated Jul. 15, 2019, received in Australian Patent Application No. 2019204403, 3 pages. |
Australian Office Action, dated Jan. 22, 2020, received in Australian Patent Application No. 2019283863, 3 pages. |
Australian Office Action, dated May 10, 2019, received in Australian Patent Application No. 2017266930, 2 pages. |
Chinese Office Action, dated Dec. 20, 2019, received in Chinese Patent Application No. 201810396354.0, 4 pages. |
Chinese Office Action, dated Feb. 11, 2019, received in Chinese Patent Application No. 201780002856.4, 7 pages. |
Chinese Office Action, dated Jul. 2, 2018, received in Chinese Patent Application No. 201810396289.1, 3 pages. |
Chinese Office Action, dated Jul. 2, 2018, received in Chinese Patent Application No. 201810396354.0, 2 pages. |
Chinese Office Action, dated May 7, 2019, received in Chinese Patent Application No. 201810396354.0, 4 pages. |
Chinese Office Action, dated Oct. 15, 2019, received in Chinese Patent Application No. 201780002856.4, 4 pages. |
Danish Decision to Grant, dated Jan. 20, 2020, received in Danish Patent Applicatoin No. 201670636, 3 pages. |
Danish Decision to Grant, dated Mar. 20, 2019, received in Danish Patent Application No. 201670654, 3 pages. |
Danish Grant Decision, dated Feb. 20, 2019, received in Danish Patent Application No. 201670651, 2 pages. |
Danish Intention to Grant, dated Dec. 4, 2019, received in Danish Patent Application No. 201670653, 2 pages. |
Danish Intention to Grant, dated Feb. 15, 2019, received in Danish Patent Application No. 201670647, 2 pages. |
Danish Intention to Grant, dated Feb. 20, 2018, received in Danish Patent Application No. 201670648, 2 pages. |
Danish Intention to Grant, dated Feb. 21, 2019, received in Danish Patent Application No. 201670655, 2 pages. |
Danish Intention to Grant, dated Feb. 27, 2018, received in Danish Patent Application No. 201670650, 2 pages. |
Danish Intention to Grant, dated Jan. 24, 2019, received in Danish Patent Application No. 201670642, 3 pages. |
Danish Intention to Grant, dated Mar. 21, 2019, received in Danish Patent Application No. 201670642, 2 pages. |
Danish Intention to Grant, dated Nov. 26, 2018, received in Danish Patent Application No. 201670654, 2 pages. |
Danish Intention to Grant, dated Oct. 1, 2019, received in Danish Patent Application No. 201670636, 2 pages. |
Danish Intention to Grant, dated Oct. 12, 2018, received in Danish Patent Application No. 201670651, 2 pages. |
Danish Notice of Allowance, dated Jun. 7, 2018, received in Danish Patent Application No. 201670650, 2 pages. |
Danish Notice of Allowance, dated Mar. 26, 2018, received in Danish Patent Application No. 201670648, 2 pages. |
Danish Notice of Allowance, dated May 2, 2019, received in Danish Patent Application No. 201670642, 2 pages. |
Danish Notice of Allowance, dated May 2, 2019, received in Danish Patent Application No. 201670647, 2 pages. |
Danish Notice of Allowance, dated May 3, 2019, received in Danish Patent Application No. 201670655, 2 pages. |
Danish Office Action, dated Apr. 18, 2018, received in Danish Patent Application No. 201670655, 4 pages. |
Danish Office Action, dated Apr. 21, 2017, received in Danish Patent Application No. 201670649, 2 pages. |
Danish Office Action, dated Aug. 23, 2017, received in Danish Patent Application No. 201670650, 3 pages. |
Danish Office Action, dated Aug. 24, 2017, received in Danish Patent Application No. 201670648, 3 pages. |
Danish Office Action, dated Aug. 31, 2017, received in Danish Patent Application No. 201670642, 4 pages. |
Danish Office Action, dated Dec. 14, 2016, received in Danish Patent Application No. 201670652, 7 pages. |
Danish Office Action, dated Dec. 14, 2016, received in Danish Patent Application No. 201670653, 6 pages. |
Danish Intention to Grant, dated Dec. 16, 2016, received in Danish Patent Application No. 201670649, 2 pages. |
Danish Office Action, dated Dec. 19, 2016, received in Danish Patent Application No. 201670654, 9 pages. |
Danish Office Action, dated Dec. 20, 2018, received in Danish Patent Application No. 201670653, 3 pages. |
Danish Office Action, dated Feb. 1, 2017, received in Danish Patent Application No. 201670655, 9 pages. |
Danish Office Action, dated Feb. 14, 2018, received in Danish Patent Application No. 201670651, 3 pages. |
Danish Office Action, dated Feb. 26, 2018, received in Danish Patent Application No. 201670653, 4 pages. |
Danish Office Action, dated Jan. 12, 2017, received in Danish Patent Application No. 201670641, 7 pages. |
Danish Office Action, dated Jan. 16, 2017, received in Danish Patent Application No. 201670647, 9 pages. |
Danish Office Action, dated Jan. 17, 2017, received in Danish Patent Application No. 201670651, 9 pages. |
Danish Office Action, dated Jan. 27, 2017, received in Danish Patent Application No. 201670636, 9 pages. |
Danish Office Action, dated Jan. 27, 2017, received in Danish Patent Application No. 201670642, 10 pages. |
Danish Office Action, dated Jan. 9, 2019, received in Danish Patent Application No. 201670641, 5 pages. |
Danish Office Action, dated Jul. 11, 2017, received in Danish Patent Application No. 201670652, 4 pages. |
Danish Office Action, dated Jul. 13, 2017, received in Danish Patent Application No. 201670649, 2 pages. |
Danish Office Action, dated Jul. 5, 2018, received in Danish Patent Application No. 201670647, 4 pages. |
Danish Office Action, dated Jul. 6, 2017, received in Danish Patent Application No. 201670653, 5 pages. |
Danish Office Action, dated Jun. 13, 2018, received in Danish Patent Application No. 201670642, 3 pages. |
Danish Office Action, dated Jun. 23, 2017, received in Danish Patent Application No. 201670636, 4 pages. |
Danish Office Action, dated Jun. 27, 2017, received in Danish Patent Application No. 201670641, 4 pages. |
Danish Office Action, dated Jun. 6, 2018, received in Danish Patent Application No. 201670641, 5 pages. |
Danish Office Action, dated Mar. 13, 2017, received in Danish Patent Application No. 201670649, 2 pages. |
Danish Office Action, dated Mar. 7, 2017, received in Danish Patent Application No. 201670650, 3 pages. |
Danish Office Action, dated May 15, 2017, received in Danish Patent Application No. 201670648, 2 pages. |
Danish Office Action, dated Nov. 22, 2018, received in Danish Patent Application No. 201670636, 3 pages. |
Danish Office Action, dated Oct. 14, 2016, received in Danish Patent Application No. 201670650, 11 pages. |
Danish Office Action, dated Oct. 20, 2016, received in Danish Patent Application No. 201670648, 8 pages. |
Danish Office Action, dated Oct. 28, 2016, received in Danish Patent Application No. 201670649, 8 pages. |
Danish Office Action, dated Oct. 29, 2019, received in Danish Patent Application No. 201670641, 5 pages. |
Danish Office Action, dated Oct. 31, 2018, received in Danish Patent Application No. 201670652, 5 pages. |
Danish Office Action, dated Sep. 14, 2017, received in Danish Patent Application No. 201670647, 4 pages. |
Danish Office Action, dated Sep. 6, 2016, received in Danish Patent Application No. 201670636, 1 page. |
Danish Office Action, dated Sep. 6, 2019, received in Danish Patent Application No. 201670652, 4 pages. |
Danish Patent, dated Dec. 7, 2018, received in Danish Patent Application No. 201670650, 5 pages. |
Danish Patent, dated Jan. 2, 2018, received in Danish Patent Application No. 201670649, 4 pages. |
Danish Patent, dated Jul. 22, 2019, received in Danish Patent Application No. 201670642, 4 pages. |
Danish Patent, dated Jul. 22, 2019, received in Danish Patent Application No. 201670647, 6 pages. |
Danish Patent, dated Jul. 22, 2019, received in Danish Patent Application No. 201670655, 6 pages. |
Danish Patent, dated May 22, 2018, received in Danish Patent Application No. 201670648, 3 pages. |
Danish Patent, dated May 8, 2019, received in Danish Patent Application No. 201670654, 4 pages. |
European Office Action, dated Dec. 17, 2018, received in European Patent Application No. 17728317.3, 11 pages. |
European Office Action, dated Feb. 6, 2020, received in European Patent Application No. 18167254.4, 9 pages. |
European Office Action, dated Jan. 30, 2020, received in European Patent Application No. 19180887.2, 5 pages. |
European Office Action, dated Jan. 9, 2020, received in European Patent Applicatoin No. 19181254.4, 6 pages. |
European Office Action, dated Jun. 27, 2019, received in European Patent Application No. 17728317.3, 6 pages. |
European Office Action, dated May 24, 2018, received in European Patent Application No. 17728317.3, 3 pages. |
European Office Action, dated Sep. 6, 2018, received in European Patent Application No. 18167254.4, 6 pages. |
European Search Report, dated Jan. 13, 2020, received in European Patent Application No. 19180887.2, 4 pages. |
European Search Report, dated Jul. 26, 2017, received in European Patent Application No. 17174969.0, 13 pages. |
European Search Report, dated Jul. 27, 2018, received in European Patent Application No. 18167254.4, 6 pages. |
International Search Report and Written Opinion, dated Jul. 19, 2017, received in International Patent Application No. PCT/US2017/034340, 10 pages. |
International Search Report and Written Opinion, dated Sep. 15, 2017, received in International Patent Application No. PCT/US2017/033395, 16 pages. |
Japanese Office Action, dated May 13, 2019, received in Japanese Patent Application No. 2018510791, 4 pages. |
Japanese Patent, dated Jun. 14, 2019, received in Japanese Patent Application No. 2018510791, 3 pages. |
Korean Office Action, dated Jul. 25, 2019, received in Korean Patent Application No. 2019-7003574, 5 pages. |
Korean Office Action, dated Mar. 6, 2018, received in Korean Patent Application No. 2018-7003537, 2 pages. |
Korean Office Action, dated Oct. 7, 2019, received in Korean Patent Application No. 2019-7019197, 4 pages. |
Korean Patent, dated Feb. 1, 2019, received in Korean Patent Application No. 2018-7003537, 5 pages. |
Summons to Attend Oral Proceedings, dated Jan. 27, 2020, received in European Patent Applicatoin No. 17728317.3, 10 pages. |
Taiwanese Office Action, dated May 1, 2019, received in Taiwanese Patent Application No. 106118670, 6 pages. |
U.S. Final Office Action, dated Aug. 7, 2019, received in U.S. Appl. No. 15/272,416, 25 pages. |
U.S. Final Office Action, dated Jan. 24, 2020, received in U.S. Appl. No. 15/272,402, 39 pages. |
U.S. Final Office Action, dated Jan. 8, 2020, received in U.S. Appl. No. 15/272,424, 7 pages. |
U.S. Notice of Allowance, dated Dec. 20, 2017, received in U.S. Appl. No. 15/272,399, 8 pages. |
U.S. Notice of Allowance, dated Jan. 14, 2019, received in U.S. Appl. No. 15/272,429, 8 pages. |
U.S. Notice of Allowance, dated Nov. 6, 2019, received in U.S. Appl. No. 15/272,421, 15 pages. |
U.S. Notice of Allowance, dated Nov. 7, 2018, received in U.S. Appl. No. 15/272,430, 7 pages. |
U.S. Office Action, dated Aug. 15, 2019, received in U.S. Appl. No. 15/272,424, 12 pages. |
U.S. Office Action, dated Feb. 17, 2017, received in U.S. Appl. No. 15/272,399, 13 pages. |
U.S. Office Action, dated Jan. 10, 2020, received in U.S. Appl. No. 15/272,416, 27 pages. |
U.S. Office Action, dated Jul. 10, 2018, received in U.S. Appl. No. 15/272,430, 16 pages. |
U.S. Office Action, dated Jul. 11, 2019, received in U.S. Appl. No. 15/272,402, 40 pages. |
U.S. Office Action, dated Jul. 22, 2019, received in U.S. Appl. No. 15/272,419, 21 pages. |
U.S. Office Action, dated Jul. 28, 2017, received in U.S. Appl. No. 15/272,399, 19 pages. |
U.S. Office Action, dated Jul. 3, 2018, received in U.S. Appl. No. 15/272,429, 25 pages. |
U.S. Office Action, dated Mar. 7, 2019, received in U.S. Appl. No. 15/272,416, 29 pages. |
U.S. Office Action, dated May 8, 2019, received in U.S. Appl. No. 15/272,421, 13 pages. |
U.S. Office Action, dated Jul. 22, 2019, received in U.S. Appl. No. 15/272,411, 23 pages. |
U.S. Office Action, dated Feb. 3, 2020, received in U.S. Appl. No. 15/272,411, 29 pages. |
Complete Guide—Messenger Platform—Technical Implementation, downloaded May 27, 2016, https://developers.facebook.com/docs/messenger-platform/implementation, 18 pages. |
Getting Started—Messenger Platform—downloaded May 27, 2016, http://developers.facebook.com/docs/messenger-platform/quickstart, 6 pages. |
Google launches time-saving keyboard for iPhones, May 12, 2016, 2 pages. |
Business Insider—Snapchat now lets you add fun stickers to photos and videos, May 23, 2016, 4 pages. |
European Patent Application No. 17174969.0, Partial European Search Report dated Jul. 26, 2017, 13 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2017/034340, dated Jul. 19, 2017,10 pages. |
D'Onfro, “Facebook Build a Basketball Game Directly into Messenger—Here's How to Play,” Business Insider, Mar. 17, 2016, retrieved from https://www.businessinsider.com/facebook-messenger-basketball-game-2016-3. |
Haslam, “Enable and Play Facebook Messenger's Hidden Basketball Game—Here's How,” Redmond Pie, Mar. 30, 2016. |
Number | Date | Country | |
---|---|---|---|
20200029181 A1 | Jan 2020 | US |
Number | Date | Country | |
---|---|---|---|
62349086 | Jun 2016 | US | |
62349113 | Jun 2016 | US | |
62349101 | Jun 2016 | US | |
62349091 | Jun 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15400780 | Jan 2017 | US |
Child | 16525377 | US |