Location-based virtual avatars

Information

  • Patent Grant
  • 11842411
  • Patent Number
    11,842,411
  • Date Filed
    Tuesday, March 26, 2019
    5 years ago
  • Date Issued
    Tuesday, December 12, 2023
    a year ago
  • CPC
  • Field of Search
    • US
    • 705 319000
    • CPC
    • G06Q50/01
    • G06F16/487
    • G06F3/0484
    • H04L51/20
    • H04L51/32
    • H04L51/10
    • H04L67/18
    • H04L51/08
    • H04L67/52
    • A63F13/216
    • A63F13/79
    • A63F13/795
    • A63F13/87
    • A63F2300/5553
    • G06N3/006
    • H04W4/02
    • H04W64/00
    • H04W4/80
    • G06T13/80
    • G06V20/47
  • International Classifications
    • G06Q50/00
    • G06F16/487
    • H04L51/52
    • H04L51/222
    • H04L51/10
    • Disclaimer
      This patent is subject to a terminal disclaimer.
Abstract
Among other things, embodiments of the present disclosure improve the functionality of electronic messaging and imaging software and systems by determining the current activities of users based on location sensor information from the users' computing devices and generating customized media content items based on their activities. The media content can be generated for a variety of topics and shared with other users. For example, media content (e.g., images or video) can be generated and displayed on a user's computing device, as well as transmitted to other users via electronic communications, such as short message service (SMS) or multimedia service (MMS) texts and emails.
Description
BACKGROUND

The popularity of electronic messaging, particularly instant messaging, continues to grow. Users increasingly share media content items such as electronic images and videos with each other, reflecting a global demand to communicate more visually. Similarly, users increasingly seek to customize the media content items they share with others, providing challenges to social networking systems seeking to generate custom media content for their members. Embodiments of the present disclosure address these and other issues.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. Some embodiments are illustrated by way of example, and not limitation, in the figures of the accompanying drawings in which:



FIG. 1 is a block diagram showing an example messaging system for exchanging data (e.g., messages and associated content) over a network.



FIG. 2 is block diagram illustrating further details regarding a messaging system, according to exemplary embodiments.



FIG. 3 is a schematic diagram illustrating data which may be stored in the database of the messaging server system, according to various exemplary embodiments.



FIG. 4 is a flow diagram of an exemplary process according to various aspects of the disclosure.



FIGS. 5A-5D are screenshots illustrating the aspects of the method described in FIG. 4.



FIG. 6 is a block diagram illustrating a representative software architecture, which may be used in conjunction with various hardware architectures herein described.



FIG. 7 is a block diagram illustrating components of a machine, according to some exemplary embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.





DETAILED DESCRIPTION

The description that follows includes systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative embodiments of the disclosure. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide an understanding of various embodiments of the inventive subject matter. It will be evident, however, to those skilled in the art, that embodiments of the inventive subject matter may be practiced without these specific details. In general, well-known instruction instances, protocols, structures, and techniques are not necessarily shown in detail.


Among other things, embodiments of the present disclosure improve the functionality of electronic messaging and imaging software and systems by determining the current activities of users based on location sensor information from the users' computing devices and generating customized media content items based on their activities. The media content can be generated for a variety of topics and shared with other users. For example, media content (e.g., images or video) can be generated and displayed on a user's computing device, as well as transmitted to other users via electronic communications, such as short message service (SMS) or multimedia service (MMS) texts and emails.



FIG. 1 is a block diagram showing an example of a messaging system 100 for exchanging data (e.g., messages and associated content) over a network. The messaging system 100 includes multiple client devices 102, each of which hosts a number of applications including a messaging client application 104. Each messaging client application 104 is communicatively coupled to other instances of the messaging client application 104 and a messaging server system 108 via a network 106 (e.g., the Internet). As used herein, the term “client device” may refer to any machine that interfaces to a communications network (such as network 106) to obtain resources from one or more server systems or other client devices. A client device may be, but is not limited to, a mobile phone, desktop computer, laptop, portable digital assistants (PDAs), smart phones, tablets, ultra books, netbooks, laptops, multi-processor systems, microprocessor-based or programmable consumer electronics, game consoles, set-top boxes, or any other communication device that a user may use to access a network.


In the example shown in FIG. 1, each messaging client application 104 is able to communicate and exchange data with another messaging client application 104 and with the messaging server system 108 via the network 106. The data exchanged between messaging client applications 104, and between a messaging client application 104 and the messaging server system 108, includes functions (e.g., commands to invoke functions) as well as payload data (e.g., text, audio, video or other multimedia data).


The network 106 may include, or operate in conjunction with, an ad hoc network, an intranet, an extranet, a virtual private network (VPN), a local area network (LAN), a wireless LAN (WLAN), a wide area network (WAN), a wireless WAN (WWAN), a metropolitan area network (MAN), the Internet, a portion of the Internet, a portion of the Public Switched Telephone Network (PSTN), a plain old telephone service (POTS) network, a cellular telephone network, a wireless network, a Wi-Fi® network, another type of network, or a combination of two or more such networks. For example, a network or a portion of a network may include a wireless or cellular network and the coupling may be a Code Division Multiple Access (CDMA) connection, a Global System for Mobile communications (GSM) connection, or other type of cellular or wireless coupling. In this example, the coupling may implement any of a variety of types of data transfer technology, such as Single Carrier Radio Transmission Technology (1×RTT), Evolution-Data Optimized (EVDO) technology, General Packet Radio Service (GPRS) technology, Enhanced Data rates for GSM Evolution (EDGE) technology, third Generation Partnership Project (3GPP) including 3G, fourth generation wireless (4G) networks, Universal Mobile Telecommunications System (UMTS), High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE) standard, others defined by various standard setting organizations, other long range protocols, or other data transfer technology.


The messaging server system 108 provides server-side functionality via the network 106 to a particular messaging client application 104. While certain functions of the messaging system 100 are described herein as being performed by either a messaging client application 104 or by the messaging server system 108, it will be appreciated that the location of certain functionality either within the messaging client application 104 or the messaging server system 108 is a design choice. For example, it may be technically preferable to initially deploy certain technology and functionality within the messaging server system 108, but to later migrate this technology and functionality to the messaging client application 104 where a client device 102 has a sufficient processing capacity.


The messaging server system 108 supports various services and operations that are provided to the messaging client application 104. Such operations include transmitting data to, receiving data from, and processing data generated by the messaging client application 104. This data may include, message content, client device information, geolocation information, media annotation and overlays, message content persistence conditions, social network information, and live event information, as examples. Data exchanges within the messaging system 100 are invoked and controlled through functions available via user interfaces (UIs) of the messaging client application 104.


Turning now specifically to the messaging server system 108, an Application Program Interface (API) server 110 is coupled to, and provides a programmatic interface to, an application server 112. The application server 112 is communicatively coupled to a database server 118, which facilitates access to a database 120 in which is stored data associated with messages processed by the application server 112.


Dealing specifically with the Application Program Interface (API) server 110, this server receives and transmits message data (e.g., commands and message payloads) between the client device 102 and the application server 112. Specifically, the Application Program Interface (API) server 110 provides a set of interfaces (e.g., routines and protocols) that can be called or queried by the messaging client application 104 in order to invoke functionality of the application server 112. The Application Program Interface (API) server 110 exposes various functions supported by the application server 112, including account registration, login functionality, the sending of messages, via the application server 112, from a particular messaging client application 104 to another messaging client application 104, the sending of electronic media files (e.g., electronic images or video) from a messaging client application 104 to the messaging server application 114, and for possible access by another messaging client application 104, the setting of a collection of media data (e.g., story), the retrieval of a list of friends of a user of a client device 102, the retrieval of such collections, the retrieval of messages and content, the adding and deletion of friends to a social graph, the location of friends within a social graph, opening and application event (e.g., relating to the messaging client application 104).


The application server 112 hosts a number of applications and subsystems, including a messaging server application 114, an image processing system 116 and a social network system 122. The messaging server application 114 implements a number of message processing technologies and functions, particularly related to the aggregation and other processing of content (e.g., textual and multimedia content including images and video clips) included in messages received from multiple instances of the messaging client application 104. As will be described in further detail, the text and media content from multiple sources may be aggregated into collections of content (e.g., called stories or galleries). These collections are then made available, by the messaging server application 114, to the messaging client application 104. Other processor and memory intensive processing of data may also be performed server-side by the messaging server application 114, in view of the hardware requirements for such processing.


The application server 112 also includes an image processing system 116 that is dedicated to performing various image processing operations, typically with respect to electronic images or video received within the payload of a message at the messaging server application 114.


The social network system 122 supports various social networking functions services, and makes these functions and services available to the messaging server application 114. To this end, the social network system 122 maintains and accesses an entity graph 304 within the database 120. Examples of functions and services supported by the social network system 122 include the identification of other users of the messaging system 100 with which a particular user has relationships or is “following”, and also the identification of other entities and interests of a particular user.


The application server 112 is communicatively coupled to a database server 118, which facilitates access to a database 120 in which is stored data associated with messages processed by the messaging server application 114.


Some embodiments may include one or more wearable devices, such as a pendant with an integrated camera that is integrated with, in communication with, or coupled to, a client device 102. Any desired wearable device may be used in conjunction with the embodiments of the present disclosure, such as a watch, eyeglasses, goggles, a headset, a wristband, earbuds, clothing (such as a hat or jacket with integrated electronics), a clip-on electronic device, or any other wearable devices.



FIG. 2 is block diagram illustrating further details regarding the messaging system 100, according to exemplary embodiments. Specifically, the messaging system 100 is shown to comprise the messaging client application 104 and the application server 112, which in turn embody a number of some subsystems, namely an ephemeral timer system 202, a collection management system 204 and an annotation system 206.


The ephemeral timer system 202 is responsible for enforcing the temporary access to content permitted by the messaging client application 104 and the messaging server application 114. To this end, the ephemeral timer system 202 incorporates a number of timers that, based on duration and display parameters associated with a message, or collection of messages (e.g., a story), selectively display and enable access to messages and associated content via the messaging client application 104.


The collection management system 204 is responsible for managing collections of media (e.g., collections of text, image, video and audio data). In some examples, a collection of content (e.g., messages, including images, video, text, and audio) may be organized into an “event gallery” or an “event story.” Such a collection may be made available for a specified time period, such as the duration of an event to which the content relates. For example, content relating to a music concert may be made available as a “story” for the duration of that music concert. The collection management system 204 may also be responsible for publishing an icon that provides notification of the existence of a particular collection to the user interface of the messaging client application 104.


The collection management system 204 furthermore includes a curation interface 208 that allows a collection manager to manage and curate a particular collection of content. For example, the curation interface 208 enables an event organizer to curate a collection of content relating to a specific event (e.g., delete inappropriate content or redundant messages). Additionally, the collection management system 204 employs machine vision (or image recognition technology) and content rules to automatically curate a content collection. In certain embodiments, compensation may be paid to a user for inclusion of user generated content into a collection. In such cases, the curation interface 208 operates to automatically make payments to such users for the use of their content.


The annotation system 206 provides various functions that enable a user to annotate or otherwise modify or edit media content associated with a message. For example, the annotation system 206 provides functions related to the generation and publishing of media overlays for messages processed by the messaging system 100. The annotation system 206 operatively supplies a media overlay (e.g., a filter) to the messaging client application 104 based on a geolocation of the client device 102. In another example, the annotation system 206 operatively supplies a media overlay to the messaging client application 104 based on other information, such as, social network information of the user of the client device 102. A media overlay may include audio and visual content and visual effects. Examples of audio and visual content include pictures, texts, logos, animations, and sound effects. An example of a visual effect includes color overlaying. The audio and visual content or the visual effects can be applied to a media content item (e.g., an image or video) at the client device 102. For example, the media overlay including text that can be overlaid on top of a photograph/electronic image generated by the client device 102. In another example, the media overlay includes an identification of a location overlay (e.g., Venice beach), a name of a live event, or a name of a merchant overlay (e.g., Beach Coffee House). In another example, the annotation system 206 uses the geo-location of the client device 102 to identify a media overlay that includes the name of a merchant at the geolocation of the client device 102. The media overlay may include other indicia associated with the merchant. The media overlays may be stored in the database 120 and accessed through the database server 118.


In some exemplary embodiments, as discussed in more detail below, embodiments of the present disclosure may generate, display, distribute, and apply media overlays to media content items. For example, embodiments may utilize media content items generated by a client device 102 (e.g., an image or video captured using a digital camera coupled to the client device 102) to generate media overlays that can be applied to other media content items.



FIG. 3 is a schematic diagram 300 illustrating data 300 that is stored in the database 120 of the messaging server system 108, according to certain exemplary embodiments. While the content of the database 120 is shown to comprise a number of tables, the data could be stored in other types of data structures (e.g., as an object-oriented database).


The database 120 includes message data stored within a message table 314. The entity table 302 stores entity data, including an entity graph 304. Entities for which records are maintained within the entity table 302 may include individuals, corporate entities, organizations, objects, places, events etc. Regardless of type, any entity regarding which the messaging server system 108 stores data may be a recognized entity. Each entity is provided with a unique identifier, as well as an entity type identifier (not shown).


The entity graph 304 furthermore stores information regarding relationships and associations between entities. Such relationships may be social, professional (e.g., work at a common corporation or organization) interested-based or activity-based, merely for example.


The database 120 also stores annotation data, in the example form of filters, in an annotation table 312. Filters for which data is stored within the annotation table 312 are associated with and applied to videos (for which data is stored in a video table 310) or images (for which data is stored in an image table 308). Filters, in one example, are overlays that are displayed as overlaid on an image or video during presentation to a recipient user. Filters may be of varies types, including a user-selected filters from a gallery of filters presented to a sending user by the messaging client application 104 when the sending user is composing a message.


Other types of filters include geolocation filters (also known as Geofilters) which may be presented to a sending user based on geographic location. For example, geolocation filters specific to a neighborhood or special location may be presented within a user interface by the messaging client application 104, based on geolocation information determined by a GPS unit of the client device 102. Another type of filter is a data filter, which may be selectively presented to a sending user by the messaging client application 104, based on other inputs or information gathered by the client device 102 during the message creation process. Example of data filters include current temperature at a specific location, a current speed at which a sending user is traveling, battery life for a client device 102 or the current time. Other annotation data that may be stored within the image table 308 is so-called “Lens” data. A “Lens” may be a real-time special effect and sound that may be added to an image or a video.


As mentioned above, the video table 310 stores video data which, in one embodiment, is associated with messages for which records are maintained within the message table 314. Similarly, the image table 308 stores image data associated with messages for which message data is stored in the entity table 302. The entity table 302 may associate various annotations from the annotation table 312 with various images and videos stored in the image table 308 and the video table 310.


A story table 306 stores data regarding collections of messages and associated image, video or audio data, which are compiled into a collection (e.g., a story or a gallery). The creation of a particular collection may be initiated by a particular user (e.g., each user for which a record is maintained in the entity table 302). A user may create a “personal story” in the form of a collection of content that has been created and sent/broadcast by that user. To this end, the user interface of the messaging client application 104 may include an icon that is user selectable to enable a sending user to add specific content to his or her personal story.


A collection may also constitute a “live story,” which is a collection of content from multiple users that is created manually, automatically or using a combination of manual and automatic techniques. For example, a “live story” may constitute a curated stream of user-submitted content from varies locations and events. Users, whose client devices have location services enabled and are at a common location event at a particular time may, for example, be presented with an option, via a user interface of the messaging client application 104, to contribute content to a particular live story. The live story may be identified to the user by the messaging client application 104, based on his or her location. The end result is a “live story” told from a community perspective.


A further type of content collection is known as a “location story,” which enables a user whose client device 102 is located within a specific geographic location (e.g., on a college or university campus) to contribute to a particular collection. In some embodiments, a contribution to a location story may require a second degree of authentication to verify that the end user belongs to a specific organization or other entity (e.g., is a student on the university campus).


Embodiments of the present disclosure may generate and present customized images for use within electronic messages/communications such as short message service (SMS) or multimedia message service (MMS) texts and emails. The customized images may also be utilized in conjunction with the stories, filters, and ephemeral messaging functionality discussed herein.



FIG. 4 depicts an exemplary process according to various aspects of the present disclosure. In this example, method 400 includes receiving authorization from a user (405) to use location information from the user's computing device, receiving location information from the user's computing device (410), determining a current activity for the user based on the location information (415), retrieving avatar information for the user (420), generating a media content item (425) based on the location information and the retrieved avatar information, and displaying the media content item (430). The steps of method 400 may be performed in whole or in part, may be performed in conjunction each other as well as with some or all of the steps in other methods, and may be performed by any number of different systems, such as the systems described in FIGS. 1 and 7.


Embodiments of the present disclosure may be used to create customized media content items (such as images) displaying maps and other backgrounds. The customized media content items may include avatars of users engaged in (or associated with) various activities, such as walking, eating, playing a sport, sleeping, etc. In method 400, the system receives authorization (405) from a user to utilize location information from the user's computing device and/or to display the user's avatar or location in media content items prior to performing the remaining steps of method 400. Such authorization may be obtained via acceptance of a terms of service for utilizing an online social network or other service provided by the system, by acceptance on a case-by-case basis by the user (e.g., via popups displayed on the user's computing device) or using any other suitable method for obtaining authorization by the user(s).


The system (e.g., messaging server system 108 in FIG. 1) may receive (410) an electronic communication transmitted from a client computing device of a user (e.g., client device 102 in FIG. 1) over a network such as the Internet (e.g., network 106 in FIG. 1) containing location information from a location sensor (e.g., position components 738 in system 700 of FIG. 7—discussed below) coupled to the user's computing device. In some embodiments, the location sensor may include a global positioning sensor (GPS) component integrated in the user's computing device, as well as other types of location sensors. The system may receive (410) location information on a periodic basis and may request information from the user's computing device and/or receive such information from the user's device without such a request. In one exemplary embodiment, for instance, the user's client computing device contains software that monitors the location sensor information from the user's device and transmits updates to the system in response to the location changing. In some cases the user's device may update the system with a new location only after the location changes by at least a predetermined distance to allow a user to move about a building or other location without triggering updates.


The system analyzes the received location information and determines a current activity (415) of the user. The system may use any number of different location measurements to determine a user's activity. In some embodiments, for example, the system may determine a speed of the user's client computing device (e.g., in real-time or near-real-time) based on first location information from the location sensor on the user's device at a first time, and second location information from the location sensor at a second (subsequent) time. The speed and location information can be analyzed together to help determine the user's activity.


The system further retrieves avatar information for the user (420) and generates (425), based on the avatar information for the user and the current activity, a media content item containing an avatar of the user engaged in the current activity. As used herein, an “avatar” of a user is any visual representation of user. The avatar of a user may be based on information (e.g., characteristics) derived from images of the user in conjunction with the avatar characteristics identified from the user's relationships with other users. Alternatively or additionally, the user may select and customize characteristics of the user's avatar via the user's computing device. Such avatar characteristics may include, for example, the user's bodily features (e.g., muscular, thin, etc.), facial features, clothing and accessories, text displayed in conjunction with the avatar, and images displayed in conjunction with the avatar. The avatar information may be retrieved (420) from a variety of sources, such as the local memory of a device performing the steps of method 400 (e.g., messaging server system 108 in FIG. 1) as well as from other systems and devices.


For example, if the system determines the user (carrying his/her computing device) is moving along a sidewalk at a rate of three miles per hour, the system may determine (based on the user's speed and the limits of the sidewalk to accommodate vehicles) that the user is walking, and generate (425) a corresponding avatar showing the user walking. If, on the other hand, the user is moving at six miles per hour along the sidewalk, the system may determine the user is running and generate an avatar of the user running. The system may likewise identify other activities for the user, such as biking, driving, flying, traveling on a train, and traveling on a boat.


As shown in the exemplary screenshots depicted in FIGS. 5A-5D, the system may present the avatars of different users in conjunction with a media content item such as an image of a map. The media content item may include a still image, animated image, video, or other content. In some embodiments, the system updates the image of the map as the location of a user's computing device changes. For example, FIG. 5A depicts a media content item (an image in this example) with the avatar of a user walking 505 along a street. The system may present the avatar 505 of the user at a first position in the image at a first time, and then modifying the image to remove the user's avatar from the first position and present the avatar 505 at a second position at a second (subsequent) time. In this manner, the system visually tracks the location of the walking user's avatar 505, while the positions of the cluster of avatars 515 and solo avatar 520 may remain static. The user may share the media content item (e.g., via social media, text, or other electronic communication) with the respective users corresponding with avatars 515 and 520, as well as with others.


Additionally or alternatively, the system may utilize information from other types of sensors and sources to help determine the activity of the user. For example, the system may utilize information from an altimeter to determine that the user is flying, or data from an accelerometer (e.g., showing repeated sudden jolts to the user's movement) to determine a user is mountain biking. Such sensors may not necessarily be integrated into a user's computing device, and simply in communication with the user's device (e.g., via a wireless connection).


In some exemplary embodiments, the system may identify a computing device in communication with the user's computing device to help generate the media content item. For example, the system may collect data from a sensor in communication with the user's computing device and integrated with a vehicle or other device operated or used by the user. In one example, the system may determine the make and model of an automobile that the user is riding in based on information gathered about the vehicle via a wireless connection between the user's device and the automobile. As shown in FIG. 5B, for example, the system may generate a media content item containing an avatar of the user 515 sitting in a representation of an automobile of the same make and model.


Similarly, the system may identify one or more wearable devices or other systems with which the user's device is in communication. In FIG. 5C, for example, the system detects that the user's device is in communication with a set of wireless headphones, and generates (425) a media content item depicting a map with the location of the user (i.e., Bob's Bar at reference 520) along with the avatar of the user 525 wearing the headphones.


The system may also analyze information from an online social network (e.g., where the user has an account or is mentioned) to help determine the user's activity. For example, the system may connect to the online social network and analyze posts by the user and/or the user's connections to determine an upcoming or current activity. In other cases, the system may analyze electronic communications transmitted from, or received by, the user's computing device. In a particular example, the user may post on a Friday (e.g., in a text message and/or to the user's online social network feed) “sure looking forward to skydiving this weekend.” The system may identify the key word “skydiving” and the temporal aspect “this weekend” from the user's post, and use this information along with data from an accelerometer and/or altimeter the following day to determine when the user is in the act of skydiving. The system may then generate a media content item displaying an avatar of the user skydiving at the same time the user is actually skydiving. The system may share the media content item to the user's contacts (e.g., via the online social network, text message, or other electronic communication) automatically and without input from the user. In this manner, the system can automatically share the user's activities with the user's friends and other contacts, even when it would be difficult or impossible for the user to do so himself/herself (e.g., when the user is busy falling from a plane).


The system may analyze periods of inactivity (or relative inactivity) by the user and adjust the generation of the media content item accordingly. For example, the system may determine, based on the location information from a user's computing device, that the user's device (carried by the user) has not moved beyond a predetermined distance from a location for a predetermined period of time. In response to such inactivity, the system may modify a media content item to remove the avatar of the user, gray out the user's avatar, make the user's avatar translucent, display an avatar of the user sleeping, or provide another visual indicator that the user is inactive.


In some embodiments, the inactivity of a user may be analyzed with other information used to determine that the user is sleeping. For example, the system may determine that the user is sleeping based on a lack of movement by the user's computing device for a predetermined period of time, the time of day at the user's current location, and/or the user's current location being the user's residence. The system may also infer inactivity/sleeping by the user based on a lack of interaction with the user's computing device by the user for a predetermined period of time.


The system may determine the current activity of a user based on identifying a venue associated with the user's current location and one or more activities associated with the venue. For example, referring now to FIG. 5C, the system determines, based on location information from the user's mobile computing device that the user is carrying, that the user is at a bar (Bob's Bar 520). The system identifies drinking spirits as an activity associated with the bar venue, and generates (425) a media content item showing the user's avatar 525 holding a beer. As noted above, the system may use other information (such as from the user's social media posts and/or electronic communications) to determine the user is drinking a beer. For example, the user depicted in FIG. 5C might post to his social media feed that he's “enjoying a beer at Bob's Bar.”


The system may identify other venues, such as restaurants, theaters, sporting events, sports fields, and transportation hubs to help identify the user's activity. In FIG. 5D, for example, a media content item comprising a map with an avatar of a first user 535 is depicted at “Joe's BBQ” restaurant (with an image of the front of the restaurant shown in bubble 540). In this case, the system identifies Joes BBQ as a restaurant and depicts the user's avatar holding a knife and fork. A second user's avatar 545 is depicted nearby at a wine bar (“Flo's Wine Bar” 550). The system identifies this location as a wine bar and customizes the user's avatar to show her holding an oversized glass of wine. In this manner, the system can generate customized avatars of users holding items, wearing apparel and accessories, sitting in vehicles, and the like that illustrates the activities they are engaged in. Media content items containing such avatar images can be shared with the user's friends and other contacts for a deeper, more interactive experience than provided by conventional messaging and social media systems.


The system may cause the user's computing device to display (430) the media content item (e.g., on the device's display screen. A variety of media content items may be generated (425) and displayed (430) in conjunction with embodiments of the present disclosure. In this context, a “media content item” may include any type of electronic media in any format. For example, a media content item may include an image in JPG format, an image in PNG format, a video in FLV format, a video in AVI format, etc. In some exemplary embodiments, a media content item may include content that is captured using an image capture device or component (such as a digital camera) coupled to, or in communication with, a system performing the functionality of method 400. In the exemplary system 700 depicted in FIG. 7 may include a digital camera as one of input components 728. Additionally or alternatively, the media content item may be received from another system or device. In FIG. 1, for example, Media content items may also include audio and combinations of different media formats (e.g., still images and video).


In some embodiments, the media content item generated (425) by the system may be included in a media overlay such as a “sticker” (i.e., an image that can be overlaid onto other images), filter (discussed above), or another media overlay. Such overlays may include static (i.e., non-moving) features as well as dynamic (i.e., moving) features.


Generation of the media content item (425) may include the generation of one or more data structure fields containing information regarding the content item. For example, the system may generate a name field in a data structure for the media overlay that includes a name for the media content item received from the content provider.


Embodiments of the present disclosure may transmit and receive electronic communications containing media content items, media overlays, or other content any form of electronic communication, such as SMS texts, MMS texts, emails, and other communications. Media content items included in such communications may be provided as attachments, displayed inline in the message, within media overlays, or conveyed in any other suitable manner.


Software Architecture



FIG. 6 is a block diagram illustrating an exemplary software architecture 606, which may be used in conjunction with various hardware architectures herein described. FIG. 6 is a non-limiting example of a software architecture and it will be appreciated that many other architectures may be implemented to facilitate the functionality described herein. The software architecture 606 may execute on hardware such as machine 700 of FIG. 7 that includes, among other things, processors 704, memory 714, and I/O components 718. A representative hardware layer 652 is illustrated and can represent, for example, the machine 700 of FIG. 7. The representative hardware layer 652 includes a processing unit 654 having associated executable instructions 604. Executable instructions 604 represent the executable instructions of the software architecture 606, including implementation of the methods, components and so forth described herein. The hardware layer 652 also includes memory or storage modules memory/storage 656, which also have executable instructions 604. The hardware layer 652 may also comprise other hardware 658.


As used herein, the term “component” may refer to a device, physical entity or logic having boundaries defined by function or subroutine calls, branch points, application program interfaces (APIs), or other technologies that provide for the partitioning or modularization of particular processing or control functions. Components may be combined via their interfaces with other components to carry out a machine process. A component may be a packaged functional hardware unit designed for use with other components and a part of a program that usually performs a particular function of related functions.


Components may constitute either software components (e.g., code embodied on a machine-readable medium) or hardware components. A “hardware component” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various exemplary embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware components of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware component that operates to perform certain operations as described herein. A hardware component may also be implemented mechanically, electronically, or any suitable combination thereof. For example, a hardware component may include dedicated circuitry or logic that is permanently configured to perform certain operations.


A hardware component may be a special-purpose processor, such as a Field-Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware component may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware component may include software executed by a general-purpose processor or other programmable processor. Once configured by such software, hardware components become specific machines (or specific components of a machine) uniquely tailored to perform the configured functions and are no longer general-purpose processors. It will be appreciated that the decision to implement a hardware component mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


A processor may be, or in include, any circuit or virtual circuit (a physical circuit emulated by logic executing on an actual processor) that manipulates data values according to control signals (e.g., “commands”, “op codes”, “machine code”, etc.) and which produces corresponding output signals that are applied to operate a machine. A processor may, for example, be a Central Processing Unit (CPU), a Reduced Instruction Set Computing (RISC) processor, a Complex Instruction Set Computing (CISC) processor, a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Radio-Frequency Integrated Circuit (RFIC) or any combination thereof. A processor may further be a multi-core processor having two or more independent processors (sometimes referred to as “cores”) that may execute instructions contemporaneously.


Accordingly, the phrase “hardware component” (or “hardware-implemented component”) should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware components are temporarily configured (e.g., programmed), each of the hardware components need not be configured or instantiated at any one instance in time. For example, where a hardware component comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware components) at different times. Software accordingly configures a particular processor or processors, for example, to constitute a particular hardware component at one instance of time and to constitute a different hardware component at a different instance of time. Hardware components can provide information to, and receive information from, other hardware components. Accordingly, the described hardware components may be regarded as being communicatively coupled. Where multiple hardware components exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware components. In embodiments in which multiple hardware components are configured or instantiated at different times, communications between such hardware components may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware components have access.


For example, one hardware component may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware component may then, at a later time, access the memory device to retrieve and process the stored output. Hardware components may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information). The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented components that operate to perform one or more operations or functions described herein. As used herein, “processor-implemented component” refers to a hardware component implemented using one or more processors. Similarly, the methods described herein may be at least partially processor-implemented, with a particular processor or processors being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented components.


Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an Application Program Interface (API)). The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across a number of machines. In some exemplary embodiments, the processors or processor-implemented components may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other exemplary embodiments, the processors or processor-implemented components may be distributed across a number of geographic locations.


In the exemplary architecture of FIG. 6, the software architecture 606 may be conceptualized as a stack of layers where each layer provides particular functionality. For example, the software architecture 606 may include layers such as an operating system 602, libraries 620, applications 616 and a presentation layer 614. Operationally, the applications 616 or other components within the layers may invoke application programming interface (API) API calls 608 through the software stack and receive messages 612 in response to the API calls 608. The layers illustrated are representative in nature and not all software architectures have all layers. For example, some mobile or special purpose operating systems may not provide a frameworks/middleware 618, while others may provide such a layer. Other software architectures may include additional or different layers.


The operating system 602 may manage hardware resources and provide common services. The operating system 602 may include, for example, a kernel 622, services 624 and drivers 626. The kernel 622 may act as an abstraction layer between the hardware and the other software layers. For example, the kernel 622 may be responsible for memory management, processor management (e.g., scheduling), component management, networking, security settings, and so on. The services 624 may provide other common services for the other software layers. The drivers 626 are responsible for controlling or interfacing with the underlying hardware. For instance, the drivers 626 include display drivers, camera drivers, Bluetooth® drivers, flash memory drivers, serial communication drivers (e.g., Universal Serial Bus (USB) drivers), Wi-Fi® drivers, audio drivers, power management drivers, and so forth depending on the hardware configuration.


The libraries 620 provide a common infrastructure that is used by the applications 616 or other components or layers. The libraries 620 provide functionality that allows other software components to perform tasks in an easier fashion than to interface directly with the underlying operating system 602 functionality (e.g., kernel 622, services 624 or drivers 626). The libraries 620 may include system libraries 644 (e.g., C standard library) that may provide functions such as memory allocation functions, string manipulation functions, mathematical functions, and the like. In addition, the libraries 620 may include API libraries 646 such as media libraries (e.g., libraries to support presentation and manipulation of various media format such as MPREG4, H.264, MP3, AAC, AMR, JPG, PNG), graphics libraries (e.g., an OpenGL framework that may be used to render 2D and 3D in a graphic content on a display), database libraries (e.g., SQLite that may provide various relational database functions), web libraries (e.g., WebKit that may provide web browsing functionality), and the like. The libraries 620 may also include a wide variety of other libraries 648 to provide many other APIs to the applications 616 and other software components/modules.


The frameworks/middleware 618 (also sometimes referred to as middleware) provide a higher-level common infrastructure that may be used by the applications 616 or other software components/modules. For example, the frameworks/middleware 618 may provide various graphic user interface (GUI) functions, high-level resource management, high-level location services, and so forth. The frameworks/middleware 618 may provide a broad spectrum of other APIs that may be utilized by the applications 616 or other software components/modules, some of which may be specific to a particular operating system 602 or platform.


The applications 616 include built-in applications 638 or third-party applications 640. Examples of representative built-in applications 638 may include, but are not limited to, a contacts application, a browser application, a book reader application, a location application, a media application, a messaging application, or a game application. Third-party applications 640 may include an application developed using the ANDROID™ or IOS™ software development kit (SDK) by an entity other than the vendor of the particular platform, and may be mobile software running on a mobile operating system such as IOS™, ANDROID™, WINDOWS® Phone, or other mobile operating systems. The third-party applications 640 may invoke the API calls 608 provided by the mobile operating system (such as operating system 602) to facilitate functionality described herein.


The applications 616 may use built in operating system functions (e.g., kernel 622, services 624 or drivers 626), libraries 620, and frameworks/middleware 618 to create user interfaces to interact with users of the system. Alternatively, or additionally, in some systems interactions with a user may occur through a presentation layer, such as presentation layer 614. In these systems, the application/component “logic” can be separated from the aspects of the application/component that interact with a user.



FIG. 7 is a block diagram illustrating components (also referred to herein as “modules”) of a machine 700, according to some exemplary embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 7 shows a diagrammatic representation of the machine 700 in the example form of a computer system, within which instructions 710 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 700 to perform any one or more of the methodologies discussed herein may be executed. As such, the instructions 710 may be used to implement modules or components described herein. The instructions 710 transform the general, non-programmed machine 700 into a particular machine 700 programmed to carry out the described and illustrated functions in the manner described. In alternative embodiments, the machine 700 operates as a standalone device or may be coupled (e.g., networked) to other machines. In a networked deployment, the machine 700 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 700 may comprise, but not be limited to, a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), an entertainment media system, a cellular telephone, a smart phone, a mobile device, a wearable device (e.g., a smart watch), a smart home device (e.g., a smart appliance), other smart devices, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 710, sequentially or otherwise, that specify actions to be taken by machine 700. Further, while only a single machine 700 is illustrated, the term “machine” shall also be taken to include a collection of machines that individually or jointly execute the instructions 710 to perform any one or more of the methodologies discussed herein.


The machine 700 may include processors 704, memory memory/storage 706, and I/O components 718, which may be configured to communicate with each other such as via a bus 702. The memory/storage 706 may include a memory 714, such as a main memory, or other memory storage, and a storage unit 716, both accessible to the processors 704 such as via the bus 702. The storage unit 716 and memory 714 store the instructions 710 embodying any one or more of the methodologies or functions described herein. The instructions 710 may also reside, completely or partially, within the memory 714, within the storage unit 716, within at least one of the processors 704 (e.g., within the processor's cache memory), or any suitable combination thereof, during execution thereof by the machine 700. Accordingly, the memory 714, the storage unit 716, and the memory of processors 704 are examples of machine-readable media.


As used herein, the term “machine-readable medium,” “computer-readable medium,” or the like may refer to any component, device or other tangible media able to store instructions and data temporarily or permanently. Examples of such media may include, but is not limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical media, magnetic media, cache memory, other types of storage (e.g., Erasable Programmable Read-Only Memory (EEPROM)) or any suitable combination thereof. The term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term “machine-readable medium” may also be taken to include any medium, or combination of multiple media, that is capable of storing instructions (e.g., code) for execution by a machine, such that the instructions, when executed by one or more processors of the machine, cause the machine to perform any one or more of the methodologies described herein. Accordingly, a “machine-readable medium” may refer to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.


The I/O components 718 may include a wide variety of components to provide a user interface for receiving input, providing output, producing output, transmitting information, exchanging information, capturing measurements, and so on. The specific I/O components 718 that are included in the user interface of a particular machine 700 will depend on the type of machine. For example, portable machines such as mobile phones will likely include a touch input device or other such input mechanisms, while a headless server machine will likely not include such a touch input device. It will be appreciated that the I/O components 718 may include many other components that are not shown in FIG. 7. The I/O components 718 are grouped according to functionality merely for simplifying the following discussion and the grouping is in no way limiting. In various exemplary embodiments, the I/O components 718 may include output components 726 and input components 728. The output components 726 may include visual components (e.g., a display such as a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)), acoustic components (e.g., speakers), haptic components (e.g., a vibratory motor, resistance mechanisms), other signal generators, and so forth. The input components 728 may include alphanumeric input components (e.g., a keyboard, a touch screen configured to receive alphanumeric input, a photo-optical keyboard, or other alphanumeric input components), point based input components (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), tactile input components (e.g., a physical button, a touch screen that provides location or force of touches or touch gestures, or other tactile input components), audio input components (e.g., a microphone), and the like. The input components 728 may also include one or more image-capturing devices, such as a digital camera for generating digital images or video.


In further exemplary embodiments, the I/O components 718 may include biometric components 730, motion components 734, environmental environment components 736, or position components 738, as well as a wide array of other components. One or more of such components (or portions thereof) may collectively be referred to herein as a “sensor component” or “sensor” for collecting various data related to the machine 700, the environment of the machine 700, a user of the machine 700, or a combinations thereof.


For example, the biometric components 730 may include components to detect expressions (e.g., hand expressions, facial expressions, vocal expressions, body gestures, or eye tracking), measure biosignals (e.g., blood pressure, heart rate, body temperature, perspiration, or brain waves), identify a person (e.g., voice identification, retinal identification, facial identification, fingerprint identification, or electroencephalogram based identification), and the like. The motion components 734 may include acceleration sensor components (e.g., accelerometer), gravitation sensor components, velocity sensor components (e.g., speedometer), rotation sensor components (e.g., gyroscope), and so forth. The environment components 736 may include, for example, illumination sensor components (e.g., photometer), temperature sensor components (e.g., one or more thermometer that detect ambient temperature), humidity sensor components, pressure sensor components (e.g., barometer), acoustic sensor components (e.g., one or more microphones that detect background noise), proximity sensor components (e.g., infrared sensors that detect nearby objects), gas sensors (e.g., gas detection sensors to detection concentrations of hazardous gases for safety or to measure pollutants in the atmosphere), or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 738 may include location sensor components (e.g., a Global Position system (GPS) receiver component), altitude sensor components (e.g., altimeters or barometers that detect air pressure from which altitude may be derived), orientation sensor components (e.g., magnetometers), and the like. For example, the location sensor component may provide location information associated with the system 700, such as the system's 700 GPS coordinates or information regarding a location the system 700 is at currently (e.g., the name of a restaurant or other business).


Communication may be implemented using a wide variety of technologies. The I/O components 718 may include communication components 740 operable to couple the machine 700 to a network 732 or devices 720 via coupling 722 and coupling 724 respectively. For example, the communication components 740 may include a network interface component or other suitable device to interface with the network 732. In further examples, communication components 740 may include wired communication components, wireless communication components, cellular communication components, Near Field Communication (NFC) components, Bluetooth® components, Wi-Fi® components, and other communication components to provide communication via other modalities. The devices 720 may be another machine or any of a wide variety of peripheral devices (e.g., a peripheral device coupled via a Universal Serial Bus (USB)).


Moreover, the communication components 740 may detect identifiers or include components operable to detect identifiers. For example, the communication components 740 may include Radio Frequency Identification (RFID) tag reader components, NFC smart tag detection components, optical reader components (e.g., an optical sensor to detect one-dimensional bar codes such as Universal Product Code (UPC) bar code, multi-dimensional bar codes such as Quick Response (QR) code, Aztec code, Data Matrix, Dataglyph, MaxiCode, PDF417, Ultra Code, UCC RSS-2D bar code, and other optical codes), or acoustic detection components (e.g., microphones to identify tagged audio signals). In addition, a variety of information may be derived via the communication components 740, such as, location via Internet Protocol (IP) geo-location, location via Wi-Fi® signal triangulation, location via detecting a NFC beacon signal that may indicate a particular location, and so forth.


Where a phrase similar to “at least one of A, B, or C,” “at least one of A, B, and C,” “one or more A, B, or C,” or “one or more of A, B, and C” is used, it is intended that the phrase be interpreted to mean that A alone may be present in an embodiment, B alone may be present in an embodiment, C alone may be present in an embodiment, or that any combination of the elements A, B and C may be present in a single embodiment; for example, A and B, A and C, B and C, or A and B and C.


Changes and modifications may be made to the disclosed embodiments without departing from the scope of the present disclosure. These and other changes or modifications are intended to be included within the scope of the present disclosure, as expressed in the following claims.

Claims
  • 1. A system comprising: a processor; andmemory coupled to the processor and storing instructions that, when executed by the processor, cause the system to perform operations comprising:receiving, at a server remote from a vehicle, from a first user's client computing device in communication with the system over a network, an electronic communication including location information from a location sensor coupled to the first user's client computing device, the location information indicating a current location of the first user's client computing device;retrieving, via the server, from a database, avatar information for the first user;determining, based on the location information, that the first user is traveling in the vehicle;receiving, via the server, from the first user's client computing device and based on determining that the first user is traveling in the vehicle, a make and model of the vehicle, the make and model having been provided from the vehicle to the first user's client computing device via a wireless connection between the vehicle and the first user's client computing device;generating, via the server, a representation of the vehicle based on the received make and model of the vehicle;generating, via the server, a media content item comprising a map interface depicting an avatar of the first user and the representation of the vehicle at the current location of the first user's client computing device in the map interface, wherein the map interface is generated based on the location information, the avatar of the first user is generated based on the avatar information for the first user, and the avatar of the first user is associated with the representation of the vehicle;causing, via a first transmission from the server to the first user's client computing device, the media content item to be displayed on a display screen of the first user's client computing device;causing, via a second transmission from the seller to a second user's client computing device, the media content item to be display on a display screen of the second user's client computing device, wherein the avatar of the first user is selectable via the display screen of the first user's client computing device; andpresenting the avatar of the first user at a first position in a map image included in the map interface at a first time based on the current position, modifying the map interface to remove the first user's avatar from the first position based on a second position of the first user's client computing device at a second subsequent time, updating the map interface based on a second position, and presenting the avatar of the first user at the second position in the map interface, wherein the system ceases displaying the avatar of the first user on the map interface on the display screen of the second computing device in response to one or more of: expiration of a predetermined period of time, an input from the first user, via the first user's computing device, to turn off sharing of the first user's location, and wherein the media content item comprises a media overlay created based on a geolocation of the first position and of the second position, the media overlay comprising an identification of a location text, the location text describing a venue at the current location, the media overlay being overlaid on or adjacent the venue on top of the map image.
  • 2. The system of claim 1, wherein the avatar of the first user is selectable via the display screen of the first user's client computing device and the display screen of the second user's client computing device, and wherein selection of the avatar of the first user via a respective display screen causes the system to display the media content item associated with the first user on the respective display screen.
  • 3. The system of claim 2, wherein an icon associated with the venue is selectable via the display screen of the first user's client computing device or the display screen of the second user's client computing device, and wherein selection of the icon associated with the venue via a respective display screen causes the system to display a second media content item associated with the venue on the respective display screen.
  • 4. The system of claim 1, wherein the system allows access to the media content item by the second user's computing device for a predetermined period of time.
  • 5. The system of claim 1, wherein the avatar of the first user remains displayed on the map interface displayed on the display screen of the first user's computing device subsequent to the input from the first user to turn off sharing of the first user's location.
  • 6. The system of claim 5, wherein the system presents an icon on the display screen of the first user's computing device to indicate the first user's location is not being shared.
  • 7. The system of claim 1, wherein the memory further stores instructions for causing the system to perform operations comprising: receiving, from the second user's client computing device, an electronic communication containing location information from a location sensor coupled to the second user's client computing device, the location information indicating a current location of the second user's client computing device;identifying a second venue based on the location information indicating the current location of the second user's client computing device; andretrieving, from the database, avatar information for the second user, wherein generating the media content item includes generating an avatar of the second user at the current location of the second user's client computing device on the map interface and an icon including a name of the second venue based on the location information indicating the current location of the second user's client computing device.
  • 8. The system of claim 1, wherein generating the media content item includes: receiving, from the computing device of the first user, a granularity option; andgenerating the avatar of the first user on the map within a predetermined distance of the current location of the first user's client computing device based on the received granularity option.
  • 9. The system of claim 8, wherein the predetermined distance of the current location of the first user's client computing device is one of: a precise location of the first user's client computing device based on the location information, and a random location within a predetermined area based on the location information.
  • 10. The system of claim 1, wherein the memory further stores instructions for causing the system to receive, from the first user's client computing device over the network, authorization from the first user to utilize the location information.
  • 11. The system of claim 1, wherein the memory further stores instructions for causing the system to perform operations comprising: determining a current activity of the first user based on the location information, wherein generating the media content item includes depicting the avatar of the first user engaged in the current activity.
  • 12. The system of claim 11, wherein determining the current activity of the first user includes determining a speed of the first user's client computing device based on first location information from the location sensor at a first time and second location information from the location sensor at a second time, the second time subsequent to the first time.
  • 13. The system of claim 1, wherein generating the media content item includes: determining, based on the location information, that the client computing device of the first user has not moved beyond a predetermined distance from a location for a predetermined period of time; andin response to determining the client computing device of the first user has not moved beyond the predetermined distance from the location, modifying the media content item to remove the avatar of the first user.
  • 14. The system of claim 1, wherein generating the media content item includes identifying the venue associated with the current location of the first user's client computing device and displaying an icon on the map associated with the venue.
  • 15. The system of claim 1, wherein generating the media content item includes identifying a wearable device in communication with the client computing device and displaying the avatar of the user wearing a representation of the wearable device.
  • 16. The system of claim 1, wherein causing, via the server, the media content item to be displayed on the display of the second user's client computing device comprises determining that the second user is included in a contacts list of the first user and displaying the media content item on the display of the second user's client computing device without input from the first user.
  • 17. A computer-implemented method comprising: receiving, at a server remote from a vehicle, from a first user's client computing device in communication with the system over a network, an electronic communication including location information from a location sensor coupled to the first user's client computing device, the location information indicating a current location of the first user's client computing device;retrieving, via the server, from a database, avatar information for the first user;determining, based on the location information, that the first user is traveling in the vehicle;receiving, via the server, from the first user's client computing device and based on determining that the first user is traveling in the vehicle, a make and model of the vehicle, the make and model having been provided from the vehicle to the first user's client computing device via a wireless connection between the vehicle and the first user's client computing device;generating, via the server, a representation of the vehicle based on the received make and model of the vehicle;generating, via the server, a media content item comprising a map interface depicting an avatar of the first user and the representation of the vehicle at the current location of the first user's client computing device in the map interface, wherein the map interface is generated based on the location information, the avatar of the first user is generated based on the avatar information for the first user, and the avatar of the first user is associated with the representation of the vehicle;causing, via a first transmission from the server to the first user's client computing device, the media content item to be displayed on a display screen of the first user's client computing device;causing, via a second transmission from the seller to a second user's client computing device, the media content item to be display on a display screen of the second user's client computing device, wherein the avatar of the first user is selectable via the display screen of the first user's client computing device; andpresenting the avatar of the first user at a first position in a map image included in the map interface at a first time based on the current position, modifying the map interface to remove the first user's avatar from the first position based on a second position of the first user's client computing device at a second subsequent time, updating the map interface based on a second position, and presenting the avatar of the first user at the second position in the map interface, wherein the system ceases displaying the avatar of the first user on the map interface on the display screen of the second computing device in response to one or more of: expiration of a predetermined period of time, an input from the first user, via the first user's computing device, to turn off sharing of the first user's location, and wherein the media content item comprises a media overlay created based on a geolocation of the first position and of the second position, the media overlay comprising an identification of a location text, the location text describing a venue at the current location, the media overlay being overlaid on or adjacent the venue on top of the map image.
  • 18. A non-transitory computer-readable medium storing instructions that, when executed by a computer system, cause the computer system to perform operations comprising: receiving, at a server remote from a vehicle, from a first user's client computing device in communication with the system over a network, an electronic communication including location information from a location sensor coupled to the first user's client computing device, the location information indicating a current location of the first user's client computing device;retrieving, via the server, from a database, avatar information for the first user;determining, based on the location information, that the first user is traveling in the vehicle;receiving, via the server, from the first user's client computing device and based on determining that the first user is traveling in the vehicle, a make and model of the vehicle, the make and model having been provided from the vehicle to the first user's client computing device via a wireless connection between the vehicle and the first user's client computing device;generating, via the server, a representation of the vehicle based on the received make and model of the vehicle;generating, via the server, a media content item comprising a map interface depicting an avatar of the first user and the representation of the vehicle at the current location of the first user's client computing device in the map interface, wherein the map interface is generated based on the location information, the avatar of the first user is generated based on the avatar information for the first user, and the avatar of the first user is associated with the representation of the vehicle;causing, via a first transmission from the server to the first user's client computing device, the media content item to be displayed on a display screen of the first user's client computing device;causing, via a second transmission from the seller to a second user's client computing device, the media content item to be display on a display screen of the second user's client computing device, wherein the avatar of the first user is selectable via the display screen of the first user's client computing device; andpresenting the avatar of the first user at a first position in a map image included in the map interface at a first time based on the current position, modifying the map interface to remove the first user's avatar from the first position based on a second position of the first user's client computing device at a second subsequent time, updating the map interface based on a second position, and presenting the avatar of the first user at the second position in the map interface, wherein the system ceases displaying the avatar of the first user on the map interface on the display screen of the second computing device in response to one or more of: expiration of a predetermined period of time, an input from the first user, via the first user's computing device, to turn off sharing of the first user's location, and wherein the media content item comprises a media overlay created based on a geolocation of the first position and of the second position, the media overlay comprising an identification of a location text, the location text describing a venue at the current location, the media overlay being overlaid on or adjacent the venue on top of the map image.
PRIORITY

This patent application is a continuation of claims the benefit of priority of U.S. patent application Ser. No. 15/628,408, filed on Jun. 20, 2017, which claims the benefit of priority to U.S. Provisional Patent Application Ser. No. 62/491,115, filed on Apr. 27, 2017, which are hereby incorporated by reference herein in their entirety.

US Referenced Citations (1085)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5826269 Hussey Oct 1998 A
5855008 Goldhaber et al. Dec 1998 A
5880731 Liles et al. Mar 1999 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6023270 Brush, II et al. Feb 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6158044 Tibbetts Dec 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6223165 Lauffer Apr 2001 B1
6233318 Picard et al. May 2001 B1
6283858 Hayes, Jr. et al. Sep 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6374292 Srivastava et al. Apr 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6473794 Guheen et al. Oct 2002 B1
6484196 Maurille Nov 2002 B1
6487586 Ogilvie et al. Nov 2002 B2
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6701347 Ogilvie Mar 2004 B1
6711608 Ogilvie Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6772195 Hatlelid et al. Aug 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6839411 Saltanov et al. Jan 2005 B1
6842779 Nishizawa Jan 2005 B1
6898626 Ohashi May 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7073129 Robarts et al. Jul 2006 B1
7079158 Lambertsen Jul 2006 B2
7085574 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240089 Boudreau Jul 2007 B2
7243163 Friend et al. Jul 2007 B1
7269426 Kokkonen et al. Sep 2007 B2
7278168 Chaudhury et al. Oct 2007 B1
7280123 Bentley et al. Oct 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Brondrup Jan 2008 B2
7342587 Danzig et al. Mar 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7376715 Cunningham et al. May 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7468729 Levinson Dec 2008 B1
7478402 Christensen et al. Jan 2009 B2
7496347 Puranik Feb 2009 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535469 Kim et al. May 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7627828 Collison et al. Dec 2009 B1
7636755 Blattner et al. Dec 2009 B2
7639251 Gu et al. Dec 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7689649 Heikes et al. Mar 2010 B2
7703140 Nath et al. Apr 2010 B2
7770137 Forbes et al. Aug 2010 B2
7775885 Van Luchene et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7792789 Prahlad et al. Sep 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7818336 Amidon et al. Oct 2010 B1
7856360 Kramer et al. Dec 2010 B2
7859551 Bulman et al. Dec 2010 B2
7885931 Seo et al. Feb 2011 B2
7912896 Wolovitz et al. Mar 2011 B2
7925703 Dinan et al. Apr 2011 B2
8001204 Burtner et al. Aug 2011 B2
8032586 Challenger et al. Oct 2011 B2
8077931 Chatman et al. Dec 2011 B1
8082255 Carlson, Jr. et al. Dec 2011 B1
8088044 Tchao et al. Jan 2012 B2
8090351 Klein Jan 2012 B2
8095878 Bates et al. Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8108774 Finn et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8117281 Robinson et al. Feb 2012 B2
8130219 Fleury et al. Mar 2012 B2
8131597 Hudetz et al. Mar 2012 B2
8135166 Rhoads et al. Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8146005 Jones et al. Mar 2012 B2
8151191 Nicol Apr 2012 B2
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8169505 Hoshi May 2012 B2
8170957 Richard May 2012 B2
8195203 Tseng Jun 2012 B1
8195748 Hallyn Jun 2012 B2
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8238947 Lottin et al. Aug 2012 B2
8244593 Klinger et al. Aug 2012 B2
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8384719 Reville et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
RE44054 Kim Mar 2013 E
8396708 Park et al. Mar 2013 B2
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8413059 Lee et al. Apr 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8425322 Gillo et al. Apr 2013 B2
8457367 Sipe et al. Jun 2013 B1
8458601 Castelli et al. Jun 2013 B2
8462198 Lin et al. Jun 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8484158 Deluca et al. Jul 2013 B2
8495503 Brown et al. Jul 2013 B2
8495505 Smith et al. Jul 2013 B2
8504926 Wolf Aug 2013 B2
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8559980 Pujol Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8564621 Branson et al. Oct 2013 B2
8564710 Nonaka et al. Oct 2013 B2
8570326 Gorev Oct 2013 B2
8570907 Garcia, Jr. et al. Oct 2013 B2
8581911 Becker et al. Nov 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8597121 Andres Del Valle Dec 2013 B2
8601051 Wang Dec 2013 B2
8601379 Marks et al. Dec 2013 B2
8613089 Holloway et al. Dec 2013 B1
8632408 Gillo et al. Jan 2014 B2
8639767 Harris et al. Jan 2014 B1
8648865 Dawson et al. Feb 2014 B2
8655389 Jackson et al. Feb 2014 B1
8659548 Hildreth Feb 2014 B2
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8683354 Khandelwal et al. Mar 2014 B2
8692830 Nelson et al. Apr 2014 B2
8700012 Ferren et al. Apr 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8730231 Snoddy et al. May 2014 B2
8732168 Johnson May 2014 B2
8738719 Lee et al. May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8810513 Ptucha et al. Aug 2014 B2
8812171 Filev et al. Aug 2014 B2
8832201 Wall Sep 2014 B2
8832552 Arrasvuori et al. Sep 2014 B2
8839327 Amento et al. Sep 2014 B2
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8887035 Mcdonald et al. Nov 2014 B2
8890926 Tandon et al. Nov 2014 B2
8892999 Nims et al. Nov 2014 B2
8893010 Brin et al. Nov 2014 B1
8909679 Root et al. Dec 2014 B2
8909714 Agarwal et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8914752 Spiegel Dec 2014 B1
8924250 Bates et al. Dec 2014 B2
8935656 Dandia et al. Jan 2015 B2
8963926 Brown et al. Feb 2015 B2
8972357 Shim et al. Mar 2015 B2
8989786 Feghali Mar 2015 B2
8995433 Rojas Mar 2015 B2
9002643 Xu Apr 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9083770 Drose et al. Jul 2015 B1
9086776 Ye et al. Jul 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9105014 Collet et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs et al. Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9148424 Yang Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9224220 Toyoda et al. Dec 2015 B2
9225805 Kujawa et al. Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9237202 Sehn Jan 2016 B1
9241184 Weerasinghe Jan 2016 B2
9247377 Pai et al. Jan 2016 B2
9256860 Herger et al. Feb 2016 B2
9258459 Hartley Feb 2016 B2
9264463 Rubinstein et al. Feb 2016 B2
9276886 Samaranayake Mar 2016 B1
9294425 Son Mar 2016 B1
9298257 Hwang et al. Mar 2016 B2
9314692 Konoplev et al. Apr 2016 B2
9330483 Du et al. May 2016 B2
9344606 Hartley et al. May 2016 B2
9357174 Li et al. May 2016 B2
9361510 Yao et al. Jun 2016 B2
9369422 Ozog Jun 2016 B1
9378576 Bouaziz et al. Jun 2016 B2
9385983 Sehn Jul 2016 B1
9392308 Ahmed et al. Jul 2016 B2
9396354 Murphy et al. Jul 2016 B1
9402057 Kaytaz et al. Jul 2016 B2
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9412192 Mandel et al. Aug 2016 B2
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9460541 Li et al. Oct 2016 B2
9482882 Hanover et al. Nov 2016 B1
9482883 Meisenholder Nov 2016 B1
9485747 Rodoper et al. Nov 2016 B1
9489661 Evans et al. Nov 2016 B2
9489760 Li et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9503845 Vincent Nov 2016 B2
9508197 Quinn et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9544257 Ogundokun et al. Jan 2017 B2
9560006 Prado et al. Jan 2017 B2
9576400 Van Os et al. Feb 2017 B2
9589357 Li et al. Mar 2017 B2
9592449 Barbalet et al. Mar 2017 B2
9628950 Noeth et al. Apr 2017 B1
9635195 Green et al. Apr 2017 B1
9641870 Cormie et al. May 2017 B1
9648376 Chang et al. May 2017 B2
9652896 Jurgenson et al. May 2017 B1
9659244 Anderton et al. May 2017 B2
9693191 Sehn Jun 2017 B2
9697635 Quinn et al. Jul 2017 B2
9705831 Spiegel Jul 2017 B2
9706040 Kadirvel et al. Jul 2017 B2
9710821 Heath Jul 2017 B2
9742713 Spiegel et al. Aug 2017 B2
9744466 Fujioka Aug 2017 B2
9746990 Anderson et al. Aug 2017 B2
9749270 Collet et al. Aug 2017 B2
9773284 Huang et al. Sep 2017 B2
9785796 Murphy et al. Oct 2017 B1
9792714 Li et al. Oct 2017 B2
9824463 Ingrassia et al. Nov 2017 B2
9825898 Sehn Nov 2017 B2
9839844 Dunstan et al. Dec 2017 B2
9854219 Sehn Dec 2017 B2
9883838 Kaleal, III et al. Feb 2018 B2
9894476 Fraccaroli Feb 2018 B2
9898849 Du et al. Feb 2018 B2
9911073 Spiegel et al. Mar 2018 B1
9936165 Li et al. Apr 2018 B2
9959037 Chaudhri et al. May 2018 B2
9961520 Brooks et al. May 2018 B2
9980100 Charlton et al. May 2018 B1
9990373 Fortkort Jun 2018 B2
9990653 Lewis et al. Jun 2018 B1
10039988 Lobb et al. Aug 2018 B2
10097492 Tsuda et al. Oct 2018 B2
10116598 Tucker et al. Oct 2018 B2
10146748 Barndollar et al. Dec 2018 B1
10155168 Blackstock et al. Dec 2018 B2
10158589 Collet et al. Dec 2018 B2
10178507 Roberts Jan 2019 B1
10194270 Yokoyama et al. Jan 2019 B2
10212541 Brody et al. Feb 2019 B1
10237692 Shan et al. Mar 2019 B2
10242477 Charlton et al. Mar 2019 B1
10242503 McPhee et al. Mar 2019 B2
10262250 Spiegel et al. Apr 2019 B1
10362219 Wilson et al. Jul 2019 B2
10375519 Pai Aug 2019 B2
10382378 Garcia Aug 2019 B2
10432498 Mcclendon Oct 2019 B1
10475225 Park et al. Nov 2019 B2
10496661 Morgan et al. Dec 2019 B2
10504266 Blattner et al. Dec 2019 B2
10573048 Ni et al. Feb 2020 B2
10657701 Osman et al. May 2020 B2
10938758 Allen et al. Mar 2021 B2
10952013 Brody et al. Mar 2021 B1
10963529 Amitay et al. Mar 2021 B1
11385763 Amitay et al. Jul 2022 B2
11392264 Amitay et al. Jul 2022 B1
11418906 Brody et al. Aug 2022 B2
11451956 Amitay et al. Sep 2022 B1
11474663 Amitay et al. Oct 2022 B2
20020035607 Checkoway et al. Mar 2002 A1
20020047868 Miyazawa Apr 2002 A1
20020059193 Decime et al. May 2002 A1
20020067362 Agostino Nocera et al. Jun 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20020169644 Greene Nov 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030206171 Kim et al. Nov 2003 A1
20030217106 Adar et al. Nov 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050144241 Stata et al. Jun 2005 A1
20050162419 Kim et al. Jul 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson et al. Sep 2005 A1
20050206610 Cordelli Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050280660 Seo et al. Dec 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060031412 Adams et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060145944 Tarlton et al. Jul 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20060294465 Ronen et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070011270 Klein et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070064899 Boss et al. Mar 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070113181 Blattner et al. May 2007 A1
20070136228 Petersen Jun 2007 A1
20070168863 Blattner et al. Jul 2007 A1
20070176921 Iwasaki et al. Aug 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070218987 Luchene et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070258656 Aarabi et al. Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080055269 Lemay et al. Mar 2008 A1
20080062141 Chandhri Mar 2008 A1
20080070593 Altman et al. Mar 2008 A1
20080076505 Nguyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080097979 Heidloff et al. Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109159 Shi et al. May 2008 A1
20080109844 Baldeschwieler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158222 Li et al. Jul 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080201638 Nair Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080209329 Defranco et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080216092 Serlet Sep 2008 A1
20080222108 Prahlad et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080309617 Kong et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090013268 Amit Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090016617 Bregman-amitai et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030884 Pulfer et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090044113 Jones et al. Feb 2009 A1
20090047972 Neeraj Feb 2009 A1
20090055484 Vuong et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090070688 Gyorfi et al. Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090099925 Mehta et al. Apr 2009 A1
20090100367 Dargahi et al. Apr 2009 A1
20090106672 Burstrom Apr 2009 A1
20090132341 Klinger et al. May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090144639 Nims et al. Jun 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090150778 Nicol Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090153552 Fidaleo et al. Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090158170 Narayanan et al. Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090164459 Jennings et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090177976 Bokor et al. Jul 2009 A1
20090192900 Collison Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090202114 Morin et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090228811 Adams et al. Sep 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254840 Churchill et al. Oct 2009 A1
20090265604 Howard et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090284551 Stanton Nov 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090300525 Jolliff Dec 2009 A1
20090303984 Clark et al. Dec 2009 A1
20090319178 Khosravy et al. Dec 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20090328122 Amento et al. Dec 2009 A1
20100011422 Mason et al. Jan 2010 A1
20100023885 Reville et al. Jan 2010 A1
20100058212 Belitz et al. Mar 2010 A1
20100062794 Han Mar 2010 A1
20100073487 Sogoh et al. Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100083140 Dawson et al. Apr 2010 A1
20100083148 Finn et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100100828 Khandelwal et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100115407 Kim et al. May 2010 A1
20100115426 Liu et al. May 2010 A1
20100121915 Wang May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100179953 Kan et al. Jul 2010 A1
20100179991 Lorch et al. Jul 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen et al. Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100203968 Gill et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100227682 Reville et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100262915 Bocking et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100274724 Bible, Jr. et al. Oct 2010 A1
20100279713 Dicke Nov 2010 A1
20100281045 Dean Nov 2010 A1
20100290756 Karaoguz et al. Nov 2010 A1
20100299060 Snavely et al. Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20100332980 Sun et al. Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110047404 Metzler et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066664 Goldman et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110093780 Dunn Apr 2011 A1
20110099507 Nesladek et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110115798 Nayar et al. May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110126096 Ohashi et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110148864 Lee et al. Jun 2011 A1
20110153759 Rathod Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110161076 Davis et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110167125 Achlioptas Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110211764 Krupka et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238762 Soni Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110239136 Goldman et al. Sep 2011 A1
20110239143 Ye et al. Sep 2011 A1
20110246330 Tikku et al. Oct 2011 A1
20110249891 Li Oct 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110285703 Jin Nov 2011 A1
20110286586 Saylor et al. Nov 2011 A1
20110292051 Nelson et al. Dec 2011 A1
20110300837 Misiag Dec 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120013770 Stafford et al. Jan 2012 A1
20120015673 Klassen Jan 2012 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120059826 Mate et al. Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120069028 Bouguerra Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113106 Choi et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120130717 Xu et al. May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120139830 Hwang et al. Jun 2012 A1
20120141046 Chen et al. Jun 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120209921 Adafin et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco Lopez et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120221687 Hunter et al. Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120229506 Nishikawa Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120271883 Montoya et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120290977 Devecka Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120315987 Walling Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130031180 Abendroth et al. Jan 2013 A1
20130036165 Tseng et al. Feb 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130073970 Piantino et al. Mar 2013 A1
20130073971 Huang et al. Mar 2013 A1
20130073984 Lessin et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130103760 Golding et al. Apr 2013 A1
20130103766 Gupta Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110631 Mitchell et al. May 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111354 Marra et al. May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130124091 Matas et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129084 Appleton May 2013 A1
20130129252 Lauper May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130141463 Barnett et al. Jun 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130151988 Sorin et al. Jun 2013 A1
20130152000 Liu et al. Jun 2013 A1
20130155169 Hoover et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130174059 Van Wie Jul 2013 A1
20130179520 Lee et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130201187 Tong et al. Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130249948 Reitan Sep 2013 A1
20130257877 Davis Oct 2013 A1
20130258040 Kaytaz et al. Oct 2013 A1
20130260800 Asakawa Oct 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130311452 Jacoby Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130332068 Kesar et al. Dec 2013 A1
20130339868 Sharpe et al. Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140011576 Barbalet et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140039842 Yuen Feb 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140043329 Wang et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140055554 Du et al. Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140082651 Sharifi Mar 2014 A1
20140085293 Konoplev et al. Mar 2014 A1
20140089771 Pilskalns Mar 2014 A1
20140089816 Dipersia et al. Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140099880 Thistoll et al. Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140125678 Wang et al. May 2014 A1
20140129343 Finster et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140143241 Barello et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140157139 Coroy et al. Jun 2014 A1
20140160149 Blackstock et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140199970 Klotz Jul 2014 A1
20140201527 Krivorot Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140218394 Hochmuth et al. Aug 2014 A1
20140221089 Fortkort Aug 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140223372 Dostie et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280058 St. Clair Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289216 Voellmer et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306884 Sano Oct 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20140347368 Kishore Nov 2014 A1
20140359024 Spiegel Dec 2014 A1
20140359032 Spiegel et al. Dec 2014 A1
20140362091 Bouaziz et al. Dec 2014 A1
20140372420 Slep Dec 2014 A1
20140380195 Graham et al. Dec 2014 A1
20140380511 Faaborg et al. Dec 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150067880 Ward et al. Mar 2015 A1
20150071619 Brough Mar 2015 A1
20150084892 Shirota et al. Mar 2015 A1
20150086087 Ricanek, Jr. et al. Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088464 Yuen Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150121251 Kadirvel et al. Apr 2015 A1
20150123967 Quinn et al. May 2015 A1
20150128020 Chávez et al. May 2015 A1
20150153934 Zherebtsov et al. Jun 2015 A1
20150155007 Barfield, Jr. Jun 2015 A1
20150160832 Walkin et al. Jun 2015 A1
20150169139 Leva et al. Jun 2015 A1
20150169142 Longo et al. Jun 2015 A1
20150169827 Laborde Jun 2015 A1
20150169938 Yao et al. Jun 2015 A1
20150172393 Oplinger Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150181380 Altman et al. Jun 2015 A1
20150186531 Agarwal et al. Jul 2015 A1
20150187100 Berry et al. Jul 2015 A1
20150193522 Choi Jul 2015 A1
20150193585 Sunna Jul 2015 A1
20150193819 Chang Jul 2015 A1
20150195235 Trussel Jul 2015 A1
20150199082 Scholler et al. Jul 2015 A1
20150201030 Longo et al. Jul 2015 A1
20150206349 Rosenthal et al. Jul 2015 A1
20150213604 Li et al. Jul 2015 A1
20150220774 Ebersman et al. Aug 2015 A1
20150222814 Li et al. Aug 2015 A1
20150227602 Ramu et al. Aug 2015 A1
20150232065 Ricci et al. Aug 2015 A1
20150234942 Harmon Aug 2015 A1
20150245168 Martin Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150264432 Cheng Sep 2015 A1
20150268830 Martynov Sep 2015 A1
20150295866 Collet et al. Oct 2015 A1
20150304806 Vincent Oct 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150334077 Feldman Nov 2015 A1
20150347519 Hornkvist et al. Dec 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150350262 Rainisto et al. Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150369623 Blumenberg et al. Dec 2015 A1
20150370830 Murphy-Chutorian et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160012066 Ning et al. Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160021153 Hull et al. Jan 2016 A1
20160035111 Ingrassia et al. Feb 2016 A1
20160045834 Burns Feb 2016 A1
20160055164 Cantarero et al. Feb 2016 A1
20160078095 Man et al. Mar 2016 A1
20160080438 Liang Mar 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160086500 Kaleal, III Mar 2016 A1
20160086670 Gross et al. Mar 2016 A1
20160093078 Davis et al. Mar 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160134840 Mcculloch May 2016 A1
20160158600 Rolley Jun 2016 A1
20160163084 Corazza et al. Jun 2016 A1
20160164823 Nordstrom et al. Jun 2016 A1
20160179823 Yang Jun 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160188997 Desnoyer et al. Jun 2016 A1
20160189310 O'kane Jun 2016 A1
20160210500 Feng et al. Jul 2016 A1
20160217292 Faaborg et al. Jul 2016 A1
20160234060 Pai et al. Aug 2016 A1
20160234149 Tsuda et al. Aug 2016 A1
20160239248 Sehn Aug 2016 A1
20160241504 Raji et al. Aug 2016 A1
20160275721 Park et al. Sep 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160286244 Chang et al. Sep 2016 A1
20160292273 Murphy et al. Oct 2016 A1
20160292905 Nehmadi Oct 2016 A1
20160294891 Miller Oct 2016 A1
20160298982 Bailiang Oct 2016 A1
20160313957 Ebert et al. Oct 2016 A1
20160321708 Sehn Nov 2016 A1
20160343160 Blattner et al. Nov 2016 A1
20160350297 Riza Dec 2016 A1
20160359957 Laliberte Dec 2016 A1
20160359987 Laliberte Dec 2016 A1
20160359993 Hendrickson et al. Dec 2016 A1
20160378278 Sirpal Dec 2016 A1
20160379415 Espeset et al. Dec 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170006322 Dury et al. Jan 2017 A1
20170010768 Watson et al. Jan 2017 A1
20170027528 Kaleal, III et al. Feb 2017 A1
20170034173 Miller et al. Feb 2017 A1
20170039452 Osindero et al. Feb 2017 A1
20170039752 Quinn et al. Feb 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170064240 Mangat et al. Mar 2017 A1
20170080346 Abbas Mar 2017 A1
20170087473 Siegel et al. Mar 2017 A1
20170113140 Blackstock et al. Apr 2017 A1
20170118145 Aittoniemi et al. Apr 2017 A1
20170124116 League May 2017 A1
20170126592 El Ghoul May 2017 A1
20170132649 Oliva et al. May 2017 A1
20170161382 Ouimet et al. Jun 2017 A1
20170199855 Fishbeck Jul 2017 A1
20170235848 Van Deusen et al. Aug 2017 A1
20170263029 Yan et al. Sep 2017 A1
20170286752 Gusarov et al. Oct 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
20170293673 Purumala et al. Oct 2017 A1
20170295250 Samaranayake et al. Oct 2017 A1
20170310934 Du et al. Oct 2017 A1
20170312634 Ledoux et al. Nov 2017 A1
20170324688 Collet et al. Nov 2017 A1
20170336960 Chaudhri et al. Nov 2017 A1
20170339006 Austin et al. Nov 2017 A1
20170352179 Hardee Dec 2017 A1
20170353477 Faigon et al. Dec 2017 A1
20170374003 Allen et al. Dec 2017 A1
20170374508 Davis et al. Dec 2017 A1
20180005420 Bondich et al. Jan 2018 A1
20180024726 Hviding Jan 2018 A1
20180025367 Jain Jan 2018 A1
20180032212 Choi et al. Feb 2018 A1
20180047200 O'hara et al. Feb 2018 A1
20180060363 Ko et al. Mar 2018 A1
20180068019 Novikoff et al. Mar 2018 A1
20180069817 Constantinides Mar 2018 A1
20180088777 Daze et al. Mar 2018 A1
20180091732 Wilson et al. Mar 2018 A1
20180097762 Garcia et al. Apr 2018 A1
20180113587 Allen et al. Apr 2018 A1
20180115503 Baldwin et al. Apr 2018 A1
20180205681 Gong et al. Jul 2018 A1
20180315076 Andreou Nov 2018 A1
20180315133 Brody et al. Nov 2018 A1
20180315134 Amitay et al. Nov 2018 A1
20190001223 Blackstock et al. Jan 2019 A1
20190057616 Cohen et al. Feb 2019 A1
20190188920 Mcphee et al. Jun 2019 A1
20200117339 Amitay et al. Apr 2020 A1
20200117340 Amitay et al. Apr 2020 A1
20200120097 Amitay et al. Apr 2020 A1
20200120170 Amitay et al. Apr 2020 A1
20200404464 Constantinides Dec 2020 A1
20210243548 Brody et al. Aug 2021 A1
20210266277 Allen et al. Aug 2021 A1
20210286840 Amitay et al. Sep 2021 A1
20210357104 Amitay et al. Nov 2021 A1
20220291812 Amitay et al. Sep 2022 A1
20230033214 Brody et al. Feb 2023 A1
20230051468 Amitay et al. Feb 2023 A1
20230067248 Amitay et al. Mar 2023 A1
Foreign Referenced Citations (121)
Number Date Country
2887596 Jul 2015 CA
101127109 Feb 2008 CN
101363743 Feb 2009 CN
102037716 Apr 2011 CN
102450031 May 2012 CN
102461218 May 2012 CN
102664819 Sep 2012 CN
103116853 May 2013 CN
103124894 May 2013 CN
103154994 Jun 2013 CN
104054077 Sep 2014 CN
104508426 Apr 2015 CN
104616540 May 2015 CN
104854615 Aug 2015 CN
105554311 May 2016 CN
105893579 Aug 2016 CN
105897565 Aug 2016 CN
106066990 Nov 2016 CN
106157155 Nov 2016 CN
106530008 Mar 2017 CN
107210948 Sep 2017 CN
108885795 Nov 2018 CN
109863532 Jun 2019 CN
110168478 Aug 2019 CN
110799937 Feb 2020 CN
110800018 Feb 2020 CN
110832538 Feb 2020 CN
110945555 Mar 2020 CN
111010882 Apr 2020 CN
111343075 Jun 2020 CN
111489264 Aug 2020 CN
111343075 Sep 2022 CN
2051480 Apr 2009 EP
2151797 Feb 2010 EP
2184092 May 2010 EP
2399928 Sep 2004 GB
2001230801 Aug 2001 JP
2014006881 Jan 2014 JP
5497931 Mar 2014 JP
2014191414 Oct 2014 JP
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
20040063436 Jul 2004 KR
1020050036963 Apr 2005 KR
20060124865 Dec 2006 KR
20110014224 Feb 2011 KR
20110054492 May 2011 KR
101060961 Aug 2011 KR
1020120070898 Jul 2012 KR
20130075380 Jul 2013 KR
20140015725 Feb 2014 KR
101445263 Sep 2014 KR
20160001847 Jan 2016 KR
20160018954 Feb 2016 KR
101604654 Mar 2016 KR
20160051536 May 2016 KR
101698031 Jan 2017 KR
20170025454 Mar 2017 KR
102434361 Aug 2022 KR
102449545 Oct 2022 KR
102455041 Oct 2022 KR
102486490 Jan 2023 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-03094072 Nov 2003 WO
WO-2003094072 Nov 2003 WO
WO-2004079530 Sep 2004 WO
WO-2004095308 Nov 2004 WO
WO-2006107182 Oct 2006 WO
WO-2006118755 Nov 2006 WO
WO-2007092668 Aug 2007 WO
WO-2007134402 Nov 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2012000107 Jan 2012 WO
WO-2012139276 Oct 2012 WO
WO-2013008238 Jan 2013 WO
WO-2013008251 Jan 2013 WO
WO-2013027893 Feb 2013 WO
WO-2013045753 Apr 2013 WO
WO-2013152454 Oct 2013 WO
WO-2013166588 Nov 2013 WO
WO-2014006129 Jan 2014 WO
WO-2014031899 Feb 2014 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014194262 Dec 2014 WO
WO-2014194439 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016090605 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016112299 Jul 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
WO-2016179235 Nov 2016 WO
WO-2017173319 Oct 2017 WO
WO-2017176739 Oct 2017 WO
WO-2017176992 Oct 2017 WO
WO-2018005644 Jan 2018 WO
WO-2018006053 Jan 2018 WO
WO-2018081013 May 2018 WO
WO-2018102562 Jun 2018 WO
WO-2018129531 Jul 2018 WO
WO-2018200042 Nov 2018 WO
WO-2018200043 Nov 2018 WO
WO-2018201102 Nov 2018 WO
WO-2018201104 Nov 2018 WO
WO-2018201106 Nov 2018 WO
WO-2018201107 Nov 2018 WO
WO-2018201108 Nov 2018 WO
WO-2018201109 Nov 2018 WO
WO-2019089613 May 2019 WO
Non-Patent Literature Citations (417)
Entry
Google UK. Introducing Google Latitude. youtube.com. Feb. 3, 2009. [Retrieved on: Oct. 23, 2019]. Retrieved from internet: <URL:https://www.youtube.com/watch?v=XecGMKqiA5A>. entire document (Year: 2009).
Zibreg. How to share your real time location on Google Maps. idownloadblog.com. Apr. 12, 2017. [Retrieved on: Oct. 23, 2019]. Retrieved from internet: <URL:https://www.idownloadblog.com/2017/04/12/how-to-share-location-google-maps/>. entire document (Year: 2017).
The Official Google Blog. Check in with Google Latitude. waybackmachine. Feb. 1, 2011. [Retrieved on: Oct. 23, 2019]. Retrieved from internet: <URL:https://web.archive.org/web/20110201201006/https://googleblog.blogspot.com/2011/02/check-in-with-google-latitude.html>. entire document (Year: 2011).
Petovello. How does a GNSS receiver estimate velocity ?. insidegnss.com. Apr. 2015. [Retrieved on: Dec. 23, 2018]. Retrieved from internet: <URL:http://insidegnss.com/wp-content/uploads/2018/01/marapr15-SOLUTIONS.pdf>. entire document (Year: 2015).
Neis. The OpenStreetMap Contributors Map aka Who's around me ?. neis.one.org. Jan. 6, 2013. [Retrieved on: Jun. 5, 2019]. Retrieved from internet: <URL:https://neis-one.org/2013/01/oooc/>. entire document (Year: 2013).
Sulleyman. Google Maps Could Let Strangers Track Your Real-Time Location for Days at a Time. Mar. 23, 2017. [Retrieved: Jun. 5, 2019]. <URL:https://www.independent.co.uk/life-style/gadgets-and-tech/news/google-maps-track-location-real-time-days-privacy-security-stalk-gps-days-a7645721.html>. (Year: 2017).
Finn. Miss Google Latitude? Google+ With Location Sharing Is Now a Suitable Alternative. cypressnorth.com. Nov. 27, 2013. <URL:https://cypressnorth.com/social-media/miss-google-latitude-google-location-sharing-now-suitable-alternative/>. entire document (Year: 2013).
Perez. Life360, The Family Locator With More Users Than Foursquare, Raises a $10 Million Series B. techcrunch.com. Jul. 10, 2013. [Retrieved on: Apr. 8, 2020]. <URL:https://techcrunch.com/2013/07/10/life360-the-family-locator-with-more-users-than-foursquare-raises-10-million-series-b/>. (Year: 2013).
Grubert, Towards pervasive augmented reality context awareness in Augmented reality (Year: 2017).
“A Whole New Story”, Snap, Inc., [Online] Retrieved from the Internet: < URL: https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding photos to your listing”, eBay, [Online] Retrieved from the Internet: < URL: http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 12/471,811, Advisory Action dated Mar. 28, 2012”, 6 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated Feb. 2, 2012”, 3 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated Apr. 18, 2011”, 3 pgs.
“U.S. Appl. No. 12/471,811, Examiner Interview Summary dated May 27, 2014”, 2 pgs.
“U.S. Appl. No. 12/471,811, Final Office Action dated Dec. 23, 2011”, 20 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Jan. 13, 2011”, 15 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Jun. 28, 2011”, 26 pgs.
“U.S. Appl. No. 12/471,811, Non Final Office Action dated Oct. 24, 2014”, 21 pgs.
“U.S. Appl. No. 12/471,811, Notice of Allowance dated Apr. 1, 2015”, 6 pgs.
“U.S. Appl. No. 12/471,811, Response filed Jan. 26, 2015 to Non Final Office Action dated Oct. 24, 2014”, 18 pgs.
“U.S. Appl. No. 12/471,811, Response filed Feb. 23, 2012 to Final Office Action dated Dec. 23, 2011”, 12 pgs.
“U.S. Appl. No. 12/471,811, Response filed Mar. 28, 2012 to Advisory Action dated Mar. 28, 2012”, 14 pgs.
“U.S. Appl. No. 12/471,811, Response filed Apr. 13, 2011 to Non Final Office Action (dated Jan. 13, 2011”, 5 pgs.
“U.S. Appl. No. 12/471,811, Response filed Sep. 28, 2011 to Non Final Office Action dated Jun. 28, 2011”, 19 pgs.
“U.S. Appl. No. 13/979,974, Corrected Notice of Allowability dated Nov. 19, 2018”, 2 pgs.
“U.S. Appl. No. 13/979,974, Examiner Interview Summary dated Jun. 29, 2017”, 3 pgs.
“U.S. Appl. No. 13/979,974, Examiner Interview Summary dated Sep. 15, 2017”, 3 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Apr. 25, 2018”, 18 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Jun. 9, 2017”, 20 pgs.
“U.S. Appl. No. 13/979,974, Final Office Action dated Oct. 12, 2016”, 13 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Feb. 22, 2017”, 17 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Apr. 27, 2016”, 16 pgs.
“U.S. Appl. No. 13/979,974, Non Final Office Action dated Oct. 3, 2017”, 17 pgs.
“U.S. Appl. No. 13/979,974, Notice of Allowance dated Aug. 10, 2018”, 9 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jan. 3, 2018 to Non Final Office Action dated Oct. 3, 2017”, 8 pgs.
“U.S. Appl. No. 13/979,974, Response filed May 22, 2017 to Non Final Office Action dated Feb. 22, 2017”, 10 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jul. 25, 2018 to Final Office Action dated Apr. 25, 2018”, 10 pgs.
“U.S. Appl. No. 13/979,974, Response filed Jul. 26, 2016 to Non Final Office Action dated Apr. 27, 2016”, 8 pgs.
“U.S. Appl. No. 13/979,974, Response filed Sep. 11, 2017 to Final Office Action dated Jun. 9, 2017”, 8 pgs.
“U.S. Appl. Serial No. 13/979,974, Response filed Jan. 12, 2017 to Non Final Office Action dated Apr. 27, 2016”, 8 pgs.
“U.S. Appl. No. 14/753,200, Non Final Office Action dated Oct. 11, 2016”, 6 pgs.
“U.S. Appl. No. 14/753,200, Notice of Allowance dated Apr. 27, 2017”, 7 pgs.
“U.S. Appl. No. 14/753,200, Response filed Feb. 13, 2017 to Non Final Office Action dated Oct. 11, 2016”, 9 pgs.
“U.S. Appl. No. 15/086,749, Final Office Action dated Oct. 31, 2017”, 15 pgs.
“U.S. Appl. No. 15/086,749, Final Office Action dated Dec. 31, 2018”, 14 pgs.
“U.S. Appl. No. 15/086,749, Non Final Office Action dated Mar. 13, 2017”, 12 pgs.
“U.S. Appl. No. 15/086,749, Non Final Office Action dated Apr. 30, 2018”, 14 pgs.
“U.S. Appl. No. 15/086,749, Notice of Allowance dated Feb. 26, 2019”, 7 pgs.
“U.S. Appl. No. 15/086,749, Response filed Feb. 11, 2019 to Final Office Action dated Dec. 31, 2018”, 10 pgs.
“U.S. Appl. No. 15/086,749, Response filed Apr. 2, 2018 to Final Office Action dated Oct. 31, 2017” 14 pgs.
“U.S. Appl. No. 15/086,749, Response filed Aug. 29, 2018 to Non Final Office Action dated Apr. 30, 2018”, 12 pgs.
“U.S. Appl. No. 15/199,472, Final Office Action dated Mar. 1, 2018”, 31 pgs.
“U.S. Appl. No. 15/199,472, Final Office Action dated Nov. 15, 2018”, 37 pgs.
“U.S. Appl. No. 15/199,472, Non Final Office Action dated Jul. 25, 2017”, 30 pgs.
“U.S. Appl. No. 15/199,472, Non Final Office Action dated Sep. 21, 2018”, 33 pgs.
“U.S. Appl. No. 15/199,472, Notice of Allowance dated Mar. 18, 2019”, 23 pgs.
“U.S. Appl. No. 15/199,472, Response filed Jan. 15, 2019 to Final Office Action dated Nov. 15, 2018”, 14 pgs.
“U.S. Appl. No. 15/199,472, Response filed Jan. 25, 2018 to Non Final Office Action dated Jul. 25, 2017”, 13 pgs.
“U.S. Appl. No. 15/199,472, Response filed Aug. 31, 2018 to Final Office Action dated Mar. 1, 2018”, 14 pgs.
“U.S. Appl. No. 15/199,472, Response filed Oct. 17, 2018 to Non Final Office Action dated Sep. 31, 2018”, 11 pgs.
“U.S. Appl. No. 15/365,046, Non Final Office Action dated Dec. 20, 2018”, 36 pgs.
“U.S. Appl. No. 15/365,046, Response filed Mar. 20, 2019 to Non Final Office Action dated Dec. 20, 2018”, 20 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Jan. 31, 2019”, 22 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Jun. 17, 2019”, 17 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Aug. 15, 2018”, 22 pgs.
“U.S. Appl. No. 15/369,499, Response filed Mar. 14, 2019 to Final Office Action dated Jan. 31, 2019”, 12 pgs.
“U.S. Appl. No. 15/369,499, Response filed Nov. 15, 2018 to Non Final Office Action dated Aug. 15, 2018”, 10 pgs.
“U.S. Appl. No. 15/583,142, Jan. 28, 2019 to Response Filed Non Final Office Action dated Oct. 25, 2018”, 19 pgs.
“U.S. Appl. No. 15/583,142, Final Office Action dated Mar. 22, 2019”, 11 pgs.
“U.S. Appl. No. 15/583,142, Non Final Office Action dated Oct. 25, 2018”, 14 pgs.
“U.S. Appl. No. 15/628,408, Final Office Action dated Jun. 10, 2019”, 44 pgs.
“U.S. Appl. No. 15/628,408, Non Final Office Action dated Jan. 2, 2019”, 28 pgs.
“U.S. Appl. No. 15/628,408, Response filed Apr. 2, 2019 to Non Final Office Action dated Jan. 2, 2019”, 15 pgs.
“U.S. Appl. No. 15/628,408, Supplemental Amendment filed Apr. 4, 2019 to Non Final Office Action dated Jan. 2, 2019”, 12 pgs.
“U.S. Appl. No. 15/661,953, Examiner Interview Summary dated Nov. 13, 2018”, 3 pgs.
“U.S. Appl. No. 15/661,953, Non Final Office Action dated Mar. 26, 2018”, 6 pgs.
“U.S. Appl. No. 15/661,953, Notice of Allowance dated Aug. 10, 2018”, 7 pgs.
“U.S. Appl. No. 15/661,953, PTO Response to Rule 312 Communication dated Oct. 30, 2018”, 2 pgs.
“U.S. Appl. No. 15/661,953, PTO Response to Rule 312 Communication dated Nov. 7, 2018”, 2 pgs.
“U.S. Appl. No. 15/661,953, Response Filed Jun. 26, 2018 to Non Final Office Action dated Mar. 26, 2018”, 13 pgs.
“U.S. Appl. No. 15/965,744, Non Final Office Action dated Jun. 12, 2019”, 18 pgs.
“U.S. Appl. No. 15/965,749, Non Final Office Action dated Jul. 10, 2019”, 8 pgs.
“U.S. Appl. No. 16/115,259, Preliminary Amendment filed Oct. 18, 2018 t”, 6 pgs.
“U.S. Appl. No. 16/193,938, Preliminary Amendment filed Nov. 27, 2018”, 7 pgs.
“BlogStomp”, StompSoftware, [Online] Retrieved from the Internet: <URL: http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, Blast Radius, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20160711202454/http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs.
“Daily App: InstaPlace (IOS/Android): Give Pictures a Sense of Place”, TechPP, [Online] Retrieved from the Internet: <URL: http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“European Application Serial No. 17776809.0, Extended European Search Report dated Feb. 27, 2019”, 7 pgs.
“InstaPlace Photo App Tell the Whole Story”, [Online] Retrieved from the internet: <URL: https://youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs., 1:02 min.
“International Application Serial No. PCT/CA2013/000454, International Preliminary Report on Patentability dated Nov. 20, 2014”, 9 pgs.
“International Application Serial No. PCT/CA2013/000454, International Search Report dated Aug. 20, 2013”, 3 pgs.
“International Application Serial No. PCT/CA2013/000454, Written Opinion dated Aug. 20, 2013”. 7 pgs.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“International Application Serial No. PCT/US2017/025460, International Preliminary Report on Patentability dated Oct. 11, 2018”, 9 pgs.
“International Application Serial No. PCT/US2017/025460, International Search Report dated Jun. 20, 2017”, 2 pgs.
“International Application Serial No. PCT/US2017/025460, Written Opinion dated Jun. 20, 2017”, 7 pgs.
“International Application Serial No. PCT/US2017/040447, International Preliminary Report on Patentability dated Jan. 10, 2019”, 8 pgs.
“International Application Serial No. PCT/US2017/040447, International Search Report dated Oct. 2, 2017”, 4 pgs.
“International Application Serial No. PCT/US2017/040447, Written Opinion dated Oct. 2, 2017”, 6 pgs.
“International Application Serial No. PCT/US2017/057918, International Search Report dated Jan. 19, 2018”, 3 pgs.
“International Application Serial No. PCT/US2017/057918, Written Opinion dated Jan. 19, 2018”, 7 pgs.
“International Application Serial No. PCT/US2017/063981, International Search Report dated Mar. 22, 2018”, 3 pgs.
“International Application Serial No. PCT/US2017/063981, Written Opinion dated Mar. 22, 2018”. 8 pgs.
“International Application Serial No. PCT/US2018/000112, International Search Report dated Jul. 20, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/000112, Written Opinion dated Jul. 20, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/000113, International Search Report dated Jul. 13, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/000113, Written Opinion dated Jul. 13, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/030039, International Search Report dated Jul. 11, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030039, Written Opinion dated Jul. 11, 2018”, 4 pgs.
“International Application Serial No. PCT/US2018/030041, International Search Report dated Jul. 11, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030041, Written Opinion dated Jul. 11, 2018”, 3 pgs.
“International Application Serial No. PCT/US2018/030043, International Search Report dated Jul. 23, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030043, Written Opinion dated Jul. 23, 2018”, 5 pgs.
“International Application Serial No. PCT/US2018/030044, International Search Report dated Jun. 26, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030044, Written Opinion dated Jun. 26, 2018” 6 pgs.
“International Application Serial No. PCT/US2018/030045, International Search Report dated Jul. 3, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030045, Written Opinion dated Jul. 3, 2018”, 6 pgs.
“International Application Serial No. PCT/US2018/030046, International Search Report dated Jul. 6, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/030046, Written Opinion dated Jul. 6, 2018”, 6 pgs.
“Introducing Snapchat Stories”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20131026084921/https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.; 00:47 min.
“List of IBM Patents or Patent Applications Treated as Related, Filed Herewith.”, 2 pgs.
“Macy's Believe-o-Magic”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20190422101854/https://www.youtube.com/watch?v=xvzRXy3J0Z0&feature=youtu.be>, (Nov. 7, 2011), 102 pgs.; 00:51 min.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, Business Wire, [Online] Retrieved from the Internet: <URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country>, (Nov. 2, 2011), 6 pgs.
“Starbucks Cup Magic”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.; 00:47 min.
“Starbucks Cup Magic for Valentine's Day”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.; 00:45 min.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, Business Wire, [Online] Retrieved from the Internet: <URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
Broderick, Ryan, “Every thing You Need to Know About Japan's Amazing Photo Booths”, [Online] Retrieved from the Internet: <https://www.buzzfeed.com/ryanhatesthis/look-how-kawaii-i-am?utm_term=.kra5QwGNZ#.muYoVB7qJ>, (Jan. 22, 2016), 30 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, TechCrunch, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs.
Castelluccia, Claude, et al., “EphPub: Toward robust Ephemeral Publishing”, 19th IEEE International Conference on Network Protocols (ICNP), (Oct. 17, 2011), 18 pgs.
Chan, Connie, “The Elements of Stickers”, [Online] Retrieved from the Internet: <https://a16z.com/2016/06/17/stickers/>, (Jun. 20, 2016), 15 pgs.
Collet, Jean Luc, et al., “Interactive avatar in messaging environment”, U.S. Appl. No. 12/471,811, filed May 26, 2009, (May 26, 2009), 31 pgs.
Dillet, Romain, “Zenly proves that location sharing isn't dead”, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2016/05/19/zenly-solomoyolo/>, (accessed Jun. 27, 2018), 6 pgs.
Fajman, “An Extensible Message Format for Message Disposition Notifications”, Request for Comments: 2298, National Institutes of Health, (Mar. 1998), 28 pgs.
Janthong, Isaranu, “Instaplace ready on Android Google Play store”, Android App Review Thailand, [Online] Retrieved from the Internet: <URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online] Retrieved from the Internet: <URL: http://www.theregister.co.uk/2005/12/12/stealthtext/>, (Dec. 12, 2005), 1 pg.
MacLeod, Duncan, “Macys Believe-o-Magic App”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
MacLeod, Duncan, “Starbucks Cup Magic Lets Merry”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Melanson, Mike, “This text message will self destruct in 60 seconds”, [Online] Retrieved from the Internet: <URL: http://readwrite.com/2011/02/11/this_text_message_will_self_destruct_in_60_seconds>, (Feb. 18, 2015), 4 pgs.
Neis, Pascal, “The OpenStreetMap Contributors Map aka Who's around me?”, [Online] Retrieved from the Internet by the examiner on Jun. 5, 2019: <URL: https://neis-one.org/2013/01/oooc/>, (Jan. 6, 2013), 7 pgs.
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, [Online] Retrieved from the Internet: <URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, a Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, TechCrunch, [Online] Retrieved form the Internet: < URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Petovello, Mark, “How does a GNSS receiver estimate velocity?”, InsideGNSS, [Online] Retrieved from the Internet: <http://insidegnss.com/wp-content/uploads/2018/01/marapr15-SOLUTIONS.pdf>., (Mar.-Apr. 2015), 3 pgs.
Rhee, Chi-Hyoung, et al., “Cartoon-like Avatar Generation Using Facial Component Matching”, International Journal of Multimedia and Ubiquitous Engineering, (Jul. 30, 2013), 69-78.
Sawers, Paul, “Snapchat for IOS Lets You Send Photos to Friends and Set How long They're Visible For”, [Online] Retrieved from the Internet; <URL: https://thenextweb.com/apps/2012/05/07/snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visible-for/>, (May 7, 2012), 5 pgs.
Shein, Esther, “Ephemeral Data”, Communications of the ACM, vol. 56, No. 9, (Sep. 2013), 3 pgs.
Sulleyman, Aatif, “Google Maps Could Let Strangers Track Your Real-Time Location for Days at a Time”, The Independent, [Online] Retrieved from the Internet by the examiner on Jun. 5, 2019: <URL: https://www.independent.co.uk/life-style/gadgets-and-tech/news/google-maps-track-location-real-time-days-privacy-security-stalk-gps-days-a7645721.html>, (Mar. 23, 2017), 5 pgs.
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, [Online] Retrieved from the Internet: <URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server>, (Dec. 28, 2012), 4 pgs.
Vaas, Lisa, “StealthText, Should You Choose to Accept It”, [Online] Retrieved from the Internet: <URL: http://www.eweek.com/print/c/a/MessagingandCollaboration/StealthTextShouldYouChoosetoAcceptIt>, (Dec. 13, 2005), 2 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Oct. 1, 2019”, 17 pgs.
“U.S. Appl. No. 15/369,499, Response filed Sep. 10, 2019 to Non-Final Office Action dated Jun. 17, 2019”, 9 pgs.
“U.S. Appl. No. 15/628,408, Non Final Office Action dated Oct. 30, 2019”, 45 pgs.
“U.S. Appl. No. 15/628,408, Response filed Jan. 30, 2020 to Non Final Office Action dated Oct. 30, 2019”, 17 pgs.
“U.S. Appl. No. 15/628,408, Response filed Aug. 12, 2019 to Final Office Action dated Jun. 10, 2019”, 12 pgs.
“U.S. Appl. No. 15/901,387, Non Final Office Action dated Oct. 30, 2019”, 40 pgs.
“U.S. Appl. No. 15/965,744, Response filed Nov. 12, 2019 to Non Final Office Action dated Jun. 12, 2019”, 10 pgs.
“U.S. Appl. No. 15/965,749, Non Final Office Action dated Jan. 27, 2020”, 9 pgs.
“U.S. Appl. No. 15/965,749, Response filed Oct. 10, 2019 to Non-Final Office Action dated Jul. 10, 2019”, 11 pgs.
“U.S. Appl. No. 15/965,764, Non Final Office Action dated Jan. 2, 2020”, 18 pgs.
“U.S. Appl. No. 15/965,775, Final Office Action dated Jan. 30, 2020”, 10 pgs.
“U.S. Appl. No. 15/965,775, Non Final Office Action dated Jul. 29, 2019”, 8 pgs.
“U.S. Appl. No. 15/965,775, Response filed Oct. 29, 2019 to Non Final Office Action dated Jul. 29, 2019”, 10 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Dec. 16, 2019”, 23 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Jul. 30, 2019”, 21 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 30, 2019 to Non Final Office Action dated Jul. 30, 2019”, 9 pgs.
“U.S. Appl. No. 16/232,824, Non Final Office Action dated Oct. 21, 2019”, 18 pgs.
“European Application Serial No. 18789872.1, Extended European Search Report dated Jan. 2, 2020”, 8 pgs.
“European Application Serial No. 18790189.7, Extended European Search Report dated Jan. 2, 2020”, 7 pgs.
“European Application Serial No. 18791925.3, Extended European Search Report dated Jan. 2, 2020”, 6 pgs.
“International Application Serial No. PCT/US2018/000112, International Preliminary Report on Patentability dated Nov. 7, 2019”, 6 pgs.
“International Application Serial No. PCT/US2018/030039, International Preliminary Report on Patentability dated Nov. 7, 2019”, 6 pgs.
“International Application Serial No. PCT/US2018/030043, International Preliminary Report on Patentability dated Nov. 7, 2019”, 7 pgs.
“International Application Serial No. PCT/US2018/030044, International Preliminary Report on Patentability dated Nov. 7, 2019”, 8 pgs.
“International Application Serial No. PCT/US2018/030045, International Preliminary Report on Patentability dated Nov. 7, 2019”, 8 pgs.
“International Application Serial No. PCT/US2018/030046, International Preliminary Report on Patentability dated Nov. 7, 2019”, 8 pgs.
“International Application Serial No. PCT/US2018/000113, International Preliminary Report on Patentability dated Nov. 7, 2019”, 6 pgs.
“U.S. Appl. No. 15/369,499, Final Office Action dated Jun. 15, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Non Final Office Action dated Mar. 2, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Response filed Feb. 3, 2020 to Final Office Action dated Oct. 1, 2019”, 10 pgs.
“U.S. Appl. No. 15/369,499, Response filed Jun. 2, 2020 to Non Final Office Action dated Mar. 2, 2020”, 9 pgs.
“U.S. Appl. No. 15/628,408, Final Office Action dated Apr. 13, 2020”, 45 pgs.
“U.S. Appl. No. 15/628,408, Response filed Jul. 13, 2020 to Final Office Action dated Apr. 13, 2020”, 20 pgs.
“U.S. Appl. No. 15/965,361, Non Final Office Action dated Jun. 22, 2020”, 35 pgs.
“U.S. Appl. No. 15/965,744, Examiner Interview Summary dated Feb. 21, 2020”, 3 pgs.
“U.S. Appl. No. 15/965,744, Final Office Action dated Feb. 6, 2020”, 19 pgs.
“U.S. Appl. No. 15/965,744, Response filed Jun. 8, 2020 to Final Office Action dated Feb. 6, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,749, Examiner Interview Summary dated Jul. 29, 2020”, 3 pgs.
“U.S. Appl. No. 15/965,749, Final Office Action dated Jun. 11, 2020”, 12 pgs.
“U.S. Appl. No. 15/965,749, Response filed Feb. 28, 2020 to Non Final Office Action dated Jan. 27, 2020”, 12 pgs.
“U.S. Appl. No. 15/965,754, Final Office Action dated Jul. 17, 2020”, 14 pgs.
“U.S. Appl. No. 15/965,754, Non Final Office Action dated Mar. 30, 2020”, 13 pgs.
“U.S. Appl. No. 15/965,754, Response filed Jun. 30, 2020 to Non Final Office Action dated Mar. 30, 2020”, 12 pgs.
“U.S. Appl. No. 15/965,756, Non Final Office Action dated Jun. 24, 2020”, 16 pgs.
“U.S. Appl. No. 15/965,764, Examiner Interview Summary dated Aug. 6, 2020”, 3 pgs.
“U.S. Appl. No. 15/965,764, Final Office Action dated May 14, 2020”, 18 pgs.
“U.S. Appl. No. 15/965,764, Response filed Apr. 2, 20 to Non Final Office Action dated Jan. 2, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,775, Non Final Office Action dated Jun. 19, 2020”, 12 pgs.
“U.S. Appl. No. 15/965,775, Response filed Jun. 1, 2020 to Final Office Action dated Jan. 30, 2020”, 10 pgs.
“U.S. Appl. No. 15/965,775, Response filed Jul. 7, 2020 to Non Final Office Action dated Jun. 19, 2020”, 9 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Jul. 22, 2020”, 20 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Apr. 9, 2020”, 18 pgs.
“U.S. Appl. No. 16/115,259, Response filed Mar. 13, 2020 to Final Office Action dated Dec. 16, 2019”, 9 pgs.
“U.S. Appl. No. 16/115,259, Response filed Jul. 9, 2020 to Non Final Office Action dated Apr. 9, 2020”.
“U.S. Appl. No. 16/232,824, Examiner Interview Summary dated Jul. 24, 2020”, 3 pgs.
“U.S. Appl. No. 16/232,824, Final Office Action dated Apr. 30, 2020”, 19 pgs.
“U.S. Appl. No. 16/232,824, Response filed Feb. 21, 2020 to Non Final Office Action dated Oct. 21, 2019”, 9 pgs.
“U.S. Appl. No. 16/232,824, Response filed Jul. 15, 2020 to Final Office Action dated Apr. 30, 2020”, 11 pgs.
“European Application Serial No. 19206595.1, Extended European Search Report dated Mar. 31, 2020”, 6 pgs.
“European Application Serial No. 18789872.1, Communication Pursuant to Article 94(3) EPC dated Aug. 11, 2020”, 6 pgs.
“European Application Serial No. 18790189.7, Communication Pursuant to Article 94(3) EPC dated Jul. 30, 2020”, 9 pgs.
“European Application Serial No. 18790189.7, Reponse Filed Jul. 14, 2020 to Extend European Search Report dated Jan. 20, 2020”, 21 pgs.
“European Application Serial No. 18790319.0, Extended European Search Report dated Feb. 12, 2020”, 6 pgs.
“European Application Serial No. 18791925.3, Reponse Filed Jul. 27, 2020 to Extended European Search Report dated Jan. 2, 2020”, 19 pgs.
“European Application Serial No. 19206610.8, Extended European Search Report dated Feb. 12, 2020”, 6 pgs.
Gundersen, Eric, “Foursquare Switches to MapBox Streets, Joins the OpenStreetMap Movement”, [Online] Retrieved from the Internet: < URL: https://blog.mapbox.com/ foursquare-switches-to-mapbox-streets-joins-the-openstreetmap-movement-29e6a17f4464>, (Mar. 6, 2012), 4 pgs.
U.S. Appl. No. 15/369,499, Corrected Notice of Allowability dated Jan. 28, 2021, 3 pgs.
“U.S. Appl. No. 15/369,499, Examiner Interview Summary dated Sep. 21, 2020”, 3 pgs.
“U.S. Appl. No. 15/369,499, Examiner Interview Summary dated Oct. 9, 2020”, 2 pgs.
“U.S. Appl. No. 15/369,499, Notice of Allowance dated Oct. 26, 2020”, 17 pgs.
“U.S. Appl. No. 15/369,499, Response filed Sep. 15, 2020 to Final Office Action dated Jun. 15, 2020”, 10 pgs.
“U.S. Appl. No. 15/628,408, Notice of Allowance dated Sep. 29, 2020”, 13 pgs.
“U.S. Appl. No. 15/859,101, Examiner Interview Summary dated Sep. 18, 2018”, 3 pgs.
“U.S. Appl. No. 15/859,101, Non Final Office Action dated Jun. 15, 2018”, 10 pgs.
“U.S. Appl. No. 15/859,101, Notice of Allowance dated Oct. 4, 2018”, 9 pgs.
“U.S. Appl. No. 15/859,101, Response filed Sep. 17, 2018 to Non Final Office Action dated Jun. 15, 2018”, 17 pgs.
“U.S. Appl. No. 15/965,749, Non Final Office Action dated Nov. 30, 2020”, 13 pgs.
“U.S. Appl. No. 15/965,749, Response filed Oct. 12, 2020 to Final Office Action dated Jun. 11, 2020”, 14 pgs.
“U.S. Appl. No. 15/965,754, Corrected Notice of Allowability dated Jan. 6, 2021”, 2 pgs.
“U.S. Appl. No. 15/965,754, Notice of Allowance dated Nov. 16, 2020”, 7 pgs.
“U.S. Appl. No. 15/965,754, Response filed Oct. 19, 2020 to Final Office Action dated Jul. 17, 2020”, 14 pgs.
“U.S. Appl. No. 15/965,754, Supplemental Notice of Allowability dated Dec. 16, 2020”, 2 pgs.
“U.S. Appl. No. 15/965,756, Non Final Office Action dated Jan. 13, 2021”, 16 pgs.
“U.S. Appl. No. 15/965,756, Response filed Sep. 24, 2020 to Non Final Office Action dated Jun. 24, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,764, Response filed Oct. 14, 2020 to Final Office Action dated May 14, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,775, Non Final Office Action dated Oct. 16, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,811, Final Office Action dated Feb. 12, 2020”, 16 pgs.
“U.S. Appl. No. 15/965,811, Non Final Office Action dated Jun. 26, 2020”, 20 pgs.
“U.S. Appl. No. 15/965,811, Non Final Office Action dated Aug. 8, 2019”, 15 pgs.
“U.S. Appl. No. 15/965,811, Response filed Jun. 12, 2020 to Final Office Action dated Feb. 12, 2020”, 13 pgs.
“U.S. Appl. No. 15/965,811, Response filed Nov. 8, 2019 to Non Final Office Action dated Aug. 8, 2019”, 14 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Jan. 11, 2021”, 17 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 22, 2020 to Final Office Action dated Jul. 22, 2020”, 10 pgs.
“U.S. Appl. No. 16/245,660, Final Office Action dated Feb. 6, 2020”, 12 pgs.
“U.S. Appl. No. 16/245,660, Non Final Office Action dated Jun. 27, 2019”, 11 pgs.
“U.S. Appl. No. 16/245,660, Notice of Allowability dated Nov. 18, 2020”, 2 pgs.
“U.S. Appl. No. 16/245,660, Notice of Allowance dated Jul. 8, 2020”, 8 pgs.
“U.S. Appl. No. 16/245,660, Notice of Allowance dated Nov. 3, 2020”, 8 pgs.
“U.S. Appl. No. 16/245,660, Response filed Jun. 8, 2020 to Final Office Action dated Feb. 6, 2020”, 16 pgs.
“U.S. Appl. No. 16/245,660, Response filed Nov. 6, 2019 to Non Final Office Action dated Jun. 27, 2019”, 11 pgs.
“European Application Serial No. 18790319.0, Response filed Aug. 27, 2020 to Extended European Search Report dated Feb. 12, 2020”, 19 pgs.
“European Application Serial No. 18791363.7, Communication Pursuant to Article 94(3) EPC dated Aug. 11, 2020”, 9 pgs.
“European Application Serial No. 18791363.7, Extended European Search Report dated Jan. 2, 2020”, 8 pgs.
“European Application Serial No. 18791363.7, Response filed Jul. 14, 2020 to Extended European Search Report dated Jan. 2, 2020”, 31 pgs.
“European Application Serial No. 19206595.1, Response filed Dec. 16, 2020 to Extended European Search Report dated Mar. 31, 2020”, w/ English Claims, 43 pgs.
“European Application Serial No. 19206610.8, Response filed Sep. 23, 2020 to Extended European Search Report dated Feb. 12, 2020”, 109 pgs.
“International Application Serial No. PCT/US2018/030041, International Preliminary Report on Patentability dated Nov. 7, 2019”, 5 pgs.
“The One Million Tweet Map: Using Maptimize to Visualize Tweets in a World Map | PowerPoint Presentation”, fppt.com, [Online] Retrieved form the Internet: <URL: https://web.archive.org/web/20121103231906/http://www.freepower-point-templates.com/articles/the-one-million-tweet-mapusing-maptimize-to-visualize-tweets-in-a-world-map/>, (Nov. 3, 2012), 6 pgs.
“U.S. Appl. No. 15/628,408, Notice of Allowance dated Jul. 8, 2021”, 11 pgs.
“U.S. Appl. No. 15/965,744, Response filed Jun. 1, 2021 to Non Final Office Action dated Feb. 1, 2021”, 11 pgs.
“U.S. Appl. No. 15/965,749, Non Final Office Action dated Jul. 9, 2021”, 14 pgs.
“U.S. Appl. No. 15/965,756, Response filed May 13, 2021 to Non Final Office Action dated Jan. 13, 2021”, 12 pgs.
“U.S. Appl. No. 15/965,764, Final Office Action dated Jun. 15, 2021”, 19 pgs.
“U.S. Appl. No. 15/965,764, Response filed May 24, 2021 to Non Final Office Action dated Feb. 22, 2021”, 13 pgs.
“U.S. Appl. No. 15/965,775, Final Office Action dated Jul. 6, 2021”, 12 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Jul. 13, 2021”, (18 pgs).
“U.S. Appl. No. 16/115,259, Response filed May 11, 2021 to Non Final Office Action dated Jan. 11, 2021”, 14 pgs.
“U.S. Appl. No. 17/131,598, Preliminary Amendment filed Jun. 8, 2021”, 10 pgs.
“European Application Serial No. 18789872.1, Summons to Attend Oral Proceedings mailed Jun. 23, 2021”, 9 pgs.
“European Application Serial No. 18790189.7, Summons to attend oral proceedings mailed Jul. 8, 2021”, 13 pgs.
“European Application Serial No. 18791925.3, Communication Pursuant to Article 94(3) EPC dated May 11, 2021”, 7 pgs.
“Korean Application Serial No. 10-2019-7034512, Notice of Preliminary Rejection dated May 17, 2021”, w/ English Translation, 15 pgs.
“Korean Application Serial No. 10-2019-7034598, Notice of Preliminary Rejection dated Jun. 3, 2021”, w/ English translation, 10 pgs.
“Korean Application Serial No. 10-2019-7034715, Notice of Preliminary Rejection dated May 21, 2021”, w/ English Translation, 15 pgs.
“Korean Application Serial No. 10-2019-7034751, Notice of Preliminary Rejection dated May 21, 2021”, w/ English Translation, 18 pgs.
“Korean Application Serial No. 10-2019-7034899, Notice of Preliminary Rejection dated May 27, 2021”, w/ English Translation, 17 pgs.
“Korean Application Serial No. 10-2019-7035443, Notice of Preliminary Rejection dated May 26, 2021”, w/ English Translation, 14 pgs.
Wilmott, Clancy, et al., “Playful Mapping in the Digital Age”, Playful Mapping Collective, Institute of Network Cultures, Amsterdam, (2016), 158 pgs.
“U.S. Appl. No. 15/628,408, Corrected Notice of Allowability dated Jul. 21, 2021”, 7 pgs.
“U.S. Appl. No. 15/628,408, Supplemental Notice of Allowability dated Oct. 21, 2021”, 2 pgs.
“U.S. Appl. No. 15/965,744, Final Office Action dated Jul. 28, 2021”, 29 pgs.
“U.S. Appl. No. 15/965,744, Response filed Nov. 29, 2021 to Final Office Action dated Jul. 28, 2021”, 13 pgs.
“U.S. Appl. No. 15/965,749, Response filed Nov. 9, 2021 to Non Final Office Action dated Jul. 9, 2021”, 14 pgs.
“U.S. Appl. No. 15/965,756, Final Office Action dated Aug. 19, 2021”, 17 pgs.
“U.S. Appl. No. 15/965,764, Response filed Nov. 15, 2021 to Final Office Action dated Jun. 15, 2021”, 12 pgs.
“U.S. Appl. No. 15/965,775, Response filed Oct. 6, 2021 to Final Office Action dated Jul. 6, 2021”, 11 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Nov. 8, 2021”, 17 pgs.
“U.S. Appl. No. 16/115,259, Response filed Oct. 13, 2021 to Final Office Action dated Jul. 13, 2021”, 10 pgs.
“U.S. Appl. No. 16/232,824, Final Office Action dated Nov. 2, 2021”, 25 pgs.
“U.S. Appl. No. 16/232,824, Response filed Aug. 19, 2021 to Non Final Office Action dated Feb. 19, 2021”, 12 pgs.
“U.S. Appl. No. 17/249,201, Preliminary Amendment filed Oct. 6, 2021”, 9 pgs.
“Chinese Application Serial No. 202010079763.5, Office Action dated Aug. 27, 2021”, w/ English Translation, 15, pgs.
“European Application Serial No. 18789872.1, Summons to Attend Oral Proceedings dated Sep. 13, 2021”, 9 pgs.
“European Application Serial No. 18790319.0, Communication Pursuant to Article 94(3) EPC dated Jul. 21, 2021”, 7 pgs.
“European Application Serial No. 19206595.1, Communication Pursuant to Article 94(3) EPC dated Jul. 22, 2021”, 7 pgs.
“European Application Serial No. 19206610.8, Communication Pursuant to Article 94(3) EPC dated Jul. 21, 2021”, 8 pgs.
“Korean Application Serial No. 10-2019-7034512, Notice of Preliminary Rejection dated Nov. 2, 2021”, With English translation, 8 pgs.
“Korean Application Serial No. 10-2019-7034598, Response filed Sep. 3, 2021 to Notice of Preliminary Rejection dated Jun. 3, 2021”, w/ English Claims, 27 pgs.
“Korean Application Serial No. 10-2019-7034899, Response filed Aug. 11, 2021 to Notice of Preliminary Rejection dated May 27, 2021”, With English claims, 26 pgs.
“U.S. Appl. No. 15/628,408, Final Office Action dated Jun. 10, 2022”, 33 pgs.
“U.S. Appl. No. 15/628,408, Non Final Office Action dated Feb. 4, 2022”, 30 pgs.
“U.S. Appl. No. 15/628,408, Response filed May 4, 2022 to Non Final Office Action dated Feb. 4, 2022”, 9 pgs.
“U.S. Appl. No. 15/965,744, Non Final Office Action dated Mar. 4, 2022”, 31 pgs.
“U.S. Appl. No. 15/965,744, Response filed Jul. 5, 2022 to Non Final Office Action dated Mar. 4, 2022”, 13 pgs.
“U.S. Appl. No. 15/965,749, Corrected Notice of Allowability dated Jun. 16, 2022”, 2 pgs.
“U.S. Appl. No. 15/965,749, Notice of Allowance dated Feb. 2, 2022”, 25 pgs.
“U.S. Appl. No. 15/965,749, Supplemental Notice of Allowability dated Apr. 7, 2022”, 3 pgs.
“U.S. Appl. No. 15/965,749, Supplemental Notice of Allowability dated May 5, 2022”, 3 pgs.
“U.S. Appl. No. 15/965,756, Non Final Office Action dated Mar. 31, 2022”, 17 pgs.
“U.S. Appl. No. 15/965,756, Response filed Dec. 20, 2021 to Final Office Action dated Aug. 19, 2021”, 13 pgs.
“U.S. Appl. No. 15/965,764, Corrected Notice of Allowability dated Mar. 30, 2022”, 1 pg.
“U.S. Appl. No. 15/965,764, Corrected Notice of Allowability dated Apr. 20, 2022”, 2 pgs.
“U.S. Appl. No. 15/965,764, Corrected Notice of Allowability dated Jun. 28, 2022”, 2 pgs.
“U.S. Appl. No. 15/965,764, Notice of Allowance dated Mar. 9, 2022”, 8 pgs.
“U.S. Appl. No. 15/965,775, Corrected Notice of Allowability dated Apr. 7, 2022”, 3 pgs.
“U.S. Appl. No. 15/965,775, Corrected Notice of Allowability dated Jun. 1, 2022”, 3 pgs.
“U.S. Appl. No. 15/965,775, Corrected Notice of Allowability dated Jun. 15, 2022”, 2 pgs.
“U.S. Appl. No. 15/965,775, Notice of Allowance dated Feb. 22, 2022”, 19 pgs.
“U.S. Appl. No. 16/115,259, Final Office Action dated Apr. 4, 2022”, 18 pgs.
“U.S. Appl. No. 16/115,259, Response filed Feb. 8, 2022 to Non Final Office Action dated Nov. 8, 2021”, 9 pgs.
“U.S. Appl. No. 16/232,824, Final Office Action dated Jun. 23, 2022”, 26 pgs.
“U.S. Appl. No. 16/232,824, Response filed May 2, 2022 to Final Office Action dated Nov. 2, 2021”, 13 pgs.
“U.S. Appl. No. 17/248,841, Notice of Allowability dated Jul. 18, 2022”, 2 pgs.
“U.S. Appl. No. 17/248,841, Notice of Allowance dated Apr. 7, 2022”, 9 pgs.
“U.S. Appl. No. 17/249,201, Corrected Notice of Allowability dated Jun. 24, 2022”, 2 pgs.
“U.S. Appl. No. 17/249,201, Non Final Office Action dated May 26, 2022”, 5 pgs.
“U.S. Appl. No. 17/249,201, Notice of Allowance dated Jun. 9, 2022”, 7 pgs.
“U.S. Appl. No. 17/249,201, Response filed May 27, 2022 to Non Final Office Action dated May 26, 2022”, 10 pgs.
“U.S. Appl. No. 17/314,963, Final Office Action dated Jul. 11, 2022”, 25 pgs.
“U.S. Appl. No. 17/314,963, Non Final Office Action dated Feb. 2, 2022”, 24 pgs.
“U.S. Appl. No. 17/314,963, Response filed May 2, 2022 to Non Final Office Action dated Feb. 2, 2022”, 10 pgs.
“Chinese Application Serial No. 202010079763.5, Office Action dated Apr. 12, 2022”, W/English Translation, 14 pgs.
“Chinese Application Serial No. 202010079763.5, Response filed Jun. 1, 2022 to Office Action dated Apr. 12, 2022”, w/ English Claims, 12 pgs.
“Chinese Application Serial No. 202010079763.5, Response Filed Jan. 11, 2022 to Office Action dated Aug. 27, 2021”, w/ English Claims, 13 pgs.
“European Application Serial No. 18790319.0, Response Filed Jan. 28, 2022 to Communication Pursuant to Article 94(3) EPC dated Jul. 21, 2021”, 16 pgs.
“European Application Serial No. 19206595.1, Response filed Jan. 28, 2022 to Communication Pursuant to Article 94(3) EPC dated Jul. 22, 2021”, 18 pgs.
“European Application Serial No. 19206610.8, Response filed Jan. 26, 2022 to Communication Pursuant to Article 94(3) EPC dated Jul. 21, 2021”, 23 pgs.
“European Application Serial No. 22165083.1, Extended European Search Report dated Jul. 12, 2022”, 7 pgs.
“Korean Application Serial No. 10-2019-7034512, Response Filed Jan. 3, 2022 to Notice of Preliminary Rejection dated Nov. 2, 2021”, w/ English Claims, 18 pgs.
“Korean Application Serial No. 10-2019-7034598, Notice of Preliminary Rejection dated Jan. 10, 2022”, w/ English translation, 13 pgs.
“Korean Application Serial No. 10-2019-7034598, Response Filed Mar. 10, 2022 to Notice of Preliminary Rejection dated Jan. 10, 2022”, W/ English Claims, 24 pgs.
“Korean Application Serial No. 102019-7034715, Final Office Action dated Mar. 7, 2022”, w/ English translation, 9 pgs.
“Korean Application Serial No. 10-2019-7034715, Response filed Nov. 22, 2021 to Office Action dated May 21, 2021”, w/ English Claims, 22 pgs.
“Korean Application Serial No. 10-2019-7034751, Final Office Action dated Mar. 7, 2022”, w/ English translation, 11 pgs.
“Korean Application Serial No. 10-2019-7034751, Response filed Jun. 7, 2022 to Office Action dated Mar. 7, 2022”, w/ English Claims, 27 pgs.
“Korean Application Serial No. 10-2019-7034899, Final Office Action dated Dec. 3, 2021”, w/ English translation, 10 pgs.
“Korean Application Serial No. 10-2019-7035443, Notice of Preliminary Rejection dated Apr. 11, 2022”, w/ English translation, 8 pgs.
“Korean Application Serial No. 10-2019-7035443, Response filed May 6, 2022 to Office Action dated Apr. 12, 2022”, w/ English Claims, 17 pgs.
“Korean Application Serial No. 10-2019-7034715, Response filed Jun. 7, 2022 to Office Action dated Mar. 7, 2022”, w/ English Claims, 18 pgs.
“Korean Application Serial No. 10-2019-7034751, Response filed Nov. 22, 2021 to Office Action dated May 21, 2021”, w/ English Claims, 28 pgs.
“Korean Application Serial No. 10-2019-7034899, Office Action dated Jan. 24, 2022”, w/ English Translation, 12 pgs.
“Korean Application Serial No. 10-2019-7034899, Response filed Jan. 5, 2022 to Office Action dated Dec. 3, 2021”, w/ English Translation of Claims, 12 pgs.
“Tiled web map—Wikipedia”, <URL:https://en.wikipedia.org/w/index.php?title=Tiled_web_map&oldid=758691778>, (Jan. 6, 2017), 1-3.
Birchall, Andrew Alexander, “The delivery of notifications that user perceives,”, IP.com English Translation of CN 107210948 a Filed Dec. 16, 2014, (2014), 28 pgs.
“U.S. Appl. No. 15/628,408, Examiner Interview Summary dated Jul. 27, 2022”, 2 pgs.
“U.S. Appl. No. 15/628,408, Response filed Aug. 10, 2022 to Final Office Action dated Jun. 10, 2022”, 14 pgs.
“U.S. Appl. No. 15/965,744, Final Office Action dated Oct. 21, 2022”, 31 pgs.
“U.S. Appl. No. 15/965,756, Non Final Office Action dated Nov. 14, 2022”, 16 pgs.
“U.S. Appl. No. 15/965,756, Response filed Aug. 1, 2022 to Non Final Office Action dated Mar. 31, 2022”, 12 pgs.
“U.S. Appl. No. 15/965,764, PTO Response to Rule 312 Communication dated Aug. 16, 2022”, 2 pgs.
“U.S. Appl. No. 16/115,259, Non Final Office Action dated Nov. 1, 2022”, 18 pgs.
“U.S. Appl. No. 16/115,259, Response filed Sep. 6, 2022 to Final Office Action dated Apr. 4, 2022”, 10 pgs.
“U.S. Appl. No. 16/232,824, Response filed Nov. 22, 2022 to Final Office Action dated Jun. 23, 2022”, 12 pgs.
“U.S. Appl. No. 17/131,598, Non Final Office Action dated Sep. 27, 2022”, 27 pgs.
“U.S. Appl. No. 17/249,201, Corrected Notice of Allowability dated Sep. 22, 2022”, 2 pgs.
“U.S. Appl. No. 17/314,963, Advisory Action dated Sep. 27, 2022”, 3 pgs.
“U.S. Appl. No. 17/314,963, Response filed Sep. 12, 2022 to Final Office Action dated Jul. 11, 2022”, 11 pgs.
“U.S. Appl. No. 17/314,963, Response filed Oct. 11, 2022 to Advisory Action dated Sep. 27, 2022”, 10 pgs.
“U.S. Appl. No. 17/805,127, Preliminary Amendment Filed Nov. 8, 2022”, 10 pgs.
“U.S. Appl. No. 17/818,896, Preliminary Amendment filed Oct. 24, 2022”, 6 pgs.
“U.S. Appl. No. 17/946,337, Preliminary Amendment Filed Oct. 31, 2022”, 7 pgs.
“U.S. Appl. No. 18/047,213, Preliminary Amendment filed Oct. 31, 2022”, 8 pgs.
“European Application Serial No. 22173072.4, Extended European Search Report dated Aug. 26, 2022”, 6 pgs.
“Korean Application Serial No. 10-2019-7034715, Office Action dated Jun. 27, 2022”, w/ English Translation, 7 pgs.
U.S. Appl. No. 17/249.201, filed Feb. 23, 2021, Location-Based Search Mechanism in a Graphical User Interface.
U.S. Appl. No. 17/248,841, filed Feb. 10, 2021, Selective Location-Based Identity Communication.
U.S. Appl. No. 17/314,963, filed May 7, 2021, Generating and Displaying Customized Avatars in Media Overlays.
“Korean Application Serial No. 10-2019-7034715, Office Action dated Jan. 30, 2023”, With English machine translation, 3 pgs.
“Korean Application Serial No. 10-2022-7013956, Notice of Preliminary Rejection dated Jan. 13, 2023”, w/ English Translation, 22 pgs.
“U.S. Appl. No. 15/628,408, Non Final Office Action dated Dec. 27, 2022”, 36 pgs.
“U.S. Appl. No. 15/965,744, Response filed Feb. 21, 2023 to Final Office Action dated Oct. 21, 2022”, 14 pgs.
“U.S. Appl. No. 15/965,756, Response filed Feb. 14, 2023 to Non Final Office Action dated Nov. 14, 2022”, 13 pgs.
“U.S. Appl. No. 16/115,259, Examiner Interview Summary dated Feb. 16, 2023”, 2 pgs.
“U.S. Appl. No. 16/115,259, Response filed Feb. 1, 2023 to Non Final Office Action dated Nov. 1, 2022”, 10 pgs.
“U.S. Appl. No. 17/131,598, Response filed Jan. 27, 2023 to Non Final Office Action dated Sep. 27, 2022”, 15 pgs.
“U.S. Appl. No. 17/314,963, Corrected Notice of Allowability dated Jan. 26, 2023”, 2 pgs.
“U.S. Appl. No. 17/314,963, Notice of Allowance dated Jan. 13, 2023”, 6 pgs.
“Application Serial No., Notice of Allowance dated Feb. 27, 2023”, 8 pgs.
“U.S. Appl. No. 16/115,259, Corrected Notice of Allowability dated Jun. 12, 2023”, 3 pgs.
“U.S. Appl. No. 17/818,896, Examiner Interview Summary dated Jun. 12, 2023”, 2 pgs.
“U.S. Appl. No. 15/628,408, Response filed Apr. 27, 2023 to Non Final Office Action dated Dec. 27, 2022”, 13 pgs.
“U.S. Appl. No. 15/965,744, Non Final Office Action dated Mar. 30, 2023”, 34 pgs.
“U.S. Appl. No. 16/232,824, Non Final Office Action dated Mar. 30, 2023”, 27 pgs.
“U.S. Appl. No. 17/314,963, Notice of Allowance dated Apr. 14, 2023”, 5 pgs.
“U.S. Appl. No. 17/804,771, Examiner Interview Summary dated Apr. 11, 2023”, 2 pgs.
“U.S. Appl. No. 17/804,771, Non Final Office Action dated Mar. 17, 2023”, 20 pgs.
“U.S. Appl. No. 17/805,127, Non Final Office Action dated Apr. 27, 2023”, 98 pgs.
“U.S. Appl. No. 17/818,896, Non Final Office Action dated Mar. 16, 2023”, 13 pgs.
“Chinese Application Serial No. 201880042674.4, Office Action dated Feb. 20, 2023”, w/ English Translation, 13 pgs.
“Chinese Application Serial No. 201880043068.4, Office Action dated Mar. 1, 2023”, W/English Translation, 22 pgs.
“Chinese Application Serial No. 201880043121.0, Office Action dated Feb. 20, 2023”, W/English Translation, 12 pgs.
“Chinese Application Serial No. 201880043199.2, Office Action dated Mar. 31, 2023”, w/ English Translation, 12 pgs.
“Korean Application Serial No. 10-2019-7034715, Office Action dated Feb. 27, 2023”, w/ English Machine Translation, 46 pgs.
“Korean Application Serial No. 10-2022-7028257, Notice of Preliminary Rejection dated Apr. 5, 2023”, W/English Translation, 6 pgs.
“What is interpolation?”, CUNY, [Online] Retrieved from the internet: <http://www.geography.hunter.cuny.edu/˜jochen/gtech361/lectures/lecture11/concepts/What%20is%20interpolation.htm>, (May 8, 2016), 2 pgs.
Dempsey, C, “What is the difference between a heat map and a hot spot map?”, [Online] Retrieved from the internet: <https://www.gislounge.com/difference-heat-map-hot-spot-map/>, (Aug. 10, 2014), 8 pgs.
“U.S. Appl. No. 15/965,744, Non Final Office Action dated Feb. 1, 2021”, 29 pgs.
“U.S. Appl. No. 15/965,749, Response filed Mar. 30, 2021 to Non Final Office Action dated Nov. 30, 2020”, 13 pgs.
“U.S. Appl. No. 15/965,754, Corrected Notice of Allowability dated Mar. 1, 2021”, 2 pgs.
“U.S. Appl. No. 15/965,764, Non Final Office Action dated Feb. 22, 2021”, 18 pgs.
“U.S. Appl. No. 15/965,775, Response filed Mar. 16, 2021 to Non Final Office Action dated Oct. 16, 2020”, 10 pgs.
“U.S. Appl. No. 16/232,824, Non Final Office Action dated Feb. 19, 2021”, 28 pgs.
“U.S. Appl. No. 17/248,841, Preliminary Amendment filed Apr. 22, 2021”, 7 pgs.
“European Application Serial No. 18789872.1, Response filed Feb. 18, 2021 to Communication Pursuant to Article 94(3) EPC dated Aug. 11, 2020”, 15 pgs.
“European Application Serial No. 18790189.7, Response filed Feb. 9, 2021 to Communication Pursuant to Article 94(3) EPC dated Jul. 30, 2020”, 11 pgs.
“U.S. Appl. No. 15/965,756, Final Office Action dated May 12, 2023”, 18 pgs.
“U.S. Appl. No. 17/131,598, Final Office Action dated May 11, 2023”, 26 pgs.
“Chinese Application Serial No. 202010086283.1, Office Action dated Mar. 16, 2023”, W/English Translation, 12 pgs.
Related Publications (1)
Number Date Country
20190220932 A1 Jul 2019 US
Provisional Applications (1)
Number Date Country
62491115 Apr 2017 US
Continuations (1)
Number Date Country
Parent 15628408 Jun 2017 US
Child 16365300 US