User interface to augment an image using geolocation

Information

  • Patent Grant
  • 11216869
  • Patent Number
    11,216,869
  • Date Filed
    Tuesday, September 23, 2014
    9 years ago
  • Date Issued
    Tuesday, January 4, 2022
    2 years ago
Abstract
A system and method for a media filter publication application are described. The media filter publication application receives a content item and a selected geolocation, generates a media filter based on the content item and the selected geolocation, and supplies the media filter to a client device located at the selected geolocation.
Description
TECHNICAL FIELD

The subject matter disclosed herein generally relates to user interface technology. Specifically, the present disclosure addresses systems and methods for a platform for publishing context relevant media filters, for presentation on the user interfaces of mobile devices.


BACKGROUND

The number of digital photographs taken with mobile wireless devices is increasingly outnumbering photographs taken with dedicated digital and film based cameras. Thus, there are growing needs to improve the experience associated with mobile wireless digital photography.





BRIEF DESCRIPTION OF THE DRAWINGS

The present disclosure is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:



FIG. 1 is a network diagram depicting a network system having a client-server architecture configured for exchanging data over a network, according to one embodiment.



FIG. 2 shows a block diagram illustrating one example embodiment of a messaging application.



FIG. 3 shows a block diagram illustrating one example embodiment of a media filter application.



FIG. 4A shows a block diagram illustrating one example embodiment of a user-based media filter publication module.



FIG. 4B shows an example of a graphical user interface for a user-based media filter publication module.



FIG. 4C shows an example of an operation of the graphical user interface of FIG. 4B.



FIG. 4D illustrates an example of a publication of a user-based media filter.



FIG. 5A shows a block diagram illustrating one example embodiment of a merchant-based media filter publication module.



FIG. 5B illustrates an example of a common geolocation.



FIG. 5C illustrates an example of a graphical user interface for a merchant-based media filter publication module.



FIG. 5D illustrates an example of a bid from a first merchant using the graphical user interface of FIG. 5C.



FIG. 5E illustrates an example of a bid from a second merchant using the graphical user interface of FIG. 5C.



FIG. 5F illustrates an example of an operation of a merchant-based media filter.



FIG. 6A shows a block diagram illustrating one example embodiment of a predefined media filter module.



FIG. 6B shows a diagram illustrating an example of a media filter with live data content.



FIG. 6C shows a diagram illustrating an example of a media filter with dynamic progressive use content.



FIG. 6D shows a diagram illustrating an example of a media filter with promotional content.



FIG. 6E shows a diagram illustrating an example of a media filter with viral content.



FIG. 7 shows an interaction diagram illustrating one example embodiment of an operation of the user-based media filter publication module.



FIG. 8 shows an interaction diagram illustrating another example embodiment of an operation of the merchant-based media filter publication module.



FIG. 9 shows a flow diagram illustrating one example embodiment of an operation of the user-based media filter publication module.



FIG. 10 shows a flow diagram illustrating one example embodiment of an operation of the merchant-based media filter publication module.



FIG. 11 shows a flow diagram illustrating one example embodiment of an operation of the live event module.



FIG. 12 shows a flow diagram illustrating one example embodiment of an operation of the social network module.



FIG. 13 shows a flow diagram illustrating one example embodiment of an operation of the promotion module.



FIG. 14 shows a flow diagram illustrating one example embodiment of an operation of the collection module.



FIG. 15 shows a flow diagram illustrating one example embodiment of an operation of the progressive use module.



FIG. 16 shows a flow diagram illustrating one example embodiment of an operation of the viral use module.



FIG. 17 shows a flow diagram illustrating one example embodiment of an operation of the actionable module.



FIG. 18 shows a diagrammatic representation of machine, in the example form of a computer system, within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.



FIG. 19 is a block diagram illustrating a mobile device, according to an example embodiment.





DETAILED DESCRIPTION

Although the present disclosure is described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


The addition of labels, drawings and other artwork to images (e.g., pictures or video) provides a compelling way for users to personalize, supplement and enhance these images before storage or publication to a broader audience. An example embodiment seeks to provide users with a set of the geo-filters (e.g., enhancement and augmentations) that can be applied to an image. The set of enhancements and augmentations, in the example form of image overlays, may be determined based on a location associated with the image. The image overlays are presented to a user for selection and combining with an image based on a determined location of the image, or content of the image. For example, where a user takes a picture on a mobile device in Disneyland, an image overlay indicating the name “Disneyland”, in a particular style, is presented to the user. Further Disneyland-themed image overlays may also be presented to the user. The presentation of the image overlay may be in response to the user performing a gesture (e.g. a swipe operation) on a screen of the mobile device. The user is then able to select the image overlay and have it applied to the image, in this way to personalize and enhance the image.


Third party entities (e.g., merchants, restaurants, individuals, etc.) may, in one example embodiment, seek to have geo-filters included in the set presented for user selection at a particular geographic location. For example, a restaurant at a particular location in San Francisco may wish to have their restaurant name and logo included in a set of geo-filters presented to a user, for the purposes of augmenting a photograph taken by the user proximate to the restaurant. According to one example embodiment, such third party entities may bid (or otherwise purchase opportunities) to have a particular geo-filter included in a set presented to a user for augmentation of a particular image. Below described are various systems and methodologies that may be used to technically implement the above described image enhancement technologies and capabilities.


More specifically, various examples of a media filter publication application are described. The media filter publication application operates at a server and generates media filters that include content based on geographic locations (also referred to as geolocation). A media filter may include audio and visual content or visual effects that can be applied to augment a media item at a mobile device. The media item may be a picture or a video. The media filter publication application includes a user-based media filter publication platform and a merchant-based publication platform.


In the user-based media filter publication platform, the media filter publication application provides a Graphical User Interface (GUI) for a user to upload content and select a geolocation on a map. For example, the user may upload a logo and define boundaries on the map to identify a particular geolocation associated with the logo. Once the user submits the logo and identifies the particular geolocation, the media filter publication application generates a media filter that includes the logo associated with the particular geolocation. As such, mobile devices that are located within the particular geolocation have access to the media filter.


In the merchant-based media filter publication platform, the media filter publication application provides a GUI for merchants to upload content, select geolocations on a map, and submit bids for the corresponding geolocations. A bidding process determines the merchant with the highest bid amount. That merchant can then exclude publication of media filters from other merchants at a selected geolocation of the merchant. Therefore, the media filter of the highest bidding merchant may be the only media filter that can be accessed by mobile devices that are located at the selected geolocation.


In other examples, the media filter includes context relevant data, such as, a current temperature, an identification of a geolocation of the mobile device (e.g., Venice beach), a name of a live event associated with the geolocation of the mobile device, or a name of a business.


In one example embodiment, a media filter application at a server provides a live event media filter to a mobile device. The live event media filter includes live event data associated with a live event, such as a sporting event or an award ceremony, at a geolocation of the mobile device. For example, a user attending a football game can access a sports media filter that includes the current score of the football game. In another example, a user attending the Oscar® award ceremony can access an entertainment media filter that includes a name of an Oscar® winner.


In one example embodiment, the media filter application at the server provides a social network media filter to the mobile device. The social network media filter may be based on social network activities of the user of the mobile device. For example, if the user follows a brand such as McDonald's® on a social network service, and the mobile device of the user is located at a McDonald's® restaurant, the mobile device of the user can access a McDonald's® media filter. Other users located at the same restaurant would not have access to the McDonald's® media filter unless they also follow McDonald's® on the social network service. In another example, the order in which the media filters are presented to users located at a McDonald's® restaurant may be modified so that the McDonald's® media filter is served higher for users following McDonald's® on the social network service.


In one example embodiment, the media filter application at the server provides a promotion media filter to a mobile device. The promotion media filter may be based on promotions from a merchant. For example, the media filter may be used to implement a Monopoly™ game at McDonald's® by randomly selecting a media filter every time the user of the mobile device walks into a McDonald's® restaurant and purchases an item. The media filter can be used to obtain Monopoly™ puzzle pieces that can be redeemed towards prizes.


In one example embodiment, the media filter application at the server enables the mobile device to collect media filters. For example, the mobile filter application provides the mobile device with permanent access to collected media filters. The collected media filters may be stored in a collection portfolio for the mobile device. The mobile device may access any of the media filters in the collection portfolio at any time.


In one example embodiment, the media filter application at the server provides a history media filter to the mobile device. The history media filter may be based on geographic locations of historical sites visited by the user of the mobile device. For example, the mobile device is awarded with a unique media filter associated with one of the Seven Wonders of the World when the mobile device is located at one of the corresponding Seven Wonders geographic locations.


In one example embodiment, the media filter application at the server provides a progressive use media filter to the mobile device. The content in the progressive use media filter changes depending on the number of people that have previously used the progressive use media filter.


In one example embodiment, users can “purchase” a geolocation for a predetermined amount of time and select a media filter associated with the geolocation. For example, a college can purchase and select a particular media filter associated with the geolocation of its campus.


In one example embodiment, the media filter application provides a viral media filter to the mobile device. For example, when the user of the mobile device obtains the viral media filter at a geolocation, that user can send the viral media filter to mobile devices located outside the geolocation of the original user. Users of the mobile devices located outside the geolocation of the original user can make use of the viral media filter for the next hour. Those users can also forward the viral media filter to other users.


In one example embodiment, the media filter application 122 provides an actionable media filter to the mobile device. For example, the actionable media filter can be a link to open a browser page in the mobile device to obtain a coupon. The actionable media filter can trigger other functions of the mobile device.


System Architecture



FIG. 1 is a network diagram depicting a network system 100 having a client-server architecture configured for exchanging data over a network, according to one embodiment. For example, the network system 100 may be a messaging system where clients may communicate and exchange data within the network system 100. The data may pertain to various functions (e.g., sending and receiving text and media communication, determining geolocation) and aspects (e.g., publication of media filters, management of media filters) associated with the network system 100 and its users. Although illustrated herein as client-server architecture, other embodiments may include other network architectures, such as peer-to-peer or distributed network environments.


A data exchange platform, in an example, includes a messaging application 120 and a media filter application 122, and may provide server-side functionality via a network 104 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize the network system 100 and, more specifically, the messaging application 120 and the media filter application 122, to exchange data over the network 104. These operations may include transmitting, receiving (communicating), and processing data to, from, and regarding content and users of the network system 100. The data may include, but is not limited to, content and user data such as user profiles, messaging content, messaging attributes, media attributes, client device information, geolocation information, photo filters content, messaging content persistence conditions, social network information, and live event data information, among others.


In various embodiments, the data exchanges within the network system 100 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs). The UIs may be associated with a client machine, such as client devices 110, 112 using a programmatic client 106, such as a client application. The programmatic client 106 may be in communication with the messaging application 120 and media filter application 122 via an application server 118. The client devices 110, 112 include mobile devices with wireless communication components, and audio and optical components for capturing various forms of media including photos and videos.


Turning specifically to the messaging application 120 and the media filter application 122, an application program interface (API) server 114 is coupled to, and provides programmatic interface to one or more application server(s) 118. The application server 118 hosts the messaging application 120 and the media filter application 122. The application server 118 is, in turn, shown to be coupled to one or more database servers 124 that facilitate access to one or more databases 126.


The API server 114 communicates and receives data pertaining to messages and media filters, among other things, via various user input tools. For example, the API server 114 may send and receive data to and from an application (e.g., the programmatic client 106) running on another client machine (e.g., client devices 110, 112 or a third party server).


In one example embodiment, the messaging application 120 provides messaging mechanisms for users of the client devices 110, 112 to send messages that include text and media content such as pictures and video. The client devices 110, 112 can access and view the messages from the messaging application 120 for a limited period of time. For example, the client device 110 can send a message to the client device 112 via the message application 120. Once the client device 112 accesses the message from the message application 120, the message is deleted after a predefined duration has elapsed from the time the client device 112 started viewing the message. Components of the messaging application 120 are described in more detail below with respect to FIG. 2.


In one example embodiment, the media filter application 122 provides a system and a method for operating and publishing media filters for messages processed by the messaging application 120. The media filter application 122 supplies a media filter to the client device 110 based on a geolocation of the client device 110. In another example, the media filter application 122 supplies a media filter to the client device 110 based on other information, such as, social network information of the user of the client device 110.


The media filter may include audio and visual content and visual effects. Examples of audio and visual content include pictures, texts, logos, animations, and sound effects. An example of a visual effect includes color filtering. The audio and visual content or the visual effects can be applied to a media content item (e.g., a photo) at the client device 110. For example, the media filter includes text that can be overlaid on top of a photo generated at the client device 110. In another example, the media filter includes an identification of a location overlay (e.g., Venice beach), a name of a live event, or a name of a merchant overlay (e.g., Beach Coffee House). In another example, the media filter application 122 uses the geolocation of the client device 110 to identify a media filter that includes the name of a merchant at the geolocation of the client device 110. The media filter may include other indicia associated with the merchant. Examples of indicia include logos and other pictures related to the merchant. The media filters may be stored in the database(s) 126 and accessed through the database server 124.


In one example embodiment, the media filter application 122 includes a user-based publication platform that enables users to select a geolocation on a map, and upload content associated with the selected geolocation. The user may also indicate other circumstances under which a particular media filter should be provided. The media filter application 122 generates a media filter that includes the uploaded content and associates the uploaded content with the selected geolocation.


In another example embodiment, the media filter application 122 includes a merchant-based publication platform that enables merchants to select a particular media filter associated with a geolocation via a bidding process. For example, the media filter application 122 associates the media filter of a highest bidding merchant with a corresponding geolocation for a predefined amount of time. Components of the media filter application 122 are described in more detail below with respect to FIG. 3.


Messaging Application



FIG. 2 shows a block diagram illustrating one example embodiment of the messaging application 120. The messaging application 120 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between server machines. The messaging application 120 and the media filter application 122 themselves are communicatively coupled (e.g., via appropriate interfaces) to each other and to various data sources, so as to allow information to be passed between the messaging application 120 and the media filter application 122, or so as to allow the messaging application 120 and the media filter application 122 to share and access common data. The messaging application 120 and the media filter application 122 may, furthermore, access the one or more databases 126 via the database server(s) 124.


The messaging application 120 is responsible for the generation and delivery of messages between users of the programmatic client 106. The messaging application 120 may utilize any one of a number of message delivery networks and platforms to deliver messages to users. For example, the messaging application 120 may deliver messages using electronic mail (e-mail), instant message (IM), Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired (e.g., the Internet), plain old telephone service (POTS), or wireless networks (e.g., mobile, cellular, WiFi, Long Term Evolution (LTE), Bluetooth).


In one example embodiment, the messaging application 120 includes a media receiver module 202, a media filter application interface 204, a message generator module 206, an ephemeral message access module 208, and an ephemeral message storage module 210. The media receiver module 202 receives a message from the programmatic client 106 of the client device 110. The message may include a combination of text, photo, or video. The media receiver module 202 also receives persistence metadata associated with the message. The persistence metadata defines how long a message can be viewed. For example, the user of client device 110 may specify that the message be persistent or can only be viewed or accessed for a user-determined amount of time (e.g., ten seconds). The media filter application interface 204 communicates with the media filter application 122 to access and retrieve a media filter associated with the metadata in the message. The message generator module 206 applies the media filter to the message from the programmatic client 106 to create an ephemeral message and temporarily store the ephemeral message with the ephemeral message storage module 210.


The ephemeral message access module 208 notifies a recipient of the message of the availability of the ephemeral message. The ephemeral message access module 208 receives a request to access the ephemeral message from the recipient and causes the ephemeral message to be displayed on a client device of the recipient for the maximum duration specified in the persistence metadata. Once the recipient views the message for the maximum duration, the ephemeral message access module 208 causes the client device of the recipient to stop displaying the ephemeral message, and deletes the ephemeral message from the ephemeral message storage module 210.


Media Filter Application



FIG. 3 shows a block diagram illustrating one example embodiment of the media filter application 122. The media filter application 122 includes a media filter publication module 304 and a media filter engine 306.


The media filter publication module 304 provides a platform for publication of media filters. In an example embodiment, the media filter publication module 304 includes a user-based media filter publication module 314 and a merchant-based media filter publication module 316. The user-based media filter publication module 314 enables users of client devices (either mobile or web clients) to upload content and select a geolocation for a user-based media filter. The merchant-based media filter publication module 316 enables merchants to upload content, select a geolocation, and submit a bid amount for a merchant-based media filter. The user-based media filter publication module 314 is described in more detail below with respect to FIG. 4A. The merchant-based media filter publication module 316 is described in more detail below with respect to FIG. 5A.


The media filter engine 306 generates and supplies a media filter based on the geolocation of a client device. In one example embodiment, the media filter engine 306 includes a predefined media filter module 318, a user-based media filter module 320, and a merchant-based media filter module 322. The media filter may be based on predefined media filters from the predefined media filter module 318, user-based media filters from the user-based media filter module 320, and merchant-based media filters from the merchant-based media filter module 322.


The predefined media filter module 318 supplies the client device with one of predefined media filters. Examples of predefined media filters are described in more detail below with respect to FIG. 6.


The user-based media filter module 320 supplies the client device with a user-based media filter generated by the user-based media filter publication module 314. The merchant-based media filter module 322 supplies the client device with a merchant-based media filter generated by the merchant-based media filter publication module 316.



FIG. 4A shows a block diagram illustrating one example embodiment of the user-based media filter publication module 314. The user-based media filter publication module 314 includes a user-based content upload module 402, a user-based geolocation selection module 404, a user-based duration selection module 406, and a user-based publication engine 408.


The user-based content upload module 402 receives uploaded content from a user. The content may include a media item such as a photo or a video. The user-based content upload module 402 may be implemented on a web server to allow a user to upload the content using a GUI as illustrated in FIG. 4B.


The user-based geolocation selection module 404 receives geolocation identification information from the user to identify a selected geolocation. The geolocation identification information may include an address, an identification of an establishment already associated with the address, Global Positioning System (GPS) coordinates, or a geographic boundary. For example, the address may include a street number, street address, city, state, and country. The user may also identify a location based on an existing establishment. For example, the geolocation information may include “restaurant x” in Venice Beach. The geographic boundary identifies a region or a zone. For example, the geographic boundary may define a region located within a predetermined radius of an address, a point of interest, or a name of an existing establishment.


In one example embodiment, the geolocation identification information may be embedded in a message or communication from a client device to the user-based geolocation selection module 404. For example, the user of the client device may take a picture of a sunset at Venice Beach and send the picture to the user-based geolocation selection module 404 that may then extract the geolocation attribute from the metadata associated with the picture of the sunset. The user-based geolocation selection module 404 may be implemented on a web server to present a user with a GUI in a web page that allows the user to select the geolocation for the content as illustrated in FIG. 4C.


The user-based duration selection module 406 receives, from the user, time duration information related to the uploaded content and selected geolocation. The time duration may identify a period of time during which the uploaded content is associated with the selected geolocation. Once the period of time has elapsed, the uploaded content is no longer associated with the selected geolocation. For example, if the time duration indicates twenty four hours, the media filter engine 306 makes the user-based media filter available to client devices that are located at the selected geolocation. Once twenty four hours has elapsed, the user-based media filter is no longer accessible by the client devices at the selected geolocation.


Other embodiments include a periodic time duration information or specific time duration information. For example, for the periodic time duration information, the user-based media filter is published and made available at the selected geolocation every Sunday (e.g., a religion related media filter available on days of religious services). For the specific time duration information, the user-based media filter is published and made available at the selected geolocation around a specific holiday or date (e.g., Thanksgiving weekend, New Year's day).


The user-based publication engine 408 generates a user-based media filter that associates the uploaded content from the user-based content upload module 402 with the selected geolocation from the user-based geolocation selection module 404. The user-based publication engine 408 publishes the user-based media filter to client devices that are located within the selected geolocation for the time duration identified with the user-based duration selection module 406.


In another example embodiment, the user-based publication engine 408 determines that no other user-based media filters exist during the same period of time for the same selected geolocation. The user-based media filter publication engine 408 may publish just one user-based media filter at any time for the same selected geolocation. In another example embodiment, a limit may be placed on the number of user-based media filters available at any time for the same selected geolocation. Thus, the user-based media filter publication engine 408 may publish and make available a limited number of user-based media filters at any time for the same selected geolocation. In another example embodiment, user-based media filters may be published to only contacts or ‘friends’ of the uploading user.



FIG. 4B illustrates an example of a GUI 410 for uploading content and for selecting a geographic region on a map. The GUI 410 includes a map 412, an upload image box 414, a select location button 416, a filter title box 418, and a submit button 420. The upload image box 414 enables a user to upload content, (e.g., a picture) to the user-based content upload module 402. The select location button 416 enables the user to identify a geolocation by drawing boundaries on the map 312 or by inputting an address or a zip code. The identified geolocation is submitted to the user-based geolocation selection module 404. The filter title box 418 enables the user to submit a name for the media filter. The user may submit the content and the requested geolocation by clicking on the submit button 420. Once the content and requested geolocation are submitted, the user-based publication engine 408 generates a user-based media filter that includes the uploaded content for the identified geolocation.



FIG. 4C illustrates an example where user identified boundaries points 424, 426, 428, and 430 on the map 412 define a geolocation 422. The user has uploaded a picture of the sun 415 displayed in the upload image box 414. The user has entered the title of the content “Fun in the sun!” in the filter title box 418. The user may submit the picture of the sun 415 and the geolocation 422 by clicking on the submit button 420. Once the picture of the sun 415 and the geolocation 422 are submitted, the user-based publication engine 408 generates a user-based media filter.



FIG. 4D illustrates an example of a publication of a user-based media filter. The media filter application 122 detects that a mobile device 1802 of a user 1816 is located at the geolocation 422. The media filter application 122 retrieves the user-based media filter 440 corresponding to the geolocation 422 and publishes the user-based media filter 440 to the mobile device 1802. The user-based media filter 440 is applied to media content 1806 in a display 1804 of the mobile device 1802.



FIG. 5A shows a block diagram illustrating one example embodiment of the merchant-based media filter publication module 316. The merchant-based media filter publication module 316 includes a merchant-based content upload module 502, a merchant-based geolocation selection module 504, a merchant-based duration selection module 506, a merchant-based bidding module 508, and a merchant-based publication engine 510.


The merchant-based content upload module 502 receives content from a merchant. The content may include a media item such as a picture, a video, a graphic, or a text. The merchant-based content upload module 502 may be implemented on a web server to allow a merchant to upload the content using a webpage.


The merchant-based geolocation selection module 504 receives geolocation identification information from the merchant to identify a selected geolocation. The geolocation identification information may include an address of an establishment, an identification of an establishment already associated with the address, GPS coordinates, or a geographic boundary. For example, the address of the establishment may include a street number, street address, city, state, and country. The merchant may also identify a location based on an existing establishment. For example, the geolocation information may include “restaurant x” in Venice beach. The geographic boundary identifies a region or a zone. For example, the geographic boundary may define a region located within a predetermined radius of an address, a point of interest, or a name of an existing establishment. The merchant may further define the geographic boundary by drawing a virtual fence on a map. The merchant-based geolocation selection module 504 may be implemented on a web server to allow a merchant to draw boundaries on a map in a web page.


The merchant-based duration selection module 506 receives, from the merchant, time duration information related to the uploaded content and selected geolocation. The time duration may identify a period of time in which the uploaded content is associated with the selected geolocation. Once the period of time has elapsed, the uploaded content is no longer associated with the selected geolocation. Other embodiments include periodic time duration information or specific time duration information. For example, for the periodic time duration information, the merchant-based media filter is published or made available at the selected geolocation (e.g., corner of two identified streets) every Saturday night (e.g., a night club related media filter available every Saturday night). For the specific time duration information, the selected media filter is published or made available at the selected geolocation around a specific date (e.g., party event date).


The merchant-based bidding module 508 provides an interface to enable merchants to submit a bid amount for a common geolocation. The common geolocation may include, for example, a same street address. For example, several businesses may have the same street address but different suite numbers in a shopping center. FIG. 5B illustrates an example of a common geolocation. Merchant A geolocation boundaries 512 overlaps with merchant B geolocation boundaries 514 to define a common geolocation 516. Thus, merchants A and B may submit respective bids corresponding to the common geolocation 516. In one example embodiment, the merchant-based geolocation selection module 504 determines common geolocations from the geolocations selected by the merchants. The merchant-based bidding module 508 identifies a highest bidder for the common geolocation and awards the highest bidder with the ability to exclude other merchant-based media filters from the common geolocation 516 for a predefined amount of time.


In another example embodiment, the merchant-based bidding module 508 prorates bid amounts based on their corresponding time duration information. For example, merchant A submits a bid amount of $100 for one day for a specific geolocation. Merchant B submits a bid amount of $160 for two days for the same specific geolocation. The merchant-based bidding module 508 may prorate the bid from merchant B for one day (e.g., $80) and compare both bids for the same period of time (e.g., one day) to determine a highest bidder.


The merchant-based publication engine 510 generates a merchant-based media filter that associates the uploaded content of the highest bidder with the geolocation identified by the highest bidder. The merchant-based publication engine 510 publishes the merchant-based media filter to client devices that are located at the geolocation selected by the highest bidder for the time duration identified with the merchant-based duration selection module 506. Merchant-based media filters from other merchants in the common geolocation 516 are excluded from publication. In another embodiment, a quota may be placed on the number of merchant-based media filters available for the common geolocation 516. For example, the merchant-based publication engine 510 may publish and make available a limited number of merchant-based media filters (e.g., a maximum of two merchant-based media filters) for the common geolocation 516.


In another example embodiment, the merchant-based publication engine 510 forms a priority relationship that associates the uploaded content of the highest bidder with the geolocation selected by the highest bidder. For example, an order in which media filters are displayed at the client device 110 may be manipulated based on the results from the merchant-based bidding module 508. A media filter of a merchant with the highest bid may be prioritized and displayed first at the client device 110. Media filters from other merchants may be displayed at the client device 110 after the media filter of the highest bidder. In another example embodiment, a merchant may be able to bid on all locations at which it maintains a presence. Thus, a restaurant chain may be able to have its media filter(s) published at each of its restaurant chain locations.



FIG. 5C illustrates an example of a GUI 520 for uploading content and for selecting a geolocation on a map. The GUI 520 includes a map 522, an upload image box 524, a select location button 526, a filter title box 528, a bid amount entry box 530, a campaign length entry box 532, and a submission button 534. The upload image box 524 enables a merchant to upload content (e.g., a picture, a video, or an animation) to the merchant-based content upload module 502. The selection location button 526 enables the merchant to identify a geolocation by drawing boundaries on the map 522 or by inputting an address or a zip code. The filter title box 528 enables the merchant to submit a name for the media filter. The bid amount entry box 530 enables the merchant to enter a bid amount for the identified geolocation. The campaign length entry box 532 enables the merchant to specify a length of a campaign in which the uploaded content is associated with the identified geolocation. The merchant may submit the uploaded content and entered information by clicking on the submit button 534.



FIG. 5D illustrates an example where a merchant A has identified boundaries points 542, 544, 546, and 548 on the map 522 to define a geolocation 540. Merchant A has uploaded a picture 525 displayed in the upload image box 524. Merchant A has entered a title “Coffee shop A” in the filter title box 528, a bid amount of $300 in the bid amount entry box 530, and a campaign length of 30 days in the campaign length entry box 532. Merchant A submits the picture 525, the requested geolocation 540, and other entered information by clicking on the submit button 534. The merchant-based publication engine 510 generates a media filter for merchant A.



FIG. 5E illustrates an example where another merchant, merchant B, has identified boundaries points 552, 554, 556, and 558 on the map 522 to define a geolocation 550. Merchant B has uploaded a picture 527 displayed in the content upload box 524. Merchant B has entered a title “Coffee shop B” in the filter title box 528, a bid amount of $500 in the bid amount entry box 530, and a campaign length of 30 days in the campaign length entry box 532. Merchant B may submit the picture 527, the requested geolocation 550, bid amount, and campaign length by clicking on the submission button 534. The merchant-based publication engine 510 generates a media filter for merchant B.



FIG. 5F shows a diagram illustrating an example of a merchant-based media filter selected based on a bidding process. The geolocation 540 of merchant A and the geolocation 550 of merchant B overlap at a common geolocation 545. The user 1816 is located at the common geolocation 545 and uses his mobile device 1802 to generate the media content 1806 (e.g., user 1816 takes a picture) in the display 1804 of the mobile device 1802. The media filter of the merchant with the highest bid for the common location 545 is published to the mobile device 1802. In the present example, merchant B has outbid merchant A. As such, media filter 560 of merchant B is provided and displayed in the display 1804 on top of the media content 1806. The media filter 560 contains the uploaded content from merchant B. In addition, it should be noted that ‘merchant’ in the context of the current example embodiments may include not only entities involved in the trade or sale of merchandise but any other entity as well, including individuals, universities, non-profit organizations, student organizations, clubs, etc.



FIG. 6A shows a block diagram illustrating one example embodiment of the predefined media filter module 318. The predefined media filter module 318 includes, for example, a live event module 602, a social network module 604, a promotion module 606, a collection module 608, a progressive use module 610, a viral use module 612, an actionable module 614, and a history aware module 616.


The live event module 602 generates a media filter based on live event information. The live event information may be related to a live game score of a sporting event associated with a corresponding geolocation, or a live news event related to an entertainment or social event associated with a corresponding geolocation. For example, a user of the client device 110 attends a game at a stadium. As such, media metadata from the client device 110 may identify the location of the stadium with a date and time. The live event module 402 uses that information to search for a live event associated with the location of the stadium, date, and time. The live event module 602 retrieves a current or nearly current game score associated with the live sporting event at the stadium (via e.g., the ESPN API). The live event module 602 may also retrieve insignias or team logos associated with the live sporting event. As such, the live event module 602 generates a media filter containing the latest score based on news sources covering the live sporting event.


In another example, the user of the client device 110 attends a social event at a venue. Similarly, media metadata identifies the location of the venue with a date and time. The live event module 602 uses that information to search for a live event associated with the location of the venue, date, and time from sources such as a social network server or news media service. The live event module 602 retrieves a news feed associated with the live social event at the venue. As such, the live event module 602 generates a media filter containing information or content based on news retrieved from a news feed associated with the live social event at the venue.


The social network module 604 generates a media filter based on social network information of a user of the client device 110. The social network information may include social network data retrieved from a social network service provider. The social network data may include profile data of the user, “likes” of the user, establishments that the user follows, friends of the user, and postings of the user among others. For example, the media filter associated with a restaurant may be available to the user at the location of the restaurant if the user has identified himself as a fan of the restaurant or indicates a “like” of the restaurant with the social network service provider. In another example, the ranking or priority of displaying the media filter in the client device 110 of the user may be based on the profile of the user or the number of “check-ins” of the user at the restaurant.


In another example embodiment, the media filter may be restricted and available only to the user and the social network (e.g., friends or other users in different categories) of the user of the client device 110. As such, the user may forward the media filter to his friends.


The promotion module 606 generates media filters for a promotion (e.g., a game, contest, lottery). For example, a set of unique media filters may be generated. One media filter from the set of unique media filters may be provided to the client device 110 when the client device 110 is at a predefined location associated with the media filters. For example, the user may visit a fast food restaurant. The media metadata from the client device 110 identifies the location of the fast food restaurant. The promotion module 606 retrieves a unique media filter from the set of unique media filters and provides it to the client device 110. The promotion module 606 may remove the unique media filter from the set of unique media filters after it has been provided to the client device 110. In another embodiment, the promotion module 406 removes the unique media filter from the set of unique media filters after it has been provided to other client devices for a predefined number of times.


The media filter includes content related to a game or promotion. In another example, the media filter may include dynamic content adjusted based on the game or promotion. For example, the dynamic content may include a current number of remaining media filters of the game or promotion. The media filters from the promotion module 606 may be “collected” by the client device 110. For example, the client device 110 may store the media filter in a collection at the client device 110. A prize may be redeemed upon collection of each filter of a predefined set of media filters.


The collection module 608 generates collectible media filters. For example, the client device 110 is provided with a media filter associated with the geolocation of the client device 110. The media filter may be collected by the client device 110 and be made permanently available to the client device 110. The client device 110 may store the collected media filter in a collection folder at the client device 110.


The progressive use module 610 generates media filters with dynamic content that changes based on a number of uses of the media filters. For example, a media filter can be set to be used for a limited number of times. Every time the media filter is provided to a client device, a content of the media filter is adjusted. For example, the media filter may include a fundraising progress bar in which a level of the bar rises every time the media filter is used. The dynamic content in the media filter may include a countdown displaying the number of remaining usage of the media filter.


The viral use module 612 generates media filters that can be forwarded to other users outside a geolocation associated with the media filters. For example, the client device 110 receives a media filter based on a geolocation of the mobile device 110. The client device 110 can send the media filter to mobile device 112 that is outside the geolocation of the mobile device 110. The forwarded media filter may be available for use by the mobile device 112 for a predefined time limit (e.g., one hour). Similarly, the mobile device 112 may forward the media filter to other mobile devices outside the geolocation of the mobile device 110 for use within the predefined time limit.


The actionable module 614 generates media filters with an action associated with a content of the media filter. For example, the media filter can start a browser of the client device 110 and open a predetermined website in the browser. In another embodiment, the media filter is capable of opening other functionalities (e.g., payment application) or executing other programs at the client device 110. For example, a user can tap on the media filter to download or display a coupon associated with the media filter at the client device 110.


The history aware module 616 generates media filters based on geolocation of the mobile device 110 and historical events associated with the geolocation. For example, a media filter may include pictures of a pyramid associated with the geolocation of the mobile device 110. The media filters may be collected based on the historical events or, for example, for each of the Seven Natural Wonders of the World. For example, a media filter associated with a national park may be collected when the user visits the national park. The device can collect all media filters associated with all national parks.



FIG. 6B shows a diagram illustrating an example of a media filter 1820 with live data content. The media filter 1820 contains live data associated with a geolocation of the mobile device 1802. For example, the live data contains a live weather status 1822 and latest score update 1824 of a sporting event associated with the geolocation of the mobile device 1802. The mobile device 1802 displays the media filter 1820 on top of (i.e., as a transparent overlay) the media content 1806. In one example embodiment, the media filter 1820 may be implemented with the live event module 602 of FIG. 6A.



FIG. 6C shows a diagram illustrating an example of a media filter 1830 with promotional content. For example, the media filter 1830 includes a digital coupon 1832 that can be redeemed at a coffee shop. The media filter 1830 may include dynamic content 1834. For example, the dynamic content 1834 may include a remaining number of times the coupon can be used. Furthermore, the media filter 1830 may include an actionable area 1836 that is associated with an executable function. For example, when the user taps the actionable area 1836, the media filter 1830 is forwarded to a mobile device of a friend of the user. The mobile device 1802 displays the media filter 1830 on top of the media content 1806. In one example embodiment, the media filter 1830 may be implemented with the social network module 604, the promotion module 606, the progressive use module 610, and the actionable module 614 of FIG. 6A.



FIG. 6D shows a diagram illustrating an example of a collectible media filter 1840. The collectible media filter 1840 may be randomly supplied to the mobile device 1802 in response to detecting the mobile device 1802 at a geolocation associated with the collectible media filter 1840. The collectible media filter 1840 can be stored at the mobile device 1802. Once the mobile device 1802 detects that related collectible media filters have been stored, the mobile device 1802 may cause the related collectible media filters or a corresponding unique media filter to be displayed in the display 1804. The mobile device 1802 displays the media filter 1840 on top of the media content 1806. In one example embodiment, the media filter 1840 may be implemented with the collection module 608 of FIG. 6A.



FIG. 6E shows a diagram illustrating an example of a viral media filter 1850. The viral media filter 1850 may include dynamic content 1854 and an actionable area 1852. For example, the dynamic content 1854 shows a progress bar and goal of a fundraising event. The progress bar is adjusted based on a latest amount raised. The actionable area 1852 may trigger the mobile device 1802 to cause a financial transaction (e.g., donation) and a communication to another mobile device (e.g., message to another mobile device using the messaging application 120). The mobile device 1802 displays the media filter 1850 on top of the media content 1806. In one example embodiment, the media filter 1850 may be implemented with the progressive use module 610, the viral use module 612, and an actionable module 614 of FIG. 6A.



FIG. 7 shows an interaction diagram illustrating one example embodiment of an operation of the user-based media filter publication module 314. At operation 710, the client device 110 of a first user uploads content and sends a requested geolocation and a requested time duration to the media filter application 122. At operation 712, the media filter application 122 generates a media filter based on the uploaded content and associates the media filter with the requested geolocation for the requested time duration. In one example embodiment, operations 710 and 712 may be implemented with the user-based media filter publication module 314 of FIG. 3.


At operation 714, the client device 112 of a second user sends geolocation information to the messaging application 120. At operation 716, the messaging application 120 identifies, from the media filter application 122, a media filter based on the geolocation of the client device 112. At operation 718, the media filter application 122 supplies the client device 112 with the identified media filter. In one example embodiment, operations 716 and 718 may be implemented with the media filter engine 306 of FIG. 3.



FIG. 8 shows an interaction diagram illustrating another example embodiment of an operation of the merchant-based media filter publication module 316. At operation 808, a client device 802 of merchant A uploads content with geolocation information (e.g., geolocation X) and a bid amount (e.g., bid amount A) to the media filter application 122 to form media filter A. At operation 810, a client device 804 of merchant B uploads content with the same geolocation information (e.g., geolocation X) and a bid amount (e.g., bid amount B) to the media filter application 122 to form media filter B. At operation 812, the media filter application 122 determines a highest bidder, and associates the media filter of the highest bidder with geolocation X. For example, if bid amount A is greater than bid amount B, media filter A is provided to client devices that are located at geolocation X. In one example embodiment, operations 808, 810, 812 may be implemented with the merchant-based media filter publication module 316 of FIG. 3.


At operation 814, a client device 806 at geolocation X sends its geolocation information to the messaging application 120. At operation 816, the messaging application 120 identifies, from the media filter application 122, the media filter associated with the geolocation X. At operation 818, the media filter application 122 supplies the client device 806 with media filter A. In one example embodiment, operations 816 and 818 may be implemented with the media filter engine 306 of FIG. 3. In another example embodiment, the media filter application 122 supplies both media filters A and B to the client device 806 with instructions for the client device 806 to display media filter A first before media filter B since merchant A was the highest bidder.



FIG. 9 shows a flow diagram illustrating one example embodiment of a method 900 of the user-based media filter publication module 314. At operation 902, the user-based media filter publication module 314 receives uploaded content and a requested geolocation information from a first client device. In one example embodiment, operation 902 may be implemented with the user-based content upload module 402, the user-based geolocation selection module 404, and the user-based duration selection module 406 of FIG. 4A.


At operation 904, the user-based media filter publication module 314 forms a user-based media filter that includes the uploaded content, and is associated with the requested geolocation. In one example embodiment, operation 904 may be implemented with the user-based publication engine 408 of FIG. 4A.


At operation 906, the user-based media filter publication module 314 receives geolocation information from a second client device. At operation 908, the user-based media filter publication module 314 determines whether the geolocation of the second client device is within the requested geolocation from the first client device. At operation 910, the user-based media filter publication module 314 publishes the user-based media filter from the first client device to the second client device in response to the geolocation of the second client device being within the requested geolocation from the first client device. In one example embodiment, operation 910 may be implemented with the user-based media filter module 320 of FIG. 3.


At operation 912, the media filter engine 306 supplies predefined media filters corresponding to the geolocation of the second client provided to the second device. In one example embodiment, operation 912 may be implemented with the predefined media filter module 318 of FIG. 3.



FIG. 10 shows a flow diagram illustrating one example embodiment of a method 1000 of operation for the merchant-based media filter publication module 316. At operations 1002 and 1004, the merchant-based media filter publication module 316 receives uploaded content, geolocation information, and corresponding bid amounts from merchants. For example, at operation 1002, the merchant-based content upload module 502 receives content A from merchant A. The merchant-based geolocation selection module 504 receives geolocation X from merchant A. The merchant-based bidding module 508 receives bid amount A from merchant A.


At operation 1004, the merchant-based content upload module 502 receives content B from merchant B. The merchant-based geolocation selection module 504 receives geolocation X from merchant B. The merchant-based bidding module 508 receives bid amount B from merchant B.


At operation 1006, the highest bid amount is determined. In one example embodiment, operation 1006 may be implemented with the merchant-based bidding module 508 of FIG. 6A. If bid amount A is greater than bid amount B, the merchant-based publication engine 510 generates a merchant-based media filter A based on content A and geolocation X at operation 1008. At operation 1010, the merchant-based media filter module 322 supplies merchant-based media filter A to client devices that are located at geolocation X.


If bid amount B is greater than bid amount A, the merchant-based publication engine 510 generates a merchant-based media filter B based on content B and geolocation X at operation 1014. At operation 1016, the merchant-based media filter module 322 supplies merchant-based media filter B to client devices that are located at geolocation X.



FIG. 11 shows a flow diagram illustrating one example embodiment of a method 1100 of operation for the live event module 602. At operation 1104, the live event module 602 receives geolocation information from a client device. At operation 1106, the live event module 602 identifies a live event associated with the geolocation. At operation 1108, the live event module 602 accesses live event data related to the live event. At operation 1110, the live event module 602 generates a live event media filter based on the live event data. At operation 1112, the live event module 602 supplies the live event media filter to the client device.



FIG. 12 shows a flow diagram illustrating one example embodiment of a method 1200 of operation for the social network module 604. At operation 1202, the social network module 604 receives social network information from a client device. At operation 1204, the social network module 604 accesses social network data from social network service providers based on social network information from the client device. At operation 1206, the social network module 604 identifies a geolocation from the geolocation information of the client device. At operation 1208, the social network module 604 generates a social network-based media filter based on the social network data and geolocation of the client device. At operation 1210, the social network module 604 supplies the social network-based media filter to the client device.



FIG. 13 shows a flow diagram illustrating one example embodiment of a method 1300 of operation for the promotion module 606. At operation 1302, the promotion module 606 generates a set of media filters for a merchant for a predefined geolocation. At operation 1304, the promotion module 606 receives geolocation information from a client device. At operation 1306, the promotion module 606 identifies the geolocation of the client device from the geolocation information. At operation 1308, the promotion module 606 accesses the set of media filters for the merchant associated with the geolocation. At operation 1310, the promotion module 606 randomly selects at least one media filter from the set of media filters. At operation 1312, the promotion module 606 supplies the randomly selected media filter(s) to the client device.



FIG. 14 shows a flow diagram illustrating one example embodiment of a method 1400 of operation for the collection module 608. At operation 1402, the collection module 608 receives geolocation information from a client device. At operation 1404, the collection module 608 determines the geolocation of the client device from the geolocation information. At operation 1406, the collection module 608 accesses media filters associated with the geolocation of the client device. At operation 1408, the collection module 608 stores the media filters in a media filter collection associated with the client device. At operation 1410, the collection module 608 presents the media filters in the media filter collection to the client device for use.



FIG. 15 shows a flow diagram illustrating one example embodiment of a method 1500 of operation for the progressive use module 610. At operation 1502, the progressive use module 610 generates a progressive use media filter for a geolocation. At operation 1504, the progressive use module 610 receives geolocation information from a first client device at the geolocation. At operation 1506, the progressive use module 610 supplies the progressive use media filter to the first client device, and generates a first modified media filter based on the progressive use media filter. At operation 1508, the progressive use module 610 receives geolocation information from a second client at the geolocation. At operation 1510, the progressive use module 610 supplies the first modified media filter to the second client device, and generates a second modified media filter based on the first modified media filter.



FIG. 16 shows a flow diagram illustrating one example embodiment of a method 1600 of operation for the viral use module 612. At operation 1602, the viral use module 612 generates a media filter for a geolocation. At operation 1604, the viral use module 612 receives geolocation information from a first client device at the geolocation. At operation 1606, the viral use module 612 supplies the media filter to the first client device at the geolocation. At operation 1608, the viral use module 612 receives a request from the first client device to forward the media filter to a second client device outside the geolocation. At operation 1610, the viral use module 612 provides the media filter for a limited time to the second client device outside the geolocation.



FIG. 17 shows a flow diagram illustrating one example embodiment of a method 1700 of operation for the actionable module 614. At operation 1702, the actionable module 614 generates an actionable media filter having an actionable portion associated with a function. At operation 1704, the actionable module 614 provides the actionable media filter to a first client device. At operation 1706, the actionable module 614 receives a media item (e.g., a photo) with the media filter from the first client device. At operation 1708, the actionable module 614 supplies the media item with the media filter to the second client device. At operation 1710, the actionable module 614 identifies a selection of the actionable portion from the second client device. At operation 1712, the actionable module 614 executes a function associated with the actionable portion at the second client device.


Modules, Components and Logic


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied (1) on a non-transitory machine-readable medium or (2) in a transmission signal) or hardware-implemented modules. A hardware-implemented module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more processors may be configured by software (e.g., an application or application portion) as a hardware-implemented module that operates to perform certain operations as described herein.


In various embodiments, a hardware-implemented module may be implemented mechanically or electronically. For example, a hardware-implemented module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware-implemented module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware-implemented module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware-implemented module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily or transitorily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware-implemented modules are temporarily configured (e.g., programmed), each of the hardware-implemented modules need not be configured or instantiated at any one instance in time. For example, where the hardware-implemented modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respectively different hardware-implemented modules at different times. Software may, accordingly, configure a processor, for example, to constitute a particular hardware-implemented module at one instance of time and to constitute a different hardware-implemented module at a different instance of time.


Hardware-implemented modules can provide information to, and receive information from, other hardware-implemented modules. Accordingly, the described hardware-implemented modules may be regarded as being communicatively coupled. Where multiples of such hardware-implemented modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware-implemented modules). In embodiments in which multiple hardware-implemented modules are configured or instantiated at different times, communications between such hardware-implemented modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware-implemented modules have access. For example, one hardware-implemented module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware-implemented module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware-implemented modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via the network 104 (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).


Electronic Apparatus and System


Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, or software, or in combinations of them. Example embodiments may be implemented using a computer program product (e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, or multiple computers).


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed in various example embodiments.


Example Computer System



FIG. 18 shows a diagrammatic representation of a machine in the example form of a machine or computer system 1800 within which a set of instructions 1824 may be executed causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine 110 and 112 in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions 1824 (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions 1824 to perform any one or more of the methodologies discussed herein.


The example computer system 1800 includes a processor 1802 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1804, and a static memory 1806, which communicate with each other via a bus 1808. The computer system 1800 may further include a video display unit 1810 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1800 also includes an alphanumeric input device 1812 (e.g., a keyboard), a UI navigation device 1814 (e.g., a mouse), a drive unit 1816, a signal generation device 1818 (e.g., a speaker), and a network interface device 1820.


The drive unit 1816 includes a computer-readable medium 1822 on which is stored one or more sets of data structures and instructions 1824 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 1824 may also reside, completely or at least partially, within the main memory 1804 or within the processor 1802 during execution thereof by the computer system 1800, with the main memory 1804 and the processor 1802 also constituting machine-readable media.


The instructions 1824 may further be transmitted or received over a network 1826 via the network interface device 1820 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).


While the computer-readable medium 1822 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 1824. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions 1824 for execution by the machine that cause the machine to perform any one or more of the methodologies of the present disclosure, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions 1824. The term “computer-readable medium” shall, accordingly, be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.


Furthermore, the machine-readable medium is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one physical location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.


Example Mobile Device



FIG. 19 is a block diagram illustrating a mobile device 1900, according to an example embodiment. The mobile device 1900 may include a processor 1902. The processor 1902 may be any of a variety of different types of commercially available processors 1902 suitable for mobile devices 1900 (for example, an XScale architecture microprocessor, a microprocessor without interlocked pipeline stages (MIPS) architecture processor, or another type of processor 1902). A memory 1904, such as a random access memory (RAM), a flash memory, or another type of memory, is typically accessible to the processor 1902. The memory 1904 may be adapted to store an operating system (OS) 1906, as well as applications 1908, such as a mobile location enabled application that may provide location-based services (LBSs) to a user. The processor 1902 may be coupled, either directly or via appropriate intermediary hardware, to a display 1910 and to one or more input/output (I/O) devices 1912, such as a keypad, a touch panel sensor, a microphone, and the like. Similarly, in some embodiments, the processor 1902 may be coupled to a transceiver 1914 that interfaces with an antenna 1916. The transceiver 1914 may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via the antenna 1916, depending on the nature of the mobile device 1900. Further, in some configurations, a GPS receiver 1918 may also make use of the antenna 1916 to receive GPS signals.


Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.


Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.


The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims
  • 1. A server comprising: one or more hardware processors comprising a media filter publication module, a messaging module, and a media filter engine,the media filter publication module configured to receive a content item and a selected geolocation from a first device, and to generate a media filter from the content item, the media filter associated with the selected geolocation;the media filter engine configured to process a geolocation of a client device, to identify a plurality of filters comprising at least the media filter based at least in part on the geolocation of the client device, and to provide the plurality of filters comprising the media filter to the client device display of the media filter on a user interface of the client device; andthe messaging module configured to receive, from the client device, a message comprising media content overlaid by the media filter, wherein the first device is different from the client device.
  • 2. The server of claim 1, wherein the media filter publication module comprises: a user-based content upload module configured to receive the content item;a user-based geolocation selection module configured to receive the selected geolocation; anda user-based media filter publication engine configured to generate a user-based media filter based on the content item and the selected geolocation,the media filter engine configured to supply the client device with the user-based media filter in response to the geolocation of the client device within the selected geolocation.
  • 3. The server of claim 2, wherein the media filter publication module further comprises: a user-based duration selection module configured to receive an identification of a period of time associated with the content item and the selected geolocation,wherein the media filter engine is configured to supply the client device with the user-based media filter within the selected geolocation during the period of time.
  • 4. The server of claim 1, wherein the media filter publication module comprises: a merchant-based media content upload module configured to receive a first content item from a first merchant and a second content item from a second merchant;a merchant-based geolocation selection module configured to receive a first geolocation information from the first merchant, and a second geolocation information from the second merchant, to identify a common geolocation based on the first geolocation information and the second geolocation information;a merchant-based bidding module configured to receive a first bid amount from the first merchant and a second bid amount from the second merchant, and to identify a highest bid amount; anda merchant-based publication engine configured to generate a merchant-based media filter based on the content item of the merchant with the highest bid amount and the common geolocation,the media filter engine configured to supply the merchant-based media filter to the client device within the common geolocation;wherein the media filter publication module further comprises:a merchant-based duration selection module configured to disable the merchant based media filter after a predetermined duration has elapsed.
  • 5. The server of claim 4, wherein the common geolocation includes a common region formed between a first geolocation from the first merchant and a second geolocation from the second merchant.
  • 6. The server of claim 1, wherein the media filter engine further comprises: a live event module configured to:identify a live event associated with the geolocation of the client device;access live event data related to the live event; andgenerate a live event media filter based on the live event data and the geolocation of the client device.
  • 7. The server of claim 1, wherein the media filter engine further comprises: a social network module configured to:access social network data based on social network information from the client device; andgenerate a social network media filter based on the social network data and the social network information from the client device.
  • 8. The server of claim 1, wherein the media filter engine further comprises: a promotion module configured to:generate a set of media filters including the media filter a merchant for a predefined geolocation of the merchant;randomly select one media filter from the set of media filters; andprovide the randomly selected media filter to the client device in response to the geolocation of the client device corresponding to the predefined geolocation of the merchant.
  • 9. The server of claim 1, wherein the media filter engine further comprises: a collection module configured to:store previously provided media filters in a media filter collection associated with the client device; andpresent media filters from the media filter collection associated with the client device in response to receiving a geolocation associated with the media filters.
  • 10. The server of claim 1, wherein the media filter engine further comprises: a progressive module configured to:generate a progressive use media filter for a predefined geolocation; andadjust a content of the progressive use media filter in response to a number of prior uses of the progressive use media filter.
  • 11. The server of claim 10, wherein the progressive module is further configured to: disable the progressive use media filter after the number of prior uses of the progressive use media filter reaches a predefined progressive use limit.
  • 12. The server of claim 1, wherein the media filter engine further comprises: a viral use module configured to:generate a viral use media filter for a predefined geolocation;provide the viral use media filter to a first client device located at the predefined geolocation;receive a request from the first client device located at the predefined geolocation to provide the viral use media filter to a second client device located outside the predefined geolocation; andprovide the viral use media filter to the second client device located outside the predefined geolocation.
  • 13. The server of claim 1, wherein the media filter engine further comprises: an actionable module configured to:execute a programmable function associated with an actionable area in response to detecting a selection of the actionable area from a user of the client device.
  • 14. The server of claim 1, wherein the media filter publication module is configured to generate a graphical user interface for displaying a map, receiving a selection of boundaries in the map, and including a geographic region formed with the selection of boundaries in the selected geolocation.
  • 15. A method comprising: receiving a content item and a selected geolocation from a first device;generating, by one or more hardware processors, a media filter from the content item, the media filter associated with the selected geolocation;receiving, from a client device, a geolocation of the client device;identifying the media filter based on the geolocation of the client device;communicating a plurality of media filters comprising the media filter to the client device for display of the media filter on a user interface of the client device by causing display of the media filter over media content on the user interface of the client device; andreceiving, from the client device, a message comprising the media content overlaid by the media filter.
  • 16. The method of claim 15, further comprising: receiving an identification of a period of time associated with the content item and the selected geolocation, the media filter displayed on the user interface of the client device in response to the client device being located within the selected geolocation during the period of time.
  • 17. The method of claim 15, further comprising: receiving a first content item and a first geolocation information from a first merchant and a second content item and a second geolocation information from a second merchant;identifying a common geolocation between the first geolocation information and the second geolocation information;receiving a first bid amount from the first merchant and a second bid amount from the merchant;identifying a highest bid amount; andgenerating a merchant-based media filter based on the content item of the merchant with the highest hid amount and the common geolocation,supplying the merchant-based media filter to the client device within the common geolocation.
  • 18. The method of claim 17, further comprising: disabling the merchant-based media filter after a predetermined duration has elapsed.
  • 19. A non-transitory computer-readable storage medium storing a set of instructions that, when executed by a processor of a machine, cause the machine to perform operations comprising: receiving a content item and a selected geolocation from a first device;generating, by one or more hardware processors, a media filter from the content item, the media filter associated with the selected geolocation;receiving, from a client device, a geolocation of the client device;identifying the media filter based on the geolocation of the client device;communicating a plurality of media filters comprising the media filter to the client device for display of the media filter on a user interface of the client device by causing display of the media filter over media content on the user interface of the client device; andreceiving, from the client device, a message comprising the media content overlaid by the media filter.
  • 20. The system of claim 1 wherein the selected geolocation is determined by a drawing input received via a graphic user interface of the first device, the input drawing generating a geometric shape drawn on a map by the first device; and wherein the geolocation of the client device is determined by a global positioning system (GPS) measurement taken by the client device.
US Referenced Citations (788)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6922634 Odakura et al. Jul 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124091 Khoo et al. Oct 2006 B1
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240025 Stone et al. Jul 2007 B2
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Bröndrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7630724 Beyer, Jr. et al. Dec 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
7991638 House et al. Aug 2011 B1
8001204 Burtner et al. Aug 2011 B2
8014762 Chmaytelli et al. Sep 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8290513 Forstall et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332402 Forstall et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8369866 Ashley, Jr. et al. Feb 2013 B2
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8433296 Hardin et al. Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8494481 Bacco et al. Jul 2013 B1
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8548735 Forstall et al. Oct 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8559980 Pujol Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8613089 Holloway et al. Dec 2013 B1
8626187 Grosman et al. Jan 2014 B2
8649803 Hamill Feb 2014 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8688519 Lin et al. Apr 2014 B1
8694026 Forstall et al. Apr 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8751310 Van Datta et al. Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8762201 Noonan Jun 2014 B1
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8838140 Ledet Sep 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Roote et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8923823 Wilde Dec 2014 B1
8924144 Forstall et al. Dec 2014 B2
8972357 Shim et al. Mar 2015 B2
8977296 Briggs et al. Mar 2015 B1
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9043329 Patton et al. May 2015 B1
9055416 Rosen et al. Jun 2015 B2
9080877 Dave et al. Jul 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs Sep 2015 B2
9137700 Elefant et al. Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9258459 Hartley Feb 2016 B2
9277365 Wilden et al. Mar 2016 B1
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9544379 Gauglitz et al. Jan 2017 B2
9591445 Zises Mar 2017 B2
9628950 Noeth et al. Apr 2017 B1
9648581 Vaynblat et al. May 2017 B1
9672538 Vaynblat et al. Jun 2017 B1
9674660 Vaynblat et al. Jun 2017 B1
9706355 Cali et al. Jul 2017 B1
9710821 Heath Jul 2017 B2
9710969 Malamud et al. Jul 2017 B2
9802121 Ackley et al. Oct 2017 B2
9823724 Vaccari et al. Nov 2017 B2
9843720 Ebsen et al. Dec 2017 B1
9854219 Sehn Dec 2017 B2
9866999 Noeth Jan 2018 B1
9894478 Deluca et al. Feb 2018 B1
9961535 Bucchieri May 2018 B2
10080102 Noeth et al. Sep 2018 B1
10176195 Patel Jan 2019 B2
10200813 Allen et al. Feb 2019 B1
10282753 Cheung May 2019 B2
10285002 Colonna et al. May 2019 B2
10285006 Colonna et al. May 2019 B2
10349209 Noeth et al. Jul 2019 B1
10395519 Colonna et al. Aug 2019 B2
10445777 McDevitt et al. Oct 2019 B2
10524087 Allen et al. Dec 2019 B1
10565795 Charlton et al. Feb 2020 B2
10616239 Allen et al. Apr 2020 B2
10616476 Ebsen et al. Apr 2020 B1
10659914 Allen et al. May 2020 B1
10694317 Cheung Jun 2020 B2
10824654 Chang et al. Nov 2020 B2
10893055 Allen et al. Jan 2021 B2
10915911 Ahmed et al. Feb 2021 B2
20020032771 Gledje Mar 2002 A1
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020098850 Akhteruzzaman et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020123327 Vataja Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030083929 Springer et al. May 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040091116 Staddon et al. May 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040185877 Asthana et al. Sep 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040193488 Khoo et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20040243704 Botelho et al. Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050032527 Sheha et al. Feb 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102180 Gailey et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060136297 Willis et al. Jun 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060259359 Gogel Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070092668 Harris et al. Aug 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070260741 Bezancon Nov 2007 A1
20070262860 Salinas et al. Nov 2007 A1
20070268988 Hedayat et al. Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080012987 Hirata et al. Jan 2008 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033795 Wishnow et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Ngyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschweiler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080160956 Jackson et al. Jul 2008 A1
20080167106 Lutnick Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080200189 Lagerstedt et al. Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080284587 Saigh et al. Nov 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090019472 Cleland et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089169 Gupta et al. Apr 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090177588 Marchese Jul 2009 A1
20090177730 Annamalai et al. Jul 2009 A1
20090192900 Collision Jul 2009 A1
20090197582 Lewis et al. Aug 2009 A1
20090197616 Lewis et al. Aug 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090299857 Brubaker Dec 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100004003 Duggal et al. Jan 2010 A1
20100041378 Aceves et al. Feb 2010 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100113066 Dingler et al. May 2010 A1
20100115281 Camenisch et al. May 2010 A1
20100130233 Lansing May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100153197 Byon Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100211431 Lutnick et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100262461 Bohannon Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20100318628 Pacella et al. Dec 2010 A1
20100323666 Cai et al. Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110035284 Moshfeghi Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110098061 Yoon Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110131633 Macaskill et al. Jun 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110170838 Rosengart et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238300 Schenken Sep 2011 A1
20110238762 Soni et al. Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110251790 Liotopoulos et al. Oct 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110256881 Huang et al. Oct 2011 A1
20110258260 Isaacson Oct 2011 A1
20110269479 Ledlie Nov 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110288917 Wanek Nov 2011 A1
20110294541 Zheng et al. Dec 2011 A1
20110295577 Ramachandran Dec 2011 A1
20110295677 Dhingra et al. Dec 2011 A1
20110295719 Chen et al. Dec 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120023522 Anderson et al. Jan 2012 A1
20120150978 Monaco Jan 2012 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054001 Zivkovic et al. Mar 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123867 Hannan May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120129548 Rao et al. May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166468 Gupta et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120179549 Sigmund et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197690 Agulnek Aug 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120208564 Clark et al. Aug 2012 A1
20120209892 Macaskill et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120270563 Sayed Oct 2012 A1
20120271684 Shutter Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Bray et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130006777 Krishnareddy et al. Jan 2013 A1
20130008238 Hogeg et al. Jan 2013 A1
20130017802 Adibi et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130115872 Huang et al. May 2013 A1
20130122862 Horn et al. May 2013 A1
20130122929 Al-mufti et al. May 2013 A1
20130124297 Hegeman et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132194 Rajaram May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130157684 Moser Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173380 Akbari et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130203373 Edge Aug 2013 A1
20130217366 Kolodziej Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 McEvilly Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130254227 Shim et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304527 Santos, III Nov 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057648 Lyman et al. Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140066106 Ngo et al. Mar 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140095296 Angell et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129627 Baldwin et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140153837 Steiner Jun 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140156410 Wuersch et al. Jun 2014 A1
20140164118 Polachi Jun 2014 A1
20140172542 Poncz et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140180829 Umeda Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140222570 Devolites et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279040 Kuboyama Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20140337123 Nuernberg et al. Nov 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150094093 Pierce et al. Apr 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150130178 Clements May 2015 A1
20150142753 Soon-shiong May 2015 A1
20150149091 Milton et al. May 2015 A1
20150154650 Umeda Jun 2015 A1
20150163629 Cheung Jun 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawaa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150186497 Patton et al. Jul 2015 A1
20150222814 Li et al. Aug 2015 A1
20150237472 Alsina et al. Aug 2015 A1
20150237473 Koepke Aug 2015 A1
20150024971 Stefansson et al. Sep 2015 A1
20150254704 Kothe et al. Sep 2015 A1
20150261917 Smith Sep 2015 A1
20150262208 Bjontegard Sep 2015 A1
20150269624 Cheng et al. Sep 2015 A1
20150271779 Alavudin Sep 2015 A1
20150287072 Golden et al. Oct 2015 A1
20150294367 Oberbrunner et al. Oct 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150033231 Cui et al. Nov 2015 A1
20150332317 Cui et al. Nov 2015 A1
20150332325 Sharma et al. Nov 2015 A1
20150332329 Luo et al. Nov 2015 A1
20150341747 Barrand et al. Nov 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150358806 Salqvist Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160019592 Muttineni et al. Jan 2016 A1
20160034712 Patton et al. Feb 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160098742 Minicucci et al. Apr 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160127871 Smith et al. May 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160210657 Chittilappilly et al. Jul 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160292735 Kim Oct 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170026786 Barron et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170078760 Christoph et al. Mar 2017 A1
20170091795 Mansour et al. Mar 2017 A1
20170127233 Liang et al. May 2017 A1
20170132647 Bostick et al. May 2017 A1
20170164161 Gupta et al. Jun 2017 A1
20170186038 Glover et al. Jun 2017 A1
20170222962 Gauglitz et al. Aug 2017 A1
20170230315 Zubas et al. Aug 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
20170339521 Colonna et al. Nov 2017 A1
20170359686 Colonna et al. Dec 2017 A1
20180121957 Cornwall et al. May 2018 A1
20180189835 Deluca et al. Jul 2018 A1
20180225687 Ahmed et al. Aug 2018 A1
20190372991 Allen et al. Dec 2019 A1
20200204726 Ebsen et al. Jun 2020 A1
20200288270 Allen et al. Sep 2020 A1
20200359166 Noeth et al. Nov 2020 A1
20200359167 Noeth et al. Nov 2020 A1
20210014238 Allen et al. Jan 2021 A1
20210073249 Chang et al. Mar 2021 A1
Foreign Referenced Citations (44)
Number Date Country
2887596 Jul 2015 CA
102930107 Feb 2013 CN
103200238 Jul 2013 CN
105760466 Jul 2016 CN
107637099 Jan 2018 CN
110249359 Sep 2019 CN
1076370998 Oct 2020 CN
112040410 Dec 2020 CN
3062590 Apr 2009 EP
2151797 Feb 2010 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
101457964 Nov 2014 KR
20160019900 Feb 2016 KR
102035405 Oct 2019 KR
102163528 Sep 2020 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
WO-2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO 2011119407 Sep 2011 WO
WO-2013008238 Jan 2013 WO
WO-2013045753 Apr 2013 WO
WO-2014068573 May 2014 WO
WO-2014115136 May 2014 WO
WO-2014172388 Oct 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016123381 Aug 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
WO-2018144931 Aug 2018 WO
Non-Patent Literature Citations (333)
Entry
US 10,484,394 B2, 11/2019, Allen et al. (withdrawn)
US 10,542,011 B2, 01/2020, Allen et al. (withdrawn)
“How Snaps Are Stored And Deleted”, Snapchat, [Online], Retrieved from the Internet: <URL: https://web.archive.org/web/20130607042322/http://blog.snapchat.com/post/50060403002/how-snaps-are-stored-and-deleted, (May 9, 2013), 2 pgs.
“International Application Serial No. PCT/US2014/040346, International Search Report dated Mar. 23, 2015”, 2 pgs.
“International Application Serial No. PCT/US2014/040346, Written Opinion dated Mar. 23, 2015”, 6 pgs.
“iVisit Mobile Getting Started”, IVISIT, (Dec. 4, 2013), 1-16.
Melanson, Mike, “This text message will self destruct in 60 seconds”, readwrite.com, [Online]. Retrieved from the Internet: <http://readwrite.com/2011/02/11/this_text_message_will_self_destruct_in_60_seconds>, (Feb. 18, 2015).
Sawers, Paul, “Snapchatfor iOS Lets You Send Photos to Friends and Set How long They're Visible For”, [Online], Retrieved from the Internet: <http:/ /thenextweb.com/apps/2012/05/07/Snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visiblefor/#! xCjrp>,, (May 7, 2012), 1-5.
Shein, Esther, “Ephemeral Data”, Communications of the ACM vol. 56 | No. 9, (Sep. 2013), 20-22.
“A Whole New Story”, [Online]. Retrieved from the Internet: <https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding a watermark to your photos”, eBay, [Online]. Retrieved from the Internet:<URL:https://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 14/304,855, Corrected Notice of Allowance dated Jun. 26, 2015”, 8 pgs.
“U.S. Appl. No. 14/304,855, Final Office Action dated Feb. 18, 2015”, 10 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action dated Mar. 18, 2015”, 9 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action dated Oct. 22, 2014”, 11 pgs.
“U.S. Appl. No. 14/304,855, Notice of Allowance dated Jun. 1, 2015”, 11 pgs.
“U.S. Appl. No. 14/304,855, Response filed Feb. 25, 2015 to Final Office Action dated Feb. 18, 2015”, 5 pgs.
“U.S. Appl. No. 14/304,855, Response filed Apr. 1, 2015 to Non Final Office Action dated Mar. 18, 2015”, 4 pgs.
“U.S. Appl. No. 14/304,855, Response filed Nov. 7, 2014 to Non Final Office Action dated Oct. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/505,478, Advisory Action dated Apr. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance dated May 18, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance dated Jul. 22, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Final Office Action dated Mar. 17, 2015”, 16 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action dated Jan. 27, 2015”, 13 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action dated Sep. 4, 2015”, 19 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance dated Apr. 28, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance dated Aug. 26, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Response filed Jan. 30, 2015 to Non Final Office Action dated Jan. 27, 2015”, 10 pgs.
“U.S. Appl. No. 14/505,478, Response filed Mar. 4, 2016 to Non Final Office Action dated Sep. 4, 2015”, 12 pgs.
“U.S. Appl. No. 14/505,478, Response filed Apr. 1, 2015 to Final Office Action dated Mar. 17, 2015”, 6 pgs.
“U.S. Appl. No. 14/506,478, Response filed Aug. 17, 2015 to Advisory Action dated Apr. 14, 2015”, 10 pgs.
“U.S. Appl. No. 14/523,728, Non Final Office Action dated Dec. 12, 2014”, 10 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Mar. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Apr. 15, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Jun. 5, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Response filed Aug. 25, 2014 to Non Final Office Action dated Jan. 16, 2015”, 5 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action dated Aug. 11, 2015”, 23 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action dated Aug. 24, 2016”, 23 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Mar. 12, 2015”, 20 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Apr. 6, 2017”, 25 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Apr. 18, 2016”, 21 pgs.
“U.S. Appl. No. 14/529,064, Response filed Feb. 5, 2015 to Restriction Requirement dated Feb. 2, 2015”, 6 pgs.
“U.S. Appl. No. 14/529,064, Response filed Mar. 26, 2015 to Non Final Office Action dated Mar. 12, 2015”, 8 pgs.
“U.S. Appl. No. 14/529,064, Response filed Jul. 18, 2016 to Non Final Office Action dated Apr. 18, 2016”, 20 pgs.
“U.S. Appl. No. 14/529,064, Restriction Requirement dated Feb. 2, 2015”, 5 pgs.
“U.S. Appl. No. 14/529,064, filed Oct. 12, 2015 to Final Office Action dated Aug. 11, 2015”, 19 pgs.
“U.S. Appl. No. 14/539,391, Notice of Allowance dated Mar. 5, 2015”, 17 pgs.
“U.S. Appl. No. 14/548,590, Advisory Action dated Nov. 18, 2016”, 3 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Jul. 5, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Sep. 16, 2015”, 15 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Jan. 9, 2017”, 14 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Feb. 11, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Apr. 20, 2015”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 9, 2017 to Non Final Office Action dated Jan. 9, 2017”, 17 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 10, 2016 to Non Final Office Action dated Feb. 11, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Nov. 7, 2016 to Final Office Action dated Jul. 5, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Dec. 16, 2015 to Final Office Action dated Sep. 16, 2015”, 13 pgs.
“U.S. Appl. No. 14/548,590, Response filed Jun. 16, 2015 to Non Final Office Action dated Apr. 20, 2015”, 19 pgs.
“U.S. Appl. No. 14/578,258, Examiner Interview Summary dated Nov. 25, 2015”, 3 pgs.
“U.S. Appl. No. 14/578,258, Non Final Office Action dated Jun. 10, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,258, Notice of Allowance dated Feb. 26, 2016”, 5 pgs.
“U.S. Appl. No. 14/578,258, Response filed Dec. 10, 2015 to Non Final Office Action dated Jun. 10, 2015”, 11 pgs.
“U.S. Appl. No. 14/578,271, Final Office Action dated Dec. 3, 2015”, 15 pgs.
“U.S. Appl. No. 14/578,271, Non Final Office Action dated Aug. 7, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,271, Notice of Allowance dated Dec. 7, 2016”, 7 pgs.
“U.S. Appl. No. 14/578,271, Response filed Feb. 9, 2016 to Final Office Action dated Dec. 3, 2015”, 10 pgs.
“U.S. Appl. No. 14/578,271, Response filed Jun. 19, 2015 to Restriction Requirement dated Apr. 23, 2015”, 6 pgs.
“U.S. Appl. No. 14/578,271, filed Oct. 28, 2015 to Non Final Office Action dated Aug. 7, 2015”, 9 pgs.
“U.S. Appl. No. 14/578,271, Restriotion Requirement dated Apr. 23, 2015”, 8 pgs.
“U.S. Appl. No. 14/594,410, Non Final Office Action dated Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance dated Aug. 2, 2016”, 5 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance dated Dec. 15, 2016”.
“U.S. Appl. No. 14/594,410, Response filed Jul. 1, 2016 to Non Final Office Action dated Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Jan. 29, 2016”, 5 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Jul. 6, 2016”, 4 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Aug. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Sep. 8, 2016”, 3 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action dated Aug. 15, 2016”, 18 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action dated Nov. 23, 2015”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Jan. 3, 2017”, 17 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Mar. 28, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Jul. 20, 2015”, 25 pgs.
“U.S. Appl. No. 14/612,692, Response filed Feb. 23, 2016 to Final Office Action dated Nov. 23, 2015”, 10 pgs.
“U.S. Appl. No. 14/612,692, filed May 3, 2017 to Non Final Office Action dated Jan. 3, 2017”, 18 pgs.
“U.S. Appl. No. 14/612,692, Response filed Nov. 14, 2016 to Final Office Action dated Aug. 15, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Response filed Jun. 28, 2016 to Non Final Office Action dated Mar. 28, 2016”, 14 pgs.
“U.S. Appl. No. 14/612,692. Response filed Oct. 19, 2015 to Non Final Office Action dated Jul. 20, 2015”, 11 pgs.
“U.S. Appl. No. 14/634,417, Advisory Action dated Mar. 14, 2017”, 3 pgs.
“U.S. Appl. No. 14/634,417, Final Office Action dated Jan. 31, 2017”, 27 pgs.
“U.S. Appl. No. 14/634,417, Non Final Office Action dated Aug. 30, 2016”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Mar. 2, 2017 to Final Office Action dated Jan. 31, 2017”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Nov. 30, 2016 to Non Final Office Action dated Aug. 30, 2016”, 18 pgs.
“U.S. Appl. No. 14/682,259, Notice of Allowance dated Jul. 27, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Final Office Action dated Jun. 17, 2016”, 12 pgs.
“U.S. Appl. No. 14/704,212, Non Final Office Action dated Dec. 4, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Response filed Mar. 4, 2016 to Non Final Office Action dated Dec. 4, 2015”, 11 pgs.
“U.S. Appl. No. 14/738,069, Non Final Office Action dated Mar. 21, 2016”, 12 pgs.
“U.S. Appl. No. 14/738,069, Notice of Allowance dated Aug. 17, 2016”, 6 pgs.
“U.S. Appl. No. 14/738,069, Response filed Jun. 10, 2016 to Non Final Office Action dated Mar. 21, 2016”, 10 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance dated Apr. 12, 2016”, 9 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance dated Jul. 14, 2016”, 8 pgs.
“U.S. Appl. No. 14/808,283, Preliminary Amendment filed Jul. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance dated Mar. 29, 2017”, 17 pgs.
“U.S. Appl. No. 14/967,472, Final Office Action dated Mar. 10, 2017”, 15 pgs.
“U.S. Appl. No. 14/967,472, Non Final Office Action dated Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 14/967,472, Preliminary Amendment filed Dec. 15, 2015”, 6 pgs.
“U.S. Appl. No. 14/967,472, Response filed Dec. 5, 2016 to Non Final Office Action dated Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 15/137,608, Preliminary Amendment filed Apr. 26, 2016”, 6 pgs.
“U.S. Appl. No. 15/152,975, Non Final Office Action dated Jan. 12, 2017”, 36 pgs.
“U.S. Appl. No. 15/152,975, Preliminary Amendment filed May 19, 2016”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance dated Feb. 27, 2017”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance dated Dec. 30, 2016”, 9 pgs.
“U.S. Appl. No. 15/208,460, Supplemental Preliminary Amendment filed Jul. 18, 2016”, 8 pgs.
“U.S. Appl. No. 15/224,312, Preliminary Amendment filed Feb. 1, 2017”, 11 pgs.
“U.S. Appl. No. 15/224,343, Preliminary Amendment filed Jan. 31, 2017”, 10 pgs.
“U.S. Appl. No. 15/224,355, Preliminary Amendment filed Apr. 3, 2017”, 12 pgs.
“U.S. Appl. No. 15/224,372, Preliminary Amendment filed May 5, 2017”, 10 pgs.
“U.S. Appl. No. 15/224,359, Preliminary Amendment filed Apr. 19, 2017”, 8 pgs.
“U.S. Appl. No. 15/298,806, Non Final Office Action dated Jun. 12, 2017”, 26 pgs.
“U.S. Appl. No. 15/298,806, Preliminary Amendment filed Oct. 21, 2016”, 8 pgs.
“U.S. Appl. No. 15/416,846, Preliminary Amendment filed Feb. 18, 2017”, 10 pgs.
“U.S. Appl. No. 15/486,111, Non Final Office Action dated May 9, 2017”, 17 pgs.
“BlogStomp”, [Online], Retrieved from the Internet: <URL:http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Canadian Application Serial No. 2,894,332 Response filed Jan. 24, 2017 to Office Action dated Aug. 16, 2016”, 15 pgs.
“Canadian Application Serial No. 2,894,332, Office Action dated Aug. 16, 2016”, 4 pgs.
“Canadian Application Serial No. 2,910,158, Office Action dated Dec. 15, 2016”, 5 pgs.
“Canadian Application Serial No. 2,910,158, Response filed Apr. 11, 2017 to Office Action dated Dec. 15, 2016”, 21 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, [Online]. Retrieved from the Internet: <http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online]. Retrieved from the Internet: <URL;http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“InstaPlace Photo App Tell The Whole Story”, [Online]. Retrieved from the Internet; <https://youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs.
“International Application Serial No. PCT/EP2008/063682, International Search Report dated Nov. 24, 2008”, 3 pgs.
“International Application Serial No. PCT/US2015/035591, International Preliminary Report on Patentability dated Dec. 22, 2016”, 7 pgs.
“International Application Serial No. PCT/US2015/035591, International Search Report dated Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/035591, International Written Opinion dated Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/050424, International Search Report dated Dec. 4, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/050424, Written Opinion dated Dec. 4, 2015”, 10 pgs.
“International Application Serial No. PCT/US2015/053811, International Preliminary Report ion Patentability dated Apr. 13, 2017”, 9 pgs.
“International Application Serial No. PCT/US2015/053811, International Search Report dated Nov. 23, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/053811, Written Opinion dated Nov. 23, 2015”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Preliminary Report on Patentability dated May 4, 2017”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Search Report dated Dec. 22, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/056884, Written Opinion dated Dec. 22, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/065785, International Search Report dated Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065785, Written Opinion dated Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065821, International Search Report dated Mar. 3, 2016”, 2 pgs.
“International Application Serial No. PCT/US2015/065821, Written Opinion dated Mar. 3, 2016”, 3 pgs
“International Application Serial No. PCT/US2016/023085, International Search Report dated Jun. 17, 2016”, 5 pgs.
“International Application Serial No. PCT/US2016/023085, Written Opinion dated Jun. 17, 2016”, 6 pgs.
“International Application Serical No. PCT/US 2015/037251, International Search Report dated Sep. 29, 2015”, 7 pgs.
“Introducing Snapchat Stories”, [Online], Retrieved from the Internet:<https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.
“Macy's Believe-o-Magic”, {Online}. Retrieved from the Internet: <https://www.youtube.com/watch?v=xvzRXy3J0Z0>, (Nov. 7, 2011), 102 pgs.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 “Believe” Campaign”, [Online]. Retrieved from the Internet: <http://www.businesswire.com/news/home/20111102006759/en/Macy%E2%80%99s-lntroduces-Augmented-Reality-Experience-Stores-Country>., (Nov. 2, 2011), 6 pgs.
“Pluraleyes by Red Giant”, © 2002-2015 Red Giant LLC, [Online], Retrieved from the Internet: <URL: http://www.redgiant.com/products/pluraleyes/, (Accessed Nov. 11, 2015), 5 pgs.
“Starbucks Cup Magic”, {Onliine}. Retrieved from the Internet: <https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.
“Starbucks Cup Magic for Valentine's Day”, {Online}. Retrieved from the Internet: <https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, [Online]. Retrieved from the Internet: <http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, [Online]. Retrieved from the Internet: URL<https://techcrunch.com/2011/09/08/mobil-filters>, (Sep. 8, 2011), 10 pgs.
Castelluccia, Claude, et al., “EphPub: Toward robust Epherneral Publishing”, Network Protocols (ICNP), 2011 19th IEEE International Conference on, IEEE, (Oct. 17, 2011), 18 pgs.
Clarke, Tangier, “Automatically syncing multiple clips and lots of audio like PluralEyes possible?”, [Online]. Retrieved from the Internet: <URL: https://forums.creativecow.net/thread/344/20553, (May 21, 2013), 8 pgs.
Janthong, Isaranu, “Android App Review Thailand”, [Online]. Retrieved from the Internet:<http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online], Retrieved from the Internet: <URL: http://www.theregister.co.uk/2005/12/12/stealthtext/, (Dec. 12, 2005), 1 pg.
Macleod, Duncan, “Macys Believe-o-Magic App”, [Online]. Retrieved from the Internet: <URL:http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
Macleod, Duncan, “Starbucks Cup Magic—Let's Merry”, {Online}. Retrieved from the Internet: <URL; http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Notopoulos, Katie, “A Guide To The New Snapchat Filters And Big Fonts”, [Online]. Retrieved from the Internet:<https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, A Replay Function And For Whatever Reason, Time, Temperature And Speed Overlays”, [Online], Retrieved from the Internet: <https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Sawers, Paul, “Snapchat for ios lets you send photos to friends and set how long they're visible for”, http ://thenextweb.com/apps/2012/05/07/ snapchat-for-ios-lets-you-send-photos-to-f riends-and-set-how-long-theyre-visible-for, (May 2012), 1-3 pgs.
Trice, Andrew, “My Favorite New Feature: Multi-Clip Sync in Premiere Pro CC”, [Online]. Retrieved from the Internet: <URL: http://www.tricedesigns.com/2013/06/18/my-favorite-new-feature-multi-cam-synch-in-premiere-pro-cc/, (Jun. 18, 2013), 5 pgs.
Tripathi, Rohit, “Watermark Images in PHP And Save File on Server”, [Online]. Retrieved from the Internet: <URL:http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server/, (Dec. 28, 2012), 4 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary dated May 23, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary dated Nov. 17, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Response filed Sep. 6, 2017 to Non Final Office Action dated Apr. 6, 2017”, 19 pgs.
“U.S. Appl. No. 14/529,064, Response filed Dec. 21, 2016 to Final Office Action dated Aug. 24, 2016”, 17 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Jul. 18, 2017”, 20 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance dated Aug. 7, 2017”, 8 pgs.
“U.S. Appl. No. 15/298,806, Final Office Action dated Oct. 24, 2017”, 15 pgs.
“U.S. Appl. No. 15/298,806, Response filed Sep. 12, 2017 to Non Final Office Action dated Jun. 12, 2017”, 12 pgs.
“U.S. Appl. No. 15/486,111, Corrected Notice of Allowance dated Sep. 7, 2017”.
“U.S. Appl. No. 15/486,111, Notice of Allowance dated Aug. 30, 2017”, 5 pgs.
“U.S. Appl. No. 15/486,111, Response filed Aug. 9, 2017 to Non Final Office Action dated May 9, 2017”, 11 pgs.
“International Application Serial No. PCT/US2016/023085, International Preliminary Report on Patentability dated Sep. 28, 2017”, 8 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action dated Jan. 25, 2018”, 39 pgs.
“U.S. Appl. No. 15/074,029, Response filed Feb. 28, 2018 to Non Final Office Action dated Nov. 30, 2017”, 12 pgs.
“U.S. Appl. No. 15/298,806, Advisory Action dated Jan. 29, 2018”, 4 pgs.
“U.S. Appl. No. 15/298,806, Examiner Interview Summary dated Jan. 12, 2018”, 3 pgs.
“U.S. Appl. No. 15/298,806, Response filed Jan. 9, 2018 to Final Office Action dated Oct. 24, 2017”, 17 pgs.
“U.S. Appl. No. 15/835,100, Non Final Office Action dated Jan. 23, 2018”, 18 pgs.
“International Application Serial No. PCT/US2018/016723, International Search Report dated Apr. 5, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/016723, Written Opinion dated Apr. 5, 2018”, 17 pgs.
“U.S. Appl. No. 14/548,590, Advisory Action dated Apr. 19, 2018”, 2 pgs.
“U.S. Appl. No. 14/548,590, Appeal Brief filed Apr. 20, 2018”, 28 pgs.
“U.S. Appl. No. 15/298,806, Non Final Office Action dated May 17, 2018”, 16 pgs.
“U.S. Appl. No. 15/835,100, Response filed Apr. 23, 2018 to Non Final Office Action dated Jan. 23, 2018”, 11 pgs.
“European Application Serial No. 16716090.2, Response filed May 21, 2018 to Communication pursuant to Rules 161(1) and 162 EPC dated Nov. 10, 2017”, w/ English Claims, 89 pgs.
“U.S. Appl. No. 15/074,029, Non Final Office Action dated Nov. 30, 2017”, 16 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Jul. 13, 2018”, 38 pgs.
“U.S. Appl. No. 14/529,064, Response filed May 25, 2018 to Final Office Action dated Jan. 25, 2018”, 20 pgs.
“U.S. Appl. No. 14/548,590, Appeal Decision dated Mar. 26, 2020”, 13 pgs.
“U.S. Appl. No. 14/548,590, Notice of Allowance dated Jun. 17, 2020”, 9 pgs.
“U.S. Appl. No. 15/074,029, Advisory Action dated Oct. 11, 2018”, 3 pgs.
“U.S. Appl. No. 15/074,029, Corrected Notice of Allowability dated Feb. 5, 2020”, 4 pgs.
“U.S. Appl. No. 15/074,029, Corrected Notice of Allowability dated Aug. 20, 2019”, 10 pgs.
“U.S. Appl. No. 15/074,029, Final Office Action dated Jun. 28, 2018”, 22 pgs.
“U.S. Appl. No. 15/074,029, Non Final Office Action dated Jan. 23, 2019”, 19 pgs.
“U.S. Appl. No. 15/074,029, Notice of Allowance dated Jun. 19, 2019”, 14 pgs.
“U.S. Appl. No. 15/074,029, Response filed Aug. 28, 2018 to Final Office Action dated Jun. 28, 2018”, 21pgs.
“U.S. Appl. No. 15/074,029, Response filed Apr. 23, 2019 to Non Final Office Action dated Jan. 23, 2019”, 15 pgs.
“U.S. Appl. No. 15/298,806, Examiner Interview Summary dated Aug. 13, 2018”, 3 pgs.
“U.S. Appl. No. 15/298,806, Notice of Allowance dated Sep. 19, 2018”, 5 pgs.
“U.S. Appl. No. 15/298,806, Response filed Aug. 10, 2018 to Non Final Office Action dated May 17, 2018”, 15 pgs.
“U.S. Appl. No. 15/424,184, Advisory Action dated May 26, 2020”, 6 pgs.
“U.S. Appl. No. 15/424,184, Advisory Action dated Aug. 25, 2020”, 5 pgs.
“U.S. Appl. No. 15/424,184, Examiner Interview Summary dated Jan. 10, 2019”, 3 pgs.
“U.S. Appl. No. 15/424,184, Examiner Interview Summary dated Jul. 30, 2019”, 2 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action dated Jan. 29, 2019”, 14 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action dated Mar. 9, 2020”, 19 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action dated Jul. 27, 2020”, 18 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action dated Sep. 9, 2019”, 13 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action dated May 21, 2019”, 16 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action dated Jun. 29, 2020”, 19 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action dated Nov. 30, 2018”, 22 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action dated Dec. 2, 2019”, 16 pgs.
“U.S. Appl. No. 15/424,184, Notice of Allowance dated Sep. 25, 2020”, 10 pgs.
“U.S. Appl. No. 15/424,184, Response filed Mar. 2, 2020 to Non Final Office Action dated Dec. 2, 2019”, 11 pgs.
“U.S. Appl. No. 15/424,184, Response filed May 11, 2020 to Final Office Action dated Mar. 9, 2020”, 14 pgs.
“U.S. Appl. No. 15/424,184, Response filed Jul. 13, 2020 to Non Final Office Action dated May 5, 2020”, 11 pgs.
“U.S. Appl. No. 15/424,184, Response filed Aug. 5, 2020 to Final Office Action dated Jul. 27, 2020”, 12 pgs.
“U.S. Appl. No. 15/424,184, Response filed Aug. 21, 2019 to Non Final Office Action dated May 21, 2019”, 12 pgs.
“U.S. Appl. No. 15/424,184, Response filed Sep. 1, 2020 to Advisory Action dated Aug. 25, 2020”, 9 pgs.
“U.S. Appl. No. 15/424,184, Response filed Nov. 11, 2019 to Final Office Action dated Sep. 9, 2019”, 12 pgs.
“U.S. Appl. No. 15/424,184, Response filed Apr. 29, 2019 to Final Office Action dated Jan. 29, 2019”, 11 pgs.
“U.S. Appl. No. 15/424,184k, Response filed Jan. 4, 2019 to Non Final Office Action dated Nov. 30, 2018”, 17 past.
“U.S. Appl. No. 15/474,821, Advisory Action dated Dec. 19, 2019”, 3 pgs.
“U.S. Appl. No. 15/474,821, Final Office Action dated Sep. 3, 2019”, 19 pgs.
“U.S. Appl. No. 15/474,821, Non Final Office Action dated Jan. 25, 2019”, 17 pgs.
“U.S. Appl. No. 15/474,821, Non Final Office Action dated Mar. 18, 2021”, 17 pgs.
“U.S. Appl. No. 15/474,821, Notice of Non-Compliant Amendment dated Sep. 8, 2020”, 6 pgs.
“U.S. Appl. No. 15/474,821, Response filed Jan. 7, 2021 to Notice of Non-Compliant Amendment dated Sep. 8, 2020”, 9 pgs.
“U.S. Appl. No. 15/474,821, Response filed May 11, 2021 to Non Final Office Action dated Mar. 18, 2021”, 10 pgs.
“U.S. Appl. No. 15/474,821, Response filed Apr. 25, 2019 to Non Final Office Action dated Jan. 25, 2019”, 16 pgs.
“U.S. Appl. No. 15/474,821, Response filed on Dec. 2, 2019 to Final Office Action dated Sep. 3, 2019”, 10 pgs.
“U.S. Appl. No. 15/835,100, Notice of Allowance dated May 22, 2018”, 5 pgs.
“U.S. Appl. No. 15/837,935, Notice of Allowance dated Nov. 25, 2019”, 18 pgs.
“U.S. Appl. No. 15/946,990, Final Office Action dated May 9, 2019”, 11 pgs.
“U.S. Appl. No. 15/946,990, Non Final Office Action dated Dec. 3, 2018”, 10 pgs.
“U.S. Appl. No. 15/946,990, Notice of Allowance dated Sep. 24, 2019”, 5 pgs.
“U.S. Appl. No. 15/946,990, Response filed Feb. 20, 2019 to Non Final Office Action dated Dec. 3, 2018”, 11 pgs.
“U.S. Appl. No. 15/946,990, Response filed Jul. 9, 2019 to Final Office Action dated May 9, 2019”, 12 pgs.
“U.S. Appl. No. 16/105,687, Non Final Office Action dated Sep. 14, 2018”, 11 pgs.
“U.S. Appl. No. 16/105,687, Notice of Allowance dated Feb. 25, 2019”, 8 pgs.
“U.S. Appl. No. 16/105,687, Response filed Dec. 14, 2018 to Non Final Office Action dated Sep. 14, 2018”, 12 pgs.
“U.S. Appl. No. 16/219,577, Non Final Office Action dated Oct. 29, 2019”, 7 pgs.
“U.S. Appl. No. 16/219,577, Notice of Allowance dated Jan. 15, 2020”, 7 pgs.
“U.S. Appl. No. 16/219,577, Response filed Oct. 3, 2019 to Restriction Requirement dated Aug. 7, 2019”, 6 pgs.
“U.S. Appl. No. 16/219,577, Response filed Dec. 5, 2019 to Non Final Office Action dated Oct. 29, 2019”, 6 pgs.
“U.S. Appl. No. 16/219,577, Restriction Requirement dated Aug. 7, 2019”, 6 pgs.
“U.S. Appl. No. 16/428,210, Advisory Action dated Sep. 9, 2020”, 3 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary dated Aug. 28, 2020”, 3 pgs.
“U.S. Appl. No. 16/428,210, Final Office Action dated Jun. 29, 2020”, 16 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action dated Apr. 6, 2020”, 16 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action dated Nov. 27, 2020”, 17 pgs.
“U.S. Appl. No. 16/428,210, Preliminary Amendment filed Aug. 8, 2019”, 8 pgs.
“U.S. Appl. No. 16/428,210, Response filed Apr. 27, 2021 to Non Final Office Action dated Nov. 27, 2020”, 11 pgs.
“U.S. Appl. No. 16/428,210, Response filed Jun. 3, 2020 to Non Final Office Action dated Apr. 6, 2020”, 10 pgs.
“U.S. Appl. No. 16/428,210, Response filed Aug. 27, 2020 to Final Office Action dated Jun. 29, 2020”, 12 pgs.
“U.S. Appl. No. 16/541,919, Non Final Office Action dated Apr. 14, 2020”, 18 pgs.
“U.S. Appl. No. 16/541,919, Notice of Allowance dated Jun. 30, 2020”, 8 pgs.
“U.S. Appl. No. 16/541,919, Notice of Allowance dated Oct. 15, 2020”, 8 pgs.
“U.S. Appl. No. 16/541,919, Response filed Jun. 12, 2020 to Non Final Office Action dated Apr. 14, 2020”, 8 pgs.
“U.S. Appl. No. 16/808,101, Preliminary Amendment filed Mar. 10, 2020”, 8 pgs.
“U.S. Appl. No. 16/841,817, Non Final Office Action dated May 26, 2021”, 7 pgs.
“U.S. Appl. No. 16/943,706, Examiner Interview Summary dated Mar. 31, 2021”, 2 pgs.
“U.S. Appl. No. 16/943,706, Final Office Action dated Feb. 24, 2021”, 17 pgs.
“U.S. Appl. No. 16/943,706, Non Final Office Action dated Sep. 8, 2020”, 16 pgs.
“U.S. Appl. No. 16/943,706, Response filed Feb. 8, 2021 to Non Final Office Action dated Sep. 8, 2020”, 9 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary dated Mar. 31, 2021”, 2 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action dated Feb. 24, 2021”, 15 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action dated Sep. 8, 2020”, 14 pgs.
“U.S. Appl. No. 16/943,804, Response filed Feb. 8, 2021 to Non Final Office Action dated Sep. 8, 2020”, 7 pgs.
“U.S. Appl. No. 17/031,310, Preliminary Amendment filed Jan. 22, 2021”, 8 pgs.
“Chinese Application Serial No. 201680027177.8, Office Action dated Oct. 28, 2019”, w/English Translation, 15 pgs.
“Chinese Application Serial No. 201680027177.8, Response filed Mar. 5, 2020 to Office Action dated Oct. 28, 2019”, w/ English Claims, 11 pgs.
“Connecting To Your Customers In the Triangle and Beyond”, Newsobserver.com, (2013), 16 pgs.
“Demystifying Location Data Accuracy”, Mobile Marketing Association, (Nov. 2015), 18 pgs.
“European Application Serial No. 16716090.2, Communication Pursuant to Article 94(3) EPC dated Jan. 15, 2020”, 6 pgs.
“European Application Serial No. 16716090.2, Response filed Apr. 15, 2020 to Communication Pursuant to Article 94(3) EPC dated Jan. 15, 2020”, 10 pgs.
“European Application Serial No. 18747246.9, Communication Pursuant to Article 94(3) EPC dated Jun. 25, 2020”, 10 pgs.
“European Application Serial No. 18747246.9, Extended European Search Report dated Nov. 7, 2019”, 7 pgs.
“European Application Serial No. 18747246.9, Response filed Jun. 3, 2020 to Extended European Search Report dated Nov. 7, 2019”, 15 pgs.
“European Application Serial No. 18747246.9, Response filed Oct. 15, 2020 to Communication Pursuant to Article 94(3) EPC dated Jun. 25, 2020”, 16 pgs.
“Geofencing and the event industry”, Goodbarber Blog, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: https://www.goodbarber.com/blog/geofencing-and-the-event-industry-a699/>, (Nov. 9, 2015), 7 pgs.
“IAB Platform Status Report: A Mobile Advertising Review”, Interactive Advertising Bureau, (Jul. 2008), 24 pgs.
“International Application Serial No. PCT/US2018/016723, International Preliminary Report on Patentability dated Aug. 15, 2019”, 19 pgs.
“Korean Application Serial No. 10-2017-7029861, Notice of Preliminary Rejection dated Jan. 17, 2019”, w/ English Translation, 9 pgs.
“Korean Application Serial No. 10-2017-7029861, Response filed Mar. 15, 2019 to Notice of Preliminary Rejection dated Jan. 17, 2019”, w/ English Claims, 20 pgs.
“Korean Application Serial No. 10-2019-7025443, Notice of Preliminary Rejection dated Feb. 2, 2021”, w/ English Translation, 11 pgs.
“Korean Application Serial No. 10-2019-7030235, Final Office Action dated May 20, 2020”, w/English Translation, 5 pgs.
“Korean Application Serial No. 16-2019-7030235, Notice of Preliminary Rejection dated Nov. 28, 2019”, w/ English Translation, 10 pgs.
“Korean Application Serial No. 10-2019-7030235, Response filed Jan. 28, 20 to Notice of Preliminary Rejection dated Nov. 28, 2019”, w/ English Claims, 12 pgs.
“Korean Application Serial No. 10-2019-7030235, Response filed Jun. 22, 2020 to Final Office Action dated May 20, 2020”, w/ English Claims, 16 pgs.
“Korean Application Serial No. 16-2021-7604376, Notice of Preliminary Rejection dated May 31, 2021”, w/ English translation, 9 pgs.
“Mobile Location User Cases and Case Studies”, Interactive Advertising Bureau, (Mar. 2014), 25 pgs.
“WIPO; International Preliminary Report; WO201776739”, (dated Sep. 10, 2018), 5 pgs.
“WIPO; Search Strategy; WO201776739”, (dated Dec. 10, 2017), 6 pgs.
Carr, Dale, “Mobile Ad Targeting: A Labor of Love”, Ad Week, [Online] Retrieved from the Internet on Feb. 11, 2019: <URL: https://www.adweek.com/digital/mobile-ad-targeting-a-labor-of-love/>, (Feb. 12, 2016), 7 pgs.
Kumar, S, “Optimization Issues in Web and Mobile Advertising”, Chapter 2—Pricing Models in Web Advertising, SpringerBriefs in Operations Management, (2016), 6 pgs.
Naylor, Joseph, “Geo-Precise Targeting: It's time to Get off the Fence”, Be In The Know Blog, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: http://blog.cmglocalsolutions.com/geo-precise-targeting-its-time-to-get-off-the-fence>, (May 15, 2015), 6 pgs.
Palmer, Alex, “Geofencing at events: how to reach potential customers live and on-site”, Streetfight Mag, [Online] Retrieved form the internet by the examiner on May 16, 2019: <URL: http://streetfightmag.com/2015/08/20/geofencing-at-events-how-to-reach-potential-customers-live-and-on-site>, (Aug. 20, 2015), 6 pgs.
Peterson, Lisa, et al., “Location-Based Advertising”, Peterson Mobility Solutions, (Dec. 2009), 39 pgs.
Quercia, Daniele, et al., “Mobile Phones and Outdoor Advertising: Measurable Advertising”, IEEE Persuasive Computing, (2011), 9 pgs.
Simonite, Tom, “Mobile Data: A Gold Mine for Telcos”, MIT Technology Review, (May 27, 2010), 6 pgs.
Virgillito, Dan, “Facebook Introduces Mobile Geo-Fencing With Local Awareness Ads”, Adespresso, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: https://adespresso.com/blog/facebook-local-business-ads-geo-fencing/>, (Oct. 8, 2014), 14 pgs.
“U.S. Appl. No. 16/943,706, Response filed Jun. 24, 2021 to Final Office Action dated Feb. 24, 2021”, 11 pgs.
“U.S. Appl. No. 16/943,804, Response filed Jun. 24, 21 to Final Office Action dated Feb. 24, 21”, 8 pgs.
“U.S. Appl. No. 16/428,210, Final Office Action dated Jul. 9, 21”, 18 pgs.
“U.S. Appl. No. 16/943,706, Non Final Office Action dated Jul. 9, 21”, 17 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action dated Jul. 21, 21”, 16 pgs.
“U.S. Appl. No. 16/808,101, Notice of Allowance dated Jul. 27, 21”, 16 pgs.
“U.S. Appl. No. 16/808,101, Supplemental Notice of Allowability dated Aug. 9, 21”, 3 pgs.
“U.S. Appl. No. 15/474,821, Final Office Action dated Aug. 19, 21”, 18 pgs.
“U.S. Appl. No. 16/841,817, Response filed Aug. 26, 21 to Non Final Office Action dated May 26, 21”, 6 pgs.
“U.S. Appl. No. 16/841,817, Notice of Allowance dated Sep. 3, 21”, 7 pgs.
“Korean Application Serial No. 10-2021-7004376, Response filed Aug. 12, 21 to Notice of Preliminary Rejection dated May 31, 21”, w/ English Translation, 47 pgs.
“European Application Serial No. 18747246.9, Summons to Attend Oral Proceedings dated Jun. 29, 21”, 12 pgs.
“Application Serial No. 16 841,817, Corrected Notice of Allowability dated Sep. 16, 21”, 2 pgs.
“Application Serial No. 17 112,676, Non Final Office Action dated Sep. 23, 21”, 26 pgs.
“Application Serial No. 16 943,804, Examiner Interview Summary dated Oct. 21, 21”, 2 pgs.
“Application Serial No. 15 474,821, Response filed Oct. 20, 21 to Final Office Action dated Aug. 19, 21”, 10 pgs.
“Application Serial No. 16 428,210, Examiner Interview Summary dated Nov. 5, 21”, 2 pgs.
“Application Serial No. 16 943,706, Examiner Interview Summary dated Nov. 5, 21”, 2 pgs.
“Application Serial No. 16 943,804, Response filed Nov. 4, 21 to Non Final Office Action dated Jul. 21, 21”, 9 pgs.
“Application Serial No. 16 943,706, Response filed Nov. 8, 21 to Non Final Office Action dated Jul. 9, 21”, 11 pgs.
“Application Serial No. 16 428,210, Response filed Nov. 9, 21 to Final Office Action dated Jul. 9, 21”, 12 pgs.
“Application Serial No. 17 031,310, Notice of Allowance dated Nov. 15, 21”, 9 pgs.
Related Publications (1)
Number Date Country
20160085863 A1 Mar 2016 US