Map interface with icon for location-based messages

Information

  • Patent Grant
  • 12127068
  • Patent Number
    12,127,068
  • Date Filed
    Thursday, July 30, 2020
    4 years ago
  • Date Issued
    Tuesday, October 22, 2024
    2 months ago
Abstract
A method comprising receiving, from a sending device, a location-based message. A location input is received, from the sending device, the location input being associated with the location-based message and defining a delivery area. A map interface is presented on a receiving device, together with an icon representative of the location-based message, the icon being presented responsive to the receiving device being located in the delivery area.
Description
TECHNICAL FIELD

This application relates generally to mobile messaging systems, and more specifically to methods, systems, and devices to enable location based messages to be generated by a user for receipt by a selected recipient when the selected recipient is within a location associated with a message.


BACKGROUND

The ever increasing use of smart phones and other mobile devices with data connections and location determination capabilities is slowly changing the way people interact. Such mobile devices can provide users with nearly universal connections to a network. Such mobile devices also commonly include mechanisms, such as global positioning system (GPS) receivers and network assisted location services that allow the devices to determine location information. Embodiments described herein relate to the use of such mobile devices for location based messaging.





BRIEF DESCRIPTION OF THE DRAWINGS

Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings in which:



FIG. 1 is a block diagram depicting a system for location based messaging in accordance with certain embodiments.



FIG. 2 is a block diagram depicting a system for location based messaging in accordance with certain embodiments.



FIG. 3 illustrates a method for location based messaging according to certain embodiments.



FIG. 4 illustrates a method for location based messaging according to certain embodiments.



FIG. 5 illustrates aspects of certain example embodiments of a location based messaging system.



FIG. 6 illustrates one implementation of a mobile device that may be used with certain example embodiments.



FIG. 7 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 8 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 9 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 10 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 11 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 12 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 13 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 14 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 15 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 16 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 17 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 18 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 19 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 20 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 21 illustrates aspects of a user interface for location based messaging according to certain example embodiments.



FIG. 22 is a diagrammatic representation of a machine in the example form of a computer system within which a set of instructions for causing the machine to perform any one or more of the methodologies discussed herein may be executed.





DETAILED DESCRIPTION

Example systems and methods are described for location based communications. In particular, certain example embodiments show communications generated by a user on a mobile device that are configured to be made available to a selected recipient when that recipient is within a target location. Certain embodiments provide systems for both sending and receiving location based content to a user-defined audience, with individual recipients identified by a sender using a sender's contacts.


For example, in one embodiment, a user may want to send a location based message to a roommate reminding the roommate to pick up pet food when the roommate is at the store. Prior to the message being generated, the user and the roommate that is to receive the message each separately register devices with separate location based messaging accounts. As part of each registration process, the user and the message recipient may each download an application for location based messaging to their respective registered devices. Additionally, the roommate enables the sender to target the recipient's device or devices with location based messages. The user is then able to generate a location based message from the user's device targeting the roommate's device.


As part of generation of a location based message, the user then inputs an identifier for the roommate to the user's device. The identifier is used to identify the roommate as the message recipient. The user also inputs information to select a delivery area as part of the location based message generation. Message content is then associated with the identifier and the delivery area. This may be a text message saying “remember the pet food,” an image such as a picture of an empty pet food bag, or a video of a pet in front of an empty food bowl. In various embodiments, the delivery of the message may be performed in a variety of different ways as described below. In one example embodiment, the roommate's device may notify the roommate that a message will be available when the roommate is at the store, without providing the message content until the roommate is within the delivery area. When the roommate's device uses location management features of the device to identify that the device is within the delivery area, then the message content will be made available for display on the roommate's device.


A location based message provides the benefit of presenting the message to the user when the recipient is in the most relevant context to receive the message, rather than presenting the message to the recipient and then relying on the recipient to remember the message when the recipient is in the appropriate context that was related to the message. There are many additional use cases for such location based messages in both personal and business settings. Non-limiting additional examples include leaving reminders for contacts at specific locations, reinforcing network contacts with communications when they are in relevant locations, delivery of location-specific content to interested individuals, reaching out to interested consumers in a given location, and many other communication types.



FIGS. 1 and 2 describe network communication systems in which location based messages may be implemented in accordance with various embodiments described herein. FIG. 1 illustrates a wireless network 100 in accordance with some embodiments. The wireless network 100 includes mobile devices 110, 115, and 120. The mobile devices 110, 115, and 120 may be, for example, laptop computers, smart phones, tablet computers, phablet devices, or any other network enabled wireless device. FIG. 22 provides additional details of one embodiment of a computing device which may be used to implement any of mobile devices 110, 115, and 120.


Wireless devices 110, 115, and 120 are configured to communicate with each other and with other devices via a wireless connection 125 to a communication node 130. Communication node 130 may be any access point, cellular node, or other source of wireless access to network 135. Wireless connection 125 may be enabled via any communication standard or protocol, such as IEEE 802.11, 3GPP LTE, Bluetooth, mesh networks, beacons, or any other such communication method.


Network 135 then provides access to cloud server computer 140. Network 135 may include a local area network, a wide area network, or the Internet, and may comprise any number of communication devices. In certain embodiments cloud server computer 135 may manage system accounts and identifiers used for location based communications. Cloud server computer 135 may also provide map information and network based location assistance that may be used as part of location based communications described herein.



FIG. 2 then describes one embodiment of application level communications between multiple mobile devices such as mobile devices 110, 115, and 120 illustrated as system 200. FIG. 2 includes location based messaging modules, illustrated as messaging module 210 and messaging module 220. Messaging module 210 is implemented on a mobile device such as mobile device 110, and messaging module 220 is implemented on a separate mobile device such as mobile device 120. The messaging modules are communicatively coupled with each other via network 235 to enable location based messages to be sent between messaging modules 210 and 220. Messaging modules are also communicatively coupled to cloud based server computer 240, which may manage aspects of location based communications including accounts, identifiers, message routing, or other such aspects of a location based communication.


Each messaging module manages both outbound and inbound location based communications. In the embodiment of FIG. 2, each messaging module includes data service module, a data storage module, a location manager module, and a user interface module. As illustrated, messaging module 210 includes data service module 212, data storage module 215, user interface (UI) module 216, and location manager module 218. Similarly, messaging module 220 includes data service module 222, data storage module 224, UI module 226, and location manager module 228.


UI modules 216 and 226 manage the receipt of inputs from input devices such as input device 2212, UT navigation device 2214, or any sources of user input data generated by a user interacting with a mobile device on which the corresponding messaging module for each UI module 216 and 226 are operating. This includes user inputs selecting a message recipient identifier. This includes user inputs identifying a geofence or boundary area which is used to delineate a delivery area. This also includes input commands used to generate or select elements of a content message that is part of a location based communication, such as text inputs, image selection, video selection, generation of graphics or illustrations as part of a user input, or any other such data that may be part of a content message. UI modules 216 and 226 may also manage output of notifications on an output of a mobile device, including audio alerts, display of content messages when a mobile device is within a delivery area, display of interface options, mobile device vibrations, and any other such output elements described herein.


Location manager modules 218 and 228 manage location measurements of a mobile device based on default settings and user selections received from a corresponding UI module. Location manager modules 218 and 228 may thus initiate location measurements using global navigation satellite system (GNSS) elements and measurements of a mobile device, network based location measurements, or any other such location measurement or combination of location measurements. Additionally, in certain embodiments, location manager modules 218 and 228 may use geofence information to determine location states as inside or outside of a geofence or delivery area. A geofence as described herein refers to data that defines a spatial boundary. A geofence may define a delivery area, such that when a location manager module determines that a mobile device is within a geofence that defines the delivery area, a content message associated with that delivery is made available on the mobile device. Location manager modules 218 and 228 may thus both determine a location of a mobile device and determine whether the location is within a geofence as part of receipt of a message for display on a receiving mobile device. Such geofence areas may be defined by user inputs received at a UI module, or may be stored as previously define locations in data storage modules.


Data storage modules 214 and 224 manage local storage of any data that is associated with location based communications described herein. This may include any identifier information, message content information, location information, system setting information, or any other such information associated with any embodiment described herein.


Data service modules 212 and 222 manage transmission and receipt of information associated with location based communication on a corresponding mobile device. This includes communication of geofence or delivery area information, communication of message content information, and interaction with other mobile devices and/or cloud based server computers that are associated with a communication process. Data service modules 212 and 222 may additionally manage registration information, and communications with other users to establish contact settings that enable a user of a device to send and/or receive location based messages involving another user.



FIG. 3 then illustrates an example embodiment of a method for location based communication, shown as method 300. For the purposes of illustration, method 300 is described below with respect to the elements of systems 100 and 200 described above. In alternative embodiments, other combinations of systems or completely different systems may be used to implement such a method. Additionally, it will be clear that other methods of location based communication are also possible in other alternative embodiments.


Method 300 is performed by a mobile device such as mobile device 110, having one or more processors such as processor 2202 and memory such as main memory 2204 and static memory 2206. Method 300, which is a location based method for communication, begins with operation 302 involving receiving, at an input device of a first mobile device 110, a first recipient selection input identifying a second mobile device 120, where the first mobile device 110 is different than the second mobile device 120 and the second mobile device is associated with a first identifier value. The input device may be a touch screen interface, a microphone with voice recognition processing element, a physical keyboard, or any other such input device that may be used with a mobile device. The first identifier value may be a previously generated identifier value as described in more detail below. The first mobile device 110 executes messaging module 210 and the second mobile device 120 executes messaging module 220. The first identifier may be stored in data storage module 214 and presented as part of UI module 216, or may be received at data service 212 from cloud based server computer 240.


Operation 304 then involves receiving at the input device of the first mobile device 110, a first content message associated with the first recipient selection input. This may involve receipt of a keyboard text input, a drawing input on a touch screen, an image or video input from a camera or a file of data storage 214, a sound input, or any other such input that may be used to generate the content message.


Operation 306 then involves receiving, at the input device of the first mobile device 110, a first geofence input associated with the first content message, wherein the first geofence input identifies a first delivery area. The first geofence input may be input as text describing latitude and longitude coordinates with an associated radius. In certain embodiments, location manager module 218 may be used to generate a geofence associated with a current or previous location of the mobile device. In other embodiments, data storage 214 may store data for a geofence that may be selected by a user input via UI module 216. In other embodiments, geofence information may be received via data service 212 from cloud based server computer 240. The first geofence input is associated with the first content message based on selections entered by a user via UI module 216 and the mobile device 110 input device or devices. Such inputs also select a geofence associated with the first geofence input as the first delivery area. In certain embodiments a single message may be associated with multiple delivery areas.


Operation 308 then involves initiating communication of the first content message from the first mobile device 110 to the second mobile device 120 via a network 230, wherein the first content message is configured for presentation on the second mobile device 120 when the second mobile device 120 is within the first delivery area. In various alternative embodiments, this may initiate a communication directly from a first mobile device 110 to a second mobile device 120 or this may initiate a communication from first mobile device 110 to a cloud server computer 140 or 240. In one potential embodiment, this may initiate a single message service (SMS) text message to the second mobile device 120 from the first mobile device 110, wherein the SMS text message indicates the availability of the first content message in the first delivery area. This SMS text message may then be used by second mobile device to download or otherwise access messaging service 220. Such an embodiment may particularly be used where a second mobile device 120 has not registered with a location based messaging service at the time that the content message is generated on the first mobile device 110.


In addition to the above described selections for generating a location based message, in certain embodiments a message sender may mark a message for future delivery in the event the sender does not want the recipient to view the message until a certain date. Users can designate messages as recurring in order for the intended recipient to receive the content on an ongoing basis once arriving at a location. Finally, a user can stipulate whether a message is sent to recipients once the recipient gets to the desired location, or conversely, when the sender gets to the intended location (e.g. send a message to pre-selected individuals when the sender arrives at the airport, etcetera). Each of these may be presented as options by a messaging module during generation of a location based communication.


Further, in addition to user-generated content, users can purchase premium content to be delivered through the messaging module. This content can be in the form of themed messages (e.g. new map icons, custom drawings, and etcetera) or pre-packaged location-based content (e.g. NYC tourist pack, US history pack, Celebrity sightings pack, and etcetera). Users may also download checklists that are accomplished upon reaching the designated locations (e.g. airports of the USA, natural wonders of the world, Michelin-starred restaurants, and etcetera). The Application can also be used to facilitate peer-to-peer payments (e.g. buying a gift card for a recipient at a given location, purchasing a dessert for a contact at a given restaurant, transferring funds to a recipient once they reach a location, etcetera). Users may receive coupons and offers from businesses and service providers of potential interest once the user is in the vicinity of the participating location. Finally, the Application may have an API that will allow 3rd parties to use the Application's technology to deliver content on a location-enabled basis.


As mentioned above, multiple different methods may be used to communicate and present a content message on a second recipient mobile device after the communication is initiated by a first mobile device. FIG. 4 describes one embodiment for such a communication. For the purposes of illustration, method 400 of FIG. 4 is described as a continuation of method 300 above, and is described in the context of FIGS. 1 and 2. Method 400 may be used with method 300, and may also be used with additional methods. Additionally, method 300 may be used with other different communication methods than method 400.


Method 400 begins with operation 402 involving a location based “drop” message being created on first mobile device 110. In one embodiment, operation 402 may be the same as method 300. In other embodiments, different methods may be used to generate a location based message.


Operation 404 then involves first mobile device 110 uploading the content message to a cloud server computer 240 using data service module 212. The cloud server computer 240 will receive, from the first mobile device 110, a location based message comprising the first identifier value, the first content message, and the first delivery area. This location based message is received in response to the communication initiated in operation 308.


Operation 406 then involves any recipient devices being notified of the available content message, including notifying the second mobile device 120, using a push message, of the location based message. In alternative embodiment, rather than cloud server computer 240 pushing a notification of the message, the cloud server computer 240 may simply automatically push a copy of the content message with information about the first delivery area to second mobile device 120.


In operation 408, second mobile device 120 synchronizes an “unfound” message list with could server computer 240 using data service module 222. This synchronization may be performed in response to the push notification from operation 406, in response to a periodic message check, in response to a new network connection establishing, for example, wireless connection 125, or based on any other such trigger to perform a message synchronization with cloud server computer 240. This synchronization verifies that the second mobile device 120 has information about the first delivery area which may be used to determine when to move the content message that is part of the location based communication to a “found” list. In one embodiment, this may involve receiving, at the second mobile device, the push message, and synchronizing, by the second mobile device, an unfound message list with the cloud server computer to receive the location based message from the cloud server computer.


In operation 410, location manager 228 receives information about a location change of second mobile device 120 that is operating messaging module 220. Such a callback on location change may involve GNSS measurements, network based location measurements, device sensors 2228, or any other such means of determining a location change.


Operation 412 may further involve location manager 228 determining a location of second mobile device 120, and comparing that location with the first delivery area identified by a geofencing input of first mobile device 110 when the location based message was created. If the second mobile device 120 is outside the first delivery area, the process repeats operations 410 and 412 until the second mobile device 120 is within the first delivery area until the message expires or is terminated.


In operation 414, after the location manager module 228 determines that the second mobile device 120 is within the first delivery area, the messaging module 220 adds the content message to a local found list within data storage module 224. In various embodiments, a single mobile device may have multiple different “unfound” messages within a list at any given time. Operations 410 through 414 may check each delivery area associated with each unfound message as part of a single operation, and may repeat this process until every unfound message is terminated or found.


In operation 416, after messaging module 220 adds the “found” message to a local found list, a message status is updated at could server computer 240. In operation 418, a notification is provided on second mobile device 120 that a message has been found and the content message is available for output on the second mobile device 120.


In another example embodiment, the message is created on the user's device. Then, the message is uploaded to the cloud. Once the message is in the cloud, the intended recipient device(s) of available data via Apple™ (or similar) push notification service (“APNS”). After receiving the notification, the recipient device(s) sync their “unfound” message list with the data available in the cloud. Once the message is stored on a user's unfound list, the messaging module on a user's device will receive feedback as the user's location changes. As soon as the user gets within the intended delivery radius as determined by a location manager module of the messaging module, the user's device outputs a notification that the user “found” the message and a message status is updated in the cloud. In alternative embodiment, the back-end configuration may be modified such that messages are “pulled” from the intended user's device rather than “pushed.” Additionally, in another embodiment, the message including the location data may be immediately provided to the intended recipient's mobile device, only becoming visible or available to the intended recipient device once it enters the specified location radius. Thus, in addition to hidden messages that are only received from a cloud server computer once a recipient enters the stipulated delivery radius of a given location, users may also toggle between leaving location-specific content on a recipient's map that appears irrespective of the recipient's location. In the event the sender leaves the message on this visible basis, the sender can further choose between showing the full message and marker or simply showing the marker, without message, to encourage the recipient to move to the location in order to view the content. If the marker is shown without the content message, the content message may be stored locally, or may be stored on a cloud server computer without a copy of the content message being stored at the mobile device until the message is “found” by the mobile device entering the delivery area around the marker.



FIG. 5 then illustrates one example implementation of data structures in a system 500 that may be used by a server such as cloud server computer 140 or cloud server computer 240 as part of location based communication embodiments. Each element of system 500 represents part of a structure, and specific instances of such structures in a system may represent data within the structure format. System 500 includes installation 510 elements, user 530 elements, location based message “drop” 550 elements, which may also be referred to as “drop” message elements or simply as location based message elements. System 500 also includes location based comment 570 elements, user friend 580 comments, user request 590 elements, and location based communication share 5000 elements. Each of the structure elements listed above include entry elements as part of the structure element. Additionally, each of the various structure elements listed above may include shared entry elements as described below. Further, while system 500 includes specific structures, in various alternative implementations, other combinations of structure elements and entry elements may be combined in different ways, or may use other structure or entry elements not specifically described in system 500. In various embodiments, each of these elements may be associated with database records and communication modules, each of which may be implemented on a single cloud server computer such as cloud server computer 140. In other embodiments, each of these elements may be implemented in whole or in part on separate cloud server computers.


Installation 510 illustrates aspects of a location based messaging system related to placement of a messaging module such as messaging module 210 on a mobile device such as mobile device 110. Installation 510 includes a number of different value elements. Identifier (ID) 511 is associated with a unique identifier for each system element. Each instance of system element 510, 530, 550, 570, 580, 590, and 5000 may have an identifier assigned by a system to enable management of each record.


Additionally, each user may have an associated user element such as user 552, user 5007, user 574, user 583, and user 592 that enables the system to track which user is associated with various communications or data instances in a system. Such a user identifier may be assigned by a system upon user registration. For example, a cloud server computer such as cloud server computer 140 may assign an user identifier when a user interacts with the cloud server computer as part of a registration process. Each instance of a system element including installation 510, user 530, drop 550, drop share 5000, dropped comments 570, user friends 580, and user request 590 which is generated by the user actions may include the user's assigned identifier whenever an a data record using a particular structure element format is created. Such a user identifier is present in the various structure elements as owner 520, user 552, user 5007, user 574, user 583, and user 592. Such user identifiers may be user generated values as part of registration, e-mail values, phone number values, or values associated with other systems or networks such as Facebook™, Twitter™, or other such social network identifiers uniquely associated with an individual user. Each of these user elements associates corresponding system element with a particular user based on the value in the user element. The use of element IDs and user identifying elements enables system 500 to be manage data by having a unique ID associated with each data record in a structure element format.


Additionally, each system element may include security or control features. The embodiment of system 500 includes access control list (ACL) value elements for each system element. These ACL value elements determine which sources have read and write access to the information in a particular data instance any system element described herein. As shown, drop comments 510 includes ACL 525, user 530 includes ACL 548, drop 550 is includes ACL 562, drop comments 570 includes ACL 577, user friends 580 includes ACL 588, user requests 590 includes ACL 598, and drop share 5000 includes ACL 5016.


Application identifier 512, application name 513, and application version 514 are identifier values associated with an installation of a messaging module, and may be used for system management and updates to a messaging module. Badge 515 is a value that may be used for security and integrity verification of a messaging module. Channels 516 may be used to store information about channels in a communication system that may be used by a mobile device on which a messaging module installation may operate. Device token 517 and device type 518 may be associated with descriptions or details of a mobile device on which a messaging module is installed. Installation 519 and parse version 521 may provide identifiers for a particular installation session or version that was used to place a messaging module on a mobile device, and may be used for system troubleshooting. Owner 520 may be an identifier associated particularly with a mobile device on which a messaging module is placed, and may further be used for situations in which multiple user accounts or installations may be placed on a single mobile device. Createdate 523 and updatedate 524 may store timing details associated with placement of a messaging module or update of a messaging module on a mobile device.


In various embodiments each of these element values may be stored as part of a system element structure either in a local data storage such as data storage 214 as part of messaging module 210, or they may be stored on a cloud server computer such as cloud server computer 240. In other embodiments, multiple copies of such elements may be stored in multiple locations.


User 530 illustrates aspects of a user record that may be generated upon the user registering with the system to be able to send location based communications as described above, for example, in methods 300 and 400. As shown in system 500, the example embodiment user 530 includes ID 531 which may be similar to ID 511. Username 532 may be a login username generated by a user to provide the user access to system. This may be part of a login used to gain access to a messaging module such as messaging module 210. Password 533 may similarly be a password selected by a user on registration as part of security and identity protection with location-based communications. Additional elements of user 530 may include details associated with the user's identity including authorization data 534, email verification 535, birthday 536, first name 537, last name 538, normalized first name 539, normalized last name 540, email 541, and phone 542. In other embodiments, additional information about a user, a user's contacts, and other profile information may be stored as part of a user 530 system element. Additionally, user 530 elements may store location preferences for a user. For example, as described in more detail below with respect to FIG. 15, a system record such as user 530 may include record elements for multiple locations. Each location stored in the user record may have a location 543 elements used to identify the location as well as associated latitude 544 and longitude 545 elements that identify coordinates for the physical location associated with location 543 elements. Created at element 546 and updated at element 547 may store information identifying a date and time when a user 530 element was initially generated and when any element value of user 530 was updated.


Drop 550 is a system element associated with a particular location based message. Each location based message sent via a communication system using system 500 includes a drop 550 element. An instance of drop 550 may, for example, be generated using method 300 described above. ID 551 elements comprise an identifier associated with the user, and user 552 elements include additional information identifying a user that generates the location based message associated with drop 550. Message 553 elements are element values that store text or other portions of a content message that are part of a location based message. Image data 554 may be a separate record for a portion of the content message that includes picture images or video images as part of a content message in a location based communication. Friend list 555 may include a recipient or a set of recipients targeted as recipients of a location based communication associated with drop 550. Radius 556, location 557, latitude 558, and longitude 559 are each aspects or details of a delivery area associated with a location based message as described herein. Thus, as described above, when a user generates a location based message, a messaging module may automatically generate a drop 550 element. This may automatically include the generating a message ID value in ID 551 as well as a sender ID in user 512. The user will also identify a recipient from a friends list using the friend list 555. The user will input a message that will be stored in a message 553 element, an image data 554 element, or both. The user will also input a geofence input that generates information that may be stored or selected from location 557, or radius 556, latitude 558, and longitude 559. Each of these element values is received via an input device of a mobile device, and passed to a messaging module for storage as part of a location based message data record. In certain embodiments, the geofencing input may include combination of a selection of a previously known location as well as user input radius or coordinate information. All of this information may be aggregated in a drop 550 element as part of a location based communication. Additionally, in certain embodiments, drop 550 elements may store a date and time of creation of the message in created at 560. In certain embodiments, a location based message may be updated either before or after it is “found” by a recipient. When a location-based message is updated by a sender, updated at 561 element may store data in times of updates as well as details of changes to the location based message of a particular instance of drop 550.


Drop comments 570 is an example of an implementation of location based communications that uses a record for social comments to be stored and shared about previously sent location based messages. Each instance of a drop comment 570 may have a unique ID 571. Each drop comment 570 may also include the details of a comment in the comment 572, which may be similar to a content message described herein such as a text message, an image, or a video. Drop 573 may include an identifier which details which drop 550 is associated with a particular drop comment 570. For example, drop 573 may include an ID from an ID 551 element of the particular drop 550 location based message. Drop comments 570 includes a user 574 element which identifies a user that generates a drop comment 570. Drop comments 570 additionally includes created at 575 and updated at 576 elements for recording details of creation and update times and details.


User friends 580 includes an identifier 581 which may be a unique value for each instance of user friends 580. Friends user 582 may include a list of contacts for a user identified by user 583 elements which identifies the user associated with the user friends 580 instance. User friends 580 elements may have additional details about the connection between users or about either user individually including details about the activity state of either user in is active 584. User friends 580 elements may also include details about which user initiated the connection associated with user friends 580 in use user initiated 585, as well as details about when the connection was established or updated in created at 586 and updated at 587.


User request 590 elements illustrate an implementation of system records in certain embodiments that are associated with user initiated requests for connections to other users. Examples of such requests are described in more detail below with respect to FIGS. 13 and 19. The embodiment described in system 500 of user requests 590 includes an ID 591 for each user requests instance, a user 592 element that identifies a user that initiates a connection request, a user request 593 element identifying the target or recipient of the user request, request email 594 element detailing connection information about users, a status 595 that may detail whether a request has been accepted or not, and created at 596 and updated at 597 elements detailing when a user request 590 was initiated or updated.


Drop share 5000 illustrates aspects of an embodiments where a recipient of a location-based message may “share” a found location based message. In such embodiments, when a location based message is presented to a recipient, the recipient may use an interface element to relay the location based message to additional users. Drop share 5000 is thus similar to drop 550, but is for a location based message that was previously generated, and is being shared by a recipient of the previously generated location-based message. ID 5001 is an identifier element for each instance of a drop share 5000 element. Created by 5002 provides information about the recipient that is sharing the received location based message. Drop 5003, drop status 5004, drop updated on 5005, drop name 5006, and other elements 5007 through 5015 include details about the previously created location-based message that is being shared by the recipient user as well as some details about the originally generating user, the sharing user, and details about the location based message status related to the users.


While system 500 illustrates one implementation of aspects of location based communications, in various other embodiments, additional combinations of elements, different structural elements, or different element values may be used as part of a system for location based communications.



FIG. 6 illustrates on example of a mobile device 600. Any mobile device described herein may be implemented as an embodiment of mobile device 600. For example first mobile device 110 and second mobile device 120 may be similar to mobile device 600. The mobile device 600 can be used as an implementation of any mobile device described herein and can also be any mobile device, a mobile station (MS), a mobile wireless device, a mobile communication device, a tablet, a handset, or other type of mobile wireless computing device. The mobile device 600 can include one or more antennas 608 within housing 602 that are configured to communicate with a hotspot, base station (BS), or other type of wireless local area network (WLAN) or wireless wide area network (WWAN) access point. Mobile device 600 may thus communicate with a WAN such as the Internet via a base station transceiver such as communication node 130 described above. Mobile device 600 can be configured to communicate using multiple wireless communication standards, including standards selected from 3GPP LTE, WiMAX, High Speed Packet Access (HSPA), Bluetooth, and Wi-Fi standard definitions. The mobile device 600 can communicate using separate antennas for each wireless communication standard or shared antennas for multiple wireless communication standards. The mobile device 600 can communicate in a WLAN, a WPAN, and/or a WWAN.



FIG. 6 also shows a microphone 620 and one or more speakers 612 that can be used for audio input and output from the mobile device 600. A display screen 604 can be a liquid crystal display (LCD) screen, or other type of display screen such as an organic light emitting diode (OLED) display. The display screen 604 can be configured as a touch screen. The touch screen can use capacitive, resistive, or another type of touch screen technology. An application processor 614 and a graphics processor 618 can be coupled to internal memory 616 to provide processing and display capabilities. A non-volatile memory port 610 can also be used to provide data input/output options to a user. The non-volatile memory port 610 can also be used to expand the memory capabilities of the mobile device 600. A keyboard 606 can be integrated with the mobile device 600 or wirelessly connected to the mobile device 600 to provide additional user input. A virtual keyboard can also be provided using the touch screen 604. A camera 622 located on the front (display screen) side or the rear side of the mobile device 600 can also be integrated into the housing 602 of the mobile device 600. Any such elements may be used to generate information that may be communicated as content in a location based communication described herein.


While certain example devices are described herein, it will be understood that various embodiments can be used on all mobile technology platforms, as well as wearable devices, and may be implemented using any combination of hardware/and or software. Further, in addition to being implemented using a mobile technology platform and wearable devices, the various embodiments described herein are not limited as such. For example, the certain embodiments may be hard-wired in an automobile or any other like implementation. Further, rather than being implemented on a mobile device, various embodiments may be also be implemented using stand-alone hardware and/or software. Additionally, certain embodiments may be implemented as computer readable instructions stored within non-transitory computer readable media. This includes any non-transitory memory described herein.



FIGS. 7-21 then illustrate display outputs that are part of various aspects of certain embodiments of location based communications described herein. For illustrative purposes, certain embodiments are described with respect to the interfaces and images of these figures. It will be apparent other embodiments are possible within the scope of the innovations presented herein.


In certain embodiments, when a user first downloads the a messaging module such as messaging module 210 to a user's mobile device, the user will be required to setup an account. FIG. 7 depicts user interface 700 of an initial sign-up screen when a user creates an account according to one embodiment. Such account generation may be performed using a messaging module communicating with a cloud server computer. As shown, a user is prompted to enter their first and last name as entry 702 on an input device of a mobile device. Additional information may be entered as entries 706. Such entries may include current city, which may be configured to populate automatically, as well as and birthday. Other such entries 706 may also be includes such as a profile picture. Additionally, in one embodiment account creation involves selection of login information 704 which will allow the user to access the account via a messaging module from any applicable device. In certain embodiments, certain elements of this information may also be used as an identifier for generating a location based communication. For example, a first or last name or an e-mail address may be used by senders to select a particular user as a recipient of a location based communication. In other embodiments, other values, such as a phone number, a third part account identifier, a user generated image or picture, a photograph, or a user generated name may be used as identifiers.


In certain embodiments, users have the ability to add saved addresses to his or her profile for other users to view in order to optimize the process of sending and receiving messages. One embodiment of this may be storing a location within location 543 as described above for implementations using system 500. These saved locations can be any location such as a home, work, favorite bar, nearest airport, or any other such location, and users are able to designate and save locations they will be visiting in the future (e.g. vacation destinations, football game, etc.). Once a user adds an upcoming location in his or her profile via a mobile device UI such as UI 700, the user's profile will appear as updated at the top of his or her contact's friend screens in order to allow efficient message creation with minimal clicks. In certain implementations, such information may be sent to data storage modules or location manager modules of messaging modules for use in generating geofence inputs to identify delivery areas for messages. Users can control which other users have access to saved addresses via system settings.


In addition to the inclusion of addresses, in certain embodiments, a registration process may allow a user to determine different types and sources of location based communications that the user wants to receive on their account. For example, a user may select individuals that they wish to receive location based messages from, either as part of registration or in response to a request from the other user, as described above and below. Additionally, a user may wish to subscribe to curated groups, or to location based message groups, where location based messages will be propagated to any user that has opted into the group.


For example, one embodiment may involve receiving, at a cloud server computer from a first mobile device, a first add group request, wherein the message acceptance communication is associated with a location based message group; receiving, at the cloud server computer from a third mobile device, a location based message directed to the location based message group; communicating, from the cloud server computer to the first mobile device, the location based message in response to a determination that the second mobile device is associated with the location based message group. Subsequent messages from third and fourth devices received at the server that are directed to the location based message group will be directed to the first device. Similarly, a mobile device may receive a user input including an identifier associated with a location based message group, communicate an add group request using the identifier to a cloud server computer, and then received location based messages in response to communication of the add group request to the cloud server computer. Location based messages may be delivered from the location based server to the first mobile device as part of group messages in any manner described herein, including message delivery upon the device “finding” the message, location or delivery area information delivered without the content message until the device is within the delivery area, or full message delivery stored on the device prior to the device being within the delivery area, with the message not being available in a “found” message UI of a messaging module until the device is within the delivery area.


Such group messaging may be integrated with any other embodiment described herein as a messaging option, both for location based message transmission, and for receipt of location based messages.


In one embodiment, registration may additionally involve user interface 800 of FIG. 8 where a user is asked to input their phone number 802 which enhances the user's ability to be found by other contacts using the system. If a user opts-into submitting their phone number, the messaging module or other module involved in registration may to auto-generate a text message the user needs to send in order to authenticate the phone number. User interface 900 of FIG. 9 illustrates an example of an auto-generated message 902 that may be used for authentication of a phone number by a system.


Once a user has signed-up for an account, they will have the ability to leave messages and content using location based messages. FIG. 10 then illustrates on example interface 1000 including a user's map screen displaying a map 1008 and a center element 1006. Using interface 1000, a user can either search for an address 1002 to leave a message. Alternatively, a user may use an input 1004 to center the map on his or her current location 1010. Additionally, a user can drag and pinch the map to manually select a location using map information within a messaging module or an associated map module. In one embodiment, interface 1000 may enable users to move between the various tabs by swiping the screen in the designated direction. Swipes from the map screen may, in certain embodiments, be on the bottom of the screen in order to avoid interfering with the map interface.


User interface 1100 of FIG. 11 illustrates a search interface. From this interface, a user has the ability to search for specific locations anywhere in the world using search 1102 or to select previously used locations 1104. Alternatively, a user is able to select addresses directly from the intended recipient's profile if the selected recipient has saved locations. Once a user has selected a location the user can mark the location by selecting the center element 1006. Holding the center element 1006 will either allow the user to quickly save the highlighted location or to define the message delivery radius with a visual indicator on the map as part of a geofence input.


In the embodiments described in FIGS. 7-20, once the center element 1006 is selected then user interface 1200, shown in FIG. 12, will be displayed. Here user inputs to an input device of a mobile device can modify the name 1202 of the delivery area identified by the selected geofence, add text or other data to the content message 1204 as well as include a picture, video or other custom content 1206. Users are able to input commands via UI 1200 to a mobile device to update or edit content messages and edit, draw and add text to pictures and videos as part of a location based communication using custom content 1206. In addition to user-generated content, various embodiments may include pre-populated pictures and content that users may include in messages.


The embodiment of user interface 1200 includes a boundary adjustment input 1208. In certain embodiments, after selecting a location as described above, users have the ability to select the delivery radius 1210 with a visual indicator for which the delivery of the message will be triggered once a selected recipient is in range.


User interface 1300 illustrates an embodiment where a messaging module is configured to allow users have the ability to select one or multiple recipients 1301 using a recipient selection checkbox 1304 in addition to having the option to leave messages for themselves with a self-selection checkbox 1302. In certain embodiments, this includes the ability for a sender of the message to stipulate whether to leave the message at every type of a certain location (e.g. every Starbucks identified by map data that the messaging module has access to, at each of the selected recipients specific homes, etcetera). The sender is also able to state whether the message expires after a specified period of time either before or after opening. In certain embodiments, in the event an intended recipient is not registered with the system, the messaging module includes a configuration to allow a user have the ability to include an unregistered recipient on messages, at which point the unregistered recipient will receive either an SMS or email indicating he or she has received a message from the user at the selected location. A messaging module may also include a configuration option to leave messages to followers as well as to the broader public. In certain embodiments, public messages will only be received by users that have opted-into receiving public content.


User interface 1400 of FIG. 14 illustrates push notification 1402 which may be used in certain embodiments. As described above, in certain embodiments, once a user is in the delivery radius of a message that has been left for the user, the user will receive a push notification from a cloud server computer. Such a push notification 1402 may have data indicating both the sender of the content in addition to the location, geofence, or delivery area associated with the content. Messages may also be left for users that are delivered on a time limited basis. If the sender sends a time-based message to a recipient, the recipient will receive a pin on his or her map wherever the user is when the content delivery time is met. The pin indicator in a recipient mobile device's map UI may appear when a sender selected time period begins, and may disappear when the sender selected time period ends. Push notification 1402 may include details on a time limit for the message in certain embodiments.


User interface 1500 of FIG. 15 illustrates one embodiment of a messaging module user interface that may be displayed when a user opens the messaging module on the user's device after receiving the notification such as push notification 1402. In such an embodiment, the user will be taken to the log screen where the user will be able to see new messages 1504. New messages may be presented in bold or emphasized formatting. A messaging module may utilize separate colors and symbols to differentiate between sent and received messages in addition to indicating which messages have been responded to. Additionally, messages sent from a location where the sender is currently located, as opposed to a remote location, may appear as a different icon on the map. New actions may be presented on user interface 1500 by a badge number on a lower tab. A message module may be configurable to enable users may sort and filter messages by time, location, sender and receiver amongst other metrics. User interface 1500 includes an edit element 1502 to quickly remove log entries. In other embodiments, such an input may be performed by either swiping left as an input or selecting the edit element 1502.


User interface 1600 of FIG. 16 illustrates a screen where the user has the ability to see the full message and content message 1604. This includes text content, image content, and video content. In certain embodiments, an interface may also enable output of other content such as audio content. From here, a messaging module may be configured for a user to reply to the message using reply element 1606 which will send a notification to the sender and as well as show up in the sender's log illustrated by user interface 1500. The reply element 1606 may be associated with a pre-populated reply. The messaging module may also be configured with ability to customize the response at the user's choosing. Once the sender of a message receives a response, the messaging module of the sender's device may be configured with the ability to continue the conversation with the message recipient. For messages sent to more than one recipient, messaging modules may be configured such that users are able to choose whether they want to reply directly to the sender, or to all recipients. Messaging modules include interfaces such that users included in group messages can collaborate to modify and edit the original message content, and so that users have the ability to save messages to a favorites folder for easier access as well as the ability to save the message location to the user's saved locations area in the user's profile. As described above, this profile information may both be stored locally as well as shared with the messaging modules of other users.


User interface 1700 of FIG. 17 illustrates an embodiment where a messaging module enables access to messages through a map display screen showing a user location 1706. In certain embodiments, icons on such a user interface 1700 are color-coded based on whether they were sent, such as message 1704 or received such as message 1702. In certain embodiments, new messages contain blinking icons. Selecting a specific icon pops-up the sender and location of the message 1702 and selecting the message again will take the user to a message interface. On the map screen, users can send a pulse in a pre-selected range that allows them to see how many messages are waiting for them in the selected vicinity.


In certain embodiments, in order for a user to send and receive messages from specific individuals, the user must first add others users to the user's contact list. This may include embodiments where a user may initiate a message by sending an SMS text message or e-mail, but where the actual message may not be sent until the recipient of the SMS text message or e-mail joins the system and adds the sender to the recipients contact list. User interface 1800 of FIG. 18 illustrates one implementation of a contacts list, where a user can both see the user's existing contacts 1804 as well as an add new contact interface 1802. A messaging module may be configured such that users can create groups within the friends screen in order to simplify the message sending process. In such an embodiment, responses to messages left to groups will be visible by all members of the group even if certain group members are not “friends” on the messaging module.


User interface 1900 of FIG. 19 then illustrates an interface providing the ability to search for other users by name or other unique identifier. The search may be input via search element 1902, and may show a list 1904 of contacts 1901 of contacts that are registered on the system so long as the user has the contact's phone number or email address saved in the user's phone contact list. In order to add contacts that are already registered with the location based messaging system, a user simply needs to select the add element 1906 next to the contact's name. From there, a push request will be sent to the selected contact that the contact has the option of either accepting or declining. For a user's phone contacts that are not on the system, the user has the option of inviting phone contacts to join the system using invite element 1910 from a list of contacts 1908.


User interface 2000 then illustrates one embodiment of an interface that may be presented once a phone contact has been invited to join the system. In such an embodiment, the user will be directed to user interface 2000 where an auto-generated message 2002 and invitation to join the system can be sent to contact 2004.


Certain embodiments may additionally include a setting interface. Setting interface 2100 of FIG. 21 includes messaging module configurations to enable users to modify account information on the settings tab. As part of setting interface 2100, a user can change the user's profile as well as customize the map display using places icon selection 2104, received communications display selection 2106, and sent communications display 2108 as part of map customization 2102. In certain embodiments, in addition to choosing hidden messages that are only received once a recipient enters the stipulated delivery radius of a given location, users may also selecting settings to toggle between leaving location-specific content on a recipient's map that appears irrespective of the recipient's location. In such embodiments, in the event the sender leaves the message on this visible basis, the sender can choose between showing the full message and marker or simply showing the marker without message to encourage the recipient to move to the location in order to view the content.


Thus, one embodiment may be a method for location based communications comprising receiving, at a input device of a first mobile device, a first recipient selection input identifying a second mobile device, where the first mobile device is different than the second mobile device and the second mobile device is associated with a first identifier value. Such a method may involve receiving, at the input device of the first mobile device, a first content message associated with the first recipient selection input; receiving, at the input device of the first mobile device, a first geofence input associated with the first content message, wherein the first geofence input identifies a first delivery area; and initiating communication of the first content message from the first mobile device to the second mobile device via a network, wherein the first content message is configured for presentation on the second mobile device when the second mobile device is within the first delivery area.


Additional implementations of such an embodiment may operate where the first identifier is a mobile telephone number or where the first identifier is an e-mail address.


Further implementations of such an embodiment may operate where initiating communication of the first content message from the first mobile device to the second mobile device comprises communicating a single message service (SMS) text message to the second mobile device, wherein the SMS text message indicates the availability of the first content message in the first delivery area.


Further implementations of such an embodiment may operate where receiving, at the input device of the first mobile device, the first geofence input associated with the first content message comprises: displaying a map interface on a touch screen of the first mobile device; receiving a location selection via the touch screen displaying the map interface, the location selection identifying a point within the first delivery area; receiving a delivery radius input on the touch screen displaying the map interface, wherein the delivery radius defines, at least in part, a boundary of the first delivery area; and displaying a visual indicator of the location selection and the delivery radius on the touch screen display.


Further implementations of such an embodiment may include receiving at the first mobile device, a first visible message setting prior to initiating communication of the first content message, receiving, at the second mobile device, a notification identifying the point within the first delivery area and the boundary of the first delivery area; displaying, at the second mobile device, in response to the first visible message setting and a determination that the second mobile device is outside the first delivery area, a second map interface comprising a first delivery area representation without displaying the first content message, determining by the second mobile device that the second mobile device has moved within the first delivery area; and displaying the first content message in response to the determining that the second mobile device has moved within the first delivery area.


Further implementations of such an embodiment may operate by receiving, at the input device of the first mobile device, an expiration input identifying an expiration time period; wherein the expiration input is received at the first mobile device prior to a beginning of the expiration time period, and wherein the second map interface comprising the first delivery area representation is only displayed by the second mobile device during the expiration time period.


Further implementations of such an embodiment may operate where receiving, at the input device of the first mobile device, the first geofence input associated with the first content message comprises: receiving a location search input; displaying a list of locations in response to the location search input; and receiving a location selection input selecting a first location associated with the first delivery area from the list of location.


Further implementations of such an embodiment may operate where receiving, at the input device of the first mobile device, the first geofence input associated with the first content message comprises: displaying, in response to the first recipient selection input, a list of saved profile locations associated with the first identifier value; and receiving a location selection input selecting a first location associated with the first delivery area from the list of saved profile locations.


Further implementations of such an embodiment may operate by receiving, at the input device of the first mobile device, a second geofence input associated with the first content message, the second geofence input comprising a first sender area; and determining that the first mobile device is within the first sender area; wherein initiating communication of the first content message from the first mobile device to the second mobile device is performed in response to the determining that the first mobile device is within the first sender area.


Further implementations of such an embodiment may operate by, prior to receipt of the first recipient selection input at the first mobile device: receiving, at a cloud server computer from the second mobile device, a first registration communication, wherein the first user registration communication comprises the first identifier value; receiving, at the cloud server computer from the first mobile device, a first add contact request; communicating, from the cloud server computer to the second mobile device, an add contact request associated with the first mobile device; receiving, at the cloud server computer from the second mobile device, a first add contact acceptance; and communicating, from the cloud server computer to the second mobile device, an add contact acceptance associated with the second mobile device.


Another embodiment may be a mobile device for location based communications. Such a mobile device may include a user input module configured to: receive a first recipient selection input identifying a second mobile device, wherein the first mobile device is different than the second mobile device and the second mobile device is associated with a first identifier value; receive a first content message associated with the first recipient selection input; and receive a first geofence input associated with the first content message, wherein the first geofence input identifies a first delivery area. Such a mobile device may also include a messaging module communicatively coupled to the user input module, wherein the messaging module is configured with at least one processor and a memory of the mobile device, and the messaging module is configured to: access the first identifier value in response to the first recipient selection input; generate a location based message from the first identifier value, the first content message, and the first geofence input; and initiate communication of the first content message as part of the location based message from the first mobile device to the second mobile device via a network, wherein the first content message is configured for presentation on the second mobile device when the second mobile device is within the first delivery area.


Further implementations of such an embodiment may include an antenna coupled to the at least one processor, with the antenna configured to communicate the location based message via the network; and a location manager module configured with the at least one processor and the memory, wherein the positioning module is configured to receive position communications and determine a position of the mobile device.


Further implementations of such an embodiment may include an output display coupled to the processor and the memory, where the messaging module is further configured to: receive a notification via the antenna and the network that a second message associated with a second delivery area has been directed to the first mobile device by a second mobile device; determine when the mobile device is within the second delivery area; and initiate display of the second message on the output display in response to determination that the mobile device is within the second delivery area.


Another embodiment may be a non-transitory computer readable medium comprising computer readable instructions that, when executed by at least on processor of a mobile device, cause the mobile device to: receive, at a input device of a mobile device, a first recipient selection input identifying a second mobile device, wherein the mobile device is different than the second mobile device and the second mobile device is associated with a first identifier value; receive, at the input device, a first content message associated with the first recipient selection input; receive, at the input device, a first geofence input associated with the first content message, wherein the first geofence input identifies a first delivery area; initiate communication of the first content message from the mobile device to the second mobile device via a network, wherein the first content message is configured for presentation on the second mobile device when the second mobile device is within the first delivery area.


Additional implementations of such an embodiment may operate where the instructions further cause the mobile device to: send a registration communication to a cloud server computer; receive a second location based message from a third mobile device via the could server computer; in response to a first visible message setting of the second location based message, store the second location based message without generating a user notification to notify a user of the mobile device of the second location based message; determine a second delivery area associated with the second location based message; determine a current location of the mobile device; and generate and output a second location based message notification in response to a determination that the current location of the mobile device is within the second delivery area.



FIG. 22 is then a block diagram illustrating an example computer system machine 2200 upon which any one or more of the methodologies herein discussed can be run. Computer system machine 2200 or elements of computer system machine 2200 can be embodied as a mobile device 110, 115, or 120, or any other computing platform or element described or referred to herein. In various alternative embodiments, the machine operates as a standalone device or can be connected (e.g., networked) to other machines. In a networked deployment, the machine can operate in the capacity of either a server or a client machine in server-client network environments, or it can act as a peer machine in peer-to-peer (or distributed) network environments. The machine can be a personal computer (PC) that may or may not be portable (e.g., a notebook or a netbook), a tablet, a set-top box (STB), a gaming console, a Personal Digital Assistant (PDA), a mobile telephone or smartphone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


Example computer system machine 2200 includes a processor 2202 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 2204 and a static memory 2206, which communicate with each other via an interconnect 2208 (e.g., a link, a bus, etc.). The computer system machine 2200 can further include a video display unit 2210, an alphanumeric input device 2212 (e.g., a keyboard), and a user interface (UI) navigation device 2214 (e.g., a mouse). In one embodiment, the video display unit 2210, input device 2212 and UI navigation device 2214 are a touch screen display. The computer system machine 2200 can additionally include a storage device 2216 (e.g., a drive unit), a signal generation device 2218 (e.g., a speaker), an output controller 2232, a power management controller 2234, and a network interface device 2220 (which can include or operably communicate with one or more antennas 2230, transceivers, or other wireless communications hardware), and one or more sensors 2228, such as a Global Positioning Sensor (GPS) sensor, compass, location sensor, accelerometer, or other sensor.


The storage device 2216 includes a machine-readable medium 2222 on which is stored one or more sets of data structures and instructions 2224 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 2224 can also reside, completely or at least partially, within the main memory 2204, static memory 2206, and/or within the processor 2202 during execution thereof by the computer system machine 2200, with the main memory 2204, static memory 2206, and the processor 2202 also constituting machine-readable media.


While the machine-readable medium 2222 is illustrated in an example embodiment to be a single medium, the term “machine-readable medium” can include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions 2224. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions.


A machine-readable storage medium or other storage device can include any non-transitory mechanism for storing information in a form readable by a machine (e.g., a computer). In the case of program code executing on programmable computers, the computing device can include a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. One or more programs that can implement or utilize the various techniques described herein can use an application programming interface (API), reusable controls, and the like. Such programs can be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language can be a compiled or interpreted language, and combined with hardware implementations.


The instructions 2224 can further be transmitted or received over a communications network 2226 using a transmission medium via the network interface device 2220 utilizing any one of a number of well-known transfer protocols (e.g., HTTP). The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding, or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.


Although embodiments have been described with reference to specific implementations, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the embodiments described herein. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof, show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be used and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.

Claims
  • 1. A method comprising: receiving, from a first computing device associated with a first account, a selection of a geographical area defined by spatial boundaries for the geographical area to create a custom user map configured to apply to trigger criterion for location-based messages;receiving location-specific information related to the custom user map;identifying a request from a second computing device and a third computing device to access the custom user map;determining that the second computing device has not registered for a second account with second computing device registration information and the third device has registered for a third account;determining that a location of the second and third computing device is within the geographical area;receiving business-related information directed to a plurality of recipients in the geographical area, wherein the business-related information directed to the plurality of recipients includes options or coupons for transaction activities with a business entity located within the geographical area;in response to determining that the second computing device has not registered for the second account and the third computing device has registered for the third account and that the location of the second and third computing devices are within the geographical area:enabling access to the custom user map and the location-specific information to the second computing device, the enabling of access enabling the second computing device to displayon a map interface of the second computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location-specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the second computing device that has not registered for the second account and by disabling the second computing device access to the options or coupons for transaction activities with a business entity located within the geographical area; andenabling access to the custom user map and the location-specific information to the third computing device, the enabling of access enabling the third computing device to display on a map interface of the third computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location- specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the third computing device that has registered for the third account and by enabling the third computing device access to the options or coupons for transaction activities with a business entity located within the geographical area.
  • 2. The method of claim 1 further comprising: responsive to selection of an icon on the map interface of the second computing device, transmitting the location-specific information to the second computing device, wherein the location-specific information comprises a sending identifier value identifying a sending user and sending message content.
  • 3. The method of claim 1, further comprising: receiving a selection of a second geographical area defined by spatial boundaries on the map to create a second custom user map;displaying the second custom user map with a second marker;notifying a group of followers to access the second custom user map;enabling access to the second custom user map to the group of followers; andsending to the group of followers a location-based communication related to the second custom user map.
  • 4. The method of claim 1, wherein the business-related information directed to the plurality of recipients includes a coupon for use at a business entity, wherein the plurality of recipients comprise followers of the a user associated with the custom user map, the followers having registered a follow relationship with the user on a messaging service.
  • 5. The method of claim 1, further comprising: receiving an expiration time period for expiration of the location-specific information, wherein the expiration time period of the location-specific information is a determinable time after an opening of the location-specific information by a receiving user.
  • 6. The method of claim 1, comprising: receiving a pulse message indicating a predetermined range associated with a delivery area;determining a collection of location-specific information associated with the delivery area; andcausing presentation, on the map interface of the receiving device, of a digital marker representative of the collection.
  • 7. The method of claim 1, wherein settings information that enables a receiving user to generate setting information related to the map interface is presented on the second computing device, the setting information specifying that icons representative of location-specific information that have been opened by the receiving user be displayed on map interface on the second computing device.
  • 8. The method of claim 1, wherein setting options that enable a receiving user to generate setting information related to the map interface are displayed on the second computing device, the setting options specifying that icons representative of location-specific information that have been sent by the receiving user be displayed on the map interface.
  • 9. The method of claim 3, further comprising: presenting, on the map on the display of the first computing device, a radius visual indicator, the radius visual indicator presenting a plurality of radii relative to the geographic location, and enabling of one of the plurality of radii as a message radius associated with the location-specific information.
  • 10. The method of claim 1, wherein the location-specific information includes points of interest in the geographical area.
  • 11. The method of claim 1, further comprising: receiving location-based comments identifying businesses and service providers in the geographical area; andadding the location-based comments to the location-specific information.
  • 12. The method of claim 1, further comprising: receiving a comment from the second computing device, the comment related to the location-specific information; andadding the comment to the location-specific information.
  • 13. The method of claim 1, wherein the location-specific information includes points of interest in the geographical area.
  • 14. The method of claim 1, further comprising: receiving a request from a third computing device to receive the location-specific information including comments to the custom user map added by a creator of the custom user map;enabling access to the custom user map and the location-specific information including the comments to the third computing device, and including a third user of the third computing device in a group of followers of the creator of the custom user map; andsending a new location-based communication to the group of followers.
  • 15. The method of claim 1, wherein the geographical area is based on a current location of the first computing device.
  • 16. The method of claim 1, wherein the geographical area is based on a previous location of the first computing device, the previous location being the location of the first computing device at a previous time period.
  • 17. The method of claim 1, wherein enabling the second computing device to display the custom user map and the location-specific information is further in response to a lapse of a time frame established by the first computing device, the time frame indicating the enablement of the viewing of the customer user map and the location-specific information after the time frame has elapsed.
  • 18. The method of claim 1, wherein the method further comprises: displaying a user interface element, the user interface element configured to toggle between (1) enabling display of the custom user map and the location-specific information to the second computing device regardless of a current location of the second computing device and (2) enabling display of the custom user map and the location-specific information to the second computing device based on the current location of the second computing device.
  • 19. A computing apparatus comprising: a processor; anda memory storing instructions that, when executed by the processor, configure the apparatus to perform operations comprising: receiving, from a first computing device associated with a first account, a selection of a geographical area defined by spatial boundaries for the geographical area to create a custom user map configured to apply to trigger criterion for location-based messages;receiving location-specific information related to the custom user map;identifying a request from a second computing device and a third computing device to access the custom user map;determining that the second computing device has not registered for a second account with second computing device registration information and the third device has registered for a third account;determining that a location of the second and third computing device is within the geographical area;receiving business-related information directed to a plurality of recipients in the geographical area, wherein the business-related information directed to the plurality of recipients includes options or coupons for transaction activities with a business entity located within the geographical area;in response to determining that the second computing device has not registered for the second account and the third computing device has registered for the third account and that the location of the second and third computing devices are within the geographical area:enabling access to the custom user map and the location-specific information to the second computing device, the enabling of access enabling the second computing device to displayon a map interface of the second computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location-specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the second computing device that has not registered for the second account and by disabling the second computing device access to the options or coupons for transaction activities with a business entity located within the geographical area; andenabling access to the custom user map and the location-specific information to the third computing device, the enabling of access enabling the third computing device to display on a map interface of the third computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location-specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the third computing device that has registered for the third account and by enabling the third computing device access to the options or coupons for transaction activities with a business entity located within the geographical area.
  • 20. The computing apparatus of claim 19, the operations further comprising: presenting, on the map on the display of the first computing device, a radius visual indicator, the radius visual indicator presenting a plurality of radii relative to a geographic location, and enabling of one of the plurality of radii as a message radius associated with the location-specific information.
  • 21. A non-transitory computer-readable storage medium, the computer-readable storage medium including instructions that, when executed by a computer, cause the computer to perform operations comprising: receiving, from a first computing device associated with a first account, a selection of a geographical area defined by spatial boundaries for the geographical area to create a custom user map configured to apply to trigger criterion for location-based messages;receiving location-specific information related to the custom user map;identifying a request from a second computing device and a third computing device to access the custom user map;determining that the second computing device has not registered for a second account with second computing device registration information;determining that a location of the second and third computing device is within the geographical area;receiving business-related information directed to a plurality of recipients in the geographical area, wherein the business-related information directed to the plurality of recipients includes options or coupons for transaction activities with a business entity located within the geographical area;in response to determining that the second computing device has not registered for the second account and the third computing device has registered for the third account and that the location of the second and third computing devices are within the geographical area: enabling access to the custom user map and the location-specific information to the second computing device, the enabling of access enabling the second computing device to displayon a map interface of the second computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location-specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the second computing device that has not registered for the second account and by disabling the second computing device access to the options or coupons for transaction activities with a business entity located within the geographical area; andenabling access to the custom user map and the location-specific information to the third computing device, the enabling of access enabling the third computing device to display on a map interface of the third computing device associated with a recipient, the custom user map and the location-specific information, the enabling of access to the custom user map and the location-specific information by initiating the first computing device to transmit the custom user map and the location-specific information to the third computing device that has registered for the third account and by enabling the third computing device access to the options or coupons for transaction activities with a business entity located within the geographical area.
CLAIM OF PRIORITY

This application is a continuation of and claims priority to U.S. patent application Ser. No. 16/428,210, filed on May 31, 2019, which is a continuation of and claims priority to U.S. patent application Ser. No. 16/105,687, filed on Aug. 20, 2018, which is a continuation of and claims priority to U.S. patent application Ser. No. 15/835,100, filed on Dec. 7, 2017, which is a continuation of and claims priority to U.S. patent application Ser. No. 15/486,111, filed on Apr. 12, 2017, which is a continuation of and claims priority to U.S. patent application Ser. No. 14/594,410, filed on Jan. 12, 2015, which claims priority to U.S. Provisional Ser. No. 61/926,324 filed on Jan. 12, 2014, each of which are hereby incorporated by reference in their entirety for all purposes.

US Referenced Citations (800)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5539395 Buss et al. Jul 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6922634 Odakura et al. Jul 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124091 Khoo et al. Oct 2006 B1
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240025 Stone et al. Jul 2007 B2
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Brondrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7630724 Beyer, Jr. Dec 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
7991638 House et al. Aug 2011 B1
8001204 Burtner et al. Aug 2011 B2
8014762 Chmaytelli et al. Sep 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8312380 Churchill et al. Nov 2012 B2
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8369866 Ashley, Jr. et al. Feb 2013 B2
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8433296 Hardin et al. Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8494481 Bacco et al. Jul 2013 B1
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8559980 Pujol Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8606792 Jackson et al. Dec 2013 B1
8613089 Holloway et al. Dec 2013 B1
8626187 Grosman Jan 2014 B2
8649803 Hamill Feb 2014 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8688519 Lin et al. Apr 2014 B1
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8751310 Van Datta et al. Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8762201 Noonan Jun 2014 B1
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8923823 Wilde Dec 2014 B1
8972357 Shim et al. Mar 2015 B2
8977296 Briggs et al. Mar 2015 B1
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9043329 Patton et al. May 2015 B1
9055416 Rosen et al. Jun 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs et al. Sep 2015 B2
9137700 Elefant et al. Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9258459 Hartley Feb 2016 B2
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9544379 Gauglitz et al. Jan 2017 B2
9591445 Zises Mar 2017 B2
9628950 Noeth et al. Apr 2017 B1
9648581 Vaynblat et al. May 2017 B1
9672538 Vaynblat et al. Jun 2017 B1
9674660 Vaynblat et al. Jun 2017 B1
9706355 Cali et al. Jul 2017 B1
9710821 Heath Jul 2017 B2
9710969 Malamud Jul 2017 B2
9802121 Ackley et al. Oct 2017 B2
9843720 Ebsen et al. Dec 2017 B1
9854219 Sehn Dec 2017 B2
9866999 Noeth Jan 2018 B1
9894478 Deluca et al. Feb 2018 B1
9961535 Bucchieri May 2018 B2
10080102 Noeth et al. Sep 2018 B1
10176195 Patel Jan 2019 B2
10200813 Allen et al. Feb 2019 B1
10282753 Cheung May 2019 B2
10285002 Colonna et al. May 2019 B2
10285006 Colonna et al. May 2019 B2
10349209 Noeth et al. Jul 2019 B1
10395519 Colonna et al. Aug 2019 B2
10445777 Mcdevitt et al. Oct 2019 B2
10524087 Allen et al. Dec 2019 B1
10616239 Allen et al. Apr 2020 B2
10616476 Ebsen et al. Apr 2020 B1
10616727 Constantinides Apr 2020 B2
10659914 Allen et al. May 2020 B1
10694317 Cheung Jun 2020 B2
11216869 Allen et al. Jan 2022 B2
20020032771 Gledje Mar 2002 A1
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020098850 Akhteruzzaman et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020123327 Vataja Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030083929 Springer et al. May 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040091116 Staddon et al. May 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040185877 Asthana et al. Sep 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040193488 Khoo et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20040243704 Botelho et al. Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050032527 Sheha et al. Feb 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102180 Gailey et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060136297 Willis et al. Jun 2006 A1
20060199612 Beyer, Jr. Sep 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060259359 Gogel Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060276184 Tretyak Dec 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070032225 Konicek Feb 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070268988 Hedayat et al. Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080012987 Hirata et al. Jan 2008 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033795 Wishnow et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Ngyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschweiler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080133336 Altman Jun 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080160956 Jackson et al. Jul 2008 A1
20080167106 Lutnick Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080200189 Lagerstedt Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080284587 Saigh et al. Nov 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090019472 Cleland et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089169 Gupta et al. Apr 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090098859 Kamdar et al. Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090177588 Marchese Jul 2009 A1
20090177730 Annamalai et al. Jul 2009 A1
20090192900 Collision Jul 2009 A1
20090197582 Lewis et al. Aug 2009 A1
20090197616 Lewis et al. Aug 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090254840 Churchill et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100004003 Duggal et al. Jan 2010 A1
20100041378 Aceves et al. Feb 2010 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100113066 Dingler May 2010 A1
20100115281 Camenisch et al. May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100153197 Byon Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100178939 Kang et al. Jul 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen et al. Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100211431 Lutnick et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100216491 Winkler et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100255865 Karmarkar Oct 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100262461 Bohannon Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20100323666 Cai Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110044563 Blose et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110098061 Yoon Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110131633 Macaskill et al. Jun 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110170838 Rosengart et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238300 Schenken Sep 2011 A1
20110238762 Soni Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110251790 Liotopoulos et al. Oct 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110256881 Huang et al. Oct 2011 A1
20110258260 Isaacson Oct 2011 A1
20110269479 Ledlie Nov 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110294541 Zheng et al. Dec 2011 A1
20110295577 Ramachandran Dec 2011 A1
20110295677 Dhingra et al. Dec 2011 A1
20110295719 Chen Dec 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120023522 Anderson et al. Jan 2012 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054001 Zivkovic et al. Mar 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Lano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123867 Hannan May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120129548 Rao et al. May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120143963 Kennberg et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166468 Gupta et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197690 Agulnek Aug 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120202525 Pettini Aug 2012 A1
20120208564 Clark et al. Aug 2012 A1
20120209892 Macaskill et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Franciscoet al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120214568 Herrmann Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120270563 Sayed Oct 2012 A1
20120271684 Shutter Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130006777 Krishnareddy et al. Jan 2013 A1
20130017802 Adibi et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130035114 Holden Feb 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130099977 Sheshadri et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130115872 Huang et al. May 2013 A1
20130122862 Horn et al. May 2013 A1
20130122929 Al-Mufti et al. May 2013 A1
20130124297 Hegeman et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132194 Rajaram May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130157684 Moser Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173380 Akbari et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130203373 Edge Aug 2013 A1
20130217366 Kolodziej Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130226453 Trussel et al. Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130231144 Daniel et al. Sep 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130254227 Shim et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304527 Santos, III Nov 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130339489 Katara et al. Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346205 Hogg et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019246 Fraccaroli Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140051436 Yan Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057648 Lyman et al. Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140066106 Ngo et al. Mar 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140095296 Angell et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129627 Baldwin et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140156410 Wuersch et al. Jun 2014 A1
20140164118 Polachi Jun 2014 A1
20140164557 Keskitalo Jun 2014 A1
20140172542 Poncz et al. Jun 2014 A1
20140173003 Van Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140173460 Kim Jun 2014 A1
20140180829 Umeda Jun 2014 A1
20140181193 Narasimhan et al. Jun 2014 A1
20140181934 Mayblum et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140240125 Burch Aug 2014 A1
20140244765 Smith et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279040 Kuboyama Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282068 Levkovitz et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'Keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20140337123 Nuernberg et al. Nov 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150094083 Ngo Apr 2015 A1
20150094093 Pierce et al. Apr 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150130178 Clements May 2015 A1
20150142753 Soon-Shiong May 2015 A1
20150154650 Umeda Jun 2015 A1
20150163629 Cheung Jun 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150186497 Patton et al. Jul 2015 A1
20150220492 Simeonov et al. Aug 2015 A1
20150222814 Li et al. Aug 2015 A1
20150237472 Alsina et al. Aug 2015 A1
20150237473 Koepke Aug 2015 A1
20150249710 Stefansson et al. Sep 2015 A1
20150254704 Kothe et al. Sep 2015 A1
20150261917 Smith Sep 2015 A1
20150262208 Bjontegard Sep 2015 A1
20150269624 Cheng et al. Sep 2015 A1
20150271779 Alavudin Sep 2015 A1
20150287072 Golden et al. Oct 2015 A1
20150294367 Oberbrunner et al. Oct 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150332310 Cui et al. Nov 2015 A1
20150332317 Cui et al. Nov 2015 A1
20150332325 Sharma et al. Nov 2015 A1
20150332329 Luo et al. Nov 2015 A1
20150334077 Feldman Nov 2015 A1
20150341747 Barrand et al. Nov 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150358806 Salqvist Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160019592 Muttineni et al. Jan 2016 A1
20160034712 Patton et al. Feb 2016 A1
20160055250 Rush Feb 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160098742 Minicucci et al. Apr 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn et al. Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160210657 Chittilappilly et al. Jul 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160292735 Kim Oct 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170026786 Barron et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170078760 Christoph et al. Mar 2017 A1
20170091795 Mansour et al. Mar 2017 A1
20170127233 Liang et al. May 2017 A1
20170132647 Bostick et al. May 2017 A1
20170164161 Gupta et al. Jun 2017 A1
20170186038 Glover et al. Jun 2017 A1
20170222962 Gauglitz et al. Aug 2017 A1
20170230315 Zubas et al. Aug 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
20170339521 Colonna et al. Nov 2017 A1
20170359686 Colonna et al. Dec 2017 A1
20180069817 Constantinides Mar 2018 A1
20180121957 Cornwall et al. May 2018 A1
20180189835 Deluca et al. Jul 2018 A1
20180225687 Ahmed et al. Aug 2018 A1
20190372991 Allen et al. Dec 2019 A1
20200204726 Ebsen et al. Jun 2020 A1
20200359167 Noeth et al. Nov 2020 A1
20220237691 Allen et al. Jul 2022 A1
Foreign Referenced Citations (41)
Number Date Country
2887596 Jul 2015 CA
102930107 Feb 2013 CN
103200238 Jul 2013 CN
105760466 Jul 2016 CN
107637099 Jan 2018 CN
110249359 Sep 2019 CN
2051480 Apr 2009 EP
2151797 Feb 2010 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
20130091878 Aug 2013 KR
102035405 Oct 2019 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
WO-2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2013008238 Jan 2013 WO
WO-2013045753 Apr 2013 WO
WO-2014006129 Jan 2014 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014172388 Oct 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016123381 Aug 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
WO-2018144931 Aug 2018 WO
Non-Patent Literature Citations (356)
Entry
US 10,484,394 B2, 11/2019, Allen et al. (withdrawn)
US 10,542,011 B2, 01/2020, Allen et al. (withdrawn)
“A Whole New Story”, Snap, Inc., [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding photos to your listing”, eBay, [Online] Retrieved from the Internet: <URL: http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 14/304,855, Corrected Notice of Allowance mailed Jun. 26, 2015”, 8 pgs.
“U.S. Appl. No. 14/304,855, Final Office Action mailed Feb. 18, 2015”, 10 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action mailed Mar. 18, 2015”, 9 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action mailed Oct. 22, 2014”, 11 pgs.
“U.S. Appl. No. 14/304,855, Notice of Allowance mailed Jun. 1, 2015”, 11 pgs.
“U.S. Appl. No. 14/304,855, Response filed Feb. 25, 2015 to Final Office Action mailed Feb. 18, 2015”, 5 pgs.
“U.S. Appl. No. 14/304,855, Response filed Apr. 1, 2015 to Non Final Office Action mailed Mar. 18, 2015”, 4 pgs.
“U.S. Appl. No. 14/304,855, Response filed Nov. 7, 2014 to Non Final Office Action mailed Oct. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/494,226, Appeal Brief filed Mar. 1, 2019 in response to Final Office Action mailed Jun. 1, 2018”, 29 pgs.
“U.S. Appl. No. 14/494,226, Examiner Interview Summary mailed Oct. 27, 2016”, 3 pgs.
“U.S. Appl. No. 14/494,226, Examiner Interview Summary mailed Dec. 20, 2017”, 2 pgs.
“U.S. Appl. No. 14/494,226, Final Office Action mailed Mar. 7, 2017”, 34 pgs.
“U.S. Appl. No. 14/494,226, Final Office Action mailed Jun. 1, 2018”, 33 pgs.
“U.S. Appl. No. 14/494,226, Non Final Office Action mailed Sep. 7, 2017”, 36 pgs.
“U.S. Appl. No. 14/494,226, Non Final Office Action mailed Sep. 12, 2016”, 32 pgs.
“U.S. Appl. No. 14/494,226, Response filed Jan. 8, 2018 to Non Final Office Action mailed Sep. 7, 2017”, 15 pgs.
“U.S. Appl. No. 14/494,226, Response filed Jul. 7, 2017 to Final Office Action mailed Mar. 7, 2017”, 13 pgs.
“U.S. Appl. No. 14/494,226, Response filed Dec. 12, 2016 to Non Final Office Action mailed Sep. 12, 2016”, 16 pgs.
“U.S. Appl. No. 14/505,478, Advisory Action mailed Apr. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance mailed May 18, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance mailed Jul. 22, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Final Office Action mailed Mar. 17, 2015”, 16 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action mailed Jan. 27, 2015”, 13 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action mailed Sep. 4, 2015”, 19 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance mailed Apr. 28, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance mailed Aug. 26, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Response filed Jan. 30, 2015 to Non Final Office Action mailed Jan. 27, 2015”, 10 pgs.
“U.S. Appl. No. 14/505,478, Response filed Mar. 4, 2016 to Non Final Office Action mailed Sep. 4, 2015”, 12 pgs.
“U.S. Appl. No. 14/505,478, Response filed Apr. 1, 2015 to Final Office Action mailed Mar. 17, 2015”, 6 pgs.
“U.S. Appl. No. 14/506,478, Response filed Aug. 17, 2015 to Advisory Action mailed Apr. 14, 2015”, 10 pgs.
“U.S. Appl. No. 14/523,728, Non Final Office Action mailed Dec. 12, 2014”, 10 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance mailed Mar. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance mailed Apr. 15, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance mailed Jun. 5, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Response filed Aug. 25, 2014 to Non Final Office Action mailed Jan. 16, 2015”, 5 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary mailed May 23, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary mailed Nov. 17, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action mailed Jan. 25, 2018”, 39 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action mailed Aug. 11, 2015”, 23 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action mailed Aug. 24, 2016”, 23 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action mailed Mar. 12, 2015”, 20 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action mailed Apr. 6, 2017”, 25 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action mailed Apr. 18, 2016”, 21 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action mailed Jul. 13, 2018”, 38 pgs.
“U.S. Appl. No. 14/529,064, Response filed Feb. 5, 2015 to Restriction Requirement mailed Feb. 2, 2015”, 6 pgs.
“U.S. Appl. No. 14/529,064, Response filed Mar. 26, 2015 to Non Final Office Action mailed Mar. 12, 2015”, 8 pgs.
“U.S. Appl. No. 14/529,064, Response filed May 25, 2018 to Final Office Action mailed Jan. 25, 2018”, 20 pgs.
“U.S. Appl. No. 14/529,064, Response filed Jul. 18, 2016 to Non Final Office Action mailed Apr. 18, 2016”, 20 pgs.
“U.S. Appl. No. 14/529,064, Response filed Sep. 6, 2017 to Non Final Office Action mailed Apr. 6, 2017”, 19 pgs.
“U.S. Appl. No. 14/529,064, Response filed Oct. 12, 2015 to Final Office Action mailed Aug. 11, 2015”, 19 pgs.
“U.S. Appl. No. 14/529,064, Response filed Dec. 21, 2016 to Final Office Action mailed Aug. 24, 2016”, 17 pgs.
“U.S. Appl. No. 14/529,064, Restriction Requirement mailed Feb. 2, 2015”, 5 pgs.
“U.S. Appl. No. 14/539,391, Notice of Allowance mailed Mar. 5, 2015”, 17 pgs.
“U.S. Appl. No. 14/548,590, Advisory Action mailed Apr. 19, 2018”, 2 pgs.
“U.S. Appl. No. 14/548,590, Advisory Action mailed Nov. 18, 2016”, 3 pgs.
“U.S. Appl. No. 14/548,590, Appeal Brief Filed Apr. 20, 2018”, 28 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action mailed Jul. 5, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action mailed Jul. 18, 2017”, 20 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action mailed Sep. 16, 2015”, 15 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action mailed Jan. 9, 2017”, 14 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action mailed Feb. 11, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action mailed Apr. 20, 2015”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 9, 2017 to Non Final Office Action mailed Jan. 9, 2017”, 17 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 10, 2016 to Non Final Office Action mailed Feb. 11, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Nov. 7, 2016 to Final Office Action mailed Jul. 5, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Dec. 16, 2015 to Final Office Action mailed Sep. 16, 2015”, 13 pgs.
“U.S. Appl. No. 14/548,590, Response filed Jun. 16, 2015 to Non Final Office Action mailed Apr. 20, 2015”, 19 pgs.
“U.S. Appl. No. 14/578,258, Examiner Interview Summary mailed Nov. 25, 2015”, 3 pgs.
“U.S. Appl. No. 14/578,258, Non Final Office Action mailed Jun. 10, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,258, Notice of Allowance mailed Feb. 26, 2016”, 5 pgs.
“U.S. Appl. No. 14/578,258, Response filed Dec. 10, 2015 to Non Final Office Action mailed Jun. 10, 2015”, 11 pgs.
“U.S. Appl. No. 14/578,271, Final Office Action mailed Dec. 3, 2015”, 15 pgs.
“U.S. Appl. No. 14/578,271, Non Final Office Action mailed Aug. 7, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,271, Notice of Allowance mailed Dec. 7, 2016”, 7 pgs.
“U.S. Appl. No. 14/578,271, Response filed Feb. 9, 2016 to Final Office Action mailed Dec. 3, 2015”, 10 pgs.
“U.S. Appl. No. 14/578,271, Response filed Jun. 19, 2015 to Restriction Requirement mailed Apr. 23, 2015”, 6 pgs.
“U.S. Appl. No. 14/578,271, Response filed Oct. 28, 2015 to Non Final Office Action mailed Aug. 7, 2015”, 9 pgs.
“U.S. Appl. No. 14/578,271, Restriction Requirement mailed Apr. 23, 2015”, 8 pgs.
“U.S. Appl. No. 14/594,410, Non Final Office Action mailed Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance mailed Aug. 2, 2016”, 5 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance mailed Dec. 15, 2016”, 6 pgs.
“U.S. Appl. No. 14/594,410, Response filed Jul. 1, 2016 to Non Final Office Action mailed Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary mailed Jan. 29, 2016”, 5 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary mailed Jul. 6, 2016”, 4 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary mailed Aug. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary mailed Sep. 8, 2016”, 3 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action mailed Aug. 15, 2016”, 18 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action mailed Nov. 23, 2015”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action mailed Jan. 3, 2017”, 17 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action mailed Mar. 28, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action mailed Jul. 20, 2015”, 25 pgs.
“U.S. Appl. No. 14/612,692, Response filed Feb. 23, 2016 to Final Office Action mailed Nov. 23, 2015”, 10 pgs.
“U.S. Appl. No. 14/612,692, Response filed May 3, 2017 to Non Final Office Action mailed Jan. 3, 2017”, 18 pgs.
“U.S. Appl. No. 14/612,692, Response filed Nov. 14, 2016 to Final Office Action mailed Aug. 15, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Response filed Jun. 28, 2016 to Non Final Office Action mailed Mar. 28, 2016”, 14 pgs.
“U.S. Appl. No. 14/612,692. Response filed Oct. 19, 2015 to Non Final Office Action mailed Jul. 20, 2015”, 11 pgs.
“U.S. Appl. No. 14/634,417, Advisory Action mailed Mar. 14, 2017”, 3 pgs.
“U.S. Appl. No. 14/634,417, Final Office Action mailed Jan. 31, 2017”, 27 pgs.
“U.S. Appl. No. 14/634,417, Non Final Office Action mailed Aug. 30, 2016”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Mar. 2, 2017 to Final Office Action mailed Jan. 31, 2017”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Nov. 30, 2016 to Non Final Office Action mailed Aug. 30, 2016”, 18 pgs.
“U.S. Appl. No. 14/682,259, Notice of Allowance mailed Jul. 27, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Final Office Action mailed Jun. 17, 2016”, 12 pgs.
“U.S. Appl. No. 14/704,212, Non Final Office Action mailed Dec. 4, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Response filed Mar. 4, 2016 to Non Final Office Action mailed Dec. 4, 2015”, 11 pgs.
“U.S. Appl. No. 14/738,069, Non Final Office Action mailed Mar. 21, 2016”, 12 pgs.
“U.S. Appl. No. 14/738,069, Notice of Allowance mailed Aug. 17, 2016”, 6 pgs.
“U.S. Appl. No. 14/738,069, Response filed Jun. 10, 2016 to Non Final Office Action mailed Mar. 21, 2016”, 10 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance mailed Apr. 12, 2016”, 9 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance mailed Jul. 14, 2016”, 8 pgs.
“U.S. Appl. No. 14/808,283, Preliminary Amendment filed Jul. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance mailed Mar. 29, 2017”, 17 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance mailed Aug. 7, 2017”, 8 pgs.
“U.S. Appl. No. 14/967,472, Final Office Action mailed Mar. 10, 2017”, 15 pgs.
“U.S. Appl. No. 14/967,472, Non Final Office Action mailed Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 14/967,472, Preliminary Amendment filed Dec. 15, 2015”, 6 pgs.
“U.S. Appl. No. 14/967,472, Response filed Dec. 5, 2016 to Non Final Office Action mailed Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 15/074,029, Advisory Action mailed Oct. 11, 2018”, 3 pgs.
“U.S. Appl. No. 15/074,029, Final Office Action mailed Jun. 28, 2018”, 22 pgs.
“U.S. Appl. No. 15/074,029, Non Final Office Action mailed Jan. 23, 2019”, 19 pgs.
“U.S. Appl. No. 15/074,029, Non Final Office Action mailed Nov. 30, 2017”, 16 pgs.
“U.S. Appl. No. 15/074,029, Response filed Feb. 28, 2018 to Non Final Office Action mailed Nov. 30, 2017”, 12 pgs.
“U.S. Appl. No. 15/074,029, Response filed Aug. 28, 2018 to Final Office Action mailed Jun. 28, 2018”, 21 pgs.
“U.S. Appl. No. 15/074,029, Response filed Apr. 23, 2019 to Non Final Office Action mailed Jan. 23, 2019”, 15 pgs.
“U.S. Appl. No. 15/137,608, Preliminary Amendment filed Apr. 26, 2016”, 6 pgs.
“U.S. Appl. No. 15/152,975, Non Final Office Action mailed Jan. 12, 2017”, 36 pgs.
“U.S. Appl. No. 15/152,975, Preliminary Amendment filed May 19, 2016”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance mailed Feb. 27, 2017”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance mailed Dec. 30, 2016”, 9 pgs.
“U.S. Appl. No. 15/208,460, Supplemental Preliminary Amendment filed Jul. 18, 2016”, 8 pgs.
“U.S. Appl. No. 15/224,312, Preliminary Amendment filed Feb. 1, 2017”, 11 pgs.
“U.S. Appl. No. 15/224,343, Preliminary Amendment filed Jan. 31, 2017”, 10 pgs.
“U.S. Appl. No. 15/224,355, Preliminary Amendment filed Apr. 3, 2017”, 12 pgs.
“U.S. Appl. No. 15/224,372, Preliminary Amendment filed May 5, 2017”, 10 pgs.
“Application Serial No. 15/224.359, Preliminary Amendment filed Apr. 19, 2017”, 8 pgs.
“U.S. Appl. No. 15/298,806, Advisory Action mailed Jan. 29, 2018”, 4 pgs.
“U.S. Appl. No. 15/298,806, Examiner Interview Summary mailed Jan. 12, 2018”, 3 pgs.
“U.S. Appl. No. 15/298,806, Examiner Interview Summary mailed Aug. 13, 2018”, 3 pgs.
“U.S. Appl. No. 15/298,806, Final Office Action mailed Oct. 24, 2017”, 15 pgs.
“U.S. Appl. No. 15/298,806, Non Final Office Action mailed May 17, 2018”, 16 pgs.
“U.S. Appl. No. 15/298,806, Non Final Office Action mailed Jun. 12, 2017”, 26 pgs.
“U.S. Appl. No. 15/298,806, Notice of Allowance mailed Sep. 19, 2018”, 5 pgs.
“U.S. Appl. No. 15/298,806, Preliminary Amendment filed Oct. 21, 2016”, 8 pgs.
“U.S. Appl. No. 15/298,806, Response filed Jan. 9, 2018 to Final Office Action mailed Oct. 24, 2017”, 17 pgs.
“U.S. Appl. No. 15/298,806, Response filed Aug. 10, 2018 to Non Final Office Action mailed May 17, 2018”, 15 pgs.
“U.S. Appl. No. 15/298,806, Response filed Sep. 12, 2017 to Non Final Office Action mailed Jun. 12, 2017”. 12 pgs.
“U.S. Appl. No. 15/416,846, Preliminary Amendment filed Feb. 18, 2017”, 10 pgs.
“U.S. Appl. No. 15/424,184, Examiner Interview Summary mailed Jan. 10, 2019”, 3 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action mailed Jan. 29, 2019”, 14 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action mailed May 21, 2019”, 16 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action mailed Nov. 30, 2018”, 22 pgs.
“U.S. Appl. No. 15/424,184, Response filed Apr. 29, 2019 to Final Office Action mailed Jan. 29, 2019”, 11 pgs.
“U.S. Appl. No. 15/424,184, Response filed Jan. 4, 2019 to Non Final Office Action mailed Nov. 30, 2018”, 17 pgsl.
“U.S. Appl. No. 15/474,821, Non Final Office Action mailed Jan. 25, 2019”, 17 pgs.
“U.S. Appl. No. 15/474,821, Response filed Apr. 25, 2019 to Non Final Office Action mailed Jan. 25, 2019”, 16 pgs.
“U.S. Appl. No. 15/486,111, Corrected Notice of Allowance mailed Sep. 7, 2017”, 3 pgs.
“U.S. Appl. No. 15/486,111, Non Final Office Action mailed May 9, 2017”, 17 pgs.
“U.S. Appl. No. 15/486,111, Notice of Allowance mailed Aug. 30, 2017”, 5 pgs.
“U.S. Appl. No. 15/486,111, Response filed Aug. 9, 2017 to Non Final Office Action mailed May 9, 2017”, 11 pgs.
“U.S. Appl. No. 15/835,100, Non Final Office Action mailed Jan. 23, 2018”, 18 pgs.
“U.S. Appl. No. 15/835,100, Notice of Allowance mailed May 22, 2018”, 5 pgs.
“U.S. Appl. No. 15/835,100, Response Filed Apr. 23, 2018 to Non Final Office Action mailed Jan. 23, 2018”, 11 pgs.
“U.S. Appl. No. 15/946,990, Final Office Action mailed May 9, 2019”, 11 pgs.
“U.S. Appl. No. 15/946,990, Non Final Office Action mailed Dec. 3, 2018”, 10 pgs.
“U.S. Appl. No. 15/946,990, Response filed Feb. 20, 2019 to Non Final Office Action mailed Dec. 3, 2018”, 11 pgs.
“U.S. Appl. No. 16/105,687, Non Final Office Action mailed Sep. 14, 2018”, 11 pgs.
“U.S. Appl. No. 16/105,687, Notice of Allowance mailed Feb. 25, 2019”, 8 pgs.
“U.S. Appl. No. 16/105,687, Response filed Dec. 14, 2018 to Non Final Office Action mailed Sep. 14, 2018”, 12 pgs.
“U.S. Appl. No. 16/428,210, Final Office Action mailed Jun. 29, 2020”, 16 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action mailed Apr. 6, 2020”, 16 pgs.
“U.S. Appl. No. 16/428,210, Preliminary Amendment filed Aug. 8, 2019”, 8 pgs.
“U.S. Appl. No. 16/428,210, Response filed Jun. 3, 2020 to Non Final Office Action mailed Apr. 6, 2020”, 10 pgs.
“BlogStomp”, StompSoftware, [Online] Retrieved from the Internet: <URL: http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Canadian Application Serial No. 2,894,332 Response filed Jan. 24, 2017 to Office Action mailed Aug. 16, 2016”, 15 pgs.
“Canadian Application Serial No. 2,894,332, Office Action mailed Aug. 16, 2016”, 4 pgs.
“Canadian Application Serial No. 2,910,158, Office Action mailed Dec. 15, 2016”, 5 pgs.
“Canadian Application Serial No. 2,910,158, Response filed Apr. 11, 2017 to Office Action mailed Dec. 15, 2016”, 21 pgs.
“Connecting To Your Customers In the Triangle and Beyond”, Newsobserver.com, (2013), 16 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, Blast Radius, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20160711202454/http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online] Retrieved from the Internet: <URL: http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“Demystifying Location Data Accuracy”, Mobile Marketing Association, (Nov. 2015), 18 pgs.
“European Application Serial No. 16716090.2, Response filed May 21, 2018 to Communication pursuant to Rules 161(1) and 162 EPC mailed Nov. 10, 2017”, w/ English Claims, 89 pgs.
“Geofencing and the event industry”, Goodbarber Blog, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: https://www.goodbarber.com/blog/geofencing-and-the-event-industry-a699/>, (Nov. 9, 2015), 7 pgs.
“How Snaps Are Stored And Deleted”, Snapchat, [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-US/news/post/how-snaps-are-stored-and-deleted/>, (May 9, 2013), 2 pgs.
“IAB Platform Status Report: A Mobile Advertising Review”, Interactive Advertising Bureau, (Jul. 2008), 24 pgs.
“InstaPlace Photo App Tell The Whole Story”, [Online] Retrieved from the Internet: <URL: youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs., 1:02 min.
“International Application Serial No. PCT/EP2008/063682, International Search Report mailed Nov. 24, 2008”, 3 pgs.
“International Application Serial No. PCT/US2014/040346, Intemational Search Report mailed Mar. 23, 2015”, 2 pgs.
“International Application Serial No. PCT/US2014/040346, Written Opinion mailed Mar. 23, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/035591, International Preliminary Report on Patentability mailed Dec. 22, 2016”, 7 pgs.
“International Application Serial No. PCT/US2015/035591, International Search Report mailed Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/035591, International Written Opinion mailed Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/037251, Intemational Search Report mailed Sep. 29, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/050424, International Search Report mailed Dec. 4, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/050424, Written Opinion mailed Dec. 4, 2015”, 10 pgs.
“International Application Serial No. PCT/US2015/053811, Intemational Preliminary Report on Patentability mailed Apr. 13, 2017”, 9 pgs.
“International Application Serial No. PCT/US2015/053811, International Search Report mailed Nov. 23, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/053811, Written Opinion mailed Nov. 23, 2015”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Preliminary Report on Patentability mailed May 4, 2017”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Search Report mailed Dec. 22, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/056884, Written Opinion mailed Dec. 22, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/065785, International Search Report mailed Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065785, Written Opinion mailed Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065821, Intemational Search Report mailed Mar. 3, 2016”, 2 pgs.
“International Application Serial No. PCT/US2015/065821, Written Opinion mailed Mar. 3, 2016”, 3 pgs.
“International Application Serial No. PCT/US2016/023085, International Preliminary Report on Patentability mailed Sep. 28, 2017”, 8 pgs.
“International Application Serial No. PCT/US2016/023085, International Search Report mailed Jun. 17, 2016”, 5 pgs.
“International Application Serial No. PCT/US2016/023085, Written Opinion mailed Jun. 17, 2016”, 6 pgs.
“International Application Serial No. PCT/US2018/016723, International Search Report mailed Apr. 5, 2018”, 2 pgs.
“International Application Serial No. PCT/US2018/016723, Written Opinion mailed Apr. 5, 2018”, 17 pgs.
“Introducing Snapchat Stories”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20131026084921/https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.; 00:47 min.
“Visit Mobile: Getting Started”, IVISIT, [Online] Retrieved from the Internet: <URL: http://web.archive.org/web/20140830174355/http://ivisit.com/support_mobile>, (Dec. 4, 2013), 16 pgs.
“Korean Application Serial No. 10-2017-7029861, Notice of Preliminary Rejection mailed Jan. 17, 2019”, w/ English Translation, 9 pgs.
“Korean Application Serial No. 10-2017-7029861, Response filed Mar. 15, 2019 to Notice of Preliminary Rejection mailed Jan. 17, 2019”, w/ English Claims, 20 pgs.
“Macy's Believe-o-Magic”, [Online] Retrieved from the Internet: <URL: https://web.archive.org/web/20190422101854/https://www.youtube.com/watch?v=xvzRXy3J0Z0&feature=youtu.be>, (Nov. 7, 2011), 102 pgs.; 00:51 min.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe Campaign”, Business Wire, [Online] Retrieved from the Internet: <URL: https://www.businesswire.com/news/home/20111102006759/en/Macys-Introduces-Augmented-Reality-Experience-Stores-Country>, (Nov. 2, 2011), 6 pgs.
“Mobile Location User Cases and Case Studies”, Interactive Advertising Bureau, (Mar. 2014), 25 pgs.
“Pluraleyes by Red Giant”, © 2002-2015 Red Giant LLC, [Online]. Retrieved from the Internet: <URL: http://www.redgiant.com/products/pluraleyes/, (Accessed Nov. 11, 2015), 5 pgs.
“Starbucks Cup Magic”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=RWwQXi9RGOw>, (Nov. 8, 2011), 87 pgs.; 00:47 min.
“Starbucks Cup Magic for Valentine's Day”, [Online] Retrieved from the Internet: <URL: https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.; 00:45 min.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, Business Wire, [Online] Retrieved from the Internet: <URL: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
“WIPO; International Preliminary Report; WO201776739”, (Sep. 10, 2018), 5 pgs.
“WIPO; Search Strategy; WO201776739”, (Dec. 10, 2017), 6 pgs.
Carr, Dale, “Mobile Ad Targeting: A Labor of Love”, Ad Week, [Online] Retrieved from the Internet on Feb. 11, 2019: <URL: https://www.adweek.com/digital/mobile-ad-targeting-a-labor-of-love/>, (Feb. 12, 2016), 7 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, TechCrunch, [Online] Retrieved from the Internet: <URL: https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs.
Castelluccia, Claude, et al., “EphPub: Toward robust Ephemeral Publishing”, 19th IEEE International Conference on Network Protocols (ICNP), (Oct. 17, 2011), 18 pgs.
Clarke, Tangier, “Automatically syncing multiple clips and lots of audio like PluralEyes possible?”, [Online]. Retrieved from the Internet: <URL: https://forums.creativecow.net/thread/344/20553, (May 21, 2013), 8 pgs.
Janthong, Isaranu, “Instaplace ready on Android Google Play store”, Android App Review Thailand, [Online] Retrieved from the Internet: <URL: http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Kumar, S, “Optimization Issues in Web and Mobile Advertising”, Chapter 2—Pricing Models in Web Advertising, SpringerBriefs in Operations Management, (2016), 6 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online] Retrieved from the Internet: <URL: http://www.theregister.co.uk/2005/12/12/stealthtext/>, (Dec. 12, 2005), 1 pg.
MacLeod, Duncan, “Macys Believe-o-Magic App”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
MacLeod, Duncan, “Starbucks Cup Magic Lets Merry”, [Online] Retrieved from the Internet: <URL: http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Melanson, Mike, “This text message will self destruct in 60 seconds”, [Online] Retrieved from the Internet: <URL: http://readwrite.com/2011/02/11/this_text_message_will_self_destruct_in_60_seconds>, (Feb. 18, 2015), 4 pgs.
Naylor, Joseph, “Geo-Precise Targeting: It's time to Get off the Fence”, Be In The Know Blog, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: http://blog.cmglocalsolutions.com/geo-precise-targeting-its-time-to-get-off-the-fence>, (May 15, 2015), 6 pgs.
Notopoulos, Katie, “A Guide To The New Snapchat Filters And Big Fonts”, [Online] Retrieved from the Internet: <URL: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term =.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Palmer, Alex, “Geofencing at events: how to reach potential customers live and on-site”, Streetfight Mag, [Online] Retrieved form the internet by the examiner on May 16, 2019: <URL: http://streetfightmag.com/2015/08/20/geofencing-at-events-how-to-reach-potential-customers-live-and-on-site>, (Aug. 20, 2015), 6 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, A Replay Function And For Whatever Reason, Time, Temperature And Speed Overlays”, TechCrunch, [Online] Retrieved form the Internet: <URL: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Peterson, Lisa, et al., “Location-Based Advertising”, Peterson Mobility Solutions, (Dec. 2009), 39 pgs.
Quercia, Daniele, et al., “Mobile Phones and Outdoor Advertising: Measurable Advertising”, IEEE Persuasive Computing, (2011), 9 pgs.
Sawers, Paul, “Snapchat for IOS Lets You Send Photos to Friends and Set How long They're Visible For”, [Online] Retrieved from the Internet: <URL: https://thenextweb.com/apps/2012/05/07/snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visible-for/>, (May 7, 2012), 5 pgs.
Shein, Esther, “Ephemeral Data”, Communications of the ACM, vol. 56, No. 9, (Sep. 2013), 3 pgs.
Simonite, Tom, “Mobile Data: A Gold Mine for Telcos”, MIT Technology Review, (May 27, 2010), 6 pgs.
Trice, Andrew, “My Favorite New Feature: Multi-Clip Sync in Premiere Pro CC”, [Online]. Retrieved from the Internet: <URL: http://www.tricedesigns.com/2013/06/18/my-favorite-new-feature-multi-cam-synch-in-premiere-pro-cc/, (Jun. 18, 2013), 5 pgs.
Tripathi, Rohit, “Watermark Images in PHP And Save File on Server”, [Online] Retrieved from the Internet: <URL: http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server>, (Dec. 28, 2012), 4 pgs.
Virgillito, Dan, “Facebook Introduces Mobile Geo-Fencing With Local Awareness Ads”, Adespresso, [Online] Retrieved from the internet by the examiner on May 16, 2019: <URL: https://adespresso.com/blog/facebook-local-business-ads-geo-fencing/>, (Oct. 8, 2014), 14 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Aug. 28, 2020”, 3 pgs.
“U.S. Appl. No. 16/428,210, Response filed Aug. 27, 2020 to Final Office Action mailed Jun. 29, 2020”, 12 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action mailed Sep. 8, 2020”, 14 pgs.
“U.S. Appl. No. 16/428,210, Advisory Action mailed Sep. 9, 2020”, 3 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action mailed Nov. 27, 2020”, 17 pgs.
“U.S. Appl. No. 16/943,804, Response filed Feb. 8, 2021 to Non Final Office Action mailed Sep. 8, 2020”, 7 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action mailed Feb. 24, 2021”, 15 pgs.
“U.S. Appl. No. 14/494,226, Appeal Decision mailed Feb. 26, 2021”, 8 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Mar. 31, 2021”, 2 pgs.
“U.S. Appl. No. 16/428,210, Response filed Apr. 27, 2021 to Non Final Office Action mailed Nov. 27, 2020”, 11 pgs.
“U.S. Appl. No. 14/494,226, Notice of Allowance mailed Jun. 9, 2021”, 7 pgs.
“U.S. Appl. No. 16/943,804, Response filed Jun. 24, 2021 to Final Office Action mailed Feb. 24, 2021”, 8 pgs.
“U.S. Appl. No. 14/494,226, Corrected Notice of Allowability mailed Sep. 28, 2021”, 2 pgs.
“U.S. Appl. No. 14/494,226, Notice of Allowance mailed Aug. 25, 2021”, 5 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Nov. 5, 2021”, 2pgs.
“U.S. Appl. No. 16/428,210, Final Office Action mailed Jul. 9, 2021”, 18 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Oct. 21, 2021”, 2 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action mailed Jul. 21, 2021”, 16 pgs.
“U.S. Appl. No. 16/943,804, Response filed Nov. 4, 2021 to Non Final Office Action mailed Jul. 21, 2021”, 9 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Nov. 5, 2021”.
“U.S. Appl. No. 16/943,804, Response filed Nov. 4, 21 to Non Final Office Action mailed Jul. 21, 2021”, 9 pgs.
“U.S. Appl. No. 16/428,210, Response filed Nov. 9, 2021 to Final Office Action mailed Jul. 9, 2021”, 12 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action mailed Nov. 29, 2021”, 17 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action mailed Nov. 29, 2021”, 14 pgs.
“U.S. Appl. No. 14/494,226, Corrected Notice of Allowability mailed Dec. 6, 2021”, 2 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Feb. 15, 2022”, 2 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Jun. 23, 2022”, 2 pgs.
“U.S. Appl. No. 16/428,210, Final Office Action mailed Apr. 1, 2022”, 16 pgs.
“U.S. Appl. No. 16/428,210, Response filed Feb. 28, 2022 to Non Final Office Action mailed Nov. 29, 2021”, 11 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Feb. 15, 2022”, 2 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Jun. 23, 2022”, 2 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action mailed Apr. 1, 2022”, 17 pgs.
“U.S. Appl. No. 16/943,804, Response filed Feb. 28, 2022 to Final Office Action mailed Nov. 29, 2021”, 8 pgs.
“U.S. Appl. No. 16/428,210, Notice of Non-Compliant Amendment mailed Dec. 28, 2022”, 2 pgs.
“U.S. Appl. No. 16/428,210, Response filed Nov. 21, 2022 to Non Final Office Action mailed Sep. 9, 2022”, 8 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action mailed Apr. 5, 2023”, 16 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action mailed Dec. 28, 2022”, 18 pgs.
“U.S. Appl. No. 16/943,804, Response filed Mar. 22, 2023 to Non Final Office Action mailed Dec. 28, 2022”, 10 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action mailed Sep. 9, 2022”, 15 pgs.
“U.S. Appl. No. 16/428,210, Response filed Jul. 29, 2022 to Final Office Action mailed Apr. 1, 2022”, 13 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action mailed Aug. 12, 2022”, 17 pgs.
“U.S. Appl. No. 16/943,804, Response filed Jul. 29, 2022 to Non Final Office Action mailed Apr. 1, 2022”, 10 pgs.
“U.S. Appl. No. 16/943,804, Response filed Oct. 25, 2022 to Final Office Action mailed Aug. 12, 2022”, 10 pgs.
“U.S. Appl. No. 17/567,624, Preliminary Amendment filed Sep. 20, 2022”, 7 pgs.
Constantinides, Stephen, “Real time geo-social visualization platform”, U.S. Appl. No. 15/189,691 filed Jun. 22, 2016, 57 pgs.
Feldman, Douglas E, “Map-based remarks”, U.S. Appl. No. 61/994,591 filed May 16, 2014, 43 pgs.
Rush, David, “Real Time Relevancy Scoring System for Social Media Posts”, U.S. Appl. No. 62/038,837 filed Aug. 19, 2014, 7 pgs.
“U.S. Appl. No. 16/428,210, Final Office Action mailed Jun. 7, 2023”, 16 pgs.
“U.S. Appl. No. 16/428,210, Response filed May 5, 2023 to Notice of Non-Compliant Amendment mailed Dec. 28, 2022”, 12 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Jul. 10, 2023”, 2 pgs.
“U.S. Appl. No. 16/943,804, Non Final Office Action mailed Aug. 18, 2023”, 16 pgs.
“U.S. Appl. No. 16/943,804, Response filed Jul. 3, 2023 to Final Office Action mailed Apr. 5, 2023”, 9 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Jan. 26, 2024”, 2 pgs.
“U.S. Appl. No. 16/428,210, Examiner Interview Summary mailed Sep. 7, 2023”, 2 pgs.
“U.S. Appl. No. 16/428,210, Non Final Office Action mailed Oct. 26, 2023”, 14 pgs.
“U.S. Appl. No. 16/428,210, Response filed Jan. 26, 2024 to Non Final Office Action mailed Oct. 26, 2023”, 12 pgs.
“U.S. Appl. No. 16/428,210, Response filed Sep. 7, 2023 to Final Office Action mailed Jun. 7, 2023”, 9 pgs.
“U.S. Appl. No. 16/943,804, Examiner Interview Summary mailed Nov. 14, 2023”, 2 pgs.
“U.S. Appl. No. 16/943,804, Response filed Nov. 17, 2023 to Non Final Office Action mailed Aug. 18, 2023”, 10 pgs.
“U.S. Appl. No. 17/567,624, Non Final Office Action mailed Sep. 29, 2023”, 33 pgs.
“U.S. Appl. No. 17/567,624, Response filed Dec. 19, 2023 to Non Final Office Action mailed Sep. 29, 2023”, 10 pgs.
“U.S. Appl. No. 14/548,590, Appeal Decision mailed Mar. 26, 2020”, 13 pgs.
“U.S. Appl. No. 14/548,590, Notice of Allowance mailed Jun. 17, 2020”, 9 pgs.
“U.S. Appl. No. 15/074,029, Corrected Notice of Allowability mailed Feb. 5, 2020”, 4 pgs.
“U.S. Appl. No. 15/074,029, Corrected Notice of Allowability mailed Aug. 20, 2019”, 10 pgs.
“U.S. Appl. No. 15/074,029, Notice of Allowance mailed Jun. 19, 2019”, 14 pgs.
“U.S. Appl. No. 15/424,184, Advisory Action mailed May 26, 2020”, 6 pgs.
“U.S. Appl. No. 15/424,184, Examiner Interview Summary mailed Jul. 30, 2019”, 2 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action mailed Mar. 9, 2020”, 19 pgs.
“U.S. Appl. No. 15/424,184, Final Office Action mailed Sep. 9, 2019”, 13 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action mailed Jun. 29, 2020”, 19 pgs.
“U.S. Appl. No. 15/424,184, Non Final Office Action mailed Dec. 2, 2019”, 16 pgs.
“U.S. Appl. No. 15/424,184, Response filed Mar. 2, 2020 to Non Final Office Action mailed Dec. 2, 2019”, 11 pgs.
“U.S. Appl. No. 15/424,184, Response filed May 11, 2020 to Final Office Action mailed Mar. 9, 2020”, 14 pgs.
“U.S. Appl. No. 15/424,184, Response filed Aug. 21, 2019 to Non Final Office Action mailed May 21, 2019”, 12 pgs.
“U.S. Appl. No. 15/424,184, Response filed Nov. 11, 2019 to Final Office Action mailed Sep. 9, 2019”, 12 pgs.
“U.S. Appl. No. 15/474,821, Advisory Action mailed Dec. 19, 2019”, 3 pgs.
“U.S. Appl. No. 15/474,821, Final Office Action mailed Sep. 3, 2019”, 19 pgs.
“U.S. Appl. No. 15/474,821, Response filed on Dec. 2, 2019 to Final Office Action mailed Sep. 3, 2019”, 10 pgs.
“U.S. Appl. No. 15/837,935, Notice of Allowance mailed Nov. 25, 2019”, 18 pgs.
“U.S. Appl. No. 15/946,990, Notice of Allowance mailed Sep. 24, 2019”, 5 pgs.
“U.S. Appl. No. 15/946,990, Response filed Jul. 9, 2019 to Final Office Action mailed May 9, 2019”, 12 pgs.
“U.S. Appl. No. 16/219,577, Non Final Office Action mailed Oct. 29, 2019”, 7 pgs.
“U.S. Appl. No. 16/219,577, Notice of Allowance mailed Jan. 15, 2020”, 7 pgs.
“U.S. Appl. No. 16/219,577, Response filed Oct. 3, 2019 to Restriction Requirement mailed Aug. 7, 2019”, 6 pgs.
“U.S. Appl. No. 16/219,577, Response filed Dec. 5, 2019 to Non Final Office Action mailed Oct. 29, 2019”, 6 pgs.
“U.S. Appl. No. 16/219,577, Restriction Requirement mailed Aug. 7, 2019”, 6 pgs.
“U.S. Appl. No. 16/541,919, Non Final Office Action mailed Apr. 14, 2020”. 18 pgs.
“U.S. Appl. No. 16/541,919, Notice of Allowance mailed Jun. 30, 2020”, 8 pgs.
“U.S. Appl. No. 16/541,919, Response filed Jun. 12, 2020 to Non Final Office Action mailed Apr. 14, 2020”, 8 pgs.
“U.S. Appl. No. 16/808,101, Preliminary Amendment filed Mar. 10, 2020”, 8 pgs.
“Chinese Application Serial No. 201680027177.8, Office Action mailed Oct. 28, 2019”. W/English Translation, 15 pgs.
“Chinese Application Serial No. 201680027177.8, Response filed Mar. 5, 2020 to Office Action mailed Oct. 28, 2019”, w/ English Claims, 11 pgs.
“European Application Serial No. 16716090.2, Communication Pursuant to Article 94(3) EPC mailed Jan. 15, 2020”, 6 pgs.
“European Application Serial No. 16716090.2, Response filed Apr. 15, 2020 to Communication Pursuant to Article 94(3) EPC mailed Jan. 15, 2020”, 10 pgs.
“European Application Serial No. 18747246.9, Communication Pursuant to Article 94(3) EPC mailed Jun. 25, 2020”, 10 pgs.
“European Application Serial No. 18747246.9, Extended European Search Report mailed Nov. 7, 2019”, 7 pgs.
“European Application Serial No. 18747246.9, Response Filed Jun. 3, 2020 to Extended European Search Report mailed Nov. 7, 2019”, 15 pgs.
“International Application Serial No. PCT/US2018/016723, International Preliminary Report on Patentability mailed Aug. 15, 2019”, 19 pgs.
“Korean Application Serial No. 10-2019-7030235, Final Office Action mailed May 20, 2020”, w/ English Translation, 5 pgs.
“Korean Application Serial No. 10-2019-7030235, Notice of Preliminary Rejection mailed Nov. 28, 2019”, w/ English Translation, 10 pgs.
“Korean Application Serial No. 10-2019-7030235, Response filed Jan. 28, 2020 to Notice of Preliminary Rejection mailed Nov. 28, 2019”, w/ English Claims, 12 pgs.
“Korean Application Serial No. 10-2019-7030235, Response filed Jun. 22, 2020 to Final Office Action mailed May 20, 2020”, w/ English Claims, 16 pgs.
“U.S. Appl. No. 16/943,804, Final Office Action mailed Feb. 22, 2024”, 18 pgs.
“U.S. Appl. No. 16/428,210, Notice of Allowance mailed Mar. 6, 2024”, 14 pgs.
“U.S. Appl. No. 17/567,624, Final Office Action mailed Mar. 27, 2024”, 38 pgs.
“U.S. Appl. No. 16/428,210, Supplemental Notice of Allowability mailed Apr. 11, 2024”, 3 pgs.
Related Publications (1)
Number Date Country
20200359166 A1 Nov 2020 US
Provisional Applications (1)
Number Date Country
61926324 Jan 2014 US
Continuations (5)
Number Date Country
Parent 16428210 May 2019 US
Child 16943706 US
Parent 16105687 Aug 2018 US
Child 16428210 US
Parent 15835100 Dec 2017 US
Child 16105687 US
Parent 15486111 Apr 2017 US
Child 15835100 US
Parent 14594410 Jan 2015 US
Child 15486111 US