User interface for accessing media at a geographic location

Information

  • Patent Grant
  • 10616476
  • Patent Number
    10,616,476
  • Date Filed
    Monday, December 11, 2017
    7 years ago
  • Date Issued
    Tuesday, April 7, 2020
    4 years ago
Abstract
A system and method for accessing a media item on a mobile device are described. The mobile device includes a media placement application and a media display application. The media placement application receives a selection of a media item generated by the mobile device. The media placement application generates access conditions for the media item based on geolocation and position information of the mobile device associated with the selected media item. The media display application monitors the geolocation and position of the mobile device and determines whether the geolocation and position of the mobile device meet the access conditions of the selected media item. The media display application generates a notification that the selected media item is available to view in a display of the mobile device in response to determining that the geolocation and position of the mobile device meet the access conditions of the selected media item.
Description
TECHNICAL FIELD

The subject matter disclosed herein generally relates to user interface technology. Specifically, the present disclosure addresses systems and methods that provide a user interface for accessing media on a mobile device.


BACKGROUND

There has been an unprecedented boom in the popularity of amateur photography sparked by the widespread adoption of mobile technology, mobile phones in particular, that incorporates cameras. In fact, mobile phone manufacturers have supplanted traditional camera companies as the world's largest producers of cameras. Software development companies have responded to this boom by creating photographic applications that allow users of mobile phones to manipulate and view photographs in creative ways. Such photographic applications may allow a user to view digital photographs taken within a specific time period (e.g., recently taken photographs, or photographs taken in a specific month or year). However, if the user wishes to view a photograph previously taken at a particular geographic location (e.g., Venice Beach, Calif.), the user may be required to tediously scroll through a large number of photographs.





BRIEF DESCRIPTION OF THE DRAWINGS

The present disclosure is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:



FIG. 1 shows a block diagram illustrating one example embodiment of a mobile device;



FIG. 2 shows a block diagram illustrating one example embodiment of a media placement application;



FIG. 3 shows a block diagram illustrating one example embodiment of a media display application;



FIG. 4 shows a block diagram illustrating one example embodiment of a media sharing application;



FIG. 5 is a network diagram depicting a network system having a client-server architecture configured for exchanging data over a network, according to one embodiment;



FIG. 6 shows an example of a graphical user interface to select media items;



FIG. 7A shows an example of a graphical user interface illustrating geolocations of media items on a map:



FIG. 7B shows another example of a graphical user interface illustrating media items placed at their respective geolocations on a map;



FIG. 8A shows an example of a notification to view a media item based on a geolocation of the mobile device;



FIG. 8B shows an example of a notification to view a shared media item based on a geolocation of the mobile device;



FIG. 8C shows another example of a notification to view a media item based on a geolocation of the mobile device;



FIG. 8D shows an example of a notification including instructions on how to view a shared media item at the mobile device;



FIG. 8E shows an example of a display of a previously selected picture in a mobile device;



FIG. 9A is a diagram illustrating an example of a visual guide in a graphical user interface for accessing a media item at a geolocation of the mobile device;



FIG. 9B is a diagram illustrating another example of a visual guide in a graphical user interface for accessing a media item at a geolocation of the mobile device:



FIG. 9C is a diagram illustrating an example of a graphical user interface for accessing a media item at a geolocation of the mobile device;



FIG. 10A shows an interaction diagram illustrating one example embodiment of a process for sharing a media item;



FIG. 10B shows an interaction diagram illustrating another example embodiment of a process for sharing a media item:



FIG. 10C shows an interaction diagram illustrating yet another example embodiment of a process for sharing a media item;



FIG. 11 shows a flow diagram illustrating one example embodiment of a method for generating access conditions for a selected media item;



FIG. 12 shows a flow diagram illustrating another example embodiment of a method for accessing a selected media item;



FIG. 13 shows a flow diagram illustrating one example embodiment of a method for generating a visual guide to access a selected media item;



FIG. 14 shows a flow diagram illustrating one example embodiment of a method for sharing a media item;



FIG. 15 shows a flow diagram illustrating one example embodiment of a method for accessing a shared media item; and



FIG. 16 shows a diagrammatic representation of machine, in the example form of a computer system, within which a set of instructions may be executed to cause the machine to perform any one or more of the methodologies discussed herein.





DETAILED DESCRIPTION

Although the present disclosure is described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.


In one example embodiment, a user of a mobile device (e.g., mobile phone) selects pictures that have already been taken with the mobile device. The mobile device places each selected picture on a geographic map to identify the geographic locations (also referred to as geolocations) where the pictures were taken. The mobile device monitors its own geolocation. When the mobile device detects that its location corresponds to a spot where a stored picture was previously captured, the mobile device displays the stored picture. For example, the user may take a picture of a sunset at Venice Beach on his mobile device and may select the picture of the sunset from a photographic application for placement. When the user later visits the same location in Venice Beach where the previously taken picture of the sunset was captured, that previously taken picture appears on the display of the mobile device (e.g., when the mobile device is held up by the user).


In another example embodiment, the mobile device generates a notification to the user when the mobile device detects that it is located at a spot where a stored picture was previously taken. The notification informs the user that the stored picture previously taken from the same spot is available for viewing. The notification further includes instructions to raise the mobile device to view the stored picture. The mobile device detects its position in space and displays the stored picture when the position of the mobile device coincides (e.g., within a threshold tolerance) with the position of the mobile device when the stored picture was originally captured. For example, the mobile device displays the stored picture when the mobile device is raised and pointed in about the same direction that the mobile device was pointed when the stored picture was captured. In another example, the mobile device displays visual guides, such as an arrow, to instruct the user of the mobile device in which direction to point the mobile device to access and view the stored picture.


In another example embodiment, the mobile device can share a picture with another mobile device (e.g., a receiving mobile device). The receiving mobile device displays the shared picture when the receiving mobile device is located at the point at which the shared picture was captured. Similarly, the receiving mobile device generates a notification when it detects that it is located at that same point at which the shared picture was captured. The notification informs the user of the receiving mobile device that the shared pictures are available for viewing. The notification instructs the user to raise the receiving mobile device to view the shared picture. In another example, the receiving mobile device instructs the user to go to the spot where the shared picture was captured to access and view the shared picture.


In another example embodiment, the user of the mobile device selects a picture previously taken with the mobile device and associates it with other geolocations, in addition to the geolocation at which the picture was captured, but virtually placing the picture at these other geolocations. Accordingly, when the mobile device detects that it is located at one of the other geolocations, the mobile device displays the picture. For example, the user may select a picture of the Santa Monica Pier taken in Santa Monica, and the user may virtually place the selected picture on Huntington Beach Pier in a map. When the user of the mobile device later visits Huntington Beach Pier and holds his mobile device up, a picture of Santa Monica Pier appears on the display of the mobile device. The user can then view and compare the actual Huntington Beach Pier with the previously taken picture of Santa Monica Pier. In another example, the user may select pictures taken at Yosemite National Park and virtually place them in other National Parks on a map. When the user later visits these other National Parks, the pictures taken at Yosemite National Park may be presented in the mobile device of the user. The user can then easily access and view his old pictures that are relevant to National Parks without having to browse through a multitude of pictures in the photo viewing application of his mobile device when the user is at one of the other National Parks.


Various examples of a media placement application, a media display application, and a media sharing application in the mobile device are described. The media placement application operates at the mobile device and associates pictures and videos with corresponding geolocations where the media was generated. The media display application operates at the mobile device and generates a display of the picture or video corresponding to a geolocation where the mobile device is currently located. The media sharing application operates at the mobile device and generates a message to another mobile device to enable viewing of a shared picture when the other mobile device is located at a geolocation associated with the shared picture. The message may include a shared picture and corresponding geolocation information. In another embodiment, the media placement application, media display application, and media sharing application operate with a server.



FIG. 1 shows a block diagram illustrating one example embodiment of a mobile device 100. The mobile device 100 includes an optical sensor 102, a Global Positioning System (GPS) sensor 104, a position sensor 106, a processor 107, a storage device 116, and a display 118.


The optical sensor 102 includes an image sensor, such as, a charge-coupled device. The optical sensor 102 captures visual media. The optical sensor 102 can be used to media items such as pictures and videos.


The GPS sensor 104 determines the geolocation of the mobile device 100 and generates geolocation information (e.g., coordinates). In another embodiment, other sensors may be used to detect a geolocation of the mobile device 100. For example, a WiFi sensor or Bluetooth sensor can be used to determine the geolocation of the mobile device 100.


The position sensor 106 measures a physical position of the mobile device relative to a frame of reference. The frame of reference may be the magnetic North Pole. For example, the position sensor 106 may include a geomagnetic field sensor to determine the direction in which the optical sensor 102 of the mobile device is pointed (e.g., West) and an orientation sensor to determine the orientation of the mobile device (e.g., raised, lowered, or leveled). In another example, the position module 206 generates an azimuth angle and an elevation angle to identify the relative position of the mobile device 100 in space. For example, an azimuth angle of 0 degrees indicates that the optical sensor 102 is pointed at Magnetic North. An elevation of 0 degrees indicates that the optical sensor 102 is pointed to the horizon.


The processor 107 may be a central processing unit that includes a media capture application 108, a media placement application 110, a media display application 112, and a media sharing application 114.


The media capture application 108 includes executable instructions to generate media items such as pictures and videos using the optical sensor 102. The media capture application 108 also associates a media item with the geolocation and the position of the mobile device 100 at the time the media item is generated using the GPS sensor 104 and the position sensor 106.


The media placement application 110 includes executable instructions to enable a user of the mobile device 100 to select and place the media items on a geographic map.


The media display application 112 includes executable instructions to determine whether the geolocation of the mobile device 100 corresponds to the geolocation of one of the media item selected with the media placement application 110. The media display application 112 displays the corresponding media item in the display 118 when the mobile device 100 is at the geolocation where the media item was previously generated.


The media sharing application 114 includes executable instructions to generate a message to another mobile device to share a media item. The mobile device of the recipient can view the shared media item when the mobile device of the recipient is at the geolocation where the shared media item was previously generated with the mobile device of the sender.


The storage device 116 includes a memory that may be or include flash memory, random access memory, any other type of memory accessible by the processor 107, or any suitable combination thereof. The storage device 116 stores the media items selected by the user for placement and also stores the corresponding geolocation information. The storage device 116 also stores executable instructions corresponding to the media capture application 108, the media placement application 110, the media display application 112, and the media sharing application 114.


The display 118 includes, for example, a touch screen display. The display 118 displays the media items generated by the media capture application 108. A user selects media items for placement by touching the corresponding media items on the display 118. A touch controller monitors signals applied to the display 118 to coordinate the selection of the media items.


The mobile device 100 also includes a transceiver that interfaces with an antenna. The transceiver may be configured to both transmit and receive cellular network signals, wireless data signals, or other types of signals via the antenna, depending on the nature of the mobile device 100. Further, in some configurations, the GPS sensor 104 may also make use of the antenna to receive GPS signals.



FIG. 2 shows a block diagram illustrating one example embodiment of a media placement application 110. The media placement application 110 includes a camera module 202, a geolocation module 204, a position module 206, a placement parameters module 208, and a media placement engine 210.


The camera module 202 communicates with the media capture application 108 to access the media items generated at the mobile device 100. In one example, the camera module 202 accesses the media items selected for placement by the user of the mobile device 100. In another example, the camera module 202 accesses media items generated from other mobile devices.


The geolocation module 204 communicates with the GPS sensor 104 to access geolocation information of the corresponding media items selected by the user. The geolocation information may include GPS coordinates of the mobile device 100 when the mobile device 100 generated the media items. In another example, the geolocation information may include GPS coordinates corresponding to the geolocation where the media items were generated using other mobile devices.


The position module 206 communicates with the position sensor 106 to access direction information and position information of the mobile device 110 at the time the mobile device 100 generated the media item. The direction information may include a direction (e.g., North. South. East. West, or other azimuth angle) in which the mobile device 100 was pointed when the mobile device 100 generated the media item. The orientation information may identify an orientation (e.g., high above the horizon, low towards the ground, or other elevation angle) at which the mobile device 100 was pointed when the mobile device 100 generated the media item.


The placement parameters module 208 accesses predefined ranges for the geolocation information, direction information, and position information. The predefined ranges identify a range for each parameter (e.g., geolocation, direction, and position). For example, a geolocation range for the geolocation information may be 100 feet. A direction range for the direction information may be 45 degrees. A position range for the position information may be 30 degrees. These ranges may be set by default or may be adjusted by the user of the mobile device 100.


The media placement engine 210 accesses the predefined ranges from the placement parameters module 208 to define access conditions for the media items. The user of the mobile device 100 can view the selected media items when the access conditions are met. For example, the media placement engine 210 receives a selection of media items from the user of the mobile device 100. The user may use the touchscreen of the display 118 to select media items for placement. The media placement engine 210 accesses the selected media items from the camera module 202, the geolocation information for the selected media items from the geolocation module 204, the direction and position information associated with the selected media items from the position module 206, and the predefined ranges from placement parameters module 208. The media placement engine 210 applies the predefined ranges to the geolocation information, the direction information, and the position information for the selected media items to generate corresponding boundaries for the selected media items. The access conditions are based on the boundaries for the selected media items.


The boundaries may include a geolocation boundary, a direction boundary, and a position boundary for a corresponding selected media item. For example, the geolocation boundary may include an area within 100 feet of the GPS coordinates of the mobile device 100 when the selected media item was generated at the mobile device 100. The direction boundary may include a direction between South East and South West based on the mobile device 100 being pointed South when the selected media item was generated at the mobile device 100 and based on a predefined direction range of 45 degrees. The position boundary may identify a position range from −30 degrees to 30 degrees based on the mobile device 100 being held up at a horizontal level when the selected media item was generated at the mobile device 100. The boundaries may be used later to determine which selected media item to display based on a current geolocation and position of the mobile device 100.



FIG. 3 shows a block diagram illustrating one example embodiment of the media display application 112. The media display application 112 includes a geolocation module 302, a position module 304, a media placement module 306, and a media display engine 308.


The geolocation module 302 communicates with the GPS sensor 104 to access an updated or a current geolocation of the mobile device 100. The geolocation information may include updated GPS coordinates of the mobile device 100. In one example, the geolocation module 302 periodically accesses the geolocation information every minute. In another example, the geolocation module 302 may dynamically access the geolocation information based on other usage (e.g., every time the mobile device 100 is held up or is used by the user).


The position module 304 communicates with the position sensor 106 to access direction information and position information of the mobile device 100. The direction information may include a direction in which the mobile device 100 is currently pointed. The position information may identify an orientation in which the mobile device 100 is currently held.


The media placement module 306 communicates with the media placement engine 210 to determine the boundaries corresponding to the selected media items. For example, the boundaries for a media item may include a zip code boundary and a direction boundary (e.g., North).


The media display engine 308 accesses the current geolocation of the mobile device 100, the current direction and position of the mobile device 100, and the corresponding boundaries for the selected media items. The media display engine 308 compares the current geolocation, direction, and position of the mobile device 100 with the corresponding boundaries for the selected media items. If the media display engine 308 determines that the current geolocation, direction, and position the mobile device 100 are within the boundaries of a selected media item, the media display engine 308 displays the selected media item in the display 118.


In another example, if the media display engine 308 determines that any combination of the current geolocation, direction, and position of the mobile device 100 is within a corresponding boundary of a selected media item, the media display engine 308 displays the selected media item in the display 118. For example, the media display engine 308 displays the selected media item when the media display engine 308 determines that a current geolocation of the mobile device 100 is within a geolocation boundary of a selected media item regardless of a current direction and position of the mobile device 100.


In another example, once the media display engine 308 determines that a current geolocation of the mobile device 100 is within a geolocation boundary of a selected media item regardless of a current direction and position of the mobile device 100, the media display engine 308 generates a notification. The media display engine 308 causes the notification to be displayed in the display 118. The notification informs the user of the mobile device 100 that the selected media item is available for viewing at the current geolocation of the mobile device 100. The media display engine 308 then determines whether the direction and position of the mobile device 100 are within corresponding boundaries of the selected media item. The media display engine 308 displays the selected media item in the display 118 once the direction and position of the mobile device 100 are within the direction and position boundaries of the selected media item.


In yet another example, the media display engine 308 generates a visual guide, such as an arrow, in the display of the mobile device 100 to guide and direct the user of the mobile device 100 to position the mobile device 100 in the direction and position associated with the selected media item. For example, the mobile device 100 may display a right arrow to instruct the user to move and point the mobile device 100 further to the right. The media display engine 308 adjusts the display position of the selected media item relative to the display of the mobile device 100 based on the position of the mobile device 100 relative to the position boundary for the selected media item.



FIG. 4 shows a block diagram illustrating one example embodiment of the media sharing application 114. The media sharing application 114 includes a social network interface module 402, a media placement module 404, and a sharing module 406.


The social network interface module 402 accesses a server of a social network service provider to access contact information of social network contacts of the user of the mobile device 100. In another example, the social network interface module 402 accesses an address book stored in the mobile device 100.


The media placement module 404 communicates with the media placement engine 210 to determine the boundaries corresponding to the selected media items. The media placement module 404 retrieves the media items selected by the user of the mobile device 100. The media placement module 404 also retrieves access conditions (e.g., boundaries) for the media items selected by the user of the mobile device 100. By way of example, the sharing module 406 communicates the selected media item and the access conditions of the selected media item to a second mobile device. The second mobile device monitors a combination of its geolocation and position. The second mobile device determines whether its geolocation or position meet the access conditions of the selected media item. If the second mobile device meets the access conditions, the second mobile device generates a notification of the availability to view the selected media item in a display of the second mobile device.


In another example, the second mobile device generates a first notification identifying a requested geolocation to access the selected media item. For example, the notification may be “You have received a photo. Please go to Venice Beach to view it.” The second mobile device monitors its geolocation and determines whether its geolocation meets the access conditions of the selected media item. In this example, the second mobile device determines whether it is located in Venice Beach. If the access conditions are met, the second mobile device generates a second notification of the availability to view the selected media item in the display of the second mobile device.



FIG. 5 is a network diagram depicting a network system 500 having a client-server architecture configured for exchanging data over a network, according to one embodiment. For example, the network system 500 may be a messaging system where clients may communicate and exchange data within the network system 500. The data may pertain to various functions (e.g., sending and receiving text and media communication, media items, and access conditions) and aspects (e.g., placement of media items, identification and retrieval of media items) associated with the network system 500 and its users. Although illustrated herein as a client-server architecture, other embodiments may include other network architectures, such as peer-to-peer or distributed network environments.


A data exchange platform, in an example, includes a server messaging application 520 and a server media placement application 522, and may provide server-side functionality via a network 504 (e.g., the Internet) to one or more clients. The one or more clients may include users that utilize the network system 500 and, more specifically, the server messaging application 520 and the server media placement application 522, to exchange data over the network 504. These operations may include transmitting, receiving (communicating), and processing data to, from, and regarding content and users of the network system 500. The data may include, but is not limited to, content and user data such as user profiles, messaging content, messaging attributes, media attributes, client device information, geolocation information, placement parameters, access conditions, and social network information, among others.


In various embodiments, the data exchanges within the network system 500 may be dependent upon user-selected functions available through one or more client or user interfaces (UIs). The UIs may be associated with a client machine, such as mobile devices 510, 512. The mobile devices 510 and 512 may be in communication with the server messaging application 520 and server media placement application 522 via an application server 518. The mobile devices 510, 512 include wireless communication components, and audio and optical components for capturing various forms of media including photos and videos as previously described with respect to FIG. 1.


Turning specifically to the server messaging application 520 and the server media placement application 522, an application program interface (API) server 514 is coupled to, and provides programmatic interface to the application server 518. The application server 518 hosts the server messaging application 520 and the server media placement application 522. The application server 518 is, in turn, shown to be coupled to one or more database servers 524 that facilitate access to one or more databases 526.


The API server 514 communicates and receives data pertaining to messages, media items, and access conditions, among other things, via various user input tools. For example, the API server 514 may send and receive data to and from an application running on another client machine (e.g., mobile devices 510, 512 or a third party server).


The server messaging application 520 provides messaging mechanisms for users of the mobile devices 510, 512 to send messages that include text and media content such as pictures and video and access conditions. The mobile devices 510, 512 can access and view the messages from the server messaging application 520. The server messaging application 520 may utilize any one of a number of message delivery networks and platforms to deliver messages to users. For example, the messaging application 120 may deliver messages using electronic mail (e-mail), instant message (IM), Short Message Service (SMS), text, facsimile, or voice (e.g., Voice over IP (VoIP)) messages via wired (e.g., the Internet), plain old telephone service (POTS), or wireless networks (e.g., mobile, cellular. WiFi, Long Term Evolution (LTE), Bluetooth).


The server media placement application 522 provides a system and a method for placing media items at select geolocations and enabling access to the media items when the mobile devices 510, 512 are located at the select geolocations. The server media placement application 522 provides a Graphical User Interface (GUI) that accesses pictures taken with the mobile device 510, and receives a selection of the pictures from the user of the mobile device 510. The server media placement application 522 associates the selected pictures with access conditions. When the server media placement application 522 detects that the geolocation and position of the mobile device 510 matches the geolocation and access conditions of the selected pictures, the server media placement application 522 causes the mobile device 510 to display the selected pictures. In one example embodiment, the server media placement application 522 may include components similar to the media placement application 110, the media display application 112, and the media sharing application 114.



FIG. 6 shows an example of a graphical user interface (GUI) for a user to select pictures for placement. A mobile device 602 includes a touchscreen display 604 that shows a gallery of pictures. The user selects pictures 606, 608, and 610 for placement and notification by tapping the touchscreen display 604 on top of the picture 606, 608, and 610. The mobile device 602 displays the corresponding selected pictures in the display 604 when the geolocation of the mobile device 602 later satisfies the access conditions.



FIG. 7A shows an example of GUI for a user to select pictures for placement in a map 700. The map 700 illustrates geolocations boundaries of selected media items. The map 700 includes geolocation boundaries 702 and 704. Each geolocation boundary is associated with one or more media items previously selected by the user of the mobile device 602. For example, pictures 606 and 608 may have been generated in geolocation boundary 702. Picture 610 may have been generated in geolocation boundary 704.



FIG. 7B shows another example of the map 700 illustrating thumbnails of selected pictures 606, 608, and 610 placed at their respective geolocation boundaries 702 and 704 on the map 700.



FIG. 8A shows an example of a notification to view a media item based on a geolocation of a mobile device. A mobile device 802 determines that it is located within a geolocation boundary 801 of a previously selected picture. The mobile device 802 generates a notification 806 in a display 804. The notification 806 informs the user of the mobile device 802 to hold the mobile device 802 up (or raise the mobile device to an eye level position) to view the selected picture that was previously captured from the same spot. The previously selected picture may have been captured with the mobile device 802 or using another mobile device.



FIG. 8B shows an example of a notification to view a shared media item based on a geolocation of the mobile device 802. The mobile device 802 determines that it is located within a geolocation boundary 803 of a shared picture. The mobile device 802 generates a notification 808 in the display 804. The notification 808 informs the user of the mobile device 802 to hold the mobile device 802 up to view the shared picture that was captured from the same spot.



FIG. 8C shows another example of a notification to view a media item based on a geolocation of the mobile device 802. The mobile device 802 determines that it is located within a geolocation boundary 805 of a previously selected picture. The picture was captured at a different geolocation from the geolocation boundary 805. The picture was placed on a map corresponding to the geolocation boundary 805. The mobile device 802 generates a notification 810 in the display 804. The notification 810 informs the user of the mobile device 802 to hold the mobile device 802 up to view the selected picture that was captured at another geolocation. In another example embodiment, pictures taken from similar places may share their geolocation boundaries. For example, the mobile device 802 may associate a picture taken at a sushi restaurant with a geolocation of other known sushi restaurants. So when the user of the mobile device 802 walks into any sushi restaurant, the user will have immediate access to all the pictures the user previously took at other sushi restaurants. The user can thus compare the sushi from the current restaurant with pictures of sushi from sushi restaurants the user previously visited.


In another similar example, the user of the mobile device 802 takes a picture of a citrus tree at the Home Depot. The user then proceeds with taking many pictures from other events such as birthday parties. When the user later walks into a different hardware store, such as Lowes, to compare the price of the same type of citrus tree from the Home Depot, the user would typically browse through many pictures previously taken at the birthday parties before reaching the picture of the citrus tree taken at the Home Depot. The media display application 112 avoids this problem by placing the picture of the citrus tree at geolocations of other hardware stores in the area. As such, the mobile device 802 displays the picture of the citrus tree taken at the Home Depot as soon as the user walks into any hardware store.



FIG. 8D shows an example of a notification including instructions on how to access a shared media item at a mobile device. The mobile device 802 receives a message that includes a shared picture and access conditions. The mobile device 802 can only view the shared picture when the mobile device 802 meets the access conditions. The mobile device 802 generates a notification 812 in the display 804. For example, the notification 810 informs the user of the mobile device 802 to go to a specific geolocation based on the access conditions to access and view the shared picture.



FIG. 8E shows an example of a display of a previously selected picture in a mobile device. The mobile device 802 is at a spot where the previously selected picture was captured. The mobile device 802 is also pointed in about the same direction that the mobile device 802 was pointed when the previously selected picture was captured. For example, the mobile device 802 is pointed towards a landscape 816 including a landmark 818. The user of the mobile device 802 previously captured a picture of the landmark 818. When the user holds up the mobile device 802, the previously captured picture 814 is overlaid on top of a current view of the landscape 816 in the display 804. In another example, a time filter may be applied to the previously selected pictures to display only pictures taken within the last year or any other predefined period of time.



FIG. 9A is a diagram illustrating an example of a visual guide in a graphical user interface for accessing a media item at a geolocation of a mobile device. Once the mobile device 802 is at a geolocation specified in the access conditions, the display 804 includes a visual guide 902 to help the user orient the mobile device 802 to view the media items. For example, the display 804 displays the visual guide 902 in the form of an arrow to guide the user to move the mobile device 804 in the direction indicated by the arrow.



FIG. 9B is a diagram illustrating another example of a visual guide in a graphical user interface for accessing a media item at a geolocation of a mobile device. The display 804 shows a portion of the media items 904 as the position of the mobile device 802 gets closer to the position boundary defined by the access conditions of the media items.



FIG. 9C is a diagram illustrating an example of a graphical user interface for accessing a media item at a geolocation of the mobile device. The display position of the media items 904 is adjusted relative to the display 804 of the mobile device 802 based on the position of the mobile device 802 relative to the position boundary for the media items.



FIG. 10A shows an interaction diagram illustrating one example embodiment of a process for sharing a media item. A user of the mobile device 510 selects a media item for sharing at operation 1002. In one example, operation 1002 may be implemented using the media capture application 108 of FIG. 1. The mobile device 510 generates access conditions for the shared media item at operation 1004. In one example, operation 1004 may be implemented using the media placement application 110 of FIG. 1. The mobile device 510 sends the shared media item and corresponding access conditions to the mobile device 512 at operation 1006. In one example, operation 1006 may be implemented using the media sharing application 114 of FIG. 1. The mobile device 512 determines whether the geolocation (and optionally the physical position) of the mobile device 512 meet the access conditions of the shared media item at operation 1008. In one example, operation 1008 may be implemented using the media display application 112 of FIG. 1. The mobile device 512 generates a display of the shared media item after the access conditions of the shared media item are met at operation 1010. In one example, operation 1010 may be implemented using the media display application 112 of FIG. 1.



FIG. 10B shows an interaction diagram illustrating another example embodiment of a process for sharing a media item. A user of the mobile device 510 selects a media item for sharing at operation 1002. The mobile device 510 generates access conditions for the shared media item at operation 1004. The mobile device 510 sends the shared media item and corresponding access conditions to a mobile device 512 at operation 1006. The mobile device 512 generates a first notification to instruct the user of the mobile device 512 to go to a specific geolocation to access and view the shared media item at operation 1012. The mobile device 512 monitors the geolocation of the mobile device 512 at operation 1014. The mobile device 512 generates a second notification to instruct the user of the mobile device 512 to raise the mobile device 512 in a particular direction to view the shared media item at operation 1016. The mobile device 512 displays the shared media item after the mobile device 512 meets the access conditions at operation 1018. In one example, operations 1012, 1014, 1016, and 1018 may be implemented using the media display application 112 of FIG. 1.



FIG. 10C shows an interaction diagram illustrating yet another example embodiment of a process for sharing a media item. A user of the mobile device 510 selects a media item for sharing at operation 1020. The mobile device 510 communicates the shared media item to the server media placement application 522 of FIG. 5 at operation 1021. The server media placement application 522 generates access conditions for the shared media item at operation 1022. The server messaging application 520 generates a message including access instructions to the shared media item at operation 1024. The server messaging application 520 sends the message to the mobile device 512 at operation 1026. The mobile device 512 determines its geolocation at operation 1028. The mobile device 512 sends its geolocation information to the server media placement application 522 at operation 1030. The server media placement application 522 determines whether the geolocation information satisfies the access conditions for the shared media item at operation 1032. The server media placement application 522 sends the shared media item after access conditions for the shared media item are met at operation 1034. The mobile device 512 generates a display of the shared media item at operation 1036.



FIG. 11 shows a flow diagram illustrating one example embodiment of a method 1100 for generating access conditions for a selected media item. At operation 1102, a mobile device receives a selection of a media item for placement. At operation 1104, the mobile device generates access conditions for the selected media item. At operation 1106, the mobile device stores the access conditions for the selected media item at the mobile device. In one embodiment, operations 1102, 1104, and 1106 may be implemented using the media placement application 110 of FIG. 1.



FIG. 12 shows a flow diagram illustrating another example embodiment of a method 1200 for accessing the selected media item. At operation 1202, a mobile device determines its current geolocation and position. At operation 1204, the mobile device determines whether a combination of its current geolocation and position match or satisfy the access conditions generated at operation 1104 of FIG. 11. At operation 1206, the mobile device generates a notification of the availability to view the selected media item associated with the access conditions. In one embodiment, operations 1202, 1204, and 1206 may be implemented using the media display application 112 of FIG. 1.



FIG. 13 shows a flow diagram illustrating one example embodiment of a method 1300 for generating a visual guide to access the selected media item. At operation 1302, a mobile device determines its geolocation and position. At operation 1304, the mobile device determines access conditions for a selected media item. At operation 1306, the mobile device generates a visual guide in a display of the mobile device to access the selected media item. At operation 1308, the mobile device adjusts a display position of the selected media item related to the display of the mobile device based on the position of the mobile device. In one embodiment, operations 1302, 1304, and 1306 may be implemented using the media display application 112 of FIG. 1.



FIG. 14 shows a flow diagram illustrating one example embodiment of a method 1400 for sharing a media item. At operation 1402, a user selects, at a mobile device, a media item for sharing with another mobile device. At operation 1404, the mobile device generates access conditions for the shared media item. At operation 1406, the mobile device sends the shared media item and the access conditions to the other mobile device. In one embodiment, operations 1402 and 1404 may be implemented using the media placement application 110 of FIG. 1. Operation 1406 may be implemented using the media sharing application 114 of FIG. 1.



FIG. 15 shows a flow diagram illustrating one example embodiment of a method 1500 for accessing a shared media item. At operation 1502, a mobile device receives a shared media item and corresponding access conditions. At operation 1504, the mobile device determines its current geolocation and position. At operation 1506, the mobile device determines whether a combination of its geolocation and position match the received access conditions. At operation 1508, the mobile device displays the shared media item when the access conditions are met. Operations 1502, 1504, 1506, and 1508 may be implemented using the media display application 112.


Modules, Components and Logic


Certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied (1) on a non-transitory machine-readable medium or (2) in a transmission signal) or hardware-implemented modules. A hardware-implemented module is a tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client, or server computer system) or one or more processors may be configured by software (e.g., an application or application portion) as a hardware-implemented module that operates to perform certain operations as described herein.


In various embodiments, a hardware-implemented module may be implemented mechanically or electronically. For example, a hardware-implemented module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware-implemented module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware-implemented module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.


Accordingly, the term “hardware-implemented module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily or transitorily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware-implemented modules are temporarily configured (e.g., programmed), each of the hardware-implemented modules need not be configured or instantiated at any one instance in time. For example, where the hardware-implemented modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respectively different hardware-implemented modules at different times. Software may, accordingly, configure a processor, for example, to constitute a particular hardware-implemented module at one instance of time and to constitute a different hardware-implemented module at a different instance of time.


Hardware-implemented modules can provide information to, and receive information from, other hardware-implemented modules. Accordingly, the described hardware-implemented modules may be regarded as being communicatively coupled. Where multiples of such hardware-implemented modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses that connect the hardware-implemented modules). In embodiments in which multiple hardware-implemented modules are configured or instantiated at different times, communications between such hardware-implemented modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware-implemented modules have access. For example, one hardware-implemented module may perform an operation, and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware-implemented module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware-implemented modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).


The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.


Similarly, the methods described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment, or a server farm), while in other embodiments the processors may be distributed across a number of locations.


The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via the network 504 (e.g., the Internet) and via one or more appropriate interfaces (e.g., APIs).


Electronic Apparatus and System


Example embodiments may be implemented in digital electronic circuitry, or in computer hardware, firmware, or software, or in combinations of them. Example embodiments may be implemented using a computer program product (e.g., a computer program tangibly embodied in an information carrier, e.g., in a machine-readable medium for execution by, or to control the operation of, data processing apparatus. e.g., a programmable processor, a computer, or multiple computers).


A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a standalone program or as a module, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.


In example embodiments, operations may be performed by one or more programmable processors executing a computer program to perform functions by operating on input data and generating output. Method operations can also be performed by, and apparatus of example embodiments may be implemented as, special purpose logic circuitry (e.g., an FPGA or an ASIC).


The computing system can include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. In embodiments deploying a programmable computing system, it will be appreciated that both hardware and software architectures merit consideration. Specifically, it will be appreciated that the choice of whether to implement certain functionality in permanently configured hardware (e.g., an ASIC), in temporarily configured hardware (e.g., a combination of software and a programmable processor), or in a combination of permanently and temporarily configured hardware may be a design choice. Below are set out hardware (e.g., machine) and software architectures that may be deployed in various example embodiments.


Example Computer System



FIG. 16 shows a diagrammatic representation of a machine in the example form of a machine or computer system 1600 within which a set of instructions 1624 may be executed causing the machine to perform any one or more of the methodologies discussed herein. In alternative embodiments, the machine operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine may operate in the capacity of a server or a client machine and in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing a set of instructions 1624 (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions 1624 to perform any one or more of the methodologies discussed herein.


The example computer system 1600 includes a processor 1602 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both), a main memory 1604, and a static memory 1606, which communicate with each other via a bus 1608. The computer system 1600 may further include a video display unit 1610 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)). The computer system 1600 also includes an alphanumeric input device 1612 (e.g., a keyboard), a UI navigation device 1614 (e.g., a mouse), a drive unit 1616, a signal generation device 1618 (e.g., a speaker), and a network interface device 1620.


The drive unit 1616 includes a computer-readable medium 1622 on which is stored one or more sets of data structures and instructions 1624 (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 1624 may also reside, completely or at least partially, within the main memory 1604 or within the processor 1602 during execution thereof by the computer system 1600, with the main memory 1604 and the processor 1602 also constituting machine-readable media.


The instructions 1624 may further be transmitted or received over a network 1626 via the network interface device 1620 utilizing any one of a number of well-known transfer protocols (e.g., HTTP).


While the computer-readable medium 1622 is shown in an example embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions 1624. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions 1624 for execution by the machine that cause the machine to perform any one or more of the methodologies of the present disclosure, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions 1624. The term “computer-readable medium” shall, accordingly, be taken to include, but not be limited to, solid-state memories, optical media, and magnetic media.


Furthermore, the machine-readable medium is non-transitory in that it does not embody a propagating signal. However, labeling the tangible machine-readable medium “non-transitory” should not be construed to mean that the medium is incapable of movement—the medium should be considered as being transportable from one physical location to another. Additionally, since the machine-readable medium is tangible, the medium may be considered to be a machine-readable device.


Although an embodiment has been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. The accompanying drawings that form a part hereof show by way of illustration, and not of limitation, specific embodiments in which the subject matter may be practiced. The embodiments illustrated are described in sufficient detail to enable those skilled in the art to practice the teachings disclosed herein. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. This Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of various embodiments is defined only by the appended claims, along with the full range of equivalents to which such claims are entitled.


As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, plural instances may be provided for resources, operations, or structures described herein as a single instance. Additionally, boundaries between various resources, operations, modules, engines, and data stores are somewhat arbitrary, and particular operations are illustrated in a context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within a scope of various embodiments of the present invention. In general, structures and functionality presented as separate resources in the example configurations may be implemented as a combined structure or resource. Similarly, structures and functionality presented as a single resource may be implemented as separate resources. These and other variations, modifications, additions, and improvements fall within a scope of embodiments of the present invention as represented by the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.


Such embodiments of the inventive subject matter may be referred to herein, individually or collectively, by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.


The Abstract of the Disclosure is provided to comply with 37 C.F.R. § 1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.

Claims
  • 1. A mobile device comprising: a processor; anda memory storing instructions that, when executed by the processor, configure the mobile device to: receive an identification of a geographic boundary related to a media item;detect that the mobile device is located within the geographic boundary;generate a notification in the mobile device in response to determining that the mobile device is located within the geographic boundary, the notification comprising an instruction to orient the mobile device in a direction associated with the media item; andcause the media item to be displayed in a display of the mobile device in response to detecting that the mobile device is oriented in the direction associated with the media item;
  • 2. The mobile device of claim 1, further comprising: a Global Positioning System (GPS) sensor configured to identify a geolocation of the mobile device;an orientation sensor configured to identify the orientation of the mobile device at a time the mobile device generated the media item; anda camera configured to generate the media item,wherein the instructions further configure the mobile device to;associate the media item with the geolocation and the orientation of the mobile device at the time the mobile device generated the media item.
  • 3. The mobile device of claim 1, wherein the instructions further configure the mobile device to: detect that the mobile device is located within the second geographic boundary associated with the second media item;generate a second notification in response to determining that the mobile device is located within the second geographic boundary related to the second media item, the second notification comprising a second instruction to raise the mobile device in the orientation associated with the second media item;detect that the orientation of the mobile device is within the second orientation range; andcause the second media item to be displayed in the display of the mobile device in response to detecting that the orientation of the mobile device is within the second orientation range.
  • 4. The mobile device of claim 1, wherein the second geographic boundary includes an area based on a preset distance radius from a center associated with the geolocation of the second media item, the second orientation range includes an angular range based on a preset angle from the orientation from an angle associated with the orientation of the mobile device at the time the mobile device generated the second media item.
  • 5. The mobile device of claim 1, wherein the instructions further configure the mobile device to: communicate the second media item and the access conditions of the second media item to a second mobile device,the second mobile device configured to:identify a geolocation and an orientation of the second mobile device,determine whether the geolocation and the orientation of the second mobile device meet the access conditions of the second media item, andgenerate a notification in the second mobile device that the second media item is available to view in a display of the second mobile device.
  • 6. The mobile device of claim 1, wherein the instructions further configure the mobile device to: display a real time image captured by the mobile device in the display in response to detecting that the mobile device is located within the geographic boundary related to the media item; andcause the media item to be displayed as layer on top of the real time image in the display in response to detecting that the mobile device is oriented in the direction associated with the media item, a displayed size of the media item being smaller than a displayed size of the real time image, a position of the media item in the display based on the orientation of the mobile device relative to the direction associated with the media item.
  • 7. The mobile device of claim 1, wherein the instructions further configure the mobile device to: display a real time image captured by the mobile device in the display in response to detecting that the mobile device is located within the geographic boundary related to the media item;generate a visual guide including a directional indicator towards the direction associated with the media item; andcause a display of the visual guide as a layer on top of the real time image in the display of the mobile device.
  • 8. The mobile device of claim 1, wherein the instructions further configure the mobile device to: receive a selection of the media item and a second media item generated by the mobile device as first and second media items;access a first geolocation associated with the first media item;access a second geolocation associated with the second media item, the first geolocation being different from the second geolocation;generate a first notification in the mobile device in response to determining that the mobile device is located within a first geographic boundary related to the first media item; andcause the first and second media items to be displayed in the display of the mobile device in response to the first notification.
  • 9. The mobile device of claim 8, wherein the instructions further configure the mobile device to: generate a second notification in the mobile device in response to determining that the mobile device is located within a second geographic boundary related to the second media item; andcause the first and second media items to be displayed in the display of the mobile device in response to the second notification.
  • 10. A method comprising: receiving an identification of a geographic boundary related to a media item at a mobile device;detecting that the mobile device is located within the geographic boundary;generating a notification in the mobile device in response to determining that the mobile device is located within the geographic boundary, the notification comprising an instruction to orient the mobile device in a direction associated with the media item;causing the media item to be displayed in a display of the mobile device in response to detecting that the mobile device is oriented in the direction associated with the media item;receiving a selection of a second media item generated by the mobile device;accessing a geolocation associated with the second media item;accessing an orientation of the mobile device at a time the mobile device generated the second media item; anddefining access conditions for the second media item, the access conditions identifying a second geographic boundary and a second orientation range for the second media item, the second geographic boundary based on the geolocation associated with second media item, the second orientation range based on the orientation of the mobile device associated with the second media item.
  • 11. The method of claim 10, further comprising: identifying a geolocation of the mobile device with a GPS sensor in the mobile device;identifying the orientation of the mobile device with an orientation sensor in the mobile device; andgenerating the media item with a camera of the mobile device;associating the media item with the geolocation and the orientation of the mobile device at a time the mobile device generated the media item.
  • 12. The method of claim 10, further comprising: detecting that the mobile device is located within the second geographic boundary associated with the second media item;generating a second notification in response to determining that the mobile device is located within the second geographic boundary related to the second media item, the second notification comprising a second instruction to raise the mobile device in the orientation associated with the second media item;detecting that the orientation of the mobile device is within the second orientation range; andcausing the second media item to be displayed in the display of the mobile device in response to detecting that the orientation of the mobile device is within the second orientation range.
  • 13. The method of claim 10, wherein the second geographic boundary includes an area based on a preset distance radius from a center associated with the geolocation of the second media item, the second orientation range includes an angular range based on a preset angle from the orientation from an angle associated with the orientation of the mobile device at the time the mobile device generated the second media item.
  • 14. The method of claim 10, further comprising: communicating the second media item and the access conditions of the second media item to a second mobile device,the second mobile device configured to:identify a geolocation and an orientation of the second mobile device,determine whether the geolocation and the orientation of the second mobile device meet the access conditions of the second media item, andgenerate a notification in the second mobile device that the second media item is available to view in a display of the second mobile device.
  • 15. The method of claim 10, further comprising: displaying a real time image captured by the mobile device in the display in response to detecting that the mobile device is located within the geographic boundary related to the media item; andcausing the media item to be displayed as layer on top of the real time image in the display in response to detecting that the mobile device is oriented in the direction associated with the media item, a displayed size of the media item being smaller than a displayed size of the real time image, a position of the media item in the display based on the orientation of the mobile device relative to the direction associated with the media item.
  • 16. The method of claim 10, further comprising: displaying a real time image captured by the mobile device in the display in response to detecting that the mobile device is located within the geographic boundary related to the media item;generating a visual guide including a directional indicator towards the direction associated with the media item; andcausing a display of the visual guide as a layer on top of the real time image in the display of the mobile device.
  • 17. The method of claim 10, further comprising: receiving a selection of a first media item and a second media item generated by the mobile device;accessing a first geolocation associated with the first media item;accessing a second geolocation associated with the second media item, the first geolocation being different from the second geolocation;generating a first notification in the mobile device in response to determining that the mobile device is located within a first geographic boundary related to the first media item; andcausing the first and second media items to be displayed in the display of the mobile device in response to the first notification.
  • 18. A computer-readable storage medium having no transitory signals and storing a set of instructions that, when executed by a processor of a machine, cause the machine to perform operations comprising: receiving an identification of a geographic boundary related to a media item at a mobile device;detecting that the mobile device is located within the geographic boundary;generating a notification in the mobile device in response to determining that the mobile device is located within the geographic boundary, the notification comprising an instruction to orient the mobile device in a direction associated with the media item; andcausing the media item to be displayed in a display of the mobile device in response to detecting that the mobile device is oriented in the direction associated with the media item;receiving a selection of a second media item generated by the mobile device;accessing a geolocation associated with the second media item;accessing an orientation of the mobile device at a time the mobile device generated the second media item; anddefining access conditions for the second media item, the access conditions identifying a second geographic boundary and a second orientation range for the second media item, the second geographic boundary based on the geolocation associated with second media item, the second orientation range based on the orientation of the mobile device associated with the second media item.
  • 19. The computer-readable storage medium of claim 18 wherein the set of instructions further cause the machine to perform operations comprising: identifying a geolocation of the mobile device with a GPS sensor in the mobile device; identifying the orientation of the mobile device with an orientation sensor in the mobile device;generating the media item with a camera of the mobile device;associating the media item with the geolocation and the orientation of the mobile device at a time the mobile device generated the media item.
  • 20. The computer-readable storage medium of claim 18 wherein the set of instructions further cause the machine to perform operations comprising: detecting that the mobile device is located within the second geographic boundary associated with the second media item;generating a second notification in response to determining that the mobile device is located within the second geographic boundary related to the second media item, the second notification comprising a second instruction to raise the mobile device in the orientation associated with the second media item;detecting that the orientation of the mobile device is within the second orientation range; andcausing the second media item to be displayed in the display of the mobile device in response to detecting that the orientation of the mobile device is within the second orientation range.
REFERENCE TO RELATED APPLICATION

This Application is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 14/841,987, filed Sep. 1, 2015, which is a continuation of and claims the benefit of priority to U.S. patent application Ser. No. 14/682,259, filed Apr. 9, 2015, which is a continuation of Ser. No. 14/539,391, filed Nov. 12, 2014, each of which is hereby incorporated herein by reference in their entirety.

US Referenced Citations (595)
Number Name Date Kind
666223 Shedlock Jan 1901 A
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5493692 Theimer et al. Feb 1996 A
5702412 Henderson, Jr. et al. Dec 1997 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6334149 Davis, Jr. et al. Dec 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6353170 Eyzaguirre et al. Mar 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6487601 Hubacher et al. Nov 2002 B1
6523008 Avrunin Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6631463 Floyd et al. Oct 2003 B1
6636247 Hamzy et al. Oct 2003 B1
6636855 Holloway et al. Oct 2003 B2
6643684 Malkin et al. Nov 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684238 Dutta Jan 2004 B1
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler Mar 2004 B1
6720860 Narayanaswami Apr 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6832222 Zimowski Dec 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6959324 Kubik et al. Oct 2005 B1
6970088 Kovach Nov 2005 B2
6970907 Ullmann et al. Nov 2005 B1
6980909 Root et al. Dec 2005 B2
6981040 Konig et al. Dec 2005 B1
7020494 Spriestersbach et al. Mar 2006 B2
7027124 Foote et al. Apr 2006 B2
7072963 Anderson et al. Jul 2006 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7188143 Szeto Mar 2007 B2
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7237002 Estrada et al. Jun 2007 B1
7240089 Boudreau Jul 2007 B2
7269426 Kokkonen et al. Sep 2007 B2
7280658 Amini et al. Oct 2007 B2
7315823 Bröndrup Jan 2008 B2
7349768 Bruce et al. Mar 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7454442 Cobleigh et al. Nov 2008 B2
7508419 Toyama et al. Mar 2009 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7546554 Chiu et al. Jun 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7639943 Kalajan Dec 2009 B1
7650231 Gadler Jan 2010 B2
7668537 DeVries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7778973 Choi Aug 2010 B2
7779444 Glad Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
7856360 Kramer et al. Dec 2010 B2
8001204 Burtner et al. Aug 2011 B2
8032586 Challenger et al. Oct 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8090351 Klein Jan 2012 B2
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8131597 Hudetz Mar 2012 B2
8135166 Rhoads Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161115 Yamamoto Apr 2012 B2
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8199747 Rojas et al. Jun 2012 B2
8208943 Petersen Jun 2012 B2
8214443 Hamburg Jul 2012 B2
8234350 Gu et al. Jul 2012 B1
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8280406 Ziskind et al. Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8287380 Nguyen et al. Oct 2012 B2
8301159 Hamynen et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8326315 Phillips et al. Dec 2012 B2
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8402097 Szeto Mar 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8423409 Rao Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8510383 Hurley et al. Aug 2013 B2
8527345 Rothschild et al. Sep 2013 B2
8554627 Svendsen et al. Oct 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8594680 Ledlie et al. Nov 2013 B2
8613089 Holloway et al. Dec 2013 B1
8660358 Bergboer et al. Feb 2014 B1
8660369 Llano et al. Feb 2014 B2
8660793 Ngo et al. Feb 2014 B2
8682350 Altman et al. Mar 2014 B2
8718333 Wolf et al. May 2014 B2
8724622 Rojas May 2014 B2
8732168 Johnson May 2014 B2
8744523 Fan et al. Jun 2014 B2
8745132 Obradovich Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8768876 Shim et al. Jul 2014 B2
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8874677 Rosen et al. Oct 2014 B2
8886227 Schmidt et al. Nov 2014 B2
8909679 Root et al. Dec 2014 B2
8909725 Sehn Dec 2014 B1
8972357 Shim et al. Mar 2015 B2
8995433 Rojas Mar 2015 B2
9015285 Ebsen et al. Apr 2015 B1
9020745 Johnston et al. Apr 2015 B2
9040574 Wang et al. May 2015 B2
9055416 Rosen et al. Jun 2015 B2
9094137 Sehn et al. Jul 2015 B1
9100806 Rosen et al. Aug 2015 B2
9100807 Rosen et al. Aug 2015 B2
9113301 Spiegel et al. Aug 2015 B1
9119027 Sharon et al. Aug 2015 B2
9123074 Jacobs Sep 2015 B2
9143382 Bhogal et al. Sep 2015 B2
9143681 Ebsen et al. Sep 2015 B1
9152477 Campbell et al. Oct 2015 B1
9191776 Root et al. Nov 2015 B2
9204252 Root Dec 2015 B2
9225897 Sehn et al. Dec 2015 B1
9258459 Hartley Feb 2016 B2
9344606 Hartley et al. May 2016 B2
9385983 Sehn Jul 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9430783 Sehn Aug 2016 B1
9439041 Parvizi et al. Sep 2016 B2
9443227 Evans et al. Sep 2016 B2
9450907 Pridmore et al. Sep 2016 B2
9459778 Hogeg et al. Oct 2016 B2
9489661 Evans et al. Nov 2016 B2
9491134 Rosen et al. Nov 2016 B2
9532171 Allen et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9628950 Noeth et al. Apr 2017 B1
9710821 Heath Jul 2017 B2
9843720 Ebsen et al. Dec 2017 B1
9854219 Sehn Dec 2017 B2
20020047868 Miyazawa Apr 2002 A1
20020078456 Hudson et al. Jun 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 Mcgrath et al. Sep 2002 A1
20020128047 Gates Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030037124 Yamaura et al. Feb 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030101230 Benschoter et al. May 2003 A1
20030110503 Perkes Jun 2003 A1
20030126215 Udell Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040064429 Hirstius et al. Apr 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20040243688 Wugofski Dec 2004 A1
20050021444 Bauer et al. Jan 2005 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050102381 Jiang et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060265417 Amato et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070038715 Collins et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080049704 Witteman et al. Feb 2008 A1
20080062141 Chandhri Mar 2008 A1
20080076505 Ngyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080094387 Chen Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschweiler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080189177 Anderton et al. Aug 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080208692 Garaventi et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay Sep 2008 A1
20080255976 Altberg et al. Oct 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080256577 Funaki et al. Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089678 Sacco et al. Apr 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind Apr 2009 A1
20090132341 Klinger May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090148045 Lee et al. Jun 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090157752 Gonzalez Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090177299 Van De Sluis Jul 2009 A1
20090192900 Collision Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249222 Schmidt et al. Oct 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li Dec 2009 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100100568 Papin et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100183280 Beauregard et al. Jul 2010 A1
20100185552 Deluca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100191631 Weidmann Jul 2010 A1
20100197318 Petersen et al. Aug 2010 A1
20100197319 Petersen et al. Aug 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198826 Petersen Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100198862 Jennings et al. Aug 2010 A1
20100198870 Petersen et al. Aug 2010 A1
20100198917 Petersen et al. Aug 2010 A1
20100201482 Robertson et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110314419 Dunn et al. Dec 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120150978 Monaco et al. Jan 2012 A1
20120028659 Whitney et al. Feb 2012 A1
20120033718 Kauffman et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120036443 Ohmori et al. Feb 2012 A1
20120054797 Skog et al. Mar 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120084731 Filman et al. Apr 2012 A1
20120084835 Thomas et al. Apr 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120123871 Svendsen et al. May 2012 A1
20120123875 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120172062 Altman et al. Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120209924 Evans et al. Aug 2012 A1
20120210244 De Francisco Lopez et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120226748 Bosworth et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120254325 Majeti et al. Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120290637 Perantatos et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Bray et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120319904 Lee et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20120324018 Metcalf et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130008238 Hogeg et al. Jan 2013 A1
20130024757 Doll et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130090171 Holton et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130104053 Thornton et al. Apr 2013 A1
20130110885 Brundrett, III May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130159919 Leydon Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 Mcevilly et al. Aug 2013 A1
20130222323 Mckenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130275505 Gauglitz et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130304646 De Geer Nov 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346869 Asver et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140043204 Basnayake et al. Feb 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140082651 Sharifi Mar 2014 A1
20140092130 Anderson et al. Apr 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173424 Hogeg et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140222564 Kranendonk et al. Aug 2014 A1
20140258405 Perkin Sep 2014 A1
20140265359 Cheng et al. Sep 2014 A1
20140266703 Dalley, Jr. et al. Sep 2014 A1
20140279061 Elimeliah et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 O'keefe et al. Sep 2014 A1
20140289833 Briceno Sep 2014 A1
20140306986 Gottesman et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140324629 Jacobs Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20150020086 Chen et al. Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150071619 Brough Mar 2015 A1
20150087263 Branscomb et al. Mar 2015 A1
20150088622 Ganschow et al. Mar 2015 A1
20150095020 Leydon Apr 2015 A1
20150096042 Mizrachi Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150169827 Laborde Jun 2015 A1
20150172534 Miyakawaa et al. Jun 2015 A1
20150178260 Brunson Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150312184 Langholz et al. Oct 2015 A1
20150350136 Flynn, III et al. Dec 2015 A1
20150365795 Allen et al. Dec 2015 A1
20150378502 Hu et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20160014063 Hogeg et al. Jan 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160099901 Allen et al. Apr 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160239248 Sehn Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160321708 Sehn Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170061308 Chen et al. Mar 2017 A1
20170287006 Azmoodeh et al. Oct 2017 A1
Foreign Referenced Citations (31)
Number Date Country
2887596 Jul 2015 CA
2051480 Apr 2009 EP
2151797 Dec 2010 EP
2399928 Sep 2004 GB
19990073076 Oct 1999 KR
20010078417 Aug 2001 KR
WO-1996024213 Aug 1996 WO
WO-1999063453 Dec 1999 WO
WO-2000058882 Oct 2000 WO
WO-2001029642 Apr 2001 WO
WO-2001050703 Jul 2001 WO
WO-2006118755 Nov 2006 WO
2007092668 Aug 2007 WO
WO-2007092668 Aug 2007 WO
WO-2009043020 Apr 2009 WO
WO-2011040821 Apr 2011 WO
WO-2011119407 Sep 2011 WO
WO-2013045753 Apr 2013 WO
WO-2013008238 Nov 2013 WO
WO-2014068573 May 2014 WO
WO-2014115136 Jul 2014 WO
WO-2014194262 Dec 2014 WO
WO-2015192026 Dec 2015 WO
WO-2016044424 Mar 2016 WO
WO-2016054562 Apr 2016 WO
WO-2016065131 Apr 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100318 Jun 2016 WO
WO-2016100342 Jun 2016 WO
WO-2016149594 Sep 2016 WO
WO-2016179166 Nov 2016 WO
Non-Patent Literature Citations (187)
Entry
“A Whole New Story”, [Online]. Retrieved from the Internet: <https://www.snap.com/en-US/news/>, (2017), 13 pgs.
“Adding a watermark to your photos”, eBay, [Online]. Retrieved from the Internet:<URL:http://pages.ebay.com/help/sell/pictures.html>, (accessed May 24, 2017), 4 pgs.
“U.S. Appl. No. 14/304,855, Corrected Notice of Allowance dated Jun. 26, 2015”, 8 pgs.
“U.S. Appl. No. 14/304,855, Final Office Action dated Feb. 18, 2015”, 10 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action dated Mar. 18, 2015”, 9 pgs.
“U.S. Appl. No. 14/304,855, Non Final Office Action dated Oct. 22, 2014”, 11 pgs.
“U.S. Appl. No. 14/304,855, Notice of Allowance dated Jun. 1, 2015”, 11 pgs.
“U.S. Appl. No. 14/304,855, Response filed Feb. 25, 2015 to Final Office Action dated Feb. 18, 2015”, 5 pgs.
“U.S. Appl. No. 14/304,855, Response filed Apr. 1, 2015 to Non Final Office Action dated Mar. 18, 2015”, 4 pgs.
“U.S. Appl. No. 14/304,855, Response filed Nov. 7, 2014 to Non Final Office Action dated Oct. 22, 2014”, 5 pgs.
“U.S. Appl. No. 14/494,226, Examiner Interview Summary dated Oct. 27, 2016”, 3 pgs.
“U.S. Appl. No. 14/494,226, Examiner Interview Summary dated Dec. 20, 2017”, 2 pgs.
“U.S. Appl. No. 14/494,226, Final Office Action dated Mar. 7, 2017”, 34 pgs.
“U.S. Appl. No. 14/494,226, Non Final Office Action dated Sep. 7, 2017”, 36 pgs.
“U.S. Appl. No. 14/494,226, Non Final Office Action dated Sep. 12, 2016”, 32 pgs.
“U.S. Appl. No. 14/494,226, Response filed Jan. 8, 2018 to Non Final Office Action dated Sep. 7, 2017”, 15 pgs.
“U.S. Appl. No. 14/494,226, Response filed Jul. 7, 2017 to Final Office Action dated Mar. 7, 2017”, 13 pgs.
“U.S. Appl. No. 14/494,226, Response filed Dec. 12, 2016 to Non Final Office Action dated Sep. 12, 2016”, 16 pgs.
“U.S. Appl. No. 14/505,478, Advisory Action dated Apr. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance dated May 18, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Corrected Notice of Allowance dated Jul. 22, 2016”, 2 pgs.
“U.S. Appl. No. 14/505,478, Final Office Action dated Mar. 17, 2015”, 16 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action dated Jan. 27, 2015”, 13 pgs.
“U.S. Appl. No. 14/505,478, Non Final Office Action dated Sep. 4, 2015”, 19 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance dated Apr. 28, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Notice of Allowance dated Aug. 26, 2016”, 11 pgs.
“U.S. Appl. No. 14/505,478, Response filed Jan. 30, 2015 to Non Final Office Action dated Jan. 27, 2015”, 10 pgs.
“U.S. Appl. No. 14/505,478, Response filed Mar. 4, 2016 to Non Final Office Action dated Sep. 4, 2015”, 12 pgs.
“U.S. Appl. No. 14/505,478, Response filed Apr. 1, 2015 to Final Office Action dated Mar. 17, 2015”, 6 pgs.
“U.S. Appl. No. 14/506,478, Response filed Aug. 17, 2015 to Advisory Action dated Apr. 14, 2015”, 10 pgs.
“U.S. Appl. No. 14/523,728, Non Final Office Action dated Dec. 12, 2014”, 10 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Mar. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Apr. 15, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Notice of Allowance dated Jun. 5, 2015”, 8 pgs.
“U.S. Appl. No. 14/523,728, Response filed Aug. 25, 2014 to Non Final Office Action dated Jan. 16, 2015”, 5 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary dated May 23, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Examiner Interview Summary dated Nov. 17, 2016”, 3 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action dated Aug. 11, 2015”, 23 pgs.
“U.S. Appl. No. 14/529,064, Final Office Action dated Aug. 24, 2016”, 23 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Mar. 12, 2015”, 20 pgs.
“U.S. Appl. No. 14/529,064, Non Final Office Action dated Apr. 6, 2017”, 25 pgs.
“14/529,064, Non Final Office Action dated Apr. 18, 2016”, 21 pgs.
“U.S. Appl. No. 14/529,064, Response filed Feb. 5, 2015 to Restriction Requirement dated Feb. 2, 2015”, 6 pgs.
“U.S. Appl. No. 14/529,064, Response filed Mar. 26, 2015 to Non Final Office Action dated Mar. 12, 2015”, 8 pgs.
“U.S. Appl. No. 14/529,064, Response filed Jul. 18, 2016 to Non Final Office Action dated Apr. 18, 2016”, 20 pgs.
“U.S. Appl. No. 14/529,064, Response filed Sep. 6, 2017 to Non Final Office Action dated Apr. 6, 2017”, 19 pgs.
“U.S. Appl. No. 14/529,064, Response filed Oct. 12, 2015 to Final Office Action dated Aug. 11, 2015”, 19 pgs.
“U.S. Appl. No. 14/529,064, Response filed Dec. 21, 2016 to Final Office Action dated Aug. 24, 2016”, 17 pgs.
“U.S. Appl. No. 14/529,064, Restriction Requirement dated Feb. 2, 2015”, 5 pgs.
“U.S. Appl. No. 14/539,391, Notice of Allowance dated Mar. 5, 2015”, 17 pgs.
“U.S. Appl. No. 14/548,590, Advisory Action dated Nov. 18, 2016”, 3 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Jul. 5, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Jul. 18, 2017”, 20 pgs.
“U.S. Appl. No. 14/548,590, Final Office Action dated Sep. 16, 2015”, 15 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Jan. 9, 2017”, 14 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Feb. 11, 2016”, 16 pgs.
“U.S. Appl. No. 14/548,590, Non Final Office Action dated Apr. 20, 2015”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 9, 2017 to Non Final Office Action dated Jan. 9, 2017”, 17 pgs.
“U.S. Appl. No. 14/548,590, Response filed May 10, 2016 to Non Final Office Action dated Feb. 11, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Nov. 7, 2016 to Final Office Action dated Jul. 5, 2016”, 14 pgs.
“U.S. Appl. No. 14/548,590, Response filed Dec. 16, 2015 to Final Office Action dated Sep. 16, 2015”, 13 pgs.
“U.S. Appl. No. 14/548,590, Response filed Jun. 16, 2015 to Non Final Office Action dated Apr. 20, 2015”, 19 pgs.
“U.S. Appl. No. 14/578,258, Examiner Interview Summary dated Nov. 25, 2015”, 3 pgs.
“U.S. Appl. No. 14/578,258, Non Final Office Action dated Jun. 10, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,258, Notice of Allowance dated Feb. 26, 2016”, 5 pgs.
“U.S. Appl. No. 14/578,258, Response filed Dec. 10, 2015 to Non Final Office Action dated Jun. 10, 2015”, 11 pgs.
“U.S. Appl. No. 14/578,271, Final Office Action dated Dec. 3, 2015”, 15 pgs.
“U.S. Appl. No. 14/578,271, Non Final Office Action dated Aug. 7, 2015”, 12 pgs.
“U.S. Appl. No. 14/578,271, Notice of Allowance dated Dec. 7, 2016”, 7 pgs.
“U.S. Appl. No. 14/578,271, Response filed Feb. 9, 2016 to Final Office Action dated Dec. 3, 2015”, 10 pgs.
“U.S. Appl. No. 14/578,271, Response filed Jun. 19, 2015 to Restriction Requirement dated Apr. 23, 2015”, 6 pgs.
“U.S. Appl. No. 14/578,271, Response filed Oct. 28, 2015 to Non Final Office Action dated Aug. 7, 2015”, 9 pgs.
“U.S. Appl. No. 14/578,271, Restriction Requirement dated Apr. 23, 2015”, 8 pgs.
“U.S. Appl. No. 14/594,410, Non Final Office Action dated Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance dated Aug. 2, 2016”, 5 pgs.
“U.S. Appl. No. 14/594,410, Notice of Allowance dated Dec. 15, 2016”, 6 pgs.
“U.S. Appl. No. 14/594,410, Response filed Jul. 1, 2016 to Non Final Office Action dated Jan. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Jan. 29, 2016”, 5 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Jul. 6, 2016”, 4 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Aug. 14, 2015”, 3 pgs.
“U.S. Appl. No. 14/612,692, Examiner Interview Summary dated Sep. 8, 2016”, 3 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action dated Aug. 15, 2016”, 18 pgs.
“U.S. Appl. No. 14/612,692, Final Office Action dated Nov. 23, 2015”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Jan. 3, 2017”, 17 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Mar. 28, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Non Final Office Action dated Jul. 20, 2015”, 25 pgs.
“U.S. Appl. No. 14/612,692, Response filed Feb. 23, 2016 to Final Office Action dated Nov. 23, 2015”, 10 pgs.
“U.S. Appl. No. 14/612,692, Response filed May 3, 2017 to Non Final Office Action dated Jan. 3, 2017”, 18 pgs.
“U.S. Appl. No. 14/612,692, Response filed Nov. 14, 2016 to Final Office Action dated Aug. 15, 2016”, 15 pgs.
“U.S. Appl. No. 14/612,692, Response filed Jun. 28, 2016 to Non Final Office Action dated Mar. 28, 2016”, 14 psg.
“U.S. Appl. No. 14/612,692, Response filed Oct. 19, 2015 to Non Final Office Action dated Jul. 20, 2015”, 11 pgs.
“U.S. Appl. No. 14/634,417, Advisory Action dated Mar. 14, 2017”, 3 pgs.
“U.S. Appl. No. 14/634,417, Final Office Action dated Jan. 31, 2017”, 27 pgs.
“U.S. Appl. No. 14/634,417, Non Final Office Action dated Aug. 30, 2016”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Mar. 2, 2017 to Final Office Action dated Jan. 31, 2017”, 23 pgs.
“U.S. Appl. No. 14/634,417, Response filed Nov. 30, 2016 to Non Final Office Action dated Aug. 30, 2016”, 18 pgs.
“U.S. Appl. No. 14/634,417, Notice of Allowance dated Jul. 27, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Final Office Action dated Jun. 17, 2016”, 12 pgs.
“U.S. Appl. No. 14/704,212, Non Final Office Action dated Dec. 4, 2015”, 17 pgs.
“U.S. Appl. No. 14/704,212, Response filed Mar. 4, 2016 to Non Final Office Action dated Dec. 4, 2015”, 11 pgs.
“U.S. Appl. No. 14/738,069, Non Final Office Action dated Mar. 21, 2016”, 12 pgs.
“U.S. Appl. No. 14/738,069, Notice of Allowance dated Aug. 17, 2016”, 6 pgs.
“U.S. Appl. No. 14/738,069, Response filed Jun. 10, 2016 to Non Final Office Action dated Mar. 21, 2016”, 10 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance dated Apr. 12, 2016”, 9 pgs.
“U.S. Appl. No. 14/808,283, Notice of Allowance dated Jul. 14, 2016”, 8 pgs.
“U.S. Appl. No. 14/808,283, Preliminary Amendment filed Jul. 24, 2015”, 8 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance dated Mar. 29, 2017”, 17 pgs.
“U.S. Appl. No. 14/841,987, Notice of Allowance dated Aug. 7, 2017”, 8 pgs.
“U.S. Appl. No. 14/967,472, Final Office Action dated Mar. 10, 2017”, 15 pgs.
“U.S. Appl. No. 14/967,472, Non Final Office Action dated Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 14/967,472, Preliminary Amendment filed Dec. 15, 2015”, 6 pgs.
“U.S. Appl. No. 14/967,472, Response filed Dec. 5, 2016 to Non Final Office Action dated Sep. 8, 2016”, 11 pgs.
“U.S. Appl. No. 15/137,608, Preliminary Amendment filed Apr. 26, 2016”, 6 pgs.
“U.S. Appl. No. 15/152,975, Non Final Office Action dated Jan. 12, 2017”, 36 pgs.
“U.S. Appl. No. 15/152,975, Preliminary Amendment filed May 19, 2016”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance dated Feb. 27, 2017”, 8 pgs.
“U.S. Appl. No. 15/208,460, Notice of Allowance dated Dec. 30, 2016”, 9 pgs.
“U.S. Appl. No. 15/208,460, Supplemental Preliminary Amendment filed Jul. 18, 2016”, 8 pgs.
“U.S. Appl. No. 15/224,262, Notice of Allowance dated Mar. 2, 2017”, 14 pgs.
“U.S. Appl. No. 15/224,312, Preliminary Amendment filed Feb. 1, 2017”, 11 pgs.
“U.S. Appl. No. 15/224,343, Preliminary Amendment filed Jan. 31, 2017”, 10 pgs.
“U.S. Appl. No. 15/224,355, Preliminary Amendment filed Apr. 3, 2017”, 12 pgs.
“U.S. Appl. No. 15/224,372, Preliminary Amendment filed May 5, 2017”, 10 pgs.
“U.S. Appl. No. 15/224,359, Preliminary Amendment filed Apr. 19, 2017”, 8 pgs.
“U.S. Appl. No. 15/298,806, Non Final Office Action dated Jun. 12, 2017”, 26 pgs.
“U.S. Appl. No. 15/298,806, Preliminary Amendment filed Oct. 21, 2016”, 8 pgs.
“U.S. Appl. No. 15/298,806, Response filed Sep. 12, 2017 to Non Final Office Action dated Jun. 12, 2017”, 12 pgs.
“U.S. Appl. No. 15/416,846, Preliminary Amendment filed Feb. 18, 2017”, 10 pgs.
“U.S. Appl. No. 15/486,111, Corrected Notice of Allowance dated Sep. 7, 2017”, 3 pgs.
“U.S. Appl. No. 15/486,111, Non Final Office Action dated May 9, 2017”, 17 pgs.
“U.S. Appl. No. 15/486,111, Notice of Allowance dated Aug. 30, 2017”, 5 pgs.
“U.S. Appl. No. 15/486,111, Response filed Aug. 9, 2017 to Non Final Office Action dated May 9, 2017”, 11 pgs.
“BlogStomp”, [Online]. Retrieved from the Internet: <URL:http://stompsoftware.com/blogstomp>, (accessed May 24, 2017), 12 pgs.
“Canadian Application Serial No. 2,894,332 Response filed Jan. 24, 2017 to Office Action dated Aug. 16, 2016”, 15 pgs.
“Canadian Application Serial No. 2,894,332, Office Action dated Aug. 16, 2016”, 4 pgs.
“Canadian Application Serial No. 2,910,158, Office Action dated Dec. 15, 2016”, 5 pgs.
“Canadian Application Serial No. 2,910,158, Response filed Apr. 11, 2017 to Office Action dated Dec. 15, 2016”, 21 pgs.
“Cup Magic Starbucks Holiday Red Cups come to life with AR app”, [Online]. Retrieved from the Internet: <http://www.blastradius.com/work/cup-magic>, (2016), 7 pgs.
“Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place”, TechPP, [Online]. Retrieved from the Internet: <URL;http://techpp.com/2013/02/15/instaplace-app-review>, (2013), 13 pgs.
“How Snaps Are Stored and Deleted”, Snapchat, [Online]. Retrieved from the Internet: <URL: https://web.archive.org/web/20130607042322/http://blog.snapchat.com/post/50060403002/how-snaps-are-stored-and-deleted, (May 9, 2013), 2 pgs.
“InstaPlace Photo App Tell the Whole Story”, [Online]. Retrieved from the Internet; <https://youtu.be/uF_gFkg1hBM>, (Nov. 8, 2013), 113 pgs.
“International Application Serial No. PCT/EP2008/063682, International Search Report dated Nov. 24, 2008”, 3 pgs.
“International Application Serial No. PCT/US2014/040346, International Search Report dated Mar. 23, 2015”, 2 pgs.
“International Application Serial No. PCT/US2014/040346, Written Opinion dated Mar. 23, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/035591, International Preliminary Report on Patentability dated Dec. 22, 2016”, 7 pgs.
“International Application Serial No. PCT/US2015/035591, International Search Report dated Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/035591, International Written Opinion dated Aug. 11, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/050424, International Search Report dated Dec. 4, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/050424, Written Opinion dated Dec. 4, 2015”, 10 pgs.
“International Application Serial No. PCT/US2015/053811, International Preliminary Report on Patentability dated Apr. 13, 2017”, 9 pgs.
“International Application Serial No. PCT/US2015/053811, International Search Report dated Nov. 23, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/053811, Written Opinion dated Nov. 23, 2015”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Preliminary Report on Patentability dated May 4, 2017”, 8 pgs.
“International Application Serial No. PCT/US2015/056884, International Search Report dated Dec. 22, 2015”, 5 pgs.
“International Application Serial No. PCT/US2015/056884, Written Opinion dated Dec. 22, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/065785, International Search Report dated Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065785, Written Opinion dated Jul. 21, 2016”, 5 pgs.
“International Application Serial No. PCT/US2015/065821, International Search Report dated Mar. 3, 2016”, 2 pgs.
“International Application Serial No. PCT/US2015/065821, Written Opinion dated Mar. 3, 2016”, 3 pgs.
“International Application Serial No. PCT/US2016/023085, International Preliminary Report on Patentability dated Sep. 28, 2017”, 8 pgs.
“International Application Serial No. PCT/US2016/023085, International Search Report dated Jun. 17, 2016”, 5 pgs.
“International Application Serial No. PCT/US2016/023085, Written Opinion dated Jun. 17, 2016”, 6 Pgs.
“Introducing Snapchat Stories”, [Online]. Retrieved from the Internet<https://www.youtube.com/watch?v=88Cu3yN-LIM>, (Oct. 3, 2013), 92 pgs.
“iVisit Mobile: Getting Started”, IVISIT, (Dec. 4, 2013), 1-16.
“Macy's Believe-o-Magic”, {Online}. Retrieved from the Internet: <https://www.youtube.com/watch?v=xvzRXy3J0Z0>, (Nov. 7, 2011), 102 pgs.
“Macy's Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 “Believe” Campaign”, [Online]. Retrieved from the Internet: <http://www.businesswire.com/news/home/20111102006759/en/Macy%E2%80%99s-Introduces-Augmented-Reality-Experience-Stores-Country>., (Nov. 2, 2011), 6 pgs.
“PluralEyes by Red Giant”, © 2002-2015 Red Giant LLC, [Online]. Retrieved from the Internet: <URL: http://www.redgiant.com/products/pluraleyes/, (Accessed Nov. 11, 2015), 5 pgs.
“Starbucks Cup Magic”, {Onliine}. Retrieved from the Internet: <https://www.youtube.com/watch?v=RWwQXi9RG0w>, (Nov. 8, 2011), 87 pgs.
“Starbucks Cup Magic for Valentine's Day”, {Online}. Retrieved from the Internet: <https://www.youtube.com/watch?v=8nvqOzjq10w>, (Feb. 6, 2012), 88 pgs.
“Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season”, [Online]. Retrieved from the Internet: <http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return>, (Nov. 15, 2011), 5 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, [Online]. Retrieved from the Internet: URL<https://techcrunch.com/2011/09/08/mobli-filters>, (Sep. 8, 2011), 10 pgs.
Castelluccia, Claude, et al., “EphPub: Toward robust Ephemeral Publishing”, Network Protocols (ICNP), 2011 19th IEEE International Conference on, IEEE, (Oct. 17, 2011), 18 pgs.
Clarke, Tangier, “Automatically syncing multiple clips and lots of audio like PluralEyes possible?”, [Online]. Retrieved from the Internet: <URL: https://forums.creativecow.net/thread/344/20553, (May 21, 2013), 8 pgs.
Janthong, Isaranu, “Android App Review Thailand”, [Online], Retrieved from the Internet<http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html>, (Jan. 23, 2013), 9 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online]. Retrieved from the Internet: <URL: http://www.theregister.co.uk/2005/12/12/stealthtext/, (Dec. 12, 2005), 1 pg.
MacLeod, Duncan, “Macys Believe-o-Magic App”, [Online]. Retrieved from the Internet: <URL:http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app>, (Nov. 14, 2011), 10 pgs.
MacLeod, Duncan, “Starbucks Cup Magic—Let's Merry”, {Online}. Retrieved from the Internet: <URL; http://theinspirationroom.com/daily/2011/starbucks-cup-magic>, (Nov. 12, 2011), 8 pgs.
Melanson, Mike, “This text message will self destruct in 60 seconds”, readwrite.com, [Online]. Retrieved from the Internet: <http://readwrite.com/2011/02/11/this_text_message_will_self_destruct_in_60_seconds>, (Feb. 18, 2015), 4 pgs.
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, [Online]. Retrieved from the Internet<https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fonts?utm_term=.bkQ9qVZWe#.nv58YXpkV>, (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, a Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, [Online]. Retrieved from the Internet: <https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/>, (Dec. 20, 2013), 12 pgs.
Sawers, Paul, “Snapchat for iOS Lets You Send Photos to Friends and Set How long They're Visible for”, [Online]. Retrieved from the Internet: <http:/ /thenextweb.com/apps/2012/05/07/Snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visiblefor/#! xCjrp>,, (May 7, 2012), 1-5.
Sawers, Paul, “Snapchat for ios lets you send photos to friends and set how long they're visible for”, http ://thenextweb.com/apps/2012/05/07/ snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visible-for, (May 2012), 1-3 pgs.
Shein, Esther, “Ephemeral Data”, Communications of the ACM vol. 56 | No. 9, (Sep. 2013), 20-22.
Trice, Andrew, “My Favorite New Feature: Multi-Clip Sync in Premiere Pro CC”, [Online]. Retrieved from the Internet: <URL: http://www.tricedesigns.com/2013/06/18/my-favorite-new-feature-multi-cam-synch-in-premiere-pro-cc/, (Jun. 18, 2013), 5 pgs.
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, [Online]. Retrieved from the Internet: <URL:http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-server/, (Dec. 28, 2012), 4 pgs.
“U.S. Appl. No. 14/494,226, Final Office Action dated Jun. 1, 2018”, 33 pgs.
Continuations (3)
Number Date Country
Parent 14841987 Sep 2015 US
Child 15837935 US
Parent 14682259 Apr 2015 US
Child 14841987 US
Parent 14539391 Nov 2014 US
Child 14682259 US