Augmented property system of curated augmented reality media elements

Information

  • Patent Grant
  • 10992836
  • Patent Number
    10,992,836
  • Date Filed
    Monday, June 24, 2019
    5 years ago
  • Date Issued
    Tuesday, April 27, 2021
    4 years ago
Abstract
A device based system for interacting around a collection of user definable moments portraying a theme readable by others via a filter. Users may participate actively, voyeuristically, passively, or vicariously, to share propaganda, graffiti, news, parody, satire, opinions, information, commentary, entertainment, contests, and amusement to a location or an object.
Description
TECHNICAL FIELD

The present invention is generally related to apparatus and systems for sending messages and more particularly to a system and apparatus for constructing and sharing an augmented interactive landscape or the like.


SUMMARY OF THE INVENTION

The present disclosure teaches a device allowing users to augment the environment with media files accessible and identified by an icon tagged to a particular location or object and accessible by users proximate to the tagged location.


Individuals interact with their environment on a continual basis. Certain moments may occur where an interaction with a place, thing, person, article, thought, feeling, or the like may occur. Such moments, indeed all moments, are multidimensional and/or multisensory. Each moment, whether ignored or passed without notice, or contemplated, generally includes all of a person's senses, a time and date, a location, and a set of things involved in the moment, e.g., a sound, a song, a video, some text, a conversation, a three-dimensional object, a place, a person, or group of people, a landscape, a view, or the like. Such moments produce thoughts and or feelings. Recording such moments for sharing and hermeneutics (context) for a particular set of circumstances is desirable. A moment may be a simple reminder or a multidimensional (multisensory) reminder (one which generates a mood or feeling) or for communicating the feelings attached to the context of experiencing a particular moment to a select group of friends, a filtered audience or broadcast unfiltered to complete strangers.


In one embodiment of the present invention a recording of a moment may be shared with a party that has moved near or to the location from which a particular moment (encounter) was recorded. Likewise, a time, thing, person, object or position may recall a recorded moment to another. In operation, a 104 or the like (smart device, iPhone, iPad, tablet, Android device, Surface) may be utilized to record and read/view/experience a moment.


A person carrying a HANDY or the like while traveling, eating, walking, working, driving (passenger), traveling, and otherwise living may record the embodied experiences of a moment (or interaction) with a video, song, menu, image, video, conversation, story, interactive moment element, or the like, tagged to a particular location and time. Interesting (suitable/desired) moment files may be located via both tangible and intangible aspects of a recorded moment (experienced/shared) by an in situ user by location, tagged object, and the like. Additionally, the context of a recorded moment may be searchable by time, location, type, mood (humorous, informative, poignant, opinion, historical, idiohistoric, and others) and filtered by an in situ user (or remote users in special embodiments of the present invention).


In a presently preferred embodiment, the invention may work and employ virtual reality standards as they develop and are deployed such that objects/things and the like may be paired with a tagged location (message).





BRIEF DESCRIPTION OF THE DRAWINGS

The numerous objects and advantages of the present invention may be better understood by those skilled in the art by reference to the accompanying figures in which:



FIG. 1 is a highly diagrammatic environmental view of the moment recorder and reader network of an embodiment of the present invention;



FIG. 2 is an environmental diagram illustrating an embodiment of a recording apparatus of the present invention;



FIG. 3 is an environmental diagram illustrating an embodiment of a reader apparatus and associated presently preferred moment selection and filtration means of an embodiment of the present invention;



FIG. 4 is a highly schematic diagram of a location determination module of an embodiment of the present invention;



FIG. 5 is an environmental diagram of an embodiment of an example of a moment location tagging aspect of an embodiment of the present invention;



FIG. 6 is an environmental diagram illustrating an embodiment of a locomotion-based embodiment of an aspect of the present invention;



FIG. 7 is an environmental diagram of various aspects of an exterior and interior utilization of an embodiment of the present invention;



FIG. 8 is an environmental diagram of an embodiment of the present invention utilized in a museum or the like;



FIG. 9 is an environmental diagram of an embodiment of the present invention utilized in a retail store or the like;



FIG. 10 is a highly schematic representation of an augmented property ownership (control) system for providing a rule of law based augmented property environment;



FIG. 11 is an augmented property purchasing flow diagram illustrating means for hypothecating, deeding, owning, obtaining, and divesting augmented property according to a rule of law based system;



FIG. 12 is an augmented property auction flow diagram illustrating a means of monetizing an embodiment of the present disclosure;



FIG. 13 is an environmental diagram of an augmented estate geo-fencing system of an embodiment of the present disclosure;



FIG. 14 is an environmental diagram illustrating an embodiment of the present disclosure capable of tagging moment files to a personal object (HANDY or the like) periodically moving with a user;



FIG. 15 is a highly diagrammatic illustration of a multidimensional moment file reader/recorder system capable of operation in both real, virtual, and augmented states where moment files may be categorized, accessed, and appended to real, augmented, and virtual objects;



FIG. 16 is an environmental diagram illustrating a moment file based game for play on an unbounded or bounded augmented playing area based upon at least the real world and its real estate, the chattels distributed thereon, and a virtual space with or without defined boundaries; and



FIG. 17 is an environmental diagram of an awards system based upon a user characteristic such as participation as an embodiment of the present disclosure.





DETAILED DESCRIPTION OF THE INVENTION

Reference will now be made in detail to exemplary embodiments of the disclosure, examples of which are illustrated in the accompanying drawings.


The instant disclosure describes an apparatus, method, and system for recording moments 10 via a moment system 100. The moment system 100 includes a plurality of moment recorders 200 for recording moment files 10 to a server 108 (or the like). Each moment file 10 may include media 212 tagged with a time 204 and location 202. An embodiment may also include locomotion source 208 and theme 210.


Moment files are preferably associated with an object 122 or location 202. Users 12 may tag objects 122 and locations 202 to leave media 212 and the like for other users 12. The present invention allows users to filter, follow, share, inform, opine, and exchange ideas and moments 10 interesting to themselves and others.


Turning first to FIG. 1, a plurality of handies 104 (or the like) may be networked in an embodiment of the present invention for recording 200 and reading 300 moments 10 by subscribed users 12. Moments 10 are recorded 200 in a file 102 (on a server 108 or the like) tagged to a location 202 (object 122, place 206 or the like). Each moment 10 is created via a HANDY 104 or the like by directing the HANDY 104 to a place 206 (object 122, location 202) to record the coordinates and time 204. A user 12 may then associate the moment 10 with media 212. Moments 10 may additionally include tokens, games, instructions, memories, memorabilia, advertisements, and the like.



FIG. 2 illustrates an embodiment of the system 100 of a moment recording system and apparatus 200 of the present invention. When activated, the moment recorder 200 records a location 206, or a series of locations in seriatim, for a moment 10 (or a series of moments) in, for example, a geographic coordinate system geodetic datum (WGS 84 or the like). The moment recorder 200 also records the date and time 204 for each location 206 in a moment file 10. Associated with the moment file 10 are additional classes of information (206, 210, 212, & 208) for providing multiple-dimensional-information 200 tagged and associated with and about a moment 10 (FIG. 15). For example, where the moment 10 was recorded while traversing a location in some means of transportation 208 such as a ship 252, airplane 254, automobile 256, public transportation 258, bicycle 260, or while ambulating 262, the method of transport is preferably associated with the moment 10. Likewise, where the moment takes place in an area 206, e.g., a national park 230, on a road 604 or sidewalk (trail 602), a campground 232, building 234, house 236, museum 238, school 240, restaurant 242, scenic area 244, city park 246, zoo 248, store 250, or the like, such information will be recorded 200 in the moment file 10. In a preferred embodiment of the recorder 200 media may also be associated (tagged) to a moment 10. For example, a picture 214, a sound or audio recording 216, a 360° video 218 or video 220, a text 222 or an image, a screen shot, a calendar entry, reminder 224, or the like. Also preferably associated with the moment 10 is context 210, or mood, or the like 108. For example, an embodiment may also record as part of a moment 10 a diary entry 302, a history 304, a feeling or mood 306, information 308, an opinion 310, or poignant anecdotes 312 or the like.



FIG. 3 illustrates a presently preferred method and apparatus for reading a tagged moment 10 (from a location 206 or the like). A HANDY 104 (camera) may be directed to, near, or toward an object 122 (e.g., a table lamp). A user 12 may then use the present invention 100 to tag the object and add content (206, 208, 210, 212) to be written with and associated with the object (and its location and/or time) to a moment file 10. The moment file 10 is, in a presently preferred embodiment, written to a server 108, via a network connection 110 or the like (the file may be restricted to a particular user 12 or user group). The moment file 10 may be stored and searched by an in situ user (and in some embodiments also a remote user) location and at least one of media 212, locomotion 208, location 206, and theme 210. Another user 12 with a HANDY 104 or the like may utilize a filter 106 or the like to restrict availability or reviewability of a moment file 10 in accordance with user selectable traits or preferences. Thus, a user 12 might select to have available moments 10 by location 206 and/or context. For example, a particular mood 306 or feeling 312, a type of object 122, a location 206, and/or media type 212. As a user 12 encounters an object they may point their HANDY 104 at an object 122, at a location 206, at a set of coordinates to review available (readable) moments 10.



FIG. 3, by way of example, illustrates three moment files 10a, 10b, and 10c. Moment 10a contains a photograph and describes a mood 306 (and time 204). Moment 10b contains a reminder 224, a time 204, a feeling 312, and a mood. Moment 10c contains a text 222 (description), a sound recording (song) 216, a time 204, and a feeling 312. All of the example moments (10a-c) may be associated with a single object 122 (lamp) at a particular location (e.g., a hotel lobby or the like), each created by different users 12, at different times, and readable 300 by one or more users 12.


A server 108 may contain an individual server, a universal server, a group server, and/or multiple servers providing connectivity to users 12 recording 200 and reading 300 via a network connection 110. The system 100 may provide users 12 access via a network connection 110 connected to a server 108 via a filter 106 (user selectable and controllable via, for e.g., an application driven menu or the like) associated with a reader 300 (HANDY 104).



FIG. 4 illustrates a presently preferred network connection 110 schema for allowing recorders 200 and readers 300 of the system 100 to operatively connect with the system to record 200 and read 300 moment files 10. Preferably the system 100 may be utilized both in and out of doors. By way of illustration a plurality of handies 104 may be connected to a server 108 via a cellular network 116 (when available) and have consistent and reliable location information 114 via a GNSS system or the like. When a cellular connection 116 is unavailable, WiFi or Bluetooth 118 may be utilized to provide both connectivity 110 and user 12 location information 114 (triangulation, multilateration, or the like). LiFi 120 and other indoor location and connectivity systems may also be utilized (Eddystone, iBeacon) to provide robust system 100 connectivity 110 for both recording 200 and reading 300.



FIG. 5 illustrates a presently preferred means of utilizing an embodiment of the present invention. Users 12 may record 200 a heart and initials virtually or actually carved into a tree. The object (carving) may then be associated with their initials (and other media) virtually into a tree. Users 12 may decide to capture the moment using aspects of the present invention 100. Using handies 104, tagging a particular location (longitude and latitude) of the tree, a reference object, i.e., heart carving. Users may then select to leave a picture 214 and a text 222 in the moment file 102 attached with the paired location-object (tree-heart). Selecting a message type may also be provided, e.g., a feeling 312 and/or diary 302. After the passing of time another user 12 nearing the tagged location (tree) with appropriate filter settings (appropriately selected filter preferences or viewing authorization) may be informed of a nearby moment 10. The moment may be read 300 or ignored. If it is to be read, shared, an embodiment may tell a user 12 how many of their steps (and in what direction) the recorded moment 10 resides. Upon following a set of instructions the tagged object and moment 10 may be read 300. This and subsequent users 12 may comment on the original and subsequent moment with a new moment 606. A different media may be utilized, and a reply may be sent to the original recording HANDY 104.



FIG. 6 illustrates an embodiment of the system 1000 for utilization while moving 208. In operation a user 12 may leave a string of moments 102k-102t along a travel way 602, 604. A user 12 in a vehicle (or walking 262) may both record and read moment files along the path. For example, a plurality of photographs 214 (album covers) and songs 216 might be left as a playlist for reading (watching/listening) by a user traveling (in a vehicle 260, 256 or the like). Member users 12 and the like may subscribe to a single or various recorders for listening and viewing the travels and travel interests of a person (recorder) they follow via their filter 106 (blogger/disk jockey). Likewise, a plurality of photographs or video snippets may be left showing scenic areas along a route during different seasons or conditions. Additionally, a recorder may record commentary or opinions as in a travelogue or the like. Others, following a particular author (travel writer/blogger) may obtain a more complete and fulfilling travel experience. Furthermore, children and the like may experience the commentary of past travel (e.g., a travelogue) recorded moments 10 of a parent (family member or friend) along a particular route. Moment archeologist of the system 100 may track, write histories, study, promote policies, predict future interest, and the like.


Turning now to FIG. 7, a moment 102a may be recorded at an outdoor table at a restaurant or café memorializing a moment via a particular medium or collection of media such that another user 12 may experience or enjoy a particular aspect saved 200 by another user 12. In an indoor table a user 12 might read 200 (or record 300) a moment 10 regarding an object such as a painting 102c. The user's 12 HANDY 104 (or the like) may provide location and connectivity via a wireless network 118. Additionally, a user 12 may opine 310 regarding a menu item 102d or menu, or meal, or the like. Information 308 regarding a particular locus in quo may also be shared via an embodiment of the system 100 of the present invention. Some locations including interesting objects, such as a sculpture, thing, or the like 102h, which may warrant a comment or moment 10 of interest to other users. Outdoor venues may also include objects to be tagged with a moment 10 such as an outdoor sculpture 102i, bench 102b, hydrant 102e, sign 102g, or the like. Location data may be derived via a GNSS network 114 or wireless network 118, or the like.



FIG. 8 illustrates, by example, the utilization of an embodiment of the present invention 100 in a museum. Users 12 may leave tagged moments 10 associated with art objects 122 containing editorial, opinion, and informational media or the like. WIKIPEDIA® like articles, encyclopedia entries, and the like may be appended to or be part of a moment files 10. Likewise, content created by the system 100 may blend moment file 10 content to form moment file 10 abstracts of a particular location or thing of interest. Additionally, a professional, such as a curator may leave moments 10 near objects 122. These professional comments (moments 10) may be commented on by other users 12 and shared within a small group or the like. In a preferred embodiment an administrator may first approve or reject moments 10 left within a geo-fenced area (around an object, within a facility) or the like. In this fashion, an authority may control the type of moment's readable/recordable. Likewise, paid moments 10 may be left on or about a facility tied to a particular activity 208 or object 122. Other monetization schema may also be employed, e.g., a subscription to the recordings of a particular recorder 12. A filter for filtering all commercial moments 10 may also be available for a subscription requiring a set number of recorded moments 10 over a particular period of time (or a small pecuniary fee). Subscription revenue (in at least one embodiment) may be wholly or partially distributed to an appropriate holder 1006 in the form of reduced fees or the like. Highly desirable moment content 10 may be associated with a brief, a paid announcement, or the like.



FIG. 9 illustrates an embodiment 100 of the invention in a retail environment. A user 12 might leave a plurality or moments 10 near and associated with grocery items and the like. A user desiring to make a particular recipe or the like might follow a plurality of moments within a grocery store or the like to purchase the correct or desired items. A virtual shopping list may be created containing other suggested items necessary to fabricate a particular dish or the like. A significant other might leave shopping instructions for their partner. Likewise, coupons and discount related information might be interactively obtained by users 12 through moment files 102 of an embodiment of the invention 100. This provides vendors, product placement managers, marketing/advertising professionals, manufacturers, and storeowners to require proximity both in space and/or time to obtain a desirable moment 10. For example, at 8:00 AM a store might be otherwise in need of shoppers on any given day. In order to drive traffic into a store (venue) a scavenger hunt (breadcrumbs, spoor) contest or the like might be utilized to provide discounts or prize type interests for shoppers.



FIG. 10 illustrates an augmented property map 1002 based upon real property boundaries or virtual boundaries 1004 in accordance with at least one embodiment of the present invention 100. Users 12 may identify, price, bid on, purchase, negotiate, trade, rent/lease, borrow, and the like a parcel of augmented property 1000. Additionally, an owner/holder 1006 of a parcel of augmented property 1000 may restrict use and or prevent trespassing users 12 and their associated moment files 10. Moments 10 may only, for example, be left, accessed/enjoyed, and/or seen (visualized by a particular user 12) as provided by the system 100 (in at least one embodiment 100).


In one embodiment users 12 gaining access to a particular location 202 by being physically present in the location may receive some haptic response (ping) originating from the system 100 to a user's 12 HANDY 14, or from a holder 1006 interested in separately interacting with a particular user 12 reading/recording 300/200 a moment file. A virtual property ownership system 1000 may include an augmented (virtual) map 1002 augmenting real or contrived boundaries 1004 such that an owner 1006 of augmented property may monetize system 100, users 12, moment file 10 recording/reading 200/300. Augmented property holder 1006 identification may be designated with a holder 1006 moment file 1008 which must be accessed/played or the like in order for a user 12 to record or read a moment file 10. In one embodiment a user moving 1010 across a boundary 1004 into another holders' 1006 augmented property may be (or may not be) required to access the crossed into holders augmented property moment identification moment file 1008. A user's 12 time within an augmented estate, and/or number of previously viewed user holder-based content moment files 1008, may modify the content of a holders' moment file 1008 so as to present either full, abbreviated, or no moment file content to said user.



FIG. 11 illustrates a currently preferred process for transferring augmented property in accordance with the system 100 of the present invention. The purchasing process 1100 includes an augmented property 1002 divisible temporally, by user population, by clicks, acreage (square meter), onerousness of holder moment 1008 content, by value and frequency of chits or coupons provided to users, coupon downloads, user traffic, and user feedback. Holder 1006 control over augmented property may be limited to actual real property ownership, fee simple, fee tail, temporal estate, lease, or license. An agreement 1102 may be utilized to describe terms and conditions incumbent on a purchasing holder's utilization of the augmented property 1104. Augmented property deeds 1106 may be freely or restrictedly hypothecated or traded in accordance with the agreement 1102.


Turning now to FIG. 12, an auctioning system 1200 for prized augmented property 1202 or a plot of augmented purchased property 1210 in an auction 1204 facilitated by the system 100 in accordance with auction standards (minimum bid, absolute, and reserve or the like). Competing bidders 1206 may combine interests, divide interests, and otherwise negotiate terms in accordance with the system 100 auction system 1200 rules. Rules may be set forth in a system moment file 10 accessible to interested parties to the property 1202. Disputes may be decided via arbitration, a rating system, or the like. Funds 1208 may be distributed partially or fully to users providing moment file 10 content based upon user recorded moment file 10 ratings, views or the like. The funds 1208 may also be distributed by the system 100 to users who access/read moment files 10 located in augmented property 1202 in the form of coupons or chits. These coupons or chits may be underwritten back to the holder 1006 by the system 100 in the form of reduced lease, rent, click, or property holder maintenance fees (or the like) payments to the system 100.



FIG. 13 illustrates a feature of an embodiment of the present invention restricting moment file 10 content recording or reading (viewing) within an augmented geo-fenced area 1302 (churches, temples, cemetery, schools, and the like). Holders 1006 may also purchase and then prevent all moment file 10 recording/reading 200/300 within the boundaries 1004 of their augmented property. Real property holders may seek condemnation (eviction) from the system 100 of an augmented property holder's 1006 interest, which is within the boundaries of the real property holder's estate.


Turning next to FIG. 14, the system 100 may allow users to tag other users with moment file 10 content. The system may allow users to restrict moment file 10 content recording on the HANDY 104 location of another user (or the like) by group, content type, or the like. Generally, it is an object of the present invention to allow system 100 users 12 to control (restrict) moment files 10 posted about them, around them, on their property, by a particular user, group of users, and to otherwise restrict their participation with the system 100 and its users. Such restrictions may be free to users by moment file category, other user identity, moment file content, or the like (hate speech, speech designed to hurt a user or non-user, bullying, unwanted interactions, stalking, and the like are preferably controlled via the system 100 filter 106. Other acceptable but undesirable moment file 10 content may be restricted by (1) user participation level (higher utilization, measured, e.g., by moment file quality and content), or (2) by subscription. Users 12 may also control and designate members within their group, when and where they wish to be reminded of an available moment file 10 (do not notify [ping] while driving, at night, at work, in the theater, or the like). Users 12 may limit the radius of their interest to certain types of locations, users, geography, and the like.


Turning now to FIG. 15, a user interface for a multidimensional platform of the invention 100 (or the like) is illustrated. User interface icons 1500 may be displayed on a device such as a HANDY 104 (herein various HANDY 104 icons throughout the appended figures also signify fixed vehicle displays or heads-up-display (HUD) or the like), capable of operation in the system 100 as at least one or both of a moment recorder 200 and/or a moment reader 300.


Sample user interface icons 1500 for display 104 are preferably representative of moment content or subject. Users 12 may selectively sort, arrange, and categorize moments 10 (FIG. 2) they have read 300 or recorded 200. Additionally, the system 100 may provide sorting and categorization (e.g., FIG. 2 or the like) according to desired system 100 outcomes. For example, increasing usability, user participation and interest, according to a particular property holder's 1006 interest, or in accordance with some useful social goal (e.g., awareness of laws, dangers, and restrictions or the like). FIG. 15 illustrates an example sample set of moment icons 1500 for an embodiment of the present invention. For example temporary moments 1502 may be available to be read 200 (displayed and accessible) for a particular singular period of time or for a brief period of time on a reoccurring or randomly reoccurring basis according to a user 12, system 100, or holder 1006 goal. Additionally, moments 10 may signify an opportunity to chat 1504 with another user 12, the system, or a holder 1006. Moments 10 may also be grouped into families' or by category as signified by a single moment 10 user interface icon 1500. Such groupings may be established according to a particular user's 12 interests, by age, game playing field (serial or intersection game board tile/space) in an augmented reality user 12, holder 1006, or system 100 designed and/or administered game. For example, a requirement that a user 12 follow a particular path solving or achieving certain physical, mental, advertiser, or social tasks in order to achieve a particular goal (see, for example, FIG. 16). In another embodiment a key/password 1508 accessible moment 10 category may require an accomplishment or the like to obtain access to a moment 10. In such an embodiment or the like, sound 1510 moments 10 may be identified and characterized as containing a voice message, musical recording, or the like. Video or movie based moments 1512 (see also 218, 220FIG. 2) and photograph based moments 1514 (see 214, FIG. 2) may also include a special interface icon 1500 moment file 10 designation. Users 12 may also design and utilize customized icons to designate their moment 10 content (an avatar or the like). Such customized icons may be available according to specified system 100 rules and or parameters.



FIG. 15 also illustrates other categories of sample moment 10 content which may or may not be available to all users 12. For example, a user 12 may be tagged with a moment 10 icon 1500 representing a personal message 1516 relevant to other users 12 or the like. In an operating embodiment of such a system 100, a user's HANDY 104 (vehicle or the like) might include an icon signifying some aspect or reputational value of such a user 12. Such a tagged user 12 might be tagged with a moniker or representation either positive or negative. Perhaps a particular user is a poor driver or doesn't obey traffic laws and or etiquette. In such a case they may be visible via a reader 300 of the system 100 wearing (designated by) a particularly designed icon 1500 representing a negative characteristic, e.g., litterer, speeder, thrasher or flamer, and the like; or a positive characteristic, e.g., expert, arbitrator, banker, employee, friend, redeemer, repairperson, troubleshooter, or the like. In one embodiment such a tagged user 12 could remove the tag only after demonstrating to the system 100 ameliorating conduct or the like (e.g., consistently obeying traffic rules, system 100 verified walking an area full of litter and then depositing the litter at a known refuse container or location). Likewise, positive monikers (tags) might be earned via ratings, moment recordings, training, and/or other system 100 designations or assignments. User location data may be required by the system 100 in order for a user to participate. Network-based, GNSS-based, handset-based, SIM-based, WiFi based, Vehicle to Vehicle (V2V), Automatic Vehicle Location (AVL) (e.g., as found at local.iteris.com/cvria/ under “Applications”, or other and/or hybrid based HANDY (vehicle) 104 location tools may be employed.


As previously described in the description of FIG. 2, opinions 310 may include a negative content 1518 moment file 10, and/or a positive content 1520 moment file 10. Users 12 of the system 100 may also achieve a moment file 10 generated by another user 12, a holder 1006, or the system in a location a user is known to frequent which awards the user 12 with a designation or chit or the like.


In other embodiments of the system 100 (FIG. 15) a user 12 may leave directions 1524 or allow a user to turn system 100 features ON/OFF by accessing (reading 300) a system menu 1526 moment file 10. A user's reputation 1530 (biography or the like) may be designated via an icon 1530 worn about a user in the augmented reality of the system 100. Some moments 10 may be time sensitive 1528 or recorded as a reminder of an appointment, road work, weather hazard or the like. Notes and/or instructions 1532 moment files 10 may be categorized and represented by a special icon 1500. Likewise a user 12 can leave a love note 1534 moment file 10 for a particular user at a special place (accessible at any time or at certain times). Dashboard moment files 1536 may be dispersed geographically providing users 12 with information about new features, changes, statistics, offers, and the like. Likewise, dashboard moments 1536 may provide a moment locator (moment radar) or clues relevant to a particular user/moment, class of user/moment, or the user/moment population.


So as to provide an enhanced social experience for users at least one embodiment may include drifting moments 10 designated by an icon 1538. Such moments may change location by time, user activity, holder 1006 requirements, or according to a pseudo-random operation of the system 100. In other embodiments users 12 may leave moment files 10 asking questions of an unknown but more knowledgeable user, or a known user, or a user with special characteristics. Such question moments 10 may be designated with a special moment icon 1542 (“Did anyone witness —————— on ——————?”). Also available in a preferred embodiment of the system 100 are “easter egg” moments 10 (treasures) designated by an icon 1544 which provide user interest available from a holder 1006, another user 12, or the system 100 (specifically tailored for a particular user or the like). Other embodiments may include game or puzzle moments 10 designated by an icon 1546 where reading 300 such a moment may entitle a user to puzzle or game play (relevant to the geography, place or the like) where success earns chits or the like.


Cross-platform access may be provided by special moment content 10 allowing integration with users of other platforms or groups providing entertainment, tools, skills, or items valuable for trade in the system 100 or another platform.


As designated by an X 1550 representing a thing or object of real property 1552, personal property 1554 (stationary or a fixture 1554A or movable or portable 1554B), and virtual 1556, the system 100 may augment any of these forms of property with a user recordable/readable moment file 10.


Turning now to FIG. 16, a game player (user 12), a property holder 1006, the system 100, or the like may design a game 1600 with static rules, or rules which change according to time of day, day of week, player accomplishment, detours, player misdirection, or the like. A player 12 may be required to access a series of moments 10 placed randomly or intelligently across a bounded 1602A or unbounded 1602B real (or real and virtual or real and augmented) area in a particular order with or without solving a physical, mental, or social problem characterized by recording a moment 10 or the like at a particular location within a given time period or at a particular step. A user 12 may use a game play HANDY 1614 or the like to participate in reading/recording (300/200) moment files 10 in accordance with game rules/instructions 1604 represented by a game play instruction icon 1604 moment file 10.



FIG. 17 illustrates a method of the system 100 for rewarding users 12 by how the user community rates a user's recorded content.


It is believed that the present invention and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction, and arrangement of the components thereof without departing from the scope and spirit of the invention or without sacrificing all of its material advantages. The form herein before described being merely an explanatory embodiment thereof, it is the intention of the following claims to encompass and include such changes.

Claims
  • 1. A device for recording and sharing moments with one or more readers, the device including a processor and comprising a moment recorder configured to record at least one moment file represented by at least one user selectable icon, the at least one moment file comprising at least: 1) a location;2) an identifier corresponding to the moment recorder;3) a time;4) at least one recorder filter configured to restrict an availability of the at least one moment file to one or more authorized readers based on at least one filter preference selected from a group including the time, a medium associated with the moment, a subject associated with the moment, an interest associated with the moment, and a theme associated with the moment;and5) an augmented property filter configured to restrict one or more of: (a) the ability of the moment recorder to record the at least one moment file;and(b) the availability of the at least one moment file to the one or more authorized readers;based on a virtual property ownership system wherein the location is associated with at least one augmented property.
  • 2. The device of claim 1, wherein the at least one moment file further comprises: at least one moment file content selected from a group including: a picture, a sound, an audio recording, a video, a text, an image, a calendar entry, a reminder, a context, a mood, a diary entry, a history, a feeling or mood, an information, an opinion, an anecdote, an award, a coupon, a discount, an instruction, a menu, a conversation, a price, or a chit.
  • 3. The device of claim 1, wherein the moment recorder further comprises: at least one marker capable of marking the location for a period of time to designate the location for moment file content to be prepared.
  • 4. The device of claim 3, wherein: the period of time is not more than 24 hours from a marking time corresponding to the marking of the location;andthe marking of the location expires when the period of time expires.
  • 5. The device of claim 1, wherein the at least one moment file includes a coupon.
  • 6. The device of claim 1, wherein the at least one moment file includes at least one game element selected from a group including a game token, a puzzle, a game instruction, a game challenge, a game play element, and a pseudo-random game element generator.
  • 7. The device of claim 1, wherein the at least one moment file includes at least one element required to access the at least one moment file, the element selected from a group including a tool, a key, and a password.
  • 8. The device of claim 1, wherein: the augmented property filter is controlled by at least one augmented property holder.
  • 9. The device of claim 8, wherein the at least one augmented property holder is represented by the at least one user selectable icon.
  • 10. The device of claim 8, wherein the at least one augmented property holder is an administrator of a geo-fenced area including the location.
  • 11. The device of claim 8, wherein: the at least one augmented property corresponds to at least one real property;andthe at least one augmented property holder is an owner of the at least one real property.
  • 12. The device of claim 8, wherein: the at least one augmented property corresponds to a real property;andthe at least one augmented property holder is a purchasing holder.
  • 13. The device of claim 8, wherein the augmented property filter requires the moment recorder and the one or more authorized readers to access at least one holder moment file identifying the at least one augmented property holder.
  • 14. The device of claim 13, wherein the moment file content of the at least one holder moment file may be modified based on at least one of: a time spent within the at least one augmented property;andat least one previously accessed holder moment file.
  • 15. The device of claim 1, wherein the moment recorder includes one or more menu access moment files providing user menu access to user settings.
  • 16. The device of claim 15, wherein the moment recorder includes at least one menu access moment icon representative of the menu access moment file.
  • 17. The device of claim 1, wherein the at least one moment file further comprises: at least one moment file notification for notifying the one or more authorized readers proximate to a matched filter preference moment file.
  • 18. The device of claim 1, wherein the at least one user selectable icon is customizable.
  • 19. The device of claim 18, wherein said customizable icon is selected from a group including: an earned customizable icon,andat least one customizable icon selected from a group including: an activity, an award, and a customizable icon obtained by a payment or a series of payments.
  • 20. The device of claim 1, wherein the at least one moment file includes at least one dashboard moment file configured to selectively personalize the moment recorder.
  • 21. The device of claim 20, wherein the dashboard moment file is selected from a group including a new feature, a change, a statistic, an offer, an instruction, and a message.
  • 22. The device of claim 1, wherein the at least one moment file is selected from a group including: an ephemeral moment file configured to expire within a set period of time;a question moment file capable of asking at least one question to the one or more readers;and a moment file capable of at least one of: warning the one or more readers; awarding the one or more readers; and sharing information with the one or more readers.
  • 23. The device of claim 1, wherein the at least one moment file is a moment file radar for at least one of categorizing moment files or locating moment files not within a defined locus to the moment reader.
  • 24. The device of claim 1, wherein: the virtual property ownership system is associated with an augmented map including the at least one augmented property selected from a group including a real boundary, a virtual boundary, and an augmented estate comprising at least one plot of augmented property.
  • 25. The device of claim 1, wherein the at least one moment file includes at least one drifting moment file capable of moving from the first location to at least one additional location.
  • 26. The device of claim 1, wherein the at least one moment file includes at least one moment file notification for informing the one or more authorized readers of the moment file.
CROSS-REFERENCE TO RELATED APPLICATIONS

The instant application claims priority under 35 U.S.C. § 120 as a continuation of U.S. patent application Ser. No. 15/231,241 filed Aug. 8, 2016. The present application claims priority under 35 U.S.C § 119 to provisional U.S. Patent Application 62/352,433 filed on Jun. 20, 2016. Said U.S. patent application Ser. No. 15/231,241 and 62/352,433 are herein incorporated by reference in their entirety.

US Referenced Citations (525)
Number Name Date Kind
4581634 Williams Apr 1986 A
4975690 Torres Dec 1990 A
5072412 Henderson, Jr. et al. Dec 1991 A
5493692 Theimer et al. Feb 1996 A
5713073 Warsta Jan 1998 A
5754939 Herz et al. May 1998 A
5855008 Goldhaber et al. Dec 1998 A
5883639 Walton et al. Mar 1999 A
5999932 Paul Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6014090 Rosen et al. Jan 2000 A
6029141 Bezos et al. Feb 2000 A
6038295 Mattes Mar 2000 A
6049711 Ben-Yehezkel et al. Apr 2000 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6205432 Gabbard et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6285381 Sawano et al. Sep 2001 B1
6285987 Roth et al. Sep 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6317789 Rakavy et al. Nov 2001 B1
6349203 Asaoka et al. Feb 2002 B1
6446004 Cao et al. Sep 2002 B1
6449657 Stanbach, Jr. et al. Sep 2002 B2
6456852 Bar et al. Sep 2002 B2
6484196 Maurille Nov 2002 B1
6523008 Avrunin et al. Feb 2003 B1
6542749 Tanaka et al. Apr 2003 B2
6549768 Fraccaroli Apr 2003 B1
6618593 Drutman et al. Sep 2003 B1
6622174 Ukita et al. Sep 2003 B1
6658095 Yoakum et al. Dec 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6668173 Greene Dec 2003 B2
6684257 Camut et al. Jan 2004 B1
6698020 Zigmond et al. Feb 2004 B1
6700506 Winkler et al. Mar 2004 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6834195 Brandenberg et al. Dec 2004 B2
6836792 Chen Dec 2004 B1
6898626 Ohashi May 2005 B2
6970088 Kovach Nov 2005 B2
6980909 Root et al. Dec 2005 B2
7085571 Kalhan et al. Aug 2006 B2
7110744 Freeny, Jr. Sep 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7173651 Knowles Feb 2007 B1
7203380 Chiu et al. Apr 2007 B2
7206568 Sudit Apr 2007 B2
7227937 Yoakum et al. Jun 2007 B1
7269426 Kokkonen et al. Sep 2007 B2
7315823 Brondrup Jan 2008 B2
7356564 Hartselle et al. Apr 2008 B2
7394345 Ehlinger et al. Jul 2008 B1
7411493 Smith Aug 2008 B2
7423580 Markhovsky et al. Sep 2008 B2
7512649 Faybishenko et al. Mar 2009 B2
7519670 Hagale et al. Apr 2009 B2
7535890 Rojas May 2009 B2
7607096 Oreizy et al. Oct 2009 B2
7639943 Kalajan Dec 2009 B1
7668537 De Vries Feb 2010 B2
7770137 Forbes et al. Aug 2010 B2
7787886 Markhovsky et al. Aug 2010 B2
7796946 Eisenbach Sep 2010 B2
7801954 Cadiz et al. Sep 2010 B2
8001204 Burtner et al. Aug 2011 B2
8082255 Carlson, Jr. et al. Dec 2011 B1
8098904 Ioffe et al. Jan 2012 B2
8099109 Altman et al. Jan 2012 B2
8131597 Hudetz et al. Mar 2012 B2
8135166 Rhoads et al. Mar 2012 B2
8136028 Loeb et al. Mar 2012 B1
8146001 Reese Mar 2012 B1
8161417 Lee Apr 2012 B1
8195203 Tseng Jun 2012 B1
8214443 Hamburg Jul 2012 B2
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8285199 Hsu et al. Oct 2012 B2
8306922 Kunal et al. Nov 2012 B1
8312097 Siegel et al. Nov 2012 B1
8326327 Hymel et al. Dec 2012 B2
8332475 Rosen et al. Dec 2012 B2
8352546 Dollard Jan 2013 B1
8379130 Forutanpour et al. Feb 2013 B2
8385950 Wagner et al. Feb 2013 B1
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8471914 Sakiyama et al. Jun 2013 B2
8472935 Fujisaki Jun 2013 B1
8560612 Kilmer et al. Oct 2013 B2
8639803 Moritz et al. Jan 2014 B2
8660358 Bergboer et al. Feb 2014 B1
8660793 Ngo et al. Feb 2014 B2
8694026 Forstall et al. Apr 2014 B2
8718333 Wolf et al. May 2014 B2
8744523 Fan et al. Jun 2014 B2
8761800 Kuwahara Jun 2014 B2
8788680 Naik Jul 2014 B1
8790187 Walker et al. Jul 2014 B2
8797415 Arnold Aug 2014 B2
8798646 Wang et al. Aug 2014 B1
8856349 Jain et al. Oct 2014 B2
8909725 Sehn Dec 2014 B1
8910081 Fennel Dec 2014 B2
9015285 Ebsen et al. Apr 2015 B1
9040574 Wang et al. May 2015 B2
9104293 Kornfeld et al. Aug 2015 B1
9131342 Forstall et al. Sep 2015 B2
9152477 Campbell et al. Oct 2015 B1
9225897 Sehn Dec 2015 B1
9258373 Harris et al. Feb 2016 B2
9258459 Hartley Feb 2016 B2
9396354 Murphy et al. Jul 2016 B1
9430783 Sehn Aug 2016 B1
9459778 Hogeg et al. Oct 2016 B2
9521515 Zimerman et al. Dec 2016 B2
9537811 Allen et al. Jan 2017 B2
9584694 Ito et al. Feb 2017 B2
9626070 Cowles et al. Apr 2017 B2
9628950 Noeth et al. Apr 2017 B1
9652896 Jurgenson et al. May 2017 B1
9681265 Davis et al. Jun 2017 B1
9710554 Sandberg Jul 2017 B2
9736371 Taneichi et al. Aug 2017 B2
9736518 Houston et al. Aug 2017 B2
9754355 Chang et al. Sep 2017 B2
9756373 Houston et al. Sep 2017 B2
9792876 Xie et al. Oct 2017 B2
9823803 Tseng Nov 2017 B2
9843720 Ebsen et al. Dec 2017 B1
9852543 Hare et al. Dec 2017 B2
9854219 Sehn Dec 2017 B2
9881094 Pavlovskaia et al. Jan 2018 B2
9936333 Lau et al. Apr 2018 B2
9978125 Chang et al. May 2018 B1
9984499 Jurgenson et al. May 2018 B1
10055895 Li et al. Aug 2018 B2
10078863 Loganathan Sep 2018 B2
10083245 Jezewski Sep 2018 B1
10102423 Shaburov et al. Oct 2018 B2
10102447 Gusarov Oct 2018 B1
10108859 Suiter et al. Oct 2018 B1
10123166 Zimerman et al. Nov 2018 B2
10135949 Pavlovskaia et al. Nov 2018 B1
10157333 Wang et al. Dec 2018 B1
10203855 Al Majid et al. Feb 2019 B2
10206059 Tseng Feb 2019 B1
10219111 Chen et al. Feb 2019 B1
10223397 Sehn et al. Mar 2019 B1
10229717 Davis Mar 2019 B1
10244186 Chen et al. Mar 2019 B1
10270839 Andreou et al. Apr 2019 B2
10285001 Allen et al. May 2019 B2
10311916 Sehn Jun 2019 B2
10318574 Bonechi et al. Jun 2019 B1
10319149 Cowburn et al. Jun 2019 B1
10327096 Ahmed et al. Jun 2019 B1
10338773 Murarka et al. Jul 2019 B2
10339365 Gusarov et al. Jul 2019 B2
10349209 Noeth et al. Jul 2019 B1
10354425 Yan et al. Jul 2019 B2
10360708 Bondich et al. Jul 2019 B2
10366543 Jurgenson et al. Jul 2019 B1
10382373 Yang et al. Aug 2019 B1
10387514 Yang et al. Aug 2019 B1
10387730 Cowburn et al. Aug 2019 B1
10397469 Yan et al. Aug 2019 B1
10402650 Suiter et al. Sep 2019 B1
10423983 Shim et al. Sep 2019 B2
10430838 Andreou Oct 2019 B1
10448201 Sehn et al. Oct 2019 B1
20020047868 Miyazawa Apr 2002 A1
20020087631 Sharma Jul 2002 A1
20020097257 Miller et al. Jul 2002 A1
20020122659 McGrath et al. Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030001846 Davis et al. Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030017823 Mager et al. Jan 2003 A1
20030020623 Cao et al. Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030126215 Udell et al. Jul 2003 A1
20030148773 Spriestersbach et al. Aug 2003 A1
20030164856 Prager et al. Sep 2003 A1
20030229607 Zellweger et al. Dec 2003 A1
20040027371 Jaeger Feb 2004 A1
20040078367 Anderson et al. Apr 2004 A1
20040111467 Willis Jun 2004 A1
20040158739 Wakai et al. Aug 2004 A1
20040189465 Capobianco et al. Sep 2004 A1
20040203959 Coombes Oct 2004 A1
20040215625 Svendsen et al. Oct 2004 A1
20040243531 Dean Dec 2004 A1
20050022211 Veselov et al. Jan 2005 A1
20050048989 Jung Mar 2005 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050119936 Buchanan et al. Jun 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson et al. Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20050288954 McCarthy et al. Dec 2005 A1
20060026067 Nicholas et al. Feb 2006 A1
20060107297 Toyama et al. May 2006 A1
20060114338 Rothschild Jun 2006 A1
20060119882 Harris et al. Jun 2006 A1
20060242239 Morishima et al. Oct 2006 A1
20060252438 Ansamaa et al. Nov 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20060287878 Wadhwa et al. Dec 2006 A1
20070004426 Pfleging et al. Jan 2007 A1
20070032244 Counts et al. Feb 2007 A1
20070040931 Nishizawa Feb 2007 A1
20070073517 Panje Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070075898 Markhovsky et al. Apr 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070136228 Petersen Jun 2007 A1
20070192128 Celestini Aug 2007 A1
20070198340 Lucovsky et al. Aug 2007 A1
20070198495 Buron et al. Aug 2007 A1
20070208751 Cowan et al. Sep 2007 A1
20070210936 Nicholson Sep 2007 A1
20070214180 Crawford Sep 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070233556 Koningstein Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070233859 Zhao et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070244750 Grannan et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20070281690 Altman et al. Dec 2007 A1
20080022329 Glad Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080032703 Krumm et al. Feb 2008 A1
20080033930 Warren Feb 2008 A1
20080043041 Hedenstroem et al. Feb 2008 A2
20080076505 Nguyen et al. Mar 2008 A1
20080092233 Tian et al. Apr 2008 A1
20080104503 Beall et al. May 2008 A1
20080109844 Baldeschwieler et al. May 2008 A1
20080120409 Sun et al. May 2008 A1
20080147730 Lee et al. Jun 2008 A1
20080148150 Mall Jun 2008 A1
20080158230 Sharma et al. Jul 2008 A1
20080168033 Ott et al. Jul 2008 A1
20080168489 Schraga Jul 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080214210 Rasanen et al. Sep 2008 A1
20080222545 Lemay et al. Sep 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080288338 Wiseman et al. Nov 2008 A1
20080306826 Kramer et al. Dec 2008 A1
20080313329 Wang et al. Dec 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20080318616 Chipalkatti et al. Dec 2008 A1
20090006191 Arankalle et al. Jan 2009 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090030774 Rothschild et al. Jan 2009 A1
20090030999 Gatzke et al. Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090061901 Arrasvuori Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090089710 Wood et al. Apr 2009 A1
20090093261 Ziskind et al. Apr 2009 A1
20090132341 Klinger et al. May 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090153492 Popp Jun 2009 A1
20090157450 Athsani et al. Jun 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090163182 Gatti et al. Jun 2009 A1
20090192900 Collison et al. Jul 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090215469 Fisher et al. Aug 2009 A1
20090232354 Camp, Jr. et al. Sep 2009 A1
20090234815 Boerries et al. Sep 2009 A1
20090239552 Churchill et al. Sep 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090265647 Martin et al. Oct 2009 A1
20090288022 Almstrand et al. Nov 2009 A1
20090291672 Treves et al. Nov 2009 A1
20090292608 Polachek Nov 2009 A1
20090319607 Belz et al. Dec 2009 A1
20090327073 Li et al. Dec 2009 A1
20100062794 Han Mar 2010 A1
20100082427 Burgener et al. Apr 2010 A1
20100082693 Hugg et al. Apr 2010 A1
20100113065 Narayan et al. May 2010 A1
20100130233 Parker May 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100153144 Miller et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161658 Hamynen Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100162149 Sheleheda et al. Jun 2010 A1
20100185552 DeLuca et al. Jul 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100198683 Aarabi Aug 2010 A1
20100198694 Muthukrishnan Aug 2010 A1
20100198828 Petersen et al. Aug 2010 A1
20100201536 Robertson et al. Aug 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100250109 Johnston et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100259386 Holley et al. Oct 2010 A1
20100273509 Sweeney et al. Oct 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110010205 Richards Jan 2011 A1
20110029512 Folgner et al. Feb 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110064388 Brown et al. Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110083101 Sharon et al. Apr 2011 A1
20110102630 Rukes May 2011 A1
20110119133 Igelman et al. May 2011 A1
20110137881 Cheng et al. Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110159890 Fortescue et al. Jun 2011 A1
20110164163 Bilbrey et al. Jul 2011 A1
20110197194 DAngelo et al. Aug 2011 A1
20110202598 Evans et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110215966 Kim et al. Sep 2011 A1
20110225048 Nair Sep 2011 A1
20110238763 Shin et al. Sep 2011 A1
20110255736 Thompson et al. Oct 2011 A1
20110273575 Lee Nov 2011 A1
20110282799 Huston Nov 2011 A1
20110283188 Farrenkopf et al. Nov 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120001938 Sandberg Jan 2012 A1
20120028659 Whitney et al. Feb 2012 A1
20120036015 Sheikh Feb 2012 A1
20120059722 Rao Mar 2012 A1
20120062805 Candelore Mar 2012 A1
20120099800 Llano et al. Apr 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120123830 Svendsen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120124176 Curtis et al. May 2012 A1
20120124458 Cruzada May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco et al. Jun 2012 A1
20120165100 Lalancette et al. Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120197724 Kendall Aug 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120210244 de Francisco Lopez et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120231814 Calman et al. Sep 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120252418 Kandekar et al. Oct 2012 A1
20120268490 Sugden Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304052 Tanaka et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20130006759 Srivastava et al. Jan 2013 A1
20130036364 Johnson Feb 2013 A1
20130045753 Obermeyer et al. Feb 2013 A1
20130050260 Reitan Feb 2013 A1
20130055083 Fino Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130080254 Thramann Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130086072 Peng et al. Apr 2013 A1
20130095857 Garcia et al. Apr 2013 A1
20130128059 Kristensson May 2013 A1
20130129252 Lauper et al. May 2013 A1
20130132477 Bosworth et al. May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130159110 Rajaram et al. Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130191198 Carlson et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130201182 Kuroki et al. Aug 2013 A1
20130218965 Abrol et al. Aug 2013 A1
20130218968 McEvilly et al. Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130267253 Case et al. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130311255 Cummins et al. Nov 2013 A1
20130325964 Berberat Dec 2013 A1
20130339864 Uusitalo Dec 2013 A1
20130339868 Sharpe Dec 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140019264 Wachman et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140045530 Gordon et al. Feb 2014 A1
20140046923 Ruble et al. Feb 2014 A1
20140047016 Rao Feb 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140096029 Schultz Apr 2014 A1
20140114565 Aziz et al. Apr 2014 A1
20140118483 Rapoport et al. May 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140139519 Mit May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140204117 Kinnebrew et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner Jul 2014 A1
20140237578 Bryant Aug 2014 A1
20140253743 Loxam Sep 2014 A1
20140258405 Perkin Sep 2014 A1
20140266703 Dailey, Jr. et al. Sep 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140279540 Jackson Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140287779 OKeefe et al. Sep 2014 A1
20140289833 Briceno et al. Sep 2014 A1
20140304646 Rossmann Oct 2014 A1
20140324627 Haver et al. Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20140372945 Fan et al. Dec 2014 A1
20150029180 Komatsu Jan 2015 A1
20150046278 Pei et al. Feb 2015 A1
20150087263 Branscomb Mar 2015 A1
20150088622 Ganschow Mar 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150169827 LaBorde Jun 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150261917 Smith Sep 2015 A1
20150350136 Flynn et al. Dec 2015 A1
20150356063 Jiang Dec 2015 A1
20160014063 Hogeg et al. Jan 2016 A1
20160085773 Chang et al. Mar 2016 A1
20160085863 Allen et al. Mar 2016 A1
20160132231 Rathod May 2016 A1
20160180602 Fuchs Jun 2016 A1
20160180887 Sehn Jun 2016 A1
20160182422 Sehn Jun 2016 A1
20160182875 Sehn Jun 2016 A1
20160212538 Fullam et al. Jul 2016 A1
20160223335 Tabata Aug 2016 A1
20160277419 Allen et al. Sep 2016 A1
20160283595 Folkens et al. Sep 2016 A1
20160335289 Andrews Nov 2016 A1
20170006094 Abou Mahmoud et al. Jan 2017 A1
20170021273 Rios Jan 2017 A1
20170132267 Zhou et al. May 2017 A1
20170150037 Rathod May 2017 A1
20170185869 Dua et al. Jun 2017 A1
20170193300 Shatz et al. Jul 2017 A1
20170195554 Shatz et al. Jul 2017 A1
20170201803 Wald et al. Jul 2017 A1
20170256097 Finn et al. Sep 2017 A1
20180026925 Kennedy Jan 2018 A1
20180072833 Chang et al. Mar 2018 A1
20180116346 Hertlein May 2018 A1
20180129905 Soundararajan et al. May 2018 A1
20190244436 Stansell Aug 2019 A1
20200234502 Anderlecht Jul 2020 A1
Foreign Referenced Citations (27)
Number Date Country
2804096 Jan 2012 CA
2863124 Jul 2015 CA
2887596 Jul 2015 CA
102238275 Nov 2011 CN
102447779 May 2012 CN
103595911 Feb 2014 CN
2051480 Apr 2009 EP
2151797 Feb 2010 EP
2732383 Apr 2018 EP
2602729 Oct 2018 EP
2589024 Aug 2019 EP
2399928 Sep 2004 GB
6082005 Feb 2017 JP
19990073076 Oct 1999 KR
1996024213 Oct 1999 WO
1999063453 Oct 1999 WO
2001029642 Oct 1999 WO
2001050703 Oct 1999 WO
2006118755 Nov 2006 WO
2009043020 Apr 2009 WO
2011080385 Jul 2011 WO
2011119407 Sep 2011 WO
2013008238 Jan 2013 WO
2013045753 Apr 2013 WO
2014115136 Jul 2014 WO
2014194262 Dec 2014 WO
2016065131 Apr 2016 WO
Non-Patent Literature Citations (39)
Entry
Collins, Katie, “Leave digital gifts in physical locations with traces app”, http://www.wired.co.uk/article/traces-messaging-app, Aug. 5, 2014, 11 pages.
Jardin, Xeni, “Pip, new minimalist messaging app, promises easy update to friends”, https://boingboing.net/2014/12/04/pip-new-minimalist-messaging.- html, Dec. 4, 2014, 2 pages.
Knibbs, Kate, “Is this the first annonymous app that understands the power of secrets?”, https://www.dailydot.com/debug/yik-yak-app/, Mar. 21, 2014, 4 pages.
Lawler, Ryan, “Whisper Confirms $36M in New Funding, Adds Related Posts, Categories, and Explore Feature to App”, https://techcrunch.com/2014/05/19/whisper-v4, May 19, 2014, 2 pages.
Martellaro, John, “Spyglass for iOS: Powerful Navigational Instrument”, https://www.macobserver.com/tmo/review/spyglass_for_ios_powerful_navigati- onal_instruction, Jun. 27, 2011, 5 pages.
A Whole New Story, [Online]. Retrieved from the Internet: <https://www.snap.com/en-US/news/>, (2017), 13 pgs.
Adding a watermark to your photos, eBay, [Online]. Retrieved from the Intenet URL:http://pages.ebay.com/help/sell/pictures.html , (accessed May 24, 2017), 4 pgs.
Tripathi, Rohit, “Watermark Images in PHP and Save File on Server”, [Online]. Retrieved from the Internet: URL:http://code.rohitink.com/2012/12/28/watermark-images-in-php-and-save-file-on-served, (Dec. 28, 2012), 4 pgs.
BlogStomp, [Online]. Retrieved from the Internet: URL:http://stompsoftware.com/blogstomp , (accessed May 24, 2017), 12 pgs.
Cup Magic Starbucks Holiday Red Cups come to life with AR app, [Online]. Retrieved from the Internet: http://www.plastradius.com/work/cup-magic , (2016), 7 pgs.
Daily App: InstaPlace (iOS/Android): Give Pictures a Sense of Place, TechPP, [Online]. Retrieved from the Internet: URL;http://techpp.com/2013/02/15/instaplace-app-review , (2013), 13 pgs.
How Snaps Are Stored and Deleted, Snapchat, [Online]. Retrieved from the Internet: URL: https://web.archive.org/web/20130607042322/http://blog.snapchat.com/post/50060403002/how-snaps-are-stored-and-deleted, (May 9, 2013), 2 pgs.
Shein, Esther, “Ephemeral Data”, Communications of the ACM vol. 56 | No. 9, (Sep. 2013), 20-22.
International Application Serial No. PCT/US2014/040346, International Search Report dated Mar. 23, 2015, 2 pgs.
International Application Serial No. PCT/US2014/040346, Written Opinion dated Mar. 23, 2015, 6 pgs.
International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015, 7 pgs.
Introducing Snapchat Stories, [Online]. Retrieved from the Internet:https://www.youtube.com/watch?v=-ie5_aaHOhE , (Oct. 1, 2013), 92 pgs.
Visit Mobile Getting Started, IVISIT, (Dec. 4, 2013), 1-16.
Macys Believe-o-Magic, [Online]. Retrieved from the Internet: https://www.youtube.com/watch?v=5p7-y5eO6X4, Nov. 27, 2011), 102 pgs.
Macys Introduces Augmented Reality Experience in Stores across Country as Part of Its 2011 Believe“Campaign”, [Online]. Retrieved from the Internet: http://www.businesswire.com/news/home/20111102006759/en/Macy%E2%80%99s-Introduces-Augmented-Reality-Experience-Stores-Country ., (Nov. 2, 2011), 6 pgs.
Starbucks Cup Magic for Valentines Day, {Online}. Retrieved from the Internet: https://www.youtube.com/watchv=8nvgOzjgl0w , (Feb. 6, 2012), 88 pgs.
Starbucks Cup Magic, {Onliine}. Retrieved from the Internet: https://www.youtube.com/watchv=RWwQXi9RG0w , (Nov. 8, 2011), 87 pgs.
Starbucks Holiday Red Cups Come to Life, Signaling the Return of the Merriest Season, [Online]. Retrieved from the Internet: http://www.businesswire.com/news/home/20111115005744/en/2479513/Starbucks-Holiday-Red-Cups-Life-Signaling-Return , (Nov. 15, 2011), 5 pgs.
U.S. Appl. No. 14/494,226, Examiner Interview Summary dated Oct. 27, 2016, 3 pgs.
U.S. Appl. No. 14/494,226, Final Office Action dated Mar. 7, 2017, 33 pgs.
U.S. Appl. No. 14/494,226, Non Final Office Action dated Sep. 12, 2016, 32 pgs.
U.S. Appl. No. 14/494,226, Response filed Dec. 12, 2016 to Non Final Office Action dated Sep. 12, 2016, 16 pgs.
U.S. Appl. No. 14/539,391, Notice of Allowance dated Mar. 5, 2015, 16 pgs.
U.S. Appl. No. 14/682,259, Notice of Allowance dated Jul. 27, 2015, 17 pgs.
InstaPlace Photo App Tell the Whole Story, [Online]. Retrieved from the Internet; https://www.youtube.com/watch?v=uF_gFkg1hBM , (Nov. 8, 2013), 113 pgs.
Carthy, Roi, “Dear All Photo Apps: Mobli Just Won Filters”, [Online]. Retrieved from the Internet: URL https://techcrunch.com/2011/09/08/mobli-filters , (Sep. 8, 2011), 10 pgs.
Janthong, Isaranu, “Android App Review Thailand”, [Online]. Retrieved from the Internet http://www.android-free-app-review.com/2013/01/instaplace-android-google-play-store.html , (Jan. 23, 2013), 9 pgs.
Leyden, John, “This SMS will self-destruct in 40 seconds”, [Online]. Retrieved from the Internet: URL: http://www.theregister.co.uk/2005/12/12/stealthtext/, (Dec. 12, 2005), 1 pg.
Macleod, Duncan, “Macys Believe-o-Magic App”, [Online]. Retrieved from the Internet: URL:http://theinspirationroom.com/daily/2011/macys-believe-o-magic-app , (Nov. 14, 2011), 10 pgs.
Macleod, Duncan, “Starbucks Cup Magic—Lets Merry”, [Online]. Retrieved from the Internet: URL; http://theinspirationroom.com/daily/2011/starbucks-cup-magic , (Nov. 12, 2011), 8 pgs.
Melanson, Mike, “This text message will self destruct in 60 seconds”, readwrite.com, [Online]. Retrieved from the Internet: http://readwrite.com/2011/02/11/this-text-message-will-self-destruct-in-60 seconds , (Feb. 18, 2015).
Notopoulos, Katie, “A Guide to the New Snapchat Filters and Big Fonts”, [Online]. Retrieved from the Internet: https://www.buzzfeed.com/katienotopoulos/a-guide-to-the-new-snapchat-filters-and-big-fontsut-term=.bkQ9qVZWe#.nv58YXpkV , (Dec. 22, 2013), 13 pgs.
Panzarino, Matthew, “Snapchat Adds Filters, a Replay Function and for Whatever Reason, Time, Temperature and Speed Overlays”, [Online]. Retrieved from the Internet: https://techcrunch.com/2013/12/20/snapchat-adds-filters-new-font-and-for-some-reason-time-temperature-and-speed-overlays/ , (Dec. 20, 2013), 12 pgs.
Sawers, Paul, “Snapchat for iOS Lets You Send Photos to Friends and Set How long Theyre Visible for”, [Online]. Retrieved from the Internet: http:/ /thenextweb.com/apps/2012/05/07/Snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visiblefor/#xCjrp , (May 7, 2012), 1-5.
Related Publications (1)
Number Date Country
20190312990 A1 Oct 2019 US
Provisional Applications (1)
Number Date Country
62352433 Jun 2016 US
Continuations (1)
Number Date Country
Parent 15231241 Aug 2016 US
Child 16449986 US