Asynchronous messaging tags

Information

  • Patent Grant
  • 8959165
  • Patent Number
    8,959,165
  • Date Filed
    Monday, September 10, 2012
    11 years ago
  • Date Issued
    Tuesday, February 17, 2015
    9 years ago
Abstract
A method includes receiving a message from a user device, determining whether the message includes a tag, identifying at least one interaction the user device performed with an application responsive to determining that the message includes the tag, calculating a difference between a time the message was received and a time associated associated with the at least one interaction responsive to determining that the difference between the time the message was received and the time associated with the at least one identified interaction is within the threshold value.
Description
BACKGROUND

The present invention relates to asynchronous messaging, and more specifically, to tags associated with asynchronous messaging.


Though the functions of available mobile telephone devices continue to advance, there are significant numbers of users who utilize mobile telephone devices with limited functions or features. For example, millions of mobile telephone device users use simple mobile telephone devices with voice and texting/short message service (SMS), but with limited alternative communications features.


BRIEF SUMMARY

According to one embodiment of the present invention, a method includes receiving a message from a user device, determining whether the message includes a tag, identifying at least one interaction the user device performed with an application responsive to determining that the message includes the tag, calculating a difference between a time the message was received and a time associated with the at least one identified interaction, determining whether the difference between the time the message was received and the time associated with the at least one identified interaction is within a threshold value, and associating the tag with the application associated with the at least one interaction responsive to determining that the difference between the time the message was received and the time associated with the at least one identified interaction is within the threshold value.


According to another embodiment of the present invention, a method includes receiving a message from a user device, determining whether the message includes a tag, identifying at least one interaction the user device performed with an application responsive to determining that the message includes the tag, calculating a distance between a location the message was sent from and a location associated with the at least one identified interaction, determining whether the distance between the location the message was sent from and the location associated with the at least one identified interaction is within a threshold value, and associating the tag with the application associated with the at least one interaction responsive to determining that the distance between the location the message was sent from and the location associated with the at least one identified interaction is within the threshold value.


Additional features and advantages are realized through the techniques of the present invention. Other embodiments and aspects of the invention are described in detail herein and are considered a part of the claimed invention. For a better understanding of the invention with the advantages and the features, refer to the description and to the drawings.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

The subject matter which is regarded as the invention is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The forgoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:



FIG. 1 illustrates an exemplary embodiment of a system.



FIG. 2 illustrates a block diagram of an exemplary method for processing tags.



FIG. 3 illustrates an application usage table.



FIG. 4 illustrates an application table.



FIG. 5 illustrates a tag table.



FIG. 6 illustrates an application phone number table.



FIG. 7 illustrates a block diagram of another exemplary method for processing tags.





DETAILED DESCRIPTION

In many mobile telephone service areas, some users may utilize voice features and texting or SMS features, but may not use other data services due to device or network limitations, or the cost of wireless data transmission over the networks. The illustrated embodiments described below allow a user to interact with a server in a session to perform a variety of tasks using text messaging or voice messaging features. The interaction is asynchronous because the session includes messages sent from a user device to a server or from a server to a user device without a constant connection (i.e., the communicative connection between the server and the device is defined by individual discrete messages sent between the server and the device.) Though the illustrated embodiments describe texting or SMS services, one of ordinary skill in the art would understand that any type similar messaging service including e-mail and instant messaging or voice messaging, which may be in some embodiments converted to textual messages or processed with voice recognition methods may be used in a similar manner as described below.


Tagging is a term used to describe assigning metadata to a data object, session, content or service. Tags often include, for example, a word or phrase that is associated with data to describe the data. Thus, for example, tags such as “apples,” “produce,” or “fruit” may be associated with a session that is associated with a recipe for a salad. In practice, the tags may be used to search for data or identify data having particular tags. Often, tags may be entered and retrieved by users of a tagging system.


In this regard, FIG. 1 illustrates an exemplary embodiment of a system 100. The system 100 includes a server or processor 102 that is communicatively connected to a memory or database 104, a display device 106, and an input device 108. The server 102 is communicatively connected to a user device 110 that may include, for example, a mobile phone or other type of user device via a communications network 101 that may include, for example any suitable communications network that is capable of transmitting messages. The network 101 may also include conversion or gateway devices that are operative to convert messages and data into formats that may be processed by, the user device 110 and the server 102. Though the illustrated embodiment includes one user device 110, the system 100 may include any number of user devices 110. The server 102 is operative to receive messages sent from the user device 110 that may be addressed to one or more phone numbers associated with the server 102. For example, the user device may send a message to the server 102 by addressing the message to a particular phone number. The server 102 receives the message that includes the phone number of the sender; the phone number that the message was sent to; and the content of the message. The message may also include the time the message was sent, the location of the user device when the message was sent, or other information associated with the user device or message content. Though the illustrated embodiment includes the use of SMS messages and associated phone numbers, one of ordinary skill in the art would understand that a similar scheme may be implemented using other types of asynchronous messaging such as, for example, email or other messaging formats.



FIG. 2 illustrates a block diagram of an exemplary method for processing tags in an asynchronous data session. FIGS. 3-5 illustrate exemplary embodiments of tables that may be used in the illustrated method. The tables 300, 400, and 500 may be stored in the database 104 and maintained by the server 102. Referring to FIG. 2, in block 202 an asynchronous data session is conducted between the user device 110 (of FIG. 1) and the server 102. The session may include, for example, a series of messages sent between the user device 110 and the server 102. In the illustrated embodiment, the sessions include the use of a recipe application (A1) and a coupon application (A2). In this regard, in an exemplary session, a user (U1) looks up a recipe for pancakes from the recipe application by sending a message to the server 102 from the user device 110. The server receives the message and sends a message having a recipe for pancakes to the user. Later, the user interacts with the coupon application to retrieve a coupon for an item the user is purchasing. The coupon application sends a message via the server 102 to the user device 110 that includes the relevant coupon information. During the sessions described above, the tables in FIGS. 3 and 4 are populated.


Referring to FIG. 4, table 400 includes an application table having an application ID field 402 that includes unique identifiers of applications and an application description field 404 that includes descriptions of the applications associated with the application identifiers. Referring to FIG. 3, table 300 includes an application usage table having an application ID field 302 that includes the unique identifier of an application, a user ID field 304 that includes a unique identifier of the user and/or user device 110, a last user interaction field 306 that includes a time stamp of the last interaction the user had with the associated application, and a user location field 308 that may include a location of the user or user device 110 during the associated interaction. Referring back to FIG. 2, in block 202 the session the table 300 is populated with the user interactions, for example, the interaction described above with the recipe application is entered into the table 300 in entry 301, while the interaction described above with the coupon application is entered into the table 300 in entry 303. The entries in the table 300 may include a user location that may be, for example, based on a global positioning system (GPS) or other type of location scheme used by the user device 110 to locate a position of the device. Thus, the messages processed by the server 102 may include location data that identifies a location where the user device 110 sent a message. In the illustrated embodiment, the table 300 includes zip codes, however other location data including, for example, GPS coordinates or other similar location data may also be used.


In block 204, the server 102 receives a message from the user device 110. In block 206, the server determines whether the message includes a tag. The server 102 may make the determination by, for example, determining whether the message includes an indicator such as a word or phrase (e.g. “Tag:”). Alternatively, a phone number may be associated with tagging, such that the user enters tags into a message and sends the message to the phone number associated with tagging. Other alternatives are possible, such as, for example, text analysis of the message to extract words or phrases. Tags may be part of reviews or comments added during or after an activity. The server 102 receives the message addressed to the phone number associated with tagging and thus, identifies the data in the message as being tags. If the message includes a tag, a tagging service that may be implemented by the server 102 determines whether the message meets time and/or location criteria. For example, the time that the tag message was sent from may be determined by header or other information in the message. The location that the tag message was sent from may be determined from software installed on the phone (e.g., a GPS receiver) or from header information or other information in the message. The server 102 may use the message destination (phone number) from the SMS header to determine the application ID in the application phone number table 600. FIG. 6 illustrates an application phone number table 600 that includes a phone number field 602 and an application ID field 604; and is populated with entries that associate a phone number with an application ID.


The server 102 uses the user ID to determine whether table 300 (of FIG. 3) includes any application usage entries associated with the user ID. If there are application usage entries associated with the user ID (for example. A1 and A2 are associated with User ID U1 in Table 300), then the server 102 may calculate for each retrieved application usage entry, a difference between the time the tag message was sent from the user device and the last user interaction time with the application ID retrieved from Table 300 and determine whether the difference is within a time threshold (e.g., 30 minutes). Alternate embodiments may include several methods for setting the time threshold. For example, one method may determine the likely maximum time for users to tag after starting an activity. For example, 90% of users complete shopping and tag within 30 minutes. The location of the user device the tag message was sent from may alternatively, or in conjunction with the time the message was received be used to determine whether the tag message meets the location criteria. For example, the server 102 may calculate a distance between the location the tag message was sent from, and the location in the table 300 and compare the distance with a threshold distance (e.g., in a same zip code). Alternate embodiments may include a variety of ways of setting the location threshold. For example, determine the likely maximum distance across which applications for an activity might be used may be determined. For example, 90% of users are likely to tag within the same zip code as the store. Thus, the user ID, time, and/or location associated with the tag message and relevant time and/or distance thresholds may be used to identify applications (i.e., application IDs) in the table 300. In block 208, the identified application(s) are associated with the tag received in the tag message. In the illustrated example, the user U1 sends a tag message to the server 102 that includes the tag “Market.” If the tag message is sent from U1 at 1405-1-1-2011 in the location 30040, the tag message would meet the criteria (i.e. is within an exemplary thirty minute time threshold, and sent from the same zip code 30040). In block 210, the applications A1 and A2 are identified and associated with the tag “Market,” and the associated tag is saved. FIG. 5 illustrates a table 500 that includes an application ID field 502 and a tag field 504, which is populated in block 210 (of FIG. 2).


If the message does not contain a tag, then the user and application data are tracked in table 300. As above, the time and/or location are determined. Then they are set into the last user interaction 306 and user location 308 fields together. Alternatively, if this is the first time a message was sent to this application from this user (the user is subscribing), a new row is added.



FIG. 7 illustrates a block diagram of an exemplary method for retrieving tags performed by the system 100. In this regard, in block 702, the server 102 receives a message from a user device 110. The message may be received from any user device 110. In block 704, the server 102 determines whether the message includes a request for data that includes a tag. For example, the user device (U2) may send a message to the server 102 that includes a request for data or applications associated with the tag “Market.” If the message does not include a tag, the message may be processed for other relevant session activities in block 706. If the message does include a request for data that includes a tag, the server 102 may use the table 500 (of FIG. 5) to identify application ID(s) associated with the tag. In the illustrated embodiment, the application IDs A1 and A2 are associated with the tag “Market.” In cases where user tags do not match stored tags, various text processing techniques may be used to determine a degree of match. For example, “Market” may match the tag “Markets” or “Supermarket”. In block 710, the server 102 sends a message associated with the applications A1 and A2 to the user device U2110. In the illustrated embodiment, the message may include the tagged recipe for pancakes and the coupon tagged by the user U1.


The system also allows the user to request the tags associated with a given application (the ‘tag cloud’). It does this by processing a message containing this request (“tags”) to the application (phone number). The server 102 then redirects this request to the tagging service. The server 102 uses table 500 to retrieve the tags associated with the application and return them to the requesting user (phone number). The tags may be ordered alphabetically. Alternatively, tags may be ordered by popularity, recency, and/or proximity.


The technical effects and benefits of the methods and systems described above, allow a user to associate tags with sessions and applications in an asynchronous messaging system. The system also allows a user to retrieve data or initiate sessions using the asynchronous messaging system.


As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon. Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.


A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.


Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.


Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).


Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks. The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one more other features, integers, steps, operations, element components, and/or groups thereof.


The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.


The flow diagrams depicted herein are just one example. There may be many variations to this diagram or the steps (or operations) described therein without departing from the spirit of the invention. For instance, the steps may be performed in a differing order or steps may be added, deleted or modified. All of these variations are considered a part of the claimed invention.


While the preferred embodiment to the invention had been described, it will be understood that those skilled in the art, both now and in the future, may make various improvements and enhancements which fall within the scope of the claims which follow. These claims should be construed to maintain the proper protection for the invention first described.

Claims
  • 1. A method comprising: receiving a message from a user device at a server;determining that the message includes a tag;identifying at the server at least one prior interaction the user device performed with an application, the at least one prior interaction occurring before the message is received at the server and;calculating a difference between a time the message was received and a time associated with the identified at least one prior interaction with the application;determining whether the difference between the time the message was received and the time associated with the identified at least one prior interaction with the application is within a threshold value;associating the tag with the application associated with the identified at least one prior interaction responsive to determining that the difference between the time the message was received and the time associated with the identified at least one prior interaction is within the threshold value;identifying a second interaction the user device performed with a second application responsive to determining that the message includes the tag;calculating a distance between a location the message was sent from and a second location associated with the second identified interaction, the second location being different than the first;determining whether the distance between the location the message was sent from and the second location is within a threshold value; andassociating the tag with the second application associated with the second interaction responsive to determining that the distance between the location the message was sent from and the second location is within the threshold value.
  • 2. The method of claim 1, wherein the method further includes: calculating a difference between a time the message was received and a time associated with the second identified interaction;determining whether the difference between the time the message was received and the time associated with the second identified interaction is within a threshold value; andassociating the tag with the second application associated with the second interaction responsive to determining that the difference between the time the message was received and the time associated with the second identified interaction is within the threshold value.
  • 3. The method of claim 1, wherein the tag includes a textual word entry.
  • 4. The method of claim 1, wherein the method further includes: receiving a message from at least one user device;determining whether the message includes a request for data associated with the tag;identifying the application associated with the tag responsive to determining that the message includes a request for data associated with the tag; andsending service data associated with the application to the user device.
  • 5. The method of claim 1, wherein the method further includes: receiving a message from at least one user device;determining whether the message includes a request for data associated with the tag;identifying the applications associated with the tag responsive to determining that the message includes a request for data associated with the tag; andsending service data associated with the applications to the user device.
CROSS-REFERENCE TO RELATED APPLICATIONS

This is a continuation application of and claims priority from U.S. application Ser. No. 13/052,501, filed on Mar. 21, 2011, the entire contents of which are incorporated herein by reference.

US Referenced Citations (145)
Number Name Date Kind
1191425 Yoshiyasu Jul 1916 A
4592085 Watari et al. May 1986 A
5390278 Gupta et al. Feb 1995 A
5422816 Sprague et al. Jun 1995 A
6718368 Ayyadurai Apr 2004 B1
6834270 Pagani et al. Dec 2004 B1
6879257 Hisano et al. Apr 2005 B2
6970935 Maes Nov 2005 B1
7003570 Messinger et al. Feb 2006 B2
7047027 Jeon May 2006 B2
7065544 Moreno Jun 2006 B2
7151490 Richards Dec 2006 B2
7163151 Kiiskinen Jan 2007 B2
7177795 Chen et al. Feb 2007 B1
7209949 Mousseau et al. Apr 2007 B2
7225249 Barry et al. May 2007 B1
7228228 Bartlett et al. Jun 2007 B2
7263597 Everdell et al. Aug 2007 B2
7266754 Shah et al. Sep 2007 B2
7337115 Liu et al. Feb 2008 B2
7512659 Keohane et al. Mar 2009 B2
7529675 Maes May 2009 B2
7551935 Karmakar Jun 2009 B2
7634528 Horvitz et al. Dec 2009 B2
7650376 Blumenau Jan 2010 B1
7673796 Kobres et al. Mar 2010 B2
7693945 Dulitz et al. Apr 2010 B1
7729689 Chakraborty et al. Jun 2010 B2
7886083 Pinkerton et al. Feb 2011 B2
7917911 Bansal et al. Mar 2011 B2
7970923 Pedersen et al. Jun 2011 B2
8209620 Dempski et al. Jun 2012 B2
8290479 Aaron et al. Oct 2012 B2
8380798 Jackson et al. Feb 2013 B2
8438553 Komatsu et al. May 2013 B2
8539542 Elhag et al. Sep 2013 B1
20020138582 Chandra et al. Sep 2002 A1
20020165961 Everdell et al. Nov 2002 A1
20030009385 Tucciarone et al. Jan 2003 A1
20030031309 Rupe et al. Feb 2003 A1
20030033394 Stine Feb 2003 A1
20030115366 Robinson Jun 2003 A1
20030120822 Langrind et al. Jun 2003 A1
20030182421 Faybishenko et al. Sep 2003 A1
20040019648 Huynh et al. Jan 2004 A1
20040022264 McCue Feb 2004 A1
20040024817 Pinkas Feb 2004 A1
20040260551 Atkin et al. Dec 2004 A1
20050010573 Garg Jan 2005 A1
20050038660 Black et al. Feb 2005 A1
20050060372 DeBettencourt et al. Mar 2005 A1
20050060381 Huynh et al. Mar 2005 A1
20050102625 Lee et al. May 2005 A1
20050114357 Chengalvarayan et al. May 2005 A1
20050152406 Chauveau Jul 2005 A2
20050171768 Gierach Aug 2005 A1
20050192808 Sugiyama Sep 2005 A1
20050198270 Rusche et al. Sep 2005 A1
20060015339 Charlesworth et al. Jan 2006 A1
20060036441 Hirota Feb 2006 A1
20060109976 Sundaram et al. May 2006 A1
20060155854 Selgert Jul 2006 A1
20060287867 Cheng et al. Dec 2006 A1
20070008069 Lastinger et al. Jan 2007 A1
20070019793 Cheng Jan 2007 A1
20070025543 Vadlakonda et al. Feb 2007 A1
20070033229 Fassett et al. Feb 2007 A1
20070078986 Ethier et al. Apr 2007 A1
20070171066 Fein et al. Jul 2007 A1
20070174326 Schwartz et al. Jul 2007 A1
20070192422 Stark et al. Aug 2007 A1
20070237135 Trevallyn-Jones et al. Oct 2007 A1
20070290787 Fiatal et al. Dec 2007 A1
20080004056 Suzman Jan 2008 A1
20080057922 Kokes et al. Mar 2008 A1
20080075433 Gustafsson Mar 2008 A1
20080083024 Glazer et al. Apr 2008 A1
20080091723 Zuckerberg et al. Apr 2008 A1
20080134205 Bansal et al. Jun 2008 A1
20080155534 Boss et al. Jun 2008 A1
20080159266 Chen et al. Jul 2008 A1
20080162132 Doulton Jul 2008 A1
20080233977 Xu et al. Sep 2008 A1
20080233981 Ismail Sep 2008 A1
20090012841 Saft et al. Jan 2009 A1
20090031006 Johnson Jan 2009 A1
20090037515 Zapata et al. Feb 2009 A1
20090062949 Heo et al. Mar 2009 A1
20090094190 Stephens Apr 2009 A1
20090099906 Kirley et al. Apr 2009 A1
20090125595 Maes May 2009 A1
20090128335 Leung May 2009 A1
20090131080 Nadler et al. May 2009 A1
20090131087 Johan May 2009 A1
20090150501 Davis et al. Jun 2009 A1
20090150786 Brown Jun 2009 A1
20090164287 Kies et al. Jun 2009 A1
20090186641 Vaananen Jul 2009 A1
20090191902 Osborne Jul 2009 A1
20090199114 Lewis et al. Aug 2009 A1
20090210226 Ma Aug 2009 A1
20090216805 Coffman et al. Aug 2009 A1
20090265631 Sigurbjornsson Oct 2009 A1
20090270068 Ahopelto et al. Oct 2009 A1
20090271380 Julia et al. Oct 2009 A1
20090276488 Alstad Nov 2009 A1
20100015956 Qu et al. Jan 2010 A1
20100023451 Lambert et al. Jan 2010 A1
20100023475 Lahav Jan 2010 A1
20100023506 Sahni et al. Jan 2010 A1
20100030578 Siddique et al. Feb 2010 A1
20100048242 Rhoads et al. Feb 2010 A1
20100049599 Owen et al. Feb 2010 A1
20100087172 Klassen et al. Apr 2010 A1
20100122331 Wang et al. May 2010 A1
20100128335 Maeda et al. May 2010 A1
20100158236 Chang et al. Jun 2010 A1
20100174622 Sohn et al. Jul 2010 A1
20100201845 Feinberg et al. Aug 2010 A1
20100210248 Morrissey et al. Aug 2010 A1
20100211868 Karmarkar et al. Aug 2010 A1
20100287197 Wang et al. Nov 2010 A1
20100296646 Hemm et al. Nov 2010 A1
20100318859 Augusto et al. Dec 2010 A1
20110021178 Balasaygun et al. Jan 2011 A1
20110035284 Moshfeghi Feb 2011 A1
20110061068 Ali et al. Mar 2011 A1
20110072015 Lin et al. Mar 2011 A1
20110077941 Dey et al. Mar 2011 A1
20110145356 Tanner Jun 2011 A1
20110153723 Mutnuru et al. Jun 2011 A1
20110153839 Rajan et al. Jun 2011 A1
20110212736 Jaime et al. Sep 2011 A1
20110219018 Bailey et al. Sep 2011 A1
20110221960 Glaznev et al. Sep 2011 A1
20110231747 Zuckerberg et al. Sep 2011 A1
20110244887 Dupray et al. Oct 2011 A1
20110246560 Gibson Oct 2011 A1
20110276513 Erhart et al. Nov 2011 A1
20120134548 Rhoads et al. May 2012 A1
20120246238 Bailey et al. Sep 2012 A1
20120266102 Dempski et al. Oct 2012 A1
20130100301 Rhoads Apr 2013 A1
20130166332 Hammad Jun 2013 A1
20140111354 Hergesheimer et al. Apr 2014 A1
Foreign Referenced Citations (11)
Number Date Country
2493180 Jul 2005 CA
1852354 Oct 2006 CN
1340096 Sep 2003 EP
2396520 Jun 2004 GB
2461730 Jan 2010 GB
2005025155 Mar 2005 WO
WO 2006127791 Nov 2006 WO
2007086683 Aug 2007 WO
2008026945 Mar 2008 WO
2009012516 Jan 2009 WO
WO 2009135292 Nov 2009 WO
Non-Patent Literature Citations (24)
Entry
International Preliminary Report on Patentability for International Patent Application No. PCT/US2011/023557; International Filing Date: Feb. 3, 2011; 5 pages.
Best Shareware, “SoundPix Plus”, http://www.bestshareware.net/download/soundpixplus.htm, retrieved from the internet on Jan. 12, 2012, 2 pages.
Computerworld, “Facebook photo privacy PANIC: Auto-tag apology and apologia”, http://blogs.computerworld.com/18435/facebook—photo—privacy—panic—auto—tag—apology—and—apologia, retrieved from the internet on Jan. 12, 2012, 7 pages.
Digital Photography School, “Columbus V-900 GPS Voice Photo Logger Review”, http://www.digital-photography-school.com/columbus-v-900-gps-voice-photo-data-logger-review, retrieved from the internet on Jan. 12, 2012, 3 pages.
Google Image Labeler, http://en.wikipedia.org/wiki/Google—Image—Labeler, Aug. 31, 2006, 4 pages.
Mobile Phones “Nokia patents voice tagging of hotos”, http://www.mobilephones.org.uk/nokia/nokia-patents-voice-tagging-of-photos/, retrieved from the internet on Apr. 12, 2012, 4 pages.
Resco.net Developer & Mobile Enterprise, “How to use mobile devices camera and microphone in business applications”, http://www.resco.net/developer/company/articles.aspx?file=articles/article14, retrieved from the internet on Jan. 12, 2012, 6 pages.
“Sentiment140”, http://help.sentiment140.com/, retrieved from Internet May 8, 2012, 2 pages.
The Stanford Natural Language Processing Group, “Stanford Log-linear Part-of—Speech Tagger”, http://npl.stanford.edu/software/tagger.shtml, retrieved from the Internet on Apr. 11, 2012, 3 pages.
Blackwell T., “Fast Decoding of Tagged Message Formats,” Proceeding IEEE INFOCOM '96, Mar. 24-28, 1996, Copyright 1996 IEEE, pp. 224-231.
Office Action—Non-final; dated Mar. 15, 2012 for U.S. Appl. No. 12/718,041.
IBM et al., “Apparatus for Sending a Sequence of Asynchronous Messages Through the Same Channel in a Messaging Middleware Cluster,” Published Oct. 4, 2005, Copyright IP.com, Inc., pp. 1-7.
Dey, Anind K. et al., “CybreMinder: A Context Aware System for Supporting Reminders,” HUC 2000, LNCS 1927, pp. 172-186, 2000.
ACM Digital Library, [online]; [retrieved on Mar. 14, 2011]; retrieved from the Internet http://portal.acm.org/citation.cfm?id=1670452 Alberto Gonzalez Prieto et al.,“Adaptive Performance Management for SMS Systems,” Journal of Network and Systems Management; vol. 17 Issue 4, Dec. 2009.
NMS Adaptive, [online]; [retrieved on Mar. 14, 2011]; retrieved from the Internet http://www.nms-adaptive.com/products/pages/desktop-sms-frameset.htm.
Carnegie Mellon University, [online]; [retrieved on Mar. 14, 2011]; retrieved from the Internet http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.70.4047&rep=rep1&type=pdf Daniel Siewiorek et al.,“SenSay: A Context-Aware Mobile Phone,” 2003.
The Symbian Blog [online]; [retrieved on Mar. 15, 2011]; retrieved from the Internet http://www.symbianzone.co.cc/2010/08/sms-tag-v10-sms-organizer-symbian-os-94.html Symbian SMS Tag Organizer; 2010.
AdaptiveMobile [online]; [retrieved on Mar. 15, 2011]; retrieved from the Internet http://www.adaptivemobile.com 2011.
International Search Report; International Application No. PCT/US11/23557; International Filing Date: Feb. 3, 2011; Date of Mailing: Apr. 4, 2011.
International Search Report—Written Opinion; International Application No. PCT/US11/23557; International Filing Date: Feb. 3, 2011; Date of Mailing: Apr. 4, 2011.
Zhumatiy, V. et al., “Metric State Space Reinforcement Learning for a Vision-Capable Mobile Robot,” Technical Report; IDSIA; Mar. 2006.
Ricci et al., “Acquiring and Revising Preferences in a Critique-Based Mobile Recommender System,” May/Jun. 2007, vol. 22, No. 3, IEEE Computer Society, Copyright 2007 IEEE.
Bae et al., “TagReel: A Visualization of Tag Relations among User Interests in the Social Tagging System”, 2009 Six International Conference on Computer Graphics, Imaging and Visualization, IEEE Computer society, 2009, pp. 437-442.
Hosy, Marc, Indexing/Labeling?Enhancing digital pictures using Voice Tags/Commands/Comments, www.ip.com, IP.com electronic| IPCOM000169425D; Publication Apr. 22, 2008; 3 pages.
Related Publications (1)
Number Date Country
20130005366 A1 Jan 2013 US
Continuations (1)
Number Date Country
Parent 13052501 Mar 2011 US
Child 13608078 US