Apparatus and method for alternate channel communication initiated through a common message thread

Information

  • Patent Grant
  • 11463393
  • Patent Number
    11,463,393
  • Date Filed
    Friday, March 5, 2021
    3 years ago
  • Date Issued
    Tuesday, October 4, 2022
    2 years ago
Abstract
A server has a processor and a memory storing a multiple channel message thread module with instructions executed by the processor to identify when participants at client devices are actively viewing a common message thread at the same time to establish a participant viewing state. An alternate channel communication lock prompt is supplied the client devices in response to the participant viewing state. An alternate channel communication is delivered to the client devices in response to activation of the alternate channel communication lock prompt by at least one participant.
Description
FIELD OF THE INVENTION

This invention relates generally to communications in computer networks. More particularly, this invention relates to techniques for initiating an additional communication channel from a common text message communication channel.


BACKGROUND OF THE INVENTION

Currently, when a first individual desires to initiate a telephone call or video call with a second individual the first individual uses his or her communication equipment to initiate a ringtone at the communication equipment of the second individual. Acknowledgement of the ringtone is required to establish the presence of the second individual. The ring tone notification technique has been relied upon since the inception of telephones, well over one hundred years ago. It would be desirable to find an alternate methodology to establish the presence of an individual and to initiate conversations, without reliance upon a technique over one hundred years old that is a relic of the technical constraints of analog telephony.


SUMMARY OF THE INVENTION

A server has a processor and a memory storing a multiple channel message thread module with instructions executed by the processor to identify when participants at client devices are actively viewing a common message thread at the same time to establish a participant viewing state. An alternate channel communication lock prompt is supplied to the client devices in response to the participant viewing state. An alternate channel communication is delivered to the client devices in response to activation of the alternate channel communication lock prompt by at least one participant.





BRIEF DESCRIPTION OF THE FIGURES

The invention is more fully appreciated in connection with the following detailed description taken in conjunction with the accompanying drawings, in which:



FIG. 1 illustrates a system configured in accordance with an embodiment of the invention.



FIG. 2 illustrates processing operations associated with an embodiment of the invention.



FIG. 3 illustrates user interface attributes associated with an embodiment of the invention.



FIG. 4 illustrates user interface attributes, including an alternate channel communication indicator, associated with an embodiment of the invention.



FIG. 5 illustrates an alternate user interface utilized in accordance with an embodiment of the invention.





Like reference numerals refer to corresponding parts throughout the several views of the drawings.


DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 illustrates a system 100 configured in accordance with an embodiment of the invention. The system 100 includes a set of client devices 102_1 through 102_N connected to a server 104 via a network 106, which may be any combination of wired and wireless networks.


Each client device 102 includes standard components, such as a central processing unit 110 and input/output devices 112 connected via a bus 114. The input/output devices 112 may include a touch display, keyboard, trackball and the like. A network interface circuit 116 is also connected to the bus 114 to provide connectivity to network 106. A memory 120 is also connected to the bus 114. The memory 120 stores a communication module 122, which may be a browser or an application to support communications with server 104. The client device 102 is typically a mobile device, such as a Smartphone or Tablet.


Server 104 also includes standard components, such as a central processing unit 160, a bus 162, input/output devices 164 and a network interface circuit 166. A memory 170 is also connected to the bus 162. The memory 170 stores a multiple channel message module 172, which includes executable instructions to implement operations of the invention. In short, the executable instructions identify the common presence of participants in a text based thread. In response, to such common presence, participants are afforded the opportunity to initiate an alternate communication channel, such as an audio channel or voice channel without the prerequisite of a ringtone.



FIG. 2 illustrates operations associated with an embodiment of the invention. In particular, the figure illustrates operations performed by two client devices 102_1 and 102_2, as coordinated by the message server 104. Initially, the message server 104 serves a thread 200. By way of example, the thread is a text communication sequence between a first participant associated with client 102_1 and a second participant associated with client 102_2. Additional participants may be involved, but are omitted for simplicity of presentation.


The client device 102_1 collects a new text entry 202, which is routed by the message server 104 to client 102_2. The client 102_2 displays the text 206. The message server 104 determines if the participants are viewing the same thread 208. That is, the message server 104 evaluates whether each client device is actively displaying the same thread on the display of the client device. If so, the message server proactively activates an alternate channel communication 210. This can be thought of as a digital call set-up, where the telephone call is muted but active. The server also supplies each device with an alternate channel communication prompt 211. The alternate channel communication prompt is displayed 212 on client 102_1 and is displayed 214 on client 102_2. The alternate channel communication prompt signals to the participants that the message thread is being actively viewed by each participant.



FIG. 3 illustrates client device 102_2 with a display 300 and keyboard 302. The display shows a thread entry 304 and an associated prompt 306 indicating common viewing of the thread. The prompt 306 may be in the form of text, an icon, an image and the like.


Returning to FIG. 2, the message server 104 determines whether the prompt is activated 216. For example, activation may be in the form of haptic contact by the participant on the prompt as displayed on a screen. If a prompt is activated, then the message server 104 delivers (e.g., un-mutes the set up call) the alternate channel communication 218. The alternate channel communication may be presented 220 on client 102_1 and/or be presented 222 on client 102_2. The processing of FIG. 2 is advantageous because of the proactive activation of the alternate channel communication before a prompt is activated. An alternate embodiment may activate the alternate channel communication in response to prompt activation.



FIG. 4 illustrates client 102_1 with a display 400 and keyboard 402. The display 400 has a thread entry 404, a prompt 406 and an alternate channel indicator 408. In this example, the alternate channel indicator 408 may be a speaker indicative of an audio channel being initiated by the participant using client device 102_2. The alternate channel indicator 408 may be a block of video initiated by the participant using client device 102_2. In the example of FIG. 4, the message thread is shown with the alternate channel indicator. Alternately, the alternate channel indicator may occupy the entire display, which would be desirable in the case of a video session.


In one embodiment, the alternate channel communication persists during haptic contact with the alternate channel communication prompt by at least one participant. The alternate channel communication may include input from each participant making haptic contact with the alternate channel communication prompt.


The alternate channel communication may be a unidirectional audio session initiated through haptic content of a prompt by a first participant that is broadcast to the remaining participants. The alternate channel communication may be a bidirectional audio session between participants making haptic contact with the alternate channel communication prompt.


The alternate channel communication may be a unidirectional video session initiated through haptic contact of a prompt by a first participant that is broadcast to the remaining participants. The alternate channel communication may be a bidirectional video session between participants making haptic contact with the alternate channel communication prompt.


The prompt may include channel selection options, such as audio and/or video. The alternate channel communications may be coordinated through the message server using Internet Protocol packet switching techniques.



FIG. 5 illustrates a client device 102_1 with an alternate user interface on display 500. In this instance a prompt 502 for an alternate channel communication is accompanied by a lock prompt 504. The lock prompt 504 may be text or other indicia (e.g., a lock symbol) of an available lock state. Haptic engagement with the prompt 502 initiates the alternate channel communication. The user may then use a haptic gesture to engage the lock prompt 504. For example, a slide gesture from the prompt position 502 to the lock position 504 may be used. Alternately, the lock prompt 504 may be tapped. Alternately, prompt 502 may be eliminated, in which case, a permanent alternate channel communication state may be invoked by haptic contact with the lock prompt 504. The alternate channel communication state may be terminated through additional haptic contact with the lock prompt 504.


Once the lock prompt is engaged, haptic engagement with the prompt 502 is no longer necessary. Consider the case of an alternate channel communication in the form of a video, haptic engagement with the prompt 502 followed by haptic engagement with the lock 504 results in video being persistently displayed on the display 500. The video session may be terminated by haptic contact with the lock prompt 504. During the video session the display 500 may receive gestures to control whether a front-facing camera or a back-facing camera is utilized. For example, a double tap on the display 500 may toggle between the front facing camera and the back-facing camera. Alternately, haptic contact with one section of the display 500 may invoke the front-facing camera, while haptic contact with another section of the display 500 may invoke the back-facing camera. For example, a left-to-right, gesture applied to the display 500 may toggle the front-facing camera to the back-facing camera, while a right-to-left gesture applied to the display 500 may toggle the back-facing camera to the front-facing camera, Other possibilities include swiping from bottom-to-top vice versa) or from one portion of the display to another (e.g., from one area to a section outside of the area, from bottom to top, etc.).


Another feature contemplated by embodiments of the invention include the utilization of a multifunctional prompt. For example, prompt 502 may be utilized for multiple functions in addition to activation of the alternate channel. In one embodiment, prompt 502 may be utilizable to ‘send’ a text message or activate a camera, etc. During situations where participants are viewing the same thread, prompt 502 may be altered to allow use for other purposes (e.g., to activate an alternate channel). In one example, prompt 502 may normally appear as a first color and turn a second color when it is available for alternate functions.


An embodiment of the present invention relates to a computer storage product with a non-transitory computer readable storage medium having computer code thereon for performing various computer-implemented operations. The media and computer code may be those specially designed and constructed for the purposes of the present invention, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: magnetic media, optical media, magneto-optical media and hardware devices that are specially configured to store and execute program code, such as application-specific integrated circuits (“ASICs”), programmable logic devices (“PLDs”) and ROM and RAM devices. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment of the invention may be implemented using JAVA®, C++, or other object-oriented programming language and development tools. Another embodiment of the invention may be implemented in hardwired circuitry in place of, or in combination with, machine-executable software instructions.


The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, they thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.

Claims
  • 1. A server, comprising: a processor; anda memory storing instructions executed by the processor to:identify when a first participant at a first client device and a second participant at a second client device are actively viewing a common message thread at a same time;provide, in response to the identifying and without input from the first or second participants, for an alternate channel communication lock prompt to be displayed on the first client device and on the second client device, the alternate channel communication lock prompt being user-selectable for alternate channel communication between the first and second client devices;receive, from the first client device, an indication of user selection of the alternate channel communication lock prompt by the first participant; andin response to receiving the indication, provide for the alternate channel communication between the first and second client devices in a locked state, andprovide for an alternate channel communication indicator to be displayed on the second client device, the alternate channel communication indicator indicating that the alternate channel communication was initiated by the first participant using the first client device.
  • 2. The server of claim 1, wherein the alternate channel communication indicator is separate from the alternate channel communication lock prompt, wherein the alternate channel communication indicator comprises a speaker icon when the alternate channel communication is audio-based, andwherein alternate channel communication indicator comprises a video icon when the alternate channel communication is video-based.
  • 3. The server of claim 1, wherein the memory stores instructions executed by the processor to: toggle between a front-facing camera and a back-facing camera in response to a gesture applied to a display of a client device.
  • 4. The server of claim 3, wherein the gesture is a tap to the display of the client device.
  • 5. The server of claim 3, wherein the gesture is haptic contact with a designated area of the display of the client device.
  • 6. The server of claim 3, wherein the gesture is a lateral gesture across the display of the client device.
  • 7. The server of claim 1, wherein the alternate channel communication lock prompt is a multifunctional button capable of providing a function other than activation of the alternate channel communication.
  • 8. The server of claim 1, wherein the alternate channel communication is a broadcast audio session from the first participant to the second participant.
  • 9. The server of claim 1, wherein the alternate channel communication is a bidirectional audio session between the first and second participants.
  • 10. The server of claim 1, wherein the alternate channel communication is a broadcast video session from the first participant to the second participant.
  • 11. The server of claim 1, wherein the alternate channel communication is a bidirectional video session between the first and second participants.
  • 12. A method, comprising: identifying when a first participant at a first client device and a second participant at a second client device are actively viewing a common message thread at a same time;providing, in response to the identifying and without input from the first or second participants, for an alternate channel communication lock prompt to be displayed on the first client device and on the second client device, the alternate channel communication lock prompt being user-selectable for alternate channel communication between the first and second client devices;receiving, from the first client device, an indication of user selection of the alternate channel communication lock prompt by the first participant; andin response to receiving the indication, providing for the alternate channel communication between the first and second client devices in a locked state, andproviding for an alternate channel communication indicator to be displayed on the second client device, the alternate channel communication indicator indicating that the alternate channel communication was initiated by the first participant using the first client device.
  • 13. The method of claim 12, wherein the alternate channel communication indicator is separate from the alternate channel communication lock prompt, wherein the alternate channel communication indicator comprises a speaker icon when the alternate channel communication is audio-based, andwherein alternate channel communication indicator comprises a video icon when the alternate channel communication is video-based.
  • 14. The method of claim 12, further comprising toggling between a front-facing camera and a back-facing camera in response to a gesture applied to a display of a client device.
  • 15. The method of claim 14, wherein the gesture is a tap to the display of the client device.
  • 16. The method of claim 14, wherein the gesture is haptic contact with a designated area of the display of the client device.
  • 17. The method of claim 12, wherein the alternate channel communication is a broadcast audio session from the first participant to the second participant.
  • 18. The method of claim 12, wherein the alternate channel communication is a bidirectional audio session between the first and second participants.
  • 19. The method of claim 12, wherein the alternate channel communication is a broadcast video session from the first participant to the second participant.
  • 20. A non-transitory computer-readable medium comprising instructions, which when executed by a computing device, cause the computing device to perform operations comprising: identifying when a first participant at a first client device and a second participant at a second client device are actively viewing a common message thread at a same time;providing, in response to the identifying and without input from the first or second participants, for an alternate channel communication lock prompt to be displayed on the first client device and on the second client device, the alternate channel communication lock prompt being user-selectable for alternate channel communication between the first and second client devices;receiving, from the first client device, an indication of user selection of the alternate channel communication lock prompt by the first participant; andin response to receiving the indication, providing for the alternate channel communication between the first and second client devices in a locked state, andproviding for an alternate channel communication indicator to be displayed on the second client device, the alternate channel communication indicator indicating that the alternate channel communication was initiated by the first participant using the first client device.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation-in-part of U.S. Ser. No. 14/187,005, filed Feb. 21, 2014.

US Referenced Citations (244)
Number Name Date Kind
5999932 Paul Dec 1999 A
6154764 Nitta et al. Nov 2000 A
6167435 Druckenmiller et al. Dec 2000 A
6204840 Petelycky et al. Mar 2001 B1
6216141 Straub et al. Apr 2001 B1
6310694 Okimoto et al. Oct 2001 B1
6442590 Inala et al. Aug 2002 B1
6484196 Maurille Nov 2002 B1
6606657 Zilberstein et al. Aug 2003 B1
6665531 Soderbacka et al. Dec 2003 B1
6724403 Santoro et al. Apr 2004 B1
6757713 Ogilvie et al. Jun 2004 B1
6879994 Matsliach et al. Apr 2005 B1
6898626 Ohashi May 2005 B2
7004394 Kim Feb 2006 B2
7124164 Chemtob Oct 2006 B1
7149893 Leonard et al. Dec 2006 B1
7203380 Chiu et al. Apr 2007 B2
7243163 Friend et al. Jul 2007 B1
7356564 Hartselle et al. Apr 2008 B2
7519670 Hagale et al. Apr 2009 B2
7856449 Martino et al. Dec 2010 B1
8001204 Burtner et al. Aug 2011 B2
8098904 Ioffe et al. Jan 2012 B2
8112716 Kobayashi Feb 2012 B2
8276092 Narayanan et al. Sep 2012 B1
8279319 Date Oct 2012 B2
8312086 Velusamy et al. Nov 2012 B2
8312097 Siegel et al. Nov 2012 B1
8379130 Forutanpour et al. Feb 2013 B2
8405773 Hayashi et al. Mar 2013 B2
8418067 Cheng et al. Apr 2013 B2
8428453 Spiegel et al. Apr 2013 B1
8471914 Sakiyama et al. Jun 2013 B2
8560612 Kilmer et al. Oct 2013 B2
8687021 Bathiche et al. Apr 2014 B2
8744523 Fan et al. Jun 2014 B2
8775407 Huang Jul 2014 B1
8775972 Spiegel Jul 2014 B2
8788680 Naik Jul 2014 B1
8797415 Arnold Aug 2014 B2
8856349 Jain et al. Oct 2014 B2
8914752 Spiegel Dec 2014 B1
9026943 Spiegel May 2015 B1
9037577 Saylor et al. May 2015 B1
9083770 Drose et al. Jul 2015 B1
9098832 Scardino Aug 2015 B1
9225897 Sehn et al. Dec 2015 B1
9237202 Sehn Jan 2016 B1
9276886 Samaranayake Mar 2016 B1
9396354 Murphy et al. Jul 2016 B1
9407712 Sehn Aug 2016 B1
9407816 Sehn Aug 2016 B1
9660950 Archibong et al. May 2017 B2
9785796 Murphy et al. Oct 2017 B1
10082926 Spiegel et al. Sep 2018 B1
10084735 Spiegel et al. Sep 2018 B1
10949049 Spiegel et al. Mar 2021 B1
10958605 Spiegel Mar 2021 B1
20020047868 Miyazawa Apr 2002 A1
20020122659 McGrath et al. Sep 2002 A1
20020144154 Tomkow Oct 2002 A1
20030016247 Lai et al. Jan 2003 A1
20030052925 Daimon et al. Mar 2003 A1
20030126215 Udell Jul 2003 A1
20030164856 Prager et al. Sep 2003 A1
20040027371 Jaeger Feb 2004 A1
20040111467 Willis Jun 2004 A1
20040203959 Coombes Oct 2004 A1
20040243531 Dean Dec 2004 A1
20050078804 Yomoda Apr 2005 A1
20050097176 Schatz et al. May 2005 A1
20050104976 Currans May 2005 A1
20050114783 Szeto May 2005 A1
20050122405 Voss et al. Jun 2005 A1
20050193340 Amburgey et al. Sep 2005 A1
20050193345 Klassen et al. Sep 2005 A1
20050198128 Anderson Sep 2005 A1
20050223066 Buchheit et al. Oct 2005 A1
20060114338 Rothschild Jun 2006 A1
20060270419 Crowley et al. Nov 2006 A1
20070040931 Nishizawa Feb 2007 A1
20070064899 Boss et al. Mar 2007 A1
20070073823 Cohen et al. Mar 2007 A1
20070082707 Flynt et al. Apr 2007 A1
20070192128 Celestini Aug 2007 A1
20070214216 Carrer et al. Sep 2007 A1
20070229651 Nakajima Oct 2007 A1
20070233801 Eren et al. Oct 2007 A1
20070243887 Bandhole et al. Oct 2007 A1
20070255456 Funayama Nov 2007 A1
20080010266 Brunn et al. Jan 2008 A1
20080025701 Ikeda Jan 2008 A1
20080033930 Warren Feb 2008 A1
20080055269 Lemay et al. Mar 2008 A1
20080104169 Combel et al. May 2008 A1
20080104503 Beall et al. May 2008 A1
20080132275 Eastwood Jun 2008 A1
20080207176 Brackbill et al. Aug 2008 A1
20080222545 Lemay Sep 2008 A1
20080256446 Yamamoto Oct 2008 A1
20080266421 Takahata et al. Oct 2008 A1
20080270938 Carlson Oct 2008 A1
20080313346 Kujawa et al. Dec 2008 A1
20090006565 Velusamy et al. Jan 2009 A1
20090015703 Kim et al. Jan 2009 A1
20090024956 Kobayashi Jan 2009 A1
20090040324 Nonaka Feb 2009 A1
20090042588 Lottin et al. Feb 2009 A1
20090058822 Chaudhri Mar 2009 A1
20090079846 Chou Mar 2009 A1
20090132453 Hangartner et al. May 2009 A1
20090132665 Thomsen et al. May 2009 A1
20090160970 Fredlund et al. Jun 2009 A1
20090265647 Martin et al. Oct 2009 A1
20100082693 Hugg et al. Apr 2010 A1
20100131880 Lee et al. May 2010 A1
20100131895 Wohlert May 2010 A1
20100156933 Jones et al. Jun 2010 A1
20100159944 Pascal et al. Jun 2010 A1
20100161831 Haas et al. Jun 2010 A1
20100185665 Horn et al. Jul 2010 A1
20100214436 Kim et al. Aug 2010 A1
20100223128 Dukellis et al. Sep 2010 A1
20100223343 Bosan et al. Sep 2010 A1
20100257196 Waters et al. Oct 2010 A1
20100281045 Dean Nov 2010 A1
20100306669 Della Pasqua Dec 2010 A1
20110004071 Faiola et al. Jan 2011 A1
20110040783 Uemichi et al. Feb 2011 A1
20110040804 Peirce et al. Feb 2011 A1
20110050909 Ellenby et al. Mar 2011 A1
20110050915 Wang et al. Mar 2011 A1
20110102605 Hannaford May 2011 A1
20110102630 Rukes May 2011 A1
20110141025 Tsai Jun 2011 A1
20110145564 Moshir et al. Jun 2011 A1
20110184980 Jeong et al. Jul 2011 A1
20110197194 D'Angelo et al. Aug 2011 A1
20110202968 Nurmi Aug 2011 A1
20110211534 Schmidt et al. Sep 2011 A1
20110213845 Logan et al. Sep 2011 A1
20110214066 Chitturi et al. Sep 2011 A1
20110273575 Lee Nov 2011 A1
20110283188 Farrenkopf Nov 2011 A1
20110286586 Saylor et al. Nov 2011 A1
20110320373 Lee et al. Dec 2011 A1
20120028659 Whitney et al. Feb 2012 A1
20120062805 Candelore Mar 2012 A1
20120108293 Law et al. May 2012 A1
20120110096 Smarr et al. May 2012 A1
20120113143 Adhikari et al. May 2012 A1
20120113272 Hata May 2012 A1
20120131507 Sparandara et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120143760 Abulafia et al. Jun 2012 A1
20120150978 Monaco Jun 2012 A1
20120163664 Zhu Jun 2012 A1
20120166971 Sachson et al. Jun 2012 A1
20120169855 Oh Jul 2012 A1
20120173991 Roberts et al. Jul 2012 A1
20120176401 Hayward et al. Jul 2012 A1
20120184248 Speede Jul 2012 A1
20120200743 Blanchflower et al. Aug 2012 A1
20120210244 De Francisco Lopez et al. Aug 2012 A1
20120212632 Mate et al. Aug 2012 A1
20120220264 Kawabata Aug 2012 A1
20120233000 Fisher et al. Sep 2012 A1
20120236162 Imamura Sep 2012 A1
20120239761 Linner et al. Sep 2012 A1
20120250951 Chen Oct 2012 A1
20120278387 Garcia et al. Nov 2012 A1
20120278692 Shi Nov 2012 A1
20120281129 Wang et al. Nov 2012 A1
20120299954 Wada et al. Nov 2012 A1
20120304080 Wormald et al. Nov 2012 A1
20120307096 Ford et al. Dec 2012 A1
20120307112 Kunishige et al. Dec 2012 A1
20120308044 Vander Mey et al. Dec 2012 A1
20120309542 Nogami et al. Dec 2012 A1
20120323933 He et al. Dec 2012 A1
20130050260 Reitan Feb 2013 A1
20130057587 Leonard et al. Mar 2013 A1
20130059607 Herz et al. Mar 2013 A1
20130060690 Oskolkov et al. Mar 2013 A1
20130063369 Malhotra et al. Mar 2013 A1
20130067027 Song et al. Mar 2013 A1
20130071093 Hanks et al. Mar 2013 A1
20130085790 Palmer et al. Apr 2013 A1
20130128059 Kristensson May 2013 A1
20130145286 Feng et al. Jun 2013 A1
20130156175 Bekiares et al. Jun 2013 A1
20130169822 Zhu et al. Jul 2013 A1
20130173729 Starenky et al. Jul 2013 A1
20130182133 Tanabe Jul 2013 A1
20130185131 Sinha et al. Jul 2013 A1
20130194301 Robbins et al. Aug 2013 A1
20130198176 Kim Aug 2013 A1
20130222323 McKenzie Aug 2013 A1
20130227476 Frey Aug 2013 A1
20130232194 Knapp et al. Sep 2013 A1
20130263031 Oshiro et al. Oct 2013 A1
20130265450 Barnes, Jr. Oct 2013 A1
20130290443 Collins et al. Oct 2013 A1
20130344896 Kirmse et al. Dec 2013 A1
20130346877 Borovoy et al. Dec 2013 A1
20140011538 Mulcahy et al. Jan 2014 A1
20140032682 Prado et al. Jan 2014 A1
20140047045 Baldwin et al. Feb 2014 A1
20140047335 Lewis et al. Feb 2014 A1
20140049652 Moon et al. Feb 2014 A1
20140052485 Shidfar Feb 2014 A1
20140052633 Gandhi Feb 2014 A1
20140057660 Wager Feb 2014 A1
20140085334 Payne Mar 2014 A1
20140089314 Iizuka et al. Mar 2014 A1
20140100997 Mayerle et al. Apr 2014 A1
20140122658 Haeger et al. May 2014 A1
20140122787 Shalvi et al. May 2014 A1
20140129953 Spiegel May 2014 A1
20140143143 Fasoli et al. May 2014 A1
20140149519 Redfern et al. May 2014 A1
20140155102 Cooper et al. Jun 2014 A1
20140173457 Wang et al. Jun 2014 A1
20140189592 Benchenaa et al. Jul 2014 A1
20140201527 Krivorot Jul 2014 A1
20140207679 Cho Jul 2014 A1
20140214471 Schreiner, III Jul 2014 A1
20140279436 Dorsey et al. Sep 2014 A1
20140280537 Pridmore et al. Sep 2014 A1
20140282096 Rubinstein et al. Sep 2014 A1
20140029821 Park et al. Oct 2014 A1
20140317302 Naik Oct 2014 A1
20140325383 Brown et al. Oct 2014 A1
20140359024 Spiegel Dec 2014 A1
20140359032 Spiegel et al. Dec 2014 A1
20150046278 Pei et al. Feb 2015 A1
20150094106 Grossman et al. Apr 2015 A1
20150116529 Wu et al. Apr 2015 A1
20150172534 Miyakawa et al. Jun 2015 A1
20150222814 Li et al. Aug 2015 A1
20150350450 Rose et al. Dec 2015 A1
20160006927 Sehn Jan 2016 A1
20210336914 Spiegel et al. Oct 2021 A1
Foreign Referenced Citations (7)
Number Date Country
2887596 Jul 2015 CA
2418606 Feb 2012 EP
2482537 Aug 2012 EP
20120003323 Jan 2012 KR
WO-2013008238 Jan 2013 WO
WO-2014194262 Dec 2014 WO
WO-2016007285 Jan 2016 WO
Non-Patent Literature Citations (66)
Entry
“Android Getting Started Guide”, Voxer Business, [Online] Retrieved from the Internet: <URL: https://voxer.com/assets/AndroidGuide.pdf>, (Feb. 1, 2014), 18 pgs.
“U.S. Appl. No. 14/187,005, Advisory Action dated Dec. 5, 2014”, 3 pgs.
“U.S. Appl. No. 14/187,005, Decision on Pre-Appeal Brief Request dated Jan. 23, 2015”, 2 pgs.
“U.S. Appl. No. 14/187,005, Examiner Interview Summary dated Feb. 12, 2015”, 3 pgs.
“U.S. Appl. No. 14/187,005, Final Office Action dated Nov. 6, 2014”, 13 pgs.
“U.S. Appl. No. 14/187,005, Non Final Office Action dated Jul. 2, 2014”, 10 pgs.
“U.S. Appl. No. 14/187,005, Pre-Appeal Brief Conference Request filed Dec. 16, 2014”, 5 pgs.
“U.S. Appl. No. 14/187,005, Response filed Aug. 6, 2014 to Non Final Office Action dated Jul. 2, 2014”, 5 pgs.
“U.S. Appl. No. 14/187,005, Response filed Nov. 25, 2014 to Final Office Action dated Nov. 6, 2014”, 6 pgs.
“U.S. Appl. No. 14/510,051, Advisory Action dated Jun. 7, 2017”, 3 pgs.
“U.S. Appl. No. 14/510,051, Examiner Interview Summary dated Jul. 21, 2017”, 5 pgs.
“U.S. Appl. No. 14/510,051, Final Office Action dated Apr. 4, 2017”, 22 pgs.
“U.S. Appl. No. 14/510,051, Final Office Action dated Dec. 13, 2017”, 16 pgs.
“U.S. Appl. No. 14/510,051, Non Final Office Action dated Jul. 12, 2017”, 20 pgs.
“U.S. Appl. No. 14/510,051, Non Final Office Action dated Nov. 4, 2016”, 10 pgs.
“U.S. Appl. No. 14/510,051, Notice of Allowability dated May 10, 2018”, 2 pgs.
“U.S. Appl. No. 14/510,051, Notice of Allowance dated May 2, 2018”, 9 pgs.
“U.S. Appl. No. 14/510,051, Response filed Mar. 13, 2018 to Final Office Action dated Dec. 13, 2017”, 6 pgs.
“U.S. Appl. No. 14/616,618, Advisory Action dated Jun. 9, 2017”, 4 pgs.
“U.S. Appl. No. 14/616,618, Examiner Interview Summary dated Feb. 6, 2015”, 3 pgs.
“U.S. Appl. No. 14/616,618, Final Office Action dated Apr. 7, 2017”, 29 pgs.
“U.S. Appl. No. 14/616,618, Final Office Action dated Dec. 13, 2017”, 14 pgs.
“U.S. Appl. No. 14/616,618, Non Final Office Action dated Jul. 17, 2017”, 28 pgs.
“U.S. Appl. No. 14/616,618, Non Final Office Action dated Nov. 4, 2016”, 13 pgs.
“U.S. Appl. No. 14/616,618, Notice of Allowance dated May 1, 2018”, 9 pgs.
“U.S. Appl. No. 14/616,618, Response file Mar. 13, 2018 to Final Office Action dated Dec. 13, 2017”, 6 pgs.
“U.S. Appl. No. 14/616,618, Response filed May 25, 2017 to Final Office Action dated Apr. 7, 2017”, 3 pgs.
“U.S. Appl. No. 14/616,618, Response filed Jul. 7, 2017 to Advisory Action dated Jun. 9, 2017”, 5 pgs.
“U.S. Appl. No. 14/616,618, Response filed Sep. 14, 2017 to Non Final Office Action dated Jul. 17, 2017”, 4 pgs.
“U.S. Appl. No. 14/616,618, Response filed Dec. 20, 2016 to Non Final Office Action dated Nov. 4, 2016”, 5 pgs.
“U.S. Appl. No. 16/053,519, Examiner Interview Summary dated Apr. 23, 2020”, 3 pgs.
“U.S. Appl. No. 16/053,519, Final Office Action dated Feb. 18, 2020”, 14 pgs.
“U.S. Appl. No. 16/053,519, Non Final Office Action dated May 11, 2020”, 18 pgs.
“U.S. Appl. No. 16/053,519, Non Final Office Action dated Sep. 27, 2019”, 13 pgs.
“U.S. Appl. No. 16/053,519, Response filed Jan. 27, 2020 to Non Final Office Action dated Sep. 27, 2019”, 8 pgs.
“U.S. Appl. No. 16/053,519, Response filed Apr. 22, 2020 to Final Office Action dated Feb. 18, 2020”, 10 pgs.
“U.S. Appl. No. 16/053,519, Response filed Aug. 11, 2020 to Non Final Office Action dated May 11, 2020”, 9 pgs.
“U.S. Appl. No. 16/059,834, Examiner Interview Summary dated Apr. 23, 2020”, 3 pgs.
“U.S. Appl. No. 16/059,834, Final Office Action dated Feb. 19, 2020”, 15 pgs.
“U.S. Appl. No. 16/059,834, Final Office Action dated Sep. 4, 2020”, 16 pgs.
“U.S. Appl. No. 16/059,834, Non Final Office Action dated May 11, 2020”, 18 pgs.
“U.S. Appl. No. 16/059,834, Non Final Office Action dated Sep. 30, 2019”, 11 pgs.
“U.S. Appl. No. 16/059,834, Notice of Allowance dated Nov. 12, 2020”, 10 pgs.
“U.S. Appl. No. 16/059,834, Response filed Jan. 30, 2020 to Non-Final Office Action dated Sep. 30, 2019”, 8 pgs.
“U.S. Appl. No. 16/059,834, Response filed Apr. 22, 2020 to Final Office Action dated Feb. 19, 2020”, 9 pgs.
“U.S. Appl. No. 16/059,834, Response filed Aug. 11, 2020 to Non Final Office Action dated May 11, 2020”, 9 pgs.
“U.S. Appl. No. 16/059,834, Response filed Oct. 23, 2020 to Final Office Action dated Sep. 4, 2020”, 7 pgs.
“European Application Serial No. 14804343.3, European Search Opinion dated Sep. 29, 2016”, 7 pgs.
“European Application Serial No. 14804343.3, Supplementary European Search Report dated Sep. 29, 2016”, 2 pgs.
“European Application Serial No. 15819676.6, European Search Opinion dated Oct. 12, 2017”, 4 pgs.
“European Application Serial No. 15819676.6, Supplementary European Search Report dated Oct. 12, 2017”, 2 pgs.
“How Snaps Are Stored And Deleted”, Snapchat, [Online] Retrieved from the Internet: <URL: https://www.snap.com/en-us/news/post/how-snaps-are-stored-and-deleted/>, (May 9, 2013), 2 pgs.
“International Application Serial No. PCT/US2014/040346, International Search Report dated Mar. 23, 2015”, 2 pgs.
“International Application Serial No. PCT/US2014/040346, Written Opinion dated Mar. 23, 2015”, 6 pgs.
“International Application Serial No. PCT/US2015/037251, International Search Report dated Sep. 29, 2015”, 2 pgs.
“International Application Serial No. PCT/US2015/037251, Written Opinion dated Sep. 29, 2015”, 4 pgs.
“iVisit Mobile: Getting Started”, IVISIT, [Online] Retrieved from the Internet: <URL: http://web.archive.org/web/20140830174355/http://ivisit.com/support_mobile>, (Dec. 4, 2013), 16 pgs.
Fajman, “An Extensible Message Format for Message Disposition Notifications”, Request for Comments: 2298, National Institutes of Health, (Mar. 1998), 28 pgs.
Melanson, Mike, “This text message will self destruct in 60 seconds”, [Online] Retrieved from the Internet: <URL: http://readwrite.com/2011/02/11/this_text_message_will_self_destruct_in_60_seconds>, (Feb. 18, 2015), 4 pgs.
Sawers, Paul, “Snapchat for iOS Lets You Send Photos to Friends and Set How long They're Visible For”, [Online] Retrieved from the Internet: <URL: https://thenextweb.com/apps/2012/05/07/snapchat-for-ios-lets-you-send-photos-to-friends-and-set-how-long-theyre-visible-for/>, (May 7, 2012), 5 pgs.
Shein, Esther, “Ephemeral Data”, Communications of the ACM, vol. 56, No. 9, (Sep. 2013), 3 pgs.
“U.S. Appl. No. 16/053,519, Final Office Action dated Sep. 4, 2020”, 16 pgs.
“U.S. Appl. No. 16/053,519, Response filed Oct. 23, 2020 to Final Office Action dated Sep. 4, 2020”, 8 pgs.
“U.S. Appl. No. 16/053,519, Notice of Allowance dated Nov. 12, 2020”, 10 pgs.
“U.S. Appl. No. 17/193,803, Response filed Feb. 18, 2022 to Non Final Office Action dated Nov. 18, 2021”, 9 pgs.
“U.S. Appl. No. 17/193,803, Non Final Office Action dated Nov. 18, 2021”, 8 pgs.
Related Publications (1)
Number Date Country
20210306291 A1 Sep 2021 US
Continuations (2)
Number Date Country
Parent 16059834 Aug 2018 US
Child 17193667 US
Parent 14510051 Oct 2014 US
Child 16059834 US
Continuation in Parts (1)
Number Date Country
Parent 14187005 Feb 2014 US
Child 14510051 US