The present invention is directed to Voice and Video conversations Over IP, and more particularly to a method and system for making such communications using a decentralized architecture.
There are numerous devices and applications which are based on transferring Voice and/or Video over IP (Internet Protocol). These include having VOIP clients on a PC, or installed on any other communication device, such as a mobile phone. There are also dedicated Wi-Fi telephones which support VOIP applications; VOIP telephones can be both wired, such as the Cisco CP-7941G, or wireless, such as the Polycom SPECTRALINK® 8002 2200-37010-020 or Linksys WIP310-G1.
Client devices for making and receiving voice and video calls over the IP network with the standard functionality of most “original” telephones are also referred to as “softphones”. Softphones usually allow integration with IP phones and USB phones instead of utilizing a computer's microphone and speakers (or headset). Often a softphone is designed to behave like a traditional telephone, sometimes appearing as an image of a phone, with a display panel and buttons with which the user can interact.
A typical application of a softphone is to make calls via an Internet telephony service provider to other softphones or to telephones. Popular Internet telephony service providers include SKYPE®, GOOGLE TALK™, and VONAGE®, which have their own softphones that a user may install on his computer or the like. Most service providers use a communication protocol called SIP (Session Initiation Protocol), whereas SKYPE® has a closed proprietary system. In order to make a voice call over the internet, one should have any computing device with an audio input (e.g. microphone) and output means (e.g. a speaker or headset), Internet connectivity such as DSL, Wi-Fi, cable or LAN, and an account with an Internet telephony service provider or IP PBX.
Such prior art devices establish a single connection to the Internet telephony service provider.
The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
Aspects of the present invention relate generally to Voice and Video over IP. In particular, aspects of the present invention relate to a system and method for communication where there is more than one terminal for conducting and controlling the VoIP sessions.
Reference is now made to
According to an embodiment of the present invention, a softphone application is installed and operates on both Display 110 and Handset 120, which are synced so as to provide a seamless experience to the user. A typical scenario includes a user having the Handset 120 close to him, possibly in his hands, whereas Display 110 is a few meters away, such as when Display 110 is an LCD display in the living room. Therefore such a scenario or similar scenarios would benefit from a decentralized system and method for providing Voice over IP or Voice and Video over IP sessions. According to an embodiment of the present invention, the controlling of such VoIP sessions is done from Handset 120; e.g. initiating calls to contacts, answering calls, ending calls, etc.
According to an embodiment of the present invention, both Display 110 and Handset 120 have audio and video capabilities, as shown in Table 1 below. Display 110 includes audio input means (e.g. built-in microphone, or such that is connected to the processing unit and modem 140), audio output (speakers), video input (built-in camera or such that is connected to the processing unit and modem 140) and video output (the display). Handset 120 includes audio input means (microphone), audio output means (earpiece and/or speakers and or headsets), and video output means (display) and according to an embodiment of the present invention may also include video input means (built-in or attachable camera).
Reference is now made to
Reference is now made to
According to a preferred embodiment of the present invention, since Handset 120 is typically in great vicinity to the user, it is used as the audio input of a conversation. The audio out may be either the audio output means of Display 110 or that of Handset 120; video output is preferably the display of Display 110, and video input may be either video input means of Display 110 or Handset 120.
According to another embodiment of the present invention, audio and video input and output means may be used concurrently. E.g., both audio output of Display 110 and that of Handset 120 may play the audio of the active conversation.
Reference is now made to
Step 430 occurs concurrently or immediately follows step 420. At step 430, a second connection is established between internet telephony service provider 130 and Display 110. According to an embodiment of present invention, said second connection is initiated by Display 110, which contacts internet telephony service provider 130 and signal that it should join initiated or ongoing session between internet telephony service provider 130 and Handset 110. Display 110 may transfer any detail about Handset and/or session so that internet telephony service provider 130 identifies session and can transfer the session or parts of it (e.g. video) to Display 110. According to another embodiment of the present invention, Handset 120 signals to internet telephony service provider 130 that it is paired to Display 110, and therefore the initiated VoIP session should be shared with Display 110.
Said second connection between Display 110 and Internet telephony service provider 130 is typically established for video purposes, whereas according to an embodiment of the present invention, video of said VoIP session is transferred between internet telephony service provider 130 and Display 110, and audio is transferred between internet telephony service provider 130 and both Handset 120 and Display 110.
Display 110 typically has a larger display than Handset 120, more convenient for user viewing, and possibly better speakers, and in addition connection between Display 110 and internet telephony service provider 130 is of higher band width, as compared to connection between Handset 120 and internet telephony service provider 130. Therefore, at step 440, the video part of the VoIP session, which is more band-width consuming, is transferred between Display 110 and internet telephony service provider 130, and the audio part is transferred between internet telephony service provider 130 and both Display 110 and Handset 120. Handset 110 is especially used for audio input of the VoIP session, whereas the audio output may be played through audio output means of Handset 110, Display 120, or both. At step 450, user terminates VoIP session via Handset 120.
According to an aspect of the current invention, even though the VoIP session is divided between two connections, the session is seamless and the user carries out the conversation as if it were a single connection. Accordingly, there is synchronization between Display 110 and Handset 120. Said synchronization is used in diminishing latencies between audio and video, and between Display 110 and Handset 120. The video received and displayed on Display 110 is synchronized with the audio received and played on either speaker/s 113 of Display 110 or audio output means 123 of Handset 120. Synchronization between Display 110 and Handset 120 is performed either via internet telephony service provider 130 or directly. In a first embodiment of the present invention, a time stamp is sent from both Display 110 and Handset 120 to internet telephony service provider 130. The internet telephony service provider 130 in turn alters transmission time of data, audio and/or video, to Display 110 and Handset 120. According to a second embodiment of the present invention, Display 110 and Handset 120 may have buffers which allow them to synchronize between Display 110 and Handset 120, by delaying playing/sending of data to internet telephony service provider 130.
According to an embodiment of the present invention, Handset 120 is also used to control other elements in Display 110, using local communication channel, which is established by local wireless modules 112 and 122. For example, Handset 120 may act as a remote control to Display 110.
According to yet another embodiment of the present invention, the system includes multiple Displays, similar to Display 110. Handset 120 communicates with all Displays, and may control a session which is divided between Handset 120 and a first Display 120 and upon manual selection by the user, or automatically, session may be handed-over to be divided between Handset 120 and second Display 110. Hand-over between first Display 110 and second Display 110 is initiated manually by user, who may be able to see which other Displays are in his vicinity. In another embodiment of the present invention, the session may be handed over automatically, possibly when user moves from first location to second location, and/or when local communication between Handset 120 and said second Display 110 is typically of better quality than that between Handset 120 and said first Display 110. Such handover mechanisms are described in the art, for example, U.S. Pat. No. 6,834,192 to Watanabe et al., which describes handover of communications in a Bluetooth, or other, radio communication system, or U.S. patent application Ser. No. 11/680,911 to Jougit, describing a method and system for a distributed Bluetooth host architecture.
Reference is now made to
This application claims benefit to, and is a continuation of, U.S. patent application Ser. No. 13/895,396, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 16, 2013 by inventor Eyal Bychkov. U.S. patent application Ser. No. 13/895,396 is a continuation of U.S. patent application Ser. No. 13/101,358, now U.S. Pat. No. 8,457,118, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 5, 2011 by inventor Eyal Bychkov. U.S. patent application Ser. No. 13/101,358 is a non-provisional of U.S. Provisional Application No. 61/345,318, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 17, 2010 by inventor Eyal Bychkov.
Number | Name | Date | Kind |
---|---|---|---|
5625673 | Grewe et al. | Apr 1997 | A |
5628055 | Stein | May 1997 | A |
5809115 | Inkinen | Sep 1998 | A |
5893037 | Reele et al. | Apr 1999 | A |
5907815 | Grimm et al. | May 1999 | A |
6188917 | Laureanti | Feb 2001 | B1 |
6201867 | Koike | Mar 2001 | B1 |
6243578 | Koike | Jun 2001 | B1 |
6285823 | Saeki et al. | Sep 2001 | B1 |
6300947 | Kanevsky | Oct 2001 | B1 |
6477357 | Cook | Nov 2002 | B1 |
6516202 | Hawkins et al. | Feb 2003 | B1 |
6640113 | Shim et al. | Oct 2003 | B1 |
6690947 | Tom | Feb 2004 | B1 |
6760415 | Beecroft | Jul 2004 | B2 |
6834192 | Watanabe | Dec 2004 | B1 |
6898283 | Wycherley et al. | May 2005 | B2 |
6907264 | Sterkel | Jun 2005 | B1 |
6999792 | Warren | Feb 2006 | B2 |
7020704 | Lipscomb et al. | Mar 2006 | B1 |
7085542 | Dietrich et al. | Aug 2006 | B2 |
7194285 | Tom | Mar 2007 | B2 |
7194752 | Kenyon et al. | Mar 2007 | B1 |
7266391 | Warren | Sep 2007 | B2 |
7275244 | Charles et al. | Sep 2007 | B1 |
7477919 | Warren | Jan 2009 | B2 |
7515937 | Lee | Apr 2009 | B2 |
7571014 | Lambourne et al. | Aug 2009 | B1 |
7747338 | Korhonen et al. | Jun 2010 | B2 |
7784065 | Polivy et al. | Aug 2010 | B2 |
8316308 | Sherman | Nov 2012 | B2 |
8457118 | Bychkov | Jun 2013 | B2 |
8463875 | Katz et al. | Jun 2013 | B2 |
9083846 | Bychkov | Jul 2015 | B2 |
9448814 | Sherman et al. | Sep 2016 | B2 |
9686145 | Sherman et al. | Jun 2017 | B2 |
20010055951 | Slotznick | Dec 2001 | A1 |
20020090980 | Wilcox et al. | Jul 2002 | A1 |
20020151327 | Levitt | Oct 2002 | A1 |
20030008563 | Nishio et al. | Jan 2003 | A1 |
20030107529 | Hayhurst et al. | Jun 2003 | A1 |
20030200001 | Goddard | Oct 2003 | A1 |
20040042601 | Miao | Mar 2004 | A1 |
20040052501 | Tam | Mar 2004 | A1 |
20040156616 | Strub et al. | Aug 2004 | A1 |
20040233930 | Colby | Nov 2004 | A1 |
20050064860 | DeLine | Mar 2005 | A1 |
20050070225 | Lee | Mar 2005 | A1 |
20050091359 | Soin et al. | Apr 2005 | A1 |
20050159184 | Kerner et al. | Jul 2005 | A1 |
20050231392 | Meehan et al. | Oct 2005 | A1 |
20050276570 | Reed et al. | Dec 2005 | A1 |
20050276750 | Ziv et al. | Dec 2005 | A1 |
20060003804 | Liu | Jan 2006 | A1 |
20060026652 | Pulitzer | Feb 2006 | A1 |
20060033809 | Farley | Feb 2006 | A1 |
20060072694 | Dai et al. | Apr 2006 | A1 |
20060075439 | Vance | Apr 2006 | A1 |
20060105722 | Kumar | May 2006 | A1 |
20060123053 | Scannell | Jun 2006 | A1 |
20060130075 | Rhoten et al. | Jun 2006 | A1 |
20060190321 | Martins Nicho et al. | Aug 2006 | A1 |
20060235872 | Kline et al. | Oct 2006 | A1 |
20060241353 | Makino et al. | Oct 2006 | A1 |
20060242590 | Polivy et al. | Oct 2006 | A1 |
20070004450 | Parikh | Jan 2007 | A1 |
20070018957 | Seo | Jan 2007 | A1 |
20070053653 | Huntington | Mar 2007 | A1 |
20070072589 | Clarke | Mar 2007 | A1 |
20070079030 | Okuley et al. | Apr 2007 | A1 |
20070139514 | Marley | Jun 2007 | A1 |
20070161404 | Yasujima et al. | Jul 2007 | A1 |
20070195158 | Kies | Aug 2007 | A1 |
20070211907 | Eo et al. | Sep 2007 | A1 |
20070226734 | Lin | Sep 2007 | A1 |
20070288583 | Rensin et al. | Dec 2007 | A1 |
20080009325 | Zinn et al. | Jan 2008 | A1 |
20080013659 | Kim | Jan 2008 | A1 |
20080013802 | Lee et al. | Jan 2008 | A1 |
20080019522 | Proctor | Jan 2008 | A1 |
20080026794 | Warren | Jan 2008 | A1 |
20080030304 | Doan et al. | Feb 2008 | A1 |
20080037674 | Zurek et al. | Feb 2008 | A1 |
20080040354 | Ray et al. | Feb 2008 | A1 |
20080045140 | Korhonen | Feb 2008 | A1 |
20080056285 | Quinn et al. | Mar 2008 | A1 |
20080120401 | Panabaker et al. | May 2008 | A1 |
20080140886 | Izutsu | Jun 2008 | A1 |
20080152165 | Zacchi | Jun 2008 | A1 |
20080162665 | Kali | Jul 2008 | A1 |
20080168368 | Louch et al. | Jul 2008 | A1 |
20080212649 | Jougit | Sep 2008 | A1 |
20080307315 | Sherman et al. | Dec 2008 | A1 |
20090002191 | Kitaura | Jan 2009 | A1 |
20090010485 | Lamb | Jan 2009 | A1 |
20090158382 | Shaffer et al. | Jun 2009 | A1 |
20090207097 | Sherman et al. | Aug 2009 | A1 |
20090210491 | Thakkar | Aug 2009 | A1 |
20090286570 | Pierce | Nov 2009 | A1 |
20100003921 | Godlewski | Jan 2010 | A1 |
20100041330 | Elg | Feb 2010 | A1 |
20100093401 | Moran et al. | Apr 2010 | A1 |
20100305729 | Glitsch et al. | Dec 2010 | A1 |
20110047247 | Katz et al. | Feb 2011 | A1 |
20110164105 | Lee | Jul 2011 | A1 |
20110208807 | Shaffer | Aug 2011 | A1 |
20110280142 | Bychkov | Nov 2011 | A1 |
20120314777 | Zhang | Dec 2012 | A1 |
20130036366 | Sherman et al. | Feb 2013 | A1 |
20130258038 | Bychkov | Oct 2013 | A1 |
20170235477 | Sherman et al. | Aug 2017 | A1 |
20170315775 | Katz et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
1871075 | Dec 2007 | EP |
WO-9421058 | Sep 1994 | WO |
WO-0059247 | Oct 2000 | WO |
WO-0186922 | Oct 2001 | WO |
WO-03103174 | Dec 2003 | WO |
WO-2008011230 | Jan 2008 | WO |
Entry |
---|
“Non-Final Office Action”, U.S. Appl. No. 13/101,358, Jan. 9, 2013, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/895,396, Nov. 20, 2014, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/895,396, Dec. 16, 2014, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/101,358, Feb. 19, 2013, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/895,396, Mar. 19, 2015, 10 pages. |
“Advisory Action”, U.S. Appl. No. 12/372,812, Nov. 29, 2012, 3 pages. |
“Final Office Action”, U.S. Appl. No. 12/372,812, Feb. 10, 2015, 14 pages. |
“Final Office Action”, U.S. Appl. No. 12/372,812, Aug. 28, 2012, 14 pages. |
“Final Office Action”, U.S. Appl. No. 12/850,804, Jan. 10, 2013, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/134,221, Nov. 15, 2011, 8 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/372,812, Jun. 13, 2014, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/372,812, Dec. 22, 2011, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/850,804, Oct. 26, 2012, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/612,879, Oct. 22, 2015, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/887,450, Jul. 8, 2015, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/134,221, Jul. 25, 2012, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/850,804, Feb. 6, 2013, 12 pages. |
“Restriction Requirement”, U.S. Appl. No. 12/134,221, Aug. 2, 2011, 6 pages. |
“Restriction Requirement”, U.S. Appl. No. 12/372,812, Sep. 30, 2011, 9 pages. |
“Restriction Requirement”, U.S. Appl. No. 12/850,804, Oct. 3, 2012, 5 pages. |
“Restriction Requirement”, U.S. Appl. No. 13/612,879, Sep. 9, 2015, 6 pages. |
“Final Office Action”, U.S. Appl. No. 13/887,450, Feb. 25, 2016, 10 pages. |
“Notice of Allowance”, U.S. Appl. No. 12/372,812, Jun. 13, 2016, 5 pages. |
“Final Office Action”, U.S. Appl. No. 13/612,879, Apr. 26, 2016, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/612,879, Sep. 20, 2016, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/887,450, Jan. 13, 2017, 11 pages. |
“Foreign Office Action”, EP Application No. 11783165.1, Jan. 13, 2017, 5 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/612,879, Feb. 10, 2017, 7 pages. |
“Supplementary European Search Report”, EP Application No. 10809633.0, dated Apr. 24, 2017, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20150288922 A1 | Oct 2015 | US |
Number | Date | Country | |
---|---|---|---|
61345318 | May 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13895396 | May 2013 | US |
Child | 14745405 | US | |
Parent | 13101358 | May 2011 | US |
Child | 13895396 | US |