Decentralized system and method for voice and video sessions

Information

  • Patent Grant
  • 9894319
  • Patent Number
    9,894,319
  • Date Filed
    Saturday, June 20, 2015
    8 years ago
  • Date Issued
    Tuesday, February 13, 2018
    6 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Nguyen; Joseph J
    • Nguyen; Phung-Hoang J.
    Agents
    • Colby Nipper
Abstract
A video communication apparatus is described, which includes a receiver for receiving video data from an internet telephony service over a communication channel, a display screen for playing the received video data received by the receiver, a wireless module for communication with a handset, and a processor configured to coordinate playing of the received video data on the display screen in synchronization with playing, by the handset, of audio data received by the handset from the internet telephony service. A handset is also described, which includes a receiver to receive audio data from an internet telephony service over a communication channel, an audio output to play the received audio data received by the receiver, a wireless module to communicate with a display device, and a processor. The processor can synchronize, using the wireless module, play of the received audio data on the audio output with display, by the display device, of video data received by the display device from the internet telephony service.
Description
FIELD OF THE INVENTION

The present invention is directed to Voice and Video conversations Over IP, and more particularly to a method and system for making such communications using a decentralized architecture.


BACKGROUND OF THE INVENTION

There are numerous devices and applications which are based on transferring Voice and/or Video over IP (Internet Protocol). These include having VOIP clients on a PC, or installed on any other communication device, such as a mobile phone. There are also dedicated Wi-Fi telephones which support VOIP applications; VOIP telephones can be both wired, such as the Cisco CP-7941G, or wireless, such as the Polycom SPECTRALINK® 8002 2200-37010-020 or Linksys WIP310-G1.


Client devices for making and receiving voice and video calls over the IP network with the standard functionality of most “original” telephones are also referred to as “softphones”. Softphones usually allow integration with IP phones and USB phones instead of utilizing a computer's microphone and speakers (or headset). Often a softphone is designed to behave like a traditional telephone, sometimes appearing as an image of a phone, with a display panel and buttons with which the user can interact.


A typical application of a softphone is to make calls via an Internet telephony service provider to other softphones or to telephones. Popular Internet telephony service providers include SKYPE®, GOOGLE TALK™, and VONAGE®, which have their own softphones that a user may install on his computer or the like. Most service providers use a communication protocol called SIP (Session Initiation Protocol), whereas SKYPE® has a closed proprietary system. In order to make a voice call over the internet, one should have any computing device with an audio input (e.g. microphone) and output means (e.g. a speaker or headset), Internet connectivity such as DSL, Wi-Fi, cable or LAN, and an account with an Internet telephony service provider or IP PBX.


Such prior art devices establish a single connection to the Internet telephony service provider.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be more fully understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:



FIG. 1 is a simplified block diagram of the VOIP system, in accordance with an embodiment of the present invention;



FIG. 2 is a simplified block diagram of the Display, in accordance with an embodiment of the present invention;



FIG. 3 is a simplified block diagram of the Handset, in accordance with an embodiment of the present invention;



FIG. 4 is a simplified flowchart of a method for conducting a VOIP conversation, in accordance with an embodiment of the present invention; and



FIG. 5 is a simplified block diagram of the VOIP system, in accordance with an embodiment of the present invention.





DETAILED DESCRIPTION

Aspects of the present invention relate generally to Voice and Video over IP. In particular, aspects of the present invention relate to a system and method for communication where there is more than one terminal for conducting and controlling the VoIP sessions.


Reference is now made to FIG. 1, which is a simplified block diagram of the system according to an embodiment of the present invention. The system includes a display 110 (referred to hereinafter as the Display) and a handset 120 (referred to hereinafter as the Handset). The Display may be inter alia a PC display, an LCD screen, a Plasma Display Panel, an LED panel, a Digital Picture Frame (DPF) device and any other display or device having a display. The Handset is typically a mobile electronic device, which communicates with the Display via local communication means. The local communication may be inter alia BLUETOOTH®, Infra-Red (IR), Wi-Fi or any other Near Field Communication means. Both Display 110 and Handset 120 are capable of communicating with Internet Telephony Service Provider 130 via the internet. Display 110 may include a modem and processing means (such as a PC or laptop computer which includes both the display and the CPU and modem) or may be connected to a separate processing unit and modem 140. Display 110 may also include or be connected to a processing unit which in turn is connected wirelessly to a router and a modem, for connecting to the internet. Handset 120 may be connected wirelessly to a router which is connected to the internet, or may be connected to the internet via any network (3G, GPRS, etc.) using any known in the art standard (e.g. WAP). According to another embodiment of the present invention, Handset 120 may even be connected to the internet via local connection to Display 110, which is connected to the internet in any of a few ways, as described hereinabove.


According to an embodiment of the present invention, a softphone application is installed and operates on both Display 110 and Handset 120, which are synced so as to provide a seamless experience to the user. A typical scenario includes a user having the Handset 120 close to him, possibly in his hands, whereas Display 110 is a few meters away, such as when Display 110 is an LCD display in the living room. Therefore such a scenario or similar scenarios would benefit from a decentralized system and method for providing Voice over IP or Voice and Video over IP sessions. According to an embodiment of the present invention, the controlling of such VoIP sessions is done from Handset 120; e.g. initiating calls to contacts, answering calls, ending calls, etc.


According to an embodiment of the present invention, both Display 110 and Handset 120 have audio and video capabilities, as shown in Table 1 below. Display 110 includes audio input means (e.g. built-in microphone, or such that is connected to the processing unit and modem 140), audio output (speakers), video input (built-in camera or such that is connected to the processing unit and modem 140) and video output (the display). Handset 120 includes audio input means (microphone), audio output means (earpiece and/or speakers and or headsets), and video output means (display) and according to an embodiment of the present invention may also include video input means (built-in or attachable camera).












TABLE 1







Display 110
Handset 120


















Audio input
Built-in microphone or
Built-in microphone



microphone attached to



processing unit and modem 140


Audio output
Speakers
earpiece and/or speakers




and or headsets


Video input
Built-in camera or attached to
Built-in or attachable



processing unit and modem 140
camera


Video output
Display
Display









Reference is now made to FIG. 2, which is a simplified block diagram of Display 110, according to an embodiment of the present invention. Display 110 includes display module 111, which includes any display technology, such as LCD, Plasma, LED, OLED, Bi-Stable or any other. for example, for an LCD module, display module 111 typically includes LCD Controller, LCD driver and LCD glass. Display 110 also includes local wireless module 112 for communicating with Handset 120. Display 110 further includes speaker or speakers 113 and power module 114, which may consist of a power outlet and/or optionally battery 117. Display 110 optionally includes processing unit and modem 140, which may alternatively be connected externally to Display 110. Display 110 optionally includes camera module 116, which functions as video input to Display 110. Display 110 may optionally also include a microphone 118, which serves as audio input. Microphone may also be an external component. Display 110 also includes UI (user-interface) means 119 for operating Display 110. UI means 119 may be keypad, button/s, touchscreen, or any other UI method known in the art.


Reference is now made to FIG. 3, which is a simplified block diagram of Handset 120, according to an embodiment of the present invention. Handset 120 includes display module 121, which includes any display technology used for small mobile devices, such as LCD, LED, OLED, Bi-Stable or any other. Handset 120 also includes local wireless module 122 for communicating with Display 110. Said module 122 may be Bluetooth, IR, Wi-Fi, NFC or any wireless module. Handset 120 further includes AUDIO OUTPUT MEANS 123, which may include earpiece and/or speakers and or headsets. Handset 120 also includes power module 124, which typically consists of a power outlet and battery 127. Handset 120 optionally includes processing unit and baseband 125, in which case Handset 120 serves as a mobile communication device, such as a mobile cellular phone. If Handset 120 is a mobile cellular phone, it may include further components known in the art, such as Antenna. Handset 120 optionally includes camera module 126, which functions as video input to Handset 120. Handset 120 also includes a microphone 128, which serves as audio input. Handset 120 also includes UI (user-interface) means 129 for operating Handset 120. UI means 129 may be keypad, button/s, touchscreen, or any other UI method known in the art.


According to a preferred embodiment of the present invention, since Handset 120 is typically in great vicinity to the user, it is used as the audio input of a conversation. The audio out may be either the audio output means of Display 110 or that of Handset 120; video output is preferably the display of Display 110, and video input may be either video input means of Display 110 or Handset 120.


According to another embodiment of the present invention, audio and video input and output means may be used concurrently. E.g., both audio output of Display 110 and that of Handset 120 may play the audio of the active conversation.


Reference is now made to FIG. 4, which is a simplified flowchart of a method for conducting a VOIP conversation, in accordance with an embodiment of the present invention. At step 410, Display 110 and Handset 120 of user are in close proximity and are paired. Such pairing may be pairing procedure known in the art, for example Bluetooth pairing. In addition, said pairing may be initiated manually by user, via Handset 120. At step 420, user initiates a VoIP session through Handset 120, which is typically carried by hand. VoIP session may be, for example, a video call. Handset 110 connects to internet telephony service provider 130, to perform the conversation.


Step 430 occurs concurrently or immediately follows step 420. At step 430, a second connection is established between internet telephony service provider 130 and Display 110. According to an embodiment of present invention, said second connection is initiated by Display 110, which contacts internet telephony service provider 130 and signal that it should join initiated or ongoing session between internet telephony service provider 130 and Handset 110. Display 110 may transfer any detail about Handset and/or session so that internet telephony service provider 130 identifies session and can transfer the session or parts of it (e.g. video) to Display 110. According to another embodiment of the present invention, Handset 120 signals to internet telephony service provider 130 that it is paired to Display 110, and therefore the initiated VoIP session should be shared with Display 110.


Said second connection between Display 110 and Internet telephony service provider 130 is typically established for video purposes, whereas according to an embodiment of the present invention, video of said VoIP session is transferred between internet telephony service provider 130 and Display 110, and audio is transferred between internet telephony service provider 130 and both Handset 120 and Display 110.


Display 110 typically has a larger display than Handset 120, more convenient for user viewing, and possibly better speakers, and in addition connection between Display 110 and internet telephony service provider 130 is of higher band width, as compared to connection between Handset 120 and internet telephony service provider 130. Therefore, at step 440, the video part of the VoIP session, which is more band-width consuming, is transferred between Display 110 and internet telephony service provider 130, and the audio part is transferred between internet telephony service provider 130 and both Display 110 and Handset 120. Handset 110 is especially used for audio input of the VoIP session, whereas the audio output may be played through audio output means of Handset 110, Display 120, or both. At step 450, user terminates VoIP session via Handset 120.


According to an aspect of the current invention, even though the VoIP session is divided between two connections, the session is seamless and the user carries out the conversation as if it were a single connection. Accordingly, there is synchronization between Display 110 and Handset 120. Said synchronization is used in diminishing latencies between audio and video, and between Display 110 and Handset 120. The video received and displayed on Display 110 is synchronized with the audio received and played on either speaker/s 113 of Display 110 or audio output means 123 of Handset 120. Synchronization between Display 110 and Handset 120 is performed either via internet telephony service provider 130 or directly. In a first embodiment of the present invention, a time stamp is sent from both Display 110 and Handset 120 to internet telephony service provider 130. The internet telephony service provider 130 in turn alters transmission time of data, audio and/or video, to Display 110 and Handset 120. According to a second embodiment of the present invention, Display 110 and Handset 120 may have buffers which allow them to synchronize between Display 110 and Handset 120, by delaying playing/sending of data to internet telephony service provider 130.


According to an embodiment of the present invention, Handset 120 is also used to control other elements in Display 110, using local communication channel, which is established by local wireless modules 112 and 122. For example, Handset 120 may act as a remote control to Display 110.


According to yet another embodiment of the present invention, the system includes multiple Displays, similar to Display 110. Handset 120 communicates with all Displays, and may control a session which is divided between Handset 120 and a first Display 120 and upon manual selection by the user, or automatically, session may be handed-over to be divided between Handset 120 and second Display 110. Hand-over between first Display 110 and second Display 110 is initiated manually by user, who may be able to see which other Displays are in his vicinity. In another embodiment of the present invention, the session may be handed over automatically, possibly when user moves from first location to second location, and/or when local communication between Handset 120 and said second Display 110 is typically of better quality than that between Handset 120 and said first Display 110. Such handover mechanisms are described in the art, for example, U.S. Pat. No. 6,834,192 to Watanabe et al., which describes handover of communications in a Bluetooth, or other, radio communication system, or U.S. patent application Ser. No. 11/680,911 to Jougit, describing a method and system for a distributed Bluetooth host architecture.


Reference is now made to FIG. 5, which is a simplified block diagram of the system according to an embodiment of the present invention. Handset 120 may be paired with first Display 510 or second Display 520, which may be any Display 110, as described above.

Claims
  • 1. A video communication apparatus, comprising: a receiver to receive video data for a video call from an internet telephony service over a first communication channel, the video call being divided into the received video data and audio data, the received video data comprising a video component of the video call and the audio data comprising an audio component of the video call;a display screen to display the received video data;a wireless module for communication with a mobile electronic device, the mobile electronic device being separate from the receiver and configured to receive the audio data of the video call from the internet telephony service over a second communication channel;a buffer; anda processor to synchronize, using the wireless module, display of the received video data on the display screen with play, by the mobile electronic device, of the received audio data, the received video data and the received audio data for a same video call, the processor configured to transmit information about the mobile electronic device to the internet telephony service in order for the internet telephony service to identify the same video call and enable the video communication apparatus and the mobile electronic device to join the same video call, the processor configured to delay the display of the received video data using the buffer or delay the play of the received audio data at the mobile electronic device.
  • 2. The apparatus of claim 1 further comprising an audio output component, and wherein the processor is further configured to also receive the audio data over the first communication channel and to play the received audio data on the audio output component.
  • 3. The apparatus of claim 1, wherein the wireless module communicates with the mobile electronic device via a local communication channel, and wherein the local communication channel comprises WiFi, Bluetooth, infrared, or near field communication.
  • 4. The apparatus of claim 3, wherein the mobile electronic device is further configured to control elements of the video communication apparatus via the local communication channel.
  • 5. The apparatus of claim 1, further comprising an audio input means, wherein the audio input means is configured to receive another audio data, the other audio data being returned to the internet telephony service as part of the same video call.
  • 6. The apparatus of claim 1, further comprising a video input means, wherein the video input means is configured to receive another video data, the other video data being returned to the internet telephony service as part of the same video call.
  • 7. The apparatus of claim 1, wherein the second communication channel comprises a cellular telephone network.
  • 8. The apparatus of claim 1 further comprising an additional display screen, wherein the processor is further configured to hand-over the video data from the display screen to the additional display screen.
  • 9. The apparatus of claim 8, wherein the hand-over of the video data from the display screen to the additional display screen coincides with a move of the mobile electronic device from a first location to a second location.
  • 10. The apparatus of claim 1, wherein the first communication channel is of a higher band width than the second communication channel.
  • 11. A mobile electronic device, comprising: a receiver to receive audio data for a video call from an internet telephony service over a first communication channel, the video call being divided into video data and the received audio data, the video data comprising a video component of the video call and the received audio data comprising an audio component of the video call;an audio output to play the received audio data;a wireless module for communication with a display device, the display device being separate from the receiver and configured to receive the video data of the video call from the internet telephony service over a second communication channel;a buffer; anda processor to synchronize, using the wireless module, play of the received audio data on the audio output with display, by the display device, of the received video data, the received video data and the received audio data for a same video call, the processor configured to transmit information about the display device to the internet telephony service in order for the internet telephony service to identify the same video call and enable the mobile electronic device and the display device to join the same video call, the processor configured to delay play of the received audio data using the buffer or delay the display of the received video data at the display device.
  • 12. The mobile electronic device of claim 11, wherein the wireless module communicates with the display device via a local communication channel, and wherein the local communication channel comprises WiFi, Bluetooth, infrared, or near field communication.
  • 13. The mobile electronic device of claim 12, wherein the mobile electronic device is further configured to communicate with the internet telephony service via the local communication channel and the display device.
  • 14. The mobile electronic device of claim 11, further comprising a display, and wherein the processor is further configured to also receive the video data over the first communication channel and to display the received video data on the display.
  • 15. The mobile electronic device of claim 11, further comprising an audio input means, wherein the audio input means is configured to receive another audio data, the other audio data being returned to the internet telephony service as part of the same video call.
  • 16. The mobile electronic device of claim 11, further comprising a video input means, wherein the video input means is configured to receive another video data, the other video data being returned to the internet telephony service as part of the same video call.
  • 17. The mobile electronic device of claim 11, wherein the first communication channel comprises a cellular telephone network.
  • 18. The mobile electronic device of claim 11, further comprising an additional display device, wherein the processor is further configured to hand-over the video data from the display device to the additional display device.
  • 19. The mobile electronic device of claim 18, wherein the hand-over of the video data from the display device to the additional display device coincides with a move of the mobile electronic device from a first location to a second location.
  • 20. The mobile electronic device of claim 11, wherein the mobile electronic device is further configured to initiate the video call.
CROSS REFERENCES TO RELATED APPLICATIONS

This application claims benefit to, and is a continuation of, U.S. patent application Ser. No. 13/895,396, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 16, 2013 by inventor Eyal Bychkov. U.S. patent application Ser. No. 13/895,396 is a continuation of U.S. patent application Ser. No. 13/101,358, now U.S. Pat. No. 8,457,118, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 5, 2011 by inventor Eyal Bychkov. U.S. patent application Ser. No. 13/101,358 is a non-provisional of U.S. Provisional Application No. 61/345,318, entitled DECENTRALIZED SYSTEM AND METHOD FOR VOICE AND VIDEO SESSIONS, filed on May 17, 2010 by inventor Eyal Bychkov.

US Referenced Citations (112)
Number Name Date Kind
5625673 Grewe et al. Apr 1997 A
5628055 Stein May 1997 A
5809115 Inkinen Sep 1998 A
5893037 Reele et al. Apr 1999 A
5907815 Grimm et al. May 1999 A
6188917 Laureanti Feb 2001 B1
6201867 Koike Mar 2001 B1
6243578 Koike Jun 2001 B1
6285823 Saeki et al. Sep 2001 B1
6300947 Kanevsky Oct 2001 B1
6477357 Cook Nov 2002 B1
6516202 Hawkins et al. Feb 2003 B1
6640113 Shim et al. Oct 2003 B1
6690947 Tom Feb 2004 B1
6760415 Beecroft Jul 2004 B2
6834192 Watanabe Dec 2004 B1
6898283 Wycherley et al. May 2005 B2
6907264 Sterkel Jun 2005 B1
6999792 Warren Feb 2006 B2
7020704 Lipscomb et al. Mar 2006 B1
7085542 Dietrich et al. Aug 2006 B2
7194285 Tom Mar 2007 B2
7194752 Kenyon et al. Mar 2007 B1
7266391 Warren Sep 2007 B2
7275244 Charles et al. Sep 2007 B1
7477919 Warren Jan 2009 B2
7515937 Lee Apr 2009 B2
7571014 Lambourne et al. Aug 2009 B1
7747338 Korhonen et al. Jun 2010 B2
7784065 Polivy et al. Aug 2010 B2
8316308 Sherman Nov 2012 B2
8457118 Bychkov Jun 2013 B2
8463875 Katz et al. Jun 2013 B2
9083846 Bychkov Jul 2015 B2
9448814 Sherman et al. Sep 2016 B2
9686145 Sherman et al. Jun 2017 B2
20010055951 Slotznick Dec 2001 A1
20020090980 Wilcox et al. Jul 2002 A1
20020151327 Levitt Oct 2002 A1
20030008563 Nishio et al. Jan 2003 A1
20030107529 Hayhurst et al. Jun 2003 A1
20030200001 Goddard Oct 2003 A1
20040042601 Miao Mar 2004 A1
20040052501 Tam Mar 2004 A1
20040156616 Strub et al. Aug 2004 A1
20040233930 Colby Nov 2004 A1
20050064860 DeLine Mar 2005 A1
20050070225 Lee Mar 2005 A1
20050091359 Soin et al. Apr 2005 A1
20050159184 Kerner et al. Jul 2005 A1
20050231392 Meehan et al. Oct 2005 A1
20050276570 Reed et al. Dec 2005 A1
20050276750 Ziv et al. Dec 2005 A1
20060003804 Liu Jan 2006 A1
20060026652 Pulitzer Feb 2006 A1
20060033809 Farley Feb 2006 A1
20060072694 Dai et al. Apr 2006 A1
20060075439 Vance Apr 2006 A1
20060105722 Kumar May 2006 A1
20060123053 Scannell Jun 2006 A1
20060130075 Rhoten et al. Jun 2006 A1
20060190321 Martins Nicho et al. Aug 2006 A1
20060235872 Kline et al. Oct 2006 A1
20060241353 Makino et al. Oct 2006 A1
20060242590 Polivy et al. Oct 2006 A1
20070004450 Parikh Jan 2007 A1
20070018957 Seo Jan 2007 A1
20070053653 Huntington Mar 2007 A1
20070072589 Clarke Mar 2007 A1
20070079030 Okuley et al. Apr 2007 A1
20070139514 Marley Jun 2007 A1
20070161404 Yasujima et al. Jul 2007 A1
20070195158 Kies Aug 2007 A1
20070211907 Eo et al. Sep 2007 A1
20070226734 Lin Sep 2007 A1
20070288583 Rensin et al. Dec 2007 A1
20080009325 Zinn et al. Jan 2008 A1
20080013659 Kim Jan 2008 A1
20080013802 Lee et al. Jan 2008 A1
20080019522 Proctor Jan 2008 A1
20080026794 Warren Jan 2008 A1
20080030304 Doan et al. Feb 2008 A1
20080037674 Zurek et al. Feb 2008 A1
20080040354 Ray et al. Feb 2008 A1
20080045140 Korhonen Feb 2008 A1
20080056285 Quinn et al. Mar 2008 A1
20080120401 Panabaker et al. May 2008 A1
20080140886 Izutsu Jun 2008 A1
20080152165 Zacchi Jun 2008 A1
20080162665 Kali Jul 2008 A1
20080168368 Louch et al. Jul 2008 A1
20080212649 Jougit Sep 2008 A1
20080307315 Sherman et al. Dec 2008 A1
20090002191 Kitaura Jan 2009 A1
20090010485 Lamb Jan 2009 A1
20090158382 Shaffer et al. Jun 2009 A1
20090207097 Sherman et al. Aug 2009 A1
20090210491 Thakkar Aug 2009 A1
20090286570 Pierce Nov 2009 A1
20100003921 Godlewski Jan 2010 A1
20100041330 Elg Feb 2010 A1
20100093401 Moran et al. Apr 2010 A1
20100305729 Glitsch et al. Dec 2010 A1
20110047247 Katz et al. Feb 2011 A1
20110164105 Lee Jul 2011 A1
20110208807 Shaffer Aug 2011 A1
20110280142 Bychkov Nov 2011 A1
20120314777 Zhang Dec 2012 A1
20130036366 Sherman et al. Feb 2013 A1
20130258038 Bychkov Oct 2013 A1
20170235477 Sherman et al. Aug 2017 A1
20170315775 Katz et al. Nov 2017 A1
Foreign Referenced Citations (6)
Number Date Country
1871075 Dec 2007 EP
WO-9421058 Sep 1994 WO
WO-0059247 Oct 2000 WO
WO-0186922 Oct 2001 WO
WO-03103174 Dec 2003 WO
WO-2008011230 Jan 2008 WO
Non-Patent Literature Citations (29)
Entry
“Non-Final Office Action”, U.S. Appl. No. 13/101,358, Jan. 9, 2013, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/895,396, Nov. 20, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/895,396, Dec. 16, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/101,358, Feb. 19, 2013, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/895,396, Mar. 19, 2015, 10 pages.
“Advisory Action”, U.S. Appl. No. 12/372,812, Nov. 29, 2012, 3 pages.
“Final Office Action”, U.S. Appl. No. 12/372,812, Feb. 10, 2015, 14 pages.
“Final Office Action”, U.S. Appl. No. 12/372,812, Aug. 28, 2012, 14 pages.
“Final Office Action”, U.S. Appl. No. 12/850,804, Jan. 10, 2013, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/134,221, Nov. 15, 2011, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/372,812, Jun. 13, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/372,812, Dec. 22, 2011, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/850,804, Oct. 26, 2012, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/612,879, Oct. 22, 2015, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/887,450, Jul. 8, 2015, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 12/134,221, Jul. 25, 2012, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 12/850,804, Feb. 6, 2013, 12 pages.
“Restriction Requirement”, U.S. Appl. No. 12/134,221, Aug. 2, 2011, 6 pages.
“Restriction Requirement”, U.S. Appl. No. 12/372,812, Sep. 30, 2011, 9 pages.
“Restriction Requirement”, U.S. Appl. No. 12/850,804, Oct. 3, 2012, 5 pages.
“Restriction Requirement”, U.S. Appl. No. 13/612,879, Sep. 9, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 13/887,450, Feb. 25, 2016, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 12/372,812, Jun. 13, 2016, 5 pages.
“Final Office Action”, U.S. Appl. No. 13/612,879, Apr. 26, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/612,879, Sep. 20, 2016, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/887,450, Jan. 13, 2017, 11 pages.
“Foreign Office Action”, EP Application No. 11783165.1, Jan. 13, 2017, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/612,879, Feb. 10, 2017, 7 pages.
“Supplementary European Search Report”, EP Application No. 10809633.0, dated Apr. 24, 2017, 7 pages.
Related Publications (1)
Number Date Country
20150288922 A1 Oct 2015 US
Provisional Applications (1)
Number Date Country
61345318 May 2010 US
Continuations (2)
Number Date Country
Parent 13895396 May 2013 US
Child 14745405 US
Parent 13101358 May 2011 US
Child 13895396 US