Embodiments of the disclosure relate generally to text captioning and more specifically to correction of errors within a text caption.
Modern telecommunication services provide features to assist those who are deaf or hearing-impaired. One such feature is a telecommunication device for the deaf (TDD). Hearing-capable users communicate with hearing-impaired users who are users of TDD devices through so-called “relays.” A relay is a telecommunication intermediary service, which is intended to permit a deaf or a hearing-impaired person to utilize a normal telephone network. A relay service may include an operator, referred to as a “call assistant,” who serves as a human intermediary between a hearing user and a hearing-impaired user. The call assistant communicates with the hearing-impaired user using a TDD and communicates with the hearing user by voice over a standard telephone line.
A text captioned telephone system employs a relay service in a mode where the relay transmits both the voice of the hearing user and a text stream of the words spoken by the hearing user. A hearing-impaired user using a text captioned telephone, or telephone enabled to do text enhanced telephony, may carry on a normal telephone conversation with a hearing user while a text transcription of the words spoken by the hearing user is displayed on the text captioned telephone. The text transcription may allow the hearing-impaired user to confirm his or her understanding of the words spoken by the hearing user.
More specifically, during a communication session, a call assistant may listen to the voice signal of a hearing user and “revoice” the words to a speech recognition computer program tuned to that call assistant's voice. A text transcription output from the speech recognition computer is then transmitted to the text captioned telephone being used by the hearing-impaired user. Even with revoicing provided by a trained call assistant, the text transcription received by the hearing-impaired user may include errors. Therefore, correction of the errors within the text transcription may be required.
According to various conventional methods, errors within a text caption are corrected by either backspacing an error in a text caption and displaying corrected text or providing a corrected portion (e.g., a word or a sentence) at the end of a previously provided text caption. Although, backspacing an error in a text caption and displaying corrected text may provide a hearing-impaired user with a context for the correction, this method is distracting to a hearing-impaired user and interrupts the continuity of a conversation between a hearing-impaired user and a hearing user. Further, providing a corrected portion at the end of a previously provided text caption not only interrupts the continuity of a conversation but also fails to provide context of the correction to the hearing-impaired user. Therefore, a hearing-impaired user must determine where the corrected text should be inserted into the previously provided text caption.
A need exists to improve text correction of a text captioning system. Specifically, there is a need for methods of providing text caption correction while providing a user with context of a correction and without distracting the user or interrupting the continuity of a conversation between a hearing-impaired user and a hearing user.
In one embodiment of the disclosure, a method of providing error correction in a text caption is disclosed. The method may comprise displaying a text caption including one or more blocks of text on each of a first device and a second device remote from the first device. The method may also include generating another block of text and replacing a block of text of the text caption with the another block of text. Furthermore, the method may include displaying the text caption on the second device having the block of text of the first text caption replaced by the another block of text.
In another embodiment of the disclosure, a communication system is provided. The communication system may include a plurality of devices, wherein each device of the plurality includes a processor and a computer-readable medium coupled to the processor. The communication system may further include a plurality of application programs, wherein each application program is stored in an associated computer-readable medium. When executed by the processor, one or more application programs are configured to display a text caption including one or more blocks of text on a display device of each of a first device of the plurality of devices and a second device of the plurality of devices. One or more application programs may be configured to generate another block of text and replace one block of text of the text caption with the another block of text. Furthermore, one or more application programs may be configured to display the corrected text caption on the display device of the second device, wherein the at least one block of text of the text caption is replaced by the another block of text.
Another embodiment of the disclosure may include a computer-readable media storage medium storing instructions that when executed by a processor cause the processor to perform instructions in accordance with one or more embodiments of the disclosure.
In the following detailed description, reference is made to the accompanying drawings which form a part hereof and, in which is shown by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice the invention, and it is to be understood that other embodiments may be utilized, and that structural, logical, and electrical changes may be made within the scope of the disclosure.
In this description, functions may be shown in block diagram form in order not to obscure the disclosure in unnecessary detail. Furthermore, specific implementations shown and described are only examples and should not be construed as the only way to implement the disclosure unless specified otherwise herein. Block definitions and partitioning of logic between various blocks represent a specific implementation. It will be readily apparent to one of ordinary skill in the art that the various embodiments of the disclosure may be practiced by numerous other partitioning solutions. For the most part, details concerning timing considerations, and the like, have been omitted where such details are not necessary to obtain a complete understanding of the disclosure in its various embodiments and are within the abilities of persons of ordinary skill in the relevant art.
Referring in general to the following description and accompanying drawings, various aspects of the disclosure are illustrated to show its structure and method of operation. Common elements of the illustrated embodiments are designated with like numerals. It should be understood the figures presented are not meant to be illustrative of actual views of any particular portion of the actual structure or method, but are merely idealized representations which are employed to more clearly and fully depict the disclosure.
When executed as firmware or software, the instructions for performing the methods and processes described herein may be stored on a computer readable medium. A computer readable medium includes, but is not limited to, magnetic and optical storage devices such as disk drives, magnetic tape, CDs (compact disks), DVDs (digital versatile discs or digital video discs), and semiconductor devices such as RAM, DRAM, ROM, EPROM, and Flash memory.
As described more fully below, relay service 110 may be configured to provide interpretive services to hearing-impaired user 140. More specifically, a human “call assistant” within relay service 110 may be employed to facilitate a communication session between a hearing-impaired user 140 and a hearing-capable user 160. By way of example only, communication device 190 may comprise a conventional voice phone. As such, hearing-capable user 160 may interact in a conventional manner with communication device 120 through the use of a voice-based dialogue conveyed over communication device 190. The voice of hearing-capable user 160 may be conveyed over communication device 190 and may be transmitted over network 180 to communication device 120. Furthermore, the voice conveyed over communication device 190 may be transmitted through communication device 120, over network 170, and to relay service 110.
By way of example, communication device 120 may include a captioned telephone, a telephone enabled for text enhanced telephony, or any other suitable communication device configured to receive and display text. Hearing-impaired user 140 may interact in a conventional manner with communication device 190 through the use of a voice-based dialogue conveyed over communication device 120. Furthermore, as described more fully below, communication device 120 may be configured to receive and display a text transcription of a voice signal sent from relay service 110 via network 170.
In various embodiments of the disclosure, instructions implementing an “application program” may be tangibly embodied in a computer-readable medium which may include one or more fixed or removable data storage devices, such as a zip drive, floppy disc drive, hard drive, CD-ROM drive, tape drive, flash memory device, etc. Further, an application program may include instructions that, when read and executed by a processor, may cause the processor to perform the steps necessary to implement and/or use embodiments of the disclosure. An application program and/or operating instructions may also be tangibly embodied in a memory and/or data communications devices, thereby making a computer program product or article of manufacture according to an embodiment the disclosure. As such, the term “application program” as used herein is intended to encompass a computer program accessible from any computer readable device or media. Furthermore, portions of an application program may be distributed such that some of the application program may be included on a computer readable media within a processor-based device (e.g., device 102 or device 152) and some of the application program may be included in a removable data storage device.
Communication device 120 may include a display device 134 and a processor-based device 152 comprising a processor 154 and a memory 156, such as random access memory (RAM). Device 152 may also implement a compiler (not shown) that allows an application program 118 to be translated into processor 104 readable code. Application program 118 may be configured to access and manipulate data stored in memory 156 of device 152 using relationships and logic that are generated using the compiler.
During a contemplated operation of communication system 100, hearing-capable user 160 (see
Hearing-impaired user 140 (see
Furthermore, it should be noted that a block of text, as output from the speech recognition program and as transmitted to and displayed within a text caption on display device 134, may also be displayed within a text caption displayed on display device 132. As a result, the call assistant may view the text caption, including one or more blocks of text, as displayed on display device 134 and as viewed by hearing-impaired user 140. Therefore, any errors that may exist in the text caption displayed on display device 134 may also be visible to the communication assistant on display device 132.
In accordance with various embodiments of the disclosure, application program 108 may be configured to provide in-line correction of any errors that may be detected within a text caption displayed on display device 134. Stated another way, application program 108 may be configured to replace any detected errors within a displayed text caption with corrected text. More specifically, in the event the call assistant notices one or more errors within a block of text of the text caption as displayed on each of display device 134 and display device 132, the call assistant may edit the block of text including the one or more errors through input into device 102 to correct the one or more errors. For example only, the call assistant may edit a block of text through input into a keyboard (e.g., selecting and replacing text, inserting text, and/or deleting text). As a result, a corrected block of text including one or more corrected words may be generated. Thereafter, the corrected block of text including the one or more corrected words may be sent to communication device 120. Upon receipt of the corrected block of text at communication device 120, application program 118 may be configured to replace the block of text including the one or more errors with the associated corrected block of text. Furthermore, substantially simultaneously upon replacing the block of text including the one or more errors with the associated corrected block of text, application program 118 may be configured to display the corrected text caption on display device 134.
Furthermore, in order to make a correction more obvious to a hearing-impaired user reading the text caption, application program 108 may further be configured to identify each corrected word within the text caption with an identifier (e.g., a mark or a tag). By way of example and not by way of limitation, application program 108 may be configured to identify each corrected word by highlighting each corrected word displayed in a text caption on display device 134.
With reference to
It should be noted that a block of text including one or more corrected words may be sent at any suitable time after the call assistant has identified one or more errors and made associated corrections. For example, the call assistant may notice one or more errors in the most recently sent block of text, correct the one or more errors, and transmit a corrected block of text including one or more corrected words prior to sending another block of text. On the other hand, the call assistant may not notice one or more errors in a block of text until after one or more blocks have subsequently been sent. In any event, any block of text displayed on a display device 134 (see
As described above in various embodiments of the disclosure, an error detected in a text caption may be replaced with corrected text (i.e., “in-line” correction) and, therefore, in comparison to conventional text caption correction methods, a user viewing the text caption may be provided with a context of the error correction. Stated another way, upon completion of an error correction within a text caption, a reader may understand how the correction relates to the text caption as a whole and will not be required to guess as to where the correction belongs within the text caption or how the correction applies to the text caption. Furthermore, in comparison to conventional text caption correction methods, in-line correction of a text caption may reduce distractions to a hearing-impaired user and, therefore, interruptions in the flow of conversation between a hearing-capable user and a hearing-impaired user may be reduced.
While the disclosure has been described herein with respect to certain preferred embodiments, those of ordinary skill in the art will recognize and appreciate that it is not so limited. Rather, many additions, deletions, and modifications to the preferred embodiments may be made without departing from the scope of the invention as hereinafter claimed. In addition, features from one embodiment may be combined with features of another embodiment while still being encompassed within the scope of the invention as contemplated by the inventors.
This application is a continuation of U.S. patent application Ser. No. 14/530,407, filed Oct. 14, 2014, which application is a continuation of U.S. patent application Ser. No. 13/768,918, filed Feb. 15, 2013, which application is a continuation of U.S. patent application Ser. No. 12/624,973, filed Nov. 24, 2009, now U.S. Pat. No. 8,379,801, issued Feb. 19, 2013, the disclosure of each of which is hereby incorporated herein by this reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4777469 | Engelke et al. | Oct 1988 | A |
4959847 | Engelke et al. | Sep 1990 | A |
5081673 | Engelke et al. | Jun 1992 | A |
5146502 | Davis | Sep 1992 | A |
5163081 | Wycherley | Nov 1992 | A |
5325417 | Engelke et al. | Jun 1994 | A |
5327479 | Engelke et al. | Jul 1994 | A |
5351288 | Engelke et al. | Sep 1994 | A |
5432837 | Engelke et al. | Jul 1995 | A |
D364865 | Engelke et al. | Dec 1995 | S |
5572423 | Church et al. | Nov 1996 | A |
5574784 | LaPadula | Nov 1996 | A |
5576955 | Newbold et al. | Nov 1996 | A |
5581593 | Engelke et al. | Dec 1996 | A |
5604786 | Engelke et al. | Feb 1997 | A |
5636340 | Bonneau et al. | Jun 1997 | A |
5680443 | Kasday | Oct 1997 | A |
5687222 | McLaughlin et al. | Nov 1997 | A |
5712901 | Meermans | Jan 1998 | A |
5715370 | Luther et al. | Feb 1998 | A |
5724405 | Engelke | Mar 1998 | A |
5754737 | Gipson | May 1998 | A |
5809112 | Ryan | Sep 1998 | A |
5809425 | Colwell et al. | Sep 1998 | A |
5855000 | Waibel et al. | Dec 1998 | A |
5909482 | Engelke | Jun 1999 | A |
5974116 | Engelke et al. | Oct 1999 | A |
5978654 | Colwell et al. | Nov 1999 | A |
6075841 | Engelke et al. | Jun 2000 | A |
6075842 | Engelke et al. | Jun 2000 | A |
6101467 | Bartosik | Aug 2000 | A |
6174170 | Olmedo | Jan 2001 | B1 |
6175819 | Van Alstine | Jan 2001 | B1 |
6233314 | Engelke | May 2001 | B1 |
6243445 | Begeja et al. | Jun 2001 | B1 |
6260011 | Heckerman et al. | Jul 2001 | B1 |
6307921 | Engelke et al. | Oct 2001 | B1 |
6314397 | Lewis et al. | Nov 2001 | B1 |
6360010 | Hu et al. | Mar 2002 | B1 |
6473778 | Gibbon | Oct 2002 | B1 |
6493426 | Engelke et al. | Dec 2002 | B2 |
6504910 | Engelke et al. | Jan 2003 | B1 |
6510206 | Engelke et al. | Jan 2003 | B2 |
6513003 | Angell et al. | Jan 2003 | B1 |
6549611 | Engelke et al. | Apr 2003 | B2 |
6567503 | Engelke | May 2003 | B2 |
6568939 | Edgar | May 2003 | B1 |
6594346 | Engelke | Jul 2003 | B2 |
6603835 | Engelke et al. | Aug 2003 | B2 |
6748053 | Engelke et al. | Jun 2004 | B2 |
6816834 | Jaroker | Nov 2004 | B2 |
6820055 | Saindon et al. | Nov 2004 | B2 |
6882707 | Engelke et al. | Apr 2005 | B2 |
6885731 | Engelke et al. | Apr 2005 | B2 |
6934366 | Engelke et al. | Aug 2005 | B2 |
6940617 | Ma et al. | Sep 2005 | B2 |
6941345 | Kapil | Sep 2005 | B1 |
6944593 | Kuzunuki et al. | Sep 2005 | B2 |
6990335 | Shamoon et al. | Jan 2006 | B1 |
6999932 | Zhou | Feb 2006 | B1 |
7003082 | Engelke et al. | Feb 2006 | B2 |
7006604 | Engelke | Feb 2006 | B2 |
7035804 | Saindon et al. | Apr 2006 | B2 |
7130401 | Rampey et al. | Oct 2006 | B2 |
7130790 | Flanagan et al. | Oct 2006 | B1 |
7164753 | Engelke et al. | Jan 2007 | B2 |
7197459 | Harinarayan et al. | Mar 2007 | B1 |
7269561 | Mukhtar et al. | Sep 2007 | B2 |
7295882 | Champion | Nov 2007 | B2 |
7305627 | Tannenbaum | Dec 2007 | B2 |
7315612 | McClelland | Jan 2008 | B2 |
7319740 | Engelke et al. | Jan 2008 | B2 |
7376561 | Rennillo et al. | May 2008 | B2 |
7428702 | Cervantes et al. | Sep 2008 | B1 |
7444285 | Forbes | Oct 2008 | B2 |
7483833 | Peters | Jan 2009 | B2 |
7511761 | Lynch | Mar 2009 | B2 |
7516404 | Colby | Apr 2009 | B1 |
7539086 | Jaroker | May 2009 | B2 |
7543033 | Vincent | Jun 2009 | B2 |
7555104 | Engelke | Jun 2009 | B2 |
7660398 | Engelke | Feb 2010 | B2 |
7698140 | Bhardwaj et al. | Apr 2010 | B2 |
7734702 | Kim | Jun 2010 | B2 |
7769586 | Bennett et al. | Aug 2010 | B2 |
7774694 | Watson et al. | Aug 2010 | B2 |
7792675 | Ramaswamy et al. | Sep 2010 | B2 |
7826635 | Barbara | Nov 2010 | B2 |
7881441 | Engelke | Feb 2011 | B2 |
7908145 | Bennett et al. | Mar 2011 | B2 |
8140337 | Nakazawa et al. | Mar 2012 | B2 |
8150689 | Beach et al. | Apr 2012 | B2 |
8195459 | Brand | Jun 2012 | B1 |
8213578 | Engleke | Jul 2012 | B2 |
8296811 | Begeja et al. | Oct 2012 | B1 |
8355914 | Joh et al. | Jan 2013 | B2 |
8379801 | Romriell | Feb 2013 | B2 |
8408913 | Palacios | Apr 2013 | B2 |
8416925 | Engelke | Apr 2013 | B2 |
8515024 | Engelke et al. | Aug 2013 | B2 |
8515748 | Gangemi et al. | Aug 2013 | B2 |
8626486 | Och | Jan 2014 | B2 |
8775175 | Nagel et al. | Jul 2014 | B1 |
8898633 | Bryant | Nov 2014 | B2 |
9336689 | Romriell et al. | May 2016 | B2 |
9350857 | Engelke et al. | May 2016 | B1 |
20010005825 | Engelke et al. | Jun 2001 | A1 |
20010047258 | Rodrigo | Nov 2001 | A1 |
20010047270 | Gusick et al. | Nov 2001 | A1 |
20020055974 | Hawkes et al. | May 2002 | A1 |
20020091832 | Low et al. | Jul 2002 | A1 |
20020120647 | Amano | Aug 2002 | A1 |
20020194278 | Golan | Dec 2002 | A1 |
20030023689 | Brown et al. | Jan 2003 | A1 |
20030028448 | Joseph et al. | Feb 2003 | A1 |
20030046350 | Chintalapati et al. | Mar 2003 | A1 |
20030125950 | Avila et al. | Jul 2003 | A1 |
20030161447 | Kind | Aug 2003 | A1 |
20030177009 | Odinak et al. | Sep 2003 | A1 |
20030187659 | Cho et al. | Oct 2003 | A1 |
20040019638 | Makagon et al. | Jan 2004 | A1 |
20040064317 | Othmer et al. | Apr 2004 | A1 |
20040204941 | Israch et al. | Oct 2004 | A1 |
20050069107 | Tanaka et al. | Mar 2005 | A1 |
20050086702 | Cormack | Apr 2005 | A1 |
20050094777 | McClelland | May 2005 | A1 |
20050096910 | Watson et al. | May 2005 | A1 |
20050102140 | Davne et al. | May 2005 | A1 |
20050131840 | Pintsov et al. | Jun 2005 | A1 |
20050226394 | Engelke et al. | Oct 2005 | A1 |
20050226398 | Bojeun | Oct 2005 | A1 |
20050283726 | Lunati | Dec 2005 | A1 |
20050289130 | Cohen et al. | Dec 2005 | A1 |
20060047767 | Dodrill et al. | Mar 2006 | A1 |
20060064631 | Parker | Mar 2006 | A1 |
20060092291 | Bodie | May 2006 | A1 |
20060095550 | Nemmaier et al. | May 2006 | A1 |
20060101406 | Goenka | May 2006 | A1 |
20060161631 | Lira | Jul 2006 | A1 |
20070118373 | Wise et al. | May 2007 | A1 |
20070118374 | Wise et al. | May 2007 | A1 |
20070124387 | Galloway | May 2007 | A1 |
20070271510 | Grigoriu et al. | Nov 2007 | A1 |
20070280463 | Kouchri | Dec 2007 | A1 |
20080097743 | Hong | Apr 2008 | A1 |
20080155411 | Christensen | Jun 2008 | A1 |
20080160491 | Allen | Jul 2008 | A1 |
20080177623 | Fritsch et al. | Jul 2008 | A1 |
20080187108 | Engelke et al. | Aug 2008 | A1 |
20080260032 | Hu et al. | Oct 2008 | A1 |
20080288250 | Rennillo et al. | Nov 2008 | A1 |
20090037171 | McFarland et al. | Feb 2009 | A1 |
20090089086 | Schoenberg | Apr 2009 | A1 |
20090112623 | Schoenberg | Apr 2009 | A1 |
20090112852 | Kim et al. | Apr 2009 | A1 |
20090192782 | Drewes | Jul 2009 | A1 |
20090263098 | Hyun et al. | Oct 2009 | A1 |
20090286210 | Spreen | Nov 2009 | A1 |
20090319927 | Beeman | Dec 2009 | A1 |
20090326938 | Marila et al. | Dec 2009 | A1 |
20100063815 | Cloran et al. | Mar 2010 | A1 |
20100135486 | Schneider | Jun 2010 | A1 |
20100138221 | Boys | Jun 2010 | A1 |
20100169073 | Almagro | Jul 2010 | A1 |
20100222649 | Schoenberg | Sep 2010 | A1 |
20100287486 | Coddington | Nov 2010 | A1 |
20100332217 | Wintner et al. | Dec 2010 | A1 |
20110018812 | Baird | Jan 2011 | A1 |
20110040754 | Peto et al. | Feb 2011 | A1 |
20110081007 | Bar-Yoav | Apr 2011 | A1 |
20110123003 | Romriell et al. | May 2011 | A1 |
20120130706 | Qiu et al. | May 2012 | A1 |
20120178073 | Wasmund | Jul 2012 | A1 |
20120250836 | Engleke et al. | Oct 2012 | A1 |
20120250837 | Engleke et al. | Oct 2012 | A1 |
20130158995 | Romriell et al. | Jun 2013 | A1 |
20150051908 | Romriell et al. | Feb 2015 | A1 |
Number | Date | Country |
---|---|---|
1091303 | Apr 2001 | EP |
Entry |
---|
Ultratec, Inc. and Captel, Inc., Plaintiffs v. Sorenson Communications, Inc. and Captioncall, LLC., Defendants, Opinion and Order of the United States District Court for the Western District of Wisconsin, dated Aug. 28, 2014. |
United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. CaptionCall, LLC, Patent Owner's Response to Decision in Institute Inter Partes Review dated Jan. 30, 2014, 60 pages. |
United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. Sorenson Communications, Inc., CaptionCall, LLC, and Wilmington Trust, National Association, Petitioner's Reply Under 37 C.F.R. 42.23 dated Apr. 16, 2014, 19 pages. |
United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. CaptionCall, LLC, Notice of Disclaimer of Claims 1, 2, 7 and 9 of U.S. Pat. No. 8,379,801 dated Jul. 1, 2014, 5 pages. |
United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. Sorenson Communictions, Inc., CaptionCall, LLC, and Wilmington Trust, National Association, Decision re Petitioner's Request for Rehearing, dated Jan. 10, 2014, 6 pages. |
7908145United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. Sorenson Communications, Inc., CaptionCall, LLC, and Wilmington Trust, National Association, Request for Rehearing Under 37 C.F.R. 42.71(D) dated Nov. 27, 2013, 9 pages. |
United States Patent and Trademark Office, Before the Patent Trial and Appeal Board pleading, Ultratec, Inc. v. CaptionCall, LLC, Decision re Institution of Inter Partes Review, dated Nov. 13, 2013, 25 pages. |
Disclaimer in Patent Under 37 C.F.R. 1.321(a) for U.S. Pat. No. 8,379,801 dated Jun. 30, 2014. |
Petition for Inter Partes Review of U.S. Pat. No. 8,379,801 (including exhibits) dated May 17, 2013. |
Advantage Software, About Us webshot; www.eclipsecat.com/contentfabout-us; May 15, 2013, 1 page. |
E-Tips Newsletter, Norton 360: Outstanding Protection; Nov. 2007; Issue 71, 6 pages. |
CapTel News, News & Helpful Tips for People Who Use CapTel, Summer 2007, newsletter. |
User Guide to Sprint CapTel Services, as early as Mar. 2009. |
All CapTel 800 Phones are being updated, Press Release, <http://www.kcdhh.org/bulletin/CapTel800Update.pdf>, as early as Oct. 2011. |
CapTel News from Ultratec—Jul. 2005. |
Ultratec, Inc., v. CaptionCall, LLC., IPR2013-00288, Final Written Decision dated Oct. 30, 2014. |
Inter Partes Review U.S. Pat. No. 9,336,689, Exhibit 1001 “Declaration of Ivan Zatkovich”, filed May 9, 2017, 106 pages. |
Inter Partes Review U.S. Pat. No. 9,336,689, Exhibit 1002 “Declaration of Meme Hilley”, filed May 9, 2017, 6 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 8,379,801, Exhibit 1012, filed May 17, 2013, 68 pages. |
Edmund A. Williams, et al., “National Association of Broadcasters: Engineering Handbook, 10th Edition”, 2007, 21 pages. |
Inter Partes Review Case No. IPR2013-00288, Exhibit 1011, “Final Written Decision”, dated Oct. 30, 2014, 29 pages. |
“Notice of Allowance” dated Oct. 7, 2014, in U.S. Appl. No. 13/768,918, 7 pages. |
“Terminal Disclaimer to Obviate a Double Patenting Rejection Over a “Prior” Patent” filed on Sep. 9, 2014, in U.S. Appl. No. 13/768,918, 1 page. |
“Decision on Petition Under CFR 1.313(c)(2)” dated Nov. 4, 2014, in U.S. Appl. No. 13/768,918, 1 page. |
“Notice of Abandonment” dated Jun. 17, 2015, in U.S. Appl. No. 13/768,918, 2 pages. |
Office Action Response filed on Sep. 25, 2014, in U.S. Appl. No. 13/768,918, 8 pages. |
“Notice of Allowance” dated Feb. 26, 2016, in U.S. Appl. No. 14/530,407, 8 pages. |
Office Action Response filed on Jan. 19, 2016, in U.S. Appl. No. 14/530,407, 11 pages. |
Non-Final Office Action dated Oct. 22, 2015, in U.S. Appl. No. 14/530,407, 7 pages. |
James Martin, “Design of Man-Computer Dialogues”, 1973, 17 pages. |
Dorothy Smith, “Communication in the Courtroom: Technology Is Helping Provide Equal Access to the Law”, 1989, 3 pages. |
Joseph Shapiro, “Technology No Longer Distances Deaf Culture”, May 1, 2006, 4 pages. |
Lloyd Vries, “Pagers Become Lifeline for Deaf”, www.cbsnews.com, Nov. 1, 2006, 2 pages. |
Susan Donaldson James, “Deaf and Proud to Use Sign Language”, Dec. 12, 2006, 3 pages. |
Petition for Inter Partes Review of U.S. Pat. No. 9,336,689, dated May 9, 2017, 79 pages. |
“Americans with Disabilities Act of 1990, the ADA Amendments Act of 2008, and 28 CFR Part 35: Title II Guidelines for the State Courts System of Florida”, Jan. 2009, 43 pages. |
“Settlement Agreement Between the United States of America and the Florida State Courts System”, dated May 31, 1996, 8 pages. |
“Florida State Courts System Provision of Real-Time Court Reporting Services for Attorneys with Disabilities”, Dated Oct. 30, 2007, 4 pages. |
Marcele M. Soviero, “Captioning Could be a Boon to Many Viewers”, “Popular Science”, Oct. 1993, 3 pages. |
“Report, In the Matter of Closed Captioning and Video Description of Video Programming”, Dated Jul. 25, 1996, 62 pages. |
Decision Institution of Inter Partes Review received in IPR2017-01394, U.S. Pat. No. 9,336,689 B2, Nov. 30, 2017. |
Judgment received in IPR2017-01394, U.S. Pat. No. 9,336,689 B2, Feb. 27, 2018. |
Number | Date | Country | |
---|---|---|---|
Parent | 14530407 | Oct 2014 | US |
Child | 15096087 | US | |
Parent | 13768918 | Feb 2013 | US |
Child | 14530407 | US | |
Parent | 12624973 | Nov 2009 | US |
Child | 13768918 | US |