System and method for third party monitoring of voice and video calls

Information

  • Patent Grant
  • 11271976
  • Patent Number
    11,271,976
  • Date Filed
    Tuesday, June 16, 2020
    3 years ago
  • Date Issued
    Tuesday, March 8, 2022
    2 years ago
Abstract
A system is described herein that facilitates the monitoring of inmate communications. The system provides a remotely-accessible means for a reviewer to monitor a call between an inmate and another person. The system includes a monitoring server and a monitoring station. The monitoring server is configured to receive a call and call information from a communication center and process the call for monitoring, schedule a review of the call; and store the call, the call information, and scheduling data. The monitoring station is configured to receive the call and the call information from the monitoring server based on the scheduling data, and to display the identifying information and facilitate the review of the call.
Description
BACKGROUND
Field

The disclosure relates to a monitoring system that facilitates third party monitoring of inmate audio and video communications.


Background

Correctional facilities provide inmates with the ability to communicate with friends, families, and visitors as it improves recidivism and provides incentives for inmates to follow rules and policies of the facility. In addition to traditional telephone calls and telephone visitations, correctional facilities seek to offer a wide variety of communication services to inmates, such as video visitation and video calls, among others. However, as the amount of communication options available to inmates increases, an increased amount of monitoring is required for these communications.





BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

The accompanying drawings, which are incorporated herein and form a part of the specification, illustrate embodiments of the present disclosure and, together with the description, further serve to explain the principles of the disclosure and to enable a person skilled in the pertinent art to make and use the embodiments.



FIG. 1 illustrates a block diagram of a monitoring system, according to exemplary embodiments of the present disclosure;



FIG. 2 illustrates a block diagram of a monitoring center, according to exemplary embodiments of the present disclosure;



FIG. 3 illustrates a block diagram of a monitoring server, according to exemplary embodiments of the present disclosure;



FIG. 4 illustrates an application server, according to exemplary embodiments of the present disclosure;



FIG. 5 illustrates a block diagram of storage devices, according to exemplary embodiments of the present disclosure;



FIG. 6 illustrates a monitoring station, according to exemplary embodiments of the present disclosure;



FIG. 7 illustrates a user interface of a monitoring station, according to exemplary embodiments of the present disclosure;



FIG. 8 illustrates a flowchart diagram of a method for monitoring a call, according to exemplary embodiments of the present disclosure;



FIG. 9 illustrates a flowchart diagram of a method for reviewing a call and monitoring a reviewer, according to exemplary embodiments of the present disclosure; and



FIG. 10 illustrates a computer system, according to exemplary embodiments of the present disclosure.





The present disclosure will be described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left most digit(s) of a reference number identifies the drawing in which the reference number first appears.


DETAILED DESCRIPTION

The following Detailed Description refers to accompanying drawings to illustrate exemplary embodiments consistent with the disclosure. References in the Detailed Description to “one exemplary embodiment,” “an exemplary embodiment,” “an example exemplary embodiment,” etc., indicate that the exemplary embodiment described may include a particular feature, structure, or characteristic, but every exemplary embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same exemplary embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an exemplary embodiment, it is within the knowledge of those skilled in the relevant art(s) to affect such feature, structure, or characteristic in connection with other exemplary embodiments whether or not explicitly described.


The exemplary embodiments described herein are provided for illustrative purposes, and are not limiting. Other exemplary embodiments are possible, and modifications may be made to the exemplary embodiments within the spirit and scope of the disclosure. Therefore, the Detailed Description is not meant to limit the invention. Rather, the scope of the invention is defined only in accordance with the following claims and their equivalents.


Embodiments may be implemented in hardware (e.g., circuits), firmware, software, or any combination thereof. Embodiments may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by one or more processors. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others. Further, firmware, software, routines, instructions may be described herein as performing certain actions. However, it should be appreciated that such descriptions are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc. Further, any of the implementation variations may be carried out by a general purpose computer, as described below.


For purposes of this discussion, any reference to the term “module” shall be understood to include at least one of software, firmware, and hardware (such as one or more circuit, microchip, or device, or any combination thereof), and any combination thereof. In addition, it will be understood that each module may include one, or more than one, component within an actual device, and each component that forms a part of the described module may function either cooperatively or independently of any other component forming a part of the module. Conversely, multiple modules described herein may represent a single component within an actual device. Further, components within a module may be in a single device or distributed among multiple devices in a wired or wireless manner.


The following Detailed Description of the exemplary embodiments will so fully reveal the general nature of the invention that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based upon the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.


Overview


Communication between inmates and outsiders has been shown to be an extremely important part of rehabilitation. Allowing for an inmate to keep in contact with friends and family significantly helps to reduce recidivism as well as to prepare an inmate for life after prison. Because most inmates are eventually released back into the public, any steps to minimize problems with their re-assimilation with society are highly beneficial.


Traditionally, communications between inmates and outsiders only included telephone calling and letter writing. However, over the years, newer technologies such as email, texting, and video calling have been used by correctional facilities. Typically, some form of monitoring has been used by correction facilities for all these types of communications.


In the case of telephone and video calling, a large amount of time is required to monitor the audio and video communications. Specifically, in typical monitoring systems, a reviewer, such as an officer, listens to an audio communication or watches a video communication of an inmate communicating with an outsider. In general, the monitoring systems allow the reviewer to listen to or watch a communication live and/or the communication is recorded for later review. However, most correctional facilities do not have a sufficient amount of personnel available to monitor all of the audio or video communications of inmates. Therefore, many correctional facilities typically monitor a selective amount of live communications and record all remaining communications for later review if an investigation points to a particular inmate, a phone number, or a time of day of an event. Based on the limited amount of personnel, many correctional facilities are forced to limit an amount of calls that each inmate may have during a time period (e.g., 1 call per day) and/or an amount of time that each inmate may have during a time period (e.g., 300 minutes a month).


In addition to recording communications, typical monitoring systems also have capabilities to convert speech to text which allows a reviewer to search for key words or phrases. However, many of these technologies do not go far enough to detect covert communications such as slang, jargon, or body jesters that are not typically used in describing criminal activity.


In light of the above, the present disclosure provides a system to allow a third party to monitor voice and video calls. This consists of a monitoring system that allows a reviewer to remotely listen to an audio call or watch a video call and provide alerts to either officers or administrators regarding any suspicious behavior or communication between an inmate and an outsider such as family, friends, or another inmate. By providing a monitoring system for managing the different communications, a significant burden can be removed from officers and administrators, while simultaneously increasing inmates communication time.


Monitoring System



FIG. 1 illustrates a block diagram of a monitoring system 100, according to exemplary embodiments of the present disclosure. The monitoring system 100 includes a monitoring center 110 configured to receive communication session data from a communication center 120. In this disclosure, a session refers to a communication between two parties, in particular, an audio or video communication between an inmate using an inmate communication device 130 and an outsider using a communication device 140. The session data may include the session itself and data related to the session, as will be described in further detail below. The monitoring center 110 connects to the communication center 120 via a network 101. The network 101 may include a Local-Area Network (LAN), a Wide-Area Network (WAN), or the Internet.


In an embodiment, the communication center 120 connects the inmate communication device 130 to the communication device 140. The inmate communication device 130 includes any or all devices such as an audio and video communications device(s) 132, wireless devices 135a or 135b, and/or a control station 136. The communication center 120 also connects to a wireless access point 134 (e.g., a router), which may provide connectivity to the wireless devices 135a and 135b. The communication center 120 connects to the inmate communication device 130 via a LAN 103.


The communication center 120 connects to the communication device 140 via any or all of a WAN 105, the Internet 107, and the Public Switched Telephone Network (PSTN) 109. The WAN 105 may facilitate communications with other nearby prisons, such as those within the same county, state, etc. Audio and/or video devices 142 located at those other facilities may provide cross-facility visitations between inmates. In an embodiment, WebRTC may be utilized in place of a session initiation protocol (SIP) over a WAN or the Internet, each of which provides a dedicated, private link between the inmate communication device 130 and the communication device 140.


The Internet 107 is utilized to provide access to remote stations 144 such as remotely distributed control stations, scheduling clients, and home visitation devices. The PSTN 109 can be used to provide connectivity to basic telephones and other communication devices (not shown) over traditional data networks.


Monitoring Center



FIG. 2 illustrates a block diagram of a monitoring center 200, according to exemplary embodiments of the present disclosure. The monitoring center 200 may be an exemplary embodiment of the monitoring center 110 of FIG. 1. The monitoring center 200 includes a monitoring server 202 which communicates with monitoring stations 204a and 204b and with an administrator device(s) 206. For example, the monitoring server 202 interfaces with the monitoring stations 204a and 204b and/or the administrator device(s) 206 via a network 201 when receiving session data from the communication center 120 and can transmit the session data to at least one of the monitoring stations 204a and 204b or the administrator device(s) 206.


The monitoring center 200 also allows communication between the monitoring stations 204a and 204b and the administrator device(s) 206. For example, a monitoring station 204a may communicate with the administrator device 206 to provide information or an alert in regards to a particular session received from the communication center 120.


In an embodiment the monitoring center 200 is further configured to interrupt and/or disconnect a communication between parties of a live communication session. For example, the monitoring stations 204a or 204b is configured to facilitate a reviewer to interrupt a communication session and issue a warning to either of or both sides of the communication session. The warning may be of the same format as the monitored communication session—voice, text (chat), or video. The occurrence of an interrupting event may be clearable in session logs. As another example, the monitoring stations 204a or 204b may be configured to disconnect a communication session between the parties, and the monitoring center may superimpose an audible or visual message that is played to the parties of the session describing the reason for the disconnection of service. A disconnection event can be clearly flagged in session logs. Further, the capability of interrupting or disconnecting a communication session may be limited to only be allowed if configured in the system on a user by user profile basis.


In an embodiment, the administrator device(s) 206 can include one or more of a phone, computer, tablet, fax machine, or pager having a capability of receiving a communication from monitoring station 204a or 204b. For example, a reviewer using the monitoring station 204a can send an email alert to an administrator, and the administrator can view the email alert by way of a phone, a computer, and/or a tablet that represent the administrator devices 206.


The network 201 can include a LAN, a WAN, or the Internet. Accordingly, the monitoring center 200 can be located on-site or at a remote monitoring location, and allows monitors, corrections officers, or others to monitor a session between an inmate and an outsider in real-time, on delay, or in the form of an audio or video recording.


Monitoring Server



FIG. 3 illustrates a block diagram of a monitoring server 300, according to exemplary embodiments of the present disclosure. The monitoring server 300 may represent an exemplary embodiment of the monitoring server 202 of FIG. 2. The monitoring server 300 includes an audio server 302, a video server 304, an application server 308, and data storage 310, that are all connected to each other via a network bus 312.


Each of the servers 302-306 can be constructed as individual physical hardware devices, or as virtual servers. The number of physical hardware machines can be scaled to match the number of simultaneous user connections desired to be supported in the monitoring system 100.


The audio server 302 can consist of any number of servers, and is configured to receive audio session data via the communication center 120. The audio server 302 supports sessions between inmates and outsiders that use audio devices such as an audio session between an inmate using the audio communication device 132 and an outsider using a telephone connected to the PSTN 109. The audio server 302 facilitates the real-time recording and delayed monitoring of audio sessions. The audio server 302 is configured to simultaneously record and transmit audio session data to the monitoring station 204a and/or 204b. For example, when an inmate uses an the audio communication device 132 to have an audio session with another inmate using the audio communication device 142, the audio server 302 receives audio session data from the communication center 120, records the audio session data, and transmits the audio session data to the monitoring station(s) 204a and/or 204b. The audio server 302 may store audio session as audio files on an internal storage or an external storage, as will be explained in more detail below.


The video server 304 can consist of any number of servers, and is configured to receive video session data via the communication center 120. The video server 304 supports video sessions between inmates and outsiders that use video devices such as a video session between an inmate using the video communication device 132 and an outsider using the video communication device 142. The video server 304 facilitates the real-time and delayed monitoring of video sessions. The video server 304 is configured to simultaneously record and transmit video session data to a monitoring station. For example, when an inmate uses the wireless device 135a to have a video session with a family member using the remote station 144, the video server 304 can receive video session data via the communication center 120, record the video session data, and transmit the video session data to the monitoring station(s) 204a and/or 204b. The video server 304 may store the video sessions as video files on an internal storage or an external storage, as will be explained in more detail below.


Because there may be a variety of different video communication standards employed by different video devices that wish to participate in video calls, in an embodiment, the video server 304 may also perform real-time format conversion. The conversion may convert incoming signals as needed, or may convert outgoing signals to be compatible with monitoring stations 204a and 204b.


Because the audio server 302 and the video server 304 receive and transmit session data by way of a network, in an exemplary embodiment, both the audio server 302 and the video server 304 can decrypt received session data and encrypt session data prior to transmitting the session data, for security purposes. Further, the audio server 302 and the video server 304 may record or store audio and video files on either internal storage or the data storage 310.



FIG. 4 illustrates an application server 400, according to exemplary embodiments of the present disclosure. The application server 400 may represent an exemplary embodiment of the application server 308 depicted in FIG. 3. The application server 400 functions as the primary logic processing center in the monitoring system 100. The application server 400 includes one or more central processing units (CPU) 410 connected via a bus 412 to several other peripherals. Such peripherals include an input device, such as a keyboard and/or mouse 420, a monitor 422 for displaying information, a network interface card 424 and/or a modem 426 that provide network connectivity and communication.


The application server 400 also includes internal data storage 430. This data storage 430 is non-volatile storage, such as one or more magnetic hard disk drives (HDDs) and/or one or more solid state drives (SSDs). The data storage 430 is used to store a variety of important files, documents, or other digital information, such as the operating system files, application files, user data, and/or temporary recording space.


The application server 400 also includes system memory 440. The system memory 440 is preferably faster and more efficient than the data storage 430, and is configured as random access memory (RAM) in an embodiment. The system memory 440 contains the runtime environment of the application server, storing temporary data for any of the operating system 442, java virtual machine 444, java application server 446, and monitoring control logic 448.



FIG. 5 illustrates a block diagram of storage devices 500, according to exemplary embodiments of the present disclosure. The storage devices 500 may represent an exemplary embodiment of the data storage 310 of FIG. 3.


As shown in FIG. 5, the storage devices 500 provide access to a wide variety of data. The configuration data store 510 allows the system to be configured with a variety of different hardware types and manufacturers, and allow for more error-free interface and monitoring at the monitoring stations 204a and 204b. The configuration data store 510 may also include the connection details of one or more hardware devices anticipated to be used for audio or video sessions received from the communication center 120, such as the video and audio servers, web servers, application servers, and remote devices. Inmate data store 520 includes information about individual inmates, such as name, address, commitment information, etc. The inmate data store 520 may also include information relating to the case workers or correctional officers assigned to the inmate. These records may be directly entered, or may be obtained from an Inmate Telephone System, Jail Management System, or the communication system 120.


Although information regarding an outsider may be directly stored in the respective databases, an outsider data store 530 may be provided in an embodiment to separately store outsider information. The outsider information may include a name or identity and/or contact information, such as phone or address, of the outsider that is communicating with the inmate. The outsider information may also be identified in various ones of the data stores 510-580 by name or identifier only and this name/identifier may include a link to full biographical information of the outsider in the outsider data store 530.


Real-time communication data store 540 receives and temporarily stores information regarding a current ongoing session. The real-time communication information is received from the communication center 120 and may include session annotations, bookmarks, or alerts from a reviewer and/or connection data regarding a currently reviewed audio or video session. For example, the real-time communication data store 540 can receive session data regarding a detected three-way call in an audio session, such as by the detection of a hook-flash or other indicating event, and/or any other event that can be generated by a telephone.


Historical communication data store 550 stores information relating to prior audio and video sessions. The information included within these records can consist of prior session data of the inmate involved in the session, the outsider, the resources used for prior calls, including the type of devices used by each of the parties, the date/time of the audio or video sessions, the duration, etc. This information can be populated by the application server 400 by tracking and monitoring visits, and recording the data associated therewith.


Scheduling data store 560 stores session events that have not yet occurred as well as historical session information (e.g., session records). In this regard, the scheduling data store 560 stores a calendar of scheduled sessions, as well as information relating to those scheduled sessions, such as the parties to be involved, their contact information, and the communication devices to be used by those individuals. The session schedule records may include links to data relating to any of the involved parties, including inmates, visitors, and correctional staff (if needed). In an embodiment, a separate data store can be used to store the session records. Further, in an embodiment, the session records may include a link to the original session reservation, any recordings or transcripts of the session, and/or a list of the actual resources used, if they differ from the scheduled resources.


The scheduling data store 560 also stores monitoring station assignments including historical, present, and future assignments for reviewers. In particular, the monitoring station assignments can include information indicating audio or video sessions that have or will be reviewed, the identity of reviewer(s) that have been or are scheduled to review the audio or video sessions, monitoring station identification, date/time and duration of a review, etc.


Audio data store 570 can store audio files created from the audio session data received from the communication center 120. The audio data store 570 can also store modified audio files such as those that have been reviewed and annotated. The audio data store 570 may function as a temporary storage for an audio file in situations such as during creation of a non-modified audio file or modified audio file. The audio data store 570 may be used as the primary storage for the audio server 302 or used as a backup for the audio server 302.


Video data store 580 can store video files created from the video session data received from the communication center 120. The video data store 580 can also store modified video files such as those that have been reviewed and annotated. The video data store 580 may function as a temporary storage for a video file in situations such as during creation of a non-modified video file or modified video file. The video data store 580 may be used as the primary storage for the video server 304 or used as a backup for the video server 304.


Because the data stored on the data stores 510-580, especially audio and video files, consume significant amounts of storage space, this data can be stored on a Network Attached Storage (NAS) device 590 configured as a mass storage device. The data stores 510-580 may include links and/or pointers to recording data located on the NAS device 590. In order to reduce the required size of the NAS device 590, the NAS device preferably includes a backup routine to transfer recording data to permanent storage devices, such as archival permanent storage or optical disks, after a predetermined time has elapsed since the initial recording of that data. The NAS device 590 is connected to the data stores by way of the network 501.


Monitoring Station



FIG. 6 illustrates a monitoring station 600, according to exemplary embodiments of the present disclosure. The monitoring station 600 may be an exemplary embodiment of the monitoring station 204a or 204b of FIG. 2. The monitoring station 600 functions as a remote reviewing center in the monitoring system 100. The monitoring station 600 may include a computer, tablet, or phone capable of viewing a user interface and includes one or more central processing units (CPU) 610 connected via a bus 601 to several other peripherals. Such peripherals include an input device, such as a keyboard and/or mouse 620 and a camera and/or microphone 628, a monitor 622 for displaying a user interface (e.g., FIG. 7), a network interface card 624 and/or a modem 626 that provide network connectivity and communication.


The monitoring station 600 also includes internal data storage 630. This data storage 630 is non-volatile storage, such as one or more magnetic hard disk drives (HDDs) or solid state drives (SSDs). The data storage 630 is used to store a variety of important files, documents, or other digital information, such as the operating system files, application files, user data, and/or temporary recording space.


The monitoring station 600 also includes system memory 640. The system memory 640 is preferably faster and more efficient than the data storage 630, and is configured as random access memory (RAM) in an embodiment. The system memory 640 contains the runtime environment of the monitoring station, storing temporary data for any of the operating system 642, java virtual machine 644, java application server 646, and monitoring station control logic 648.



FIG. 7 illustrates a user interface 700 of a monitoring station, according to exemplary embodiments of the present disclosure. The user interface 700 may represent an exemplary embodiment of a user interface for the monitoring station 600 of FIG. 6. The user interface 700 includes an interface window 702 having different panels to facilitate review of a session between an inmate and an outsider. The interface window 702 includes a communication panel 710, a message panel 720, a data panel 730, and a control panel 740. The communication panel 710 includes a recordation panel 712 used to play, pause, rewind, and fast forward a communication. The recordation panel 712 includes time indicators such as a current time or a duration of the communication. For video calls, the communication panel 710 includes multiple screens 714 and 716 to show the inmate and the outsider. The screens 714 and 716 may include indicators of the inmate's name and the outsider's name and/or other identifying information of the parties involved.


The message panel 720 facilitates a reviewer in taking notes during a communication. The message panel 720 may include an outgoing message area 722 where a reviewer inserts a message by way of a input device, such as the keyboard 620 of FIG. 6. The outgoing message area 722 facilitates a reviewer in performing tasks such as entering a message into a record or sending a message to an administrator when the reviewer presses the enter button 724. The message panel 720 also includes an incoming message window 726 that displays messages received from the monitoring server 202. For example, the incoming message window 726 can display a code that the reviewer has to enter into the message area 722 to verify that a reviewer is performing monitoring duties. The incoming message window 726 may also display annotations/bookmarks or alerts created by a previous reviewer or that were automatically created by the audio server 302 or the video server 304 so as to alert the reviewer to suspicious behavior occurring at a particular time in the call. For example, while a reviewer is reviewing a session, the incoming message window 726 can display an annotation at a time during the session that a previous reviewer made the annotation.


The data panel 730 provides a reviewer with access to data that may be necessary for monitoring a communication. The data panel 730 is populated by data from the data stores 510-590. The data panel 730 can include an inmate data button 732, an outsider data button 734, a real-time communication data button 736, and a historical communications data button 738. When a reviewer pushes one of the buttons, data pertaining to the specific button is viewed in a data window. For example, as shown by FIG. 7, a reviewer may view historical communications of the inmate by clicking on the historical communications data button 738 to view data in the viewing window 739 pertaining to prior communications. The data viewed in the data window, for example viewing window 739, is populated from data stored on a data storage used by the monitoring server 202.


The control panel 740 includes controls for performing tasks while a reviewer monitors a session. The control panel 740 can include buttons for searching a record of the inmate, creating a bookmark at a specific time of the communication, creating an annotation at a specific time of the communication, contacting an administrator of the inmate, completing the review, and/or changing settings.


Monitoring Center Operation


The monitoring center 200 provides monitoring capabilities for a third party reviewer and capabilities in supervising the reviewer. An operation of the monitoring center 200 will be described with respect to FIG. 8, which illustrates a flowchart diagram of a method 800 for monitoring a call, according to exemplary embodiments of the present disclosure. Although the physical devices and components that form the system have largely already been described, additional details regarding their more nuanced operation will be described below with respect to FIGS. 1-7.


The monitoring center 200 begins a monitoring procedure (802). The monitoring server 202 receives session data associated with a scheduled session, either voice or video session, from the communication center 100 (840). In an embodiment, the session data may be received prior to the session or simultaneously with the session. However, as described for this exemplary embodiment, the session data is received prior to receipt of an associated session. Upon receiving the session data, the monitoring center may assign an identification to the session data. The identification is used for scheduling purposes and associates a scheduled session to inmate records.


The monitoring server 202 then schedules a reviewer to the scheduled session (806). In scheduling a reviewer, the monitoring server 202 coordinates between such stored data as the inmate data, reviewer availability, outsider data, or historical communication data. For example, the monitoring server 202 can schedule a reviewer based on whether a real-time monitoring is required for a particular inmate or outsider per the data stored in the inmate data store 520, the outsider data store 530, and the historical communication data store 550.


The monitoring server 202 receives the scheduled session (808) and begins recording and processing the session and session data (810). For example, monitoring server 202 can process the session to link present communication data received such that the inmate's phone number and the outsider's phone number are linked to the session. Further, the monitoring server 202 can process the session by inserting timestamps within or retrieving timestamps from a session file. For example, the monitoring server 202 can insert a timestamp at predetermined segment lengths (e.g., every 10 seconds) of the session. The timestamps facilitate synchronization with annotation, bookmarks, alerts, etc. received from monitoring stations such that annotations/bookmarks/alerts from multiple reviewers may be viewed according to the time created in relation to the session. During the storing and processing of the session, the monitoring server 202 can simultaneously transmit the session data to at least one of the monitoring stations 204a and/or 204b (812).


Next, the session is reviewed at the monitoring station(s) 204a and/or 204b by a reviewer at 814. During the review of the session, which may be real-time or pre-recorded, the reviewer, by way of the user interface 700, can pause, rewind, and fast forward the session. Using the monitoring station, the reviewer can also view inmate data, outsider data, real-time communication data, and/or historical communication data if needed on the user interface.


As will be explained in more detail below, the monitoring server 202 and the monitoring stations 204a and 204b may further monitor the reviewer's progress on reviewing the session (814). After the session has been reviewed at (814), the monitoring server 202 ends the monitoring procedure (816).



FIG. 9 illustrates a flowchart diagram of a method 900 for reviewing a session and monitoring a reviewer, according to exemplary embodiments of the present disclosure. The method 900 may represent an exemplary embodiment of reviewing of a session (814) of FIG. 8. A review procedure can be performed by the monitoring server 202 and a monitoring station or by a monitoring station alone, as will be described in detail below. The following description will discuss a review procedure performed by the monitoring server 202 in conjunction with a monitoring station. When the monitoring server 202 transmits a session, either real-time or pre-recorded, and session data to a monitoring station, such as monitoring stations 204a and/or 204b, the review procedure begins (902).


The monitoring server 202 determines whether input data, such as an annotation, a flag, or a bookmark, has been received from the monitoring station 204a or 204b. If determined that an annotation/bookmark has been received, the monitoring server 202 stores the annotation/bookmark, tags the annotation/bookmark with an identifier, and links the annotation/bookmark with the session (906) such that the annotation/bookmark can be viewed separate from the session or viewed in conjunction with the session at the associated time of the session when the annotation/bookmark was created. The monitoring server 202 can match a timestamp of the session with a timestamp of when the annotation/bookmark was made by the reviewer. Synchronizing the annotation/bookmark facilitates future review of the session along with the annotation/bookmark. For example, if a first reviewer using monitor station 204a makes an annotation at a timestamp of 3 minutes in the call, a second reviewer can either view the annotation as a separate note or view the annotation in the incoming message screen 726 of FIG. 7 at the 3 minute mark of the session.


After storing and linking the annotation/bookmark, the monitoring server 202 determines whether the review procedure is finished (914). The determination of whether the review is finished can be based on multiple factors. Such factors may include whether the communication has ended and/or whether the reviewer has ended the monitoring session.


If no annotation/bookmark have been received, the monitoring server 202 determines whether additional input data such as an alert has been received (908) from the monitoring station 204a or 204b. If determined that an alert has been received, the monitoring server 202 stores the alert and links the alert with the call (910). Similar to a received annotation/bookmark, the monitoring server 202 synchronizes an alert such that the alert can be viewed at a particular time of the session when viewed at a later time. After storing and linking the alert, the monitoring system 202 determines whether the review procedure is finished (914).


Aside from providing recorded notes of a session, the annotations, bookmarks, and alerts also facilitate the monitoring of a reviewer. For example, when an annotation is received from a monitoring station, the monitoring server 202 determines that the reviewer is attentive and reviewing the session. However, when neither an annotation, bookmark, or an alert is received from a monitoring station within a predetermined time, the monitoring server 202 presumes that the reviewer is not properly monitoring the session. Accordingly, the monitoring server 202 can perform a monitor verification (912). The verification may be performed by the monitoring server 202 transmitting a verification command the monitoring station and requiring that the reviewer perform a simple task. The verification command may require a reviewer at the monitoring station 204a or 204b to perform a task. For example, the monitoring server 202 may transmit a message, such as shown in the incoming message window 726 of FIG. 7, that requires the reviewer to enter a code. As another example, the reviewer may be required to perform a biometric verification such as looking/speaking into a camera/microphone (e.g., the camera/microphone 628 of FIG. 6) for retinal, facial, and/or voice verification. The monitoring station 204a or 204b transmits a response to the verification command and based on whether the task is completed, the monitoring system can determine whether or not to finish the review procedure (914).


In an embodiment, the review process may be performed entirely by a monitoring station. For example, the monitoring station 204a may receive a session, either real-time or pre-recorded, and begin the review procedure (902). In receiving the session, the monitoring station 204a can download the session on to a local storage of the monitoring station 204a. Next, the monitoring station 204a can determine whether an annotation/bookmark is input by a reviewer (904). When determined that an annotation/bookmark has been input, the monitoring station 204a can store the annotation/bookmark on a local storage, tag the annotation/bookmark with an identifier, and synchronize the annotation/bookmark, by timestamp, with the session. Similarly, the monitoring station 204a can determine whether the reviewer wants to send an alert to an administrator (908). If determined that an alert is to be sent, the monitoring station 204a stores the alert on a local storage, tags the alert by an identifier, links the alert to a timestamp of the session, and sends the alert to the administrator.


Similar to the monitoring server 202, if no annotation/bookmark has been inputted or no alert has been submitted, the monitoring station 204a can perform a monitor verification (912) to determine whether the reviewer is attentive. The monitoring station 204a can determine whether the reviewer is being attentive based on the results of the monitor verification.


When the monitoring station 204a has determined that the review is finished (914), the monitoring station 204a uploads the annotations, bookmarks, and alerts to the monitoring server 202, and the review procedure is finished (916).


Exemplary Computer Implementation


It will be apparent to persons skilled in the relevant art(s) that various elements and features of the present disclosure, as described herein, can be implemented in hardware using analog and/or digital circuits, in software, through the execution of computer instructions by one or more general purpose or special-purpose processors, or as a combination of hardware and software.


The following description of a general purpose computer system is provided for the sake of completeness. Embodiments of the present disclosure can be implemented in hardware, or as a combination of software and hardware. Consequently, embodiments of the disclosure may be implemented in the environment of a computer system or other processing system. For example, the methods of FIGS. 8 and 9 can be implemented in the environment of one or more computer systems or other processing systems. An example of such a computer system 1000 is shown in FIG. 10. One or more of the modules depicted in the previous figures can be at least partially implemented on one or more distinct computer systems 1000.


Computer system 1000 includes one or more processors, such as processor 1004. Processor 1004 can be a special purpose or a general purpose digital signal processor. Processor 1004 is connected to a communication infrastructure 1002 (for example, a bus or network). Various software implementations are described in terms of this exemplary computer system. After reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement the disclosure using other computer systems and/or computer architectures.


Computer system 1000 also includes a main memory 1006, preferably random access memory (RAM), and may also include a secondary memory 1008. Secondary memory 1008 may include, for example, a hard disk drive 1010 and/or a removable storage drive 1012, representing a floppy disk drive, a magnetic tape drive, an optical disk drive, or the like. Removable storage drive 1012 reads from and/or writes to a removable storage unit 1016 in a well-known manner. Removable storage unit 1016 represents a floppy disk, magnetic tape, optical disk, or the like, which is read by and written to by removable storage drive 1012. As will be appreciated by persons skilled in the relevant art(s), removable storage unit 1016 includes a computer usable storage medium having stored therein computer software and/or data.


In alternative implementations, secondary memory 1008 may include other similar means for allowing computer programs or other instructions to be loaded into computer system 1000. Such means may include, for example, a removable storage unit 1018 and an interface 1014. Examples of such means may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM, or PROM) and associated socket, a thumb drive and USB port, and other removable storage units 1018 and interfaces 1014 which allow software and data to be transferred from removable storage unit 1018 to computer system 1000.


Computer system 1000 may also include a communications interface 1020.


Communications interface 1020 allows software and data to be transferred between computer system 1000 and external devices. Examples of communications interface 1020 may include a modem, a network interface (such as an Ethernet card), a communications port, a PCMCIA slot and card, etc. Software and data transferred via communications interface 1020 are in the form of signals which may be electronic, electromagnetic, optical, or other signals capable of being received by communications interface 1020. These signals are provided to communications interface 1020 via a communications path 1022. Communications path 1022 carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular phone link, an RF link and other communications channels.


As used herein, the terms “computer program medium” and “computer readable medium” are used to generally refer to tangible storage media such as removable storage units 1016 and 1018 or a hard disk installed in hard disk drive 1010. These computer program products are means for providing software to computer system 1000.


Computer programs (also called computer control logic) are stored in main memory 1006 and/or secondary memory 1008. Computer programs may also be received via communications interface 1020. Such computer programs, when executed, enable the computer system 1000 to implement the present disclosure as discussed herein. In particular, the computer programs, when executed, enable processor 1004 to implement the processes of the present disclosure, such as any of the methods described herein. Accordingly, such computer programs represent controllers of the computer system 1000. Where the disclosure is implemented using software, the software may be stored in a computer program product and loaded into computer system 1000 using removable storage drive 1012, interface 1014, or communications interface 1020.


In another embodiment, features of the disclosure are implemented primarily in hardware using, for example, hardware components such as application-specific integrated circuits (ASICs) and gate arrays. Implementation of a hardware state machine so as to perform the functions described herein will also be apparent to persons skilled in the relevant art(s).


CONCLUSION

It is to be appreciated that the Detailed Description section, and not the Abstract section, is intended to be used to interpret the claims. The Abstract section may set forth one or more, but not all exemplary embodiments, and thus, is not intended to limit the disclosure and the appended claims in any way.


The invention has been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries may be defined so long as the specified functions and relationships thereof are appropriately performed.


It will be apparent to those skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus, the invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims
  • 1. A monitoring server for monitoring an inmate communication session, the monitoring server comprising: at least one processor configured to: receive the inmate communication session;transmit communication session data associated with the inmate communication session to a monitoring station selected from among a plurality of monitoring stations to monitor the inmate communication session;schedule a reviewer to monitor the inmate communication session based on a user profile of a user of the inmate communication session and reviewer availability data of the reviewer;receive input data associated with the inmate communication session from the monitoring station;interrupt or disconnect the inmate communication session based on the input data; andsuperimpose an audible or visual message that is played to parties of the inmate communication session on the inmate communication session describing a reason for interrupting or disconnecting the inmate communication session.
  • 2. The monitoring server of claim 1, wherein the at least one processor is further configured to configure the user profile.
  • 3. The monitoring server of claim 2, wherein the user profile indicates that interrupting and disconnecting the inmate communication session is allowed.
  • 4. The monitoring server of claim 2, wherein a memory of the monitoring server stores the user profile.
  • 5. The monitoring server of claim 1, wherein the user profile indicates that the inmate communication session requires real-time monitoring.
  • 6. The monitoring server of claim 1, wherein the reviewer availability data indicates that the reviewer is available for real-time monitoring.
  • 7. A monitoring center for monitoring an inmate communication session, the monitoring center comprising: a monitoring server configured to: receive the inmate communication session;schedule a reviewer to monitor the inmate communication session based on a user profile of a user of the inmate communication session and reviewer availability data of the reviewer, anda monitoring station configured to: monitor communication session data associated with the inmate communication session for suspicious behavior occurring; andprovide input data to the monitoring server notifying of the suspicious behavior in response to detecting the suspicious behavior occurring within the inmate communication session,wherein the monitoring server is further configured to interrupt or disconnect the inmate communication session based on the input data, and superimpose an audible or visual message that is played to parties of the inmate communication session on the inmate communication session describing a reason for interrupting or disconnecting the inmate communication session.
  • 8. The monitoring center of claim 7, wherein the monitoring server is further configured to configure the user profile.
  • 9. The monitoring center of claim 8, wherein the user profile indicates that interrupting and disconnecting the inmate communication session is allowed.
  • 10. The monitoring center of claim 8, wherein the monitoring server is further configured to store the user profile in a memory of the monitoring server.
  • 11. The monitoring center of claim 7, wherein the user profile indicates that the inmate communication session requires real-time monitoring.
  • 12. The monitoring center of claim 7, wherein the reviewer availability data indicates that the reviewer is available for real-time monitoring.
  • 13. A monitoring server for monitoring an inmate communication session, the monitoring server comprising: a storage device;a video server configured to: decrypt and record the inmate communication session when the inmate communication session corresponds to a video communication session,convert the inmate communication session in real time to a format compatible with at least one monitoring station, andencrypt and transmit the video communication session to the at least one monitoring station to monitor the video communication session; andan application server configured to: schedule a reviewer to monitor the inmate communication session based on a user profile of a user of the inmate communication session and reviewer availability data of the reviewer;receive input data associated with the inmate communication session from the at least one monitoring station,interrupt or disconnect the inmate communication session based on the input data, andsuperimpose an audible or visual message that is played to parties of the inmate communication session on the inmate communication session describing a reason for interrupting or disconnecting the inmate communication session.
  • 14. The monitoring server of claim 13, wherein the application server is further configured to: configure the user profile; andstore the user profile in the storage device.
  • 15. The monitoring server of claim 14, wherein the user profile indicates that interrupting and disconnecting the inmate communication session is allowed.
  • 16. The monitoring server of claim 7, wherein the user profile indicates that the inmate communication session requires real-time monitoring.
  • 17. The monitoring server of claim 7, wherein the reviewer availability data indicates that the reviewer is available for real-time monitoring.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation application of U.S. application Ser. No. 16/391,954, filed Apr. 23, 2019, which is a continuation application of U.S. application Ser. No. 15/882,321, filed Jan. 29, 2018, now U.S. Pat. No. 10,277,640, which is a continuation application of U.S. application Ser. No. 15/594,150, filed Mar. 30, 2018, now U.S. Pat. No. 9,923,936, which is a continuation application of U.S. application Ser. No. 15/331,414, filed Oct. 21, 2016, now U.S. Pat. No. 9,674,340, which is a continuation application of U.S. application Ser. No. 15/093,300 filed Apr. 7, 2016, now U.S. Pat. No. 9,609,121, each of which is incorporated herein by reference in its entirety.

US Referenced Citations (392)
Number Name Date Kind
4054756 Cornelia et al. Oct 1977 A
4191860 Weber Mar 1980 A
4670628 Boratgis et al. Jun 1987 A
4675659 Jenkins, Jr. et al. Jun 1987 A
4691347 Stanley et al. Sep 1987 A
4737982 Boratgis et al. Apr 1988 A
4813070 Humphreys et al. Mar 1989 A
4907221 Pariani et al. Mar 1990 A
4918719 Daudelin Apr 1990 A
4935956 Hellwarth et al. Jun 1990 A
4943973 Werner Jul 1990 A
4995030 Helf Feb 1991 A
5185781 Dowden et al. Feb 1993 A
5210789 Jeffus et al. May 1993 A
5229764 Matchett et al. Jul 1993 A
5291548 Tsumura et al. Mar 1994 A
5319702 Kitchin et al. Jun 1994 A
5319735 Preuss et al. Jun 1994 A
5345501 Shelton Sep 1994 A
5345595 Johnson et al. Sep 1994 A
5379345 Greenberg Jan 1995 A
5425091 Josephs Jun 1995 A
5438616 Peoples Aug 1995 A
5469370 Ostrover et al. Nov 1995 A
5471318 Ahuja et al. Nov 1995 A
5485507 Brown et al. Jan 1996 A
5502762 Andrew et al. Mar 1996 A
5517555 Amadon et al. May 1996 A
5526407 Russell et al. Jun 1996 A
5535194 Brown et al. Jul 1996 A
5535261 Brown et al. Jul 1996 A
5539731 Haneda et al. Jul 1996 A
5539812 Kitchin et al. Jul 1996 A
5544649 David et al. Aug 1996 A
5555551 Rudokas et al. Sep 1996 A
5583925 Bernstein Dec 1996 A
5590171 Howe Dec 1996 A
5592548 Sih Jan 1997 A
5613004 Cooperman Mar 1997 A
5619561 Reese Apr 1997 A
5627887 Freedman May 1997 A
5634086 Rtischev et al. May 1997 A
5634126 Norell May 1997 A
5636292 Rhoads Jun 1997 A
5640490 Hansen et al. Jun 1997 A
5646940 Hotto Jul 1997 A
5649060 Ellozy et al. Jul 1997 A
5655013 Gainsboro Aug 1997 A
5675704 Juang et al. Oct 1997 A
5687236 Moskowitz Nov 1997 A
5710834 Rhoads Jan 1998 A
5719937 Warren et al. Feb 1998 A
5745558 Richardson, Jr. et al. Apr 1998 A
5745569 Moskowitz Apr 1998 A
5745604 Rhoads Apr 1998 A
5748726 Unno May 1998 A
5748763 Rhoads May 1998 A
5748783 Rhoads May 1998 A
5757889 Ohtake May 1998 A
5768355 Salibrici Jun 1998 A
5768426 Rhoads Jun 1998 A
5774452 Greenberg Jun 1998 A
5793415 Gregory, III et al. Aug 1998 A
5796811 McFarlen Aug 1998 A
5805685 McFarlen Sep 1998 A
5809462 Nussbaum Sep 1998 A
5822432 Moskowitz Oct 1998 A
5822436 Rhoads Oct 1998 A
5832068 Smith Nov 1998 A
5832119 Rhoads Nov 1998 A
5835486 Davis et al. Nov 1998 A
5841886 Rhoads Nov 1998 A
5841978 Rhoads Nov 1998 A
5850481 Rhoads Dec 1998 A
5861810 Nguyen Jan 1999 A
5862260 Rhoads Jan 1999 A
5867562 Scherer Feb 1999 A
5883945 Richardson et al. Mar 1999 A
5889868 Seraphim et al. Mar 1999 A
5899972 Miyazawa et al. May 1999 A
5907602 Peel et al. May 1999 A
5915001 Uppaluru Jun 1999 A
5920834 Sih et al. Jul 1999 A
5923746 Baker et al. Jul 1999 A
5926533 Gainsboro Jul 1999 A
5930369 Cox et al. Jul 1999 A
5930377 Powell et al. Jul 1999 A
5937035 Andraska et al. Aug 1999 A
5953049 Hom et al. Sep 1999 A
5960080 Fahlman et al. Sep 1999 A
5963909 Warren et al. Oct 1999 A
5982891 Ginter et al. Nov 1999 A
5991373 Pattison et al. Nov 1999 A
5999828 Sih et al. Dec 1999 A
6011849 Orrin Jan 2000 A
6026193 Rhoads Feb 2000 A
6035034 Tramp Mar 2000 A
6038315 Strait et al. Mar 2000 A
6052454 Kek et al. Apr 2000 A
6052462 Lu Apr 2000 A
6058163 Pattison et al. May 2000 A
6064963 Gainsboro May 2000 A
6072860 Kek et al. Jun 2000 A
6078567 Traill et al. Jun 2000 A
6078645 Cai et al. Jun 2000 A
6078807 Dunn et al. Jun 2000 A
6111954 Rhoads Aug 2000 A
6118860 Hillson et al. Sep 2000 A
6122392 Rhoads Sep 2000 A
6122403 Rhoads Sep 2000 A
6138119 Hall et al. Oct 2000 A
6141406 Johnson Oct 2000 A
6160903 Hamid et al. Dec 2000 A
6173284 Brown Jan 2001 B1
6175831 Weinreich et al. Jan 2001 B1
6185416 Rudokas et al. Feb 2001 B1
6185683 Ginter et al. Feb 2001 B1
6205249 Moskowitz Mar 2001 B1
6211783 Wang Apr 2001 B1
6219640 Basu et al. Apr 2001 B1
6233347 Chen et al. May 2001 B1
6237786 Ginter et al. May 2001 B1
6243480 Zhao et al. Jun 2001 B1
6243676 Witteman Jun 2001 B1
6253193 Ginter et al. Jun 2001 B1
6263507 Ahmad et al. Jul 2001 B1
6266430 Rhoads Jul 2001 B1
6278772 Bowater et al. Aug 2001 B1
6278781 Rhoads Aug 2001 B1
6289108 Rhoads Sep 2001 B1
6301360 Bocionek et al. Oct 2001 B1
6308171 De La Huerga Oct 2001 B1
6312911 Bancroft Nov 2001 B1
6314192 Chen et al. Nov 2001 B1
6324573 Rhoads Nov 2001 B1
6324650 Ogilvie Nov 2001 B1
6330335 Rhoads Dec 2001 B1
6343138 Rhoads Jan 2002 B1
6343738 Ogilvie Feb 2002 B1
6345252 Beigi et al. Feb 2002 B1
6381321 Brown et al. Apr 2002 B1
6389293 Clore et al. May 2002 B1
6421645 Beigi et al. Jul 2002 B1
6526380 Thelen et al. Feb 2003 B1
6542602 Elazar Apr 2003 B1
6611583 Gainsboro Aug 2003 B1
6625261 Holtzberg Sep 2003 B2
6625587 Erten et al. Sep 2003 B1
6633846 Bennett et al. Oct 2003 B1
6636591 Swope et al. Oct 2003 B1
6639977 Swope et al. Oct 2003 B1
6639978 Draizin et al. Oct 2003 B2
6647096 Milliom et al. Nov 2003 B1
6665376 Brown Dec 2003 B1
6665644 Kanevsky et al. Dec 2003 B1
6668044 Schwartz et al. Dec 2003 B1
6668045 Mow Dec 2003 B1
6671292 Haartsen Dec 2003 B1
6688518 Valencia et al. Feb 2004 B1
6728345 Glowny et al. Apr 2004 B2
6728682 Fasciano Apr 2004 B2
6748356 Beigi et al. Jun 2004 B1
6760697 Neumeyer et al. Jul 2004 B1
6763099 Blink Jul 2004 B1
6782370 Stack Aug 2004 B1
6788772 Barak et al. Sep 2004 B2
6810480 Parker et al. Oct 2004 B1
6850609 Schrage Feb 2005 B1
6880171 Ahmad et al. Apr 2005 B1
6895086 Martin May 2005 B2
6898612 Parra et al. May 2005 B1
6907387 Reardon Jun 2005 B1
6920209 Gainsboro Jul 2005 B1
6947525 Benco Sep 2005 B2
6970554 Peterson et al. Nov 2005 B1
7032007 Fellenstein et al. Apr 2006 B2
7035386 Susen et al. Apr 2006 B1
7039171 Gickler May 2006 B2
7039585 Wilmot et al. May 2006 B2
7046779 Hesse May 2006 B2
7050918 Pupalaikis et al. May 2006 B2
7062286 Grivas et al. Jun 2006 B2
7075919 Wendt et al. Jul 2006 B1
7079636 McNitt et al. Jul 2006 B1
7079637 McNitt et al. Jul 2006 B1
7092494 Anders et al. Aug 2006 B1
7103549 Bennett et al. Sep 2006 B2
7106843 Gainsboro Sep 2006 B1
7123704 Martin Oct 2006 B2
7133511 Buntin et al. Nov 2006 B2
7133828 Scarano et al. Nov 2006 B2
7133845 Ginter et al. Nov 2006 B1
7149788 Gundla et al. Dec 2006 B1
7191133 Pettay Mar 2007 B1
7197560 Caslin et al. Mar 2007 B2
7236932 Grajski Jun 2007 B1
7248685 Martin Jul 2007 B2
7256816 Profanchik et al. Aug 2007 B2
7277468 Tian et al. Oct 2007 B2
7280816 Fratti et al. Oct 2007 B2
7324637 Brown et al. Jan 2008 B2
7333798 Hodge Feb 2008 B2
7366782 Chong et al. Apr 2008 B2
7406039 Cherian et al. Jul 2008 B2
7417983 He et al. Aug 2008 B2
7424715 Dutton Sep 2008 B1
7466816 Blair Dec 2008 B2
7494061 Reinhold Feb 2009 B2
7496345 Rae et al. Feb 2009 B1
7505406 Spadaro et al. Mar 2009 B1
7519169 Hingoranee et al. Apr 2009 B1
7529357 Rae et al. May 2009 B1
7551732 Anders Jun 2009 B2
7596498 Basu et al. Sep 2009 B2
7639791 Hodge Dec 2009 B2
7664243 Martin Feb 2010 B2
7672845 Beranek et al. Mar 2010 B2
RE41190 Darling Apr 2010 E
7698182 Falcone et al. Apr 2010 B2
7742581 Hodge et al. Jun 2010 B2
7742582 Harper Jun 2010 B2
7783021 Hodge Aug 2010 B2
7804941 Keiser et al. Sep 2010 B2
7826604 Martin Dec 2010 B2
7848510 Shaffer et al. Dec 2010 B2
7853243 Hodge Dec 2010 B2
7860222 Sidler et al. Dec 2010 B1
7881446 Apple et al. Feb 2011 B1
7899167 Rae Mar 2011 B1
7961860 McFarlen Jun 2011 B1
7973043 Migaly Jul 2011 B2
8031052 Polozola Oct 2011 B2
8135115 Hogg, Jr. et al. Mar 2012 B1
8204177 Harper Jun 2012 B2
8295446 Apple et al. Oct 2012 B1
8458732 Hanna et al. Jun 2013 B2
8488756 Hodge et al. Jul 2013 B2
8498937 Shipman, Jr. et al. Jul 2013 B1
8509390 Harper Aug 2013 B2
8577003 Rae Nov 2013 B2
8630726 Hodge et al. Jan 2014 B2
8731934 Olligschlaeger et al. May 2014 B2
8756065 Melamed et al. Jun 2014 B2
8806020 Lewis et al. Aug 2014 B1
8886663 Gainsboro et al. Nov 2014 B2
8917848 Torgersrud et al. Dec 2014 B2
8929525 Edwards Jan 2015 B1
9020115 Hangsleben Apr 2015 B2
9043813 Hanna et al. May 2015 B2
9077680 Harper Jul 2015 B2
9094500 Edwards Jul 2015 B1
9143609 Hodge Sep 2015 B2
9232051 Torgersrud et al. Jan 2016 B2
9307386 Hodge et al. Apr 2016 B2
9396320 Lindemann Jul 2016 B2
9552417 Olligschlaeger et al. Jan 2017 B2
9609121 Hodge Mar 2017 B1
9615060 Hodge Apr 2017 B1
9621504 Torgersrud et al. Apr 2017 B2
9674340 Hodge Jun 2017 B1
9800830 Humpries Oct 2017 B2
9923936 Hodge Mar 2018 B2
10027797 Hodge et al. Jul 2018 B1
10120919 Olligschlaeger et al. Nov 2018 B2
10225396 Hodge Mar 2019 B2
10277640 Hodge Apr 2019 B2
10715565 Hodge Jul 2020 B2
20010009547 Jinzaki et al. Jul 2001 A1
20010036821 Gainsboro et al. Nov 2001 A1
20010043697 Cox et al. Nov 2001 A1
20010056349 St. John Dec 2001 A1
20010056461 Kampe et al. Dec 2001 A1
20020002464 Pertrushin Jan 2002 A1
20020010587 Pertrushin Jan 2002 A1
20020032566 Tzirkel-Hancock et al. Mar 2002 A1
20020046057 Ross Apr 2002 A1
20020067272 Lemelson et al. Jun 2002 A1
20020069084 Donovan Jun 2002 A1
20020076014 Holtzberg Jun 2002 A1
20020107871 Wyzga et al. Aug 2002 A1
20020147707 Kraay et al. Oct 2002 A1
20020174183 Saeidi Nov 2002 A1
20030002639 Huie Jan 2003 A1
20030023444 St. John Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030035514 Jang Feb 2003 A1
20030040326 Levy et al. Feb 2003 A1
20030070076 Michael Apr 2003 A1
20030086546 Falcone et al. May 2003 A1
20030093533 Ezerzer et al. May 2003 A1
20030099337 Lord May 2003 A1
20030117280 Prehn Jun 2003 A1
20030126470 Crites et al. Jul 2003 A1
20030174826 Hesse Sep 2003 A1
20030190045 Huberman et al. Oct 2003 A1
20040008828 Coles et al. Jan 2004 A1
20040029564 Hodge Feb 2004 A1
20040081296 Brown et al. Apr 2004 A1
20040161086 Buntin et al. Aug 2004 A1
20040169683 Chiu et al. Sep 2004 A1
20040249650 Freedman et al. Dec 2004 A1
20040252184 Hesse et al. Dec 2004 A1
20040252447 Hesse et al. Dec 2004 A1
20050010411 Rigazio et al. Jan 2005 A1
20050027723 Jones et al. Feb 2005 A1
20050080625 Bennett et al. Apr 2005 A1
20050094794 Creamer et al. May 2005 A1
20050102371 Aksu May 2005 A1
20050114192 Tor et al. May 2005 A1
20050125226 Magee Jun 2005 A1
20050128283 Bulriss et al. Jun 2005 A1
20050141678 Anders et al. Jun 2005 A1
20050144004 Bennett et al. Jun 2005 A1
20050170818 Netanel et al. Aug 2005 A1
20050182628 Choi Aug 2005 A1
20050207357 Koga Sep 2005 A1
20060064037 Shalon et al. Mar 2006 A1
20060087554 Boyd et al. Apr 2006 A1
20060087555 Boyd et al. Apr 2006 A1
20060093099 Cho May 2006 A1
20060198504 Shemisa et al. Sep 2006 A1
20060200353 Bennett Sep 2006 A1
20060239296 Jinzaki et al. Oct 2006 A1
20060285650 Hodge Dec 2006 A1
20060285665 Wasserblat et al. Dec 2006 A1
20070003026 Hodge et al. Jan 2007 A1
20070011008 Scarano et al. Jan 2007 A1
20070041545 Gainsboro Feb 2007 A1
20070047734 Frost Mar 2007 A1
20070071206 Gainsboro et al. Mar 2007 A1
20070133437 Wengrovitz et al. Jun 2007 A1
20070185717 Bennett Aug 2007 A1
20070192174 Bischoff Aug 2007 A1
20070195703 Boyajian et al. Aug 2007 A1
20070237099 He et al. Oct 2007 A1
20070244690 Peters Oct 2007 A1
20080000966 Keiser Jan 2008 A1
20080021708 Bennett et al. Jan 2008 A1
20080046241 Osburn et al. Feb 2008 A1
20080096178 Rogers et al. Apr 2008 A1
20080106370 Perez et al. May 2008 A1
20080118045 Polozola et al. May 2008 A1
20080195387 Zigel et al. Aug 2008 A1
20080198978 Olligschlaeger Aug 2008 A1
20080201143 Olligschlaeger et al. Aug 2008 A1
20080201158 Johnson et al. Aug 2008 A1
20080260133 Hodge et al. Oct 2008 A1
20080300878 Bennett Dec 2008 A1
20090306981 Cromack et al. Dec 2009 A1
20100177881 Hodge Jul 2010 A1
20100194563 Berner et al. Aug 2010 A1
20100202595 Hodge et al. Aug 2010 A1
20100299761 Shapiro Nov 2010 A1
20110055256 Phillips et al. Mar 2011 A1
20110244440 Saxon et al. Oct 2011 A1
20110260870 Bailey Oct 2011 A1
20110279228 Kumar et al. Nov 2011 A1
20120041911 Pestian et al. Feb 2012 A1
20120262271 Torgersrud et al. Oct 2012 A1
20130104246 Bear et al. Apr 2013 A1
20130124192 Lindmark et al. May 2013 A1
20130179949 Shapiro Jul 2013 A1
20130263227 Gongaware Oct 2013 A1
20130336469 Olligschlaeger Dec 2013 A1
20140247926 Gainsboro et al. Sep 2014 A1
20140273929 Torgersrud Sep 2014 A1
20140287715 Hodge et al. Sep 2014 A1
20140313275 Gupta et al. Oct 2014 A1
20140334610 Hangsleben Nov 2014 A1
20150051893 Ratcliffe, III et al. Feb 2015 A1
20150077245 Kaufman et al. Mar 2015 A1
20150206417 Bush Jul 2015 A1
20150215254 Bennett Jul 2015 A1
20150221151 Bacco et al. Aug 2015 A1
20150281431 Gainsboro et al. Oct 2015 A1
20150281433 Gainsboro et al. Oct 2015 A1
20150332186 Torgersrud Nov 2015 A1
20160191484 Gongaware Jun 2016 A1
20160224538 Chandrasekar et al. Aug 2016 A1
20160239932 Sidler et al. Aug 2016 A1
20160277302 Olsen Sep 2016 A1
20160301728 Keiser et al. Oct 2016 A1
20160371756 Yokel et al. Dec 2016 A1
20160373909 Rasmussen et al. Dec 2016 A1
20170034355 Kamat et al. Feb 2017 A1
20170270627 Hodge Sep 2017 A1
20170295212 Hodge Oct 2017 A1
20180167421 Hodge Jun 2018 A1
20180227625 Yoshizawa et al. Aug 2018 A1
20180293682 Hodge Oct 2018 A1
20180338036 Hodge Nov 2018 A1
20180349335 Hodge Dec 2018 A1
Foreign Referenced Citations (11)
Number Date Country
1280137 Dec 2004 EP
2579676 Apr 2013 EP
2075313 Nov 1981 GB
59225626 Dec 1984 JP
60010821 Jan 1985 JP
61135239 Jun 1986 JP
3065826 Mar 1991 JP
WO 9614703 Nov 1995 WO
WO 9813993 Apr 1998 WO
WO 2001074042 Oct 2001 WO
WO 2016028864 Feb 2016 WO
Non-Patent Literature Citations (120)
Entry
“Cisco IAD2400 Series Business-Class Integrated Access Device”, Cisco Systems Datasheet, 2003; 8 pages.
“Cisco IAD2420 Series Integrated Access Devices Software Configuration Guide—Initial Configuration,” Cisco Systems, accessed Sep. 23, 2014, accessible at http://www.cisco.com/en/US/docs/routers/access/2400/2420/software/configuration/guide/init_cf.html; 5 pages.
“Hong Kong: Prison Conditions in 1997,” Human Rights Watch, Mar. 1, 1997, C905, available at http://www.refworld.org/docid/3ae6a7d014.html, accessed May 29, 2014; 48 pages.
“PacketCableTM 1.0 Architecture Framework Technical Report”, PKT-TR-ARCH-V01-001201 (Cable Television Laboratories, Inc. 1999).
“PacketCableTM Audio/Video Codecs Specification,” Cable Television Laboratories, Inc., Ser. No. PKT-SP-CODEC-I05-040113 (2004).
“Service-Observing Arrangements Using Key Equipment for Telephone Company Business Offices, Description and Use,” Pac. Tel. & Tel. Co., Bell System Practices, Station Operations Manual, Section C71.090, Issue A, 1-1-57-N, 1957; 8 pages.
“SIP and IPLinkTM in the Next Generation Network: An Overview,” Intel, 2001; 6 pages.
“The AutoEDMS Document Management and Workflow System: An Overview of Key Features, Functions and Capabilities,” ACS Software, May 2003; 32 pages.
“Voice Over Packet in Next Generation Networks: An Architectural Framework,” Bellcore, Special Report SR-4717, Issue 1, Jan. 1999; 288 pages.
“Cool Edit Pro, Version 1.2 User Guide,” Syntrillium Software Corporation, 1998; 226 pages.
“Criminal Calls: A Review of the Bureau of Prisons' Management of Inmate Telephone Privileges,” U.S. Department of Justice, Office of the Inspector General, Aug. 1999; 166 pages.
“Global Call API for Linux and Windows Operating Systems,” Intel Dialogic Library Reference, Dec. 2005; 484 pages.
“The NIST Year 2002 Speaker Recognition Evaluation Plan,” NIST, Feb. 27, 2002, accessible at http://www.itl.nist.gov/iad/mig/tests/spk/2002/2002-spkrecevalplan-v60.pdf; 9 pages.
Aggarwal, et al., “An Environment for Studying Switching System Software Architecture,” IEEE, Global Telecommunications Conference, 1988; 7 pages.
Auckenthaler, et al., “Speaker-Centric Score Normalization and Time Pattern Analysis for Continuous Speaker Verification,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Jun. 2000, pp. 1065-1068.
Audacity Team, “About Audacity,” World Wide Web, 2014, accessible at http://wiki.audacity.team.org/wiki/About_Audacity; 3 pages.
Beek et al., “An Assessment of the Technology of Automatic Speech Recognition for Military Applications,” IEEE Trans. Acoustics, Speech, and Signal Processing, vol. ASSP-25, No. 4, 1977; pp. 310-322.
Beigi, et al., “A Hierarchical Approach to Large-Scale Speaker Recognition,” EuroSpeech 1999, Sep. 1999, vol. 5; pp. 2203-2206.
Beigi, et al., “IBM Model-Based and Frame-By-Frame Speaker-Recognition,” Speaker Recognition and its Commercial and Forensic Applications, Apr. 1998; pp. 1-4.
Beigi, H., “Challenges of Large-Scale Speaker Recognition,” 3rd European Cooperation in the Field of Scientific and Technical Research Conference, Nov. 4, 2005; 33 pages.
Beigi, H., “Decision Theory,” Fundamentals of Speaker Recognition, Ch. 9, Springer, US 2011; pp. 313-339.
Bender, et al., “Techniques For Data Hiding,” IBM Systems Journal, vol. 35, Nos. 3&4, 1996; 24 pages.
Boersma, et al., “Praat: Doing Phonetics by computer,” World Wide Web, 2015, accessible at http://www.fon.hum.uva.nl/praat; 2 pages.
Bolton, et al., “Statistical Fraud Detection: A Review,” Statistical Science, vol. 17, No. 3 (2002), pp. 235-255.
Boney, L., et al., “Digital Watermarks for Audio Signals” Proceedings of EUSIPC0-96, Eighth European Signal processing Conference, Trieste, Italy, 10-13 (1996).
Boney, L., et al., “Digital Watermarks for Audio Signals” Proceedings of the International Conference on Multimedia Computing Systems, p. 473-480, IEEE Computer Society Press, United States (1996).
Bur Goode, Voice Over Internet Protocol (VoIP), Proceedings of the IEEE, vol. 90, No. 9, Sep. 2002; pp. 1495-1517.
Carey, et al., “User Validation for Mobile Telephones,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Jun. 2000, pp. 1093-1096.
Chau, et al., “Building an Infrastructure for Law Enforcement Information Sharing and Collaboration: Design Issues and Challenges,” National Conference on Digital Government, 2001; 6 pages.
Chaudhari, et al., “Transformation enhanced multi-grained modeling for text-independent speaker recognition,” International Conference on Spoken Language Processing, 2000, pp. 298-301.
Christel, et al., “Interactive Maps for a Digital Video Library,” IEEE Special Edition on Multimedia Computing, Jan.-Mar. 2000, IEEE, United States; pp. 60-67.
Clavel, et al., “Events Detection for an Audio-Based Surveillance System,” IEEE International Conference on Multimedia and Expo (ICME2005), Jul. 6-8, 2005, pp. 1306-1309.
Coden, et al., “Speech Transcript Analysis for Automatic Search,” Proceedings of the 34th Hawaii International Conference on System Sciences, IEEE, 2001; 9 pages.
Coherent Announces Industry's First Remote Management System for Echo Canceller, Business Wire, Mar. 3, 1997; 3 pages.
Corbato, et al., “Introduction and Overview of the MULTICS System,” Proceedings—Fall Joint Computer Conference, 1965; 12 pages.
Cox, et al.; “Secure Spread Spectrum Watermarking for Multimedia,” NEC Research Institute, Technical Report 95-10, Dec. 1997; 34 pages.
Digital Copy of “Bellcore Notes on the Networks,” Bellcore, Special Report SR-2275, Issue 3, Dec. 1997.
Doddington, G., “Speaker Recognition based on Idiolectal Differences between Speakers,” 7th European Conference on Speech Communication and Technology, Sep. 3-7, 2001; 4 pages.
Dunn, et al., “Approaches to speaker detection and tracking in conversational speech,” Digital Signal Processing, vol. 10, 2000; pp. 92-112.
Dye, Charles, “Oracle Distributed Systems,” O'Reilly Media, Inc., Apr. 1, 1999; 29 pages.
Fischer, AlanD., “COPLINK nabs criminals faster,” Arizona Daily Star, Jan. 7, 2001; 5 pages.
Fleischman, E., “Advanced Streaming Format (ASF) Specification,” Microsoft Corporation, Jan. 9, 1998; 78 pages.
Fox, B., “The First Amendment Rights of Prisoners,” 63 J. Crim. L. Criminology & Police Sci. 162, 1972; 24 pages.
Frankel, E., Audioconferencing Options (Teleconferencing Units, Conference Bridges and Service Bureaus), Teleconnect, vol. 4, No. 5, p. 131(3), May 1996; 6 pages.
Furui, et al., “Experimental studies in a new automatic speaker verification system using telephone speech,” Acoustics, Speech, and Signal Processing, IEEE International Conference on ICASSP '80, vol. 5, Apr. 1980, pp. 1060-1062.
Furui, S., “50 Years of Progress in Speech and Speaker Recognition Research,” ECTI Transactions on Computer and Information Technology, vol. 1, No. 2, Nov. 2005, pp. 64-74.
Hansen, et al., “Speaker recognition using phoneme-specific gmms,” The Speaker and Language Recognition Workshop, May-Jun. 2004; 6 pages.
Hauck, et al., “Coplink: A Case of Intelligent Analysis and Knowledge Management,” University of Arizona, 1999; 20 pages.
Hewett, et al., Signaling System No. 7 (SS7/C7): Protocol, Architecture, and Services (Networking Technology), Cisco Press, Jun. 2005; 8 pages.
I2 Investigative Analysis Software; “Chart Reader”, URL: http://www.i2.eo.uk/Products/Chart Reader. Jun. 13, 2005.
I2 Investigative Analysis Software; “i2 TextChart—Text Visualized”, URL: http://www.i2.co.uk/Products/i2TextChart/. Jun. 13, 2005.
I2 Investigative Analysis Software; “iBase-Information Captured”, URL: http://www.i2.co.uk/Products/iBase/. Jun. 13, 2005.
I2 Investigative Analysis Software; “iBridge”, URL: http://www.i2.eo.uk/Products/iBridge/. Jun. 13, 2005.
I2 Investigative Analysis Software; “Pattern Tracer”, URL: http://www.i2.co.uk/Products/Pattern Tracer/. Jun. 13, 2005.
I2 Investigative Analysis Software; “Prisons”, URL: http://www.i2.co.uk/Solutions/Prisons/default.aso. Jun. 13, 2005.
I2 Investigative Analysis Software; “Setting International Standards for Investigative Analysis”, URL: htto://www.i2.co.uk/Products/index.htm. Jun. 13, 2005.
Imagis Technologies, Inc. “Computer Arrest and Booking System”, [retrieved from http://www.imagistechnologies.com/Product/CABS.htm] (Nov. 5, 2002) 5 pages.
Imagis Technologies, Inc. “Integrated Justice System—Web-based Image and Data Sharing” [retrieved from http://www.imagistechnologies.com/Product/IJISFramework.htm>] (Nov. 5, 2002) 4 pages.
Inmate Telephone Services: Large Business: Voice, Oct. 2, 2001; 3 pages.
Intel® NetStructure High-Density Station Interface (HDSI) Boards Archived Webpage, Intel Corporation, 2003; 2 pages.
International Search Report and Written Opinion directed to related International Application No. PCT/US2017/022169, dated May 29, 2017; 57 pages.
International Search Report for International Application No. PCT/US04/025029, European Patent Office, Netherlands, dated Mar. 14, 2006.
Isobe, et al., “A new cohort normalization using local acoustic information for speaker verification,” Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 2, Mar. 1999; pp. 841-844.
Juang, et al., “Automatic Speech Recognition—A Brief History of the Technology Development,” Oct. 8, 2014; 24 pages.
Kinnunen, et al., “Real-Time Speaker Identification and Verification,” IEEE Transactions on Audio, Speech, and Language Processing, vol. 14, No. 1, Jan. 2006, pp. 277-288.
Knox, “The Problem of Gangs and Security Threat Groups (STG's) in American Prisons Today: Recent Research Findings From the 2004 Prison Gang Survey,” National Gang Crime Research Center, 2005; 67 pages.
Kozamernik, F., “Media Streaming over the Internet—an overview of delivery technologies,” EBU Technical Review, Oct. 2002; 15 pages.
Lane, et al., Language Model Switching Based on Topic Detection for Dialog Speech Recognition, Proceedings of the IEEE-ICASSP, vol. 1, 2003, IEEE; pp. 616-619.
Maes, et al., “Conversational speech biometrics,” E-Commerce Agents, Marketplace Solutions, Security Issues, and Supply and Demand, Springer-Verlang, London, UK, 2001, pp. 166-179.
Maes, et al., “Open SESAME! Speech, Password or Key to Secure Your Door?,” Asian Conference on Computer Vision, Jan. 1998; pp. 1-3.
Matsui, et al., “Concatenated Phoneme Models for Text-Variable Speaker Recognition,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Apr. 1993; pp. 391-394.
McCollum, “Federal Prisoner Health Care Copayment Act of 2000,” House of Representatives Report 106-851, 106th Congress 2d Session, Sep. 14, 2000; 22 pages.
Microsoft White Paper: “Integrated Justice Information Systems”, retrieved from Microsoft Justice & Public Safety Solutions (Nov. 5, 2002) [http://jps.directtaps.net_vtibin/owssvr.dll?Using=Default%2ehtm]; 22 pages.
Moattar, et al., “Speech Overlap Detection using Spectral Features and its Application in Speech Indexing,” 2nd International Conference on Information & Communication Technologies, 2006; pp. 1270-1274.
National Alliance of Gang Investigators Associations, 2005 National Gang Threat Assessment, 2005, Bureau of Justice Assistance, Office of Justice Programs, U.S. Department of Justice; 73 pages.
National Major Gang Taskforce, “A Study of Gangs and Security Threat Groups in America's Adult Prisons and Jails,” 2002; 38 pages.
Navratil, et al., “A Speech Biometrics System With MultiGrained Speaker Modeling,” 2000; 5 pages.
Navratil, et al., “Phonetic speaker recognition using maximum-likelihood binary-decision tree models,” Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Apr. 6-10, 2003; 4 pages.
O'Harrow, R. “U.S. Backs Florida's New Counterterrorism Database; ‘Matrix’ Offers Law Agencies Faster Access to Americans' Personal Records”; The Washington Post. Washington, D.C., Aug. 6, 2003; p. A 01.
O'Harrow, R.. “Database will make tracking suspected terrorists easier”, The Dallas Morning News. Dallas, TX, Aug. 6, 2003; p. 7A.
Olligschlaeger, A. M., “Criminal Intelligence Databases and Applications,” in Marilyn B. Peterson, Bob Morehouse, and Richard Wright, Intelligence 2000: Revising the Basic Elements—A Guide for Intelligence Professionals, Mar. 30, 2000 a joint publication of IALEIA and LEIU; 53 pages.
Osifchin, N., “A Telecommunications Buildings/Power Infrastructure in a New Era of Public Networking,” IEEE 2000; 7 pages.
Pages from http://www.corp.att.com/history, archived by web.archive.org on Nov. 4, 2013.
Pelecanos, J. “Conversational biometrics,” in Biometric Consortium Meeting, Baltimore, MD, Sep. 2006, accessible at http://www.biometrics.org/bc2006/presentations/Thu_Sep_21/Session_I/Pelecanos_Conversational_Biometrics.pdf; 14 pages.
Pollack, et al., “On the Identification of Speakers by Voice,” The Journal of the Acoustical Society of America, vol. 26, No. 3, May 1954; 4 pages.
Prosecution History of International Patent Application No. PCT/US99/09493 by Brown et al., filed Apr. 29, 1999.
Prosecution History of U.S. Appl. No. 11/182,625, filed Jul. 15, 2005.
Rey, R.F., ed., “Engineering and Operations in the Bell System,” 2nd Edition, AT&T Bell Laboratories: Murray Hill, NJ, 1983; 884 pages.
Reynolds, D., “Automatic Speaker Recognition Using Gaussian Mixture Speaker Models,” The Lincoln Laboratory Journal, vol. 8, No. 2, 1995; pp. 173-192.
Rosenberg, et al., “SIP: Session Initial Protocol,” Network Working Group, Standard Track, Jun. 2002; 269 pages.
Rosenberg, et al., “The Use of Cohort Normalized Scores for Speaker Verification,” Speech Research Department, AT&T Bell Laboratories, 2nd International Conference on Spoken Language Processing, Oct. 12-16, 1992; 4 pages.
Ross, et al., “Multimodal Biometrics: An Overview,” Proc. of 12th European Signal Processing Conference (EUSIPCO), Sep. 2004; pp. 1221-1224.
Science Dynamics, BubbleLINK Software Architecture, 2003; 10 pages.
Science Dynamics, Commander Call Control System, Rev. 1.04, 2002; 16 pages.
Science Dynamics, Inmate Telephone Control Systems, http://scidyn.com/fraudprev_main.htm (archived by web.archive.org on Jan. 12, 2001).
Science Dynamics, SciDyn BubbleLINK, http://www.scidyn.com/products/bubble.html (archived by web.archive.org on Jun. 18, 2006).
Science Dynamics, SciDyn Call Control Solutions: Commander II, http://www.scidyn.com/products/commander2.html (archived by web.archive.org on Jun. 18, 2006).
Science Dynamics, SciDyn IP Gateways, http://scidyn.com/products/ipgateways.html (archived by web.archive.org on Aug. 15, 2001).
Science Dynamics, Science Dynamics—IP Telephony, http://www.scidyn.com/iptelephony_main.htm (archived by web.archive.org on Oct. 12, 2000).
Shearme, et al., “An Experiment Concerning the Recognition of Voices,” Language and Speech, vol. 2, No. 3, Jul./Sep. 1959; 10 pages.
Silberg, L., Digital on Call, HFN The Weekly Newspaper for the Home Furnishing Network, Mar. 17, 1997; 4 pages.
Silberschatz, et al., Operating System Concepts, Third Edition, Addison-Wesley: Reading, MA, Sep. 1991; 700 pages.
Simmons, R., “Why 2007 is Not Like 1984: A Broader Perspective on Technology's Effect on Privacy and Fourth Amendment Jurisprudence,” J. Crim. L. & Criminology vol. 97, No. 2, Winter 2007; 39 pages.
Smith, M., “Corrections Turns Over a New LEAF: Correctional Agencies Receive Assistance From the Law Enforcement Analysis Facility,” Corrections Today, Oct. 1, 2001; 4 pages.
Specification of U.S. Appl. No. 10/720,848, “Information Management and Movement System and Method,” to Viola, et al., filed Nov. 24, 2003. (Abandoned).
State of North Carolina Department of Correction RFP #ITS-000938A, issued May 25, 2004; 8 pages.
Statement for the Record of John S. Pistole, Assistant Director, Counterterrorism Division, Federal Bureau of Investigation, Before the Senate Judiciary Committee, Subcommittee on Terrorism, Technology, and Homeland Security, Oct. 14, 2003.
Sundstrom, K., “Voice over IP: An Engineering Analysis,” Master's Thesis, Department of Electrical and Computer Engineering, University of Manitoba, Sep. 1999; 140 pages.
Supplementary European Search Report for EP Application No. EP 04 80 9530, Munich, Germany, completed on Mar. 25, 2009.
Tanenbaum, A., Modern Operating Systems, Third Edition, Peason Prentice Hall: London, 2009; 552 pages.
Tirkel, A., et al.; “Image Watermarking—A Spread Spectrum Application,” Sep. 22-25, 1996; 7 pages.
Viswanathan, et al., “Multimedia Document Retrieval using Speech and Speaker Recognition,” International Journal on Document Analysis and Recognition, Jun. 2000, vol. 2; pp. 1-24.
Walden, R., “Performance Trends for Analog-to-Digital Converters,” IEEE Communications Magazine, Feb. 1999.
Weinstein, C., MIT, The Experimental Integrated Switched Network—A System-Level Network Test Facility, IEEE 1983; 8 pages.
Wilkinson, Reginald A., “Visiting in Prison,” Prison and Jail Administration's Practices and Theory, 1999; 7 pages.
Winterdyk et al., “Managing Prison Gangs,” Journal of Criminal Justice, vol. 38, 2010; pp. 730-736.
Zajic, et al., “A Cohort Methods for Score Normalization in Speaker Verification System, Acceleration of On-Line Cohort Methods,” Proceedings of the 12th International Conference “Speech and Computer,” Oct. 15-18, 2007; 6 pages.
U.S. Appl. No. 60/607,447, “IP-based telephony system and method,” to Apple, et al., filed Sep. 3, 2004; 665 pages.
International Search Report and Written Opinion directed to International Application No. PCT/US2017/026570, dated May 8, 2017; 7 pages.
Supplementary European Search Report for Application No. EP 17 77 9902, Munich, Germany, completed on Aug. 5, 2019; 7 pages.
Related Publications (1)
Number Date Country
20200314157 A1 Oct 2020 US
Continuations (5)
Number Date Country
Parent 16391954 Apr 2019 US
Child 16902565 US
Parent 15882321 Jan 2018 US
Child 16391954 US
Parent 15594150 May 2017 US
Child 15882321 US
Parent 15331414 Oct 2016 US
Child 15594150 US
Parent 15093300 Apr 2016 US
Child 15331414 US