Third party monitoring of activity within a monitoring platform

Information

  • Patent Grant
  • 12095943
  • Patent Number
    12,095,943
  • Date Filed
    Thursday, January 19, 2023
    a year ago
  • Date Issued
    Tuesday, September 17, 2024
    3 months ago
Abstract
The present disclosure describes a monitoring environment that monitors an activity for activity that may be indicative of being prohibited by the local, the state, and/or the national governing authorities, namely suspicious activity, or activity that is prohibited by the local, the state, and/or the national governing authorities, namely prohibited activity. The monitoring environment verifies the activity is actually being monitored within the monitoring environment. The verification can require one or more monitoring persons monitoring the activity to perform one or more tasks to verify their attentiveness in monitoring the activity. The one or more tasks can be as simple as activating a checkbox or providing a code or an electronic signature to provide some examples, although more complicated tasks, such as a biometric verification such as a retinal, a facial, and/or a voice verification to provide some examples, are possible as will be recognized by those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.
Description
BACKGROUND
Related Art

Correctional facilities provide inmates with the ability to communicate with friends, families, and visitors as it improves recidivism and provides incentives for inmates to follow rules and policies of the facility. In addition to traditional telephone calls and telephone visitations, correctional facilities seek to offer a wide variety of communication services to inmates, such as video visitation and video calls, among others. However, as the amount of communication options available to inmates increases, an increased amount of monitoring is required for these communications.





BRIEF DESCRIPTION OF THE DRAWINGS/FIGURES

Embodiments of the disclosure are described with reference to the accompanying drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Additionally, the left most digit(s) of a reference number identifies the drawing in which the reference number first appears. In the accompanying drawings:



FIG. 1 illustrates an exemplary monitoring platform according to an exemplary embodiment of the present disclosure;



FIG. 2 illustrates an exemplary monitoring center of the exemplary monitoring platform according to an exemplary embodiment of the present disclosure;



FIG. 3 is a flowchart of exemplary operational steps for the exemplary operation center according to an embodiment of the present disclosure; and



FIG. 4 illustrates a block diagram of an exemplary computer system for implementing the exemplary design environment according to an exemplary embodiment of the present disclosure.





The disclosure will now be described with reference to the accompanying drawings. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements. The drawing in which an element first appears is indicated by the leftmost digit(s) in the reference number.


DETAILED DESCRIPTION OF THE DISCLOSURE

Overview


The present disclosure describes a monitoring environment that monitors an activity for activity that may be indicative of being prohibited by the local, the state, and/or the national governing authorities, namely suspicious activity, or activity that is prohibited by the local, the state, and/or the national governing authorities, namely prohibited activity. The monitoring environment verifies the activity is actually being monitored within the monitoring environment. The verification can require one or more monitoring persons monitoring the activity to perform one or more tasks to verify their attentiveness in monitoring the activity. The one or more tasks can be as simple as activating a checkbox or providing a code or an electronic signature to provide some examples, although more complicated tasks, such as a biometric verification such as a retinal, a facial, and/or a voice verification to provide some examples, are possible as will be recognized by those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.


Exemplary Monitoring Platform



FIG. 1 illustrates an exemplary monitoring platform according to an exemplary embodiment of the present disclosure. As illustrated in FIG. 1, a monitoring environment 100 monitors an activity for activity that may be indicative of being prohibited by the local, the state, and/or the national governing authorities, namely suspicious activity, or activity that is prohibited by the local, the state, and/or the national governing authorities, namely prohibited activity. Moreover, the monitoring environment 100 verifies the activity is actually being monitored within the monitoring environment 100. In the exemplary embodiment illustrated in FIG. 1, the monitoring environment 100 includes a controlled environment 102, an uncontrolled environment 104, and a monitoring platform 106 that are communicatively coupled to each other via a communication network 108. Although the controlled environment 102 is to be described in terms of an institutional environment, such as a local, a state, and/or a national prison, correctional facility, detention center, jail, penitentiary or remand center to provide some examples, those skilled in the relevant art(s) will recognize that the teachings described herein are equally applicable to any other suitable environment that is prescribed by the local, the state, and/or the national governing authorities without departing from a spirit and scope of the present disclosure. This other suitable environment can include a military environment, a hospital environment, an educational environment, a business environment, or a governmental environment to provide some examples.


The controlled environment 102 can be characterized as including one or more access restriction controls to control access to the controlled environment 102, access from the controlled environment 102, and/or access within the controlled environment 102. For example, the one or more access restriction controls can restrict entry by the general public into the controlled environment 102, can restrict those within the controlled environment 102 from leaving the controlled environment 102, and/or can restrict activity of those within the controlled environment 102. In some situations, the access restriction controls can be prescribed by the local, the state, and/or the national governing authorities.


As illustrated in FIG. 1, the controlled environment 102 can include one or more insider persons 110. The one or more insider persons 110 can include one or more persons occupying the controlled environment 102, such as one or more prisoners, inmates, or detainees to provide some examples, and/or one or more persons needed for operation of the controlled environment 102, such as one or more prison officers, corrections officers, correctional officers, detention officers, or penal officers to provide some examples. In some situations, the one or more insider persons 110 can access one or more insider communication devices 112 such as one or more mobile telephony devices, such as one or more mobile phones, one or more mobile computing devices; one or more mobile internet devices, such as one or more tablet computers and/or one or more laptop computers; one or more personal digital assistants; one or more handheld game consoles; one or more portable media players; one or more digital cameras; one or more pagers; one or more personal navigation devices; and/or other suitable communication devices that are capable of communication within the monitoring environment 100 that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. These other suitable communication devices can include one or more wired communication devices, such as one or more telephones, and/or one or more personal computing devices to provide some examples. In these situations, the one or more insider persons 110 can utilize the one or more insider communication devices 112 for communication, such as audio communication, video communication, and/or data communication to provide some examples, within the controlled environment 102 and/or between the controlled environment 102 and the uncontrolled environment 104.


Preferably, the uncontrolled environment 104 is situated in a location that is remote from the controlled environment 102, namely outside of the controlled environment 102, such that the uncontrolled environment 104 does not include the one or more access restriction controls of the controlled environment 102. For example, the general public is free to enter and/or to exit the uncontrolled environment 104 without being subject to the one or more access restriction controls of the controlled environment 102. However, those skilled in the relevant art(s) will recognize that the uncontrolled environment 104 can include other access restriction controls that can be prescribed by the local, the state, and/or the national governing authorities without departing from the spirit and scope of the present disclosure.


As additionally illustrated in FIG. 1, the uncontrolled environment 104 can include one or more outsider persons 114. In some situations, the one or more outsider persons 114 have access to one or more outsider communication devices 116 such as one or more mobile telephony devices, such as one or more mobile phones, one or more mobile computing devices, one or more mobile internet devices, such as one or more tablet computers and/or one or more laptop computers, one or more personal digital assistants, one or more handheld game consoles, one or more portable media players, one or more digital cameras, one or more pagers, one or more personal navigation devices, and/or other suitable communication devices that are capable of communication within the monitoring environment 100 that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. These other suitable communication devices can include one or more wired communication devices, such as one or more telephones, and/or one or more personal computing devices to provide some examples. In these situations, the one or more outsider persons 114 can utilize the one or more outsider communication devices 116 for communication, such as audio communication, video communication, and/or data communication to provide some examples, between the controlled environment 102 and the uncontrolled environment 104.


The monitoring platform 106 can operate in a monitor mode of operation or in a verification mode of operation. In an exemplary embodiment, the monitoring platform 106 can simultaneously operate in the monitor mode of operation or in the verification mode of operation; however, in some situations, the monitoring platform 106 can operate in the monitor mode of operation or in the verification mode of operation and can switch between the monitor mode of operation and the verification mode of operation when appropriate. In the monitoring mode of operation, the monitoring platform 106 monitors the activity within the monitoring environment 100. In the exemplary embodiment as illustrated in FIG. 1, the monitoring platform 106 is situated in a location that is remote from the controlled environment 102 and the uncontrolled environment 104, namely outside of the controlled environment 102 and the uncontrolled environment 104. However, in some situations, the monitoring platform 106 can be integrated within the controlled environment 102 or the uncontrolled environment 104.


In the monitoring mode of operation, the monitoring platform 106 monitors the activity within the monitoring environment 100 as it occurs in real-time, or near-time, or the activity within the monitoring environment 100 can be stored and monitored by the monitoring platform 106 after its occurrence. In an exemplary embodiment, the monitoring platform 106 can monitor less than all of the activity within the monitoring environment 100. In this exemplary embodiment, the specific number of activities to be monitored, for example a percentage of the activity within the monitoring environment 100, can be prescribed by the local, the state, and/or the national governing authorities or a contract between the monitoring platform 106 and the local, the state, and/or the national governing authorities can specify the specific number of activities. In this exemplary embodiment, the monitoring platform 106 can measure the specific number of activities monitored and/or verified to be monitored by the monitoring platform 106 to ensure the monitoring platform 106 is performing in accordance with regulations of the local, the state, and/or the national governing authorities and/or the contract between the monitoring platform 106 and the local, the state, and/or the national governing authorities. In some situations, payment to the monitoring platform 106 for monitoring the activity within the monitoring environment 100 can be based upon whether the monitoring platform 106 is performing in accordance with the regulations by the local, the state, and/or the national governing authorities and/or the contract between the monitoring platform 106 and the local, the state, and/or the national governing authorities.


Generally, the activity can include activity within the controlled environment 102 and/or activity between the controlled environment 102 and the uncontrolled environment 104. In an exemplary embodiment, the activity within the controlled environment 102 can include communication-related activity which relates to communication, such as audio communication, video communication, and/or data communication to provide some examples, among the one or more insider persons 110 within the controlled environment 102. This communication-related activity can include incoming telephone calls to the controlled environment 102, calls from informants to the controlled environment 102, inmate to inmate calling within the controlled environment 102, inmate to inmate electronic message relay within the controlled environment 102, electronic messaging, text messaging, video conferencing, or other real-time communication within the controlled environment 102, voicemail messaging, or other non-real-time communication, within the controlled environment 102, visitation within the controlled environment 102, teletypewriter (TTY), or other types of signaling, within the controlled environment 102, Internet browsing within the controlled environment 102, pre-incarceration communication within the controlled environment 102, post-incarceration communication within the controlled environment 102, chatbot interaction within the controlled environment 102, non-scheduled video communication within the controlled environment 102, covert audio and/or video communication within the controlled environment 102, and/or transcriptions of communication within the controlled environment 102 to provide some examples.


In another exemplary embodiment, the activity within the controlled environment 102 can additionally, or alternatively, include non-communication activity which typically involves observation of the one or more insider persons 110 within the controlled environment 102. For example, the non-communication activity within the controlled environment 102 can include movement of the one or more insider persons 110 within the controlled environment 102. In this example, the controlled environment 102 can include one or more security cameras to view and/or to record the movement of the one or more insider persons 110 within the controlled environment 102. Other non-communication activity within the controlled environment 102 that can be monitored by the monitoring platform 106 can include visitations between the one or more insider persons 110 and the one or more outsider persons 114 occurring within the controlled environment 102, exercising activity of the one or more insider persons 110 within the controlled environment 102, leisure time activity of the one or more insider persons 110 within the controlled environment 102, mealtime activity of the one or more insider persons 110 within the controlled environment 102, educational activity of the one or more insider persons 110 within the controlled environment 102, and/or employment activity of the one or more insider persons 110 within the controlled environment 102.


In a further exemplary embodiment, the activity between the controlled environment 102 and the uncontrolled environment 104 can further, or alternatively, include communication-related activity which relates to communication, such as audio communication, video communication, and/or data communication to provide some examples, between the one or more insider persons 110 and the one or more outsider persons 114. This communication-related activity can include telephone calls between the controlled environment 102 and the uncontrolled environment 104, electronic messaging, text messaging, video conferencing, or other real-time communication between the controlled environment 102 and the uncontrolled environment 104, voicemail messaging, or other non-real-time communication, between the controlled environment 102 and the uncontrolled environment 104, teletypewriter (TTY), or other types of signaling, between the controlled environment 102 and the uncontrolled environment 104, non-scheduled video communication between the controlled environment 102 and the uncontrolled environment, covert audio and/or video communication between the controlled environment 102 and the uncontrolled environment 104, and/or transcriptions of communication between the controlled environment 102 and the uncontrolled environment 104 to provide some examples.


As additionally illustrated in FIG. 1, the monitoring platform 106 can be utilized by one or more monitoring persons 118 to monitor the activity within the monitoring environment 100. Generally, the one or more monitoring persons 118 review the activity for the presence of the suspicious activity and/or the prohibited activity. Typically, the one or more monitoring persons 118 generate one or more warning alerts, such as an annotation, a flag, a bookmark, an audible alert, and/or a video alert to provide some examples, when the suspicious activity and/or the prohibited activity is present within the activity within the monitoring environment 100. The simplest warning alerts can include one or more annotations of the suspicious activity and/or the prohibited activity, although warning alerts of much greater complexity can be used, such notifying the local, the state, and/or the national governing authorities of the suspicious activity and/or the prohibited activity to provide an example, without departing from the spirit and scope of the present disclosure. In some situations, the generating of the one or more warning alerts can affect the activity within the monitoring environment 100. For example, communication between the one or more insider persons 110 and the one or more outsider persons 114 can be interrupted and/or disconnected upon the one or more monitoring persons 118 generating the one or more warning alerts. In an exemplary embodiment, the monitoring platform 106 can store a listing of warning alerts indexed to various actions to be performed by the monitoring platform 106. In this exemplary embodiment, the monitoring platform 106 can proceed with the action, such as annotating the suspicious activity and/or the prohibited activity, interrupting and/or disconnecting the communication between the one or more insider persons 110 and the one or more outsider persons 114, and/or notifying the local, the state, and/or the national governing authorities of the suspicious activity and/or the prohibited activity to provide some examples, corresponding with the one or more warning alerts as prescribed in the listing of warning alerts.


In the verification mode of operation, the monitoring platform 106 verifies the one or more monitoring persons 118 are monitoring the activity within the monitoring environment 100. The monitoring platform 106 determines whether the one or more monitoring persons 118 are attentive and reviewing the activity. For example, when the one or more warning alerts are generated by the one or more monitoring persons 118 within a predetermined amount of time, such as once every minute, once every couple of minutes, once every hour, or once every couple of hours to provide some examples, the monitoring platform 106 presumes the one or more monitoring persons 118 are attentive and reviewing the activity. However, in some situations, no warning alerts may be generated by the one or more monitoring persons 118 when reviewing the activity within the predetermined amount of time. In these situations, the monitoring platform 106 can require the one or more monitoring persons 118 to perform one or more tasks to verify the one or more monitoring persons 118 are attentive and reviewing the activity. The one or more tasks can be as simple as activating a checkbox, entering a response to a question, or providing a code or an electronic signature to provide some examples, although more complicated tasks, such as a biometric verification such as a retinal, a facial, and/or a voice verification to provide some examples, are possible as will be recognized by those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. In some situations, one or more pop-windows or dialog boxes can appear to the one or more monitoring persons 118 notifying the one or more monitoring persons 118 to perform the one or more tasks and/or entering information requested by one or more tasks to be performed by the monitoring persons 118 during the verification mode of operation.


The communication network 108 includes one or more wireless communication networks and/or one or more wired communication networks for communicatively coupling the controlled environment 102, the uncontrolled environment 104, and the monitoring platform 106. The one or more wireless communication networks can include one or more cellular phone networks, wireless local area networks (WLANs), wireless sensor networks, satellite communication networks, terrestrial microwave networks, and/or other suitable networks that transmit data over a wireless-based communication technology that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. The one or more wired communication networks include one or more telephone networks, cable television networks, internet access networks, fiber-optic communication networks and/or other suitable networks that transmit data over a wire-based communication technology that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.


Exemplary Monitoring Center of the Exemplary Monitoring Platform



FIG. 2 illustrates an exemplary monitoring center of the exemplary monitoring platform according to an exemplary embodiment of the present disclosure. A monitoring center 200 monitors the activity, as described above in FIG. 1, for the presence of the suspicious activity and/or the prohibited activity. Moreover, the monitoring center 200 verifies the activity is actually being monitored within the monitoring center 200. In the exemplary embodiment illustrated in FIG. 2, the monitoring center 200 includes a monitoring server 202, a monitoring storage 204, monitoring stations 206.1 through 206.n, and an administrative station 208 that are communicatively coupled to each other via a communication network 210. The monitoring center 200 can represent an exemplary embodiment of the monitoring platform 106.


Generally, the monitoring server 202 controls distribution of text, audio, and/or video information relating to the activity and/or scheduling of the monitoring stations 206.1 through 206.n, the administrative station 208, monitoring persons 212.1 through 212.n associated with the monitoring stations 206.1 through 206.n, and/or an administrative person 214 associated with the administrative station 208 to monitor the text, the audio, and/or the video information. In the exemplary embodiment illustrated in FIG. 2, the monitoring server 202 receives the text, the audio, and/or the video information. In some situations, the text, the audio, and/or the video information can include metadata describing the activity. This metadata can include the identification of persons within the activity and/or the date, the time, the duration, and/or the location of the activity to provide some examples. In an exemplary embodiment, this metadata can include a unique identifier for the text, the audio, and/or the video information corresponding to the activity to allow the activity to be easily tracked throughout the monitoring center 200.


The monitoring server 202 schedules the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 to review the text, the audio, and/or the video information. For example, the monitoring server 202 schedules the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 to review the text, the audio, and/or the video information in a round-robin manner. Typically, the round-robin manner sequentially cycles through the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 one after another; however, those skilled in the relevant art(s) will recognize that the round-robin manner may cycle through the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 in any suitable order without departing from the spirit and scope of the present disclosure.


As another example, the monitoring server 202 schedules the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 to review the text, the audio, and/or the video information based upon abilities of the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214. In an exemplary embodiment, the abilities of the monitoring persons 212.1 through 212.n, and/or the administrative person 214 can be stored by the monitoring server 202 in one or more monitoring profiles which can be maintained by the administrative station 208. In this exemplary embodiment, the one or more monitoring profiles can also include the type of information, such as the text, the audio, and/or the video information, that the monitoring persons 212.1 through 212.n, and/or the administrative person 214 is permitted and/or certified to review. In this example, the monitoring server 202 can schedule one or more monitoring persons from among the monitoring persons 212.1 through 212.n and/or the administrative person 214 familiar with a particular language and/or custom when the text, the audio, and/or the video information can be characterized as being in the particular language and/or custom. Otherwise, or additionally, in this example, the monitoring server 202 can schedule the monitoring persons 212.1 through 212.n and/or the administrative person 214 based on scores or grades, which are to be described in further detail below, of the monitoring persons 212.1 through 212.n and/or the administrative person 214. In an exemplary embodiment, those monitoring persons from among the monitoring persons 212.1 through 212.n having higher scores or grades can be scheduled more often than those monitoring persons from among the monitoring persons 212.1 through 212.n having lower scores or grades. As a further example, the monitoring server 202 can schedule the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 based workloads of the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214. In another exemplary embodiment, those monitoring stations and/or persons having higher workloads, for example, scheduled more frequently, can be scheduled less often than those monitoring stations and/or persons having lower workloads. In some situations, the monitoring server 202 can use a priority based scheduling when scheduling the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 to review the text, the audio, and/or the video information. In these situations, the monitoring server 202 schedules higher priority activity, such as real-time activity like electronic or text messages to provide some examples, to be reviewed sooner than lower priority activity such as non-real-time activity like voicemail messages to provide some examples.


In some situations, the monitoring server 202 can process the text, the audio, and/or the video information for automated review. In these situations, the monitoring server 202 automatically reviews the text, the audio, and/or the video information for the presence of the suspicious activity and/or the prohibited activity, such as call forwarding, three way calling, and/or dialing during communication to provide some examples, as described above in FIG. 1. Typically, the monitoring server 202 generates one or more automatic warning alerts, such as an annotation, a flag, a bookmark, an audible alert, and/or a video alert to provide some examples, when the text, the audio, and/or the video information include the suspicious activity and/or the prohibited activity as described above in FIG. 1. In an exemplary embodiment, the monitoring server 202 provides an indication to the monitoring stations 206.1 through 206.n, the administrative station 208, the monitoring persons 212.1 through 212.n, and/or the administrative person 214 when the monitoring server 202 generates the one or more automatic warning alerts. In another exemplary embodiment, the monitoring server 202 can send the text, the audio, and/or the video and the one or more automatic warning alerts, if generated, to the monitoring storage 204. In this other exemplary embodiment, the monitoring server 202 can time-stamp the one or more automatic warning alerts to correspond with the text, the audio, and/or the video information to allow the text, the audio, and/or the video information that caused the one or more automatic warning alerts to be easily reviewed at a later time.


In some situations, the monitoring server 202 converts the text, the audio, and/or the video information, into a format that is suitable for monitoring by the monitoring stations 206.1 through 206.n and/or the administrative station 208. For example, the monitoring server 202 can transcribe the text, the audio, and/or the video information into other text, audio, and/or video information that is suitable for monitoring by the monitoring stations 206.1 through 206.n and/or the administrative station 208. As an example of this transcription, the monitoring server 202 can transcribe audio information corresponding to an audio communication and/or video information corresponding to a video communication into text information. In this example, the transcription of the audio information and/or the video information into the text information allows the monitoring server 202 to automatically search the text information for one or more key words and/or key phrases from among a dictionary of key words and/or key phrases and to generate the one or more automatic warning alerts when the text information includes the one or more key words and/or key phrases.


In some situations, the monitoring server 202 can compress, decompress, encrypt and/or decrypt the text, the audio, and/or the video information in accordance with one or more compression algorithms and/or one or more encryption algorithms. The one or more compression algorithms can include any suitable lossless data compression algorithm and/or any suitable lossy data compression algorithm that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. The one or more encryption algorithms can include any suitable symmetric key algorithm, any suitable private key algorithm and/or any suitable public key algorithm that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.


The monitoring storage 204 stores the text, the audio, and/or the video information for review by the monitoring stations 206.1 through 206.n as well as other data as to be described in further detail below. The monitoring storage 204 can include a read only memory (ROM), a random-access memory (RAM), a magnetic disk storage medium, a solid-state storage medium, an optical storage media, and/or a flash memory device to provide some examples for storing the text, the audio, and/or the video information.


The monitoring stations 206.1 through 206.n can operate in the monitor mode of operation or in the verification mode of operation as described above in FIG. 1. Each of the monitoring stations 206.1 through 206.n can be implemented as a mobile communication device such as a smartphone to provide an example, a desktop computer, a tablet computer, a personal digital assistant (PDA), or any other suitable electronic device capable of performing the functions of the monitoring stations 206.1 through 206.n as described herein that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. In the monitoring mode of operation, the monitoring stations 206.1 through 206.n retrieves the text, the audio, and/or the video information from the monitoring storage 204. However, in some situations, the monitoring stations 206.1 through 206.n can retrieve the text, the audio, and/or the video information from the monitoring server 202. For example, the text, the audio, and/or the video information for higher priority activity, such as real-time activity like electronic or text messages to provide some examples, can be retrieved from the monitoring server 202, as opposed to the text, the audio, and/or the video information for lower priority activity such as non-real-time activity like voicemail messages to provide some examples, which can be retrieved from the monitoring storage 204. In an exemplary embodiment, each of the monitoring stations 206.1 through 206.n can receive a monitoring schedule from the monitoring server 202 which includes a listing of the text, the audio, and/or the video information that the monitoring stations 206.1 through 206.n and/or the monitoring persons 212.1 through 212.n are scheduled to review. In this exemplary embodiment, the monitoring stations 206.1 through 206.n retrieve the text, the audio, and/or the video information that the monitoring stations 206.1 through 206.n and/or the monitoring persons 212.1 through 212.n are scheduled to review in accordance with the monitoring schedule.


Thereafter, the monitoring stations 206.1 through 206.n process the text, the audio, and/or the video information for review by the monitoring persons 212.1 through 212.n. Generally, the monitoring persons 212.1 through 212.n review the text, the audio, and/or the video information that they are scheduled to review for the presence of the suspicious activity and/or the prohibited activity as described above in FIG. 1. In an exemplary embodiment, the monitoring stations 206.1 through 206.n authenticate credentials, such as a username, a password, and/or an authentication code to provide some examples of the monitoring persons 212.1 through 212.n, before the monitoring persons 212.1 through 212.n can access the monitoring stations 206.1 through 206.n to review the text, the audio, and/or the video information that they are scheduled to review. In another exemplary, the monitoring stations 206.1 through 206.n and/or the administrative station 208 can track review the text, the audio, and/or the video information by the monitoring persons 212.1 through 212.n. For example, the monitoring stations 206.1 through 206.n and/or the administrative station 208 can track portions the text, the audio, and/or the video information that have been reviewed by the monitoring persons 212.1 through 212.n and/or yet to be reviewed by the monitoring persons 212.1 through 212.n. Also, in this example, the the monitoring stations 206.1 through 206.n and/or the administrative station 208 can track which reviewer from among the monitoring persons 212.1 through 212.n has reviewed the text, the audio, and/or the video information. Typically, the monitoring persons 212.1 through 212.n generate one or more manual warning alerts, such as an annotation, a flag, a bookmark, an audible alert, and/or a video alert to provide some examples, when the text, the audio, and/or the video information includes the suspicious activity and/or the prohibited activity as described above in FIG. 1. In further exemplary embodiment, the monitoring stations 206.1 through 206.n provide an indication to the monitoring server 202 and/or the administrative station 208 when the monitoring persons 212.1 through 212.n generate the one or more manual warning alerts. In a yet further exemplary embodiment, the monitoring stations 206.1 through 206.n can send the text, the audio, and/or the video and/or the one or more manual warning alerts, if generated, to the monitoring storage 204. In this yet further exemplary embodiment, the monitoring stations 206.1 through 206.n can time-stamp the one or more manual warning alerts to correspond with the text, the audio, and/or the video information to allow the text, the audio, and/or the video information that caused the one or more manual warning alerts to be easily reviewed at a later time. In some situations, the monitoring stations 206.1 through 206.n review the text, the audio, and/or the video information having the one or more manual automatic alerts generated by the monitoring server 202 to verify the presence of the suspicious activity and/or the prohibited activity within the text, the audio, and/or the video information.


Often times, the monitoring stations 206.1 through 206.n include one or more graphical user interfaces (GUIs) for interfacing with the monitoring persons 212.1 through 212.n. In an exemplary embodiment, the monitoring stations 206.1 through 206.n can display the text, the audio, and/or the video information using one or more graphical user interfaces (GUIs) to allow the text, the audio, and/or the video information to be monitored by the monitoring persons 212.1 through 212.n. In this exemplary embodiment, a first portion of the one or more GUIs includes one or more activity display areas for displaying the text, the audio, and/or the video information. This activity display area can include one or more activity display area controls allowing the monitoring persons 212.1 through 212.n to pause, stop, fast forward, rewind, and/or play the text, the audio, and/or the video information. The activity display area can also include one or more metadata display areas for displaying the metadata included within the text, the audio, and/or the video information and/or other data derived from the metadata included within the text, the audio, and/or the video information. Additionally, or alternatively, in this exemplary embodiment, a second portion of the one or more GUIs includes a warning alert area for generating the one or more manual warning alerts and/or for verifying the one or more automatic warning alerts. Further, or in the alternative, in this exemplary embodiment, one or more pop-windows or dialog boxes can appear to the monitoring persons 212.1 through 212.n notifying the monitoring persons 212.1 through 212.n to perform the one or more tasks and/or entering information requested by one or more tasks to be performed by the monitoring persons 212.1 through 212.n during the verification mode of operation as to be described in further detail below.


In the exemplary embodiment illustrated in FIG. 2, the monitoring server 202, the monitoring stations 206.1 through 206.n, and/or the administrative station 208 verifies the monitoring persons 212.1 through 212.n are monitoring the text, the audio, and/or the video information during the verification mode of operation. The monitoring server 202, the monitoring stations 206.1 through 206.n, and/or the administrative station 208 determines whether the monitoring persons 212.1 through 212.n are attentive and reviewing the activity. For example, when the one or more manual warning alerts are generated by the monitoring persons 212.1 through 212.n within a predetermined amount of time, such as once every minute, once every couple of minutes, once every hour, or once every couple of hours to provide some examples, the monitoring server 202, the monitoring stations 206.1 through 206.n, and/or the administrative station 208 presumes the monitoring persons 212.1 through 212.n are attentive and reviewing the activity. However, in some situations, no warning alerts may be generated by the monitoring persons 212.1 through 212.n when reviewing the activity within the predetermined amount of time. In these situations, the monitoring server 202, the monitoring stations 206.1 through 206.n, and/or the administrative station 208 can require the monitoring persons 212.1 through 212.n to perform the one or more tasks to verify the monitoring persons 212.1 through 212.n are attentive and reviewing the activity as described above in FIG. 1. As another example, the monitoring persons 212.1 through 212.n can be required to electrically certify, for example, by electronically signing, their monitoring of the text, the audio, and/or the video information after their review of the text, the audio, and/or the video information is complete. In this other example, the one or more GUIs, as described above, can include a textual box for entry of the electronically signatures of the monitoring persons 212.1 through 212.n. As a further example, the monitoring stations 206.1 through 206.n can include microphones and/or video cameras for recording audio and/or video of the monitoring persons 212.1 through 212.n while they are reviewing the text, the audio, and/or the video information. In this further example, the audio and/or the video of the monitoring persons 212.1 through 212.n can be stored in the monitoring storage 204 for access by the administrative station 208 to verify the monitoring persons 212.1 through 212.n are monitoring the text, the audio, and/or the video information.


The administrative station 208 and/or the administrative person 214 oversee operation of the monitoring center 200. Generally, the administrative person 214 can be characterized as being a root user, an administrator, an administrative user, or a supervisor user having more privileges than the monitoring persons 212.1 through 212.n. For example, the administrative person 214 can edit the one or more monitoring profiles of the monitoring persons 212.1 through 212.n, edit the one or more manual warning alerts, such as the annotation, the flag, the bookmark, the audible alert, and/or the video alert to provide some examples, generated by the monitoring persons 212.1 through 212.n, edit the monitoring schedule from the monitoring server 202, and/or edit the scores or grades of the monitoring persons 212.1 through 212.n.


The administrative station 208 can be implemented as a mobile communication device such as a smartphone to provide an example, a desktop computer, a tablet computers, a personal digital assistants (PDA), or any other suitable electronic device capable of performing the functions of the administrative station 208 as described herein that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. In an exemplary embodiment, the administrative station 208 can review the text, the audio, and/or the video information in a substantially similar manner as the monitoring stations 206.1 through 206.n.


In some situations, the administrative station 208 and/or the administrative person 214 receives a notification, such as a text message, an electronic mail message, or other electronic message, audible alert, video alert, from the monitoring stations 206.1 through 206.n when the one or more manual warning alerts have generated by the monitoring persons 212.1 through 212.n indicating further review of the text, the audio, and/or the video information. In these situations, the administrative station 208 can retrieve the text, the audio, and/or the video information and the one or more manual warning alerts from the storage monitoring storage 204.


Preferably in these situations, the administrative person 214 reviews the text, the audio, and/or the video information having the one or more manual warning alerts and/or the one or more automatic alerts to verify the presence of the suspicious activity and/or the prohibited activity within the text, the audio, and/or the video information. In an exemplary embodiment, the administrative station 208 authenticates credentials, such as a username, a password, and/or an authentication code to provide some examples of the administrative person 214, before the administrative person 214 can access the administrative station 208 to review the text, the audio, and/or the video information. In another exemplary embodiment, the administrative station 208 includes a substantially similar GUI as the monitoring stations 206.1 through 206.n to allow the administrative person 214 to verify the suspicious activity and/or the prohibited activity within the text, the audio, and/or the video information. In some situations, the administrative station 208, as well as the monitoring stations 206.1 through 206.n, can affect the activity when the presence of the suspicious activity and/or the prohibited activity within the text, the audio, and/or the video information has been verified. For example, communication within the monitoring environment can be interrupted and/or disconnected upon verification of the presence of the suspicious activity and/or the prohibited activity. In an exemplary embodiment, the administrative station 208, as well as the monitoring stations 206.1 through 206.n, can store a listing of automatic and/or manual warning alerts which is indexed to various actions to be performed by the monitoring stations 206.1 through 206.n and/or the administrative station 208. In this exemplary embodiment, the monitoring stations 206.1 through 206.n and/or the administrative station 208 can proceed with the action, such as annotating the suspicious activity and/or the prohibited activity, interrupting and/or disconnecting the communication within the monitoring environment, and/or notifying the local, the state, and/or the national governing authorities of the suspicious activity and/or the prohibited activity to provide some examples, corresponding with the one or more automatic warning alerts and/or the one or more manual warning alerts as prescribed in the listing of warning alerts.


In the exemplary embodiment illustrated in FIG. 2, the administrative station 208 can utilize various statistical indicators to evaluate performance of the monitoring persons 212.1 through 212.n and/or the administrative person 214. For example, these statistical indicators can include the number of the one or more manual warning alerts generated by the monitoring persons 212.1 through 212.n and/or the administrative person 214 over a predetermined amount of time, for example, per activity, a day, a week, or a month, the number of the one or more manual warning alerts generated by the monitoring persons 212.1 through 212.n and/or the administrative person 214 and verified by the administrative station 208 over a predetermined amount of time, for example, per activity, a day, a week, or a month, the number of activities that have been monitored by the monitoring persons 212.1 through 212.n and/or the administrative person 214 over a predetermined amount of time, for example, a day, a week, or a month, and/or the number of times the monitoring persons 212.1 through 212.n and/or the administrative person 214 have been required to perform the one or more tasks over a predetermined amount of time, for example, per activity, a day, a week, or a month. In this exemplary embodiment, the administrative station 208 can score or grade the monitoring persons 212.1 through 212.n and/or the administrative person 214 based upon the various statistical indicators. For example, a first monitoring person from among the monitoring persons 212.1 through 212.n can be scored or grader higher than a second monitoring person from among the monitoring persons 212.1 through 212.n when the statistical indicators indicate the first monitoring person is more efficient, for example, more warning alerts generated by the first monitoring person over the predetermined amount of time, more warning alerts generated by the first monitoring person verified by the administrative station 208 over the predetermined amount of time, more activities have been monitored by the first monitoring person over the predetermined amount of time, and/or less tasks have been required to be performed by the first monitoring person, at monitoring the activity than the second monitoring person. Additionally, the administrative station 208 can measure the specific number of activities monitored and/or verified to be monitored the monitoring by the monitoring persons 212.1 through 212.n and/or the administrative person 214 to ensure the monitoring center 200 is performing in accordance with the regulations by the local, the state, and/or the national governing authorities and/or the contract between the monitoring center 200 and the local, the state, and/or the national governing authorities as described above in FIG. 1. Further, the administrative station 208 can store a communication log indicating the activity that has been affected by the monitoring stations 206.1 through 206.n and/or the administrative station 208 for including the suspicious activity and/or the prohibited activity. This communication log can be indexed to the persons within the activity and/or the date, the time, the duration, and/or the location of the activity to provide some examples to allow the text, the audio, and/or the video information to be correlated with other text, audio, and/or video information of other activities by the same persons, the same date, the same time, the same duration, and/or the same location.


The communication network 210 includes one or more wireless communication networks and/or one or more wired communication networks for communicatively coupling the monitoring server 202, the monitoring storage 204, the monitoring stations 206.1 through 206.n, and the administrative station 208. The one or more wireless communication networks can include one or more cellular phone networks, wireless local area networks (WLANs), wireless sensor networks, satellite communication networks, terrestrial microwave networks, and/or other suitable networks that transmit data over a wireless-based communication technology that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure. The one or more wired communication networks include one or more telephone networks, cable television networks, internet access networks, fiber-optic communication networks and/or other suitable networks that transmit data over a wire-based communication technology that will be apparent to those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.


Exemplary Operation of the Monitoring Center



FIG. 3 is a flowchart of exemplary operational steps for the exemplary operation center according to an embodiment of the present disclosure. The disclosure is not limited to this operational description. Rather, it will be apparent to ordinary persons skilled in the relevant art(s) that other operational control flows are within the scope and spirit of the present disclosure. The following discussion describes an exemplary operational control flow 300 for a monitoring center, such as monitoring center 200 to provide an example, in monitoring the text information, the audio, and/or the video information of the activity.


At step 302, the operational control flow 300 reviews text, audio, and/or video information relating to the activity for the suspicious activity and/or the prohibited activity. Typically, the operational control flow 300 generate one or more manual warning alerts, such as an annotation, a flag, a bookmark, an audible alert, and/or a video alert to provide some examples, when the suspicious activity and/or the prohibited activity is present within the activity. The simplest warning alerts can include one or more annotations of the suspicious activity and/or the prohibited activity, although warning alerts of much greater complexity can be used, such notifying the local, the state, and/or the national governing authorities of the suspicious activity and/or the prohibited activity to provide an example, without departing from the spirit and scope of the present disclosure.


At step 304, the operational control flow 300 determines whether the one or more manual warning alerts have been generated in step 302 within a predetermined amount of time. The operational control flow 300 proceeds to step 306 to verify a monitoring person is attentive and reviewing the text, the audio, and/or the video information when the one or more manual warning alerts have not been generated in step 302 within the predetermined amount of time. Otherwise, the operational control flow 300 reverts to step 302 to continue reviewing the text, the audio, and/or the video information.


At step 306, the operational control flow 300 requires the monitoring person to perform one or more tasks to verify the monitoring person is attentive and reviewing the activity. The one or more tasks can be as simple as activating a checkbox or providing a code or an electronic signature to provide some examples, although more complicated tasks, such as a biometric verification such as a retinal, a facial, and/or a voice verification to provide some examples, are possible as will be recognized by those skilled in the relevant art(s) without departing from the spirit and scope of the present disclosure.


Exemplary Computer System for Implementing the Exemplary Design Environment



FIG. 4 illustrates a block diagram of an exemplary computer system for implementing the exemplary design environment according to an exemplary embodiment of the present disclosure. A computer system 400 can be used to implement the monitoring server 202, the monitoring storage 204, the monitoring stations 206.1 through 206.n, and the administrative station 208 as described above in FIG. 2. After reading this description, it will become apparent to a person skilled in the relevant art how to implement embodiments using other computer systems and/or computer architectures.


The computer system 400 includes one or more processors 404, also referred to as central processing units, or CPUs, to execute operations of the monitoring server 202, the monitoring storage 204, the monitoring stations 206.1 through 206.n, and the administrative station 208 as described above in FIG. 2. The one or more processors 404 can be connected to a communication infrastructure or bus 406. In an exemplary embodiment, one or more of the one or more processors 404 can be implemented as a graphics processing unit (GPU). The GPU represents a specialized electronic circuit designed to rapidly process mathematically intensive applications on electronic devices. The GPU may have a highly parallel structure that is efficient for parallel processing of large blocks of data, such as mathematically intensive data common to computer graphics applications, images and videos.


The computer system 400 also includes user input/output device(s) 403, such as monitors, keyboards, pointing devices, etc., which communicate with communication infrastructure 406 through user input/output interface(s) 402.


The computer system 400 also includes a main or primary memory 408, such as a random-access memory (RAM) to provide an example. The main memory 408 can include one or more levels of cache. The main memory 408 has stored therein control logic (i.e., computer software) and/or data, to perform operations of the monitoring server 202, the monitoring storage 204, the monitoring stations 206.1 through 206.n, and the administrative station 208 as described above in FIG. 2


The computer system 400 can also include one or more secondary storage devices or memory 410 to store data for performing operations of the monitoring server 202, the monitoring storage 204, the monitoring stations 206.1 through 206.n, and the administrative station 208 as described above in FIG. 2. The one or more secondary storage devices or memory 410 can include, for example, a hard disk drive 412 and/or a removable storage device or drive 414. The removable storage drive 414 may be a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup device, and/or any other storage device/drive. The removable storage drive 414 may interact with a removable storage unit 418. The removable storage unit 418 includes a computer usable or readable storage device having stored thereon computer software (control logic) and/or data. The removable storage unit 418 may be a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, and/any other computer data storage device. The removable storage drive 414 reads from and/or writes to removable storage unit 418 in a well-known manner.


According to an exemplary embodiment, the one or more secondary storage devices or memory 410 may include other means, instrumentalities or other approaches for allowing computer programs and/or other instructions and/or data to be accessed by computer system 400. Such means, instrumentalities or other approaches may include, for example, a removable storage unit 422 and an interface 420. Examples of the removable storage unit 422 and the interface 420 may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM or PROM) and associated socket, a memory stick and USB port, a memory card and associated memory card slot, and/or any other removable storage unit and associated interface.


The computer system 400 may further include a communication or network interface 424. The communication or network interface 424 enables the computer system 400 to communicate and interact with any combination of remote devices, remote networks, remote entities, etc. (individually and collectively referenced by reference number 428). For example, the communication or network interface 424 may allow the computer system 400 to communicate with the remote devices 428 over a communication path 426, which may be wired and/or wireless, and which may include any combination of LANs, WANs, the Internet, etc. Control logic and/or data may be transmitted to and from the computer system 400 via communication path 426.


In an embodiment, a tangible apparatus or article of manufacture comprising a tangible computer useable or readable medium having control logic (software) stored thereon is also referred to herein as a computer program product or program storage device. This includes, but is not limited to, the computer system 400, the main memory 408, the secondary memory 410, and the removable storage units 418 and 422, as well as tangible articles of manufacture embodying any combination of the foregoing. Such control logic, when executed by one or more data processing devices (such as computer system 400), causes such data processing devices to operate as described herein.


Based on the teachings contained in this disclosure, it will be apparent to persons skilled in the relevant art(s) how to make and use the invention using data processing devices, computer systems and/or computer architectures other than that illustrated in FIG. 4. In particular, embodiments may operate with software, hardware, and/or operating system implementations other than those described herein.


CONCLUSION

The Detailed Description referred to accompanying figures to illustrate exemplary embodiments consistent with the disclosure. References in the disclosure to “an exemplary embodiment” indicates that the exemplary embodiment described include a particular feature, structure, or characteristic, but every exemplary embodiment can not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same exemplary embodiment. Further, any feature, structure, or characteristic described in connection with an exemplary embodiment can be included, independently or in any combination, with features, structures, or characteristics of other exemplary embodiments whether or not explicitly described.


The exemplary embodiments described within the disclosure have been provided for illustrative purposes, and are not intend to be limiting. Other exemplary embodiments are possible, and modifications can be made to the exemplary embodiments while remaining within the spirit and scope of the disclosure. The disclosure has been described with the aid of functional non-building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional non-building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries can be defined so long as the specified functions and relationships thereof are appropriately performed.


The Detailed Description of the exemplary embodiments fully revealed the general nature of the disclosure that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based on the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein.

Claims
  • 1. A monitoring platform for monitoring activity within a controlled environment facility, comprising: a memory; andone or more processors configured to: receive a monitoring directive defining one or more activities and levels to be monitored;monitor inmate activity according to the monitoring directive, the monitoring including reviewing content of an inmate communication;determine, based on the monitoring, a degree to which the monitoring directive has been met;receive billing guidelines; andgenerate a bill of payment based on the degree and the billing guidelines.
  • 2. The monitoring platform of claim 1, wherein the activities include one or more of telephone communications, messaging communications, video communications, inter-inmate communications, Internet browsing activity, visitations, and visitation communication.
  • 3. The monitoring platform of claim 1, wherein the levels to be monitored define a percentage of a total amount of corresponding activities.
  • 4. The monitoring platform of claim 1, wherein the monitoring directive corresponds to review parameters set forth in a services contract.
  • 5. The monitoring platform of claim 1, wherein the monitoring directive corresponds to local, state, or national prison monitoring regulations.
  • 6. The monitoring platform of claim 1, wherein the degree is determined based on a comparison of an amount of an activity monitored to the level associated with the activity.
  • 7. The monitoring platform of claim 1, wherein the one or more processors are further configured to: obtain a baseline billing price from the billing guidelines;calculate a billing adjustment based on the degree and the billing guidelines; andadjust the baseline billing price according to the billing adjustment to generate a final billing amount.
  • 8. The monitoring platform of claim 7, wherein the bill of payment assesses a cost in accordance with the final billing amount.
  • 9. The monitoring platform of claim 1, wherein the one or more processors are further configured to: monitor activity of a reviewer; andinstruct the reviewer to provide an input after a predetermined period of inactivity by the reviewer,wherein the degree is based on the monitoring of the activity of the reviewer.
  • 10. A monitoring method for monitoring activity within a controlled environment facility, comprising: receiving a monitoring directive defining one or more activities and levels to be monitored;monitoring inmate activity according to the monitoring directive, the monitoring including reviewing content of an inmate communication;determining, based on the monitoring, a degree to which the monitoring directive has been met;receiving billing guidelines; andgenerating a bill of payment based on the degree and the billing guidelines.
  • 11. The method of claim 10, wherein the activities include one or more of telephone communications, messaging communications, video communications, inter-inmate communications, Internet browsing activity, visitations, and visitation communication.
  • 12. The method of claim 10, wherein the levels to be monitored define a percentage of a total amount of corresponding activities.
  • 13. The method of claim 10, wherein the monitoring directive corresponds to review parameters set forth in a services contract.
  • 14. The method of claim 10, wherein the monitoring directive corresponds to local, state, or national prison monitoring regulations.
  • 15. The method of claim 10, wherein the degree is determined based on a comparison of an amount of an activity monitored to the level associated with the activity.
  • 16. The method of claim 10, further comprising: obtaining a baseline billing price from the billing guidelines;calculating a billing adjustment based on the degree and the billing guidelines; andadjusting the baseline billing price according to the billing adjustment to generate a final billing amount.
  • 17. The method of claim 16, wherein the bill of payment assesses a cost in accordance with the final billing amount.
  • 18. The method of claim 10, further comprising: monitoring activity of a reviewer; andinstructing the reviewer to provide an input after a predetermined period of inactivity by the reviewer,wherein the degree is based on the monitoring of the activity of the reviewer.
  • 19. A monitoring platform for monitoring activity within a controlled environment facility, comprising: a memory; andone or more processors configured to: receive a monitoring directive defining one or more activities and levels to be monitored;monitor inmate activity according to the monitoring directive, the monitoring including reviewing content of an inmate communication;monitor activity of a reviewer;instruct the reviewer to provide an input after a predetermined period of inactivity by the reviewer;determine, based on the monitoring of the activity of the reviewer, a degree to which the monitoring directive has been met;receive billing guidelines; andgenerate a bill of payment based on the degree and the billing guidelines.
  • 20. The monitoring platform of claim 19, wherein the one or more processors are further configured to: obtain a baseline billing price from the billing guidelines;calculate a billing adjustment based on the degree and the billing guidelines; andadjust the baseline billing price according to the billing adjustment to generate a final billing amount.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. patent application Ser. No. 17/349,572, filed Jun. 16, 2021, which is a continuation of U.S. patent application Ser. No. 16/827,403, filed Mar. 23, 2020, issued as U.S. Pat. No. 11,044,361, which is a continuation of U.S. patent application Ser. No. 16/282,886, filed Feb. 22, 2019, issued as U.S. Pat. No. 10,601,982, which is a continuation of U.S. patent application Ser. No. 15/611,598, filed Jun. 1, 2017, issued as U.S. Pat. No. 10,225,396, which claims the benefit of U.S. Provisional Patent Appl. No. 62/508,106, filed May 18, 2017, each of which is incorporated herein by reference in its entirety.

US Referenced Citations (390)
Number Name Date Kind
4054756 Comella et al. Oct 1977 A
4191860 Weber Mar 1980 A
4670628 Boratgis et al. Jun 1987 A
4691347 Stanley et al. Sep 1987 A
4737982 Boratgis et al. Apr 1988 A
4813070 Humphreys et al. Mar 1989 A
4907221 Pariani et al. Mar 1990 A
4918719 Daudelin Apr 1990 A
4935956 Hellwarth et al. Jun 1990 A
4943973 Werner Jul 1990 A
4995030 Helf Feb 1991 A
5185781 Dowden et al. Feb 1993 A
5210789 Jeffus et al. May 1993 A
5229764 Matchett et al. Jul 1993 A
5291548 Tsumura et al. Mar 1994 A
5319702 Kitchin et al. Jun 1994 A
5319735 Preuss et al. Jun 1994 A
5345501 Shelton Sep 1994 A
5345595 Johnson et al. Sep 1994 A
5379345 Greenberg Jan 1995 A
5425091 Josephs Jun 1995 A
5438616 Peoples Aug 1995 A
5469370 Ostrover et al. Nov 1995 A
5485507 Brown et al. Jan 1996 A
5502762 Andrew et al. Mar 1996 A
5517555 Amadon et al. May 1996 A
5535194 Brown et al. Jul 1996 A
5535261 Brown et al. Jul 1996 A
5539731 Haneda et al. Jul 1996 A
5539812 Kitchin et al. Jul 1996 A
5544649 David et al. Aug 1996 A
5555551 Rudokas et al. Sep 1996 A
5583925 Bernstein Dec 1996 A
5590171 Howe et al. Dec 1996 A
5592548 Sih Jan 1997 A
5613004 Cooperman Mar 1997 A
5619561 Reese Apr 1997 A
5627887 Freedman May 1997 A
5634086 Rtischev et al. May 1997 A
5634126 Norell May 1997 A
5636292 Rhoads Jun 1997 A
5640490 Hansen et al. Jun 1997 A
5646940 Hotto Jul 1997 A
5649060 Ellozy et al. Jul 1997 A
5655013 Gainsboro Aug 1997 A
5675704 Juang et al. Oct 1997 A
5687236 Moskowitz Nov 1997 A
5710834 Rhoads Jan 1998 A
5719937 Warren et al. Feb 1998 A
5745558 Richardson, Jr. et al. Apr 1998 A
5745569 Moskowitz Apr 1998 A
5745604 Rhoads Apr 1998 A
5748726 Unno May 1998 A
5748763 Rhoads May 1998 A
5748783 Rhoads May 1998 A
5757889 Ohtake May 1998 A
5768355 Salibrici et al. Jun 1998 A
5768426 Rhoads Jun 1998 A
5774452 Greenberg Jun 1998 A
5793415 Gregory, III et al. Aug 1998 A
5796811 McFarlen Aug 1998 A
5805685 McFarlen Sep 1998 A
5809462 Nussbaum Sep 1998 A
5822432 Moskowitz Oct 1998 A
5822436 Rhoads Oct 1998 A
5832068 Smith Nov 1998 A
5832119 Rhoads Nov 1998 A
5835486 Davis et al. Nov 1998 A
5841886 Rhoads Nov 1998 A
5841978 Rhoads Nov 1998 A
5850481 Rhoads Dec 1998 A
5861810 Nguyen Jan 1999 A
5862260 Rhoads Jan 1999 A
5867562 Scherer Feb 1999 A
5883945 Richardson et al. Mar 1999 A
5889868 Seraphim et al. Mar 1999 A
5899972 Miyazawa et al. May 1999 A
5907602 Peel et al. May 1999 A
5915001 Uppaluru Jun 1999 A
5920834 Sih et al. Jul 1999 A
5923746 Baker et al. Jul 1999 A
5926533 Gainsboro Jul 1999 A
5930369 Cox et al. Jul 1999 A
5930377 Powell et al. Jul 1999 A
5937035 Andruska et al. Aug 1999 A
5953049 Horn et al. Sep 1999 A
5960080 Fahlman et al. Sep 1999 A
5963909 Warren et al. Oct 1999 A
5982891 Ginter et al. Nov 1999 A
5991373 Pattison et al. Nov 1999 A
5999828 Sih et al. Dec 1999 A
6011849 Orrin Jan 2000 A
6026193 Rhoads Feb 2000 A
6035034 Trump Mar 2000 A
6038315 Strait et al. Mar 2000 A
6052454 Kek et al. Apr 2000 A
6052462 Lu Apr 2000 A
6058163 Pattison et al. May 2000 A
6064963 Gainsboro May 2000 A
6072860 Kek et al. Jun 2000 A
6078567 Traill et al. Jun 2000 A
6078645 Cai et al. Jun 2000 A
6078807 Dunn et al. Jun 2000 A
6111954 Rhoads Aug 2000 A
6118860 Hillson et al. Sep 2000 A
6122392 Rhoads Sep 2000 A
6122403 Rhoads Sep 2000 A
6138119 Hall et al. Oct 2000 A
6141406 Johnson Oct 2000 A
6160903 Hamid et al. Dec 2000 A
6173284 Brown Jan 2001 B1
6175831 Weinreich et al. Jan 2001 B1
6185416 Rudokas et al. Feb 2001 B1
6185683 Ginter et al. Feb 2001 B1
6205249 Moskowitz Mar 2001 B1
6211783 Wang Apr 2001 B1
6219640 Basu et al. Apr 2001 B1
6233347 Chen et al. May 2001 B1
6237786 Ginter et al. May 2001 B1
6243480 Zhao et al. Jun 2001 B1
6243676 Witteman Jun 2001 B1
6253193 Ginter et al. Jun 2001 B1
6263507 Ahmad et al. Jul 2001 B1
6266430 Rhoads Jul 2001 B1
6278772 Bowater et al. Aug 2001 B1
6278781 Rhoads Aug 2001 B1
6289108 Rhoads Sep 2001 B1
6301360 Bocionek et al. Oct 2001 B1
6308171 De La Huerga Oct 2001 B1
6312911 Bancroft Nov 2001 B1
6314192 Chen et al. Nov 2001 B1
6324573 Rhoads Nov 2001 B1
6324650 Ogilvie Nov 2001 B1
6330335 Rhoads Dec 2001 B1
6343138 Rhoads Jan 2002 B1
6343738 Ogilvie Feb 2002 B1
6345252 Beigi et al. Feb 2002 B1
6381321 Brown et al. Apr 2002 B1
6389293 Clore et al. May 2002 B1
6421645 Beigi et al. Jul 2002 B1
6526380 Thelen et al. Feb 2003 B1
6542602 Elazar Apr 2003 B1
6611583 Gainsboro Aug 2003 B1
6625261 Holtzberg Sep 2003 B2
6625587 Erten et al. Sep 2003 B1
6633846 Bennett et al. Oct 2003 B1
6636591 Swope et al. Oct 2003 B1
6639977 Swope et al. Oct 2003 B1
6639978 Draizin et al. Oct 2003 B2
6647096 Milliorn et al. Nov 2003 B1
6665376 Brown Dec 2003 B1
6665644 Kanevsky et al. Dec 2003 B1
6668044 Schwartz et al. Dec 2003 B1
6668045 Mow Dec 2003 B1
6671292 Haartsen Dec 2003 B1
6688518 Valencia et al. Feb 2004 B1
6728345 Glowny et al. Apr 2004 B2
6728682 Fasciano Apr 2004 B2
6748356 Beigi et al. Jun 2004 B1
6760697 Neumeyer et al. Jul 2004 B1
6763099 Blink Jul 2004 B1
6782370 Stack Aug 2004 B1
6788772 Barak et al. Sep 2004 B2
6795540 Mow Sep 2004 B1
6810480 Parker et al. Oct 2004 B1
6850609 Schrage Feb 2005 B1
6880171 Ahmad et al. Apr 2005 B1
6895086 Martin May 2005 B2
6898612 Parra et al. May 2005 B1
6907387 Reardon Jun 2005 B1
6920209 Gainsboro Jul 2005 B1
6947525 Benco Sep 2005 B2
6970554 Peterson et al. Nov 2005 B1
7032007 Fellenstein et al. Apr 2006 B2
7035386 Susen et al. Apr 2006 B1
7039171 Gickler May 2006 B2
7039585 Wilmot et al. May 2006 B2
7046779 Hesse May 2006 B2
7050918 Pupalaikis et al. May 2006 B2
7062286 Grivas et al. Jun 2006 B2
7075919 Wendt et al. Jul 2006 B1
7079636 McNitt et al. Jul 2006 B1
7079637 McNitt et al. Jul 2006 B1
7092494 Anders et al. Aug 2006 B1
7103549 Bennett et al. Sep 2006 B2
7106843 Gainsboro et al. Sep 2006 B1
7123704 Martin Oct 2006 B2
7133511 Buntin et al. Nov 2006 B2
7133828 Scarano et al. Nov 2006 B2
7133845 Ginter et al. Nov 2006 B1
7149788 Gundla et al. Dec 2006 B1
7191133 Pettay Mar 2007 B1
7197560 Caslin et al. Mar 2007 B2
7236932 Grajski Jun 2007 B1
7248685 Martin Jul 2007 B2
7256816 Profanchik et al. Aug 2007 B2
7277468 Tian et al. Oct 2007 B2
7280816 Fratti et al. Oct 2007 B2
7324637 Brown et al. Jan 2008 B2
7333798 Hodge Feb 2008 B2
7366782 Chong et al. Apr 2008 B2
7406039 Cherian et al. Jul 2008 B2
7417983 He et al. Aug 2008 B2
7424715 Dutton Sep 2008 B1
7466816 Blair Dec 2008 B2
7494061 Reinhold Feb 2009 B2
7496345 Rae et al. Feb 2009 B1
7505406 Spadaro et al. Mar 2009 B1
7519169 Hingoranee et al. Apr 2009 B1
7529357 Rae et al. May 2009 B1
7551732 Anders Jun 2009 B2
7596498 Basu et al. Sep 2009 B2
7639791 Hodge Dec 2009 B2
7664243 Martin Feb 2010 B2
7672845 Beranek et al. Mar 2010 B2
RE41190 Darling Apr 2010 E
7698182 Falcone et al. Apr 2010 B2
7742581 Hodge et al. Jun 2010 B2
7742582 Harper Jun 2010 B2
7783021 Hodge Aug 2010 B2
7804941 Keiser et al. Sep 2010 B2
7826604 Martin Dec 2010 B2
7848510 Shaffer et al. Dec 2010 B2
7853243 Hodge Dec 2010 B2
7860222 Sidler et al. Dec 2010 B1
7881446 Apple et al. Feb 2011 B1
7899167 Rae Mar 2011 B1
7961860 McFarlen Jun 2011 B1
8031052 Polozola Oct 2011 B2
8099080 Rae et al. Jan 2012 B1
8135115 Hogg, Jr. et al. Mar 2012 B1
8204177 Harper Jun 2012 B2
8295446 Apple et al. Oct 2012 B1
8458732 Hanna et al. Jun 2013 B2
8467381 Keiser et al. Jun 2013 B1
8488756 Hodge et al. Jul 2013 B2
8498937 Shipman, Jr. et al. Jul 2013 B1
8509390 Harper Aug 2013 B2
8577003 Rae Nov 2013 B2
8630726 Hodge et al. Jan 2014 B2
8731934 Olligschlaeger et al. May 2014 B2
8886663 Gainsboro et al. Nov 2014 B2
8894417 Mandel Nov 2014 B2
8917848 Torgersrud et al. Dec 2014 B2
8929525 Edwards Jan 2015 B1
9020115 Hangsleben Apr 2015 B2
9043813 Hanna et al. May 2015 B2
9077680 Harper Jul 2015 B2
9094500 Edwards Jul 2015 B1
9143609 Hodge Sep 2015 B2
9232051 Torgersrud et al. Jan 2016 B2
9307386 Hodge et al. Apr 2016 B2
9396320 Lindemann Jul 2016 B2
9552417 Olligschlaeger et al. Jan 2017 B2
9609121 Hodge Mar 2017 B1
9615060 Hodge Apr 2017 B1
9621504 Torgersrud et al. Apr 2017 B2
9621714 Seyfetdinov Apr 2017 B2
9674340 Hodge Jun 2017 B1
9800830 Humpries Oct 2017 B2
9923936 Hodge Mar 2018 B2
10225396 Hodge Mar 2019 B2
10394900 Edwards Aug 2019 B1
10440175 Dover Oct 2019 B1
10601982 Hodge Mar 2020 B2
10628571 Kursun et al. Apr 2020 B2
10686935 Parampottil et al. Jun 2020 B1
11044361 Hodge Jun 2021 B2
11563845 Hodge Jan 2023 B2
20010036821 Gainsboro et al. Nov 2001 A1
20010043697 Cox et al. Nov 2001 A1
20010056349 St. John Dec 2001 A1
20010056461 Kampe et al. Dec 2001 A1
20020002464 Pertrushin Jan 2002 A1
20020010587 Pertrushin Jan 2002 A1
20020032566 Tzirkel-Hancock et al. Mar 2002 A1
20020046057 Ross Apr 2002 A1
20020067272 Lemelson et al. Jun 2002 A1
20020069084 Donovan Jun 2002 A1
20020076014 Holtzberg Jun 2002 A1
20020107871 Wyzga et al. Aug 2002 A1
20020147707 Kraay et al. Oct 2002 A1
20020174183 Saeidi Nov 2002 A1
20030002639 Huie Jan 2003 A1
20030023444 St. John Jan 2003 A1
20030023874 Prokupets et al. Jan 2003 A1
20030035514 Jang Feb 2003 A1
20030040326 Levy et al. Feb 2003 A1
20030070076 Michael Apr 2003 A1
20030086546 Falcone et al. May 2003 A1
20030093533 Ezerzer et al. May 2003 A1
20030099337 Lord May 2003 A1
20030117280 Prehn Jun 2003 A1
20030126470 Crites et al. Jul 2003 A1
20030174826 Hesse Sep 2003 A1
20030190045 Huberman et al. Oct 2003 A1
20040008828 Coles et al. Jan 2004 A1
20040029564 Hodge Feb 2004 A1
20040081296 Brown et al. Apr 2004 A1
20040161086 Buntin et al. Aug 2004 A1
20040169683 Chiu et al. Sep 2004 A1
20040249650 Freedman et al. Dec 2004 A1
20040252184 Hesse et al. Dec 2004 A1
20040252447 Hesse et al. Dec 2004 A1
20050010411 Rigazio et al. Jan 2005 A1
20050027723 Jones et al. Feb 2005 A1
20050078006 Hutchins et al. Apr 2005 A1
20050080625 Bennett et al. Apr 2005 A1
20050094794 Creamer et al. May 2005 A1
20050102371 Aksu May 2005 A1
20050114192 Tor et al. May 2005 A1
20050125226 Magee Jun 2005 A1
20050128283 Bulriss et al. Jun 2005 A1
20050141678 Anders et al. Jun 2005 A1
20050144004 Bennett et al. Jun 2005 A1
20050170818 Netanel et al. Aug 2005 A1
20050182628 Choi Aug 2005 A1
20050207357 Koga Sep 2005 A1
20060064037 Shalon et al. Mar 2006 A1
20060087554 Boyd et al. Apr 2006 A1
20060087555 Boyd et al. Apr 2006 A1
20060089837 Adar et al. Apr 2006 A1
20060093099 Cho May 2006 A1
20060198504 Shemisa et al. Sep 2006 A1
20060200353 Bennett Sep 2006 A1
20060285650 Hodge Dec 2006 A1
20060285665 Wasserblat et al. Dec 2006 A1
20070003026 Hodge et al. Jan 2007 A1
20070011008 Scarano et al. Jan 2007 A1
20070041545 Gainsboro Feb 2007 A1
20070047734 Frost Mar 2007 A1
20070071206 Gainsboro et al. Mar 2007 A1
20070133437 Wengrovitz et al. Jun 2007 A1
20070185717 Bennett Aug 2007 A1
20070192174 Bischoff Aug 2007 A1
20070195703 Boyajian et al. Aug 2007 A1
20070237099 He et al. Oct 2007 A1
20070244690 Peters Oct 2007 A1
20080000966 Keiser Jan 2008 A1
20080021708 Bennett et al. Jan 2008 A1
20080046241 Osburn et al. Feb 2008 A1
20080096178 Rogers et al. Apr 2008 A1
20080106370 Perez et al. May 2008 A1
20080118045 Polozola et al. May 2008 A1
20080195387 Zigel et al. Aug 2008 A1
20080198978 Olligschlaeger Aug 2008 A1
20080201143 Olligschlaeger et al. Aug 2008 A1
20080201158 Johnson et al. Aug 2008 A1
20080260133 Hodge et al. Oct 2008 A1
20080300878 Bennett Dec 2008 A1
20090178144 Redlich et al. Jul 2009 A1
20100177881 Hodge Jul 2010 A1
20100202595 Hodge et al. Aug 2010 A1
20100299761 Shapiro Nov 2010 A1
20110055256 Phillips et al. Mar 2011 A1
20110206038 Hodge Aug 2011 A1
20110244440 Saxon et al. Oct 2011 A1
20110279228 Kumar et al. Nov 2011 A1
20120173528 Kreindler Jul 2012 A1
20120262271 Torgersrud et al. Oct 2012 A1
20130104246 Bear et al. Apr 2013 A1
20130124192 Lindmark et al. May 2013 A1
20130179949 Shapiro Jul 2013 A1
20130279686 Keiser et al. Oct 2013 A1
20140247926 Gainsboro et al. Sep 2014 A1
20140273929 Torgersrud Sep 2014 A1
20140287715 Hodge et al. Sep 2014 A1
20140313275 Gupta et al. Oct 2014 A1
20140334610 Hangsleben Nov 2014 A1
20150051893 Ratcliffe, III et al. Feb 2015 A1
20150206417 Bush Jul 2015 A1
20150215254 Bennett Jul 2015 A1
20150221151 Bacco et al. Aug 2015 A1
20150281431 Gainsboro et al. Oct 2015 A1
20150281433 Gainsboro et al. Oct 2015 A1
20160027278 McIntosh et al. Jan 2016 A1
20160191484 Gongaware Jun 2016 A1
20160217807 Gainsboro et al. Jul 2016 A1
20160224538 Chandrasekar et al. Aug 2016 A1
20160239932 Sidler Aug 2016 A1
20160301728 Keiser et al. Oct 2016 A1
20160371756 Yokel et al. Dec 2016 A1
20160373909 Rasmussen et al. Dec 2016 A1
20170270115 Cormack et al. Sep 2017 A1
20170270627 Hodge Sep 2017 A1
20170295212 Hodge Oct 2017 A1
20170323410 Donovan Nov 2017 A1
20180278464 Donovan et al. Sep 2018 A1
20180338036 Hodge Nov 2018 A1
20190163891 Kursun et al. May 2019 A1
Foreign Referenced Citations (11)
Number Date Country
1280137 Dec 2004 EP
2579676 Apr 2013 EP
2075313 Nov 1981 GB
59225626 Dec 1984 JP
60010821 Jan 1985 JP
61135239 Jun 1986 JP
3065826 Mar 1991 JP
WO 9614703 Nov 1995 WO
WO 9813993 Apr 1998 WO
WO 2001074042 Oct 2001 WO
WO 2016028864 Feb 2016 WO
Non-Patent Literature Citations (118)
Entry
“Cisco IAD2400 Series Business-Class Integrated Access Device”, Cisco Systems Datasheet, 2003; 8 pages.
“Cisco IAD2420 Series Integrated Access Devices Software Configuration Guide—Initial Configuration,” Cisco Systems, accessed Sep. 23, 2014, accessible at http://www.cisco.com/en/US/docs/routers/access/2400/2420/software/configuration/guide/init_cf.html; 5 pages.
“Hong Kong: Prison Conditions in 1997,” Human Rights Watch, Mar. 1, 1997, C905, available at http://www.refworld.org/docid/3ae6a7d014.html, accessed May 29, 2014; 48 pages.
“PacketCableTM 1.0 Architecture Framework Technical Report”, PKT-TR-ARCH-VO 1-001201 (Cable Television Laboratories, Inc. 1999).
“PacketCable™ Audio/Video Codecs Specification,” Cable Television Laboratories, Inc., Ser. No. PKT-SP-CODEC-105-040113 (2004).
“Service-Observing Arrangements Using Key Equipment for Telephone Company Business Offices, Description and Use,” Pac. Tel. & Tel. Co., Bell System Practices, Station Operations Manual, Section C71.090, Issue A, 1-1-57-N, 1957; 8 pages.
“SIP and IPLinkTM in the Next Generation Network: An Overview,” Intel, 2001; 6 pages.
“The AutoEDMS Document Management and Workflow System: An Overview of Key Features, Functions and Capabilities,” ACS Software, May 2003; 32 pages.
“Voice Over Packet in Next Generation Networks: An Architectural Framework,” Bellcore, Special Report SR-4717, Issue 1, Jan. 1999; 288 pages.
“Cool Edit Pro, Version 1.2 User Guide,” Syntrillium Software Corporation, 1998; 226 pages.
“Criminal Calls: A Review of the Bureau of Prisons' Management of Inmate Telephone Privileges,” U.S. Department of Justice, Office of the Inspector General, Aug. 1999; 166 pages.
“Global Call API for Linux and Windows Operating Systems,” Intel Dialogic Library Reference, Dec. 2005; 484 pages.
“The NIST Year 2002 Speaker Recognition Evaluation Plan,” NIST, Feb. 27, 2002, accessible at http://www.itl.nist.gov/iad/mig/tests/spk/2002/2002-spkrecevalplan-v60.pdf; 9 pages.
Aggarwal, et al., “An Environment for Studying Switching System Software Architecture,” IEEE, Global Telecommunications Conference, 1988; 7 pages.
Auckenthaler, et al., “Speaker—Centric Score Normalization and Time Pattern Analysis for Continuous Speaker Verification,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Jun. 2000, pp. 1065-1068.
Audacity Team, “About Audacity,” World Wide Web, 2014, accessible at http://wiki.audacity.team.org/wiki/About_Audacity; 3 pages.
Beek et al., “An Assessment of the Technology of Automatic Speech Recognition for Military Applications,” IEEE Trans. Acoustics, Speech, and Signal Processing, vol. ASSP-25, No. 4, 1977; pp. 310-322.
Beigi, et al., “A Hierarchical Approach to Large-Scale Speaker Recognition,” Euro Speech 1999, Sep. 1999, vol. 5; pp. 2203-2206.
Beigi, et al., “IBM Model-Based and Frame-By-Frame Speaker-Recognition,” Speaker Recognition and its Commercial and Forensic Applications, Apr. 1998; pp. 1-4.
Beigi, H., “Challenges of Large-Scale Speaker Recognition,” 3rd European Cooperation in the Field of Scientific and Technical Research Conference, Nov. 4, 2005; 33 pages.
Beigi, H., “Decision Theory,” Fundamentals of Speaker Recognition, Ch. 9, Springer, US 2011; pp. 313-339.
Bender, et al., “Techniques For Data Hiding,” IBM Systems Journal, vol. 35, Nos. 3&4, 1996; 24 pages.
Boersma, et al., “Praat: Doing Phonetics by computer,” World Wide Web, 2015, accessible at http://www.fon.hum.uva.nl/praat; 2 pages.
Bolton, et al., “Statistical Fraud Detection: A Review,” Statistical Science, vol. 17, No. 3 (2002), pp. 235-255.
Boney, L., et al., “Digital Watermarks for Audio Signals” Proceedings of EUSIPC0-96, Eighth European Signal processing Conference, Trieste, Italy, 10-13 (1996).
Boney, L., et al., “Digital Watermarks for Audio Signals” Proceedings of the International Conference on Multimedia Computing Systems, p. 473-480, IEEE Computer Society Press, United States (1996).
Bur Goode, Voice Over Internet Protocol (VOIP), Proceedings of the IEEE, vol. 90, No. 9, Sep. 2002; pp. 1495-1517.
Carey, et al., “User Validation for Mobile Telephones,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Jun. 2000, pp. 1093-1096.
Chau, et al., “Building an Infrastructure for Law Enforcement Information Sharing and Collaboration: Design Issues and Challenges,” National Conference on Digital Government, 2001; 6 pages.
Chaudhari, et al., “Transformation enhanced multi-grained modeling for text—independent speaker recognition,” International Conference on Spoken Language Processing, 2000, pp. 298-301.
Christel, et al., “Interactive Maps for a Digital Video Library,” IEEE Special Edition on Multimedia Computing, January-Mar. 2000, IEEE, United States; pp. 60-67.
Clavel, et al., “Events Detection for an Audio-Based Surveillance System,” IEEE International Conference on Multimedia and Expo (ICME2005), Jul. 6-8, 2005, pp. 1306-1309.
Coden, et al., “Speech Transcript Analysis for Automatic Search,” Proceedings of the 34th Hawaii International Conference on System Sciences, IEEE, 2001; 9 pages.
Coherent Announces Industry's First Remote Management System for Echo Canceller, Business Wire, Mar. 3, 1997; 3 pages.
Corbato, et al., “Introduction and Overview of the MULTICS System,” Proceedings—Fall Joint Computer Conference, 1965; 12 pages.
Cox, et al.; “Secure Spread Spectrum Watermarking for Multimedia,” NEC Research Institute, Technical Report 95-10, Dec. 1997; 34 pages.
Digital “Bellcore Notes on the Networks,” Bellcore, Special Report SR-2275, Issue 3, Dec. 1997.
Doddington, G., “Speaker Recognition based on Idiolectal Differences between Speakers,” 7th European Conference on Speech Communication and Technology, Sep. 3-7, 2001; 4 pages.
Dunn, et al., “Approaches to speaker detection and tracking in conversational speech,” Digital Signal Processing, vol. 10, 2000; pp. 92-112.
Dye, Charles, “Oracle Distributed Systems,” O'Reilly Media, Inc., Apr. 1, 1999; 29 pages.
Fischer, Alan D., “Coplink nabs criminals faster,” Arizona Daily Star, Jan. 7, 2001; 5 pages.
Fleischman, E., “Advanced Streaming Format (ASF) Specification,” Microsoft Corporation, Jan. 9, 1998; 78 pages.
Fox, B., “The First Amendment Rights of Prisoners,” 63 J. Crim. L. Criminology & Police Sci. 162, 1972; 24 pages.
Frankel, E., Audioconferencing Options (Teleconferencing Units, Conference Bridges and Service Bureaus), Teleconnect, vol. 4, No. 5, p. 131(3), May 1996; 6 pages.
Furui, et al., “Experimental studies in a new automatic speaker verification system using telephone speech,” Acoustics, Speech, and Signal Processing, IEEE International Conference on ICASSP '80, vol. 5, Apr. 1980, pp. 1060-1062.
Furui, S., “50 Years of Progress in Speech and Speaker Recognition Research,” ECTI Transactions on Computer and Information Technology, vol. 1, No. 2, Nov. 2005, pp. 64-74.
Hansen, et al., “Speaker recognition using phoneme-specific gmms,” The Speaker and Language Recognition Workshop, May-Jun. 2004; 6 pages.
Hauck, et al., “Coplink: A Case of Intelligent Analysis and Knowledge Management,” University of Arizona, 1999; 20 pages.
Hewett, et al., Signaling System No. 7 (SS7/C7): Protocol, Architecture, and Services (Networking Technology), Cisco Press, Jun. 2005; 8 pages.
I2 Investigative Analysis Software; “Chart Reader”, URL: http://www.i2.co.uk/Products/Chart Reader. Jun. 13, 2005.
I2 Investigative Analysis Software; “i2 TextChart—Text Visualized”, URL: http://www.i2.co.uk/Products/i2TextChart/. Jun. 13, 2005.
I2 Investigative Analysis Software; “iBase-Information Captured”, URL: http://www.i2.co.uk/Products/iBase/. Jun. 13, 2005.
12 Investigative Analysis Software; “iBridge”, URL: http://www.i2.co.uk/Products/iBridge/. Jun. 13, 2005.
I2 Investigative Analysis Software; “Pattern Tracer”, URL: http://www.i2.co.uk/Products/Pattern Tracer/. Jun. 13, 2005.
I2 Investigative Analysis Software; “Prisons”, URL: http://www.i2.co.uk/Solutions/Prisons/default.aso. Jun. 13, 2005.
I2 Investigative Analysis Software; “Setting International Standards for Investigative Analysis”, URL: http://www.i2.co.uk/Products/index.htm. Jun. 13, 2005.
IMAGIS Technologies, Inc. “Computer Arrest and Booking System”, [retrieved from http://www.imagistechnologies.com/Product/CABS.htm] (Nov. 5, 2002) 5 pages.
IMAGIS Technologies, Inc. “Integrated Justice System- Web-based Image and Data Sharing” [retrieved from http://www.imagistechnologies.com/Product/IJISFramework.htm>] (Nov. 5, 2002) 4 pages.
Inmate Telephone Services: Large Business: Voice, Oct. 2, 2001; 3 pages.
Intel® NetStructure High-Density Station Interface (HDSI) Boards Archived Webpage, Intel Corporation, 2003; 2 pages.
International Search Report and Written Opinion directed to International Application No. PCT/US2017/022169, mailed on May 29, 2017; 57 pages.
International Search Report for International Application No. PCT/US04/025029, European Patent Office, Netherlands, mailed on Mar. 14, 2006.
Isobe, et al., “A new cohort normalization using local acoustic information for speaker verification,” Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 2, Mar. 1999; pp. 841-844.
Juang, et al., “Automatic Speech Recognition—A Brief History of the Technology Development,” Oct. 8, 2014; 24 pages.
Kinnunen, et al., “Real-Time Speaker Identification and Verification,” IEEE Transactions on Audio, Speech, and Language Processing, vol. 14, No. 1, Jan. 2006, pp. 277-288.
Knox, “The Problem of Gangs and Security Threat Groups (STG's) in American Prisons Today: Recent Research Findings From the 2004 Prison Gang Survey,” National Gang Crime Research Center, 2005; 67 pages.
Kozamernik, F., “Media Streaming over the Internet—an overview of delivery technologies,” EBU Technical Review, Oct. 2002; 15 pages.
Lane, et al., Language Model Switching Based on Topic Detection for Dialog Speech Recognition, Proceedings of the IEEE-ICASSP, vol. 1, 2003, IEEE; pp. 616-619.
Maes, et al., “Conversational speech biometrics,” E-Commerce Agents, Marketplace Solutions, Security Issues, and Supply and Demand, Springer-Verlang, London, UK, 2001, pp. 166-179.
Maes, et al., “Open SESAME! Speech, Password or Key to Secure Your Door?,” Asian Conference on Computer Vision, Jan. 1998; pp. 1-3.
Matsui, et al., “Concatenated Phoneme Models for Text-Variable Speaker Recognition,” International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 2, Apr. 1993; pp. 391-394.
McCollum, “Federal Prisoner Health Care Copayment Act of 2000,” House of Representatives Report 106-851, 106th Congress 2d Session, Sep. 14, 2000; 22 pages.
Microsoft White Paper: “Integrated Justice Information Systems”, retrieved from Microsoft Justice & Public Safety Solutions (Nov. 5, 2002) [http://jps.directtaps.net_vtibin/owssvr.dll?Using=Default%2ehtm]; 22 pages.
Moattar, et al., “Speech Overlap Detection using Spectral Features and its Application in Speech Indexing,” 2nd International Conference on Information & Communication Technologies, 2006; pp. 1270-1274.
National Alliance of Gang Investigators Associations, 2005 National Gang Threat Assessment, 2005, Bureau of Justice Assistance, Office of Justice Programs, U.S. Department of Justice; 73 pages.
National Major Gang Taskforce, “A Study of Gangs and Security Threat Groups in America's Adult Prisons and Jails,” 2002; 38 pages.
Navratil, et al., “A Speech Biometrics System With MultiGrained Speaker Modeling,” 2000; 5 pages.
Navratil, et al., “Phonetic speaker recognition using maximum-likelihood binary—decision tree models,” Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Apr. 6-10, 2003; 4 pages.
O'Harrow, R. “U.S. Backs Florida's New Counterterrorism Database; ‘Matrix’ Offers Law Agencies Faster Access to Americans' Personal Records”; The Washington Post. Washington, D.C., Aug. 6, 2003; p. A 01.
O'harrow, R.. “Database will make tracking suspected terrorists easier”, The Dallas Morning News. Dallas, TX, Aug. 6, 2003; p. 7A.
Olligschlaeger, A. M., “Criminal Intelligence Databases and Applications, ” in Marilyn B. Peterson, Bob Morehouse, and Richard Wright, Intelligence 2000: Revising the Basic Elements—A Guide for Intelligence Professionals, Mar. 30, 2000 a joint publication of IALEIA and LEIU; 53 pages.
Osifchin, N., “A Telecommunications Buildings/Power Infrastructure in a New Era of Public Networking,” IEEE 2000; 7 pages.
Pages from http://www.corp.att.com/history, archived by web.archive.org on Nov. 4, 2013.
Pelecanos, J. “Conversational biometrics,” in Biometric Consortium Meeting, Baltimore, MD, Sep. 2006, accessible at http://www.biometrics.org/bc2006/presentations/Thu_Sep_21/Session_I/Pelecanos_Conversational_Biometrics.pdf; 14 pages.
Pollack, et al., “On the Identification of Speakers by Voice,” The Journal of the Acoustical Society of America, vol. 26, No. 3, May 1954; 4 pages.
Prosecution History of International Patent Application No. PCT/US99/09493 by Brown et al., filed Apr. 29, 1999.
Prosecution History of U.S. Appl. No. 11/182,625, filed Jul. 15, 2005.
Rey, R.F., ed., “Engineering and Operations in the Bell System, ” 2nd Edition, AT&T Bell Laboratories: Murray Hill, NJ, 1983; 884 pages.
Reynolds, D., “Automatic Speaker Recognition Using Gaussian Mixture Speaker Models, ” The Lincoln Laboratory Journal, vol. 8, No. 2, 1995; pp. 173-192.
Rosenberg, et al., “SIP: Session Initial Protocol,” Network Working Group, Standard Track, Jun. 2002; 269 pages.
Rosenberg, et al., “The Use of Cohort Normalized Scores for Speaker Verification,” Speech Research Department, AT&T Bell Laboratories, 2nd International Conference on Spoken Language Processing, Oct. 12-16, 1992; 4 pages.
Ross, et al., “Multimodal Biometrics: An Overview,” Proc. of 12th European Signal Processing Conference (EUSIPCO), Sep. 2004; pp. 1221-1224.
Science Dynamics, BubbleLINK Software Architecture, 2003; 10 pages.
Science Dynamics, Commander Call Control System, Rev. 1.04, 2002; 16 pages.
Science Dynamics, Inmate Telephone Control Systems, http://scidyn.com/fraudprev_main.htm (archived by web.archive.org on Jan. 12, 2001).
Science Dynamics, SciDyn BubbleLINK, http://www.scidyn.com/products/bubble.html (archived by web.archive.org on Jun. 18, 2006).
Science Dynamics, SciDyn Call Control Solutions: Commander II, http://www.scidyn.com/products/commander2.html (archived by web.archive.org on Jun. 18, 2006).
Science Dynamics, SciDyn IP Gateways, http://scidyn.com/products/ipgateways.html (archived by web.archive.org on Aug. 15, 2001).
Science Dynamics, Science Dynamics—IP Telephony, http://www.scidyn.com/iptelephony_main.htm (archived by web.archive.org on Oct. 12, 2000).
Shearme, et al., “An Experiment Concerning the Recognition of Voices,” Language and Speech, vol. 2, No. 3, Jul./Sep. 1959; 10 pages.
Silberg, L., Digital on Call, HFN The Weekly Newspaper for the Home Furnishing Network, Mar. 17, 1997; 4 pages.
Silberschatz, et al., Operating System Concepts, Third Edition, Addison-Wesley: Reading, MA, Sep. 1991; 700 pages.
Simmons, R., “Why 2007 is Not Like 1984: A Broader Perspective on Technology's Effect on Privacy and Fourth Amendment Jurisprudence,” J. Crim. L. & Criminology vol. 97, No. 2, Winter 2007; 39 pages.
Smith, M., “Corrections Turns Over a New LEAF: Correctional Agencies Receive Assistance From the Law Enforcement Analysis Facility,” Corrections Today, Oct. 1, 2001; 4 pages.
Specification of U.S. Appl. No. 10/720,848, “Information Management and Movement System and Method,” to Viola, et al., filed Nov. 24, 2003. (Abandoned).
State of North Carolina Department of Correction RFP #ITS-000938A, issued May 25, 2004; 8 pages.
Statement for the Record of John S. Pistole, Assistant Director, Counterterrorism Division, Federal Bureau of Investigation, Before the Senate Judiciary Committee, Subcommittee on Terrorism, Technology, and Homeland Security, Oct. 14, 2003.
Sundstrom, K., “Voice over IP: An Engineering Analysis,” Master's Thesis, Department of Electrical and Computer Engineering, University of Manitoba, Sep. 1999; 140 pages.
Supplementary European Search Report for EP Application No. EP 04 80 9530, Munich, Germany, completed on Mar. 25, 2009.
Tanenbaum, A., Modern Operating Systems, Third Edition, Peason Prentice Hall: London, 2009; 552 pages.
Tirkel, A., et al.; “Image Watermarking—A Spread Spectrum Application,” Sep. 22-25, 1996; 7 pages.
U.S. Appl. No. 60/607,447, “IP-based telephony system and method,” to Apple, et al., filed Sep. 3, 2004.
Viswanathan, et al., “Multimedia Document Retrieval using Speech and Speaker Recognition,” International Journal on Document Analysis and Recognition, Jun. 2000, vol. 2; pp. 1-24.
Walden, R., “Performance Trends for Analog-to-Digital Converters,” IEEE Communications Magazine, Feb. 1999.
Weinstein, C., MIT, The Experimental Integrated Switched Network—A System-Level Network Test Facility, IEEE 1983; 8 pages.
Wilkinson, Reginald A., “Visiting in Prison,” Prison and Jail Administration's Practices and Theory, 1999; 7 pages.
Winterdyk et al., “Managing Prison Gangs,” Journal of Criminal Justice, vol. 38, 2010; pp. 730-736.
Zajic, et al., “A Cohort Methods for Score Normalization in Speaker Verification System, Acceleration of On-Line Cohort Methods,” Proceedings of the 12th International Conference “Speech and Computer,” Oct. 15-18, 2007; 6 pages.
Related Publications (1)
Number Date Country
20230231948 A1 Jul 2023 US
Provisional Applications (1)
Number Date Country
62508106 May 2017 US
Continuations (4)
Number Date Country
Parent 17349572 Jun 2021 US
Child 18098858 US
Parent 16827403 Mar 2020 US
Child 17349572 US
Parent 16282886 Feb 2019 US
Child 16827403 US
Parent 15611598 Jun 2017 US
Child 16282886 US