System and method for providing channel configurations in a communications environment

Information

  • Patent Grant
  • 8831664
  • Patent Number
    8,831,664
  • Date Filed
    Tuesday, August 16, 2011
    12 years ago
  • Date Issued
    Tuesday, September 9, 2014
    9 years ago
Abstract
A method is provided in one example embodiment and includes monitoring media streams associated with channels in a communication environment. The method also includes receiving an alert in one of the media streams and elevating a priority associated with the media stream receiving the alert. Elevating the priority can include increasing the volume of the media stream receiving the alert, changing a spatial direction for rendering the media stream receiving the alert, rendering the media stream receiving the alert to a priority spatial direction, identifying a primary channel and rendering the alert message to the primary channel, turning a volume down on media streams not receiving the alert, or a combination thereof.
Description
TECHNICAL FIELD OF THE INVENTION

This invention relates in general to the field of communications and, more particularly, to a system and a method for providing channel configurations in a communications environment.


BACKGROUND OF THE INVENTION

Communications architectures have grown increasingly complex in communication environments. This complexity has resulted in numerous protocols being implemented to satisfy endpoints and to intelligently respond to challenging situations. Push-to-talk (PTT) devices are commonly used in safety and security environments. PTT devices carried by emergency response personnel (ERT) can support a single channel or a plurality of channels.


Typically, a management center supports a plurality of channels, where less important channels are rendered to one ear, and channels that are more important are rendered to the other ear. Universally, channels are statically configured, where the volume of each channel can be manually adjusted by an end user. When important information needs to be broadcast, channels should be configured optimally. If the channels are not fully utilized, then messages are not received by their intended recipients.


Thus, the ability to offer a system or a protocol that offers an effective coordination for channels in a communications environment provides a significant challenge to network designers, component manufacturers, service providers, and system administrators alike.





BRIEF DESCRIPTION OF THE DRAWINGS

To provide a more complete understanding of the present invention and features and advantages thereof, reference is made to the following description, taken in conjunction with the accompanying figures, wherein like reference numerals represent like parts, in which:



FIG. 1 is a simplified block diagram of a communication system for providing channel configurations in a communications environment in accordance with one embodiment of the present invention;



FIG. 2A is a simplified schematic diagram of a management center that may be part of the communication system in accordance with one embodiment of the present invention;



FIG. 2B is a simplified schematic diagram of an aspect of the management center in accordance with one embodiment of the present invention;



FIG. 3 is a simplified schematic diagram of an element that may be part of the management center in accordance with one embodiment of the present invention;



FIG. 4 is a simplified schematic diagram illustrating another example implementation of the management center; and



FIG. 5 is a simplified flowchart illustrating a scenario implicating the communication system.





DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

Overview


A method is provided in one example embodiment and includes monitoring a plurality of channels provisioned for an endpoint. The method also includes reacting to an alert message by adjusting one of the channels for the endpoint. The adjustment is a selected one of increasing a volume associated with the channel receiving the alert message, or changing a spatial direction from which the alert message and an associated media stream are conveyed to the endpoint. In other embodiments, the adjustment provisions the alert message from a left ear to a right ear of an end user of the endpoint. In still other embodiments, the adjustment includes determining which is a primary channel of the channels and provisioning the alert message to the primary channel of the endpoint. The media stream associated with the alert message is also moved to a same spatial direction along with the alert message.


In still other embodiments, the adjustment includes turning a volume down on every other channel besides a channel being used for the alert message. In yet other embodiments, the adjustment includes using out-of-band signaling. In other embodiments, the method includes sending the alert message until feedback has been received, which indicates that the alert message has been acknowledged by the endpoint.



FIG. 1 is a simplified block diagram for coordinating communications received by one or more endpoints. Communication system 10 includes a server 14, which could include a server administration console, an operational view, and a policy engine: all of which are depicted as element 18. FIG. 1 also includes an IP network 20, which is coupled to a router media service (RMS) element 30 and a universal media service (UMS) element 34. UMS 34 is coupled to a public switched telephone network (PSTN) and can effectively coordinate sessions for a multitude of end-user devices, as is shown. RMS 30 is coupled to a management center 38 and a phone client 36. RMS 30 is also coupled to a series of end-user devices for which it may facilitate coordinating audio and/or data exchanges in the network.


Note that the end-user devices of FIG. 1, as generally depicted at 40 and 50, are providing just some of the many examples that could be used in conjunction with the present invention. In a general sense, elements 40 and 50 are examples of field equipment, while element 24 is an example for end-user equipment, which resides in the control center. These devices (referred to as ‘endpoint’ as used herein in this document) may include VHF radios, UHF radios, PSTN telephones, IP phones, push-to-talk telephones, cellular telephones, laptops, desktop computers, personal digital assistants (PDAs), or any other suitable end-user device capable of exchanging audio or other data in the architecture. Before turning to the operational aspects of the present invention, FIG. 2A is briefly introduced. FIG. 2A is a simplified schematic diagram of a display 60, which can be part of management center 38. FIG. 2A includes one or more groups for which a particular end-user is subscribed. The groups include a DevTest 70, a SysTestTeam 72, a server team 74, a management group 76, an EFT team 78, and a dispatcher 79. Also provided are several pages on the left-hand side of the diagram, which help illustrate some of the end-users and some of the groups for this particular end-user. A tab 80 is also provided and can be used to expand the pages from the left-hand side of the user's screen. FIG. 2A also includes a volume control 62 and each of the channels can have its own volume adjustment.


For purposes of illustrating the techniques of communication system 10, it is important to understand the typical communications that may be traversing the network. The following foundational information may be viewed as a basis from which the present invention may be properly explained. Such information is offered earnestly for purposes of explanation only and, accordingly, should not be construed in any way to limit the broad scope of the present invention and its potential applications.


PTT devices, such as those shown in FIG. 1 [generally at 40 and 50], are commonly used in safety and security environments. PTT devices such as those [carried by emergency response personnel (ERT)] can support either a single channel or a plurality of channels. Management center 38 is an example of a PTT device, which supports a plurality of channels. Management center 38 can support a plurality of channels, where important channels are rendered to one ear (e.g., the right) and the less important channels are sent to the other ear (e.g., the left). It should be noted that some systems may provide users with the ability to position media streams at a variety of spatial arrival directions. The fact that we describe our system by way of example with respect to left and right ear should not be viewed as limiting it only to these two arrival directions.


A problem occurs when a user turns down the volume of a given channel and, as a result, cannot hear an alert tone or the conversation that follows on that channel. This may result in a user not responding to an important alert signal sent by another user, or by the system. Consider an example involving a police officer, who is on a UHF radio. Management center 38 can listen in on a multitude of channels. For the police officer in this example, audio channels can be configured on his device such that certain types of audio data will be directed to either the left ear or the right ear. For example, one typical application could involve configuring a management center 38 to render to the right ear of the dispatcher communication over a UHF radio channel and turning the volume up on that channel. The left ear could be configured to receive passive communications that may not be critical to the police officer's routine activities (e.g., a VHF channel).


Following along with this example of the police officer, an urgent message is broadcast over this VHF radio. The channel could be called “North District” or “City Works” in this example. In this example, an officer has been shot or has been inured such that he requires assistance. In this instance, the alert came on the VHF non-high priority channel. This alert message may or may not have been heard by other officers, or, because the volume on that specific channel may have been turned down, the officer may not hear the following communication on the low priority VHF channel.


In accordance with the techniques and teachings of the present invention, communication system 10 can readily identify this alert message and respond in any number of ways to ensure that the message was properly received. One intelligent response mechanism of the present invention provisions communications from the left ear to the right ear or vice versa. Another intelligent response mechanism simply turn up the volume on a channel that the alert. Note that these responses could easily be combined into an automatic reconfiguration.


Another intelligent response for this scenario is for management center 38 to identify which channel is the primary channel for one or more end-users. Due to the urgency of this event alert, the system responds by moving the alert message to the primary channel and marginalizing all the other channels for a configurable period. Part of this response also includes turning the volume down on every other channel. Thus, management center 38 can promptly identify an event alert and coordinate a suitable response autonomously, or in conjunction with a server that would assist in provisioning the appropriate channels.


In other embodiments, a resource management server (capable of processing media services) can coordinate an effective response that includes either an audio tone or a message being sent out-of-band. This server could be UMS 34 or RMS 30. For the out-of-band signaling, the identification of the alert is provided in the header of a packet, in the payload of the packet, in control packets, or in any other suitable location. Furthermore, this out-of-band signaling is sent to management center 38, which can assist in processing this data. Management center 38 promptly identifies the information included in these packets. In the case of out-of-band signaling, if an end user has stepped out (for whatever reason), because the specific channel has been moved, the actions taken by the system in response to the alert could signify to him what has occurred while he was away.


As can be appreciated by the foregoing description and the discussions herein, the coordination of the responses for these alert messages can occur at the edge of the network, at a server, or performed as a hybrid of these functions. The server is predominantly part of the network in the context of some of these discussions. Thus, an alert event can be captured and subsequent responses properly coordinated by management center 38, server 14, RMS 30, or UMS 34. All of these elements can have the intelligence to dictate an effective response mechanism, as prescribed herein.


Before turning to some of the operations of this architecture, a brief discussion is provided about some of the infrastructure of FIG. 1. Software (which manages the operations of the system in accordance with some embodiments of the invention) can reside in the network. By “reside in the network” such terminology is meant connote that the software can be hosted on any suitable device or component, for example, included in management center 38, server 14, RMS 30, and UMS 34. These elements may include any suitable combination of software or hardware to execute the response mechanisms and/or perform the coordination of channels as outlined herein. All of these elements may be referred to as ‘management elements’ as used herein in this Specification, which could be routers, switches, gateways, bridges, loadbalancers, firewalls, servers, or any other suitable device, component, element, or object operable to assist in these activities. Moreover, these management elements may include any suitable hardware, software, components, modules, interfaces, or objects that facilitate the operations thereof. This may be inclusive of appropriate algorithms and communication protocols that allow for the effective exchange of data or information in the architecture.


Each of these management elements can also include memory elements for storing information to be used in achieving the channel management and coordination operations as outlined herein. Additionally, each of these devices may include a processor that can execute software or an algorithm to perform the management activities as discussed in this specification. Memory elements and processors (which facilitate these outlined operations) may be included in these management elements or provided externally to these elements, or consolidated in any suitable fashion. The processors can readily execute code (software) for effectuating the activities described. These devices may further keep information in any suitable random access memory (RAM), read only memory (ROM), erasable programmable ROM (EPROM), electronically erasable PROM (EEPROM), application specific integrated circuit (ASIC), software, hardware, or in any other suitable component, device, element, or object where appropriate and based on particular needs.


Note also that the specific coordination and response mechanisms may be provided external to any of the management elements, as opposed to provided internally. In addition, some, all, or none of these operations may be suitably combined across these elements, or provided in just one element to accomplish the operations as outlined herein in this document.


Turning to a particular component that may be entrusted with some of these response obligations, the UMS is the component that receives the audio streams in the architecture in certain embodiments of the present invention. It is not necessarily the only component that receives this audio information. The UMS can also receive out-of-band messages, and it too can have information regarding who is registered and who is listening to messages in the architecture. The UMS can simply re-stream media that it receives in certain embodiments. Because the UMS has information regarding who is listening on giving channels, it too can take the initiative in provisioning channels for endpoint devices. It has the intelligence to understand which streams should be given priority over other streams. Thus, the UMS can shift the streams or it can elect which streams and at which volume should propagate to end user devices.


In typical scenarios, users may be listening to multiple channels simultaneously. For a user to be able to listen to multiple channels, the media streams of the channels need to be mixed in a two dimensional mixer. A two dimensional mixer is a mixer that facilitates assigning of media streams to left and/or right ear. This mixer can also control the relative volume of each media stream played to the right and left ear. In accordance one example implementation, multiple media streams arrive at management center 38 and the mixing of these media streams takes place in management center 38.


In accordance with another example aspect of the present architecture, multiple media streams arrive at UMS 34 or RMS 30 and are being mixed there. The resulting two media streams (per user] are then streamed from the RMS/UMS to management center 38. Such an architecture has the advantage that the bandwidth it needs [between management center 38 and IP network 20] is enough to accommodate two media streams. In a hybrid architecture, some mixing takes place on the UMS/RMS and some at management center 38 is also covered by our invention. In this sense, the provisioning is being done externally, and the individual endpoints do not have to be capable of managing these alert messages.


Note that in previous embodiments, management center 38 would perform a certain amount of mixing functions for the endpoints. For example, ten different media streams could be propagating down to management center 38, which is relegated to task of managing the streams from multiple endpoints. Management center 38 is assigning each of those streams to either the left ear or the right ear or both. By properly managing the volume of the mixed channels, any channel can be configured to appear for the user from any designated arrival direction. Furthermore, each channel is receiving the stream at a prescribed volume.


This mixing functionality (which could be provided in software or in hardware) could effectively be moved to the UMS such that these activities are performed by one or more alternative components, and not necessarily performed by management center 38. In this sense, this functionality has been moved into the network (potentially residing on a server that includes a processor and a memory element for storing code to be executed by the processor) and this embodiment is certainly within the broad scope of the present invention.


In other scenarios, management center 38 is monitoring all of the received media streams, where it can detect that media should be rendered to an active channel whose volume was set below a given threshold, and then assumes that the alert may have to be duplicated or a volume adjustment should be made. One reaction to this scenario would be to increase the volume of the channel where the tone is played and relatively decrease the volume of chatter on other channels. Please note that teachings of the present invention do not solely accommodate alert tones. Such an architecture can also handle events, where the priority of a given channel is elevated by the dispatcher, or when a dispatcher adds a high priority virtual talk group (VTG) to the channel. Note that the priority of the channel is elevated not only for the alert tone, but also for the following media stream. This is done by increasing the volume of the said channel and by moving it to the priority direction, e.g., right ear.


In still other scenarios, the present architecture can continuously signal the alert until feedback has been received that indicates that the alert has been acknowledged. In other embodiments, the alert could be continuously increased in volume and/or duration until properly acknowledged by an endpoint (either vocally, through software, through out-of-band signaling, etc.).


Turning now to some of the potential infrastructure of example embodiments of the present architecture, FIG. 2B is a simplified schematic diagram of the dispatcher channel controls and includes volume control 62. Various components of the architecture can control volume parameters on individual channels. Thus, channel controls can be independent of one another. Note also that management center 38 can be implemented in either software or hardware. Some of the FIGURES show management center 38 being implemented in software and, therefore, the buttons in FIGS. 2A and 2B are soft buttons. The present invention covers implementations of the system in hardware, software or any hybrid implementation of these elements.



FIG. 3 is a simplified schematic diagram of a settings element within the management center. The settings include a status 82, a series of groups 90, and a spatial position 92. The groups 90 are depicted having certain positions, which may signify their respective priorities. The configuration of a specific position for any given communication group controls the location of this specific channel on the management center screen, as shown in FIG. 2A. In this example, the management team is relegated to position number five, whereas the DevTest group, which was configured to position 01 has the highest priority and, therefore, is positioned at the top left of the management center screen. These groups may be configured by an end-user, a manager, or by an administrator of the system based on particular needs. Spatial position 92 includes a drop-down menu in which the user may select stereo, left, right, or another configuration for channels to be heard by an end-user. Also provided in FIG. 3 is a skin tab, which can be selected in order to change the appearance of management center 38.



FIG. 4 is a simplified schematic diagram of an example management center 96. In this example case, the layout has changed and a number of important groups are highlighted for this particular end-user, who will receive alert messages intended for the groups for which he is subscribed (or configured). It is important to note that any suitable configuration or selection of end user groups may be accommodated by the present invention


Note that with the example provided above, as well as numerous other examples provided herein, interaction may be described in terms of two, three, or four network elements. However, this has been done for purposes of clarity and example only. In certain cases, it may be easier to describe one or more of the functionalities of a given set of flows by only referencing a limited number of network elements. It should be appreciated that communication system 10 (and its teachings) are readily scalable and can accommodate a large number of components, as well as more complicated/sophisticated arrangements and configurations. Accordingly, the examples provided should not limit the scope or inhibit the broad teachings of communication system 10 as potentially applied to a myriad of other architectures.


It is also important to note that the steps in the preceding flows illustrate only some of the possible scenarios that may be executed by, or within, communication system 10. Some of these steps may be deleted or removed where appropriate, or these steps may be modified or changed considerably without departing from the scope of the present invention. In addition, a number of these operations have been described as being executed concurrently with, or in parallel to, one or more additional operations. However, the timing of these operations may be altered considerably. The preceding operational flows have been offered for purposes of example and discussion. Substantial flexibility is provided by communication system 10 in that any suitable arrangements, chronologies, configurations, and timing mechanisms may be provided without departing from the teachings of the present invention.


Although the present invention has been described in detail with reference to particular arrangements and configurations, these example configurations and arrangements may be changed significantly without departing from the scope of the present invention. For example, although the present invention has been described with reference to particular communication exchanges involving channels, communication system 10 can also manage alerts and events on other pathways. Note also that the term ‘alert message’ as used herein in this Specification is meant to connote any type of message or signal that was intended for one or more end users. The message may inform the end users or apprise them of some situation, or alternatively simply convey some information from a sender. This may include simple conversational exchanges and/or simple correspondence between two end users operating their endpoints.


The present architecture can also resolve any possible conflict between one or more alerts. This may involve a simple priority designation such that one alert is sounded or sent over (or in place of) another, or it may involve delay mechanisms, where one alert is sent along at a later time. Additionally, although described with reference to emergency, police, and fire fighter type applications, the present invention can certainly be used in operational environments where there are communication flows propagating amongst users within a given group. These communication environments could involve non-emergency scenarios.



FIG. 5 is a simplified flowchart illustrating an example scenario involving communication system 10. The flow begins at step 100, where a channel of an end-user device is set at a low priority and at a low volume. In this example, the end user is part of some group for which alert messages are sent. The name of this group is Eastern District Fire for this hypothetical. At step 102, the system checks to identify if an alert indication has been received.


If the answer to this query is NO, then the process circles back to the decision block of the flow. If, however, the answer to this query is YES, then the process moves to step 104, where the volume of the channel is increased and/or, the system moves the channel to the high priority spatial position.


At step 106, the volume of other channels is reduced in this example. Note also that the system is designed to check to see if multiple alerts were received at the same time. When this occurs, the system may delay one of the alert tones (but not the other actions, e.g., volume increase and move to high priority spatial position) to ensure that the user is made aware that multiple alerts for multiple events were sent. Any suitable combination of these response reactions are certainly within the broad scope of the present invention.


Numerous other changes, substitutions, variations, alterations, and modifications may be ascertained to one skilled in the art and it is intended that the present invention encompass all such changes, substitutions, variations, alterations, and modifications as falling within the scope of the appended claims. In order to assist the United States Patent and Trademark Office (USPTO) and, additionally, any readers of any patent issued on this application in interpreting the claims appended hereto, Applicant wishes to note that the Applicant: (a) does not intend any of the appended claims to invoke paragraph six (6) of 35 U.S.C. section 112 as it exists on the date of the filing hereof unless the words “means for” or “step for” are specifically used in the particular claims; and (b) does not intend, by any statement in the specification, to limit this invention in any way that is not otherwise reflected in the appended claims.

Claims
  • 1. A method, comprising: monitoring media streams associated with channels in a communication environment;receiving an alert in one of the media streams; andelevating a priority associated with the media stream receiving the alert, such that an adjustment is made to how the media stream is provisioned to an endpoint in the communication environment.
  • 2. The method of claim 1, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert.
  • 3. The method of claim 1, wherein elevating the priority comprises changing a spatial direction for rendering the media stream receiving the alert.
  • 4. The method of claim 1, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert and changing a spatial direction for rendering the media stream receiving the alert.
  • 5. The method of claim 1, wherein elevating the priority comprises rendering the media stream receiving the alert to a priority spatial direction.
  • 6. The method of claim 1, wherein elevating the priority includes identifying a primary channel and rendering the alert message to the primary channel.
  • 7. The method of claim 1, wherein elevating the priority includes turning a volume down on media streams not receiving the alert.
  • 8. The method of claim 1, wherein the alert is received in a header of a packet of the media stream.
  • 9. The method of claim 1, wherein the alert is received in a control packet of the media stream.
  • 10. The method of claim 1, wherein the alert is duplicated and rendered at a higher volume.
  • 11. Non-transitory readable media that includes code for execution and when executed by a processor operable to perform operations comprising: monitoring media streams associated with channels in a communication environment;receiving an alert in one of the media streams; andelevating a priority associated with the media stream receiving the alert.
  • 12. The non-transitory readable media of claim 11, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert.
  • 13. The non-transitory readable media of claim 11, wherein elevating the priority comprises changing a spatial direction for rendering the media stream receiving the alert.
  • 14. The non-transitory readable media of claim 11, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert and changing a spatial direction for rendering the media stream receiving the alert.
  • 15. The non-transitory readable media of claim 11, wherein elevating the priority includes identifying a primary channel and rendering the alert message to the primary channel.
  • 16. An apparatus, comprising: a memory element;a processor; anda management element stored in the memory element and configured to interface with the processor such that the apparatus is configured for: monitoring media streams associated with channels in a communication environment;receiving an alert in one of the media streams; andelevating a priority associated with the media stream receiving the alert.
  • 17. The apparatus of claim 16, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert.
  • 18. The apparatus of claim 16, wherein elevating the priority comprises changing a spatial direction for rendering the media stream receiving the alert.
  • 19. The apparatus of claim 16, wherein elevating the priority comprises increasing the volume of the media stream receiving the alert and changing a spatial direction for rendering the media stream receiving the alert.
  • 20. The apparatus of claim 16, wherein elevating the priority includes identifying a primary channel and rendering the alert message to the primary channel.
RELATED APPLICATION

This Application is a continuation (and claims the benefit of priority under 35 U.S.C. §120) of U.S. application Ser. No. 12/340,417, filed Dec. 19, 2008, entitled “SYSTEM AND METHOD FOR PROVIDING CHANNEL CONFIGURATIONS IN A COMMUNICATIONS ENVIRONMENT,” Inventor(s) Shmuel Shaffer, et al. The disclosure of the prior application is considered part of (and is incorporated by reference in) the disclosure of this application.

US Referenced Citations (132)
Number Name Date Kind
4359603 Heaton Nov 1982 A
4730306 Uchida Mar 1988 A
5048082 Krafft et al. Sep 1991 A
5099510 Blinken et al. Mar 1992 A
5436896 Anderson et al. Jul 1995 A
5539741 Barraclough et al. Jul 1996 A
5539811 Nakamura et al. Jul 1996 A
5625407 Biggs et al. Apr 1997 A
5689245 Noreen et al. Nov 1997 A
5889769 Mizuno Mar 1999 A
6006848 Grube et al. Dec 1999 A
6011841 Isono Jan 2000 A
6011851 O'Connor et al. Jan 2000 A
6094578 Purcell et al. Jul 2000 A
6094579 Olvera-Hernandez et al. Jul 2000 A
6151309 Busuioc et al. Nov 2000 A
6178237 Horn Jan 2001 B1
6185205 Sharrit et al. Feb 2001 B1
6233315 Reformato et al. May 2001 B1
6327567 Willehadson et al. Dec 2001 B1
6374100 Smith et al. Apr 2002 B1
6385461 Raith May 2002 B1
6400816 Hjalmtysson et al. Jun 2002 B1
6404873 Beyda et al. Jun 2002 B1
6408327 McClennon et al. Jun 2002 B1
6418214 Smythe et al. Jul 2002 B1
6421002 Krasner Jul 2002 B2
6453022 Weinman, Jr. Sep 2002 B1
6501739 Cohen Dec 2002 B1
6501740 Sun et al. Dec 2002 B1
6608820 Bradshaw, Jr. Aug 2003 B1
6708030 Horikawa Mar 2004 B1
6721284 Mottishaw et al. Apr 2004 B1
6792092 Michalewicz Sep 2004 B1
6813250 Fine et al. Nov 2004 B1
6850496 Knappe et al. Feb 2005 B1
6873854 Crockett et al. Mar 2005 B2
6882856 Alterman et al. Apr 2005 B1
6885874 Grube et al. Apr 2005 B2
6912389 Bright et al. Jun 2005 B2
6944137 Pan et al. Sep 2005 B1
6965767 Maggenti et al. Nov 2005 B2
6982961 Refai et al. Jan 2006 B2
6987480 Kotick et al. Jan 2006 B1
6987841 Byers et al. Jan 2006 B1
6993120 Brown et al. Jan 2006 B2
6996088 Kroon et al. Feb 2006 B1
6996406 Lection et al. Feb 2006 B2
6999782 Shaughnessy et al. Feb 2006 B2
6999783 Toyryla et al. Feb 2006 B2
7003286 Brown et al. Feb 2006 B2
7006607 Garcia Feb 2006 B2
7010106 Gritzer et al. Mar 2006 B2
7010109 Gritzer et al. Mar 2006 B2
7010275 Davies Mar 2006 B2
7013279 Nelson Mar 2006 B1
7031700 Weaver et al. Apr 2006 B1
7034678 Burkley et al. Apr 2006 B2
7035385 Levine et al. Apr 2006 B2
7058168 Knappe et al. Jun 2006 B1
7062286 Grivas et al. Jun 2006 B2
7072952 Takehiro et al. Jul 2006 B2
7079857 Maggenti et al. Jul 2006 B2
7091851 Mason et al. Aug 2006 B2
7149477 Ogami Dec 2006 B2
7237117 Weiss Jun 2007 B2
7271742 Sheha et al. Sep 2007 B2
7292544 Roher et al. Nov 2007 B2
7328036 Hart et al. Feb 2008 B2
7339900 Perlman Mar 2008 B2
7352707 Ho et al. Apr 2008 B2
7369513 Sankaran May 2008 B1
7369530 Keagy May 2008 B2
7379961 Matsuoka May 2008 B2
7418090 Reding et al. Aug 2008 B2
7426192 Amano et al. Sep 2008 B2
7460492 Portolani et al. Dec 2008 B2
7463597 Kompella Dec 2008 B1
7466812 Mahy et al. Dec 2008 B1
7508840 Delaney Mar 2009 B2
7526306 Brems et al. Apr 2009 B2
7558221 Nelson et al. Jul 2009 B2
7606256 Vitebsky et al. Oct 2009 B2
7633914 Shaffer et al. Dec 2009 B2
7636339 Shaffer et al. Dec 2009 B2
7639634 Shaffer et al. Dec 2009 B2
7706339 Shaffer et al. Apr 2010 B2
7710983 Zheng et al. May 2010 B2
7821978 Staack Oct 2010 B2
7831270 Kalley et al. Nov 2010 B2
7860070 Shaffer et al. Dec 2010 B2
7860958 Yoon et al. Dec 2010 B2
7869386 Shaffer et al. Jan 2011 B2
7933286 Vitebsky et al. Apr 2011 B2
7953859 Kiefhaber et al. May 2011 B1
8041378 Shaffer et al. Oct 2011 B2
8619586 Hanes et al. Dec 2013 B2
20020152305 Jackson et al. Oct 2002 A1
20030055711 Doherty Mar 2003 A1
20040139320 Shinohara Jul 2004 A1
20040249949 Gourraud et al. Dec 2004 A1
20050015444 Rambo Jan 2005 A1
20050171954 Hull et al. Aug 2005 A1
20050232207 Antoniadis et al. Oct 2005 A1
20060114847 Dssouli et al. Jun 2006 A1
20060118636 Miles et al. Jun 2006 A1
20060165060 Dua Jul 2006 A1
20060281471 Shaffer et al. Dec 2006 A1
20070030144 Titus et al. Feb 2007 A1
20070060144 Mills et al. Mar 2007 A1
20070104121 Shaffer et al. May 2007 A1
20070115848 Chean et al. May 2007 A1
20070202907 Shaffer et al. Aug 2007 A1
20070202908 Shaffer et al. Aug 2007 A1
20070203996 Davitz et al. Aug 2007 A1
20070239824 Shaffer et al. Oct 2007 A1
20070266097 Harik et al. Nov 2007 A1
20070271336 Ramaswamy Nov 2007 A1
20070280203 Shaffer et al. Dec 2007 A1
20070282621 Altman et al. Dec 2007 A1
20070293240 Drennan Dec 2007 A1
20080005249 Hart Jan 2008 A1
20080037461 Biltz et al. Feb 2008 A1
20080045236 Nahon et al. Feb 2008 A1
20080159128 Shaffer et al. Jul 2008 A1
20080167049 Karr et al. Jul 2008 A1
20080280637 Shaffer et al. Nov 2008 A1
20090144377 Kim et al. Jun 2009 A1
20100159975 Shaffer Jun 2010 A1
20100159977 Shaffer Jun 2010 A1
20100161727 Shaffer Jun 2010 A1
20110239252 Kazama et al. Sep 2011 A1
Foreign Referenced Citations (14)
Number Date Country
101461152 Jun 2009 CN
0849964 Jun 1998 EP
1622044 Feb 2006 EP
1920568 Aug 2006 EP
1889495 Dec 2006 EP
1920615 Feb 2007 EP
2030338 Dec 2007 EP
2674355 Sep 1992 FR
WO0191485 Nov 2001 WO
WO02074051 Sep 2002 WO
WO2006135533 Dec 2006 WO
WO2007021586 Feb 2007 WO
WO2007027356 Mar 2007 WO
WO2007142961 Dec 2007 WO
Non-Patent Literature Citations (41)
Entry
U.S. Appl. No. 12/722,389, entitled “System and Method for Providing Data Channel Management in a Network Environment,” filed Mar. 11, 2010; Inventors: Shmuel Shaffer, et al.
PCT Feb. 10, 2001 International Search Report from PCT/EP01/04570; 1 page.
PCT Feb. 27, 2007 Notification of Transmittal of the International Search Report for PCT/US06/30447; 2 pages.
PCT Apr. 9, 2007 Notification of Transmittal of the International Search Report from PCT/US06/30294; 1 page.
PCT Sep. 4, 2007 International Search Report from PCT/US02/08419; 2 pages.
PCT Sep. 4, 2007 International Search Report and Written Opinion from PCT/US06/19227; 7 pages.
PCT Dec. 11, 2007 International Preliminary Report on Patentability ( 1 page) and the Written Opinion of the International Searching Authority (3 pages), for PCT/US06/19227.
PCT Feb. 12, 2008 International Preliminary Report on Patentability ( 1 page) and the Written Opinion of the International Searching Authority (14 pages), from PCT/US06/30447.
PCT Mar. 14, 2008 International Preliminary Report on Patentability (1 page) and the Written Opinion of the International Searching Authority or the Declaration (4 pages) for PCT/US06/30294.
PCT Sep. 17, 2008 Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority for PCT/US07/12728; 6 pages.
PCT Dec. 3, 2008 International Preliminary Report on Patentability (1 page) and the Written Opinion of the International Searching Authority (5 pages) from PCT/US07/12728.
USPTO Jun. 27, 2011 Notice of Allowance from U.S. Appl. No. 12/340,417.
USPTO Jul. 11, 2011 Nonfinal Office Action from U.S. Appl. No. 12/340,451.
USPTO Jan. 17, 2012 Response to Oct. 14, 2011 Nonfinal Office Action from U.S. Appl. No. 12/340,468.
USPTO Sep. 20, 2012 Non-Final Office Action from U.S. Appl. No. 12/722,389.
USPTO Oct. 4, 2013 Non-Final Rejection from U.S. Appl. No. 12/340,468.
USPTO Apr. 5, 2013 Notice of Allowance from U.S. Appl. No. 12/722,389.
Herther, Jay, et al., “Voice-Over-interintranet Protocol for Critical Communications,” Urgent Communications, The Official Publication of IWCE, Aug. 2001, 3 pages http://urgentcomm.com/mag/radio—voiceoverintranet—protocol—critical/.
Interoperability Gateway, OpenSky, NetworkFirst, P25IP, Data Sheet, M/A-COM, Inc. ECR-7054, [Printed on Jun. 28, 2011] http://www.pspc.harris.com/media/ECR-7054E—tcm27-10470.pdf.
McKay, Jim, “Intact Amid Chaos,” Government Technology, Mar. 2005, 3 pages http://www.bultronic.net/documents/articles/intact%20amid%20chaos.pdf.
Urgent Communications: Service, Safety, Security, “5 for '05 Technologies,” Dec. 1, 2004; 6 pages; http://urgentcomm.com/mag/publicsafety/radio—technologies/.
Urgent Communications: Service, Safety, Security, “M/A-Com Emphasizes Power of IP in Seven Announcents at APCO,” Aug. 18, 2002, 5 pages http://urgentcomm.com/products/new/radio—macom—emphasizes—power/index.html.
NetworkFirst Interoperability Solved, Data Sheet, M/A-COM, Inc., ECR-7018A, [Printed on Jun. 28, 2011] 6 pages; http://www.ka-omminc.com/PDF/NetworkFirstRevised5—03.pdf.
Polk, J., et al., “Dynamic Host Configuration Protocol Option for Coordinate-based Location Configuration Information,” Network Working Group, Internet RFC 3825, Jul. 2004, 15 pages.
Rivero-Angeles, Mario et al., “Random-Access Control Mechanism Using Adaptive Traffic Load in ALOHA and CSMA Strategies for EDGE,” IEEE Transactions on Vehicular Technology, vol. 54, No. 3, May 2005; Abstract Only http://ieeexplore.ieee.org/xpl/freeabs—all.jsp?arnumber=1433259.
Trimeche, M., “Digital Rights Management for Visual Content in Mobile Applications,” published Sep. 27, 2004; ISBN )-7803-8379-6; © 2004 IEEE; http://ieeexplore.ieee.org; pp. 95-98.
VIP Dispatch Console, OpenSky, NetworkFirst, P25IP, Data Sheet, M/A-COM, Inc. ECR-7241C, © 2006 M/A-COM, Inc., 2 pages http://www.bultronic.net/documents/NetworkFirst/VIP%20Dispatch%20Console.pdf.
USPTO Nov. 9, 2011 Notice of Allowance from U.S. Appl. No. 12/340,451.
USPTO Dec. 18, 2012 Response to Sep. 20, 2012 Non-Final Office Action from U.S. Appl. No. 12/722,389.
USPTO Mar. 9, 2012 Final Rejection from U.S. Appl. No. 12/340,468.
USPTO May 24, 2012 RCE Response to Mar. 9, 2012 Final Rejection from U.S. Appl. No. 12/340,468.
USPTO Oct. 11, 2011 Response to Nonfinal Office Action mailed Jul. 11, 2011 from U.S. Appl. No. 12/340,451.
USPTO Oct. 14, 2011 Nonfinal Office Action from U.S. Appl. No. 12/340,468.
Wikipedia, “Minimum spanning tree,” http://en.wikipedia.org/wiki/Minimum—spanning—tree, Dec. 18, 2008, 5 pages.
Wikipedia, “Distributed minimum spanning tree,” http://en.wikipedia.org/wiki/Distributed—minimum—spanning—tree, Dec. 18, 2008, 2 pages.
Bender, Ron, et al., “Multiple Choices for Critical Communications,” Urgent Communications, The Official Publication of IWCE, Oct. 2001, 4 pages; http://urgentcomm.com/mag/radio—multiple—choices—critical—2/.
C3 MaestroIP Dispatch Console, OpenSKY, NetworkFirst, P25IP, Data Sheet, M/A-COM, Inc. ECR-7160A, [Printed on Jun. 28, 2011] 2 pages http://www.pspc.harris.com/media/ECR-7160E—tcm27-10458.pdf.
Cisco IPICS, Solution Overview, “Collaborative Incident Response,” © Cisco Systems, Inc. 2010, 2 pages, www.cisco.com/go/ipics.
Wikipedia, “Plectron,” http://en.wikipedia.org/wiki/Plectron, Dec. 18, 2008, 2 pages.
Thunder Eagle, Inc.—Radio Wireless Alerting Systems, “MRI-100™: Multi Radio Interface,” http://www.thuneagle.com/mri100.htm, Dec. 18, 2008, 2 pages.
Positron Public Safety Systems, “Product Specifications: Power RADIO,” http://www.positron911.com/products/powerRADIO/powerRADIO—specs.asp, Dec. 18, 2008, 2 pages.
Related Publications (1)
Number Date Country
20110299404 A1 Dec 2011 US
Continuations (1)
Number Date Country
Parent 12340417 Dec 2008 US
Child 13210967 US