Session-based device configuration

Information

  • Patent Grant
  • 9614724
  • Patent Number
    9,614,724
  • Date Filed
    Monday, April 21, 2014
    10 years ago
  • Date Issued
    Tuesday, April 4, 2017
    7 years ago
Abstract
Techniques for session-based device configuration are described. According to one or more implementations, various settings of a wireless device are configured to optimize device performance while participating in a communication session via a wireless network. The settings, for instance, are configured dynamically and on a per-session basis.
Description
BACKGROUND

Mobile computing devices have been developed to increase the functionality that is made available to users in a mobile setting. For example, a user may interact with a mobile phone, tablet computer, or other mobile computing device to check email, surf the web, write texts, interact with applications, and so on. In an enterprise setting, a user may utilize a personal mobile device to engage in enterprise-related activities, such as online meetings, content creation and/or sharing, and so forth.


While allowing a user to utilize their personal device in an enterprise setting is advantageous in terms of cost savings and convenience, it presents a number of implementation challenges. For instance, to leverage an enterprise wireless network to transmit and receive data wirelessly, a personal device typically needs to be configured with particular settings to connect and transmit data over the wireless network. Since a wide variety of different mobile devices exist with a varied assortment of capabilities and operating environments, configuring different devices with the appropriate settings can complicate users' ability to leverage their devices in an enterprise wireless network.


SUMMARY

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Techniques for session-based device configuration are described. According to one or more implementations, various settings of a wireless device are configured to optimize device performance while participating in a communication session via a wireless network. The settings, for instance, are configured dynamically and on a per-session basis.





BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.



FIG. 1 is an illustration of an environment in an example implementation that is operable to employ techniques discussed herein.



FIG. 2 illustrates an example implementation scenario for initiating a communication session in accordance with one or more embodiments.



FIG. 3 illustrates an example implementation scenario for updating session awareness in accordance with one or more embodiments.



FIG. 4 illustrates an example implementation scenario for session termination in accordance with one or more embodiments.



FIG. 5 is a flow diagram that describes steps in a method for applying network policies to a communication session in accordance with one or more embodiments.



FIG. 6 is a flow diagram that describes steps in a method for notifying an entity of communication session attributes in accordance with one or more embodiments.



FIG. 7 is a flow diagram that describes steps in a method for notifying a device of a change in communication session attributes in accordance with one or more embodiments.



FIG. 8 is a flow diagram that describes steps in a method for configuring a device to participate in a communication session in accordance with one or more embodiments.



FIG. 9 illustrates an example system and computing device as described with reference to FIG. 1, which are configured to implement embodiments of techniques described herein.





DETAILED DESCRIPTION

Overview


Techniques for session-based device configuration are described. In at least some embodiments, a communication session refers to an exchange of communication data between different nodes in a network. Examples of a communication session include a Voice over Internet Protocol (VoIP) call, a video call, text messaging, a file transfer, and/or combinations thereof. A communication session, for instance, represents a Unified Communication and Collaboration (UC&C) session.


According to one or more implementations, various settings of a wireless device are configured to optimize device performance while participating in a communication session via an enterprise wireless network. The settings, for instance, are configured dynamically and on a per-session basis.


For instance, consider a scenario where a user device (e.g., a user's personal mobile device) connects to a wireless enterprise network managed by an enterprise entity, such as a business entity, an educational entity, a government entity, and so forth. The enterprise entity establishes various network policies that specify rules and parameters for wireless connections to the enterprise network and/or for participating in communication sessions via the enterprise network.


Further to the example scenario, while connected to the enterprise network, the user's device engages in a communication session with a different device. The different device may be connected to the enterprise network, or may be connected to a different network that communicates with the enterprise network. In response to detecting that the user device is engaging in a communication session, a network controller for the enterprise network ascertains various attributes of the user device and/or the communication session. For instance, the network controller may ascertain the attributes directly from the user device, from network elements of the enterprise network (e.g., wireless access points), and/or via a notification received from an external service.


The network controller applies the attributes to the network policies to specify different configuration parameters for the user device. The configuration parameters, for instance, specify different device settings for the user device. The network controller then generates a notification that includes the configuration parameters. As detailed below, the notification may include an application programming interface (API) that is configured with the parameters.


Further to the example scenario, the network controller communicates the notification to the user device. The user device receives the notification and processes the notification (e.g., the API) to ascertain the configuration parameters. The user device utilizes the configuration parameters to configure various settings and/or attributes of the user device. For instance, the configuration parameters are used to control various wireless-related behaviors, such as off-channel scanning, power saving procedures, wireless access point connections, and so forth.


As referenced above, a device may be configured on a per-session basis, e.g., each time a new communication session is initiated that involves the device. Thus, custom device configurations can be defined (e.g., dynamically and based on network policies) that enable devices to adapt to various network and/or device states, and to dynamically reconfigure themselves based on changes in network policies, network state, device state, and so forth.


In the following discussion, an example environment is first described that is operable to employ techniques described herein. Next, a section entitled “Propagating Session Awareness for Communication Sessions” discusses some example ways for notifying different entities of attributes of communication sessions. Following this, a section entitled “Example Network Policies” describes some example network policies in accordance with one or more embodiments. Next, a section entitled “Example Implementation Scenarios” describes some example implementation scenarios in accordance with one or more embodiments. Following this, a section entitled “Example Procedures” describes some example procedures in accordance with one or more embodiments. Finally, a section entitled “Example System and Device” describes an example system and device that are operable to employ techniques discussed herein in accordance with one or more embodiments.


Having presented an overview of example implementations in accordance with one or more embodiments, consider now an example environment in which example implementations may by employed.


Example Environment



FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ techniques for session-based device configuration described herein. Generally, the environment 100 includes various devices, services, and networks that enable communication via a variety of different modalities. For instance, the environment 100 includes a client device 102 connected to a wireless enterprise network (WEN) 104. The client device 102 may be configured in a variety of ways, such as a traditional computer (e.g., a desktop personal computer, laptop computer, and so on), a mobile station, an entertainment appliance, a smartphone, a netbook, a game console, a handheld device (e.g., a tablet), a wearable computing device, and so forth.


The WEN 104 is representative of a network that provides the client device 102 with connectivity to various networks and/or services, such as the Internet. The WEN 104 may be provided and/or managed by a particular enterprise entity, such as a business entity, an educational institution (e.g., a university), a government institution, and so forth. As used herein, the term “enterprise” generally refers to an entity or group of entities that may maintain a wireless data network for various purposes. The WEN 104 may provide the client device 102 with wireless connectivity via a variety of different connectivity technologies, such as broadband cable, digital subscriber line (DSL), wireless data connectivity (e.g., WiFi™), T-carrier (e.g., T1), Ethernet, and so forth.


The WEN 104 is implemented at least in part via wireless access points (WAP) 106, which are representative of functionality to transmit and receive wireless data as part of the WEN 104. The WAP 106, for instance, provide wireless connectivity for the client device 102 and other wireless-enabled devices. The client device 102 further includes wireless devices 108, which are representative of functionalities to enable the client device 102 to transmit and receive wireless data. Example implementations of the wireless devices 108 include different types of antennas, radios, filters, receivers, transmitters, and so forth.


The wireless devices 108 are generally associated with wireless drivers 110, which are representative of functionality to enable interaction between components of the client device 102 and the wireless devices 108, and vice-versa. For instance, a communication application 112 may leverage the wireless drivers 110 to enable communication data to be transmitted and received via the wireless devices 108.


Generally, the communication application 112 is representative of functionality to enable different forms of communication via the client device 102. Examples of the communication application 112 include a voice communication application (e.g., a VoIP client), a video communication application, a messaging application, a content sharing application, and combinations thereof. The communication application 112, for instance, enables different communication modalities to be combined to provide diverse communication scenarios. According to one or more embodiments, the communication application 112 represents an application that is installed on the client device 102. Additionally or alternatively, the communication application 112 can be implemented as a remote application that is accessible via a web browser, a web application, and so forth.


The environment 100 further includes a network infrastructure 114, which is representative of different connected components that exchange, process, and/or route data among various entities. The network infrastructure 114, for instance, represents different networks and/or sub-networks that can be provided and managed by different entities, such as Internet service providers (ISP). For example, the WAP 106 are connected to the network infrastructure 114 (e.g., by a wired and/or wireless connection) to provide the WAP 106 with network connectivity, such as to the Internet, the web, other enterprise networks, and so forth.


In at least some embodiments, the network infrastructure 114 enables different forms of communication. The network infrastructure 114, for example, enables transmission and receipt of voice data, video data, content data, and so forth. In at least some embodiments, the network infrastructure 114 represents a Unified Communication and Collaboration (UC&C)-enabled network.


Connected to and/or implemented as part of the network infrastructure 114 is a communication service 116, which is representative of a service to perform various tasks for management of communication between the client device 102 and user devices 118. The communication service 116, for instance, can manage initiation, moderation, and termination of communication sessions. Examples of the communication service 116 include a VoIP service, an online conferencing service, a UC&C service, and so forth. In at least some embodiments, the communication service 116 may be implemented as or be connected to a private branch exchange (PBX) in communication with a Public Switched Telephone Network (“PSTN”) to enable voice communication between the client device 102 and user devices 118.


According to one or more implementations, the client device 102 is configured to interface with the communication service 116 via the communication application 112 to enable communication between the client device 102 and the user devices 118. The communication application 112, for instance, represents a communication portal that is implemented and managed by the communication service 116 to enable various types of communication.


The environment 100 further includes a network controller 120, which is representative of functionality to manage various aspects of the WEN 104. The network controller 120, for instance, is connected to the WEN 104 and maintains state awareness of different components of the WEN 104. For example, the network controller 120 maintains a mapping of the WAP 106 (e.g., in terms of location) and performance attributes of the WAP 106, such as signal quality for the different WAP 106, quality of service (QoS) attributes of the WAP 106, and so forth. The network controller 120, for instance, may be implemented as a software-defined networking (SDN) controller for managing various aspects of the WEN 104.


According to one or more embodiments, the network controller 120 includes connectivity and logic that accesses routing information for the WEN 104. For instance, the network controller 120 can access an Interior Gateway Protocol (IGP) and/or spanning tree switching topology for the WEN 104. This enables the network controller 120 to identify different data routing paths within the WEN 104, and to map and remap the different routing paths. The network controller 120 stores this information as part of a network database 122, which is representative of functionality to track and store state information for components of the WEN 104.


The network controller 120 may augment the network database 122 with performance data from the WAP 106, such as indications of data flow quality across the individual WAP 106. As further detailed herein, this enables the network controller 120 to make decisions based on quality metrics, and to notify various entities (e.g., the client device 102) of quality metrics for the WAP 106 to enable the entities to make network connectivity decisions.


The network controller 120 further maintains network policies 124, which are representative of different rules and parameters for the WEN 104. The network policies 124, for instance, specify particular behaviors and/or settings for devices that connect to the WEN 104. Examples of different example implementations of the network policies 124 are discussed below.


The network controller 120 is configured to propagate the network policies 124 to different entities via a configuration broker 126. Generally, the configuration broker 126 is representative of functionality to interact with different wireless devices (e.g., the client device 102) to enable the devices to be configured based on the network policies 124. The client device 102, for instance, includes a configuration module 128 which is representative of functionality to interact with the configuration broker 126 and/or other functionalities to enable configuration of the client device 102 for wireless communication via the WEN 104.


For example, the configuration broker 126 can communicate various attributes of the network policies 124 to the configuration module 128. The configuration module 128 can cause the client device 102 to be configured according to the attributes, such as to optimize wireless performance of the client device 102. The configuration module 128 may be implemented in a variety of ways, such as via software, firmware, hardware, and/or combinations thereof. According to one or more implementations, the configuration module 128 can be implemented as a physical layer (PHY) and/or media access control (MAC) layer component of the client device 102. Thus, various techniques discussed herein may be implemented at the PHY and/or MAC layer to configure the client device 102 for a communication session.


The network controller 120 may also enable the WAP 106 to be configured for different communication sessions. For instance, various notifications and operations discussed herein with reference to the client device 102 may also be utilized to notify the WAP 106 of communication session attributes and policies to enable the WAP 106 to be configured for particular communication sessions.


In at least some embodiments, configuration of the client device 102 according to the network policies 124 can occur on a per-session basis, e.g., each time the client device 102 participates in a communication session with another device. Further details concerning configuration of the client device 102 according to different network policies 124 and/or session attributes are discussed below.


According to one or more implementations, the network controller 120 maintains active state awareness of various devices connected to the WEN 104, state conditions of the WEN 104, and of communication sessions that involve the WEN 104. For instance, the network database 122 tracks connectivity attributes of different devices and components within the WEN 104. The network database 122, for example, includes records for active communication sessions and dynamically updates the records, such as based on changes in routing path, changes in connection quality, and so forth. In at least some embodiments, quality metrics from the network database 122 can be used to issue notifications to the client device 102 that enable the client device 102 to adjust to various state changes. Further details and implementations of the various entities of the environment 100 are discussed below.


Having described an example environment in which the techniques described herein may operate, consider now a discussion of example ways of propagating various attributes of communication sessions and network policies in accordance with one or more embodiments.


Propagating Session Awareness for Communication Sessions


According to various embodiments, techniques can be employed to dynamically enlighten various network components with information about communication sessions. For instance, notification events can be generated that include various attributes of communication sessions. The notification events can be propagated to different entities further to techniques for session-based device configuration discussed herein.


In at least some embodiments, notification events can be configured using a communication application programming interface (API) that can be leveraged to configure and communicate session information to various network components involved in a communication session. For example, the communication API can identify dialogue events and session events which can be populated with respective values for a particular communication session. Consider, for instance, the following events and attributes that may be conveyed via a notification event generated by the communication API:


Dialogue Events—


These events apply to various portions of a communication session, such as the start, update, and end of a communication session. A dialogue event can include one or more of the following example attributes.


(1) Timestamp: This attribute can be leveraged to specify timestamps for a start of a communication session, updates that occur during a communication session, and an end (e.g., termination) of a communication session.


(2) Source IP Address: This attribute can be leveraged to specify an IP address for a device that is a source of media during a communication session, e.g., a device that initiates a communication session.


(3) Destination IP Address: This attribute can be leveraged to specify an IP address for a device that is to receive media as part of a communication session.


(4) Transport Type: This attribute can be leveraged to specify a transport type or combination of transport types for a communication session. Examples of transport types include Transmission Control Protocol (TCP), User Datagram Protocol (UDP), and so forth.


(5) Source Port: this attribute can be leveraged to specify an identifier for a port at a source device, e.g., a source device identified by the Source IP Address referenced above.


(6) Destination Port: This attribute can be leveraged to specify an identifier for a port at a destination device, e.g., a destination device identified by the Destination IP Address referenced above.


(7) Media Type: This attribute can be leveraged to specify a media type and/or types that are to be transmitted and/or are being transmitted as part of a communication session. As discussed elsewhere herein, the communication session can involve multiple different types of media. Thus, the Media Type attribute can be employed to identify media types in a communication session, such as for applying network policies discussed herein.


(8) Bandwidth Estimation: This attribute can be leveraged to specify an estimated bandwidth that is to be allocated for a communication session. The estimated bandwidth, for instance, can be based on various factors, such as a privilege level associated with a user, type and/or types of media included in a communication session, a network policy applied to the communication session, and so forth.


(9) To: This attribute can be leveraged to identify a user to which media in a communication session is to be transmitted.


(10) From: This attribute can be leveraged to identify a user from which media in a communication session is transmitted.


(11) Error Code: This attribute can be leveraged to specify various error codes for errors that may occur as part of a communication session. For example, errors can include errors that occur during initiation the communication session, errors that occurred during a communication session, errors that occur when a communication session is terminated, and so forth.


Session Problem Events—


These events can be generated and applied when a communication session experiences errors, performance degradation, and so forth. A session problem event may include one or more of the attributes discussed above with reference to Dialogue Events, and may also include one or more of the following attributes.


(1) Mean Opinion Score (MOS) Degradation: This attribute can be leveraged to specify a MOS for a communication session. The attribute, for instance, can be used to indicate that an overall quality of a communication session has decreased.


(2) Jitter Inter-Arrival Time: This attribute can be leveraged to specify jitter values for a communication session. The attribute, for instance, can be used to indicate that a jitter value or values have increased, e.g., have exceeded a specified jitter value threshold.


(3) Packet Loss Rate: This attribute can be leveraged to specify a packet loss rate for a communication session. The attribute, for instance, can be used to indicate that a packet loss rate has increased, e.g., has exceeded a specified packet loss rate value threshold.


(4) Round Trip Delay (RTD): This attribute can be leveraged to specify RTD values for packets in communication sessions. The attribute, for instance, can be used to indicate that RTD values for packets have increased, e.g., have exceeded a specified RTD value threshold.


(5) Concealment Ratio: This attribute can be leveraged to specify a cumulative ratio of concealment time over speech time observed after starting a communication session. The attribute, for instance, can be used to specify that a concealment ratio has increased, e.g., has exceeded a specified concealment ratio value threshold.


Thus, various notifications discussed herein can include one or more of the attributes discussed above and can be used to propagate the attributes to various entities. Elements from the communication API discussed above, for example, can be configured based on network policies and attributes of a communication session. For instance, attributes of a particular communication session can be applied to network policies to configure elements of the communication API. The configured elements can be communicated to a device (e.g., the client device 102) to enable the device to be configured based on values from the communication API elements.


Having described an example ways of propagating session awareness for communication sessions, consider now some example network policies in accordance with one or more embodiments.


Example Network Policies


The following section describes example network policies (e.g., network policies 124) in accordance with one or more embodiments. As referenced above, network policies generally specify various rules and parameters for connecting to a wireless network, and for transmitting and receiving data via the wireless network.


Off-Channel Scanning


Generally, off-channel scanning refers to scanning for available wireless network channels. For instance, a device may scan for available wireless channels in attempt to maintain channel awareness in an event that a wireless channel is required.


An example network policy can specify that when a communication session is in progress, off-channel scanning is to be halted and/or minimized. For instance, a network policy may specify that off-channel scanning is not to be performed while a communication session is in progress. Alternatively, a network policy may specify a maximum amount of time during which off-channel scanning may be performed while a communication session is in progress, e.g., 30 milliseconds, 60 milliseconds, and so forth.


In at least some embodiments, a notification event can be sent to a client device notifying the device that the device is currently participating in a communication session, and thus off-channel scanning is to be halted or minimized. The notification event, for instance, can include attributes of the communication API introduced above. When the communication event is terminated, a notification event (e.g., based on the communication API) can be sent to the client device notifying the device that the communication event is terminated, and thus off-channel scanning may resume according to default settings.


Wireless Mobility


Mobile devices often move between different locations. When a mobile device moves while connected to a wireless network, the mobile device may transfer its network connection between different WAP. For instance, if a user is participating in a communication session with a mobile device while walking between areas of an enterprise facility, handoffs may occur between different WAP to enable the communication session to continue and to maintain an acceptable signal quality.


According to various implementations, network policies can be employed to optimize connection handoff between different WAP. For instance, the network controller 120 can maintain various state information for components of the WEN 104. Examples of such state information include:


(1) An identifier for a current WAP to which the client device 102 is connected.


(2) A location of the client device 102. The location, for instance, can be determined relative to a WAP to which the client device 102 is connected.


(3) Direction of movement of the client device 102. For instance, the network controller 120 can determine that the client device 102 is moving in a particular direction, such as relative to an associated WAP. In at least some embodiments, this information can be received from a WAP that detects movement of the client device 102 in a general direction.


(4) Signal quality attributes of a current connection of the client device 102 to a WAP. Examples of signal quality attributes include signal-to-noise ratio (SNR), received signal strength indicator (RSSI), jitter, packet delay, wireless congestion, and so forth.


(5) Signal quality attributes of other WAP of the WEN 104. The attributes, for instance, can be determined from the WAP themselves, and/or from connected devices.


(6) Locations of other WAP. The network controller 120, for instance, may maintain a map of WAP locations. Further, the map may be augmented with signal quality attributes of the individual WAP such that the network controller 120 maintains a mapping of wireless availability and quality in different locations.


The network controller 120 can utilize this information to enable intelligent decisions to be made regarding access point candidates. For instance, the network controller 120 can identify a best-candidate WAP for the client device 102, e.g., based on location proximity to the client device 102 and signal quality. The network controller 120 can then send a notification event (e.g., using the communication API) to the client device 102 instructing the client device 102 to establish a connection with the WAP.


Alternatively or additionally, the network controller 120 can provide a list of best-candidate WAP to the client device 102, and the client device 102 can employ internal decision-making logic to select a WAP from the list with which to connect.


According to various implementations, this process can occur dynamically and continuously. For instance, the network controller 120 can periodically and/or continuously update its WAP state awareness. Further, the network controller 120 can periodically and/or continuously update the client device 102 regarding best-candidate WAP for wireless data transmission.


Battery Power and Wireless Performance


Mobile devices often implement battery-saving procedures when operating under battery power. For instance, when disconnected from an alternating current (AC) source, to conserve battery life a mobile device may lower the amount of power used to transmit wireless data. However, reducing the amount of power to a wireless functionality (e.g., the wireless devices 108) may adversely affect wireless signal quality.


Accordingly, a network policy 124 may specify that while a communication session is in progress, power supplied to wireless functionalities is not to be reduced. In at least some implementations, this network policy can override a default device setting that attempts to reduce power for wireless data transmission when a device is operating on battery power.


The network controller 120, for example, can send a notification event to the client device 102 (e.g., using the communication API) indicating that a communication session is in progress, and thus power supplied to wireless functionality is not to be reduced. When the communication session terminates, the network controller 120 can send a notification event to the client device 102 indicating that the communication session has terminated. Thus, the client device may resume default power saving procedures, such as reducing power supplied to wireless functionality.


Wireless Rate Adaption


Mobile devices may implement rate adaption procedures to compensate for problems in signal quality, such as may occur in areas with noise sources that generate RF interference. Generally, rate adaption refers to a process of reducing a transmission bit rate while increasing transmission power for data transmission. However, typical rate adaption algorithms may adversely affect wireless signal quality. For instance, some rate adaption algorithms cause increases in packet transmission retries and retransmissions, which may cause a receiving device to drop packets as the time sequence to play out media from a communication session expires.


Accordingly, a network policy 124 may specify that while a communication session is in progress, a default rate adaption algorithm is to be overridden with a custom rate adaption algorithm. The custom rate adaption algorithm, for instance, may specify that packet retransmissions and transmission retries are to be reduced from default levels Implementation of the custom rate adaption algorithm may reduce the likelihood that unnecessary packet retransmissions and transmission retries are performed by a transmitting device.


The network controller 120, for example, can send a notification event to the client device 102 (e.g., using the communication API) indicating that a communication session is in progress, and thus a custom rate adaption algorithm is to be implemented if rate adaption is to be performed. When the communication session terminates, the network controller 120 can send a notification event to the client device 102 indicating that the communication session has terminated. Thus, the client device may resume default rate adaption procedures.


Quality of Service


According to various implementations, wireless packets that are transmitted may be associated with quality of service (QoS) markings that specify how that packets are to be treated by various network elements. Examples of QoS markings include expedited forwarding, assured forwarding, best effort, and so forth. For instance, a differentiated services code point (DSCP) field in an IP packet can be configured based on different QoS levels to enable different levels of service to be assigned to network traffic. Typical solutions for QoS markings, however, rely on per-packet QoS marking.


Accordingly, a network policy 124 may specify particular QoS levels that are to be applied to transmission of different data packets. The network controller 120, for example, can send a notification event to the client device 102 (e.g., using the communication API) indicating that a communication session is in progress, and thus a particular QoS level is to be applied to packets that are transmitted by the client device 102. The notification event, for instance, is out-of-band from the actual media packets of the communication session. The notification may include actual tags to be applied to the data packets, regardless of how the data packets may be tagged when they are received for transmission. Thus, a QoS level specified by the notification event for packets of a communication session may override a QoS marking attached to the packets. Thus, embodiments discussed herein provide ways of dynamically configuring QoS for communication sessions, such as on a per-session basis.


Channel Quality


As discussed above, state information regarding different WAP can be maintained, such as location and signal quality for different WAP. Thus, if the client device 102 experiences signal quality degradation with a current WAP, the client device 102 can be informed of candidate replacement WAP. The network controller 120, for example, can send a notification event to the client device 102 (e.g., using the communication API) identifying a candidate WAP and/or wireless channels that the client device 102 may associate with to increase signal quality. In at least some implementations, this can circumvent the need for the client device to perform channel search procedures, such as off-channel scanning.


Having described some example network policies, consider now some example implementation scenarios for session-based device configuration in accordance with one or more embodiments.


Example Implementation Scenarios


The following section describes example implementation scenarios for session-based device configuration in accordance with one or more embodiments. The implementation scenarios may be implemented in the environment 100 discussed above, and/or any other suitable environment.



FIG. 2 illustrates an example implementation scenario for initiating a communication session generally at 200. The scenario 200 includes various entities and components introduced above with reference to the environment 100.


In the scenario 200, a communication session 202 is initiated between the client device 102 and the user device 118 via the communication service 116. The communication service 116, for instance, serves as an intermediary between the communication application 112 of the client device 102, and the user device 118. For example, the communication service 116 may manage various aspects of initiation, moderation, and termination of the communication session 202.


The communication session 202 may include various types of communication media, such as voice, video, and/or combinations thereof. While the user device 118 is illustrated as being connected outside of the WEN 104, in alternative implementations the client device 102 and the user device 118 may be connected directly to the WEN 104.


In response to initiation of the communication session 202, the communication service 116 generates a notification event 204 and sends the notification event 204 to the network controller 120. The notification event 204 notifies the network controller 120 that the communication session 202 is initiated. The notification event 204 includes a session notification API 206, which represents an implementation of the communication API detailed above.


Further to the scenario 200, the session notification API 206 includes values for various attributes of the communication session 202. Examples of such attributes include identifiers for the client device 102 and the user device 118, such as IP addresses, media access control (MAC) addresses, and so forth. The attributes may further include attributes of the communication session itself, such as a type or types of media being transferred during the communication session, a start time of the communication session, an application ID for the communication application 112, and so forth. Examples of other attributes that may be communicated with the session notification API 206 are detailed above, such as in the discussion of the example communication API and the example network policies.


Thus, based on information from the session notification API (e.g., an ID for the client device 102), the network controller 120 ascertains that the client device 102 is connected to a network domain of the network controller 120. Accordingly, the network controller 120 generates a configuration event 208 that includes a session configuration API 210. The session configuration API 210, for instance, is configured by applying values from the session notification API 206 to the network policies 124.


Further to the scenario 200, the network controller 120 communicates the configuration event 208 to the client device 102 via the WEN 104. For instance, the configuration broker 126 interacts with the configuration module 128 to communicate the configuration event 208. The configuration module 128 includes functionality to consume the session configuration API 210, extract information from the API, and to configure various attributes of the client device 102 based on attributes and values included in the session configuration API 210. For instance, the configuration module 128 can propagate information from the session configuration API 210 to different functionalities of the client device 102 to enable the client device 102 to operate according to the network policies 124, e.g., while engaging in the communication session 202.


As an example, consider that the wireless driver 110 is configured by default to perform periodic off-channel scanning to identify available wireless channels. According to the scenario 200, the session configuration API 210 includes an indication that the client device is either to halt off-channel scanning during the communication session 202, or is to limit the amount of time during which off-channel scanning is performed. The configuration module 128 can read this information from the session configuration API 210, and communicate the information to the wireless driver 110. Thus, the wireless driver 110 may operate according to this policy to limit or stop off-channel scanning while the communication session 202 is active.


This example policy is presented for purpose of example only, and it is to be appreciated that a wide variety of different policies and behaviors can be enforced utilizing techniques discussed herein. Examples of other policies and behaviors that may be utilized are discussed above.



FIG. 3 illustrates an example implementation scenario for updating session awareness, generally at 300. The scenario 300 includes various entities and components introduced above with reference to the environment 100. In at least some embodiments, the scenario 300 represents a continuation of the scenario 200, discussed above.


In the scenario 300, the communication service 116 detects one or more changes in the communication session 202. For instance, the communication service 116 may receive an indication from the client device 102 and/or the user device 118 of a problem with session quality of the communication session 202. Examples of session quality problems include lower than acceptable S/N ratio, low signal strength, too much jitter, too many dropped packets, and so forth.


In response to the indication of session quality problems, the communication service 116 generates an update event 302 that includes a session update API 304. The session update API 304, for instance, represents an implementation of the communication API detailed above. The communication service 116 sends the update event 302 to the network controller 120. The update event 302 notifies the network controller 120 of a change in the communication session 202, e.g., of signal problems with the communication session.


Further to the scenario 300, the session update API 304 includes values for various attributes of the communication session 202. Examples of such attributes include identifiers for the client device 102 and the user device 118, such as IP addresses, media access control (MAC) addresses, and so forth. The attributes may further include a session ID for the communication session and an indication of the change to the communication session. Examples of other attributes that may be communicated with the session update API 304 are detailed above, such as in the discussion of the example communication API and the example network policies.


Thus, based on information from the session update API 304, the network controller 120 ascertains that a problem is occurring with the communication session 202. The session update API 304, for instance, may indicate that signal quality for a WAP 106 to which the client device 102 is connected is poor.


Accordingly, the network controller 120 generates a reconfiguration event 306 that includes a reconfiguration API 308. The reconfiguration API 308, for instance, is configured by applying values from the session update API 304 to the network policies 124. In at least some embodiments, the reconfiguration API 308 may identify candidate WAP 106 that have better signal quality than a current WAP 106 to which the client device 102 is connected.


Further to the scenario 300, the network controller 120 communicates the reconfiguration event 306 to the client device 102 via the WEN 104. For instance, the configuration broker 126 interacts with the configuration module 128 to communicate the reconfiguration event 306. The configuration module 128 includes functionality to consume the reconfiguration API 308, extract information from the API, and to configure various attributes of the client device 102 based on attributes and values included in the reconfiguration API 308. For instance, the configuration module 128 can propagate information from the reconfiguration API 210 to different functionalities of the client device 102 to enable the client device 102 to operate according to the network policies 124, e.g., while engaging in the communication session 202.


In at least some embodiments, based on a candidate WAP 106 identified in the reconfiguration API 308, the client device 102 initiates a handoff procedure to disconnect from a current WAP 106 and to connect to a different WAP 106. Thus, signal quality for the communication session 202 may be increased by connecting to a WAP 106 with higher signal quality.


While the scenario 300 is discussed with reference to the reconfiguration event 306 being generated in response to the update event 302, this is not intended to be limiting. For instance, in at least some embodiments the network controller 120 maintains its own session and/or network awareness independent of the communication service 116. Thus, the network controller 120 can detect changes in network and/or session attributes, and can generate a reconfiguration event and reconfiguration API to notify the client device 102 of the changes and appropriate configuration settings for the client device 102 based on the changes. The network controller 120, for instance, can generate the reconfiguration event 306 and the reconfiguration API 308 based on its own state awareness and independent of a notification from an external entity such as the communication service 116.


Accordingly techniques discussed herein can be employed to dynamically update communication session awareness while a communication session is in progress. Further, update events and reconfiguration events may be issued multiple times during a particular communication session, thus enabling participating devices to be dynamically reconfigured to adapt to changes in session quality and/or session attributes.



FIG. 4 illustrates an example implementation scenario for session termination, generally at 400. The scenario 400 includes various entities and components introduced above with reference to the environment 100. In at least some embodiments, the scenario 400 represents a continuation of the scenarios 200 and 300, discussed above.


In the scenario 400, the communication service 116 detects that the communication session 202 has terminated. For instance, the communication service 116 may receive an indication from the client device 102 and/or the user device 118 that the communication session 202 has ended.


In response to the indication of session termination, the communication service 116 generates an update event 402 that includes a session update API 404. The session update API 404, for instance, represents an implementation of the communication API detailed above. The communication service 116 sends the update event 402 to the network controller 120. The update event 402 notifies the network controller 120 that the communication session 202 has ended.


Further to the scenario 400, the session update API 404 includes values for various attributes of the communication session 202. Examples of such attributes include identifiers for the client device 102 and the user device 118. The attributes may further include a session ID for the communication session 202 and a session end timestamp for the communication session 202. Examples of other attributes that may be communicated with the session update API 404 are detailed above in the discussion of the example communication API.


Thus, based on information from the session update API 404, the network controller 120 ascertains that the communication session 202 has ended. Accordingly, the network controller 120 generates a termination event 406 that includes a termination API 408. The termination API 408, for instance, is configured by applying values from the session update API 404 to the network policies 124. In at least some embodiments, the termination API 408 identifies the communication session 202 and specifies that the communication session has ended.


Further to the scenario 400, the network controller 120 communicates the termination event 406 to the client device 102 via the WEN 104. For instance, the configuration broker 126 interacts with the configuration module 128 to communicate the termination event 406. The configuration module 128 includes functionality to consume the termination API 408 and to configure various attributes of the client device 102 based on attributes and values included in the termination API 408. For instance, the configuration module 128 can propagate information from the termination API 408 to different functionalities of the client device 102 to enable the client device 102 to operate according to the network policies 124.


In at least some embodiments, based on an indication that the communication session 202 is terminated, the client device 102 may notify its various components that they may resume default behavior. For instance, the configuration module 128 may notify the wireless drivers 110 that default behaviors may be resumed, such as with reference to off-channel scanning, battery conservation techniques, wireless rate adaption algorithms, and so forth.


Accordingly techniques discussed herein can be employed to notify devices of session start and stop events, and to dynamically configure device attributes on a per-session basis.


Having discussed some example implementation scenarios, consider now a discussion of some example procedures in accordance with one or more embodiments.


Example Procedures


The following discussion describes some example procedures for session-based device configuration in accordance with one or more embodiments. The example procedures may be employed in the environment 100 of FIG. 1, the system 900 of FIG. 9, and/or any other suitable environment. Further, the example procedures may represent implementations of the example scenarios discussed above. In at least some embodiments, steps described for the various procedures can be implemented automatically and independent of user interaction.



FIG. 5 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method describes an example procedure for applying network policies to a communication session in accordance with one or more embodiments. In at least some implementations, the method can be performed by the network controller 120.


Step 500 receives a notification that a communication session is initiated in a network. The notification, for instance, includes various attributes of the communication session. For example, the notification may be configured via the communication API detailed above. Examples of attributes and information that may be communicated via the notification are described above.


Step 502 ascertains attributes of the communication session from the notification. For example, the network controller 120 can process the notification to identify session attributes, such as from a communication API included with the notification.


Step 504 applies the attributes of the communication session to network policies for the network to specify parameters for the communication session. For instance, different policy-based decisions can be made based on the attributes. Examples of network policies are detailed above.


Step 506 generates a configuration event that includes the parameters for the communication session. The configuration event, for instance, includes a communication API that is populated with various values that represent the parameters for the communication session. Examples of such parameters include behaviors for a device that is participating in the communication session, such as whether to engage in off-channel scanning during the communication session, allowed power conservation techniques during the communication session, QoS marking to be applied to session packets, and so forth.


Step 508 communicates the configuration event to a device that is connected to the network and that is participating in the communication session. In at least some embodiments, information from the configuration event enables the device to configure itself to operate according to the parameters for the communication session.


With reference to the environment 100 and the scenarios discussed above, the network controller 120 can communicate the configuration event to the client device 102. Alternatively or additionally, the network controller 120 can communicate the configuration event to other network elements, such as the WAP 106. For instance, techniques discussed herein may be employed to configure the WAP 106 and/or other network components and network elements, and are not limited to configuration of end-user devices.



FIG. 6 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method describes an example procedure for notifying an entity of communication session attributes in accordance with one or more embodiments.


Step 600 configures a notification event that includes attributes of a communication session that is occurring in a network. The communication service 116, for instance, populates a communication API with attributes of a communication session. Examples of communication API and communication session attributes are detailed above. In at least some embodiments, the attributes may include attributes of a communication session that was recently initiated, and/or changes to attributes of an existing communication session.


Step 602 communicates the notification event to a network controller for the network. The communication service 116, for instance, communicates the populated communication API to the network controller 120. The notification event may include attributes of a new communication session, and/or changes to attributes of an existing communication session. As detailed herein the network controller 120 can utilize information from the communication API to apply network policies and notify various devices of parameters and behaviors to be applied for a communication session.



FIG. 7 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method describes an example procedure for notifying a device of a change in communication session attributes in accordance with one or more embodiments.


Step 700 receives an indication of a change in communication session attributes for a communication session that is occurring in a network. The network controller 120, for example, receive an indication that one or more attributes of a communication session have changed. Examples of such a change include a change in session quality, a change in device location, a change in device performance (e.g., for the client device 102 and/or a WAP 106), and so forth. The indication of the change may be received from the communication service 116 and/or based on detected state conditions for the network.


Step 702 generates a reconfiguration event based on the change in the communication session attributes. The network controller 120, for instance, applies the changed attributes to the network policies 124 to generate a session update API for the communication session. The session update API, for instance, includes element values that reflect the change in the communication session attributes as applied to the network policies 124.


In at least some embodiments, the reconfiguration event may identify WAP 106 that are candidates to provide a wireless connection. The candidates may be identified based on signal quality for the individual WAP 106 and/or location for the individual WAP 106. For instance, if the change in the communication session attributes indicates a change in session quality, the reconfiguration event can identify WAP 106 in a particular region that have a higher signal quality than a currently-connected WAP.


Alternatively or additionally, if the change in the communication session attributes indicates that a device (e.g., the client device 102) is moving from one location to another, the reconfiguration event can identify WAP 106 that occur in the general direction of movement and that are available to provide wireless connectivity. Thus, a device that receives the reconfiguration event can process data from the event and select a WAP 106 with which to associate, such as to improve signal quality during a communication event and/or enable the communication event to continue when moving between locations.


Step 704 communicates the reconfiguration event to a device that is connected to the network and that is participating in the communication event. The network controller 120, for instance, communicates the reconfiguration event to the client device 102. Based on information from the communication event, the client device 102 can change its internal settings, can connect to a different WAP 106, and so forth.



FIG. 8 is a flow diagram that describes steps in a method in accordance with one or more embodiments. The method describes an example procedure for configuring a device to participate in a communication session in accordance with one or more embodiments.


Step 800 receives a configuration event that includes parameters to be applied for a communication session. The client device 102, for instance, receives the configuration event from the network controller 120. In at least some embodiments, the configuration event may be an initial configuration event, e.g., a first configuration event that is received after initiation of a communication session. Alternatively, the configuration event may be a reconfiguration event that is received during a communication session and subsequent to a previously-received configuration event for the communication session. According to various implementations, the configuration event is received after the client device 102 has begun participating in the communication session.


Step 802 processes the configuration event to identify the parameters for the communication session. The configuration event, for example, includes a communication API that is populated with different values for different session parameters and/or device settings. The client device 102 may process the communication API to expose the different parameters for the communication session.


Step 804 configures a device for the communication session based on the parameters. The client device 102, for instance, can configure various device settings based on the parameters. For example, the configuration module 128 can communicate various parameters and/or settings to the wireless drivers 110 to enable the wireless drivers 110 to control the wireless devices 108 according to the parameters and settings. Examples of different device settings and attributes that can be configured are discussed above, and include off-channel scan settings, power conservation settings, QoS marking to be applied to communication session packets, and so forth.


A device may be configured as part of an initial configuration of the device for a communication session and/or as part of a configuration update. For instance, the parameters may include updates to previously configured settings and device attributes, such as received as part of a reconfiguration event. Thus, previously-applied settings and attributes for a device participating in a communication session may be updated for the communication session, such as to reflect changes in the communication session.


As referenced above in the discussion of environment 100, the configuration module 128 can be implemented as a PHY and/or MAC layer component of the client device 102. Aspects of the various procedures discussed above, for instance, may be implemented at the PHY and/or MAC layer to configure a device for a communication session. For example, processing of the communication API may occur at the PHY and/or MAC layer to enable various device parameters and settings to be configured for a communication session.


While the method discussed above is described with reference to configuration a user device (e.g., the client device 102) for a communication session, this is not intended to be limiting. For instance, in at least some embodiments, network components such as wireless access points, network firewalls, and so forth, may be configured utilizing techniques discussed herein. Different events and APIs discussed herein, for example, may be communicated to different network components to enable the components to be configured for particular communication sessions. Configuration of network components may occur additionally or alternatively to configuration of an end-user device, and in at least some embodiments may occur in parallel with configuration of an end-user device. For instance, the various notification events discussed above as being communicated to the client device 102 may additionally or alternatively be communicated to one or more of the WAP 106, a network firewall component, a hub, a switch, a router, and so forth, to enable the different components to be configured according to techniques discussed herein.


As discussed above, the different notification events and APIs referenced herein may be communicated separately from data packets of a communication session. Thus, the notification events may be considered as out-of-band communications with regard to communication sessions. In at least some embodiments, this enables devices to be configured and reconfigured for a communication session without interfering with the communication session itself, e.g., independent of a flow of data packets for the communication session.


Having discussed some example procedures, consider now a discussion of an example system and device in accordance with one or more embodiments.


Example System and Device



FIG. 9 illustrates an example system generally at 900 that includes an example computing device 902 that is representative of one or more computing systems and/or devices that may implement various techniques described herein. For example, the client device 102, the communication service 116, and/or the network controller 120 discussed above can be embodied as the computing device 902. The computing device 902 may be, for example, a server of a service provider, a device associated with the client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.


The example computing device 902 as illustrated includes a processing system 904, one or more computer-readable media 906, and one or more Input/Output (I/O) Interfaces 908 that are communicatively coupled, one to another. Although not shown, the computing device 902 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.


The processing system 904 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 904 is illustrated as including hardware element 910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 910 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.


The computer-readable media 906 is illustrated as including memory/storage 912. The memory/storage 912 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage 912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage 912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 906 may be configured in a variety of other ways as further described below.


Input/output interface(s) 908 are representative of functionality to allow a user to enter commands and information to computing device 902, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice recognition and/or spoken input), a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to detect movement that does not involve touch as gestures), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 902 may be configured in a variety of ways as further described below to support user interaction.


Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.


An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the computing device 902. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”


“Computer-readable storage media” may refer to media and/or devices that enable persistent storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Computer-readable storage media do not include signals per se. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.


“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 902, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.


As previously described, hardware elements 910 and computer-readable media 906 are representative of instructions, modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein. Hardware elements may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware devices. In this context, a hardware element may operate as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element as well as a hardware device utilized to store instructions for execution, e.g., the computer-readable storage media described previously.


Combinations of the foregoing may also be employed to implement various techniques and modules described herein. Accordingly, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 910. The computing device 902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of modules that are executable by the computing device 902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 910 of the processing system. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 902 and/or processing systems 904) to implement techniques, modules, and examples described herein.


As further illustrated in FIG. 9, the example system 900 enables ubiquitous environments for a seamless user experience when running applications on a personal computer (PC), a television device, and/or a mobile device. Services and applications run substantially similar in all three environments for a common user experience when transitioning from one device to the next while utilizing an application, playing a video game, watching a video, and so on.


In the example system 900, multiple devices are interconnected through a central computing device. The central computing device may be local to the multiple devices or may be located remotely from the multiple devices. In one embodiment, the central computing device may be a cloud of one or more server computers that are connected to the multiple devices through a network, the Internet, or other data communication link.


In one embodiment, this interconnection architecture enables functionality to be delivered across multiple devices to provide a common and seamless experience to a user of the multiple devices. Each of the multiple devices may have different physical requirements and capabilities, and the central computing device uses a platform to enable the delivery of an experience to the device that is both tailored to the device and yet common to all devices. In one embodiment, a class of target devices is created and experiences are tailored to the generic class of devices. A class of devices may be defined by physical features, types of usage, or other common characteristics of the devices.


In various implementations, the computing device 902 may assume a variety of different configurations, such as for computer 914, mobile 916, and television 918 uses. Each of these configurations includes devices that may have generally different constructs and capabilities, and thus the computing device 902 may be configured according to one or more of the different device classes. For instance, the computing device 902 may be implemented as the computer 914 class of a device that includes a personal computer, desktop computer, a multi-screen computer, laptop computer, netbook, and so on.


The computing device 902 may also be implemented as the mobile 916 class of device that includes mobile devices, such as a mobile phone, portable music player, portable gaming device, a tablet computer, a multi-screen computer, and so on. The computing device 902 may also be implemented as the television 918 class of device that includes devices having or connected to generally larger screens in casual viewing environments. These devices include televisions, set-top boxes, gaming consoles, and so on.


The techniques described herein may be supported by these various configurations of the computing device 902 and are not limited to the specific examples of the techniques described herein. For example, functionalities discussed with reference to the communication service 116, the communication application 112, and/or the network controller 120 may be implemented all or in part through use of a distributed system, such as over a “cloud” 920 via a platform 922 as described below.


The cloud 920 includes and/or is representative of a platform 922 for resources 924. The platform 922 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 920. The resources 924 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 902. Resources 924 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.


The platform 922 may abstract resources and functions to connect the computing device 902 with other computing devices. The platform 922 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 924 that are implemented via the platform 922. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 900. For example, the functionality may be implemented in part on the computing device 902 as well as via the platform 922 that abstracts the functionality of the cloud 920.


Discussed herein are a number of methods that may be implemented to perform techniques discussed herein. Aspects of the methods may be implemented in hardware, firmware, or software, or a combination thereof. The methods are shown as a set of steps that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. Further, an operation shown with respect to a particular method may be combined and/or interchanged with an operation of a different method in accordance with one or more implementations. Aspects of the methods can be implemented via interaction between various entities discussed above with reference to the environment 100.


CONCLUSION

Techniques for session-based device configuration are described. Although embodiments are described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.

Claims
  • 1. A system comprising: at least one processor; andone or more computer-readable storage media including instructions stored thereon that, responsive to execution by the at least one processor, cause the system perform operations including: receiving a notification that a communication session is initiated in a network, the notification including a session notification application programming interface (API) that includes a value for an attribute of the communication session;ascertaining that a client device is connected to the network based on the attribute of the communication session received as part of the session notification API;applying the attribute of the communication session received as part of the session notification API to a network policy for the network to specify a parameter for the communication session;configuring a session configuration application programming interface (API) with the parameter for the communication session by applying the value for the attribute included in the session notification API to the network policy;generating a configuration event that includes the session configuration API configured with the parameter for the communication session; andcommunicating the configuration event to the client device.
  • 2. The system as recited in claim 1, wherein the notification is separate from data packets of the communication session.
  • 3. The system as recited in claim 1, wherein the attribute includes an identifier for the client device.
  • 4. The system as recited in claim 1, wherein the attribute includes one or more media types for the communication session.
  • 5. The system as recited in claim 1, wherein the session notification API further includes multiple attributes of the communication session.
  • 6. The system as recited in claim 1, wherein the parameter for the communication session specifies a wireless behavior to be applied by the client device.
  • 7. The system as recited in claim 1, wherein the parameter for the communication session specifies a quality of service marking to be applied by the client device to data packets of the communication session.
  • 8. The system as recited in claim 1, the operations further including: receiving an indication of a change in the communication session;generating a reconfiguration event that includes at least one of a change to the parameter for the communication session or a change to a different parameter for the communication session; andcommunicating the reconfiguration event to the client device.
  • 9. The system as recited in claim 8, wherein the indication of the change includes an indication of a problem with session quality for the communication session, and wherein at least one of the change to the parameter for the communication session or the change to the different parameter is specified to increase the session quality for the communication session.
  • 10. The system as recited in claim 8, wherein the indication of the change includes an indication that the client device is moving to a different location, and wherein at least one of the change to the parameter for the communication session or the change to the different parameter identifies a wireless access point that is available to provide wireless connectivity at the different location.
  • 11. The system as recited in claim 8, wherein the indication of the change includes an indication that the communication session is terminated, and wherein at least one of the change to the parameter for the communication session or the change to the different parameter includes an indication that the communication session is terminated.
  • 12. The system as recited in claim 1, wherein the session notification API is configured to identify dialogue events and session events, and populate the dialogue events and session events with respective values for the communication session.
  • 13. A computer-implemented method comprising: receiving a notification that a communication session is initiated in a network, the notification including a session notification application programming interface (API) that includes a value for an attribute of the communication session;ascertaining that a client device is connected to the network based on the attribute of the communication session received as part of the session notification API;applying the attribute of the communication session received as part of the session notification API to a network policy for the network to specify a parameter for the communication session;configuring a session configuration application programming interface (API) with the parameter for the communication session by applying the value for the attribute included in the session notification API to the network policy;generating a configuration event that includes the session configuration API configured with the parameter for the communication session; andcommunicating the configuration event to the client device.
  • 14. The method as recited in claim 13, wherein the parameter for the communication session specifies a wireless behavior to be applied by the client device.
  • 15. The method as recited in claim 13, further comprising: receiving an indication of a change in the communication session;generating a reconfiguration event that includes at least one of a change to the parameter for the communication session or a change to a different parameter for the communication session; andcommunicating the reconfiguration event to the client device.
  • 16. The method as recited in claim 15, wherein the indication of the change includes an indication of a problem with session quality for the communication session, and wherein at least one of the change to the parameter for the communication session or the change to the different parameter is specified to increase the session quality for the communication session.
  • 17. A method comprising: receiving an update event including an indication of a change to a communication session in a network between a client device and at least one other device, the update event further including a session update application programming interface (API) that includes a value for an attribute of the communication session;ascertaining that the change has occurred in the communication session based on the attribute of the communication session received as part of the session update API;applying the attribute of the communication session received as part of the session update API to a network policy for the network to specify a parameter for the communication session;configuring a reconfiguration application programming interface (API) with the parameter for the communication session by applying the value for the attribute included in the session update API to the network policy;generating a reconfiguration event that includes the reconfiguration API configured with the parameter for the communication session; andcommunicating the reconfiguration event to the client device.
  • 18. The method of claim 17, wherein the indication of the change includes an indication of a problem with session quality for the communication session, and wherein the parameter for the communication session is specified to increase the session quality for the communication session.
  • 19. The method of claim 17, wherein the indication of the change includes an indication that the client device is moving to a different location, and wherein the parameter for the communication session identifies a wireless access point that is available to provide wireless connectivity at the different location.
  • 20. The method of claim 17, wherein the indication of the change includes an indication that the communication session is terminated, and wherein the change to the parameter for the communication session includes an indication that the communication session is terminated.
US Referenced Citations (510)
Number Name Date Kind
4868653 Golin et al. Sep 1989 A
5060170 Bourgeois Oct 1991 A
5149919 Greanias et al. Sep 1992 A
5241682 Bryant et al. Aug 1993 A
5353133 Bernkopf Oct 1994 A
5450586 Kuzara et al. Sep 1995 A
5475425 Przyborski et al. Dec 1995 A
5544258 Levien Aug 1996 A
5687011 Mowry Nov 1997 A
5778404 Capps et al. Jul 1998 A
5831594 Tognazzini et al. Nov 1998 A
5867709 Klencke Feb 1999 A
5903566 Flammer, III May 1999 A
5964879 Dunstan Oct 1999 A
6028960 Graf et al. Feb 2000 A
6151643 Cheng et al. Nov 2000 A
6167377 Gillick et al. Dec 2000 A
6185528 Fissore Feb 2001 B1
6232972 Arcuri et al. May 2001 B1
6263308 Heckerman et al. Jul 2001 B1
6282709 Reha et al. Aug 2001 B1
6283858 Hayes et al. Sep 2001 B1
6297825 Madden et al. Oct 2001 B1
6339437 Nielsen Jan 2002 B1
6349406 Levine et al. Feb 2002 B1
6389181 Shaffer et al. May 2002 B2
6452597 Goldberg et al. Sep 2002 B1
6603491 Lemelson et al. Aug 2003 B2
6757027 Edwards et al. Jun 2004 B1
6847386 Paleiov Jan 2005 B2
6854073 Bates et al. Feb 2005 B2
6879709 Tian et al. Apr 2005 B2
6934370 Leban et al. Aug 2005 B1
6970947 Ebling et al. Nov 2005 B2
7082211 Simon et al. Jul 2006 B2
7146296 Carlbom et al. Dec 2006 B1
7171432 Wildhahen Jan 2007 B2
7194114 Schneiderman Mar 2007 B2
7200561 Moriya et al. Apr 2007 B2
7251812 Jhanwar et al. Jul 2007 B1
7254257 Kim et al. Aug 2007 B2
7337112 Moriya et al. Feb 2008 B2
7370043 Shelton et al. May 2008 B1
7380003 Guo et al. May 2008 B1
7387539 Trenne Jun 2008 B2
7400439 Holman Jul 2008 B2
7443791 Barrett et al. Oct 2008 B2
7443807 Cutler Oct 2008 B2
7458825 Atsmon et al. Dec 2008 B2
7466986 Halcrow et al. Dec 2008 B2
7496910 Voskuil Feb 2009 B2
7525928 Cutler Apr 2009 B2
7551754 Steinberg et al. Jun 2009 B2
7577295 Constantin et al. Aug 2009 B2
7577297 Mori et al. Aug 2009 B2
7580952 Logan et al. Aug 2009 B2
7584285 Hudson et al. Sep 2009 B2
7606375 Bailey et al. Oct 2009 B2
7614046 Daniels et al. Nov 2009 B2
7639877 Shiota et al. Dec 2009 B2
7680327 Weiss Mar 2010 B2
7697557 Segel Apr 2010 B2
7703036 Satterfield Apr 2010 B2
7715598 Li et al. May 2010 B2
7716643 Goldin May 2010 B2
7729902 Gupta Jun 2010 B1
7738870 Howard Jun 2010 B2
7751599 Chen et al. Jul 2010 B2
7756538 Bonta et al. Jul 2010 B2
7765194 Sharma et al. Jul 2010 B1
7766498 Sampsell Aug 2010 B2
7779367 Oshiro et al. Aug 2010 B2
7783629 Li et al. Aug 2010 B2
7783777 Pabla et al. Aug 2010 B1
7835910 Hakkani-Tur et al. Nov 2010 B1
7864967 Takeuchi et al. Jan 2011 B2
7865952 Hopwood et al. Jan 2011 B1
7881479 Asada Feb 2011 B2
7900011 Amundsen et al. Mar 2011 B2
7959308 Freeman et al. Jun 2011 B2
7970350 Sheynman et al. Jun 2011 B2
7970901 Lipscomb et al. Jun 2011 B2
7978925 Souchard Jul 2011 B1
8015006 Kennewick et al. Sep 2011 B2
8026830 Womble et al. Sep 2011 B2
8074213 Holtz Dec 2011 B1
8078623 Chou et al. Dec 2011 B2
8091074 Lyon-Smith Jan 2012 B2
8107243 Guccione et al. Jan 2012 B2
8150098 Gallagher et al. Apr 2012 B2
8154384 Hirai Apr 2012 B2
8155400 Bronstein et al. Apr 2012 B2
8165352 Mohanty Apr 2012 B1
8170298 Li et al. May 2012 B2
8189807 Cutler May 2012 B2
8194177 Jung et al. Jun 2012 B2
8212294 Hoke Jul 2012 B2
8212894 Nozaki et al. Jul 2012 B2
8213333 Greel et al. Jul 2012 B2
8213690 Okada et al. Jul 2012 B2
8224036 Maruyama et al. Jul 2012 B2
8229729 Sarikaya et al. Jul 2012 B2
8232962 Buck Jul 2012 B2
8239446 Navar et al. Aug 2012 B2
8245043 Cutler Aug 2012 B2
8275615 Kozat Sep 2012 B2
8296107 Turner et al. Oct 2012 B2
8296673 Lipstein et al. Oct 2012 B2
8302006 Stanek et al. Oct 2012 B2
8306280 Nozaki et al. Nov 2012 B2
8321220 Chotimongkol et al. Nov 2012 B1
8326634 Di Cristo et al. Dec 2012 B2
8331632 Mohanty et al. Dec 2012 B1
8345934 Obrador et al. Jan 2013 B2
8346563 Hjelm et al. Jan 2013 B1
8358811 Adams et al. Jan 2013 B2
8364717 Delling et al. Jan 2013 B2
8368540 Perkins et al. Feb 2013 B2
8373829 Hara et al. Feb 2013 B2
8374122 Meier et al. Feb 2013 B2
8384694 Powell et al. Feb 2013 B2
8384791 Porter et al. Feb 2013 B2
8392594 Georgis et al. Mar 2013 B2
8397163 Sran Mar 2013 B1
8400332 Szwabowski et al. Mar 2013 B2
8406206 Chiang Mar 2013 B2
8410903 Hirai Apr 2013 B2
8412521 Mathias et al. Apr 2013 B2
8413198 Connor et al. Apr 2013 B2
8448847 Lee May 2013 B2
8468548 Kulkarni et al. Jun 2013 B2
8484314 Luna et al. Jul 2013 B2
8504823 Carpenter Aug 2013 B2
8516471 Bhakta et al. Aug 2013 B2
8522209 Wintergerst et al. Aug 2013 B2
8526683 Maruyama et al. Sep 2013 B2
8527602 Rasmussen et al. Sep 2013 B1
8532347 Bourdev Sep 2013 B2
8535075 Golko et al. Sep 2013 B1
8538091 Kaneda et al. Sep 2013 B2
8539477 Balascio et al. Sep 2013 B2
8555364 Filippi et al. Oct 2013 B2
8559722 Tsuji Oct 2013 B2
8571866 Melamed et al. Oct 2013 B2
8611678 Hanson et al. Dec 2013 B2
8614734 Cutler Dec 2013 B2
8619062 Powell et al. Dec 2013 B2
8620351 Karaoguz Dec 2013 B2
8620649 Gao Dec 2013 B2
8626932 Lydon et al. Jan 2014 B2
8631350 Lepage et al. Jan 2014 B2
8670850 Soulodre Mar 2014 B2
8686600 Terlizzi et al. Apr 2014 B2
8701102 Appiah et al. Apr 2014 B2
8705806 Nakano Apr 2014 B2
8719603 Belesiu May 2014 B2
8761512 Buddemeier Jun 2014 B1
8776166 Erickson et al. Jul 2014 B1
8924315 Archambeau Dec 2014 B2
8935673 Ashkenazi et al. Jan 2015 B1
9239773 Teplitsky et al. Jan 2016 B1
9324323 Bikel et al. Apr 2016 B1
9460493 Suri et al. Oct 2016 B2
9510125 Raghuvanshi et al. Nov 2016 B2
20010000356 Woods Apr 2001 A1
20020041590 Donovan Apr 2002 A1
20020083041 Achlioptas Jun 2002 A1
20020101918 Rodman et al. Aug 2002 A1
20020116171 Russell Aug 2002 A1
20020143855 Traversat et al. Oct 2002 A1
20030064142 Wagner et al. Apr 2003 A1
20030068100 Covell et al. Apr 2003 A1
20030125948 Lyudovyk Jul 2003 A1
20030182414 O'Neill Sep 2003 A1
20030212543 Epstein Nov 2003 A1
20030212544 Acero Nov 2003 A1
20040040021 Bharati et al. Feb 2004 A1
20040088726 Ma et al. May 2004 A1
20040168165 Kokkinen Aug 2004 A1
20040210752 Rao Oct 2004 A1
20040240711 Hamza et al. Dec 2004 A1
20050039169 Hsu et al. Feb 2005 A1
20050052427 Wu et al. Mar 2005 A1
20050058297 Jot et al. Mar 2005 A1
20050065789 Yacoub Mar 2005 A1
20050091057 Phillips et al. Apr 2005 A1
20050114625 Snyder May 2005 A1
20050144013 Fujimoto et al. Jun 2005 A1
20050144616 Hammond et al. Jun 2005 A1
20050163372 Kida et al. Jul 2005 A1
20050165598 Cote et al. Jul 2005 A1
20050165839 Madan et al. Jul 2005 A1
20050177515 Kalavade et al. Aug 2005 A1
20050177624 Oswald et al. Aug 2005 A1
20050245243 Zuniga Nov 2005 A1
20060009996 Lipscomb et al. Jan 2006 A1
20060034542 Aoyama Feb 2006 A1
20060036965 Harris et al. Feb 2006 A1
20060046709 Krumm et al. Mar 2006 A1
20060058009 Vogedes et al. Mar 2006 A1
20060088209 Yu et al. Apr 2006 A1
20060156222 Chi et al. Jul 2006 A1
20060200477 Barrenechea Sep 2006 A1
20060212867 Fields et al. Sep 2006 A1
20060218302 Chia Sep 2006 A1
20060244845 Craig et al. Nov 2006 A1
20060250834 Chinn et al. Nov 2006 A1
20060253491 Gokturk Nov 2006 A1
20060277478 Seraji et al. Dec 2006 A1
20060280341 Koshizen Dec 2006 A1
20060287856 He et al. Dec 2006 A1
20060290705 White Dec 2006 A1
20070002478 Mowry Jan 2007 A1
20070038436 Cristo et al. Feb 2007 A1
20070053607 Mitsunaga Mar 2007 A1
20070055752 Wiegand et al. Mar 2007 A1
20070055936 Dhanjal et al. Mar 2007 A1
20070058878 Gomilla et al. Mar 2007 A1
20070074168 Bates et al. Mar 2007 A1
20070128979 Shackelford Jun 2007 A1
20070147318 Ross et al. Jun 2007 A1
20070150428 Webb Jun 2007 A1
20070156392 Balchandran et al. Jul 2007 A1
20070157313 Denton Jul 2007 A1
20070172099 Park Jul 2007 A1
20070188477 Rehm Aug 2007 A1
20070198950 Dodge et al. Aug 2007 A1
20070203863 Gupta Aug 2007 A1
20070226649 Agmon Sep 2007 A1
20070233879 Woods Oct 2007 A1
20070234048 Ziv Oct 2007 A1
20070271086 Peters et al. Nov 2007 A1
20070294061 Carlbom et al. Dec 2007 A1
20080004877 Tian Jan 2008 A1
20080005114 Li Jan 2008 A1
20080014563 Visani Jan 2008 A1
20080037438 Twiss et al. Feb 2008 A1
20080037442 Bill Feb 2008 A1
20080046425 Perski Feb 2008 A1
20080055278 Locker et al. Mar 2008 A1
20080069364 Itou et al. Mar 2008 A1
20080089299 Lindsley et al. Apr 2008 A1
20080089561 Zhang Apr 2008 A1
20080137875 Zong et al. Jun 2008 A1
20080140981 Kim Jun 2008 A1
20080143674 Molander et al. Jun 2008 A1
20080159232 Thalanany et al. Jul 2008 A1
20080165701 Ananthanarayanan et al. Jul 2008 A1
20080175190 Lee et al. Jul 2008 A1
20080183751 Cazier et al. Jul 2008 A1
20080192820 Brooks et al. Aug 2008 A1
20080195388 Bower et al. Aug 2008 A1
20080204598 Maurer et al. Aug 2008 A1
20080209354 Stanek et al. Aug 2008 A1
20080212894 Demirli et al. Sep 2008 A1
20080215183 Chen Sep 2008 A1
20080235017 Satomura Sep 2008 A1
20080263130 Michalowitz et al. Oct 2008 A1
20080273708 Sandgren et al. Nov 2008 A1
20090028380 Hillebrand et al. Jan 2009 A1
20090030697 Cerra et al. Jan 2009 A1
20090046864 Mahabub et al. Feb 2009 A1
20090055389 Schilit et al. Feb 2009 A1
20090055461 Georgis et al. Feb 2009 A1
20090083148 Hwang et al. Mar 2009 A1
20090087099 Nakamura Apr 2009 A1
20090089801 Jones et al. Apr 2009 A1
20090100384 Louch Apr 2009 A1
20090100459 Riedl et al. Apr 2009 A1
20090100489 Strothmann Apr 2009 A1
20090116749 Cristinacce et al. May 2009 A1
20090180671 Lee Jul 2009 A1
20090185723 Kurtz Jul 2009 A1
20090187593 Chen et al. Jul 2009 A1
20090210328 Fomenko et al. Aug 2009 A1
20090228820 Kim et al. Sep 2009 A1
20090259667 Wang et al. Oct 2009 A1
20090271735 Anderson et al. Oct 2009 A1
20090292687 Fan Nov 2009 A1
20090300596 Tyhurst et al. Dec 2009 A1
20090313546 Katpelly et al. Dec 2009 A1
20100004930 Strope Jan 2010 A1
20100011123 Dantzig et al. Jan 2010 A1
20100023625 Lee Jan 2010 A1
20100027663 Dai et al. Feb 2010 A1
20100054544 Arguelles Mar 2010 A1
20100082478 Van Der Veen et al. Apr 2010 A1
20100103117 Townsend et al. Apr 2010 A1
20100111059 Bappu et al. May 2010 A1
20100114890 Hagar May 2010 A1
20100128863 Krum et al. May 2010 A1
20100135038 Handschy et al. Jun 2010 A1
20100189313 Prokoski Jul 2010 A1
20100191837 Linden Jul 2010 A1
20100205177 Sato Aug 2010 A1
20100211695 Steinmetz et al. Aug 2010 A1
20100211908 Luk et al. Aug 2010 A1
20100229222 Li et al. Sep 2010 A1
20100251206 Horiuchi et al. Sep 2010 A1
20100251230 O'Farrell et al. Sep 2010 A1
20100279653 Poltorak Nov 2010 A1
20100295774 Hennessey Nov 2010 A1
20100312546 Chang et al. Dec 2010 A1
20110007174 Bacivarov et al. Jan 2011 A1
20110009075 Jantunen et al. Jan 2011 A1
20110010171 Talwar et al. Jan 2011 A1
20110010319 Harada Jan 2011 A1
20110010424 Fox et al. Jan 2011 A1
20110016333 Scott et al. Jan 2011 A1
20110043490 Powell et al. Feb 2011 A1
20110052081 Onoe et al. Mar 2011 A1
20110064331 Andres Del Valle Mar 2011 A1
20110071841 Fomenko et al. Mar 2011 A1
20110081023 Raghuvanshi et al. Apr 2011 A1
20110091113 Ito Apr 2011 A1
20110093459 Dong et al. Apr 2011 A1
20110099538 Naidu Pujala et al. Apr 2011 A1
20110129159 Cifarelli Jun 2011 A1
20110135166 Wechsler Jun 2011 A1
20110138064 Rieger et al. Jun 2011 A1
20110144999 Jang et al. Jun 2011 A1
20110153324 Ballinger et al. Jun 2011 A1
20110158536 Nakano Jun 2011 A1
20110173556 Czerwinski et al. Jul 2011 A1
20110176058 Biswas et al. Jul 2011 A1
20110177481 Haff et al. Jul 2011 A1
20110179182 Vadia et al. Jul 2011 A1
20110283266 Gallagher et al. Nov 2011 A1
20110289482 Bentlye Nov 2011 A1
20110321029 Kern et al. Dec 2011 A1
20120014560 Obrador et al. Jan 2012 A1
20120027311 Cok Feb 2012 A1
20120029661 Jones et al. Feb 2012 A1
20120030325 Silverman et al. Feb 2012 A1
20120030682 Shaffer et al. Feb 2012 A1
20120054624 Owens et al. Mar 2012 A1
20120065976 Deng Mar 2012 A1
20120066642 Shi Mar 2012 A1
20120071174 Bao et al. Mar 2012 A1
20120072528 Rimac et al. Mar 2012 A1
20120076427 Hibino et al. Mar 2012 A1
20120079372 Kandekar et al. Mar 2012 A1
20120084086 Gilbert Apr 2012 A1
20120096121 Hao et al. Apr 2012 A1
20120106859 Cheatle May 2012 A1
20120120678 Su May 2012 A1
20120134139 Jang et al. May 2012 A1
20120144288 Caruso et al. Jun 2012 A1
20120169791 Whitehead et al. Jul 2012 A1
20120188382 Morrison et al. Jul 2012 A1
20120224388 Lin Sep 2012 A1
20120225652 Martinez et al. Sep 2012 A1
20120232885 Barbosa et al. Sep 2012 A1
20120235887 Border et al. Sep 2012 A1
20120242598 Won et al. Sep 2012 A1
20120245944 Gruber Sep 2012 A1
20120246458 Jain et al. Sep 2012 A1
20120253799 Bangalore Oct 2012 A1
20120253802 Heck et al. Oct 2012 A1
20120254086 Deng Oct 2012 A1
20120254161 Zhang et al. Oct 2012 A1
20120254227 Heck et al. Oct 2012 A1
20120256967 Baldwin et al. Oct 2012 A1
20120265531 Bennett Oct 2012 A1
20120266140 Bates Oct 2012 A1
20120269355 Chandak et al. Oct 2012 A1
20120271617 Nakajima et al. Oct 2012 A1
20120278430 Lehane et al. Nov 2012 A1
20120290293 Hakkani-Tur et al. Nov 2012 A1
20120293543 Jardine-Skinner Nov 2012 A1
20120303565 Deng et al. Nov 2012 A1
20120308124 Belhumeur et al. Dec 2012 A1
20120310523 Delling et al. Dec 2012 A1
20120313865 Pearce Dec 2012 A1
20120317197 De Foy et al. Dec 2012 A1
20120324069 Nori et al. Dec 2012 A1
20120327040 Simon et al. Dec 2012 A1
20120327042 Harley et al. Dec 2012 A1
20120331102 Ertugrul Dec 2012 A1
20120331111 Wu et al. Dec 2012 A1
20130013936 Lin et al. Jan 2013 A1
20130014050 Queru Jan 2013 A1
20130016055 Chuang Jan 2013 A1
20130019175 Kotler et al. Jan 2013 A1
20130021373 Vaught et al. Jan 2013 A1
20130031476 Coin et al. Jan 2013 A1
20130065576 Basir Mar 2013 A1
20130073725 Bordeleau et al. Mar 2013 A1
20130078869 Golko et al. Mar 2013 A1
20130085756 Chotimongkol et al. Apr 2013 A1
20130086461 Ashley-Rollman et al. Apr 2013 A1
20130086507 Poston et al. Apr 2013 A1
20130091205 Kotler et al. Apr 2013 A1
20130091440 Kotler et al. Apr 2013 A1
20130091453 Kotler Apr 2013 A1
20130091465 Kikin-Gil et al. Apr 2013 A1
20130091534 Gilde et al. Apr 2013 A1
20130094445 De Foy et al. Apr 2013 A1
20130097481 Kotler et al. Apr 2013 A1
20130097490 Kotler et al. Apr 2013 A1
20130106725 Bakken et al. May 2013 A1
20130106740 Yilmaz et al. May 2013 A1
20130106977 Chu et al. May 2013 A1
20130108065 Mullins et al. May 2013 A1
20130117658 Fidler et al. May 2013 A1
20130127982 Zhang et al. May 2013 A1
20130128364 Wheeler et al. May 2013 A1
20130138436 Yu May 2013 A1
20130148864 Dolson et al. Jun 2013 A1
20130151441 Archambeau Jun 2013 A1
20130151681 Dournov et al. Jun 2013 A1
20130151975 Shadi et al. Jun 2013 A1
20130152092 Yadgar Jun 2013 A1
20130156275 Amacker et al. Jun 2013 A1
20130166742 Wiener et al. Jun 2013 A1
20130173604 Li et al. Jul 2013 A1
20130174047 Sivakumar et al. Jul 2013 A1
20130185065 Tzirkel-Hancock et al. Jul 2013 A1
20130188032 Vertegaal Jul 2013 A1
20130191781 Radakovitz et al. Jul 2013 A1
20130212484 Joshi et al. Aug 2013 A1
20130217414 Nagaraj Aug 2013 A1
20130226587 Cheung Aug 2013 A1
20130227398 Bolstad Aug 2013 A1
20130227415 Gregg et al. Aug 2013 A1
20130231130 Cherian et al. Sep 2013 A1
20130231862 Delling et al. Sep 2013 A1
20130234913 Thangadorai et al. Sep 2013 A1
20130238729 Holzman et al. Sep 2013 A1
20130242964 Hassan et al. Sep 2013 A1
20130243328 Irie Sep 2013 A1
20130252636 Chang et al. Sep 2013 A1
20130254412 Menezes et al. Sep 2013 A1
20130266196 Kono Oct 2013 A1
20130275779 He Oct 2013 A1
20130293530 Perez et al. Nov 2013 A1
20130297700 Hayton et al. Nov 2013 A1
20130298185 Koneru et al. Nov 2013 A1
20130315235 Foo Nov 2013 A1
20130318249 McDonough et al. Nov 2013 A1
20130321390 Latta et al. Dec 2013 A1
20130325148 Mustafa et al. Dec 2013 A1
20130335301 Wong et al. Dec 2013 A1
20130339478 Edge Dec 2013 A1
20130342637 Felkai et al. Dec 2013 A1
20140004741 Jol et al. Jan 2014 A1
20140006420 Sparrow et al. Jan 2014 A1
20140007215 Romano et al. Jan 2014 A1
20140019626 Hubler et al. Jan 2014 A1
20140019896 Satterfield Jan 2014 A1
20140025380 Koch et al. Jan 2014 A1
20140029859 Libin Jan 2014 A1
20140046914 Das et al. Feb 2014 A1
20140050419 Lerios et al. Feb 2014 A1
20140072242 Wei et al. Mar 2014 A1
20140098682 Cao Apr 2014 A1
20140107921 Delling et al. Apr 2014 A1
20140108979 Davidson et al. Apr 2014 A1
20140157169 Kikin-gil Jun 2014 A1
20140173602 Kikin-gil et al. Jun 2014 A1
20140181708 Kikin-gil et al. Jun 2014 A1
20140210797 Kreek et al. Jul 2014 A1
20140214410 Jang Jul 2014 A1
20140223334 Jensen Aug 2014 A1
20140253522 Cueto Sep 2014 A1
20140257803 Yu et al. Sep 2014 A1
20140282415 Ovadia et al. Sep 2014 A1
20140317602 Zuo Oct 2014 A1
20140341443 Cao Nov 2014 A1
20140358537 Gilbert Dec 2014 A1
20140359593 Cohen et al. Dec 2014 A1
20140359709 Nassar Dec 2014 A1
20140372112 Xue et al. Dec 2014 A1
20140379326 Sarikaya et al. Dec 2014 A1
20140379353 Boies et al. Dec 2014 A1
20150082291 Thomas et al. Mar 2015 A1
20150082292 Thomas et al. Mar 2015 A1
20150082293 Thomas et al. Mar 2015 A1
20150082296 Thomas et al. Mar 2015 A1
20150100312 Bocchieri Apr 2015 A1
20150161993 Sainath Jun 2015 A1
20150161994 Tang Jun 2015 A1
20150170020 Garimella Jun 2015 A1
20150255061 Xue et al. Sep 2015 A1
20150255069 Adams et al. Sep 2015 A1
20150277682 Kaufthal Oct 2015 A1
20150277708 Rodrig et al. Oct 2015 A1
20150278191 Levit et al. Oct 2015 A1
20150310040 Chan et al. Oct 2015 A1
20150310261 Lee et al. Oct 2015 A1
20150310858 Li et al. Oct 2015 A1
20150317147 Nachimuthu et al. Nov 2015 A1
20150317313 Lv et al. Nov 2015 A1
20150317510 Lee Nov 2015 A1
20150325236 Levit Nov 2015 A1
20150331240 Poulos Nov 2015 A1
20150347120 Garg et al. Dec 2015 A1
20150347274 Taylor Dec 2015 A1
20150347734 Beigi Dec 2015 A1
20150350333 Cutler et al. Dec 2015 A1
20150356759 Delling et al. Dec 2015 A1
20150363919 Suri et al. Dec 2015 A1
20150371409 Negrila et al. Dec 2015 A1
20150373475 Raghuvanshi et al. Dec 2015 A1
20150373546 Haugen et al. Dec 2015 A1
20150378515 Powell Dec 2015 A1
20160203125 Sarikaya et al. Jul 2016 A1
20160210035 Rodrig et al. Jul 2016 A1
20160239987 Negrila et al. Aug 2016 A1
20160379343 Suri et al. Dec 2016 A1
Foreign Referenced Citations (34)
Number Date Country
101753404 Jun 2010 CN
0704655 Apr 1996 EP
0553101 Jul 1997 EP
0816981 Jul 1998 EP
1055872 Nov 2000 EP
1174787 Jan 2002 EP
1331566 Jul 2003 EP
1628197 Feb 2006 EP
1965389 Sep 2008 EP
1970803 Sep 2008 EP
2096577 Sep 2009 EP
2267655 Dec 2010 EP
2312462 Apr 2011 EP
2482572 Aug 2012 EP
2575128 Apr 2013 EP
2431001 Apr 2007 GB
2002091477 Mar 2002 JP
20040076079 Aug 2004 KR
20130022513 Mar 2013 KR
WO-9304468 Mar 1993 WO
WO-0250590 Jun 2002 WO
WO-2005013262 Feb 2005 WO
WO-2005033934 Apr 2005 WO
WO-2008124181 Oct 2008 WO
WO-2009015047 Jan 2009 WO
WO-2009082814 Jul 2009 WO
WO-2009089308 Jul 2009 WO
WO-2010141403 Dec 2010 WO
WO-2011014138 Feb 2011 WO
WO-2012152817 Nov 2012 WO
WO-2013048510 Apr 2013 WO
WO-2013154561 Oct 2013 WO
WO-2013171481 Nov 2013 WO
WO-2013184225 Dec 2013 WO
Non-Patent Literature Citations (420)
Entry
“GPU-Accelerated Route Planning”, https://www.cs.unc.edu/cms/research/summaries/GPUAcceleratedRoutePlanning.pdf, Aug. 2005, 2 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/920,323, Feb. 27, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/456,679, Jun. 19, 2015, 22 pages.
Abraham,“Hierarchical Hub Labelings for Shortest Paths”, In Technical Report MSR-TR-MSR-TR-2012-46, Apr. 2012, 15 pages.
Bast,“Fast Routing in Road Networks with Transit Nodes”, In Proceedings of Science, vol. 316, No. 5824, Apr. 27, 2007, p. 566.
Bast,“Route Planning in Transportation Networks”, In Technical Report MSR-TR-2014-4, Jan. 8, 2014, 57 pages.
Bleiweiss,“GPU Accelerated Pathfinding”, In Proceedings of the 23rd ACM Siggraph/Eurographics symposium on Graphics hardware, Jun. 20, 2008, pp. 65-74.
Cormen,“Introduction to Algorithms”, The MIT Press, Jul. 31, 2009, 43 pages.
Delling,“Customizable Route Planning in Road Networks”, In Proceedings of the Sixth Annual Symposium on Combinatorial Search, Jul. 2011, pp. 1-31.
Delling,“Customizable Route Planning”, In Proceedings of the 10th International Symposium on Experimental Algorithms, May 2011, pp. 1-12.
Delling,“Faster Customization of Road Networks”, In Proceedings of the 12th International Symposium on Experimental Algorithms, Jun. 5, 2013, pp. 1-12.
Delling,“Graph Partitioning with Natural Cuts”, In Proceedings of the IEEE International Parallel & Distributed Processing Symposium, May 16, 2011, 15 pages.
Delling,“PHAST: Hardware-Accelerated Shortest Path Trees”, In Journal of Parallel and Distributed Computing, vol. 73, No. 7, Jul. 2013, 11 pages.
Delling,“Query Scenarios for Customizable Route Planning”, U.S. Appl. No. 13/649,114, Oct. 11, 2012, 27 pages.
Dong,“Image Retargeting by Content-Aware Synthesis”, IEEE Transactions on Visualization and Computer Graphics, vol. XX, No. XX, June 2014, Mar. 26, 2014, 14 pages.
Efentakis,“Optimizing Landmark-Based Routing and Preprocessing”, In Proceedings of the Sixth ACM SIGSPATIAL International Workshop on Computational Transportation Science, Nov. 5, 2013, 6 pages.
Geisberger,“Efficient Routing in Road Networks with Turn Costs”, In Proceedings of the 10th International Conference on Experimental Algorithms, May 5, 2011, 12 pages.
Gooch,“Color2Gray: Salience-Preserving Color Removal”, In Journal of ACM Transactions on Graphics, vol. 24 Issue 3, Jul. 2006.
Holzer,“Engineering Multilevel Overlay Graphs for Shortest-Path Queries”, In ACM Journal of Experimental Algorithmics, vol. 13, Sep. 2008, 26 pages.
Kohler,“Fast Point-to-Point Shortest Path Computations with Arc-Flags”, In Proceedings of Shortest Path Computations: Ninth DIMACS Challenge, vol. 24 of DIMACS Book. American Mathematical Society, Nov. 13, 2006, pp. 1-27.
Lilly,“Robust Speech Recognition Using Singular Value Decomposition Based Speech Enhancement”, IEEE Tencon, 1997, 4 pages.
Lu,“Context Aware Textures”, In Journal of ACM Transactions on Graphics, vol. 26 Issue 1, Jan. 2007, 31 pages.
Madduri,“Parallel Shortest Path Algorithms for Solving Large-Scale Instances”, In Proceedings of 9th DIMACS Implementation Challenge—The Shortest Path Problem, Aug. 30, 2006, 39 pages.
Meyer,“D-Stepping: A Parallelizable Shortest Path Algorithm”, In Journal of Algorithms, vol. 49, Issue 1, Oct. 2003, pp. 114-152.
Ortega-Arranz,“A New GPU-based Approach to the Shortest Path Problem”, In Proceedings of International Conference on High Performance Computing and Simulation, Jul. 1, 2013, 7 pages.
Perumalla,“GPU-based Real-Time Execution of Vehicular Mobility Models in Large-Scale Road Network Scenarios”, In ACM/IEEE/SCS 23rd Workshop on Principles of Advanced and Distributed Simulation, Jun. 22, 2009, 9 pages.
Shan,“Image Based Surface Detail Transfer”, In IEEE Computer Graphics and Applications, vol. 24 Issue 3, May 2004, 6 pages.
Shen,“Agent-based Traffic Simulation and Traffic Signal Timing Optimization with GPU”, 2011 14th International IEEE Conference on Intelligent Transportation Systems, Oct. 5, 2011, pp. 145-150.
Sommer,“Shortest-Path Queries in Static Networks”, In Proceedings of ACM Computing Surveys, Apr. 7, 2014, 35 pages.
Wodecki,“Multi-GPU Parallel Memetic Algorithm for Capacitated Vehicle Routing Problem”, In Proceedings of Distributed, Parallel, and Cluster Computing, Jan. 21, 2014, pp. 207-214.
“Creating Interactive Virtual Auditory Environments”, IEEE Computer Graphics and Applications, Aug. 2002, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/920,323, Sep. 24, 2015, 24 pages.
“Integrated Vapor Chamber for Thermal Management of Computing Devices”, U.S. Appl. No. 14/294,040, filed Jun. 2, 2014, 27 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/036595, Sep. 24, 2015, 10 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/017872, Jun. 25, 2015, 11 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/033545, Aug. 20, 2015, 11 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/022887, Jun. 26, 2015, 12 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/029334, Jul. 7, 2015, 12 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/033872, Sep. 2, 2015, 12 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/035219, Sep. 29, 2015, 12 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/027689, Jul. 8, 2015, 13 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/026971, Jul. 24, 2015, 15 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2014/041023, Mar. 6, 2015, 17 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/036767, Sep. 14, 2015, 19 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2015/027688, Sep. 7, 2015, 9 pages.
“International Search Report and the Written Opinion”, Application No. PCT/US2014/041014, Oct. 2, 2014, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/923,917, May 28, 2015, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/923,969, May 6, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/201,704, Jul. 1, 2015, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/266,795, Oct. 7, 2015, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/273,100, Oct. 1, 2015, 20 pages.
“Notice of Allowance”, U.S. Appl. No. 14/312,562, Sep. 18, 2015, 13 pages.
“Restriction Requirement”, U.S. Appl. No. 14/279,146, Sep. 3, 2015, 6 pages.
Ajwani,“Breadth First Search on Massive Graphs”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 15 pages.
Barrett,“Implementations of Routing Algorithms for Transportation Networks”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 19 pages.
Belhumeur,“Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Jul. 1997, pp. 711-720.
Bohus,“Olympus: An Open-Source Framework for Conversational Spoken Language Interface Research”, In Proceedings of the Workshop on Bridging the Gap: Academic and Industrial Research in Dialog Technologies, Apr. 2007, 8 pages.
Cao,“Face Recognition with Learning-based Descriptor”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2010, 8 pages.
Chandrasekaran,“Sparse and Low-Rank Matrix Decompositions”; IFAC Symposium on System Identification, 2009, 6 pages.
Chen,“Bayesian Face Revisited: A Joint Formulation”, In Proceedings of the 12th European Conference on Computer Vision (ECCV), Oct. 2012, 14 pages.
Chen,“Supplemental Material for Bayesian Face Revisited: A Joint Formulation”, Apr. 2013, 5 pages.
Cootes,“Modeling Facial Shape and Appearance”, Handbook of Face Recognition, Springer, New York, US, 2005, pp. 39-63.
Davis,“Information-Theoretic Metric Learning”, In Proceedings of the 24th International Conference on Machine Learning (ICML), Jun. 2007, 8 pages.
Delano,“Integrated Development Environments for Natural Language Processing”, Available at: http://www.textanalysis.com/TAI-IDE-WP.pdf, Oct. 2001, 13 pages.
Delling,“Customizable Route Planning”, U.S. Appl. No. 13/152,313, filed Jun. 3, 2011, 23 pages.
Delling,“Customizable Route Planning”, U.S. Appl. No. 13/868,135, filed Apr. 23, 2013, 33 pages.
Delling,“Customizing Driving Directions With GPUs”, In Proceedings of the 20th Euro-Par International Conference on Parallel Processing, Aug. 2014, 12 pages.
Delling,“High-Performance Multi-Level Graphs”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 13 pages.
Delling,“Highway Hierarchies Star”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 29 pages.
Demetrescu,“The Shortest Path Problem: Ninth DIMACS Implementation Challenge”, In Proceedings of DIMACS Series in Discrete Mathematics and Theoretical Computer Science, Jul. 28, 2009, 3 pages.
Diez,“Optimization of a Face Verification System Using Bayesian Screening Techniques”, In Proceedings of the 23rd IASTED International Multi-Conference on Artificial Intelligence and Applications, Feb. 2005, pp. 427-432.
Ding,“Handbook of Face Recognition, Chapter 12: Facial Landmark Localization”, Jan. 1, 2011, 19 pages.
Dos“LUP: A Language Understanding Platform”, A Dissertation for the Degree of Master of Information Systems and Computer Engineering, Jul. 2012, 128 pages.
Eagle,“Common Sense Conversations: Understanding Casual Conversation using a Common Sense Database”, In Proceedings of the Artificial Intelligence, Information Access, and Mobile Computing Workshop, Aug. 2003, 6 pages.
Edmonds,“Single-Source Shortest Paths With the Parallel Boost Graph Library”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 4, 2006, 20 pages.
Geisberger,“Exact Routing in Large Road Networks using Contraction Hierarchies”, In Proceedings of Transportation Science, vol. 46, No. 3, Aug. 2012, 17 pages.
Goldberg,“Better Landmarks within Reach”, In Proceedings of the 6th International Conference on Experimental Algorithms, Jun. 6, 2007, 14 Pages.
Guillaumin,“Is that you? Metric Learning Approaches for Face Identification”, In Proceedings of 12th IEEE International Conference on Computer Vision (ICCV), Sep. 2009, 8 pages.
He,“What is Discriminative Learning”, Discriminative Learning for Speech Recognition Theory and Practice, Achorn International, Jun. 25, 2008, 25 pages.
Hoffmeister,“Log-linear Model Combination with Word-dependent Scaling Factors”, Human Language Technology and Pattern Recognition Computer Science Department, 2009, 4 pages.
Huang,“Labeled Faces in the Wild: A Database for Studying Face Recognition in Unconstrained Environments”, In Proceedings of the 10th European Conference on Computer Vision (ECCV), Oct. 2008, 11 pages.
Huang,“Unified Stochastic Engine (USE) for Speech Recognition”, School of Computer Science, 1993, 4 pages.
Ioffe,“Probabilistic Linear Discriminant Analysis”, International Journal of Computer Vision, Jun. 2001, 12 pages.
Karpinski,“Multi-GPU Parallel Memetic Algorithm for Capacitated Vehicle Routing Problem”, Lecture Noes in Computer Science, May 8, 2014, 12 pages.
Keshtkar,“A Corpus-based Method for Extracting Paraphrases of Emotion Terms”, Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text, 2010, 10 pages.
Ko,“Cammia—A Context-Aware Spoken Dialog System for Mobile Environments”, In Automatic Speech Recognition and Understanding, Jul. 29, 2011, 2 pages.
Kumar,“Attribute and Simile Classifiers for Face Verification”, In Proceedings of the 12th IEEE International Conference on Computer Vision (ICCV), Sep. 2009, 8 pages.
Kumar,“Describable Visual Attributes for Face Verification and Image Research”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Oct. 2011, 17 pages.
Kumar,“Face Recognition Using Gabor Wavelets”, In Proceedings of the 40th IEEE Asilomar Conference on Signals, Systems and Computers, Oct. 2006, 5 pages.
Lanitis,“Toward Automatic Simulation of Aging Effects on Face Images”, IEEE Trans. PAML, vol. 24, No. 4, Apr. 2002, 14 pages.
Lauther,“An Experimental Evaluation for Point-To-Point Shortest Path Calculation on Roadnetworks with Precalculated Edge-Flags”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 18 pages.
Lee,“Intention-Based Corrective Feedback Generation using Context-Aware Model”, In Proceedings of the Second International Conference on Computer Supported Education, Apr. 7, 2010, 8 pages.
Lei,“Face Recognition by Exploring Information Jointly in Space, Scale and Orientation”, IEEE Transactions on Image Processing, Jan. 2011, pp. 247-256.
Li,“Bayesian Face Recognition Using Support Vector Machine and Face Clustering”, In Proceedings of the IEEE Computer Society on Computer Vision Pattern and Recognition (CVPR), Jun. 2004, 7 pages.
Li,“Comparison of Discriminative Input and Output Transformations for Speaker Adaptation in the Hybrid NN/ HMM Systems”, In Proceedings of 11th Annual Conference of the International Speech Communication Association, Sep. 26, 2010, 4 pages.
Li,“Probabilistic Models for Inference about Identity”, IEEE Transactions on Pattern Recognition and Machine Intelligence, Jan. 2012, 16 pages.
Liang,“Face Alignment via Component-Based Discriminative Search”, Computer Vision, ECCV 2008, Lecture Notes in Computer Science vol. 5303, 2008, 14 pages.
Martin,“CUDA Solutions for the SSSP Problem”, In Proceedings of 9th International Conference Baton Rouge, May 25, 2009, 10 pages.
Moghaddam,“Bayesian Face Recognition”, The Journal of Pattern Recognition, Nov. 2000, pp. 1771-1782.
Moreira,“Towards the Rapid Development of a Natural Language Understanding Module”, In Proceedings of the 10th International Conference on Intelligent Virtual Agents, Jan. 2011, 7 pages.
Nguyen,“Cosine Similarity Metric Learning for Face Verification”, In Proceedings of the 10th Asian Conference on Computer Vision (ACCV), Nov. 2010, 12 pages.
Ojala,“A Generalized Local Binary Pattern Operator for Multiresolution Gray Scale and Rotation Invariant Texture Classification”, In Proceedings of the 2nd International Conference on Advances in Pattern Recognition (ICAPR), Mar. 2001, 10 pages.
Pascoal,“Implementations and Empirical Comparison of K Shortest Loopless Path Algorithms”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 16 pages.
Phillips,“The FERET Evaluation Methodology for Face-Recognition Algorithms”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Oct. 2000, pp. 1090-1104.
Powell,“Increased Accuracy Corner Cube Arrays for High Resolution Retro-Reflective Imaging Applications”, U.S. Appl. No. 62/062,732, filed Oct. 10, 2014, 46 pages.
Raghuvanshi,“Parametric Wave Field Coding for Precomputed Sound Propagation”, Jul. 2014, 11 pages.
Ramanan,“Local Distance Functions: A Taxonomy, New Algorithms, and an Evaluation”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Apr. 2011, 8 pages.
Rodrig,“Command User Interface for Displaying and Scaling Selectable Controls and Commands”, U.S. Appl. No. 14/254,681, filed Apr. 16, 2014, 51 pages.
Sanders,“Robust, Almost Constant Time Shortest-Path Queries in Road Networks”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 19 pages.
Santos,“K Shortest Path Algorithms”, In Proceedings of the 9th DIMACS Implementation Challenge: The Shortest Path Problem, Nov. 2006, 13 pages.
Sarukkai,“Word Set Probability Boosting for Improved Spontaneous Dialog Recognition”, IEEE Transactions on Speech and Audio Processing, vol. 5, No. 5, Sep. 1997, 13 pages.
Seneff,“Galaxy-II: A Reference Architecture for Conversational System Development”, In Proceedings of the 5th International Conference on Spoken Language Processing, Nov. 2008, 4 pages.
Seo,“Face Verification Using the LARK Representation”, IEEE Transactions on Information Forensics and Security, Dec. 2011, 12 pages.
Sing,“Domain Metric Knowledge Model for Embodied Conversation Agents”, In 5th International Conference on Research, Innovation & Vision for the Future, Mar. 5, 2007, 7 pages.
Susskind,“Modeling the joint density of two images under a variety of transformations”, In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2011, 8 pages.
Taigman,“Leveraging Billions of Faces to Overcome Performance Barriers in Unconstrained Face Recognition”, Aug. 4, 2011, 7 pages.
Taigman,“Multiple One-Shots for Utilizing Class Label Information”, In Proceedings of the British Machine Vision Conference (BMVC), Sep. 2009, 12 pages.
Tian,“Facial Expression Analysis”, Handbook of Face Recognition, pp. 247-275.
Wang,“A Unified Framework for Subspace Face Recognition”, retrieved at <<http://ieeexplore.ieee.org/Xplore/login.jsp?url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F34%2F29188%2F01316855.pdf&authDecision=-203>>, Sep. 2004, pp. 1222-1228.
Wang,“Bayesian Face Recognition Using Gabor Features”, In Proceedings of the ACM SIGMM Workshop on Biometrics Methods and Applications (WBMA), Nov. 8, 2003, pp. 70-73.
Wang,“Boosted Multi-Task Learning for Face Verification with Applications to Web Image and Video Search”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2009, 8 pages.
Wang,“Subspace Analysis Using Random Mixture Models”, In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2005, 7 pages.
Weinberger,“Distance Metric Learning for Large Margin Nearest Neighbor Classification”, In Proceedings of the Conference on Advances in Neural Information Processing Systems (NIPS), Dec. 2008, 8 pages.
Xue,“Singular Value Decomposition Based Low-Footprint Speaker Adaptation and Personalization for Deep Neural Network”, In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, May 4, 2014, 5 pages.
Ying,“Distance Metric Learning with Eigenvalue Optimization”, Journal of Machine Learning Research, Jan. 3, 2012, 26 pages.
Zhang,“Two-Dimensional Bayesian Subspace Analysis for Face Recognition”, In Proceedings of the 4th International Symposium on Neutral Networks (ISNN), Jun. 2007, 7 pages.
Zhu,“A Rank-Order Distance based Clustering Algorithm for Face Tagging”, In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2011, pp. 481-488.
“Final Office Action”, U.S. Appl. No. 14/304,911, Nov. 13, 2015, 7 pages.
“Final Office Action”, U.S. Appl. No. 14/456,679, Nov. 2, 2015, 26 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/036587, Oct. 8, 2015, 11 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/029805, Oct. 15, 2015, 20 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/279,146, Dec. 8, 2015, 9 pages.
Cvetkovic,“Image enhancement circuit using nonlinear processing curve and constrained histogram range equalization”, Visual Communications and Image Processing 2004, 2004, 12 pages.
Grasset,“Image-Driven View Management for Augmented Reality Browsers”, IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Nov. 5, 2012, 10 pages.
Rosten,“Real-time Video Annotations for Augmented Reality”, Advances in Visual Computing Lecture Notes in Computer Science, Jan. 1, 2005, 8 pages.
Yin,“An Associate-Predict Model for Face Recognition”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Jun. 2011, 8 pages.
“Cisco Bring Your Own Device”, Available at: http://www.cisco.com/c/en/us/td/docs/solutions/Enterprise/Borderless—Networks/Unified—Access/byodwp.html, Mar. 2014, 23 Pages.
“Unified Communications Interoperability Forum and Open Networking Foundation Announce Collaborative Relationship Between Unified Communications and Software-Defined Networks”, Retrieved From: <http://www.businesswire.com/news/home/20131120005275/en/Unified-Communications-Interoperability-Forum-Open-Networking-Foundation> Mar. 7, 2014, Nov. 20, 2013, 2 Pages.
“Unified Communications Managed API 3.0 Core SDK Documentation”, retrieved from: http://msdn.microsoft.com/en-us/library/gg421023.aspx on Feb. 14, 2012, Dec. 1, 2011 2 pages.
Ferguson, “Five Key Criteria for adaptable SDN Wi-Fi”, Retrieved From: <http://www.extremenetworks.com/five-key-criteria-for-adaptable-sdn-wi-fi/> Mar. 7, 2014, Nov. 25, 2013, 7 Pages.
Van “Unified Communication and Collaboration from the User's Perspective”, retrieved from: http://www.ucstrategies.com/unified-communications-expert-views/unified-communication-and-collaboration-from-the-users-perspective.aspx , Dec. 8, 2009 2 pages.
“Corporate Telecommunication Networks—Mobility for Enterprise Communications”, ECMA/TC32-TG17/2010/056, XP050514180, Geneva [retrieved on Nov. 4, 2010], Oct. 2010, 38 pages.
“Final Office Action”, U.S. Appl. No. 12/970,949, Jun. 10, 2015, 25 pages.
“Final Office Action”, U.S. Appl. No. 13/327,794, Nov. 20, 2014, 13 pages.
“Final Office Action”, U.S. Appl. No. 13/530,015, Nov. 19, 2014, 48 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/024594, Jul. 24, 2015, 10 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/032089, Jul. 31, 2015, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/027409, Jul. 22, 2015, 13 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/028383, Jul. 24, 2015, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/970,949, Jan. 2, 2015, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/530,015, Apr. 28, 2015, 32 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/903,944, Mar. 27, 2015, 24 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/264,012, Jul. 31, 2015, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/304,911, Jul. 17, 2015, 6 pages.
“Notice of Allowance”, U.S. Appl. No. 12/970,939, Dec. 19, 2014, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 12/970,943, Dec. 19, 2014, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 13/026,058, Nov. 7, 2014, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 13/367,377, Feb. 7, 2012, 10 pages.
Malony,“Compensation of Measurement Overhead in Parallel Performance Profiling”, The International Journal of High Performance Computing Applications, May 1, 2007, 23 pages.
“Acoustics—Measurement of room acoustic parameters—Part 1: Performance spaces”, In ISO 3382-1:2009, May 6, 2014, 2 pages.
“Debug Navigator Help: Using Debug Gauges”, https://developer.apple.com/library/mac/recipes/xcode—help-debug—navigator/articles/using—debug—gauges.html#//apple—ref/doc/uid/TP40010432-CH8-SW1, May 28, 2014, 3 pages.
“Deployment Planning Tips for Office 365”, http://technet.microsoft.com/en-us/library/hh852435.aspx, Oct. 14, 2012, 7 pages.
“Failover Cluster Step-by-Step Guide: Validating Hardware for a Failover Cluster”, http://technet.microsoft.com/en-us/library/cc732035(v=ws.10).aspx, Mar. 20, 2011, 10 pages.
“Get history and other info about your code”, <<http://msdn.microsoft.com/en-us/library/dn269218.aspx>>, retrieved May 23, 2014,, 10 pages.
“Interactive 3D Audio Rendering Guidelines, Level 2.0”, In proceedings of 3D Working Group of the Interactive Audio Special Interest Group, Sep. 20, 1999, 29 pages.
“Interest Point Detection”, Available at: http://en.wikipedia.org/wiki/Interest—point—detection, Apr. 21, 2014, 3 pages.
“Lifecycle Services for Microsoft Dynamics User Guide (LCS) [AX 2012]”, Available at: http://technet.microsoft.com/en-us/library/dn268616.aspx, Aug. 8, 2013, 5 pages.
“Low-Footprint Adaptation and Personalization fora Deep Neural Network”, U.S. Appl. No. 14/201,704, Mar. 7, 2014, 20 pages.
“Microsoft CodeLens Code Health Indicator extension”, <<https://developer.apple.com/library/ios/documentation/ToolsLanguages/Conceptual/Xcode—Overview/DebugYourApp/DebugYourApp.html>>, Mar. 10, 2014, 13 pages.
“New CodeLens Indicator—Incoming Changes”, <<http://msdn.microsoft.com/en-us/library/dn269218.aspx>>, retrieved May 23, 2014,, 8 pages.
“Secure Separation in Cisco Unified Data Center Architecture”, Available at: http://www.cisco.com/en//solutions/collateral/ns340/ns414/ns742/ns743/ns1050/white—paper—c11-722425.html, Oct. 1, 2013, 8 pages.
“Shared Hidden Layer Combination for Speech Recognition Systems”, U.S. Appl. No. 14/265,110, Apr. 29, 2014, 22 pages.
“Types of vCloud Hybrid Service”, Available at: http://pubs.vmware.com/vchs/index.jsp?topic=%2FGUID-FD4D5E84-1AB8-4A1B-8C3F-769176FCD154%2FGUID-375065F3-110A-4B84-99FA-FB8467361960.html, Dec. 16, 2012, 2 pages.
“UI Element Guidelines: Menus”, Available at: https://developer.apple.com/library/mac/documentation/userexperience/conceptual/apple higuidelines/Menus/Menus.html, Sep. 26, 2011, 22 pages.
“Xcode OpenGL ES Tools Overview”, Retrieved on: Jun. 5, 2014 Available at: https://developer.apple.com/library/prerelease/ios/documentation/3DDrawing/Conceptual/OpenGLES—ProgrammingGuide/ToolsOverview/ToolsOverview.html, 10 pages.
Abad, et al.,' “Context Dependent Modelling Approaches for Hybrid Speech Recognizers”, In Proceeding of Interspeech, Sep. 26, 2010, 4 pages.
Abdel-Hamid, et al.,' “Fast Speaker Adaptation of Hybrid NN/HMM Model for Speech Recognition Based on Discriminative Learning of Speaker Code”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Abid, et al.,' “A New Neural Network Pruning Method Based on the Singular Value Decomposition and the Weight Initialization”, In Proceedings of 11th European Signal Processing Conference, Sep. 3, 2002, 4 pages.
Ajdler, et al.,' “The Plenacoustic Function and Its Sampling”, In IEEE Transactions on Signal Processing, vol. 54, Issue 10, Oct. 2006, 35 pages.
Ajmani, et al.,' “Scheduling and Simulation: How to Upgrade Distributed Systems”, In Proceedings of the 9th conference on Hot Topics in Operating Systems, vol. 9., May 18, 2013, 6 pages.
Alt, et al.,' “Increasing the User's Attention on the Web: Using Implicit Interaction Based on Gaze Behavior to Tailor Content”, In Proceedings of the 7th Nordic Conference on Human-Computer Interaction—Making Sense through Design, Oct. 14, 2012, 10 pages.
Azizyan, et al.,' “SurroundSense: Mobile Phone Localization via Ambience Fingerprinting”, In Proceedings of the 15th annual international conference on Mobile computing and networking, Sep. 20, 2009, 12 pages.
Barman, et al.,' “Nonnegative Matrix Factorization (NMF) Based Supervised Feature Selection and Adaptation”, In Proceedings of the 9th International Conference on Intelligent Data Engineering and Automated Learning, Nov. 2, 2008, 2 pages.
Beymer, et al.,' “WebGazeAnalyzer: A System for Capturing and Analyzing Web Reading Behavior Using Eye Gaze”, In Proceedings of Extended Abstracts on Human Factors in Computing Systems, Apr. 2, 2005, 10 pages.
Bonzi, et al.,' “The Use of Anaphoric Resolution for Document Description in Information Retrieval”, In Proceedings of Information Processing & Management, vol. 25, Issue 4, Jun. 1989, 14 pages.
Bradley, et al.,' “Accuracy and Reproducibility of Auditorium Acoustics Measures”, In Proceedings of British Institute of Acoustics, vol. 10, May 6, 2014, 2 pages.
Broder, “A Taxonomy of Web Search”, In Proceedings of ACM SIGIR Forum, vol. 36, Issue 2, Sep. 2002, 8 pages.
Burges, “From Ranknet to Lambdarank to Lambdamart: An Overview”, In Microsoft Research Technical Report MSR-TR-2010-82, Jun. 23, 2010, 19 pages.
Burges, “Learning to Rank with Nonsmooth Cost Functions”, In Proceedings of the Advances in Neural Information Processing Systems, Dec. 2006, 8 pages.
Buscher, et al.,' “Generating and Using Gaze-Based Document Annotations”, In Proceedings of Extended Abstracts on Human Factors in Computing Systems, Apr. 5, 2008, 6 pages.
Calamia, “Advances in Edge-Diffraction Modeling for Virtual-Acoustic Simulations”, In Doctoral Dissertation of Princeton University, Jun. 2009, 159 pages.
Calian, “Passage-Level Evidence in Document Retrieval”, In Proceedings of the 17th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 3, 1994, 9 Pages.
Castro, et al.,' “A Probabilistic Room Location Service for Wireless Networked Environments”, In Proceedings of the 3rd international conference on Ubiquitous Computing, Sep. 30, 2001, 19 pages.
Chandak, et al.,' “AD-Frustum: Adaptive Frustum Tracing for Interactive Sound Propagation”, In IEEE Transactions on Visualization and Computer Graphics, vol. 14, Issue 6, Nov. 2008, 8 pages.
Chen, “Building Language Model on Continuous Space using Gaussian Mixture Models”, In Proceedings of Research in Language Modeling, Jan. 2007, 66 pages.
Cheng, et al.,' “Entityrank: Searching Entities Directly and Holistically”, In Proceedings of the 33rd International Conference on Very Large Data Bases, Sep. 23, 2007, 12 pages.
Cheng, et al.,' “Heritage and Early History of the Boundary Element Method”, In Proceedings of Engineering Analysis with Boundary Elements, vol. 29, Issue 3, Mar. 2005, 35 pages.
Chi, et al.,' “Visual Foraging of Highlighted Text: An Eye-Tracking Study”, In Proceedings of the 12th International Conference on Human-Computer Interaction—Intelligent Multimodal Interaction Environments, Jul. 22, 2007, 10 pages.
Choi, et al.,' “Face Annotation for Personal Photos Using Collaborative Face Recognition in Online Social Networks”, In 16th International Conference on Digital Signal Processing, Jul. 5, 2009, 8 pages.
Choudhury, et al.,' “A Framework for Robust Online Video Contrast Enhancement Using Modularity Optimization”, In IEEE Transactions on Circuits and Systems for Video Technology, vol. 22 , Issue: 9, Sep. 2012, 14 pages.
Clarke, “Exploiting Redundancy in Question Answering”, In Proceedings of the 24th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Sep. 9, 2001, 8 pages.
Cucerzan, “Large-Scale Named Entity Disambiguation Based on Wikipedia Data”, In Proceedings of the Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Jun. 28, 2007, 9 Pages.
Dahl, et al.,' “Context-Dependent Pre-Trained Deep Neural Networks for Large Vocabulary Speech Recognition”, In IEEE Transactions on Audio, Speech, and Language Processing, vol. 20, Issue 1, Jan. 1, 2012, 13 pages.
Dahl, et al.,' “Large Vocabulary Continuous Speech Recognition with Context-Dependent DBN-HMMs”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 22, 2011, 4 pages.
Davis, et al.,' “Low-Rank Approximations for Conditional Feedforward Computation in Deep Neural Networks”, In Proceedings of ArXiv preprint arXiv: 1312.4461, Dec. 2013, 10 Pages.
Edens, et al.,' “An Investigation of Broad Coverage Automatic Pronoun Resolution for Information Retrieval”, In Proceedings of the 26th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 28, 2003, 2 pages.
Fang, et al.,' “A Formal Study of Information Retrieval Heuristics”, In Proceedings of the 27th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 25, 2004, 8 pages.
Finkel, “Incorporating Non-Local Information into Information Extraction Systems by Gibbs Sampling”, In Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics, Jun. 2005, 8 pages.
Funkhouser, et al.,' “A Beam Tracing Method for Interactive Architectural Acoustics”, In Journal of the Acoustical Society of America, Feb. 2004, 18 pages.
Funkhouser, et al.,' “Realtime Acoustic Modeling for Distributed Virtual Environments”, In Proceedings of the 26th annual conference on Computer graphics and interactive techniques, Jul. 1, 1999, 10 pages.
Gade, “Acoustics in Halls for Speech and Music”, In Springer Handbook of Acoustics, May 6, 2014, 8 pages.
Gemello, et al.,' “Adaptation of Hybrid ANN/HMM Models Using Linear Hidden Transformations and Conservative Training”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 14, 2006, 4 pages.
Goldstein, et al.,' “Summarizing Text Documents: Sentence Selection and Evaluation Metrics”, In Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug. 1, 1999, 8 pages.
Gruenstein, et al.,' “Context-Sensitive Language Modeling for Large Sets of Proper Nouns in Multimodal Dialogue Systems”, In Proceedings of IEEE/ACL Workshop on Spoken Language Technology, Dec. 10, 2006, 4 pages.
Gumerov, et al.,' “Fast multipole methods on graphics processors”, In Journal of Computational Physics, vol. 227, Issue 18, Sep. 10, 2008, 4 pages.
Harper, et al.,' “A Language Modelling Approach to Relevance Profiling for Document Browsing”, In Proceedings of the 2nd ACM/IEEE-CS Joint Conference on Digital Libraries, Jul. 13, 2007, 8 pages.
Harper, et al.,' “Within-Document Retrieval: A User-Centered Evaluation of Relevance Profiling”, In Journal of Information Retrieval, vol. 7, Issue 3-4, Sep. 2004, 26 pages.
Harris, “On the use of windows for harmonic analysis with the discrete Fourier transform”, In Proceedings of the IEEE vol. 66, Issue 1, Jan. 1978, 33 pages.
Hawamdeh, et al.,' “Paragraph-based nearest neighbor searching in full-text documents”, In Proceedings of Electronic Publishing, vol. 2, Dec. 1989, 14 pages.
Hearst, “Tilebars: Visualization of Term Distribution Information in Full Text Information Access”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, May 7, 1995, 8 pages.
Heck, et al.,' “Robustness to Telephone Handset Distortion in Speaker Recognition by Discriminative Feature Design”, In Journal of Speech Communication—Speaker Recognition and its Commercial and Forensic Applications, vol. 31, Issue 2-3, Jun. 2000, 12 pages.
Hinton, et al.,' “Deep Neural Networks for Acoustic Modeling in Speech Recognition”, In IEEE Signal Processing Magazine, vol. 29, Issue 6, Nov. 2012, 27 pages.
Hodgson, et al.,' “Experimental evaluation of radiosity for room sound-field prediction”, In the Journal of the Acoustical Society of America, Aug. 2006, 12 pages.
Hsu, et al.,' “HBCI: Human-Building-Computer Interaction”, In Proceedings of the 2nd ACM Workshop on Embedded Sensing Systems for Energy-Efficiency in Building, Nov. 2, 2010, 6 pages.
Jacob, “QR Directory App—Overview”, In Blog of Josh Jacob Dev, Apr. 21, 2011.
Jaitly, et al.,' “Application of Pretrained Deep Neural Networks to Large Vocabulary Conversational Speech Recognition”, In Proceedings of 13th Annual Conference of the International Speech Communication Association, Mar. 12, 2012, 11 pages.
Jones, “Automatic Summarising: The state of the Art”, In Journal of Information Processing and Management: an International Journal, vol. 43, Issue 6, Nov. 1, 2007, 52 pages.
Kaszkiel, et al.,' “Effective Ranking with Arbitrary Passages”, In Journal of the American Society for Information Science and Technology, vol. 52, Issue 4, Feb. 15, 2001, 21 pages.
Kaszkiel, et al.,' “Efficient Passage Ranking for Document Databases”, In Journal of ACM Transactions on Information Systems, Oct. 1, 1999, 26 pages.
Kolarik, et al.,' “Perceiving Auditory Distance Using Level and Direct-to-Reverberant Ratio Cues”, In the Journal of the Acoustical Society of America, Oct. 2011, 4 pages.
Konig, et al.,' “Nonlinear Discriminant Feature Extraction for Robust Text-Independent Speaker Recognition”, In Proceeding of the RLA2C, ESCA workshop on Speaker Recognition and its Commercial and Forensic Applications, Apr. 1998, 4 pages.
Koo, et al.,' “Autonomous Construction of a WiFi Access Point Map Using Multidimensional Scaling”, In Proceedings of the 9th international conference on Pervasive computing, Jun. 12, 2011, 18 pages.
Krokstad, “The Hundred Years Cycle in Room Acoustic Research and Design”, In Proceedings of Reflections on sound, Jun. 2008, 30 pages.
Kumar, et al.,' “Gaze-Enhanced Scrolling Techniques”, In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, Oct. 2007, 4 pages.
Kuttruff, “Room Acoustics, Fourth Edition”, Available at: http://www.crcpress.com/product/isbn/9780419245803, Aug. 3, 2000, 1 page.
Laflen, et al.,' “Introducing New Features in the VSTS Database Edition GDR”, http://msdn.microsoft.com/en-us/magazine/dd483214.aspx, Nov. 2008, 16 pages.
Lavrenko, et al.,' “Relevance-Based Language Models”, In Proceedings of the 24th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Sep. 9, 2001, 8 pages.
Lecouteux, et al.,' “Dynamic Combination of Automatic Speech Recognition Systems by Driven Decoding”, In Journal of IEEE Transactions on Audio, Speech and Language Processing, Jan. 2013, 10 pages.
Li et al.,' “Roles of Pre-Training and Fine-Tuning in Context-Dependent DBN-HMMs for Real-Word Speech Recognition”, In Proceeding of NIPS Workshop on Deep Learning and Unsupervised Feature Learning, Dec. 2010, 8 pages.
Li, et al.,' “Lattice Combination for Improved Speech Recognition”, In Proceedings of the 7th International Conference of Spoken Language Processing, Sep. 16, 2002, 4 pages.
Li, et al.,' “Spatial Sound Rendering Using Measured Room Impulse Responses”, In IEEE International Symposium on Signal Processing and Information Technology, Aug. 27, 2006, 5 pages.
Liao, “Speaker Adaptation of Context Dependent Deep Neural Networks”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Lin, et al.,' “What Makes a Good Answer? The Role of Context in Question Answering”, In Proceedings of the Ninth IFIP TC13 International Conference on Human-Computer Interaction, Sep. 2003, 8 pages.
Liu, et al.,' “Use of Contexts in Language Model Interpolation and Adaptation”, In Journal of Computer Speech and Language vol. 27 Issue 1, Feb. 2009, 23 pages.
Loizides, et al.,' “The Myth of Find: User Behaviour and Attitudes Towards the Basic Search Feature”, In Proceedings of the 8th ACM/IEEE-CS Joint Conference on Digital Libraries, Jun. 16, 2008, 4 pages.
Lv, et al.,' “A Comparative Study of Methods for Estimating Query Language Models with Pseudo Feedback”, In Proceedings of the 18th ACM Conference on Information and Knowledge Management, Nov. 2, 2009, 4 pages.
Lv, et al.,' “Positional Language Models for Information Retrieval”, In Proceedings of the 32nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 19, 2009, 8 pages.
Machiraju, et al.,' “Designing Multitenant Applications on Windows Azure”, Available at: http://msdn.microsoft.com/en-us/library/windowsazure/hh689716.aspx, Apr. 18, 2013, 20 pages.
Mavridis, et al.,' “Friends with Faces: How Social Networks Can Enhance Face Recognition and Vice Versa”, In Proceedings of Computational Social Networks Analysis: Trends, Tools and Research Advances, May 24, 2010, 30 pages.
Mehra, et al.,' “An efficient GPU-based time domain solver for the acoustic wave equation”, In Proceedings of Applied Acoustics, vol. 73, Issue 2, Feb. 2012, 13 pages.
Mehra, et al.,' “Wave-Based Sound Propagation in Large Open Scenes Using an Equivalent Source Formulation”, In Journal of ACM transactions on Graphics, vol. 32, Issue 2, Apr. 1, 2013, 13 pages.
Mehrotra, et al.,' “nterpolation of Combined Head and Room Impulse Response for Audio Spatialization”, In Proceeding of IEEE 13th International Workshop on Multimedia Signal Processing, Oct. 17, 2011, 6 pages.
Meinedo, et al.,' “Combination of Acoustic Models in Continuous Speech Recognition Hybrid Systems”, In Proceedings of Sixth International Conference on Spoken Language Processing, Oct. 2000, 4 pages.
Mihalcea, et al.,' “Wikify!: Linking Documents to Encyclopedic Knowledge”, In Proceedings of the Sixteenth ACM Conference on Conference on Information and Knowledge Management,, Nov. 6, 2007, 9 Pages.
Mohamed, et al.,' “Acoustic Modeling Using Deep Belief Networks”, In IEEE Transactions on Audio, Speech, and Language Processing, vol. 20, Issue 1, Jan. 2012, 10 pages.
Motlicek, et al.,' “Feature and Score Level Combination of Subspace Gaussinasin LVCSR Task”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Na, et al.,' “A 2-Poisson Model for Probabilistic Coreference of Named Entities for Improved Text Retrieval”, In Proceedings of the 32nd International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 19, 2009, 8 pages.
Neve, et al.,' “Face Recognition for Personal Photos using Online Social Network Context and Collaboration”, In Guest Lecture at KAIST, Dec. 14, 2010, 54 pages.
Novak, et al.,' “Use of Non-Negative Matrix Factorization for Language Model Adaptation in a Lecture Transcription Task”, In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, vol. 1, May 7, 2001, 4 pages.
Papadopoulos, et al.,' “Image Clustering Through Community Detection on Hybrid Image Similarity Graphs”, In 17th IEEE International Conference on Image Processing, Sep. 26, 2014, 4 pages.
Perenson, “In-depth Look at Google+ Photo Update with the Team that Designed it”, Available at: http://connect.dpreview.com/post/1400574775/hands-on-with-google-plus-photo-update, May 17, 2013, 10 pages.
Peter, et al.,' “Frequency-domain edge diffraction for finite and infinite edges”, In Proceedings of Acta acustica united with acustica, vol. 95, No. 3, May 6, 2014, 2 pages.
Petkova, et al.,' “Proximity-Based Document Representation for Named Entity Retrieval”, In Proceedings of the Sixteenth ACM Conference on Conference on Information and Knowledge Management, Nov. 6, 2007, 10 pages.
Pierce, “An Introduction to Its Physical Principles and Applications”, In Acoustical Society of America, Jun. 1989, 1 page.
Ponte, et al.,' “A Language Modelling Approach to Information Retrieval”, In Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug. 1, 1998, 7 pages.
Poulos, et al.,' “Assisted Viewing of Web-based Resources”, U.S. Appl. No. 14/161,693, Jan. 23, 2014, 48 pages.
Raghuvanshi, “Interactive Physically-based Sound Simulation”, In UMI Dissertation, Sep. 9, 2011, 187 Pages.
Raghuvanshi, et al.,' “Efficient and Accurate Sound Propagation Using Adaptive Rectangular Decomposition”, In IEEE Transactions on Visualization and Computer Graphics, vol. 15, Issue 99, Feb. 13, 2009, 13 pages.
Raghuvanshi, et al.,' “Precomputed wave simulation for real-time sound propagation of dynamic sources in complex scenes”, In Journal of ACM Transactions on Graphics, vol. 29, Issue 4, Jul. 26, 2010, 11 pages.
Rindel, et al.,' “The Use of Colors, Animations and Auralizations in Room Acoustics”, In Internoise, Sep. 15, 2013, 9 Pages.
Roberts, et al.,' “Evaluating Passage Retrieval Approaches for Question Answering”, In Proceedings of 26th European Conference on Information Retrieval, Apr. 14, 2003, 8 pages.
Robertson, et al.,' “Okapi at TREC-3”, In Proceedings of Text Retrieval Conference, Jan. 24, 2014, 19 pages.
Rouillard, “Contextual QR Codes”, In Proceedings of the Third International Multi-Conference on Computing in the Global Information Technology, Jul. 27, 2008, 6 pages.
Sabine, “Room acoustics”, In Transactions of the IRE Professional Group on Audio, vol. 1, Issue 4, Jul. 1953, 9 pages.
Sainath, et al.,' “Auto-Encoder Bottleneck Features Using Deep Belief Networks”, In Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, Mar. 25, 2012, 4 pages.
Sainath, et al.,' “Low-Rank Matrix Factorization for Deep Neural Network Training with High-Dimensional Output Targets”, In proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Sainath, et al.,' “Making Deep Belief Networks Effective for Large Vocabulary Continuous Speech Recognition”, In Proceedings of IEEE Workshop on Automatic Speech Recognition and Understanding, Dec. 11, 2011, 6 pages.
Sakamoto, et al.,' “Calculation of impulse responses and acoustic parameters in a hall by the finite-difference time-domain method”, In Proceedings of Acoustical Science and Technology, vol. 29, Issue 4, Feb. 2008, 10 pages.
Saluja, et al.,' “Context-aware Language Modeling for Conversational Speech Translation”, In Proceedings of Machine Translation Summit XIII, Sep. 19, 2011, 8 pages.
Sarukkai, et al.,' “Improved Spontaneous Dialogue Recognition Using Dialogue and Utterance Triggers by Adaptive Probability Boosting”, In Fourth International Conference on Spoken Language, vol. 1, Oct. 3, 1996, 4 pages.
Satoh, et al.,' “Poster Abstract: Ambient Sound-based Proximity Detection with Smartphones”, In Proceedings of the 11th ACM Conference on Embedded Networked Sensor Systems, Nov. 11, 2013, 2 pages.
Savioja, “Real-Time 3D Finite-Difference Time-Domain Simulation of Mid-Frequency Room Acoustics”, In Proceedings of the 13th International Conference on Digital Audio Effects, Sep. 6, 2010, 8 pages.
Savioja, et al.,' “Simulation of room acoustics with a 3-D finite difference mesh”, In Proceedings of the International Computer Music Conference, Sep. 1994, 4 pages.
Seide, et al.,' “Conversational Speech Transcription using Context-Dependent Deep Neural Networks”, In Proceeding of 12th Annual Conference of the International Speech Communication Association, Aug. 28, 2011, 4 pages.
Shah, et al.,' “All Smiles: Automatic Photo Enhancement by Facial Expression Analysis”, In Proceedings of Conference on Visual Media Production, Dec. 5, 2012, 10 pages.
Shanklin, “Samsung Galaxy S4 to Feature Eye-Tracking Technology”, Available at: http://www.gizmag.com/galaxy-s4-eye-tracking-technology/26503/, Mar. 4, 2013, 5 pages.
Shieh, et al.,' “Seawall: Performance Isolation for Cloud Datacenter Networks”, In Proceedings of the 2nd UNENIX Conference on Hot Topics in Cloud Computing, Jun. 22, 2010, 7 pages.
Singhal, et al.,' “Pivoted Document Length Normalization”, In Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug. 18, 1996, 12 pages.
Singh-Miller, et al.,' “Dimensionality Reduction for Speech Recognition Using Neighborhood Components Analysis”, In Proceedings of 8th Annual Conference of the International Speech Communication Association, Antwerp, Dec. 27, 2007, 4 pages.
Siniscalchi, et al.,' “Hermitian Based Hidden Activation Functions for Adaptation of Hybrid HMM/ANN Models”, In Proceedings of 13th Annual Conference of the International Speech Communication Association,, Sep. 9, 2012, 4 pages.
Starr, “Facial recognition app matches strangers to online profiles”, Available at: http://www.cnet.com.au/facial-recognition-app-matches-strangers-to-online-profiles-339346355.htm, Jan. 7, 2014, 10 pages.
Stettner, et al.,' “Computer Graphics Visualization for Acoustic Simulation”, In Proceedings of the 16th annual conference on Computer graphics and interactive techniques, vol. 23, No. 3, Jul. 1989, 12 pages.
Su, et al.,' “Error Back Propagation for Sequence Training of Context-Dependent Deep Networks for Conversational Speech Transcription”, In IEEE International Conference on Acoustics, Speech, and Signal Processing, May 26, 2013, 5 pages.
Svensson, et al.,' “The use of Ambisonics in describing room impulse responses”, In Proceedings of the International Congress on Acoustics, Apr. 2004, 4 pages.
Swietojanski, et al.,' “Revisiting Hybrid and GMM-HMM System Combination Techniques”, In Proceeding of the IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Takala, et al.,' “Sound rendering”, In Proceedings of Siggraph Computer Graphics, Jul. 1992, 11 pages.
Taylor, et al.,' “RESound: interactive sound rendering for dynamic virtual environments”, In Proceedings of the 17th ACM international conference on Multimedia, Oct. 19, 2009, 10 pages.
Tellex, et al.,' “Quantitative Evaluation of Passage Retrieval Algorithms for Question Answering”, In Proceedings of the 26th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Jul. 28, 2003, 7 pages.
Thompson, “A review of finite-element methods for time-harmonic acoustics”, In Journal of Acoustical Society of America, vol. 119, Issue 3, Mar. 2006, 16 pages.
Tombros, et al.,' “Advantages of Query Biased Summaries in Information Retrieval”, In Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Aug. 1, 1998, 9 Pages.
Trmal, et al.,' “Adaptation of a Feedforward Artificial Neural Network Using a Linear Transform”, In Proceedings of In Text, Speech and Dialogue, Sep. 10, 2010, 8 pages.
Tsay, et al.,' “Personal Photo Organizer based on Automated Annotation Framework”, In 5th International Conference on Intelligent Information Hiding and Multimedia Signal Processing, Sep. 12, 2009, 4 pages.
Valimaki, et al.,' “Fifty Years of Artificial Reverberation. Audio, Speech, and Language Processing”, In IEEE Transactions on Audio, Speech, and Language Processing, vol. 20, Issue 5, Jul. 2012, 28 pages.
Van “Transform Coding of Audio Impulse Responses”, In Master's Thesis of Delft University of Technology, Aug. 2003, 109 pages.
Vanhoucke, et al.,' “Improving the Speed of Neural Networks on CPUs”, In Proceedings of NIPS Workshop on Deep Learning and Unsupervised Feature Learning, Dec. 16, 2011, 8 pages.
Wu, et al.,' “Adapting Boosting for Information Retrieval Measures”, In Journal of Information Retrieval, vol. 13, Issue 3, Jun. 1, 2010, 17 pages.
Xu, et al.,' “User-Oriented Document Summarization through Vision-Based Eye-Tracking”, In Proceedings of the 14th International Conference on Intelligent User Interfaces, Feb. 8, 2009, 10 pages.
Xue, et al.,' “Restructuring Deep Neural Network Acoustic Models”, U.S. Appl. No. 13/920,323, Jun. 18, 2013, 30 pages.
Xue, et al.,' “Restructuring of Deep Neural Network Acoustic Models with Singular Value Decomposition”, In Proceedings of 14th Annual Conference of the International Speech Communication Association,, Aug. 25, 2013, 5 pages.
Yan, et al.,' “A Scalable Approach to Using DSS-Derived Features in GMM-HMM Based Acoustic Modeling for LVCSR”, In Proceeding of the 14th Annual Conference of the International Speech Communication Association, Aug. 25, 2013, 5 pages.
Yang, et al.,' “Qualifier in TREC-12 QA Main Task”, In Proceedings of the Twelfth Text Retrieval Conference, Nov. 2003, 9 Pages.
Yao, et al.,' “Adaptation of Context-Dependent Deep Neural Networks for Automatic Speech Recognition”, In IEEE Spoken Language Technology Workshop, Dec. 2, 2012, 4 pages.
Yeh, et al.,' “Wave-ray Coupling for Interactive Sound Propagation in Large Complex Scenes”, In Journal of ACM Transactions on Graphics, vol. 32 Issue 6, Nov. 2013, 10 pages.
Yu, et al.,' “Exploiting Sparseness in Deep Neural Networks for Large Vocabulary Speech Recognition”, In Proceeding of IEEE International Conference on Acoustics, Speech and Signal Processing, Mar. 25, 2012, 4 pages.
Yu, et al.,' “Improved Bottleneck Features Using Pretrained Deep Neural Networks”, In Proceedings of 12th Annual Conference of the International Speech Communication Association, Aug. 28, 2011, 4 pages.
Yu, et al.,' “KL-Divergence Regularized Deep Neural Network Adaptation for Improved Large Vocabulary Speech Recognition”, In IEEE International Conference on Acoustics, Speech and Signal Processing, May 26, 2013, 5 pages.
Zhai, et al.,' “A Study of Smoothing Methods for Language Models Applied to Ad Hoc Information Retrieval”, In Proceedings of the 24th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Sep. 9, 2009, 9 Pages.
Zwol, et al.,' “Prediction of Favorite Photos using Social, Visual, and Textual Signals”, In Proceedings of the International Conference on Multimedia, Oct. 25, 2010, 4 pages.
“Advisory Action”, U.S. Appl. No. 14/304,911, Jan. 14, 2016, 2 pages.
“Final Office Action”, U.S. Appl. No. 13/923,917, Sep. 29, 2015, 6 pages.
“Final Office Action”, U.S. Appl. No. 14/266,795, Apr. 11, 2016, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/273,100, Mar. 3, 2016, 18 pages.
“Final Office Action”, U.S. Appl. No. 14/279,146, Apr. 13, 2016, 16 pages.
“Flexible Schema for Language Model Customization”, U.S. Appl. No. 14/227,492, filed Mar. 27, 2014, 20 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2014/041014, Sep. 15, 2015, 6 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/024594, Mar. 24, 2016, 7 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/027688, Apr. 26, 2016, 7 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/022887, Apr. 7, 2016, 8 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/022886, Aug. 31, 2015, 17 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/036859, Dec. 22, 2015, 17 pages.
“Invitation to Pay Additional Fees/Partial International Search Report”, Application No. PCT/US2015/033950, Feb. 23, 2016, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/227,492, Aug. 13, 2015, 36 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/264,012, Mar. 10, 2016, 7 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/264,619, Apr. 19, 2016, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/268,953, Apr. 19, 2016, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/311,208, Jan. 7, 2016, 6 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/312,501, Dec. 16, 2015, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/465,679, May 10, 2016, 31 pages.
“Non-Final Office Action”, Application No. 147/264,619, Apr. 19, 2016, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 13/923,969, Oct. 1, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 13/923,969, Nov. 30, 2015, 5 pages.
“Notice of Allowance”, U.S. Appl. No. 14/254,681, Dec. 4, 2015, 9 pages.
“Notice of Allowance”, U.S. Appl. No. 14/264,012, Dec. 18, 2015, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/304,911, Feb. 19, 2016, 7 pages.
“Notice of Allowance”, U.S. Appl. No. 14/311,208, Mar. 30, 2016, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/312,562, Jan. 7, 2016, 9 pages.
“Preinterview First Office Action”, U.S. Appl. No. 14/444,987, Mar. 3, 2016, 4 pages.
“Ribbon Layout and Resizing”, Retrieved on Mar. 12, 2014 at: https://msdn.microsoft.com/en-us/library/ff701790, 6 pages.
“Second Written Opinion”, Application No. PCT/US2015/022887, Jan. 7, 2016, 5 pages.
“Second Written Opinion”, Application No. PCT/US2015/026971, Mar. 30, 2016, 7 pages.
“Second Written Opinion”, Application No. PCT/US2015/027409, Mar. 18, 2016, 8 pages.
“Second Written Opinion”, Application No. PCT/US2015/027688, Feb. 9, 2016, 6 pages.
“Second Written Opinion”, Application No. PCT/US2015/027689, Apr. 1, 2016, 8 pages.
“Second Written Opinion”, Application No. PCT/US2015/028383, Apr. 18, 2016, 9 pages.
“Second Written Opinion”, Application No. PCT/US2015/029334, Mar. 31, 2016, 5 pages.
“Second Written Opinion”, Application No. PCT/US2015/029805, May 6, 2016, 9 pages.
“Second Written Opinion”, Application No. PCT/US2015/032089, Apr. 12, 2016, 8 pages.
“Second Written Opinion”, Application No. PCT/US2015/033872, Apr. 21, 2016, 6 pages.
“Second Written Opinion”, Application No. PCT/US2015/036859, May 6, 2016, 7 pages.
“Step by Step Microsoft Word 2013”, Available at: https://dbgyan.files.wordpress.com/2013/02/0735669120—wor.pdf, Mar. 1, 2013, 576 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 14/312,562, Apr. 28, 2016, 2 pages.
“The Ribbon Bar”, Available at: http://bioinf.scri.ac.uk/tablet/help/ribbon.shtml, Dec. 1, 2012, 36 pages.
Gajos,“Automatically Generating Personalized User Interfaces with Supple”, In Proceedings of Artificial Intelligence, vol. 174, Issue, Aug. 1, 2010, 49 pages.
Gajos,“Exploring the Design Space for Adaptive Graphical User Interfaces”, In Proceedings of the Working Conference on Advanced Visual Interfaces, May 6, 2006, 8 pages.
Liu,“Language Model Combination and Adaptation using Weighted Finite State Transducers”, In Proceedings of IEEE International Conference on Acoustics Speech and Signal Processing, Mar. 14, 2010, 4 pages.
Peng,“Joint and Implicit Registration for Face Recognition”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'09), Jun. 2009, 8 pages.
Scarr,“Improving Command Selection with Command Maps”, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, May 2012, 10 pages.
“Final Office Action”, U.S. Appl. No. 12/573,157, Feb. 17, 2015, 18 pages.
“Final Office Action”, U.S. Appl. No. 12/573,157, Jul. 5, 2013, 18 pages.
“Final Office Action”, U.S. Appl. No. 14/264,619, Aug. 12, 2016, 12 pages.
“Final Office Action”, U.S. Appl. No. 14/456,679, Aug. 31, 2016, 33 pages.
“First Action Interview Office Action”, U.S. Appl. No. 14/444,987, Aug. 24, 2016, 9 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/026971, Aug. 10, 2016, 8 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/035219, Jun. 23, 2016, 9 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/032089, Jun. 29, 2016, 9 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/027689, Jul. 18, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/573,157, Apr. 23, 2014, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/573,157, Aug. 20, 2015, 17 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/573,157, Nov. 28, 2012, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/923,917, Jun. 30, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/264,012, Aug. 10, 2016, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/266,795, Jul. 19, 2016, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/529,636, Jul. 19, 2016, 12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/573,157, Jun. 6, 2016, 10 pages.
“Notice of Allowance”, U.S. Appl. No. 14/227,492, Aug. 4, 2016, 11 pages.
“Notice of Allowance”, U.S. Appl. No. 14/265,110, Aug. 3, 2016, 22 pages.
“Notice of Allowance”, U.S. Appl. No. 14/311,208, Jul. 19, 2016, 8 pages.
Astheimer,“What you see is what you hear—Acoustics applied in Virtual Worlds”, In Proceedings of IEEE Symposium on Research Frontiers in Virtual Reality, Oct. 25, 1993, pp. 100-107.
Funkhouser,“Survey of Methods for Modeling Sound Propagation in Interactive Virtual Environment Systems”, Retrieved from <<http://www-sop.inria.fr/reves/Nicolas.Tsingos/publis/presence03.pdf, Jan. 2003, 53 pages.
Lauterbach,“Interactive Sound Rendering in Complex and Dynamic Scenes Using Frustum Tracing”, In Proceedings of IEEE Transactions on Visualization and Computer Graphics (vol. 13, Issue 6), Nov. 2007, pp. 1672-1679.
Lentz,“Virtual Reality System with Integrated Sound Field Simulation and Reproduction”, In EURASIP Journal on Applied Signal Processing, Issue 1, Jan. 2007, 22 pages.
Wand,“A Real-Time Sound Rendering Algorithm for Complex Scenes”, Retrieved at: <<http://web.archive.org/web/20090605124135/http://www.mpi-de/˜mwand/papers/tr03.pdf>>, Jul. 2003, 13 pages.
“Final Office Action”, U.S. Appl. No. 14/312,501, May 27, 2016, 13 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/027409, Jun. 16, 2016, 9 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/338,078, Jun. 16, 2016, 19 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/456,679, May 10, 2016, 31 pages.
“Notice of Allowance”, U.S. Appl. No. 14/304,911, May 23, 2016, 7 pages.
“Second Written Opinion”, Application No. PCT/US2015/036587, May 18, 2016, 7 pages.
“Second Written Opinion”, Application No. PCT/US2015/036595, May 31, 2016, 6 pages.
“Final Office Action”, U.S. Appl. No. 14/268,953, Sep. 14, 2016, 16 pages.
“Final Office Action”, U.S. Appl. No. 14/316,802, Sep. 6, 2016, 20 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/036587, Sep. 12, 2016, 7 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2015/033633, Sep. 18, 2015, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/264,619, Nov. 2, 2016, 10 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/036595, Oct. 7, 2016, 8 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/036859, Oct. 7, 2016, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/316,802, Dec. 2, 2016, 22 pages.
“Notice of Allowance”, U.S. Appl. No. 4/456,679, Nov. 30, 2016, 15 pages.
“Final Office Action”, U.S. Appl. No. 14/266,795, Jan. 17, 2017, 11 pages.
“Final Office Action”, U.S. Appl. No. 14/296,644, Jan. 12, 2017, 30 pages.
“Final Office Action”, U.S. Appl. No. 14/338,078, Dec. 30, 2016, 32 pages.
“Final Office Action”, U.S. Appl. No. 14/529,636, Jan. 31, 2017, 15 pages.
“Final Office Action”, U.S. Appl. No. 15/076,125, Dec. 8, 2016, 6 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/033950, Dec. 15, 2016, 14 pages.
“International Preliminary Report on Patentability”, Application No. PCT/US2015/033545, Dec. 15, 2016, 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/268,953, Jan. 26, 2017, 15 pages.
“Non-Final Office Action”, U.S. Appl. No. 14/273,100, Jan. 30, 2017, 14 pages.
“Notice of Allowance”, U.S. Appl. No. 14/264,012, Jan. 5, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/312,501, Feb. 15, 2017, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 14/456,679, Nov. 30, 2016, 15 pages.
Related Publications (1)
Number Date Country
20150304165 A1 Oct 2015 US