This relates generally to controlling television user interactions and, more specifically, to processing speech for a virtual assistant to control television user interactions.
Intelligent automated assistants (or virtual assistants) provide an intuitive interface between users and electronic devices. These assistants can allow users to interact with devices or systems using natural language in spoken and/or text forms. For example, a user can access the services of an electronic device by providing a spoken user input in natural language form to a virtual assistant associated with the electronic device. The virtual assistant can perform natural language processing on the spoken user input to infer the user's intent and operationalize the user's intent into tasks. The tasks can then be performed by executing one or more functions of the electronic device, and, in some examples, a relevant output can be returned to the user in natural language form.
While mobile telephones (e.g., smartphones), tablet computers, and the like have benefitted from virtual assistant control, many other user devices lack such convenient control mechanisms. For example, user interactions with media control devices (e.g., televisions, television set-top boxes, cable boxes, gaming devices, streaming media devices, digital video recorders, etc.) can be complicated and difficult to learn. Moreover, with the growing sources of media available through such devices (e.g., over-the-air TV, subscription TV service, streaming video services, cable on-demand video services, web-based video services, etc.), it can be cumbersome or even overwhelming for some users to find desired media content to consume. As a result, many media control devices can provide an inferior user experience that can be frustrating for many users.
Systems and processes are disclosed for controlling television interactions using a virtual assistant. In one example, speech input can be received from a user. Media content can be determined based on the speech input. A first user interface having a first size can be displayed, and the first user interface can include selectable links to the media content. A selection of one of the selectable links can be received. In response to the selection, a second user interface can be displayed having a second size larger than the first size, and the second user interface can include the media content associated with the selection.
In another example, speech input can be received from a user at a first device having a first display. A user intent of the speech input can be determined based on content displayed on the first display. Media content can be determined based on the user intent. The media content can be played on a second device associated with a second display.
In another example, speech input can be received from a user, and the speech input can include a query associated with content shown on a television display. A user intent of the query can be determined based on the content shown on the television display and/or a viewing history of media content. A result of the query can be displayed based on the determined user intent.
In another example, media content can be displayed on a display. An input can be received from a user. Virtual assistant queries can be determined based on the media content and/or a viewing history of media content. The virtual assistant queries can be displayed on the display.
In the following description of examples, reference is made to the accompanying drawings in which it is shown by way of illustration specific examples that can be practiced. It is to be understood that other examples can be used and structural changes can be made without departing from the scope of the various examples.
This relates to systems and processes for controlling television user interactions using a virtual assistant. In one example, a virtual assistant can be used to interact with a media control device, such as a television set-top box controlling content shown on a television display. A mobile user device or a remote control with a microphone can be used to receive speech input for the virtual assistant. The user's intent can be determined from the speech input, and the virtual assistant can execute tasks according to the user's intent, including causing playback of media on a connected television and controlling any other functions of a television set-top box or like device (e.g., managing video recordings, searching for media content, navigating menus, etc.).
Virtual assistant interactions can be shown on a connected television or other display. In one example, media content can be determined based on speech input received from a user. A first user interface with a first small size can be displayed, including selectable links to the determined media content. After receiving a selection of a media link, a second user interface with a second larger size can be displayed, including the media content associated with the selection. In other examples, the interface used to convey virtual assistant interactions can expand or contract to occupy a minimal amount of space while conveying desired information.
In some examples, multiple devices associated with multiple displays can be used to determine user intent from speech input as well as to convey information to users in different ways. For example, speech input can be received from a user at a first device having a first display. The user's intent can be determined from the speech input based on content displayed on the first display. Media content can be determined based on the user intent, and the media content can be played on a second device associated with a second display.
Television display content can also be used as contextual input for determining user intent from speech input. For example, speech input can be received from a user, including a query associated with content shown on a television display. The user intent of the query can be determined based on the content shown on the television display as well as a viewing history of media content on the television display (e.g., disambiguating the query based on characters in a playing TV show). The results of the query can then be displayed based on the determined user intent.
In some examples, virtual assistant query suggestions can be provided to the user (e.g., to acquaint the user with available commands, suggest interesting content, etc.). For example, media content can be shown on a display, and an input can be received from the user requesting virtual assistant query suggestions. Virtual assistant queries suggestions can be determined based on the media content shown on the display and a viewing history of media content shown on the display (e.g., suggesting queries related to a playing TV show). The suggested virtual assistant queries can then be shown on the display.
Controlling television user interactions using a virtual assistant according to the various examples discussed herein can provide an efficient and enjoyable user experience. User interactions with media control devices can be intuitive and simple using a virtual assistant capable of receiving natural language queries or commands. Available functions can be suggested to users as desired, including meaningful query suggestions based on playing content, which can aid users to learn control capabilities. In addition, available media can be made easily accessible using intuitive spoken commands. It should be understood, however, that still many other advantages can be achieved according to the various examples discussed herein.
A virtual assistant can be capable of accepting a user request at least partially in the form of a natural language command, request, statement, narrative, and/or inquiry. Typically, the user request seeks either an informational answer or performance of a task by the virtual assistant (e.g., causing display of particular media). A satisfactory response to the user request can include provision of the requested informational answer, performance of the requested task, or a combination of the two. For example, a user can ask the virtual assistant a question, such as “Where am I right now?” Based on the user's current location, the virtual assistant can answer, “You are in Central Park.” The user can also request the performance of a task, for example, “Please remind me to call Mom at 4 p.m. today.” In response, the virtual assistant can acknowledge the request and then create an appropriate reminder item in the user's electronic schedule. During the performance of a requested task, the virtual assistant can sometimes interact with the user in a continuous dialogue involving multiple exchanges of information over an extended period of time. There are numerous other ways of interacting with a virtual assistant to request information or performance of various tasks. In addition to providing verbal responses and taking programmed actions, the virtual assistant can also provide responses in other visual or audio forms (e.g., as text, alerts, music, videos, animations, etc.). Moreover, as discussed herein, an exemplary virtual assistant can control playback of media content (e.g., playing video on a television) and cause information to be displayed on a display.
An example of a virtual assistant is described in Applicants' U.S. Utility application Ser. No. 12/987,982 for “Intelligent Automated Assistant,” filed Jan. 10, 2011, the entire disclosure of which is incorporated herein by reference.
As shown in
In some examples, television set-top box 104 can function as a media control center for multiple types and sources of media content. For example, television set-top box 104 can facilitate user access to live television (e.g., over-the-air, satellite, or cable television). As such, television set-top box 104 can include cable tuners, satellite tuners, or the like. In some examples, television set-top box 104 can also record television programs for later time-shifted viewing. In other examples, television set-top box 104 can provide access to one or more streaming media services, such as cable-delivered on-demand television shows, videos, and music as well as Internet-delivered television shows, videos, and music (e.g., from various free, paid, and subscription-based streaming services). In still other examples, television set-top box 104 can facilitate playback or display of media content from any other source, such as displaying photos from a mobile user device, playing videos from a coupled storage device, playing music from a coupled music player, or the like. Television set-top box 104 can also include various other combinations of the media control features discussed herein, as desired.
User device 102 and television set-top box 104 can communicate with server system 110 through one or more networks 108, which can include the Internet, an intranet, or any other wired or wireless public or private network. In addition, user device 102 can communicate with television set-top box 104 through network 108 or directly through any other wired or wireless communication mechanisms (e.g., Bluetooth, Wi-Fi, radio frequency, infrared transmission, etc.). As illustrated, remote control 106 can communicate with television set-top box 104 using any type of communication, such as a wired connection or any type of wireless communication (e.g., Bluetooth, Wi-Fi, radio frequency, infrared transmission, etc.), including via network 108. In some examples, users can interact with television set-top box 104 through user device 102, remote control 106, or interface elements integrated within television set-top box 104 (e.g., buttons, a microphone, a camera, a joystick, etc.). For example, speech input including media-related queries or commands for the virtual assistant can be received at user device 102 and/or remote control 106, and the speech input can be used to cause media-related tasks to be executed on television set-top box 104. Likewise, tactile commands for controlling media on television set-top box 104 can be received at user device 102 and/or remote control 106 (as well as from other devices not shown). The various functions of television set-top box 104 can thus be controlled in a variety of ways, giving users multiple options for controlling media content from multiple devices.
The client-side portion of the exemplary virtual assistant executed on user device 102 and/or television set-top box 104 with remote control 106 can provide client-side functionalities, such as user-facing input and output processing and communications with server system 110. Server system 110 can provide server-side functionalities for any number of clients residing on a respective user device 102 or respective television set-top box 104.
Server system 110 can include one or more virtual assistant servers 114 that can include a client-facing I/O interface 122, one or more processing modules 118, data and model storage 120, and an I/O interface to external services 116. The client-facing I/O interface 122 can facilitate the client-facing input and output processing for virtual assistant server 114. The one or more processing modules 118 can utilize data and model storage 120 to determine the user's intent based on natural language input, and can perform task execution based on inferred user intent. In some examples, virtual assistant server 114 can communicate with external services 124, such as telephony services, calendar services, information services, messaging services, navigation services, television programming services, streaming media services, and the like, through network(s) 108 for task completion or information acquisition. The I/O interface to external services 116 can facilitate such communications.
Server system 110 can be implemented on one or more standalone data processing devices or a distributed network of computers. In some examples, server system 110 can employ various virtual devices and/or services of third-party service providers (e.g., third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 110.
Although the functionality of the virtual assistant is shown in
For example, user device 102 can include a motion sensor 210, a light sensor 212, and a proximity sensor 214 coupled to peripherals interface 206 to facilitate orientation, light, and proximity sensing functions. One or more other sensors 216, such as a positioning system (e.g., a GPS receiver), a temperature sensor, a biometric sensor, a gyroscope, a compass, an accelerometer, and the like, can also be connected to peripherals interface 206, to facilitate related functionalities.
In some examples, a camera subsystem 220 and an optical sensor 222 can be utilized to facilitate camera functions, such as taking photographs and recording video clips. Communication functions can be facilitated through one or more wired and/or wireless communication subsystems 224, which can include various communication ports, radio frequency receivers and transmitters, and/or optical (e.g., infrared) receivers and transmitters. An audio subsystem 226 can be coupled to speakers 228 and microphone 230 to facilitate voice-enabled functions, such as voice recognition, voice replication, digital recording, and telephony functions.
In some examples, user device 102 can further include an I/O subsystem 240 coupled to peripherals interface 206. I/O subsystem 240 can include a touchscreen controller 242 and/or other input controller(s) 244. Touchscreen controller 242 can be coupled to a touchscreen 246. Touchscreen 246 and the touchscreen controller 242 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, such as capacitive, resistive, infrared, and surface acoustic wave technologies; proximity sensor arrays; and the like. Other input controller(s) 244 can be coupled to other input/control devices 248, such as one or more buttons, rocker switches, a thumb-wheel, an infrared port, a USB port, and/or a pointer device, such as a stylus.
In some examples, user device 102 can further include a memory interface 202 coupled to memory 250. Memory 250 can include any electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device; a portable computer diskette (magnetic); a random access memory (RAM) (magnetic); a read-only memory (ROM) (magnetic); an erasable programmable read-only memory (EPROM) (magnetic); a portable optical disc such as CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW; or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like. In some examples, a non-transitory computer-readable storage medium of memory 250 can be used to store instructions (e.g., for performing portions or all of the various processes described herein) for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device, and can execute the instructions. In other examples, the instructions (e.g., for performing portions or all of the various processes described herein) can be stored on a non-transitory computer-readable storage medium of server system 110, or can be divided between the non-transitory computer-readable storage medium of memory 250 and the non-transitory computer-readable storage medium of server system 110. In the context of this document, a “non-transitory computer-readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device.
In some examples, memory 250 can store an operating system 252, a communication module 254, a graphical user interface module 256, a sensor processing module 258, a phone module 260, and applications 262. Operating system 252 can include instructions for handling basic system services and for performing hardware-dependent tasks. Communication module 254 can facilitate communicating with one or more additional devices, one or more computers, and/or one or more servers. Graphical user interface module 256 can facilitate graphical user interface processing. Sensor processing module 258 can facilitate sensor-related processing and functions. Phone module 260 can facilitate phone-related processes and functions. Applications 262 can facilitate various functionalities of user applications, such as electronic messaging, web browsing, media processing, navigation, imaging, and/or other processes and functions.
As described herein, memory 250 can also store client-side virtual assistant instructions (e.g., in a virtual assistant client module 264) and various user data 266 (e.g., user-specific vocabulary data, preference data, and/or other data such as the user's electronic address book, to-do lists, shopping lists, television program favorites, etc.) to, for example, provide the client-side functionalities of the virtual assistant. User data 266 can also be used in performing speech recognition in support of the virtual assistant or for any other application.
In various examples, virtual assistant client module 264 can be capable of accepting voice input (e.g., speech input), text input, touch input, and/or gestural input through various user interfaces (e.g., I/O subsystem 240, audio subsystem 226, or the like) of user device 102. Virtual assistant client module 264 can also be capable of providing output in audio (e.g., speech output), visual, and/or tactile forms. For example, output can be provided as voice, sound, alerts, text messages, menus, graphics, videos, animations, vibrations, and/or combinations of two or more of the above. During operation, virtual assistant client module 264 can communicate with the virtual assistant server using communication subsystem 224.
In some examples, virtual assistant client module 264 can utilize the various sensors, subsystems, and peripheral devices to gather additional information from the surrounding environment of user device 102 to establish a context associated with a user, the current user interaction, and/or the current user input. Such context can also include information from other devices, such as from television set-top box 104. In some examples, virtual assistant client module 264 can provide the contextual information or a subset thereof with the user input to the virtual assistant server to help infer the user's intent. The virtual assistant can also use the contextual information to determine how to prepare and deliver outputs to the user. The contextual information can further be used by user device 102 or server system 110 to support accurate speech recognition.
In some examples, the contextual information that accompanies the user input can include sensor information, such as lighting, ambient noise, ambient temperature, images or videos of the surrounding environment, distance to another object, and the like. The contextual information can further include information associated with the physical state of user device 102 (e.g., device orientation, device location, device temperature, power level, speed, acceleration, motion patterns, cellular signal strength, etc.) or the software state of user device 102 (e.g., running processes, installed programs, past and present network activities, background services, error logs, resources usage, etc.). The contextual information can further include information associated with the state of connected devices or other devices associated with the user (e.g., media content displayed by television set-top box 104, media content available to television set-top box 104, etc.). Any of these types of contextual information can be provided to virtual assistant server 114 (or used on user device 102 itself) as contextual information associated with a user input.
In some examples, virtual assistant client module 264 can selectively provide information (e.g., user data 266) stored on user device 102 in response to requests from virtual assistant server 114 (or it can be used on user device 102 itself in executing speech recognition and/or virtual assistant functions). Virtual assistant client module 264 can also elicit additional input from the user via a natural language dialogue or other user interfaces upon request by virtual assistant server 114. Virtual assistant client module 264 can pass the additional input to virtual assistant server 114 to help virtual assistant server 114 in intent inference and/or fulfillment of the user's intent expressed in the user request.
In various examples, memory 250 can include additional instructions or fewer instructions. Furthermore, various functions of user device 102 can be implemented in hardware and/or in firmware, including in one or more signal processing and/or application specific integrated circuits.
As shown in
For example, television set-top box 104 can include a communications subsystem 324. Communication functions can be facilitated through one or more wired and/or wireless communication subsystems 324, which can include various communication ports, radio frequency receivers and transmitters, and/or optical (e.g., infrared) receivers and transmitters.
In some examples, television set-top box 104 can further include an I/O subsystem 340 coupled to peripherals interface 306. I/O subsystem 340 can include an audio/video output controller 370. Audio/video output controller 370 can be coupled to a display 112 and speakers 111 or can otherwise provide audio and video output (e.g., via audio/video ports, wireless transmission, etc.). I/O subsystem 340 can further include remote controller 342. Remote controller 342 can be communicatively coupled to remote control 106 (e.g., via a wired connection, Bluetooth, Wi-Fi, etc.). Remote control 106 can include microphone 372 for capturing audio input (e.g., speech input from a user), button(s) 374 for capturing tactile input, and transceiver 376 for facilitating communication with television set-top box 104 via remote controller 342. Remote control 106 can also include other input mechanisms, such as a keyboard, joystick, touchpad, or the like. Remote control 106 can further include output mechanisms, such as lights, a display, a speaker, or the like. Input received at remote control 106 (e.g., user speech, button presses, etc.) can be communicated to television set-top box 104 via remote controller 342. I/O subsystem 340 can also include other input controller(s) 344. Other input controller(s) 344 can be coupled to other input/control devices 348, such as one or more buttons, rocker switches, a thumb-wheel, an infrared port, a USB port, and/or a pointer device, such as a stylus.
In some examples, television set-top box 104 can further include a memory interface 302 coupled to memory 350. Memory 350 can include any electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device; a portable computer diskette (magnetic); a random access memory (RAM) (magnetic); a read-only memory (ROM) (magnetic); an erasable programmable read-only memory (EPROM) (magnetic); a portable optical disc such as CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW; or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like. In some examples, a non-transitory computer-readable storage medium of memory 350 can be used to store instructions (e.g., for performing portions or all of the various processes described herein) for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device, and can execute the instructions. In other examples, the instructions (e.g., for performing portions or all of the various processes described herein) can be stored on a non-transitory computer-readable storage medium of server system 110, or can be divided between the non-transitory computer-readable storage medium of memory 350 and the non-transitory computer-readable storage medium of server system 110. In the context of this document, a “non-transitory computer-readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device.
In some examples, memory 350 can store an operating system 352, a communication module 354, a graphical user interface module 356, an on-device media module 358, an off-device media module 360, and applications 362. Operating system 352 can include instructions for handling basic system services and for performing hardware-dependent tasks. Communication module 354 can facilitate communicating with one or more additional devices, one or more computers, and/or one or more servers. Graphical user interface module 356 can facilitate graphical user interface processing. On-device media module 358 can facilitate storage and playback of media content stored locally on television set-top box 104 and other media content available locally (e.g., cable channel tuning). Off-device media module 360 can facilitate streaming playback or download of media content stored remotely (e.g., on a remote server, on user device 102, etc.). Applications 362 can facilitate various functionalities of user applications, such as electronic messaging, web browsing, media processing, gaming, and/or other processes and functions.
As described herein, memory 350 can also store client-side virtual assistant instructions (e.g., in a virtual assistant client module 364) and various user data 366 (e.g., user-specific vocabulary data, preference data, and/or other data such as the user's electronic address book, to-do lists, shopping lists, television program favorites, etc.) to, for example, provide the client-side functionalities of the virtual assistant. User data 366 can also be used in performing speech recognition in support of the virtual assistant or for any other application.
In various examples, virtual assistant client module 364 can be capable of accepting voice input (e.g., speech input), text input, touch input, and/or gestural input through various user interfaces (e.g., I/O subsystem 340 or the like) of television set-top box 104. Virtual assistant client module 364 can also be capable of providing output in audio (e.g., speech output), visual, and/or tactile forms. For example, output can be provided as voice, sound, alerts, text messages, menus, graphics, videos, animations, vibrations, and/or combinations of two or more of the above. During operation, virtual assistant client module 364 can communicate with the virtual assistant server using communication subsystem 324.
In some examples, virtual assistant client module 364 can utilize the various subsystems and peripheral devices to gather additional information from the surrounding environment of television set-top box 104 to establish a context associated with a user, the current user interaction, and/or the current user input. Such context can also include information from other devices, such as from user device 102. In some examples, virtual assistant client module 364 can provide the contextual information or a subset thereof with the user input to the virtual assistant server to help infer the user's intent. The virtual assistant can also use the contextual information to determine how to prepare and deliver outputs to the user. The contextual information can further be used by television set-top box 104 or server system 110 to support accurate speech recognition.
In some examples, the contextual information that accompanies the user input can include sensor information, such as lighting, ambient noise, ambient temperature, distance to another object, and the like. The contextual information can further include information associated with the physical state of television set-top box 104 (e.g., device location, device temperature, power level, etc.) or the software state of television set-top box 104 (e.g., running processes, installed applications, past and present network activities, background services, error logs, resources usage, etc.). The contextual information can further include information associated with the state of connected devices or other devices associated with the user (e.g., content displayed on user device 102, playable content on user device 102, etc.). Any of these types of contextual information can be provided to virtual assistant server 114 (or used on television set-top box 104 itself) as contextual information associated with a user input.
In some examples, virtual assistant client module 364 can selectively provide information (e.g., user data 366) stored on television set-top box 104 in response to requests from virtual assistant server 114 (or it can be used on television set-top box 104 itself in executing speech recognition and/or virtual assistant functions). Virtual assistant client module 364 can also elicit additional input from the user via a natural language dialogue or other user interfaces upon request by virtual assistant server 114. Virtual assistant client module 364 can pass the additional input to virtual assistant server 114 to help virtual assistant server 114 in intent inference and/or fulfillment of the user's intent expressed in the user request.
In various examples, memory 350 can include additional instructions or fewer instructions. Furthermore, various functions of television set-top box 104 can be implemented in hardware and/or in firmware, including in one or more signal processing and/or application specific integrated circuits.
It should be understood that system 100 and system 300 are not limited to the components and configuration shown in
Throughout this disclosure, references to “the system” can include system 100, system 300, or one or more elements of either system 100 or system 300. For example, a typical system referred to herein can include at least television set-top box 104 receiving user input from remote control 106 and/or user device 102.
In one example, a virtual assistant can be triggered to listen for speech input containing a command or query (or to commence recording of speech input for subsequent processing or commence processing in real-time of speech input). Listening can be triggered in a variety of ways, including indications such as a user pressing a physical button on remote control 106, a user pressing a physical button on user device 102, a user pressing a virtual button on user device 102, a user uttering a trigger phrase that is recognizable by an always-listening device (e.g., uttering “Hey Assistant” to commence listening for a command), a user performing a gesture detectable by a sensor (e.g., motioning in front of a camera), or the like. In another example, a user can press and hold a physical button on remote control 106 or user device 102 to initiate listening. In still other examples, a user can press and hold a physical button on remote control 106 or user device 102 while speaking a query or command, and can release the button when finished. Various other indications can likewise be received to initiate receipt of speech input from the user.
In response to receiving an indication to listen for speech input, speech input interface 484 can be displayed.
As the user begins to speak, listening confirmation 487 shown in
Upon detecting that the user has finished speaking (e.g., based on a pause, speech interpretation indicating the end of a query, or any other endpoint detection method), processing confirmation 488 shown in
After the captured speech input is interpreted as text (or in response to successfully converting the speech input to text), command receipt confirmation 490 and/or transcription 492 shown in
In other examples, speech transcription can be performed in real-time as a user speaks. As words are transcribed, they can be displayed in speech input interface 484. For example, the words can be displayed alongside listening confirmation 487. After the user finishes speaking, command receipt confirmation 490 can be displayed briefly before executing the tasks associated with the user's command.
Moreover, in other examples, command receipt confirmation 490 can convey information about received and understood commands. For example, for a simple request to change to another channel, a logo or number associated with the channel can briefly be displayed as command receipt confirmation 490 (e.g., for a few seconds) as the channel is changed. In another example, for a request to pause a video (e.g., video 480), a pause symbol (e.g., two vertical, parallel bars) can be displayed as command receipt confirmation 490. The pause symbol can remain on the display until, for example, the user performs another action (e.g., issuing a play command to resume playback). Symbols, logos, animations, or the like can likewise be displayed for any other command (e.g., symbols for rewind, fast forward, stop, play, etc.). Command receipt confirmation 490 can thus be used to convey command-specific information.
In some examples, speech input interface 484 can be hidden after receipt of a user query or command. For example, speech input interface 484 can be animated as sliding downward until it is out of view of the bottom of display 112. Speech input interface 484 can be hidden in instances where further information need not be displayed to the user. For example, for common or straightforward commands (e.g., change to channel ten, change to the sports channel, play, pause, fast forward, rewind, etc.), speech input interface 484 can be hidden immediately after confirming command receipt, and the associated task or tasks can be performed immediately. Although various examples herein illustrate and describe an interface at a bottom or top edge of a display, it should be appreciated that any of the various interfaces can be positioned in other locations around a display. For example, speech input interface 484 can emerge from a side edge of display 112, in the center of display 112, in a corner of display 112, or the like. Similarly, the various other interface examples described herein can be arranged in a variety of different orientations in a variety of different locations on a display. Moreover, although various interfaces described herein are illustrated as opaque, any of the various interfaces can be transparent or otherwise allow an image (blurred or whole) to be viewed through the interface (e.g., overlaying interface content on media content without completely obscuring the underlying media content).
In other examples, the result of a query can be displayed within speech input interface 484 or in a different interface.
As shown, media content interface 510 can be a larger size than speech input interface 484. In one example, speech input interface 484 can be of a smaller first size to accommodate speech input information, while media content interface 510 can be of a larger second size to accommodate query results, which can include text, still images, and moving images. In this manner, interfaces for conveying virtual assistant information can scale in size according to the content that is to be conveyed, thereby limiting screen real estate intrusion (e.g., minimally blocking other content, such as video 480).
As illustrated, media content interface 510 can include (as a result of a virtual assistant query) selectable video links 512, selectable text links 514, and additional content link 513. In some examples, links can be selected by navigating focus, a cursor, or the like to a particular element and selecting it using a remote control (e.g., remote control 106). In other examples, links can be selected using voice commands to the virtual assistant (e.g., watch that soccer game, show details about the basketball game, etc.). Selectable video links 512 can include still or moving images and can be selectable to cause playback of the associated video. In one example, selectable video link 512 can include a playing video of the associated video content. In another example, selectable video link 512 can include a live feed of a television channel. For example, selectable video link 512 can include a live feed of a soccer game on a sports channel as a result of a virtual assistant query about sporting events currently on television. Selectable video link 512 can also include any other video, animation, image, or the like (e.g., a triangular play symbol). Moreover, link 512 can link to any type of media content, such as a movie, television show, sporting event, music, or the like.
Selectable text links 514 can include textual content associated with selectable video links 512 or can include textual representations of results of a virtual assistant query. In one example, selectable text links 514 can include a description of media resulting from a virtual assistant query. For instance, selectable text link 514 can include the name of a television program, title of a movie, description of a sporting event, television channel name or number, or the like. In one example, selection of text link 514 can cause playback of the associated media content. In another example, selection of text link 514 can provide additional detailed information about the media content or other virtual assistant query result. Additional content link 513 can link to and cause display of additional results of a virtual assistant query.
Although certain media content examples are shown in
In other examples, media content interface 510 can include a paraphrase of a query in addition to media content results. For example, a paraphrase of the user's query can be displayed above the media content results (above selectable video links 512 and selectable text links 514). In the example of
In some examples, after displaying any interface, including interface 510, a user can initiate capture of additional speech input with a new query (that may or may not be related to previous queries). User queries can include commands to act on interface elements, such as a command to select a video link 512. In another example, user speech can include a query associated with displayed content, such as displayed menu information, a playing video (e.g., video 480), or the like. A response can be determined for such a query based on the information shown (e.g., displayed text) and/or metadata associated with displayed content (e.g., metadata associated with a playing video). For example, a user can ask about a media result shown in an interface (e.g., interface 510), and metadata associated with that media can be searched to provide an answer or result. Such an answer or result can then be provided in another interface or within the same interface (e.g., in any of the interfaces discussed herein).
As noted above, in one example, additional detailed information about media content can be displayed in response to selection of a text link 514.
In one example, detail interface 618 can include selectable video link 620 (or another link to play media content), which can include a larger version of a corresponding selectable video link 512. As such, selectable video link 620 can include still or moving images and can be selectable to cause playback of the associated video. Selectable video link 620 can include a playing video of the associated video content, a live feed of a television channel (e.g., a live feed of a soccer game on a sports channel), or the like. Selectable video link 620 can also include any other video, animation, image, or the like (e.g., a triangular play symbol).
As noted above, a video can be played in response to selection of a video link, such as video link 620 or video links 512.
As discussed above, a virtual assistant can be triggered to listen for speech input containing a command or query (or to commence recording of speech input for subsequent processing or commence processing in real-time of speech input). Listening can be triggered in a variety of ways, including indications such as a user pressing a physical button on remote control 106, a user pressing a physical button on user device 102, a user pressing a virtual button on user device 102, a user uttering a trigger phrase that is recognizable by an always-listening device (e.g., uttering “Hey Assistant” to commence listening for a command), a user performing a gesture detectable by a sensor (e.g., motioning in front of a camera), or the like. In another example, a user can press and hold a physical button on remote control 106 or user device 102 to initiate listening. In still other examples, a user can press and hold a physical button on remote control 106 or user device 102 while speaking a query or command, and can release the button when finished. Various other indications can likewise be received to initiate receipt of speech input from the user.
In response to receiving an indication to listen for speech input, speech input interface 836 can be displayed over menu 830.
As shown in
In other examples, speech transcription can be performed in real-time as a user speaks. As words are transcribed, they can be displayed in speech input interface 836. For example, the words can be displayed alongside a larger version of listening confirmation 487 discussed above. After the user finishes speaking, command receipt confirmation 838 can be displayed briefly before executing the tasks associated with the user's command.
Moreover, in other examples, command receipt confirmation 838 can convey information about received and understood commands. For example, for a simple request to tune to a particular channel, a logo or number associated with the channel can briefly be displayed as command receipt confirmation 838 (e.g., for a few seconds) as the channel is tuned. In another example, for a request to select a displayed menu item (e.g., one of media options 832), an image associated with the selected menu item can be displayed as command receipt confirmation 838. Command receipt confirmation 838 can thus be used to convey command-specific information.
In some examples, speech input interface 836 can be hidden after receipt of a user query or command. For example, speech input interface 836 can be animated as sliding downward until it is out of view of the bottom of display 112. Speech input interface 836 can be hidden in instances where further information need not be displayed to the user. For example, for common or straightforward commands (e.g., change to channel ten, change to the sports channel, play that movie, etc.), speech input interface 836 can be hidden immediately after confirming command receipt, and the associated task or tasks can be performed immediately.
In other examples, the result of a query can be displayed within speech input interface 836 or in a different interface.
In one example, the clip addressing the user's query can include a time-cued portion of previously-aired content (that may be available from a recording or from a streaming service). The virtual assistant can, in one example, identify such content based on the user intent associated with the speech input and by searching detailed information about available media content (e.g., including metadata for recorded programs along with detailed timing information or detailed information about streaming content). In some examples, a user may not have access to or may not have a subscription for certain content. In such instances, content can be offered for purchase, such as via purchase link 948. The cost of the content can be automatically withdrawn from a user account or charged to a user account upon selection of purchase link 948 or video link 946.
Referring again to process 1000 of
At block 1006, a first user interface of a first size with selectable media links can be displayed. For example, media content interface 510 with selectable video links 512 and selectable text links 514 can be displayed on display 112 as shown in
At block 1008, a selection of one of the links can be received. For example, selection of one of links 512 and/or links 514 can be received. At block 1010, a second user interface of a larger second size with media content associated with the selection can be displayed. For example, detail interface 618 with selectable video link 620 and detailed media information 622 can be displayed as shown in
In another example, a larger size interface can be displayed over a control menu than over background video content. For example, speech input interface 836 can be displayed over menu 830 as shown in
In one example, a user request to play content via television set-top box 104 (e.g., on display 112 and speakers 111) can include an ambiguous reference to something shown on user device 102. Transcribed user speech 1258, for example, includes a reference to “that” soccer game (“Put on that soccer game.”). The particular soccer game desired can be unclear from the speech input alone. In some examples, however, the content shown on user device 102 can be used to disambiguate user requests and determine user intent. In one example, content shown on user device 102 prior to the user making the request (e.g., prior to interface 1254 appearing on touchscreen 246) can be used to determine user intent (as can content appearing within interface 1254, such as previous queries and results). In the illustrated example, the content shown in interface 1150 of
In other examples, a user can reference television programs shown in interface 1150 in a variety of other ways (e.g., the show on channel eight, the news, the drama show, the advertisement, the first show, etc.), and user intent can similarly be determined based on displayed content. It should be appreciated that metadata associated with displayed content (e.g., TV program descriptions), fuzzy matching techniques, synonym matching, and the like can further be used in conjunction with displayed content to determine user intent. For example, the term “advertisement” can be matched to the description “paid programming” (e.g., using synonyms and/or fuzzy matching techniques) to determine user intent from a request to show “the advertisement.” Likewise, the description of a particular TV program can be analyzed in determining user intent. For example, the term “law” could be identified in the detailed description of a courtroom drama, and the user intent can be determined from a user request to watch the “law” show based on the detailed description associated with the content shown in interface 1150. Displayed content and data associated with it can thus be used to disambiguate user requests and determine user intent.
In one example, a user request to play media content or display media via television set-top box 104 (e.g., on display 112 and speakers 111) can include an ambiguous reference to something shown on user device 102. Transcribed user speech 1468, for example, includes a reference to “that” video (“Show that video.”). The particular video referenced can be unclear from the speech input alone. In some examples, however, the content shown on user device 102 can be used to disambiguate user requests and determine user intent. In one example, content shown on user device 102 prior to the user making the request (e.g., prior to interface 1254 appearing on touchscreen 246) can be used to determine user intent (as can content appearing within interface 1254, such as previous queries and results). In the example of user speech 1468, the content shown in interface 1360 of
In another example, transcribed user speech 1470 includes a reference to “that” album (“Play a slideshow of that album.”). The particular album referenced can be unclear from the speech input alone. The content shown on user device 102 can again be used to disambiguate the user request. In particular, the content shown in interface 1360 of
In yet another example, transcribed user speech 1472 includes a reference to the “last” photo (“Display the last photo on the kitchen television.”). The particular photo referenced can be unclear from the speech input alone. The content shown on user device 102 can again be used to disambiguate the user request. In particular, the content shown in interface 1360 of
In other examples, a user can reference media content shown in interface 1360 in a variety of other ways (e.g., the last couple of photos, all of the videos, all of the photos, the graduation album, the graduation video, the photo from June 21st, etc.), and user intent can similarly be determined based on displayed content. It should be appreciated that metadata associated with displayed content (e.g., timestamps, location information, titles, descriptions, etc.), fuzzy matching techniques, synonym matching, and the like can further be used in conjunction with displayed content to determine user intent. Displayed content and data associated with it can thus be used to disambiguate user requests and determine user intent.
It should be understood that any type of displayed content in any application interface of any application can be used in determining user intent. For example, images displayed on a webpage in an Internet browser application can be referenced in speech input, and the displayed webpage content can be analyzed to identify the desired images. Similarly, a music track in a list of music in a music application can be referenced in speech input by title, genre, artist, band name, or the like, and the displayed content in the music application (and associated metadata in some examples) can be used to determine user intent from the speech input. As discussed above, the determined user intent can then be used to cause media display or playback via another device, such as via television set-top box 104.
In some examples, user identification, user authentication, and/or device authentication can be employed to determine whether media control can be permitted, determine media content available for display, determine access permissions, and the like. For example, it can be determined whether a particular user device (e.g., user device 102) is authorized to control media on, for example, television set-top box 104. A user device can be authorized based on a registration, pairing, trust determination, passcode, security question, system setup, or the like. In response to determining that a particular user device is authorized, attempts to control television set-top box 104 can be permitted (e.g., media content can be played in response to determining that a requesting device is authorized to control media). In contrast, media control commands or requests from unauthorized devices can be ignored, and/or users of such devices can be prompted to register their devices for use in controlling a particular television set-top box 104.
In another example, a particular user can be identified, and personal data associated with the user can be used to determine user intent of requests. For example, a user can be identified based on speech input, such as by voice recognition using a voiceprint of the user. In some examples, users can utter a particular phrase that is analyzed for voice recognition. In other examples, speech input requests directed to the virtual assistant can be analyzed using voice recognition to identify the speaker. A user can also be identified based on the source of the speech input sample (e.g., on a user's personal device 102). A user can also be identified based on passwords, passcodes, menu selection, or the like. Speech input received from the user can then be interpreted based on personal data of the identified user. For example, user intent of speech input can be determined based on previous requests from the user, media content owned by the user, media content stored on the user's device, user preferences, user settings, user demographics (e.g., languages spoken, etc.), user profile information, user payment methods, or a variety of other personal information associated with a particular identified user. For instance, speech input referencing a favorites list or the like can be disambiguated based on personal data, and the user's personal favorites list can be identified. Speech input referencing “my” photos, “my” videos, “my” shows, or the like can likewise be disambiguated based on user identification to correctly identify photos, videos, and shows associated with the identified user (e.g., photos stored on a personal user device or the like). Similarly, speech input requesting purchase of content can be disambiguated to determine that the identified user's payment method should be charged for the purchase (as opposed to another user's payment method).
In some examples, user authentication can be used to determine whether a user is allowed to access media content, purchase media content, or the like. For example, voice recognition can be used to verify the identity of a particular user (e.g., using their voiceprint) to permit the user to make purchases using the user's payment method. Similarly, passwords or the like can be used to authenticate the user to permit purchases. In another example, voice recognition can be used to verify the identity of a particular user to determine whether the user is allowed to watch a particular program (e.g., a program having a particular parental guideline rating, a movie having a particular age suitability rating, or the like). For instance, a child's request for a particular program can be denied based on voice recognition indicating that the requester is not an authorized user able to view such content (e.g., a parent). In other examples, voice recognition can be used to determine whether users have access to particular subscription content (e.g., restricting access to premium channel content based on voice recognition). In some examples, users can utter a particular phrase that is analyzed for voice recognition. In other examples, speech input requests directed to the virtual assistant can be analyzed using voice recognition to identify the speaker. Certain media content can thus be played in response to first determining that a user is authorized in any of a variety of ways.
In some examples, a determination can be made as to whether results of a virtual assistant query should be displayed on user device 102 directly or on display 112 associated with television set-top box 104. In one example, in response to determining that the user intent of a query includes a request for information, an informational response can be displayed on user device 102. In another example, in response to determining that the user intent of a query includes a request to play media content, media content responsive to the query can be played via television set-top box 104.
The second query in interface 1254, however, includes a media request. In particular, transcribed user speech 1578 requests changing displayed media content to “the game.” The user intent of transcribed user speech 1578 can be determined based on displayed content (e.g., to identify which game the user desires), such as a game listed in interface 510 of
In some examples, a determination can be made as to whether to display media on device 102 or on display 112 based on media result format, user preference, default settings, an express command in the request itself, or the like. For example, the format of a media result to a query can be used to determine on which device to display the media result by default (e.g., without specific instructions). A television program can be better suited for display on a television, a large format video can be better suited for display on a television, thumbnail photos can be better suited for display on a user device, small format web videos can be better suited for display on a user device, and various other media formats can be better suited for display on either a relatively large television screen or a relatively small user device display. Thus, in response to a determination that media content should be displayed on a particular display (e.g., based on media format), the media content can be displayed on that particular display by default.
In the second query, however, transcribed user speech 1682 includes a request to show pictures of players of a team (e.g., pictures of “Team Alpha”). As in the examples discussed above, the user intent of transcribed user speech 1682 can be determined. The user intent of transcribed user speech 1682 can include performing a search (e.g., a web search) for pictures associated with “Team Alpha,” and displaying the resulting pictures. In response to determining that the user intent includes a request to display media that may be presented in thumbnail format, or media associated with a web search, or other non-specific media without a particular format, the system can automatically determine to display the desired media result on touchscreen 246 in interface 1254 of user device 102 (as opposed to displaying the resulting pictures on display 112 via television set-top box 104). For example, as shown, thumbnail photos 1684 can be displayed within interface 1254 on user device 102 in response to the user's query. The virtual assistant system can thus cause media of a certain format, or media that might be presented in a certain format (e.g., in a group of thumbnails), to be displayed on user device 102 by default.
It should be appreciated that, in some examples, the soccer game referenced in user speech 1680 can be shown on user device 102, and photos 1684 can be shown on display 112 via television set-top box 104. The default device for display, however, can be determined automatically based on media format, thereby simplifying media commands for the user. In other examples, the default device for displaying requested media content can be determined based on user preferences, default settings, the device used most recently to display content, voice recognition to identify a user and a device associated with that user, or the like. For example, a user can set a preference or a default configuration can be set to display certain types of content (e.g., videos, slideshows, television programs, etc.) on display 112 via television set-top box 104 and other types of content (e.g., thumbnails, photos, web videos, etc.) on touchscreen 246 of user device 102. Similarly, preferences or default configurations can be set to respond to certain queries by displaying content on one device or the other. In another example, all content can be displayed on user device 102 unless the user instructs otherwise.
In still other examples, a user query can include a command to display content on a particular display. For example, user speech 1472 of
In one example, proximity of devices can be used to determine to which of multiple set-top boxes to send commands (or on which display to show requested media content). A proximity can be determined between a user device 102 or remote control 106 and each of multiple set-top boxes. Issued commands can then be sent to the nearest set-top box (or requested media content can be displayed on the nearest display). Proximity can be determined (or at least approximated) in any of a variety of ways, such as time-of-flight measurements (e.g., using radio frequency), Bluetooth LE, electronic ping signals, proximity sensors, sound travel measurements, or the like. Measured or approximated distances can then be compared, and the device with the shortest distance can be issued the command (e.g., the nearest set-top box).
It should be understood that a user can specify a different device for a command, in some cases overriding proximity. For example, a list of available display devices can be displayed on user device 102 (e.g., listing first display 1786 and second display 1788 by setup name, designated room, or the like, or listing first set-top box 1792 and second set-top box 1794 by setup name, designated room, or the like). A user can select one of the devices from the list, and commands can then be sent to the selected device. Requests for media content issued at user device 102 can then be handled by displaying the desired media on the selected device. In other examples, users can speak the desired device as part of a spoken command (e.g., show the game on the kitchen television, change to the cartoon channel in the living room, etc.).
In still other examples, the default device for showing requested media content can be determined based on status information associated with a particular device. For example, it can be determined whether headphones (or a headset) are attached to user device 102. In response to determining that headphones are attached to user device 102 when a request to display media content is received, the requested content can be displayed on user device 102 by default (e.g., assuming the user is consuming content on user device 102 and not on a television). In response to determining that headphones are not attached to user device 102 when a request to display media content is received, the requested content can be displayed on either user device 102 or on a television according to any of the various determination methods discussed herein. Other device status information can similarly be used to determine whether requested media content should be displayed on user device 102 or a set-top box 104, such as ambient lighting around user device 102 or set-top box 104, proximity of other devices to user device 102 or set-top box 104, orientation of user device 102 (e.g., landscape orientation can be more likely to indicate desired viewing on user device 102), display status of set-top box 104 (e.g., in a sleep mode), time since the last interaction on a particular device, or any of a variety of other status indicators for user device 102 and/or set-top box 104.
At block 1804, user intent can be determined from the speech input based on content displayed on the first display. For example, content such as television programs 1152 in interface 1150 of
Referring again to process 1800 of
Referring again to process 1800 of
In some examples, a determination can be made as to whether responses to speech input directed to a virtual assistant should be displayed on a first display associated with a first device (e.g., user device 102) or a second display associated with a second device (e.g., television set-top box 104). For example, as discussed above with reference to
In some examples, as content shown on user device 102 can be used to inform interpretations of speech input as discussed above, content shown on display 112 can likewise be used to inform interpretations of speech input. In particular, content shown on a display associated with television set-top box 104 can be used along with metadata associated with that content to determine user intent from speech input, disambiguate user queries, respond to content-related queries, or the like.
In one example, a user query directed to a virtual assistant can include an ambiguous reference to something shown on display 112. Transcription 1916, for example, includes a reference to “those” actresses (“Who are those actresses?”). The particular actresses the user is asking about can be unclear from the speech input alone. In some examples, however, the content shown on display 112 and associated metadata can be used to disambiguate user requests and determine user intent. In the illustrated example, the content shown on display 112 can be used to determine the user intent from the reference to “those” actresses. In one example, television set-top box 104 can identify playing content along with details associated with the content. In this instance, television set-top box 104 can identify the title of video 480 along with a variety of descriptive content. In other examples, a television show, sporting event, or other content can be shown that can be used in conjunction with associated metadata to determine user intent. In addition, in any of the various examples discussed herein, speech recognition results and intent determination can weight terms associated with displayed content higher than alternatives. For example, actor names for on-screen characters can be weighted higher while those actors appear on screen (or while a show is playing in which they appear), which can provide for accurate speech recognition and intent determination of likely user requests associated with displayed content.
In one example, a character and/or actor list associated with video 480 can be used to identify all or the most prominent actresses appearing in video 480, which might include actresses 1910, 1912, and 1914. The identified actresses can be returned as a possible result (including fewer or additional actresses if the metadata resolution is coarse). In another example, however, metadata associated with video 480 can include an identification of which actors and actresses appear on screen at a given time, and the actresses appearing at the time of the query can be determined from that metadata (e.g., specifically identifying actresses 1910, 1912, and 1914). In yet another example, a facial recognition application can be used to identify actresses 1910, 1912, and 1914 from the images shown on display 112. In still other examples, various other metadata associated with video 480 and various other recognition approaches can be used to identify the user's likely intent in referring to “those” actresses.
In some examples, the content shown on display 112 can change during submission of a query and determination of a response. As such, a viewing history of media content can be used to determine user intent and determine the response to a query. For example, should video 480 move to another view (e.g., with other characters) before a response to the query is generated, the result of the query can be determined based on the user's view at the time the query was spoken (e.g., the characters shown on screen at the time the user initiated the query). In some instances, a user might pause playing media to issue a query, and the content shown when paused can be used with associated metadata to determine user intent and a response to the query.
Given the determined user intent, a result of the query can be provided to the user.
As with other interfaces displayed on display 112, assistant response interface 2018 can occupy a minimal amount of screen real estate while providing sufficient space to convey the desired information. In some examples, as with other text displayed in interfaces on display 112, assistant response 2020 can be scrolled up into the position shown in
In some examples, a user query directed to a virtual assistant can include an ambiguous reference using the name of a character, the name of an actor, the name of a program, the name of player, or the like. Without the context of the content shown on display 112 and its associated metadata, such references may be difficult to resolve accurately. Transcription 2122, for example, includes a reference to a character named “Blanche” from video 480. The particular actress or other individual the user is asking about can be unclear from the speech input alone. In some examples, however, the content shown on display 112 and associated metadata can be used to disambiguate user requests and determine user intent. In the illustrated example, the content shown on display 112 and associated metadata can be used to determine the user intent from the character name “Blanche.” In this instance, a character list associated with video 480 can be used to determine that “Blanche” likely refers to the character “Blanche” in video 480. In another example, detailed metadata and/or facial recognition can be used to determine that a character with the name “Blanche” appears on the screen (or appeared on the screen at the initiation of the user's query), making the actress associated with that character the likeliest intention of the user's query. For example, it can be determined that characters 1910, 1912, and 1914 appear on display 112 (or appeared on display 112 at the initiation of the user's query), and their associated character names can then be referenced to determine the user intent of the query referencing the character Blanche. An actor list can then be used to identify the actress who plays Blanche, and a search can be conducted to identify other media in which the identified actress appears.
Given the determined user intent (e.g., resolution of the character reference “Blanche”) and the determination of the result of the query (e.g., other media associated with the actress who plays “Blanche”), a response can be provided to the user.
Assistant response interface 2224 can also include selectable video links 2228. In some examples, various types of media content can be provided as results to a virtual assistant query, including movies (e.g., Movie A and Movie B of interface 2224). Media content displayed as a result of a query can include media that may be available to the user for consumption (for free, for purchase, or as part of a subscription). A user can select displayed media to view or consume the resulting content. For instance, a user can select one of selectable video links 2228 (e.g., using a remote control, voice command, or the like) to watch one of the other movies in which actress Jennifer Jones appears. In response to selection of one of selectable video links 2228, the video associated with the selection can be played, replacing video 480 on display 112. Thus, displayed media content and associated metadata can be used to determine user intent from speech input, and, in some examples, playable media can be provided as a result.
It should be understood that a user can reference actors, players, characters, locations, teams, sporting event details, movie subjects, or a variety of other information associated with displayed content in forming queries, and the virtual assistant system can similarly disambiguate such requests and determine user intent based on displayed content and associated metadata. Likewise, it should be understood that, in some examples, results can include media suggestions associated with the query, such as a movie, television show, or sporting event associated with a person who is the subject of a query (whether or not the user specifically requests such media content).
Moreover, in some examples, user queries can include requests for information associated with media content itself, such as queries about a character, an episode, a movie plot, a previous scene, or the like. As with the examples discussed above, displayed content and associated metadata can be used to determine user intent from such queries and determine a response. For instance, a user might request a description of a character (e.g., “What does Blanche do in this movie?”). The virtual assistant system can then identify from metadata associated with displayed content the requested information about the character, such as a character description or role (e.g., “Blanche is one of a group of lawyers and is known as a troublemaker in Hartford.”). Similarly, a user might request an episode synopsis (e.g., “What happened in the last episode?”), and the virtual assistant system can search for and provide a description of the episode.
In some examples, content displayed on display 112 can include menu content, and such menu content can similarly be used to determine user intent of speech input and responses to user queries.
In one example, a user request to play content can include an ambiguous reference to something shown on display 112 in menu 830. For example, a user viewing menu 830 can request to watch “that” soccer game, “that” basketball game, the vacuum advertisement, the law show, or the like. The particular program desired can be unclear from the speech input alone. In some examples, however, the content shown on display 112 can be used to disambiguate user requests and determine user intent. In the illustrated example, the media options in menu 830 (along with metadata associated with the media options in some examples) can be used to determine the user intent from commands including ambiguous references. For example, “that” soccer game can be resolved to the soccer game on the sports channel. “That” basketball game can be resolved to the basketball game on the college sports channel. The vacuum advertisement can be resolved to the paid programming show (e.g., based on metadata associated with the show describing a vacuum). The law show can be resolved to the courtroom drama based on metadata associated with the show and/or synonym matching, fuzzy matching, or other matching techniques. The appearance of the various media options 832 in menu 830 on display 112 can thus be used to disambiguate user requests.
In some examples, displayed menus can be navigated with a cursor, joystick, arrows, buttons, gestures, or the like. In such instances, a focus can be shown for a selected item. For example, a selected item can be shown in bold, underlined, outlined with a border, in larger size than other menu items, with a shadow, with a reflection, with a glow, and/or with any other features to emphasize which menu item is selected and has focus. For example, selected media option 2330 in
In some examples, a request to play content or select a menu item can include an ambiguous reference to a menu item that has focus. For example, a user viewing menu 830 of
As with a viewing history of media content that can be used to disambiguate a user request (e.g., content displayed at the time a user initiated a request but since having passed), previously displayed menu or search result content can similarly be used to disambiguate later user requests after moving on, for example, to later menu or search result content. For example,
In still other examples, various display cues shown in a menu or results list on display 112 can be used to disambiguate user requests and determine user intent.
In some examples, a request to play content or select a menu item can include an ambiguous reference to a menu item in a group of items (such as a category). For example, a user viewing category interface 2440 can request to play the soccer show (“Play the soccer show.”). The particular menu item or show that is desired can be unclear from the speech input alone. Moreover, the query can resolve to more than one show that is displayed on display 112. For example, the request for the soccer show might refer to either the soccer game listed in the TV programs category or the soccer movie listed in the movies category. The content shown on display 112—including display cues—can be used to disambiguate user requests and determine user intent. In particular, the fact that the movies category has focus in category interface 2440 can be used to identify the particular soccer show that is desired, which is likely the soccer movie given the focus on the movies category. A category of media (or any other grouping of media) having focus as shown on display 112 can thus be used in determining user intent from speech input. It should also be appreciated that users can make various other requests associated with categories, such as requesting display of certain categorical content (e.g., show me comedy movies, show me horror movies, etc.).
In other examples, a user can refer to menu or media items shown on display 112 in a variety of other ways, and user intent can similarly be determined based on displayed content. It should be appreciated that metadata associated with displayed content (e.g., TV program descriptions, movie descriptions, etc.), fuzzy matching techniques, synonym matching, and the like can further be used in conjunction with displayed content to determine user intent from speech input. User requests in a variety of forms—including natural language requests—can thus be accommodated and user intent can be determined according to the various examples discussed herein.
It should be understood that content displayed on display 112 can be used alone or in conjunction with content displayed on user device 102 or on a display associated with remote control 106 in determining user intent. Likewise, it should be understood that virtual assistant queries can be received at any of a variety of devices communicatively coupled to television set-top box 104, and content displayed on display 112 can be used to determine user intent regardless of which device receives the query. Results of queries can likewise be displayed on display 112 or on another display (e.g., on user device 102).
In addition, in any of the various examples discussed herein, the virtual assistant system can navigate menus and select menu options without requiring a user to specifically open menus and navigate to menu items. For example, a menu of options might appear after selecting media content or a menu button, such as selecting a movie option 2444 in
Referring again to process 2500 of
At block 2506, a result of the query can be displayed based on the determined user intent. For example, a result similar to assistant response 2020 in assistant response interface 2018 of
In some examples, virtual assistant query suggestions can be provided to a user to, for example, inform the user of available queries, suggest content that the user may enjoy, teach the user how to use the system, encourage the user to find additional media content for consumption, or the like. In some examples, query suggestions can include generic suggestions of possible commands (e.g., find comedies, show me the TV guide, search for action movies, turn on closed captioning, etc.). In other examples, query suggestions can include targeted suggestions related to displayed content (e.g., add this show to a watch list, share this show via social media, show me the soundtrack of this movie, show me the book that this guest is selling, show me the trailer for the movie that guest is plugging, etc.), user preferences (e.g., closed captioning use, etc.), user-owned content, content stored on a user's device, notifications, alerts, a viewing history of media content (e.g., recently displayed menu items, recently displayed scenes of a show, recent actor appearances, etc.), or the like. Suggestions can be displayed on any device, including on display 112 via television set-top box 104, on user device 102, or on a display associated with remote control 106. In addition, suggestions can be determined based on which devices are nearby and/or in communication with television set-top box 104 at a particular time (e.g., suggesting content from devices of the users in the room watching TV at a particular time). In other examples, suggestions can be determined based on a variety of other contextual information, including the time of day, crowd-sourced information (e.g., popular shows being watched at a given time), shows that are live (e.g., live sporting events), a viewing history of media content (e.g., the last several shows that were watched, a recently viewed set of search results, a recently viewed group of media options, etc.), or any of a variety of other contextual information.
Suggestions interface 2650 can be displayed over a moving image, such as video 480, or over any other background content (e.g., a menu, a still image, a paused video, etc.). As with other interfaces discussed herein, suggestions interface 2650 can be animated to slide up from the bottom of display 112, and can occupy a minimal amount of space while sufficiently conveying the desired information so as to limit interference with video 480 in the background. In other examples, a larger interface of suggestions can be provided when the background content is still (e.g., a paused video, a menu, an image, etc.).
In some examples, virtual assistant query suggestions can be determined based on displayed media content or a viewing history of media content (e.g., a movie, television show, sporting event, recently viewed show, recently viewed menu, recently viewed scene of a movie, recent scene of a playing television episode, etc.). For example,
In another example, an actor or actress appearing on display 112 can be identified (e.g., based on metadata and/or facial recognition), and query suggestions associated with that actor or actress can be provided. Such query suggestions can include role(s) played, acting awards, age, other media in which they appear, history, family members, relationships, or any of a variety of other details about an actor or actress. For example, character 1914 can be played by an actress named Whitney Davidson, and the actress's name Whitney Davidson can be used to formulate a query suggestion to identify other movies, television programs, or other media in which the actress Whitney Davidson appears (e.g., “What else is Whitney Davidson in?”).
In other examples, details about a show can be used to formulate query suggestions. An episode synopsis, plot summary, episode list, episode titles, series titles, or the like can be used to formulate query suggestions. For example, a suggestion can be provided to describe what happened in the last episode of a television program (e.g., “What happened in the last episode?”), to which the virtual assistant system can provide as a response an episode synopsis from the prior episode identified based on the episode currently shown on display 112 (and its associated metadata). In another example, a suggestion can be provided to set up a recording for the next episode, which can be accomplished by the system identifying the next episode based on the currently playing episode shown on display 112. In yet another example, a suggestion can be provided to get information about the current episode or show appearing on display 112, and the title of the show obtained from metadata can be used to formulate the query suggestion (e.g., “What is this episode of ‘Their Show’ about?” or “What is ‘Their Show’ about?”).
In another example, category, genre, rating, awards, descriptions, or the like associated with displayed content can be used to formulate query suggestions. For example, video 480 can correspond to a television program described as a comedy having female lead characters. A query suggestion can be formulated from this information to identify other shows with similar characteristics (e.g., “Find me other comedies with female leads.”). In other examples, suggestions can be determined based on user subscriptions, content available for playback (e.g., content on television set-top box 104, content on user device 102, content available for streaming, etc.), or the like. For example, potential query suggestions can be filtered based on whether informational or media results are available. Query suggestions that might not result in playable media content or informational answers can be excluded, and/or query suggestions with readily available informational answers or playable media content can be provided (or weighted more heavily in determining which suggestions to provide). Displayed content and associated metadata can thus be used in a variety of ways to determine query suggestions.
Answer interface 2862 can include informational answers and/or media results responsive to a selected query suggestion (or responsive to any other query). For example, in response to selected query suggestion 2756, assistant result 2860 can be determined and provided. In particular, in response to a request for a synopsis of a prior episode, the prior episode can be identified based on displayed content, and an associated description or synopsis can be identified and provided to the user. In the illustrated example, assistant result 2860 can describe a previous episode of the program corresponding to video 480 on display 112 (e.g., “In episode 203 of ‘Their Show,’ Blanche gets invited to a college psychology class as a guest speaker. Julia and Melissa show up unannounced and cause a stir.”). Informational answers and media results (e.g., selectable video links) can also be presented in any of the other ways discussed herein, or results can be presented in various other ways (e.g., speaking answers aloud, playing content immediately, showing an animation, displaying an image, etc.).
In another example, a notification or alert can be used to determine virtual assistant query suggestions.
Notifications or alerts can notify the user of a variety of information, such as available alternative media content (e.g., alternatives to what may be shown currently on display 112), available live television programs, newly downloaded media content, recently added subscription content, suggestions received from friends, receipt of media sent from another device, or the like. Notifications can also be personalized based on a household or an identified user watching media (e.g., identified based on user authentication using account selections, voice recognition, passwords, etc.). In one example, the system can interrupt a show and display a notification based on likely desired content, such as displaying notification 2964 for a user who-based on a user profile, favorite team(s), preferred sport(s), viewing history, and the like—can be likely to desire the content of the notification. For example, sporting event scores, game status, time remaining, and the like can be obtained from a sport data feed, news outlet, social media discussions, or the like, and can be used to identify possible alternative media content for notifying the user.
In other examples, popular media content (e.g., across many users) can be provided via alerts or notifications to suggest alternatives to currently viewed content (e.g., notifying a user that a popular show or a show in a genre the user likes just started or is otherwise available for viewing). In the illustrated example, the user might follow one or both of Team Zeta and Team Alpha (or might follow soccer or a particular sport, league, etc.). The system can determine that available live content matches the user's preferences (e.g., a game on another channel matches a user's preferences, the game has little time remaining, and the score is close). The system can then determine to alert the user via notification 2964 of the likely desired content. In some examples, a user can select notification 2964 (or a link within notification 2964) to switch to the suggested content (e.g., using a remote control button, cursor, spoken request, etc.).
Virtual assistant query suggestions can be determined based on notifications by analyzing notification content to identify relevant media related terms, names, titles, subjects, actions, or the like. The identified information can then be used to formulate appropriate virtual assistant query suggestions, such as notification-based suggestions 2966 based on notification 2964. For example, a notification about an exciting end of a live sporting event can be displayed. Should the user then request query suggestions, suggestions interface 2650 can be displayed, including query suggestions to view the sporting event, inquire about team statistics, or find content related to the notification (e.g., change to the Zeta/Alpha game, what are team Zeta's stats, what other soccer games are on, etc.). Based on the particular terms of interest identified in the notification, various other query suggestions can likewise be determined and provided to the user.
Virtual assistant query suggestions related to media content (e.g., for consumption via television set-top box 104) can also be determined from content on a user device, and suggestions can also be provided on a user device. In some examples, playable device content can be identified on user devices that are connected to or in communication with television set-top box 104.
With playable media 3068 identified, virtual assistant query suggestions can be determined and provided to the user.
Virtual assistant query suggestions provided on user device 102 can include suggestions based on a variety of source devices as well as general suggestions. For example, device-based suggestions 3174 can include query suggestions based on content stored on user device 102 (including content displayed on user device 102). Content-based suggestions 2652 can be based on content displayed on display 112 associated with television set-top box 104. General suggestions 3176 can include general suggestions that may not be associated with particular media content or a particular device with media content.
Device-based suggestions 3174 can be determined, for example, based on playable content identified on user device 102 (e.g., videos, music, photographs, game interfaces, application interfaces, etc.). In the illustrated example, device-based suggestions 3174 can be determined based on playable media 3068 shown in
In other examples, content available on other connected devices can be identified and used to formulate virtual assistant query suggestions. For example, content from each of two user devices 102 connected to a common television set-top box 104 can be identified and used in formulating virtual assistant query suggestions. In some examples, users can select which content to make visible to the system for sharing, and can hide other content from the system so as not to include it in query suggestions or otherwise make it available for playback.
Content-based suggestions 2652 shown in interface 3170 of
It should be understood that any combination of virtual assistant query suggestions from various sources can be provided in response to a request for suggestions. For example, suggestions from various sources can be combined randomly, or can be presented based on popularity, user preference, selection history, or the like. Moreover, queries can be determined in a variety of other ways and presented based on a variety of other factors, such as a query history, a user preference, a query popularity, or the like. In addition, in some examples, query suggestions can be cycled automatically by replacing displayed suggestions with new alternative suggestions after a delay. It should further be understood that users can select displayed suggestions on any interface by, for example, tapping on a touchscreen, speaking the query, selecting a query with navigation keys, selecting a query with a button, selecting a query with a cursor, or the like, and an associated response can then be provided (e.g., an informational and/or media response).
In any of the various examples, virtual assistant query suggestions can also be filtered based on available content. For example, potential query suggestions that would result in unavailable media content (e.g., no cable subscription) or that may not have an associated informational answer can be disqualified as suggestions and held back from being displayed. On the other hand, potential query suggestions that would result in immediately playable media content to which the user has access can be weighted over other potential suggestions or otherwise biased for display. In this manner, the availability of media content for user viewing can also be used in determining virtual assistant query suggestions for display.
In addition, in any of the various examples, pre-loaded query answers can be provided instead of or in addition to suggestions (e.g., in suggestions interface 2650). Such pre-loaded query answers can be selected and provided based on personal use and/or current context. For example, a user watching a particular program can tap a button, double-click a button, long-press a button, or the like to receive suggestions. Instead of or in addition to query suggestions, context-based information can be provided automatically, such as identifying a playing song or soundtrack (e.g., “This song is Performance Piece”), identifying cast members of a currently playing episode (e.g., “Actress Janet Quinn plays Genevieve”), identifying similar media (e.g., “Show Q is similar to this”), or providing results of any of the other queries discussed herein.
Moreover, affordances can be provided in any of the various interfaces for users to rate media content to inform the virtual assistant of user preferences (e.g., a selectable rating scale). In other examples, users can speak rating information as a natural language command (e.g., “I love this,” “I hate this,” “I don't like this show,” etc.). In still other examples, in any of the various interfaces illustrated and described herein, a variety of other functional and informational elements can be provided. For example, interfaces can further include links to important functions and places, such as search links, purchase links, media links, and the like. In another example, interfaces can further include recommendations of what else to watch next based on currently playing content (e.g., selecting similar content). In yet another example, interfaces can further include recommendations of what else to watch next based on personalized taste and/or recent activity (e.g., selecting content based on user ratings, user-entered preferences, recently watched programs, etc.). In still other examples, interfaces can further include instructions for user interactions (e.g., “Press and hold to talk to the Virtual Assistant,” “Tap once to get suggestions,” etc.). In some examples, providing pre-loaded answers, suggestions, or the like can provide an enjoyable user experience while also making content readily available to a wide variety of users (e.g., to users of various skill levels irrespective of language or other control barriers).
At block 3306, virtual assistant queries can be determined based on the media content and/or a viewing history of media content. For example, virtual assistant queries can be determined based on a displayed program, menu, application, list of media content, notification, or the like. In one example, content-based suggestions 2652 can be determined based on video 480 and associated metadata as described with reference to
Referring again to process 3300 of
In addition, in any of the various examples discussed herein, various aspects can be personalized for a particular user. User data, including contacts, preferences, location, favorite media, and the like, can be used to interpret voice commands and facilitate user interaction with the various devices discussed herein. The various processes discussed herein can also be modified in various other ways according to user preferences, contacts, text, usage history, profile data, demographics, or the like. In addition, such preferences and settings can be updated over time based on user interactions (e.g., frequently uttered commands, frequently selected applications, etc.). Gathering and use of user data that is available from various sources can be used to improve the delivery to users of invitational content or any other content that may be of interest to them. The present disclosure contemplates that in some instances, this gathered data can include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, home addresses, or any other identifying information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure.
The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data as private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.
Despite the foregoing, the present disclosure also contemplates examples in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of advertisement delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services. In another example, users can select not to provide location information for targeted content delivery services. In yet another example, users can select not to provide precise location information, but permit the transfer of location zone information.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed examples, the present disclosure also contemplates that the various examples can also be implemented without the need for accessing such personal information data. That is, the various examples of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to the content delivery services, or publicly available information.
In accordance with some examples,
As shown in
Processing unit 3406 can be configured to receive speech input from a user (e.g., via input unit 3404). Processing unit 3406 can be further configured to determine (e.g., using media content determining unit 3410) media content based on the speech input. Processing unit 3406 can be further configured to display (e.g., on display unit 3402 using first user interface displaying unit 3412) a first user interface having a first size, wherein the first user interface comprises one or more selectable links to the media content. Processing unit 3406 can be further configured to receive (e.g., from input unit 3404 using selection receiving unit 3414) a selection of one of the one or more selectable links. Processing unit 3406 can be further configured to, in response to the selection, display (e.g., on display unit 3402 using second user interface displaying unit 3416) a second user interface having a second size larger than the first size, wherein the second user interface comprises the media content associated with the selection.
In some examples, the first user interface (e.g., of first user interface displaying unit 3412) expands into the second user interface (e.g., of second user interface displaying unit 3416) in response to the selection (e.g., of selection receiving unit 3414). In other examples, the first user interface is overlaid on playing media content. In one example, the second user interface is overlaid on playing media content. In another example, the speech input (e.g., of speech input receiving unit 3408 from input unit 3404) comprises a query, and the media content (e.g., of media content determining unit 3410) comprises a result of the query. In still another example, the first user interface comprises a link to results of the query beyond the one or more selectable links to the media content. In other examples, the query comprises a query about weather, and the first user interface comprises a link to media content associated with the query about the weather. In another example, the query comprises a location, and the link to the media content associated with the query about the weather comprises a link to a portion of media content associated with weather at the location.
In some examples, in response to the selection, processing unit 3406 can be configured to play the media content associated with the selection. In one example, the media content comprises a movie. In another example, the media content comprises a television show. In another example, the media content comprises a sporting event. In some examples, the second user interface (e.g., of second user interface displaying unit 3416) comprises a description of the media content associated with the selection. In other examples, the first user interface comprises a link to purchase media content.
Processing unit 3406 can be further configured to receive additional speech input from the user (e.g., via input unit 3404), wherein the additional speech input comprises a query associated with displayed content. Processing unit 3406 can be further configured to determine a response to the query associated with the displayed content based on metadata associated with the displayed content. Processing unit 3406 can be further configured to, in response to receiving the additional speech input, display (e.g., on display unit 3402) a third user interface, wherein the third user interface comprises the determined response to the query associated with the displayed content.
Processing unit 3406 can be further configured to receive an indication to initiate receipt of speech input (e.g., via input unit 3404). Processing unit 3406 can be further configured to, in response to receiving the indication, display a readiness confirmation (e.g., on display unit 3402). Processing unit 3406 can be further configured to, in response to receiving the speech input, display a listening confirmation. Processing unit 3406 can be further configured to detect the end of the speech input, and, in response to detecting the end of the speech input, display a processing confirmation. In some examples, processing unit 3406 can be further configured to display a transcription of the speech input.
In some examples, electronic device 3400 comprises a television. In other examples, electronic device 3400 comprises a television set-top box. In other examples, electronic device 3400 comprises a remote control. In still other examples, electronic device 3400 comprises a mobile telephone.
In one example, the one or more selectable links in the first user interface (e.g., of first user interface displaying unit 3412) comprise moving images associated with the media content. In some examples, the moving images associated with the media content comprise live feeds of the media content. In other examples, the one or more selectable links in the first user interface comprise still images associated with the media content.
In some examples, processing unit 3406 can be further configured to determine whether currently displayed content comprises a moving image or a control menu; in response to a determination that currently displayed content comprises a moving image, select a small size as the first size for the first user interface (e.g., of first user interface displaying unit 3412); and, in response to a determination that currently displayed content comprises a control menu, select a large size, larger than the small size, as the first size for the first user interface (e.g., of first user interface displaying unit 3412). In other examples, processing unit 3406 can be further configured to determine alternative media content for display based on one or more of a user preference, a show popularity, and a status of a live sporting event, and to display a notification comprising the determined alternative media content.
In accordance with some examples,
As shown in
Processing unit 3506 can be configured to receive (e.g., from input unit 3504 using speech input receiving unit 3508) speech input from a user at a first device (e.g., device 3500) having a first display (e.g., display unit 3502 in some examples). Processing unit 3506 can be further configured to determine (e.g., using user intent determining unit 3510) a user intent of the speech input based on content displayed on the first display. Processing unit 3506 can be further configured to determine (e.g., using media content determining unit 3512) media content based on the user intent. Processing unit 3506 can be further configured to play (e.g., using media content playing unit 3514) the media content on a second device associated with a second display (e.g., display unit 3502 in some examples).
In one example, the first device comprises a remote control. In another example, the first device comprises a mobile telephone. In another example, the first device comprises a tablet computer. In some examples, the second device comprises a television set-top box. In other examples, the second display comprises a television.
In some examples, the content displayed on the first display comprises an application interface. In one example, the speech input (e.g., of speech input receiving unit 3508 from input unit 3504) comprises a request to display media associated with the application interface. In one example, the media content comprises the media associated with the application interface. In another example, the application interface comprises a photo album, and the media comprises one or more photos in the photo album. In yet another example, the application interface comprises a list of one or more videos, and the media comprises one of the one or more videos. In still other examples, the application interface comprises a television program listing, and the media comprises a television program in the television program listing.
In some examples, processing unit 3506 can be further configured to determine whether the first device is authorized; wherein the media content is played on the second device in response to a determination that the first device is authorized. Processing unit 3506 can be further configured to identify the user based on the speech input, and determine (e.g., using user intent determining unit 3510) the user intent of the speech input based on data associated with the identified user. Processing unit 3506 can be further configured to determine whether the user is authorized based on the speech input; wherein the media content is played on the second device in response to a determination that the user is an authorized user. In one example, determining whether the user is authorized comprises analyzing the speech input using voice recognition.
In other examples, processing unit 3506 can be further configured to, in response to determining that the user intent comprises a request for information, display information associated with the media content on the first display of the first device. Processing unit 3506 can be further configured to, in response to determining that the user intent comprises a request to play the media content, play the media content on the second device.
In some examples, the speech input comprises a request to play content on the second device, and the media content is played on the second device in response to the request to play content on the second device. Processing unit 3506 can be further configured to determine whether the determined media content should be displayed on the first display or the second display based on a media format, a user preference, or a default setting. In some examples, the media content is displayed on the second display in response to a determination that the determined media content should be displayed on the second display. In other examples, the media content is displayed on the first display in response to a determination that the determined media content should be displayed on the first display.
In other examples, processing unit 3506 can be further configured to determine a proximity of each of two or more devices, including the second device and a third device. In some examples, the media content is played on the second device associated with the second display based on the proximity of the second device relative to the proximity of the third device. In some examples, determining the proximity of each of the two or more devices comprises determining the proximity based on Bluetooth LE.
In some examples, processing unit 3506 can be further configured to display a list of display devices, including the second device associated with the second display, and receive a selection of the second device in the list of display devices. In one example, the media content is displayed on the second display in response to receiving the selection of the second device. Processing unit 3506 can be further configured to determine whether headphones are attached to the first device. Processing unit 3506 can be further configured to, in response to a determination that headphones are attached to the first device, display the media content on the first display. Processing unit 3506 can be further configured to, in response to a determination that headphones are not attached to the first device, display the media content on the second display. In other examples, processing unit 3506 can be further configured to determine alternative media content for display based on one or more of a user preference, a show popularity, and a status of a live sporting event, and to display a notification comprising the determined alternative media content.
In accordance with some examples,
As shown in
Processing unit 3606 can be configured to receive (e.g., from input unit 3604 using speech input receiving unit 3608) speech input from a user, wherein the speech input comprises a query associated with content shown on a television display (e.g., display unit 3602 in some examples). Processing unit 3606 can be further configured to determine (e.g., using user intent determining unit 3610) a user intent of the query based on one or more of the content shown on the television display and a viewing history of media content. Processing unit 3606 can be further configured to display (e.g., using query result displaying unit 3612) a result of the query based on the determined user intent.
In one example, the speech input is received at a remote control. In another example, the speech input is received at a mobile telephone. In some examples, the result of the query is displayed on the television display. In another example, the content shown on the television display comprises a movie. In yet another example, the content shown on the television display comprises a television show. In still another example, the content shown on the television display comprises a sporting event.
In some examples, the query comprises a request for information about a person associated with the content shown on the television display, and the result (e.g., of query result displaying unit 3612) of the query comprises information about the person. In one example, the result of the query comprises media content associated with the person. In another example, the media content comprises one or more of a movie, a television show, or a sporting event associated with the person. In some examples, the query comprises a request for information about a character in the content shown on the television display, and the result of the query comprises information about the character or information about the actor who plays the character. In one example, the result of the query comprises media content associated with the actor who plays the character. In another example, the media content comprises one or more of a movie, a television show, or a sporting event associated with the actor who plays the character.
In some examples, processing unit 3606 can be further configured to determine the result of the query based on metadata associated with the content shown on the television display or the viewing history of media content. In one example, the metadata comprises one or more of a title, a description, a list of characters, a list of actors, a list of players, a genre, or a display schedule associated with the content shown on the television display or the viewing history of media content. In another example, the content shown on the television display comprises a list of media content, and the query comprises a request to display one of the items in the list. In yet another example, the content shown on the television display further comprises an item in the list of media content having focus, and determining (e.g., using user intent determining unit 3610) the user intent of the query comprises identifying the item having focus. In some examples, processing unit 3606 can be further configured to determine (e.g., using user intent determining unit 3610) the user intent of the query based on menu or search content recently displayed on the television display. In one example, the content shown on the television display comprises a page of listed media, and the recently displayed menu or search content comprises a previous page of listed media. In another example, the content shown on the television display comprises one or more categories of media, and one of the one or more categories of media has focus. In one example, processing unit 3606 can be further configured to determine (e.g., using user intent determining unit 3610) the user intent of the query based on the one of the one or more categories of media having focus. In another example, the categories of media comprise movies, television programs, and music. In other examples, processing unit 3606 can be further configured to determine alternative media content for display based on one or more of a user preference, a show popularity, and a status of a live sporting event, and to display a notification comprising the determined alternative media content.
In accordance with some examples,
As shown in
Processing unit 3706 can be configured to display (e.g., using media content displaying unit 3708) media content on a display (e.g., display unit 3702). Processing unit 3706 can be further configured to receive (e.g., from input unit 3704 using input receiving unit 3710) an input from a user. Processing unit 3706 can be further configured to determine (e.g., using query determining unit 3712) one or more virtual assistant queries based on one or more of the media content and a viewing history of media content. Processing unit 3706 can be further configured to display (e.g., using query displaying unit 3714) the one or more virtual assistant queries on the display.
In one example, the input is received from the user on a remote control. In another example, the input is received from the user on a mobile telephone. In some examples, the one or more virtual assistant queries are overlaid on a moving image. In another example, the input comprises a double click of a button. In one example, the media content comprises a movie. In another example, the media content comprises a television show. In yet another example, the media content comprises a sporting event.
In some examples, the one or more virtual assistant queries comprise a query about a person appearing in the media content. In other examples, the one or more virtual assistant queries comprise a query about a character appearing in the media content. In another example, the one or more virtual assistant queries comprise a query for media content associated with a person appearing in the media content. In some examples, the media content or the viewing history of media content comprise an episode of a television show, and the one or more virtual assistant queries comprise a query about another episode of the television show. In another example, the media content or the viewing history of media content comprise an episode of a television show, and the one or more virtual assistant queries comprise a request to set a reminder to watch or record a subsequent episode of the media content. In still another example, the one or more virtual assistant queries comprise a query for descriptive details of the media content. In one example, the descriptive details comprise one or more of a show title, a character list, an actor list, an episode description, a team roster, a team ranking, or a show synopsis.
In some examples, processing unit 3706 can be further configured to receive a selection of one of the one or more virtual assistant queries. Processing unit 3706 can be further configured to display a result of the selected one of the one or more virtual assistant queries. In one example, determining the one or more virtual assistant queries comprises determining the one or more virtual assistant queries based on one or more of a query history, a user preference, or a query popularity. In another example, determining the one or more virtual assistant queries comprises determining the one or more virtual assistant queries based on media content available to the user for viewing. In yet another example, determining the one or more virtual assistant queries comprises determining the one or more virtual assistant queries based on a received notification. In still another example, determining the one or more virtual assistant queries comprises determining the one or more virtual assistant queries based on an active application. In other examples, processing unit 3706 can be further configured to determine alternative media content for display based on one or more of a user preference, a show popularity, and a status of a live sporting event, and to display a notification comprising the determined alternative media content.
Although examples have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art (e.g., modifying any of the systems or processes discussed herein according to the concepts described in relation to any other system or process discussed herein). Such changes and modifications are to be understood as being included within the scope of the various examples as defined by the appended claims.
This application is a continuation of U.S. patent application Ser. No. 17/125,876, filed Dec. 17, 2020, which is a continuation of U.S. patent application Ser. No. 15/495,861, filed Apr. 24, 2017, which is continuation of U.S. patent application Ser. No. 15/085,465, filed Mar. 30, 2016, now U.S. Pat. No. 9,668,024, issued May 30, 2017, which is a continuation of U.S. patent application Ser. No. 14/498,503, filed Sep. 26, 2014, now U.S. Pat. No. 9,338,493, issued May 10, 2016, which claims priority from U.S. Provisional Ser. No. 62/019,312, filed on Jun. 30, 2014, which are hereby each incorporated by reference in their entirety for all purposes. This application also relates to the following provisional application: U.S. Patent Application Ser. No. 62/019,292, “Real-time Digital Assistant Knowledge Updates,” filed Jun. 30, 2014, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
7603684 | Ellis | Oct 2009 | B1 |
7796980 | McKinney et al. | Sep 2010 | B1 |
8090571 | Elshishiny et al. | Jan 2012 | B2 |
8095364 | Longe et al. | Jan 2012 | B2 |
8099289 | Mozer et al. | Jan 2012 | B2 |
8099395 | Pabla et al. | Jan 2012 | B2 |
8099418 | Inoue et al. | Jan 2012 | B2 |
8103510 | Sato | Jan 2012 | B2 |
8103947 | Lunt et al. | Jan 2012 | B2 |
8107401 | John et al. | Jan 2012 | B2 |
8112275 | Kennewick et al. | Feb 2012 | B2 |
8112280 | Lu | Feb 2012 | B2 |
8115772 | Ostermann et al. | Feb 2012 | B2 |
8117026 | Lee et al. | Feb 2012 | B2 |
8117037 | Gazdzinski | Feb 2012 | B2 |
8117542 | Radtke et al. | Feb 2012 | B2 |
8121413 | Hwang et al. | Feb 2012 | B2 |
8121837 | Agapi et al. | Feb 2012 | B2 |
8122094 | Kotab | Feb 2012 | B1 |
8122353 | Bouta | Feb 2012 | B2 |
8130929 | Wilkes et al. | Mar 2012 | B2 |
8131556 | Barton et al. | Mar 2012 | B2 |
8131557 | Davis et al. | Mar 2012 | B2 |
8135115 | Hogg, Jr. et al. | Mar 2012 | B1 |
8138912 | Singh et al. | Mar 2012 | B2 |
8140330 | Cevik et al. | Mar 2012 | B2 |
8140335 | Kennewick et al. | Mar 2012 | B2 |
8140368 | Eggenberger et al. | Mar 2012 | B2 |
8140567 | Padovitz et al. | Mar 2012 | B2 |
8145489 | Freeman et al. | Mar 2012 | B2 |
8150694 | Kennewick et al. | Apr 2012 | B2 |
8150700 | Shin et al. | Apr 2012 | B2 |
8155956 | Cho et al. | Apr 2012 | B2 |
8156005 | Vieri | Apr 2012 | B2 |
8156060 | Borzestowski et al. | Apr 2012 | B2 |
8160877 | Nucci et al. | Apr 2012 | B1 |
8160883 | Lecoeuche | Apr 2012 | B2 |
8165321 | Paquier et al. | Apr 2012 | B2 |
8165886 | Gagnon et al. | Apr 2012 | B1 |
8166019 | Lee et al. | Apr 2012 | B1 |
8166032 | Sommer et al. | Apr 2012 | B2 |
8170790 | Lee et al. | May 2012 | B2 |
8170966 | Musat et al. | May 2012 | B1 |
8171137 | Parks et al. | May 2012 | B1 |
8175872 | Kristjansson et al. | May 2012 | B2 |
8175876 | Bou-ghazale et al. | May 2012 | B2 |
8179370 | Yamasani et al. | May 2012 | B1 |
8188856 | Singh et al. | May 2012 | B2 |
8190359 | Bourne | May 2012 | B2 |
8190596 | Nambiar et al. | May 2012 | B2 |
8194827 | Jaiswal et al. | Jun 2012 | B2 |
8195460 | Degani et al. | Jun 2012 | B2 |
8195467 | Mozer et al. | Jun 2012 | B2 |
8195468 | Weider et al. | Jun 2012 | B2 |
8200489 | Baggenstoss | Jun 2012 | B1 |
8200495 | Braho et al. | Jun 2012 | B2 |
8201109 | Van Os et al. | Jun 2012 | B2 |
8204238 | Mozer | Jun 2012 | B2 |
8204751 | Di et al. | Jun 2012 | B1 |
8205788 | Gazdzinski et al. | Jun 2012 | B1 |
8209177 | Sakuma et al. | Jun 2012 | B2 |
8209183 | Patel et al. | Jun 2012 | B1 |
8213911 | Williams et al. | Jul 2012 | B2 |
8219115 | Nelissen | Jul 2012 | B1 |
8219406 | Yu et al. | Jul 2012 | B2 |
8219407 | Roy et al. | Jul 2012 | B1 |
8219555 | Mianji | Jul 2012 | B1 |
8219608 | alSafadi et al. | Jul 2012 | B2 |
8224649 | Chaudhari et al. | Jul 2012 | B2 |
8224757 | Bohle | Jul 2012 | B2 |
8228299 | Maloney et al. | Jul 2012 | B1 |
8233919 | Haag et al. | Jul 2012 | B2 |
8234111 | Lloyd et al. | Jul 2012 | B2 |
8239206 | LeBeau et al. | Aug 2012 | B1 |
8239207 | Seligman et al. | Aug 2012 | B2 |
8244545 | Paek et al. | Aug 2012 | B2 |
8244672 | Thenthiruperai et al. | Aug 2012 | B1 |
8244712 | Serlet et al. | Aug 2012 | B2 |
8250071 | Killalea et al. | Aug 2012 | B1 |
8254829 | Kindred et al. | Aug 2012 | B1 |
8255216 | White | Aug 2012 | B2 |
8255217 | Stent et al. | Aug 2012 | B2 |
8260117 | Xu et al. | Sep 2012 | B1 |
8260247 | Lazaridis et al. | Sep 2012 | B2 |
8260617 | Dhanakshirur et al. | Sep 2012 | B2 |
8260619 | Bansal et al. | Sep 2012 | B1 |
8270933 | Riemer et al. | Sep 2012 | B2 |
8271287 | Kermani | Sep 2012 | B1 |
8275621 | Alewine et al. | Sep 2012 | B2 |
8275736 | Guo et al. | Sep 2012 | B2 |
8279171 | Hirai et al. | Oct 2012 | B2 |
8280438 | Barbera | Oct 2012 | B2 |
8285546 | Reich | Oct 2012 | B2 |
8285551 | Gazdzinski | Oct 2012 | B2 |
8285553 | Gazdzinski | Oct 2012 | B2 |
8285737 | Lynn et al. | Oct 2012 | B1 |
8290274 | Mori et al. | Oct 2012 | B2 |
8290777 | Nguyen et al. | Oct 2012 | B1 |
8290778 | Gazdzinski | Oct 2012 | B2 |
8290781 | Gazdzinski | Oct 2012 | B2 |
8296124 | Holsztynska et al. | Oct 2012 | B1 |
8296145 | Clark et al. | Oct 2012 | B2 |
8296146 | Gazdzinski | Oct 2012 | B2 |
8296153 | Gazdzinski | Oct 2012 | B2 |
8296380 | Kelly et al. | Oct 2012 | B1 |
8296383 | Lindahl | Oct 2012 | B2 |
8300776 | Davies et al. | Oct 2012 | B2 |
8300801 | Sweeney et al. | Oct 2012 | B2 |
8301456 | Gazdzinski | Oct 2012 | B2 |
8311189 | Champlin et al. | Nov 2012 | B2 |
8311834 | Gazdzinski | Nov 2012 | B1 |
8311835 | Lecoeuche | Nov 2012 | B2 |
8311838 | Lindahl et al. | Nov 2012 | B2 |
8312017 | Martin et al. | Nov 2012 | B2 |
8321786 | Lunati | Nov 2012 | B2 |
8326627 | Kennewick et al. | Dec 2012 | B2 |
8332205 | Krishnan et al. | Dec 2012 | B2 |
8332218 | Cross, Jr. et al. | Dec 2012 | B2 |
8332224 | Di Cristo et al. | Dec 2012 | B2 |
8332748 | Karam | Dec 2012 | B1 |
8335689 | Wittenstein et al. | Dec 2012 | B2 |
8340975 | Rosenberger | Dec 2012 | B1 |
8345665 | Vieri et al. | Jan 2013 | B2 |
8346563 | Hjelm et al. | Jan 2013 | B1 |
8346757 | Lamping et al. | Jan 2013 | B1 |
8352183 | Thota et al. | Jan 2013 | B2 |
8352268 | Naik et al. | Jan 2013 | B2 |
8352272 | Rogers et al. | Jan 2013 | B2 |
8355919 | Silverman et al. | Jan 2013 | B2 |
8359234 | Vieri | Jan 2013 | B2 |
8370145 | Endo et al. | Feb 2013 | B2 |
8370158 | Gazdzinski | Feb 2013 | B2 |
8371503 | Gazdzinski | Feb 2013 | B2 |
8374871 | Ehsani et al. | Feb 2013 | B2 |
8375320 | Kotler et al. | Feb 2013 | B2 |
8380504 | Peden et al. | Feb 2013 | B1 |
8380507 | Herman et al. | Feb 2013 | B2 |
8381107 | Rottler et al. | Feb 2013 | B2 |
8381135 | Hotelling et al. | Feb 2013 | B2 |
8386485 | Kerschberg et al. | Feb 2013 | B2 |
8386926 | Matsuoka et al. | Feb 2013 | B1 |
8391844 | Novick et al. | Mar 2013 | B2 |
8392717 | Chai et al. | Mar 2013 | B2 |
8396295 | Gao et al. | Mar 2013 | B2 |
8396714 | Rogers et al. | Mar 2013 | B2 |
8396715 | Odell et al. | Mar 2013 | B2 |
8401163 | Kirchhoff et al. | Mar 2013 | B1 |
8406745 | Upadhyay et al. | Mar 2013 | B1 |
8407239 | Dean et al. | Mar 2013 | B2 |
8423288 | Stahl et al. | Apr 2013 | B2 |
8428758 | Naik et al. | Apr 2013 | B2 |
8433572 | Caskey et al. | Apr 2013 | B2 |
8433778 | Shreesha et al. | Apr 2013 | B1 |
8434133 | Kulkarni et al. | Apr 2013 | B2 |
8442821 | Vanhoucke | May 2013 | B1 |
8447612 | Gazdzinski | May 2013 | B2 |
8452597 | Bringert et al. | May 2013 | B2 |
8452602 | Bringert et al. | May 2013 | B1 |
8453058 | Coccaro et al. | May 2013 | B1 |
8457959 | Kaiser | Jun 2013 | B2 |
8458115 | Cai et al. | Jun 2013 | B2 |
8458278 | Christie et al. | Jun 2013 | B2 |
8463592 | Lu et al. | Jun 2013 | B2 |
8464150 | Davidson et al. | Jun 2013 | B2 |
8473289 | Jitkoff et al. | Jun 2013 | B2 |
8473485 | Wong et al. | Jun 2013 | B2 |
8477323 | Low et al. | Jul 2013 | B2 |
8478816 | Parks et al. | Jul 2013 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8484027 | Murphy | Jul 2013 | B1 |
8489599 | Bellotti | Jul 2013 | B2 |
8498670 | Cha et al. | Jul 2013 | B2 |
8498857 | Kopparapu et al. | Jul 2013 | B2 |
8514197 | Shahraray et al. | Aug 2013 | B2 |
8515736 | Duta | Aug 2013 | B1 |
8515750 | Lei et al. | Aug 2013 | B1 |
8521513 | Millett et al. | Aug 2013 | B2 |
8521526 | Lloyd et al. | Aug 2013 | B1 |
8521531 | Kim | Aug 2013 | B1 |
8521533 | Ostermann et al. | Aug 2013 | B1 |
8527276 | Senior et al. | Sep 2013 | B1 |
8533266 | Koulomzin et al. | Sep 2013 | B2 |
8537033 | Gueziec | Sep 2013 | B2 |
8539342 | Lewis | Sep 2013 | B1 |
8543375 | Hong | Sep 2013 | B2 |
8543397 | Nguyen | Sep 2013 | B1 |
8543398 | Strope et al. | Sep 2013 | B1 |
8560229 | Park et al. | Oct 2013 | B1 |
8560366 | Mikurak | Oct 2013 | B2 |
8571528 | Channakeshava | Oct 2013 | B1 |
8571851 | Tickner et al. | Oct 2013 | B1 |
8577683 | Dewitt | Nov 2013 | B2 |
8583416 | Huang et al. | Nov 2013 | B2 |
8583511 | Hendrickson | Nov 2013 | B2 |
8583638 | Donelli | Nov 2013 | B2 |
8589156 | Burke et al. | Nov 2013 | B2 |
8589161 | Kennewick et al. | Nov 2013 | B2 |
8589374 | Chaudhari | Nov 2013 | B2 |
8589869 | Wolfram | Nov 2013 | B2 |
8589911 | Sharkey et al. | Nov 2013 | B1 |
8595004 | Koshinaka | Nov 2013 | B2 |
8595642 | Lagassey | Nov 2013 | B1 |
8600743 | Lindahl et al. | Dec 2013 | B2 |
8600746 | Lei et al. | Dec 2013 | B1 |
8600930 | Sata et al. | Dec 2013 | B2 |
8606090 | Eyer | Dec 2013 | B2 |
8606568 | Tickner et al. | Dec 2013 | B1 |
8606576 | Barr et al. | Dec 2013 | B1 |
8606577 | Stewart et al. | Dec 2013 | B1 |
8615221 | Cosenza et al. | Dec 2013 | B1 |
8620659 | Di Cristo et al. | Dec 2013 | B2 |
8620662 | Bellegarda | Dec 2013 | B2 |
8626681 | Jurca et al. | Jan 2014 | B1 |
8630841 | Van Caldwell et al. | Jan 2014 | B2 |
8635073 | Chang | Jan 2014 | B2 |
8638363 | King et al. | Jan 2014 | B2 |
8639516 | Lindahl et al. | Jan 2014 | B2 |
8645128 | Agiomyrgiannakis | Feb 2014 | B1 |
8645137 | Bellegarda et al. | Feb 2014 | B2 |
8645138 | Weinstein et al. | Feb 2014 | B1 |
8654936 | Eslambolchi et al. | Feb 2014 | B1 |
8655646 | Lee et al. | Feb 2014 | B2 |
8655901 | Li et al. | Feb 2014 | B1 |
8660843 | Falcon et al. | Feb 2014 | B2 |
8660849 | Gruber et al. | Feb 2014 | B2 |
8660924 | Hoch et al. | Feb 2014 | B2 |
8660970 | Fiedorowicz | Feb 2014 | B1 |
8661112 | Creamer et al. | Feb 2014 | B2 |
8661340 | Goldsmith et al. | Feb 2014 | B2 |
8670979 | Gruber et al. | Mar 2014 | B2 |
8675084 | Bolton et al. | Mar 2014 | B2 |
8676273 | Fujisaki | Mar 2014 | B1 |
8676583 | Gupta et al. | Mar 2014 | B2 |
8676904 | Lindahl | Mar 2014 | B2 |
8677377 | Cheyer et al. | Mar 2014 | B2 |
8681950 | Vlack et al. | Mar 2014 | B2 |
8682667 | Haughay | Mar 2014 | B2 |
8687777 | Lavian et al. | Apr 2014 | B1 |
8688446 | Yanagihara | Apr 2014 | B2 |
8688453 | Joshi et al. | Apr 2014 | B1 |
8689135 | Portele et al. | Apr 2014 | B2 |
8694322 | Snitkovskiy et al. | Apr 2014 | B2 |
8695074 | Saraf et al. | Apr 2014 | B2 |
8696364 | Cohen | Apr 2014 | B2 |
8706472 | Ramerth et al. | Apr 2014 | B2 |
8706474 | Blume et al. | Apr 2014 | B2 |
8706503 | Cheyer et al. | Apr 2014 | B2 |
8707195 | Fleizach et al. | Apr 2014 | B2 |
8712778 | Thenthiruperai | Apr 2014 | B1 |
8713119 | Lindahl et al. | Apr 2014 | B2 |
8713418 | King et al. | Apr 2014 | B2 |
8719006 | Bellegarda | May 2014 | B2 |
8719014 | Wagner | May 2014 | B2 |
8719039 | Sharifi | May 2014 | B1 |
8731610 | Appaji | May 2014 | B2 |
8731912 | Tickner et al. | May 2014 | B1 |
8731942 | Cheyer et al. | May 2014 | B2 |
8739208 | Davis et al. | May 2014 | B2 |
8744852 | Seymour et al. | Jun 2014 | B1 |
8751971 | Fleizach et al. | Jun 2014 | B2 |
8760537 | Johnson et al. | Jun 2014 | B2 |
8762145 | Ouchi et al. | Jun 2014 | B2 |
8762156 | Chen | Jun 2014 | B2 |
8762469 | Lindahl | Jun 2014 | B2 |
8768693 | Somekh et al. | Jul 2014 | B2 |
8768702 | Mason et al. | Jul 2014 | B2 |
8775154 | Clinchant et al. | Jul 2014 | B2 |
8775177 | Heigold et al. | Jul 2014 | B1 |
8775341 | Commons | Jul 2014 | B1 |
8775931 | Fux et al. | Jul 2014 | B2 |
8781456 | Prociw | Jul 2014 | B2 |
8781841 | Wang | Jul 2014 | B1 |
8793301 | Wegenkittl et al. | Jul 2014 | B2 |
8798255 | Lubowich et al. | Aug 2014 | B2 |
8798995 | Edara | Aug 2014 | B1 |
8799000 | Guzzoni et al. | Aug 2014 | B2 |
8805684 | Aleksic et al. | Aug 2014 | B1 |
8805690 | Lebeau et al. | Aug 2014 | B1 |
8812299 | Su | Aug 2014 | B1 |
8812302 | Xiao et al. | Aug 2014 | B2 |
8812321 | Gilbert et al. | Aug 2014 | B2 |
8823507 | Touloumtzis | Sep 2014 | B1 |
8823793 | Clayton et al. | Sep 2014 | B2 |
8825474 | Zhai et al. | Sep 2014 | B1 |
8831947 | Wasserblat et al. | Sep 2014 | B2 |
8831949 | Smith et al. | Sep 2014 | B1 |
8838457 | Cerra et al. | Sep 2014 | B2 |
8855915 | Furuhata et al. | Oct 2014 | B2 |
8861925 | Ohme | Oct 2014 | B1 |
8862252 | Rottler et al. | Oct 2014 | B2 |
8868111 | Kahn et al. | Oct 2014 | B1 |
8868409 | Mengibar et al. | Oct 2014 | B1 |
8868431 | Yamazaki et al. | Oct 2014 | B2 |
8868469 | Xu et al. | Oct 2014 | B2 |
8868529 | Lerenc | Oct 2014 | B2 |
8880405 | Cerra et al. | Nov 2014 | B2 |
8886534 | Nakano et al. | Nov 2014 | B2 |
8886540 | Cerra et al. | Nov 2014 | B2 |
8886541 | Friedlander | Nov 2014 | B2 |
8892446 | Cheyer et al. | Nov 2014 | B2 |
8893023 | Perry et al. | Nov 2014 | B2 |
8897822 | Martin | Nov 2014 | B2 |
8898064 | Thomas et al. | Nov 2014 | B1 |
8898568 | Bull et al. | Nov 2014 | B2 |
8903716 | Chen et al. | Dec 2014 | B2 |
8909693 | Frissora et al. | Dec 2014 | B2 |
8918321 | Czahor | Dec 2014 | B2 |
8922485 | Lloyd | Dec 2014 | B1 |
8930176 | Li et al. | Jan 2015 | B2 |
8930191 | Gruber et al. | Jan 2015 | B2 |
8938394 | Faaborg et al. | Jan 2015 | B1 |
8938450 | Spivack et al. | Jan 2015 | B2 |
8938688 | Bradford et al. | Jan 2015 | B2 |
8942986 | Cheyer et al. | Jan 2015 | B2 |
8943423 | Merrill et al. | Jan 2015 | B2 |
8964947 | Noolu et al. | Feb 2015 | B1 |
8965770 | Petrushin | Feb 2015 | B2 |
8972240 | Brockett et al. | Mar 2015 | B2 |
8972432 | Shaw et al. | Mar 2015 | B2 |
8972878 | Mohler et al. | Mar 2015 | B2 |
8976063 | Hawkins et al. | Mar 2015 | B1 |
8976108 | Hawkins et al. | Mar 2015 | B2 |
8977255 | Freeman et al. | Mar 2015 | B2 |
8983383 | Haskin | Mar 2015 | B1 |
8984098 | Tomkins et al. | Mar 2015 | B1 |
8989713 | Doulton | Mar 2015 | B2 |
8990235 | King et al. | Mar 2015 | B2 |
8994660 | Neels et al. | Mar 2015 | B2 |
8995972 | Cronin | Mar 2015 | B1 |
8996350 | Dub et al. | Mar 2015 | B1 |
8996376 | Fleizach et al. | Mar 2015 | B2 |
8996381 | Mozer et al. | Mar 2015 | B2 |
8996550 | Ko et al. | Mar 2015 | B2 |
8996639 | Faaborg et al. | Mar 2015 | B1 |
9002714 | Kim et al. | Apr 2015 | B2 |
9009046 | Stewart | Apr 2015 | B1 |
9015036 | Karov Zangvil et al. | Apr 2015 | B2 |
9020804 | Barbaiani et al. | Apr 2015 | B2 |
9026425 | Nikoulina et al. | May 2015 | B2 |
9026426 | Wu et al. | May 2015 | B2 |
9031834 | Coorman et al. | May 2015 | B2 |
9031970 | Das et al. | May 2015 | B1 |
9037967 | Al-jefri et al. | May 2015 | B1 |
9043208 | Koch et al. | May 2015 | B2 |
9043211 | Haiut et al. | May 2015 | B2 |
9046932 | Medlock et al. | Jun 2015 | B2 |
9049255 | Macfarlane et al. | Jun 2015 | B2 |
9049295 | Cooper et al. | Jun 2015 | B1 |
9053706 | Jitkoff et al. | Jun 2015 | B2 |
9058105 | Drory et al. | Jun 2015 | B2 |
9058332 | Darby et al. | Jun 2015 | B1 |
9058811 | Wang et al. | Jun 2015 | B2 |
9063979 | Chiu et al. | Jun 2015 | B2 |
9064495 | Torok et al. | Jun 2015 | B1 |
9065660 | Ellis et al. | Jun 2015 | B2 |
9070247 | Kuhn et al. | Jun 2015 | B2 |
9070366 | Mathias et al. | Jun 2015 | B1 |
9071701 | Donaldson et al. | Jun 2015 | B2 |
9075435 | Noble et al. | Jul 2015 | B1 |
9075824 | Gordo et al. | Jul 2015 | B2 |
9076448 | Bennett et al. | Jul 2015 | B2 |
9076450 | Sadek et al. | Jul 2015 | B1 |
9081411 | Kalns et al. | Jul 2015 | B2 |
9081482 | Zhai et al. | Jul 2015 | B1 |
9082402 | Yadgar et al. | Jul 2015 | B2 |
9083581 | Addepalli et al. | Jul 2015 | B1 |
9092789 | Anshul | Jul 2015 | B2 |
9094576 | Karakotsios | Jul 2015 | B1 |
9094636 | Sanders et al. | Jul 2015 | B1 |
9098467 | Blanksteen et al. | Aug 2015 | B1 |
9101279 | Ritchey et al. | Aug 2015 | B2 |
9112984 | Sejnoha et al. | Aug 2015 | B2 |
9117212 | Sheets et al. | Aug 2015 | B2 |
9117447 | Gruber et al. | Aug 2015 | B2 |
9123338 | Sanders et al. | Sep 2015 | B1 |
9143907 | Caldwell et al. | Sep 2015 | B1 |
9159319 | Hoffmeister | Oct 2015 | B1 |
9164983 | Liu et al. | Oct 2015 | B2 |
9171541 | Kennewick et al. | Oct 2015 | B2 |
9171546 | Pike | Oct 2015 | B1 |
9172747 | Walters et al. | Oct 2015 | B2 |
9183845 | Gopalakrishnan et al. | Nov 2015 | B1 |
9190062 | Haughay | Nov 2015 | B2 |
9196245 | Larcheveque et al. | Nov 2015 | B2 |
9197848 | Felkai et al. | Nov 2015 | B2 |
9201955 | Quintao et al. | Dec 2015 | B1 |
9202520 | Tang | Dec 2015 | B1 |
9208153 | Zaveri et al. | Dec 2015 | B1 |
9213754 | Zhan et al. | Dec 2015 | B1 |
9218122 | Thoma et al. | Dec 2015 | B2 |
9218809 | Bellegard et al. | Dec 2015 | B2 |
9218819 | Stekkelpa et al. | Dec 2015 | B1 |
9223537 | Brown et al. | Dec 2015 | B2 |
9230561 | Ostermann et al. | Jan 2016 | B2 |
9232293 | Hanson | Jan 2016 | B1 |
9236047 | Rasmussen | Jan 2016 | B2 |
9241073 | Rensburg et al. | Jan 2016 | B1 |
9245151 | LeBeau et al. | Jan 2016 | B2 |
9250703 | Hernandez-Abrego et al. | Feb 2016 | B2 |
9251713 | Giovanniello et al. | Feb 2016 | B1 |
9251787 | Hart et al. | Feb 2016 | B1 |
9255812 | Maeoka et al. | Feb 2016 | B2 |
9257120 | Alvarez Guevara et al. | Feb 2016 | B1 |
9258604 | Bilobrov et al. | Feb 2016 | B1 |
9262412 | Yang et al. | Feb 2016 | B2 |
9262612 | Cheyer | Feb 2016 | B2 |
9263058 | Huang et al. | Feb 2016 | B2 |
9274598 | Beymer et al. | Mar 2016 | B2 |
9280535 | Varma et al. | Mar 2016 | B2 |
9282211 | Osawa | Mar 2016 | B2 |
9286727 | Kim et al. | Mar 2016 | B2 |
9286910 | Li et al. | Mar 2016 | B1 |
9292487 | Weber | Mar 2016 | B1 |
9292489 | Sak et al. | Mar 2016 | B1 |
9292492 | Sarikaya et al. | Mar 2016 | B2 |
9298358 | Wilden et al. | Mar 2016 | B1 |
9299344 | Braho et al. | Mar 2016 | B2 |
9300718 | Khanna | Mar 2016 | B2 |
9301256 | Mohan et al. | Mar 2016 | B2 |
9305543 | Fleizach et al. | Apr 2016 | B2 |
9305548 | Kennewick et al. | Apr 2016 | B2 |
9311308 | Sankarasubramaniam et al. | Apr 2016 | B2 |
9311912 | Swietlinski et al. | Apr 2016 | B1 |
9313317 | LeBeau et al. | Apr 2016 | B1 |
9318108 | Gruber et al. | Apr 2016 | B2 |
9325809 | Barros et al. | Apr 2016 | B1 |
9325842 | Siddiqi et al. | Apr 2016 | B1 |
9330659 | Ju et al. | May 2016 | B2 |
9330668 | Nanavati et al. | May 2016 | B2 |
9330720 | Lee | May 2016 | B2 |
9335983 | Breiner et al. | May 2016 | B2 |
9338493 | Van Os et al. | May 2016 | B2 |
9342829 | Zhou et al. | May 2016 | B2 |
9342930 | Kraft et al. | May 2016 | B1 |
9349368 | Lebeau et al. | May 2016 | B1 |
9355472 | Kocienda et al. | May 2016 | B2 |
9361084 | Costa | Jun 2016 | B1 |
9367541 | Servan et al. | Jun 2016 | B1 |
9368114 | Larson et al. | Jun 2016 | B2 |
9377871 | Waddell et al. | Jun 2016 | B2 |
9378456 | White et al. | Jun 2016 | B2 |
9378740 | Rosen et al. | Jun 2016 | B1 |
9380155 | Reding et al. | Jun 2016 | B1 |
9383827 | Faaborg et al. | Jul 2016 | B1 |
9384185 | Medlock et al. | Jul 2016 | B2 |
9390726 | Smus et al. | Jul 2016 | B1 |
9396722 | Chung et al. | Jul 2016 | B2 |
9400779 | Convertino et al. | Jul 2016 | B2 |
9401140 | Weber et al. | Jul 2016 | B1 |
9401147 | Jitkoff et al. | Jul 2016 | B2 |
9405741 | Schaaf et al. | Aug 2016 | B1 |
9406224 | Sanders et al. | Aug 2016 | B1 |
9406299 | Gollan et al. | Aug 2016 | B2 |
9408182 | Hurley et al. | Aug 2016 | B1 |
9412392 | Lindahl | Aug 2016 | B2 |
9418650 | Bharadwaj et al. | Aug 2016 | B2 |
9423266 | Clark et al. | Aug 2016 | B2 |
9424246 | Spencer et al. | Aug 2016 | B2 |
9424840 | Hart et al. | Aug 2016 | B1 |
9431021 | Scalise et al. | Aug 2016 | B1 |
9432499 | Hajdu et al. | Aug 2016 | B2 |
9436918 | Pantel et al. | Sep 2016 | B2 |
9437186 | Liu et al. | Sep 2016 | B1 |
9437189 | Epstein et al. | Sep 2016 | B2 |
9442687 | Park et al. | Sep 2016 | B2 |
9443527 | Watanabe et al. | Sep 2016 | B1 |
9454599 | Golden et al. | Sep 2016 | B2 |
9454957 | Mathias et al. | Sep 2016 | B1 |
9465798 | Lin | Oct 2016 | B2 |
9465833 | Aravamudan et al. | Oct 2016 | B2 |
9465864 | Hu et al. | Oct 2016 | B2 |
9466027 | Byrne et al. | Oct 2016 | B2 |
9466294 | Tunstall-pedoe et al. | Oct 2016 | B1 |
9471566 | Zhang et al. | Oct 2016 | B1 |
9472196 | Wang et al. | Oct 2016 | B1 |
9483388 | Sankaranarasimhan et al. | Nov 2016 | B2 |
9483461 | Fleizach et al. | Nov 2016 | B2 |
9483529 | Pasoi et al. | Nov 2016 | B1 |
9484021 | Mairesse et al. | Nov 2016 | B1 |
9485286 | Sellier et al. | Nov 2016 | B1 |
9495129 | Fleizach et al. | Nov 2016 | B2 |
9501741 | Cheyer et al. | Nov 2016 | B2 |
9502025 | Kennewick et al. | Nov 2016 | B2 |
9508028 | Bannister et al. | Nov 2016 | B2 |
9510044 | Pereira et al. | Nov 2016 | B1 |
9514470 | Topatan et al. | Dec 2016 | B2 |
9516014 | Zafiroglu et al. | Dec 2016 | B2 |
9519453 | Perkuhn et al. | Dec 2016 | B2 |
9524355 | Forbes et al. | Dec 2016 | B2 |
9529500 | Gauci et al. | Dec 2016 | B1 |
9531862 | Vadodaria | Dec 2016 | B1 |
9535906 | Lee et al. | Jan 2017 | B2 |
9536527 | Carlson | Jan 2017 | B1 |
9536544 | Osterman et al. | Jan 2017 | B2 |
9547647 | Badaskar | Jan 2017 | B2 |
9548050 | Gruber et al. | Jan 2017 | B2 |
9548979 | Johnson et al. | Jan 2017 | B1 |
9569549 | Jenkins et al. | Feb 2017 | B1 |
9575964 | Yadgar et al. | Feb 2017 | B2 |
9576575 | Heide | Feb 2017 | B2 |
9578173 | Sanghavi et al. | Feb 2017 | B2 |
9584946 | Lyren et al. | Feb 2017 | B1 |
9586318 | Djugash et al. | Mar 2017 | B2 |
9602946 | Karkkainen et al. | Mar 2017 | B2 |
9607612 | Deleeuw | Mar 2017 | B2 |
9612999 | Prakah-Asante et al. | Apr 2017 | B2 |
9619200 | Chakladar et al. | Apr 2017 | B2 |
9619459 | Hebert et al. | Apr 2017 | B2 |
9620113 | Kennewick et al. | Apr 2017 | B2 |
9620126 | Chiba | Apr 2017 | B2 |
9626695 | Balasubramanian et al. | Apr 2017 | B2 |
9626799 | McArdle et al. | Apr 2017 | B2 |
9626955 | Fleizach et al. | Apr 2017 | B2 |
9633004 | Giuli et al. | Apr 2017 | B2 |
9633191 | Fleizach et al. | Apr 2017 | B2 |
9633660 | Haughay | Apr 2017 | B2 |
9633674 | Sinha | Apr 2017 | B2 |
9646313 | Kim et al. | May 2017 | B2 |
9648107 | Penilla et al. | May 2017 | B1 |
9652453 | Mathur et al. | May 2017 | B2 |
9658746 | Cohn et al. | May 2017 | B2 |
9659002 | Medlock et al. | May 2017 | B2 |
9659298 | Lynch et al. | May 2017 | B2 |
9665567 | Li et al. | May 2017 | B2 |
9665662 | Gautam et al. | May 2017 | B1 |
9668121 | Naik et al. | May 2017 | B2 |
9672725 | Dotan-Cohen et al. | Jun 2017 | B2 |
9672822 | Brown et al. | Jun 2017 | B2 |
9678664 | Zhai et al. | Jun 2017 | B2 |
9690542 | Reddy et al. | Jun 2017 | B2 |
9691161 | Yalniz et al. | Jun 2017 | B1 |
9691378 | Meyers et al. | Jun 2017 | B1 |
9696963 | Son et al. | Jul 2017 | B2 |
9697016 | Jacob | Jul 2017 | B2 |
9697822 | Naik et al. | Jul 2017 | B1 |
9697827 | Lilly et al. | Jul 2017 | B1 |
9697828 | Prasad et al. | Jul 2017 | B1 |
9698999 | Mutagi | Jul 2017 | B2 |
9711148 | Sharifi et al. | Jul 2017 | B1 |
9720907 | Bangalore et al. | Aug 2017 | B2 |
9721566 | Newendorp et al. | Aug 2017 | B2 |
9721570 | Beal et al. | Aug 2017 | B1 |
9723130 | Rand | Aug 2017 | B2 |
9734817 | Putrycz | Aug 2017 | B1 |
9734839 | Adams | Aug 2017 | B1 |
9741343 | Miles et al. | Aug 2017 | B1 |
9747083 | Roman et al. | Aug 2017 | B1 |
9747093 | Latino et al. | Aug 2017 | B2 |
9755605 | Li et al. | Sep 2017 | B1 |
9760566 | Heck et al. | Sep 2017 | B2 |
9767710 | Lee et al. | Sep 2017 | B2 |
9772994 | Karov et al. | Sep 2017 | B2 |
9786271 | Combs et al. | Oct 2017 | B1 |
9792907 | Bocklet et al. | Oct 2017 | B2 |
9798719 | Karov et al. | Oct 2017 | B2 |
9812128 | Mixter et al. | Nov 2017 | B2 |
9813882 | Masterman | Nov 2017 | B1 |
9818400 | Paulik et al. | Nov 2017 | B2 |
9823811 | Brown et al. | Nov 2017 | B2 |
9823828 | Zambetti et al. | Nov 2017 | B2 |
9824379 | Khandelwal et al. | Nov 2017 | B2 |
9824691 | Montero et al. | Nov 2017 | B1 |
9824692 | Khoury et al. | Nov 2017 | B1 |
9830044 | Brown et al. | Nov 2017 | B2 |
9830449 | Wagner | Nov 2017 | B1 |
9842168 | Heck et al. | Dec 2017 | B2 |
9842584 | Hart et al. | Dec 2017 | B1 |
9846685 | Li | Dec 2017 | B2 |
9846836 | Gao et al. | Dec 2017 | B2 |
9858925 | Gruber et al. | Jan 2018 | B2 |
9858927 | Williams et al. | Jan 2018 | B2 |
9886953 | Lemay et al. | Feb 2018 | B2 |
9887949 | Shepherd et al. | Feb 2018 | B2 |
9911415 | Vanblon et al. | Mar 2018 | B2 |
9916839 | Scalise et al. | Mar 2018 | B1 |
9922642 | Pitschel et al. | Mar 2018 | B2 |
9928835 | Tang | Mar 2018 | B1 |
9934777 | Joseph et al. | Apr 2018 | B1 |
9934785 | Hulaud | Apr 2018 | B1 |
9946862 | Yun et al. | Apr 2018 | B2 |
9948728 | Linn et al. | Apr 2018 | B2 |
9959129 | Kannan et al. | May 2018 | B2 |
9959506 | Karppanen | May 2018 | B1 |
9966065 | Gruber et al. | May 2018 | B2 |
9966068 | Cash et al. | May 2018 | B2 |
9967381 | Kashimba et al. | May 2018 | B1 |
9971495 | Shetty et al. | May 2018 | B2 |
9984686 | Mutagi et al. | May 2018 | B1 |
9986419 | Naik et al. | May 2018 | B2 |
9990129 | Yang et al. | Jun 2018 | B2 |
9990176 | Gray | Jun 2018 | B1 |
9990921 | Vanblon et al. | Jun 2018 | B2 |
9990926 | Pearce | Jun 2018 | B1 |
9996626 | Bailey et al. | Jun 2018 | B1 |
9998552 | Ledet | Jun 2018 | B1 |
10001817 | Zambetti et al. | Jun 2018 | B2 |
10013416 | Bhardwaj et al. | Jul 2018 | B1 |
10013654 | Levy et al. | Jul 2018 | B1 |
10013979 | Roma et al. | Jul 2018 | B1 |
10019436 | Huang | Jul 2018 | B2 |
10025378 | Venable et al. | Jul 2018 | B2 |
10026209 | Dagley et al. | Jul 2018 | B1 |
10027662 | Mutagi et al. | Jul 2018 | B1 |
10032451 | Mamkina et al. | Jul 2018 | B1 |
10032455 | Newman et al. | Jul 2018 | B2 |
10037758 | Jing et al. | Jul 2018 | B2 |
10043516 | Saddler et al. | Aug 2018 | B2 |
10049161 | Kaneko | Aug 2018 | B2 |
10049663 | Orr et al. | Aug 2018 | B2 |
10049668 | Huang et al. | Aug 2018 | B2 |
10055390 | Sharifi et al. | Aug 2018 | B2 |
10055681 | Brown et al. | Aug 2018 | B2 |
10068570 | Dai et al. | Sep 2018 | B2 |
10074360 | Kim | Sep 2018 | B2 |
10074371 | Wang et al. | Sep 2018 | B1 |
10078487 | Gruber et al. | Sep 2018 | B2 |
10083213 | Podgorny et al. | Sep 2018 | B1 |
10083690 | Giuli et al. | Sep 2018 | B2 |
10088972 | Brown et al. | Oct 2018 | B2 |
10089072 | Piersol et al. | Oct 2018 | B2 |
10096319 | Jin et al. | Oct 2018 | B1 |
10101887 | Bernstein et al. | Oct 2018 | B2 |
10102359 | Cheyer | Oct 2018 | B2 |
10115055 | Weiss et al. | Oct 2018 | B2 |
10127901 | Zhao et al. | Nov 2018 | B2 |
10127908 | Deller et al. | Nov 2018 | B1 |
10127926 | James | Nov 2018 | B2 |
10134425 | Johnson, Jr. | Nov 2018 | B1 |
10135965 | Woolsey et al. | Nov 2018 | B2 |
10146923 | Pitkänen et al. | Dec 2018 | B2 |
10147421 | Liddell et al. | Dec 2018 | B2 |
10147441 | Pogue et al. | Dec 2018 | B1 |
10149156 | Tiku et al. | Dec 2018 | B1 |
10162817 | Schlesinger et al. | Dec 2018 | B2 |
10169329 | Futrell et al. | Jan 2019 | B2 |
10170123 | Orr et al. | Jan 2019 | B2 |
10170135 | Pearce et al. | Jan 2019 | B1 |
10175879 | Missig et al. | Jan 2019 | B2 |
10176167 | Evermann | Jan 2019 | B2 |
10176802 | Ladhak et al. | Jan 2019 | B1 |
10176808 | Lovitt et al. | Jan 2019 | B1 |
10178301 | Welbourne et al. | Jan 2019 | B1 |
10185542 | Carson et al. | Jan 2019 | B2 |
10186254 | Williams et al. | Jan 2019 | B2 |
10186266 | Devaraj et al. | Jan 2019 | B1 |
10191627 | Cieplinski et al. | Jan 2019 | B2 |
10191646 | Zambetti et al. | Jan 2019 | B2 |
10191718 | Rhee et al. | Jan 2019 | B2 |
10192546 | Piersol et al. | Jan 2019 | B1 |
10192552 | Raitio et al. | Jan 2019 | B2 |
10192557 | Lee et al. | Jan 2019 | B2 |
10198877 | Maltsev et al. | Feb 2019 | B1 |
10199051 | Binder et al. | Feb 2019 | B2 |
10200824 | Gross et al. | Feb 2019 | B2 |
10204627 | Nitz et al. | Feb 2019 | B2 |
10210860 | Ward et al. | Feb 2019 | B1 |
10216351 | Yang | Feb 2019 | B2 |
10216832 | Bangalore et al. | Feb 2019 | B2 |
10223066 | Martel et al. | Mar 2019 | B2 |
10225711 | Parks et al. | Mar 2019 | B2 |
10228904 | Raux | Mar 2019 | B2 |
10229109 | Cherepanov et al. | Mar 2019 | B1 |
10229356 | Liu et al. | Mar 2019 | B1 |
10237711 | Linn et al. | Mar 2019 | B2 |
10242501 | Pusch et al. | Mar 2019 | B1 |
10248308 | Karunamuni et al. | Apr 2019 | B2 |
10249300 | Booker et al. | Apr 2019 | B2 |
10249305 | Yu | Apr 2019 | B2 |
10255922 | Sharifi et al. | Apr 2019 | B1 |
10261672 | Dolbakian et al. | Apr 2019 | B1 |
10261830 | Gupta et al. | Apr 2019 | B2 |
10269345 | Castillo Sanchez et al. | Apr 2019 | B2 |
10275513 | Cowan et al. | Apr 2019 | B1 |
10282737 | Clark et al. | May 2019 | B2 |
10289205 | Sumter et al. | May 2019 | B1 |
10296160 | Shah et al. | May 2019 | B2 |
10297253 | Walker, II et al. | May 2019 | B2 |
10303772 | Hosn et al. | May 2019 | B2 |
10304463 | Mixter et al. | May 2019 | B2 |
10311482 | Baldwin | Jun 2019 | B2 |
10311871 | Newendorp et al. | Jun 2019 | B2 |
10325598 | Basye et al. | Jun 2019 | B2 |
10332509 | Catanzaro et al. | Jun 2019 | B2 |
10332513 | D'souza et al. | Jun 2019 | B1 |
10332518 | Garg et al. | Jun 2019 | B2 |
10339224 | Fukuoka | Jul 2019 | B2 |
10339714 | Corso et al. | Jul 2019 | B2 |
10339925 | Rastrow et al. | Jul 2019 | B1 |
10346540 | Karov et al. | Jul 2019 | B2 |
10346541 | Phillips et al. | Jul 2019 | B1 |
10346753 | Soon-Shiong et al. | Jul 2019 | B2 |
10346878 | Ostermann et al. | Jul 2019 | B1 |
10353975 | Oh et al. | Jul 2019 | B2 |
10354168 | Bluche | Jul 2019 | B2 |
10354677 | Mohamed et al. | Jul 2019 | B2 |
10356243 | Sanghavi et al. | Jul 2019 | B2 |
10360305 | Larcheveque et al. | Jul 2019 | B2 |
10360716 | Van Der Meulen et al. | Jul 2019 | B1 |
10365887 | Mulherkar | Jul 2019 | B1 |
10366160 | Castelli et al. | Jul 2019 | B2 |
10366692 | Adams et al. | Jul 2019 | B1 |
10372814 | Gliozzo et al. | Aug 2019 | B2 |
10372881 | Ingrassia, Jr. et al. | Aug 2019 | B2 |
10389876 | Engelke et al. | Aug 2019 | B2 |
10402066 | Kawana | Sep 2019 | B2 |
10403283 | Schramm et al. | Sep 2019 | B1 |
10409454 | Kagan et al. | Sep 2019 | B2 |
10410637 | Paulik et al. | Sep 2019 | B2 |
10417037 | Gruber et al. | Sep 2019 | B2 |
10417344 | Futrell et al. | Sep 2019 | B2 |
10417554 | Scheffler | Sep 2019 | B2 |
10431210 | Huang et al. | Oct 2019 | B1 |
10437928 | Bhaya et al. | Oct 2019 | B2 |
10446142 | Lim et al. | Oct 2019 | B2 |
10453117 | Reavely et al. | Oct 2019 | B1 |
10469665 | Bell et al. | Nov 2019 | B1 |
10474961 | Brigham et al. | Nov 2019 | B2 |
10475446 | Gruber et al. | Nov 2019 | B2 |
10482875 | Henry | Nov 2019 | B2 |
10490195 | Krishnamoorthy et al. | Nov 2019 | B1 |
10496364 | Yao | Dec 2019 | B2 |
10496705 | Irani et al. | Dec 2019 | B1 |
10497365 | Gruber et al. | Dec 2019 | B2 |
10497366 | Sapugay et al. | Dec 2019 | B2 |
10504518 | Irani et al. | Dec 2019 | B1 |
10512750 | Lewin et al. | Dec 2019 | B1 |
10515133 | Sharifi | Dec 2019 | B1 |
10515623 | Grizzel | Dec 2019 | B1 |
10521946 | Roche et al. | Dec 2019 | B1 |
10528386 | Yu | Jan 2020 | B2 |
10540976 | Van Os et al. | Jan 2020 | B2 |
10558893 | Bluche | Feb 2020 | B2 |
10559225 | Tao et al. | Feb 2020 | B1 |
10559299 | Arel et al. | Feb 2020 | B1 |
10566007 | Fawaz et al. | Feb 2020 | B2 |
10568032 | Freeman et al. | Feb 2020 | B2 |
10572885 | Guo et al. | Feb 2020 | B1 |
10579401 | Dawes | Mar 2020 | B2 |
10580409 | Walker, II et al. | Mar 2020 | B2 |
10582355 | Lebeau et al. | Mar 2020 | B1 |
10585957 | Heck et al. | Mar 2020 | B2 |
10586369 | Roche et al. | Mar 2020 | B1 |
10599449 | Chatzipanagiotis et al. | Mar 2020 | B1 |
10628483 | Rao et al. | Apr 2020 | B1 |
10629186 | Slifka | Apr 2020 | B1 |
10630795 | Aoki et al. | Apr 2020 | B2 |
10642934 | Heck et al. | May 2020 | B2 |
10649652 | Sun | May 2020 | B2 |
10659851 | Lister et al. | May 2020 | B2 |
10671428 | Zeitlin | Jun 2020 | B2 |
10679007 | Jia et al. | Jun 2020 | B2 |
10679608 | Mixter et al. | Jun 2020 | B2 |
10684099 | Zaetterqvist | Jun 2020 | B2 |
10684703 | Hindi et al. | Jun 2020 | B2 |
10699697 | Qian et al. | Jun 2020 | B2 |
10706841 | Gruber et al. | Jul 2020 | B2 |
10721190 | Zhao et al. | Jul 2020 | B2 |
10732708 | Roche et al. | Aug 2020 | B1 |
10743107 | Yoshioka et al. | Aug 2020 | B1 |
10748529 | Milden | Aug 2020 | B1 |
10748546 | Kim et al. | Aug 2020 | B2 |
10754658 | Tamiya | Aug 2020 | B2 |
10755032 | Douglas et al. | Aug 2020 | B2 |
10757499 | Vautrin et al. | Aug 2020 | B1 |
10769385 | Evermann | Sep 2020 | B2 |
10778839 | Newstadt et al. | Sep 2020 | B1 |
10783151 | Bushkin et al. | Sep 2020 | B1 |
10783883 | Mixter et al. | Sep 2020 | B2 |
10789945 | Acero et al. | Sep 2020 | B2 |
10791176 | Phipps et al. | Sep 2020 | B2 |
10795944 | Brown et al. | Oct 2020 | B2 |
10796100 | Bangalore et al. | Oct 2020 | B2 |
10803255 | Dubyak et al. | Oct 2020 | B2 |
10811013 | Secker-Walker et al. | Oct 2020 | B1 |
10818288 | Garcia et al. | Oct 2020 | B2 |
10842968 | Kahn et al. | Nov 2020 | B1 |
10846618 | Ravi et al. | Nov 2020 | B2 |
10847142 | Newendorp et al. | Nov 2020 | B2 |
10860629 | Gangadharaiah et al. | Dec 2020 | B1 |
10861483 | Feinauer et al. | Dec 2020 | B2 |
10877637 | Antos et al. | Dec 2020 | B1 |
10880668 | Robinson et al. | Dec 2020 | B1 |
10885277 | Ravi et al. | Jan 2021 | B2 |
10892996 | Piersol | Jan 2021 | B2 |
10909459 | Tsatsin et al. | Feb 2021 | B2 |
10942703 | Martel et al. | Mar 2021 | B2 |
10944859 | Weinstein et al. | Mar 2021 | B2 |
10957311 | Solomon et al. | Mar 2021 | B2 |
10957337 | Chen et al. | Mar 2021 | B2 |
10970660 | Harris et al. | Apr 2021 | B1 |
10974139 | Feder et al. | Apr 2021 | B2 |
10978056 | Challa et al. | Apr 2021 | B1 |
10978090 | Binder et al. | Apr 2021 | B2 |
10983971 | Carvalho et al. | Apr 2021 | B2 |
11009970 | Hindi et al. | May 2021 | B2 |
11017766 | Chao et al. | May 2021 | B2 |
11037565 | Kudurshian et al. | Jun 2021 | B2 |
11061543 | Blatz et al. | Jul 2021 | B1 |
11072344 | Provost et al. | Jul 2021 | B2 |
11076039 | Weinstein et al. | Jul 2021 | B2 |
11094311 | Candelore et al. | Aug 2021 | B2 |
11113598 | Socher et al. | Sep 2021 | B2 |
11132172 | Naik et al. | Sep 2021 | B1 |
11169660 | Gupta et al. | Nov 2021 | B2 |
11181988 | Bellegarda et al. | Nov 2021 | B1 |
11183205 | Ebenezer et al. | Nov 2021 | B1 |
11200027 | Aggarwal et al. | Dec 2021 | B2 |
11204787 | Radebaugh et al. | Dec 2021 | B2 |
11210477 | Srinivasan et al. | Dec 2021 | B2 |
11269426 | Jorasch et al. | Mar 2022 | B2 |
11283631 | Yan et al. | Mar 2022 | B2 |
20020010589 | Nashida et al. | Jan 2002 | A1 |
20030078784 | Jordan et al. | Apr 2003 | A1 |
20040226042 | Ellis | Nov 2004 | A1 |
20050049862 | Choi et al. | Mar 2005 | A1 |
20050075875 | Shozakai et al. | Apr 2005 | A1 |
20060075429 | Istvan et al. | Apr 2006 | A1 |
20060246878 | Khoury | Nov 2006 | A1 |
20070127631 | Difiglia | Jun 2007 | A1 |
20080015864 | Ross et al. | Jan 2008 | A1 |
20080271078 | Gossweiler et al. | Oct 2008 | A1 |
20090112592 | Candelore | Apr 2009 | A1 |
20100082567 | Rosenblatt et al. | Apr 2010 | A1 |
20100241418 | Maeda | Sep 2010 | A1 |
20100332236 | Tan | Dec 2010 | A1 |
20110086631 | Park et al. | Apr 2011 | A1 |
20110119715 | Chang et al. | May 2011 | A1 |
20110138064 | Rieger et al. | Jun 2011 | A1 |
20110202533 | Wang | Aug 2011 | A1 |
20110295590 | Lloyd et al. | Dec 2011 | A1 |
20120002820 | Leichter | Jan 2012 | A1 |
20120005224 | Ahrens et al. | Jan 2012 | A1 |
20120005602 | Anttila et al. | Jan 2012 | A1 |
20120008754 | Mukherjee et al. | Jan 2012 | A1 |
20120010886 | Razavilar | Jan 2012 | A1 |
20120011138 | Dunning et al. | Jan 2012 | A1 |
20120013609 | Reponen et al. | Jan 2012 | A1 |
20120015629 | Olsen et al. | Jan 2012 | A1 |
20120016658 | Wu et al. | Jan 2012 | A1 |
20120016678 | Gruber et al. | Jan 2012 | A1 |
20120019400 | Patel et al. | Jan 2012 | A1 |
20120020490 | Leichter | Jan 2012 | A1 |
20120020503 | Endo et al. | Jan 2012 | A1 |
20120022787 | LeBeau et al. | Jan 2012 | A1 |
20120022857 | Baldwin et al. | Jan 2012 | A1 |
20120022860 | Lloyd et al. | Jan 2012 | A1 |
20120022865 | Milstein | Jan 2012 | A1 |
20120022868 | LeBeau et al. | Jan 2012 | A1 |
20120022869 | Lloyd et al. | Jan 2012 | A1 |
20120022870 | Kristjansson et al. | Jan 2012 | A1 |
20120022872 | Gruber et al. | Jan 2012 | A1 |
20120022874 | Lloyd et al. | Jan 2012 | A1 |
20120022876 | LeBeau et al. | Jan 2012 | A1 |
20120022967 | Bachman et al. | Jan 2012 | A1 |
20120023088 | Cheng et al. | Jan 2012 | A1 |
20120023095 | Wadycki et al. | Jan 2012 | A1 |
20120023462 | Rosing et al. | Jan 2012 | A1 |
20120026395 | Jin et al. | Feb 2012 | A1 |
20120029661 | Jones et al. | Feb 2012 | A1 |
20120029910 | Medlock et al. | Feb 2012 | A1 |
20120034904 | LeBeau et al. | Feb 2012 | A1 |
20120035907 | Lebeau et al. | Feb 2012 | A1 |
20120035908 | Lebeau et al. | Feb 2012 | A1 |
20120035924 | Jitkoff et al. | Feb 2012 | A1 |
20120035925 | Friend et al. | Feb 2012 | A1 |
20120035926 | Ambler | Feb 2012 | A1 |
20120035931 | LeBeau et al. | Feb 2012 | A1 |
20120035932 | Jitkoff et al. | Feb 2012 | A1 |
20120035935 | Park et al. | Feb 2012 | A1 |
20120036556 | LeBeau et al. | Feb 2012 | A1 |
20120039539 | Boiman et al. | Feb 2012 | A1 |
20120039578 | Issa et al. | Feb 2012 | A1 |
20120041752 | Wang et al. | Feb 2012 | A1 |
20120041756 | Hanazawa et al. | Feb 2012 | A1 |
20120041759 | Barker et al. | Feb 2012 | A1 |
20120042014 | Desai et al. | Feb 2012 | A1 |
20120042343 | Laligand et al. | Feb 2012 | A1 |
20120052945 | Miyamoto et al. | Mar 2012 | A1 |
20120053815 | Montanari et al. | Mar 2012 | A1 |
20120053829 | Agarwal et al. | Mar 2012 | A1 |
20120053945 | Gupta et al. | Mar 2012 | A1 |
20120055253 | Sinha | Mar 2012 | A1 |
20120056815 | Mehra | Mar 2012 | A1 |
20120058783 | Kim et al. | Mar 2012 | A1 |
20120059655 | Cartales | Mar 2012 | A1 |
20120059813 | Sejnoha et al. | Mar 2012 | A1 |
20120060052 | White et al. | Mar 2012 | A1 |
20120062473 | Xiao et al. | Mar 2012 | A1 |
20120064975 | Gault et al. | Mar 2012 | A1 |
20120065972 | Strifler et al. | Mar 2012 | A1 |
20120066212 | Jennings | Mar 2012 | A1 |
20120066581 | Spalink | Mar 2012 | A1 |
20120075054 | Ge et al. | Mar 2012 | A1 |
20120075184 | Madhvanath | Mar 2012 | A1 |
20120077479 | Sabotta et al. | Mar 2012 | A1 |
20120078611 | Soltani et al. | Mar 2012 | A1 |
20120078624 | Yook et al. | Mar 2012 | A1 |
20120078627 | Wagner | Mar 2012 | A1 |
20120078635 | Rothkopf et al. | Mar 2012 | A1 |
20120078747 | Chakrabarti et al. | Mar 2012 | A1 |
20120082317 | Pance et al. | Apr 2012 | A1 |
20120083286 | Kim et al. | Apr 2012 | A1 |
20120084086 | Gilbert et al. | Apr 2012 | A1 |
20120084087 | Yang et al. | Apr 2012 | A1 |
20120084089 | Lloyd et al. | Apr 2012 | A1 |
20120084251 | Lingenfelder et al. | Apr 2012 | A1 |
20120084634 | Wong et al. | Apr 2012 | A1 |
20120088219 | Briscoe et al. | Apr 2012 | A1 |
20120089331 | Schmidt et al. | Apr 2012 | A1 |
20120089659 | Halevi et al. | Apr 2012 | A1 |
20120094645 | Jeffrey | Apr 2012 | A1 |
20120101823 | Weng et al. | Apr 2012 | A1 |
20120105257 | Murillo et al. | May 2012 | A1 |
20120108166 | Hymel | May 2012 | A1 |
20120108221 | Thomas et al. | May 2012 | A1 |
20120109632 | Sugiura et al. | May 2012 | A1 |
20120109753 | Kennewick et al. | May 2012 | A1 |
20120109997 | Sparks et al. | May 2012 | A1 |
20120110456 | Larco et al. | May 2012 | A1 |
20120114108 | Katis et al. | May 2012 | A1 |
20120116770 | Chen et al. | May 2012 | A1 |
20120117499 | Mori et al. | May 2012 | A1 |
20120117590 | Agnihotri et al. | May 2012 | A1 |
20120124126 | Alcazar et al. | May 2012 | A1 |
20120124177 | Sparks | May 2012 | A1 |
20120124178 | Sparks | May 2012 | A1 |
20120128322 | Shaffer et al. | May 2012 | A1 |
20120130709 | Bocchieri et al. | May 2012 | A1 |
20120130717 | Xu et al. | May 2012 | A1 |
20120130978 | Li et al. | May 2012 | A1 |
20120130995 | Risvik et al. | May 2012 | A1 |
20120135714 | King, II | May 2012 | A1 |
20120136529 | Curtis et al. | May 2012 | A1 |
20120136572 | Norton | May 2012 | A1 |
20120136649 | Freising et al. | May 2012 | A1 |
20120136658 | Shrum, Jr. et al. | May 2012 | A1 |
20120136855 | Ni et al. | May 2012 | A1 |
20120136985 | Popescu et al. | May 2012 | A1 |
20120137367 | Dupont et al. | May 2012 | A1 |
20120287067 | Ikegami | May 2012 | A1 |
20120148077 | Aldaz et al. | Jun 2012 | A1 |
20120149342 | Cohen et al. | Jun 2012 | A1 |
20120149394 | Singh et al. | Jun 2012 | A1 |
20120150532 | Mirowski et al. | Jun 2012 | A1 |
20120150544 | McLoughlin et al. | Jun 2012 | A1 |
20120150580 | Norton | Jun 2012 | A1 |
20120158293 | Burnham | Jun 2012 | A1 |
20120158399 | Tremblay et al. | Jun 2012 | A1 |
20120158422 | Burnham et al. | Jun 2012 | A1 |
20120159380 | Kocienda et al. | Jun 2012 | A1 |
20120162540 | Ouchi et al. | Jun 2012 | A1 |
20120163710 | Skaff et al. | Jun 2012 | A1 |
20120166177 | Beld et al. | Jun 2012 | A1 |
20120166196 | Ju et al. | Jun 2012 | A1 |
20120166429 | Moore et al. | Jun 2012 | A1 |
20120166942 | Ramerth et al. | Jun 2012 | A1 |
20120166959 | Hilerio et al. | Jun 2012 | A1 |
20120166998 | Cotterill et al. | Jun 2012 | A1 |
20120173222 | Wang et al. | Jul 2012 | A1 |
20120173244 | Kwak et al. | Jul 2012 | A1 |
20120173464 | Tur et al. | Jul 2012 | A1 |
20120174121 | Treat et al. | Jul 2012 | A1 |
20120176255 | Choi et al. | Jul 2012 | A1 |
20120179457 | Newman et al. | Jul 2012 | A1 |
20120179467 | Williams et al. | Jul 2012 | A1 |
20120179471 | Newman et al. | Jul 2012 | A1 |
20120185237 | Gajic et al. | Jul 2012 | A1 |
20120185480 | Ni et al. | Jul 2012 | A1 |
20120185781 | Guzman et al. | Jul 2012 | A1 |
20120185803 | Wang et al. | Jul 2012 | A1 |
20120185821 | Yaseen et al. | Jul 2012 | A1 |
20120191461 | Lin et al. | Jul 2012 | A1 |
20120192096 | Bowman et al. | Jul 2012 | A1 |
20120197743 | Grigg et al. | Aug 2012 | A1 |
20120197967 | Sivavakeesar | Aug 2012 | A1 |
20120197995 | Caruso | Aug 2012 | A1 |
20120197998 | Kessel et al. | Aug 2012 | A1 |
20120200489 | Miyashita et al. | Aug 2012 | A1 |
20120201362 | Crossan et al. | Aug 2012 | A1 |
20120203767 | Williams et al. | Aug 2012 | A1 |
20120208592 | Davis et al. | Aug 2012 | A1 |
20120209454 | Miller et al. | Aug 2012 | A1 |
20120209654 | Romagnino et al. | Aug 2012 | A1 |
20120209853 | Desai et al. | Aug 2012 | A1 |
20120209874 | Wong et al. | Aug 2012 | A1 |
20120210266 | Jiang et al. | Aug 2012 | A1 |
20120210378 | Mccoy et al. | Aug 2012 | A1 |
20120214141 | Raya et al. | Aug 2012 | A1 |
20120214517 | Singh et al. | Aug 2012 | A1 |
20120215640 | Ramer et al. | Aug 2012 | A1 |
20120215762 | Hall et al. | Aug 2012 | A1 |
20120221339 | Wang et al. | Aug 2012 | A1 |
20120221552 | Reponen et al. | Aug 2012 | A1 |
20120222132 | Burger et al. | Aug 2012 | A1 |
20120223889 | Medlock et al. | Sep 2012 | A1 |
20120223936 | Aughey et al. | Sep 2012 | A1 |
20120226491 | Yamazaki et al. | Sep 2012 | A1 |
20120232885 | Barbosa et al. | Sep 2012 | A1 |
20120232886 | Capuozzo et al. | Sep 2012 | A1 |
20120232906 | Lindahl | Sep 2012 | A1 |
20120233207 | Mohajer | Sep 2012 | A1 |
20120233266 | Hassan et al. | Sep 2012 | A1 |
20120233267 | Miner | Sep 2012 | A1 |
20120233280 | Ebara | Sep 2012 | A1 |
20120239403 | Cano et al. | Sep 2012 | A1 |
20120239661 | Giblin | Sep 2012 | A1 |
20120239761 | Linner et al. | Sep 2012 | A1 |
20120242482 | Elumalai et al. | Sep 2012 | A1 |
20120245719 | Story, Jr. et al. | Sep 2012 | A1 |
20120245924 | Brun | Sep 2012 | A1 |
20120245939 | Braho et al. | Sep 2012 | A1 |
20120245941 | Cheyer | Sep 2012 | A1 |
20120245944 | Gruber et al. | Sep 2012 | A1 |
20120246064 | Balkow | Sep 2012 | A1 |
20120250858 | Iqbal et al. | Oct 2012 | A1 |
20120252367 | Gaglio et al. | Oct 2012 | A1 |
20120252540 | Kirigaya | Oct 2012 | A1 |
20120253785 | Hamid et al. | Oct 2012 | A1 |
20120253791 | Heck et al. | Oct 2012 | A1 |
20120254143 | Varma et al. | Oct 2012 | A1 |
20120254152 | Park et al. | Oct 2012 | A1 |
20120254290 | Naaman | Oct 2012 | A1 |
20120259615 | Morin et al. | Oct 2012 | A1 |
20120259638 | Kalinli | Oct 2012 | A1 |
20120262296 | Bezar | Oct 2012 | A1 |
20120265482 | Grokop et al. | Oct 2012 | A1 |
20120265528 | Gruber et al. | Oct 2012 | A1 |
20120265535 | Bryant-Rich et al. | Oct 2012 | A1 |
20120265787 | Hsu et al. | Oct 2012 | A1 |
20120265806 | Blanchflower et al. | Oct 2012 | A1 |
20120271625 | Bernard | Oct 2012 | A1 |
20120271634 | Lenke | Oct 2012 | A1 |
20120271635 | Ljolje | Oct 2012 | A1 |
20120271640 | Basir | Oct 2012 | A1 |
20120271676 | Aravamudan et al. | Oct 2012 | A1 |
20120272177 | Vaghefinazari et al. | Oct 2012 | A1 |
20120275377 | Lehane et al. | Nov 2012 | A1 |
20120278073 | Weider et al. | Nov 2012 | A1 |
20120278744 | Kozitsyn et al. | Nov 2012 | A1 |
20120278812 | Wang | Nov 2012 | A1 |
20120284015 | Drewes | Nov 2012 | A1 |
20120284027 | Mallett et al. | Nov 2012 | A1 |
20120290291 | Shelley et al. | Nov 2012 | A1 |
20120290300 | Lee et al. | Nov 2012 | A1 |
20120290657 | Parks et al. | Nov 2012 | A1 |
20120290680 | Hwang | Nov 2012 | A1 |
20120295708 | Hernandez-Abrego et al. | Nov 2012 | A1 |
20120296638 | Patwa | Nov 2012 | A1 |
20120296649 | Bansal et al. | Nov 2012 | A1 |
20120296654 | Hendrickson et al. | Nov 2012 | A1 |
20120296891 | Rangan | Nov 2012 | A1 |
20120297341 | Glazer et al. | Nov 2012 | A1 |
20120297348 | Santoro | Nov 2012 | A1 |
20120303369 | Brush et al. | Nov 2012 | A1 |
20120303371 | Labsky et al. | Nov 2012 | A1 |
20120304124 | Chen et al. | Nov 2012 | A1 |
20120304239 | Shahraray et al. | Nov 2012 | A1 |
20120309363 | Gruber et al. | Dec 2012 | A1 |
20120310642 | Cao et al. | Dec 2012 | A1 |
20120310649 | Cannistraro et al. | Dec 2012 | A1 |
20120310652 | O'Sullivan | Dec 2012 | A1 |
20120310922 | Johnson et al. | Dec 2012 | A1 |
20120311478 | Van Os et al. | Dec 2012 | A1 |
20120311583 | Gruber et al. | Dec 2012 | A1 |
20120311584 | Gruber et al. | Dec 2012 | A1 |
20120311585 | Gruber et al. | Dec 2012 | A1 |
20120316774 | Yariv et al. | Dec 2012 | A1 |
20120316862 | Sultan et al. | Dec 2012 | A1 |
20120316875 | Nyquist et al. | Dec 2012 | A1 |
20120316878 | Singleton et al. | Dec 2012 | A1 |
20120316955 | Panguluri et al. | Dec 2012 | A1 |
20120317194 | Tian | Dec 2012 | A1 |
20120317498 | Logan et al. | Dec 2012 | A1 |
20120321112 | Schubert et al. | Dec 2012 | A1 |
20120323560 | Perez Cortes et al. | Dec 2012 | A1 |
20120324391 | Tocci | Dec 2012 | A1 |
20120327009 | Fleizach | Dec 2012 | A1 |
20120329529 | Van Der Raadt | Dec 2012 | A1 |
20120330660 | Jaiswal | Dec 2012 | A1 |
20120330661 | Lindahl | Dec 2012 | A1 |
20120330990 | Chen et al. | Dec 2012 | A1 |
20130002716 | Walker et al. | Jan 2013 | A1 |
20130005405 | Prociw | Jan 2013 | A1 |
20130006633 | Grokop et al. | Jan 2013 | A1 |
20130006637 | Kanevsky et al. | Jan 2013 | A1 |
20130006638 | Lindahl | Jan 2013 | A1 |
20130007240 | Qiu et al. | Jan 2013 | A1 |
20130007648 | Gamon et al. | Jan 2013 | A1 |
20130009858 | Lacey | Jan 2013 | A1 |
20130010575 | He et al. | Jan 2013 | A1 |
20130013313 | Shechtman et al. | Jan 2013 | A1 |
20130013319 | Grant et al. | Jan 2013 | A1 |
20130014026 | Beringer et al. | Jan 2013 | A1 |
20130014143 | Bhatia et al. | Jan 2013 | A1 |
20130018659 | Chi | Jan 2013 | A1 |
20130018863 | Regan et al. | Jan 2013 | A1 |
20130022189 | Ganong et al. | Jan 2013 | A1 |
20130024277 | Tuchman et al. | Jan 2013 | A1 |
20130024576 | Dishneau et al. | Jan 2013 | A1 |
20130027875 | Zhu et al. | Jan 2013 | A1 |
20130028404 | Omalley et al. | Jan 2013 | A1 |
20130030787 | Cancedda et al. | Jan 2013 | A1 |
20130030789 | Dalce | Jan 2013 | A1 |
20130030804 | Zavaliagkos et al. | Jan 2013 | A1 |
20130030815 | Madhvanath et al. | Jan 2013 | A1 |
20130030904 | Aidasani et al. | Jan 2013 | A1 |
20130030913 | Zhu et al. | Jan 2013 | A1 |
20130030955 | David | Jan 2013 | A1 |
20130031162 | Willis et al. | Jan 2013 | A1 |
20130031476 | Coin et al. | Jan 2013 | A1 |
20130176208 | Tanaka et al. | Jan 2013 | A1 |
20130033643 | Kim et al. | Feb 2013 | A1 |
20130035086 | Chardon et al. | Feb 2013 | A1 |
20130035942 | Kim et al. | Feb 2013 | A1 |
20130035961 | Yegnanarayanan | Feb 2013 | A1 |
20130035994 | Pattan et al. | Feb 2013 | A1 |
20130036200 | Roberts et al. | Feb 2013 | A1 |
20130038618 | Urbach | Feb 2013 | A1 |
20130041647 | Ramerth et al. | Feb 2013 | A1 |
20130041654 | Walker et al. | Feb 2013 | A1 |
20130041661 | Lee et al. | Feb 2013 | A1 |
20130041665 | Jang et al. | Feb 2013 | A1 |
20130041667 | Longe et al. | Feb 2013 | A1 |
20130041685 | Yegnanarayanan | Feb 2013 | A1 |
20130041968 | Cohen et al. | Feb 2013 | A1 |
20130046544 | Kay et al. | Feb 2013 | A1 |
20130047178 | Moon et al. | Feb 2013 | A1 |
20130050089 | Neels et al. | Feb 2013 | A1 |
20130054550 | Bolohan | Feb 2013 | A1 |
20130054609 | Rajput et al. | Feb 2013 | A1 |
20130054613 | Bishop | Feb 2013 | A1 |
20130054631 | Govani et al. | Feb 2013 | A1 |
20130054675 | Jenkins et al. | Feb 2013 | A1 |
20130054706 | Graham et al. | Feb 2013 | A1 |
20130054945 | Free et al. | Feb 2013 | A1 |
20130055099 | Yao et al. | Feb 2013 | A1 |
20130055147 | Vasudev et al. | Feb 2013 | A1 |
20130055201 | No et al. | Feb 2013 | A1 |
20130060571 | Soemo et al. | Mar 2013 | A1 |
20130060807 | Rambhia et al. | Mar 2013 | A1 |
20130061139 | Mahkovec et al. | Mar 2013 | A1 |
20130063611 | Papakipos et al. | Mar 2013 | A1 |
20130066832 | Sheehan et al. | Mar 2013 | A1 |
20130067307 | Tian et al. | Mar 2013 | A1 |
20130067312 | Rose | Mar 2013 | A1 |
20130067421 | Osman et al. | Mar 2013 | A1 |
20130069769 | Pennington et al. | Mar 2013 | A1 |
20130073286 | Bastea-Forte et al. | Mar 2013 | A1 |
20130073293 | Jang et al. | Mar 2013 | A1 |
20130073346 | Chun et al. | Mar 2013 | A1 |
20130073580 | Mehanna et al. | Mar 2013 | A1 |
20130073676 | Cockcroft | Mar 2013 | A1 |
20130078930 | Chen et al. | Mar 2013 | A1 |
20130080152 | Brun et al. | Mar 2013 | A1 |
20130080162 | Chang et al. | Mar 2013 | A1 |
20130080167 | Mozer | Mar 2013 | A1 |
20130080177 | Chen | Mar 2013 | A1 |
20130080178 | Kang et al. | Mar 2013 | A1 |
20130080251 | Dempski | Mar 2013 | A1 |
20130080972 | Moshrefi et al. | Mar 2013 | A1 |
20130082967 | Hillis et al. | Apr 2013 | A1 |
20130084882 | Khorashadi et al. | Apr 2013 | A1 |
20130085755 | Bringert et al. | Apr 2013 | A1 |
20130085757 | Nakamura et al. | Apr 2013 | A1 |
20130085761 | Bringert et al. | Apr 2013 | A1 |
20130086609 | Levy et al. | Apr 2013 | A1 |
20130090921 | Liu et al. | Apr 2013 | A1 |
20130091090 | Spivack et al. | Apr 2013 | A1 |
20130095805 | LeBeau et al. | Apr 2013 | A1 |
20130096909 | Brun et al. | Apr 2013 | A1 |
20130096911 | Beaufort et al. | Apr 2013 | A1 |
20130096917 | Edgar et al. | Apr 2013 | A1 |
20130097566 | Berglund | Apr 2013 | A1 |
20130097682 | Zeljkovic et al. | Apr 2013 | A1 |
20130100017 | Papakipos et al. | Apr 2013 | A1 |
20130100268 | Mihailidis et al. | Apr 2013 | A1 |
20130103383 | Du et al. | Apr 2013 | A1 |
20130103391 | Millmore et al. | Apr 2013 | A1 |
20130103405 | Namba et al. | Apr 2013 | A1 |
20130103698 | Schlipf | Apr 2013 | A1 |
20130106742 | Lee et al. | May 2013 | A1 |
20130107053 | Ozaki | May 2013 | A1 |
20130109412 | Nguyen et al. | May 2013 | A1 |
20130110505 | Gruber et al. | May 2013 | A1 |
20130110511 | Spiegel et al. | May 2013 | A1 |
20130110515 | Guzzoni et al. | May 2013 | A1 |
20130110518 | Gruber et al. | May 2013 | A1 |
20130110519 | Cheyer et al. | May 2013 | A1 |
20130110520 | Cheyer et al. | May 2013 | A1 |
20130110943 | Menon et al. | May 2013 | A1 |
20130111330 | Staikos et al. | May 2013 | A1 |
20130111348 | Gruber et al. | May 2013 | A1 |
20130111365 | Chen et al. | May 2013 | A1 |
20130111487 | Cheyer et al. | May 2013 | A1 |
20130111581 | Griffin et al. | May 2013 | A1 |
20130115927 | Gruber et al. | May 2013 | A1 |
20130117022 | Chen et al. | May 2013 | A1 |
20130124187 | Qin | May 2013 | A1 |
20130124189 | Baldwin et al. | May 2013 | A1 |
20130124672 | Pan | May 2013 | A1 |
20130125168 | Agnihotri et al. | May 2013 | A1 |
20130130669 | Xiao et al. | May 2013 | A1 |
20130132081 | Ryu et al. | May 2013 | A1 |
20130132084 | Stonehocker et al. | May 2013 | A1 |
20130132089 | Fanty et al. | May 2013 | A1 |
20130132094 | Lim | May 2013 | A1 |
20130132871 | Zeng et al. | May 2013 | A1 |
20130138440 | Strope et al. | May 2013 | A1 |
20130141551 | Kim | Jun 2013 | A1 |
20130142317 | Reynolds | Jun 2013 | A1 |
20130142345 | Waldmann | Jun 2013 | A1 |
20130144594 | Bangalore et al. | Jun 2013 | A1 |
20130144616 | Bangalore | Jun 2013 | A1 |
20130151258 | Chandrasekar et al. | Jun 2013 | A1 |
20130151339 | Kim et al. | Jun 2013 | A1 |
20130152092 | Yadgar | Jun 2013 | A1 |
20130154811 | Ferren et al. | Jun 2013 | A1 |
20130155948 | Pinheiro et al. | Jun 2013 | A1 |
20130156198 | Kim et al. | Jun 2013 | A1 |
20130157629 | Lee et al. | Jun 2013 | A1 |
20130158977 | Senior | Jun 2013 | A1 |
20130159847 | Banke et al. | Jun 2013 | A1 |
20130159861 | Rottler et al. | Jun 2013 | A1 |
20130165232 | Nelson et al. | Jun 2013 | A1 |
20130166278 | James et al. | Jun 2013 | A1 |
20130166303 | Chang et al. | Jun 2013 | A1 |
20130166332 | Hammad | Jun 2013 | A1 |
20130166442 | Nakajima et al. | Jun 2013 | A1 |
20130167242 | Paliwal | Jun 2013 | A1 |
20130170738 | Capuozzo et al. | Jul 2013 | A1 |
20130172022 | Seymour et al. | Jul 2013 | A1 |
20130173258 | Liu et al. | Jul 2013 | A1 |
20130173268 | Weng et al. | Jul 2013 | A1 |
20130173513 | Chu et al. | Jul 2013 | A1 |
20130173610 | Hu et al. | Jul 2013 | A1 |
20130173614 | Ismalon | Jul 2013 | A1 |
20130174034 | Brown et al. | Jul 2013 | A1 |
20130176147 | Anderson et al. | Jul 2013 | A1 |
20130176244 | Yamamoto et al. | Jul 2013 | A1 |
20130176592 | Sasaki | Jul 2013 | A1 |
20130179168 | Bae et al. | Jul 2013 | A1 |
20130179172 | Nakamura et al. | Jul 2013 | A1 |
20130179440 | Gordon | Jul 2013 | A1 |
20130179806 | Bastide et al. | Jul 2013 | A1 |
20130183942 | Novick et al. | Jul 2013 | A1 |
20130183944 | Mozer et al. | Jul 2013 | A1 |
20130185059 | Riccardi | Jul 2013 | A1 |
20130185066 | Tzirkel-hancock et al. | Jul 2013 | A1 |
20130185074 | Gruber et al. | Jul 2013 | A1 |
20130185081 | Cheyer et al. | Jul 2013 | A1 |
20130185336 | Singh et al. | Jul 2013 | A1 |
20130187850 | Schulz et al. | Jul 2013 | A1 |
20130187857 | Griffin et al. | Jul 2013 | A1 |
20130190021 | Vieri et al. | Jul 2013 | A1 |
20130191117 | Atti et al. | Jul 2013 | A1 |
20130191408 | Volkert | Jul 2013 | A1 |
20130197911 | Wei et al. | Aug 2013 | A1 |
20130197914 | Yelvington et al. | Aug 2013 | A1 |
20130198159 | Hendry | Aug 2013 | A1 |
20130198841 | Poulson | Aug 2013 | A1 |
20130204813 | Master et al. | Aug 2013 | A1 |
20130204897 | McDougall | Aug 2013 | A1 |
20130204967 | Seo et al. | Aug 2013 | A1 |
20130207898 | Sullivan et al. | Aug 2013 | A1 |
20130210410 | Xu | Aug 2013 | A1 |
20130210492 | You et al. | Aug 2013 | A1 |
20130212501 | Anderson et al. | Aug 2013 | A1 |
20130218553 | Fujii et al. | Aug 2013 | A1 |
20130218560 | Hsiao et al. | Aug 2013 | A1 |
20130218574 | Falcon et al. | Aug 2013 | A1 |
20130218899 | Raghavan et al. | Aug 2013 | A1 |
20130219333 | Palwe et al. | Aug 2013 | A1 |
20130222249 | Pasquero et al. | Aug 2013 | A1 |
20130223279 | Tinnakornsrisuphap et al. | Aug 2013 | A1 |
20130225128 | Gomar | Aug 2013 | A1 |
20130226580 | Witt-ehsani | Aug 2013 | A1 |
20130226935 | Bai et al. | Aug 2013 | A1 |
20130226996 | Itagaki et al. | Aug 2013 | A1 |
20130231917 | Naik | Sep 2013 | A1 |
20130234947 | Kristensson et al. | Sep 2013 | A1 |
20130235987 | Arroniz-Escobar | Sep 2013 | A1 |
20130238312 | Waibel | Sep 2013 | A1 |
20130238326 | Kim et al. | Sep 2013 | A1 |
20130238334 | Ma et al. | Sep 2013 | A1 |
20130238540 | O'donoghue et al. | Sep 2013 | A1 |
20130238647 | Thompson | Sep 2013 | A1 |
20130238729 | Holzman et al. | Sep 2013 | A1 |
20130244615 | Miller | Sep 2013 | A1 |
20130244633 | Jacobs et al. | Sep 2013 | A1 |
20130246048 | Nagase et al. | Sep 2013 | A1 |
20130246050 | Yu et al. | Sep 2013 | A1 |
20130246329 | Pasquero et al. | Sep 2013 | A1 |
20130246920 | Fields et al. | Sep 2013 | A1 |
20130253911 | Petri et al. | Sep 2013 | A1 |
20130253912 | Medlock et al. | Sep 2013 | A1 |
20130260739 | Saino | Oct 2013 | A1 |
20130262168 | Makanawala et al. | Oct 2013 | A1 |
20130268263 | Park et al. | Oct 2013 | A1 |
20130268956 | Recco | Oct 2013 | A1 |
20130275117 | Winer | Oct 2013 | A1 |
20130275136 | Czahor | Oct 2013 | A1 |
20130275138 | Gruber et al. | Oct 2013 | A1 |
20130275164 | Gruber et al. | Oct 2013 | A1 |
20130275199 | Proctor, Jr. et al. | Oct 2013 | A1 |
20130275625 | Taivalsaari et al. | Oct 2013 | A1 |
20130275875 | Gruber et al. | Oct 2013 | A1 |
20130275899 | Schubert et al. | Oct 2013 | A1 |
20130279724 | Stafford et al. | Oct 2013 | A1 |
20130282709 | Zhu et al. | Oct 2013 | A1 |
20130283168 | Brown et al. | Oct 2013 | A1 |
20130283199 | Selig et al. | Oct 2013 | A1 |
20130283283 | Wang et al. | Oct 2013 | A1 |
20130285913 | Griffin et al. | Oct 2013 | A1 |
20130288722 | Ramanujam et al. | Oct 2013 | A1 |
20130289991 | Eshwar et al. | Oct 2013 | A1 |
20130289993 | Rao | Oct 2013 | A1 |
20130289994 | Newman et al. | Oct 2013 | A1 |
20130290001 | Yun et al. | Oct 2013 | A1 |
20130290222 | Gordo et al. | Oct 2013 | A1 |
20130290905 | Luvogt et al. | Oct 2013 | A1 |
20130291015 | Pan | Oct 2013 | A1 |
20130297078 | Kolavennu | Nov 2013 | A1 |
20130297198 | Velde et al. | Nov 2013 | A1 |
20130297317 | Lee et al. | Nov 2013 | A1 |
20130297319 | Kim | Nov 2013 | A1 |
20130297348 | Cardoza et al. | Nov 2013 | A1 |
20130298139 | Resnick et al. | Nov 2013 | A1 |
20130300645 | Fedorov | Nov 2013 | A1 |
20130300648 | Kim et al. | Nov 2013 | A1 |
20130303106 | Martin | Nov 2013 | A1 |
20130304476 | Kim et al. | Nov 2013 | A1 |
20130304479 | Teller et al. | Nov 2013 | A1 |
20130304758 | Gruber et al. | Nov 2013 | A1 |
20130304815 | Puente et al. | Nov 2013 | A1 |
20130305119 | Kern et al. | Nov 2013 | A1 |
20130307855 | Lamb et al. | Nov 2013 | A1 |
20130307997 | O'Keefe et al. | Nov 2013 | A1 |
20130308922 | Sano et al. | Nov 2013 | A1 |
20130311179 | Wagner | Nov 2013 | A1 |
20130311184 | Badavne et al. | Nov 2013 | A1 |
20130311487 | Moore et al. | Nov 2013 | A1 |
20130311997 | Gruber et al. | Nov 2013 | A1 |
20130315038 | Ferren et al. | Nov 2013 | A1 |
20130316679 | Miller et al. | Nov 2013 | A1 |
20130316746 | Miller et al. | Nov 2013 | A1 |
20130317921 | Havas | Nov 2013 | A1 |
20130318478 | Ogura | Nov 2013 | A1 |
20130321267 | Bhatti et al. | Dec 2013 | A1 |
20130322634 | Bennett et al. | Dec 2013 | A1 |
20130322665 | Bennett et al. | Dec 2013 | A1 |
20130325340 | Forstall et al. | Dec 2013 | A1 |
20130325436 | Wang et al. | Dec 2013 | A1 |
20130325443 | Begeja et al. | Dec 2013 | A1 |
20130325447 | Levien et al. | Dec 2013 | A1 |
20130325448 | Levien et al. | Dec 2013 | A1 |
20130325460 | Kim et al. | Dec 2013 | A1 |
20130325473 | Larcher et al. | Dec 2013 | A1 |
20130325480 | Lee et al. | Dec 2013 | A1 |
20130325481 | Van Os et al. | Dec 2013 | A1 |
20130325484 | Chakladar et al. | Dec 2013 | A1 |
20130325844 | Plaisant | Dec 2013 | A1 |
20130325967 | Parks et al. | Dec 2013 | A1 |
20130325970 | Roberts et al. | Dec 2013 | A1 |
20130325979 | Mansfield et al. | Dec 2013 | A1 |
20130326576 | Zhang et al. | Dec 2013 | A1 |
20130328809 | Smith | Dec 2013 | A1 |
20130329023 | Suplee, III et al. | Dec 2013 | A1 |
20130331127 | Sabatelli et al. | Dec 2013 | A1 |
20130332113 | Piemonte et al. | Dec 2013 | A1 |
20130332159 | Federighi et al. | Dec 2013 | A1 |
20130332162 | Keen | Dec 2013 | A1 |
20130332164 | Nalk | Dec 2013 | A1 |
20130332168 | Kim et al. | Dec 2013 | A1 |
20130332172 | Prakash et al. | Dec 2013 | A1 |
20130332400 | González | Dec 2013 | A1 |
20130332538 | Clark et al. | Dec 2013 | A1 |
20130332721 | Chaudhri et al. | Dec 2013 | A1 |
20130339028 | Rosner et al. | Dec 2013 | A1 |
20130339256 | Shroff | Dec 2013 | A1 |
20130339454 | Walker et al. | Dec 2013 | A1 |
20130339991 | Ricci | Dec 2013 | A1 |
20130342487 | Jeon | Dec 2013 | A1 |
20130342672 | Gray et al. | Dec 2013 | A1 |
20130343584 | Bennett et al. | Dec 2013 | A1 |
20130343721 | Abecassis | Dec 2013 | A1 |
20130346016 | Suzuki et al. | Dec 2013 | A1 |
20130346065 | Davidson et al. | Dec 2013 | A1 |
20130346068 | Solem et al. | Dec 2013 | A1 |
20130346347 | Patterson et al. | Dec 2013 | A1 |
20130346488 | Lunt et al. | Dec 2013 | A1 |
20130347018 | Limp et al. | Dec 2013 | A1 |
20130347029 | Tang et al. | Dec 2013 | A1 |
20130347102 | Shi | Dec 2013 | A1 |
20130347117 | Parks et al. | Dec 2013 | A1 |
20140001255 | Anthoine | Jan 2014 | A1 |
20140002338 | Raffa et al. | Jan 2014 | A1 |
20140006012 | Zhou et al. | Jan 2014 | A1 |
20140006025 | Krishnan et al. | Jan 2014 | A1 |
20140006027 | Kim et al. | Jan 2014 | A1 |
20140006028 | Hu | Jan 2014 | A1 |
20140006030 | Fleizach et al. | Jan 2014 | A1 |
20140006153 | Thangam et al. | Jan 2014 | A1 |
20140006191 | Shankar et al. | Jan 2014 | A1 |
20140006483 | Garmark et al. | Jan 2014 | A1 |
20140006496 | Dearman et al. | Jan 2014 | A1 |
20140006562 | Handa et al. | Jan 2014 | A1 |
20140006947 | Garmark et al. | Jan 2014 | A1 |
20140006951 | Hunter | Jan 2014 | A1 |
20140006955 | Greenzeiger et al. | Jan 2014 | A1 |
20140008163 | Mikonaho et al. | Jan 2014 | A1 |
20140012574 | Pasupalak et al. | Jan 2014 | A1 |
20140012575 | Ganong et al. | Jan 2014 | A1 |
20140012580 | Ganong, III et al. | Jan 2014 | A1 |
20140012586 | Rubin et al. | Jan 2014 | A1 |
20140012587 | Park | Jan 2014 | A1 |
20140013336 | Yang | Jan 2014 | A1 |
20140019116 | Lundberg et al. | Jan 2014 | A1 |
20140019133 | Bao et al. | Jan 2014 | A1 |
20140019460 | Sambrani et al. | Jan 2014 | A1 |
20140026037 | Garb et al. | Jan 2014 | A1 |
20140028029 | Jochman | Jan 2014 | A1 |
20140028477 | Michalske | Jan 2014 | A1 |
20140028603 | Xie et al. | Jan 2014 | A1 |
20140028735 | Williams et al. | Jan 2014 | A1 |
20140032453 | Eustice et al. | Jan 2014 | A1 |
20140032678 | Koukoumidis et al. | Jan 2014 | A1 |
20140033071 | Gruber et al. | Jan 2014 | A1 |
20140035823 | Khoe et al. | Feb 2014 | A1 |
20140037075 | Bouzid et al. | Feb 2014 | A1 |
20140039888 | Taubman et al. | Feb 2014 | A1 |
20140039893 | Weiner et al. | Feb 2014 | A1 |
20140039894 | Shostak | Feb 2014 | A1 |
20140040274 | Aravamudan et al. | Feb 2014 | A1 |
20140040748 | Lemay et al. | Feb 2014 | A1 |
20140040754 | Donelli | Feb 2014 | A1 |
20140040801 | Patel et al. | Feb 2014 | A1 |
20140040905 | Tsunoda et al. | Feb 2014 | A1 |
20140040918 | Li | Feb 2014 | A1 |
20140040961 | Green et al. | Feb 2014 | A1 |
20140046934 | Zhou et al. | Feb 2014 | A1 |
20140047001 | Phillips et al. | Feb 2014 | A1 |
20140051399 | Walker | Feb 2014 | A1 |
20140052451 | Cheong et al. | Feb 2014 | A1 |
20140052680 | Nitz et al. | Feb 2014 | A1 |
20140052791 | Chakra et al. | Feb 2014 | A1 |
20140053082 | Park | Feb 2014 | A1 |
20140053101 | Buehler et al. | Feb 2014 | A1 |
20140053210 | Cheong et al. | Feb 2014 | A1 |
20140057610 | Olincy et al. | Feb 2014 | A1 |
20140059030 | Hakkani-Tur et al. | Feb 2014 | A1 |
20140059423 | Gorga et al. | Feb 2014 | A1 |
20140067361 | Nikoulina et al. | Mar 2014 | A1 |
20140067371 | Liensberger | Mar 2014 | A1 |
20140067402 | Kim | Mar 2014 | A1 |
20140067738 | Kingsbury | Mar 2014 | A1 |
20140067740 | Solari | Mar 2014 | A1 |
20140068751 | Last | Mar 2014 | A1 |
20140071241 | Yang et al. | Mar 2014 | A1 |
20140074454 | Brown et al. | Mar 2014 | A1 |
20140074466 | Sharifi et al. | Mar 2014 | A1 |
20140074470 | Jansche et al. | Mar 2014 | A1 |
20140074472 | Lin et al. | Mar 2014 | A1 |
20140074482 | Ohno | Mar 2014 | A1 |
20140074483 | Van Os | Mar 2014 | A1 |
20140074589 | Nielsen et al. | Mar 2014 | A1 |
20140074815 | Plimton | Mar 2014 | A1 |
20140074846 | Moss et al. | Mar 2014 | A1 |
20140075453 | Bellessort et al. | Mar 2014 | A1 |
20140078065 | Akkok | Mar 2014 | A1 |
20140079195 | Srivastava et al. | Mar 2014 | A1 |
20140080410 | Jung et al. | Mar 2014 | A1 |
20140080428 | Rhoads et al. | Mar 2014 | A1 |
20140081619 | Solntseva et al. | Mar 2014 | A1 |
20140081633 | Badaskar | Mar 2014 | A1 |
20140081635 | Yanagihara | Mar 2014 | A1 |
20140081829 | Milne | Mar 2014 | A1 |
20140081941 | Bai et al. | Mar 2014 | A1 |
20140082500 | Wilensky et al. | Mar 2014 | A1 |
20140082501 | Bae et al. | Mar 2014 | A1 |
20140082545 | Zhai et al. | Mar 2014 | A1 |
20140082715 | Grajek et al. | Mar 2014 | A1 |
20140086458 | Rogers | Mar 2014 | A1 |
20140087711 | Geyer et al. | Mar 2014 | A1 |
20140088952 | Fife et al. | Mar 2014 | A1 |
20140088961 | Woodward et al. | Mar 2014 | A1 |
20140088964 | Bellegarda | Mar 2014 | A1 |
20140088970 | Kang | Mar 2014 | A1 |
20140092007 | Kim et al. | Apr 2014 | A1 |
20140095171 | Lynch et al. | Apr 2014 | A1 |
20140095172 | Cabaco et al. | Apr 2014 | A1 |
20140095173 | Lynch et al. | Apr 2014 | A1 |
20140095432 | Trumbull et al. | Apr 2014 | A1 |
20140095601 | Abuelsaad et al. | Apr 2014 | A1 |
20140095965 | Li | Apr 2014 | A1 |
20140096077 | Jacob et al. | Apr 2014 | A1 |
20140096209 | Saraf et al. | Apr 2014 | A1 |
20140098247 | Rao et al. | Apr 2014 | A1 |
20140100847 | Ishii et al. | Apr 2014 | A1 |
20140101127 | Simhon et al. | Apr 2014 | A1 |
20140104175 | Ouyang et al. | Apr 2014 | A1 |
20140108017 | Mason et al. | Apr 2014 | A1 |
20140108357 | Procops et al. | Apr 2014 | A1 |
20140108391 | Volkert | Apr 2014 | A1 |
20140108792 | Borzycki et al. | Apr 2014 | A1 |
20140112556 | Kalinli-akbacak | Apr 2014 | A1 |
20140114554 | Lagassey | Apr 2014 | A1 |
20140115062 | Liu et al. | Apr 2014 | A1 |
20140115114 | Garmark et al. | Apr 2014 | A1 |
20140118155 | Bowers et al. | May 2014 | A1 |
20140118624 | Jang et al. | May 2014 | A1 |
20140120961 | Buck | May 2014 | A1 |
20140122057 | Chelba et al. | May 2014 | A1 |
20140122059 | Patel et al. | May 2014 | A1 |
20140122085 | Piety et al. | May 2014 | A1 |
20140122086 | Kapur et al. | May 2014 | A1 |
20140122136 | Jayanthi | May 2014 | A1 |
20140122153 | Truitt | May 2014 | A1 |
20140123022 | Lee et al. | May 2014 | A1 |
20140128021 | Walker et al. | May 2014 | A1 |
20140129006 | Chen et al. | May 2014 | A1 |
20140129226 | Lee et al. | May 2014 | A1 |
20140132935 | Kim et al. | May 2014 | A1 |
20140134983 | Jung et al. | May 2014 | A1 |
20140135036 | Bonanni et al. | May 2014 | A1 |
20140136013 | Wolverton et al. | May 2014 | A1 |
20140136187 | Wolverton et al. | May 2014 | A1 |
20140136195 | Abdossalami et al. | May 2014 | A1 |
20140136212 | Kwon et al. | May 2014 | A1 |
20140136946 | Matas | May 2014 | A1 |
20140136987 | Rodriguez | May 2014 | A1 |
20140142922 | Liang et al. | May 2014 | A1 |
20140142923 | Jones et al. | May 2014 | A1 |
20140142934 | Kim | May 2014 | A1 |
20140142935 | Lindahl et al. | May 2014 | A1 |
20140142953 | Kim et al. | May 2014 | A1 |
20140143550 | Ganong, III et al. | May 2014 | A1 |
20140143721 | Suzuki et al. | May 2014 | A1 |
20140143784 | Mistry et al. | May 2014 | A1 |
20140146200 | Scott et al. | May 2014 | A1 |
20140148209 | Weng et al. | May 2014 | A1 |
20140149118 | Lee et al. | May 2014 | A1 |
20140152577 | Yuen et al. | Jun 2014 | A1 |
20140153709 | Byrd et al. | Jun 2014 | A1 |
20140155031 | Lee et al. | Jun 2014 | A1 |
20140156262 | Yuen et al. | Jun 2014 | A1 |
20140156269 | Lee et al. | Jun 2014 | A1 |
20140156279 | Okamoto et al. | Jun 2014 | A1 |
20140156564 | Knight et al. | Jun 2014 | A1 |
20140157319 | Kimura et al. | Jun 2014 | A1 |
20140157422 | Livshits et al. | Jun 2014 | A1 |
20140163751 | Davis et al. | Jun 2014 | A1 |
20140163951 | Nikoulina et al. | Jun 2014 | A1 |
20140163953 | Parikh | Jun 2014 | A1 |
20140163954 | Joshi et al. | Jun 2014 | A1 |
20140163962 | Castelli et al. | Jun 2014 | A1 |
20140163976 | Park et al. | Jun 2014 | A1 |
20140163977 | Hoffmeister et al. | Jun 2014 | A1 |
20140163978 | Basye et al. | Jun 2014 | A1 |
20140163981 | Cook et al. | Jun 2014 | A1 |
20140163995 | Burns et al. | Jun 2014 | A1 |
20140164305 | Lynch et al. | Jun 2014 | A1 |
20140164312 | Lynch et al. | Jun 2014 | A1 |
20140164476 | Thomson | Jun 2014 | A1 |
20140164508 | Lynch et al. | Jun 2014 | A1 |
20140164532 | Lynch et al. | Jun 2014 | A1 |
20140164533 | Lynch et al. | Jun 2014 | A1 |
20140164953 | Lynch et al. | Jun 2014 | A1 |
20140169795 | Clough | Jun 2014 | A1 |
20140171064 | Das | Jun 2014 | A1 |
20140172412 | Viegas et al. | Jun 2014 | A1 |
20140172878 | Clark et al. | Jun 2014 | A1 |
20140173445 | Grassiotto | Jun 2014 | A1 |
20140173460 | Kim | Jun 2014 | A1 |
20140176814 | Ahn | Jun 2014 | A1 |
20140179295 | Luebbers et al. | Jun 2014 | A1 |
20140180499 | Cooper et al. | Jun 2014 | A1 |
20140180689 | Kim | Jun 2014 | A1 |
20140180697 | Torok et al. | Jun 2014 | A1 |
20140181123 | Tuffet Blaise et al. | Jun 2014 | A1 |
20140181741 | Apacible et al. | Jun 2014 | A1 |
20140181865 | Koganei | Jun 2014 | A1 |
20140188335 | Madhok et al. | Jul 2014 | A1 |
20140188460 | Ouyang et al. | Jul 2014 | A1 |
20140188477 | Zhang | Jul 2014 | A1 |
20140188478 | Zhang | Jul 2014 | A1 |
20140188485 | Kim et al. | Jul 2014 | A1 |
20140188835 | Zhang et al. | Jul 2014 | A1 |
20140195226 | Yun et al. | Jul 2014 | A1 |
20140195230 | Han et al. | Jul 2014 | A1 |
20140195233 | Bapat et al. | Jul 2014 | A1 |
20140195244 | Cha et al. | Jul 2014 | A1 |
20140195251 | Zeinstra et al. | Jul 2014 | A1 |
20140195252 | Gruber et al. | Jul 2014 | A1 |
20140198048 | Unruh et al. | Jul 2014 | A1 |
20140200891 | Larcheveque et al. | Jul 2014 | A1 |
20140203939 | Harrington et al. | Jul 2014 | A1 |
20140205076 | Kumar et al. | Jul 2014 | A1 |
20140207439 | Venkatapathy et al. | Jul 2014 | A1 |
20140207446 | Klein et al. | Jul 2014 | A1 |
20140207447 | Jiang et al. | Jul 2014 | A1 |
20140207466 | Smadi | Jul 2014 | A1 |
20140207468 | Bartnik | Jul 2014 | A1 |
20140207582 | Flinn et al. | Jul 2014 | A1 |
20140211944 | Hayward et al. | Jul 2014 | A1 |
20140214429 | Pantel | Jul 2014 | A1 |
20140214537 | Yoo et al. | Jul 2014 | A1 |
20140215367 | Kim et al. | Jul 2014 | A1 |
20140215513 | Ramer et al. | Jul 2014 | A1 |
20140218372 | Missig et al. | Aug 2014 | A1 |
20140222422 | Sarikaya et al. | Aug 2014 | A1 |
20140222435 | Li et al. | Aug 2014 | A1 |
20140222436 | Binder et al. | Aug 2014 | A1 |
20140222678 | Sheets et al. | Aug 2014 | A1 |
20140222967 | Harrang et al. | Aug 2014 | A1 |
20140223377 | Shaw et al. | Aug 2014 | A1 |
20140223481 | Fundament | Aug 2014 | A1 |
20140226503 | Cooper et al. | Aug 2014 | A1 |
20140229158 | Zweig et al. | Aug 2014 | A1 |
20140229184 | Shires | Aug 2014 | A1 |
20140230055 | Boehl | Aug 2014 | A1 |
20140232570 | Skinder et al. | Aug 2014 | A1 |
20140232656 | Pasquero et al. | Aug 2014 | A1 |
20140236595 | Gray | Aug 2014 | A1 |
20140236986 | Guzman | Aug 2014 | A1 |
20140237042 | Ahmed et al. | Aug 2014 | A1 |
20140237366 | Poulos et al. | Aug 2014 | A1 |
20140244248 | Arisoy et al. | Aug 2014 | A1 |
20140244249 | Mohamed et al. | Aug 2014 | A1 |
20140244254 | Ju et al. | Aug 2014 | A1 |
20140244257 | Colibro et al. | Aug 2014 | A1 |
20140244258 | Song et al. | Aug 2014 | A1 |
20140244263 | Pontual et al. | Aug 2014 | A1 |
20140244266 | Brown et al. | Aug 2014 | A1 |
20140244268 | Abdelsamie et al. | Aug 2014 | A1 |
20140244270 | Han et al. | Aug 2014 | A1 |
20140244271 | Lindahl | Aug 2014 | A1 |
20140244712 | Walters et al. | Aug 2014 | A1 |
20140245140 | Brown et al. | Aug 2014 | A1 |
20140247383 | Dave et al. | Sep 2014 | A1 |
20140247926 | Gainsboro et al. | Sep 2014 | A1 |
20140249812 | Bou-Ghazale et al. | Sep 2014 | A1 |
20140249816 | Pickering et al. | Sep 2014 | A1 |
20140249817 | Hart et al. | Sep 2014 | A1 |
20140249820 | Hsu et al. | Sep 2014 | A1 |
20140249821 | Kennewick et al. | Sep 2014 | A1 |
20140250046 | Winn et al. | Sep 2014 | A1 |
20140253455 | Mauro et al. | Sep 2014 | A1 |
20140257809 | Goel et al. | Sep 2014 | A1 |
20140257815 | Zhao et al. | Sep 2014 | A1 |
20140257902 | Moore et al. | Sep 2014 | A1 |
20140258324 | Mauro et al. | Sep 2014 | A1 |
20140258357 | Singh et al. | Sep 2014 | A1 |
20140258857 | Dykstra-Erickson et al. | Sep 2014 | A1 |
20140258905 | Lee et al. | Sep 2014 | A1 |
20140267022 | Kim | Sep 2014 | A1 |
20140267599 | Drouin et al. | Sep 2014 | A1 |
20140267933 | Young | Sep 2014 | A1 |
20140272821 | Pitschel et al. | Sep 2014 | A1 |
20140273979 | Van Os et al. | Sep 2014 | A1 |
20140274005 | Luna et al. | Sep 2014 | A1 |
20140274203 | Ganong, III et al. | Sep 2014 | A1 |
20140274211 | Sejnoha et al. | Sep 2014 | A1 |
20140278051 | Mcgavran et al. | Sep 2014 | A1 |
20140278343 | Tran | Sep 2014 | A1 |
20140278349 | Grieves et al. | Sep 2014 | A1 |
20140278379 | Coccaro et al. | Sep 2014 | A1 |
20140278390 | Kingsbury et al. | Sep 2014 | A1 |
20140278391 | Braho et al. | Sep 2014 | A1 |
20140278394 | Bastyr et al. | Sep 2014 | A1 |
20140278406 | Tsumura et al. | Sep 2014 | A1 |
20140278413 | Pitschel et al. | Sep 2014 | A1 |
20140278426 | Jost et al. | Sep 2014 | A1 |
20140278429 | Ganong, III | Sep 2014 | A1 |
20140278435 | Ganong, III et al. | Sep 2014 | A1 |
20140278436 | Khanna et al. | Sep 2014 | A1 |
20140278438 | Hart et al. | Sep 2014 | A1 |
20140278443 | Gunn et al. | Sep 2014 | A1 |
20140278444 | Larson et al. | Sep 2014 | A1 |
20140278513 | Prakash et al. | Sep 2014 | A1 |
20140279622 | Lamoureux et al. | Sep 2014 | A1 |
20140279739 | Elkington et al. | Sep 2014 | A1 |
20140279787 | Cheng et al. | Sep 2014 | A1 |
20140280072 | Coleman | Sep 2014 | A1 |
20140280107 | Heymans et al. | Sep 2014 | A1 |
20140280138 | Li et al. | Sep 2014 | A1 |
20140280292 | Skinder | Sep 2014 | A1 |
20140280353 | Delaney et al. | Sep 2014 | A1 |
20140280450 | Luna | Sep 2014 | A1 |
20140280757 | Tran | Sep 2014 | A1 |
20140281944 | Winer | Sep 2014 | A1 |
20140281983 | Xian et al. | Sep 2014 | A1 |
20140281997 | Fleizach et al. | Sep 2014 | A1 |
20140282003 | Gruber et al. | Sep 2014 | A1 |
20140282007 | Fleizach | Sep 2014 | A1 |
20140282045 | Ayanam et al. | Sep 2014 | A1 |
20140282178 | Borzello et al. | Sep 2014 | A1 |
20140282201 | Pasquero et al. | Sep 2014 | A1 |
20140282203 | Pasquero et al. | Sep 2014 | A1 |
20140282559 | Verduzco et al. | Sep 2014 | A1 |
20140282586 | Shear et al. | Sep 2014 | A1 |
20140282743 | Howard et al. | Sep 2014 | A1 |
20140288990 | Moore et al. | Sep 2014 | A1 |
20140289508 | Wang | Sep 2014 | A1 |
20140297267 | Spencer et al. | Oct 2014 | A1 |
20140297281 | Togawa et al. | Oct 2014 | A1 |
20140297284 | Gruber et al. | Oct 2014 | A1 |
20140297288 | Yu et al. | Oct 2014 | A1 |
20140298395 | Yang et al. | Oct 2014 | A1 |
20140304086 | Dasdan et al. | Oct 2014 | A1 |
20140304605 | Ohmura et al. | Oct 2014 | A1 |
20140309990 | Gandrabur et al. | Oct 2014 | A1 |
20140309996 | Zhang | Oct 2014 | A1 |
20140310001 | Kalns et al. | Oct 2014 | A1 |
20140310002 | Nitz et al. | Oct 2014 | A1 |
20140310348 | Keskitalo et al. | Oct 2014 | A1 |
20140310365 | Sample et al. | Oct 2014 | A1 |
20140310595 | Acharya et al. | Oct 2014 | A1 |
20140313007 | Harding | Oct 2014 | A1 |
20140315492 | Woods | Oct 2014 | A1 |
20140316585 | Boesveld et al. | Oct 2014 | A1 |
20140317030 | Shen et al. | Oct 2014 | A1 |
20140317502 | Brown et al. | Oct 2014 | A1 |
20140324429 | Weilhammer et al. | Oct 2014 | A1 |
20140324884 | Lindahl et al. | Oct 2014 | A1 |
20140330560 | Venkatesha et al. | Nov 2014 | A1 |
20140330569 | Kolavennu et al. | Nov 2014 | A1 |
20140330951 | Sukoff et al. | Nov 2014 | A1 |
20140335823 | Heredia et al. | Nov 2014 | A1 |
20140337037 | Chi | Nov 2014 | A1 |
20140337048 | Brown et al. | Nov 2014 | A1 |
20140337266 | Wolverton et al. | Nov 2014 | A1 |
20140337370 | Aravamudan et al. | Nov 2014 | A1 |
20140337371 | Li | Nov 2014 | A1 |
20140337438 | Govande et al. | Nov 2014 | A1 |
20140337621 | Nakhimov | Nov 2014 | A1 |
20140337751 | Lim et al. | Nov 2014 | A1 |
20140337814 | Kalns et al. | Nov 2014 | A1 |
20140342762 | Hajdu et al. | Nov 2014 | A1 |
20140343834 | Demerchant et al. | Nov 2014 | A1 |
20140343943 | Al-telmissani | Nov 2014 | A1 |
20140343946 | Torok et al. | Nov 2014 | A1 |
20140344205 | Luna et al. | Nov 2014 | A1 |
20140344627 | Schaub et al. | Nov 2014 | A1 |
20140344687 | Durham et al. | Nov 2014 | A1 |
20140347181 | Luna et al. | Nov 2014 | A1 |
20140350847 | Ichinokawa | Nov 2014 | A1 |
20140350924 | Zurek et al. | Nov 2014 | A1 |
20140350933 | Bak et al. | Nov 2014 | A1 |
20140351741 | Medlock et al. | Nov 2014 | A1 |
20140351760 | Skory et al. | Nov 2014 | A1 |
20140358519 | Mirkin et al. | Dec 2014 | A1 |
20140358521 | Mikutel et al. | Dec 2014 | A1 |
20140358523 | Sheth et al. | Dec 2014 | A1 |
20140358549 | O'connor et al. | Dec 2014 | A1 |
20140359637 | Yan | Dec 2014 | A1 |
20140359709 | Nassar et al. | Dec 2014 | A1 |
20140361973 | Raux et al. | Dec 2014 | A1 |
20140363074 | Dolfing et al. | Dec 2014 | A1 |
20140364149 | Marti et al. | Dec 2014 | A1 |
20140365209 | Evermann | Dec 2014 | A1 |
20140365214 | Bayley | Dec 2014 | A1 |
20140365216 | Gruber et al. | Dec 2014 | A1 |
20140365226 | Sinha | Dec 2014 | A1 |
20140365227 | Cash et al. | Dec 2014 | A1 |
20140365407 | Brown et al. | Dec 2014 | A1 |
20140365505 | Clark et al. | Dec 2014 | A1 |
20140365880 | Bellegarda | Dec 2014 | A1 |
20140365885 | Carson et al. | Dec 2014 | A1 |
20140365895 | Magahern et al. | Dec 2014 | A1 |
20140365922 | Yang | Dec 2014 | A1 |
20140365945 | Karunamuni et al. | Dec 2014 | A1 |
20140370817 | Luna | Dec 2014 | A1 |
20140370841 | Roberts et al. | Dec 2014 | A1 |
20140372112 | Xue et al. | Dec 2014 | A1 |
20140372356 | Bilal et al. | Dec 2014 | A1 |
20140372468 | Collins et al. | Dec 2014 | A1 |
20140372931 | Zhai et al. | Dec 2014 | A1 |
20140379326 | Sarikaya et al. | Dec 2014 | A1 |
20140379334 | Fry | Dec 2014 | A1 |
20140379338 | Fry | Dec 2014 | A1 |
20140379341 | Seo et al. | Dec 2014 | A1 |
20140379798 | Bunner et al. | Dec 2014 | A1 |
20140380285 | Gabel et al. | Dec 2014 | A1 |
20150003797 | Schmidt | Jan 2015 | A1 |
20150004958 | Wang et al. | Jan 2015 | A1 |
20150005009 | Tomkins et al. | Jan 2015 | A1 |
20150006147 | Schmidt | Jan 2015 | A1 |
20150006148 | Goldszmit et al. | Jan 2015 | A1 |
20150006157 | Silva et al. | Jan 2015 | A1 |
20150006167 | Kato et al. | Jan 2015 | A1 |
20150006176 | Pogue et al. | Jan 2015 | A1 |
20150006178 | Peng et al. | Jan 2015 | A1 |
20150006184 | Marti et al. | Jan 2015 | A1 |
20150006199 | Snider et al. | Jan 2015 | A1 |
20150012271 | Peng et al. | Jan 2015 | A1 |
20150012862 | Ikeda et al. | Jan 2015 | A1 |
20150019219 | Tzirkel-Hancock et al. | Jan 2015 | A1 |
20150019221 | Lee et al. | Jan 2015 | A1 |
20150019445 | Glass et al. | Jan 2015 | A1 |
20150019944 | Kalgi | Jan 2015 | A1 |
20150019954 | Dalal et al. | Jan 2015 | A1 |
20150019974 | Doi et al. | Jan 2015 | A1 |
20150025405 | Vairavan et al. | Jan 2015 | A1 |
20150025890 | Jagatheesan et al. | Jan 2015 | A1 |
20150026620 | Kwon et al. | Jan 2015 | A1 |
20150027178 | Scalisi | Jan 2015 | A1 |
20150031416 | Labowicz et al. | Jan 2015 | A1 |
20150032443 | Karov et al. | Jan 2015 | A1 |
20150032457 | Koo et al. | Jan 2015 | A1 |
20150033130 | Scheessele | Jan 2015 | A1 |
20150033219 | Breiner et al. | Jan 2015 | A1 |
20150033275 | Natani et al. | Jan 2015 | A1 |
20150034855 | Shen | Feb 2015 | A1 |
20150038161 | Jakobson et al. | Feb 2015 | A1 |
20150039292 | Suleman et al. | Feb 2015 | A1 |
20150039295 | Soschen | Feb 2015 | A1 |
20150039299 | Weinstein et al. | Feb 2015 | A1 |
20150039305 | Huang | Feb 2015 | A1 |
20150039606 | Salaka et al. | Feb 2015 | A1 |
20150040012 | Faaborg et al. | Feb 2015 | A1 |
20150042640 | Algreatly | Feb 2015 | A1 |
20150045003 | Vora et al. | Feb 2015 | A1 |
20150045007 | Cash | Feb 2015 | A1 |
20150045068 | Soffer et al. | Feb 2015 | A1 |
20150046375 | Mandel et al. | Feb 2015 | A1 |
20150046434 | Lim et al. | Feb 2015 | A1 |
20150046537 | Rakib | Feb 2015 | A1 |
20150046828 | Desai et al. | Feb 2015 | A1 |
20150050633 | Christmas et al. | Feb 2015 | A1 |
20150050923 | Tu et al. | Feb 2015 | A1 |
20150051754 | Kwon et al. | Feb 2015 | A1 |
20150051901 | Stonehouse et al. | Feb 2015 | A1 |
20150052128 | Sharifi | Feb 2015 | A1 |
20150053779 | Adamek et al. | Feb 2015 | A1 |
20150053781 | Nelson et al. | Feb 2015 | A1 |
20150055879 | Yang | Feb 2015 | A1 |
20150058013 | Pakhomov et al. | Feb 2015 | A1 |
20150058018 | Georges et al. | Feb 2015 | A1 |
20150058720 | Smadja et al. | Feb 2015 | A1 |
20150058785 | Ookawara | Feb 2015 | A1 |
20150065149 | Russell et al. | Mar 2015 | A1 |
20150065200 | Namgung et al. | Mar 2015 | A1 |
20150066473 | Jeong et al. | Mar 2015 | A1 |
20150066479 | Pasupalak et al. | Mar 2015 | A1 |
20150066494 | Salvador et al. | Mar 2015 | A1 |
20150066496 | Deoras et al. | Mar 2015 | A1 |
20150066506 | Romano et al. | Mar 2015 | A1 |
20150066516 | Nishikawa et al. | Mar 2015 | A1 |
20150066817 | Slayton et al. | Mar 2015 | A1 |
20150067485 | Kim et al. | Mar 2015 | A1 |
20150067819 | Shribman et al. | Mar 2015 | A1 |
20150067822 | Randall | Mar 2015 | A1 |
20150071121 | Patil et al. | Mar 2015 | A1 |
20150073788 | Sak et al. | Mar 2015 | A1 |
20150073804 | Senior et al. | Mar 2015 | A1 |
20150074524 | Nicholson et al. | Mar 2015 | A1 |
20150074615 | Han et al. | Mar 2015 | A1 |
20150081295 | Yun et al. | Mar 2015 | A1 |
20150082180 | Ames et al. | Mar 2015 | A1 |
20150082229 | Ouyang et al. | Mar 2015 | A1 |
20150086174 | Abecassis et al. | Mar 2015 | A1 |
20150088511 | Bharadwaj et al. | Mar 2015 | A1 |
20150088514 | Typrin | Mar 2015 | A1 |
20150088518 | Kim et al. | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150088523 | Schuster | Mar 2015 | A1 |
20150088998 | Isensee et al. | Mar 2015 | A1 |
20150092520 | Robison et al. | Apr 2015 | A1 |
20150094834 | Vega et al. | Apr 2015 | A1 |
20150095031 | Conkie et al. | Apr 2015 | A1 |
20150095159 | Kennewick et al. | Apr 2015 | A1 |
20150095268 | Greenzeiger et al. | Apr 2015 | A1 |
20150095278 | Flinn et al. | Apr 2015 | A1 |
20150100144 | Lee et al. | Apr 2015 | A1 |
20150100313 | Sharma | Apr 2015 | A1 |
20150100316 | Williams et al. | Apr 2015 | A1 |
20150100537 | Grieves et al. | Apr 2015 | A1 |
20150100983 | Pan | Apr 2015 | A1 |
20150106061 | Yang et al. | Apr 2015 | A1 |
20150106085 | Lindahl | Apr 2015 | A1 |
20150106093 | Weeks et al. | Apr 2015 | A1 |
20150106737 | Montoy-Wilson et al. | Apr 2015 | A1 |
20150112684 | Scheffer et al. | Apr 2015 | A1 |
20150113407 | Hoffert et al. | Apr 2015 | A1 |
20150113435 | Phillips | Apr 2015 | A1 |
20150113454 | McLaughlin | Apr 2015 | A1 |
20150120296 | Stern et al. | Apr 2015 | A1 |
20150120641 | Soon-Shiong et al. | Apr 2015 | A1 |
20150120723 | Deshmukh et al. | Apr 2015 | A1 |
20150121216 | Brown et al. | Apr 2015 | A1 |
20150121227 | Peng | Apr 2015 | A1 |
20150123898 | Kim et al. | May 2015 | A1 |
20150127336 | Lei et al. | May 2015 | A1 |
20150127337 | Heigold et al. | May 2015 | A1 |
20150127348 | Follis | May 2015 | A1 |
20150127350 | Agiomyrgiannakis | May 2015 | A1 |
20150128058 | Anajwala | May 2015 | A1 |
20150133049 | Lee et al. | May 2015 | A1 |
20150133109 | Freeman et al. | May 2015 | A1 |
20150134318 | Cuthbert et al. | May 2015 | A1 |
20150134322 | Cuthbert et al. | May 2015 | A1 |
20150134323 | Cuthbert et al. | May 2015 | A1 |
20150134334 | Sachidanandam et al. | May 2015 | A1 |
20150135085 | Shoham et al. | May 2015 | A1 |
20150135123 | Carr et al. | May 2015 | A1 |
20150140934 | Abdurrahman et al. | May 2015 | A1 |
20150140990 | Kim et al. | May 2015 | A1 |
20150141150 | Zha | May 2015 | A1 |
20150142420 | Sarikaya et al. | May 2015 | A1 |
20150142438 | Dai et al. | May 2015 | A1 |
20150142440 | Parkinson et al. | May 2015 | A1 |
20150142447 | Kennewick et al. | May 2015 | A1 |
20150142851 | Gupta et al. | May 2015 | A1 |
20150143419 | Bhagwat et al. | May 2015 | A1 |
20150148013 | Baldwin et al. | May 2015 | A1 |
20150149146 | Abramovitz et al. | May 2015 | A1 |
20150149177 | Kalns et al. | May 2015 | A1 |
20150149182 | Kalns et al. | May 2015 | A1 |
20150149354 | McCoy | May 2015 | A1 |
20150149469 | Xu et al. | May 2015 | A1 |
20150149899 | Bernstein et al. | May 2015 | A1 |
20150149964 | Bernstein et al. | May 2015 | A1 |
20150154001 | Knox et al. | Jun 2015 | A1 |
20150154185 | Waibel | Jun 2015 | A1 |
20150154976 | Mutagi | Jun 2015 | A1 |
20150160855 | Bi | Jun 2015 | A1 |
20150161291 | Gur et al. | Jun 2015 | A1 |
20150161370 | North et al. | Jun 2015 | A1 |
20150161521 | Shah et al. | Jun 2015 | A1 |
20150161989 | Hsu et al. | Jun 2015 | A1 |
20150162000 | Di Censo et al. | Jun 2015 | A1 |
20150162001 | Kar et al. | Jun 2015 | A1 |
20150162006 | Kummer | Jun 2015 | A1 |
20150163558 | Wheatley | Jun 2015 | A1 |
20150169081 | Neels et al. | Jun 2015 | A1 |
20150169195 | Choi | Jun 2015 | A1 |
20150169284 | Quast et al. | Jun 2015 | A1 |
20150169336 | Harper et al. | Jun 2015 | A1 |
20150169696 | Krishnappa et al. | Jun 2015 | A1 |
20150170073 | Baker | Jun 2015 | A1 |
20150170664 | Doherty et al. | Jun 2015 | A1 |
20150172262 | Ortiz, Jr. et al. | Jun 2015 | A1 |
20150172463 | Quast et al. | Jun 2015 | A1 |
20150178388 | Winnemoeller et al. | Jun 2015 | A1 |
20150178785 | Salonen | Jun 2015 | A1 |
20150179168 | Hakkani-tur et al. | Jun 2015 | A1 |
20150179176 | Ryu et al. | Jun 2015 | A1 |
20150181285 | Zhang et al. | Jun 2015 | A1 |
20150185718 | Tappan et al. | Jul 2015 | A1 |
20150185964 | Stout | Jul 2015 | A1 |
20150185993 | Wheatley et al. | Jul 2015 | A1 |
20150185996 | Brown et al. | Jul 2015 | A1 |
20150186012 | Coleman et al. | Jul 2015 | A1 |
20150186110 | Kannan | Jul 2015 | A1 |
20150186154 | Brown et al. | Jul 2015 | A1 |
20150186155 | Brown et al. | Jul 2015 | A1 |
20150186156 | Brown et al. | Jul 2015 | A1 |
20150186351 | Hicks et al. | Jul 2015 | A1 |
20150186538 | Yan et al. | Jul 2015 | A1 |
20150186783 | Byrne et al. | Jul 2015 | A1 |
20150186892 | Zhang et al. | Jul 2015 | A1 |
20150187355 | Parkinson et al. | Jul 2015 | A1 |
20150187369 | Dadu et al. | Jul 2015 | A1 |
20150189362 | Lee et al. | Jul 2015 | A1 |
20150193379 | Mehta | Jul 2015 | A1 |
20150193391 | Khvostichenko et al. | Jul 2015 | A1 |
20150193392 | Greenblatt et al. | Jul 2015 | A1 |
20150194152 | Katuri et al. | Jul 2015 | A1 |
20150194165 | Faaborg et al. | Jul 2015 | A1 |
20150195379 | Zhang et al. | Jul 2015 | A1 |
20150195606 | McDevitt | Jul 2015 | A1 |
20150199077 | Zuger et al. | Jul 2015 | A1 |
20150199960 | Huo et al. | Jul 2015 | A1 |
20150199965 | Leak et al. | Jul 2015 | A1 |
20150199967 | Reddy et al. | Jul 2015 | A1 |
20150200879 | Wu et al. | Jul 2015 | A1 |
20150201064 | Bells et al. | Jul 2015 | A1 |
20150201077 | Konig et al. | Jul 2015 | A1 |
20150205425 | Kuscher et al. | Jul 2015 | A1 |
20150205568 | Matsuoka | Jul 2015 | A1 |
20150205632 | Gaster | Jul 2015 | A1 |
20150205858 | Xie et al. | Jul 2015 | A1 |
20150206529 | Kwon et al. | Jul 2015 | A1 |
20150208226 | Kuusilinna et al. | Jul 2015 | A1 |
20150212791 | Kumar et al. | Jul 2015 | A1 |
20150213140 | Volkert | Jul 2015 | A1 |
20150213796 | Waltermann et al. | Jul 2015 | A1 |
20150215258 | Nowakowski et al. | Jul 2015 | A1 |
20150215350 | Slayton et al. | Jul 2015 | A1 |
20150217870 | Mccullough et al. | Aug 2015 | A1 |
20150220264 | Lewis et al. | Aug 2015 | A1 |
20150220507 | Mohajer et al. | Aug 2015 | A1 |
20150220715 | Kim et al. | Aug 2015 | A1 |
20150220972 | Subramanya et al. | Aug 2015 | A1 |
20150221302 | Han et al. | Aug 2015 | A1 |
20150221304 | Stewart | Aug 2015 | A1 |
20150221307 | Shah et al. | Aug 2015 | A1 |
20150222586 | Ebersman et al. | Aug 2015 | A1 |
20150224848 | Eisenhour | Aug 2015 | A1 |
20150227505 | Morimoto | Aug 2015 | A1 |
20150227633 | Shapira | Aug 2015 | A1 |
20150228274 | Leppanen et al. | Aug 2015 | A1 |
20150228275 | Watanabe et al. | Aug 2015 | A1 |
20150228281 | Raniere | Aug 2015 | A1 |
20150228282 | Evrard | Aug 2015 | A1 |
20150228283 | Ehsani et al. | Aug 2015 | A1 |
20150228292 | Goldstein et al. | Aug 2015 | A1 |
20150230095 | Smith et al. | Aug 2015 | A1 |
20150234556 | Shaofeng et al. | Aug 2015 | A1 |
20150234636 | Barnes, Jr. | Aug 2015 | A1 |
20150234800 | Patrick et al. | Aug 2015 | A1 |
20150235434 | Miller et al. | Aug 2015 | A1 |
20150235540 | Verna et al. | Aug 2015 | A1 |
20150237301 | Shi et al. | Aug 2015 | A1 |
20150242091 | Lu et al. | Aug 2015 | A1 |
20150242385 | Bao et al. | Aug 2015 | A1 |
20150243278 | Kibre et al. | Aug 2015 | A1 |
20150243279 | Morse et al. | Aug 2015 | A1 |
20150243283 | Halash et al. | Aug 2015 | A1 |
20150244665 | Choi et al. | Aug 2015 | A1 |
20150245154 | Dadu et al. | Aug 2015 | A1 |
20150248494 | Mital | Sep 2015 | A1 |
20150248651 | Akutagawa et al. | Sep 2015 | A1 |
20150248886 | Sarikaya et al. | Sep 2015 | A1 |
20150249715 | Helvik et al. | Sep 2015 | A1 |
20150253146 | Annapureddy et al. | Sep 2015 | A1 |
20150253885 | Kagan et al. | Sep 2015 | A1 |
20150254057 | Klein et al. | Sep 2015 | A1 |
20150254058 | Klein et al. | Sep 2015 | A1 |
20150254333 | Fife et al. | Sep 2015 | A1 |
20150255068 | Kim et al. | Sep 2015 | A1 |
20150255071 | Chiba | Sep 2015 | A1 |
20150256873 | Klein et al. | Sep 2015 | A1 |
20150261298 | Li | Sep 2015 | A1 |
20150261496 | Faaborg et al. | Sep 2015 | A1 |
20150261850 | Mittal | Sep 2015 | A1 |
20150261944 | Hosom et al. | Sep 2015 | A1 |
20150262443 | Chong | Sep 2015 | A1 |
20150262573 | Brooks et al. | Sep 2015 | A1 |
20150262583 | Kanda et al. | Sep 2015 | A1 |
20150269139 | McAteer et al. | Sep 2015 | A1 |
20150269617 | Mikurak | Sep 2015 | A1 |
20150269677 | Milne | Sep 2015 | A1 |
20150269943 | VanBlon et al. | Sep 2015 | A1 |
20150277574 | Jain et al. | Oct 2015 | A1 |
20150278199 | Hazen et al. | Oct 2015 | A1 |
20150278348 | Paruchuri et al. | Oct 2015 | A1 |
20150278370 | Stratvert et al. | Oct 2015 | A1 |
20150278737 | Chen Huebscher et al. | Oct 2015 | A1 |
20150279354 | Gruenstein et al. | Oct 2015 | A1 |
20150279358 | Kingsbury et al. | Oct 2015 | A1 |
20150279360 | Mengibar et al. | Oct 2015 | A1 |
20150279366 | Krestnikov et al. | Oct 2015 | A1 |
20150281380 | Wang et al. | Oct 2015 | A1 |
20150281401 | Le et al. | Oct 2015 | A1 |
20150286627 | Chang et al. | Oct 2015 | A1 |
20150286716 | Snibbe et al. | Oct 2015 | A1 |
20150286937 | Hildebrand | Oct 2015 | A1 |
20150287401 | Lee et al. | Oct 2015 | A1 |
20150287408 | Svendsen et al. | Oct 2015 | A1 |
20150287409 | Jang | Oct 2015 | A1 |
20150287411 | Kojima et al. | Oct 2015 | A1 |
20150288629 | Choi et al. | Oct 2015 | A1 |
20150294086 | Kare et al. | Oct 2015 | A1 |
20150294377 | Chow | Oct 2015 | A1 |
20150294516 | Chiang | Oct 2015 | A1 |
20150294670 | Roblek et al. | Oct 2015 | A1 |
20150295915 | Xiu | Oct 2015 | A1 |
20150296065 | Narita et al. | Oct 2015 | A1 |
20150301796 | Visser et al. | Oct 2015 | A1 |
20150302316 | Buryak et al. | Oct 2015 | A1 |
20150302855 | Kim et al. | Oct 2015 | A1 |
20150302856 | Kim et al. | Oct 2015 | A1 |
20150302857 | Yamada | Oct 2015 | A1 |
20150302870 | Burke et al. | Oct 2015 | A1 |
20150308470 | Graham et al. | Oct 2015 | A1 |
20150309691 | Seo et al. | Oct 2015 | A1 |
20150309997 | Lee et al. | Oct 2015 | A1 |
20150310114 | Ryger et al. | Oct 2015 | A1 |
20150310852 | Spizzo et al. | Oct 2015 | A1 |
20150310858 | Li et al. | Oct 2015 | A1 |
20150310862 | Dauphin et al. | Oct 2015 | A1 |
20150310879 | Buchanan et al. | Oct 2015 | A1 |
20150310888 | Chen | Oct 2015 | A1 |
20150312182 | Langholz | Oct 2015 | A1 |
20150312409 | Czarnecki et al. | Oct 2015 | A1 |
20150314454 | Breazeal et al. | Nov 2015 | A1 |
20150317069 | Clements et al. | Nov 2015 | A1 |
20150317310 | Eiche et al. | Nov 2015 | A1 |
20150319264 | Allen et al. | Nov 2015 | A1 |
20150319411 | Kasmir et al. | Nov 2015 | A1 |
20150324041 | Varley et al. | Nov 2015 | A1 |
20150324334 | Lee et al. | Nov 2015 | A1 |
20150324362 | Glass et al. | Nov 2015 | A1 |
20150331664 | Osawa et al. | Nov 2015 | A1 |
20150331711 | Huang et al. | Nov 2015 | A1 |
20150332667 | Mason | Nov 2015 | A1 |
20150334346 | Cheatham, III et al. | Nov 2015 | A1 |
20150339049 | Kasemset et al. | Nov 2015 | A1 |
20150339391 | Kang et al. | Nov 2015 | A1 |
20150340033 | Di Fabbrizio et al. | Nov 2015 | A1 |
20150340034 | Schalkwyk et al. | Nov 2015 | A1 |
20150340040 | Mun et al. | Nov 2015 | A1 |
20150340042 | Sejnoha et al. | Nov 2015 | A1 |
20150341717 | Song et al. | Nov 2015 | A1 |
20150346845 | Di Censo et al. | Dec 2015 | A1 |
20150347086 | Liedholm et al. | Dec 2015 | A1 |
20150347381 | Bellegarda | Dec 2015 | A1 |
20150347382 | Dolfing et al. | Dec 2015 | A1 |
20150347383 | Willmore et al. | Dec 2015 | A1 |
20150347385 | Flor et al. | Dec 2015 | A1 |
20150347393 | Futrell et al. | Dec 2015 | A1 |
20150347552 | Habouzit et al. | Dec 2015 | A1 |
20150347733 | Tsou et al. | Dec 2015 | A1 |
20150347985 | Gross et al. | Dec 2015 | A1 |
20150348533 | Saddler et al. | Dec 2015 | A1 |
20150348547 | Paulik et al. | Dec 2015 | A1 |
20150348548 | Piernot et al. | Dec 2015 | A1 |
20150348549 | Giuli et al. | Dec 2015 | A1 |
20150348551 | Gruber et al. | Dec 2015 | A1 |
20150348554 | Orr et al. | Dec 2015 | A1 |
20150348555 | Sugita | Dec 2015 | A1 |
20150348565 | Rhoten et al. | Dec 2015 | A1 |
20150349934 | Pollack et al. | Dec 2015 | A1 |
20150350031 | Burks et al. | Dec 2015 | A1 |
20150350147 | Shepherd et al. | Dec 2015 | A1 |
20150350342 | Thorpe et al. | Dec 2015 | A1 |
20150350594 | Mate et al. | Dec 2015 | A1 |
20150352999 | Bando et al. | Dec 2015 | A1 |
20150355879 | Beckhardt et al. | Dec 2015 | A1 |
20150356410 | Faith et al. | Dec 2015 | A1 |
20150363587 | Ahn et al. | Dec 2015 | A1 |
20150364128 | Zhao et al. | Dec 2015 | A1 |
20150364140 | Thörn | Dec 2015 | A1 |
20150365251 | Kinoshita et al. | Dec 2015 | A1 |
20150370455 | Van Os et al. | Dec 2015 | A1 |
20150370531 | Faaborg | Dec 2015 | A1 |
20150370780 | Wang et al. | Dec 2015 | A1 |
20150370787 | Akbacak et al. | Dec 2015 | A1 |
20150370884 | Hurley et al. | Dec 2015 | A1 |
20150371215 | Zhou et al. | Dec 2015 | A1 |
20150371529 | Dolecki | Dec 2015 | A1 |
20150371639 | Foerster et al. | Dec 2015 | A1 |
20150371663 | Gustafson et al. | Dec 2015 | A1 |
20150371664 | Bar-or et al. | Dec 2015 | A1 |
20150371665 | Naik et al. | Dec 2015 | A1 |
20150373183 | Woolsey et al. | Dec 2015 | A1 |
20150379118 | Wickenkamp et al. | Dec 2015 | A1 |
20150379414 | Yeh et al. | Dec 2015 | A1 |
20150379993 | Subhojit et al. | Dec 2015 | A1 |
20150381923 | Wickenkamp et al. | Dec 2015 | A1 |
20150382047 | Van Os et al. | Dec 2015 | A1 |
20150382079 | Lister et al. | Dec 2015 | A1 |
20150382147 | Clark et al. | Dec 2015 | A1 |
20160004499 | Kim et al. | Jan 2016 | A1 |
20160004690 | Bangalore et al. | Jan 2016 | A1 |
20160005320 | deCharms et al. | Jan 2016 | A1 |
20160006795 | Yunten | Jan 2016 | A1 |
20160012038 | Edwards et al. | Jan 2016 | A1 |
20160014476 | Caliendo, Jr. et al. | Jan 2016 | A1 |
20160018872 | Tu et al. | Jan 2016 | A1 |
20160018900 | Tu et al. | Jan 2016 | A1 |
20160018959 | Yamashita et al. | Jan 2016 | A1 |
20160019886 | Hong | Jan 2016 | A1 |
20160019896 | Alvarez Guevara et al. | Jan 2016 | A1 |
20160021414 | Padi et al. | Jan 2016 | A1 |
20160026242 | Burns et al. | Jan 2016 | A1 |
20160026258 | Ou et al. | Jan 2016 | A1 |
20160027431 | Kurzweil et al. | Jan 2016 | A1 |
20160028666 | Li | Jan 2016 | A1 |
20160028802 | Balasingh et al. | Jan 2016 | A1 |
20160029316 | Mohan et al. | Jan 2016 | A1 |
20160034042 | Joo | Feb 2016 | A1 |
20160034447 | Shin et al. | Feb 2016 | A1 |
20160034811 | Paulik et al. | Feb 2016 | A1 |
20160036750 | Yuan et al. | Feb 2016 | A1 |
20160036953 | Lee et al. | Feb 2016 | A1 |
20160041809 | Clayton et al. | Feb 2016 | A1 |
20160042735 | Vibbert et al. | Feb 2016 | A1 |
20160042748 | Jain et al. | Feb 2016 | A1 |
20160043905 | Fiedler | Feb 2016 | A1 |
20160048666 | Dey et al. | Feb 2016 | A1 |
20160050254 | Rao et al. | Feb 2016 | A1 |
20160055422 | Li | Feb 2016 | A1 |
20160057203 | Gärdenfors et al. | Feb 2016 | A1 |
20160057475 | Liu | Feb 2016 | A1 |
20160061623 | Pahwa et al. | Mar 2016 | A1 |
20160062459 | Publicover et al. | Mar 2016 | A1 |
20160062605 | Agarwal et al. | Mar 2016 | A1 |
20160063094 | Udupa et al. | Mar 2016 | A1 |
20160063095 | Nassar et al. | Mar 2016 | A1 |
20160063998 | Krishnamoorthy et al. | Mar 2016 | A1 |
20160065155 | Bharj et al. | Mar 2016 | A1 |
20160065626 | Jain et al. | Mar 2016 | A1 |
20160066020 | Mountain | Mar 2016 | A1 |
20160070581 | Soon-Shiong | Mar 2016 | A1 |
20160071516 | Lee et al. | Mar 2016 | A1 |
20160071517 | Beaver et al. | Mar 2016 | A1 |
20160071520 | Hayakawa | Mar 2016 | A1 |
20160071521 | Haughay | Mar 2016 | A1 |
20160072940 | Cronin | Mar 2016 | A1 |
20160077794 | Kim et al. | Mar 2016 | A1 |
20160078359 | Csurka et al. | Mar 2016 | A1 |
20160078860 | Paulik et al. | Mar 2016 | A1 |
20160080165 | Ehsani et al. | Mar 2016 | A1 |
20160080475 | Singh et al. | Mar 2016 | A1 |
20160085295 | Shimy et al. | Mar 2016 | A1 |
20160085827 | Chadha et al. | Mar 2016 | A1 |
20160086116 | Rao et al. | Mar 2016 | A1 |
20160086599 | Kurata et al. | Mar 2016 | A1 |
20160088335 | Zucchetta | Mar 2016 | A1 |
20160091871 | Marti et al. | Mar 2016 | A1 |
20160091967 | Prokofieva et al. | Mar 2016 | A1 |
20160092046 | Hong et al. | Mar 2016 | A1 |
20160092434 | Bellegarda | Mar 2016 | A1 |
20160092447 | Pathurudeen et al. | Mar 2016 | A1 |
20160092766 | Sainath et al. | Mar 2016 | A1 |
20160093291 | Kim | Mar 2016 | A1 |
20160093298 | Naik et al. | Mar 2016 | A1 |
20160093301 | Bellegarda et al. | Mar 2016 | A1 |
20160093304 | Kim et al. | Mar 2016 | A1 |
20160094700 | Lee et al. | Mar 2016 | A1 |
20160094889 | Venkataraman et al. | Mar 2016 | A1 |
20160094979 | Naik et al. | Mar 2016 | A1 |
20160098991 | Luo et al. | Apr 2016 | A1 |
20160098992 | Renard et al. | Apr 2016 | A1 |
20160099892 | Palakovich et al. | Apr 2016 | A1 |
20160099984 | Karagiannis et al. | Apr 2016 | A1 |
20160104480 | Sharifi | Apr 2016 | A1 |
20160104486 | Penilla et al. | Apr 2016 | A1 |
20160105308 | Dutt | Apr 2016 | A1 |
20160111091 | Bakish | Apr 2016 | A1 |
20160112746 | Zhang et al. | Apr 2016 | A1 |
20160112792 | Lee et al. | Apr 2016 | A1 |
20160117386 | Ajmera et al. | Apr 2016 | A1 |
20160118048 | Heide | Apr 2016 | A1 |
20160119338 | Cheyer | Apr 2016 | A1 |
20160125048 | Hamada | May 2016 | A1 |
20160125071 | Gabbai | May 2016 | A1 |
20160132046 | Beoughter et al. | May 2016 | A1 |
20160132290 | Raux | May 2016 | A1 |
20160132484 | Nauze et al. | May 2016 | A1 |
20160132488 | Clark et al. | May 2016 | A1 |
20160133254 | Vogel et al. | May 2016 | A1 |
20160139662 | Dabhade | May 2016 | A1 |
20160140951 | Agiomyrgiannakis et al. | May 2016 | A1 |
20160140962 | Sharifi | May 2016 | A1 |
20160147725 | Patten et al. | May 2016 | A1 |
20160147739 | Lim et al. | May 2016 | A1 |
20160148610 | Kennewick, Jr. et al. | May 2016 | A1 |
20160148612 | Guo et al. | May 2016 | A1 |
20160148613 | Kwon et al. | May 2016 | A1 |
20160149966 | Remash et al. | May 2016 | A1 |
20160150020 | Farmer et al. | May 2016 | A1 |
20160151668 | Barnes et al. | Jun 2016 | A1 |
20160154624 | Son et al. | Jun 2016 | A1 |
20160154880 | Hoarty | Jun 2016 | A1 |
20160155442 | Kannan et al. | Jun 2016 | A1 |
20160155443 | Khan et al. | Jun 2016 | A1 |
20160156574 | Hum et al. | Jun 2016 | A1 |
20160156990 | Miccoy et al. | Jun 2016 | A1 |
20160162456 | Munro et al. | Jun 2016 | A1 |
20160163311 | Crook et al. | Jun 2016 | A1 |
20160163312 | Naik et al. | Jun 2016 | A1 |
20160170710 | Kim et al. | Jun 2016 | A1 |
20160170966 | Kolo | Jun 2016 | A1 |
20160171980 | Liddell et al. | Jun 2016 | A1 |
20160173578 | Sharma et al. | Jun 2016 | A1 |
20160173617 | Allinson | Jun 2016 | A1 |
20160173929 | Klappert | Jun 2016 | A1 |
20160173960 | Snibbe et al. | Jun 2016 | A1 |
20160179462 | Bjorkengren | Jun 2016 | A1 |
20160179464 | Reddy et al. | Jun 2016 | A1 |
20160179787 | Deleeuw | Jun 2016 | A1 |
20160180840 | Siddiq et al. | Jun 2016 | A1 |
20160180844 | Vanblon et al. | Jun 2016 | A1 |
20160182410 | Janakiraman et al. | Jun 2016 | A1 |
20160182709 | Kim et al. | Jun 2016 | A1 |
20160188181 | Smith | Jun 2016 | A1 |
20160188738 | Gruber et al. | Jun 2016 | A1 |
20160189198 | Daniel et al. | Jun 2016 | A1 |
20160189715 | Nishikawa | Jun 2016 | A1 |
20160189717 | Kannan et al. | Jun 2016 | A1 |
20160196110 | Yehoshua et al. | Jul 2016 | A1 |
20160198319 | Huang et al. | Jul 2016 | A1 |
20160203002 | Kannan et al. | Jul 2016 | A1 |
20160203193 | Kevin et al. | Jul 2016 | A1 |
20160210551 | Lee et al. | Jul 2016 | A1 |
20160210981 | Lee | Jul 2016 | A1 |
20160212206 | Wu et al. | Jul 2016 | A1 |
20160212208 | Kulkarni et al. | Jul 2016 | A1 |
20160212488 | Os et al. | Jul 2016 | A1 |
20160217784 | Gelfenbeyn et al. | Jul 2016 | A1 |
20160217794 | Imoto et al. | Jul 2016 | A1 |
20160224540 | Stewart et al. | Aug 2016 | A1 |
20160224559 | Hicks et al. | Aug 2016 | A1 |
20160224774 | Pender | Aug 2016 | A1 |
20160225372 | Cheung et al. | Aug 2016 | A1 |
20160226956 | Hong et al. | Aug 2016 | A1 |
20160227107 | Beaumont | Aug 2016 | A1 |
20160227633 | Sun et al. | Aug 2016 | A1 |
20160232500 | Wang et al. | Aug 2016 | A1 |
20160234206 | Tunnell et al. | Aug 2016 | A1 |
20160239480 | Larcheveque et al. | Aug 2016 | A1 |
20160239568 | Packer et al. | Aug 2016 | A1 |
20160239645 | Heo et al. | Aug 2016 | A1 |
20160239848 | Chang et al. | Aug 2016 | A1 |
20160240187 | Fleizach et al. | Aug 2016 | A1 |
20160240189 | Lee et al. | Aug 2016 | A1 |
20160240192 | Raghuvir | Aug 2016 | A1 |
20160242148 | Reed | Aug 2016 | A1 |
20160247061 | Trask et al. | Aug 2016 | A1 |
20160249319 | Dotan-Cohen et al. | Aug 2016 | A1 |
20160253312 | Rhodes | Sep 2016 | A1 |
20160253528 | Gao et al. | Sep 2016 | A1 |
20160259623 | Sumner et al. | Sep 2016 | A1 |
20160259656 | Sumner et al. | Sep 2016 | A1 |
20160259779 | Labský et al. | Sep 2016 | A1 |
20160260431 | Newendorp et al. | Sep 2016 | A1 |
20160260433 | Sumner et al. | Sep 2016 | A1 |
20160260434 | Gelfenbeyn et al. | Sep 2016 | A1 |
20160260436 | Lemay et al. | Sep 2016 | A1 |
20160262442 | Davila et al. | Sep 2016 | A1 |
20160266871 | Schmid et al. | Sep 2016 | A1 |
20160267904 | Biadsy et al. | Sep 2016 | A1 |
20160269540 | Butcher et al. | Sep 2016 | A1 |
20160274938 | Strinati et al. | Sep 2016 | A1 |
20160275941 | Bellegarda et al. | Sep 2016 | A1 |
20160275947 | Li et al. | Sep 2016 | A1 |
20160282824 | Smallwood et al. | Sep 2016 | A1 |
20160282956 | Ouyang et al. | Sep 2016 | A1 |
20160283185 | Mclaren et al. | Sep 2016 | A1 |
20160284005 | Daniel et al. | Sep 2016 | A1 |
20160284199 | Dotan-Cohen et al. | Sep 2016 | A1 |
20160285808 | Franklin et al. | Sep 2016 | A1 |
20160286045 | Shaltiel et al. | Sep 2016 | A1 |
20160291831 | Baek | Oct 2016 | A1 |
20160293157 | Chen et al. | Oct 2016 | A1 |
20160293167 | Chen et al. | Oct 2016 | A1 |
20160293168 | Chen | Oct 2016 | A1 |
20160294755 | Prabhu | Oct 2016 | A1 |
20160294813 | Zou | Oct 2016 | A1 |
20160299685 | Zhai et al. | Oct 2016 | A1 |
20160299882 | Hegerty et al. | Oct 2016 | A1 |
20160299883 | Zhu et al. | Oct 2016 | A1 |
20160299977 | Hreha | Oct 2016 | A1 |
20160300571 | Foerster et al. | Oct 2016 | A1 |
20160301639 | Liu et al. | Oct 2016 | A1 |
20160306683 | Standley et al. | Oct 2016 | A1 |
20160307566 | Bellegarda | Oct 2016 | A1 |
20160308799 | Schubert et al. | Oct 2016 | A1 |
20160309035 | Li | Oct 2016 | A1 |
20160313906 | Kilchenko et al. | Oct 2016 | A1 |
20160314788 | Jitkoff et al. | Oct 2016 | A1 |
20160314789 | Marcheret et al. | Oct 2016 | A1 |
20160314792 | Alvarez et al. | Oct 2016 | A1 |
20160315996 | Ha et al. | Oct 2016 | A1 |
20160316349 | Lee et al. | Oct 2016 | A1 |
20160317924 | Tanaka et al. | Nov 2016 | A1 |
20160320838 | Teller et al. | Nov 2016 | A1 |
20160321239 | Iso-Sipilä et al. | Nov 2016 | A1 |
20160321243 | Walia et al. | Nov 2016 | A1 |
20160321261 | Spasojevic et al. | Nov 2016 | A1 |
20160321358 | Kanani et al. | Nov 2016 | A1 |
20160322043 | Bellegarda | Nov 2016 | A1 |
20160322044 | Jung et al. | Nov 2016 | A1 |
20160322045 | Hatfield et al. | Nov 2016 | A1 |
20160322048 | Amano et al. | Nov 2016 | A1 |
20160322050 | Wang et al. | Nov 2016 | A1 |
20160322055 | Sainath et al. | Nov 2016 | A1 |
20160328134 | Xu | Nov 2016 | A1 |
20160328147 | Zhang et al. | Nov 2016 | A1 |
20160328205 | Agrawal et al. | Nov 2016 | A1 |
20160328893 | Cordova et al. | Nov 2016 | A1 |
20160329060 | Ito et al. | Nov 2016 | A1 |
20160334973 | Reckhow et al. | Nov 2016 | A1 |
20160335138 | Surti et al. | Nov 2016 | A1 |
20160335139 | Hurley et al. | Nov 2016 | A1 |
20160335532 | Sanghavi et al. | Nov 2016 | A1 |
20160336007 | Hanazawa et al. | Nov 2016 | A1 |
20160336010 | Lindahl | Nov 2016 | A1 |
20160336011 | Koll et al. | Nov 2016 | A1 |
20160336024 | Choi et al. | Nov 2016 | A1 |
20160337299 | Lane et al. | Nov 2016 | A1 |
20160337301 | Rollins et al. | Nov 2016 | A1 |
20160342317 | Lim et al. | Nov 2016 | A1 |
20160342685 | Basu et al. | Nov 2016 | A1 |
20160342781 | Jeon | Nov 2016 | A1 |
20160350650 | Leeman-Munk et al. | Dec 2016 | A1 |
20160350812 | Priness et al. | Dec 2016 | A1 |
20160351190 | Piernot et al. | Dec 2016 | A1 |
20160352567 | Robbins et al. | Dec 2016 | A1 |
20160352924 | Senarath et al. | Dec 2016 | A1 |
20160357304 | Hatori et al. | Dec 2016 | A1 |
20160357728 | Bellegarda et al. | Dec 2016 | A1 |
20160357790 | Elkington et al. | Dec 2016 | A1 |
20160357861 | Carlhian et al. | Dec 2016 | A1 |
20160357870 | Hentschel et al. | Dec 2016 | A1 |
20160358598 | Williams et al. | Dec 2016 | A1 |
20160358600 | Nallasamy et al. | Dec 2016 | A1 |
20160358619 | Ramprashad et al. | Dec 2016 | A1 |
20160359771 | Sridhar | Dec 2016 | A1 |
20160360039 | Sanghavi et al. | Dec 2016 | A1 |
20160360336 | Gross et al. | Dec 2016 | A1 |
20160360382 | Gross et al. | Dec 2016 | A1 |
20160364378 | Futrell et al. | Dec 2016 | A1 |
20160365101 | Foy et al. | Dec 2016 | A1 |
20160371250 | Rhodes | Dec 2016 | A1 |
20160372112 | Miller et al. | Dec 2016 | A1 |
20160372119 | Sak et al. | Dec 2016 | A1 |
20160378747 | Orr et al. | Dec 2016 | A1 |
20160379091 | Lin et al. | Dec 2016 | A1 |
20160379105 | Moore, Jr. | Dec 2016 | A1 |
20160379626 | Deisher et al. | Dec 2016 | A1 |
20160379632 | Hoffmeister et al. | Dec 2016 | A1 |
20160379633 | Lehman et al. | Dec 2016 | A1 |
20160379639 | Weinstein et al. | Dec 2016 | A1 |
20160379641 | Liu et al. | Dec 2016 | A1 |
20170000348 | Karsten et al. | Jan 2017 | A1 |
20170003931 | Dvortsov et al. | Jan 2017 | A1 |
20170004209 | Johl et al. | Jan 2017 | A1 |
20170004824 | Yoo et al. | Jan 2017 | A1 |
20170005818 | Gould | Jan 2017 | A1 |
20170006329 | Jang et al. | Jan 2017 | A1 |
20170011091 | Chehreghani | Jan 2017 | A1 |
20170011279 | Soldevila et al. | Jan 2017 | A1 |
20170011303 | Annapureddy et al. | Jan 2017 | A1 |
20170011742 | Jing et al. | Jan 2017 | A1 |
20170013124 | Havelka et al. | Jan 2017 | A1 |
20170013331 | Watanabe et al. | Jan 2017 | A1 |
20170018271 | Khan et al. | Jan 2017 | A1 |
20170019987 | Dragone et al. | Jan 2017 | A1 |
20170023963 | Davis et al. | Jan 2017 | A1 |
20170025124 | Mixter et al. | Jan 2017 | A1 |
20170026318 | Daniel et al. | Jan 2017 | A1 |
20170026509 | Rand | Jan 2017 | A1 |
20170027522 | Van Hasselt et al. | Feb 2017 | A1 |
20170031576 | Saoji et al. | Feb 2017 | A1 |
20170032783 | Lord et al. | Feb 2017 | A1 |
20170032787 | Dayal | Feb 2017 | A1 |
20170032791 | Elson et al. | Feb 2017 | A1 |
20170039283 | Bennett et al. | Feb 2017 | A1 |
20170039475 | Cheyer et al. | Feb 2017 | A1 |
20170040002 | Basson et al. | Feb 2017 | A1 |
20170041388 | Tal et al. | Feb 2017 | A1 |
20170046025 | Dascola et al. | Feb 2017 | A1 |
20170047063 | Ohmura et al. | Feb 2017 | A1 |
20170052760 | Johnson et al. | Feb 2017 | A1 |
20170053652 | Choi et al. | Feb 2017 | A1 |
20170055895 | Jardins et al. | Mar 2017 | A1 |
20170060853 | Lee et al. | Mar 2017 | A1 |
20170061423 | Bryant et al. | Mar 2017 | A1 |
20170068423 | Napolitano et al. | Mar 2017 | A1 |
20170068513 | Stasior et al. | Mar 2017 | A1 |
20170068550 | Zeitlin | Mar 2017 | A1 |
20170068670 | Orr et al. | Mar 2017 | A1 |
20170069308 | Aleksic et al. | Mar 2017 | A1 |
20170069321 | Toiyama | Mar 2017 | A1 |
20170069327 | Heigold et al. | Mar 2017 | A1 |
20170075653 | Dawidowsky et al. | Mar 2017 | A1 |
20170076518 | Patterson et al. | Mar 2017 | A1 |
20170076720 | Gopalan et al. | Mar 2017 | A1 |
20170076721 | Bargetzi et al. | Mar 2017 | A1 |
20170078490 | Kaminsky et al. | Mar 2017 | A1 |
20170083179 | Gruber et al. | Mar 2017 | A1 |
20170083285 | Meyers et al. | Mar 2017 | A1 |
20170083504 | Huang | Mar 2017 | A1 |
20170083506 | Liu et al. | Mar 2017 | A1 |
20170084277 | Sharifi | Mar 2017 | A1 |
20170085547 | De Aguiar et al. | Mar 2017 | A1 |
20170085696 | Abkairov | Mar 2017 | A1 |
20170090428 | Oohara | Mar 2017 | A1 |
20170090569 | Levesque | Mar 2017 | A1 |
20170091168 | Bellegarda et al. | Mar 2017 | A1 |
20170091169 | Bellegarda et al. | Mar 2017 | A1 |
20170091612 | Gruber et al. | Mar 2017 | A1 |
20170092259 | Jeon | Mar 2017 | A1 |
20170092270 | Newendorp et al. | Mar 2017 | A1 |
20170092278 | Evermann et al. | Mar 2017 | A1 |
20170093356 | Cudak et al. | Mar 2017 | A1 |
20170097743 | Hameed et al. | Apr 2017 | A1 |
20170102837 | Toumpelis | Apr 2017 | A1 |
20170102915 | Kuscher et al. | Apr 2017 | A1 |
20170103749 | Zhao et al. | Apr 2017 | A1 |
20170103752 | Senior et al. | Apr 2017 | A1 |
20170105190 | Logan et al. | Apr 2017 | A1 |
20170108236 | Guan et al. | Apr 2017 | A1 |
20170110117 | Chakladar et al. | Apr 2017 | A1 |
20170110125 | Xu et al. | Apr 2017 | A1 |
20170116177 | Walia | Apr 2017 | A1 |
20170116982 | Gelfenbeyn et al. | Apr 2017 | A1 |
20170116987 | Kang et al. | Apr 2017 | A1 |
20170116989 | Yadgar et al. | Apr 2017 | A1 |
20170124190 | Wang et al. | May 2017 | A1 |
20170124311 | Li et al. | May 2017 | A1 |
20170124531 | McCormack | May 2017 | A1 |
20170125016 | Wang | May 2017 | A1 |
20170127124 | Wilson et al. | May 2017 | A9 |
20170131778 | Tyer | May 2017 | A1 |
20170132019 | Karashchuk et al. | May 2017 | A1 |
20170132199 | Vescovi et al. | May 2017 | A1 |
20170133007 | Drewes | May 2017 | A1 |
20170140041 | Dotan-Cohen et al. | May 2017 | A1 |
20170140052 | Bufe, III et al. | May 2017 | A1 |
20170140644 | Hwang et al. | May 2017 | A1 |
20170140760 | Sachdev | May 2017 | A1 |
20170147722 | Greenwood | May 2017 | A1 |
20170147841 | Stagg et al. | May 2017 | A1 |
20170148044 | Fukuda et al. | May 2017 | A1 |
20170154033 | Lee | Jun 2017 | A1 |
20170154055 | Dimson et al. | Jun 2017 | A1 |
20170154628 | Mohajer et al. | Jun 2017 | A1 |
20170155940 | Jin et al. | Jun 2017 | A1 |
20170155965 | Ward | Jun 2017 | A1 |
20170161018 | Lemay et al. | Jun 2017 | A1 |
20170161268 | Badaskar | Jun 2017 | A1 |
20170161293 | Ionescu et al. | Jun 2017 | A1 |
20170161393 | Oh et al. | Jun 2017 | A1 |
20170161439 | Raduchel et al. | Jun 2017 | A1 |
20170161500 | Yang | Jun 2017 | A1 |
20170162191 | Grost et al. | Jun 2017 | A1 |
20170162202 | Anthony et al. | Jun 2017 | A1 |
20170162203 | Huang et al. | Jun 2017 | A1 |
20170169506 | Wishne et al. | Jun 2017 | A1 |
20170169818 | Vanblon et al. | Jun 2017 | A1 |
20170169819 | Mese et al. | Jun 2017 | A1 |
20170177080 | Deleeuw | Jun 2017 | A1 |
20170177547 | Ciereszko et al. | Jun 2017 | A1 |
20170178619 | Naik et al. | Jun 2017 | A1 |
20170178620 | Fleizach et al. | Jun 2017 | A1 |
20170178626 | Gruber et al. | Jun 2017 | A1 |
20170178666 | Yu | Jun 2017 | A1 |
20170180499 | Gelfenbeyn et al. | Jun 2017 | A1 |
20170185375 | Martel et al. | Jun 2017 | A1 |
20170185581 | Bojja et al. | Jun 2017 | A1 |
20170186429 | Giuli et al. | Jun 2017 | A1 |
20170187711 | Joo et al. | Jun 2017 | A1 |
20170193083 | Bhatt et al. | Jul 2017 | A1 |
20170195493 | Sudarsan et al. | Jul 2017 | A1 |
20170195495 | Deora et al. | Jul 2017 | A1 |
20170195636 | Child et al. | Jul 2017 | A1 |
20170195856 | Snyder et al. | Jul 2017 | A1 |
20170199870 | Zheng et al. | Jul 2017 | A1 |
20170199874 | Patel et al. | Jul 2017 | A1 |
20170200066 | Wang et al. | Jul 2017 | A1 |
20170201609 | Salmenkaita et al. | Jul 2017 | A1 |
20170201613 | Engelke et al. | Jul 2017 | A1 |
20170201846 | Katayama et al. | Jul 2017 | A1 |
20170206899 | Bryant et al. | Jul 2017 | A1 |
20170215052 | Koum et al. | Jul 2017 | A1 |
20170220212 | Yang et al. | Aug 2017 | A1 |
20170221486 | Kurata et al. | Aug 2017 | A1 |
20170223189 | Meredith et al. | Aug 2017 | A1 |
20170227935 | Su et al. | Aug 2017 | A1 |
20170228367 | Pasupalak et al. | Aug 2017 | A1 |
20170228382 | Haviv et al. | Aug 2017 | A1 |
20170229121 | Taki et al. | Aug 2017 | A1 |
20170230429 | Garmark et al. | Aug 2017 | A1 |
20170230497 | Kim et al. | Aug 2017 | A1 |
20170230709 | Van Os et al. | Aug 2017 | A1 |
20170235361 | Rigazio et al. | Aug 2017 | A1 |
20170235618 | Lin et al. | Aug 2017 | A1 |
20170235721 | Almosallam et al. | Aug 2017 | A1 |
20170236512 | Williams et al. | Aug 2017 | A1 |
20170236514 | Nelson | Aug 2017 | A1 |
20170236517 | Yu et al. | Aug 2017 | A1 |
20170238039 | Sabattini | Aug 2017 | A1 |
20170242478 | Ma | Aug 2017 | A1 |
20170242653 | Lang et al. | Aug 2017 | A1 |
20170242657 | Jarvis et al. | Aug 2017 | A1 |
20170242840 | Lu et al. | Aug 2017 | A1 |
20170243468 | Dotan-Cohen et al. | Aug 2017 | A1 |
20170243576 | Millington et al. | Aug 2017 | A1 |
20170243583 | Raichelgauz et al. | Aug 2017 | A1 |
20170243586 | Civelli et al. | Aug 2017 | A1 |
20170249309 | Sarikaya | Aug 2017 | A1 |
20170256256 | Wang et al. | Sep 2017 | A1 |
20170257723 | Morishita et al. | Sep 2017 | A1 |
20170262051 | Tall et al. | Sep 2017 | A1 |
20170263247 | Kang et al. | Sep 2017 | A1 |
20170263248 | Gruber et al. | Sep 2017 | A1 |
20170263249 | Akbacak et al. | Sep 2017 | A1 |
20170263254 | Dewan et al. | Sep 2017 | A1 |
20170264451 | Yu et al. | Sep 2017 | A1 |
20170264711 | Natarajan et al. | Sep 2017 | A1 |
20170270715 | Lindsay et al. | Sep 2017 | A1 |
20170270822 | Cohen | Sep 2017 | A1 |
20170270912 | Levit et al. | Sep 2017 | A1 |
20170273044 | Alsina | Sep 2017 | A1 |
20170278513 | Li et al. | Sep 2017 | A1 |
20170278514 | Mathias et al. | Sep 2017 | A1 |
20170285915 | Napolitano et al. | Oct 2017 | A1 |
20170286397 | Gonzalez | Oct 2017 | A1 |
20170286407 | Chochowski et al. | Oct 2017 | A1 |
20170287218 | Nuernberger et al. | Oct 2017 | A1 |
20170287472 | Ogawa et al. | Oct 2017 | A1 |
20170289305 | Liensberger et al. | Oct 2017 | A1 |
20170295446 | Shivappa | Oct 2017 | A1 |
20170301348 | Chen et al. | Oct 2017 | A1 |
20170308552 | Soni et al. | Oct 2017 | A1 |
20170308609 | Berkhin et al. | Oct 2017 | A1 |
20170311005 | Lin | Oct 2017 | A1 |
20170316775 | Le et al. | Nov 2017 | A1 |
20170316782 | Haughay | Nov 2017 | A1 |
20170319123 | Voss et al. | Nov 2017 | A1 |
20170323637 | Naik | Nov 2017 | A1 |
20170329466 | Krenkler et al. | Nov 2017 | A1 |
20170329490 | Esinovskaya et al. | Nov 2017 | A1 |
20170329572 | Shah et al. | Nov 2017 | A1 |
20170329630 | Jann et al. | Nov 2017 | A1 |
20170330567 | Van Wissen et al. | Nov 2017 | A1 |
20170336920 | Chan et al. | Nov 2017 | A1 |
20170337035 | Choudhary et al. | Nov 2017 | A1 |
20170337478 | Sarikaya et al. | Nov 2017 | A1 |
20170345411 | Raitio et al. | Nov 2017 | A1 |
20170345420 | Barnett, Jr. | Nov 2017 | A1 |
20170345429 | Hardee et al. | Nov 2017 | A1 |
20170346949 | Sanghavi et al. | Nov 2017 | A1 |
20170347180 | Petrank | Nov 2017 | A1 |
20170351487 | Avilés-Casco et al. | Dec 2017 | A1 |
20170352346 | Paulik et al. | Dec 2017 | A1 |
20170352350 | Booker et al. | Dec 2017 | A1 |
20170357478 | Piersol et al. | Dec 2017 | A1 |
20170357529 | Venkatraman et al. | Dec 2017 | A1 |
20170357632 | Pagallo et al. | Dec 2017 | A1 |
20170357633 | Wang et al. | Dec 2017 | A1 |
20170357637 | Nell et al. | Dec 2017 | A1 |
20170357640 | Bellegarda et al. | Dec 2017 | A1 |
20170357716 | Bellegarda et al. | Dec 2017 | A1 |
20170358300 | Laurens et al. | Dec 2017 | A1 |
20170358301 | Raitio et al. | Dec 2017 | A1 |
20170358302 | Orr et al. | Dec 2017 | A1 |
20170358303 | Walker, II et al. | Dec 2017 | A1 |
20170358304 | Castillo et al. | Dec 2017 | A1 |
20170358305 | Kudurshian et al. | Dec 2017 | A1 |
20170358317 | James | Dec 2017 | A1 |
20170359680 | Ledvina et al. | Dec 2017 | A1 |
20170365251 | Park et al. | Dec 2017 | A1 |
20170371509 | Jung et al. | Dec 2017 | A1 |
20170371885 | Aggarwal et al. | Dec 2017 | A1 |
20170374093 | Dhar et al. | Dec 2017 | A1 |
20170374176 | Agrawal et al. | Dec 2017 | A1 |
20180004372 | Zurek et al. | Jan 2018 | A1 |
20180004396 | Ying | Jan 2018 | A1 |
20180005112 | Iso-Sipila et al. | Jan 2018 | A1 |
20180007060 | Leblang et al. | Jan 2018 | A1 |
20180007096 | Levin et al. | Jan 2018 | A1 |
20180007538 | Naik et al. | Jan 2018 | A1 |
20180012596 | Piernot et al. | Jan 2018 | A1 |
20180018248 | Bhargava et al. | Jan 2018 | A1 |
20180018590 | Szeto et al. | Jan 2018 | A1 |
20180018814 | Patrik et al. | Jan 2018 | A1 |
20180018959 | Des Jardins et al. | Jan 2018 | A1 |
20180018973 | Moreno et al. | Jan 2018 | A1 |
20180024985 | Asano | Jan 2018 | A1 |
20180025124 | Mohr et al. | Jan 2018 | A1 |
20180025287 | Mathew et al. | Jan 2018 | A1 |
20180028918 | Tang et al. | Feb 2018 | A1 |
20180033431 | Newendorp et al. | Feb 2018 | A1 |
20180033435 | Jacobs, II | Feb 2018 | A1 |
20180033436 | Zhou | Feb 2018 | A1 |
20180045963 | Hoover et al. | Feb 2018 | A1 |
20180046340 | Mall | Feb 2018 | A1 |
20180047201 | Filev et al. | Feb 2018 | A1 |
20180047391 | Baik et al. | Feb 2018 | A1 |
20180047393 | Tian et al. | Feb 2018 | A1 |
20180047406 | Park | Feb 2018 | A1 |
20180052909 | Sharifi et al. | Feb 2018 | A1 |
20180054505 | Hart et al. | Feb 2018 | A1 |
20180060032 | Boesen | Mar 2018 | A1 |
20180060301 | Li et al. | Mar 2018 | A1 |
20180060312 | Won | Mar 2018 | A1 |
20180060555 | Boesen | Mar 2018 | A1 |
20180061400 | Carbune et al. | Mar 2018 | A1 |
20180061401 | Sarikaya et al. | Mar 2018 | A1 |
20180062691 | Barnett, Jr. | Mar 2018 | A1 |
20180063308 | Crystal et al. | Mar 2018 | A1 |
20180063324 | Van Meter, II | Mar 2018 | A1 |
20180063624 | Boesen | Mar 2018 | A1 |
20180067904 | Li | Mar 2018 | A1 |
20180067914 | Chen et al. | Mar 2018 | A1 |
20180067918 | Bellegarda et al. | Mar 2018 | A1 |
20180067929 | Ahn | Mar 2018 | A1 |
20180068074 | Shen | Mar 2018 | A1 |
20180068194 | Matsuda | Mar 2018 | A1 |
20180069743 | Bakken et al. | Mar 2018 | A1 |
20180075847 | Lee et al. | Mar 2018 | A1 |
20180075849 | Khoury et al. | Mar 2018 | A1 |
20180077095 | Deyle et al. | Mar 2018 | A1 |
20180077648 | Nguyen | Mar 2018 | A1 |
20180082692 | Khoury et al. | Mar 2018 | A1 |
20180088788 | Cheung et al. | Mar 2018 | A1 |
20180088969 | Vanblon et al. | Mar 2018 | A1 |
20180089166 | Meyer et al. | Mar 2018 | A1 |
20180089588 | Ravi et al. | Mar 2018 | A1 |
20180090143 | Saddler et al. | Mar 2018 | A1 |
20180091604 | Yamashita et al. | Mar 2018 | A1 |
20180091732 | Wilson et al. | Mar 2018 | A1 |
20180091847 | Wu et al. | Mar 2018 | A1 |
20180096683 | James et al. | Apr 2018 | A1 |
20180096690 | Mixter et al. | Apr 2018 | A1 |
20180101599 | Kenneth et al. | Apr 2018 | A1 |
20180101925 | Brinig et al. | Apr 2018 | A1 |
20180102914 | Kawachi et al. | Apr 2018 | A1 |
20180103209 | Fischler et al. | Apr 2018 | A1 |
20180107917 | Hewavitharana et al. | Apr 2018 | A1 |
20180107945 | Gao et al. | Apr 2018 | A1 |
20180108346 | Paulik et al. | Apr 2018 | A1 |
20180108351 | Beckhardt et al. | Apr 2018 | A1 |
20180108357 | Liu | Apr 2018 | A1 |
20180109920 | Aggarwal et al. | Apr 2018 | A1 |
20180113673 | Sheynblat | Apr 2018 | A1 |
20180314362 | Kim et al. | Apr 2018 | A1 |
20180121430 | Kagoshima et al. | May 2018 | A1 |
20180121432 | Parson et al. | May 2018 | A1 |
20180122376 | Kojima | May 2018 | A1 |
20180122378 | Mixter et al. | May 2018 | A1 |
20180126260 | Chansoriya et al. | May 2018 | A1 |
20180129967 | Herreshoff | May 2018 | A1 |
20180130470 | Lemay et al. | May 2018 | A1 |
20180130471 | Trufinescu et al. | May 2018 | A1 |
20180137856 | Gilbert | May 2018 | A1 |
20180137857 | Zhou et al. | May 2018 | A1 |
20180137865 | Ling | May 2018 | A1 |
20180143857 | Anbazhagan et al. | May 2018 | A1 |
20180143967 | Anbazhagan et al. | May 2018 | A1 |
20180144465 | Hsieh et al. | May 2018 | A1 |
20180144615 | Kinney et al. | May 2018 | A1 |
20180144746 | Mishra et al. | May 2018 | A1 |
20180144748 | Leong | May 2018 | A1 |
20180146089 | Rauenbuehler et al. | May 2018 | A1 |
20180150744 | Orr et al. | May 2018 | A1 |
20180152557 | White et al. | May 2018 | A1 |
20180152803 | Seefeldt et al. | May 2018 | A1 |
20180157372 | Kurabayashi | Jun 2018 | A1 |
20180157408 | Yu et al. | Jun 2018 | A1 |
20180157992 | Susskind et al. | Jun 2018 | A1 |
20180158548 | Taheri et al. | Jun 2018 | A1 |
20180158552 | Liu et al. | Jun 2018 | A1 |
20180165857 | Lee et al. | Jun 2018 | A1 |
20180166076 | Higuchi et al. | Jun 2018 | A1 |
20180167884 | Dawid et al. | Jun 2018 | A1 |
20180173403 | Carbune et al. | Jun 2018 | A1 |
20180173542 | Chan et al. | Jun 2018 | A1 |
20180174406 | Arashi et al. | Jun 2018 | A1 |
20180174576 | Soltau et al. | Jun 2018 | A1 |
20180174597 | Lee et al. | Jun 2018 | A1 |
20180181370 | Parkinson | Jun 2018 | A1 |
20180182376 | Gysel et al. | Jun 2018 | A1 |
20180188840 | Tamura et al. | Jul 2018 | A1 |
20180188948 | Ouyang et al. | Jul 2018 | A1 |
20180189267 | Takiel | Jul 2018 | A1 |
20180190263 | Calef, III | Jul 2018 | A1 |
20180190273 | Karimli et al. | Jul 2018 | A1 |
20180190279 | Anderson et al. | Jul 2018 | A1 |
20180191670 | Suyama | Jul 2018 | A1 |
20180196683 | Radebaugh et al. | Jul 2018 | A1 |
20180205983 | Lee et al. | Jul 2018 | A1 |
20180210874 | Fuxman et al. | Jul 2018 | A1 |
20180213448 | Segal et al. | Jul 2018 | A1 |
20180214061 | Knoth et al. | Aug 2018 | A1 |
20180217810 | Agrawal | Aug 2018 | A1 |
20180218735 | Hunt et al. | Aug 2018 | A1 |
20180221783 | Gamero | Aug 2018 | A1 |
20180225131 | Tommy et al. | Aug 2018 | A1 |
20180225274 | Tommy et al. | Aug 2018 | A1 |
20180232203 | Gelfenbeyn et al. | Aug 2018 | A1 |
20180232608 | Pradeep et al. | Aug 2018 | A1 |
20180232688 | Pike et al. | Aug 2018 | A1 |
20180233132 | Herold et al. | Aug 2018 | A1 |
20180233140 | Koishida et al. | Aug 2018 | A1 |
20180247065 | Rhee et al. | Aug 2018 | A1 |
20180253209 | Jaygarl et al. | Sep 2018 | A1 |
20180253652 | Palzer et al. | Sep 2018 | A1 |
20180260680 | Finkelstein et al. | Sep 2018 | A1 |
20180268023 | Korpusik et al. | Sep 2018 | A1 |
20180268106 | Velaga | Sep 2018 | A1 |
20180268337 | Miller et al. | Sep 2018 | A1 |
20180270343 | Rout et al. | Sep 2018 | A1 |
20180275839 | Kocienda et al. | Sep 2018 | A1 |
20180276197 | Nell et al. | Sep 2018 | A1 |
20180277113 | Hartung et al. | Sep 2018 | A1 |
20180278740 | Choi et al. | Sep 2018 | A1 |
20180285056 | Cutler et al. | Oct 2018 | A1 |
20180293984 | Lindahl | Oct 2018 | A1 |
20180293988 | Huang et al. | Oct 2018 | A1 |
20180293989 | De et al. | Oct 2018 | A1 |
20180299878 | Cella et al. | Oct 2018 | A1 |
20180300317 | Bradbury | Oct 2018 | A1 |
20180300400 | Paulus | Oct 2018 | A1 |
20180300608 | Sevrens et al. | Oct 2018 | A1 |
20180300952 | Evans et al. | Oct 2018 | A1 |
20180307216 | Ypma et al. | Oct 2018 | A1 |
20180308470 | Park et al. | Oct 2018 | A1 |
20180308477 | Nagasaka | Oct 2018 | A1 |
20180308480 | Jang et al. | Oct 2018 | A1 |
20180308485 | Kudurshian et al. | Oct 2018 | A1 |
20180308486 | Saddler et al. | Oct 2018 | A1 |
20180308491 | Oktem et al. | Oct 2018 | A1 |
20180314552 | Kim et al. | Nov 2018 | A1 |
20180314689 | Wang et al. | Nov 2018 | A1 |
20180315415 | Mosley et al. | Nov 2018 | A1 |
20180315416 | Berthelsen et al. | Nov 2018 | A1 |
20180322112 | Bellegarda et al. | Nov 2018 | A1 |
20180322881 | Min et al. | Nov 2018 | A1 |
20180324518 | Dusan et al. | Nov 2018 | A1 |
20180329508 | Klein et al. | Nov 2018 | A1 |
20180329677 | Gruber et al. | Nov 2018 | A1 |
20180329957 | Frazzingaro et al. | Nov 2018 | A1 |
20180329982 | Patel et al. | Nov 2018 | A1 |
20180329998 | Thomson et al. | Nov 2018 | A1 |
20180330714 | Paulik et al. | Nov 2018 | A1 |
20180330721 | Thomson et al. | Nov 2018 | A1 |
20180330722 | Newendorp et al. | Nov 2018 | A1 |
20180330723 | Acero et al. | Nov 2018 | A1 |
20180330729 | Golipour et al. | Nov 2018 | A1 |
20180330730 | Garg et al. | Nov 2018 | A1 |
20180330731 | Zeitlin et al. | Nov 2018 | A1 |
20180330733 | Orr et al. | Nov 2018 | A1 |
20180330737 | Paulik et al. | Nov 2018 | A1 |
20180332118 | Phipps et al. | Nov 2018 | A1 |
20180332389 | Ekkizogloy et al. | Nov 2018 | A1 |
20180335903 | Coffman et al. | Nov 2018 | A1 |
20180336006 | Chakraborty et al. | Nov 2018 | A1 |
20180336049 | Mukherjee et al. | Nov 2018 | A1 |
20180336184 | Bellegarda et al. | Nov 2018 | A1 |
20180336197 | Skilling et al. | Nov 2018 | A1 |
20180336275 | Graham et al. | Nov 2018 | A1 |
20180336439 | Kliger et al. | Nov 2018 | A1 |
20180336449 | Adan et al. | Nov 2018 | A1 |
20180336880 | Arik et al. | Nov 2018 | A1 |
20180336885 | Mukherjee et al. | Nov 2018 | A1 |
20180336892 | Kim et al. | Nov 2018 | A1 |
20180336894 | Graham et al. | Nov 2018 | A1 |
20180336904 | Piercy et al. | Nov 2018 | A1 |
20180336905 | Kim et al. | Nov 2018 | A1 |
20180336911 | Dahl et al. | Nov 2018 | A1 |
20180336920 | Bastian et al. | Nov 2018 | A1 |
20180338191 | Van Scheltinga et al. | Nov 2018 | A1 |
20180341643 | Alders et al. | Nov 2018 | A1 |
20180343557 | Naik et al. | Nov 2018 | A1 |
20180349084 | Nagasaka et al. | Dec 2018 | A1 |
20180349346 | Hatori et al. | Dec 2018 | A1 |
20180349349 | Bellegarda et al. | Dec 2018 | A1 |
20180349447 | Maccartney et al. | Dec 2018 | A1 |
20180349472 | Kohlschuetter et al. | Dec 2018 | A1 |
20180349728 | Wang et al. | Dec 2018 | A1 |
20180350345 | Naik | Dec 2018 | A1 |
20180350353 | Gruber et al. | Dec 2018 | A1 |
20180357073 | Johnson et al. | Dec 2018 | A1 |
20180357308 | Cheyer | Dec 2018 | A1 |
20180358015 | Cash et al. | Dec 2018 | A1 |
20180358019 | Mont-Reynaud | Dec 2018 | A1 |
20180365653 | Cleaver et al. | Dec 2018 | A1 |
20180366105 | Kim | Dec 2018 | A1 |
20180366110 | Hashem et al. | Dec 2018 | A1 |
20180366116 | Nicholson et al. | Dec 2018 | A1 |
20180373487 | Gruber et al. | Dec 2018 | A1 |
20180373493 | Watson et al. | Dec 2018 | A1 |
20180373796 | Rathod | Dec 2018 | A1 |
20180374484 | Huang et al. | Dec 2018 | A1 |
20190005024 | Somech et al. | Jan 2019 | A1 |
20190012141 | Piersol et al. | Jan 2019 | A1 |
20190012445 | Lesso et al. | Jan 2019 | A1 |
20190012449 | Cheyer | Jan 2019 | A1 |
20190012599 | El Kaliouby et al. | Jan 2019 | A1 |
20190013018 | Rekstad | Jan 2019 | A1 |
20190013025 | Alcorn et al. | Jan 2019 | A1 |
20190014450 | Gruber et al. | Jan 2019 | A1 |
20190019077 | Griffin et al. | Jan 2019 | A1 |
20190020482 | Gupta et al. | Jan 2019 | A1 |
20190027152 | Huang et al. | Jan 2019 | A1 |
20190034040 | Shah et al. | Jan 2019 | A1 |
20190034826 | Ahmad et al. | Jan 2019 | A1 |
20190035385 | Lawson et al. | Jan 2019 | A1 |
20190035405 | Haughay | Jan 2019 | A1 |
20190037258 | Justin et al. | Jan 2019 | A1 |
20190042059 | Baer | Feb 2019 | A1 |
20190042627 | Osotio et al. | Feb 2019 | A1 |
20190043507 | Huang et al. | Feb 2019 | A1 |
20190044854 | Yang et al. | Feb 2019 | A1 |
20190045040 | Lee et al. | Feb 2019 | A1 |
20190051306 | Torama et al. | Feb 2019 | A1 |
20190051309 | Kim et al. | Feb 2019 | A1 |
20190057697 | Giuli et al. | Feb 2019 | A1 |
20190065144 | Sumner et al. | Feb 2019 | A1 |
20190065993 | Srinivasan et al. | Feb 2019 | A1 |
20190066674 | Jaygarl et al. | Feb 2019 | A1 |
20190068810 | Okamoto et al. | Feb 2019 | A1 |
20190173996 | Butcher et al. | Feb 2019 | A1 |
20190073607 | Jia et al. | Mar 2019 | A1 |
20190073998 | Leblang et al. | Mar 2019 | A1 |
20190074009 | Kim et al. | Mar 2019 | A1 |
20190074015 | Orr et al. | Mar 2019 | A1 |
20190074016 | Orr et al. | Mar 2019 | A1 |
20190079476 | Funes | Mar 2019 | A1 |
20190079724 | Feuz et al. | Mar 2019 | A1 |
20190080685 | Johnson, Jr. | Mar 2019 | A1 |
20190080698 | Miller | Mar 2019 | A1 |
20190082044 | Olivia et al. | Mar 2019 | A1 |
20190087412 | Seyed Ibrahim et al. | Mar 2019 | A1 |
20190087455 | He et al. | Mar 2019 | A1 |
20190095050 | Gruber et al. | Mar 2019 | A1 |
20190095069 | Proctor et al. | Mar 2019 | A1 |
20190095171 | Carson et al. | Mar 2019 | A1 |
20190102145 | Wilberding et al. | Apr 2019 | A1 |
20190102378 | Piernot et al. | Apr 2019 | A1 |
20190102381 | Futrell et al. | Apr 2019 | A1 |
20190103103 | Ni et al. | Apr 2019 | A1 |
20190103112 | Walker et al. | Apr 2019 | A1 |
20190108834 | Nelson et al. | Apr 2019 | A1 |
20190114320 | Patwardhan et al. | Apr 2019 | A1 |
20190116264 | Sanghavi et al. | Apr 2019 | A1 |
20190122666 | Raitio et al. | Apr 2019 | A1 |
20190122692 | Binder et al. | Apr 2019 | A1 |
20190124019 | Leon et al. | Apr 2019 | A1 |
20190129499 | Li | May 2019 | A1 |
20190129615 | Sundar et al. | May 2019 | A1 |
20190132694 | Hanes et al. | May 2019 | A1 |
20190134501 | Feder et al. | May 2019 | A1 |
20190138704 | Shrivastava et al. | May 2019 | A1 |
20190139541 | Andersen et al. | May 2019 | A1 |
20190139563 | Chen et al. | May 2019 | A1 |
20190141494 | Gross et al. | May 2019 | A1 |
20190147052 | Lu et al. | May 2019 | A1 |
20190147369 | Gupta et al. | May 2019 | A1 |
20190147880 | Booker et al. | May 2019 | A1 |
20190147883 | Mellenthin et al. | May 2019 | A1 |
20190149972 | Parks et al. | May 2019 | A1 |
20190156830 | Devaraj et al. | May 2019 | A1 |
20190158994 | Gross et al. | May 2019 | A1 |
20190163667 | Feuz et al. | May 2019 | A1 |
20190164546 | Piernot et al. | May 2019 | A1 |
20190172243 | Mishra et al. | Jun 2019 | A1 |
20190172458 | Mishra et al. | Jun 2019 | A1 |
20190172467 | Kim et al. | Jun 2019 | A1 |
20190179607 | Thangarathnam et al. | Jun 2019 | A1 |
20190179890 | Evermann | Jun 2019 | A1 |
20190180770 | Kothari et al. | Jun 2019 | A1 |
20190182176 | Niewczas | Jun 2019 | A1 |
20190187787 | White et al. | Jun 2019 | A1 |
20190188326 | Daianu et al. | Jun 2019 | A1 |
20190188328 | Oyenan et al. | Jun 2019 | A1 |
20190189118 | Piernot et al. | Jun 2019 | A1 |
20190189125 | Van Os et al. | Jun 2019 | A1 |
20190190898 | Cui | Jun 2019 | A1 |
20190197053 | Graham et al. | Jun 2019 | A1 |
20190213498 | Adjaoute | Jul 2019 | A1 |
20190213601 | Hackman et al. | Jul 2019 | A1 |
20190213774 | Jiao et al. | Jul 2019 | A1 |
20190213999 | Grupen et al. | Jul 2019 | A1 |
20190214024 | Gruber et al. | Jul 2019 | A1 |
20190220245 | Martel et al. | Jul 2019 | A1 |
20190220246 | Orr et al. | Jul 2019 | A1 |
20190220247 | Lemay et al. | Jul 2019 | A1 |
20190220704 | Schulz-Trieglaff et al. | Jul 2019 | A1 |
20190220727 | Dohrmann et al. | Jul 2019 | A1 |
20190222684 | Li et al. | Jul 2019 | A1 |
20190224049 | Creasy et al. | Jul 2019 | A1 |
20190230215 | Zhu et al. | Jul 2019 | A1 |
20190230426 | Chun | Jul 2019 | A1 |
20190236130 | Li et al. | Aug 2019 | A1 |
20190236459 | Cheyer et al. | Aug 2019 | A1 |
20190237061 | Rusak et al. | Aug 2019 | A1 |
20190243902 | Saeki et al. | Aug 2019 | A1 |
20190244618 | Newendorp et al. | Aug 2019 | A1 |
20190251167 | Krishnapura Subbaraya et al. | Aug 2019 | A1 |
20190251339 | Hawker | Aug 2019 | A1 |
20190251960 | Maker et al. | Aug 2019 | A1 |
20190259386 | Kudurshian et al. | Aug 2019 | A1 |
20190266246 | Wang et al. | Aug 2019 | A1 |
20190272318 | Suzuki et al. | Sep 2019 | A1 |
20190272818 | Fernandez et al. | Sep 2019 | A1 |
20190272825 | O'Malley et al. | Sep 2019 | A1 |
20190272831 | Kajarekar | Sep 2019 | A1 |
20190273963 | Jobanputra et al. | Sep 2019 | A1 |
20190278841 | Pusateri et al. | Sep 2019 | A1 |
20190279622 | Liu et al. | Sep 2019 | A1 |
20190281387 | Woo et al. | Sep 2019 | A1 |
20190287012 | Asli et al. | Sep 2019 | A1 |
20190287522 | Lambourne et al. | Sep 2019 | A1 |
20190294769 | Lesso | Sep 2019 | A1 |
20190294962 | Vezer et al. | Sep 2019 | A1 |
20190295529 | Tomita | Sep 2019 | A1 |
20190295540 | Grima | Sep 2019 | A1 |
20190295544 | Garcia et al. | Sep 2019 | A1 |
20190303442 | Peitz et al. | Oct 2019 | A1 |
20190303504 | Pasumarthy | Oct 2019 | A1 |
20190304438 | Qian et al. | Oct 2019 | A1 |
20190310765 | Napolitano et al. | Oct 2019 | A1 |
20190311708 | Bengio et al. | Oct 2019 | A1 |
20190311720 | Pasko | Oct 2019 | A1 |
20190318722 | Bromand | Oct 2019 | A1 |
20190318724 | Chao et al. | Oct 2019 | A1 |
20190318725 | Le Roux et al. | Oct 2019 | A1 |
20190318732 | Huang et al. | Oct 2019 | A1 |
20190318735 | Chao et al. | Oct 2019 | A1 |
20190318739 | Garg et al. | Oct 2019 | A1 |
20190325866 | Bromand et al. | Oct 2019 | A1 |
20190333523 | Kim et al. | Oct 2019 | A1 |
20190339784 | Lemay et al. | Nov 2019 | A1 |
20190340252 | Huyghe | Nov 2019 | A1 |
20190341027 | Vescovi et al. | Nov 2019 | A1 |
20190341056 | Paulik et al. | Nov 2019 | A1 |
20190347063 | Liu et al. | Nov 2019 | A1 |
20190347525 | Liem et al. | Nov 2019 | A1 |
20190348022 | Park et al. | Nov 2019 | A1 |
20190349333 | Pickover et al. | Nov 2019 | A1 |
20190349622 | Kim et al. | Nov 2019 | A1 |
20190354548 | Orr et al. | Nov 2019 | A1 |
20190355346 | Bellegarda | Nov 2019 | A1 |
20190355384 | Sereshki et al. | Nov 2019 | A1 |
20190361729 | Gruber et al. | Nov 2019 | A1 |
20190361978 | Ray et al. | Nov 2019 | A1 |
20190362557 | Lacey et al. | Nov 2019 | A1 |
20190369748 | Hindi et al. | Dec 2019 | A1 |
20190369842 | Dolbakian et al. | Dec 2019 | A1 |
20190369868 | Jin et al. | Dec 2019 | A1 |
20190370292 | Irani et al. | Dec 2019 | A1 |
20190370323 | Davidson et al. | Dec 2019 | A1 |
20190370443 | Lesso | Dec 2019 | A1 |
20190371315 | Newendorp et al. | Dec 2019 | A1 |
20190371316 | Weinstein et al. | Dec 2019 | A1 |
20190371317 | Irani et al. | Dec 2019 | A1 |
20190371331 | Schramm et al. | Dec 2019 | A1 |
20190372902 | Piersol | Dec 2019 | A1 |
20190373102 | Weinstein et al. | Dec 2019 | A1 |
20190377955 | Swaminathan et al. | Dec 2019 | A1 |
20190385418 | Mixter et al. | Dec 2019 | A1 |
20190387352 | Jot et al. | Dec 2019 | A1 |
20200019609 | Yu et al. | Jan 2020 | A1 |
20200020326 | Srinivasan et al. | Jan 2020 | A1 |
20200034421 | Ferrucci et al. | Jan 2020 | A1 |
20200035224 | Ward et al. | Jan 2020 | A1 |
20200042334 | Radebaugh et al. | Feb 2020 | A1 |
20200043467 | Qian et al. | Feb 2020 | A1 |
20200043471 | Ma et al. | Feb 2020 | A1 |
20200043482 | Gruber et al. | Feb 2020 | A1 |
20200043489 | Bradley et al. | Feb 2020 | A1 |
20200044485 | Smith et al. | Feb 2020 | A1 |
20200051565 | Singh | Feb 2020 | A1 |
20200051583 | Wu et al. | Feb 2020 | A1 |
20200053218 | Gray | Feb 2020 | A1 |
20200058299 | Lee et al. | Feb 2020 | A1 |
20200065601 | Andreassen | Feb 2020 | A1 |
20200073629 | Lee et al. | Mar 2020 | A1 |
20200075018 | Chen | Mar 2020 | A1 |
20200075040 | Provost et al. | Mar 2020 | A1 |
20200076538 | Soultan et al. | Mar 2020 | A1 |
20200081615 | Yi et al. | Mar 2020 | A1 |
20200082807 | Kim et al. | Mar 2020 | A1 |
20200084572 | Jadav et al. | Mar 2020 | A1 |
20200090393 | Shin et al. | Mar 2020 | A1 |
20200091958 | Curtis et al. | Mar 2020 | A1 |
20200092625 | Raffle | Mar 2020 | A1 |
20200098352 | Feinstein et al. | Mar 2020 | A1 |
20200098362 | Piernot et al. | Mar 2020 | A1 |
20200098368 | Lemay et al. | Mar 2020 | A1 |
20200103963 | Kelly et al. | Apr 2020 | A1 |
20200104357 | Bellegarda et al. | Apr 2020 | A1 |
20200104362 | Yang et al. | Apr 2020 | A1 |
20200104369 | Bellegarda | Apr 2020 | A1 |
20200104668 | Sanghavi et al. | Apr 2020 | A1 |
20200105260 | Piernot et al. | Apr 2020 | A1 |
20200112454 | Brown et al. | Apr 2020 | A1 |
20200117717 | Ramamurti et al. | Apr 2020 | A1 |
20200118566 | Zhou | Apr 2020 | A1 |
20200118568 | Kudurshian et al. | Apr 2020 | A1 |
20200125820 | Kim et al. | Apr 2020 | A1 |
20200127988 | Bradley et al. | Apr 2020 | A1 |
20200134316 | Krishnamurthy et al. | Apr 2020 | A1 |
20200135180 | Mukherjee et al. | Apr 2020 | A1 |
20200135209 | Delfarah et al. | Apr 2020 | A1 |
20200135226 | Mittal et al. | Apr 2020 | A1 |
20200137230 | Spohrer | Apr 2020 | A1 |
20200143812 | Walker, II et al. | May 2020 | A1 |
20200143819 | Delcroix et al. | May 2020 | A1 |
20200152186 | Koh et al. | May 2020 | A1 |
20200159579 | Shear et al. | May 2020 | A1 |
20200159651 | Myers | May 2020 | A1 |
20200159801 | Sekine | May 2020 | A1 |
20200160179 | Chien et al. | May 2020 | A1 |
20200160838 | Lee | May 2020 | A1 |
20200168120 | Rodriguez Bravo | May 2020 | A1 |
20200169637 | Sanghavi et al. | May 2020 | A1 |
20200175566 | Bender et al. | Jun 2020 | A1 |
20200176004 | Kleijn et al. | Jun 2020 | A1 |
20200176018 | Feinauer et al. | Jun 2020 | A1 |
20200184057 | Mukund | Jun 2020 | A1 |
20200184964 | Myers et al. | Jun 2020 | A1 |
20200184966 | Yavagal | Jun 2020 | A1 |
20200193997 | Piernot et al. | Jun 2020 | A1 |
20200210142 | Mu et al. | Jul 2020 | A1 |
20200211566 | Kang et al. | Jul 2020 | A1 |
20200218074 | Hoover et al. | Jul 2020 | A1 |
20200218780 | Jun et al. | Jul 2020 | A1 |
20200219517 | Wang et al. | Jul 2020 | A1 |
20200221155 | Hansen et al. | Jul 2020 | A1 |
20200226823 | Stachniak et al. | Jul 2020 | A1 |
20200227034 | Summa et al. | Jul 2020 | A1 |
20200227044 | Lindahl | Jul 2020 | A1 |
20200228774 | Kar et al. | Jul 2020 | A1 |
20200243069 | Amores et al. | Jul 2020 | A1 |
20200243094 | Thomson et al. | Jul 2020 | A1 |
20200249985 | Zeitlin | Aug 2020 | A1 |
20200252508 | Gray | Aug 2020 | A1 |
20200258508 | Aggarwal et al. | Aug 2020 | A1 |
20200267222 | Phipps et al. | Aug 2020 | A1 |
20200272485 | Karashchuk et al. | Aug 2020 | A1 |
20200279556 | Gruber et al. | Sep 2020 | A1 |
20200279576 | Binder et al. | Sep 2020 | A1 |
20200279627 | Nida et al. | Sep 2020 | A1 |
20200285327 | Hindi et al. | Sep 2020 | A1 |
20200286472 | Newendorp et al. | Sep 2020 | A1 |
20200286493 | Orr et al. | Sep 2020 | A1 |
20200294487 | Donohoe et al. | Sep 2020 | A1 |
20200294494 | Suyama et al. | Sep 2020 | A1 |
20200298394 | Han et al. | Sep 2020 | A1 |
20200301950 | Theo et al. | Sep 2020 | A1 |
20200302356 | Gruber et al. | Sep 2020 | A1 |
20200302919 | Greborio et al. | Sep 2020 | A1 |
20200302925 | Shah et al. | Sep 2020 | A1 |
20200302930 | Chen et al. | Sep 2020 | A1 |
20200302932 | Schramm et al. | Sep 2020 | A1 |
20200304955 | Gross et al. | Sep 2020 | A1 |
20200304972 | Gross et al. | Sep 2020 | A1 |
20200305084 | Freeman et al. | Sep 2020 | A1 |
20200310513 | Nicholson et al. | Oct 2020 | A1 |
20200312315 | Li et al. | Oct 2020 | A1 |
20200312317 | Kothari et al. | Oct 2020 | A1 |
20200314191 | Madhavan et al. | Oct 2020 | A1 |
20200319850 | Stasior et al. | Oct 2020 | A1 |
20200320592 | Soule et al. | Oct 2020 | A1 |
20200327895 | Gruber et al. | Oct 2020 | A1 |
20200333875 | Bansal et al. | Oct 2020 | A1 |
20200334492 | Zheng et al. | Oct 2020 | A1 |
20200335121 | Mosseri et al. | Oct 2020 | A1 |
20200342082 | Sapozhnykov et al. | Oct 2020 | A1 |
20200342849 | Yu et al. | Oct 2020 | A1 |
20200342863 | Aggarwal et al. | Oct 2020 | A1 |
20200356243 | Meyer et al. | Nov 2020 | A1 |
20200356589 | Rekik et al. | Nov 2020 | A1 |
20200356634 | Srinivasan et al. | Nov 2020 | A1 |
20200357391 | Ghoshal et al. | Nov 2020 | A1 |
20200357406 | York et al. | Nov 2020 | A1 |
20200357409 | Sun et al. | Nov 2020 | A1 |
20200364411 | Evermann | Nov 2020 | A1 |
20200364858 | Kaethner et al. | Nov 2020 | A1 |
20200365155 | Milden | Nov 2020 | A1 |
20200367006 | Beckhardt | Nov 2020 | A1 |
20200372633 | Lee et al. | Nov 2020 | A1 |
20200372904 | Vescovi et al. | Nov 2020 | A1 |
20200372905 | Wang et al. | Nov 2020 | A1 |
20200374243 | Jina et al. | Nov 2020 | A1 |
20200379610 | Ford et al. | Dec 2020 | A1 |
20200379640 | Bellegarda et al. | Dec 2020 | A1 |
20200379726 | Blatz et al. | Dec 2020 | A1 |
20200379727 | Blatz et al. | Dec 2020 | A1 |
20200379728 | Gada et al. | Dec 2020 | A1 |
20200380389 | Eldeeb et al. | Dec 2020 | A1 |
20200380956 | Rossi et al. | Dec 2020 | A1 |
20200380963 | Chappidi et al. | Dec 2020 | A1 |
20200380966 | Acero et al. | Dec 2020 | A1 |
20200380973 | Novitchenko et al. | Dec 2020 | A1 |
20200380980 | Shum et al. | Dec 2020 | A1 |
20200380985 | Gada et al. | Dec 2020 | A1 |
20200382616 | Vaishampayan et al. | Dec 2020 | A1 |
20200382635 | Vora et al. | Dec 2020 | A1 |
20210006943 | Gross et al. | Jan 2021 | A1 |
20210011557 | Lemay et al. | Jan 2021 | A1 |
20210012113 | Petill et al. | Jan 2021 | A1 |
20210012775 | Kang et al. | Jan 2021 | A1 |
20210012776 | Peterson et al. | Jan 2021 | A1 |
20210043190 | Wang et al. | Feb 2021 | A1 |
20210065698 | Topcu et al. | Mar 2021 | A1 |
20210067631 | Van Os et al. | Mar 2021 | A1 |
20210072953 | Amarilio et al. | Mar 2021 | A1 |
20210074264 | Liang et al. | Mar 2021 | A1 |
20210074295 | Moreno et al. | Mar 2021 | A1 |
20210082400 | Vishnoi et al. | Mar 2021 | A1 |
20210090314 | Hussen et al. | Mar 2021 | A1 |
20210092128 | Leblang | Mar 2021 | A1 |
20210097998 | Kim et al. | Apr 2021 | A1 |
20210104232 | Lee et al. | Apr 2021 | A1 |
20210105528 | Van Os et al. | Apr 2021 | A1 |
20210110106 | Vescovi et al. | Apr 2021 | A1 |
20210110115 | Moritz et al. | Apr 2021 | A1 |
20210110254 | Duy et al. | Apr 2021 | A1 |
20210124597 | Ramakrishnan et al. | Apr 2021 | A1 |
20210127220 | Mathieu et al. | Apr 2021 | A1 |
20210134318 | Harvey et al. | May 2021 | A1 |
20210141839 | Tang et al. | May 2021 | A1 |
20210143987 | Xu et al. | May 2021 | A1 |
20210149629 | Martel et al. | May 2021 | A1 |
20210149996 | Bellegarda | May 2021 | A1 |
20210150151 | Jiaming et al. | May 2021 | A1 |
20210151041 | Gruber et al. | May 2021 | A1 |
20210151070 | Binder et al. | May 2021 | A1 |
20210152684 | Weinstein et al. | May 2021 | A1 |
20210165826 | Graham et al. | Jun 2021 | A1 |
20210176521 | Matthews | Jun 2021 | A1 |
20210182716 | Muramoto et al. | Jun 2021 | A1 |
20210191603 | Napolitano et al. | Jun 2021 | A1 |
20210191968 | Orr et al. | Jun 2021 | A1 |
20210208752 | Hwang | Jul 2021 | A1 |
20210208841 | Wilberding | Jul 2021 | A1 |
20210216134 | Fukunaga et al. | Jul 2021 | A1 |
20210216760 | Dominic et al. | Jul 2021 | A1 |
20210224032 | Ryan et al. | Jul 2021 | A1 |
20210224474 | Jerome et al. | Jul 2021 | A1 |
20210233532 | Aram et al. | Jul 2021 | A1 |
20210248804 | Hussen Abdelaziz et al. | Aug 2021 | A1 |
20210249009 | Manjunath et al. | Aug 2021 | A1 |
20210258881 | Freeman et al. | Aug 2021 | A1 |
20210264913 | Schramm et al. | Aug 2021 | A1 |
20210264916 | Kim et al. | Aug 2021 | A1 |
20210271333 | Hindi et al. | Sep 2021 | A1 |
20210273894 | Tian et al. | Sep 2021 | A1 |
20210278956 | Dolbakian et al. | Sep 2021 | A1 |
20210281965 | Malik et al. | Sep 2021 | A1 |
20210294569 | Piersol et al. | Sep 2021 | A1 |
20210294571 | Carson et al. | Sep 2021 | A1 |
20210295602 | Scapel et al. | Sep 2021 | A1 |
20210303116 | Barlow | Sep 2021 | A1 |
20210303342 | Dunn et al. | Sep 2021 | A1 |
20210304075 | Duong et al. | Sep 2021 | A1 |
20210306812 | Gross et al. | Sep 2021 | A1 |
20210312930 | Sugaya | Oct 2021 | A1 |
20210312931 | Paulik et al. | Oct 2021 | A1 |
20210314440 | Matias et al. | Oct 2021 | A1 |
20210318901 | Gruber et al. | Oct 2021 | A1 |
20210327409 | Naik | Oct 2021 | A1 |
20210327410 | Beaufays et al. | Oct 2021 | A1 |
20210334528 | Bray et al. | Oct 2021 | A1 |
20210335342 | Yuan et al. | Oct 2021 | A1 |
20210349605 | Nonaka et al. | Nov 2021 | A1 |
20210349608 | Blatz et al. | Nov 2021 | A1 |
20210350799 | Hansen et al. | Nov 2021 | A1 |
20210350803 | Hansen et al. | Nov 2021 | A1 |
20210350810 | Phipps et al. | Nov 2021 | A1 |
20210352115 | Hansen et al. | Nov 2021 | A1 |
20210357172 | Sinesio et al. | Nov 2021 | A1 |
20210365161 | Ellis et al. | Nov 2021 | A1 |
20210365174 | Ellis et al. | Nov 2021 | A1 |
20210365641 | Zhang et al. | Nov 2021 | A1 |
20210366473 | Maeng | Nov 2021 | A1 |
20210366480 | Lemay et al. | Nov 2021 | A1 |
20210373851 | Stasior et al. | Dec 2021 | A1 |
20210375290 | Hu et al. | Dec 2021 | A1 |
20210377381 | Aggarwal et al. | Dec 2021 | A1 |
20210390259 | Hildick-Smith et al. | Dec 2021 | A1 |
20210390955 | Piernot et al. | Dec 2021 | A1 |
20210393168 | Santarelli et al. | Dec 2021 | A1 |
20210402306 | Huang | Dec 2021 | A1 |
20210407318 | Pitschel et al. | Dec 2021 | A1 |
20210407502 | Vescovi et al. | Dec 2021 | A1 |
20220004825 | Xie et al. | Jan 2022 | A1 |
20220013106 | Deng et al. | Jan 2022 | A1 |
20220019292 | Lemay et al. | Jan 2022 | A1 |
20220021631 | Jina et al. | Jan 2022 | A1 |
20220021978 | Gui et al. | Jan 2022 | A1 |
20220028387 | Walker et al. | Jan 2022 | A1 |
20220030345 | Gong et al. | Jan 2022 | A1 |
20220035999 | Pawelec | Feb 2022 | A1 |
20220043986 | Nell et al. | Feb 2022 | A1 |
20220067283 | Bellegarda et al. | Mar 2022 | A1 |
20220068278 | York et al. | Mar 2022 | A1 |
20220083986 | Duffy et al. | Mar 2022 | A1 |
20220084511 | Nickson et al. | Mar 2022 | A1 |
20220093088 | Sridhar et al. | Mar 2022 | A1 |
20220093095 | Dighe et al. | Mar 2022 | A1 |
20220093101 | Krishnan et al. | Mar 2022 | A1 |
20220093109 | Orr et al. | Mar 2022 | A1 |
20220093110 | Kim et al. | Mar 2022 | A1 |
20220094765 | Niewczas | Mar 2022 | A1 |
20220107780 | Gruber et al. | Apr 2022 | A1 |
20220122615 | Chen et al. | Apr 2022 | A1 |
20220130126 | Delgado et al. | Apr 2022 | A1 |
20220139396 | Gada et al. | May 2022 | A1 |
20220148587 | Drummie et al. | May 2022 | A1 |
20220156041 | Newendorp et al. | May 2022 | A1 |
20220157310 | Newendorp et al. | May 2022 | A1 |
20220157315 | Raux et al. | May 2022 | A1 |
20220197491 | Meyer et al. | Jun 2022 | A1 |
20220206298 | Goodman | Jun 2022 | A1 |
20220214775 | Shah et al. | Jul 2022 | A1 |
20220229985 | Bellegarda et al. | Jul 2022 | A1 |
20220230653 | Binder et al. | Jul 2022 | A1 |
20220253969 | Kamenetskaya et al. | Aug 2022 | A1 |
20220254338 | Gruber et al. | Aug 2022 | A1 |
20220254339 | Acero et al. | Aug 2022 | A1 |
20220254347 | Lindahl | Aug 2022 | A1 |
20220262354 | Greborio et al. | Aug 2022 | A1 |
20220264262 | Gruber et al. | Aug 2022 | A1 |
20220284901 | Novitchenko et al. | Sep 2022 | A1 |
20220293124 | Weinberg et al. | Sep 2022 | A1 |
20220293125 | Maddika et al. | Sep 2022 | A1 |
20220300094 | Hindi et al. | Sep 2022 | A1 |
20220301566 | Van Os et al. | Sep 2022 | A1 |
20220329691 | Chinthakunta et al. | Oct 2022 | A1 |
20220343066 | Kwong et al. | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
2014100581 | Sep 2014 | AU |
2015203483 | Jul 2015 | AU |
2015101171 | Oct 2015 | AU |
2017203668 | Jan 2018 | AU |
2018100187 | Mar 2018 | AU |
2017222436 | Oct 2018 | AU |
2666438 | Jun 2013 | CA |
709795 | Dec 2015 | CH |
1585479 | Feb 2005 | CN |
101156430 | Apr 2008 | CN |
101771691 | Jul 2010 | CN |
102088421 | Jun 2011 | CN |
102324233 | Jan 2012 | CN |
102340590 | Feb 2012 | CN |
102346557 | Feb 2012 | CN |
102346719 | Feb 2012 | CN |
102368256 | Mar 2012 | CN |
102402985 | Apr 2012 | CN |
102405463 | Apr 2012 | CN |
102449438 | May 2012 | CN |
102483915 | May 2012 | CN |
102495406 | Jun 2012 | CN |
102498457 | Jun 2012 | CN |
102510426 | Jun 2012 | CN |
102520789 | Jun 2012 | CN |
101661754 | Jul 2012 | CN |
102629246 | Aug 2012 | CN |
102647628 | Aug 2012 | CN |
102651217 | Aug 2012 | CN |
102663016 | Sep 2012 | CN |
102681761 | Sep 2012 | CN |
102681896 | Sep 2012 | CN |
102682769 | Sep 2012 | CN |
102682771 | Sep 2012 | CN |
102685295 | Sep 2012 | CN |
102693725 | Sep 2012 | CN |
102693729 | Sep 2012 | CN |
102694909 | Sep 2012 | CN |
202453859 | Sep 2012 | CN |
102708867 | Oct 2012 | CN |
102710976 | Oct 2012 | CN |
102722478 | Oct 2012 | CN |
102737104 | Oct 2012 | CN |
102750087 | Oct 2012 | CN |
102792320 | Nov 2012 | CN |
102801853 | Nov 2012 | CN |
102820033 | Dec 2012 | CN |
102844738 | Dec 2012 | CN |
102866828 | Jan 2013 | CN |
102870065 | Jan 2013 | CN |
102882752 | Jan 2013 | CN |
102890936 | Jan 2013 | CN |
102915221 | Feb 2013 | CN |
102915731 | Feb 2013 | CN |
102917004 | Feb 2013 | CN |
102917271 | Feb 2013 | CN |
102918493 | Feb 2013 | CN |
102955652 | Mar 2013 | CN |
103035240 | Apr 2013 | CN |
103035251 | Apr 2013 | CN |
103038718 | Apr 2013 | CN |
103064956 | Apr 2013 | CN |
103093334 | May 2013 | CN |
103093755 | May 2013 | CN |
103109249 | May 2013 | CN |
103135916 | Jun 2013 | CN |
103187053 | Jul 2013 | CN |
103197963 | Jul 2013 | CN |
103198831 | Jul 2013 | CN |
103209369 | Jul 2013 | CN |
103217892 | Jul 2013 | CN |
103226949 | Jul 2013 | CN |
103236260 | Aug 2013 | CN |
103246638 | Aug 2013 | CN |
103268315 | Aug 2013 | CN |
103280218 | Sep 2013 | CN |
103282957 | Sep 2013 | CN |
103292437 | Sep 2013 | CN |
103324100 | Sep 2013 | CN |
103327063 | Sep 2013 | CN |
103365279 | Oct 2013 | CN |
103366741 | Oct 2013 | CN |
203249629 | Oct 2013 | CN |
103390016 | Nov 2013 | CN |
103412789 | Nov 2013 | CN |
103414949 | Nov 2013 | CN |
103426428 | Dec 2013 | CN |
103455234 | Dec 2013 | CN |
103456303 | Dec 2013 | CN |
103456304 | Dec 2013 | CN |
103456306 | Dec 2013 | CN |
103457837 | Dec 2013 | CN |
103475551 | Dec 2013 | CN |
103477592 | Dec 2013 | CN |
103533143 | Jan 2014 | CN |
103533154 | Jan 2014 | CN |
103543902 | Jan 2014 | CN |
103546453 | Jan 2014 | CN |
103562863 | Feb 2014 | CN |
103582896 | Feb 2014 | CN |
103593054 | Feb 2014 | CN |
103608859 | Feb 2014 | CN |
103620605 | Mar 2014 | CN |
103645876 | Mar 2014 | CN |
103677261 | Mar 2014 | CN |
103686723 | Mar 2014 | CN |
103714816 | Apr 2014 | CN |
103716454 | Apr 2014 | CN |
103727948 | Apr 2014 | CN |
103730120 | Apr 2014 | CN |
103744761 | Apr 2014 | CN |
103760984 | Apr 2014 | CN |
103761104 | Apr 2014 | CN |
103765385 | Apr 2014 | CN |
103778527 | May 2014 | CN |
103780758 | May 2014 | CN |
103792985 | May 2014 | CN |
103794212 | May 2014 | CN |
103795850 | May 2014 | CN |
103809548 | May 2014 | CN |
103841268 | Jun 2014 | CN |
103885663 | Jun 2014 | CN |
103902373 | Jul 2014 | CN |
103930945 | Jul 2014 | CN |
103942932 | Jul 2014 | CN |
103959751 | Jul 2014 | CN |
203721183 | Jul 2014 | CN |
103971680 | Aug 2014 | CN |
104007832 | Aug 2014 | CN |
102693729 | Sep 2014 | CN |
104036774 | Sep 2014 | CN |
104038621 | Sep 2014 | CN |
104050153 | Sep 2014 | CN |
104090652 | Oct 2014 | CN |
104092829 | Oct 2014 | CN |
104113471 | Oct 2014 | CN |
104125322 | Oct 2014 | CN |
104144377 | Nov 2014 | CN |
104145304 | Nov 2014 | CN |
104169837 | Nov 2014 | CN |
104180815 | Dec 2014 | CN |
104185868 | Dec 2014 | CN |
104219785 | Dec 2014 | CN |
104240701 | Dec 2014 | CN |
104243699 | Dec 2014 | CN |
104281259 | Jan 2015 | CN |
104281390 | Jan 2015 | CN |
104284257 | Jan 2015 | CN |
104284486 | Jan 2015 | CN |
104335207 | Feb 2015 | CN |
104335234 | Feb 2015 | CN |
104350454 | Feb 2015 | CN |
104360990 | Feb 2015 | CN |
104374399 | Feb 2015 | CN |
104423625 | Mar 2015 | CN |
104423780 | Mar 2015 | CN |
104427104 | Mar 2015 | CN |
104463552 | Mar 2015 | CN |
104464733 | Mar 2015 | CN |
104487929 | Apr 2015 | CN |
104516522 | Apr 2015 | CN |
104573472 | Apr 2015 | CN |
104575493 | Apr 2015 | CN |
104575501 | Apr 2015 | CN |
104575504 | Apr 2015 | CN |
104584010 | Apr 2015 | CN |
104584096 | Apr 2015 | CN |
104584601 | Apr 2015 | CN |
104604274 | May 2015 | CN |
104679472 | Jun 2015 | CN |
104685898 | Jun 2015 | CN |
104699746 | Jun 2015 | CN |
104731441 | Jun 2015 | CN |
104769584 | Jul 2015 | CN |
104769670 | Jul 2015 | CN |
104798012 | Jul 2015 | CN |
104821167 | Aug 2015 | CN |
104821934 | Aug 2015 | CN |
104836909 | Aug 2015 | CN |
104854583 | Aug 2015 | CN |
104867492 | Aug 2015 | CN |
104869342 | Aug 2015 | CN |
104951077 | Sep 2015 | CN |
104967748 | Oct 2015 | CN |
104969289 | Oct 2015 | CN |
104978963 | Oct 2015 | CN |
105025051 | Nov 2015 | CN |
105027197 | Nov 2015 | CN |
105093526 | Nov 2015 | CN |
105100356 | Nov 2015 | CN |
105144136 | Dec 2015 | CN |
105164678 | Dec 2015 | CN |
105164719 | Dec 2015 | CN |
105190607 | Dec 2015 | CN |
105247511 | Jan 2016 | CN |
105247551 | Jan 2016 | CN |
105264524 | Jan 2016 | CN |
105278681 | Jan 2016 | CN |
105320251 | Feb 2016 | CN |
105320726 | Feb 2016 | CN |
105338425 | Feb 2016 | CN |
105379234 | Mar 2016 | CN |
105430186 | Mar 2016 | CN |
105471705 | Apr 2016 | CN |
105472587 | Apr 2016 | CN |
105516441 | Apr 2016 | CN |
105554217 | May 2016 | CN |
105556592 | May 2016 | CN |
105808200 | Jul 2016 | CN |
105830048 | Aug 2016 | CN |
105869641 | Aug 2016 | CN |
105872222 | Aug 2016 | CN |
105917311 | Aug 2016 | CN |
106030699 | Oct 2016 | CN |
106062734 | Oct 2016 | CN |
106062790 | Oct 2016 | CN |
106415412 | Feb 2017 | CN |
106462383 | Feb 2017 | CN |
106463114 | Feb 2017 | CN |
106465074 | Feb 2017 | CN |
106471570 | Mar 2017 | CN |
106534469 | Mar 2017 | CN |
106558310 | Apr 2017 | CN |
106773742 | May 2017 | CN |
106776581 | May 2017 | CN |
107004412 | Aug 2017 | CN |
107123417 | Sep 2017 | CN |
107450800 | Dec 2017 | CN |
107480161 | Dec 2017 | CN |
107491285 | Dec 2017 | CN |
107491468 | Dec 2017 | CN |
107491469 | Dec 2017 | CN |
107506037 | Dec 2017 | CN |
107545262 | Jan 2018 | CN |
107608998 | Jan 2018 | CN |
107615378 | Jan 2018 | CN |
107623616 | Jan 2018 | CN |
107786730 | Mar 2018 | CN |
107852436 | Mar 2018 | CN |
107871500 | Apr 2018 | CN |
107919123 | Apr 2018 | CN |
107924313 | Apr 2018 | CN |
107978313 | May 2018 | CN |
108268187 | Jul 2018 | CN |
108647681 | Oct 2018 | CN |
109447234 | Mar 2019 | CN |
109657629 | Apr 2019 | CN |
110135411 | Aug 2019 | CN |
110263144 | Sep 2019 | CN |
105164719 | Nov 2019 | CN |
110531860 | Dec 2019 | CN |
110598671 | Dec 2019 | CN |
110647274 | Jan 2020 | CN |
110825469 | Feb 2020 | CN |
110945840 | Mar 2020 | CN |
111124224 | May 2020 | CN |
107123417 | Jun 2020 | CN |
111316203 | Jun 2020 | CN |
112204507 | Jan 2021 | CN |
202016008226 | May 2017 | DE |
1094406 | Apr 2001 | EP |
2431842 | Mar 2012 | EP |
2523109 | Nov 2012 | EP |
2523188 | Nov 2012 | EP |
2551784 | Jan 2013 | EP |
2555536 | Feb 2013 | EP |
2575128 | Apr 2013 | EP |
2632129 | Aug 2013 | EP |
2639792 | Sep 2013 | EP |
2669889 | Dec 2013 | EP |
2672229 | Dec 2013 | EP |
2672231 | Dec 2013 | EP |
2675147 | Dec 2013 | EP |
2680257 | Jan 2014 | EP |
2683147 | Jan 2014 | EP |
2683175 | Jan 2014 | EP |
2672231 | Apr 2014 | EP |
2717259 | Apr 2014 | EP |
2725577 | Apr 2014 | EP |
2733598 | May 2014 | EP |
2733896 | May 2014 | EP |
2743846 | Jun 2014 | EP |
2760015 | Jul 2014 | EP |
2779160 | Sep 2014 | EP |
2781883 | Sep 2014 | EP |
2787683 | Oct 2014 | EP |
2801890 | Nov 2014 | EP |
2801972 | Nov 2014 | EP |
2801974 | Nov 2014 | EP |
2824564 | Jan 2015 | EP |
2849177 | Mar 2015 | EP |
2879402 | Jun 2015 | EP |
2881939 | Jun 2015 | EP |
2891049 | Jul 2015 | EP |
2915021 | Sep 2015 | EP |
2930715 | Oct 2015 | EP |
2938022 | Oct 2015 | EP |
2940556 | Nov 2015 | EP |
2947859 | Nov 2015 | EP |
2950307 | Dec 2015 | EP |
2957986 | Dec 2015 | EP |
2973380 | Jan 2016 | EP |
2985984 | Feb 2016 | EP |
2988513 | Feb 2016 | EP |
2891049 | Mar 2016 | EP |
3032532 | Jun 2016 | EP |
3035329 | Jun 2016 | EP |
3038333 | Jun 2016 | EP |
3107101 | Dec 2016 | EP |
3115905 | Jan 2017 | EP |
3125097 | Feb 2017 | EP |
2672231 | May 2017 | EP |
3161612 | May 2017 | EP |
3200185 | Aug 2017 | EP |
3224708 | Oct 2017 | EP |
3227771 | Oct 2017 | EP |
3246916 | Nov 2017 | EP |
3270658 | Jan 2018 | EP |
3300074 | Mar 2018 | EP |
3336805 | Jun 2018 | EP |
2973380 | Aug 2018 | EP |
2983065 | Aug 2018 | EP |
3382530 | Oct 2018 | EP |
3392876 | Oct 2018 | EP |
3401773 | Nov 2018 | EP |
2973002 | Jun 2019 | EP |
3506151 | Jul 2019 | EP |
3550483 | Oct 2019 | EP |
3567584 | Nov 2019 | EP |
3323058 | Feb 2020 | EP |
3321928 | Apr 2020 | EP |
2011MU03716 | Feb 2012 | IN |
2012MU01227 | Jun 2012 | IN |
2001-325052 | Nov 2001 | JP |
2002-41276 | Feb 2002 | JP |
2002-251235 | Sep 2002 | JP |
2007-34960 | Feb 2007 | JP |
2007-235912 | Sep 2007 | JP |
2007-328635 | Dec 2007 | JP |
2008-11021 | Jan 2008 | JP |
2012-14394 | Jan 2012 | JP |
2012-502377 | Jan 2012 | JP |
2012-22478 | Feb 2012 | JP |
2012-33997 | Feb 2012 | JP |
2012-37619 | Feb 2012 | JP |
2012-40655 | Mar 2012 | JP |
2012-63536 | Mar 2012 | JP |
2012-508530 | Apr 2012 | JP |
2012-89020 | May 2012 | JP |
2012-511774 | May 2012 | JP |
2012-116442 | Jun 2012 | JP |
2012-142744 | Jul 2012 | JP |
2012-147063 | Aug 2012 | JP |
2012-150804 | Aug 2012 | JP |
2012-164070 | Aug 2012 | JP |
2012-165084 | Aug 2012 | JP |
2012-518847 | Aug 2012 | JP |
2012-211932 | Nov 2012 | JP |
2012-220959 | Nov 2012 | JP |
2012-253573 | Dec 2012 | JP |
2013-37688 | Feb 2013 | JP |
2013-46171 | Mar 2013 | JP |
2013-511214 | Mar 2013 | JP |
2013-65284 | Apr 2013 | JP |
2013-73240 | Apr 2013 | JP |
2013-513315 | Apr 2013 | JP |
2013-80476 | May 2013 | JP |
2013-517566 | May 2013 | JP |
2013-131087 | Jul 2013 | JP |
2013-134430 | Jul 2013 | JP |
2013-134729 | Jul 2013 | JP |
2013-140520 | Jul 2013 | JP |
2013-527947 | Jul 2013 | JP |
2013-528012 | Jul 2013 | JP |
2013-148419 | Aug 2013 | JP |
2013-156349 | Aug 2013 | JP |
2013-174987 | Sep 2013 | JP |
2013-535059 | Sep 2013 | JP |
2013-200265 | Oct 2013 | JP |
2013-200423 | Oct 2013 | JP |
2013-205999 | Oct 2013 | JP |
2013-238935 | Nov 2013 | JP |
2013-238936 | Nov 2013 | JP |
2013-248292 | Dec 2013 | JP |
2013-257694 | Dec 2013 | JP |
2013-258600 | Dec 2013 | JP |
2014-2586 | Jan 2014 | JP |
2014-10688 | Jan 2014 | JP |
2014-502445 | Jan 2014 | JP |
2014-26629 | Feb 2014 | JP |
2014-45449 | Mar 2014 | JP |
2014-507903 | Mar 2014 | JP |
2014-60600 | Apr 2014 | JP |
2014-72586 | Apr 2014 | JP |
2014-77969 | May 2014 | JP |
2014-89711 | May 2014 | JP |
2014-109889 | Jun 2014 | JP |
2014-124332 | Jul 2014 | JP |
2014-126600 | Jul 2014 | JP |
2014-127754 | Jul 2014 | JP |
2014-140121 | Jul 2014 | JP |
2014-518409 | Jul 2014 | JP |
2014-142566 | Aug 2014 | JP |
2014-145842 | Aug 2014 | JP |
2014-146940 | Aug 2014 | JP |
2014-150323 | Aug 2014 | JP |
2014-519648 | Aug 2014 | JP |
2014-524627 | Sep 2014 | JP |
2014-191272 | Oct 2014 | JP |
2014-219614 | Nov 2014 | JP |
2014-222514 | Nov 2014 | JP |
2015-1931 | Jan 2015 | JP |
2015-4928 | Jan 2015 | JP |
2015-8001 | Jan 2015 | JP |
2015-12301 | Jan 2015 | JP |
2015-18365 | Jan 2015 | JP |
2015-501022 | Jan 2015 | JP |
2015-501034 | Jan 2015 | JP |
2015-504619 | Feb 2015 | JP |
2015-41845 | Mar 2015 | JP |
2015-52500 | Mar 2015 | JP |
2015-60423 | Mar 2015 | JP |
2015-81971 | Apr 2015 | JP |
2015-83938 | Apr 2015 | JP |
2015-94848 | May 2015 | JP |
2015-514254 | May 2015 | JP |
2015-519675 | Jul 2015 | JP |
2015-520409 | Jul 2015 | JP |
2015-524974 | Aug 2015 | JP |
2015-526776 | Sep 2015 | JP |
2015-527683 | Sep 2015 | JP |
2015-528140 | Sep 2015 | JP |
2015-528918 | Oct 2015 | JP |
2015-531909 | Nov 2015 | JP |
2016-504651 | Feb 2016 | JP |
2016-35614 | Mar 2016 | JP |
2016-508007 | Mar 2016 | JP |
2016-71247 | May 2016 | JP |
2016-119615 | Jun 2016 | JP |
2016-151928 | Aug 2016 | JP |
2016-524193 | Aug 2016 | JP |
2016-156845 | Sep 2016 | JP |
2016-536648 | Nov 2016 | JP |
2017-11608 | Jan 2017 | JP |
2017-19331 | Jan 2017 | JP |
2017-516153 | Jun 2017 | JP |
2017-123187 | Jul 2017 | JP |
2017-211608 | Nov 2017 | JP |
2017-537361 | Dec 2017 | JP |
6291147 | Feb 2018 | JP |
2018-64297 | Apr 2018 | JP |
2018-511095 | Apr 2018 | JP |
2018-101242 | Jun 2018 | JP |
2018-113035 | Jul 2018 | JP |
2018-525950 | Sep 2018 | JP |
2018-536889 | Dec 2018 | JP |
10-2012-0020164 | Mar 2012 | KR |
10-2012-0031722 | Apr 2012 | KR |
10-2012-0066523 | Jun 2012 | KR |
10-2012-0082371 | Jul 2012 | KR |
10-2012-0084472 | Jul 2012 | KR |
10-1178310 | Aug 2012 | KR |
10-2012-0120316 | Nov 2012 | KR |
10-2012-0137424 | Dec 2012 | KR |
10-2012-0137435 | Dec 2012 | KR |
10-2012-0137440 | Dec 2012 | KR |
10-2012-0138826 | Dec 2012 | KR |
10-2012-0139827 | Dec 2012 | KR |
10-1193668 | Dec 2012 | KR |
10-2013-0035983 | Apr 2013 | KR |
10-2013-0086750 | Aug 2013 | KR |
10-2013-0090947 | Aug 2013 | KR |
10-2013-0108563 | Oct 2013 | KR |
10-1334342 | Nov 2013 | KR |
10-2013-0131252 | Dec 2013 | KR |
10-2013-0132200 | Dec 2013 | KR |
10-2013-133629 | Dec 2013 | KR |
10-2014-0007282 | Jan 2014 | KR |
10-2014-24271 | Feb 2014 | KR |
10-2014-0025996 | Mar 2014 | KR |
10-2014-0031283 | Mar 2014 | KR |
10-2014-0033574 | Mar 2014 | KR |
10-2014-42994 | Apr 2014 | KR |
10-2014-55204 | May 2014 | KR |
10-2014-0059697 | May 2014 | KR |
10-2014-68752 | Jun 2014 | KR |
10-2014-0071208 | Jun 2014 | KR |
10-2014-88449 | Jul 2014 | KR |
10-2014-0093949 | Jul 2014 | KR |
10-2014-106715 | Sep 2014 | KR |
10-2014-0107253 | Sep 2014 | KR |
10-2014-147557 | Dec 2014 | KR |
10-2015-0006454 | Jan 2015 | KR |
10-2015-13631 | Feb 2015 | KR |
10-1506510 | Mar 2015 | KR |
10-2015-38375 | Apr 2015 | KR |
10-2015-39380 | Apr 2015 | KR |
10-2015-41974 | Apr 2015 | KR |
10-2015-0043512 | Apr 2015 | KR |
10-1510013 | Apr 2015 | KR |
10-2015-0062811 | Jun 2015 | KR |
10-2015-95624 | Aug 2015 | KR |
10-1555742 | Sep 2015 | KR |
10-2015-113127 | Oct 2015 | KR |
10-2015-0131262 | Nov 2015 | KR |
10-2015-138109 | Dec 2015 | KR |
10-2016-0004351 | Jan 2016 | KR |
10-2016-0010523 | Jan 2016 | KR |
10-2016-0040279 | Apr 2016 | KR |
10-2016-55839 | May 2016 | KR |
10-2016-65503 | Jun 2016 | KR |
10-2016-0101079 | Aug 2016 | KR |
10-2016-101198 | Aug 2016 | KR |
10-2016-105847 | Sep 2016 | KR |
10-2016-121585 | Oct 2016 | KR |
10-2016-0127165 | Nov 2016 | KR |
10-2016-140694 | Dec 2016 | KR |
10-2016-0147854 | Dec 2016 | KR |
10-2017-0004482 | Jan 2017 | KR |
10-2017-36805 | Apr 2017 | KR |
10-2017-0104006 | Sep 2017 | KR |
10-2017-107058 | Sep 2017 | KR |
10-1776673 | Sep 2017 | KR |
10-2018-32632 | Mar 2018 | KR |
10-2018-34637 | Apr 2018 | KR |
10-2018-0135877 | Dec 2018 | KR |
10-1959328 | Mar 2019 | KR |
10-2020-0105519 | Sep 2020 | KR |
2012141604 | Apr 2014 | RU |
201227715 | Jul 2012 | TW |
201245989 | Nov 2012 | TW |
201312548 | Mar 2013 | TW |
201407184 | Feb 2014 | TW |
201610982 | Mar 2016 | TW |
201629750 | Aug 2016 | TW |
2007009225 | Jan 2007 | WO |
2008142472 | Nov 2008 | WO |
2010109358 | Sep 2010 | WO |
2011069035 | Jun 2011 | WO |
2011088053 | Jul 2011 | WO |
2011133573 | Oct 2011 | WO |
2011097309 | Dec 2011 | WO |
2011088053 | Jan 2012 | WO |
2012008434 | Jan 2012 | WO |
201219020 | Feb 2012 | WO |
2012019637 | Feb 2012 | WO |
2012033312 | Mar 2012 | WO |
2012056463 | May 2012 | WO |
201263260 | May 2012 | WO |
2012084965 | Jun 2012 | WO |
2012092562 | Jul 2012 | WO |
2012112331 | Aug 2012 | WO |
2012129231 | Sep 2012 | WO |
2012063260 | Oct 2012 | WO |
2012135157 | Oct 2012 | WO |
2012154317 | Nov 2012 | WO |
2012154748 | Nov 2012 | WO |
2012155079 | Nov 2012 | WO |
2012158407 | Nov 2012 | WO |
2012160567 | Nov 2012 | WO |
2012167168 | Dec 2012 | WO |
2012173902 | Dec 2012 | WO |
2013009578 | Jan 2013 | WO |
201322135 | Feb 2013 | WO |
201322223 | Feb 2013 | WO |
201348880 | Apr 2013 | WO |
201349358 | Apr 2013 | WO |
201357153 | Apr 2013 | WO |
2013101489 | Jul 2013 | WO |
2013118988 | Aug 2013 | WO |
2013122310 | Aug 2013 | WO |
2013128999 | Sep 2013 | WO |
2013133533 | Sep 2013 | WO |
2013137660 | Sep 2013 | WO |
2013163113 | Oct 2013 | WO |
2013163857 | Nov 2013 | WO |
2013169842 | Nov 2013 | WO |
2013173504 | Nov 2013 | WO |
2013173511 | Nov 2013 | WO |
2013176847 | Nov 2013 | WO |
2013184953 | Dec 2013 | WO |
2013184990 | Dec 2013 | WO |
20143138 | Jan 2014 | WO |
20144544 | Jan 2014 | WO |
2014018580 | Jan 2014 | WO |
201421967 | Feb 2014 | WO |
201422148 | Feb 2014 | WO |
201428735 | Feb 2014 | WO |
2014028797 | Feb 2014 | WO |
201431505 | Feb 2014 | WO |
201432461 | Mar 2014 | WO |
2014040022 | Mar 2014 | WO |
2014046475 | Mar 2014 | WO |
201447047 | Mar 2014 | WO |
2014048855 | Apr 2014 | WO |
201466352 | May 2014 | WO |
201470872 | May 2014 | WO |
2014073825 | May 2014 | WO |
201478965 | May 2014 | WO |
201493339 | Jun 2014 | WO |
2014093911 | Jun 2014 | WO |
201496506 | Jun 2014 | WO |
2014124332 | Aug 2014 | WO |
2014137074 | Sep 2014 | WO |
2014138604 | Sep 2014 | WO |
2014143959 | Sep 2014 | WO |
2014144395 | Sep 2014 | WO |
2014144579 | Sep 2014 | WO |
2014144949 | Sep 2014 | WO |
2014149473 | Sep 2014 | WO |
2014151153 | Sep 2014 | WO |
2014124332 | Oct 2014 | WO |
2014159578 | Oct 2014 | WO |
2014159581 | Oct 2014 | WO |
2014162570 | Oct 2014 | WO |
2014169269 | Oct 2014 | WO |
2014173189 | Oct 2014 | WO |
2013173504 | Dec 2014 | WO |
2014197336 | Dec 2014 | WO |
2014197339 | Dec 2014 | WO |
2014197635 | Dec 2014 | WO |
2014197730 | Dec 2014 | WO |
2014200728 | Dec 2014 | WO |
2014204659 | Dec 2014 | WO |
2014210392 | Dec 2014 | WO |
201518440 | Feb 2015 | WO |
201520942 | Feb 2015 | WO |
201529379 | Mar 2015 | WO |
201530796 | Mar 2015 | WO |
2015036817 | Mar 2015 | WO |
201541882 | Mar 2015 | WO |
201541892 | Mar 2015 | WO |
201547932 | Apr 2015 | WO |
201553485 | Apr 2015 | WO |
2015054141 | Apr 2015 | WO |
2015080530 | Jun 2015 | WO |
201584659 | Jun 2015 | WO |
201592943 | Jun 2015 | WO |
201594169 | Jun 2015 | WO |
201594369 | Jun 2015 | WO |
201598306 | Jul 2015 | WO |
201599939 | Jul 2015 | WO |
2015112625 | Jul 2015 | WO |
2015116151 | Aug 2015 | WO |
2015121449 | Aug 2015 | WO |
2015127404 | Aug 2015 | WO |
2015151133 | Oct 2015 | WO |
2015153310 | Oct 2015 | WO |
2015157013 | Oct 2015 | WO |
2015183368 | Dec 2015 | WO |
2015183401 | Dec 2015 | WO |
2015183699 | Dec 2015 | WO |
2015184186 | Dec 2015 | WO |
2015184387 | Dec 2015 | WO |
2015200207 | Dec 2015 | WO |
201627933 | Feb 2016 | WO |
201628946 | Feb 2016 | WO |
201633257 | Mar 2016 | WO |
201639992 | Mar 2016 | WO |
2016040721 | Mar 2016 | WO |
2016048789 | Mar 2016 | WO |
2016051519 | Apr 2016 | WO |
201652164 | Apr 2016 | WO |
201654230 | Apr 2016 | WO |
201657268 | Apr 2016 | WO |
201675081 | May 2016 | WO |
201685775 | Jun 2016 | WO |
201685776 | Jun 2016 | WO |
2016089029 | Jun 2016 | WO |
2016100139 | Jun 2016 | WO |
2016111881 | Jul 2016 | WO |
2016144840 | Sep 2016 | WO |
2016144982 | Sep 2016 | WO |
2016144983 | Sep 2016 | WO |
2016175354 | Nov 2016 | WO |
2016187149 | Nov 2016 | WO |
2016190950 | Dec 2016 | WO |
2016209444 | Dec 2016 | WO |
2016209924 | Dec 2016 | WO |
201744160 | Mar 2017 | WO |
201744257 | Mar 2017 | WO |
201744260 | Mar 2017 | WO |
201744629 | Mar 2017 | WO |
201753311 | Mar 2017 | WO |
201758293 | Apr 2017 | WO |
201759388 | Apr 2017 | WO |
201771420 | May 2017 | WO |
2017142116 | Aug 2017 | WO |
2017160487 | Sep 2017 | WO |
2017200777 | Nov 2017 | WO |
2017203484 | Nov 2017 | WO |
2017213678 | Dec 2017 | WO |
2017213682 | Dec 2017 | WO |
2017218194 | Dec 2017 | WO |
20189397 | Jan 2018 | WO |
2018044633 | Mar 2018 | WO |
2018057269 | Mar 2018 | WO |
2018067528 | Apr 2018 | WO |
2018081833 | May 2018 | WO |
2018176053 | Sep 2018 | WO |
2018209152 | Nov 2018 | WO |
2018213401 | Nov 2018 | WO |
2018213415 | Nov 2018 | WO |
2018213481 | Nov 2018 | WO |
2018217014 | Nov 2018 | WO |
2018231307 | Dec 2018 | WO |
201967930 | Apr 2019 | WO |
201978576 | Apr 2019 | WO |
201979017 | Apr 2019 | WO |
2019143397 | Jul 2019 | WO |
2019147429 | Aug 2019 | WO |
2019190646 | Oct 2019 | WO |
2019236217 | Dec 2019 | WO |
202010530 | Jan 2020 | WO |
2020022572 | Jan 2020 | WO |
2020109074 | Jun 2020 | WO |
2021054565 | Mar 2021 | WO |
2021252230 | Dec 2021 | WO |
2022047214 | Mar 2022 | WO |
Entry |
---|
102324233, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201580046330.7 dated Aug. 23, 2021. |
102346719, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201810019395.8 dated Oct. 29, 2021. |
102495406, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110571137.2 dated Sep. 30, 2021. |
102520789, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010735884.0 dated Mar. 10, 2021. |
102663016, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201910330895.8 dated Dec. 15, 2020. |
102681761, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201710551469.8 dated Nov. 10, 2021. |
102708867, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680049825.X dated Jun. 17, 2022. |
102890936, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201710386932.8 dated Apr. 6, 2021. |
102915731, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201580080518.3 dated Dec. 18, 2020. |
103093755, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010736257.9 dated Aug. 30, 2021. |
103187053, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201580080518.3 dated Oct. 18, 2021. |
103197963, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201910115436.8 dated Mar. 14, 2022. |
103217892, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110557428.6 dated Dec. 2, 2021. |
103324100, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680065149.5 dated Dec. 15, 2021. |
203249629, CN, U, Chinese Patent Office in an Office Action for related Patent Application No. 201680049880.9 dated Apr. 6, 2021. |
103414949, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680003291.7 dated Mar. 24, 2021. |
103456303, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010736257.9 dated Aug. 30, 2021. |
103457837, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010997038.6 dated Sep. 9, 2021. |
103475551, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010735884.0 dated Mar. 10, 2021. |
103546453, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110689193.6 dated Aug. 1, 2022. |
103593054, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201710551469.8 dated Jul. 15, 2021. |
103686723, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680065149.5 dated Dec. 15, 2021. |
103730120, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680049825.X dated Jun. 17, 2022. |
103761104, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110943177.5 dated Mar. 8, 2022. |
103778527, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110571137.2 dated Sep. 30, 2021. |
103780758, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110557428.6 dated Dec. 2, 2021. |
103809548, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010735884.0 dated Mar. 10, 2021. |
103885663, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010356666.6 dated Dec. 31, 2020. |
103942932, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110571137.2 dated Sep. 30, 2021. |
104036774, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201580080518.3 dated Dec. 18, 2020. |
104092829, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680065149.5 dated Dec. 15, 2021. |
104185868, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201580080518.3 dated Oct. 18, 2021. |
104240701, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201710551469.8 dated Nov. 10, 2021. |
104360990, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201810998619.4 dated Dec. 28, 2020. |
104423780, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201810998574.0 dated Dec. 25, 2020. |
104464733, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680049415.5 dated Dec. 28, 2020. |
104575504, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201680049825.X dated Jun. 17, 2022. |
104731441, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201810019395.8 dated Oct. 29, 2021. |
104798012, CN, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-205151 dated Nov. 26, 2021. |
104836909, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202011041038.5 dated Feb. 26, 2021. |
105338425, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110689193.6 dated Aug. 1, 2022. |
105516441, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201810019395.8 dated Oct. 29, 2021. |
105554217, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202011041038.5 dated Jan. 26, 2022. |
105872222, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 201980033273.7 dated Jul. 5, 2021. |
106773742, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202011041038.5 dated Feb. 26, 2021. |
107623616, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010356666.6 dated Dec. 31, 2020. |
107786730, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202010356666.6 dated Dec. 31, 2020. |
108268187, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110513252.4 dated Mar. 14, 2022. |
110263144, CN, A, WIPO in an Office Action for related Patent Application No. PCT/US2021/036910 dated Sep. 29, 2021. |
111124224, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202110513252.4 dated Mar. 14, 2022. |
2012-40655, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2018-087328 dated Nov. 17, 2020. |
2012-511774, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-172654 dated Oct. 1, 2021. |
2012-165084, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-205151 dated Nov. 26, 2021. |
2012-220959, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2018-087328 dated Nov. 17, 2020. |
2013-131087, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-172654 dated Oct. 1, 2021. |
2013-174987, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-172654 dated Oct. 1, 2021. |
2013-200265, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-217267 dated Nov. 15, 2021. |
2013-238935, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-172654 dated Oct. 1, 2021. |
2013-248292, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-217267 dated Nov. 15, 2021. |
2013-257694, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2018-192102 dated Mar. 11, 2021. |
2015-1931, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-217267 dated Nov. 15, 2021. |
2015-520409, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2022-054176 dated Jul. 22, 2022. |
2016-35614, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-123111 dated Jun. 25, 2021. |
2016-156845, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2021-131662 dated Jul. 27, 2022. |
2017-11608, JP, A, Korean Patent Office in an Office Action for related Patent Application No. 10-2022-7002780 dated Feb. 22, 2022. |
2017-211608, JP, A, Korean Patent Office in an Office Action for related Patent Application No. 10-2020-7037527 dated Oct. 25, 2021. |
2018-511095, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2018-184477 dated Sep. 22, 2022. |
2018-64297, JP, A, Japanese Patent Office in an Office Action for related Patent Application No. 2018-18447 dated Sep. 22, 2022. |
2018-101242, JP, A, Danish Patent Office in an Office Action for related Patent Application No. PA202070658 dated Jan. 22, 2021. |
10-2014-0007282, KR, A, Korean Patent Office in an Office Action for related Patent Application No. 10-2020-7037527 dated Oct. 25, 2021. |
10-2014-0071208, KR, A, Japanese Patent Office in an Office Action for related Patent Application No. 2020-205151 dated Nov. 26, 2021. |
Chenghao, Yuan, “MacroDroid”, Online available at: https://www.ifanr.com/weizhizao/612531, Jan. 25, 2016, 7 pages, Chinese Patent Office in an Office Action for related Patent Application No. 202010167391.1 dated Apr. 20, 2021. |
“How to adjust the order of control center buttons on iPhone iOS12 version after buying a mobile phone”, Available online at: https://jingyan.baidu.com/article/5bbb5albbe5a9713eba1791b.html? Jun. 14, 2019, 4 pages, Chinese Patent Office in an Office Action for related Patent Application No. 202110513252.4 dated Mar. 14, 2022. |
Song, Yang, “Research of Chinese Continuous Digital Speech Input System Based on HTK”, Computer and Digital Engineering, vol. 40, No. 4, Dec. 31, 2012, 5 pages, Chinese Patent Office in an Office Action for related Patent Application No. 201710109781.1 dated Feb. 22, 2021. |
“Use Macrodroid skillfully to automatically clock in with Ding Talk”, Online available at: https://blog.csdn.net/qq_26614295/article/details/84304541, Nov. 20, 2018, 11 pages, Chinese Patent Office in an Office Action for related Patent Application No. 202010167391.1 dated Apr. 20, 2021. |
Zhao et al., “Big Data Analysis and Application”, Aviation Industry Press, Dec. 2015, pp. 236-241, Chinese Patent Office in an Office Action for related Patent Application No. 202010356666.6 dated Jun. 23, 2021. |
Abdelaziz et al., “Speaker-Independent Speech-Driven Visual Speech Synthesis using Domain-Adapted Acoustic Models”, May 15, 2019, 9 pages. |
Accessibility on iOS, Apple Inc., online available at: https://developer.apple.com/accessibility/ios/, Retrieved on Jul. 26, 2021, 2 pages. |
Alsharif et al., “Long Short-Term Memory Neural Network for Keyboard Gesture Decoding”, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brisbane, Australia, Sep. 2015, 5 pages. |
Apple Differential Privacy Team, “Learning with Privacy at Scale”, Apple Machine Learning Blog, vol. 1, No. 8, Online available at: <https://machinelearning.apple.com/2017/12/06/learning-with-privacy-at-scale.html>, Dec. 2017, 9 pages. |
Apple, “Apple previews innovative accessibility features combining the power of hardware, software, and machine learning”, Available online at: https://www.apple.com/newsroom/2022/05/apple-previews-innovative-accessibility-features/, May 17, 2022, 10 pages. |
Badshah, et al., “Deep Features-based Speech Emotion Recognition For Smart Affective Services”, Multimedia Tools and Applications, Oct. 31, 2017, pp. 5571-5589. |
Bodapati et al., “Neural Word Decomposition Models for Abusive Language Detection”, Proceedings of the Third Workshop on Abusive Language Online, Aug. 1, 2019, pp. 135-145. |
Büttner et al., “The Design Space of Augmented and Virtual Reality Applications for Assistive Environments in Manufacturing: A Visual Approach”, In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments (PETRA '17), Island of Rhodes, Greece, online available at: https://dl.acm.org/doi/pdf/10.1145/3056540.3076193, Jun. 21-23, 2017, pp. 433-440. |
Chen, Angela, “Amazon's Alexa now handles patient health information”, Available online at: <https://www.theverge.com/2019/4/4/18295260/amazon-hipaa-alexa-echo-patient-health-information-privacy-voice-assistant>, Apr. 4, 2019, 2 pages. |
Chenghao, Yuan, “MacroDroid”, Online available at: https://www.ifanr.com/weizhizao/612531, Jan. 25, 2016, 7 pages (Official Copy Only). {See communication under 37 CFR § 1.98(a) (3)}. |
“Context-Sensitive User Interface”, Online available at: https://web.archive.org/web/20190407003349/https://en.wikipedia.org/wiki/Context-sensitive_user_interface, Apr. 7, 2019, 3 pages. |
Creswell et al., “Generative Adversarial Networks”, IEEE Signal Processing Magazine, Jan. 2018, pp. 53-65. |
Dai, et al., “Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context”, Online available at: arXiv:1901.02860v3, Jun. 2, 2019, 20 pages. |
Dighe et al., “Lattice-Based Improvements for Voice Triggering Using Graph Neural Networks”, in 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Jan. 25, 2020, 5 pages. |
Dwork et al., “The Algorithmic Foundations of Differential Privacy”, Foundations and Trends in Theoretical Computer Science: vol. 9: No. 3-4, 211-407, 2014, 281 pages. |
Fitzpatrick, Aidan, “Introducing Camo 1.5: AR modes”, Available Online at: “https://reincubate.com/blog/camo-ar-modes-release/”, Oct. 28, 2021, 8 pages. |
Ganin et al., “Unsupervised Domain Adaptation by Backpropagation”, in Proceedings of the 32nd International Conference on Machine Learning, vol. 37, Jul. 2015, 10 pages. |
Gatys et al., “Image Style Transfer Using Convolutional Neural Networks”, Proceedings of IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2016, pp. 2414-2423. |
Geyer et al., “Differentially Private Federated Learning: A Client Level Perspective”, arXiv:1712.07557v2, Mar. 2018, 7 pages. |
Gomes et al., “Mining Recurring Concepts in a Dynamic Feature Space”, IEEE Transactions on Neural Networks and Learning Systems, vol. 25, No. 1, Jul. 31, 2013, pp. 95-110. |
Goodfellow et al., “Generative Adversarial Networks”, Proceedings of the Neural Information Processing Systems, Dec. 2014, 9 pages. |
Graves, Alex, “Sequence Transduction with Recurrent Neural Networks”, Proceeding of International Conference of Machine Learning (ICML) Representation Learning Workshop, Nov. 14, 2012, 9 pages. |
Gu et al., “BadNets: Evaluating Backdooring Attacks on Deep Neural Networks”, IEEE Access, vol. 7, Mar. 21, 2019, pp. 47230-47244. |
Guo et al., “StateLens: A Reverse Engineering Solution for Making Existing Dynamic Touchscreens Accessible”, In Proceedings of the 32nd Annual Symposium on User Interface Software and Technology (UIST '19), New Orleans, LA, USA, online available at: https://dl.acm.org/doi/pdf/10.1145/3332165.3347873, Oct. 20-23, 2019, pp. 371-385. |
Guo et al., “Time-Delayed Bottleneck Highway Networks Using a DFT Feature for Keyword Spotting”, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2018, 5 pages. |
Guo et al., “VizLens: A Robust and Interactive Screen Reader for Interfaces in the Real World”, In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16), Tokyo, Japan, online available at: https://dl.acm.org/doi/pdf/10.1145/2984511.2984518, Oct. 16-19, 2016, pp. 651-664. |
Haung et al., “A Study for Improving Device-Directed Speech Detection Toward Frictionless Human-Machine Interaction”, in Proc. Interspeech, 2019, 5 pages. |
Hawkeye, “Hawkeye—A better user testing platform”, Online Available at: https://www.youtube.com/watch?v=el0TW0g_76o, Oct. 16, 2019, 3 pages. |
Hawkeye, “Learn where people look in your products”, Online Available at: https://www.usehawkeye.com, 2019, 6 pages. |
Heller et al., “AudioScope: Smartphones as Directional Microphones in Mobile Audio Augmented Reality Systems”, In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), Crossings, Seoul, Korea, Online available at: https://dl.acm.org/doi/pdf/10.1145/2702123.2702159, Apr. 18-23, 2015, pp. 949-952. |
Henderson et al., “Efficient Natural Language Response Suggestion for Smart Reply”, Available Online at: https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/1846e8a466c079eae7e90727e27caf5f98f10e0c.pdf, 2017, 15 pages. |
Hinton et al., “Distilling the Knowledge in A Neural Network”, arXiv preprintarXiv:1503.02531, Mar. 2, 2015, 9 pages. |
Hook et al., “Automatic speech-based emotion recognition using paralinguistics features”, Bulletin of the Polish Academy of Sciences, Technical Sciences, vol. 67, No. 3, 2019, pp. 479-488. |
“How to adjust the order of control center buttons on iPhone iOS12 version after buying a mobile phone”, Available online at: https://jingyan.baidu.com/article/5bbb5albbe5a9 713eba1791b.html? Jun. 14, 2019, 4 pages (Official Copy only). {See communication under 37 CFR § 1.98(a) (3)}. |
Idasallinen, “What's The ‘Like’ Meter Based on?”, Online Available at:—<https://community.spotify.com/t5/Content-Questions/What-s-the-like-meter-based-on/td-p/1209974>, Sep. 22, 2015, 6 pages. |
Jeon et al., “Voice Trigger Detection from LVCSR Hypothesis Lattices Using Bidirectional Lattice Recurrent Neural Networks”, International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, Feb. 29, 2020, 5 pages. |
Jeong et al., “Development Trend of N-Screen Service”, Journal of Broadcasting Engineering, vol. 17, No. 1, Sep. 2012, 18 pages (6 pages of English Translation and 12 pages of Official Copy). |
Kannan et al., “Smart Reply: Automated Response Suggestion for Email”, Available Online at: https://arxiv.org/pdf/1606.04870.pdf, Jun. 15, 2016, 10 pages. |
Kondrat, Tomek, “Automation for Everyone with MacroDroid”, Online available at: https://www.xda-developers.com/automation-for-everyone-with-macrodroid/, Nov. 17, 2013, 6 pages. |
Kruger et al., “Virtual World Accessibility with the Perspective Viewer”, Proceedings of ICEAPVI, Athens, Greece, Feb. 12-14, 2015, 6 pages. |
Kumar, Shiu, “Ubiquitous Smart Home System Using Android Application”, International Journal of Computer Networks & Communications (IJCNC) vol. 6, No. 1, Jan. 2014, pp. 33-43. |
Kumatani et al., “Direct Modeling of Raw Audio with DNNS For Wake Word Detection”, in 2017 IEEE Automatic Speech Recognition and Understanding Workshop (ASRU), 2017, 6 pages. |
Li et al., “Deep neural network for short-text sentiment classification”, International Conference on Database Systems for Advanced Applications, Springer, Cham, 2016, 8 pages. |
Lin, Luyuan, “An Assistive Handwashing System with Emotional Intelligence”, Using Emotional Intelligence in Cognitive Intelligent Assistant Systems, 2014, 101 pages. |
Maas et al., “Combining Acoustic Embeddings and Decoding Features for End-Of-Utterance Detection in Real-Time Far-Field Speech Recognition Systems”, in 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2018, 5 pages. |
Mallidi et al., “Device-Directed Utterance Detection”, Proc. Interspeech, Aug. 7, 2018, 4 pages. |
“Method to Provide Remote Voice Navigation Capability on the Device”, ip.com, Jul. 21, 2016, 4 pages. |
Microsoft Soundscape—A map delivered in 3D sound, Microsoft Research, online available at: https://www.microsoft.com/en-us/research/product/soundscape/, Retrieved on Jul. 26, 2021, 5 pages. |
Mnih et al., “Human-Level Control Through Deep Reinforcement Learning”, Nature, vol. 518, Feb. 26, 2015, pp. 529-533. |
Müller et al., “A Taxonomy for Information Linking in Augmented Reality”, AVR 2016, Part I, LNCS 9768, 2016, pp. 368-387. |
Muller et al., “Control Theoretic Models of Pointing”, ACM Transactions on Computer-Human Interaction, Aug. 2017, 36 pages. |
Norouzian et al., “Exploring Attention Mechanism for Acoustic based Classification of Speech Utterances into System-Directed and Non-System-Directed”, International Conference on Acoustics, Speech and Signal Processing (ICASSP). lEEE, Feb. 1, 2019, 5 pages. |
“Nuance Dragon Naturally Speaking”, Version 13 End-User Workbook, Nuance Communications Inc., Sep. 2014, 125 pages. |
Pavlopoulos et al., “ConvAI at SemEval-2019 Task 6: Offensive Language Identification and Categorization with Perspective and BERT”, Proceedings of the 13th International Workshop on Semantic Evaluation (SemEval-2019), Jun. 6-7, 2019, pp. 571-576. |
Philips, Chris, “Thumbprint Radio: A Uniquely Personal Station Inspired by All of Your Thumbs Up”, Pandora News, Online Available at:—<https://blog.pandora.com/author/chris-phillips/>, Dec. 14, 2015, 7 pages. |
Ping, et al., “Deep Voice 3: Scaling Text to Speech with Convolutional Sequence Learning”, Available online at: https://arxiv.org/abs/1710.07654, Feb. 22, 2018, 16 pages. |
“Pose, Cambridge Dictionary Definition of Pose”, Available online at: <https://dictionary.cambridge.org/dictionary/english/pose>, 4 pages. |
“Radio Stations Tailored to You Based on the Music You Listen to on iTunes”, Apple Announces iTunes Radio, Press Release, Jun. 10, 2013, 3 pages. |
Raux, Antoine, “High-Density Dialog Management the Topic Stack”, Adventures in High Density, Online available at: https://medium.com/adventures-in-high-density/high-density-dialog-management-23efcf91db1e, Aug. 1, 2018, 10 pages. |
Ravi, Sujith, “Google AI Blog: On-device Machine Intelligence”, Available Online at: https://ai.googleblog.com/2017/02/on-device-machine-intelligence.html, Feb. 9, 2017, 4 pages. |
Robbins, F Mike, “Automatically place an Android Phone on Vibrate at Work”, Available online at: https://mikefrobbins.com/2016/07/21/automatically-place-an-android-phone-on-vibrate-at-work/, Jul. 21, 2016, pp. 1-11. |
Rodrigues et al., “Exploring Mixed Reality in Specialized Surgical Environments”, In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17), Denver, CO, USA, online available at: https://dl.acm.org/doi/pdf/10.1145/3027063.3053273, May 6-11, 2017, pp. 2591-2598. |
Ross et al., “Epidemiology as a Framework for Large-Scale Mobile Application Accessibility Assessment”, In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17), Baltimore, MD, USA, online available at: https://dl.acm.org/doi/pdf/10.1145/3132525.3132547, Oct. 29-Nov. 1, 2017, pp. 2-11. |
Schenk et al., “GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios”, In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI'17). ACM, New York, NY, 30343044. Online Available at: https://doi.org/10.1145/3025453.3025455, May 6-11, 2017, 11 pages. |
Sigtia et al., “Efficient Voice Trigger Detection for Low Resource Hardware”, in Proc. Interspeech 2018, Sep. 2-6, 2018, pp. 2092-2096. |
Sigtia et al., “Multi-Task Learning for Voice Trigger Detection”, in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2020, Apr. 20, 2020, 5 pages. |
Song, Yang, “Research of Chinese Continuous Digital Speech Input System Based on HTK”, Computer and Digital Engineering, vol. 40, No. 4, Dec. 31, 2012, 5 pages (Official Copy Only). {See communication under 37 CFR § 1.98(a) (3)}. |
Speicher et al., “What is Mixed Reality?”, In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, Article 537, Glasgow, Scotland, UK, online available at: https://dl.acm.org/doi/pdf/10.1145/3290605.3300767, May 4-9, 2019, 15 pages. |
Sperber et al., “Self-Attentional Models for Lattice Inputs”, in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, Association for Computational Linguistics, Jun. 4, 2019, 13 pages. |
Sutskever et al., “Sequence to Sequence Learning with Neural Networks”, Proceedings of the 27th International Conference on Neural Information Processing Systems, 2014, 9 pages. |
Tamar et al., “Value Iteration Networks”, Advances in Neural Information Processing Systems, vol. 29, 2016, 16 pages. |
Tech Target Contributor, “AI Accelerator”, Available online at: https://searchenterpriseai.techtarget.com/definition/AI-accelerator, Apr. 2018, 3 pages. |
Tech With Brett, “Everything the Google Nest Hub Can Do”, Available online at: https://www.youtube.com/watch?v=x3vdytgru2E, Nov. 12, 2018, 13 pages. |
Tech With Brett, “Google Home Multiple Users Setup”, Available online at: https://www.youtube.com/watch?v=BQOAbRUeFRo&t=257s, Jun. 29, 2017, 4 pages. |
Tkachenko, Sergey, “Chrome will automatically create Tab Groups”, Available online at : https://winaero.com/chrome-will-automatically-create-tab-groups/, Sep. 18, 2020, 5 pages. |
Tkachenko, Sergey, “Enable Tab Groups Auto Create in Google Chrome”, Available online at: https://winaero.com/enable-tab-groups-auto-create-in-google-chrome/, Nov. 30, 2020, 5 pages. |
“Use Macrodroid skillfully to automatically clock in with Ding Talk”, Online available at: https://blog.csdn.net/qq_26614295/article/details/84304541, Nov. 20, 2018, 11 pages (Official Copy Only). {See communication under 37 CFR § 1.98(a) (3)}. |
Vazquez et al., “An Assisted Photography Framework to Help Visually Impaired Users Properly Aim a Camera”, ACM Transactions on Computer-Human Interaction, vol. 21, No. 5, Article 25, Online available at: https://dl.acm.org/doi/pdf/10.1145/2651380, Nov. 2014, 29 pages. |
Velian Speaks Tech, “10 Google Assistant Tips!”, Available online at: https://www.youtube.com/watch?v=3RNWA3NK9fs, Feb. 24, 2020, 3 pages. |
Walker, Amy, “NHS Gives Amazon Free Use of Health Data Under Alexa Advice Deal”, Available online at: <https://www.theguardian.com/society/2019/dec/08/nhs-gives-amazon-free-use-of-health-data-under-alexa-advice-deal>, 3 pages. |
Wang, et al., “Tacotron: Towards End-to-End Speech Synthesis”, Available online at: https://arxiv.org/abs/1703.10135, Apr. 6, 2017, 10 pages. |
Wang, et al., “Training Deep Neural Networks with 8-bit Floating Point Numbers”, 32nd Conference on Neural Information Processing Systems (Neurl PS 2018), 2018, 10 pages. |
Wei et al., “Design and Implement On Smart Home System”, 2013 Fourth International Conference on Intelligent Systems Design and Engineering Applications, Available online at: https://ieeexplore.ieee.org/document/6843433, 2013, pp. 229-231. |
“What's on Spotify?”, Music for everyone, Online Available at:—<https://web.archive.org/web/20160428115328/https://www.spotify.com/us/>, Apr. 28, 2016, 6 pages. |
Win, et al., “Myanmar Text to Speech System based on Tacotron-2”, International Conference on Information and Communication Tehcnology Convergence (ICTC), Oct. 21-23, 2020, pp. 578-583. |
“Working with the Dragon Bar”, Nuance Communications, Inc, Jun. 27, 2016, 2 pages. |
Wu et al., “Monophone-Based Background Modeling for Two-Stage On-device Wake Word Detection”, in 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Apr. 2018, 5 pages. |
Xu et al., “Show, Attend and Tell: Neural Image Caption Generation with Visual Attention”, Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015, 10 pages. |
Young et al., “POMDP-Based Statistical Spoken Dialog Systems: A Review”, Proceedings of the IEEE, vol. 101, No. 5, 2013, 18 pages. |
Zhang et al., “Interaction Proxies for Runtime Repair and Enhancement of Mobile Application Accessibility”, In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, Denver, CO, USA, online available at: https://dl.acm.org/doi/pdf/10.1145/3025453.3025846, May 6-11, 2017, pp. 6024-6037. |
Zhang et al., “Very Deep Convolutional Networks for End-To-End Speech Recognition”, IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2017, 5 pages. |
Zhao et al., “Big Data Analysis and Application”, Aviation Industry Press, Dec. 2015, pp. 236-241 (Official Copy Only). {See communication under 37 CFR § 1.98(a)(3)}. |
Zhao et al., “CueSee: Exploring Visual Cues for People with Low Vision to Facilitate a Visual Search Task”, In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, UbiComp '16, Heidelberg, Germany, online available at: https://dl.acm.org/doi/pdf/10.1145/2971648.2971730, Sep. 12-16, 2016, pp. 73-84. |
Zhao et al., “Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation”, In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, Article 116, Montréal, QC, Canada, online available at: https://dl.acm.org/doi/pdf/10.1145/3173574.3173690, Apr. 21-26, 2018, 14 pages. |
Zhao et al., “SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision”, In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, Article 111, Glasgow, Scotland, UK, Online available at: https://dl.acm.org/doi/pdf/10.1145/3290605.3300341, May 4-9, 2019, 14 pages. |
Zhao et al., “Transferring Age and Gender Attributes for Dimensional Emotion Prediction from Big Speech Data Using Hierarchical Deep Learning”, 2018 4th IEEE International Conference on Big Data Security on Cloud, 2018, pp. 20-24. |
Zheng, et al., “Intent Detection and Semantic Parsing for Navigation Dialogue Language Processing”, 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), 2017, 6 pages. |
Zhou et al., “Learning Dense Correspondence via 3D-guided Cycle Consistency”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, 10 pages. |
Intention to Grant received for European Patent Application No. 19160560.9, dated Oct. 20, 2022, 9 pages. |
Ashbrook, Daniel L., “Enabling Mobile Microinteractions”, Retrieved from the Internet: URL: http://danielashbrook.com/wp-content/uploads/2012/06/2009-Ashbrook-Thesis.pdf, May 2010, 186 pages. |
Coulouris, et al., “Distributed Systems: Concepts and Design (Fifth Edition)”, Addison-Wesley, May 7, 2011, 391 pages. |
Jefford, et al., “Professional BizTalk Server 2006”, Wrox, May 7, 2007, 398 pages. |
Navigli, Roberto, “Word Sense Disambiguation: A Survey”, ACM Computing Surveys, vol. 41, No. 2, Article 10, Feb. 2009, 69 pages. |
Phoenix Solutions, Inc., “Declaration of Christopher Schmandt Regarding the MIT Galaxy System”, West Interactive Corp., a Delaware Corporation, Document 40, Jul. 2, 2010, 162 pages. |
Stent, et al., “Geo-Centric Language Models for Local Business Voice Search”, AT&T Labs—Research, 2009, pp. 389-396. |
Tur, et al., “The CALO Meeting Assistant System”, IEEE Transactions on Audio, Speech, and Language Processing, vol. 18, No. 6, Aug. 2010, pp. 1601-1611. |
Wikipedia, “Speech Recognition”, available at <http://en.wikipedia.org/wiki/Speech_recognition>, retrieved on Sep. 14, 2011, 12 pages. |
Xu, et al., “Speech-Based Interactive Games for Language Learning: Reading, Translation, and Question-Answering”, Computational Linguistics and Chinese Language Processing, vol. 14, No. 2, Jun. 2009, pp. 133-160. |
“Alexa, Turn Up the Heat! Smartthings Samsung [online]”, Online available at:—<https://web.archive.org/web/20160329142041/https://blog.smartthings.com/news/smartthingsupdates/alexa-turn-up-the-heat/>, Mar. 3, 2016, 3 pages. |
“Ask Alexa—Things That Are Smart Wiki”, Online available at:—<http://thingsthataresmart.wiki/index.php?title=Ask_Alexa&oldid=4283>, Jun. 8, 2016, pp. 1-31. |
“DIRECTV™ Voice”, Now Part of the DIRECTTV Mobile App for Phones, Sep. 18, 2013, 5 pages. |
“Galaxy S7: How to Adjust Screen Timeout & Lock Screen Timeout”, Online available at:—<https://www.youtube.com/watch?v=n6e1WKUS2ww>, Jun. 9, 2016, 1 page. |
“Headset Button Controller v7.3 APK Full APP Download for Andriod, Blackberry, iPhone”, Online available at:—<http://fullappdownload.com/headset-button-controller-v7-3-apk/>, Jan. 27, 2014, 11 pages. |
“Hey Google: How to Create a Shopping List with Your Google Assistant”, Online available at:—<https://www.youtube.com/watch?v=w9NCsElax1Y>, May 25, 2018, 1 page. |
“How to Enable Google Assistant on Galaxy S7 and Other Android Phones (No Root)”, Online available at:—<https://www.youtube.com/watch?v=HekIQbWyksE>, Mar. 20, 2017, 1 page. |
“How to Use Ok Google Assistant Even Phone is Locked”, Online available at:—<https://www.youtube.com/watch?v=9B_gP4j_SP8>, Mar. 12, 2018, 1 page. |
“Interactive Voice”, Online available at:—<http://www.helloivee.com/company/>, retrieved on Feb. 10, 2014, 2 pages. |
“iPhone 6 Smart Guide Full Version for SoftBank”, Gijutsu-Hyohron Co., Ltd., vol. 1, Dec. 1, 2014, 4 pages. |
“Link Your Voice to Your Devices with Voice Match, Google Assistant Help”, Online available at: <https://support.google.com/assistant/answer/9071681?co=GENIE.Platform%3DAndroid&hl=en>, Retrieved on Jul. 1, 2020, 2 pages. |
“Meet Ivee, Your Wi-Fi Voice Activated Assistant”, Availale Online at:—<http://www.helloivee.com/>, retrieved on Feb. 10, 2014, 8 pages. |
“Mobile Speech Solutions, Mobile Accessibility”, SVOX AG Product Information Sheet, Online available at:—<http://www.svox.com/site/bra840604/con782768/mob965831936.aSQ?osLang=1>, Sep. 27, 2012, 1 page. |
“Natural Language Interface Using Constrained Intermediate Dictionary of Results”, List of Publications Manually reviewed for the Search of U.S. Pat. No. 7,177,798, Mar. 22, 2013, 1 page. |
“Quick Type Keyboard on iOS 8 Makes Typing Easier”, Online available at:—<https://www.youtube.com/watch?v=0CldLR4fhVU>, Jun. 3, 2014, 3 pages. |
“Skilled at Playing my iPhone 5”, Beijing Hope Electronic Press, Jan. 2013, 6 pages. |
“SmartThings +Amazon Echo”, Smartthings Samsung [online], Online available at:—<https://web.archive.org/web/20160509231428/https://blog.smartthings.com/featured/alexa-turn-on-my-smartthings/>, Aug. 21, 2015, 3 pages. |
AAAAPLAY, “Sony Media Remote for iOS and Android”, Online available at: <https://www.youtube.com/watch?v=W8QoeQhlGok>, Feb. 4, 2012, 3 pages. |
Advisory Action received for U.S. Appl. No. 17/125,876, dated Jun. 28, 2022, 6 pages. |
Alfred App, “Alfred”, Online available at:—<http://www.alfredapp.com/>, retrieved on Feb. 8, 2012, 5 pages. |
Anania Peter, “Amazon Echo with Home Automation (Smartthings)”, Online available at:—<https://www.youtube.com/watch?v=LMW6aXmsWNE>, Dec. 20, 2015, 1 page. |
Android Authority, “How to use Tasker: A Beginner's Guide”, Online available at:—<https://youtube.com/watch?v= rDpdS_YWzFc>, May 1, 2013, 1 page. |
Apple, “VoiceOver for OS X”, Online available at:—<http://www.apple.com/accessibility/voiceover/>, May 19, 2014, pp. 1-3. |
Applicant Initiated Interview Summary Received for U.S. Appl. No. 15/495,861, dated Feb. 10, 2020, 3 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 15/495,861, dated Aug. 25, 2020, 5 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 17/125,876, dated Jun. 10, 2022, 2 pages. |
Applicant-Initiated Interview Summary received for U.S. Appl. No. 17/125,876, dated Nov. 29, 2021, 2 pages. |
Asakura et al., “What LG thinks; How the TV should be in the Living Room”, HiVi, vol. 31, No. 7, Stereo Sound Publishing, Inc., Jun. 17, 2013, pp. 68-71. |
ASHINGTONDCTECH & Gaming, “SwipeStatusBar—Reveal the Status Bar in a Fullscreen App”, Online Available at: <https://www.youtube.com/watch?v=wA_tT9lAreQ>, Jul. 1, 2013, 3 pages. |
Automate Your Life, “How to Setup Google Home Routines—A Google Home Routines Walkthrough”, Online Available at: <https://www.youtube.com/watch?v=pXokZHP9kZg>, Aug. 12, 2018, 1 page. |
Bell, Jason, “Machine Learning Hands-On for Developers and Technical Professionals”, Wiley, 2014, 82 pages. |
Bellegarda, Jeromer, “Chapter 1: Spoken Language Understanding for Natural Interaction: The Siri Experience”, Natural Interaction with Robots, Knowbots and Smartphones, 2014, pp. 3-14. |
Bellegarda, Jeromer, “Spoken Language Understanding for Natural Interaction: The Siri Experience”, Slideshow retrieved from: <https://www.uni-ulm.de/fileadmin/website_uni_ulm/iui.iwsds2012/files/Bellegarda.pdf>, International Workshop on Spoken Dialog Systems (IWSDS), May 2012, pp. 1-43. |
beointegration.com, “BeoLink Gateway—Programming Example”, Online Available at: <https:/ /www.youtube.com/watch?v=TXDaJFm5UH4>, Mar. 4, 2015, 3 pages. |
Board Decision received for Chinese Patent Application No. 201580029053.9, dated Aug. 19, 2021, 15 pages. |
Board Opinion received for Chinese Patent Application No. 201580029053.9, dated Apr. 8, 2021, 9 pages. |
Burgess, Brian, “Amazon Echo Tip: Enable the Wake Up Sound”, Online available at:—<https://www.groovypost.com/howto/amazon-echo-tip-enable-wake-up-sound/>, Jun. 30, 2015, 4 pages. |
Butcher, Mike, “EVI Arrives in Town to go Toe-to-Toe with Siri”, TechCrunch, Jan. 23, 2012, pp. 1-2. |
Cambria et al., “Jumping NLP curves: A Review of Natural Language Processing Research.”, IEEE Computational Intelligence magazine, 2014, vol. 9, May 2014, pp. 48-57. |
Caraballo et al., “Language Identification Based on a Discriminative Text Categorization Technique”, Iberspeech 2012—VII Jornadas En Tecnologia Del Habla And III Iberian Sltech Workshop, Nov. 21, 2012, pp. 1-10. |
Castleos, “Whole House Voice Control Demonstration”, Online available at:—<https://www.youtube.com/watch?v=9SRCoxrZ_W4>, Jun. 2, 2012, 1 pages. |
Chang et al., “Monaural Multi-Talker Speech Recognition with Attention Mechanism and Gated Convolutional Networks”, Interspeech 2018, Sep. 2-6, 2018, pp. 1586-1590. |
Chen et al., “A Convolutional Neural Network with Dynamic Correlation Pooling”, 13th International Conference on Computational Intelligence and Security, IEEE, 2017, pp. 496-499. |
Chen et al., “Progressive Joint Modeling in Unsupervised Single-Channel Overlapped Speech Recognition”, IEEE/ACM Transactions on Audio, Speech, And Language Processing, vol. 26, No. 1, Jan. 2018, pp. 184-196. |
Chen, Yi, “Multimedia Siri Finds and Plays Whatever You Ask For”, PSFK Report, Feb. 9, 2012, pp. 1-9. |
Cheyer, Adam, “Adam Cheyer—About”, Online available at:—<http://www.adam.cheyer.com/about.html>, retrieved on Sep. 17, 2012, pp. 1-2. |
Colt, Sam, “Here's One Way Apple's Smartwatch Could Be Better Than Anything Else”, Business Insider, Aug. 21, 2014, pp. 1-4. |
Conneau et al., “Supervised Learning of Universal Sentence Representations from Natural Language Inference Data”, Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark, Sep. 7-11, 2017, pp. 670-680. |
Czech Lucas, “A System for Recognizing Natural Spelling of English Words”, Diploma Thesis, Karlsruhe Institute of Technology, May 7, 2014, 107 pages. |
Decision to Refuse received for European Patent Application No. 15717712.2, dated Jan. 4, 2019, 11 pages. |
Deedeevuu, “Amazon Echo Alarm Feature”, Online available at:—<https://www.youtube.com/watch?v=fdjU8eRLk7c>, Feb. 16, 2015, 1 page. |
Delcroix et al., “Context Adaptive Deep Neural Networks for Fast Acoustic Model Adaptation”, ICASSP, 2015, pp. 4535-4539. |
Delcroix et al., “Context Adaptive Neural Network for Rapid Adaptation of Deep CNN Based Acoustic Models”, Interspeech 2016, Sep. 8-12, 2016, pp. 1573-1577. |
Derrick, Amanda, “How to Set Up Google Home for Multiple Users”, Lifewire, Online available at:—<https://www.lifewire.com/set-up-google-home-multiple-users-4685691>, Jun. 8, 2020, 9 pages. |
Dihelson, “How Can I Use Voice or Phrases as Triggers to Macrodroid?”, Macrodroid Forums, Online Available at:—<https://www.tapatalk.com/groups/macrodroid/how-can-i-use-voice-or-phrases-as-triggers-to-macr-t4845.html>, May 9, 2018, 5 pages. |
Earthling1984, “Samsung Galaxy Smart Stay Feature Explained”, Online available at:—<https://www.youtube.com/watch?v=RpjBNtSjupl>, May 29, 2013, 1 page. |
Eder et al., “At the Lower End of Language—Exploring the Vulgar and Obscene Side of German”, Proceedings of the Third Workshop on Abusive Language Online, Florence, Italy, Aug. 1, 2019, pp. 119-128. |
Edim, et al., “A Multi-Agent Based Virtual Personal Assistant for E-Health Service”, Journal of Information Engineering and Applications, vol. 3, No. 11, 2013, 9 pages. |
Evi, “Meet Evi: The One Mobile Application that Provides Solutions for your Everyday Problems”, Feb. 2012, 3 pages. |
Extended European Search Report received for European Patent Application No. 19160560.9, dated May 17, 2019, 8 pages. |
Filipowicz, Luke, “How to use the QuickType keyboard in iOS 8”, Online available at:—<https://www.imore.com/comment/568232>, Oct. 11, 2014, pp. 1-17. |
Final Office Action received for U.S. Appl. No. 15/495,861, dated Nov. 18, 2019, 16 pages. |
Final Office Action received for U.S. Appl. No. 17/125,876, dated Feb. 1, 2022, 19 pages. |
Findlater et al., “Beyond QWERTY: Augmenting Touch-Screen Keyboards with Multi-Touch Gestures for Non-Alphanumeric Input”, CHI '12, May 5-10, 2012, 4 pages. |
Gadget Hacks, “Tasker Too Complicated? Give MacroDroid a Try [How-To]”, Online available at: <https://www.youtube.com/watch?v=8YL9cWCykKc>, May 27, 2016, 1 page. |
Ghauth et al., “Text Censoring System for Filtering Malicious Content Using Approximate String Matching and Bayesian Filtering”, Proc. 4th INNS Symposia Series on Computational Intelligence in Information Systems, Bandar Seri Begawan, Brunei, 2015, pp. 149-158. |
Google Developers,“Voice search in your app”, Online available at:—<https://www.youtube.com/watch?v=PS1FbB5qWEI>, Nov. 12, 2014, 1 page. |
Guim, Mark, “How to Set a Person-Based Reminder with Cortana”, Online available at:—<http://www.wpcentral.com/how-to-person-based-reminder-cortana>, Apr. 26, 2014, 15 pages. |
Gupta et al., “I-vector-based Speaker Adaptation Of Deep Neural Networks For French Broadcast Audio Transcription”, ICASSP, 2014, 2014, pp. 6334-6338. |
Gupta, Naresh, “Inside Bluetooth Low Energy”, Artech House, 2013, 274 pages. |
Hardawar, Devindra, “Driving App Waze Builds its own Siri for Hands-Free Voice Control”, Online available at:—<http://venturebeat.com/2012/02/09/driving-app-waze-builds-its-own-siri-for-hands-free-voice-control/>, retrieved on Feb. 9, 2012, 4 pages. |
Hashimoto, Yoshiyuki, “Simple Guide for iPhone Siri, which can be Operated with your Voice”, Shuwa System Co., Ltd., vol. 1, Jul. 5, 2012, pp. 8, 130, 131. |
Hershey et al., “Deep Clustering: Discriminative Embeddings For Segmentation and Separation”, Proc. ICASSP, Mar. 2016, 6 pages. |
Hutsko et al., “iPhone All-in-One for Dummies”, 3rd Edition, 2013, 98 pages. |
id3.org, “id3v2.4.0-Frames”, Online available at:—<http://id3.org/id3v2.4.0-frames ?action=print>, retrieved on Jan. 22, 2015, pp. 1-41. |
Ikeda, Masaru, “beGlobal Seoul 2015 Startup Battle: Talkey”, YouTube Publisher, Online Available at:—<https://www.youtube.com/watch?v=4Wkp7sAAldg>, May 14, 2015, 1 page. |
Inews and Tech,“How To Use The QuickType Keyboard In IOS 8”, Online available at:—<http://www.inewsandtech.com/how-to-use-the-quicktype-keyboard-in-ios-8/>, Sep. 17, 2014, 6 pages. |
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2015/023089, dated Jan. 12, 2017, 12 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2015/023089, dated Aug. 20, 2015, 16 pages. |
Internet Services and Social Net, “How to Search for Similar Websites”, Online available at:—<https://www.youtube.com/watch?v=nLf2uirpt5s>, see from 0:17 to 1:06, Jul. 4, 2013, 1 page. |
Invitation to Pay Additional Fees and Partial Search Report received for PCT Patent Application No. PCT/US2015/023089, dated Jun. 17, 2015, 7 pages. |
Isik et al., “Single-Channel Multi-Speaker Separation using Deep Clustering”, Interspeech 2016, Sep. 8-12, 2016, pp. 545-549. |
Jonsson et al., “Proximity-based Reminders Using Bluetooth”, 2014 IEEE International Conference on Pervasive Computing and Communications Demonstrations, 2014, pp. 151-153. |
Jouvet et al., “Evaluating Grapheme-to-phoneme Converters in Automatic Speech Recognition Context”, IEEE, 2012, pp. 4821-4824. |
Karn, Ujjwal, “An Intuitive Explanation of Convolutional Neural Networks”, The Data Science Blog, Aug. 11, 2016, 23 pages. |
Kastrenakes, Jacob, “Siri's creators will unveil their new AI bot on Monday”, The Verge, Online available at:—<https://web.archive.org/web/20160505090418/https://www.theverge.com/2016/5/4/11593564/viv-labs-unveiling-monday-new-ai-from-siri-creators>, May 4, 2016, 3 pages. |
Kazmucha Allyson, “How to Send Map Locations Using iMessage”, iMore.com, Online available at:—<http://www.imore.com/how-use-imessage-share-your-location-your-iphone>, Aug. 2, 2012, 6 pages. |
Kickstarter, “Ivee Sleek: Wi-Fi Voice-Activated Assistant”, Online available at:—<https://www.kickstarter.com/projects/ivee/ivee-sleek-wi-fi-voice-activated-assistant>, retrieved on Feb. 10, 2014, pp. 1-13. |
King et al., “Robust Speech Recognition Via Anchor Word Representations”, Interspeech 2017, Aug. 20-24, 2017, pp. 2471-2475. |
Lee, Sungjin, “Structured Discriminative Model for Dialog State Tracking”, Proceedings of the SIGDIAL 2013 Conference, Aug. 22-24, 2013, pp. 442-451. |
Liou et al., “Autoencoder for Words”, Neurocomputing, vol. 139, Sep. 2014, pp. 84-96. |
Liu et al., “Accurate Endpointing with Expected Pause Duration”, Sep. 6-10, 2015, pp. 2912-2916. |
Loukides et al., “What Is the Internet of Things?”, O'Reilly Media, Inc., Online Available at: <https://www.oreilly.com/library/view/what-is-the/9781491975633/>, 2015, 31 pages. |
Luo et al., “Speaker-Independent Speech Separation with Deep Attractor Network”, IEEE/ACM Transactions On Audio, Speech, And Language Processing, vol. 26, No. 4, Apr. 2018, pp. 787-796. |
Marketing Land,“Amazon Echo: Play music”, Online Available at:—<https://www.youtube.com/watch?v=A7V5NPbsXi4>, Apr. 27, 2015, 3 pages. |
Mhatre et al., “Donna Interactive Chat-bot acting as a Personal Assistant”, International Journal of Computer Applications (0975-8887), vol. 140, No. 10, Apr. 2016, 6 pages. |
Mikolov et al., “Linguistic Regularities in Continuous Space Word Representations”, Proceedings of NAACL-HLT, Jun. 9-14, 2013, pp. 746-751. |
Miller Chance, “Google Keyboard Updated with New Personalized Suggestions Feature”, Online available at:—<http://9to5google.com/2014/03/19/google-keyboard-updated-with-new-personalized-suggestions-feature/>, Mar. 19, 2014, 4 pages. |
Modern Techies,“Braina-Artificial Personal Assistant for PC (like Cortana, Siri) !!!!”, Online available at: <https://www.youtube.com/watch?v=_Coo2P8ilqQ>, Feb. 24, 2017, 3 pages. |
Morrison Jonathan, “iPhone 5 Siri Demo”, Online Available at:—<https://www.youtube.com/watch?v=_wHWwG5lhWc>, Sep. 21, 2012, 3 pages. |
My Cool Aids, “What's New”, Online available at:—<http://www.mycoolaids.com/>, 2012, 1 page. |
Myers, Brad A., “Shortcutter for Palm”, Available at: <http://www.cs.cmu.edu/˜pebbles/v5/shortcutter/palm/index.html>, retrieved on Jun. 18, 2014, 10 pages. |
Nakamura et al., “Study of Information Clouding Methods to Prevent Spoilers of Sports Match”, Proceedings of the International Working Conference on Advanced Visual Interfaces (AVI' 12), ISBN: 978-1-4503-1287-5, May 2012, pp. 661-664. |
Nakamura et al., “Study of Methods to Diminish Spoilers of Sports Match: Potential of a Novel Concept “Information Clouding””, vol. 54, No. 4, ISSN: 1882-7764. Online available at: <https://ipsj.ixsq.nii.ac.jp/ej/index.php?active_action=repository_view_main_item_detail&page_id=13&block_id=8&item_id=91589&item_no=1>, Apr. 2013, pp. 1402-1412. |
NDTV, “Sony SmartWatch 2 Launched in India for Rs. 14,990”, available at <http://gadgets.ndtv.com/others/news/sony-smartwatch-2-launched-in-india-for-rs-14990-420319>, Sep. 18, 2013, 4 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/085,465, dated Jul. 28, 2016., 5 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/495,861, dated Apr. 30, 2020, 15 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/495,861, dated Jun. 14, 2018, 11 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/495,861, dated Mar. 1, 2019, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 17/125,876, dated Oct. 6, 2021, 17 pages. |
Notice of Acceptance received for Australian Patent Application No. 2015284755, dated Oct. 19, 2017, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2018200679, dated Jan. 29, 2020, 3 pages. |
Notice of Acceptance received for Australian Patent Application No. 2020203023, dated Apr. 8, 2022, 3 pages. |
Notice of Allowance received for Japanese Patent Application No. 2018-136037, dated Dec. 4, 2020, 4 pages. |
Notice of Allowance received for Japanese Patent Application No. 2020-215571, dated Mar. 18, 2022, 4 pages. |
Notice of Allowance received for Taiwan Patent Application No. 104113312, dated Jan. 18, 2017, 3 pages. |
Notice of Allowance received for U.S. Appl. No. 14/498,503, dated Dec. 18, 2015, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 14/498,503, dated Feb. 26, 2016, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 15/085,465, dated Feb. 14, 2017, 9 pages. |
Notice of Allowance received for U.S. Appl. No. 15/495,861, dated Sep. 18, 2020, 8 pages. |
Notice of Allowance received for U.S. Appl. No. 17/125,876, dated Jul. 27, 2022, 7 pages. |
Office Action received for Australian Patent Application No. 2015284755, dated Oct. 21, 2016, 2 pages. |
Office Action received for Australian Patent Application No. 2018200679, dated Jan. 29, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2018200679, dated Nov. 14, 2019, 3 pages. |
Office Action received for Australian Patent Application No. 2020203023, dated Apr. 22, 2021, 3 pages. |
Office Action received for Australian Patent Application No. 2020203023, dated Mar. 22, 2022, 4 pages. |
Office Action received for Australian Patent Application No. 2020203023, dated Sep. 16, 2021, 5 pages. |
Office Action received for Chinese Patent Application No. 201580029053.9, dated Aug. 22, 2019, 18 pages. |
Office Action received for Chinese Patent Application No. 201580029053.9, dated Dec. 2, 2019, 14 pages. |
Office Action received for Chinese Patent Application No. 2015800290539, dated Jan. 29, 2019, 16 pages. |
Office Action received for European Patent Application No. 15717712.2, dated Sep. 11, 2017, 10 pages. |
Office Action received for European Patent Application No. 19160560.9, dated Jul. 16, 2021, 5 pages. |
Office Action received for European Patent Application No. 19160560.9, dated Jun. 9, 2020, 6 pages. |
Office Action received for Japanese Patent Application No. 2016-568608, dated Jun. 20, 2017, 7 pages. |
Office Action received for Japanese Patent Application No. 2016-568608, dated Mar. 19, 2018, 6 pages. |
Office Action received for Japanese Patent Application No. 2018-136037, dated Jul. 29, 2019, 5 pages. |
Office Action received for Japanese Patent Application No. 2018-136037, dated Mar. 30, 2020, 6 pages. |
Office Action received for Japanese Patent Application No. 2020-215571, dated Apr. 23, 2021, 4 pages. |
Office Action received for Japanese Patent Application No. 2020-215571, dated Oct. 15, 2021, 3 pages. |
Office Action received for Taiwan Patent Application No. 104113312, dated Jan. 25, 2016, 10 pages. |
OSXDAILY, “Get a List of Siri Commands Directly from Siri”, Online available at:—<http://osxdaily.com/2013/02/05/list-siri-commands/>, Feb. 5, 2013, 15 pages. |
Pak, Gamerz, “Braina: Artificially Intelligent Assistant Software for Windows PC in (urdu / hindhi)”, Online available at: <https://www.youtube.com/watch?v=JH_rMjw8lqc>, Jul. 24, 2018, 3 pages. |
Pathak et al., “Privacy-preserving Speech Processing: Cryptographic and String-matching Frameworks Show Promise”, In: IEEE signal processing magazine, Online available at:—<http://www.merl.com/publications/docs/TR2013-063.pdf>, Feb. 13, 2013, 16 pages. |
Patra et al., “A Kernel-Based Approach for Biomedical Named Entity Recognition”, Scientific World Journal, vol. 2013, 2013, pp. 1-7. |
PC Mag, “How to Voice Train Your Google Home Smart Speaker”, Online available at: <https://in.pcmag.com/google-home/126520/how-to-voice-train-your-google-home-smart-speaker>, Oct. 25, 2018, 12 pages. |
Pennington et al., “GloVe: Global Vectors for Word Representation”, Proceedings of the Conference on Empirical Methods Natural Language Processing (EMNLP), Doha, Qatar, Oct. 25-29, 2014, pp. 1532-1543. |
Perlow, Jason, “Alexa Loop Mode with Playlist for Sleep Noise”, Online Available at: <https://www.youtube.com/watch?v=nSkSuXziJSg>, Apr. 11, 2016, 3 pages. |
pocketables.com,“AutoRemote example profile”, Online available at: https://www.youtube.com/watch?v=kC_zhUnNZj8, Jun. 25, 2013, 1 page. |
Qian et al., “Single-channel Multi-Talker Speech Recognition With Permutation Invariant Training”, Speech Communication, Issue 104, 2018, pp. 1-11. |
Rasch, Katharina, “Smart Assistants for Smart Homes”, Doctoral Thesis in Electronic and Computer Systems, 2013, 150 pages. |
Rios Mafe, “New Bar Search for Facebook”, YouTube, available at:—<https://www.youtube.com/watch?v=vwgN1WbvCas>, Jul. 19, 2013, 2 pages. |
Ritchie, Rene, “QuickType keyboard in iOS 8: Explained”, Online Available at:—<https://www.imore.com/quicktype-keyboards-ios-8-explained>, Jun. 21, 2014, pp. 1-19. |
Routines, “SmartThings Support”, Online available at:—<https://web.archive.org/web/20151207165701/https://support.smartthings.com/hc/en-us/articles/205380034-Routines>, 2015, 3 pages. |
Rowland et al., “Designing Connected Products: UX for the Consumer Internet of Things”, O'Reilly, May 2015, 452 pages. |
Samsung Support, “Create a Quick Command in Bixby to Launch Custom Settings by at Your Command”, Online Available at:—<https://www.facebook.com/samsungsupport/videos/10154746303151213>, Nov. 13, 2017, 1 page. |
Santos et al., “Fighting Offensive Language on Social Media with Unsupervised Text Style Transfer”, Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (vol. 2: Short Papers), May 20, 2018, 6 pages. |
Seehafer Brent, “Activate Google Assistant on Galaxy S7 with Screen off”, Online available at:—<https://productforums.google.com/forum/#!topic/websearch/Ip3qIGBHLVI>, Mar. 8, 2017, 4 pages. |
Selfridge et al., “Interact: Tightly-coupling Multimodal Dialog with an Interactive Virtual Assistant”, International Conference on Multimodal Interaction, ACM, Nov. 9, 2015, pp. 381-382. |
Senior et al., “Improving DNN Speaker Independence With I-Vector Inputs”, ICASSP, 2014, pp. 225-229. |
Seroter et al., “SOA Patterns with BizTalk Server 2013 and Microsoft Azure”, Packt Publishing, Jun. 2015, 454 pages. |
Settle et al., “End-to-End Multi-Speaker Speech Recognition”, Proc. ICASSP, Apr. 2018, 6 pages. |
Shen et al., “Style Transfer from Non-Parallel Text by Cross-Alignment”, 31st Conference on Neural Information Processing Systems (NIPS 2017), 2017, 12 pages. |
Simonite, Tom, “Confronting Siri: Microsoft Launches Digital Assistant Cortana”, 2014, 2 pages. |
Siou, Serge, “How to Control Apple TV 3rd Generation Using Remote app”, Online available at: <https://www.youtube.com/watch?v=PhyKftZ0S9M>, May 12, 2014, 3 pages. |
Smith, Jake, “Amazon Alexa Calling: How to Set it up and Use it on Your Echo”, iGeneration, May 30, 2017, 5 pages. |
SRI, “SRI Speech: Products: Software Development Kits: EduSpeak”, Online available at:—<http://web.archive.org/web/20090828084033/http://www.speechatsri.com/products/eduspeak>shtml, retrieved on Jun. 20, 2013, pp. 1-2. |
Summon to Attend Oral Proceedings received for European Patent Application No. 15717712.2, mailed on Apr. 16, 2018, 2 pages. |
Summon to Attend Oral Proceedings received for European Patent Application No. 15717712.2, mailed on Mar. 19, 2018, 9 pages. |
Sundaram et al., “Latent Perceptual Mapping with Data-Driven Variable-Length Acoustic Units for Template-Based Speech Recognition”, ICASSP 2012, Mar. 2012, pp. 4125-4128. |
Sundermeyer et al., “From Feedforward to Recurrent LSTM Neural Networks for Language Modeling.”, IEEE Transactions to Audio, Speech, and Language Processing, vol. 23, No. 3, Mar. 2015, pp. 517-529. |
Sundermeyer et al., “LSTM Neural Networks for Language Modeling”, INTERSPEECH 2012, Sep. 9-13, 2012, pp. 194-197. |
Tan et al., “Knowledge Transfer In Permutation Invariant Training For Single-channel Multi-talker Speech Recognition”, ICASSP 2018, 2018, pp. 5714-5718. |
Tofel et al., “SpeakToit: A Personal Assistant for Older iPhones, iPads”, Apple News, Tips and Reviews, Feb. 9, 2012, 7 pages. |
Vaswani et al., “Attention Is All You Need”, 31st Conference on Neural Information Processing Systems (NIPS 2017), 2017, pp. 1-11. |
Villemure et al., “The Dragon Drive Innovation Showcase: Advancing the State-of-the-art in Automotive Assistants”, 2018, 7 pages. |
Vodafone Deutschland, “Samsung Galaxy S3 Tastatur Spracheingabe”, Online available at—<https://www.youtube.com/watch?v=6kOd6Gr8uFE>, Aug. 22, 2012, 1 page. |
Wang et al., “End-to-end Anchored Speech Recognition”, Proc. ICASSP2019, May 12-17, 2019, 5 pages. |
Weng et al., “Deep Neural Networks for Single-Channel Multi-Talker Speech Recognition”, IEEE/ACM Transactions on Audio, Speech, And Language Processing, vol. 23, No. 10, Oct. 2015, pp. 1670-1679. |
Wikipedia, “Home Automation”, Online Available at:—<https://en.wikipedia.org/w/index.php?title=Home_automation&oldid=686569068>, Oct. 19, 2015, 9 pages. |
Wikipedia, “Siri”, Online Available at:—<https://en.wikipedia.org/w/index.php?title=Siri&oldid=689697795>, Nov. 8, 2015, 13 Pages. |
Wikipedia, “Virtual Assistant”, Wikipedia, Online Available at:—<https://en.wikipedia.org/w/index.php?title=Virtual_assistant&oldid=679330666>, Sep. 3, 2015, 4 pages. |
X.AI, “How it Works”, Online available at:—<https://web.archive.org/web/20160531201426/https://x.ai/how-it-works/>, May 31, 2016, 6 pages. |
Xiang et al., “Correcting Phoneme Recognition Errors in Learning Word Pronunciation through Speech Interaction”, Speech Communication, vol. 55, No. 1, Jan. 1, 2013, pp. 190-203. |
Xu et al., “Policy Optimization of Dialogue Management in Spoken Dialogue System for Out-of-Domain Utterances”, 2016 International Conference On Asian Language Processing (IALP), IEEE, Nov. 21, 2016, pp. 10-13. |
Yan et al., “A Scalable Approach to Using DNN-derived Features in GMM-HMM Based Acoustic Modeling for LVCSR”, 14th Annual Conference of the International Speech Communication Association, InterSpeech 2013, Aug. 2013, pp. 104-108. |
Yang Astor, “Control Android TV via Mobile Phone APP RKRemoteControl”, Online Available at: <https://www.youtube.com/watch?v=zpmUeOX_xro>, Mar. 31, 2015, 4 pages. |
Yates Michaelc., “How Can I Exit Google Assistant After I'm Finished with it”, Online available at:—<https://productforums.google.com/forum/#!msg/phone-by-google/faECnR2RJwA/gKNtOkQgAQAJ>, Jan. 11, 2016, 2 pages. |
Ye et al., “iPhone 4S Native Secret”, Jun. 30, 2012, 1 page. |
Yeh Jui-Feng, “Speech Act Identification Using Semantic Dependency Graphs with Probabilistic Context-free Grammars”, ACM Transactions on Asian and Low-Resource Language Information Processing, vol. 15, No. 1, Dec. 2015, pp. 5.1-5.28. |
Yousef, Zulfikara., “Braina (A.I) Artificial Intelligence Virtual Personal Assistant”, Online available at:—<https://www.youtube.com/watch?v=2h6xpB8bPSA>, Feb. 7, 2017, 3 pages. |
Yu et al., “Permutation Invariant Training of Deep Models for Speaker-Independent Multi-talker Speech Separation”, Proc. ICASSP, 2017, 5 pages. |
Yu et al., “Recognizing Multi-Talker Speech with Permutation Invariant Training”, Interspeech 2017, Aug. 20-24, 2017, pp. 2456-2460. |
Zainab, “Google Input Tools Shows Onscreen Keyboard in Multiple Languages [Chrome]”, Online available at:—<http://www.addictivetips.com/internet-tips/google-input-tools-shows-multiple-language-onscreen-keyboards-chrome/>, Jan. 3, 2012, 3 pages. |
Zhan et al., “Play with Android Phones”, Feb. 29, 2012, 1 page. |
Zhong et al., “JustSpeak: Enabling Universal Voice Control on Android”, W4A'14, Proceedings of the 11th Web for All Conference, No. 36, Apr. 7-9, 2014, 8 pages. |
Zmolikova et al., “Speaker-Aware Neural Network Based Beamformer For Speaker Extraction in Speech Mixtures”, Interspeech 2017, Aug. 20-24, 2017, pp. 2655-2659. |
Decision to Grant received for European Patent Application No. 19160560.9, dated Feb. 16, 2023, 2 pages. |
Extended European Search Report received for European Patent Application No. 23157829.5, dated Apr. 21, 2023, 7 pages. |
Office Action received for Japanese Patent Application No. 2022-060413, dated May 22, 2023, 12 pages (6 pages of English Translation and 6 pages of Official Copy). |
101771691, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
102088421, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
102915221, CN, A, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
10-2013-0132200, KR, A, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
Deng Xi, “Research on Intelligent Guide System Based on Virtual Human”, Tianjin University of Technology, Jan. 2008, 60 pages, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
Liyin Liu, “Research and Application of Recommendation Technology Based on Logistic Regression”, University of Electronic Science and Technology of China, 2013, 84 pages, Chinese Patent Office in an Office Action for related Patent Application No. 202111371040.3 dated Jul. 1, 2023. |
Notice of Allowance received for Chinese Patent Application No. 202111371040.3, dated Jul. 1, 2023, 5 pages (1 page of English Translation and 4 pages of Official Copy). |
Deng Xi, “Research on Intelligent Guide System Based on Virtual Human”, Tianjin University of Technology, Jan. 2008, 60 pages (Official Copy only) (See Communication Under 37 CFR § 1.98(a) (3)). |
Liyin Liu, “Research and Application of Recommendation Technology Based on Logistic Regression”, University of Electronic Science and Technology of China, 2013, 84 pages (Official Copy only) (See Communication Under 37 CFR § 1.98(a) (3)). |
Number | Date | Country | |
---|---|---|---|
20230066552 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
62019312 | Jun 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17125876 | Dec 2020 | US |
Child | 17973398 | US | |
Parent | 15495861 | Apr 2017 | US |
Child | 17125876 | US | |
Parent | 15085465 | Mar 2016 | US |
Child | 15495861 | US | |
Parent | 14498503 | Sep 2014 | US |
Child | 15085465 | US |