Method and system for operating a multi-function portable electronic device using voice-activation

Information

  • Patent Grant
  • 11012942
  • Patent Number
    11,012,942
  • Date Filed
    Friday, January 31, 2020
    6 years ago
  • Date Issued
    Tuesday, May 18, 2021
    4 years ago
Abstract
Methods and systems in which a portable electronic device can be voice to activated are disclosed. The portable electronic device can be a multi-function electronic device. The voice activation can be robust and context sensitive. The voice activation can also be utilized without any preparatory user action with respect to the portable electronic device. The portable electronic device can also interact with a media system.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

The present invention relates to a portable electronic device and, more particularly, to a multi-function portable electronic device.


Description of the Related Art

Today, cellular phones primarily require manual interaction by a user to invoke functions or to enter data, etc. However, cellular phones can also support limited voice activation. For example, a user can press a predetermined button, then speak a name of a person in the address book of the cellular phone. If the cellular phone recognizes the spoken name, then the person can be automatically called using the phone number provided in the address book. Cellular phones can also be use inside automobiles in a similar fashion. Some automobiles also support hands-free cellular operation by providing an embedded speaker and microphone internal to the vehicle. Bluetooth car kits are also available to add-on a speaker and microphone for hands-free operation. In any case, with cellular phones, voice commands are also conventionally limited to recognizing names of contacts within an address book and require manual user interaction with the cellular phone or automobile (e.g., button press) prior to any voice command.


Specialized computer programs also exist which can execute on a personal computer and wirelessly interact with a Bluetooth-enabled cellular phone. For example, a user interface displayed on a personal computer can allow a user to dial, answer, hang up and hold calls with respect to a cellular phone. Users can also be alerted at the personal computer of incoming calls or SMS messages. When a call is received at the cellular phone, media playback in progress at the personal computer can be paused.


SUMMARY OF THE INVENTION

The invention pertains to voice activation for a portable electronic device. The portable electronic device can be a multi-function electronic device. The voice activation can be robust and context sensitive. The voice activation can also be utilized without any preparatory user action with respect to the portable electronic device. The portable electronic device can also interact with a media system.


According to one embodiment, one function that can be supported by the portable electronic device is voice communications. When a voice call is incoming to the portable electronic device, the portable electronic device can automatically control itself or the media system to pause, stop and/or lower its volume so that media playback need not disturb a user while participating in the voice call. After the voice call ends, the portable electronic device can automatically control itself or the media system to resume, start and/or raise its volume so that the user can again participate in media playback.


The invention can be implemented in numerous ways, including as a method, system, device, apparatus (including graphical user interface), or computer readable medium. Several embodiments of the invention are discussed below.


As a method for operating a portable electronic device using voice-activated input, one embodiment of the invention can, for example, include at least: operating the portable electric device to listen for a user voice command; monitoring an operational state of the portable electronic device; receiving an audio input; determining a set of commands that are authorized for usage with the portable electronic while in the operational state; determining whether the audio input pertains to at least one of the commands within the set of commands; and executing the at least one of the commands within the set of commands that is determined to pertain to the audio input.


As a portable electronic device, one embodiment of the invention can, for example, include at least: a microphone capable of picking up a voice input from a user; a voice analyzer operatively connected to the microphone; and a processor for controlling operation of the portable electronic device. The voice analyzer can be configured to analyze the voice input to determine if one or more predetermined commands are to be performed. The processor can operate to perform the one or more predetermined commands when the voice analyzer determines that the voice input substantially matches characteristics of the one or more predetermined commands.


As a method for operating an electronic device supporting or coupling to a plurality of functions, where one of the functions can be wireless voice communications and another of the functions can be media playback, one embodiment of the invention can, for example, include at least: determining whether a voice call is incoming; determining when media playback is active; outputting a ringtone if a voice call is incoming and media playback is not active; outputting the ringtone mixed with media output if a voice call is incoming and media playback is active; activating a microphone if the microphone is not already active; determining whether a voice command is received while the call is incoming; answering the call when the voice command received requests that the call be answered; pausing or stopping the media playback if media playback is still active when the call is answered; determining whether the call has ended; and resuming or restarting the media playback after the call has ended.


As a computer readable medium including at least computer program code stored thereon for operating a portable electronic device using voice-activated input, one embodiment of the invention can, for example, include at least: computer program code for operating the portable electric device to listen for a user voice command; computer program code for monitoring an operational state of the portable electronic device; computer program code for determining a set of commands that are authorized for usage with the portable electronic while in the operational state; computer program code for determining whether an audio input pertains to at least one of the commands within the set of commands; and computer program code for executing the at least one of the commands within the set of commands that is determined to pertain to the audio input.


As a computer readable medium including at least computer program code stored thereon for operating an electronic device supporting or coupling to a plurality of functions, where one of the functions is wireless voice communications and another of the functions is media playback, another embodiment of the invention can, for example, include at least: computer program code for determining whether a voice call is incoming; computer program code for determining when media playback is active; computer program code for outputting a ringtone if a voice call is incoming and media playback is not active; computer program code for outputting the ringtone mixed with media output if a voice call is incoming and media playback is active; computer program code for determining whether a voice command is received while the call is incoming; computer program code for answering the call when the voice command received requests that the call be answered; computer program code for pausing or stopping the media playback if media playback is still active when the call is answered; computer program code for determining whether the call has ended; and computer program code for resuming or restarting the media playback after the call has ended.


Other aspects and advantages of the invention will become apparent from the following detailed description taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:



FIG. 1 is a block diagram of a portable electronic device according to one embodiment of the invention.



FIG. 2 is a block diagram of an electronic device according to one embodiment of the invention.



FIG. 3 is a block diagram of voice-to-command analyzer according to one embodiment of the invention.



FIG. 4 is a flow diagram of voice command process according to one embodiment of the invention.



FIG. 5 is a flow diagram of voice command process according to another embodiment of the invention.



FIG. 6 is a flow diagram of a voice command recognition process according to one embodiment of the invention.



FIGS. 7A-7C illustrate exemplary graphical user interfaces that can be presented on a display device according to certain embodiments of the invention.



FIGS. 8A-8D illustrate exemplary graphical user interfaces that can be provided on a display device of an electronic device according to certain embodiments of the invention.



FIGS. 9A-9E illustrate certain predetermined system configurations for a portable electronic device and a media system.



FIG. 10 illustrates process involving interaction between a portable electronic device and a media system according to one embodiment of the invention.



FIGS. 11A and 11B are flow diagrams of process concerning media playback and voice call handling according to one embodiment of the invention.



FIG. 12 is a block diagram of media player according to one embodiment of the invention.





DETAILED DESCRIPTION OF THE INVENTION

The invention pertains to voice activation for a portable electronic device. The portable electronic device can be a multi-function electronic device. The voice activation can be robust and context sensitive. The voice activation can also be utilized without any preparatory user action with respect to the portable electronic device. The portable electronic device can also interact with a media system.


According to one embodiment, one function that can be supported by the portable electronic device is voice communications. When a voice call is incoming to the portable electronic device, the portable electronic device can automatically control itself or the media system to pause, stop and/or lower its volume so that media playback need not disturb a user while participating in the voice call. After the voice call ends, the portable electronic device can automatically control itself or the media system to resume, start and/or raise its volume so that the user can again participate in media playback.


The invention is well suited for a portable electronic device that can support multiple functions. In one embodiment, the invention is suitable for use with a portable electronic device having at least wireless voice communication capability and media playback capability. The portable electronic device can, for example, be a portable media device (e.g., digital music player or MP3 player) having wireless voice communications. In another embodiment, the portable electronic device can be a wireless communications device (e.g., cellular phone) having media playback and/or media recording capabilities. In still another embodiment, the portable electronic device can be a portable electronic device having media playback or recording capability and workout support via a workout manager. These portable electronic devices can also have other functions (e.g., applications), such as functions supporting electronic calendars, electronic appointments, network browsers, network data transfers, VoIP applications, etc.


Embodiments of the invention are discussed below with reference to FIGS. 1-12. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.



FIG. 1 is a block diagram of portable electronic device 100 according to one embodiment of the invention. Portable electronic device 100 includes voice control module 102. Voice control module 102 can be used to control portable electronic device 100. More particularly, a user of portable electronic device 100 can issue voice commands to portable electronic device 100. Voice control module 102 analyzes a user's voice input to determine whether it corresponds to a command understood by voice control module 102. If a command is recognized by voice control module 102, portable electronic device 100 can process the command. The command can pertain to any of a number of functions or operations supported by portable electronic device 100. Since portable electronic device 100 is able to operate in a voice-activated manner, portable electronic device 100 needs little or no user input devices, such as buttons, dials, touch pads and the like. Portable electronic device 100, however, can utilize such user input devices to replace or supplement voice commands.



FIG. 2 is a block diagram of electronic device 200 according to one embodiment of the invention. Device 200 is typically a portable or mobile electronic device. Device 200 can pertain to a computing device, a media player, a mobile telephone, a portable game player, a portable workout manager, and the like. In one embodiment, device 200 is a multi-function device that supports a plurality of different functions. As one example, device 200 can be portable and operate as a mobile telephone while also operating as a media player. As another example, device 200 can operate as a media player while also operating as a portable workout manager.


Device 200 can include processor 202 that controls the overall operation of device 200. Device 200 can further include a program store 204 that stores a plurality of different software modules. The software modules can provide different functions or operations for the device 200. The software modules can correspond program code for application programs, operating systems, utility programs, and the like.


Device 200 can also include at least one input device 206. Input device 206 can pertain to one or more input buttons, touch-sensitive surfaces, rotary input mechanisms, etc. The input device 206 enables the user to provide user input, such as user selections for usage.


Device 200 can also include a display 208. As appropriate, graphical user interface (GUI) 210 can be presented on display 208. For example, GUI 210 can present a dialog window on display 208 that assists a user in controlling operation of device 200. GUI 210 can also present information to the user of device 200. Input device 206 can assist a user in providing user input to device 200, such as by interacting with GUI 210.


Device 200 also includes a network interface 212. Network interface 212 can establish a link 214 to a network, thereby facilitating wired or wireless network communications. In the case of a wireless network link, network interface 212 can include or pertain to a wireless transceiver.


In addition, device 200 can be controlled by voice control. In this regard, device 200 includes voice-to-command analyzer 216. Voice-to-command analyzer 216 operates to receive an audio input from a user via a microphone 218. Voice-to-command analyzer 216 can then analyze the audio input to determine whether it is requesting execution of a particular one of a set of predetermined commands or a particular one of a set of predetermined macros. As illustrated in FIG. 2, device 200 can include data store 220. Data store 220 can store a plurality of commands or macros as well as other data. These commands or macros are eligible to be executed by device 200 when requested by a voice input. Similarly, voice-to-command analyzer 216 can determine whether the voice input corresponds to a macro from a set of available macros stored in data store 220. The macros can be considered groups or sets of commands which are arranged in a particular sequence. A macro manager 220 can couple to voice-to-command analyzer 216 so that when the voice input corresponds to a macro, the macro manager 222 can manage the performance of the macro, which involves a plurality of commands operated in a particular sequence.


Device 200 can include battery 224 that provides power to device 200. Typically, battery 224 is rechargeable by coupling battery 224 to an AC outlet to allow a charge circuit (not shown) to charge battery 224. Although device 200 is powered by battery 224, in one embodiment, device 200 can also at times utilize power from AC power supplied via power cord coupled to an AC plug. The AC power, when available, is also used to charge battery 224.



FIG. 3 is a block diagram of voice-to-command analyzer 300 according to one embodiment of the invention. Voice-to-command analyzer 300 is, for example, one implementation of voice-to-command analyzer 216 illustrated in FIG. 2. Voice-to-command analyzer 300 receives a voice input from a microphone (e.g., microphone 218). Voice-to-command analyzer 300 also receives data pertaining to available commands. For example, the available commands can be stored and accessed in a data store, such as data store 220 illustrated in FIG. 2. In addition, voice-to-command analyzer 300 can receive device state information. The device state information can provide voice-to-command analyzer 300 with information concerning the state of the electronic device (e.g., device 200). The device state information can, for example, pertain to a state, condition, event or the like, which can pertain to hardware or software.


As an example, one state associated with the electronic device having voice-to-command analyzer 300 is a context of a graphical user interface (GUI) utilized by the electronic device. The context of the GUI can then provide state information to the voice-to-command analyzer 300. In one embodiment, depending upon the context of the GUI, different available commands can be utilized and provided to the voice-to-command analyzer 300. In general, as the device state changes, different available commands can be presented to voice-to-command analyzer 300. As a result, the available commands being provided to voice-to-command analyzer 300 can be restricted to those that are appropriate given the current state of the electronic device. Eventually, the voice-to-command analyzer 300 can recognize a command from the voice input. The recognized command is one of the available commands presented to voice-to-command analyzer 300. Of course, the voice input may not correlate to any of the available commands, in which case voice-to-command analyzer 300 would not output a recognized command.



FIG. 4 is a flow diagram of voice command process 400 according to one embodiment of the invention. Voice command process 400 is, for example, performed by an electronic device, such as device 100 illustrated in FIG. 1 or device 200 illustrated in FIG. 2.


Voice command process 400 monitors 402 an operational state of a portable electronic device. For example, the operational state may correspond to a functional mode, usage or program being utilized by the portable electronic device. As another example, the operational state can pertain to a state of a graphical user interface being provided on a display associated with the portable electronic device.


The voice command process 400 also receives 404 an audio input. Here, the portable electronic device includes electrical components that enable the portable electronic device to receive 404 an audio input. Typically, the audio input is a voice input provided by a user of the portable electronic device.


Next, a set of commands that are authorized for usage with the portable electronic device while in the operational state can be determined 406. Then, the voice command process 400 can determines 408 whether the audio input pertains to at least one of the commands within the set of commands. Since the determination 408 is limited, in one embodiment, to those commands within the set of commands that are authorized for usage while in the operational state, the determination 408 can be rapidly performed without excessive computational capability and without excessive power consumption. Thereafter, the at least one of the commands within the set of commands that has been determined 408 to pertain to the audio input can be executed 410. Consequently, voice command process 400 receives an audio input from a user, determines which of the limited set of available commands the user is requesting by the audio input, and then executes the appropriate command. Accordingly, an electronic device using voice command process 400 is able to command or control the operation of the electronic device using voice, namely, the electronic device is voice activated.



FIG. 5 is a flow diagram of voice command process 500 according to another embodiment of the invention. Voice command process 500 is, for example, performed by an electronic device, such as device 100 illustrated in FIG. 1 or device 200 illustrated in FIG. 2.


The voice command process 500 activates 502 a microphone. The device is also operated 504 in a low power mode if appropriate. For example, if the device is substantially idle and no user input is being received, the electronic device can be placed in a low power mode to conserve battery energy. Decision 506 determines whether an audio pickup has been received. The device can receive an audio pickup even while in the low power mode. When decision 506 determines that an audio pickup has not been received, voice command process 500 awaits to receive an audio pickup. Once the decision 506 determines that an audio pickup has been received, the audio pickup is analyzed 508. When analyzing the audio pickup, the processing can be made efficient and more robust by taking into consideration context with which the audio pickup has been received. The context can pertain to the electronic device, such as a state of the electronic device. In other words, the audio pickup can be analyzed 508 in a context-sensitive manner.


Next, decision 510 determines whether a voice command has been recognized. When decision 510 determines that a voice command has not recognized, voice command process 500 returns to repeat decision 506 to subsequently process another audio pickup. On the other hand, when decision 510 determines that a voice command has been recognized, the electronic device is operated 512 in a normal power mode. Here, if the electronic device was in a low power mode, the electronic device is returned to a normal power mode so that the recognized voice command can be quickly and efficiently processed.


In this embodiment, the recognized command can pertain to a macro. Decision 514 determines whether the recognized command is a macro. When the recognized command is not a macro, the recognized command is executed 516. On the other hand, when decision 514 determines that the command is a macro, the associated macro is retrieved 518. The associated macro is then executed 520. Decision 522 then determines whether there is any more operations (e.g., commands) within the associated macro that is to be executed. When decision 522 determines that there are more operations to be executed, voice command process 500 returns to repeat block 520 so that additional operations of the associated macro can be executed. Once decision 522 determines that there are no more operations within the macro to be executed, as well as directly following the block 516, voice command process 500 returns to repeat block 502 and subsequent operations so that a subsequent audio pickup can be processed in a similar manner.



FIG. 6 is a flow diagram of a voice command recognition process 600 according to one embodiment of the invention. The voice command recognition process 600 can, for example, pertain to processing associated with the decision 510 illustrated in FIG. 5. In other words, the voice command recognition process operates to determine whether the audio pickup pertains to one of the available commands supported by an electronic device. In particular, the voice command recognition process 600 can determine 602 a device context. Those commands available given the device context can then be identified 604. The audio pickup can be correlated 606 to the identified commands. Thereafter, the voice command recognition process 600 determines 608 whether the audio pickup corresponds to one of the identified commands based on the correlation data.


One aspect of the invention pertains restricting available commands based on device context. The device context, in one embodiment, pertains to the state of a graphical user interface (GUI). FIGS. 7A-7C illustrate exemplary graphical user interfaces that can be presented on a display device according to certain embodiment of the invention. These exemplary graphical user interfaces are just a few of the many embodiments that can utilize state of GUI to restrict or limit available voice commands to be recognized.



FIG. 7A illustrates exemplary menu 700 suitable for use on a display device associated with an electronic device according to one embodiment of the invention. While menu 700 is being displayed, a user can provide an audio input that pertains to a voice command. When menu 700 is displayed, the available voice commands that can received can be restricted. The menu 700 can be used to navigate to an appropriate media item or a group of media items to be played by the electronic device. While menu 700 is being displayed, a user can request to play a particular media item. For example, the user might provide an audio input, namely, a voice command, by announcing the phrase “play irreplaceable”. Here, the electronic device would recognize that the first portion “play” is a command that is supported and the second term “irreplaceable” is the name of a song available to be played at the electronic device. As another example, the user could provide an audio input, namely, a voice command, by announcing the phrase “play 06”, which could be the user requesting to play a playlist denoted as “summer '06” and available at the media device. As still another example, the user could provide an audio input, namely, a voice command, by announcing one of the menu items of the menu 700 (or perhaps even a nested menu) which could effect a selection of such item. For example, the menu items could be categories, classifications, groupings, media items, device settings, device functions, and the like. The menu 700 can represent one menu of a series of nested or hierarchical menus, which can also be navigated or traversed by voice commands.



FIG. 7B illustrates display region 720 of a display device associated with an electronic device according to one embodiment of the invention. Display region 720 includes meeting reminder notification 722. Meeting reminder notification 722 can be displayed on at least a portion of display region 720. In this example, meeting reminder notification 722 informs the user that a meeting to which they are scheduled starts in “15 minutes” at building “IL1, Room 1.” In this context, the available commands available to the user can pertain to permitted interaction with the electronic device in response to the meeting reminder. For example, the acceptable commands can be “clear” or “close” which requests that the electronic device close meeting reminder notification 722. Another example is the command “tell” which can respond to the meeting attendees with a message. For example, “tell everyone I will be 10 minutes late” which will be understood by the electronic device as a request to send a text message or email to all attendees of the meeting that the user will be ten (10) minutes late to the meeting.



FIG. 7C is an illustration of exemplary camera window 740 of a display device associated with an electronic device according to one embodiment of the invention. Camera window 740 can be presented on a display device associated with the electronic device. Camera window 740 is displayed on the display device when the electronic device has been placed in a camera mode. While in the camera mode, the available commands can be specific to likely camera operations. For example, in the camera mode, likely camera operations include taking pictures, deleting pictures, saving pictures, etc. Available commands in the camera mode can also include macros. As an example, a macro can be triggered when an audio input is a command requesting that a current picture be taken. As an example, a macro can cause the picture to be taken, cause the picture to be saved in memory, and cause the picture to be uploaded. Although the electronic device is voice-activated, in some embodiments, the electronic device also supports the use of non-voice-activated techniques to provide user input. For example, camera window 740 can include soft buttons 742-746 for which the user can provide user input. Soft buttons 742-746 can be activated using a keypad.



FIGS. 8A-8D illustrate exemplary graphical user interfaces that can be provided on a display device of an electronic device according to certain embodiments of the invention. These graphical user interfaces are associated with an electronic device that supports wireless voice communications. These exemplary graphical user interfaces are just a few of the many embodiments that can be utilized by an electronic device that supports wireless voice communications.



FIG. 8A illustrated exemplary graphical user interface (GUI) 800 for an incoming call. GUI 800 is a representative display screen concerning an incoming call from a caller (“Jim Jones”) as provided in upper portion 802 of GUI 800. Lower portion 804 of GUI 800 can display some or all of the available commands that can be spoken by a user to initiate the corresponding actions at the electronic device. As shown in FIG. 8A, the exemplary available commands with respect to the particular context of the GUI 800 can include “Answer” or “Voicemail” (or its abbreviated form “VM”).



FIG. 8B illustrates exemplary GUI 810 for a voicemail notification. GUI 810 is a representative display screen concerning voicemail available for a user of the electronic device. In upper portion 812 of GUI 810, the user can be informed that there are new voice messages awaiting their review. For example, as shown in FIG. 8B, the user is informed that there are two new voice messages. Lower portion 814 of GUI 810 can display some or all of the available commands that can be spoken by a user to initiate the corresponding actions at the electronic device. In FIG. 8B, the exemplary available commands illustrated in lower portion 814 can include “Play voicemail” and “Show details”.



FIG. 8C illustrated exemplary GUI 820 for voicemail review. GUI 820 is a representative display screen for reviewing voicemail at the electronic device. In upper portion 822 of GUI 820, description information pertaining to a voicemail that can be reviewed is provided. In the example illustrated in FIG. 8C, the information concerning the voicemail specifies the caller name, date, time and duration for the voicemail. Lower portion 824 can display some or all exemplary available commands that can be spoken by a user to initiate action at the electronic device. In particular, lower portion 824 indicates that the exemplary available commands can include “Play voicemail”, “Delete voicemail”, “Forward to [contact]”, or “Next”. The forward command can specify to forward the voicemail to another person known to the electronic device or another device. For example, the user could provide the command “Forward to Bob” which would be understood by the electronic device to forward the voicemail to Bob, who is a known contact (e.g., address book) of the user. As another example, the user could provide the command “Forward to my computer” which would be understood by the electronic device to forward the voicemail from their portable electronic device (or its associated supporting server) to the user's computer (personal computer).



FIG. 8D illustrates exemplary GUI 830 for playing of a voicemail. GUI 830 is a representative display screen for playing voicemail at the electronic device. Upper portion 832 of GUI 830 indicates that descriptive information concerning the voicemail be played. In the example illustrated in FIG. 8D, the information concerning the voicemail specifies the caller name, date, time and duration for the voicemail. Lower portion 834 can display some or all of the available commands while the electronic device is presenting GUI 830. In particular, lower portion 834 indicates that the available commands can include “Delete voicemail”, “Forward to [contact]”, “Text reply [Msg]”. The text reply command can specify to send a reply text message to another person known to the electronic device or another device. For example, the spoken phrase could be “Text reply meet you at noon for lunch,” which causes a text message “meet you at noon for lunch” to be sent to Jim, who is the sender of the message being replied to.


According to another embodiment of the invention a portable electronic device can be used in conjunction with a media system. The media system can pertain to a television system, a home stereo, a personal computer, and the like. The media system can also be referred to as a home entertainment system. FIGS. 9A-9E illustrate certain predetermined system configurations for a portable electronic device and a media system.



FIG. 9A is a block diagram of system configuration 900 according to one embodiment of the invention. System configuration 900 can include media system 902 and portable media system 904. Portable electronic device 902 is an electronic device, such as a personal computer, mobile communication device, media player (including portable media player), etc. Portable electronic device 902 can couple to media system 902 and thus be used in conjunction with portable electronic device 902. In FIG. 9A, portable electronic device 904 is shown as being apart from media system 902 but connected by way of a wired link 906. The wired link 906 may connect to the media system 902 and the portable electronic device 904 through electronic device, such as a network.



FIG. 9B is a block diagram of system configuration 900′ according to another embodiment of the invention. System configuration 900′ is generally similar to system configuration 900 illustrated in FIG. 9A. However, in FIG. 9B, portable electronic device 904 has been physically connected to media system 902. In one embodiment, host device 902 can include receptacle 910 that is capable of receiving portable electronic device 904, thereby providing a direct connection between portable electronic device 904 and media system 902.



FIG. 9C is a block diagram of system configuration 900″ according to another embodiment of the invention. System configuration 900″ is generally similar to system configuration 900 as illustrated in FIG. 9A. However, in FIG. 9C, portable electronic device 904 is brought within proximity to media system 902. When portable electronic device 904 is proximate to host device 902, wireless data link 912 can be provided by a short range wireless data link between portable electronic device 904 and media system 902.



FIG. 9D is a block diagram of system configuration 900″′ according to still another embodiment of the invention. System configuration 900″′ can include portable electronic device 904 and media system 902 as discussed above in FIG. 9A. However, system configuration 900″′ can further include auxiliary device 914 that is electrically connected to host device 902 by way of cable (or wire) 916. In one embodiment, auxiliary device 914 can pertain to a peripheral device for media system 902. One specific example for auxiliary device 914 is a docking station. Auxiliary device 914 can include a receptacle to receive wireless device 904 similar to receptacle 910 illustrated in FIG. 9B. Alternatively, auxiliary device 914 could permit a wireless data link to be established between portable electronic device 904 and auxiliary device 910 so long as such devices are in proximity, which is similar to wireless data link 912 illustrated in FIG. 9C. Auxiliary device 914 can also be referred to an intermediate device. In other words, auxiliary device 914 as shown in FIG. 9D is provided between portable electronic device 904 and media system 902. The intermediate device can pertain to a dock, adapter, media station, media player, personal computer, etc. In one example, an adapter can pertain to a cigarette lighter adapter that can be utilized in a cigarette lighter as typically provided in an automobile.



FIG. 9E is a block diagram of local environment 950 according to one embodiment of the invention. Local environment 950 can pertain to an automobile environment, a home environment, an office environment or other relatively constrained local environment. Within local environment 950, portable electronic device 952 can interact with media system 954. Media system 954 can pertain to a television system, a home stereo, a personal computer, and the like. Media system 954 can also be referred to as a home entertainment system. Accessory device 956 can also be provided in local environment 950. Portable electronic device 952 can include microphone 958 and speaker 960. Speaker 960 can be used to output audio sound (audio output) to the user. For example, the audio output can pertain to a voice call or media output. Microphone 958 can be utilized to pick up voice commands that are used by portable electronic device 952 or media system 954. Accessory device 956 can also include microphone 962 to pick up voice commands. Such voice commands can be supplied to media system 954 which, in turn, can supply them to portable electronic device 952, or the voice commands can be directly provided from accessory device 956 to portable electronic device 952. Accessory device 956 can also include wireless module 964. Wireless module 964 can permit accessory device 956 to wirelessly communicate to wireless headset 966. The wireless protocol being utilized between wireless headset 966 and wireless module 964 can pertain to Bluetooth technology or other short range wireless technology. Headset 966 can receive and/or output audio from/to media system 954 or portable electronic device 952. Accessory device 956 could also include a speaker (not shown) to provide audio output.


According to one aspect of the invention, a portable electronic device can interact with a media system. The interaction can be provided via a direct connection, a wired connection to a network, or a wireless connection to a network.



FIG. 10 illustrates process 1000 involving interaction between a portable electronic device and a media system according to one embodiment of the invention. In this embodiment, the context of the interaction is such that the media system is playing media using media data provided by the portable electronic device, while also answering a telephone call at the portable electronic device.


Process 1000 is a representative process that can be utilized between a portable electronic device and a media system according to one embodiment of the invention. At step 1, media to be played on the media system can be selected. A play command and the media data can then be sent to the media system (step 2). At the media system, the play command and the media data can be received and then the play command executed (step 3). Hence, media corresponding to the media data is output (step 4). Here, in this embodiment, the media data for the media to be played is provided by the portable electronic device to the media system. In another embodiment, the media data could be resident on the media system and when the play command is executed, the media could be output from the media data resident on a media system.


At some time later, assuming that the media is still being output, an incoming call can be detected (step 5). When an incoming call is detected (step 5), a ringtone command can be sent to the media system (step 6). The media system can subsequently receive and execute the ringtone command (step 7) when the ringtone command is executed, a ringtone is output (step 8). At the portable electronic device, when the ringtone is output (step 8), the user of the portable electronic device understands that there is an incoming call that can be answered. It should be understood that the ringtone could also be output directly at the portable electronic device. However, one advantage of outputting the ringtone by the media system is that the media being output by the media system can also continue to be output in a manner such that the ringtone can still be heard. For example, when outputting the ringtone, the output of the media (step 4) could have its volume lowered. In any case, at step 9, it is determined whether the user desires to answer the call. In this embodiment, it is assumed that the user will signal the portable electronic device using a voice command. Alternatively, the user can signal the portable electronic device to answer the call through a physical selection (e.g., button press). Hence, when the user has signaled to answer the call by a voice command, an answer call command will be executed (step 10). Since the call is being answered, a pause media command can be sent to the media system (step 11). The media system then receives and executes the pause media command (step 12). In doing so, media output is paused (step 13). Then, the user participates in the call and at some time later determines to end the call (step 14). Again, the determination to end the call can be done in a voice-activated manner. Alternatively, the end of the call can be initiated through a physical selection (e.g., button press). In any case, when the determination is made to end the call (step 14), the call is ended (step 15). A resume media command can then be sent to the media system (step 16). At the media system, the resume media command can be received and executed (step 17). The media output is then resumed (step 18).



FIGS. 11A and 11B are flow diagrams of process 1100 according to one embodiment of the invention. Process 1100 concerns media playback and voice call handling. In one example, process 1100 can be performed by a portable electronic device supporting wireless voice communications and media playback. In another example, process 1100 can be performed by a portable electronic device supporting wireless voice communications and a media system providing media playback.


Process 1100 can begin with decision 1102 that determines whether a call is incoming. When decision 1102 determines that a call is not incoming, process 1100 waits for an incoming call. On the other hand, when decision 1102 determines that a call is incoming, decision 1104 determines whether media playback is active. When decision 1104 determines that media playback is not active a ringtone can be output 1106. Alternatively, when decision 1104 determines that media playback is active, the volume of the media output can be limited 1108. Also, a ringtone mixed with the media output can be output 1110. Following block 1106 or block 1110, a microphone can be activated 1112.


Next, decision 1114 determines whether a voice command has been received. When decision 1114 determines that a voice command has not been received, decision 1116 determines whether a time-out has occurred. The time-out refers to a predetermined period of time during which the user of the electronic device can answer the incoming call. During this period of time, the microphone is activated so that a voice command can be received. When decision 1116 determines that a time-out has not yet occurred, process 1100 returns to repeat decision 1114 to continue to determine whether a voice command has been received. When decision 1114 determines that a voice command has been received, decision 1118 can determine whether a “who is it” command has been received. The “who is it” command is one type of voice command that can be received. When decision 1118 determines that a “who is it” command has been received, then caller information can be presented 1120. Presentation 1120 of caller information can be performed using a display device and/or by audio output. Following block 1120, process 1100 returns to repeat decision 1114 and subsequent blocks.


On the other hand, when decision 1118 determines that the voice command received is not a “who is it” command, decision 1122 determines whether the voice command is an answer call command. When decision 1122 determines that the voice command is not an answer call command, decision 1124 determines whether a call is to be manually answered. When decision 1124 determines that the call is not to be manually answered, then process 1100 returns to repeat decision 1114. Alternatively, when decision 1122 determines that the voice command received is an answer call command, as well as following decision 1124 when the call is to be manually answered, the media playback is paused 1126. By pausing the media playback, the user of the electronic device is able to participate in the call without being disturbed by the media playback. In another embodiment, the media playback can continue with its volume substantially limited such that it is does not materially interfere with the ability of the user to participate in the call. The incoming call is also answered 1128. Audio input/output for the call can then be processed 1130.


As the call continues, audio pertaining to the call will be incoming and outgoing so as to carry out the conversation or communications associated with the call. Decision 1132 can determine during the call whether a voice command has been received. Here, during the call, the electronic device can render certain commands as being available to be voice-activated by a user. When decision 1132 determines that a voice command has been received, decision 1134 determines whether the voice command is an end call command. The end call command is one type of voice command that can be received. When decision 1134 determines that the voice command that has been received is not an end call command, then optionally other commands can be processed 1136. Alternatively, when decision 1132 determines that a voice command has not been received, as well as following block 1136, decision 1138 determines whether a call is to end. Here, the call can be ended by a manual operation with respect to the electronic device. In other words, decision 1138 is a manual operation that is distinct from a voice command. When decision 1138 determines that the call is not to end, process 1100 returns to repeat block 1130 and subsequent blocks. Alternatively, when decision 1138 determines that the call is to end manually, or when decision 1134 determines that the received voice command is an end call command, then the call is closed 1140. Further, the microphone is deactivated 1142. In addition, playback of the media can be resumed 1144. Also, when decision 1116 determines that a time-out has occurred, the microphone can also be deactivated 1148. Following block 1148 or block 1144, the media output can be returned 1146 to its prior volume level. Following block 1146, process 1100 can end.


The media playback and voice call handling discussed above in FIGS. 10, 11A and 11B are examples of control of a media system by way of voice commands provided at a portable electronic device (or an associated accessory device). More generally, according to one embodiment of the invention, a media system can be controlled in any of a number of ways by voice commands provided at a portable electronic device (or an associated accessory device). For example, a user of the portable electronic device can provide voice comments that cause the media system to perform a channel change, a mute operation, media source change, track change, playback operation, stop playback, volume adjustment, etc.


The electronic device as described herein can be a wireless communication device (e.g., portable telephone) capable of communication over a network. The wireless communication device can also include other applications such as a media playback application or a media recording application.


The electronic device as described herein can be a media device (e.g., media player) capable of playing (including displaying) media items. The media items can pertain to audio items (e.g., audio files or songs), videos (e.g., movies) or images (e.g., photos). The media device can also include other applications such as a wireless communication application.


In one embodiment, the electronic device is a portable electronic device. In one implementation, the portable electronic device is a handheld electronic device. Often, portable electronic devices are handheld electronic devices that can be easily held by and within a single hand of a user. The portable electronic device can also pertain to a wearable electronic device or a miniature electronic device. However, the invention can apply to electronic devices whether portable or not.



FIG. 12 is a block diagram of media player 1200 according to one embodiment of the invention. Media player 1200 can include the circuitry of device 100 in FIG. 1, device 200 in FIG. 2, device 900 in FIGS. 9A-9C, device 952 in FIG. 9D, or can perform the operations described with reference to FIGS. 4-6, 10 or 11A and 11B, and/or can present a display screen as in FIGS. 7A-7C or FIGS. 8A-8D.


Media player 1200 can include processor 1202 that pertains to a microprocessor or controller for controlling the overall operation of media player 1200. Media player 1200 can store media data pertaining to media items in file system 1204 and cache 1206. File system 1204 is, typically, a storage disk or a plurality of disks. File system 1204 typically provides high capacity storage capability for media player 1200. File system 1204 can store not only media data but also non-media data. However, since the access time to file system 1204 is relatively slow, media player 1200 can also include cache 1206. Cache 1206 is, for example, Random-Access Memory (RAM) provided by semiconductor memory. The relative access time to cache 1206 can be substantially shorter than for file system 1204. However, cache 1206 does not have the large storage capacity of file system 1204. Further, file system 1204, when active, consumes more power than does cache 1206. The power consumption is often a concern when media player 1200 is a portable media player that is powered by battery 1207. Media player 1200 can also include RAM 1220 and Read-Only Memory (ROM) 1222. ROM 1222 can store programs, utilities or processes to be executed in a non-volatile manner. RAM 1220 provides volatile data storage, such as for cache 1206.


Media player 1200 can also include user input device 1208 that allows a user of media player 1200 to interact with media player 1200. For example, user input device 1208 can take a variety of forms, such as a button, keypad, dial, etc. (physical or soft implementations) each of which can be programmed to individually or in combination perform any of a suite of functions. In one implementation, user input device 1208 can be provided by a dial that physically rotates. In another implementation, user input device 1208 can be implemented as a touchpad (i.e., a touch-sensitive surface). In still another implementation, user input device 1208 can be implemented as a combination of one or more physical buttons as well as a touchpad. Still further, media player 1200 can include display 1210 (screen display) that can be controlled by processor 1202 to display information to the user. Data bus 1211 can facilitate data transfer between at least file system 1204, cache 1206, processor 1202, and CODEC 1212.


Media player 1200 can also provide status monitoring of battery 1207. In this regard, media player 1200 can include battery monitor 1213. Battery monitor 1213 can be operatively coupled to battery 1207 to monitor conditions. Battery monitor 1213 can, for example, communicate battery status (or conditions) with processor 1202.


In one embodiment, media player 1200 can serve to store a plurality of media items (e.g., songs, videos, TV shows, podcasts, etc.) in file system 1204. When a user desires to have media player 1200 play a particular media item, a list of available media items can be displayed on display 1210. Then, using user input device 1208 (or voice commands), a user can select one of the available media items. Processor 1202, upon receiving a selection of a particular media item, can supply the media data (e.g., audio file) for the particular media item to coder/decoder (CODEC) 1212. CODEC 1212 can then produce analog output signals for speaker 1214. Speaker 1214 can be a speaker internal to media player 1200 or external to media player 1200. For example, headphones or earphones that connect to media player 1200 could be considered an external speaker. Speaker 1214 can not only be used to output audio sounds pertaining to the media item being played, but also to output audio notifications pertaining to battery status. Notifications of battery status can also be output to display 1210.


In one embodiment, media player 1200 is a portable computing device that can support processing media, such as audio and/or video. For example, media player 1200 can be a music player (e.g., MP3 player), a video player, a game player, and the like. These devices are generally battery operated and highly portable so as to allow a user to listen to music, play games or video, record video or take pictures wherever the user travels.


In one implementation, media player 1200 is a handheld device sized for placement into a pocket or hand of the user. By being handheld, media player 1200 is relatively small and easily handled and utilized by its user. By being pocket sized, the user does not have to directly carry the device and therefore the device can be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a portable computer). Furthermore, in one implementation, the device may be operated by the user's hands; no reference surface such as a desktop is needed.


Media player 1200 can also include network/bus interface 1216 that couples to data link 1218. Data link 1218 can allow media player 1200 to couple to a host computer. Data link 1218 can be provided over a wired connection or a wireless connection. In the case of a wireless connection, network/bus interface 1216 can include a wireless transceiver.


To support wireless communications, media player 1200 can also include wireless communications module 1224. Wireless communication module 1224 can be considered to provide voice communications (e.g., calls via a cellular network), whereas network/bus interface 1216 can be considered to provide data communications. A user of media player 1200 can thus make and receive voice calls using the wireless communications module in media player 1200. Wireless communications module 1224 can also couple to data bus 1211 to couple to processor 1202 and other resources. Media player 1200 can also include microphone 1226 for pick up of the user's voice.


The invention is suitable for use with battery-powered electronic devices. However, the invention is particularly well suited for handheld electronic devices, such as a handheld media device. One example of a handheld media device is a portable media player (e.g., music player or MP3 player). Another example of a handheld media device is a mobile telephone (e.g., cell phone) or Personal Digital Assistant (PDA).


Portable media devices can store and play audio sounds pertaining to media assets (media items), such as music, audiobooks, meeting recordings, and other speech or voice recordings. Portable media devices, such as media players, are small and highly portable and have limited processing resources. Often, portable media devices are handheld media devices which can be easily held by and within a single hand of a user.


One example of a media player is the iPod® media player, which is available from Apple Inc. of Cupertino, Calif. Often, a media player acquires its media assets from a host computer that serves to enable a user to manage media assets. As an example, the host computer can execute a media management application to utilize and manage media assets. One example of a media management application is iTunes®, produced by Apple Inc.


“Media items,” as used herein, is digital data that pertains to at least one of audio, video, or images. Media items are also referred to as digital media assets. The digital data for media items can be referred to as media data or media content. Some examples of specific forms of media items include, but are not limited to, songs, albums, audiobooks, playlists, movies, music videos, photos, computer games, podcasts, audio and/or video presentations, news reports, and sports updates. Video media items include movies, music videos, video presentations, and any other media items having a video characteristic.


U.S. patent application Ser. No. 11/209,367, filed Aug. 22, 2005, and entitled “AUDIO STATUS INFORMATION FOR A PORTABLE ELECTRONIC DEVICE,” is hereby incorporated herein by reference.


U.S. patent application Ser. No. 11/565,890, filed Dec. 1, 2006, and entitled “POWER CONSUMPTION MANAGEMENT FOR FUNCTIONAL PRESERVATION IN A BATTERY-POWERED ELECTRONIC DEVICE,” is hereby incorporated herein by reference.


U.S. patent application Ser. No. 10/981,993, filed Nov. 4, 2004, and entitled “AUDIO USER INTERFACE FOR COMPUTING DEVICE.” is hereby incorporated herein by reference.


The various aspects, embodiments, implementations or features of the invention can be used separately or in any combination.


The invention is preferably implemented by software, hardware or a combination of hardware and software. The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.


The advantages of the invention are numerous. Different aspects, embodiments or implementations may, but need not, yield one or more of the following advantages. One advantage of the invention is that an electronic device can be user controlled through voice commands. Another advantage of the invention is that available voice commands can be context sensitive for robust and power efficient operation. Yet another advantage of the invention is that an electronic device can intelligently interact with a nearby media system to provide multiple functions (e.g., media playback and wireless voice communications).


The many features and advantages of the present invention are apparent from the written description. Further, since numerous modifications and changes will readily occur to those skilled in the art, the invention should not be limited to the exact construction and operation as illustrated and described. Hence, all suitable modifications and equivalents may be resorted to as falling within the scope of the invention.

Claims
  • 1. An electronic device operable in a plurality of operational states, the portable electronic device comprising a processor and memory storing one or more programs for execution by the processor, the one or more programs comprising instructions for: detecting, via a microphone of the electronic device, a voice input from a user;determining that at least a portion of the voice input matches characteristics of one or more predetermined commands, wherein the one or more predetermined commands include commands that are authorized for usage while the electronic device is in an operational state;determining that the electronic device is in the operational state; andin accordance with a determination that the at least a portion of the voice input matches characteristics of the or more predetermined commands and a determination that the electronic device is in the operational state, executing the one or more predetermined commands.
  • 2. The electronic device of claim 1, wherein the one or more predetermined commands correspond to a macro including at least a series of commands to be performed.
  • 3. The electronic device of claim 1, wherein the operational state of the electronic device is one of: a state of graphical user interface being displayed on the electronic device;a functional mode of the electronic device; anda low power mode.
  • 4. The electronic device of claim 3, wherein the operational state of the electronic device is a low power mode and, wherein detecting the voice input from the user includes monitoring for the voice input while in the low power mode.
  • 5. The electronic device of claim 4, wherein the operational state is dependent on a state of an application program being executed by the processor.
  • 6. The electronic device of claim 1, wherein the electronic device is a multi-function device supporting a plurality of functions, one of the functions being wireless voice communications and another of the functions being media playback or media recording.
  • 7. The electronic device of claim 1, wherein the electronic device is a handheld electronic device.
  • 8. The electronic device of claim 1, wherein executing the one or more predetermined commands includes changing the operational mode of the electronic device to another operational mode.
  • 9. The electronic device of claim 8, wherein the operational state of the electronic device is a low power mode, and wherein the another operational mode of the electronic device is a normal power mode.
  • 10. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device operable in a plurality of operational states, the one or more programs including instructions for: detecting, via a microphone of the electronic device, a voice input from a user;determining that at least a portion of the voice input matches characteristics of one or more predetermined commands, wherein the one or more predetermined commands include commands that are authorized for usage while the electronic device is in an operational state;determining that the electronic device is in the operational state; andin accordance with a determination that the at least a portion of the voice input matches characteristics of the or more predetermined commands and a determination that the electronic device is in the operational state, executing the one or more predetermined commands.
  • 11. The non-transitory computer-readable storage medium of claim 10, wherein the one or more predetermined commands correspond to a macro including at least a series of commands to be performed.
  • 12. The non-transitory computer-readable storage medium of claim 10, wherein the operational state of the electronic device is one of: a state of graphical user interface being displayed on the electronic device;a functional mode of the electronic device; anda low power mode.
  • 13. The non-transitory computer-readable storage medium of claim 12, wherein the operational state of the electronic device is a low power mode and, wherein detecting the voice input from the user includes monitoring for the voice input while in the low power mode.
  • 14. The non-transitory computer-readable storage medium of claim 13, wherein the operational state is dependent on a state of an application program being executed by the processor.
  • 15. The non-transitory computer-readable storage medium of claim 10, wherein the electronic device is a multi-function device supporting a plurality of functions, one of the functions being wireless voice communications and another of the functions being media playback or media recording.
  • 16. The non-transitory computer-readable storage medium of claim 10, wherein the electronic device is a handheld electronic device.
  • 17. The non-transitory computer-readable storage medium of claim 10, wherein executing the one or more predetermined commands includes changing the operational mode of the electronic device to another operational mode.
  • 18. The non-transitory computer-readable storage medium of claim 17, wherein the operational state of the electronic device is a low power mode, and wherein the another operational mode of the electronic device is a normal power mode.
  • 19. A method comprising: detecting, via a microphone of an electronic device operable in a plurality of operational states, a voice input from a user;determining that at least a portion of the voice input matches characteristics of one or more predetermined commands, wherein the one or more predetermined commands include commands that are authorized for usage while the electronic device is in an operational state;determining that the electronic device is in the operational state; andin accordance with a determination that the at least a portion of the voice input matches characteristics of the or more predetermined commands and a determination that the electronic device is in the operational state, executing the one or more predetermined commands.
  • 20. The method of claim 19, wherein the one or more predetermined commands correspond to a macro including at least a series of commands to be performed.
  • 21. The method of claim 19, wherein the operational state of the electronic device is one of: a state of graphical user interface being displayed on the electronic device;a functional mode of the electronic device; anda low power mode.
  • 22. The method of claim 21, wherein the operational state of the electronic device is a low power mode and, wherein detecting the voice input from the user includes monitoring for the voice input while in the low power mode.
  • 23. The method of claim 22, wherein the operational state is dependent on a state of an application program being executed by the processor.
  • 24. The method of claim 19, wherein the electronic device is a multi-function device supporting a plurality of functions, one of the functions being wireless voice communications and another of the functions being media playback or media recording.
  • 25. The method of claim 19, wherein the electronic device is a handheld electronic device.
  • 26. The method of claim 19, wherein executing the one or more predetermined commands includes changing the operational mode of the electronic device to another operational mode.
  • 27. The method of claim 26, wherein the operational state of the electronic device is a low power mode, and wherein the another operational mode of the electronic device is a normal power mode.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/605,793, entitled “METHOD AND SYSTEM FOR OPERATING A MULTI-FUNCTION PORTABLE ELECTRONIC DEVICE USING VOICE-ACTIVATION,” filed Jan. 26, 2015, which claims priority to U.S. Provisional application Ser. No. 11/696,057, entitled “METHOD AND SYSTEM FOR OPERATING A MULTI-FUNCTION PORTABLE ELECTRONIC DEVICE USING VOICE-ACTIVATION,” filed Apr. 3, 2007, the content of which are hereby incorporated by reference in their entireties.

US Referenced Citations (658)
Number Name Date Kind
5878394 Muhling Mar 1999 A
6069648 Suso et al. May 2000 A
6240303 Katzur May 2001 B1
6397186 Bush et al. May 2002 B1
6895257 Boman et al. May 2005 B2
7072686 Schrager Jul 2006 B1
7643990 Bellegarda Jan 2010 B1
7647225 Bennett et al. Jan 2010 B2
7649454 Singh et al. Jan 2010 B2
7649877 Vieri et al. Jan 2010 B2
7653883 Hotelling et al. Jan 2010 B2
7656393 King et al. Feb 2010 B2
7657424 Bennett Feb 2010 B2
7657844 Gibson et al. Feb 2010 B2
7657849 Chaudhri et al. Feb 2010 B2
7663607 Hotelling et al. Feb 2010 B2
7664558 Lindahl et al. Feb 2010 B2
7664638 Cooper et al. Feb 2010 B2
7669134 Christie et al. Feb 2010 B1
7672841 Bennett Mar 2010 B2
7672952 Isaacson et al. Mar 2010 B2
7673238 Girish et al. Mar 2010 B2
7673340 Cohen et al. Mar 2010 B1
7676026 Baxter, Jr. Mar 2010 B1
7676365 Hwang et al. Mar 2010 B2
7676463 Thompson et al. Mar 2010 B2
7679534 Kay et al. Mar 2010 B2
7680649 Park Mar 2010 B2
7681126 Roose Mar 2010 B2
7683886 Willey Mar 2010 B2
7683893 Kim Mar 2010 B2
7684985 Dominach et al. Mar 2010 B2
7684990 Caskey et al. Mar 2010 B2
7684991 Stohr et al. Mar 2010 B2
7689245 Cox et al. Mar 2010 B2
7689408 Chen et al. Mar 2010 B2
7689409 Heinecke Mar 2010 B2
7689421 Li et al. Mar 2010 B2
7693715 Hwang et al. Apr 2010 B2
7693717 Kahn et al. Apr 2010 B2
7693719 Chu et al. Apr 2010 B2
7693720 Kennewick et al. Apr 2010 B2
7698131 Bennett Apr 2010 B2
7702500 Blaedow Apr 2010 B2
7702508 Bennett Apr 2010 B2
7706510 Ng Apr 2010 B2
7707026 Liu Apr 2010 B2
7707027 Balchandran et al. Apr 2010 B2
7707032 Wang et al. Apr 2010 B2
7707221 Dunning et al. Apr 2010 B1
7707267 Lisitsa et al. Apr 2010 B2
7710262 Ruha May 2010 B2
7711129 Lindahl et al. May 2010 B2
7711550 Feinberg et al. May 2010 B1
7711565 Gazdzinski May 2010 B1
7711672 Au May 2010 B2
7712053 Bradford et al. May 2010 B2
7716056 Weng et al. May 2010 B2
7716216 Harik et al. May 2010 B1
7720674 Kaiser et al. May 2010 B2
7720683 Vermeulen et al. May 2010 B1
7721226 Barabe et al. May 2010 B2
7721301 Wong et al. May 2010 B2
7724242 Hillis et al. May 2010 B2
7725307 Bennett May 2010 B2
7725318 Gavalda et al. May 2010 B2
7725320 Bennett May 2010 B2
7725321 Bennett May 2010 B2
7725838 Williams May 2010 B2
7729904 Bennett Jun 2010 B2
7729916 Coffman et al. Jun 2010 B2
7734461 Kwak et al. Jun 2010 B2
7735012 Naik Jun 2010 B2
7739588 Reynar et al. Jun 2010 B2
7742953 King et al. Jun 2010 B2
7743188 Haitani et al. Jun 2010 B2
7747616 Yamada et al. Jun 2010 B2
7752152 Paek et al. Jul 2010 B2
7756868 Lee Jul 2010 B2
7757173 Beaman Jul 2010 B2
7757182 Elliott et al. Jul 2010 B2
7761296 Bakis et al. Jul 2010 B1
7763842 Hsu et al. Jul 2010 B2
7774204 Mozer et al. Aug 2010 B2
7774388 Runchey Aug 2010 B1
7778432 Larsen Aug 2010 B2
7778595 White et al. Aug 2010 B2
7778632 Kurlander et al. Aug 2010 B2
7779353 Grigoriu et al. Aug 2010 B2
7779356 Griesmer Aug 2010 B2
7779357 Naik Aug 2010 B2
7783283 Kuusinen et al. Aug 2010 B2
7783486 Rosser et al. Aug 2010 B2
7788590 Taboada et al. Aug 2010 B2
7797265 Brinker et al. Sep 2010 B2
7797269 Rieman et al. Sep 2010 B2
7797331 Theimer et al. Sep 2010 B2
7801721 Rosart et al. Sep 2010 B2
7801728 Ben-david et al. Sep 2010 B2
7801729 Mozer Sep 2010 B2
7805299 Coifman Sep 2010 B2
7809565 Coifman Oct 2010 B2
7809569 Attwater et al. Oct 2010 B2
7809570 Kennewick et al. Oct 2010 B2
7809610 Cao Oct 2010 B2
7809744 Nevidomski et al. Oct 2010 B2
7818165 Carlgren et al. Oct 2010 B2
7818176 Freeman et al. Oct 2010 B2
7818215 King et al. Oct 2010 B2
7818291 Ferguson et al. Oct 2010 B2
7818672 McCormack et al. Oct 2010 B2
7822608 Cross, Jr. et al. Oct 2010 B2
7823123 Sabbouh Oct 2010 B2
7826945 Zhang et al. Nov 2010 B2
7827047 Anderson et al. Nov 2010 B2
7831423 Schubert Nov 2010 B2
7831426 Bennett Nov 2010 B2
7831432 Bodin et al. Nov 2010 B2
7836437 Kacmarcik Nov 2010 B2
7840400 Lavi et al. Nov 2010 B2
7840447 Kleinrock et al. Nov 2010 B2
7840581 Ross et al. Nov 2010 B2
7840912 Elias et al. Nov 2010 B2
7848924 Nurminen et al. Dec 2010 B2
7848926 Goto et al. Dec 2010 B2
7853444 Wang et al. Dec 2010 B2
7853445 Bachenko et al. Dec 2010 B2
7853574 Kraenzel et al. Dec 2010 B2
7853577 Sundaresan et al. Dec 2010 B2
7853664 Wang et al. Dec 2010 B1
7853900 Nguyen et al. Dec 2010 B2
7865817 Ryan et al. Jan 2011 B2
7869999 Amato et al. Jan 2011 B2
7870118 Jiang et al. Jan 2011 B2
7873519 Bennett Jan 2011 B2
7873654 Bernard Jan 2011 B2
7877705 Chambers et al. Jan 2011 B2
7880730 Robinson et al. Feb 2011 B2
7881936 Longe et al. Feb 2011 B2
7885844 Cohen et al. Feb 2011 B1
7886233 Rainisto et al. Feb 2011 B2
7889184 Blumenberg et al. Feb 2011 B2
7889185 Blumenberg et al. Feb 2011 B2
7890330 Ozkaragoz et al. Feb 2011 B2
7890652 Bull et al. Feb 2011 B2
7895531 Radtke et al. Feb 2011 B2
7899666 Varone Mar 2011 B2
7908287 Katragadda Mar 2011 B1
7912289 Kansai et al. Mar 2011 B2
7912699 Saraclar et al. Mar 2011 B1
7912702 Bennett Mar 2011 B2
7912720 Hakkani-tur et al. Mar 2011 B1
7912828 Bonnet et al. Mar 2011 B2
7913185 Benson et al. Mar 2011 B1
7916979 Simmons Mar 2011 B2
7917367 Di Cristo et al. Mar 2011 B2
7917497 Harrison et al. Mar 2011 B2
7920678 Cooper et al. Apr 2011 B2
7920682 Byrne et al. Apr 2011 B2
7920857 Lau et al. Apr 2011 B2
7925525 Chin Apr 2011 B2
7925610 Elbaz et al. Apr 2011 B2
7929805 Wang et al. Apr 2011 B2
7930168 Weng et al. Apr 2011 B2
7930183 Odell et al. Apr 2011 B2
7930197 Ozzie et al. Apr 2011 B2
7936339 Marggraff et al. May 2011 B2
7941009 Li et al. May 2011 B2
7945470 Cohen et al. May 2011 B1
7949529 Weider et al. May 2011 B2
7949534 Davis et al. May 2011 B2
7953679 Chidlovskii et al. May 2011 B2
7957975 Burns et al. Jun 2011 B2
7962179 Huang Jun 2011 B2
7974844 Sumita Jul 2011 B2
7974972 Cao Jul 2011 B2
7975216 Woolf et al. Jul 2011 B2
7983478 Liu et al. Jul 2011 B2
7983915 Knight et al. Jul 2011 B2
7983917 Kennewick et al. Jul 2011 B2
7983919 Conkie Jul 2011 B2
7983997 Allen et al. Jul 2011 B2
7984062 Dunning et al. Jul 2011 B2
7986431 Emori et al. Jul 2011 B2
7987151 Schott et al. Jul 2011 B2
7987244 Lewis et al. Jul 2011 B1
7991614 Washio et al. Aug 2011 B2
7992085 Wang-aryattanwanich et al. Aug 2011 B2
7996228 Miller et al. Aug 2011 B2
7996589 Schultz et al. Aug 2011 B2
7996792 Anzures et al. Aug 2011 B2
7999669 Singh et al. Aug 2011 B2
8000453 Cooper et al. Aug 2011 B2
8005664 Hanumanthappa Aug 2011 B2
8005679 Jordan et al. Aug 2011 B2
8006180 Tunning et al. Aug 2011 B2
8015006 Kennewick et al. Sep 2011 B2
8015011 Nagano et al. Sep 2011 B2
8015144 Zheng et al. Sep 2011 B2
8018431 Zehr et al. Sep 2011 B1
8019271 Izdepski Sep 2011 B1
8024195 Mozer et al. Sep 2011 B2
8027836 Baker et al. Sep 2011 B2
8031943 Chen et al. Oct 2011 B2
8032383 Bhardwaj et al. Oct 2011 B1
8036901 Mozer Oct 2011 B2
8037034 Plachta et al. Oct 2011 B2
8041557 Liu Oct 2011 B2
8041570 Mirkovic et al. Oct 2011 B2
8041611 Kleinrock et al. Oct 2011 B2
8042053 Darwish et al. Oct 2011 B2
8046363 Cha et al. Oct 2011 B2
8050500 Batty et al. Nov 2011 B1
8055502 Clark et al. Nov 2011 B2
8055708 Chitsaz et al. Nov 2011 B2
8060824 Brownrigg, Jr. et al. Nov 2011 B2
8064753 Freeman Nov 2011 B2
8065143 Yanagihara Nov 2011 B2
8065155 Gazdzinski Nov 2011 B1
8065156 Gazdzinski Nov 2011 B2
8069046 Kennewick et al. Nov 2011 B2
8069422 Sheshagiri et al. Nov 2011 B2
8073681 Baldwin et al. Dec 2011 B2
8077153 Benko et al. Dec 2011 B2
8078473 Gazdzinski Dec 2011 B1
8082153 Coffman et al. Dec 2011 B2
8082498 Salamon et al. Dec 2011 B2
8090571 Elshishiny et al. Jan 2012 B2
8095364 Longe et al. Jan 2012 B2
8099289 Mozer et al. Jan 2012 B2
8099395 Pabla et al. Jan 2012 B2
8099418 Inoue et al. Jan 2012 B2
8103510 Sato Jan 2012 B2
8107401 John et al. Jan 2012 B2
8112275 Kennewick et al. Feb 2012 B2
8112280 Lu Feb 2012 B2
8117037 Gazdzinski Feb 2012 B2
8117542 Radtke et al. Feb 2012 B2
8121413 Hwang et al. Feb 2012 B2
8121837 Agapi et al. Feb 2012 B2
8122094 Kotab Feb 2012 B1
8122353 Bouta Feb 2012 B2
8131557 Davis et al. Mar 2012 B2
8135115 Hogg, Jr. et al. Mar 2012 B1
8138912 Singh et al. Mar 2012 B2
8140335 Kennewick et al. Mar 2012 B2
8140567 Padovitz et al. Mar 2012 B2
8150694 Kennewick et al. Apr 2012 B2
8150700 Shin et al. Apr 2012 B2
8155956 Cho et al. Apr 2012 B2
8156005 Vieri Apr 2012 B2
8160883 Lecoeuche Apr 2012 B2
8165321 Paquier et al. Apr 2012 B2
8165886 Gagnon et al. Apr 2012 B1
8166019 Lee et al. Apr 2012 B1
8170790 Lee et al. May 2012 B2
8179370 Yamasani et al. May 2012 B1
8188856 Singh et al. May 2012 B2
8190359 Bourne May 2012 B2
8195467 Mozer et al. Jun 2012 B2
8200495 Braho et al. Jun 2012 B2
8201109 Van Os et al. Jun 2012 B2
8204238 Mozer Jun 2012 B2
8205788 Gazdzinski et al. Jun 2012 B1
8209183 Patel et al. Jun 2012 B1
8219115 Nelissen Jul 2012 B1
8219406 Yu et al. Jul 2012 B2
8219407 Roy et al. Jul 2012 B1
8219608 Alsafadi et al. Jul 2012 B2
8224649 Chaudhari et al. Jul 2012 B2
8239207 Seligman et al. Aug 2012 B2
8255217 Stent et al. Aug 2012 B2
8275621 Alewine et al. Sep 2012 B2
8285546 Reich Oct 2012 B2
8285551 Gazdzinski Oct 2012 B2
8285553 Gazdzinski Oct 2012 B2
8290777 Nguyen et al. Oct 2012 B1
8290778 Gazdzinski Oct 2012 B2
8290781 Gazdzinski Oct 2012 B2
8296146 Gazdzinski Oct 2012 B2
8296153 Gazdzinski Oct 2012 B2
8296380 Kelly et al. Oct 2012 B1
8296383 Lindahl Oct 2012 B2
8300801 Sweeney et al. Oct 2012 B2
8301456 Gazdzinski Oct 2012 B2
8311834 Gazdzinski Nov 2012 B1
8332224 Di Cristo et al. Dec 2012 B2
8332748 Karam Dec 2012 B1
8345665 Vieri et al. Jan 2013 B2
8352183 Thota et al. Jan 2013 B2
8352268 Naik et al. Jan 2013 B2
8352272 Rogers et al. Jan 2013 B2
8355919 Silverman et al. Jan 2013 B2
8359234 Vieri Jan 2013 B2
8370158 Gazdzinski Feb 2013 B2
8371503 Gazdzinski Feb 2013 B2
8374871 Ehsani et al. Feb 2013 B2
8375320 Kotler et al. Feb 2013 B2
8380504 Peden et al. Feb 2013 B1
8381107 Rottler et al. Feb 2013 B2
8381135 Hotelling et al. Feb 2013 B2
8396714 Rogers et al. Mar 2013 B2
8423288 Stahl et al. Apr 2013 B2
8428758 Naik et al. Apr 2013 B2
8447612 Gazdzinski May 2013 B2
8479122 Hotelling et al. Jul 2013 B2
8489599 Bellotti Jul 2013 B2
8498857 Kopparapu et al. Jul 2013 B2
8521513 Millett et al. Aug 2013 B2
8583416 Huang et al. Nov 2013 B2
8589869 Wolfram Nov 2013 B2
8595004 Koshinaka Nov 2013 B2
8620659 Di Cristo et al. Dec 2013 B2
8645137 Bellegarda et al. Feb 2014 B2
8655901 Li et al. Feb 2014 B1
8660849 Gruber et al. Feb 2014 B2
8660970 Fiedorowicz Feb 2014 B1
8706472 Ramerth et al. Apr 2014 B2
8731610 Appaji May 2014 B2
8760537 Johnson et al. Jun 2014 B2
8768693 Somekh et al. Jul 2014 B2
8838457 Cerra et al. Sep 2014 B2
8880405 Cerra et al. Nov 2014 B2
8886540 Cerra et al. Nov 2014 B2
8943423 Merrill et al. Jan 2015 B2
8972878 Mohler et al. Mar 2015 B2
8977255 Freeman Mar 2015 B2
10568032 Freeman Feb 2020 B2
20020059066 O'hagan May 2002 A1
20020091511 Hellwig et al. Jul 2002 A1
20040106432 Kanamori et al. Jun 2004 A1
20050027385 Yueh Feb 2005 A1
20060252457 Schrager Nov 2006 A1
20080114604 Wei et al. May 2008 A1
20090300391 Jessup et al. Dec 2009 A1
20100004931 Ma et al. Jan 2010 A1
20100005081 Bennett Jan 2010 A1
20100013796 Abileah et al. Jan 2010 A1
20100019834 Zerbe et al. Jan 2010 A1
20100023318 Lemoine Jan 2010 A1
20100023320 Di Cristo et al. Jan 2010 A1
20100030928 Conroy et al. Feb 2010 A1
20100031143 Rao et al. Feb 2010 A1
20100036655 Cecil et al. Feb 2010 A1
20100036660 Bennett Feb 2010 A1
20100037183 Miyashita et al. Feb 2010 A1
20100042400 Block et al. Feb 2010 A1
20100049514 Kennewick et al. Feb 2010 A1
20100050064 Liu et al. Feb 2010 A1
20100054512 Solum Mar 2010 A1
20100057457 Ogata et al. Mar 2010 A1
20100057643 Yang Mar 2010 A1
20100060646 Unsal et al. Mar 2010 A1
20100063804 Sato et al. Mar 2010 A1
20100063825 Williams et al. Mar 2010 A1
20100063961 Guiheneuf et al. Mar 2010 A1
20100064113 Lindahl et al. Mar 2010 A1
20100067723 Bergmann et al. Mar 2010 A1
20100067867 Lin Mar 2010 A1
20100070281 Conkie et al. Mar 2010 A1
20100070899 Hunt et al. Mar 2010 A1
20100076760 Kraenzel et al. Mar 2010 A1
20100079501 Ikeda et al. Apr 2010 A1
20100080398 Waldmann Apr 2010 A1
20100080470 Deluca et al. Apr 2010 A1
20100081456 Singh et al. Apr 2010 A1
20100081487 Chen et al. Apr 2010 A1
20100082327 Rogers et al. Apr 2010 A1
20100082328 Rogers et al. Apr 2010 A1
20100082329 Silverman et al. Apr 2010 A1
20100082346 Rogers et al. Apr 2010 A1
20100082347 Rogers et al. Apr 2010 A1
20100082348 Silverman et al. Apr 2010 A1
20100082349 Bellegarda et al. Apr 2010 A1
20100082970 Lindahl et al. Apr 2010 A1
20100086152 Rank et al. Apr 2010 A1
20100086153 Hagen et al. Apr 2010 A1
20100086156 Rank et al. Apr 2010 A1
20100088020 Sano et al. Apr 2010 A1
20100088093 Lee et al. Apr 2010 A1
20100088100 Lindahl Apr 2010 A1
20100100212 Lindahl et al. Apr 2010 A1
20100100384 Ju et al. Apr 2010 A1
20100103776 Chan Apr 2010 A1
20100106500 Mckee et al. Apr 2010 A1
20100114856 Kuboyama May 2010 A1
20100125460 Mellott et al. May 2010 A1
20100125811 Moore et al. May 2010 A1
20100131273 Aley-raz et al. May 2010 A1
20100131498 Linthicum et al. May 2010 A1
20100131899 Hubert May 2010 A1
20100138215 Williams Jun 2010 A1
20100138224 Bedingfield, Sr. Jun 2010 A1
20100138416 Bellotti Jun 2010 A1
20100142740 Roerup Jun 2010 A1
20100145694 Ju et al. Jun 2010 A1
20100145700 Kennewick et al. Jun 2010 A1
20100146442 Nagasaka et al. Jun 2010 A1
20100153115 Klee et al. Jun 2010 A1
20100161313 Karttunen Jun 2010 A1
20100161554 Datuashvili et al. Jun 2010 A1
20100164897 Morin et al. Jul 2010 A1
20100169075 Raffa et al. Jul 2010 A1
20100169097 Nachman et al. Jul 2010 A1
20100171713 Kwok et al. Jul 2010 A1
20100174544 Heifets Jul 2010 A1
20100179932 Yoon et al. Jul 2010 A1
20100179991 Lorch et al. Jul 2010 A1
20100185448 Meisel Jul 2010 A1
20100185949 Jaeger Jul 2010 A1
20100197359 Harris Aug 2010 A1
20100204986 Kennewick et al. Aug 2010 A1
20100211199 Naik et al. Aug 2010 A1
20100217604 Baldwin et al. Aug 2010 A1
20100222098 Garg Sep 2010 A1
20100223055 Mclean Sep 2010 A1
20100228540 Bennett Sep 2010 A1
20100228691 Yang et al. Sep 2010 A1
20100231474 Yamagajo et al. Sep 2010 A1
20100235167 Bourdon Sep 2010 A1
20100235341 Bennett Sep 2010 A1
20100235729 Kocienda et al. Sep 2010 A1
20100235770 Ording et al. Sep 2010 A1
20100250542 Fujimaki Sep 2010 A1
20100250599 Schmidt et al. Sep 2010 A1
20100257160 Cao Oct 2010 A1
20100257478 Longe et al. Oct 2010 A1
20100262599 Nitz Oct 2010 A1
20100268539 Xu et al. Oct 2010 A1
20100274753 Liberty et al. Oct 2010 A1
20100277579 Cho et al. Nov 2010 A1
20100278320 Arsenault et al. Nov 2010 A1
20100278453 King Nov 2010 A1
20100280983 Cho et al. Nov 2010 A1
20100281034 Petrou et al. Nov 2010 A1
20100286985 Kennewick et al. Nov 2010 A1
20100287514 Cragun et al. Nov 2010 A1
20100293460 Budelli Nov 2010 A1
20100299133 Kopparapu et al. Nov 2010 A1
20100299138 Kim Nov 2010 A1
20100299142 Freeman et al. Nov 2010 A1
20100302056 Dutton et al. Dec 2010 A1
20100305807 Basir et al. Dec 2010 A1
20100305947 Schwarz et al. Dec 2010 A1
20100312547 Van Os et al. Dec 2010 A1
20100312566 Odinak et al. Dec 2010 A1
20100318576 Kim Dec 2010 A1
20100322438 Siotis Dec 2010 A1
20100324895 Kurzweil et al. Dec 2010 A1
20100324905 Kurzweil et al. Dec 2010 A1
20100325573 Estrada et al. Dec 2010 A1
20100325588 Reddy et al. Dec 2010 A1
20100332224 Makela et al. Dec 2010 A1
20100332235 David Dec 2010 A1
20100332280 Bradley et al. Dec 2010 A1
20100332348 Cao Dec 2010 A1
20100332428 Mchenry et al. Dec 2010 A1
20100332976 Fux et al. Dec 2010 A1
20100333030 Johns Dec 2010 A1
20110002487 Panther et al. Jan 2011 A1
20110010178 Lee et al. Jan 2011 A1
20110010644 Merrill et al. Jan 2011 A1
20110016150 Engstrom et al. Jan 2011 A1
20110018695 Bells et al. Jan 2011 A1
20110021213 Carr Jan 2011 A1
20110022292 Shen et al. Jan 2011 A1
20110022394 Wide Jan 2011 A1
20110022952 Wu et al. Jan 2011 A1
20110029616 Wang et al. Feb 2011 A1
20110033064 Johnson et al. Feb 2011 A1
20110038489 Visser et al. Feb 2011 A1
20110047072 Ciurea Feb 2011 A1
20110047161 Myaeng et al. Feb 2011 A1
20110050591 Kim et al. Mar 2011 A1
20110054901 Qin et al. Mar 2011 A1
20110055256 Phillips et al. Mar 2011 A1
20110060584 Ferrucci et al. Mar 2011 A1
20110060587 Phillips et al. Mar 2011 A1
20110060807 Martin et al. Mar 2011 A1
20110066468 Huang et al. Mar 2011 A1
20110072492 Mohler et al. Mar 2011 A1
20110076994 Kim et al. Mar 2011 A1
20110082688 Kim et al. Apr 2011 A1
20110083079 Farrell et al. Apr 2011 A1
20110087491 Wittenstein et al. Apr 2011 A1
20110090078 Kim et al. Apr 2011 A1
20110093261 Angott Apr 2011 A1
20110093265 Stent et al. Apr 2011 A1
20110093271 Bernard Apr 2011 A1
20110099000 Rai et al. Apr 2011 A1
20110103682 Chidlovskii et al. May 2011 A1
20110106736 Aharonson et al. May 2011 A1
20110110502 Daye et al. May 2011 A1
20110112827 Kennewick et al. May 2011 A1
20110112837 Kurki-suonio et al. May 2011 A1
20110112921 Kennewick et al. May 2011 A1
20110116480 Li May 2011 A1
20110116610 Shaw et al. May 2011 A1
20110119049 Ylonen May 2011 A1
20110119051 Li et al. May 2011 A1
20110125540 Jang et al. May 2011 A1
20110130958 Stahl et al. Jun 2011 A1
20110131036 Dicristo et al. Jun 2011 A1
20110131038 Oyaizu et al. Jun 2011 A1
20110131045 Cristo et al. Jun 2011 A1
20110143811 Rodriguez Jun 2011 A1
20110144973 Bocchieri et al. Jun 2011 A1
20110144999 Jang et al. Jun 2011 A1
20110145718 Ketola et al. Jun 2011 A1
20110151830 Blanda, Jr. et al. Jun 2011 A1
20110153209 Geelen Jun 2011 A1
20110153330 Yazdani et al. Jun 2011 A1
20110157029 Tseng Jun 2011 A1
20110161076 Davis et al. Jun 2011 A1
20110161079 Gruhn et al. Jun 2011 A1
20110161309 Lung et al. Jun 2011 A1
20110161852 Vainio et al. Jun 2011 A1
20110167350 Hoellwarth Jul 2011 A1
20110175810 Markovic et al. Jul 2011 A1
20110179002 Dumitru et al. Jul 2011 A1
20110179372 Moore et al. Jul 2011 A1
20110184721 Subramanian et al. Jul 2011 A1
20110184730 Lebeau et al. Jul 2011 A1
20110191271 Baker et al. Aug 2011 A1
20110191344 Jin et al. Aug 2011 A1
20110195758 Damale et al. Aug 2011 A1
20110201387 Paek et al. Aug 2011 A1
20110209088 Hinckley et al. Aug 2011 A1
20110212717 Rhoads et al. Sep 2011 A1
20110218855 Cao et al. Sep 2011 A1
20110219018 Bailey et al. Sep 2011 A1
20110224972 Millett et al. Sep 2011 A1
20110231182 Weider et al. Sep 2011 A1
20110231188 Kennewick et al. Sep 2011 A1
20110231474 Locker et al. Sep 2011 A1
20110238407 Kent Sep 2011 A1
20110238408 Larcheveque et al. Sep 2011 A1
20110238676 Liu et al. Sep 2011 A1
20110242007 Gray et al. Oct 2011 A1
20110249144 Chang Oct 2011 A1
20110260861 Singh et al. Oct 2011 A1
20110264643 Cao Oct 2011 A1
20110274303 Filson et al. Nov 2011 A1
20110276598 Kozempel Nov 2011 A1
20110279368 Klein et al. Nov 2011 A1
20110282888 Koperski et al. Nov 2011 A1
20110288861 Kurzwei et al. Nov 2011 A1
20110298585 Barry Dec 2011 A1
20110306426 Novak et al. Dec 2011 A1
20110307491 Fisk et al. Dec 2011 A1
20110307810 Hilerio et al. Dec 2011 A1
20110314032 Bennett et al. Dec 2011 A1
20110314404 Kotler et al. Dec 2011 A1
20120002820 Leichter Jan 2012 A1
20120011138 Dunning et al. Jan 2012 A1
20120013609 Reponen et al. Jan 2012 A1
20120016678 Gruber et al. Jan 2012 A1
20120020490 Leichter Jan 2012 A1
20120022787 Lebeau et al. Jan 2012 A1
20120022857 Baldwin et al. Jan 2012 A1
20120022860 Lloyd et al. Jan 2012 A1
20120022868 Lebeau et al. Jan 2012 A1
20120022869 Lloyd et al. Jan 2012 A1
20120022870 Kristjansson et al. Jan 2012 A1
20120022872 Gruber et al. Jan 2012 A1
20120022874 Lloyd et al. Jan 2012 A1
20120022876 Lebeau et al. Jan 2012 A1
20120023088 Cheng et al. Jan 2012 A1
20120034904 Lebeau et al. Feb 2012 A1
20120035907 Lebeau et al. Feb 2012 A1
20120035908 Lebeau et al. Feb 2012 A1
20120035924 Jitkoff et al. Feb 2012 A1
20120035925 Friend et al. Feb 2012 A1
20120035931 Lebeau et al. Feb 2012 A1
20120035932 Jitkoff et al. Feb 2012 A1
20120036556 Lebeau et al. Feb 2012 A1
20120042343 Laligand et al. Feb 2012 A1
20120053815 Montanari et al. Mar 2012 A1
20120053945 Gupta et al. Mar 2012 A1
20120056815 Mehra Mar 2012 A1
20120078627 Wagner Mar 2012 A1
20120082317 Pance et al. Apr 2012 A1
20120084086 Gilbert et al. Apr 2012 A1
20120108221 Thomas et al. May 2012 A1
20120116770 Chen et al. May 2012 A1
20120124126 Alcazar et al. May 2012 A1
20120136572 Norton May 2012 A1
20120137367 Dupont et al. May 2012 A1
20120149394 Singh et al. Jun 2012 A1
20120150580 Norton Jun 2012 A1
20120158293 Burnham Jun 2012 A1
20120158422 Burnham et al. Jun 2012 A1
20120163710 Skaff et al. Jun 2012 A1
20120173464 Tur et al. Jul 2012 A1
20120174121 Treat et al. Jul 2012 A1
20120185237 Gajic et al. Jul 2012 A1
20120197995 Caruso Aug 2012 A1
20120197998 Kessel et al. Aug 2012 A1
20120201362 Crossan et al. Aug 2012 A1
20120214141 Raya et al. Aug 2012 A1
20120214517 Singh et al. Aug 2012 A1
20120221339 Wang et al. Aug 2012 A1
20120221552 Reponen et al. Aug 2012 A1
20120245719 Story, Jr. et al. Sep 2012 A1
20120245941 Cheyer Sep 2012 A1
20120245944 Gruber et al. Sep 2012 A1
20120252367 Gaglio et al. Oct 2012 A1
20120254152 Park et al. Oct 2012 A1
20120265528 Gruber et al. Oct 2012 A1
20120265535 Bryant-rich et al. Oct 2012 A1
20120271625 Bernard Oct 2012 A1
20120271635 Ljolje Oct 2012 A1
20120271676 Aravamudan et al. Oct 2012 A1
20120284027 Mallett et al. Nov 2012 A1
20120290300 Lee et al. Nov 2012 A1
20120296649 Bansal et al. Nov 2012 A1
20120304124 Chen et al. Nov 2012 A1
20120309363 Gruber et al. Dec 2012 A1
20120310642 Cao et al. Dec 2012 A1
20120310649 Cannistraro et al. Dec 2012 A1
20120310652 O″sullivan Dec 2012 A1
20120311478 Van Os et al. Dec 2012 A1
20120311583 Gruber et al. Dec 2012 A1
20120311584 Gruber et al. Dec 2012 A1
20120311585 Gruber et al. Dec 2012 A1
20120317498 Logan et al. Dec 2012 A1
20120330660 Jaiswal Dec 2012 A1
20120330661 Lindahl Dec 2012 A1
20130006633 Grokop et al. Jan 2013 A1
20130006638 Lindahl Jan 2013 A1
20130055099 Yao et al. Feb 2013 A1
20130073286 Bastea-forte et al. Mar 2013 A1
20130080167 Mozer Mar 2013 A1
20130080177 Chen Mar 2013 A1
20130085761 Bringert et al. Apr 2013 A1
20130110505 Gruber et al. May 2013 A1
20130110515 Guzzoni et al. May 2013 A1
20130110518 Gruber et al. May 2013 A1
20130110519 Cheyer et al. May 2013 A1
20130110520 Cheyer et al. May 2013 A1
20130111348 Gruber et al. May 2013 A1
20130111487 Cheyer et al. May 2013 A1
20130115927 Gruber et al. May 2013 A1
20130117022 Chen et al. May 2013 A1
20130170738 Capuozzo et al. Jul 2013 A1
20130185074 Gruber et al. Jul 2013 A1
20130185081 Cheyer et al. Jul 2013 A1
20130225128 Gomar Aug 2013 A1
20130238647 Thompson Sep 2013 A1
20130275117 Winer Oct 2013 A1
20130289991 Eshwar et al. Oct 2013 A1
20130304758 Gruber et al. Nov 2013 A1
20130325443 Begeja et al. Dec 2013 A1
20130346068 Solem et al. Dec 2013 A1
20140080428 Rhoads et al. Mar 2014 A1
20140086458 Rogers Mar 2014 A1
20140152577 Yuen et al. Jun 2014 A1
20150133109 Freeman et al. May 2015 A1
Related Publications (1)
Number Date Country
20200305084 A1 Sep 2020 US
Continuations (2)
Number Date Country
Parent 14605793 Jan 2015 US
Child 16778826 US
Parent 11696057 Apr 2007 US
Child 14605793 US