Techniques for presenting sound effects on a portable media player

Information

  • Patent Grant
  • 10750284
  • Patent Number
    10,750,284
  • Date Filed
    Wednesday, February 15, 2017
    7 years ago
  • Date Issued
    Tuesday, August 18, 2020
    4 years ago
Abstract
Improved techniques for presenting sound effects at a portable media device are disclosed. The sound effects can be output as audio sounds to an internal speaker, an external speaker, or both. In addition, the audio sounds for the sound effects can be output together with other audio sounds pertaining to media assets (e.g., audio tracks being played). In one embodiment, the sound effects can serve to provide auditory feedback to a user of the portable media device. A user interface can facilitate a user's selection of sound effect usages, types or characteristics.
Description
FIELD OF THE INVENTION

The present invention relates to audio sound effects and, more particularly, to providing audio sound effects on a portable media device.


DESCRIPTION OF THE RELATED ART

Conventionally, portable media players have user input devices (buttons, dials, etc.) and a display screen for user output. Sometimes the display screen updates as user inputs are provided via the user input devices, thereby providing visual feedback to users regarding their user input. However, the display screen does not always provide visual feedback and the user is not always able to view the display screen to receive the visual feedback. Still further, some portable media players do not include a display screen. Portable media players can also provide auditory feedback as user inputs are provided via the user input devices. For example, to provide auditory feedback for a rotation user input, the iPod® media player, which is available from Apple Computer, Inc. of Cupertino, Calif., outputs a “click” sound using a piezoelectric device provided within the media player.


Unfortunately, however, users often interact with media players while wearing earphones or headphones. In such case, the users will likely not be able to hear any auditory feedback, such as “click” sounds from a piezoelectric device. Moreover, the user might also be listening to audio sounds via the earphones or headphones when the user interaction occurs. Consequently, any users interaction with the media player while wearing, earphone or headphones will be without the advantage of auditory feedback. The lack of auditory feedback degrades the user experience and renders the media player less user friendly.


Thus, there is a need for improved techniques to facilitate auditory feedback on portable media players.


SUMMARY OF THE INVENTION

The invention pertains to techniques for presenting sound effects at a portable media device. The sound effects can be output as audio sounds to an internal speaker, an external speaker, or both. In addition, the audio sounds for the sound effects can be output together with other audio sounds pertaining to media assets (e.g., audio tracks being played). In one embodiment, the sound effects can serve to provide auditory feedback to a user of the portable media device. A user interface can facilitate a user's selection of sound effect usages, types or characteristics.


The invention can be implemented in numerous ways, including as a method, system, device, apparatus (including graphical user interface), or computer readable medium. Several embodiments of the invention are discussed below.


As a method for providing auditory feedback to a user of portable media device, one embodiment of the method includes at least the acts of: outputting first audio data pertaining to a digital media asset to an audio output device associated with the portable media device; detecting an event at the portable media device; and outputting second audio data after the event has been detected, the second audio data pertaining to a sound effect associated with the event that has been detected, the second audio data being output to the audio output device.


As a method for outputting a sound effect from an external speaker associated with a portable media device, one embodiment of the method includes at least the acts of: determining whether a sound effect is to be output to the external speaker; identifying sound effect data for the sound effect to be output; retrieving the identified sound effect data; mixing the identified sound effect data with audio data being output, if any, to produce mixed audio data; and outputting the mixed audio data to the external speaker.


As a method for providing auditory feedback to a user of portable media device, one embodiment of the method includes at least the acts of: detecting an event at the portable media device; determining whether device feedback is enabled; producing an auditory feedback at the portable media device in response to the event when it is determined that the device feedback is enabled; determining whether earphone feedback is enabled; and producing an auditory feedback at one or more earphones coupled to the portable media device in response to the event when it is determined that the earphone feedback is enabled.


As a portable media device, one embodiment of the invention includes at least; an audio output device; a first memory device for storing a plurality of sound effects; computer program code for determining when to output at least one of the sound effects; and a processor for determining when to output at least one of the sound effects and for processing the at least one of the sound effects to produce output sound effect data for the audio output device.


As a graphical user interface for a media device adapted to provide auditory feedback, one embodiment of the invention includes at least; a list of auditory feedback options; and a visual indicator that indicates a selected on of the auditory feedback options. The media device thereafter provides auditory feedback in accordance with the selected one of the auditory feedback options.


As a computer readable medium including at least computer program code for outputting, a sound effect from an external speaker associated with a portable media device, one embodiment of the invention includes at least: computer program code for determining whether a sound effect is to be output to the external speaker; computer program code for identifying sound effect data for the sound effect to be output; computer program code for retrieving the identified sound effect data; computer program code for mixing the identified sound effect data with audio data being output, if any, to produce mixed audio data; and computer program code for outputting the mixed audio data to the external speaker.


Other aspects and advantages of the invention will become apparent from the following detailed description taken in conjunction with the accompanying drawings which illustrate, by way of example, the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:



FIG. 1 is a block diagram of an audio system according to one embodiment of the invention.



FIG. 2 is a flow diagram of an audio output process according to one embodiment of the invention.



FIG. 3 is a block diagram of an audio processing system according to one embodiment of the invention.



FIG. 4 is a flow diagram of an audio mixing process according to one embodiment of the invention.



FIG. 5 is an audio processing system according to one embodiment of the invention.



FIG. 6 is a block diagram of a multi-channel audio mixing system according to one embodiment of the invention.



FIG. 7 is a block diagram of a media player according to one embodiment of the invention.



FIG. 8 illustrates a media player having a particular user input device according to one embodiment.



FIG. 9 is a flow diagram of a sound effect event process according to one embodiment of the invention.



FIG. 10 illustrates a graphical user interface according to one embodiment of the invention.





DETAILED DESCRIPTION OF THE INVENTION

The invention pertains to techniques for presenting sound effects at a portable media device. The sound effects can be output as audio sounds to an internal speaker, an external speaker, or both. In addition, the audio sounds for the sound effects can be output together with other audio sounds pertaining to media assets (e.g., audio tracks being played). In one embodiment, the sound effects can serve to provide auditory feedback to a user of the portable media device. A user interface can facilitate a user's selection of sound effect usages, types or characteristics.


The invention is well suited for audio sounds pertaining to media assets (media items), such as musk, audiobooks, meeting recordings, and other speech or voice recordings.


The improved techniques are also resource efficient. Given the resource efficiency of these techniques, the improved techniques are also well suited for use with portable electronic devices having audio playback capabilities, such as portable media devices. Portable media devices, such as media players, are small and highly portable and have limited processing resources. Often, portable media devices are hand-held media devices, such as hand-held audio players, which can be easily held by and within a single hand of a user.


Embodiments of the invention are discussed below with reference to FIGS. 1-10. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.



FIG. 1 is a block diagram of an audio system 100 according to one embodiment of the invention. The audio system 100 depicts a data flow for the audio system 100 under the control of an application 102. Typically, the audio system 100 is provided by a computing device. Often, the computing device is a portable computing device especially designed for audio usage. One example of portable computing devices are portable media players (e.g., music players or MP3 players). Another example of portable computing devices are mobile telephones (e.g., cell phones) or Personal Digital Assistants (PDA).


The application 102 is, for example, a software application that operates on the computing device. The application 102 has access to audio data 104 and sound effect data 106. The application 102 can utilize the audio data 104 when the application 102 desires to output the audio data 104. The sound effect data 106 can represent audio sounds pertaining to sound effects that can be utilized by the computing device. For example, the sound effects may correspond to sounds (actual or synthetic) for mouse clicks, button presses, and the like. The sound effect data 106 is audio data and can be stored in a wide variety of formats. For example, the sound effect data 106 a can be simply Pulse Coded Modulation (PCM) data or can be encoded data, such as MP3 or MPEG-4 format. PCM data is typically either raw data (e.g., a block of samples) or formatted (e.g., WAV or AIFF file formats).


The application 102 controls when a sound effect is to be output by the audio system 100. The application 102 also understands that it may or may not already be outputting audio data 104 at the time at which a sound effect is to the output. In the embodiment shown in FIG. 1, the application 102 can control an audio device 108. The audio device 108 is a hardware component that is capable of producing a sound, such as a sound effect. For example, the audio device 108 can pertain to an audio output device (e.g., speaker or piezoelectric device) that can be briefly activated to provide a sound effect. The sound affect can serve to inform the user of the computing device of a condition, status or event.


In addition, the application 102 produces an audio channel 110 and a mixer channel 112. The audio channel 110 is a virtual channel over which the application 102 can send audio data 104 such that it can be directed to an audio output device. For example, the audio output device can be a speaker that outputs the corresponding audio sounds. In addition, the application 102 can utilize a mixer channel 112 to output sound effects to the audio output device. The mixer channel 112 and the audio channel 110 can be mixed together downstream (see FIG. 3). Hence, the audio system 100 can not only output audio data 104 over the audio channel 110 but can also output sound effects over the mixer channel 112. As discussed in greater detail below, the audio data on the audio channel 110 can be mix with any sound effect data on the mixer channel 112.



FIG. 2 is a flow diagram of an audio output process 200 according to one embodiment of the invention. The audio output process 200 is performed by an audio system. For example, the audio output process 200 can be performed by the application 102 of the audio system 100 illustrated in FIG. 1.


The audio output process 200 begins with a decision 202 that determines whether an audio play request has been issued. For example, an audio play request can be issued as a result of a system action or a user action with respect to the audio system. When the decision 202 determines that an audio play request has been issued, audio data is output 204 to an audio channel. By outputting the audio data to the audio channel, the audio data is directed to an audio output device, namely, a speaker, wherein audible sound is output.


Following the operation 204, or following the decision 202 when an audio play request has not been issued, a decision 206 determines whether a sound effect request has been issued. When the decision 206 determines that a sound effect request has been issued, then sound effect data is output 208 to a mixer channel. The mixer channel carries other audio data, such as audio data pertaining to sound effects (sound effect data). The mixer channel allows the sound effect data to mix with the audio data on the audio channel. After the sound effect data has been output 208 to the mixer channel, or directly following the decision 206 when a sound effect request has not been issued, the audio output processed 200 turns to repeat the decision 202 and subsequent operations so that subsequent requests can be similarly processed.


It should be understood that often audio data is output for a longer duration than is any sound effect data, which tends to be of a shorter duration. Hence, during the output of the audio data to the audio channel, sound effect data for one or more sound effects can be output to the mixer channel and this combined with the audio data.



FIG. 3 is a block diagram of an audio processing system 300 according to one embodiment of the invention. The audio processing system 300 includes an audio channel 302 and a mixer channel 304. The audio channel 302 typically includes a decoder and a buffer. The mixer channel 304 typically includes resolution and/or sample rate converters.


The audio channel 302 receives audio data 306 that is to be output by the audio processing system 300. After the audio data 306 passes through the audio channel 302, it is provided to a mixer 380. The mixer channel 304 receives sound effect data 310. After the sound effect data 310 has passed through the mixer channel 304, it is provided to a mixer 308. The mixer 308 serves to combine the audio data from the audio channel 302 with the sound effect data 310 from the mixer channel 304. The combined data is then supplied to a Digital-to-Analog Converter (DAC) 312. The DAC 312 converts the combined data to an analog audio output. The analog audio output can be supplied to an audio output device, such as a speaker.



FIG. 4 is a flow diagram of an audio mixing process 400 according to one embodiment of the invention. The audio mixing process 400 it is, for example, performed by the audio processing system 300 illustrated in FIG. 3.


The audio mixing process 400 begins with a decision 402 that determines whether a sound effect is to be output. When the decision 402 determines that a sound effect is not to be output, then the audio mixing process 400 awaits the need to output a sound effect. For example, the decision 206 of the audio output process 200 illustrated in FIG. 2 indicates that an audio system can make the determination of whether a sound effect is to be output. Accordingly, the audio mixing process 400 is invoked when a sound effect is to be output.


Once the decision 402 determines that a sound effect is to be output, a desired sound effect to be output is determined 404. Here, in one embodiment, the audio system can support a plurality of different sound effects. In such an embodiment, the audio system needs to determine which of the plurality of sound effects is the desired sound effect. The sound effect data for the desired sound effect is then retrieved 406.


A decision 408 then determines whether audio data is also being output. When the decision 408 determines that audio data is also being output, audio characteristics for the audio data being output are obtained 410. In one implementation, the audio characteristics pertain to metadata corresponding to the audio data being output. The sound effect data is then modified 412 based on the audio characteristics. In one embodiment, the audio characteristics can pertain to one or more of: audio resolution (e.g., bit depth), sample rate, and stereo/mono. For example, the audio resolution for the sound effect data can be modified 412 to match the audio resolution (e.g., bit depth) of the audio data. As another example, the sample rate for the sound effect can be modified 412 based on the sample rate of the audio data. In any case, after the sound effect data has been modified 412, the modified sound effect data is then mixed 414 with the audio data. Thereafter, the mixed audio data is output 416. As an example, the mixed audio data can be output 416 to an audio output device (e.g., speaker) associated with the audio system.


On the other hand, when the decision 408 determines that audio data is not being output, sound effect data is output 418. Here, since there is no audio data being output, the sound elect data can be simply output 418. If desired, the sound effect data can be modified before being output 418, such as to change audio resolution or sample rate conversion. Here, the output 418 of the sound effect data can also be provided to the audio output device. Following the operations 416 and 418, the audio mixing process 400 is complete and ends



FIG. 5 is an audio processing system 500 according to one embodiment of the invention. The audio processing system 500 includes an audio channel 502. The audio channel 502 includes a decoder 504 and a buffer 506. The decoder 504 receives incoming audio data. The decoder 504 decodes the audio data (which was previously encoded). The decoded audio data is then temporarily stored in the buffer 506. As needed for transmission, the decoded audio data is supplied from the buffer 506 to a mixer 508.


The audio processing system 500 also includes a mixer channel 510. The mixer channel 510 receives sound effect data that is to be output. Since the audio processing system 500 can process audio data of various bit depths, sample rates, and other criteria, the mixer channel 510 can serve to modify the sound effect data. One benefit of providing the mixer channel 510 with conversion or adaptation capabilities is the ability to modify in the audio characteristics of the sound effect data. By doing so, the sound effect data does not have to be stored by the audio system for a large number of different audio formats. Indeed, for efficient use of storage resources, only a single file for each sound effect need be stored. As needed, sound effect data can have its audio characteristics altered so as to closely match those of the audio data also being output by the audio processing system 500. In this regard, the mixer channel 510 can include a bit depth converter 512, a channel count adapter 514, and a sample rate converter 516. The bit depth converter 512 can convert the bit depth (i.e., resolution) of the sound effect data. As one example, if the sound effect data has a bit depth of eight (8) bits, the bit depth converter 512 could change the bit depth to sixteen (16) bits. The channel count adapter 514 can modify the sound effect data to provide mono or stereo audio components. The sample rate converter 516 converts the sample rate for the sound effect data. To assist the mixer channel 510 in converting or adapting the audio characteristics, the audio characteristics from the audio data provided to the audio channel 502 can be provided to the mixer channel 510, so as to inform the mixer channel 510 of the audio characteristics of the audio data in the audio channel 502.


The modified sound effect data output by the mixer channel 510 is supplied to the mixer 508. The mixer 508 adds or sums the decoded audio data from the audio channel 502 with the modified sound effect data from the mixer channel 510. The results of the mixer 508 is mixed audio data that is supplied to a buffer 518. The mixed audio data is digital data stored in the buffer 518. The audio processing, system 500 also includes a Digital-to-Analog Converter (DAC) 520. The DAC 520 receives the mixed audio data from the buffer 518, which is digital data, and converts it into an analog audio output. The analog audio output can be supplied to an audio output device, such as a speaker.


Although the audio processing system 500 illustrated in FIG. 5 depicts a single audio channel and a single mixer channel, it should be understood that the audio processing system 500 can include more than one mixer channel. The advantage of having more than one mixer channel is that multiple sound effects can be output concurrently, thereby providing a polyphony audio effect.



FIG. 6 is a block diagram of a multi-channel audio mixing system 600 according to one embodiment of the invention. The multi-channel audio mixing system 600 includes an audio channel 602 that receives audio data and outputs decoded audio data. The decoded audio data being output by the audio channel 602 is supplied to a mixer 604. The multi-channel audio mixing system 600 also includes a plurality of mixer channel's 606-1, 606-2, . . . , 606-N. Each of the mixer channels 606 is capable of receiving a different sound effect. For example, the mixer channel 1606-1 can receive a sound effect A, the mixer channel 2606-2 can receive a sound effect B, and the mixer channel N can receive a sound effect N. If desired, the mixer channels 606 can each carry a sound effect at same time, or at least with partial temporal overlap, so that the various sound effects can be output without substantial distortion amongst one another. Regardless of the number of sound effects being processed by the mixer channels 606, the sound effect data output from the mixer channels 606 are provided to the mixer 604. The mixer 604 combines the sound effect data from one or more of the mixer channels 606 with the decoded audio data from the audio channel 602. The result of the mixer 604 is a mixed audio output that can be supplied to in audio output device.



FIG. 7 is a block diagram of a media player 700 according to one embodiment of the invention. The media player 700 can implement the audio system 100 of FIG. 1 or the audio processing system 200, 500 of FIGS. 3 and 5 The media player 700 includes a processor 702 that pertains to a microprocessor or controller for controlling the overall operation of the media player 700. The media player 700 stores media data pertaining to media items in a file system 704 and a cache 706. The file system 704 is, typically, a storage disk or a plurality of disks. The file system 704 typically provides high capacity storage capability for the media player 700. The file system 704 can store not only media data but also non-media data (e.g., when operated in a disk mode). However, since the access time to the file system 704 is relatively slow, the media player 700 can also include a cache 706. The cache 706 is, for example, Random-Access Memory (RAM) provided by semiconductor memory. The relative access time to the cache 706 is substantially shorter than for the file system 704. However, the cache 706 does not have the large storage capacity of the file system 704. Further, the file system 704, when active, consumes more power than does the cache 706. The power consumption is often a concern when the media player 700 is a portable media player that is powered by a battery (not shown). The media player 700 also includes a RAM 722 and a Read-Only Memory (ROM) 722. The ROM 720 can store programs, utilities or processes to be executed in a non-volatile manner. The RAM 722 provides volatile data storage, such as for the cache 706.


The media player 700 also includes a user input device 708 that allows a user of the media player 700 to interact with the media player 700. For example, the user input device 708 can take a variety of forms, such as a button, keypad, dial, etc. In one implementation, the user input device 708 can be provided by a dial that physically rotates. In another implementation, the user input device 70$ can be implemented as a touchpad (i.e., a touch-sensitive surface), in still another implementation, the user input device 708 can be implemented as a combination one or more physical buttons and well as a touchpad. Regardless of how implemented, as the user interacts with the user interface device 708, a piezoelectric device 724 can provide auditory feedback to the user. For example, the piezoelectric device 724 can be controlled by the processor 702 to emit a sound in response to a user action (e.g., user selection or button press). Still further, the media player 700 includes a display 710 (screen display) that can be controlled by the processor 702 to display information to the user. A data bus 711 can facilitate data transfer between at least the file system 704, the cache 706, the processor 702, and the CODEC 712.


In one embodiment, the media player 700 serves to store a plurality of media items (e.g., songs) in the file system 704. When a user desires to have the media player play a particular media item, a list of available media items is displayed on the display 710. Then, using the user input device 708, a user can select one of the available media items. The processor 702, upon receiving a selection of a particular media item, supplies the media data (e.g., audio file) for the particular media item to a coder/decoder (CODEC) 712. The CODEC 712 then produces analog output signals for a speaker 714. The speaker 714 can be a speaker internal to the media player 700 or external to the media player 700. For example, headphones or earphones that connect to the media player 700 would be considered an external speaker. The speaker 714 can not only be used to output audio sounds pertaining to the media item being played, but also to output sound effects. The sound effects can be stored as audio data on the media player 700, such as in file system 704, the cache 706, the ROM 720 or the RAM 722. A sound effect can be output in response to a user input or a system request. When a particular sound effect is to be output to the speaker 714, the associated sound effect audio data can be retrieved by the processor 702 and supplied to the CODEC 712 which then supplies audio signals to the speaker 714. In the case where audio data for a media item is also being output, the processor 702 can process the audio data for the media item as well as the sound effect. In such case, the audio data for the sound effect can be mixed with the audio data for the media item. The mixed audio data can then be supplied to the CODEC 712 which supplies audio signals (pertaining to both the media item and the sound effect) to the speaker 714.


The media player 700 also includes a network/bus interface 716 that couples to a data link 718. The data link 718 allows the media player 700 to couple to a host computer. The data link 718 can be provided over a wired connection or a wireless connection. In the case of a wireless connection, the network/bus interface 716 can include a wireless transceiver.


In one embodiment, the media player 700 is a portable computing device dedicated to processing media such as audio. For example, the media player 700 can be a music player (e.g., MP3 player), a game player, and the like. These devices are generally battery operated and highly portable so as to allow a user to listen to music, play games or video, record video or take pictures wherever the user travels. In one implementation, the media player 700 is a handheld device that is sized for placement into a pocket or hand of the user. By being handheld, the media player 700 is relatively small and easily handled and utilized by its user. By being pocket sized, the user does not have to directly carry the device and therefore the device can be taken almost anywhere the user travels (e.g., the user is not limited by carrying a large, bulky and often heavy device, as in a portable computer). Furthermore, the device may be operated by the user's hands, no reference surface such as a desktop is needed.


The user input device 708 can take a variety of forms, such as a button, keypad, dial, etc. (physical or soft implementations) each of which can be programmed to individually or in combination to perform any of a suite of functions. FIG. 8 illustrates a media player 800 having a particular user input device 802 according to one embodiment. The media player 804 can also include a display 804. The user input device 802 includes a number of input devices 806, which can be either physical or soft devices. Such input devices 806 can take the form of a rotatable dial 806-1, such as in the form of a wheel, capable of rotation in either a clockwise or counterclockwise direction. A depressible input button 806-2 can be provided at the center of the dial 806-1 and arranged to receive a user input event such as a press event. Other input buttons 806 include input buttons 806-3 through 806-6 each available to receive user supplied input action.


As noted above, the audio system can be utilized to mix sound effects with player data such that the mixed audio can be output to an audio output device. The audio system can be system or user configurable as to sound effect processing. For example, a user may desire sound effects to be output to a particular audio output device of the audio system. As one example, the audio output device can be an in-device speaker. As another example, a user may desire sound effects to be output to a headphone (earphone) instead of or in addition to any in-device speaker.



FIG. 9 is a flow diagram of a sound effect event process 900 according to one embodiment of the invention. The sound effect event process 900 begins with a decision 902 that determines whether a sound effect event has been initiated. An audio system, or its user can initiate a sound effect event. When the decision 902 determines that a sound effect event has not been issued, then the sound effect event process 900 awaits such an event. On the other hand, once the decision 900 determines that a sound effect event has been issued, a decision 904 determines whether a device effect is enabled. When the decision 904 determines that the device effect is enabled, then a device effect is activated 906. The file device effect corresponds to an audio output device which can be activated to physically produce the sound effect. For example, the device effect can be produced by an in-device speaker. One type of speaker is a loudspeaker. Another type of speaker is a piezoelectric speaker (e.g., piezoelectric device 724).


A user or system can configure the audio system to provide a given sound effect, the device effect, via an audio output device. For example, if the audio output device is a piezoelectric speaker, the system can control the audio output device to provide the device effect that corresponds to the sound effect event that has been issued. For example, if the sound effect event issued was a “mouse click” event, then the device effect could be a click sound that is physically generated by an electrical control signal supplied to the piezoelectric speaker.


On the other hand, when the decision 904 determines that the device effect is not enabled, or following the activation 906 if the device effect was enabled, a decision 908 determines whether an earphone effect is enabled. Here, the system or user can configure the audio system to provide a sound effect to the user via one or more earphones coupled to the audio system. When the decision 908 determines that the earphone effect is enabled, then an earphone effect is activated 910. By activation 910 of the earphone effect, the appropriate sound effect is output to the user by way of the one or more earphones. As a result, should the user be wearing be earphones, the sound effect is able to be perceived in an audio manner by the user. Following the operation 910, or following the decision 908 when the earphone effect is not enabled, the sound effect event process 900 returns to repeat the decision 902 and subsequent operations so that additional sound effect events can be processed.


In one embodiment, the audio system makes use of a graphical user interface to assist the user with configuring audible sound effects. For example, the user may desire to have little or no sound effects active. On the other hand, when sound effects are these partial the active, the user may desire the sound effects be provided at an in-device speaker of the audio system. Alternatively, or in addition, the user may also desire sound effects to be provided in an audio manner via an earphone or headphone.



FIG. 10 illustrates a graphical user interface 1000 according to one embodiment of the invention. The graphical user interface 1000 allows a user to configure a portable computing device for auditory feedback. More particularly, the graphical user interface 1000 includes a header or title 1002 designating that the graphical user interface pertains to “Feedback”. The graphical user interface 1000 also displays a menu or list 1004 of user selectable items. In this example, the menu or list 1004 includes four user selectable items, namely, “Speaker”, “Headphone”, “Both” and “Off”. The “Speaker” selection causes the configuration to provide auditory feedback via a speaker (e.g., piezoelectric device 724). The “Headphone” selection causes the configuration to provide auditory feedback via earphone(s) or headphone(s) (e.g., external speaker 714 (external)). The “Both” selection causes the configuration to provide auditory feedback via a speaker (e.g., piezoelectric device 724) and as earphone(s) or headphone(s) (e.g., external speaker 714 (external)). The “Off” selection causes the configuration to provide no auditory feedback. A selector 1006 indicates current selection of the “Headphone” item.


One example of a media player is the iPod® media player, which is available from Apple Computer, Inc. of Cupertino, Calif. Often, a media player acquires its media assets from a host computer that serves to enable a user to manage media assets. As an example, the host computer can execute a media management application to utilize and manage media assets. One example of a media management application is iTunes®, produced by Apple Computer, Inc.


The various aspects, embodiments, implementations or features of the invention can be used separately or in any combination.


The invention is preferably implemented by software, hardware or a combination of hardware and software. The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can thereafter be read by a computer system. Examples of the computer readable medium include read-only memory, random-access memory, CD-ROMs, DVDs, magnetic tape, optical data storage devices, and carrier waves. The computer readable medium can also be distributed over network-coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.


The advantages of the invention are numerous. Different aspects, embodiments or implementations may yield one or more of the following advantages. One advantage of the invention is that processing resources required to implement audio sound effects can be substantially reduced. A media device that is highly portable can make use of audio sound effects. Another advantage of the invention is that sound effects can be output even while a media device is outputting other media music). Another advantage of the invention is that the audio data for sound effects can be stored in a single formats and converted to other formats as appropriate to substantially match audio data of a media item being played. Still another advantage of the invention is that multiple sound effects can be output concurrently with substantial preservation of their intelligibility.


The many features and advantages of the present invention are apparent from the written description and, thus, it is intended by the appended claims to cover all such features and advantages of the invention. Further, since numerous modifications and changes will readily occur to those skilled in the art, the invention should not be limited to the exact construction and operation as illustrated and described. Hence, all suitable modifications and equivalents may be resorted to as falling within the scope of the invention.

Claims
  • 1. A method for a computing device comprising: determining a desired sound effect to output for the computing device if a sound effect event is detected;retrieving sound effect data for the desired sound effect;determining if audio data is to be output for the computing device with the desired sound effect;obtaining characteristics of the audio data if determined to be output;modifying the sound effect data based on the obtained characteristics of the audio data prior to mixing with the audio data such that the sound effect data have similar characteristics as the audio data;mixing the modified sound effect data and the audio data, wherein the audio data has a longer audio playback duration than the modified sound effect data and the modified sound effect data serves to inform a user of the computing device of a condition being triggered by the computing device; andinforming the user of the triggered condition by outputting the mixed modified sound effect data and audio data.
  • 2. The method of claim 1, further comprising outputting the mixed modified sound effect data and audio data to an audio output device.
  • 3. The method of claim 1, wherein if audio data is not determined to be output for the computing device, the sound effect data for the desired sound effect is output for the computing device without any audio data.
  • 4. The method of claim 1, wherein the characteristics of the audio data include meta data.
  • 5. The method of claim 1, wherein modifying the sound effect data includes modifying the sound effect data based on audio resolution, sample rate, or stereo characteristics of the audio data.
  • 6. The method of claim 1, wherein the condition is triggered in response to a system action or a user action.
  • 7. A portable computing device comprising: a memory storing audio data and sound effect data; anda processor running an application to determine a desired sound effect to output for the portable computing device if a sound effect event is detected,retrieve sound effect data for the desired sound effect,determine if audio data is to be output for the portable computing device with the desired sound effect;obtain characteristics of the audio data if determined to be output,modify the sound effect data based on the obtained characteristics of the audio data prior to mixing with the audio data such that the sound effect data have similar characteristics as the audio data,mix the modified sound effect data and the audio data, wherein the audio data has a longer audio playback duration than the modified sound effect data and the modified sound effect data serves to inform a user of the computing device of a condition being triggered by the computing device, andinform the user of the triggered condition by outputting the mixed modified sound effect data and audio data.
  • 8. The portable computing device of claim 7, further comprising an audio output device to output the sound effect data for the desired sound effect without audio data.
  • 9. The portable computing device of claim 8, wherein the audio device is to output the mixed modified sound effect data and audio data.
  • 10. The portable computing device of claim 8, wherein the application produces an audio channel and a mixer channel.
  • 11. The portable computing device of claim 10, wherein the application is to send audio data on the audio channel to the audio output device and mixed modified sound effect data and audio data on the mixer channel to the audio output device.
  • 12. The portable computing device of claim 7, wherein the characteristics of the audio data include resolution, sample rate, or stereo characteristics.
  • 13. A non-transitory machine-readable medium, comprising instructions, which executed by a computing device, cause the computing device to perform a method comprising: determining a desired sound effect to output for the computing device if a sound effect event is detected;retrieving sound effect data for the desired sound effect;determining if audio data is to be output for the computing device with the desired sound effect;obtaining characteristics of the audio data if determined to be output;modifying the sound effect data based on the obtained characteristics of the audio data prior to mixing with the audio data such that the sound effect data have similar characteristics as the audio data;mixing the modified sound effect data and the audio data, wherein the audio data has a longer audio playback duration than the modified sound effect data and the modified sound effect data serves to inform a user of the computing device of a condition being triggered by the computing device; andinforming the user of the triggered condition by outputting the mixed modified sound effect data and audio data.
  • 14. The non-transitory machine-readable medium of claim 13, comprising instructions, which executed by the computing device, cause the computing device to perform a method comprising outputting sound effect data for the desired sound effect without any audio data if audio data is not determined to be output for the computing device.
  • 15. The non-transitory machine-readable medium of claim 14, comprising instructions, which executed by the computing device, cause the computing device to perform a method comprising outputting the mixed modified sound effect data and audio data to the audio output device.
  • 16. The non-transitory machine-readable medium of claim 14, comprising instructions, which executed by the computing device, cause the computing device to perform a method comprising modifying the sound effect data based on audio resolution, sample rate, or stereo characteristics of the audio data.
  • 17. A computing device comprising: means for determining a desired sound effect to output for the computing device if a sound effect event is detected;means for retrieving sound effect data for the desired sound effect;means for determining if audio data is to be output for the computing device with the desired sound effect;means for obtaining characteristics of the audio data if determined to be output;means for modifying the sound effect data based on the obtained characteristics of the audio data prior to mixing with the audio data such that the sound effect data have similar characteristics as the audio data;means for mixing the modified sound effect data and the audio data, wherein the audio data has a longer audio playback duration than the modified sound effect data and the modified sound effect data serves to inform a user of the computing device of a condition being triggered by the computing device; andmeans for outputting the mixed modified sound effect data and audio data to inform the user of the triggered condition.
  • 18. The computing device of claim 17, further comprising means for outputting the sound effect data for the desired sound effect without any audio data if audio data is not determined to be output for the computing device.
  • 19. The computing device of claim 17, wherein the characteristics of the audio data include meta data.
  • 20. The computing device of claim 17, further comprising means for modifying the sound effect data based on audio resolution, sample rate, or stereo characteristics of the audio data.
BACKGROUND OF THE INVENTION

This application is a continuation of co-pending U.S. application Ser. No. 13/660,839 filed Oct. 25, 2012, which is a continuation of U.S. application Ser. No. 11/144,541 filed on Jun. 3, 2005, now issued as U.S. Pat. No. 8,300,841.

US Referenced Citations (270)
Number Name Date Kind
4090216 Constable May 1978 A
4386345 Narveson et al. May 1983 A
4451849 Fuhrer May 1984 A
4589022 Prince et al. May 1986 A
4908523 Snowden et al. Mar 1990 A
4928307 Lynn May 1990 A
4951171 Tran et al. Aug 1990 A
5185906 Brooks Feb 1993 A
5293494 Saito et al. Mar 1994 A
5379057 Clough Jan 1995 A
5406305 Shimomura et al. Apr 1995 A
5559945 Beaudet et al. Sep 1996 A
5566337 Szymanski et al. Oct 1996 A
5570308 Ochi Oct 1996 A
5583993 Foster et al. Dec 1996 A
5596260 Moravec et al. Jan 1997 A
5608698 Yamanoi et al. Mar 1997 A
5616876 Cluts Apr 1997 A
5617386 Choi Apr 1997 A
5670985 Cappels, Sr. et al. Sep 1997 A
5675362 Clough Oct 1997 A
5684513 Decker Nov 1997 A
5710922 Alley et al. Jan 1998 A
5712949 Kato et al. Jan 1998 A
5717422 Fergason Feb 1998 A
5721949 Smith et al. Feb 1998 A
5726672 Hernandez et al. Mar 1998 A
5739451 Winksy et al. Apr 1998 A
5740143 Suetomi Apr 1998 A
5760588 Bailey Jun 1998 A
5778374 Dang et al. Jul 1998 A
5803786 McCormick Sep 1998 A
5815225 Nelson Sep 1998 A
5822288 Shinada Oct 1998 A
5835721 Donahue et al. Nov 1998 A
5835732 Kikinis et al. Nov 1998 A
5838969 Jacklin et al. Nov 1998 A
5864868 Contois Jan 1999 A
5867163 Kurtenbach Feb 1999 A
5870710 Ozawa et al. Feb 1999 A
5890017 Tulkoff Mar 1999 A
5918303 Yamaura et al. Jun 1999 A
5920728 Hallowell et al. Jul 1999 A
5923757 Hocker et al. Jul 1999 A
5936643 Tindell Aug 1999 A
5952992 Helms Sep 1999 A
5982902 Terano Nov 1999 A
5986589 Rosefield Nov 1999 A
5998972 Gong Dec 1999 A
6006274 Hawkins et al. Dec 1999 A
6009237 Hirabayashi et al. Dec 1999 A
6011585 Anderson Jan 2000 A
6018705 Gaudet et al. Jan 2000 A
6041023 Lakhansingh Mar 2000 A
6052654 Gaudet et al. Apr 2000 A
6057789 Lin May 2000 A
6108426 Stortz Aug 2000 A
6122340 Darley et al. Sep 2000 A
6158019 Squibb Dec 2000 A
6161944 Leman Dec 2000 A
6172948 Keller et al. Jan 2001 B1
6179432 Zhang et al. Jan 2001 B1
6185163 Bickford et al. Feb 2001 B1
6191939 Burnett Feb 2001 B1
6208044 Viswanadham et al. Mar 2001 B1
6216131 Liu et al. Apr 2001 B1
6217183 Shipman Apr 2001 B1
6222347 Gong Apr 2001 B1
6248946 Dwek Jun 2001 B1
6295541 Bodnar et al. Sep 2001 B1
6297795 Kato et al. Oct 2001 B1
6298314 Blackadar et al. Oct 2001 B1
6332175 Birrell et al. Dec 2001 B1
6336365 Blackadar et al. Jan 2002 B1
6336727 Kim Jan 2002 B1
6341316 Kloba et al. Jan 2002 B1
6357147 Darley et al. Mar 2002 B1
6377530 Burrows Apr 2002 B1
6452610 Reinhardt et al. Sep 2002 B1
6467924 Shipman Oct 2002 B2
6493652 Ohlenbusch et al. Dec 2002 B1
6536139 Darley et al. Mar 2003 B2
6549497 Miyamoto et al. Apr 2003 B2
6560903 Darley May 2003 B1
6587403 Keller et al. Jul 2003 B1
6587404 Keller Jul 2003 B1
6605038 Teller et al. Aug 2003 B1
6606281 Cowgill et al. Aug 2003 B2
6611607 Davis Aug 2003 B1
6611789 Darley Aug 2003 B1
6617963 Watters et al. Sep 2003 B1
6621768 Keller et al. Sep 2003 B1
6623427 Mandigo Sep 2003 B2
6631101 Chan et al. Oct 2003 B1
6658577 Huppi et al. Dec 2003 B2
6693612 Matsumoto et al. Feb 2004 B1
6728584 Duan Apr 2004 B1
6731312 Robbin May 2004 B2
6760536 Amir et al. Jul 2004 B1
6762741 Weindorf Jul 2004 B2
6781611 Richard Aug 2004 B1
6794566 Pachet Sep 2004 B2
6799226 Robbin et al. Sep 2004 B1
6801964 Mahdavi Oct 2004 B1
6832373 O'Neill Dec 2004 B2
6844511 Hsu et al. Jan 2005 B1
6870529 Davis Mar 2005 B1
6871063 Schiffer Mar 2005 B1
6876947 Darley et al. Apr 2005 B1
6882955 Ohlenbusch et al. Apr 2005 B1
6886749 Chiba et al. May 2005 B2
6898550 Blackadar et al. May 2005 B1
6911971 Suzuki et al. Jun 2005 B2
6918677 Shipman Jul 2005 B2
6931377 Seya Aug 2005 B1
6934812 Robbin et al. Aug 2005 B1
6950087 Knox et al. Sep 2005 B2
6950603 Isozaki et al. Sep 2005 B1
7010365 Maymudes Mar 2006 B2
7028096 Lee Apr 2006 B1
7046230 Zadesky May 2006 B2
7062225 White Jun 2006 B2
7076561 Rosenberg et al. Jul 2006 B1
7084856 Huppi Aug 2006 B2
7084921 Ogawa Aug 2006 B1
7092946 Bodnar Aug 2006 B2
7124125 Cook et al. Oct 2006 B2
7131059 Obrador Oct 2006 B2
7143241 Hull Nov 2006 B2
7146437 Robbin et al. Dec 2006 B2
7171331 Vock et al. Jan 2007 B2
7191244 Jennings et al. Mar 2007 B2
7213228 Putterman et al. May 2007 B2
7216008 Sakata May 2007 B2
7234026 Robbin et al. Jun 2007 B2
7277928 Lennon Oct 2007 B2
7301857 Shah et al. Nov 2007 B2
7356679 Le et al. Apr 2008 B1
7508535 Hart et al. Mar 2009 B2
20010013983 Izawa et al. Aug 2001 A1
20010029178 Criss et al. Oct 2001 A1
20010037367 Iyer Nov 2001 A1
20010041021 Boyle et al. Nov 2001 A1
20010042107 Pahn Nov 2001 A1
20020002413 Tokue Jan 2002 A1
20020013784 Swanson Jan 2002 A1
20020028683 Banatre et al. Mar 2002 A1
20020045961 Gibbs et al. Apr 2002 A1
20020046315 Miller et al. Apr 2002 A1
20020055934 Lipscomb et al. May 2002 A1
20020059440 Hudson et al. May 2002 A1
20020059499 Hudson May 2002 A1
20020090912 Cannon et al. Jul 2002 A1
20020116082 Gudorf Aug 2002 A1
20020116517 Hudson et al. Aug 2002 A1
20020122031 Maglio et al. Sep 2002 A1
20020123359 Wei et al. Sep 2002 A1
20020152045 Dowling et al. Oct 2002 A1
20020156833 Maurya et al. Oct 2002 A1
20020161865 Nguyen Oct 2002 A1
20020173273 Spurgat et al. Nov 2002 A1
20020189426 Hirade et al. Dec 2002 A1
20020189429 Qian et al. Dec 2002 A1
20020199043 Yin Dec 2002 A1
20030002688 Kanevsky et al. Jan 2003 A1
20030007001 Zimmerman Jan 2003 A1
20030018799 Eyal Jan 2003 A1
20030037254 Fischer et al. Feb 2003 A1
20030046434 Flanagin et al. Mar 2003 A1
20030050092 Yun Mar 2003 A1
20030074457 Kluth Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076306 Zadesky Apr 2003 A1
20030079038 Robbin et al. Apr 2003 A1
20030095096 Robbin et al. May 2003 A1
20030097379 Ireton May 2003 A1
20030104835 Douhet Jun 2003 A1
20030127307 Liu et al. Jul 2003 A1
20030128192 van Os Jul 2003 A1
20030133694 Yeo Jul 2003 A1
20030153213 Siddiqui et al. Aug 2003 A1
20030156503 Schilling et al. Aug 2003 A1
20030167318 Robbin et al. Sep 2003 A1
20030176935 Lian et al. Sep 2003 A1
20030182100 Plastina et al. Sep 2003 A1
20030221541 Platt Dec 2003 A1
20030229490 Etter Dec 2003 A1
20030236695 Litwin, Jr. Dec 2003 A1
20040001395 Keller et al. Jan 2004 A1
20040001396 Keller et al. Jan 2004 A1
20040012556 Yong et al. Jan 2004 A1
20040055446 Robbin et al. Mar 2004 A1
20040066363 Yamano et al. Apr 2004 A1
20040069122 Wilson Apr 2004 A1
20040076086 Keller Apr 2004 A1
20040086120 Akins, III et al. May 2004 A1
20040094018 Ueshima et al. May 2004 A1
20040103411 Thayer May 2004 A1
20040125522 Chiu et al. Jul 2004 A1
20040165302 Lu Aug 2004 A1
20040177063 Weber et al. Sep 2004 A1
20040198436 Alden Oct 2004 A1
20040210628 Inkinen et al. Oct 2004 A1
20040216108 Robbin Oct 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040242224 Janik et al. Dec 2004 A1
20040246275 Yoshihara et al. Dec 2004 A1
20040255135 Kitaya et al. Dec 2004 A1
20040267825 Novak et al. Dec 2004 A1
20050015254 Beaman Jan 2005 A1
20050053365 Adams et al. Mar 2005 A1
20050060240 Popofsky Mar 2005 A1
20050060542 Risan et al. Mar 2005 A1
20050108754 Carhart et al. May 2005 A1
20050111820 Matsumi May 2005 A1
20050122315 Chalk et al. Jun 2005 A1
20050123886 Hua et al. Jun 2005 A1
20050146534 Fong et al. Jul 2005 A1
20050149213 Guzak et al. Jul 2005 A1
20050152294 Yu et al. Jul 2005 A1
20050156047 Chiba et al. Jul 2005 A1
20050160270 Goldberg et al. Jul 2005 A1
20050166153 Eytchison et al. Jul 2005 A1
20050216855 Kopra et al. Sep 2005 A1
20050218303 Poplin Oct 2005 A1
20050234983 Plastina et al. Oct 2005 A1
20050245839 Stivoric et al. Nov 2005 A1
20050246324 Paalasmaa et al. Nov 2005 A1
20050248555 Feng et al. Nov 2005 A1
20050257169 Tu Nov 2005 A1
20050259064 Sugino et al. Nov 2005 A1
20050259524 Yeh Nov 2005 A1
20050259758 Razzell Nov 2005 A1
20060013414 Shih Jan 2006 A1
20060025068 Regan et al. Feb 2006 A1
20060026424 Eto Feb 2006 A1
20060061563 Fleck Mar 2006 A1
20060068760 Hameed et al. Mar 2006 A1
20060071899 Chang et al. Apr 2006 A1
20060088228 Marriott et al. Apr 2006 A1
20060092122 Yoshihara et al. May 2006 A1
20060094409 Inselberg May 2006 A1
20060095502 Lewis et al. May 2006 A1
20060098320 Koga et al. May 2006 A1
20060135883 Jonsson et al. Jun 2006 A1
20060145053 Stevenson et al. Jul 2006 A1
20060152382 Hiltunen Jul 2006 A1
20060155914 Jobs et al. Jul 2006 A1
20060170535 Watters et al. Aug 2006 A1
20060173974 Tang Aug 2006 A1
20060190577 Yamada Aug 2006 A1
20060190980 Kikkoji et al. Aug 2006 A1
20060221057 Fux et al. Oct 2006 A1
20060221788 Lindahl et al. Oct 2006 A1
20060265503 Jones et al. Nov 2006 A1
20060272483 Honeywell Dec 2006 A1
20060277336 Lu et al. Dec 2006 A1
20070014536 Hellman Jan 2007 A1
20070028009 Robbin et al. Feb 2007 A1
20070061759 Klein, Jr. Mar 2007 A1
20070089057 Kindig Apr 2007 A1
20070106660 Stern et al. May 2007 A1
20070124679 Jeong et al. May 2007 A1
20070129062 Pantalone et al. Jun 2007 A1
20070135225 Nieminen et al. Jun 2007 A1
20070248311 Wice et al. Oct 2007 A1
20070255163 Prineppi Nov 2007 A1
20080055228 Glen Mar 2008 A1
20080134287 Gudorf et al. Jun 2008 A1
20100077338 Matthews et al. Mar 2010 A1
Foreign Referenced Citations (71)
Number Date Country
43 34 773 Apr 1994 DE
44 45 023 Jun 1996 DE
0 127 139 May 1984 EP
0578604 Jan 1994 EP
0 757 437 Feb 1997 EP
0 813 138 Dec 1997 EP
0 863 469 Sep 1998 EP
0 917 077 May 1999 EP
0 982 732 Mar 2000 EP
1 028 425 Aug 2000 EP
1028426 Aug 2000 EP
1 076 302 Feb 2001 EP
1 213 643 Jun 2002 EP
1 289 197 Mar 2003 EP
1 503 363 Feb 2005 EP
1536612 Jun 2005 EP
1 566 743 Aug 2005 EP
1566948 Aug 2005 EP
1 372 133 Dec 2005 EP
1 686 496 Aug 2006 EP
2 370 208 Jun 2002 GB
2384399 Jul 2003 GB
2399639 May 2005 GB
59-023610 Feb 1984 JP
03-228490 Oct 1991 JP
04-243386 Aug 1992 JP
6-96520 Apr 1994 JP
8-235774 Sep 1996 JP
9-50676 Feb 1997 JP
9-259532 Oct 1997 JP
2000-90651 Mar 2000 JP
2000-224099 Aug 2000 JP
2000-285643 Oct 2000 JP
2000-299834 Oct 2000 JP
2000-311352 Nov 2000 JP
2000-339864 Dec 2000 JP
2001-236286 Aug 2001 JP
2001-312338 Nov 2001 JP
2002-076977 Mar 2002 JP
2002-175467 Jun 2002 JP
2003-188792 Jul 2003 JP
2003-259333 Sep 2003 JP
2003-319365 Nov 2003 JP
2004-021720 Jan 2004 JP
2004-219731 Aug 2004 JP
2004-220420 Aug 2004 JP
20010076508 Aug 2001 KR
0 127 139 May 1984 WO
WO 0133569 Jun 1995 WO
WO 9516950 Jun 1995 WO
0 757 437 Feb 1997 WO
9817032 Apr 1998 WO
WO 9928813 Jun 1999 WO
WO 0022820 Apr 2000 WO
WO 0165413 Sep 2001 WO
WO 0167753 Sep 2001 WO
WO 0225610 Mar 2002 WO
WO 03023786 Mar 2003 WO
WO 03036457 May 2003 WO
WO 03067202 Aug 2003 WO
2004061850 Jul 2004 WO
WO 2004055637 Jul 2004 WO
WO 2004084413 Sep 2004 WO
WO 2004104815 Dec 2004 WO
WO 2005031737 Apr 2005 WO
05048644 May 2005 WO
WO 05048644 May 2005 WO
WO 2005008505 Jul 2005 WO
2005109781 Nov 2005 WO
WO 2006040737 Apr 2006 WO
2006071364 Jun 2006 WO
Non-Patent Literature Citations (151)
Entry
Office Action dated Mar. 25, 2010 in U.S. Appl. No. 11/297,032.
Office Action dated Mar. 10, 2010 in U.S. Appl. No. 11/583.327.
Office Action dated Mar. 11, 2010 in U.S. Appl. No. 11/830,746.
Office Action dated Mar. 4, 2010 in U.S. Appl. No. 11/324,863.
Office Action dated Feb. 3, 2010 in U.S. Appl. No. 11/439,613.
Office Action dated Dec. 14, 2009 in U.S. Appl. No. 11/535,646.
Office Action dated Sep. 25, 2009 in Chinese Application No. 200610130904.1.
Notice of Allowance dated Feb. 4, 2010 in U.S. Appl. No. 11/535,646.
Office Action dated Apr. 12, 2010 in U.S. Appl. No. 12/397.051.
Office Action dated Apr. 13, 2010 in U.S. Appl. No. 12/406,793.
Office Action dated Apr. 15, 2010 in U.S. Appl. No. 11/373,468.
Office Action dated Sep. 3, 2009 in U.S. Appl. No. 11/324,863.
Office Action dated Jan. 26, 2009 in U.S. Appl. No. 11/373,468.
Office Action dated Jun. 24, 2009 in U.S. Appl. No. 11/373,468.
Kadir et al., “Adaptive Fast Playback-Based Video Skimming Using a Compressed-Domain Visual Complexity Measure”, 2004 IEEE International Conference on Multimedia and Expo, pp. 2055-2058.
Office Action dated Oct. 16. 2008 in U.S. Appl. No. 11/327,544.
Office Action in European Patent Application No. 05 855 368.6 dated Nov. 20, 2008.
Office Action dated Dec. 15, 2008 in U.S. Appl. No. 11/212,313.
Notice of Allowance dated Dec. 18, 2008 in U.S. Appl. No. 11/212,555.
International Search Report dated Oct. 10, 2008 in PCT Application No. PCT/US2007/077160.
Written Opinion dated Oct. 10, 2008 in PCT Application No. PCT/US2007/077160.
Office Action dated Sep. 1, 2008 in EP Application No. 06 256 215.2.
Written Opinion dated Jan. 6, 2009 in Singapore Application No. 200701865-8.
Office Action dated May 27, 2009 in U.S. Appl. No. 11/439,613.
Office Action dated Jun. 2, 2009 in U.S. Appl. No. 11/530,773.
Office Action dated May 11, 2009 in U.S. Appl. No. 11/680,580.
Notice of Allowance dated Apr. 21, 2009 in U.S. Appl. No. 11/327,544.
Office Action dated Mar. 30, 2009 in U.S. Appl. No. 11/515,270.
Office Action dated Apr. 9, 2009 in U.S. Appl. No. 11/583,199.
Notice of Allowance dated Jun. 15, 2009 in U.S. Appl. No. 11/212,313.
Office Action dated Jun. 22, 2009 in U.S. Appl. No. 11/515,270.
Office Action dated Jun. 24, 2009 in U.S. Appl. No. 11/519,352.
Office Action dated Sep. 10, 2009 in U.S. Appl. No. 11/746,548.
Office Action dated Sep. 2, 2009 in U.S. Appl. No. 11/515,270.
Office Action dated Oct. 16, 2009 in U.S. Appl. No. 11/583,199.
Office Action dated Oct. 23, 2009 in Chinese Application No. 200580048143.9.
Office Action dated Nov. 16, 2009 in U.S. Appl. No. 11/439,613.
Office Action dated May 29, 2009 in EP Application No. 06 847 856.9.
Office Action dated Dec. 11, 2009 in U.S. Appl. No. 11/519,352.
Office Action dated Dec. 16, 2009 in U.S. Appl. No. 11/746,548.
Examination Report dated Sep. 1, 2009 in Singapore Application No. 200701865-8.
U.S. Appl. No. 10/125,893, filed Apr. 18, 2002 and titled “Power Adapters for Powering and/or Charging Peripheral Devices.”
International Search Report dated Dec. 5, 2007 in PCT Application PCT/US2007/004810.
Written Opinion dated Dec. 5, 2007 in PCT Application No. PCT/US2007/004810.
Partial Search Report dated Sep. 6, 2007 in PCT Application No. PCT/US2007/004810.
Apple iTunes Smart Playlists, downloaded Apr. 5, 2005 from http://web.archive.org/web/2003100211316/www.apple.com/itunes/smartplaylists . . . pp. 1-2.
International Search Report dated Dec. 5, 2007 in Patent Application No. PCT/US2007/004810.
International Search Report in Patent Application No. PCT/US2006/048738 dated Jan. 29, 2008.
International Search Report in Patent Application No. PCT/US2007/077020 dated Jan. 28, 2008.
International Search Report in Patent Application No. PCT/US2007/076889 dated Jan. 28, 2008.
iTunes, Wikipedia: The Free Encyclopedia; downloaded on Oct. 5, 2005, pp. 1-6.
Nutzel et al., “Sharing Systems for Future HiFi Systems”, The Computer Society, Jun. 2004.
Partial Search Report dated Sep. 6, 2007 in Patent Application No. PCT/US2007/004810.
Written Opinion dated Dec. 5, 2007 in Patent Application No. PCT/US2007/004810.
Written Opinion in Patent Application No. PCT/US2006/048738 dated Jan. 29, 2008.
Written Opinion in Patent Application No. PCT/US2007/076889 dated Jan. 28, 2008.
Written Opinion in Patent Application No. PCT/US2007/077020 dated Jan. 28, 2008.
Office Action dated Feb. 1, 2008 in U.S. Appl. No. 11/327,544.
Hart-Daves, Guy, “How to Do Everything With Your iPod and iPod Mini”, 2004, McGraw-Hill Professional, p. 33.
Office Action dated Feb. 4, 2008 in U.S. Appl. No. 11/566,072.
“Creative liefert erstes Portable Media Center aus” [Online] Sep. 2, 2004, Retrieved from the internet on Sep. 20, 2007 from http://www.golem.de/0409/33347.html>.
International Search Report dated Feb. 18, 2008 in Patent Application No. PCT/US2007/079766.
International Search Report dated Sep. 27, 2007 in Application No. 05824296.7.
Office Action dated Apr. 4, 2008 in U.S. Appl. No. 11/212,555.
Office Action dated Feb. 20, 2008 in Japanese Application No. 2007-538196.
Office Action dated Feb. 25, 2008 in U.S. Appl. No. 11/749,599.
Office Action dated Mar. 4, 2008 from U.S. Appl. No. 10/973,657.
Partial International Search Report dated Feb. 1, 2008 in Patent Application No. PCT/US2007/010630.
Written Opinion dated Feb. 18, 2008 in Patent Application No. PCT/US2007/079766.
Invitation to Pay Additional Fees and Partial Search Report for PCT Application No. PCT/US2007/077160 dated Apr. 1, 2008.
“Combination Belt Clip Leaf Spring and Housing Latch”, Wandt et al.; Motorola Technical Developments, Motorla Inc. Schaumburg, IL. vol. 18, Mar. 1, 1993.
“Creative Zen Vision: M 30GB”, Dec. 21, 2005; downloaded on Jan. 11, 2008 from http://web.archive.org/web/20051221050140/http://www.everythingusb.com/creative_zen_vision:m_30gb.html>.
International Search Report dated Jul. 7, 2008 in PCT Application No. PCT/US2007/076793.
International Search Report dated Jun. 10, 2008 in PCT Application No. PCT/US2007/010630.
Notification of Reason for Rejection from PCT Application No. 2003-539048 dated Nov. 27, 2007.
Office Action dated Jun. 17, 2008 in U.S. Appl. No. 11/212,313.
Office Action dated May 30, 2008 in Chinese Patent Application No. 02825938.6.
Office Action in Chinese Patent Application No. 2008-045351 dated Aug. 5, 2008.
Office Action in U.S. Appl. No. 11/212,555 dated Aug. 14, 2008.
Search Report dated May 15, 2008 in PCT Application No. PCT/US2007/019578.
Written Opinion dated Jul. 7, 2008 in PCT Application No. PCT/US2007/076793.
Written Opinion dated Jun. 10, 2008 in PCT Application No. PCT/US2007/010630.
Written Opinion dated May 15, 2008 in PCT Application No. PCT/US2007/019578.
Yee et al., “Faceted Metadata for Image Search and Browsing.” Association for Computing Machinery, Conference Proceedings, Apr. 5, 2003.
“Apple Announces iTunes 2,” Press Release, Apple Computer, Inc., Oct. 23, 2001.
“Apple Introduces iTunes—World's Best and Easiest to Use Jukebox Software,” Macworld Expo, San Francisco, Jan. 9, 2001.
“Apple's iPod Available in Stores Tomorrow,” Press Release, Apple Computer, Inc., Nov. 9, 2001.
“Nomad Jukebox,” User Guide, Creative Technology Ltd., Version 1, Aug. 2000.
“SoundJam MP Plus Manual, version 2.0”—MP3 Player and Encoder for Macintosh by Jeffrey Robbin, Bill Kincaid and Dave Heller, manual by Tom Negrino, published by Casady & Greene, Inc., 2000.
“12.1″ 925 Candela Mobile PC”, downloaded from LCDHardware.com on Dec. 19, 2002, http://www.lcdhardware.com/panel/12_1_panel/default.asp.
“BL82 Series Backlit Keyboards”, www.tg3electronics.com/products/backlit/backlit.htm, downloaded Dec. 19, 2002.
“Bluetooth PC Headsets—Enjoy Wireless VoIP Conversations: ‘Connecting’ Your Bluetooth Headset With Your Computer”, Bluetooth PC Headsets; downloaded on Apr. 29, 2006 from http://www.bluetoothpcheadsets.com/connect.htm.
“Creative MuVo TX 256 MB,” T3 Magazine, Aug. 17, 2004, http://www.t3.co.uk/reviews/entertainment/mp3_player/creative_muvo_tx_256mb [downloaded Jun. 6, 2006].
“Digital Still Cameras—Downloading Images to a Computer,” Mimi Chakarova et al., Multi-Media Reporting and Convergence, 2 pgs.
“Eluminx Illuminated Keyboard”, downloaded Dec. 19, 2002, http://www.elumix.com/.
“How to Pair a Bluetooth Headset & Cell Phone”, About.com; downloaded on Apr. 29, 2006 from http://mobileoffice.about.com/od/usingyourphone/ht/blueheadset_p.htm.
“Peripherals for Industrial Keyboards & Pointing Devices”, Stealth Computer Corporation, downloaded on Dec. 19, 2002, http://www.stealthcomputer.com/peropherals_ocm.htm.
“Poly-Optical Fiber Optic Membrane Switch Backlighting”, downloaded Dec. 19, 2002, http://www.poly-optical.com/membrane_switches.html.
“Public Safety Technologies Tracer 2000 Computer”, downloaded Dec. 19, 2002, http://www.pst911.com/traver.html.
“QuickTime Movie Playback Programming Guide”, Apple Computer, Inc., Aug. 11, 2005.
“QuickTime Overview”, Apple Computer, Inc., Aug. 11, 2005.
“Rocky Matrix Backlit Keyboard”, downloaded Dec. 19, 2002, http://www.amrel.com/psj_matrixkeyboard.html.
“Sony Ericsson to introduce Auto pairing to improve Bluetooth connectivity between headsets and phones”, Sep. 28, 2005 Press Release, Sony Ericsson Corporate; downloaded on Apr. 29, 2006 from http://www.sonyericsson.com/spg.jsp/cc=global&lc=en&ver=4001&template=pc3_1_1&z . . . .
“TAOS, Inc., Announces Industry's First Ambient Light Sensor to Convert Light Intensity to Digital Signals”, www.taosinc.com/pressrelease_090902.htm, downloaded Jan. 23, 2003.
“Toughbook 28: Powerful, Rugged and Wireless”, Panasonic: Toughbook Models, downloaded Dec. 19, 2002, http:www.panasonic.com/computer/notebook/html/01a_s8.htm.
“When it Comes to Selecting a Projection TV, Toshiba Makes Everything Perfectly Clear, Previews of New Releases”, www.bestbuy.com/HomeAudioVideo/Specials/ToshibaTVFeatures.asp, downloaded Jan. 23, 2003.
“WhyBuy: Think Pad”, IBM ThinkPad Web Page Ease of Use, downloaded on Dec. 19, 2002, http://www.pc.ibm.com/us/thinkpad/easeofuse.html.
512MB Waterproof MP3 Player with FM Radio & Built-in Pedometer, Oregon Scientific, downloaded on Jul. 31, 2006 from http://www2.oregonscientific.com/shop/product.asp?cid=4&scid=11&pid=581.
Adam C. Engst. “SoundJam Keeps on Jammin',” Jun. 19, 2000, http://db.tidbits.com/getbits.ncgi?tbart=05988.
Alex Velga, “AT&T Wireless Launching Music Service,” Yahoo! Finance, Oct. 5, 2004, pp. 1-2.
Andrew Birrell, “Personal Jukebox (PJB),” Oct. 13, 2000 http://birrell.org/andrew/talks/pib-overview.ppt.
Apple iPod Technical Specifications, iPod 20GB and 60GB Mac + PC, downloaded from http://www.apple.com/ipod/color/specs.html on Aug. 8, 2005.
Bociurkiw, Michael, “Product Guide: Vanessa Matz,”, www.forbes.com/asap/2000/1127/vmartz_print.html, Nov. 27 2000.
Compaq, “Personal Jukebox,” Jan. 24, 2001, http://research.compaq.com/SRC/pib/.
Creative: “Creative NOMAD MuVo TX,” www.creative.com, Nov. 1, 2004, http://web.archive.org/web/20041024175952/www.creative.com/products/pfriendly.asp?product=9672 [downloaded Jun. 6, 2006].
Creative: “Creative NOMAD MuVo,” www.creative.com, Nov. 1, 2004, http://web.archive.org/web/20041024075901/www.creative.com/products/product.asp?category=213&subcategory=215&product=110 [downloaded Jun. 7, 2006].
Creative: “MP3 Player,” www.creative.com, Nov. 1, 2004, http://web.archive.org/web/20041024074823/www.creative.com/products/product.asp?category=213&subcategory=216&product=4983 [downloaded Jun. 7, 2006].
De Herrera, Chris, “Microsoft ActiveSync 3.1,” Version 1.02, Oct. 13, 2000.
IAP Sports Lingo 0×09 Protocol V1.00, May 1, 2006.
IEEE 1394—Wikipedia, 1995, http.//www.wikipedia.org/wiki/Firewire.
Written Opinion of the International Searching Authority dated Nov. 24, 2006 in PCT Application No. PCT/US2005/046797.
International Search Report dated Feb. 4, 2003 in corresponding application No. PCT/US2002/033330.
International Search Report dated Jul. 10, 2007 in corresponding application No. PCT/US2006/048738.
International Search Report dated Apr. 5, 2006 from corresponding International Application No. PCT/US2005/038819.
International Search Report dated Jul. 2, 2007 in related case PCT/US2006/048669.
International Search Report dated Jun. 19, 2007 in related Application PCT/US2006/048753.
International Search Report dated May 21, 2007 from corresponding PCT Application No. PCT/US2006/048670.
International Search Report in corresponding European Application No. 06256215.2 dated Feb. 20, 2007.
Invitation to Pay Additional Fees and Partial Search Report for corresponding PCT Application No. PCT/US2005/046797 dated Jul. 3, 2006.
iTunes 2, Playlist Related Help Screens, iTunes v2.0, Apple Computer, Inc., Oct. 23, 2001.
iTunes, Playlist Related Help Screens, iTunes v 1.0, Apple Computer, Inc., Jan. 2001.
Jabra Bluetooth Headset User Manual; GN Netcom A/s, 2005.
Jabra Bluctooth Introduction; GN Netcom A/S, Oct. 2004.
Jabra FreeSpeak BT200 User Manual; Jabra Corporation, 2002.
Kennedy, “Digital Data Storage Using Video Disc,” IBM Technical Disclosure Bulletin, vol. 24, No. 2, Jul. 1981.
Miniman, “Applian Software's Replay Radio and Player v1.02,” Product review, pocketnow.com, http://www.pocketnow.com/reviews/replay/replay.htm, Jul. 31, 2001.
Musicmatch, “Musicmatch and Xing Technology Introduce Musicmatch Jukebox,” May 18, 1998, http://www.musicmatch.com/info/company/press/releases/?year=1998&release=2.
Nonhoff-Arps, et al., “Straßenmusik Portable MP3-Spieler mit USB-Anschluss,” CT Magazin Fuer Computer Technik, Verlag Heinz Heise GMBH, Hannover DE, No. 25, Dec. 4, 2000.
International Search Report dated Nov. 24, 2006 in PCT Application No. PCT/US2005/046797.
Personal Jukebox (PJB), “Systems Research Center and PAAD,” Compaq Computer Corp., Oct. 13, 2000, http://research.compaq.com/SRC/pib/.
Peter Lewis, “Two New Ways to Buy Your Bits,” CNN Money, Dec. 31, 2003, pp. 1-4.
Sastry, Ravindra Wadali. “A Need for Speed; A New Speedometer for Runners”, submitted to the Department of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, May 28, 1999.
Sinitsyn, Alexander. “A Synchronization Framework for Personal Mobile Servers,” Pervasice Computing and Communications Workshops, 2004. Proceedings of the Second IEEE Annual Conference on, Piscataway, NJ, USA, IEEE, Mar. 14, 2004, pp. 208-212.
SoundJam MP Plus, Representative Screens, published by Casady & Greene, Inc., Salinas, CA, 2000.
Specification Sheet. iTunes 2, Apple Computer, Inc., Oct. 31, 2001.
Spiller, Karen. “Low-decibel earbuds keep noise at a reasonable level”, The Telegraph Online, dated Aug. 13, 2006, http://www.nashuatelegraph.com/apps/pbcs.dll/article?Date=20060813&Cate . . . Downloaded Aug. 16, 2006.
Steinberg, “Sonicblue Rio Car,” Product Review, Dec. 12, 2000, http://electronics.cnet.com/electronics/0-6342420-1304-4098389.html.
Travis Butler, “Archos Jukebox 6000 Challenges Nomad Jukebox,” Aug. 13, 2001, http://db.tidbits.com/getbits.acgi7tbart=06521.
Travis Butler, “Portable MP3: The Nomad Jukebox,” Jan. 8, 2001, http://db.tidbits.com/getbits.acgi?tbart=06261.
U.S. Appl. No. 11/621,541. “Personalized Podcasting Podmapping” filed Jan. 9, 2007.
Waterproof Music Player with FM Radio and Pedometer User Manual, Oregon Scientific, 2005.
Related Publications (1)
Number Date Country
20170251306 A1 Aug 2017 US
Continuations (2)
Number Date Country
Parent 13660839 Oct 2012 US
Child 15433810 US
Parent 11144541 Jun 2005 US
Child 13660839 US