Playback device

Information

  • Patent Grant
  • 11550539
  • Patent Number
    11,550,539
  • Date Filed
    Monday, November 22, 2021
    3 years ago
  • Date Issued
    Tuesday, January 10, 2023
    a year ago
Abstract
A system is described for maintaining synchrony of operations among a plurality of devices that have independent clocking arrangements. The system includes a task distribution device that distributes tasks to a synchrony group comprising a plurality of devices that are to perform the tasks distributed by the task distribution device in synchrony. The task distribution device distributes each task to the members of the synchrony group over a network. Each task is associated with a time stamp that indicates a time, relative to a clock maintained by the task distribution device, at which the members of the synchrony group are to execute the task. Each member of the synchrony group periodically obtains from the task distribution device an indication of the current time indicated by its clock, determines a time differential between the task distribution device's clock and its respective clock and determines therefrom a time at which, according to its respective clock, the time stamp indicates that it is to execute the task.
Description
FIELD OF THE INVENTION

The present invention relates generally to the field of digital data processing devices, and more particularly to systems and methods for synchronizing operations among a plurality of independently-clocked digital data processing devices. The invention is embodied in a system for synchronizing operations among a plurality of devices, in relation to information that is provided by a common source. One embodiment of the invention enables synchronizing of audio playback as among two or more audio playback devices that receive audio information from a common information source, or channel.


More generally, the invention relates to the field of arrangements that synchronize output generated by a number of output generators, including audio output, video output, combinations of audio and video, as well as other types of output as will be appreciated by those skilled in the art, provided by a common channel Generally, the invention will find utility in connection with any type of information for which synchrony among independently-clocked devices is desired.


BACKGROUND OF THE INVENTION

There are a number of circumstances under which it is desirable to maintain synchrony of operations among a plurality of independently-clocked digital data processing devices in relation to, for example, information that is provided thereto by a common source. For example, systems are being developed in which one audio information source can distribute audio information in digital form to a number of audio playback devices for playback. The audio playback devices receive the digital information and convert it to analog form for playback. The audio playback devices may be located in the same room or they may be distributed in different rooms in a residence such as a house or an apartment, in different offices in an office building, or the like. For example, in a system installed in a residence, one audio playback device may be located in a living room, while another audio playback device is be located in a kitchen, and yet other audio playback devices may be located in various bedrooms of a house. In such an arrangement, the audio information that is distributed to various audio playback devices may relate to the same audio program, or the information may relate to different audio programs. If the audio information source provides audio information relating to the same audio program to two or more audio playback devices at the same time, the audio playback devices will generally contemporaneously play the same program. For example, if the audio information source provides audio information to audio playback devices located in the living room and kitchen in a house at the same time, they will generally contemporaneously play the same program.


One problem that can arise is to ensure that, if two or more audio playback devices are contemporaneously attempting to play back the same audio program, they do so simultaneously. Small differences in the audio playback devices' start times and/or playback speeds can be perceived by a listener as an echo effect, and larger differences can be very annoying. Differences can arise because for a number of reasons, including delays in the transfer of audio information over the network. Such delays can differ as among the various audio playback devices for a variety of reasons, including where they are connected into the network, message traffic and other reasons as will be apparent to those skilled in the art.


Another problem arises from the following. When an audio playback device converts the digital audio information from digital to analog form, it does so using a clock that provides timing information. Generally, the audio playback devices that are being developed have independent clocks, and, if they are not clocking at precisely the same rate, the audio playback provided by the various devices can get out of synchronization.


SUMMARY OF THE INVENTION

The invention provides a new and improved system and method for synchronizing operations among a number of digital data processing devices that are regulated by independent clocking devices. Generally, the invention will find utility in connection with any type of information for which synchrony among devices connected to a network is desired. The invention is described in connection with a plurality of audio playback devices that receive digital audio information that is to be played back in synchrony, but it will be appreciated that the invention can find usefulness in connection with any kind of information for which coordination among devices that have independent clocking devices would find utility.


In brief summary, the invention provides, in one aspect, a system for maintaining synchrony of operations among a plurality of devices that have independent clocking arrangements. The system includes a task distribution device that distributes tasks to a synchrony group comprising a plurality of devices that are to perform the tasks distributed by the task distribution device in synchrony. The task distribution device distributes each task to the members of the synchrony group over a network. Each task is associated with a time stamp that indicates a time, relative to a clock maintained by the task distribution device, at which the members of the synchrony group are to execute the task. Each member of the synchrony group periodically obtains from the task distribution device an indication of the current time indicated by its clock, determines a time differential between the task distribution device's clock and its respective clock and determines therefrom a time at which, according to its respective clock, the time stamp indicates that it is to execute the task.


In one embodiment, the tasks that are distributed include audio information for an audio track that is to be played by all of the devices comprising the synchrony group synchronously. The audio track is divided into a series of frames, each of which is associated with a time stamp indicating the time, relative to the clock maintained by an audio information channel device, which, in that embodiment, serves as the task distribution device, at which the members of the synchrony group are to play the respective frame. Each member of the synchrony group, using a very accurate protocol, periodically obtains the time indicated by the audio information channel device, and determines a differential between the time as indicated by its local clock and the audio information channel device's clock. The member uses the differential and the time as indicated by the time stamp to determine the time, relative to its local clock, at which it is to play the respective frame. The members of the synchrony group do this for all of the frames, and accordingly are able to play the frames in synchrony.





BRIEF DESCRIPTION OF THE DRAWINGS

This invention is pointed out with particularity in the appended claims. The above and further advantages of this invention may be better understood by referring to the following description taken in conjunction with the accompanying drawings, in which:



FIG. 1 schematically depicts an illustrative networked audio system, constructed in accordance with the invention;



FIG. 2 schematically depicts a functional block diagram of a synchrony group utilizing a plurality of zone players formed within the networked audio system depicted in FIG. 1;



FIG. 2A schematically depicts two synchrony groups, illustrating how a member of one synchrony group can provide audio information to the members of another synchrony group;



FIG. 3 depicts an functional block diagram of a zone player for use in the networked audio system depicted in FIG. 1; and



FIG. 4 is useful in understanding an digital audio information framing methodology useful in the network audio system depicted in FIG. 1.





DETAILED DESCRIPTION OF AN ILLUSTRATIVE EMBODIMENT


FIG. 1 depicts an illustrative network audio system 10 constructed in accordance with the invention. With reference to FIG. 1, the network audio system 10 includes a plurality of zone players 11(1) through 11(N) (generally identified by reference numeral 11(n)) interconnected by a local network 12, all of which operate under control of one or more user interface modules generally identified by reference numeral 13. One or more of the zone players 11(n) may also be connected to one or more audio information sources, which will generally be identified herein by reference numeral 14(n)(s), and/or one or more audio reproduction devices, which will generally be identified by reference numeral 15(n)(r). In the reference numeral 14(n)(s), index “n” refers to the index “n” of the zone player 11(n) to which the audio information source is connected, and the index “s” (s=1, . . . , Sn) refers to the “s-th” audio information source connected to that “n-th” zone player 11(n). Thus, if, for example, a zone player 11(n) is connected to four audio information sources 14(n)(1) through 14(n)(4), the audio information sources may be generally identified by reference numeral 14(n)(s), with Sn=4. It will be appreciated that the number of audio information sources Sn may vary as among the various zone players 11(n), and some zone players may not have any audio information sources connected thereto. Similarly, in the reference numeral 15(n)(r), index “n” refers to the index “n” of the zone player 11(n) to which the audio reproduction device is connected, and the index “r” (r=1, . . . , Rn) refers to the “r-th” audio information source connected to that “n-th” zone player 11(n). In addition to the audio information sources 14(n)(s), the network audio system 10 may include one or more audio information sources 16(1) through 16(M) connected through appropriate network interface devices (not separately shown) to the local network 12. Furthermore, the local network may include one or more network interface devices (also not separately shown) that are configured to connect the local network 12 to other networks, including a wide area network such as the Internet, the public switched telephony network (PSTN) or other networks as will be apparent to those skilled in the art, over which connections to audio information sources may be established.


The zone players 11(n) associated with system 10 may be distributed throughout an establishment such as residence, an office complex, a hotel, a conference hall, an amphitheater or auditorium, or other types of establishments as will be apparent to those skilled in the art or the like. For example, if the zone players 11(n) and their associated audio information source(s) and/or audio reproduction device(s) are distributed throughout a residence, one, such as zone player 11(1) and its associated audio information source(s) and audio reproduction device(s) may be located in a living room, another may be located in a kitchen, another may be located in a dining room, and yet others may be located in respective bedrooms, to selectively provide entertainment in those rooms. On the other hand, if the zone players 11(n) and their associated audio information source(s) and/or audio reproduction device(s) are distributed throughout an office complex, one may, for example, be provided in each office to selectively provide entertainment to the employees in the respective offices. Similarly, if the zone players 11(n) and associated audio information source(s) and/or audio reproduction device(s) are used in a hotel, they may be distributed throughout the rooms to provide entertainment to the guests. Similar arrangements may be used with zone players 11(n) and associated audio information source(s) and/or audio reproduction device(s) used in an amphitheater or auditorium. Other arrangements in other types of environments will be apparent to those skilled in the art. In each case, the zone players 11(n) can be used to selectively provide entertainment in the respective locations, as will be described below.


The audio information sources 14(n)(s) and 16(m) may be any of a number of types of conventional sources of audio information, including, for example, compact disc (“CD”) players, AM and/or FM radio receivers, analog or digital tape cassette players, analog record turntables and the like. In addition, the audio information sources 14(n)(s) and 16(m) may comprise digital audio files stored locally on, for example, personal computers (PCs), personal digital assistants (PDAs), or similar devices capable of storing digital information in volatile or non-volatile form. As noted above, the local network 12 may also have an interface (not shown) to a wide area network, over which the network audio system 10 can obtain audio information. Moreover, one or more of the audio information sources 14(n)(s) may also comprise an interface to a wide area network such as the Internet, the public switched telephony network (PSTN) or any other source of audio information. In addition, one or more of the audio information sources 14(n)(s) and 16(m) may comprise interfaces to radio services delivered over, for example, satellite. Audio information obtained over the wide area network may comprise, for example, streaming digital audio information such as Internet radio, digital audio files stored on servers, and other types of audio information and sources as will be appreciated by those skilled in the art. Other arrangements and other types of audio information sources will be apparent to those skilled in the art.


Generally, the audio information sources 14(n)(s) and 16(m) provide audio information associated with audio programs to the zone players for playback. A zone player that receives audio information from an audio information source 14(n)(s) that is connected thereto can provide playback and/or forward the audio information, along with playback timing information, over the local network 12 to other zone players for playback. Similarly, each audio information source 16(m) that is not directly connected to a zone player can transmit audio information over the network 12 to any zone player 11(n) for playback. In addition, as will be explained in detail below, the respective zone player 11(n) can transmit the audio information that it receives either from an audio information source 14(n)(s) connected thereto, or from an audio information source 16(m), to selected ones of the other zone players 11(n′), 11(n″), . . . (n not equal to n′, n″, . . . ) for playback by those other zone players. The other zone players 11(n′), 11(n″), . . . to which the zone player 11(n) transmits the audio information for playback may be selected by a user using the user interface module 13. In that operation, the zone player 11(n) will transmit the audio information to the selected zone players 11(n′), 11(n″), . . . over the network 12. As will be described below in greater detail, the zone players 11(n), 11(n′), 11(n″), . . . operate such that the zone players 11(n′), 11(n″), . . . synchronize their playback of the audio program with the playback by the zone player 11(n), so that the zone players 11(n), 11(n′), 11(n″) provide the same audio program at the same time.


Users, using user interface module 13, may also enable different groupings or sets of zone players to provide audio playback of different audio programs synchronously. For example, a user, using a user interface module 13, may enable zone players 11(1) and 11(2) to play one audio program, audio information for which may be provided by, for example, one audio information source 14(1)(1). The same or a different user may, using the same or a different user interface module 13, enable zone players 11(4) and 11(5) to contemporaneously play another audio program, audio information for which may be provided by a second audio information source, such as audio information source 14(5)(2). Further, a user may enable zone player 11(3) to contemporaneously play yet another audio program, audio information for which may be provided by yet another audio information source, such as audio information source 16(1). As yet another possibility, a user may contemporaneously enable zone player 11(1) to provide audio information from an audio information source connected thereto, such as audio information source 14(1)(2), to another zone player, such as zone player 11(6) for playback.


In the following, the term “synchrony group” will be used to refer to a set of one or more zone players that are to play the same audio program synchronously. Thus, in the above example, zone players 11(1) and 11(2) comprise one synchrony group, zone player 11(3) comprises a second synchrony group, zone players 11(4) and 11(5) comprise a third synchrony group, and zone player 11(6) comprises yet a fourth synchrony group. Thus, while zone players 11(1) and 11(2) are playing the same audio program, they will play the audio program synchronously. Similarly, while zone players 11(4) and 11(5) are playing the same audio program, they will play the audio program synchronously. On the other hand, zone players that are playing different audio programs may do so with unrelated timings. That is, for example, the timing with which zone players 11(1) and 11(2) play their audio program may have no relationship to the timing with which zone player 11(3), zone players 11(4) and 11(5), and zone player 11(6) play their audio programs. It will be appreciated that, since “synchrony group” is used to refer to sets of zone players that are playing the same audio program synchronously, zone player 11(1) will not be part of zone player 11(6)'s synchrony group, even though zone player 11(1) is providing the audio information for the audio program to zone player 11(6).


In the network audio system 10, the synchrony groups are not fixed. Users can enable them to be established and modified dynamically. Continuing with the above example, a user may enable the zone player 11(1) to begin providing playback of the audio program provided thereto by audio information source 14(1)(1), and subsequently enable zone player 11(2) to join the synchrony group. Similarly, a user may enable the zone player 11(5) to begin providing playback of the audio program provided thereto by audio information source 14(5)(2), and subsequently enable zone player 11(4) to join that synchrony group. In addition, a user may enable a zone player to leave a synchrony group and possibly join another synchrony group. For example, a user may enable the zone player 11(2) to leave the synchrony group with zone player 11(1), and join the synchrony group with zone player 11(6). As another possibility, the user may enable the zone player 11(1) to leave the synchrony group with zone player 11(2) and join the synchrony group with zone player 11(6). In connection with the last possibility, the zone player 11(1) can continue providing audio information from the audio information source 14(1)(1) to the zone player 11(2) for playback thereby.


A user, using the user interface module 13, can enable a zone player 11(n) that is currently not a member of a synchrony group to join a synchrony group, after which it will be enabled to play the audio program that is currently being played by that synchrony group. Similarly, a user, also using the user interface module 13, can enable a zone player 11(n) that is currently a member of one synchrony group, to disengage from that synchrony group and join another synchrony group, after which that zone player will be playing the audio program associated with the other synchrony group. For example, if a zone player 11(6) is currently not a member of any synchrony group, it, under control of the user interface module 13, can become a member of a synchrony group, after which it will play the audio program being played by the other members of the synchrony group, in synchrony with the other members of the synchrony group. In becoming a member of the synchrony group, zone player 11(6) can notify the zone player that is the master device for the synchrony group that it wishes to become a member of its synchrony group, after which that zone player will also transmit audio information associated with the audio program, as well as timing information, to the zone player 11(6). As the zone player 11(6) receives the audio information and the timing information from the master device, it will play the audio information with the timing indicated by the timing information, which will enable the zone player 11(6) to play the audio program in synchrony with the other zone player(s) in the synchrony group.


Similarly, if a user, using the user interface module 13, enables a zone player 11(n) associated with a synchrony group to disengage from that synchrony group, and if the zone player 11(n) is not the master device of the synchrony group, the zone player 11(n) can notify the master device, after which the master device can terminate transmission of the audio information and timing information to the zone player 11(n). If the user also enables the zone player 11(n) to begin playing another audio program using audio information from an audio information source 14(n)(s) connected thereto, it will acquire the audio information from the audio information source 14(n)(s) and initiate playback thereof. If the user enables another zone player 11(n′) to join the synchrony group associated with zone player 11(n), operations in connection therewith can proceed as described immediately above.


As yet another possibility, if a user, using the user interface module 13, enables a zone player 11(n) associated with a synchrony group to disengage from that synchrony group and join another synchrony group, and if the zone player is not the master device of the synchrony group from which it is disengaging, the zone player 11(n) can notify the master device of the synchrony group from which it is disengaging, after which that zone player will terminate transmission of audio information and timing information to the zone player 11(n) that is disengaging. Contemporaneously, the zone player 11(n) can notify the master device of the synchrony group that it (that is, zone player 11(n)) is joining, after which the master device can begin transmission of audio information and timing information to that zone player 11(n). The zone player 11(n) can thereafter begin playback of the audio program defined by the audio information, in accordance with the timing information so that the zone player 11(n) will play the audio program in synchrony with the master device.


As yet another possibility, a user, using the user interface module 13, may enable a zone player 11(n) that is not associated with a synchrony group, to begin playing an audio program using audio information provided to it by an audio information source 14(n)(s) connected thereto. In that case, the user, also using the user interface module 13 or a user interface device that is specific to the audio information source 14(n)(s), can enable the audio information source 14(n)(s) to provide audio information to the zone player 11(n). After the zone player 11(n) has begun playback, or contemporaneously therewith, the user, using the user interface module 13, can enable other zone players 11(n′), 11(n″), . . . to join zone player 11(n)'s synchrony group and enable that zone player 11(n) to transmit audio information and timing information thereto as described above, to facilitate synchronous playback of the audio program by the other zone players 11(n′), 11(n″) . . . .


A user can use the user interface module 13 to control other aspects of the network audio system 10, including but not limited to the selection of the audio information source 14(n)(s) that a particular zone player 11(n) is to utilize, the volume of the audio playback, and so forth. In addition, a user may use the user interface module 13 to turn audio information source(s) 14(n)(s) on and off and to enable them to provide audio information to the respective zone players 11(n).


Operations performed by the various devices associated with a synchrony group will be described in connection with FIG. 2, which schematically depicts a functional block diagram of a synchrony group in the network audio system 10 described above in connection with FIG. 1. With reference to FIG. 2, a synchrony group 20 includes a master device 21 and zero or more slave devices 22(1) through 22(G) (generally identified by reference numeral 22(g)), all of which synchronously play an audio program provided by an audio information channel device 23. Each of the master device 21, slave devices 22(g) and audio information channel device 23 utilizes a zone player 11(n) depicted in FIG. 1, although it will be clear from the description below that a zone player may be utilized both for the audio information channel device for the synchrony group 20, and the master device 21 or a slave device 22(g) of the synchrony group 20. As will be described below in more detail, the audio information channel device 23 obtains the audio information for the audio program from an audio information source, adds playback timing information, and transmits the combined audio and playback timing information to the master device 21 and slave devices 22(g) over the network 12 for playback. The playback timing information that is provided with the audio information, together with clock timing information provided by the audio information channel device 23 to the various devices 21 and 22(g) as will be described below, enables the master device 21 and slave devices 22(g) of the synchrony group 20 to play the audio information simultaneously.


The master device 21 and the slave devices 22(g) receive the audio and playback timing information, as well as the clock timing information, that are provided by the audio information channel device 23, and play back the audio program defined by the audio information. The master device 21 is also the member of the synchrony group 20 that communicates with the user interface module 13 and that controls the operations of the slave devices 22(g) in the synchrony group 20. In addition, the master device 21 controls the operations of the audio information channel device 23 that provides the audio and playback timing information for the synchrony group 20. Generally, the initial master device 21 for the synchrony group will be the first zone player 11(n) that a user wishes to play an audio program. However, as will be described below, the zone player 11(n) that operates as the master device 21 can be migrated from one zone player 11(n) to another zone player 11(n′), which preferably will be a zone player that is currently operating as a slave device 22(g) in the synchrony group.


In addition, under certain circumstances, as will be described below, the zone player 11(n) that operates as the audio information channel device 23 can be migrated from one zone player to another zone player, which also will preferably will be a zone player that is currently operating as a member of the synchrony group 20. It will be appreciated that the zone player that operates as the master device 21 can be migrated to another zone player independently of the migration of the audio information channel device 23. For example, if one zone player 11(n) is operating as both the master device 21 and the audio information channel device 23 for a synchrony group 20, the master device 21 can be migrated to another zone player 11(n′) while the zone player 11(n) is still operating as the audio information channel device 23. Similarly, if one zone player 11(n) is operating as both the master device 21 and the audio information channel device 23 for a synchrony group 20, the audio information channel device 23 can be migrated to another zone player 11(n′) while the zone player 11(n) is still operating as the master device 21. In addition, if one zone player 11(n) is operating as both the master device 21 and the audio information channel device 23 for a synchrony group 20, the master device 21 can be migrated to another zone player 11(n′) and the audio information channel device can be migrated to a third zone player 11(n″).


The master device 21 receives control information from the user interface module 13 for controlling the synchrony group 20 and provides status information indicating the operational status of the synchrony group to the user interface module 13. Generally, the control information from the user interface module 13 enables the master device 21 to, in turn, enable the audio information channel device 23 to provide audio and playback timing information to the synchrony group to enable the devices 21 and 22(g) that are members of the synchrony group 20 to play the audio program synchronously. In addition, the control information from the user interface module 13 enables the master device 21 to, in turn, enable other zone players to join the synchrony group as slave devices 22(g) and to enable slave devices 22(g) to disengage from the synchrony group. Control information from the user interface module 13 can also enable the zone player 11(n) that is currently operating as the master device 21 to disengage from the synchrony group, but prior to doing so that zone player will enable the master device 21 to transfer from that zone player 11(n) to another zone player 11(n′), preferably to a zone player 11(n′) that is currently a slave device 22(g) in the synchrony group 20. The control information from the user interface module 13 can also enable the master device 21 to adjust its playback volume and to enable individual ones of the various slave devices 22(g) to adjust their playback volumes. In addition, the control information from the user interface module 13 can enable the synchrony group 20 to terminate playing of a current track of the audio program and skip to the next track, and to re-order tracks in a play list of tracks defining the audio program that is to be played by the synchrony group 20.


The status information that the master device 21 may provide to the user interface module 13 can include such information as a name or other identifier for the track of the audio work that is currently being played, the names or other identifiers for upcoming tracks, the identifier of the zone player 11(n) that is currently operating as the master device 21, and identifiers of the zone players that are currently operating as slave devices 22(g). In one embodiment, the user interface module 13 includes a display (not separately shown) that can display the status information to the user.


It will be appreciated that the zone player 11(n) that is operating as the audio information channel device 23 for one synchrony group may also comprise the master device 21 or any of the slave devices 22(g) in another synchrony group. This may occur if, for example, the audio information source that is to provide the audio information that is to be played by the one synchrony group is connected to a zone player also being utilized as the master device or a slave device for the other synchrony group. This will be schematically illustrated below in connection with FIG. 2A. Since, as noted above, the zone player 11(n) that is operating as the audio information channel device 23 for the synchrony group 20 may also be operating as a master device 21 or slave device 22(g) for another synchrony group, it can also be connected to one or more audio reproduction devices 15(n)(r), although that is not depicted in FIG. 2. Since the master device 21 and slave devices 22(g) are all to provide playback of the audio program, they will be connected to respective audio reproduction devices 15(n)(r). Furthermore, it will be appreciated that one or more of the zone players 11(n) that operate as the master device 21 and slave devices 22(g) in synchrony group 20 may also operate as an audio information channel device for that synchrony group or for another synchrony group and so they may be connected to one or more audio information sources 14(n)(s), although that is also not depicted in FIG. 2. In addition, it will be appreciated that a zone player 11(n) can also operate as a audio information channel device 23 for multiple synchrony groups.


If the audio information channel device 23 does not utilize the same zone player as the master device 21, the master device 21 controls the audio information channel device by exchanging control information over the network 12 with the audio information channel device 23. The control information is represented in FIG. 2 by the arrow labeled CHAN_DEV_CTRL_INFO. The control information that the master device 21 provides to the audio information channel device 23 will generally depend on the nature of the audio information source that is to provide the audio information for the audio program that is to be played and the operation to be enabled by the control information. If, for example, the audio information source is a conventional compact disc, tape, or record player, broadcast radio receiver, or the like, which is connected to a zone player 11(n), the master device 21 may merely enable the zone player serving as the audio information channel device 23 to receive the audio information for the program from the audio information source. It will be appreciated that, if the audio information is not in digital form, the audio information channel device 23 will convert it to digital form and provide the digitized audio information, along with the playback timing information, to the master device 21 and slave devices 22(g).


On the other hand, if the audio information source is, for example, a digital data storage device, such as may be on a personal computer or similar device, the master device 21 can provide a play list to the audio information channel device 23 that identifies one or more files containing the audio information for the audio program. In that case, the audio information channel device 23 can retrieve the files from the digital data storage device and provide them, along with the playback timing information, to the master device 21 and the slave devices 22(g). It will be appreciated that, in this case, the audio information source may be directly connected to the audio information channel device 23, as, for example, an audio information source 14(n)(s), or it may comprise an audio information source 16(m) connected to the network 12. As a further alternative, if the audio information source is a source available over the wide area network, the master device 21 can provide a play list comprising a list of web addresses identifying the files containing the audio information for the audio program that is to be played, and in that connection the audio information channel device 23 can initiate a retrieval of the files over the wide area network. As yet another alternative, if the audio information source is a source of streaming audio received over the wide area network, the master device 21 can provide a network address from which the streaming audio can be received. Other arrangements by which the master device 21 can control the audio information channel device 23 will be apparent to those skilled in the art.


The master device 21 can also provide control information to the synchrony group's audio information channel device 23 to enable a migration from one zone player 11(n) to another zone player 11(n′). This may occur if, for example, the audio information source is one of audio information sources 16 or a source accessible over the wide area network via the network 12. The master device 21 can enable migration of the audio information channel device 23 for several reasons, including, for example, to reduce the loading of the zone player 11(n), to improve latency of message transmission in the network 12, and other reasons as will be appreciated by those skilled in the art.


As noted above, the audio information channel device 23 provides audio and playback timing information for the synchrony group to enable the master device 21 and slave devices 22(g) to play the audio program synchronously. Details of the audio and playback timing information will be described in detail below in connection with FIGS. 3 and 4, but, in brief, the audio information channel device 23 transmits the audio and playback timing information in messages over the network 12 using a multi-cast message transmission methodology. In that methodology, the audio information channel device 23 will transmit the audio and playback timing information in a series of messages, with each message being received by all of the zone players 11(n) comprising the synchrony group 20, that is, by the master device 21 and the slave devices 22(g). Each of the messages includes a multi-cast address, which the master device 21 and slave devices 22(g) will monitor and, when they detect a message with that address, they will receive and use the contents of the message. The audio and playback timing information is represented in FIG. 2 by the arrow labeled “AUD+PBTIME_INF0,” which has a single tail, representing a source for the information at the audio information channel device 23, and multiple arrowheads representing the destinations of the information, with one arrowhead extending to the master device 21 and other arrowheads extending to each of the slave devices 22(g) in the synchrony group 20. The audio information channel device 23 may make use of any convenient multi-cast message transmission methodology in transmitting the audio and playback timing information to the synchrony group 20. As will be described in detail in connection with FIG. 4, the audio and playback timing information is in the form of a series of frames, with each frame having a time stamp. The time stamp indicates a time, relative to the time indicated by a clock maintained by the audio information channel device 23, at which the frame is to be played. Depending on the size or sizes of the messages used in the selected multi-cast message transmission methodology and the size or sizes of the frames, a message may contain one frame, or multiple frames, or, alternatively, a frame may extend across several messages.


The audio information channel device 23 also provides clock time information to the master device 21 and each of the slave devices 22(g) individually over network 12 using a highly accurate clock time information transmission methodology. The distribution of the clock time information is represented in FIG. 2 by the arrows labeled “AICD_CLK_INF (M)” (in the case of the clock time information provided to the master device 21) and “AICD_CLK_INF (S1)” through “AICD_CLK_INF (SG)” (in the case of audio information channel device clock information provided to the slave devices 22(g)). In one embodiment, the master device 21 and slave devices 22(g) make use of the well-known SNTP (Simple Network Time Protocol) to obtain current clock time information from the audio information channel device 23. The SNTP makes use of a unicast message transfer methodology, in which one device, such as the audio information channel device 23, provides clock time information to a specific other device, such as the master device 21 or a slave device 22(g), using the other device's network, or unicast, address. Each of the master device 21 and slave devices 22(g) will periodically initiate SNTP transactions with the audio information channel device 23 to obtain the clock time information from the audio information channel device 23. As will be described below in more detail, the master device 21 and each slave device 22(g) make use of the clock time information to determine the time differential between the time indicated by the audio information channel device's clock and the time indicated by its respective clock, and use that time differential value, along with the playback time information associated with the audio information and the respective device's local time as indicated by its clock to determine when the various frames are to be played. This enables the master device 21 and the slave devices 22(g) in the synchrony group 20 to play the respective frames simultaneously.


As noted above, the control information provided by the user to the master device 21 through the user interface module 13 can also enable the master device 21 to, in turn, enable another zone player 11(n′) to join the synchrony group as a new slave device 22(g). In that operation, the user interface module 13 will provide control information, including the identification of the zone player 11(n′) that is to join the synchrony group to the master device 21. After it receives the identification of the zone player 11(n′) that is to join the synchrony group, the master device 21 will exchange control information, which is represented in FIG. 2 by the arrows labeled SLV_DEV_CTRL_INF (S1) through SLV_DEV_CTRL_INF (SG) group slave control information, over the network 12 with the zone player 11(d) that is identified in the control information from the user interface module 13. The control information that the master device 21 provides to the new zone player 11(n′) includes the network address of the zone player 11(n) that is operating as the audio information channel device 23 for the synchrony group, as well as the multi-cast address that the audio information channel device 23 is using to broadcast the audio and playback timing information over the network. The zone player that is to operate as the new slave device 22(g′) uses the multi-cast address to begin receiving the multi-cast messages that contain the audio information for the audio program being played by the synchrony group.


It will be appreciated that, if the zone player 11(n) that is operating as the master device 21 for the synchrony group 20 is also operating the audio information channel device 23, and if there are no slave devices 22(g) in the synchrony group 20, the audio information channel device 23 may not be transmitting audio and playback timing information over the network. In that case, if the new slave device 22(g′) is the first slave device in the synchrony group, the zone player 11(n) that is operating as both the master device 21 and audio information channel device 23, can begin transmitting the audio and playback timing information over the network 12 when the slave device 22(g′) is added to the synchrony group 20. The zone player 11(n) can maintain a count of the number of slave devices 22(g) in the synchrony group 20 as they join and disengage, and, if the number drops to zero, it can stop transmitting the audio and playback timing information over the network 12 to reduce the message traffic over the network 12.


The new slave device 22(g′) added to the synchrony group 20 uses the network address of the audio information channel device 23 for several purposes. In particular, the new slave device 22(g′) will, like the master device 21 (assuming the zone player 11(n) operating as the master device 21 is not also the audio information channel device 23), engage in SNTP transactions with the audio information channel device 23 to obtain the clock timing information from the audio information channel device 23. In addition, the new slave device 22(g′) can notify the audio information channel device 23 that it is a new slave device 22(g′) for the synchrony group 20 and provide the audio information channel device 23 with its network address. As will be described below, in one embodiment, particularly in connection with audio information obtained from a source, such as a digital data storage device, which can provide audio information at a rate that is faster than the rate at which it will be played, the audio information channel device 23 will buffer audio and timing information and broadcast it over the network 12 to the synchrony group 20 generally at a rate at which it is provided by the source. Accordingly, when a new slave device 22(g′) joins the synchrony group 20, the playback timing information may indicate that the audio information that is currently being broadcast by the audio information channel device 23 using the multi-cast methodology is to be played back some time in the future. To reduce the delay with which the new slave device 22(g′) will begin playback, the audio information channel device 23 can also retransmit previously transmitted audio and timing information that it had buffered to the new slave device 22(g′) using the unicast network address of the slave device 22(g′).


The master device 21 can also use the slave device control information exchanged with the slave devices 22(g) for other purposes. For example, the master device 21 can use the slave device control information to initiate a migration of the master from its zone player 11(n) to another zone player 11(n′). This may occur for any of a number of reasons, including, for example, that the master device 21 is terminating playback by it of the audio program and is leaving the synchrony group 20, but one or more of the other devices in the synchrony group is to continue playing the audio program. The master device 21 may also want to initiate a migration if it is overloaded, which can occur if, for example, the zone player 11(n) that is the master device 21 for its synchrony group is also operating as an audio information channel device 23 for another synchrony group.


The user can also use the user interface module 13 to adjust playback volume by the individual zone players 11(n) comprising the synchrony group. In that operation, the user interface module 13 provides information identifying the particular device whose volume is to be adjusted, and the level at which the volume is to be set to the master device 21. If the device whose volume is to be adjusted is the master device 21, the master device 21 can adjust its volume according to the information that it receives from the user interface module 13. On the other hand, if the device whose volume is to be adjusted is a slave device 22(g), the master device 21 can provide group slave control information to the respective slave device 22(g), to enable it to adjust its volume.


The user can also use the user interface module 13 to enable a synchrony group 20 to cancel playing of the track in an audio program that is currently being played, and to proceed immediately to the next track. This may occur, for example, if the tracks for the program is in the form of a series of digital audio information files, and the user wishes to cancel playback of the track that is defined by one of the files. In that case, when the master device 21 receives the command to cancel playback of the current track, it will provide channel device control information to the audio information channel device 23 so indicating. In response, the audio information channel device 23 inserts control information into the audio and playback timing information, which will be referred to as a “resynchronize” command. In addition, the audio information channel device 23 will begin transmitting audio information for the next track, with timing information to enable it to be played immediately. The resynchronize command can also enable playback of a track to be cancelled before it has been played. Details of these operations will be described below.


As noted above, there may be multiple synchrony groups in the network audio system 10, and further that, for example, a zone player 11(n) may operate both as a master device 21 or a slave device 22(g) in one synchrony group, and as the audio information channel device 23 providing audio and playback timing information and clock timing information for another synchrony group. An illustrative arrangement of this will be described in connection with FIG. 2A. With reference to FIG. 2A, that FIG. depicts elements of two synchrony groups, identified by reference numerals 20(1) and 20(2), respectively. For clarity, FIG. 2A does not show a number of elements, the presence of which would be evident from FIGS. 1 and 2 as described above. For example, FIG. 2A does not depict the audio information sources from which audio information is obtained for the synchrony groups or the audio reproduction devices that are used to produce sound for the master and slave devices, which are depicted in both FIGS. 1 and 2. In addition, FIG. 2A does not depict arrows that represent control information provided by the respective master devices to the slave devices in the respective synchrony groups, or to the audio information channel devices that provide audio and timing information for the respective synchrony groups, which are depicted in FIG. 2. In addition, FIG. 2A does not depict the arrows that represent the clock timing information provided by the audio information channel devices to the respective members of the respective synchrony groups, which are also depicted in FIG. 2. As will be noted below, however, FIG. 2A does depict arrows representing the audio and playback timing information provided by the respective audio information channel devices for the respective synchrony groups 20(1), 20(2), to the master and slave devices comprising the respective synchrony groups 20(1), 20(2).


Each synchrony group 20(1), 20(2) comprises elements of a number of zone players. A functional block diagram of a zone player will be described below in connection with FIG. 3. Synchrony group 20(1) includes a master device 21(1) and “K” slave devices 22(1)(1) through 22(K)(1) (the index “1” in reference numeral 21(1) and the last index in reference numeral 22(1)(1) through 21(K)(1) corresponds to the index of the synchrony group 20(1) to which they belong) utilize zone players 11(1) and 11(K+1) respectively. Similarly, synchrony group 20(2) includes a master device 21(2) and “L” slave devices 22(1)(2) through 22(L)(2) that utilize zone players 11(K+2) through 11(K+L+2). In the illustrative arrangement depicted in FIG. 2A, both synchrony groups 20(1) and 20(2) are controlled by the user interface module 13, which can provide control information to, and receive status information from, the master devices 21(1) and 21(2) independently. It will be appreciated that separate user interface modules may be provided to provide control information to, and receive status information from, the respective master devices 21(1), 21(2).


As noted above, the slave device 22(1)(2) in synchrony group 20(2) utilizes zone player 11(K+3). In the illustrative arrangement depicted in FIG. 2A, the audio information channel device 23(1) that provides audio and playback timing information to the master and slave devices 21(1), 22(1)(1), . . . , 22(K)(1) of synchrony group 20(1) also utilizes zone player 11(K+3). As noted above, this may occur if, for example, the audio information source that is to provide audio information to be played by the synchrony group 20(1) is connected to the zone player 11(K+3). Thus, when the master device 21(1) of synchrony group 20(1) exchanges channel device control information with the audio information channel device 23(1), it is effectively exchanging channel device control information with the zone player 11(K+3). Similarly, when the master and slave devices 21(1), 22(1)(1), . . . , 22(K)(1) of synchrony group 20(1) receive audio and playback timing information, as well as clock timing information, from the audio information channel device 23(1), they are effectively receiving the information from the zone player 11(K+3). FIG. 2A depicts a multi-headed arrow representing audio and playback timing information transmitted by the zone player 11(K+3), as audio information channel device 23(1), to the master and slave devices 21(1), 22(1)(1), . . . , 11(K)(1) comprising synchrony group 20(1).


On the other hand, in the illustrative arrangement depicted in FIG. 2A, the synchrony group 20(2) utilizes a zone player 11(K+L+3) as its audio information channel device 23(2). As with synchrony group 20(1), when the master device 21(2) of synchrony group 20(2) exchanges channel device control information with the audio information channel device 23(2), it is effectively exchanging channel device control information with the zone player 11(K+L+3). Similarly, when the master and slave devices 21(2), 22(1)(2), . . . , 22(L)(2) of synchrony group 20(2) receive audio and playback timing information, as well as clock timing information, from the audio information channel device 23(2), they are effectively receiving the information from the zone player 11(K+L+3). FIG. 2A depicts a multi-headed arrow representing audio and playback timing information transmitted by the zone player 11(K+3) as audio information channel device 23(2) to the master and slave devices 21(2), 22(1)(2), . . . , 22(L)(2) comprising synchrony group 20(2).


In the illustrative arrangement depicted in FIG. 2A, zone player 11(K+L+3), which is the audio information channel device 23(2) for synchrony group 20(2), is not shown as being either a master or a slave device in another synchrony group. However, it will be appreciated that zone player 11(K+L+3) could also be utilized as the master device or a slave device for another synchrony group. Indeed, it will be appreciated that the zone player that is utilized as the audio information channel device for synchrony group 20(2) may also be a zone player that is utilized as the master device 21(1) or a slave device 22(1)(1), . . . , 22(K)(1) in the synchrony group 20(1).


A zone player 11(n) that is utilized as a member of one synchrony group may also be utilized as the audio information channel device for another synchrony group if the audio information source that is to supply the audio information that is to be played by the other synchrony group is connected to that zone player 11(n). A zone player 11(n) may also be utilized as the audio information channel device for the other synchrony group if, for example, the audio information source is an audio information source 16(m) (FIG. 1) that is connected to the network 12 or an audio information source that is available over a wide area network such as the Internet. The latter may occur if, for example, the zone player 11(n) has sufficient processing power to operate as the audio information channel device and it is in an optimal location in the network 12, relative to the zone players comprising the other synchrony group (that is the synchrony group for which it is operating as audio information channel device) for providing the audio and playback timing information to the members of the other synchrony group. Other circumstances under which the zone player 11(n) that is utilized as a member of one synchrony group may also be utilized as the audio information channel device for another synchrony group will be apparent to those skilled in the art.


As was noted above, the master device 21 for a synchrony group 20 may be migrated from one zone player 11(n) to another zone player 11(n′). As was further noted above, the audio information channel device 23 for a synchrony group 20 may be migrated from one zone player 11(n) to another zone player 11(n′). It will be appreciated that the latter may occur if, for example, the audio information source that provides the audio program for the synchrony group is not connected to the zone player 11(n) that is operating as the audio information channel device 23, but instead is one of the audio information sources 16(m) connected to the network 12 or a source available over a wide area network such as the Internet. Operations performed during a migration of an audio information channel device 23 from one zone player 11(n) to another zone player 11(n′) will generally depend on the nature of the audio information that is being channeled by the audio information channel device 23. For example, if the audio information source provides streaming audio, the zone player 11(n) that is currently operating as the audio information channel device 23 for the synchrony group 20, can provide the following information to the other zone player 11(n′) that is to become the audio information channel device 23 for the synchrony group 20:


(a) the identification of the source of the streaming audio information,


(b) the time stamp associated with the frame that the zone player 11(n) is currently forming, and


(c) the identifications of the zone players that are operating as the master device 21 and slave devices 22(g) comprising the synchrony group 20.


After the zone player 11(n′) receives the information from the zone player 11(n), it will begin receiving the streaming audio from the streaming audio information source identified by the zone player 11(n), assemble the streaming audio information into frames, associate each frame with a time stamp, and transmit the resulting audio and playback timing information over the network 12. The zone player 11(n′) will perform these operations in the same manner as described above, except that, instead of using the time indicated by its digital to analog converter clock 34 directly in generating the time stamps for the frames, the initial time stamp will be related to the value of the time stamp that is provided by the zone player 11(n) (reference item (b) above), with the rate at which the time stamps are incremented corresponding to the rate at which its (that is, the zone player 11(n′)'s) clock increments. In addition, the zone player 11(n′) will notify the zone players that are operating as the master device 21 and slave devices 22(g) of the synchrony group 20 that it is the new audio information channel device 23 for the synchrony group 20, and provide the multi-cast address that it will be using to multi-cast the audio and playback timing information, as well as its unicast network address. After the members of the synchrony group 20 receive the notification from the zone player 11(n′) indicating that it is the new audio information channel device 23 for the synchrony group 20, they will receive the audio and playback timing information from the zone player 11(n′) instead of the zone player 11(n), using the multi-cast address provided by the zone player 11(n′). In addition, they can utilize the zone player 11(n′)'s unicast network address to obtain current time information therefrom. It will be appreciated that the zone player 11(n′) will determine its current time in relation to the time stamp that is provided by the zone player 11(n) (reference item (b) above) or the current time information that it received from the zone player 11(n) using the SNTP protocol as described above.


Generally similar operations can be performed in connection with migrating the audio information channel device from one zone player 11(n) to another zone player 11(n′) if the audio information is from one or more audio information files, such as may be the case if the audio information comprises MP3 or WAV files that are available from sources such as sources 16(m) connected to the network 12 or over from sources available over a wide area network such as the Internet, except for differences to accommodate the fact that the audio information is in files. In that case, the zone player 11(n) that is currently operating as the audio information channel device 23 for the synchrony group 20 can provide the following information to the zone player 11(n′) that is to become the audio information channel device 23 for the synchrony group 20:


(d) a list of the audio information files containing the audio information that is to be played;


(e) the identification of the file for which the zone player 11(n) is currently providing audio and playback timing information, along with the offset into the file for which the current item of audio and playback timing information is being generated and the time stamp that the zone player 11(n) is associating with that frame, and


(f) the identifications of the zone players that comprise the master device 21 and slave devices 22(g) comprising the synchrony group 20.


After the zone player 11(n′) receives the information from the zone player 11(n), it will begin retrieving audio information from the file identified in item (e), starting at the identified offset. In addition, the zone player 11(n′) can assemble the retrieved audio information into frames, associate each frame with a time stamp and transmit the resulting audio and playback timing information over the network 12. The zone player 11(n′) will perform these operations in the same manner as described above, except that, instead of using the time indicated by its digital to analog converter clock 34 directly in generating the time stamps for the frames, the value of the initial time stamp will be related to the time stamp that is provided by the zone player 11(n) (reference item (e) above), with the rate at which the time stamps are incremented corresponding to the rate at which its (that is, the zone player 11(n′)'s) clock increments. In addition, the zone player 11(n′) will notify the zone players that are operating as the master device 21 and slave devices 22(g) of the synchrony group 20 that it is the new audio information channel device 23 for the synchrony group 20, and provide the multi-cast address that it will be using to multi-cast the audio and playback timing information, as well as its unicast network address. After the members of the synchrony group 20 receive the notification from the zone player 11(n′) indicating that it is the new audio information channel device 23 for the synchrony group 20, they will receive the audio and playback timing information from the zone player 11(n′) instead of the zone player 11(n), using the multi-cast address provided by the zone player 11(n′). In addition, they can utilize the zone player 11(n′)'s unicast network address to obtain current time information therefrom. It will be appreciated that the zone player 11(n′) will determine its current time in relation to the time stamp that is provided by the zone player 11(n) (reference item (b) above) or the current time information that it received from the zone player 11(n) using the SNTP protocol as described above. The zone player 11(n′) will process successive audio information files in the list that it receives from the zone player 11(n) (reference item (d)).


Operations performed by the zone players 11(n) and 11(n′) in connection with migration of the audio information channel device 23 for other types of audio information will be apparent to those skilled in the art. In any case, preferably, the zone player 11(n) will continue operating as an audio information channel device 23 for the synchrony group 20 for at least a brief time after it notifies the zone player 11(n′) that it is to become audio information channel device for the synchrony group, so that the zone player 11(n′) will have time to notify the zone players in the synchrony group 20 that it is the new audio information channel device 23 for the synchrony group.


Before proceeding further in describing operations performed by the network audio system 10, it would be helpful to provide a detailed description of a zone player 11(n) constructed in accordance with the invention. FIG. 3 depicts a functional block diagram of a zone player 11(n) constructed in accordance with the invention. All of the zone players in the network audio system 10 may have similar construction. With reference to FIG. 3, the zone player 11(n) includes an audio information source interface 30, an audio information buffer 31, a playback scheduler 32, a digital to analog converter 33, an audio amplifier 35, an audio reproduction device interface 36, a network communications manager 40, and a network interface 41, all of which operate under the control of a control module 42. The zone player 11(n) also has a device clock 43 that provides timing signals that control the general operations of the zone player 11(n). In addition, the zone player 11(n) includes a user interface module interface 44 that can receive control signals from the user interface module 13 (FIGS. 1 and 2) for controlling operations of the zone player 11(n), and provide status information to the user interface module 13.


Generally, the audio information buffer 31 buffers audio information, in digital form, along with playback timing information. If the zone player 11(n) is operating as the audio information channel device 23 (FIG. 2) for a synchrony group 20, the information that is buffered in the audio information buffer 31 will include the audio and playback timing information that will be provided to the devices 21 and 22(g) in the synchrony group 20. If the zone player 11(n) is operating as the master device 21 or a slave device 22(g) for a synchrony group, the information that is buffered in the audio information buffer 31 will include the audio and playback timing information that the zone player 11(n) is to play.


The audio information buffer 31 can receive audio and playback timing information from two sources, namely, the audio information source interface 30 and the network communications manager 40. In particular, if the zone player 11(n) is operating as the audio information channel device 23 for a synchrony group 20, and if the audio information source is a source 14(n)(s) connected to the zone player 11(n), the audio information buffer 31 will receive and buffer audio and playback timing information from the audio information source interface 30. On the other hand, if the zone player 11(n) is operating as the audio information channel device 23 for a synchrony group 20, and if the audio information source is a source 16(m) connected to the network 12, or a source available over the wide area network, the audio information buffer 31 will receive and buffer audio and playback timing information from the network communications manager 40. It will be appreciated that, if the zone player 11(n) is not a member of the synchrony group, the zone player 11(n) will not play this buffered audio and playback timing information.


On yet another hand, if the zone player 11(n) is operating as the master device 21 or a slave device 22(g) in a synchrony group, and if the zone player 11(n) is not also the audio information channel device 23 providing audio and playback timing information for the synchrony group 20, the audio information buffer 31 will receive and buffer audio and playback timing information from the network communications manager 40.


The audio information source interface 30 connects to the audio information source(s) 14(n)(s) associated with the zone player 11(n). While the zone player 11(n) is operating as audio information channel device 23 for a synchrony group 20, and if the audio information is to be provided by a source 14(n)(s) connected to the zone player 11(n), the audio information source interface 30 will selectively receive audio information from one of the audio information source(s) 14(n)(s) to which the zone player is connected and store the audio information in the audio information buffer 21. If the audio information from the selected audio information source 14(n)(s) is in analog form, the audio information source interface 30 will convert it to digital form. The selection of the audio information source 14(n)(s) from which the audio information source interface 30 receives audio information is under control of the control module 42, which, in turn, receives control information from the user interface module through the user interface module interface 44. The audio information source interface 30 adds playback timing information to the digital audio information and buffers the combined audio and playback timing information in the audio information buffer 21.


More specifically, as noted above, the audio information source interface 30 receives audio information from an audio information source 14(n)(s), converts it to digital form if necessary, and buffers it along with playback timing information in the audio information buffer 21. In addition, the audio information source interface 30 will also provide formatting and scheduling information for the digital audio information, whether as received from the selected audio information source 14(n)(s) or as converted from an analog audio information source. As will be made clear below, the formatting and scheduling information will control not only playback by the zone player 11(n) itself, but will also enable other zone players 11(n′), 11(n″), . . . that may be in a synchrony group for which the zone player 11(n) is the master device, to play the audio program associated with the audio information in synchrony with the zone player 11(n).


In one particular embodiment, the audio information source interface 30 divides the audio information associated with an audio work into a series of frames, with each frame comprising digital audio information for a predetermined period of time. As used herein, an audio track may comprise any unit of audio information that is to be played without interruption. On the other hand, an audio program may comprise a series of one or more audio tracks that are to be played in succession. It will be appreciated that the tracks comprising the audio program may also be played without interruption, or alternatively playback between tracks may be interrupted by a selected time interval. FIG. 4 schematically depicts an illustrative framing strategy used in connection with one embodiment of the invention for a digital audio stream comprising an audio work. More specifically, FIG. 4 depicts a framed digital audio stream 50 comprising a sequence of frames 51(1) through 51(F) (generally identified by reference numeral 51(f)). Each frame 51(f), in turn, comprises a series of audio samples 52(f)(1) through 52(f)(S) (generally identified by reference numeral 52(f)(s)) of the audio track. Preferably all of the frames will have the same number “S” of audio samples, although it will be appreciated from the following that that is primarily for convenience. On the other hand, it will be appreciated that, the number of audio samples may differ from “S”; this may particularly be the case if the frame 51(f) contains the last audio samples for the digital audio stream for a particular audio work. In that case, the last frame 51(F) will preferably contain samples 52(F)(1) through 52(F)(x), where “x” is less than “S.” Generally, it is desirable that the number of samples be consistent among all frames 51(f), and in that case padding, which will not be played, can be added to the last frame 51(F).


Associated with each frame 51(f) is a header 55(f) that includes a number of fields for storing other information that is useful in controlling playback of the audio samples in the respective frame 51(f). In particular, the header 55(f) associated with a frame 51(f) includes a frame sequence number field 56, an encoding type field 57, a sampling rate information field 58, a time stamp field 60, an end of track flag 61, and a length flag field 62. The header 55(f) may also include fields (not shown) for storing other information that is useful in controlling playback. Generally, the frame sequence number field 56 receives a sequence number “f” that identifies the relative position of the frame 51(f) in the sequence of frames 51(1) . . . 51(f) . . . 51(F) containing the digital audio stream 50. The encoding type field 57 receives a value that identifies the type of encoding and/or compression that has been used in generating the digital audio stream. Conventional encoding or compression schemes include, for example, the well-known MP3 and WAV encoding and/or compression schemes, although it will be appreciated that other schemes may be provided for as well. The sampling rate information field 58 receives sampling rate information that indicates the sampling rate for the audio samples 52(f)(s). As will be apparent to those skilled in the art, the sampling rate determines the rate at which the zone player 11(n) is to play the audio samples 52(f)(s) in the frame, and, as will be described below, determines the period of the digital to analog converter clock 34.


The condition of the end of work flag 61 indicates whether the frame 51(f) contains the last digital audio samples for the audio track associated with the framed digital audio work 50. If the frame 51(f) does not contain the audio samples that are associated with the end of the digital audio stream 50 for a respective audio work, the end of work flag will be clear. On the other hand, if the frame 51(f) does contain the audio samples that are associated with the end of the digital audio stream 50 for a respective audio work, the end of work flag 61 will be set. In addition, since the number of valid audio samples 52(F)(s) in the frame 51(F), that is, the samples that are not padding, may be less than “S,” the default number of audio samples in a frame 51(f), the length flag field 62 will contain a value that identifies the number of audio samples in 52(F)(s) in the last frame 51(F) of the audio work 50. If, as noted above, the frames have a consistent number “S” of samples, the samples 52(F)(x+1) through 52(F)(S) will contain padding, which will not be played.


The time stamp field 60 stores a time stamp that identifies the time at which the zone player 11(n) is to play the respective frame. More specifically, for each frame of a framed digital audio stream 50 that is buffered in the audio information buffer 21, the audio information source interface 30, using timing information from the digital to analog converter clock 34, will determine a time at which the zone player 11(n) is to play the respective frame, and stores a time stamp identifying the playback time in the time stamp field 60. The time stamp associated with each frame will later be used by the playback scheduler 32 to determine when the portion of the digital audio stream stored in the frame is to be coupled to the digital to analog converter 33 to initiate play back. It will be appreciated that the time stamps that are associated with frames in sequential frames 51(1), 51(2), . . . , 51(F), will be such that they will be played back in order, and without an interruption between the sequential frames comprising the digital audio stream 50. It will further be appreciated that, after a time stamp has been determined for the first frame, stored in frame 51(1), of a digital audio stream 50, the audio information source interface 30 can determine time stamps for the subsequent frame 51(2), 51(3), . . . , 51(F) in relation to the number of samples “S” in the respective frames and the sample rate. The time stamps will also preferably be such that frames will be played back after some slight time delay after they have been buffered in the audio information buffer 21; the purpose for the time delay will be made clear below.


Returning to FIG. 3, in addition to dividing the digital audio information into frames, the audio information source interface 30 also aggregates and/or divides the frames 51(f) as necessary into packets, each of which will be of a length that would fit into a message for transmission over the network, and associates each packet with a packet sequence number. For example, if a packet will accommodate multiple frames 51(f), 51(f+1), . . . 51(f+y−1), it will aggregate them into a packet and associate them with a packet number, for example p(x). If the entire frames 51(f) and 51(f+y−1) was accommodated in packet p(x), where “x” is the sequence number, which will occur if the size of a packet is an exact multiple of the frame size, the next packet, p(x+1) will begin with frame 51(f+y) and will include frames 51(f+y), . . . , 51(f+2y−1). Subsequent packets p(x+2), . . . will be formed in a similar manner. On the other hand, if the packet length will not accommodate an exact multiple of the frame size, the last frame in the packet will be continued at the beginning of the next packet.


If the audio information source interface 30 is aware of track boundaries, which may be the case if the tracks are divided into files, the packets will reflect the track boundaries, that is, the packets will not contain frames from two tracks. Thus, if the last frames associated with a track are insufficient to fill a packet, the packet will contain padding from the last frame associated with the track to the end of the packet, and the next packet will begin with the first frames associated with the next track.


In one embodiment, the audio information source interface 30 stores the packets in the audio information buffer 31 in a ring buffer. As is conventional, a ring buffer includes a series of storage locations in the buffer. Each entry will be sufficient to store one packet. Four pointers are used in connection with the ring buffer, a first pointer pointing to the beginning of the ring buffer, a second pointer pointing to the end of the ring buffer, an third “write” pointer pointing to the entry into which a packet will be written and a fourth “read” pointer pointing to the entry from which packet will be read for use in playback. When a packet is read from the ring buffer for playback, it will be read from the entry pointed to by the read pointer. After the packet has been read, the read pointer will be advanced. If the read pointer points beyond the end of the ring buffer, as indicated by the end pointer, it will be reset to point to the entry pointed to by the beginning pointer, and the operations can be repeated.


On the other hand, when the audio information source interface 30 stores a packet in the ring buffer, first determine whether the entry pointed to by the write pointer points to the same entry as the entry pointed to by the read pointer. If the write pointer points to the same entry as the entry pointed to by the read pointer, the entry contains at least a portion of a packet that has not yet been read for playback, and the audio information source interface 30 will delay storage of the packet until the entire packet has been read and the read pointer advanced. After the read pointer has been advanced, the audio information source interface 30 can store the packet in the entry pointed to by the write pointer. After the packet has been stored, the audio information source interface 30 will advance the write pointer. If the write pointer points beyond the end of the ring buffer, as indicated by the end pointer, it will be reset to point to the entry pointed to by the beginning pointer, and the operations can be repeated.


As noted above, the zone player 11(n) can operate both as an audio information channel device 23 and a member of the synchrony group 20 of which it is a member. In that case, the audio information buffer 31 can contain one ring buffer. On the other hand, the zone player 11(n) can operate as an audio information channel device 23 for one synchrony group 20(1) (FIG. 2A) and a member of another synchrony group 20(2). In that case, the audio information buffer 31 would maintain two ring buffers, one for the audio and timing information associated with synchrony group 20(1), and the other for the audio and timing information associated with synchrony group 20(2). It will be appreciated that, in the latter case, the zone player 11(n) will only use the audio and timing information that is associated with synchrony group 20(2) for playback.


The playback scheduler 32 schedules playback of the audio information that is buffered in the audio information buffer 31 that is to be played by the zone player 11(n). Accordingly, under control of the playback scheduler 32, the digital audio information that is buffered in the audio information buffer 21 that is to be played by the zone player 11(n) is transferred to the digital to analog converter 33 for playback. As noted above, if the zone player 11(n) is operating as an audio information channel device 23 for a synchrony group 20 for which it is not a member, the playback scheduler 32 will not schedule the digital audio information that is to be played by that synchrony group 20 for playback. The playback scheduler 32 only schedules the digital audio information, if any, that is buffered in the audio information buffer 31 that is associated with a synchrony group for which the zone player 11(n) is a member, whether as master device 21 or a slave device 22(g).


Essentially, the playback scheduler 32 makes use of the read pointer associated with the circular buffer that contains the audio and playback timing information that is to be played by the zone player 11(n). The playback scheduler 32 retrieves the packet information from the entry of the ring buffer pointed to by the read pointer, and then advances the ring pointer as described above. The playback scheduler 32 determines the boundaries of the frames in the packet and uses the time stamps in the time stamp fields 60 associated with the respective frame 51(f), along with timing information provided by the zone player 11(n)'s digital to analog converter clock 34, to determine when the respective frame is to be transferred to the digital to analog converter 33. Generally, when the time stamp associated with a buffered digital audio information frame corresponds to the current time as indicated by the digital to analog converter clock 34, the playback scheduler 32 will enable the respective frame to be transferred to the digital to analog converter 33.


The digital to analog converter 33, also under control of the digital to analog converter clock 34, converts the buffered digital audio information to analog form, and provides the analog audio information to the audio amplifier 35 for amplification. The amplified analog information, in turn, is provided to the audio reproduction devices 15(n)(r) through the audio reproduction device interface 36. The audio reproduction devices 15(n)(r) transform the analog audio information signal to sound thereby to provide the audio program to a listener. The amount by which the audio amplifier 35 amplifies the analog signal is controlled by the control module 42, in response to volume control information provided by the user through the user interface module 13.


The network communications manager 40 controls network communications over the network 12, and the network interface 41 transmits and receives message packets over the network 12. The network communications manager 40 generates and receives messages to facilitate the transfer of the various types of information described above in connection with FIG. 2, including the channel device control information, slave device control information, audio and playback timing information and the audio information channel device's clock timing information. In connection with the channel device control information and the slave device control information, the network communications manager 40 will generate messages for transfer over the network 12 in response to control information from the control module 42. Similarly, when the network communications manager 40 receives messages containing channel device control information and slave device control information, the network communications manager will provide the information to the control module 42 for processing.


With regards to the audio information channel device's clock timing information, as noted above, the master device 21 and slave devices 22(g) of the synchrony group 20 obtain the clock timing information from the audio information channel device 23 using the well-known SNTP. If the zone player 11(n) is operating as the audio information channel device 23 for a synchrony group, during the SNTP operation, it will provide its current time, particularly a current time as indicated by its digital to analog converter clock 34. On the other hand, if the zone player 11(n) is operating as the master device 21 or slave device 22(g) of a synchrony group 20, it will receive the clock timing information from the audio information channel device 23. After the respective device 21, 22(g) has obtained the audio information channel device's clock timing information, it will generate a differential time value ΔT representing the difference between the time T indicated by its digital to analog converter clock 34 and the current time information from the audio information channel device 23. The differential time value will be used to update the time stamps for the frames of the digital audio stream 50 (FIG. 4) that are received from the audio information channel device.


With regards to the audio and playback timing information, operations performed by the network communications manager 40 will depend on whether


(i) the audio and playback timing information has been buffered in the audio information buffer 31 for transmission, as audio information channel device 23, over the network 12 to the master device 21 and/or slave devices 22(g) of a synchrony group, or


(ii) the audio and playback timing information has been received from the network 12 to be played by the zone player 11(n) as either the master device 21 for a synchrony group or a slave device in a synchrony group.


It will be appreciated that the network communications manager 40 may be engaged in both (i) and (ii) contemporaneously, since the zone player 11(n) may operate both as the audio information channel device 23(1) for a synchrony group 20(1) (reference FIG. 2A) of which it is not a member, and a member of another synchrony group 20(2) for which another zone player 11(n′) is the audio information channel device 23(2). With reference to item (i) above, after a packet that is to be transmitted has been buffered in the respective ring buffer, the network communications manager 40 retrieves the packet, packages it into a message and enables the network interface 41 to transmit the message over the network 12. If the control module 42 receives control information from the user interface module 13 (if the master device 21 is also the audio information channel device 23 for the synchrony group 20) or from the master device (if the master device 21 is not the audio information channel device 23 for the synchrony group 20) that would require the transmission of a “resynchronize” command as described above, the control module 42 of the audio information channel device 23 enables the network communications manager 40 to insert the command into a message containing the audio and playback timing information. Details of the operations performed in connection with the “resynchronize” command will be described below. As noted above, the “resynchronize” command is used if the user enables a synchrony group to terminate the playback of a track that is currently being played, or cancel playback of a track whose playback has not begun.


On the other hand, with reference to item (ii) above, if network interface 41 receives a message containing a packet containing frames of audio and playback timing information that the zone player 11(n) is to play either as a master device 21 or a slave device for a synchrony group 20, the network interface 41 provides the audio and playback timing information to the network communications manager 40. The network communications manager 40 will determine whether the packet contains a resynchronize command and, if so, notify the control module 42, which will enable operations to be performed as described below. In any case, the network communications manager 40 will normally buffer the various frames comprising the audio and playback timing information in the audio information buffer 31, and in that operation will generally operate as described above in connection with the audio information source interface 30. Before buffering them, however, the network communications manager 40 will update their time stamps using the time differential value described above. It will be appreciated that the network communications manager 40 will perform similar operations whether the messages that contain the packets were multi-cast messages or unicast messages as described above


The updating of the time stamps by the master device 21 and the slave devices 22(g) in the synchrony group 20 will ensure that they all play the audio information synchronously. In particular, after the network communications manager 40 has received a frame 51(f) from the network interface 41, it will also obtain, from the digital to analog converter clock 34, the zone player 11(n)'s current time as indicated by its digital to analog converter clock 34. The network communications manager 40 will determine a time differential value that is the difference between the slave device's current clock time, as indicated by its digital to analog converter 34, and the audio information channel device's time as indicated by the audio information channel device's clock timing information. Accordingly, if the master or slave device's current time has a value TS and the audio information channel device's current time, as indicated by the clock timing information, has a value TC, the time differential value ΔT=TS−TC. If the current time of the master or slave device in the synchrony group 20, as indicated by its digital to analog converter clock 34, is ahead of the audio information channel device's clock time as indicated by the clock timing information received during the SNTP operation, the time differential value will have a positive value. On the other hand, if the master or slave device's current time is behind the audio information channel device's clock time, the time differential value ΔT will have a negative value. If the zone player 11(n) obtains clock timing information from the audio information channel device 23 periodically while it is a member of the synchrony group 20, the network communications manager 40 can generate an updated value for the time differential value ΔT when it receives the clock timing information from the audio information channel device 23, and will subsequently use the updated time differential value.


The network communications manager 40 uses the time differential value ΔT that it generates from the audio information channel device timing information and zone player 11(n)'s current time to update the time stamps that will be associated with the digital audio information frames that the zone player 11(n) receives from the audio information channel device. For each digital audio information frame that is received from the audio information channel device, instead of storing the time stamp that is associated with the frame as received in the message in the audio information buffer 21, the network communications manager 40 will store the updated time stamp with the digital audio information frame. The updated time stamp is generated in a manner so that, when the zone player 11(n), as a member of the synchrony group plays back the digital audio information frame, it will do so in synchrony with other devices in the synchrony group.


More specifically, after the zone player 11(n)'s network interface 41 receives a message containing a packet that, in turn, contains one or more frames 51(f), it will provide the packet to the network communications manager 40. For each frame 51(f) in the packet that the network communications manager 40 receives from the network interface 41, the network communications manager 40 will add the time differential value ΔT to the frame's time stamp, to generate the updated time stamp for the frame 51(f), and store the frame 51(f), along with the header 55(f) with the updated time stamp in the audio information buffer 31. Thus, for example, if a frame's time stamp has a time value TF, the network communications manager 40 will generate an updated time stamp TUF having a time value TUF=TF+ΔT. Since time value TUF according to the slave device's digital to analog converter clock 34 is simultaneous to the time value TF according to the audio information channel device's digital to analog converter clock 34, the zone player 11(n) device will play the digital audio information frame at the time determined by the audio information channel device 23. Since all of the members of the synchrony group 20 will perform the same operations, generating the updated time stamps TUF for the various frames 51(f) in relation to their respective differential time values, all of the zone players 11(n) in the synchrony group 20 will play them synchronously. The network communications manager 40 will generate updated time stamps TUF for all of the time stamps 60 in the packet, and then store the packet in the audio information buffer 31.


It will be appreciated that, before storing a packet in the audio information buffer 21, the network communications manager 40 can compare the updated time stamps TUF associated with the frames in the packet to the slave device's current time as indicated by its digital to analog converter clock 34. If the network communications manager 40 determines the time indicated by the updated time stamps of frames 51(f) in the packet are earlier than the zone player's current time, it can discard the packet instead of storing it in the audio information buffer 21, since the zone player 11(n) will not play them. That is, if the updated time stamp has a time value TUF that identifies a time that is earlier than the zone player's current time TS as indicated by the zone player's digital to analog converter clock 34, the network communications manager 40 can discard the packet.


If the zone player 11(n) is operating as the master device 21 of a synchrony group 20, when the user, through the user interface module 13, notifies the zone player 11(n) that another zone player 11(n′) is to join the synchrony group 20 as a slave device 22(g), the control module 42 of the zone player 11(n) enables the network communications manager 40 to engage in an exchange of messages, described above in connection with FIG. 2, to enable the other zone player 11(n′) to join the synchrony group 20 as a slave device. As noted above, during the message exchange, the messages generated by the network communications manager 40 of the zone player 11(n) will provide the network communications manager of the zone player 11(n′) that is to join the synchrony group 20 with information such as the multi-cast address being used by the audio information channel device 23 that is providing the audio program to the synchrony group 20, as well as a unicast network address for the audio information channel device 23. After receiving that information, the network communications manager and network interface of the zone player 11(n′) that is to join the synchrony group 20 can begin receiving the multi-cast messages containing the audio program for the synchrony group, engage in SNTP transactions with the audio information channel device 23 to obtain the latter's current time, and also enable the audio information channel device 23 to send the zone player 11(n′) frames 51(f) that it had previously broadcast using the unicast message transmission methodology as described above.


On the other hand, if the network communications manager 40 and network interface 41 of the zone player 11(n) receive a message over the network 12 indicating that it is to become a slave device 22(g) of a synchrony group for which another zone player 11(n′) is the master device, the network communications manager 40 for zone player 11(n) will provide a notification to the control module 42 of zone player 11(n). Thereafter, the control module 42 of zone player 11(n) can enable the network communications manager 40 of zone player 11(n) to perform the operations described above to enable it to join the synchrony group 20.


As noted above, the user, using user interface module 13, can enable the synchrony group to terminate playback of a track of an audio program that is currently being played. After playback of a track that is currently being played has been terminated, playback will continue in a conventional manner with the next track that has been buffered in the audio information buffer 31. It will be appreciated that that could be the next track that is on the original play list, or a previous track. In addition, the user can enable the synchrony group 20 to cancel playback of a track that it has not yet begun to play, but for which buffering of packets has begun in the synchrony group 20. Both of these operations make use of the “resynchronize” command that the master device 21 of the synchrony group 20 can enable the audio information channel device 23 to include in the multi-cast message stream that it transmits to the synchrony group 20. Generally, in response to receipt of the resynchronize command, the members of the synchrony group 20 flush the ring buffer containing the packets that they are to play in the future. In addition, if the members of the synchrony group provide separate buffers for their respective digital to analog converters 33, the members will also flush those buffers as well. After the audio information channel device transmits a packet containing the resynchronize command:


(i) in the case of the use of the resynchronize command to terminate playing of a track that is currently being played, the audio information channel device 23 will begin multi-casting packets for the next track, to begin play immediately, and will continue through the play list in the manner described above; and


(ii) in the case of the use of the resynchronize command to cancel play of a track for which buffering has begun, but which is to be played in the future, the audio information channel device 23 will begin multi-casting packets for the track after the track that has been cancelled, to be played beginning at the time the cancelled track was to begin play, and will also continue through the play list in the manner described above.


It will be appreciated that,


(a) in the first case (item (i) directly above), the resynchronize command can enable the read pointer to be set to point to the entry in the circular buffer into which the first packet for the next track will be written, which will correspond to the entry to which the write pointer points, but


(b) in the second case (item (ii) directly above), the resynchronize command can enable the write pointer for the circular buffer to be set to point to the entry that contains the first packet for the track whose play is being cancelled.


It will further be appreciated that, if a track is cancelled for which buffering has not begun, the resynchronize command will generally not be necessary, since the audio information channel device 23 for the synchrony group 20 need only delete that track from the play list.


Operations performed in connection with use of the resynchronize command to cancel playback of a track that is currently being played will be described in connection with Packet Sequence A below, and operations performed in connection with the use of the resynchronize command to cancel playback of a track that is has not yet begun to play, but for which buffering of packets has begun, will be described in connection with Packet Sequence B below.


Packet Sequence A


(A1.0) [packet 57]


(A1.1 [continuation of frame 99]


(A1.2) [frame 100, time=0:00:01, type=mp3 audio]


(A1.3) [frame 101, time=0:00:02, type=mp3 audio]


(A1.4) [frame 102, time=0:00:03, type=mp3 audio]


(A2.0) [packet 58]


(A2.1) [continuation of frame 102]


(A2.2) [frame 103, time=0:00:04, type=mp3 audio]


(A2.3) [frame 104, time=0:00:05, type=mp3 audio]


(A2.4) [frame 105, time=0:00:06, type=mp3 audio]


(A3.0) [packet 59]


(A3.1) [continuation of frame 105]


(A3.2) [frame 106, time=0:00:07, type=mp3 audio]


(A3.3) [frame 107, time=0:00:08, type=mp3 audio]


(A3.4) [frame 108, time=0:00:09, type=mp3 audio]


(A4.0) [packet 60]


(A4.1) [continuation of frame 108]


(A4.2) [frame 109, time=0:00:10, type=mp3 audio]


(A4.3) [Resynchronize command]


(A4.4) [Padding, if necessary]


(A5.0) [packet 61]


(A5.1) [frame 1, time=0:00:07, type=mp3 audio]


(A5.2) [frame 2, time=0:00:08, type=mp3 audio]


(A5.3) [frame 3, time=0:00:09, type=mp3 audio]


(A5.4) [frame 4, time=0.00.10, type=mp3 audio]


15.


(A6.0) [packet 62]


(A6.1) [continuation of frame 4]


(A6.2) [frame 5, time=0:00:11, type=mp3 audio]


(A6.3) [frame 6, time=0:00:12, type=mp3 audio]


(A6.4) [frame 7, time=0:00:13, type=mp3 audio]


Packet Sequence A comprises a sequence of six packets, identified by packet 57 through packet 62, that the audio information channel device 23 multi-casts in respective messages to the members of a synchrony group 20. It will be appreciated that the series of messages that the audio information channel device 23 may multi-cast to the synchrony group 20 may include a messages prior to the packet 57, and may also include messages after packet 62. Each packet comprises a packet header, which is symbolized by lines (A1.0), (A2.0), . . . (A6.0) in Packet Sequence A, and will generally also include information associated with at least a portion of a frame. In the packets represented in Packet Sequence A, each packet includes information associated with a plurality of frames. Depending on the lengths of the packets, each packet may contain information associated with a portion of a frame, an entire frame, or multiple frames. In the illustration represented by Packet Sequence A, it is assumed that each packet may contain information associated with multiple frames. In addition, it is assumed that a packet does not necessarily contain information associated with an integral number of frames; in that case, a packet may contain information associated with a portion of a frame, and the next packet will contain the information associated with the rest of a frame.


The frames and associated header playback timing information contained in the various packets are symbolized by lines (A1.1), (A1.2), . . . , (A1.4), (A2.1), . . . (A6.4) of Packet Sequence A. Thus, for example, line (A1.2) of packet 57 represents the one-hundredth frame, that is, frame 51(100) (reference FIG. 4), of the track whose audio information is being transmitted in the sequence of packets that includes packet 57. The frame 51(100) is to be played at an illustrative time, according to the audio information channel device's digital to analog converter clock, of “time=0:00:01,” and the frame is encoded and/or compressed using the well-known MP3 encoding and compression methodology. In that case, the legend“time=0:00:01” represents the time stamp that would be included in field 60 (FIG. 4) of the header associated with the frame 50(100) as multi-cast by the audio information channel device for the synchrony group. It will be appreciated that the playback time and encoding/compression methodology will be referred in the header 55(100) that is associated with the frame 51(100). It will also be appreciated that the header may also contain additional information as described above.


Similarly, line (A1.3) of packet 57 represents the one-hundred and first frame, that is, frame 51(101), of the track whose audio information is being transmitted in the sequence of packets that includes packet 57. The frame 51(101) is to be played at an illustrative time, according to the audio information channel device's digital to analog converter clock, of “0:00:02,” and the frame is also encoded and/or compressed using the MP3 encoding and compression methodology. Line (A1.4) of packet 57 represents similar information, although it will be appreciated that, depending on the length of packet 57, the line may not represent the information for an entire frame 51(102) and/or its associated header. If the length of packet 57 is not sufficient to accommodate the information for the entire frame 51(102) and/or associated header, the information will continue in packet 58, as represented by line (A2.1) in Packet Sequence A. Similarly, if the length of packet 56 was not sufficient to contain the information for an entire frame 51(99) preceding frame 51(100), packet 57 (lines (A1.0) through 1.4) may contain any information from frame 51(99) that packet 56 was unable to accommodate.


As noted above, when the master device 21 or a slave device 22(g) in the synchrony group 20 receives the packet 57, its respective network communications manager 40 will update the time stamps associated with the various frames 51(f) as described above before buffering the respective frames in the respective audio information buffer 31.


Packets 58 and 59 contain information that is organized along the lines described above in connection with packet 57.


Packet 60 also contains, as represented by lines (A4.1) and (A4.2), information that is organized along the lines of the information represented by lines (Ax.1) and (Ax.2) (“x” equals an integer) described above in connection with packets 57 through 59. On the other hand, packet 60 contains a resynchronize command, as represented by line (A4.3). Packet 60 also may contain padding, as represented by line 4.4, following the resynchronize command. As noted above, the master device 21 of a synchrony group 20 will enable the audio information channel device 23 that is providing audio information to the synchrony group 20 to multi-cast a message containing the resynchronize command when it receives notification from the user interface module 13 that the user wishes to cancel playback of a track that is currently being played. In the example depicted in Packet Sequence A, as will be described below, the audio information channel device 23 receives notification from the master device 21 that the user wishes to cancel playback of a track at a time corresponding to “time=0:00:07” according to its digital to analog converter clock 34, and, in line (A4.3) of packet 60 it will provide the resynchronize command, followed by padding, if necessary.


As will be apparent from examining lines (A3.1) through (A3.4) of packet 59 and lines (A4.1) and (A4.2) of packet 60, although the audio information channel device 23 has received the notification from the synchrony group's master device 21 to multi-cast the resynchronize command at a time corresponding to “time=0:00:07” according to the clock time indicated by its digital to analog converter clock 34, it (that is, the audio information channel device 23) has already multi-cast messages containing frames that are to be played at that time and subsequently. That is, the audio information channel device 23 has, multi-cast in packet 59, frames 51(106) through 51(108) that contain time stamps “time=0:00:07,” “time=0:00:08” and “time=0:00:09,” respectively, and, in packet 60, in addition to the continuation of frame 51(108), frame 51(109) that contains time stamp “time=0:00:10.” (It will be appreciated that the times indicated by the illustrative time stamps are for illustration purposes only, and that in an actual embodiment the time stamps may have different values and differentials.)


As noted above, the audio information channel device 23 multi-casts a message containing a packet that, in turn, contains the resynchronize command when it receives the notification from the master device 21 to do so. In the example depicted in Packet Sequence A, the packet will be multi-cast when the audio information channel device's digital to analog converter clock time corresponds to “0:00:07.” Subsequently, two things happen. In one, aspect, when the master device 21 and slave devices 22(g) receive the packet that contains the resynchronize command, they will stop playing the audio program that they are playing.


In addition, the audio information channel device 23 will begin transmitting frames containing audio information for the next track, including therewith time stamps immediately following the digital to analog converter clock time at which the packet including the resynchronize command was transmitted. Accordingly, and with further reference to Packet Sequence A, the audio information channel device 23 will multi-cast a message containing packet 61. As indicated above, packet 61 contains, as represented in lines (A5.1) through (A5.3), frames 51(1) through 51(3), which are the first three frames of the next track of the audio program that is to be played. It is also compressed and encoded using the MP3 encoding and compression scheme, and it is accompanied by time stamps “time=0:00:07,” “time=0:00:08” and “time=0:00:10.” As noted above, the time stamp “time=0:00:07” corresponds to the clock time at which the audio information channel device 23 multi-casts the resynchronize command, and, when the master device 21 and slave devices 22(g) receive these frames, they would be expected to begin playing them very shortly, if not immediately after the audio information channel device 23 multi-casts the message containing the packet that, in turn, contains the resynchronize command. Packet 61 also includes at least a portion of the next frame, that is, frame 51(4), for that track. In addition, Packet Sequence A depicted above further includes a subsequent packet, namely, packet 62, that contains any necessary continuation of frame 51(4), as well as three subsequent frames. If any additional packets are required for the track, as well as for subsequent tracks, they can be multi-cast in a similar manner.


As further noted above, the resynchronize command can also be used to cancel playing of one or more tracks for which playback has begun. This will be illustrated in connection with Packet Sequence B:


Packet Sequence B


(B1.0) [packet 157]


(B1.1) [continuation of frame 99]


(B1.2) [frame 100, time=0:00:01, type=mp3 audio]


(B1.3) [frame 101, time=0:00:02, type=mp3 audio]


(B1.4) [frame 102, time=0:00:03, type=mp3 audio]


(B2.0) [packet 158]


(B2.1) [continuation of frame 102]


(B2.2) [frame 103, time=0:00:04, type=mp3 audio]


(B2.3) [frame 104, time=0:00:05, type=mp3 audio]


(B2.4) [frame 105, time=0:00:06, type=mp3 audio]


(B3.0) [packet 159]


(B3.1) [continuation of frame 105]


(B3.2) [frame 106, time=0:00:07, type=mp3 audio]


(B3.3) [track boundary notification]


(B3.4) [Padding, if necessary]


(B4.0) [packet 160]


(B4.1) [frame 1, time=0:00:08, type=mp3 audio]


(B4.2) [frame 2, time=0:00:09, type=mp3 audio]


(B4.3) [frame 3, time=0:00:10, type=mp3 audio]


(B5.0) [packet 161]


(B5.1) [continuation of frame 3]


(B5.2) [frame 4, time=0:00:11, type=mp3 audio]


(B5.3) [Resynchronize, after packet 159]


(B5.4) [Padding, if necessary]


(B6.0) [packet 162]


(B6.1) [frame 1, time=0:00:08, type=mp3 audio]


(B6.2) [frame 2, time=0:00:09, type=mp3 audio]


(B6.3) [frame 3, time=0:00:10, type=mp3 audio].


(B6.4) [frame 4, time=0:00:11, type=mp3 audio]


(B7.0) [packet 163]


(B7.1) [continuation of frame 4]


(B7.2) [frame 5, time=0:00:12, type=mp3 audio]


(B7.3) [frame 6, time=0:00:13, type=mp3 audio]


(B7.4) [frame 7, time=0:00:14, type=mp3 audio]


Packet Sequence B comprises a series of seven packets, identified by packet 157 through 163, that the audio information channel device 23 multi-casts to the members of a synchrony group 20. As with Packet Sequence A, it will be appreciated that the series of packets that the audio information channel device 23 may multi-cast to the synchrony group 20 may include packets prior to the packet 157, and may also include packets after packet 162. Each packet comprises a packet header, which is symbolized by lines (B1.0), (B2.0), . . . (B7.0) in Packet Sequence B. As in Packet Sequence A, each packet will also generally include information associated with at least a portion of a frame 51(f) along with its associated frame 55(f). As in the packets represented in Packet Sequence A, each packet includes information associated with a plurality of frames. Depending on the lengths of the packets, each packet may contain information associated with a portion of a frame, an entire frame, or multiple frames. Further, as with Packet Sequence A, it is assumed that each packet may contain information associated with multiple frames. In addition, it is assumed that a packet does not necessarily contain information associated with an integral number of frames; in that case, a packet may contain information associated with a portion of a frame, and the next packet will contain the information associated with the rest of a frame.


The structures of the packets represented by Packet Sequence B are similar to those described above in connection with Packet Sequence A, and will not be repeated here. Generally, Packet Sequence B illustratively contains a sequence of packets that represent at least portions of three tracks that may have been selected from, for example, a play list. In particular, packets 157 through 159 represent frames from a portion of one track, packets 160 and 161 represent frames from a second track and packets 162 and 163 represent frames from a third track. The play list indicated that the first, second and third tracks were to be played in that order. With particular reference to Packet Sequence B, it should be noted that line (B3.3) indicates that packet 159 includes an indication that that packet contains the last frame for the track, and line (B3.4) provides for padding to the end of the packet. The first frame of the next track begins in packet 160.


In connection with the use of the resynchronize command to cancel playback of a track, at least a portion of which the audio information channel device 23 has multi-cast to the members of the synchrony group, packet 161, in line (B5.3) represents a resynchronize command that indicates that resynchronization is to occur after packet 159, that is, immediately after the packet that contains the last frame of the first of the three tracks represented by the packets in Packet Sequence B. It should be noted that the resynchronize command is in the packet 161, while the resynchronization is to occur at packet 160, that is, the synchrony group is to not play the track starting with packet 160, but instead is to begin playing the track frames for which begin with the next packet, that is, packet 162. As with Packet Sequence A, in Packet Sequence B the audio information channel device 23, in packet 162 and 163, multi-casts frames whose time stamps indicate that they are to be played when the frames that were multi-cast in packets 160 and 161 were to be played. By use of the resynchronize command and specifying a packet in this manner, the audio information channel device can cancel playback of a track for which playback has not yet begun.


It will be appreciated that the resynchronize command is generally not necessary for cancelling play back of a track that the audio information channel device 23 has not started multi-casting to the synchrony group 20, since the audio information channel device 23 itself can re-order the play list to accommodate the cancellation.


The invention provides a number of advantages. In particular, the invention provides a network audio system in which a number of devices share information can reproduce audio information synchronously, notwithstanding the fact that packets, which may contain digital audio information, transmitted over the network to the various zone players connected thereto may have differing delays and the zone players operate with independent clocks. Moreover, although the invention has been described in connection with audio information, it will be appreciated that the invention will find utility in connection with any type of isochronous information for which synchrony among devices is desired. The system is such that synchrony groups are created and destroyed dynamically, and in such a manner as to avoid requiring a dedicated device as the master device.


It will be appreciated that a number of changes and modifications may be made to the network audio system 10 as described above. For example, although the invention has been described as providing that the audio information channel device 23 provides digital audio information to the members synchrony group 20 that has been encoded using particular types of encoding and compression methodologies, it will be appreciated that the audio information channel device 23 can provide digital audio information to various members of the synchrony group 20 that have been encoded and compressed using different types of encoding and compression methodologies, and, moreover, for which different sampling rates have been used. For example, the audio information channel device 23 may provide digital audio information to the master device 21 and slave devices 22(1) through 22(g1) using the MP3 methodology at a specified sampling rate, the digital audio information for the same program to slave devices 22(g1+1) through 22(g2) using the WAV methodology at one specified sampling rate, and to slave devices 22(g2+1) through 22(G) using the WAV methodology at another specified sampling rate. In that case, the audio information channel device 23 can specify the particular encoding and compression methodology that has been used in the encoding type field 57 associated with each frame and the sampling rate in the sampling rate field 58. Moreover, since the encoding and compression type and sampling rate are specified for each frame, the encoding and compression type and sampling rate can be changed from frame to frame. The audio information channel device 23 may use different multi-cast addresses for the different encoding and compression types and sampling rates, but it will be appreciated that that would not be required.


It will be appreciated that two advantages of providing that the encoding and compression methodology and the sampling rate is provided on a frame-by-frame basis, instead of on, for example, a track-by-track basis, is that that would facilitate a slave device joining the synchrony group 20 at a frame mid-track, without requiring, for example, the master device 21 or the audio information channel device 23 to notify it of the encoding and compression methodology or the sampling rate.


Another modification is that, instead of the network communications manager 40 of a member of a synchrony group 20 generating the updated time stamp TuF for a digital audio information frame by adding the time differential value ΔT to the time stamp TF associated with a frame, the network communications manager 40 may instead generate the updated time stamp TUF by subtracting the differential time value ΔT from the member's current time TS as indicated by the member's digital to analog converter clock 34 at the time at which the digital audio information is received. It will be appreciated, however, that there may be variable time delays in processing of messages by the slave device's network communications manager 40, and so it may be preferable to generate the time differential value ΔT using the time stamp TF provided by the audio information channel device 23.


In addition, instead of the network communications manager 40 of a member of a synchrony group generating an updated time stamp to reflect the difference between the times indicated by the member's digital to analog converter clock and the audio information channel device's digital to analog converter clock, the network communications manager 40 can generate the time differential value ΔT and provide it to the member's playback scheduler 32. In that case, the member's network communications manager 40 can store each digital audio information frame along with the time stamp TF as received from the master device in the audio information buffer 21. The playback scheduler 32 can utilize the time differential value ΔT, and the time stamps TF associated with the digital audio information frames, to determine when the respective digital audio information frames are to be played. In determining when a digital audio information frame is to be played, the playback scheduler can add the time differential value to the time stamp TF associated with the digital audio frame, and enable the digital audio frame to be coupled to the digital to analog converter 33 when the time indicated by the sum corresponds to the current time as indicated by the slave device's digital to analog converter clock 34. Alternatively, when the member's digital to analog converter clock 34 updates its current time TS, the playback scheduler can generate an updated current time T′S by subtracting the differential time value ΔT from the current time TS, and using the updated current time T′S to determine when to play a digital audio information frame.


As described above, the members of a synchrony group 20 periodically obtain the audio information channel device's current time value and uses the current time value that it receives from the audio information channel device to periodically update the time differential value ΔT that it uses in updating the time stamps associated with the various frames. It will be appreciated that, if the digital to analog converter clock(s) associated with the member(s) of a synchrony group 20 are ensured to have the same rate as the digital to analog converter clock, a member need only obtain the current time value from the audio information channel device once, at the beginning of playback.


As another alternative, if the zone players are provided with digital to analog converter clock 34 whose time and rate can be set by an element such as the network communications manager 40, when a zone player 11(n) is operating as a member of a synchrony group 20, its network communications manager 40 can use the various types of timing information that it receives from the audio information channel device 23, including the current time information and the playback timing information indicated by the time stamps that are associated with the various frames 51(f) comprising the audio and playback timing information that it receives, to adjust the synchrony group member's digital to analog converter clock's time value and/or the clock rate that it uses for playback. If the clock's time value is to be adjusted, when the synchrony group member's network communications manager 40 initially receives the current time information from the audio information channel device 23 for the synchrony group 20, the network communications manager 40 can set the synchrony group member's digital to analog converter clock 34 to the current time value as indicated by the audio information channel device's current time information. The network communications manager 40 can set the clock 34 to the current time value indicated by the audio information channel device's current time information once, or periodically as it receives the current time information.


Alternatively or in addition, the synchrony group member's network communications manager 40 can use one or both of the current time information and/or the playback timing information in the time stamps associated with the respective frames 51(f) to adjust the clock rate of the clock 34 that it uses for playback. For example, when the synchrony group member's network communications manager 40 receives a frame 51(fX) having a time stamp having a time value TfX, it can generate the updated time value TUfX=TfX+ΔT as described above, and store the frame with the time stamp with the updated time value in the audio information buffer 30. In addition, since both the number of samples in a frame and the sampling rate, which determines the rate at which the frame is to be played, are known to the network communications manager 40, it can use that information, along with the updated time value TUFX that is to be used for frame 51(fX) to generate an expected updated time value TEfX+1 that is expected for the updated time stamp of the next frame 51(fX+1). After the synchrony group member's network communications manager 40 receives the next frame 51(fX+1), it can generate the updated time value TUfX+1 and compare that value to the expected updated time value TEfX+1. If the two time values do not correspond, or if the difference between them is above a selected threshold level, the clock that is used by the audio information channel device 23 to generate the time stamps is advancing at a different rate than the synchrony group member's digital to analog converter clock 34, and so the network communications manager 40 can adjust the rate of the digital to analog converter clock 34 to approach that of the clock used by the audio information channel device 23 so that the differential time value ΔT is constant. On the other hand, if the two time values do correspond, then the time differential value ΔT is constant, or the difference is below a threshold level, and the network communications manager 40 need not change the clock rate of the digital to analog converter clock 34. It will be appreciated that, if the clock rate is to be adjusted, the rate adjustment can be fixed, or it can vary based on, for example, the difference between the updated time value TUfX+1 and the expected updated time value TEfX+1.


It will also be appreciated that, if no rate adjustment is performed for one frame 51(fX+1), the synchrony group member's network communications manager 40 can generate an expected updated time value TEfX+2 that is expected for the updated time stamp of the next frame 51(fX+2) using the updated time value TUFX determined for frame 51(fX), along with the number of samples in a frame and the sampling rate, and compare the expected updated time value TEfX+2 to the updated time value TUfX+2 that it generates when it receives frame 51(fX+2). At that point, if the network communications manager 41 determines that two time values do not correspond, or if the difference between them is above a selected threshold level, it can adjust the rate of the digital to analog converter clock 34. Similar operations can be performed if no rate adjustment is performed for several successive frames 51(fX+1), 51(fX+2), . . . . This will accommodate the possibility that the rate differential between the clock 34 and the clock used by the audio information channel device 23 in generating the time stamps have rates that differ by an amount sufficiently small that it cannot be detected using time stamps of two or more successive frames.


Instead or in addition to adjusting the clock rate as described above, the synchrony group member's network communications manager 40 can perform similar operations in connection with adjusting the clock rate in connection with the current time information that it receives from the audio information channel device 23.


Furthermore, although the network audio system 10 has been described such that the master device 21 of a synchrony group 20 can, in response to control information provided thereto by a user through the user interface module 13, provide a notification to a zone player 11(n) that it is to become a member of its synchrony group 20 as a slave device 22(g), it will be appreciated that the user interface module 13 can provide the notification directly to the zone player 11(n) that is to become a member of the synchrony group 20. In that case, the zone player 11(n) can notify the master device 21 that it is to become a slave device 22(g) in the synchrony group 20, after which the master device 21 can provide information regarding the synchrony group 20, including the multi-cast and unicast addresses of the audio information channel device and other information as described above.


Similarly, although the network audio system 10 has been described such that the master device 21 of a synchrony group 20 can, in response to control information provided thereto by a user through the user interface module 13, provide a command to a slave device 22(g) to enable the slave device 22(g) to adjust its volume, it will be appreciated that the user interface module 13 can provide control information directly to the slave device 22(g) to enable the slave device 22(g) to adjust its volume.


In addition, although the network audio system 10 has been described such that each frames 51(f) is associated with a frame sequence number (reference field 56, FIG. 4), it will be appreciated that, if the packets described above in connection with Packet Sequence A and Packet Sequence B are provided with packet sequence numbers, the frame sequence numbers need not be provided, since the packet sequence numbers can suffice for defining the frame sequencing.


Furthermore, although the network audio system 10 has been described such that the zone players 11(n) are provided with an audio amplifier 35 for amplifying the analog signal provided by the respective digital to analog converters 33, it will be appreciated that a zone player may be provided that does not itself include an audio amplifier. In that case, the analog signal may be coupled to an external amplifier for amplification as necessary before being provided to the audio reproduction device(s) 15(n)(r). It will be appreciated that a single zone player 11(n) may be provided with multiple audio amplifiers and audio reproduction device interfaces, and, if necessary, multiple digital to analog converters 33, to provide audio programs for corresponding numbers of synchrony groups.


Similarly, although the zone players 11(n) have been described such that they may be connected to one or more audio information sources, it will be appreciated that an audio information source may form part of and be integrated into a zone player 11(n). For example, a zone player may include a compact disk player, cassette tape player, broadcast radio receiver, or the like, that has been integrated into it. In addition, as noted above, an individual zone player 11(n) may be connected to multiple audio information sources and may contemporaneously operate as the audio information channel device 23 for multiple synchrony groups.


In addition, although FIG. 1 shows the network audio system 10 as including one user interface module 13, it will be appreciated that the system 10 may include a plurality of user interface modules. Each user interface module be useful for controlling all of the zone players as described above, or alternatively one or more of the user interface modules may be useful for controlling selected subsets of the zone players.


Moreover, it will be appreciated that, although the invention has been described in connection with audio information, it will be appreciated that the invention will find utility in connection with any type of information for which synchrony among devices connected to a network is desired.


As noted above, while a zone player 11(n) is operating as audio information channel device 23 for a synchrony group 20, when the zone player 11(n)'s audio information source interface 30 or network communications manager 40 stores digital audio information frames based on audio information from an audio information source 14(n)(s) in the audio information buffer 31, it will provide time stamps for the respective frames to schedule them for playback after some time delay after they have been buffered in the audio information buffer 31. The delay is provided so that, for other zone players 11(n′), 11(n″), . . . that are operating as members of a synchrony group, there will be sufficient time for the audio and playback timing information to be transferred over the network 12 to those other zone players 11(n′), 11(n″), . . . so that it can be processed and played by them at the appropriate time as described above. The time period that is selected for the time delay may be fixed or variable, and in either case may be based on a number of factors. If the time period selected for the time delay is fixed, it may be based on, for example, factors such as an estimate of the maximum latency in the network 12, the estimated maximum loading of the various components comprising the zone players 11(n), and other estimates as will be appreciated by those skilled in the art.


The time delay may be the same for audio information from all types of audio information sources, and may be constant over the entire period that the synchrony group 20 is playing an audio work. Alternatively, different time delays may be utilized based on various criteria. For example, if the audio information is to be played independently of information associated with other types of media, the time delay may be selected to be relatively long, on the order of a significant fraction of a second, or longer. On the other hand, if the audio information is to be played contemporaneously with, for example, video information, which may be supplied by, for example, a video disk, video tape cassette, over cable, satellite, or broadcast television, which may not be buffered or which may be displayed independently of the network audio system 10, it may be undesirable to provide for such a lengthy delay, since the time delay of the audio playback, in relation to the video display, may be noticeable. In that case, the zone player 11(n) may provide for a much shorter time delay. In one embodiment, the time delay provided for audio information to be played concurrently with video information is selected to be generally on the order of fifty milliseconds, which would barely, if at all, be perceptible to someone viewing the video. Other desirable time delays for information from other types of sources will be apparent to those skilled in the art.


As yet a further possibility, the zone player 11(n), when operating as an audio information channel device 23 for a synchrony group 20, can dynamically determine the time delay based on a number of conditions in the network audio system 10, including, for example, the message transfer latency in network 12, the loading of microprocessors or other components that are used in the various zone players 11(n′), 11(n″), . . . that may comprise a synchrony group 20, as well as other factors. For example, if the audio information channel device 23 determines that the latency in the network 12 has increased beyond a selected threshold, the audio information channel device 23 can adjust the delay to increase the likelihood that the members of the synchrony group 20 will be able to receive the packets and process the frames so that they will be able to play them at the appropriate times. Similarly, if the audio information channel device 23 is notified that a member of the synchrony group 20 to which it provides audio information requires additional time to receive and process the frames that it transmits, the audio information channel device 23 can adjust the delay accordingly. It will be appreciated that, to reduce or minimize possible discontinuities in the audio playback by the members of the synchrony group, the audio information channel device 23 can, instead of adjusting the time delay during a particular audio track, adjust the time delay between tracks, during silent periods of a track or otherwise as will be appreciated by those skilled in the art. In addition, the audio information channel device 23 can use conventional audio compression methodologies to facilitate a speeding up and/or slowing down of playback of an audio track while it is in the process of providing additional time delay. Generally, the members of the synchrony group 20 can provide notifications to the audio information channel device 23 if they determine that they will need an additional time delay, and the audio information channel device 23 can adjust the time delay in accordance with the notifications from the members of the synchrony group 20.


It will be appreciated that a system in accordance with the invention can be constructed in whole or in part from special purpose hardware or a general purpose computer system, or any combination thereof, any portion of which may be controlled by a suitable program. Any program may in whole or in part comprise part of or be stored on the system in a conventional manner, or it may in whole or in part be provided in to the system over a network or other mechanism for transferring information in a conventional manner. In addition, it will be appreciated that the system may be operated and/or otherwise controlled by means of information provided by an operator using operator input elements (not shown) which may be connected directly to the system or which may transfer the information to the system over a network or other mechanism for transferring information in a conventional manner.


The foregoing description has been limited to a specific embodiment of this invention. It will be apparent, however, that various variations and modifications may be made to the invention, with the attainment of some or all of the advantages of the invention. It is the object of the appended claims to cover these and such other variations and modifications as come within the true spirit and scope of the invention.

Claims
  • 1. A method performed by a computing device, the method comprising: operating in a first mode in which the computing device is not serving as an audio information channel device for any synchrony group;receiving an indication that the computing device is to begin serving as an audio information channel device for a synchrony group that comprises at least a first zone player and a second zone player that are communicatively coupled with the computing device over an asynchronous data network configured to exchange digital data packets, wherein the computing device, the first zone player, and the second zone player each have a different respective clock time;after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, (a) configuring the synchrony group to playback audio in synchrony, wherein configuring the synchrony group comprises sending control information to at least one of the first zone player or second zone player that (i) identifies the computing device as the audio information channel device for the synchrony group, and (ii) instructs the at least one of the first zone player or second zone player to begin operating as part of the synchrony group; and(b) transitioning from operating in the first mode to operating in a second mode in which the computing device serves as the audio information channel device for the synchrony group, wherein, while operating in the second mode, the computing device is configured to:exchange, via the asynchronous data network, clock timing information with each of the first zone player and the second zone player that facilitates time synchronization between the computing device and each of the first zone player and the second zone player, wherein the time synchronization establishes a common notion of clock time for each of the first zone player, the second zone player, and the computing device;obtain given audio content to be played back by the synchrony group;generate playback timing information associated with the given audio content that comprises an indicator of a first future time, relative to the common notion of clock time, at which at least the first and second zone players are to initiate synchronous playback of the given audio content; andtransmit, via a network interface to the first and second zone players over the asynchronous data network, both (a) the audio information representing the given audio content via and (b) the playback timing information, thereby enabling the first and second zone players to play back the given audio content in synchrony.
  • 2. The method of claim 1, further comprising: initiating establishment of at least one communication channel by which audio information will be transmitted by the audio information channel device to the synchrony group after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group.
  • 3. A computing device comprising: one or more processors; andtangible, non-transitory computer-readable media comprising program instructions stored therein, wherein the program instructions are executable by the one or more processors such that the computing device is configured to:operate in a first mode in which the computing device is not serving as an audio information channel device for any synchrony group;receive an indication that the computing device is to begin serving as an audio information channel device for a synchrony group that comprises at least a first zone player and a second zone player that are communicatively coupled with the computing device over an asynchronous data network configured to exchange digital data packets, wherein the computing device, the first zone player, and the second zone player each have a different respective clock time;after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, (a) configure the synchrony group to playback audio in synchrony, wherein configuring the synchrony group comprises sending control information to at least one of the first zone player or second zone player that (i) identifies the computing device as the audio information channel device for the synchrony group, and (ii) instructs the at least one of the first zone player or second zone player to begin operating as part of the synchrony group; and(b) transition from operating in the first mode to operating in a second mode in which the computing device serves as the audio information channel device for the synchrony group, wherein, while operating in the second mode, the computing device is configured to:exchange, via the asynchronous data network, clock timing information with each of the first zone player and the second zone player that facilitates time synchronization between the computing device and each of the first zone player and the second zone player, wherein the time synchronization establishes a common notion of clock time for each of the first zone player, the second zone player, and the computing device;obtain given audio content to be played back by the synchrony group;generate playback timing information associated with the given audio content that comprises an indicator of a first future time, relative to the common notion of clock time, at which at least the first and second zone players are to initiate synchronous playback of the given audio content; andtransmit, via a network interface to the first and second zone players over the asynchronous data network, both (a) the audio information representing the given audio content and (b) the playback timing information, thereby enabling the first and second zone players to play back the given audio content in synchrony.
  • 4. The computing device of claim 3, wherein: while in the first mode, the computing device is configured to play back audio content.
  • 5. The computing device of claim 3, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to transmit the audio information to the first and second zone players over the asynchronous data network comprise program instructions that are executable by the one or more processors such that the computing device is configured to: transmit a series of frames, wherein individual frames comprise audio information representing a respective portion of the given audio content, wherein the first future time corresponds to a first frame in the series of frames, and wherein the playback timing information further comprises, for an individual subsequent frame after the first frame in the series of frames, an indicator of a respective future time, relative to the clock time of the computing device, at which the respective portion of the given audio content in the individual subsequent frame is to be synchronously played back by the first and second zone players.
  • 6. The computing device of claim 3, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to generate the playback timing information for the audio information comprise program instructions that are executable by the one or more processors such that the computing device is configured to: capture a reading of the clock time of the computing device;calculate the first future time by adding a time delay to the captured reading of the clock time; andgenerate a timestamp that indicates the first future time.
  • 7. The computing device of claim 6, wherein the given audio content comprises first given audio content, wherein the playback timing information comprises first playback timing information, and wherein the instruction that causes the first zone player to begin serving as the audio information channel device comprises an instruction that causes the first zone player to (a) obtain second given audio content, (b) generate second playback timing information, and (c) transmit, to the second zone player over the asynchronous data network, both (i) audio information representing the second given audio content and (ii) the second playback timing information, thereby enabling the first and second zone players to play back the second given audio content in synchrony.
  • 8. The computing device of claim 6, wherein the time delay comprises a sufficient amount of time to allow at least an initial portion of the audio information to be received over the asynchronous data network and processed by the first and second zone players.
  • 9. The computing device of claim 6, wherein the time delay is determined based on an estimate of latency in the asynchronous data network.
  • 10. The computing device of claim 6, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to generate the playback timing information for the audio information further comprise program instructions that are executable by the one or more processors such that the computing device is configured to: receive from at least one of the first and second zone players an indication of a respective latency associated with the at least one of the first and second zone players, wherein the time delay is determined based on the received indication of the respective latency.
  • 11. The computing device of claim 3, further comprising program instructions stored on the tangible, non-transitory computer-readable medium that are executable by the one or more processors such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: while the first and second zone players are playing back the given audio content in synchrony, receive a request for the synchrony group to switch from playing back the given audio content to playing back new audio content; andin response to receiving the request for the synchrony group to switch from playing back the given audio content to playing back the new audio content: transmit, to the first and second zone players over the asynchronous data network, a command to terminate playback of the given audio content;obtain the new audio content;generate playback timing information associated with the new audio content that comprises an indicator of a second future time, relative to the clock time of the computing device, at which the first and second zone players are to initiate synchronous playback of the new audio content; andtransmit the playback timing information associated with the new audio content and audio information representing the new audio content to the first and second zone players over the asynchronous data network and thereby cause the first and second zone players to play back the new audio content in synchrony.
  • 12. The computing device of claim 11, wherein the command to terminate playback of the given audio content comprises a future time, relative to the clock time of the computing device, at which the first and second zone players are to terminate synchronous playback of the given audio content.
  • 13. The computing device of claim 3, wherein the computing device is a zone player.
  • 14. The computing device of claim 3, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to receive the indication that the computing device is to begin operating as an audio information channel device for the synchrony group comprise program instructions that are executable by the one or more processors such that the computing device is configured to: receive a request to begin operating as the audio information channel device for the synchrony group from (a) a controller device, (b) one of the first zone player or the second zone players, (c) another network device, or (d) a user.
  • 15. The computing device of claim 3, wherein while in the first mode, the computing device is configured to (a) obtain audio content and (b) play back at least a first portion of the obtained audio content, wherein the given audio content transmitted while operating in the second mode comprises at least a second portion of the audio content obtained while operating in the first mode.
  • 16. The computing device of claim 3, further comprising program instructions that are executable by the one or more processors such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: receive a request for the synchrony group to play back the given audio content, wherein the given audio content is obtained in response to receiving the request for the synchrony group to play back the given audio content.
  • 17. The computing device of claim 3, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to transmit the audio information representing the given audio content via a network interface to the first and second zone players over the asynchronous data network comprise program instructions that are executable by the one or more processors such that the computing device is configured to: transmit a first portion of the audio information to the first and second zone players over the asynchronous data network before synchronous playback of the given audio content begins and transmit a second portion of the audio information to the first and second zone players over the asynchronous data network after synchronous playback of the given audio content begins.
  • 18. The computing device of claim 3, wherein the program instructions that are executable by the one or more processors such that the computing device is configured to transmit, via a network interface to the first and second zone players over the asynchronous data network, both (a) the audio information representing the given audio content and (b) the playback timing information comprise program instructions that are executable by the one or more processors such that the computing device is configured to: transmit the playback timing information in a separate communication from the audio information.
  • 19. The computing device of claim 3, further comprising program instructions that are executable by the one or more processors such that the computing device is configured to, after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group: establish at least one first communication channel by which audio information will be transmitted; andwherein the program instructions that are executable by the one or more processors such that the computing device is configured to send control information to at least one of the first zone player or second zone player comprise program instructions that are executable by the one or more processors such that the computing device is configured to:send control information to at least one of the first zone player or second zone player that initiates establishment of at least one second communication channel by which the clock timing information will be transmitted by the computing device.
  • 20. The computing device of claim 3, further comprising program instructions that are executable by the one or more processors such that the computing device is configured to, while operating as the audio information channel device for the synchrony group: receive, from at least one of the first zone player or the second zone player via the asynchronous data network, an indication that at least one of the first zone player or the second zone player has successfully joined the synchrony group.
  • 21. The computing device of claim 3, further comprising program instructions that are executable by the one or more processors such that the computing device is configured to: receive an indication that the computing device is to instruct the first zone player to begin serving as an audio information channel device for a synchrony group that comprises at least the first zone player and the second zone player; andafter receiving the indication that the computing device is to instruct the first zone player to begin serving as the audio information channel device, transmit an instruction to begin serving as the audio information channel device to the first zone player over the asynchronous data network that causes the first zone player to begin serving as the audio information channel device.
  • 22. The computing device of claim 3, further comprising program instructions that are executable by the one or more processors such that the computing device is configured to, while operating as the audio information channel device for the synchrony group: receive a request for the synchrony group to terminate playback of the given audio content; andin response to receiving the request for the synchrony group to terminate playback of the given audio content, transmit, to the first and second zone players over the asynchronous data network, a command to terminate playback of the given audio content.
  • 23. The computing device of claim 22, further comprising program instructions that are executable by the one or more processors such that the computing device is configured to, while operating as the audio information channel device for the synchrony group: if the computing device is still transmitting the audio information representing the given audio content to the first and second zone players over the asynchronous data network at the time of receiving the request for the synchrony group to terminate playback of the given audio content, then in response to receiving the request for the synchrony group to terminate playback of the given audio content, also terminate transmission of the audio information representing the given audio content to the first and second zone players over the asynchronous data network.
  • 24. The computing device of claim 3, wherein the given audio content is obtained from one of (a) a network-accessible audio source that is communicatively coupled to the computing device over one or both of a local area network (LAN) or a wide area network (WAN), (b) a local audio source that is directly connected to the computing device, or (c) a storage medium of the computing device.
  • 25. A system comprising a computing device and a first zone player, wherein the computing device comprises: a network interface;a clock that is configured to provide a clock time of the computing device;at least one first processor;at least one first tangible, non-transitory computer-readable medium; andprogram instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device is configured to: operate in a first mode in which the computing device is not serving as an audio information channel device for any synchrony group;receive an indication that the computing device is to begin serving as an audio information channel device for a synchrony group that comprises at least the first zone player and a second zone player that are communicatively coupled with the computing device over an asynchronous data network configured to exchange digital data packets, wherein the computing device, the first zone player, and the second zone player each have a different respective clock time;after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, (a) configure the synchrony group to playback audio in synchrony, wherein configuring the synchrony group comprises sending control information to at least one of the first zone player or second zone player that (i) identifies the computing device as the audio information channel device for the synchrony group, and (ii) instructs the at least one of the first zone player or second zone player to begin operating as part of the synchrony group; and(b) transition from operating in the first mode to operating in a second mode in which the computing device serves as the audio information channel device for the synchrony group, wherein, while operating in the second mode, the computing device is configured to:exchange, via the asynchronous data network, clock timing information with each of the first zone player and the second zone player that facilitates time synchronization between the computing device and each of the first zone player and the second zone player, wherein the time synchronization establishes a common notion of clock time for each of the first zone player, the second zone player, and the computing device;obtain given audio content to be played back by the synchrony group;generate playback timing information associated with the given audio content that comprises an indicator of a first future time, relative to the common notion of clock time, at which at least the first and second zone players are to initiate synchronous playback of the given audio content; andtransmit, via the network interface to the first and second zone players over the asynchronous data network, both (a) the audio information representing the given audio content, and (b) the playback timing information, thereby enabling the first and second zone players to play back the given audio content in synchrony.
  • 26. The system of claim 25, wherein: while in the first mode, the computing device is configured to play back audio content.
  • 27. The system of claim 25, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to receive the indication that the computing device is to begin operating as an audio information channel device for the synchrony group comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: receive a request to begin operating as the audio information channel device for the synchrony group from (a) a controller device, (b) one of the first and second zone players, (c) another network device, or (d) a user.
  • 28. The system of claim 25, wherein: while in the first mode, the computing device is configured to (a) obtain audio content and (b) play back at least a first portion of the obtained audio content, wherein the given audio content transmitted while operating in the second mode comprises at least a second portion of the audio content obtained while operating in the first mode.
  • 29. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least first one processor such that the computing device, while operating as the audio information channel device for the synchrony group is configured to: receive a request for the synchrony group to play back the given audio content, wherein the given audio content is obtained in response to receiving the request for the synchrony group to play back the given audio content.
  • 30. The system of claim 25, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to transmit the audio information to the first and second zone players over the asynchronous data network comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: transmit a first portion of the audio information to the first and second zone players over the asynchronous data network before synchronous playback of the given audio content begins and transmit a second portion of the audio information to the first and second zone players over the asynchronous data network after synchronous playback of the given audio content begins.
  • 31. The system of claim 25, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to transmit the playback timing information and the audio information to the first and second zone players over the asynchronous data network comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: transmit the playback timing information in a separate communication from the audio information.
  • 32. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that, after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, the computing device is further configured to: establish at least one first communication channel by which audio information will be transmitted; andwherein the program instructions that are executable by the at least one first processor such that the computing device is configured to send the control information to at least one of the first zone player or second zone player further comprise program instructions that are executable by the at least one first processor such that the computing device is configured to:send control information to at least one of the first zone player or second zone player that initiates establishment of at least one second communication channel by which the clock timing information will be transmitted by the computing device.
  • 33. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: receive, from at least one of the first zone player or the second zone player via the asynchronous data network, an indication that at least one of the first zone player or the second zone player have successfully joined the synchrony group.
  • 34. The system of claim 25, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to transmit the audio information to the first and second zone players over the asynchronous data network comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: transmit a series of frames, wherein individual frames comprise audio information representing a respective portion of the given audio content, wherein the first future time corresponds to a first frame in the series of frames, and wherein the playback timing information further comprises, for an individual subsequent frame after the first frame in the series of frames, an indicator of a respective future time, relative to the clock time of the computing device, at which the respective portion of the given audio content in the individual subsequent frame is to be synchronously played back by the first and second zone players.
  • 35. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device is configured to: receive an indication that the computing device is to instruct the first zone player to begin serving as an audio information channel device for a synchrony group that comprises at least the first zone player and the second zone player; andafter receiving the indication that the computing device is to instruct the first zone player to begin serving as the audio information channel device, transmit an instruction to begin serving as the audio information channel device to the first zone player over the asynchronous data network that causes the first zone player to begin serving as the audio information channel device.
  • 36. The system of claim 35, wherein the given audio content comprises first given audio content, wherein the playback timing information comprises first playback timing information, and wherein the instruction that causes the first zone player to begin serving as the audio information channel device comprises an instruction that causes the first zone player to (a) obtain second given audio content, (b) generate second playback timing information, and (c) transmit, to the second zone player over the asynchronous data network, both (i) audio information representing the second given audio content and (ii) the second playback timing information, thereby enabling the first and second zone players to play back the second given audio content in synchrony.
  • 37. The system of claim 25, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to generate the playback timing information for the audio information comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: capture a reading of the clock time of the computing device;calculate the first future time by adding a time delay to the captured reading of the clock time; andgenerate a timestamp that indicates the first future time.
  • 38. The system of claim 37, wherein the time delay comprises a sufficient amount of time to allow at least an initial portion of the audio information to be received over the asynchronous data network and processed by the first and second zone players.
  • 39. The system of claim 37, wherein the time delay is determined based on an estimate of latency in the asynchronous data network.
  • 40. The system of claim 37, wherein the program instructions that are executable by the at least one first processor such that the computing device is configured to generate the playback timing information for the audio information further comprise program instructions that are executable by the at least one first processor such that the computing device is configured to: receive from at least one of the first and second zone players an indication of a respective latency associated with the at least one of the first and second zone players, wherein the time delay is determined based on the received indication of the respective latency.
  • 41. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: receive a request for the synchrony group to terminate playback of the given audio content; andin response to receiving the request for the synchrony group to terminate playback of the given audio content, transmit, to the first and second zone players over the asynchronous data network, a command to terminate playback of the given audio content.
  • 42. The system of claim 41, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: if the computing device is still transmitting the audio information representing the given audio content to the first and second zone players over the asynchronous data network at the time of receiving the request for the synchrony group to terminate playback of the given audio content, then in response to receiving the request for the synchrony group to terminate playback of the given audio content, also terminate transmission of the audio information representing the given audio content to the first and second zone players over the asynchronous data network.
  • 43. The system of claim 25, further comprising program instructions stored on the at least one first tangible, non-transitory computer-readable medium that are executable by the at least one first processor such that the computing device, while operating as the audio information channel device for the synchrony group, is configured to: while the first and second zone players are playing back the given audio content in synchrony, receive a request for the synchrony group to switch from playing back the given audio content to playing back new audio content; andin response to receiving the request for the synchrony group to switch from playing back the given audio content to playing back the new audio content: transmit, to the first and second zone players over the asynchronous data network, a command to terminate playback of the given audio content;obtain the new audio content;generate playback timing information associated with the new audio content that comprises an indicator of a second future time, relative to the clock time of the computing device, at which the first and second zone players are to initiate synchronous playback of the new audio content; andtransmit the playback timing information associated with the new audio content and audio information representing the new audio content to the first and second zone players over the asynchronous data network and thereby cause the first and second zone players to play back the new audio content in synchrony.
  • 44. The system of claim 43, wherein the command to terminate playback of the given audio content comprises a future time, relative to the clock time of the computing device, at which the first and second zone players are to terminate synchronous playback of the given audio content.
  • 45. The system of claim 25, wherein the given audio content is obtained from one of (a) a network-accessible audio source that is communicatively coupled to the computing device over one or both of a local area network (LAN) or a wide area network (WAN), (b) a local audio source that is directly connected to the computing device, or (c) a storage medium of the computing device.
  • 46. The system of claim 25, wherein the computing device is a zone player.
  • 47. The system of claim 25, wherein the first zone player comprises: at least one second processor;at least one second tangible, non-transitory computer-readable medium; andprogram instructions stored on the at least one second tangible, non-transitory computer-readable medium that are executable by the at least one second processor such that the first zone player is configured to:play back the given audio content in synchrony with the second zone player using the audio information and the playback timing information transmitted by the computing device.
  • 48. The system of claim 47, further comprising program instructions stored on the at least one second tangible, non-transitory computer-readable medium that are executable by the at least one second processor such that the first zone player is configured to: operate in a first mode in which the first zone player is not participating in any synchrony group, wherein while operating in the first mode, the first zone player is configured to play back audio content while (a) the first zone player is not a member of any synchrony group and (b) the computing device is not serving as an audio information channel device for any synchrony group;receive the control information sent by the computing device to the first zone player; andafter receiving the control information, transition from the first mode to a second mode in which the first zone player is configured to serve as a group member for the synchrony group.
  • 49. The system of claim 47, wherein: the program instructions stored on the at least one first tangible, non-transitory computer-readable medium comprise further program instructions that are executable by the at least one first processor such that the computing device is configured to (i) receive an indication that the computing device is to instruct the first zone player to begin serving as an audio information channel device for a synchrony group that comprises at least the first zone player and the second zone player, and (ii) after receiving the indication that the computing device is to instruct the first zone player to begin serving as the audio information channel device, transmit an instruction over the asynchronous data network to the first zone player to begin serving as the audio information channel device; andthe program instructions stored on the at least one second tangible, non-transitory computer-readable medium comprise further program instructions that are executable by the at least one second processor such that the first zone player is configured to (i) receive the instruction to begin serving as the audio information channel device transmitted by the computing device, and (ii) based on receiving the instruction to begin serving as the audio information channel device, begin to serve as the audio information channel device.
  • 50. The system of claim 47, further comprising program instructions stored on the at least one second tangible, non-transitory computer-readable medium that are executable by the at least one second processor such that the first zone player is configured to: transmit, to the computing device via the at least on asynchronous data network, an indication that the first zone player has successfully joined the synchrony group.
  • 51. Tangible, non-transitory computer-readable media comprising program instructions stored therein, wherein the program instructions are executable by one or more processors such that a computing device is configured to: operate in a first mode in which the computing device is not serving as an audio information channel device for any synchrony group;receive an indication that the computing device is to begin serving as an audio information channel device for a synchrony group that comprises at least a first zone player and a second zone player that are communicatively coupled with the computing device over an asynchronous data network configured to exchange digital data packets, wherein the computing device, the first zone player, and the second zone player each have a different respective clock time;after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, (a) configure the synchrony group to playback audio in synchrony, wherein configuring the synchrony group comprises sending control information to at least one of the first zone player or second zone player that (i) identifies the computing device as the audio information channel device for the synchrony group, and (ii) instructs the at least one of the first zone player or second zone player to begin operating as part of the synchrony group; and(b) transition from operating in the first mode to operating in a second mode in which the computing device serves as the audio information channel device for the synchrony group, wherein, while operating in the second mode, the computing device is configured to:exchange, via the asynchronous data network, clock timing information with each of the first zone player and the second zone player that facilitates time synchronization between the computing device and each of the first zone player and the second zone player, wherein the time synchronization establishes a common notion of clock time for each of the first zone player, the second zone player, and the computing device;obtain given audio content to be played back by the synchrony group;generate playback timing information associated with the given audio content that comprises an indicator of a first future time, relative to the common notion of clock time, at which at least the first and second zone players are to initiate synchronous playback of the given audio content; andtransmit, via a network interface to the first and second zone players over the asynchronous data network, both (a) the audio information representing the given audio content and (b) the playback timing information, thereby enabling the first and second zone players to play back the given audio content in synchrony.
  • 52. The tangible, non-transitory computer-readable media of claim 51, further comprising program instructions stored on the tangible, non-transitory computer-readable medium that are executable by the one or more processors such that, after receiving the indication that the computing device is to begin serving as the audio information channel device for the synchrony group, the computing device is further configured to: initiate establishment of at least one communication channel by which audio information will be transmitted by the audio information channel device to the synchrony group.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of U.S. application Ser. No. 17/306,016 titled “Playback Device, filed May 3, 2021, and U.S. application Ser. No. 17/306,016 is a continuation of U.S. application Ser. No. 13/864,249 titled “Concurrent Transmission and Playback of Audio Information,” filed Apr. 17, 2013, and issued as U.S. Pat. No. 11,080,001 on Aug. 3, 2021; U.S. application Ser. No. 13/864,249 is a continuation of U.S. application Ser. No. 13/297,000 titled “System And Method For Synchronizing Operations Among A Plurality Of Independently Clocked Digital Data Processing Devices,” filed Nov. 15, 2011, and issued as U.S. Pat. No. 9,182,777 on Nov. 10, 2015; U.S. application Ser. No. 13/297,000 is a continuation of U.S. application Ser. No. 10/816,217 titled “System And Method For Synchronizing Operations Among A Plurality Of Independently Clocked Digital Data Processing Devices,” filed Apr. 1, 2004, and issued as U.S. Pat. No. 8,234,395 on Jul. 31, 2012; U.S. application Ser. No. 10/816,217 claims priority to U.S. Provisional App. 60/490,768 titled “Method For Synchronizing Audio Playback Between Multiple Networked Devices,” filed Jul. 28, 2003, and now expired. The entire contents of the Ser. No. 17/306,016; 13/864,249; 13/297,000; 10/816,217; and 60/490,768 applications are incorporated herein by reference.

US Referenced Citations (1078)
Number Name Date Kind
3956591 Gates, Jr. May 1976 A
4105974 Rogers Aug 1978 A
D260764 Castagna et al. Sep 1981 S
4296278 Cullison et al. Oct 1981 A
4306114 Callahan Dec 1981 A
4310922 Lichtenberger et al. Jan 1982 A
4509211 Robbins Apr 1985 A
D279779 Taylor Jul 1985 S
4530091 Crockett Jul 1985 A
4661902 Hochsprung et al. Apr 1987 A
4689786 Sidhu et al. Aug 1987 A
4696037 Fierens Sep 1987 A
4701629 Citroen Oct 1987 A
4712105 Koehler Dec 1987 A
D293671 Beaumont Jan 1988 S
4731814 Becker et al. Mar 1988 A
4816989 Finn et al. Mar 1989 A
4824059 Butler Apr 1989 A
D301037 Matsuda May 1989 S
4845751 Schwab Jul 1989 A
D304443 Grinyer et al. Nov 1989 S
D313023 Kolenda et al. Dec 1990 S
D313398 Gilchrist Jan 1991 S
D313600 Weber Jan 1991 S
4994908 Kuban et al. Feb 1991 A
D320598 Auerbach et al. Oct 1991 S
D322609 Patton Dec 1991 S
5086385 Launey et al. Feb 1992 A
D326450 Watanabe May 1992 S
D327060 Wachob et al. Jun 1992 S
5151922 Weiss Sep 1992 A
5153579 Fisch et al. Oct 1992 A
D331388 Dahnert et al. Dec 1992 S
5182552 Paynting Jan 1993 A
D333135 Wachob et al. Feb 1993 S
5185680 Kakubo Feb 1993 A
5198603 Nishikawa et al. Mar 1993 A
5237327 Saitoh et al. Aug 1993 A
5239458 Suzuki Aug 1993 A
5272757 Scofield et al. Dec 1993 A
5299266 Lumsden Mar 1994 A
5313524 Van Hulle et al. May 1994 A
D350531 Tsuji Sep 1994 S
D350962 Reardon et al. Sep 1994 S
5361381 Short Nov 1994 A
5372441 Louis Dec 1994 A
D354059 Hendricks Jan 1995 S
D354751 Hersh et al. Jan 1995 S
D356093 McCauley et al. Mar 1995 S
D356312 Althans Mar 1995 S
D357024 Tokiyama et al. Apr 1995 S
5406634 Anderson et al. Apr 1995 A
5430485 Lankford et al. Jul 1995 A
5440644 Farinelli et al. Aug 1995 A
D362446 Gasiorek et al. Sep 1995 S
5457448 Totsuka et al. Oct 1995 A
D363933 Starck Nov 1995 S
5467342 Logston et al. Nov 1995 A
D364877 Tokiyama et al. Dec 1995 S
D364878 Green et al. Dec 1995 S
D365102 Gioscia Dec 1995 S
D366044 Hara et al. Jan 1996 S
5481251 Buys et al. Jan 1996 A
5491839 Schotz Feb 1996 A
5515345 Barreira et al. May 1996 A
5533021 Branstad et al. Jul 1996 A
D372716 Thorne Aug 1996 S
5553147 Pineau Sep 1996 A
5553222 Milne et al. Sep 1996 A
5553314 Grube et al. Sep 1996 A
D377651 Biasotti et al. Jan 1997 S
5596696 Tindell et al. Jan 1997 A
5602992 Danneels Feb 1997 A
5623483 Agrawal et al. Apr 1997 A
5625350 Fukatsu et al. Apr 1997 A
5633871 Bloks May 1997 A
D379816 Laituri et al. Jun 1997 S
5636345 Valdevit Jun 1997 A
5640388 Woodhead et al. Jun 1997 A
5642171 Baumgartner et al. Jun 1997 A
D380752 Hanson Jul 1997 S
5652749 Davenport et al. Jul 1997 A
D382271 Akwiwu Aug 1997 S
5661665 Glass et al. Aug 1997 A
5661728 Finotello et al. Aug 1997 A
5668884 Clair, Jr. et al. Sep 1997 A
5673323 Schotz et al. Sep 1997 A
D384940 Kono et al. Oct 1997 S
5687191 Lee et al. Nov 1997 A
D387352 Kaneko et al. Dec 1997 S
5696896 Badovinatz et al. Dec 1997 A
D388792 Nykerk Jan 1998 S
D389143 Wicks Jan 1998 S
D392641 Fenner Mar 1998 S
5726989 Dokic Mar 1998 A
5732059 Katsuyama et al. Mar 1998 A
D393628 Ledbetter et al. Apr 1998 S
5740235 Lester et al. Apr 1998 A
5742623 Nuber Apr 1998 A
D394659 Biasotti et al. May 1998 S
5751819 Dorrough May 1998 A
5761320 Farinelli et al. Jun 1998 A
5774016 Ketterer Jun 1998 A
D395889 Gerba et al. Jul 1998 S
5787249 Badovinatz et al. Jul 1998 A
5790543 Cloutier Aug 1998 A
D397996 Smith Sep 1998 S
5808662 Kinney et al. Sep 1998 A
5812201 Yoo Sep 1998 A
5815689 Shaw et al. Sep 1998 A
5818948 Gulick Oct 1998 A
D401587 Rudolph Nov 1998 S
5832024 Schotz et al. Nov 1998 A
5838909 Roy et al. Nov 1998 A
5848152 Slipy et al. Dec 1998 A
5852722 Hamilton Dec 1998 A
5852744 Agatone et al. Dec 1998 A
D404741 Schumaker et al. Jan 1999 S
D405071 Gambaro Feb 1999 S
5867691 Shiraishi Feb 1999 A
5875233 Cox Feb 1999 A
5875354 Charlton et al. Feb 1999 A
D406847 Gerba et al. Mar 1999 S
D407071 Keating Mar 1999 S
5887143 Saito et al. Mar 1999 A
5905768 Maturi et al. May 1999 A
D410927 Yamagishi Jun 1999 S
5910990 Jang Jun 1999 A
5917830 Chen et al. Jun 1999 A
D412337 Hamano Jul 1999 S
5923869 Kashiwagi et al. Jul 1999 A
5923902 Inagaki Jul 1999 A
5946343 Schotz et al. Aug 1999 A
5956025 Goulden et al. Sep 1999 A
5956088 Shen et al. Sep 1999 A
5960006 Maturi et al. Sep 1999 A
5960167 Roberts et al. Sep 1999 A
D415496 Gerba et al. Oct 1999 S
D416021 Godette et al. Nov 1999 S
5984512 Jones et al. Nov 1999 A
5987525 Roberts et al. Nov 1999 A
5987611 Freund Nov 1999 A
5990884 Douma et al. Nov 1999 A
5991307 Komuro et al. Nov 1999 A
5999906 Mercs et al. Dec 1999 A
6009457 Moller Dec 1999 A
6018376 Nakatani Jan 2000 A
D420006 Tonino Feb 2000 S
6026150 Frank et al. Feb 2000 A
6026297 Haartsen Feb 2000 A
6029196 Lenz Feb 2000 A
6031818 Lo et al. Feb 2000 A
6032202 Lea et al. Feb 2000 A
6038614 Chan et al. Mar 2000 A
6046550 Ference et al. Apr 2000 A
6061457 Stockhamer May 2000 A
6078725 Tanaka Jun 2000 A
6081266 Sciammarella Jun 2000 A
6085236 Lea Jul 2000 A
6088063 Shiba Jul 2000 A
D429246 Holma Aug 2000 S
D430143 Renk Aug 2000 S
6101195 Lyons et al. Aug 2000 A
6108485 Kim Aug 2000 A
6108686 Williams, Jr. Aug 2000 A
6119239 Fujii Sep 2000 A
6122668 Teng et al. Sep 2000 A
6122749 Gulick Sep 2000 A
D431552 Backs et al. Oct 2000 S
D432525 Beecroft Oct 2000 S
6127941 Van Ryzin Oct 2000 A
6128318 Sato Oct 2000 A
6131130 Van Ryzin Oct 2000 A
6148205 Cotton Nov 2000 A
6154772 Dunn et al. Nov 2000 A
6157957 Berthaud Dec 2000 A
6163647 Terashima et al. Dec 2000 A
6169725 Gibbs et al. Jan 2001 B1
6175872 Neumann et al. Jan 2001 B1
6181383 Fox et al. Jan 2001 B1
6185737 Northcutt et al. Feb 2001 B1
6195435 Kitamura Feb 2001 B1
6195436 Scibora et al. Feb 2001 B1
6199169 Voth Mar 2001 B1
6212282 Mershon Apr 2001 B1
6246701 Slattery Jun 2001 B1
6253293 Rao et al. Jun 2001 B1
D444475 Levey et al. Jul 2001 S
6255961 Van et al. Jul 2001 B1
6256554 DiLorenzo Jul 2001 B1
6269406 Dutcher et al. Jul 2001 B1
6301012 White et al. Oct 2001 B1
6308207 Tseng et al. Oct 2001 B1
6310652 Li et al. Oct 2001 B1
6313879 Kubo et al. Nov 2001 B1
6321252 Bhola et al. Nov 2001 B1
6324586 Johnson Nov 2001 B1
D452520 Gotham et al. Dec 2001 S
6332147 Moran et al. Dec 2001 B1
6336219 Nathan Jan 2002 B1
6343028 Kuwaoka Jan 2002 B1
6349285 Liu et al. Feb 2002 B1
6349339 Williams Feb 2002 B1
6349352 Lea Feb 2002 B1
6351821 Voth Feb 2002 B1
6353172 Fay et al. Mar 2002 B1
6356871 Hemkumar et al. Mar 2002 B1
6389057 Haartsen May 2002 B1
6404811 Cvetko et al. Jun 2002 B1
6418150 Staats Jul 2002 B1
6430353 Honda et al. Aug 2002 B1
6442443 Fujii et al. Aug 2002 B1
D462339 Allen et al. Sep 2002 S
D462340 Allen et al. Sep 2002 S
D462945 Skulley Sep 2002 S
6446080 Van et al. Sep 2002 B1
6449642 Bourke-Dunphy et al. Sep 2002 B2
6449653 Klemets et al. Sep 2002 B2
6452974 Menon et al. Sep 2002 B1
6456783 Ando et al. Sep 2002 B1
6463474 Fuh et al. Oct 2002 B1
6466832 Zuqert et al. Oct 2002 B1
6469633 Wachter Oct 2002 B1
D466108 Glodava et al. Nov 2002 S
6487296 Allen et al. Nov 2002 B1
6493832 Itakura et al. Dec 2002 B1
D468297 Ikeda Jan 2003 S
6522886 Youngs et al. Feb 2003 B1
6526325 Sussman et al. Feb 2003 B1
6526411 Ward Feb 2003 B1
6535121 Mathney et al. Mar 2003 B2
D474763 Tozaki et al. May 2003 S
D475993 Meyer Jun 2003 S
D476643 Yamagishi Jul 2003 S
D477310 Moransais Jul 2003 S
6587127 Leeke et al. Jul 2003 B1
6598172 Vandeusen et al. Jul 2003 B1
D478051 Sagawa Aug 2003 S
D478069 Beck et al. Aug 2003 S
D478896 Summers Aug 2003 S
6611537 Edens Aug 2003 B1
6611813 Bratton Aug 2003 B1
D479520 De Sep 2003 S
D481056 Kawasaki et al. Oct 2003 S
6631410 Kowalski et al. Oct 2003 B1
6636269 Baldwin Oct 2003 B1
6639584 Li Oct 2003 B1
6653899 Organvidez et al. Nov 2003 B2
6654720 Graham et al. Nov 2003 B1
6654956 Trinh et al. Nov 2003 B1
6658091 Naidoo et al. Dec 2003 B1
6674803 Kesselring Jan 2004 B1
6684060 Curtin Jan 2004 B1
D486145 Kaminski et al. Feb 2004 S
6687664 Sussman et al. Feb 2004 B1
6703940 Allen et al. Mar 2004 B1
6704421 Kitamura Mar 2004 B1
6732176 Stewart et al. May 2004 B1
6741708 Nakatsugawa May 2004 B1
6741961 Lim May 2004 B2
D491925 Griesau et al. Jun 2004 S
6757517 Chang Jun 2004 B2
D493148 Shibata et al. Jul 2004 S
6763274 Gilbert Jul 2004 B1
D495333 Borsboom Aug 2004 S
6772267 Thaler et al. Aug 2004 B2
6775246 Kuribayashi et al. Aug 2004 B1
6778073 Lutter et al. Aug 2004 B2
6778493 Ishii Aug 2004 B1
6778869 Champion Aug 2004 B2
D496003 Spira Sep 2004 S
D496005 Wang Sep 2004 S
D496335 Spira Sep 2004 S
6788938 Sugaya et al. Sep 2004 B1
6795852 Kleinrock et al. Sep 2004 B1
D497363 Olson et al. Oct 2004 S
6803964 Post et al. Oct 2004 B1
6809635 Kaaresoja Oct 2004 B1
D499086 Polito Nov 2004 S
6816104 Lin Nov 2004 B1
6816510 Banerjee Nov 2004 B1
6816818 Wolf et al. Nov 2004 B2
6823225 Sass Nov 2004 B1
6826283 Wheeler et al. Nov 2004 B1
D499395 Hsu Dec 2004 S
D499718 Chen Dec 2004 S
D500015 Gubbe Dec 2004 S
6836788 Kim et al. Dec 2004 B2
6839752 Miller et al. Jan 2005 B1
D501477 Hall Feb 2005 S
6859460 Chen Feb 2005 B1
6859538 Voltz Feb 2005 B1
6873862 Reshefsky Mar 2005 B2
6882335 Saarinen Apr 2005 B2
D504872 Uehara et al. May 2005 S
D504885 Zhang et al. May 2005 S
6898642 Chafle et al. May 2005 B2
6901439 Bonasia et al. May 2005 B1
D506463 Daniels Jun 2005 S
6907458 Tomassetti et al. Jun 2005 B2
6910078 Raman et al. Jun 2005 B1
6912610 Spencer Jun 2005 B2
6915347 Hanko et al. Jul 2005 B2
6917592 Ramankutty et al. Jul 2005 B1
6919771 Nakajima Jul 2005 B2
6920373 Xi et al. Jul 2005 B2
6931557 Togawa Aug 2005 B2
6934300 Tomassetti et al. Aug 2005 B2
6934766 Russell Aug 2005 B1
6937988 Hemkumar et al. Aug 2005 B1
6950666 Asakawa Sep 2005 B2
6965948 Eneborg et al. Nov 2005 B1
6970481 Gray, III et al. Nov 2005 B2
6970482 Kim Nov 2005 B2
6981259 Luman et al. Dec 2005 B2
6985694 De Bonet et al. Jan 2006 B1
6987767 Saito Jan 2006 B2
6993570 Irani Jan 2006 B1
D515072 Lee Feb 2006 S
D515557 Okuley Feb 2006 S
7006758 Yamamoto et al. Feb 2006 B1
7007106 Flood et al. Feb 2006 B1
7020791 Aweya et al. Mar 2006 B1
D518475 Yang et al. Apr 2006 S
7043477 Mercer et al. May 2006 B2
7043651 Aweya et al. May 2006 B2
7046677 Monta et al. May 2006 B2
7047308 Deshpande May 2006 B2
7054888 Lachapelle et al. May 2006 B2
7058889 Trovato et al. Jun 2006 B2
7068596 Mou Jun 2006 B1
D524296 Kita Jul 2006 S
7076204 Richenstein et al. Jul 2006 B2
D527375 Flora et al. Aug 2006 S
7089333 Marinescu et al. Aug 2006 B2
7092528 Patrick et al. Aug 2006 B2
7092694 Griep et al. Aug 2006 B2
7095947 Van Aug 2006 B2
7096169 Crutchfield et al. Aug 2006 B2
7102513 Taskin et al. Sep 2006 B1
7106224 Knapp et al. Sep 2006 B2
7107442 Cheshire Sep 2006 B2
7113999 Pestoni et al. Sep 2006 B2
7115017 Laursen et al. Oct 2006 B1
7120168 Zimmermann Oct 2006 B2
7120259 Ballantyne et al. Oct 2006 B1
7123731 Cohen et al. Oct 2006 B2
7130316 Kovacevic Oct 2006 B2
7130368 Aweya et al. Oct 2006 B1
7130608 Hollstrom et al. Oct 2006 B2
7130616 Janik Oct 2006 B2
7136934 Carter et al. Nov 2006 B2
7139981 Mayer et al. Nov 2006 B2
7143141 Morgan et al. Nov 2006 B1
7143939 Henzerling Dec 2006 B2
7146260 Preston et al. Dec 2006 B2
7158488 Fujimori Jan 2007 B2
7158783 Eguchi Jan 2007 B2
7161939 Israel et al. Jan 2007 B2
7162315 Gilbert Jan 2007 B2
7164694 Nodoushani et al. Jan 2007 B1
7167765 Janik Jan 2007 B2
7174157 Gassho et al. Feb 2007 B2
7184774 Robinson et al. Feb 2007 B2
7185090 Kowalski et al. Feb 2007 B2
7187947 White et al. Mar 2007 B1
7188353 Crinon Mar 2007 B1
7197148 Nourse et al. Mar 2007 B2
7206367 Moore Apr 2007 B1
7206618 Latto et al. Apr 2007 B2
7206967 Marti et al. Apr 2007 B1
7209795 Sullivan et al. Apr 2007 B2
7215649 Yu et al. May 2007 B2
7218708 Berezowski et al. May 2007 B2
7218930 Ko et al. May 2007 B2
7236739 Chang Jun 2007 B2
7236773 Thomas Jun 2007 B2
7246374 Simon et al. Jul 2007 B1
7251533 Yoon et al. Jul 2007 B2
7257398 Ukita et al. Aug 2007 B1
7260616 Cook Aug 2007 B1
7263070 Delker et al. Aug 2007 B1
7263110 Fujishiro Aug 2007 B2
7269338 Janevski Sep 2007 B2
7274761 Muller et al. Sep 2007 B2
7275156 Balfanz et al. Sep 2007 B2
7277547 Delker et al. Oct 2007 B1
7286652 Azriel et al. Oct 2007 B1
7289631 Ishidoshiro Oct 2007 B2
7293060 Komsi Nov 2007 B2
7295548 Blank et al. Nov 2007 B2
7305694 Commons et al. Dec 2007 B2
7308188 Namatame Dec 2007 B2
7308489 Weast Dec 2007 B2
7310334 Fitzgerald et al. Dec 2007 B1
7312785 Tsu et al. Dec 2007 B2
7313384 Meenan et al. Dec 2007 B1
7313593 Pulito et al. Dec 2007 B1
7319764 Reid et al. Jan 2008 B1
7324857 Goddard Jan 2008 B2
7330875 Parasnis et al. Feb 2008 B1
7333519 Sullivan et al. Feb 2008 B2
7356011 Waters et al. Apr 2008 B1
7359006 Xiang et al. Apr 2008 B1
7363363 Dal Canto et al. Apr 2008 B2
7366206 Lockridge et al. Apr 2008 B2
7372846 Zwack May 2008 B2
7376834 Edwards et al. May 2008 B2
7383036 Kang et al. Jun 2008 B2
7391791 Balassanian et al. Jun 2008 B2
7392102 Sullivan et al. Jun 2008 B2
7392387 Balfanz et al. Jun 2008 B2
7392481 Gewickey et al. Jun 2008 B2
7394480 Song Jul 2008 B2
7400644 Sakamoto et al. Jul 2008 B2
7400732 Staddon et al. Jul 2008 B2
7412499 Chang et al. Aug 2008 B2
7428310 Park Sep 2008 B2
7430181 Hong Sep 2008 B1
7433324 Switzer et al. Oct 2008 B2
7434166 Acharya et al. Oct 2008 B2
7454619 Smetters et al. Nov 2008 B2
7457948 Bilicksa et al. Nov 2008 B1
7469139 Van De Groenendaal Dec 2008 B2
7472058 Tseng et al. Dec 2008 B2
7474677 Trott Jan 2009 B2
7483538 McCarty et al. Jan 2009 B2
7483540 Rabinowitz et al. Jan 2009 B2
7483958 Elabbady et al. Jan 2009 B1
7492912 Chung et al. Feb 2009 B2
7505889 Salmonsen et al. Mar 2009 B2
7509181 Champion Mar 2009 B2
7519667 Capps Apr 2009 B1
7532862 Cheshire May 2009 B2
7548744 Oesterling et al. Jun 2009 B2
7548851 Lau et al. Jun 2009 B1
7558224 Surazski et al. Jul 2009 B1
7558635 Thiel et al. Jul 2009 B1
7561697 Harris Jul 2009 B2
7571014 Lambourne et al. Aug 2009 B1
7574274 Holmes Aug 2009 B2
7581096 Balfanz et al. Aug 2009 B2
7599685 Goldberg et al. Oct 2009 B2
7606174 Ochi et al. Oct 2009 B2
7607091 Song et al. Oct 2009 B2
7627825 Kakuda Dec 2009 B2
7630501 Blank et al. Dec 2009 B2
7631119 Moore et al. Dec 2009 B2
7634093 McGrath Dec 2009 B2
7643894 Braithwaite et al. Jan 2010 B2
7653344 Feldman et al. Jan 2010 B1
7657224 Goldberg et al. Feb 2010 B2
7657255 Abel et al. Feb 2010 B2
7657644 Zheng Feb 2010 B1
7657910 McAulay et al. Feb 2010 B1
7665115 Gallo et al. Feb 2010 B2
7668990 Krzyzanowski et al. Feb 2010 B2
7669113 Moore et al. Feb 2010 B1
7669219 Scott, III et al. Feb 2010 B2
7672470 Lee Mar 2010 B2
7675943 Mosig et al. Mar 2010 B2
7676044 Sasaki et al. Mar 2010 B2
7676142 Hung Mar 2010 B1
7688306 Wehrenberg et al. Mar 2010 B2
7689304 Sasaki Mar 2010 B2
7689305 Kreifeldt et al. Mar 2010 B2
7690017 Stecyk et al. Mar 2010 B2
7702279 Ko et al. Apr 2010 B2
7702403 Gladwin et al. Apr 2010 B1
7710941 Rietschel et al. May 2010 B2
7711774 Rothschild May 2010 B1
7716375 Blum et al. May 2010 B2
7720096 Klemets May 2010 B2
7721032 Bushell et al. May 2010 B2
7742740 Goldberg et al. Jun 2010 B2
7743009 Hangartner et al. Jun 2010 B2
7746906 Jinzaki et al. Jun 2010 B2
7752329 Meenan et al. Jul 2010 B1
7756743 Lapcevic Jul 2010 B1
7757076 Stewart et al. Jul 2010 B2
7761176 Ben-Yaacov et al. Jul 2010 B2
7765315 Batson et al. Jul 2010 B2
RE41608 Blair et al. Aug 2010 E
7792311 Holmgren et al. Sep 2010 B1
7793206 Lim et al. Sep 2010 B2
7827259 Heller et al. Nov 2010 B2
7831054 Ball et al. Nov 2010 B2
7835689 Goldberg et al. Nov 2010 B2
7853341 McCarty et al. Dec 2010 B2
7865137 Goldberg et al. Jan 2011 B2
7882234 Watanabe et al. Feb 2011 B2
7885622 Krampf et al. Feb 2011 B2
7899656 Crutchfield, Jr. Mar 2011 B2
7904720 Smetters et al. Mar 2011 B2
7907736 Yuen et al. Mar 2011 B2
7907819 Ando et al. Mar 2011 B2
7916861 Conley et al. Mar 2011 B2
7916877 Goldberg et al. Mar 2011 B2
7917082 Goldberg et al. Mar 2011 B2
7933418 Morishima Apr 2011 B2
7934239 Dagman Apr 2011 B1
7937089 Smetters et al. May 2011 B2
7937752 Balfanz et al. May 2011 B2
7945143 Yahata et al. May 2011 B2
7945636 Nelson et al. May 2011 B2
7945708 Ohkita May 2011 B2
7958441 Heller et al. Jun 2011 B2
7966388 Pugaczewski et al. Jun 2011 B1
7987294 Bryce et al. Jul 2011 B2
7995732 Koch et al. Aug 2011 B2
7996566 Sylvain et al. Aug 2011 B1
7996588 Subbiah et al. Aug 2011 B2
8014423 Thaler et al. Sep 2011 B2
8015306 Bowman Sep 2011 B2
8020023 Millington et al. Sep 2011 B2
8023663 Goldberg Sep 2011 B2
8028038 Weel Sep 2011 B2
8028323 Weel Sep 2011 B2
8041062 Cohen et al. Oct 2011 B2
8045721 Burgan et al. Oct 2011 B2
8045952 Qureshey et al. Oct 2011 B2
8050203 Jacobsen et al. Nov 2011 B2
8050652 Qureshey et al. Nov 2011 B2
8055364 Champion Nov 2011 B2
8074253 Nathan Dec 2011 B1
8086752 Millington et al. Dec 2011 B2
8090317 Burge et al. Jan 2012 B2
8103009 McCarty et al. Jan 2012 B2
8111132 Allen et al. Feb 2012 B2
8112032 Ko et al. Feb 2012 B2
8116476 Inohara Feb 2012 B2
8126172 Horbach et al. Feb 2012 B2
8131389 Hardwick Mar 2012 B1
8131390 Braithwaite et al. Mar 2012 B2
8134650 Maxson et al. Mar 2012 B2
8144883 Pdersen et al. Mar 2012 B2
8148622 Rothkopf et al. Apr 2012 B2
8150079 Maeda et al. Apr 2012 B2
8156337 Balfanz et al. Apr 2012 B2
8169938 Duchscher et al. May 2012 B2
8170222 Dunko May 2012 B2
8170260 Reining et al. May 2012 B2
8175297 Ho et al. May 2012 B1
8185674 Moore et al. May 2012 B2
8194874 Starobin et al. Jun 2012 B2
8204890 Gogan Jun 2012 B1
8208653 Eo et al. Jun 2012 B2
8214447 Deslippe et al. Jul 2012 B2
8214740 Johnson Jul 2012 B2
8214873 Weel Jul 2012 B2
8218790 Bull et al. Jul 2012 B2
8230099 Weel Jul 2012 B2
8233029 Yoshida et al. Jul 2012 B2
8233648 Sorek et al. Jul 2012 B2
8234305 Seligmann et al. Jul 2012 B2
8234395 Millington Jul 2012 B2
8239748 Moore et al. Aug 2012 B1
8250218 Watanabe et al. Aug 2012 B2
8275910 Hauck Sep 2012 B1
8279709 Choisel et al. Oct 2012 B2
8281001 Busam et al. Oct 2012 B2
8285404 Kekki Oct 2012 B1
8290603 Lambourne Oct 2012 B1
8300845 Zurek et al. Oct 2012 B2
8311226 Lorgeoux et al. Nov 2012 B2
8315555 Ko et al. Nov 2012 B2
8316147 Batson et al. Nov 2012 B2
8325931 Howard et al. Dec 2012 B2
8326951 Millington et al. Dec 2012 B1
8340330 Yoon et al. Dec 2012 B2
8345709 Nitzpon et al. Jan 2013 B2
8364295 Beckmann et al. Jan 2013 B2
8370678 Millington et al. Feb 2013 B2
8374595 Chien et al. Feb 2013 B2
8407623 Kerr et al. Mar 2013 B2
8411883 Matsumoto Apr 2013 B2
8417827 Anttila et al. Apr 2013 B2
8423659 Millington Apr 2013 B2
8423893 Ramsay et al. Apr 2013 B2
8432851 Xu et al. Apr 2013 B2
8433076 Zurek et al. Apr 2013 B2
8442239 Bruelle-Drews et al. May 2013 B2
8457334 Yoon et al. Jun 2013 B2
8463184 Dua Jun 2013 B2
8463875 Katz et al. Jun 2013 B2
8473844 Kreifeldt et al. Jun 2013 B2
8477958 Moeller et al. Jul 2013 B2
8483853 Lambourne Jul 2013 B1
8509211 Trotter et al. Aug 2013 B2
8509463 Goh et al. Aug 2013 B2
8515389 Smetters et al. Aug 2013 B2
8520870 Sato et al. Aug 2013 B2
8565455 Worrell et al. Oct 2013 B2
8577048 Chaikin et al. Nov 2013 B2
8588949 Lambourne et al. Nov 2013 B2
8600084 Garrett Dec 2013 B1
8601394 Sheehan et al. Dec 2013 B2
8611559 Sanders Dec 2013 B2
8615091 Terwal Dec 2013 B2
8639830 Bowman Jan 2014 B2
8654995 Silber et al. Feb 2014 B2
8672744 Gronkowski et al. Mar 2014 B1
8683009 Ng et al. Mar 2014 B2
8688431 Lyons et al. Apr 2014 B2
8689036 Millington et al. Apr 2014 B2
8731206 Park May 2014 B1
8750282 Gelter et al. Jun 2014 B2
8751026 Sato et al. Jun 2014 B2
8762565 Togashi et al. Jun 2014 B2
8768252 Watson et al. Jul 2014 B2
8775546 Millington Jul 2014 B2
8797926 Kearney, III et al. Aug 2014 B2
8818538 Sakata Aug 2014 B2
8819554 Basso et al. Aug 2014 B2
8831761 Kemp et al. Sep 2014 B2
8843586 Pantos et al. Sep 2014 B2
8861739 Ojanpera Oct 2014 B2
8868698 Millington et al. Oct 2014 B2
8885851 Westenbroek Nov 2014 B2
8904066 Moore et al. Dec 2014 B2
8917877 Haaff et al. Dec 2014 B2
8930006 Haatainen Jan 2015 B2
8934647 Joyce et al. Jan 2015 B2
8934655 Breen et al. Jan 2015 B2
8938637 Millington et al. Jan 2015 B2
8942252 Balassanian et al. Jan 2015 B2
8942395 Lissaman et al. Jan 2015 B2
8954177 Sanders Feb 2015 B2
8965544 Ramsay Feb 2015 B2
8966394 Gates et al. Feb 2015 B2
9014833 Goh et al. Apr 2015 B2
9042556 Kallai et al. May 2015 B2
9078281 Matsuda et al. Jul 2015 B2
9130770 Millington et al. Sep 2015 B2
9137602 Mayman et al. Sep 2015 B2
9160965 Redman et al. Oct 2015 B2
9195258 Millington Nov 2015 B2
9407385 Spurgat et al. Aug 2016 B2
9456243 Hughes et al. Sep 2016 B1
9507780 Rothkopf et al. Nov 2016 B2
9560448 Hartung Jan 2017 B2
9628851 Hu et al. Apr 2017 B2
9977561 Bates et al. May 2018 B2
9998321 Cheshire Jun 2018 B2
10063896 Amidei et al. Aug 2018 B2
10133536 Millington Nov 2018 B2
10157033 Millington Dec 2018 B2
10175930 Millington Jan 2019 B2
10185541 Millington Jan 2019 B2
10303432 Millington May 2019 B2
20010001160 Shoff et al. May 2001 A1
20010009604 Ando et al. Jul 2001 A1
20010022823 Renaud Sep 2001 A1
20010027498 Van De Meulenhof et al. Oct 2001 A1
20010032188 Miyabe et al. Oct 2001 A1
20010042107 Palm Nov 2001 A1
20010043456 Atkinson Nov 2001 A1
20010046235 Trevitt et al. Nov 2001 A1
20010047377 Sincaglia et al. Nov 2001 A1
20010050991 Eves Dec 2001 A1
20010055950 Davies et al. Dec 2001 A1
20020002039 Qureshey et al. Jan 2002 A1
20020002562 Moran et al. Jan 2002 A1
20020002565 Ohyama Jan 2002 A1
20020003548 Krusche et al. Jan 2002 A1
20020015003 Kato et al. Feb 2002 A1
20020022453 Balog et al. Feb 2002 A1
20020026442 Lipscomb et al. Feb 2002 A1
20020034374 Barton Mar 2002 A1
20020035621 Zintel et al. Mar 2002 A1
20020042844 Chiazzese Apr 2002 A1
20020049843 Barone et al. Apr 2002 A1
20020056082 Hull et al. May 2002 A1
20020062406 Chang et al. May 2002 A1
20020065926 Hackney et al. May 2002 A1
20020067909 Iivonen Jun 2002 A1
20020072816 Shdema et al. Jun 2002 A1
20020072817 Champion Jun 2002 A1
20020073228 Cognet et al. Jun 2002 A1
20020078293 Kou et al. Jun 2002 A1
20020080783 Fujimori et al. Jun 2002 A1
20020083172 Knowles et al. Jun 2002 A1
20020083342 Webb et al. Jun 2002 A1
20020090914 Kang et al. Jul 2002 A1
20020093478 Yeh Jul 2002 A1
20020095460 Benson Jul 2002 A1
20020098878 Mooney et al. Jul 2002 A1
20020101357 Gharapetian Aug 2002 A1
20020103635 Mesarovic et al. Aug 2002 A1
20020109710 Holtz et al. Aug 2002 A1
20020112084 Deen et al. Aug 2002 A1
20020112244 Liou et al. Aug 2002 A1
20020114354 Sinha et al. Aug 2002 A1
20020114359 Ibaraki et al. Aug 2002 A1
20020124097 Isely Sep 2002 A1
20020124182 Bacso et al. Sep 2002 A1
20020129128 Gold et al. Sep 2002 A1
20020129156 Yoshikawa Sep 2002 A1
20020131398 Taylor Sep 2002 A1
20020131761 Kawasaki et al. Sep 2002 A1
20020136335 Liou et al. Sep 2002 A1
20020137505 Eiche et al. Sep 2002 A1
20020143998 Rajagopal et al. Oct 2002 A1
20020146981 Saint-Hilaire et al. Oct 2002 A1
20020150053 Gray et al. Oct 2002 A1
20020159596 Durand et al. Oct 2002 A1
20020163361 Parkin Nov 2002 A1
20020165721 Chang et al. Nov 2002 A1
20020165921 Sapieyevski Nov 2002 A1
20020168938 Chang Nov 2002 A1
20020173273 Spurgat et al. Nov 2002 A1
20020174243 Spurgat et al. Nov 2002 A1
20020174269 Spurgat et al. Nov 2002 A1
20020177411 Yajima et al. Nov 2002 A1
20020181355 Shikunami et al. Dec 2002 A1
20020184310 Traversat et al. Dec 2002 A1
20020188762 Tomassetti et al. Dec 2002 A1
20020194260 Headley et al. Dec 2002 A1
20020194309 Carter et al. Dec 2002 A1
20030002609 Faller et al. Jan 2003 A1
20030002849 Lord Jan 2003 A1
20030008616 Anderson Jan 2003 A1
20030014486 May Jan 2003 A1
20030018797 Dunning et al. Jan 2003 A1
20030020763 Mayer et al. Jan 2003 A1
20030023411 Witmer et al. Jan 2003 A1
20030023741 Tomassetti et al. Jan 2003 A1
20030035072 Hagg Feb 2003 A1
20030035444 Zwack Feb 2003 A1
20030037125 Luman et al. Feb 2003 A1
20030041173 Hoyle Feb 2003 A1
20030041174 Wen et al. Feb 2003 A1
20030043856 Lakaniemi et al. Mar 2003 A1
20030043924 Haddad et al. Mar 2003 A1
20030046703 Knowles et al. Mar 2003 A1
20030050058 Walsh et al. Mar 2003 A1
20030055892 Huitema et al. Mar 2003 A1
20030056220 Thornton et al. Mar 2003 A1
20030061428 Garney et al. Mar 2003 A1
20030063528 Ogikubo Apr 2003 A1
20030063755 Nourse et al. Apr 2003 A1
20030066094 Van Der Schaar et al. Apr 2003 A1
20030067437 McClintock et al. Apr 2003 A1
20030073432 Meade Apr 2003 A1
20030097478 King May 2003 A1
20030099212 Anjum et al. May 2003 A1
20030099221 Rhee May 2003 A1
20030100335 Gassho et al. May 2003 A1
20030101253 Saito et al. May 2003 A1
20030103088 Dresti et al. Jun 2003 A1
20030103464 Wong et al. Jun 2003 A1
20030109270 Shorty Jun 2003 A1
20030110329 Higaki et al. Jun 2003 A1
20030118158 Hattori Jun 2003 A1
20030123853 Iwahara et al. Jul 2003 A1
20030126211 Anttila et al. Jul 2003 A1
20030135822 Evans Jul 2003 A1
20030157951 Hasty, Jr. Aug 2003 A1
20030164084 Redmann et al. Sep 2003 A1
20030167335 Alexander Sep 2003 A1
20030172123 Polan et al. Sep 2003 A1
20030179780 Walker et al. Sep 2003 A1
20030182254 Plastina et al. Sep 2003 A1
20030185400 Yoshizawa et al. Oct 2003 A1
20030187657 Erhart et al. Oct 2003 A1
20030195964 Mane Oct 2003 A1
20030198254 Sullivan et al. Oct 2003 A1
20030198255 Sullivan et al. Oct 2003 A1
20030198257 Sullivan et al. Oct 2003 A1
20030200001 Goddard Oct 2003 A1
20030204273 Dinker et al. Oct 2003 A1
20030204509 Dinker et al. Oct 2003 A1
20030210347 Kondo Nov 2003 A1
20030210796 McCarty et al. Nov 2003 A1
20030212802 Rector et al. Nov 2003 A1
20030219007 Barrack et al. Nov 2003 A1
20030220705 Ibey Nov 2003 A1
20030225834 Lee et al. Dec 2003 A1
20030227478 Chatfield Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20030231208 Hanon et al. Dec 2003 A1
20030231871 Ushimaru Dec 2003 A1
20030235304 Evans et al. Dec 2003 A1
20040001106 Deutscher et al. Jan 2004 A1
20040001484 Ozguner Jan 2004 A1
20040001591 Mani et al. Jan 2004 A1
20040002938 Deguchi Jan 2004 A1
20040008852 Also et al. Jan 2004 A1
20040010727 Fujinami Jan 2004 A1
20040012620 Buhler et al. Jan 2004 A1
20040014426 Moore Jan 2004 A1
20040015252 Aiso et al. Jan 2004 A1
20040019497 Volk et al. Jan 2004 A1
20040019807 Freund et al. Jan 2004 A1
20040019911 Gates et al. Jan 2004 A1
20040023697 Komura Feb 2004 A1
20040024478 Hans et al. Feb 2004 A1
20040024925 Cypher et al. Feb 2004 A1
20040027166 Mangum et al. Feb 2004 A1
20040032348 Lai et al. Feb 2004 A1
20040032421 Williamson et al. Feb 2004 A1
20040032922 Knapp et al. Feb 2004 A1
20040037433 Chen Feb 2004 A1
20040041836 Zaner et al. Mar 2004 A1
20040042629 Mellon et al. Mar 2004 A1
20040044742 Evron Mar 2004 A1
20040048569 Kawamura Mar 2004 A1
20040059842 Hanson et al. Mar 2004 A1
20040059965 Marshall et al. Mar 2004 A1
20040066736 Kroeger Apr 2004 A1
20040075767 Neuman et al. Apr 2004 A1
20040078383 Mercer et al. Apr 2004 A1
20040078828 Parchman et al. Apr 2004 A1
20040080671 Siemens et al. Apr 2004 A1
20040093096 Huang et al. May 2004 A1
20040098754 Vella et al. May 2004 A1
20040111473 Lysenko et al. Jun 2004 A1
20040117462 Bodin et al. Jun 2004 A1
20040117491 Karaoguz et al. Jun 2004 A1
20040117840 Boudreau et al. Jun 2004 A1
20040117858 Boudreau et al. Jun 2004 A1
20040128701 Kaneko et al. Jul 2004 A1
20040131192 Metcalf Jul 2004 A1
20040133689 Vasisht Jul 2004 A1
20040143368 May et al. Jul 2004 A1
20040143675 Aust Jul 2004 A1
20040143852 Meyers Jul 2004 A1
20040148237 Bittmann et al. Jul 2004 A1
20040168081 Ladas et al. Aug 2004 A1
20040170383 Mazur Sep 2004 A1
20040171346 Lin Sep 2004 A1
20040176025 Holm et al. Sep 2004 A1
20040177167 Iwamura et al. Sep 2004 A1
20040179554 Tsao Sep 2004 A1
20040183827 Putterman et al. Sep 2004 A1
20040185773 Gerber et al. Sep 2004 A1
20040189363 Takano Sep 2004 A1
20040195313 Lee Oct 2004 A1
20040203376 Phillipps Oct 2004 A1
20040203378 Powers Oct 2004 A1
20040203590 Shteyn Oct 2004 A1
20040203936 Ogino et al. Oct 2004 A1
20040208158 Fellman et al. Oct 2004 A1
20040213230 Douskalis et al. Oct 2004 A1
20040214524 Noda et al. Oct 2004 A1
20040223622 Lindemann et al. Nov 2004 A1
20040224638 Fadell et al. Nov 2004 A1
20040228367 Mosig et al. Nov 2004 A1
20040248601 Chang Dec 2004 A1
20040249490 Sakai Dec 2004 A1
20040249965 Huggins et al. Dec 2004 A1
20040249982 Arnold et al. Dec 2004 A1
20040252400 Blank et al. Dec 2004 A1
20040253969 Nguyen et al. Dec 2004 A1
20050010691 Oyadomari et al. Jan 2005 A1
20050011388 Kouznetsov Jan 2005 A1
20050013394 Rausch et al. Jan 2005 A1
20050015551 Eames et al. Jan 2005 A1
20050021590 Debique et al. Jan 2005 A1
20050021811 Roelens Jan 2005 A1
20050027821 Alexander et al. Feb 2005 A1
20050047605 Lee et al. Mar 2005 A1
20050058149 Howe Mar 2005 A1
20050060435 Xue et al. Mar 2005 A1
20050062637 El Zabadani et al. Mar 2005 A1
20050081213 Suzuoki et al. Apr 2005 A1
20050100166 Smetters et al. May 2005 A1
20050102699 Kim et al. May 2005 A1
20050105052 McCormick May 2005 A1
20050114538 Rose May 2005 A1
20050120128 Willes et al. Jun 2005 A1
20050125222 Brown et al. Jun 2005 A1
20050125357 Saadat et al. Jun 2005 A1
20050129240 Balfanz et al. Jun 2005 A1
20050131558 Braithwaite et al. Jun 2005 A1
20050149204 Manchester et al. Jul 2005 A1
20050154766 Huang et al. Jul 2005 A1
20050159833 Giaimo Jul 2005 A1
20050160270 Goldberg et al. Jul 2005 A1
20050166135 Burke et al. Jul 2005 A1
20050168630 Yamada et al. Aug 2005 A1
20050170781 Jacobsen et al. Aug 2005 A1
20050177643 Xu Aug 2005 A1
20050181348 Carey et al. Aug 2005 A1
20050195205 Abrams Sep 2005 A1
20050195823 Chen et al. Sep 2005 A1
20050197725 Alexander et al. Sep 2005 A1
20050198574 Lamkin et al. Sep 2005 A1
20050201549 Dedieu et al. Sep 2005 A1
20050215265 Sharma Sep 2005 A1
20050216556 Manion et al. Sep 2005 A1
20050239445 Karaoguz et al. Oct 2005 A1
20050246421 Moore et al. Nov 2005 A1
20050262217 Nonaka Nov 2005 A1
20050281255 Davies et al. Dec 2005 A1
20050283820 Richards et al. Dec 2005 A1
20050288805 Moore et al. Dec 2005 A1
20050289224 Deslippe et al. Dec 2005 A1
20060002681 Spilo et al. Jan 2006 A1
20060041639 Lamkin et al. Feb 2006 A1
20060049966 Ozawa et al. Mar 2006 A1
20060072489 Toyoshima Apr 2006 A1
20060095516 Wijeratne May 2006 A1
20060098936 Ikeda et al. May 2006 A1
20060119497 Miller et al. Jun 2006 A1
20060142034 Wentink et al. Jun 2006 A1
20060143236 Wu Jun 2006 A1
20060149850 Bowman Jul 2006 A1
20060155721 Grunwald et al. Jul 2006 A1
20060156374 Hu et al. Jul 2006 A1
20060161742 Sugimoto et al. Jul 2006 A1
20060173844 Zhang et al. Aug 2006 A1
20060173976 Vincent et al. Aug 2006 A1
20060193454 Abou-Chakra et al. Aug 2006 A1
20060215741 Chin et al. Sep 2006 A1
20060222186 Paige et al. Oct 2006 A1
20060227985 Kawanami Oct 2006 A1
20060259649 Hsieh et al. Nov 2006 A1
20060265571 Bosch et al. Nov 2006 A1
20060270395 Dhawan et al. Nov 2006 A1
20060281409 Levien et al. Dec 2006 A1
20060287746 Braithwaite et al. Dec 2006 A1
20070003067 Gierl et al. Jan 2007 A1
20070022207 Millington Jan 2007 A1
20070038999 Millington Feb 2007 A1
20070043847 Carter et al. Feb 2007 A1
20070047712 Gross et al. Mar 2007 A1
20070048713 Plastina et al. Mar 2007 A1
20070054680 Mo et al. Mar 2007 A1
20070087686 Holm et al. Apr 2007 A1
20070142022 Madonna et al. Jun 2007 A1
20070142944 Goldberg Jun 2007 A1
20070143493 Mullig et al. Jun 2007 A1
20070169115 Ko et al. Jul 2007 A1
20070180137 Rajapakse Aug 2007 A1
20070192156 Gauger Aug 2007 A1
20070220150 Sarg Sep 2007 A1
20070249295 Ukita et al. Oct 2007 A1
20070265031 Koizumi et al. Nov 2007 A1
20070271388 Bowra et al. Nov 2007 A1
20070299778 Haveson et al. Dec 2007 A1
20080002836 Moeller et al. Jan 2008 A1
20080007649 Bennett Jan 2008 A1
20080007650 Bennett Jan 2008 A1
20080007651 Bennett Jan 2008 A1
20080018785 Bennett Jan 2008 A1
20080022320 Ver Steeg Jan 2008 A1
20080025535 Rajapakse Jan 2008 A1
20080060084 Gappa et al. Mar 2008 A1
20080072816 Riess et al. Mar 2008 A1
20080075295 Mayman et al. Mar 2008 A1
20080077619 Gilley et al. Mar 2008 A1
20080077620 Gilley et al. Mar 2008 A1
20080086318 Gilley et al. Apr 2008 A1
20080091771 Allen et al. Apr 2008 A1
20080109852 Kretz et al. May 2008 A1
20080120429 Millington et al. May 2008 A1
20080126943 Parasnis et al. May 2008 A1
20080144861 Melanson et al. Jun 2008 A1
20080144864 Huon et al. Jun 2008 A1
20080146289 Korneluk et al. Jun 2008 A1
20080189272 Powers et al. Aug 2008 A1
20080205070 Osada Aug 2008 A1
20080212729 Shaanan Sep 2008 A1
20080212786 Park Sep 2008 A1
20080215169 Debettencourt et al. Sep 2008 A1
20080263010 Roychoudhuri et al. Oct 2008 A1
20080273714 Hartung Nov 2008 A1
20080303947 Ohnishi et al. Dec 2008 A1
20090011798 Yamada Jan 2009 A1
20090017868 Ueda et al. Jan 2009 A1
20090031336 Chavez et al. Jan 2009 A1
20090060219 Inohara Mar 2009 A1
20090062947 Lydon et al. Mar 2009 A1
20090070434 Himmelstein Mar 2009 A1
20090077610 White et al. Mar 2009 A1
20090087000 Ko Apr 2009 A1
20090089327 Kalaboukis et al. Apr 2009 A1
20090100189 Barren et al. Apr 2009 A1
20090124289 Nishida May 2009 A1
20090157905 Davis Jun 2009 A1
20090164655 Pettersson et al. Jun 2009 A1
20090169030 Inohara Jul 2009 A1
20090193345 Wensley et al. Jul 2009 A1
20090222115 Malcolm et al. Sep 2009 A1
20090222392 Martin et al. Sep 2009 A1
20090228919 Zott et al. Sep 2009 A1
20090251604 Iyer Oct 2009 A1
20100004983 Dickerson et al. Jan 2010 A1
20100031366 Knight et al. Feb 2010 A1
20100049835 Ko et al. Feb 2010 A1
20100087089 Struthers et al. Apr 2010 A1
20100228740 Cannistraro et al. Sep 2010 A1
20100284389 Ramsay et al. Nov 2010 A1
20100299639 Ramsay et al. Nov 2010 A1
20110001632 Hohorst Jan 2011 A1
20110002487 Panther et al. Jan 2011 A1
20110066943 Brillon et al. Mar 2011 A1
20110222701 Donaldson et al. Sep 2011 A1
20110228944 Croghan et al. Sep 2011 A1
20110316768 McRae Dec 2011 A1
20120029671 Millington et al. Feb 2012 A1
20120030366 Collart et al. Feb 2012 A1
20120051567 Castor-Perry Mar 2012 A1
20120060046 Millington Mar 2012 A1
20120129446 Ko et al. May 2012 A1
20120148075 Goh et al. Jun 2012 A1
20120185771 Rothkopf et al. Jul 2012 A1
20120192071 Millington Jul 2012 A1
20120207290 Moyers et al. Aug 2012 A1
20120237054 Eo et al. Sep 2012 A1
20120281058 Laney et al. Nov 2012 A1
20120290621 Heitz, III et al. Nov 2012 A1
20130013757 Millington et al. Jan 2013 A1
20130018960 Knysz et al. Jan 2013 A1
20130031475 Maor et al. Jan 2013 A1
20130038726 Kim Feb 2013 A1
20130041954 Kim et al. Feb 2013 A1
20130047084 Sanders et al. Feb 2013 A1
20130052940 Brillhart et al. Feb 2013 A1
20130070093 Rivera et al. Mar 2013 A1
20130080599 Ko et al. Mar 2013 A1
20130124664 Fonseca, Jr. et al. May 2013 A1
20130129122 Johnson et al. May 2013 A1
20130132837 Mead et al. May 2013 A1
20130159126 Elkady Jun 2013 A1
20130167029 Friesen et al. Jun 2013 A1
20130174100 Seymour et al. Jul 2013 A1
20130174223 Dykeman et al. Jul 2013 A1
20130179163 Herbig et al. Jul 2013 A1
20130191454 Oliver et al. Jul 2013 A1
20130197682 Millington Aug 2013 A1
20130226323 Millington Aug 2013 A1
20130230175 Bech et al. Sep 2013 A1
20130232416 Millington Sep 2013 A1
20130253934 Parekh et al. Sep 2013 A1
20130279706 Marti et al. Oct 2013 A1
20130287186 Quady Oct 2013 A1
20130290504 Quady Oct 2013 A1
20140006483 Garmark et al. Jan 2014 A1
20140037097 Labosco Feb 2014 A1
20140064501 Olsen et al. Mar 2014 A1
20140075308 Sanders et al. Mar 2014 A1
20140075311 Boettcher et al. Mar 2014 A1
20140079242 Nguyen et al. Mar 2014 A1
20140108929 Garmark et al. Apr 2014 A1
20140123005 Forstall et al. May 2014 A1
20140140530 Gomes-Casseres et al. May 2014 A1
20140161265 Chaikin et al. Jun 2014 A1
20140181569 Millington et al. Jun 2014 A1
20140233755 Kim et al. Aug 2014 A1
20140242913 Pang Aug 2014 A1
20140256260 Ueda et al. Sep 2014 A1
20140267148 Luna et al. Sep 2014 A1
20140270202 Ivanov et al. Sep 2014 A1
20140273859 Luna et al. Sep 2014 A1
20140279889 Luna Sep 2014 A1
20140285313 Luna et al. Sep 2014 A1
20140286496 Luna et al. Sep 2014 A1
20140298174 Ikonomov Oct 2014 A1
20140323036 Daley et al. Oct 2014 A1
20140344689 Scott et al. Nov 2014 A1
20140378056 Liu et al. Dec 2014 A1
20150019670 Redmann Jan 2015 A1
20150026613 Kwon et al. Jan 2015 A1
20150032844 Tarr et al. Jan 2015 A1
20150043736 Olsen et al. Feb 2015 A1
20150049248 Wang et al. Feb 2015 A1
20150074527 Sevigny et al. Mar 2015 A1
20150074528 Sakalowsky et al. Mar 2015 A1
20150098576 Sundaresan et al. Apr 2015 A1
20150139210 Marin et al. May 2015 A1
20150256954 Carlsson et al. Sep 2015 A1
20150304288 Balasaygun et al. Oct 2015 A1
20150365987 Weel Dec 2015 A1
20190324713 Millington Oct 2019 A1
20190339934 Millington Nov 2019 A1
20190369954 Millington Dec 2019 A1
Foreign Referenced Citations (78)
Number Date Country
2320451 Mar 2001 CA
2485100 Nov 2003 CA
1598767 Mar 2005 CN
101292500 Oct 2008 CN
0251584 Jan 1988 EP
0672985 Sep 1995 EP
0772374 May 1997 EP
1058985 Dec 2000 EP
1111527 Jun 2001 EP
1122931 Aug 2001 EP
1295420 Mar 2003 EP
1312188 May 2003 EP
1389853 Feb 2004 EP
2713281 Apr 2004 EP
1517464 Mar 2005 EP
0895427 Jan 2006 EP
1416687 Aug 2006 EP
1410686 Mar 2008 EP
2043381 Apr 2009 EP
2161950 Mar 2010 EP
0742674 Apr 2014 EP
2591617 Jun 2014 EP
2284327 May 1995 GB
2338374 Dec 1999 GB
2379533 Mar 2003 GB
2486183 Jun 2012 GB
63269633 Nov 1988 JP
07-210129 Aug 1995 JP
2000149391 May 2000 JP
2001034951 Feb 2001 JP
2001251360 Sep 2001 JP
2002111817 Apr 2002 JP
2002123267 Apr 2002 JP
2002141915 May 2002 JP
2002358241 Dec 2002 JP
2003037585 Feb 2003 JP
2003037585 Feb 2003 JP
2003506765 Feb 2003 JP
2003101958 Apr 2003 JP
2003101958 Apr 2003 JP
2003169089 Jun 2003 JP
2005108427 Apr 2005 JP
2005136457 May 2005 JP
2007241652 Sep 2007 JP
2009506603 Feb 2009 JP
2009075540 Apr 2009 JP
2009135750 Jun 2009 JP
2009535708 Oct 2009 JP
2009538006 Oct 2009 JP
2011130496 Jun 2011 JP
20030011128 Feb 2003 KR
439027 Jun 2001 TW
199525313 Sep 1995 WO
9709756 Mar 1997 WO
1999023560 May 1999 WO
199961985 Dec 1999 WO
0019693 Apr 2000 WO
0110125 Feb 2001 WO
200153994 Jul 2001 WO
02073851 Sep 2002 WO
02073851 Sep 2002 WO
02091596 Nov 2002 WO
03023759 Mar 2003 WO
03058965 Jul 2003 WO
03093950 Nov 2003 WO
03096741 Nov 2003 WO
2003093950 Nov 2003 WO
2005013047 Feb 2005 WO
2007023120 Mar 2007 WO
2007127485 Nov 2007 WO
2007131555 Nov 2007 WO
2007135581 Nov 2007 WO
2008046530 Apr 2008 WO
2008082350 Jul 2008 WO
2008114389 Sep 2008 WO
2012050927 Apr 2012 WO
2014004182 Jan 2014 WO
2014149533 Sep 2014 WO
Non-Patent Literature Citations (1181)
Entry
“RVL-6 Modular Multi-Room Controller, Installation & Operation Guide,” Nile Audio Corporations, 1999, 46 pages.
Schertel, Barry. Griffin Evolve Wireless iPod Speakers, Feb. 18, 2008, 4 pages.
Schmandt et al., “Impromptu: Managing Networked Audio Applications for Mobile Users,” 2004, 11 pages.
Schulzrinne et al., “RTP: A Transport Protocol for Real-Time Applications,” Network Working Group, Jan. 1996, pp. 1-75.
Schulzrinne H., et al., “RTP: A Transport Protocol for Real-Time Applications, RFC 3550,” Network Working Group, 2003, pp. 1-89.
Shannon, Victoria. The New York Times, Company supports Apple: Philips sets up a ‘Rendezvous’, Sep. 11, 2002, 2 pages.
Sieborger, D. R., Multiprotocol Control of Networked Home Entertainment Devices, Feb. 2004, 131 pages.
Simple Network Time Protocol (SNTPI), RFC 1361 (Aug. 1992) (D+M_0397537-46) (10 pages).
Simple Network Time Protocol (SNTPII), RFC 1769 (Mar. 1995) (D+M_0397663-76) (14 pages).
Simple Service Discovery Protocol/1.0 Operating without an Arbiter (Oct. 28, 1999) (24 pages).
Slim Devices. Index of/downloads, slimdevices.com, Apr. 13, 2003, 1 page.
Slim Devices. Index of/downloads/SLIMP3_Server_v4.0. slimdevices.com, Apr. 14, 2003, 1 page.
Slim Devices. Support Downloads, slimdevices.com, Apr. 11, 2003, 1 page.
SMC EZ-Stream Universal Wireless Multimedia Receiver—NextUp, Dec. 5, 2003, 4 pages.
SMC Network. SMCWMR-AG—EZ-Stream Universal Wireless Multimedia Receiver, Dec. 3, 2003, 2 pages.
SMC Networks Consumer Site. About SMC: Press Release Details, Feb. 21, 2004, 2 pages.
SMC Networks Consumer Site. Products: Home Entertainment Networking, Dec. 10, 2003, 1 page.
SMC Networks Consumer Site. Products: Home Entertainment Networking, Feb. 7, 2004, 1 page.
SMC Networks Consumer Site. Support: Support Center Downloads, Feb. 7, 2004, 1 page.
SMC Networks Ez-Stream Universal 2.4GHz/5GHz Wireless Multimedia Receiver. SMCWMR-AG Users Manual, 60 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
SMC Networks. SMCWAA-B EZ-Stream 2.4GHz Wireless Audio Adapter. User Guide, 2004, 51 pages.
SMC Networks. SMCWMR-AG EZ-Stream Universal Wireless Multimedia Receiver. User Guide, 2003, 43 pages.
SMC-GT1255FTX-SC EZ Card. SMC Networks: What's New, Feb. 5, 2004, 7 pages.
Snarfed/p4sync. GitHub: A library and plugins for a few music players that (attempts to) synchronize playback across multiple computers, 2 pages [online]. [retrieved on Mar. 26, 2020]. Retrieved online URL: https://github.com/snarfed/p4sync.
Software & drivers. Micro Audio System MCW770/37. Philips. Copyright 2004-2020, 3 pages [online]. [retrieved on Feb. 24, 2020]. Retrieved from the Internet URL: https://www.usa.philips.com/c-p/MCW770_37/-support.
Solid State Logic G Series Master Studio System: SSL G Series Console, 1988, 113 pages.
Solid State Logic G Series Systems Operator's Shortform Guide: SSL G Series Console, 1994, 49 pages.
Solid State Logic SL 9000 J Series Total Studio System: Console Operator's Manual, 1994, 415 pages.
Sonos, Inc. v D&M Holdings, D&M Supp Opposition Brief including Exhibits, Mar. 17, 2017, 23 pages.
Sonos, Inc. v. D&M Holdings, Expert Report of Jay P. Kesan including Appendices A-P, Feb. 20, 2017, 776 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Complaint for Patent Infringement, filed Oct. 21, 2014, 20 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions, filed Sep. 14, 2016, 100 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions, filed Apr. 15, 2016, 97 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Preliminary Identification of Indefinite Terms, provided Jul. 29, 2016, 8 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' 35 U.S.C. § 282 Notice filed Nov. 2, 2017, 31 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Amended Answer, Defenses, and Counterclaims for Patent Infringement, filed Nov. 30, 2015, 47 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Answer to Plaintiff's Second Amended Complaint, filed Apr. 30, 2015, 19 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, filed Sep. 7, 2016, 23 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Reply in Support of Partial Motion for Judgment on the Pleadings, filed Jun. 10, 2016, 15 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, provided Aug. 1, 2016, 26 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, filed Sep. 9, 2016, 43 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Sep. 9, 2016, 88 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., First Amended Complaint for Patent Infringement, filed Dec. 17, 2014, 26 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Joint Claim Construction Chart, vol. 1 of 3 with Exhibits A-O, filed Aug. 17, 2016, 30 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Opening Brief in Support of Defendants' Partial Motion for Judgment on the Pleadings for Lack of Patent-Eligible Subject Matter, filed May 6, 2016, 27 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff Sonos, Inc.'s Opening Claim Construction Brief, filed Sep. 9, 2016, 26 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff Sonos, Inc.'s Response in Opposition to Defendants' Partial Motion for Judgment on the Pleadings, filed May 27, 2016, 24 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Reply Brief in Support of Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Nov. 10, 2016, 16 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Reply Brief in Support of Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Sep. 9, 2016, 16 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Second Amended Complaint for Patent Infringement, filed Feb. 27, 2015, 49 pages.
European Patent Office, Office Action dated Nov. 25, 2016, issued in connection with EP Application No. 13810340.3, 5 pages.
European Patent Office, Summons to Attend Oral Proceedings mailed on Jul. 10, 2018, issued in connection with European Application No. 13184747.7, 10 pages.
Examiner's Answer to Appeal Brief mailed on Nov. 5, 2018, issued in connection with U.S. Reexamination Control No. 90/013959 for U.S. Pat. No. 9,213,357, filed on Jun. 16, 2017, 14 pages.
Exstreamer—The Exstreamer Instruction Manual Version 1.5. Barix Think Further. Sourced from Sonos, Inc. v. Lenbrook Industries Limited et al., Defendants' Answer to Plaintiff's Complaint—Exhibit E, filed Oct. 14, 2019, 21 pages.
Exstreamer—The Exstreamer Technical Description Version 1.5. Barix Think Further. Sourced from Sonos, Inc. v. Lenbrook Industries Limited et al., Defendants' Answer to Plaintiff's Complaint—Exhibit D, filed Oct. 14, 2019, 36 pages.
Exstreamer. Network MP3 player for digital audio streaming in a consumer, home installation and commercial applications. Barix Think Further. Sep. 2002, 2 pages.
Exstreamer. The Exstreamer Instruction Manual. Barix Think Further. Version 1.5 , Oct. 2002, 21 pages.
Exstreamer. The Exstreamer Technical Description: Version 1.5. Barix Think Further. Oct. 2002, 36 pages.
Extron System Integrator Speakers. System Integrator Speaker Series. ExtroNews. Issue 16.2, Winter 2005, 32 pages.
EZ-Stream 11 Mbps Wireless Audio Adapter. Model No. SMCWAA-B. Home Entertainment Networking, 2 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Falcone, John, “Sonos BU150 Digital Music System review,” CNET, CNET [online] Jul. 27, 2009 [retrieved on Mar. 16, 2016], 11 pages Retrieved from the Internet: URL:http://www.cnet.com/products/sonos-bu150-digital-music-system/.
Faller, Christof, “Coding of Spatial Audio Compatible with Different Playback Formats,” Audio Engineering Society Convention Paper (Presented at the 117th Convention), Oct. 28-31, 2004, 12 pages.
Fielding et al. RFC 2616 Hypertext Transfer Protocol—HTTP/1.1, Jun. 1999, 114 pages.
File History of Re-Examination U.S. Appl. No. 90/013,423.
Final Office Action dated Jun. 5, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 12 pages.
Final Office Action dated Jul. 13, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 16 pages.
Final Office Action dated Sep. 13, 2012, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 17 pages.
Final Office Action dated Nov. 18, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 56 pages.
Final Office Action dated Oct. 21, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 19 pages.
Final Office Action dated Mar. 27, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 29 pages.
Final Office Action dated Jan. 28, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 21 pages.
Final Office Action dated Jun. 30, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 30 pages.
Final Office Action dated Jun. 2, 2017, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 32 pages.
Final Office Action dated Aug. 3, 2015, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages.
Final Office Action dated Dec. 3, 2014, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 12 pages.
Final Office Action dated Jul. 3, 2012, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 46 pages.
Final Office Action dated Jun. 3, 2016, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 24 pages.
Final Office Action dated Mar. 3, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 13 pages.
Final Office Action dated Mar. 4, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 16 pages.
Final Office Action dated Jul. 5, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 22 pages.
Final Office Action dated Mar. 5, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 13 pages.
Final Office Action dated Jun. 6, 2018, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 14 pages.
Final Office Action dated Jan. 7, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 14 pages.
Final Office Action dated Mar. 8, 2017, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 39 pages.
Final Office Action dated Nov. 8, 2017, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages.
Final Office Action dated Nov. 8, 2017, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 12 pages.
Final Office Action dated Mar. 9, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 14 pages.
Final Office Action dated Mar. 1, 2018, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 16 pages.
Final Office Action dated Apr. 10, 2017, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 15 pages.
Final Office Action dated Aug. 10, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 26 pages.
Final Office Action dated Aug. 11, 2015, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 15 pages.
Final Office Action dated Feb. 11, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 13 pages.
Final Office Action dated Feb. 11, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 17 pages.
Final Office Action dated Jul. 11, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 13 pages.
Final Office Action dated Jul. 11, 2018, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 13 pages.
Final Office Action dated Jul. 11, 2018, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 10 pages.
Final Office Action dated Feb. 12, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 20 pages.
Final Office Action dated Jan. 12, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 25 pages.
Final Office Action dated Dec. 13, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 41 pages.
Final Office Action dated Oct. 13, 2011, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 10 pages.
Notice of Allowance dated May 19, 2015, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 7 pages.
Notice of Allowance dated Nov. 19, 2018, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 19 pages.
Notice of Allowance dated Oct. 19, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 14 pages.
Notice of Allowance dated Mar. 2, 2021, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 8 pages.
Notice of Allowance dated Nov. 20, 2018, issued in connection with U.S. Appl. No. 16/119,352, filed Aug. 31, 2018, 9 pages.
Notice of Allowance dated Nov. 20, 2018, issued in connection with U.S. Appl. No. 16/119,638, filed Aug. 31, 2018, 9 pages.
Notice of Allowance dated May 21, 2019, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 9 pages.
Notice of Allowance dated Oct. 21, 2021, issued in connection with U.S. Appl. No. 17/306,016, filed May 3, 2021, 10 pages.
Notice of Allowance dated Sep. 21, 2015, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 11 pages.
Notice of Allowance dated Dec. 22, 2016, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 9 pages.
Notice of Allowance dated Oct. 22, 2018, issued in connection with U.S. Appl. No. 14/058,166, filed Oct. 18, 2013, 7 pages.
Notice of Allowance dated Sep. 22, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 7 pages.
Notice of Allowance dated Sep. 22, 2020, issued in connection with U.S. Appl. No. 16/545,844, filed Aug. 20, 2019, 8 pages.
Notice of Allowance dated Nov. 23, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 21 pages.
Notice of Allowance dated Oct. 23, 2018, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 10 pages.
Notice of Allowance dated Aug. 24, 2020, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 8 pages.
Notice of Allowance dated Sep. 24, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 7 pages.
Notice of Allowance dated Sep. 24, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 7 pages.
Notice of Allowance dated Apr. 25, 2017, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 8 pages.
Notice of Allowance dated Feb. 25, 2022, issued in connection with U.S. Appl. No. 17/540,631, filed Dec. 2, 2021, 27 pages.
Notice of Allowance dated Mar. 25, 2019, issued in connection with U.S. Appl. No. 15/972,383, filed May 7, 2018, 9 pages.
Notice of Allowance dated Nov. 25, 2020, issued in connection with U.S. Appl. No. 16/999,048, filed Aug. 20, 2020, 9 pages.
Notice of Allowance dated Sep. 25, 2014, issued in connection with U.S. Appl. No. 14/176,808, filed Feb. 10, 2014, 5 pages.
Notice of Allowance dated Sep. 25, 2019, issued in connection with U.S. Appl. No. 16/459,565, filed Jul. 1, 2019, 14 pages.
Notice of Allowance dated Dec. 26, 2019, issued in connection with U.S. Appl. No. 16/544,900, filed Aug. 20, 2019, 20 pages.
Notice of Allowance dated Jul. 26, 2019, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 7 pages.
Notice of Allowance dated Sep. 26, 2018, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 10 pages.
Notice of Allowance dated Sep. 26, 2019, issued in connection with U.S. Appl. No. 16/514,280, filed Jul. 17, 2019, 16 pages.
Notice of Allowance dated Apr. 27, 2015, issued in connection with U.S. Appl. No. 13/932,864, filed Jul. 1, 2013, 20 pages.
Notice of Allowance dated Aug. 27, 2015, issued in connection with U.S. Appl. No. 13/705,177, filed Dec. 5, 2012, 34 pages.
Notice of Allowance dated Aug. 27, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 18 pages.
Notice of Allowance dated Dec. 27, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 15 pages.
Notice of Allowance dated Jun. 27, 2018, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 19 pages.
Notice of Allowance dated Mar. 27, 2017, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 7 pages.
Notice of Allowance dated Nov. 27, 2018, issued in connection with U.S. Appl. No. 16/052,316, filed Aug. 1, 2018, 8 pages.
Notice of Allowance dated Nov. 27, 2019, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 8 pages.
Notice of Allowance dated Apr. 28, 2020, issued in connection with U.S. Appl. No. 16/459,569, filed Jul. 1, 2019, 8 pages.
Notice of Allowance dated Jan. 28, 2019, issued in connection with U.S. Appl. No. 16/181,270, filed Nov. 5, 2018, 10 pages.
Notice of Allowance dated Mar. 28, 2017, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 7 pages.
Notice of Allowance dated Mar. 28, 2017, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 7 pages.
Notice of Allowance dated Apr. 29, 2015, issued in connection with U.S. Appl. No. 13/863,083, filed Apr. 15, 2013, 19 pages.
Notice of Allowance dated Aug. 29, 2018, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 8 pages.
Notice of Allowance dated Jul. 29, 2015, issued in connection with U.S. Appl. No. 13/359,976, filed Jan. 27, 2012, 28 pages.
Notice of Allowance dated Jul. 29, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 9 pages.
Notice of Allowance dated Mar. 29, 2017, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 8 pages.
Notice of Allowance dated Mar. 29, 2019, issued in connection with U.S. Appl. No. 15/095,145, filed Apr. 10, 2016, 7 pages.
Notice of Allowance dated May 29, 2020, issued in connection with U.S. Appl. No. 16/514,280, filed Jul. 17, 2019, 8 pages.
Notice of Allowance dated Nov. 29, 2019, issued in connection with U.S. Appl. No. 16/459,569, filed Jul. 1, 2019, 8 pages.
Notice of Allowance dated Sep. 29, 2020, issued in connection with U.S. Appl. No. 16/459,565, filed Jul. 1, 2019, 7 pages.
Notice of Allowance dated Sep. 29, 2020, issued in connection with U.S. Appl. No. 16/516,567, filed Jul. 19, 2019, 7 pages.
Apple. NewsRoom, Developers Rapidly Adopt Apple's Rendezvous Networking Technology, Sep. 10, 2002, 3 pages.
Apple WWDC 2003 Session 105—Rendezvous—YouTube available via https://www.youtube.com/watch?v=Ge5bsDijGWM [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Ashcraft et al. P4 Protocol Specification vo.2 Apr. 6, 2002, 11 pages [online]. [retrieved on Mar. 26, 2020]. Retrieved from the Internet URL: https://snarfed.org/p4protocol.
Atlanta Journal. The big picture. Nov. 9, 2000, 1 page. [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Audio Authority. Access EZ: Demonstration Network. Home Audio and Video System Installation Manual, 60 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Audio Authority: How to Install and Use the Model 1154 Signal Sensing Auto Selector, 2002, 4 pages.
Audio Authority: Model 1154B High Definition AV Auto Selector, 2008, 8 pages.
AudioPoint from Home Director. Play Digital Music on Your Conventional Stereo System, 2002, 2 pages.
AudioPoint, Welcome to the coolest way to listen to digital music over your conventional stereo equipment, Home Director HD00B02, 2002, 2 pages.
AudioSource: AMP 100 User Manual, 2003, 4 pages.
AudioTron Quick Start Guide, Version 1.0, Mar. 2001, 24 pages.
AudioTron Reference Manual, Version 3.0, May 2002, 70 pages.
AudioTron Setup Guide, Version 3.0, May 2002, 38 pages.
Automatic Profile Hunting Functional Description, AVAGO0013, Agere Systems, Feb. 2004, 2 pages.
“A/V Surround Receiver AVR-5800,” Denon Electronics, 2000, 2 pages.
“A/V System Controleer, Owner's Manual,” B&K Compontents, Ltd., 1998, 52 pages.
AVTransport:1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (66 pages).
AXIS Communication: AXIS P8221 Network I/O Audio Module, 2009, 41 pages.
Baldwin, Roberto. “How-To: Setup iTunes DJ on Your Max and iPhone”, available at http://www.maclife.com/article/howtos/howto_setup_itunes_dj_your_mac_and_iphone, archived on Mar. 17, 2009, 4 pages.
Balfanz et al., “Network-in-a-Box: How to Set Up a Secure Wireless Network in Under a Minute,” 13th USENIX Security Symposium—Technical Paper, 2002, 23 pages.
Balfanz et al., “Talking to Strangers: Authentication in Ad-Hoc Wireless Networks,” Xerox Palo Alto Research Center, 2002, 13 pages.
Barham et al., “Wide Area Audio Synchronisation,” University of Cambridge Computer Laboratory, 1995, 5 pages.
Barix Download Exstreamer Software. Accessed via WayBack Machine, Apr. 6, 2003. http://www.barix.com/estreamer/software.download.html. 2 pages.
Barix. Exstreamer Datasheet. Accessed via WayBack Machine, Apr. 2, 2003. http://www.barix.com/exstreamer/, 1 page.
Barret, Ryan. P4 Proposal :CS194 Project Proposal. Toward an Application-Independent Distributed Network Platform. Apr. 9, 2002,4 pages [online]. [retrieved on Mar. 26, 2020], Retrieved from the Internet URL: https://snarfed.org/p4proposal.
Barrett, Ryan, (no title) Blog on P4Sync network and code, 1 page [online], [retrieved on Mar. 26, 2020], Retrieved from the Internet URL: https://snarfed.org.p4.
Baudisch et al., “Flat Volume Control: Improving Usability by Hiding the Volume Control Hierarchy in the User Interface,” 2004, 8 pages.
Beatty et al. Web Services Dynamic Discovery WS-Discovery, Feb. 2004, 35 pages.
Benslimane Abderrahim, “A Multimedia Synchronization Protocol for Multicast Groups,” Proceedings of the 26th Euromicro Conference, 2000, pp. 456-463, vol. 1.
Biersack et al., “Intra- and Inter-Stream Synchronization for Stored Multimedia Streams,” IEEE International Conference on Multimedia Computing and Systems, 1996, pp. 372-381.
Blakowski G. et al., “A Media Synchronization Survey: Reference Model, Specification, and Case Studies,” Jan. 1996, pp. 5-35, vol. 14, No. 1.
Blau, John. News Analysis, Wi-Fi Hotspot Networks Sprout Like Mushrooms, Sep. 2002, 3 pages.
Bluetooth. “Specification of the Bluetooth System: The ad hoc SCATTERNET for affordable and highly functional wireless connectivity,” Core, Version 1.0 A, Jul. 26, 1999, 1068 pages.
Bluetooth. “Specification of the Bluetooth System: Wireless connections made easy,” Core, Version 1.0 B, Dec. 1, 1999, 1076 pages.
Bluetooth Specification. Advanced Audio Distribution Profile (A2DP) Specification, 2007, 73 pages.
Bogen Communications, Inc., ProMatrix Digitally Matrixed Amplifier Model PM3180, Copyright1996, 2 pages.
BoomBottle MM Blue Hatch 2-Pack. Blue Hatch Waterproof Dual Pairing Wireless Speakers each with Built-in-MagicMount, 4 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Bootcamp. Digital Music on Your Stereo System. Jan. 10, 2003, 1 page.
Bose. Imagine a State-of-the-Art Audio System, copyright 1999, 6 pages, [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Bose. Lifestyle 50 Home Theater System. Dec. 12, 2001, 1 page [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Bose. Lifestyle 50 Home Theater System. Aug. 1999, 4 pages, [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Bose. Lifestyle 50 Home Theater System. Questions and Answers, 21 pages. [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Bose Lifestyle SA-2 and SA-3 Stereo Amplifier Owner's Guide, 2004, 32 pages.
Bose. The Bose Lifestyle 50 System. Owner's Guide, Oct. 17, 2001, 55 pages.
Bose. The Bose Lifestyle Powered Speaker System. Owner's Guide. Dec. 20, 2001, 19 pages.
Brassil et al., “Enhancing Internet Streaming Media with Cueing Protocols,” 2000, 9 pages.
Breebaart et al., “Multi-Channel Goes Mobile: MPEG Surround Binaural Rendering,” AES 29th International Conference, Sep. 2-4, 2006, pp. 1-13.
Bretl W.E., et al., MPEG2 Tutorial [online], 2000 [retrieved on Jan. 13, 2009] Retrieved from the Internet:(http://www.bretl.com/mpeghtml/MPEGindex.htm), pp. 1-23.
BridgeCo—Wireless Loudspeaker Product Information Version 1.4, Dec. 16, 2003, 5 pages.
Godber et al. Secure Wireless Gateway. RightsLink. Arizona State University, pp. 41-46 [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Golem, WLAN-MP3-Player zum Anschluss an die Stereoanlage, Jun. 1, 2003, 2 pages.
Google's Answer to Complaint and Counterclaims filed with United States District Court Central District of California, Western Division on Mar. 2, 2020, 50 pages.
Google's Counterclaims to Sonos's Complaint filed with United States District Court Central District of California, Western Division on Mar. 11, 2020, 13 pages..
Guttman, Erik. An API for the Zeroconf Multicast Address Allocation Protocol, Jun. 6, 2001, 11 pages.
Guttman, Erik. Autoconfiguration for IP Networking: Enabling Local Communication, Jun. 2001, 6 pages.
Guttman, Erik. Network Working Group, Zeroconf Host Profile Applicability Statement, Internet-Draft, Jul. 20, 2001, 9 pages.
Hans et al., “Interacting with Audio Streams for Entertainment and Communication,” Proceedings of the Eleventh ACM International Conference on Multimedia, ACM, 2003, 7 pages.
Hawn, Andrew. TechTV, First Look: cd3o c300, 2004, 2 pages.
Herre et al., “The Reference Model Architecture for MPEG Spatial Audio Coding,” Audio Engineering Society Convention Paper (Presented at the 118th Convention), May 28-31, 2005, 13 pages.
Herschman, Noah. Sound Advice: Luxury in the Box All-in-one shelf systems deliver terrific sound and sleek styling. Jazz Times, Nov. 2000, 4 pages. [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
High Fidelity. New Wave in Speaker Design. Oct. 1980, 130 pages.
Home Networking with Universal Plug and Play, IEEE Communications Magazine, vol. 39 No. 12 (Dec. 2001) (D+M_0402025-40) (16 pages).
“Home Theater Control Systems,” Cinema Source, 2002, 19 pages.
HomePod—Wireless Network Digital Music Player with FM Tuner, User Manual, 2003, 16 pages.
HomePod MP-100, Wireless Network Music Player, with USB Jukebox, Internet Radio, and FM Tuner, Specification, 2003, 2 pages.
HomePod. User Manual, Wireless Network Digital Audio Player with FM Tuner, 2003, 49 pages.
Horwitz, Jeremy, “Logic3 i-Station25,” retrieved from the internet: http://www.ilounge.com/index.php/reviews/entry/logic3-i-station25/, last visited Dec. 17, 2013, 5 pages.
How cd30 Network MP3 Players Work, Feb. 2, 2004, 3 pages.
Howe et al. A Methodological Critique of Local Room Equalization Techniques, 5 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
HP DeskJet 5850 User Guide, copyright 2003, 217 pages.
Huang C.M., et al., “A Synchronization Infrastructure for Multicast Multimedia at the Presentation Layer,” IEEE Transactions on Consumer Electronics, 1997, pp. 370-380, vol. 43, No. 3.
IBM Home Director Installation and Service Manual, Copyright1998, 124 pages.
IBM Home Director Owner's Manual, Copyright 1999, 67 pages.
IEEE Standards 1588-2002. IEEE Standard for a Precision Clock Synchronization Protocol for Networked Measurement and Control Systems, Nov. 8, 2002, 154 pages.
IEEE Standards 802.3. Part 3: Carrier sense multiple access with collision detection CSMA/CD access method and physical layer specifications, Mar. 8, 2002, 1562 pages.
LIVE. User's Guide IS809B Wireless Speaker System, Copyright 2010, 12 pages.
Implicit, LLC v. Sonos, Inc. (No. 14-1330-RGA), Defendant's Original Complaint (Mar. 3, 2017) (15 pages).
Integra Audio Network Receiver NAC 2.3 Instruction Manual, 68 pages.
Integra Audio Network Server NAS 2.3 Instruction Manual, pp. 1-32.
Integra Service Manual, Audio Network Receiver Model NAC-2.3, Dec. 2002, 44 pages.
Intel Announces WS-Discovery Spec for Joining Devices and Web Services, Intel Developer Forum Spring 2004, Feb. 17, 2004, 4 pages.
Intel Designing a UPnP AV Media Renderer, v. 1.0 (“Intel AV Media Renderer”) (May 20, 2003) (SONDM000115117-62) (46 pages).
Intel Media Renderer Device Interface (“Intel Media Renderer”) (Sep. 6, 2002) (62 pages).
Intel SDK for UPnP Devices Programming Guide, Version 1.2.1, (Nov. 2002) (30 pages).
Intel Sees Unified Platform and Ecosystem as Key to Enabling the Digital Home, Intel Developer Forum, Feb. 17, 2004, 4 pages.
Intel Tools Validate First Solutions that Enable Devices to Work Together in the Digital Home, Intel Developer Forum, Feb. 17, 2004, 2 pages.
Intel. User's Manual, An Intel Socket 478 Processor Based Mainboard. Mar. 27, 2003, 96 pages.
International Bureau, International Preliminary Report on Patentability dated Jan. 8, 2015, issued in connection with International Application No. PCT/US2013/046372, filed on Jun. 18, 2013, 6 pages.
International Bureau, International Preliminary Report on Patentability, dated Jan. 8, 2015, issued in connection with International Application No. PCT/US2013/046386, filed on Jun. 18, 2013, 8 pages.
International Bureau, International Preliminary Report on Patentability dated Jan. 30, 2014, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 6 pages.
International Searching Authority, International Search Report dated Aug. 1, 2008, in connection with International Application No. PCT/US2004/023102, 5 pages.
International Searching Authority, International Search Report dated Aug. 26, 2013, issued in connection with International Application No. PCT/US2013/046372, filed on Jun. 18, 2013, 3 pages.
International Searching Authority, International Search Report dated Dec. 26, 2012, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 3 pages.
International Searching Authority, International Search Report dated Sep. 30, 2013, issued in connection with International Application No. PCT/US2013/046386, filed on Jun. 18, 2013, 3 pages.
International Searching Authority, Written Opinion dated Aug. 26, 2013, issued in connection with International Application No. PCT/US2013/046372, filed on Jun. 18, 2013, 4 pages.
International Searching Authority, Written Opinion dated Dec. 26, 2012, issued in connection with International Application No. PCT/US2012/045894, filed on Jul. 9, 2012, 4 pages.
International Searching Authority, Written Opinion dated Sep. 30, 2013, issued in connection with International Application No. PCT/US2013/046386, filed on Jun. 18, 2013, 6 pages.
International Trade Commission Remote Hearing for Case 337-TA-1191 Transcripts vols. 1-5, dated Feb. 22, 2021-Feb. 26, 2021, 794 pages.
Introducing ROOMLINK Network Media Receiver—PCNA-MR10, Sony Vaio, 2003, 2 pages.
“884+ Automatic Matrix Mixer Control System,” Ivie Technologies, Inc., 2000, pp. 1-4.
Acoustic Research. 900MHz Wireless Stereo Speakers Model AW871 Installation and Operation Manual, 2003, 15 pages.
Acoustic Research. 900MHz Wireless Stereo Speakers Model AW871 Installation and Operation Manual, 2007, 12 pages.
Acoustic Research. Wireless Stereo Speakers with Auto-Tuning. Model AW877 Installation and Operation Manual, 2007, 13 pages.
Advanced Driver Tab User Interface WaveLan GUI Guide, AVAGO0009, Agere Systems, Feb. 2004, 4 pages.
Advisory Action dated Feb. 2, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 8 pages.
Advisory Action dated Sep. 18, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 8 pages.
Advisory Action dated Feb. 1, 2016, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 6 pages.
Advisory Action dated Jun. 1, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 11 pages.
Advisory Action dated Nov. 1, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 3 pages.
Advisory Action dated Mar. 2, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 3 pages.
Advisory Action dated Jan. 5, 2012, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 3 pages.
Advisory Action dated Sep. 5, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 3 pages.
Advisory Action dated Jan. 8, 2015, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 4 pages.
Advisory Action dated Mar. 8, 2017, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 22 pages.
Advisory Action dated Jun. 9, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 25, 2013, 3 pages.
Advisory Action dated Aug. 10, 2017, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 3 pages.
Advisory Action dated Feb. 10, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 3 pages.
Advisory Action dated Jan. 10, 2018, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 3 pages.
Advisory Action dated Jun. 11, 2018, issued in connection with U.S. Reexamination Control No. 90/013,959 for U.S. Pat. No. 9,213,357, filed on Jun. 16, 2017, 3 pages.
Advisory Action dated Nov. 12, 2014, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 6 pages.
Advisory Action dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 9 pages.
Advisory Action dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 9 pages.
Advisory Action dated Aug. 16, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 5 pages.
Advisory Action dated Jun. 20, 2017, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 5 pages.
Advisory Action dated Aug. 22, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 3 pages.
Advisory Action dated Mar. 22, 2019, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 3 pages.
Advisory Action dated Sep. 22, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed an Aug. 22, 2016, 3 pages.
Advisory Action dated Mar. 25, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 5 pages.
Advisory Action dated Feb. 26, 2015, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 3 pages.
Advisory Action dated Nov. 26, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 9 pages.
Advisory Action dated Apr. 27, 2016, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 7 pages.
Advisory Action dated Dec. 28, 2016, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 4 pages.
Advisory Action dated Jul. 28, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 7 pages.
Advisory Action dated Sep. 28, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 4 pages.
Agere Systems' Voice-over-Wireless LAN (VoWLAN) Station Quality of Service, AVAGO0015, Agere Systems, Jan. 2005, 5 pages.
Akyildiz et al., “Multimedia Group Synchronization Protocols for Integrated Services Networks,” IEEE Journal on Selected Areas in Communications, 1996 pp. 162-173, vol. 14, No. 1.
Allen and Heath ML4000 User Guide, 2003, 56 pages.
Amazon: Philips MCW770 WiFi Wireless PC Link AM/FM 5-CD Microsystem (Discontinued by Manufacturer): Home Audio & Theater, 5 pages [online]. [retrieved on Feb. 24, 2020]. Retrieved from the Internet URL: https://www.amazon.com/gp/product/B000278KLC.
Amazon.com: CD30 c300 Wireless Network MP3 Player (Analog/Digital): Home Audio & Theater, 5 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Amazon.com, Cisco-Linksys Wireless-B Music System WMLS11B, 5 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Amazon.com. Creative Labs Sound Blaster Wireless Music: Electronics, 7 pages, [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Anonymous, “Transmission Control Protocol,” RFC: 793, USC/Information Sciences Institute, Sep. 1981, 91 pages.
Appeal Brief and Exhibits A-K filed on Sep. 17, 2018, in connection with U.S. Rexam U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 240 pages.
Apple. Airport Express, Setup Guide. May 20, 2004, 51 pages.
Apple. Airport Express, Setup Guide. 2004, 48 pages.
Apple Developer Connection. Browsing for Network Services. Nov. 12, 2002, 5 pages.
Apple. NewsRoom, Apple “Open Sources” Rendezvous. Sep. 25, 2002, 2 pages.
Apple. NewsRoom, Apple Ships New AirPort Express with AirTunes. Jul. 14, 2004, 3 pages.
Apple. NewsRoom, Apple Unveils AirPort Express for Mac & PC Users. Jun. 7, 2004, 3 pages.
Space.com. Tech Today: News about the latest gizmos and gadgets conveniently available on Earth, Feb. 14, 2004, 2 pages.
Squeezebox by Logitech. Owner's Guide, 2007, 32 pages.
Squeezebox Duet Network Music System by Logitech. User Guide English (North America), 2008, 45 pages.
Squeezebox Network Music Player. Owner's Manual, Slim Devices, 2003, 22 pages.
Step-by-step P4 Connection. P4 Poster (without music), 5 pages [online], [retrieved on Mar. 26, 2020]. Retrieved from the Internet URL: https://snarfed.org/p4_poster/index.html.
Stereo Review's Sound & Vision. Bose Lifestyle 50 Review. Oct. 2000, 1 page [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Steve Jobs introduces AirPort Express All Things D2 (2004)—YouTube available via https://www.youtube.com/watch?v=hq5_P90pOqo 3 pages, [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Structured Media Components. Leviton Integrated Networks, last modified Apr. 10, 2006, 28 pages.
Support. Manuals & Documentation. Micro Audio System MCW770/37. Philips. Copyright 2004-2020, 3 pages. [online], [retrieved on Feb. 24, 2020], Retrieved from the Internet URL: https://www.usa.philips.con/c-p/MCW770_37/-/support.
Synchronizing mp3 playback. 3 pages [online]. [retrieved on Mar. 26, 2020]. Retrieved from the Internet URL: https://snarfed.org/synchronizing_mp3_playback.
Taylor, Marilou, “Long Island Sound,” Audio Video Interiors, Apr. 2000, 8 pages.
Technology. cd30 is developing products which implement NAVOS, allowing consumers to get better utility out of their home media libraries. Nov. 21, 2003, 1 pages.
Teirikangas, Jussi. HAVi: Home Audio Video Interoperability. Helsinki University of Technology, 2001, 10 pages.
Thaler et al. Scalability and Synchronization in IEEE 1394-Based Content-Creation Networks. Audio Engineering Society Convention Paper 5461, Sep. 21-24, 2001, 16 pages.
TOA Corporation, Digital Processor DP-0206 DACsys2000 Version 2.00 Software Instruction Manual, Copyright 2001, 67 pages.
TOA Electronics, Inc. DP-0206 Digital Signal Processor. DACsys 2000, 2001, 12 pages.
Tom's Hardware Guide: Nachrichten. Nachrichten vom Jan. 10, 2003, 3 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Trask, Simon. NewsRoom, Pro Sound News Europe, Bluetooth to drive wireless speakers, vol. 18; Issue 6, Jun. 1, 2003, 2 pages.
Tsai et al. SIM-based Subscriber Authentication for Wireless Local Area Networks, 2003, 6 pages.
Understanding Universal Plug and Play, Microsoft White Paper (Jun. 2000) (D+M_0402074-118) (45 pages).
U.S. Appl. No. 60/379,313, filed May 9, 2002, entitled “Audio Network Distribution System,” 50 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
U.S. Appl. No. 60/490,768, filed Jul. 28, 2003, entitled “Method for synchronizing audio playback between multiple networked devices,” 13 pages.
United States Patent and Trademark Office, U.S. Appl. No. 60/825,407, filed Sep. 12, 2006, entitled “Controlling and manipulating groupings in a multi-zone music or media system,” 82 pages.
Universal Plug and Play Device Architecture V. 1.0, (Jun. 8, 2000) (54 pages).
Universal Plug and Play in Windows XP, Tom Fout. Microsoft Corporation (Jul. 2001) (D+M_0402041-73) (33 pages).
Universal Plug and Play (“UPnP”) AV Architecture:1 for UPnP, Version 1.0, (Jun. 25, 2002) (D+M_0298151-72) (22 pages).
Universal Plug and Play Vendor's Implementation Guide (Jan. 5, 2000) (7 pages).
“UPnP and Sonos Questions,” Sonos Community, Dec. 2006, 5 pages.
UPnP AV Architecture:0.83 for UPnP Version 1.0, Jun. 12, 2002, copyright 2000, 22 pages.
UPnP AV Architecture:0.83 (Jun. 12, 2002) (SONDM000115483-504) (22 pages).
UPnP Design by Example, A Software Developers Guide to Universal Plug and Play Michael Jeronimo and JackWeast, Intel Press (D+M_0401307-818) (Apr. 2003) (511 pages).
UPnP Forum. UPnP Device Architecture 1.0. Oct. 15, 2008, 80 pages.
UPnP; “Universal Plug and Play Device Architecture,” Jun. 8, 2000; version 1.0; Microsoft Corporation; pp. 1-54.
Urien et al. EAP-TLS Smartcards, from Dream to Reality, 4th Workshop on Applications and Services in Wireless Networks, Aug. 9, 2004, 19 pages.
Valtchev et al. In Home Networking, Service Gateway Architecture for a Smart Home, Apr. 2002, 7 pages.
WANCommonlnterfaceConfig:1 Service Template Version 1.01 for UPnP, Ver. 1.0 (Nov. 12, 2001) (D+M_0401820-43) (24 pages).
WANIPConnection:1 Service Template Version 1.01 for UPnP Ver. 1.0 (Nov. 12, 2001) (D+M_0401844-917) (74 pages).
WANPPPConnection:1 Service Template Version 1.01 for UPnP, Version 1.0 (Nov. 12, 2001) (D+M_0401918-2006) (89 pages).
WaveLan High-Speed Multimode Chip Set, AVAGO0003, Agere Systems, Feb. 2003, 4 pages.
WaveLan High-Speed Multimode Chip Set, AVAGO0005, Agere Systems, Feb. 2003, 4 pages.
WaveLAN Wireless Integration Developer Kit (WI-DK) for Access Point Developers, AVAGO0054, Agere Systems, Jul. 2003, 2 pages.
WaveLAN Wireless Integration-Developer Kit (WI-DK) Hardware Control Function (HCF), AVAGO0052, Agere Systems, Jul. 2003, 2 pages.
“Welcome. You're watching Apple TV.” Apple TV 1st Generation Setup Guide, Apr. 8, 2008 http://manuals.info.apple.com/MANUALS/0/MA403/en_US/AppleTV_SetupGuide.pdf Retrieved Oct. 14, 2014, 40 pages.
“Welcome. You're watching Apple TV.” Apple TV 2nd Generation Setup Guide, Mar. 10, 2011 Retrieved Oct. 16, 2014, 36 pages.
“Welcome. You're watching Apple TV.” Apple TV 3rd Generation Setup Guide, Mar. 16, 2012 Retrieved Oct. 16, 2014, 36 pages.
Weverka et al. Windows XP Gigabook for Dummies. Wiley Publishing, Inc. 2004, 915 pages.
WI-DK Release 2 WaveLan Embedded Drivers for VxWorks and Linux, AVAGO0056, Agere Systems, Jul. 2003, 2 pages.
WI-DK Release 2 WaveLan END Reference Driver for VxWorks, AVAGO0044, Agere Systems, Jul. 2003, 4 pages.
WI-DK Release 2 WaveLan LKM Reference Drivers for Linux, AVAGO0048, Agere Systems, Jul. 2003, 4 pages.
Wi-Fi Alliance. Wi-Fi Protected Setup Specification, Version 1.0h, Dec. 2006, 110 pages.
IPR Details—Apple Computer's Statement About IPR Claimed in draft-ietf-zeroconf-ipv4-linklocal, Apr. 26, 2004, 3 pages.
Ishibashi et al., “A Comparison of Media Synchronization Quality Among Reactive Control Schemes,” IEEE Infocom, 2001, pp. 77-84.
Ishibashi et al., “A Group Synchronization Mechanism for Live Media in Multicast Communications,” IEEE Global Telecommunications Conference, 1997, pp. 746-752, vol. 2.
Ishibashi et al., “A Group Synchronization Mechanism for Stored Media in Multicast Communications,” IEEE Information Revolution and Communications, 1997, pp. 692-700, vol. 2.
Issues with Mixed IEEE 802.b/802.11g Networks, AVAGO0058, Agere Systems, Feb. 2004, 5 pages.
Japanese Patent Office, Decision of Refusal and Translation dated Mar. 30, 2021, issued in connection with Japanese Patent Application No. 2019-104398, 5 pages.
Japanese Patent Office, Decision of Rejection dated Jul. 8, 2014, issued in connection with Japanese Patent Application No. 2012-178711, 3 pages.
Japanese Patent Office, Final Office Action dated Nov. 8, 2016, issued in connection with Japanese Patent Application No. 2015-520286, 5 pages.
Japanese Patent Office, Japanese Office Action dated Oct. 3, 2017, issued in connection with Japanese Application No. 2016-163042, 5 pages.
Japanese Patent Office, Notice of Rejection, dated Feb. 3, 2015, issued in connection with Japanese Patent Application No. 2014-521648, 7 pages.
Japanese Patent Office, Notice of Rejection dated Sep. 15, 2015, issued in connection with Japanese Patent Application No. 2014-220704, 7 pages.
Japanese Patent Office, Office Action and Translation dated Aug. 17, 2020, issued in connection with Japanese Patent Application No. 2019-104398, 5 pages.
Japanese Patent Office, Office Action dated May 15, 2018, issued in connection with Japanese Application No. 2016-163042, 6 pages.
Japanese Patent Office, Office Action dated Nov. 22, 2016, issued in connection with Japanese Application No. 2015-520288, 6 pages.
Japanese Patent Office, Office Action dated May 24, 2016, issued in connection with Japanese Patent Application No. 2014-220704, 7 pages.
Japanese Patent Office, Office Action dated Mar. 29, 2016, issued in connection with Japanese Patent Application No. JP2015-520288, 12 pages.
Japanese Patent Office, Office Action dated Nov. 29, 2016, issued in connection with Japanese Application No. 2015-516169, 4 pages.
Japanese Patent Office, Office Action dated Feb. 5, 2019, issued in connection with Japanese Application No. 2016-163042, 6 pages.
Japanese Patent Office, Office Action Summary dated Feb. 2, 2016, issued in connection with Japanese Patent Application No. 2015-520286, 6 pages.
Japanese Patent Office, Office Action Summary dated Nov. 19, 2013, issued in connection with Japanese Patent Application No. 2012-178711, 5 pages.
Japanese Patent Office, Office Action Translation dated Feb. 5, 2019, issued in connection with Japanese Application No. 2016-163042, 4 pages.
Japanese Patent Office, Translation of Office Action dated May 15, 2018, issued in connection with Japanese Application No. 2016-163042, 4 pages.
Jo et al., “Synchronized One-to-many Media Streaming with Adaptive Playout Control,” Proceedings of SPIE, 2002, pp. 71-82, vol. 4861.
Johnson, Ian. SMC EZ-Stream Universal Wireless Multimedia Receiver—The Globe and Mail, Dec. 3, 2003, 6 pages.
Jones, Stephen, “Dell Digital Audio Receiver: Digital upgrade for your analog stereo,” Analog Stereo, Jun. 24, 2000 http://www.reviewsonline.com/articles/961906864.htm retrieved Jun. 18, 2014, 2 pages.
Kostiainen, K., Intuitive Security Initiation Using Location-Limited Channels. Helsinki University of Technology, Master's Thesis Apr. 14, 2004, 86 pages.
Kou et al., “RenderingControl:1 Service Template Verion 1.01,” Contributing Members of the UPnP Forum, Jun. 25, 2002, 63 pages.
Kraemer, Alan. Two Speakers Are Better Than 5.1—IEEE Spectrum, May 1, 2001, 6 pages.
Kumin, Daniel. User's Report. Bose: Lifestyle 50 Home Theater/Multiroom System. Jul./Aug. 2000, 4 pages. [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
LA Audio ZX135E 6 Zone Expander. Pro Audio Design Pro. Inc. https://www.proaudiodesign.com/products/la-audio-zx135e-6-zone-expander, accessed Mar. 26, 2020, 6 pages.
Lake Processors: Lake® LM Series Digital Audio Processors Operation Manual, 2011, 71 pages.
Levergood et al., “AudioFile: A Network-Transparent System for Distributed Audio Applications,” Digital Equipment Corporation, 1993, 109 pages.
LG: RJP-201M Remote Jack Pack Installation and Setup Guide, 2010, 24 pages.
Lienhart et al., “On the Importance of Exact Synchronization for Distributed Audio Signal Processing,” Session L: Poster Session II—ICASSP'03 Papers, 2002, 1 page.
Linksys 2.4GHz Wireless-B—User Guide Media Link for Music Model WML11B/WMLS11B, 68 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Linksys 2.4GHz Wireless-B—User Guide V2 Model WMA11B, 68 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
LinkSys by Cisco, Wireless Home Audio Controller, Wireless-N Touchscreen Remote DMRW1000 Datasheet, Copyright 2008, 2 pages.
LinkSys by Cisco, Wireless Home Audio Controller, Wireless-N Touchscreen Remote DMRW1000 User Guide, Copyright 2008, 64 pages.
LinkSys by Cisco, Wireless Home Audio Player, Wireless-N Music Extender DMP100 Quick Installation Guide, Copyright 2009, 32 pages.
LinkSys by Cisco, Wireless Home Audio Player, Wireless-N Music Extender DMP100 User Guide, Copyright 2008, 65 pages.
Linksys. Quick Installation for Windows XP Only. Wireless-B Media Adapter, 2 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Linksys. Wireless Adapters, 2003, 2 pages.
Linksys. Wireless PrintServer, User Guide, Model No. WPS11 Version 3, 2002, 31 pages.
Linksys Wireless-B Media Adapter—User Guide V1 Model WMA11B, 2003, 32 pages.
Linksys. Wireless-B Media Adapter, Product Data, Model No. WMA11B, 2003, 2 pages.
Linksys. Wireless-B Media Adapter, WMA11B, 2003, 2 pages.
Linux SDK for UPnP Devices v. 1.2 (Sep. 6, 2002) (101 pages).
“Linux SDK for UPnP Devices vl.2,” Intel Corporation, Jan. 17, 2003, 102 pages.
Liu et al., “A synchronization control scheme for real-time streaming multimedia applications,” Packet Video, 2003, 10 pages, vol. 2003.
Liu et al., “Adaptive Delay Concealment for Internet Voice Applications with Packet-Based Time-Scale Modification,” Information Technologies 2000, pp. 91-102.
NewsRoom. Business Wire, BridgeCo Adds Wireless Connectivity and Enhances Surround Sound Processing for New Seneration Speakers. May 5, 2003, 3 pages.
NewsRoom. Business Wire, BridgeCo Launches Entertainment Network Adapter at CES2003, Jan. 9, 2003, 3 pages.
NewsRoom. Business Wire, BridgeCo Launches Entertainment Network Adapter for Pro Audio at NAMM Show, Jan. 16, 2003, 3 pages.
NewsRoom. Business Wire, BridgeCo Opens USA Business Development HQ in Silicon Valley and Expands Management Team, Mar. 15, 2004, 3 pages.
NewsRoom. Business Wire, BridgeCo Releases Silicon and Firmware Platform Compatible with Microsoft Windows Media Connect and Windows Media DRM Technology. May 3, 2004, 3 pages.
NewsRoom. Business Wire, CSR and BridgeCo Launch Design for New Generation Wireless Speakers; Transforms Traditional Speakers into Portable Internet Radio, May 6, 2003, 3 pages.
NewsRoom. Business Wire, Epson Announces the Epson Stylus Photo 900: The First Photo Printer Under $200 to Print Directly Onto CDs and DVDs; New Printer Offers a Complete Printing Solution for Digital Lifestyles, Apr. 16, 2003, 4 pages.
NewsRoom. Business Wire, Good Guys Unveils Top 10 Holiday Electronics Gifts; Advances in Technology and Lower Prices Across the Industry Make for Great Deals on In-Demand Products This Season, Dec. 3, 2003, 3 pages.
NewsRoom. Bytestechnology Briefing, Feb. 19, 2002, 2 pages.
NewsRoom. CEA Announces 2007 Mark of Excellence Award Winners, Mar. 10, 2007, 3 pages.
NewsRoom. CEDIA Abuzz with Trends—Integrators agree: The hot products at this year's expo are the start of a revolutionary move for the home automation market. Oct. 9, 2006, 4 pages.
NewsRoom. Chicago Sun Times, Wireless stream player hits the right notes, Jan. 17, 2004, 3 pages.
NewsRoom. Computer Shopper, Entertainment geekly: the blueprints have been drawn for a connected home that fuses the PC with entertainment devices. All you have to do is install . . . , Nov. 1, 2003, 6 pages.
NewsRoom. Computer Shopper, Tunesail around, vol. 23; Issue 11, Nov. 1, 2003, 1 page.
NewsRoom. Computer Shopper, What we want: here's the gear our editors are wishing for this year, vol. 23; Issue 12, Dec. 1, 2003, 8 pages.
NewsRoom. Computer Shopper, Wi-Fi meets Hi-Fi: here's how to stream music, still images, and videos to your home entertainment center, Nov. 1, 2003, 5 pages.
NewsRoom. Custom Home, Easy listening: the hard disk is shaping the future of home entertainment. (The Wired House)., May 1, 2003, 3 pages.
NewsRoom. D-Link to Supply Omnifi with Exclusive New Antenna for Streaming Audio Throughout the House, Jan. 8, 2004, 3 pages.
NewsRoom. Easdown, R., System Heaven: Custom House Technofile, Nov. 24, 2003, 5 pages.
NewsRoom. Electronic House Expo Announces 2005 Multi-Room Audio/Video Award Winners. Nov. 18, 2005, 3 pages.
NewsRoom. Electronic House Expo Fall 2003 Exhibitor Profiles. Business Wire. Nov. 11, 2003, 7 pages.
NewsRoom. Electronic House Expo Spring 2004 Exhibitor Profiles. Business Wire. Mar. 10, 2004, 7 pages.
NewsRoom. Evangelista, B., Sound and Fury the Latest in Volume and Video at SF Home Entertainment Show, Jun. 6, 2003, 3 pages.
NewsRoom. Fallon et al. The Goods, Jul. 31, 2003, 2 pages.
NewsRoom. Future shocks—CONNECT: Your ultimate home-entertainment guide, Dec. 4, 2003, 3 pages.
NewsRoom. Greg, T., Rooms with a tune, Jul. 23, 2003, 3 pages.
NewsRoom. Hoffman, A., Computer networks start entertaining, Jun. 1, 2003, 3 pages.
NewsRoom. Home theater systems that are a real blast, New Straits. Jan. 6, 2000, 3 pages.
NewsRoom. IDG's PC World Announces Winners of the 2004 World Class Awards, Jun. 2, 2004, 3 pages.
NewsRoom. InfoComm 2004 Exhibitors vol. 7, Issue 5, May 1, 2004, 24 pages.
NewsRoom. International Herald Tribune, Transmitting media gets easier cheaply, Jan. 31, 2004, 2 pages.
NewsRoom. Latest electronic gadgets unveiled in Las Vegas: Wireless Devices take centre stage, Jan. 13, 2003, 4 pages.
NewsRoom. Linksys Extends Wireless Functionality to the Television, Jul. 14, 2003, 3 pages.
NewsRoom. Linksys Ships Wireless-B Media Link for Streamlined Delivery of Music From PC to Stereo Stream MP3s, Play Lists and Internet Radio to Any Stereo With the Wireless-B Media Link for Music, May 19, 2004, 3 pages.
NewsRoom. Linksys Wireless Home Products Are Hot Tech Gifts for 2003, Nov. 24, 2003, 3 pages.
NewsRoom. Living room expansion—The PC is going from word processor to entertainment hub for many households, Aug. 18, 2003, 4 pages.
NewsRoom. Macy's Returns to Electronics With Home Theater Boutique, Aug. 11, 2003, 2 pages.
NewsRoom. Many different ways to enjoy digital music library, Apr. 29, 2003, 3 pages.
NewsRoom. Marlowe, C., Pad gadgets: home is where the gear is. Oct. 20, 2003, 2 pages.
NewsRoom. Miller II, S. A., Technology gets simpler and smarter, Jan. 14, 2003, 2 pages.
NewsRoom. Miller, M., Adapted for flight: hands-on trial: wireless media adapters send digital entertainment soaring from PC to living room. Sep. 18, 2003, 3 pages.
NewsRoom. Miller, S., Creating Virtual Jukeboxes Gadgets Make Digital Music Portable. Aug. 19, 2003, 3 pages.
NewsRoom. Morning Call, Cutting the cord; Wi-Fi networks connect computers, TVs, DVD players and more without a clutter of wires, Feb. 2, 2003, 5 pages.
NewsRoom. Mossberg, W., PC-stored music sent without wires, Jan. 25, 2004, 2 pages.
NewsRoom. Nadel, B., Beam music, images from PC to stereo, TV: Linksys Wireless-B Media Adapter WMA11B. Nov. 1, 2003, 2 pages.
NewsRoom. Net Briefs, Jul. 21, 2003, 2 pages.
NewsRoom. NetWork World, The Toys of Summer, Sep. 1, 2003, 3 pages.
NewsRoom. Networked C300 Speaks Your Language. Apr. 6, 2003, 3 pages.
NewsRoom. New Camera—Now What? It's easy to go wild printing, sharing your digital photos. Oct. 16, 2003, 2 pages.
NewsRoom. New Products Allow Easier Access to Audio Video on Home Computers, Nov. 9, 2003, 3 pages.
Wildstrom, Stephen. At CES, Cool Tech Still Rules. BusinessWeek Online, Jan. 13, 2003, 3 pages.
Wilkins, N., SMC SMCWMR-AG EZ-Stream (wireless) review. CNET, Feb. 8, 2004, 3 pages.
Wilkins, N., SMC SMCWMR-AG EZ-Stream (wireless) review. CNET, Feb. 8, 2004, 5 pages.
Williams, A. Zero Configuration Networking. Requirements for Automatic Configuration of IP Hosts, Sep. 19, 2002, 19 pages.
Williams, Stephen. NewsRoom, Going Wireless, Oct. 21, 2003, 2 pages.
Williams, Stephen. NewsRoom, Newsday, As Wireless Evolves, Compatibility is Key, Jul. 21, 2003, 3 pages.
Windows Media Connect Device Compatibility Specification (Apr. 12, 2004) (16 pages).
Windows XP: The Complete Reference—Chapter 19 Working with Sound, 6 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Wired. Total Remote Control, Issue 11.06, Jun. 2003, 2 pages.
Wireless Home Audio Director. Wireless N Music Player with Integrated Amplifier DMC250. Datasheet. Linksys by Disco. Fill Your Home with Music, 2008, 2 pages.
Wireless USB Adapter 11g CPWUA054, CPWUA054|00, CPWUA054|37, User Manual, Version: 1.0, Dec. 2003, 29 pages.
WPA Reauthentication Rates, AVAGO0063, Agere Systems, Feb. 2004, 3 pages.
Yahoo Finance. BridgeCo Successfully Commercializes its BeBoB Application for the Music Industry: Four Manufacturers Demonstrate BeBoB-enabled Products at NAMM 2004. Jan. 16, 2004, 3 pages.
Yahoo Groups. Exstreamer. Barix Exstreamer. Access via Wayback Machine http://groups.yahoo.com/group/exstreamer/ Dec. 22, 2013, 1 page.
Yamaha. Digital Audio Server, MCX-1000, Owner's Manual, 1996-2002, 148 pages.
Yamaha DME 32 manual: copyright 2001.
Yamaha DME 64 Owner's Manual; copyright 2004, 80 pages.
Yamaha DME Designer 3.0 Owner's Manual; Copyright 2008, 501 pages.
Yamaha DME Designer 3.5 setup manual guide; copyright 2004, 16 pages.
Yamaha DME Designer 3.5 User Manual; Copyright 2004, 507 pages.
Yamaha DME Designer software manual: Copyright 2004, 482 pages.
Yamaha MusicCAST Digital Audio Server MCX-1000 Owner's Manual, Copyright 1996-2002, 148 pages.
Yamaha, MusicCAST: Digital Audio Terminal MCX-A10, Owner's Manual. Jun. 4, 2003, 76 pages.
Yamaha Personal Receiver RP-U200 Operation Manual (“Operation Manual”), Copyright 1992-1997, 57 pages.
“Symantec pcAnywhere User's Guide,” v 10.5.1, 1995-2002, 154 pages.
“Systemline Modular Installation Guide, Multiroom System,” Systemline, 2003, pp. 1-22.
Zero Configuration networking with Bonjour—YouTube available via https://www.youtube.com/watch?v=ZhtZJ6EsCXo 3 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Zeroconf Working Group, Dynamic Configuration of IPv4 Link-Local Addresses, Internet-Draft, Jul. 8, 2004, 62 pages.
Zeroconf Working Group, Dynamic Configuration of IPv4 Link-Local Addresses, Internet-Draft, Jul. 1, 2004, 60 pages.
Zeroconf Working Group, Dynamic Configuration of IPv4 Link-Local Addresses, Internet-Draft, Jun. 7, 2004, 62 pages.
Zeroconf Working Group, Dynamic Configuration of Link-Local IPv4 Addresses, Internet-Draft, Feb. 16, 2004, 60 pages.
Zeroconf Working Group, Dynamic Configuration of Link-Local IPv4 Addresses, Internet-Draft, Mar. 31, 2004, 60 pages.
“ZR-8630AV MultiZone Audio/Video Receiver, Installation and Operation Guide,” Niles Audio Corporation, 2003, 86 pages.
ZX135: Installation Manual,LA Audio, Apr. 2003, 44 pages.
Plugged In. The latest in electronic hardware and incentive offerings. Electronic Awards, Jun. 2001, 4 pages. [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Pohlmann, Ken. Omnifi DMS1 Wi-Fi Media Receiver. Sound & Vision, Oct. 20, 2003, 7 pages.
Polycom Conference Composer User Guide, copyright 2001, 29 pages.
Pre-Brief Conference Decision mailed on May 11, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 2 pages.
Pre-Interview First Office Action dated Mar. 10, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 4 pages.
Presentations at WinHEC 2000, May 2000, 138 pages.
PRISMIQ, Inc., “PRISMIQ Media Player User Guide,” 2003, 44 pages.
Pro Tools Reference Guide Version 5.3 Manual, 2002, 582 pages.
Pro Tools Reference Guide Version 6.1 , 2003, 643 pages.
Proficient Audio Systems M6 Quick Start Guide, 2011, 5 pages.
Proficient Audio Systems: Proficient Editor Advanced Programming Guide, 2007, 40 pages.
Programming Interface for WL54040 Dual-Band Wireless Transceiver, AVAGO0066, Agere Systems, May 2004, 16 pages.
Publishing Network Services. Apple Developer Connection. Rendezous Network Services: Publishing Network Services, Nov. 12, 2002, 6 pages.
Radio Shack, “Auto-Sensing 4-Way Audio/Video Selector Switch,” 2004, 1 page.
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 1, 100 pages.
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 2, 100 pages.
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 3, 100 pages.
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 4, 100 pages.
RadioShack, Pro-2053 Scanner, 2002 Catalog, part 5, 46 pages.
Rangan et al., “Feedback Techniques for Continuity and Synchronization in Multimedia Information Retrieval,” ACM Transactions on Information Systems, 1995, pp. 145-176, vol. 13, No. 2.
Real Time Control Protocol (RTCP) and Realtime Transfer Protocol (RTP), RFC 1889 (Jan. 1996) (D+M_0397810-84) (75 pages).
Realtime Streaming Protocol (RTSP), RFC 2326 (Apr. 1998) (D+M_0397945-8036) (92 pages).
Realtime Transport Protocol (RTP), RFC 3550 (Jul. 2003) (D+M_0398235-323) (89 pages).
Re-Exam Final Office Action dated Aug. 5, 2015, issued in connection with U.S. Appl. No. 90/013,423, filed Jan. 5, 2015, 25 pages.
Reexam Non-Final Office Action dated Nov. 9, 2016, issued in connection with U.S. Appl. No. 90/013,774, filed Jun. 29, 2016, 35 pages.
Re-Exam Non-Final Office Action dated Apr. 22, 2015, issued in connection with U.S. Appl. No. 90/013,423, filed Jan. 5, 2015, 16 pages.
Reid, Mark, “Multimedia conferencing over ISDN and IP networks using ITU-T H-series recommendations architecture, control and coordination,” Computer Networks, 1999, pp. 225-235, vol. 31.
RenderingControl:1 Service Template Version 1.01 For UPnP, Version 1.0, (Jun. 25, 2002) (SONDM000115187-249) (63 pages).
Rendezous Network Services: Resolving and Using Network Services. Apple Developer Connection, Nov. 12, 2002, 5 pages.
Rendezvous Network Services: About Rendezvous. Apple Developer Connection, Nov. 12, 2002, 5 pages.
Renewed Request for Ex Parte Re-Examination, U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 126 pages.
Renkus Heinz Manual; available for sale at least 2004, 6 pages.
Request for Ex Parte Reexamination submitted in U.S. Pat. No. 9213357 on May 22, 2017, 85 pages.
“Residential Distributed Audio Wiring Practices,” Leviton Network Solutions, 2001, 13 pages.
Reviewer's Choice Awards. Sound & Vision, Dec. 2000, pp. 127-134 [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Ritchie et al., “MediaServer:1 Device Template Version 1.01,” Contributing Members of the UPnP Forum, Jun. 25, 2002, 12 pages.
Ritchie et al., “UPnP AV Architecture:1, Version 1.0,” Contributing Members of the UPnP Forum, Jun. 25, 2002, 22 pages.
Ritchie, John, “MediaRenderer:1 Device Template Version 1.01,” Contributing Members of the UPnP Forum, Jun. 25, 2002, 12 pages.
Rocketfish. Digital Wireless Speakers. RF-WS01/WS01-W/WS02 User Guide, 2008, 28 pages.
Rocketfish. Wireless OutdoorSpeaker. RF-RBWS02 User Guide, 2009, 33 pages.
Roku SoundBridge Network Music Player User Guide v2.5, 2006, 40 pages.
Roland Corporation, “Roland announces BA-55 Portable PA System,” press release, Apr. 6, 2011, 2 pages.
Rose, B., Home Networks: A Standards Perspective. In-Home Networking, IEEE Communications Magazine, Dec. 2001, 8 pages.
Rothermel et al., “An Adaptive Protocol for Synchronizing Media Streams,” Institute of Parallel and Distributed High-Performance Systems (IPVR), 1997, 26 pages.
Rothermel et al., “An Adaptive Stream Synchronization Protocol,” 5th International Workshop on Network and Operating System Support for Digital Audio and Video, 1995, 13 pages.
Rothermel et al., “An Adaptive Stream Synchronization Protocol,” 5th International Workshop on Network and Dperating System Support for Digital Audio and Video, Apr. 18-21, 1995, 12 pages.
Rothermel et al., “Clock Hierarchies—An Abstraction for Grouping and Controlling Media Streams,” University of Stuttgart Institute of Parallel and Distributed High-Performance Systems, Jan. 1996, 23 pages.
Rothermel et al., “Synchronization in Joint-Viewing Environments,” University of Stuttgart Institute of Parallel and Distributed High-Performance Systems, 1992, 13 pages.
Rothermel, Kurt, “State-of-the-Art and Future Research in Stream Synchronization,” University of Stuttgart, 3 pages.
Creative Sound Blaster Wireless Music, User's Guide, Version 1.0, Aug. 2003, 61 pages.
Crest Audio Pro Series 8001 Power Amplifier. V. 2.2 Mar. 25, 1997, 2 pages.
Creston's Adagio Entertainment System with New AMS Processor Wins Awards at CEDIA, Sep. 29, 2006, 3 pages.
Crestron Adagio AMS Media System Operations Guide, 2008, 114 pages.
Crestron. Adagio. Home Entertainment is Just the Beginning . . . 2007, 10 pages.
Crestron. AVS Forum. Dec. 1, 2007, 9 pages.
Crestron, Industry Awards, Crestron's Spirit of Innovation has Resulted in the Most Award-Winning Products in the Industry, 2006, 6 pages.
Crestron, Industry Awards, Crestron's Spirit of Innovation has Resulted in the Most Award-Winning Products in the Industry, 2007, 5 pages.
Crome, Caleb. Logitech Squeezebox Boom Audio Design, 2008, 11 pages.
Crown PIP Manual available for sale at least 2004, 68 pages.
Dannenberg et al., “A. System Supporting Flexible Distributed Real-Time Music Processing,” Proceedings of the 2001 International Computer Music Conference, 2001, 4 pages.
Dannenberg, Roger B., “Remote Access to Interactive Media,” Proceedings of the SPIE 1785, 1993, pp. 230-237.
Davies, Chris. Sony Ericsson MS500 Bluetooth Splashproof Speaker. http://www.slashgear.com/sony-ericsson-ms500-bluetooth-splashproof. Mar. 17, 2009, 2 pages.
Day, Rebecca, “Going Elan!” Primedia Inc., 2003, 4 pages.
Deep-Sleep Implementation in WL60011 for IEEE 802.11b Applications, AVAGO0020, Agere Systems, Jul. 2004, 22 pages.
Dell, Inc. “Dell Digital Audio Receiver: Reference Guide,” Jun. 2000, 70 pages.
Dell, Inc. “Start Here,” Jun. 2000, 2 pages.
“Denon 2003-2004 Product Catalog,” Denon, 2003-2004, 44 pages.
Denon AV Surround Receiver AVR-1604/684 User's Manual, 2004, 128 pages.
Denon AV Surround Receiver AVR-5800 Operating Instructions, Copyright 2000, 67 pages.
Denon AVR-3805 A/V Surround Receiver. Datasheet, last modified Mar. 1, 2004, 2 pages.
Designing a UPnP AV MediaServer, Nelson Kidd (2003) (SONDM000115062-116) (55 pages).
Dhir, Amit, “Wireless Home Networks—DECT, Bluetooth, Home RF, and Wirelss LANs,” XILINX, wp135 (v1.0), Mar. 21, 2001, 18 pages.
Dierks et al. RFC 2246 The TLS Protocol, Jan. 1999, 80 pages.
Digigram. EtherSound ES8in/8out Ethernet Audio Bridges. Easy and Cost-Effective Audio Distribution, Nov. 2002, 4 pages.
D-Link. User's Manual, Wireless HD Media Player, Version 1.1, DSM-520, Sep. 28, 2005, 127 pages.
DLNA. Overview and Vision, White Paper, Jun. 2004, 16 pages.
DLNA. Use Case Scenarios, White Paper, Jun. 2004, 15 pages.
“DP-0206 Digital Signal Processor,” TOA Electronics, Inc., 2001, pp. 1-12.
DP-0206 TOA Digital Signal Processor. TOA Corporation, 2001, 4 pages.
Duo Soundolier. Sound & Light: Wireless Speaker Torchiere. Soundolier Integrated Wireless Technologies, 2006, 3 pages.
ECMA. Near Field Communication—White Paper, Ecma/TC32-TG19/2004/1, 9 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
ECMA. Near Field Communication, ECMA/TC32-TG19, Oct. 2002, 15 pages.
ECMA. Standard ECMA-340, Near Field Communication—Interface and Protocol NFCIP-1, Dec. 2002, 66 pages.
Ecma. What is Ecma? 2 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Epson. EpsonNet 802.11B, Convenient Printing Using Wireless Technology, 2002, 2 pages.
Epson. EpsonNet 802.11b, User's Guide, 2002, 68 pages.
Epson Product Support Bulletin. PSB # PSB.2003.05.005, EpsonNet 802.11b Wireless Print Server, Apr. 30, 2003, 30 pages.
Epson Product Support Bulletin. PSB # PSB.2003.05.007, EpsonNet 802.11b Wireless Print Server, Apr. 23, 2003, 10 pages.
Epson Stylus C80WN. Quick Start, 2002, 2 pages.
Epson Stylus C80WN. Setup and Installation, Nov. 2001, 67 pages.
European Patent Office, European EPC Article 94.3 mailed on Feb. 22, 2022, issued in connection with European Application No. 19194999.9, 4 pages.
European Patent Office, European Extended Search Report dated Mar. 7, 2016, issued in connection with EP Application No. 13810340.3, 9 pages.
European Patent Office, European Extended Search Report dated Feb. 28, 2014, issued in connection with EP Application No. 13184747.7, 8 pages.
European Patent Office, European Extended Search Report dated Mar. 31, 2015, issued in connection with EP Application No. 14181454.1, 9 pages.
European Patent Office, European Extended Search Report dated Mar. 31, 2020, issued in connection with European Application No. 19194999.9, 15 pages.
European Patent Office, European Office Action dated Sep. 1, 2017, issued in connection with European Application No. 13184747.7, 7 pages.
European Patent Office, European Search Report dated Jan. 27, 2020, issued in connection with European Application No. 19194999.9, 16 pages.
European Patent Office, Examination Report dated Mar. 22, 2016, issued in connection with European Patent Application No. EP14181454.1, 6 pages.
European Patent Office, Examination Report dated Oct. 24, 2016, issued in connection with European Patent Application No. 13808623.6, 4 pages.
BridgeCo. BridgeCo Launches UPnP-Compliant Wireless Audio Adapter: Moving More Digital Audio to More Devices in More Locations, Wirelessly. Sep. 16, 2003, 1 page.
BridgeCo. Company Overview. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Networked Loudspeaker Product Information, 4 pages [produced by Google in Inv. No 337-TA-1191 on May 6, 2020].
BridgeCo. Professional Loudspeaker—Product Information, 3 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. User Manual, Wireless Audio Adapter. Sep. 22, 2003, 34 pages.
BridgeCo. Vision. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, 5 Factors, 5 Missing Functionalities. 1 page [produced by Google in Inv. No 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, 5 Key Functions. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, BridgeCo Solution. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, Consumer Benefits. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, Consumer Demand. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, ENA Applications. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, ENA Deployment. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, ENA Functionality. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, ENA Market. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, Entertainment Continuum. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, Entertainment Network Adapter. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, New Entertainment. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Vision, Technical Problems. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Wireless Audio Adapter, Product Information. 3 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
BridgeCo. Wireless Audio Adapter Reference Design, Product Information. Version 1.3. Oct. 31, 2003, 2 pages.
BridgeCo. Wireless Loudspeaker, Product Information. 4 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Buffalo. Link Theater LT-H90 Media Player v1.0, 2003-2008, 38 pages.
Buffalo. LinkTheater PC-P3LWG/DVD, 59 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Business Wire. BridgeCo Adds Wireless Connectivity and Enhances Surround Sound Processing for New Generation Speakers May 5, 2003, 2 pages.
C200 Wireless Network MP3 Player, Jun. 4, 2003, 1 page.
Canadian Intellectual Property Office, Canadian Office Action dated Apr. 4, 2016, issued in connection with Canadian Patent Application No. 2,842,342, 5 pages.
Canadian Intellectual Property Office, Canadian Office Action dated Sep. 14, 2015, issued in connection with Canadian Patent Application No. 2,842,342, 2 pages.
Canadian Patent Office, Canadian Office Action dated Jun. 11, 2019, issued in connection with Canadian Application No. 2982726, 4 pages.
Canadian Patent Office, Canadian Office Action dated Mar. 3, 2020, issued in connection with Canadian Application No. 3033268, 4 pages.
Canadian Patent Office, Office Action dated Jul. 10, 2018, issued in connection with Canadian Application No. 2982726, 3 pages.
Carnoy, David. Parrot DS1120 Wireless Hi-Fi Speaker System Review, Jul. 15, 2008, 4 pages.
Case et al. RFC 1157—A Simple Network Management Protocol, May 1990, 36 pages.
Cd30. Audio Control Document V4.2 Released! Sep. 18, 2003, 7 pages.
Cd30 Audio Control Protocol. Version 4.2. Sep. 18, 2003, 24 pages.
Cd30. Audio Stream Protocol Released. Mar. 9, 2004, 2 pages.
Cd30. Audio Stream Protocol: Version 18. Mar. 9, 2004, 13 pages.
Cd30 Backgrounder, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30. c100 Network MP3 Player. Quick Product Summary .1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30. c200 Wireless Network MP3 Player. Quick Product Summary. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30. c300 Extended-Range Wireless Network MP3 Player. Quick Product Summary, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 0300 Reviews. Digital Audio Receivers (DARs) Reviews by CNET, Mar. 30, 2003, 3 pages.
Cd30. Careers, Nov. 21, 2003, 1 page.
Cd30. Contact, Dec. 12, 2003, 1 page.
Cd30. Corporate Fact Sheet, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 FAQs. What problem or need does cd30 address with their products? 2 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 Frequently-Asked Questions About cd30 Network MP3 Players, Dec. 12, 2003, 6 pages.
Cd30 Introduces Family of MP3 Players at this year's Consumer Electronics Show. Jan. 9-12, 2003 Las Vegas Convention Center, Feb. 12, 2004, 2 pages.
Cd30 Introduces Family of MP3 Players at this year's Consumer Electronics Show. Jan. 9-12, 2003 Las Vegas Convention Center, 2 pages.
Notice of Allowance dated May 6, 2011, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 10 pages.
Notice of Allowance dated Sep. 6, 2013, issued in connection with U.S. Appl. No. 13/619,237, filed Sep. 14, 2012, 10 pages.
Notice of Allowance dated Apr. 7, 2016, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 40 pages.
Notice of Allowance dated Dec. 7, 2016, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 9 pages.
Notice of Allowance dated Oct. 7, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 7 pages.
Notice of Allowance dated Mar. 9, 2017, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 7 pages.
Notice of Allowance dated Oct. 9, 2015, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 4 pages.
Notice of Allowance dated Oct. 1, 2018, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 12 pages.
Notice of Allowance dated Aug. 10, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 9 pages.
Notice of Allowance dated Aug. 10, 2018, issued in connection with U.S. Appl. No. 15/081,911, filed Mar. 27, 2016, 5 pages.
Notice of Allowance dated Feb. 10, 2017, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 13 pages.
Notice of Allowance dated Jul. 10, 2018, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 9 pages.
Notice of Allowance dated Jul. 10, 2020, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 26 pages.
Notice of Allowance dated May 10, 2018, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 8 pages.
Notice of Allowance dated Nov. 10, 2011, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 17 pages.
Notice of Allowance dated Apr. 11, 2016, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 21 pages.
Notice of Allowance dated Feb. 11, 2019, issued in connection with U.S. Appl. No. 16/180,920, filed Nov. 5, 2018, 10 pages.
Notice of Allowance dated Feb. 11, 2019, issued in connection with U.S. Appl. No. 16/181,342, filed Nov. 6, 2018, 15 pages.
Notice of Allowance dated Jan. 11, 2016, issued in connection with U.S. Appl. No. 14/564,544, filed Dec. 9, 2014, 5 pages.
Notice of Allowance dated Mar. 11, 2019, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 17 pages.
Notice of Allowance dated Aug. 12, 2015, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 27 pages.
Notice of Allowance dated Jul. 12, 2017, issued in connection with U.S. Appl. No. 13/894,179, filed May 14, 2013, 10 pages.
Notice of Allowance dated Sep. 12, 2018, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 8 pages.
Notice of Allowance dated Sep. 12, 2018, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 8 pages.
Notice of Allowance dated Sep. 12, 2018, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 8 pages.
Notice of Allowance dated Dec. 13, 2016, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 9 pages.
Notice of Allowance dated Jul. 13, 2015, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 22 pages.
Notice of Allowance dated Jul. 13, 2017, issued in connection with U.S. Appl. No. 13/895,076, filed May 15, 2013, 10 pages.
Notice of Allowance dated Jul. 13, 2020, issued in connection with U.S. Appl. No. 16/516,567, filed Jul. 19, 2019, 19 pages.
Notice of Allowance dated May 13, 2020, issued in connection with U.S. Appl. No. 16/459,565, filed Jul. 1, 2019, 8 pages.
Notice of Allowance dated May 13, 2020, issued in connection with U.S. Appl. No. 16/544,900, filed Aug. 20, 2019, 8 pages.
Notice of Allowance dated Nov. 13, 2013, issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 7 pages.
Notice of Allowance dated Nov. 13, 2020, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 8 pages.
Notice of Allowance dated Oct. 13, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 7 pages.
Notice of Allowance dated Sep. 13, 2021, issued in connection with U.S. Appl. No. 16/383,910, filed Apr. 15, 2019, 5 pages.
Notice of Allowance dated Aug. 14, 2012, issued in connection withU.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 33 pages.
Notice of Allowance dated Dec. 14, 2016, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 9 pages.
Notice of Allowance dated Jun. 14, 2012, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 9 pages.
Notice of Allowance dated Mar. 14, 2019, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 14 pages.
Notice of Allowance dated Dec. 15, 2020, issued in connection with U.S. Appl. No. 15/946,660, filed Apr. 5, 2018, 5 pages.
Notice of Allowance dated Jul. 15, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 18 pages.
Notice of Allowance dated Mar. 15, 2017, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 7 pages.
Notice of Allowance dated Nov. 15, 2019, issued in connection with U.S. Appl. No. 16/544,902, filed Aug. 20, 2019, 9 pages.
Notice of Allowance dated Jun. 16, 2009, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 11 pages.
Notice of Allowance dated Sep. 16, 2020, issued in connection with U.S. Appl. No. 15/946,660, filed Apr. 5, 2018, 5 pages.
Notice of Allowance dated Dec. 17, 2018, issued in connection with U.S. Appl. No. 16/128,404, filed Sep. 11, 2018, 24 pages.
Notice of Allowance dated Jul. 17, 2015, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 20 pages.
Notice of Allowance dated May 17, 2021, issued in connection with U.S. Appl. No. 16/459,605, filed Jul. 1, 2019, 23 pages.
Notice of Allowance dated Jul. 18, 2014, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 8 pages.
Notice of Allowance dated Sep. 18, 2019, issued in connection with U.S. Appl. No. 16/459,569, filed Jul. 1, 2019, 10 pages.
NewsRoom. Washington Post, Ask the Computer Guy, Jan. 11, 2004, 2 pages.
NewsRoom. Yamaha Announces the Worlds First Wireless Home Music System. Aug. 11, 2003, 2 pages.
NewsRoom. Yamaha Musiccast An easy way to spread music around your home. Dec. 1, 2003, 2 pages.
NewsRoom.Slim Devices Introduces Squeezebox. PR Newswire. Nov. 18, 2003, 2 pages.
“NexSys Software v.3 Manual,” Crest Audio, Inc., 1997, 76 pages.
Niederst, Jennifer “O'Reilly Web Design in a Nutshell,” Second Edition, Sep. 2001, 678 pages.
Niles SI-1230. Systems Integration Amplifier. Installation & Operation Guide, 2009, 32 pages.
Niles SI-1260. Systems Integration Amplifier. Installation & Operation Guide, 2000, 32 pages.
Niles SVL-4 Speaker Selection/Volume Control System Installation & Operation Guide. Copyright 1999. Sourced from Sonos, Inc. v. Lenbrook Industries Limited et al., Defendants' Answer to Plaintiff's Complaint—Exhibit C, filed Oct. 14, 2019, 16 pages.
Nilsson, M., “ID3 Tag Version 2,” Mar. 26, 1998, 28 pages.
Non-Final Office Action dated May 1, 2014, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 31 pages.
Non-Final Office Action dated Dec. 5, 2013, issued in connection with U.S. Appl. No. 13/827,653, filed Mar. 14, 2013, 28 pages.
Non-Final Office Action dated Jan. 5, 2012, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 40 pages.
Non-Final Office Action dated May 6, 2014, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 23 pages.
Non-Final Office Action dated Sep. 7, 2016, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 12 pages.
Non-final Office Action dated Apr. 10, 2013, issued in connection with U.S. Appl. No. 13/619,237, filed Sep. 14, 2012, 10 pages.
Non-Final Office Action dated May 12, 2014, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 23 pages.
Non-Final Office Action dated May 14, 2014, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 14 pages.
Non-Final Office Action dated Jun. 17, 2014, issued in connection with U.S. Appl. No. 14/176,808, filed Feb. 10, 2014, 6 pages.
Non-Final Office Action dated Dec. 18, 2013, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 12 pages.
Non-Final Office Action dated Jan. 18, 2008, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 38 pages.
Non-Final Office Action dated Apr. 19, 2010, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 16 pages.
Non-Final Office Action dated Mar. 19, 2013, issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 9 pages.
Non-Final Office Action dated Jun. 21, 2011, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 13 pages.
Non-Final Office Action dated Jan. 22, 2009, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 18 pages.
Non-Final Office Action dated Jul. 22, 2021, issued in connection with U.S. Appl. No. 17/306,016, filed May 3, 2021, 8 pages.
Non-Final Office Action dated Jul. 25, 2014, issued in connection with U.S. Appl. No. 14/184,526, filed Feb. 19, 2014, 9 pages.
Non-Final Office Action dated Jul. 25, 2014, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 11 pages.
Non-Final Office Action dated Jun. 25, 2010, issued in connection with U.S. Appl. No. 10/816,217, filed Apr. 1, 2004, 17 pages.
Non-Final Office Action dated Nov. 25, 2013, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 19 pages.
Non-Final Office Action dated May 27, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 13 pages.
Non-Final Office Action dated Feb. 29, 2012, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 10 pages.
Non-Final Office Action dated Nov. 29, 2010, issued in connection with U.S. Appl. No. 11/801,468, filed May 9, 2007, 17 pages.
Non-Final Office Action dated Jul. 30, 2013 issued in connection with U.S. Appl. No. 13/724,048, filed Dec. 21, 2012, 7 pages.
Non-Final Office Action dated Jul. 31, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 31 pages.
Non-Final Office Action dated Dec. 1, 2014, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 11 pages.
Non-Final Office Action dated Jun. 1, 2016, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 21 pages.
Non-Final Office Action dated Oct. 1, 2019, issued in connection with U.S. Appl. No. 16/516,567, filed Jul. 19, 2019, 11 pages.
Non-Final Office Action dated Sep. 1, 2010, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 36 pages.
Non-Final Office Action dated Nov. 2, 2016, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 37 pages.
Non-Final Office Action dated Feb. 3, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 32 pages.
Non-Final Office Action dated Jan. 3, 2017, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 10 pages.
Non-Final Office Action dated Jun. 3, 2015, issued in connection with U.S. Appl. No. 14/564,544, filed Dec. 9, 2014, 7 pages.
Non-Final Office Action dated Nov. 3, 2016, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 17 pages.
Non-Final Office Action dated Oct. 3, 2014, issued in connection with U.S. Appl. No. 13/863,083, filed Apr. 15, 2013, 22 pages.
Non-Final Office Action dated Jun. 4, 2015, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 16 pages.
Non-Final Office Action dated Mar. 4, 2015, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 16 pages.
Non-Final Office Action dated Oct. 4, 2016, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 9 pages.
Non-Final Office Action dated Oct. 5, 2016, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 10 pages.
Non-Final Office Action dated Oct. 5, 2016, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 11 pages.
Notice of Allowance dated Apr. 3, 2017, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 8 pages.
Notice of Allowance dated Jun. 3, 2019, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 8 pages.
Notice of Allowance dated Aug. 30, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 7 pages.
Notice of Allowance dated Jul. 30, 2015, issued in connection with U.S. Appl. No. 13/705,178, filed Dec. 5. 2012, 18 pages.
Notice of Allowance dated Mar. 30, 2017, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 7 pages.
Notice of Allowance dated Nov. 30, 2021, issued in connection with U.S. Appl. No. 16/383,910, filed Apr. 15, 2019, 5 pages.
Notice of Allowance dated Jan. 31, 2018, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 6 pages.
Notice of Allowance dated Jun. 4, 2021, issued in connection with U.S. Appl. No. 17/102,873, filed Nov. 24, 2020, 8 pages.
Notice of Allowance dated Aug. 5, 2015, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 26 pages.
Notice of Allowance dated Apr. 6, 2017, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 7 pages.
Notice of Allowance dated Jul. 6, 2015, issued in connection with U.S. Appl. No. 13/297,000, filed Nov. 15, 2011, 24 pages.
Notice of Allowance dated Mar. 6, 2019, issued in connection with U.S. Appl. No. 16/181,327, filed Nov. 5, 2018, 16 pages.
Notice of Allowance dated Jan. 8, 2019, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 10 pages.
Notice of Allowance dated Jan. 8, 2021, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 8 pages.
Notice of Allowance dated Jun. 2, 2020, issued in connection with U.S. Appl. No. 15/946,660, filed Apr. 5, 2018, 5 pages.
Notice of Appeal and Certificate of Service filed on Jul. 16, 2018, in connection with Reexam U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 2 pages.
Notice of Incomplete Re-Exam Request dated May 25, 2017, issued in connection with U.S. Appl. No. 90/013,959, filed Apr. 1, 2016, 10 pages.
Notice of Intent to Issue Re-Examination Certificate dated Aug. 3, 2017, issued in connection with U.S. Appl. No. 90/013,882, filed Dec. 27, 2016, 20 pages.
Now Playing. cd30 and Wireless Network Connection 2 Status, 7 pages [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Nutzel et al., “Sharing Systems for Future HiFi Systems,” IEEE, 2004, 9 pages.
Office Action in Ex Parte Reexamination dated Oct. 20, 2017, issued in connection with Reexamination U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 50 pages.
Ohr, Stephan. Audio IC, equipment houses embrace multichannel, Oct. 23, 2000, 1 page [produced by Google in Inv. No. 337-TA-1191 on Dec. 1, 2020].
Olenick, Doug. Networked MP3 Player Lineup Bows From cd3o. Jan. 9, 2003, 6 pages.
Olenick, Doug. Twice, Networked MP3 Player Lineup Bows from cd3o, Jan. 9, 2003, 2 pages.
Omnifi A Simple Media Experience. DMSI User Manual, Jul. 2003 36 pages.
Omnifi DMS1 Wi-Fi Media Receiver p. 2, Sound & Vision, Copyright 2020, 7 pages.
Omnifi DMS1 Wi-Fi Media Receiver p. 3, Sound & Vision, Copyright 2020, 5 pages.
“Sonos Multi-Room Music System User Guide,” Version 090401, Sonos, Inc. Apr. 1, 2009, 256 pages.
P4 0.3.1 software/source code available via link (“Download P4 0.3.1.”) 1 page [online], [retrieved on Mar. 26, 2020], Retrieved from the Internet URL: http://snarfed.org/p4.
P4sync/player.cpp. GitHub. Copyright 2005, 4 pages [online], [retrieved on Mar. 26, 2020]. Retrieved from the Internet URL: http://github.com/snarfed/p4sync/blob/master/player.cpp.
Palm, Inc., “Handbook for the Palm VII Handheld,” May 2000, 311 pages.
Parasound Zpre2 Zone Preamplifier with PIZI Remote Control, 2005, 16 pages.
Park et al., “Group Synchronization in MultiCast Media Communications,” Proceedings of the 5th Research on Multicast Technology Workshop, 2003, 5 pages.
Parrot—All Products—Bluetooth Hands Free Car Kits, Oct. 21, 2008, 3 pages.
Parrot DS1120—Wireless Hi-Fi Stereo Sound System, Nov. 22, 2008, 3 pages.
Parrot DS1120 User Guide, English. Retrieved on Mar. 26, 2020, 11 pages.
Parrot DS1120 User Manual, 2007, 22 pages.
Pascoe, Bob, “Salutation Architectures and the newly defined service discovery protocols from Microsoft® and Sun®,” Salutation Consortium, White Paper, Jun. 6, 1999, 5 pages.
Patent Board Decision mailed on May 31, 2019, issued in connection with U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 9 pages.
Philips. Installation CD Content, software/ source code available via zip file (“Installation CD Content”) published Sep. 15, 2004, 3 pages [online], [retrieved on Feb. 24, 2020], Retrieved from the Internet URL : https://www.usa.philips.com/c-p/MCW770_37/-/support.
Philips Leads Consumer Electronics Industry with 21 CES Innovation Awards. Business Wire. 2004 International CES, Jan. 6, 2004, 3 pages.
Philips. MC W7708. Wireless PC Link Quick Installation. Published Dec. 22, 2004, 8 pages.
Philips. MCW770 Leaflet. Remote Control MP3 Music from Your PC . . . Wirelessly. MP3 Micro Hi-Fi System with 5 CD Tray Changer. Published Mar. 2, 2004, 2 pages.
Philips. MCW770 Quick Use Guide. English version. Published Dec. 22, 2004, 4 pages.
Philips Media Manager 3.3.12.0004 Release Notes, last modified Aug. 29, 2006, 2 pages.
Philips. Media Manager Software—English version: PMM 3.3.12, software/ source code available via zip file (“Media Manager Software—English”) published Sep. 15, 2004, 3 pages [online], [retrieved on Feb. 24, 2020]. Retrieved from the Internet URL : https://www.usa.philips.com/c-p/MCW//0_37/-/support.
Philips. PC Software version: V.12.1, software/ source code available via zip file (“PC Software”) published Sep. 15, 2004, 3 pages [online], [retrieved on Feb. 24, 2020]. Retrieved from the Internet URL : https://www.usa.philips.com/c-p/MCW770_37/-/support.
Philips.Wireless PC Link Micro MCW770 Custom Installation, User Manual, published Aug. 24, 2004, 61 pages.
Pillai et al., “A Method to Improve the Robustness of MPEG Video Applications over Wireless Networks,” Kent Ridge Digital Labs, 2000, 15 pages.
Pinnacle ShowCenter. Pinnacle Systems, Mar. 2005, 132 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Sonos's Motion to Strike Defendants' New Amended Answer Submitted with their Reply Brief, provided Sep. 15, 2016, 10 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Sonos's Opposition to Defendants' Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Oct. 31, 2016, 26 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Third Amended Complaint for Patent Infringement, filed Jan. 29, 2016, 47 pages.
Sonos, Inc. v. D&M Holdings, Inc. (No. 14-1330-RGA), Defendants' Final Invalidity Contentions (Jan. 18, 2017) (106 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 226, Opinion Denying Inequitable Conduct Defenses, Feb. 6, 2017, updated, 5 pages.
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 242, US District Judge Andrews 101 Opinion, Mar. 13, 2017, 16 pages.
Sonos, Inc. v D&M Holdings, Sonos Supp Opening Markman Brief including Exhibits, Mar. 3, 2017, 17 pages.
Sonos, Inc. v. D&M Holdings, Sonos Supp Reply Markman Brief including Exhibits, Mar. 29, 2017, 36 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Declaration of Steven C. Visser, executed Sep. 9, 2016, 40 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 1: Defendants' Invalidity Contentions for U.S. Pat. No. 7,571,014 filed Sep. 16, 2016, 270 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 10: Defendants' Invalidity Contentions for U.S. Pat. No. 9,219,959 filed Sep. 27, 2016, 236 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 11: Defendants' Invalidity Contentions for Design U.S. Pat. No. D. 559,197 filed Sep. 27, 2016, 52 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 2: Defendants' Invalidity Contentions for U.S. Pat. No. 8,588,949 filed Sep. 27, 2016, 224 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 3: Defendants' Invalidity Contentions for U.S. Pat. No. 8,843,224 filed Sep. 27, 2016, 147 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 4: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,312 filed Sep. 27, 2016, 229 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 5: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,637 filed Sep. 27, 2016, 213 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 6: Defendants' Invalidity Contentions for U.S. Pat. No. 9,042,556 filed Sep. 27, 2016, 162 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 7: Defendants' Invalidity Contentions for U.S. Pat. No. 9,195,258 filed Sep. 27, 2016, 418 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 8: Defendants' Invalidity Contentions for U.S. Pat. No. 9,202,509 filed Sep. 27, 2016, 331 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Amended Invalidity Contentions Exhibit 9: Defendants' Invalidity Contentions for U.S. Pat. No. 9,213,357 filed Sep. 27, 2016, 251 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 1: Defendants' Invalidity Contentions for U.S. Pat. No. 7,571,014 filed Apr. 15, 2016, 161 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 10: Defendants' Invalidity Contentions for U.S. Pat. No. 9,213,357 filed Apr. 15, 2016, 244 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 11: Defendants' Invalidity Contentions for U.S. Pat. No. 9,219,959 filed Apr. 15, 2016, 172 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 12: Defendants' Invalidity Contentions for Design U.S. Pat. No. D. 559,197 filed Apr. 15, 2016, 36 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 2: Defendants' Invalidity Contentions for U.S. Pat. No. 8,588,949 filed Apr. 15, 2016, 112 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 3: Defendants' Invalidity Contentions for U.S. Pat. No. 8,843,224 filed Apr. 15, 2016, 118 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 4: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,312 filed Apr. 15, 2016, 217 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 5: Defendants' Invalidity Contentions for U.S. Pat. No. 8,938,637 filed Apr. 15, 2016, 177 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 6: Defendants' Invalidity Contentions for U.S. Pat. No. 9,042,556 filed Apr. 15, 2016, 86 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 7: Defendants' Invalidity Contentions for U.S. Pat. No. 9,130,771 filed Apr. 15, 2016, 203 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 8: Defendants' Invalidity Contentions for U.S. Pat. No. 9,195,258 filed Apr. 15, 2016, 400 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Initial Invalidity Contentions Exhibit 9: Defendants' Invalidity Contentions for U.S. Pat. No. 9,202,509 filed Apr. 15, 2016, 163 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendant's Preliminary Identification of Prior Art References, provided Jul. 29, 2016, 5 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Brief in Support of their Motion for Leave to Amend their Answer to Add the Defense of Inequitable Conduct, provided Oct. 12, 2016, 24 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Defendants' Opposition to Sonos's Motion to Strike Defendants' New Amended Answer Submitted with their Reply, provided Oct. 3, 2016, 15 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit A: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Oct. 12, 2016, 43 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Exhibit B: Defendants' Second Amended Answer to Plaintiffs' Third Amended Complaint, provided Oct. 12, 2016, 43 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Opening Brief in Support of Defendants' Motion for Leave to Amend Their Answer to Add the Defense of Inequitable Conduct, provided Aug. 1, 2016, 11 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Order, provided Oct. 7, 2016, 2 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Plaintiff's Opposition to Defendants' Motion for Leave to Amend Their Answer to Add the Defense of Inequitable Conduct, provided Aug. 26, 2016, 25 pages.
Sonos, Inc. v. D&M Holdings Inc. et al., Redlined Exhibit B: Defendants' First Amended Answer to Plaintiffs' Third Amended Complaint, provided Aug. 1, 2016, 27 pages.
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 206-1, Transcript of 101 Hearing (Nov. 28, 2016) (28 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 207, Public Joint Claim Construction Brief (Nov. 30, 2016) (88 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 214, D&M Post-Markman Letter (Dec. 22, 2016) (13 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 215, Sonos Post-Markman Letter (Dec. 22, 2016) (15 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 219, Claim Construction Opinion (Jan. 12, 2017) (24 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), DI 221, Claim Construction Order (Jan. 18, 2017) (2 pages).
Sonos, Inc. v. D&M Holdings (No. 14-1330-RGA), Markman Hearing Transcript (Dec. 14, 2016) (69 pages).
Sonos, Inc. v. Google LLC, Appendix A to Respondents' Response to the Complaint and Notice of Investigation, filed Feb. 27, 2020, 2 pages.
Sonos, Inc. v. Google LLC, Appendix B to Respondents' Response to the Complaint and Notice of Investigation, filed Feb. 27, 2020, 176 pages.
Final Office Action dated Aug. 14, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 28 pages.
Final Office Action dated Feb. 15, 2018, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 17 pages.
Final Office Action dated Jul. 15, 2015, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 18 pages.
Final Office Action dated Jun. 15, 2015, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 25 pages.
Final Office Action dated Jun. 15, 2017, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 16 pages.
Final Office Action dated May 15, 2017, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 13 pages.
Final Office Action dated Mar. 16, 2011, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 40 pages.
Final Office Action dated Mar. 16, 2018, issued in connection with U.S. Appl. No. 90/013,959, filed Jun. 16, 2017, 39 pages.
Final Office Action dated May 16, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 14 pages.
Final Office Action dated May 16, 2017, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 12 pages.
Final Office Action dated May 16, 2018, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 11 pages.
Final Office Action dated Oct. 16, 2018, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 15 pages.
Final Office Action dated Dec. 17, 2014, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 36 pages.
Final Office Action dated Oct. 19, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 14 pages.
Final Office Action dated Apr. 20, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 14 pages.
Final Office Action dated Jan. 21, 2010, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 27 pages.
Final Office Action dated Oct. 22, 2014, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 12 pages.
Final Office Action dated Oct. 23, 2014, issued in conection with U.S. Appl. No. 13/705,176, filed an Dec. 5, 2012, 23 pages.
Final Office Action dated Dec. 24, 2009, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 29 pages.
Final Office Action dated Feb. 24, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 28 pages.
Final Office Action dated May 25, 2016, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 33 pages.
Final Office Action dated Oct. 26, 2018, issued in connection with U.S. Appl. No. 15/095,145, filed Apr. 10, 2016, 14 pages.
Final Office Action dated Apr. 28, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 20 pages.
Final Office Action dated Jun. 28, 2017, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 14 pages.
Final Office Action dated Oct. 28, 2020, issued in connection with U.S. Appl. No. 16/459,605, filed Jul. 1, 2019, 14 pages.
Final Office Action dated Mar. 29, 2018, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 24 pages.
Final Office Action dated Mar. 30, 2021, issued in connection with U.S. Appl. No. 16/383,910, filed Apr. 15, 2019, 15 pages.
Final Office Action dated Nov. 30, 2015, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 26 pages.
Final Office Action dated Dec. 31, 2015, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 34 pages.
Final Office Action dated Dec. 4, 2018, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 24 pages.
FireBall Digital Music Manager E-40 and E-120. Meet FireBall. The Industry's choice for managing your entire music collection. Datasheet. 2003, 2 pages.
Fireball DVD and Music Manager DVDM-100 Installation and User's Guide, Copyright 2003, 185 pages.
Fireball E2 User's Manual. Escient. Gracenote cddb. 2000-2004, 106 pages.
Fireball MP-200 User's Manual, Copyright 2006, 93 pages.
Fireball Remote Control Guide WD006-1-1, Copyright 2003, 19 pages.
Fireball SE-D1 User's Manual, Copyright 2005, 90 pages.
First Action Interview Office Action Summary dated Apr. 15, 2015, issued in connection with U.S. Appl. No. 14/505,027, filed Oct. 2, 2014, 6 pages.
First Action Pre-Interview Office Action dated Jun. 22, 2017, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 4 pages.
First Action Pre-Interview Office Action dated Jun. 22, 2017, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 5 pages.
First Office Action Interview dated Aug. 30, 2017, issued in connection with U.S. Appl. No. 14/516,883, filed Oct. 17, 2014, 5 pages.
Fober et al., “Clock Skew Compensation over a High Latency Network,” Proceedings of the ICMC, 2002, pp. 548-552.
Fout, Tom, “Universal Plug and Play (UPnP) Client Support,” Microsoft, Aug. 2001, 18 pages.
Fried, John J. NewsRoom, Convergence melds personal computer, TV and stereo, Feb. 20, 2003, 4 pages.
Fries et al. “The MP3 and Internet Audio Handbook: Your Guide to the Digital Music Revolution.” 2000, 320 pages.
Frodigh, Magnus. Wireless ad hoc networking—The art of networking without a network, Ericsson Review No. 4, 2000, 16 pages.
Fulton et al., “The Network Audio System: Make Your Application Sing (as Well as Dance)!” The X Resource, 1994, 14 pages.
Gaston et al., “Methods for Sharing Stereo and Multichannel Recordings Among Planetariums,” Audio Engineering Society Convention Paper 7474, 2008, 15 pages.
Gateway SOLO 5300 User Manual, 305 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
General Event Notification Architecture Base: Client to Arbiter (Apr. 2000) (23 pages).
Getting to know Logitech Squeezebox Touch Wi-Fi Music Player. Features Guide, 2010, 36 pages.
NewsRoom. Newman, H., All-in-one Audio, Video Devices will be next big thing, Jan. 9, 2003, 3 pages.
NewsRoom. Norris, A., Come over to my house. Jan. 23, 2003, 3 pages.
NewsRoom. On the Printer Trail—Ream of new SMB models offers channel a range of sales hooks CRN Test Center finds. Oct. 13, 2003, 5 pages.
NewsRoom. One way to organize and weed Favorites, May 8, 2003, 3 pages.
Newsroom, Outfitting your personal fortress of solitude, Mar. 14, 2002, 4 pages.
NewsRoom. Philadelphia Inquirer, Wireless solution for stereo sound, Aug. 7, 2003, 3 pages.
NewsRoom. Popular Science, Yamaha Musiccast an easy way to spread music around your home, Dec. 1, 2003, 2 pages.
NewsRoom. PR Newswire, “Home Director Announces Availability of AudioPoint Receiver,” Sep. 27, 2002, 4 pages.
NewsRoom. Preview the New EZ-Stream Wireless Audio Adapter at CES Jan. 8-11, 2004 BridgeCo Booth 19629, Jan. 7, 2004, 3 pages.
NewsRoom. Receiver Lets Stereo Join the Wi-Fi Band, Apr. 10, 2003, 2 pages.
NewsRoom. Rogers, P., Speaker Screech: The End is Near, Apr. 8, 2003, 2 pages.
NewsRoom. San Jose Mercury News, Intel Fund to Invest in Digital Home, Jan. 7, 2004, 2 pages.
NewsRoom. Science & Technology: Wired for sound and video, Jan. 14, 2004, 3 pages.
NewsRoom, Sears reveals plans for new Eatons stores, Oct. 26, 2000, 3 pages.
NewsRoom. Seattle Times, Inventions real stars of the show As speeches predict future 100,000 browse ‘superstore’, Jan. 13, 2003, 4 pages.
NewsRoom, Sensible Sound, Goin' to a show-show, Surveying the Soundscape, Jun. 1, 2003, 8 pages.
NewsRoom. Shaw, K., Cool Tools, Jan. 20, 2003, 2 pages.
NewsRoom. Sheehan, W., More brains, less brawn. Sep. 1, 2003, 3 pages.
NewsRoom. Sidener, J., Everett Roach, Jul. 14, 2003, 2 pages.
NewsRoom. Sirius XM Companies Flood Cedia With New Products. Satellite Week. Sep. 15, 2003, 2 pages.
NewsRoom. Slim Devices Introduces Slimserver, Nov. 18, 2003, 2 pages.
NewsRoom. Slim Devices Introduces Slimserver. PR Newswire. Nov. 18, 2003, 2 pages.
NewsRoom. Slim Devices Introduces Squeezebox, Nov. 18, 2003, 2 pages.
NewsRoom. SMC Sponsors Canada's First Combined ‘LAN Event’ for Gamers: DreamlanSMC, Jan. 15, 2004, 2 pages.
NewsRoom. SMC Sponsors Canada's First Combined ‘LAN Event’ for Gamers: DreamlanSMC, Jan. 15, 2004, 3 pages.
NewsRoom. SMC Sponsors Home by Design Showhouse/Connected by Design Tour, Jan. 6, 2004, 3 pages.
NewsRoom. SMC Teams with Get Digital to Offer Free Music Conversion to Its Wireless Audio Adapter Users, Feb. 23, 2004, 3 pages.
NewsRoom. SMC teams with Get Digital to offer free music conversion to wireless users, Mar. 29, 2004, 1 page.
NewsRoom. SMC to Offer Home Entertainment Networking Bundle With New Intel Desktop Boards, Nov. 3, 2003, 3 pages.
NewsRoom. Sonic divide crumbles, 2001 WLNR 5430795. Sep. 5, 2001, 3 pages.
NewsRoom. Sound and Fury the Latest in Volume and Video at SF Home Entertainment Show Jun. 6, 2003, 3 pages.
NewsRoom. Sound Blaster Goes Wireless, Sep. 30, 2003, 3 pages.
NewsRoom. St. Paul Pioneer Press, Guide to Better Giving You Know These People. Why Is it So Hard to Buy for Them? Maybe It's Not: Everyone Need Technology, From the Littlest Angel to the Most Resistant Grandparent, Nov. 24, 2003, 6 pages.
NewsRoom. Sullivan, A., PluggedIn—Digital music migrates to the home stereo, Oct. 28, 2003, 3 pages.
NewsRoom. Tech along, Jan. 25, 2004, 3 pages.
NewsRoom. Technology Life in the iPad. Mar. 15, 2007, 5 pages.
NewsRoom. Televisions defy hi-tech trend for minimalism, Feb. 19, 2004, 3 pages.
NewsRoom. The 50 Best Music Systems, Dec. 13, 2003, 15 pages.
NewsRoom. The Age (Australia), Fresh Gadgets, 2001 WLNR 13294645, Sep. 7, 2001, 3 pages.
NewsRoom. The Dallas Morning News, Honorable mentions worth a look, Nov. 20, 2003, 2 pages.
NewsRoom. The Dallas Morning News, Innovations Hasten Trend of On-the-Go Music, Video, Technology, Jan. 16, 2003, 4 pages.
NewsRoom. The Dallas Morning News, Wireless Technology Focus of Consumer Electronics Show in Las Vegas, Jan. 9, 2003, 4 pages.
NewsRoom, The Goods Whats' New What's Hot, Nov. 9, 2000, 2 pages.
NewsRoom. The Next Ace in the Hole?—Epson HP set the stage for promising alternatives to wired solutions in vertical markets, Jan. 14, 2002, 3 pages.
NewsRoom. The Orange County Register, Holiday Season Brings Gift Ideas for Tech-Heads, Gadget Groupie, Dec. 8, 2003, 4 pages.
NewsRoom. The personal computer shows its creative side. Technology has discovered its next “killer app.” Aug. 14, 2003, 3 pages.
NewsRoom. The top 25: computer shopper editors handpick this months best desktops notebooks digital audio receivers, handhelds, and software. Nov. 1, 2003, 3 pages.
NewsRoom. The toys of summer: Some cool tools that will get you through the lazy days. Sep. 1, 2003, 3 pages.
NewsRoom. The wide world of Wi-Fi: wherever you are, wireless networking is where it's at. Find out which Wi-Fi components will help you stay connected while . . . May 1, 2004, 7 pages.
NewsRoom. Ticker, Aug. 1, 2003, 2 pages.
Cd30 Introduces Family of Wireless Network MP3 Players. Jan. 9-12, 2003 Las Vegas Convention Center, 2 pages.
Cd30. Logo page, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 Management, Dec. 12, 2003, 1 page.
Cd30. Management Team, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30. Multi-Player Synchronization. Jan. 15, 2004, 4 pages.
Cd30 Network MP3 Player Models, Feb. 1, 2004, 1 page.
Cd30, Network MP3 Player, Product Manual. Copyright 2003, 65 pages.
Cd30 Network MP3 Player. Product Manual for c100, c200, and c300, 2003, 65 pages.
Cd30. Network MP3 Player. Quick Installation Guide, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 Network MP3 Player Reviews. Feb. 1, 2004, 2 pages.
Cd30 Network MP3 Player Setup Wizard, 9 pages, [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Cd30 Network MP3 Player Specifications. Feb. 2, 2004, 2 pages.
Cd30 Network MP3 Players, Nov. 18, 2003, 1 page.
Cd30 Network MP3 Players c100, c200, and c300, 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Cd30 Network MP3 Players: Stream music from your PC to your stereo, Nov. 18, 2003, 1 page.
Cd30 Network MP3 Players: Stream your MP3s to your stereo! May 24, 2003, 1 page.
Cd30. News, Reviews Nov. 21, 21 2003, 2 pages.
Cd30. Product Support. May 10, 2006, 17 pages.
Cd30 Product Support Forums. Forum Index, Apr. 15, 2003, 1 page.
Cd30 Product Support Forums. Forum Index, Jun. 18, 2003, 1 page.
Cd30 Product Support Forums. Forum Index, Feb. 2, 2004, 1 page.
Cd30. Product Support Forums. Multiple stereos—multiple cd30s—same music? Nov. 3, 2003, 2 pages.
Cd3o. Network MP3 Player, Product Manual, 2003, 65 pages.
Cd3o Product Support Center, Nov. 19, 2003, 1 page.
Cen et al., “A Distributed Real-Time MPEG Video Audio Player,” Department of Computer Science and Engineering, Oregon Graduate Institute of Science and Technology, 1995, 12 pages.
CES: MP3-Player mit Pfiff, Jan. 13, 2003, 4 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Chakrabarti et al., “A Remotely Controlled Bluetooth Enabled Environment,” IEEE, 2004, pp. 77-81.
Change Notification: Agere Systems WaveLan Multimode Reference Design (D2 to D3), AVAGO0042, Agere Systems, Nov. 2004, 2 pages.
Chapter 19: Working with Sound. Configuring Windows to Work with Sound, 6 pages, [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Cheshire et al. RFC 3927—Dynamic Configuration of IPv4 Link-Local Addresses, 2005, 34 pages.
Cheshire et al. Zero Configuration Networking: The Definitive Guide. Dec. 2005, 288 pages.
Chinese Office Action, Office Action dated Dec. 20, 2016, issued in connection with Chinese Application No. 201380044446.8, 16 pages.
Chinese Patent Office, Office Action dated Jul. 5, 2016, issued in connection with Chinese Patent Application No. 201380044380.2, 25 pages.
Chinese Patent Office, Second Office Action dated Feb. 27, 2017, issued in connection with Chinese Patent Application No. 201380044380.2, 22 pages.
Clipsal. Multi Room Audio Amplifier, User's Guide, V1.0, Dec. 2005, 28 pages.
Clipsal. Multi Room Audio Matrix Switcher, User's Guide, 560884, V1.0, Dec. 2005, 20 pages.
C-Media. CM102-A/102S USB 2CH Audio Controller, Data Sheet. Version 1.4. May 21, 2003, 20 pages.
C-Media Electronics Inc. CMI8768/8768+ Advanced Driver Software Architecture. User Manual, Revision: 1.0, May 25, 2004, 29 pages.
C-Media XeaR 3D Sound Solution. CMI8738 4/6-Channel PCI Audio Single Chip. User Manual, Rev. 2.1, May 21, 2002, 44 pages.
CNET. Wireless gizmo for PC music hits home, Sep. 30, 2003, 4 pages.
Compaq et al., Universal Serial Bus Specification, Revision 2.0, Apr. 27, 2000, 650 pages.
Connected, distributed audio solution for your home by barix and Stand-alone, distributed audio solution for your home by barix. Copyright Sep. 2003. Sourced from Sonos, Inc. v. Lenbrook Industries Limited et al.—Defendants' Answer to Plaintiff's Complaint—Exhibit A filed Oct. 14, 2019, 3 pages.
Connected Planet. Using PC Link. Streamium PC Link by Philips. Models MC-i200/250, SL300i, SL400i, MX6000i, last modified Aug. 5, 2004, 2 pages.
Connection Manager: 1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (25 pages).
ContentDirectory:1 Service Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (89 pages).
Corrected Notice of Allowability dated Dec. 23, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 18 pages.
Corrected Notice of Allowance dated Aug. 19, 2015, issued in connection with U.S. Appl. No. 13/907,666, filed May 31, 2013, 2 pages.
Creating the Future of Home Entertainment Today. NetStreams Product Catalog 2003/2004, 20 pages.
Creative, “Connecting Bluetooth Devices with Creative D200,” http://support.creative.com/kb/ShowArticle.aspx?url=http://ask.creative.com:80/SRVS/CGI-BIN/WEBCGI.EXE/,/?St=106,E=0000000000396859016,K=9377,Sxi=8,VARSET=ws:http://us.creative.com,case=63350, available on Nov. 28, 2011, 2 pages.
Sonos, Inc. v. Google LLC, Complainant Sonos, Inc.'s Pre-Hearing Brief [Redacted Jan. 29, 2021] dated Jan. 22, 2021, 513 pages.
Sonos, Inc. v. Google LLC, Direct Witness Statement of Dan Schonfeld, PH.D. [Redacted Jan. 29, 2021] dated Dec. 18, 2020, 390 pages.
Sonos, Inc. v. Google LLC, Direct Witness Statement of Martin Rinard [Redacted Jan. 28, 2021] dated Dec. 18, 2020, 217 pages.
Sonos, Inc. v. Google LLC, Expert Report of Dan Schonfeld, PH.D., Regarding Invalidity of Asserted Claims of U.S. Pat. No. 9,195, 258 and U.S. Pat. No. 10,209,953 [Redacted Dec. 1, 2020] dated Oct. 23, 2020, 387 pages.
Sonos, Inc. v. Google LLC, Expert Report of Martin Rinard, PH.D., Regarding the Validity of Claims 1-6 of U.S. Pat. No. 8,588,949 [Redacted Dec. 1, 2020] dated Oct. 23, 2020, 261 pages.
Sonos, Inc. v. Google, LLC. Google's Petition for Review of the Initial Determination on Violation of Section 337, filed Sep. 8, 2021, 106 pages.
Sonos, Inc. v. Google, LLC. Google's Response to Sonos's Petition for Review of the Initial Determination on Violation of Section 337, Sep. 7, 2021, 111 pages.
Sonos, Inc. v. Google LLC. Initial Determination on Violation of Section 337 and Recommended Determination on Remedy and Bond, filed Aug. 13, 2021, 199 pages.
Sonos, Inc. v. Google LLC. Order 20: Construing the Terms of the Asserted Claims of the Patents at Issue dated Sep. 25, 2020, 53 pages.
Sonos, Inc. v. Google LLC, Rebuttal Expert Report of Jon B. Weissman, Regarding Validity of U.S. Pat. No. 8,588,949 [Redacted Jan. 29, 2021] dated Nov. 13, 2020, 369 pages.
Sonos, Inc. v. Google LLC, Rebuttal Expert Report of Kevin C. Almeroth [Redacted Jan. 29, 2021] dated Nov. 13, 2020, 547 pages.
Sonos, Inc. v. Google LLC, Rebuttal Witness Statement of Jon B. Weissman Regarding Validity of U.S. Pat. No. 8,588,949 and U.S. Pat. No. 10,439,896 [Redacted Jan. 29, 2021] dated Jan. 8, 2021, 736 pages.
Sonos, Inc. v. Google LLC, Rebuttal Witness Statement of Kevin C. Almeroth [Redacted Jan. 29, 2021] dated Jan. 8, 2021, 735 pages.
Sonos, Inc. v. Google LLC, Respondent Google's Pre-Trial Brief [Redacted Jan. 29, 2021] dated Jan. 22, 2021, 516 pages.
Sonos, Inc. v. Google LLC. Respondents' Final Invalidity Claim Charts for U.S. Pat. No. 10,209,953, Exhibits 1-10 and B, dated Sep. 4, 2020, 406 pages.
Sonos, Inc. v. Google LLC. Respondents' Final Invalidity Claim Charts for U.S. Pat. No. 9,195,258, Exhibits 1-10 and B, dated Sep. 4, 2020, 461 pages.
Sonos, Inc. v. Google LLC. Respondents' Final Invalidity Claims Charts for U.S. Pat. No. 8,588,949, Exhibits 1-20 and B, dated Sep. 4, 2020, 520 pages.
Sonos, Inc. v. Google LLC. Respondents' Final Invalidity Contentions [Redacted] dated Sep. 4, 2020, 261 pages.
Sonos, Inc. v. Google LLC, Respondents' Response to the Complaint and Notice of Investigation, filed Feb. 27, 2020, 46 pages.
Sonos, Inc. v. Google, LLC. Sonos Inc.'s Petition and Contingent Petition for Review of the Initial Determination on Violation of Section 337, Aug. 27, 2021, 122 pages.
Sonos, Inc. v. Google, LLC. Sonos Inc.'s Response to Google's Petition for Review of the Initial Determination on Violation of Section 337, Sep. 7, 2021, 117 pages.
Sonos, Inc. v. Implicit, LLC: Declaration of Roman Chertov in Support of the Inter Partes Review of U.S. Pat. No. 7,391,791 dated Mar. 9, 2018, 92 pages.
Sonos, Inc. v. Implicit, LLC: Declaration of Roman Chertov in Support of the Inter Partes Review of U.S. Pat. No. 8,942,252 dated Mar. 9, 2018, 81 pages.
Sonos, Inc. v. Lenbrook Industries Limited et al., Defendants' Answer to Plaintiff's Complaint, filed Oct. 14, 2019, 66 pages.
Sonos, Inc. v. Lenbrook Industries Limited et al., Defendants' First Amended Answer and Counterclaims to Plaintiff's Complaint, filed Nov. 14, 2019, 66 pages.
Sonos System Overview, Version 1.0, Jul. 2011, 12 pages.
Sonos v. Google. Exhibit A to Respondents' Initial Invalidity Contentions dated Apr. 29, 2020, 194 pages.
Sonos v. Google. Respondents' Initial Invalidity Claim Charts for U.S. Pat. No. 10,439,896, Exhibits 1-16 and B, dated Apr. 29, 2020, 1102 pages.
Sonos v. Google. Respondents' Initial Invalidity Claim Charts for U.S. Pat. No. 10,209,953, Exhibits 1-10 and B, dated Apr. 29, 2020, 288 pages.
Sonos v. Google. Respondents' Initial Invalidity Claim Charts for U.S. Pat. No. 8,588,949, Exhibits 1-19 and B, dated Apr. 29, 2020, 280 pages.
Sonos v. Google. Respondents' Initial Invalidity Claim Charts for U.S. Pat. No. 9,195,258, Exhibits 1-10 and B, dated Apr. 29, 2020, 345 pages.
Sonos v. Google. Respondents' Initial Invalidity Claim Charts for U.S. Pat. No. 9,219,959, Exhibits 1-9 and B, dated Apr. 29, 2020, 344 pages.
Sonos v. Google. Respondents' Initial Invalidity Contentions dated Apr. 29, 2020, 200 pages.
Sony: AIR-SA 50R Wireless Speaker, Copyright 2009, 2 pages.
Sony: Altus Quick Setup Guide ALT-SA32PC, Copyright 2009, 2 pages.
Sony: BD/DVD Home Theatre System Operating Instructions for BDV-E300, E301 and E801, Copyright 2009, 115 pages.
Sony: BD/DVD Home Theatre System Operating Instructions for BDV-IT1000/BDV-IS1000, Copyright 2008, 159 pages.
Sony: Blu-ray Disc/DVD Home Theatre System Operating Instructions for BDV-IZ1000W, Copyright 2010, 88 pages.
Sony: DVD Home Theatre System Operating Instructions for DAV-DZ380W/DZ680W/DZ880W, Copyright 2009, 136 pages.
Sony: DVD Home Theatre System Operating Instructions for DAV-DZ870W, Copyright 2008, 128 pages.
Sony Ericsson MS500 User Guide, Copyright 2009, 2 pages.
Sony. Home Theatre System. HT-DDW790 and HT-DDW685 Operating Instructions, 2007, 64 pages.
Sony: Home Theatre System Operating Instructions for HT-IS100, Copyright 2008, 168 pages.
Sony: HT-IS100, 5.1 Channel Audio System, last updated Nov. 2009, 2 pages.
Sony: Multi Channel AV Receiver Operating Instructions, 2007, 80 pages.
Sony: Multi Channel AV Receiver Operating Instructions for STR-DN1000, Copyright 2009, 136 pages.
Sony Shows Off Range of Home LANs, Dec. 15, 2000, 1 page.
Sony: STR-DN1000, Audio Video Receiver, last updated Aug. 2009, 2 pages.
Sony: Wireless Surround Kit Operating Instructions for WHAT-SA2, Copyright 2010, 56 pages.
Sound Blaster, Wireless Music. User's Guide: Creative Sound Blaster Wireless Music Version 1.0, Aug. 2003, 66 pages.
Non-Final Office Action dated Apr. 24, 2018, issued in connection with U.S. Appl. No. 15/095,145, filed Apr. 10, 2016, 13 pages.
Non-Final Office Action dated Oct. 24, 2014, issued in connection with U.S. Appl. No. 13/435,776, filed Mar. 30, 2012, 14 pages.
Non-Final Office Action dated Feb. 26, 2015, issued in connection with U.S. Appl. No. 14/186,850, filed Feb. 21, 2014, 25 pages.
Non-Final Office Action dated Jul. 26, 2017, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 14 pages.
Non-Final Office Action dated Mar. 26, 2015, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 18 pages.
Non-Final Office Action dated Jan. 27, 2021, issued in connection with U.S. Appl. No. 17/102,873, filed Nov. 24, 2020, 27 pages.
Non-Final Office Action dated Jun. 27, 2008, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 19 pages.
Non-Final Office Action dated Mar. 27, 2015, issued in connection with U.S. Appl. No. 13/705,178, filed Dec. 5, 2012, 14 pages.
Non-Final Office Action dated Sep. 27, 2019, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 13 pages.
Non-Final Office Action dated Aug. 28, 2017, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 17 pages.
Non-Final Office Action dated Dec. 28, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 29 pages.
Non-Final Office Action dated Nov. 28, 2017, issued in connection with U.S. Appl. No. 13/864,248, filed Apr. 17, 2013, 12 pages.
Non-Final Office Action dated Sep. 28, 2018, issued in connection with U.S. Appl. No. 15/972,383, filed May 7, 2018, 15 pages.
Non-Final Office Action dated Aug. 29, 2017, issued in connection with U.S. Appl. No. 14/058,166, filed Oct. 18, 2013, 12 pages.
Non-Final Office Action dated Nov. 29, 2016, issued in connection with U.S. Appl. No. 13/894,179, filed May 14, 2013, 14 pages.
Non-Final Office Action dated Apr. 30, 2012, issued in connection with U.S. Appl. No. 13/204,511, filed Aug. 5, 2011, 16 pages.
Non-Final Office Action dated Apr. 30, 2020, issued in connection with U.S. Appl. No. 16/459,605, filed Jul. 1, 2019, 25 pages.
Non-Final Office Action dated Jan. 30, 2015, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 29 pages.
Non-Final Office Action dated Jan. 30, 2015, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 13 pages.
Non-Final Office Action dated Jul. 30, 2018, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 22 pages.
Non-Final Office Action dated Nov. 30, 2016, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 12 pages.
Non-Final Office Action dated Oct. 30, 2018, issued in connection with U.S. Appl. No. 16/052,316, filed Aug. 1, 2018, 7 pages.
Non-Final Office Action dated Sep. 30, 2016, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 12 pages.
Non-Final Office Action dated Dec. 31, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 26 pages.
Non-Final Office Action dated Aug. 5, 2020, issued in connection with U.S. Appl. No. 16/544,905, filed Aug. 20, 2019, 15 pages.
Non-Final Office Action dated Jan. 7, 2019, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 15 pages.
Non-Final Office Action dated Jan. 7, 2019, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 14 pages.
Non-Final Office Action dated Feb. 8, 2022, issued in connection with U.S. Appl. No. 17/234,442, filed Apr. 19, 2021, 13 pages.
Non-Final Office Action dated Jan. 9, 2018, issued in connection with U.S. Appl. No. 13/864,250, filed Apr. 17, 2013, 13 pages.
North American MPEG-2 Information, “The MPEG-2 Transport Stream,” Retrieved from the Internet: URL: http://www.coolstf.com/mpeg/#ts, 2006, pp. 1-5.
Notice of Allowance dated Jan. 31, 2013, issued in connection with U.S. Appl. No. 13/298,090, filed Nov. 16, 2011, 19 pages.
Notice of Allowance dated Dec. 1, 2016, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 9 pages.
Notice of Allowance dated Jun. 1, 2017, issued in connection with U.S. Appl. No. 14/290,493, filed May 29, 2014, 12 pages.
Notice of Allowance dated Oct. 1, 2019, issued in connection with U.S. Appl. No. 16/544,902, filed Aug. 20, 2019, 23 pages.
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 9 pages.
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 9 pages.
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/089,758, filed Apr. 4, 2016, 9 pages.
Notice of Allowance dated Dec. 2, 2016, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 9 pages.
Notice of Allowance dated Dec. 2, 2019, issued in connection with U.S. Appl. No. 16/514,280, filed Jul. 17, 2019, 8 pages.
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 17 pages.
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 19 pages.
Notice of Allowance dated Jul. 2, 2015, issued in connection with U.S. Appl. No. 14/184,935, filed Feb. 20, 2014, 23 pages.
Notice of Allowance dated Jun. 2, 2017, issued in connection with U.S. Appl. No. 14/486,667, filed Sep. 15, 2014, 10 pages.
Notice of Allowance dated Sep. 3, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 4 pages.
Notice of Allowance dated Aug. 4, 2015, issued in connection with U.S. Appl. No. 14/516,867, filed Oct. 17, 2014, 13 pages.
Notice of Allowance dated Oct. 5, 2012, issued in connection with U.S. Appl. No. 13/204,511, filed Aug. 5, 2011, 11 pages.
Notice of Allowance dated Dec. 6, 2019, issued in connection with U.S. Appl. No. 16/459,565, filed Jul. 1, 2019, 8 pages.
Notice of Allowance dated Jul. 6, 2018, issued in connection with U.S. Appl. No. 14/058,166, filed Oct. 18, 2013, 19 pages.
Notice of Allowance dated Mar. 6, 2014, issued in connection with U.S. Appl. No. 13/827,653, filed Mar. 14, 2013, 17 pages.
Non-Final Office Action dated Oct. 6, 2016, issued in connection with U.S. Appl. No. 15/088,678, filed Apr. 1, 2016, 9 pages.
Non-Final Office Action dated Nov. 7, 2011, issued in connection with U.S. Appl. No. 11/147,116, filed Jun. 6, 2005, 48 pages.
Non-Final Office Action dated Oct. 7, 2016, issued in connection with U.S. Appl. No. 15/156,392, filed May 17, 2016, 8 pages.
Non-Final Office Action dated Mar. 8, 2016, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 13 pages.
Non-Final Office Action dated Aug. 9, 2016, issued in connection with U.S. Appl. No. 13/871,795, filed Apr. 26, 2013, 31 pages.
Non-Final Office Action dated Apr. 10, 2017, issued in connection with U.S. Appl. No. 13/871,785, filed Apr. 26, 2013, 10 pages.
Non-Final Office Action dated Jan. 10, 2018, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 18 pages.
Non-Final Office Action dated Mar. 10, 2011, issued in connection with U.S. Appl. No. 12/035,112, filed Feb. 21, 2008, 12 pages.
Non-Final Office Action dated May 10, 2016, issued in connection with U.S. Appl. No. 14/504,812, filed Oct. 2, 2014, 22 pages.
Non-Final Office Action dated Nov. 10, 2016, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 11 pages.
Non-Final Office Action dated Aug. 11, 2020, issued in connection with U.S. Appl. No. 16/383,910, filed Apr. 15, 2019, 23 pages.
Non-Final Office Action dated Jul. 11, 2017, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 10 pages.
Non-Final Office Action dated Jan. 12, 2017, issued in connection with U.S. Appl. No. 13/895,076, filed May 15, 2013, 10 pages.
Non-Final Office Action dated Jun. 12, 2015, issued in connection with U.S. Appl. No. 13/848,932, filed Mar. 22, 2013, 16 pages.
Non-Final Office Action dated Mar. 12, 2015, issued in connection with U.S. Appl. No. 13/705,174, filed Dec. 5, 2012, 13 pages.
Non-Final Office Action dated Jan. 13, 2016, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 14 pages.
Non-Final Office Action dated Mar. 13, 2015, issued in connection with U.S. Appl. No. 13/705,177, filed Dec. 5, 2012, 15 pages.
Non-Final Office Action dated Nov. 13, 2017, issued in connection with U.S. Appl. No. 13/864,249, filed Apr. 17, 2013, 11 pages.
Non-Final Office Action dated Nov. 13, 2019, issued in connection with U.S. Appl. No. 15/946,660, filed Apr. 5, 2018, 6 pages.
Non-Final Office Action dated Dec. 14, 2017, issued in connection with U.S. Appl. No. 15/081,911, filed Mar. 27, 2016, 17 pages.
Non-Final Office Action dated Nov. 14, 2017, issued in connection with U.S. Appl. No. 13/864,252, filed Apr. 17, 2013, 11 pages.
Non-Final Office Action dated Aug. 15, 2017, issued in connection with U.S. Appl. No. 14/184,522, filed Feb. 19, 2014, 11 pages.
Non-Final Office Action dated Jul. 15, 2016, issued in connection with U.S. Appl. No. 14/803,953, filed Jul. 20, 2015, 20 pages.
Non-Final Office Action dated Nov. 15, 2017, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 14 pages.
Non-Final Office Action dated Nov. 15, 2017, issued in connection with U.S. Appl. No. 15/243,186, filed Aug. 22, 2016, 13 pages.
Non-Final Office Action dated Nov. 16, 2016, issued in connection with U.S. Appl. No. 15/228,639, filed Aug. 4, 2016, 15 pages.
Non-Final Office Action dated Aug. 17, 2017, issued in connection with U.S. Appl. No. 14/184,528, filed Feb. 19, 2014, 12 pages.
Non-Final Office Action dated Nov. 17, 2014, issued in connection with U.S. Appl. No. 13/864,247, filed Apr. 17, 2013, 11 pages.
Non-Final Office Action dated Feb. 18, 2009, issued in connection with U.S. Appl. No. 10/861,653, filed Jun. 5, 2004, 18 pages.
Non-Final Office Action dated Jan. 18, 2013, issued in connection with U.S. Appl. No. 13/618,829, filed Sep. 14, 2012, 58 pages.
Non-Final Office Action dated Nov. 18, 2014, issued in connection with U.S. Appl. No. 13/435,739, filed Mar. 30, 2012, 10 pages.
Non-Final Office Action dated Jun. 19, 2015, issued in connection with U.S. Appl. No. 13/533,105, filed Jun. 26, 2012, 38 pages.
Non-Final Office Action dated Nov. 19, 2014, issued in connection with U.S. Appl. No. 13/848,921, filed Mar. 22, 2013, 9 pages.
Non-Final Office Action dated Apr. 2, 2018, issued in connection with U.S. Appl. No. 15/243,355, filed Aug. 22, 2016, 20 pages.
Non-Final Office Action dated Apr. 20, 2017, issued in connection with U.S. Appl. No. 90/013,882, filed Dec. 27, 2016, 197 pages.
Non-Final Office Action dated Aug. 20, 2009, issued in connection with U.S. Appl. No. 11/906,702, filed Oct. 2, 2007, 27 pages.
Non-Final Office Action dated Oct. 20, 2020, issued in connection with U.S. Appl. No. 16/459,661, filed Jul. 2, 2019, 9 pages.
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/080,591, filed Mar. 25, 2016, 9 pages.
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/080,716, filed Mar. 25, 2016, 8 pages.
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/088,283, filed Apr. 1, 2016, 9 pages.
Non-Final Office Action dated Sep. 21, 2016, issued in connection with U.S. Appl. No. 15/088,532, filed Apr. 1, 2016, 9 pages.
Non-Final Office Action dated Jul. 22, 2019, issued in connection with U.S. Appl. No. 16/009,182, filed Jun. 14, 2018, 23 pages.
Non-Final Office Action dated Sep. 22, 2016, issued in connection with U.S. Appl. No. 15/088,906, filed Apr. 1, 2016, 9 pages.
Non-Final Office Action dated Sep. 22, 2016, issued in connection with U.S. Appl. No. 15/155,149, filed May 16, 2016, 7 pages.
Non-Final Office Action dated Jun. 23, 2015, issued in connection with U.S. Appl. No. 13/705,176, filed Dec. 5, 2012, 30 pages.
Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/848,904, filed Mar. 22, 2013, 11 pages.
Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/864,251, filed Apr. 17, 2013, 11 pages.
Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/888,203, filed May 6, 2013, 9 pages.
Non-Final Office Action dated Oct. 23, 2014, issued in connection with U.S. Appl. No. 13/932,864, filed Jul. 1, 2013, 20 pages.
Non-Final Office Action dated Oct. 23, 2018, issued in connection with U.S. Appl. No. 14/808,875, filed Jul. 24, 2015, 16 pages.
Ljungstrand et al. UBICOMP 2002, Adjunct Proceedings, Fourth International Conference on Ubiquitous Computing, 2002, 90 pages.
Logitech Slimserver. Server for Logitech Squeezebox Players. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Logitech/slimserver. Github. 1 page [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Logitech/Slimserver. Github. Version 2.3 Release. May 19, 2002. 2 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
Louderback, Jim, “Affordable Audio Receiver Furnishes Homes With MP3,” TechTV Vault. Jun. 28, 2000 retrieved Jul. 10, 2014, 2 pages.
Maniactools, “Identify Duplicate Files by Sound,” Sep. 28, 2010, http://www.maniactools.com/soft/music-duplicate-remover/identify-duplicate-files-by-sound.shtml.
Marchetti, Nino. EdgeReview, CES 2003 Home Network Entertainment, Jan. 28, 2003, 2 pages.
Mcgiaun, Shane. Best Buy unveils new Rocketboost RF-RBKIT whole home audio solution and more. Oct. 22, 2009, 7 pages.
MediaLounge Entertainment Network D-Link DSM-320 Wireless Media Player Manual v 1.0, 59 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
MediaRenderer:1 Device Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (12 pages).
MediaServer:1 Device Template Version 1.01 for UPnP, Version 1.0 (Jun. 25, 2002) (12 pages).
Microsoft, Universal Plug and Play (UPnP) Client Support (“Microsoft UPnP”) (Aug. 2001) (D+M_0402007-24) (18 pages).
Microsoft Window's XP Reviewer's Guide (Aug. 2001) (D+M_0402225-85) (61 pages).
“Microsoft Windows XP File and Printer Share with Microsoft Windows” Microsoft Windows XP Technical Article, 2003, 65 pages.
Microsoft Windows XP Student Edition Complete. University of Salford. Custom Guide Learn on Demand, 2004, 369 pages.
Micro-Star International. 865PE Neo2. MS-6728 v1.X ATX Mainboard. Version 1.1. Apr. 2003, 118 pages.
Millard, Max. From a remote control house-monitoring system to a brand-new helicopter, there's something for everyone. San Francisco Examiner, Dec. 19, 2001, 1 page, [produced by Google in Inv. No. 337-TA-1191 on Sep. 4, 2020].
Miller II, Stanley. Technology gets simpler and smarter. JSOnline Milwaukee Journal Sentinel, Jan. 13, 2003, 3 pages.
Mills David L., “Network Time Protocol (Version 3) Specification, Implementation and Analysis,” Network Working Group, Mar. 1992, 7 pages.
Mills, David L., “Precision Synchronization of Computer Network Clocks,” ACM SIGCOMM Computer Communication Review, 1994, pp. 28-43, vol. 24, No. 2.
“Model MRC44 Four Zone—Four Source Audio/Video Controller/Amplifier System,” Xantech Corporation, 2002, 52 pages.
Model MRC88 Eight Zone—Eight Source Audio/Video Controller/Amplifier System, Xantech Corporation, 2003, 102 pages.
Moses, B., Home Networking Using IEEE 1394 in Combination with Other Networking Technologies. Audio Delivery. The Changing Home Experience—AES 17 UK Conference 2002, 16 pages.
Motorola, “Simplefi, Wireless Digital Audio Receiver, Installation and User Guide,” Dec. 31, 2001, 111 pages.
“SMPTE Made Simple: A Time Code Tutor by Timeline,” 1996, 46 pages.
Muherim et al. On the Performance of Clock Synchronization Algorithms for a Distributed Commodity Audio System. Audio Engineering Society Convention Paper presented at 114th Convention Mar. 22-25, 2003, 12 pages.
Multi-Zone Control Systems. ZR-8630AV MultiZone Receiver. Niles, http://www.ampersandcom.com/zr8630av.html accessed Mar. 26, 2020, 5 pages.
Murph, Darren. Rocketfish Wireless Whole Home Audio System Cuts the Cord on All Your Speakers. Engadget. Oct. 23, 2009, 9 pages.
Musica 5000 Series. Multi-Room Audio System, NetStreams, 2005, 7 pages.
Musica MU4602. Audio Distribution System. Data Sheet, 2004, 2 pages.
Musica MUR2E Network Interface. NetStreams Creating the future of home entertainment—today, 2004, 2 pages.
Musica MUR2EM Network Interface. NetStreams the IP Based Distributed Entertainment Company, 2005, 2 pages.
MusicCAST. Interactive Wireless. Home Music Network System, 6 pages [produced by Google in Inv. No. 337-TA-1191 on May 6, 2020].
MusicCAST System—About the Quick Manual, 1999, 7 pages.
NETGEAR. User's Manual for the MP101 Digital Music Player, Version 1.2, May 2004, 48 pages.
NetStreams. Musica MU4602 Audio Distribution System. Data Sheet. Copyright 2004, 2 pages.
NetStreams Musica MU5066. Multi-Room Audio System. Installation and User's Guide, 2005, 44 pages.
NetStreams Musica. NS-MU4602 Audio Distribution System, Integration & Design Guide. The IP-Based Audio Distribution Company, 2004, 22 pages.
NetStreams. Panorama PAN6400 Multi-Room Video & Control System Installation Guide, Jan. 1, 2006, 64 pages.
NetStreams Product Catalog 2003-2004. Creating the Future of Home Entertainment Today 20 pages.
Network Time Protocol (NTP), RFC 1305 (Mar. 1992) (D+M_0397417-536) (120 pages).
Network Working Group. Zeroconf Multicast Address Allocation Protocol, Internet-Draft, Oct. 22, 2002, 14 pages.
NewRoom. Sirius, XM Companies Flood Cedia with New Products, Sep. 15, 2003, 2 pages.
NewRoom. SMC Ships New EZ-Stream Universal 80211ag Wireless Router, Jan. 14, 2004, 3 pages.
NewsRoom. AP Datastream, Wall Street Journal Digest, Jan. 15, 2004, 3 pages.
NewsRoom. AP Online, AP Technology NewsBrief. Dec. 26, 2003, 2 pages.
NewsRoom. AP Online, AP Technology NewsBrief. Dec. 27, 2003, 2 pages.
NewsRoom. Belleville News Democrat, Tunes, Pictures From Computer Can be Sent to Your TV, Stereo, Dec. 27, 2003, 2 pages.
NewsRoom. BridgeCo Successfully Concludes Second Financing Round of US $13.3 Million, Business Wire, Jan. 9, 2003, 3 pages.
NewsRoom. Business Line, Cisco arm rolls out products for SOHO. Nov. 5, 2003, 2 pages.
Japanese Patent Office, Decision of Refusal and Translation dated Jul. 5, 2022, issued in connection with Japanese Patent Application No. 2021-124360, 10 pages.
Sonos, Inc.v. Google LLC. Principal Brief of Cross-Appellant Google LLC. United States Court of Appeals for the Federal Circuit. Appeals from the United States International Trade Commission in Investigation No. 337-TA-1191, Jul. 8, 2022, 96 pages.
Related Publications (1)
Number Date Country
20220188064 A1 Jun 2022 US
Provisional Applications (1)
Number Date Country
60490768 Jul 2003 US
Continuations (4)
Number Date Country
Parent 17306016 May 2021 US
Child 17532724 US
Parent 13864249 Apr 2013 US
Child 17306016 US
Parent 13297000 Nov 2011 US
Child 13864249 US
Parent 10816217 Apr 2004 US
Child 13297000 US