Calibrating media playback channels for synchronized presentation

Information

  • Patent Grant
  • 10614857
  • Patent Number
    10,614,857
  • Date Filed
    Monday, July 2, 2018
    5 years ago
  • Date Issued
    Tuesday, April 7, 2020
    4 years ago
Abstract
In some implementations, a computing device can calibrate media playback channels for presenting media content through a media system by determining the media propagation latency through the media system. For example, the computing device can send calibration content (e.g., audio data, video data, etc.) to various playback devices (e.g., playback channels) of the media system and record a timestamp indicating when the calibration content was sent. When the playback devices present the calibration content, a sensor device (e.g., remote control device, smartphone, etc.) can detect the presentation of the calibration content. The sensor device can send calibration data (e.g., media samples that may include the calibration content and/or a timestamp indicating when the media sample was detected by the sensor device) to the computing device. The computing device can determine the propagation latency (e.g., presentation delay) based on the calibration data received from the sensor device.
Description
TECHNICAL FIELD

The disclosure generally relates to synchronizing playback of audio/video data through multiple channels.


BACKGROUND

Various types of wired and/or wireless media systems are available in the market today. Many of these systems present audio and/or video data through multiple channels (e.g., devices, speakers, displays, earphones, etc.). For example, to play music throughout the whole house, a user may place different speakers in each room of the house. To simulate theater surround sound when watching movies, the user may place different speakers at different locations in a room with a television and/or other media device (e.g., streaming device, set top box, etc.). To avoid a discordant playback experience, the playback of audio and/or video at these various playback devices (e.g., speakers, television, etc.) must be synchronized so that each playback device is presenting the same media content at the same time.


SUMMARY

In some implementations, a computing device can calibrate media playback channels for presenting media content through a media system by determining the media propagation latency through the media system. For example, the computing device can send calibration content (e.g., audio data, video data, etc.) to various playback devices (e.g., playback channels) of the media system and record a timestamp indicating when the calibration content was sent. When the playback devices present the calibration content, a sensor device (e.g., remote control device, smartphone, etc.) can detect the presentation of the calibration content. The sensor device can send calibration data (e.g., media samples that may include the calibration content and/or a timestamp indicating when the media sample was detected by the sensor device) to the computing device. The computing device can determine the propagation latency (e.g., presentation delay) based on the calibration data received from the sensor device.


Particular implementations provide at least the following advantages. A media system can be calibrated for synchronous playback at multiple playback devices using different types of sensor devices (e.g., a dedicated remote-control device, smartphone, tablet computer, etc.). The media system can be calibrated for synchronous playback through third-party playback devices (e.g., Bluetooth speakers, Bluetooth headsets, etc.). The media system can be calibrated with or without explicit user input initiating the calibration process. For example, the calibration process can be performed in the background while the user performs other tasks on or with the sensor device. Thus, the calibration process can be performed automatically, dynamically, and/or frequently without burdening the user with providing explicit input to perform the calibration process.


Details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, aspects, and potential advantages will be apparent from the description and drawings, and from the claims.





DESCRIPTION OF DRAWINGS


FIG. 1 is a block diagram of an example media system for calibrating media playback channels for synchronized playback based on a Bluetooth clock at a sensor device.



FIG. 2 is a block diagram of an example media system for calibrating media playback channels for synchronized playback based on a system clock at a sensor device.



FIG. 3 is a block diagram of an example media system for calibrating media playback channels for synchronized playback based on visual calibration content detected at a sensor device.



FIG. 4 is a block diagram of an example media system for calibrating media playback channels for synchronized playback based on a time when calibration data is received at the sending media device.



FIG. 5 illustrates an example calibration content for determining propagation latency on a communication channel of a media system.



FIG. 6 is flow diagram of an example process for calibrating media playback channels for synchronized presentation based on a detection time determined by a sensor device.



FIG. 7 is flow diagram of an example process for calibrating media playback channels for synchronized presentation based on a received time determined by a media device and a time in flight for transmitting data between the sensor device and the media device.



FIG. 8 is a block diagram of an example computing device that can implement the features and processes of FIGS. 1-7.





Like reference symbols in the various drawings indicate like elements.


DETAILED DESCRIPTION

The technology described herein provides several mechanisms for calibrating media playback channels for synchronized playback. For example, a media device (e.g., a computing device, laptop computer, set-top-box, streaming media player, etc.) within a media system can determine and/or calculate the propagation latency of media content through various playback channels of the media system so that the media device can adjust the timing of transmission of media content through the playback channels in order to provide synchronized presentation of media content at playback devices (e.g., speakers, displays, televisions, earphones, headsets, etc.).


A playback channel can correspond to a communication path that media content travels from a sending (e.g., originating) media device through a playback device (e.g., including the playback device) that presents (e.g., audibly and/or visually) the media content to the user or users of the media system. The playback channel can be a wired playback channel (e.g., HDMI, RCA cables, coaxial cables, Ethernet, speaker wires, etc.) or wireless playback channel (e.g., Bluetooth, Wi-Fi, etc.) to a corresponding playback device (e.g., television, speaker, monitor, display, etc.).


The propagation latency (e.g., presentation delay) can correspond to the amount of time it takes for media content sent by the media device to be perceptibly presented (e.g., audibly or visually) for the user's consumption at a playback device (e.g., presentation time−transmission time=propagation latency). The media device can determine the propagation latency on each of a variety of playback channels (e.g., wired, wireless, Wi-Fi, Bluetooth, etc.) corresponding to a variety of playback devices (e.g., televisions, speakers, headphones, set-top-boxes, computing devices, etc.). The media device can then adjust the timing of when media content is sent to each channel based on the determined propagation latency for each channel so that the media content is presented synchronously (e.g., the same portion of media content is played at the same time) by the playback devices associated with each playback channel.



FIG. 1 is a block diagram of an example media system 100 for calibrating media playback channels for synchronized playback based on a Bluetooth clock at a sensor device. For example, the Bluetooth clock at the sensor device can be used by media system 100 to determine a time at which a playback device presented calibration content so that the propagation latency through media system 100 can be determined.


In some implementations, media system 100 can include media device 110. For example, media device 110 can be a computing device (e.g., laptop computer, set-top-box, streaming media player, smartphone, etc.) capable of streaming media to other playback devices. Media device 110 can stream media to other playback devices using a variety of wired (e.g., HDMI, RCA cables, coaxial cables, Ethernet, speaker wires, etc.) or wireless communication (e.g., Bluetooth, Wi-Fi, etc.) channels.


In some implementations, media device 110 can include media module 112. For example, media module 112 can be a software module configured to perform various media management functions on media device 110. Media module 112 can, for example, send media content (e.g., audio content, video content, etc.) for the user's enjoyment to various playback devices through respective playback channels according to output configurations specified by the user of media device 110. Media module 112 can manage the transmission of media content to the playback devices such that the playback devices present the media content in a synchronous manner. For example, media module 112 can adjust the timing of when media content is sent to the playback devices according to a propagation latency determined for each playback device and/or corresponding playback channel.


In some implementations, media module 112 can perform a calibration process to determine the propagation latency of each playback channel and/or corresponding playback device. For example, media module 110 can have a playback mode and a calibration mode. Media module 112 can normally operate in playback mode when sending media to various playback devices in response to a user request to play music, videos, movies, or some other media content for the user's entertainment. In some implementations, media module 112 can enter calibration mode to determine the propagation latency for the various playback channels and calibrate the various playback channels of media system 100 based on the determined propagation latency to ensure synchronized playback of media content on the playback channels and/or playback devices.


Media module 112 can enter calibration mode in a variety of ways. For example, media module 112 can enter calibration mode in response to the user providing input to media device 110. For example, media device 110 can present a graphical user interface on television 130 and the user can provide input selecting a calibration menu item to cause media module 112 to enter calibration mode.


As another example, media module 112 can enter calibration mode automatically when the user of sensor device 140 and/or media device 110 enables (e.g., turns on) microphone 144. For example, the user of sensor device 140 can press a button on sensor device 140 (e.g., a dedicated remote-control device) to enable voice input for providing input to media device 110. When microphone 144 is enabled, media module 112 can enter calibration mode to determine the propagation latency of the playback channels in media system 100 and calibrate the playback channels to enable synchronized playback of media content based on the sounds detected while microphone 144 is enabled. Thus, media system 100 can be automatically and dynamically (e.g., frequently) calibrated without burdening the user with providing explicit input to calibrate media system 100.


In some implementations, when media module 112 enters calibration mode, media module 112 can cause sensor device 140 to enter calibration mode. For example, media module 112 can send a message through network 150 (e.g., peer-to-peer Bluetooth, peer-to-peer Wi-Fi, local area network, etc.) to cause sensor device 140 and/or remote module 142 to enter calibration mode. When in calibration mode, remote module 142 can sample sounds detected by microphone 144 (e.g., or video captured by a camera of sensor device 140) so that media module 112 can determine when calibration content 118 was presented by a playback device (e.g., speaker 132, speaker 160, television 130, etc.) and determine the propagation latency, as described further below.


To determine propagation latency and perform the calibration process, media module 112 can send calibration content 118 to a playback device (or devices) through a corresponding playback channel (or channels). For example, media module 112 can send calibration content 118 through playback channel 126 (e.g., HDMI channel) to television 130 and speaker 132. Speaker 132 can be, for example, a speaker attached or connected to television 130.


In some implementations, calibration content 118 can be media content that includes audio content and/or video content specifically created for calibrating media system 100. In general, calibration content can be configured to include an initial media segment, followed by a calibration media segment (e.g., an audio or video pattern useful for calibration), followed by an ending media segment. The initial media segment and ending media segment can be configured to be pleasant sounding or visually appealing to the user such that they mask or make more tolerable the calibration media segment, which may be less appealing to the user. For example, the initial media segment and the ending media segment may be of a longer duration than the calibration media segment and, therefore less noticeable to the user. By configuring the calibration segment in between the initial and ending media segments, media module 112 can determine an offset at which the calibration media segment is presented within the calibration content. This offset may allow for greater precision when determining the propagation latency on a playback channel, as described further below with reference to FIG. 5. In the example of FIG. 1, calibration content 118 can include audio data to be presented by television 130 and/or speaker 132. However, in other implementations (e.g., FIG. 4), calibration content 118 can include video content.


As described above, the purpose of the calibration process is to determine the amount of time (e.g., propagation latency) it takes for calibration content 118 to be transmitted to and presented by speaker 132 after media module 112 sends the calibration content to television 130 and/or speaker 132 so that the output of media content from media module 112 can be calibrated (e.g., timed) for synchronized playback. For example, a significant source of latency in playback channel that includes a display device (e.g., television 130) is the video processing performed by the display device. Thus, the sending of media content through playback channels that do not include display devices, or other devices that perform video processing, may need to be delayed to accommodate the delay associated with the video processing performed at the display device in order to provide a synchronized playback experience for the user across all playback channels.


In some implementations, when sending calibration content 118, media module 112 can determine a time (e.g., transmission time) when calibration content 118 is sent based on system clock 120. For example, system clock 120 can be an internal clock used by media device 110 to perform various computing operations. In some implementations, system clock 120 can be synchronized with a network clock using well-known protocols. Media module 112 can record and/or store the system time at which calibration content 110 was sent to a playback device (e.g., television 130 and/or speaker 132) so that the transmission time can be compared to a presentation time when calibration content is presented by the playback device (e.g., as detected by sensor device 140) when calculating the propagation latency on a playback channel (e.g., playback channel 126).


When television 130 and/or speaker 132 receive calibration content 118, speaker 132 can present calibration content 118. For example, speaker 132 can present a sound or sounds corresponding to the audio data in calibration content 118 (e.g., a pleasant sound, followed by an audible test pattern, followed by a pleasant sound).


In some implementations, media system 100 can include sensor device 140. For example, sensor device 140 can be a computing device, such as a remote-control device, a smartphone, a tablet computer, or other device configured with sound and/or image sensors and capable of communicating with media device 110 through network 150 (e.g., a Bluetooth network, a Wi-Fi network, a peer-to-peer network, etc.). In the particular example of media system 100, sensor device 140 can correspond to a dedicated remote-control device for controlling and/or providing input to media device 110 using a Bluetooth connection.


In some implementations, sensor device 140 can include remote module 142. For example, remote module 142 can be a software module that provides the remote-control capabilities of sensor device 140 with respect to media device 110. Remote module 142 can obtain media samples (e.g., audio samples, video samples, etc.) generated by sensor device 140 and provide calibration data, including the media samples, to media module 112 for determining propagation latency through the various playback channels of media system 100.


In some implementations, sensor device 140 can include microphone 144 (e.g., sound sensor) for detecting sounds, such as voice commands for remotely controlling media device 110. Microphone 144 can also be used by remote module 142 to detect calibration content 118 presented by speaker 132, or any other audio playback device (e.g., speaker 160, headphones, etc.) when in calibration mode.


In some implementations, when remote module 142 is in calibration mode, remote module 142 can monitor the sounds detected by microphone 144 and periodically send calibration data to media device 110. For example, the calibration data can include media samples (e.g., sound samples, video samples, etc.) detected and/or generated by sensor device 140 using sound and/or image sensors of sensor device 140. The calibration data can include a timestamp indicating the time when the media sample in the calibration data was detected and/or generated by sensor device 140. While in calibration mode, remote module 142 can generate and send the calibration data on a periodic basis. For example, remote module 142 can periodically sample sensor data generated by sensors (e.g., sound sensor, image sensor, etc.) on sensor device 140 and generate calibration data for each sampling period. Remote module 142 can then send the calibration data for the sampling period, including a newly collected media sample for the current sampling period, to media device 110. For example, the sampling period can be 50 milliseconds, one second, etc., while in calibration mode. Each instance of calibration data may or may not include calibration content, and more importantly, may or may not include the calibration media segment. Thus, media module 112 on media device 110 can analyze each calibration data as it is received to determine whether the calibration data includes the calibration media segment, as described further below.


In some implementations, remote module 142 can use Bluetooth clock 146 to determine the timestamps for the calibration data. For example, sensor device 140 may not have a system clock. Remote module 142 can, therefore, obtain a current time (e.g., timestamp) using Bluetooth clock 146 when calibration data is generated. For example, the current time can be obtained from the Bluetooth clock 146 through an API (application programming interface) of Bluetooth controller 148. Remote module 142 can then store the timestamp in the calibration data that includes the media sample for the current sampling period. After generating calibration data for the current sampling period, remote module 142 can send message 149 to media device 110 that includes the calibration data generated by remote module 142 for the current sampling period.


When message 149 is received by media device 110, media module 112 can determine a system time corresponding to the Bluetooth time at which the sound sample included in message 149 was detected by sensor device 140. For example, as part of the Bluetooth communication protocol, Bluetooth clock 146 on sensor device 140 and Bluetooth clock 116 on media device 110 can be synchronized. However, Bluetooth clock 116 and system clock 120 on media device 110 may not be synchronized. Thus, when message 149 is received, media module can obtain the current time of Bluetooth clock 116 (e.g., from Bluetooth controller 114) and system clock 120 to determine a mapping between Bluetooth time and system time on media device 110. For example, media module 112 can determine an amount of time (e.g., 20 milliseconds, 5 seconds, 30 seconds, etc.) that the system time of system clock 120 is ahead (or behind) the Bluetooth time of Bluetooth clock 116. Media module 112 can then add (or subtract) this amount of time to (or from) the Bluetooth timestamp included in the calibration data to determine the system time at which the calibration data was generated and/or when the calibration media sample was detected by sensor device 140.


After determining the system time at which sensor device 140 generated the calibration data in message 149, media module 112 can determine a presentation time when the playback device began presenting calibration content 118. For example, media module 112 can determine the presentation time based on the time at which the playback device presented the calibration media segment as described below with reference to FIG. 5.


Media module 112 can then compare the system time (e.g., transmission time) at which media module 112 sent calibration content 118 to the playback device (e.g., television 130 and/or speaker 132) with the time (e.g., presentation time) at which the playback device presented calibration content 118 to determine the propagation latency (e.g., presentation delay) on playback channel 126. For example, media module 112 can subtract the transmission time from the presentation time to determine the propagation latency on playback channel 126.


In some implementations, media module 112 can determine the propagation latency through other (e.g., additional) playback channels in a similar manner as described above with reference to playback channel 126. For example, while media system 100 is in calibration mode, media module 112 can send calibration content 118 to speaker 160 (e.g., smart speaker, Bluetooth speaker, headphones, wireless earbuds, etc.) through playback channel 162. Calibration content 118 sent through playback channel 162 can include the same calibration media segment (e.g., audio pattern, video pattern, etc.) as the calibration content sent through playback channel 126. Calibration content 118 sent through playback channel 162 can include a different calibration media segment (e.g., audio calibration pattern, video calibration pattern, etc.) than the calibration content sent through playback channel 126. Sensor device 140 can then detect calibration content 118 presented by speaker 160, generate calibration data, and send the calibration data to media device 110 so that media module 112 can determine the propagation latency through playback channel 162, as described above.


In some implementations, media module 112 can send calibration content 118 to playback channel 126 and playback channel 162 simultaneously. For example, media module 112 can determine all of the playback channels (e.g., playback channel 126, playback channel 162, etc.) or playback devices (e.g., television 130, speaker 132, speaker 160, etc.) through which, or to which, media device 110 is configured to send media content. Media module 112 can then send calibration content 118 through each channel and/or to each playback device so that the playback devices present calibration content 118 when received at the playback devices. As described above, calibration content 118 can include the same calibration media segment for each playback channel or calibration content 118 can include different calibration media segments for each playback channel. For example, by sending different calibration media segments to each playback channel, media module 112 can determine which calibration data (e.g., detected calibration media segment) corresponds to which playback channel by matching the calibration media segment in the calibration data to the calibration media segment sent to each playback channel by media module 112.


In any case, the playback devices can present calibration content 118 when calibration content 118 is received by the playback devices. Due to the differences in propagation latency on each channel (e.g., channel 126, channel 162, etc.) the calibration content 118 may be presented by each playback device at different times or at the same time. However, sensor device 140 can detect the calibration content 118 presented by each playback device (e.g., television 130, speaker 132, and/or speaker 160), generate calibration data, and send the calibration data to media device 110 so that media module 112 can calculate the propagation latency for each playback channel 126 and/or 160 based on the calibration data for each channel, as described herein.


In some implementations, media module 112 can use the propagation latency calculated for each playback channel to synchronize media content playback across playback channels. For example, video processing (e.g., performed by television 130) is usually the source of the greatest amount of propagation latency within media system 100. So, if media module 112 determines that playback channel 126 (e.g., television 130, speaker 132) has a propagation latency of two (2) seconds, and that playback channel 162 (e.g., speaker 160) has a propagation latency of one (1) second, then, when sending media content to playback devices for presentation, media module 112 can send the media content to television 130 and/or speaker 132 one second before media module 112 sends the media content to speaker 160. Stated differently, media module 112 can delay sending the media content to speaker 160 by one second after sending the media content to television 130 and/or speaker 132 so that television 130, speaker 132, and/or speaker 160 all present the media content at the same time. Thus, media module 112 can calibrate the transmission of media content on the various playback channels based on the determined propagation latency determined for each playback channel so that the playback devices associated with each playback channel present media content simultaneously and in synchronization with the other playback devices.



FIG. 2 is a block diagram of an example media system 200 for calibrating media playback channels for synchronized playback based on a system clock at a sensor device. For example, media system 200 can correspond to system 100, described above. In some implementations, a system clock at the sensor device can be used by media system 100 to determine a time at which a playback device presented calibration content so that the propagation latency through media system 100 can be determined. For example, sensor device 140 can be a computing device (e.g., smartphone, tablet computer, etc.) that includes system clock 202. The calibration of media system 200 can be performed similarly to the calibration of media system 100, as described above, however the timestamp for the calibration data can be determined based on system clock 202 of sensor device 140 rather than Bluetooth clock 146. System clock 202 can be synchronized with system clock 120 of media device 110 using well-known network protocols. Thus, media module 112 can use the calibration data timestamp directly (e.g., without converting Bluetooth clock time to system time) when determining the propagation latency of each playback channel.


To calibrate the playback channels of system 200, media module 112 can send a notification to sensor device 140 to cause sensor device 140 (e.g., a smartphone, tablet computer, smartwatch, other computing device, etc.) to enter calibration mode when media module 112 of media device 110 enters calibration mode. As described above, media module 112 can enter calibration mode when the user selects a calibration menu item presented by media module 112 on television 130. For example, sensor device 140 can present the calibration notification on a display of sensor device 140. The user of sensor device 140 can provide input in response to the notification to cause remote module 142 to enter calibration mode.


Alternatively, the user of sensor device 140 can invoke remote module 142 (e.g., a software application) on sensor device 140 and provide input to remote module 142 to cause remote module 142 and/or media module 112 on media device 110 to enter calibration mode. When in calibration mode, remote module 142 can sample (e.g., record for a period of time) the sensor data generated by microphone 144 and generate calibration data on a periodic basis, as described above.


When remote module 142 generates calibration data, remote module 142 can determine the time at which the media sample was collected by requesting the current time from system clock 202. Remote module 142 can include the sample time in the calibration data and send the calibration data to media module 112 on media device 110 in message 204.


When message 204 is received by media module 112, media module 112 can determine the system time at which sensor device 140 generated the calibration data in message 204 based on the timestamp in the calibration data. After determining the system time at which sensor device 140 generated the calibration data in message 204, media module 112 can determine a presentation time when the playback device began presenting calibration content 118. For example, media module 112 can determine the presentation time based on the time at which the playback device presented the calibration media segment as described below with reference to FIG. 5.


To determine the propagation latency on the playback channels, media module 112 can calculate the difference between the system time when media module 112 sent calibration content 118 to the playback devices and the presentation time (e.g., system time) to determine the propagation latency on each playback channel. After the propagation latency is determined for each playback channel, media module 112 can calibrate the transmission of media content on each playback channel so that each playback device associated with the playback channels presents the media content synchronously, as described above.



FIG. 3 is a block diagram of an example media system 300 for calibrating media playback channels for synchronized playback based on visual calibration content detected at a sensor device. For example, media system 300 can correspond to system 200, described above. However, instead of determining the propagation latency of playback channel 304 based on audio calibration content, media system 300 can determine the propagation latency of playback channel 304 using video calibration content. Like media system 200, media system 300 can use the system clock at the sensor device to determine a time at which a sample of the video calibration content was detected by camera 302 so that the propagation latency through media system 300 (e.g., playback channel 304) can be determined. Since playback channel 162 to speaker 160 does not include a video playback device, the propagation latency through playback channel 162 can be (e.g., simultaneously or separately) determined using audio calibration content, as described above.


In some implementations, media module 112 can determine the type and/or capabilities of playback devices to which media module 112 is configured to send media content. For example, when establishing playback channel 304 to television 130, media module 112 can determine that television 130 is a type of playback device that is capable of presenting audio and video content. When establishing playback channel 162 to speaker 160, media module 112 can determine that speaker 160 is a type of playback device that is capable of only presenting audio content. Thus, when sending calibration content 118 to television 140 and/or speaker 160, media module 112 can select video and/or audio calibration content according to the capabilities of the playback devices associated with each playback channel. Alternatively, calibration content 118 can include both audio and video calibration content and the playback devices can present the audio and/or video content according to the capabilities of the playback devices.


In the example of FIG. 3, media module 112 can select to send video calibration content to television 130 and audio calibration content to speaker 160 and record the system time at which the calibration content was sent to each playback device. When television 130 receives the video calibration content 118, television 130 can present the video calibration content on a display of television 130.


In some implementations, sensor device 140 can be configured with camera 302 and/or microphone 144. Sensor device 140 can use microphone 144 (e.g., a sound sensor) to detect the presentation of audio calibration content 118 by speaker 160, as described above. Sensor device 140 can use camera 302 (e.g., an image sensor) to detect the presentation of video calibration content 118 by television 130. For example, when media module 112 enters calibration mode, media module 112 can send a notification to sensor device 140 (e.g., a smartphone, tablet computer, etc.). The notification can include information indicating that media device 110 has entered calibration mode. The notification can include information indicating the type of calibration content (e.g., video content, audio content, etc.) to be used for the calibration of media system 300. Sensor device 140 can present the calibration notification on the display of sensor device 140.


When the user of sensor device 140 selects or interacts with the calibration notification presented on sensor device 140 to cause sensor device 140 to enter calibration mode, remote module 142 can present instructions for performing the video calibration of media system 300. For example, when sensor device 140 enters calibration mode, remote module 142 can enable (e.g., turn on) microphone 144 and/or camera 302 and instruct the user to orient sensor device 140 so that the lens of camera 302 is directed at television 130. Thus, when television 130 presents the video calibration content (e.g., calibration content 118), camera 302 can detect the presentation of the video calibration content. For example, when in calibration mode, remote module 142 can sample (e.g., record for a period of time) the sensor data generated by camera 302 and generate calibration data on a periodic basis, as described above.


When remote module 142 generates calibration data, remote module 142 can determine the time at which the media sample was collected by requesting the current time from system clock 202. Remote module 142 can include the sample time in the calibration data and send the calibration data to media module 112 on media device 110 in message 306.


When message 306 is received by media module 112, media module 112 can determine the system time at which sensor device 140 generated the calibration data in message 306 based on the timestamp in the calibration data. After determining the system time at which sensor device 140 generated the calibration data (e.g., sample data) in message 306, media module 112 can determine a presentation time when the playback device began presenting calibration content 118. For example, media module 112 can determine the presentation time based on the time at which the playback device presented the calibration media segment as described below with reference to FIG. 5.


To determine the propagation latency on the playback channels, media module 112 can calculate the difference between the system time when media module 112 sent calibration content 118 to the playback devices and the presentation time (e.g., system time) to determine the propagation latency on each playback channel. After the propagation latency is determined for each playback channel, media module 112 can calibrate (e.g., adjust the timing of) the transmission of media content on each playback channel so that each playback device associated with the playback channels presents the media content synchronously, as described above.



FIG. 4 is a block diagram of an example media system 400 for calibrating media playback channels for synchronized playback based on a time when calibration data is received at the sending media device. For example, system 400 can correspond to system 100, described above. However, in system 400, remote module 142 may not have access to Bluetooth clock 146 (or a system clock) to determine when calibration content is presented by playback devices and/or detected by microphone 144. Thus, media system 400 can be configured to determine propagation latency based on the system clock of media device 110 and a time in flight for transmitting calibration content from sensor device 140 to media device 110. For example, media module 112 can calculate the time (e.g., sample time) at which sensor device 140 generated the media sample in the calibration data by subtracting the time in flight (e.g., the amount of time it takes to transmit data from sensor device 140 to media device 110 through communication channel 404) from the time (e.g., received time) at which media module 112 received message 402, including detected calibration content 118, was received by media device 110.


After determining the system time (e.g. sample time) at which sensor device 140 generated the calibration data in message 402, media module 112 can determine a presentation time when the playback device began presenting calibration content 118. For example, media module 112 can determine the presentation time based on the time at which the playback device presented the calibration media segment as described below with reference to FIG. 5.


In some implementations, media module 112 can send calibration content 118 to playback devices through various playback channels. For example, media module 112 can send calibration content 118 to television 130 and/or speaker 132 through playback channel 126, as described above. Media module 112 can send calibration content 118 to speaker 160 through playback channel 162, as described above.


In some implementations, when remote module 142 is in calibration mode, remote module 142 can monitor the sounds detected by microphone 144 and periodically send calibration data to media device 110. For example, the calibration data can include media samples (e.g., sound samples, video samples, etc.) detected and/or generated by sensor device 140 using sound and/or image sensors of sensor device 140. However, in the example of FIG. 4, remote module 142 may not have access to any clock (e.g., Bluetooth clock 146) on sensor device 140 (e.g., sensor device 140 may just be a remote-control device with no system clock). Thus, remote module 142 can send calibration data, including media samples, without a corresponding timestamp indicating when the calibration data and/or media samples were generated.


While in calibration mode, remote module 142 can generate and send the calibration data on a periodic basis. For example, remote module 142 can periodically sample sensor data generated by sensors (e.g., sound sensor, image sensor, etc.) on sensor device 140 and generate calibration data for each sampling period. Remote module 142 can then send the calibration data for the sampling period, including newly collected media samples, to media device 110. For example, the sampling period can be 50 milliseconds, one second, etc., while in calibration mode. Each instance of calibration data may or may not include calibration content, and more importantly, may or may not include the calibration media segment. Thus, media module 112 can analyze each calibration data as it is received to determine whether the calibration data includes the calibration media segment, as described further below. After generating calibration data for the current sampling period, remote module 142 can send message 402, including the calibration data generated by remote module 142, to media device 110.


When media device 110 receives message 402, media module 112 can calculate the difference (e.g., roundtrip time) between the system time at which media module 112 sent the calibration content to the playback device(s) and the time at which media module 112 received message 402. Media module 112 can then subtract a time in flight value from the roundtrip time to determine (e.g., estimate) when sensor device 140 generated the calibration data and/or media sample included in message 402.


In some implementations, the time in flight value can be determined based on the amount of time it takes for a message transmitted by sensor device 140 to be received by media device 110. For example, the time in flight value can be determined based on the Bluetooth clocks at sensor device 140 and media device 110. For example, although remote module 142 may not have access to Bluetooth clock 146 for determining a time at which calibration content 118 was detected by sensor device 140, Bluetooth controller 148 may include a Bluetooth clock time in message 402 indicating a time at which message 402 was transmitted by sensor device 140 as part of the Bluetooth communication protocol. When message 402 is received at media device 110, Bluetooth controller 114 can determine, based on Bluetooth clock 116, a Bluetooth time at which message 402 was received. Bluetooth controller 114 can determine the time in flight by calculating the difference between the Bluetooth time (e.g., transmission time) in message 402 and the Bluetooth time at which message 402 was received at media device 110. This calculated time in flight can be provided to media module 112. Media module 112 can then subtract the time in flight from the roundtrip time to determine the when the calibration data and/or media sample in message 402 was generated.


In some implementations, the time in flight value can be a statistical value (e.g., minimum value) determined from the time in flight values calculated for many messages sent from sensor device 140 to media device 110. For example, over time, sensor device 140 can send many (e.g., hundreds, thousands, etc.) Bluetooth messages to media device 110. Media module 112 can store the time in flight values generated for each of the messages received from sensor device 140 over a period of time (e.g., all time, previous week, previous hour, etc.). In some implementations, media module 112 can determine a minimum time in flight value from among all of the messages and use the minimum time in flight value when calculating the propagation latency based on time in flight between sensor device 140 and media device 110, as described above. In some implementations, media module 112 can calculate other statistical time in flight values, such as median, average, etc., and use one of these other statistical time in flight values when calculating the propagation latency based on time in flight between sensor device 140 and media device 110, as described above.


After determining the system time at which sensor device 140 generated the calibration data in message 149, media module 112 can determine a presentation time when the playback device began presenting calibration content 118. For example, media module 112 can determine the presentation time based on the time at which the remote device 142 generated the calibration data and/or media sample as described below with reference to FIG. 5.


Media module 112 can then compare the system time (e.g., transmission time) at which media module 112 sent calibration content 118 to the playback device (e.g., television 130) with the time (e.g., presentation time) at which the playback device presented calibration content 118 to determine the propagation latency (e.g., delay) on playback channel 304. For example, media module 112 can subtract the transmission time from the presentation time to determine the propagation latency on playback channel 304.


After calculating the propagation latency on each playback channel using the time in flight calculation described above, media module 112 can calibrate each playback channel based on the propagation latency calculated for each playback channel, as described above.



FIG. 5 illustrates an example calibration content 500 for determining propagation latency on a communication channel of a media system. For example, calibration content 500 can correspond to calibration content 118 of media systems 100, 200, 300, and/or 400, described above. As described above, calibration content 500 can include video content and/or audio content.


In some implementations, calibration content 500 can include a beginning media segment 502, a calibration segment 504, and an ending media segment 506. For example, media segment 502 and media segment 504 can include some audibly or visually pleasing media. Calibration segment 504 can include an audio or video pattern that can be matched by media device 110 when performing the calibration processes described herein. For example, media module 112 can match calibration segment 504 to audio and/or video sample data to determine whether the sample data includes the calibration segment when determining when calibration segment 504 was presented by a playback device. When presented by a playback device (e.g., television 130, speaker 132, speaker 160, etc.), the playback device can present media segment 502 for a first duration of time (e.g., time 512−time 510), calibration segment 504 for a second duration of time (e.g., time 514−time 512), and media segment 506 for a third duration of time (e.g., time 516−time 514). For example, the duration of calibration segment 504 can be shorter than the durations of media segment 502 and/or 506. Thus, calibration content 500 can be presented for a total duration of time (e.g., time 516−time 510).


In some implementations, sensor device 140 can capture a sample of sensor data that includes a portion of calibration content 500. For example, sensor device 140 can capture sample 520. Sample 520 can be a sample of audio data or video data captured and/or generated by a sound sensor (e.g., microphone) or image sensor (e.g., camera) of sensor device 140. As described above, remote module 142 can sample the sensor data on a periodic basis while in calibration mode. Sample 520 is an example of the sample data generated by remote module 142. Sample 520 can be sent by remote module 142 in calibration data to media device 110. The time indicated in the calibration data can correspond to time 522 when sample 520 was captured or generated by remote module 142.


As illustrated by FIG. 5, sample 520 can be generated at time 522 and end at 542. Thus, sample 520 may not include all of calibration content 500 (e.g., running from time 510 to time 516. Moreover, the beginning of sample 520 (e.g., time 522) may not coincide with the beginning of calibration content 500 (e.g., time 510). For example, the difference between the time (e.g., time 510) when the playback device started presenting calibration content 500 and the time when media module 112 sent calibration content 500 to the playback device is the propagation latency for the communication channel to the playback device so media module 112 needs to determine time 510 to calculate the propagation latency.


In some implementations, media module 112 can use calibration segment 504 to determine when the playback device started presenting calibration content 500 even though the beginning of calibration content 500 at time 510 is not part of sample 520. For example, media module 112 can determine a calibration time offset 530 for calibration segment 504. For example, calibration time offset 530 can correspond to the difference between time 512 (e.g., the beginning of calibration segment 504) and time 510 (e.g., the beginning of calibration content 500). Media module 112 can use the calibration time offset 530 for calibration segment 504 to determine when a playback device began presenting calibration content 500. For example, if media module 112 can determine a time when calibration segment 504 was presented, then media module 112 can subtract calibration offset 530 from this time to determine when a playback device started presenting calibration content 500. This “start time” or presentation time can be used by media module 112 to calculate the propagation latency from media device 110 through the playback device that presented calibration content 500.


In some implementations, when media module 112 receives calibration data including sample 520 from sensor device 140, media module 112 can analyze sample 520 to determine a sample time offset 540 corresponding to when calibration segment 504 begins within sample 520 (e.g., the difference between time 522 and time 512). For example, calibration segment 504 may begin at a sample time offset 540 of one (1) second from the beginning of sample 520.


When calibration data is sent to media module 112, media module 112 can determine a time at which sample 520 was generated (e.g., the time at which the beginning of sample 520 was captured). For example, the calibration data time can be obtained from the calibration data itself (e.g., the system or Bluetooth clock time determined at sensor device 140) or can be derived from the time in flight calculations, as described above.


To determine the time when calibration content 500 was first presented (e.g., time 510), media module 112 can add the sample time offset 540 to the calibration data time (e.g., time 522) and subtract the calibration time offset 530. The result of these calculations can correspond to the presentation time for the calibration content. For example, the presentation time corresponds to the time when a playback device began presenting calibration content 500.


Example Processes

To enable the reader to obtain a clear understanding of the technological concepts described herein, the following processes describe specific steps performed in a specific order. However, one or more of the steps of a particular process may be rearranged and/or omitted while remaining within the contemplated scope of the technology disclosed herein. Moreover, different processes, and/or steps thereof, may be combined, recombined, rearranged, omitted, and/or executed in parallel to create different process flows that are also within the contemplated scope of the technology disclosed herein. Additionally, while the processes below may omit or briefly summarize some of the details of the technologies disclosed herein for clarity, the details described in the paragraphs above may be combined with the process steps described below to get a more complete and comprehensive understanding of these processes and the technologies disclosed herein.



FIG. 6 is flow diagram of an example process 600 for calibrating media playback channels for synchronized presentation based on a detection time determined by a sensor device. For example, process 600 can be performed by media systems 100, 200, and/or 300, as described above. Process 600 can be performed to determine the propagation latency for each playback channel (e.g., including playback devices) in media systems 100, 200, and/or 300. The propagation latency determined for each playback channel can then be used by media device 110 to adjust the transmission times of media content through each playback channel so that the media content is presented in a synchronous manner across all playback devices.


At step 602, media device 110 can cause media device 110 and sensor device 140 to enter calibration mode. For example, media device 110 can receive explicit user input indicating that the user wishes to calibrate media system 100, 200, and/or 300. The user input can be received by media device 110 through a remote control (e.g., sensor device 140) associated with media device 110.


In some implementations, media device 110 can detect when the user activates a sensor (e.g., microphone, camera, etc.) on sensor device 140 and take the opportunity (e.g., without explicit user input) to calibrate media system 100, 200, and/or 300. For example, the user may enable the microphone on sensor device 140 to provide voice input to media device 110. Media device 110 can receive a message from sensor device 140 indicating that the microphone is active or turned on and cause media device 110 and sensor device 140 to enter calibration mode. Thus, media device 110 can cause media device 110 and sensor device 140 to enter calibration mode opportunistically when the user enables a calibration sensor (e.g., microphone, camera, etc.) on sensor device 140.


In some implementations, media device 110 can periodically calibrate media system 100, 200, and/or 300. For example, media device 110 can calibrate media system 100, 200, and/or 300 on a recurring, periodic basis (e.g., daily, weekly, etc.). If media device 110 has not recently calibrated media system 100, 200, and/or 300, media device 110 can automatically enter calibration mode at the end of the configured period and send a notification to sensor device 140 to cause sensor device 140 to enter calibration mode. For example, a user of sensor device 140 can interact with the notification to allow sensor device 140 to enter calibration mode and activate the calibration sensors (e.g., microphone, camera, etc.) on sensor device 140, as described above.


At step 604, media device 110 can send calibration content to a playback device through a playback channel and record the transmission time. For example, media device 110 can determine the sensor capabilities (e.g., sound sensor—microphone, image sensor—camera, etc.) of sensor device 140. Media device 110 can determine the media presentation capabilities (e.g., audio only, audio and video, etc.) of the playback devices in media system 100, 200, and/or 300. Media device 110 can select calibration content to send to each playback device on each playback channel based on the determined capabilities of sensor device 140 and the playback devices. For example, when sensor device 140 can only detect sound (e.g., is configured with only a microphone), then media device 110 can send audio calibration data to the various playback devices in media system 100, 200, and/or 300. When sensor device 140 can detect sound and images, media device 110 can select audio or video calibration data according to the output capabilities of the playback devices. For example, video calibration data can be sent to playback devices having displays. Audio calibration data can be sent to playback devices having speakers. When sending calibration data to a playback device over a playback channel, media device 110 can record the local system time (e.g., using the system clock of media device 110) at which the calibration data was sent over the playback channel. When the playback device receives the calibration data, the playback device can present the calibration data (e.g., using a display or speakers of the playback device).


At step 606, sensor device 140 can detect calibration content presented by a playback device and record the detection time. For example, when in calibration mode, the calibration sensor(s) of sensor device 140 may remain enabled (e.g., active, turned on) such that sounds and/or images corresponding to the calibration content presented by playback devices can be detected by sensor device 140. Remote module 142 can sample (e.g., periodically record) the sounds and/or images detected by sensor device 140 when in calibration mode. Remote module 142 can record the time at which each sample is recorded and/or calibration data is generated. For example, sensor device 140 can record the current Bluetooth clock time or the current system time on sensor device 140 when sampling the sensor data and/or generating the calibration data. Sensor device 140 may record the current Bluetooth time when a system clock is unavailable on sensor device 140.


At step 608, sensor device 140 can send the calibration data to media device 110. For example, remote module 142 can send the calibration data for the current sampling period to media device 140.


In some implementations, steps 606 and 608 can be performed repeatedly while in calibration mode. For example, remote module 142 can sample the data generated by the calibration sensors (e.g., microphone, camera, etc.) on sensor device 140 on a periodic basis (e.g., every 50 milliseconds, every one second, etc.), store the sensor data (e.g., detected calibration data), record the current time for the current sample, and send the current sample and the current time to media device 110 for analysis. Remote module 142 may iterate through many sampling periods while in calibration mode. Thus, remote module 142 may send many instances of calibration data to media module 112 on media device 110.


At step 610, media device 110 can calculate the propagation latency based on the calibration data and the transmission time of the calibration content. For example, media module 112 can analyze each instance of calibration data to determine which instance, or instances when multiple playback channels are calibrated, of calibration data include a calibration segment (e.g., the audio and/or video calibration pattern). When media module 112 identifies an instance of calibration data that includes the calibration segment, media module 112 can determine the time at which the calibration segment of the calibration data (e.g., sampled sensor data) was presented by a playback device and/or received by sensor device 140. For example, the time at which the calibration segment was presented can be determined by adding the sample offset to the time (e.g., sample time) indicated in the calibration data. Media module 112 can then determine the time (e.g., presentation time) at which the calibration content was presented by the playback device based on the calibration offset of the calibration segment, as described above. Media module 112 can then calculate the propagation latency based on the difference between the transmission time of the calibration content and the presentation time of the calibration segment. For example, media module 112 can subtract the transmission time recorded when calibration content 118 was sent to the playback device through the playback channel from the presentation time determined based on the calibration data.


At step 612, media device 110 can adjust the transmission delay for the playback channel when sending media content for playback based on the playback latency determined for the playback channel. For example, media module 112 can compare the presentation latency for the playback channel to the presentation latencies calculated for other playback channels and adjust the playback delays (e.g., an amount of time for delaying sending the media content) for each playback channel to accommodate the playback channel that has the longest presentation latency. For example, if playback channel 126 has a presentation latency of 5 seconds and playback channel 162 has a playback latency of 2 seconds, then media module 112 can delay sending media content on playback channel 162 for 3 seconds after sending the same media content on playback channel 126 so that the media content will be presented simultaneously by the playback devices associated with each playback channel.



FIG. 7 is flow diagram of an example process 700 for calibrating media playback channels for synchronized presentation based on a received time determined by a media device and a time in flight for transmitting data between the sensor device and the media device. For example, process 700 can be performed by media systems 400, as described above. Process 700 can be performed to determine the propagation latency for each playback channel (e.g., including playback devices) in media system 400. The propagation latency determined for each playback channel can then be used by media device 110 to adjust the transmission times of media content through each playback channel so that the media content is presented in a synchronous manner across all playback devices.


At step 702, media device 110 can cause media device 110 and sensor device 140 to enter calibration mode. For example, media module 112 can receive explicit user input indicating that the user wishes to calibrate media system 400. The user input can be received by media module 112 through a remote control (e.g., sensor device 140) associated with media device 110.


In some implementations, media module 112 can detect when the user activates a sensor (e.g., microphone, camera, etc.) on sensor device 140 and take the opportunity (e.g., without explicit user input) to calibrate media system 400. For example, the user may enable the microphone on sensor device 140 to provide voice input to media module 112. Media module 112 can receive a message from sensor device 140 indicating that the microphone is active or turned on and cause media module 112 and remote module 142 to enter calibration mode. Thus, media module 112 can cause media module 112 and remote module 142 to enter calibration mode opportunistically when the user enables a calibration sensor (e.g., microphone, camera, etc.) on sensor device 140.


In some implementations, media module 112 can periodically calibrate media system 400. For example, media module can calibrate media system 400 on a recurring, periodic basis (e.g., daily, weekly, etc.). If media module 112 has not recently calibrated media system 400, media module 112 can automatically enter calibration mode at the end of the configured period and send a notification to sensor device 140 to cause sensor device 140 to enter calibration mode. For example, a user of sensor device 140 can interact with the notification to allow remote module 142 to enter calibration mode and activate the calibration sensors (e.g., microphone, camera, etc.) on sensor device 140, as described above.


At step 704, media device 110 can send calibration content to a playback device through a playback channel and record the transmission time. For example, media module 112 can determine the sensor capabilities (e.g., sound sensor—microphone, image sensor—camera, etc.) of sensor device 140. Media module 112 can determine the media presentation capabilities (e.g., audio only, audio and video, etc.) of the playback devices in media system 400. Media module 112 can select calibration content to send to each playback device on each playback channel based on the determined capabilities of sensor device 140 and the playback devices. For example, when sensor device 140 can only detect sound (e.g., is configured with only a microphone), then media module 112 can send audio calibration data to the various playback devices in media system 400.


When sensor device 140 can detect sound and images, media module 112 can select audio or video calibration data according to the output capabilities of the playback devices. For example, video calibration data can be sent to playback devices having displays. Audio calibration data can be sent to playback devices having speakers. When sending calibration data to a playback device over a playback channel, media module 112 can record the local system time (e.g., using the system clock of media device 110) at which the calibration data was sent over the playback channel. When the playback device receives the calibration data 118, the playback device can present the calibration data 118 using a display or speakers of the playback device.


At step 706, sensor device 140 can detect calibration content presented by a playback device and record the detection time. For example, when in calibration mode, the calibration sensor(s) of sensor device 140 may remain enabled (e.g., active, turned on) such that sounds and/or images corresponding to the calibration content presented by playback devices can be detected by sensor device 140. Remote module 142 can sample (e.g., periodically record) the sounds and/or images detected by sensor device 140 when in calibration mode. In the example of media system 400, remote module 142 will not record the time at which each sample is recorded and/or calibration data is generated because remote module 142 does not have access to a clock on sensor device 140.


At step 708, sensor device 140 can send calibration data to media device 110. For example, remote module 142 can send the calibration data for the current sampling period to media device 140.


In some implementations, steps 706 and 708 can be performed repeatedly while in calibration mode. For example, remote module 142 can sample the data generated by the calibration sensors (e.g., microphone, camera, etc.) on sensor device 140 on a periodic basis (e.g., every 50 milliseconds, every one second, etc.), store the sensor data (e.g., detected calibration data), record the current time for the current sample, and send the current sample and the current time to media device 110 for analysis. Remote module 142 may iterate through many sampling periods while in calibration mode. Thus, remote module 142 may send many instances of calibration data to media module 112 on media device 110.


At step 710, media device 110 can determine a time when media device receives the calibration data from sensor device 140. For example, when media device 110 receives calibration data, media module 112 can obtain the current system time from the system clock on media device 110 and store the current system time as the received time for the calibration data.


At step 712, media device 110 can calculate the propagation latency based on the transmission time, calibration data, the received time of the calibration data, and the time in flight of transmission between sensor device 140 and media device 110. For example, media module 112 can analyze each instance of calibration data to determine which instance, or instances when multiple playback channels are calibrated, of calibration data include a calibration segment (e.g., the audio and/or video calibration pattern). When media module 112 identifies an instance of calibration data that includes the calibration segment, media module 112 can determine the time at which the calibration segment of the calibration data (e.g., sampled sensor data) was presented by a playback device and/or received by sensor device 140. For example, the time at which the calibration segment was presented can be determined by adding the sample offset to a time (e.g., sample time) when the sample in the calibration data was captured. This sample time can be estimated by subtracting a time in flight value (e.g., corresponding to an amount of time estimated for a message to travel from sensor device 140 to media device 110) from the calibration data received time determined at step 710. Media module 112 can then determine the time (e.g., presentation time) at which the calibration content was presented by the playback device based on the calibration offset of the calibration segment (e.g., subtract the calibration offset from the time at which the calibration segment was presented), as described above. Media module 112 can then calculate the propagation latency based on the difference between the transmission time of the calibration content and the presentation time of the calibration segment. For example, media module 112 can subtract the transmission time recorded when calibration content 118 was sent to the playback device through the playback channel from the presentation time determined based on the calibration data.


At step 714, media device 110 can adjust the transmission delay for the playback channel when sending media content for playback based on the playback latency determined for the playback channel. For example, media module 112 can compare the presentation latency for the playback channel to the presentation latencies calculated for other playback channels and adjust the playback delays (e.g., an amount of time for delaying sending the media content) for each playback channel to accommodate the playback channel that has the longest presentation latency. For example, if playback channel 126 has a presentation latency of 5 seconds and playback channel 162 has a playback latency of 2 seconds, then media module 112 can delay sending media content on playback channel 162 for 3 seconds after sending the same media content on playback channel 126 so that the media content will be presented simultaneously by the playback devices associated with each playback channel.


Graphical User Interfaces


This disclosure above describes various Graphical User Interfaces (GUIs) for implementing various features, processes or workflows. These GUIs can be presented on a variety of electronic devices including but not limited to laptop computers, desktop computers, computer terminals, television systems, tablet computers, e-book readers and smart phones. One or more of these electronic devices can include a touch-sensitive surface. The touch-sensitive surface can process multiple simultaneous points of input, including processing data related to the pressure, degree or position of each point of input. Such processing can facilitate gestures with multiple fingers, including pinching and swiping.


When the disclosure refers to “select” or “selecting” user interface elements in a GUI, these terms are understood to include clicking or “hovering” with a mouse or other input device over a user interface element, or touching, tapping or gesturing with one or more fingers or stylus on a user interface element. User interface elements can be virtual buttons, menus, selectors, switches, sliders, scrubbers, knobs, thumbnails, links, icons, radio buttons, checkboxes and any other mechanism for receiving input from, or providing feedback to a user.


Privacy

The present disclosure recognizes that the use of personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data (e.g., samples of detected audio and/or video calibration data) can be used to calibrate playback devices so that audio and/or video data can be presented in a synchronized manner across different playback devices. Accordingly, use of such personal information data enables calculated control of the presented audio and/or video content. Although in some instances, a user's voice and/or other sounds proximate to the sensor device may be recorded while sampling audio and/or video calibration data, the systems described herein maintain and protect the user's privacy by recording and/or detecting the audio/video data only in response to user input indicating that such audio and/or video detection sensors (e.g., camera, microphone, etc.) should be activated, enabled, or turned on. Thus, outside of the specific calibration processes described herein, the technology described herein is not configured to record audio and/or video data without the user's knowledge and/or consent.


The present disclosure further contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. For example, personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection should occur only after receiving the informed consent of the users. Additionally, such entities would take any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices.


Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of advertisement delivery services, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services. In another example, users can select not to provide location information for targeted content delivery services. In yet another example, users can select to not provide precise location information, but permit the transfer of location zone information.


Example System Architecture


FIG. 8 is a block diagram of an example computing device 800 that can implement the features and processes of FIGS. 1-7. The computing device 800 can include a memory interface 802, one or more data processors, image processors and/or central processing units 804, and a peripherals interface 806. The memory interface 802, the one or more processors 804 and/or the peripherals interface 806 can be separate components or can be integrated in one or more integrated circuits. The various components in the computing device 800 can be coupled by one or more communication buses or signal lines.


Sensors, devices, and subsystems can be coupled to the peripherals interface 806 to facilitate multiple functionalities. For example, a motion sensor 810, a light sensor 812, and a proximity sensor 814 can be coupled to the peripherals interface 806 to facilitate orientation, lighting, and proximity functions. Other sensors 816 can also be connected to the peripherals interface 806, such as a global navigation satellite system (GNSS) (e.g., GPS receiver), a temperature sensor, a biometric sensor, magnetometer or other sensing device, to facilitate related functionalities.


A camera subsystem 820 and an optical sensor 822, e.g., a charged coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS) optical sensor, can be utilized to facilitate camera functions, such as recording photographs and video clips. The camera subsystem 820 and the optical sensor 822 can be used to collect images of a user to be used during authentication of a user, e.g., by performing facial recognition analysis.


Communication functions can be facilitated through one or more wireless communication subsystems 824, which can include radio frequency receivers and transmitters and/or optical (e.g., infrared) receivers and transmitters. The specific design and implementation of the communication subsystem 824 can depend on the communication network(s) over which the computing device 800 is intended to operate. For example, the computing device 800 can include communication subsystems 824 designed to operate over a GSM network, a GPRS network, an EDGE network, a Wi-Fi or WiMax network, and a Bluetooth™ network. In particular, the wireless communication subsystems 824 can include hosting protocols such that the device 100 can be configured as a base station for other wireless devices.


An audio subsystem 826 can be coupled to a speaker 828 and a microphone 830 to facilitate voice-enabled functions, such as speaker recognition, voice replication, digital recording, and telephony functions. The audio subsystem 826 can be configured to facilitate processing voice commands, voiceprinting and voice authentication, for example.


The I/O subsystem 840 can include a touch-surface controller 842 and/or other input controller(s) 844. The touch-surface controller 842 can be coupled to a touch surface 846. The touch surface 846 and touch-surface controller 842 can, for example, detect contact and movement or break thereof using any of a plurality of touch sensitivity technologies, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch surface 846.


The other input controller(s) 844 can be coupled to other input/control devices 848, such as one or more buttons, rocker switches, thumb-wheel, infrared port, USB port, and/or a pointer device such as a stylus. The one or more buttons (not shown) can include an up/down button for volume control of the speaker 828 and/or the microphone 830.


In one implementation, a pressing of the button for a first duration can disengage a lock of the touch surface 846; and a pressing of the button for a second duration that is longer than the first duration can turn power to the computing device 800 on or off. Pressing the button for a third duration can activate a voice control, or voice command, module that enables the user to speak commands into the microphone 830 to cause the device to execute the spoken command. The user can customize a functionality of one or more of the buttons. The touch surface 846 can, for example, also be used to implement virtual or soft buttons and/or a keyboard.


In some implementations, the computing device 800 can present recorded audio and/or video files, such as MP3, AAC, and MPEG files. In some implementations, the computing device 800 can include the functionality of an MP3 player, such as an iPod™.


The memory interface 802 can be coupled to memory 850. The memory 850 can include high-speed random-access memory and/or non-volatile memory, such as one or more magnetic disk storage devices, one or more optical storage devices, and/or flash memory (e.g., NAND, NOR). The memory 850 can store an operating system 852, such as Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks.


The operating system 852 can include instructions for handling basic system services and for performing hardware dependent tasks. In some implementations, the operating system 852 can be a kernel (e.g., UNIX kernel). In some implementations, the operating system 852 can include instructions for performing voice authentication. For example, operating system 852 can implement the media system calibration features as described with reference to FIGS. 1-7.


The memory 850 can also store communication instructions 854 to facilitate communicating with one or more additional devices, one or more computers and/or one or more servers. The memory 850 can include graphical user interface instructions 856 to facilitate graphic user interface processing; sensor processing instructions 858 to facilitate sensor-related processing and functions; phone instructions 860 to facilitate phone-related processes and functions; electronic messaging instructions 862 to facilitate electronic-messaging related processes and functions; web browsing instructions 864 to facilitate web browsing-related processes and functions; media processing instructions 866 to facilitate media processing-related processes and functions; GNSS/Navigation instructions 868 to facilitate GNSS and navigation-related processes and instructions; and/or camera instructions 870 to facilitate camera-related processes and functions.


The memory 850 can store other software instructions 872 to facilitate other processes and functions, such as the media system calibration processes and functions as described with reference to FIGS. 1-7.


The memory 850 can also store other software instructions 874, such as web video instructions to facilitate web video-related processes and functions; and/or web shopping instructions to facilitate web shopping-related processes and functions. In some implementations, the media processing instructions 866 are divided into audio processing instructions and video processing instructions to facilitate audio processing-related processes and functions and video processing-related processes and functions, respectively.


Each of the above identified instructions and applications can correspond to a set of instructions for performing one or more functions described above. These instructions need not be implemented as separate software programs, procedures, or modules. The memory 850 can include additional instructions or fewer instructions. Furthermore, various functions of the computing device 800 can be implemented in hardware and/or in software, including in one or more signal processing and/or application specific integrated circuits.


Example Embodiments

Some embodiments can include a method comprising: detecting, by a sensor device, a portion of calibration content presented by a first playback device, the portion of calibration content transmitted to the first playback device from a media device at a transmission time determined based on a first clock at the media device; generating, by the sensor device, calibration data, the calibration data including the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected by the sensor device, the detection time determined based on a second clock on the sensor device; sending, by the sensor device, the calibration data to the media device, wherein the media device calculates a propagation latency value based on a transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.


The method can include embodiments wherein the first clock is a system clock and the second clock is a Bluetooth clock. The method can include embodiments wherein the first clock and the second clock are system clocks of the media device and the sensor device respectively. The method can include embodiments wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The method can include embodiments wherein the calibration content is audio content. The method can include embodiments wherein the calibration content is video content. The method can include embodiments wherein the sensor device is a remote-control device for remotely controlling the media device.


Some embodiments can include a system comprising: a plurality of computing devices, including a media device, a sensor device; and a plurality of non-transitory computer-readable medium including one or more sequences of instructions that, when executed by the computing devices, cause the computing devices to perform operations comprising: sending, by the media device, calibration content to a first playback device associated with a first playback channel; storing, by the media device, a transmission time indicating when the calibration content was sent to the first playback device, the transmission time determined based on a first clock on the media device; detecting, by the sensor device, a portion of calibration content presented by the first playback device; generating, by the sensor device, calibration data, the calibration data including the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected, the detection time determined based on a second clock on the sensor device; sending, by the sensor device, the calibration data to the media device; and calculating, by the media device, a propagation latency value based on the transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.


The system can include embodiments wherein the first clock is a system clock and the second clock is a Bluetooth clock. The system can include embodiments wherein the first clock and the second clock are system clocks of the media device and the sensor device respectively. The system can include embodiments wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The system can include embodiments wherein the calibration content is audio content. The system can include embodiments wherein the calibration content is video content. The system can include embodiments wherein the calibration media segment is associated with a time offset, and wherein the instructions cause the computing devices to perform operations comprising: calculating, by the media device, a propagation latency value based on the transmission time, the detection time, and the time offset for the calibration media segment.


Some embodiments can include a media device comprising: one or more processors; and a non-transitory computer readable medium including one or more sequences of instructions that, when executed by the one or more processors, cause the processors to perform operations comprising: sending, by the media device, calibration content to a first playback device associated with a first playback channel; storing, by the media device, a transmission time indicating when the calibration content was sent to the first playback device, the transmission time determined based on a first clock on the media device; receiving, by the media device from a sensor device, the calibration data, the calibration data including a portion of calibration content presented by the first playback device and detected by the sensor device; determining, by the media device, a received time indicating when the calibration data was received by the media device, the received time determined based on the first clock on the media device; and calculating, by the media device, a propagation latency value based on the transmission time, the received time, the portion of the detected calibration content, and a time in flight value representing an amount of time it takes for a message to be received at the media device after being sent by the sensor device.


The media device can include embodiments wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment. The media device can include embodiments wherein the calibration content is audio content. The media device can include embodiments wherein the calibration content is video content. The media device can include embodiments wherein the instructions cause operations comprising: determining a detection time for the portion of calibration content based on the received time and the time in flight value. The media device can include embodiments wherein the calibration media segment is associated with a time offset, and wherein the instructions cause operations comprising: calculating, by the media device, a propagation latency value based on the transmission time, the detection time, and the time offset for the calibration media segment.

Claims
  • 1. A method comprising: sending, by a media device, calibration content to a first playback device associated with a first playback channel;storing, by the media device, a transmission time indicating when the calibration content was sent to the first playback device, the transmission time determined based on a first clock on the media device;detecting, by a sensor device, a portion of calibration content presented by the first playback device;generating, by the sensor device, calibration data, the calibration data including the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected, the detection time determined based on a second clock on the sensor device;sending, by the sensor device, the calibration data to the media device; andcalculating, by the media device, a propagation latency value based on the transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.
  • 2. The method of claim 1, wherein the first clock is a system clock and the second clock is a Bluetooth clock.
  • 3. The method of claim 1, wherein the first clock and the second clock are system clocks of the media device and the sensor device respectively.
  • 4. The method of claim 1, wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment.
  • 5. The method of claim 4, wherein the calibration media segment is associated with a time offset, and further comprising: calculating, by the media device, a propagation latency value based on the transmission time, the detection time, and the time offset for the calibration media segment.
  • 6. The method of claim 1, wherein the calibration content is audio content.
  • 7. The method of claim 1, wherein the calibration content is video content.
  • 8. A method comprising: sending, by a media device, calibration content to a first playback device associated with a first playback channel;storing, by the media device, a transmission time indicating when the calibration content was sent to the first playback device, the transmission time determined based on a first clock on the media device;receiving, by the media device from a sensor device, calibration data, the calibration data including a portion of calibration content presented by the first playback device and detected by the sensor device;determining, by the media device, a received time indicating when the calibration data was received by the media device, the received time determined based on the first clock on the media device; andcalculating, by the media device, a propagation latency value based on the transmission time, the received time, the portion of the detected calibration content, and a time in flight value representing an amount of time it takes for a message to be received at the media device after being sent by the sensor device.
  • 9. The method of claim 8, wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment.
  • 10. The method of claim 9, wherein the calibration media segment is associated with a time offset, and further comprising: calculating, by the media device, a propagation latency value based on the transmission time, the detection time, and the time offset for the calibration media segment.
  • 11. The method of claim 8, wherein the calibration content is audio content.
  • 12. The method of claim 8, wherein the calibration content is video content.
  • 13. The method of claim 8, further comprising: determining a detection time for the portion of calibration content based on the received time and the time in flight value.
  • 14. A sensor device comprising: one or more processors; anda non-transitory computer readable medium including one or more sequences of instructions that, when executed by the processors, cause the processors to perform operations comprising:detecting, by the sensor device, a portion of calibration content presented by a first playback device, the portion of calibration content transmitted to the first playback device from a media device at a transmission time determined based on a first clock at the media device;generating, by the sensor device, calibration data, the calibration data including the portion of the detected calibration content and a detection time indicating when the portion of the calibration content was detected by the sensor device, the detection time determined based on a second clock on the sensor device;sending, by the sensor device, the calibration data to the media device, wherein the media device calculates a propagation latency value based on a transmission time, the portion of the detected calibration content, and the detection time indicated in the calibration data.
  • 15. The sensor device of claim 14, wherein the first clock is a system clock and the second clock is a Bluetooth clock.
  • 16. The sensor device of claim 14, wherein the first clock and the second clock are system clocks of the media device and the sensor device respectively.
  • 17. The sensor device of claim 14, wherein the calibration content includes a first media segment followed by a calibration media segment followed by a second media segment.
  • 18. The sensor device of claim 17, wherein the sensor device is a remote-control device for remotely controlling the media device.
  • 19. The sensor device of claim 14, wherein the calibration content is audio content.
  • 20. The sensor device of claim 14, wherein the calibration content is video content.
US Referenced Citations (408)
Number Name Date Kind
4807224 Naron et al. Feb 1989 A
5534911 Levitan Jul 1996 A
5535381 Kopper Jul 1996 A
5553222 Milne et al. Sep 1996 A
5559945 Beaudet et al. Sep 1996 A
5583993 Foster et al. Dec 1996 A
5587404 Kroener et al. Dec 1996 A
5613863 Klaus et al. Mar 1997 A
5616876 Cluts Apr 1997 A
5640566 Victor et al. Jun 1997 A
5664044 Ware Sep 1997 A
5664226 Czako et al. Sep 1997 A
5666530 Clark et al. Sep 1997 A
5696948 Cruz et al. Dec 1997 A
5709521 Glass et al. Jan 1998 A
5710922 Alley et al. Jan 1998 A
5721949 Smith et al. Feb 1998 A
5722041 Freadman Feb 1998 A
5727202 Kucala Mar 1998 A
5739451 Winksy et al. Apr 1998 A
5745583 Koizumi et al. Apr 1998 A
5751997 Kullick et al. May 1998 A
5790521 Lee et al. Aug 1998 A
5815297 Ciciora Sep 1998 A
5835721 Donahue et al. Nov 1998 A
5835732 Kikinis et al. Nov 1998 A
5845282 Alley et al. Dec 1998 A
5864868 Contois Jan 1999 A
5867668 Spirakis et al. Feb 1999 A
5875354 Charlton et al. Feb 1999 A
5918213 Bernard et al. Jun 1999 A
5918303 Yamaura et al. Jun 1999 A
5923757 Hocker et al. Jul 1999 A
5925843 Miller et al. Jul 1999 A
5931906 Fidelibus et al. Aug 1999 A
5953350 Higgins Sep 1999 A
5963916 Kaplan Oct 1999 A
5969283 Looney et al. Oct 1999 A
6000000 Hawkins et al. Dec 1999 A
6006274 Hawkins et al. Dec 1999 A
6008777 Yiu Dec 1999 A
6038199 Pawlowski et al. Mar 2000 A
6041023 Lakhansingh Mar 2000 A
6061306 Buchheim May 2000 A
6085252 Zhu et al. Jul 2000 A
6092119 Rossmere et al. Jul 2000 A
6101591 Foster et al. Aug 2000 A
6125369 Wu et al. Sep 2000 A
6138245 Son et al. Oct 2000 A
6166314 Weinstock et al. Dec 2000 A
6172948 Keller et al. Jan 2001 B1
6208044 Viswanadham et al. Mar 2001 B1
6212359 Knox Apr 2001 B1
6216131 Liu et al. Apr 2001 B1
6243328 Fenner et al. Jun 2001 B1
6243725 Hempleman et al. Jun 2001 B1
6243772 Ghori et al. Jun 2001 B1
6247135 Feague Jun 2001 B1
6248946 Dwek Jun 2001 B1
6263313 Milsted et al. Jul 2001 B1
6263503 Margulis Jul 2001 B1
6272545 Flanagin et al. Aug 2001 B1
6282714 Ghori et al. Aug 2001 B1
6283764 Kajiyama et al. Sep 2001 B2
6295541 Bodnar et al. Sep 2001 B1
6332175 Birrell et al. Dec 2001 B1
6338044 Cook et al. Jan 2002 B1
6341316 Kloba et al. Jan 2002 B1
6345256 Milsted et al. Feb 2002 B1
6356971 Katz et al. Mar 2002 B1
6374177 Lee et al. Apr 2002 B1
6380947 Stead Apr 2002 B1
6389467 Eyal May 2002 B1
6397388 Allen May 2002 B1
6411943 Crawford Jun 2002 B1
6429880 Marcos et al. Aug 2002 B2
6453281 Walters et al. Sep 2002 B1
6489986 Allen Dec 2002 B1
6490432 Wegener et al. Dec 2002 B1
6493758 McLain Dec 2002 B1
6523124 Lunsford et al. Feb 2003 B1
6529233 Allen Mar 2003 B1
6529804 Draggon et al. Mar 2003 B1
6563769 Van Der Meulen May 2003 B1
6587403 Keller et al. Jul 2003 B1
6587404 Keller et al. Jul 2003 B1
6587480 Sanders Jul 2003 B1
6621768 Keller et al. Sep 2003 B1
6630963 Billmaier Oct 2003 B1
6636873 Carini et al. Oct 2003 B1
6659861 Faris et al. Dec 2003 B1
6664981 Ashe et al. Dec 2003 B2
6665803 Lunsford et al. Dec 2003 B2
6684060 Curtin Jan 2004 B1
6694200 Naim Feb 2004 B1
6718348 Novak et al. Apr 2004 B1
6721489 Benyamin et al. Apr 2004 B1
6728585 Neoh Apr 2004 B2
6728729 Jawa et al. Apr 2004 B1
6731312 Robbin May 2004 B2
6744738 Park et al. Jun 2004 B1
6757913 Knox Jun 2004 B2
6760721 Chasen et al. Jul 2004 B1
6763345 Hempleman et al. Jul 2004 B1
6766376 Price Jul 2004 B2
6779019 Mousseau et al. Aug 2004 B1
6785542 Blight et al. Aug 2004 B1
6794566 Pachet Sep 2004 B2
6798838 Ngo Sep 2004 B1
6801964 Mahdavi Oct 2004 B1
6831881 Patil et al. Dec 2004 B2
6845398 Galensky et al. Jan 2005 B1
6874037 Abram et al. Mar 2005 B1
6920179 Anand et al. Jul 2005 B1
6925595 Whitledge et al. Aug 2005 B1
6928433 Goodman et al. Aug 2005 B2
6929058 Liu et al. Aug 2005 B2
6941324 Plastina et al. Sep 2005 B2
6944880 Allen Sep 2005 B1
6956562 O'Hara et al. Oct 2005 B1
6959288 Medina et al. Oct 2005 B1
6959562 Navedo et al. Nov 2005 B2
6978127 Bulthuis et al. Dec 2005 B1
6981259 Luman et al. Dec 2005 B2
6985966 Gupta et al. Jan 2006 B1
6993532 Platt et al. Jan 2006 B1
6993722 Greer et al. Jan 2006 B1
6999826 Zhou et al. Feb 2006 B1
7010758 Bate Mar 2006 B2
7016443 Splett Mar 2006 B1
7022905 Hinman et al. Apr 2006 B1
7024214 Loveland Apr 2006 B2
7024491 Hanmann et al. Apr 2006 B1
7024575 Lienhart et al. Apr 2006 B2
7034891 Joung et al. Apr 2006 B2
7039656 Tsai et al. May 2006 B1
7047308 Deshpande May 2006 B2
7069058 Kawashima Jun 2006 B2
7075000 Gang et al. Jul 2006 B2
7076204 Richenstein et al. Jul 2006 B2
7082310 Hirayama et al. Jul 2006 B2
7082320 Kattukaran et al. Jul 2006 B2
7084898 Firestone et al. Aug 2006 B1
7096271 Omoigui et al. Aug 2006 B1
7111009 Gupta et al. Sep 2006 B1
7130892 Mukai Oct 2006 B2
7136934 Carter et al. Nov 2006 B2
7142934 Janik Nov 2006 B2
7146322 Cowgill Dec 2006 B2
7174560 Crinon Feb 2007 B1
7184774 Robinson et al. Feb 2007 B2
7185084 Sirivara et al. Feb 2007 B2
7194692 Marcos et al. Mar 2007 B2
7260714 Dawson et al. Aug 2007 B2
7266713 Lienhart et al. Sep 2007 B2
7281141 Elkayam et al. Oct 2007 B2
7283880 Dick Oct 2007 B2
7289393 Keller et al. Oct 2007 B2
7295809 Moore Nov 2007 B2
7295983 Fujiwara et al. Nov 2007 B2
7302239 Jitsuhara Nov 2007 B2
7305691 Cristofalo Dec 2007 B2
7343553 Kaye Mar 2008 B1
7346698 Hannaway Mar 2008 B2
7359671 Richenstein et al. Apr 2008 B2
7367701 Lee May 2008 B2
7369532 Silvester May 2008 B2
7370129 Green et al. May 2008 B2
7406294 Liu Jul 2008 B1
7430753 Gray et al. Sep 2008 B2
7437158 Russell Oct 2008 B2
7444388 Svendsen Oct 2008 B1
7469283 Eyal et al. Dec 2008 B2
7471988 Smith et al. Dec 2008 B2
7474677 Trott Jan 2009 B2
7480746 Simon et al. Jan 2009 B2
7483538 McCarty et al. Jan 2009 B2
7502604 Knox Mar 2009 B2
7508815 Lapeyre et al. Mar 2009 B2
7519681 Edwards et al. Apr 2009 B2
7519686 Hong et al. Apr 2009 B2
7539777 Aitken May 2009 B1
7542784 Passier et al. Jun 2009 B2
7555291 Waessingbo Jun 2009 B2
7561215 Kim et al. Jul 2009 B2
7577261 Liu et al. Aug 2009 B2
7606570 Karaoguz et al. Oct 2009 B2
7617513 McCafferty et al. Nov 2009 B2
7620011 Kim et al. Nov 2009 B2
7627343 Fadell et al. Dec 2009 B2
7634227 De Jong Dec 2009 B2
7698297 Jawa et al. Apr 2010 B2
7724780 Baird et al. May 2010 B2
7769903 Robbin et al. Aug 2010 B2
7797446 Heller et al. Sep 2010 B2
7821574 Black Oct 2010 B2
7996505 Krantz et al. Aug 2011 B2
8037220 Moore et al. Oct 2011 B2
8184657 Dacosta May 2012 B2
8185674 Moore et al. May 2012 B2
8214447 Deslippe et al. Jul 2012 B2
8284739 Doyle et al. Oct 2012 B2
8301790 Morrison Oct 2012 B2
8374087 Dacosta Feb 2013 B2
8689036 Millington et al. Apr 2014 B2
8918541 Morrison Dec 2014 B2
8957972 Gluskin Feb 2015 B2
9182777 Millington Nov 2015 B2
9189011 Millington Nov 2015 B2
9207905 Millington Dec 2015 B2
9417689 Ramaswamy Aug 2016 B1
9763018 McPherson et al. Sep 2017 B1
9826012 Hao et al. Nov 2017 B2
20010004310 Kono Jun 2001 A1
20010008535 Lanigan Jul 2001 A1
20010011308 Clark et al. Aug 2001 A1
20010018858 Dwek Sep 2001 A1
20010021053 Colbourne et al. Sep 2001 A1
20010021305 Sugiyama et al. Sep 2001 A1
20010021998 Margulis Sep 2001 A1
20010041021 Boyle et al. Nov 2001 A1
20010044835 Schober et al. Nov 2001 A1
20010048642 Berhan Dec 2001 A1
20010052123 Kawai Dec 2001 A1
20020002413 Tokue Jan 2002 A1
20020013784 Swanson Jan 2002 A1
20020013852 Janik Jan 2002 A1
20020013853 Baber et al. Jan 2002 A1
20020019984 Rakib Feb 2002 A1
20020045960 Phillips et al. Apr 2002 A1
20020046315 Miller et al. Apr 2002 A1
20020055934 Lipscomb et al. May 2002 A1
20020073139 Hawkins et al. Jun 2002 A1
20020074413 Henzerling Jun 2002 A1
20020078075 Colson et al. Jun 2002 A1
20020081098 Scally Jun 2002 A1
20020095663 Joory Jul 2002 A1
20020103554 Coles et al. Aug 2002 A1
20020113824 Myers Aug 2002 A1
20020116082 Gudorf Aug 2002 A1
20020118848 Karpenstein Aug 2002 A1
20020133515 Kagle et al. Sep 2002 A1
20020133824 Mensch Sep 2002 A1
20020138606 Robison Sep 2002 A1
20020156921 Dutta et al. Oct 2002 A1
20020161865 Nguyen Oct 2002 A1
20020164973 Janik et al. Nov 2002 A1
20020174243 Spurgat et al. Nov 2002 A1
20020174269 Spurgat et al. Nov 2002 A1
20020194309 Carter et al. Dec 2002 A1
20020196912 Norris Dec 2002 A1
20030013332 Lin Jan 2003 A1
20030013492 Bokhari et al. Jan 2003 A1
20030013493 Irimajiri et al. Jan 2003 A1
20030030733 Seaman et al. Feb 2003 A1
20030037254 Fischer et al. Feb 2003 A1
20030045955 Janik Mar 2003 A1
20030046434 Flanagin et al. Mar 2003 A1
20030050058 Walsh et al. Mar 2003 A1
20030065802 Vitikainen et al. Apr 2003 A1
20030074457 Kluth Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030079038 Robbin et al. Apr 2003 A1
20030083954 Namba May 2003 A1
20030097379 Ireton May 2003 A1
20030112279 Irimajiri Jun 2003 A1
20030120742 Ohgami et al. Jun 2003 A1
20030131360 Joung et al. Jul 2003 A1
20030134589 Oba Jul 2003 A1
20030158737 Csicsatka Aug 2003 A1
20030167318 Robbin et al. Sep 2003 A1
20030181203 Cheshire Sep 2003 A1
20030182315 Plastina et al. Sep 2003 A1
20030191756 Oh Oct 2003 A1
20030197725 Tuli Oct 2003 A1
20030210821 Yogeshwar et al. Nov 2003 A1
20030221161 Balassanian et al. Nov 2003 A1
20030221541 Platt Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20040001395 Keller et al. Jan 2004 A1
20040001396 Keller et al. Jan 2004 A1
20040001494 Barrack et al. Jan 2004 A1
20040003151 Bateman et al. Jan 2004 A1
20040004338 Jung Jan 2004 A1
20040017997 Cowgill Jan 2004 A1
20040027930 Kudo Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040045030 Reynolds et al. Mar 2004 A1
20040055446 Robbin et al. Mar 2004 A1
20040057446 Varsa et al. Mar 2004 A1
20040068536 Demers et al. Apr 2004 A1
20040072584 Kern Apr 2004 A1
20040076086 Keller et al. Apr 2004 A1
20040078416 Kawasaki et al. Apr 2004 A1
20040128198 Register et al. Jul 2004 A1
20040128402 Weaver et al. Jul 2004 A1
20040132510 Yamashita Jul 2004 A1
20040133908 Smith et al. Jul 2004 A1
20040139180 White et al. Jul 2004 A1
20040139233 Kellerman et al. Jul 2004 A1
20040139844 Tsuboi Jul 2004 A1
20040143442 Knight Jul 2004 A1
20040157548 Eyer Aug 2004 A1
20040174896 Caspi et al. Sep 2004 A1
20040177063 Weber et al. Sep 2004 A1
20040177371 Caspi et al. Sep 2004 A1
20040177377 Lin et al. Sep 2004 A1
20040179540 Lee et al. Sep 2004 A1
20040193900 Nair Sep 2004 A1
20040215810 Tan et al. Oct 2004 A1
20040216108 Robbin Oct 2004 A1
20040221088 Lisitsa et al. Nov 2004 A1
20040223622 Lindemann et al. Nov 2004 A1
20040225762 Poo Nov 2004 A1
20040234088 McCarty et al. Nov 2004 A1
20040236568 Guillen et al. Nov 2004 A1
20040242224 Janik et al. Dec 2004 A1
20040250273 Swix et al. Dec 2004 A1
20040252604 Johnson et al. Dec 2004 A1
20040255326 Hicks et al. Dec 2004 A1
20040261040 Radcliffe et al. Dec 2004 A1
20040261112 Hicks et al. Dec 2004 A1
20040267825 Novak et al. Dec 2004 A1
20050010616 Burks Jan 2005 A1
20050055444 Venkatasubramanian Mar 2005 A1
20050071375 Houghton et al. Mar 2005 A1
20050080915 Shoemaker et al. Apr 2005 A1
20050089052 Chen et al. Apr 2005 A1
20050147130 Hurwitz et al. Jul 2005 A1
20050174488 Chennakeshu Aug 2005 A1
20050201360 Redstone Sep 2005 A1
20050201398 Naik et al. Sep 2005 A1
20050207726 Chen Sep 2005 A1
20050226233 Kryuchkov et al. Oct 2005 A1
20050235015 Abanami et al. Oct 2005 A1
20050235048 Costa-Requena et al. Oct 2005 A1
20050240494 Cue et al. Oct 2005 A1
20050240661 Heller et al. Oct 2005 A1
20050240745 Iyer et al. Oct 2005 A1
20050254447 Miller-Smith Nov 2005 A1
20050262528 Herley et al. Nov 2005 A1
20050265316 Liu et al. Dec 2005 A1
20050273790 Kearney et al. Dec 2005 A1
20060007943 Fellman Jan 2006 A1
20060015580 Gabriel et al. Jan 2006 A1
20060027080 Schultz Feb 2006 A1
20060030961 Lin Feb 2006 A1
20060062242 Dacosta Mar 2006 A1
20060062243 Dacosta Mar 2006 A1
20060067463 Hack et al. Mar 2006 A1
20060069724 Langdon Mar 2006 A1
20060074637 Berreth Apr 2006 A1
20060083194 Dhrimaj et al. Apr 2006 A1
20060090202 Liu et al. Apr 2006 A1
20060092844 Jeon et al. May 2006 A1
20060100978 Heller et al. May 2006 A1
20060106806 Sperling et al. May 2006 A1
20060117371 Margulis Jun 2006 A1
20060126667 Smith et al. Jun 2006 A1
20060143455 Gitzinger Jun 2006 A1
20060159109 Lamkin et al. Jul 2006 A1
20060167982 Jawa et al. Jul 2006 A1
20060168351 Ng et al. Jul 2006 A1
20060224620 Silverman et al. Oct 2006 A1
20060245451 Wakid Nov 2006 A1
20060253279 Sung Nov 2006 A1
20060274747 Duchscher et al. Dec 2006 A1
20060277216 Shukhman Dec 2006 A1
20060288057 Collins et al. Dec 2006 A1
20070033052 Cowgill Feb 2007 A1
20070038941 Wysocki et al. Feb 2007 A1
20070067309 Klein et al. Mar 2007 A1
20070073723 Ramer et al. Mar 2007 A1
20070073728 Klein et al. Mar 2007 A1
20070074118 Robbin et al. Mar 2007 A1
20070084333 Robbin et al. Apr 2007 A1
20070088727 Kindig Apr 2007 A1
20070088764 Yoon et al. Apr 2007 A1
20070110074 Bradley et al. May 2007 A1
20070124680 Robbin et al. May 2007 A1
20070130541 Louch et al. Jun 2007 A1
20070169115 Ko et al. Jul 2007 A1
20070185919 Kaplan et al. Aug 2007 A1
20070203954 Vargas et al. Aug 2007 A1
20070220552 Juster et al. Sep 2007 A1
20070250761 Bradley et al. Oct 2007 A1
20070271312 Heller et al. Nov 2007 A1
20070291323 Roncal Dec 2007 A1
20080018927 Martin et al. Jan 2008 A1
20080028008 Brunet et al. Jan 2008 A1
20080065247 Igoe Mar 2008 A1
20080086494 Heller et al. Apr 2008 A1
20080164581 Cho et al. Jul 2008 A1
20080168185 Robbin et al. Jul 2008 A1
20080168245 De et al. Jul 2008 A1
20080168391 Robbin et al. Jul 2008 A1
20080168525 Heller et al. Jul 2008 A1
20080168526 Robbin et al. Jul 2008 A1
20080229335 Robbin et al. Sep 2008 A1
20090172200 Morrison Jul 2009 A1
20090290725 Huang Nov 2009 A1
20110264732 Robbin et al. Oct 2011 A1
20130058593 Kanalakis, Jr. Mar 2013 A1
20140006946 Robbin et al. Jan 2014 A1
20140037097 Labosco Feb 2014 A1
20140244863 Bradley et al. Aug 2014 A1
20140307585 Kearney et al. Oct 2014 A1
20170054774 Robbin et al. Feb 2017 A1
Foreign Referenced Citations (56)
Number Date Country
0146334 Jun 1985 EP
0830026 Mar 1998 EP
0917077 May 1999 EP
0982732 Mar 2000 EP
1028425 Aug 2000 EP
1112931 Jul 2001 EP
1122931 Aug 2001 EP
1143719 Oct 2001 EP
1353269 Oct 2003 EP
1408427 Apr 2004 EP
1429569 Jun 2004 EP
1463334 Sep 2004 EP
1523171 Apr 2005 EP
1548740 Jun 2005 EP
1751949 Feb 2007 EP
2360887 Aug 2011 EP
2375678 Oct 2011 EP
2000-339917 Dec 2000 JP
2001-093226 Apr 2001 JP
2001-117800 Apr 2001 JP
2003-077214 Mar 2003 JP
2003-303137 Oct 2003 JP
2003-319485 Nov 2003 JP
10-2001-0063284 Jul 2001 KR
10-2001-0079176 Aug 2001 KR
10-2002-0001127 Jan 2002 KR
10-2002-0011027 Feb 2002 KR
10-2006-0035634 Apr 2006 KR
10-0599204 Jul 2006 KR
9408337 Apr 1994 WO
9516950 Jun 1995 WO
0016533 Mar 2000 WO
0043914 Jul 2000 WO
0126374 Apr 2001 WO
0133569 May 2001 WO
0167753 Sep 2001 WO
0225610 Mar 2002 WO
0225935 Mar 2002 WO
0265723 Aug 2002 WO
0265732 Aug 2002 WO
2003009601 Jan 2003 WO
2003023786 Mar 2003 WO
0338637 May 2003 WO
2003036541 May 2003 WO
2004004338 Jan 2004 WO
2004034286 Apr 2004 WO
2004057474 Jul 2004 WO
2004084413 Sep 2004 WO
2005060387 Jul 2005 WO
2005114472 Dec 2005 WO
2005122531 Dec 2005 WO
2006007322 Jan 2006 WO
2006047578 May 2006 WO
2007079360 Jul 2007 WO
2007079334 Aug 2007 WO
2008033771 Mar 2008 WO
Non-Patent Literature Citations (48)
Entry
Maulik, “Synchronizing mp3 playback, version #2,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=2, generated Jun. 8, 2006, version #2 dated Dec. 28, 2004 in history, 2-pgs.
Maulik, “Synchronizing mp3 playback, version #1,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=1- , generated Jun. 8, 2006, version #1 dated Nov. 8, 2004 in history, 2-pgs.
Maulik and Ryan, “Synchronizing mp3 playback, version #9,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=9- , generated Jun. 8, 2006, version #9 dated Mar. 3, 2005 in history, 3-pgs.
Wikipedia: “iTunes”, www.wikipedia.com, May 9, 2005, 6 pages. (U.S. Appl. No. 11/519,429).
Wikipedia, “IEEE 1394,” 13 pgs (Firewire and also known as Sony's iLink) (downloaded from https://en.wikipedia.org/wiki/IEEE.sub.-1394).
Travis Butler, “Portable MP3: the Nomad Jukebox,” Aug. 1, 2001, available from http://db.tidbits.com/getbits.acgi?tbart=06261. downloaded Jul. 13, 2011 at http://www.tidbits.com/article/6261 (U.S. Appl. No. 11/519,429).
Travis Butler, “Archos Jukebox 6000 Challenges Nomad Jukebox,” Aug. 13, 2001, available from http://db.tidbits.com/getbits.acgi?tbart=06521—Downloaded Jul. 13, 2011 @ http://www.tidbits.com/article/6521?print version=1 (U.S. Appl. No. 11/519,429).
Titmus. Richard. “Softsqueeze 2.0.” obtained from http://softsqueeze.souceforge.net/. generated Jun. 8, 2006. copyright 2004. 2005. 3-pgs.
Stewart et al., “Request for Comments: 3758,” Network Working Group, May 2004, 22-pgs.
snarfed.org,“libmsntp,” obtained from http://snarfed.org/spact/libmsntp, generated Jun. 8, 2006, undated, 2 pgs.
Snarfed.org, “History of Synchronizing mp3 playback,” obtained from http://snarfed.org/exec/history?name=synchronizing+mp3+playback, generated Jun. 8, 2006, 2-pgs.
Slim Devices, Inc., “Squeezebox 2: Owner's Guide,” copyright 2005, 28-pgs.
Slim Devices, Inc., “Slim Devices: Squeezebox: Free Your Music,” obtained from http://www.slimdevices.com/index.html, generated Jun. 8, 2006, copyright 2002-2004, 1-pg.
Slim Devices, Inc. “Slim Devices: Support: FAQ,” obtained from http://www.slimdevices.com/su.sub.—faq.html, generated Jun. 21, 2006, copyright 2002-2005, 31 pgs.
Slim Devices, Inc. “Slim Devices: Squeezebox: Overview” obtained from http:www.slimdevices.com/pi.sub.--overview.html, generated Jun. 21, 2006, copyright 2002-0226, 3 pgs.
Shulzrinne et al., “Request for Comments: 3550,” Network Working Group, Jul. 2003, 104-pgs.
Schulzrinne Comumbia U A RAO Netscape R Lanphier RealNetworks H: “Real Time Streaming Protocol (RTSP)” IETF Standard, Internet Engineering Task Force, IETF, CH, Apr.
Perkins C. “RTP Audio and Video for the Internet” 2003, Addison-Wesley, pp. 107-109.
Palacharla, et al. “Design and Implementation of a Real-time Multimedia Presentation System using RTP;”Computer Software and Applications Conference; Aug. 13, 1997; pp. 376-381.
Nullsoft, “winamp.com / Plug-ins,” obtained from http://winamp.com/plugins/details.php?id=15667, generated Jun. 8, 2006, copyright 2006, 2 pgs.
Nomad Jukebox, User Guide, Creative Technology Ltd., Version 1, Aug. 2000. [38 pages downloaded Aug. 16, 2011] (U.S. Appl. No. 11/519,429).
Nilsson, “IDS tag versions 2.4.0.—Main Structure,” Nov. 1, 2000 (downloaded from http://www.id3.org/id3v2.4.0-structure?-action=print).
Myradus, LLC, “Myradus Media Player Puppeteer for iTunes,” obtained from http://www.myradus.com/Product_MediaPlayerPuppeteerForm.aspx, generated Jun. 8, 2006, copyright 2004-2005, 1-pg.
Mills, David L., “Request for Comments: 1305,” Network Working Group, Mar. 1992, 113-pgs.
Microsoft Corp., “Window's Media Player 6.4,” 2 pgs, 1999 (software downloadable at http://www.oldversion.com/program php?n=wmp, downloaded Jul. 13, 2011 from http://www.microsoft.com/download/en/confirmation.axps?displaylangen&id=2- 2758).
Maulik and Ryan, “Synchronizing mp3 playback, version #8,” http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=8, generated Jun. 8, 2006, version #8 dated Feb. 23, 2005 in history, 3 pgs.
Maulik and Ryan, “Synchronizing mp3 playback, version #6,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=6, generated Jun. 8, 2006, version #6 dated Feb. 2, 2005 in history, 3-pgs.
Maulik and Ryan, “Synchronizing mp3 playback, version #5,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=5, generated Jun. 8, 2006, version #5 dated Jan. 19, 2005 in history, 3-pgs.
Maulik and Ryan, “Synchronizing mp3 playback, version #4,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=4- , generated Jun. 8, 2006, version #4 dated Jan. 3, 2005 in history, 3-pgs.
Maulik and Ryan, “Synchronizing mp3 playback, version #3,” obtained from http://snarfed.org/exec/version?name=synchronizing+mp3+playback&version=3- , generated Jun. 8, 2006, version #3 dated Jan. 2, 2005 in history, 3-pgs.
Maulik and Ryan “Synchronizing mp3 playback, version#7,” obtainedfromhftp://snarfed.org/exec/Nersion?name-synchronizing+rp3hpiaybackuerllon7generated6/8/2006version#?dated2/21/2005nhistoy3pgs.
M. Nilsson; ID3tag version 2.3.0; Feb. 3, 1999, http://www.id3lib.or/idev2.3.0.html [30 pages Jun. 29, 2011 at http://www/id3.org/id3v2.3.0?action=print] (U.S. Appl. No. 11/519,429).
Linksys “New Linksys Wireless Home Products Showcased at CEBIT 2004” Internet Article (Mar. 18, 2004) www.broadbandbuyerco.uk/Shop/pageTextDetail.asp?Setl0=28,TextI0=473.
K*Software, “Kquery.com,” http://www.kquery.com/index.php?page=software_1%id=8, generated Jun. 8, 2006, copyright 2004, 4 pgs.
ITunes-perl, copyright 2004-2006 by Jay McGavren, obtained from the Internet at http://code.googlecom, pp. 7. (downloaded 5 pages on Jan. 13, 2012 from http://code.google.com/p/itunes-perl/wiki/Usage) (U.S. Appl. No. 11/519,429).
Iriver, “PMP140/120,” 2 pgs, Sep. 13, 2004.
Handbook for Palm.TM. m500 Series Handhelds,User Manual.—(286 pages) (U.S. Appl. No. 11/519,429).
De Herrera, Chris, “Microsoft ActiveSync 11” Version 1.02, (Oct. 13, 2000 Downloaded Aug. 16, 2011 From http://www.pocketpcfaq.com/wce/activesync3.1.htm (U.S. Appl. No. 11/519,429).
Compaq Computer Corp., “Systems Research Center and PAAD,” Personal Jukebox (PJB), 25 pgs, Oct. 13, 2000 (downloaded from http://research.compaq.com/SRC/pjb/; redownloaded Apr. 2016 from http://birrell.org/andrew/talks/pjb-overview.pdf).
Clifton, “Pipe.c, a Kla2 Module,” 8 pgs, 2003 (retrieved Oct. 17, 2011 from http://www.codelode.com/Kernel/k1a2pepec.html, downloaded Apr. 18, 2012 from http://web.archive.org/2004010719482/http://www.codelode.com/Ke- rnel/kla2pipec . . . ).
Butler, “Archos Jukebox 6000 Challenges Nomad Jukebox,” 6 pgs, Aug. 13, 2001 (from http://db.tidbits.com/getbits.acgi?tbart=06521 (downloaded Jul. 13, 2011 from http://www.tidbits.com/article/6521?printversion=1).
Apple, “Mac OS X: Bonjour,” Technology Brief, Apr. 2005, 6-pgs.
Apple, “AirPort Express,” Technology Overview, Jul. 2004, 31-pgs.
Apple Introduces iTunes—World's Best and Easiest to Use Jukebox Software, Macworld Expo, San Francisco, Jan. 9, 2011 [.about.2 pages Downloaded on Jul. 14, 2011 at http://www.apple.com/pr/library/2001/01/09Apple-Introduces-iTunes-Worlds— Best and Ea . . . ] (U.S. Appl. No. 11/519,429).
Apple Inc., “Apple-Downloads-Dashboard,” 1 pg (downloaded Apr. 2016 from http://www.apple.com/downloads/dashboard).
Apple Announces iTunes 2, Press Release, Apple Computer, Inc., dated Oct. 23, 2001 http://www.apple.com/pr/library/2001/10/23Apple-Announces-iTunes-2.html] downloaded Apr. 8, 2012 (U.S. Appl. No. 11/519,429).
Adam C. Engst. “SoundJam Keeps on Jammin'.” Jun. 19, 2000, available from http://db.tidbits.com/getbits.acgi?tbart=05988. downloaded Jul. 25, 2011 at http?..db.tidbits.com/article/5988?print?version+1 (U.S. Appl. No. 11/519,429).
“D-Link's New Wireless Media Device Plays Digital Music, Videos and Photos on Home Television and Stereo:” http://presslink.dlink.com/pr/?prid=136, dated Jan. 20, 2004, 2 pgs.
Related Publications (1)
Number Date Country
20200005830 A1 Jan 2020 US