This disclosure relates to techniques for transmitting data between a wireless source device and other wireless devices and more particularly to the transmission of media data from the wireless source device to a wireless sink device and a wireless client device.
Wireless display (WD) or WI-FI Display (WFD) systems include a wireless source device and one or more wireless sink devices. The source device and each of the sink devices may be either mobile devices or wired devices with wireless communication capabilities. One or more of the source device and the sink devices may, for example, include mobile telephones, portable computers with wireless communication cards, personal digital assistants (PDAs), portable media players, or other such devices with wireless communication capabilities, including so-called “smart” phones and “smart” pads or tablets, or other types of wireless communication devices. One or more of the source device and the sink devices may also include wired devices such as televisions, desktop computers, monitors, projectors, and the like, that include communication capabilities.
The source device sends media data, such as audio video (AV) data, to one or more of the sink devices participating in a particular media share session. The media data may be played back at both a local display of the source device and at each of the displays of the sink devices. More specifically, each of the participating sink devices renders the received media data on its screen and audio equipment.
Server computing devices may also provide media items to client devices using various media sharing protocols. The client devices may issue playback commands to the server computing devices. In response to receiving the playback commands, the server may transmit media items, for example using streaming, to the client devices.
This disclosure generally describes a system where a first wireless computing device configured as a wireless source device can communicate with a second wireless computing device configured as a wireless sink device and a wireless client computing device. As part of a communication session, the wireless source device can transmit audio and video data to the wireless sink device, and the wireless sink device can transmit user inputs received at the wireless sink device back to the wireless source device. The wireless source device may also execute a media sharing application, which enables sharing playlists of media items between the first wireless computing device and the wireless client computing device. The wireless client computing device may determine which media items that the wireless client computing device is capable of outputting. The first wireless computing device may receive media playback commands from the wireless client computing device, and may transmit the media items to the wireless client computing device in response to the playback commands.
In one example, the disclosure describes a method of transmitting content of a first wireless computing device to a second wireless computing device that includes initiating, with the first wireless computing device, a WI-FI display (WFD) connection, transmitting, with the first wireless computing device, data from the first wireless computing device via the WFD connection to the second wireless computing device, executing, with the first wireless computing device, a media sharing application that enables the first wireless computing device to share a media item of a playlist with a wireless client computing device, transmitting, with the first wireless computing device, information that describes the media item of the playlist to the wireless client computing device, wherein transmitting the information that describes the media item causes the wireless client computing device to determine if the wireless client computing device is capable of outputting the media item, and transmitting, with the first wireless computing device, the media item to the wireless client computing device.
In another example, the disclosure describes a first wireless computing device that includes a WI-FI display (WFD) module configured to initiate a WI-FI display (WFD) connection with a second wireless computing device, and transmit data from the first wireless computing device via the WFD connection to the second wireless computing device, a media sharing module configured to: execute a media sharing application that enables the first wireless computing device to share a media item of a playlist with a wireless client computing device, transmit information that describes the media item of the playlist to the wireless client computing device, wherein the transmission of the information that describes the media item causes the wireless client computing device to determine if the wireless client computing device is capable of outputting the media item, and transmit the media item to the wireless client computing device.
In another example, the disclosure describes a first wireless computing device that includes means for initiating a WI-FI display (WFD) connection with a second wireless computing device, means for transmitting data via the WFD connection to the second wireless computing device, means for executing a media sharing application that enables the first wireless computing device to share a media item of a playlist with a wireless client computing device, means for transmitting information that describes the media item of the playlist to the wireless client computing device transmitting, wherein the means for transmitting the information that describes the media item causes the wireless client computing device to determine if the wireless client computing device is capable of outputting the media item, and means for transmitting the media item to the wireless client computing device.
In another example, the disclosure describes a computer-readable storage medium. The computer-readable storage medium having stored thereon instructions that upon execution cause one or more processors to initiate a WI-FI display (WFD) connection with a second wireless computing device, transmit data from the first wireless computing device via the WFD connection to the second wireless computing device, execute a media sharing application that enables the first wireless computing device to share a media item of a playlist with a wireless client computing device, transmit information that describes the format of the media item of the playlist to the wireless client computing device, wherein the transmission of the information that describes the media item causes the wireless client computing device to determine if the wireless client computing device is capable of outputting the media item, and transmit the media item to the wireless client computing device.
WI-FI Display (WFD) may be used in a variety of applications in order to support wirelessly transmitting content. As an example, a mobile device (referred to as a “source”) may be used to wirelessly transmit video content or other application data from a mobile computing device, such as a phone, tablet, smartphone, or portable digital assistant (PDA), to other devices (referred to as “sink(s)” and “clients”), which are WFD enabled. The video content or other application data may be transmitted from the source and received by the sink may be output by one or more output devices of the sink.
In this disclosure, the term source device generally refers to a device that is transmitting media data to either a sink device or a client device. As will be described in greater detail below, the term sink device generally refers to a device that is receiving media data from a source device and simultaneously rendering the same media content as the source device. The term client device generally refers to a device that is receiving media data from a source device, but unlike a sink device, a client device is not necessary simultaneously rendering the same media content as the source device. For example, a source device may stream video data or audio data to a client device even though the source device is not itself rendering the movie data or audio data. The terms source device, sink device, and client device generally refer to a state of operation for a particular device. Thus, one device may be capable of being any of a source device, sink device, or client device, and in some instances may even simultaneously function as more than one type of device. For example, a particular device may be a client device for one device but may be a source device for another device.
In an example, a user of a mobile source device may execute a media sharing application on the source device when the user enters a proximity to support WI-FI communications. The media sharing application may allow one or more users of WI-FI equipped client devices to select content to watch, listen, and/or view media items shared on the source device via WI-FI streaming. The media sharing application of the source device may also use WFD to connect with a WFD-compatible sink device in order to share contacts, or any other application data of the source device with the WFD-compatible sink device.
The media sharing application may present one or more playlists of media items, such as audio, video, pictures, etc., for streaming to the users of the client devices, which may execute a client application to communicate with the media sharing application. Users of the client devices may select media items to play from the playlists. In some examples, the client devices and the media sharing application may negotiate with each other, and only show media items in the playlists which the devices are capable of outputting. The users of the client devices may select one or more media items for playback.
The source computing device may share the playlists using one or more protocols, such as the protocols of Universal Plug and Play (UPnP). The client devices may request a stream of one or more selected media items of the playlist using a protocol, such as RTSP (Real-Time Streaming Protocol). Responsive to receiving the request for the one or more media items, the source device may stream the requested items using a protocol, such as real-time transport protocol (RTP), to the requesting client devices.
When the source device enters a proximity close enough to support wireless communications via WI-FI, the user of the source device may launch the media sharing application. The application may initiate a WFD session, which may configure the mobile device as a WFD source. The source device may connect with a WFD-compatible device (referred to as a “sink” or “sink device”) by communicating wirelessly with the WFD-compatible sink device. The WFD sink device may use some authentication mechanism to ensure that the user of the sink device is authorized to connect with the sink device, such as a pre-shared key, or certificate system.
The source device, client devices, and the sink device may comprise devices such as DVD players, TVs, MP3 players, laptops, tablets, netbooks, and/or other devices that are WI-FI enabled. In some examples, the client devices and/or the sink device may be integrated into an automobile. In other examples, the client devices and/or the sink device may belong to the users and may be portable.
In one example user environment, a smartphone may operate as wireless source device and transmit media data to passengers in an automobile. The automobile may, for example, include a wireless sink device in the dashboard or control panel that allows a driver to safely view map applications or other such content while driving. The automobile may additionally include one or more client devices. For example, client devices in the back seat may enable passengers in the backseat to view a movie stored on the smartphone or to listen to music stored on the smartphone. For purposes of explanation and example, certain aspects of this disclosure may be described with respect to a user environment inside an automobile, but the techniques of this disclosure are not limited to any particular user environment.
The WFD connection between the source device and the sink device may allow sharing of application data. In various examples, the application data may include contacts, calendar appointments, music stored on the source device, navigation data, or any other application data that the user of the source device may want access to. In addition to providing application data to the WFD-enabled device of the automobile, the source device may also perform screen mirroring with the sink device in accordance with the WFD draft specification, which is currently under development. When performing mirroring, the display of the source device may be sent to the sink device in real-time, such that the sink device and the source device are in sync.
WFD mirroring refers to the device transmitting image data at a source to a sink device, which displays the transmitted image data in real-time. In addition to transmitting image data between a source and a sink, WFD also allows a source device to transfer input commands to the sink device, and the sink device to transmit input commands from the sink to the source, as described below using a user input back channel (UIBC). In an example, the sink device may also receive playlists of media items, via WFD. The sink device may receive transmit user input commands to the source device that select media items for playback. In response to receiving the media commands, the source device may transmit the requested media items to the sink device.
In the context of the present disclosure, the WFD sink device may comprise one or more processors, memory, one or more storage devices, input and/or output devices, a wireless module capable of WI-FI communications. As stated above, when the source device connects to the sink device, the sink device may display the interface of the source device. In the example where the sink device comprises a device of an automobile, and the source device comprises a device of a driver of the automobile, the sink device may include a larger screen than the source device. This in turn, may be beneficial from a safety standpoint for the driver. By using a built-in output device of the automobile as a wireless sink device, the driver may avoid having to look away from the road to see the display of the source device.
In various examples, the user of the sink device may issue user input commands to the source device. In some examples, the input commands may include mouse clicks, scrolling actions, keyboard input, or any other type of user input. The input device of the sink may comprise a touch screen and/or a voice command system, e.g. via BLUETOOTH in some examples.
Responsive to receiving user input, the sink device may transmit the user input back to the source device via a UIBC of the data connection between the source and sink. Responsive to receiving the user input from the sink device, the source device may accept the user input and take an action, such as scrolling, accepting mouse and/or keyboard input, or acting in response to a voice command.
In the automobile user environment, the driver may utilize the sink device in a variety of ways in order to interact with the contents of the source device. In various examples, the user may interact with the sink device, and the interaction may cause the source device to make phone calls, access contact information, change musical selections, access calendar and/or scheduling data, access the internet, access navigation data and/or services, or perform other actions. Additionally, in some examples, various input and output of the source device may be redirected to various devices connected to the sink device. For instance, if the driver is making a phone call, the driver may speak into one or more microphones which may be connected to the sink device in order to facilitate the ease of making the phone call. Additionally, the audio of the phone call may be redirected from the source device through the speakers of the car, which are connected to the sink device, to provide better audio call quality and ease of listening for the driver.
As stated above, users of the client devices may also access content of the source device via the media sharing application. In an example, before a user of a client device (e.g. a backseat passenger) may access content on the source device, the user of the source device (e.g. the driver) may setup a playlist for the user of the client device. The playlist may include a variety of media that the users of the client devices may select from to watch and/or listen. The playlist may, for example, include various compressed video files, audio files, images, or any other content that may be displayed by the output devices of the client devices. Multiple output devices, e.g. multiple WI-FI-equipped client devices may be able to simultaneously connect to the media sharing application of the source device and the source device may transmit multiple simultaneous media streams to each of the client devices. In this manner, each user of a client device may be able to access different media items simultaneously according to the preferences of each user.
Using one or more input and output devices of a client device, such as a touch screen display, mouse, keyboard, etc., a user of the wireless client device may connect via WI-FI to the media sharing application running on the source device. In some examples, the WI-FI connection may be established using WI-FI direct. In another example, an automobile may provide a wireless network to which the source device and the client devices may connect to, and over which data to and from the source and client devices may be transferred. In some examples, the client devices may include a WI-FI module, processor, memory, storage, and one or more additional input devices, may allow a user to select media from the playlist of the source device.
Once a user of a client device has selected one or more media items from the playlist, the source device may begin streaming a selected media item via a WI-FI connection to the client device. After the media item has finished playing, the next media item from the playlist may stream to the output device until the playlist is complete. In some examples, a user of the client device may utilize user input with the client device to select a different media item from the media server of the source device. The user may also perform additional playback commands, such as “start,” “stop,” “fast forward,” etc. in order to control the playback of media on the client device.
In the example of
In addition to rendering audio/video data 121 locally via display 122 and speaker 123, audio/video encoder 124 of source device 120 can encode audio/video data 121, and transmitter/receiver unit 126 can transmit the encoded data over communication channel 150 to sink device 160. Sink device 160 may comprise a device that includes a touchscreen display, which may be mounted in a convenient location of an automobile for driver interaction. Transmitter/receiver unit 166 of driver sink device 160 receives the encoded data, and audio/video decoder 164 decodes the encoded data and outputs the decoded data via display 162 and speaker 163. In this manner, the audio and video data being rendered by display 122 and speaker 123 can be simultaneously rendered by display 162 and speaker 163. The audio data and video data may be arranged in frames, and the audio frames may be time-synchronized with the video frames when rendered.
Audio/video encoder 124 and audio/video decoder 164 may implement any number of audio and video compression standards, such as the ITU-T H.264 standard, alternatively referred to as MPEG-4, Part 10, Advanced Video Coding (AVC), or the newly emerging high efficiency video coding (HEVC) standard, sometimes called the H.265 standard. Generally speaking, audio/video decoder 164 is configured to perform the reciprocal coding operations of audio/video encoder 124. Although not shown in
As will be described in more detail below, A/V encoder 124 may also perform other encoding functions in addition to implementing a video compression standard as described above. For example, A/V encoder 124 may add various types of metadata to A/V data 121 prior to A/V data 121 being transmitted to sink device 160. In some instances, A/V data 121 may be stored on or received at source device 120 in an encoded form and thus not require further compression by A/V encoder 124.
Although,
Display 122 and display 162 may comprise any of a variety of video output devices such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device. Speaker 123 may comprise any of a variety of audio output devices such as headphones, a single-speaker system, a multi-speaker system, or a surround sound system. Additionally, although display 122 and speaker 123 are shown as part of source device 120 and display 162 and speaker 163 are shown as part of sink device 160, source device 120 and sink device 160 may in fact be a system of devices. As one example, display 162 may be a television, speaker 163 may be a surround sound system, and decoder 164 may be part of an external box connected, either wired or wirelessly, to display 162 and speaker 163. In other instances, sink device 160 may be a single device, such as a tablet computer or smartphone. In still other cases, driver device 160 and sink device 120 are similar devices, e.g., both being smartphones, tablet computers, or the like. In this case, one device may operate as the source and the other may operate as the sink. These rolls may even be reversed in subsequent communication sessions.
Transmitter/receiver unit 126 and transmitter/receiver unit 166 may each include various mixers, filters, amplifiers and other components designed for signal modulation, as well as one or more antennas and other components designed for transmitting and receiving data. Communication channel 150 generally represents any suitable communication medium, or collection of different communication media, for transmitting video data from source device 120 to sink device 160. Communication channel 150 is usually a relatively short-range communication channel, similar to WI-FI, BLUETOOTH, or the like. However, communication channel 150 is not necessarily limited in this respect, and may comprise any wireless or wired communication medium, such as a radio frequency (RF) spectrum or one or more physical transmission lines, or any combination of wireless and wired media. In other examples, communication channel 150 may even form part of a packet-based network, such as a wired or wireless local area network, a wide-area network, or a global network such as the Internet. Additionally, communication channel 150 may be used by source device 120 and sink device 160 to create a peer-to-peer link. Source device 120 and sink device 160 may communicate over communication channel 150 using a communications protocol such as a standard from the IEEE 802.11 family of standards. The techniques of this disclosure may at times be described with respect to WI-FI, but it is contemplated that aspects of these techniques may also be compatible with other communication protocols.
In addition to decoding and rendering data received from source device 120, sink device 160 can also receive user inputs from user input device 167. User input device 167 may, for example, be a keyboard, mouse, trackball or track pad, touch screen, voice command recognition module, or any other such user input device. UIPM 168 formats user input commands received by user input device 167 into a data packet structure that source device 120 is capable of interpreting. Such data packets are transmitted by transmitter/receiver 166 to source device 120 over communication channel 150. Transmitter/receiver unit 126 receives the data packets, and A/V control module 125 parses the data packets to interpret the user input command that was received by user input device 167.
Additionally, users, e.g. passengers or a driver, of sink device 160 may be able to launch and control applications on source device 120. For example, a user of sink device 160 may be able to launch a photo editing or navigation application stored on source device 120 and use the application to edit a photo that is stored locally on source device 120. Sink device 160 may present a user with a user experience that looks and feels like the photo is being edited locally on sink device 160 while in fact the photo is being edited on source device 120. Using such a configuration, a device user may be able to leverage the capabilities of one device for use with several devices. For example, source device 120 may be a smartphone with a large amount of memory and high-end processing capabilities, and a user of source device 120 may use the smartphone in all the settings and situations smartphones are typically used. When watching a movie, the user may wish to watch the movie on a device with a bigger display screen, in which case sink device 160 may be a tablet computer. When wanting to send or respond to email, the user may wish to use a device with a keyboard, in which case sink device 160 may be a laptop. In both situations, the bulk of the processing may still be performed by source device 120 (a smartphone in this example) even though the user is interacting with a tablet computer or a laptop. Due to the bulk of the processing being performed by source device 120, sink device 160 may be a lower cost device with fewer resources than if sink device 160 were being asked to do the processing being done by source device 120.
In some configurations, A/V control module 125 may be an operating system process being executed by the operating system of source device 120. In other configurations, however, A/V control module 125 may be a software process of an application running on source device 120. In an example, A/V control module 125 may include a media server capable of WIFI media streaming and a WFD module. In such a configuration, the user input command may be interpreted by the software process, such that a user of sink device 160 is interacting directly with the application running on source device 120, as opposed to the operating system running on source device 120. By interacting directly with an application as opposed to an operating system, a user of sink device 160 may have access to a library of commands that are not native to the operating system of source device 120. Additionally, interacting directly with an application may enable commands to be more easily transmitted and processed by devices running on different platforms.
Source device 120 can respond to user inputs applied at wireless sink device 160. In such an interactive application setting, the user inputs applied at wireless sink device 160 may be sent back to the wireless display source over communication channel 150. In one example, a reverse channel architecture, also referred to as a user interface back channel (UIBC) may be implemented to enable sink device 160 to transmit the user inputs applied at sink device 160 to source device 120. The reverse channel architecture may include upper layer messages for transporting user inputs and lower layer frames for negotiating user interface capabilities at sink device 160 and source device 120. The UIBC may reside over the Internet Protocol (IP) transport layer between sink device 160 and source device 120. In this manner, the UIBC may be above the transport layer in the Open System Interconnection (OSI) communication model. In one example, the OSI communication includes seven layers (1-—physical, 2—data link, 3—network, 4—transport, 5—session, 6—presentation, and 7—application). In this example, being above transport layer refers to layers 5, 6, and 7. To promote reliable transmission and in sequence delivery of data packets containing user input data, UIBC may be configured run on top of other packet-based communication protocols such as the transmission control protocol/internet protocol (TCP/IP) or the user datagram protocol (UDP).
In some cases, there may be a mismatch between the user input interfaces located at source device 120 and sink device 160. To resolve the potential problems created by such a mismatch and to promote a good user experience under such circumstances, user input interface capability negotiation may occur between source device 120 and sink device 160 prior to establishing a communication session.
The UIBC may be designed to transport various types of user input data, including cross-platform user input data. For example, source device 120 may run the iOS® operating system, while sink device 160 runs another operating system such as Android® or Windows®. Regardless of platform, UIPM 168 can encapsulate received user input in a form understandable to A/V control module 125. A number of different types of user input formats may be supported by the UIBC so as to allow many different types of source and automobile sink devices to exploit the protocol. Generic input formats may be defined, and platform specific input formats may both be supported, thus providing flexibility in the manner in which user input can be communicated between source device 120 and sink device 160 by the UIBC.
In an example, sink device 160 may establish a WFD connection with source device 120 and source device 120 may transmit information that describes one or more media items of a playlist to sink device 160. Playlists and media items are described in further detail below, e.g., with respect to
In the example of
For this disclosure, the term source device is generally used to refer to the device that is transmitting audio/video data, and the term sink device is generally used to refer to the device that is receiving the audio/video data from the source device. In many cases, source device 120 and sink device 160 may be similar or identical devices, with one device operating as the source and the other operating as the sink. Moreover, these rolls may be reversed in different communication sessions. Thus, a sink device in one communication session may be a source device in a subsequent communication session, or vice versa.
Source device 120 may share one or more available playlists with one or more devices, such as client device 180. Source device 120 may further transmit information that describes at least one media item of a playlist to client device 180. The transmission of the information may cause client device 180 to determine if wireless client device 180 is capable of outputting the at least one media item. A user of client device 180 may request one or more media items of the one or more of the playlists from driver source device 180. Responsive to receiving the requests for the one or more media items, source device 120 may stream or transmit the requested media items to client device 180, and client device 180 may output the requested media items on an output device, such as display 182, and/or speaker 183.
Source device 120 of
Audio/video control unit 125 may be configured to execute a media sharing application 128 with one or more processors of source device 120. Media application 128 may be part of an operating system or a standalone application of source device 120, in some examples. Media sharing application 128 may determine one or more playlists to share with client computing devices, such as client computing device 180. The media items of the playlists may be stored on local storage, including hard drives, flash memory, and/or peripherals connected to source device 120. Additionally, the media items of the playlists may be accessed remotely by source device 120. Examples of such remotely accessible may include media items stored on a cloud, streaming video, or media items stored on a file server.
Responsive to the user of driver source device launching media sharing application 128, media sharing application 128 may broadcast the playlists to one or more client devices, such as client device 180. In some examples, media sharing application 128 may broadcast the playlists using one or more of the protocols of the UPnP set of protocols. Although described with respect to UPnP, media sharing application 128 may use any mechanism compatible with the wireless communication protocol to broadcast the playlists to client devices, such as client device 180. Source device 120 may announce its services (i.e. that driver source device offers streaming services using RTSP and RTP) using Simple Service Discovery Protocol (SSDP), which is the protocol of UPnP that provides for services discovery of devices on a network. The use of SSDP as a discovery protocol is but one example, and should be considered non-limiting. Other protocols and set of protocols, such as Universal Datagram Protocol (UDP), BONJOUR, Service Location Protocol (SLP), Web Services Dynamic Discovery (WS-Discovery), and Zero configuration networking (zeroconf) may also enable client devices to discover the streaming services provided by source device 120. In the example where source device 120 uses UDP to transmit the playlists, source device 120 may transmit the playlists using a particular port, broadcast address or a well-known multicast address such that one or more client devices, such as client device 180 may listen for the playlists transmitted by source device 120.
Client device 180 may be similar to sink device 160 of
The service announcements received by client application 185 may also include one or more playlists of media items or a link to a resource location that contains one or more playlists of media items, such as a URL, network path, etc. If a link to the playlists is included in the service announcement, client application 185 may retrieve the playlists shared by media sharing application 128 from the location.
Each of the one or more playlists may contain a list of one or more media items, which can be streamed from driver device 120 to client device 180. Each playlist may also include a user identifier, which may be used to restrict access to one or more users to a specific playlist. Each playlist may include one or more properties for each of the one or more media items. The properties may generally include information, such as a name, length, resolution, frame rate, profile level, bitrate, and/or file format for each media item, as some examples. The properties of the playlist and the media items are described in further detail below with respect to
Upon receiving one or more playlists, client application 185 may output the playlists to a user of client device 180 with display 182. Display 182 may comprise any of a variety of video output devices such as a cathode ray tube (CRT), a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device. Client device 180 may receive user inputs selecting one of the playlists from user input devices 187. User input devices 187 may, for example, be a keyboard, mouse, trackball or track pad, touch screen, voice command recognition module, or any other such user input device.
Responsive to receiving the playlists, client application 185 may determine which media items of the playlists are capable of being output by client device 180, and present the media items capable of being output by client device 180 to the user of client device 180 using display 182. To determine which of the media items client device 180 is capable of playing, client application 185 may query the operating system for installed codecs, digital rights management (DRM) capabilities, and/or the hardware capabilities of client device 180 and compare the capabilities of client device 180 with attribute information included in the playlists and the media items of the playlists.
Client application 185 may display the media items from the selected playlist that client device 180 is capable of playing to the user of client device 180 using display 182. The user of client device 180 may select one or more media items from the playlist for playback with user input devices 187. Responsive to receiving the selections for the one or more media items, client application 185 may cause transmit/receive unit 186 to create a playback request for one of the selected media items In some examples, the playback request may be request that source device 120, play, pause, stop, record, etc. the selected media item. Transmit/receive unit 186 may transmit the requests for the selected media items to transmit/receive unit 126 over communication channel 152.
If the user of client device 180 selects multiple media items from the playlist, client application 185 may issue a playback request for a first one of the selected media items, and may enqueue requests for the rest of the selected media items such that once the playback of the first media items completes, client application 185 requests one of the enqueued media items for playback, streams the requested media item, and repeats the process of requesting and streaming the enqueued media items until all the enqueued media items have been requested and streamed.
In an example, client device 180 may be connected with source device 120 using a WFD connection. Client device 180 may receive display information (e.g., graphical representations) that, when rendered by client device 180, illustrates the playlists that the one or more playlists that further contain the one or more media items. Client device 180 may output the display information to a user of client device 180 using display 182, and a user of client device 180 may select one or more media items for playback using UI 187. In an example, client device 180 may receive the playlists, which comprise information that describes one or more media items using a WFD connection. UI 187 receive and transmit user input commands that select one or more media items over a UIBC to source device 120.
Responsive to receiving a request or media playback command, such as an RTSP PLAY request or a UIBC user input command from client device 180, for one or more media items, media sharing application 128 may cause transmit/receive unit 126 may construct a stream of the requested media item. In some examples, transmit/receive unit 126 may construct an RTP session for the stream of the media item. Transmit/receive unit 126 may establish the RTP session with client device 180 and transmit the stream for the media item to transmit/receive unit 186 over communication channel 152. In the example described above where client device 180 receives information about the playlists from a WFD connection, and transmits media item selections using a UIBC, source device 120 and client device 180 may terminate the WFD connection responsive to receiving the UIBC input command that selects media items and/or playlists. Once the UIBC connection terminates, source device 120 and client device 180 may continue to communicate using RTSP, and/or RTP.
In some examples, media sharing application 128 may need to transcode the selected media item into a different format before transmitting the media item to client device 180. In such cases, media application 128 may utilize audio/video encoder 124 to re-encode the selected media format from one format to another format, e.g. from an MPEG Layer 3 audio (MP3) format to a Windows Media Audio (WMA) format.
Transmit/receive unit 186 may receive the RTP stream of the requested media item from transmit/receive unit 126. If some of the packets of the stream of the requested media item are out of order, transmit/receive unit 186 may reassemble and/or reorder the stream into a correct order. Transmit/receive unit 186 may also determine whether there are any issues with the received RTP stream, such as dropped packets, and may re-request the dropped packets from source device 120.
Client application 185 may analyze the stream and output the audio and/or video portions of the stream using display 182 and speaker 183. In this manner, the audio and video data being rendered by display 182 and speaker 183 can be simultaneously rendered by display 182 and speaker 183. The audio data and video data may be arranged in frames, and the audio frames may be time-synchronized with the video frames when rendered. If client application 185 determines that the stream of the media item needs to be decoded, client application 185 may utilize audio/video decoder 184 to decode the encoded stream before outputting the stream.
Each of the one or more attributes of playlists 200, 202, and 204 may generally have an identifier. The identifier may be associated with a list of one or more values. As an example, playlists 200, 202, 204 may include a “number” attribute. The number attribute may indicate the number of media items associated with each playlist that a client device may stream from source device 120 to client device 180. As another example, playlists 200, 202, 204 may also include a “users” attribute. The users attribute may be associated with a list of one or more users, groups of users, and/or devices that are permitted to stream the media items associated with a particular playlist.
In some examples, the values associated with the access attribute may comprise a username of a particular user that is allowed to stream the media items of the playlist. As an example, in
Media sharing application 128 may authenticate a client device identified with a hardware identifier by comparing the hardware identifier (e.g. MAC address) provided by the client device to media sharing application 128. In the example of
Media sharing application 128 may authenticate a user of a client device, e.g. client device 180 using a variety of different authentication mechanisms. In some examples, client application 128 may request authentication information, e.g., a username and password, from client device 180. Responsive to receiving a username and password from client device 180, media sharing application 128 may compare the received username and password with the usernames associated with the users attribute. In an example, the users attribute of playlist 200 includes associated users “Jessie” and “Bob.” Media sharing application 128 may request a username and password from client application 185, and may receive a response that includes username “Bob,” and a password for user Bob. Media sharing application 128 may determine that username Bob is included in the users attribute. Media sharing application 128 may then compare a locally stored password to the password supplied by client device 180 to determine whether the supplied password matches the stored password. If the supplied password matches the stored password, media sharing application 128 may authenticate client device 180, and grant access to (i.e. allow streaming of) media items of playlist 200 to client device 180. Media sharing application 128 may store the passwords of the users in a database or local storage of source device 120. In some examples, media sharing application 128 may utilize an authentication technique, such as a certificate system to authenticate users or devices. In other examples,
In the example of
Although the users attribute of playlists 200, 202, 204 is illustrated as a list of users or devices that are allowed access to the media items of playlists 200, 202, 204, the users attribute of a particular playlist may alternatively include lists of users and/or devices that are excluded from accessing the media items of the playlist. In some examples, the playlist may include a list of users that are permitted to access the media items of the playlist and a list of users that are denied access to the particular playlist.
Responsive to receiving the one or more playback commands, media sharing application 128 may perform actions in accordance with the requested playback command. As an example, if client application 185 transmits an RTSP PLAY command to media sharing application 128, requesting that media sharing application 128 play media item 200, media sharing application 128 may respond by sending a stream corresponding to the requested media item 202 to client application 185. As another example, if media sharing application 128 receives an RTSP STOP command, media sharing application 128 may stop streaming a media item, such as media item 222, that is currently playing. In addition to using RTSP for controlling the playback of media items, the streams of media items sent from source device 120 to client device 180 may generally use a different protocol, such as RTP, to stream the actual media items, while using RTSP to control the RTP streams.
In general, media items may have file name, file type, resolution, bit rate, length, and/or profile attributes. The file name attribute may indicate the file name or title of the media item. The file type attribute may indicate whether the media item is of video, audio, or another file format. In some examples, the file type attribute may also indicate more specific information, such as a specific type of audio or video, etc. (e.g. H.264 or MP3) of the media item.
The resolution attribute may indicate the horizontal and vertical resolution of a media item. Source device 120 and client device 180 may negotiate a set of one or more resolutions client device 180 is capable of outputting based on the resolution attributes of one or more media items. As part of this negotiation process, source device 120 and client device 180 can agree on a negotiated screen resolution. When client device 180 streams data associated with a media item, such as streaming video, source device 120 can scale or transcode the video data of the media item to match the negotiated screen resolution. In this manner, client device 180 may receive video data with the agreed upon resolution. By providing client device 180 with video in the agreed upon resolution, client device 180 may not have to transcode the video data of the media item, which may consume additional power and/or processing cycles.
In
Media items 220, 222 may also have a length attribute, which indicates the playback time of the media item. Media item 220 has a length of one hour and 40 minutes (1 H 40 M). Media item 222 has a length of two minutes and 22 seconds (2 M 22 SEC). In addition to the length attribute, media items may have an associated profile attribute, which may indicate capabilities or encoding features of the media item. As an example, the value of the profile attribute value of media item 220 is “main,” which may correspond to a specific profile of MPEG 4 video. Media item 222 does not have a value associated with the profile attribute because the MP3 media does not have a profile. Although described with respect to profiles of H.264 video in
Client application 185 may use the values of the attributes associated with media items 220, 222, 224 of playlist 200 to determine a subset of media items that client device 180 is capable of playing. Once client application 185 determines the media items that client device 180 is capable of playing, client application 185 may output only those media items of the subset of media items to the user of client device 185 that client device 185 is capable of playing. The user may select for playback only media items from the subset that client device 180 is capable of playing.
To determine which media items client device 180 is capable of playing, client application 185 may query client device 180 to determine the hardware capabilities of client device 180. Client application 185 may query the operating system as to the amount of RAM, storage space, output device resolution, sound output capabilities, processor speed, libraries installed, CODECs (coder-decoders), or any other information relevant to the playback of media items on client device 180. When client application 185 requests access a playlist, e.g. playlist 200, client application 185 may compare the attributes of playlist 200 and the attributes of the media items 220, 222, 224 of playlist 200 with the capabilities of client device 185 to determine which media items client device 180 is capable of playing.
As one example, client device 180 may have an output device capable of displaying only 1280×720 pixel video resolution. Client application 185 may determine that media item 220 has 1920×1080 resolution, and may exclude media item 220 from the media items of playlist 200 available for playback by client device 180. Alternatively, client application 185 may determine that that source device 120 can scale the video of media item 220 down to 1280×720 resolution, and may include media item 220 in the media items of playlist 200 available for playback.
In another example, client application 185 may determine a connection speed of communication link 152 between client device 180 and source device 120, e.g. 10 Mbit/s of bandwidth. Based on the connection speed, client application 185 may determine whether there is sufficient bandwidth to stream a particular media item without excessive buffering. Because media item 220 has a bit rate of 12 Mbit/s, which is greater than the available 10 Mbit/s bandwidth, client application 185 may exclude media item 220 from the list of media items of playlist 220 available for playback. Client application 185 may examine the bit rate attribute of media item 222, and because the value of 320 kbit/s is less than the 10 Mbit/s of bandwidth, may include media item 222 in the list of media items of playlist 220 available for playback.
The ability to determine which media items client device 180 is capable of playing and only presenting those media items to the user may be useful in situations where client device 180 has limited functionality. As an example, in an automobile setting, client device 180 may be hardware that is built-in to the automobile, such as a seatback media player, which may not receive updates that include newer CODECs or media profiles. As such, client device 180 may not be able to display a significant variety of media items, and the media items that client device 180 is not capable of playing should be excluded from the playlist of media items that is ultimately presented to the user.
The attributes of playlists 200, 202, and media items 220, 222, and 224 may be stored in a variety of formats. In some examples, the attributes and their associated values may be stored in XML (eXtensible Markup Language), binary, CSV (comma separate value), HTML (HyperText Markup Language), or any other format of storing records. In some examples, the attributes of the playlists and media items may be stored in the playlists themselves. In some instances, the attributes and their associated values may be stored in a separate database media sharing application 125 may index based on a unique identifier associated with each playlist and/or media item.
Physical layer 302 and MAC layer 304 may define physical signaling, addressing and channel access control used for communications in a WD system. Physical layer 302 and MAC layer 304 may define the frequency band structure used for communication, e.g., Federal Communications Commission bands defined at 700 MHz, 2.4, GHz, 3.6 GHz, 5 GHz, 60 GHz or Ultrawideband (UWB) frequency band structures. Physical layer 302 and MAC 304 may also define data modulation techniques e.g. analog and digital amplitude modulation, frequency modulation, phase modulation techniques, and combinations thereof. Physical layer 302 and MAC 304 may also define multiplexing techniques, e.g. example, time division multi access (TDMA), frequency division multi access (FDMA), code division multi access (CDMA), or any combination of OFDM, FDMA, TDMA and/or CDMA. In one example, physical layer 302 and media access control layer 304 may be defined by a Wi-Fi (e.g., IEEE 802.11-3007 and 802.11n-3009x) standard, such as that provided by WFD. In other examples, physical layer 302 and media access control layer 304 may be defined by any of: WirelessHD, Wireless Home Digital Interface (WHDI), WiGig, and Wireless USB.
Internet protocol (IP) 306, user datagram protocol (UDP) 308, real time protocol (RTP) 310, transport control protocol (TCP) 322, and real time streaming protocol (RTSP) 324 define packet structures and encapsulations used in a WD system and may be defined according to the standards maintained by the Internet Engineering Task Force (IETF).
RTSP 324 may be used by source device 120 and sink device 160 to negotiate capabilities, establish a session, and session maintenance and management, as well as by source device 120 and sink device 160 to transmit media items in accordance with the techniques of this disclosure. For example, source device 120 may send a capability request message (e.g., RTSP GET_PARAMETER request message) to sink device 160 specifying a list of capabilities that are of interest to source device 120. Sink device 160 may respond with a capability response message (e.g., RTSP GET_PARAMETER response message) to source device 120 declaring its capability of supporting the capability. As an example, the capability response message may indicate a “yes” if sink device 160 supports the capability. Source device 120 may then send an acknowledgement request message (e.g., RTSP SET_PARAMETER request message) to sink device 160 indicating that the capability is supported. Sink device 160 may respond with an acknowledgment response message (e.g., RTSP SET_PARAMETER response message) to source device 120 acknowledging that the capability will be used during the media share session.
Video codec 318 may define the video data coding techniques that may be used by a WD system. Video codec 318 may implement any number of video compression standards, such as ITU-T H.261, ISO/IEC MPEG-1 Visual, ITU-T H.262 or ISO/IEC MPEG-2 Visual, ITU-T H.263, ISO/IEC MPEG-4 Visual, ITU-T H.264 (also known as ISO/IEC MPEG-4 AVC), VP8 and High-Efficiency Video Coding (HEVC). It should be noted that in some instances WD system may either compressed or uncompressed video data.
Audio codec 320 may define the audio data coding techniques that may be used by a WD system. Audio data may be coded using multi-channel formats such those developed by Dolby and Digital Theater Systems. Audio data may be coded using a compressed or uncompressed format. Examples of compressed audio formats include MPEG-1, 2 Audio Layers II and III, AC-3, AAC. An example of an uncompressed audio format includes pulse-code modulation (PCM) audio format.
Packetized elementary stream (PES) packetization 316 and MPEG2 transport stream (MPEG2-TS) 312 may define how coded audio and video data is packetized and transmitted. Packetized elementary stream (PES) packetization 316 and MPEG-TS 312 may be defined according to MPEG-2 Part 1. In other examples, audio and video data may be packetized and transmitted according to other packetization and transport stream protocols. Content protection 314, may provide protection against unauthorized copying of audio or video data. In one example, content protection 314 may be defined according to High bandwidth Digital Content Protection 2.0 specification.
Memory 402 may store A/V visual data in the form of media data in compressed or uncompressed formats. Memory 402 may store an entire media data file, or may comprise a smaller buffer that simply stores a portion of a media data file, e.g., streamed from another device or source. Memory 402 may comprise any of a wide variety of volatile or non-volatile memory, including but not limited to random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read-only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, and the like. Memory 402 may comprise a computer-readable storage medium for storing media data, as well as other kinds of data. Memory 402 may additionally store instructions and program code that are executed by a processor as part of performing the various techniques described in this disclosure.
Display processor 404 may obtain captured video frames and may process video data for display on local display 406. Display 406 comprise one of a variety of display devices such as a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device capable of presenting video data to a user of source device 400.
Audio processor 408 may obtain audio captured audio samples and may process audio data for output to speakers 410. Speakers 410 may comprise any of a variety of audio output devices such as headphones, a single-speaker system, a multi-speaker system, or a surround sound system.
Video encoder 412 may obtain video data from memory 402 and encode video data to a desired video format. Video encoder 412 may be a combination of hardware and software used to implement aspects of video codec 318 described above with respect to
Video packetizer 414 may packetize encoded video data. In one example video packetizer 414 may packetize encoded video data as defined according to MPEG-2 Part 1. In other examples, video data may be packetized according to other packetization protocols. Video packetizer 414 may be a combination of hardware and software used to implement aspects of packetized elementary stream (PES) packetization 216 described above with respect to
Audio encoder 416 may obtain audio data from memory 402 and encode audio data to a desired audio format. Audio encoder 416 may be a combination of hardware and software used to implement aspects of audio codec 320 described above with respect to
Audio packetizer 418 may packetize encoded audio data. In one example, audio packetizer 418 may packetize encoded audio data as defined according to MPEG-2 Part 1. In other examples, audio data may be packetized according to other packetization protocols. Audio packetizer 418 may be a combination of hardware and software used to implement aspects of packetized elementary stream (PES) packetization 316 described above with respect to
A/V mux 420 may apply multiplexing techniques to combine video payload data and audio payload data as part of a common data stream. In one example, A/V mux 420 may encapsulate packetized elementary video and audio streams as an MPEG2 transport stream defined according to MPEG-2 Part 1. A/V mux 420 may provide synchronization for audio and video packets, as well as error correction techniques.
Transport module 422 may process media data for transport to a sink device. Further, transport module 422 may process received packets from a sink device so that they may be further processed. For example, transport module 422 may be configured to communicate using IP, TCP, UDP, RTP, and RTSP. For example, transport module 422 may further encapsulate an MPEG2-TS for communication to a sink device or across a network.
Modem 424 may be configured to perform physical and MAC layer processing according to the physical and MAC layers utilized in a WD system. As described with reference to
Control module 426 may be configured to perform source device 400 communication control functions. Communication control functions may relate to negotiating capabilities with a sink device, establishing a session with a sink device, and session maintenance and management. Control module 426 may use RTSP to communication with a sink device. Further, control module 426 may use an RTSP message transaction to negotiate a capability of source device 400 and a sink device to support capabilities of UIBC.
Feedback de-packetizer 428 may parse human interface device commands (HIDC), generic user inputs, OS specific user inputs, and performance information from a feedback packet. A feedback category field may identify a generic input category to indicate that feedback packet payload data is formatted using generic information elements. As another example, the feedback category field may identify a human interface device command (HIDC) input category. As another example, the feedback category field may identify an operating system (OS) specific input category to indicate that payload data is formatted based on the type OS used by either the source device or the sink device.
Feedback module 430 receives performance information from feedback de-packtetizer and processes performance information such that source device 400 may adjust the transmission of media data based on a performance information message.
Source device 400 provides an example of a source device configured to transmit content to a second wireless computing device. Source device 400 may initiate a WI-FI display (WFD) connection, transmit data from the first wireless computing device via the WFD connection to the second wireless computing device, execute a media sharing application that enables the first wireless computing device to share a media item of a playlist with a wireless client computing device, and transmit information that describes the media item of the playlist to the wireless client computing device. Transmitting the information that describes the media item may cause the wireless client computing device to determine if the wireless client computing device is capable of outputting the media item and transmit the media item to the wireless client computing device.
Modem 502, may be configured to perform physical and MAC layer processing according to the physical and MAC layers utilized in a WD system. As described with reference to
Transport module 504, may process received media data from a source device. Further, transport module 504 may process feedback packets for transport to a source device. For example, transport module 504 may be configured to communicate using IP, TCP, UDP, RTP, and RSTP. In addition, transport module 504 may include a timestamp value in any combination of IP, TCP, UDP, RTP, and RSTP packets. The timestamp values may enable a source device to identify which media data packet experienced a reported performance degradation and to calculate the roundtrip delay in a WD system.
A/V demux 506, may apply de-multiplexing techniques to separate video payload data and audio payload data from data stream. In one example, A/V mux 506 may separate packetized elementary video and audio streams of an MPEG2 transport stream defined according to MPEG-2 Part 1.
Video de-packetizer 508 and Video decoder 510 may perform reciprocal processing of a video packetizer and a video encoder implementing packetization and coding techniques described herein and output video output video data to display processor 512.
Display processor 512 may obtain captured video frames and may process video data for display on display 514. Display 514 may comprise one of a variety of display devices such as a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display.
Audio de-packetizer 516 and audio decoder 518 may perform reciprocal processing of an audio packetizer and audio encoder implementing packetization and coding techniques described herein and output audio data to display processor 520
Audio processor 520 may obtain audio data from audio decoder and may process audio data for output to speakers 522. Speakers 522 may comprise any of a variety of audio output devices such as headphones, a single-speaker system, a multi-speaker system, or a surround sound system.
User input module 524 may format user input commands received by user input device such as, for example, a keyboard, mouse, trackball or track pad, touch screen, voice command recognition module, or any other such user input device. In one example user input module 524 may format user input commands according formats defined according to Human interface device commands (HIDC) 330, generic user inputs 332 and OS specific user inputs 336 described above with respect to
Performance analysis module 526 may determine performance information based on media data packets received from a source device. Performance information may include: delay jitter, packet loss, error distribution in time, packet error ratio, and RSSI distribution in time, as well as other examples described herein. Performance analysis module 526 may calculate performance information according to any of the techniques described herein.
Feedback packetizer 528 may packet may process the user input information from user input module 524 and performance analysis module generator 526 to create feedback packets. In one example, a feedback packet may use the message format described with respect to
Control module 530 may be configured to perform sink or client device 500 communication control functions. Communication control functions may relate to negotiating capabilities with a source device, establishing a session with a source device, and session maintenance and management. Control module 530 may use RTSP to communication with a source device. Further, control module 530 may negotiate a capability of sink or client device 500 and a source device to support features of UIBC.
In an example, sink device 500 provides an example of a source device configured to initiate a WFD connection with a wireless source device, such as source device 400 (
The coded data for each data stream may be multiplexed with pilot data using orthogonal frequency division multiplexing (OFDM) techniques. A wide variety of other wireless communication techniques may also be used, including but not limited to time division multi access (TDMA), frequency division multi access (FDMA), code division multi access (CDMA), or any combination of OFDM, FDMA, TDMA and/or CDMA.
Consistent with
The modulation symbols for the data streams are then provided to a TX MIMO processor 620, which may further process the modulation symbols (e.g., for OFDM). TX MIMO processor 620 can then provide NT modulation symbol streams to NT transmitters (TMTR) 622a through 622t. In certain aspects, TX MIMO processor 620 applies beamforming weights to the symbols of the data streams and to the antenna from which the symbol is being transmitted.
Each transmitter 622 may receive and process a respective symbol stream to provide one or more analog signals, and further conditions (e.g., amplifies, filters, and upconverts) the analog signals to provide a modulated signal suitable for transmission over the MIMO channel. NT modulated signals from transmitters 622a through 622t are then transmitted from NT antennas 624a through 624t, respectively.
At receiver system 650, the transmitted modulated signals are received by NR antennas 652a through 652r and the received signal from each antenna 652 is provided to a respective receiver (RCVR) 654a through 654r. Receiver 654 conditions (e.g., filters, amplifies, and downconverts) a respective received signal, digitizes the conditioned signal to provide samples, and further processes the samples to provide a corresponding “received” symbol stream.
A receive (RX) data processor 660 then receives and processes the NR received symbol streams from NR receivers 654 based on a particular receiver processing technique to provide NT “detected” symbol streams. The RX data processor 660 then demodulates, deinterleaves and decodes each detected symbol stream to recover the traffic data for the data stream. The processing by RX data processor 660 is complementary to that performed by TX MIMO processor 620 and TX data processor 614 at transmitter system 610.
A processor 670 that may be coupled with a memory 672 periodically determines which pre-coding matrix to use. The reverse link message may comprise various types of information regarding the communication link and/or the received data stream. The reverse link message is then processed by a TX data processor 638, which also receives traffic data for a number of data streams from a data source 636, modulated by a modulator 680, conditioned by transmitters 654a through 654r, and transmitted back to transmitter system 610.
At transmitter system 610, the modulated signals from receiver system 650 are received by antennas 624, conditioned by receivers 622, demodulated by a demodulator 640, and processed by a RX data processor 642 to extract the reserve link message transmitted by the receiver system 650. Processor 630 then determines which pre-coding matrix to use for determining the beamforming weights then processes the extracted message.
Source device 120 may generally operate in the same manner described above for source device 120 of
Source device 120 and sink device 160 may negotiate capabilities through a sequence of messages. The messages may, for example, be real time streaming protocol (RTSP) messages. At any stage of the negotiations, the recipient of an RTSP request message may respond with an RTSP response that includes an RTSP status code other than RTSP OK, in which case, the message exchange might be retried with a different set of parameters or the capability negotiation session may be ended.
Source device 120 can send a first message (RTSP OPTIONS request message) to sink device 160 in order to determine the set of RTSP methods that sink device 160 supports. On receipt of the first message from source device 120, sink device 160 can respond with a second message (RTSP OPTIONS response message) that lists the RTSP methods supported by sink 160. The second message may also include a RTSP OK status code.
After sending the second message to source device 120, sink device 160 can send a third message (RTSP OPTIONS request message) in order to determine the set of RTSP methods that source device 120 supports. On receipt of the third message from sink device 160, source device 120 can respond with a fourth message (RTSP OPTIONS response message) that lists the RTSP methods supported by source device 120. The fourth message can also include RTSP OK status code.
After sending the fourth message, source device 120 can send a fifth message (RTSP GET_PARAMETER request message) to specify a list of capabilities that are of interest to source device 120. Sink device 160 can respond with a sixth message (an RTSP GET_PARAMETER response message). The sixth message may contain an RTSP status code. If the RTSP status code is OK, then the sixth message can also include response parameters to the parameter specified in the fifth message that are supported by sink device 160. Sink device 160 can ignore parameters in the fifth message that sink device 160 does not support.
Based on the sixth message, source 120 can determine the optimal set of parameters to be used for the communication session and can send a seventh message (an RTSP SET_PARAMETER request message) to sink device 160. The seventh message can contain the parameter set to be used during the communication session between source device 120 and sink device 160. The seventh message can include the wfd-presentation-url that describes the Universal Resource Identifier (URI) to be used in the RTSP Setup request in order to setup the communication session. The wfd-presentation-url specifies the URI that sink device 160 can use for later messages during a session establishment exchange. The wfd-url0 and wfd-url1 values specified in this parameter can correspond to the values of rtp-port0 and rtp-port1 values in the wfd-client-rtp-ports in the seventh message. RTP in this instance generally refers to the real-time protocol which can run on top of the UDP.
Upon receipt of the seventh message, sink device 160 can respond with an eighth message with an RTSP status code indicating if setting the parameters as specified in the seventh message was successful. As mentioned above, the roles or source device and sink device may reverse or change in different sessions. The order of the messages that set up the communication session may, in some cases, define the device that operates as the source and define the device that operates as the sink.
Source device 120 may initiate a WI-FI display (WFD) connection to sink device 160 (800). Source device 120 may also transmit data from source device 120 via the WFD connection to sink device 160 (
Source device 120 may execute a media sharing application that enables source device 120 to share a media item of a playlist with client device 180 (
Source device 120 may also transmit WFD data related to at least one of the playlist and the media item of the playlist via a second WFD connection to wireless client device 180. Source device 120 may further receive a user input back channel input command from client device 180 via the second WFD connection. In response to receiving the UIBC input command, the first wireless computing device may terminate the second WFD connection.
Transmitting the information that describes the media item of the playlist may cause client device 180 to determine whether client device 180 is capable of outputting the media item (808). Source device 120 may transmit the media item to client device 180 (810). Source device 120 may also receive a media playback command from client device 180, which may comprise a RTSP directive in some examples. In some instances, source device 120 may transmit the media item using RTP. In some instances, transmitting the media item with source device 120 may occur after receiving the media playback command from client device 180.
Source device 120 may also receive authentication information from client device 180. Source device 120 may authenticate the wireless client device based on the authentication information, and grant access client device 180 access to the playlist in response to authenticating client device 180.
Although illustrated in a particular order for the purposes of example, the method of
In one or more examples, the functions described in this disclosure may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over, as one or more instructions or code, a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer-readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transient media, but are instead directed to non-transient, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor,” as used herein may refer to any of the foregoing structure or any other structure suitable for implementation of the techniques described herein. In addition, in some aspects, the functionality described herein may be provided within dedicated hardware and/or software modules configured for encoding and decoding, or incorporated in a combined codec. Also, the techniques could be fully implemented in one or more circuits or logic elements.
The techniques of this disclosure may be implemented in a wide variety of devices or apparatuses, including a wireless handset, an integrated circuit (IC) or a set of ICs (e.g., a chip set). Various components, modules, or units are described in this disclosure to emphasize functional aspects of devices configured to perform the disclosed techniques, but do not necessarily require realization by different hardware units. Rather, as described above, various units may be combined in a codec hardware unit or provided by a collection of interoperative hardware units, including one or more processors as described above, in conjunction with suitable software and/or firmware.
Various examples have been described. These and other examples are within the scope of the following claims.
This application claims the benefit of U.S. Provisional Application No. 61/583,987 filed Jan. 6, 2012 and U.S. Provisional Application No. 61/599,564 filed Feb. 16, 2012, the entire contents of which are incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4791554 | Hirota et al. | Dec 1988 | A |
5828370 | Moeller et al. | Oct 1998 | A |
5835723 | Andrews et al. | Nov 1998 | A |
5925137 | Okanoue et al. | Jul 1999 | A |
6014706 | Cannon et al. | Jan 2000 | A |
6049549 | Ganz et al. | Apr 2000 | A |
6195680 | Goldszmidt et al. | Feb 2001 | B1 |
6252889 | Patki et al. | Jun 2001 | B1 |
6266690 | Shankarappa et al. | Jul 2001 | B1 |
6400720 | Ovadia et al. | Jun 2002 | B1 |
6424626 | Kidambi et al. | Jul 2002 | B1 |
6515992 | Weston et al. | Feb 2003 | B1 |
6571279 | Herz et al. | May 2003 | B1 |
6594699 | Sahai et al. | Jul 2003 | B1 |
6608841 | Koodli | Aug 2003 | B1 |
6748195 | Phillips | Jun 2004 | B1 |
6760772 | Zou et al. | Jul 2004 | B2 |
6801530 | Brandt et al. | Oct 2004 | B1 |
6876857 | Nee et al. | Apr 2005 | B1 |
6917976 | Slaughter et al. | Jul 2005 | B1 |
6963921 | Yang et al. | Nov 2005 | B1 |
7035281 | Spearman et al. | Apr 2006 | B1 |
7072984 | Polonsky et al. | Jul 2006 | B1 |
7080151 | Borella et al. | Jul 2006 | B1 |
7085420 | Mehrotra | Aug 2006 | B2 |
7099629 | Bender | Aug 2006 | B1 |
7324462 | Page et al. | Jan 2008 | B1 |
7328021 | Satapathy | Feb 2008 | B1 |
7333464 | Yang et al. | Feb 2008 | B2 |
7366204 | Kang et al. | Apr 2008 | B2 |
7373415 | DeShan et al. | May 2008 | B1 |
7376155 | Ahn et al. | May 2008 | B2 |
7477659 | Nee et al. | Jan 2009 | B1 |
7519470 | Brasche et al. | Apr 2009 | B2 |
7529823 | Trufinescu et al. | May 2009 | B2 |
7565357 | Rao | Jul 2009 | B2 |
7688859 | Chen et al. | Mar 2010 | B2 |
7696980 | Piot et al. | Apr 2010 | B1 |
7712670 | Sauerwein, Jr. et al. | May 2010 | B2 |
7716385 | Saint-Hilaire et al. | May 2010 | B2 |
7719972 | Yuan et al. | May 2010 | B2 |
7720096 | Klemets | May 2010 | B2 |
7768536 | Hyatt | Aug 2010 | B2 |
7835406 | Oran et al. | Nov 2010 | B2 |
7836051 | Mason | Nov 2010 | B1 |
7868890 | Ludwin et al. | Jan 2011 | B2 |
7881315 | Haveson et al. | Feb 2011 | B2 |
7929475 | Simonson et al. | Apr 2011 | B2 |
8001384 | Yamamoto et al. | Aug 2011 | B2 |
8069169 | Fitzpatrick et al. | Nov 2011 | B2 |
8102849 | Martinez Bauza et al. | Jan 2012 | B2 |
8157168 | Sauerwein, Jr. et al. | Apr 2012 | B2 |
8364201 | Fujisaki | Jan 2013 | B1 |
8406961 | Pathak et al. | Mar 2013 | B2 |
8428048 | Walker et al. | Apr 2013 | B2 |
8437347 | Casaccia et al. | May 2013 | B2 |
8466870 | Cohen et al. | Jun 2013 | B2 |
8517251 | Cohen et al. | Aug 2013 | B2 |
8572000 | Weingarten et al. | Oct 2013 | B1 |
8593996 | Lee et al. | Nov 2013 | B2 |
8605048 | Ye et al. | Dec 2013 | B2 |
8605584 | Leung et al. | Dec 2013 | B2 |
8612619 | Guo et al. | Dec 2013 | B2 |
8724696 | Byford et al. | May 2014 | B2 |
8751582 | Behforooz et al. | Jun 2014 | B1 |
8966131 | Huang et al. | Feb 2015 | B2 |
20020007494 | Hodge | Jan 2002 | A1 |
20020035621 | Zintel et al. | Mar 2002 | A1 |
20020097718 | Korus et al. | Jul 2002 | A1 |
20030031152 | Gohda et al. | Feb 2003 | A1 |
20030064752 | Adachi et al. | Apr 2003 | A1 |
20030110297 | Tabatabai et al. | Jun 2003 | A1 |
20030142631 | Silvester | Jul 2003 | A1 |
20030152098 | Zhu | Aug 2003 | A1 |
20030167171 | Calderone et al. | Sep 2003 | A1 |
20030225737 | Mathews | Dec 2003 | A1 |
20040039934 | Land et al. | Feb 2004 | A1 |
20040071169 | Abe et al. | Apr 2004 | A1 |
20040083284 | Ofek et al. | Apr 2004 | A1 |
20040103282 | Meier et al. | May 2004 | A1 |
20040147264 | Ogawa | Jul 2004 | A1 |
20040160967 | Fujita et al. | Aug 2004 | A1 |
20040202249 | Lo et al. | Oct 2004 | A1 |
20040214571 | Hong | Oct 2004 | A1 |
20050018829 | Baker | Jan 2005 | A1 |
20050021810 | Umemura et al. | Jan 2005 | A1 |
20050044142 | Garrec et al. | Feb 2005 | A1 |
20050058090 | Chang et al. | Mar 2005 | A1 |
20050060750 | Oka et al. | Mar 2005 | A1 |
20050085239 | Cedervall | Apr 2005 | A1 |
20050096086 | Singamsetty | May 2005 | A1 |
20050102699 | Kim et al. | May 2005 | A1 |
20050111361 | Hosein | May 2005 | A1 |
20050130611 | Lu et al. | Jun 2005 | A1 |
20050136990 | Hardacker et al. | Jun 2005 | A1 |
20050138193 | Encarnacion et al. | Jun 2005 | A1 |
20050144225 | Anderson et al. | Jun 2005 | A1 |
20050149976 | Lupoi et al. | Jul 2005 | A1 |
20050152330 | Stephens et al. | Jul 2005 | A1 |
20050166241 | Kim et al. | Jul 2005 | A1 |
20050175321 | Aridome et al. | Aug 2005 | A1 |
20050176429 | Lee et al. | Aug 2005 | A1 |
20050198663 | Chaney et al. | Sep 2005 | A1 |
20050219266 | Koutani et al. | Oct 2005 | A1 |
20050266798 | Moloney et al. | Dec 2005 | A1 |
20050267946 | An et al. | Dec 2005 | A1 |
20050271072 | Anderson et al. | Dec 2005 | A1 |
20060002320 | Costa-Requena et al. | Jan 2006 | A1 |
20060002395 | Araki et al. | Jan 2006 | A1 |
20060013182 | Balasubramanian et al. | Jan 2006 | A1 |
20060028398 | Willmore | Feb 2006 | A1 |
20060050640 | Jin et al. | Mar 2006 | A1 |
20060053459 | Simerly et al. | Mar 2006 | A1 |
20060058003 | Lee | Mar 2006 | A1 |
20060069797 | Abdo et al. | Mar 2006 | A1 |
20060098593 | Edvardsen et al. | May 2006 | A1 |
20060101146 | Wang | May 2006 | A1 |
20060103508 | Sato | May 2006 | A1 |
20060133414 | Luoma et al. | Jun 2006 | A1 |
20060136963 | Oh et al. | Jun 2006 | A1 |
20060146009 | Syrbe et al. | Jul 2006 | A1 |
20060187964 | Li et al. | Aug 2006 | A1 |
20060198448 | Aissi et al. | Sep 2006 | A1 |
20060199537 | Eisenbach | Sep 2006 | A1 |
20060202809 | Lane et al. | Sep 2006 | A1 |
20060203805 | Karacali-Akyamac et al. | Sep 2006 | A1 |
20060206340 | Silvera et al. | Sep 2006 | A1 |
20060209787 | Okuda | Sep 2006 | A1 |
20060218298 | Knapp et al. | Sep 2006 | A1 |
20060222246 | Murai et al. | Oct 2006 | A1 |
20060223442 | Stephens | Oct 2006 | A1 |
20060233191 | Pirzada et al. | Oct 2006 | A1 |
20060236250 | Gargi | Oct 2006 | A1 |
20060256851 | Wang et al. | Nov 2006 | A1 |
20060268869 | Boers et al. | Nov 2006 | A1 |
20060270417 | Chi | Nov 2006 | A1 |
20060288008 | Bhattiprolu et al. | Dec 2006 | A1 |
20070004387 | Gadamsetty et al. | Jan 2007 | A1 |
20070008922 | Abhishek et al. | Jan 2007 | A1 |
20070011039 | Oddo | Jan 2007 | A1 |
20070016654 | Bowles et al. | Jan 2007 | A1 |
20070022195 | Kawano et al. | Jan 2007 | A1 |
20070037600 | Fukuda | Feb 2007 | A1 |
20070043550 | Tzruya | Feb 2007 | A1 |
20070057865 | Song et al. | Mar 2007 | A1 |
20070057885 | Kurumisawa et al. | Mar 2007 | A1 |
20070061433 | Reynolds et al. | Mar 2007 | A1 |
20070104215 | Wang et al. | May 2007 | A1 |
20070126715 | Funamoto | Jun 2007 | A1 |
20070141984 | Kuehnel et al. | Jun 2007 | A1 |
20070141988 | Kuehnel et al. | Jun 2007 | A1 |
20070157283 | Setlur et al. | Jul 2007 | A1 |
20070162945 | Mills | Jul 2007 | A1 |
20070171910 | Kumar et al. | Jul 2007 | A1 |
20070182728 | Fujimori | Aug 2007 | A1 |
20070185744 | Robertson | Aug 2007 | A1 |
20070211041 | Lai et al. | Sep 2007 | A1 |
20070222779 | Fastert | Sep 2007 | A1 |
20070259662 | Lee et al. | Nov 2007 | A1 |
20070264988 | Wilson et al. | Nov 2007 | A1 |
20070264991 | Jones et al. | Nov 2007 | A1 |
20070274400 | Murai et al. | Nov 2007 | A1 |
20070291636 | Rajagopal et al. | Dec 2007 | A1 |
20070292135 | Guo et al. | Dec 2007 | A1 |
20070299778 | Haveson | Dec 2007 | A1 |
20080005348 | Kosiba et al. | Jan 2008 | A1 |
20080013658 | Lewis et al. | Jan 2008 | A1 |
20080018657 | Montag | Jan 2008 | A1 |
20080031210 | Abhishek et al. | Feb 2008 | A1 |
20080037506 | Dharmaraju et al. | Feb 2008 | A1 |
20080037785 | Gantman et al. | Feb 2008 | A1 |
20080045149 | Dharmaraju et al. | Feb 2008 | A1 |
20080046944 | Lee et al. | Feb 2008 | A1 |
20080050715 | Golczewski et al. | Feb 2008 | A1 |
20080109763 | Lee | May 2008 | A1 |
20080115183 | Zato et al. | May 2008 | A1 |
20080129879 | Shao et al. | Jun 2008 | A1 |
20080130612 | Gorokhov et al. | Jun 2008 | A1 |
20080155057 | Khedouri et al. | Jun 2008 | A1 |
20080198847 | Yamagishi et al. | Aug 2008 | A1 |
20080198848 | Yamagishi | Aug 2008 | A1 |
20080205394 | Deshpande et al. | Aug 2008 | A1 |
20080211766 | Westerman et al. | Sep 2008 | A1 |
20080231595 | Krantz et al. | Sep 2008 | A1 |
20080232402 | Higuchi et al. | Sep 2008 | A1 |
20080270532 | Billmaier et al. | Oct 2008 | A1 |
20080273485 | Tsigler et al. | Nov 2008 | A1 |
20080291863 | Agren | Nov 2008 | A1 |
20080304408 | Kraemer et al. | Dec 2008 | A1 |
20080307349 | Wang et al. | Dec 2008 | A1 |
20080310391 | Schneidman et al. | Dec 2008 | A1 |
20090002263 | Pasetto | Jan 2009 | A1 |
20090010259 | Sirotkin | Jan 2009 | A1 |
20090013081 | Laroia et al. | Jan 2009 | A1 |
20090031035 | Dharmaraju et al. | Jan 2009 | A1 |
20090049093 | Wassingbo et al. | Feb 2009 | A1 |
20090070404 | Mazzaferri | Mar 2009 | A1 |
20090083431 | Balachandran et al. | Mar 2009 | A1 |
20090089453 | Bohan et al. | Apr 2009 | A1 |
20090091656 | Kitaru et al. | Apr 2009 | A1 |
20090094317 | Venkitaraman | Apr 2009 | A1 |
20090102838 | Bullard et al. | Apr 2009 | A1 |
20090109974 | Shetty et al. | Apr 2009 | A1 |
20090133122 | Koo et al. | May 2009 | A1 |
20090141180 | Kondo et al. | Jun 2009 | A1 |
20090141692 | Kasslin et al. | Jun 2009 | A1 |
20090147139 | Watanabe et al. | Jun 2009 | A1 |
20090153737 | Glen | Jun 2009 | A1 |
20090162029 | Glen | Jun 2009 | A1 |
20090189860 | Su et al. | Jul 2009 | A1 |
20090191926 | Doyle | Jul 2009 | A1 |
20090201423 | Sugiyama et al. | Aug 2009 | A1 |
20090252130 | Sheth et al. | Oct 2009 | A1 |
20090288125 | Morioka | Nov 2009 | A1 |
20090300676 | Harter, Jr. et al. | Dec 2009 | A1 |
20090323562 | Cho | Dec 2009 | A1 |
20100027467 | Wu et al. | Feb 2010 | A1 |
20100073334 | Cohen et al. | Mar 2010 | A1 |
20100105334 | Terry et al. | Apr 2010 | A1 |
20100118200 | Gelman et al. | May 2010 | A1 |
20100123826 | Sagi | May 2010 | A1 |
20100127847 | Evans et al. | May 2010 | A1 |
20100134312 | Park et al. | Jun 2010 | A1 |
20100146143 | Thorup | Jun 2010 | A1 |
20100146583 | Prehofer et al. | Jun 2010 | A1 |
20100153553 | Sheth et al. | Jun 2010 | A1 |
20100166017 | Na et al. | Jul 2010 | A1 |
20100172320 | Suzuki | Jul 2010 | A1 |
20100189131 | Branam et al. | Jul 2010 | A1 |
20100199187 | Lin et al. | Aug 2010 | A1 |
20100205321 | Martinez Bauza et al. | Aug 2010 | A1 |
20100245296 | Sip et al. | Sep 2010 | A1 |
20100257238 | Jeon et al. | Oct 2010 | A1 |
20100257450 | Go et al. | Oct 2010 | A1 |
20100281103 | Imai et al. | Nov 2010 | A1 |
20100289871 | Tatsuta et al. | Nov 2010 | A1 |
20100289872 | Funabiki et al. | Nov 2010 | A1 |
20100293287 | Kobayashi | Nov 2010 | A1 |
20100306344 | Athas | Dec 2010 | A1 |
20110002255 | Dharmaraju et al. | Jan 2011 | A1 |
20110016121 | Sambrani et al. | Jan 2011 | A1 |
20110019620 | Wang | Jan 2011 | A1 |
20110037447 | Mair | Feb 2011 | A1 |
20110051602 | Matthews et al. | Mar 2011 | A1 |
20110055017 | Solomon et al. | Mar 2011 | A1 |
20110066497 | Gopinath et al. | Mar 2011 | A1 |
20110066507 | Iyer et al. | Mar 2011 | A1 |
20110069720 | Jacobs et al. | Mar 2011 | A1 |
20110072473 | Funabiki et al. | Mar 2011 | A1 |
20110107388 | Lee et al. | May 2011 | A1 |
20110115818 | Chung et al. | May 2011 | A1 |
20110128442 | Blanchard et al. | Jun 2011 | A1 |
20110145879 | Rajamani et al. | Jun 2011 | A1 |
20110149806 | Verma et al. | Jun 2011 | A1 |
20110157470 | Tsuruga et al. | Jun 2011 | A1 |
20110164058 | Lemay | Jul 2011 | A1 |
20110167176 | Yew et al. | Jul 2011 | A1 |
20110167181 | Minoo et al. | Jul 2011 | A1 |
20110182195 | Oikawa | Jul 2011 | A1 |
20110186138 | Hanna et al. | Aug 2011 | A1 |
20110205433 | Altmann | Aug 2011 | A1 |
20110216239 | Raveendran | Sep 2011 | A1 |
20110216785 | Begen et al. | Sep 2011 | A1 |
20110216829 | Raveendran | Sep 2011 | A1 |
20110281557 | Choi et al. | Nov 2011 | A1 |
20110314168 | Bathiche et al. | Dec 2011 | A1 |
20120036543 | George et al. | Feb 2012 | A1 |
20120036549 | Patel et al. | Feb 2012 | A1 |
20120038825 | Kanonich | Feb 2012 | A1 |
20120044985 | Tao et al. | Feb 2012 | A1 |
20120060100 | Sherwood et al. | Mar 2012 | A1 |
20120084670 | Momchilov | Apr 2012 | A1 |
20120099566 | Laine et al. | Apr 2012 | A1 |
20120113113 | Hong | May 2012 | A1 |
20120147799 | Nagara et al. | Jun 2012 | A1 |
20120154386 | Nagara et al. | Jun 2012 | A1 |
20120162537 | Maddali et al. | Jun 2012 | A1 |
20120209839 | Andrews et al. | Aug 2012 | A1 |
20120233238 | Braginsky et al. | Sep 2012 | A1 |
20120249575 | Krolczyk et al. | Oct 2012 | A1 |
20120323933 | He et al. | Dec 2012 | A1 |
20130002949 | Raveendran et al. | Jan 2013 | A1 |
20130003621 | Huang et al. | Jan 2013 | A1 |
20130003622 | Huang et al. | Jan 2013 | A1 |
20130003623 | Raveendran et al. | Jan 2013 | A1 |
20130003624 | Huang | Jan 2013 | A1 |
20130009873 | Huang et al. | Jan 2013 | A1 |
20130009887 | Huang | Jan 2013 | A1 |
20130009996 | Raveendran | Jan 2013 | A1 |
20130013318 | Huang | Jan 2013 | A1 |
20130033435 | Raveendran et al. | Feb 2013 | A1 |
20130033496 | Raveendran | Feb 2013 | A1 |
20130047189 | Raveendran et al. | Feb 2013 | A1 |
20130124740 | Liansky | May 2013 | A1 |
20130128948 | Rabii et al. | May 2013 | A1 |
20130139210 | Huang | May 2013 | A1 |
20130165238 | Batista Jerez | Jun 2013 | A1 |
20130174208 | Lee et al. | Jul 2013 | A1 |
20130188632 | Sheth et al. | Jul 2013 | A1 |
20130195119 | Huang | Aug 2013 | A1 |
20130215142 | Park | Aug 2013 | A1 |
20130218973 | Good et al. | Aug 2013 | A1 |
20130222301 | Lee et al. | Aug 2013 | A1 |
20130227152 | Lee | Aug 2013 | A1 |
20130234913 | Thangadorai | Sep 2013 | A1 |
20130246565 | Froelicher et al. | Sep 2013 | A1 |
20130246665 | Lee et al. | Sep 2013 | A1 |
20130272628 | Lee | Oct 2013 | A1 |
20130297936 | Khosravi et al. | Nov 2013 | A1 |
20130304794 | Verma | Nov 2013 | A1 |
20140019653 | Amchislavsky et al. | Jan 2014 | A1 |
20140022146 | Thangadorai | Jan 2014 | A1 |
20140075351 | Hansen et al. | Mar 2014 | A1 |
20140089132 | Pavlidis et al. | Mar 2014 | A1 |
20140096164 | Bei et al. | Apr 2014 | A1 |
20140108333 | Jain et al. | Apr 2014 | A1 |
20140120829 | Bhamidipati et al. | May 2014 | A1 |
20140173399 | Sorg et al. | Jun 2014 | A1 |
20140210693 | Bhamidipati et al. | Jul 2014 | A1 |
20140250043 | Malinsky et al. | Sep 2014 | A1 |
20140372620 | Vedula | Dec 2014 | A1 |
20150019328 | Abhyanker | Jan 2015 | A1 |
20150262245 | Arvanitis | Sep 2015 | A1 |
Number | Date | Country |
---|---|---|
1437355 | Aug 2003 | CN |
1561609 | Jan 2005 | CN |
1592884 | Mar 2005 | CN |
1596004 | Mar 2005 | CN |
1656750 | Aug 2005 | CN |
1662944 | Aug 2005 | CN |
1774106 | May 2006 | CN |
1832481 | Sep 2006 | CN |
1893356 | Jan 2007 | CN |
1983945 | Jun 2007 | CN |
101002453 | Jul 2007 | CN |
101018330 | Aug 2007 | CN |
101083825 | Dec 2007 | CN |
101247249 | Aug 2008 | CN |
101247250 | Aug 2008 | CN |
101370038 | Feb 2009 | CN |
201210689 | Mar 2009 | CN |
101771484 | Jul 2010 | CN |
101883147 | Nov 2010 | CN |
102065075 | May 2011 | CN |
1203080 | May 2002 | EP |
1206080 | May 2002 | EP |
1233326 | Aug 2002 | EP |
1235392 | Aug 2002 | EP |
1325591 | Jul 2003 | EP |
1333373 | Aug 2003 | EP |
1385336 | Jan 2004 | EP |
1423778 | Jun 2004 | EP |
1507369 | Feb 2005 | EP |
1517228 | Mar 2005 | EP |
1550264 | Jul 2005 | EP |
1653678 | May 2006 | EP |
1944946 | Jul 2008 | EP |
1959685 | Aug 2008 | EP |
1959686 | Aug 2008 | EP |
2012461 | Jan 2009 | EP |
2037683 | Mar 2009 | EP |
2190202 | May 2010 | EP |
2383920 | Jul 2003 | GB |
H06110424 | Apr 1994 | JP |
H07104722 | Apr 1995 | JP |
H07129364 | May 1995 | JP |
H07240806 | Sep 1995 | JP |
H08237628 | Sep 1996 | JP |
H09325923 | Dec 1997 | JP |
2000278320 | Oct 2000 | JP |
2000354031 | Dec 2000 | JP |
2001034250 | Feb 2001 | JP |
2001282673 | Oct 2001 | JP |
2001352533 | Dec 2001 | JP |
2002064725 | Feb 2002 | JP |
2002142210 | May 2002 | JP |
2002165248 | Jun 2002 | JP |
2002262341 | Sep 2002 | JP |
2002330381 | Nov 2002 | JP |
2003050761 | Feb 2003 | JP |
2003102060 | Apr 2003 | JP |
2003124991 | Apr 2003 | JP |
2003143237 | May 2003 | JP |
2003271279 | Sep 2003 | JP |
2003304523 | Oct 2003 | JP |
2004054783 | Feb 2004 | JP |
2004505531 | Feb 2004 | JP |
2004086550 | Mar 2004 | JP |
2004120441 | Apr 2004 | JP |
2004192140 | Jul 2004 | JP |
2004199454 | Jul 2004 | JP |
2004265329 | Sep 2004 | JP |
2004274159 | Sep 2004 | JP |
2004531916 | Oct 2004 | JP |
2005049666 | Feb 2005 | JP |
2005515714 | May 2005 | JP |
2005142808 | Jun 2005 | JP |
2005148450 | Jun 2005 | JP |
2005204016 | Jul 2005 | JP |
2006500860 | Jan 2006 | JP |
2006060448 | Mar 2006 | JP |
2006060589 | Mar 2006 | JP |
2006060596 | Mar 2006 | JP |
2006100885 | Apr 2006 | JP |
2006514353 | Apr 2006 | JP |
2006121562 | May 2006 | JP |
2006155327 | Jun 2006 | JP |
2006172423 | Jun 2006 | JP |
2006197401 | Jul 2006 | JP |
2006254328 | Sep 2006 | JP |
2006285302 | Oct 2006 | JP |
2007043685 | Feb 2007 | JP |
2007082070 | Mar 2007 | JP |
2007505580 | Mar 2007 | JP |
2007088539 | Apr 2007 | JP |
2007508783 | Apr 2007 | JP |
2007206644 | Aug 2007 | JP |
2007271908 | Oct 2007 | JP |
2007274150 | Oct 2007 | JP |
2007282219 | Oct 2007 | JP |
2007316405 | Dec 2007 | JP |
2008508600 | Mar 2008 | JP |
2008079139 | Apr 2008 | JP |
2008191929 | Aug 2008 | JP |
2008293361 | Dec 2008 | JP |
2008301249 | Dec 2008 | JP |
2008547264 | Dec 2008 | JP |
2009021698 | Jan 2009 | JP |
2009502067 | Jan 2009 | JP |
2009033348 | Feb 2009 | JP |
2009071580 | Apr 2009 | JP |
2009083896 | Apr 2009 | JP |
2009147893 | Jul 2009 | JP |
2009537051 | Oct 2009 | JP |
2010033548 | Feb 2010 | JP |
2010068537 | Mar 2010 | JP |
2010098344 | Apr 2010 | JP |
2010178147 | Aug 2010 | JP |
2012044746 | Mar 2012 | JP |
2012525773 | Oct 2012 | JP |
2014507862 | Mar 2014 | JP |
100398610 | Sep 2003 | KR |
1020050007533 | Jan 2005 | KR |
20060060717 | Jun 2006 | KR |
20080065633 | Jul 2008 | KR |
2207723 | Jun 2003 | RU |
2005113275 | Oct 2005 | RU |
2269873 | Feb 2006 | RU |
496058 | Jul 2002 | TW |
I239179 | Sep 2005 | TW |
200618653 | Jun 2006 | TW |
200838310 | Sep 2008 | TW |
200943168 | Oct 2009 | TW |
0184291 | Nov 2001 | WO |
02010942 | Feb 2002 | WO |
0223825 | Mar 2002 | WO |
0249314 | Jun 2002 | WO |
03030451 | Apr 2003 | WO |
03061240 | Jul 2003 | WO |
WO-03104834 | Dec 2003 | WO |
2004030351 | Apr 2004 | WO |
2004034646 | Apr 2004 | WO |
2004051962 | Jun 2004 | WO |
WO-2005107187 | Nov 2005 | WO |
WO-2005109781 | Nov 2005 | WO |
2006007352 | Jan 2006 | WO |
2006020304 | Feb 2006 | WO |
2006135289 | Dec 2006 | WO |
2007000757 | Jan 2007 | WO |
2007009876 | Jan 2007 | WO |
2007013334 | Feb 2007 | WO |
2007021269 | Feb 2007 | WO |
WO-2007033049 | Mar 2007 | WO |
2007098425 | Aug 2007 | WO |
2007133483 | Nov 2007 | WO |
2007140342 | Dec 2007 | WO |
2007140344 | Dec 2007 | WO |
2008027724 | Mar 2008 | WO |
2008087713 | Jul 2008 | WO |
2009015322 | Jan 2009 | WO |
2009040918 | Apr 2009 | WO |
2010120878 | Oct 2010 | WO |
2010126727 | Nov 2010 | WO |
2011002141 | Jan 2011 | WO |
2011003089 | Jan 2011 | WO |
2012096546 | Jul 2012 | WO |
Entry |
---|
“Bluetooth Specification Version 1.1” published Feb. 22, 2001; Section 1 pp. 41-42; Section 2.1, p. 43; Section 4.1-2, pp. 47-50; Section 10.9, p. 120; and Section 11, pp. 126-137. |
Byungjoo Lee, U.S. Appl. No. 61/433,942, filed Jan. 18, 2011. |
Taiwan Search Report—TW096140444—TIPO—Sep. 16, 2013. |
Taiwan Search Report—TW098111391—TIPO—Nov. 1, 2013. |
Apple, Safari Web Content Guide, Chapter 6, Handling Events, Oct. 12, 2011, retrieved from http://developer.apple.com/library/safari/#documentation/AppleApplications/Reference/SafariWebContent/HandlingEvents/HandlingEvents.html. |
Basso et al., “RTP Payload Format for MPEG-4 Streams; draft-ietf-avt-mpeg4-multisi-03.txt”, vol. avt, No. 3, Nov. 1, 2001, XP015015620, ISSN: 0000-0004. |
Brandenburg, et al., AVTCore, RTCP for inter-destination media syncronization, Internet Draft, draft-ietf-avtcore-idms-092.txt, Oct. 31, 2011. |
Co-pending U.S. Appl. No. 10/236,657, filed Sep. 6, 2002. |
Doerffel T., “User manual iTALC—Intelligent Teaching and Learning with Computers Version 1.0.4”, Jan. 29, 2008, pp. 1-17, XP55025785, Retrieved from the Internet: URL:http://italc.sourceforge.net/italc-manual-2007-01-29.pdf [retrieved on Apr. 26, 2012] the whole document. |
Helmy A: “Architectural framework for large-scale multicast in mobile ad hoc networks” Proceedings of IEEE International Conference on Communications—Apr. 28-May 2, 2002—New York, NY, USA, IEEE, Piscataway, NJ, USA LNKDDOI: 10.1109/ICC.2002.997206, vol. 4, Apr. 28, 2002, pp. 2036-2042, XP010589844 ISBN: 978-0-7803-7400-3. |
IEEE 802.15.3, “Part 15.3: Wireless Medium Access Control (MAC) and Physical Layer (PHY) Specifications for High Rate Wireless Personal Area Networks (WPANs),” IEEE Computer Society, 2003. |
International Search Report and Written Opinion—PCT/US2013/020155—ISA/EPO—May 24, 2013. |
Kwon E., et al., “An idle timeslot reuse scheme for IEEE 802.15.3 high-rate wireless personal area networks” Vehicular Technology Conference, 2005. VTC-2005-Fall. 2005 IEEE 62nd Dallas, TX, USA Sep. 25-28, 2005, Piscataway, NJ USA, IEEE, Sep. 25, 2005, pp. 715-719, XP010878576, ISBN: 0-7803-9152-7 section 1. Introduction. |
McKnight et al. (TPRC 30th Research Conference on Communication, Information and Internet Policy, Aug. 2002) Virtual Markets in Wireless Grids: Peering Policy Obstacles, hereinafter referred as McKnight, pp. 1 and 20. |
Media Content Distribution (MCD); 3D 1-30 Gaming Graphics Delivery Overview, Technical Report, European Telecommunications Standards Institute (ETSI), 650, Route Des Lucioles ; F-06921 Sophia-Antipolis ; France, vol. MCD, No. V1. 1. 1, Dec. 1, 2010, XP014061814, section 5. |
Miller B., et al., “Mapping salutation architecture APIs to Bluetooth service discovery layer,” Bluetooth White Paper, [Online} pp. 1-25, Jul. 1, 1999, XP002511956. |
Mitrea M., et al., “Novel approaches to 1-30 remote display representations: BiFS-based solution and its deployment within the FP7 MobiThin project”, 87. MPEG Meeting; Feb. 2, 2009-Jun. 2, 2009; Lausanne; (Motion Picture Expert Group or ISO/IEC JTC1/SC29/WG11),, No. M16058, Jan. 29, 2009, XP030044655, sections 2 and 3. |
MSDN DirectShow, retrieved Nov. 28, 2011 from: http://msdn.microsoft.com/en-us/library/dd375454(VS.85).aspx. |
MSDN Windows Sockets 2, retrieved Nov. 28, 2011 from: http://msdn.microsoft.com/en-us/library/ms740673(VS.85).aspx. |
Myers, et al: “Collaboration Using Multiple PDAS Connected to a PC” Proceedings of the ACM Conference on Computer Supported Cooperative Work (CSCW), Nov 14, 1998, pp. 285-294, ISBN: 978-1-58113-009-6. |
Nave I et al., “Games@large graphics streaming architecture”, Consumer Electronics, 2008. ISCE 2008. IEEE International Symposium on, IEEE, Piscataway, NJ, USA, Apr. 14, 2008, pp. 1-4, XP031283619, ISBN: 978-1-4244-2422-1 abstract col. 2-col. 6. |
Nordbotten, N.A. et al., “Methods for service discovery in Bluetooth scatternets,” Computer Communications, Elsevier Science Publishers BV, Amdsterdam, NL, vol. 27, No. 11, Jul. 1, 2004, pp. 1087-1096, XP004503638. |
Schulzrinne, et al., “RTP: A Transport Protocol for Real-Time Applications”, rfc3550.txt, Jul. 1, 2003, XP015009332, ISSN: 0000-0003. |
Shoji Y., et al., “Research and Standardization activty for IEEE802.15.3c mmW WPAN: (2) Target applications and Usage Models”, IEICE Tech. Rep., vol. 106, No. 555, RCS2006-279, pp. 179-182, Feb. 2007. |
Video Electronics Standards Association (VESA) Mobile Display Digital Interface Standard (MDDI), Jul. 2004. |
Wenger et al., “RTP Payload Format for H.264 Video,” Network Working Group, RFC 3984, Feb. 2005, 78 pp. |
Yin Z., et al., “Third-party handshake protocol for efficient peer discovery in IEEE 802.15.3 WPANs” Broadband Networks, 2005 2nd International Conference on Boston, MA Oct. 3-7, 2005, Piscataway, NJ, USA IEEE, Oct. 3, 2005, pp. 902-911, XP010890303. |
Casner, S., et al., “Compressing IP/UDP/RTP Headers for Low-Speed Serial Links,” IETF Network Working Group, RFC 2508 (Feb. 1999). |
Gentric:, et al., “RTP Payload Format for MPEG-4 Streams”, Internet Engineering Task Force, draft-ietf-avt-mpeg4-multisl-03.txt, Nov. 2001, pp. 13,14,25 and 33. |
Handley, M. et al., “SDP: Session Description Protocol” Network Working Group, Request for Comments: 2327, Category: Standards Track. ISI/LBNL, Apr. 1998, pp. 1-42. |
DDJ., “Research on Windows Programming with Visual C++, Final Issue, Overcoming WIN32 Hook Processing in MFC Programming”, Feb. 1996 issue (vol. 5,No. 2, No. 61 in all), Shoeisha Co., Ltd., Feb. 1, 1996, pp. 66-77. |
Hayakawa A., “Operate multiple machines by remote control software”, VNCThing, MAC POWER, Japan, ASCII Corporation, Jun. 1, 2003, vol. 14, No. 6, p. 86. |
Kinoshita K., “The Software” Everyone knows, Read this and everything will be clear, Standard software with freedom., Mac Fan, Japan, Mainichi Communications Inc., Sep. 1, 2007, vol. 15, No. 9, pp. 212-219. |
“Raster Graphics” Wikipedia. Wikimedia Foundation, Jan. 29, 2011, Web, Apr. 1, 2015. |
Ryo Yamaichi, Good to Remember! “Wisdom” for Troubleshooting, 68th Windows Q & A Hotline, Windows Server World, Japan, IDG Japan, Inc., Oct. 1, 2009, vol. 14, No. 10, pp. 104-107. |
Wikipedia entry for UPnP List of UPnP AV media servers and clients (captured Aug. 20, 2010), pp. 1-10, Retrieved from the Internet , whole document. |
Wikipedia entry of header (captured Aug. 30, 2010), 1 Page, Retrieved from the Internet , whole document. |
Number | Date | Country | |
---|---|---|---|
20130238702 A1 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
61583987 | Jan 2012 | US | |
61599564 | Feb 2012 | US |