This application relates to computer implemented applications.
A computing device may execute an operating environment that may include elements, such as file system objects and executing applications. The computing device may render a representation of the operating environment as part of a graphical interface, which may be output for presentation on a display unit of the computing device. The representation of the operating environment may be rendered at a defined display resolution, which may define a display area included in the graphical interface.
Disclosed herein are aspects of systems, methods, and apparatuses for adjustable buffer remote access.
An aspect is a method for adjustable buffer remote access. Adjustable buffer remote access may include generating, at a client device, a client display buffer request indicating a portion of a display area of an operating environment of a host device; transmitting the client display buffer request to the host device; receiving a rendered buffer portion including a rendering of a representation of the portion of the display area of the operating environment of the host device, wherein the rendered buffer portion includes a client display portion and a margin portion; presenting the client display portion as a window into the display area of the operating environment of the host device.
Another aspect is another method for adjustable buffer remote access. Adjustable buffer remote access may include initiating an adjustable buffer remote access session, wherein initiating the adjustable buffer remote access session includes receiving host display information indicating the display area, generating, at a client device, a client display buffer request indicating a portion of a display area of an operating environment of a host device, and transmitting the client display buffer request to the host device via an electronic communication network. The method of adjustable buffer remote access may include receiving, via the electronic communication network, a rendered buffer portion including a rendering of a representation of the portion of the display area of the operating environment of the host device, wherein the rendered buffer portion includes a client display portion and a margin portion; identifying the client display portion based on a difference between the rendered buffer portion and the margin portion; and presenting the client display portion as a window into the display area of the operating environment of the host device by outputting a portion of the rendered buffer portion corresponding to the client display portion to a graphical display device of the client device.
Another aspect is another method for adjustable buffer remote access. Adjustable buffer remote access may include initiating an adjustable buffer remote access session, wherein initiating the adjustable buffer remote access session includes receiving host display information indicating the display area, generating, at a client device, a client display buffer request indicating a portion of a display area of an operating environment of a host device, and transmitting the client display buffer request to the host device via an electronic communication network. The method of adjustable buffer remote access may include receiving, via the electronic communication network, a rendered buffer portion including a rendering of a representation of the portion of the display area of the operating environment of the host device, wherein the rendered buffer portion includes a client display portion and a margin portion; presenting the client display portion as a window into the display area of the operating environment of the host device by outputting a portion of the rendered buffer portion corresponding to the client display portion to a graphical display device of the client device; receiving an indication of a client display portion change from a user input device of the client device via an operating system of the client device, wherein the client display portion change indicates a change in position, size, or zoom of the client display portion relative to the representation of the display area of the operating environment of the host device; presenting an updated client display portion based on the rendered buffer portion and the client display portion change, wherein the updated client display portion includes a portion of the margin portion; transmitting an updated client display buffer request to the host device indicating an updated portion of the display area of the operating environment of the host device based on the client display portion change; receiving an updated rendered buffer portion including a rendering of a representation of the updated portion of the display area of the operating environment of the host device, wherein the updated rendered buffer portion includes the updated client display portion and an updated margin portion; and presenting the updated client display portion.
Variations in these and other aspects will be described in additional detail hereafter.
The description herein makes reference to the accompanying drawings wherein like reference numerals refer to like parts throughout the several views, and wherein:
Remote access technologies, such as remote desktop or screen sharing, may allow a computing device (client) to remotely access an operating environment of another computing device (host). For example, the host device may render a representation of a display area of the operating environment, which may be associated with a defined resolution, and may transmit the rendered output to the client device for presentation on a display unit of the client device. Rendering the representation of the display area may include, for example, capturing the content of the display area and encoding the content as a series of frames.
In moving window based remote access technologies, the host device may render a representation of a portion of the display area of operating environment on the host device and may transmit the output, including the rendered portion of the display area, to the client device. The client device may present the output as a window into the display area of the operating environment. The window may be moved to include another portion of the display area, and the host device may render the other portion of the display area and transmit the output to the client device for presentation.
In some implementations, moving window based remote access technologies may perform poorly at the client device due to the time utilized between input of a change in window position at the client device and receipt, at the client device, of the output rendered by the host device. For example, changing the portion displayed may include transmitting information indicating the new the portion of the display area from the client device to the host device, which may include a size of the portion, rendering the new portion of the display area at the host device, transmitting the output from the host device to the client device, and presenting the rendering of the new portion at the display unit of the client device.
Adjustable buffer remote access may improve remote access performance, and may include transmitting information indicating the a buffer portion of the display area from the client device to the host device, rendering the buffer portion of the display area at the host device, transmitting the output from the host device to the client device, and presenting the a display portion of the rendered buffer portion at the display unit of the client device, wherein the display portion is smaller than and included within the buffer portion.
The computing device 100 may be a stationary computing device, such as a personal computer (PC), a server, a workstation, a minicomputer, or a mainframe computer; or a mobile computing device, such as a mobile telephone, a personal digital assistant (PDA), a laptop, or a tablet PC. Although shown as a single unit, any one or more element of the communication device 100 can be integrated into any number of separate physical units. For example, the UI 130 and processor 140 can be integrated in a first physical unit and the memory 150 can be integrated in a second physical unit.
The communication interface 110 can be a wireless antenna, as shown, a wired communication port, such as an Ethernet port, an infrared port, a serial port, or any other wired or wireless unit capable of interfacing with a wired or wireless electronic communication medium 180.
The communication unit 120 can be configured to transmit or receive signals via a wired or wireless medium 180. For example, as shown, the communication unit 120 is operatively connected to an antenna configured to communicate via wireless signals. Although not explicitly shown in
The UI 130 can include any unit capable of interfacing with a user, such as a virtual or physical keypad, a touchpad, a display, a touch display, a speaker, a microphone, a video camera, a sensor, or any combination thereof. The UI 130 can be operatively coupled with the processor, as shown, or with any other element of the communication device 100, such as the power source 170. Although shown as a single unit, the UI 130 may include one or more physical units. For example, the UI 130 may include an audio interface for performing audio communication with a user, and a touch display for performing visual and touch based communication with the user. Although shown as separate units, the communication interface 110, the communication unit 120, and the UI 130, or portions thereof, may be configured as a combined unit. For example, the communication interface 110, the communication unit 120, and the UI 130 may be implemented as a communications port capable of interfacing with an external touchscreen device.
The processor 140 can include any device or system capable of manipulating or processing a signal or other information now-existing or hereafter developed, including optical processors, quantum processors, molecular processors, or a combination thereof. For example, the processor 140 can include a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessor in association with a DSP core, a controller, a microcontroller, an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a programmable logic array, programmable logic controller, microcode, firmware, any type of integrated circuit (IC), a state machine, or any combination thereof. As used herein, the term “processor” includes a single processor or multiple processors. The processor can be operatively coupled with the communication interface 110, communication unit 120, the UI 130, the memory 150, the instructions 160, the power source 170, or any combination thereof.
The memory 150 can include any non-transitory computer-usable or computer-readable medium, such as any tangible device that can, for example, contain, store, communicate, or transport the instructions 160, or any information associated therewith, for use by or in connection with the processor 140. The non-transitory computer-usable or computer-readable medium can be, for example, a solid state drive, a memory card, removable media, a read only memory (ROM), a random access memory (RAM), any type of disk including a hard disk, a floppy disk, an optical disk, a magnetic or optical card, an application specific integrated circuits (ASICs), or any type of non-transitory media suitable for storing electronic information, or any combination thereof. The memory 150 can be connected to, for example, the processor 140 through, for example, a memory bus (not explicitly shown).
The instructions 160 can include directions for performing any method, or any portion or portions thereof, disclosed herein. The instructions 160 can be realized in hardware, software, or any combination thereof. For example, the instructions 160 may be implemented as information stored in the memory 150, such as a computer program, that may be executed by the processor 140 to perform any of the respective methods, algorithms, aspects, or combinations thereof, as described herein. The instructions 160, or a portion thereof, may be implemented as a special purpose processor, or circuitry, that can include specialized hardware for carrying out any of the methods, algorithms, aspects, or combinations thereof, as described herein. Portions of the instructions 160 can be distributed across multiple processors on the same machine or different machines or across a network such as a local area network, a wide area network, the Internet, or a combination thereof.
The power source 170 can be any suitable device for powering the communication device 110. For example, the power source 170 can include a wired power source; one or more dry cell batteries, such as nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion); solar cells; fuel cells; or any other device capable of powering the communication device 110. The communication interface 110, the communication unit 120, the UI 130, the processor 140, the instructions 160, the memory 150, or any combination thereof, can be operatively coupled with the power source 170.
Although shown as separate elements, the communication interface 110, the communication unit 120, the UI 130, the processor 140, the instructions 160, the power source 170, the memory 150, or any combination thereof can be integrated in one or more electronic units, circuits, or chips.
A computing and communication device 100A/100B/100C can be, for example, a computing device, such as the computing device 100 shown in
Each computing and communication device 100A/100B/100C can be configured to perform wired or wireless communication. For example, a computing and communication device 100A/100B/100C can be configured to transmit or receive wired or wireless communication signals and can include a user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a cellular telephone, a personal computer, a tablet computer, a server, consumer electronics, or any similar device. Although each computing and communication device 100A/100B/100C is shown as a single unit, a computing and communication device can include any number of interconnected elements.
Each access point 210A/210B can be any type of device configured to communicate with a computing and communication device 100A/100B/100C, a network 220, or both via wired or wireless communication links 180A/180B/180C. For example, an access point 210A/210B can include a base station, a base transceiver station (BTS), a Node-B, an enhanced Node-B (eNode-B), a Home Node-B (HNode-B), a wireless router, a wired router, a hub, a relay, a switch, or any similar wired or wireless device. Although each access point 210A/210B is shown as a single unit, an access point can include any number of interconnected elements.
The network 220 can be any type of network configured to provide services, such as voice, data, applications, voice over internet protocol (VoIP), or any other communications protocol or combination of communications protocols, over a wired or wireless communication link. For example, the network 220 can be a local area network (LAN), wide area network (WAN), virtual private network (VPN), a mobile or cellular telephone network, the Internet, or any other means of electronic communication. The network can use a communication protocol, such as the transmission control protocol (TCP), the user datagram protocol (UDP), the internet protocol (IP), the real-time transport protocol (RTP) the Hyper Text Transport Protocol (HTTP), or a combination thereof.
The computing and communication devices 100A/100B/100C can communicate with each other via the network 220 using one or more a wired or wireless communication links, or via a combination of wired and wireless communication links. For example, as shown the computing and communication devices 100A/100B can communicate via wireless communication links 180A/180B, and computing and communication device 100C can communicate via a wired communication link 180C. Any of the computing and communication devices 100A/100B/100C may communicate using any wired or wireless communication link, or links. For example, a first computing and communication device 100A can communicate via a first access point 210A using a first type of communication link, a second computing and communication device 100B can communicate via a second access point 210B using a second type of communication link, and a third computing and communication device 100C can communicate via a third access point (not shown) using a third type of communication link. Similarly, the access points 210A/210B can communicate with the network 220 via one or more types of wired or wireless communication links 230A/230B. Although
Other implementations of the computing and communications system 200 are possible. For example, in an implementation the network 220 can be an ad-hock network and can omit one or more of the access points 210A/210B. The computing and communications system 200 may include devices, units, or elements not shown in
The host device 310 may execute an operating environment, which may include an instance of an operating system and may be associated with an account, such as a logged in user account. As shown, a representation of the operating environment may include a display area 350. The display area 350 may indicate a height and a width of the representation of the operating environment. For example, the display area 350 may be associated with a defined display resolution, which may be expressed in physical units of measure, such as millimeters, or logical units of measure, such as pixels. For example, the display area 350 may have a display resolution of 1920 (width) by 1080 (height) pixels. The host device 310 may render the display area and may transmit the rendered output to the client device 320 via the network 330. In some implementations, the host device 310 may render the output as a series of frames, which may include an I-frame followed by one or more P-frames.
The client device 320 may execute an operating environment, which may include a remote access application 322. The client device 320 may receive the rendered output from the host device 310 via the network 330 and may present the representation of the display area 350A via a graphical display unit of the client device 320.
In some implementations, the client device 320 may be configured to present the representation of the display area 350A at a display resolution that differs from the display resolution rendered by the host device 310. For example, the client device 320 may scale the rendered output for presentation via the graphical display unit of the client device 320. In some implementations, the host device 310 may receive an indication of the display resolution of the client device 320 and may render the representation of the operating environment using the display resolution of the client device 320.
For example, the host device 310 may adjust the display resolution of the host device 310 to match the display resolution of the client device 320, and may render the representation of the display area at the adjusted display resolution. Adjusting the display resolution may cause unwanted interference with the operating environment of the host device 310.
In another example, rendering the representation of the display are at the host device 310 may include scaling or sampling the representation of the display area to generate output at the display resolution of the client device 320, which may consume significant resources, such as processing resources, and may produce graphical artifacts.
The host device 410 may execute an operating environment, which may include an instance of an operating system and may be associated with an account, such as a logged in user account. As shown, a representation of the operating environment may include a display area 450. The host device 410 may receive an indication of a display window 452, which may include a display resolution and position, from the client device 420. The host device 410 may render a representation of a portion of the display area indicated by the window 452, and may transmit the rendered output to the client device 420 via the network 430.
Although any size and position scheme may be used, display resolution and position are described herein using pixel count and Cartesian coordinates such that the display resolution of the display area 450 may be expressed as a number of horizontal pixels and a number of vertical pixels, such as 1920×1080, a pixel at the top left corner of the display area 450 may be indicated by the coordinates 0/0, and a pixel at the bottom right corner of the display area 450 may be indicated by the coordinates 1920/1080.
For example, the display area 450 of the host device 410 may be expressed as a width, which may be a number of horizontal pixels, such as host_width=1920, and a height, which may be a number of vertical pixels, such as host_height=1080, and the display window 452 may be expressed as a width, which may be a number of horizontal pixels, such as client_width=1024, a height, which may be a number of vertical pixels, such as client_height=800, and a position, which may indicate, for example, the position of the top left corner of the display window 452 relative to the display area 450. The position may include a horizontal offset, which may be a number of pixels from the left side of the display area 450, such as offset_x=10, and a vertical offset, which may be a number of pixels from the top of the display area 450, such as offset_y=280.
The client device 420 may execute an operating environment, which may include a remote access application 422. The client device 420 may receive the rendered output from the host device 410 via the network 430 and may present the representation of the portion of the display area 452A via a graphical display unit of the client device 420.
Although not shown in
In some implementations, changing the position, size, or zoom of the display window 452A may perform poorly at the client device 420. For example, the client device 420 may send updated display window position information, such as a new offset, to the host device 410. The host device 410 may process the updated display window information, may render a representation of the updated portion of the display area 450 corresponding to the updated display window information as a series of frames including an I-frame followed by one or more P-frames, and may send the series of frames to the client device 420. The client device 420 may receive one or more of the series of frames, such as the I-frame, and may begin presenting the representation of the updated portion of the display area. The I-frame may take longer to render and transmit than a P-frame. The delay between sending the updated information and receiving the rendered output corresponding to the new offset may be noticeable, such as hundreds of milliseconds.
The host device 510 may execute an operating environment, which may include an instance of an operating system and may be associated with an account, such as a logged in user account. As shown, a representation of the operating environment may include a display area 550. The host device 510 may receive an indication of a display window buffer 552, which may include a display resolution and position, from the client device 520. The display window buffer 552 may include a display window portion 554 and a margin portion 556. The host device 510 may render a representation of a portion of the display area 550 indicated by the display window buffer 552, and may transmit the rendered output to the client device 520 via the network 530. In some implementations, the host device 510 may not include a graphical display device. Although described as two dimensional herein, the display area 550 may be three dimensional.
The client device 520 may execute an operating environment, which may include a remote access application 522. The client device 520 may receive the rendered output from the host device 510, which may include the display window portion 554 and the margin portion 556, via the network 530 and may present a portion 554A of the representation of the portion of the display area indicated by the display window buffer 552 via a graphical display unit of the client device 520. The presented portion 554A may correspond to the display window portion 554, and the margin portion 556 may not be presented.
Although not shown in
The display area 610 may indicate a height and a width of a representation of the operating environment at the host device. For example, the display area 610 may be expressed as a width, which may be a number of horizontal pixels, such as host_width=1920, and a height, which may be a number of vertical pixels, such as host_height=1080.
The client display portion 624 may indicate a height and width of a display window on a client device, such as the client device 520 shown in
The display buffer 620 may indicate a height and width of a portion of the display area of the host device, such as a portion requested by a client device. For example, the display buffer 620 may be expressed as a width, which may be a number of horizontal pixels, such as client_buffer_width=1224, a height, which may be a number of vertical pixels, such as client_buffer_height=1000. The position of the display buffer 620 relative to the display area 610 may be expressed as an offset, which may indicate a position of the top left pixel of the display buffer 620 relative to the top left pixel of the display area 610, and may include a horizontal offset, and a vertical offset. The horizontal offset may indicate a logical or physical distance, such as a number of horizontal pixels, from the left side of the display area 610 to the left side of the display buffer 620, and may be expressed as buffer_offset_x. The vertical offset may indicate a logical or physical distance, such as a number of vertical pixels, from the top of the display area 610 to the top of the display buffer 620, and may be expressed as buffer_offset_y. The margin portion 622 may indicate a difference between the client display portion 624 and the display buffer 620.
As shown, the display area 610 may be larger than the display buffer 620 and the display buffer 620 may be larger than the client display portion 624, which may be expressed as follows:
In some implementations, the margin portion 622 may be dynamically adjustable. For example, a client device may determine the relative size of the margin portion and may minimize presentation disruption. In an example, the margin portion 622 use a constant margin size (margin_left, margin_right, margin_top, margin_bottom), which may be expressed as the following:
In another example, the margin portion 622 may use a constant margin size ratio (ratio_left, ratio_right, ratio_top, ratio_bottom), which may be expressed as the following:
Adjustable buffer remote access, such as the adjustable buffer remote access shown in
Adjustable buffer remote access may be initiated at 710. Initiating adjustable buffer remote access may include establishing a connection between the client device 702 and the host device 704. The client device 702 and the host device 704 may exchange information, such as a client display buffer request, a request identifier (request_id), a requested display buffer size (client_buffer_width, client_buffer_height) and position (buffer_offset_x, buffer_offset_y) and a display resolution (host_width, host_height) of the host device 704. The exchanged information may indicate that the client device 702 is using moving window based remote access or adjustable buffer remote access. In some implementations, the client device 702 may determine whether the requested buffer is within the host 704 display area.
The host device 704 may render a portion of a representation of a display area of an operating environment of the host device 704 at 720. The portion rendered may correspond to the requested display buffer information, as shown in
The host device 704 may transmit and the client device 702 may receive the rendered buffer portion at 725. The rendered buffer portion may include a portion corresponding to the client display portion, as shown in
The client device 702 may receive an indication of a change of the client display portion at 740. For example, the client device 702 may receive input, such as user input, which may be received, via an operating system of the client device, from a user input device of the client device, such as a mouse, a keyboard, or a touch screen, indicating a change in the size, position, zoom, or a combination thereof, of the client display portion. For example, the client device 702 may include a touch screen and the indication of the change of the client display portion may be received in response to a gesture received by the touch screen.
The client device 702 my present the updated display portion, or a part of the updated display portion, based on the received rendered buffer portion at 750.
In some implementations, the updated client display portion may be included in the received rendered buffer portion. For example, the client display portion change may be smaller than the difference between the client display portion and the buffer portion. As shown in
In some implementations, the updated client display portion, or portion thereof, may not be included in the received rendered buffer portion. For example, the client display portion change may be larger than the difference between the client display portion and the buffer portion. The client device 702 may present the portion of the updated client display portion that is included in the received rendered buffer portion.
The client device 702 may transmit information indicating the display portion update to the host device 704 at 760. For example, the client device 702 may transmit a new request identifier (request_id), a new requested display buffer size (client_buffet_width, client_buffer_height), and a new requested buffer position (buffer_offset_x, buffer_offset_y). Although shown sequentially in
The host device 704 may render a portion of a representation of a display area of an operating environment corresponding to the updated buffer information at 770. The host device 704 may transmit and the client device 702 may receive the updated rendered buffer portion at 775. The client device 702 may present the updated display portion based on the received updated rendered buffer portion at 780.
Although described with reference to an operating environment of the host device 704, adjustable buffer remote access may be used for remote presentation at the client device 702 of any content rendered at the host device 704. For example, adjustable buffer remote access may be associated with an identified application running at the host device 704 and the client buffer include a portion of the a display area of the application.
Other implementations of the diagram of adjustable buffer remote access as shown in
The words “example” or “exemplary” are used herein to mean serving as an example, instance, or illustration. Any aspect or design described herein as “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects or designs. Rather, use of the words “example” or “exemplary” is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or”. That is, unless specified otherwise, or clear from context, “X includes A or B” is intended to mean any of the natural inclusive permutations. That is, if X includes A; X includes B; or X includes both A and B, then “X includes A or B” is satisfied under any of the foregoing instances. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form. Moreover, use of the term “an embodiment” or “one embodiment” or “an implementation” or “one implementation” throughout is not intended to mean the same embodiment or implementation unless described as such. As used herein, the terms “determine” and “identify”, or any variations thereof, includes selecting, ascertaining, computing, looking up, receiving, determining, establishing, obtaining, or otherwise identifying or determining in any manner whatsoever using one or more of the devices shown in
Further, for simplicity of explanation, although the figures and descriptions herein may include sequences or series of steps or stages, elements of the methods disclosed herein can occur in various orders or concurrently. Additionally, elements of the methods disclosed herein may occur with other elements not explicitly presented and described herein. Furthermore, not all elements of the methods described herein may be required to implement a method in accordance with the disclosed subject matter.
The implementations of the computing and communication devices (and the algorithms, methods, or any part or parts thereof, stored thereon or executed thereby) can be realized in hardware, software, or any combination thereof. The hardware can include, for example, computers, intellectual property (IP) cores, application-specific integrated circuits (ASICs), programmable logic arrays, optical processors, programmable logic controllers, microcode, microcontrollers, servers, microprocessors, digital signal processors or any other suitable circuit. In the claims, the term “processor” should be understood as encompassing any of the foregoing hardware, either singly or in combination. The terms “signal” and “data” are used interchangeably. Further, portions of the computing and communication devices do not necessarily have to be implemented in the same manner.
Further, all or a portion of implementations can take the form of a computer program product accessible from, for example, a tangible computer-usable or computer-readable medium. A computer-usable or computer-readable medium can be any device that can, for example, tangibly contain, store, communicate, or transport the program for use by or in connection with any processor. The medium can be, for example, an electronic, magnetic, optical, electromagnetic, or a semiconductor device. Other suitable mediums are also available.
The above-described implementations have been described in order to allow easy understanding of the application are not limiting. On the contrary, the application covers various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structure as is permitted under the law.
Number | Name | Date | Kind |
---|---|---|---|
5452435 | Malouf et al. | Sep 1995 | A |
5638114 | Hatanaka et al. | Jun 1997 | A |
5731840 | Kikuchi et al. | Mar 1998 | A |
5801756 | Iizawa | Sep 1998 | A |
6021213 | Helterbrand et al. | Feb 2000 | A |
6025870 | Hardy | Feb 2000 | A |
6038367 | Abecassis | Mar 2000 | A |
6091777 | Guetz et al. | Jul 2000 | A |
6195391 | Hancock et al. | Feb 2001 | B1 |
6204847 | Wright | Mar 2001 | B1 |
6243683 | Peters | Jun 2001 | B1 |
6266337 | Marco | Jul 2001 | B1 |
6271840 | Finseth et al. | Aug 2001 | B1 |
6346963 | Katsumi | Feb 2002 | B1 |
6363067 | Chung | Mar 2002 | B1 |
6421387 | Rhee | Jul 2002 | B1 |
6483454 | Torre et al. | Nov 2002 | B1 |
6556588 | Wan et al. | Apr 2003 | B2 |
6577333 | Tai et al. | Jun 2003 | B2 |
6587985 | Fukushima et al. | Jul 2003 | B1 |
6681362 | Abbott et al. | Jan 2004 | B1 |
6684354 | Fukushima et al. | Jan 2004 | B2 |
6704024 | Robotham et al. | Mar 2004 | B2 |
6707852 | Wang | Mar 2004 | B1 |
6711209 | Lainema et al. | Mar 2004 | B1 |
6728317 | Demos | Apr 2004 | B1 |
6732313 | Fukushima et al. | May 2004 | B2 |
6741569 | Clark | May 2004 | B1 |
6812956 | Ferren et al. | Nov 2004 | B2 |
6816836 | Basu et al. | Nov 2004 | B2 |
6918077 | Fukushima et al. | Jul 2005 | B2 |
6952450 | Cohen | Oct 2005 | B2 |
7007098 | Smyth et al. | Feb 2006 | B1 |
7007235 | Hussein et al. | Feb 2006 | B1 |
7015954 | Foote et al. | Mar 2006 | B1 |
7124333 | Fukushima et al. | Oct 2006 | B2 |
7178106 | Lamkin et al. | Feb 2007 | B2 |
7180896 | Okumura | Feb 2007 | B1 |
7197070 | Zhang et al. | Mar 2007 | B1 |
D541293 | Harvey | Apr 2007 | S |
7219062 | Colmenarez et al. | May 2007 | B2 |
7263644 | Park et al. | Aug 2007 | B2 |
7266782 | Hull et al. | Sep 2007 | B2 |
D553632 | Harvey | Oct 2007 | S |
7356750 | Fukushima et al. | Apr 2008 | B2 |
7372834 | Kim et al. | May 2008 | B2 |
7376880 | Ichiki et al. | May 2008 | B2 |
7379653 | Yap et al. | May 2008 | B2 |
7447235 | Luby et al. | Nov 2008 | B2 |
7447969 | Park et al. | Nov 2008 | B2 |
7484157 | Park et al. | Jan 2009 | B2 |
D594872 | Akimoto | Jun 2009 | S |
7567671 | Gupte | Jul 2009 | B2 |
7577898 | Costa et al. | Aug 2009 | B2 |
7607157 | Inoue et al. | Oct 2009 | B1 |
7636298 | Miura et al. | Dec 2009 | B2 |
7664185 | Zhang et al. | Feb 2010 | B2 |
7664246 | Krantz et al. | Feb 2010 | B2 |
7680076 | Michel et al. | Mar 2010 | B2 |
7684982 | Taneda | Mar 2010 | B2 |
D614646 | Chen et al. | Apr 2010 | S |
7707224 | Chastagnol et al. | Apr 2010 | B2 |
7710973 | Rumbaugh et al. | May 2010 | B2 |
7720686 | Volk et al. | May 2010 | B2 |
7735111 | Michener et al. | Jun 2010 | B2 |
7739714 | Guedalia | Jun 2010 | B2 |
7756127 | Nagai et al. | Jul 2010 | B2 |
7797274 | Strathearn et al. | Sep 2010 | B2 |
7822607 | Aoki et al. | Oct 2010 | B2 |
7823039 | Park et al. | Oct 2010 | B2 |
7860718 | Lee et al. | Dec 2010 | B2 |
7864210 | Kennedy | Jan 2011 | B2 |
7974243 | Nagata et al. | Jul 2011 | B2 |
8010185 | Ueda | Aug 2011 | B2 |
8019175 | Lee et al. | Sep 2011 | B2 |
8060651 | Deshpande et al. | Nov 2011 | B2 |
8078493 | Rosenberg et al. | Dec 2011 | B2 |
8085767 | Lussier et al. | Dec 2011 | B2 |
8087056 | Ryu | Dec 2011 | B2 |
8130823 | Gordon et al. | Mar 2012 | B2 |
8161159 | Shetty et al. | Apr 2012 | B1 |
8175041 | Shao et al. | May 2012 | B2 |
8176524 | Singh et al. | May 2012 | B2 |
8179983 | Gordon et al. | May 2012 | B2 |
8223268 | Fujiwara et al. | Jul 2012 | B2 |
8233539 | Kwon | Jul 2012 | B2 |
8265450 | Black et al. | Sep 2012 | B2 |
8307403 | Bradstreet et al. | Nov 2012 | B2 |
8316450 | Robinson et al. | Nov 2012 | B2 |
8494053 | He et al. | Jul 2013 | B2 |
8553776 | Shi et al. | Oct 2013 | B2 |
20020085637 | Henning | Jul 2002 | A1 |
20020140851 | Laksono | Oct 2002 | A1 |
20020152318 | Menon et al. | Oct 2002 | A1 |
20020157058 | Ariel et al. | Oct 2002 | A1 |
20020176604 | Shekhar et al. | Nov 2002 | A1 |
20020191072 | Henrikson | Dec 2002 | A1 |
20030012287 | Katsavounidis et al. | Jan 2003 | A1 |
20030014674 | Huffman et al. | Jan 2003 | A1 |
20030016630 | Vega-Garcia et al. | Jan 2003 | A1 |
20030061368 | Chaddha | Mar 2003 | A1 |
20030098992 | Park et al. | May 2003 | A1 |
20030226094 | Fukushima et al. | Dec 2003 | A1 |
20030229822 | Kim et al. | Dec 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20040071170 | Fukuda | Apr 2004 | A1 |
20040105004 | Rui et al. | Jun 2004 | A1 |
20040165585 | Imura et al. | Aug 2004 | A1 |
20040172252 | Aoki et al. | Sep 2004 | A1 |
20040172255 | Aoki et al. | Sep 2004 | A1 |
20040184444 | Aimoto et al. | Sep 2004 | A1 |
20040196902 | Faroudja | Oct 2004 | A1 |
20040233938 | Yamauchi | Nov 2004 | A1 |
20050033635 | Jeon | Feb 2005 | A1 |
20050041150 | Gewickey et al. | Feb 2005 | A1 |
20050060229 | Riedl et al. | Mar 2005 | A1 |
20050060742 | Riedl et al. | Mar 2005 | A1 |
20050060745 | Riedl et al. | Mar 2005 | A1 |
20050071781 | Atkins | Mar 2005 | A1 |
20050076272 | Delmas et al. | Apr 2005 | A1 |
20050091508 | Lee et al. | Apr 2005 | A1 |
20050117653 | Sankaran | Jun 2005 | A1 |
20050125734 | Mohammed et al. | Jun 2005 | A1 |
20050154965 | Ichiki et al. | Jul 2005 | A1 |
20050157793 | Ha et al. | Jul 2005 | A1 |
20050180415 | Cheung et al. | Aug 2005 | A1 |
20050185715 | Karczewicz et al. | Aug 2005 | A1 |
20050220188 | Wang | Oct 2005 | A1 |
20050251856 | Araujo et al. | Nov 2005 | A1 |
20050259729 | Sun | Nov 2005 | A1 |
20060013310 | Lee et al. | Jan 2006 | A1 |
20060039470 | Kim et al. | Feb 2006 | A1 |
20060066717 | Miceli | Mar 2006 | A1 |
20060140584 | Ellis et al. | Jun 2006 | A1 |
20060146940 | Gomila et al. | Jul 2006 | A1 |
20060150055 | Quinard et al. | Jul 2006 | A1 |
20060153217 | Chu et al. | Jul 2006 | A1 |
20060195864 | New et al. | Aug 2006 | A1 |
20060215014 | Cohen et al. | Sep 2006 | A1 |
20060215752 | Lee et al. | Sep 2006 | A1 |
20060247927 | Robbins et al. | Nov 2006 | A1 |
20060248563 | Lee et al. | Nov 2006 | A1 |
20060282774 | Covell et al. | Dec 2006 | A1 |
20060291475 | Cohen | Dec 2006 | A1 |
20070011702 | Vaysman | Jan 2007 | A1 |
20070036354 | Wee et al. | Feb 2007 | A1 |
20070064094 | Potekhin et al. | Mar 2007 | A1 |
20070080971 | Sung | Apr 2007 | A1 |
20070081522 | Apelbaum | Apr 2007 | A1 |
20070081587 | Raveendran et al. | Apr 2007 | A1 |
20070097257 | El-Maleh et al. | May 2007 | A1 |
20070121100 | Divo | May 2007 | A1 |
20070124762 | Chickering et al. | May 2007 | A1 |
20070168824 | Fukushima et al. | Jul 2007 | A1 |
20070195893 | Kim et al. | Aug 2007 | A1 |
20070223529 | Lee et al. | Sep 2007 | A1 |
20070237226 | Regunathan et al. | Oct 2007 | A1 |
20070237232 | Chang et al. | Oct 2007 | A1 |
20070250754 | Costa et al. | Oct 2007 | A1 |
20070268964 | Zhao | Nov 2007 | A1 |
20070285505 | Korneliussen | Dec 2007 | A1 |
20080004731 | Ozaki | Jan 2008 | A1 |
20080037624 | Walker et al. | Feb 2008 | A1 |
20080043832 | Barkley et al. | Feb 2008 | A1 |
20080052630 | Rosenbaum et al. | Feb 2008 | A1 |
20080072267 | Monta et al. | Mar 2008 | A1 |
20080077264 | Irvin et al. | Mar 2008 | A1 |
20080089414 | Wang et al. | Apr 2008 | A1 |
20080101403 | Michel et al. | May 2008 | A1 |
20080109369 | Su et al. | May 2008 | A1 |
20080109707 | Dell et al. | May 2008 | A1 |
20080126278 | Bronstein et al. | May 2008 | A1 |
20080134005 | Izzat et al. | Jun 2008 | A1 |
20080144553 | Shao et al. | Jun 2008 | A1 |
20080178211 | Lillo et al. | Jul 2008 | A1 |
20080207182 | Maharajh et al. | Aug 2008 | A1 |
20080209300 | Fukushima et al. | Aug 2008 | A1 |
20080250294 | Ngo et al. | Oct 2008 | A1 |
20080260042 | Shah et al. | Oct 2008 | A1 |
20080270528 | Girardeau et al. | Oct 2008 | A1 |
20080273591 | Brooks et al. | Nov 2008 | A1 |
20080320512 | Knight | Dec 2008 | A1 |
20090006927 | Sayadi et al. | Jan 2009 | A1 |
20090007159 | Rangarajan et al. | Jan 2009 | A1 |
20090010325 | Nie et al. | Jan 2009 | A1 |
20090013086 | Greenbaum | Jan 2009 | A1 |
20090022157 | Rumbaugh et al. | Jan 2009 | A1 |
20090031390 | Rajakarunanayake et al. | Jan 2009 | A1 |
20090059067 | Takanohashi et al. | Mar 2009 | A1 |
20090059917 | Lussier et al. | Mar 2009 | A1 |
20090080510 | Wiegand et al. | Mar 2009 | A1 |
20090103635 | Pahalawatta | Apr 2009 | A1 |
20090122867 | Mauchly et al. | May 2009 | A1 |
20090125812 | Blinnikka et al. | May 2009 | A1 |
20090138784 | Tamura et al. | May 2009 | A1 |
20090144417 | Kisel et al. | Jun 2009 | A1 |
20090161763 | Rossignol et al. | Jun 2009 | A1 |
20090180537 | Park et al. | Jul 2009 | A1 |
20090187862 | DaCosta | Jul 2009 | A1 |
20090238277 | Meehan | Sep 2009 | A1 |
20090241147 | Kim et al. | Sep 2009 | A1 |
20090245351 | Watanabe | Oct 2009 | A1 |
20090249158 | Noh et al. | Oct 2009 | A1 |
20090254657 | Melnyk et al. | Oct 2009 | A1 |
20090268819 | Nishida | Oct 2009 | A1 |
20090276686 | Liu et al. | Nov 2009 | A1 |
20090276817 | Colter et al. | Nov 2009 | A1 |
20090307227 | Prestenback et al. | Dec 2009 | A1 |
20090322854 | Ellner | Dec 2009 | A1 |
20100040349 | Landy | Feb 2010 | A1 |
20100054333 | Bing et al. | Mar 2010 | A1 |
20100077058 | Messer | Mar 2010 | A1 |
20100122127 | Oliva et al. | May 2010 | A1 |
20100149301 | Lee et al. | Jun 2010 | A1 |
20100153828 | De Lind Van Wijngaarden et al. | Jun 2010 | A1 |
20100171882 | Cho et al. | Jul 2010 | A1 |
20100186041 | Chu et al. | Jul 2010 | A1 |
20100192078 | Hwang et al. | Jul 2010 | A1 |
20100202414 | Malladi et al. | Aug 2010 | A1 |
20100220172 | Michaelis | Sep 2010 | A1 |
20100235820 | Khouzam et al. | Sep 2010 | A1 |
20100293470 | Zhao et al. | Nov 2010 | A1 |
20100306618 | Kim et al. | Dec 2010 | A1 |
20100309372 | Zhong | Dec 2010 | A1 |
20100309982 | Le Floch et al. | Dec 2010 | A1 |
20110033125 | Shiraishi | Feb 2011 | A1 |
20110047163 | Chechik et al. | Feb 2011 | A1 |
20110069890 | Besley | Mar 2011 | A1 |
20110093273 | Lee et al. | Apr 2011 | A1 |
20110103480 | Dane | May 2011 | A1 |
20110131144 | Ashour et al. | Jun 2011 | A1 |
20110158529 | Malik | Jun 2011 | A1 |
20110191374 | Bengio et al. | Aug 2011 | A1 |
20110194605 | Amon et al. | Aug 2011 | A1 |
20110218439 | Masui et al. | Sep 2011 | A1 |
20110225417 | Maharajh et al. | Sep 2011 | A1 |
20110265136 | Liwerant et al. | Oct 2011 | A1 |
20120013705 | Taylor et al. | Jan 2012 | A1 |
20120072960 | Rosenberg et al. | Mar 2012 | A1 |
20120084821 | Rogers | Apr 2012 | A1 |
20120110443 | Lemonik et al. | May 2012 | A1 |
20120206562 | Yang et al. | Aug 2012 | A1 |
20120232681 | Mundy et al. | Sep 2012 | A1 |
20120246343 | Story, Jr. et al. | Sep 2012 | A1 |
20120287999 | Li et al. | Nov 2012 | A1 |
20120315008 | Dixon et al. | Dec 2012 | A1 |
20120324324 | Hwang et al. | Dec 2012 | A1 |
20130031441 | Ngo et al. | Jan 2013 | A1 |
20130198617 | Maloney et al. | Aug 2013 | A1 |
Number | Date | Country |
---|---|---|
1777969 | Apr 2007 | EP |
0715711 | Jan 1995 | JP |
2008146057 | Jun 2008 | JP |
2008225379 | Sep 2008 | JP |
WO0249356 | Jun 2002 | WO |
WO2007057850 | May 2007 | WO |
WO2008006062 | Jan 2008 | WO |
Entry |
---|
Screen shot of website dated May 2011: www.cbs.com/primtime/60—minutes/video/?pid=Hwiua1litcOuuHiAYN. |
Chen, Yu, et al., “An Error Concealment Algorithm for Entire Frame Loss in Video Transmission,” Picture Coding Symposium, 2004. |
European Search Report for European Patent Application No. 08146463.1 dated Jun. 23, 2009. |
Feng, Wu-chi; Rexford, Jennifer; “A Comparison of Bandwidth Smoothing Techniques for the Transmission of Prerecorded Compressed Video”, Paper, 1992, 22 pages. |
Friedman, et al., “RTP: Control Protocol Extended Reports (RTPC XR),” Network Working Group RFC 3611 (The Internet Society 2003) (52 pp). |
Frossard, Pascal; “Joint Source/FEC Rate Selection for Quality-Optimal MPEG-2 Video Delivery”, IEEE Transactions on Image Processing, vol. 10, No. 12, (Dec. 2001) pp. 1815-1825. |
Hartikainen, E. and Ekelin, S. Tuning the Temporal Characteristics of a Kalman-Filter Method for End-to-End Bandwidth Estimation. IEEE E2EMON. Apr. 3, 2006. |
High efficiency video coding (HEVC) text specification draft 6, JCTVC-H1003, JCT-VC 7th meeting, Geneva, Switzerland, Nov. 21-30, 2011. |
International Search Report and Written Opinion Dated Aug. 13, 2012, in PCT/US2012/034426. |
International Search Report and Written Opinion for International Application No. PCT/US2011/051818 dated Nov. 21, 2011 (16 pages). |
International Search Report for International Application No. PCT/EP2009/057252 mailed on Aug. 27, 2009. |
JongWon Kim, Young-Gook Kim, HwangJun Song, Tien-Ying Kuo, Yon Jun Chung, and C.-C. Jay Kuo; “TCP-friendly Internet Video Streaming employing Variable Frame-rate Encoding and Interpolation”; IEEE Trans. Circuits Syst. Video Technology, Jan. 2000; vol. 10 pp. 1164-1177. |
Khronos Group Inc. OpenMAX Integration Layer Application Programming Interface Specification. Dec. 16, 2005, 326 pages, Version 1.0. |
Korhonen, Jari; Frossard, Pascal; “Flexible forward error correction codes with application to partial media data recovery”, Signal Processing: Image Communication vol. 24, No. 3 (Mar. 2009) pp. 229-242. |
Li, A., “RTP Payload Format for Generic Forward Error Correction”, Network Working Group, Standards Track, Dec. 2007, (45 pp). |
Liang, Y.J.; Apostolopoulos, J.G.; Girod, B., “Analysis of packet loss for compressed video: does burst-length matter?,” Acoustics, Speech and Signal Processing, 2003. Proceedings. (ICASSP '03). 2003 IEEE International conference on, vol. 5, No., pp. V, 684-7 vol. 5, Apr. 6-10, 2003. |
Murat A. Tekalp, 'Block-Based Methods, Digital Video Processing, Prentice Hall Processing Series, Aug. 12, 1995, pp. 98-116, Prentice Hall PTR. |
Neogi, A., et al., Compression Techniques for Active Video Content; State University of New York at Stony Brook; Computer Science Department; pp. 1-11. |
ON2 Technologies Inc., White Paper TrueMotion VP7 Video Codec, Jan. 10, 2005, 13 pages, Document Version: 1.0, Clifton Park, New York. |
ON2 Technologies, Inc., White Paper On2's TrueMotion VP7 Video Codec, Jul. 11, 2008, pp. 7 pages, Document Version:1.0, Clifton Park, New York. |
Park, Jun Sung, et al., “Selective Intra Prediction Mode Decision for H.264/AVC Encoders”, World Academy of Science, Engineering and Technology 13, (2006). |
Peng, Qiang, et al., “Block-Based Temporal Error Concealment for Video Packet Using Motion Vector Extrapolation,” IEEE 2003 Conference of Communications, Circuits and Systems and West Sino Expositions, vol. 1, No. 29, pp. 10-14 (IEEE 2002). |
Roca, Vincent, et al., Design and Evaluation of a Low Density Generator Matrix (LDGM) Large Block FEC Codec, INRIA Rhone-Alpes, Planete project, France, Date Unknown, (12 pp). |
“Rosenberg, J. D. RTCWEB I-D with thoughts on the framework. Feb. 8, 2011. Retrieved fromhttp://www.ietf.org/mail-archive/web/dispatch/current/msg03383.html on Aug. 1, 2011.”. |
“Rosenberg, J.D., et al. An Architectural Framework for Browser based Real-Time Communications (RTC) draft-rosenberg-rtcweb-framework-00. Feb. 8, 2011. Retrieved fromhttp://www.ietf.org/id/draft-rosenberg-rtcweb-framework-00.txt on Aug. 1, 2011.”. |
Scalable Video Coding, SVC, Annex G extension of H264. |
Steliaros, Michael K., et al.; “Locally-accurate motion estimation for object-based video coding”, SPIE vol. 3309, 1997, 11 pp. |
Stiller, Christoph; “Motion-Estimation for Coding of Moving Video at 8 kbit/s with Gibbs Modeled Vectorfield Smoothing”, SPIE vol. 1360 Visual Communications and Image Processing 1990, 9 pp. |
Strobach, Peter; “Tree-Structured Scene Adaptive Coder”, IEEE Transactions on Communications, vol. 38, No. 4, Apr. 1990, 10 pp. |
Wikipedia, the free encyclopedia, “Low-density parity-check code”, http://en.wikipedia.org/wiki/Low-density—parity-check—code, Jul. 30, 2012 (5 pp). |
Yan, Bo and Gharavi, Hamid, “A Hybrid Frame Concealment Algorithm for H.264/AVC,” IEEE Transactions on Image Processing, vol. 19, No. 1, pp. 98-107 (IEEE, Jan. 2010). |
Yoo, S. J.B., “Optical Packet and burst Switching Technologies for the Future Photonic Internet,” Lightwave Technology, Journal of, vol. 24, No. 12, pp. 4468, 4492, Dec. 2006. |
Yu, Xunqi, et al; “The Accuracy of Markov Chain Models in Predicting Packet-Loss Statistics for a Single Multiplexer”, IEEE Transaactions on Information Theory, vol. 54, No. 1 (Jan. 2008) pp. 489-501. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video; Advanced video coding for generic audiovisual services”. H.264. Version 1. International Telecommunication Union. Dated May 2003. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video; Advanced video coding for generic audiovisual services”. H.264. Version 3. International Telecommunication Union. Dated Mar. 2005. |
“Overview; VP7 Data Format and Decoder”. Version 1.5. On2 Technologies, Inc. Dated Mar. 28, 2005. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video; Advanced video coding for generic audiovisual services”. H.264. Amendment 1: Support of additional colour spaces and removal of the High 4:4:4 Profile. International Telecommunication Union. Dated Jun. 2006. |
“VP6 Bitstream & Decoder Specification”. Version 1.02. On2 Technologies, Inc. Dated Aug. 17, 2006. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video”. H.264. Amendment 2: New profiles for professional applications. International Telecommunication Union. Dated Apr. 2007. |
“VP6 Bitstream & Decoder Specification”. Version 1.03. On2 Technologies, Inc. Dated Oct. 29, 2007. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video”. H.264. Advanced video coding for generic audiovisual services. Version 8. International Telecommunication Union. Dated Nov. 1, 2007. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video”. H.264. Advanced video coding for generic audiovisual services. International Telecommunication Union. Version 11. Dated Mar. 2009. |
“Series H: Audiovisual and Multimedia Systems; Infrastructure of audiovisual services—Coding of moving video”. H.264. Advanced video coding for generic audiovisual services. International Telecommunication Union. Version 12. Dated Mar. 2010. |
“Implementors' Guide; Series H: Audiovisual and Multimedia Systems; Coding of moving video: Implementors Guide for H.264: Advanced video coding for generic audiovisual services”. H.264. International Telecommunication Union. Version 12. Dated Jul. 30, 2010. |
“VP8 Data Format and Decoding Guide”. WebM Project. Google On2. Dated: Dec. 1, 2010. |
Bankoski et al. “VP8 Data Format and Decoding Guide; draft-bankoski-vp8-bitstream-02” Network Working Group. Dated May 18, 2011. |
Bankoski et al. “Technical Overview of VP8, an Open Source Video Codec for the Web”. Dated Jul. 11, 2011. |
Bankoski, J., Koleszar, J., Quillio, L., Salonen, J., Wilkins, P., and Y. Xu, “VP8 Data Format and Decoding Guide”, RFC 6386, Nov. 2011. |
Mozilla, “Introduction to Video Coding Part 1: Transform Coding”, Video Compression Overview, Mar. 2012, 171 pp. |
Chae-Eun Rhee et al. (:A Real-Time H.264/AVC Encoder with Complexity-Aware Time Allocation, Circuits and Systems for video Technology, IEEE Transactions on, vol. 20, No. 12, pp. 1848, 1862, Dec. 2010). |
Ciachetti (Matching techniques to compute image motion, Image and Vision Computing, vol. 18, No. 3, Feb. 2000, pp. 247-260. |
Sceen shot of website dated Oct. 14, 2011: www:abc.go.com/watch/2020/SH559026/VD55148316/2020. |
Screen shot of website dated May, 2011: www.cbs.com/primtime/60—minutes/video/?pid=Hwiua1litcOuuHiAYN. |