For a more complete understanding of the present invention and its advantages, reference is made to the following description taken in conjunction with the accompanying drawings in which:
Table 12 and chairs 14 may be included in conference room 10 to provide a user with a more comfortable environment. Microphones 16 may be used to convert sound in conference room 10, e.g. a user's voice, into a digital audio signal for transmission to a remote site or sites. Also, loudspeakers 18 may be used to convert digital audio signals received from the remote site or sites into sound in conference room 10. While illustrated as having a particular configuration, conference room 10 may include any suitable number and arrangement of table 12, chairs 14, microphones 16, and loudspeakers 18 in any appropriate location.
Remote sites, not illustrated, may be any suitable elements communicating with conference room 10 through a telephone call, a video conference, or any other suitable communication. As used herein, “remote site” includes any equipment used to participate in a video communication session, including elements that encode, compress, and/or otherwise process signals to be transmitted to conference room 10. In particular embodiments, the remote site may be a room similar to conference room 10. In other embodiments, the remote site may employ a videoconferencing phone, a phone operable to transmit data signals with audio signals, an audio-only phone, or any other suitable device. In some embodiments, the remote site may be located a significant distance from conference room 10. However, while described as “remote,” the remote site may be in any appropriate location, including in the same building as conference room 10.
In the illustrated embodiment, conference room 10 also includes user interface 20. User interface 20 may receive user input and provide a user with information regarding the operation of elements in conference room 10. For example, user interface 20 may receive user input to initiate a communication with a particular remote site, to place the communication on hold, to conference third parties, and to terminate the communication. In some embodiments, user interface 20 receives information regarding the communication from conference coordination module 24. For example, conference coordination module 24 may identify remote sites, the status of communications, call history for conference room 10, etc. In the illustrated embodiment, user interface 20 relays audio signals received from microphones 16. In particular embodiments, user interface 20 allows a user to interact with the functionality of conference coordination module 24, described more fully below. In general, user interface 20 may operate in any appropriate manner to facilitate the initiation, execution, maintenance, and termination of communications in conference room 10.
As illustrated, conference room 10 also includes display device 22. In general, display device 22 displays signals received from one or more remote sites. In particular embodiments, display device 22 may display a digital video signal, for example, by displaying an image of or a real-time video feed of remote users who are participating in the video conference. In some embodiments, conference coordination module 24 receives a digital video signal from the remote site and relays the signal to display device 22. While described as displaying a digital video signal, display device may also display digital data or other similar signals. In certain embodiments, display device 22 may also display other types of signals, such as digitized audio signals. Display device 22 may display digital audio signals through a device similar to loudspeakers 18 or any other appropriate device. In particular embodiments, conference coordination module 24 may perform signal processing on digital signals received from the remote sites; however, in other embodiments, some or all signal processing may be performed by display device 22. In some embodiments, display device 22 comprises a plasma screen television or similar plasma display. In other embodiments, display device 22 comprises a liquid crystal display (LCD), organic light emitting diode (OLED) display, field emissive device (FED) display, or any other suitable display device.
Display device 22 may reduce the overall latency of digital signals by decreasing the latency added to the digital signals by display device 22. In particular embodiments, digital signals are sent from one or more remote sites to conference coordination module 24. These digital signals typically include multiple frames, which are formed of a plurality of pixels. These digital signals may have an associated frame rate. After the digital signals are received, conference coordination module 24 may perform signal processing, such as decompression, decoding, error correction, etc., and then forward the digital signals to display device 22. Display device 22 may receive the digital signals and store pixels of each frame in a frame buffer prior to display. After a certain number of pixels of a frame are stored in the frame buffer, display device 22 may forward the pixels of the frame to a display module for display by display device 22. In some embodiments, the frame rate determines the number of pixels of a frame that should be stored in the frame buffer before the pixels are forwarded to the display module. In particular embodiments, the number of pixels stored in the frame buffer before beginning to transmit the pixels to the display module is at or near to a minimum number of pixels allowable without causing errors in the display of the frame. The display module may begin to process a frame for display prior to the frame buffer receiving all pixels corresponding to that frame.
In the illustrated embodiment, display device 22 is located in conference room 10; however, it is understood that display device 22 may be located or implemented in any environment in which a low-latency display device may be suitable. While display device 22 is described as having a particular configuration and functionality, display device 22 may be any device operable to display a digital signal while reducing the latency introduced by that device. Display device 22 may reduce latency in telephone or videoconferencing communications, or any other appropriate environment.
Conference coordination module 24 coordinates the functions of the various devices and elements within conference room 10. In particular embodiments, conference coordination module 24 is responsible for initiating and terminating communications between conference room 10 and remote sites. In some embodiments, conference coordination module 24 receives audio, video, and/or data signals from remote sites and forwards them to loudspeakers 18, user interface 20, and/or display device 22. Conference coordination module 24 may also receive audio, video, and/or data signals from users of conference room 10 or other devices in conference room 10, such as microphones 16 and user interface 20. Conference coordination module 24 may then forward these received audio, video, and/or data signals to remote sites. In certain embodiments, conference coordination module 24 interacts with other devices or elements, either located proximately or remotely, to facilitate communications in conference room 10. Conference coordination module 24 may be responsible for processing audio, video, and/or data signals sent by or to remote sites.
Particular embodiments of a system for reducing the latency of a display device have been described and are not intended to be all inclusive. While conference room 10 is depicted as containing a certain configuration and arrangement of elements and devices, it should be noted that this is merely an example arrangement of elements and devices. User interface 20, display device 22, and conference coordination module 24 represent logical depictions of elements and devices employing particular functionality. In general, the components and functionality of conference room 10 may be provided by any suitable collection and arrangement of components and may be combined, separated, distributed, or replaced as appropriate both logically and physically. The functions performed by the various components of conference room 10 may be accomplished by any suitable devices to reduce the latency added to a digital signal by a display device. Additionally, while digital signals are described, other embodiments may provide reduced latency for a display device receiving analog signals and converting those signals into digital-based signals or data.
Input module 30 receives signals from an external source. In particular embodiments, input module 30 receives digital signals from conference coordination module 24. The digital signals may encode pixels corresponding to one or more frames. In particular embodiments, the digital signals contain digital video frames. In some embodiments, input module 30 formats the received digital signals. Formatting may include decompressing, decoding, reformatting, or otherwise processing the signals. In certain embodiments, input module 30 receives digital signals, performs any required signal processing, and forwards frames to controller 36 and then to frame buffer 32 for storage. Input module 30 may be composed of any appropriate logic, software, or hardware for receiving digital signals.
Frame buffer 32 stores the pixels corresponding to frames to be displayed by display device 22. In the illustrated embodiment, frame buffer 32 receives the pixels from controller 36, who receives the signal from input module 30. In other embodiments, frame buffer 32 receives data from input module 30. After receiving data from controller 36, frame buffer 32 may store pixels until instructed to send the pixels to display module 34. Once instructed to transmit pixels, frame buffer 32 may begin to send pixels to display module 34. In the illustrated embodiment, frame buffer 32 sends the pixels to display module 34 through controller 36. In certain embodiments, frame buffer 32 receives the pixels to be stored after input module 30 or controller 36 extracts frames of pixels from received digital signals. In other embodiments, digital signals may simply be comprised of a plurality of frames. In certain embodiments, frame buffer 32 is operable to store multiple frames of pixels. In other embodiments, frame buffer 32 may store only pixels corresponding to a portion of a frame. Frame buffer 32 may receive pixels from input module 30 at a different rate and format than frame buffer 32 forwards pixels to display module 34. While described as a buffer, frame buffer 32 may employ any appropriate logic, software, hardware, etc. to store frame data received by display device 22 for display by display module 34.
Display module 34 displays the digital signal received by display device 22. Display module 34 may display pixels received from controller 36 and frame buffer 32. In certain embodiments, controller 36 receives pixels from frame buffer 32 and forwards them to display module 34. In other embodiments, frame buffer 32 sends the pixels directly to display module 34. In particular embodiments, display module 34 is a plasma display panel module. In other embodiments, display module 34 includes a liquid crystal display (LCD), organic light emitting diode (OLED) display, field emissive device (FED) display, or any other suitable display device. In certain embodiments, display module 34 must receive pixels according to a particular format. Display module 34 may require pixels to be received at a 60 Hz frame rate with blanking between rows of pixels within a frame. In some embodiments, input module 30, controller 36, and/or frame buffer 32 format pixels in the manner appropriate for display by display module 34. In general, display module 34 may be any device that is operable to display digital signals received by display device 22.
As illustrated, display device 22 also includes controller 36, memory 38, and minimum pixel table 40. Generally, controller 36 controls the operation of the components within display device 22. In some embodiments, controller 36 accesses memory 38, specifically minimum pixel table 40, to reduce the latency introduced by display device 22. Controller 36 may also receive digital signals from input module 30 and determine the frame rate of the received digital signals. Controller 36 may also access frame buffer 32 to determine the number of pixels stored in frame buffer 32. In particular embodiments, controller 36 initiates transmittal of pixels of a frame from frame buffer 32 to display module 34 when frame buffer 32 stores a number of pixels equal to the pixel delay, which is partially determined by the input frame rate. In certain embodiments, controller 36 is operable to format the pixels and/or the digital signal in any appropriate manner. In general, controller 36 may be operable to transmit, receive, and/or format data in any appropriate manner. Memory 38 stores minimum pixel table 40, which may indicate the pixel delay corresponding to various possible frame rates of a received digital signal. Also, memory 38 may include any additional hardware, software, firmware, or any other programming or files necessary for the operation of display device 22. While memory 38 is depicted as an element separate from controller 36, it should be understood that memory 38 and controller 36 may have any appropriate configuration and arrangement.
In operation of the illustrated embodiment, input module 30 receives digital signals encoding frame information from a remote site through conference coordination module 24. Controller 36 receives the signals from input module 30 and determines the frame rate of the digital signals. After determining the frame rate, controller 36 accesses minimum pixel table 40 in memory 38 in order to determine the pixel delay. In some embodiments, the pixel delay is selected to allow display device 22 to introduce the least amount of latency feasible. Meanwhile, input module 30 may perform signal processing on the digital signals. This processing may include decoding, decompressing, gamma correction or other forms of pixel value adjustment, resolution scaling, refresh rate scaling, image layering (e.g. adding text or graphics), extracting pixel information from the signals to obtain frames for display, and/or other image processing functions. After any processing, controller 36 forwards the frames to frame buffer 32. Frame buffer 32 stores the received pixels corresponding to a frame until the number of pixels stored equals the pixel delay. Controller 36 monitors frame buffer 32 to determine when the pixel delay is reached by monitoring frame buffer, setting an interrupt, informing frame buffer 32 of the pixel delay, or any other suitable methods. When the pixel delay is reached, controller 36 begins to move the frame from frame buffer 32 to display module 34 for display. Frame buffer 32 and/or controller 36 may format the frame for transmission to display module 34 in any appropriate manner. In particular embodiments, controller 36 formats the frame by transmitting the frame to display module 34 at an appropriate frequency or by introducing blanking.
While display device 22 is depicted as one element containing a particular configuration and arrangement of components, it should be noted that this is a logical depiction and the components and functionality of display device 22 may be provided by any suitable collection and arrangement of components. For example, the positioning and functions of controller 36 may be modified as appropriate. The functions performed by the various components of display device 22 may be accomplished by any suitable devices to reduce the latency added to a digital signal by display device 22.
While minimum pixel table 40 is depicted as having a particular configuration and arrangement of data stored in a particular way, it is understood that this is merely a logical depiction. The functionality of minimum pixel table 40 may be provided by any suitable storage devices and may include any suitable factors for determining the pixel delay to be employed by display device 22.
The method described with respect to
Although the present invention has been described in several embodiments, a myriad of changes and modifications may be suggested to one skilled in the art, and it is intended that the present invention encompass such changes and modifications as fall within the present appended claims.
This application claims priority to U.S. Patent Application Ser. No. 60/794,016, entitled “VIDEOCONFERENCING SYSTEM,” which was filed on Apr. 20, 2006.
Number | Date | Country | |
---|---|---|---|
60794016 | Apr 2006 | US |