This invention relates generally to digital video and, more particularly, to a multi-format digital video production system capable of maintaining full-bandwidth resolution while providing professional quality editing and manipulation of images for various applications, including digital HDTV and specialized video monitoring.
Traditional systems for video production either rely on uncompressed video signals (for example, SMPTE 4:4:4 or 4:2:2), standard compressed MPEG-2 4:2:2P@ML signals, or other signals that have undergone only minimal compression, such as the (approximately) 5:1 compression utilized for DVCPRO and DVCAM equipment by Panasonic and Sony. However, the bandwidth required for these high-quality signals still is too great for many broadcast and industrial applications, particularly those that require the level of detail available in HDTV images.
Due to the high-bandwidth demands of high-quality signals, typical distribution systems utilize only the highest quality levels for the head-end equipment and the first part of the signal distribution chain. Furthermore, because of network traffic due to multiple users (as for example, in a cable television distribution system), the last leg of the signal path utilizes a more highly compressed signal, to maximize the usage of the available bandwidth. In most cases, this requires that the original signal be decompressed, and then re-compressed at a much higher compression ratio, so that less bandwidth is required for the final portion of the path.
Accordingly, the need remains for an approach to video production and monitoring which allows the levels of quality that users have come to expect at their receiving terminals, while utilizing existing broadband media and other conventional technologies to optimize the signal storage, processing, and transmission path performance.
This invention resides in a multi-format digital video production system capable of maintaining the full-bandwidth resolution of the subject material, while providing professional quality editing and manipulation of images intended for digital television and for other applications, including digital HDTV programs and specialized video monitoring applications.
Broadly, this invention allows emerging broadband video transmission media, including Internet broadcast schemes, to overcome existing technology limitations. In the preferred embodiment, for example, the approach facilitates high-quality/large-screen video production and monitoring through the use of conventional broadband channels, including those which currently only exhibit bandwidths on the order of 4 Mbps. In more specific examples, in formats utilizing a 24 fps progressive scan multi-format system, direct streaming is made possible from HDTV (16:9) high-quality data, thereby expanding market applications which require these higher levels of resolution, bits per pixel, and so forth.
This system, now known as the “Direct Stream Cinema System,” is based on optimizing the entire signal path, utilizing 4:2:2 color processing and bit rates typically in the range of 2-6 Mbps. It begins with digitizing and compressing the output of the optical pickup and graphics processor (including appropriate processing, such as noise reduction and resolution enhancement) and carries through the processing circuitry to the receiving terminal device at the user end of the transmission chain. Signal quality is preserved throughout the process, by eliminating the need to decompress a lower-compression signal from a camera, video recorder, or other source device for editing or other purposes, and then re-compressing the signal at a much higher rate for transmission purposes.
A high-quality, reduced-data-rate digital video system according to a preferred embodiment includes a source of a streaming video program having a progressive-scanned image with a frame rate of less than substantially 24 fps; a video server in communication with the source for storing the program; and one or more computers in network communication with the video server for locally displaying the program or portions thereof.
In a “direct stream” implementation the locally displayed program or portions thereof are in the same format as the streaming video program received form the source. The system and method may further include a personal-computer-based control of the camera/input device, monitor for the streaming video program received from the source, or other PC-based capabilities. The streaming video program may be received through a network connection, and the video server includes one or more of the following for storing the program: a micro-disk, portable HDD, memory-stick, optical storage, or magneto-optical storage.
This invention overcomes the limitations of the existing art by providing a video production/monitoring capability capable of transmitting HDTV (16:9) quality video utilizing existing broadband bandwidths of [such as 4 Mbps (1024×576 pixels) or greater], thereby overcoming the traditional problem of conserving bandwidth while preserving quality.
The “Direct Stream Cinema System” preferably utilizes a 24 fps progressive camera format which, through the use of proprietary multi-format production techniques (110), facilitates Internet and broadband applications, including streaming services 122, Internet TV, video monitoring/security 124, and 35 mm/HDTV/DVD output capabilities 126. The approach does not require an HDTV quality video camera or recording, however, but nevertheless facilitates HDTV quality, direct video monitoring, off-line editing, and other capabilities at a great reduction in total system cost.
With respect to streaming applications, the video data may be transmitted directly to a central server through a network environment, resulting in both a comparatively small capacity storage requirement and also in other advantages over existing approaches. In one disclosed example, HDTV quality video with an aspect ratio of 16:9 is achieved, having a horizontal resolution of 1024×576, with the potential for up-conversion to 1920×1080. This resolution, equivalent to a 42-inch plasma display, is accomplished with a data rate of 4 Mbps, more or less, enabling recording to occur at 2 GBytes/hr, whereas current HDTV requires more than 100 GBytes/hr. Various video formats are possible through the use of proprietary multi-format progressive systems and frame rates, which may vary up to 24 fps (or greater) in the preferred embodiment.
Newer media players, such as Microsoft's new “Corona” technology, which is scheduled to be released with the latest version of the Windows Media Player (Series 9), are aimed at signal distribution systems utilizing a data rate of 6 Mbps, using MPEG 4 and other comparable compression techniques. However, such technology also provides for bit rates in the range of 2-4 Mbps, being directed towards applications such as archiving, streaming video, and off-line viewing. At these data rates, it is possible to store 100 hours of video in only 180 GB of storage [(100 hr)×(3600 sec/hr)×(4 Mbps)/(8 b/B)].
The resulting signal can be stored, in an AVI format, for example, on a hard disk drive. Currently, these PC cards only are being used for SDTV, but in the future, they will be capable of HDTV recording, and for specialized industrial applications; for HDTV applications, a new decoder board would be used.
The preferred storage and distribution format according to the invention is 1024×576@24 fps. Compression ratios of 100:1 are practical for SDTV, and 400:1 for HDTV. In addition, the system is scalable, for example, to the following:
200 Kbps@1 fps
500 Kbps@3 fps
1 Mbps@6 fps
2 Mbps@12 fps
4 Mbps@24 fps
Comparisons of the output quality of a variety of PC-video display cards utilizing both interlaced and progressive signals and also frame-rate/standards-conversion indicate a need to optimize the signal processing. For conversions from interlaced PAL signals to NTSC, these cards produce outputs with noticeable frame skipping and jumping. However, from a progressive PAL signal (i.e., greater than 50 fps progressive), the severity of artifacts is greatly reduced. Newer PC graphics cards produce significantly better results, which suggests that they may have adopted the frame-rate conversion techniques disclosed in U.S. Pat. No. 5,999,220, entitled “Multi-Format Audio/Video Production System with Frame Rate Conversion” and U.S. Pat. No. 6,370,198B1, entitled “Wide-band Multi-Format Audio/Video Production System with Frame Rate Conversion,” the entire content of both being incorporated herein by reference.
In preferred embodiments, signals at the head-end of a signal distribution system are converted to progressively scanned signals. A frame rate of 24 fps preferably is employed, in order to optimize the utilization of the available bandwidth. In the next step, the signals are compressed to create a data stream at 2-4 Mbps (for 1024×576@24 fps) or 4-6 Mbps (for 1280×720@24 fps. These signals may be stored for subsequent transmission to receiving terminal equipment (such as PCs, cable boxes, personal video recorders, display monitors, or other terminal equipment), or immediately transmitted over a signal distribution system, which may be wired, wireless, satellite, or other medium, including physical media such as CD-ROMs, DVDs, etc.). This receiving terminal equipment may be located at multiple remote sites, may be located at multiple sites within a single facility, or may be configured as a combination of local and remote sites.
In an alternative embodiment, signals may be received from multiple sources, including one or more remote sources, and are collected at a central location for viewing, storage, or both. The signals preferably are transmitted to the central site as compressed, progressively-scanned streaming video signals, employing data rates in the range of 2-4 Mbps. As in other embodiments, 24 fps is preferably used, although the frame rate may be greater or less, may be variable or fixed, and may be modified under control of a local operator, or may be modified automatically in response to a predetermined set of criteria, utilizing sensors at the physical location of the camera or signal source, or via remote control from a central site, either under control of an operator, or automatically in response to a predetermined set of criteria. The source signal frame rate and image size may be different for each source signal, and the frame rate and image size of a source signal in the format stored need not be identical to the frame rate and image size in the format displayed.
Currently, ½-inch 3-CCD cameras are available for less than $10,000, and ⅓-inch 3-CCD cameras are available for approximately $5,000. As such, it is already practical and economical to implement this type of system for a range of commercial/industrial applications, for example:
Airport security
Monitoring of remote natural areas, such as forests
Auto crash testing
Public building (Court, Government office, School) security
Hospital security
Educational/instructional
The advantages of this approach are many, in addition to the ability to use existing broadband infrastructures supporting data transfers in the range 1:4 Mbps, the systems may be built at 1/10th cost of conventional HDTV systems. High-quality monitoring is capable, as is direct network connectivity. The use of a generic PC-based server can easily handle a large monitoring application. The resulting configuration improves security, at banks, for example, while reducing mistakes due to human error. Operating efficiency is improved for medical applications, for example, along with reliability and monitoring efficiency (speed). Overall, the system is physically compact.
This system application offers numerous features and advantages over a traditional system, which requires a more traditional recording and editing system 406, and which does not allow a direct connection via path 408. Using the approach described above, results in a dramatic reduction and system cost (under $10.000 vs. $100.000 or more at current prices). Full digital component processing (4:2:2) is achieved without a loss in quality, and excessive hard disk drives are not required for editing; rather, a generic PC is capable of editing the program (10 gigabytes vs. terabytes for traditional HDTV). The advantages includes a reduced HDTV production cost and time without a separate data capture step. The invention is not limited in term so video format or streaming, as all existing and yet to be developed formats may be accommodated.
While broadcast quality video 508 (standard definition at 4:3) costs much less, the image quality is reduced dramatically, to a frame size of 720×480 pixels (4:3, 30 fps). According to the invention, however, utilizing a 24 fps scan and proprietary multi-format system at 506, a 24P image at 1024×576 or 1280×720 can be generated having an aspect ratio of 16:9, exhibiting a quality comparable to conventional HDTV broadcast, but at a cost of under $10,000. A typical surveillance image, at 320×240 and <15 fps is shown at 510 for comparison purposes.
For any of these implementations (Professional, Camcorder, Surveillance, or Consumer), a key part of the system resides in the optimization of the entire processing scheme, with an eye towards the end-user quality level. For example, in the case of modem plasma-display units, the capability of the individual unit largely is determined by the physical dimensions of the screen: 32″ displays are supplied as capable of 848×477 pixels; 42″ displays are supplied as capable of 1024×576 pixels; 50″ displays are supplied as capable of 1280×720 pixels. Because multiple tests have demonstrated that “film quality” as measured at the theatrical projection screen only provides approximately 700 lines of resolution (see, for example, A. Kaiser, H. W. Mahler, and R. H. McMann, SMPTE Journal, June, 1985), 1024×576, or at most 1280×720, provides the optimum display quality; 1920×1080 or other higher-pixel-count systems are not required.
Another key feature of the system is the utilization of compression technology. Most origination-quality systems rely on intra-frame compression (such as Motion-JPEG), which is limited to 3:1 or 4:1 for this type of application. Further downstream in the processing and transmission chain, much higher inter-frame-based compression ratios are needed (such as MPEG-2), in order to make signal distribution practical and economical. The instant invention contemplates high compression ratios throughout the process, achieving in excess of 100:1 compression. In this way, the use of “intermediate” formats, such as DVC-PRO or DV-CAM no longer are required. Furthermore, the reduced data rates required for the system eliminates the need for extremely large capacity hard-disk recording capability, enabling editing on most of today's conventional PCs.
However, in order to achieve these kinds of compression ratios without sacrificing quality, the preferred embodiment employs 24 fps signals (which, evidently, saves 20% of the data rate required for a 30 fps signal), and also progressive-scanning (which is over 50% more efficient than compression of interlaced signals). Many compression schemes are available, including, for example, industry standards such as MPEG-4, and proprietary systems such as Microsoft Windows Media 9, Divx, and Wavelet-type compression. The resulting data rates easily are conveyed over conventional distribution paths, such as satellite, cable, and broadcast systems, requiring only 1-2 Mbps for SDTV-type signals, and 6 Mbps for HDTV-type signals.
As shown in
However, consumer cameras are producing increasingly high quality recording. despite their small size and low cost. By employing the techniques disclosed herein, DV-quality recordings for more than one hour are practical, and S-VHS-quality recordings for more than two hours can be achieved. In addition, video editing is simplified, as no step of capturing to the PC is required—editing can proceed directly from camera memory cards or other storage devices (including hard-disk, optical disc, DVD, etc.), and the quality is preserved throughout the process. In addition, the resulting recordings are compatible with various streaming conventions, such as those supported by Microsoft and Real Networks video. This same system of video processing without a step of capturing the signal to the PC applies equally as well to Professional and Camcorder applications.
The reader will appreciate that the practical application of the instant invention has significant implications in many fields. For example, Digital Asset Management systems typically employ highly-compressed “proxies” to convey the content of much less-compressed primary program materials, thereby enabling Edit Decision Lists to be developed from the “proxies” and then used to edit the final program using the primary program material. With the much more efficient signal processing methods provided herein, it is not necessary to create the separate proxies, as the primary signals themselves are provided at much lower data rates than traditionally have been available for these materials, making them suitable for use in a single-step on-line editing application.
The “Direct Stream Cinema System” is based on optimizing the entire signal path, utilizing 4:2:2 color processing and bit rates typically in the range of 1-2 Mbps for SDTV-quality video and 4-6 Mbps for HDTV-quality video. It begins with digitizing and compressing the output of the optical pickup and graphics processor (including appropriate processing, such as noise reduction and resolution enhancement), so that from the onset the data rate is set and then maintained through the internal processing circuitry, recording steps, and through the distribution steps to the receiving terminal device at the user end of the transmission chain. Signal quality is preserved throughout the process, by eliminating the need to decompress a lower-compression signal from a camera, video recorder, or other source device for editing or other purposes, and then re-compressing the signal at a much higher rate for transmission purposes. Thus, there is no distinct “intermediate” format of any kind, as the original video format obtained from the optical pickup or other source device is maintained through the entire path to the receiving terminal device.
Note that, to a certain extent, the resolution sizes and pixels, as well as the prices, and other data are associated with current technology, and are anticipated to vary in time as technology improves and matures. Nevertheless, the inventive approach of applicant will at all times result in a substantial decrease in system cost while preserving the highest possible quality, even at limited bandwidths. Additionally, in all embodiments of the invention, techniques such as pixel interpolation may advantageously be used to further enhance image resolution/quality.
This application claims priority from U.S. patent application Ser. No. 10/664,244, filed 17 Sep. 2003, which claims priority from U.S. Provisional Patent Application Ser. No. 60/411,474, filed 17 Sep. 2002, the entire content of both which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5325202 | Washino | Jun 1994 | A |
5450140 | Washino | Sep 1995 | A |
5450247 | Schwab | Sep 1995 | A |
5625410 | Washino et al. | Apr 1997 | A |
5627898 | Washino | May 1997 | A |
5923484 | Washino et al. | Jul 1999 | A |
5999220 | Washino | Dec 1999 | A |
6144375 | Jain et al. | Nov 2000 | A |
6201896 | Ishikawa | Mar 2001 | B1 |
6240217 | Ercan et al. | May 2001 | B1 |
RE37342 | Washino et al. | Aug 2001 | E |
6370198 | Washino | Apr 2002 | B1 |
6489986 | Allen | Dec 2002 | B1 |
RE38079 | Washino et al. | Apr 2003 | E |
6675386 | Hendricks et al. | Jan 2004 | B1 |
6698021 | Amini et al. | Feb 2004 | B1 |
6724433 | Lippman | Apr 2004 | B1 |
6920179 | Amand et al. | Jul 2005 | B1 |
6952804 | Kumagai et al. | Oct 2005 | B2 |
7124427 | Esbensen | Oct 2006 | B1 |
20010024233 | Urisaka | Sep 2001 | A1 |
20010045988 | Yamauchi et al. | Nov 2001 | A1 |
20020035732 | Zetts | Mar 2002 | A1 |
20020072955 | Brock | Jun 2002 | A1 |
20020116716 | Sideman | Aug 2002 | A1 |
20020124122 | Hosokawa | Sep 2002 | A1 |
20020194054 | Frengut | Dec 2002 | A1 |
20040003411 | Nakai | Jan 2004 | A1 |
20050086699 | Hahn et al. | Apr 2005 | A1 |
Entry |
---|
Dr. Gorry Fairhurst, Jan. 2001, MPEG-2 Overview, pp. 3-4. |
Dr. Gorry Fairhurst, “MPEG-2 Overview”, Jan. 2001, p. 1. |
Pulnix, TM-1300 Progressive Scan High Resolution Camera, Jun. 1998, pp. 1-7. |
Abbott, et al.: Transmission Line Drivers and Receivers for TIA/EAI Standards RS-422 and RS-423, National Semi Conductor, Application Note 214, pp. 1-3, Aug. 1993. |
Number | Date | Country | |
---|---|---|---|
20170272791 A1 | Sep 2017 | US |
Number | Date | Country | |
---|---|---|---|
60411474 | Sep 2002 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10664244 | Sep 2003 | US |
Child | 15614137 | US |