The inventions described below relate to the field of video projection and more specifically to automated calibration of video projection systems.
It is advantageous in large screen rear projection monitors to provide a camera that acts as a sensor in a feedback control loop. This camera can watch the images displayed by the image projection system in order to notice defects in the projected images. These defects may include but are not limited to distorted images, images that are not correctly centered (linearly or rotationally) on the display screen, chromatic aberrations, or the like.
Shipping and handling a large screen rear projection monitor may cause optical components such as projection lenses and fold mirrors to move out of alignment, resulting in the previously mentioned problems. Any of these problems is undesirable to the consumer watching the large screen rear projection monitor.
The angle of reflection of the projected image from the back of the projector screen does not provide the best surface for gauging the quality of the projected image. The shallowness of the light rays' angles leads to image related problems. The electronics that monitor the images taken by the feedback control camera have difficulty recognizing portions of pictures at the outside edge of the camera's field of view. This is due to the way the camera's lens collects light and projects it onto a camera sensor, such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS). Light rays from features at the center of the display screen to the camera have very steep angles (with respect to the display screen), and the angle between light rays pointing from the boundaries of such features (for example, a one inch square) will be comparatively large. If, however, these same features are located at the very edge of the display screen, then the light rays from the features to the camera will be very shallow (again, with respect to the screen), and the angle between light rays emanating from the boundaries of a feature (for example, a one inch square) will be extremely small. The magnitude of the angle between the boundary light rays is proportional to the number of camera sensor pixels that the feature is projected onto. The more pixels devoted to a feature in an image, the easier it is for electronics or software to recognize that feature. Thus, due to the geometry of the camera/lens/display screen setup, features at the center of the screen take up comparatively more pixels versus features of the same linear size at the very edge of the screen. This makes it very difficult for electronics to see the edge of the screen and consequently images projected on the screen.
It is important during both an initial camera calibration stage and in subsequent use to locate the edge of the screen within the images taken by the feedback control camera.
What is needed is a technique for automatically detecting and correcting misalignments, aberrations or other imperfections in the projected image from a surface that provides better reflectivity.
A video projection monitoring and calibration technique discussed below includes a camera or other image sensor capable of watching images displayed on the internal side of the image screen of a rear projection video monitor. The image projection system can monitor how projected images look and can adjust the way the images are projected in order to correct detected problems.
For aesthetic and calibration related reasons the camera should be located within the large screen rear projection monitor, looking at the surface of the display screen where images are projected. This surface is opposite the surface that viewers watch. Unfortunately, locating the camera within the large screen rear projection monitor cabinet forces the camera to sit very close to the display screen, especially in the case of thin cabinet rear projection monitors. This means that the light rays reaching the camera from the furthest edges of the display screen are close to parallel to the screen. The camera therefore needs an extremely wide-angle lens, or a fish-eye lens, to see the entire area of the display screen. For example, suppose a feedback control camera is located six inches directly behind the center of a fifty inch (diagonal), 16:9 aspect ratio display screen. A light ray pointing from a corner of the screen to the camera will have a roughly 13.5 degree angle with respect to the screen.
During the initial camera calibration stage a mapping function is created that maps the location of pixels within the camera's images to the physical locations on the display screen. Finding the edge of the screen quickly reveals the geometry of the camera relative to the display screen. This mapping function is then stored in the electronics within the large screen rear projection monitor. When used, the electronics can use images collected by the feedback control camera and, combined with the mapping function, determine the location of a projected image on the display screen and thereby establish if it has shifted or warped out of position. Again, being able to quickly locate the edge of the display screen facilitates this diagnostic process.
A video projection system may include a transmissive display screen having a front side and a rear side, a projector having one or more optical elements forming an image path to the rear side of the display screen, a calibration bezel in the image path, means for collecting calibration data having a view of a portion of the display screen or a portion of the calibration bezel, or both, and an image processing means using collected calibration data to adjust image data for projection by the projector.
The calibration bezel forms a visual fiducial or reference at the edge of a display screen in a large screen rear projection monitor. The calibration bezel may include one or more elements to form a complete or partial border around the display screen and the bezel elements may be oriented at an angle relative to the display screen. Thus, a comparatively larger angle exists between light rays reflected from the bezel to the camera than in a conventional setup that has a surface planar with respect to the display screen. This makes it easier for electronics or software to locate the edge of the display in images captured by the camera, facilitating a more efficient camera setup process and an improved picture alignment process. The calibration bezel further allows electronics to quickly locate and see images at the perimeter that defines the edge of a large screen rear projection monitor screen.
Referring now to
The microdisplay imager in projector 12 may be an HD microdisplay, meaning that the display contains electronically controlled pixels arrayed in 1280 columns by 720 rows. The operation of an HD microdisplay projector is familiar to those of skill in the art. Projector 12 may be a multiple microdisplay projector with resolution greater or less than HD without departing from the spirit of the invention. Note that the distances and relative size of objects in
A projected image radiates from projector 12 through projection optics 14. Although projection optics 14 are shown schematically in
Control system 16 is responsible for receiving input video images 17 from any suitable video input device 15, re-sampling the images to convert them to a pixel based images, and turning the corresponding microdisplay pixels on and off in order to display the images. Control system 16 are also responsible for performing the picture alignment process aided by camera 22. Control system 16 may include non-volatile memory, a microprocessor, integrated circuits, and the like. Similarly, control system 16 may be implemented in hardware, software, firmware or any other suitable combination.
Camera 22 is configured to electronically share information with control system 16. Camera 22 is a low resolution digital camera, such as those manufactured by Micron. Those of skill in the art will recognize that it is possible to replace camera 22 with an image sensor or any other suitable device without departing from the spirit of the invention. Camera 22 is located inside cabinet 20 and is oriented to view surface 19 of screen 18. Although camera 22 is located directly behind the center of screen 18 as illustrated in
For example, if screen 18 is a fifty inch display screen (measured along the diagonal) that measures 43.8 inches along the screen's horizontal and 24.7 inches along the screen's vertical, camera 22 may be located a distance (distance 13) from the screen; this distance may be, for example, 6.4 inches in a cabinet of 14 inches total depth. These measurements are provided merely for illustrative purposes.
Referring now to
The calibration bezel 31 need not be rigidly attached to screen 18 but may instead be attached to a frame securing screen 18 in cabinet 20. In this particular configuration, calibration bezel 31 need not physically touch screen 18.
As illustrated schematically in
Referring now to
A calibration bezel such as bezel 31 may operate as a visual fiducial in images captured by camera 22. This aids the control system, whether it is system 16 or other suitable outside electronics connected to the camera, in locating viewing area 37. Calibration bezel 31 essentially “frames” viewing area 37 in images captured by camera 22 without being visible from front 21. Knowing the location of viewing area 37 within an image captured by camera 22 allows many tasks to be performed, including but not limited to (1) locating where a projected image falls on screen 18, and thereby determining if a projected image is centered on screen 18; (2) calibrating camera 22 by mapping captured image pixels to specific locations on screen 18 or calibration bezel 31, and (3) establishing if a portion of an image projected on screen 18 is distorted, discolored or otherwise in need of correction.
For a feature in an image captured by camera 22 to be identifiable, there must be a substantial difference in the angles subtended by the light rays extending from the feature's borders to lens 23. This large angle corresponds to the feature taking up more pixels on image sensor 22′ in camera 22. The light rays emanating from calibration bezel 31 have a large angle between them, and thus the calibration bezel 31 provides a noticeable boundary around viewing area 37 in images captured by camera 22. This contrasts with a barely visible boundary around the viewing area in the case where only a planar surface extends beyond the screen.
Referring now to
αa−αb=tan−1(distance 13/distance 67)−tan−1(distance 13/(distance 67+distance 65))
If, for example, in
Referring now to
αa−αc=tan−1(distance 13/distance 67)−tan−1(distance 13−distance 69)/(distance 67)
If, for example, in
The geometry of the calibration bezel also reflects light rays towards camera 22, greatly increasing how noticeable the calibration bezel is in images. With reference to
Referring now to
Thus, while the preferred embodiments of the devices and methods have been described in reference to the environment in which they were developed, they are merely illustrative of the principles of the inventions. Other embodiments and configurations may be devised without departing from the spirit of the inventions and the scope of the appended claims.