This invention relates to image projectors and methods for projecting images. The invention has application, for example, in cinema projection, projection television, advertising displays, general illumination such as spatially adaptive automotive headlights and the like.
Many light projectors have a light source that uniformly illuminates an image formation chip, such as a DMD, LCoS, LCD or reflective LCD (or film) that subtractively modulates the incoming light in order to create a target image. Such projectors typically 1) cannot exceed a peak luminance set by the optical power of the light source, the projected image size, and the reflectivity of the image screen, and 2) have a dynamic range or contrast that is limited by the image formation device, for example film, or digital devices like LCD, LCOs or DMD imaging chips.
Light projectors vary in their capability to produce target images with specified luminance and chromaticity values. The range of capabilities stem from technological limitations related to maximum peak luminance (optical output of the light source) to lowest black-level and hence contrast (contrast of the included image formation technology), to chromatic purity and colour gamut (governed either by the filters applied to a broadband source or to the wavelength of, for example, a laser light source), as well as uniformity and noise specifications. Some projectors can produce light output with limited contrast, for example reaching a peak luminance of 100 cd/m2 and a black level of 1 cd/m2, and hence a contrast of 100:1. Other projectors can reach brighter highlights (by increasing the light source power), and/or deeper black levels (using higher contrast image formation technology). In some systems, very deep black levels can be achieved by modulating the image twice (“dual modulation”). The contrast or dynamic range of a projector can be dynamically adjusted by inserting an iris or aperture in the light path, whose light blocking may be driven in response to image content.
The type of and requirements of image or video content to be reproduced on a projector can vary significantly in time over the course of a presentation of image or video content. The presentation could, for example, comprise presentation of a movie in a cinema, a live performance that uses projectors, or projection of light by adaptive (image-) projector headlights while driving in different conditions in a vehicle. For example a movie might begin with a dark, high contrast, black and white scene, and later contain bright and low contrast scenes with pure colors. While driving at night, an adaptive car headlight might be required to project a uniform, and low contrast light field on an empty road outside the city, but within the city be required to produce a very high contrast, bright image to highlight stop signs, avoid illuminating upcoming cars (casting a shadow in that region) or signaling information on the road.
High brightness, high dynamic range projectors are often more expensive than standard lower dynamic range projectors for similar average light (power) outputs. One reason for this is that achieving better black levels often requires more elements within the system (for example dual modulation designs that use cascaded, light attenuating elements). Another reason is that achieving higher peak luminance on the same screen requires more light-source power in the projector.
There remains a need for good ways to control a projection system to reproduce image content having characteristics that vary significantly over time (e.g. characteristics such as dynamic range, black level, maximum luminance, color saturation) as in the examples above. Such ways would beneficially provide advantages such as reducing power requirements, providing good black level, and/or providing bright highlights.
There remains a need for light projection systems that offer one or both of higher image quality and better cost efficiency.
There remains a need for practical and cost effective projection systems suitable for projecting patterns such as images, desired lamp illumination patterns, and the like. There is a particular need for such systems that are able to faithfully display content having characteristics that change significantly over time (e.g. systems called upon to display bright low-contrast images at some times and to display dark images with bright highlights at other times).
This invention has a number of aspects. One aspect provides a projector system that combines a plurality of projectors. The projectors may have performance characteristics different from one another. The projectors may be separate devices or share certain components, such as control electronic or certain optical elements. Another aspect provides control hardware devices useful for coordinating the operation of two or more projectors to display an image. Another aspect provides a method for splitting an incoming image signal into separate images.
Multiple image generating devices may be used to form a combined image. Each device has a set of operating specifications (which may include, for example, specifications such as peak luminance, resolution, black level, contrast, chromatic extent or gamut). Defined mathematical functions provide image quality and cost metrics in a mathematical framework that permits optimization to achieve goals such as improved image quality or lower cost. The results of the optimization yield separate image data for each image generating device.
This concept can be applied to projectors, where two or more systems with similar or different capabilities produce a combined image in accordance with image data.
In cases where a low dynamic range projector is present in an installation or a high dynamic range projector of suitable maximum output power cannot be found, it may be desirable to combine two or more projectors with similar or different capabilities in order to create a single image with high peak luminance and low black levels. An example of such an arrangement comprises a low dynamic range projector and a high dynamic range projector to create a single image with high peak luminance and low black levels.
Further aspects and example embodiments are illustrated in the accompanying drawings and/or described in the following description.
The accompanying drawings illustrate non-limiting example embodiments of the invention.
Throughout the following description, specific details are set forth in order to provide a more thorough understanding of the invention. However, the invention may be practiced without these particulars. In other instances, well known elements have not been shown or described in detail to avoid unnecessarily obscuring the invention. Accordingly, the specification and drawings are to be regarded in an illustrative, rather than a restrictive sense.
One motivation for combining two or more low dynamic range projectors (projector tiling), or even two low peak luminance, high contrast (dynamic range) projectors, is to boost the overall luminance (brightness) on screen of the resulting image. Low dynamic range projectors are common and a commodity technology and thus command a much lower purchase price than high dynamic range projectors of similar total output brightness.
In some embodiments, all of the plurality of projectors contribute light to the same viewing area (e.g. boundaries of the fields of view of the projectors may be the same). Each of the plurality of projectors may deliver light to any part of the viewing area. Viewers perceive the combined output of the projectors. In some embodiments, each of the projectors projects onto the full display area of the viewing screen.
In a system where a low and high dynamic range projector (LDR and HDR) are combined, the optimal ratio of light contributed by each of the projectors to the final image can vary greatly. This variation is a result of image and environmental properties such as:
Below are five example cases showing how images from a HDR projector and a LDR projector can be combined according to an example embodiment of the invention. “Bright” and “dim” refer to the luminance level of the image.
Case 1: Bright Low Dynamic Range Image, Elevated Black Levels
The image (
The LDR projector may be controlled to output as much light as it can (see
Case 2: Dim Low Dynamic Range Image, High Blacks
This image (
Case 3: Bright High Dynamic Range Image, High Blacks
This image (
Case 4: Bright High Dynamic Range Image, Low Blacks
This image (
Case 5: Dim Low Dynamic Range Image, Low Blacks
Here the peak brightness of the image is quite low (sec
Iris/Global Lamp Power Control
Low dynamic range projectors often produce a dark grey image when attempting to show black due to limitations of light-modulator technology. As an example, consider images in which the brightest areas have luminances lower than the peak luminance of the projector. Here, better contrast can be achieved by dimming the light source. In another example, the amount of detail in dark areas of a target image can be determined to be of higher perceptual importance to the viewer. In such cases, bright content may be sacrificed by dimming the projector to regain deeper black levels. Most low dynamic range projectors are lamp based and cannot easily be dimmed or turned on and off (to create pure black) on a per scene basis due to warm-up issues.
In cases where a low dynamic range projector needs to be turned “off” or simply down, an iris can be placed in the optical path (e.g. over the lens). The iris may then be made smaller to improve the black level of the projected image. Also note that the iris is not binary; an iris may be opened to a size dictated by the desired image black level. It is assumed that the iris can change size with sufficient speed as to not create a noticeable lag when changing scenes. The iris function may also be implemented by some other electrical or mechanical means such as an LCD plate (electrically dimmable) or a high speed shutter rapidly closing and opening.
If the LDR projector has a solid state light source that has a light output that can be controlled, an iris may not be needed. In such embodiments, the light source may be dimmed in an amount such that its light output is equivalent to the light that would have been available through a constricted iris.
A high dynamic range projector may optionally include a globally dimmable solid state light source and/or an iris.
Artifact Mitigation
It may be advantageous for image quality to never completely close the iris and accept a slightly higher black level. If a HDR projector shows poorer image quality due to field non-uniformity or other artifacts, having at least a base amount of light from the LDR projector can help to perceptually mitigate the artifacts.
If an LDR projector displays image artifacts such as vignetting or other non-uniformity, the HDR projector may be used to correct for the non-uniformity of the light field.
Projector Balancing Algorithm
Display Representation:
In order to determine settings for each component projector one can take the capabilities of each projector into account.
Previous approaches commonly model image formation as a simple pipeline where each component takes an input, operates upon it, and passes it to the next stage. This approach is effective for systems consisting of relatively few controllable elements, e.g. light sources, modulators or irises, coupled with relatively many passive optical components such as mirrors or lenses, however it is less desirable in more complex systems. Such systems may combine multiple displays (projectors) or feed the output of one display into subsequent displays. In this case, parameters for later stages of the pipeline can be adjusted in order to compensate for artifacts or performance limitations of earlier stages.
It is advantageous to think of each display in an abstract sense as taking a set of display parameters, P (e.g. pixel values), and a source illumination, S, which are then operated upon by the display to produce an output image, O=F(P,S), where the function F models the operation of the specific display hardware. This abstract conception of a display is illustrated in
This modular approach allows displays to be nearly arbitrarily connected to form networks of abstract displays and passive optical components to model more complex imaging systems. Displays in a network can be connected either in series to form a single optical path, or in parallel to combine multiple optical paths, or in a combination of serial and parallel designs.
An example of a serial connection for two displays is shown in
An example of a parallel arrangement is found in projector super-resolution applications, in which the output images from multiple projectors are overlapped with a slight deregistration in order to generate higher spatial frequency features than are present in an image from a single projector. This arrangement is shown in
In the parallel arrangement, the optical paths of two amplitude modulating projectors are combined (by the projection screen) to produce an output image.
Based on the arrangement, the output image can be determined mathematically by either addition or composition of images generated by the component displays. Taking two displays with functions F1 and F2 taking parameters P1 and P2 respectively, a parallel configuration results in the following expression for the output image:
O=(P1,S1)+F2(P2,S2)
while a serial configuration results in the following expression:
O=F2(F1,(P1,S1),S2)
It is also possible to arrange arbitrarily many displays in a network to form compound displays by taking the union of the component display parameters and source illuminations as the inputs to the compound display. An example for a parallel configuration is shown in
Compound displays can consequently be represented as specific types of abstract displays, which can in turn be arranged into networks and/or grouped to form higher level compound displays. Provided the component display image formation models, Fi, are known a mathematical image formation model of the overall display system can be expressed via combinations of the serial and parallel formulas. Such an image formation model may be applied to optimize the operation of a display system.
Display Parameter Optimization:
One benefit of this representation is that once the overall image formation model for the display system is defined, optimal parameters for individual displays can be obtained via numerical optimization. Such optimizations can incorporate multiple, sometimes conflicting, goals in order to balance desirable properties such as artifact mitigation, maximization of component display lifespans, total system efficiency, power consumption, and output image fidelity among many other options.
Considering a display system as an abstract (possibly compound) display that takes parameters, P, and source illumination, S, to produce an output image can allow the parameters to be jointly optimized. Such a system is depicted in
Although not explicitly labeled for diagram clarity, the models used by the system implicitly have access to target image, source illumination and current parameter selection. A camera located to acquire images showing the output of the display may also be incorporated into the feedback loop. In some embodiments, optimization is performed using a cost function that includes differences between images acquired by the camera and the desired output of the display system (a target image).
Each of the models attempts to correct for deviations of the output image or parameter selection from desirable properties. One common model is image fidelity: it is desirable that the image produced by the system closely approximate the target image, T, or a modified version of the target image, perhaps one where perceptual factors are taken into account. Errors between the output image and target image are used by the model to compute parameter adjustments. Optimization may proceed until either convergence of the parameters is achieved or a time budget is exhausted.
The system constraints model ensures that the parameter selection result in physically realizable (and desirable configurations). Such criteria can include requiring that source illumination profiles are within the available power or that parameters for modulators vary between opaque and transmissive, i.e. do not produce light. Desirable configurations may include choosing parameters that have spatial or temporal coherence, that are within a certain range (see e.g. the LCoS linearity discussion earlier), or parameters that minimize power usage and/or maximize component lifetime.
Image quality heuristics may be used to compensate for behaviors that are not easily modeled or which are costly to model for the image formation models. Image quality heuristics may include moiré, diffraction, temporal behavior and color fringing, among other artifacts. The heuristics models are intended to help compensate for these using empirical image-quality criteria. Image quality heuristics can also be provided to adjust parameters to optimize for properties of human perception, such as veiling luminance, adaptation levels, mean picture levels, metamerism and variations in sensitivity to chroma/luma errors. Sensitivity to these properties can be exploited in content generation.
The LDR and HDR projectors may themselves be compound displays. An example embodiment having desirable properties for commercial applications has a relatively high power LDR projector that can achieve a full-screen white suitable for typical average picture levels combined with a lower-power HDR projector that can achieve much higher peak brightness but does not have the power to do so over the entire screen. Such a system can be vastly more efficient and less costly than building a single projector capable of increased full-screen white values due to distributions of luminance in typical images. In such an embodiment, it is desirable to provide a control which permits global dimming of the LDR projector. Some example ways to provide such global dimming use an iris, a controllable shutter, and/or a variable output light source. The iris is a very simple display that modulates the intensity of the LDR projector, which could be replaced, in principle by a source, S1, for the LDR projector that can be dynamically modulated.
The display parameter optimization searches for LDR parameters P1, Iris/drive level parameters P2 and HDR parameters P3 causing the output image O to best match the target image T. The system of
O=F2(P2,F1(P1,S1)+F3(P3,S3)=F(P,S)
Improved display parameters can be obtained via optimization. The optimization may comprise minimizing the sum of cost functions representing the image fidelity, image quality and system constraints, for example as follows:
P=argmin∝C(T−F(P,S))+Σi∈QβiQi(P,S) subject to Kj(P,S)=0∀j
Here the image fidelity model is the function, C, which weights errors between the image produced by the system, F(P,S), to produce a scalar indicating how preferable the current set of parameters are. Common examples for C are the mean squared error (MSE) or the mean absolute error (MAE).
The functions Qi represent image quality heuristics/models which also produce scalar values indicating how preferable the current parameters are in terms of unmodeled artifacts, e.g. moirë, color fringing, or diffractions artifacts. The constants α and βi control the relative importance given to the various terms (which may be contradictory), providing a way for the content generation to favour one objective over another.
The constraints Ki impose conditions on the parameters, for instance that modulators in projectors must operate in the range between fully transmissive and fully opaque. They are expressed here as set-valued constraints that are either satisfied (Kj(P,S)=0) or unsatisfied, however existing optimization techniques can relax these conditions to allow minor constraint violations.
Although not explicitly listed, the constraint functions, K, and image quality models, Q, may also have a dependence on the output image, O=F(P,S).
It is now possible to express several different schemes for partitioning image content between the LDR and HDR projectors. Several different examples are presented here:
Smooth Blends Between HDR and LDR Projector
Although the HDR projector is necessary for high luminance regions, it may be desirable, from an image quality perspective, to also make use of the HDR projector in regions below the full-screen white level of the LDR projector. This requires portioning content between the two projectors.
One straightforward way of approaching this is to blur or diffuse the mask used by the HDR projector, for example by blurring a dilated binary mask of pixels above the LDR projector full-screen white. A more sophisticated approach could compute approximations of the veiling luminance at each pixel in order to adjust blending parameters dynamically.
There are numerous other options for how to partition content between the component projectors. Examples of these options are discussed below:
With any of these approaches, the blending factors may be dynamically adjusted spatially within a scene to achieve desired local behaviour. For instance, low luminance content adjacent to high-luminance regions may be obscured by veiling luminance of highlights. In this case, neither of the LDR and HDR projectors need to display content for those regions. Alternatively, some scenes may have large bright regions and large dim regions. The adjustments discussed above can then be made, taking into account the scattering behavior of the projectors.
Extending Color Gamut
If the primary colours used in the HDR and LDR projectors differ, perhaps by design, it may be possible to extend the color gamut of the combined system. This can be achieved by mapping the target image to the appropriate color-space and determining what mixture of the two available sets of primaries best represents the target color, for instance choosing as broad a set of primaries as possible to improve metamerism. The process here is similar in principle to that used in extending the dynamic luminance range, as has been discussed throughout this document.
Super-Resolution
If the HDR and LDR projectors are deregistered, it may be possible to increase the apparent resolution of the combined system to decrease aliasing near edges. This can be achieved by optimizing for a high resolution target image, which will cause the projector contributions between HDR and LDR to automatically adjust in order to best approximate the high spatial frequency features.
Scatter Compensation & Feedback of Ambient Conditions
Scatter from the viewing environment can lead to dark image regions with elevated levels. Incorporating a heuristic scattering model for either the target or output image allows this to be taken into account in order to compensate for this effect. In this case the image formation model F could be represented as follows:
F(P,S)=F′(P,S)+R(P,S)
Here R is a function modeling scatter from the viewing environment and F′ is the image formation model for the system in a non-scattering viewing environment. Parameters for the displays optimized using this image formation model automatically attempt to compensate for the resulting scatter.
A similar approach can use actual measurements of scattered light in place of the function R in order to dynamically compensate for light scattering from the viewing environment.
The method illustrated in
The decision boxes depicted in
The “Tone Map Image” operation examines the luminance levels (if available) in the incoming image and maps them to the capabilities of the combined LDR and HDR projector. This operation also takes in account the ambient light level when mapping the darker areas of the image, and the maximum overall luminance the observer would be comfortable with.
The “Adjust Black Level” operation will increase the black level of the mapped image in cases where the observer will not be able to perceive the lower black level. An example of this would be black text in a white field where veiling luminance would not allow an observer to distinguish a very low black level from a slightly elevated one. To achieve this, a forward model of the projectors may be used (to predict halo from brightness).
If an image still has a low black level after the above operations, an iris size (the amount of light attenuated by the iris or by dimming a light source) may be calculated to compensate for the elevated native black level of the LDR projector. Shrinking the iris will also lower the peak brightness available from the LDR projector. The reduced peak brightness may be computed as well.
If the LDR projector with its diminished iris size will not supply sufficient light to the image, the HDR projector may be used to generate the entire image. Note that as explained in the iris section above, it may be desired to never completely block all light from the LDR projector.
In the case where black levels are not low and the image contains highlights that cannot be shown using just the LDR projector due to insufficient brightness capabilities, a separate image for the LDR and the HDR projector may be computed. Since two images are being combined on screen in this case, care should be taken to “blend” them such that edge artifacts are not created when adjacent pixels are delivered from different projectors. The following approaches may be taken, either individually or in combination:
An example of threshold banding would be in the small pixel areas surrounding a bright feature. Here both projectors would contribute light and sum together to create the pixels. The size of this area can be calculated from the veiling luminance effect or simply a fixed number of pixels when there is a fairly soft transition between the highlight and the adjacent features (bright spot on a gradient).
Using a Brightness Booster for Multiple Stage Projection
High dynamic range projectors use two or more imaging stages to lower black levels when generating images. Each one of these image stages has a loss associated with it so when creating very bright images there is far more light loss in a multi stage projector as compared with a single stage projector. Light can be added when required before the final imaging stage to boost the efficiency of the system when low black levels are not required.
Image forming elements used in the light path of projection systems are non-ideal in nature. When forming an image they allow light to leak through in dark areas and absorb some light in bright areas at the expense of overall contrast. To address this, projector manufacturers have made systems with multiple imaging elements to decrease the amount of light leaking through the system in dark areas. This in turn has required a much brighter light source to compensate for the transmission losses through two (or more) imaging elements in bright areas. These projectors show dramatically lower operational efficiency when showing bright images as compared with single stage projectors.
A projection system according to the example embodiment in claim 14 examines the nature of the image being projected and in the case of a low contrast high brightness image will add a calculated amount of uniform light before the final imaging stage. The added light will then only have to travel through a single imaging stage and thus incur far lower transmission losses. Thus, the operational efficiency of the system when producing bright images will be substantially increased. When producing images that require far less light and higher contrast, little or no light will be added before the last imaging elements to preserve the low black levels expected of a multiple stage system.
It is not mandatory that boost light delivered to the second imaging stage be uniform or even. In some embodiments the booster light is non-uniform. An example application of this is in the case where a first imaging stage provides a light output that includes undesired light patches or other artifacts. For example where the first stage is a light steering stage the first stage may provide static artifacts that are not steerable (for example a global roll-off of intensity towards the edges, or visible patches and stripes from different laser diodes that for one reason or another are not corrected for). In such cases the booster light may be structured in such a way that the sum of the booster light and the artifacts is uniform or near uniform illumination. This may be done by providing a non-uniform pattern of booster light inverse to the pattern of artifacts from the first stage.
The purpose of the first imaging element is to block light or steer light away from darker parts of the image such that the last imaging element will not have to block much light from darker parts of the image being projected, leading to a high contrast image when desired. The first imaging element may, for example, modulate the phase and/or intensity of light from the main light source.
The “last imaging element” can be paired such that the boost light source has its own independent light path to the screen. This may be desirable in a very high power system when a single final stage imaging element may not be able to handle the thermal stress or intensity associated with both light paths being summed on its surface.
In a color projector the methods can be implemented separately for each color primary in the system or operated in a color field sequential manner on one or more example implementations.
An algorithm is executed to govern the relative intensity settings of the two light sources. The boost light will be active when displaying low contrast imagery or when veiling luminance in the observer's eye or other optical scatter in the system or environment masks surrounding dark areas such that elevating the intensity of those dark areas does not result in noticeable degradation of the displayed image.
Image statistics, for example a histogram of the luminance distribution within an image, or other methods may be employed to determine the overall contrast requirements of the image. The boost light source may be used whenever possible as it is a more efficient light path than from the main light source and may always be used to provide brightness up to the darkest level present in an image.
The main light source may be dimmed to compensate for light being added to the image by the boost light source.
Cases A and H show an image that is uniformly white at full intensity. In cases B,C,D,I, and J the boost light can drive higher than the lowest level due to veiling luminance effects. Cases P and Q are also affected by veiling luminance and allow some light to come from the boost light. In cases K, L, M, N, and O the boost light drives to the lowest brightness level present in the image. For example, the boost light may be provided at a level determined by multiplying the lowest luminance level in the image by a factor. The factor may be based on the contrast capability of the second modulator. For example if the lowest luminance level in a particular image is Lmin=1 cd/m2, and the contrast of the second modulator C2=2000:1, then the booster light may be provided with a luminance sufficient to achieve 2000 cd/m2 with a fully open modulator C2 while allowing the light level to be reduced to 1 cd/m2 by setting the second modulator to its least light-transmitting state.
In some embodiments, if a dark patch exceeds a threshold size such that it will not be masked by a veiling luminance effect, the boost light will be completely turned off and the non-black area of the screen will be illuminated through two image forming elements in series—drastically reducing the amount of light leaking through into the dark areas. In example cases E, F, G, R, S, T, and U there is enough dark content that the boost light is powered off to preserve the black levels.
It is not mandatory that the boost light and the main light source are distinct from one another. In some embodiments an optical system is provided that can direct some or all light from a main light source directly onto the last imaging element bypassing the first imaging element. For example, a variable beam splitter may be applied to divert some light from a main light source onto the last imaging element. Some embodiments have both a separate boost light source and a provision for diverting light from the main light source onto the last imaging element.
In some embodiments an optical element or elements are provided to combine light from the boost light source with light that has been modulated by the first imaging element and to direct the combined light onto the last imaging element. The optical element or elements comprises a prism in some embodiments.
In some embodiments the boost light source comprises a plurality of light sources such as a plurality of light-emitting diodes (LEDs). In one example embodiment the boost light source is arranged around an outer perimeter of the first imaging element. For example, the boost light source may comprise a ring of LEDs. Suitable reflectors, diffusers, spaces and/or other optical elements may be provided to cause light from the boost light source to be evenly distributed on the last imaging element.
Case 1: Bright Low Dynamic Range Image, Elevated Black Levels
The boost stage is used to illuminate most of the image. The first, steering and high contrast stage is used to add minimal highlights to the image. Little steering is required.
Case 2: Dim Low Dynamic Range Image, High Blacks
The boost stage is used to illuminate the entire image. The steering stage is not used.
Case 3: Bright High Dynamic Range Image, High Blacks
The boost stage is full on. The steering stage is also full on providing maximum steering.
Case 4: Bright High Dynamic Range Image, Low Blacks
The boost stage is off. The image is created using the steering stage only.
Case 5: Dim Low Dynamic Range Image, Low Blacks
The boost stage in on, but at reduced intensity to preserve some of the black level in the image. The steering stage is off as no highlights are needed.
Technology as described herein may be applied, without limitation, to displays of the types described in U.S. patent application No. 61/893,270 filed Oct. 20, 2013 which is hereby incorporated herein by reference for all purposes.
Using a Combination of Projectors to Show Stereoscopic Content:
Systems of combined projectors or light sources, as described herein, lend themselves to applications that require the efficient or low cost or high brightness reproduction of 3D (stereoscopic) content.
Stereoscopic image pairs comprise an image intended for viewing with the right eye and an image intended for viewing with the left eye. The disparity of the images creates a depth effect. No disparity will render images perceived to be in the plane of the projection screen. A disparity between left and right eye images will render objects to be perceived away from the projection screen plane, either closer to the viewer (audience) or, if inverted further away (perceived to be behind the screen plane).
One characteristic of cinematic and other stereoscopic image content is that a pleasant viewing experience is more likely to be achieved if the disparity between left and right eye views is not too great (for example, depicted objects are not perceived as being too close to the viewer). The differences between the left and right eye views in stereoscopic image pairs are therefore typically kept small. Even in image pairs with depicted content that is perceived as being very close to the viewer (or very far away), many image areas in the left and right eye views will typically be the same because in almost all cases only some objects will be rendered as being close or far relative to the viewer.
Many, if not all, practical stereoscopic projection systems require filtering of light that is reflected off the projections screen before the light enters each eye of an observer. This filtering results in different images being delivered to viewers' left and right eyes. Filtering is often provided using eyeglasses which provide different filters for the left and right eyes. Common techniques use color filters (notch filters for some or all of the color primaries for the left and the right eye), circular or linear polarization filters, temporal shutters or temporal polarization switches.
Projection systems are set up to produce different images for the left and right eyes which have different corresponding (to the filter at right and left eye) light properties, for example narrow band primaries different for left and right eye view, or clockwise and counter-clockwise circularly polarized light, or light with orthogonal linear polarization states, or temporal light fields matching the temporal shutter at the eye or the polarization of the polarization switch.
All of these filtering techniques have in common that a large amount of light is lost between the light source of the projector and the observers' eye compared to similar non-stereoscopic projection systems. Stereoscopic projection systems are also more complex and thus more costly than non-stereoscopic projection systems. Another problem is that it is not always possible or easy to upgrade an existing non-stereoscopic projector to operate as a stereoscopic projector.
In a system as described herein, it is possible to use one projector in a non-stereoscopic mode with a light source that is compatible with both the left and the right eye filters (for example a broadband light source in the case of a system based on color notch filters, or a randomly polarized system in the case of either the circular or linearly polarized filter system or a permanently ON light source in case of any temporal shutter filtering system). The non-stereoscopic projector will create those parts of an image that are common to both the left and the right eye view.
A second projector (one or more projectors) may then be used to display the parts of the images that differ between the left and right eye views. The second projector projects light having the properties required for the left and the right eye filters (wavelength, or polarization, or temporal image fields).
There are several benefits in using such a system: compared to the system described herein, the additional cost to enable stereoscopic projection is minimal, because most of the components are already included in the architecture.
The power requirements for the second projector can be lower as the image regions with disparity between left and right are typically not large relative to all pixels of the image. Light steering may be used to steer light to the display areas corresponding to depicted objects perceived as being out of the plane of the display screen.
Creating good separation (=contrast) between the left and the right eye is not easy or costly. Less than perfect separation will result in some light intended for the right eye entering into the left eye. This effect is known as ghosting and reduces image quality and causes headaches. Since the second projector power requirements are lower than the main projector and the cost to make such a second projector is lower, more care can be taken to ensure that left and right eye views are truly separated.
A low power secondary projector can cost effectively be added to upgrade and enable an existing non-stereoscopic projection system to display stereoscopic images.
Power Output Relationship Between LDR/HDR Projectors:
With projector systems as described herein it should be possible to combine an LDR projector with for example 5× the power of the HDR projector. Since HDR projectors are far more expensive than LDR projectors this will allow for a more economical setup.
The following are non-limiting enumerated example embodiments.
Unless the context clearly requires otherwise, throughout the description and the
Words that indicate directions such as “vertical”, “transverse”, “horizontal”, “upward”, “downward”, “forward”, “backward”, “inward”, “outward”, “vertical”, “transverse”, “left”, “right”, “front”, “back”, “top”, “bottom”, “below”, “above”, “under”, and the like, used in this description and any accompanying claims (where present), depend on the specific orientation of the apparatus described and illustrated. The subject matter described herein may assume various alternative orientations. Accordingly, these directional terms are not strictly defined and should not be interpreted narrowly.
Embodiments of the invention may be implemented using specifically designed hardware, configurable hardware, programmable data processors configured by the provision of software (which may optionally comprise “firmware”) capable of executing on the data processors, special purpose computers or data processors that are specifically programmed, configured, or constructed to perform one or more steps in a method as explained in detail herein and/or combinations of two or more of these. Examples of specifically designed hardware are: logic circuits, application-specific integrated circuits (“ASICs”), large scale integrated circuits (“LSIs”), very large scale integrated circuits (“VLSIs”), and the like. Examples of configurable hardware are: one or more programmable logic devices such as programmable array logic (“PALs”), programmable logic arrays (“PLAs”), and field programmable gate arrays (“FPGAs”)). Examples of programmable data processors are: microprocessors, digital signal processors (“DSPs”), embedded processors, graphics processors, math co-processors, general purpose computers, server computers, cloud computers, mainframe computers, computer workstations, and the like. For example, one or more data processors in a control circuit for a device may implement methods as described herein by executing software instructions in a program memory accessible to the processors.
While processes or blocks are presented in a given order, alternative examples may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or subcombinations. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.
The invention may also be provided in the form of a program product. The program product may comprise any non-transitory medium which carries a set of computer-readable instructions which, when executed by a data processor, cause the data processor to execute a method of the invention. Program products according to the invention may be in any of a wide variety of forms. The program product may comprise, for example, non-transitory media such as magnetic data storage media including floppy diskettes, hard disk drives, optical data storage media including CD ROMs, DVDs, electronic data storage media including ROMs, flash RAM, EPROMs, hardwired or preprogrammed chips (e.g., EEPROM semiconductor chips), nanotechnology memory, or the like. The computer-readable signals on the program product may optionally be compressed or encrypted.
In some embodiments, the invention may be implemented in software. For greater clarity, “software” includes any instructions executed on a processor, and may include (but is not limited to) firmware, resident software, microcode, and the like. Both processing hardware and software may be centralized or distributed (or a combination thereof), in whole or in part, as known to those skilled in the art. For example, software and other modules may be accessible via local memory, via a network, via a browser or other application in a distributed computing context, or via other means suitable for the purposes described above.
Where a component (e.g. a software module, processor, assembly, display, iris, device, circuit, etc.) is referred to above, unless otherwise indicated, reference to that component (including a reference to a “means”) should be interpreted as including as equivalents of that component any component which performs the function of the described component (i.e., that is functionally equivalent), including components which are not structurally equivalent to the disclosed structure which performs the function in the illustrated exemplary embodiments of the invention.
Specific examples of systems, methods and apparatus have been described herein for purposes of illustration. These are only examples. The technology provided herein can be applied to systems other than the example systems described above. Many alterations, modifications, additions, omissions, and permutations are possible within the practice of this invention. This invention includes variations on described embodiments that would be apparent to the skilled addressee, including variations obtained by: replacing features, elements and/or acts with equivalent features, elements and/or acts; mixing and matching of features, elements and/or acts from different embodiments; combining features, elements and/or acts from embodiments as described herein with features, elements and/or acts of other technology; and/or omitting combining features, elements and/or acts from described embodiments.
It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions, omissions, and sub-combinations as may reasonably be inferred. The scope of the claims should not be limited by the preferred embodiments set forth in the examples, but should be given the broadest interpretation consistent with the description as a whole.
This application is a continuation of U.S. application Ser. No. 16/288,669 filed Dec. 20, 2018, which is a continuation of U.S. application Ser. No. 15/359427 filed Nov. 22, 2016, which is a continuation of U.S. application Ser. No. 15/312165. U.S. application Ser. No. 15/312165 is the US National Stage of PCT Application No. PCT/CA2015/000324 filed 15 May 2015, which claims priority from U.S. Application No. 61/994002 filed 15 May 2014 and U.S. Patent Application No. 62/148041 filed 15 Apr. 2015. For purposes of the United States, this application claims the benefit under 35 U.S.C. § 119 of U.S. Application No. 61/994002 filed 15 May 2014 entitled BRIGHTNESS BOOSTER FOR MULTIPLE-STAGE PROJECTORS and U.S. Patent Application No. 62/148041 filed 15 Apr. 2015 entitled OPTIMIZING DRIVE SCHEMES FOR MULTIPLE PROJECTOR SYSTEMS, all of which are hereby incorporated herein by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5287096 | Thompson et al. | Feb 1994 | A |
5490009 | Venkateswar et al. | Feb 1996 | A |
5597223 | Watanabe | Jan 1997 | A |
5886675 | Aye | Mar 1999 | A |
5953469 | Zhou | Sep 1999 | A |
5956000 | Kreitman et al. | Sep 1999 | A |
5978142 | Blackham | Nov 1999 | A |
6057537 | Schubert et al. | May 2000 | A |
6115022 | Mayer, III | Sep 2000 | A |
6222593 | Higurashi et al. | Apr 2001 | B1 |
6406148 | Marshall et al. | Jun 2002 | B1 |
6417892 | Sharp | Jul 2002 | B1 |
6456339 | Surati et al. | Sep 2002 | B1 |
6490364 | Hanna | Dec 2002 | B2 |
6568816 | Mayer et al. | May 2003 | B2 |
6570623 | Li et al. | May 2003 | B1 |
6733138 | Raskar | May 2004 | B2 |
6760075 | Mayer et al. | Jul 2004 | B2 |
6771272 | Deering | Aug 2004 | B2 |
6804406 | Chen | Oct 2004 | B1 |
6814448 | Ioka | Nov 2004 | B2 |
7108379 | Tan | Sep 2006 | B2 |
7118226 | Davis et al. | Oct 2006 | B2 |
7440160 | Heckmeier et al. | Oct 2008 | B2 |
7712902 | Nakamura | May 2010 | B2 |
8330870 | Marcus et al. | Dec 2012 | B2 |
8339695 | Haussler et al. | Dec 2012 | B2 |
8534868 | Krijn et al. | Sep 2013 | B2 |
8547641 | Capolla | Oct 2013 | B2 |
8749463 | Matsumoto et al. | Jun 2014 | B2 |
9874319 | Minor et al. | Jan 2018 | B2 |
10171779 | Kozak et al. | Jan 2019 | B2 |
10324361 | Damberg et al. | Jun 2019 | B2 |
10404957 | Damberg et al. | Sep 2019 | B2 |
10408390 | Minor et al. | Sep 2019 | B2 |
10477170 | Damberg et al. | Nov 2019 | B2 |
10531055 | Richards et al. | Jan 2020 | B2 |
11363242 | Kozak et al. | Jun 2022 | B2 |
20030019854 | Gross et al. | Jan 2003 | A1 |
20030197669 | Marshall | Oct 2003 | A1 |
20030218590 | Kiser et al. | Nov 2003 | A1 |
20040104902 | Fujii et al. | Jun 2004 | A1 |
20040169823 | Bridgwater | Sep 2004 | A1 |
20040239885 | Jaynes et al. | Dec 2004 | A1 |
20050018309 | Mcguire et al. | Jan 2005 | A1 |
20050058175 | Gross et al. | Mar 2005 | A1 |
20050111072 | Miyagaki | May 2005 | A1 |
20050195223 | Nitta | Sep 2005 | A1 |
20060072075 | De Vaan | Apr 2006 | A1 |
20060158405 | Willis | Jul 2006 | A1 |
20060202930 | Uchiyama et al. | Sep 2006 | A1 |
20060215130 | Nakamura | Sep 2006 | A1 |
20070046898 | Conner | Mar 2007 | A1 |
20070091277 | Damera-Venkata | Apr 2007 | A1 |
20070103768 | Blackham | May 2007 | A1 |
20070268224 | Whitehead et al. | Nov 2007 | A1 |
20070273957 | Zalevsky | Nov 2007 | A1 |
20080036872 | Nobori | Feb 2008 | A1 |
20080049044 | Nitta | Feb 2008 | A1 |
20080204847 | Kamm et al. | Aug 2008 | A1 |
20080266321 | Aufranc et al. | Oct 2008 | A1 |
20080278689 | Read | Nov 2008 | A1 |
20090001272 | Hajjar | Jan 2009 | A1 |
20090002297 | Sakai et al. | Jan 2009 | A1 |
20090002787 | Cable et al. | Jan 2009 | A1 |
20090040133 | Clodfelter | Feb 2009 | A1 |
20090116520 | Oozeki | May 2009 | A1 |
20090128875 | Christmas et al. | May 2009 | A1 |
20090141242 | Silverstein et al. | Jun 2009 | A1 |
20090190103 | Takahashi et al. | Jul 2009 | A1 |
20090225234 | Ward et al. | Sep 2009 | A1 |
20090225395 | Ganti et al. | Sep 2009 | A1 |
20100007577 | Ninan | Jan 2010 | A1 |
20100141855 | Wynn | Jun 2010 | A1 |
20100149313 | Kroll et al. | Jun 2010 | A1 |
20100208327 | Sandstrom | Aug 2010 | A1 |
20110018911 | Kitaoka et al. | Jan 2011 | A1 |
20110019112 | Dolgoff | Jan 2011 | A1 |
20110037953 | Nizani et al. | Feb 2011 | A1 |
20110101253 | Lal et al. | May 2011 | A1 |
20110122467 | Futterer et al. | May 2011 | A1 |
20120001834 | Hudman et al. | Jan 2012 | A1 |
20120032999 | Seetzen | Feb 2012 | A1 |
20120200476 | Kanamori et al. | Aug 2012 | A1 |
20120229430 | Ward et al. | Sep 2012 | A1 |
20130015367 | Cui | Jan 2013 | A1 |
20130038838 | Ferri | Feb 2013 | A1 |
20130070320 | Holmes | Mar 2013 | A1 |
20130077308 | Svensen et al. | Mar 2013 | A1 |
20130162952 | Lippey | Jun 2013 | A1 |
20130170007 | Kurashige et al. | Jul 2013 | A1 |
20130182322 | Silverstein | Jul 2013 | A1 |
20130201403 | Iversen | Aug 2013 | A1 |
20130214688 | Chapman et al. | Aug 2013 | A1 |
20130215012 | Reddy et al. | Aug 2013 | A1 |
20130250049 | Schwerdtner | Sep 2013 | A1 |
20130265622 | Christmas et al. | Oct 2013 | A1 |
20140002514 | Richards | Jan 2014 | A1 |
20140029858 | Tung | Jan 2014 | A1 |
20140035919 | Majumder | Feb 2014 | A1 |
20140043352 | Damberg | Feb 2014 | A1 |
20140055692 | Kroll et al. | Feb 2014 | A1 |
20140268330 | Perkins | Sep 2014 | A1 |
20150042895 | Jannard et al. | Feb 2015 | A1 |
20150172610 | Candry | Jun 2015 | A1 |
20160284260 | Mizuno | Sep 2016 | A1 |
20160381329 | Damberg et al. | Dec 2016 | A1 |
20170078629 | Kozak et al. | Mar 2017 | A1 |
20170085846 | Damberg et al. | Mar 2017 | A1 |
20170127025 | Damberg et al. | May 2017 | A1 |
20170150107 | Kozak et al. | May 2017 | A1 |
20170192224 | Logiudice et al. | Jul 2017 | A1 |
20180373129 | Pertierra et al. | Dec 2018 | A1 |
20180376115 | Damberg et al. | Dec 2018 | A1 |
20190124304 | Kozak et al. | Apr 2019 | A1 |
20200004115 | Kyosuna et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2088497 | Feb 1992 | CA |
2443494 | Mar 2005 | CA |
2884903 | Sep 2015 | CA |
2956844 | Feb 2016 | CA |
101295123 | Oct 2008 | CN |
102053371 | May 2011 | CN |
103325129 | Sep 2013 | CN |
103477640 | Dec 2013 | CN |
106662753 | May 2017 | CN |
102005021155.0 | Nov 2006 | DE |
102009028626.8 | Jan 2011 | DE |
1098536 | May 2001 | EP |
1363460 | Nov 2003 | EP |
0927379 | Dec 2005 | EP |
3180652 | Apr 2018 | EP |
2398130 | Aug 2004 | GB |
2482066 | Jan 2012 | GB |
2485609 | May 2012 | GB |
2499579 | Aug 2013 | GB |
H06242509 | Sep 1994 | JP |
H095881 | Jan 1997 | JP |
H11337871 | Dec 1999 | JP |
2003125317 | Apr 2003 | JP |
2007033576 | Feb 2007 | JP |
2007033577 | Feb 2007 | JP |
2007532983 | Nov 2007 | JP |
2008015064 | Jan 2008 | JP |
2008089686 | Apr 2008 | JP |
2008197386 | Aug 2008 | JP |
5287121 | Oct 2008 | JP |
2009042372 | Feb 2009 | JP |
2009180821 | Aug 2009 | JP |
2010529484 | Aug 2010 | JP |
2010533889 | Oct 2010 | JP |
2011502274 | Jan 2011 | JP |
2011514546 | May 2011 | JP |
2011227324 | Nov 2011 | JP |
2012237814 | Dec 2012 | JP |
2013015599 | Jan 2013 | JP |
2014513316 | May 2014 | JP |
2014517337 | Jul 2014 | JP |
2014518400 | Jul 2014 | JP |
2015510150 | Apr 2015 | JP |
2017527111 | Sep 2017 | JP |
0125848 | Apr 2001 | WO |
2004046805 | Jun 2004 | WO |
2006116536 | Nov 2006 | WO |
2008013368 | Jan 2008 | WO |
2008049917 | May 2008 | WO |
2008075096 | Jun 2008 | WO |
2009089211 | Jul 2009 | WO |
2009126263 | Oct 2009 | WO |
2010125367 | Nov 2010 | WO |
2010149587 | Dec 2010 | WO |
2011061914 | May 2011 | WO |
2011071701 | Jun 2011 | WO |
2012021567 | Feb 2012 | WO |
2012125756 | Sep 2012 | WO |
2012145200 | Oct 2012 | WO |
2012151262 | Nov 2012 | WO |
2012166536 | Dec 2012 | WO |
2012166682 | Dec 2012 | WO |
2013029667 | Mar 2013 | WO |
2013117903 | Aug 2013 | WO |
2013117923 | Aug 2013 | WO |
2013130037 | Sep 2013 | WO |
2015054797 | Apr 2015 | WO |
2015172236 | Nov 2015 | WO |
2015184549 | Dec 2015 | WO |
2016015163 | Feb 2016 | WO |
2016023133 | Feb 2016 | WO |
Entry |
---|
European Patent Office, Communication pursuant to Article 94(3) EPC, EP Patent Application 14854627.8, dated Dec. 6, 2019, 5 pages. |
European Patent Office, Communication pursuant to Article 94(3) EPC, EP Patent Application 15792616.3, dated Feb. 2, 2021, 9 pages. |
European Patent Office, Communication pursuant to Article 94(3) EPC, EP Patent Application 15827729.3, dated Dec. 17, 2019, 8 pages. |
European Patent Office, Communication pursuant to Article 94(3), EP Patent Application 16852940.2, dated Mar. 26, 2021, 6 pages. |
European Patent Office, Examination Report for European Application No. 14854627.8; dated Dec. 13, 2018, 5 pages. |
European Patent Office, Extended European Search Report, EP Patent Application 15792616.3, dated Dec. 7, 2017, 11 pages. |
European Patent Office, Extended European Search Report, EP Patent Application 15827729.3, dated Feb. 20, 2018, 13 pages. |
European Patent Office, Extended European Search Report, EP Patent Application 16852940.2, dated May 9, 2019, 8 pages. |
European Patent Office, Extended European Search Report, EP Patent Application 21214505.6, dated May 9, 2022, 9 pages. |
Hoskinson et al. “Light Reallocation for High Contrast Projection Using an Analog Micromirror Array,” ACM Siggraph conference proceedings, Dec. 15, 2010, 10 pages. |
International Search Report and Written Opinion, PCT Patent Application PCT/CA2014/051013, dated Jan. 30, 2015, 11 pages. |
International Search Report and Written Opinion, PCT Patent Application PCT/CA2015/050778, dated Nov. 24, 2015, 10 pages. |
Murdoch, Michael J. et al. “Veiling Glare and Perceived Black in High Dynamic Range Displays,” Journal of the Optical Society of America, vol. 29, No. 4, p. 559, Apr. 1, 2012, 17 pages. |
Office Action for Japanese Application No. 2016-525963; dated Nov. 9, 2018; 10 pages. |
Schwartzburg, Y. et al. “High-Contrast Computational Caustic Design,” ACM Transactions on Graphics, ACM, US vol. 33, No. 4, Jul. 27, 2014, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20230027499 A1 | Jan 2023 | US |
Number | Date | Country | |
---|---|---|---|
62148041 | Apr 2015 | US | |
61994002 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16228669 | Dec 2018 | US |
Child | 17839418 | US | |
Parent | 15359427 | Nov 2016 | US |
Child | 16228669 | US | |
Parent | 15312165 | US | |
Child | 15359427 | US |