In the following Detailed Description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. In this regard, directional terminology, such as “top,” “bottom,” “front,” “back,” etc., may be used with reference to the orientation of the Figure(s) being described. Because components of embodiments of the present invention can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration and is in no way limiting. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the present invention. The following Detailed Description, therefore, is not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims.
As described herein, a system for viewing different subsets of images from a set of simultaneously displayed and at least partially overlapping images by different viewers is provided. The system includes two or more subsets of projectors, where each of the subset of projectors simultaneously projects a different image onto a display surface in positions that at least partially overlap, and a channel selection device. The channel selection device allows different subsets of the projected images (also referred to herein as channels) to be viewed by different viewers. To do so, the channel selection device causes a subset of the images to be viewed by each viewer while preventing another subset of the images from being seen by each viewer. The channel select device may also allow the full set of images to be viewed by one or more viewers as a channel while other viewers are viewing only a subset of the images. Accordingly, different viewers viewing the same display surface at the same time may see different content in the same location on the display surface.
Each subset of projectors includes one or more projectors. Where a subset of projectors includes two or more projectors, each projector projects a sub-frame formed according to a geometric relationship between the projectors in the subset. The images may each be still images that are displayed for a relatively long period of time, video images from video streams that are displayed for a relatively short period of time, or any combination of still and video images. In addition, the images may be fully or substantially fully overlapping (e.g., superimposed on one another), partially overlapping (e.g., tiled where the images have a small area of overlap), or any combination of fully and partially overlapping. Further, the area of overlap between any two images in the set of images may change spatially, temporally, or any combination of spatially and temporally.
Image display system 100 processes one or more sets of image data 102 and generates a set of displayed images 114 on a display surface 116 where at least two of the displayed images are displayed in at least partially overlapping positions on display surface 116.
Displayed images 114 are defined to include any combination of pictorial, graphical, or textural characters, symbols, illustrations, or other representations of information. Displayed images 114 may each be still images that are displayed for a relatively long period of time, video images from video streams that are displayed for a relatively short period of time, or any combination of still and video images. In addition, at least two of the set of displayed images 114 are fully overlapping (e.g., superimposed on one another; or one image fully contained within another image), substantially fully overlapping (e.g., superimposed with a small area that does not overlap), or partially overlapping (e.g., partially superimposed; or tiled where the images have a small area of overlap) either continuously or at various times. Other images in the set of displayed images 114 may also overlap by any degree with or be separated from the overlapping images in the set of displayed images 114. Any area of overlap or separation between any two images in the set of displayed images 114 may change spatially, temporally, or any combination of spatially and temporally.
A channel selection device 130 is configured to allow different subsets 132(1)-132(N) (collectively referred to as subsets 132) of the at least partially overlapping images 114 to be simultaneously viewed by viewers 140(1)-140(N) (collectively referred to as viewers 140) on display surface 116 where N is an integer greater than or equal to two. Subset 132 may also refer to the entire set of displayed images. Accordingly, different viewers 140 viewing display surface 116 at the same time may see different subsets 132 of images 114. Subsets 132 are also referred to herein as channels when describing what viewers 140 see. Although shown in
Image frame buffer 104 receives and buffers sets of image data 102 to create sets of image frames 106. In one embodiment, each set of image data 102 corresponds to a different image in the set of displayed images 114 and each set of image frames 106 is formed from a different set of image data 102. In another embodiment, each set of image data 102 corresponds to one or more than one of the images in the set of displayed images 114 and each set of image frames 106 is formed from one or more than one set of image data 102. In a further embodiment, a single set of image data 102 may correspond to all of the images in the set of displayed images 114 and each set of image frames 106 is formed from the single set of image data 102.
Sub-frame generator 108 processes the sets of image frames 106 to define corresponding image sub-frames 110(1)-110(M) (collectively referred to as sub-frames 110) and provides sub-frames 110(1)-110(M) to projectors 112(1)-112(M), respectively. Sub-frames 110 are received by projectors 112, respectively, and stored in image frame buffers 113(1)-113(M) (collectively referred to as image frame buffers 113), respectively. Projectors 112(1)-112(M) project the sub-frames 110(1)-110(M), respectively, to produce video image streams 115(1)-115(M) (individually referred to as a video stream 115 or collectively referred to as video streams 115), respectively, that project through or onto channel selection device 130 and onto display surface 116 to produce the set of displayed images 114. Each image in the set of displayed images 114 is formed from a subset of sub-frames 110(1)-110(M) projected by a respective subset of projectors 112(1)-112(M). For example, sub-frames 110(1)-110(i) may be projected by projectors 112(1)-112(i) to form a first image in the set of displayed images 114, and sub-frames 110(i+1)-110(M) may be projected by projectors 112(i+1)-112(M) to form a second image in the set of displayed images 114 where i is an integer index from 1 to M that represents the ith sub-frame 110 in the set of sub-frames 110(1)-110(M) and the ith projector 112 in the set of projectors 112(1)-112(M).
Projectors 112 receive image sub-frames 110 from sub-frame generator 108 and simultaneously project the image sub-frames 110 onto display surface 116. As noted above, different subsets of projectors 112(1)-112(M) form different images in the set of displayed images 114 by projecting respective subsets of sub-frames 110(1)-110(M). The subsets of projectors 112 project the subsets of sub-frames 110 such that the set of displayed images 114 appears in any suitable superimposed, tiled, or separated arrangement, or combination thereof, on display surface 116 where at least two of the images the set of displayed images 114 at least partially overlap.
Each image in displayed images 114 may be formed by a subset of projectors 112 that include one or more projectors 112. Where a subset of projectors 112 includes one projector 112, the projector 112 in the subset projects a sub-frame 110 onto display surface 116 to produce an image in the set of displayed images 114.
Where a subset of projectors 112 includes more than one projector 112, the subset of projectors 112 simultaneously project a corresponding subset of sub-frames 110 onto display surface 116 at overlapping and spatially offset positions to produce an image in the set of displayed images 114. An example of a subset of sub-frames 110 projected at overlapping and spatially offset positions to form an image in the set of displayed images 114 is described with reference to
Sub-frame generator 108 forms each subset of two or more sub-frames 110 according to a geometric relationship between each of the projectors 112 in a given subset as described in additional detail below with reference to the embodiments of
In one embodiment, image display system 100 attempts to determine appropriate values for the sub-frames 110 so that each image in the set of displayed images 114 produced by the projected sub-frames 110 is close in appearance to how a corresponding high-resolution image (e.g., a corresponding image frame 106) from which the sub-frame or sub-frames 110 were derived would appear if displayed directly.
Also shown in
Display system 100 includes at least one camera 122 and calibration unit 124, which are used to automatically determine a geometric relationship between each projector 112 in each subset of projectors 112 and the reference projector 118, as described in further detail below with reference to the embodiments of
Channel selection device 130 is configured to allow different subsets in the set of displayed images 114 to be viewed by different viewers 140. To do so, channel selection device 130 causes a subset of the set of displayed images 114 to be viewed by each viewer 140 while simultaneously preventing another subset of the set of displayed images 114 from being seen by each viewer 140. Channel selection device 130 may also be configured to allow selected users to view the entire set of displayed images 114 without preventing any of the images in set of displayed images 114 from being seen by the selected viewers. Accordingly, different viewers 140 viewing the same portion of display surface 116 at the same time may see different subsets of the set of displayed images 114 or the entire set of displayed images 114.
When viewed without channel selection device 130, the set of displayed images 114 may appear distorted to viewers 140 where the content of two or more of the images that overlap are unrelated or independent of one another. For example, if one of the images is from a first television channel and another of the images is from a second, unrelated television channel, the overall appearance of the set of displayed images 114 may be distorted and unwatchable in the region of overlap.
If the content of the overlapping images are related, dependent upon one another, or complementary, then the overall appearance of the set of displayed images 114 may be undistorted in the region of overlap. For example, if one of the images is from a movie without visual enhancements and another of the images is from the same movie with visual enhancements (e.g., sub-titles, notes of explanation, additional, alterative, or selected audience content, etc.), then the full set of displayed images 114 may be viewed by one or more viewers 140 without distortion.
If the overlapping images in the set of displayed images 114 are unrelated or independent, channel selection device 130 eliminates the distortion caused by the overlapping images by simultaneously allowing viewers 140(1) and 140(2) to view subsets 132(1) and 132(2), respectively, and preventing viewers 140(1) and 140(2) from seeing unrelated or independent subsets of overlapping images in the set of displayed images 114. As a result, subsets 132(1) and 132(2) appear undistorted and watchable by viewers 140(1) and 140(2), respectively. In the example set forth above for unrelated or independent overlapping images, channel selection device 130 may cause subset 132(1) to include the first television channel, but not the second, unrelated television channel so that viewer 140(1) sees only the first television channel. Similarly, channel selection device 130 may cause subset 132(2) to include the second television channel, but not the first, unrelated television channel so that viewer 140(2) sees only the second television channel.
If the overlapping images in the set of displayed images 114 are related, dependent, or complementary, channel selection device 130 prevents different subsets of the overlapping images from being seen by viewers 140(1) and 140(2), respectively. Each subset 132(1) and 132(2) appears undistorted and fully watchable by viewers 140(1) and 140(2), respectively. Each subset 132(1) and 132(2), however, includes a different subset of images from the set of displayed images 114. In the example set forth above for related, dependent, or complementary overlapping images, channel selection device 130 may cause each subset 132(1) and 132(2) to selectively include a different subset of visual enhancements in a movie that appear in the display of the full set of displayed images 114. For example, subset 132(1) may include images that form additional content for mature audiences but not images that form sub-titles. Similarly, subset 132(2) may include the images that form the sub-titles but not the images that form the content for mature audiences. A third subset 132(3) (not shown in
Referring back to
In
Projector comb filters 152 are each configured to filter selected light frequency ranges in the visible light spectrum from respective projectors 112. Accordingly, projector comb filters 152 pass selected frequency ranges from respective projectors 112 and block selected frequency ranges from respective projectors 112. Projector comb filters 152 receive video streams 115, respectively, filter selected frequency ranges in video streams 115, and transmit the filtered video streams onto display surface 116.
Along with projectors 112, projector comb filters 152 are divided into subsets where each projector comb filters 152 in a subset is configured to filter the same frequency ranges and different subsets are configured to filter different frequency ranges. The frequency ranges of different subsets may be mutually exclusive may partially overlap with the frequency ranges in another subset. For example, a first subset of projector comb filters 152 may include projector comb filters 152(1)-152(i) that filter a first set of frequency ranges (where i is an integer index from 1 to M-1 that represents the ith projector comb filter 152 in the set of projector comb filters 152(1)-152(M-1)), and a second set of projector comb filters 152 may include projector comb filters 152(i+1)-152(M) that filter a second set of frequency ranges that differs from the first set of frequency ranges. In addition, the frequency ranges of different subsets of projector comb filters 152 may vary over time such that the specific frequency range of each subset varies as a function of time.
Graph 182(1) illustrates the wavelengths ranges filtered by a first subset of projector comb filters 152. The shaded regions indicate wavelengths ranges that are filtered by the first subset. As shown, the first subset passes portions of the wavelength range for each color (blue, green, and red). For example, the first subset passes a range of wavelengths 182 and a range of wavelengths 184 in the blue light wavelength range. Similar ranges of wavelengths are passed in the green and red light wavelengths ranges.
Graph 182(2) illustrates the wavelengths ranges filtered by a second subset of projector comb filters 152. The shaded regions indicate wavelengths ranges that are filtered by the second subset. As shown, the second subset passes portions of the wavelength range for each color (blue, green, and red). For example, the second subset passes a range of wavelengths 188 and a range of frequencies 190 in the blue light wavelength range. Similar ranges of frequencies are passed in the green and red light wavelengths ranges.
Graph 182(P) illustrates the wavelengths ranges filtered by a Pth subset of projector comb filters 152. The shaded regions indicate wavelengths ranges that are filtered by the Pth subset. As shown, the Pth subset passes portions of the wavelength range for each color (blue, green, and red). For example, the Pth subset passes a range of wavelengths 192 and a range of wavelengths 194 in the blue light wavelength range. Similar ranges of wavelengths are passed in the green and red light wavelengths ranges.
Referring back to
Using subsets of projector comb filters 152, subsets of projectors 112 form different images in the set of displayed images 114 where each of the different images is formed using different ranges of light frequencies.
Viewer comb filters 154 are each configured to filter selected ranges of light frequency in the visible light spectrum from display surface 116. Accordingly, viewer comb filters 154 pass selected frequency ranges from display surface 116 and block selected frequency ranges from display surface 116 to allow viewers 140 to see a selected subset of the set of displayed images 114. Viewer comb filters 154 receive the filtered video streams from display surface 116, filter selected frequency ranges in the filtered video streams to form subsets 132 of the set of displayed images 114, and transmit subsets 132 to viewers 140. A viewer comb filter 154 may also be configured to pass all frequency ranges to form a subset 132 and allow a viewer 140 to see the entire set of displayed images 114.
The frequency ranges filtered by each viewer comb filter 154 corresponds to one or more subsets of projector comb filters 152. Accordingly, a viewer 140 using a given comb filter 154 views the images in the set of displayed images 114 that correspond to one or more subsets of projectors 112 with projector comb filters 152 that pass the same frequency ranges as the given comb filter. The frequency ranges filtered by each viewer comb filter 154 may vary over time and may be synchronized with one or more different subsets of projector comb filters 152 that also vary over time.
The block regions of graph 196(1) illustrate the wavelengths ranges passed by viewer comb filter 154(1). As shown, viewer comb filter 154(1) passes portions of the wavelength range for each color (blue, green, and red). For example, viewer comb filter 154(1) passes a range of wavelengths 197 in the blue light wavelength range. At least one range of wavelengths is passed in each of the blue, green, and red color bands. Referring back to
Referring to
Referring to
With the embodiment of
In one embodiment, each viewer comb filter 154 may be included in glasses or a visor that fits on the face of a viewer 140. In other embodiments, each viewer comb filter 154 may be included in any suitable substrate (e.g., a glass panel) positioned between a viewer 140 and display surface 116.
In other embodiments, one or more subsets of projectors 112 do not project video streams 115 through projector comb filters 154. In these embodiments, the images projected by these subsets of projectors 112 are included in each subset 132 and seen by all viewers 140.
In
Vertical polarizers 162(1)-162(i) are configured to transmit only vertically polarized light from video streams 115(1)-115(i), respectively, and horizontal polarizers 164(1)-164(M-i) are configured to transmit only horizontally polarized light from video streams 115(i+1)-115(M), respectively.
Vertical polarizers 162 are used with one or more subsets of projectors 112 to project one or more vertically polarized images on display surface 116. Horizontal polarizers 164 are used with one or more other subsets of projectors 112 to project one or more horizontally polarized images on display surface 116.
Vertically polarized filter 166 and horizontally polarized filter 168 each receive the polarized images from display surface 116. Vertically polarized filter 166 filters the images from display surface 116 that are not vertically polarized to form a subset 132(1) that includes only vertically polarized images. Likewise, horizontally polarized filter 168 filters the images from display surface 116 that are not horizontally polarized to form a subset 132(2) that includes only horizontally polarized images. Another subset 132(3) is not filtered by either vertically polarized filter 166 or horizontally polarized filter 168 and includes the entire set of displayed images 114 including both vertically and horizontally polarized images.
Vertically polarized filter 166 and horizontally polarized filter 168 may be integrated with projectors 112 (e.g., inserted into the projection paths of projectors 112 or formed as part of specialized color wheels that transmit only the desired polarized light) or may be adjacent or otherwise external to projectors 112 in the projection path between projectors 112 and display surface 116.
In one embodiment, both vertically polarized filter 166 and horizontally polarized filter 168 may be included in a separate apparatus (not shown) for each viewer 140 where respective apparatus are positioned between respective viewers 140 and display surface 116. In this embodiment, a viewer 140 or other operator selects vertically polarized filter 166, horizontally polarized filter 168, or neither vertically polarized filter 166 nor horizontally polarized filter 168 for use at a given time to allow subset 132(1), 132(2), or 132(3), respectively, to be viewed by a viewer 140. In other embodiments, an apparatus with both vertically polarized filter 166 and horizontally polarized filter 168 may be formed for multiple viewers 140. In further embodiments, an apparatus with only one of vertically polarized filter 166 and horizontally polarized filter 168 may be formed for each viewer 140 or multiple viewers 140. In each the above embodiments, the apparatus may be glasses or a visor that fits on the face of a viewer 140 or any suitable substrate (e.g., a glass panel) positioned between a viewer 140 and display surface 116.
In other embodiments, one or more subsets of projectors 112 do not project video streams 115 through vertical polarizers 162 or horizontal polarizers 164. In these embodiments, the images projected by these subsets of projectors 112 are included in each subset 132 and seen by all viewers 140.
In other embodiments of channel selection device 130B, diagonal polarizers (not shown) may be used in place of or in addition to vertical polarizers 162 and horizontal polarizers 164 for one or more subsets of projectors 112, and diagonal polarized filters may be used in place of or in addition to vertically polarized filter 166 and horizontally polarized filter 168. For example, diagonal polarizers with a 45 degree polarization may be configured to transmit only 45 degree polarized light from video streams 115 and diagonal polarizers with a 135 degree polarization may be configured to transmit only 135 degree polarized light from video streams 115.
In these embodiments, any vertically polarized filters 166 filter the images from display surface 116 that are horiztonally polarized to form a subset 132 that includes the vertically and 45 and 135 degree diagonally polarized images. Likewise, any horizontally polarized filters 168 filter the images from display surface 116 that are vertically polarized to form a subset 132 that includes horizontally and 45 and 135 degree diagonally polarized images. Further, any 45 degree polarized filters filter the images from display surface 116 that are 135 degree polarized to form a subset 132 that includes vertically, horizontally, and 45 degree polarized images. Similarly, any 135 degree polarized filters filter the images from display surface 116 that are 45 degree polarized to form a subset 132 that includes vertically, horizontally, and 135 degree polarized images.
In further embodiments of channel selection device 130B, circular polarizers (not shown) may be used in place of or in addition to vertical polarizers 162 and horizontal polarizers 164 for one or more subsets of projectors 112, and circularly polarized filters may be used in place of or in addition to vertically polarized filter 166 and horizontally polarized filter 168. The circular polarizers may include clockwise circular polarizers and counterclockwise circular polarizers where clockwise circular polarizers polarize video streams 115 into clockwise polarizations and counterclockwise circular polarizers polarize video streams 115 into counterclockwise polarizations.
In these embodiments, clockwise circularly polarized filters filter the images from display surface 116 that are counterclockwise circularly polarized to form a subset 132 that includes the clockwise circularly polarized images. Similarly, counterclockwise circularly polarized filters filter the images from display surface 116 that are clockwise circularly polarized to form a subset 132 that includes the counterclockwise circularly polarized images.
In the above embodiments, vertical polarizers 162 and horizontal polarizers 164 form complementary polarizers that form complementary polarizations (i.e., vertical and horizontal polarizations). 45 degree diagonal polarizers and 135 degree diagonal polarizers also form complementary polarizers that form complementary polarizations (i.e., 45 degree diagonal and 135 degree polarizations). In addition, clockwise circular polarizers and counterclockwise circular polarizers form complementary polarizers that form complementary polarizations (i.e., clockwise circular polarizations and counterclockwise circular polarizations).
In the above embodiments, the polarizations of one or more subsets of projectors 112 may be time varying (e.g., by rotating or otherwise adjusting a polarizer). In addition, the polarizations filtered by a polarized filter may vary over time and may be synchronized with one or more subsets of projectors 112 with varying polarizations.
Display surface 116 may be configured to reflect or absorb selected polarizations of light in the above embodiments.
In
Shutter devices 172 are synchronized with the periodic time intervals of one or more subsets of projectors 112. Although each shutter device 172 receives all images projected on display surface 116, each shutter device 172 transmits any images on display surface 116 to a respective viewer 140 only during selected time intervals. During other time intervals, each shutter device 172 blocks the transmission of all images on display surface 116. A shutter device 172 may also be operate to transmit during all time intervals to allow a viewer 140 to see the entire set of displayed images 114.
For example, a first subset of projectors 112 may project images during a first set of time intervals, and a second subset of projectors 112 may project images during a second set of time intervals that is mutually exclusive with the first set of time intervals (e.g., alternating). A shutter device 172(1) transmits the images on display surface 116 to viewer 140(1) during the first set of time intervals and blocks the transmission of images on display surface 116 during the second set of time intervals. Likewise, a shutter device 172(2) transmits the images on display surface 116 to viewer 140(2) during the second set of time intervals and blocks the transmission of images on display surface 116 during the first set of time intervals.
In one embodiment, shutter devices 172 include electronic shutters such as liquid crystal display (LCD) shutters. In other embodiments, shutter devices 172 include mechanical or other types of shutters.
In one embodiment, each shutter device 172 may be included in glasses or a visor that fits on the face of a viewer 140. In other embodiments, each shutter device 172 may be included in any suitable substrate (e.g., a glass panel) positioned between a viewer 140 and display surface 116.
In one embodiment, projectors 112 may be configured to operate with an increased frame rate (e.g., 60 frames per second) or the number of overlapping images on display surface 116 may be limited to minimize any flicker effects experienced by viewers 140.
In
Lenticular array 178 may be periodically configured to change or adjust the direction of display of one or more subsets 132. In addition, lenticular array 178 may be operated to transmit the entire set of displayed images 114 in a selected direction at various times.
Lenticular array 178 may be adjacent to display surface 116 (as shown in
Each of the embodiments 130A-130D of channel selection device 130 may be preconfigured to allow a viewer to see a predetermined subset 132 or may be switchable to allow subsets 132 to be selected any time before or during viewing of display surface 116. Channel selection devices 130 may be switchable for individual viewers 140 by operating switches on components of channel selection device 130 to select a subset 132. The switches maybe operated directly on each component or may be operated remotely using any suitable wired or wireless connection. For example, viewer comb filters 140 (shown in
Referring back to
Although described above as providing different subsets 132 to different viewers 140, channel selection device 130 may also provide different subsets 132 to each eye of each viewer 140 in other embodiments to allow viewers 140 to see 3D or stereoscopic images.
In one embodiment, sub-frame generator 108 generates image sub-frames 110 with a resolution that matches the resolution of projectors 112, which is less than the resolution of image frames 106 in one embodiment. Sub-frames 110 each include a plurality of columns and a plurality of rows of individual pixels representing a subset of an image frame 106.
In one embodiment, display system 100 is configured to give the appearance to the human eye of high-resolution displayed images 114 by displaying overlapping and spatially shifted lower-resolution sub-frames 110 from at least one subset of projectors 112. The projection of overlapping and spatially shifted sub-frames 110 may give the appearance of enhanced resolution (i.e., higher resolution than the sub-frames 110 themselves).
Sub-frames 110 projected onto display surface 116 may have perspective distortions, and the pixels may not appear as perfect squares with no variation in the offsets and overlaps from pixel to pixel, such as that shown in
Image display system 100 includes hardware, software, firmware, or a combination of these. In one embodiment, one or more components of image display system 100 are included in a computer, computer server, or other microprocessor-based system capable of performing a sequence of logic operations. In addition, processing can be distributed throughout the system with individual portions being implemented in separate system components, such as in a networked or multiple computing unit environments.
Sub-frame generator 108 may be implemented in hardware, software, firmware, or any combination thereof. For example, sub-frame generator 108 may include a microprocessor, programmable logic device, or state machine. Sub-frame generator 108 may also include software stored on one or more computer-readable mediums and executable by a processing system (not shown). The term computer-readable medium as used herein is defined to include any kind of memory, volatile or non-volatile, such as floppy disks, hard disks, CD-ROMs, flash memory, read-only memory, and random access memory.
Image frame buffer 104 includes memory for storing image data 102 for the sets of image frames 106. Thus, image frame buffer 104 constitutes a database of image frames 106. Image frame buffers 113 also include memory for storing any number of sub-frames 110. Examples of image frame buffers 104 and 113 include non-volatile memory (e.g., a hard disk drive or other persistent storage device) and may include volatile memory (e.g., random access memory (RAM)).
Display surface 116 may be planar, non-planar, curved, or have any other suitable shape. In one embodiment, display surface 116 reflects the light projected by projectors 112 to form the set of displayed images 114. In another embodiment, display surface 116 is translucent, and display system 100 is configured as a rear projection system.
Sub-frame 110(1) is spatially offset from first sub-frame 110(2) by a predetermined distance. Similarly, sub-frame 110(3) is spatially offset from first sub-frame 110(4) by a predetermined distance. In one illustrative embodiment, vertical distance 204 and horizontal distance 206 are each approximately one-half of one pixel.
The display of sub-frames 110(2), 110(3), and 110(4) are spatially shifted relative to the display of sub-frame 110(1) by vertical distance 204, horizontal distance 206, or a combination of vertical distance 204 and horizontal distance 206. As such, pixels 202 of sub-frames 110(1), 110(2), 110(3), and 110(4) at least partially overlap thereby producing the appearance of higher resolution pixels. Sub-frames 110(1), 110(2), 110(3), and 110(4) may be superimposed on one another (i.e., fully or substantially fully overlap), may be tiled (i.e., partially overlap at or near the edges), or may be a combination of superimposed and tiled. The overlapped sub-frames 110(1), 110(2), 110(3), and 110(4) also produce a brighter overall image than any of sub-frames 110(1), 110(2), 110(3), or 110(4) alone.
In other embodiments, other numbers of projectors 112 are used in system 100 and other numbers of sub-frames 110 are generated for each image frame 106.
In other embodiments, sub-frames 110(1), 110(2), 110(3), and 110(4) may be displayed at other spatial offsets relative to one another and the spatial offsets may vary over time.
In one embodiment, sub-frames 110 have a lower resolution than image frames 106. Thus, sub-frames 110 are also referred to herein as low-resolution images or sub-frames 110, and image frames 106 are also referred to herein as high-resolution images or frames 106. The terms low resolution and high resolution are used herein in a comparative fashion, and are not limited to any particular minimum or maximum number of pixels.
In one embodiment, sub-frame generator 108 determines appropriate values separately for each subset of sub-frames 110 where two or more sub-frames are used to form an image in the set of images 114 using the embodiments described with reference to
In other embodiments where two or more sub-frames are used to form an image in the set of images 114, sub-frame generator 108 determines appropriate values for one or more subsets of sub-frames 110 using images from camera 122 that include two or more subsets of sub-frames 110 with the embodiments described with reference to
In one embodiment, display system 100 produces at least a partially superimposed projected output that takes advantage of natural pixel mis-registration to provide a displayed image with a higher resolution than the individual sub-frames 110. In one embodiment, image formation due to a subset of multiple overlapped projectors 112 is modeled using a signal processing model. Optimal sub-frames 110 for each of the component projectors 112 in the subset are estimated by sub-frame generator 108 based on the model, such that the resulting image predicted by the signal processing model is as close as possible to the desired high-resolution image to be projected. In one embodiment described with reference to
In one embodiment, sub-frame generator 108 is configured to generate a subset of sub-frames 110 based on the maximization of a probability that, given a desired high resolution image, a simulated high-resolution image that is a function of the sub-frame values, is the same as the given, desired high-resolution image. If the generated subset of sub-frames 110 are optimal, the simulated high-resolution image will be as close as possible to the desired high-resolution image. The generation of optimal sub-frames 110 based on a simulated high-resolution image and a desired high-resolution image is described in further detail below with reference to the embodiment of
A. Multiple Color Sub-Frames
Zk=HkDTYk Equation I
The low-resolution sub-frame pixel data (Yk) is expanded with the up-sampling matrix (DT) so that sub-frames 110 (Yk) can be represented on a high-resolution grid. The interpolating filter (Hk) fills in the missing pixel data produced by up-sampling. In the embodiment shown in
In one embodiment, the geometric mapping (Fk) is a floating-point mapping, but the destinations in the mapping are on an integer grid in image 304. Thus, it is possible for multiple pixels in image 302 to be mapped to the same pixel location in image 304, resulting in missing pixels in image 304. To avoid this situation, in one embodiment, during the forward mapping (Fk), the inverse mapping (Fk−1) is also utilized as indicated at 305 in
In another embodiment, the forward geometric mapping or warp (Fk) is implemented directly, and the inverse mapping (Fk−1) is not used. In one form of this embodiment, a scatter operation is performed to eliminate missing pixels. That is, when a pixel in image 302 is mapped to a floating point location in image 304, some of the image data for the pixel is essentially scattered to multiple pixels neighboring the floating point location in image 304. Thus, each pixel in image 304 may receive contributions from multiple pixels in image 302, and each pixel in image 304 is normalized based on the number of contributions it receives.
A superposition/summation of such warped images 304 from all of the component projectors 112 forms a hypothetical or simulated high-resolution image 306 ({circumflex over (X)}, also referred to as X-hat herein) in reference projector frame buffer 120, as represented in the following Equation II:
If the simulated high-resolution image 306 (X-hat) in reference projector frame buffer 120 is identical to a given (desired) high-resolution image 308 (X), the system of component low-resolution projectors 112 would be equivalent to a hypothetical high-resolution projector placed at the same location as hypothetical reference projector 118 and sharing its optical path. In one embodiment, the desired high-resolution images 308 are the high-resolution image frames 106 received by sub-frame generator 108.
In one embodiment, the deviation of the simulated high-resolution image 306 (X-hat) from the desired high-resolution image 308 (X) is modeled as shown in the following Equation III:
X={circumflex over (X)}+η Equation III
As shown in Equation III, the desired high-resolution image 308 (X) is defined as the simulated high-resolution image 306 (X-hat) plus η, which in one embodiment represents zero mean white Gaussian noise.
The solution for the optimal sub-frame data (Yk*) for sub-frames 110 is formulated as the optimization given in the following Equation IV:
Thus, as indicated by Equation IV, the goal of the optimization is to determine the sub-frame values (Yk) that maximize the probability of X-hat given X. Given a desired high-resolution image 308 (X) to be projected, sub-frame generator 108 determines the component sub-frames 110 that maximize the probability that the simulated high-resolution image 306 (X-hat) is the same as or matches the “true” high-resolution image 308 (X).
Using Bayes rule, the probability P(X-hat|X) in Equation IV can be written as shown in the following Equation V:
The term P(X) in Equation V is a known constant. If X-hat is given, then, referring to Equation III, X depends only on the noise term, η, which is Gaussian. Thus, the term P(X|X-hat) in Equation V will have a Gaussian form as shown in the following Equation VI:
To provide a solution that is robust to minor calibration errors and noise, a “smoothness” requirement is imposed on X-hat. In other words, it is assumed that good simulated images 306 have certain properties. The smoothness requirement according to one embodiment is expressed in terms of a desired Gaussian prior probability distribution for X-hat given by the following Equation VII:
In another embodiment, the smoothness requirement is based on a prior Laplacian model, and is expressed in terms of a probability distribution for X-hat given by the following Equation VIII:
The following discussion assumes that the probability distribution given in Equation VII, rather than Equation VIII, is being used. As will be understood by persons of ordinary skill in the art, a similar procedure would be followed if Equation VIII were used. Inserting the probability distributions from Equations VI and VII into Equation V, and inserting the result into Equation IV, results in a maximization problem involving the product of two probability distributions (note that the probability P(X) is a known constant and goes away in the calculation). By taking the negative logarithm, the exponents go away, the product of the two probability distributions becomes a sum of two probability distributions, and the maximization problem given in Equation IV is transformed into a function minimization problem, as shown in the following Equation IX:
The function minimization problem given in Equation IX is solved by substituting the definition of X-hat from Equation II into Equation IX and taking the derivative with respect to Yk, which results in an iterative algorithm given by the following Equation X:
Y
k
(n+1)
=Y
k
(n)
−Θ{DH
k
T
F
k
T└({circumflex over (X)}(n)−X)+β2∇2{circumflex over (X)}(n)┘} Equation X
Equation X may be intuitively understood as an iterative process of computing an error in the hypothetical reference projector coordinate system and projecting it back onto the sub-frame data. In one embodiment, sub-frame generator 108 is configured to generate sub-frames 110 in real-time using Equation X. The generated sub-frames 110 are optimal in one embodiment because they maximize the probability that the simulated high-resolution image 306 (X-hat) is the same as the desired high-resolution image 308 (X), and they minimize the error between the simulated high-resolution image 306 and the desired high-resolution image 308. Equation X can be implemented very efficiently with conventional image processing operations (e.g., transformations, down-sampling, and filtering). The iterative algorithm given by Equation X converges rapidly in a few iterations and is very efficient in terms of memory and computation (e.g., a single iteration uses two rows in memory; and multiple iterations may also be rolled into a single step). The iterative algorithm given by Equation X is suitable for real-time implementation, and may be used to generate optimal sub-frames 110 at video rates, for example.
To begin the iterative algorithm defined in Equation X, an initial guess, Yk(0), for sub-frames 110 is determined. In one embodiment, the initial guess for sub-frames 110 is determined by texture mapping the desired high-resolution frame 308 onto sub-frames 110. In one embodiment, the initial guess is determined from the following Equation XI:
Y
k
(0)
=DB
k
F
k
T
X Equation XI
Thus, as indicated by Equation XI, the initial guess (Yk(0)) is determined by performing a geometric transformation (FkT) on the desired high-resolution frame 308 (X), and filtering (Bk) and down-sampling (D) the result. The particular combination of neighboring pixels from the desired high-resolution frame 308 that are used in generating the initial guess (Yk(0)) will depend on the selected filter kernel for the interpolation filter (Bk).
In another embodiment, the initial guess, Yk(0), for sub-frames 110 is determined from the following Equation XII
Y
k
(0)
=DF
k
T
X Equation XII
Equation XII is the same as Equation XI, except that the interpolation filter (Bk) is not used.
Several techniques are available to determine the geometric mapping (Fk) between each projector 112 and hypothetical reference projector 118, including manually establishing the mappings, or using camera 122 and calibration unit 124 to automatically determine the mappings. In one embodiment, if camera 122 and calibration unit 124 are used, the geometric mappings between each projector 112 and camera 122 are determined by calibration unit 124. These projector-to-camera mappings may be denoted by Tk, where k is an index for identifying projectors 112. Based on the projector-to-camera mappings (Tk), the geometric mappings (Fk) between each projector 112 and hypothetical reference projector 118 are determined by calibration unit 124, and provided to sub-frame generator 108. For example, in a display system 100 with two projectors 112(1) and 112(2), assuming the first projector 112(1) is hypothetical reference projector 118, the geometric mapping of the second projector 112(2) to the first (reference) projector 112(1) can be determined as shown in the following Equation XIII:
F
2
=T
2
T
1
−1 Equation XIII
In one embodiment, the geometric mappings (Fk) are determined once by calibration unit 124, and provided to sub-frame generator 108. In another embodiment, calibration unit 124 continually determines (e.g., once per frame 106) the geometric mappings (Fk), and continually provides updated values for the mappings to sub-frame generator 108.
B. Single Color Sub-Frames
In another embodiment illustrated by the embodiment of
Zik=HiDiTYik Equation XIV
The low-resolution sub-frame pixel data (Yik) is expanded with the up-sampling matrix (DiT) so that sub-frames 110 (Yik) can be represented on a high-resolution grid. The interpolating filter (Hi) fills in the missing pixel data produced by up-sampling. In the embodiment shown in
In one embodiment, the geometric mapping (Fik) is a floating-point mapping, but the destinations in the mapping are on an integer grid in image 404. Thus, it is possible for multiple pixels in image 402 to be mapped to the same pixel location in image 404, resulting in missing pixels in image 404. To avoid this situation, in one embodiment, during the forward mapping (Fik), the inverse mapping (Fik−1) is also utilized as indicated at 405 in
In another embodiment, the forward geometric mapping or warp (Fk) is implemented directly, and the inverse mapping (Fk−1) is not used. In one form of this embodiment, a scatter operation is performed to eliminate missing pixels. That is, when a pixel in image 402 is mapped to a floating point location in image 404, some of the image data for the pixel is essentially scattered to multiple pixels neighboring the floating point location in image 404. Thus, each pixel in image 404 may receive contributions from multiple pixels in image 402, and each pixel in image 404 is normalized based on the number of contributions it receives.
A superposition/summation of such warped images 404 from all of the component projectors 112 in a given color plane forms a hypothetical or simulated high-resolution image (X-hati) for that color plane in reference projector frame buffer 120, as represented in the following Equation XV:
A hypothetical or simulated image 406 (X-hat) is represented by the following Equation XVI:
{circumflex over (X)}=[{circumflex over (X)}1 {circumflex over (X)}2 . . . {circumflex over (X)}N]T Equation XVI
If the simulated high-resolution image 406 (X-hat) in reference projector frame buffer 120 is identical to a given (desired) high-resolution image 408 (X), the system of component low-resolution projectors 112 would be equivalent to a hypothetical high-resolution projector placed at the same location as hypothetical reference projector 118 and sharing its optical path. In one embodiment, the desired high-resolution images 408 are the high-resolution image frames 106 received by sub-frame generator 108.
In one embodiment, the deviation of the simulated high-resolution image 406 (X-hat) from the desired high-resolution image 408 (X) is modeled as shown in the following Equation XVII:
X={circumflex over (X)}+η Equation XVII
As shown in Equation XVII, the desired high-resolution image 408 (X) is defined as the simulated high-resolution image 406 (X-hat) plus η, which in one embodiment represents zero mean white Gaussian noise.
The solution for the optimal sub-frame data (Yik*) for sub-frames 110 is formulated as the optimization given in the following Equation XVIII:
Thus, as indicated by Equation XVIII, the goal of the optimization is to determine the sub-frame values (Yik) that maximize the probability of X-hat given X. Given a desired high-resolution image 408 (X) to be projected, sub-frame generator 108 determines the component sub-frames 110 that maximize the probability that the simulated high-resolution image 406 (X-hat) is the same as or matches the “true” high-resolution image 408 (X).
Using Bayes rule, the probability P(X-hat|X) in Equation XVIII can be written as shown in the following Equation XIX:
The term P(X) in Equation XIX is a known constant. If X-hat is given, then, referring to Equation XVII, X depends only on the noise term, η, which is Gaussian. Thus, the term P(X|X-hat) in Equation XIX will have a Gaussian form as shown in the following Equation XX:
To provide a solution that is robust to minor calibration errors and noise, a “smoothness” requirement is imposed on X-hat. In other words, it is assumed that good simulated images 406 have certain properties. For example, for most good color images, the luminance and chrominance derivatives are related by a certain value. In one embodiment, a smoothness requirement is imposed on the luminance and chrominance of the X-hat image based on a “Hel-Or” color prior model, which is a conventional color model known to those of ordinary skill in the art. The smoothness requirement according to one embodiment is expressed in terms of a desired probability distribution for X-hat given by the following Equation XXI:
In another embodiment, the smoothness requirement is based on a prior Laplacian model, and is expressed in terms of a probability distribution for X-hat given by the following Equation XXII:
The following discussion assumes that the probability distribution given in Equation XXI, rather than Equation XXII, is being used. As will be understood by persons of ordinary skill in the art, a similar procedure would be followed if Equation XXII were used. Inserting the probability distributions from Equations XX and XXI into Equation XIX, and inserting the result into Equation XVIII, results in a maximization problem involving the product of two probability distributions (note that the probability P(X) is a known constant and goes away in the calculation). By taking the negative logarithm, the exponents go away, the product of the two probability distributions becomes a sum of two probability distributions, and the maximization problem given in Equation V is transformed into a function minimization problem, as shown in the following Equation XXIII:
TLi=ith element in the first row in a color transformation matrix, T, for transforming the luminance of X-hat.
The function minimization problem given in Equation XXIII is solved by substituting the definition of X-hati from Equation XV into Equation XXIII and taking the derivative with respect to Yik, which results in an iterative algorithm given by the following Equation XXIV:
Equation XXIV may be intuitively understood as an iterative process of computing an error in the hypothetical reference projector coordinate system and projecting it back onto the sub-frame data. In one embodiment, sub-frame generator 108 is configured to generate sub-frames 110 in real-time using Equation XXIV. The generated sub-frames 110 are optimal in one embodiment because they maximize the probability that the simulated high-resolution image 406 (X-hat) is the same as the desired high-resolution image 408 (X), and they minimize the error between the simulated high-resolution image 406 and the desired high-resolution image 408. Equation XXIV can be implemented very efficiently with conventional image processing operations (e.g., transformations, down-sampling, and filtering). The iterative algorithm given by Equation XXIV converges rapidly in a few iterations and is very efficient in terms of memory and computation (e.g., a single iteration uses two rows in memory; and multiple iterations may also be rolled into a single step). The iterative algorithm given by Equation XXIV is suitable for real-time implementation, and may be used to generate optimal sub-frames 110 at video rates, for example.
To begin the iterative algorithm defined in Equation XXIV, an initial guess, Yik(0), for sub-frames 110 is determined. In one embodiment, the initial guess for sub-frames 110 is determined by texture mapping the desired high-resolution frame 408 onto sub-frames 110. In one embodiment, the initial guess is determined from the following Equation XXV:
Y
ik
(0)
=D
i
B
i
F
ik
T
X
i Equation XXV
Thus, as indicated by Equation XXV, the initial guess (Yik(0)) is determined by performing a geometric transformation (FikT) on the ith color plane of the desired high-resolution frame 408 (Xi), and filtering (Bi) and down-sampling (Di) the result. The particular combination of neighboring pixels from the desired high-resolution frame 408 that are used in generating the initial guess (Yik(0)) will depend on the selected filter kernel for the interpolation filter (Bi).
In another embodiment, the initial guess, Yik(0), for sub-frames 110 is determined from the following Equation XXVI:
Y
ik
(0)
=D
i
F
ik
T
X
i Equation XXVI
Equation XXVI is the same as Equation XXV, except that the interpolation filter (Bk) is not used.
Several techniques are available to determine the geometric mapping (Fik) between each projector 112 and hypothetical reference projector 118, including manually establishing the mappings, or using camera 122 and calibration unit 124 to automatically determine the mappings. In one embodiment, if camera 122 and calibration unit 124 are used, the geometric mappings between each projector 112 and camera 122 are determined by calibration unit 124. These projector-to-camera mappings may be denoted by Tk, where k is an index for identifying projectors 112. Based on the projector-to-camera mappings (Tk), the geometric mappings (Fk) between each projector 112 and hypothetical reference projector 118 are determined by calibration unit 124, and provided to sub-frame generator 108. For example, in a display system 100 with two projectors 112(1) and 112(2), assuming the first projector 112(1) is hypothetical reference projector 118, the geometric mapping of the second projector 112(2) to the first (reference) projector 112(1) can be determined as shown in the following Equation XXVII:
F
2
=T
2
T
1
−1 Equation XXVII
In one embodiment, the geometric mappings (Fik) are determined once by calibration unit 124, and provided to sub-frame generator 108. In another embodiment, calibration unit 124 continually determines (e.g., once per frame 106) the geometric mappings (Fik), and continually provides updated values for the mappings to sub-frame generator 108.
One embodiment provides an image display system 100 with multiple overlapped low-resolution projectors 112 coupled with an efficient real-time (e.g., video rates) image processing algorithm for generating sub-frames 110. In one embodiment, multiple low-resolution, low-cost projectors 112 are used to produce high resolution images at high lumen levels, but at lower cost than existing high-resolution projection systems, such as a single, high-resolution, high-output projector. One embodiment provides a scalable image display system 100 that can provide virtually any desired resolution, brightness, and color, by adding any desired number of component projectors 112 to the system 100.
In some existing display systems, multiple low-resolution images are displayed with temporal and sub-pixel spatial offsets to enhance resolution. There are some important differences between these existing systems and embodiments described herein. For example, in one embodiment, there is no need for circuitry to offset the projected sub-frames 110 temporally. In one embodiment, sub-frames 110 from the component projectors 112 are projected “in-sync”. As another example, unlike some existing systems where all of the sub-frames go through the same optics and the shifts between sub-frames are all simple translational shifts, in one embodiment, sub-frames 110 are projected through the different optics of the multiple individual projectors 112. In one embodiment, the signal processing model that is used to generate optimal sub-frames 110 takes into account relative geometric distortion among the component sub-frames 110, and is robust to minor calibration errors and noise.
It can be difficult to accurately align projectors into a desired configuration. In one embodiment, regardless of what the particular projector configuration is, even if it is not an optimal alignment, sub-frame generator 108 determines and generates optimal sub-frames 110 for that particular configuration.
Algorithms that seek to enhance resolution by offsetting multiple projection elements have been previously proposed. These methods may assume simple shift offsets between projectors, use frequency domain analyses, and rely on heuristic methods to compute component sub-frames. In contrast, one form of the embodiments described herein utilize an optimal real-time sub-frame generation algorithm that explicitly accounts for arbitrary relative geometric distortion (not limited to homographies) between the component projectors 112, including distortions that occur due to a display surface that is non-planar or has surface non-uniformities. One embodiment generates sub-frames 110 based on a geometric relationship between a hypothetical high-resolution hypothetical reference projector at any arbitrary location and each of the actual low-resolution projectors 112, which may also be positioned at any arbitrary location.
In one embodiment, system 100 includes multiple overlapped low-resolution projectors 112, with each projector 112 projecting a different colorant to compose a full color high-resolution image on the display surface with minimal color artifacts due to the overlapped projection. By imposing a color-prior model via a Bayesian approach as is done in one embodiment, the generated solution for determining sub-frame values minimizes color aliasing artifacts and is robust to small modeling errors.
Using multiple off the shelf projectors 112 in system 100 allows for high resolution. However, if the projectors 112 include a color wheel, which is common in existing projectors, the system 100 may suffer from light loss, sequential color artifacts, poor color fidelity, reduced bit-depth, and a significant tradeoff in bit depth to add new colors. One embodiment described herein eliminates the need for a color wheel, and uses in its place, a different color filter for each projector 112. Thus, in one embodiment, projectors 112 each project different single-color images. By not using a color wheel, segment loss at the color wheel is eliminated, which could be up to a 30% loss in efficiency in single chip projectors. One embodiment increases perceived resolution, eliminates sequential color artifacts, improves color fidelity since no spatial or temporal dither is required, provides a high bit-depth per color, and allows for high-fidelity color.
Image display system 100 is also very efficient from a processing perspective since, in one embodiment, each projector 112 only processes one color plane. Thus, each projector 112 reads and renders only one-third (for RGB) of the full color data.
In one embodiment, image display system 100 is configured to project images that have a three-dimensional (3D) appearance. In 3D image display systems, two images, each with a different polarization, are simultaneously projected by two different projectors. One image corresponds to the left eye, and the other image corresponds to the right eye. Conventional 3D image display systems typically suffer from a lack of brightness. In contrast, with one embodiment, a first plurality of the projectors 112 may be used to produce any desired brightness for the first image (e.g., left eye image), and a second plurality of the projectors 112 may be used to produce any desired brightness for the second image (e.g., right eye image). In another embodiment, image display system 100 may be combined or used with other display systems or display techniques, such as tiled displays.
Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described without departing from the scope of the present invention. This application is intended to cover any adaptations or variations of the specific embodiments discussed herein. Therefore, it is intended that this invention be limited only by the claims and the equivalents thereof.
This application is related to U.S. patent application Ser. No. 11/080,583, filed Mar. 15, 2005, and entitled PROJECTION OF OVERLAPPING SUB-FRAMES ONTO A SURFACE; and U.S. patent application Ser. No. 11/080,223, filed Mar. 15, 2005, and entitled PROJECTION OF OVERLAPPING SINGLE-COLOR SUB-FRAMES ONTO A SURFACE. These applications are incorporated by reference herein.