This disclosure relates generally to multiple projector systems and more particularly to generating pixel intensity blending maps for multiple projector systems.
Multiple projector systems (i.e., projection systems using two or more projectors) are typical in applications with large projection surfaces. By utilizing multiple projectors, the spatial resolution of the projected image can be increased because each projector provides a portion of the total projected image on a portion of the total projection surface. That is, each projector projects only a fraction of the entire projected image and the composite of the projected images defines the whole projected image. Additionally, multiple projector systems allow images to be projected onto three-dimensional (3D) surfaces where light from a single projector may be blocked by the shape of the surface and may not reach all desired locations.
In multiple projector systems, such as the ones described above, the projected image from one projector often overlaps with the projected image of neighboring projectors in order to provide a continuous projected image on the projection surface. Overlapping regions thus receive light from more than one projector, increasing the intensity of light on the overlapping region and leading to noticeable and often distracting light intensity variations for a viewer, as well as changing the visual output from a desired light intensity and/or color. Additionally, traditional projectors are typically unable to completely eliminate light when projecting black images and light leakage can occur. In overlapping regions, this light leakage compounds with the inadvertent increased intensity to create perceptibly lighter regions. Compensation techniques for these issues are often complicated by the size and shape of the projection surface. If the projection surface is three-dimensional, for example, certain contours or objects can skew images, cast shadows, or otherwise distort the projected images.
In one embodiment, a method of blending pixel intensity is provided. The method includes generating a plurality of pixel maps for a plurality of projectors based on at least one of an orientation of the plurality of projectors and a projection surface and adjusting an intensity of at least one pixel of at least one of the plurality of projectors based, at least in part, on the plurality of pixel maps.
In another embodiment, a method of generating a substantially seamless image with a plurality of overlapping projection images is provided. The method includes projecting, with a first projector, a first structured light pattern on a projection surface. The method further includes receiving, by a processor, a first image corresponding to the first structured light pattern. The method further includes projecting, with a second projector, a second structured light pattern on the projection surface, wherein the second structured light pattern at least partially overlaps the first structured light pattern on the projection surface. The method further includes receiving, by a processor, a second image corresponding to the second structured light pattern. The method further includes generating, a set of pixel correspondences between the first projector and second projector, based on the first image and the second image and modifying a brightness of one or more pixels of at least one of the first and second projector based, at least in part, on the set of pixel correspondences.
In yet another embodiment, a system for generating pixel intensity blending maps is provided. The system includes a plurality of projectors that project at least partially overlapping projection regions onto a projection surface, a processor, and a computer readable storage device having program instructions stored thereon for execution by the processor. The program instructions include program instructions to project, by the plurality of projectors, structured light patterns on the projection surface. The program instructions further include program instructions to receive data corresponding to images of the structured light patterns. The program instructions further include program instructions to generate a plurality of correspondence tables based on the images, the plurality of correspondence tables describing which projectors of the plurality of projectors can emit light on each point of the projection surface. The program instructions further include program instructions to generate a plurality of pixel maps for each of the plurality of the projectors, the pixel maps including overlapping projection region information. The program instructions further include program instructions to adjust an intensity of one or more pixels based on the pixel maps to generate a composite image having a perceptibly smooth intensity.
The present disclosure provides methods and systems for compensating and correcting projected images creating a composite image in a multiple projector system. In one embodiment, a virtual representation of a projection system is generated. The virtual representation of the projection system can include 3D aspects of a projection surface such as surface geometry or 3D objects in the projection path. Using the virtual representation, pixel intensity blending maps indicating overlapping projections regions are calculated for each pair of projectors in the projection system. The term pixel as used herein encompasses both the points constituting an image on the projection surface (e.g., points on the 3D surface), as well as the picture elements of a respective imaging device, such as a camera or projector.
The pixel intensity blending maps for overlapping regions, also referred to herein as alpha blending maps, account for arbitrary diffuse surfaces and are used to modify the originally input images to compensate for the overlapping regions and projector characteristics. Once the blending maps are generated for the virtual projector system, additional processing accounts for various physical phenomena or physical characteristics (e.g., geometrical orientations of the projection system, characteristics of the projection surface or the projector, and so on). The intensity blending maps can then be applied to inputted projected images to generate a resulting projected image visually smooth such that overlapping regions are not visible to the viewer. Then, the light intensity of one or more pixels of the projectors is adjusted based on overlapping sections of images projected by one or more projectors, as well as light leakage and other intrinsic characteristics of each projector to create one or more compensated images.
Additionally, black level compensation maps, also referred to herein as beta blending maps, can be generated that assist in compensating for projector characteristics, such as light leakage. A region of the projection surface having a maximum number of projectors contributing light to the region is determined. The maximum overlapping region is determined based on overlap maps designating which projectors contribute to each pixel of the projection surface. Using the overlap maps, black level compensation maps can be generated that indicate how much the pixel intensity of each region on the projection surface should be increased to match the intensity of the maximum overlapping region. The projection image can be modified, for example, by increasing the intensity to non-overlapping regions, to compensate for the intrinsic features of the projector, reducing visible artifacts due to light leakage from the projector. The light leakage results in an increased black level intensity in overlapping regions that can be adjusted for in non-overlapping regions by increasing the projected pixel intensity.
Both the intensity blending maps the black level compensation maps can be used on their own or together, depending on the desired level of correction. Using both maps together creates compensated images having a substantially uniform intensity and light levels across the overlapping and non-overlapping regions of the projected composite image. As used herein, “substantially smooth” or “substantially seamless” refers to a projected image where any pixel intensity variations caused by overlapping projectors are imperceptible to a viewer.
Turning to the figures, a projection system will now be discussed in more detail.
The camera 106 includes FOV 116. The FOV 116 allows the camera 106 to capture one or more images of the complete projection surface 110, including the overlapping region 118. The camera 106 is used to capture images of the entire projection surface 110. In some embodiments, multiple cameras can be used to ensure that the entire projection surface 110 is captured. Capturing images of the entire projection surface 110 provides correspondence information between pixels projected by the projectors 102, 104, pixels as they appear on the projection surface 110, and pixels captured by the camera 106. Using the pixel correspondence information, blending maps can be generated to compensate for projection image overlap, light leakage from the projectors, and other undesirable effects in projected images. In various embodiments, the projectors 102,104 and the camera 106 can be arranged in any configuration so long as the configuration is known or can be determined. For example, the camera 106 can be positioned between the projectors 102, 104, as shown in
The projection surface 110 is substantially any type of surface. In many instances the projection surface 110 may be a textured, off-white, colored, or other non-ideal reflective projection surface. The projection surface 110 can be a substantially planar or an arbitrarily shaped surface, as shown in
The projectors 102,104 are devices configured to project and spatially control light onto a surface and may emit light in a color space, such as red, green, blue and alpha (RGBA). In some examples, the projectors 102, 104 can be digital light processing (DLP) projectors. The projectors 102, 104 are configured to project images onto the projection surface 110 such that the projected images overlap by a number of pixels in the overlapping region 118. The intensities of light emitted by the projectors 102, 104 can be adjusted, for example, on a pixel-by-pixel basis. For example, the projectors 102, 104 can selectively increase or decrease light intensity for each pixel generated by the projector 102, 104. Exemplary components of the projectors 102, 104 are described in further detail below with respect to
The camera 104 is in optical communication with the projection surface 110 and is any device configured to capture images. The camera 106 typically includes a lens and an image sensor, such as a charge-coupled device, a complementary metal-oxide-semiconductor device, or an N-type or P-type metal-oxide-semiconductor. The type of image sensor and lens are typically varied based on the camera. In many examples the camera 106 is configured to capture color images; however, in other embodiments the camera may be monochromatic and one or more color filers can be used over the camera's lens to capture color images. It is desirable for the image sensor of the camera 106 to be able to capture the entire dynamic range of the projected light intensities to allow better correction of the projected images, but is not always necessary. The camera 106 captures one or more images of structured light patterns projected onto the projection surface 110 by the projectors 102, 104. The correction images can be used to correct and modify images projected by the projection system 100. The camera 104 may be in electrical communication with the computer 108 directly (e.g., via cable, WiFi or other wireless data transmission method) or indirectly (e.g., removable memory card, etc.) and transfers the correction images of the structured light patterns to the computer 108.
The computer 108 may communicate with and control the projectors 102, 104 and the camera 106. In certain embodiments, the computer 108 calculates pixel intensity blending maps and communicates with the projectors 102, 104 to adjust the intensity of projected images based on the pixel intensity blending maps. The computer 108 can be a server computer, a laptop computer, a tablet computer, a netbook computer, a personal computer, a smartphone, or a desktop computer. In another embodiment, the computer 108 can represent a computing system utilizing clustered computers and components to act as a single pool of seamless resources, as is common in “cloud computing.” In general, the computer 108 can be any programmable electronic device capable of communicating with the other devices in the projection system 100. The computer 108 can include internal and external components, as depicted and described in further detail with respect to
The projectors 102, 104 direct light toward the projection surface 110. In the embodiment of
To compensate for the increased intensity in the overlapping region 118, a pixel intensity blending map can be calculated and applied to the projectors 102, 104 to adjust the intensity to account for this effect.
The camera 106 may be calibrated using, for example, a planar marker, e.g. a checkerboard, calibration technique or a full self-calibration technique. In embodiments using a planar marker calibration technique, a planar surface with known feature points, such as a checkerboard on it is placed within the viewing frustum of the camera 106. The camera 106 captures one or more images of the planar surface which is moved to different locations to determine pixel correspondences between the pixels of the camera 106 and the known feature locations on the pattern. Based on the known pattern in the planar marker the camera's 106 orientation and lens properties are calibrated. Images captured by the camera 106 of the projected structured light pattern are then used to create a dense set of sub-pixel accurate correspondences between the pixels of the camera 106 to the pixels of each of the projectors 102, 104. The set of pixel correspondences can be stored as look up tables that provide pixel correspondences from the camera 106 to each projector 102, 104 (camera-to-projector look up tables) and from each projector 102, 104 to the camera 106 (projector-to-camera look up tables). That is, each pixel of the structured light pattern projected by the projectors 102, 104 can be located in a corresponding pixel of the camera 106. These pixel-to-pixel correspondences can be saved as look up tables such that identifying a particular pixel of the projected image provides the corresponding pixel of the camera, and vice versa. Example structured light patterns are described in more detail below with respect to
In embodiments using a self-calibration or auto calibration technique, the correspondences between the camera and one or more of the projectors may be determined simultaneously. That is, un-calibrated cameras may be used and the calibration process calibrates the cameras and the projectors in a single step. While this calibration technique reduces the number of calibrations and manual interventions needed, it may also result in a less accurate calibration since depending on the setup this method might not be as robust as another calibration method.
To determine the surface geometry (i.e., the topography) of the projection surface 110, one or more calibration images, such as known projected structured light patterns, can be captured with the calibrated camera 106 and analyzed. For example, the surface geometry can be reconstructed in a virtual world by using structured light patterns projected by the projectors 102, 104. In some embodiments, the surface geometry is reconstructed using binary encoded 2D pixel coordinate indices as structured light patterns. In other embodiments, other types of structured light patterns can be used. Generally, using this method a series of known structured light patterns, such as gray codes and binary blobs, are projected by the projectors 102, 104 onto the projection surface 110 Images of the projected structured light patterns can be captured by the camera 106. The captured images can be analyzed and compared to the known structured light patterns to detect any shadows, obstructions, or deformations caused by the surface geometry of the projection surface 110. Based on the deformations between the structured light patterns and the captured images, a virtual model of the surface geometry can be calculated that matches the physical surface geometry of the projection surface 110 using multi-view reconstruction, for example. To increase the accuracy of the reconstructed surface geometry, the number of structured light patterns and the number of cameras 106 can be increased to capture deformations in the projected structured light patterns more accurately from multiple angles. Having reconstructed the surface geometry, the projectors' orientation and internal parameters such as lens properties can be calibrated.
Turning to
In addition to the above examples, other methods may be used to assess the 3D scene information of the projection surface 110. For example, one or more depth cameras can be used on their own to map the projection surface 110 and its various features. Another example includes the using a laser scanning device to acquire the surface geometry or using a computer generated model which accurately matches the real surface.
The structured light patterns can also be used to determine correspondences between the two dimensional pixels of the projected image, and the corresponding location of those pixels on the 3D projection surface 110. The value and location of each pixel in the two-dimensional (2D) structured light pattern is known. The cameras 106 capture images of the deformed structured light patterns on the 3D projection surface 110. The pixels of the captured images can be compared with the pixels of the structured light pattern, in combination with the determined surface geometry of the projection surface 110 to determine the location of each projected pixel on the 3D projection surface. Determining projector to projection surface pixel correspondences can be used to verify and improve the projector-to-projector look up tables generated using the projector-to-camera lookup tables and the camera-to-projector look up tables as described above with respect to operation 202.
Returning again to the method 200 of
The projector-to-projector lookup tables correspond to initial pixel overlap maps for each combination of projectors. For example, in the embodiment of
In operation 206, the computer 108 maps or warps the projection frustum of each projector onto the image plane of each projector for which pixel intensity blending maps are being generated. For example, in the embodiment of
Specifically, each pixel of the first projector 102 is identified in the projector-to-projector lookup table. If a pixel of the first projector 102 corresponds to a pixel (i.e., both projectors can project to the pixel) of a second projector (e.g., projector 104), then the pixel is indicated as overlapping in the initial pixel intensity blending map. Alternatively, if the pixel of the first projector 102 does not correspond to a pixel of the second projector 104, then the pixel is indicated as not overlapping in the initial pixel intensity blending map. As described above, an initial pixel intensity blending map can be generated for each projector to projector combination (including a projector compared to itself). Optionally, one or more virtual pixel masks can be applied to the initial pixel intensity blending maps to exclude certain pixels from the maps. The masks may be used, for example, to exclude certain pixels that are blocked by a 3D feature from being projected onto by one or more of the projectors 102, 104. Additionally, the masks may eliminate pixels which are outside of an area of interest, such as pixels that are projected off of the projection surface 110.
Turning to
The first projection region 402 includes a first non-overlapping region 406 and a first overlapping region 410. Similarly, the second projection region 404 includes a non-overlapping region 408 and the overlapping region 410. Notably, both of the first and second projection regions 402,404 include the overlapping region 410, meaning that each of the projectors generating the projection regions 402 and 404 can project images to the overlapping region 410. For example, the pixel 412 located in the overlapping region 410 can be projected to by both of the projectors 102, 104. However, the pixel 414 can only be projected to by the projector 104.
Returning again to the method 200 of
With reference again to
Referring again to
In operation 212, the computer 108 adjusts the virtual projector output. The computer 108 applies the calculated projector intensity contributions determined in operation 210 to the virtual projectors in the idealized virtual world. The virtual projector outputs are adjusted according to the specific contributions of each projector to a particular location on the projection surface as calculated in operation 210. For example, for a given pixel, the computer may determine that a first projector (e.g., projector 102) contributes 40% of the light intensity for the pixel and a second projector (e.g., projector 104) contributes 60% of the light intensity for the pixel. The computer 108 adjusts the intensity outputs of each of the virtual projectors so that the intensities of the projectors match the calculated contributions (i.e., projector 102 is adjusted to 40% output and projector 104 is adjusted to 60% output).
In operation 214, the computer 108 calculates and applies physical compensation parameters to the pixel intensity blending maps generated in operation 210. The initial pixel intensity blending maps generated in operation 210 are determined for an idealized virtual world and as such physical phenomena are not directly accounted for in these maps. However, the physical phenomena and “real-world” operating parameters can be adjusted for once the pixel intensity blending maps have been generated. Taking into account these additional characteristics allows the correction applied to the projection system to be more accurate and provide a better output image. Physical parameters include, but are not limited to, surface reflectance of the projection surface 110, including reflectance based on surface geometry and color (hue), light falloff and attenuation from physical blend maps, light intensity, artistic desired intensity, and the projector's internal color response.
In one example, the geometry of the projection surface 110 can be at an angle with respect to the projector 102, and a portion of the light projected by the projector 102 may not be reflected toward the viewer as desired. To compensate for such a situation, the intensity of the pixels incident on the angled portion of the projection surface 110 can be increased to ensure that the perceived intensity of the projected light is consistent with surrounding pixels. This adjustment can be calculated based on the known surface geometry. That is, the further the angle of the projection surface 110 is from normal to the viewer, the more the pixel intensity on the angled region will be increased. Such increases can be experimentally determined, or calculated based on the surface geometry and the measured or approximated reflectance of the projection surface 110. Alternatively, cameras 106 or other imaging devices can measure pixel intensity of a projected image to determine reflectance variation. For example, an image of uniform pixel intensity can be projected onto the projection surface 110. Because of the geometry, certain regions of the projection surface do not reflect the same intensity as others. The camera 106 can detect the variations in reflectance, and the intensity of the projected image can be adjusted to compensate. The light attenuation is compensated for by calculating the distance the light travels from the 3D position of the projector to the surface. It's compensated by adjusting the intensities accordingly such that the same intensity falls onto the surface. Artistic intent is added by adding an additional attenuation intensity mask which is applied on top of the other corrections. A camera image can also be used for adjustments by warping the camera image to the projectors' image planes using projector-to-camera LUTs and changing the projector pixel intensities according to the corresponding intensities of the warped camera image.
As another example, the reflectance of the projection surface 110 may depend on the wavelength of the incident light. Therefore, the intensity of certain colors is increased or decreased in order to produce a smooth and perceptibly consistent projected image. The surface reflectance may be a measurable property of the projection surface material and can be adjusted based on experimentally determined wavelength based reflectance variations. The camera 106 or other imaging device can measure the reflected intensity of different color projections and adjustments to pixel intensity can be made based on the measured variations
Those skilled in the art will appreciate that additional physical adjustments may be required based on a number of physical phenomena and technology specific imperfections. The physical or real world parameters can be taken into account either manually or automatically. For example,
In operation 216, the physical compensation parameters are combined with the pixel intensity blending maps generated in operation 210 to generate final pixel intensity blending maps. A final pixel intensity blending map can be calculated for each of the projector-to-projector pixel maps. In operation, the final pixel intensity blending maps can be provided to the projectors 102, 104 and used to adjust, on a pixel by pixel basis, the intensity of the projected light, so that the projected image appears seamless and consistent in pixel intensity. Optionally, the final pixel intensity blending maps may be dithered in order to avoid banding artifacts when only a limited projector bit depth is available (e.g., an 8-bit projector). Dithering includes approximating colors not in the color space of the projector 102, 104 with a diffusion of colored pixels from within the available color space. For example, if red and blue are part of the color space, but purple is not, then diffusing red and blue pixels can create an image that appears purple. Dithering can be used to avoid perceptible color banding by simulating a wider variety of colors than are actually available in the color space of the projector.
In embodiments where the projectors 102, 104 are projecting low intensity images, such as black projections, pixel intensity cannot be reduced below a certain value because projectors are often associated with a certain level of light leakage, due to hardware and/or software limitations. In such embodiments, rather than reducing pixel intensity in overlapping regions according to a pixel intensity blending map (or in addition to), as described above with respect to
In operation 502, the computer 108 (i.e., a processor of the computer 108) determines the 3D scene information. In various embodiments, the 3D scene information can be determined as described above with respect to operation 202 of the method 200. Specifically, the 3D scene information can include the spatial orientation of the projectors 102, 104 and the camera 106 and the surface geometry of the projection surface 110. The orientation of the projectors in space can be manually measured or can be determined by calibrating the camera 106 with the projectors 102, 104. Structured light patterns can be used to determine the 3D geometry of the projection surface 110.
In operation 504, the computer 108 generates projector-to-projector lookup tables. The projector-to-projector lookup tables can be produced as described above with respect to operation 204 of the method 200. Specifically, the projector-to-projector look up tables can be generated by creating and comparing the projector-to-camera look up tables and/or the camera-to-projector look up tables for two projectors (e.g., projectors 102, 104) to one another to determine which pixels of the camera 106 correspond to both of the projectors 102, 104. In operation 506, the computer 108 warps the projection frusta for each projector-to-projector look up table onto the image plane of each projector to generate an initial black level pixel intensity blending map. Operation 506 can be performed as described above with respect to operation 206 of the method 200. Specifically, the projector-to-projector lookup tables can be analyzed for each projector combination in order to generate an initial black level intensity blending map which indicates each pixel of a first projector that is also projected to by a second projector. Additionally, the initial black level intensity blending maps show regions that are not overlapping. The pixel intensity in the non-overlapping regions can be increased to match the pixel intensity in the overlapping regions caused by aggregate light leakage.
In operation 508, the computer 108 determines per pixel projector contributions. The computer 108 analyzes each pixel on the projection surface, and, by referencing the projector-to-projector look up tables, the computer 108 determines the number of projectors projecting to a particular pixel on the projection surface 110. For example, a first overlapping region can have two contributing projectors projecting light in the region. In this example, each pixel in the overlapping region has two contributing projectors. Generally, any number of projectors may contribute pixel intensity to an overlapping region. However, each non-overlapping region only has one contributing projector. For each region on the projection surface the total number of contributing projectors can be determined. The computer 108 can also identify a maximum projection region that has the greatest number of contributing projectors. The maximum projection region can be used to set a target intensity for each of the other projection regions. The pixel intensities in the other projection regions can be adjusted to match the intensity of the maximum projection region. For example, if the maximum projection region has three contributing projectors, then a projection region with two contributing projectors can have the intensity of each of the contributing projectors increased by 50%. Similarly, a projection region with only one contributing projector can have its pixel intensity increased by 200%. The computer 108 can also determine a baseline light leakage level for each projector in non-overlapping regions. For example, the camera 106 can capture an image of the projection surface with a single projector (e.g., projector 102) projecting at its lowest intensity. Because of light leakage, the projected intensity will not be exactly zero, and the camera 106 can determine the minimum intensity for that projector. The process may be repeated for each projector in the multiple projector system to establish a baseline minimum intensity for each projector.
In operation 510, the computer 108 compensates the black level intensity for each projection region in the initial black level pixel intensity blending maps. In one example, the light intensity for each region of the initial black level pixel intensity blending maps can be adjusted (e.g., increased in intensity) such that each region on the projection surface 110 matches the intensity in the overlapping region with the greatest number of contributing projectors. With reference again to
In operation 512, the computer 108 calculates physical compensation parameters. The physical parameters can be calculated as described above with respect to operation 212 of method 200. More specifically, the physical parameters that require additional adjustment may include surface reflectance, including reflectance based on the projection surface geometry and color, light falloff and attenuation from physical blend maps, light intensity, and the projector's internal color response. These physical phenomena can be adjusted for by measuring variations in, for example, reflectance, attenuation, and the projector's internal color response. Once measurements are made, adjustments to the black level pixel intensity blending maps are made to compensate for the measured physical phenomena. In operation 514, the computer 108 calculates final black level intensity blending maps. The final black level intensity maps account for light leakage in overlapping projection regions by increasing the light intensity in non-overlapping regions, as well as accounting for various physical parameters which can distort or otherwise affect black level intensities. The computer 108 can apply the final black level pixel intensity maps to each projector in order to compensate for apparent black level discrepancies between overlapping and non-overlapping regions.
From the overlap maps, a distance estimation can be calculated for each pixel in an overlapping region, as described in operation 208 of
Generating both alpha and beta blending maps, and applying both to projected images can generate a perceptibly smooth and seamless image. By simultaneously reducing pixel intensity in overlapping regions to compensate for aggregate pixel intensity and increasing pixel intensity in non-overlapping regions (and overlapping regions that have fewer contributing projectors than the maximum overlapping region) to compensate for aggregate light leakage, hard seams and other visually distracting effects can be reduced or eliminated from a composite image.
The processing elements 702 can be substantially any electronic device capable of processing, receiving, and/or transmitting instructions. The memory 704 stores electronic data that are used by the projectors 102, 104, 120. The input/output interface 710 provides communication to and from the projectors 102, 104, 120, the camera 106, and/or the computer 108, as well as other devices. The input/output interface 710 can include one or more input buttons, a communication interface, such as Wi-Fi, Ethernet, or the like, as well as other communication components such as universal serial bus (USB) cables, or the like. The power source 712 may be a battery, power cord, or other element configured to transmit power to the components of the projectors.
The light source 708 is any type of light emitting element, such as, but not limited to, one or more light emitting diodes (LED), incandescent bulbs, halogen lights, lasers, or the like. The lens 706 is in optical communication with the light source and directs, focuses, and transmits light from the source 708 to a desired destination, in this case, the projection surface 110. The lens 706 varies one more parameters to affect the light, such as focusing the light at a particular distance. The lens may also include or have placed over it one or more physical masks which may block or obscure some or all of the projected light. In some instances, such as when the projector is a laser projector, the lens may be omitted.
The projectors 102, 104, 120 may also include a white balance adjustment or other color balance feature that can be adjusted automatically and/or manually. This allows the white point of the projector to be applied so as to avoid color shifts that are visible by a human observer.
With reference to
Computer 108 includes communications fabric 802, which provides communications between computer processor(s) 804, memory 806, persistent storage 808, communications unit 810, and input/output (I/O) interface(s) 812. Communications fabric 802 can be implemented with any architecture designed for passing data and/or control information between processors (such as microprocessors, communications and network processors, etc.), system memory, peripheral devices, and any other hardware components within a system. For example, communications fabric 802 can be implemented with one or more buses.
Memory 806 and persistent storage 808 are computer-readable storage media. In this embodiment, memory 806 includes random access memory (RAM) 814 and cache memory 816. In general, memory 806 can include any suitable volatile or non-volatile computer-readable storage media.
Various embodiments of the present invention can include computer program instructions for generating and applying pixel intensity blending maps in multiple projector systems. The computer program instructions can be stored in persistent storage 808 for execution by one or more of the respective computer processors 804 via one or more memories of memory 806. In this embodiment, persistent storage 808 includes a magnetic hard disk drive. Alternatively, or in addition to a magnetic hard disk drive, persistent storage 808 can include a solid state hard drive, a semiconductor storage device, read-only memory (ROM), erasable programmable read-only memory (EPROM), flash memory, or any other computer-readable storage media that is capable of storing program instructions or digital information.
The media used by persistent storage 808 may also be removable. For example, a removable hard drive may be used for persistent storage 808. Other examples include optical and magnetic disks, thumb drives, and smart cards that are inserted into a drive for transfer onto another computer-readable storage medium that is also part of persistent storage 808.
Communications unit 810, in these examples, provides for communications with other data processing systems or devices, for example, the projectors 102, 104 and the camera 106. In these examples, communications unit 810 includes one or more network interface cards and one or more near field communication devices. Communications unit 810 may provide communications through the use of either or both physical and wireless communications links. Computer programs and processes may be downloaded to persistent storage 808 through communications unit 810.
I/O interface(s) 812 allows for input and output of data with other devices that may be connected to computer 108. For example, I/O interface 812 may provide a connection to external devices 818 such as a keyboard, keypad, a touch screen, a camera, and/or some other suitable input device. External devices 818 can also include portable computer-readable storage media such as, for example, thumb drives, portable optical or magnetic disks, and memory cards. Software and data used to practice embodiments of the present invention can be stored on such portable computer-readable storage media and can be loaded onto persistent storage 808 via I/O interface(s) 812. I/O interface(s) 812 may also connect to a display 820.
Display 820 provides a mechanism to display data to a user and may be, for example, an embedded display screen or touch screen.
The programs described herein are identified based upon the application for which they are implemented in a specific embodiment of the invention. However, it should be appreciated that any particular program nomenclature herein is used merely for convenience, and thus the invention should not be limited to use solely in any specific application identified and/or implied by such nomenclature.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.