The invention relates to optical systems, and more particularly, optical systems including a projection screen and a projector.
Projection display systems typically include an image source, such as a projector, and a projection screen. During operation of the projection display system, the projector typically projects an image onto the projection screen for presentation to viewers. The projection screen may provide a diffuse surface to improve the image quality seen by viewers. Projection systems may be used for advertising in malls, showrooms, and exhibitions. Rear projection systems are one such example. A rear projection system includes at least a projection device (e.g. a DLP (Digital Light Processing) projector) and a rear projection screen. The projector is configured to project content within a limited projection area, which is typically a basic shape, such as a square or rectangle.
At least some aspects of the present disclosure feature a projection system capable of automatic or semiautomatic projection alignment. The projection system includes a projector, a shaped projection screen, one or more alignment marks, an image sensor, and a processing unit. The projector is configured to project an image. The shaped projection screen is configured to receive the projected image and display the projected image, the projector having a projection area on the shaped projection screen. The one or more alignment marks are proximate to a border of the projection screen. The image sensor is configured to capture an image of the shaped projection screen and generate a sensor signal corresponding to the captured image. The processing unit is electronically coupled to the image sensor and configured to receive the sensor signal and determine the positions of the one or more alignment marks based on the sensor signal.
At least some aspects of the present disclosure direct to a method of automatic or semiautomatic alignment of a projection system, including the steps of: projecting a piece of alignment content, by a projector, to a shaped projection screen; providing a fiducial mark proximate to a border of the shaped projection screen; displaying the piece of alignment content by the shaped projection screen; capturing an image of at least part of the shaped projection screen when the piece of alignment content is displayed, by an image sensor, wherein the captured image comprises a visual representation of the fiducial mark; and determining, by a processing system, the position of the fiducial mark based on the captured image, wherein the projector has a projection area on the shaped projection screen.
The accompanying drawings are incorporated in and constitute a part of this specification and, together with the description, explain the advantages and principles of the invention. In the drawings,
A projection system typically includes a projector and a projection screen. The position of a projection screen relative to a projector as well as angle of incidence from the projector onto the projection screen affects how content looks to the viewer. For example, it can affect image alignment with the screen and cause keystone distortion of the image. The present disclosure provides methods and systems for determining relative locations, orientations, distortions, and/or other projection characteristics of a projector to a projector screen. In some cases, the projection system can apply projective transformation and/or alignment factors to adjust content to match the projection screen location and shape. Alignment factors include, for example, perspective projection factors (e.g., scale factor, location compensation factor, rotation compensation factor, orientation compensation factor, etc.), supplemental projection factors, aspect ratio factors, flipping factors (i.e., a factor used when a reflector/mirror is used in the projection system). Projective transformation refers to an image transformation (i.e., using transformational matrix or formula) to adjust presentation content to match the projection screen. At least some embodiments of the present disclosure direct to methods and systems to aid the positioning and specify configurations of a projection screen and/or a projector. In some embodiments, the projection screen is a shaped project screen, which refers to a projection screen that is not in the shape of a rectangle.
Some embodiments of the present disclosure direct to a projection system that determines the position and shape of the projection screen relative to the projector using one or more image sensors so that the system could reshape or position the projected content within the projection area in such a way as to ensure that the projected content falls on the projection screen. In some implementations, such a system could include setting all pixels of the projected image that have been determined not to fall within the projection area to black or other color that is less visible depending on the material and/or color of the projection screen. This can minimize the visibility of such pixels.
In some embodiments, a projection system may include alignment marks, also referred to as fiducials, fiducial marks, or alignment fiducials, to aid the determination of the relative position, orientation, and/or other projection characteristics of a projector to a projection screen. In some other embodiments, a projection system may use a series of alignment content to aid the determination of the relative position, orientation, and/or other projection characteristics of a projector to a projection screen. In yet other embodiments, a projection system may use both alignment marks and a series of alignment content to aid the determination of the relative position, orientation, and/or other projection characteristics of a projector to a projection screen.
The alignment marks 220 can be virtual marks (i.e. as part of a projected image from the projector) or physical alignment marks. The physical alignment marks can be relatively small and inconspicuous or larger and more conspicuous. Additionally, the physical alignment marks can be visible or invisible. For example, the visible alignment marks can be made from retro-reflective materials, visible ink, or the like. As another example, the invisible alignment marks can be made using opaque ink, infrared ink, ultraviolet ink, or the like. In some implementations, the projection screen can have holes or other non-projection areas within its general shape. For example, a project screen in the shape of house can have windows that are non-projection areas. In such implementations, the system can include one or more alignment marks proximate to the non-projection area (i.e., close to its border or on its border, etc.) to aid the determination of the projection area. In some cases, the alignment marks 220 can be less conspicuous, as illustrated in
The projector 110 may be any suitable device configured to project an image onto projection screen 120, such as, but not limited to, a liquid crystal display (LCD) projector, a digital light processing (DLP) projector, a liquid crystal on silicon (LCOS) projector or a plasma projector. Other projectors may utilize surface-conduction electron-emitter display (SED) technology, organic light-emitted diode (OLED) technology, solid-state technology using lasers, and solid-state technology using light emitting diodes (LED). The projector 110 can be any suitable front-projection or rear-projection device. The projector 110 is configured to receive an input, such as a video signal from a video file, and project the corresponding image onto projection screen 120. The projector 110 may have any suitable display resolution, such as, but not limited to, display resolutions in accordance with the Super Video Graphics Array (SVGA) display standard (800×600 pixels), the eXtended Graphics Array (XGA) display standard (1024×768 pixels), the 720p display standard (1280×720 pixels) or the 1080p display standard (1920×1080 pixels).
The projectors for projecting the changeable electronic content can include mercury bulb based projectors (e.g., X56 projector from 3M Company, Saint Paul, Minn., U.S.A.), LED based projectors (e.g., MP410 projectors from 3M Company, Saint Paul, Minn., U.S.A.), laser based projectors, and hybrid projectors (e.g. XJ-M250 from Casio Computer Co., Ltd., Tokyo, Japan). In some systems, use of a laser based projector is advantageous because of its very long depth of focus, resulting in little degradation of the image quality from the top-to-bottom or side-to-side of the projected image and the high color gamut. In some systems, the use of a hybrid projector is advantageous because of the long life of the light source and the high color gamut.
The projection screen 120 may be any suitable projection screen that may be cut to define a particular shape prior to installation on a display surface, such as a window, door, or wall. The relative dimensions of the different sides of the projection screen 120 (e.g., an overall height and an overall width) may be selected based on the aspect ratio of the projector 110, as well as the pixel count of the projector 110.
While it is useful to cut a projection screen during or before installation to customize the projection screen for use with a particular shape of a window, such as a square, rectangular or circular window, projection screens having more unique shapes may also be useful. For example, a projection screen cut into a shape resembling a trademarked shape, such as a beverage bottle, may be more eye-catching than a rectangular shaped screen. A unique shape may add to the appeal of the projection screen 120 as well as the ability to captivate viewers. Other non-limiting examples of shapes of the projection screen 120 include silhouettes of characters, alphabetic letters, geometric patterns, logos, marquees, geometric shapes, thought bubbles, human figures, animal outlines, and product outlines. A product outline may include interior holes, such as the aforementioned bubbles which in turn have interior features.
In some implementations, a vector outline defined by a vector-based graphics software program may be used to define the boundaries of the projection screen 120 in order to extract (e.g., cut) the projection screen 120 from a sheet of optical film or otherwise create the projection screen 120 defining a customized shape. Any suitable software program executing on a computing device may be used to create the vector outline for defining the desired shape. Examples of suitable software programs include Adobe Photoshop, Adobe Flash, Adobe FreeHand, and Adobe Illustrator, which are each available from Adobe Systems Incorporated, San Jose, Calif. Further examples of suitable software programs for creating a vector image include CorelDRAW available from Corel Corporation, Ottawa, Canada and ConceptDraw available from Computer Systems Odessa, Odessa, Ukraine.
Vector images typically define a shape in computer graphics by geometrical primitives, such as lines, curves, points, polygons, and so forth. Vector images may provide certain advantages over raster-based images, such as an ability to be scaled without a loss of clarity. That is, a vector image may be scaled to substantially any size, large or small, without losing the clarity of the curves or other geometrical primitives defining the image. Thus, a vector image defining a shape for the projection screen 120 may be scaled to any size without losing the clarity of the outer boundaries of the projection screen 120. In contrast, raster images, which define a shape via a plurality of pixels, degrade in clarity upon scaling. Vector images may also be referred to as vector graphics, geometric modeling or object-oriented graphics.
In one embodiment, the projection screen 120 is a substantially flexible projection screen. For example, the projection screen 120 may be a flexible screen including refractive elements, such as glass beads, and a light absorbing layer for rendering the projection screen 120 substantially opaque in ambient lit conditions when no image is projected on the projection screen 120 by the projector 110. In some embodiments, the projection screen 120 is a rear projection screen in which the projector 110 projects an image onto a rear of the projection screen 120 and the image is viewable from a front surface of the projection screen 120, which is substantially opposite the rear surface. In other embodiments, the projection screen 120 is a front projection screen, in which the projector 110 projects an image onto the same surface as the viewing surface of the projection screen 120. Rear projection screens, including shaped screens, are described in the following, all of which are incorporated herein by reference as if fully set forth: U.S. Pat. No. 7,923,675; U.S. Pat. No. 6,870,670; and U.S. patent application Ser. No. 13/407,053, entitled “Shaped Rear Projection Screen with Shaped Fresnel Lens Sheet,” and filed Feb. 28, 2012.
In general, a thinner projection screen 120 may be easier to cut into a customized shape than a thicker screen. Accordingly, in some embodiments, the projection screen 120 has a thickness less than or equal to about 1 millimeter (mm).
The image sensor 130 can include at least one of a camera, infrared camera, CCD (Charged Couple Device) array or the like. In some cases, the processing unit 140, also referred to as a processing system, can include one or more processors, microprocessors, microcontrollers, computers, or other computing devices. In such cases, the processing unit 140 can execute software or firmware stored in non-transitory computer-readable medium to implement automatic or semiautomatic alignment for the projection system 100. In some other cases, the processing unit 140 can include circuits to implement part of or all of the functionality of automatic or semiautomatic (i.e., including manual steps) alignment.
In some embodiments, the processing unit 140 is further configured to determine the general boundary of the shaped projection screen based on the sensor signal and a predetermined shape of the shaped projection screen. The predetermined shape can be stored in a data repository. In some embodiments, the processing unit 140 is further configured to determine the general boundary of the projection area on the shaped projection screen based on the sensor signal and a predetermined shape of the projection area. In some cases, the processing unit 140 is further configured to adjust presentation content based on the determined position of the alignment mark, the determined boundary of the shaped projection screen, and/or the determined boundary of the projection area. The adjusted content is provided to the projector 110 for projection. Content, also referred to as presentation content or projection content, which is typically to be projected onto a projection screen, generally includes, for example, static image, video, interactive image or content, dynamic image or content, or the like.
In some embodiments, the projection system 100A can include more than one alignment mark 125, where the processing unit 140 is further configured to determine the positions of the one or more additional alignment marks and a rotation angle of the shaped projection screen. In some cases, the processing unit 140 is further configured to adjust the content based on the determined positions of the one or more alignment marks and the rotation angle of the shaped projection screen.
In some embodiments, the projector 110 has a projection area on the shaped projection screen, and the processing unit 140 is further configured to determine the positions of the one or more additional alignment marks and determine a scale (or zoom) factor, a metrics translation factor, and/or a rotation factor to be used to align the projection area with the shaped projection screen. In some implementations, the processing unit 140 is further configured to adjust presentation content based on the determined positions of the one or more alignment marks and the scale factor, the metrics translation factor, and/or the rotation factor. In some cases, the projector 110 can have more than one shaped projection areas. For each of the projection areas, a set of alignment factors can be determined and used to adjust presentation content.
In some embodiments, the projector 110 has a projection area on the shaped projection screen, and the processing unit 140 is configured to determine the positions of the one or more additional alignment marks and determine a keystone factor to be used to align the projection area with the shaped projection screen. In some implementations, the processing unit 140 is further configured to adjust presentation content based on the determined positions of the one or more alignment marks and/or the keystone factor. In some cases, the processing unit 140 is further configured to determine perspective projection factors (e.g., scale factor, orientation factor, rotation factor, etc.), image transformation (e.g., projection matrix), and/or projector setting adjustments. In some cases, the processing unit 140 is further configured to adjust the projector settings, and/or apply the image transformation and perspective projection factors to presentation content.
In some cases, the series of alignment content include a set of still images that changes along a first direction. In such cases, at least part of the border of the shaped projection screen intersecting with the first direction can be determined by the processing unit, for example, by analyzing the intensity changes among the series of captured images. Further, after the general border of the shaped projection screen is identified, the process can be repeated with smaller changes within a certain area where the border locates to acquire a more accurate position of the border.
In some embodiments, the series of alignment content include several set of alignment content to determine the relative location, orientation, rotation, and distortion of the shaped projection screen and compute alignment factors and/or projective transformation, where each set of alignment content changes along an individual direction. For example, the series of alignment content may include two sets of alignment content, where the first set of alignment content changes in a first direction and the second set of alignment content changes in a second direction orthogonal to the first direction. In some other cases, the series of alignment content 125B can vary in one or more patterns to aid the discovery. For example, a set of alignment content can use sweeping pattern of lines, boxes; or points to determine the relative location, orientation, rotation, and distortion of the shaped projection screen. Both positive images and negative images (inverted images) can be used as alignment content.
As an example, the projection system can use a set of alignment content with a pixel square (e.g., 1×1, 2×2, 3×3, etc.) sweeping across rows (e.g., from left to right) to determine edges of the projection screen. In some cases, the edges can be determined using brightness differencing. In addition, the projection system can use another set of alignment content with a pixel square sweeping across columns (e.g., from top to bottom) to refine the shape determination. Once the shape is defined, alignment factors and/or projective transformation can be mathematically derived by comparing the determined shape based upon the captured alignment content to the known designed shape of the projection screen. These alignment factors and/or projective transformation can be applied to a piece of presentation content either as it is projected, or as a separate step.
In some embodiments, a projection system can determine the alignment factors and/or projective transformation by analyzing a set of captured alignment content and a predetermined shape of the shaped projection screen. In some cases, only a portion of the border of the shaped projection screen needs to be determined to generate the alignment factors and/or projective transformation.
In some implementations, the camera 330 and the projector 310 can be contained in the same housing and use the same optical lens, so the camera 330 can capture image from the same location as the projector 310. In some other implementations, the camera 330 and the projector 310 can be set up at different housing or in the same housing but having separate optical lens. In a majority of implementations, the camera 330 may need to capture the location of the projected content so the relative alignment of the projector 310 and the projection screen 320 can be determined.
In some embodiments, the projection system 300A can include an optional light source 335 that can provide the necessary lighting to the alignment marks during the alignment process. The light source 335 can include infrared light source, ultraviolet light source, rapidly pulsed light source, and other suitable light source adequate to provide projection lights to illuminate the alignment marks 325.
The media player 340 provides content to the projector 310 for projection. The relative locations, rotations, orientations, and other presentation characteristics of the projector 310 from the projection screen 320 are analyzed and computed by a processing unit. In some embodiment, based on the computed presentation characteristics, presentation content is adjusted to properly project on to the shape projection screen by a processing unit. The content adjustment can include, for example, changing the size of content by applying a scale factor, or masking the portion of content outside the projection screen with black or other less visible pixels. The processing unit (not shown in
In some implementations, the projection system 300A can modify the projection mount and configurations of the projector 310 to allow better alignment of the projector with the projection screen. For example, the projection system 300A can adjust the zoom and focus of the projector 310, and/or roll pitch and yaw of the projector mount. In some cases, the projection system 300A can run the alignment process upon installation and/or periodically (e.g., when the projector shuts down or powers on) to accommodate any relative motion between the projection screen 320 and the projector 310 that was induced, for example, during the preceding day.
In some embodiments, the border of the projection screen can be outlined with some specific ink or materials (e.g., retro-reflective tape, infrared ink, etc.) to aid the border determination. In some embodiments, the projection system 300B can include an optional light source 335 that can provide the necessary lighting to the projection screen during the alignment process. The light source 335 can include infrared light source, ultraviolet light source, rapidly pulsed light source, and other suitable light source adequate to provide projection lights.
In some embodiments, at least some of the series of alignment content changes along a first direction, and at least part of general boundary of the shaped projection screen crosses the first direction. In some embodiments, at least some of the series of alignment content changes along a second direction different from the first direction, and at least part of general boundary of the shaped projection screen crosses the second direction. In some cases, the first direction is orthogonal to the second direction. In some embodiments, the media player 340 is further configured to adjust presentation content based on the at least part of the general boundary of the shaped projection screen and provide the adjusted presentation content to the projector. In some cases, the media player 340 may be configured to report to a content source the boundary information so that the content may be computationally adjusted for the measured boundary information and corrected content then supplied to the media player 340.
In some cases, the media player 340 is configured to determine one or more alignment factors based on the positions of at least part of the general boundary of the shaped projection screen. In such cases, the media player 340 is further configured to adjust presentation content based on the one or more alignment factors and provide the adjusted presentation content to the projector. Additionally, the media player 340 may adjust a projector setting based on the one or more alignment factors. In some cases, the media player 340 may be configured to report to a content source the alignment factors so that the content may be computationally adjusted for the measured alignment factors and corrected content then supplied to the media player 340.
In some cases, the media player 340 is configured to determine a projective transformation to be used to align the projection area with the shaped projection screen based on the at least part of the general boundary of the shaped projection screen. In such cases, the media player 340 is further configured to apply the projective transformation to presentation content and provide the adjusted presentation content to the projector. In some embodiments, the media player 340 may be configured to report to a content source the projective transformation information so that the content may be computationally adjusted for the measured projective transformation and corrected content then supplied to the media player 340.
In some cases, the media player 340 is configured to determine a keystone factor to be used to align the projection area with the shaped projection screen based on the at least part of the general boundary of the shaped projection screen. In such cases, the media player 340 is further configured to adjust presentation content based on the keystone factor and provide the adjusted presentation content to the projector. In some embodiments, the media player 340 may be configured to report to a content source the general boundary of the projection screen so that the content may be computationally adjusted for the measured general boundary of the projection screen and corrected content then supplied to the media player 340.
If the projection screen 420 is cut by a computer-controlled cutting machine, the cutting path for the machine may be based on a properly scaled virtual shape template. In one embodiment, the cutting machine is a computer numerically controlled (CNC) cutting machine employing a cutting tool to cut the projection screen film. The CNC cutting machine may be configured to move a cutting tool in two, three or more dimensions. In some embodiments, virtual shape template defines a cutting path for a computer-controlled cutting machine, such as by using a coordinates to indicate the linear path of cutting. In one type of CNC cutting machine, a controller, which may be provided by an external computing device or may be integral with the CNC cutting machine, generates signals indicative of the cutting path based on shape template. Based on the signals, the cutting tool of the cutting machine selectively cuts the projection screen material to produce the projection screen 420 defining a customized shape. The cutting machine may cut the projection screen 420 with a substantially continuous path in order to create a substantially clean edge.
After the projection screen 420 is cut to the desired shape, the projection screen 420 may be installed at the desired location. In some embodiments, the projection screen 420 is configured to be applied directly to an application surface. In the case of a rear projection screen, the application surface may be any suitable substantially transparent surface as long as the projection screen 420 is in a position capable of being viewed. The substantially transparent surface may comprise, for example, exterior or interior doors or windows. In some cases, substantially transparent surface may be somewhat opaque. For example, the surface may comprise a tinted, dirty or colored window, or it may comprise a window that has a wire pattern embedded in the glass. Alternatively, the projection screen 420 may include a stand that allows the projection screen 420 to be free-standing.
After the projection screen 420 is installed, the projector 410 may be positioned relative to the projection screen 420. Alternatively, if the projector 410 is in a fixed location, the projection screen 420 may be positioned relative to the projector 410. Precise and accurate placement of the projector 410 relative to the projection screen 420 is an important aspect to correctly projecting an image onto the projection screen 420 defining a customized shape. Due to the nature of the unique border of the projection screen 420, there may be less of a margin for misalignment between the projector 410 and the projection screen 420.
In some embodiments, in order to align the projector 410 and the projection screen 420, the projector 410 can project one or more alignment content onto the projection screen 420. The image sensor 430 can capture the projected alignment content and provide the captured alignment content to the processor-based device 440 for further analysis. In some embodiments, the processor-based device 440 determines projective transformation and/or alignment factors based on the captured alignment content that can be used to adjust the settings of the projector 410, adjust the relative position between the projector 410 and the projection screen 420, adjust presentation content, and/or mask presentation content. Virtual masking in projection systems are described in the following, all of which is incorporated herein by reference as if fully set forth: U.S. Pat. No. 7,923,675; U.S. Pat. No. 8,193,480.
In some implementations, the processor-based device 440 can be the same device as the image sensor 430 and/or projector 410 or hosted in the same housing as the projector 410 and/or image sensor 430. In some other implementations, the processor-based device 440 can be a different device from the image sensor 430 and the projector 410. In some cases, the processor-based device 440 can be located remotely and received the captured alignment content via a communication interface. Various components of the projection system, such as projector, image sensor, and processor-based device, can communicate via a communication interface. The communication interface includes, but not limited to, any wired or wireless short-range and long-range communication interfaces. The short-range communication interfaces may be, for example, local area network (LAN), interfaces conforming to a known communications standard, such as Bluetooth standard, IEEE 802 standards (e.g., IEEE 802.11), a ZigBee or similar specification, such as those based on the IEEE 802.15.4 standard, or other public or proprietary wireless protocol. The long-range communication interfaces may be, for example, wide area network (WAN), cellular network interfaces, satellite communication interfaces, etc. The communication interface may be either within a private computer network, such as intranet, or on a public computer network, such as the internet.
In some embodiments, as illustrated in
In some cases, the piece of alignment content is a blank page. In such cases, the image sensor may capture an image of at least part of the shaped projection screen when the piece of alignment content is not displayed as a second input to determine the position of the fiducial mark(s). Further, such approaches can reduce ambient light noise and/or baseline light level for the projection environment. In some other cases, the projector can project a first piece of alignment content and a second piece of alignment content to the projection screen sequentially; the image sensor captures a first image and a second image when the first and the second pieces of alignment content being projected respective; and the processor or the process-based device determines the position of the fiducial mark(s) based on the first image and the second image. For example, the first piece of alignment content and the second piece of alignment content can be designed to cover different areas of the projection screen (and its adjacent outer areas) and the fiducial mark(s) within those areas respectively.
In some embodiments, the processor-based device may determine alignment factors based on the determined position of the fiducial mark(s).
In some cases, the perspective transform or applying alignment factors may create distortion. In such cases, in order to compensate for this distortion, at least three fiducial marks of known relative positions can be used. The three fiducial marks with known relative positions are identified in the captured images. The positions of the three fiducial marks can be computed. Optionally, the positions of the three fiducial marks can be converted from sensor coordinates to projector coordinates. Using the positions of three fiducial marks, the processor-based device can calculate whether the projective distortion requires additional location compensation and by how much (i.e., distortion compensation factor(s)), in the projector coordinates.
In some cases, aspect-ratio compensation factors are needed to align projection content. In some embodiments, at least three alignment marks of known relative positions are needed. The three fiducial marks with known relative positions are identified in the captured images. The positions of the three fiducial marks can be computed. Optionally, the positions of the three fiducial marks can be converted from sensor coordinates to projector coordinates. Using the positions of three fiducial marks, the processor-based device can calculate the aspect ratio in the projection space. The processor-based device can then determine aspect-ratio compensation factors by comparing the aspect ratio of the projection space with the aspect ratio of the ideal projection space.
In some embodiments, the processor-based device determines a general boundary of the shaped projection screen based on a predetermined shape of the projection screen and the determined position of the fiducial mark(s) and further adjusts presentation content. In such embodiments, the predetermined shape of the projection screen can be stored in the format of, for example, file(s), data entry in database, or other suitable formats. The predetermined shape can be stored in a data repository. In some cases, the data repository may run on a single computer, a server, a storage device, a cloud server, or the like. In some other cases, the data repository may run on a series of networked computers, servers, or devices. In some implementations, the data repository includes tiers of data storage devices including local, regional, and central. In some embodiments, the alignment factors can be stored in the data repository.
In some embodiments, the projection system can adjust display content based on the at least part of the first border of the shaped projection screen and project the adjusted display content onto the shaped projection screen by a projector. In some cases, at least part of the series of the alignment images changes along a second direction different from the first direction and the processor-based device determines at least part of a second border of the shaped projection screen based on the series of the captured alignment content. In some cases, the second border intersects with the second direction. In one embodiment, the second direction is orthogonal to the first direction. The processor-based device may determine alignment factors and/or projective transformation based on both the at least part of the first border and the at least part of the second border.
In some embodiments, the processor-based device determines at least part of general boundary of the shaped projection screen based on the at least part of the first border and the at least part of the second border and further determines alignment factors and/or projective transformation based on the at least part of the general boundary of the shaped projection screen. The processor-based device can adjust presentation content by applying alignment factors and/or projective transformation, and provided the adjusted content to the projector. The projector then projects the adjusted presentation content to the shaped projection screen.
The projection screen 900 includes a plurality of refractive elements 904 (e.g., glass beads), light absorbing layer 906, light transmitting substrate 908, removable adhesive 910, and liner 912. In one embodiment, refractive elements 904 are situated in substantially predetermined positions. However, manufacturing and cost limitations may limit the precision of the placement of refractive elements 904. For example, refractive elements 904 may be placed in an array, a closely or loosely packed arrangement.
Refractive elements 904 may be constructed from glass or polymeric materials. Suitable examples include glass or a transparent plastic material. Projections screens including refractive beads and construction of such screens may comprise the teachings disclosed in commonly assigned patent applications PCT WO 99/50710 and PCT WO 98/45753, and U.S. Pat. No. 6,466,368, issued Oct. 15, 2002, and entitled “REAR PROJECTION SYSTEM WITH REDUCED SPECKLE,” and U.S. Pat. No. 6,535,333, issued Mar. 18, 2003, entitled “OPTICAL SYSTEM WITH REDUCED COLOR SHIFT”, U.S. Pat. No. 6,631,030, issued Oct. 7, 2003, and entitled “PROJECTION SCREENS AND METHODS FOR MAKING SUCH PROJECTION SCREENS,” and U.S. Pat. No. 6,204,971, issued Mar. 20, 2001 and entitled “GLASS MICROSPHERES FOR USE IN FILMS AND PROJECTION SCREEN DISPLAYS AND METHODS” (the entire contents of each of which are herein incorporated by reference).
In one embodiment, refracting elements 904 are transparent, substantially spherical, refracting beads seated in an absorptive, high optical density transparent polymer matrix. The beads may be in intimate contact with a transparent binder material. The beads may have a refractive index between about 1.2 and 1.9. In some embodiments, the spherical beads have an average diameter of greater than about 20 micrometers (μm) and less than about 400 p.m. For example, the average diameter may be between about 40 μm and about 90 μm. As another example, the average diameter of the refractive beads may be is a range of about 50 μm and about 80 In one embodiment, the average diameter of each spherical refractive bead is about 65 μm.
The projection screen 9100 including refractive beads (i.e., a “beaded screen”) affords a relatively good contrast and a viewing angle that allow a bright, sharp picture to be viewed at wide angles while minimizing any losses in image quality due to washout from sunlight or room lighting. Beaded screens may be constructed to provide substantially symmetric horizontal and vertical viewing angle and gain characteristics. This may be particularly useful for large screens used in multilevel locations (such as shopping malls) where a person located on a level above or below the screen may wish to view the screen. Also, beaded screens may be constructed to be flexible so that they can be easily mounted to any rigid, transparent surface minimizing surface reflection losses that might be present with a conventional rigid rear projection screen.
As used herein, the viewing angle means the angle at which gain is reduced by 50% of the peak value. To determine viewing angle, screen gain is tested. Gain is a measure of screen brightness and a function of viewing angle. It is normalized with respect to a Lambertian diffuser. To measure gain, a white light source illuminates a white reflectance standard. Its luminance is measured with a luminance meter at near normal incidence (LR). A screen is placed in front of the light source and the luminance is measured (on the opposite side of the sample from the source) at near normal incidence (LS). The peak gain is defined as the ratio of LS/LR. After the on-axis gain measurement, the screen then stepped through a range of angles, a luminance reading taken at each position. LS-Θ/LR (Gain) is then plotted as a function of angle. The viewing angle is defined as the angle at which the gain falls to one-half its peak value.
When beaded rear projection screens are used for displays, it has been found that in some situations, a wider viewing angle is desired, while in other situations, a narrower viewing angle may be preferred. Lower refractive indices for the beads tend to narrow the viewing angle, but provide a brighter image to viewers located within the area defined by the maximum viewing angle. For this reason, it is useful to be able to provide a variety of different screens for different situations. Use of different beads for different screens affords this flexibility in screen design.
In one embodiment, light absorbing layer 906 may be coated on or otherwise coupled to light transmitting substrate 908. In another embodiment, light transmitting substrate 908 may be applied onto light absorbing layer 906. Light absorbing layer 906 helps controls ambient light rejection for an optical system. As a result of light absorbing layer 906, screen 900 supplies excellent contrast characteristics, even in relatively high ambient lighting conditions, as compared to screens that do not include a light absorbing layer 906.
Light absorbing layer 906 may be opaque or substantially opaque. In embodiments, light absorbing layer 906 includes one or more of a powder coating of carbon black, a black dye, an opaque particle, an organic or inorganic pigment or particle, or such a particle dispersed in a binder material. The particles that define light absorbing layer 906 may be of a wide variety and shapes. For example, the material may be dispersed in a liquid or solid binder system. In one embodiment, light absorbing layer 906 comprises a clear binder having black particles dispersed throughout the clear binder. The binder may comprise, for example, an acrylate or other UV curable polymer. Light absorbing layer 906 may be applied by a conventional technique such as a coating process or powder coating.
Light transmitting substrate 908 is substantially flexible to help render screen 900 substantially flexible. Light transmitting substrate 908 is also substantially transparent or translucent. For example, a substantially flexible and substantially transparent substrate 908 may comprise suitable light transmitting materials such as polyvinyl chloride, acrylic, polycarbonate or combinations of such materials. Light transmitting substrate 908 may include an optional matte anti-glare finish, such as a finish achieved by embossing.
Removable adhesive 910 couples screen 900 to an application surface, such as transparent surface 830 of
Removable adhesive 910 may be an optical adhesive, such as the ones described PCT WO 97/01610 (the entire contents of which are herein incorporated by reference). In some embodiments, removable adhesive 910 may be reusable or repositionable. Other examples of suitable adhesives 910 include strong, tacky adhesives such as acrylic adhesives available from 3M Company of St. Paul, Minn. and Ashland Chemical Company of Columbus, Ohio (such as Aroset branded acrylics), and those constructions disclosed in U.S. Pat. No. 5,196,266 and PCT Patent Publication WO94/21742. Nonlimiting examples of other pressure sensitive adhesives 910 can generally be found in Satas, Ed., Handbook of Pressure Sensitive Adhesives, 2nd Ed. (Von Reinhold Nostrand 1989). Of these adhesives, desirable adhesives include solvent-based acrylic and rubber adhesives, water-based acrylic adhesives, hot melt adhesives, microsphere-based adhesives, and silicone-based adhesives, regardless of their method of preparation.
Other nonlimiting examples of suitable adhesives 910 include acrylic adhesives from 3M Company and Ashland Chemical Co. and a nontacky adhesive, such as a terpolymer of acrylonitrile, butadiene, and isoprene, or similar copolymer of acrylonitrile and either butadiene or isoprene, commercially available under the brand Nipol adhesives from Zeon Chemical Co., Louisville, Ky. and those adhesives disclosed in EPO Patent Publication EP 0 736 585 (Kreckel et al.). Suitable acrylic adhesives having permanently low tack include microsphere-based adhesives disclosed in U.S. Pat. No. 5,141,790 (Calhoun et al.); U.S. Pat. No. 5,296,277 (Wilson et al.); U.S. Pat. No. 5,362,516 (Wilson et al.) and EPO Patent Publication EP 0 570 515 B1 (Steelman et al.), which are each incorporated herein by reference in their entireties.
Coating weights of adhesive 910 can range from about 10 micrometers (μm) to about 300 μm, such as about 20 μm to about 250 μm. Percent solids of such adhesives in the formulations to be applied on layer range from about 5% to about 100%, such as about 20% to about 100%. Adhesive 910 may be applied using a variety of techniques known to those skilled in the art such as casting, extruding, coating, spraying, screen-printing and laminating.
In some embodiments, the refractive index of adhesive 910 is between about 1.40 and 1.9, such as between 1.4 and 1.55. The index of refraction of adhesive 910 may be similar to the index of refraction of the substrate 908 so that a minimum amount of scattering occurs. Scattering may reduce the brightness or other optical properties of screen 900. In one embodiment, the difference in the indexes of refraction of substrate 908 and screen 910 is less than about 0.15, such as less than about 0.1. Alternatively, other factors may be varied to achieve the desired effect.
Other projection screens may be incorporated into a projection system of the present invention. For example, other projection screens described in commonly-assigned U.S. Pat. No. 6,870,670, entitled, “SCREENS AND METHODS FOR DISPLAYING INFORMATION,” which was previously incorporated by reference, may be used in other embodiments.
The projection systems described herein are useful for many different applications. Examples of methods of providing information to a potential customer according are described in U.S. Pat. No. 6,870,670, entitled, “SCREENS AND METHODS FOR DISPLAYING INFORMATION.” Also described in U.S. Pat. No. 6,870,670 are various networks that may be utilized to display information via a projection screen. Those networks may also utilize a projection system including a projection system described herein.
Embodiment One is a projection system capable of projection alignment, comprising: a projector configured to project an image, a shaped projection screen configured to receive the projected image and display the projected image, the projector having a projection area on the shaped projection screen, one or more alignment marks proximate to a border of the projection screen, an image sensor configured to capture an image of the shaped projection screen and generate a sensor signal corresponding to the captured image, and a processing unit electronically coupled to the image sensor and configured to receive the sensor signal and determine the positions of the one or more alignment marks based on the sensor signal.
Embodiment Two is the projection system of Embodiment One, wherein the processor is further configured to determine the general boundary of the projection screen based on the determined positions of the one or more alignment marks.
Embodiment Three is the projection system of Embodiment One or Embodiment Two, wherein the processor is further configured to determine the general boundary of the shaped projection screen based on the determined positions of the one or more alignment marks and a predetermined shape of the shaped projection screen.
Embodiment Four is the projection system of any of the Embodiment One through Embodiment Three, wherein at least one of the one or more alignment marks is a physical alignment mark.
Embodiment Five is the projection system of any of the Embodiment One through Embodiment Four, wherein at least one of the one or more alignment marks is made using at least one of a retro-reflective material, a visible ink, an infrared ink, an opaque ink, and an ultraviolet ink.
Embodiment Six is the projection system of any of the Embodiment One through Embodiment Five, further comprising: an area extended from the border of the shaped projection screen, wherein at least one of the one or more alignment marks is in the extended area.
Embodiment Seven is the projection system of any of the Embodiment One through Embodiment Six, wherein the processing unit is further configured to provide content to the projector and adjust the content based on the determined positions of the one or more alignment marks.
Embodiment Eight is the projection system of any of the Embodiment One through Embodiment Seven, wherein the processing unit is further configured to determine an alignment factor based on the positions of the one or more alignment marks.
Embodiment Nine is the projection system of Embodiment Eight, wherein the processing unit is further configured to adjust the content based on the alignment factor and provide the adjusted presentation content to the projector.
Embodiment Ten is the projection system of any of the Embodiment One through Embodiment Nine, wherein the processing unit is further configured to determine a projective transformation to be used to align the projection area with the shaped projection screen based on the positions of the one or more alignment marks.
Embodiment Eleven is the projection system of Embodiment Ten, wherein the processing unit is further configured to apply the projective transformation to presentation content and provide the adjusted presentation content to the projector.
Embodiment Twelve is the projection system of any of the Embodiment One through Embodiment Eleven, wherein the processing unit is further configured to determine a keystone factor to be used to align the projection area with the shaped projection screen based on the positions of the one or more alignment marks.
Embodiment Thirteen is the projection system of Embodiment Twelve, wherein the processing unit is further configured to adjust presentation content based on the keystone factor and provide the adjusted presentation content to the projector.
Embodiment Fourteen is the projection system of any of the Embodiment One through Embodiment Thirteen, wherein the image sensor comprises at least one of a camera, infrared camera, and CCD (Charged Couple Device) array.
Embodiment Fifteen is a method of alignment of a projection system, comprising: projecting a piece of alignment content, by a projector, to a shaped projection screen; providing a fiducial mark proximate to a border of the shaped projection screen; displaying the piece of alignment content by the shaped projection screen; capturing an image of at least part of the shaped projection screen when the piece of alignment content is displayed, by an image sensor, wherein the captured image comprises a visual representation of the fiducial mark; and determining, by a processing system, the position of the fiducial mark based on the captured image, wherein the projector has a projection area on the shaped projection screen.
Embodiment Sixteen is the method of Embodiment Fifteen, further comprising: determining, by the processing system, a general boundary of the shaped projection screen based on the determined position of the fiducial marks.
Embodiment Seventeen is the method of Embodiment Fifteen or Embodiment Sixteen, further comprising: determining, by the processing system, a general boundary of the shaped projection screen based on a predetermined shape of the shaped projection screen and the determined position of the fiducial marks.
Embodiment Eighteen is the method of any one of the Embodiment Fifteen through Embodiment Seventeen, further comprising: adjusting presentation content, by the processing system, based on the determined positions of the one or more fiducial marks.
Embodiment Nineteen is the method of any one of the Embodiment Fifteen through Embodiment Eighteen, wherein at least one of the one or more fiducial marks is a physical fiducial mark.
Embodiment Twenty is the method of any one of the Embodiment Fifteen through Embodiment Nineteen, wherein at least one of the one or more fiducial marks is made using at least one of a retro-reflective material, a visible ink, an infrared ink, an opaque ink, and an ultraviolet ink.
Embodiment Twenty-one is the method of any one of the Embodiment Fifteen through Embodiment Twenty, further comprising: determining, by the processing system, an alignment factor based on the positions of the one or more fiducial marks.
Embodiment Twenty-two is the method of Embodiment Twenty-one, further comprising: adjusting presentation content, by the processing system, based on the alignment factor.
Embodiment Twenty-three is the method of Embodiment Twenty-one, further comprising: adjusting a projector setting, by the processing system, based on the alignment factor.
Embodiment Twenty-four is the method of any one of the Embodiment Fifteen through Embodiment Twenty-three, further comprising: determining, by the processing system, a projective transformation based on the positions of the one or more fiducial marks.
Embodiment Twenty-five is the method of Embodiment Twenty-four, further comprising: adjusting presentation content, by the processing system, by applying the projective transformation.
Embodiment Twenty-six is the method of any one of the Embodiment Fifteen through Embodiment Twenty-five, further comprising: determining, by the processing system, a keystone factor to be used to align the projection area with the shaped projection screen based on the positions of the one or more fiducial marks.
Embodiment Twenty-seven is the method of Embodiment Twenty-six, further comprising: adjusting presentation content, by the processing system, based on the keystone factor.
Embodiment Twenty-eight is the method of any one of the Embodiment Fifteen through Embodiment Twenty-seven, further comprising: determining, by the processing system, the general boundary of the shaped projection screen based on the sensor signal and a predetermined shape of the shaped projection screen.
Embodiment Twenty-nine is the method of any one of the Embodiment Fifteen through Embodiment Twenty-eight, wherein the image sensor comprises at least one of a camera, infrared camera, and CCD (Charged Couple Device) array.
The present invention should not be considered limited to the particular examples and embodiments described above, as such embodiments are described in detail to facilitate explanation of various aspects of the invention. Rather the present invention should be understood to cover all aspects of the invention, including various modifications, equivalent processes, and alternative devices falling within the spirit and scope of the invention as defined by the appended claims and their equivalents.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US14/24022 | 3/12/2014 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
61782958 | Mar 2013 | US |