The application relates generally to multichromic reflective layers to enhance screen gain.
Television designers have gone to great lengths to control the backlight level even going so far as to provide per pixel backlight control for liquid crystal display (LCD) televisions. At the same time, the amount of light emitted by the television has increased dynamically. Coupling this large increase in both contrast and brightness with moving to ten bits per color component (over the eight bits previously used) creates what is termed “High Dynamic Range” content.
Because of the monochromatic nature of projection screens, it has been difficult to realize HDR content through a projector system. In allowed U.S. patent application Ser. No. 15/004,200, owned by the present assignee, an electronic eScreen capable of pixel level gray scale adjustment was disclosed that could be applied to a wall or other supporting substrate to provide a large surface for a video projector.
As understood herein, selective reflectivity of light from the eScreen can be enhanced using multichromic particles coated onto the eScreen.
Accordingly, an assembly includes at least one substrate against which a projector can project color video. The substrate includes pixels actuatable to establish grayscale values on the substrate. At least one multichromic reflective coating is disposed on the substrate.
The assembly can include the projector, and the substrate can include e-ink. The projector may be an ultra-short throw (UST) projector. The multichromic reflective coating may have plural reflection coefficients for light polarized in respective plural directions. In examples, the multichromic reflective coating can include molecules that are linearly disposed with respect to each other. In one example, the multichromic reflective coating includes one and only one (single) layer of multichromic reflective particles (MRP) mixed together, with each MRP reflecting red, green, or blue light such that the single layer reflects red, green, and blue light. In another example, the multichromic reflective coating includes at least first and second sublayers, with the first sublayer including at least first MRP reflecting at least a first frequency of visible light and with the second sublayer including MRP reflecting at least a second frequency of visible light different from the first frequency of visible light.
In another aspect, a method includes identifying multiple visible light frequencies characteristic of a color projector. At least a non-characteristic visible light frequency is not characteristic of the color projector. The method includes coating a projector substrate with multichromic material that reflects the multiple visible light frequencies and that does not reflect the non-characteristic visible light frequency.
In another aspect, an assembly includes at least one substrate against which a projector can project color video. The substrate includes pixels actuatable to establish grayscale values on the substrate. At least one multichromic substance (MS) is disposed on the substrate. The MS reflects red, green, and blue light.
In some examples, the MS reflects no other light other than red, green, and blue. In other examples, the MS also reflects yellow light. In non-limiting example implementations, the MS reflects wavelengths between 440 nm and 450 nm, 635 nm-645 nm, and 525-540 nm and no other wavelengths. In other examples, the MS primarily reflects wavelengths of 445 nm, 638-639 nm, and 530 nm or 545 nm and substantially no other wavelengths. In other examples, the MS reflects all wavelengths in the range 445 nm-639 nm.
The details of the present application, both as to its structure and operation, can best be understood in reference to the accompanying drawings, in which like reference numerals refer to like parts, and in which:
This disclosure relates generally to computer ecosystems including aspects of consumer electronics (CE) device networks such as projector systems. A system herein may include server and client components, connected over a network such that data may be exchanged between the client and server components. The client components may include one or more computing devices including video projectors and projector screens, portable televisions (e.g. smart TVs, Internet-enabled TVs), portable computers such as laptops and tablet computers, and other mobile devices including smart phones and additional examples discussed below. These client devices may operate with a variety of operating environments. For example, some of the client computers may employ, as examples, operating systems from Microsoft, or a Unix operating system, or operating systems produced by Apple Computer or Google. These operating environments may be used to execute one or more browsing programs, such as a browser made by Microsoft or Google or Mozilla or other browser program that can access web applications hosted by the Internet servers discussed below.
Servers and/or gateways may include one or more processors executing instructions that configure the servers to receive and transmit data over a network such as the Internet. Or, a client and server can be connected over a local intranet or a virtual private network. A server or controller may be instantiated by a game console such as a Sony Playstation (trademarked), a personal computer, etc.
Information may be exchanged over a network between the clients and servers. To this end and for security, servers and/or clients can include firewalls, load balancers, temporary storages, and proxies, and other network infrastructure for reliability and security. One or more servers may form an apparatus that implement methods of providing a secure community such as an online social website to network members.
As used herein, instructions refer to computer-implemented steps for processing information in the system. Instructions can be implemented in software, firmware or hardware and include any type of programmed step undertaken by components of the system.
A processor may be any conventional general-purpose single- or multi-chip processor that can execute logic by means of various lines such as address lines, data lines, and control lines and registers and shift registers.
Software modules described by way of the flow charts and user interfaces herein can include various sub-routines, procedures, etc. Without limiting the disclosure, logic stated to be executed by a particular module can be redistributed to other software modules and/or combined together in a single module and/or made available in a shareable library.
Present principles described herein can be implemented as hardware, software, firmware, or combinations thereof; hence, illustrative components, blocks, modules, circuits, and steps are set forth in terms of their functionality.
Further to what has been alluded to above, logical blocks, modules, and circuits described below can be implemented or performed with one or more general purpose processors, a digital signal processor (DSP), a field programmable gate array (FPGA) or other programmable logic device such as an application specific integrated circuit (ASIC), discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be implemented by a controller or state machine or a combination of computing devices.
The functions and methods described below, when implemented in software, can be written in an appropriate language such as but not limited to C# or C++, and can be stored on or transmitted through a computer-readable storage medium such as a random access memory (RAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), compact disk read-only memory (CD-ROM) or other optical disk storage such as digital versatile disc (DVD), magnetic disk storage or other magnetic storage devices including removable thumb drives, etc. A connection may establish a computer-readable medium. Such connections can include, as examples, hard-wired cables including fiber optics and coaxial wires and digital subscriber line (DSL) and twisted pair wires. Such connections may include wireless communication connections including infrared and radio.
Components included in one embodiment can be used in other embodiments in any appropriate combination. For example, any of the various components described herein and/or depicted in the Figures may be combined, interchanged or excluded from other embodiments.
“A system having at least one of A, B, and C” (likewise “a system having at least one of A, B, or C” and “a system having at least one of A, B, C”) includes systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.
Now specifically referring to
For example, the projection screen assembly 12 can include one or more e-ink type screens or displays 14 that may be implemented by one or more e-ink arrays. An e-ink array may be made of small polyethylene spheres (for instance, between seventy-five and one hundred micrometers in diameter). Each sphere may be made of negatively charged black plastic on one side and positively charged white plastic on the other. The spheres can be embedded in a transparent silicone sheet, with each sphere suspended in a bubble of oil so that it can rotate freely. The polarity of the voltage applied to each pair of electrodes then determines whether the white or black side is face-up, thus giving the pixel a white or black appearance. Other e-ink technology may use polyvinylidene fluoride (PVDF) as the material for spheres. Other e-ink technologies includes electrophoretic with titanium dioxide particles approximately one micrometer in diameter dispersed in a hydrocarbon oil, Microencapsulated Electrophoretic Displays, electrowetting, electrofluidic, and interferometric modulator displays that can create various colors using interference of reflected light, bistable displays such as flexible plastic electrophoretic displays, cholesteric liquid crystal displays, nemoptic displays made of nematic materials organic transistors embedded into flexible substrates, electrochromic displays, etc.
Other active screen technology that may be used include “meta materials”, chemical-based active screens, and screens with pixels established by carbon nanotubes.
The projection screen assembly 12 may include one or more speakers 16 for outputting audio in accordance with present principles, and at least one input device 18 such as e.g. an audio receiver/microphone or key pad or control keys for e.g. entering commands to at least one screen processor 20. The example screen assembly 12 may also include one or more network interfaces 22 for communication over at least one network 24 such as the Internet, an WAN, an LAN, etc. under control of the one or more processors 20. Thus, the interface 22 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface, such as but not limited to a mesh network transceiver, or it may be a Bluetooth or wireless telephony transceiver. It is to be understood that the processor 20 controls the screen assembly 12 to undertake present principles, including the other elements of the screen assembly 12 described herein such as e.g. controlling the display 14 to present images thereon and receiving input therefrom. Furthermore, note the network interface 22 may be, e.g., a wired or wireless modem or router, or other appropriate interface such as, e.g., a wireless telephony transceiver, or Wi-Fi transceiver as mentioned above, etc.
In addition to the foregoing, the screen assembly 12 may also include one or more input ports 26 such as, e.g., a high definition multimedia interface (HDMI) port or a USB port to physically connect (e.g. using a wired connection) to another CE device and/or a headphone port to connect headphones to the screen assembly 12 for presentation of audio from the screen assembly 12 to a user through the headphones. For example, the input port 26 (and/or network interface 22) may be connected via wire or wirelessly via the network 24 to a cable or satellite or other audio video source 28 with associated source processor 28A and source computer memory 28B. Thus, the source may be, e.g., a separate or integrated set top box, or a satellite receiver. Or, the source 28 may be a game console or personal computer or laptop computer or disk player. Yet again, the source 28 and/or the color video source discussed below may be cloud servers on the Internet, and may include and perform “cloud” functions such that the devices of the system 10 may access a “cloud” environment via the server 28 in example embodiments. Or, the server 28 may be implemented by a game console or other computer in the same room as the other devices shown in
In any case, the video source 28 controls the reflectance of the video shown on the screen assembly 12 by the below-described projector by inputting grayscale values to the active pixels of the screen assembly 12. The video source 28 may be a separate video source as shown which receives full color video and derives a grayscale rendering thereof according to principles discussed below, in which case the source 28 is tailored to source a separate piece of grayscale content to maximize the usage of the reflectance properties of the screen assembly 12. Such a source 28 may be separate from the screen assembly 12 as shown or it may be incorporated into the screen assembly 12 in some implementations.
Or the source 28 may be the same as the color video source mentioned below, in which case the color video source may include a color video file for projection onto the screen assembly 12 and a corresponding grayscale video file that is sent to the screen assembly 12 to control the active elements in the screen assembly 12.
The screen assembly 12 may further include one or more computer memories 30 such as disk-based or solid-state storage that are not transitory signals, in some cases embodied in the chassis of the screen as standalone devices or as a personal video recording device (PVR) or video disk player either internal or external to the chassis of the AVDD for playing back AV programs or as removable memory media.
Still referring to
In one example, a front projector 32 such as but not limited to a Sony ultra-short throw (UST) projector may be used to project demanded images onto the front of the display 14. The example projector 32 may include one or more network interfaces 34 for communication over the network 24 under control of one or more projector processors 36. Thus, the interface 34 may be, without limitation, a Wi-Fi transceiver, which is an example of a wireless computer network interface, including mesh network interfaces, or a Bluetooth transceiver, or a wireless telephony transceiver.
It is to be understood that the projector processor 36 controls the projector 32 to undertake present principles. In this regard, the projector processor 36 may receive signals representing demanded color images from a color video source 38 which may be the same as or different from the video source 28 described previously and which may be established by any one or more of the source types described previously. When separate grayscale and color sources are used, as opposed to separate grayscale and color video files on the same source, the sources 28, 38 may communicate with each other, e.g., via a wired communication path or via the network 24 as shown.
The projector processor 36 controls a lamp assembly 40 to project color light onto the screen assembly 12. The lamp assembly may be a laser lamp assembly or other type of color illuminator assembly. The projector may further include one or more computer memories 42 such as disk-based or solid-state storage.
As shown in
In the example shown, the projected pixels 202 are illustrated as rectilinear areas that border each other across the entirety of the screen 14. In implementation, the shape of each projected pixel 202 may not be precisely rectilinear owing to bleed over of light caused by reflection and other effects including lens structure on the projector 32, but present principles understand that such bleed over between adjacent projected pixels 200 is minimized owing to the grayscale control afforded by control of the screen pixels 202 described below. Also, in implementation the footprint of the combined projected pixels 200 that establish the color video image may not be exactly coterminous with, and may be smaller than, the entire active area of the screen 14, in which case
Based on the image from the calibration camera 302, the optics of the projector 32 and/or the direction in which the projector 32 is pointed and/or the distance at which the projector 32 is from the screen 14 can be modified to align the left-most column 300 with the left edge 306 of the active portion of the screen 14 as shown, with the left edge being made more visibly manifest by causing the left-most one, two, or three columns of screen pixels 202 to be all white. The projector 32 may be moved left or right by hand by a person observing the image of the column 300 and/or the column 300 itself as it appears on the screen. Or, the processor 304 may receive the image of the column 300 and control a motor 308 (such as a servo or stepper motor or other appropriate apparatus) to move the optics and/or housing of the projector 32 to align the column 300 with the left edge 306.
Note that in some implementations, the left most column 300 may not be aligned with the left edge 306 of the active portion of the screen but rather with a column of screen pixels 202 that is inboard of the left edge and thereafter regarded as a virtual left edge by the system.
It may also be desirable to align the projector 32 with the top edge 310 of the screen 14, with the top edge being made more visibly manifest if desired by causing the top-most one, two, or three rows of screen pixels 202 to be all white. In the example shown, a top-most row 312 of projected pixels 200 can be projected onto the screen 14. The calibration camera 302 may capture the image of the row 312.
Based on the image from the calibration camera 302, the optics of the projector 32 and/or the direction in which the projector 32 is pointed and/or the distance at which the projector 32 is from the screen 14 can be modified to align the top-most row 312 with the top edge 310 of the active portion of the screen 14 as shown. The projector 32 may be moved hand by a person observing the image of the row 312 and/or looking at the row 312 itself as it appears on the screen. Or, the processor 304 may receive the image of the row 312 and control the motor 308 to move the optics and/or housing of the projector 32 to align the row 312 with the top edge 310.
Note that in some implementations, the top most column 312 may not be aligned with the top edge 310 of the active portion of the screen but rather with a column of screen pixels 202 that is below the top edge and thereafter regarded as a virtual top edge by the system. Note further that the edges 306, 310 may alternatively be the physical edges of the screen if desired, when the physical edges are not coterminous with the edges of the active portion of the screen.
If desired, once the left and top rows of projected are aligned with the left and top edges as described, the right and bottom projected pixel column/row may be aligned with the respective edges of the screen according to the algorithm above by, e.g., expanding or shrinking the footprint of the projected image using, e.g., the optics of the projector or by other means. Or, once the first two edges are aligned, the remaining two edges of the projected image may be projected onto the screen with the underlying screen pixels thus being designated as the virtual right and bottom edge of the screen for calibration purposes.
Present principles recognize that rows and columns of screen pixels 202 may not be precisely linear. For example, the screen 14 may be deliberately configured to be mildly concave, and/or local artifacts might exist to introduce non-linearity. Accordingly,
For illustration purposes,
For simplicity of disclosure, a single column 400 of projected pixels 2001-2007 is shown and screen assignment discussed for the pixels in that column.
In the example shown, the first, second, sixth, and seventh projected pixels 2001, 2002, 2006, 2007 would be associated with screen pixels in the respective row of the respective projected pixel from the first through third columns 202A, 202B, 202C of screen pixels based on, e.g., imaging the presence of those screen pixels within the respective projected pixels, with screen pixels in other candidate columns not being associated with these respective projected pixels. In contrast, the third, fourth, and fifth projected pixels 2003, 2004, 2005 would be associated with screen pixels in the respective row of the respective projected pixel from the second through fourth columns 202B, 202C, 202D of screen pixels. The process may continue using successive columns and then rows (or using a grid as mentioned above) of projected pixels to associate respective groups of screen pixels 202 with each respective one of at least some and preferably all projected pixels 200 while accounting for possible non-linearities in the screen 14.
Now referring to
The grayscale value to be established by the screen pixels associated with a particular color pixel to be projected are then derived as follow. At block 502, for a color video file to be projected onto the screen 14, the logic moves to block 504 to derive a grayscale file from the color video file. The grayscale file may be derived on a pixel-by-pixel basis.
Any appropriate method may be used for deriving a grayscale file from a color file such that the grayscale values in the grayscale file are synchronized with the color values in the color file using, e.g., timing information carried over from the color file into the grayscale file.
As examples, a grayscale value can be derived as follows for each color pixel to be projected.
In systems in which luminance is directly indicated in the pixel data, that luminance may be used as the grayscale value.
When the pixel data indicates only color values for red, green, and blue (RGB), the corresponding grayscale value to be inserted into the grayscale file can use weighted sums calculated from the RGB values, if desired after the gamma compression function has been removed first via gamma expansion.
In some embodiments, gamma expansion may be defined as:
C_\mathrm{linear}=\begin{cases}\frac{C_\mathrm{srgb}} {12.92}, & C_\mathrm{srgb}\1e0.04045\\ \left(\frac{C_\mathrm{srgb}+0.055} {1.055}\right){circumflex over ( )}{2.4}, & C_\mathrm{srgb}>0.04045 \end{cases}
where Csrgb represents any of the three gamma-compressed sRGB primaries (Rsrgb, Gsrgb, and Bsrgb, each in range [0,1]) and Clinear is the corresponding linear-intensity value (R, G, and B, also in range [0,1]).
Then, luminance can be calculated as a weighted sum of the three linear-intensity values. The sRGB color space is defined in terms of the CIE 1931 linear luminance Y, which is given by
Y=0.2126 R+0.7152 G+0.0722 B. [5]
The coefficients represent the measured intensity perception of typical trichromat humans, depending on the primaries being used; in particular, human vision is most sensitive to green and least sensitive to blue. To encode grayscale intensity in linear RGB, each of the three primaries can be set to equal the calculated linear luminance Y (replacing R,G,B by Y,Y,Y to get this linear grayscale). Linear luminance typically needs to be gamma compressed to get back to a conventional non-linear representation.
In contrast, for images in color spaces such as Y′UV and its relatives, which are used in standard color TV and video systems such as PAL, SECAM, and NTSC, a nonlinear luma component (Y′) can be calculated directly from gamma-compressed primary intensities as a weighted sum, which can be calculated quickly without the gamma expansion and compression used in colorimetric grayscale calculations. In the Y′UV and Y′IQ models used by PAL and NTSC, the grayscale component can be computed as
Y′=0.299 R′+0.587 G′+0.114 B′
where the prime distinguishes these gamma-compressed values from the linear R, G, B, and Y discussed above.
Yet again, for the ITU-R BT.709 standard used for HDTV developed by the ATSC, the grayscale value “Y” can be calculated as:
Y′=0.2126 R′+0.7152 G′+0.0722 B′.
Although these are numerically the same coefficients used in sRGB above, the effect is different because they are being applied directly to gamma-compressed values.
Recall that each color pixel to be projected is associated with plural screen pixels. Accordingly, once a single grayscale value is established for each color pixel to be projected, the process then uses that grayscale value to establish screen pixel control data defining the configuration of each of the plural screen pixels associated with the respective color pixel to be projected. Thus, each grayscale value may be expanded into “N” screen pixel control values to establish, for each screen pixel in the group of “N” screen pixels associated with the color pixel to be projected from whence the grayscale value was derived, whether that screen pixel is to be controlled to be white or black.
In one embodiment, this is done using stippling or stippling-like techniques, in which for lighter grayscale values, more of the screen pixels are caused to present a white appearance, and for darker grayscale values, more of the screen pixels are caused to present a black appearance, sometimes using randomly-selected pixels from among the group of screen pixels.
As additional illustrative examples of stippling-like techniques, halftoning or dithering may be used to configure the plural screen pixels associated with the respective color pixel to be projected to establish the derived grayscale value. Example non-limiting details of such techniques may be found in, e.g., Martin et al., “Scale-Dependent and Example-Based Stippling”, Computers & Graphics, 35(1):160-174 (2011) and Salomon, “The Computer Graphics Manual” (Springer-Verlag London, Ltd., 2011), both of which are incorporated herein by reference.
Note that the grayscale file may contain either one or both of the grayscale values corresponding to a single-color pixel to be projected, and the
In cases in which the refresh rate of the color video is faster than the refresh rate afforded by the active screen, each grayscale value may be an average of multiple color video values for the associated color pixel to be projected during a single cycle of screen refresh to which the grayscale value applies. For example, if the screen is refreshed 30 times per second and the color video is refreshed 60 times per second, each grayscale value may be the average of the two grayscale values derived from the two-color pixels to be projected during the single screen refresh period. Or, each grayscale value may be a selected one of the multiple color video values for the associated color pixel to be projected during a single cycle of screen refresh to which the grayscale value applies.
While a 4K screen is mentioned above, it is to be understood that other screen resolutions are encompassed by present principles. For example, individual pixels can be increased on the screen for 8K or higher projection systems or combined to a visually equivalent e-ink contrast grid that allows for larger grayscale areas or blocks. This could happen, for instance, when a 4K projection is presented on a very large screen. The combination of the screen size and the projection resolution influences the size of the matching grayscale contrast areas or blocks of e-ink on the screen. Moreover, the e-ink areas can be adjusted for pixel aspect ratios of different sizes, such as square versus rectangular. The e-ink pixel area shape and size can be tailored to how the color video to be projected is shot, e.g., either as DV or intended for film.
Note that in a typical implementation the particle density is not great enough to completely block all projector light from the projector screen.
Note further that by “selectively reflect those wavelengths produced by the projector” is meant to selectively reflect the light exiting the projector apparatus after being generated by a source of a particular color such as a laser or LED and then possibly having passed through any color filters.
“Multichromic” includes material having different absorption coefficients for light polarized in different directions. Multichromic material can refer to a dye or stain whose molecules possess the property of becoming linearly disposed within the oriented sheet material. Further information pertaining to such technology may be found in U.S. Pat. Nos. 6,013,123 and 5,764,248 incorporated herein by reference.
Moving to block 702, multichromic material is selected for the coating 600 that best reflects the wavelengths identified at block 700. Block 704 indicates that the coating is applied to the surface of the eScreen onto which light is to be projected.
In
In
In
The above methods may be implemented as software instructions executed by a processor, including suitably configured application specific integrated circuits (ASIC) or field programmable gate array (FPGA) modules, or any other convenient manner as would be appreciated by those skilled in those art. Where employed, the software instructions may be embodied in a device such as a CD Rom or Flash drive or any of the above non-limiting examples of computer memories that are not transitory signals. The software code instructions may alternatively be embodied in a transitory arrangement such as a radio or optical signal, or via a download over the internet.
It will be appreciated that whilst present principals have been described with reference to some example embodiments, these are not intended to be limiting, and that various alternative arrangements may be used to implement the subject matter claimed herein.
Number | Name | Date | Kind |
---|---|---|---|
2281101 | Land | Apr 1942 | A |
2740954 | Georges | Apr 1956 | A |
3510197 | Seki et al. | May 1970 | A |
3961839 | Brobst | Jun 1976 | A |
5111337 | Martinez | May 1992 | A |
5218472 | Jozefowicz et al. | Jun 1993 | A |
5361164 | Steliga | Nov 1994 | A |
5625489 | Glenn | Apr 1997 | A |
5764248 | Scarpetti | Jun 1998 | A |
5777720 | Shapiro et al. | Jul 1998 | A |
5903328 | Greene et al. | May 1999 | A |
6013123 | Scarpetti | Jan 2000 | A |
6208325 | Reddy et al. | Mar 2001 | B1 |
6301051 | Sankur | Oct 2001 | B1 |
6337769 | Lee | Jan 2002 | B1 |
6529322 | Jones et al. | Mar 2003 | B1 |
6530664 | Vanderwerf et al. | Mar 2003 | B2 |
6842282 | Kuroda et al. | Jan 2005 | B2 |
6892949 | Mondie | May 2005 | B2 |
7072108 | Cruz-Uribe et al. | Jul 2006 | B2 |
7130118 | Smythe et al. | Oct 2006 | B2 |
7248406 | May et al. | Jul 2007 | B2 |
7384158 | Ramachandran et al. | Jun 2008 | B2 |
7480096 | May et al. | Jan 2009 | B2 |
7535636 | Lippey et al. | May 2009 | B2 |
7538942 | Odagiri et al. | May 2009 | B2 |
7545397 | O'Dea et al. | Jun 2009 | B2 |
7614750 | May et al. | Nov 2009 | B2 |
7661828 | Allen et al. | Feb 2010 | B2 |
7733310 | Hajjar et al. | Jun 2010 | B2 |
7911693 | Smith et al. | Mar 2011 | B2 |
7936507 | Sano et al. | May 2011 | B2 |
7974005 | Huibers et al. | Jul 2011 | B2 |
8081368 | Lippey | Dec 2011 | B2 |
8218236 | Shiau et al. | Jul 2012 | B2 |
8284487 | Liu | Oct 2012 | B1 |
8411983 | Wei | Apr 2013 | B2 |
8469519 | Marcus et al. | Jun 2013 | B2 |
8649090 | Hosoi | Feb 2014 | B2 |
8913000 | Erol et al. | Dec 2014 | B2 |
9412318 | Chang | Aug 2016 | B2 |
9640143 | Dawson et al. | May 2017 | B1 |
9792847 | Dawson et al. | Oct 2017 | B2 |
20010035927 | Sasagawa et al. | Nov 2001 | A1 |
20030147053 | Matsuda et al. | Aug 2003 | A1 |
20040257649 | Heikkila et al. | Dec 2004 | A1 |
20050128581 | Samuels et al. | Jun 2005 | A1 |
20060061860 | Devos et al. | Mar 2006 | A1 |
20060209213 | Baker | Sep 2006 | A1 |
20060228523 | Cronin | Oct 2006 | A1 |
20060279839 | May | Dec 2006 | A1 |
20070014318 | Hajjar | Jan 2007 | A1 |
20070040989 | Weng et al. | Feb 2007 | A1 |
20070133088 | Lippey | Jun 2007 | A1 |
20070177063 | Hiramatsu | Aug 2007 | A1 |
20080100564 | Vincent | May 2008 | A1 |
20080144172 | Sano | Jun 2008 | A1 |
20080239497 | Lippey | Oct 2008 | A1 |
20080304014 | Vaan | Dec 2008 | A1 |
20090086296 | Renaud-Goud | Apr 2009 | A1 |
20100097699 | Destain et al. | Apr 2010 | A1 |
20100207956 | Keh et al. | Aug 2010 | A1 |
20100245995 | Graetz et al. | Sep 2010 | A1 |
20110019914 | Bimber et al. | Jan 2011 | A1 |
20110075114 | Tanis-Likkel et al. | Mar 2011 | A1 |
20110179734 | Shaffer | Jul 2011 | A1 |
20120223879 | Winter | Sep 2012 | A1 |
20130033530 | Gamache et al. | Feb 2013 | A1 |
20140028594 | Chen et al. | Jan 2014 | A1 |
20140104297 | Yang | Apr 2014 | A1 |
20140168288 | Tusch | Jun 2014 | A1 |
20140354698 | Lee et al. | Dec 2014 | A1 |
20150077849 | Sadahiro et al. | Mar 2015 | A1 |
20150138627 | Ehrensperger et al. | May 2015 | A1 |
20150309316 | Osterhout | Oct 2015 | A1 |
20160366379 | Hickl | Dec 2016 | A1 |
20170032728 | Shima et al. | Feb 2017 | A1 |
20170075207 | Tao et al. | Mar 2017 | A1 |
20170269360 | Yamaguchi | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
0421809 | Apr 1991 | EP |
624772 | Jun 1949 | GB |
H09274159 | Oct 1997 | JP |
2002097730 | Apr 2002 | JP |
2008032925 | Feb 2008 | JP |
1020160103460 | Sep 2016 | KR |
Entry |
---|
“Team Develops new, inexpensive transparent projection screen (w/Video)”, Phys.Org, Jan. 21, 2014. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, file history of related U.S. Appl. No. 15/656,691, filed Jul. 21, 2017. |
Thomas Dawson, Steven Richman, “Multichromic Reflective Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Non-Final Office Action dated Nov. 20, 2017. |
Thomas Dawson, Steven Richman, “Multichromic Reflective Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's response to Non-Final Office Action filed Nov. 22, 2017. |
Steven Martin Richman, Thomas Dawson, Frederick J. Zustak, “Dual Layer EScreen to Compensate for Ambient Lighting”, related U.S. Appl. No. 15/601,758, Non-Final Office Action dated Jan. 19, 2018. |
Steven Martin Richman, Thomas Dawson, Frederick J. Zustak, “Dual Layer EScreen to Compensate for Ambient Lighting”, related U.S. Appl. No. No. 15/601,758, Applicant's response to Non-Final Office Action filed Jan. 23, 2018. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Final Office Action dated Jan. 30, 2018. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's response to Final Office Action filed Feb. 1, 2018. |
“How to Setup a Projector Screen”, EBAY, Mar. 13, 2016. Retrieved from http://www.ebay.com/gds/How-to-Set-Up-a-Projector-Screen-/10000000205290613/g.html. |
“VIEWALL® Erasable Wall Material”, Visual Planning 2015 Corporation, Sep. 15, 2016. Retrieved from http://www.visualplanning.com/boardswallmaterial2.html. |
Steven Richman, Thomas Dawson, Frederick J. Zustak, “Dual Layer EScreen to Compensate for Ambient Lighting”, file history of related U.S. Appl. No. 15/601,758, filed May 22, 2017. |
Steven Richman, Thomas Dawson, Frederick J. Zustak, “Tunable Lenticular Screen to Control Luminosity and Pixel-Based Contrast”, file history of related U.S. Appl. No. 15/601,686, filed May 22, 2017. |
Thomas Dawson, Steven Richman, Frederick J. Zustak, “Transparent Glass of Polymer Window Pane as a Projector Screen”, file history of related U.S. Appl. No. 15/602,796, filed May 23, 2017. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, file history of related U.S. Appl. No. 15/608,667, filed May 30, 2017. |
Steven Richman, Thomas Dawson, “Tile-Based Lenticular Projection Screen”, file history of related U.S. Appl. No. 15/666,247, filed Aug. 1, 2017. |
Thomas Dawson, Steven Richman, “Microfaceted Projection Screen”, file history of related U.S. Appl. No. 15/615,523, filed Jun. 6, 2017. |
Steven Richman, Thomas Dawson, Frederick J. Zustak, “Tunable Lenticular Screen to Control Luminosity and Pixel-Based Contrast”, related U.S. Appl. No. 15/601,686, Non-Final Office Action dated Jun. 18, 2018. |
Steven Richman, Thomas Dawson, Frederick J. Zustak, “Tunable Lenticular Screen to Control Luminosity and Pixel-Based Contrast”, related U.S. Appl. No. 15/601,686, Applicant's response to Non-Final Office Action filed Jun. 20, 2018. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Non-Final Office Action dated Jun. 12, 2018. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's response to Non-Final Office Action filed Jun. 14, 2018. |
Thomas Dawson, Steven Richman, “Microfaceted Projection Screen”, related U.S. Appl. No. 15/615,523, Non-Final Office Action dated Sep. 27, 2018. |
Thomas Dawson, Steven Richman, “Microfaceted Projection Screen”, related U.S. Appl. No. 15/615,523, Applicant's response to Non-Final Office Action filed Oct. 1, 2018. |
Thomas Dawson, Steven Richman, Multichromic Filtering Layer to Enhance Screen Gain, related U.S. Appl. No. 15/656,691, Final Office Action dated Jul. 26, 2018. |
Thomas Dawson, Steven Richman, “Multichrornic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's response to Final Office Action filed Aug. 8, 2018. |
Steven Richman, Thomas Dawson, “Tile-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/666,247, Non-Final Office Action dated May 3, 2019. |
Steven Richman, Thomas Dawson, “Tile-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/666,247, Applicant's response to Non-Final Office Action filed May 8, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Applicant's response to Final Office Action filed Apr. 23, 2019. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's Reply Brief in response to the Examiner's Answer filed Apr. 23, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Non-Final Office Action dated Mar. 7, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Applicant's response to Non-Final Office Action filed Mar. 11, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Final Office Action dated Apr. 8, 2019. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Examiner's Answer dated Apr. 8, 2019. |
Dawson et al., “Transparent Glass of Polymer Window Pane as a Projector Screen”, related U.S. Appl. No. 15/602,796, Final Office Action dated Jun. 26, 2019. |
Dawson et al., “Transparent Glass of Polymer Window Pane as a Projector Screen”, related U.S. Appl. No. 15/602,796, Applicant's response to Final Office Action filed Jul. 19, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Non-Final Office Action dated Jun. 14, 2019. |
Steven Richman, Thomas Dawson, “Wallpaper-Based Lenticular Projection Screen”, related U.S. Appl. No. 15/608,667, Applicant's response to Non-Final Office Action filed Jun. 18, 2019. |
Thomas Dawson, Steven Richman, “Microfaceted Projection Screen”, related U.S. Appl. No. 15/615,523, Non-Final Office Action dated Jun. 11, 2019. |
Thomas Dawson, Steven Richman, “Microfaceted Projection Screen”, related U.S. Appl. No. 15/615,523, Applicant's response to Non-Final Office Action filed Jun. 12, 2019. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Non-Final Office Action dated Dec. 13, 2018. |
Thomas Dawson, Steven Richman, “Multichromic Filtering Layer to Enhance Screen Gain”, related U.S. Appl. No. 15/656,691, Applicant's response to Non-Final Office Action filed Dec. 17, 2018. |
Number | Date | Country | |
---|---|---|---|
20190028670 A1 | Jan 2019 | US |