The present disclosure relates to display devices. Specifically the present disclosure relates to display devices that use fluid droplets as a display medium.
Fountains and other water features have been used as attractive elements in gardens and corporate lobbies. Images have been projected onto fountains and ponds to create interesting looking attractions. These projected images require projectors placed out of plane from the area on which the images are projected. Positioning the projector out of plane from the display requires additional space and can be an unattractive. Additionally images projected on to fountains or other water features are only viewable from one direction.
It is within this context that embodiments of the present disclosure arise.
The aspects of the present disclosure can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
Although the following detailed description contains many specific details for the purposes of illustration, anyone of ordinary skill in the art will appreciate that many variations and alterations to the following details are within the scope of the disclosure. Accordingly, examples of embodiments of the disclosure described below are set forth without any loss of generality to, and without imposing limitations upon, the claimed disclosure.
While numerous specific details are set forth in order to provide a thorough understanding of embodiments of the disclosure, it will be understood by those skilled in the art that other embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the present disclosure. Some portions of the description herein are presented in terms of algorithms and symbolic representations of operations on data bits or binary digital signals within a computer memory. These algorithmic descriptions and representations may be the techniques used by those skilled in the data processing arts to convey the substance of their work to others skilled in the art.
An algorithm, as used herein, is a self-consistent sequence of actions or operations leading to a desired result. These include physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
Unless specifically stated or otherwise as apparent from the following discussion, it is to be appreciated that throughout the description, discussions utilizing terms such as “processing”, “computing”, “converting”, “reconciling”, “determining” or “identifying,” refer to the actions and processes of a computer platform which is an electronic computing device that includes a processor which manipulates and transforms data represented as physical (e.g., electronic) quantities within the processor's registers and accessible platform memories into other data similarly represented as physical quantities within the computer platform memories, processor registers, or display screen.
A computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks (e.g., compact disc read only memory (CD-ROMs), digital video discs (DVDs), Blu-Ray Discs™, etc.), and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, flash memories, or any other type of non-transitory media suitable for storing electronic instructions.
The terms “coupled” and “connected,” along with their derivatives, may be used herein to describe structural relationships between components of the apparatus for performing the operations herein. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, “connected” may be used to indicate that two or more elements are in direct physical or electrical contact with each other. In some instances, “connected”, “connection”, and their derivatives are used to indicate a logical relationship, e.g., between node layers in a neural network (NN). “Coupled” may be used to indicated that two or more elements are in either direct or indirect (with other intervening elements between them) physical or electrical contact with each other, and/or that the two or more elements co-operate or communicate with each other (e.g., as in a cause an effect relationship).
In the following Description, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration specific embodiments in which the claimed invention may be practiced. In this regard, directional terminology, such as “top,” “bottom,”, “above”, “below”, “front,” “back,” “leading,” “trailing,” etc., is used with reference to the orientation of the figure(s) being described. Because certain components described herein can be positioned in a number of different orientations, the directional terminology is used for purposes of illustration and is in no way limiting. It is to be understood that other embodiments may be utilized and structural or logical changes may be made without departing from the scope of the claimed invention. The following Description, therefore, is not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims.
While this
In the implementation depicted in
The droplets shown in
Air resistance may tend to dampen the acceleration of the liquid droplets 103, which increases as the speed of the droplet increases. The air resistance and gravitational acceleration are in equilibrium when the liquid droplet is traveling at its terminal velocity. The longer a droplet falls, the closer to its terminal velocity its speed will be. If droplets are emitted at terminal velocity, or achieve terminal velocity by the time they reach the first light emitter, then they will maintain a consistent speed and spacing as they fall past the light emitters. In some implementations of the present disclosure, the outlet nozzles may emit the liquid droplets at terminal velocity. In other implementations, the droplets may be released at a velocity above their terminal velocity. The outlet nozzles may be configured to output liquid droplets at terminal velocity through for example and without limitation a piezoelectric micro-diaphragm or a small heating element. In other implementations, the pressure of the liquid at the outlet nozzle is high enough that droplets are emitted at terminal velocity. In some implementations, the droplets can be released at higher initial velocity by using airflow in the vicinity of the nozzles to accelerate the droplets as they are released. In some implementations, this airflow will be constant, while in other implementations the airflow will only be present for a short period of time as each droplet is released. In some implementations, a physical mechanism can be used to push the droplet as it is being released or just after it is released to increase its initial velocity. For example without limitation, a spinning water wheel or a paddle mechanism could push droplets to increase the droplet's initial velocity.
The dotted vertical lines from each droplet emitter 101 show the path that the droplets emitted by that emitter will follow. The dashed horizontal lines shown to the right from each light emitter 102 show the path that the light emitted by that light emitter will follow. The full horizontal lines shown to the right from some of the light emitters 102 show the light beam emitted by that light emitter to intersect a droplet 103. For example, the light beam emitted by light emitter 102a illuminates the droplet created by the seventh droplet emitter 103g. The corresponding light beam will have the intensity and coloration corresponding to the seventh pixel over on the top row of the image being displayed.
The image created by the display is updated rapidly to ensure that the image is coherent. In one implementation, each outlet nozzle in the horizontal row of liquid outlet nozzles emits droplets at least once every 24 times per second. For larger displays and smoother image reproduction higher emission rates hereinafter referred to as refresh, rates are desirable. A refresh rate of between 60 and 100 times per second may produce smoother, more comprehensible images.
Wind and environmental conditions may affect the quality of images displayed; as such, a transparent material 104 may be placed on either side of the falling liquid droplets to shield them from such environmental effects. In implementations that are intended to be viewed only from one side, the material 104 on the backside of the display can be made of a material that does not allow the display to be viewed through the material. The transparent material 104 may be part of an enclosure 105 that surrounds the liquid display device. The transparent material 104 may be glass, plastic, ceramic, transparent metal, greased paper or any transparent material sufficient to block wind while not blocking light. In some implementations, the enclosure 105 may be sealed from the surrounding environment. The seal may be an airtight seal and in some implementations, the air-pressure within the enclosure may be less than the air-pressure outside the enclosure. Reducing the air-pressure in the enclosure may allow for faster refresh rates because the falling liquid droplets will have a higher terminal velocity. In implementations with reduced air-pressure, boiling of the liquid may be a problem therefore a liquid with a lower vapor pressure/higher boiling point may be used, sublimation may also be a problem and in such cases a liquid with a higher vapor pressure may be used.
By way of example, and not by way of limitation, the liquid emitted from the outlet nozzles 101 may be any material that is a liquid at room temperature. The liquid may be for example and without limitation, water, mineral oil (alkanes or cycloalkanes), alcohol, or carboxylic acid in general a suitable liquid will have a lot of internal reflection/refraction so that there is a wide viewing angle. If there are zero internal reflections, then the light would pass through and there would be no light dispersed in the direction of the viewer. It should be noted that higher viscosity liquids would form droplets slower and as such affect the refresh rate. Additives may be combined with the liquid to improve display performance or provide additional effects. For example and without limitation, ethylene glycol or propylene glycol may be added to a water display to increase the boiling of the mixture, fluorescent or phosphorescent dye may be added to liquid to enhance the effect of the light emitters. In implementations using fluorescent dyes, the light emitters may be in the infrared or ultraviolet spectrum. Such implementations would show no visible light along the edges or top of the display. In some implementations the nozzles may incorporate gas bubbles into the droplets as they are emitted, which can give the droplets a cloudy appearance that disperses light more evenly.
To create a three-dimensional effect several horizontal rows of outlet nozzles may be stacked next to each other in the shorter horizontal dimension as shown in
A controller 206 may be communicative coupled 210 to the light emitters 201, 202, 203 and outlet nozzles 208 through for example and without limitation a serial cable, a bus bar, Ethernet cable or any other suitable connection type. The controller may be configured to receive image frames and convert the images frames to time synchronized signals sent to the light emitters. This conversion can be done by determining the timing for when each droplet will fall past each light emitter and using the light levels for the corresponding pixel from the image frame to illuminate the light emitter at that time. The image frames may be part of a stream of image frames in a video stream. Alternatively, the image frame may be a still image. In some implementations, the controller is coupled to the light emitters and the outlet nozzles are controlled independently. In this implementation, the controller may be calibrated to generate an independent representation of the pattern and timing of fluid droplets created by the outlet nozzles. In other implementations, the timing of the drops may be calculated from the signals sent to the nozzles to control the release of the drops. In some implementations, the timing of the drops is determined from measurements of the drops. By having two sensors between each nozzle and the first light emitter the timing of when the droplets released passes each sensor can be measured, thus giving both the time when the droplet passed by the bottom sensor and the velocity in which the droplet was falling. Such measurements of the droplets can be used to compensate for changes in nozzle behavior over time, such as from a buildup of mineral deposits. Such measurements can be used in conjunction with the signals sent to the nozzles to automatically perform calibration that compensates for differences in behavior from one nozzle to the next. Such sensors can be implemented as a beam and detector that can detect when the droplet crosses the path of the beam. Such sensors can be implemented using cameras, in which case a single sensor may be used to detect both timing and velocity. In some cases, a single camera can be used as the sensor for the droplets from multiple nozzles.
A more organized droplet 209 pattern is shown in
Additionally shown in
A benefit of the implementation shown in
In some implementations, the velocity at which the droplets emitted by a first nozzle can differ from the velocity at which the droplets emitted by a second nozzle are emitted. Such differences can be used, for example, in an implementation where the nozzles are in an angled line, but the light emitters are in a vertical line projecting horizontal beams of light. By releasing the droplets from the lower nozzles at a higher velocity than the droplets released from higher nozzles it can compensate for the fact that the droplets released by the higher nozzles were accelerated by gravity before they reached the height of the lower nozzles. This can be used to have the droplets pass the first light emitter with a constant velocity. Such an implementation can also work well when the nozzles are arranged in a stair-step pattern consisting of multiple horizontal lines at differing heights. Releasing droplets at differing velocities can also be used to create eye-catching patterns in the falling droplets.
In some implementations, the nozzles can be used to emit droplets in a pattern that is meant to draw the attention of the viewer. If such a pattern is emitted when no image is being displayed, such as the blank interval between scenes in a video program, then multiple droplets can be emitted at the same time, as there is no concern with droplets obscuring light beams. Such a display can be interspersed with video display to better grab viewer's attention and break up the monotony of a continuous video display.
Conversion of Image Frames to Fluid Images
Initially, calibration data may be used to determine the speed and trajectory of liquid droplets emitted from the system. Ideally, each outlet nozzle would emit liquid with the same volume and vertical speed but due to fluid dynamics, such consistency is unlikely therefore, some initial calibration data is required. Calibration may be performed using a camera and a light source to track the trajectory and speed of the falling droplets.
Once the speed and trajectory of the falling droplets is sufficiently tracked, the information may be used to generate an internal representation of the falling droplets. The internal representation of the falling droplets may then be mapped to the image frame through a series of time steps.
For example and without limitation, the image is present in an image buffer with data, e.g., chroma and luma data for pixels arranged in an xy plane as is common for most computer graphic systems. In another buffer there is arranged by delta-time from the start of the frame a pointers to a piece of image data from the image frame and an index to the light emitters (RGB, etc.) set to illuminate. This data is pre-calculated possibly during a calibration phase such that every pixel in the image frame is presented. The image may be masked for displays that do not have a strictly horizontal rectangular display area.
In example illustrated in
Thus, the command table of the falling droplet pattern in
DT1: Image 1,3 to I1; Image 2,2 to I2; Image 3,1 to I3; Nozzle D3
DT2: Image 3,3 to I1; Image 1,2 to I2; Image 2,1 to I3; Nozzle D2
DT3: Image 2,3 to I1; Image 3,2 to I2; Image 1,1 to I3; Nozzle D1
Where DT1 . . . n represents a drop times, I1 . . . n represents light emitters and D1 . . . n represent outlet nozzles. In the above example drop times or delta-time corresponds to the time it takes for a drop to travel the distance from one light emitter row to the another light emitter row and the drop pattern loops indefinitely. In some implementations, the drop times are not synchronous with illumination times and may be offset as T0.5, T1.4 etc. In other implementations the drop pattern is varied based, a predefined pattern map. The pattern map may simply be an array similar to the one above showing drop location at delta times.
The frame buffer 608 contains the illumination value for each pixel for the current image being displayed. In some instances, the frame buffer may also contain the illumination for each pixel in more than one frame, allowing access to upcoming and/or previous illumination values. The light emitter driver 605 uses the droplet timing date 604 to determine when each droplet will pass through the path of the light beam from each light emitter. For each droplet, passing through a light beam the light emitter driver will then get the illumination level for the corresponding pixel in the display from the frame buffer 608 and send that information to the appropriate light emitter 606 to have the beam illuminate the droplet with the proper value for the pixel.
In some implementations, there is not a one to one mapping between the pixels in the frame buffer and the pixels in the display. This can be the case if the image source provides an image that is a different size or aspect ratio than the display. This can also be the case if the display area is not rectangular, such as when the display area is a trapezoid. In such a case, the display area may be scaled to the size of the frame buffer and the pixels that do not correspond to the scaled display area in the frame buffer can be ignored, such as displaying a trapezoid area out of a rectangular TV signal, and ignoring the rest of the area. In the case where there is not a one to one mapping the light emitter driver can take care of any scaling or masking needed to translate the image data in the frame buffer to determine the pixel illumination for each pixel in the display.
In some implementations, a control unit (not shown) will determine the image source to use and the pattern that the nozzle driver should use to release the droplets. The controller may change these values over time, such as to vary the display between displaying images and displaying patterns of falling water to be of more interest to viewers. The control unit can also control external lighting so that when the display is being used to display a pattern of droplets and not display image data, lighting near the display can be increased to make the pattern of water droplets more noticeable.
Implementation
The computing device 500 may include one or more processor units 503, which may be configured according to well-known architectures, such as, e.g., single-core, dual-core, quad-core, multi-core, processor-coprocessor, cell processor, and the like. The computing device may also include one or more memory units 504 (e.g., random access memory (RAM), dynamic random access memory (DRAM), read-only memory (ROM), and the like).
The processor unit 503 may execute one or more programs, portions of which may be stored in the memory 504 and the processor 503 may be operatively coupled to the memory, e.g., by accessing the memory via a data bus 505. The programs may be configured to generate calibration data 521 and convert image frames 508 to a form that is displayable on the fluid display device 531. Additionally the Memory 504 may contain programs that implement encoding and decoding of image frames 510.
The calibration data and other synchronization information may also be stored as data 518 in the Mass Store 515. The processor unit 503 is further configured to execute one or more programs 517 stored in the mass store 515 or in memory, which cause the processor to carry conversion of image frames to fluid display image.
The computing device 500 may also include well-known support circuits, such as input/output (I/O) 507, circuits, power supplies (P/S) 511, a clock (CLK) 512, and cache 513, which may communicate with other components of the system, e.g., via the bus 505. The computing device may include a network interface 514. The processor unit 503 and network interface 514 may be configured to implement a local area network (LAN) or personal area network (PAN), via a suitable network protocol, e.g., Bluetooth, for a PAN. The computing device may optionally include a mass storage device 515 such as a disk drive, CD-ROM drive, tape drive, flash memory, or the like, and the mass storage device may store programs and/or data. The computing device may also include a user interface 516 to facilitate interaction between the system and a user. The user interface may include a monitor, Television screen, speakers, headphones or other devices that communicate information to the user.
The computing device 500 may include a network interface 514 to facilitate communication via an electronic communications network 520. The network interface 514 may be configured to implement wired or wireless communication over local area networks and wide area networks such as the Internet. The device 500 may send and receive data and/or requests for files via one or more message packets over the network 520. Message packets sent over the network 520 may temporarily be stored in a buffer 509 in memory 504.
While the above is a complete description of the preferred embodiment of the present disclosure, it is possible to use various alternatives, modifications and equivalents. It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, while the flow diagrams in the figures show a particular order of operations performed by certain embodiments of the disclosure, it should be understood that such order is not required (e.g., alternative embodiments may perform the operations in a different order, combine certain operations, overlap certain operations, etc.). Furthermore, many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. Although the present disclosure has been described with reference to specific exemplary embodiments, it will be recognized that the disclosure is not limited to the embodiments described, but can be practiced with modification and alteration within the spirit and scope of the appended claims. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. Any feature described herein, whether preferred or not, may be combined with any other feature described herein, whether preferred or not. In the claims that follow, the indefinite article “A”, or “An” refers to a quantity of one or more of the item following the article, except where expressly stated otherwise. The appended claims are not to be interpreted as including means-plus-function limitations, unless such a limitation is explicitly recited in a given claim using the phrase “means for.”
This application claims the priority benefit of U.S. Provisional Patent Application No. 62/775,291, filed Dec. 4, 2018, the entire contents of which are incorporated herein by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
62775291 | Dec 2018 | US |