This disclosure relates generally to holography and, in non-limiting embodiments, to holographic light curtains for detecting a disturbance or movement in a space.
A light curtain is an optical barrier that detects the presence or absence of objects within regions of 3D space. Light curtain systems are designed for detecting the presence of objects within a user-defined 3D region of space, which has many applications across vision and robotics. However, the shape of light curtains are limited to ruled surfaces, i.e., surfaces composed of straight lines.
For example, light curtains are used in elevators and garage doors in order to keep doors open when a person or object is in the doorway. Safety light curtains also are used in environments containing hazardous equipment (e.g., machine tools, robotic arms) to protect personnel from injury by automatically turning off dangerous machinery whenever a curtain is breached. These light curtains involve two components: emitters and receivers. These light curtains position an emitter to directly illuminate a receiver through direct line of sight. These light curtains must be physically configured for their specific environments, which is a laborious process.
Some light curtains have been proposed that use triangulation. In these examples, a scene is illuminated with a laser line, and the response is measured with a camera. The intersection of the illumination and sensing planes produces a 3D line and, if an object touches this line, light from the source reflects off of the object and reaches the camera. Rapidly changing the position of the illumination and sensing planes (e.g., with mirror galvanometers) creates ruled surfaces, i.e., surfaces defined by unions of straight lines. Triangulation light curtains have been restricted to being ruled surfaces. Moreover, prior systems offer only one degree of freedom over the positions of the laser line and scan line, limiting triangulation light curtains to an even smaller subset of ruled surfaces.
According to non-limiting embodiments or aspects, provided is a system comprising: a holographic projector configured to project a holographic image; a rolling-shutter camera arranged to receive light from the holographic image; and at least one processor in communication with the rolling-shutter camera, the at least one processor programmed or configured to: determine an intensity of the light received from the holographic image; and detect a disturbance in a space of the holographic image based on a change in the intensity.
In non-limiting embodiments or aspects, the holographic projector comprises: a laser light source; a spatial light modulator arranged to receive light from the laser light source; a first lens arranged between the laser light source and the spatial light modulator; and a second lens arranged between the spatial light modulator and an image plane formed by a reflection of the spatial light modulator. In non-limiting embodiments or aspects, the system further comprises: an objective lens arranged between the image plane and a scene of the holographic image; and an optical aperture device arranged between the second lens and the objective lens, the optical aperture device configured to block a portion of the reflection of the spatial light modulator. In non-limiting embodiments or aspects, the at least one processor is further programmed or configured to generate the holographic image by: (a) generating a random pattern for display on the spatial light modulator; (b) propagating a wavefront based on the random pattern from a plane associated with the spatial light modulator to the image plane; (c) replacing an amplitude of the wavefront with a value derived from a target image to generate a new wavefront; (d) propagating the new wavefront from the image plane to the plane associated with the spatial light modulator; (e) binarizing the new wavefront; and (f) repeating steps (b)-(e) until substantially converging to the target image.
In non-limiting embodiments or aspects, the holographic image comprises a plurality of light curtains, and detecting the disturbance in the space of the holographic image comprises detecting separate disturbances in at least one light curtain of the plurality of light curtains. In non-limiting embodiments or aspects, the holographic image comprises a first light curtain and a second light curtain at least partially overlapping the first light curtain, and the first light curtain has a depth or thickness greater than a depth or thickness of the second light curtain. In non-limiting embodiments or aspects, detecting the disturbance in the space of the holographic image is based on an intensity of the first light curtain and an intensity of the second light curtain.
According to non-limiting embodiments or aspects, provided is a system comprising: a laser light source; a spatial light modulator arranged to receive light from the laser light source and reflect at least a portion of the light; an objective lens arranged to project a holographic image based on the at least a portion of the light reflected by the spatial light modulator; a rolling-shutter camera arranged to capture light from the holographic image; and at least one processor configured to detect movement in a space of the holographic image based on data received from the rolling-shutter camera.
In non-limiting embodiments or aspects, the system further comprises: a first lens arranged between the laser light source and the spatial light modulator; a second lens arranged between the spatial light modulator and the objective lens, the objective lens is arranged between an image plane of the second lens and a scene of the holographic image; and an optical aperture device arranged between the second lens and the objective lens, the optical aperture device configured to block a portion of the reflection of the spatial light modulator. In non-limiting embodiments or aspects, the at least one processor is further programmed or configured to generate the holographic image by: (a) generating a random pattern for display on the spatial light modulator; (b) propagating a wavefront based on the random pattern from a plane associated with the spatial light modulator to the image plane; (c) replacing an amplitude of the wavefront with a square root of a target image to generate a new wavefront; (d) propagating the new wavefront from the image plane to the plane associated with the spatial light modulator; (e) binarizing the new wavefront; and (f) repeating steps (b)-(e) until substantially converging to the target image.
In non-limiting embodiments or aspects, the holographic image comprises a plurality of light curtains, and detecting the movement in the space of the holographic image comprises detecting separate disturbances in at least a subset of light curtains of the plurality of light curtains. In non-limiting embodiments or aspects, the holographic image comprises a first light curtain and a second light curtain, the first light curtain and the second light curtain overlapping at least partially, and the first light curtain has a depth or thickness greater than a depth or thickness of the second light curtain. In non-limiting embodiments or aspects, wherein detecting the movement in the space of the holographic image is based on an intensity of the first light curtain and an intensity of the second light curtain.
According to non-limiting embodiments or aspects, provided is a method comprising: generating, with at least one processor, a holographic image; projecting the holographic image to a scene with a holographic projector; capturing at least a portion of the scene with a rolling-shutter camera; and detecting, with at least one processor, a disturbance in a space of the holographic image based on at least one frame received from the rolling-shutter camera.
In non-limiting embodiments or aspects, generating the holographic image comprises: (a) generating a random pattern for controlling a micromirror device; (b) propagating a wavefront based on the random pattern from a plane associated with the micromirror device to an image plane of the holographic image; (c) replacing an amplitude of the wavefront with a square root of a target image to generate a new wavefront; (d) propagating the new wavefront from the image plane to the plane associated with the micromirror device; (e) binarizing the new wavefront; and (f) repeating steps (b)-(e) until substantially converging to the target image.
In non-limiting embodiments or aspects, the holographic image comprises a plurality of light curtains, and detecting the disturbance in the space of the holographic image comprises detecting separate disturbances in at least a subset of light curtains of the plurality of light curtains. In non-limiting embodiments or aspects, the holographic image comprises a first light curtain and a second light curtain, the first light curtain and the second light curtain overlapping at least partially, and the first light curtain has a depth or thickness greater than a depth or thickness of the second light curtain. In non-limiting embodiments or aspects, detecting the disturbance in the space of the holographic image is based on an intensity of the first light curtain and an intensity of the second light curtain. In non-limiting embodiments or aspects, the method further comprises: blocking, with an optical aperture device, at least a portion of a reflection of a micromirror device of the holographic projector.
Other non-limiting embodiments or aspects will be set forth in the following numbered clauses:
Additional advantages and details are explained in greater detail below with reference to the exemplary embodiments that are illustrated in the accompanying schematic figures, in which:
For purposes of the description hereinafter, the terms “end,” “upper,” “lower,” “right,” “left,” “vertical,” “horizontal,” “top,” “bottom,” “lateral,” “longitudinal,” and derivatives thereof shall relate to the disclosure as it is oriented in the drawing figures. However, it is to be understood that the subject matter of the disclosure may assume various alternative variations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific devices and processes illustrated in the attached drawings, and described in the following specification, are simply exemplary embodiments or aspects of the disclosed subject matter. Hence, specific dimensions and other physical characteristics related to the embodiments or aspects disclosed herein are not to be considered as limiting.
No aspect, component, element, structure, act, step, function, instruction, and/or the like used herein should be construed as critical or essential unless explicitly described as such. Also, as used herein, the articles “a” and “an” are intended to include one or more items and may be used interchangeably with “one or more” and “at least one.” Where only one item is intended, the term “one” or similar language is used. Also, as used herein, the terms “has,” “have,” “having,” or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least partially on” unless explicitly stated otherwise.
As used herein, the terms “communication” and “communicate” refer to the receipt or transfer of one or more signals, messages, commands, or other type of data. For one unit (e.g., any device, system, or component thereof) to be in communication with another unit means that the one unit is able to directly or indirectly receive data from and/or transmit data to the other unit. This may refer to a direct or indirect connection that is wired and/or wireless in nature. Additionally, two units may be in communication with each other even though the data transmitted may be modified, processed, relayed, and/or routed between the first and second unit. For example, a first unit may be in communication with a second unit even though the first unit passively receives data and does not actively transmit data to the second unit. As another example, a first unit may be in communication with a second unit if an intermediary unit processes data from one unit and transmits processed data to the second unit. It will be appreciated that numerous other arrangements are possible.
As used herein, the term “computing device” may refer to one or more electronic devices configured to process data. A computing device may, in some examples, include the necessary components to receive, process, and output data, such as a processor, a display, a memory, an input device, a network interface, and/or the like. A computing device may be a processor, such as a CPU or GPU, a mobile device, and/or other like devices. A computing device may also be a desktop computer or other form of non-mobile computer. Reference to “a processor,” as used herein, may refer to a previously-recited processor that is recited as performing a previous step or function, a different processor, and/or a combination of processors. For example, as used in the specification and the claims, a first processor that is recited as performing a first step or function may refer to the same or different processor recited as performing a second step or function.
As used herein, the term “light curtain” refers to a sensing arrangement in which light is transmitted and the interruption and/or disruption of the transmitted light is used to detect movement. A light curtain may be used for safety in industrial applications, such that the detection of movement automatically turns off dangerous machinery or takes other corrective action. A light curtain may also be used for any other application or scenario in which movement is detected.
Non-limiting embodiments described herein provide for a new and innovative system and method for implementing one or more light curtains to detect a disturbance in a space using one or more holographic images.
With continued reference to
Still referring to
Referring now to
With continued reference to
In the example shown in
The spatial light modulator 204 may be positioned at the Fourier plane (e.g., at a front focal plane of lens 208) to form an interference pattern at the image plane 209 (e.g., at a back focal plane of lens 208). The image formation model for the wavefront U(s,t) at the image plane 209 can be expressed as follows:
In the above Equation (1), u(x, y) is the pattern displayed on the spatial light modulator, a(x, y) represents optical aberrations associated with imperfections with the spatial light modulator, and F{·} is the Fourier transform operator that models wavefront propagation from the Fourier plane to the image plane. The intensity of a wavefront, which is the signal measured by a camera, is given by its squared magnitude (e.g., |U(s, t)|2).
A holographic projector reallocates light from dark regions toward light parts of the image. The intensity of the line is inversely proportional to the thickness of the line. In some non-limiting embodiments, a fast phase-only spatial light modulator may be used instead of a digital micromirror device to avoid wasting light from the pixels of the micromirror device that are off and the blocked light.
In non-limiting embodiments, an improved, modified Gerchberg-Saxton (GS) algorithm is used to compute a pattern u(x, y) that produces a target image I(s, t)=|U(s, t)|2. The modified algorithm alternates between enforcing a constraint on the hologram's intensity at the image plane 209, and enforcing the binary constraint on the pattern at the Fourier plane (e.g., at the spatial light modulator 204). After initializing the pattern u(x, y) with random binary values, the algorithm iteratively performs four operations to compute the hologram. It will be appreciated that fewer, different, and/or more operations may be performed to compute the hologram. First, Equation (1) is used to simulate the propagation of the wavefront from the Fourier plane to the image plane 209. This involves performing an element-wise multiplication with a pre-computed phase pattern a(x, y) and computing the Fourier transform of the result, producing a conjugate symmetric wavefront U(s, t). Next, the phase U(s, t) of this wavefront is maintained, but its amplitude is changed (e.g., replaced) to match the target intensity image. Next, the propagation operator is inverted by using an inverse Fourier transform and performing an element-wise multiplication with the complex conjugate of the phase pattern a(x, y). As a final step, the result may be binarized by setting all values with positive real components to 1 and setting all other values to 0. These steps are repeated until the resulting image converges to the target image.
In non-limiting embodiments, the spatial light modulator 204 may be calibrated. The light reflecting off of the spatial light modulator 204 may be affected by aberrations. The most severe aberrations may be attributed to the non-planarity of the surface of the spatial light modulator 204 (e.g., digital micromirror device), which may be characterized by a spatially-varying phase pattern a(x, y). Ignoring these aberrations results in blurry holographic images. In non-limiting embodiments, the phase aberration image a(x, y) may be pre-computed and the modified GS algorithm is applied to produce sharper holographic images. To calibrate for this distortion, in non-limiting embodiments, a calibration is performed that does not rely on interfering pairs of digital micromirror device blocks like other methods. First, a block of pixels is displayed on the digital micromirror device, where pixels within this block are randomly turned on or off. This forms a random interference pattern, which is imaged with a sensor (e.g., rolling-shutter camera). Next, this block is slid to different digital micromirror device regions and the corresponding interference patterns are recorded (e.g., stored in memory). If the phase pattern varies linearly across a block, this shifts the observed interference pattern. The shifts may be measured by performing zero-normalized cross-correlation between every measurement and a reference interference pattern, and the corresponding gradients on the phase pattern may then be computed. After discretely sampling the gradients at a number of positions on the digital micromirror device, the result is interpolated to densely represent the gradient of the phase across the entire digital micromirror device. Finally, a large linear system is solved to compute the spatially-varying phase values from these phase gradients, which is based on solving a Poisson equation. This calibration method is less computationally complex than prior methods that involve dividing the micromirror device into blocks of pixels and turning on pairs of blocks at the same time to produce interference patterns.
Referring now to
With continued reference to
The image data from the camera is analyzed and, at step 306, a computing device determines an intensity of the light captured with the camera. At step 308, the computing device determines if the intensity determined at step 306 changes (e.g., decreases), which indicates a disturbance in the space of the holographic image (e.g., an object or entity intersecting with the hologram). The change in intensity at step 308 may be based on satisfying one or more thresholds, such that a decrease in intensity that is equal to and/or greater than a threshold amount triggers a change in intensity at step 308. In response to a change in intensity at step 308, a disturbance event may be detected at step 310. A disturbance event may result in an automatic notification, alert, and/or performance of any other action (e.g., stopping a machine, locking or unlocking a mechanism, and/or the like).
In non-limiting embodiments, a disturbance map may be extracted from the captured images by first imaging a light curtain when the scene is undisturbed and subtracting the light curtain output after the detected disturbance. This results in a difference image over a specific geometry of interest.
Referring now to
Non-limiting embodiments allow for 3D light curtains that provide advantages over 2D horizontal or vertical curtains. Non-limiting embodiments may be used in various implementations and for various purposes. For example, the system can be mounted in an assembly line to inspect whether objects passing through have defects (e.g., by the defects causing a disturbance outside of the expected shape and/or size). In other examples, a 3D touch interface may be implemented using non-limiting embodiments of the holographic light curtains described herein. For example, a light curtain may be formed above (e.g., 2 cm or the like) a desired surface. When a person's finger interacts with this surface, the light curtain detects its location. Non-limiting embodiments may therefore turn any arbitrary geometry into a virtual touch interface. In augmented reality, detecting where a user interacts with a scene may be used as a new input for art or entertainment applications. For example, any real-life object may be turned into a virtual drawing surface.
Referring now to
In non-limiting embodiments, the holographic projector may be light redistributive, which can be verified by projecting lines of different thicknesses onto a flat diffused white surface, and measuring the average brightness of each line using an exposure stack to form a high dynamic range (HDR) image. The pattern brightens as the illuminated area decreases. The optimal light redistribution curve may be plotted by taking the sum of the brightness values of the thinnest line, and normalizing that value by the range of areas.
In non-limiting embodiments, multiple light curtains may be generated and monitored simultaneously. This may be done simultaneously within a single rolling-shutter frame, as an example. In some examples, a first curtain may have a first thickness, and a second curtain may have a larger thickness that overlaps with the first curtain. Thus, if the thick curtain experiences a disturbance while the thinner curtain does not, the disturbance can be determined to be a relatively smaller event. In some examples, multiple light curtains may form layers that can detect an extent of a disturbance based on how many layers experience a disturbance.
In non-limiting embodiments, scalar diffraction theory, including Kirchoff diffraction, can be used to propagate a wavefront from the image plant to any point x in the scene according to the following equation:
In non-limiting embodiments, a generalized propagation operator could be used in conjunction with the modified GS algorithm to generate a hologram at a different plane, without needing to physically adjust the objective lens. The hologram may also be optimized for a 2D manifold R in 3D space, e.g., the region of space imaged by a row of camera pixels. If R is planar, this can be done using techniques based on rotating the angular spectrum. However, these propagation operators can be computationally expensive, especially in the context of an iterative GS algorithm which requires evaluating the propagation operator multiple times. Therefore, in non-limiting embodiments a Fourier-based image formation model may be used.
In non-limiting embodiments, to compute the intrinsic and extrinsic parameters of the projector-camera system, a modified projector-camera calibration procedure may be performed in which, in place of a checkerboard pattern, an inverted circleboard pattern (e.g., white circles on a black background) printed onto a planar calibration target is used. Further, the holographic projector may be used to illuminate the calibration target with a sequence of Gray code patterns. Decoding the corresponding Gray code images produces a dense set of correspondences between camera and projector pixels. To minimize the effect of speckle artifacts in the projection patterns, four binary holograms of the same Gray code pattern may be computed, using different initializations of a GS algorithm. The corresponding images may then be captured and averaged to obtain despeckled measurements.
In non-limiting embodiments, accurate time synchronization between the digital micromirror device and rolling-shutter camera is performed. For example, a hardware trigger may start the exposure of the camera when the digital micromirror device displays the first pattern of a sequence. An appropriate pixel clock is then determined for the camera, and an exposure time for the digital micromirror device patterns is also determined. A light emitting diode (LED) blinking at a known, fixed rate may be used to calibrate the inter-row delay of the rolling-shutter camera for different pixel clock values. The camera may be positioned in front of the digital micromirror device, and the digital micromirror device may be illuminated with a bright point light source. The light reflects toward the camera only if the digital micromirror device pixels are turned “on”. The digital micromirror device timings may be calibrated by displaying a sequence of patterns where all the pixels are turned “off”, except for two patterns at known indices where all the pixels are turned “on”. By examining where the bright rows occur in the rolling-shutter capture, the real pattern exposure time of the digital micromirror device and the delay between the start of the digital micromirror device pattern sequence and the start of the rolling-shutter frame for some desired digital micromirror device pattern exposure time can be determined.
In non-limiting embodiments, phase-distortion calibration may be performed if the camera does not exactly image the Fourier transform of the wavefront at the digital micromirror device (e.g., if the sensor just images a subset of the resulting interference pattern or if the interference pattern may be rotated or warped). To account for these issues, a series of Gray codes may be projected to determine correspondences between pixels in the simulated interference pattern and sensor pixels in the real interference pattern. Using these correspondences, a homography may be calculated to warp the captured interference patterns to match the simulated ones. The Gray code measurements may be despeckled by averaging the results of multiple GS algorithm instantiations.
Referring now to
With continued reference to
Device 900 may perform one or more processes described herein. Device 900 may perform these processes based on processor 904 executing software instructions stored by a computer-readable medium, such as memory 906 and/or storage component 908. A computer-readable medium may include any non-transitory memory device. A memory device includes memory space located inside of a single physical storage device or memory space spread across multiple physical storage devices. Software instructions may be read into memory 906 and/or storage component 908 from another computer-readable medium or from another device via communication interface 914. When executed, software instructions stored in memory 906 and/or storage component 908 may cause processor 904 to perform one or more processes described herein. Additionally, or alternatively, hardwired circuitry may be used in place of or in combination with software instructions to perform one or more processes described herein. Thus, embodiments described herein are not limited to any specific combination of hardware circuitry and software. The term “programmed or configured,” as used herein, refers to an arrangement of software, hardware circuitry, or any combination thereof on one or more devices.
Although embodiments have been described in detail for the purpose of illustration, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
This application claims the benefit of U.S. Provisional Patent Application No. 63/324,163, filed on Mar. 28, 2022, the disclosure of which is hereby incorporated by reference in its entirety.
This invention was made with Government support under NSF U.S. Pat. No. 1,900,821 awarded by the National Science Foundation. The Government has certain rights in the invention.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2023/023679 | 5/26/2023 | WO |
Number | Date | Country | |
---|---|---|---|
63324163 | Mar 2022 | US |