This patent application relates to optical object detection, such as proximity sensing.
Optical object detection involving a light emitter and a detector sensitive to reflected light from an object for sensing or detecting the presence of the object is well known in the art. The choice of optical wavelength can vary according to the needs, for example visible or invisible. The choice of beam shape, such as a collimated beam that detects passage across a line, or a fan-shaped or conical beam that projects light over a given area, also varies according to the detection needs. Such active source optical object detection is commonly used in a variety of proximity sensors and for applications such as object identification, distance estimation, and 3D mapping.
Applicant has discovered that a liquid crystal dynamic beam shaping device can be coupled to a light source for enhancing optical object detection.
In some embodiments, variable liquid crystal beam shaping is used to variably broaden a narrow beam. When narrow, the beam can detect both closer and farther objects that are within the scope of the narrow beam, and when broader, the beam can detect closer objects within a wider area. Dynamically varying the beam may be used to determine information about the object's relative size and position.
A first broad aspect is a detection device including a light source for projecting a beam of light within a target region having an initial beam divergence; a light detector for receiving light reflected from an object within the target region and providing a detection signal; a liquid crystal beam shaping element having an input control signal defining a modulation of the light source beam to provide a controllable greater divergence in the beam of light within the target region; and a controller responsive to the detection signal for adjusting the input control signal to detect and analyze objects and positions of the objects with an improved signal-to-noise ratio.
In some embodiments, the liquid crystal beam shaping element includes a liquid crystal layer disposed between two substrates and two or more independent electrodes disposed on one of the substrates.
In some embodiments, the electrodes are configured to provide a spatially variable electric field.
In some embodiments, the initial beam divergence is less than 10 degrees.
In some embodiments, the beam divergence is between 2 degrees and 50 degrees.
In some embodiments, the detection device is configured to detect a distance of the object from the detection device.
In some embodiments, the light source is configured to project the beam of light in a series of pulses.
In some embodiments, the light detector is configured to use a bandpass filter to distinguish the detection signal from a noise signal.
In some embodiments, the liquid crystal beam shaping element is configured to perform at least one of the following: symmetric broadening, asymmetric light stretching, diffraction and beam steering.
In some embodiments, an automatic controller for an appliance includes one or more detection devices.
Another broad aspect is a method of performing proximity detection using a proximity sensor, the method including projecting a beam of light within a target region, the beam having an initial beam divergence and angle of projection; receiving light reflected from an object within the target region; determining a first signal-to-noise ratio; dynamically manipulating the beam of light; determining a second signal-to-noise ratio; and determining a distance of the object from the proximity sensor.
In some embodiments, manipulating the beam of light includes broadening the beam divergence of the beam of light.
In some embodiments, manipulating the beam of light includes changing the angle of projection of the beam of light.
In some embodiments, manipulating the beam of light includes using a liquid crystal element.
In some embodiments, the method of performing proximity detection using a proximity sensor further includes eliminating noise from the detected signals.
Another broad aspect is a method of determining a location and/or a shape of an object, the method including projecting a light signal with narrow spectral band; diffracting the light signal using a liquid crystal device to produce a grid pattern; acquiring an image of the grid pattern reflected by the object; and determining the location and/or a shape of the object from the grid pattern reflected by the object.
In some embodiments, the grid pattern extends in a single direction.
In some embodiments, the grid pattern extends in two directions in a single plane.
In some embodiments, the method of determining a location and/or a shape of an object further includes configuring the liquid crystal device to control a pitch of the grid pattern.
In some embodiments, controlling a pitch of the grid pattern includes producing a first grid pattern with a first pitch and a second grid pattern with a second pitch.
In some embodiments, controlling a pitch of the grid pattern includes dynamically varying the pitch.
In some embodiments, the light signal includes pulses at a given frequency.
In some embodiments, the method of determining a location and/or a shape of an object further includes controlling the liquid crystal device based on the acquired image of the reflected light.
The invention will be better understood by way of the following detailed description of embodiments of the invention with reference to the appended drawings, in which:
The current patent application aims at improving object detection and recording efficiency by providing dynamically variable lighting system.
In
As is known in the art, electrodes for a transmissive liquid crystal device can be transparent, for example of a coating of indium tin oxide (ITO) material. The approximate voltage, as schematically is shown (in the inset at the top of the figure), ramps up at one side of zone 12a from a minimum value and begins again the same ramp on the other side of the zone boundary in zone 12b. The drive frequency can be the same for all of the electrodes 14a, and the liquid crystal molecules 4 orient themselves to be parallel to the electric field 3.
Further details of the devices illustrated in
According to the present disclosure, the devices and methods described above may be used to perform proximity detection, object location and optimized recording or monitoring. Proximity detection may comprise detecting whether an object is located proximate a sensor, and object location may comprise detecting the approximate position or the distance at which an object is disposed from a sensor. Proximity detection and object location according to the present disclosure may be used in applications such as automobile sensors for intelligent control systems, automatic lights, faucets, and other appliances. The present disclosure may allow for optimized or more precise control in these and other applications by improving proximity detection and allowing object location to be performed using a proximity sensor.
A variety of types of proximity detection may be incorporated as part of the systems and methods disclosed herein. First, two types of light sources are proposed: spectrally broad band light source (e.g., LED/Phosphore) and narrow band light sources (e.g., Infra Red LED or Diode Laser). Second, three approaches for obtaining optimization are proposed: simple symmetric broadening, asymmetric light stretching, and steering elements may be used to obtain the optimization. Any light source may be used with any optimization approach. One skilled in the art will further recognize that the methods and systems disclosed herein are compatible with other types of proximity detection known in the art. Accordingly, methods and systems described herein may use any type of proximity detection without departing from the scope of the disclosure.
The present disclosure relates to methods and systems using a proximity sensor to detect and locate objects. Prior art may disclose detecting objects using proximity sensors. The present disclosure may present advantages over the prior art by providing for improved object detection by dynamic optimization of lighting (illumination) conditions and by further providing for object location using proximity sensors.
The present disclosure relates to two general categories of systems and methods. The first category is directed towards systems and methods which use a broadened or stretched beam to detect objects, locate objects, and/or determine information about objects. The second category is directed towards systems and methods which use a diffracted beam to detect objects, locate objects, and/or determine information about objects (e.g., 3D forms). In general,
Systems and Methods Using a Broadened Beam
The emitter/receiver 301 may capture a received signal 302. The received signal may comprise background light signal from the environment in which the emitter/receiver 301 is located. Alternatively, a spectral filter can be used to detect only (mainly) the signal emitted by 301. If an object is located in the path of the emitted signal, the received signal 302 may also comprise light emitted by the emitter/receiver 301 that is reflected from the object.
The received signal 302 may be analyzed through object discrimination 303 and/or signal-to-noise-ratio (SNR) measurement 304. Object discrimination 303 may entail determining whether an object is located in the path of the emitted signal. Performing object discrimination 303 may comprise determining whether the received signal 302 includes a reflected signal as described above. A reflected signal may have the same pulse pattern as the emitted signal and this pulse pattern may be used to recognize the reflected signal. If the received signal 302 includes a reflected signal, an object may be present; if the received signal 302 does not include a reflected signal, an object may not be present.
SNR measurement 304 may entail determining the ratio of the reflected signal (signal) to the background light signal (noise). This ratio may be defined as
and may be called the SNR. If the received signal 302 is entirely noise, the SNR may be zero. If the received signal 302 is entirely reflected signal, the SNR may be one, or close to one. A high SNR may indicate that the object is occupying a large percentage of the area of a cross-section of the cone of the emitted signal. A low SNR may indicate that the object is occupying a small percentage of the area of a cross-section of the cone of the emitted signal. Obviously, we must also consider the reflectivity of the object's surface for the emitter's wavelength. Otherwise, relative measurements (e.g., change in time) can be performed. Object discrimination 303 and/or SNR measurement 304 may be performed by the emitter/receiver 301, by a separate controller (not illustrated), or by another piece of hardware. In some embodiments, calculations other than SNR measurements 304 may be performed. One skilled in the art will recognize that such measurements may be readily used with in place of SNR measurements 304.
The SNR may provide a first step in determining the object's approximate location 309 and object size. As discussed above, the SNR provides an estimation of the percentage of a cross-section of the emitted signal which an object occupies. One skilled in the art will recognize that the area of a cross-section of the signal increases with distance from the emitter/receiver 301 because of the divergence of the emitted signal. Therefore, an object which produces a large SNR may be a relatively small object located relatively close to the emitter/receiver 301 or a relatively large object located relatively far from the emitter/receiver 301. Accordingly, this first SNR merely provides a first data point for object location 309 and does not enable an object to be located precisely.
To complete object location 309, a beam broadening controller 305 and a beam intensity controller 306 may be operated based on the SNR measurement 304. The beam broadening controller 305 may produce a dynamic (or transient) beam broadening signal 307. The beam broadening signal 307 may be used to broaden the cone of the signal emitted by the emitter/receiver 301 through any beam broadening hardware and methods known in the art. This broadened emitted signal cone is represented by the broadened beam 311. In some embodiments, the beam broadening may be performed by hardware which is separate from the emitter/receiver 301. In some embodiments, the beam controller 305 may broaden the emitted signal to a cone of about 50 degrees, or to any angle between 2 degrees and 50 degrees.
The beam intensity controller 306 may produce a beam intensity signal 308, which may increase or decrease (dimming) the intensity of the emitted signal through any hardware and methods known in the art. In some embodiments, the beam intensity may be changed by the emitter/receiver 301.
Dynamic beam broadening 312 may enable optimized object location 309 to be performed for different positions of objects: e.g., on axis close targets 313, off-axis far targets 314, and on-axis far targets 315. Methods for determining the location of these objects using the narrow beam 310 and the broadened beam 311 will be described below.
The distance of the object may vary from few centimeters to many meters. In the majority of those cases, the stationary illumination system will not be optimal since for close objects the detection (or the recording) device will be back illuminated (by the reflection of light from the object) with high intensity and will saturate the process if the object is close; while the back-scattered signal may be too weak if the object is far. The same problem can happen in many applications (Lidar detection, video or picture recording, security camera, etc.).
As illustrated in
These steps may be performed iteratively as detailed in
In step 502, the beam may be broadened using the methods and hardware described above or using any other methods and hardware known in the art. The beam may be broadened by a small increment, for example by one degree. In this step, the object may reflect some, all, or none of the beam back to the receiver and a second SNR may be calculated based on the received signal.
In step 503, the first SNR may be compared to the second SNR to determine which is greater. As described above, a higher SNR may indicate that the reflected signal makes up a greater part of the received signal and the object occupies a larger portion of a cross-section of the emitted beam, while a lower SNR may indicate that the reflected signal makes up a smaller part of the received signal and the object occupies a smaller portion of a cross-section of the emitted beam. The measurement is mainly relative since different objects may have different reflectivities at used wavelengths.
If the second SNR is greater than the first SNR, the method may return to step 502. If the second SNR is smaller than the first SNR, the method may advance to step 504.
In the method returns to step 502, steps 502 and 503 may be repeated any number of times. In each increment, the currently calculated SNR will be compared to the immediately preceding SNR. For example, a sixth SNR may be compared to a fifth SNR. Whenever the current SNR is smaller than the immediately preceding SNR, the method may advance to step 504.
In step 504, the beam may be narrowed using the methods and hardware described above or using any other methods and hardware known in the art. Narrowing the beam may comprise broadening the original beam to a lesser degree than the degree to which the beam was broadened in the preceding step. The beam may be narrowed by a small increment, for example by one degree. In this step, the object may reflect some, all, or none of the beam back to the receiver and a new SNR may be calculated based on the received signal.
In step 505, the new SNR may be compared to the immediately preceding SNR to determine which is greater. If the new SNR is greater than the preceding SNR, the method may return to step 504. If the new SNR is smaller than the preceding SNR, the method may conclude.
In the method returns to step 502, steps 504 and 505 may be repeated any number of times. In each increment, the currently calculated SNR will be compared to the immediately preceding SNR. For example, a sixth SNR may be compared to a fifth SNR. Whenever the current SNR is smaller than the immediately preceding SNR, the method may conclude.
Conclusion of the method may indicate that the largest possible SNR for the given object and sensor has been identified. This may indicate the point at which the most accurate object detection or recording may be performed.
Further detailed embodiments of the method and systems described above will now be outlined.
In a first embodiment, a liquid crystal beam shaper may be used to control and optimize A—the width of the scene of illumination (the analogy of the variable “field of view”) as well as B— the level of illumination. A simple feedback system (photo detector, camera image sensor, etc.) can be used to adjust the angular range of illumination that will, in the same time, allow also the control of the level of illumination (an alternative way of dimming to optimize the signal capture).
This can be done along with a white (broad-band) light or narrow band illumination source.
In a second embodiment a dynamic liquid crystal beam shaper may be used to perform a dynamic scan of the beam broadening angle, for example going from a large 55 deg (always measured at Full Width at Half of Maximum, FWHM) to a narrow 2 deg beam to help estimate approximate size of an object as well as its distance from the optical axis of illumination. Some embodiments may use an accompanying dimming or intensity change.
In step 631, the original light source and any corresponding “primary optics” can provide rather narrow beam to start with (e.g., 2 deg.). In step 632, one of the beam broadening components may be used to obtain a very large illumination angle at the beginning and start detecting the back scattered (and reflected) light. The beam broadening component may be calibrated in advance so that it is known how much light goes in which direction. Thus, the detected/recorded signal would represent objects in the entire illumination zone (field of view). If the beam is narrowed, then the field of illumination exposes more and more objects positioned close to the optical axis and, in some extend, to not detect anymore objects which are off-axis. In step 633, the recorded signal may be analyzed, for example by determining an SNR. In step 634, it may be determined whether or not the signal is sufficient to perform the desired calculations based on the analysis. The desired calculations may comprise determining a size or location of the object. If the signal is sufficient as indicated by step 635, the size of the object and the distance of the object from the optical axis may be determined. If the signal is not sufficient, as indicated by step 636, the method may return to step 632 and cycle through again. This dynamic scanning will give the information about the presence of objects in different solid angles around the same optical axis. Also, as the reflection/back scattering light's intensity will be also defined by the distance of the object with respect to the illumination/recording module. The dynamic scan of the illumination angle will thus provide information about the proximity of objects as well as their relative sizes and positions with respect to the optical axis (without the capability of identifying it is on the left or on the right of the optical axis).
Here also, the adjustment of the illumination power (or intensity) may be used to optimize the detection or recording. For example, it may be necessary to increase the intensity of the light source if the object is too far (to have enough reflected light). However, there may be also need to adjust the light intensity down when the object is too close (to avoid the detection saturation). A similar effect can be obtained by broadening the light divergence angle.
Systems and Methods Using a Diffracted Beam
As discussed above, the present disclosure also relates to object location systems and methods for locating and characterizing objects which use diffracted beams. Object location systems according to the present disclosure may include proximity sensors, image recording and liquid crystal devices. Methods of locating objects may make use of these systems and may have practical applications in a variety of fields.
In general, the use of beam shaping devices with periodic internal structures (providing periodic modulation of optical refractive indexes) can generate diffraction patterns when we use a narrow band light source.
The proximity sensor 772 may be any type of proximity sensor known in the art. In general, the proximity sensor 772 may include a light source 773 and a receiver 774. The light source 773 may be configured to emit either broad band light (e.g. LED/Phosphore) or narrow band light (e.g. infrared LED or diode laser). The diffraction pattern appearance can be improved by using a coherent light source. In some embodiments, the light source 773 may emit light in pulses at a certain frequency or pattern of frequencies. The receiver 774 may capture a received signal. The received signal may comprise background light from the environment in which the proximity sensor 772 is located. If an object is located in the path of the emitted signal, the received signal may also comprise light emitted by the light source 773 that is reflected from the object. The receiver 774 or a connected processor may differentiate the reflected signal from the background noise, for example, by identifying the frequency of pulses present in the reflected signal.
The receiver 774 or a connected processor may acquire an image of the reflected light. The light may reflect off the object in a grid pattern based on the pattern in which it has been diffracted. The grid will be deformed for objects having non flat surfaces.
The object location system 770 may further include one or more liquid crystal devices 775.
In some embodiments, the object location system 770 may include more than one group of patterned electrodes within each liquid crystal layer (or cell) or a combination of many liquid crystal devices 775. Each of the devices 775 may produce a different diffraction pattern, or in other words, a grid with a different pitch. The liquid crystal devices 775 may be disposed in series, such that light from the light source 772 passes through all the liquid crystal devices 775. The devices 775 may be controlled such that no more than one device 775 has a voltage applied at a time. The control may be preformed by various types of controller, processor, or other hardware and software.
In some embodiments, the object location system 770 may include a single liquid crystal device 775 capable of producing multiple diffraction patterns.
In some embodiments, the liquid crystal device 775 may include a series of horizontal electrodes (not illustrated) and a series of vertical electrodes (not illustrated). The electrodes may be generally finger shaped, or may have any other shape known in the art. The horizontal electrodes may be activated to produce a vertical line of dots of light, as shown in
In addition to the elements shown in
In addition to the elements described above,
As discussed above,
The present disclosure is further related to methods of locating objects using proximity sensors and liquid crystal devices. In some embodiments, methods according to the present disclosure may use object location systems as illustrated in
Methods according to the present disclosure may be used to detect and analyze an object. A light signal may be projected from a light source. The signal may comprise a series of pulses of light at a particular frequency. The signal may be unchanged, i.e. not diffracted or polarized. A receiver may be used to detect whether or not the light signal is reflected. The receiver may receive both background light and reflected light, and the received light may be analyzed to determine a signal to noise ratio or other measure indicating the amount of reflected light present. If the amount of reflected light is above a threshold, this may indicate that an object is present.
If an object is present, a liquid crystal device may be used to diffract the light signal to produce a grid pattern of light. The object may reflect the points of light in the grid which are projected onto it, while not reflecting the other points of light. The location of the object within the grid may be determined based on the points which it reflects.
In some embodiments, further steps may be taken to collect additional information about the object. The device can produce a single vertical or horizontal line of light with strong bright points. These points may allow more specific information to be gathered about a small object or a particular region of a larger object. For another example, voltage may be applied to a second liquid crystal device and removed from the first liquid crystal device. This may allow the distance of the object from the proximity sensor to be determined more specifically and may allow the morphology of some objects to be determined.
Methods according to the present disclosure may be used in autonomous cars, security systems, automatic appliances and other smart home applications, recording applications, and numerous other applications. The steps described above may be modified as necessary to accommodate the particular situations.
In some methods, illumination grids may be used to recover the 3D information (by the deformation of the grid lines) or for the corrections of deformations of the image recorded. The characteristic size of the unit (distance between spot lines or the diagonal of a rectangle, etc.) must be large enough to be captured by the camera, but, at the same time, small enough to reveal the spatial depth variations of the object. Naturally, this size must be different for close or far positions of the same object, as shown in
In some embodiments, a liquid crystal beam shaper may be used along with a relatively narrow band illumination source to generate an illumination grid, to control the size of the unit element in the illumination grid. Such embodiments may be used for applications such as the corrections of distortions of cameras (e.g., panoramic or fish-eye cameras), identification of the distance of objects (by the distance of spots on the object, similar to Moiré interferometry, along with the known unit cell size of the grid that will be generated) and, the evaluation of the 3D profile of the object.
In some embodiments, a liquid crystal beam shaper may be used along with an illumination source to generate a continuously variable illumination angle (grid or just illumination) that is accompanied with a continuous detection of the back scattered signal. The detection of the back scattered signal will be affected by the position and size of objects (in the scene). For example, if, for a narrow-angle illumination, the back scattered and detected signal is weak or absent then that would mean that there is no object within the illuminated solid angle. Then when increasing the divergence angle the detection of a signal would mean an appearance of an object within a specific solid angle. The continuous process of broadening and detection might identify also the size of the object.
In some embodiments, a liquid crystal beam shaper and steerer, which may include components like those presented in
This application claims priority from U.S. Provisional Patent Application 62/883,770 filed Aug. 7, 2019, the contents of which are hereby incorporated by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CA2020/051085 | 8/7/2020 | WO |
Number | Date | Country | |
---|---|---|---|
62883770 | Aug 2019 | US |