Three-dimensional sensors can be applied in autonomous vehicles, drones, robotics, security applications, and the like. LiDAR sensors may achieve high angular resolutions appropriate for such applications. Existing techniques for scanning laser beams of a LiDAR sensor across a field of view (FOV) include rotating an entire LiDAR sensor assembly, or using a scanning mirror to deflect a laser beam to various directions. Improved scanning LiDAR systems are needed.
According to some embodiments, a scanning LiDAR system includes a base frame, a lens frame fixedly attached to the base frame, a lens attached to the lens frame, and an optoelectronic assembly fixedly attached to the base frame. The optoelectronic assembly includes one or more laser sources and one or more photodetectors. The scanning LiDAR system further includes a platform, and one or more optical fibers. Each respective optical fiber has a first end attached to the platform, and a second end optically coupled to a respective laser source and a respective photodetector. The platform is positioned with respect to the lens such that the first end of each respective optical fiber is positioned substantially at the focal plane of the lens. Each respective optical fiber is configured to: receive and propagate a light beam emitted by the respective laser source from the second end to the first end; and receive and propagate a return light beam from the first end to second end, so as to be received by the respective photodetector. The scanning LiDAR system further includes a flexure assembly flexibly coupling the platform to the lens frame or the base frame, and a driving mechanism coupled to the flexure assembly and configured to cause the flexure assembly to be flexed so as to scan the platform laterally in a plane substantially perpendicular to an optical axis of the scanning LiDAR system, thereby scanning the first end of each optical fiber in the plane relative to the lens.
According to some embodiments, a method of three-dimensional imaging using a scanning LiDAR system is provided. The scanning LiDAR system includes an optoelectronic assembly and a lens. The optoelectronic assembly includes at least a first laser source and a first photodetector. The method includes emitting, using the first laser source, a plurality of laser pulses, and coupling each of the plurality of laser pulses into an optical fiber through a first end of the optical fiber. A second end of the optical fiber is attached to a platform that is positioned with respect to the lens such that the second end of the optical fiber is positioned substantially at a focal plane of the lens. The method further includes translating the second end of the optical fiber in the focal plane of the lens by translating the platform, so that the lens projects the plurality of laser pulses at a plurality of angles in a field of view (FOV) in front of the scanning LiDAR system, and receiving and focusing, using the lens, a plurality of return laser pulses reflected off one or more objects onto the second end of the optical fiber. A portion of each of the plurality of return laser pulses is coupled into the optical fiber through the first end and propagated therethrough to the first end. The method further includes detecting, using the first photodetector optically coupled to the first end of the optical fiber, the plurality of return laser pulses, determining, using a processor, a time of flight for each return laser pulse of the plurality of return laser pulses, and constructing a three-dimensional image of the one or more objects based on the times of flight of the plurality of return laser pulses.
The present invention relates generally to scanning LiDAR systems for three-dimensional imaging. Merely by way of examples, embodiments of the present invention provide apparatuses and methods for a scanning LiDAR system in which a lens assembly is moved while an optoelectronic assembly is fixed. In some other embodiments, both the lens assembly and the optoelectronic assembly are fixed, and the ends of an array of optical fibers coupled to the optoelectronic assembly are scanned relative to the lens assembly.
A portion 122 of the collimated laser pulse 120′ is reflected off of the object 150 toward the receiving lens 140. The receiving lens 140 is configured to focus the portion 122′ of the laser pulse reflected off of the object 150 onto a corresponding detection location in the focal plane of the receiving lens 140. The LiDAR sensor 100 further includes a photodetector 160a disposed substantially at the focal plane of the receiving lens 140. The photodetector 160a is configured to receive and detect the portion 122′ of the laser pulse 120 reflected off of the object at the corresponding detection location. The corresponding detection location of the photodetector 160a is optically conjugate with the respective emission location of the laser source 110a.
The laser pulse 120 may be of a short duration, for example, 100 ns pulse width. The LiDAR sensor 100 further includes a processor 190 coupled to the laser source 110a and the photodetector 160a. The processor 190 is configured to determine a time of flight (TOF) of the laser pulse 120 from emission to detection. Since the laser pulse 120 travels at the speed of light, a distance between the LiDAR sensor 100 and the object 150 may be determined based on the determined time of flight.
One way of scanning the laser beam 120′ across a FOV is to move the laser source 110a laterally relative to the emission lens 130 in the back focal plane of the emission lens 130. For example, the laser source 110a may be raster scanned to a plurality of emission locations in the back focal plane of the emitting lens 130 as illustrated in
By determining the time of flight for each laser pulse emitted at a respective emission location, the distance from the LiDAR sensor 100 to each corresponding point on the surface of the object 150 may be determined. In some embodiments, the processor 190 is coupled with a position encoder that detects the position of the laser source 110a at each emission location. Based on the emission location, the angle of the collimated laser pulse 120′ may be determined. The X-Y coordinate of the corresponding point on the surface of the object 150 may be determined based on the angle and the distance to the LiDAR sensor 100. Thus, a three-dimensional image of the object 150 may be constructed based on the measured distances from the LiDAR sensor 100 to various points on the surface of the object 150. In some embodiments, the three-dimensional image may be represented as a point cloud, i.e., a set of X, Y, and Z coordinates of the points on the surface of the object 150.
In some embodiments, the intensity of the return laser pulse 122′ is measured and used to adjust the power of subsequent laser pulses from the same emission point, in order to prevent saturation of the detector, improve eye-safety, or reduce overall power consumption. The power of the laser pulse may be varied by varying the duration of the laser pulse, the voltage or current applied to the laser, or the charge stored in a capacitor used to power the laser. In the latter case, the charge stored in the capacitor may be varied by varying the charging time, charging voltage, or charging current to the capacitor. In some embodiments, the intensity may also be used to add another dimension to the image. For example, the image may contain X, Y, and Z coordinates, as well as reflectivity (or brightness).
The angular field of view (AFOV) of the LiDAR sensor 100 may be estimated based on the scanning range of the laser source 110a and the focal length of the emitting lens 130 as,
where h is scan range of the laser source 110a along certain direction, and f is the focal length of the emitting lens 130. For a given scan range h, shorter focal lengths would produce wider AFOVs. For a given focal length f, larger scan ranges would produce wider AFOVs. In some embodiments, the LiDAR sensor 100 may include multiple laser sources disposed as an array at the back focal plane of the emitting lens 130, so that a larger total AFOV may be achieved while keeping the scan range of each individual laser source relatively small. Accordingly, the LiDAR sensor 100 may include multiple photodetectors disposed as an array at the focal plane of the receiving lens 140, each photodetector being conjugate with a respective laser source. For example, the LiDAR sensor 100 may include a second laser source 110b and a second photodetector 160b, as illustrated in
The laser source 110a may be configured to emit laser pulses in the ultraviolet, visible, or near infrared wavelength ranges. The energy of each laser pulse may be in the order of microjoules, which is normally considered to be eye-safe for repetition rates in the KHz range. For laser sources operating in wavelengths greater than about 1500 nm, the energy levels could be higher as the eye does not focus at those wavelengths. The photodetector 160a may comprise a silicon avalanche photodiode, a photomultiplier, a PIN diode, or other semiconductor sensors.
The angular resolution of the LiDAR sensor 100 can be effectively diffraction limited, which may be estimated as,
where λ is the wavelength of the laser pulse, and D is the diameter of the lens aperture. The angular resolution may also depend on the size of the emission area of the laser source 110a and aberrations of the lenses 130 and 140. According to various embodiments, the angular resolution of the LiDAR sensor 100 may range from about 1 mrad to about 20 mrad (about 0.05-1.0 degrees), depending on the type of lenses.
I. Lidar Systems with Moving Lens Assembly
As discussed above, for the LiDAR system illustrated in
The laser source 110a and the photodetector 160a are usually connected to power sources and control electronics via electrical cables. Since the power sources and the control electronics are normally stationary, moving the laser source 110a and the photodetector 160a may cause strains on the electrical cables, and can potentially affect the robustness of the operation of the LiDAR system. According to some embodiments, the laser source 110a and the photodetector 160a remain fixed, and the scanning of the laser beam 120′ across the FOV is achieved by moving the emission lens 130 laterally in a plane substantially perpendicular to its optical axis (e.g., in the plane perpendicular to the page), either in one dimension or two dimensions. Accordingly, the receiving lens 140 is moved synchronously with the motion of the emission lens 130, so that a return laser beam 122′ is focused onto the photodetector 160a. This scanning method has the advantage that no electrical connection is required between moving parts and stationary parts. It may also make it easier to adjust the alignment of the laser source 110a and the photodetector 160a during operation, since they are not moving.
The LiDAR system 200 may further include an emission lens 230 and a receiving lens 240. Each of the emission lens 230 and the receiving lens 240 may be a compound lens that includes multiple lens elements. The emission lens 230 and the receiving lens 240 may be mounted in a lens mount 220. The lens mount 220 with the emission lens 230 and the receiving lens 240 attached thereto may be referred to herein as a lens assembly.
The lens assembly may be flexibly attached to the base frame 202 via a pair of flexures 270a and 270b as illustrated in
As illustrated in
Because the lens assembly 220 may not require any electrical connections for power, moving the lens assembly 220 may not cause potential problems with electrical connections, as compared to the case in which the optoelectronic board 250 is being moved. Therefore, the LiDAR system 200 may afford more robust operations. It may also be easier to adjust the alignment of the laser sources 210 and photodetectors 260 during operation, since they are not moving.
Although
The LiDAR system 300 may further include an emission lens 330 and a receiving lens 340. (Note that each of the emission lens 330 and the receiving lens 340 may be a compound lens that includes multiple lens elements.) The emission lens 330 and the receiving lens 340 may be mounted in a lens frame 320. The lens frame 320 with the emission lens 330 and the receiving lens 340 attached thereto may be referred to herein as a lens assembly.
The lens assembly 320 may be flexibly attached to the base frame 302 via four flexures 370a-370d. A first end of each flexure 370a, 370b, 370c, or 370d is attached to a respective corner of the lens frame 320. A second end of each flexure 370a, 370b, 370c, or 370d opposite to the first end is attached to the base frame 302, as illustrated in
In some embodiments, the flexures 370a-370d may be made of spring steel such as music wires, so that the flexures 370a-370d can be deflected in two dimensions. One or more actuators 304a-304d (e.g., voice coil motors or other types of actuators) may be coupled to the flexures 370a-370d, and can cause the first end of each flexure to be deflected, thus causing the lens assembly 320 to move in two dimensions in a plane substantially perpendicular to the optical axis (e.g., along the Z-direction) of the emission lens 330 or the receiving lens 340, as indicated by the two orthogonal double-sided arrows in
In some embodiments, the two-dimensional scanning of the lens assembly may be performed in a raster scan pattern. For example, the lens assembly may be scanned at a higher frequency (e.g., on the order of a hundred to a few hundred Hz) in the horizontal direction (e.g., the X-direction), and at a lower frequency (e.g., on the order of a few to a few 10's of Hz) in the vertical direction (e.g., the Y-direction). The high-frequency scan in the horizontal direction may correspond to a line scan, and the low-frequency scan in the vertical direction may correspond to a frame rate. The high frequency may be at a resonant frequency of the flexure assembly 304. The low frequency scan may not be at the resonant frequency.
In some other embodiments, the two-dimensional scanning of the lens assembly 320 may be performed in a Lissajous pattern. A Lissajous scan pattern may be achieved by scanning the lens assembly in the horizontal and vertical directions with similar but not identical frequencies. Mathematically, a Lissajous curve is a graph of parametric equations:
x=A sin(at +δ), y=B sin(bt),
where a and b are the frequencies in the x direction (e.g., the horizontal direction) and y direction (e.g., the vertical direction), respectively; t is time; and δ is a phase difference.
The frame rate may be related to the difference between the two frequencies a and b. In some embodiments, the scanning frequencies a and b may be chosen based on a desired frame rate. For instance, if a frame rate of 10 frames per second is desired, a frequency of 200 Hz in the horizontal direction and 210 Hz in the vertical direction may be chosen. In this example, the Lissajous pattern may repeat exactly from frame to frame. By choosing the two frequencies a and b to be significantly greater than the frame rate and properly selecting the phase difference 8, a relatively uniform and dense coverage of the field of view may be achieved.
In some other embodiments, if it is desired for the Lissajous pattern not to repeat, a different frequency ratio or an irrational frequency ratio may be chosen. For example, the scanning frequencies in the two directions a and b may be chosen to be 200 Hz and 210.1 Hz, respectively. In this example, if the frame rate is 10 frames per second, the Lissajous pattern may not repeat from frame to frame. As another example, the scanning frequencies a and b may be chosen to be 201 Hz and 211 Hz, respectively, so that the ratio a/b is irrational. In this example, the Lissajous pattern will also shift from frame to frame. In some cases, it may be desirable to have the Lissajous pattern not to repeat from frame to frame, as a trajectory of the laser source or the photodetector from a subsequent frame may fill in gaps of a trajectory from an earlier frame, thereby effectively have a denser coverage of the field of view.
In some embodiments, a frequency separation that is multiples of a desired frame rate may also be used. For example, the scanning frequencies in the two directions a and b may be chosen to be 200 Hz and 220 Hz, respectively. In this case, for example, a frame of either 10 Hz or 20 Hz may be used. According to various embodiments, a ratio between the scanning frequencies a and b may range from about 0.5 to about 2.0.
Referring to
Other types of two-dimensional flexures different from the rod springs may also be used.
Each of the pair of flexures 420a and 420b may be fabricated by cutting a plate of spring material. A convolution configuration, as illustrated in
In order to mitigate any vibrations that may be caused by the scanning of the lens assembly, a counter-balance may be used in a LiDAR system.
In some embodiments, the counter-balance structure 580 may be arranged to scan sympathetically to the lens assembly 220 without active drive, similar to the way one arm of a tuning fork will vibrate opposite to the other arm even if only the other arm is struck. In another embodiment, the counter-balance structure 580 may be driven and the lens assembly 220 may scan sympathetically. In yet another embodiment, a driving mechanism may be arranged to act between the lens assembly 220 and the counter-balance structure 580 without direct reference to the base frame 202.
In some embodiments, the counter-balance object 580 may advantageously be configured to have a center of mass that is close to the center of mass of the lens assembly 220. In some embodiments, the counter-balance object 580 may have substantially the same mass as the mass of the lens assembly 220. Thus, when the counter-balance object 580 is scanned with equal magnitude as the lens assembly 220 but in an opposite direction, the momentum of the counter-balance object 580 may substantially cancel the momentum of the lens assembly 220, thereby minimizing the vibration of the LiDAR system 500. In some other embodiments, the counter-balance object 580 may have a mass that is smaller (or larger) than the mass of the lens assembly 220, and may be scanned with a larger (or smaller) amplitude than the lens assembly 220, so that the momentum of the counter-balance object 580 substantially cancels the momentum of the lens assembly 220.
The method 700 includes, at 702, scanning the lens assembly in a plane substantially perpendicular to an optical axis of the LiDAR system, while the optoelectronic assembly of the LiDAR system is fixed. The lens assembly may include an emission lens and a receiving lens. The optoelectronic assembly may include at least a first laser source and at least a first photodetector. The lens assembly is positioned relative to the optoelectronic assembly in a direction along the optical axis such that the first laser source is positioned substantially at a focal plane of the emission lens, and the first photodetector is positioned substantially at a focal plane of the receiving lens.
The method 700 further includes, at 704, emitting, using the first laser source, a plurality of laser pulses as the lens assembly is being scanned to a plurality of positions, respectively, such that the plurality of laser pulses are projected at a plurality of angles in a field of view (FOV) in front of the LiDAR system. The plurality of laser pulses may be reflected off of one or more objects in the FOV.
The method 700 further includes, at 706, detecting, using the first photodetector, the plurality of laser pulses reflected off of the one or more objects.
The method 700 further includes, at 708, determining, using a processor, a time of flight for each laser pulse of the plurality of laser pulses.
The method 700 further includes, at 710, constructing a three-dimensional image of the one or more objects based on the times of flight of the plurality of laser pulses.
It should be appreciated that the specific steps illustrated in
II. Lidar Systems with Optical Fiber Array
According to some embodiments, a scanning LiDAR system may use optical fibers to couple light beams emitted by the laser sources to the focal plane of an emission lens, and to couple return laser beams focused at the focal plane of a receiving lens to the photodetectors. Instead of moving the lens assembly or the laser sources, the ends of the optical fibers are moved relative to the lens assembly so as to scan the laser beams across a FOV.
The LiDAR system 800 also includes one or more emission optical fibers 810. A first end of each emission optical fiber 810 is coupled to a respective laser source 210 of the one or more laser sources 210. A second end 812 of each emission optical fiber 810 is positioned substantially at the focal plane of the emission lens 230. Thus, a light beam emitted by the respective laser source 210 is coupled into the respective emission optical fiber 810, and is emitted from the second end 812 of the emission optical fiber 810 to be collimated by the emission lens 230.
The LiDAR system 800 also includes one or more receiving optical fibers 860. A first end of each receiving optical fiber 860 is coupled to a respective photodetector 260 of the one or more photodetectors 260. A second end 862 of each receiving optical fiber 860 is positioned substantially at the focal plane of the receiving lens 240. The position of the second end 862 of the receiving optical fiber 860 is optically conjugate with the position of the second end 812 of the emission optical fiber 810, so that a return light beam focused by the receiving lens 240 may be coupled into the receiving optical fiber 860, and to be propagated onto the respective photodetector 260.
The second end 812 of each emission optical fiber 810 and the second end 862 of each receiving optical fiber 860 are attached to a platform 820. The platform 820 is flexibly attached to the lens frame 880 via a pair of flexures 890a and 890b. The platform 820 may be moved laterally left or right relative to the lens frame 880 by deflecting the pair of flexures 890a and 890b using an actuator (not shown), as indicated by the double-sided arrow in
In the LiDAR system 800, both the lens assembly 880 and the optoelectronic assembly 250 are fixed, and the scanning is achieved by moving the platform 820, thereby moving the second ends 812 of the emission optical fibers 810 and the second ends 862 of the receiving optical fibers 860 relative to the lens assembly 880. Since optical fibers with relatively small diameters can be quite flexible, moving the platform 820 may not cause significant strains on the emission optical fibers 810 and the receiving optical fibers 860. Thus, the LiDAR system 800 may be operationally robust.
In some embodiments in which the LiDAR system 800 includes multiple laser sources 210 and multiple photodetectors 260 (e.g., four laser sources 210 and four photodetectors 260 as illustrated in
The LiDAR system 900 further includes one or more emission optical fibers 910, and one or more receiving optical fibers 960. A first end of each emission optical fiber 910 is coupled to a respective laser source 310 of the one or more laser sources 310. A second end 912 of each emission optical fiber 910 is attached to a platform 920. A first end of each receiving optical fiber 960 is coupled to a respective photodetector 360 of the one or more photodetectors 360. A second end 962 of each receiving optical fiber 960 is attached to the platform 920.
The platform 920 is spaced apart from the emission lens 330 and the receiving lens 340 such that the second ends 912 of the emission optical fibers are positioned substantially in the focal plane of the emission lens 330, and the second ends 962 of the receiving optical fibers are positioned substantially in the focal plane of the receiving lens 340. Thus, a light beam emitted by a respective laser source 310 may be coupled into a respective emission optical fiber 310, which may subsequently be emitted from the second end 912 of the emission optical fiber 910 to be collimated by the emission lens 330. A return light beam focused by the receiving lens 340 may be coupled into a respective receiving optical fiber 960 through its second end 962, and propagated by the respective receiving optical fiber 960 onto a respective photodetector 360.
The platform 920 is flexibly attached to the lens frame 980 via four flexures 990a-990d, which may be coupled to one or more actuators (not shown). The platform 920 may be moved laterally in two dimensions (e.g., in the X-direction and Y-direction) in a plane substantially perpendicular to the optical axis (e.g., in the Z-direction) of the emission lens 330 and the optical axis of the receiving lens 340 by deflecting the flexures 990a-990d via the actuators, as indicated by the two double-sided arrows in
Similar to the LiDAR system 700 illustrated in
In some embodiments, the two-dimensional scanning of the platform 920 may be performed in a raster scan pattern. For example, the platform 920 may be scanned at a higher frequency (e.g., on the order of a hundred to a few hundred Hz) in the horizontal direction (e.g., the X-direction), and at a lower frequency (e.g., on the order of a few to a few 10's of Hz) in the vertical direction (e.g., the Y-direction). The high-frequency scan in the horizontal direction may correspond to a line scan, and the low-frequency scan in the vertical direction may correspond to a frame rate. The high frequency may be at a resonant frequency of the flexure assembly. The low frequency scan may not be at the resonant frequency. In some other embodiments, the two-dimensional scanning of the platform 920 may be performed in a Lissajous pattern, by scanning in both directions at relatively high frequencies that are close but not identical, as discussed above with reference to
In some embodiments, a single optical fiber can be used for both conducting light emitted by a laser source to the focal plane of a lens, and conducting light reflected off an object to a photodetector.
The LiDAR system 1000 also includes an optical fiber 1040. A first end 1042 of the optical fiber 1040 is attached to a platform 1020. The platform 1020 is flexibly attached to the lens frame 1080 via a pair of flexures 1090a and 1090b. The platform 1020 can be moved laterally left or right relative to the lens frame 1080 by flexing the pair of flexures 1090a and 1090b using an actuator (not shown). In some embodiments, the flexures 1090a and 1090b can be flexed in two dimensions (e.g., both in the left and right direction and in the direction in and out of the page). Alternatively, the platform 1020 can be flexibly attached to the base frame 1002 via flexures. The platform 1020 is spaced apart from the lens 1030 so that the first end 1042 of the optical fiber 1040 is positioned substantially at the focal plane of the lens 1030.
Light emitted by the laser source 1010 can be coupled into a second end 1044 of the optical fiber 1040, propagated to the first end 1042, and be emitted from the first end 1042. Thus, an outgoing light beam 1012 can be collimated by the lens 1030 and be projected to a scene. An incoming light beam 1014 that is reflected off an object in the scene can be focused by the lens 1030, and be coupled into the first end 1042 of the optical fiber 1040.
According to some embodiments, the incoming light and the outgoing light can be separated at the second end 1044 of the optical fiber 1040 using an optical beam splitter or other optical components. Exemplary optical components can include free-space beam splitters (e.g., prism beam splitter, or polarizing beam splitter), fiber-optic splitters (e.g., fused biconical taper (FBT) splitter, or planar lightwave circuit (PLC) splitter), waveguide coupler, partially transmitting and partially reflecting mirror, and the like.
According to some embodiments, other optical elements, such as a small lens, an optical filter, and/or an anti-reflective coating can be attached or applied to the first end 1042 of the optical fiber 1040 to improve light coupling between the optical fiber 1040 and the lens 1030. A similar optical component can also be attached or applied to the second end 1044 of the optical fiber 1040 to improve light coupling between the optical fiber 1040 and the laser source 1010 and the photodetector 1060.
According to some embodiments, an array of laser sources and an array of photodetectors can be used for covering a larger field of view. As an example,
According to some embodiments, a mirror can be used to couple light emitted by the laser source 1010 into the optical fiber 1040, or to couple incoming light from the fiber 1040 onto the photodetector 1060.
In
The method 1300 includes, at 1302, emitting, using the first laser source, a plurality of laser pulses; and at 1304, coupling each of the plurality of laser pulses into an optical fiber through a first end of the optical fiber. A second end of the optical fiber is attached to a platform that is positioned with respect to the lens such that the second end of the optical fiber is positioned substantially at a focal plane of the lens.
The method 1300 further includes, at 1306, translating the second end of the optical fiber in the focal plane of the lens by translating the platform, so that the lens projects the plurality of laser pulses at a plurality of angles in a field of view (FOV) in front of the scanning LiDAR system.
The method 1300 further includes, at 1308, receiving and focusing, using the lens, a plurality of return laser pulses reflected off one or more objects onto the second end of the optical fiber. A portion of each of the plurality of return laser pulses is coupled into the optical fiber through the second end and propagated therethrough to the first end.
The method 1300 further includes, at 1310, detecting, using the first photodetector optically coupled to the first end of the optical fiber, the plurality of return laser pulses; at 1312, determining, using a processor, a time of flight for each return laser pulse of the plurality of return laser pulses; and at 1314, constructing a three-dimensional image of the one or more objects based on the times of flight of the plurality of return laser pulses.
It should be appreciated that the specific steps illustrated in
It is also understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application and scope of the appended claims.
This application is continuation-in-part of U.S. patent application Ser. No. 16/504,989, filed on Jul. 8, 2019, which claims the benefit of U.S. Provisional Patent Application No. 62/696,247, filed on Jul. 10, 2018, the contents of which are hereby incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
62696247 | Jul 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16504989 | Jul 2019 | US |
Child | 17385669 | US |