Light detection and ranging (LIDAR) is increasingly useful for providing range measurements in the vicinity of autonomous vehicles, robots and smart buildings. Traditionally, LIDAR systems have been placed on the exterior of host platforms (e.g. vehicles) with direct access to a field of view (FOV). While this is useful during research and development, external LIDAR placement with a single FOV poses challenges including aesthetics, long-term reliability and cost.
Flash LIDAR or time-of-flight (TOF) cameras are a class of scannerless LIDAR in which a laser or LED source illuminates a plurality of directions at once and a photodetector array such as a focal plane array (FPA) of avalanche photodiodes, or an array of single photon avalanche detectors (SPADS) detects the timing of reflections from the plurality of directions. An ongoing challenge is that the photodetector array (e.g. single photon avalanche detector) can be the most expensive part of a flash LIDAR. Providing the LIDAR with an unobstructed view of the vicinity typically requires mounting the LIDAR on the exterior of the host platform, where it is subject to weather and damage
In a related area, an ongoing challenge for autonomous vehicles (AVs) and advanced driverless assistance systems (ADAS) is to sense objects in close proximity to the vehicle. This capability is important because many driving scenarios require knowledge of what is going on close to the vehicle (e.g. entering a tight parking spot). Centrally mounting a LIDAR on the roof can provide 360 coverage beyond a certain distance (e.g. beyond 3 meters from a car). However within this distance a centrally located LIDAR may not be able to detect objects due to obstruction by the vehicles roof. To address this challenge some AV platforms have added additional LIDARs to the perimeter of the vehicle (e.g. located the perimeter of the roof or located on the vehicle fenders) in order to detect objects close to the vehicle (e.g. curbs, or pedestrians). In a related aspect, a single perimeter LIDAR may only address a single blindspot and therefore several perimeter LIDARs may be required to address all blindspots. U.S. Patent Application 2015/0192677 to Yu addresses this challenge by disclosing multiple LIDAR sensor around a vehicle to provide adequate coverage. However the operation of multiple LIDARs around a vehicle perimeter in a system remains a significant challenge.
Within examples, a distributed LIDAR system is disclosed, comprising a ranging subassembly, one or more LIDARs, and a plurality of shared light emitters. The LIDARs are located remotely from one another, for example, around the perimeter of a car. The shared light emitters can be located remote from the LIDARs and provide light pulses to several of the LIDARs. In prior art, each LIDAR would directly control one or more dedicated light emitters (e.g. a laser, a laser array or an LED). This prior art configuration would enable the LIDAR to closely control the timing of the dedicated light emitter(s) for the purpose of determining the time of flight (TOF) of light pulses. While bistatic LIDARs have been disclosed, whereby the light emitter is located remotely from the light detector, this architecture still requires the light emitter to be directly controlled by a single LIDAR. This is opposite to a light emitter that services several LIDARs at once (i.e. a shared light emitter).
In one aspect of several embodiments the light emitters are shared among multiple LIDARs by passing a common reference timing signal (or a set of distinct reference signals derived from a common clock signal) to both the emitter(s) and the LIDAR(s). A ranging subassembly or central controller in a vehicle can pass the common reference signal to the emitter(s) and the LIDAR(s) In this way a LIDAR can calculate the TOF of a light reflection based on the common (or related) reference timing signal.
Aspects of this disclosure provide solutions for challenges of implementing this architecture. These challenges include crosstalk among multiple emitter (e.g. how to figure out which of the shared emitters a light reflection is arriving from), as well as how to proliferate a common (or related) time reference signal both the emitters and the LIDARs in a distributed architecture.
The above summary does not include an exhaustive list of all aspects of the present invention. It is contemplated that the invention includes all systems and methods that can be practiced from all suitable combinations of the various aspects summarized above, as well as those disclosed in the Detailed Description below and particularly pointed out in the claims filed with the application. Such combinations have particular advantages not specifically recited in the above summary.
Embodiments of the present disclosure are operable to provide the following exemplary advantages: In one advantage the disclosed perimeter detection system enables a single LIDAR to receive reflections from multiple light emitters located on a host vehicle. For example, a LIDAR mounted on the center of a vehicle grille could benefit from two light emitters, one in the left headlight and one in the right headlight. Each of the shared emitters can be positioned to illuminate distinct portions of the FOV of the LIDAR.
In a second advantage, the two light emitters of the previous example could be shared with two other LIDARs mounted on each of the fenders. In this was the present disclosure provides distinct advantage over previous architectures where the 3 LIDARS (Grille and 2 fenders) would have to operate their respective dedicated light emitters in a non-interfering (e.g. time multiplexed) manner. This advantage is achieved by providing a common time reference signal to each of the LIDARs that see reflections from a shared emitter. For example, each of the 3 LIDARs and the shared emitter could receive a common timing pulse to indicate when a laser shot has been emitted by a light emitter.
In a third advantage, the light reflections from a shared light emitter represent usable reflections to any LIDAR that received the common time reference signal. This makes the disclosed architecture more scalable than architectures where multiple LIDARs independently operate dedicated light emitters.
Several embodiments enable the time reference signal to be transmitted to the LIDAR on a simple coax cable thereby simplifying deployment.
The disclosed architecture enables various shared emitters to be specialized (e.g. some with narrow beam angle and high intensity, some with wide field of view, and others aimed at knows areas of interest).
Embodiments of the disclosed architecture enable each LIDAR to provide reflections signals when sequentially illuminated by each of the specialized emitters (e.g. one set of reflections from the left headlight emitter and one set of reflections from the right headlight emitter). The LIDAR or circuitry in the ranging subassembly can thereby process the various sets of reflections to produce a composite 3D pointcloud from multiple shared emitters.
Another advantage of shared light emitters, located remote from the LIDARs is that the emitters can be housed in locations best suited for such emitters (e.g. in headlights and accent lights) while the LIDARs can be located in locations best suited to maximize FOV such as the front grille, side mirrors, behind windshields, or on the perimeter of the roof.
In another advantage, the set of shared emitters can be tailored for different environments or driving scenarios. For example, light emitters with a wide beam angle may be used for parking while light emitters with intense narrow beams may be used for highway driving.
In another advantage, a light emitter co-located with a LIDAR may experience shadowing of a large portion of the FOV by an object. A plurality of shared emitters offers multiple ways to illuminate a scene or an object and provide reflections in the presence of the shadowing object.
The embodiments of the invention are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “an” or “one” embodiment of the invention in this disclosure are not necessarily to the same embodiment, and they mean at least one.
Remote ranging subassembly further functions to distribute emitter reference timing signals to a four shared light emitters 120a, 120b, 120c and 120d as well as detector reference timing signals to the four LIDARS 150a, 150b, 150c and 150d. The emitter reference signal functions to tell the shared emitter when to emit light. The detector time reference signals function to indicate to the LIDAR when the shared emitters emitted the light. Each light emitter is operable to emit light rays 125 in a range of directions. Each of the LIDARs is operable to detect light reflections in a FOV, e.g. FOV 140 for LIDAR 150a.
In the embodiment of
Ranging subassembly 400 comprises a reference clock 405 that functions to create a clock signal 407 that forms the basis for both the emitter and detector time reference signals. The clock signal 407 is bed to a Light emitter time reference signal generator 410 and a LIDAR time reference signal generator 420. The time reference signal generators can comprise circuitry to amplify, frequency multiply or frequency divide the clock signal 407. For example, an exemplary clock signal may be 40 Mhz. A LIDAR may have a time-of-flight chip that requires an 8 Mhz clock signal. A design may require a light emitter to modulate at 20 Mhz. In this example the clock signal 407 may be divided by 2 to generate emitter time reference signal 415. Emitter time reference signal 3415 is then transmitted through a power-over-coax circuit and transmitted through cable 370c to the shared light emitter. Further in this example, LIDAR time reference signal generator 420 divides the clock signal 407 by a factor of 5 to generate the 8 Mhz detector time reference signal 417. It would be obvious to one of skill in the art that the 20 Mhz emitter reference signal can be recreated from the detector time reference signal at the LIDAR by knowing the ratio of the emitter and detector time reference signals and using a common clock signal to generate both reference signals. The LIDAR time reference signal generators can be part of an FPGA clock module such as the Clock wizard Logicore IP available for the Zynq FPGA family from Xilinx of San Jose Calif.
Ranging subassembly 400 further comprises a Deserializer Hub that functions to capture multiple serial streams of time of flight data from several LIDARs. An exemplary deserializer hub is the quad FPDLink hub ds90ub960 from Texas Instruments Inc. of Dallas Tex. The deserializer receives high speed data from the LIDARs and transmits commands (e.g. I2C commands to the LIDARs over a backchannel). In one aspect hub 430 can be commanded from a timing controller 440 to turn off the backchannel signal when the ranging subassembly wants to transmit the detector timing reference signal without interference. This requires closely coordinated timing of when the LIDAR is transmitting data but does enable a single coaxial cable 370a to carry power to the LIDAR, data from the LIDAR and carry the detector timing reference signal 417. Timing controller 440 can be a dedicated circuit comprising of transistor logic or can be software running on a microprocessor that times when deserializer is instructed to first tell the serializer in the LIDAR to depower and then turns off the backchannel to thereby free up the cable 370 for transmission of the detector time reference signal 417. The time signal combiner and switch 435 is circuitry that function to combine the detector time reference signal (e.g. 8 Mhz signal) and the serializer link (e.g. a 4.16 Ghz signal). Combiner 435 can comprise ferrite filters, inductors, capacitors, diodes and amplifiers for the purpose of combining the two signal paths. POC circuitry 425 functions to add power to the cable 370a to power the LIDAR. 3D Location calculator 450 and point cloud generator 460 process the time of flight signals from deserializer hub 430 to represent the 3D locations of light reflections from the shared emitter.
Operation
A detector time reference signal 720 is transmitted to one or more LIDARs. The function of the detector time reference signal is to enable the LIDAR to determine when received light reflections were generated. To this end, the detector time reference can have identical pulses (e.g. pulse 722) timing and duration to pulses in the emitter time reference signal. However in other embodiments the detector time reference signal need not be identical to the emitter time reference signal in order to still provide a time reference for when light pulses were produced. For example, a relationship or transfer function may exist between the emitter and detector time reference signals. The emitter time reference signal may be twice the frequency of the detector time reference signal, illustrated in region 760. The emitter time reference may have a time delay due to signal propagation as illustrated by time shift 770. Nevertheless the detector time reference signal can be used to, calculate the time of flight of light reflections from the shared emitters.
Serializer traffic is illustrated as packets 740 and frame start command 730. In one aspect many serializers keep the high speed data channel to the deserializer active even when not sending data. This serializer signal constitutes is a source of noise and hence time uncertainty for the detector time reference signal. To improve the quality of the detector time reference signal the ranging subassembly can instruct the LIDAR shut off serializer power 750 for a period time while the detector time reference signal is being transmitted (usually during the integration time of the TOF photodetector). One way to accomplish this is to transmit a first POC voltage level (e.g. VA=15V) from the ranging subassembly 215 during normal serializer operation. When the ranging subassembly wants the serializer at the LIDAR to power down during the transmission of the detector time reference signal the ranging subassembly can transmit a second POC voltage level VB (e.g. VB=8V). The serializer link switch circuit 550 in
3D location sensing system 800 can function to utilize a shared light emitter 810 between multiple LIDARs 850a and 850b. System 800 can solve two challenges to utilize the shared emitter, firstly determining the location of the shared emitter 810 relative to the LIDARs 850a and 850b and secondly to determine the timing when the shared emitter emits light.
System 800 can be mounted on a host platform. Exemplary host platforms include a vehicle, a building, a person, a bridge, a road sign, or a robot. Exemplary host vehicles include a ship, a car, a truck, a train, an airplane, a drone or a bicycle. Shared emitter 810 can be attached to the host platform in a location known to the ranging subassembly 825. In this case the location of the shared emitter relative to LIDARs 850a and 850b can be stored by 3D location sensing system 800.
Alternatively shared emitter 810 can be movable relative to the two LIDARs 850a and 850b. For example, the shared emitter could be located separate from the host platform (e.g. host platform=a car) and the shared emitter is located on a light pole at an intersection. In an embodiment where shared emitter 810 moves relative to the LIDARs 850a and 850b a challenge for the ranging subassembly 825 is to calculate the instantaneous position of the shared emitter 810. This can be accomplished by identifying a common object in the range data from the first LIDAR 850a and in the range data from the second LIDAR 850b. This common object could be the shared emitter 810, an object in the local environment (e.g. object “A” 820 or object “B” 815), or a part of the host platform (e.g. a fiducial mark on fender on a host car).
Turning to
External light emitter 950 can comprise laser 955 to generate light pulses, scanning mirror 960 to scan the light pulses across a field of view and one or more lenses 965 to focus, collimate or spread the light pulses. In the embodiment of
External Light Emitter Operation
In a first embodiment a LIDAR in a host device can utilize an external LIDAR pulse generator or illuminator using the following steps. Both an external and internal illuminator can generate light pulses that reflect to the light detector (i.e. light receiver) of the LIDAR and the location of external light emitter can be thereby characterized for computing the distance to subsequent objects illuminated by the external light emitter. The LIDAR 920 can generate one or more pulses of light (e.g. pulse 980). This light can reflect from a location (e.g. the center of table 975). The time of arrival of reflection 981 at receiver 930 can be used to determine the distance to the reflection location. Reflections from several directions can be used to generate a 3D image of table 975. External light emitter 950 can generate one or more pulses of light. These pulses can be short discrete pulses (e.g. 5-10 nanoseconds in duration) or these pulses can be a continuous wave of pulses with a pulse frequency (e.g. a pulse frequency of 20-100 Mhz). Receiver 930 can receive reflections (e.g. reflection 982) from light pulse 970. These reflections can be from substantially similar locations on table 975 as the reflections from the internal illuminator 940. Reflections (e.g. 982) from the external light emitter 950 can be used to generate a 3D image of table 975 at the LIDAR host device 910. LIDAR host device 910 can be configured to identify and compare a subset of reflection data from the internally generated light pulses and externally generated light pulses. In other embodiments a LIDAR host device could be a cellular telephone, virtual reality goggles, a tablet a video game console or a display device. The identification of the subset of light reflections corresponding to each illumination source may be based on objects that are well illuminated by each of the internal and external light emitters. The LIDAR host device can estimate a location and pointing direction of the external light emitter by processing the subset of reflections from the external light emitter. For example, both the external and internal illuminators may generate light reflections from the table 975. The LIDAR host device may use the internal reflections and associated timing as a source of truth for the location and distance of the table in a FOV of the LIDAR. The reflection data (e.g. reflection 982) from the external light emitter can be processed by the LIDAR host device to best map the reflections from the table onto the expected location (e.g. range of directions in the FOV) and the expected distance. This processing can generate an expected position and pointing direction of the external light emitter. The LIDAR host device can use the position and direction of the external light emitter to process other reflections from light generated by the external light emitter (e.g. reflections from person 991). These reflections may not have corresponding reflections from the internal illuminator. However, the location and direction of the external light emitter can enable these reflections to be located relative to the LIDAR 920. In this way, this method enables a calibration set of reflections from both an internal and external light emitter to calibrate or model the location and direction of an external light emitter relative to a LIDAR and provides a model for calculating 3D reflections corresponding to light reflections from light from the external light emitter.
In a second embodiment, LIDAR 920 can generate a trigger signal that is operable to instruct an external light emitter to generate one or more light pulses in a plurality of directions (e.g. one or more flashes in a field of view). The trigger signal can be a radio wave (e.g. wireless radio signal) or an optical signal. The external light emitter can generate a corresponding response signal (e.g. radio wave 993 or optical signal 985). The LIDAR host device can use a measure of the phase relationship or time difference between the outgoing trigger signal and response signal to estimate the distance of the external light emitter to the LIDAR host device. Optical signal 985 can be a light pulse or a periodic light signal (e.g. a 850 nm infrared light emitted with a 20 Mhz modulation frequency). The Optical signal 985 can be part of one or more light pulses generated by the external light emitter 950 for ranging objects. Optical signal 985 can travel directly from the external light emitter to the LIDAR and can function to indicate to LIDAR 920 a location of the external light emitter 950 in the FOV of the LIDAR 920. For example, LIDAR 920 may receive a strong infrared signal at one or more pixels in a light detector array in light receiver 930 indicating the presence of an external light emitter 950 in a corresponding portion of the FOV. Optical signal 985 can be phase locked to a timing or trigger signal from the LIDAR or LIDAR host device (e.g. a 20 Mhz radio signal). This phase relationship between an optical signal 985 and a trigger signal from the LIDAR or LIDAR host device can be used by the LIDAR host device to estimate the distance of the external light emitter along the direction provided by the optical signal 985. In this way the optical signal can provide both a direction and distance (which combined can form a location) of the external light emitter relative to the LIDAR.
In a related embodiment external light emitter 950 can be configured to transmit a radio signal 993 when it transmits a light pulse (e.g. pulse 970). This radio signal 993 can be used by the LIDAR or LIDAR host device to identify when the light emitter has generated a light pulse.
In one aspect the list of focus positions can be ordered to enable the focusable lens assembly 1020 to more easily step from one focus position to another. For example, in list 1025, the focus distances are ordered from nearest to furthest from the camera. In one embodiment a sensor data processor can identify person 1030 as an object of interest and iteratively update a focus position 1026 to best focus on person 1030 while stepping through the sequence 1025 of focus positions. Similarly focus position 1028 can be selected based on identifying a vehicle 1040 with sensor data. In one embodiment a sensor data process processes sensor data (e.g. camera, LIDAR or RADAR data) and thereby updates a sequence 1025 of focus positions. Camera 1010 modifies the focusable lens assembly 1020 to focus on each of the focus positions in sequence 1025 in order. For each focus position camera 1010 gathers one or more images. In one embodiment camera 1010 can gather data from a specific region of interest (i.e. a ROI containing the object being focused on). One of the focus distances 1028 can be a remote mirror 1050 that is operable to provide a remote FOV to camera 1010 (e.g. a side view mirror operable to show camera 1010 objects outside the FOV of camera 1010). Remote mirror 1028 can have features 1060 operable to identify remote mirror 1050 as a remote mirror. Features 1060 can be reflectors, beacons, or features with a distinctive shape. Remote mirror 1050 may also have a remote mirror positioner operable to receive messages and to alter the position of remote mirror 1050 based on these messages. Remote mirror 1050 can have a relatively constant focus distance 1028 from camera 1010. In this way Focus distance list 1025 can comprise some focus distances that change as objects (e.g. person 1030 move in the FOV) and some objects (e.g. remote mirror 1050) that are stationary or fixed in position. The sensor data processor can be configured to process data from movable objects to update their focus position while not updating the focus position corresponding to the remote mirror 1050. The camera 1010 can step through the combined list of fixed and variable focus positions in order to provide the sharpest focus on each of a plurality of objects.
In one advantage focusing the camera accurately on the remote mirror provides a much more usable remote FOV from the remote mirror. In another embodiment LIDAR data is processed to identify the list of focus distances 1025 can the camera 1010 is stepped through these focus distances in a defined order (e.g. from furthest to nearest or from nearest to furthest). Similarly each of the focus distances can be assigned an importance (e.g. a score between 0 and 100). Camera 1010 can step through the focus distances in list 1025 in order of importance (e.g. from highest importance score to lowest). The sensor data processor can calculate a relative velocity and/or trajectory for one or more targets in the field of view and modify one, some or all of the focus distances to match the expected location of one or more of the objects during subsequent image capture by camera 1010. For example, camera 1010 may be mounted to an AV travelling at 10 meters per second towards a stationary person 1030 and a car travelling at 15 meters per second. The sensor data processor can calculate subsequent sequences of focus distances based on the respective relative speeds of the person 1030 and vehicle 1040 while not changing the focus distance of a remote mirror 1050 mounted to the AV. In another embodiment camera 1010 can be controllable to point in different directions. Each focus distance in the sequence of focus distances can also include a pointing direction (e.g. focus distance 1026 associated with person 1030 can have an associated pointing direction towards person 1030). The sequence of focus directions can further serve to steer the camera to the corresponding pointing direction.
In another invention, a flash LIDAR can illuminate a FOV in a periodic manner with an offset to the periodicity that is determined at least in part by the direction that the LIDAR is pointing. Scanning LIDARS scan a laser through a field of view and the probability of shining the laser at another LIDAR is low due to the scanning motion. However a flash LIDAR illuminated a plurality of directions at once. This increases the potential for a flash LIDAR to interfere with another LIDAR in the local environment. One way to address this challenge is illustrated in
In one aspect this time offset can be based on the direction that LIDAR 1120 is pointing. In the example of
In a related embodiment the serializer can be configured to generate the commands to the illuminator for either the camera operation, the LIDAR operation or both the camera and LIDAR operation. In a related embodiment the serializer generates a trigger signal for the camera receiver and a trigger signal for the LIDAR signal that are timed relative one another that a trigger signal to the illuminator 1350 can provide light photons while the LIDAR receiver is receptive and while the camera receiver is receptive to sensing photons from the illuminator. In one embodiment the trigger signals to the camera, LIDAR and illuminator can be simultaneous. One of the camera receiver chip or LIDAR receiver chip can be configured to delay transmitting sensor data for a period of time long enough to enable the other chip (i.e. the other camera or LIDAR chip) to firstly transmit sensor data to the serializer. For example, both the camera chip and the LIDAR receiver may be simultaneously along with the illuminator. The camera can acquire photons for 10 ms and then delay 50 ms before transmitting the image data via a CSI-2 link to the serializer in order to allow time for the LIDAR receiver to first transmit depth data to the serializer via the CSI-2 bus. Hence a camera and a LIDAR can use a common illuminator and a common serializer, allowing a smaller package and lower cost.
While the above description contains many specificities, these should not be construed as limitations on the scope of any embodiment, but as exemplifications of various embodiments thereof. Many other ramifications and variations are possible within the teachings of the various embodiments. Thus the scope should be determined by the appended claims and their legal equivalents, and not by the examples given.
Any of the methods (including user interfaces) described herein may be implemented as software, hardware or firmware, and may be described as a non-transitory computer-readable storage medium storing a set of instructions capable of being executed by a processor (e.g., computer, tablet, smartphone, etc.), that when executed by the processor causes the processor to control perform any of the steps, including but not limited to: displaying, communicating with the user, analyzing, modifying parameters (including timing, frequency, intensity, etc.), determining, alerting, or the like.
When a feature or element is herein referred to as being “on” another feature or element, it can be directly on the other feature or element or intervening features and/or elements may also be present. In contrast, when a feature or element is referred to as being “directly on” another feature or element, there are no intervening features or elements present. It will also be understood that, when a feature or element is referred to as being “connected”, “attached” or “coupled” to another feature or element, it can be directly connected, attached or coupled to the other feature or element or intervening features or elements may be present. In contrast, when a feature or element is referred to as being “directly connected”, “directly attached” or “directly coupled” to another feature or element, there are no intervening features or elements present. Although described or shown with respect to one embodiment, the features and elements so described or shown can apply to other embodiments. It will also be appreciated by those of skill in the art that references to a structure or feature that is disposed “adjacent” another feature may have portions that overlap or underlie the adjacent feature.
Terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. For example, as used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as “/”.
Spatially relative terms, such as “under”, “below”, “lower”, “over”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is inverted, elements described as “under” or “beneath” other elements or features would then be oriented “over” the other elements or features. Thus, the exemplary term “under” can encompass both an orientation of over and under. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. Similarly, the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
Although the terms “first” and “second” may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
Throughout this specification and the claims which follow, unless the context requires otherwise, the word “comprise”, and variations such as “comprises” and “comprising” means various components can be co jointly employed in the methods and articles (e.g., compositions and apparatuses including device and methods). For example, the term “comprising” will be understood to imply the inclusion of any stated elements or steps but not the exclusion of any other elements or steps.
In general, any of the apparatuses and methods described herein should be understood to be inclusive, but all or a sub-set of the components and/or steps may alternatively be exclusive, and may be expressed as “consisting of” or alternatively “consisting essentially of” the various components, steps, sub-components or sub-steps.
As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word “about” or “approximately,” even if the term does not expressly appear. The phrase “about” or “approximately” may be used when describing magnitude and/or position to indicate that the value and/or position described is within a reasonable expected range of values and/or positions. For example, a numeric value may have a value that is +/−0.1% of the stated value (or range of values), +/−1% of the stated value (or range of values), +/−2% of the stated value (or range of values), +/−5% of the stated value (or range of values), +/−10% of the stated value (or range of values), etc. Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value “10” is disclosed, then “about 10” is also disclosed. Any numerical range recited herein is intended to include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that “less than or equal to” the value, “greater than or equal to the value” and possible ranges between values are also disclosed, as appropriately understood by the skilled artisan. For example, if the value “X” is disclosed the “less than or equal to X” as well as “greater than or equal to X” (e.g., where X is a numerical value) is also disclosed. It is also understood that the throughout the application, data is provided in a number of different formats, and that this data, represents endpoints and starting points, and ranges for any combination of the data points. For example, if a particular data point “10” and a particular data point “15” are disclosed, it is understood that greater than, greater than or equal to, less than, less than or equal to, and equal to 10 and 15 are considered disclosed as well as between 10 and 15. It is also understood that each unit between two particular units are also disclosed. For example, if 10 and 15 are disclosed, then 11, 12, 13, and 14 are also disclosed.
Although various illustrative embodiments are described above, any of a number of changes may be made to various embodiments without departing from the scope of the invention as described by the claims. For example, the order in which various described method steps are performed may often be changed in alternative embodiments, and in other alternative embodiments one or more method steps may be skipped altogether. Optional features of various device and system embodiments may be included in some embodiments and not in others. Therefore, the foregoing description is provided primarily for exemplary purposes and should not be interpreted to limit the scope of the invention as it is set forth in the claims.
The examples and illustrations included herein show, by way of illustration and not of limitation, specific embodiments in which the subject matter may be practiced. As mentioned, other embodiments may be utilized and derived there from, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Such embodiments of the inventive subject matter may be referred to herein individually or collectively by the term “invention” merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept, if more than one is, in fact, disclosed. Thus, although specific embodiments have been illustrated and described herein, any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description.
Number | Date | Country | |
---|---|---|---|
63243186 | Sep 2021 | US | |
63355650 | Jun 2022 | US |