The present disclosure relates to methods, mechanisms, and systems for altering apertures of optical systems.
An optical system, which may be used for an autonomous vehicle, is provided. The optical system includes a camera configured to take one or more captured images. The camera includes an adaptive aperture plane, which is configured to provide an adjustable aperture for the camera. The adaptive aperture plane is configured to change both an aperture size and an aperture shape in response to an aperture signal.
The camera also includes a first polarized surface on a first side of the adaptive aperture plane, relative to light passage. A first lens and a second lens are positioned within the light flow path. A second polarized surface is located on a second side of the adaptive aperture plane, relative to light passage, such that light strikes the first polarized surface, then the adaptive aperture plane, then the second polarized surface.
An image sensor is beyond the second polarized surface and configured to output one or more image signals. A processor is operatively configured to execute one or more image perception algorithms based on the image signals from the image sensor. The image perception algorithms alter the aperture size and the aperture shape by sending the aperture signal from the processor to the camera for subsequent captured images. When used with an autonomous vehicle, the captured images from the camera may be used to control movement of the autonomous vehicle.
In some configurations of the optical system, the image perception algorithms interact with a stored library of shapes to determine relevant shapes in the captured images, such that the image perception algorithms recognize object geometries based on the library of shapes. The adaptive aperture plane may be formed by, for example, a liquid crystal element or a digital micromirror device. The optical system may have a transmissive alignment, such that the first lens, the first polarized surface, the adaptive aperture plane, the second polarized surface, the second lens, and the image sensor are substantially aligned.
Additionally, the optical system may have a reflective alignment that includes a mirror. In the reflective alignment, the first lens is substantially aligned with the mirror and the adaptive aperture plane, and the first lens is at an angle of approximately 90 degrees relative to the second lens and the image sensor. Furthermore, the first polarized surface and the second polarized surface are at an angle of between 40-50 degrees relative to the first lens, the second lens, and the image sensor.
A method of controlling an optical system for an autonomous vehicle is also provided, and includes capturing one or more images with the optical system, which includes an adaptive aperture plane configured with a changeable aperture size and aperture shape in response to an aperture signal. The method may execute an image perception algorithm on the captured images, such that the image perception algorithm recognizes at least one of pedestrians or other vehicles in the captured images.
The method may further execute an aperture control algorithm on the captured image, such that the aperture control algorithm analyzes a scene of the captured images. The method determines whether the aperture size or aperture shape should change with either of the image perception algorithm or the aperture control algorithm. If the aperture size or aperture shape needs to be modified, the method sends the aperture signal from, for example, a voltage controller to adjust the adaptive aperture plane and capture subsequent images. If the aperture size or aperture shape does not need to be modified, the method captures subsequent images. Movement of the autonomous vehicle may be controlled based on the captured images.
In some configurations, the method includes determining shapes in the captured images by comparing shapes in the captured images to a library of shapes, with the image perception algorithm or the aperture control algorithm. The aperture size or aperture shape may be modified based on the determined shapes.
The above features and advantages and other features and advantages of the present disclosure are readily apparent from the following detailed description of the best modes for carrying out the disclosure when taken in connection with the accompanying drawings.
Referring to the drawings, like reference numbers refer to similar components, wherever possible. All figure descriptions simultaneously refer to all other figures.
The autonomous vehicle 12 includes one or more sensor pods 14, one of which may house the optical system 10. Note, however, that the optical system 10 may be located anywhere that would provide some benefit for the autonomous vehicle 12. Additionally, while the sensor pod 14 is shown near the dashboard of the autonomous vehicle 12, it may be located elsewhere. For example, and without limitation, the sensor pod 14 may be located on the exterior or interior of the roof of the autonomous vehicle 12. Furthermore, there may be additional sensors pods 14. Note that the optical system 10 may be used independently of the autonomous vehicle 12. In addition to the optical system 10, the autonomous vehicle 12 may have numerous other sensors, including, without limitation: autonomous vehicles are equipped with many sensors: light detection and ranging (LiDAR), infrared, sonar, or inertial measurement units.
A generalized control system or controller is operatively in communication with components of, at least, the optical system 10, the autonomous vehicle 12, or the sensor pod 14, and is configured to execute any of the methods, processes, and algorithms described herein. The controller includes, for example and without limitation, a non-generalized, electronic control device having a preprogrammed digital computer or processor, a memory, storage, or non-transitory computer-readable medium used to store data such as control logic, instructions, lookup tables, etc., and a plurality of input/output peripherals, ports, or communication protocols. The controller is configured to execute or implement all control logic or instructions described herein and may be communicating with any of the sensors described herein or recognizable by skilled artisans.
Furthermore, the controller may include, or be in communication with, a plurality of sensors, including, without limitation, those configured to inform the movement or actions of the autonomous vehicle 12. Numerous additional systems may be used in controlling and determining movement of the autonomous vehicle 12, as will be recognized by those having ordinary skill in the art. The controller may be dedicated to the specific aspects of the autonomous vehicle 12 described herein, or the controller may be part of a larger control system that manages numerous functions of the autonomous vehicle 12.
The drawings and figures presented herein are diagrams, are not to scale, and are provided purely for descriptive and supportive purposes. Thus, any specific or relative dimensions or alignments shown in the drawings are not to be construed as limiting. While the disclosure may be illustrated with respect to specific applications or industries, those skilled in the art will recognize the broader applicability of the disclosure. Those having ordinary skill in the art will recognize that terms such as “above,” “below,” “upward,” “downward,” et cetera, are used descriptively of the figures, and do not represent limitations on the scope of the disclosure, as defined by the appended claims. Any numerical designations, such as “first” or “second” are illustrative only and are not intended to limit the scope of the disclosure in any way. Any use of the term, “or,” whether in the specification or claims, is inclusive of any specific element referenced and also includes any combination of the elements referenced, unless otherwise explicitly stated.
Features shown in one figure may be combined with, substituted for, or modified by, features shown in any of the figures. Unless stated otherwise, no features, elements, or limitations are mutually exclusive of any other features, elements, or limitations. Furthermore, no features, elements, or limitations are absolutely required for operation. Any specific configurations shown in the figures are illustrative only and the specific configurations shown are not limiting of the claims or the description.
All numerical values of parameters (e.g., of quantities or conditions) in this specification, including the appended claims, are to be understood as being modified in all instances by the term about whether or not the term actually appears before the numerical value. About indicates that the stated numerical value allows some slight imprecision (with some approach to exactness in the value; about or reasonably close to the value; nearly). If the imprecision provided by about is not otherwise understood in the art with this ordinary meaning, then about as used herein indicates at least variations that may arise from ordinary methods of measuring and using such parameters. In addition, disclosure of ranges includes disclosure of all values and further divided ranges within the entire range. Each value within a range and the endpoints of a range are hereby all disclosed as separate embodiments.
When used, the term “substantially” refers to relationships that are ideally perfect or complete, but where manufacturing realties prevent absolute perfection. Therefore, substantially denotes typical variance from perfection. For example, if height A is substantially equal to height B, it may be preferred that the two heights are 100.0% equivalent, but manufacturing realities likely result in the distances varying from such perfection. Skilled artisans will recognize the amount of acceptable variance. For example, and without limitation, coverages, areas, or distances may generally be within 10% of perfection for substantial equivalence. Similarly, relative alignments, such as parallel or perpendicular, may generally be considered to be within 5%.
The autonomous vehicle 12 may have a communications system that is capable of sharing information determined by the controller, for example, or other parts of the autonomous vehicle 12 with locations outside of the autonomous vehicle 12. For example, and without limitation, the communications system may include cellular or Wi-Fi technology that allows signals to be sent to centralized locations, such as clouds or communications networks. It is envisioned that the methods and mechanisms described herein may occur on the autonomous vehicle 12, in a cloud system, a combination of both, or via other computational systems, such that the controller, or functions thereof, may be executed externally to the autonomous vehicle 12.
As schematically illustrated in
The camera 16 includes many components for operation, some of which are illustrated in
The adaptive aperture plane 20 may be controlled by, for example and without limitation, a voltage controller, which may be incorporated into several of the components shown and described. Other control mechanisms for the adaptive aperture plane 20 will be recognized by skilled artisans.
The camera 16 includes a first polarizer or first polarized surface 22 on a first side of the adaptive aperture plane 20. All references to alignment and/or direction are substantially relative to light flow or light passage through the camera 16. A first lens 24 is located before the first polarized surface 22.
The camera 16 includes a second polarizer or second polarized surface 26 on a second side of the adaptive aperture plane 20, opposite the first polarized surface 22. A second lens 28 is located after the second polarized surface 26. Note that all references to any lens includes groups of lenses having one or more lenses cooperating to modify light passage within the camera 16.
All lenses are shown highly schematically, such that light flow may not be representative of actual changes through the illustrated lenses. Note that the layout of elements for the camera 16 in
The camera 16 includes an image sensor 30 in communication with an image signal processor 40. The image sensor 30 is located beyond the second polarized surface 26 and is configured to output one or more image signals. The image sensor 30 and the image signal processor 40 may be combined into the same, or closely related hardware. The image signal processor 40 may be referred to as an ISP, and is dedicated hardware used to process the sensor image to produce the final output, such as, for example and without limitation, JPEG images. Note that it is also possible to perform operations common on an ISP on a CPU or GPU. The image signal processor 40 and the image sensor 30 may be referred to interchangeably herein.
A camera processor 42 is operatively configured to execute one or more image perception algorithms based on the image signals from the image signal processor 40. In some configurations, and without limitation, the image signal processor 40 and the camera processor 42 may be integrated into the same hardware, different hardware, or combinations thereof. However, the description will refer to the processors separately, as they may, or may not, be executing different functions and/or algorithms.
The image perception algorithms of the camera processor 42 may be used to alter the aperture size and the aperture shape by sending the aperture signal from the camera processor 42, or through other components, to the camera 16. The autonomous vehicle 12 may use the images processed by the image perception algorithms to control the path, and general movement, of the autonomous vehicle 12. Any of the functions of the image signal processor 40, the camera processor 42, or both, may be conducted by the generalized control system or controller for the autonomous vehicle 12. Those having ordinary skill in the art will recognize different image perception algorithms usable for the optical system 10 and the autonomous vehicle 12, including, without limitation, machine vision algorithms, robotic navigation algorithms, machine learning, or computer vision algorithms.
In some configurations of the optical system 10, and as illustrated in the flow chart of
In the optical system 10, the adaptive aperture plane 20 may be formed by, for example and without limitation, a liquid crystal (LC) element, a digital micromirror device, or combinations thereof. Skilled artisans will recognize additional structures capable of providing an adaptive aperture plane 20 configured to change both the aperture size and the aperture shape to form different aperture openings 21, as schematically illustrated in
Where the adaptive aperture plane 20 is formed from a liquid crystal (LC) device, it can create infinite different shapes. The LC device is made up of an LC pixelated array or LC cells, where each pixel controls the optical polarization phase of the given LC cell via a drive voltage applied to the specific cell. Each cell refers to a pixel and there can be hundreds to thousands of pixels across the adaptive aperture plane 20.
The light intensity passing through the adaptive aperture plane 20 is controlled by the voltage, which in turn changes the polarization phase of the light transmitted through the cell. For example, and without limitation, to block light the first polarized surface 22 will pass light linearly polarized light in one direction, such as horizontal, as recognized by skilled artisans. The second polarized surface 26 after the LC device is oriented in the same direction.
Therefore, if a voltage is applied from horizontal to vertical—such as a half wave phase—the light transmitted through the adaptive aperture plane 20 will then be vertically polarized and cannot pass through the horizontally oriented second polarized surface 26. If one wants light to pass through, no voltage phase should be applied to the LC device, such that no change to the transmitted light is induced. Alternatively, the voltage applied may be a full wave or multiples thereof. Therefore, to transmit light, the voltage applied will be to rotate the light from horizontal to vertical and the second polarizer, the second polarized surface 26, is in the vertical direction.
As schematically illustrated in
Alternatively, as schematically illustrated in
A first lens 74 is substantially aligned with a mirror 82 and the adaptive aperture plane 70. The first lens 74 is at an angle of about 90 degrees relative to a second lens 78 and an image sensor 80. The first polarized surface 72 and the second polarized surface 76 are at an angle of between 40-50 degrees relative to the first lens 74, the second lens 78, and the image sensor 80. The mirror 82 reflects light passing through the first polarized surface 72 and the adaptive aperture plane 70 back toward the second polarized surface 76.
Note that the first polarized surface 72 may be configured such that light passes through to be selectively blocked by the adaptive aperture plane 70. However, the second polarized surface 76 may be configured to reflect the selectively polarized light downward toward the second lens 78 and the image sensor 80. Any of the polarized surfaces discussed herein may be, for example, and without limitation, linear or circular polarizing filters, the specific benefits of the use of each will be recognized by those having ordinary skill in the art. Furthermore, any of the polarized surfaces discussed herein may be, for example, and without limitation, dichroic, reflective, birefringent, thin film, or combinations thereof
All lenses are shown highly schematically, such that light flow may not be representative of actual changes through the illustrated lenses. Note that the layout of elements for the camera 66 in
Step 110: Start/Capture Next Image. At step 110 the method 100 initializes or starts by capturing one or more images with the optical system 10, such as with either the camera 16 or the camera 66, or another digital camera device. The method 100 may begin operation when called upon by the controller, may be constantly running, or may be looping iteratively.
Furthermore, the method 100 may be carried out by the image signal processor 40, the camera processor 42, both processors, or may be conducted by another generalized control system or controller. Several of the steps may move, depending on the configuration, between the image signal processor 40 and the camera processor 42, which is likely part of the camera 16 or the camera 66.
Step 112: Aperture Control Algorithm. The method 100 executes one or more aperture control algorithms on the captured images. The aperture control algorithms may provide several features, but at least analyzes a scene of the captured images.
Step 114: Perception Algorithms. The method 100 executes one or more image perception algorithms on the captured images. The image perception algorithms analyze the captured images in order to identify relevant objects. For example, and without limitation, the image perception algorithms may recognize at least pedestrians or other vehicles in the captured images. Where the optical system 10 is operating an autonomous vehicle 12, the image perception algorithms may also be used to determine control—i.e., direction, speed, movement—of the autonomous vehicle 12 in conjunction with its other sensors and systems.
Optional Step 120: Library of Shapes. In some configurations, the method 100 utilizes a library of shapes to assist in identifying relevant shapes in the captured images. Therefore, the method 100 is determining shapes in the captured images by comparing shapes in the captured images to the library of shapes. This may occur via either the image perception algorithm, the aperture control algorithm, both, or alternative algorithms. The library of shapes may be prepopulated with known shapes that are recognizable via machine image analysis. The library of shapes may communicate back and forth with both the perception algorithms and the aperture control algorithms.
Step 122: Aperture Requires Modification? At step 122, the method 100 determines whether the aperture size or aperture shape of the aperture opening 21 created by the adaptive aperture plane 20 should change. This process may occur via the image perception algorithm, the aperture control algorithm, or alternative algorithms. If the aperture size or aperture shape does not need to be modified by the adaptive aperture plane 20, the method 100 captures subsequent images and/or reverts to the image perception algorithms.
Step 124: Aperture Control. Where step 122 determines that the aperture size or aperture shape needs to be modified, such that a new aperture opening 21 will be created by the adaptive aperture plane 20, the method 100 sends the aperture signal from, for example and without limitation, the voltage controller. The aperture signal adjusts the aperture opening 21 provided by the adaptive aperture plane 20, such that the method 100 and the optical system may capture subsequent images with the improved aperture opening 21. This may include modifying the aperture size and/or aperture shape based on the determined or identified shapes from the library of shapes.
The autonomous vehicle 12 may be controlling its movement based on the captured images with the improved aperture opening 21. After step 124, the method 100 ends. In many configurations, the method 100 will loop constantly or at a regular interval, as will be recognized by skilled artisans.
Therefore, the optical system 10 has feedback between the image perception algorithms and the camera operation, which may further enhance the performance of the image perception algorithms and, therefore, the performance of the autonomous vehicle 12.
The detailed description and the drawings or figures are supportive and descriptive of the subject matter herein. While some of the best modes and other embodiments have been described in detail, various alternative designs, embodiments, and configurations exist.
Any embodiments shown in the drawings or the characteristics of various embodiments mentioned in the present description are not necessarily to be understood as embodiments independent of each other. Rather, it is possible that each of the characteristics described in one of the examples of an embodiment can be combined with one or a plurality of other desired characteristics from other embodiments, resulting in other embodiments not described in words or by reference to the drawings. Accordingly, such other embodiments fall within the framework of the scope of the appended claims.