LIGHT-BASED OBJECT LOCALIZATION

Information

  • Patent Application
  • 20230298198
  • Publication Number
    20230298198
  • Date Filed
    March 18, 2022
    3 years ago
  • Date Published
    September 21, 2023
    2 years ago
Abstract
Provided are methods for light-based object localization, which can include comparing unexpected light sources to expected light sources for determination of an agent, such as a partially and/or fully occluded agent. Some methods described also include generating a trajectory for an autonomous vehicle based on the comparison. Systems and computer program products are also provided.
Description
Claims
  • 1. A method comprising: obtaining, using at least one processor, sensor data associated with an environment in which an autonomous vehicle is operating;determining, using the at least one processor, based on the sensor data and environment data, scene data indicative of an expected scene representative of the environment;determining, using the at least one processor, based on the scene data and the sensor data, a light parameter indicative of an unexpected light source in the environment;determining, using the at least one processor, based on the light parameter, an agent in the environment; andgenerating, using the at least one processor, based on the agent, a trajectory for the autonomous vehicle.
  • 2. The method of claim 1, wherein determining the agent in the environment comprises determining, using the at least one processor, based on the sensor data and the light parameter, an agent location in the environment.
  • 3. The method of claim 1, wherein determining the light parameter comprises: determining, using the at least one processor, whether the sensor data meets a criterion; andin response to determining that the sensor data meets the criterion, determining, using the at least one processor, that the light parameter indicates a presence of unexpected light in the environment.
  • 4. The method of claim 1, wherein determining the agent comprises: determining, using the at least one processor, based on the light parameter, a candidate location associated with a candidate agent;generating, using the at least one processor, based on the scene data and the light parameter, a light propagation result for the candidate agent at the candidate location; anddetermining, using the at least one processor, based on the light propagation result, the agent.
  • 5. The method of claim 4, wherein determining the light propagation result for the candidate agent at the candidate location is based on a type of agent.
  • 6. The method of claim 5, wherein the type of agent includes one or more of: a vehicle, a car, a motorcycle, a pedestrian, and a bicycle.
  • 7. The method of claim 1, wherein determining the agent comprises: determining, using the at least one processor, based on the light parameter, a predictive candidate location associated with a candidate agent;generating, using the at least one processor, based on the scene data and the light parameter, a reverse light tracing result at the predictive candidate location; anddetermining, using the at least one processor, based on the reverse light tracing result, the agent.
  • 9. The method of claim 1, wherein determining the agent comprises determining, using the at least one processor, based on the sensor data and the light parameter, an agent trajectory parameter indicative of a trajectory of the agent.
  • 10. The method of claim 1, wherein determining the light parameter comprises: determining, using the at least one processor, based on the sensor data and the scene data, a differential scene indicative of differences in light intensity between the environment data and the sensor data;wherein the light parameter is based on the differential scene.
  • 11. The method of claim 1, wherein the sensor data is obtained from one or more of: a camera, a light-intensity sensor, and a LIDAR sensor.
  • 12. The method of claim 1, wherein the scene data includes one or more of: a location parameter indicative of a location of the autonomous vehicle, a time parameter indicative of a time of day, and a weather parameter indicative of a weather condition of the environment.
  • 13. The method of claim 1, wherein the environment data includes data indicative of one or more predetermined light sources.
  • 14. The method of claim 13, wherein the light parameter is not indicative of the one or more predetermined light sources.
  • 15. The method of claim 1, wherein the scene data comprises a three-dimensional scene data.
  • 16. The method of claim 1, wherein the light parameter is indicative of an unexpected light intensity in the environment.
  • 17. The method of claim 1, wherein determining the scene data comprises determining, based on one or more of a stereoscopic scene builder, a LIDAR scene builder, and a sensor fusion scene builder, the scene data.
  • 18. The method of claim 1, further comprising: generating, using the at least one processor, based on the agent, an advance warning indication.
  • 19. A non-transitory computer readable medium comprising instructions stored thereon that, when executed by at least one processor, cause the at least one processor to carry out operations comprising: obtaining, using at least one processor, sensor data associated with an environment in which an autonomous vehicle is operating;determining, using the at least one processor, based on the sensor data and environment data, scene data indicative of an expected scene representative of the environment;determining, using the at least one processor, based on the scene data and the sensor data, a light parameter indicative of an unexpected light source in the environment;determining, using the at least one processor, based on the light parameter, an agent in the environment; andgenerating, using the at least one processor, based on the agent, a trajectory for the autonomous vehicle.
  • 20. A system, comprising at least one processor; and at least one memory storing instructions thereon that, when executed by the at least one processor, cause the at least one processor to: obtain sensor data associated with an environment in which an autonomous vehicle is operating;determine, based on the sensor data and environment data, scene data indicative of an expected scene representative of the environment;determine, based on the scene data and the sensor data, a light parameter indicative of an unexpected light source in the environment;determine, based on the light parameter, an agent in the environment; andgenerate, using the at least one processor, based on the agent, a trajectory for the autonomous vehicle.