The present invention relates to the guidance of autonomous vehicles and in particular, relates to guiding an autonomous vehicle along a roadway by means of active devices [d1] during normal and inclement weather that will work under any luminous conditions.
The Proposed sensor devices are strategically placed in or on the vehicle that will detect active devices [d1]. The Devices [d1] could be self-powered by solar and/or batteries with active electronics to process approaching vehicles or/and pedestrians for where powers are not readily available. This self-contained or/and networked device[d1] will either receive a discovery signal from the approaching vehicle or self-detect the vehicle/pedestrian or even provided by the network. Once either vehicle or pedestrian is known then the device[d1] will either reply with the predetermined information to the vehicle or/and has already display the “status” (i.e.: stop sign) or/and it displays the information.
Currently, the autonomous vehicles are helpless and cannot or do not know the signs or road details without aid of camera system or detailed pre-mapped system of navigation. In use today, the navigation system is either GPS based for obtaining details of the roads or/and uses LiDAR, or/and optical (Camera) based for mapping of the road network sign(s) in conjunction with pre-mapped extremely detailed road information. This pre-mapping of the road is inefficient and, in most cases, impractical if there are continual changes to the roadways (especially with the rate of change ranging from daily to hourly). These constant changes can create havoc to autonomous vehicles on the road if they do not have the latest updates or if there are changes of the road networks. This could potentially result in a big catastrophe. Even if feasible, the amount of changes will require a multitude of road networks to be re-mapped. This will not give the autonomous vehicles the information it needs to navigate the roadway in a timely manner.
Furthermore, current technologies such as camera (optical), LiDAR, and other roadway interpretation systems fail under severe weather conditions and under certain luminous conditions. As such, the present invention claims will help evolve the self-driving, autonomous vehicles to navigate the road in a more effective manner under normal or severe weather condition and any luminous conditions.
Based on the preliminary research of the current technologies that are deployed or proposed to solve the navigation for the autonomous vehicles on our roads today, all fail to navigate under inclement weather conditions and are costly to implement. However, the present invention below represents a more economical and efficient way to implement a self-assisted navigation system. For example, one of the claims is to modify the existing passive signage(s) to an active device so that sensors on the autonomous vehicle can accurately pick up the information ahead before it traverse the road. These details of the road could be a simple stop sign(s) or parking spaces available ahead on the road and can be even adapted to provide information from pass stop to approaching autonomous bus—especially under (inclement weather and any luminous) conditions.
The proposed sensor devices are strategically placed in or on the vehicle that will detect “special” lane markers. The lane markers can consist of any of the following: reflective paints, metal paints, small sensors that light up (when the vehicle is about to approach the markers) whose power is from solar, battery, or by the vehicle itself sending microwave energy to activate these lane markers.
Currently autonomous vehicles are helpless without a system of navigation. In use today, the navigation system is either GPS based for obtaining details of the roads or LIDAR-based (Light Detection and Ranging) for mapping of the road networks. This pre-mapping of the road is inefficient and, in most cases, impractical if there are continual changes to the vehicles on the road if they do not have the latest updates or changes of the road networks. This could potentially result in a catastrophe. Even if feasible, the amount of changes will require a multitude of road networks to be re-mapped. This will not give the autonomous vehicles the information it needs to navigate the roadway in a timely manner.
Furthermore, current technologies such as optical systems, LIDAR, and other roadway interpretation systems fail under severe weather conditions and under certain luminous conditions. As such, the present invention claims will help evolve the self-driving, autonomous vehicles to navigate the road in a more effective manner under normal or severe weather conditions and any luminous conditions.
Based on the preliminary research of the current technologies that is deployed or proposed to solve the navigation for the autonomous vehicles on our roads today, all fail to navigate under inclement weather conditions and are costly to implement. However, the present invention below represents a more economical and efficient way to implement a self-assisted navigation system. For example, one of the claims is to modify the existing lane marking painting technique using ferrous or non-ferrous materials so that sensors on the autonomous vehicle can accurately pick up or sense the location of the marker(s) on the road—especially under inclement weather conditions.
As an example, in U.S. Pat. No. 9,080,866 B1 patent it claims that it's using a laser detection system to pick up reflections from the lane marker reflective property. Under inclement weather this system will fail as snow, ice, fog, or rain will cause havoc with a laser based detection system. The laser will not be able to penetrate heavy snow and may get a false reflection signal from bouncing off the snow.
In another example from patent US 2015/0303581A1, a reflector which contains both microwave retro-reflector and an embedded tuned circuit is claimed. This type of system tends to be expensive to implement as it requires installing active circuitry devices along the many miles of the road as lane markers. And not to mention these active devices could become malfunction or die altogether over time.
This disclosure relates to an autonomous vehicle and a system for controlling the vehicle, more specifically, the present disclosure is directed towards autonomous vehicles with a plurality of sensor devices to read road information then to navigate along the roadway.
It is the objective of the present disclosure to provide an improved autonomous vehicle and a system of sensor devices for detecting the road information from passive signage(s)/traffic cone(s) which can be read/received when in severe weather conditions and/or luminous conditions.
According to an aspect of the disclosure it is to provide an autonomous vehicle, comprising:
In another aspect of the disclosure it is to provide an autonomous vehicle comprising:
In another aspect of the disclosure it is to provide an autonomous vehicle comprising:
This disclosure relates to an autonomous vehicle and a system from controlling the same. More specifically, the present disclosure is directed towards autonomous vehicles with a plurality of sensor devices to read road information to navigate along the roadway.
It is the objective of the present disclosure to provide an improved autonomous vehicle and a system of sensor devices for detecting the road information from passive lane markers which can be read when in severe weather conditions and/or luminous conditions.
According to an aspect of the disclosure it is to provide an autonomous vehicle, comprising:
In another aspect of the disclosure it is to provide an autonomous vehicle comprising:
In another aspect of the disclosure it is to provide an autonomous vehicle comprising:
Additional objects, features and advantages of the present invention will become more readily apparent from the following embodiments when taken in conjunction with the drawings wherein the reference numerals refer to the corresponding parts in the several views.
The description herein makes reference to the accompanying drawings below wherein like reference numerals refer to like parts throughout the several views, and wherein:
Devices (could have all the below components or part of it and other future components that are not identified below. It depends on the application):
Electronics
The embedded devices [d1] are either powered up by itself by itself solar or small long-lasting batteries or a combination of all of above and/or can be powered by the car by sending energy sources such as microwaves and etc. The sign information is transmitted back to the approaching vehicles by a common frequency that all autonomous vehicles understand and can interpret and process. Both the sign and the autonomous vehicle sensors and devices can potentially work on a dedicated short-range communication (DSRC) but not limited to DSRC. DSRC works in 5.9 GHz band with bandwidth of 75 MHz and approximate range of 1000 m. Vehicular sensors[s2] and road network devices[D1]
Based on the information read or received, a detail of the environment could be gathered. In order to have sufficient data points to formulate detail of the devices or objects of the roadway, the transmitted signal needs multitudes of samples per second.
Note in order to cover the front, back, and sides of the vehicle, multiple sensors (and their associated antennas) maybe required to be installed. The number of sensors required is dictated by the achievable data point resolution to accurately generate a 3-D map of the road. As such, the sensor antenna coverage beamwidth and gain (to resolve and coherently receive the reflected signal) performance will contribute to the number of sensors required.
Once the information is processed and the information is parsed and identified then it is sent to the Autonomous Control System (ACS) and/or the 3D map navigation database system. Below briefly explains the function of each entity and how each interacts with one another.
The sensor sends out the discovery signal from the autonomous vehicle to discover the all the devices on the road as it travel.
All the received data from the sensors are processed by the hub. The main goal of the system is to identify all the device(s) on the road that are within certain distance from itself. After the device(s) are identified and processed, this information is passed to the Autonomous Control System and/or the 3D map navigation system. This has to happen in real time and in advance of the path the vehicle is traveling on.
The 3D map navigation database system, where the road networks are detailed and created into three dimensional images so that the autonomous vehicle can use to traverse to its destination or simply find a suitable parking space. In the scenario where the autonomous vehicle relies on using the 3D detailed mapping database system to obtain the devices on the road as it is traveling, then the proposed system will compare the device(s) information on the 3D map to see if it is up to date with the newly acquired information. In the event there are discrepancies and devices are permanently in nature, the system thus flags the changes for the 3D mapping system to make the updated changes. In the event the device(s) are temporary like traffic cones, then the system utilized this real-time information to navigate.
In a scenario where the autonomous vehicle is acquiring the road information in real time, the identified device(s) information is passed directly to the Autonomous Control System (ACS) for use in navigating the roadway. Once the autonomous vehicle has successfully navigated the roadway, then this information is passed to the 3D mapping system to compare and update the information for future use via locally stored or, via other 3D mapping systems on the network.
As all the device(s) are learnt from all sides of the vehicle, this information can be stored in a 3D map navigation database, or the autonomous control system depending on which database is being used. Further, with the mapping the vehicle can update the mapping process for other vehicles in real time if there have been changes to the road due to construction or other such adjustments.
Based on the above, in the referred embodiment depicted, the system will work even under severe adverse weather conditions. The active sensor devices in the autonomous vehicle continue to read the device(s) on the roadway information at certain frequency intervals in real-time. The sensor in the autonomous vehicle can function independently as a stand-alone system or in conjunction with other existing navigation system (such as the GPS or Lidar systems for example) to give it finer details of the roadway that it is travelling on. The proposed system is superior to other existing systems because, unlike other existing systems, this system will continue to work autonomously even under severe weather conditions such as heavy snowstorm, ice, fog or any other inclement weather.
It is important for the autonomous vehicle to have the latest road network details to navigate. These sensors can be strategically placed in, or mounted on, the vehicle to enable them to read the most accurate road information for either a straight or bent road.
Note that the proposed system does not require modification to the existing road networks, with the exception of changing or installing co-exiting active electronics. Thus, to summarize, the following is a sequence of steps that must happen for the autonomous vehicle to navigate the roadway in the most effective manner:
In order to have the most effective and accurate road information, the sensors would read the information from ahead and from both sides of the vehicle to determine the roadway structures and objects. Each side of the road may provide different information as the vehicle travels ahead. Once the information is obtained by the autonomous control system and/or 3D map navigation database system, it then processes it and formulates a decision on how to best navigate. The proposed system will work under any weather condition.
Based on the time of arrival and incident angle of the reflected signal 40, the distance between the vehicle and the lane marker(s) 50 can be calculated. In order to have sufficient data points to formulate a mapping of the roadway, the transmitted signal 30 is required to be sent out frequently at multiple samples per second.
Note in order to cover the front, back, and sides of the vehicle, multiple sensors (and their associated antennas) will be required to be installed. The number of sensors required is dictated by the achievable data point resolution to accurately generate a 3-D map of the road. As such, the sensor antenna coverage beamwidth and gain (to resolve and coherently receive the reflected signal) performance will contribute to the number of sensors required.
The circle 120 in
The sensor sends out the discovery signal from the autonomous vehicle to discover the lane marker(s) 50 as explained in
All the received data from the sensors are processed by the hub. The main goal of the system is to identify the lane markers 50. After the lane markers 50 are identified and processed, this information is passed to the Autonomous Control System and/or the 3D map navigation system. This has to happen in real time and in advance of the path the vehicle is traveling on.
The 3D map navigation database system, where the road networks are detailed and created into three dimensional so that the autonomous vehicle can use to traverse to its destination. In the scenario where the autonomous vehicle relies on using the 3D detailed mapping database system to obtain the lane information as it is traveling, then the proposed system in the circle 12 will compare the lane marking information on the 3D map to see if it is up to date with the newly acquired information. In the event the lane markers 50 are out of date, the system thus flags the changes for the 3D mapping system to make the updated changes.
In a scenario where the autonomous vehicle is acquiring the road information in real time, the identified lane marker 50 information is passed directly to the Autonomous Control System (ACS) for use in navigating the roadway. Once the autonomous vehicle has successfully navigated the roadway, then this information is passed to the 3D mapping system to compare and update lanes marking information for future use via locally stored or, via other 3D mapping system on the network.
In the event the vehicle is required to go in the reverse direction it would have all the needed information to complete its task.
As all the lane markers 50 are learnt from all sides of the vehicle, this information can be stored in a 3D map navigation database, or the autonomous control system depending on which database is being used. Further, with the mapping the vehicle can update the mapping process for other vehicles in real time if there have been changes to the road due to construction or other such adjustments.
Based on the above, in the referred embodiment depicted, the system will work even under severe adverse weather conditions. The active sensor devices in the autonomous vehicle continue to read the lane and roadway information at certain frequency interval in real-time. The sensor devices in the autonomous vehicle can function independently as a stand-alone system or in conjunction with other existing navigation system (such as the GPS or Lidar system for example) to give it finer details of the roadway that it is travelling on. The proposed system is superior to other existing systems because, unlike other existing systems, this system will continue to work autonomously even under severe weather conditions such as heavy snowstorm, ice, fog or any other inclement weather.
It is important for the autonomous vehicle to have the latest road network details to navigate. These sensor devices can be strategically placed in, or mounted on, the vehicle to enable them to read the most accurate road information for either a straight or bent road.
Note that the proposed system does not require modification to the existing road networks, with the exception of changing the paint based material used for painting the lane lines/markers 50. As well, the lane markers 50 could also be made of metallic lane markers 50. Thus, to summarize, the following is a sequence of steps that must happen for the autonomous vehicle to navigate the roadway in the most effective manner:
1. The sensor devices would send a discovery signal ahead using sonar technology for example.
2. The metallic paints or metallic lane markers 50 bounces the discovered information back to the source.
3. The sensor device in the vehicle receives the discovered information signal (bounced off the lane markers 50) and passes it on to the processing hub.
4. The hub interprets/processes the information as it receives it in real time.
5. The processed information is translated to a format that is consumable by the autonomous control system or 3D map navigation database system.
6. The autonomous control system or 3D map navigation database system processes the information and makes decision based on the received discovered signals from the proposed sensor devices.
In order to have the most effective and accurate road information, the sensors would read the information from ahead and from both sides of the vehicle to determine the lane structures. Each side of the road may provide different information as the vehicle travels ahead. Once the information is obtained by the autonomous control system and/or 3D map navigation database system, it then processes it and formulates a decision on how to best navigate. The proposed system will work under any weather condition.
Although described with reference to referred embodiments of the invention, it should be readily understood that various changes and/or modifications can be made to the invention without departing from the spirit thereof. In general, the invention is only intended to be limited by the scope of the following claims.
Thus, the following outlines a set of claims that will help or evolve the self-driving, autonomous vehicles to navigate the road in a more effective manner under normal or sever weather and luminous condition:
Number | Date | Country | Kind |
---|---|---|---|
2945564 | Aug 2016 | CA | national |
This application claims the benefit of priority as a continuation-in-part patent application of U.S. patent application Ser. No. 17/350,155 filed Jun. 17, 2021; which itself claims the benefit of priority from U.S. patent application Ser. No. 16/245,503 filed Jan. 11, 2019 which has issued as U.S. Pat. No. 11,043,124; which itself claims the benefit of priority from U.S. Provisional Patent Application 62/624,385 filed Jan. 31, 2018; the entire contents of each being incorporated herein by reference. This application claims the benefit of priority as a continuation-in-part patent application of U.S. patent application Ser. No. 17/553,660 filed Dec. 16, 2021; which itself claims the benefit of priority from U.S. patent application Ser. No. 15/784,168 filed Oct. 15, 2017 which has issued as U.S. Pat. No. 11,237,011;which itself claims the benefit of priority from Canadian Patent Application 2,945,564 filed Oct. 18, 2016.
Number | Date | Country | |
---|---|---|---|
62624385 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16245503 | Jan 2019 | US |
Child | 17350155 | US | |
Parent | 15784168 | Oct 2017 | US |
Child | 17553660 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17350155 | Jun 2021 | US |
Child | 18101463 | US | |
Parent | 17553660 | Dec 2021 | US |
Child | 16245503 | US |