The present invention relates generally to a communication system for vehicles and, more particularly, to a vehicle communication system that provides or communicates information to or between vehicles traveling along a road.
Communication systems for vehicles may provide for communication between vehicles and/or between a vehicle and a remote server. Such car2car or V2V and car2X or V2X technology provides for communication between vehicles based on information provided by one or more vehicles and/or information provided by a remote server or the like. Examples of such systems are described in U.S. Pat. No. 7,580,795 and/or U.S. Publication Nos. US-2012-0218412, published Aug. 30, 2012, and/or US-2012-0062743, published Mar. 15, 2012, which are hereby incorporated herein by reference in their entireties.
The present invention provides a vehicle communication system that includes communication devices incorporated into or integrated in a powered road or strip along a roadway, whereby the communication system is operable to communicate information to vehicles traveling along the road and is operable to receive information from vehicles traveling along the road. The communication system and road is/are solar powered so that the devices disposed along the road may operate without a wired power supply connection to a power plant or the like.
These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.
Referring now to the drawings and the illustrative embodiments depicted therein, a communication system includes communication devices disposed along or integrated in a solar powered roadway (
The present invention provides solar powered roads that include a communication system that communicates (using a high speed secure network) with the vehicles traveling on and along the road and with vehicles traveling at any speed on and along the road, which makes the solar powered roads of the present invention “smart roads.” The smart roads may detect or sense the exact location of the traveling vehicles using sensors or receivers along the smart roads (such as a receiver that receives a communication from vehicles, such as a communication that is substantially continuously transmitted by a vehicle communication system and that identifies the particular vehicle). Vehicles using the smart roads may communicate to the smart road system information pertaining the vehicle's sensors, actuators and/or driver intent information and/or the like.
The smart roads of the present invention may be connected via any suitable means, such as a high speed secure fiber optics network or the like, to a remote or central system, such as a system having high speed super computers or remote servers or the like. The central system may be operable (such as responsive to a determination of an autonomous vehicle on one of the smart roads) to drive the automated vehicles on the smart road. For example, as an autonomous vehicle enters the smart road/lane, the communication system of the vehicle communicates information about the location or route that is entered for the vehicle to follow, the smart road system receives this communication and communicates the information to the high speed super computers of the central server, which has all the environment information and road information and transmits commands to the fully automated vehicle in real time to control the autonomous vehicle and drive it from its current location along the desired route and to its final destination. Because the smart road system has information of other roads along the path of travel towards the destination of the autonomous vehicle, the system road system may adjust or alter the route, such as responsive to traffic or weather conditions or the like. The system may monitor the behavior of automated/semi-automated/manually driven vehicles and take necessary safety intervention in case of system malfunction or driver error. For example, responsive to a determination of a hazardous condition of a semi-automated/manually driven vehicle traveling on and along the smart road, the system may communicate signals to that vehicle to a vehicle control system of that vehicle to control at least one of the throttle of the vehicle, the brakes of the vehicle and the steering of the vehicle as the vehicle travels on the road, whereby the driver may temporarily relinquish control of a manually driven or semi-autonomous vehicle to the control unit of the smart road system.
The smart road system may transmit the exact lane information with respect to a vehicle traveling along the road, which could be used for lane keep assist, automated lane change and/or the like of that vehicle and other vehicles traveling in the same lane or other lanes of the road. Because the smart road system may send the information on the vehicle to other vehicles in the vicinity of the vehicle, the driver intent may be communicated to other vehicles and this information may be fused with the existing ADAS sensors and used for ADAS feature enhancement (such as for adaptive cruise control (ACC), cross traffic alert, lane departure warning, lane keep assist and/or the like).
As shown in
Optionally, and such as shown in
Optionally, the smart road system of the present invention may provide high precision geographical location information of vehicles traveling along the road(s), because the precise geographical location of points along the roads may be entered and known when the smart roads are deployed. This information may be communicated to the vehicles and may act as redundant GPS information, which may be useful in situations where GPS satellite reception is lost or interrupted, such as in cities with dense buildings, tunnels and/or the like). Optionally, the smart road system may communicate location based advertisements to the vehicles traveling along the road (such as advertisements for fuel, restaurants, hotels or the like) that are local to the current known geographical location of the vehicle traveling along the smart road.
Because the geographic location information is well known during the installation of the smart road, the smart road may transmit or communicate high resolution geographic information to vehicles traveling on and along the smart road. Such geographic information may be redundant to the satellite GPS signal received by the vehicle-based GPS systems, and is also useful in situations where the vehicle's GPS satellite reception is lost or interrupted, such as in a tunnel or in cities and/or the like. The geographical information may be transmitted by the communication devices responsive to determination or detection of a vehicle traveling along the road at or near respective geographical locations, with each communication device communicating information pertaining to the sensed location of the vehicle on the road. For example, the system (without input from the remote server) may transmit geographical information to vehicles traveling on and along the road so that the vehicles receive accurate geographical location information (that is either redundant to the vehicle-based GPS systems or that supplements the vehicle-based GPS systems if the satellite signals are lost or interrupted). Optionally, for example, the communication devices of the road (responsive to the remote server that may receive vehicle information) may communicate information to a vehicle traveling on the road pertaining to a destination location and route for that vehicle (where such information may be communicated by the vehicle via a V2X communication link to the communication devices and/or remote server), such that the vehicle maintains the navigation information even if the satellite signals are lost or interrupted.
Optionally, because the lane marking information is also known during the installation of the smart road, the smart road communication devices may transmit or communicate lane information (such as for a lane departure warning system of the vehicles), such as responsive to the sensors of the smart road (and processor or server) sensing or determining the vehicle location on the road surface. For example, if a vehicle starts to move out of a lane and into another lane occupied by another vehicle, the smart road (knowing the location of both vehicles and the lane boundaries) can generate a signal to an alert system of the lane-changing vehicle to alert the driver of the hazardous condition. The smart road may transmit or communicate lane location information to vehicles traveling on and along the road, and such communication may be redundant to lane marker detection by the vehicles, but will be very useful in poor visibility conditions (such as in snow or rain or mud conditions), where the vehicle sensor (such as a forward viewing camera) may have difficulty determining the lane markers on the road ahead of the vehicle. The lane information may be transmitted by the communication devices responsive to determination or detection of a vehicle traveling along the road at or near respective communication devices, with each communication device communicating information pertaining to the sensed location of the vehicle on the road relative to the lane or lanes along which the vehicle is traveling.
Optionally, because the smart road system may be connected with a high speed backend network, the system may be operable to provide a high speed wireless connection for the vehicles traveling along the smart road, so that the vehicle systems may access the network for infotainment, web browsing and/or the like.
As shown in
Thus, the smart road system or communication system of the present invention provides enhanced communication with vehicles traveling on and along the smart road and may provide automated or semi-automated control of such vehicles.
The vehicles traveling along the road may include any suitable communication system that is capable of transmitting vehicle information to the smart road system's communication devices and receiving information from the smart road system's communication devices. For example, the vehicle communication systems may utilize aspects of the systems described in U.S. Pat. Nos. 6,690,268; 6,693,517; 7,156,796 and/or 7,580,795, and/or U.S. Publication Nos. US-2015-0158499; US-2015-0124096; US-2014-0218529; US-2014-0375476; US-2012-0218412; and/or US-2012-0062743, and/or U.S. patent application Ser. No. 14/730,544, filed Jun. 4, 2015 and published Dec. 10, 2015 as U.S. Publication No. US-2015-0352953, which are all hereby incorporated herein by reference in their entireties.
Optionally, the vehicles may include a vehicle vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide a top down or bird's eye or surround view display and may provide a displayed image that is representative of the subject vehicle, and optionally with the displayed image being customized to at least partially correspond to the actual subject vehicle.
For example, and as shown in
The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an EyeQ2 or EyeQ3 image processing chip available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580; and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO/2010/144900; WO 2013/043661 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, which are hereby incorporated herein by reference in their entireties.
Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.
The present application claims the filing benefits of U.S. provisional application Ser. No. 62/032,037, filed Aug. 1, 2014, which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4088937 | Uchida | May 1978 | A |
5595271 | Tseng | Jan 1997 | A |
5760962 | Schofield et al. | Jun 1998 | A |
5796094 | Schofield et al. | Aug 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
6129411 | Neff | Oct 2000 | A |
6201642 | Bos | Mar 2001 | B1 |
6396397 | Bos et al. | May 2002 | B1 |
6405132 | Breed et al. | Jun 2002 | B1 |
6636258 | Strumolo | Oct 2003 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6693517 | McCarthy et al. | Feb 2004 | B2 |
6975246 | Trudeau | Dec 2005 | B1 |
7005974 | McMahon et al. | Feb 2006 | B2 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7145519 | Takahashi et al. | Dec 2006 | B2 |
7161616 | Okamoto et al. | Jan 2007 | B1 |
7230640 | Regensburger et al. | Jun 2007 | B2 |
7248283 | Takagi et al. | Jul 2007 | B2 |
7295229 | Kumata et al. | Nov 2007 | B2 |
7301466 | Asai | Nov 2007 | B2 |
7580795 | McCarthy et al. | Aug 2009 | B2 |
7592928 | Chinomi et al. | Sep 2009 | B2 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
7881496 | Camilleri et al. | Feb 2011 | B2 |
8892345 | Arcot | Nov 2014 | B2 |
9218001 | Lee | Dec 2015 | B2 |
9478129 | Kothari | Oct 2016 | B1 |
20030095039 | Shimomura et al. | May 2003 | A1 |
20060254142 | Das et al. | Nov 2006 | A1 |
20070032245 | Alapuranen | Feb 2007 | A1 |
20090033474 | Chen | Feb 2009 | A1 |
20100085171 | Do | Apr 2010 | A1 |
20110032119 | Pfeiffer et al. | Feb 2011 | A1 |
20110112720 | Keep | May 2011 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120065858 | Nickolaou | Mar 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20130116859 | Ihlenburg et al. | May 2013 | A1 |
20130342333 | Hutchings | Dec 2013 | A1 |
20140032091 | Arcot | Jan 2014 | A1 |
20140088796 | Lee | Mar 2014 | A1 |
20140195068 | Boss | Jul 2014 | A1 |
20140195138 | Stelzig | Jul 2014 | A1 |
20140218529 | Mahmoud | Aug 2014 | A1 |
20140222323 | Purushothaman | Aug 2014 | A1 |
20140253345 | Breed | Sep 2014 | A1 |
20140309806 | Ricci | Oct 2014 | A1 |
20140309864 | Ricci | Oct 2014 | A1 |
20140375476 | Johnson et al. | Dec 2014 | A1 |
20150124096 | Koravadi | May 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150228188 | Macfarlane | Aug 2015 | A1 |
20150232065 | Ricci | Aug 2015 | A1 |
20150251599 | Koravadi | Sep 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20160260328 | Mishra | Sep 2016 | A1 |
20160358477 | Ansari | Dec 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
20160036917 A1 | Feb 2016 | US |
Number | Date | Country | |
---|---|---|---|
62032037 | Aug 2014 | US |