The present disclosure generally relates to a system and method for automatically activating autonomous parking (“autopark”) with a navigation program. More particularly, the navigation program may automatically cause a vehicle to automatically execute autopark upon sensing certain conditions.
Vehicles are typically incapable of moving in a side-ways or lateral direction. As a result, when parallel parking, a driver must enter the spot at an angle, and then adjust so that the vehicle's length is parallel to the street and the two outer wheels are sufficiently close to the street's edge.
Difficulty arises when two parked cars lie on both sides of a potential parking spot. In this case, the driver must parallel park while avoiding contact with the parked cars. This typically involves backing into the potential spot at a severe angle with respect to the street's edge, then gradually softening the angle as the rear of the vehicle slides into place.
To improve a driver's experience, car manufacturers have recently introduced automatic parking programs (“autopark”). When activated, autopark is capable of automatically parallel parking a car. The vehicle autoparks by measuring the potential parking spot's dimensions with a sensor, then applying a mathematical model to the dimensions to generate the proper speed, approach angle, etc. for parallel parking. Autopark programs can safely parallel park a car faster than an experienced driver.
A problem with existing autopark programs is that they require user-activation via a button or command. As a result, some drivers forget to initiate autopark and manually park their car in a parking spot. As a result, traffic may build up behind the driver, among other potential problems.
In various embodiments, the present disclosure resolves those problems by providing systems and methods for parking a vehicle.
The system includes a vehicle including a motor, sensors, a steering system, a processor, and a memory; an autopark program operatively connected to the vehicle's steering system; a navigation program operatively connected to the vehicle and configured to determine the vehicle's speed and lane with the sensors, to evaluate the data, and in response to the evaluation, to automatically execute the autopark program.
The method of parking a vehicle having a motor, sensors, a steering system, a processor, and a memory, includes: determining the vehicle's speed and lane with the sensors and a navigation program operatively coupled to the vehicle; evaluating the data; and automatically executing the vehicle's autopark in response to the evaluation.
For a better understanding of the invention, reference may be made to embodiments shown in the following drawings. The components in the drawings are not necessarily to scale and related elements may be omitted, or in some instances proportions may have been exaggerated, so as to emphasize and clearly illustrate the novel features described herein. In addition, system components can be variously arranged, as known in the art. Further, in the drawings, like reference numerals designate corresponding parts throughout the several views.
While the invention may be embodied in various forms, there are shown in the drawings, and will hereinafter be described, some exemplary and non-limiting embodiments, with the understanding that the present disclosure is to be considered an exemplification of the invention and is not intended to limit the invention to the specific embodiments illustrated.
In this application, the use of the disjunctive is intended to include the conjunctive. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, a reference to “the” object or “a” and “an” object is intended to denote also one of a possible plurality of such objects. Further, the conjunction “or” may be used to convey features that are simultaneously present instead of mutually exclusive alternatives. In other words, the conjunction “or” should be understood to include “and/or”.
Vehicle 100 may include sensors 102. The sensors 102 can be arranged in and around the car in a suitable fashion. The sensors can all be the same or different. There can be many sensors, as shown in
As shown in
The processor or controller 210 may be any suitable processing device or set of processing devices such as, but not limited to: a microprocessor, a microcontroller-based platform, a suitable integrated circuit, or one or more application-specific integrated circuits (ASICs).
The memory 208 may be volatile memory (e.g., RAM, which can include non-volatile RAM, magnetic RAM, ferroelectric RAM, and any other suitable forms); non-volatile memory (e.g., disk memory, FLASH memory, EPROMs, EEPROMs, memristor-based non-volatile solid-state memory, etc.); unalterable memory (e.g., EPROMs); read-only memory; a hard drive; a solid state hard drive; or a physical disk such as a DVD. In an embodiment, the memory includes multiple kinds of memory, particularly volatile memory add non-volatile memory.
The communication devices 214 may include a wired or wireless network interface to enable communication with an external network. The external network may be a collection of one or more networks, including standards-based networks (e.g., 2G, 3G, 4G, Universal Mobile Telecommunications Autonomous valet parking system (UMTS), GSM® Association, Long Term Evolution (LTE)™, or more); WiMAX; Bluetooth; near field communication (NFC); WiFi (including 802.11 a/b/g/n/ac or others); WiGig; Global Positioning System (GPS) networks; and others available at the time of the filing of this application or that may be developed in the future. Further, the external network(s) may be a public network, such as the Internet; a private network, such as an intranet; or combinations thereof, and may utilize a variety of networking protocols now available or later developed including, but not limited to, TCP/IP-based networking protocols. The communication devices 214 may also include a wired or wireless interface to enable direct communication with an electronic device, such as a USB or Bluetooth interface.
The user interface 212 may include any suitable input and output devices. The input devices enable a driver or a passenger of the vehicle to input modifications or updates to information referenced by the navigation program 110 as described herein. The input devices may include, for instance, a control knob, an instrument panel, a keyboard, a scanner, a digital camera for image capture and/or visual command recognition, a touch screen, an audio input device (e.g., cabin microphone), buttons, a mouse, or a touchpad. The output devices may include instrument cluster outputs (e.g., dials, lighting devices), actuators, a display (e.g., a liquid crystal display (“LCD”), an organic light emitting diode (“OLED”), a flat panel display, a solid state display, a cathode ray tube (“CRT”), or a heads-up display), and speakers.
The disk drive 216 is configured to receive a computer readable medium. In certain embodiments, the disk drive 216 receives the computer-readable medium on which one or more sets of instructions, such as the software for operating the methods of the present disclosure including the navigation program 110, the autopark engager program 115, and the autopark program 120, can be embedded. The instructions may embody one or more of the methods or logic as described herein. In a particular embodiment, the instructions may reside completely, or at least partially, within any one or more of the main memory 208, the computer readable medium, and/or within the processor 210 during execution of the instructions.
The term “computer-readable medium” should be understood to include a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term “computer-readable medium” also includes any tangible medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a system to perform any one or more of the methods or operations disclosed herein.
It should be appreciated that vehicle 100 may be fully autonomous, or partially autonomous. In one embodiment, the vehicle 100 is partially autonomous in that it enables a driver to manually steer, brake, and accelerate and also includes an autonomous autopark software or program 120. When executed by the processor, the autopark program 120 parks the vehicle in a parking space. In various embodiments, the autopark program 120 autonomously operates the steering and the acceleration, but requires the driver to manually operate the brake while parking. In various, embodiments, the autopark program 120 autonomously operates the steering, but requires the driver to manually operate the brake and the acceleration while parking. In other embodiments, the autopark program 120 autonomously operates all aspects of parallel parking. The parking space may be any suitable parking space including a parallel parking space, a driveway, or a spot in a garage. The autopark program 120 can send and receive data to and from sensors 102, user interface 212, communication devices 214, drive 206, or any other component operatively connected to the vehicle data bus 202, to safely and effectively autonomously park a vehicle. Suitable autoparks programs are known in the art.
In one embodiment, shown in
The mobile phone or device 105 is operatively connected to the vehicle 100 via any suitable data connection such as WiFi, Bluetooth, USB, or a cellular data connection. Although this disclosure generally refers to a mobile phone 105, it should be appreciate that mobile phone 105 may be any suitable device such as a laptop.
In one embodiment, shown in
As shown in
One embodiment of the present invention includes the navigation program 110, the autopark engager program 115, and the autopark program 120. Although the programs are shown as distinct in
The navigation program 110 receives a destination in step 502. The driver can manually enter the destination into an interface in the navigation program 110 or the navigation program 110 can receive an electronic command from an external source with the destination. In this example embodiment, shown in
In step 503, the navigation program 110 uses the mobile processor to generate a parking assessment area 604 by comparing first properties to first criteria.
When the vehicle 100 enters the parking assessment area 604, the method advances to step 505. The navigation program 110 now selects or generates one or more parking zones 904 by comparing second properties to second criteria.
The method advances to step 507 when the navigation system senses that the vehicle has crossed into one of the parking zones. In step 507, the mobile phone executes the autopark engager program 115. In step 508, the autopark engager program 115 evaluates parking spaces by comparing third properties to third criteria.
The parking assessment area 604 may be user adjustable. For example, the navigation program 110 may enable a user or driver to define the assessment area as a two-mile radius around the destination. Alternatively, or in addition, the driver may define the assessment area in terms of travel time from the destination. In this case, the navigation program 110 may query traffic information from an external source then automatically define the assessment area 604 in terms of travel time. For instance, the driver may define the assessment area to include any location less than five minutes away from the destination. The vehicle may dynamically update the assessment area based on new information.
The navigation program 110 may have an interface enabling a user or driver to select first properties 701 and first criteria 702. The navigation program 110 may compare the first properties 701 to the first criteria 702 in order to generate the parking assessment area 604. For example, as shown in
As shown in
One or more static sensors 804 may be installed on the side of the road. The static sensors 804 may be integrated into existing infrastructure such as buildings, light posts, and meters. The static sensors 804 detect when one or more parking spots become available. They may also measure other properties of the parking spot such as its length. The static sensors 804 may send information directly to the vehicle, the mobile phone, or an external server.
In addition, a third-party or external vehicle 805 may sense an open parking spot using vehicle mounted sensors. For example, when vehicle 805 approaches spot 803, the sensors may capture data relating to the open spot 803. The vehicle 805 may assess the data using an internal process coupled to assessment software to ensure that the spot is suitable for a vehicle, or may transmit the raw data to an external server. The external server may analyze the data immediately, it may simply store the data and transmit it in response to a query, or it may analyze the data in response to a predetermined command. It should be appreciated that a similar data capture function could be performed by a pedestrian's mobile phone or an aerial unit such as a drone.
At an appropriate time, such as when the vehicle enters the parking assessment area, the navigation program 110 queries or accesses the parking data to select or generate parking zones 904. As shown in
The second properties may include distance or time to the destination from the parking zone 904, accessibility of public transportation in the parking zone 904, the number of spots in the parking zone, the size of the parking zone, the relatively safety or security of the assessment area 904, and the probability of a spot being available in the parking zone 904 at the driver's projected arrival time.
As shown in
In
The navigation program 110 monitors the vehicle's location with respect to the zones 904. When the vehicle enters one of the zones 904, the navigation program 110 causes the vehicle 100 or mobile phone 105 to alert the driver via the user interfaces 212 or 213. In one embodiment, a series of light emitting diodes (not shown) light up on the vehicle's gauge cluster. In one embodiment, the diodes remain on at least until the autopark engager program 115 has finished running. The navigation program 110 simultaneously sends a command to execute the autopark engager program 115. It should be appreciated that if the vehicle is partially autonomous, the vehicle may automatically assume driving control when the autopark engager program 115 becomes active. In one embodiment, the driver retains full manual control until the autopark engager program 115 activates autopark.
As shown in
When the third properties satisfy the third criteria, the autopark engager program 115 sends a command causing the autopark program 120 to execute. Similar to the navigation program 110, the autopark engager program 115 may automatically select the third properties and the third criteria 1105 based on external conditions 1103 and the driver's identity 1104.
In order to conserve resources, the navigation program 110 of the present disclosure may only build or display parking zones 904 after the driver enters the parking assessment area 604 to avoid overloading the driver with information. In one embodiment, the autopark engager program 115 is only executed after the driver enters a parking zone. This reduces the possibility of the vehicle assuming automatic control before the driver is ready.
It should be appreciated that the system of the present disclosure may be configured to run in locations with little or no parking data. The navigation program 110 may evaluate the quantity or quality of parking data before or after generating the assessment area. In one embodiment, the navigation program 110 will only build parking zones in response to a sufficient amount of parking data. The degree of sufficiency may be user adjustable via an interface. If the parking data is insufficient, the navigation system may be programmed to inform the driver that the autopark engager program 115 will be inactive and the driver must manually initiate autopark.
In one embodiment, the navigation program 110, in response to insufficient or unsuitable parking data, uses internal vehicle sensors to activate the autopark engager program 115. For example, once a driver enters the parking assessment zone, the navigation system may use internal vehicle sensors such as a camera, sonar, radar, or LiDAR to locate a suitable parking spot. After locating a suitable parking spot, the navigation program 110 may automatically enable the autopark engager program 115 and notify the driver. At this point, the autopark engager program 115 may execute as described above.
It should be appreciated that the navigation program 110 and the autopark engager program 115 may only collect or assess information on demand. For example, if the navigation program 110 has been configured to build the parking assessment area exclusively based on proximity to the destination, then the navigation system may delay collecting or assessing parking information until the vehicle 100 is a predetermined distance from the destination.
It should be appreciated that generating the parking assessment area 604 and the parking zones 904 may be a dynamic process, continuously being updated in response to new information. In one embodiment, the first, second, and third properties and criteria may be dynamically updated based on a change in external conditions. In one embodiment, the systems of the present disclosure require driver-authorization (or at least notify the driver) before applying a dynamic update.
In one embodiment, the driver may adjust when and how the parking assessment area 604 and parking zones 904 are displayed with user adjustable display criteria. In one embodiment, the assessment area 604 and the zones 904 are displayed sequentially, to reduce map clutter.
Although the virtual map 601 is shown from a top plan view, it may be displayed from a top or side perspective view. The driver may adjust the area shown on the map by zooming or moving the map's center. The driver may also adjust the level of detail on the map, the kinds of details shown on the map, and the presentation of those details. For example, the navigation program 110 may enable the driver to display the parking assessment area in one color and the one or more parking zones in another color. In one embodiment, the navigation program 110 automatically displays the parking assessment area and the parking zones sequentially, so that both are not simultaneously present on the map.
For the purposes of the claims, the term “autopark” is defined to mean a program or software configured at least to autonomously steer a vehicle from a traffic lane to a parking space.
The above-described embodiments, and particularly any “preferred” embodiments, are possible examples of implementations and merely set forth for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiment(s) without substantially departing from the spirit and principles of the techniques described herein. All modifications are intended to be included herein within the scope of this disclosure and protected by the following claims.
This application is a continuation of U.S. application Ser. No. 14/993,922 filed on Jan. 12, 2016, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
8542130 | Lavoie | Sep 2013 | B2 |
9283960 | Lavoie | Mar 2016 | B1 |
9286803 | Tippelhofer | Mar 2016 | B2 |
9522675 | You | Dec 2016 | B1 |
9637117 | Gusikhin et al. | May 2017 | B1 |
20060235590 | Bolourchi | Oct 2006 | A1 |
20080154464 | Sasajima | Jun 2008 | A1 |
20090289813 | Kwiecinski | Nov 2009 | A1 |
20090313095 | Hurpin | Dec 2009 | A1 |
20100152972 | Attard | Jun 2010 | A1 |
20110082613 | Oetiker | Apr 2011 | A1 |
20120072067 | Jecker | Mar 2012 | A1 |
20120083960 | Zhu | Apr 2012 | A1 |
20120173080 | Cluff | Jul 2012 | A1 |
20130021171 | Hsu | Jan 2013 | A1 |
20130024202 | Harris | Jan 2013 | A1 |
20130073119 | Huger | Mar 2013 | A1 |
20130110342 | Wuttke | May 2013 | A1 |
20130289825 | Noh | Oct 2013 | A1 |
20140121930 | Allexi | May 2014 | A1 |
20140188339 | Moon | Jul 2014 | A1 |
20140240502 | Strauss | Aug 2014 | A1 |
20140303839 | Filev et al. | Oct 2014 | A1 |
20140320318 | Victor | Oct 2014 | A1 |
20140350855 | Vishnuvajhala | Nov 2014 | A1 |
20150039173 | Beaurepaire | Feb 2015 | A1 |
20150066545 | Kotecha | Mar 2015 | A1 |
20150123818 | Sellschopp | May 2015 | A1 |
20150149265 | Huntzicker | May 2015 | A1 |
20150161890 | Huntzicker | Jun 2015 | A1 |
20150203111 | Bonnet | Jul 2015 | A1 |
20150219464 | Beaurepaire | Aug 2015 | A1 |
20150344028 | Gieseke | Dec 2015 | A1 |
20150346727 | Ramanujam | Dec 2015 | A1 |
20150360720 | Li | Dec 2015 | A1 |
20150371541 | Korman | Dec 2015 | A1 |
20160012726 | Wang | Jan 2016 | A1 |
20160055749 | Nicoll | Feb 2016 | A1 |
20160075369 | Lavoie | Mar 2016 | A1 |
20160107689 | Lee | Apr 2016 | A1 |
20160117926 | Akavaram | Apr 2016 | A1 |
20160185389 | Ishijima | Jun 2016 | A1 |
20160189435 | Beaurepaire | Jun 2016 | A1 |
20160207528 | Stefan | Jul 2016 | A1 |
20160236680 | Lavoie | Aug 2016 | A1 |
20160257304 | Lavoie | Sep 2016 | A1 |
20160284217 | Lee | Sep 2016 | A1 |
20160288657 | Tokura | Oct 2016 | A1 |
20160304088 | Barth | Oct 2016 | A1 |
20160321926 | Mayer | Nov 2016 | A1 |
20160334797 | Ross | Nov 2016 | A1 |
20160358474 | Uppal | Dec 2016 | A1 |
20160368489 | Aich | Dec 2016 | A1 |
20160371607 | Rosen | Dec 2016 | A1 |
20170001650 | Park | Jan 2017 | A1 |
20170028985 | Kiyokawa | Feb 2017 | A1 |
20170072947 | Lavoie | Mar 2017 | A1 |
20170076603 | Bostick | Mar 2017 | A1 |
20170116790 | Kusens | Apr 2017 | A1 |
20170123423 | Sako | May 2017 | A1 |
20170132482 | Kim | May 2017 | A1 |
20170144654 | Sham | May 2017 | A1 |
20170144656 | Kim | May 2017 | A1 |
20170147995 | Kalimi | May 2017 | A1 |
Number | Date | Country |
---|---|---|
102006058213 | Jul 2008 | DE |
2014125196 | Jul 2014 | JP |
1020160051993 | May 1916 | KR |
1020090040024 | Apr 2009 | KR |
WO 2011141096 | Nov 2011 | WO |
Entry |
---|
Search Report dated Jul. 11, 2017 for GB Patent Application No. Enter 15/583,524, pp. 3. |
Number | Date | Country | |
---|---|---|---|
20170232961 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14993922 | Jan 2016 | US |
Child | 15583524 | US |