GPS devices are becoming more prevalent in a variety of modes. Many vehicles come equipped with onboard navigation systems or onboard navigation capability. Typically, these systems will either display a route to be traveled, along with a series of directions, or they will audibly recite directions as a driver is en route to a destination.
Portable GPS devices and even some cellular phones or other wireless devices may also be used to provide directions, again, providing the directions in a visual or audible manner.
Typically, when interacting with these devices, a user will input a destination location. Using the present GPS coordinates of the vehicle, a routing engine will determine a route to the destination location, according to one or more predefined parameters (e.g., without limitation, fastest route, avoiding highways, avoiding toll roads, avoiding dirt roads, etc.).
In the most common manner of operation, the devices instruct the user when to turn (left, right, bear left, bear right, sharp left, sharp right, etc.). The warnings on when to turn are usually given some distance ahead of the turn, and may be repeated a number of times until the turn is to be made.
In many of these systems, however, little attention is given to, for example, crossroads along a route that are not taken as part of the route. For example, as a user approaches a particular turn along an audibly (but not visually) provided route, the user will typically only be given the instruction for when the turn occurs. In the audible-only systems, the user will not typically receive any data regarding roads along the route on which the user is not presently traveling or onto which the user is being instructed to turn.
Even with visual-based systems, cross-road names may not be displayed, and the user is provided with little, if any, information regarding cross-roads, unless the user inadvertently turns on to one of those roads (thus making the road part of the “new” route).
In a first illustrative embodiment, a computer-implemented method for road identification includes receiving a request at a vehicle computing system for a crossroad identification. The method also includes determining, via the vehicle computing system, a crossroad along a route-being-traveled corresponding to the request. Finally the method includes outputting, from the vehicle computing system to an output, the determined crossroad responsive to the request.
In a second illustrative embodiment, a computer-implemented method includes determining, via a vehicle computing system (VCS), a classification of road on which a vehicle is traveling. The illustrative method also includes setting a crossroad identification level in the VCS based at least in part on the determined road classification. The method further includes receiving, at the VCS, a request for crossroad identification.
The illustrative method also includes determining, via the VCS, a crossroad corresponding to the request and the set crossroad identification level. Finally the method includes outputting, from the VCS to an output, the determined crossroad.
In a third illustrative embodiment, an apparatus includes receiving programmed logic circuitry to receive a request at a vehicle computing system for a crossroad identification. The apparatus also includes determining programmed logic circuitry to determine, via the vehicle computing system, a crossroad along a route-being-traveled corresponding to the request. Finally, this apparatus includes outputting programmed logic circuitry to output, from the vehicle computing system to an output, the determined crossroad responsive to the request.
In the illustrative embodiment 1 shown in
The processor is also provided with a number of different inputs allowing the user to interface with the processor. In this illustrative embodiment, a microphone 29, an auxiliary input 25 (for input 33), a USB input 23, a GPS input 24 and a BLUETOOTH input 15 are all provided. An input selector 51 is also provided, to allow a user to swap between various inputs. Input to both the microphone and the auxiliary connector is converted from analog to digital by a converter 27 before being passed to the processor.
Outputs to the system can include, but are not limited to, a visual display 4 and a speaker 13 or stereo system output. The speaker is connected to an amplifier 11 and receives its signal from the processor 3 through a digital-to-analog converter 9. Output can also be made to a remote BLUETOOTH device such as PND 54 or a USB device such as vehicle navigation device 60 along the bi-directional data streams shown at 19 and 21 respectively.
In one illustrative embodiment, the system 1 uses the BLUETOOTH transceiver 15 to communicate 17 with a user's nomadic device 53 (e.g., cell phone, smart phone, PDA, or any other device having wireless remote network connectivity). The nomadic device can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, tower 57 may be a WiFi access point.
Exemplary communication between the nomadic device and the BLUETOOTH transceiver is represented by signal 14.
Pairing a nomadic device 53 and the BLUETOOTH transceiver 15 can be instructed through a button 52 or similar input. Accordingly, the CPU is instructed that the onboard BLUETOOTH transceiver will be paired with a BLUETOOTH transceiver in a nomadic device.
Data may be communicated between CPU 3 and network 61 utilizing, for example, a data-plan, data over voice, or DTMF tones associated with nomadic device 53. Alternatively, it may be desirable to include an onboard modem 63 having antenna 18 in order to communicate 16 data between CPU 3 and network 61 over the voice band. The nomadic device 53 can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, the modem 63 may establish communication 20 with the tower 57 for communicating with network 61. As a non-limiting example, modem 63 may be a USB cellular modem and communication 20 may be cellular communication.
In one illustrative embodiment, the processor is provided with an operating system including an API to communicate with modem application software. The modem application software may access an embedded module or firmware on the BLUETOOTH transceiver to complete wireless communication with a remote BLUETOOTH transceiver (such as that found in a nomadic device).
In another embodiment, nomadic device 53 includes a modem for voice band or broadband data communication. In the data-over-voice embodiment, a technique known as frequency division multiplexing may be implemented when the owner of the nomadic device can talk over the device while data is being transferred. At other times, when the owner is not using the device, the data transfer can use the whole bandwidth (300 Hz to 3.4 kHz in one example).
If the user has a data-plan associated with the nomadic device, it is possible that the data-plan allows for broad-band transmission and the system could use a much wider bandwidth (speeding up data transfer). In still another embodiment, nomadic device 53 is replaced with a cellular communication device (not shown) that is installed to vehicle 31. In yet another embodiment, the ND 53 may be a wireless local area network (LAN) device capable of communication over, for example (and without limitation), an 802.11g network (i.e., WiFi) or a WiMax network.
In one embodiment, incoming data can be passed through the nomadic device via a data-over-voice or data-plan, through the onboard BLUETOOTH transceiver and into the vehicle's internal processor 3. In the case of certain temporary data, for example, the data can be stored on the HDD or other storage media 7 until such time as the data is no longer needed.
Additional sources that may interface with the vehicle include a personal navigation device 54, having, for example, a USB connection 56 and/or an antenna 58; or a vehicle navigation device 60, having a USB 62 or other connection, an onboard GPS device 24, or remote navigation system (not shown) having connectivity to network 61.
Further, the CPU could be in communication with a variety of other auxiliary devices 65. These devices can be connected through a wireless 67 or wired 69 connection. Also, or alternatively, the CPU could be connected to a vehicle based wireless router 73, using for example a WiFi 71 transceiver. This could allow the CPU to connect to remote networks in range of the local router 73. Auxiliary device 65 may include, but are not limited to, personal media players, wireless health devices, portable computers, and the like.
In a first illustrative embodiment, shown in
In this illustrative embodiment, if no identification of “next” or “previous” is provided, then the system will assume that the user is requesting information regarding the next upcoming crossroad along the route 207.
Once it is determined for which crossroad the user is requesting information, a vehicle navigation system (or computing system, or off-board remote routing system, etc) may examine a pre-stored map to determine the requested crossroad information 209.
In this illustrative embodiment, the system provides the identification in two instances. First, in response to the query, regardless of the user's present location, the system identifies the requested crossroad 211 (in this embodiment the crossroad is identified upon request, although it is possible that identification to the user may be delayed until a later point). This identification can be done audibly or visually (assuming the system has display capability). Additionally, in this illustrative embodiment, if the request is for an upcoming crossroad 213, then if (or once) the user is within a predetermined distance from the crossroad 215, the identification will be repeated (or fixedly displayed) 217. Once the user has passed the crossroad 219, the identification ceases 221.
The repeating of the identification may be in the form of one or more audible prompts provided at some series of intervals, it may be an audible prompt provided once (for the second time), or it may be a persistent display on a visual map on a vehicle navigation display. Of course, such a secondary display/vocalization may also be omitted if desired.
In one illustrative embodiment, if a visual display is available for use by the vehicle computing system, a requested crossroad name may be displayed in a highlighted or otherwise enhanced manner. Alternatively, the crossroad name may be displayed in a pop-up window, or otherwise distinguished from the general map data being displayed. Of course, the road name may also be displayed in conjunction with the road on the displayed map, such as being displayed along-side the road (or having an identifier from the road name pointing or connecting to the road).
Finally, in this illustrative embodiment, once the road has been passed by the driver, the display is removed/ended 219 (or an enhanced display is rendered ordinary). This may aid the driver in keeping track of a present location on a map, especially if the level of the view is zoomed out to an extent where the vehicle position may not be readily apparent (due, for example, to the icon representing the vehicle remaining large whilst the display of streets past which the vehicle is traveling has been reduced and become somewhat crowded).
In yet another illustrative example, discussed with more detail with respect to
For example, in another illustrative embodiment, intersection classification may be available. Either using a ranking system as described above, classifications such as “traffic signal”, “traffic sign”, etc. or some other suitable system, a determination engine may be able to distinguish intersection types.
Based on some determined system of classification, a user can set (or a system can dynamically select, as discussed with regards to
In this example, based on a user-request or a predefined setting (such as, but not limited to, “identify all crossroads of X type”), a user may be notified just prior to and/or just after crossing a crossroad of X type along a route. This may make it easier for a user to be aware that a turn is upcoming, without necessitating that the user focus solely on the navigation system. This may also be useful, for example, in hectic traffic, when weather conditions are somewhat obscuring the visibility along a route, when traveling at night, etc.
Sometimes, a user will receive verbal instructions from a person at the destination as well e.g.—“we're the second right after Telegraph road”, and it may be useful to know when Telegraph road is reached so that the user can prepare to make a turn. The user may find this more useful than instructions from the GPS such as “turn right in three hundred meters”, especially if there are a densely packed network of small and/or poorly marked side streets following a major intersection.
This information could similarly be useful if the destination is on the road on which the user is traveling, and the user knows that the destination is “just past Telegraph road on the right” for example. Thus, when Telegraph road is approached/reached/passed, the user can begin scanning the right-hand side of the road for a destination. Such information may be especially useful if the destination is in a strip mall or the like, where the GPS may only have an approximate, and not a precise, location available.
In the illustrative embodiment shown in
Once the road type has been set, a determination process (located onboard the vehicle or remotely) examines each crossroad 303. If the crossroad qualifies for reporting 305, then the system will display and/or announce the crossroad 307 in accordance with the available features and/or system settings.
If a destination has been reached 309, then the process ends. Similarly, if the user disables the feature 311, then the process ends 313. Otherwise, a next crossroad is examined as it is approached 303.
In this manner, the system can automatically report some types or all crossroads without requiring a user prompt. This is just one manner in which the process may be implemented.
Next crossroads may be examined by the system at any distance from the crossroad, although the system may not elect to report the next crossroad until the user is within a certain distance of the crossroad (e.g., without limitation, a projected visible distance). Similarly, display/output of a particular crossroad may be ceased after the crossroad has been passed.
In this illustrative embodiment, either automatic or per-request crossroad reporting may be enabled 401. In this embodiment, the system has been instructed to filter crossroads based on dynamic conditions 403. In one example, the filter is based at least in part on the road type on which the user is traveling.
In this illustrative example, if the user is traveling on, for example, a highway, then it may be the case that only exit roads are announced/displayed (ignoring overpasses without exits). If the user exits the highway and passes into a suburban area, then the announcement/display may pass down to a surface street level (or at least a slower class of road than was previously announced/displayed).
Whether the reporting is automatic or by request, the “next” or “previous” road that is reported is determined, in this illustrative example, by the classification filter.
Once the filter has been enabled 403, a decision process examines the type of road on which the user is presently traveling 405 and sets a level of output based on that road's classification 407. Then, the process examines a next crossroad (or previous, if so requested) and determines if the next crossroad qualifies for output 409. If the next crossroad does not qualify, the system checks to see if any crossroads remain between a present location and a final destination (assuming that further crossroads are to be output) 411. If no crossroads remain, the process exits 413. If crossroads remain, the process returns to examining a next crossroad 409.
If a crossroad qualifies for reporting, then the crossroad is output at an appropriate time 415 in a manner consistent with available features and/or vehicle settings. As with the previous cases, this can be a visual or audible output, and the system may cease/remove the output one the crossroad has been passed.
In yet a further illustrative embodiment, a user may request notification when a specific crossroad is being approached. The requested crossroad can then be visually or audibly highlighted as the user approaches (or passes) the requested crossroad.
Although the invention has been described in terms of illustrative embodiments, these are intended to provide examples and not to limit the scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4937751 | Nimura et al. | Jun 1990 | A |
5177685 | Davis et al. | Jan 1993 | A |
5220507 | Kirson | Jun 1993 | A |
5275474 | Chin et al. | Jan 1994 | A |
5291412 | Tamai et al. | Mar 1994 | A |
5351779 | Yamashita | Oct 1994 | A |
5394332 | Kuwahara et al. | Feb 1995 | A |
5406491 | Lima | Apr 1995 | A |
5406492 | Suzuki | Apr 1995 | A |
5578748 | Brehob et al. | Nov 1996 | A |
5742922 | Kim | Apr 1998 | A |
5767795 | Schaphorst | Jun 1998 | A |
5790973 | Blaker et al. | Aug 1998 | A |
5848364 | Ohashi | Dec 1998 | A |
6005494 | Schramm | Dec 1999 | A |
6028537 | Suman et al. | Feb 2000 | A |
6101443 | Kato et al. | Aug 2000 | A |
6314369 | Ito et al. | Nov 2001 | B1 |
6374177 | Lee et al. | Apr 2002 | B1 |
6424363 | Matsuba et al. | Jul 2002 | B1 |
6424888 | Sone et al. | Jul 2002 | B1 |
6427115 | Sekiyama | Jul 2002 | B1 |
6427117 | Ito et al. | Jul 2002 | B1 |
6462676 | Koizumi | Oct 2002 | B1 |
6484092 | Seibel | Nov 2002 | B2 |
6484093 | Ito et al. | Nov 2002 | B1 |
6533367 | Latarnik et al. | Mar 2003 | B1 |
6574538 | Sasaki | Jun 2003 | B2 |
6691025 | Reimer | Feb 2004 | B2 |
6791471 | Wehner et al. | Sep 2004 | B2 |
6829529 | Trefzer et al. | Dec 2004 | B2 |
6834229 | Rafiah et al. | Dec 2004 | B2 |
6866349 | Sauter et al. | Mar 2005 | B2 |
6904362 | Nakashima et al. | Jun 2005 | B2 |
7053866 | Mimran | May 2006 | B1 |
7089110 | Pechatnikov et al. | Aug 2006 | B2 |
7113107 | Taylor | Sep 2006 | B2 |
7167799 | Dolgov et al. | Jan 2007 | B1 |
7243134 | Bruner et al. | Jul 2007 | B2 |
7369938 | Scholl | May 2008 | B2 |
7486199 | Tengler et al. | Feb 2009 | B2 |
7571042 | Taylor et al. | Aug 2009 | B2 |
7642901 | Kato et al. | Jan 2010 | B2 |
7653481 | Tramel | Jan 2010 | B2 |
7726360 | Sato et al. | Jun 2010 | B2 |
7818380 | Tamura et al. | Oct 2010 | B2 |
7822380 | Wu | Oct 2010 | B2 |
7822546 | Lee | Oct 2010 | B2 |
7826945 | Zhang et al. | Nov 2010 | B2 |
7920969 | Mudalige et al. | Apr 2011 | B2 |
20010001847 | Hessing | May 2001 | A1 |
20020087262 | Bullock et al. | Jul 2002 | A1 |
20030040866 | Kawakami | Feb 2003 | A1 |
20030040868 | Fish et al. | Feb 2003 | A1 |
20030158652 | Friedrichs et al. | Aug 2003 | A1 |
20040021583 | Lau et al. | Feb 2004 | A1 |
20040117108 | Nemeth | Jun 2004 | A1 |
20050085956 | Losey | Apr 2005 | A1 |
20050144573 | Moody et al. | Jun 2005 | A1 |
20050159881 | Furukawa | Jul 2005 | A1 |
20060026335 | Hodgson et al. | Feb 2006 | A1 |
20060089788 | Laverty | Apr 2006 | A1 |
20060168627 | Zeinstra et al. | Jul 2006 | A1 |
20060172745 | Knowles | Aug 2006 | A1 |
20060184314 | Couckuyt et al. | Aug 2006 | A1 |
20060190164 | Glaza | Aug 2006 | A1 |
20060241857 | Onishi et al. | Oct 2006 | A1 |
20070005241 | Sumizawa et al. | Jan 2007 | A1 |
20070038362 | Gueziec | Feb 2007 | A1 |
20070050248 | Huang et al. | Mar 2007 | A1 |
20070093955 | Hughes | Apr 2007 | A1 |
20070104224 | Conner et al. | May 2007 | A1 |
20070143482 | Zancho | Jun 2007 | A1 |
20070143798 | Jira et al. | Jun 2007 | A1 |
20070198172 | Sumizawa et al. | Aug 2007 | A1 |
20070203643 | Ramaswamy et al. | Aug 2007 | A1 |
20070203646 | Diaz et al. | Aug 2007 | A1 |
20070213092 | Geelen | Sep 2007 | A1 |
20070273624 | Geelen | Nov 2007 | A1 |
20070290839 | Uyeki et al. | Dec 2007 | A1 |
20080065318 | Ho | Mar 2008 | A1 |
20080147308 | Howard et al. | Jun 2008 | A1 |
20080195305 | Jendbro et al. | Aug 2008 | A1 |
20080228346 | Lucas et al. | Sep 2008 | A1 |
20090143934 | Motonaga et al. | Jun 2009 | A1 |
20090196294 | Black et al. | Aug 2009 | A1 |
20090254266 | Altrichter et al. | Oct 2009 | A1 |
20090259354 | Krupadanam et al. | Oct 2009 | A1 |
20090326797 | Tengler et al. | Dec 2009 | A1 |
20090326801 | Johnson et al. | Dec 2009 | A1 |
20100010732 | Hartman | Jan 2010 | A1 |
20100048184 | Kim | Feb 2010 | A1 |
20100088018 | Tsurutome et al. | Apr 2010 | A1 |
20100088029 | Hu et al. | Apr 2010 | A1 |
20100094550 | Tsurutome et al. | Apr 2010 | A1 |
20100174485 | Taylor et al. | Jul 2010 | A1 |
20100191463 | Berry et al. | Jul 2010 | A1 |
20100198508 | Tang | Aug 2010 | A1 |
20100217482 | Vogel et al. | Aug 2010 | A1 |
20100241342 | Scalf et al. | Sep 2010 | A1 |
20100245123 | Prasad et al. | Sep 2010 | A1 |
20110004523 | Giuli et al. | Jan 2011 | A1 |
20110046883 | Ross et al. | Feb 2011 | A1 |
20110166774 | Schunder | Jul 2011 | A1 |
20110255481 | Sumcad et al. | Oct 2011 | A1 |
20120004841 | Schunder | Jan 2012 | A1 |
20120029806 | Scalf et al. | Feb 2012 | A1 |
20120029807 | Schunder et al. | Feb 2012 | A1 |
20120041673 | Vandivier et al. | Feb 2012 | A1 |
20120053825 | Schunder | Mar 2012 | A1 |
Number | Date | Country |
---|---|---|
2008037471 | Apr 2008 | WO |
Entry |
---|
Navigator—A Talking GPS Receiver for the Blind, Ryszard Kowalik and Stanislaw Kwasniewski, Gdansk University of Technology, 2004. |
Speech-Enabled Web Services for Mobile Devices, M. Hu, Z. Davis, S. Prasad, M. Schuricht, P.M. Melilar-Smith and L.E. Moser, Department of Electrical and Computer Engineering, University of California, Santa Barbara, CA 93106. |
Kermit Whitfield, “A hitchhiker's guide to the telematics ecosystem”, Automotive Design & Production, Oct. 2003, http://findarticles.com, pp. 1-3. |
Ford Motor Company, “Navigation System: SYNC,” Owner's Guide Supplement, SYNC Version 1 (Jul. 2007). |
Ford Motor Company, “SYNC,” Owner's Guide Supplement, SYNC Version 1 (Nov. 2007). |
Ford Motor Company, “Navigation System: SYNC,” Owner's Guide Supplement, SYNC Version 2 (Oct. 2008). |
Ford Motor Company, “SYNC,” Owner's Guide Supplement, SYNC Version 2 (Oct. 2008). |
Ford Motor Company, “Navigation System: SYNC,” Owner's Guide Supplement, SYNC Version 3 (Jul. 2009). |
Ford Motor Company, “SYNC,” Owner's Guide Supplement, SYNC Version 3 (Aug. 2009). |
Findlater et al., Impact of Screen Size on Performance, Awareness, and User Satisfaction with Graphical User Interfaces, Association for Computing Machinery (ACM), Apr. 5-10, 2008, pp. 1247-1256, see Fig. 1. |
Garmin Garage, Follow the Leader, www.garmin.com/garmin/cms/site/us. |
TomTom, portable car navigation systems, http://www.tomtom.com, Feb. 6, 2009. |
MapQuest Maps—Driving Directions—Map, http://www.mapquest.com, Aug. 25, 2009. |
Multi-Modal Navigation Tools, TDM Encyclopedia, Jan. 26, 2010. |
Google Maps Finally Adds Bike Routes, Mary Catherine O'Connor, Mar. 10, 2010, printed from www.wired.com/autopia/2010/03/google-maps-for-bikes/. |
POI Along Route Qs, Printed from http://www.tomtomforums.com, printed Jul. 30, 2010. |
Difficult POI search in Streets & Trips, printed from http://www.laptopgpsworld.com/3520-difficult-poi-search-streets-tips, printed Jul. 30, 2010. |
http://www.rated4stars.com/html/gps-saves-gas.html. |
http://www.gps.cx/index.php?c=1&n=493964&i=B001LTHONU&x=GPS—Buddy—FEO1US—Fuel—Economy—Software—Package. |
http://www.gpsmagaziine.com/2009/02/hands-on—with—garmins—new—ecor.php (Feb. 2009). |
http://www.nrel.gov/vehiclesandfuels/vsa/pdfs/42557.pdf (Apr. 2008). |
http://green.autoblog.com/2009/03/05/sentience-research-vehicle-shows-how-tons-of-data-can-save-milli/ (Mar. 2009). |
http://reviews.cnet.com18301-13746—7-10189749-48.html. |
Number | Date | Country | |
---|---|---|---|
20120158292 A1 | Jun 2012 | US |