The illustrative embodiments generally relate to a method and apparatus for estimating the distance from trailer axle to trailer tongue.
Whether moving, transporting items or transporting recreational vehicles, people often hook up trailers to a vehicle trailer hitch. These trailers come in all shapes and sizes, and vary in weight and handling capability. As vehicles now contain computerized systems and modules capable of modifying vehicle behavior, better control over an otherwise unwieldy trailer can be obtained if the vehicle knows certain features of the trailer.
For example, it may be useful to the vehicle if the trailer length from tongue-to-axle is known. Of course, that may require that an owner measure the distance using a tape measure, which may not be easily found or even owned. Further, it may not be clear to a trailer user from where the measurements are to be taken, which can result in confusion or an improper measurement.
In a first illustrative embodiment, a system includes a processor configured to receive a trailer image. The processor is also configured to identify an axle in the trailer image and identify a tongue-end in the trailer image. Further, the processor is configured to receive a tire image, including a wheel diameter provided on a tire. The processor is additionally configured to retrieve the wheel diameter from the tire image. The processor is also configured to identify a wheel, having an indentified diameter corresponding to the wheel diameter, in the first image. Additionally, the processor is configured to calculate a distance from the axle to the tongue-end using the identified diameter.
In a second illustrative embodiment, a computer-implemented method includes receiving a trailer image. The method also includes identifying an axle in the trailer image and identifying a tongue-end in the trailer image. Further, the method includes receiving a tire image, including a wheel diameter provided on a tire. The method additionally includes retrieving the wheel diameter from the tire image. The method also includes identifying a wheel, having an indentified diameter corresponding to the wheel diameter, in the first image. Additionally, the method includes calculating a distance from the axle to the tongue-end using the identified diameter.
In a third illustrative embodiment, a non-transitory computer-readable storage medium stores instructions that, when executed by a processor, cause the processor to perform a method including receiving a trailer image. The method also includes identifying an axle in the trailer image and identifying a tongue-end in the trailer image. Further, the method includes receiving a tire image, including a wheel diameter provided on a tire. The method additionally includes retrieving the wheel diameter from the tire image. The method also includes identifying a wheel, having an indentified diameter corresponding to the wheel diameter, in the first image. Additionally, the method includes calculating a distance from the axle to the tongue-end using the identified diameter.
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
In the illustrative embodiment 1 shown in
The processor is also provided with a number of different inputs allowing the user to interface with the processor. In this illustrative embodiment, a microphone 29, an auxiliary input 25 (for input 33), a USB input 23, a GPS input 24 and a BLUETOOTH input 15 are all provided. An input selector 51 is also provided, to allow a user to swap between various inputs. Input to both the microphone and the auxiliary connector is converted from analog to digital by a converter 27 before being passed to the processor. Although not shown, numerous of the vehicle components and auxiliary components in communication with the VCS may use a vehicle network (such as, but not limited to, a CAN bus) to pass data to and from the VCS (or components thereof).
Outputs to the system can include, but are not limited to, a visual display 4 and a speaker 13 or stereo system output. The speaker is connected to an amplifier 11 and receives its signal from the processor 3 through a digital-to-analog converter 9. Output can also be made to a remote BLUETOOTH device such as PND 54 or a USB device such as vehicle navigation device 60 along the bi-directional data streams shown at 19 and 21 respectively.
In one illustrative embodiment, the system 1 uses the BLUETOOTH transceiver 15 to communicate 17 with a user's nomadic device 53 (e.g., cell phone, smart phone, PDA, or any other device having wireless remote network connectivity). The nomadic device can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, tower 57 may be a WiFi access point.
Exemplary communication between the nomadic device and the BLUETOOTH transceiver is represented by signal 14.
Pairing a nomadic device 53 and the BLUETOOTH transceiver 15 can be instructed through a button 52 or similar input. Accordingly, the CPU is instructed that the onboard BLUETOOTH transceiver will be paired with a BLUETOOTH transceiver in a nomadic device.
Data may be communicated between CPU 3 and network 61 utilizing, for example, a data-plan, data over voice, or DTMF tones associated with nomadic device 53. Alternatively, it may be desirable to include an onboard modem 63 having antenna 18 in order to communicate 16 data between CPU 3 and network 61 over the voice band. The nomadic device 53 can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, the modem 63 may establish communication 20 with the tower 57 for communicating with network 61. As a non-limiting example, modem 63 may be a USB cellular modem and communication 20 may be cellular communication.
In one illustrative embodiment, the processor is provided with an operating system including an API to communicate with modem application software. The modem application software may access an embedded module or firmware on the BLUETOOTH transceiver to complete wireless communication with a remote BLUETOOTH transceiver (such as that found in a nomadic device). Bluetooth is a subset of the IEEE 802 PAN (personal area network) protocols. IEEE 802 LAN (local area network) protocols include WiFi and have considerable cross-functionality with IEEE 802 PAN. Both are suitable for wireless communication within a vehicle. Another communication means that can be used in this realm is free-space optical communication (such as IrDA) and non-standardized consumer IR protocols.
In another embodiment, nomadic device 53 includes a modem for voice band or broadband data communication. In the data-over-voice embodiment, a technique known as frequency division multiplexing may be implemented when the owner of the nomadic device can talk over the device while data is being transferred. At other times, when the owner is not using the device, the data transfer can use the whole bandwidth (300 Hz to 3.4 kHz in one example). While frequency division multiplexing may be common for analog cellular communication between the vehicle and the internet, and is still used, it has been largely replaced by hybrids of with Code Domian Multiple Access (CDMA), Time Domain Multiple Access (TDMA), Space-Domian Multiple Access (SDMA) for digital cellular communication. These are all ITU IMT-2000 (3G) compliant standards and offer data rates up to 2 mbs for stationary or walking users and 385 kbs for users in a moving vehicle. 3G standards are now being replaced by IMT-Advanced (4G) which offers 100 mbs for users in a vehicle and 1 gbs for stationary users. If the user has a data-plan associated with the nomadic device, it is possible that the data-plan allows for broad-band transmission and the system could use a much wider bandwidth (speeding up data transfer). In still another embodiment, nomadic device 53 is replaced with a cellular communication device (not shown) that is installed to vehicle 31. In yet another embodiment, the ND 53 may be a wireless local area network (LAN) device capable of communication over, for example (and without limitation), an 802.11g network (i.e., WiFi) or a WiMax network.
In one embodiment, incoming data can be passed through the nomadic device via a data-over-voice or data-plan, through the onboard BLUETOOTH transceiver and into the vehicle's internal processor 3. In the case of certain temporary data, for example, the data can be stored on the HDD or other storage media 7 until such time as the data is no longer needed.
Additional sources that may interface with the vehicle include a personal navigation device 54, having, for example, a USB connection 56 and/or an antenna 58, a vehicle navigation device 60 having a USB 62 or other connection, an onboard GPS device 24, or remote navigation system (not shown) having connectivity to network 61. USB is one of a class of serial networking protocols. IEEE 1394 (firewire), EIA (Electronics Industry Association) serial protocols, IEEE 1284 (Centronics Port), S/PDIF (Sony/Philips Digital Interconnect Format) and USB-IF (USB Implementers Forum) form the backbone of the device-device serial standards. Most of the protocols can be implemented for either electrical or optical communication.
Further, the CPU could be in communication with a variety of other auxiliary devices 65. These devices can be connected through a wireless 67 or wired 69 connection. Auxiliary device 65 may include, but are not limited to, personal media players, wireless health devices, portable computers, and the like.
Also, or alternatively, the CPU could be connected to a vehicle based wireless router 73, using for example a WiFi 71 transceiver. This could allow the CPU to connect to remote networks in range of the local router 73.
In addition to having exemplary processes executed by a vehicle computing system located in a vehicle, in certain embodiments, the exemplary processes may be executed by a computing system in communication with a vehicle computing system. Such a system may include, but is not limited to, a wireless device (e.g., and without limitation, a mobile phone) or a remote computing system (e.g., and without limitation, a server) connected through the wireless device. Collectively, such systems may be referred to as vehicle associated computing systems (VACS). In certain embodiments particular components of the VACS may perform particular portions of a process depending on the particular implementation of the system. By way of example and not limitation, if a process has a step of sending or receiving information with a paired wireless device, then it is likely that the wireless device is not performing the process, since the wireless device would not “send and receive” information with itself One of ordinary skill in the art will understand when it is inappropriate to apply a particular VACS to a given solution. In all solutions, it is contemplated that at least the vehicle computing system (VCS) located within the vehicle itself is capable of performing the exemplary processes.
While trailer operators may not have access to or knowledge of how to properly utilize a tape measure to measure a trailer correctly, or how to successfully input the proper measurements to a vehicle, given the prevalence of phones equipped with cameras, it is reasonably likely that the operator has a camera phone available. Also, it is likely that the operator knows how to utilize the camera on the phone, as a stand alone application or in conjunction with an application provided in accordance with the illustrative embodiments.
By taking a picture of the trailer, an application designed to estimate a distance on the image can be provided with a picture usable for the appropriate estimation. Additionally, since a user may not be sure of a tire size, a suitably lit picture of a tire can provide the application with information usable to determine distances within the first picture.
Once the application has been launched 201, the user may be asked to take a picture of the full trailer, or at least including both the trailer axle and the tongue (i.e. attachable end). The process receives the full image 203 once the picture has been taken by the user. In this illustrative example, the image processor will attempt to identify the distance between a tongue and an axle, as this will be useful in providing enhanced trailer control while the vehilce is en-route. Accordingly, the process checks the image to determine if both a recognizable axle center and tongue end-point are present 205.
Due to bad lighting, poor picture quality, rust and discoloration, off-center imaging and other potential problems, the process may not be able to recognize an axle and an end-point. In this example, a new picture is provided 207 until the proper attributes are present. In other examples, it may not be possible to merely take the picture with suitable recognizability, and user assistance may be required to identify the attributes. Such user assistance is discussed in greater detail with respect to
Additionally, the process uses the wheel diameter to determine other distances on the image. Since the user may not know the wheel diameter, an image of the tire exterior may be used to provide the needed information. In this illustrative example, the process attempts to read the markings on the tire exterior 209 in order to determine a wheel diameter.
If the image is unclear or illegible (which may be common, given the distance at which the first picture may be taken), the process requests a close up of the markings on the tire 211. An image is taken and received by the process 213, at which point the process determines if the markings are legible 215. If the markings are not legible, the process may ask if the user wishes to manually input the wheel diameter 217.
Since it is possible that the markings of the tire will have been worn off over time or otherwise degraded, it may be impossible to read the diameter regardless of the number of photographs. In such a case, it may be desirable to manually input a wheel diameter 221. Otherwise, if the markings are available, a new image may be taken and provided 219 that more clearly shows the markings for reading by the process 223.
The process can read the markings (on a legible tire) and determine the wheel diameter 223. Once the diameter is known, the process can examine the original image and recognize the distance between two oppositional radial points. The distance (i.e., diameter) as represented on the picture can be used as a scale to then measure a distance from the center of the axle to the tongue of the trailer 225, 227. This information can then be relayed to a vehicle computing system for later use in trailer control during travel.
If an image does not have identifiable points, the process may ask a user if they wish to manually identify the points 301. If manual input is desired, the process may display the first (full) image for the user 303. In this image, the user can select a tire, which can be used then for axle identification. The tire selection 305 will then be displayed in a zoomed fashion 307. If the axle is identifiable the axle can be selected on the picture 309. If the axle and/or selection is then identifiable from the selection 311, the process may zoom in further.
Once the axle has been selected, the process zooms out the picture 313 and the user selects a region containing the tongue 315. Again, the process may zoom in on the tongue 317, and then the user can select the “end” of the tongue, representing the point on the attachable end of the trailer furthest from the axle 319. If the end is clearly identifiable once selected 321, the process can then display the points on the image 323. The user can then confirm that the points accurately represent the proper portions of the image 325.
In this example, the process identifies a wheel diameter 407 and can visually show the diameter so that the user can confirm the correct identification was applied, i.e., they don't want to accidentally select the tire diameter or other point. The process may also show a point representing the axle 409 and a marking representing the end-point of the trailer 411.
Using the represented distance shown by the diameter of the wheel, any other distance between two points in the image can be estimated. The process can thus estimate the points between the center of the axle and the end of the trailer tongue 413.
In
Also, a name/identifier may be designated for the save trailer 505. The system can also track accumulated miles 507, as well as providing other vehicle-related data such as gain 509 and current power output 511. Other suitable features may also be provided.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4727419 | Yamada et al. | Feb 1988 | A |
4778060 | Wessner, Jr. | Oct 1988 | A |
5455557 | Noll et al. | Oct 1995 | A |
5523947 | Breen | Jun 1996 | A |
6292094 | Deng et al. | Sep 2001 | B1 |
6301548 | Gerum | Oct 2001 | B1 |
6366202 | Rosenthal | Apr 2002 | B1 |
6526335 | Treyz et al. | Feb 2003 | B1 |
6573833 | Rosenthal | Jun 2003 | B1 |
6806809 | Lee et al. | Oct 2004 | B2 |
6959970 | Tseng | Nov 2005 | B2 |
6999856 | Lee et al. | Feb 2006 | B2 |
7047117 | Akiyama et al. | May 2006 | B2 |
7207041 | Elson et al. | Apr 2007 | B2 |
7266435 | Wang et al. | Sep 2007 | B2 |
7447585 | Tandy, Jr. et al. | Nov 2008 | B2 |
7505784 | Barbera | Mar 2009 | B2 |
7602782 | Doviak et al. | Oct 2009 | B2 |
7798263 | Tandy, Jr. et al. | Sep 2010 | B2 |
7801941 | Conneely et al. | Sep 2010 | B2 |
7904222 | Lee et al. | Mar 2011 | B2 |
8073594 | Lee et al. | Dec 2011 | B2 |
8121802 | Grider et al. | Feb 2012 | B2 |
8131458 | Zilka | Mar 2012 | B1 |
8190364 | Rekow | May 2012 | B2 |
8498757 | Bowden et al. | Jul 2013 | B2 |
8547401 | Mallinson et al. | Oct 2013 | B2 |
8571777 | Greene | Oct 2013 | B2 |
8675953 | Elwell et al. | Mar 2014 | B1 |
8755984 | Rupp et al. | Jun 2014 | B2 |
8811698 | Kono et al. | Aug 2014 | B2 |
20020098853 | Chrumka | Jul 2002 | A1 |
20030079123 | Mas Ribes | Apr 2003 | A1 |
20030147534 | Ablay et al. | Aug 2003 | A1 |
20040203660 | Tibrewal et al. | Oct 2004 | A1 |
20040260438 | Chemetsky et al. | Dec 2004 | A1 |
20040267585 | Anderson et al. | Dec 2004 | A1 |
20050074143 | Kawai | Apr 2005 | A1 |
20050091408 | Parupudi et al. | Apr 2005 | A1 |
20050177635 | Schmidt et al. | Aug 2005 | A1 |
20060150197 | Werner | Jul 2006 | A1 |
20060156315 | Wood et al. | Jul 2006 | A1 |
20060190097 | Rubenstein | Aug 2006 | A1 |
20060244579 | Raab | Nov 2006 | A1 |
20060287787 | Engstrom et al. | Dec 2006 | A1 |
20060287821 | Lin | Dec 2006 | A1 |
20070016362 | Nelson | Jan 2007 | A1 |
20070042809 | Angelhag | Feb 2007 | A1 |
20070042812 | Basir | Feb 2007 | A1 |
20070050854 | Cooperstein et al. | Mar 2007 | A1 |
20070132572 | Itoh et al. | Jun 2007 | A1 |
20070294625 | Rasin et al. | Dec 2007 | A1 |
20080086455 | Meisels et al. | Apr 2008 | A1 |
20080148374 | Spaur et al. | Jun 2008 | A1 |
20080220718 | Sakamoto et al. | Sep 2008 | A1 |
20080231701 | Greenwood et al. | Sep 2008 | A1 |
20080313050 | Basir | Dec 2008 | A1 |
20090005932 | Lee et al. | Jan 2009 | A1 |
20090010448 | Voto et al. | Jan 2009 | A1 |
20090075624 | Cox et al. | Mar 2009 | A1 |
20090106036 | Tamura et al. | Apr 2009 | A1 |
20090117890 | Jacobsen et al. | May 2009 | A1 |
20090140064 | Schultz et al. | Jun 2009 | A1 |
20090156182 | Jenkins et al. | Jun 2009 | A1 |
20090228908 | Margis et al. | Sep 2009 | A1 |
20090253466 | Saito et al. | Oct 2009 | A1 |
20090280859 | Bergh | Nov 2009 | A1 |
20090318119 | Basir | Dec 2009 | A1 |
20100063670 | Brzezinski et al. | Mar 2010 | A1 |
20100094996 | Samaha | Apr 2010 | A1 |
20100098853 | Hoffmann et al. | Apr 2010 | A1 |
20100157061 | Katsman et al. | Jun 2010 | A1 |
20100216509 | Riemer et al. | Aug 2010 | A1 |
20100234071 | Shabtay et al. | Sep 2010 | A1 |
20100272370 | Schilling et al. | Oct 2010 | A1 |
20100306309 | Santori et al. | Dec 2010 | A1 |
20100324770 | Ramsey et al. | Dec 2010 | A1 |
20110087385 | Bowden et al. | Apr 2011 | A1 |
20110105097 | Tadayon et al. | May 2011 | A1 |
20110110530 | Kimura | May 2011 | A1 |
20110112762 | Gruijters et al. | May 2011 | A1 |
20110125457 | Lee et al. | May 2011 | A1 |
20110181457 | Basten | Jul 2011 | A1 |
20110185390 | Faenger et al. | Jul 2011 | A1 |
20110195659 | Boll et al. | Aug 2011 | A1 |
20110275358 | Faenger | Nov 2011 | A1 |
20110296037 | Westra et al. | Dec 2011 | A1 |
20120054300 | Marchwicki et al. | Mar 2012 | A1 |
20120064917 | Jenkins et al. | Mar 2012 | A1 |
20120065815 | Hess | Mar 2012 | A1 |
20120079002 | Boll et al. | Mar 2012 | A1 |
20120084292 | Liang et al. | Apr 2012 | A1 |
20120200706 | Greenwood et al. | Aug 2012 | A1 |
20130158863 | Skvarce et al. | Jun 2013 | A1 |
20130268160 | Trombley et al. | Oct 2013 | A1 |
20130321347 | Kim | Dec 2013 | A1 |
20140005918 | Qiang | Jan 2014 | A1 |
20140012465 | Shank et al. | Jan 2014 | A1 |
20140058655 | Trombley et al. | Feb 2014 | A1 |
20140085472 | Lu et al. | Mar 2014 | A1 |
20140172232 | Rupp et al. | Jun 2014 | A1 |
Number | Date | Country |
---|---|---|
3931518 | Apr 1991 | DE |
9208595 | Aug 1992 | DE |
10154612 | May 2003 | DE |
102005042957 | Mar 2007 | DE |
102006002294 | Jul 2007 | DE |
102006035021 | Apr 2010 | DE |
102008043675 | May 2010 | DE |
102009007990 | Aug 2010 | DE |
102011108440 | Jan 2013 | DE |
102012006206 | Oct 2013 | DE |
0418653 | Mar 1991 | EP |
1593552 | Mar 2007 | EP |
1810913 | Jul 2007 | EP |
2487454 | Aug 2012 | EP |
2551132 | Jan 2013 | EP |
2012059207 | May 2012 | WO |
2014019730 | Feb 2014 | WO |
Entry |
---|
Automatic Vehicle Classification Using Learning-based Computer Vision and Fuzzy Logic; Jailson A de Brito Jr., Departamento de Ciencia da Computacao, Instituto de Matematics, Universidade Federal da Bahia; Luis Edmundo Prado de Campos, Laboratorio de Geotencnia—DCTM, Escola Plitecnica, Universidde Federal da Bahia. |
Recognizing Cars: Louka Dlagnekov, Sege Belongie, Department of Comptuer Science and Engineering, University of California, San Diego, CA. |
Service and User Interface Transfer from Nomadic Devices to Car Infotainment Systems: Jan Sonnenberg, Technische Universitat Braunschweig, Proceedings of Second International Conference on Automotive User Interface and Itneractive Vehicular Applications Nov. 2010, Pittsburg, PA, USA. |
Vehicle Dimensions Estimation Scheme Using AAM on Stereoscopic Vldeo: Robert Tarajczak, Tomasz Grajek, Chair of Multimedia Telecommunications and Microelectronics, Poznan University of Technology, Polanka 3 street, Poznan, Poland. |
Google Mapp Application: https://support.google.com/maps/answer/1628031?hl=en. |
Iphotomeasure Software Application: http://gigaom.com/2007/02/06/how—to—measure—/. |
European Patent Office. |
International Search Report for the corresponding PCT/US2010/37052. |
Korean Intellectual. |
Ford Motor Company, “SYNC with Navigation System,” Owner's Guide Supplement, SYNC System Version 1 (Jul. 2007). |
Ford Motor Company, “SYNC,” Owners's Guide Supplement, SYNC System Version 1 (Nov. 2007). |
Ford Motor Company, “SYNC with Navigation System,” Owner's Guide Supplement, SYNC System Version 2 (Oct. 2008). |
Ford Motor Company, “SYNC,” Owner's Guide Supplement, SYNC System Version 2 (Oct. 2008). |
Ford Motor Company, “SYNC with Navigation System,” Owner's Guide Supplement, SYNC System Version 3 (Jul. 2009). |
Ford Motor Company, “SYNC,” Owner's Guide Supplement, SYNC System Version 3 (Aug. 2009). |
Kermit Whitfield, “A hitchhiker's guide to the telematics ecosystem,” Automotive Design & Production, Oct. 2003, http://findarticles.com, pp. 103. |
Service Discovery Protocol (SDP) Layer Tutorial, Palowireless Bluetooth Research Center, http://www.palowireless.com/infotooth/tutorial/sdp.asp. Aug. 3, 2010. |
IPhone Hacks, Apple Files Patent Which Allow You to Control Your Computer Remotely Using IPhone, http://www.iphonehacks.com/2009/12/apple-files-patent-which-could-allow-you-to-control-your-computer-remotely-using-iphone, Jun. 22, 2010. |
Zack Newmark, American, Concept Car, Ford, Gadgets, Lifestyle, Technology, Student develop in-car cloud computing apps; envision the future of in-car connectivity, May 4, 2010, http://ww.woldcarfans.com/print/110050425986/student-develop-in-car-cloud-computing—apps;—envision—the—future—of—in-car—connectivity. |
Wikipedia, the free encyclopedia, X Window System, http://en.wikipedia.org/wiki/X—Window—System, Jun. 22, 2010. |
Darryl Chantry, MSDN, Mapping Applications to the Cloud, 2010 Microsoft Corporation, Platform Architecture Team, Jan. 2009, http://msdn.microsoft.com/en-us/library/dd430340(printer).aspx, Jun. 18, 2010. |
“MobileSafer is your personal safe driving assistant”, 2010 ZoomSafer Inc. <http://zoomsafer.com/products/mobilesafer> Dec. 28, 2010. |
“How PhonEnforcer Works” Turn Off the Cellphone While Driving—PhonEnforcer. Turn Off the Cell Phone LLC. <http://turnoffthecellphone.com/howitworks.htm> Dec. 28, 2010, pp. 1-3. |
“PhonEnforcer FAQ's” Turn Off the Cellphone While Driving—PhonEnforcer. Turn Off the Cell Phone LLC. <http://turnoffthecellphone.com/faq.html> Dec. 28, 2010, pp. 1-2. |
Lamberti, Ralph “Daimler Full Circle: The Rise of Vehicle-Installed Telematics—Telematics Munich 2009” Nov. 10, 2009.v. |
Narasimhan, et al., A lightweight remote display management protocol for mobile devices, Application Research Center, Motorola Labs Schaumburg, IL, 2007, pp. 711-715. |
Voelcker, Top 10 Tech Cars It's the Environment, Stupid, www.Spectrum.IEEE.Org, Apr. 2008, pp. 26-35. |
Yarden, et al., TUKI: A Voice-Activated Information Browser, IEEE, 2009, pp. 1-5. |
Gil-Castineira, et al., Integration of Nomadic Devices with Automotive User Interfaces, IEEE Transactions on Consumer Electronics, vol. 55, No. 1, Feb. 2009. |
Nusser, et al., Bluetooth-based Wireless Connectivity in an Automotive Environment, Robert Bosch GmbH, VTC 2000, pp. 1935-1942. |
Antuan Goodwin, The Car Tech Blog, Ford Unveils open-source Sync developer platform, http://reviews.cnet.com/8301-13746—7-10385619-48.html, Oct. 2009, pp. 1-5. |
Olof Enqvist, “AFS-Assisted Trailer Reversing,” Institutionen för systemteknik Deartment of Electrical Engineering, Jan. 27, 2006, pp. 1-57. |
Number | Date | Country | |
---|---|---|---|
20140241584 A1 | Aug 2014 | US |