Method and system for transmitting vehicle data using an automated voice

Information

  • Patent Grant
  • 9674683
  • Patent Number
    9,674,683
  • Date Filed
    Monday, April 27, 2015
    9 years ago
  • Date Issued
    Tuesday, June 6, 2017
    7 years ago
Abstract
A vehicle communication system includes one or more processors configured to output a computer generated speech signal for at least a portion of vehicle parameter data based on at least one condition detection signal. The one or more processors are further configured to output the speech signal to a call center over a voice channel via a wireless communication. The one or more processors are further configured to receive a geographic location of a vehicle so that the speech signal is in a selected language based on a language associated with the geographic location of the vehicle.
Description
TECHNICAL FIELD

Embodiments of the present invention generally relate to a method and system for notifying emergency responders in the event of an automobile accident or other emergency.


BACKGROUND

An in-vehicle system (IVS) captures data such as location data and in an emergency automatically places a call to an emergency call taker or Public Safety Answering Point (PSAP) via a wireless telecommunications network. After a voice call session is established, the IVS transmits a predetermined control signal through the voice channel. The control signal directs the call taker system to prepare to receive data. Preferably, the control signal comprises at least one audio frequency tone. This may be done without human intervention. After transmission of essential information, the IVS system may switch on audio connections for live human voice conversation.


An apparatus for providing useful data in association with a high-priority call such as an emergency includes data (e.g., an MSD or FSD) embedded within one or more real-time protocol packets such as RTP Control Protocol (RTCP) packets, that are interspersed within the voice or user data stream (carried in e.g., RTP packets) of an emergency call. The apparatus transmits the data portion reliably from the initiating terminal (e.g., an in-vehicle system) to a PSAP, by using the same transport connection as the user data.


A system and method to be used with first and second devices capable of communicating using a subset of different modulation schemes include optimizing transmission of data from the first device to the second device when receiving a trigger signal from the second device. The transmitted trigger signal includes data transmitted using a sequence of at least two of the modulation schemes. The system and method analyze the received trigger signal to identify one of the modulation schemes as a function of the received trigger signal as an initial modulation scheme to be used to transmit data to the second device and transmit the data from the first device to the second device. See, for example, U.S. Pat. App. 2010/0227584; U.S. Pat. App. 2010/0202368; and U.S. Pat. App. 2010/0069018.


SUMMARY

In at least one embodiment, a vehicle communication system enables one or more processors to receive indication that an emergency event has taken place from emergency detection sensors positioned throughout the vehicle. An emergency condition detection signal may be automatically sent by a detection sensor, or manually sent by a vehicle occupant pushing an emergency assistance button. The processor may receive vehicle parameter data indicating one or more vehicle parameters including, but not limited to, vehicle global position coordinates. The processor may transmit a communication to an emergency response center that an emergency has been indicated at the vehicle. The vehicle parameter data may be transmitted to the emergency response center using a data transmission signal. If the data transmission signal has failed transmission after several attempts, the processor may convert at least a portion of the vehicle parameter data to speech signals and communicate the speech signals to the emergency response center.


In at least one embodiment, a vehicle emergency response communication method enables several attempts and techniques for data transmission to an emergency call center. The method receives an emergency condition sensor indicating that at least one emergency condition detection signal has been enabled. The method receives vehicle parameter data indicating one or more vehicle parameters, and transmits a wireless communication to an emergency response center. The communication to the emergency response center indicates that an emergency condition has been detected at the vehicle and begins to transmit at least a portion of the vehicle parameter data. The vehicle parameter data is transmitted to the emergency response center using a data transmission signal including, but not limited to, data over voice. If the data transmission signal has failed transmission after a predetermined threshold including several retry attempts, the method may convert at least a portion of the vehicle parameter data to speech signals and communicate the speech signals to the emergency response center.


In at least one embodiment, a system enables a processor to receive an emergency condition input indicating that an emergency event has taken place at the vehicle. The processor may receive a dataset including, but not limited to, GPS coordinates, a number of passengers in a vehicle, time stamp, vehicle identification, service provider identifier, and an e-call qualifier notifying the emergency response center that the emergency event has been manually or automatically initiated. Once the emergency condition input is received, the processor may transmit a wireless communication to an emergency response center through a wirelessly connected nomadic device. The wireless communication to the emergency response center may include the emergency condition at the vehicle and the dataset. The processor may detect a data transmission failure of the wireless communication to the emergency response center. Once the data transmission failure is detected, the processor may convert at least a portion of the dataset to voice. After a portion of the dataset is converted to voice, the processor may transmit the voice to the emergency response center over a voice-channel.


In at least one embodiment, a vehicle communication system includes one or more processors configured to output a speech signal for at least a portion of vehicle parameter data based on at least one emergency condition detection signal. The one or more processors are further configured to output the speech signal to an emergency response center over a voice channel via a wireless communication. The one or more processors are further configured to receive a geographic location of a vehicle so that the speech signal is in a selected language based on a language spoken in the geographic location.


In at least one embodiment, a method for a vehicle computer to communicate speech signals to an emergency response center includes receiving, via the vehicle computer, at least one emergency condition detection signal. The method further includes receiving data indicating one or more vehicle parameters and converting at least a portion of the data to speech signals. The method may convert the speech signals based on a geographic location of a vehicle so that the speech signal is in a selected language based on a language spoken in the geographic location. The method further includes communicating the speech signals to an emergency response center over a voice channel.


In at least one embodiment, a system includes at least one processor configured to receive at least one emergency condition detection signal. The at least one processor is further configured to receive data indicating one or more vehicle parameters and to convert at least a portion of the data to speech signals. The at least one processor is further configured to convert the at least a portion of data based on a geography location of a vehicle so that the speech signal is in a selected language based on a language spoken in the geographic location. The at least one processor is further configured to output the speech signals to an emergency response center over a voice channel.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is an exemplary block topology of a vehicle infotainment system implementing a user-interactive vehicle information display system;



FIG. 2 is an exemplary block topology of a vehicle computing system for notifying an emergency responder of an automobile accident or other emergency;



FIG. 3 is a flow diagram illustrating an example process for implementing embodiments of the present invention;



FIG. 4 is a flow chart illustrative of a vehicle computing system for notifying an emergency responder of an automobile accident or other emergency; and



FIG. 5 is a flow chart illustrative of a vehicle computing system for determining a language to send a message to an emergency responder.





DETAILED DESCRIPTION

As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.



FIG. 1 illustrates an example block topology for a vehicle based computing system 1 (VCS) for a vehicle 31. An example of such a vehicle-based computing system 1 is the SYNC system manufactured by THE FORD MOTOR COMPANY. A vehicle enabled with a vehicle-based computing system may contain a visual front end interface 4 located in the vehicle. The user may also be able to interact with the interface if it is provided, for example, with a touch sensitive screen. In another illustrative embodiment, the interaction occurs through, button presses, spoken dialog system with automatic speech recognition and speech synthesis.


In the illustrative embodiment 1 shown in FIG. 1, a processor 3 controls at least some portion of the operation of the vehicle-based computing system. Provided within the vehicle, the processor allows onboard processing of commands and routines. Further, the processor is connected to both non-persistent 5 and persistent storage 7. In this illustrative embodiment, the non-persistent storage is random access memory (RAM) and the persistent storage is a hard disk drive (HDD) or flash memory. In general, persistent (non-transitory) memory can include all forms of memory that maintain data when a computer or other device is powered down. These include, but are not limited to, HDDs, CDs, DVDs, magnetic tapes, solid state drives, portable USB drives and any other suitable form of persistent memory.


The processor is also provided with a number of different inputs allowing the user to interface with the processor. In this illustrative embodiment, a microphone 29, an auxiliary input 25 (for input 33), a USB input 23, a GPS input 24, screen 4, which may be a touchscreen display, and a BLUETOOTH input 15 are all provided. An input selector 51 is also provided, to allow a user to swap between various inputs. Input to both the microphone and the auxiliary connector is converted from analog to digital by a converter 27 before being passed to the processor. Although not shown, numerous of the vehicle components and auxiliary components in communication with the VCS may use a vehicle network (such as, but not limited to, a CAN bus) to pass data to and from the VCS (or components thereof).


Outputs to the system can include, but are not limited to, a visual display 4 and a speaker 13 or stereo system output. The speaker is connected to an amplifier 11 and receives its signal from the processor 3 through a digital-to-analog converter 9. Output can also be made to a remote BLUETOOTH device such as PND 54 or a USB device such as vehicle navigation device 60 along the bi-directional data streams shown at 19 and 21 respectively.


In one illustrative embodiment, the system 1 uses the BLUETOOTH transceiver 15 to communicate 17 with a user's nomadic device 53 (e.g., cell phone, smart phone, PDA, or any other device having wireless remote network connectivity). The nomadic device can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, tower 57 may be a WiFi access point.


Exemplary communication between the nomadic device and the BLUETOOTH transceiver is represented by signal 14.


Pairing a nomadic device 53 and the BLUETOOTH transceiver 15 can be instructed through a button 52 or similar input. Accordingly, the CPU is instructed that the onboard BLUETOOTH transceiver will be paired with a BLUETOOTH transceiver in a nomadic device.


Data may be communicated between CPU 3 and network 61 utilizing, for example, a data-plan, data over voice, or DTMF tones associated with nomadic device 53. Alternatively, it may be desirable to include an onboard modem 63 having antenna 18 in order to communicate 16 data between CPU 3 and network 61 over the voice band. The nomadic device 53 can then be used to communicate 59 with a network 61 outside the vehicle 31 through, for example, communication 55 with a cellular tower 57. In some embodiments, the modem 63 may establish communication 20 with the tower 57 for communicating with network 61. As a non-limiting example, modem 63 may be a USB cellular modem and communication 20 may be cellular communication.


In one illustrative embodiment, the processor is provided with an operating system including an API to communicate with modem application software. The modem application software may access an embedded module or firmware on the BLUETOOTH transceiver to complete wireless communication with a remote BLUETOOTH transceiver (such as that found in a nomadic device). Bluetooth is a subset of the IEEE 802 PAN (personal area network) protocols. IEEE 802 LAN (local area network) protocols include WiFi and have considerable cross-functionality with IEEE 802 PAN. Both are suitable for wireless communication within a vehicle. Another communication means that can be used in this realm is free-space optical communication (such as IrDA) and non-standardized consumer IR protocols.


In another embodiment, nomadic device 53 includes a modem for voice band or broadband data communication. In the data-over-voice embodiment, a technique known as frequency division multiplexing may be implemented when the owner of the nomadic device can talk over the device while data is being transferred. At other times, when the owner is not using the device, the data transfer can use the whole bandwidth (300 Hz to 3.4 kHz in one example). While frequency division multiplexing may be common for analog cellular communication between the vehicle and the internet, and is still used, it has been largely replaced by hybrids of Code Domain Multiple Access (CDMA), Time Domain Multiple Access (TDMA), Space-Domain Multiple Access (SDMA) for digital cellular communication. These are all ITU IMT-2000 (3G) compliant standards and offer data rates up to 2 Mbps for stationary or walking users and 385 kbps for users in a moving vehicle. 3G standards are now being replaced by IMT-Advanced (4G) which offers 100 Mbps for users in a vehicle and 1 Gbps for stationary users. If the user has a data-plan associated with the nomadic device, it is possible that the data-plan allows for broad-band transmission and the system could use a much wider bandwidth (speeding up data transfer). In still another embodiment, nomadic device 53 is replaced with a cellular communication device (not shown) that is installed to vehicle 31. In yet another embodiment, the ND 53 may be a wireless local area network (LAN) device capable of communication over, for example (and without limitation), an 802.11g network (i.e., WiFi) or a WiMax network.


In one embodiment, incoming data can be passed through the nomadic device via a data-over-voice or data-plan, through the onboard BLUETOOTH transceiver and into the vehicle's internal processor 3. In the case of certain temporary data, for example, the data can be stored on the HDD or other storage media 7 until such time as the data is no longer needed.


Additional sources that may interface with the vehicle include a personal navigation device 54, having, for example, a USB connection 56 and/or an antenna 58, a vehicle navigation device 60 having a USB 62 or other connection, an onboard GPS device 24, or remote navigation system (not shown) having connectivity to network 61. USB is one of a class of serial networking protocols. IEEE 1394 (FireWire™ (Apple), i.LINK™ (Sony), and Lynx™ (Texas Instruments)), EIA (Electronics Industry Association) serial protocols, IEEE 1284 (Centronics Port), S/PDIF (Sony/Philips Digital Interconnect Format) and USB-IF (USB Implementers Forum) form the backbone of the device-device serial standards. Most of the protocols can be implemented for either electrical or optical communication.


Further, the CPU could be in communication with a variety of other auxiliary devices 65. These devices can be connected through a wireless 67 or wired 69 connection. Auxiliary device 65 may include, but are not limited to, personal media players, wireless health devices, portable computers, and the like.


Also, or alternatively, the CPU could be connected to a vehicle based wireless router 73, using for example a WiFi (IEEE 803.11) 71 transceiver. This could allow the CPU to connect to remote networks in range of the local router 73.


In addition to having exemplary processes executed by a vehicle computing system located in a vehicle, in certain embodiments, the exemplary processes may be executed by a computing system in communication with a vehicle computing system. Such a system may include, but is not limited to, a wireless device (e.g., and without limitation, a mobile phone) or a remote computing system (e.g., and without limitation, a server) connected through the wireless device. Collectively, such systems may be referred to as vehicle associated computing systems (VACS). In certain embodiments particular components of the VACS may perform particular portions of a process depending on the particular implementation of the system. By way of example and not limitation, if a process has a step of sending or receiving information with a paired wireless device, then it is likely that the wireless device is not performing the process, since the wireless device would not “send and receive” information with itself. One of ordinary skill in the art will understand when it is inappropriate to apply a particular VACS to a given solution. In all solutions, it is contemplated that at least the vehicle computing system (VCS) located within the vehicle itself is capable of performing the exemplary processes.



FIG. 2 is an exemplary block topology of a vehicle computing system for notifying an emergency responder of an automobile accident or other emergency. Automobile accidents may be detected using one or more accelerometers and/or impact detecting devices mounted throughout the vehicle 201. The impact detecting devices 203 may include, but are not limited to, air bag deployment sensors, vehicle impact sensors, dash impact sensors, seat/occupant sensors, rollover sensors, flame/heat sensors, gasoline sensors and an occupant-activated panic button. The vehicle system architecture may comprise sub-systems, some of which may be interconnected by a vehicle network such as a Controller Area Network or other suitable communication network monitoring the one or more accelerometers and impact detecting sensors 203 within the vehicle 201.


The VCS may have one or more processors communicating with several subsystems receiving signals from detection sensors including a wide variety of different interconnections among subsystems and external communication networks. For example, a wired connection may be established between a cellular telephone and data processor, voice synthesizer, and/or DTMF interface. In another example, a processor may be connected directly or indirectly to emergency sensor modules, and may monitor the ports to which the emergency sensor modules are attached instead of the vehicle network.


In one embodiment, a nomadic device 207 communicating with the VCS 205 using BLUETOOTH technology may establish wireless communication with a terrestrial tower 211. The terrestrial tower 211 in turn establishes communication through telephone switching network 213 with an emergency call center 219. Emergency call center 219 may include police, ambulance, a 911 or a 112 (Europe version of 911) public safety access point, or a call center.


In another embodiment, an embedded cell phone within the VCS 205 may establish direct communication 209 with a terrestrial tower 211. Data may be uploaded and downloaded between the VCS 205 and the emergency call center 219.


In one illustrative embodiment, the VCS 205 may communicate with a wireless device, or a remote computing system connected through the wireless device, for communication to the emergency call center 219. The wireless device may include, but is not limited to, an embedded cellular modem, embedded WiFi device 217, Bluetooth transmitter, Near Field Communication device, brought-in cellular device like a USB modem, MiFi, smartphone that may be connected to the vehicle through SYNC or other Bluetooth pairing device, or a PC network 215 that may be connected to the vehicle through SYNC or other Bluetooth pairing device. The VCS may wirelessly communicate data with the emergency call center 219 with the use of a wireless device. Once the vehicle system has enabled communication with the emergency call center 219, an operator 223 and/or emergency information can proceed to be exchanged with an emergency response computer terminal 221.


The VCS may also communicate with a network having associated storage hosting a plurality of web pages for internet access by a plurality of browsers, including but not limited to emergency responder(s), cellular telephone owner(s), healthcare providers, etc. Some browsers, such as cellular telephone owners may upload data over Internet to storage, and other browsers, such as emergency responders may download data. The data may be uploaded and downloaded using several types of transmission mediums including, but not limited to narrowband, broadband, and/or voice over internet protocol.


The emergency call center 219 may receive a transmission of a set of data about the vehicle accident including, but not limited to, a geographic location of the vehicle. In one embodiment, a method for transmitting this information may include, but is not limited to, in-band modem or data-over-voice. Once the information is received by the emergency call center 219, the information may be presented at an emergency responder's computer terminal 221. Once the set of data has been transmitted to the emergency call center, the voice line may be opened allowing the passengers in the vehicle to communicate to the emergency call center operator 223.



FIG. 3 is a flow diagram illustrating a process for implementing various embodiments. The process describes communicating electronic vehicle parameter data, such as the number of occupants in a vehicle and the time and location of the collision to authorities in the event a collision is detected. The process describes various methods to communicate a set of data from a vehicle to the emergency call center including, but not limited to, in-band modem, data over voice, computer-to-computer communication by transmitting the data converted to binary digits, and/or digitally generated voice data communicated to an operator. The process allows a data set to be communicated by computer generated spoken word if the data over voice transmission has failed.


In some instances during transmission of the data set using data over voice, critical location data may not be transmitted preventing the emergency call center from receiving the location of the crash. Similarly, after several failed attempts to send the critical location data using data over voice, the system may open the voice line; however, this is ineffective if the vehicle occupants are unable to communicate to the operator. To solve this problem the system may generate spoken word of the data set, including the critical location data, after several attempts to send the data using data over voice has failed.


At step 301, the VCS or vehicle subsystem may monitor one or more accelerometers and/or impact detecting devices mounted within the vehicle or on a nomadic device. Based on the analysis of one or more of these sensors, the VCS or vehicle subsystem may detect an accident. Once an accident has been detected the VCS may communicate to a nomadic device that an accident has been detected and begin initialization of data to be transmitted to an emergency call center. The VCS may begin to gather a set of data to transmit to the call center including, but not limited to, GPS coordinates, the number of passengers in the vehicle, time stamp, vehicle identification, service provider identifier, and an e-call qualifier notifying the call center that the call has been manually or automatically initiated.


At step 303, upon receipt of notification of an emergency notification signal, a device may initialize communication with the emergency call center. The device may have a local communication link established with the VCS. The link may be a BLUETOOTH piconet, or other suitable short-range wired or wireless network. The status of the link may include the connectivity of the paired cellular telephone, the signal strength, the identity of other available devices, etc. The link status may be reported by LCD display, LED, or audibly to the vehicle passengers. Preferably, a warning or other notification is provided to passengers within the vehicle compartment when a link is disrupted, or when no link is available. The device may include, but is not limited to, a cellular telephone, smart phone, tablet, laptop, or other device that may communicate between the VCS and an emergency call center.


At step 305, the system may notify occupants of the vehicle that a 911 or 211 emergency call to one or more emergency responders or other contacts is going to be made at the device. Occupant notification may be done audibly using voice synthesizer and speaker, which may or may not be a component of the vehicle sound system. The system may automatically contact the emergency call center, for example, by dialing 911 (or 211 in Europe) if one or more emergency sensors detect an accident. The VCS may be manually initiated to contact an emergency call center by a vehicle occupant by pressing a button within the vehicle or saying “911”.


At step 307, once connected with the emergency call center, the VCS may begin to communicate the set of data to the emergency call center. In one illustrative embodiment, the set of data may be transmitted to the emergency call center using data-over-voice communication. The receiving modem at the emergency call center may accept the data set from the device at step 309. The system may verify whether the set of data has been transferred successfully at step 311. While not specifically illustrated, the VCS may try to reconnect a predefined number of times if the system fails to successfully transmit the data as determined at 311.


At step 313, once the data has been successfully transferred to the emergency call center modem, the data may be presented on a computer terminal. The system may verify if the data is complete at the emergency call center computer terminal at step 315. If the data is incomplete at the computer terminal, the system may again request the information from the receiving modem. In one illustrative embodiment, the emergency call center may notify the device that the data set is incomplete. Once the data set has been completely transferred to the emergency call center computer terminal, the system may open the device voice line allowing the vehicle occupants to communicate with the emergency call center operator at step 317.


At step 311, if the data set transmission has failed, the system may employ a voice message of the data set. A voice synthesis system may allow the data set to be translated into spoken word at step 319. The voice synthesis system allows the VCS to interact with the emergency call center operator. An illustrative example may include the GPS coordinates to be transferred being synthesized into spoken word, allowing the emergency call center operator to send help to the appropriate location at step 325. If the voice message is incomplete, the message may be played one or more times at step 327. The message reply may be based on the emergency call center operator instructions or request. Once the data set has been completely transferred by spoken word to the emergency call center, the system may open the device voice line allowing the vehicle occupants to communicate with the emergency call center operator at step 317.


At step 321, if the system does not include a voice synthesizer, or an error has occurred during translation into spoken word, the system may turn the data set into digits. An example may be turning the GPS coordinates into binary code allowing the emergency call center computer to receive the information and present the data set onto the emergency call center computer terminal. The system may validate whether the data set has been completely received at step 323. Once the data set has been completely transferred by digits to the emergency call center, the system may open the device voice line allowing for the vehicle occupants to communicate with the emergency call center operator at step 317.



FIG. 4 is a flow chart illustrative of a vehicle computing system with one or more processors configured to notify an emergency responder of an automobile accident or other emergency. Once an accident is detected by the VCS, a set of data may be sent to an emergency call center. The set of data being transmitted to the emergency call center may include, but is not limited to Global Position System coordinates. The transmission of data may be accomplished using data over voice. The following illustrative flow chart describes a method for employing a voice message of data when transmission of the data set is not successful.


At step 401, the VCS may monitor one or more accelerometers and/or impact detecting devices mounted within the vehicle or on a nomadic device. Based on the analysis of one or more of these accident detection sensors, the VCS may detect an accident. After an accident has been detected, the VCS may begin initialization of data to be transmitted to an emergency call center and communicate to a nomadic device that an accident is detected.


At step 403, a nomadic device may receive a request from the VCS to contact an emergency call center. The nomadic device may include, but is not limited to, an embedded cellular modem, embedded Wi-Fi device, Bluetooth transmitter, Near Field Communication device, brought-in cellular device such as a USB modem, MiFi, smartphone that may be connected to the vehicle through SYNC or other Bluetooth pairing device, or a PC that may be connected to the vehicle through SYNC or other Bluetooth pairing device. The nomadic device may detect connection with the emergency call center at step 405. If the nomadic device fails connection with the emergency call center, it may retry to contact the call center at step 403.


At step 407, once the nomadic device is connected to the emergency call center, the VCS may begin to wirelessly transmit the data set to the emergency call center through the nomadic device. If the data set has been transmitted successfully, the voice line is opened to allow the vehicle occupants to communicate with the emergency call center operator or other emergency responder at step 421.


At step 411, the VCS may attempt several retries of transmitting data to the emergency call center based on a predefined calibrated threshold if the transmission of the set of data using data over voice has failed. An example of a predefined calibrated threshold may include three attempts made to transmit the set of data using data over voice. After several attempts have been made to transmit the data set using data over voice, the VCS may determine the location of the vehicle and begin to select a language to translate the data set into spoken word based on vehicle location at step 412. Once the location is determined, the VCS may convert the data set into words and/or phrases that may need to be translated. For example, the data set spoken into words based on location may be implemented in the European Union where several languages are commonly spoken. In another example, several dialects of the official language may be spoken based on a particular region, such as in China or India.


At step 413, once the location has been determined and a language selected, the VCS may connect the voice line with the emergency call center. Once the voice line is connected, the VCS may transmit the voice message of the data set through the nomadic device to the emergency call center at step 417.


In another illustrative embodiment, the VCS may convert the data set into computer language or text message to communicate with the emergency call center computer system. The information may be converted at the emergency call center computer terminal for display to an operator or other emergency responder.


At step 419, the system may determine whether the complete set of data in spoken word has been transmitted to the emergency call center. If the system detects that the set of data in spoken word is not complete, the system may retry sending the voice message. Once the complete voice message has been transmitted, the system may open the voice line allowing the vehicle occupants to communicate to the emergency call center operator at step 421.



FIG. 5 is a flow chart illustrative of a vehicle computing system determining a language to send a message to an emergency responder. The VCS may automatically provision an emergency call language based on a determination that a new language is appropriate based on, for example, a vehicle location. Although a user could drive for a thousand miles in the United States and never cross a national border, such a trip in Europe, for example, is almost certain to cross one or more national borders. While it may be common for citizens of Europe to each speak several languages, it is not desirable to rely on the ability of an emergency operator to speak the language of a user, when the user is outside of a home country.


For example, if the user originated in France, then the user may have set the vehicle computing system to respond to and speak in French. This could be the user's own language, or a common international language, such as English, and may also be the language of the Emergency Operator for emergency phone calls placed while in France.


If the user then travels to Germany, while it may be possible that a particular German Emergency Operator could speak French, it is certainly not preferable to rely on such an occurrence in the event of a vehicle emergency, such as an accident. Further, if the operator doesn't speak French, not only will the vehicle communication system be unable to successfully communicate the vehicle parameter data converted to spoken word with the operator in French, but if the driver only speaks French, then even an open line between the driver and the operator will be to no avail.


In this illustrative embodiment, the vehicle communication system can automatically switch to a local language, so that emergency communication is possible between the operator and the vehicle, even if no one in the vehicle speaks the appropriate language.


When a cellular phone, or other nomadic device, connects to a cellular tower, in Europe, for example, configuration information may be passed between the tower and the device. This information can be used to establish a connection between the tower and the device, and may also contain a code (such as a mobile country code (MCC)) establishing the country of origin of the tower (or some other designation based on a code).


In this illustrative embodiment, continuing from a vehicle computing system notifying an emergency responder of an automobile accident from step 411, a vehicle computing system polls a paired nomadic device periodically to obtain at least a designation code at step 501. The system may open a connection to the nomadic device. The system then checks to see if cellular communication information is stored within the device. For example, if the device stores a configuration packet received from a cellular tower or other communication point, then the system may be able to retrieve that packet from the device.


At step 503, if the location is found, the system may proceed to translate the data set to the appropriate language for a voice message transmission to an emergency call center. The set of data voice message may be translated to the location language of the vehicle at step 509.


At step 505, if there is no such data stored locally in the device, then the system may instruct the nomadic device to initiate communication with a cellular tower or other communication point, in order to receive a configuration packet.


The configuration packet is then transferred from the nomadic device to the VCS. Based on an origin code or other designation, the VCS can determine the local language of choice. In this illustrative example, a lookup table is used for this determination of words and phrases, although other suitable methods may also be used.


At step 507, once a local language is determined, the vehicle computing system can setup access to, for example, a preset database of words in that language. The VCS may implement a number of look-up tables in the software to determine word translations for the dataset variables in the selected language of the vehicle location. In the event an emergency call is placed, the system can draw on this database to communicate with an emergency operator and/or translate the data set to the appropriate voice message language at step 509.


At step 511, if a local language is not determined, the vehicle computing system can establish a default language for the voice message sending the set of data. The default language may be preset by the vehicle operator during VCS setup. Once the default language is set, the set of data may be translated to the default language and ready for transmission to the emergency call center at step 509.


In this illustrative embodiment, fully switching between language packs when the local language changes is not the preferred option. It can take up to a minute to switch the language of the vehicle computing system to another installed language pack. Further, the language option for the local language may not be presently installed in the VCS. Installing the language could require a charge, a lengthy download, or even possibly physical insertion of a persistent memory device containing a new language.


While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.

Claims
  • 1. A vehicle communication system, comprising: a processor configured to:in response to an emergency condition detection signal, output a computer generated speech signal for vehicle parameter data to an emergency response center over a voice channel via a wireless communication, the speech signal being in a language selected based on a geographic location of a vehicle; andtransmit the vehicle parameter data as digits if an error occurs during generation of the speech signal.
  • 2. The vehicle communication system of claim 1, wherein the processor is further configured to: transmit the vehicle parameter data that indicates one or more vehicle parameters via the wireless communication to the emergency response center using data transmission signaling.
  • 3. The vehicle communication system of claim 2, wherein the data transmission signaling includes data over voice.
  • 4. The vehicle communication system of claim 1, wherein the processor is additionally configured to: establish communication between the vehicle communication system and a nomadic device; andestablish communication between the emergency response center and the vehicle communication system through the nomadic device in response to the emergency condition detection signal.
  • 5. The vehicle communication system of claim 1, wherein the language is selected based on a look-up table.
  • 6. The vehicle communication system of claim 1, wherein the wireless communication includes a cellular phone.
  • 7. The vehicle communication system of claim 1, wherein the at least a portion of vehicle parameter data includes global position system coordinates of the vehicle.
  • 8. A method comprising: receiving, via a vehicle computer, at least one emergency condition detection signal;receiving data indicating one or more vehicle parameters;converting at least a portion of the data to speech signals based on a geographic location of a vehicle so that the speech signals are in a language associated with the geographic location;communicating the speech signals to an emergency response center over a voice channel; andtransmitting the data as digits to the emergency response center over data transmission signaling if an error occurs during conversion of the data to the speech signals.
  • 9. The method of claim 8, further comprising: establishing communication between the vehicle computer and a nomadic device; andestablishing communication between the emergency response center and the vehicle computer through the nomadic device.
  • 10. The method of claim 9, wherein the nomadic device includes a cellular telephone.
  • 11. The method of claim 9, wherein the communication between the vehicle computer and the nomadic device includes Bluetooth technology.
  • 12. The method of claim 8, wherein the converting of at least a portion of the data to speech signals includes: determining a word or phrase to be communicated to an emergency operator from the vehicle computer;looking up the determined word or phrase in a lookup table to determine a corresponding sound bite to be played; andplaying the determined corresponding sound bite over an outgoing communication with the emergency operator via the voice channel, such that the sound bite communicates the determined phrase in the language associated with the geographic location of the vehicle to the emergency operator.
  • 13. A system comprising: a processor configured to:receive data indicating one or more vehicle parameters;convert at least a portion of the data to speech signals based on a language selected based on a geographic location of a vehicle;output the speech signals to a call center over a voice channel; andoutput the data as digits to the call center if an error occurs during conversion of the data to the speech signals.
  • 14. The system of claim 13 wherein the data includes one or more of GPS coordinates, number of passengers in a vehicle, time stamp, vehicle identification, service provider identifier, and an indication notifying the call center that at least one emergency condition signal has been manually or automatically initiated.
  • 15. The system of claim 13, wherein the processor is additionally configured to convert the at least a portion of data received from the one or more vehicle parameters to a text message.
  • 16. The system of claim 13, wherein the at least a portion of the data to speech signals is converted by: determining a word or phrase to be communicated to an emergency operator from the processor from a lookup table; andtransmitting a sound associated with the word or phrase over the voice channel, such that the sound communicates the determined word or phrase in the selected language to the call center.
  • 17. The system of claim 13, wherein the processor is additionally configured to open a line of communication allowing an emergency call center operator to communicate with occupants of a vehicle.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/748,636 filed Jan. 24, 2013, now issued as U.S. Pat. No. 9,049,584, the disclosure of which is hereby incorporated in its entirety by reference herein.

US Referenced Citations (161)
Number Name Date Kind
4442485 Ota et al. Apr 1984 A
4833477 Tendler May 1989 A
4937796 Tendler Jun 1990 A
5144323 Yonkers Sep 1992 A
5223844 Mansell et al. Jun 1993 A
5388147 Grimes Feb 1995 A
5515043 Berard et al. May 1996 A
5555286 Tendler Sep 1996 A
5598460 Tendler Jan 1997 A
5649059 Tendler et al. Jul 1997 A
5736962 Tendler Apr 1998 A
5825098 Darby et al. Oct 1998 A
6014555 Tendler Jan 2000 A
6073004 Balachandran Jun 2000 A
6151385 Reich et al. Nov 2000 A
6266617 Evans Jul 2001 B1
6275713 Toda Aug 2001 B1
6292551 Entman et al. Sep 2001 B1
6496107 Himmelstein Dec 2002 B1
6504909 Cook et al. Jan 2003 B1
6516198 Tendler Feb 2003 B1
6519463 Tendler Feb 2003 B2
6532372 Hwang Mar 2003 B1
6608887 Reksten et al. Aug 2003 B1
6621892 Banister et al. Sep 2003 B1
6647270 Himmelstein Nov 2003 B1
6680998 Bell et al. Jan 2004 B1
6757528 Cardina et al. Jun 2004 B1
6775356 Salvucci et al. Aug 2004 B2
6778820 Tendler Aug 2004 B2
6952155 Himmelstein Oct 2005 B2
7027842 Zhang et al. Apr 2006 B2
7034238 Uleski et al. Apr 2006 B2
7050818 Tendler May 2006 B2
7092723 Himmelstein Aug 2006 B2
7113091 Script et al. Sep 2006 B2
7119669 Lundsgaard et al. Oct 2006 B2
7123926 Himmelstein Oct 2006 B2
7139549 Islam et al. Nov 2006 B2
7164921 Owens et al. Jan 2007 B2
7228145 Burritt et al. Jun 2007 B2
7305243 Tendler Dec 2007 B1
7400886 Sahim et al. Jul 2008 B2
7447508 Tendler Nov 2008 B1
7450955 Himmelstein Nov 2008 B2
7463896 Himmelstein Dec 2008 B2
7479900 Horstemeyer Jan 2009 B2
7482952 Horstemeyer Jan 2009 B2
7505772 Himmelstein Mar 2009 B2
7548158 Titus et al. Jun 2009 B2
7574195 Krasner et al. Aug 2009 B2
7580697 Lappe et al. Aug 2009 B2
7580782 Breed et al. Aug 2009 B2
7593408 Rezaiifar et al. Sep 2009 B2
7596391 Himmelstein Sep 2009 B2
7599715 Himmelstein Oct 2009 B2
7626490 Kashima Dec 2009 B2
7706796 Rimoni et al. Apr 2010 B2
7747291 Himmelstein Jun 2010 B2
7783304 Himmelstein Aug 2010 B2
7825901 Potera Nov 2010 B2
7844282 Tendler Nov 2010 B1
7885685 Himmelstein Feb 2011 B2
7894592 Book et al. Feb 2011 B2
7902960 Tsuchimochi et al. Mar 2011 B2
7904053 Krasner et al. Mar 2011 B2
7907976 Himmelstein Mar 2011 B2
7957772 Charlier et al. Jun 2011 B2
8036634 Dimeo et al. Oct 2011 B2
8060117 Tendler Nov 2011 B1
8224346 Himmelstein Jul 2012 B2
8224523 Hatton Jul 2012 B2
8396447 Reich et al. Mar 2013 B2
8897826 Singhal Nov 2014 B2
20010044302 Okuyama Nov 2001 A1
20020086718 Bigwood et al. Jul 2002 A1
20030227381 Best, Jr. Dec 2003 A1
20030231550 MacFarlane Dec 2003 A1
20040162064 Himmelstein Aug 2004 A1
20040183671 Long Sep 2004 A1
20050037730 Montague Feb 2005 A1
20050048948 Holland et al. Mar 2005 A1
20050099275 Kamdar et al. May 2005 A1
20050119030 Bauchot et al. Jun 2005 A1
20050197174 Hasan et al. Sep 2005 A1
20050222933 Wesby Oct 2005 A1
20050275505 Himmelstein Dec 2005 A1
20060049922 Kolpasky et al. Mar 2006 A1
20060061483 Smith et al. Mar 2006 A1
20060071804 Yoshioka Apr 2006 A1
20060165015 Melick et al. Jul 2006 A1
20060217105 Kumar et al. Sep 2006 A1
20060224305 Ansari et al. Oct 2006 A1
20060256794 Rezaiifar et al. Nov 2006 A1
20060262103 Hu et al. Nov 2006 A1
20060288053 Holt et al. Dec 2006 A1
20070050248 Huang et al. Mar 2007 A1
20070053513 Hoffberg Mar 2007 A1
20070106897 Kulakowski May 2007 A1
20070142028 Ayoub et al. Jun 2007 A1
20070171854 Chen et al. Jul 2007 A1
20070203643 Ramaswamy et al. Aug 2007 A1
20070218923 Park et al. Sep 2007 A1
20070243853 Bumiller et al. Oct 2007 A1
20070264990 Droste et al. Nov 2007 A1
20070281603 Neth et al. Dec 2007 A1
20080039018 Kim Feb 2008 A1
20080040669 Plocher Feb 2008 A1
20080080687 Broms Apr 2008 A1
20080139118 Sanguinetti Jun 2008 A1
20080140665 Ariel et al. Jun 2008 A1
20080143497 Wasson et al. Jun 2008 A1
20080150683 Mikan et al. Jun 2008 A1
20080177541 Satomura Jul 2008 A1
20080180237 Fayyad et al. Jul 2008 A1
20080208446 Geelen et al. Aug 2008 A1
20080243545 D'Ambrosia et al. Oct 2008 A1
20090002145 Berry et al. Jan 2009 A1
20090099732 Pisz Apr 2009 A1
20090117924 Kfoury et al. May 2009 A1
20090149153 Lee Jun 2009 A1
20090160607 Edwards et al. Jun 2009 A1
20090161836 Oesterling Jun 2009 A1
20090164053 Oesterling Jun 2009 A1
20090186596 Kaltsukis Jul 2009 A1
20090187300 Everitt et al. Jul 2009 A1
20090253403 Edge et al. Oct 2009 A1
20090261958 Sundararajan Oct 2009 A1
20090286504 Krasner et al. Nov 2009 A1
20100035598 Lee et al. Feb 2010 A1
20100058333 Peterson Mar 2010 A1
20100069018 Simmons et al. Mar 2010 A1
20100076764 Chengalvarayan Mar 2010 A1
20100097239 Campbell et al. Apr 2010 A1
20100190479 Scott et al. Jul 2010 A1
20100202368 Hans Aug 2010 A1
20100210211 Price Aug 2010 A1
20100227582 Berry et al. Sep 2010 A1
20100227584 Hong Sep 2010 A1
20100240337 Dimeo et al. Sep 2010 A1
20100253535 Thomas et al. Oct 2010 A1
20100273466 Robertson et al. Oct 2010 A1
20100323657 Barnard et al. Dec 2010 A1
20100323660 Himmelstein Dec 2010 A1
20100330972 Angiolillo Dec 2010 A1
20110003578 Chen et al. Jan 2011 A1
20110028118 Thomas Feb 2011 A1
20110059720 Penix et al. Mar 2011 A1
20110071880 Spector Mar 2011 A1
20110098016 Hatton Apr 2011 A1
20110201302 Hatton Aug 2011 A1
20110202233 Hatton Aug 2011 A1
20110225228 Westra et al. Sep 2011 A1
20110230159 Hatton Sep 2011 A1
20110275321 Zhou et al. Nov 2011 A1
20120028599 Hatton et al. Feb 2012 A1
20120053782 Gwozdek et al. Mar 2012 A1
20120264395 Bradburn et al. Oct 2012 A1
20120281605 Himmelstein Nov 2012 A1
20120309340 Ray Dec 2012 A1
20130244611 Singhal Sep 2013 A1
Foreign Referenced Citations (10)
Number Date Country
102008060567 Feb 2010 DE
1507129 Feb 2005 EP
2093982 Dec 2007 EP
2219163 Jan 2009 EP
2037664 Mar 2009 EP
2001043472 Feb 2001 JP
2003022490 Jan 2003 JP
206005744 Jan 2006 JP
2006270564 Oct 2006 JP
2008011432 Jan 2008 WO
Non-Patent Literature Citations (19)
Entry
Notice of Reasons for Rejections, JP2010-514958; Dec. 4, 2012; 3 pages.
Intellectual Property Office Examination Report; GB1117765.6; dated Aug. 22, 2013; 2 pages.
European MoU for Realisation of Interoperable in-Vehicle eCall; May 28, 2004; 7 pages.
Ford Motor Company, “SYNC with Navigation System,” Owner's Gude Supplement, SYNC System Version 1 (Jul. 2007).
Ford Motor Company, “SYNC with Navigation System,” Owner's Gude Supplement, SYNC System Version 2 (Oct. 2008).
Ford Motor Company, “SYNC,” Owner's Supplement, SYNC Systaem Version 2 (Oct. 2008).
Ford Motor Company, “SYNC with Navigation System,” Owner's Gude Supplement, SYNC System Version 3 (Jul. 2009).
Ford Motor Company, “SYNC,” Owner's Gude Supplement, SYNC System Version 3 (Aug. 2009).
Kermit Whitfield, “A hitchhiker's guide to the telematics ecosystem,” Automotive Design & Production, Oct. 2003, <<http://findarticles.com>> pp. 1-3.
Wayne Thalls, “Santa Cruz Ares Handbook,” Santa Cruz Ares 1990, pp. 1-29.
Ford Motor Company, “SYNC,” Owner's Gude Supplement, SYNC System Version 1 (Nov. 2007).
EPO Search Report; 11153638.9-1525; mailed May 27, 2011.
PCT Search Report; PCT App. PCT/US10/027451; mailed May 20, 2010.
PCT Search Report; PCT App. PCT/US08/66878;; mailed Aug. 7, 2008.
EPO Extended Search Report; 11009966.0-2413; dated May 24, 2012.
EPO Search Report; 11154014; dated May 31, 2011.
Office Action, non-final rejection, FMC3515PUS dated Apr. 22, 2013.
Wikipedia, “e-Call” project of the European Commission web page (e-Call—Wikipedia, the free encyclopedia).
“eCall Toolbox” web page (SafetySupport, 2005).
Related Publications (1)
Number Date Country
20150245190 A1 Aug 2015 US
Continuations (1)
Number Date Country
Parent 13748636 Jan 2013 US
Child 14697197 US