Vehicle immersive communication system

Information

  • Patent Grant
  • 9930158
  • Patent Number
    9,930,158
  • Date Filed
    Monday, February 8, 2010
    14 years ago
  • Date Issued
    Tuesday, March 27, 2018
    6 years ago
Abstract
A vehicle communication system facilitates hands-free interaction with a mobile device in a vehicle or elsewhere. Users interact with the system by speaking to it. The system processes text and processes commands. The system supports Bluetooth wireless technology for hands-free use. The system handles telephone calls, email, and SMS text messages. The user can customize the device via a user profile stored on an Internet web server.
Description
BACKGROUND OF THE INVENTION

This invention relates to a system for managing and communicating information while in a vehicle. More specifically, this invention relates to a system that integrates with a cell phone, PDA, or other mobile device to provide hands-free use of phone call, email, text messaging, and other functionality of a mobile device.


Exchanging critical information using email, instant messaging, and other online media is essential to succeed in today's connected lifestyles and business environments. We depend on constant connectivity for important emails, timely updates, and to make sound decisions. Unfortunately, managing this online information on a mobile device or visible screen can be extremely difficult and dangerous while driving.


In order to address these safety hazards, many states have enacted legislation to restrict the use of cell phones and other mobile devices while in the car. In light of this, hands-free devices for cell phones have become increasingly popular. However, many users are still distracted while trying to drive and operate their wireless devices.


An arrangement for safely managing and communicating a variety of information while in a vehicle is needed.


SUMMARY OF THE INVENTION

This invention addresses this need by providing a convenient and safe hands-free interface to manage important online information while enhancing the driving experience. Rather than awkwardly reaching for a handheld device and looking away from the road to read a new email, this invention integrates seamlessly in a vehicle to read important information out loud, directly to the driver. A voice-based interface provides unified access to all communication needs while allowing the driver to focus their attention on the road.


This invention provides a small device that wirelessly interacts directly with mobile devices and vehicle hands-free audio systems or headsets and allows the driver to listen to and manage email, quickly respond to email over the phone, compose SMS messages, and answer and return phone calls. The driver can talk and listen to it, and remain focused on driving and navigation of the vehicle. A built-in intelligent information manager automatically composes appropriate responses while parsing and prioritizing incoming information to ensure that only the most important messages are heard first.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically illustrates a communication system according to one embodiment of the present invention; and



FIG. 2 illustrates some of the components of the control unit of the communication system of FIG. 1.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

A communication system 10 is shown in FIG. 1 as implemented in a vehicle 8. The system 10 includes a device control unit 11 which is preferably mounted in a discreet location within the vehicle 8, such as under the dashboard, in the glove compartment, etc. The control unit 11 supports wireless communication via Bluetooth (IEEE 802.15.1) or any other wireless standard to communicate wirelessly with a cell phone, PDA, or other mobile device 12. All data 13 is encrypted prior to transmission. The audio output of the control unit 11 is transmitted either wirelessly 14 or through a direct, wired connection 15 to the vehicle's car stereo 16. The audio input for the control unit 11 is obtained either through a directly connected microphone 17, through an existing vehicle hands-free system, or wirelessly though a headset 18 connected to the mobile device 12.


The control unit 11 connects to the vehicle's battery 19 for power. An AC adapter is available for use at home or in the office. For portable use in other vehicles, an optional “Y” or pass-through cable is available to plug into a cigarette lighter accessory socket for power.


The control unit 11 contains a recessed button 20 which enables the driver to do the following: register new or replacement remotes; pair the device with a new mobile device 12; and clear all preferences and reset the device to its factory default settings. The control unit 11 also has a set of four status lights 21 which display the following information: power and system health, vehicle connection status and activity, mobile device connection status and activity, and information access and general status.


In one example, the control unit 11 and the mobile device 12 recognize when the user, and the user's associated mobile device 12, are near to, or have entered the vehicle. This may be accomplished, for example, by Bluetooth pairing of the device and the vehicle, or similar wireless communication initiation protocols. Within this range, the handheld device 12 changes from its normal, self-contained operating mode, to an immersive communication mode, where it is operated through the control unit 11. As will be described in more detail below, among other things, this mode enables the user to hear their emails played through the vehicle's sound system 16, or, alternatively, and if so equipped, played through the sound system of the mobile device 12, e.g., headphones 18. Microphones 17 in the vehicle 8 or on the mobile device 12 detect user-generated voice commands. Thus, the user is not required to change modes on the mobile device 12; instead, the control unit 11 and associated mobile device 12, recognize that the user is proximate the vehicle 8 and adjust the mode accordingly.


In addition to adjusting the mode based on vehicle proximity, the system 10 may adjust between a public and a private mode. For instance, as explained above, the system's immersive communication mode ordinarily occurs when the user is proximate the vehicle 8. The immersive communication mode may have a public setting and a private setting. The public setting plays the emails over headphones 18 associated with the mobile device 12. Such a setting prevents a user from disturbing other occupants of the vehicle 8. The private setting plays the emails over the vehicle sound system 16, and is ordinarily used when the user is the only occupant in the vehicle 8.


Of course, such system settings may be adjusted by the user and their particular preferences in their user profile. For example, the user may prefer to switch to the immersive communication mode when the mobile device 12 and user are within a certain distance from the vehicle 8, whereas another user may switch modes only when the mobile device 12 and user have entered the vehicle 8. Further, the user may want to operate the control unit 11 and associated device 12 in a public mode, even if other occupants are in the vehicle 8.


Similarly, the system 10 recognizes when the user leaves the vehicle 8 and the mobile device 12 reverts to a self-contained (normal) mode. The mobile device 12 may also record the vehicle's location when the user leaves the vehicle 8 (based upon GPS or other information). Accordingly, the user can recall the vehicle position at a later time, either on the device or elsewhere on the system, which may aid the user in locating the vehicle 8.


The device has multiple USB ports 22. There are standard USB ports which serve the following functions: to enable the driver to store preferences, settings, and off-line memos and transcriptions on a standard USB flash drive; to permit future expansion, upgrades, and add-on features; and to connect an Ethernet dongle for high-speed internet access. In addition, the control unit 11 has a dual-purpose USB 2.0 port which in addition to the features mentioned above, provides USB 2.0 “on-the-go” functionality by directly connecting to the USB port of a notebook computer with a standard cable (i.e. just like connecting a portable camera or GPS unit directly to a computer).


Other ports on the control unit 11 include an ⅛″ audio jack 23 to connect to a car stereo without Bluetooth support, a ⅛″ microphone jack 24 to support external high-quality microphones for hands-free calling, and a ⅛″ stereo headset jack 25 for use away from the vehicle or in a vehicle without Bluetooth support.


The system 10 also includes an optional remote control 26 to interact with the control unit 11. The remote control contains lithium batteries, similar to that of a remote keyless entry remote for a common vehicle.


In order to provide security and privacy, the device uses both authentication and encryption. Voice-based biometrics may also be used to further enhance security.


The driver stores his or her settings for the device in their settings profile 30. This profile 30 may be stored in a database on an Internet server 27. The control unit 11 utilizes the internet access provided by the driver's mobile device 12 to download the driver's profile 30 via the Internet. The control unit 11 also uses the pairing information from the mobile device 12 to retrieve the correct profile 30 from the server 27. If the profile 30 has already been downloaded to the control unit 11, the control unit 11 may just check for changes and updates on the server 27. Each profile 30 on the server 27 contains a set of rules that the control unit 11 uses to make decisions on content delivery to the driver. The driver can access and modify their profile 30 on the Internet server 27 through either the Internet using a web-based interface 28, or through a simple interface directly accessible from the associated mobile device 12. Alternatively, the profile 30 is always stored and modified on the control unit 11 only and can be accessed via the mobile device 12 and/or via a USB connection to a laptop or desktop computer.


As shown in FIG. 2, the control unit 11 includes a text processing module 34, a vehicle communication module 36, a speech recognition module 38, Bluetooth (or other wireless communication) modules 40, a mobile device communication module 42, a text-to-speech module 44, a user interface module 46, and a remote device behavior controller 48. The control unit 11 has an email processing agent 50 that processes email messages and determines the identity of the sender, whether the message has an attachment, and if so what type of attachment, and then extracts the body-text of the message. The control unit 11 also determines if a message is a reminder, news, or just a regular email message. The control unit 11 uses a data mining algorithm to determine if any parts of the email should be excluded (e.g. a lengthy signature).


Hands-Free Email


One feature of the system is hands-free email. Using the text-to-speech module 44, the control unit 11 can read email to the driver. When new email arrives, the control unit 11 uses the profile 30 to guide an intelligent filtering and prioritization system which enables the driver to do the following: ensure that emails are filtered and read in order of priority, limit the frequency of new email interruptions, send automatic replies without driver intervention, and forward certain emails to a third-party without interruption. In addition, prior to being read out loud, the control unit 11 processes emails to optimize clarity. Part of that process involves detecting acronyms, symbols, and other more complex structures and ensuring that they can be easily understood when read. The control unit 11 provides intelligent email summarization in order to reduce the time required to hear the important content of email when read out loud.


The driver can interact with the control unit 11 using voice commands, including “go back” and “go forward,” to which the control unit 11 responds by going back to the previous phrase or sentence or the next phrase or sentence in the email respectively. In addition, speaking “go back, go back” would back up two phrases or sentences.


Additional hands-free email features include a time-saving filtering system which allows the driver to hear only the most important content or meaning of an email. Another email-related feature is the ability to download custom email parsers to add a new dimension to audible email, and to parse informal email styles (i.e. 18r, ttyl).


The hands-free email functionality includes content-rich notification. When providing notification of a new email, the control unit 11 provides a quick summary about the incoming email, enabling the driver to prioritize which messages are more important. Examples include “You have mail from Sally” (similar to a caller-ID for email), or “You have an important meeting request from Cathy.” The control unit 11 looks up the known contact names based upon the sender's email address in the user's address book on the mobile device 12. The control unit 11 uses known contact names to identify the parties of an email instead of just reading the cryptic email addresses out loud.


In addition to reading email, the control unit 11 also enables the driver to compose responses. The driver can send a reply using existing text or voice templates (i.e. “I'm in the car call me at ‘number,’” or “I'm in the car, I will reply as soon as I can”). New emails can also be created and sent as a voice recording in the form of a .wav or .mp3 file. The driver is also provided the option of calling the sender of the email on the phone using existing contact information in the address book, or responding to meeting requests and calendar updates (i.e. Outlook). Emails can also be created as freeform text responses by dictating the contents of the email. The device then translates that into text form for email transmission. An intelligent assistant will be immediately available to suggest possible actions and to provide help as needed. Again all of these options are prompted by verbal inquires by the control unit 11 which can be selected by voice commands by the driver.


The control unit 11 supports multiple email accounts, and email can be composed from any existing account. Incoming email can also be intelligently handled and prioritized based upon account. Optional in-vehicle email addresses on a custom domain are available. Emails sent from this address would include a notification that the email was composed while in transit. When composing an email to an in-vehicle email address, the sender knows that the email will be read out loud in a vehicle. If the traditional email is “george@work.net,” then the in-vehicle address may be “george@driving.net.” Optional enhanced existing email addresses are also available on supported email systems. For example, if the traditional email is “george@work.com,” an enhanced in-vehicle address of “george+driving@work.com” may be selected.


Enhanced Hands-Free Telephone Calls


Another feature of this invention is enhanced hands-free telephone calls. This includes transparent use of any existing hands-free system. All incoming telephone calls can use either the existing vehicle hands-free system or a user headset 18. If an expected important email arrives while the driver is on the phone, an “email-waiting” indicator (lights and/or subtle tones) will provide subtle notification without disrupting the conversation. A headset 18 can be activated at any time for privacy or to optimize clarity. The control unit 11 will seamlessly switch from the vehicle hands-free system to the private headset 18 for privacy.


The control unit 11 also features enhanced caller-ID. The device announces incoming calls by reading the caller name or number out loud (e.g. “This is a call from John Doe, do you want to answer it?”). This eliminates the need to look away from the road to find out who is calling. Vehicle-aware screening can also automatically forward specific calls to voicemail or to another number when driving, again based upon the driver's profile. Normal forwarding rules will resume when leaving the vehicle.


The control unit 11 also provides voice activated answering and calling. When the control unit 11 announces a telephone call, the driver can accept the call using a voice command. The driver can use voice commands associated with either contacts in an address book or with spoken phone numbers to place outgoing telephone calls (i.e. “Call Krista”).


Unified Information Management


Another feature of the present invention is that it provides unified information management. The control unit 11 provides a consistent interface for seamless access to incoming and outgoing telephone calls, email, and other sources of information. The existing hands-free interface automatically switches between telephone calls, reading email, and providing important notifications. When entering the vehicle, the control unit 11 automatically provides an enhanced voice-based interface, and when leaving the vehicle, the mobile device 12 automatically resumes normal operation. Email reading can also be paused to accept an incoming phone call, and can be resumed when the call is complete.


In addition, the driver can communicate with any contact through email, a phone call, or an SMS text message simply by speaking. The control unit 11 provides enhanced information for incoming telephone calls. The name and number, if available, are read out loud to ensure that the driver knows the caller without looking away from the road. A nickname, or other information located in an address book, may also be used for notification.


The driver can also reply to an email with a phone call. While reading an email, the driver can contact the sender by placing a telephone call with address book information. When a phone call is made, but the line is busy or no voicemail exists, the user is given the option of sending an email to the same contact instead. This eliminates the need to wait and try calling the person again.


Within their profile 30, the driver can prioritize between email and phone calls, so that an important email will not be interrupted by a less important phone call. In addition, custom mp3 (or other format) ring tones can be associated with both incoming emails and telephone calls. Ring tones can be customized by email from certain contacts, phone calls from certain contacts, or email about certain subjects. Custom “call waiting” audible indicators can be used when an important email arrives while on the phone, or when an important phone call arrives while reading or composing an email.


Enhanced Hands-Free Calendar


Another feature of the present invention is the enhanced hands-free calendar wherein the control unit 11 utilizes the calendar functionality of the user's mobile device 12. The control unit 11 reads the subject and time of calendar reminders out loud, and the driver can access additional calendar information with voice commands if desired. The driver can also perform in-transit schedule management by reviewing scheduled appointments (including date, time, subject, location and notes); accepting, declining, or forwarding meeting requests from supported systems (e.g. Outlook); scheduling meetings; and automatically annotating meetings with location information. The driver can also store location-based reminders, which will provide reminders the next time the vehicle is present in a specified geographical area, and automatically receive information associated with nearby landmarks. In addition, the driver could plan and resolve meeting issues by communicating directly with other participants' location-aware devices.


Do Not Disturb


Another feature of the present invention is the “do not disturb” functionality. When passengers are present in the vehicle, the control unit 11 can be temporarily silenced. Even when silent, the control unit 11 will continue to intelligently handle incoming email, email forwarding, providing automatic email replies, and processing email as desired. A mute feature is also available.


Integrated Voice Memo Pad


Another feature of the present invention is the integrated voice memo pad, which enables the driver to record thoughts and important ideas while driving so they will not be forgotten while parking or searching for a memo pad or device. Memos can be transferred via email to the driver's inbox, or to any of the driver's contacts. Memos can also be wirelessly transferred to a computer desktop via the Bluetooth interface as the user arrives in the office, or transferred to a removable USB flash memory drive. Memos can also be annotated automatically using advanced context information including location, weather, and trip information. For example, “this memo was recorded at night in a traffic jam on the highway, halfway between the office and the manufacturing facility.” Such augmented information can provide valuable cues when reviewing memos.


Access to Diverse Information


Another feature of the present invention is the ability to access to diverse information. Information is available in audible form (text-to-speech) from a wide range of sources. First, the control unit 11 provides access to personal connectivity and time management information. This includes email (new and previously read), incoming caller name and number, SMS messages, MMS messages, telephone call logs, address book, calendar and schedule, and instant messages.


Second, the control unit 11 provides multi-format support. This includes email attachments that can be read out loud, including plain text, audio attachments (i.e. .wav, .mp3), HTML (i.e. encoded emails and web sites), plain text portions of Word and PowerPoint files, Adobe Portable Document format (PDF), OpenDocument formats, and compressed and/or encoded attachments of the above formats (i.e. .zip).


Third, the control unit 11 provides remote access to information. This includes existing news sources (i.e. existing RSS feeds) and supported websites. This also includes subscription to value-added services including: weather, custom alerts (i.e. stock price triggers), traffic conditions, personalized news, e-books (not limited to audio books, but any e-book), personalized audio feeds, and personalized image or video feeds for passengers.


Fourth, the device provides environment and location awareness. This includes current location and navigation information, local weather conditions, vehicle status, and relevant location-specific information (i.e. where is “work”, where is “home?”).


Personalization


Another feature in the present invention is extensive personalization and customization for email handling, email notification, time-sensitive rules, vehicle-aware actions, text-to-speech preferences, and multiple user support.


The email handling settings in the user's profile 30 allow the driver to use the control unit's 11 built-in intelligent email parsing and processing. This enables the driver to avoid receiving notification for every trivial incoming email. Some of the intelligent parsing features include automatic replies, forwarding and prioritization based on content and sender, and substitution of difficult phrases (i.e. email addresses and web site URLs) with simple names and words. The driver can also choose to hear only select information when a new email arrives (i.e. just the sender name, or the sender and subject, or a quick summary). Email “ring tones” are also available for incoming emails based on sender or specific keywords. Prepared text or voice replies can be used to send frequently used responses (i.e. “I'm in transit right now”). Some prepared quick-responses may be used to automatically forward an email to a pre-selected recipient such as an administrative assistant. The driver can also set up both email address configuration and multiple email address rules (i.e. use “me@work.com” when replying to emails sent to “me@work.com,” but use “me@mobile.com” when composing new emails).


The driver can also customize notification. This includes prioritizing emails and phone calls based on caller or sender and subject (i.e. never read emails from Ben out loud, or if an email arrives from George, it should be read before others). The driver can also limit the amount of notifications received (i.e. set minimum time between notifications, or maximum number of emails read in a short period of time).


Time-sensitive rules in the profile 30 may include options such as “don't bother me in the morning,” or “only notify me about incoming email between these hours.” The driver can also configure audible reminder types based on calendar and scheduling items from the mobile device. Vehicle-aware actions are configurable based on the presence of the user in the vehicle. These actions include the content of automatic replies and predefined destinations and rules to automatically forward specific emails to an administrative assistant or other individual. These also include actions to take when multiple Bluetooth enabled mobile devices are present (i.e. switch to silent “do not disturb” mode, or take no action).


The text-to-speech settings for the device are also configurable. This includes speech characteristics such as speed, voice, and volume. The voice may be set to male or female, and may be set to speak a number of languages, including but not limited to US English, UK English, French, Spanish, German, Italian, Dutch, and Portuguese. A base set of languages will be provided with the device, with alternate languages being available in the future. The driver can set personal preferences for pronunciation of specific words, such as difficult contact names, and specialized acronyms or symbols, such as “H2O.” By default, most acronyms are spelled out letter by letter (i.e. IMS, USB).


Information about specific words or phrases can be used to enhance both speech recognition performance and text-to-speech performance, and this includes context sensitive shortcuts. For example, nicknames should be expanded into an email address if the driver is dictating an email. In addition, email addresses should be expanded to a common name when found. The driver can also set custom voice prompts or greetings.


The device also features multiple user support, wherein multiple people can share the same device. The device automatically identifies each person by their mobile device 12, and maintains individual profiles 30 for each driver.


Connectivity


The connectivity functionality of the control unit 11 enables it to function as a hands-free audio system. It interacts with supported Bluetooth hands-free devices, including but not limited to Bluetooth enabled vehicles (HS, HFP, and A2DP), after-market hands-free vehicle products, and supported headsets to provide privacy. For vehicles not containing Bluetooth or other wireless support, the control unit 11 can connect directly to the vehicle's audio system 16 through a wired connection. Retrofit solutions will also be available for existing vehicles lacking wireless connectivity in the form of an optional after-market Bluetooth kit.


The system 10 may include a remote control 26 for accessing the control unit 11. Emergency response support is available for direct assistance in emergencies, providing GPS location information if available. The driver could also use the control unit 11 through an advanced wireless audio/visual system, including such features as streaming music and providing image content (i.e. PowerPoint, images attached in emails, slideshows). Integrated steering-wheel column buttons is also an available option.


The control unit 11 can also connect to a computer and external devices. This includes personal computers with Bluetooth to conveniently exchange information over a personal area network (PAN). This also includes GPS devices (with Bluetooth or other wireless or wired connectivity) for location awareness. This also includes storage devices (Bluetooth or other wireless or wired) for personal e-book libraries, or to manage offline content with the unified hands-free interface. An optional cable will be available for controlling an iPod or other music player with voice commands. Through the device's USB ports, the driver can expand the functionality of the device by attaching such items as a USB GPRS/EDGE/3G device for direct mobile access without a separate mobile device, or a USB WiFi for high-speed Internet access.


Upgradeability and Expansion


The driver may add future enhancements to the control unit 11 wirelessly using standard Bluetooth enabled devices. This includes support for wireless transfer with a desktop or notebook computer to transfer and synchronize information. Advanced Bluetooth profile support (i.e. A2DP) for stereo and high quality audio is also available.


As mentioned previously, the control unit 11 will contain two USB ports. The standard USB port or ports will provide convenient access to standard USB devices for storing preferences on a standard USB flash drive; storing and moving off-line memos and transcriptions recorded by the device; and future expansion, upgrades, and add-on features. The dual-purpose USB 2.0 “On-The-Go” port or ports will provide both the aforementioned features to access USB devices, and also direct connections to a computer with a standard cable (i.e. just like connecting a digital camera or GPS unit directly to a computer).


In accordance with the provisions of the patent statutes and jurisprudence, exemplary configurations described above are considered to represent a preferred embodiment of the invention. However, it should be noted that the invention can be practiced otherwise than as specifically illustrated and described without departing from its spirit or scope.

Claims
  • 1. A method for providing communication in a vehicle including the steps of: a) determining that a mobile device is in a vehicle via a wireless connection to the mobile device, wherein the mobile device is a cell phone;b) identifying the mobile device via the wireless connection to the mobile device;c) identifying a user profile associated with the mobile device based upon the identification of the mobile device in said step b); andd) processing a non-voice message received by the mobile device based upon said step c), including the step of redirecting the message intended for the user to a control unit in the vehicle over a wireless communication link based upon the identification of the user in said step c).
  • 2. The method of claim 1 wherein said step d) includes the step of forwarding the message based upon the user profile.
  • 3. The method of claim 2 wherein the step of forwarding includes the step of forwarding the message to a different user.
  • 4. The method of claim 2 further including the step of processing a telephone call to the mobile device based upon the user profile.
  • 5. The method of claim 1 wherein the message includes at least one of an email message, an SMS or text message or a MMS message.
  • 6. The method of claim 1 wherein the mobile device is the user's mobile device, the method further including the step of: e) detecting the presence of a mobile device other than the user's mobile device;wherein said step d) includes processing the message based upon the presence of the mobile device other than the user's mobile device.
  • 7. The method of claim 6 wherein the user's mobile device is a first user's mobile device and wherein the mobile device other than the user's mobile device is a second user's mobile device, the method further including the steps of: f) identifying the second user's mobile device;g) identifying the second user associated with the second user's mobile deviceh) maintaining a first profile associated with the first user and a second profile associated with the second user.
  • 8. The method of claim 1 wherein said step c) includes the step of selecting the user profile associated with the mobile device from among a plurality of user profiles.
  • 9. The method of claim 1 wherein the message is an email message.
  • 10. The method of claim 1 wherein the communication is an email message.
  • 11. A method for providing communication in a vehicle including the steps of: a) determining that a mobile device is in a vehicle, wherein the mobile device is a cell phone;b) identifying the mobile device via a wireless connection to the mobile device;c) identifying a user profile associated with the mobile device in response to the identification of the mobile device in said step b);d) redirecting a first email message received on the mobile device to a control unit in the vehicle over a wireless connection based upon the determination in said step a) that the mobile device is in the vehicle and based upon the user profile identified in said step c);e) converting the first email message to speech and playing the speech audibly in the vehicle;f) receiving a second email message on the cell phone; andg) automatically forwarding the second email message based upon the user profile.
  • 12. A method for providing communication in a vehicle including the steps of: a) determining that a mobile device is in a vehicle, wherein the mobile device is a cell phone;b) identifying the mobile device via a wireless connection to the mobile device;c) identifying a user profile associated with the mobile device in response to the identification of the mobile device in said step b);d) processing an email message received on the mobile device based upon the user profile identified in said step c), wherein the email message in said step d) is one of a plurality of email messages, said step d) including the step of processing the plurality of email messages including filtering the plurality of email messages based upon the user profile identified in said step c), wherein said step d) further includes the step of redirecting the message intended for the user to a control unit in the vehicle over a wireless communication link based upon the identification of the user in said step c); ande) based upon the user profile and based upon the filtering in said step d), converting at least one of the plurality of email messages to speech and playing the speech audibly in the vehicle.
  • 13. The method of claim 12 wherein the step of processing the plurality of email messages including prioritizing the plurality of email messages based upon the user profile identified in said step c), wherein the step of playing the speech is based upon the prioritization.
Parent Case Info

The application is a continuation of U.S. patent application Ser. No. 11/452,117, filed Jun. 13, 2006, now U.S. Pat. No. 7,689,253 which claimed priority to U.S. Provisional Application Ser. Nos. 60/689,959, filed Jun. 13, 2005; 60/729,905, filed Oct. 25, 2005; 60/736,102, filed Nov. 10, 2005; 60/763,660 filed Jan. 31, 2006; 60/777,424 filed Feb. 28, 2006 and 60/803,329 filed May 26, 2006.

US Referenced Citations (232)
Number Name Date Kind
3928724 Andersen et al. Dec 1975 A
4083003 Haemmig et al. Apr 1978 A
4532052 Weaver et al. Jul 1985 A
4591823 Horvat May 1986 A
4989144 Barnett et al. Jan 1991 A
5177685 Davis et al. Jan 1993 A
5246073 Sandiford et al. Sep 1993 A
5488360 Ray Jan 1996 A
5638425 Meador et al. Jun 1997 A
5760742 Branch et al. Jun 1998 A
5836392 Urlwin-Smith Nov 1998 A
5912951 Checchio et al. Jun 1999 A
5931907 Davies et al. Aug 1999 A
5938706 Feldman et al. Aug 1999 A
5944783 Nieten Aug 1999 A
5963618 Porter Oct 1999 A
5983108 Kennedy et al. Nov 1999 A
6012030 French et al. Jan 2000 A
6041300 Ittycheriah et al. Mar 2000 A
6061718 Nelson May 2000 A
6088650 Schipper et al. Jul 2000 A
6147598 Murphy et al. Nov 2000 A
6176315 Reddy et al. Jan 2001 B1
6192364 Baclawski Feb 2001 B1
6192986 Urlwin-Smith Feb 2001 B1
6196317 Hardy Mar 2001 B1
6212474 Fowler et al. Apr 2001 B1
6253122 Razavi et al. Jun 2001 B1
6282495 Kirkhart et al. Aug 2001 B1
6295449 Westerlage et al. Sep 2001 B1
6356869 Chapados et al. Mar 2002 B1
6362748 Huang Mar 2002 B1
6367022 Gillespie et al. Apr 2002 B1
6377825 Kennedy et al. Apr 2002 B1
6385202 Katseff et al. May 2002 B1
6496107 Himmelstein Dec 2002 B1
6529863 Ball et al. Mar 2003 B1
6553102 Fogg et al. Apr 2003 B1
6574531 Tan et al. Jun 2003 B2
6580973 Leivian et al. Jun 2003 B2
6594557 Stefan et al. Jul 2003 B1
6607035 Reddy et al. Aug 2003 B1
6615130 Myr Sep 2003 B2
6622083 Knockeart et al. Sep 2003 B1
6650997 Funk Nov 2003 B2
6690268 Schofield et al. Feb 2004 B2
6697638 Larsson et al. Feb 2004 B1
6714223 Asami et al. Mar 2004 B2
6721633 Funk et al. Apr 2004 B2
6724863 Bedingfield Apr 2004 B1
6731239 Wall et al. May 2004 B2
6738742 Smith et al. May 2004 B2
6748211 Isaac et al. Jun 2004 B1
6764981 Eoff et al. Jul 2004 B1
6788935 McKenna et al. Sep 2004 B1
6788949 Bansal Sep 2004 B1
6812888 Drury et al. Nov 2004 B2
6812942 Ribak Nov 2004 B2
6839669 Gould et al. Jan 2005 B1
6842510 Sakamoto Jan 2005 B2
6895084 Saylor et al. May 2005 B1
6895257 Boman et al. May 2005 B2
6895310 Kolls May 2005 B1
6909947 Douros et al. Jun 2005 B2
6925154 Gao et al. Aug 2005 B2
6944679 Parupudi et al. Sep 2005 B2
6968272 Knockeart et al. Nov 2005 B2
6970703 Fuchs et al. Nov 2005 B2
6970783 Knockeart et al. Nov 2005 B2
6972669 Saito et al. Dec 2005 B2
6982635 Obradovich Jan 2006 B2
7039166 Peterson et al. May 2006 B1
7049982 Sleboda et al. May 2006 B2
7050834 Harwood et al. May 2006 B2
7062286 Grivas et al. Jun 2006 B2
7069118 Coletrane et al. Jun 2006 B2
7085629 Gotou et al. Aug 2006 B1
7091160 Dao et al. Aug 2006 B2
7113911 Hinde et al. Sep 2006 B2
7117075 Larschan et al. Oct 2006 B1
7127271 Fujisaki Oct 2006 B1
7151997 Uhlmann et al. Dec 2006 B2
7191040 Pajakowski et al. Mar 2007 B2
7191059 Asahara Mar 2007 B2
7212814 Wilson et al. May 2007 B2
7228224 Rosen et al. Jun 2007 B1
7257426 Witkowski et al. Aug 2007 B1
7280975 Donner Oct 2007 B1
7286825 Shishido et al. Oct 2007 B2
7286857 Walker et al. Oct 2007 B1
7289796 Kudoh Oct 2007 B2
7296066 Lehaff et al. Nov 2007 B2
7346374 Witkowski et al. Mar 2008 B2
7356474 Kumhyr Apr 2008 B2
7363229 Falcon et al. Apr 2008 B2
7366795 O'Neil et al. Apr 2008 B2
7386376 Basir et al. Jun 2008 B2
7400879 Lehaff et al. Jul 2008 B2
7412078 Kim Aug 2008 B2
7412328 Uhlmann et al. Aug 2008 B2
7426647 Fleck et al. Sep 2008 B2
7444286 Roth et al. Oct 2008 B2
7461344 Young et al. Dec 2008 B2
7496514 Ross et al. Feb 2009 B2
7505951 Thompson et al. Mar 2009 B2
7526431 Roth et al. Apr 2009 B2
7554435 Tengler et al. Jun 2009 B2
7567542 Rybak et al. Jul 2009 B2
7643619 Jung Jan 2010 B2
7646296 Ohki Jan 2010 B2
7653545 Starkie Jan 2010 B1
7689253 Basir Mar 2010 B2
7693720 Kennewick et al. Apr 2010 B2
7769364 Logan et al. Aug 2010 B2
7787907 Zeinstra et al. Aug 2010 B2
7801283 Harwood et al. Sep 2010 B2
7814353 Naitou et al. Oct 2010 B2
7859392 McClellan et al. Dec 2010 B2
7865309 Taylor Jan 2011 B2
7881864 Smith Feb 2011 B2
7912186 Howell et al. Mar 2011 B2
7948969 Boys May 2011 B2
7983811 Basir et al. Jul 2011 B2
8015010 Basir Sep 2011 B2
8060285 Chigusa Nov 2011 B2
8090848 Maes et al. Jan 2012 B2
8195467 Mozer et al. Jun 2012 B2
8218737 Odinak Jul 2012 B2
8289186 Osafune Oct 2012 B2
8350721 Carr Jan 2013 B2
20010005854 Murata et al. Jun 2001 A1
20010021640 Lappe Sep 2001 A1
20010056345 Guedalia Dec 2001 A1
20020032042 Poplawsky et al. Mar 2002 A1
20020041659 Beswick Apr 2002 A1
20020090930 Fujiwara et al. Jul 2002 A1
20020137505 Eiche et al. Sep 2002 A1
20030114202 Suh et al. Jun 2003 A1
20030181543 Reddy et al. Sep 2003 A1
20030212745 Caughey Nov 2003 A1
20030227390 Hung et al. Dec 2003 A1
20030231550 Macfarlane Dec 2003 A1
20040001575 Tang Jan 2004 A1
20040058647 Zhang et al. Mar 2004 A1
20040082340 Eisinger Apr 2004 A1
20040090308 Takahashi et al. May 2004 A1
20040090950 Lauber et al. May 2004 A1
20040102188 Boyer et al. May 2004 A1
20040104842 Drury et al. Jun 2004 A1
20040116106 Shishido et al. Jun 2004 A1
20040119628 Kumazaki et al. Jun 2004 A1
20040133345 Asahara Jul 2004 A1
20040145457 Schofield et al. Jul 2004 A1
20040158367 Basu et al. Aug 2004 A1
20040182576 Reddy et al. Sep 2004 A1
20040185915 Ihara et al. Sep 2004 A1
20040193420 Kennewick et al. Sep 2004 A1
20040204161 Yamato et al. Oct 2004 A1
20040243406 Rinscheid Dec 2004 A1
20040257210 Chen et al. Dec 2004 A1
20050046584 Breed Mar 2005 A1
20050049781 Oesterling Mar 2005 A1
20050054386 Chung Mar 2005 A1
20050066207 Fleck et al. Mar 2005 A1
20050088320 Kovach Apr 2005 A1
20050107132 Kamdar et al. May 2005 A1
20050107944 Hovestadt et al. May 2005 A1
20050130631 Maguire et al. Jun 2005 A1
20050131677 Assadollahi Jun 2005 A1
20050135573 Harwood et al. Jun 2005 A1
20050143134 Harwood et al. Jun 2005 A1
20050174217 Basir et al. Aug 2005 A1
20050230434 Campbell et al. Oct 2005 A1
20050285743 Weber Dec 2005 A1
20050288190 Dao et al. Dec 2005 A1
20060009885 Raines Jan 2006 A1
20060030298 Burton et al. Feb 2006 A1
20060052921 Bodin et al. Mar 2006 A1
20060055565 Kawamata et al. Mar 2006 A1
20060089754 Mortenson Apr 2006 A1
20060101311 Lipscomb et al. May 2006 A1
20060135175 Lundstrom et al. Jun 2006 A1
20060214783 Ratnakar Sep 2006 A1
20060217858 Peng Sep 2006 A1
20060271275 Verma Nov 2006 A1
20070016813 Naitou et al. Jan 2007 A1
20070038360 Sakhpara Feb 2007 A1
20070042812 Basir Feb 2007 A1
20070043574 Coffman et al. Feb 2007 A1
20070050108 Larschan et al. Mar 2007 A1
20070061401 Bodin et al. Mar 2007 A1
20070073812 Yamaguchi Mar 2007 A1
20070106739 Clark et al. May 2007 A1
20070118380 Konig May 2007 A1
20070162552 Shaffer et al. Jul 2007 A1
20080004875 Chengalvarayan et al. Jan 2008 A1
20080027643 Basir et al. Jan 2008 A1
20080031433 Sapp et al. Feb 2008 A1
20080037762 Shaffer et al. Feb 2008 A1
20080071465 Chapman et al. Mar 2008 A1
20080119134 Rao May 2008 A1
20080132270 Basir Jun 2008 A1
20080133230 Herforth Jun 2008 A1
20080140408 Basir Jun 2008 A1
20080201135 Yano Aug 2008 A1
20080252487 McClellan et al. Oct 2008 A1
20080263451 Portele et al. Oct 2008 A1
20080270015 Ishikawa et al. Oct 2008 A1
20080306740 Schuck et al. Dec 2008 A1
20090011799 Douthitt et al. Jan 2009 A1
20090099836 Jacobsen et al. Apr 2009 A1
20090106036 Tamura et al. Apr 2009 A1
20090124272 White et al. May 2009 A1
20090161841 Odinak Jun 2009 A1
20090176522 Kowalewski et al. Jul 2009 A1
20090204410 Mozer et al. Aug 2009 A1
20090259349 Golenski Oct 2009 A1
20090298474 George Dec 2009 A1
20090318119 Basir et al. Dec 2009 A1
20100023246 Zhao et al. Jan 2010 A1
20100036595 Coy et al. Feb 2010 A1
20100097239 Campbell et al. Apr 2010 A1
20100100307 Kim Apr 2010 A1
20100130180 Lim May 2010 A1
20100137037 Basic Jun 2010 A1
20100138140 Okuyama Jun 2010 A1
20100159968 Ng Jun 2010 A1
20100198428 Sultan et al. Aug 2010 A1
20100211301 McClellan Aug 2010 A1
20100222939 Namburu et al. Sep 2010 A1
20100280884 Levine et al. Nov 2010 A1
20110302253 Simpson-Anderson et al. Dec 2011 A1
Foreign Referenced Citations (17)
Number Date Country
2405813 Nov 2001 CA
19920227 Nov 2000 DE
102007062958 Jun 2009 DE
901000 Mar 1999 EP
1463345 Sep 2004 EP
1575225 Sep 2005 EP
1568970 Dec 2006 EP
1701247 Jan 2009 EP
1840523 Mar 2011 EP
1986170 Apr 2011 EP
2329970 Nov 2001 GB
2 366 157 Feb 2002 GB
2001343979 Dec 2001 JP
2002171337 Jun 2002 JP
2002176486 Jun 2002 JP
2002237869 Aug 2002 JP
2004252563 Sep 2004 JP
Non-Patent Literature Citations (3)
Entry
Translated Office Action Office for Japanese Application No. 2008-516087 dated Sep. 29, 2011.
European Search Report for EP Application No. 06752782.0, dated Mar. 1, 2010.
International Search Report for PCT Application No. PCT/CA2006/000946, dated Nov. 8, 2006.
Related Publications (1)
Number Date Country
20100137037 A1 Jun 2010 US
Provisional Applications (6)
Number Date Country
60689959 Jun 2005 US
60729905 Oct 2005 US
60736102 Nov 2005 US
60763660 Jan 2006 US
60777424 Feb 2006 US
60803329 May 2006 US
Continuations (1)
Number Date Country
Parent 11452117 Jun 2006 US
Child 12701817 US