Mobile devices such as wireless devices, including, for example, cellular telephones, smart phones, laptop computers, notebook computers, tablet devices (e.g., iPad by Apple®) are ubiquitous in modern society. Use of such mobile devices while operating a vehicle, however, can be hazardous. The problem is exacerbated for inexperienced operators of the vehicle, such as youngsters just learning how to drive. Rates of vehicular accidents where mobile devices are involved are rising, especially with teenagers. Text messaging while operating a moving vehicle can be dangerous and has been linked with causing accidents. More generally, operating any keyboard or other interactive device while operating a vehicle can be dangerous.
Thus, the widespread adoption of mobile devices and common use of the devices while driving has raised concerns about the distraction of drivers. A driver speaking, text messaging, or using a software application on a mobile telephone may become mentally distracted from driving and lose control of the vehicle that he or she is driving. Thus, it is not uncommon to see an individual involved in an accident who was speaking or text messaging on a mobile device rather than paying attention to the road. Studies now suggest that individuals speaking on mobile telephones while driving a car may be as impaired as a person who drives while intoxicated. Not only is the driver mentally distracted, but eyes of the driver are diverted for dialing, looking to see who an incoming call is from.
It would be highly desirable to detect the presence of a mobile device, such as a wireless device, within a vehicle and control or inhibit the operation of the mobile device.
With the advancement of mobile technology, we have the capability to stay connected at all time. For many people, the urge to stay connected does not stop when they are behind the driving wheel. Driving while distracted by mobile technology is an endangerment to both the driver and general public. The present disclosure seeks to discourage distracted driving by partially inhibiting a function of a mobile device that might otherwise be used in a moving vehicle and in the proximity of the driver seat. Disclosed herein are details regarding technology that detects whether the mobile device is on the driver seat.
Most location detection technology relies on two phenomena of physics: time of arrival and received power. The time of arrival (TOA) is a location detection technique. If a distant transmitter emits a wave, and the receiver detects the wave at a later time, the distance between the transmitter and receiver is determined by the formula d=V*t, where V is the propagation velocity of the wave, and t is the time that the wave takes to arrive at the receiver. TOA detection has been used extensively with sound wave (such as sonar), because of the relative slow speed of sound lends to high location detection accuracy. At normal temperature, pressure and humidity, sound wave travels at 340 meters per second, or approximately 1 foot per millisecond. Many animals and modern instruments are capable of measuring TOA with sufficient accuracy for good location detection. For example, some dolphins and bats are known to use ultrasonic echo to locate their prey. Additionally, submarines use sonar to detect enemy vessels. Further, backup sensors installed on vehicles use ultrasonic sonar to detect obstruction.
The use of TOA with electromagnetic wave has been limited due to high speed of the electromagnetic wave. All electromagnetic waves travel at speed of light, that is 3*10{circumflex over ( )}8 m/s, or approximately 1 foot per nanosecond. If sub-meter location accuracy is desired, then synchronization between transmitter and receiver, and the measurement of TOA must have accuracy of sub-nanoseconds. The electronic systems capable of measuring nanoseconds, or at high GHz frequency, are often expensive. An interesting implementation of TOA with electromagnetic wave is the Global Positioning System. The GPS partially circumvents the nanoseconds timing challenge by having multiples GPS satellites synchronized using atomic clocks, and then continuously send GPS signal packets containing the time stamp from the satellites. The GPS receivers at the ground now are relieved from the burden of high accuracy synchronization, but still have to measure relative delays between multiple GPS signals accurately. It is only within the recent decade that the cost of GPS receiver came down dramatically, making GPS affordable to more consumers.
The power or signal strength of a wave weakens as the receiver moves further away from the transmitter. If the distance between the transmitter and receiver is R, then the power density sensed by the receiver is given by the equation below:
where Su is the received power density and Ps is the power from the transmitter.
Many modern technologies make use of this phenomenon to perform distance detection. Radar is one of the most well known examples where a radar transmitter sends an electromagnetic wave, and measured the received power of the electromagnetic waves reflects off an object from the distance. In consumer electronic technology, various location detection techniques have been developed using Received Signal Strength (RSS) measurements of wireless signals such as cellular, Wifi and Bluetooth. For example, the Wifi Positioning Technology promoted by Google, Skyhook and Navizon uses measured RSS to known Wifi access points to determine the location of mobile devices (Skyhook).
The received power approach to location detection may have limiting factors, which can include:
1) Signal noise: noise from various sources such as electronic (thermal, shot, flicker) can degrade the accuracy of the measured RSS;
2) Interference: reflection and refraction of the wave can lead to less accurate measurement. In addition, if more than one transmitter shares the same frequency spectrum, then the crowding effect further degrades RSS measurement; and
3) Obstruction: if there is any obstruction between the transmitter and receiver, then the received power is no longer solely dependent on the distance, but also the extent of the obstruction.
In one embodiment, a system, comprising hardware and software, uses the TOA of high frequency sound waves (such as, for example, 19 KHz) for driver set location detection. In one embodiment, the present disclosure comprises software that functions as an application that can be installed on mobile devices, such as a smartphone, tablet, and etc. hardware is installed on the vehicle and consists of at microphones, speakers and an embedded processor. The present disclosure provides two methods of mobile device detection. In one embodiment, an active detection method, multiple microphones are placed inside the vehicle and are utilized to detect a high frequency sound signal emit by a mobile device. In another embodiment, a passive detection method, an audio signal emitted by multiple speakers installed in a car is detected by a mobile device.
The novel features of the various embodiments are set forth with particularity in the appended claims. The various embodiments, however, both as to organization and methods of operation, together with the advantages thereof, may be understood by reference to the following description taken in conjunction with the accompanying drawings as follows.
Various embodiments are described to provide an overall understanding of the structure, function, manufacture, and use of the devices and methods disclosed herein. One or more examples of these embodiments are illustrated in the accompanying drawings. Those of ordinary skill in the art will understand that the devices and methods specifically described herein and illustrated in the accompanying drawings are non-limiting embodiments and that the scope of the various embodiments is defined solely by the claims. The features illustrated or described in connection with one embodiment may be combined, in whole or in part, with the features of other embodiments. Such modifications and variations are intended to be included within the scope of the claims.
The present disclosure describes embodiments of an apparatus, system, and method for detecting the presence of a mobile device, such as a wireless device, in a predetermined detection zone and controlling or inhibiting operation of the mobile device when it is detected in the predetermined detection zone. In particular, the present disclosure is directed to embodiments of an apparatus, system, and method for detecting the presence of a mobile device such as a wireless device in a predetermined detection zone within a vehicle and disabling some or all of the functions of the mobile device when it is detected in the predetermined detection zone. More particularly, the present disclosure is directed to automatically preventing a person in the driver's seat of a vehicle from text messaging and doing other similar excessively dangerous activities using a mobile device.
It is to be understood that this disclosure is not limited to particular aspects or embodiments described, as such may vary. It is also to be understood that the terminology used herein is for the purpose of describing particular aspects or embodiments only, and is not intended to be limiting, since the scope of the apparatus, system, and method for detecting the presence of a mobile device within a predetermined zone within a vehicle and controlling the operation of the mobile device when it is detected is defined only by the appended claims.
In various embodiments, a mobile device may be implemented as a handheld portable device, computer, mobile telephone, sometimes referred to as a smartphone, tablet personal computer (PC), laptop computer, or any combination thereof. Non-limiting examples of smartphones include, for example, Palm® products such as Palm® Treo® smartphones (now Hewlett Packard or HP), Blackberry® smart phones, Apple® iPhone®, Motorola Droid®, and the like. Tablet devices include the iPad® tablet computer by Apple® and more generally a class of lightweight portable computers known as Netbooks. In some embodiments, the mobile device may be comprise, or be implemented as, any type of wireless device, mobile station, or portable computing device with a self-contained power source (e.g., battery) such as a laptop computer, ultra-laptop computer, personal digital assistant (PDA) with communications capabilities, cellular telephone, combination cellular telephone/PDA, mobile unit, subscriber station, user terminal, portable computer, handheld computer, palmtop computer, wearable computer, media player, pager, messaging device, data communication device, and so forth.
Accordingly, systems and methods of detecting the presence of the mobile device may vary based on the wireless technology communication standards used by the mobile device. Examples of wireless technology communication standards that may be used In the United States, for example, may include Code Division Multiple Access (CDMA) systems, Global System for Mobile Communications (GSM) systems, North American Digital Cellular (NADC) systems, Time Division Multiple Access (TDMA) systems, Extended-TDMA (E-TDMA) systems, Narrowband Advanced Mobile Phone Service (NAMPS) systems, 3G systems such as Wide-band CDMA (WCDMA), 4G systems, CDMA-2000, Universal Mobile Telephone System (UMTS) systems, Integrated Digital Enhanced Network (iDEN) (a TDMA/GSM variant) and so forth. A mobile device may also utilize different types of shorter range wireless systems, such as a Bluetooth system operating in accordance with the Bluetooth Special Interest Group (SIG) series of protocols, including Bluetooth Specification versions v1.0, v1.1, v1.2, v1.0, v2.0 with Enhanced Data Rate (EDR), as well as one or more Bluetooth Profiles, and so forth. Other examples may include systems using infrared techniques or near-field communication techniques and protocols, such as electromagnetic induction (EMI) techniques. An example of EMI techniques may include passive or active radio-frequency identification (RFID) protocols and devices. These wireless communications standards are understood by one of ordinary skill in the art.
Once an appropriate command or control signal is detected, operation of the mobile device may be controlled in one or more ways. For example, in one embodiment, the mobile device is associated with a control module that disables or inhibits the operation of at least one function of the mobile device and the mobile device is rendered either inoperable or operable only in a state of limited capacity. Accordingly, the control module may be able to either completely block the ability to receive or send a call on a mobile device, or sufficiently interfere with a function of the mobile device so as to make the mobile device usage undesirable. In embodiments, the control module may disable the operation of certain components or functions of the mobile device. For example, a keyboard portion of a mobile device may be disabled to prevent the user from using a text messaging function or an email function of the mobile device. In another embodiment, the control module may direct the operation of the mobile device to a hands-free operation. In another embodiment, outgoing communication functions may be inhibited, but incoming communication functions may be uninhibited. In another embodiment, automatic replies may be initiated during a period in which a function of the mobile device is inhibited.
In embodiments, the control module may be independent of the mobile device and may communicate with the mobile device on a primary communication channel of the mobile device only or in addition to one or more secondary channels. Further, in certain embodiments, the control module may be activated only if other logical conditions are met such as the state of the ignition system, a state of a gear box, or other sensors. Accordingly, a triggering condition may be the activation of a switch, such as the ignition switch of a vehicle, or deactivation of a “park” sensor of an automatic transmission of the vehicle, among other sensors. In embodiments, the control module may allow emergency functions, such as 911 calls, when active.
In embodiments, a command or control signal may be localized to other areas within the vehicle so that operation of a mobile device in that area is disabled, but leaving other mobile devices outside of that area operational. In various embodiments, the power level of a command or control signal may be configured such that the command or control signal is delivered precisely to the predetermined detection zone. In one embodiment, this may be implemented with a directional antenna located within the vehicle where the signal is delivered to precisely the predetermined detection zone.
In embodiments described herein, a predetermined detection zone may be defined as a three-dimensional zone within or in proximity of a driver seat in a vehicle. A predetermined detection zone may be a zone within a vehicle, such as a passenger car; however, the predetermined detection zone need be within a vehicle and may be any predetermined zone as appropriate. For instance, the predetermined detection zone may be an area within a room in a building.
In one embodiment of a theory of the present disclosure, which may be referred to as active detection, a method for determining a presence of a mobile device located in a predetermined detection zone, comprises transmitting, by the mobile device, an acoustic signal, receiving, at each of a plurality of acoustic receivers, the acoustic signal transmitted from the mobile device, determining, by a processor, a location of the mobile device based on the received acoustic signal, determining whether the location of the mobile device matches the predetermined detection zone, and inhibiting at least one function of the mobile device upon determining that the location of the mobile device matches the predetermined detection zone. The method may further comprise monitoring a communication channel for a control or a command signal and inhibiting the at least one function of the mobile device upon reception of the control or command signal. According to one embodiment, the communication channel may be a Bluetooth channel or any other connection that is secondary to the primary cellular communication channel.
An embodiment of an active detection system for determining a presence of a mobile device located in a predetermined detection zone is shown in
Furthermore, in embodiments, the circuit 301 may comprise a control module associated with the mobile device 303, where the control module 301 is coupled to a non-transitory memory that stores executable instructions, wherein the control module 301 is operable to execute the instructions stored in the memory. The control module may be operable to execute the instructions to cause an acoustic signal to be transmitted from the mobile device 303 to a plurality of acoustic receivers 305, receive a command signal from a processor 307 configured to determine a location of the mobile device 303 based on the time of reception of the acoustic signal by the plurality of acoustic receivers 305 and determine whether the location of the mobile device 303 matches the predetermined detection zone, and inhibit at least one function of the mobile device 303 upon reception of the command signal. In one embodiment, the control module 301 may be located within the mobile device. In another embodiment, the circuit may be in communication with the mobile device through a communication network, such as a wireless communication network.
The control module 301 may be configured to inhibit the at least one function of the mobile device 303 upon the processor 307 determining that the location of the mobile device matches the predetermined detection zone. The control module 301 may also be configured to redirect at least one function of the mobile device 303 to a hands-free alternate system upon the processor 307 determining that the location of the mobile device 303 matches the predetermined detection zone.
In embodiments, the system 300 may use the Time of Arrival (TOA) of the acoustic signal for detection of the mobile device 303 and to determine whether the mobile device is in a driver side location of a vehicle. The acoustic signal may comprise at least one sonic pulse, which may be an ultrasonic pulse. In one embodiment, the at least one ultrasonic pulse is transmitted at a range of about 15 KHz to about 60 KHz. In another embodiment, the at least one ultrasonic pulse is transmitted at a range of about 10 KHz to about 21 KHz. In a further embodiment, the at least one ultrasonic pulse is transmitted at about 19 KHz. Using a narrow-bandwidth 19 KHz acoustic pulse or beep may allow for aggressive digital filtering to attenuate background noise. Furthermore, a narrow-bandwidth 19 KHz acoustic pulse or beep may improve localization sensitivity over a range of frequencies since a wider bandwidth may contain more noise in a pass band directed to such a range of frequencies. Additionally, using a narrow-bandwidth 19 KHz acoustic pulse or beep may allow for transmission at a lower acoustic volume.
Once a determination is made by the processor 307 as to whether the mobile device 303 is within the predetermined detection zone, the processor 307 may cause a signal to be sent to the mobile device 303 for inhibiting a function of the mobile device 303. The signal may be received via an antenna 311 of the mobile device 303. The antenna 311 may be a component of the primary communication scheme of the mobile device 303 or a component of a secondary communication scheme of the mobile device, such as Bluetooth. Once an appropriate signal is received, operation of the mobile device may be controlled in one or more ways. For example, in one embodiment, the mobile device 303 is associated with control module 301 that disables or inhibits the operation of at least one function of the mobile device 303. Thus the mobile device 303 is rendered either inoperable or operable only in a state of limited capacity. Accordingly, the control module 301 may be able to either completely block the ability to receive or send a call on a mobile device 303, or sufficiently interfere with a function of the mobile device 303 so as to make the mobile device 303 usage undesirable. In embodiments, the control module 301 may disable the operation of certain components or functions of the mobile device. For example, a keyboard portion of a mobile device 301 may be disabled to prevent the user from using a text messaging function or an email function of the mobile device. In another embodiment, the control module 301 may alter the operation of one or more functions of the mobile device, for example directing the operation of the mobile device 303 to a hands-free operation. In another embodiment, outgoing communication functions may be inhibited, but incoming communication functions may be uninhibited. In another embodiment, automatic replies may be initiated during a period in which a function of the mobile device 303 is inhibited.
In embodiments, the processor 307 may be coupled to a non-transitory memory that stores executable instructions, and the processor 307 may be operable to execute the instructions. The processor 307 may be operable to execute the instructions to receive a plurality of a electrical signals from the plurality of acoustic receivers 305, where each electrical signal is based on an acoustic signal received by each of the plurality of acoustic receivers 305, to determine a location of the mobile device 303 based on the time of reception of the acoustic signal by the plurality of acoustic receivers 305, and to determine whether the location of the mobile device 303 matches the predetermined detection zone. In one embodiment, the processor 307 is operable to determine the location of the mobile device 303 based on a distance from the mobile device 303 to each of the plurality of acoustic receivers 305. Further, the processor 307 may be operable to determine the distance of the mobile device 307 to each of the plurality of acoustic receivers 305 based on a time difference in reception at each of the plurality of acoustic receivers 305 of the acoustic signal, where the acoustic signal is transmitted from the mobile device 305. Further, in embodiments, components or functions of the processor 307 may be part of or performed by the mobile device 303. Accordingly, the mobile device may receive a communication signal from the processor 307 that provides information regarding a time of reception of an acoustic signal at each of the plurality of acoustic receivers 305.
In embodiments where the processor is independent of the mobile device, the battery drain on the mobile device may be lower if signal processing is performed on dedicated hardware powered by a separate power source, such as a vehicle power source. The processor may also be operable to receive a Bluetooth signal transmitted by the mobile device and to transmit a signal to the mobile device. In one embodiment, a Bluetooth Simple Serial Profile SSP may be used to provide a communication signal to the mobile device.
In one embodiment, the plurality of acoustic receivers comprises an array of microphones. The array 401 may be installed in multiple locations inside a cabin of a vehicle 400 as shown in
In one embodiment, an acoustic receiver, such as a microphone, may implement a high pass filter before an amplifier of the microphone so that most of the sound energy such as conversation, music, road noise below the frequency of the acoustic signal, such as 19 KHz will be filtered. The high pass filter may ensure that the microphone amplifier does not enter saturation state when an area where the location of the microphone, such as a vehicle cabin, is very noisy because if the microphone amplifier enters saturation state, a location of mobile device may be able to be detected reliably. Furthermore, background noise removal may be accomplished by first estimating an amount of background noise and then removing the background noise from the audio signal to prevent erroneous detection.
Additionally, in embodiments, fade in and fade out may applied at the beginning and the end of a transmission of an acoustic signal to minimize popping and whopping sounds caused by the instantaneous charging and discharging of the speaker coil when a high-volume sound is suddenly played on the speaker. In another embodiment, the system may adjust for temperature and humidity effect in the calculation of a physical distance of a mobile device based on speed of sound, which change based on humidity and temperature change in the environment.
In embodiments, the systems and methods of the present disclosure may comprise components that are hardware, software, or combinations thereof. In one embodiment, the software may be an application that is able to be installed on a mobile device, such as a smartphone, tablet, etc. In embodiment, a mobile application may be configured to run on mobile devices such as Android devices, iPhone and various wearable devices.
Advantages of a systems and methods of the present disclosure include:
1) Availability of Ultrasound Friendly Speaker on Smartphone—Because of a consumer's expectation of high fidelity sound from the speaker of a mobile device, such as a smart phone, many mobile devices come equipped with high performance speaker that can output a high volume of ultrasound.
2) Minimal software processing on a mobile device—In embodiments where the processor-intensive location detection algorithm is carried out independent of the mobile device, minimum resource may be required for a software application on a mobile device. This allows the system to run on devices that have constrained processor and battery resources, such as for example Google Glass, smart watch, and low-end smart phones.
3) Robustness—In embodiments where a system/method implements a time of first arrival, the system/method is less prone to the distortion introduced by obstruction, reflection and multi-path effect.
4) Low Interference—Most audio interferences inside a car cabin have frequency much less than about 19 KHz. Road, engine and wind noises are in the hundreds of Hz, human conversation centers around 5 KHz, and music rarely exceeds about 13 KHz. Because of the minimal interference in the high frequency audible range, the system/method may be able to achieve better signal to noise ratio, and thus better detection success rate.
5) Unobtrusiveness—Most adult human beings cannot hear frequency above about 15 KHz. In one embodiment, a short sound pulse ( 1/10 s of a second) emitted by the system should be imperceptible to most drivers and passengers.
In embodiments of active detection, the acoustic signal received by the acoustic receivers is converted to an electrical signal and the electrical signal comprises information regarding the acoustic parameters of the acoustic signal. In embodiments, signal processing is performed on the electrical signal to determine a location of mobile device. In embodiments, the systems and methods of the present disclosure may comprise a sound player, a sound recorder, and/or a sound filter that perform particular functions of the necessary signal processing. In embodiments, the signal processing components and functions described for active detection may be implemented in the same or similar fashion in embodiments of passive detection described below with regard to
It may be recognized, however, that active detection methods may include features that may be difficult to implement.
For example, the active detection method may not be robust for localization of multiple phones. It may be necessary for each phone to encoded specific identification information in the sound it emits. Alternatively, each phone may have to coordinate with hardware in the vehicle through another communication method (Bluetooth, wifi and etc) and take turns to emit the sound (Round Robin fashion) with other phones located in the vehicle. Such methods may require significant engineering efforts.
Additionally, in the active detection method, the hardware must constantly monitor the acoustic environment of the vehicle because the ultrasonic pulse emitted by the mobile device may occur at any time. The hardware in the vehicle therefore needs to be capable of fast and sensitive sound recording and processing. One or more high performance microphones, amplifiers and/or processors may be required for installation in the vehicle. Some exemplary candidates for the processor may include an ARM Cortex M4F processor configured to operate at least 100 M Hz or faster. The cost of processor alone is $8˜12 at volume. Because a vehicle OEM may have to add at least 2 microphones and provision significant processing capability, this method may be difficult to implement in the vehicle.
As shown in
It may be understood that the acoustic environment may comprise all sound signals within the environment of the mobile device. The sound signals within the acoustic environment may include infrasonic sounds (in some embodiments, sounds having a frequency less than about 20 Hz), audible sounds (in some embodiments, sounds ranging from about 20 Hz to about 20 KHz), and ultrasonic sounds (in some embodiments, sounds having a frequency greater than about 20 KHz). In some embodiments, ultrasonic sounds may also refer to sounds having a frequency greater than about 10 KHz or a frequency greater than about 15 KHz, which may include sounds at the high frequency end of the audible sound spectrum.
In embodiments, the system 1800 may use the Time of Arrival (TOA) of the acoustic signal for detection of the mobile device 1803 and to determine whether the mobile device 1803 is in a driver side location of a vehicle. The acoustic signal may comprise at least one sonic pulse, which may be an ultrasonic pulse. In one embodiment, the at least one ultrasonic pulse is transmitted in a range of about 15 KHz to about 60 KHz. In another embodiment, the at least one ultrasonic pulse is transmitted at a range of about 10 KHz to about 21 KHz. In a further embodiment, the at least one ultrasonic pulse is transmitted at about 19 KHz. Using a narrow-bandwidth 19 KHz acoustic pulse or beep may allow for aggressive digital filtering to attenuate background noise. Furthermore, a narrow-bandwidth 19 KHz acoustic pulse or beep may improve localization sensitivity over a range of frequencies since a wider bandwidth may contain more noise in a pass band directed to such a range of frequencies. Additionally, using a narrow-bandwidth 19 KHz acoustic pulse or beep may allow for transmission at a lower acoustic volume. Although the center frequency of such a band pass filter may be set to about 19 KHz, it may be understood that frequencies within a neighborhood of about 19 KHz (such as between about 18 KHz and about 20 KHz) may also be allowed through the filter passband. For some applications, a passband may range from about 18 KHz to about 20 KHz. In other applications, the passband may range from about 18.9 KHz to about 19.1 KHz. It may be understood that the width of the passband may be set to a narrow range for improved noise immunity, or may be set to a wider range to allow the acoustic pulse to be transmitted using frequency modulation or frequency hopping techniques.
The system 1800 may also comprise circuit 1801 may be configured to inhibit at least one function of the mobile device 1803. The processor 1813 may be in communication with the circuit 1801 of the mobile device. As shown in the embodiment of
Furthermore, in embodiments, the circuit 1801 may comprise a control module associated with the mobile device 1803, wherein the control module 1801 is coupled to a non-transitory memory that stores executable instructions and wherein the control module 1801 is operable to execute the instructions stored in the memory. The control module 1801 may be operable to receive a command signal from a processor 1813 and inhibit at least one function of the mobile device 1803 upon reception of the command signal. As shown in
During embodiments of passive detection, each transmitter 1805 may be configured to emit an acoustic signal into the acoustic environment of the vehicle in which each acoustic signal comprises short pulse of a high frequency (ultrasonic) sound signal. The mobile device 1803 may be configured to capture the acoustic signal via an acoustic receiver 1809, such as a microphone of the mobile device 1803. The processor 1813 may be configured to calculate a time-of-flight of the acoustic signal and determine a location of the mobile device 1803 in reference to a predetermined detection zone based on the time-of-flight.
Once a determination is made by the processor 1813 as to whether the mobile device 1803 is within the predetermined detection zone, the processor 1813 may cause a signal to be sent to the mobile device 1803 to inhibit a function of the mobile device 1803. The signal may be received via an antenna 1811 of the mobile device 1803 if the processor 1813 is not a component of the mobile device 1803. Once an appropriate signal is received, operation of the mobile device 1803 may be controlled in one or more ways. For example, in one embodiment, the mobile device 1803 is associated with control module 1801 that disables or inhibits the operation of at least one function of the mobile device 1803. Thus the mobile device 1803 is rendered either inoperable or operable only in a state of limited capacity. Accordingly, the control module 1801 may be able to either completely block the ability to receive or send a call on a mobile device 1803, or sufficiently interfere with a function of the mobile device 1803 so as to make the mobile device 1803 usage undesirable. In embodiments, the control module 1801 may disable the operation of certain components or functions of the mobile device. For example, a keyboard portion of a mobile device 1801 may be disabled to prevent the user from using a text messaging function or an email function of the mobile device. In another embodiment, the control module 1801 may alter the activity of one or more functions of the mobile device 1801, for example directing the operation of the mobile device 1803 to a hands-free operation. In another embodiment, outgoing communication functions may be inhibited, but incoming communication functions may be uninhibited. In another embodiment, automatic replies may be initiated during a period in which a function of the mobile device 1803 is inhibited.
In embodiments, the processor 1813 may be coupled to a non-transitory memory that stores executable instructions, and the processor 1813 may be operable to execute the instructions. The processor 1813 may be operable to execute the instructions to receive the electrical signals from an acoustic receiver 1809 of the mobile device 1803, where each electrical signal is based on each acoustic signal received by the acoustic receivers 1809, to determine a location of the mobile device 1803 based on the time of reception of the acoustic signals by the acoustic receiver 1809, and to determine whether the location of the mobile device 1803 matches the predetermined detection zone. In one embodiment, the processor 1813 is operable to determine the location of the mobile device 1803 based on a distance from the mobile device 1803 to each of the plurality of acoustic transmitters 1805. Further, the processor 1813 may be operable to determine the distance of the mobile device 1803 to each of the plurality of acoustic transmitters 1805 based on a time difference in transmission from each of the plurality of acoustic transmitters 1805 of the acoustic signals. In one embodiment, the processor 1813 is a mobile application processor. Further, in one embodiment, the processor 1813 may be located within the mobile device and in another embodiment the processor 1813 may be independent of the mobile device 1803 and communicatively coupled to the mobile device 1803. Further, in embodiments, components or functions of the processor 1813 may be part of or performed by the mobile device 1803. Accordingly, the mobile device may receive a communication signal from the processor 1813 that provides information regarding a time of reception of each acoustic signal at the acoustic receivers 1809 of the mobile device 1803
The plurality of transmitters 1805 may be a plurality of acoustic transmitters, such as speakers, located inside of a cabin of a vehicle. One embodiment of a location of the speakers 1805 is shown in
In addition, a method for determining a presence of a mobile device located in a predetermined detection zone comprises transmitting a sequence of acoustic pulses through multiple acoustic transmitters, for example a plurality of speakers 1805. Each pulse may be transmitted at about 19 KHz and may be separated from another pulse by a pre-defined time delay. The sound received by the acoustic receiver of the mobile device 1803 may be recorded. The acoustic signal from each speaker is identified and the time difference between each pulse is analyzed. Based on the time difference between the pulses, a relative distance is calculated to each speaker and a determination is made as to whether the mobile device is in the driver zone or not.
A sound player within the vehicle may periodically play a sound file comprising the acoustic signal that contains 19 KHz audio acoustic pulses through the speakers. In one embodiment, a sound file may be configured to cause the speakers to emit pulses, or beeps, that are about 10 milliseconds long and are about 19 KHz sinusoidal signals separated by about 190 ms of silence between the pulses. In some alternative examples, the pulse width can range from about 1 ms to about 500 ms. The pulse-width may be kept as short as possible so that more pulses may be transmitted in each time period. The lower bound on pulse-width may be set by the characteristics of the audio receiver in the mobile device: if the pulse-width is too short, there may not be sufficient sound energy to be registered by the microphone. In some embodiments, it has been determined that a pulse width ranging from about 5 ms to about 10 ms may provide a strong enough signal to be registered by the microphone, while being short enough to permit multiple pulses per seconds. The period of silence between ultrasonic pulses may also be configurable. A lower boundary, for example of about tens of milliseconds, may be determined based on the reverberation of the pulse. The period of silence may be long enough so that all echoes from a prior pulse may have already died down. In some embodiments, the period of silence between ultrasonic pulses has been set to about 50 ms to about 200 ms. A long period silence may not be ideal, because it may reduce the number of ultrasonic pulses transmitted in any time period. This sound file may be recorded using about a 44.1 KHz sampling rate and 32-bit floating number format.
There are several mechanisms by which the sound file may be introduced into the sound system of the vehicle to cause the vehicle sound system to emit the acoustic signal. In one embodiment, the in-vehicle audio system may use a software mixer routine to add the acoustic signal into the audio signal that will eventually be played through the speaker. In an exemplary embodiment, for better localization accuracy, the acoustic signal may be sourced by only the front two speakers, for example by one or more tweeters. In another embodiment, the acoustic signal may be added to a source of music, such as through mixing the acoustic signal into existing CD, digital audio/video, streaming audio and video. In another embodiment, the acoustic signal may be added to a radio, Satellite, TV or Internet audio and/or video broadcast. In yet another embodiment, the acoustic signal may be added to software (such as iPhone, Android or vehicle software app) that generates any audio or video output. In one example, an iPhone or other connected device may source the acoustic signal via a USB connection to play through in-vehicle audio system. In another example, an iPhone or other connected device source the acoustic signal via a Bluetooth Audio connection to play through in-vehicle audio system. In yet another embodiment, encryption or other security technique may be incorporated into the acoustic signal to prevent an unauthorized party from replicating or reverse engineering the acoustic signal.
The introduction of an audio file comprising the acoustic signal from an extra-vehicular source into a pre-existing vehicle audio system may have several advantages. Such advantages may include:
It may be recognized that the passive localization method may be affected by music, noise, conversation, or other external audio signals that may match the characteristics of the acoustic signal and lock the phone (audio interference). Audio interference may be addressed in several ways, including, but not limited to:
In embodiments, the acoustic signal received by the acoustic receiver of the mobile device may be converted to an electrical signal and the electrical signal comprises information regarding the acoustic parameters of the acoustic signal. In embodiments, processing is performed on the electrical signal to determine a location of mobile device. In embodiments, the systems and methods of the present disclosure may comprise a sound player, a sound recorder, and/or a sound filter as described with regard to
However, in the passive detection method, the mobile device must constantly monitor the acoustic environment of the vehicle because the ultrasonic pulse emitted by the transmitters may occur at any time. As a result, the processor may run continuously in order to evaluate the acoustic environment and detect the occurrence of one or more ultrasonic pings. Such continual higher processor activity may lead to battery drain. Several mechanisms may be incorporated into the passive localization method to address the issue of power consumption including, without limitation:
An embodiment of a passive method for mobile device localization that addresses the issue of continual evaluation of the acoustic environment of the vehicle by the mobile device is shown in
In the method 600 depicted in
In some embodiments, the mobile device may follow an acoustic sampling protocol to sample the acoustic environment 601 for a period of about 1 sec. and remain disabled for about 9 secs. Such a sampling protocol may be described has having a sampling protocol frequency of about 0.1 Hz with a sampling protocol duty cycle of about 10%. Alternative sampling protocols may have a sampling protocol frequency of about 0.5 Hz to about 0.01 Hz with a sampling protocol duty cycle of about 5% to about 30%. If acoustic sampling by the mobile device is disabled 602, the mobile device takes no further actions. If acoustic sampling by the mobile device is enabled 602, the mobile device may be configured to enable a sound recorder 603 to capture a short recording from an acoustic receiver at a predetermined sampling frequency. In one embodiment, the sampling frequency is about 44.1 KHz. In an alternative embodiment, the sampling frequency may be greater, for example at about 100 KHz. Further, in an embodiment, the recorded audio is converted to an array of double precision floating number for further analysis. Example code of an embodiment for capturing a recording is shown below:
Further, at step 605, a sound filter may apply a narrow band-pass filter centered at about 19 KHz to emphasis the acoustic signal. In one embodiment, the sound filter comprises a Butterworth Infinite Impulse Response filter (Butterworth-type IIR filter). Example code for a Butterworth-type IIR filter is shown below:
Further, an IIR filter is one embodiment of a plurality of different embodiments of filter implementations. Depending on a particular operating system of a mobile device, a software library, and/or a particular hardware resource; a type of IIR and/or Finite Impulse Response (FIR) filter may be chosen as appropriate.
In one embodiment, an acoustic receiver, such as a microphone, records the acoustic signal as oscillations around the 0-axis. A volume value, which is always greater or equal to 0, may be extracted from the sound recording at step 607 for the purpose of efficient analysis. Sound volume extraction may be done by calculating the 7-elements moving average of the absolute values of the sound volume. Example code of an embodiment for sound volume extraction is shown below:
In an alternative embodiment, a less processor intensive algorithm may be used to calculate the sound volume based on a 2-element moving average. Such an algorithm may increase the speed of the calculation as only two stored values may be used instead of seven. An example code of such an embodiment for a 2-element moving average may include:
soundVolume[i]=max[abs(soundInput[i]),abs(soundInput[i−1])]
Due to possible interference, filtering artifacts, electronic noise and transducer distortions, it may be necessary to remove background noise from the volume data at step 609. To remove background noise, a fixed threshold may be applied to each element of the volume data. If the volume data is less than the threshold, it may be assigned a value of 0. Example code of applying a threshold to volume data is shown below:
Sounds with an energy level that is significantly higher than the background noise, which may be referred to as pulses, beeps, or peaks, and are potential candidates for identifying pulses at step 611. The method for the pulse detection may be a fixed threshold technique according to the example code shown below:
C++ Psuedo Code
double noise_free_volume[ ]; //input
int initial_cross_over_points[ ]; //output, time index where volume first change from zero to non-zero.
Below is example code that may be implemented for pulse detection:
A process of initial pulse detection performed at step 611 may produce a list of time stamps of sound pulses. As part of a previous step, the list may be filtered by eliminating sound pulses that are very close to or very far from earlier pulses according to a pulse down selection process performed at step 613. In one embodiment, if a time difference between a pulse and a preceding pulse or a proceeding pulse is not in a range specified by a minimum and maximum value, then the pulse may be eliminated from the list of time stamps. Accordingly, if a pulse is not within a predetermined range, it may be determined to be a reverberation of an earlier pulse instead of a new pulse. Example code for determining time differences of pulses in the list is shown below:
According to the embodiments disclosed above for method steps 605, 607, 609, 611, and 613, the processor may determine if the sound recorded in step 603 of the acoustic environment of the mobile device comprises the acoustic signals transmitted by the transmitters. When the recording is determined to comprise the acoustic signals transmitted by the transmitter, the relative location of the mobile device may then be calculated using the speed of sound in step 615 using the following formula:
Example code of an embodiment for calculating a relative location of a mobile device is shown below:
The value “34” shown above is the speed of sound in cm/ms. The value “44.1” is the number of audio samples in 1 millisecond at the sampling frequency of 44.1 KHz. In alternative embodiments, the sampling frequency may be higher, for example at about 100 KHz. In such alternative embodiments, the code may be changed so that the value “44.1” is replaced by “100” or other value related to the sampling frequency.
In addition, there are many sources of error that might lead to incorrect calculated distance from time to time. To eliminate statistical outliers, distance filtering may be applied at step 617 based on a calculated distance that may be averaged over current values and a finite set of historical values. A moving average process may improve the accuracy at the expense of slower detection speed (˜10 seconds). Example code below illustrates one embodiment of a moving average filtering calculation:
Ultimately, a determination is made as to whether a mobile device is located in a predetermined detection zone in step 619; such as a driver's zone. For the implementation shown above, a mobile device may be considered to be in a predetermined detection zone when a relative position is greater than 0. In an embodiment, this means if a relative placement is to the left of a mid-point of a vehicle cabin, then a mobile device may be determined to be in a driver's seat location. Example code of an embodiment for determining a relative position is shown below:
Alternative embodiments may use different criteria to determine that the mobile device is located in the predetermined detection zone. According to alternative calculations, if the calculated relative distance is less than zero, then the mobile device is determined to be in the predetermined detection zone (the driver's side).
Once the position of the mobile device is determined, the control circuit may cause an inhibition of one or more functions of the mobile device if the position is found to be in the predetermined detection zone. Functions that may be inhibited may include texting functions or functions related to internet communications. In one example, the function of the mobile device may be altered, for example configuring voice communications to employ a hands-free system incorporated in the vehicle.
In one embodiment, the mobile device may continue to periodically sense the acoustic environment and determine the position of the mobile device even after the control circuit has inhibited the one or more functions of the mobile device. In an alternative embodiment, a timer associated with the mobile device may be implemented so that the mobile device may discontinue sensing the acoustic environment and determining the position of the mobile device until the timer runs out. In either embodiment, the at least one function of the mobile device may be restored when the mobile device determines that it is no longer located within the predetermined detection zone.
In addition, various embodiments of the sound filter discussed above with regard to step 605 of
There are also many popular circuit implementations of various band pass filters, including:
Further, embodiments of sound filters may be implemented using a microprocessor Field Programmable Gate Array (FPGA) or a Digital Signal Processor (DSP).
Additionally, embodiments of sound volume extraction discussed above are described below. A demodulation process used by Amplitude Modulation (AM) radio receiver may be used for extracting sound volume from an ultrasonic pulse. Accordingly, various analog implementations of an AM radio demodulator may be used to extract the volume information from a 19 KHz ultrasonic carrier frequency. The following is a list of AM demodulation techniques:
In addition, a Hibert Transform may be used for volume extraction. Further, a dedicated Application Specific Integrated Circuit, or ASIC semiconductor chip, may be used to detect the volume level from audio signal. One example is a THAT 2252 RMS-Level Detector chip manufactured by THAT Corporation.
Moreover, embodiments of pulse detection as discussed above are described below. Pulse detection may be considered as a problem studied across various academic fields. The operation may be to separate out a true signal, which is referred to as a ping, from noise. One embodiment of pulse detection functions to separate a ping from noise is when the volume information exceeds fixed multiples of the background noise. Another embodiment of pulse detection according to the present disclosure involves using a Cumulative Sum (CUSUM) chart. The CUSUM may be used to discern significant deviation from natural variability in continuous evolving process. In addition, an Otsu threshold can be applied to identify a ping (foreground) from noise (background). The algorithm assumes that an acoustic signal follows a bi-modal histogram consists of ping (foreground) and noise (background). By dividing each time slice into two groups (ping and noise), while minimizing the variance within each group, a ping may be identified reliably even with varying noise level.
Additionally, one or more of the steps depicted in
The following steps illustrate the calculation of phase correlation between the acoustic data from two microphones, s1 and s2:
Once phase shift has been determined, the relative location can be calculated by multiplying the phase shift by the speed of sound.
In passive detection of a relative location of a mobile device can be calculated using the speed of sound. The following illustrates one embodiment of a calculation process. In the example of
The mid-point between the two speakers 2001, 2003 is a distance of m from each speaker. The mobile device is calculated to be a distance of d right of the center point between left and right speaker 2001, 2003. The speed of sound is v. The distance of the mobile device to the right speaker 2003 is (m−d). Distance of the mobile device to the left speaker 2001 is (m+d).
For the first pulse from the left speaker, it will be:
For the second pulse from the right speaker, it will be:
The silence between the two pulses, specifically, from the falling edge of the 1st pulse to the rising edge of the 2nd pulses is measured:
Therefore, the relative distance d from the center point can be calculated by finding the small shift in the silence period between the two pulses.
In the above example, the relative placement is −14 cm, or 14 cm to the right of the midpoint between the two speakers 2001, 2003. The calculations disclosed above are examples only as they relate to acoustic signals having the timing characteristics as depicted in
In the embodiments disclosed above, the calculations for the location of the mobile device are referenced to a predetermined detection zone corresponding to a driver's side of a vehicle. In many of the sample calculations disclosed above, the driver's side of the vehicle is taken to be the left side of the vehicle (corresponding to jurisdictions having right-hand traffic laws, such as in the U.S.). Thus, in the calculation of the relative distance disclosed above, a negative value may correspond to an area outside of the driver's side such as the forward passenger side. It may be understood that equivalent embodiments, methods, and calculations may apply to vehicles having a right side corresponding to the driver's side of the vehicle (for jurisdictions having left-hand traffic laws, for example in the U.K.). In such embodiments, for example, a negative value of the relative distance may correspond to the predetermined detection zone corresponding to the driver's side of the vehicle.
Additionally, a method for determining a presence of a mobile device located in a predetermined detection zone comprises transmitting, by each of a plurality of transmitters, acoustic signals to the mobile device, receiving, by the mobile device, each acoustic signal transmitted by the plurality of transmitters, determining, by a processor, a location of the mobile device based on the communication signals transmitted by the plurality of transmitters and received by the mobile device, determining whether the location of the mobile device matches the predetermined detection zone, and inhibiting at least one function of the mobile device upon determining that the location of the mobile device matches the predetermined detection zone. Each of the acoustic signals comprises at least one ultrasonic pulse at about 19 kHz.
Further, determining the location of the mobile device may comprise determining the location of the mobile device based on a distance from the mobile device to each of the plurality of receivers and the distance of the mobile device to each of the plurality of receivers is may be determined based on time difference in reception at each of the plurality of receivers of the acoustic signal transmitted from the mobile device. Additionally determining the location of the mobile device comprises determining the location of the mobile device based on triangulation.
In addition, an acoustic signal may be transmitted by a plurality of acoustic transmitters with additional location or identification information that allows each of the acoustic transmitters to be identified based on information contained in the acoustic signal. In one embodiment, information is encoded using pulse compression by modulating the transmitted acoustic signal and then correlating the received signal with the transmitted acoustic signal. The modulated acoustic signal may be transmitted according to certain parameters such that signal processing is accomplished the same as or similar to the processes described above.
As disclosed above, a mobile device may be localized within a vehicle based on the receipt, by the device, of one or more audio signals emitted by one or more transmitters within the vehicle. In one embodiment of a method, a mobile device periodically records sounds from its acoustic environment and processes data derived from the recorded sounds. The mobile device may then determine from the data that the recorded sounds comprise the audio signals, and then use timing information from the audio signals to determine the position of the mobile device within the vehicle. It may be recognized that, in some embodiments, both the periodic sampling by the mobile device and the emission of the audio signals by the transmitters may be free-running and uncorrelated processes. Consequently, it is possible that the mobile device may begin recording the environment at a time between the transmission of the audio signal from a first transmitter and the transmission of the audio signal from the second transmitter. Unless the audio signal from the first transmitter (the first audio signal) can be distinguished from the audio signal from the second transmitter (the second audio signal), the software within the mobile device may invert the sense of the transmitter and therefore incorrectly calculate its location. Therefore, in one embodiment, the first audio signal and the second audio signal may be distinguished according to one or more audio characteristics.
The second audio signal 722 may be similarly characterized as audio signal 702. The second audio signal 722 may include an ultrasonic pulse 724 starting at a time t2 726 and ending at a time t3 730. The ultrasonic pulse 724 may therefore have a pulse width w3 defined as the difference between time t2 726 and time t3 730. Ultrasonic pulse 724 may be followed by a refractory or silent period 732 having a time width of w4, corresponding to the difference in time between the start of a subsequent ultrasonic pulse 724 and the ending time t3 of a previous ultrasonic pulse 724. The second ultrasonic signal 722 may therefore be characterized by a period T2 comprising a sum of pulse width w3 and refractory period width w4. Additionally, the second audio signal 722 may be characterized by a duty cycle D2 calculated as (w3/T2)*100 (percentage of the period T2 during which the ultrasonic pulse 724 is emitted). The second audio signal 722 may be emitted with a delay time td1 with respect to the first audio signal 702. A delay time td1 may be calculated as the time between the start 706 of an ultrasonic pulse 704 in the first audio signal 702 and the start 726 of a subsequent ultrasonic pulse 724 in the second audio signal 722 (or a difference between t2 and t0). An alternative delay time td2 may be calculated as the time between the start 726 of an ultrasonic pulse 724 in the second audio signal 722 and the start 706 of a subsequent ultrasonic pulse 704 in the first audio signal 702 (or a difference between t0+w1+w2 and t2 of a preceding ultrasonic pulse 724.) It may be recognized that the first audio signal 702 may be distinguished from the second audio signal 722 according to differences in the timing characteristics of the signals. For example, the first audio signal 702 may have a pulse width w1 longer or shorter than the pulse width w3 of the second audio signal 722. Alternatively, the first audio signal 702 may have a refractory period w2 longer or shorter than the refractory period w4 of the second audio signal 722. In another example, the first audio signal 702 may have a duty cycle D1 longer or shorter than the duty cycle D2 of the second audio signal 722. In yet another example, delay time td1 may be longer or shorter than delay time td2. In some embodiments, the period T1 of the first audio signal 702 and the period T2 of the second audio signal 722 may both be about 125 msec. However, delay time td1 may be about 50 msec. and the delay time td2 may be about 75 msec. In this manner, the first audio signal 702 and the second audio signal 722 may be distinguished regardless of when the mobile device begins sampling the acoustic environment.
In addition to the characteristics of a first and a second audio signal disclosed above, each audio signal may be characterized according to the central frequency of the ultrasonic pulse and/or a wave envelope of the ultrasonic pulses.
It may be recognized that the location of a mobile device within a vehicle may be localized only in terms of a left side of the vehicle versus a right side of the vehicle when only two speakers are used (for example, the speakers are installed in the front of the vehicle). Such one dimensional localization (across a width dimension of the vehicle cabin) may be sufficient for a vehicle having only a front driver seat and a front passenger seat. However, such a system may be insufficient to localize a mobile device in a driver's seat for a vehicle having front and rear seats (or more than one rear seat, as may be found in some vans). The mobile device may be located in two dimensions (along the width and length of the vehicle cabin) if additional positioning information is provided. In one example, additional positioning information may be determined based on the power of the acoustic signal received by the mobile device. As disclosed above, the power or signal strength of a wave weakens as the receiver moves further away from the transmitter. If the distance between the transmitter and receiver is R, then the power density sensed by the receiver is given by the equation below:
where Su is the received power density and Ps is the power from the transmitter. Thus, the location of the mobile device within the vehicle cabin may be determined in a length dimension of the vehicle cabin based on measuring a value of the power density of the acoustic signals emitted by the speakers.
It is recognized that acoustic noise may interfere with a localization system based solely on the receipt of an acoustic signal. For example, if a large number of vehicles on the road rely on ultrasound emitter in the vehicle to determine the location of the phone, it is a possible that vehicle A with windows or doors opened might receive ultrasound interferences from nearby vehicle B. To prevent interference from nearby acoustical transmitters, the following techniques can be utilized:
Because acoustic signals may be readily generated, it is possible that a user might attempt to circumvent the acoustic methods for localizing the mobile device. Such attempts may include, without limitation:
As disclosed above, power may be saved in a mobile device by only sampling the acoustic environment periodically. However, if transmitters within a vehicle transmit acoustic signals freely, it is possible that the mobile device may sample the acoustic environment at time periods between or within the transmissions of the acoustic signals. In this manner, the mobile device may not be able to distinguish an acoustic signal transmitted from one transmitter or speaker from another. In one embodiment, each speaker may emit an acoustic signal having acoustic characteristics that differ from the others. In this manner, the characteristics of an acoustic signal detected by the mobile device may be used to identify which speaker emitted a particular acoustic signal. In an alternative embodiment, the mobile device may be synchronized to the acoustic signals. In this manner, the mobile device may sample the acoustic environment at a predetermined time with respect to the transmission of all of the acoustic signals. In one embodiment, synchronization may be accomplished by the receipt by the mobile device of a synchronization signal produced by an apparatus or device that is also incorporated into the vehicle. The synchronization signal may have a predetermined delay time with respect to the first acoustic signal emitted by the first speaker. The mobile device may therefore begin recording sound from the acoustic environment upon receiving the synchronization signal. As disclosed above, the distance of the mobile device to the speakers may be determined by the delay in the receipt of the acoustic signal emitted by each of the speakers by the mobile device. It may be recognized that a synchronization signal should have characteristics such that no appreciable delay in the receipt of the signal by the mobile device may occur regardless of the position of the mobile device within the vehicle. Appropriate characteristics of the synchronization signal and the acoustic signals are illustrated in
As illustrated in
The advantage of this embodiment may include the following:
As disclosed above, the embodiment is advantageous in minimizing the complexity and processing requirements and therefore may reduce the associated the hardware costs. Cost reduction may arise due to the following considerations:
Additional financial advantages of the system may include:
Under some circumstances, a person lacking a mobile device or having a mobile device in an off state or in airplane mode may enter a vehicle. I would be useful to include a method to determine if there is an active mobile device in proximity to the vehicle. Processing and power savings may be realized if the vehicle hardware and system can determine if methods to localize a mobile device are unnecessary. The above disclosure describes a sound-based localization technique in combination with a radio wave technology. In one embodiment, a radio wave technology such as Bluetooth, Bluetooth Smart/Low Energy, or NFC may be used to permit the vehicle-based electronics to determine whether the mobile device is in close proximity to a vehicle. Once the vehicle-based electronics determines that the mobile device is near the vehicle, they may then enable the sound-based localization techniques to determine the precise location of the mobile device and whether it is in the driver's area.
The radio technology may include one or more of the following techniques alone or in combination, to determine whether the electronic device is near the vehicle:
In an alternative embodiment, the vehicle-based electronics may determine the presence of a mobile device in proximity to the vehicle via sound localization. Once the mobile device is determined to be in proximity to or within the vehicle, the location of the mobile device with respect to the predetermined detection zone may be determined.
The system and methods disclosed above have considered the problem of identifying a location of a single mobile device within a vehicle. It is recognized that there may be multiple occupants in a vehicle, each one possessing one or more mobile devices.
In another embodiment, the location and identification of multiple mobile devices within a vehicle may be determined based on wireless signals emitted by the mobile devices.
Returning to
In addition to the location of a mobile device within the vehicle, the information may include identifying information about the mobile device including, without limitation, a MAC address, a list of applications resident on the device, and information related to the use of the device. If the electrical device 1402 is additionally connected to the ODB-11 (On-Board Diagnostics System) interface, the electrical device 1402 may also be able to correlate driving performance with a driver possessing an identified mobile device. For example, the electrical device 1402 may receive vehicle information such as speed, break, sensor information, diagnostic and other information available from the ODB-11 port. An additional advantage is that the electrical device 1402 may also be powered through the vehicle power system, and not require an additional power supply.
The information stored on the server 1406 may be accessed by a user over one or more communications interfaces. In some embodiments, the server 1406 may include operations to restrict the access of the mobile device information to an authorized user. An authorized user may include a law enforcement user, an insurance user, and a healthcare user. For example, insurance providers may use this information to set the premium for personalized usage-based insurance rate. Information that may be of use to the insurance provider may include, without limitation, which driver is driving the vehicle as well as the driving performance of the driver (from ODB-11 information).
This information can be collected and stored in a back-end database. Access may be restricted according to any standard mechanism including, without limitation, the use of an identifier name, a password, a biometric token (such as a scanned finger print), a one-time password token, and similar. The server 1406 may then determine that the received security token or identifier is valid, and permit access to the information.
In an additional embodiment, the electrical device 1402 may transmit one or more messages back to the mobile device 1803 that has been localized to the predetermined detection zone (for example, at the driver's side of the vehicle). Such a message may include information regarding the state of the vehicle based on the ODB-11 information. As one example, if the vehicle is operating in an auto-pilot mode, a text message may be forwarded to the driver to indicate a potential hazard that the auto-pilot mode is unable to address. Such a warning message may include a request that the driver should resume manual control of the vehicle.
In alternative embodiments, the location of a mobile device within a vehicle may be determined based on other sensors.
As disclosed above, localization of a mobile device may be determined based on acoustic transmitters, magnetic sensors, or detectors of wireless transmissions from the mobile device. Alternatively, a beacon-based system may be employed in which beacons can be placed within the vehicle, and the mobile device may determine a distance from each beacon. Such a system is analogous to indoor GPS systems.
It may be recognized that a combination acoustic, wifi, and beacon-based technology may be used together for improved localization accuracy. For example, wireless technology may be used to establish the approximate location of a mobile device. An ultrasound sensor may be used to provide precise or fine location determinations. Additionally, magnetic technology as well as GPS and position techniques may provide more refined information.
The various illustrative functional elements, logical blocks, modules, circuits, and processors described in connection with the embodiments disclosed herein may be implemented or performed with an appropriate processor device, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein as appropriate. As described herein a processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine designed to perform the appropriate function. A processor may be part of a computer system that also has a user interface port that communicates with a user interface, and which receives commands entered by a user, has at least one memory (e.g., hard drive or other comparable storage, and random access memory) that stores electronic information including a program that operates under control of the processor and with communication via the user interface port, and a video output that produces its output via any kind of video output format.
The functions of the various functional elements, logical blocks, modules, and circuits elements described in connection with the embodiments disclosed herein may be performed through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the terms “processor” or “module” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, DSP hardware, read-only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage. Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
The various functional elements, logical blocks, modules, and circuits elements described in connection with the embodiments disclosed herein may comprise a processing unit for executing software program instructions to provide computing and processing operations for the systems and methods described herein. A processing unit may be responsible for performing various voice and data communications operations between the mobile device and other components of an appropriate system. Although the processing unit may include a single processor architecture, it may be appreciated that any suitable processor architecture and/or any suitable number of processors in accordance with the described embodiments. In one embodiment, the processing unit may be implemented using a single integrated processor.
The functions of the various functional elements, logical blocks, modules, and circuits elements described in connection with the embodiments disclosed herein may also be implemented in the general context of computer executable instructions, such as software, control modules, logic, and/or logic modules executed by the processing unit. Generally, software, control modules, logic, and/or logic modules include any software element arranged to perform particular operations. Software, control modules, logic, and/or logic modules can include routines, programs, objects, components, data structures and the like that perform particular tasks or implement particular abstract data types. An implementation of the software, control modules, logic, and/or logic modules and techniques may be stored on and/or transmitted across some form of computer-readable media. In this regard, computer-readable media can be any available medium or media useable to store information and accessible by a computing device. Some embodiments also may be practiced in distributed computing environments where operations are performed by one or more remote processing devices that are linked through a communications network. In a distributed computing environment, software, control modules, logic, and/or logic modules may be located in both local and remote computer storage media including memory storage devices.
Additionally, it is to be appreciated that the embodiments described herein illustrate example implementations, and that the functional elements, logical blocks, modules, and circuits elements may be implemented in various other ways which are consistent with the described embodiments. Furthermore, the operations performed by such functional elements, logical blocks, modules, and circuits elements may be combined and/or separated for a given implementation and may be performed by a greater number or fewer number of components or modules. As will be apparent to those of skill in the art upon reading the present disclosure, each of the individual embodiments described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several aspects without departing from the scope of the present disclosure. Any recited method can be carried out in the order of events recited or in any other order which is logically possible.
It is worthy to note that any reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” or “in one aspect” in the specification are not necessarily all referring to the same embodiment.
Unless specifically stated otherwise, it may be appreciated that terms such as “processing,” “computing,” “calculating,” “determining,” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, such as a general purpose processor, a DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein that manipulates and/or transforms data represented as physical quantities (e.g., electronic) within registers and/or memories into other data similarly represented as physical quantities within the memories, registers or other such information storage, transmission or display devices.
It is worthy to note that some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. With respect to software elements, for example, the term “coupled” may refer to interfaces, message interfaces, application program interface (API), exchanging messages, and so forth.
It will be appreciated that those skilled in the art will be able to devise various arrangements which, although not explicitly described or shown herein, embody the principles of the present disclosure and are included within the scope thereof. Furthermore, all examples and conditional language recited herein are principally intended to aid the reader in understanding the principles described in the present disclosure and the concepts contributed to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions. Moreover, all statements herein reciting principles, aspects, and embodiments as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents and equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure. The scope of the present disclosure, therefore, is not intended to be limited to the example aspects and aspects shown and described herein. Rather, the scope of present disclosure is embodied by the appended claims.
The terms “a” and “an” and “the” and similar referents used in the context of the present disclosure (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. Recitation of ranges of values herein is merely intended to serve as a shorthand method of referring individually to each separate value falling within the range. Unless otherwise indicated herein, each individual value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or example language (e.g., “such as”, “in the case”, “by way of example”) provided herein is intended merely to better illuminate the present disclosure and does not pose a limitation on the scope of the present disclosure otherwise claimed. No language in the specification should be construed as indicating any non-claimed element essential to the practice of the present disclosure. It is further noted that the claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as solely, only and the like in connection with the recitation of claim elements, or use of a negative limitation.
Groupings of alternative elements or embodiments disclosed herein are not to be construed as limitations. Each group member may be referred to and claimed individually or in any combination with other members of the group or other elements found herein. It is anticipated that one or more members of a group may be included in, or deleted from, a group for reasons of convenience and/or patentability.
While certain features of the embodiments have been illustrated as described above, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is therefore to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the disclosed embodiments.
Various embodiments are described in the following numbered clauses:
1. A system for determining a presence of a mobile device located in a predetermined detection zone within a vehicle, the system comprising: a mobile device comprising a processor, wherein the mobile device is configured to periodically record sounds from an acoustic environment, and wherein the processor configured to: determine that the periodically recorded sounds comprise a periodically recorded first acoustic signal comprising a first ultrasonic pulse and a second acoustic signal comprising a second ultrasonic pulse; calculate, from the periodically recorded sounds, a first time of arrival of the first acoustic signal and a second time of arrival of the second acoustic signal; determine a location of the mobile device within the vehicle based on the first time of arrival and the second time of arrival; and determine that the location of the mobile device matches the predetermined detection zone.
2. The system of clause 1, wherein upon determining that the location of the mobile device matches the predetermined detection zone, the processor is further configured to cause the mobile device to inhibit at least one function of the mobile device.
3. The system of clause 1, wherein upon determining that the location of the mobile device matches the predetermined detection zone, the processor is further configured to cause the mobile device to alter the activity of at least one function of the mobile device.
4. The system of clause 1, wherein upon determining that the location of the mobile device matches the predetermined detection zone, the processor is further configured to cause the mobile device to issue a notification to a user of the mobile device.
5. The system of clause 1, wherein the first acoustic signal has a first acoustic characteristic and the second acoustic signal has a second acoustic characteristic.
6. The system of clause 5, wherein the first acoustic characteristic differs from the second acoustic characteristic.
7. The system of clause 5, wherein the first acoustic characteristic and the second acoustic characteristic independently comprise an acoustic signal period.
8. The system of clause 5, wherein the first acoustic characteristic and the second acoustic characteristic independently comprise an ultrasonic pulse width.
9. The system of clause 5, wherein the first acoustic characteristic and the second acoustic characteristic independently comprise an acoustic signal duty cycle.
10. The system of clause 5, wherein the first acoustic characteristic and the second acoustic characteristic independently comprise an ultrasonic pulse central frequency.
11. The system of clause 5, wherein the first acoustic characteristic and the second acoustic characteristic independently comprise an ultrasonic pulse shape.
12. The system of clause 1, wherein the processor is further configured to: calculate, from the periodically recorded sounds, a power of the first acoustic signal and a power of the second acoustic signal; and determine a location of the mobile device within the vehicle based on the power of the first acoustic signal and the power of the second acoustic signal.
13. A method for determining a presence of a mobile device located in a predetermined detection zone within a vehicle, the method comprising: periodically recording, by the mobile device comprising a processor, a plurality of sounds comprising an acoustic environment; determining, by the processor, that the periodically recorded sounds comprise a periodically recorded first acoustic signal comprising a first ultrasonic pulse and a second acoustic signal comprising a second ultrasonic pulse; calculating, by the processor from the periodically recorded sounds, a first time of arrival of the first acoustic signal and a second time of arrival of the second acoustic signal; determining, by the processor, a location of the mobile device within the vehicle based on the first time of arrival and the second time of arrival; and determining, by the processor, that the location of the mobile device matches the predetermined detection zone.
14. The method of clause 13, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to inhibit at least one function of the mobile device.
15. The method of clause 13, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to alter the activity of at least one function of the mobile device.
16. The method of clause 13, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to issue a notification to a user of the mobile device.
17. The method of clause 13, further comprising: determining that the periodically recorded sounds comprise a periodically recorded first acoustic signal comprising a first ultrasonic pulse having first acoustic characteristic; and determining that the periodically recorded sounds comprise a periodically recorded second acoustic signal comprising a second ultrasonic pulse having second acoustic characteristic.
18. The method of clause 13, further comprising: determining that the periodically recorded sounds comprise a periodically recorded first acoustic signal comprising a first ultrasonic pulse having a frequency in the range of 15 kHz to 60 kHz; and determining that the periodically recorded sounds comprise a periodically recorded second acoustic signal comprising a second ultrasonic pulse having a frequency in the range 15 kHz to 60 kHz.
19. The method of clause 13, further comprising: determining that the periodically recorded sounds comprise a periodically recorded first acoustic signal comprising a first ultrasonic pulse having a frequency in the range of 10 kHz to 21 kHz; and determining that the periodically recorded sounds comprise a periodically recorded second acoustic signal comprising a second ultrasonic pulse having a frequency in the range 10 kHz to 21 kHz.
20. The method of clause 13, further comprising: calculating, from the periodically recorded sounds, a power of the first acoustic signal and a power of the second acoustic signal; and determining a location of the mobile device within the vehicle based on the power of the first acoustic signal and the power of the second acoustic signal.
21. A method for determining a presence of a mobile device located in a predetermined detection zone within a vehicle, the method comprising: receiving, by a mobile device, a wireless synchronization signal; recording, by the mobile comprising a processor, a plurality of sounds comprising an acoustic environment upon receiving the wireless synchronization signal; determining, by the processor, that the recording of the plurality of sounds comprise a recorded first acoustic signal comprising a first ultrasonic pulse and a second acoustic signal comprising a second ultrasonic pulse; calculating, by the processor, from the recorded sounds, a first time of arrival of the first acoustic signal and a second time of arrival of the second acoustic signal; determining, by the processor, a location of the mobile device within the vehicle based on the first time of arrival and the second time of arrival; and determining, by the processor, that the location of the mobile device matches the predetermined detection zone.
22. The method of clause 21, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to inhibit at least one function of the mobile device.
23. The method of clause 21, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to alter the activity of at least one function of the mobile device.
24. The method of clause 21, further comprising, upon determining that the location of the mobile device matches the predetermined detection zone, causing, by the processor, the mobile device to issue a notification to a user of the mobile device.
25. The method of clause 21, wherein receiving, by a mobile device, a wireless synchronization signal comprises receiving, by the mobile device, a Bluetooth broadcast message comprising the synchronization signal.
26. A method of providing a location of at least one mobile device within a vehicle to recipient, the method comprising: receiving, by a server comprising a processor and a memory, data from a mobile device, wherein the data from the mobile device comprises a location of the mobile device within a vehicle; storing, by the server processor, the data from the mobile device in the server memory; and providing, by the server processor, the data from the mobile device to the recipient via a communication interface.
27. The method of clause 26, wherein receiving, by a server comprising a processor and a memory, data from a mobile device further comprises receiving, by a server, identification data from the mobile device.
28. The method of clause 26, wherein storing the mobile device data by the server processor in the server memory comprises storing, by the server processor, the mobile device data in a database stored in the server memory.
29. The method of clause 26, wherein providing, by the server processor, the data from the mobile device to the recipient via a communication interface comprises: receiving, by the server processor, a security token from the recipient via the communication interface; determining, by the server processor, that the security token is a valid security token; and providing, by the server processor, the data from the mobile device to the recipient via a communication interface.
30. The method of clause 26, wherein receiving, by a server comprising a processor and a memory, data from a mobile device comprises receiving, by a server comprising a processor and a memory, data from the mobile device via a wireless communications protocol.
31. The method of clause 30, wherein receiving, by a server comprising a processor and a memory, data from a mobile device comprises receiving, by a server comprising a processor and a memory, data from the mobile device via a cellular phone communication protocol.
This application is a continuation application filed under 35 U.S.C. § 120 of U.S. patent application Ser. No. 15/210,649, filed Jul. 14, 2016, entitled “DETECTING THE LOCATION OF A PHONE USING RF WIRELESS AND ULTRASONIC SIGNALS,” now U.S. Pat. No. 10,205,819, and which further claims the benefit, under 35 USC § 119(e), of U.S. provisional patent application No. 62/192,354, filed Jul. 14, 2015, entitled “DETECTING THE LOCATION OF A PHONE USING RF WIRELESS AND ULTRASONIC SIGNALS”, the entire disclosures of which are hereby incorporated by reference in their entirety and for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5258968 | Matsuda et al. | Nov 1993 | A |
6188315 | Herbert et al. | Feb 2001 | B1 |
6556810 | Suzuki | Apr 2003 | B2 |
6620108 | Duval et al. | Sep 2003 | B2 |
6690618 | Tomasi et al. | Feb 2004 | B2 |
6726636 | Der Ghazarian et al. | Apr 2004 | B2 |
6728542 | Meda | Apr 2004 | B2 |
6892131 | Coffee et al. | May 2005 | B2 |
6901264 | Myr | May 2005 | B2 |
6904110 | Trans et al. | Jun 2005 | B2 |
6967581 | Karsten | Nov 2005 | B2 |
7084894 | Van Brocklin et al. | Aug 2006 | B2 |
7086596 | Meier et al. | Aug 2006 | B2 |
7095402 | Kunii et al. | Aug 2006 | B2 |
7126937 | Crosbie et al. | Oct 2006 | B2 |
7170404 | Albert et al. | Jan 2007 | B2 |
7173536 | Duval | Feb 2007 | B2 |
7181229 | Singh et al. | Feb 2007 | B2 |
7200409 | Ichikawa et al. | Apr 2007 | B1 |
7206696 | Furukawa | Apr 2007 | B2 |
7215944 | Mecca | May 2007 | B2 |
7218236 | Mobley et al. | May 2007 | B2 |
7254417 | Slemmer et al. | Aug 2007 | B2 |
7260022 | Schliep et al. | Aug 2007 | B2 |
7260221 | Atsmon | Aug 2007 | B1 |
7283904 | Benjamin et al. | Oct 2007 | B2 |
7287617 | Mobley et al. | Oct 2007 | B2 |
7292936 | Furukawa | Nov 2007 | B2 |
7299890 | Mobley et al. | Nov 2007 | B2 |
7319455 | Kunii et al. | Jan 2008 | B2 |
7377352 | Mobley et al. | May 2008 | B2 |
7379083 | Van Brocklin et al. | May 2008 | B2 |
7406000 | Lee | Jul 2008 | B2 |
7413047 | Brown et al. | Aug 2008 | B2 |
7426689 | Simonds et al. | Sep 2008 | B2 |
7464005 | Beetner et al. | Dec 2008 | B1 |
7481292 | Mobley et al. | Jan 2009 | B2 |
7505784 | Barbera | Mar 2009 | B2 |
7577872 | DiBartolomeo et al. | Aug 2009 | B2 |
7582196 | Babes-Domeo et al. | Sep 2009 | B2 |
7656287 | Albert et al. | Feb 2010 | B2 |
7660667 | Furukawa | Feb 2010 | B2 |
7690572 | Meier et al. | Apr 2010 | B2 |
7698062 | McMullen et al. | Apr 2010 | B1 |
7714832 | Tong et al. | May 2010 | B2 |
7728755 | Jocic | Jun 2010 | B1 |
7729709 | Loeb et al. | Jun 2010 | B1 |
7820108 | Lampotang et al. | Oct 2010 | B2 |
7841224 | Son | Nov 2010 | B2 |
7852318 | Altman | Dec 2010 | B2 |
7856203 | Lipovski et al. | Dec 2010 | B2 |
7876205 | Catten et al. | Jan 2011 | B2 |
7887089 | Breed et al. | Feb 2011 | B2 |
7891456 | Takahashi et al. | Feb 2011 | B2 |
7916577 | Jeong et al. | Mar 2011 | B2 |
7925243 | McGary | Apr 2011 | B2 |
7934577 | Walter et al. | May 2011 | B2 |
7966215 | Myers et al. | Jun 2011 | B1 |
7976092 | Meredith et al. | Jul 2011 | B2 |
7991654 | Sacks et al. | Aug 2011 | B1 |
7991655 | Sacks et al. | Aug 2011 | B1 |
7991656 | Sacks et al. | Aug 2011 | B1 |
7996023 | McGary et al. | Aug 2011 | B2 |
8002957 | Grincourt et al. | Aug 2011 | B2 |
8014945 | Cooper et al. | Sep 2011 | B2 |
8016196 | Meier et al. | Sep 2011 | B2 |
8032764 | Shankar et al. | Oct 2011 | B2 |
8051449 | Kunii et al. | Nov 2011 | B2 |
8065051 | Chopcinski et al. | Nov 2011 | B2 |
8090399 | Howarter et al. | Jan 2012 | B2 |
8095065 | Nagara et al. | Jan 2012 | B2 |
8099054 | Tabe | Jan 2012 | B2 |
8134481 | Ohki | Mar 2012 | B2 |
8136011 | Cho et al. | Mar 2012 | B2 |
8140358 | Ling et al. | Mar 2012 | B1 |
8145199 | Tadayon et al. | Mar 2012 | B2 |
8166081 | Christensen et al. | Apr 2012 | B2 |
8179271 | Kamiki | May 2012 | B2 |
8196694 | Biondo et al. | Jun 2012 | B2 |
8200291 | Steinmetz et al. | Jun 2012 | B2 |
8201437 | Takata | Jun 2012 | B2 |
8213914 | Kim et al. | Jul 2012 | B2 |
8213962 | Carr | Jul 2012 | B2 |
8233775 | Kunii et al. | Jul 2012 | B2 |
8238951 | McGary | Aug 2012 | B2 |
8239831 | Brennan et al. | Aug 2012 | B2 |
8240419 | Zimmermann et al. | Aug 2012 | B2 |
8249627 | Olincy et al. | Aug 2012 | B2 |
8258919 | Corradino et al. | Sep 2012 | B2 |
8258968 | Ghazarian et al. | Sep 2012 | B2 |
8265590 | Sennett et al. | Sep 2012 | B2 |
8270933 | Riemer et al. | Sep 2012 | B2 |
8280417 | Venkatachalam et al. | Oct 2012 | B2 |
8290509 | Jung et al. | Oct 2012 | B2 |
8296728 | Webster | Oct 2012 | B1 |
8301161 | Li | Oct 2012 | B2 |
8315597 | Olincy et al. | Nov 2012 | B2 |
8326635 | Usher et al. | Dec 2012 | B2 |
8340730 | Pallotta | Dec 2012 | B2 |
8346310 | Boll et al. | Jan 2013 | B2 |
8359014 | Olincy et al. | Jan 2013 | B2 |
8374636 | McDonough | Feb 2013 | B2 |
8377705 | Lambert et al. | Feb 2013 | B2 |
8401578 | Inselberg | Mar 2013 | B2 |
8401589 | Liu et al. | Mar 2013 | B2 |
8401848 | Dowlatkhah | Mar 2013 | B2 |
8412123 | Foster | Apr 2013 | B2 |
8413217 | Bhatia | Apr 2013 | B2 |
8417268 | Halferty et al. | Apr 2013 | B1 |
8442447 | Veluppillai et al. | May 2013 | B2 |
8479864 | White et al. | Jul 2013 | B2 |
8498941 | Felsher | Jul 2013 | B2 |
8527013 | Guba et al. | Sep 2013 | B2 |
8594041 | Mecca | Nov 2013 | B2 |
8600895 | Felsher | Dec 2013 | B2 |
8665077 | Richter | Mar 2014 | B2 |
8684922 | Trans | Apr 2014 | B2 |
8686864 | Hannon | Apr 2014 | B2 |
8694058 | Weiss | Apr 2014 | B2 |
8706143 | Elias | Apr 2014 | B1 |
8718536 | Hannon | May 2014 | B2 |
8761821 | Tibbitts et al. | Jun 2014 | B2 |
8884750 | Bacal | Nov 2014 | B2 |
8968195 | Tran | Mar 2015 | B2 |
9028405 | Tran | May 2015 | B2 |
9060683 | Tran | Jun 2015 | B2 |
9069058 | Booij et al. | Jun 2015 | B2 |
9137776 | Lavery | Sep 2015 | B2 |
9160859 | Tadayon et al. | Oct 2015 | B2 |
9185526 | Guba et al. | Nov 2015 | B2 |
9209909 | Booij et al. | Dec 2015 | B2 |
9280145 | Hannon | Mar 2016 | B2 |
9311670 | Hoffberg | Apr 2016 | B2 |
9338605 | Guba et al. | May 2016 | B2 |
9358940 | Cooper et al. | Jun 2016 | B2 |
9369196 | Hannon | Jun 2016 | B2 |
9379805 | Hannon | Jun 2016 | B2 |
9398421 | Guba et al. | Jul 2016 | B2 |
9557402 | Bartov et al. | Jan 2017 | B2 |
9609482 | Want et al. | Mar 2017 | B1 |
9674337 | Alexandre et al. | Jun 2017 | B2 |
9717108 | Raj et al. | Jul 2017 | B2 |
9758039 | Hannon | Sep 2017 | B2 |
9791540 | Want et al. | Oct 2017 | B2 |
9820658 | Trans et al. | Nov 2017 | B2 |
9854433 | Hannon | Dec 2017 | B2 |
9860710 | Buttolo et al. | Jan 2018 | B2 |
9888394 | Rajendran et al. | Feb 2018 | B2 |
10034145 | Yang et al. | Jul 2018 | B2 |
10205819 | Hannon et al. | Feb 2019 | B2 |
20020132646 | Girod | Sep 2002 | A1 |
20020156602 | Kunii et al. | Oct 2002 | A1 |
20020167862 | Tomasi | Nov 2002 | A1 |
20030086515 | Trans et al. | May 2003 | A1 |
20030222144 | Meier et al. | Dec 2003 | A1 |
20040083031 | Okezie | Apr 2004 | A1 |
20040124697 | MacGregor et al. | Jul 2004 | A1 |
20040267607 | Maddux | Dec 2004 | A1 |
20050041529 | Schliep et al. | Feb 2005 | A1 |
20050050209 | Main, II | Mar 2005 | A1 |
20050064922 | Owens et al. | Mar 2005 | A1 |
20050186933 | Trans | Aug 2005 | A1 |
20050261824 | Furukawa | Nov 2005 | A1 |
20050261829 | Furukawa | Nov 2005 | A1 |
20060033628 | Duval | Feb 2006 | A1 |
20060058951 | Cooper et al. | Mar 2006 | A1 |
20060058952 | Cooper et al. | Mar 2006 | A1 |
20060058953 | Cooper et al. | Mar 2006 | A1 |
20060080031 | Cooper et al. | Apr 2006 | A1 |
20060080032 | Cooper et al. | Apr 2006 | A1 |
20060099940 | Pfleging et al. | May 2006 | A1 |
20060205394 | Vesterinen | Sep 2006 | A1 |
20060224945 | Khan et al. | Oct 2006 | A1 |
20060240860 | Benco et al. | Oct 2006 | A1 |
20060265508 | Angel et al. | Nov 2006 | A1 |
20070032225 | Konicek et al. | Feb 2007 | A1 |
20070088495 | Ibrahim | Apr 2007 | A1 |
20070130153 | Nachman et al. | Jun 2007 | A1 |
20070136068 | Horvitz | Jun 2007 | A1 |
20070182595 | Ghasabian | Aug 2007 | A1 |
20070188472 | Ghasabian | Aug 2007 | A1 |
20070196078 | Kunii et al. | Aug 2007 | A1 |
20070288164 | Gordon et al. | Dec 2007 | A1 |
20080009296 | Han | Jan 2008 | A1 |
20080123580 | Vathulya | May 2008 | A1 |
20080147314 | Cubillo | Jun 2008 | A1 |
20080168398 | Geelen et al. | Jul 2008 | A1 |
20080182598 | Bowman | Jul 2008 | A1 |
20080208447 | Geelen et al. | Aug 2008 | A1 |
20090012704 | Franco et al. | Jan 2009 | A1 |
20090024707 | Aase et al. | Jan 2009 | A1 |
20090028179 | Albal | Jan 2009 | A1 |
20090075139 | Kucernak et al. | Mar 2009 | A1 |
20090083035 | Huang et al. | Mar 2009 | A1 |
20090089293 | Garritano et al. | Apr 2009 | A1 |
20090112572 | Thorn | Apr 2009 | A1 |
20090146848 | Ghassabian | Jun 2009 | A1 |
20090177736 | Christensen et al. | Jul 2009 | A1 |
20090215387 | Brennan et al. | Aug 2009 | A1 |
20090215466 | Ahl et al. | Aug 2009 | A1 |
20090238386 | Usher et al. | Sep 2009 | A1 |
20090253423 | Kullberg | Oct 2009 | A1 |
20090255917 | Feichko et al. | Oct 2009 | A1 |
20090264161 | Usher et al. | Oct 2009 | A1 |
20090316529 | Huuskonen et al. | Dec 2009 | A1 |
20100004004 | Browne-Swinburne et al. | Jan 2010 | A1 |
20100009626 | Farley | Jan 2010 | A1 |
20100010740 | Nachman et al. | Jan 2010 | A1 |
20100035596 | Nachman et al. | Feb 2010 | A1 |
20100035632 | Catten | Feb 2010 | A1 |
20100039224 | Okude et al. | Feb 2010 | A1 |
20100062788 | Nagorniak | Mar 2010 | A1 |
20100082820 | Furukawa | Apr 2010 | A1 |
20100113073 | Schlesener et al. | May 2010 | A1 |
20100131304 | Collopy et al. | May 2010 | A1 |
20100164836 | Liberatore | Jul 2010 | A1 |
20100199176 | Chronqvist | Aug 2010 | A1 |
20100236924 | Chapples et al. | Sep 2010 | A1 |
20100251804 | Morley et al. | Oct 2010 | A1 |
20100269566 | Carroll et al. | Oct 2010 | A1 |
20100279626 | Bradley et al. | Nov 2010 | A1 |
20100297929 | Harris | Nov 2010 | A1 |
20100306309 | Santori et al. | Dec 2010 | A1 |
20100311345 | Santori et al. | Dec 2010 | A1 |
20100317420 | Hoffberg | Dec 2010 | A1 |
20100322293 | Rhodes et al. | Dec 2010 | A1 |
20100331051 | Kim et al. | Dec 2010 | A1 |
20100332226 | Lee et al. | Dec 2010 | A1 |
20110009107 | Guba et al. | Jan 2011 | A1 |
20110015934 | Rowe et al. | Jan 2011 | A1 |
20110029869 | McLennan | Feb 2011 | A1 |
20110032096 | Miller et al. | Feb 2011 | A1 |
20110045813 | Choi | Feb 2011 | A1 |
20110045839 | Chao | Feb 2011 | A1 |
20110063098 | Fischer et al. | Mar 2011 | A1 |
20110065375 | Bradley | Mar 2011 | A1 |
20110079073 | Keays | Apr 2011 | A1 |
20110084807 | Logan et al. | Apr 2011 | A1 |
20110086668 | Patel | Apr 2011 | A1 |
20110093474 | Etchegoyen | Apr 2011 | A1 |
20110102160 | Heubel et al. | May 2011 | A1 |
20110105084 | Chandrasekaran | May 2011 | A1 |
20110111724 | Baptiste | May 2011 | A1 |
20110133919 | Evarts et al. | Jun 2011 | A1 |
20110143786 | Fan et al. | Jun 2011 | A1 |
20110153120 | Katou | Jun 2011 | A1 |
20110153742 | Sloop et al. | Jun 2011 | A1 |
20110175930 | Hwang et al. | Jul 2011 | A1 |
20110187646 | Mahmoud | Aug 2011 | A1 |
20110207441 | Wood | Aug 2011 | A1 |
20110212737 | Isidore | Sep 2011 | A1 |
20110219080 | McWhithey et al. | Sep 2011 | A1 |
20110230165 | Kleve et al. | Sep 2011 | A1 |
20110263293 | Blake et al. | Oct 2011 | A1 |
20110288764 | Sathish et al. | Nov 2011 | A1 |
20110304446 | Basson et al. | Dec 2011 | A1 |
20110304465 | Boult et al. | Dec 2011 | A1 |
20110306304 | Forutanpour et al. | Dec 2011 | A1 |
20120004933 | Foladare et al. | Jan 2012 | A1 |
20120032876 | Tabe | Feb 2012 | A1 |
20120034954 | Tabe | Feb 2012 | A1 |
20120035923 | Krause | Feb 2012 | A1 |
20120052854 | DiMeo et al. | Mar 2012 | A1 |
20120064924 | Schapsis et al. | Mar 2012 | A1 |
20120066638 | Ohri | Mar 2012 | A1 |
20120109451 | Tan | May 2012 | A1 |
20120110126 | Sparks | May 2012 | A1 |
20120119936 | Miller et al. | May 2012 | A1 |
20120122525 | Miller et al. | May 2012 | A1 |
20120136503 | Schunder | May 2012 | A1 |
20120136529 | Curtis et al. | May 2012 | A1 |
20120140147 | Satoh et al. | Jun 2012 | A1 |
20120157069 | Elliott et al. | Jun 2012 | A1 |
20120176237 | Tabe et al. | Jul 2012 | A1 |
20120228047 | White et al. | Sep 2012 | A1 |
20120236136 | Boddy | Sep 2012 | A1 |
20120244883 | Tibbitts et al. | Sep 2012 | A1 |
20120265535 | Bryant-Rich et al. | Oct 2012 | A1 |
20120283894 | Naboulsi | Nov 2012 | A1 |
20120284659 | De Leon | Nov 2012 | A1 |
20130046562 | Taylor et al. | Feb 2013 | A1 |
20130084847 | Tibbitts et al. | Apr 2013 | A1 |
20130316737 | Guba et al. | Nov 2013 | A1 |
20130336094 | Gruteser et al. | Dec 2013 | A1 |
20140179356 | Hannon | Jun 2014 | A1 |
20140335902 | Guba et al. | Nov 2014 | A1 |
20140357192 | Azogiu et al. | Dec 2014 | A1 |
20150043309 | Calvarese | Feb 2015 | A1 |
20150062091 | Li | Mar 2015 | A1 |
20150113175 | Brezezinski et al. | Apr 2015 | A1 |
20150139058 | Xia | May 2015 | A1 |
20150149042 | Cooper | May 2015 | A1 |
20160066013 | Li et al. | Mar 2016 | A1 |
20160073324 | Guba et al. | Mar 2016 | A1 |
20160353251 | Yang | Dec 2016 | A1 |
20170075740 | Guba et al. | Mar 2017 | A1 |
20170078948 | Breaux et al. | Mar 2017 | A1 |
20170322287 | Benbouhout et al. | Nov 2017 | A1 |
20180069438 | Bit-Babik et al. | Mar 2018 | A1 |
20180164398 | Olsen et al. | Jun 2018 | A1 |
20180252796 | Qu et al. | Sep 2018 | A1 |
20180370360 | Hannon | Dec 2018 | A1 |
20190025402 | Qu et al. | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
201224324 | Apr 2009 | CN |
101554835 | Oct 2009 | CN |
201347000 | Nov 2009 | CN |
101808273 | Aug 2010 | CN |
201792751 | Apr 2011 | CN |
102256206 | Nov 2011 | CN |
2428028 | Mar 2012 | EP |
2708910 | Mar 2014 | EP |
2428028 | Jul 2014 | EP |
2995006 | Mar 2016 | EP |
2995006 | Jan 2017 | EP |
1401318 | Jul 1975 | GB |
H10200961 | Jul 1998 | JP |
2000230900 | Aug 2000 | JP |
2002335584 | Apr 2002 | JP |
2004249847 | Sep 2004 | JP |
2007106277 | Apr 2007 | JP |
4034813 | Jan 2008 | JP |
2008137624 | Jun 2008 | JP |
208160715 | Jul 2008 | JP |
4351286 | Oct 2009 | JP |
2009284442 | Dec 2009 | JP |
2013219678 | Oct 2013 | JP |
10199800440012 | Sep 1998 | KR |
1019990043676 | Jun 1999 | KR |
20000001005 | Jan 2000 | KR |
201239384 | Oct 2012 | TW |
200108328 | Feb 2001 | WO |
2002012883 | Feb 2002 | WO |
2004018249 | Mar 2004 | WO |
2009014703 | Jan 2009 | WO |
2010129939 | Nov 2010 | WO |
2014182971 | Nov 2014 | WO |
2014182971 | Nov 2014 | WO |
2015070064 | May 2015 | WO |
2016210181 | Dec 2016 | WO |
Entry |
---|
AlcoMate Premium AL7000 Breathalyzer Product Specifications, http://alcomate.net/index.php/model-al7000.html, Jun. 16, 2011. |
Breathalyzer—Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Breathalyzer, Jun. 16, 2011. |
Bluetooth SIG, Bluetooth Specification Version 4.0 [vol. 0]. |
International Search Report & Written Opinion for corresponding PCT Application No. PCT/US2016/042305 dated Oct. 19, 2016. |
Supplemental Search Report for corresponding EP Application No. EP17739027 dated Jul. 31, 2019. |
Partial Supplemental European Search Report for corresponding EP Application No. 16825187.4 dated Jun. 12, 2019. |
How Stuff Works: How Breathalyzers Work, Jun. 16, 2011. |
Swerdlow, Alexej et al., “Speaker Position Estimation in Vehicles by Means of Acoustic Analysis,” Fortschritte Der Akustik: DAGA, Mar. 2008 in Dresden. |
Yang, et al., “Detecting Driver Phone Use Leveraging Car Speakers,” MobiCom'11, Sep. 19-23, 2011, Las Vegas, Nevada, USA, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20190199850 A1 | Jun 2019 | US |
Number | Date | Country | |
---|---|---|---|
62192354 | Jul 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15210649 | Jul 2016 | US |
Child | 16225376 | US |