Embodiments are generally related to the field of electronic mobile wireless devices (e.g., smartphones, smartwatches, tablet computing devices, laptop computers, etc.). Embodiments also relate to connected car technology and the deployment of electronic mobile wireless devices in the context of automotive vehicles. Embodiments are additionally related to the detection of external and internal conditions with respect to a vehicle and the notification of alerts to a user regarding such conditions.
The availability of on-board electronics and in-vehicle information systems has accelerated the development of more intelligent vehicles. One possibility is to automatically monitor a driver's driving performance to prevent potential risks. Although protocols to measure a driver's workload have been developed by both government agencies and the automobile industry, they have been criticized as being too costly and difficult to obtain. In addition, existing uniform heuristics for driving risk preventions do not account for changes in individual driving environments. Hence, technologies for understanding a driver's frustrations to prevent potential driving risks has been listed by many international automobile companies as one of the key research areas for realizing intelligent transportation systems.
Additionally, there is a need to monitor not only a driver's activity (e.g. driver inattention) but also to monitor the driver's activity with respect to certain conditions, which may be external to the vehicle or may occur within the vehicle. For example, a driver may be approaching a red light or the driver may be stopped at a red light and is looking at his cell phone or other distractions instead of being attentive to the traffic light.
The following summary is provided to facilitate an understanding of some of the innovative features unique to the disclosed embodiments and is not intended to be a full description. A full appreciation of the various aspects of the embodiments disclosed herein can be gained by taking the entire specification, claims, drawings, and abstract as a whole.
It is, therefore, one aspect of the disclosed embodiments to provide for an improved method and system for alerting a driver of a vehicle of a change in condition external to or within the vehicle.
It is another aspect of the disclosed embodiments to provide a method and system for issuing alerts to a vehicle driver and/or passenger via wireless communications.
The aforementioned aspects and other objectives and advantages can now be achieved as described herein. Methods and system are disclosed for alerting a vehicle driver via wireless communications. Such an approach can include steps or operations for monitoring one or more conditions with respect to a vehicle, detecting a change in the condition (or conditions), automatically transmitting a signal wirelessly to a computing device, wherein the signal is indicative of the change in condition(s), and alerting the driver of the change in condition(s) in response to transmitting the signal to the computing device (e.g., a tablet computing device, smartphone, smartwatch, etc.).
In some embodiments, the step or operation of alerting the driver to the change in condition(s) can further include a step or operation of providing an audible alert via a speaker associated with the computing device, wherein the audible alert indicative of the change condition(s). In another embodiment, the step or operation of alerting the driver of the change in condition(s) can further include steps or operations for establishing a wireless connection between the computing device and a radio system of the vehicle. And providing an audible alert from or via the computing device indicative of the change in condition via the radio system. In another embodiment, the step or operation of alerting the driver of the change condition(s) can further include or involve a step or operation for alerting the driver of the change in the condition by a text message displayable through the computing device.
In another embodiment, the step or operation of monitoring the condition(s) with respect to the vehicle can further involve monitoring of the condition with one or more cameras (e.g., a video camera, high-definition video camera, etc.). In some embodiments, one or more of such cameras can be a 360 degree video camera. In other embodiments, monitoring the condition(s) with respect to the vehicle may also involve monitoring condition(s) with one or more sensors, either by themselves or in association with the aforementioned video camera(s). In another embodiment, the step or operation of monitoring the condition(s) with respect to the vehicle can involve a step or operation of analyzing video data from the video camera(s) (e.g., an HD video camera, a 360 degree video camera etc.) utilizing anomaly detection (e.g., an anomaly detection module, anomaly detection mechanism, etc.). In yet another embodiment, the step or operation of monitoring the condition(s) can further involve a step or operation for analyzing video data captured by the video camera(s) utilizing machine learning. In still another embodiment, the step or operation of monitoring the condition(s) with respect to the vehicle can further involve or include a step or operation for analyzing video data obtained from the video camera(s) utilizing location data (e.g., GPS data, beacon data) from a location module (e.g., GPS module, beacon module, etc.) associated with the computing device. The location data can be employed to cross-reference location identification information, for example, with respect to items or objects identified from the video data obtained from the video camera(s).
In another embodiment, a method can be implemented for alerting a vehicle driver via wireless communications, wherein such a method includes steps or operations for monitoring condition(s) external to a vehicle while the vehicle is in operation and a driver of the vehicle is located in a driver seat of the vehicle, detecting a change in the condition(s); transmitting a signal wirelessly to a computing device, wherein the signal indicative of a change in the condition(s); and alerting the driver regarding the change in condition(s), after transmission of the signal to the computing device.
In some embodiments, the step of alerting the driver of the change in the condition(s) after transmission of the signal to the computing device, can further involve a step or operation of providing or issuing an audible alert via a speaker associated with the computing device, wherein the audible alert is indicative of the change in condition(s).
In another embodiment, the step or operation of alerting the driver of the change in condition(s) after transmission of the signal to the computing device, can further include steps or operations for establishing a wireless connection between the computing device and a radio system of the vehicle; and issuing or providing an audible alert from the computing device which is issued (e.g., played or broadcast) through the vehicle's radio system, wherein the audible alert is indicative of the change in condition. In such a situation the radio system is in communication with the computing device. The computing device may be a separate wireless mobile electronic device such as a smartphone, smartwatch or tablet computing device associated with the driver and/or a passenger (or passengers) in the vehicle, or the computing device may be a computing system integrated with the vehicle and which communicates electronically with the radio system. In some embodiments, the “vehicle in operation” may involve the vehicle in motion or the vehicle temporarily stopped (e.g., at an intersection, dropping or picking up a passenger, etc.). In another embodiment, the step or operation of alerting the driver of the change in condition(s), after transmission of the signal to the computing device, can further involve a step or operation of providing an audible alert via a speaker associated with the computing device, the audible alert indicative of the change in condition(s).
In yet another embodiment, the step or operation of alerting the driver of the change in the condition(s) after transmission of the signal to the computing device, can further involve steps or logical operations for establishing a wireless connection between the computing device and a radio system of the vehicle; and providing an audible alert from the computing device indicative of the change in the condition(s) via the radio system. In still another embodiment, the step or operation of monitoring a condition external to the vehicle while the vehicle is in operation and the driver of the vehicle is located in the driver seat of the vehicle, can further include a step or operation of monitoring the condition(s) with a camera (e.g., video camera, HD camera, 360 degree video camera, etc.) that communicates with the computing device. In some embodiments, the video camera may be integrated with the computing device (e.g., a tablet computing device video camera, smartphone video camera, etc.), while in other embodiments, the camera may be a standalone camera positioned within the vehicle to monitor the condition(s) and which communicates via a wireless connection with the computing device (e.g., tablet computing device, smartphone, smartwatch, integrated in-vehicle computing system, etc.).
In still another embodiment, a step or operation can be provided for tracking and recording in a memory of a computing system (e.g., a server, an in-vehicle computing system, a tablet computing device, a smartphone, a smartwatch, laptop computer, desktop computer, etc.), data indicative of the number of times the driver is alerted to changes in conditions. In another embodiment, steps or operations can be implemented for periodically retrieving such tracked and recorded data from the memory, and transmitting the data wirelessly from the computing system to a central repository (e.g., a server) for further storage and analysis. Note that in some embodiments, the condition(s) external to the vehicle may be a stop light condition (e.g., traffic light) and the change in the condition(s) involves a change from one stop light color to another stop light color (e.g. red to green, green to red, green to yellow, etc.).
In another embodiment the step or operation of transmitting the signal wirelessly to the computing device, wherein the signal is indicative of the change in condition(s), can further involve a step or operation for transmitting the signal wirelessly through a PAN (Personal Area Network). In some embodiments, the PAN may be a network enabled for Bluetooth wireless communications, induction wireless communications, infrared wireless communications, ultra-wideband wireless communications and/or ZigBee wireless communications. In some embodiments, the wireless connection between the computing device and the radio system can be established via Secure Simple Pairing (SSP).
In another embodiment, the step or operation of detecting the change in the condition(s) can involve utilizing anomaly detection or machine learning approaches for detecting the change in the condition(s).
In yet another embodiment, steps or operations can be provided for determining if the vehicle is no longer in operation, monitoring the condition(s) within the vehicle, in response to determining that the vehicle is no longer in operation, determining if the condition(s) within the vehicle comprises an anomalous condition; and wirelessly transmitting an alert to a computing device associated with the user, the alert indicative of the anomalous condition, if is determined that the condition(s) comprises the anomalous condition (i.e., if it is determined that the condition or conditions are anomalous). Note that in some embodiments, the alert can be wirelessly transmitted as a text message to the computing device via a wireless network, wherein the text message is displayed via a display screen associated with the computing device.
In another embodiment, a system for alerting a vehicle driver via wireless communications, can be implemented. Such a system can include, for example, a video camera (i.e., one or more video cameras), one or more processors which communicate with and process video data captured by the video camera, and a computer-usable medium embodying computer program code, wherein the computer-usable medium capable of communicating with the processor(s). The computer program code can include instructions executable by the processor(s) and configured, for example, for: monitoring condition(s) with respect to the vehicle with the video camera, detecting a change in the condition(s) monitored via the video camera, transmitting a signal wirelessly to a computing device, the signal indicative of the change in condition(s), and alerting the driver of the change in condition(s) in response to transmitting the signal to the computing device. As indicated previously, the computing device may be, for example, mobile electronic device such as a smartphone, tablet computing device, laptop computer and so on. In some embodiments, the computing device may be an in-vehicle computing system that communicates wirelessly with other computing devices such as the driver's and/or a passenger's smartphone, tablet computing device, etc.
In some embodiments, the instructions for alerting the driver of the change condition(S) can be further configured for providing an audible alert via a speaker associated with the computing device, wherein the audible alert is indicative of the change condition(s). In yet another embodiment, the instructions for alerting the driver of the change in the condition(s) can be further configured for establishing a wireless connection between the computing device and a radio system of the vehicle; and providing an audible alert from the computing device indicative of the change in the condition(s) via the radio system. In another embodiment, the instructions for alerting the driver of the change in the condition(s) can be further configured for alerting the driver of the change in the condition(s) by a text message displayable through the computing device and/or played or broadcast as a voice alert through the computing device. As indicated previously, the camera(s) may be, for example, a video camera, HD video camera, 360 degree video camera, and so on. In some embodiments, one or more sensors (e.g., temperature sensor, pressure sensor, velocity sensor, acceleration sensor, vehicle heading sensor, etc.) may be employed for use in monitoring the conditions external to or within the vehicle.
In some embodiments, the instructions for monitoring the condition(s) can be further configured for analyzing the video data captured from the video camera utilizing anomaly detection or machine learning techniques. In still other embodiments, the instructions for monitoring the condition(s) can be configured for analyzing the video data captured from the 360 degree video camera utilizing, for example, location data from a location or locating module (e.g., beacon module, GPS module, etc.) associated with the computing device and in association with the captured video data.
The accompanying figures, in which like reference numerals refer to identical or functionally-similar elements throughout the separate views and which are incorporated in and form a part of the specification, further illustrate the present invention and, together with the detailed description of the invention, serve to explain the principles of the present invention.
The particular values and configurations discussed in these non-limiting examples can be varied and are cited merely to illustrate one or more embodiments and are not intended to limit the scope thereof.
The alert 24 may be transmitted wirelessly to, for example, the user's mobile electronic wireless computing device 21 (e.g., a smartphone, tablet computing device, smartwatch, wearable device, etc.) or to a device or system integrated with the vehicle. Note that electronic devices such as smartphones, smartwatches, personal digital assistants (PDAs) mobile telephones, tablet devices, laptops and other Internet connectivity devices (“mobile stations”) provide users with mobile access to various information resources. Such mobile stations generally operate via a cellular 3G or 4G broadband data communication standard and/or a WIFI network connection to a local area network. In
In the example of
A non-limiting example of a wearable device such as a smartwatch, which can be utilized as computing device 21 is depicted in U.S. Pat. No. 8,854,925 entitled “Smart Watch and Control Method for the Same,” which issued on Oct. 7, 2014 and is incorporated herein by reference. Another non-limiting example of a smartwatch that can be adapted for use as computing device 21 is disclosed in U.S. Pat. No. 8,279,716 entitled “Smart-Watch Including Flip-Up Display,” which issued on Oct. 2, 2012 and is incorporated herein by reference. Note that the terms “smart watch” and “smartwatch” and “smart-watch) can be utilized interchangeably to refer to the same type of device.
Another example of a wearable device that can be implemented as computing device 21 is an OHMD (Optical Head-Mounted Display) that can be equipped with a video camera. OHMD is a wearable display that has the capability of reflecting projected images as well as allowing the user to see through it that is augmented reality.
The computing device 21 can incorporate a video camera 19. In some embodiments, the example computing device 21 with camera 19 may be implemented in the context of, for example, a smartphone or tablet computing device located or mounted on the vehicle dashboard or elsewhere within the vehicle (or located on the vehicle external to the passenger compartment). The alert 24 may be broadcast as an audio alert or text alert message through the computing device 21. In some embodiments, the alert can be transmitted in the context of a voice alert, which is discussed further herein.
In another embodiment, the camera 19 may be implemented as a standalone camera that communicates wirelessly with the computing device 21 via wireless communications as described in greater detail herein.
Camera 19 may also be implemented as, for example, a so-called dashcam or dash cam. A dashcam (dashboard camera) is an onboard camera that attaches to the vehicle's interior windscreen by either a supplied suction cup mount or an adhesive-tape mount. It can also be positioned on top of the dashboard or attached to the rear-view mirror with a special mount. It continuously records the road ahead while the vehicle is in motion. Various types of dashcam can be implemented as camera 19, ranging from basic video cameras to those capable of recording parameters such as date/time, speed, G-forces and location. In some embodiments, camera 19 may be implemented as a wearable video camera that monitors conditions external to the vehicle or within the vehicle. Such a video camera may be, for example, a lapel camera worn by a the vehicle driver and/or a passenger.
Note that such a monitoring step or logical operation may involve monitoring the condition with a camera that communicates with the computing device. The camera may be integrated the computing device (e.g., a Smartphone or tablet computer). In other embodiments, such a camera may be a standalone camera positioned within the vehicle to monitor the condition and the camera may also communicate via a wireless connection (e.g., Bluetooth or other wireless communication as discussed in greater detail herein) with the computing device.
Monitoring can involve the use of object recognition or other video image recognition techniques and systems. For example, in one embodiment a traffic recognition approach can be utilized as part of the video monitoring operation. One example of a traffic object recognition approach that can be adapted for use in accordance with an embodiment is disclosed in U.S. Patent Application Publication No. 2011/0184895 entitled “Traffic Object Recognition System, Method for Recognizing a Traffic Object, and Method for Setting up a Traffic Object Recognition System,” which published to Janssen on Jul. 28, 2011 and is incorporated herein by reference in its entirety. Another object recognition approach that can be adapted for use in accordance with an alternative embodiment is disclosed in U.S. Pat. No. 8,447,139 entitled “Object recognition using Haar features and histograms of oriented gradients,” which issued on May 21, 2013 and is incorporate herein by reference in its entirety.
Next, as illustrated at decision block 28, a test can be performed to determine if a change has been detected the monitored conditions (or conditions). If a change is detected, then as disclosed at block 30, a step or logical operation can be implemented for transmitting a signal wirelessly to a computing device, wherein such a signal indicative of the change in the condition(s) monitored. Thereafter, as shown at block 32, a step or logical operation can be implemented to alert the driver of the change in the condition after transmission of the signal to the computing device.
It can be appreciated that the vehicle in operation may be, for example, temporarily stopped (e.g., at an intersection/stop light, a parking lot, in traffic, etc.) or in motion. In some implementations, the computing device that receives and plays the alert (e.g., an audio signal or voice announcement) may be, for example, a smartphone or a tablet computing device. In other embodiments, the computing device may be integrated with the vehicle as part of an in-vehicle system that provides alerts and other information (e.g., GPS information) to the vehicle's occupants. Such a system typically includes a dashboard display. One example of a non-limiting in-vehicle system that can be adapted for use in accordance with an alternative embodiment is disclosed in U.S. Patent Application Publication No. 20110034128 entitled “Mobile Communication Device Linked to In-Vehicle System,” which published on Feb. 10, 2011 and is incorporated herein by reference in its entirety. Yet another example of a non-limiting in-vehicle system that can be adapted for use in accordance with an alternative embodiment is disclosed in U.S. Pat. No. 8,417,211 entitled “In-Vehicle System (IVS) Control of Emergency Data Communications,” which issued on Apr. 9, 2013 and is incorporated herein by reference in its entirety.
It can also be appreciated that the in the context of a tablet or smartphone implementation, the computing device may not necessarily belong to the vehicle driver but may, for example, be a computing device (e.g., hand held wireless electronic device, smartphone, tablet, etc.) belonging to passengers.
As depicted at block 68, a test is performed to determine if conditions anomalous. Anomalous conditions may include one of a variety of possible conditions. For example, an anomalous condition may be a change in temperature in the vehicle. Another anomalous condition may be, for example, the presence of someone in the vehicle who normally would not still be in the vehicle after the car is turned off or, for example, the vehicle doors are closed and/or locked. If such an anomalous condition is detected, then as indicated at block 70, an alert may be wirelessly transmitted to a computing device associated with a user (e.g., the vehicle driver, a passenger, etc.) indicating such an anomalous condition. The process can then terminate, as depicted at block 72. Note that in some embodiments, such an alert may be wirelessly transmitted as a text message to the computing device via a wireless network. Such a wireless network can be, for example, a cellular telephone network and/or a WiFi network.
A text message alert can be implemented via for example, Short Message Service (SMS), SkyMail, Short Mail, Instant Messaging (IM), chat, Mobile Instant Messaging (MiM), Multimedia Messaging Service (MMS), and other messaging services. Text messaging is supported by computer devices such as laptop computers, desktop computers, handheld computers, and wireless devices such as cellular telephones, Wireless Local Area Network (WLAN) terminals, Wireless Wide Area Network (WWAN) terminals, and Wireless Personal Area Network (WPAN) terminals, for example.
Typically, a text message server serves as an intermediate device for receiving a text message from a source device, storing the text message, and forwarding the text message to a recipient device, e.g., a first cell phone as a source device and a second cell phone as a recipient device. While some text message service providers charge for text message support, e.g., cellular networks, other text message service providers support text messaging without charge. Various protocols such as SS7, GSM MAP, or TCP/IP, for example, may be employed to support text messaging.
In some embodiments, the alert regarding a change in condition can be implemented in the context of a notification service. In one example, the text message may be sent as a push notification across a cellular or wireless communication network to the computing device. Certain text messaging protocols may be used, such as, mobile short message service (SMS), multimedia message service (MMS), and instant messaging (IM), or any other related text application. The communication medium may include transferring information over communication links, such as wireless networks (e.g., GSM, CDMA, 3G, 4G, etc.), wireline networks (e.g., landline telephony), Internet, satellite/cable networks, or, any other data medium using standard communication protocols.
An example of a notification service that can be adapted for use with an alternative embodiment is disclosed in U.S. Pat. No. 8,751,602 entitled “Method and Apparatus of Providing Notification Services to Smartphone Devices,” which issued on Jun. 10, 2014 and is incorporated herein by reference in its entirety. Another non-limiting example of a system that can be adapted for use in accordance with an alternative embodiment for delivery of an alert regarding a change in condition is disclosed in U.S. Pat. No. 8,265,938 entitled “Voice Alert Methods, Systems and Processor-Readable Media,” which issued on Sep. 11, 2012 and is incorporated herein by reference in its entirety.
An example of a situation where the method 60 would be useful is the case of a child accidentally being left in a vehicle during hot weather. A camera in the vehicle operating via battery power or residual electricity from the vehicle electrical system may detect an anomaly such as the child in a rear car seat. The anomaly in this case would be the presence (e.g., detection of the child moving, turning his or her head, moving his or her arms, legs, etc.) of a child in a car seat, wherein a child would not normally be in the car seat after the car is no longer in operation and/or after the doors are closed/locked, and/or after a particular amount of time (e.g., 5 minutes, 10 minutes, etc.). Note that a cellular network or cellular link or service such as OnStar can be utilized for sending out an alert (e.g., text message, audio alert) etc. to let them know that a child may have been left in a car seat.
Audio sensors may also be employed to detect, for example, the sound of a crying child. A temperature sensor could also be utilized to detect a rise in temperature to an unsafe level for humans and when that temperature threshold is met, the alert is transmitted wirelessly to the user's hand held device (e.g., smartphone, tablet, smartwatch or other wearable device, etc.). Such an approach could thus be utilized to prevent tragic and unnecessary deaths in automobiles due to heatstroke.
Note that the step or logical operation of anomaly detection or outlier detection shown in block 68 can involve the identification of items, events or observations which may not conform to an expected pattern or other items in a dataset. Anomalies are also referred to as outliers, novelties, noise, deviations and exceptions. In the context of abuse and network intrusion detection, “interesting” objects are often not rare objects, but unexpected bursts in activity. This pattern does not adhere to the common statistical definition of an outlier as a rare object, and many outlier detection methods (in particular unsupervised methods) will fail on such data, unless it has been aggregated appropriately. Instead, a cluster analysis algorithm may be able to detect the micro clusters formed by these patterns.
The anomaly detection operation shown at block 68 can preferably be implemented by an anomaly detection mechanism based on a number of possible categories of anomaly detection including but not limited to, unsupervised anomaly detection, supervised anomaly detection, semi-supervised anomaly detection, etc. An unsupervised anomaly detection technique can be employed detect anomalies in an unlabeled test data set under the assumption that the majority of the instances in the data set are normal by looking for instances that seem to fit least to the remainder of the data set. Alternatively, a supervised anomaly detection technique may be employed, which requires a data set that has been labeled as “normal” and “abnormal” and involves training a classifier (the key difference to many other statistical classification problems is the inherent unbalanced nature of outlier detection). Semi-supervised anomaly detection techniques may also be employed, which construct a model representing normal behavior from a given normal training data set, and then testing the likelihood of a test instance to be generated by the learnt model.
Examples of sensors that can be utilized to implement sensor(s) 83 are sensors such as temperature sensors, pressure sensors, velocity sensors, acceleration sensors, vehicle heading sensors, yaw-rate sensors, and so on. One example of a vehicle heading sensor approach that can be adapted for use as or with sensor(s) 83 in accordance with an alternative embodiment, is disclosed in U.S. Pat. No. 7,957,897 entitled “GPS-based in-vehicle sensor calibration algorithm,” which issued on Jun. 7, 2011 and is incorporated herein by reference in its entirety. The GPS module discussed herein can be utilized in association with such sensors to provide location or position data with respect to the vehicle and also provide vehicle heading sensor data.
Note that in some embodiments, the computing device 84 can communicate with the vehicle radio system via wireless communications established via Secure Simple Pairing (SSP). The sensor(s) 83 and the camera(s) 82 and the computing device 84 may also in some embodiments communicate with module 72 via SSP. SSP, which requires less user interaction utilizes a one-time six-digit key displays at the time of pairing on both the device and the car, replacing the PIN code. Once the user confirms that the keys match, the two devices can be paired.
The monitoring module 74 can implement the monitoring steps or operations discussed previously. For example, monitoring module 74 can monitor traffic lights or other conditions (i.e., conditions external to the vehicle or within the vehicle) facilitated by, for example, camera(s) 82 and/or sensor(s) 83. The monitoring module 74 can be, for example, an anomaly detection mechanism that detects changes in conditions as discussed previously.
The alerting module 76 serves to alert the driver of the detected change in a condition. The alert (e.g., an audio alert, a text message, etc.) can be broadcast through, for example, the computing device 84 or the vehicle radio system 86 (assuming the vehicle radio system 86 is paired with the computing device 84). The tracking module 78 and the recording module 80 function to respectively track and record in a memory of a computing system (e.g., the computing device 84, an onboard computing system, etc.) data indicative of, for example, the number of times the driver is alerted to changes in conditions. Such data can be retrieved from the computer memory and then transmitted to, for example, a central repository for further storage and analysis.
It can be appreciated that in some cases, the connections between the various components shown in
The wireless network 85 may be implemented as a PAN (Bluetooth or otherwise), and the signal transmitted through the PAN. It should be appreciated that wireless network 85 may be implemented not just via Bluetooth communications, but through one of a number of possible alternative PAN wireless technologies. For example, in one embodiment wireless network 85 may be implemented as a PAN based on induction wireless technology, which uses magnetic induction rather than radio for close-range communications. In radio, both electric and magnetic fields make up the signal, while in induction wireless, only the magnetic field is transmitted. The transmitter in this context is a radiating coil that is more like the primary winding of a transformer than an antenna. A PAN based on an induction wireless approach has about a 3-m range. A typical unit transmits up to 204.8-kbit/s data rates via GMSK modulation on 11.5 MHz. Key benefits of induction wireless technologies are extremely low power consumption, low cost, and the inherent security that accompanies short range.
Another implementation of wireless network 85 can involve the use of infrared wireless communications. Such a PAN technology can be employed for use over short distances. The IrDA infrared (IR) standard appeared during the early 1990s, and can be utilized to implement wireless network 85 as a PAN network. IrDA initially offered a 115.2-kbit/s data rate over a range of up to 1 m. A 4-Mbit/s version was soon developed and has been widely incorporated in laptops and PDAs for printer connections and short-range PANs. A 16-Mbit/s version is available too
The problem with IrDA is not just its very short range, but also its need for a line-of-sight (LOS) connection. Of course, Bluetooth does not need LOS, and it can blast through walls. A more recent IR development is IrGate, which was produced by Infra-Com Technologies. This new IR development uses arrays of high-powered IR LEDs to emit coded baseband IR in all directions. Then, it relies on an array of photodetectors and super-sensitive receivers to pick up the diffused IR within the networking space. Thus, the LOS problem is mitigated, and a data rate of up to 10 Mbits/s is possible.
Still another wireless technology for implementing wireless network 85 in the context of, for example, an in-vehicle PAN is UWB (Ultra Wideband), which transmits data by way of baseband pulses applied directly to the antenna. The narrow pulses (less than 1 ns) create an extremely broad bandwidth signal. The pulses are modulated by pulse position modulation (PPM) or binary phase-shift keying (BPSK). The FCC permits UWB in the 3.1- to 10.6-GHz band. Its primary application to date has been short-range, high-resolution radar and imaging systems that penetrate walls, the ground, and the body. In addition, this new technology is useful for short-range LANs or PANs that require very high data rates (over 100 Mbits/s).
Still another wireless technology for implementing wireless network 85 in the context of, for example, an in-vehicle PAN is ZigBee, which is a simpler, slower lower-power, lower-cost cousin of Bluetooth, ZigBee. ZigBee is supported by a mix of companies that are targeting the consumer and industrial markets. It may be a better fit with games, consumer electronic equipment, and home-automation applications than Bluetooth. Short-range industrial telemetry and remote control are other target applications. It can be appreciated, however, that wireless network 85 can be implemented as a ZigBeen PAN.
Previously referred to as RF-Lite, ZigBee is similar to Bluetooth because it uses the 2.4-GHz band with frequency-hopping spread-spectrum with 25 hops spaced every 4 MHz. The basic data rate is 250 kbits/s, but a slower 28-kbit rate is useful for extended range and greater reliability. With a 20-dBm power level, ZigBee can achieve a range of up to 134 meters at 28 kbits/s. It additionally allows for networking of up to 254 nodes.
Note that in some embodiments, whether that of
An example of a 360 degree camera that can be adapted for use with one or more embodiments is the Giroptic 360cam by Gripoptic. Such a device includes three 185-degree fish-eye cameras, allowing it to capture 360 degrees of HD video and photos (including time-lapse and HDR). The Giroptic 360cam captures audio as well as video, and can record 3D sound from three microphones. Media can be saved onto a microSD card, which is then loaded onto a computer via a micro USB port on the unit's base, or via Wi-Fi. It can be appreciated that such a device (or other 360 degree video cameras) can be modified to communicate via other types of wireless communications, such as Bluetooth communications, cellular, and so forth as discussed herein. Note that reference herein to the Giroptic video camera is for illustrative purposes only and is not considered a limiting feature of the disclosed embodiments.
ML techniques can be employed in the context of, for example, an algorithm that operates by building a model from example inputs and used to make predictions or decisions, rather than following strictly static program instructions. ML can be used to construct a model or rule set to predict a result based on values with respect to a number of features. A series of input patterns can be provided to an algorithm along with a desired output (e.g., the label) and the algorithm then learns how to classify the patterns by outing a desired label. In supervised learning (e.g., Kernal-based support vector machine (SVM) algorithm), a human operator must provide the labels during a teaching phase. Alternatively, unsupervised clustering is a process of assigning labels to the input patterns without the use of the human operator. Such unsupervised methods generally function through a statistical analysis of the input data by determining an Eigen value vector of a covariance matrix.
One non-limiting ML technique that can be adapted for use in accordance with an embodiment is AHaH (Anti-Hebbian and Hebbian) learning, which can be employed for feature extraction. One example of an AHaH ML approach is disclosed in U.S. Pat. No. 8,918,353 entitled “Methods and Systems for Feature Extraction,” which issued on Dec. 23, 2014 and is incorporated herein by reference in its entirety. Another non-limiting ML technique that can be adapted in accordance with another embodiment is disclosed in U.S. Pat. No. 8,429,103 entitled “Native Machine Learning Service for User Adaptation on a Mobile Platform,” which issued on Apr. 23, 2013 and is incorporated herein by reference in its entirety. It can be appreciated that such ML approaches are referred to for illustrative purposes only and are not considered limiting features of the disclosed embodiments.
In the context of the embodiment shown in
For example, in some embodiments the monitoring operation of monitoring module 74 can involve estimating the distance to a particular point or location near the vehicle and providing a notification/alert via the alerting module 75 in the form of an audio alert, text message etc. ML and/or AD modules or mechanisms can be employed to detect changes in conditions with respect to particular geographic locations. For example, the GPS data may be utilized to determine that the vehicle is rapidly approaching particular crosswalk or intersection and an alert issued to let the driver know that he or she is approaching this particular crosswalk or intersection, while the ML and/or AD modules or techniques can be employed to determine if someone is in the middle of the crosswalk/intersection.
Possible alerts or conditions to be monitored and alerted can be, for example, “approaching a red light,” “changing lanes,” “approaching a median,” “15 feet to a median,” “10 feet to a median,” “at median,” etc. The camera(s) 19, 82 and so forth and the monitoring module 75 can look for conditions such as medians, red lights, yellow lights etc. and determines how far away these things are from the vehicle.
Note that a non-limiting example of a camera that can be adapted for use in accordance with the operation shown as block 98 and in some implementations for us as the camera 19 discussed earlier herein is a color recognition camera. A non-limiting example of a color recognition camera is disclosed in U.S. Pat. No. 6,803,956 entitled “Color Recognition Camera,” which issued on Oct. 12, 2004 and is incorporated herein by reference in its entirety. Such an example color recognition camera includes a red-green-blue CCD-imaging device that provides an analog RGB-video signal. A set of three analog-to-digital converters converts the analog RGB-video signal into a digital RGB-video signal. A digital comparator tests the digital RGB-video signal pixel-by-pixel for a match against a color setpoint. If a match occurs, a pixel with a particular color represented by the color setpoint has been recognized and a “hit” is output. A pixel address counter provides a pixel address output each time a “hit” is registered. The number of hits per video frame are accumulated, and a color-match area magnitude value is output for each frame. Alternatively, neural networks can be used to indicate hits when a pixel in the video image comes close enough to the color setpoint value. Just how close can be “learned” by the neural network.
As indicated next at block 100, a step or logical operation can be implemented to determine if the light is red, green or yellow. If it is determined, as shown at block 102, that the light is red, then the traffic light is monitored to determine if there is change from to green. Assuming that the light changes from red to green, an alert (e.g., audio) is then issued indicating the change from red to green, as depicted at block 104. The process can then terminate, as shown at block 106.
Note that another color recognition approach that can be adapted for use in accordance with an alternative embodiment and for monitoring a change in color (e.g., traffic light change from yellow to green, red to green, etc.) is disclosed in U.S. Pat. No. 8,139,852 entitled “Color classification method, color recognition method, color classification apparatus, color recognition apparatus, color recognition system, computer program, and recording medium,” which issued on Mar. 12, 2012 and is incorporated herein by reference in its entirety.
Note that in some embodiments, computer program code for carrying out operations of the disclosed embodiments may be written in an object oriented programming language (e.g., Java, C#, C++, etc.). Such computer program code, however, for carrying out operations of particular embodiments can also be written in conventional procedural programming languages, such as the “C” programming language or in a visually oriented programming environment, such as, for example, Visual Basic.
The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer. In the latter scenario, the remote computer may be connected to a user's computer through a local area network (LAN) or a wide area network (WAN), wireless data network e.g., Wi-Fi, Wimax, 802.xx, and cellular network or the connection may be made to an external computer via most third party supported networks (e.g., through the Internet via an Internet Service Provider).
The embodiments are described at least in part herein with reference to flowchart illustrations and/or block diagrams of methods, systems, and computer program products and data structures according to embodiments of the invention. It will be understood that each block of the illustrations, and combinations of blocks, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general-purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the various block or blocks, flowcharts, and other architecture illustrated and described herein.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the block or blocks.
As illustrated in
As illustrated, the various components of data-processing system 200 can communicate electronically through a system bus 151 or similar architecture. The system bus 151 may be, for example, a subsystem that transfers data between, for example, computer components within data-processing system 200 or to and from other data-processing devices, components, computers, etc. Data-processing system 200 may be implemented as, for example, a server in a client-server based network (e.g., the Internet) or can be implemented in the context of a client and a server (i.e., where aspects are practiced on the client and the server). Data-processing system 200 may also be, for example, a standalone desktop computer, a laptop computer, a Smartphone, a pad computing device and so on. In the case of a smartphone, it can be assumed that devices such as keyboard 144, input unit 145 and so on would implemented in the context of a touch screen display or other appropriate mobile input interface. The data-processing system 200 can also include or communicate with an image capturing unit 132 (e.g., a video camera such as discussed herein, etc.).
The following discussion is intended to provide a brief, general description of suitable computing environments in which the system and method may be implemented. Although not required, the disclosed embodiments will be described in the general context of computer-executable instructions, such as program modules, being executed by a single computer. In most instances, a “module” constitutes a software application.
Generally, program modules include, but are not limited to, routines, subroutines, software applications, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and instructions. Moreover, those skilled in the art will appreciate that the disclosed method and system may be practiced with other computer system configurations, such as, for example, hand-held devices, multi-processor systems, data networks, microprocessor-based or programmable consumer electronics, networked PCs, minicomputers, mainframe computers, servers, and the like.
Note that the term module as utilized herein may refer to a collection of routines and data structures that perform a particular task or implements a particular abstract data type. Modules may be composed of two parts: an interface, which lists the constants, data types, variable, and routines that can be accessed by other modules or routines; and an implementation, which is typically private (accessible only to that module) and which includes source code that actually implements the routines in the module. The term module may also simply refer to an application, such as a computer program designed to assist in the performance of a specific task, such as word processing, accounting, inventory management, etc.
Thus beacon 13 can assist the mobile computing device 21 in determining its approximate location or context. With the assistance of beacon 13, software associated with mobile computing device 21 can approximately find its relative location with respect to beacon 13 and hence with respect to the traffic light 22 (assuming the beacon 13 is located at or proximate to the traffic light 22). The beacon 13 can communicate with device 21 using BLE (Bluetooth Low Energy) technology also referred to as “Bluetooth Smart”. The beacon 13 uses low energy proximity sensing to transmit a universally unique identifier picked up be a compatible “app” or operating system. The identifier can then be looked up over the Internet to determine the physical location of device 21 or trigger an action on device 21 such as a push notification or tracking and recording operations as discussed previously herein. One non-limiting example of a beacon device and systems that can be adapted for use as or with device 21 and beacon 13 and the methods/systems disclosed herein is discussed in U.S. Pat. No. 8,718,620 entitled “Personal Media Devices with Warless Communication,” which issued on May 6, 2014 and is incorporated herein by reference.
Based on the foregoing, it can be appreciated that a number of embodiments, preferred and alternative, are disclosed herein. For example, in one embodiment a method for alerting a vehicle driver via wireless communications can be implemented. Such a method can include the steps or logical operations of, for example, monitoring one or more conditions with respect to a vehicle; detecting a change in the condition (or conditions); transmitting a signal wirelessly to a computing device, the signal indicative of the change in the condition(s); and alerting the driver of the change in the condition(s) in response to transmitting the signal to the computing device. In some embodiments, the computing device may be, for example, a wireless hand held electronic device such as a smartphone or tablet computing device (e.g., iPad, Android tablet, etc.). In other embodiments, the table computing device may actually be integrated with the vehicle.
In some embodiments, the step or logical operation of alerting the driver of the change in the condition(s) further comprises providing an audible alert via a speaker associated with the computing device, the audible alert indicative of the change in condition(s). In another embodiment, the step or logical operation of alerting the driver of the change in the condition(s) can further include steps or logical operations for establishing a wireless connection between the computing device and a radio system of the vehicle; providing an audible alert from the computing device indicative of the change in the condition(s) via the radio system.
In yet another embodiment, the step or logical operation of alerting the driver of the change in the condition(s) can further include the step or logical operation of alerting the driver of the change in the condition by a text message displayable through the computing device. In still another embodiment, monitoring the condition(s) with respect to a vehicle can further involve monitoring such condition(s) with a camera. In some embodiments, such a camera may be for example, an HD video camera, a 360 degree video camera, etc.
In other embodiments, the step or logical operation of monitoring the condition(s) with respect to the vehicle can further include a step or logical operation of monitoring the condition(s) with one or more sensors (e.g., temperature sensor, tire pressure sensor, etc.). In yet other embodiments, the step or logical operation of monitoring the condition(s) can further involve a step or logical operation of analyzing video data from the 360 degree video camera utilizing anomaly detection. In still other embodiments, the step or logical operation of monitoring the condition(s) can further involve the step or logical operation of analyzing video data from the 360 degree video camera utilizing machine learning. In yet another embodiment the step or logical operation of monitoring the condition(s) can further include a step or logical operation of monitoring the condition(s) by analyzing video data from the 360 degree video camera utilizing location data (e.g., GPS data, beacon data, etc.) from a location module (e.g., GPS module, iBeacon, etc.) associated with, for example, the computing device or the vehicle itself (e.g., an in-vehicle mounted GPS unit).
In another embodiment, a method for alerting a vehicle driver via wireless communications, can be implemented. Such a method can include the steps or logical operations of monitoring one or more conditions external to a vehicle while the vehicle is in operation and a driver of the vehicle is located in a driver seat of the vehicle; detecting a change in the condition(s); transmitting a signal wirelessly to a computing device, the signal indicative of the change in the condition(s); and alerting the driver of the change in the condition(s) after transmission of the signal to the computing device.
In some embodiments, the step or logical operation of alerting the driver of the change in the condition(s) after transmission of the signal to the computing device, can further include a step or logical operation of providing an audible alert via a speaker associated with the computing device, the audible alert indicative of the change in the at least one condition. In still another embodiment, the step or logical operation of alerting the driver of the change in the condition(s) after transmission of the signal to the computing device, can further include the steps or logical operation of establishing a wireless connection between the computing device and a radio system of the vehicle; and providing an audible alert from the computing device indicative of the change in the condition(s) via the radio system. In some embodiments, the “vehicle in operation” can include at least one of: the vehicle in motion, or the vehicle temporarily stopped (i.e., yet still in operation, such as the vehicle temporarily stopped at an intersection).
In another embodiment, the step or logical operation of alerting the driver of the change in the at least one condition after transmission of the signal to the computing device, can further include the step or logical operation of providing an audible alert via a speaker associated with the computing device, the audible alert indicative of the change in the at least one condition. In another embodiment, the step of logical operation of alerting the driver of the change in the at least one condition after transmission of the signal to the computing device, can further include the steps or logical operations of establishing a wireless connection between the computing device and a radio system of the vehicle; and providing an audible alert from the computing device indicative of the change in the at least one condition via the radio system.
In still another embodiment, the step or logical operation of monitoring a condition external to a vehicle while the vehicle is in operation and a driver of the vehicle is located in a driver seat of the vehicle, can further include the step or logical operation of monitoring the at least one condition with a camera that communicates with the computing device. In some embodiments, the aforementioned camera may be integrated with the computing device or can be a standalone camera positioned within the vehicle to monitor the at least one condition and wherein the standalone camera communicates via a wireless connection with the computing device. In some cases more than one camera may be employed (e.g., both the standalone camera and the camera integrated with the computing device).
In another embodiments, steps or logical operations can be provided for tracking and recording in a memory of a computing system data indicative of a number of times the driver is alerted to the change in the at least one condition. Additionally, steps or logical operations may be provided for periodically retrieving the data from the memory; and transmitting the data wirelessly from the computing system to a central repository for further storage and analysis.
In some embodiments, the at least one condition external to the vehicle may be, for example, a stop light condition and the change in the at least one condition comprises a change from one stop light color to another stop light color.
In another embodiment the step or logical operation of transmitting the signal wirelessly to the computing device, the signal indicative of the change in the at least one condition, can further include or involve a step or logical operation of transmitting the signal wirelessly through a PAN (Personal Area Network). In some embodiments, such a PAN can be a network enabled for example, for: Bluetooth wireless communications, induction wireless communications, infrared wireless communications, ultra-wideband wireless communications and ZigBee wireless communications. In some embodiments, the wireless connection between the computing device and the radio system can be established via Secure Simple Pairing (SSP).
In other embodiments, the step or logical operation of detecting the change in the at least one condition can further involve a step or logical operation for utilizing an anomaly detection mechanism to detect the change in the at least one condition.
In another embodiment, steps or logical operations can be provided for determining if the vehicle is no longer in operation; monitoring at least one condition within the vehicle, in response to determining that the vehicle is no longer in operation; determining if the at least one condition within the vehicle comprises an anomalous condition; and wirelessly transmitting an alert to a computing device associated with the user, the alert indicative of the anomalous condition, if is determined that the at least one condition comprises the anomalous condition. In yet another embodiment the alert can be wirelessly transmitted as a text message to the computing device via a wireless network.
In another embodiment, a system for alerting a vehicle driver via wireless communications, can be implemented. Such a system can include, for example, a video camera (one or more video cameras), at least one processor that communicates with and processes video data captured by the video camera; and a computer-usable medium embodying computer program code, the computer-usable medium capable of communicating with the at least one processor. The computer program code can include instructions executable by the at least one processor and configured for example, for: monitoring at least one condition with respect to a vehicle with the video camera; detecting a change in the at least one condition monitored with the video camera; transmitting a signal wirelessly to a computing device, the signal indicative of the change in the at least one condition; and alerting the driver of the change in the at least one condition in response to transmitting the signal to the computing device. As indicated previously, the computing device may be, for example, a smartphone, a tablet computing device or an-vehicle computing system (or a combination of both).
In some embodiments, the aforementioned instructions for alerting the driver of the change in the at least one condition can be further configured for providing an audible alert via a speaker associated with the computing device, the audible alert indicative of the change in the at least one condition. In some situations, the speak may be, for example, a speaker integrated with a smartphone, tablet computing device, etc. The speaker may also be an audio speaker associated with an in-vehicle system such as discussed herein.
In other embodiments, the aforementioned instructions for alerting the driver of the change in the at least one condition can be further configured for establishing a wireless connection between the computing device and a radio system of the vehicle; and providing an audible alert from the computing device indicative of the change in the at least one condition via the radio system.
In yet another embodiment, the instructions for alerting the driver of the change in the at least one condition can further include instructions configured for alerting the driver of the change in the condition by a text message displayable through the computing device.
In some embodiments, such a camera may be a 360 degree video camera. In other embodiments, one or more sensors can also be employed for use in monitoring the condition(s). In another embodiment, the instructions for monitoring the at least one condition can include instructions for analyzing the video data captured from the 360 degree video camera utilizing anomaly detection. In yet another embodiment, the instructions for monitoring the at least one condition can further include instructions for analyzing the video data captured from the 360 degree video camera utilizing machine learning. In still another embodiment, the instructions for monitoring the at least one condition can further involve or include instructions for analyzing the video data captured from the 360 degree video camera utilizing location data from a location (e.g., GPS module, beacon, etc.) associated with the computing device.
The flowcharts and block diagrams in the different depicted embodiments illustrate the architecture, functionality, and operation of some possible implementations of apparatus, methods and computer program products. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified function or functions. In some alternative implementations, the function or functions noted in the block may occur out of the order noted in the figures. For example, in some cases, two blocks shown in succession may be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
The invention can take the form of an entire hardware embodiment, an entire software embodiment, or an embodiment containing both hardware and software elements. In a preferred embodiment, the invention is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc.
Furthermore, the invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable medium can be any tangible apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device). Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-R/W) and DVD.
Some embodiments may be implemented in the context of a so-called “app” or software application. An “app” is a self-contained program or piece of software designed to fulfill a particular purpose; an application, especially as downloaded by a user to a mobile device (e.g., smartphone, tablet computing device, etc.).
It will be appreciated that variations of the above-disclosed and other features and functions, or alternatives thereof, may be desirably combined into many other different systems or applications. Also, that various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
This application is a continuation of U.S. patent application Ser. No. 17/082,312, filed on Oct. 28, 2020; which is a continuation of U.S. patent application Ser. No. 16/700,832, filed on Dec. 2, 2019 (now U.S. Pat. No. 10,850,664); which is a continuation of U.S. patent application Ser. No. 16/115,807, filed on Aug. 29, 2018 (now U.S. Pat. No. 10,493,911); which is a continuation of U.S. patent application Ser. No. 15/688,237, filed on Aug. 28, 2017 (now U.S. Pat. No. 10,089,871); which is a continuation of U.S. patent application Ser. No. 15/396,728 (now U.S. Pat. No. 9,824,582), filed on Jan. 2, 2017; which is a continuation of U.S. patent application Ser. No. 14/661,065 (now U.S. Pat. No. 9,610,893), filed on Mar. 18, 2015; the aforementioned applications being hereby incorporated by reference in their respective entireties.
Number | Name | Date | Kind |
---|---|---|---|
3265938 | Daien | Aug 1966 | A |
3369838 | Nelson | Feb 1968 | A |
4941263 | Hirshberg | Jul 1990 | A |
5131043 | Fujii | Jul 1992 | A |
5287411 | Hill | Feb 1994 | A |
5436612 | Aduddell | Jul 1995 | A |
5495242 | Kick | Feb 1996 | A |
5684455 | Villiams | Nov 1997 | A |
5710555 | McConnell | Jan 1998 | A |
5990801 | Kyouno | Nov 1999 | A |
6087960 | Kyouno | Jul 2000 | A |
6096969 | Fujita | Aug 2000 | A |
6223125 | Hall | Apr 2001 | B1 |
6292109 | Murano | Sep 2001 | B1 |
6323761 | Son | Nov 2001 | B1 |
6362749 | Brill | Mar 2002 | B1 |
6366219 | Hoummady | Apr 2002 | B1 |
6384742 | Harrison | May 2002 | B1 |
6438491 | Farmer | Aug 2002 | B1 |
6792339 | Basson | Sep 2004 | B2 |
6803956 | Hirono | Oct 2004 | B1 |
6940422 | Bachelder | Sep 2005 | B1 |
7019669 | Carr | Mar 2006 | B1 |
7089099 | Shostak | Aug 2006 | B2 |
7142089 | Yamagishi | Nov 2006 | B2 |
7210725 | Moore | May 2007 | B2 |
7442089 | Regnier | Oct 2008 | B2 |
7466227 | Chen et al. | Dec 2008 | B2 |
7821422 | Hutchison et al. | Oct 2010 | B2 |
7860813 | Wang et al. | Dec 2010 | B2 |
7957897 | Basnayake | Jun 2011 | B2 |
8014601 | Takahashi | Sep 2011 | B2 |
8031062 | Smith | Oct 2011 | B2 |
8043138 | Buckley | Oct 2011 | B2 |
8047649 | Chen et al. | Nov 2011 | B2 |
8092986 | Lee et al. | Jan 2012 | B2 |
8107060 | Hebrank | Jan 2012 | B2 |
8139852 | Shinjo et al. | Mar 2012 | B2 |
8228210 | Sitbon | Jul 2012 | B2 |
8265938 | Verna et al. | Sep 2012 | B1 |
8279716 | Gossweiler, III et al. | Oct 2012 | B1 |
8305459 | Kanemitsu et al. | Nov 2012 | B2 |
8369838 | Mallavarapu | Feb 2013 | B2 |
8386658 | Lee et al. | Feb 2013 | B2 |
8417211 | Hong | Apr 2013 | B2 |
8418115 | Tom et al. | Apr 2013 | B1 |
8429103 | Aradhye et al. | Apr 2013 | B1 |
8447139 | Guan et al. | May 2013 | B2 |
8471691 | Zhang | Jun 2013 | B2 |
8483941 | Fu et al. | Jul 2013 | B2 |
8487139 | Raja et al. | Jul 2013 | B2 |
8509824 | Bennett | Aug 2013 | B2 |
8532843 | Nagua | Sep 2013 | B2 |
8581699 | Kojima | Nov 2013 | B2 |
8718620 | Rosenblatt | May 2014 | B2 |
8751602 | Slonh | Jun 2014 | B1 |
8755998 | Braennstroem | Jun 2014 | B2 |
8805351 | Sigal et al. | Aug 2014 | B2 |
8811743 | Kapoor et al. | Aug 2014 | B2 |
8819172 | Davis et al. | Aug 2014 | B2 |
8841987 | Stanfield | Sep 2014 | B1 |
8854197 | Ikeda et al. | Oct 2014 | B2 |
8854925 | Lee et al. | Oct 2014 | B1 |
8890674 | Zeiger et al. | Nov 2014 | B2 |
8918353 | Mugent | Dec 2014 | B2 |
8952800 | Bantz | Feb 2015 | B2 |
8985073 | Kanai | Mar 2015 | B2 |
9002536 | Hatton | Apr 2015 | B2 |
9014920 | Torres | Apr 2015 | B1 |
9073430 | Boss | Jul 2015 | B1 |
9137498 | L'Heureux | Sep 2015 | B1 |
9153135 | Bantz | Oct 2015 | B2 |
9239380 | Hegemann | Jan 2016 | B2 |
9610893 | Lopez-Hinojosa | Apr 2017 | B2 |
9648107 | Penilla | May 2017 | B1 |
9679487 | Hayward | Jun 2017 | B1 |
9734699 | Wassef | Aug 2017 | B2 |
9771038 | Mori | Sep 2017 | B2 |
9824582 | Lopez-Hinojosa | Nov 2017 | B2 |
9836963 | Hayward | Dec 2017 | B1 |
9841286 | Hayward | Dec 2017 | B1 |
9841287 | Hayward | Dec 2017 | B1 |
9841767 | Hayward | Dec 2017 | B1 |
9842496 | Hayward | Dec 2017 | B1 |
9904289 | Hayward | Feb 2018 | B1 |
9934685 | Bernhardt | Apr 2018 | B1 |
9975483 | Ramaswamy | May 2018 | B1 |
10019904 | Chan | Jul 2018 | B1 |
10026309 | Nepomuceno | Jul 2018 | B1 |
10042364 | Hayward | Aug 2018 | B1 |
10054453 | Hayward | Aug 2018 | B1 |
10055982 | Hayward | Aug 2018 | B1 |
10055985 | Hayward | Aug 2018 | B1 |
10089871 | Lopez-Hinojosa | Oct 2018 | B2 |
10104203 | Hatton | Oct 2018 | B2 |
10215573 | Hayward | Feb 2019 | B1 |
10216194 | Hayward | Feb 2019 | B1 |
10222228 | Chan | Mar 2019 | B1 |
10262532 | Hyun | Apr 2019 | B2 |
10328855 | Lopez-Hinojosa | Jun 2019 | B2 |
10354461 | Hayward | Jul 2019 | B1 |
10359782 | Hayward | Jul 2019 | B1 |
10366605 | Hayward | Jul 2019 | B1 |
10380904 | Hayward | Aug 2019 | B1 |
10399495 | Osborne | Sep 2019 | B1 |
10451427 | Hayward | Oct 2019 | B1 |
10453352 | Hayward | Oct 2019 | B1 |
10493911 | Lopez-Hinojosa | Dec 2019 | B2 |
10506092 | Stephenson | Dec 2019 | B1 |
10509414 | Hayward | Dec 2019 | B1 |
10576993 | Yoon | Mar 2020 | B2 |
10611304 | Lopez-Hinojosa | Apr 2020 | B2 |
10850664 | Lopez-Hinojosa | Dec 2020 | B2 |
10904377 | Stephenson | Jan 2021 | B2 |
10948927 | Harris | Mar 2021 | B1 |
11004280 | Hayward | May 2021 | B1 |
11061408 | Hayward | Jul 2021 | B1 |
11358525 | Lopez-Hinojosa | Jun 2022 | B2 |
11364845 | Lopez-Hinojosa | Jun 2022 | B2 |
20020113875 | Mazzilli | Aug 2002 | A1 |
20020120374 | Douros | Aug 2002 | A1 |
20020126022 | Ellis | Sep 2002 | A1 |
20020149490 | Butler | Oct 2002 | A1 |
20030158644 | Basson | Aug 2003 | A1 |
20030231550 | Macfarlane | Dec 2003 | A1 |
20040193347 | Harumoto | Sep 2004 | A1 |
20040212488 | Gift | Oct 2004 | A1 |
20050038573 | Goudy | Feb 2005 | A1 |
20050104745 | Bachelder | May 2005 | A1 |
20050184860 | Taruki | Aug 2005 | A1 |
20050209776 | Ogino | Sep 2005 | A1 |
20050231323 | Underdahl | Oct 2005 | A1 |
20050273218 | Breed | Dec 2005 | A1 |
20060017564 | Phillips | Jan 2006 | A1 |
20060033615 | Nou | Feb 2006 | A1 |
20060049922 | Kolpasky | Mar 2006 | A1 |
20060109095 | Takata | May 2006 | A1 |
20060122748 | Nou | Jun 2006 | A1 |
20060197761 | Suzuki | Sep 2006 | A1 |
20060244828 | Ho | Nov 2006 | A1 |
20070096943 | Arnold et al. | May 2007 | A1 |
20070120644 | Seike | May 2007 | A1 |
20070222638 | Chen et al. | Sep 2007 | A1 |
20080004774 | Weiczorek | Jan 2008 | A1 |
20080042812 | Dunsmoir | Feb 2008 | A1 |
20080077358 | Marvasti | Mar 2008 | A1 |
20080088479 | Caminiti et al. | Apr 2008 | A1 |
20080238642 | Mauti | Oct 2008 | A1 |
20080252444 | Batot | Oct 2008 | A1 |
20080255722 | McClellan | Oct 2008 | A1 |
20080291052 | Burns | Nov 2008 | A1 |
20090051510 | Follmer | Feb 2009 | A1 |
20090070031 | Ginsberg | Mar 2009 | A1 |
20090079555 | Aguirre De Carcer | Mar 2009 | A1 |
20090248257 | Hoshino | Oct 2009 | A1 |
20090261969 | Kobayashi | Oct 2009 | A1 |
20100039291 | Harrison | Feb 2010 | A1 |
20100070128 | Johnson | Mar 2010 | A1 |
20100079874 | Kamei | Apr 2010 | A1 |
20100094502 | Ito | Apr 2010 | A1 |
20100157061 | Katsman | Jun 2010 | A1 |
20100191449 | Iwamoto | Jul 2010 | A1 |
20100220189 | Yanagi | Sep 2010 | A1 |
20110012755 | Mudalige | Jan 2011 | A1 |
20110034128 | Kirsch | Feb 2011 | A1 |
20110037618 | Ginsberg et al. | Feb 2011 | A1 |
20110037619 | Ginsberg et al. | Feb 2011 | A1 |
20110040621 | Ginsberg et al. | Feb 2011 | A1 |
20110169626 | Sun | Jul 2011 | A1 |
20110184895 | Janssen | Jul 2011 | A1 |
20110194002 | Hachisu | Aug 2011 | A1 |
20110199199 | Perkins | Aug 2011 | A1 |
20110205040 | Van Wiemeersch | Aug 2011 | A1 |
20110304444 | Zhang | Dec 2011 | A1 |
20110319910 | Roelle | Dec 2011 | A1 |
20120056734 | Ikeda | Mar 2012 | A1 |
20120088462 | Mader | Apr 2012 | A1 |
20120128210 | Zobel | May 2012 | A1 |
20120139754 | Ginsberg et al. | Jun 2012 | A1 |
20120176232 | Bantz | Jul 2012 | A1 |
20120179358 | Chang et al. | Jul 2012 | A1 |
20120182140 | Kumabe | Jul 2012 | A1 |
20120203436 | Braennstroem | Aug 2012 | A1 |
20120214463 | Smith | Aug 2012 | A1 |
20120268267 | Anderson | Oct 2012 | A1 |
20120274481 | Ginsberg et al. | Nov 2012 | A1 |
20120299749 | Xiao | Nov 2012 | A1 |
20120326855 | Bantz | Dec 2012 | A1 |
20130009766 | Shaw | Jan 2013 | A1 |
20130099892 | Tucker | Apr 2013 | A1 |
20130137415 | Takikawa | May 2013 | A1 |
20130154819 | Stefanovski | Jun 2013 | A1 |
20130158838 | Yorke | Jun 2013 | A1 |
20130265178 | Tengler | Oct 2013 | A1 |
20130271292 | McDermott | Oct 2013 | A1 |
20140003710 | Seow et al. | Jan 2014 | A1 |
20140043155 | Shaw | Feb 2014 | A1 |
20140047143 | Bateman et al. | Feb 2014 | A1 |
20140062685 | Tamatsu | Mar 2014 | A1 |
20140063196 | Daniel | Mar 2014 | A1 |
20140107864 | Cecchini | Apr 2014 | A1 |
20140118128 | Orzeck | May 2014 | A1 |
20140118168 | Lee | May 2014 | A1 |
20140139655 | Mimar | May 2014 | A1 |
20140172753 | Nowozin et al. | Jun 2014 | A1 |
20140191858 | Morgan | Jul 2014 | A1 |
20140222280 | Salomonsson | Aug 2014 | A1 |
20140225724 | Rankin | Aug 2014 | A1 |
20140267398 | Beckwith | Sep 2014 | A1 |
20140277937 | Scholz | Sep 2014 | A1 |
20140306838 | Beumler | Oct 2014 | A1 |
20140309864 | Ricci | Oct 2014 | A1 |
20140314275 | Edmonson | Oct 2014 | A1 |
20140320637 | Yi | Oct 2014 | A1 |
20140321737 | Movellan et al. | Oct 2014 | A1 |
20140350785 | Tsuchida | Nov 2014 | A1 |
20140362195 | Ng-Thow-Hing | Dec 2014 | A1 |
20150019043 | Creasey | Jan 2015 | A1 |
20150035665 | Plante | Feb 2015 | A1 |
20150046038 | Kawamata | Feb 2015 | A1 |
20150084791 | Jang | Mar 2015 | A1 |
20150091740 | Bai | Apr 2015 | A1 |
20150112543 | Binion | Apr 2015 | A1 |
20150112731 | Binion | Apr 2015 | A1 |
20150137998 | Marti | May 2015 | A1 |
20150142993 | Blanc | May 2015 | A1 |
20150154860 | Holzwanger et al. | Jun 2015 | A1 |
20150160019 | Biswal | Jun 2015 | A1 |
20150172878 | Luna | Jun 2015 | A1 |
20150187146 | Chen | Jul 2015 | A1 |
20150228191 | Steinmetz | Aug 2015 | A1 |
20150251599 | Koravadi | Sep 2015 | A1 |
20150274180 | Prakah-Asante | Oct 2015 | A1 |
20150309717 | Sinaguinan | Oct 2015 | A1 |
20160042627 | Foley | Feb 2016 | A1 |
20160042644 | Velusamy | Feb 2016 | A1 |
20160049601 | Scarborough | Feb 2016 | A1 |
20160061613 | Jung | Mar 2016 | A1 |
20160086285 | Jordan Peters | Mar 2016 | A1 |
20160093212 | Barfield, Jr. | Mar 2016 | A1 |
20160093213 | Rider | Mar 2016 | A1 |
20160116299 | Kim | Apr 2016 | A1 |
20160119897 | Kim | Apr 2016 | A1 |
20160127529 | Kim | May 2016 | A1 |
20160150070 | Goren | May 2016 | A1 |
20160200168 | Boyer | Jul 2016 | A1 |
20160203717 | Ginsberg et al. | Jul 2016 | A1 |
20160221500 | Sakai | Aug 2016 | A1 |
20160223813 | Feldman | Aug 2016 | A1 |
20160260326 | Ng-Thow-Hing | Sep 2016 | A1 |
20160267335 | Hampiholi | Sep 2016 | A1 |
20160272113 | Lopez-Hinojosa | Sep 2016 | A1 |
20160272150 | Doshi | Sep 2016 | A1 |
20160293006 | Bauer et al. | Oct 2016 | A1 |
20160034256 | Song | Nov 2016 | A1 |
20160358081 | Cama | Dec 2016 | A1 |
20160379065 | Hartmann | Dec 2016 | A1 |
20160379466 | Payant | Dec 2016 | A1 |
20170169707 | Lopez-Hinojosa | Jun 2017 | A1 |
20170197617 | Penilla | Jul 2017 | A1 |
20170200197 | Brubaker | Jul 2017 | A1 |
20170240110 | Lopez-Hinojosa | Aug 2017 | A1 |
20170328734 | Devkar | Nov 2017 | A1 |
20170358207 | Lopez-Hinojosa | Dec 2017 | A1 |
20180046869 | Cordell | Feb 2018 | A1 |
20180056935 | Doshi | Mar 2018 | A1 |
20180197029 | Ali | Jul 2018 | A1 |
20180205457 | Scheim | Jul 2018 | A1 |
20180247109 | Joseph | Aug 2018 | A1 |
20180284265 | Bilik | Oct 2018 | A1 |
20180316788 | Elliott | Nov 2018 | A1 |
20180350235 | Hyun | Dec 2018 | A1 |
20190001884 | Lopez-Hinojosa | Jan 2019 | A1 |
20190176845 | Yoon | Jun 2019 | A1 |
20190228539 | Alt | Jul 2019 | A1 |
20190232870 | Lopez-Hinojosa | Aug 2019 | A1 |
20200074492 | Scholl | Mar 2020 | A1 |
20200076943 | Stephenson | Mar 2020 | A1 |
20200156540 | Lopez-Hinojosa | May 2020 | A1 |
20200207267 | Lopez-Hinojosa | Jul 2020 | A1 |
20210039553 | Lopez-Hinojosa | Feb 2021 | A1 |
20210043082 | Grant | Feb 2021 | A1 |
20210191388 | Marquart | Jun 2021 | A1 |
20210365848 | Lord | Nov 2021 | A1 |
20220281383 | Lopez-Hinojosa | Sep 2022 | A1 |
20220305990 | Lopez-Hinojosa | Sep 2022 | A1 |
20230121366 | Sulaiman | Apr 2023 | A1 |
20230182759 | Wright | Jun 2023 | A1 |
20230415645 | Lopez-Hinojosa | Dec 2023 | A1 |
Entry |
---|
Miersma, S., Audi Traffic Light Assists helps you hit every green light, Autoglob, ://www.autoblog.com/ces), Jan. 3, 2014, 2 pages. |
Ta, V. T., Automated Road Traffic Congestion Detection and Alarm Systems: Incorporating V21 communications into VTCSs, CoRR abs/1606.01010 (2016), 31 pages. |
Kim, S.J. et al., Sensors Know When to Interrupt you in the Car: Detecting Driver Interruptibility Through Monitoring of Peripheral Interactions, CHI (2015) April 18-23, Seoul, Republic of Korea, 10 pages. |
Brogan, J., Red Light, Green Light, Slate (2015) Aug. 25, http://www.slate.com/articles/technology/future tense/2015/08/connected_lights_enlighten_app_lets_drivers_know_if_a_red_light_is_coming.html, 6 pages. |
Badon, M., Red Light, Green Light: The Invention of the Traffic Signal, Design Observer (2010) Jun. 14, http:// Jesignobserver.com/feature/red-light-green-light-the-invention-of-the-traffic-signal/8627/, 9 pages. |
Han, Y. et al., Hardware/Software Co-Design of a Traffic Sign Recognition System Using Zynq FPGAs, Electronics ;2015) 4:1062-1089. |
Hodgkins, K., EnLighten tells you when your traffic light is going to change, Engadget Jan. 10, 2014, https://www.gadget.com/2014/01/10/enlighten-tells-you-when-your-traffic-light-is-going-to-change/z, 9 pages. |
Gonder, J. et al., Analyzing Vehicle Fuel Savings Opportunities through Intelligent Driver Feedback, SAE International Apr. 16, 2012, 13 pages. |
Hsieh, C.-H. et al. Predict scooter's stopping event using smartphone as the sensing device, 2014 IEEE International Conference on Internet of Things, Green Computing and Communications, and Cyber-Physical-Social Computing, 7 pages. |
Manasseh, C. G., Technologies for Mobile ITS Applications and Safer Driving, Dissertation for Doctor of Philosophy in Engineering—Civil and Environmental Engineering, University of California, Berkeley, Fall 2010, 70 pages. |
Huang, Q. et al., Reliable Wireless Traffic Signal Protocols for Smart Intersections, at the Crossroads:Integrating Mobility and Safety and Security. ITS America 2004, 14th Annual Meeting and Exposition, San Antonio, Texas, 17 pages. |
Siogkas, G. et al., Traffic Lights Detection in Adverse Conditions Using Color, Symmetry and Spatiotemporal information, VISAPP (2012), pp. 620-627. |
Alertdriving, Global Driver Risk Management, Human Error Accounts for 90% of Road Accidents, Apr. 2011, http:// www.aler1driving.com/home/fleet-alert-magazine/international/human-error-accounts-90-road-accidents, 5 pages. |
Newcomb, D., “Bluetooth in Your Car: Still Indispensable, Still Imperfect,” Edmunds.com, published Aug. 27, 2013, 4 pages. |
Villas-Boas, A., “Hands on With the Giroptic 360cam,” PCMag.com, http//www.pcmag.com/article2/0, ,2817,2458401,OO.asp, printed Mar. 1, 2015, 2 pages. |
Optical head-mounted display—Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/Optical_headmounted_display, printed Mar. 17, 2015, 22 pages. |
Frenzel, L. E., “Wireless PAN Alternatives to Bluetooth,” Electronic Design Jun. 24, 2002, 5 pages. |
IBeacon—Wikipedia, the free encyclopedia, http://en.wikipedia.org/wiki/iBeacon, printed Mar. 10, 2015, 9 pages. |
Koscher, et al., Experimental Security Analysis of a Modern Automobile, 2010, IEEE, pp. 447-462 (2010). |
Fadilah, et al., A Time Gap Interval for Safe Following Distance (TGFD) in Avoiding Car Collision in Wireless Vehicular Networks (VANET) Environment, 2014, IEEE, pp. 683-689 (2014). |
Reddy, et al., On Board Assistant to GPS Navigation of Vehicles, 2009, IEEE, pp. 7-13 (2009). |
Studnia, et al., Survey on security threats and protection mechanisms in embedded automotive networks, 2013, IEEE, pp. 1-12 (2010). |
Sun, et al. Identifying Relative Vehicle Positions via Electronic and Visual Signals, 2016, IEEE, p. 36-45 (2016). |
Fawcett, Supporting human interaction with the Sentient vehicle, 2002 IEEE, p. 307-312 (2002). |
Dikaiakos, et a.. Location-Aware Services over Vehicular Ad-Hoc networks using Car-to-Car Communication, 2007. IEEE, p. 1590-1602 (2007). |
Chin, et al, Solutions of Wireless Sensor Networks for Driver Associate, 2008. IEEE, p. 1-6 (2008). |
King, et al, A Wireless Sensor Network Architecture for Highway Intersection Collision Prevention, 2007 IEEE, p. 178-183 (2007). |
Yugapriya, et al., Adaptive Traffice Management with VANET in V to I communication using greedy forwarding algorithm, 2014, IEEE, pp. 1-6 (2014). |
Dhole et al, Smart traffice signal using ultrasonic sensor, 2014, IEEE, pp. 1-4 (2014). |
Chavan et al. Design of Intelligent Traffic Light controller Using Embedded System, 2009, IEEE, pp. 1086-1091 (2009). |
Hewton, The metropoliton toronto traffic control signal system, 1968, IEEE, pp. 577-599 (1968). |
Number | Date | Country | |
---|---|---|---|
20220281383 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17082312 | Oct 2020 | US |
Child | 17824684 | US | |
Parent | 16700832 | Dec 2019 | US |
Child | 17082312 | US | |
Parent | 16115807 | Aug 2018 | US |
Child | 16700832 | US | |
Parent | 15688237 | Aug 2017 | US |
Child | 16115807 | US | |
Parent | 15396728 | Jan 2017 | US |
Child | 15688237 | US | |
Parent | 14661065 | Mar 2015 | US |
Child | 15396728 | US |