An alert system and method are disclosed for activating an alert when an object (e.g., approaching vehicle) is detected as traveling at a given velocity and within a given distance of a roadside alert beacon.
Each year service technicians or emergency responders are injured when assisting or approaching distressed, stopped, or parked vehicles. For instance, accidents may occur when an approaching vehicle is traveling at an undesirable velocity or within an undesirable distance from the service vehicle or distressed vehicle. To prevent accidents and to provide advance warning to approaching vehicles, roadside cones or barrels that include flashing LED lights may be employed to alert the approaching vehicles that assistance is being provided. However, conventional cones or barrels may not always effectively provide advance warning to approaching vehicles, and conventional cones and alerts do not provide warnings to the service technician or emergency responders.
An alert system and method for deployment on or along a roadway. The alert system may comprise at least one alert beacon having one or more sensors (e.g., LiDAR sensor). The alert beacon further including a processor operable to poll the LiDAR sensor for a predefined number of beta readings in response to receiving an initial reading from the LiDAR sensor indicating a vehicle is within a predefined distance away from the alert beacon. The processor further being operable to calculate an average distance and an average velocity for the vehicle in response to receiving the predefined number of beta readings when the vehicle is within the predefined distance from the alert beacon. The processor also being operable to activate an audible alert and a visual alert when the average distance is below a distance threshold and the average velocity exceeds a velocity threshold in response to calculating the average distance and the average velocity.
Each alert beacon may also include one or more digital camera(s) operable to acquire one or more digital images in response to receiving the initial reading from the LiDAR sensor indicating the vehicle is within the predefined distance away from the alert beacon. The processor may also be operable to calculate a second average distance and a second average velocity for the vehicle using the one or more images. The processor may be further operable to activate the audible alert and the visual alert when the second average distance is below the distance threshold and the second average velocity exceeds the velocity threshold. The processor may further be operable to analyze the one or more digital images to determine whether a service repair protocol is being performed.
Each alert beacon may also include a global positioning system (GPS) operable to provide a positioning data and a network interface operable to communicate with a remote server. Each processor may then be operable to transmit an identification and the positioning data of the at least one alert beacon in response to a request signal being received from the remote server. Each processor may also be operable to transmit the positioning data of the alert beacon to the remote server in response to receiving the initial reading from the LiDAR sensor indicating the vehicle is within the predefined distance away from the alert beacon. Each processor may be operable to navigate the at least one alert beacon to the geographical coordinate based on the positioning data in response to a request to deploy the at least one alert beacon to a geographical coordinate.
It is also contemplated that at least one of the alert beacons may be an aerial drone operable to hover about the geographical coordinate based on the positioning data. A mobile software application executing on a mobile device may also be operable to communicate with the at least one alert beacon. Each processor may then be operable to transmit a signal to the mobile software application to activate a visual notification and audible notification on the mobile device in response to receiving the initial reading from the LiDAR sensor indicating the vehicle is within the predefined distance away from the at least one alert beacon. Lastly, each processor may be operable to transmit a warning that is displayed upon an infotainment system within the vehicle in response to receiving the initial reading from the LiDAR sensor indicating the vehicle is within the predefined distance away from the alert beacon.
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
Each year people may be injured when trying to assist or approach distressed, stopped, or parked vehicles. For instance,
While the service assistant is connecting the two vehicles, changing a tire, or fixing the distressed vehicle 104 in some way, the service assistant might not be aware of the location or speed of the approaching vehicles 110. Alternatively, objects (e.g., concrete, stones, or items from approaching vehicles 110) may project dangerously close toward the service vehicle 102 and distressed vehicle 104 where the service technician is operating. Unaware of the approaching vehicles 110 or objects, a potentially hazardous condition may arise for the service assistant, occupants within the distressed vehicle 104, or occupants of the approaching vehicles 110. It is therefore desirable to provide a system and method for detecting and providing advance warning when such potentially hazardous conditions arise.
The alert system 200 may include at least one alert beacon 202. The alert beacon 202 may include at least one processor 204 that is operatively connected to a memory unit 208. The processor 204 may be one or more integrated circuits that implement the functionality of a CPU 206 (i.e., central processing unit). The processor 204 may be a microcontroller board (e.g., Arduino microcontroller). Or, processor 204 may be a commercially available CPU that implements an instruction such as one of the x86, ARM, Power, or MIPS instruction set families.
During operation, the CPU 206 may execute stored program instructions that are retrieved from the memory unit 208. The stored program instructions may include software that controls operation of the CPU 206 to perform the operation described herein. In some examples, the processor 204 may be a system on a chip (SoC) that integrates functionality of the CPU 206, the memory unit 208, a network interface, and input/output interfaces into a single integrated device. The processor 204 may implement an operating system for managing various aspects of the operation.
The alert beacon may include an electrical energy power supply 226 that may comprise a DC-battery or high-voltage capacitor. In operation, the power supply 226 may receive recharging energy from an external solar panel 228. Alternatively, a wind turbine may provide recharging energy to the power supply 226. It is also contemplated that power supply may be connected to an AC-energy source (i.e., 120-V AC outlet) that may be used to recharge the power supply 226.
The memory unit 208 may include volatile memory and non-volatile memory for storing instructions and data. The non-volatile memory may include solid-state memories, such as NAND flash memory, magnetic and optical storage media, or any other suitable data storage device that retains data when the alert system 200 is deactivated or loses electrical power. The volatile memory may include static and dynamic random-access memory (RAM) that stores program instructions and data.
The alert beacon 202 may include one or more sensors. For instance, the alert beacon 202 may include a light detection and ranging (LiDAR) sensor 210 operable to use light in the form of a pulsed laser that alert beacon 202 may use to measure a distance, velocity (using a change in distance), rate of acceleration, or velocity of an approaching objects. As discussed below, the processor 204 may be operable to algorithmically detect incoming objects and calculate their velocity in miles per hour using the data provided by the LiDAR sensor 210.
The alert beacon 202 may also include other radar sensors 212 such as ultra-sonic radar sensors or short/medium/long-range radar sensors that are similarly operable to transmit pulsed signals that may be used by alert beacon 202 for measuring ranges (distances) from objects. The alert beacon 202 may include a digital camera 214 operable to capture images or video that may then be processed by alert beacon 202 for detecting stationary or incoming objects. The alert beacon 202 may also include a global positioning system (GPS) 215 for detecting the location of the alert beacon 202.
The alert beacon 202 may further include one or more audible alerts 216. The audible alerts 216 may comprise a speaker that provides a spoken warning or siren to people within a given radius of the alert beacon 202. Or the audible alerts 216 may include multiple, unique alarms that provide different notifications to the service technician. For instance, one unique alarm may be used to alert the service technician that an approaching vehicle 110 is approaching from behind the distressed vehicle 104 and a different alert may be used for approaching vehicles 110 that may be on a path in front of the distressed vehicle 104.
The alert beacon 202 may further include one or more visual alerts 218 to people within a given radius of the alert system 200. For instance, the visual alert 218 may include a light system (e.g., one or more light-emitting diodes (LED)) that can provide a constant, flashing, or blinking visual warning to people. Or, the visual alert 218 may be an electronic message board that is operable to provide readable and modifiable warnings to people.
It is contemplated that the audible alerts 216 and/or the visual alerts 218 may be used to warn the occupants of the approaching vehicle 110, the service technician, or the occupants of the distressed vehicle 104. It is also contemplated that one or more relays may be used by the alert beacon to activate and operate the audible alerts 216 and visual alerts 218 to warn the occupants of the approaching vehicle 110, the service technician, or the occupants of the distressed vehicle 104. It is also contemplated that the audible alerts 216 and/or the visual alerts 218 may operate to alert the occupants (i.e., driver) of the approaching vehicle 110 to deviate course away from the alert beacon 202, service vehicle 102, and/or distressed vehicle 104. Or, the audible alerts 216 and/or the visual alerts 218 may operate to alert the service technician or the occupants of the distressed vehicle 104 to move away from the approaching vehicle 110.
The alert beacon 202 may include a network interface device 220 that is configured to provide communication with external systems and devices. For example, the network interface device 220 may include a wired and/or wireless Ethernet interface as defined by Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards. The network interface device 220 may include a cellular communication interface for communicating with a cellular network (e.g., 3G, 4G, 5G). The network interface device 220 may be further configured to provide a communication interface to an external network 222 or cloud.
The external network 222 may be interconnected to the world-wide web or the Internet. The external network 222 may establish a standard communication protocol between one or more external computing devices 224. The external network 222 may allow information and data to be easily exchanged between computing devices 224 and the network interface 220. For instance, the external devices 224 may comprise one or more servers that are in communication with alert beacon 202 via the external network 222. Or external devices 224 may include mobile devices (e.g., smart phone, smart watch) that are in communication with alert beacon 202 via the external network 222.
It is further contemplated that the alert system 200 may be implemented using one or more alert beacons 202. While
When multiple alert beacons 202 are employed, the alert system 200 may use external network 222 to communicate between each individual alert beacon 202. For instance, the alert system 200 may be operable to use external network 222 to communicate between a first alert beacon 202 situated in front of the distressed vehicle 104 and a second alert beacon 202 situated behind the service vehicle 102. Placement of multiple alert beacons 202 provides the alert system 200 with the capability of using LiDAR 210, radar 212, or camera 214 to scan vehicles or objects approaching in multiple directions (e.g., vehicles approaching toward the front end of the distressed vehicle 104 or from the rear-side of the service vehicle 102). In addition, implementing multiple alert beacons 202 provides the alert system 200 with redundancy so that if one alert beacon 202 stops operating the remaining alert beacons 202 may continue operating to scan, detect, and alert about approaching vehicles 110 or objects.
The alert beacon 202 may be designed to operate in extreme weather conditions across differing geographic regions. For instance, the alert beacon 202 may be designed to operate in extreme cold or warm weather, or when exposed to rain, sleet, or snow. It is therefore contemplated that the alert beacon may be hermetically sealed or positioned within an Ingress Protection (IP) enclosure to protect the components (e.g., processor 204, LiDAR 210) from the various weather conditions and climate changes.
Again, the alert beacon 202 may include one or more audible alerts 216 and/or visual alerts 218 operable to indicate the presence of the service vehicle 102 or distressed vehicle 104 to an approaching vehicle 110. Or, the audible alerts 216 and/or visual alerts 218 may also be operable to indicate the presence of approaching vehicle 110 to the service assistant. As shown by
It is also contemplated that the alert system 200 may operate by detecting whether an approaching vehicle 110 is within a predetermined range using data provided by the LiDAR sensor 210 or radar 212. The processor 204 may include instructions to perform an error checking to remove any false positive data received from LiDAR sensor 210 or radar 212.
The processor 204 may also operate on beta measurements or samples for approaching objects (i.e., approaching vehicle 110) before determining an average distance. If processor 204 determines the measurement is not within a predefined range, the processor 204 may not store the measurements within memory 208 and/or the processor 204 may discard the measurements. The processor 204 may continue polling LiDAR sensor 210 or radar 212 until there exists a predetermined number of readings (i.e., beta readings) within a predetermined range (e.g., [Gama, Delta] centimeters) as shown by Equation (1) below:
In Equation (1), xi is the distance of an approaching object in centimeters (cm). Once the processor 204 calculates the average distance, the processor 204 may further calculate a velocity for the approaching object. The velocity for the approaching object may be expressed as the change in position (centimeters) divided by change in time (milliseconds) as shown by Equation (2) below:
Where pi is a position at iteration i and ti is the time at iteration i. The processor 204 may also be operable to convert the calculated velocity into miles per hour (MPH). The processor 204 may convert the calculated velocity from centimeters/milliseconds to miles/hours using Equations (3), (4), (5) below:
Processor 204 may also determine if the velocity of the object (i.e., approaching vehicle 110) is moving at a speed greater than or equal to a predetermined velocity (e.g., 25 MPH) and whether the velocity of the object is at a distance less than or equal to a predetermined distance (e.g., 3000 cm) as shown by Equation (6) below:
Where z may be an output indicating whether an audible alert 216 or visual alert 218 should be activated, x is speed in miles per hour (MPH), and y is distance in centimeters (cm). If the processor 204 determines the object is within the predetermined velocity and distance, then the processor may activate the visual alert 218 (e.g., LED light) or audible alert 216 (e.g., loud siren).
As illustrated in
Lastly, it is contemplated that the alert beacon 202 may also be designed as a clothing article or an IoT device that a service technician may wear when assisting a distressed vehicle 104. The alert system 200 may still provide wireless connectivity between the alert beacon 202 (i.e., clothing article or IoT device) worn by the service technician and additional alert beacons 202 positioned around the service vehicle 102 and distressed vehicle 104. However, it is also contemplated that the clothing article or IoT device may be an alternative form of the alert system 200 independent of the alert beacons 202 illustrated by
For instance, the clothing article may be a vest worn by the service technician. The vest may include one or more LiDAR sensors or radar sensors for detecting the location and speed of approaching vehicles 110 or objects. The vest may also include one or more camera sensors for detecting and recording video. The vest may be operable to determine if an oncoming vehicle is approaching within a predetermined distance or speed of the service vehicle 102 or distressed vehicle 104. The vest may include audible and visual alerts that may then be activated to notify the service technician about the approaching vehicle 110 or object. If employed as wearable glasses or contact lenses, the alert system 200 could display visual alerts to the service technician. Or, the clothing article may be a smart watch (e.g., Android watch or Apple watch) where a mobile software application could be utilized on smart watches to provide visual or audible alerts to the service technician.
It is contemplated, however, that the service technician may manually control placement of the alert beacons 202A-202D using network interface 220. For instance, the service technician may use a mobile device or remote control that is wirelessly connected to each alert beacon 202A-202D through the network interface 220. The service technician may use, for instance, a mobile app that allows selection of each alert beacon 202A-202D. Following selection of the alert beacon 202A-202D, the mobile app may provide the service technician with the capability of controlling placement of the alert beacon 202A-202D.
Again, each alert beacon 202A-202D may be an aerial drone as illustrated by
It is also contemplated that each alert beacon 202A-202D also includes a motorized assembly (not shown) that is controlled by processor 204 to self-level the LiDAR 210, radar 212, and camera 214 regardless of the road grade. For instance, the processor 204 may be programmed to: (1) scan downward until the ground is detected; (2) scan upward to detect a horizon; and (3) auto-level the LiDAR 210 at a position that projects toward the approaching vehicle 110. Or the processor may provide self-leveling using an accelerometer to determine a specific orientation of the LiDAR 210, radar 212, and camera 214 and to measure different values of downward acceleration due to gravity.
It is further contemplated each alert beacon 202A-202D may be physically attached to the service vehicle 102. For instance, each alert beacon 202A-202D may be attached to a light bar atop the service vehicle 102 or through equipment attached inside or outside the service vehicle 102. The LiDAR 210, radar 212, and camera 214 may also be positioned around the service vehicle 102 and may be used by processor 204 to detect approaching vehicles 110 approaching from various directions. The LiDAR 210, radar 212, and camera 214 may also be controlled by the service technician or may automatically be activated in conjunction with traffic flow and road position.
As shown by
However, as shown by
It is also contemplated that the camera 214 may be operable to provide video recording of the area surrounding the distressed vehicle 104. The camera 214 may be operated whenever an alert beacon 202 is deployed. Or, the camera 214 may only be operable to record video when an approaching vehicle 110 is determined as moving above a predetermined velocity (i.e., speed) or within a predetermined direction of the distressed vehicle 104, service vehicle 102, or alert beacon 202. The predetermined velocity and direction values may be stored within memory 208. The predetermined direction and velocity values may be calibratable or may be adjusted by the service technician. The alert beacon 202 may also be operable to record and store the digital images, recorded video, or video segments acquired from camera 214 within memory 208 or stored in external network 222. Additionally, the camera 214 may also be used by the processor 204 in conjunction with a machine learning algorithm to determine if the service technicians are following a predetermined series of safety or operational protocols while assisting occupants of the distressed vehicle 104.
The alert system 200 may also be operable to transmit the video using external network 222 to a remote storage (e.g., device 224) that may be located within service vehicle 102. Or, the alert beacon 202 may operably transmit the video using external network 222 to a remote server (e.g., Corporate Server or cloud-based storage like Amazon Web Services). The transmitted video may then be observed by remote workers either while service is being provided, or at a later time. The remote workers may observe the video to provide supervision and oversite for the work being performed by the service technician. Or the remote workers may observe the video as an extra level of safety for the service technician and the occupants of the distressed vehicle 104. Video and GPS positions could be live streamed via network interface 220 and external network 224 to a central location allowing supervisors and fleet operators the ability to oversee operations in real time.
The alert system 200 may also be operable to process the real time traffic analytics stored within memory 208 using the video collected from camera 214. Traffic analytics may again be transmitted using external network 222 to central system or cloud-based storage (e.g., device 224) that may be monitoring multiple alert systems 200 (i.e., multiple emergency service vehicles) distributed across various locations. Traffic analytics data could be used both internally and externally to provide more accurate information to service technicians and to motorists.
Data from the GPS 215 may likewise be transmitted to the monitoring service or emergency service (via external network 222) when processor 204 determines the approaching vehicle 110 is approaching at a given speed, distance, or path toward the service vehicle 102, distressed vehicle 104, or alert beacon 202. The data provided by the GPS 215 may also be processed for internal analytics regarding prevalent distressed vehicle locations.
The alert system 200 may also be operable to transmit an alert using external network 222 to an infotainment system, heads-up display, video monitor, or mobile device located within an approaching vehicle 110. For instance, the alert system 200 may also employ external network 222 to provide geo-fencing capabilities that can provide the alert within the oncoming vehicles. The alert system 200 may transmit to the approaching vehicle 110 over the external network 222 data indicating the location of the service vehicle 102, distressed vehicle 104, or the alert beacon 202. The alert system 200 may also receive from the external network 222 data indicative of the location of the approaching vehicle 110. The alert system 200 may determine when to activate the audible alert 216 or the visual alert 218 based on the location and velocity of the approaching vehicle 110 in relation the service vehicle 102, distressed vehicle 104, or the alert beacon 202. It is further contemplated that the alert system may be in communication with mobile software applications that may then provide route information to drivers and give real-time traffic information to advise occupants of the approaching vehicles 110.
The alert system 200 may also transmit instructions from network interface 220 over external network 222 to slow a given speed of approaching vehicles 110. For instance, the alert system 200 may transmit data or instructions over external network 222 notifying local emergency services regarding the distressed vehicle 104. The local emergency services may be equipped to transmit a notification signal to approaching vehicles 110 nearing the proximity of the distressed vehicle 104 (e.g., ¼ mile radius). Upon receiving the notification signal, the approaching vehicles 110 may be programmatically controlled to reduce to a specified speed (e.g., 25 MPH) regardless of whether the driver attempts to depress the accelerator pedal. It is contemplated that notification signal may not be required as coming from an emergency service location but could be transmitted by alert system 200 or monitoring service that is in communication with alert system 200.
It is also contemplated that the alert system 200 may transmit notification signals operable to initiate automatic braking or collision avoidance within the approaching vehicles 110. For instance, the notification signals may be used to provide automatic braking within approaching vehicles 110 that are approaching within a predetermined velocity or distance to the alert beacon 202, service vehicle 102, or distressed vehicle 104. Or, the notification signal may be used to steer the approaching vehicle 110 away from the alert beacon 202, service vehicle 102, or distressed vehicle 104.
The alert system 200 may further be operable to use external network 222 to connect with a roadside billboard or municipal notification system to provide additional alerts to approaching vehicles 110. For instance, many roadside billboards are now equipped as video electronic displays. The alert system 200 may be operable to connect with such billboards (either directly or a through a notification service) using the external network 222 so that information may be provided to approaching vehicles 110. Many cities are also equipped with electronic signage that may be used to alert the approaching vehicles 110 about current traffic conditions. These electronic signs may also be used by the alert system 200 to notify approaching vehicles 110 about the location of the service vehicle 102, distressed vehicle 104, or the alert beacon 202.
The alert system 200 may also be operable to connect using external network with a mobile device worn by the service technician. For instance, the alert system 200 may include a mobile software application that may be downloaded on a mobile device (e.g., app available and downloadable onto an Apple or Android smart phone). The mobile software application may employ audible or visual alert capabilities of the mobile device to alert the service technician when it is determined that the velocity of an approaching vehicle 110 is above a predetermined threshold, or the direction of an approaching vehicle 110 is within a predetermined distance.
The alert system 200 may be integrated to operatively use sensors or alert systems located within a service vehicle 102. Or, the alert system 200 may integrate, or alternatively rely on, sensors located within a distressed vehicle 104. For instance, the distressed vehicle 104 may be operable to include functionality that allows service technician to connect the alert system 200 to sensors (e.g., LiDAR, cameras) positioned within the distressed vehicle 104. The sensors located within the distressed vehicle 104 may then be implemented by the alert system 200 to further detect and provide alerts about approaching vehicles 110 or objects.
The alert system 200 may also transmit to external network 222 data indicative of traffic patterns surrounding the distressed vehicle 104. Or the alert system 200 may transmit instructions requesting re-routing of traffic away from the distressed vehicle 104. The data and instructions may be provided to mapping software providers (e.g., Google or Waze) so that approaching vehicles 110 may be informed and/or re-routed away from the distressed vehicle 104. For instance, the alert system 200 may request that approaching vehicles 110 be re-routed a given distance (e.g., ½-mile) away from distressed vehicle 104.
It is further contemplated that the area surrounding the distressed vehicle 104 may have moveable traffic flow devices. For instance, certain roadways include lane diversion systems that permit for an additional or alternative traffic lane. Alert system 200 may activate and use this additional or alternative traffic lane to re-route approaching vehicles 110 away from distressed vehicle 104 to provide safe working environment for service technician.
The alert system 200 may also be designed to receive information regarding the location where the distressed vehicle 104 is located. For instance, the distressed vehicle 104 may be located in a highly traversed area, an area that includes visual obstructions for approaching vehicles 110 (e.g., bridges, bushes), or a location that does not include suitable space to service the distressed vehicle 104 (e.g., an area with a small or no shoulder). The alert system 200 may be operable to evaluate and determine if the distressed vehicle 104 is located at an area that is unsafe for the service technician. The alert system 200 may be operable to alert the distressed vehicle 104 to proceed to different location prior to being serviced.
It is also contemplated that the alert system 200 may operably receive from external network 222 data from local weather services about pending weather conditions surrounding the distressed vehicle 104. If the alert system 200 determines that the received data about the weather conditions may increase the potential for accidents with approaching vehicles 110 additional safety measures may be employed. For instance, if the alert system 200 receives data about a severe snow storm or that there exists icy road conditions around the distressed vehicle 104, the alert system 200 may require increased coverage by the alert beacons 202 surrounding the distressed vehicle 104. The radius and number of the alert beacons 202 may also be increased to ensure the alert system 200 can provide advanced alert warnings to the service technician. The alert system 200 may also operably employ a machine learning algorithm so that the service vehicle 102 could access telematics data to determine any deterioration in alert beacons 202 which would lead to a breakdown or equipment failure.
It is further contemplated that the alert system 200 may implement a facial recognition algorithm, blockchain algorithm, optical character recognition (OCR), or image recognition for tracking and detecting potential misplacement or theft of any one of the alert beacons 202. For instance, an alert beacon 202 may be taken from the roadside or from the back of a service vehicle 102. Using the network transmitter 220, the processor 204 may transmit digital images acquired by the camera 214. A facial recognition algorithm may be employed by processor 204 to identify the individual responsible for taking the alert beacon 202. Also, the processor 204 may employ GPS data from GPS 215 to determine and transmit the location of the alert beacon 202 for retrieval by authorities.
The processor 204 may also employ camera 214 to acquire images of the license plates from oncoming vehicles 110. The alert system 200 may use external network 222 to communicate with an external server (e.g., police database) or emergency services when it is determined that an acquired license plate is that of a stolen or missing vehicle. The alert system 200 may detect when a stolen or missing vehicle using the image acquired by the camera 214. The alert system 200 may send a notification (using external network 222) to the local authorities (e.g., police department) with a location where the stolen or missing vehicle was identified. Should the alert system 200 be unable to capture license plates, it may still capture images of vehicles and use object/color detection to get the make, model and color of the stolen or missing vehicle.
The LiDAR sensor 210, radar sensor 212, camera 214, and GPS 215 may also be used to create a surface or topographical map pertaining to where the distressed vehicle 104 is situated. The surface/topographical map may be used by the alert system 200 to detect for hazardous road conditions or obstacles. The alert system 200 may then provide alerts to the service technician if a road condition or obstacle may present a dangerous work environment. For instance, the surface map may indicate that a large pothole exists near the distressed vehicle 104. The alert system 200 may provide an audible or visual warning to the service technician about the pothole. The service technician may then use the alert to add additional alert beacons 202 around the service vehicle 102 or distressed vehicle 104 to ensure that approaching vehicles 110 avoid the obstacle (e.g., pothole).
The alert system 200 may also be operable to store the locations, topographical data, and weather conditions within memory 208 when servicing a distressed vehicle 104. The alert system 200 may use this information to generate analytical data about common locations where a distressed vehicle 104 requires service. If a given location routinely involves a distressed vehicle 104 requiring service, the alert system 200 may notify local authorities. The alert system 200 may also provide local authorities with data regarding potential reasons why there are increased numbers of distressed vehicles 102 in a given location. For instance, the alert system 200 may be operable to assess analytical data that includes topographical, satellite images, or surface maps acquired from the LiDAR sensor 210, radar 212, camera 214, or GPS 215 to determine that a given location may include several large potholes. The alert system 200 may be operable to transmit the analytical data using network interface 220. The analytical data may be received by local authorities that can use the information to correct or rectify the pothole.
The alert system 200 may further employ a microphone (e.g., within the camera 214) to record and analyze the voice analytics during which a service technician is servicing a distressed vehicle 104. The voice analytics may then be further processed to determine the satisfaction of the customer while the distressed vehicle is being serviced. If the alert system 200 determines a positive customer satisfaction, the alert system 200 may be enabled to provide a post to a social networking website (e.g., LinkedIn or Facebook) about the service technician and the work performed. Also, the alert system 200 may further be enabled to track the response time and time required to service a distressed vehicle 104. Again, the alert system 200 may then be operable to post updates to social networking websites about the response or service times. Or the time update may be used to inform another potential customer about their expected wait time.
It is further contemplated that occupants of the distressed vehicle 104 may be able to fill out an application process that is accessible using external network 222 by the alert system 200. The application process may be part of an enrollment system with an insurance agent (e.g., AAA of Michigan). The application process may include emergency contact information. The alert system 200 may be operable to provide alerts to the emergency contacts when the alert system 200 is deployed for the occupants of the distressed vehicle 104.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
This application is a continuation of U.S. application Ser. No. 16/878,272 filed May 19, 2020, now U.S. Pat. No.: 11,508,239, issued Nov. 22, 2022, the disclosure of which is hereby incorporated in its entirety by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
5696502 | Busch et al. | Dec 1997 | A |
5729215 | Jutras | Mar 1998 | A |
5760686 | Toman | Jun 1998 | A |
5767954 | Laakmann | Jun 1998 | A |
6288651 | Souza | Sep 2001 | B1 |
6559774 | Bergan et al. | May 2003 | B2 |
7030777 | Nelson | Apr 2006 | B1 |
8237555 | McCarthy | Aug 2012 | B2 |
9489841 | Huggins | Nov 2016 | B1 |
9792820 | Russell | Oct 2017 | B1 |
10304308 | Mujeeb | May 2019 | B2 |
10319227 | Roy | Jun 2019 | B2 |
10480731 | Liu et al. | Nov 2019 | B2 |
10783776 | Hassani et al. | Sep 2020 | B2 |
10843626 | Yu | Nov 2020 | B2 |
11145192 | Wright | Oct 2021 | B1 |
11214287 | Lin et al. | Jan 2022 | B2 |
11238726 | Isaacs | Feb 2022 | B2 |
20120126996 | McCarthy | May 2012 | A1 |
20140035737 | Rashid et al. | Feb 2014 | A1 |
20150054660 | Simmons | Feb 2015 | A1 |
20160232410 | Kelly | Aug 2016 | A1 |
20160236638 | Lavie et al. | Aug 2016 | A1 |
20170316691 | Miller et al. | Nov 2017 | A1 |
20180347752 | Costello et al. | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
110941223 | Mar 2020 | CN |
2016192145 | Nov 2016 | JP |
20170084465 | Jul 2017 | KR |
2020190988 | Sep 2020 | WO |
WO-2020190988 | Sep 2020 | WO |
Entry |
---|
International Search Report of International Application No. PCT/US2021/033219 dated Sep. 8, 2021, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20230037925 A1 | Feb 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16878272 | May 2020 | US |
Child | 17966459 | US |