This disclosure relates to communication systems and, more particularly, to communication systems for use in autonomous vehicles.
As transportation moves towards autonomous (i.e., driverless) vehicles, the manufactures and designers of these autonomous vehicles must address issues that are not a concern in traditional vehicles.
For example and with respect to hired vehicles, such hired vehicles are driven by a driver to which riders in the vehicle may speak. Accordingly, the rider in the vehicle may ask the driver questions, change their destination, ask the driver to make an unexpected stop midpoint in a trip, etc. However and with respect to autonomous vehicles, such discussions are difficult to have, being there is no human being with whom the rider make speak.
In one implementation, a computer-implemented method is executed on a computing device and includes: monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle; and processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle.
One or more of the following features may be included. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes a microphone assembly for obtaining the verbal information provided by the rider of the autonomous vehicle. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for non-verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes an input device for obtaining the non-verbal information provided by the rider of the autonomous vehicle. The non-verbal information may include one or more of: text-based information; and encoded information. The response may be provided to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes a speaker assembly for providing the verbal response to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a non-verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes an output assembly for providing the non-verbal response to the rider of the autonomous vehicle. The non-verbal information may include one or more of: a text-based response; and a light-based response. The information provided by the rider of the autonomous vehicle may be verbal information. Processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle may include: processing the verbal information provided by the rider of the autonomous vehicle to the autonomous vehicle using Natural Language Understanding (NLU) to generate a response based, at least in part, upon the verbal information provided by the rider of the autonomous vehicle.
In another implementation, a computer program product resides on a computer readable medium and has a plurality of instructions stored on it. When executed by a processor, the instructions cause the processor to perform operations including: monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle; and processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle.
One or more of the following features may be included. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes a microphone assembly for obtaining the verbal information provided by the rider of the autonomous vehicle. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for non-verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes an input device for obtaining the non-verbal information provided by the rider of the autonomous vehicle. The non-verbal information may include one or more of: text-based information; and encoded information. The response may be provided to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes a speaker assembly for providing the verbal response to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a non-verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes an output assembly for providing the non-verbal response to the rider of the autonomous vehicle. The non-verbal information may include one or more of: a text-based response; and a light-based response. The information provided by the rider of the autonomous vehicle may be verbal information. Processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle may include: processing the verbal information provided by the rider of the autonomous vehicle to the autonomous vehicle using Natural Language Understanding (NLU) to generate a response based, at least in part, upon the verbal information provided by the rider of the autonomous vehicle.
In another implementation, a computing system includes a processor and memory is configured to perform operations including: monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle; and processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle.
One or more of the following features may be included. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes a microphone assembly for obtaining the verbal information provided by the rider of the autonomous vehicle. Monitoring the interior of an autonomous vehicle for information being provided by a rider of the autonomous vehicle to the autonomous vehicle may include: monitoring the interior of the autonomous vehicle for non-verbal information being provided by the rider of the autonomous vehicle to the autonomous vehicle; wherein the interior of the autonomous vehicle includes an input device for obtaining the non-verbal information provided by the rider of the autonomous vehicle. The non-verbal information may include one or more of: text-based information; and encoded information. The response may be provided to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes a speaker assembly for providing the verbal response to the rider of the autonomous vehicle. Providing the response to the rider of the autonomous vehicle may include: providing a non-verbal response to the rider of the autonomous vehicle; wherein the interior of the autonomous vehicle includes an output assembly for providing the non-verbal response to the rider of the autonomous vehicle. The non-verbal information may include one or more of: a text-based response; and a light-based response. The information provided by the rider of the autonomous vehicle may be verbal information. Processing the information provided by the rider of the autonomous vehicle to the autonomous vehicle to generate a response based, at least in part, upon the information provided by the rider of the autonomous vehicle may include: processing the verbal information provided by the rider of the autonomous vehicle to the autonomous vehicle using Natural Language Understanding (NLU) to generate a response based, at least in part, upon the verbal information provided by the rider of the autonomous vehicle.
The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features and advantages will become apparent from the description, the drawings, and the claims.
Like reference symbols in the various drawings indicate like elements.
Autonomous Vehicle Overview
Referring to
Autonomous vehicle 10 may include a plurality of sensors (e.g. sensors 12), a plurality of electronic control units (e.g. ECUs 14) and a plurality of actuators (e.g. actuators 16). Accordingly, sensors 12 within autonomous vehicle 10 may monitor the environment in which autonomous vehicle 10 is operating, wherein sensors 12 may provide sensor data 18 to ECUs 14. ECUs 14 may process sensor data 18 to determine the manner in which autonomous vehicle 10 should move. ECUs 14 may then provide control data 20 to actuators 16 so that autonomous vehicle 10 may move in the manner decided by ECUs 14. For example, a machine vision sensor included within sensors 12 may “read” a speed limit sign stating that the speed limit on the road on which autonomous vehicle 10 is traveling is now 35 miles an hour. This machine vision sensor included within sensors 12 may provide sensor data 18 to ECUs 14 indicating that the speed on the road on which autonomous vehicle 10 is traveling is now 35 mph. Upon receiving sensor data 18, ECUs 14 may process sensor data 18 and may determine that autonomous vehicle 10 (which is currently traveling at 45 mph) is traveling too fast and needs to slow down. Accordingly, ECUs 14 may provide control data 20 to actuators 16, wherein control data 20 may e.g. apply the brakes of autonomous vehicle 10 or eliminate any actuation signal currently being applied to the accelerator (thus allowing autonomous vehicle 10 to coast until the speed of autonomous vehicle 10 is reduced to 35 mph).
System Redundancy
As would be imagined, since autonomous vehicle 10 is being controlled by the various electronic systems included therein (e.g. sensors 12, ECUs 14 and actuators 16), the potential failure of one or more of these systems should be considered when designing autonomous vehicle 10 and appropriate contingency plans may be employed.
For example and referring also to
Autonomy control unit 50 may be configured to perform various functions. For example, autonomy control unit 50 may receive and process exteroceptive sensor data (e.g., sensor data 18), may estimate the position of autonomous vehicle 10 within its operating environment, may calculate a representation of the surroundings of autonomous vehicle 10, may compute safe trajectories for autonomous vehicle 10, and may command the other ECUs (in particular, a vehicle control unit) to cause autonomous vehicle 10 to execute a desired maneuver. Autonomy control unit 50 may include substantial compute power, persistent storage, and memory.
Accordingly, autonomy control unit 50 may process sensor data 18 to determine the manner in which autonomous vehicle 10 should be operating. Autonomy control unit 50 may then provide vehicle control data 52 to vehicle control unit 54, wherein vehicle control unit 54 may then process vehicle control data 52 to determine the manner in which the individual control systems (e.g. powertrain system 56, braking system 58 and steering system 60) should respond in order to achieve the trajectory defined by autonomous control unit 50 within vehicle control data 52.
Vehicle control unit 54 may be configured to control other ECUs included within autonomous vehicle 10. For example, vehicle control unit 54 may control the steering, powertrain, and brake controller units. For example, vehicle control unit 54 may provide: powertrain control signal 62 to powertrain control unit 64; braking control signal 66 to braking control unit 68; and steering control signal 70 to steering control unit 72.
Powertrain control unit 64 may process powertrain control signal 62 so that the appropriate control data (commonly represented by control data 20) may be provided to powertrain system 56. Additionally, braking control unit 68 may process braking control signal 66 so that the appropriate control data (commonly represented by control data 20) may be provided to braking system 58. Further, steering control unit 72 may process steering control signal 70 so that the appropriate control data (commonly represented by control data 20) may be provided to steering system 60.
Powertrain control unit 64 may be configured to control the transmission (not shown) and engine/traction motor (not shown) within autonomous vehicle 10; while brake control unit 68 may be configured to control the mechanical/regenerative braking system (not shown) within autonomous vehicle 10; and steering control unit 72 may be configured to control the steering column/steering rack (not shown) within autonomous vehicle 10.
Autonomy control unit 50 may be a highly complex computing system that may provide extensive processing capabilities (e.g., a workstation-class computing system with multi-core processors, discrete co-processing units, gigabytes of memory, and persistent storage). In contrast, vehicle control unit 54 may be a much simpler device that may provide processing power equivalent to the other ECUs included within autonomous vehicle 10 (e.g., a computing system having a modest microprocessor (with a CPU frequency of less than 200 megahertz), less than 1 megabyte of system memory, and no persistent storage). Due to these simpler designs, vehicle control unit 54 may have greater reliability and durability than autonomy control unit 50.
To further enhance redundancy and reliability, one or more of the ECUs (ECUs 14) included within autonomous vehicle 10 may be configured in a redundant fashion. For example and referring also to
In this particular configuration, the two vehicle control units (e.g. vehicle control units 54, 74) may be configured in various ways. For example, the two vehicle control units (e.g. vehicle control units 54, 74) may be configured in an active—passive configuration, wherein e.g. vehicle control unit 54 performs the active role of processing vehicle control data 52 while vehicle control unit 74 assumes a passive role and is essentially in standby mode. In the event of a failure of vehicle control unit 54, vehicle control unit 74 may transition from a passive role to an active role and assume the role of processing vehicle control data 52. Alternatively, the two vehicle control units (e.g. vehicle control units 54, 74) may be configured in an active—active configuration, wherein e.g. both vehicle control unit 52 and vehicle control unit 74 perform the active role of processing vehicle control data 54 (e.g. divvying up the workload), wherein in the event of a failure of either vehicle control unit 54 or vehicle control unit 74, the surviving vehicle control unit may process all of vehicle control data 52.
While
Autonomy Computational Subsystems
Referring also to
For example, one or more of ECUs 14 may be configured to effectuate/form perception subsystem 100, wherein perception subsystem 100 may be configured to process data from onboard sensors (e.g., sensor data 18) to calculate concise representations of objects of interest near autonomous vehicle 10 (examples of which may include but are not limited to other vehicles, pedestrians, traffic signals, traffic signs, road markers, hazards, etc.) and to identify environmental features that may assist in determining the location of autonomous vehicle 10. Further, one or more of ECUs 14 may be configured to effectuate/form state estimation subsystem 102, wherein state estimation subsystem 102 may be configured to process data from onboard sensors (e.g., sensor data 18) to estimate the position, orientation, and velocity of autonomous vehicle 10 within its operating environment. Additionally, one or more of ECUs 14 may be configured to effectuate/form planning subsystem 104, wherein planning subsystem 104 may be configured to calculate a desired vehicle trajectory (using perception output 106 and state estimation output 108). Further still, one or more of ECUs 14 may be configured to effectuate/form trajectory control subsystem 110, wherein trajectory control subsystem 110 uses planning output 112 and state estimation output 108 (in conjunction with feedback and/or feedforward control techniques) to calculate actuator commands (e.g., control data 20) that may cause autonomous vehicle 10 to execute its intended trajectory within it operating environment.
For redundancy purposes, the above-described subsystems may be distributed across various devices (e.g., autonomy control unit 50 and vehicle control units 54, 74). Additionally/alternatively and due to the increased computational requirements, perception subsystem 100 and planning subsystem 104 may be located almost entirely within autonomy control unit 50, which (as discussed above) has much more computational horsepower than vehicle control units 54, 74. Conversely and due to their lower computational requirements, state estimation subsystem 102 and trajectory control subsystem 110 may be: located entirely on vehicle control units 54, 74 if vehicle control units 54, 74 have the requisite computational capacity; and/or located partially on vehicle control units 54, 74 and partially on autonomy control unit 50. However, the location of state estimation subsystem 102 and trajectory control subsystem 110 may be of critical importance in the design of any contingency planning architecture, as the location of these subsystems may determine how contingency plans are calculated, transmitted, and/or executed.
Referring also to
Specifically and as will be discussed below:
Referring also to
The instruction sets and subroutines of communication process 250, which may be stored on storage device 114 (see
With respect to the above-described systems/devices that enable a rider (e.g., rider 200) to communicate with autonomous vehicle 10, these various systems/devices may be configured to provide different types of functionality, as discussed below.
Ride Request
A rider (e.g., rider 200) may sign in to a mobile or web application (not shown) wherein the rider (e.g., rider 200) may have a stored user profile that contains their contact information. Through the use of such an application, the rider (e.g., rider 200) may request a ride from an autonomous vehicle (e.g., autonomous vehicle 10) at the current time (or by scheduling a ride for some time in the future). At the time that ride is requested, details such as the desired pick-up date and time, the pickup and drop off locations, and the number of seats required may be specified. These ride details may be encoded into a digital ticket (e.g., digital ticket 214) that may be stored in a mobile application (e.g., a smartphone application) for future use. Once generated, the ride request may be sent to a dispatch system (not shown). This dispatch system may select an autonomous vehicle (e.g., autonomous vehicle 10) and (via a telecommunications link) may remotely command an autonomous vehicle (e.g., autonomous vehicle 10) to drive to the appropriate pickup location. A human operator may optionally be involved in the autonomous vehicle's dispatch, either by simply authorizing the pre-selected dispatch or by manually performing other computations/decisions/commands required to dispatch the autonomous vehicle (e.g., autonomous vehicle 10).
Ticket Check
Once the autonomous vehicle (e.g., autonomous vehicle 10) arrives at the requested pickup location, the rider (e.g., rider 200) may be prompted (e.g., via verbal prompting or text-based prompting) to produce their digital ticket (e.g., digital ticket 214) in order to begin their ride. The digital ticket (e.g., digital ticket 214) may be required to enter autonomous vehicle 10, whereby the vehicle doors are unlocked only when a valid digital ticket (e.g., digital ticket 214) is presented. Alternatively, a valid digital ticket (e.g., digital ticket 214) may be required after the rider (e.g., rider 200) has entered the autonomous vehicle (e.g., autonomous vehicle 10) in order to begin the trip. Ride details may be encoded in the digital ticket (e.g., digital ticket 214) and may be sent to the autonomous vehicle (e.g., autonomous vehicle 10) at the time of the ride via a visual code (e.g., QR code or other 1D or 2D barcode) scannable via scanner assembly 210 (e.g., an optical scanner assembly, near-field communication (NFC), radio-frequency identification (RFID), wi-fi, Bluetooth, or other short-range communication technologies). The dispatch system may then utilize the information contained in the digital ticket (e.g., digital ticket 214) in order to fulfill the trip.
Display Screens
One or more display screens (e.g., contextual display assembly 202) may be mounted inside the autonomous vehicle (e.g., autonomous vehicle 10), wherein these display screens (e.g., contextual display assembly 202) may be configured to provide simple, contextually relevant information/prompts to the rider (e.g., rider 200) throughout the course of the trip, wherein such information may include text, graphics, animations and/or sounds.
Message Triggers
Messages shown on the display screens (e.g., contextual display assembly 202) may be triggered by various systems including communication process 250, a remote monitoring system and/or a dispatch system. Additionally, such messages shown on the display screens (e.g., contextual display assembly 202) may be triggered by rider actions, remote monitor actions, trip events, system status and/or other events. For example:
Communication process 250 may determine which message is appropriate to display at any given time based on a number of factors, including trip status. For example, different messages may be displayed in response to a seatbelt being unbuckled depending on whether the trip has not yet started, is in progress, or has been completed. For example:
Additionally, messages may be assigned a priority, whereby if multiple messages are triggered, only the highest priority message may be displayed at a given time. Consider the following example in which two messages have been triggered. The vehicle is approaching the drop-off destination, so a first message has been triggered to inform the rider that the trip is nearing completion. Further, a second message has also been triggered because the vehicle is experiencing a fault and is about to pull over. The vehicle fault message may have a higher priority than the approaching destination message, so the display screen (e.g., contextual display assembly 202) may only show the vehicle fault message. The display screen (e.g., contextual display assembly 202) may also include a default (or main) screen (e.g., a map showing the location of autonomous vehicle 10) that is the lowest priority and is only displayed when no other messages have been triggered. Any suppressed messages may either be stored in a queue for later display or not shown at all if they are no longer relevant.
Lighting
Communication process 250 may render unique light codes that vary by intensity, duration, frequency, color, animation pattern, and light display location. These unique light codes may be rendered on light display assembly 204 and may be assigned to each type of message as a secondary method of conveying information to the rider (e.g., rider 200). Messages shown on the display screen (e.g., contextual display assembly 202) may be accompanied by their uniquely assigned light code rendered on light display assembly 204 in order to subtly inform the rider or to generally enhance the vehicle appearance and setting. For example, light display assembly 204 may include animated strip lighting near the top of the windshield that may be used to indicate that autonomous vehicle 10 is turning (much like a turn signal indicator). Light display assembly 204 may also be configured to accompany messages shown on the display screen (e.g., contextual display assembly 202), such as rapidly blinking lights to remind passengers to buckle their seatbelts.
Physical Buttons
Communication process 250 may be configured to interface with one or more physical buttons (e.g., physical button 216) inside autonomous vehicle 10 and may be used by the rider (e.g., rider 200) during the trip to trigger a response from autonomous vehicle 10 (or a remote monitoring system). Further, multiple sets of buttons may be positioned within autonomous vehicle 10 so that passengers throughout autonomous vehicle 10 may have access to a physical button (e.g., physical button 216). For example, if a rider (e.g., rider 200) wishes to stop autonomous vehicle 10 in the middle of a ride, rider 200 may press a physical button (e.g., physical button 216) within autonomous vehicle 10. In response to pressing physical button 216, communication process 250 may send a signal to either autonomous vehicle 10 or to a remote monitoring system (not shown) requesting that autonomous vehicle 10 pull over.
Additionally, physical button 216 located in autonomous vehicle 10 may enable a rider to request support. Accordingly, when a rider (e.g., rider 200) presses physical button 216, communication process 250 may send a signal to the remote monitoring system (not shown) requesting support, wherein the remote monitor responsible for monitoring autonomous vehicle 10 may respond by opening an audio call with rider (e.g., rider 200).
System for Communicating with Remote Support Personnel
Various audio and visual communication equipment may be present in autonomous vehicle 10, examples of which may include cameras, speakers, and microphones. At any time before, during, or after a trip, such audio and visual communication equipment may be used for communication between the rider (e.g., rider 200) and the remote monitor (e.g., in the case of the support request described above).
Mobile Application
Riders (e.g., rider 200) may also use a mobile application (not shown) on their smartphone (not shown) to interact with autonomous vehicle 10 and/or the remote monitor before, during, and after their trip. For example, the current location of autonomous vehicle 10 may be displayed visually on a map within the mobile application (not shown) on their smartphone (not shown). As a further example, a rider (e.g., rider 200) may press a soft button on the mobile application (not shown) on their smartphone (not shown) to request support, pull-over, change destination, or address other vehicle issues.
The Communication System
Accordingly, communication process 250 may monitor 252 the interior of the autonomous vehicle (e.g., autonomous vehicle 10) for information (e.g., information 218) being provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10).
Specifically and when monitoring 252 the interior of an autonomous vehicle (e.g., autonomous vehicle 10) for information (e.g., information 218) being provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10), communication process 250 may monitor 254 the interior of the autonomous vehicle (e.g., autonomous vehicle 10) for verbal information (e.g., information 218) being provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10).
As discussed above, the interior of the autonomous vehicle (e.g., autonomous vehicle 10) may include a microphone assembly (e.g., microphone assembly 208) for obtaining the verbal information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10). For example, if a rider (e.g., rider 200) of autonomous vehicle 10 is displeased with the condition of autonomous vehicle 10, microphone assembly 208 may be configured to enable rider 200 to provide verbal feedback to autonomous vehicle 10 (or to a person remotely-monitoring the operation of autonomous vehicle 10).
Further and when monitoring 252 the interior of an autonomous vehicle (e.g., autonomous vehicle 10) for information (e.g., information 218) being provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10), communication process 250 may monitor 256 the interior of the autonomous vehicle (e.g., autonomous vehicle 10) for non-verbal information (e.g., information 218) being provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10).
As discussed above, the interior of the autonomous vehicle (e.g., autonomous vehicle 10) may include an input device for obtaining the non-verbal information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10). Specifically and as discussed above, the interior of autonomous vehicle 10 may include one or more of scanner assembly 210 and keyboard assembly 212. Accordingly, examples of this non-verbal information (e.g., information 218) may include but are not limited to one or more of: text-based information that is provided to keyboard assembly 212 and encoded information (e.g., QR code information, creditcard information, ApplePay™ information, GooglePay™ information or SamsungPay™ information) that is provided to scanner assembly 210.
Once obtained, communication process 250 may process 258 the information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10) to generate a response (e.g., response 220) based, at least in part, upon the information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10).
As discussed above, the information provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) may be verbal information (e.g., information 218). Accordingly and when processing 258 the information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10) to generate the response (e.g., response 220) based, at least in part, upon the information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10), communication process 250 may process 260 the verbal information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10) to the autonomous vehicle (e.g., autonomous vehicle 10) using Natural Language Understanding (NLU) to generate the response (e.g., response 220) based, at least in part, upon the verbal information (e.g., information 218) provided by the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10).
As is known in the art, Natural language Understanding (NLU) is a branch of artificial intelligence (AI) that uses computer software to understand verbal inputs provided by a user (e.g., rider 200). NLU may directly enable human-computer interaction (HCl), wherein the understanding of natural human language enables computers to understand human-provided commands (without the formalized syntax of computer languages) while enabling these computers to respond to the human in their own language. The field of NLU is an important and challenging subset of natural language processing (NLP). While both understand human language, NLU is tasked with communicating with untrained individuals and understanding their intent, meaning that NLU goes beyond understanding words and actually interprets the meaning of such words. NLU may use algorithms to reduce human speech into a structured ontology, fleshing out such things as intent, timing, locations and sentiments.
Once generated, communication process 250 may provide 262 the response (e.g., response 220) to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10).
When providing 262 the response (e.g., response 220) to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10), communication process 250 may provide 264 a verbal response (e.g., response 220) to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10).
As discussed above, the interior of the autonomous vehicle (e.g., autonomous vehicle 10) may include a speaker assembly (e.g., speaker assembly 206) for providing the verbal response to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10). For example, if autonomous vehicle 10 just picked up a rider and is getting ready to depart, speaker assembly 206 may be configured to render an audio-based message that says “Welcome aboard. Please buckle your seatbelt so we can depart.”
Further and when providing 262 the response (e.g., response 220) to the rider (e.g., rider 200) of the autonomous vehicle, communication process 250 may provide 266 a non-verbal response (e.g., response 220) to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10).
Accordingly, the interior of the autonomous vehicle (e.g., autonomous vehicle 10) includes an output assembly (e.g., contextual display assembly 202 and/or light display assembly 204) for providing the non-verbal response to the rider (e.g., rider 200) of the autonomous vehicle (e.g., autonomous vehicle 10). For example, if autonomous vehicle 10 is approaching its intended destination, contextual display assembly 202 may be configured to render a text-based message for rider 200 that says “Arriving at your destination. Please exit from the right-side of the vehicle.” Additionally, if autonomous vehicle 10 is going to be turning right at an intersection, light display assembly 204 may be configured to render a right-sweeping image to indicate that the autonomous vehicle (e.g., autonomous vehicle 10) will be turning right.
General
As will be appreciated by one skilled in the art, the present disclosure may be embodied as a method, a system, or a computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, the present disclosure may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Any suitable computer usable or computer readable medium may be utilized. The computer-usable or computer-readable medium may be, for example but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. More specific examples (a non-exhaustive list) of the computer-readable medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. The computer-usable or computer-readable medium may also be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory. In the context of this document, a computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The computer-usable medium may include a propagated data signal with the computer-usable program code embodied therewith, either in baseband or as part of a carrier wave. The computer usable program code may be transmitted using any appropriate medium, including but not limited to the Internet, wireline, optical fiber cable, RF, etc.
Computer program code for carrying out operations of the present disclosure may be written in an object oriented programming language such as Java, Smalltalk, C++ or the like. However, the computer program code for carrying out operations of the present disclosure may also be written in conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network/a wide area network/the Internet (e.g., network 14).
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer/special purpose computer/other programmable data processing apparatus, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures may illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present disclosure has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the disclosure. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.
A number of implementations have been described. Having thus described the disclosure of the present application in detail and by reference to embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the disclosure defined in the appended claims.
This application claims the benefit of U.S. Provisional Application No. 62/959,400, filed on 10 Jan. 2020, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3997868 | Ribnick et al. | Dec 1976 | A |
4930742 | Schofield et al. | Jun 1990 | A |
4956866 | Bernstein et al. | Sep 1990 | A |
4959865 | Stettiner et al. | Sep 1990 | A |
4975966 | Sapiejewski | Dec 1990 | A |
5287411 | Hill et al. | Feb 1994 | A |
5329593 | Lazzeroni et al. | Jul 1994 | A |
5495242 | Kick et al. | Feb 1996 | A |
5550677 | Schofield et al. | Aug 1996 | A |
5670935 | Schofield et al. | Sep 1997 | A |
5671996 | Bos et al. | Sep 1997 | A |
5703957 | McAteer | Dec 1997 | A |
5820245 | Desmond et al. | Oct 1998 | A |
5828012 | Repolle et al. | Oct 1998 | A |
5850016 | Jung et al. | Dec 1998 | A |
5877897 | Schofield et al. | Mar 1999 | A |
5878147 | Killion et al. | Mar 1999 | A |
5894279 | Rose et al. | Apr 1999 | A |
5949331 | Schofield et al. | Sep 1999 | A |
5979586 | Farmer et al. | Nov 1999 | A |
6166625 | Teowee et al. | Dec 2000 | A |
6243003 | DeLine et al. | Jun 2001 | B1 |
6278377 | DeLine et al. | Aug 2001 | B1 |
6326613 | Heslin et al. | Dec 2001 | B1 |
6329925 | Skiver et al. | Dec 2001 | B1 |
6362749 | Brill | Mar 2002 | B1 |
6363156 | Roddy | Mar 2002 | B1 |
6366213 | DeLine et al. | Apr 2002 | B2 |
6420975 | DeLine et al. | Jul 2002 | B1 |
6428172 | Hutzel et al. | Aug 2002 | B1 |
6433676 | DeLine et al. | Aug 2002 | B2 |
6485081 | Bingle et al. | Nov 2002 | B1 |
6501387 | Skiver et al. | Dec 2002 | B2 |
6570992 | Folan et al. | May 2003 | B1 |
6587186 | Bamji et al. | Jul 2003 | B2 |
6614911 | Watson et al. | Sep 2003 | B1 |
6648477 | Hutzel et al. | Nov 2003 | B2 |
6678039 | Charbon | Jan 2004 | B2 |
6690268 | Schofield et al. | Feb 2004 | B2 |
6690354 | Sze | Feb 2004 | B2 |
6693517 | McCarthy et al. | Feb 2004 | B2 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6717524 | DeLine et al. | Apr 2004 | B2 |
6774356 | Heslin et al. | Aug 2004 | B2 |
6798890 | Killion et al. | Sep 2004 | B2 |
6876775 | Torunoglu | Apr 2005 | B2 |
6882734 | Watson et al. | Apr 2005 | B2 |
6906632 | DeLine et al. | Jun 2005 | B2 |
6906793 | Bamji et al. | Jun 2005 | B2 |
6919549 | Bamji et al. | Jul 2005 | B2 |
6958707 | Siegel | Oct 2005 | B1 |
6980092 | Turnbull et al. | Dec 2005 | B2 |
6980663 | Linhard | Dec 2005 | B1 |
7038577 | Pawlicki et al. | May 2006 | B2 |
7053357 | Schwarte | May 2006 | B2 |
7061402 | Lawson | Jun 2006 | B1 |
7157685 | Bamji et al. | Jan 2007 | B2 |
7176438 | Bamji et al. | Feb 2007 | B2 |
7203356 | Gokturk et al. | Apr 2007 | B2 |
7212663 | Tomasi | May 2007 | B2 |
7245232 | Caouette, Sr. | Jul 2007 | B1 |
7283213 | O'Connor et al. | Oct 2007 | B2 |
7308341 | Schofield et al. | Dec 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7321111 | Bamji et al. | Jan 2008 | B2 |
7340077 | Gokturk et al. | Mar 2008 | B2 |
7352454 | Bamji et al. | Apr 2008 | B2 |
7375803 | Bamji | May 2008 | B1 |
7379100 | Gokturk et al. | May 2008 | B2 |
7379163 | Rafii et al. | May 2008 | B2 |
7405812 | Bamji | Jul 2008 | B1 |
7408627 | Bamji et al. | Aug 2008 | B2 |
7415116 | Fels | Aug 2008 | B1 |
7580795 | McCarthy et al. | Aug 2009 | B2 |
7657052 | Larson et al. | Feb 2010 | B2 |
7675431 | Caouette, Sr. | Mar 2010 | B1 |
7720580 | Higgins-Luthman | May 2010 | B2 |
7791499 | Mohan et al. | Sep 2010 | B2 |
7855755 | Weller et al. | Dec 2010 | B2 |
8013780 | Lynam | Sep 2011 | B2 |
8027029 | Lu et al. | Sep 2011 | B2 |
8094040 | Cornett et al. | Jan 2012 | B1 |
8258932 | Wahlstrom | Sep 2012 | B2 |
8275145 | Buck et al. | Sep 2012 | B2 |
8319620 | Usher et al. | Nov 2012 | B2 |
8355521 | Arson et al. | Jan 2013 | B2 |
8392064 | Thrun et al. | Mar 2013 | B2 |
8676427 | Ferguson et al. | Mar 2014 | B1 |
8824697 | Christoph | Sep 2014 | B2 |
8849557 | Levandowski | Sep 2014 | B1 |
9036026 | Dellantoni et al. | May 2015 | B2 |
9146898 | Ihlenburg et al. | Sep 2015 | B2 |
9224294 | St. John | Dec 2015 | B1 |
9275136 | Sharifi et al. | Mar 2016 | B1 |
9278689 | Delp | Mar 2016 | B1 |
9280202 | Gieseke et al. | Mar 2016 | B2 |
9397630 | Wang et al. | Jul 2016 | B2 |
9405120 | Graf et al. | Aug 2016 | B2 |
9412273 | Ricci | Aug 2016 | B2 |
9417838 | Baalu et al. | Aug 2016 | B2 |
9470033 | Dudar | Oct 2016 | B1 |
9575160 | Davis et al. | Feb 2017 | B1 |
9575560 | Poupyrev et al. | Feb 2017 | B2 |
9576208 | Agnew et al. | Feb 2017 | B2 |
9599702 | Bordes et al. | Mar 2017 | B1 |
9689967 | Stark et al. | Jun 2017 | B1 |
9701258 | Tiryaki | Jul 2017 | B2 |
9753121 | Davis et al. | Sep 2017 | B1 |
9800983 | Wacquant et al. | Oct 2017 | B2 |
9811164 | Poupyrev | Nov 2017 | B2 |
9817109 | Saboo et al. | Nov 2017 | B2 |
9818246 | Elie et al. | Nov 2017 | B2 |
9982474 | Dezorzi et al. | May 2018 | B2 |
10137777 | Lu et al. | Nov 2018 | B2 |
10246009 | McMahon et al. | Apr 2019 | B2 |
10266182 | Krishnan | Apr 2019 | B2 |
10346888 | Nix | Jul 2019 | B2 |
10372130 | Kaushansky | Aug 2019 | B1 |
10443292 | Baruco et al. | Oct 2019 | B2 |
10459080 | Schwesig et al. | Oct 2019 | B1 |
10476500 | Powell et al. | Nov 2019 | B2 |
10589716 | Sobecki et al. | Mar 2020 | B2 |
10598504 | Oh | Mar 2020 | B2 |
10609148 | Tran | Mar 2020 | B1 |
10766412 | Austin | Sep 2020 | B1 |
10789840 | Boykin | Sep 2020 | B2 |
10914110 | Mitchell | Feb 2021 | B2 |
11167771 | Caron et al. | Nov 2021 | B2 |
11244564 | Seifert | Feb 2022 | B2 |
20020032510 | Turnbull et al. | Mar 2002 | A1 |
20020080021 | Skiver et al. | Jun 2002 | A1 |
20020110255 | Killion et al. | Aug 2002 | A1 |
20020110256 | Watson et al. | Aug 2002 | A1 |
20040155795 | Quintana | Aug 2004 | A1 |
20040170286 | Durach et al. | Sep 2004 | A1 |
20040258252 | Inoue et al. | Dec 2004 | A1 |
20050074131 | McCall et al. | Apr 2005 | A1 |
20060023892 | Schultz | Feb 2006 | A1 |
20060055630 | Cheang et al. | Mar 2006 | A1 |
20070008175 | Johnson et al. | Jan 2007 | A1 |
20070146127 | Stilp et al. | Jun 2007 | A1 |
20070216539 | D'Antoni et al. | Sep 2007 | A1 |
20080068187 | Bonefas et al. | Mar 2008 | A1 |
20080150786 | Breed | Jun 2008 | A1 |
20090028425 | Cavallaro et al. | Jan 2009 | A1 |
20090097674 | Watson et al. | Apr 2009 | A1 |
20090125311 | Haulick et al. | May 2009 | A1 |
20090179774 | Mohan et al. | Jul 2009 | A1 |
20090322559 | Yen et al. | Dec 2009 | A1 |
20100100284 | Kudo et al. | Apr 2010 | A1 |
20100231436 | Focke et al. | Sep 2010 | A1 |
20100245066 | Sarioglu et al. | Sep 2010 | A1 |
20110199230 | Stahlin et al. | Aug 2011 | A1 |
20110296794 | Thomas et al. | Dec 2011 | A1 |
20120062743 | Lynam et al. | Mar 2012 | A1 |
20120072051 | Koon et al. | Mar 2012 | A1 |
20120121113 | Li | May 2012 | A1 |
20120136559 | Rothschild | May 2012 | A1 |
20120218412 | Dellantoni et al. | Aug 2012 | A1 |
20120230504 | Kuroda | Sep 2012 | A1 |
20130049985 | Eisenson et al. | Feb 2013 | A1 |
20130124038 | Naboulsi | May 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20130223643 | Sato et al. | Aug 2013 | A1 |
20130261871 | Hobbs et al. | Oct 2013 | A1 |
20140078291 | Hammarstrom et al. | Mar 2014 | A1 |
20140218529 | Mahmoud et al. | Aug 2014 | A1 |
20140266853 | Orr et al. | Sep 2014 | A1 |
20140309878 | Ricci | Oct 2014 | A1 |
20140336876 | Gieseke et al. | Nov 2014 | A1 |
20140350927 | Yamabe et al. | Nov 2014 | A1 |
20140375476 | Johnson et al. | Dec 2014 | A1 |
20150009010 | Biemer | Jan 2015 | A1 |
20150015710 | Tiryaki | Jan 2015 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150025709 | Spaulding et al. | Jan 2015 | A1 |
20150061895 | Ricci | Mar 2015 | A1 |
20150065060 | Stahlin | Mar 2015 | A1 |
20150066284 | Yopp | Mar 2015 | A1 |
20150092042 | Fursich | Apr 2015 | A1 |
20150137998 | Marti et al. | May 2015 | A1 |
20150142244 | You et al. | May 2015 | A1 |
20150161458 | Agnew et al. | Jun 2015 | A1 |
20150181175 | Camilleri et al. | Jun 2015 | A1 |
20150232030 | Bongwald | Aug 2015 | A1 |
20150294169 | Zhou et al. | Oct 2015 | A1 |
20150296135 | Wacquant et al. | Oct 2015 | A1 |
20150332425 | Kalanick | Nov 2015 | A1 |
20150348580 | van Hoff | Dec 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20150382102 | Akino | Dec 2015 | A1 |
20160029111 | Wacquant et al. | Jan 2016 | A1 |
20160094911 | Kropf | Mar 2016 | A1 |
20160137126 | Fursich et al. | May 2016 | A1 |
20160191815 | Annau | Jun 2016 | A1 |
20160209647 | Fursich | Jul 2016 | A1 |
20160252905 | Tian et al. | Sep 2016 | A1 |
20160259037 | Molchanov et al. | Sep 2016 | A1 |
20160355124 | Cervantes Guarneros et al. | Dec 2016 | A1 |
20160355125 | Herbert | Dec 2016 | A1 |
20170109947 | Prokhorov | Apr 2017 | A1 |
20170217445 | Tzirkel-Hancock | Aug 2017 | A1 |
20170222311 | Hess et al. | Aug 2017 | A1 |
20170254873 | Koravadi | Sep 2017 | A1 |
20170267169 | Fleurence et al. | Sep 2017 | A1 |
20170274906 | Hassan et al. | Sep 2017 | A1 |
20170276788 | Wodrich | Sep 2017 | A1 |
20170315231 | Wodrich | Nov 2017 | A1 |
20170323540 | Boykin | Nov 2017 | A1 |
20170356994 | Wodrich et al. | Dec 2017 | A1 |
20180015875 | May et al. | Jan 2018 | A1 |
20180045812 | Hess | Feb 2018 | A1 |
20180046255 | Rothera et al. | Feb 2018 | A1 |
20180050698 | Polisson | Feb 2018 | A1 |
20180077506 | Wacquant et al. | Mar 2018 | A1 |
20180079427 | Herz | Mar 2018 | A1 |
20180137756 | Moosaei | May 2018 | A1 |
20180170309 | McMahon et al. | Jun 2018 | A1 |
20180194366 | Krishnan et al. | Jul 2018 | A1 |
20180196501 | Trotta | Jul 2018 | A1 |
20180202822 | DeLizio | Jul 2018 | A1 |
20180222414 | Ihlenburg et al. | Aug 2018 | A1 |
20180231635 | Woehlte | Aug 2018 | A1 |
20180238099 | Schatz et al. | Aug 2018 | A1 |
20180251122 | Golston | Sep 2018 | A1 |
20180261237 | Moore et al. | Sep 2018 | A1 |
20180284772 | Ravichandran | Oct 2018 | A1 |
20180350391 | Moore et al. | Dec 2018 | A1 |
20180364704 | Liu | Dec 2018 | A1 |
20190018364 | Kim | Jan 2019 | A1 |
20190051302 | Gonzalez | Feb 2019 | A1 |
20190122056 | Tran et al. | Apr 2019 | A1 |
20190153770 | Mitchell et al. | May 2019 | A1 |
20190162010 | Rafrafi et al. | May 2019 | A1 |
20190162821 | Rafrafi et al. | May 2019 | A1 |
20190162822 | Rafrafi et al. | May 2019 | A1 |
20190220010 | Leonard | Jul 2019 | A1 |
20190227553 | Kentley-Klay | Jul 2019 | A1 |
20190309564 | Mitchell et al. | Oct 2019 | A1 |
20190337532 | Myers et al. | Nov 2019 | A1 |
20190339382 | Hess et al. | Nov 2019 | A1 |
20190371297 | Kim et al. | Dec 2019 | A1 |
20200018111 | Akbarian et al. | Jan 2020 | A1 |
20200143560 | Lu et al. | May 2020 | A1 |
20200157873 | Sabatini et al. | May 2020 | A1 |
20200191589 | Tamai | Jun 2020 | A1 |
20200202151 | Wacquant | Jun 2020 | A1 |
20200216086 | Lenke | Jul 2020 | A1 |
20200320320 | Lynam | Oct 2020 | A1 |
20200353862 | Schaye | Nov 2020 | A1 |
20210174590 | Huet | Jun 2021 | A1 |
20210262274 | Schatz et al. | Aug 2021 | A1 |
20210291739 | Kasarla et al. | Sep 2021 | A1 |
20210323473 | Peterson et al. | Oct 2021 | A1 |
Number | Date | Country |
---|---|---|
102013004271 | Sep 2013 | DE |
2018163668 | Oct 2018 | JP |
101588184 | Jan 2016 | KR |
101654694 | Sep 2016 | KR |
1998017046 | Apr 1998 | WO |
199031637 | Jun 1999 | WO |
2001037519 | May 2001 | WO |
2003091066 | Nov 2003 | WO |
2014172334 | Oct 2014 | WO |
2014204794 | Dec 2014 | WO |
Entry |
---|
International Search Report and Written Opinion dated Apr. 14, 2021 for corresponding PCT Application No. PCT/US2021/012958. |
Real F.D., Berry F. (2009) Smart Cameras: Technologies and Applications. In: Belbachir A. (eds) Smart Cameras. Springer, Boston , MA. 2009 (Year: 2009). |
Bendler et al., “Emergency Vehicle Detector,” ECE4007 Senior Design Project, Sep. 15, 2008. |
Fazenda et al., “Acoustic based safety emergency vehicle detection for intelligent transport systems,” IEEE ICCAS-SICE Sep. 2009. |
Fragoulis et al., “A Siren Detection System Based on Mechanical Resonant Filter5b4s,” Sensors 2001, vol. 1, pp. 121-137, ISSN 1424-8220, MDPI, 2001. |
Park et al., “Automatic Detection of Emergency Vehicles for Hearing Impaired Drivers,” Texas A&M University—Kingsville, EE/CS Department, 2000. |
Schroder et al., “Automatic Acoustic Siren Detection in Traffic Noise By Part-Based Models,” University of Oldenburg, Germany, May 2013. |
Schwander et al., “Effect of two-microphone noise reduction on speech recognition by normal-hearing listeners” Veterans Administration, Journal of Rehabilitation Research and Development, vol. 24, No. 4, pp. 87-92, Fall 1987. |
Simmer et al., “Adaptive Microphone Arrays for Noise Suppression in the Frequency Domain” Second Cost 229 Workshop on Adaptive Algorithms in Communications, Bordeaux, 30.92.10.1992. |
Zhang et al., “High-speed Noise Cancellation with Microphone Array” Technology Reports, NTT DoCoMo Technical Journal vol. 9, No. 4. |
International Search Report and Written Opinion dated Aug. 3, 2020 from corresponding PCT Application No. PCT/CA2020/050673 and filed May 19, 2020. |
Number | Date | Country | |
---|---|---|---|
20210213970 A1 | Jul 2021 | US |
Number | Date | Country | |
---|---|---|---|
62959400 | Jan 2020 | US |