USER INTERFACES FOR AUTONOMY STATE CONTROL AND ALERTS

Information

  • Patent Application
  • 20230242161
  • Publication Number
    20230242161
  • Date Filed
    January 31, 2022
    2 years ago
  • Date Published
    August 03, 2023
    a year ago
Abstract
In some scenarios, the approach may be used with two or more vehicles travelling in formation where selected vehicles may be fully or partially autonomously controlled. Information is collected at each vehicle and from the drivers and it is shared with other vehicles and drivers to make a collective decision to enter or leave autonomy.
Description
TECHNICAL FIELD

This application relates to systems and methods for conveying vehicle autonomy states to a driver via visual and/or audio messaging. It also relates to systems and methods for enabling a driver to control and manage transitions into, out of, and between autonomy modes. The system can provide advanced notice of impending autonomy mode changes, and warn of system failures.


BACKGROUND

Researchers and vehicle manufacturers have been developing self-driving technologies for many years. Commercial trucking continues to be one of the areas where autonomous vehicles will eventually become widespread. Sensors and/or wireless connections (such as vehicle-to-vehicle radio communication) can be used by such autonomous systems to follow a route, and locate and avoid other vehicles and obstacles and follow the road. However, safety considerations require user interfaces for such vehicles that remain complex and difficult to use for many drivers.


SUMMARY

In an example implementation, a vehicle such as a truck may be controlled by both a human driver and autonomy logic. The vehicle is therefore equipped with the usual throttle pedal, brake pedal and steering wheel for a human to operate. Electronic actuator devices are coupled to these human-controlled inputs. The electronic actuators in turn operate the vehicle's throttle, brake and steering sub-system mechanisms.


The vehicle also includes autonomy logic that may follow lanes, or a pre-defined plan or which generates a plan for controlling the vehicle such as the path it should follow. The autonomy logic uses inputs from sensors such as cameras, lidars, radars, and data available from other inputs such as radios and position sensors to devise the plan. The autonomy logic also provides an autonomy ready signal when it is able to transition into an autonomy mode based on its sensors and other inputs.


Autonomy control inputs are then generated to operate the throttle, brake, and steering sub-systems based on the plan. The controller also assesses the readiness of the actuators to accurately operate the vehicle's throttle, brake and steering sub-systems.


In some arrangements, a system controller receives both the human control inputs and the autonomy control inputs. These control inputs may be fed to a set of relays. The controller also receives the autonomy ready and actuator ready signals. The controller operates the relays to select which outputs to provide to an Electronic Control Unit (ECU) which in turn generates signals to be applied to the throttle, brake and steering actuators. The controller determines whether to set the relays to choose the human control inputs or autonomy control inputs based on several conditions such as the ready signals and/or other inputs such as human-operated mode input(s) and a human-operated stop input. In addition to switching between human and machine control, these relays can also indicate safety interlocks. For example, the control inputs may be fed to a set of electronic switches, relays, or equivalent devices to enable switching between manual and autonomous control. However, these control inputs may also be fed to a set of interlock devices to enable safe operation of the system.


The controller may also provide autonomy related information to the human driver via lights, tones, audio outputs and/or display screens associated with the autonomy system status, including state transitions into and out of autonomy modes. This information may include details such as whether the autonomy system is armed or disarmed, or in a pre-arm state and whether the autonomy logic is ready or not. The controller may also accept inputs from the driver to engage or disengage autonomy modes or enable only selected autonomy modes.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a high level diagram of the components of a system that implements controlled transitions into and out of autonomy state and provides corresponding driver alerts.



FIG. 2 is an example user interface including two rocker switches, four lights and a stop button.



FIG. 3 is an example autonomy state transition diagram.



FIG. 4 is an example of collaborative transition into autonomy.



FIG. 5 is an example of collaborative notification of transition out of autonomy.



FIG. 6 shows an autonomous and a human-driven truck and some of the hardware and software systems that enable collaborative decision making via a shared world model.



FIG. 7 is an architecture block diagram that shows the equipment that connects the controllers on two vehicles to the driver(s), sensors, actuators, and to the other vehicle.



FIG. 8 is a flowchart and data flow diagram representing the process steps and information flows that enable collaboration via a shared world model.



FIG. 9 is a data flow diagram that shows some major components of the autonomy controller and their data interconnections.





DETAILED DESCRIPTION OF ILLUSTRATIVE EMBODIMENT(S)

As shown more particularly in FIG. 1, a vehicle can be driven by either autonomy logic 10 or a human driver 20.


The human driver 20 provides inputs to the system controller 70 via typical human input devices 40 such as a throttle pedal, brake pedal and steering wheel. It should be understood that the reference to a “throttle” herein does not mean the vehicle must be driven by an internal combustion engine. The vehicle may be propelled by electric motors or other alternatives to fossil fuels.


The human driver 20 can also view a display 50 and operate other inputs such as stop button 60 or mode input 75.


The autonomy logic 10 receives inputs from sensors 15 such as one or more camera(s), lidar(s), radar(s), position sensor(s), and/or receive data from other sources via radio(s) and other inputs. The autonomy logic 10 produces autonomy control signals including throttle, brake and steering signals. outputs to the controller 70.


The autonomy logic 20 may include an A-kit module 20 and a B-kit module 30.


The A-kit module 20 is responsible for generating instructions that describe a plan for the vehicle, such as a path that it should follow. The A-kit module also provides a ready signal when the autonomy logic 10 determines that it has sufficient information to devise and approve of a plan for the vehicle to be autonomously driven.


The B-kit module 30 receives inputs from the autonomy logic 10, such as instructions from the A-kit (including a path to follow) and produces autonomy control signals (for example including throttle, brake and steering control input signals) to the controller 70.


The controller 70 receives human control inputs 40 from the human driver 20 and autonomous control inputs from the B-kit module 30, and feeds at least the throttle to a Pulse Width Modulated (PWM) relay (not shown). The relay can select which of the inputs are fed to the vehicle's Electronic Control Unit (ECU) 80. Steering and brake inputs may be controlled over a Controller Area Network (CAN) bus.


The ECU 80 in turn produces electronic control signals used by one or more actuators 90 which may include the vehicles throttle 91, brake 92 and steering 93 actuators that in turn operate the physical throttle, brake and steering sub-systems on the vehicle. In some implementations the ECU may not control one or more of the actuators, such as the steering 93.


The controller 70 may also receive inputs from actuators 90 that indicate their current status.


In some implementations the controller 70 may provide visual or audio alerts outputs to the output device(s) 50. The output device(s) may include indicator lights and/or a speaker and electronics that can playback pre-recorded audio messages.


The controller 70 may also receive inputs such as from the stop button 60 and or mode switch 75 as operated by the driver 20.



FIG. 2 is an example of an operator interface which may provide portions of the output device 50 and stop button 60. Here the display may consist of four lights including an arm light, autonomy light, disarm light and ready for autonomy light. The logical operation of these lights will be explained in more detail below. These four lights maybe put on or formed as part of mode input devices 75 such as the rocker switches shown.


The autonomy controller 70 may assume a number of states as follows:

    • Pre-disarm—The system enters this state on startup and remains in this state until the B-Kit signals it is active and ready.
    • Disarm—The system is disengaged but the B-Kit is ready to arm the system. The human operated inputs 40 are controlling the actuators 90 and the rest of the system should not interfere with truck operation.
    • Arm—The system's low level controls are engaged but only track the human inputs 40 and generate the corresponding control signals.
    • Ready for Autonomy—The A-Kit is ready to enter full autonomy but the system is otherwise functionally still in the Arm state.
    • Autonomy—The human inputs 40 are inactive and the vehicle is controlled by the B-Kit as directed by the A-Kit.
    • E-Stop—E-stop has been pressed and autonomy system is deactivated, with the human inputs exclusively controlling the vehicle.


Table 1 below lists the various controls and outputs for each of these states in more detail. The “Driver Alert System” referenced in Table 1 may include the output device 50 such as the electronics and speaker that play back pre-recorded audio messages. The “PWM controller” referenced in the table is portion of controller 70 that manages the autonomous control of the throttle and selection of the manual or autonomous throttle inputs.










TABLE 1







E-Stop
PWM throttle relay is powered down to connect throttle pedal to ECU.



Safety Controller, B-Kit, and A-Kit are powered down.



Disarm, Arm, Ready for Autonomy, and Autonomy lights are turned off.


Pre-
PWM throttle relay is powered up and connects throttle pedal to ECU


disarm
through the PWM controller. The PWM controller passes the input



throttle signals directly to the outputs to the ECU to electronically



mimic a wired connection.



Disarm, Arm, Ready for Autonomy, and Autonomy lights are turned off.



B-Kit software inhibits CAN communication to avoid affective truck



operation.



Driver Alert System is silent


Disarm
PWM throttle relay is powered up and connects throttle pedal to ECU



through the PWM controller. The PWM controller passes the input



throttle signals directly to the outputs to the ECU to electronically



mimic a wired connection.



Amber Disarm light is turned on.



Arm, Ready for Autonomy, and Autonomy lights are turned off.



B-Kit software inhibits CAN communication to avoid affective truck



operation.



Driver Alert System is silent


Arm
PWM throttle relay is powered up and connects throttle pedal to ECU



through the PWM controller. The PWM controller reads the throttle



pedal signals, converts them to a percentage throttle, sends that



percentage throttle to the PWM signal generator, and generates the



output signal that is sent to the ECU. This process permits throttle



pedal control while demonstrating that the PWM signal generation



system that will be used for autonomy is functional.



Green Arm light is turned on.



Disarm, Ready for Autonomy, and Autonomy lights are turned off.



B-Kit software enables CAN communication. Driver Alert System is silent


Ready for
PWM throttle relay is powered up and connects throttle pedal to ECU


Autonomy
through the PWM controller. The PWM controller reads the throttle



pedal signals, converts them to a percentage throttle, sends that



percentage throttle to the PWM signal generator, and generates the



output signal that is sent to the ECU. This process permits throttle



pedal control while demonstrating that the PWM signal generation



system that will be used for autonomy is functional.



Green Arm light is turned on.



Amber Ready for Autonomy light is flashing at a slow rate.



Ready for Autonomy, and Autonomy lights are turned off.



B-Kit software enables CAN communication.



Driver Alert System repeatedly plays the Ready for Autonomy message



with a pause between repeats.


Autonomy
PWM throttle relay is powered up and connects throttle pedal to the



PWM controller. The PWM controller ignores the throttle pedal signal.



The PWM controller receives a percentage throttle command from the



B-Kit, sends that percentage throttle to the PWM signal generator,



and generates the output signal that is sent to the ECU.



Green Arm light is turned on.



Green Autonomy light is turned on.



Disarm and Ready for Autonomy lights are turned off.



B-Kit software enables CAN communication.



A-Kit generates a polynomial describing vehicle path and sends that



polynomial to the B-Kit.



B-Kit controls steering, brake and throttle based on polynomial



received from the A-Kit.



Driver Alert System plays the Autonomy message one time.









It should be understood the table above is but one example. In other implementations, modes can be set to cause either the human inputs to be inactive, the human inputs add to the autonomous inputs, or the human inputs cause an exit from the autonomous control when activated.



FIG. 3 is a detailed diagram showing transitions between the pre-disarm 301, disarm 302, arm 303, ready for autonomy 304 autonomy 305, and E-stop 306 states.


Briefly, pressing the E-stop button 60 in any state causes the system to enter the E-stop 306 state. When E-stop is released, the system enters pre-disarm 301. The system remains in this state until the B-kit ready signal (referred to as “B-kit enable in FIG. 3) is active, where then disarm 302 state is entered. The system enters arm 303 when the B-kit enable is active and the arm rocker (FIG. 2) is pressed. The system enters ready for autonomy 304 when both B-kit and A-kit are ready. The system does not enter autonomy 305 until both A-kit and B-kit are ready and the autonomy rocker is pressed.


Conditions causing an exit from any of the states are also shown in FIG. 3. For example, when in autonomy mode 305, the A-kit ready goes inactive but B-kit ready remains active, then arm 303 may be entered. Autonomy mode 305 may also exit to arm mode 303 if the arm rocker is switched to the off position or the brake pedal is pressed.


In another example, if at any time the B-kit ready or A-kit ready signal goes inactive, the disarm 302 is entered. Likewise, if the arm rocker is moved to the disarm position, then disarm 302 is entered.


Tables 2-1 and 2-2 provide more details list of the types of exceptions other events that may initiate transitions from one state to another.









TABLE 2-1







Exception Events








Exception Event
Results, Outputs, Controls





E-Stop pressed
System enters E-Stop state and powers down


Brake pedal pressed
System enters Arm state



Driver Alert System plays the Manual



Takeover message


Steering wheel manually
System enters Disarm state


turned
Driver Alert System is silent


System cannot maintain
System enters Arm state


autonomy - A-Kit enable
Driver Alert System plays Take Control


becomes inactive
message repeatedly until the system state



changes
















TABLE 2-2







System State Transition Table










Current





State
Next State
Inputs or Conditions
Comments





Any State
E-Stop
E-Stop is pressed
E-Stop must be released and system





must be restarted to return





to normal operation.


E-Stop
Pre-disarm
E-Stop is released
System powers up and




Key switch is cycled
enters Pre-disarm state.


Pre-disarm
Pre-disarm
B-Kit enable not active
System remains in Pre-disarm state.


Pre-disarm
Disarm
B-Kit enable active
System enters Disarm state


Disarm
Disarm
B-Kit enable active
System remains in Disarm state.




Arm button is not pressed


Disarm
Arm
B-Kit enable active
System enters Arm state.




Arm rocker switch is




pressed momentarily


Arm
Arm
B-Kit enable active
System remains in Arm state.




No rocker switch pressed


Arm
Disarm
B-Kit enable active
System returns to Disarm state




Disarm rocker switch is




pressed momentarily


Arm
Ready for
B-Kit enable active
System enters Ready



Autonomy
A-Kit enable is active
for Autonomy state




No rocker switch pressed


Ready for
Ready for
B-Kit enable active
System remains in Ready


Autonomy
Autonomy
A-Kit enable is active
for Autonomy state




No rocker switch pressed


Ready for
Autonomy
B-Kit enable active
System enters Autonomy state


Autonomy

A-Kit enable is active




Autonomy rocker switch




is pressed momentarily


Ready for
Arm
A-Kit enable becomes
System returns to Arm state


Autonomy

inactive OR




Arm rocker switch is




pressed momentarily


Ready for
Disarm
B-Kit enable becomes
System returns to Disarm state


Autonomy

inactive OR




Disarm rocker switch is




pressed momentarily


Autonomy
Autonomy
B-Kit enable active
System remains in Autonomy state




A-Kit enable is active




No rocker switch is pressed


Autonomy
Arm
A-Kit enable becomes
System returns to Arm state




inactive OR




Arm rocker switch is




pressed momentarily




OR




Brake is pressed




OR




Driver applies torque




to steering wheel


Autonomy
Disarm
B-Kit enable becomes
System returns to Disarm state




inactive OR




Disarm rocker switch is




pressed momentarily


Autonomy
Ready for
A-Kit enable becomes inactive
System returns to Ready



Autonomy
then becomes active again
for Autonomy state









The controller 70 may also accept inputs from a mode switch 75. The mode switch may be a three position switch corresponding to three system states: normal, steering only, and speed only. The mode switch 75 outputs may be fed to the controller 70 to control autonomy system modes as shown in Table 3.









TABLE 3







System Modes









Mode
Description
Conditions





Normal
System control of brake,
The Mode Switch can be turned while



throttle, and steering is
the system is operating but the change in



enabled, subject to
mode will not change system mode until



system state.
the system is power cycled.


Steering
System control of brake and
The Mode Switch can be turned while


Only
throttle is disabled and only the
the system is operating but the change in



driver can control brake and
mode will not change system mode until



throttle
the system is power cycled.



System control of steering is



enabled, subject to system



state.


Speed
System control of brake and
The Mode Switch can be turned while


Only
throttle is enabled, subject to
the system is operating but the change in



system state.
mode will not change system mode until



System control of steering is
the system is power cycled.



disabled and only the driver can



control steering.









Coordinating Autonomy State Transitions with Other Vehicles


It can now be understood how the above system and method can be used to improve how the driver may better control and remain informed of the state of the vehicle's autonomy logic. However in some instances the autonomy state of other vehicles may also be of importance. One such scenarios is where one or more drivers are responsible for two or more semi-trucks traveling together in a convoy. The multiple trucks, multiple human drivers and multiple autonomy logics should each be informed of what the other is doing.


Consider for example where the human interface should keep the driver informed of transitions into and out of autonomy mode. When one vehicle is about to transition into the autonomy state, the driver of another vehicle may want to be at least alerted to that fact. In other scenarios, where safety may be a concern, a lead driver may need to approve of any following vehicle's transition into autonomy. But cognitive or information overload should be avoided to the extent possible especially where human drivers are involved. Therefore, any interface to inform and/or obtain consent from a driver should be minimally distractive and easy to operate.



FIG. 4 is an example process flow that leverages driver interfaces and system components described in FIGS. 1-3 above. In this scenario there are two trucks, a first vehicle (V1) and a second vehicle (V2). Vehicle V1 may be human driven and the second vehicle V2 is capable of being controlled by either a human driver or by autonomy logic. Vehicle V2 therefore has, in one embodiment, at least the components shown in FIGS. 1 and 3 and a way to communicate with V1. Vehicle V1, in one embodiment, is equipped with at least the interface shown in FIG. 2, a way to communicate with V2 and a controller.


Each of the vehicle's respective controllers may communicate with the other vehicle and share information such as via a Vehicle to Vehicle (V2V) radio link. Information may also be exchanged via a shared world model (SWM) as explained in more detail below. Information received from the V2V link or the shared world model is then leveraged to provide alerts to and obtain consent from the human driver(s).


In a first state 401, vehicle V2 requests entry into the autonomy state. This request can be conveyed over the V2V link or via updates to the SWM. In state 402 vehicle V1 receives the request from V2 and presents it to the human driver of V1. The presentation may be by playing an audio clip (e.g., “Vehicle V2 requests to enter autonomy”) or by turning on a “Request for Autonomy” light.


In state 404 the human driver of V1 approves the autonomy request such as perhaps via the rocker switch of FIG. 2 or via spoken message. The approval is then passed over the V2V radio link (or via the SWM) and received at V2 in state 405.


In state 407 the controller in V2 then proceeds with the transition from the disarm state to the arm state to the ready for autonomy state (FIG. 3). The fact that V2 is now ready for autonomy is then communicated to V1, and a “Ready for Autonomy” alert is presented to the driver of vehicle V1—again either via a played back audio message or by lights.


In state 412 the human driver of V1 consents the transition by audio command or pressing a switch and this approval is communicated via the SWM or the V2V link to the controller in vehicle V2. In state 415, V2 completes the transition from the ready for autonomy state to the autonomy state. This state information again can be maintained in the SWM or updated through the V2V link. In state 416 vehicle V1 may present an autonomy mode alert to inform its driver that V2 is now operating in autonomy mode.



FIG. 5 shows a scenario where vehicle V2 transitions out of autonomy mode. In an initial state date 501 vehicle V2 is in an autonomy driving mode. However V2's sensors have detected an unsafe condition requiring immediate transition out of that mode. This fact is conveyed to the controller in vehicle V1 using the shared world model or more likely via a high priority message over the V2V link.


An emergency alert is presented to the driver of V1 in state 502.


In state 503 (which may occur before, after, or simultaneously with state 502) an alert is also presented to the driver of vehicle V2. E-stop mode is entered and confirmed in state 805. This e-stop status is also presented to vehicle V1 in state 506 who made turn display or play an alert to its driver. In states 806 and 808 both vehicles are human driven.


Example Implementation that Supports Collaborative Behaviors Via Shared World Model


As mentioned above, a Shared World Model is one way for the drivers and autonomy logic in multiple vehicles to engage in collaborative decision-making. A detailed example implementation of such an SWM was described in the pending patent application referenced on page 1. That description is repeated here for convenience of the reader.


1.1 Information Flows



FIG. 6 is a situation where two vehicles are cooperatively travelling as a unit. Both vehicles may incorporate, either when the vehicle is manufactured or in a subsequent retrofit, hardware and software that enables them to implement autonomy logic. Such autonomy logic may include algorithms to enable vehicles to drive themselves, to interact with human drivers, and to exchange information between themselves.


An example vehicle 110 may include sensors 112, actuators 114, V2V radio transceivers 116, other I/O devices 118, and one or more processors 120 (also referred to herein as controllers 120). As discussed further herein, the one or more processors 120 may execute various logic including autonomy logic 122, decision logic 124 and may also maintain a world model 126. Vehicles 110 include human-driven vehicles 110-H and autonomous vehicles 110-A.



FIG. 7 illustrates these components in more detail. Self-driving algorithms and other aspects of the autonomy logic 122 are implemented in a controller (such as one or more processors 120) that receives sensor 112 data from the respective vehicle (110-A or 110-H) and sends actuator 114 signals to the respective vehicle 110-A or 110-H. The controller 120 may further implement human interface algorithms that accept inputs (e.g., steering, throttle, touch screen etc.) via other I/O devices 118-D from human drivers while also sending data to other I/O devices 118 such as human-readable displays. The controller 120 may also send data to and accept data from vehicle-to-vehicle (V2V) radio transceiver(s) 116 to allow it to interact with humans and exchange information with the controllers 120 located in other vehicles 110.


1.2. Collaboration Via Shared Information


As shown in FIG. 8, functions that provide decision logic 124 for each vehicle 110-A, 110-H may include perception and state estimation 320, situation awareness and assessment 322, decision making 324, and behavior execution 328. The world model 126 may include a state model 330, situation model 332 and decision model 334.


The controller 120 may implement algorithms that enable driver(s) and vehicles to collaborate based on the shared information. This information typically includes sensor data originating outside the components of the computer/human system as well as derived data (states, events, constraints, conclusions) originating inside the computer/human system and data indicative of physical phenomena created by human drivers or for the purpose of being sensed by human drivers. The shared information may therefore include data that (i) originates within or outside the convoy, (ii) represents physical phenomena (such phenomena produced by or capable of being sensed by humans, such as forces on steering wheels), (iii) is received from sensors or other input devices in its raw/sensed form or (iv) is derived data (examples include states, events, constraints, conclusions, originating inside the components of autonomy logic or human control).


Each vehicle 110 will have its own local copy 126 of such shared information referred to herein as a shared world model 240. At any given instant, each local copy 126 of the shared world model 240 may not be entirely consistent with the local copy 126 on other vehicles. Nevertheless, processes residing on all controllers 120 for all vehicles 110 attempt to keep the shared information in the shared world model 240 and local copies 126 sufficiently up-to-date and consistent to permit effective collaboration. Propagation of the local copy 126 of the shared world model 240 among vehicles 110 is discussed in more detail below.


1.3. Information Processing Steps


In FIG. 8, the rectangular boxes on both left and right sides indicate an example sequence called a perceive-think-act sequence. The dotted lines represent transitions between those steps and the solid lines represent data flows between each process and components of the shared world model. As data that originates inside the computer/human system, the world model 126 (and hence also the shared world model 240) acts as another source of information to be used by the autonomy algorithms implemented in the controller 120.


The perception and state estimation step 320 may process all of the information incoming from all sources in order to derive substantially new information that describes arbitrary attributes of the vehicles, humans, and external objects and traffic, etc. Such processing may comprise operations such as, for example:

    • state estimation/data fusion—reducing redundant information to non-redundant form or combining relevant quantities to produce other relevant quantities (e.g. measuring space between the vehicles)
    • prediction—deriving later state from earlier state and a measurement (or prediction) of time (e.g. predicting time to collision based on last known speed)


The situation awareness and assessment step 322 may process all of the information incoming from all sources in order to derive substantially new information that is less directly related to sensed and communicated information, for example:

    • detecting events—watching and noticing when something important occurs (e.g. car in blind spot)
    • constraint propagation—deriving constraints from other information (e.g. do not change lanes now)


The decision making step 324 may process all of the information incoming from all sources in order to derive substantially new information that is associated with or comprises decision making, for example:

    • logic—deriving conclusions from other information, typically by the processes of logical inference, deduction, resolution etc. (e.g. proposed lane change is rejected due to incoming veto)
    • deliberation—weighing numerous alternatives in order to choose one of them (e.g. first vehicle cannot speed up so second vehicle should slow down)


The behavior execution step 328 may process all of the information incoming from all sources in order to derive substantially new information that is associated with or causes acting in the real world, for example:

    • notification—creating events (e.g. messages) for consumption in another part of the system (e.g. lane change maneuver commencing now)
    • action—maintaining or altering the motion of the vehicle (e.g. execution of emergency braking maneuver.


1.4. Shared World Model


In FIG. 8, the ellipses 330, 332, 334 indicate some components of the shared world model 240. The shared world model 240 is used as a diverse data repository that is both the source of relevant information for some algorithms, and the repository for results for other (or the same) algorithms. In general, any processing step may read information from or write information to any component. As mentioned above, each vehicle has its own local copy 126 of the shared world model 240 and processes attempt to keep them somewhat up-to-date and consistent. In this way, processes on any one vehicle are effectively reading and writing information to the shared world models of all vehicles.


The shared world model 240 comprises all information that is shared. In FIG. 3, it was depicted as being divided into three components 330, 332, 334 for convenience of explanation but there can be more or fewer components (such as state prediction 321 described below) and the distinctions between only serve to help elaborate the different ways in which vehicles may collaborate. In an example embodiment, the shared information contains:

    • State model—information that relates to the properties, attributes, etc. of all objects of interest, both internal and external to the vehicle formation. For example, this component comprises aspects of “where everything is and how it is moving”.
    • Situation model—information that relates to description of the situation. For example, this component comprises aspects of “what is happening”.
    • Decision model—information that relates to decisions to taking or not taking actions. For example, this component comprises aspects of “what to do”.


As depicted in the example data flow diagram of FIG. 4, each vehicle 110-H or 110-A has a controller 120, local model 126, perception and state estimation 320, state prediction 321, situation assessment 322, decision making 324, shared world model 240, model propagation 280, model rendering 290, and vehicle to vehicle communication 295 components. Sensors 112 feed the perception and state estimation 320 and state prediction 321 components that make up the local model 126. The content of the local model 126 is used to develop the local copy of the shared world model 240 which is in turn shared with other vehicles 110 via the model propagation function 280. Constraints, preconditions and possible actions 260 are input to decision making 324 along with outputs from situation assessment 322, which in turn drive the controller 120. Model rendering 290 feeds a user interface.


Collaborative Autonomy Transition Behavior Enabled by Shared Information


Given the context of shared information stored in a shared world model 240, it can now be understood how such shared information can be used to implement collaborative behaviors such as a convoy vehicle's transitions into or out of autonomy mode.


FURTHER IMPLEMENTATION OPTIONS

It should be understood that the example embodiments described above may be implemented in many different ways. In some instances, the various “data processors” may each be implemented by a physical or virtual general purpose computer having a central processor, memory, disk or other mass storage, communication interface(s), input/output (I/O) device(s), and other peripherals. The general-purpose computer is transformed into the processors and executes the processes described above, for example, by loading software instructions into the processor, and then causing execution of the instructions to carry out the functions described.


As is known in the art, such a computer may contain a system bus, where a bus is a set of hardware lines used for data transfer among the components of a computer or processing system. The bus or busses are essentially shared conduit(s) that connect different elements of the computer system (e.g., one or more central processing units, disks, various memories, input/output ports, network ports, etc.) that enables the transfer of information between the elements. One or more central processor units are attached to the system bus and provide for the execution of computer instructions. Also attached to system bus are typically I/O device interfaces for connecting the disks, memories, and various input and output devices. Network interface(s) allow connections to various other devices attached to a network. One or more memories provide volatile and/or non-volatile storage for computer software instructions and data used to implement an embodiment. Disks or other mass storage provides non-volatile storage for computer software instructions and data used to implement, for example, the various procedures described herein.


Embodiments may therefore typically be implemented in hardware, custom designed semiconductor logic, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), firmware, software, or any combination thereof.


In certain embodiments, the procedures, devices, and processes described herein are a computer program product, including a computer readable medium (e.g., a removable storage medium such as one or more DVD-ROM's, CD-ROM's, diskettes, tapes, etc.) that provides at least a portion of the software instructions for the system. Such a computer program product can be installed by any suitable software installation procedure, as is well known in the art. In another embodiment, at least a portion of the software instructions may also be downloaded over a cable, communication and/or wireless connection.


Embodiments may also be implemented as instructions stored on a non-transient machine-readable medium, which may be read and executed by one or more procedures. A non-transient machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computing device). For example, a non-transient machine-readable medium may include read only memory (ROM); random access memory (RAM); storage including magnetic disk storage media; optical storage media; flash memory devices; and others.


Furthermore, firmware, software, routines, or instructions may be described herein as performing certain actions and/or functions. However, it should be appreciated that such descriptions contained herein are merely for convenience and that such actions in fact result from computing devices, processors, controllers, or other devices executing the firmware, software, routines, instructions, etc.


It also should be understood that the block and system diagrams may include more or fewer elements, be arranged differently, or be represented differently. But it further should be understood that certain implementations may dictate the block and network diagrams and the number of block and network diagrams illustrating the execution of the embodiments be implemented in a particular way.


Accordingly, further embodiments may also be implemented in a variety of computer architectures, physical, virtual, cloud computers, and/or some combination thereof, and thus the computer systems described herein are intended for purposes of illustration only and not as a limitation of the embodiments.


The above description has particularly shown and described example embodiments. However, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the legal scope of this patent as encompassed by the appended claims.

Claims
  • 1. A system for controlling an autonomy system of a vehicle comprising: a human-operable input device including one or more of a throttle, brake or steering input providing human input;autonomy logic providing autonomy control inputs, including one or more of a throttle, brake or steering input;the autonomy logic also generating an autonomy ready signal indicating readiness of the autonomy logic to generate autonomy control inputs;one or more actuators configured to operate one or more corresponding vehicle sub-systems and to provide an actuator ready signal indicating operating status of the one or more electronic actuators;a human-operable mode input device providing a requested mode signal;a controller for selecting either a human driving mode or an autonomy driving mode that respectively provide the human control inputs or the autonomy control inputs to operate the vehicle sub-systems, the human driving mode or autonomy mode depending on the state of the mode signal, the autonomy ready signal, and the actuator ready signal; andan output device providing one or more outputs indicating the human driving mode or autonomy driving mode
  • 2. The apparatus of claim 1 wherein: the autonomy logic further comprises an A-kit module that to generate a desired travel plan;a B-kit module that controls one or more steering, brake and/or throttle actuators of the vehicle based on the desired motion;and wherein the controller has logical states including:a pre-disarm state, wherein the B-kit is enabled but not yet ready to be active, and the A-kit is disabled and inactive, such that the human control inputs directly control the actuators;a disarm state where the B-Kit is enabled and ready to be active but the A-kit is disabled such that human control inputs directly control the actuators;an arm state where the B-kit is enabled and active, and the A-kit is enabled to track human control to control the actuators in response;a ready for autonomy state where the B-kit is enabled and active, and A-Kit is enabled to enter a full autonomy state, but still controls the actuators via human control inputs;a full autonomy state where the vehicle is autonomously controlled by the B-Kit as directed by the A-Kit and the human control inputs do not control the actuators; andan e-stop stop state where the A-Kit and B-kit are deactivated and disabled.
  • 3. The apparatus of claim 2 wherein in the arm state: a throttle relay is powered up and connects a throttle pedal to an Electronic Control Unit (ECU) through the controller;the controller reads the throttle pedal signals, and generates the output signal that is sent to the ECU, to permit throttle pedal control while demonstrating that the autonomy logic is functional;the B-Kit enables communication over a CAN bus; anda Driver Alert System is silent.
  • 4. The apparatus of claim 2 wherein in the arm state, the output device further comprises: an Arm light that is turned on;a Disarm, Ready for Autonomy, and Autonomy lights that are turned off.
  • 5. The apparatus of claim 2 wherein in the autonomy state: a throttle relay is powered up and connects a throttle pedal to the controller;the controller ignores the throttle pedal signal;the controller receives a percentage throttle command from the B-Kit, and generates the output signal that is sent to the ECU;the B-Kit further enables communication over a CAN bus; andthe A-Kit generates data representing a desired travel path and sends that data to the B-Kit;the B-Kit generates steering, brake and throttle control signals based on the data received from the A-Kit; anda Driver Alert System plays an audible Autonomy enabled message.
  • 6. The apparatus of claim 2 wherein in the autonomy state, the output device comprises: an Arm light that is turned on;an Autonomy light that is turned on; anda Disarm and Ready for Autonomy lights are turned off.
  • 7. The apparatus of claim 2 wherein in the disarm state: a throttle relay is powered up and connects a throttle pedal to an ECU through the controller;the controller passes the input throttle signals directly to the outputs to an ECU to electronically mimic a wired connection;the B-Kit inhibits communication over a CAN bus; anda Driver Alert System is silent.
  • 8. The apparatus of claim 2 wherein in the disarm state: a Disarm light is turned on; andArm, Ready for Autonomy, and Autonomy lights are turned off.
  • 9. The apparatus of claim 2 wherein in the e-stop state: a throttle relay is powered off and set to connect a throttle pedal to an ECU;Safety Controller, B-Kit, and A-Kit are powered down;Disarm, Arm, Ready for Autonomy, and Autonomy lights are turned off.
  • 10. The apparatus of claim 2 wherein in the pre-disarm state: a throttle relay is powered up and connects a throttle pedal to an ECU through the controller;the controller passes input throttle signals directly to the ECU to electronically mimic a wired connection;a B-Kit module inhibits communication over a CAN bus; anda Driver Alert System is silent.
  • 11. The apparatus of claim 2 wherein in the pre-disarm state: Disarm, Arm, Ready for Autonomy, and Autonomy lights are turned off.
  • 12. The apparatus of claim 2 wherein in the Ready for Autonomy state: a throttle relay is powered up and connects a throttle pedal to an ECU through the controller;the controller reads the throttle pedal signals, converts them to a percentage throttle, and uses the percentage throttle to generate an output signal that is sent to an ECU, to thereby permit throttle pedal control while demonstrating that the signal generation system that will be used for autonomy is functional; andthe B-Kit software enables communication over a CAN bus; anda Driver Alert System repeatedly plays an Ready for Autonomy message with a pause between repeats.
  • 13. The apparatus of claim 2 wherein in the ready for autonomy state: am Arm light is turned on;an Amber Ready for Autonomy light is flashing at a slow rate; andReady for Autonomy, and Autonomy lights are turned off.
  • 14. The apparatus of claim 2 wherein a transition to the autonomy state is a collaborative decision between the human driver and either autonomy logic or a human driver associated with a second vehicle.
  • 15. An interface for enabling collaborative control of at least two vehicles wherein a first vehicle is at least partially controllable by a human driver and a second vehicle is at least partially controllable by autonomy logic, the interface comprising one or more processors configured to execute program code stored in one or more memories, the program code for further: collecting information from human driver inputs and outputs on the first vehicle;collecting information regarding an autonomy state of at least the second vehicle;sharing the resulting collected information between the first and second vehicles;each vehicle using the shared information to maintain a world model;using the world model to enable the vehicles to collaboratively engage in a decision as a unit to alter an autonomy state of the second vehicle; andgenerating an alert related to the autonomy state of the second vehicle via an apparatus associated with the first vehicle.
  • 16. The apparatus of claim 15 wherein the alert is an audio message.
  • 17. The apparatus of claim 15 wherein the alert is a colored light.
  • 18. The apparatus of claim 15 wherein the decision is for the second vehicle to enter autonomy mode.
  • 19. The apparatus of claim 15 wherein the decision is a result of a driver of the first vehicle requesting to leave autonomy mode.
  • 20. The apparatus of claim 15 wherein the decision is to leave autonomy mode as a result of detecting an unsafe condition of the second vehicle.
  • 21. The apparatus of claim 15 wherein collecting information regarding an unsafe condition from sensors associated with the first and the second vehicle, andproviding such information regarding an unsafe condition as shared information.
CROSS REFERENCE TO RELATED APPLICATIONS

This patent application relates a co-pending U.S. patent application entitled “SHARED CONTROL FOR VEHICLES TRAVELLING IN FORMATION” Ser. No. 17/507,935 filed Oct. 22, 2021, which in turn claims the benefit of a provisional application entitled “SHARED CONTROL FOR VEHICLES TRAVELLING IN FORMATION” Ser. No. 63/128,961 filed Dec. 22, 2020, the entire contents of each of which are hereby incorporated by reference.