This invention relates generally to roadside traffic management vehicles and, more particularly, this invention relates to an impact attenuating driverless traffic management vehicle configurable in at least one driverless mode of operation.
According to one aspect, there is provide a driverless traffic management vehicle comprising a control system having a controller interfacing steering and drive interfaces for control of respective steering and drive subsystems of the vehicle, an impact attenuator and attenuator actuator therefor for configuring the attenuator in deployed and stowed configurations and wherein the control system comprises at least one driverless mode controller operably controlling the steering and drive interfaces for controlling the vehicle in at least one driverless mode of operation, the control system comprises a mode control interface configured for setting the at least one driverless mode of operation and the at least one driverless mode controller comprises: a follow mode controller configured for controlling the vehicle in a driverless follow mode of operation and wherein the vehicle comprises a data interface and wherein, in use, the data interface is configured for receiving waypoint data from a lead vehicle and wherein the follow mode controller is configured for controlling the steering and drive interfaces to control the vehicle to follow the lead vehicle using the waypoint data; a remote-control mode controller configured for controlling the vehicle in a driverless remote-control mode of operation wherein the remote-control mode controller receives control instructions from a remote-control unit, the remote-controller comprising controls for controlling at least one of the steering and drive of the vehicle, and controlling at least one of the steering and drive interfaces accordingly; and driverless autonomous control mode controller configured for controlling the vehicle in a driverless autonomous control mode of operation wherein, in use, the driverless autonomous control mode controller is configured for controlling the steering and drive interfaces such that the vehicle follows a predefined waypoint route defined by a route waypoint data file.
In the driverless follow mode of operation, the controller may be configured for displaying driving indication signs comprising turn direction and stopping indication signs and the controller may be configured for receiving driving indication data signals from the lead vehicle and displaying the driving indication signs according to the driving indication data signals.
In the driverless follow mode of operation, the control system may be configured for detecting respective positions of the driverless traffic management vehicle and the lead vehicle and determining a distance along a route therebetween and displaying the distance using the electronic signage board.
The remote-control unit may comprise a signage control configured for selecting one of the plurality of signs remotely for display on the electronic signage board.
The attenuator may be pivotally coupled to a rear of the vehicle so as to pivot upwardly to the stowed configuration and pivot downwardly to the deployed configuration and wherein, in the stowed configuration, the electronic signage board may be revealed behind the attenuator.
The attenuator may comprise a rear surface and the traffic management vehicle may further comprise an attenuator electronic signage board on the rear surface and, wherein in the stowed configuration, the controller may be configured for displaying a plurality of signs on the attenuator electronic signage board and in the deployed configuration, the controller may be configured for displaying a plurality of signs on the electronic signage board.
The traffic management vehicle may further comprise a hoist having signage thereon which can be raised when the attenuator may be in the stowed configuration such that the signage thereon may be viewable above the attenuator.
The driverless traffic management vehicle may further comprise a further electronic signage board across a rearward impact face of the attenuator and the controller may be configured for displaying signs thereon when the attenuator may be in the deployed configuration.
The remote-controller may comprise a control for stowing or deploying the attenuator and the controller may be configured for controlling the attenuator actuator according to control signals received from the remote-controller.
The remote-control unit may comprise a mode control configured for setting the at least one driverless mode of operation remotely.
The predefined waypoint route may comprise waypoint regions where the attenuator may be to be deployed and waypoint regions where the attenuator may be to be stowed and wherein, when in the driverless autonomous control mode of operation, the controller may be configured for controlling the attenuator actuator to deploy or stow the attenuator accordingly.
In the at least one driverless mode of operation, the controller may be configured for sensing the speed of the traffic management vehicle and automatically deploying the impact attenuator when the speed may be less than a set speed deployment threshold.
In the at least one driverless mode of operation, the controller may be configured for sensing a verge offset of the traffic management vehicle with respect to a road verge and automatically deploying the impact attenuator according to the verge offset.
The driverless traffic management vehicle may comprise at least one proximity sensor and, in the at least one driverless mode of operation, the controller may be configured for detecting an obstacle using the proximity sensor and controlling at least one of the steering and drive interfaces to avoid a collision therewith.
The mode control interface may allow setting of a following distance and, in the driverless follow mode of operation, the follow mode controller may be configured for following the lead vehicle at the following distance.
The mode control interface may allow for configuration of a lateral steering offset and wherein the, in the driverless autonomous control mode of operation, the follow mode controller may be configured to adjust a lateral position of the vehicle according to the lateral steering offset.
The vehicle may comprise an electric drivetrain.
According to another aspect, there is provided a method comprising driving the vehicle to a roadside location and then setting the vehicle in a driverless mode of operation and leaving the vehicle to drive autonomously.
Other aspects of the invention are also disclosed.
Notwithstanding any other forms which may fall within the scope of the present invention, preferred embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
The vehicle 100 comprises a deployable crash attenuator 109 shown in
The vehicle 100 may further comprise and electronic signage board 106 for the display of a plurality of signs thereon. In embodiments, the electronic signage board 106 may comprise a high intensity multicoloured dot-matrix display. As can be seen from
In embodiment, the vehicle 100 may comprise an attenuator electronic signage board 136 on a rear face of the attenuator 109 which is visible when the attenuator 109 is in the stowed configuration. In accordance with this embodiment, the attenuator electronic signage board 136 may be controlled to display signs when the attenuator 109 is in the stowed configuration shown in
The vehicle 100 may further comprise further signage 107. The further signs 107 may comprise directional indication lighting which may comprise a plurality of lamps arranged to display direction arrows and the like, including whilst flashing. The lamps may be controlled in subsets to change the direction of arrows indicated.
In the embodiment shown in
In embodiments, the hoist 135 may be automatically raised when the attenuator 109 is in the stowed configuration when the vehicle 100 is in a signage display mode of operation.
In embodiments, the vehicle 100 comprises visibility lighting 103.
In a preferred embodiment, the vehicle 100 is an electric vehicle comprising a battery supply for an electric drivetrain, thereby being especially suited for fine control low speed driverless mode of operation described hereunder in further detail.
The control system 134 comprises a controller 115 comprising a processor for processing digital data and a memory device operably coupled thereto via a system bus. The memory device is configured for storing digital data including computer program code instructions which are fetched, decoded and executed by the processor in use. These computer program code instructions may be logically divided into various controllers including those shown in
The control system 134 comprises a steering interface 126 for controlling the steering subsystem of the vehicle 100, typically operably controlling the front two wheels of the vehicle 100 so as to be able to turn the vehicle 100 left and right. Furthermore, the control system 134 comprises a drive interface 127 operably controlling the drive subsystem of the vehicle 100 so as to allow the vehicle to drive forwards, including at various speeds, and in reverse.
The control system 134 may comprises an attenuator actuator 111 configured for stowing and deploying the attenuator 109 as illustrated in
However, in embodiments, the control system 134 may be controlled remotely using a remote-control unit 129 communicating via a data interface 114, such as a short distance radiofrequency interface or alternatively long-distance GSM data interface. In this regard, the remote-control unit 129 may comprise an attenuator controller 132 for raising and lowering the attenuator 109 as shown in
In embodiments, the control system 134 may automatically deploy the attenuator 109 during driverless mode operation.
The control system 134 may comprise an actuator 105 for raising and lowering the hoist 135 for the further signage 107. Furthermore, the control system 134 may draw power from a power supply 128. The power supply 128 may additionally power the electric drivetrain of the vehicle 100.
The controllers may comprise a display controller 124 controlling the signage of the electronic signage board 106 and other electronic signage boards. In embodiments, the memory device of the control system 134 comprises a plurality of roadsigns available for selection by an operator for display on the electronic signage board 106. As such, in use, either using an in-vehicle user interface or via a signage control 130 of the remote-control unit 129, the user may control the display of different roadsigns using the electronic signage board 106.
In embodiments, the control system 134 may automatically control the signage displayed by the electronic signage board 106 including in accordance with the current operation mode of the vehicle 100. For example, the control system 134 may display different roadsigns using the electronic signage board 106 depending on whether the vehicle is driver controlled or operating in a driverless mode of operation.
In embodiments, different signs may be displayed depending on the type of driverless mode. In further embodiments, the control system 134 may display differing signage when the vehicle is reversing. In further embodiments, the control system 134 may infer the speed of the vehicle 100, including by interfacing with a vehicle subsystem thereof or alternatively ascertaining the speed from a GPS location sensor 125 so as to be able to display the travel speed of the vehicle using the electronic signage board 106, or alternatively display different signs depending on the current speed of the vehicle 100. In further embodiments, the control system 134 may display different signs using the electronic signage board 106 depending on an impending manoeuvre of the vehicle 100, such as turning left, turning right, stopping, starting up and the like.
The control system 134 may comprise proximally sensors 124 for sensing the proximity of various objects and obstacles in relation to the vehicle 100. For example, radar or ultrasonic proximally sensors may be deployed around the vehicle so as to, for example, detect obstacles. The control system 134 may be configured for stopping the vehicle when detecting an obstacle in front, for example or alternatively manoeuvring around the obstacle.
The control system 134 may comprise a mode control interface 122 allowing for the configuration of the particular driving mode of operation. In embodiments, the remote-control unit 129 may comprise a mode control 133 for remotely controlling the mode of operation of the vehicle 100.
The control system 134 may comprise a drive control interface 123 for controlling the vehicle 100 from within the cabin 101. In the driver operation mode, the operator may operate the vehicle 100 in the conventional manner, including driving the vehicle 100 to a site.
However, on-site, the user may then configure the vehicle 100 in at least one driverless mode of operation wherein the controller 115 controls the steering 126 and drive interfaces 127 to control the steering and driver of the vehicle 100 respectively. As alluded to above, the controller 115 may control the signs displayed by the electronic signage board 106 accordingly also.
With reference to
Steering offsets may be provided when, for example, the follow mode controller 117 may be configured to adjust the sideways position of the vehicle so as to travel on the road as opposed to the verge directly behind the grass cutting unit.
An operator may drive vehicle 100 to a position behind a maintenance machine such as a grass cutting unit. The driver may then control the mode control interface 122 within the vehicle 100 to configure the vehicle 100 in the driverless follow mode of operation. The attenuator 109 may be deployed by the operator or alternatively automatically by the controller 115 when entering the driverless follow mode of operation. The operator may then step from the cabin 101 to allow the vehicle 100 to follow the maintenance machine autonomously in the aforedescribed manner. Alternatively, once having stepped from the cabin 101, the operator may switch the vehicle 100 to the driverless follow mode using the mode control 133 of the remote-control unit 129.
The controller 115 comprises a remote-control mode controller 118 configured to control the vehicle 100 in a driverless remote-control mode of operation wherein the steering and drive and other functionality of the vehicle 100 is controlled remotely from the remote-control unit 129.
In this regard, the remote-control unit 129 may comprise a position control 131 for controlling the steering and drive of the vehicle 100. For example, using the position control 131, the operator may control the steering of the vehicle 100 left and right and control the direction and speed of drive of the vehicle from a distance.
The controller 115 comprises an autonomous control mode controller 119 configured for controlling the vehicle 100 in an autonomous driverless mode of operation. In accordance with this mode of operation, a route waypoint file may be generated comprising a plurality of navigational waypoints. The route waypoint file may be generated using a data logging unit travelling along and intended route beforehand.
As such, the autonomous control mode controller 119 may read the route waypoint file from the memory of the controller 115 or receive the waypoint route via the data interface 114 and then control the steering interface 126 and the drive interface 127 to allow the vehicle 100 to follow the set out route by following each waypoint of the route waypoint file in turn.
As alluded to above, the controller 115 may be configured for this plane a plurality of signs on the electronic signage board 106 depending on the driverless mode of operation.
In the driverless autonomous control mode of operation, the controller 115 may be configured for displaying driving indication signs on the electronic signage board 106 or other electronic signage boards of the vehicle. The driving indication signs may comprise turn direction and stopping indication signs.
In embodiments, the controller 115 is configured for displaying the driving indication signs in advance of controlling the steering interface 126 and the drive interface 127 accordingly. In embodiments, the controller is configured for inspecting the route waypoint data file for inferring direction indications therefrom. For example, timestamps of waypoints within the route waypoint data file may be analysed to detect slowing down or stopping such that stopping indication signs may be appropriately displayed in advance. Furthermore, lateral offsets of waypoints indicative of changing of directional lanes may be analysed by the controller 115 to display turn direction indication signs appropriately in advance.
In the driverless follow mode of operation, the control system 134 may be configured for displaying driving indication signs comprising turn direction and stopping indication signs of the lead vehicle. In accordance with this embodiment, the controller 115 may be configured for receiving driving indication data signals from the lead vehicle by the data interface 114 and displaying the driving indication signs according to the driving indication data signals.
Furthermore, in the driverless follow mode of operation, the control system 134 may be configured for detecting respective positions of the driverless traffic management vehicle 100 and the lead vehicle and determining a distance therebetween and displaying the distance using the electronic signage board 106 other electronic signage boards of the vehicle 100. The control system 134 may be calculated for detecting a distance between the traffic management vehicle 100 and the lead vehicle along an active route to account for road bends.
The signage control interface 130 of the remote-control unit 129 may be used to select which signage to be displayed by the electronic signage board 106 or other electronic signage boards of the vehicle 100. In embodiments, the signage control interface 130 may display an image gallery of available signs for selection. Alternatively, the signage control interface 130 may allow the user to specify the mode of operation, such as grass cutting driverless follow mode of operation wherein the controller 115 select the appropriate signage to display indicative of the vehicle 100 following behind a grass cutter.
In embodiments, the predefined waypoint route comprises waypoint regions of the attenuator 109 is to be deployed in waypoint regions where the attenuator is to be stowed. As such, when in the driverless autonomous control mode of operation, the controller 115 may be configured for controlling the actuator 111 to deploy or stow the attenuator accordingly. In accordance with this embodiment, the controller 115 may automatically deploy the attenuator 109 when the predefined waypoint route causes the vehicle 100 to come to a stop. Conversely, when the predefined waypoint route causes the vehicle 100 to commence travelling, the controller 115 may automatically stow the attenuator 109.
In embodiments, the controller 115 is configured for sensing the speed of the traffic management vehicle 100, such as by using the location sensors 125 or vehicle management interface to sense the speed of the vehicle 100 and automatically deploy the impact attenuator 109 when the speed is less than a set speed deployment threshold, such as less than 10 km an hour or when the vehicle 100 is stationary stopped for periods of time.
In embodiments, in the at least one driverless mode of operation, the controller 115 may be configured for sensing a verge offset of the traffic management vehicle 100 with respect to a road verge and automatically deploying or stowing the attenuator 109 accordingly. The controller 115 may use the location sensors 125 to determine a lateral offset of the traffic management vehicle 100. In alternative embodiments, the control system 104 may comprise a vision subsystem (not shown) from which surrounding image data is obtained and analysed by the controller 115 to infer a lateral offset from a road verge or centreline marking to determine the lateral offset of the vehicle 100. In this way, for example, the attenuator 109 may be automatically deployed when the vehicle is across the road verge or in the centre of a particular lane.
In embodiments, the motor control interface 122 allows the setting of a following distance such that, when the vehicle 100 operates in the driverless follow mode of operation, the following mode controller 117 is configured for controlling the drive interface 127 to follow the lead vehicle at the configured following distance.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practise the invention. Thus, the foregoing descriptions of specific embodiments of the invention are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the invention to the precise forms disclosed as obviously many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2019900231 | Jan 2019 | AU | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/AU2019/051351 | 12/10/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/150767 | 7/30/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4860209 | Sugimoto et al. | Aug 1989 | A |
5172767 | Turner et al. | Dec 1992 | A |
6038502 | Sudo | Mar 2000 | A |
6195610 | Kaneko | Feb 2001 | B1 |
6269763 | Woodland | Aug 2001 | B1 |
6364026 | Doshay | Apr 2002 | B1 |
6445984 | Kellogg | Sep 2002 | B1 |
7007420 | Garcia | Mar 2006 | B2 |
7522979 | Pillar | Apr 2009 | B2 |
8139109 | Schmiedel et al. | Mar 2012 | B2 |
8417860 | Choi | Apr 2013 | B2 |
8899903 | Saad et al. | Dec 2014 | B1 |
9117185 | Vian et al. | Aug 2015 | B2 |
9200904 | Borland et al. | Dec 2015 | B2 |
9399150 | Almutairi | Jul 2016 | B1 |
9607510 | DeLorean | Mar 2017 | B1 |
9725083 | Dextreit | Aug 2017 | B2 |
9972205 | Beaulieu | May 2018 | B2 |
10353387 | Stenneth | Jul 2019 | B2 |
20020133285 | Hirasago | Sep 2002 | A1 |
20060221328 | Rouly | Oct 2006 | A1 |
20060229804 | Schmidt et al. | Oct 2006 | A1 |
20070030212 | Shibata | Feb 2007 | A1 |
20070115138 | Arakawa | May 2007 | A1 |
20080167774 | Patel | Jul 2008 | A1 |
20090205845 | Hoffman | Aug 2009 | A1 |
20090255887 | Mrowiec | Oct 2009 | A1 |
20090321094 | Thomas | Dec 2009 | A1 |
20100032176 | McIntosh et al. | Feb 2010 | A1 |
20100101401 | Toeckes et al. | Apr 2010 | A1 |
20100114541 | Johnson | May 2010 | A1 |
20110186657 | Haviland | Aug 2011 | A1 |
20120021660 | St-Pierre et al. | Jan 2012 | A1 |
20120064480 | Hegemann | Mar 2012 | A1 |
20120226394 | Marcus | Sep 2012 | A1 |
20120261144 | Vian et al. | Oct 2012 | A1 |
20120303458 | Schuler, Jr. | Nov 2012 | A1 |
20130080041 | Kumabe | Mar 2013 | A1 |
20130114632 | Telford | May 2013 | A1 |
20130235169 | Kato | Sep 2013 | A1 |
20130270394 | Downs | Oct 2013 | A1 |
20140007756 | Diaz | Jan 2014 | A1 |
20140070963 | DeLorean | Mar 2014 | A1 |
20140118553 | Diba | May 2014 | A1 |
20140236414 | Droz | Aug 2014 | A1 |
20140277899 | Matsuzaki et al. | Sep 2014 | A1 |
20140343891 | Becker et al. | Nov 2014 | A1 |
20150043231 | Clark | Feb 2015 | A1 |
20150142287 | Dornieden et al. | May 2015 | A1 |
20150202770 | Patron et al. | Jul 2015 | A1 |
20150291160 | Kim | Oct 2015 | A1 |
20150367861 | Mori et al. | Dec 2015 | A1 |
20160014982 | Malsam | Jan 2016 | A1 |
20160018822 | Nevdahs et al. | Jan 2016 | A1 |
20160071418 | Oshida et al. | Mar 2016 | A1 |
20160082298 | Dagenhart | Mar 2016 | A1 |
20160129999 | Mays | May 2016 | A1 |
20160170487 | Saisho | Jun 2016 | A1 |
20160174453 | Matsuzaki et al. | Jun 2016 | A1 |
20160240085 | Otsuka | Aug 2016 | A1 |
20160274591 | Bick et al. | Sep 2016 | A1 |
20160355258 | Williams et al. | Dec 2016 | A1 |
20170036601 | Kimura | Feb 2017 | A1 |
20170084160 | Piccolo | Mar 2017 | A1 |
20170123671 | Kundu et al. | May 2017 | A1 |
20170128759 | Zonio et al. | May 2017 | A1 |
20170160748 | Nakagawaa et al. | Jun 2017 | A1 |
20170177003 | Yokoyama et al. | Jun 2017 | A1 |
20170240276 | Zilberstein | Aug 2017 | A1 |
20170305365 | Matsumoto | Oct 2017 | A1 |
20180120861 | Saxena | May 2018 | A1 |
20180247137 | Boyle | Aug 2018 | A1 |
20180261088 | Roy et al. | Sep 2018 | A1 |
20180326901 | Boyle | Nov 2018 | A1 |
20180326995 | Hiramatsu et al. | Nov 2018 | A1 |
20190044728 | Karmoose | Feb 2019 | A1 |
20190176987 | Beecham | Jun 2019 | A1 |
20190294165 | Hofmann et al. | Sep 2019 | A1 |
20190349719 | Pattan | Nov 2019 | A1 |
20190378418 | Menadue | Dec 2019 | A1 |
20200186290 | Zhang | Jun 2020 | A1 |
20200249699 | Kim | Aug 2020 | A1 |
20210264793 | Shuman | Aug 2021 | A1 |
20210350707 | Ucar | Nov 2021 | A1 |
20210381179 | Boyle | Dec 2021 | A1 |
20220061068 | Liu | Feb 2022 | A1 |
20220104200 | Zang | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
686048 | Dec 1995 | CH |
2627340 | Dec 1977 | DE |
3709662 | Oct 1988 | DE |
19722292 | Dec 1998 | DE |
102013104990 | Nov 2014 | DE |
3340000 | Jun 2018 | EP |
20150129247 | Nov 2015 | KR |
2008097173 | Aug 2008 | WO |
2015089588 | Jun 2015 | WO |
Entry |
---|
Ferreira et al., Autonomous System for Wildfire and Forest Fire Early Detection and Control (Year: 2020). |
Mateen et al., Smart Roads for Autonomous Accident Detection and Warnings (Year: 2022). |
Sathiabalan et al., Autonomous robotic fire detection and extinguishing system (Year: 2021). |
Chien et al., Develop a Multiple Interface Based Fire Fighting Robot (Year: 2007). |
Colas and US Partners Global Launch of First Autonomous Road Safety Vehicle, “https://www.youtube.com/watch?v=XRWaeaZ6eTQ”—published on Aug. 17, 2017, whole video. |
Eric Holst, The Devastation of the Soda Fire and the Seeds of Hope for the Future (Year: 2015). |
International Search Report & Written Opinion dated Feb. 13, 2017 from PCT Application No. PCT/AU2016/051084. |
International Search Report & Written Opinion dated Mar. 2, 2020 from PCT Application No. PCT/AU2019/051351. |
Jared Harwayne-Gidansky, A Low-Complexity Navigation Algorithm for a Scalable Autonomous Firefighting Vehicle (Year: 2007). |
Jijesh et al., Design and Implementation of Automated Fire Fighting and Rescuing Robot (Year: 2020). |
Phan et al., A cooperative UAV-UGV platform for wildfire detection and fighting (Year: 2008). |
Raj et al., Internet of Robotic Things Based Autonomous Fire Fighting Mobile Robot (Year: 2018). |
Sherstijuk et al., Forest Fire Fighting Using Heterogeneous Ensemble of Unmanned Aerial Vehicles (Year: 2019). |
Zhang et al., Research of AI Fire Fighting Robot Based on Big Data and Group Intelligence Perception (Year: 2020). |
Number | Date | Country | |
---|---|---|---|
20210381179 A1 | Dec 2021 | US |