Vehicle productivity and operator comfort can be improved by recognizing repetitive patterns for the vehicle, vehicle components and/or the operator. When a pattern is recognized, active and/or passive actions can be taken to improve productivity and comfort.
Certain vehicles may be operated in a repetitive pattern in a typical duty cycle. For example, the vehicle operator may use the vehicle to perform the same actions or movements or to repeatedly follow the same route. Vehicles such as wheel loaders, forklifts and the like are typical examples of off-highway vehicles that may be operated in this fashion. It can be appreciated that the vehicle performs the same actions in these cycles, but the components of the vehicle and operator may as well. It would increase efficiency and productivity of the vehicle and the operator if these patterns could be recognized and then implemented into the vehicle control so that the control was anticipated.
A method of vehicle and operator guidance by pattern recognition begins with collecting data on vehicle and operator patterns through sensors. The collected data is combined and analyzed with a pattern recognition algorithm. The combined and analyzed data is compared with patterns in a pattern database. If data matches a pattern in the pattern database, the pattern can be selected. If the data does not match a pattern in the pattern database, a new pattern can be created and selected. Based on the selected pattern, a determination is made to passively assist the operator, actively assist the operator or passively and actively assist the operator. The determined action is then implemented.
Exemplary embodiments of the present disclosure will now be described by way of example with reference to the accompanying drawings, in which:
Operator comfort and vehicle productivity can be improved if the repetitive patterns or movements of the operator and/or vehicle are recognized. For example, operator comfort and/or vehicle productivity might be improved if the vehicle control parameters can be adapted to recognize the repetitive patterns or movements. Some of the vehicle control parameters that may be advantageously adapted to the repetitive patterns or movements include, but are not limited to, pre-fill phases in clutches, pre-engagement of a gear before it is selected, control of the bucket of a wheel loader (such as bucket leveling) or the forks and mast of a forklift.
The patterns can represent repetitive movements of the vehicle, similar to the ones followed by a wheel loader when loading and unloading soil.
As suggested above, the repetitive patterns are not limited to just movements of the entire vehicle. The patterns might also be repetitive movements made by some machine components (i.e. working functions), such as the bucket of a wheel loader or the mast and forks of a forklift.
By way of example,
The numbers, arrows, labels and different line types in
Once the pattern is recognized, many actions might be taken, such as passive actions (e.g., pre-engaging certain clutches), to active actions (e.g., to guide the driver actions or even complete automation of a pattern).
Examples of vehicle sensors comprise internal combustion engine speed sensors, wheel speed sensors, pressure sensors for the clutches and the hydraulic cylinders, and load sensors on the fork lift mast, among others. Examples of sensors for driver actions comprise sensors for the position of the steering, pressure applied on the throttle, and position or angle of the levers and joysticks. Examples of fleet management sensors comprise sensors to collect and process GPS data for vehicle positioning, positions and actions from other vehicles, actions to do, and location and weight of loads, among others. These sensors are depicted on one exemplary vehicle in
The information collected from the vehicle sensors, the fleet management sensors and the driver action sensors are combined and processed by a pattern recognition algorithm. Using typical supervised learning algorithms, which covers the case where a set of data (i.e. training data) for the vehicle movements have been pre-defined (prior hand-labeling), the information retrieved by the sensors are labeled with the related classification of tasks. An example operation of such a vehicle, for instance soil digging, consists of several tasks in a sequence. The controller will try to match the actions interpreted via the sensors with the training data to achieve a sequence labeling. As a result of this recognition phase, the algorithm will determine which pattern is being followed by interpreting actions in a dimensionless set of data (left turn, right turn, move forward, raising a load, . . . ) using parametric classification methods.
Any number of pattern categories (i.e. sequence of tasks) can be stored in the internal memory of the vehicle controller to be classified based on the sensor input values. The CPU (i.e. the master vehicle controller) will execute the algorithms mentioned above. In case the pattern followed by the vehicle cannot be matched with the training data (i.e. already stored), by interpreting the repetitive motions in the clustered (un-labeled) data, a new structure of pattern (sequence of tasks) can be formed and taught to the machine (i.e. unsupervised machine learning).
In one embodiment, there may be four pattern categories: (1) the original patterns, (2) the updated patterns, (3) the patterns received from fleet management and (4) the self-learned patterns.
Original patterns may be comprised of the pre-defined patterns (i.e. labeled training data) implemented by the control designer in an offline way (e.g., generic patterns), as shown in
Updated patterns may be comprised of improvements on the original patterns based on existing (or newly installed) sensor measurements on the vehicle (e.g., weight of the load or distances taken on the route . . . ) learned during vehicle operation or operator operation.
Received patterns may be comprised of information received by a fleet management server and/or cloud (i.e., information from other vehicles or infrastructure) through a communication system installed on the vehicle. For example, an operation of a forklift, such as lifting a pallet from a truck in the warehouse entrance and storing it in a rack at a particular location can form a new label for the sequences of tasks and can be shared with all other vehicles of the fleet. As other vehicles learn the new pattern for the first time, they can properly operate the same sequence of tasks in replacement with the first vehicle by immediately recognizing the new pattern.
Self-learned patterns may be comprised of patterns not included in the originally installed patterns. The recognition algorithm can detect and record repetitive movements by the vehicle or the operator. A separate space on the vehicle memory is allocated to store all the new patterns followed by the vehicle, progressively. The controller will interpret this sequence of actions by clustering methods and these actions can form new labels by checking related parameters (e.g., how many times the action is repeated or the variation among each repetition etc.) assigned for each particular application. To limit the number of patterns in the database, the followed pattern might be compared first to the most frequently used patterns, and the less frequently used or non-used patterns can be removed from the database to ensure fast-response and the reliability of the pattern recognition algorithms.
The information from the sensors is compared with the pattern categories and the one that most closely matches the available pattern category is selected. Once the pattern category is selected, the related actions are determined depending on the strategy (e.g. passive assistance, active assistance or complete automation) and finally the desired actions are sent to the vehicle components for implementation. The related actions can include blending passive, active, or automation actions that may be taken in any order.
Each vehicle has at least one controller generally responsible for the control and operation of the machine and/or its various components. The vehicle controller will be responsible for recognizing the patterns to guide the driver and increase productivity. Many different ways of recognizing patterns can be implemented in the controller.
For example, the operator could enter new patterns manually in the “register mode”, by executing a set of repetitive actions desired to be recorded. Once the patterns have been recorded into the memory, the controller can recognize them in following operations.
As yet another example, patterns can also be learned from external sources outside the vehicle, such as other vehicles of the fleet or from the fleet management server. Such a pattern management using telematics means might lead to a selection of the optimal pattern and a standardization of it by comparing patterns used by many operators.
Additionally, the telematics system can monitor and give instructions to the driver through a telematics platform, such as the next actions to be taken (e.g. to pick a load at a certain location). The addition of information received from the telematics system can improve the accuracy and reliability of the pattern recognition algorithms.
All these pattern recognition techniques are based on the analysis of the driver intentions (throttle, joystick and levers actuation, steering wheel, . . . ) and use the information from existing sensors (engine speed, wheel speed sensor, clutch pressure . . . ) and additional sensors (GPS for vehicle positioning, load sensor for determining the load in the bucket, . . . )
Once a pattern has been recognized, the actions to be taken vary depending on the circumstances. Examples of just a few potential actions are below. These can be combined together, used separately, use sequentially or in any pattern.
The driver can be assisted in a passive manner by preparing the vehicle for the anticipated next driver action once the pattern is recognized. For example, reverse gear can be pre-engaged on a forklift after the load is lifted, and/or the speed of the vehicle's internal combustion engine can be changed in anticipation of a patterned action.
The driver can be actively assisted to help control the vehicle. The actions under this pattern might comprise adjusting the vehicle speed to a level appropriate for the activity undertaken by the vehicle, the driver's actions and the vehicle pattern. It can also include actions on the bucket or mast and forks, and lowering the load easily as the driver touches the load lowering controls.
The invention may also assist the vehicle operator by standardizing his actions on the vehicle and the equipment depending on the pattern engaged. If the current motion matches with “the unloading of a truck” pattern, but if the driver is driving faster than the speed defined in the pattern, the controller could force the vehicle to reduce the speed to approach the speed usually followed in this pattern. The driver can still have priority on the control, meaning that, for example, if he presses again on the accelerator, the controller will stop managing its speed and the driver will be again fully in charge of the vehicle.
Additionally, the invention can comprise taking control of the vehicle by performing small parts of the pattern automatically (e.g., lower buckets, raise forks, . . . ) or the complete pattern (e.g., load some soil, move, unload and move back to the initial position).
In the case of automatic or semi-automatic vehicle control, safety measures have to be implemented to permit the pattern recognition to be terminated and to permit the driver to take full control of the vehicle and its components.
The advantages associated with the present invention over the prior art include, but are not limited to, increased productivity and increased driver comfort by anticipating the next actions and thus adapting the vehicle parameters accordingly.
This application claims priority from and the benefit of U.S. Patent Application Ser. No. 61/811,264 filed on Apr. 12, 2013, which is incorporated by reference in its entirety herein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2014/033753 | 4/11/2014 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2014/169182 | 10/16/2014 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5999872 | Kinugawa et al. | Dec 1999 | A |
6879969 | Engstrom et al. | Apr 2005 | B2 |
7349767 | Kuge et al. | Mar 2008 | B2 |
7389178 | Raz et al. | Jun 2008 | B2 |
7444311 | Engstrom et al. | Oct 2008 | B2 |
7496545 | Chung | Feb 2009 | B2 |
7555367 | Kuge | Jun 2009 | B2 |
7778742 | Kuge et al. | Aug 2010 | B2 |
7826970 | Kobayashi et al. | Nov 2010 | B2 |
7912796 | Engstrom et al. | Mar 2011 | B2 |
20030093187 | Walker | May 2003 | A1 |
20050125148 | Van Buer | Jun 2005 | A1 |
20050162513 | Chan | Jul 2005 | A1 |
20050273215 | Kuge | Dec 2005 | A1 |
20060155394 | Syeda-Mahmood | Jul 2006 | A1 |
20070012499 | Kobayashi et al. | Jan 2007 | A1 |
20090030605 | Breed | Jan 2009 | A1 |
20090140887 | Breed | Jun 2009 | A1 |
20100052945 | Breed | Mar 2010 | A1 |
20130054133 | Lewis | Feb 2013 | A1 |
20140015329 | Widmer | Jan 2014 | A1 |
20140015522 | Widmer | Jan 2014 | A1 |
20140032017 | Anderson | Jan 2014 | A1 |
20140229167 | Wolff | Aug 2014 | A1 |
20150235109 | Yoshii | Aug 2015 | A1 |
20160054140 | Breed | Feb 2016 | A1 |
Number | Date | Country |
---|---|---|
1168942 | Dec 1997 | CN |
101002239 | Jul 2007 | CN |
1954303 | Apr 2008 | CN |
101169873 | Apr 2008 | CN |
101353017 | Jan 2009 | CN |
101633357 | Jan 2010 | CN |
101633359 | Jan 2010 | CN |
102666262 | Sep 2012 | CN |
102826084 | Dec 2012 | CN |
102991503 | Mar 2013 | CN |
1544071 | Jun 2005 | EP |
1602542 | Dec 2005 | EP |
1743818 | Jan 2007 | EP |
2001-142506 | May 2001 | JP |
2002-181179 | Jun 2002 | JP |
2006347535 | Dec 2006 | JP |
200778116 | Mar 2007 | JP |
200942051 | Feb 2009 | JP |
Entry |
---|
European Patent Office; The International Search Report and Written Opinion; dated Nov. 13, 2014; 12 pages; European Patent Office, Rijswijk, Netherlands. |
Machine-generated English translation of JP2001-142506, obtained via Japan Platform for Patent Information. |
Japanese Office Action for Application No. 2016-507677, dated Sep. 20, 217. |
Machine-generated English translation of JP2002-181179, obtained via Japan Platform for Patent Information. |
Machine-generated English translation of CN101353017, obtained via Espacenet Patent Search. |
Chinese Office Action for Application No. 201480017798.9, dated Dec. 2, 2016. |
Number | Date | Country | |
---|---|---|---|
20160075339 A1 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
61811264 | Apr 2013 | US |