This application claims priority to European Patent Application No. 14193343.2, filed Nov. 14, 2014, which is incorporated by reference herein in its entirety for all purposes.
The disclosure relates to the field of machines of a kind comprising a body and an implement movable relative to the body.
A user of a machine of the kind having a machine body and an implement movable relative to the machine body can see directly from only one perspective at any one time. As such, an implement movable relative to the machine body may be visible to a user from only one perspective, such as a rear or side of the implement rather than, for example, from a front of the implement. Accordingly, when precise control of the position of the implement is necessary, a user may require additional information in order to position the implement accurately, particularly in respect of a part of the implement that is not directly visible to the user. Such assistance may be provided, for example, by a camera or by a colleague at a distance from the machine.
Even if a user has assistance of a colleague or from a camera, the user still needs to be able to make judgements about a future position of the implement in order to be able to adjust their control of the ground propulsion of the machine and/or their control of the position of the implement relative to the machine body in order to ensure that the implement arrives at the desired location relative, for example, to an article to be contacted by the implement.
A user may, over time, develop sufficient experience and familiarity to be able to infer a position of a part of an implement that is not directly visible to them. With yet further experience, a user may be able to make judgements regarding a future position of the implement on the basis of various control inputs and how to influence that future position by altering one or more control inputs.
Against this background, there is provided a system using radar apparatus for assisting a user of a machine of a kind comprising a body and an implement.
A system for providing assistance to a user of a machine of a kind comprising a body and an implement movable relative to the body, wherein the system comprises:
Specific embodiments of the disclosure will now be described, by way of example only, with reference to the accompanying drawings in which:
In a first embodiment, the system of the present disclosure may be employed in the context of a machine 100, a schematic illustration of which is shown in
The machine 100 may comprise a body 130 having a cab 140. The machine 100 may also comprise, as one of many options for implements, a loader bucket 120 at a front end of the machine that is movable relative to the body 130. While the illustrated embodiment shows a loader bucket 120, the implement may be interchangeable with a number of alternative implements.
Movement of the machine 100 and of the implement 120 is controllable by a user in the cab 140. A user may thereby control both ground propulsion of the machine and movement of the implement 120 relative to the body 130.
Ground propulsion of the machine 100 may be governed by devices and techniques that are well known in the art. In the case of the machine 100 of
Movement of the implement 120 relative to the body 130 may, for example, be actuated hydraulically and controlled by one or more levers that may be located in the cab 140 such that a user can operate them from the same position or a similar position as that for operating the controls that govern ground propulsion. Depending on the nature of the implement 120 and the mechanism of connection to the body 130 of the machine 100, the implement 120 may be controllable to move relative to the body 130 of the machine 100 with multiple degrees of freedom. The implement 120 of
In the
It will be appreciated that, for many combinations of arm angles and loader bucket angles, a front edge 122 of the loader bucket 120 is not visible to a user sitting in the cab 140 of the machine 100. For other combinations of arm angles and loader bucket angles where a front edge 122 of the loader bucket 120 is visible to a user sitting in the cab 140 of the machine 100, other features of the loader bucket 120, such as a top edge of the loader bucket 120 may not be visible to the user. Furthermore, since for many implements, including a loader bucket 120, there are further degrees of freedom, including the possibility of changing an angle between the bottom surface 126 of the bucket and a rear surface 128 of the bucket, there are further aspects of the implement and its position that may not be visible to the user when at certain angles.
The system may comprise one or more sensors, a processor and a display visible to a user in the cab of the machine 100.
In the embodiment of
The data library may be any source of data, including a source of data located within the processor or a source of data located remotely from the processor and perhaps accessed wirelessly over the internet or by other means.
The image processing software may be further configured to detect a reference point or points of the implement 120 and a reference point or points of the body 130 in order to determine a position of an implement 120 reference point relative to a body 130 reference point.
Having determined a position of at least one reference point on the implement 120 as well as details of the implement type and size from the data library, these two sets of data may be used by the image processing software to determine implement position data in respect of a wider range of reference points, perhaps including reference points that are not within the field of vision of the camera.
The display (not shown in
The display may be configured to display a live view of at least a part of the implement in the context of its surroundings. The live view may be that obtained by the first camera or another camera. Alternatively, the live view may be schematic in nature and may be generated from data derived from one or multiple cameras or other data derived from one or more other sensors.
In some embodiments, the display may not provide either a live view of an image or a schematic representation of a view. Rather, it may simply superimpose information (such as guidance or trajectory information) over a user's view. For example, in an embodiment involving a head-up display, guidance or trajectory information may be provided as an overlay of the actual view of the environment as seen by the user.
It is possible that the live view may be from an angle different to any and all of the cameras. For example, rather than having a camera located at a distance above the machine in order to obtain a direct bird's eye view, the live view may be a schematic bird's eye representation of the machine and implement in its surroundings assimilated from data obtained from a plurality of cameras or sensors whose field of vision or field of sensing may project outwardly from the machine 100. Such an assimilated schematic bird's eye view may be particularly useful in providing information to the user regarding how to position the machine (e.g., how to control the ground propulsion of the machine) relative to one or more articles in the surrounding environment, possibly before moving the implement 120 relative to the body 130.
The image processing software may be configured to superimpose onto the view a representation of one or more aspects of the implement that may not be visible to a user of the machine in the cab. For example, a front edge 122 of a loader bucket 120, as in the embodiment of
The display may instead or in addition provide raw data relating to a position of the implement rather than a view. For example, the display may show a height of the front edge of the bucket loader relative to the ground or relative to a reference position on the body 130. It may show a tilt angle of the bucket loader relative to a longitudinal horizontal direction and an angle of the bucket loader relative to a transverse horizontal direction. These might be displayed as one or more numerical values, perhaps also with icons and perhaps also with a colour-coded representation to signify appropriate (e.g., green) and inappropriate (e.g., red) values. One example of a display showing such information is shown in
In addition to representing present machine and implement position data, the system of the present disclosure may be used to provide the user with predictive information regarding a future position of the implement.
In one arrangement of such a predictive implementation, the processor may predict a future position of the machine and the implement on the basis of current sensor inputs. The processor may predict a future position initially assuming current sensor inputs are maintained at constant levels. Further functionality may react to a change in sensor inputs to update a predicted future position of the implement. The more input data that is provided, the more variables there may be when predicting a future position.
The machine 200 may, as illustrated in
The fork attachment 220 may itself include one or more sensors 229. One of the one or more sensors 229 may be a camera that may be configured to provide an image feed to the display and/or from which data may be calculated in order to provide a user with, for example, height information of the forks and/or width information regarding the current distance between the forks. In the case of an image feed, the user may therefore be able to view the forks in the context of, for example, a pallet that the user wishes to lift using the forks. Such functionality may be particularly appropriate where the pallet is at a height significantly above the user in the cab of the machine and where otherwise a user may (a) require the assistance of another person at a distance from the machine that allows that person to see the pallet and forks together or (b) need to leave the cab of the machine in order to view the position of the forks relative to the pallet from an alternative perspective.
Another feature applicable to all embodiments but explained in detail with respect to the
In some embodiments, one of the one or more sensors may be a camera that may be mounted on the body 230 of the machine 200 that may provide an image feed via a processor to a display. A schematic representation of such an image feed may be found in
In some embodiments, the trajectory may be two-dimensional while in other embodiments the trajectory may be three-dimensional.
The data relating to a steering angle of the machine may be provided by a position sensor on a steering rack that controls angles of the wheels to determine an angle of the steering relative to a longitudinal direction of the machine. Alternative techniques for sensing wheel angle may also be appropriate.
In a further variation, sensor readings indicative of changes in height in the forks as controlled by the user may also be provided to the processor such that trajectory of fork position may include also an indication of a future height. In this manner, future height may be calculated and superimposed on the image provided by the camera. Again, changes in the sensor reading indicative of implement height control may be fed into the processor and the trajectory may be updated in near real time to take account of such changes.
In this way, an inexperienced user may be provided with sufficient information to be able to change the steering control and the implement height control simultaneously in order to arrive at a trajectory that meets with an article of interest. In the case of forks, a user may be in a position to change steering angle and fork height simultaneously so that the forks arrive at an appropriate position to pick up a desired pallet. The machine position and fork height may be controlled by a user on the basis of the feedback provided by the trajectory mapping element of the system such that both the machine and the forks arrive at the appropriate position in tandem. This may avoid an inexperienced user having to perform various manoeuvres in series, such as, in a first stage, positioning the machine in an appropriate position through altering the ground propulsion control, including steering, and, in a second stage started only after completion of the first stage, positioning the forks of the implement relative to the machine only after the machine is itself stationary. It also reduces the likelihood that errors in the first stage machine positioning are only identified by the user once the second stage fork lifting stage has been completed, resulting in the user having to return to the first stage of repositioning the machine altogether.
In a further variation, in the case of an implement having multiple degrees of freedom, these additional degrees of freedom may be accommodated by the trajectory mapping element of the system. Accordingly, for example, in the case of an implement capable of movement relative to the machine body in terms of height and angle, sensors in respect of the control of both of these aspects of implement position relative to the machine body may be provided to the processor for use in the trajectory mapping functionality in order to provide a user with detailed predictions of a future position of the implement based on current control inputs, and may update in near real time in the case of changes to any of those inputs.
As the skilled person would appreciate, in the case of the backhoe loader exemplified by
In the case of excavator 300, the degrees of freedom of the implement relative to the machine may be different from those associated with the loader bucket 120 shown in the context of the machine 100 of
The excavator may comprise a body 330 rotationally mounted on a drive portion 335 that may comprise tracks for ground propulsion. Rotational mounting may allow rotation about an axis that projects normal to ground on which the drive portion rests. The body 330 may comprise a cab 340 from which a user may control both ground propulsion of the excavator 300 and movement of the grapple 320 relative to the body 330.
The excavator 300 may further comprise a first arm 350 having a first end 351 and a second end 352. The first end 351 of the first arm 350 may be pivotally connected to the body 330 via a first pivot 355 (not shown). The excavator may further comprise a second arm 360 having a first end 361 and a second end 362. The first end 361 of the second arm 360 may be pivotally connected via a second pivot 365 to the second end 352 of the first arm 350. The second arm 360 may comprise an implement coupling portion 368 at the second end 362 of the second arm 360. An axis of the first pivot 355 may be parallel to an axis of the second pivot 365.
In the illustrated embodiment of
The grapple 320 may comprise a first jaw 321 and a second jaw 322. The first jaw 321 may be openable and closable relative to the second jaw 322 and vice versa via a hinging arrangement 323.
The system may comprise one or more sensors, a processor and a display visible to a user in the cab of the excavator 300. The one or more sensors may comprise a grapple camera 324 located in the grapple 320, perhaps between the two jaws 321, 322 adjacent the hinging arrangement 323 such that when the grapple jaws 321, 322 are open the camera may provide an image to include opening teeth 325, 326 of each of the two jaws 321, 322 and any article that may be in the vicinity of the teeth 325, 326. This may assist a user in aligning the jaws 321, 322 relative to the article in order to grab the article securely between the jaws 321, 322. For example, image processing software may be configured to represent the grapple (either schematically or as a live camera view) and to superimpose on the representation a projection of a future position of the grapple jaws based on current control inputs. This may be updated by the image processing software in the event that inputs change.
In addition, the embodiment of
Data obtained from the camera and sensors may, for example, be used to produce for display to the user a schematic representation of the grapple 320 relative to an object within view of the grapple camera 324 whose dimensions and position may be obtained from the view provided by the grapple camera 324. The schematic representation may show the grapple from an assimilated position adjacent the grapple, even though there may not be a camera at that location. Schematic representations of the implement relative to the machine body may show its position relative to other articles in the surrounding environment such as, but not limited to, obstacles that the user may have reason to want to avoid.
In addition or in the alternative, such data may be provided also be provided to a user in a variety of formats including raw distances and angles and with respect to relative scales.
A wide variety of grapple implements are known in the art.
Further grapples are contemplated within the scope of the disclosure. For many types of grapples, such as the sorting grapple of
An example of the kind of image that might be displayed is shown in
As can be seen from the first, second and third blade configurations, the blade 520 may comprise a hinge. A first portion 521 of the blade 520 may be situated on a first side of the hinge and a second portion 522 of the blade 520 may be situated on a second side of the hinge. The hinge may enable the blade 520 to be used in a single straight blade configuration or in a folded configuration whereby the first portion 521 of the blade 520 is at an angle other than 180° with respect to the second portion 522 of the blade 520.
In addition, the blade 520 may be movable up and down relative to the body 530 of the track-type tractor 500. Furthermore, one side of the blade 520 may be movable up and down independently of an opposite side of the blade 520 such that the blade 520 may be lower at a first end than at a second end. Furthermore, while in
In this embodiment, as in the previous embodiments, sensors may be used to detect the implement type, angle, tilt, hinge position (since the blade may be substituted for another implement). Furthermore, sensors may be configured to provide data regarding machine ground propulsion control. Such sensors may include those known in the art. For example, sensors relating to speed of a machine relative to the ground are known in machines for providing data to a speedometer for display to a user. Furthermore, the sensors may be configured to feedback changes to the sensed parameters at frequent intervals. The data obtained from these sensors may be processed in a processor and used to provide information to the user via a display.
As referred to above in respect of the examples illustrated, implements may be interchangeable. This may be the case for many machines known in the art. For example, the track-type tractor of
For implements such as these (and others), in one embodiment, the system of the present disclosure may provide a schematic bird's eye view of the machine in its environment on which are superimposed various representations of widths and areas relative to the implement. An example of this is shown in
First, the system may be configured to obtain information regarding the implement type and size. This may be obtained in any manner including that of alphanumeric recognition of a code on the implement and visible to a camera on the machine, as described above with reference to the
Based on information regarding machine type, there may be superimposed onto the bird's eye view schematic representation pairs of (potentially) parallel lines representing any or all of the following:
Other representations may also be shown, depending on the implement selected.
The information required for the
A schematic illustration of the various criteria that may be detected for use by the system of any of the embodiments illustrated herein is provided in
The following is a list showing some of the variables to be sensed and, in each case, a representative example of the kind of sensor that may be used:
Type of attachment may be sensed by a camera with image processing algorithm or non-contacting inductive (RFID) sensor. Steering angle may be sensed by a non-contacting magnetic position sensor. Snow blower nozzle direction may be sensed by a non-contacting magnetic position sensor. Attachment angle may be sensed by a non-contacting magnetic position sensor. Machine speed may be sensed by a inductive or hall effect shaft rotation speed sensor. Blade angle may be sensed by a non-contacting linear position sensor. Front linkage height, tilt, and/or angle may be sensed by a non-contacting rotary magnetic position sensor. Machine level may be sensed by an accelerometer or gyroscopic inclination sensor. Fork width may be sensed by a camera with image processing algorithm. Forward radar may be sensed by a radar-based sensor with object detection algorithms. Forward (work zone) camera may be sensed by an optical camera. Downward (below ground) radar may be sensed by a radar-based sensor with object detection algorithms. ‘Birds eye’ camera view may be sensed by a multiple cameras with image processing/stitching functionality.
In some embodiments, the implement itself may comprise a camera configured to provide data to a user via the system of the disclosure. This may be particularly useful when aligning the implement with an article to be contacted by the implement.
In a further embodiment, shown in
An embodiment including a safety zone may be particularly applicable to attachments that are typically used while the machine is moving relative to its surroundings. For example, a snow blower (see
There may be potential hazards associated with this. In the case of the broom 950 example, there may be a risk of debris being propelled a significant distance from the broom 950. Similarly, in the case of a snow blower 960, there may be the intention that snow is diverted in a particular direction from the snow blower 960.
In order to reduce potential hazards, it may be advisable for implements not to be used when in close proximity to people and/or particular features in the surroundings. A safe distance from an implement may depend on a number of factors including, implement type, implement specific parameters (e.g., rotational speed, in the case of a broom), position of implement relative to machine body, forward or backward propulsion speed of machine, steering position of machine, and potentially many other factors.
For example, in the case of the broom, when forward or rearward propulsion of the machine is fast, an appropriate distance from the machine may be greater than when it is slow. Similarly, when the broom is rotating fast, an appropriate distance from the machine may be greater than when it is slow. The distance may be different depending on the direction from the implement. For example, the distance may be longer in front of the implement than to the side of the implement. A safety zone may be said to be defined by a perimeter around the implement inside which it is inadvisable to enter. The size and shape of the safety zone may depend on a wide number of variables. The size and shape of the safety zone may be determined on the basis of a combination of the current input data and may be obtained, for example, from a data library or similar or calculated in the processor.
The safety zone may, in one embodiment, simply be represented schematically to a user on a display. An example of such an embodiment is shown in
Articles present within or without the safety zone may be represented in the bird's eye view. For example, a person might be represented, for example by an image or perhaps by an icon on the display at the position where the person is detected.
In a further embodiment, instead or in addition, the system may automatically override control of the implement (for example, rotation of a broom) in order to reduce risk in the event that an obstacle (for example, a person) is detected within the safety zone. The system may alternatively or in addition automatically override control of ground propulsion of the machine, perhaps by applying a brake to prevent ground propulsion of the machine.
In some embodiments, implements may benefit from further sensing relative to the surrounding environment. For example, a saw may benefit from sensing in relation to unseen features that may be beneath or within a surface to be cut by the saw. For example, where a saw might be used to cut into a surface, it may be beneficial to detect pipes, cables, steel and other objects that may be hidden within the surface to be cut.
Saw implements, together with radar detection zone and optical camera focus zone are illustrated in
To this end, a saw implement may be equipped with radar functionality. The radar functionality may be configured to detect objects that might include pipes, cables, steel and other objects, including those that may be hidden behind or within other objects and surfaces. The radar functionality may be provided by a radar apparatus focusable on a position of interest ahead of the current saw position. In tandem with this there may be provided an optical camera focused on the same or a similar position of interest. In this way, the processor may receive data from the radar apparatus and the optical camera and process the information in order to superimpose information obtained via those techniques onto a schematic representation of the implement and its surroundings so that a user may be provided with information to assist in controlling ground propulsion of the machine and in controlling of the implement relative to the machine body in order to determine a preferred cutting path. The sensor data may be provided in near real time and the processor may operate in near real time in order to provide the information to the user in near real time and such that the user may continuously adapt their controlling of the machine and implement to plot a preferred path for the implement relative to the surroundings, such as a surface to be cut.
In some embodiments, there may be more than one radar apparatus. For example, there may be a first radar apparatus configured to obtain information forward of the saw and relevant to forward saw trajectory plotting and there may be a second radar apparatus configured to obtain information closer to a current area of impact of the saw.
In this way, there may be information provided to influence user control of forward movement of the saw as well as information provided to influence user control of the saw blade at the current position. This may be particularly effective in preventing unintentional cutting of an object just before said cutting may be due to occur.
In a further embodiment of the disclosure, sensor information may be used to assist a user in positioning a machine 100 having a loading implement (such as a loader bucket 120) adjacent a truck into which the user may wish to deposit contents of the loader bucket 120. Such information may be presented to a user on a display and/or may provide the user with information in other ways such as, for example, providing an audible indication when a part of the machine or implement comes within a particular distance of the truck. In some embodiments, there may be active braking that engages a brake of the machine in order to prevent a collision. It may, alternatively or in addition, prevent a user from further movement of an implement where such movement would follow a trajectory that would result in a collision of the implement with, for example, a side of the truck.
While various implements have been disclosed in relation to various machine types, it will of course be appreciated by the skilled person that the specific machine-implement combinations described and illustrated herein are merely indicative. To the extent that an implement is compatible for attachment to a particular machine, the system of the present disclosure may be equally applicable to that implement-machine combination.
For example, the disclosure may be equally applicable to machines having any kind of ground propulsion known in the art. The system may use input data regarding ground propulsion without knowledge of how that ground propulsion may be achieved. Accordingly, any particular embodiment of the disclosure is not limited to whether the machine with which it operates is propelled by wheels, tracks, a combination of the two or any other means. Other means of ground propulsion than those explicitly recited herein are known in the art.
The disclosure may be applicable to a machine having a wide range of different implement possibilities. The implement may be permanently attached to a particular machine or couplable to the machine, and therefore substitutable for one or more of a further range of implements.
While some particular combinations of sensors and inputs have been disclosed in relation to specific embodiments, the skilled person would appreciate that the different combinations of sensors and inputs may be applicable to different embodiments of machine and implement. The disclosure is not to be understood as limited to the specific combination of sensors and inputs disclosed in respect of the specific machines and implements. Any combination of sensors and inputs may be equally applicable to any combination of machine and implement.
The present disclosure finds application in improving the efficient use of a machine, particularly though not exclusively by an inexperienced user.
Number | Date | Country | Kind |
---|---|---|---|
14193343.2 | Nov 2014 | EP | regional |