The present invention relates to an autonomous work machine, a control method of the autonomous work machine, and a storage medium.
There is conventionally known an autonomous work machine that defines a working area using a plurality of markers and performs a work in the defined working area.
PTL 1 discloses a weeding apparatus that extracts, from a captured image, images of markers set in advance at predetermined positions in a working area, obtains a current position based on the extracted marker images, and performs autonomous traveling according to a working line.
PTL 1: Japanese Patent Laid-Open No. 2017-158532
However, if the autonomous work machine travels toward between markers, the autonomous work machine may deviate out of the working area defined by the markers. If the autonomous work machine passes between markers and deviates out of the working area, it may cause an accident by, for example, submerging in a pond existing outside the working area.
The present invention provides a technique for making it possible to continue a work without causing an autonomous work machine to deviate out of a working area.
According to one aspect of the present invention, there is provided an autonomous work machine that performs a work in a working area defined by a marker, comprising:
a camera;
a detection unit configured to detect the marker from a captured image of the camera; and
a control unit configured to, if the marker deviates from a detection range of the detection means, control at least one of a traveling speed and a traveling direction of the autonomous work machine.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain principles of the invention.
Hereinafter, the embodiments of the present invention will be described with reference to the accompanying drawings. It should be noted that the same reference numerals denote the same constituent elements throughout the drawings.
In
The vehicle body 12 of the working vehicle 10 includes a chassis 12a and a frame 12b attached to the chassis 12a. The front wheels 14 are two, left and right small-diameter wheels fixed to the front part of the chassis 12a via the stay 13. The rear wheels 16 are two, left and right large-diameter wheels attached to the rear part of the chassis 12a.
The blade 20 is a lawn mowing rotary blade attached near the central position of the chassis 12a. The working motor 22 is an electric motor arranged above the blade 20. The blade 20 is connected to and rotated by the working motor 22. The motor holding member 23 holds the working motor 22. The rotation of the motor holding member 23 is regulated with respect to the chassis 12a. In addition, the vertical movement of the motor holding member 23 is permitted by a combination of a guide rail and a slider capable of vertically moving by being guided by the guide rail.
The blade height adjusting motor 100 is a motor for adjusting the height of the blade 20 in the vertical direction from a ground surface GR. The translation mechanism 101 is connected to the blade height adjusting motor 100, and converts the rotation of the blade height adjusting motor 100 into a vertical translational movement. The translation mechanism 101 is also connected to the motor holding member 23 for holding the working motor 22.
The rotation of the blade height adjusting motor 100 is converted into the translational movement (vertical movement) by the translation mechanism 101, and this translational movement is transmitted to the motor holding member 23. The translational movement (vertical movement) of the motor holding member 23 causes the working motor 22 held by the motor holding member 23 to translationally move (vertically move). The height of the blade 20 from the ground surface GR can be adjusted by the vertical movement of the working motor 22.
The traveling motors 26 are two electric motors (motors) attached to the chassis 12a of the working vehicle 10. The two electric motors are connected to the left and right rear wheels 16. The left and right wheels are independently rotated forward (rotated in an advancing direction) or rotated backward (rotated in a retreating direction) by using the front wheels 14 as driven wheels and the rear wheels 16 as driving wheels. This allows the working vehicle 10 to move in various directions.
The charging terminal 34 is a charging terminal installed in the front end position of the frame 12b in the front-and-rear direction. The charging terminal 34 can receive power from a charging station (not shown) when connected to a corresponding terminal of the charging station. The charging terminal 34 is connected to the charging unit 30 by a line, and the charging unit 30 is connected to the battery 32. The working motor 22, the traveling motors 26, and the blade height adjusting motor 100 are also connected to the battery 32, and receive power from the battery 32.
The ECU 44 is an electronic control unit including a microcomputer formed on a circuit board, and controls the operation of the working vehicle 10. Details of the ECU 44 will be described later. If an abnormality occurs in the working vehicle 10, the notification unit 35 notifies the user of the occurrence of the abnormality. For example, a notification is made by a voice or display. Alternatively, the notification unit 35 outputs the abnormality occurrence to an external device connected to the working vehicle 10 by a wire or wirelessly. The user can know the occurrence of the abnormality via the external device.
The ECU 44 is connected to the various sensors S. The sensors S include an azimuth sensor 46, a GPS (Global Positioning System) sensor 48, a wheel speed sensor 50, an angular velocity sensor 52, an acceleration sensor 54, a current sensor 62, and a blade height sensor 64.
The azimuth sensor 46 and the GPS sensor 48 are sensors for obtaining information of the direction and the position of the working vehicle 10. The azimuth sensor 46 detects the azimuth corresponding to the terrestrial magnetism. The GPS sensor 48 receives radio waves from GPS satellites and detects information indicating the current position (the latitude and the longitude) of the working vehicle 10. Note that concerning estimation of the current position, the method is not limited to use of the GPS. The current position may be obtained by an arbitrary satellite positioning system such as a GNSS (Global Navigation Satellite System). The current position may be obtained based on an image capturing result of a camera.
The wheel speed sensor 50, the angular velocity sensor 52, and the acceleration sensor 54 are sensors for obtaining information on the moving state of the working vehicle 10. The wheel speed sensor 50 detects the wheel speeds of the left and right wheels 16. The angular velocity sensor 52 detects the angular velocity around the vertical axis (the z-axis in the perpendicular direction) in the barycentric position of the working vehicle 10. The acceleration sensor 54 detects accelerations in the directions of three perpendicular axes, that is, the x-, y-, and z-axes, which act on the working vehicle 10.
The current sensor 62 detects the current consumption (power consumption) of the battery 32. The detection result of the current consumption (power consumption) is saved in the memory 44c of the ECU 44. When a predetermined power amount is consumed and the power amount stored in the battery 32 becomes equal to or lower than a threshold value, the ECU 44 performs control for returning the working vehicle 10 to the charging station (not shown) in order to charge the working vehicle 10.
The blade height sensor 64 detects the height of the blade 20 from the ground surface GR. The blade height sensor 64 outputs the detection result to the ECU 44. Under the control of the ECU 44, the blade height adjusting motor 100 is driven, and the blade 20 vertically moves, thereby adjusting the height from the ground surface GR.
The outputs from the various sensors S are input to the ECU 44 via the I/O 44b. Based on the outputs from the various sensors S, the ECU 44 supplies power from the battery 32 to the traveling motor 26, the working motor 22, and the height adjusting motor 100. The ECU 44 controls the traveling motor 26 by outputting a control value via the I/O 44b, thereby controlling traveling of the working vehicle 10. The ECU 44 also controls the height adjusting motor 100 by outputting a control value via the I/O 44b, thereby controlling the height of the blade 20. Furthermore, the ECU 44 controls the working motor 22 by outputting a control value via the I/O 44b, thereby controlling the rotation of the blade 20. The I/O 44b can function as a communication interface, and can be connected to an external device (for example, a communication device such as a smartphone or a personal computer) 350 via a network 302 by a wire or wirelessly.
<Processing>
The procedure of processing executed by the working vehicle 10 according to this embodiment will be described next with reference to the flowchart of
In step S402, the ECU 44 detects a marker from the captured image. Here, the marker is a mark arranged in a working area in advance to define the working area where the working vehicle 10 performs a work. For example,
In step S403, the ECU 44 controls the operation of the working vehicle 10 based on the captured image and the markers. The ECU 44 can calculate and obtain distance information between the working vehicle 10 and a marker existing in front of the working vehicle 10 based on a plurality of captured images captured by the camera unit 11. The ECU 44 controls the operation of the working vehicle 10 based on the distance information up to the marker. At this time, the ECU 44 may set a virtual wire between the detected markers. The ECU 44 may recognize the set virtual wire as the boundary of the working area and control the operation (for example, at least one of the traveling speed and the traveling direction) of the working vehicle 10 so that the working vehicle does not deviate out of the working area across the virtual wire. This enables more precise control because an operation such as a turn can be performed without deviating from the virtual wire while the markers are being detected.
In step S404, the ECU 44 obtains a captured image captured by the camera unit 11. In step S405, the ECU 44 detects markers from the captured image obtained in step S404. If markers are detected, the process returns to step S403. On the other hand, if no marker is detected (that is, if markers deviate from the detection range), the process advances to step S406.
In step S405, for example, if a plurality of markers are detected from the captured image of the preceding frame, the process may advance to step S406 in a case in which all the plurality of markers are not detected from the captured image any more (in a case in which the markers deviate from the detection range).
Alternatively, in step S405, if a plurality of markers are detected from the captured image of the preceding frame, the process may advance to step S406 in a case in which at least one of the plurality of markers is not detected from the captured image of the current frame any more (in a case in which the marker deviates from the detection range). If two markers are detected from the captured image of the preceding frame, the process may advance to step S406 in a case in which one of the two markers is not detected from the captured image of the current frame any more (in a case in which the marker deviates from the detection range). In the example shown in
Alternatively, in step S405, if a marker detected in the captured image of the preceding frame is not recognized as a marker in the captured image of the current frame, the process may advance to step S406. This corresponds to a case in which, for example, the working vehicle 10 approaches a marker too much, and a marker included in a captured image cannot be recognized as a marker any more.
In step S406, the ECU 44 determines whether the working vehicle 10 is moving in the boundary direction of the working area. In the example shown in
In step S407, the ECU 44 controls at least one of the traveling speed and the traveling direction of the working vehicle 10. For example, the traveling speed of the working vehicle 10 is reduced from the current traveling speed, and/or the working vehicle 10 is turned to change the traveling direction. For example, the working vehicle 10 is turned to one of the left and right sides such that the direction becomes 90° with respect to the current traveling direction. Alternatively, the working vehicle 10 may be turned to one of the left and right sides such that the direction becomes 180° with respect to the current traveling direction. That is, the working vehicle 10 may be turned such that it travels in a direction opposite to the current traveling direction. Note that the angle is not limited to 90° to 180°, and may be an arbitrary angle. The angle may be selected and decided from the range of, for example, 45° to 180°. Alternatively, the working vehicle 10 may temporarily stop and then retreat in a direction opposite to the traveling direction.
In the example shown in
However, the working vehicle 10 need not always turned in the direction of separating from the boundary. The working vehicle 10 may turn in the direction of approaching the boundary if it does not deviate from the boundary no matter which direction it turns to the left or right with respect to the current traveling direction.
An example of a method of deciding the direction in a case in which a turn can be made to both the left or right will be described here.
Note that a method of determining which working area has a large working amount will be described with reference to
In step S408, the ECU 44 determines whether to continue the processing. For example, if the work in the working area is completed, or the user powers off the working vehicle 10, the processing is ended. Otherwise, it is determined to continue the processing. Upon determining, in this step, to continue the processing, the process returns to step S401. On the other hand, upon determining to end the processing, the series of processes shown in
In step S409, the ECU 44 controls the operation of the working vehicle 10 based on the captured image. For example, the ECU 44 can calculate and obtain information of the distance between the working vehicle 10 and an object (for example, a tree, a stone, or a rock in the working area) existing in front of the working vehicle 10 based on a plurality of captured images captured by the camera unit 11. The ECU 44 then controls the operation of the working vehicle 10 based on the distance information up to the object. If no obstacle exists in front, control may be performed to maintain the traveling speed and the traveling direction. After the processing of this step, the process returns to step S401. The processing shown in
Note that in the processing shown in
Also, in the processing shown in
As described above, in this embodiment, an example in which if the markers deviate from the detection range, at least one of the traveling speed and the traveling direction of the autonomous work machine is controlled has been described. This makes it possible to continue a work without causing the autonomous work machine to deviate out of the working area.
In this embodiment, control to be performed in a case in which a marker that should exist at a position cannot be detected for some reason will be described. For example, in a case in which a marker falls or its position moves due to the influence of a wind, an obstacle is installed in front of a marker, or a lawn grows and hides a marker, if a work is performed based on the marker, the work may be performed in an unintended working area. That is, there is a possibility that the working vehicle may deviate from the working area. In this embodiment, the arrangement information of each marker is stored in advance, and the consistency between the detection result of a marker and the arrangement information is determined. If these are not consistent, control is switched to an alternative position estimation method to control the operation of the working vehicle.
The configuration of a working vehicle according to this embodiment is the same as the configuration of the working vehicle 10 described in the first embodiment, and a detailed description thereof will be omitted. Note that a working vehicle 10 according to this embodiment holds, in a memory 44c in advance, arrangement information representing the arrangement positions of a plurality of markers that define a working area. It is determined whether the position of a marker detected from a captured image is consistent with the arrangement information. If these are not consistent, the position of the working vehicle 10 is estimated using a GPS sensor 48, and the operation of the working vehicle 10 is controlled based on the estimated position. Alternatively, the position of the working vehicle 10 is estimated using not the GPS sensor 48 but odometry using a wheel speed sensor 50, an angular velocity sensor 52, an acceleration sensor 54, and the like, and the operation of the working vehicle 10 is controlled based on the estimated position.
<Processing>
The procedure of processing executed by the working vehicle 10 according to this embodiment will be described with reference to the flowchart of
In step S503, an ECU 44 determines consistency with the arrangement information of a marker held in the memory 44c in advance. More specifically, it is determined whether the position of the marker detected in step S502 and the arrangement information of the marker held in the memory 44c in advance are consistent. Upon determining that these are consistent, the process advances to step S504. On the other hand, upon determining that these are not consistent, the process advances to step S510.
In step S509, the ECU 44 determines consistency with the arrangement information of a marker held in the memory 44c in advance. More specifically, if no marker exists at a position represented by the arrangement information of the marker held in the memory 44c in advance (if the marker does not exist at a position where it should exist), it is determined that the position is not consistent with the arrangement information. Upon determining that these are consistent, the process advances to step S512. On the other hand, upon determining that these are not consistent, the process advances to step S510.
In step S510, the ECU 44 estimates the position of the working vehicle 10. The position may be estimated using the GPS sensor 48 or may be estimated by odometry using the wheel speed sensor 50, the angular velocity sensor 52, the acceleration sensor 54, and the like. If the marker does not exist at the position where it should exist, and the operation of the working vehicle 10 is controlled using the detected marker, the working vehicle may deviate from the originally intended working area. For this reason, the position of the working vehicle 10 is estimated using another position estimation method.
In step S511, the ECU 44 controls the operation of the working vehicle 10 based on the position of the working vehicle 10 estimated in step S510. The processing shown in
Note that in the processing shown in
As described above, in this embodiment, the consistency between the arrangement information of a marker held in advance and the detection result of the marker is determined. If these are not consistent, the position of the working vehicle is estimated using an alternative position estimation method, and the operation of the working vehicle is controlled based on the estimation result.
Hence, even if the position of a detected marker moves due to a wind, or the like, it is possible to prevent the working vehicle from deviating from the working area. Alternatively, if an obstacle exists in front of a marker, or a lawn grows to make a marker invisible, the marker cannot be detected. However, even in this case, the working vehicle can be prevented from deviating from the working area.
The present invention is not limited to the above embodiments, and various changes and modifications can be made within the spirit and scope of the present invention. Therefore, to apprise the public of the scope of the present invention, the following claims are made.
1. An autonomous work machine (for example, 10) according to the above-described embodiment is
an autonomous work machine (for example, 10) that performs a work in a working area (for example, 600) defined by a marker (for example, 601-605), comprising:
a camera (for example, 11);
a detection unit (for example, 44) configure to detect the marker from a captured image of the camera; and
a control unit (for example, 44) configure to, if the marker deviates from a detection range of the detection unit, control at least one of a traveling speed and a traveling direction of the autonomous work machine.
According to this embodiment, it is possible to continue the work without causing the autonomous work machine to deviate out of the working area.
2. In the autonomous work machine (for example, 10) according to the above-described embodiment,
the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine after an elapse of a predetermined time from deviation of the marker from the detection range of the detection unit.
According to this embodiment, if a turning operation is performed immediately after the deviation from the detection range, the timing is too early, and work omission (for example, an unmown lawn) may occur near the boundary of the working area. However, such work omission can be reduced.
3. The autonomous work machine (for example, 10) according to the above-described embodiment further comprises a setting unit (for example, 44) configure to set a virtual wire between markers detected by the detection unit,
wherein the control unit recognizes the virtual wire set by the setting unit as a boundary of the working area, and controls at least one of the traveling speed and the traveling direction of the autonomous work machine such that the autonomous work machine does not deviate out of the working area across the virtual wire.
During detection of the marker, a turn can be made without deviating across the virtual wire. However, if some error has occurred (for example, if the marker cannot be detected), deviation across the virtual wire may occur. According to this embodiment, if an error has occurred, an operation such as a turn is performed. It is therefore possible continue the work without causing the autonomous work machine to deviate out of the working area.
4. The autonomous work machine (for example, 10) according to the above-described embodiment further comprises:
an estimation unit (for example, 48, 50, 52, 54) configure to estimate a position of the autonomous work machine;
a storage unit (for example, 44c) configure to store arrangement information including an arrangement position of the marker; and
a determination unit (for example, 44) configure to determine whether the arrangement position and a detection result by the detection unit are consistent,
wherein if the determination unit determines that the arrangement position and the detection result are not consistent, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine based on an estimation result by the estimation unit.
According to this embodiment, even if the position of the detected marker moves due to a wind or the like from the position where it should exist, it is possible to prevent the work machine from deviating from the working area. Alternatively, if an obstacle exists in front of the marker, or a lawn grows to make the marker invisible, the marker cannot be detected. However, even in this case, the work machine can be prevented from deviating from the working area.
5. In the autonomous work machine (for example, 10) according to the above-described embodiment,
the estimation unit performs the estimation using one of odometry (for example, 50, 52, 54), a satellite positioning system (for example, 48), and an image capturing result of the camera.
According to this embodiment, even if control based on analysis of a captured image cannot guarantee correctness, deviation from the working area can be prevented, and the work can be continued by using another position estimation method.
6. The autonomous work machine (for example, 10) according to the above-described embodiment further comprises:
a storage unit (for example, 44c) configure to store arrangement information including an arrangement position of the marker; and
a determination unit (for example, 44) configure to determine whether the arrangement position and a detection result by the detection unit are consistent,
wherein if the determination unit determines that the arrangement position and the detection result are consistent, and the marker deviates from the detection range of the detection unit, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, an operation such as a turn is performed after determining the consistency between the arrangement information of the marker and the actually detected position of the marker, thereby implementing accurate control.
7. The autonomous work machine (for example, 10) according to the above-described embodiment further comprises a direction determination unit (for example, 44) configure to determine whether the autonomous work machine is moving in a boundary direction of the working area,
wherein if the direction determination unit determines that the autonomous work machine is moving in the boundary direction, and the marker deviates from the detection range of the detection unit, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, if the autonomous work machine is not moving in the boundary direction of the working area, it is considered that the possibility of deviation from the boundary is low. In this case, an unnecessary operation such as a turn need not be performed.
8. In the autonomous work machine (for example, 10) according to the above-described embodiment,
if all of a plurality of markers detected by the detection unit deviate from the detection range, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, it is considered that if all the markers cannot be detected, the autonomous work machine is close to the boundary of the working area. It is therefore possible to appropriately prevent deviation from the working area by performing an operation such as a turn.
9. In the autonomous work machine (for example, 10) according to the above-described embodiment,
if at least one of a plurality of markers detected by the detection unit deviates from the detection range, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, it is considered that if the number of markers that can be detected so far decreases, the autonomous work machine is moving toward the boundary of the working area. It is therefore possible to appropriately prevent deviation from the working area by performing an operation such as a turn.
10. In the autonomous work machine (for example, 10) according to the above-described embodiment,
if one of two markers detected by the detection unit deviates from the detection range, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, during detection of the two markers, a position between the markers can be recognized as the boundary of the working area. If one of these cannot be detected any more, the boundary cannot be recognized. It is therefore possible to appropriately prevent deviation from the working area by performing an operation such as a turn.
11. In the autonomous work machine (for example, 10) according to the above-described embodiment,
if the marker detected by the detection unit cannot be recognized as a marker any more, the control unit (for example, 44) controls at least one of the traveling speed and the traveling direction of the autonomous work machine.
According to this embodiment, for example, even if the autonomous work machine is too close to the marker, and a marker included in the captured image cannot be recognized as a marker any more, it is possible to appropriately prevent deviation from the working area.
12. A control method of an autonomous work machine (for example, 10) according to the above-described embodiment is
a control method of an autonomous work machine (for example, 10) that performs a work in a working area (for example, 600) defined by a marker (for example, 601-605), comprising:
detecting the marker from a captured image of a camera (for example, 11) provided in the autonomous work machine; and
if the marker deviates from a detection range in the detecting, controlling at least one of a traveling speed and a traveling direction of the autonomous work machine.
According to this embodiment, it is possible to continue the work without causing the autonomous work machine to deviate out of the working area.
13. A storage medium storing a program according to the above-described embodiment is
a storage medium storing a program configured to cause a computer to function as an autonomous work machine defined in the above-described embodiment.
According to this embodiment, processing according to the above-described embodiment can be implemented by the computer.
According to the present invention, it is possible to continue a work without causing an autonomous work machine to deviate out of a working area.
This application is a continuation of International Patent Application No. PCT/JP2018/042853 filed on Nov. 20, 2018, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/042853 | Nov 2018 | US |
Child | 17320064 | US |