The present invention relates to a work time prediction device, a server device, a terminal device, a work time prediction method, and a storage medium.
In parks, gardens, and the like, work such as lawn mowing or grass mowing may be performed by a landscaper. The landscaper performs the work using a device such as a lawn mower (for example, Patent Literature 1), a trimmer, or a blower, or manually.
When the landscaper receives a request for work from a requester, the landscaper may observe an area of a work target to predict a work time, and may estimate a work cost or the like on the basis of the predicted work time. Therefore, it is desired to improve the prediction accuracy of the work time in order to perform appropriate estimation.
The present invention provides technology for appropriately predicting a work time.
According to one aspect of the present invention, there is provided a work time prediction device, comprising:
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain principles of the invention.
Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention, and limitation is not made to an invention that requires a combination of all features described in the embodiments. Two or more of the multiple features described in the embodiments may be combined as appropriate. Furthermore, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
<Outline of System>
The server device 1 functions as a work time prediction device for predicting a time required for work. The server device 1 includes a processing unit 101, a storage unit 102, and a communication unit 103. The processing unit 101, the storage unit 102, and the communication unit 103 are connected by a bus (not illustrated). The processing unit 101 is a processor represented by a CPU, and executes a program stored in the storage unit 102 to implement various functions related to the server device 1 as the work time prediction device. The storage unit 102 is, for example, a read only memory (ROM), a random access memory (RAM), a hard disk drive (HDD), or a solid state drive (SSD), and stores various types of data in addition to the program executed by the processing unit 101. The communication unit 103 is a communication interface with an external device.
The terminal device 2 is, for example, a terminal operated by a user who performs work. The terminal device 2 may be, for example, a tablet, a smartphone, a PC, or the like. The terminal device 2 includes a processing unit 201, a storage unit 202, a communication unit 203, a display unit 204, and an input unit 205. The processing unit 201, the storage unit 202, the communication unit 203, the display unit 204, and the input unit 205 are connected by a bus (not illustrated). The processing unit 201 is a processor represented by a CPU, and executes a program stored in the storage unit 202 to implement various functions related to the terminal device 2. The storage unit 202 is, for example, a read only memory (ROM), a random access memory (RAM), a hard disk drive (HDD), or a solid state drive (SSD), and stores various types of data in addition to the program executed by the processing unit 201. The communication unit 203 is a communication interface with an external device.
The display unit 204 is a user interface that displays various types of information. For example, the display unit 204 may be a liquid crystal display, an organic EL panel, or the like. Further, the input unit 205 is a user interface that receives input from the user. For example, the input unit 205 may be a touch panel, a pointing device such as a mouse, a physical key, or the like.
Although one terminal device 2 is illustrated in
<Work Area>
The lawn area 51 is an area where a lawn is planted. For example, the landscaper performs lawn mowing work using a ride-on lawn mower 61, a walking lawn mower (not illustrated), or the like. Alternatively, the lawn mowing work may be performed by an autonomous robot lawn mower or the like. In the present embodiment, a tree 58 is planted in the lawn area 51, and the landscaper performs the work while avoiding the tree 58 and its surroundings. That is, the tree 58 and its surrounding region are non-entry regions for a working machine.
The grass area 52 is an area where grass is growing. For example, the landscaper performs grass mowing work using a ride-on grass mower 62, a walking grass mower, or the like. Alternatively, the grass mowing work may be performed by an autonomous robot grass mower or the like. In the present embodiment, a pond 55 is arranged in the grass area 52, and the landscaper performs the work while avoiding the pond 55 and its surroundings.
The handheld area 53 is an area where the landscaper performs manual work. In the present embodiment, since the work by the ride-on lawn mower 61, the ride-on grass mower 62, or the like is difficult in a portion adjacent to a building 56 or a passage 57, the landscaper manually performs the work using a handheld working machine 63 such as an edger.
The hedge area 54 is an area where a hedge is planted. For example, the landscaper adjusts an outer shape of the hedge using a trimmer 64.
Note that the method of dividing the work area 5 is expedient according to the work content, and can be appropriately set.
<Data Configuration>
The area information regarding the past work area includes a work division, a size of the work area, and a coefficient. The work division is obtained by dividing regions in the work area according to the work performed by the landscaper. In
The size of the work area is indicated for each work division. For the lawn, the grass, and the handheld, respective areas are indicated. In addition, a surface area of a portion to be a work target is indicated for the hedge. Note that the size of the hedge may be indicated by, for example, an area of a region where the hedge is planted. However, a work amount can be more appropriately grasped by representing the size of the hedge by the surface area.
The coefficient is a value multiplied by the size of the work area. In the present embodiment, each coefficient is set to a value between 1 and 2. The coefficient is used to consider not only the size of the work area but also the easiness of work in the work area in predicting the work time described later. For example, when there is an inclination in the work area or when it is necessary to perform work while avoiding a waterside of the work area, the work may take more time due to the need to reduce the movement speed of the working machine or the like. Therefore, when an element that causes an increase in the work time is present in the work area, a numerical value obtained by multiplying the size of the work area by a predetermined coefficient is used for prediction of the work time, so that the work time can be predicted in consideration of easiness of work in the work area. That is, a value of size×coefficient here can be said to be information used for prediction of the work time in consideration of easiness of work in the work area.
In the present embodiment, the area information regarding the past work area includes coefficients regarding an inclination, a waterside, an object, and a plant. That is, in a case where there is an inclination in the work area, in a case where there is a waterside, in a case where there is an object that needs to be avoided, and the like, it is considered that more time is required for work. Therefore, these factors are set as coefficients. In addition, depending on the type of plant to be cut in the work area, it is considered that more time is required for work due to reasons such as hardness and difficulty in cutting. Therefore, the type of plant is also set as a coefficient. Note that a target for which the coefficient is set is not limited to these, and can be set as appropriate.
In the present embodiment, workability information regarding the easiness of work in the work area includes coefficients related to an inclination, a waterside, an object, and a plant, but the workability information may be information other than the coefficients. For example, the workability information may be various types of information regarding workability, such as an average obliquity of the work area, an area of the waterside or the object, and an outer peripheral length of the waterside or the object. Further, for example, the workability information may be information regarding weather such as sunlight, sunlight hours, temperature, humidity, and precipitation in the work area. For example, even in the same plant, since a degree of growth varies depending on the sunlight, the precipitation, and the like, these may affect the work time. Further, for example, when it rains on the day of work or the day before work, the mud in the work area or the like may affect the work time. Therefore, the coefficients described above may be set for these pieces of information.
The time information regarding the past work time includes an actual work time, the number of workers, and the number of working machines. Further, the information regarding the estimation of the work cost includes a fuel consumption amount of the working machine and an actual cost. In the present embodiment, the time information regarding the past work time includes information regarding the work time, information regarding the number of workers, and information regarding the number of working machines for each work division.
<Control Example>
In S1, the processing unit 201 executes input reception processing of area information regarding a current work area. The processing unit 201 instructs the input unit 205 to receive the user's input. Here,
Further, in the present embodiment, the input unit 205 can receive input of division information regarding a division of work as the area information. Then, the input unit 205 receives input of the area as the size information and the coefficient as the workability information for each division of each work. The work division may include at least one of lawn work, grass work, manual work, and hedge work. In the present embodiment, the input unit 205 receives a surface area of a work target portion as the size information for the hedge.
Further, in the present embodiment, the input unit 205 receives the number of workers for each work division and the number of work patterns for preparing an estimation as information for preparing the estimation.
Further, in the present embodiment, the workability information to be input includes information regarding the inclination of the work area, the non-entry region of the working machine in the work area, the object disposed in the work area, and the type of the plant in the work area. Specifically, the input unit 205 is configured to be able to receive coefficients of respective items. Note that the items of the coefficients to be input are not limited, and may include at least one of the exemplified items, or may include items other than the exemplified items.
In S2, the communication unit 203 transmits the area information acquired in S1 to the server device 1 on the basis of a command from the processing unit 201.
In S3, the processing unit 101 of the server device 1 executes reception processing of the area information transmitted from the terminal device 2. The processing unit 101 receives the area information transmitted from the terminal device 2 by the communication unit 103. That is, the processing unit 101 acquires area information regarding the current work area by receiving information from the terminal device 2. Moreover, the processing unit 101 acquires information input by the user to the terminal device 2 as the area information.
In S4, the processing unit 101 predicts a work time. Details of this step in which the processing unit 101 predicts the work time in the current work area on the basis of the area information acquired in S3 and the history information in which the area information regarding the past work area and the time information regarding the work time in the past work area are associated with each other will be described later.
In S5, the processing unit 101 generates an estimation of the work cost based on the work time predicted in S4. For example, the processing unit 101 calculates an estimation in consideration of the work time, the number of workers, the number of working machines, the fuel cost of the working machine, and the like for each work division.
In S6, the communication unit 103 transmits information regarding the predicted work time and the estimation on the basis of a command from the processing unit 101. That is, the processing unit 101 outputs the work time predicted in S4 and the estimation of the work cost based on the work time to the terminal device 2 via the communication unit 103. In the present embodiment, the processing unit 101 outputs an estimation of a plurality of patterns of work costs according to the number of working machines used in the current work (see
In S7, the processing unit 201 of the terminal device 2 executes reception processing of the information transmitted from the server device 1. The processing unit 201 receives information regarding the predicted work time (prediction result) and the estimation transmitted from the server device 1 by the communication unit 203.
In S8, the processing unit 201 executes processing of displaying the received information. The processing unit 201 instructs the display unit 204 to display the information received in S7.
Specifically, in
In S401, the processing unit 101 selects a prediction target division of the work time. For example, the processing unit 101 selects the lawn as the target division.
In S402, the processing unit 101 extracts data regarding a past target division similar to a current target division.
For example, the processing unit 101 extracts data having a division ID of A0001 as data similar to the current target division. Here, a similar data extraction method can be set as appropriate. For example, data in which items having a coefficient exceeding 1 are matched may be extracted as the similar data. In addition, data in which a difference between the items is within a predetermined value or within a predetermined ratio may be extracted as the similar data. Furthermore, data in which a difference between values of size×coefficient is within a predetermined value or within a predetermined ratio may be extracted as the similar data. In the present embodiment, the processing unit 101 extracts one piece of similar data, but the number of pieces of data to be extracted may be two or more. In this case, the processing unit 101 may previously define the number of pieces of data to be extracted, for example, 2 to 5 pieces of data, or may extract all data satisfying a predetermined condition regarding a similarity.
In S403, the processing unit 101 identifies the predicted work time. An example will be described. First, the processing unit 101 calculates a value of the size of the lawn of the current work area×the coefficient on the basis of the information received from the terminal device 2. In this example,
450 (size)×1.3 (inclination coefficient)×1.4 (object coefficient)=819
is obtained. Next, the processing unit 101 predicts the work time of the lawn on the basis of the calculated value of size×coefficient and the past similar data extracted in S402. In the present example, in the past data, the value of size×coefficient is 900, the number of workers is 1, the number of working machines is 1, and the actual work time is 40 minutes. On the other hand, in the current lawn, since the value of size×coefficient is 819, and the other conditions are the same, the work time is calculated as
40 (min)×(819/900)=36.4 (min) 36 (min)
in consideration of a difference between values of size×coefficient.
As described above, the processing unit 101 extracts the history information of the past work area similar to the current work area, and predicts the work time in the current work area on the basis of the extracted history information. When a plurality of pieces of similar data are extracted in S402, an average value of these pieces of data may be obtained, and the current work time may be calculated on the basis of the average value.
In S404, the processing unit 101 confirms whether or not there is a division for which the predicted time is not calculated. When there is a division for which the predicted time is not calculated, the processing unit returns to S401. When there is no division for which the predicted time is not calculated, the processing unit ends the present flowchart.
As described above, according to the present embodiment, since the processing unit 101 predicts the work time in the current work area on the basis of the history information of the past work, it is possible to appropriately predict the work time.
The image capturing device 3 is a device for capturing a captured image used for prediction of a work time in the work area. The image capturing device 3 may be, for example, a flying body such as a drone, or may be configured to be able to capture an image of the work area from above. Further, for example, the image capturing device 3 may be a moving body capable of traveling on the work area. The moving body may be, for example, a working machine such as an autonomous lawn mower, and a camera for peripheral detection provided in the working machine may function as an image capturing unit 304 described later. Further, for example, the image capturing device 3 may be a monitoring camera or the like. Note that the image capturing device 3 may be possessed by a landscaper, or may be possessed by a manager or the like of the work area. Alternatively, the image capturing device 3 may be a portable terminal such as a digital camera or a smartphone possessed by a general user such as a user in the work area. The image capturing device 3 includes a processing unit 301, a storage unit 302, a communication unit 303, an image capturing unit 304, and a moving unit 305. The processing unit 301, the storage unit 302, the communication unit 303, the image capturing unit 304, and the moving unit 305 are connected by a bus (not illustrated).
The processing unit 301 is a processor represented by a CPU, and executes a program stored in the storage unit 302 to implement various functions related to a terminal device 2. The storage unit 302 is, for example, a read only memory (ROM), a random access memory (RAM), a hard disk drive (HDD), or a solid state drive (SSD), and stores various types of data in addition to the program executed by the processing unit 301. The communication unit 303 is a communication interface with an external device. The image capturing unit 304 is, for example, a camera, and is configured to be capable of capturing a still image or a moving image. The moving unit 305 moves the image capturing device 3. For example, in a case where the image capturing device 3 is a drone, the moving unit 305 can include a propeller, a motor that drives the propeller, and the like. When the image capturing device 3 is a portable terminal such as a monitoring camera or a digital camera installed in the work area, the moving unit 305 is omitted.
In the present embodiment, the image capturing device 3 can be connected to a terminal device 2 in a wired or wireless manner by the communication unit 303. For example, an image of the work area captured by the image capturing unit 304 is transmitted to the terminal device 2 by the communication unit 303. As a communication method here, known technology can be appropriately adopted, but the terminal device 2 and the image capturing device 3 may be communicable by, for example, wireless fidelity (Wi-Fi), Bluetooth (registered trademark), or the like. Alternatively, the image capturing device 3 may be able to communicate with the server device 1 or the terminal device 2 via a network NW such as the Internet. Note that, although one image capturing device 3 is illustrated in
In S21, the processing unit 301 controls the image capturing unit 304 and the moving unit 305 to capture an image of the work area. Note that the captured image obtained in this step may be a still image or a moving image. For example, in a case where the image capturing device 3 is a drone, an autonomous lawn mower, or the like, the image capturing device 3 starts image capturing of the work area on the basis of reception of an area image capturing instruction or the like from the terminal device 2.
In S22, the communication unit 303 transmits the captured image acquired in S21 to the terminal device 2 on the basis of a command from the processing unit 301.
Note that, although the captured image is transmitted from the image capturing device 3 to the terminal device 2 in the present embodiment, the captured image may be transmitted from the image capturing device 3 to the server device 1. In this case, when the server device 1 receives a transmission request of the captured image from the terminal device 2, the server device 1 may transmit the captured image to the terminal device 2. That is, it is also possible to adopt a configuration in which the captured image of the image capturing device 3 is accumulated in the server device 1, and the terminal device 2 acquires the accumulated captured image from the server device 1 as necessary. Note that the server device 1 may transmit the captured image received from the image capturing device 3 to the terminal device 2 as it is, or may transmit the captured image to the terminal device 2 after the terminal device 2 extracts the captured image required for subsequent processing or performs predetermined image processing.
In S23, the processing unit 201 of the terminal device 2 executes processing of receiving the captured image transmitted from the image capturing device 3. The processing unit 201 receives the captured image transmitted from the image capturing device 3 by the communication unit 203.
In S24, the processing unit 201 executes area determination processing using the received captured image. This processing is processing for dividing the image-captured work area into work divisions. Known image processing technology can be appropriately adopted for the division of the work area.
In S25, the processing unit 201 executes processing for confirming a determination result in S24.
In S26, the processing unit 201 executes area information input reception processing.
Since S2 and subsequent steps are similar to those in
In the present embodiment, the terminal device 2 executes the area determination processing in S24, but the server device 1 may execute the area determination processing. The server device 1 executes the area determination processing, so that it is possible to reduce the processing load on the side of the terminal device 2. In this case, in S22, the captured image of the image capturing device 3 may be transmitted to the server device 1. Then, after executing the area determination processing, the server device 1 may transmit, to the terminal device 2, information necessary for the terminal device 2 to execute the determination result confirmation processing in S25.
<First Modification of Work Time Prediction>
Since S401 and S404 are similar to those in the flowchart of
In S412, the processing unit 101 inputs the area information acquired from the terminal device 2 to the learned model. For example, the processing unit 101 inputs the size of the target division and each coefficient to the learned model.
In S413, the processing unit 101 acquires output of the learned model as a prediction result of the work time.
According to the present modification, the work time can be predicted more appropriately on the basis of the history information by using the learned model. Note that, in the present modification, the input data of the teacher data of the learned model is the size of the work area and the coefficient, but for example, a captured image captured by the image capturing device 3 may be used as the input data of the teacher data. That is, content of the teacher data can be set as appropriate.
<Second Modification of Work Time Prediction>
Since S401 and S404 are similar to those in the flowchart of
In S422, the processing unit 101 performs calculation based on the area information acquired from the terminal device 2. For example, the processing unit 101 acquires the predicted work time as a calculation result by putting the value of size×coefficient into the arithmetic expression.
According to the present modification, the work time can be predicted using the arithmetic expression. Further, at the time of predicting the work time, since it is not necessary to refer to the past data accumulated in the work history database 1021, it is possible to reduce the processing load.
In the above embodiment, the server device 1 functions as the work time prediction device, but the terminal device 2 may function as the work time prediction device. In this case, the terminal device 2 may acquire the information regarding the current work area by receiving the user's input by the input unit 205. Alternatively, the terminal device 2 may acquire information regarding the current work area by receiving the captured image from the image capturing device 3 and performing area determination processing or the like on the image.
Further, regarding the prediction of the work time, for example, when the terminal device 2 predicts the work time according to the flowchart illustrated in
In the above embodiment, the work time when the landscaper performs work such as lawn mowing or grass mowing in a park, a garden, or the like is predicted, but the configuration of the above embodiment can also be applied to prediction of other types of works. For example, the configuration of the above embodiment can also be applied to snow removal work by a snow removal machine, cultivation work by an agricultural machine, ground leveling work by a construction machine, and the like.
The above embodiment discloses at least the following work time prediction device, server device, terminal device, work time prediction method, and program.
1. A work time prediction device (1, for example) according to the above embodiment comprises
According to this embodiment, since the prediction unit predicts the work time in the current work area on the basis of the history information of the past work, the work time can be appropriately predicted.
2. According to the above embodiment,
According to this embodiment, since the prediction unit predicts the work time on the basis of easiness of work in the work area, the prediction accuracy of the work time can be improved.
3. According to the above embodiment,
According to this embodiment, since the prediction unit predicts the work time on the basis of the easiness of work for each work division, the prediction accuracy of the work time can be improved.
4. According to the above embodiment,
According to this embodiment, since the work time is predicted on the basis of the input information of the user, the work time can be predicted with a simple configuration.
5. According to the above embodiment,
According to this embodiment, since the acquisition unit acquires the information based on the captured image as the area information, it is possible to reduce the trouble for the user to input the area information.
6. According to the above embodiment,
According to this embodiment, since the history information of the past work area similar to the current work area is extracted and used for prediction of the work time, the prediction accuracy of the work time can be improved.
7. According to the above embodiment,
According to this embodiment, by using the learned model, the work time can be predicted more appropriately on the basis of the history information.
8. According to the above embodiment,
According to this embodiment, since the work time is predicted by the arithmetic expression, the processing related to the prediction of the work time can be simplified.
9. According to the above embodiment,
According to this embodiment, since the estimation is output according to the predicted time of the work predicted by the prediction unit, it is possible to provide a highly accurate estimation to a requester of the work.
10. According to the above embodiment,
According to this embodiment, an estimation of a plurality of patterns can be provided to a requester of the work.
11. According to the above embodiment,
According to this embodiment, the prediction unit can predict the work time on the basis of the history information stored in the storage unit.
12. According to the above embodiment,
According to this embodiment, since the prediction unit can more appropriately grasp the easiness of the work, the prediction accuracy of the work time can be improved.
13. According to the above embodiment,
According to this embodiment, since the work amount can be more appropriately grasped, the prediction accuracy of the work time can be improved.
14. According to the above embodiment,
According to this embodiment, the work time can be predicted more appropriately according to the work division.
15. A server device (1, for example) according to the above embodiment functions as each unit (101, for example) of the work time prediction device according to the above 1 to 14.
According to this embodiment, a server device capable of appropriately predicting a work time is provided.
16. A terminal device (2, for example) according to the above embodiment functions as each unit (201, for example) of the work time prediction device according to the above 1 to 14.
According to this embodiment, a terminal device capable of appropriately predicting a work time is provided.
17. A terminal device according to the above embodiment comprises
According to this embodiment, the prediction result of the work time in the server device can be confirmed by the terminal device.
18. A work time prediction method according to the above embodiment, comprising:
According to this embodiment, since the prediction unit predicts the work time on the basis of easiness of work in the work area, the prediction accuracy of the work time can be improved.
19. A non-transitory computer readable storage medium storing a program according to the above embodiment causes a computer to function as:
According to this embodiment, since the prediction unit predicts the work time on the basis of easiness of work in the work area, the prediction accuracy of the work time can be improved.
20. A non-transitory computer readable storage medium storing a program according to the above embodiment is a program for causing a computer of a terminal device (2, for example) capable of communicating with a server device (1, for example) to execute a work prediction time display method, wherein
According to this embodiment, since the prediction unit predicts the work time on the basis of easiness of work in the work area, the prediction accuracy of the work time can be improved.
The invention is not limited to the foregoing embodiments, and various variations/changes are possible within the spirit of the invention.
According to the present invention, a work time can be appropriately predicted.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application is a continuation of International Patent Application No. PCT/JP2021/022666 filed on Jun. 15, 2021, the entire disclosures of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/022666 | Jun 2021 | US |
Child | 18536886 | US |