The present disclosure relates to an image analysis device and method using an image recognition technology.
Patent Document 1 discloses an image monitoring system identifying a person and to tracking the movement of the person. The image monitoring system detects a person and an abandoned item shown in an image taken by any of a plurality of imaging devices, and identifies a target person who has left the abandoned item. The image monitoring system searches for an image showing the target person from among the images respectively taken by any of the imaging devices, based on a facial feature value and a worn clothing value, such as the color and shape of the clothing, of the target person. The image monitoring system outputs a display showing the movement of the target person on the screen based on the imaging device that has taken each image showing the target person, and the imaging time at which each image was taken.
The present disclosure provides an image analysis device capable of estimating operators performing respective tasks when a plurality of tasks is performed by a plurality of operators.
An image analysis device according to an aspect of the present disclosure includes an input interface, a controller, and a storage. The input interface acquires image data indicating a captured image of a site in which a plurality of operators performs a plurality of tasks. Based on the image data, the controller generates task history information indicating the tasks performed in the site by respective operators among the plurality of operators. The storage stores the task history information. The controller successively recognizes the tasks and positions of the plurality of operators based on the image data at each time in the site. The controller detects crossing caused between a plurality of trajectories, each trajectory including successive positions for each of the plurality of operators. When the crossing is not detected, the controller generates the task history information by associating tasks recognized at each time with the respective operators, based on the plurality of trajectories. When the crossing is detected, the controller associate the recognized tasks with the respective operators, based on the recognized tasks and past task history information.
These general and specific aspects may be implemented by a system, a method, and a computer program, and a combination thereof.
By the image analysis device and method according to the present disclosure, it is possible to estimate operators performing respective tasks when a plurality of tasks are performed by a plurality of operators.
Some embodiments will now be explained in detail with reference to some drawings, as appropriate. However, descriptions more in detail than necessary may be omitted. For example, detailed descriptions of well-known matters and redundant descriptions of substantially the same configurations may also be omitted. This is to avoid unnecessary redundancy in the description below, and to facilitate understanding of those skilled in the art. Note that the inventor(s) provide the accompanying drawings and the following description to facilitate those skilled in the art to fully understand the present disclosure, and the accompanying drawings and the following description are not intended to limit the subject matter defined in the claims in any way.
1. Configuration
A task analysis system according to a first embodiment will now be explained with reference to
1-1. System Overview
As illustrated in
In the example illustrated in
The analysis chart 7 in the system 1 presents a ratio of items including “main task”, “sub-task”, and “non-task” performed by each of the operators W1 to W3 in the analysis period. The items classifies tasks according to a degree of added value by the task, for example. In the example in
By the task analysis system 1 according to the present embodiment, by presenting the analysis chart 7 to the user 3, the user 3 can analyze tasks done by each of the operators W1 to W3, in order to consider improving operating efficiency in the work site 6, for example.
The camera 2 in the system 1 is disposed to capture image of an entire area where the operators W1 to W3 move across the work site 6. The camera 2 repeats an image capturing operation in the work site 6 at a predetermined cycle to generate image data indicating a captured image, for example. The camera 2 is connected to the task analysis device 5 so that the image data is transmitted to the task analysis device 5, for example. Although one camera 2 is illustrated in
The task analysis device 5 includes an information processing device, such as a server. The task analysis device 5 is communicably connected to an external information processing device, such as a personal computer (PC) including a monitor 4. A configuration of the task analysis device 5 will now be explained with reference to
1-2. Configuration of Task Analysis Device
The controller 50 includes a CPU or an MPU that performs a predetermined function, in cooperation with software, and controls the overall operation of the task analysis device 5, for example. The controller 50 performs various functions by reading data and programs stored in the storage 52 and performing various arithmetic operations. For example, the controller 50 includes an image recognizer 51 as a functional configuration.
The image recognizer 51 applies various image recognition technologies to image data, to recognize a position of a preset processing target in an image indicated by image data, and outputs a recognition result. In the image recognizer 51 according to this embodiment, a person such as the operator W is set as a processing target. The recognition result may include information indicating a time at which the position of the processing target is recognized, for example. The image recognizer 51 performs image recognition processing using a trained model implemented with a neural network model, such as a convolutional neural network. Various types of image recognition algorithms may be used in performing the image recognition processing.
The controller 50 executes a program including a group of commands for performing the function of the task analysis device 5, for example. Such a program may be provided over a communication network such as the Internet, or stored in a portable recording medium. The controller 50 may also include an internal memory, as a temporary storage area, where various types of data and programs are retained.
The controller 50 may be a hardware circuit such as a dedicated electronic circuit designed to perform a predetermined function, or a reconfigurable electronic circuit. The controller 50 may include various semiconductor integrated circuits such as a CPU, an MPU, a GPU, a GPGPU, a TPU, a microcomputer, a DSP, an FPGA, and an ASIC.
The storage 52 is a storage medium that stores programs and data required for performing the function of the task analysis device 5. The storage 52 includes a hard disk drive (HDD) or a semiconductor storage device (SSD), for example. For example, the storage 52 stores the program described above, and various types of information such as trajectory data D0, map data D1, task order information D2, and identification information D3.
The trajectory data D0 indicates trajectories of the operators W each moving across the work site 6. The trajectory data D0 is generated based on the recognition result obtained by inputting image data acquired from the camera 2 to the image recognizer 51, for example. The map data D1 indicates a layout of various facilities such as the conveyor line 61 and the shelves 62 in the work site 6, with respect to a predetermined coordinate system. The task order information D2 is information indicating the temporal order in which tasks in a combination are performed. The identification information D3 is information identifying an individual, such as each of the operators W1 to W3. Each piece of these information will be described later in detail.
The storage 52 may include a temporary storage element including a DRAM or an SRAM, and may function as a working area of the controller 50, for example. For example, the storage 52 may temporarily store data such as the image data received from the camera 2 and the recognition result output from the image recognizer 51.
The operation interface 53 is a general term indicating operation members that receive user operations. The operation interface 53 includes any one of a keyboard, a mouse, a trackpad, a touchpad, a button, and a switch, or a combination thereof, for example. The operation interface 53 acquires various types of information input via a user operation.
The device I/F 54 is a circuit via which an external device such as the camera 2 is connected to the task analysis device 5. The device I/F 54 communicates with the external device, in accordance with a predetermined communication standard. Examples of the predetermined standard include USB, HDMI (registered trademark), IEEE 1395, IEEE 802.11, and Bluetooth (registered trademark). The device I/F 54 is an example of an input interface that receives various types of information from the external device, in the task analysis device 5. In the task analysis system 1, the task analysis device 5 acquires image data indicating moving image captured by the camera 2 via the device I/F 54, for example.
The output I/F 55 is a circuit for outputting information. The output I/F 55 outputs signals such as video signals to an external display device such as a monitor and a projector for displaying various types of information, in compliance with the HDMI standard, for example.
The configuration of the task analysis device 5 described above is merely exemplary, and the configuration of the task analysis device 5 is not limited thereto. The task analysis device 5 may be configured using various types of computers, including a personal computer (PC). In addition to or instead of the output I/F 55, the task analysis device 5 may include a display implemented with a liquid crystal display or an organic EL display as a built-in display device, for example. In addition, the task analysis method according to the embodiment may be executed in distributed computing.
In addition to or instead of the above configuration, the task analysis device 5 may have a configuration that communicates with an external information processing device over a communication network. For example, the operation interface 53 may be configured to receive an operation performed by an external information processing apparatus that is connected over a communication network. Furthermore, the output I/F 55 may transmit various types of information to an external information processing apparatus over a communication network.
In addition, the input interface in the task analysis device 5 may be implemented by cooperating with various kinds of software in the controller 50, for example. The input interface in the task analysis device 5 may acquire various types of information by reading the various types of information stored in various storage media (e.g., the storage 52) onto the working area of the controller 50.
1-3. Various Data Structures
As described above, the task analysis device 5 according to the present embodiment stores the trajectory data D0, the map data D1, the task order information D2, and the identification information D3 in the storage 52. An example of structures various data D0 to D3 and the like will now be explained.
The trajectory data D0 manages, time, a trajectory ID identifying a trajectory of an operator W, and the position of the operator W recognized by the image recognizer 51 in the work site 6 at the respective times, associating them with each other, for example. The trajectory data D0 maps a trajectory based on the positions of the operator W at each time, to the map data D1, for example.
In the map data D1 illustrated in
Each of the sections Z1, Z2 includes a task area indicating an area in which the operator W operates in the work site 6. The section Z1 in
In addition, the task analysis device 5 according to the present embodiment stores, in the storage 52, task area information associating the positions in the work site 6 with the tasks. The task area information manages, for each of the sections included in the work site 6, associating the task areas with tasks performed in the respective task areas, for example. For example, packing and box preparation are performed in the task area A1 near the conveyor line 61 in the section Z1, and picking is performed in the task area A2 near the shelves 62 in the section Z1. The task area information may also include, for the area A1 to which a plurality of tasks are associated, information indicating a positional relationship among the plurality of tasks in the Y direction.
The task order information D2 manages, as illustrated in
Although not illustrated, the task analysis device 5 according to the present embodiment stores, in the storage 52, task tendency information regarding the work site 6, in addition to the task order information D2, for example. The task tendency information includes information such as a standard task period set for each task performed by each of the operators W1 to W3, for example. The task tendency information may include information indicating various tendencies in which various tasks are performed by the operators W to be analyzed in the work site 6. In addition, the task tendency information may include information indicating the classifications, such as the main task and the sub-task used in the analysis chart 7.
Furthermore, the task analysis device 5 according to the present embodiment stores, in the storage 52, identification information D3 identifying individuals, such as the operators W1 to W3. The identification information D3 is acquired in advance using a card reader or the like installed in the work site 6, through an identification operation performed by each of the operators W1 to W3, upon entry to the work site 6, and is transmitted to the task analysis device 5, for example. The identification information D3 includes information indicating a time at which the identification operation is received from each of the operators W1 to W3, for example. An operation of the task analysis device 5 using these various types of information will be described later.
2. Operation
An operation of the task analysis system 1 and the task analysis device 5 having the configuration described above will now be explained.
The task analysis system 1 illustrated in
The task analysis device 5 according to the present embodiment associates a trajectory ID with each of the operators W1 to W3, based on a time at which a position of the trajectory ID in the work site 6 is first recognized in the trajectory data D0, and the identification information D3 including a time at which each of the operators W1 to W3 enters the work site 6, for example. The task analysis device 5 then recognizes the positions of the operators W and the performed tasks, by applying image recognition using the image recognizer 51 to an image of the work site 6 captured by the camera 2 for example. The task analysis device 5 then updates the trajectory data D0 by associating the recognized positions to past positions of the operators W1 to W3, respectively. In this manner, the task analysis device 5 discriminates the operators W1 to W3 who performed the respective performed task at the respective recognized positions.
In such a processing, the performed tasks at each time can be recognized by applying image recognition processing to the captured image from the camera 2. However, identifying the trajectory or the operator W corresponding to the respective performed tasks may be challenging, when the trajectories cause crossing such that a plurality of trajectories cross each other. To overcome this challenge, in the present embodiment, the task analysis system 1 provides the task analysis device 5 capable of estimating the operator W who performs the recognized respective performed tasks, even when the crossing as described above occurs.
2-1. Problem
A scene that may obstruct identifying the operator performing the respective performed tasks, in the task analysis system 1 according to the present embodiment, will now be explained with reference to
In the examples illustrated in
In the examples illustrated in
As described above, even though the positions of the operators W and performed tasks in the work site 6 can be recognized at each time, when the trajectories of the operators W become the crossing state, it is sometimes difficult to distinguish the operators W performing the respective performed tasks. In particular, when the trajectories of operators W1 to W3 become the crossing state in a situation such that the operators W1 to W3 wears uniform with the same color and the shape in the work site 6, it is difficult to distinguish the operators W1 to W3 of the respective performed tasks based on the image recognition or the like, for example.
To overcome this problem, the task analysis device 5 according to the present embodiment performs processing to estimate the operator performing each task, using the task tendency information such as the task order information D2, in addition to the positions of performed task determined based on the captured image. By such processing, even in a situation where trajectories of a plurality of operators become the crossing state and it is difficult to determine the operators W by applying image recognition or the like of their positions, as illustrated in
For example, in the situation illustrated in
2-2. Overall Operation
The overall operation of the task analysis device 5 in the task analysis system 1 will now be explained with reference to
To begin with, the controller 50 acquires the image data corresponding to the analysis period from the camera 2 via the device IF 54, for example (S1). While the operators W1 to W3 operate in the work site 6, the camera 2 captures moving image to generate image data indicating the captured image at each time at a predetermined cycle such as a frame cycle of the moving image, and records the image data in the internal memory, for example. The camera 2 transmits the image data recorded over the analysis period, to the task analysis device 5. The controller 50 stores the acquired image data in the storage 52, for example.
The controller 50 then selects, in the temporal order, the image data corresponding to one frame indicating an image captured at each time from the acquired image data in the analysis period, for example (S2). The controller 50 then records the time at which the selected one frame is captured, as a time in the trajectory data D0, for example.
The controller 50 functioning as the image recognizer 51 recognizes the positions of the operators W and tasks in the image indicated by the image data of the selected one frame (S3). In step S3, the controller 50 transforms the positions recognized in the image into positions in a coordinate system indicating the positions of the work site 6, based on the map data D1, for example. The controller 50 also recognizes the tasks using the task area information based on that the recognized position is located in the task area A1, the task area A2, or another area, for example.
In the example illustrated in
Based on the recognition result in step S3, the controller 50 detects the crossing state of trajectories, for example (S4). For example, the controller 50 detects whether any occlusion occurs, resultant of the positions of a plurality of operators W overlapping with each other, in the captured image corresponding to the selected frame. For example, when it is determined that occlusion occurs, and that the positions recognized in the work site 6 are within a predetermined range from the latest positions of the trajectories in the trajectory data D0, the controller 50 determines that the trajectories become the crossing state. The predetermined range is set in advance as a range small enough to be considered as a range in which the operator W moves within a time interval corresponding to a frame cycle in the work site 6, for example.
When no the crossing state is detected (NO in S4), the controller 50 updates the trajectory data D0 by adding the positions recognized in step S3 for this cycle, as the positions of the corresponding trajectory IDs (S6). In this step, the controller 50 determines which one of the operators W1 to W3 corresponds to the performed task at each of the positions recognized in step S3, by associating the respective performed task with each of the positions and the corresponding trajectory ID in the trajectory data D0 (S6). The information associating a performed task and an operator W with each position in the trajectory data D0 is an example of the task history information according to the present embodiment.
By contrast, when the crossing state is detected (YES in S4), the controller 50 according to the present embodiment determines the operator W corresponding to each of the recognized tasks, based on tasks recognized in the crossing state, and the past performed task by each of the operators associated with the trajectory data D0 (S5). By performing operator discrimination processing for the crossing state (S5), it is possible to estimate the operators W even in trajectory the crossing state where the operators W performing the tasks cannot be identified by associating the positions recognized in step S3 with the past trajectories. The controller 50 according to the present embodiment refers to the task tendency information, such as the task order information D2, to perform the operator discrimination processing the crossing state (S5). The operator discrimination processing for the crossing state (S5) will be described later in detail. Hereinafter, the operator discrimination processing for the crossing state will also be simply referred to as operator discrimination processing.
After determining the operator W corresponding to each of the performed tasks (S5, S6), the controller 50 proceeds to step S7. When all the frames in the image data in the analysis period are not selected yet (NO in S7), the controller 50 performs the processing from steps S2 to S6 again, on the image data of the next time. In this manner, the trajectory data D0 based on the image data in each time in the analysis period, is obtained. The processing in steps S3 to S6 may be executed for each of the sections in the work site 6, examples of which is illustrated in
When all of the frames in the analysis period are selected (YES in S7), the controller 50 performs visualization processing to generate the analysis chart 7 (S8). For example, the controller 50 counts the number of times a task is determined per time interval, such as for each period corresponding one frame, for each of the operators W1 to W3 in the work site 6. Upon obtaining the total number of times by which each task is performed during the analysis period for each of the operators, the controller 50 calculates the ratio of each task performed by the specific operator to generate the analysis chart 7. The analysis chart 7 indicates the ratio of each task, as a ratio of the time for which the task is performed, with respect to the analysis period, for example.
For example, the controller 50 stores the analysis chart 7 generated in the visualization processing (S8) in the storage 52, and ends the processing illustrated in this flowchart.
With the processing described above, the positions of the operators W and the tasks in the work site 6 are recognized based on the image data (S3), and the operators W performing the respective tasks at the respective positions are identified by associating the recognized positions with the past trajectories in the trajectory data D0 (S6). In the crossing state of trajectories (YES in S4), operators are discriminated by the operator discrimination processing (S5). Therefore, information in which each of the performed tasks is assigned with an operator included in the trajectory data D0 is obtained, and the analysis chart 7 is generated based on the performed tasks performed by each of the operators, across the entire time intervals in the analysis period (S8).
In step S1 described above, image data generated by the camera 2 may be acquired sequentially. For example, instead of step S7, the controller 50 may repeat the processes in step S1 and subsequent steps until the trajectory data D0 is obtained based on the image data for the number of frames in the analysis period. Furthermore, in detection of the crossing state of trajectories (S4), the controller 50 may detect the crossing state based on any one of occlusion in the captured image and the positions of the operators W in the work site 6, for example.
2-3. Operator Discrimination Processing
Details of the operator discrimination processing in step S5 illustrated in
In the operator discrimination processing (S5) according to the present embodiment, the controller 50 determines, referring to the task tendency information such as the task order information D2, one candidate from the plurality of candidates in the task combination tables T1, T2, as a task combination of the discrimination result. Based on the determined task combination, the controller 50 discriminates the operators W for each of the tasks.
In the flowchart illustrated in
In the example illustrated in
Furthermore, in the example illustrated in
The task combination table T1 in
In the example illustrated in
The task combination table T2 illustrated in
The controller 50 then removes candidates from the task combination tables T1, T2 based on the task order information D2 (S12). For example, the controller 50 determines whether each one of the candidates corresponds to any one of the abnormal orders in the task order information D2, and removes the corresponding candidate from the task combination tables T1, T2.
For example, in the task combination table T1 illustrated in
The controller 50 then removes a candidate for which the task period of the performed task exceeds a standard task period, from the task combination tables T1, T2, based on information of the standard task period stored in the storage 52, for example (S13). For example, the controller 50 calculates the task period of the latest performed task in the task sequence of each candidate, and removes the candidate including the task sequence in which the task period exceeds a predetermined period indicating a significant excess from the standard task period. The standard task period is calculated in advance, as a period required for the respective operators to perform each of the tasks, by averaging periods measured a plurality of number of times. An increment of the predetermined period is set as three times the standard deviation of the measured periods, for example.
For example, in the task combination table T2 illustrated in
For example, after removing the candidates as described above (S12, S13), the controller 50 determines whether there are a plurality of candidates not removed, that is, whether the plurality of candidates are left in the task combination tables T1, T2 (S14). In a case where all of the candidates are removed in steps S12 and S13, in order to leave at least one candidate for step S14, the controller 50 may relax the condition for removing by the task tendency information, and perform the processing in step S11 and subsequent steps again, for example. For example, the predetermined period used in removing the candidate by the task period (S13) may be set to be longer than that used in the above example.
When the plurality of candidates are not left in the task combination tables T1, T2 (NO in S14), the controller 50 determines remaining one of the candidates as the task combination corresponding to the discrimination result (S16). In the task combination tables T1, T2 illustrated in
By contrast, when a plurality of candidates are left (YES in S14), the controller 50 selects a candidate including a task sequence in which the latest performed task with a period having the smallest difference from the standard task period, among the plurality of candidates, for example (S15). The controller 50 determines the selected candidate as the task combination of the discrimination result (S16).
From the task combination corresponding to the discrimination result, the controller 50 discriminates operators W of the respective tasks recognized at the latest time, the respective tasks being unable to be associated with the operators W for based on the trajectories (S17). The controller 50 then updates the trajectory data D0 by adding the determined positions of the respective operators W at the latest time, each as the position of the corresponding trajectory ID (S17).
In the example illustrated in
After discriminating the operators W and updating the trajectory data D0 (S17), the controller 50 ends the processing illustrated in this flowchart. The process then proceeds to step S7 in
By the operator discrimination processing for the crossing state described above (S5), the controller 50 generates a task combination including a task that cannot be mapped to the operator W due to the crossing state of trajectories, as a candidate (S11), and determines the task combination of the discrimination result, from the candidates, referring to the task tendency information (S12 to S16). Accordingly, even when the trajectories become the crossing state, it is possible to discriminate the operators W the respective tasks, based on the determined task combination (S17), and therefore it is possible to estimate the operators W of the respective tasks.
In the example explained above, in step S11, tasks recognized in three frames are included in each task sequence in the task combination tables T1, T2. The task sequence is not limited to a sequence of tasks at each time, such as three frames, and may include three tasks of different kinds, for example. In this case, for example, the controller 50 generates the task combination tables T1, T2 referring to the past tasks that are associated with the trajectory data D0, until three kinds of tasks are obtained. Furthermore, without being limited to the three kinds, the controller 50 may generate a task sequence including a sequence of three tasks per a predetermined period. Furthermore, the number is not limited to three, and a task sequence including a sequence of two tasks may be generated.
In the generation of the task combination tables T1, T2 (S11), the controller 50 may narrow down the candidates of the operator W using the coordinate information in the trajectory data D0 and the moving distance per unit time. For example, the controller 50 may determine the operator W relevant to the crossing state in a task combination using the moving speed of the operator W that is based on the past positions, for example, in addition to the past positions of the operator W in the trajectory data D0.
3. Effects
As described above, the task analysis device 5 is an example of the image analysis device according to the present embodiment. The task analysis device 5 includes the device I/F 54 as an example of the input interface, the controller 50, and the storage 52. The device I/F 54 acquires image data indicating a captured image of a work site 6 in which a plurality of operators W performs a plurality of tasks (S1). Based on the image data, the controller 50 generates information associating the tasks and the operators W to positions in the trajectory data D0, the information being an example of the task history information indicating the tasks performed in the work site 6 by the respective operators W1 to W3 among the plurality of operators W (S5, S6). The storage 52 stores the task history information. The controller 50 successively recognizes the tasks and positions of the plurality of operators W, based on the image data at each time in the work site 6 (S2, S3, S7). The controller 50 detects the crossing state as an example of crossing caused between a plurality of trajectories, each trajectory including successive positions for each of the plurality of operators W (S4). When the crossing state is not detected (NO in S4), the controller 50 generates the task history information by associating tasks recognized at each time with the respective operators W1 to W3, based on the plurality of trajectories (S6). When the crossing state is detected (YES in S4), the controller 50 associates the recognized tasks with the respective operators W1 to W3, based on the recognized tasks and past task history information (S5).
By the task analysis device 5 described above, upon detection of the crossing state, in which it is difficult to associate tasks with the operators W1 to W3 based on the trajectories (S6) (YES in S4), the tasks are associated with the operators W1 to W3 based on the recognized tasks and the past task history information (S5). In this manner, when a plurality of tasks are performed by a plurality of operators W in a work site 6, it is possible to estimate the operators W performing the respective tasks.
In the present embodiment, the storage 52 stores the task tendency information indicating a tendency for the tasks performed in the work site 6. When the crossing state is detected (YES in S4), the controller 50 associates the recognized tasks with the respective operators W1 to W3, by referring to the task tendency information (S5). Therefore, even when it is difficult to map the tasks to the operators W1 to W3 by image recognition or the like of the positions of the operators W due to the crossing state of trajectories, it is possible to estimate the operator W corresponding to each task based on the task tendency information.
In the present embodiment, when the crossing state is detected (YES in S4), the controller 50 generates a plurality of task combinations, as an example of calculating a plurality of combinations of the recognized tasks and corresponding operators W to the plurality of crossing trajectories causing the crossing state (S11). The controller 50 determines one task combination in the plurality of task combinations based on the task tendency information (S16), and associates the recognized tasks with the respective operators W1 to W3, based on the determined one task combination (S17). In this manner, the task combination tables T1, T2 including a plurality of tasks combinations as the candidates C11 to C22 are generated (S11), and by narrowing down the candidates C11 to C22 based on the task tendency information, the task combination of the discrimination result is determined (S16). Therefore, the operators W the respective tasks can be discriminated, based on the determined task combination.
In the present embodiment, the task tendency information includes the task order information D2, as an example of information indicating an order of tasks in a combination of two or more tasks among a plurality of tasks. Based on the order, the controller 50 determines the one task combination (S16) by removing the task combination corresponding to the abnormal order, as an example of a predetermined order, from the plurality of task combinations (S12). Therefore, it is possible to determine the task combination not corresponding to the abnormal order in the task order information D2 as the discrimination result.
In the present embodiment, the task tendency information includes information indicating a standard task period set to the first task among the plurality of tasks. The controller 50 determines the one combination (S16), based on a period in which a task performed by an operator W is recognized as the first task, by removing a task combination in which the period is longer than the standard task period, from the plurality of task combinations (S13). In the example illustrated in
In the present embodiment, the controller 50 detects the crossing state based on the occlusion that is an example of overlapping of positions of the operators W1, W2 (an example of two or more operators) among the plurality of operators W in the image indicated by the acquired image data (S4). Therefore, it is possible to implement the crossing state detection that is based on the positions of the operators W in the image.
In the present embodiment, the storage 52 further stores the identification information D3 which identifies the respective operators W1 to W3. The controller 50 associates recognized positions with the respective operators W1 to W3 when the positions of the operators W in the work site 6 are recognized for the first time. In this manner, the controller 50 manages the operators W1 to W3 associating with the trajectory IDs assigned to the positions in the trajectory data D0. Accordingly, the trajectory data D0 is updated by associating the sequentially recognized positions of the operators W with the past positions of the operators W1 to W3 (S3), therefore it is possible to discriminate the operators W1 to W3 of the performed tasks at the recognized positions (S6).
The controller 50 generates the analysis chart 7 as an example of information indicating a ratio of each of the plurality of tasks over the analysis period (an example of the predetermined period), for each of the operators W1 to W3, based on the task history information in the analysis period. Therefore, the analysis chart 7 related to a plurality of operators W who perform a plurality of tasks in the work site 6 can be presented to the user 3 of the task analysis system 1, for example. The task analysis device 5 may further include an output I/F 55 and/or a monitor 4, as an example of a display that displays the generated information, such as the analysis chart 7.
The task analysis method is an example of the image analysis method in the present embodiment The method includes, by the controller 50 of a computer, acquiring image data indicating a captured image of a work site 6 in which a plurality of operators W performs a plurality of tasks (S1), and generating, based on the image data, task history information indicating the tasks performed by respective operators W1 to W3 among the plurality of operators W in the work site 6 (S2 to S7). In generating the task history information (S2 to S7), the controller 50 of the computer successively recognizes the tasks and positions of the plurality of operators W based on the image data at each time in the work site 6 (S2, S3, S7), and detects the crossing state as an example of crossing caused between a plurality of trajectories, each trajectory including successive positions for each of the plurality of operators W (S4). When the crossing state is not detected (NO in S4), the controller 50 generates the task history information by associating tasks recognized at each time with the respective operators. W1 to W3, based on the plurality of trajectories (S6). When the crossing state is detected (YES in S4), the controller associates recognized tasks with the respective operators W1 to W3, based on the recognized tasks and the past task history information (S5).
In the present embodiment, a program for causing a controller of a computer to execute the task analysis method as described above is provided. By the task analysis method according to the present embodiment, when a plurality of tasks are performed by a plurality of operators W, it is possible to estimate the operator W performing each one of the plurality of tasks.
Explained in the first embodiment is an example in which the task analysis device 5 performs the operator discrimination processing based on the information on the task period and the task order information D2. Explained in a second embodiment is an example in which a task analysis device 5 performs the operator discrimination processing also based on task plan information related to the work site 6.
The task analysis device 5 according to the present embodiment will be explained, by omitting the explanations of configurations and operations that are the same as those of the task analysis device 5 according to the first embodiment, as appropriate.
The task plan information D4 illustrated in
An operation of the task analysis device 5 according to the present embodiment using the task plan information D4, such as that as described above, will now be explained with reference to
Each of
The flowchart illustrated in
The controller 50 recognizes the positions of the operators W and tasks in the frame of the next time corresponding to the scene illustrated in
Next, the controller 50 removes candidates from the task combination table T3, based on the task plan information D4 (S21). The controller 50 removes a candidate in which each of the operators in their assigned sections performs box preparation that is the auxiliary task as the latest performed task, from the task combination table T3, referring to the assigned section in the task plan information D4, for example. This rule for the exclusion is set in advance, based on the task plan information D4, considering that the operator W is most likely to perform packing that is the main task or picking related to the main task in the assigned section, for example. Because the operator W1 is assigned with the section Z1 as the assigned section in the task plan information D4 illustrated in
Thereafter, the controller 50 removes candidates from the task combination table T3, based on the task order information D2 and the task period (S12 to S13). In the example illustrated in
As described above, in the present embodiment, the task tendency information includes the task plan information D4, as an example of information which associates the respective operators W1 to W3 with an assigned section (an example of an assigned range) indicating a range of positions where the respective operators W1 to W3 do not perform the auxiliary task that is an example of the second task among the plurality of tasks in the work site. When a position of an operator W is included in the assigned section assigned to the operator W, the controller 50 removes the task combination in which the task performed by the operator W is the auxiliary task, from the plurality of tasks combinations (S21), to determine the one task combination (S16). Therefore, it is possible to determine the task combination in which each of the operators W performs the main task in the assigned section as the discrimination result.
The exclusion based on the task plan information D4 (S21) is not limited to that based on the assigned section, and information regarding various task plans may be used. For example, the exclusion may be performed using information related to the number of handling quantity. For example, when the operator W already performed packing the number of times corresponding to the predetermined handling quantity, combinations including packing for subsequent time may be removed, considering that the operator W is not likely to perform packing at the subsequent time.
The first and the second embodiments are described above, as some examples of the technology disclosed in the present application. However, the technology according to the present disclosure is not limited thereto, and may also be applied to embodiments including changes, replacements, additions, omissions, and the like made as appropriate. In addition, it is also possible to combine the elements described in the embodiments to form a new embodiment. Other embodiments will now be explained as some examples.
In each of the embodiments described above, the task order information D2 in which the abnormal order is mapped to the section in the work site 6 is explained. In this embodiment, the task order information D2 may include a standard order indicating an order in which an operator W perform different tasks, correspondingly to each of the sections in the work site 6, for example. In this case, in the exclusion processing (S12) performed based on the task order information in the operator discrimination processing (
Explained in each of the embodiments above is an example in which the image recognizer 51 outputs the recognition result of the position of a person, as a person without distinguishing the operators W1 to W3 from one another. The image recognizer 51 according to this embodiment may recognize the positions of the operators W1 to W3 by the operators W1 to W3 from one another, using face recognition technology, for example. Also in this case, the operation of the task analysis device 5 according to this embodiment can be applied when a task can be recognized, the task being performed by each operator W whose face is not illustrated in the captured image of the camera 2, for example.
Explained in each of the embodiments above is an example in which the operator W at every position recognized by the image recognizer 51 is subjected to the analysis. The task analysis device 5 according to this embodiment, however, may exclude the positions of some operators who are not to be subjected to the analysis, from the task-related processing (S4 to S5, S8). For example, in the work site 6, operators at positions outside the task areas A1, A2 and an area therebetween, such as the area for refilling, may be excluded. In addition, for example, an operator who performs process management or monitoring in the work site 6 may be excluded from the processing, based on a tendency that the trajectories of such operators pass through the area between the task areas A1, A2 for a long time period.
Explained in each of the embodiments above is the task analysis device 5 that removes candidates based on the task period, with reference to preset task period information (S13). In the task analysis device 5 according to this embodiment, for example, the period of the picking task may be set, for each of the operators, based on the period for which the respective operators stayed in the task area A2 on the side of the shelves 62 in the past.
Explained in each of the embodiments above is an example in which the operator discrimination processing is performed using information, such as a task period, corresponding to each task. In the operator discrimination processing, without limitation to the task period of each task, the task analysis device 5 according to the present embodiment may execute the operator discrimination processing using information such as a period in which the operator is presumably performing a task outside the angle of view of the camera 2, or a period in which the operator is taking a break.
Explained above in the embodiments is an example in which the task analysis system 1 is applied to the work site 6 such as a logistics warehouse. In the present embodiment, the work site where the task analysis system 1 and the task analysis device 5 are used, that is, a site is not particularly limited to the work site 6 described above, and may be various types of work sites such as a factory or a store floor. In addition, the task determined by the task analysis system 1 is not limited to the example of tasks such as packing described above, and may be various tasks performed in various sites. In addition, the operator to be analyzed by the task analysis system 1 is not limited to a person such as the operator W, and may be any moving object capable of performing various types of task. For example, the moving object may be a robot, or may be various manned or unmanned vehicles.
As described above, the embodiments are described as examples of the technology according to the present disclosure. The accompanying drawings and the detailed description are provided for this purpose.
Accordingly, the components described in the accompanying drawings and the detailed description may include not only the components essential for solving the problems, but also components that are not essential for solving the problems, for the purpose of explaining the examples of the above technology. Therefore, it should not be immediately recognized that these non-essential components are essential based on the fact that these non-essential components are described in the accompanying drawings and the detailed description.
The present disclosure is applicable to data analysis for analyzing tasks performed by operators in various environments such as a logistics site or a factory.
Number | Date | Country | Kind |
---|---|---|---|
2021-098078 | Jun 2021 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/012833 | Mar 2022 | US |
Child | 18530573 | US |