The present invention relates to an information processing technique for assisting in inspection work to be performed by a worker, and more particularly to an information processing technique for providing assistance in inspection work, depending on the proficiency of a worker.
In machinery and equipment such as a water treatment plant, a plant facility, and an electric power facility, inspection work for maintenance or quality maintenance is indispensable for the operations of the machinery and equipment. When performing such a type of inspection work, a worker needs to periodically inspect the maintenance state or operating state of the machinery and equipment on the basis of, for example, a work procedure manual or a screen image for a work procedure displayed on an information processing terminal, and record a result of the inspection correctly. Further, when the inspection result shows that there is a defect in the machinery and equipment, the worker must take a measure such as repair of the machinery and equipment or adjustment of the operating state, as needed.
However, in many cases, the instruction contents in the work procedure manual or screen image for a work procedure are common regardless of the skill levels of workers. Therefore, in a case in which the instruction contents are concise contents suitable for workers having a high skill level, a beginner possibly recognizes that the instruction contents are lacking in information or difficult to understand, and then performs inefficient and incorrect inspection work. In contrast, in a case in which the instruction contents are contents suitable for workers having a low skill level, an expert possibly recognizes that the instruction contents are redundant. In this case, the experts' operating efficiency possibly decreases.
Therefore, it is preferable that the instruction contents in the work procedure manual or the screen image for a work procedure be changed to contents corresponding to the skill level of the worker. For example, Patent Literature 1 (Japanese Patent Application Publication No. 2012-234406) discloses a work assistance apparatus that automatically estimates the skill level of a worker and displays instruction contents corresponding to the result of the estimation. The work assistance apparatus measures a distribution of velocity of line-of-sight movement of a worker, and estimates that the worker has a high skill level when a peak of the distribution remarkably appears in a specific velocity range.
Patent Literature 1: Japanese Patent Application Publication No. 2012-234406 (paragraphs [0038] to [0052], for example)
According to the conventional technique disclosed in Patent Literature 1, when the worker remains stationary, the skill level can be estimated correctly on the basis of measurement values of the distribution of velocity of line-of-sight movement of the worker. However, when the worker performs inspection work while moving, the worker performs not only line-of-sight movement necessary for the inspection work, but also line-of-sight movement needed to move the worker's head or body. The velocity of the line-of-sight movement needed for the movement of the worker becomes a noise component, causing reduction in accuracy of the estimation of the skill level.
In view of the foregoing, it is an object of the present invention to provide a work assistance apparatus, work learning apparatus and work assistance system which make it possible to estimate the skill level of a worker with a high degree of accuracy even when the worker performs inspection work while moving.
According to a first aspect of the present invention, there is provided a work assistance apparatus which includes: a line-of-sight measuring unit configured to measure line-of-sight information about a worker; a line-of-sight movement-direction measuring unit configured to measure a direction of line-of-sight movement on the basis of a measurement result acquired by the line-of-sight measuring unit, thereby to output a measurement quantity of the direction of line-of-sight movement; a skill-level estimator configured to compare the measurement quantity with a reference quantity prepared in advance, and to estimate a skill level indicating a proficiency level of the worker on the basis of a result of the comparison; and an output controller configured to cause an information output unit to output work assistance information having descriptions corresponding to the skill level estimated by the skill-level estimator.
According to a second aspect of the present invention, there is provided a work learning apparatus which includes: an output controller for work learning, configured to cause an information output unit to output display of guidance information that prompts a worker to move a line of sight toward a next work item to be inspected from an inspected work item; a line-of-sight measuring unit for work learning, configured to measure line-of-sight information about the worker in response to the output of display of the guidance information; a line-of-sight movement-direction measuring unit for work learning, configured to measure a direction of line-of-sight movement on the basis of a measurement result acquired by the line-of-sight measuring unit for work learning, thereby to output a measurement quantity of the direction of line-of-sight movement; and a reference data calculator configured to calculate a reference quantity corresponding to the guidance information on the basis of the measurement quantity.
According to a third aspect of the present invention, there is provided a work assistance system which includes: the work assistance apparatus according to the first aspect; and the work learning apparatus according to the second aspect.
According to the present invention, because the skill level of a worker is estimated using a measurement quantity of a direction of line-of-sight movement of the worker, the skill level can be estimated with a high degree of accuracy even when the worker performs inspection work while moving. Therefore, by using the result of the estimation, it is possible to provide efficient work assistance depending on the skill level of the worker.
Hereafter, embodiments according to the present invention will be explained in detail with reference to the drawings. It is assumed that components denoted by the same reference numerals in the whole of the drawings have the same configurations and the same functions.
The sound input/output unit 12 is comprised of a microphone MK disposed as a sound input unit that converts an acoustic wave into an electric signal, and a speaker SP disposed as a sound output unit that outputs an acoustic wave to space. In the present embodiment, the sensor group 11, the sound input/output unit 12, and the display device 13 construct a wearable device which can be attached to the head or body of a worker.
Further,
The front-image sensor 11D shown in
A skill level in the present embodiment is a value indicating the proficiency level in inspection work that is performed by a worker. Referring to
The work assistance apparatus 10 estimates the skill level of a worker by using the reference data file Fc. Further, the work assistance apparatus 10 supplies work assistance information having descriptions corresponding to the skill level to the display device 13, the speaker SP, or both these display device 13 and speaker SP, by using the output data file set Fd. As a result, the worker can recognize the work assistance information visually, auditorily, or visually and auditorily. An information output unit that outputs work assistance information is comprised of the speaker SP and the display device 13.
Work performed using the work assistance system 1 of the present embodiment is grouped into on-line work (i.e., inspection work) performed at an inspection site, and the off-line work performed prior to the on-line work. The work learning apparatus 20 and the contents-compilation apparatus 30 are used for off-line work, and the work assistance apparatus 10 is used for the on-line work.
First, the work assistance apparatus 10 will be explained.
As shown in
Work procedure data Fa are stored in the storage medium 105 together with the above-mentioned output data file set Fd and the above-mentioned reference data file Fc. The main controller 101 controls the contents of the output of the output controller 102 in accordance with the procedure defined by the work procedure data Fa. The communication unit 106 can communicate with the work learning apparatus 20 to acquire the reference data file Fc from the work learning apparatus 20, and store the reference data file Fc in the storage medium 105. The communication unit 106 can also communicate with the contents-compilation apparatus 30 to acquire the output data file set Fd from the contents-compilation apparatus 30, and store the output data file set Fd in the storage medium 105.
For example, at a location where a communication network environment exists, the work learning apparatus 20 and the contents-compilation apparatus 30 can store the output data file set Fd and the reference data file Fc in an information distribution server. The communication unit 106 of the work assistance apparatus 10 can transmit a distribution request to the information distribution server to acquire the output data file set Fd and the reference data file Fc. As an alternative, in a case in which the storage medium 105 is configured as a storage medium which can be freely attached and detached, the reference data file Fc can be transferred from the work learning apparatus 20 to the work assistance apparatus 10 via the storage medium 105, and the output data file set Fd can be transferred from the contents-compilation apparatus 30 to the work assistance apparatus 10 via the storage medium 105.
On the other hand, the I/F unit 107 is configured in such a way as to carry out transmission and reception of data among the sensor group 11, the sound input/output unit 12, and the display device 13. Although the I/F unit 107 of the present embodiment is connected to the sensor group 11, the sound input/output unit 12, and the display device 13 via cables, as shown in
The sensor group 11 includes an image sensor 11A for line-of-sight detection, a sensor 11B for position detection, a direction sensor 11C, and the front-image sensor 11D. The image sensor 11A for line-of-sight detection takes an image of an eyeball of a worker to generate image data showing the eyeball, and supplies the image data CD to the line-of-sight measuring unit 111 via the I/F unit 107. The line-of-sight measuring unit 111 can analyze the image showing the eyeball to measure the line of sight of the worker in real time.
As the sensor 11B for position detection, for example, a GNSS (Global Navigation Satellite System) sensor such as a GPS (Global Positioning System) sensor, an electric wave sensor that detects an electric wave emitted by a wireless LAN base station, or an RFID (Radio Frequency IDentification) sensor is provided. However, the sensor 11B for position detection is not particularly limited to such a sensor as long as the sensor for position detection is used for the detection of the position of a worker and the position of a work target. Further, the direction sensor 11C is used for the detection of the face direction of a worker. For example, the direction sensor can be comprised of a gyro sensor and an acceleration sensor. The front-image sensor 11D takes an image of an object located in front of a worker to generate a digital image. The front-image sensor 11D is comprised of a solid state image sensor such as a CCD image sensor or a CMOS image sensor.
Sensor data SD that consist of the outputs of the sensor 11B for position detection, the direction sensor 11C, and the front-image sensor 11D are supplied to the worker- information acquisition unit 103 and the work-targetinformation acquisition unit 104 via the I/F unit 107.
The microphone MK detects an input acoustic wave such as a voice, and supplies input voice data VD which is a result of the detection to the worker-information acquisition unit 103 via the I/F unit 107. The speaker SP converts outputted acoustic data AD input thereto, via the I/F unit 107, from the output controller 102 into an acoustic wave, and outputs the acoustic wave. On the other hand, the display device 13 has a function of converting display data DD input thereto, via the I/F unit 107, from the output controller 102 into a display image, and outputting the display image.
The worker-information acquisition unit 103 includes a position detector 103P that detects a current position of a worker, a motion detector 103M that detects a motion pattern of the worker, a voice recognizer 103A that recognizes a specific voice pattern, and a direction detector 103D that detects a direction in which the face of the worker is facing (simply referred to as “the face direction of the worker” hereafter). Worker information includes at least one of the current position, the face direction, the motion pattern, and the voice pattern of a worker. The worker information is supplied to the main controller 101. Further, results of the detection of the face direction and the current position of a worker are supplied also to the work-target information acquisition unit 104.
The position detector 103P detects the current position of a worker in real time on the basis of the detection output of the sensor 11B for position detection. For example, when a worker is outside, the current position of the worker can be detected using the above-mentioned GNSS sensor. In contrast, when a worker is inside, the current position of the worker (e.g., a current position defined on a per-building basis, on a per-floor basis, or on a per-room basis) can be detected using either the detection output of the above-mentioned electric wave sensor or the detection output of the above-mentioned RFID sensor. The position detector 103P can acquire information about the current position of a worker from a management system disposed separately such as an entering and leaving control system. The direction detector 103D can detect the face direction of a worker in real time on the basis of the detection output of the direction sensor 11C.
The motion detector 103M analyzes moving image data outputted from the front-image sensor 11D to detect a specific motion pattern of a part (e.g., a hand) of the body of a worker. When a worker moves a part of his or her body with a specific motion pattern (e.g., a motion pattern of moving an index finger up and down), the motion detector 103M can detect the motion pattern by performing a moving image analysis. The motion detector 103M can detect a motion pattern by using not only the moving image data, but also distance information (depth information) acquired by a distance sensor (not illustrated). The distance sensor has a function of detecting the distance to each part of the body surface of a worker by using a well-known projector camera method or a well-known TOF (Time Of Flight) method. The voice recognizer 103A also analyzes the input voice data VD to recognize a voice of a worker, and, when the recognized voice matches a registered voice (e.g., “Inspection has been completed” or “Next inspection”), outputs a result of the recognition. The voice recognition method is not limited particularly to this example, and a well-known voice recognition technique can be used.
On the other hand, the work-target information acquisition unit 104 acquires the results of the detection of the current position and the face direction of a worker from the worker-information acquisition unit 103, and also acquires the front image signal from the sensor data SD. The work-target information acquisition unit 104 can recognize a candidate for a work target existing ahead of the line of sight of the worker by analyzing the front image signal by using the detection results, and can also recognize candidates for a work item in the work target and output results of the recognition of the candidates to the main controller 101. The work-target information acquisition unit 104 can recognize both a candidate for a work target existing at a specific position in a direction which the face of the worker is facing, and candidates for a work item from the front image signal, by using, for example, a well-known pattern matching method.
The main controller 101 controls the contents of the output of the output controller 102 in accordance with the procedure defined by the work procedure data Fa stored in the storage medium 105.
Further, position information for specifying an arrangement range which is occupied by a work item in each work target is defined in “work item position (four point coordinates)” shown in
Further, “requirement for completion of current work” shown in
As the requirement to complete inspection work, a specific motion pattern of a part of a body can be defined. The motion detector 103M can recognize such a motion pattern. As an alternative, the completion of response input of an inspection result can be defined as the requirement to complete inspection work.
Display information on an augmented reality (AR) space which should be displayed after the completion of inspection work is defined in “AR display” shown in
For example, as to P1 to P3, it is defined that after the completion of work on each of the work items specified by the procedure IDs (a circuit breaker A, a circuit breaker B, and a power supply A in a transmission board), the direction of the next inspection position (the position of the next work item to be inspected) is indicated by an arrow. In this case, the “arrow” represents the display information.
The output controller 102 shown in
In contrast,
When the requirement for completion of current work mentioned above (
The main controller 101 notifies the timing acquisition unit 110 of a change timing in response to the display output of the above-mentioned guidance information. For example, the main controller 101 can notify the timing acquisition unit 110 of a change timing immediately after the display output of the above-mentioned guidance information is performed by the output controller 102. The timing acquisition unit 110 causes the line-of-sight measuring unit 111 to start a measurement of line-of-sight information, in response to the notification. The line-of-sight measuring unit 111 can analyze the image data CD acquired by the image sensor 11A for line-of-sight detection to measure line-of-sight information about a worker (including line-of-sight coordinates and time information) in real time. Data showing results of the measurement are supplied to the line-of-sight movement-direction measuring unit 112 and the line-of-sight movement-timing measuring unit 113. The line-of-sight movement-direction measuring unit 112 uses the line-of-sight coordinates included in the line-of-sight information, and the line-of-sight movement-timing measuring unit 113 uses the time information included in the line-of-sight information.
As the method of measuring a line of sight, a well-known image analysis method such as a corneal reflection method can be used. In the case of using the corneal reflection method, for example, the line-of-sight measuring unit 111 analyzes the motion of a pupil which appears in an eyeball image taken by the image sensor for line-of-sight detection on the basis of the eyeball image, to estimate the coordinates of the pupil center and the position coordinates of a corneal reflection image (the position coordinates of an optical image called a Purkinje image). The line-of-sight measuring unit 111 can calculate a sight line vector showing a line-of-sight direction on three-dimensional virtual space on the basis of both the coordinates of the pupil center and the position coordinates of the corneal reflection image. The line-of-sight measuring unit 111 can calculate line-of-sight coordinates on a two-dimensional image coordinates system, the line-of-sight coordinates showing the position at which a worker gazes, on the basis of the sight line vector. A line-of-sight measurement algorithm using such a corneal reflection method is disclosed in, for example, PCT International Application Publication No. 2012/137801.
The line-of-sight measuring unit 111 can measure line-of-sight information about a worker by using only the image data CD including an eyeball image of the worker. As an alternative, the line-of-sight measuring unit can measure line-of-sight information by using the face direction detected by the direction detector 103D in addition to the image data CD. As a result, a measurement of line-of-sight information with a higher degree of reliability can be carried out.
The configuration of the line-of-sight measuring unit 111 can be modified in such a way that the line-of-sight measuring unit measures line-of-sight information on the basis of only the face direction detected by the direction detector 103D. In this case, the line-of-sight direction is estimated from the face direction. Therefore, because it is not necessary to use both a sophisticated sensing technique for specifying the position of the line of sight on the basis of the image data CD, and the image sensor 11A for line-of-sight detection, the configuration of the sensor group 11 and the configuration of the line-of-sight measuring unit 111 can be implemented at a low cost.
The line-of-sight movement-direction measuring unit 112 measures the direction of line-of-sight movement of a worker immediately after the display of the guidance information on the basis of a measurement result (line-of-sight coordinates) acquired by the line-of-sight measuring unit 111, and outputs a measurement quantity Dm of the direction of line-of-sight movement to the skill-level estimator 114. The measurement quantity Dm can be calculated as, for example, an angle or a vector quantity. In parallel, the line-of-sight movement-timing measuring unit 113 measures a timing at which a line-of-sight movement of the worker immediately after the display of the guidance information starts on the basis of a measurement result acquired by the line-of-sight measuring unit 111, and outputs a measurement value (time information) Tm of the timing to the skill-level estimator 114.
The skill-level estimator 114 accesses the storage medium 105 to acquire the reference data file Fc. Both data indicating a reference quantity of the direction (referred to as a “directional reference quantity” hereafter) of line-of-sight movement, and data indicating a reference value of the timing (referred to as a “timing reference value” hereafter) at which the line-of-sight movement of a worker starts are included in the reference data file Fc. A plurality of combinations of the directional reference quantity and the timing reference value is prepared for each point at which to change from a work item to another work item, the number of combinations being equal to the number of skill levels which can be estimated.
The skill-level estimator 114 can estimate the skill level on the basis of a combination of a comparison result which is acquired by comparing the measurement quantity Dm with the directional reference quantity, and a comparison result which is acquired by comparing the measurement value Tm with the timing reference value. Concretely, the skill-level estimator 114 selects a combination which is most similar to a combination of the actual measurement quantity Dm and the actual measurement value Tm from among the combinations of the directional reference quantity and the timing reference value, and outputs, as a result of the estimation, a similarity corresponding to the combination selected thereby. The skill-level estimator can alternatively estimate the skill level on the basis of only either the comparison result which is acquired by comparing the measurement quantity Dm with the directional reference quantity, or the comparison result which is acquired by comparing the measurement value Tm with the reference value. A concrete example of the method of estimating the skill level will be mentioned later.
Next, the operations of the work assistance apparatus 10 described above will be explained with reference to
Referring to
In contrast, when a workplace is recognized (YES in step ST12), the main controller 101 tries to recognize a work item to be inspected of a work target registered in the work procedure data Fa by using a recognition result acquired by the work-target information acquisition unit 104 (step ST13). As a result, when no work item to be inspected is recognized (NO in step ST14), the processing returns to step ST13. The flow chart shown in
When a work item to be inspected is recognized (YES in step ST14), the skill-level estimator 114 estimates the current skill level (step ST15). To be more specific, the output controller 102 makes a request of the skill-level estimator 114 to estimate the skill level, in accordance with control performed by the main controller 101, and the skill-level estimator 114 estimates the current skill level in response to the request.
Specifically, in a case in which skill levels 1 to Q on a Q-level scale can be estimated (Q is an integer equal to or greater than 2), for example, the skill-level estimator 114 can estimate that the most frequent skill level among N most recent consecutive results (N is an integer equal to or greater than 2) of the skill-level estimation is the current skill level. In this case, when the number of times that the skill level j is estimated is equal to the number of times that the skill level i is estimated (i≠j), it can be estimated that a lower-proficiency one of those two skill levels j and i is the current skill level.
As an alternative, in a case in which only two skill levels including the skill level 1 for beginners and the skill level 2 for experts are prepared, the skill-level estimator 114 can estimate that the skill level 2 is the current skill level, when the number of times that the skill level 2 is estimated is equal to or greater than M (M is a positive integer; M≤N) among N most recent consecutive results of the skill-level estimation. In contrast, when the number of times that the skill level 2 is estimated is less than M, the skill-level estimator 114 can estimate that the skill level 1 is the current skill level.
Next, the output controller 102 accesses the storage medium 105 to select an output data file corresponding to the skill level from the output data file set Fd, and outputs the work assistance information shown by the output data file to the I/F unit 107 (step ST16). As a result, the worker can recognize the work assistance information visually, auditorily, or visually and auditorily via the speaker SP, the display device 13, or both these speaker and display device.
After that, the work assistance apparatus 10 waits until receiving a predetermined response input, such as a voice or a motion pattern, which is defined in the work procedure data Fa from the worker (NO in step ST17). For example, the worker is allowed to utter “Circuit breaker A has an abnormality”, “No abnormality”, or “Next inspection.” When no predetermined response input is received even after a prescribed time period has elapsed (NO in step ST17 and YES in step ST18), the processing shifts to the next step ST20.
In contrast, when a predetermined response input is received within the prescribed time period (NO in step ST18 and YES in step ST17), the main controller 101 stores the result of the response input as an inspection result (step ST19). For example, the main controller 101 can store data showing the inspection result in the communication unit 106, or store the inspection result in an external device by transmitting the data showing the inspection result to the external device via the communication unit 106.
After that, the main controller 101 determines the presence or absence of the next work item to be inspected (step ST20). More specifically, the main controller 101 determines whether the next work item to be inspected exists by referring to the work procedure data Fa. When it is determined that the next work item to be inspected exists (YES in step ST20), the output controller 102 causes the display device 13 to perform display output of guidance information for guiding the worker to the next work item (step ST21). For example, the output controller 102 should just cause the display device 13 to display the guidance information GA shown by the arrow symbol shown in
The timing acquisition unit 110 waits until being notified of a change timing. When receiving a notification of a change timing from the main controller 101, the timing acquisition unit 110 instructs the line-of-sight measuring unit 111 to perform a line-of-sight measurement, in response to the notification (step ST30 of
After that, the skill-level estimator 114 accesses the storage medium 105 to acquire the reference data file Fc (step ST34). Then, with respect to the similarity k set in the reference data file Fc, the skill-level estimator 114 compares the set (Dm, Tm) of the measurement quantity Dm and the measurement value Tm with the set (Dk, Tk) of the directional reference quantity Dk and the timing reference value Tk which corresponds to the skill level k (steps ST35 to ST37).
Concretely, the skill-level estimator 114 sets a number k indicating a skill level (k is an integer ranging from 1 to Q) to “1” first (step ST35). The skill-level estimator 114 then compares the measurement quantity Dm with the directional reference quantity Dk corresponding to the skill level k and calculates either a dissimilarity ΔD(k) or a similarity SD(k) which is a result of the comparison (step ST36). For example, the skill-level estimator can calculate either the dissimilarity AD(k) or the similarity SD(k) by using the following equation (1A) or (1B) (a is a positive coefficient).
ΔD(k)=|Dm−Dk| (1A)
SD(k)=a/ΔD(k) (1B)
The measurement quantity Dm in the above equation (1A) is the angle which the direction shown by the guidance information forms with the direction of line-of-sight movement of the worker. The dissimilarity ΔD(k) means the absolute value of the difference between the measurement quantity Dm and the directional reference quantity Dk. When the measurement quantity Dm and the directional reference quantity Dk are vector quantities, instead of the difference absolute value in the above equation (1A), for example, the norm of the difference vector between the measurement quantity Dm and the directional reference quantity Dk can be calculated as the dissimilarity ΔD(k). In general, the norm of a vector is the length of the vector.
The skill-level estimator 114 also compares the measurement value Tm with the timing reference value Tk corresponding to the skill level k and calculates either a dissimilarity ΔT(k) or a similarity ST(k) which is a result of the comparison (step ST37). For example, the skill-level estimator can calculate either the dissimilarity ΔT(k) or the similarity ST(k) by using the following equation (b is a positive coefficient).
ΔT(k)=|Tm−Tk| (2A)
ST(k)=b/ΔT(k) (2B)
The dissimilarity ΔT(k) in the above equation (2A) means the absolute value of the difference between the measurement value Tm and the timing reference value Tk.
Next, when the number k indicating a skill level does not reach the maximum number Q (YES in step ST38), the skill-level estimator 114 increments the number k indicating a skill level by 1 (step ST39), and performs the steps ST36 and ST37.
After that, when the number k indicating a skill level reaches the maximum number Q (NO in step ST38), the skill-level estimator 114 estimates the skill level of the worker on the basis of either the degrees of dissimilarity ΔD(k) and ΔT(k), or the degrees of similarity SD(k) and ST(k) (step ST40).
For example, the skill-level estimator 114 can estimate, as a result of the estimation, the skill level k which minimizes the norm of the degree-of-dissimilarity vector (ΔD(k), ΔT(k)), among the skill levels 1 to Q. The skill-level estimator 114 can alternatively estimate, as a result of the estimation, the skill level k which maximizes the norm of the degree-of-similarity vector (SD(k), ST(k)), among the skill levels 1 to Q.
As an alternative, the skill-level estimator 114 can estimate, as a result of the estimation, the degree of skill k which minimizes a combined dissimilarity Δ(k) (=ΔD(k)+ΔT(k)), among the skill levels 1 to Q. The skill-level estimator 114 can alternatively estimate, as a result of the estimation, the skill level k which maximizes a combined similarity S(k) (=SD(k)+ST(k)), among the skill levels 1 to Q.
As an alternative, in the case in which only two skill levels including the skill level 1 for beginners and the skill level 2 for experts are prepared, the skill-level estimator 114 can estimate that the skill level 2 is a result of the estimation when the requirement A as shown below is satisfied, whereas the skill-level estimator can estimate that the skill level 1 is a result of the estimation when the requirement A is not satisfied.
Requirement A: ΔD(1)>ΔD(2) and ΔT(1)>ΔT(2)
After the performance of the above-mentioned step ST40, the processing returns to the step ST13 shown in
Next, the contents-compilation apparatus 30 will be explained.
As shown in
The work procedure data Fa is stored in the storage medium 303. The contents-compilation processor 301 can cause a display device 310 to display a screen image for compilation which makes it possible to compile the descriptions of the work procedure data Fa and generate the output data file set Fd, via the I/F unit 302. A compiler can compile the descriptions of the work procedure data Fa by inputting information to the contents-compilation processor 301 while visually recognizing the screen image for compilation and handling a manual input device 311, thereby generating the output data file set Fd. The output data file set Fd consists of the plural output data files F1, . . . , FN which correspond to skill levels on a multi-level scale. The communication unit 304 can communicate with the work assistance apparatus 10 to supply the output data file set Fd to the work assistance apparatus 10.
Next, the operations of the contents-compilation apparatus 30 will be explained with reference to
The contents-compilation processor 301 waits until receiving an instruction to start compilation provided through a manual input done by the compiler (NO in step ST51). When receiving an instruction to start compilation (YES in step ST51), the contents-compilation processor 301 reads the work procedure data Fa (step ST51), and causes the display device 310 to display a screen image for compilation with respect to either a work target or one or more work items which are specified by the instruction to start compilation (step ST53). The contents-compilation processor then performs a contents-compilation operation corresponding to a manual input done by the compiler (step ST54). Through the contents-compilation operation, the user can compile the information (e.g., the contents of a text to be displayed as shown in
After ending the contents-compilation operation, the contents-compilation processor 301 generates an output data file corresponding to the skill level (step ST55), and stores the output data file in the storage medium 303 (step ST56).
After that, when an instruction for compilation with respect to another screen image for compilation is received in a state in which no instruction to terminate compilation is received (NO in step ST57 and YES in step ST58), the contents-compilation processor 301 performs the step ST53 and the subsequent steps. In contrast, when no instruction for compilation with respect to another screen image for compilation is received in the state in which no instruction to terminate compilation is received (NO in step ST57 and NO in step ST58), the contents-compilation processor 301 waits while this state does not last a prescribed time period (NO in step ST59). When the state lasts the prescribed time period (YES in step ST59), or when an instruction to terminate compilation is received (YES in step ST57), the above-mentioned output data file generating processing is ended.
Next, the work learning apparatus 20 will be explained.
The main controller 101A has the same function as the main controller 101 of the work assistance apparatus 10, with the exception that the main controller 101A controls only the skill level that is set. Further, the output controller 102A has the same function as the output controller 102 of the work assistance apparatus 10, with the exception that the output controller 102A performs an operation on only the set skill level. More specifically, the output controller 102A acquires an output data file corresponding to the set skill level from an output data file set Fd in accordance with control performed by the main controller 101A. The output controller 102A then supplies work assistance information having descriptions shown by the output data file to a speaker SP, a display device 13, or both these speaker and display device via an I/F unit 107.
In this case, the set skill level is the known skill level of a worker who uses the work learning apparatus 20. For example, the main controller 101A can set the skill level on the basis of skill-level setting information which is input by voice to a microphone MK, by using the voice recognition function of a voice recognizer 103A. As an alternative, the skill level can be set on the basis of skill-level setting information that is input via a communication unit 106.
The reference data calculator 201 can calculate a directional reference quantity on the basis of a measurement quantity calculated by a line-of-sight movement-direction measuring unit 112. For example, the average of quantities which have been measured multiple times for one or more workers having the same skill level can be calculated as the directional reference quantity. The reference data calculator 201 can also calculate a timing reference value on the basis of a measurement value calculated by a line-of-sight movement-timing measuring unit 113. For example, the average of values which have been measured multiple times for one or more workers having the same skill level can be calculated as the timing reference value. A reference data file Fc including these directional reference quantity and timing reference value is stored in a storage medium 105. The communication unit 106 can communicate with the work assistance apparatus 10 to supply the reference data file Fc to the work assistance apparatus 10.
Next, the operations of the work learning apparatus 20 will be explained with reference to
Referring to
Next, the work learning apparatus 20 waits until receiving a predetermined response input, such as a voice or a motion pattern, which is defined in work procedure data Fa from the worker (NO in step ST61). For example, the worker is allowed to utter “Circuit breaker A has an abnormality”, “No abnormality”, or “Next inspection.” When no predetermined response input is received even after a fixed time period has elapsed (NO in step ST61 and YES in step ST62), the processing shifts to the next step ST63.
In contrast, when a predetermined response input is received within the fixed time period (NO in step ST62 and YES in step ST61), the main controller 101A determines the presence or absence of the next work item to be inspected (step ST63). When it is determined that the next work item to be inspected exists (YES in step ST63), the output controller 102A causes the display device 13 to perform display output of guidance information for guiding the worker to the next work item, like in the case of the above-mentioned step ST21 (step ST64). The main controller 101A then notifies a timing acquisition unit 110 of a change timing in response to the display output of the guidance information (step ST65). After that, the reference data calculation operation is performed (step ST66).
When receiving a notification of a change timing from the main controller 101, the timing acquisition unit 110 instructs a line-of-sight measuring unit 111 to perform a line-of-sight measurement, in response to the notification (step ST70 of
Next, the reference data calculator 201 accesses the storage medium 105 to read a previous reference data file Fc (step ST74). The reference data calculator 201 then calculates, as a new directional reference quantity, the average of plural measurement quantities including a quantity which has been previously measured, on the basis of both a previous directional reference quantity in the reference data file Fc, and the measurement quantity Dm (step ST75). The reference data calculator 201 also calculates, as a new timing reference value, the average of plural measurement values including a value which has been previously measured, on the basis of both a previous timing reference value in the reference data file Fc, and the measurement value Tm (step ST76). Then, the reference data calculator 201 newly generates a reference data file by using the directional reference quantity and the timing reference value which are newly calculated (step ST77), and stores the newly-generated reference data file in the storage medium 105 (step ST78).
After performing the above-mentioned step ST66, the processing returns to the step ST13 shown in
Each of the hardware configurations of the work assistance apparatus 10 and the work learning apparatus 20, which are explained above, can be implemented by, for example, an information processing device, such as a workstation or a mainframe, which has a computer configuration in which a CPU (Central Processing Unit) is mounted. As an alternative, each of the hardware configurations of the above-mentioned work assistance apparatus 10 and the above-mentioned work learning apparatus 20 can be implemented by an information processing device having an LSI (Large Scale Integrated circuit) such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array).
In a case in which the work assistance apparatus 10 shown in
In contrast, in a case in which the work learning apparatus 20 shown in
In a case in which the work assistance apparatus 10 shown in
In contrast, in a case in which the work learning apparatus 20 shown in
As each of the mounted storage media 43 and 55 shown in
Further, each of the communication circuits 42 and 54 shown in
As previously explained, according to the present embodiment, because the skill level of a worker can be estimated automatically by using a measurement value of the timing of line-of-sight movement, a measurement quantity of the direction of line-of-sight movement, or both of these measured results at a time when the worker changes from an inspection item to another inspection item, the skill level can be estimated promptly even though the worker performs work accompanied by his or her motion. Therefore, by presenting work assistance information having descriptions corresponding to the skill level of the worker to the worker by using the result of the estimation, effective work assistance can be provided for the worker. Further, when the worker 4 performs work while attaching the wearable device 5 equipped with spectacles, like in the case of the present embodiment, there occurs a state in which a line-of-sight movement resulting from a motion of the worker 4 easily occurs. Even in such a state, according to the present embodiment, because the skill level of the worker can be estimated automatically by using the measurement value of the timing of line-of-sight movement, the measurement quantity of the direction of line-of-sight movement, or both of these measured results at a time when the worker changes from an inspection item to another inspection item, there is provided an advantage of improving the accuracy of the estimation of the skill level. Even when, instead of the wearable device 5 equipped with spectacles of the present embodiment, another type of wearable device is used, the same advantage can be provided.
Further, as mentioned above, the output controller 102 of the work learning apparatus 20 selects an output data file corresponding to the skill level estimated by the skill-level estimator 114, from among the output data files F1 to FN corresponding to plural skill levels. A compiler can compile each of the output data files F1 to FN by using the contents-compilation apparatus 30. With this configuration, the descriptions of the work assistance information can be customized in detail with flexibility in accordance with the skill level, and the working efficiency of the worker 4 can be improved.
Although Embodiment 1 according to the present invention has been described with reference to the drawings as previously explained, the embodiment exemplifies the present invention, and various embodiments other than the embodiment can also be exemplified. Within the scope of the present invention, an arbitrary combination of two or more of the components of the above embodiment can be made, a change can be made in an arbitrary component of the above embodiment, and/or an arbitrary component of the above embodiment can be omitted.
The work assistance apparatus and the work assistance system according to the present invention can be applied to assistance in work, such as maintenance or inspection of machinery and equipment, or repair or assembly of machinery and equipment, which is performed in accordance with a certain procedure.
1: work assistance system; 2: work assistance system; 3A, 3B: information processing devices; 4: worker; 5: wearable device equipped with spectacles; 10: work assistance apparatus; 11: sensor group; 11A: image sensor for line-of-sight detection; 11B: sensor for position detection; 11C: direction sensor; 11D: front-image sensor; 12: sound input/output unit; 13: display device; 13V: eyeglass portions; 20: work learning apparatus; 30: contents-compilation Apparatus; 40: signal processing circuit; 41: interface (I/F) circuit; 41: I/F circuit; 42: communication circuit; 43: storage medium; 44: memory interface; 45: storage medium; 46: signal path; 50: processor; 51: RAM; 52: ROM; 53: interface (I/F) circuit; 54: communication circuit; 55: storage medium; 56: memory interface; 57: storage medium; 58: signal path; 101, 101A: main controllers; 102, 102A: output controller; 103: worker-information acquisition unit; 103P: position detector; 103M: motion detector; 103A: voice recognizer; 103D: direction detector; 104: work-target information acquisition unit; 105: storage medium; 106: communication unit; 107: interface unit (I/F unit); 110: timing acquisition unit; 111: line-of-sight measuring unit; 112: line-of-sight movement-direction measuring unit; 113: line-of-sight movement-timing measuring unit; 114: skill-level detector; 201: reference data calculator; 301: contents-compilation processor; 302: interface unit (I/F unit); 303: storage medium; 304: communication unit; 310: display device; and 311: manual input device.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/050543 | 1/8/2016 | WO | 00 |