The present invention claims priority under 35 U.S.C. § 119 to Japanese patent Application No. 2018-078438, filed on Apr. 16, 2018, is incorporated herein by reference in its entirety.
The present invention relates to an image forming apparatus, such as a multi-functional peripheral (MFP), and a technology associated with the image forming apparatus.
For example, in a case where an image forming apparatus has occurrence of an abnormal event (e.g., a paper jam), a plurality of work stages according to work to be conducted by a worker at the image forming apparatus (e.g., elimination work of the paper jam) sequentially guides the worker. Then, the worker sequentially conducts each work stage in accordance with the guide of the image forming apparatus, and additionally performs, after completion of each work stage, a depression operation, for example, to a “completion” button (button for notifying the image forming apparatus of the completion of each work stage) displayed on an operation screen of the image forming apparatus. The image forming apparatus determines the effect that each work stage has been completed, on the basis of the depression operation of the worker.
Note that performing a depression operation to the “completion” button on the operation screen of the image forming apparatus after completion of each work stage, annoys the worker (extremely).
As a technique made in consideration of the point, there is a technique of determining completion of each work stage with a sensor (refer to JP 2007-230001 A). According to this technique, the completion of each work stage is automatically determined on the basis of a detected result of the sensor. Thus, a worker is not required to perform an operation (manual operation) on an operation screen of an image forming apparatus after the completion of each work stage, so that the work efficiency of the worker can improve (dramatically).
There is a technique of determining completion of each work stage on the basis of a voice input of the effect that each work stage has been conducted, from a worker (refer to JP 2003-215985 A). According to this technique, the worker is not required to perform an operation (e.g., a depression operation to a “completion” button) on an operation screen of an image forming apparatus after the completion of each work stage. Thus, the work efficiency of the worker can improve.
However, because the technique described in JP 2007-230001 A typically adopts only the sensor for determining the completion of each work stage and the technique described in JP 2003-215985 A typically adopts only the voice input for determining the completion of each work stage, each technique has room for further improvement of the work efficiency of the worker.
For example, according to the technique described in JP 2007-230001 A, completion of all the work stages is not necessarily detected with the sensor. In a case where a work stage is present of which completion is not to be detected with the sensor, even when the worker conducts the work stage, the effect that the work stage has been completed is not automatically determined. As a result, the worker is required to perform an operation (e.g., a depression operation to a “completion” button) (namely, an annoying operation) on the operation screen of the image forming apparatus after the completion of the work stage.
According to the technique described in JP 2003-215985 A, the worker is not required to perform a depression operation to, for example, a “completion” button on the operation screen of the image forming apparatus after the completion of each work stage. However, every time, the worker needs to perform voice input of the effect that each work stage has been conducted (effect that each work stage has been completed) (although simpler than an operation input on the operation screen).
Thus, an object of the present invention is to provide a technique enabling further improvement of the work efficiency of a worker.
To achieve the abovementioned object, according to an aspect of the present invention, an image forming apparatus reflecting one aspect of the present invention comprises: a hardware processor that sequentially guides a worker to a plurality of work stages according to work to be conducted by the worker at the image forming apparatus and determines completion of each of the plurality of work stages with a completion-determination method corresponding to a type of the work stage; at least one sensor capable of detecting a final state of at least one work stage of the plurality of work stages; and a voice inputter capable of receiving a voice input from the worker, wherein the plurality of work stages is classified into a first type of stage and a second type of stage in advance, and the hardware processor: determines, in a case where a current work stage in the plurality of work stages belongs to the first type of stage, completion of the current work stage, based on a detected result of a specific sensor corresponding to the current work stage in the at least one sensor; and determines, in a case where the current work stage belongs to the second type of stage, the completion of the current work stage, based on the voice input of an effect that the current work stage has been conducted.
The advantages and features provided by one or more embodiments of the invention will become more fully understood from the detailed description given hereinbelow and the appended drawings which are given by way of illustration only, and thus are not intended as a definition of the limits of the present invention:
Hereinafter, one or more embodiments of the present invention will be described with reference to the drawings. However, the scope of the invention is not limited to the disclosed embodiments.
<1-1. Configuration Overview>
The MFP 10 is an apparatus having, for example, a scan function, a copy function, a facsimile function, and a box-storage function (also, referred to as a multi-functional printer). Specifically, as illustrated in the functional block diagram of
The image scanner 2 is a processor that optically reads (namely, scans) an original mounted on a predetermined position in the MFP 10, to generate image data of the original (also, referred to as an original image or a scanned image).
The printing outputter 3 is an outputter that performs printing output of an image on various media, such as paper, on the basis of data regarding a printing target (printing-target data).
The communicator 4 is a processor capable of performing facsimile communication through, for example, a public line. Furthermore, the communicator 4 is capable of performing network communication through a network. For example, various protocols, such as Transmission Control Protocol/Internet Protocol (TCP/IP), are used in the network communication. Use of the network communication enables the MFP 10 to perform transmission and reception of various types of data between a desired destination and the MFP 10. The communicator 4 includes a transmitter 4a that transmits various types of data and a receiver 4b that receives various types of data.
The storage 5 includes storage devices, such as a hard disk drive (HDD) and a semiconductor memory.
The operator 6 includes an operation inputter 6a that receives an operation input to the MFP 10 (an operation input to, for example, an operation panel 6c) and a display (display outputter) 6b that performs display output of various types of information (e.g., work guidance information 300 (to be described later)).
The MFP 10 is provided with the operation panel 6c that is substantially tabular in shape (refer to
The voice inputter and outputter 7 (refer to
The MFP 10 is provided with a plurality of state-detection sensors 8. The state-detection sensors 8 are each capable of directly detecting the final state of at least one work stage of a plurality of work stages according to work to be conducted by the worker at the MFP 10.
For example, a member openable and closable in the MFP 10 (openable and closable member) is provided with, as a state-detection sensor 8, a sensor capable of directly detecting the open/closed state of the openable and closable member (the open state and the closed state) (also, referred to as an open/closed detection sensor). The open/closed detection sensor is used as a sensor capable of directly detecting the final state of opening work of the openable and closable member that is a detection target (the open state of the openable and closable member) (state-detection sensor 8) and additionally is used as a sensor capable of directly detecting the final state of closing work of the openable and closable member (the closed state of the openable and closable member) (state-detection sensor 8). Note that, as the openable and closable member, a door provided at the right face of the MFP 10 (right door 40 (
The open/closed detection sensor is provided per detection target member (detection target). For example, the right door 40 (
As each open/closed detection sensor, here, a photo sensor is exemplified.
For example, in a case where an openable and closable member (e.g., the right door 40) is in the closed state, a light-receiving element (light-receiving element provided at the right door 40) receives light emitted from a light-emitting element (light-emitting element provided at the body of the MFP 10) in the photosensor (photosensor for the right door 40), resulting in detection of the closed state of the right door 40. After that, transition of the right door 40 from the closed state to the open state (namely, the right door 40 opens) causes the light-receiving element not to receive the light emitted from the light-emitting element in the photosensor for the right door 40, resulting in detection of the change of the right door 40 from the closed state to the open state. Furthermore, after that, re-transition of the right door 40 from the open state to the closed state (namely, the right door 40 closes) causes the light-receiving element to re-receive the light emitted from the light-emitting element in the photosensor for the right door 40, resulting in detection of the change of the right door 40 from the open state to the closed state.
Note that, here, each open/closed detection sensor includes, but is not limited to, the photosensor. Each open/closed detection sensor may include a different type of sensor (e.g., a gyroscope sensor or an image sensor). For example, a gyroscope sensor may be provided, as the open/closed detection sensor for the right door 40, at the right door 40. The open/closed state of the right door 40 (open/closed angle) may be detected on the basis of an output value of the gyroscope sensor when the right door 40 opens or closes. An image sensor (e.g., a camera) may be provided, as the open/closed detection sensor for the right door 40, for example, on the ceiling of the upper portion of the MFP 10. The open/closed state of the right door 40 may be detected on the basis of a shoot image of the image sensor when the right door 40 opens or closes.
The controller 9 is a control device that is built in the MFP 10 and controls the MFP 10 in a unificatory manner. The controller 9 includes a computer system including, for example, a central processing unit (CPU) (also, referred to as a microprocessor or a computer processor) and various semiconductor memories (a RAM and a ROM). The CPU of controller 9 executes a predetermined software program (hereinafter, also simply referred to as a program) stored in the ROM (e.g., EEPROM (registered trademark)), to achieve various processors. Note that the program (in detail, a program module group) may be recorded on a portable recoding medium, such as a USB memory. The program read from the recording medium may be installed on the MFP 10. Alternatively, the program downloaded through, for example, a network may be installed on the MFP 10.
Specifically, as illustrated in
The communication controller 11 is a processor that controls communication operation to a different apparatus, being cooperative with, for example, the communicator 4.
The input controller 12 is a controller that controls input operation to the operation inputter 6a (e.g., the touch panel 25).
The display controller 13 is a processor that controls display operation at the display 6b (e.g., the touch panel 25).
The guidance controller (also, referred to as a guide controller) 15 is a processor that performs control (guide control) of sequentially guiding the worker to the plurality of work stages (a plurality of work steps) according to the work to be conducted by the worker (a series of work) at the MFP 10. Specifically, the guidance controller 15 outputs the work guidance information 300 (to be described next) to sequentially guide the worker to the plurality of work stages. The work guidance information 300 is information for guiding the worker to the plurality of work stages. The work guidance information 300 is also referred to as work guide information. The guidance controller 15 performs, for example, determination operation of determining completion of each work stage with a completion-determination method corresponding to the type of each work stage.
Note that, here, the execution of the software program by the CPU of the controller 9 mainly allows performance of the various types of operation, but the present embodiment is not limited to this. The various types of operation may be performed with, for example, dedicated hardware provided at the MFP 10 (in detail, inside or outside the controller 9). For example, all or part of the communication controller 11, the input controller 12, the display controller 13, and the guidance controller 15 (
<1-2. Operation>
Here, a series of work to be conducted by the worker in order to eliminate the paper jam occurring in the MFP 10 (elimination work of the paper jam), includes three pieces of elemental work (work stages): work of opening the right door 40 (
The MFP 10 has the three work stages previously classified by, for example, a designer of the program into two types of stages: a detection target stage and a non-detection target stage. The detection target stage is a work stage of which completion is detected with a state-detection sensor 8 (in detail, a sensor corresponding to the current work stage, in the plurality of state-detection sensors 8). The non-detection target stage is a work stage of which completion is not detected with a state-detection sensor 8.
Specifically, on the basis of whether each work stage is a work stage of which completion can be detected with a state-detection sensor 8, the three work stages are each classified into the detection target stage or the non-detection target stage. More specifically, from the three work stages, a work stage of which completion can be detected with the sensor corresponding to the current work stage (also, referred to as a detectable stage) is classified into the detection target stage. Meanwhile, from the three work stages, a work stage of which completion cannot be detected with a state-detection sensor 8 (also, referred to as an undetectable stage) is classified into the non-detection target stage. Note that, the present embodiment is not limited to this. On the basis of a different criterion, the three work stages may be classified into the detection target stage and the non-detection target stage.
Here, as to be described later, from the plurality of work stages according to the elimination work of the paper jam (three work stages), the opening work of the right door 40 of the MFP 10 (work of opening the right door 40) and the closing work of the right door 40 of the MFP 10 (work of closing the right door 40) are classified into the detection target stage, and the sheet removal work is classified into the non-detection target stage.
The MFP 10 determines completion of the detection target stage (here, the opening work or the closing work of the right door 40) from the three work stages, on the basis of a detected result of the sensor (refer to
In this manner, on the basis of whether each work stage belongs to the type of either the detection target stage or the non-detection target stage, a completion-determination method for each work stage is selected. In other words, in order to determine the completion of each work stage (each piece of elemental work), the sensor or the voice input is used, on the basis of whether each work stage belongs to either the detection target stage or the non-detection target stage.
The details of this operation will be described below.
First, at step S11, the MFP 10 starts a guide of the plurality of work stages according to the work to be conducted by the worker at the MFP 10 (work guide).
Specifically, the MFP 10 starts output of the work guidance information 300 (information for guiding the worker to the plurality of work stages). Here, information for guiding the worker to elimination stages for the abnormal event (paper jam) occurring in the MFP 10 (also, referred to as elimination-stage information), is output as the work guidance information 300. Here, the work guidance information 300 is subjected to display output (image output) through the touch panel 25, and the work guidance information 300 is subjected to voice output through the voice outputter 7b.
More specifically, from a plurality of pieces of work guidance information 301 to 303 (300) regarding the plurality of work stages according to the elimination work of the paper jam (refer to
This arrangement allows the worker to verify the work contents of the first work stage (opening work of the right door 40).
Then, the processing proceeds from step S11 to step S12.
At step S12, the MFP 10 determines whether the current work stage (here, the first work stage) (currently in guiding) belongs to either the detection target stage or the non-detection target stage. Then, the MFP 10 selects (determines) a completion-determination method for the current work stage, on the basis of whether the current work stage belongs to either the detection target stage or the non-detection target stage.
Here, the MFP 10 has in advance a registration of whether each work stage belongs to either the detection target stage or the non-detection target stage.
For example, the right door 40 (peripheral portion of the right door 40) is provided with, as a state-detection sensor 8, the open/closed detection sensor capable of detecting the final state of the opening work of the right door 40 (state where the right door 40 is open). Here, the effect that the opening work of the right door 40 belongs to the detection target stage (detection target work), is registered in advance. On the basis of the registered content, the effect that the first work stage (opening work of the right door 40) belongs to the detection target stage (classified in the detection target stage), is determined at step S12. Then, the processing proceeds from step S12 to step S13.
At step S13, the MFP 10 is in wait for a detected result of the state-detection sensor 8 (here, the open/closed detection sensor corresponding to the opening work of the right door 40, in the plurality of state-detection sensors 8). In other words, it is determined whether the detection target stage (opening work of the right door 40) has been completed.
Specifically, the MFP 10 determines whether the work target in the current work stage (detection target stage) has varied to the final state of the current work stage (state after completion of the current work stage). Here, it is determined whether the right door 40 (work target in the first work stage) has varied from the closed state to the open state (final state of the opening work).
For example, when the worker conducts the opening work of the right door 40 in accordance with the first-stage guidance information 301 (
Then, the processing proceeds from step S13 to step S16, and the MFP 10 determines the effect that the current work stage (here, the first work stage) has been completed (step S16).
In this manner, in a case where the current work stage (here, the opening work of the right door 40) belongs to the detection target stage, the completion of the current work stage is determined on the basis of the detected result of the state-detection sensor 8 (in detail, the open/closed detection sensor for the right door 40, in the plurality of state-detection sensors 8).
Then, the processing proceeds from step S16 to step S17, and then it is determined whether all the work stages have been completed. Here, the effect that uncompleted work stages (remaining work stages (second and third work stages) except the first work stage in the three work stages regarding the elimination work of the paper jam) are left is determined at step S17. Then, the processing proceeds from step S17 to step S18.
At step S18, the MFP 10 finishes the display output of the current work guidance information 300 (first-stage guidance information 301) and additionally starts a guide of the next work stage (remaining work stage (here, the second work stage)).
Specifically, from the plurality of pieces of work guidance information 301 to 303 (300) regarding the elimination work of the paper jam, the MFP 10 starts output (display output and voice output) of the work guidance information 302 regarding the second work stage (also, referred to as second-stage guidance information). Here, the second work stage includes the work of removing the sheet from the paper-jam portion 50 on the sheet conveyance path in the MFP 10 (sheet removal work) (refer to
Then, for the second work stage, similarly, it is determined at step S12 whether the second work stage (sheet removal work) belongs to either the detection target stage or the non-detection target stage.
Here, a sensor itself capable of detecting a paper-jam state is provided on the sheet conveyance path in the MFP 10. However, along with the opening work of the right door 40, the sensor for the paper-jam portion 50 on the sheet conveyance path (namely, a portion with which fingers of the worker can come in contact) transitions to a non-conductive state. As a result, during the elimination work of the paper jam, the sensor cannot detect the final state of the sheet removal work (state where the sheet has been removed from the paper-jam portion 50) (does not perform detection operation). In consideration of the point, here, the effect that the sheet removal work belongs to the non-detection target stage (non-detection target work), is registered in advance. On the basis of the registered content, the effect that the second work stage (sheet removal work) belongs to the non-detection target stage (classified in the non-detection target stage), is determined at step S12. Then, the processing proceeds from step S12 to step S14.
At step S14, the MFP 10 notifies the worker of the effect that voice input is required for the effect that the current work stage has been conducted. Specifically, a voice message, such as “Vocalize “OK” after completion of the work”, is output.
Then, the processing proceeds from step S14 to step S15. The MFP 10 is in wait for a voice input from the worker (voice input of the effect that the current work stage has been conducted). In other words, it is determined whether the non-detection target stage (sheet removal work) has been completed. Specifically, the MFP 10 determines whether the voice input of the effect that the non-detection target stage (here, the second work stage) has been conducted has been received (voice input from the worker).
For example, when the worker emits a voice of the effect that the sheet removal work has been conducted (“OK”) (refer to
After the reception of the voice input from the worker, the processing proceeds from step S15 to step S16. The MFP 10 determines the effect that the second work stage (sheet removal work) has been completed.
In this manner, in a case where the current work stage (here, the sheet removal work) belongs to the non-detection target stage, the completion of the current work stage is determined on the basis of the voice input of the effect that the current work stage has been conducted, from the worker.
Then, the effect that an uncompleted work stage (in detail, the third work stage) is left is determined at step S17. The processing proceeds from step S17 to step S18.
After that, at step S18, the MFP 10 finishes the display output of the second-stage guidance information 302, and additionally starts a guide of the next work stage (third work stage).
Specifically, from the plurality of pieces of work guidance information 301 to 303 (300) regarding the elimination work of the paper jam, the MFP 10 starts output (display output and voice output) of the work guidance information 303 regarding the third (last) work stage (also, referred to as third-stage guidance information). Here, the third work stage includes the work of closing the right door 40 (closing work of the right door 40) (refer to
Then, for the third work stage, similarly, it is determined at step S12 whether the third work stage (closing work of the right door 40) belongs to either the detection target stage or the non-detection target stage.
Here, similarly to the opening work of the right door 40, the effect that the closing work of the right door 40 belongs to the detection target stage, is registered in advance. The effect that the third work stage (closing work of the right door 40) belongs to the detection target stage, is determined at step S12. Then, the processing proceeds from step S12 to step S13.
After that, when the worker conducts the closing work of the right door 40, the state-detection sensor 8 of the MFP 10 (here, the open/closed detection sensor corresponding to the opening work of the right door 40, in the plurality of state-detection sensors 8 (open/closed detection sensor for the right door 40)) detects the effect that the right door 40 is in the closed stage (final state of the closing work). Acquisition of a detected result of the effect that the right door 40 is in the closed state, causes determination of the effect that the right door 40 has varied (attained) from the open state to the closed state (final state of the closing work), at step S13.
Then, the processing proceeds from step S13 to step S16. The effect that the third work stage has been completed, is determined. After that, the effect that all the work stages according to the elimination work of the paper jam have been completed, is determined at step S17. The processing of
As described above, according to the first embodiment, in a case where the current work stage (e.g., the opening work of the right door 40) belongs to the detection target stage, in the plurality of work stages of the elimination work of the paper jam, the completion of the current work stage is automatically determined on the basis of the detected result of the sensor corresponding to the current work stage (here, the open/closed detection sensor for the right door 40) (refer to
Furthermore, according to the first embodiment, the work guidance information 300 (information for guiding the worker to the plurality of work stages according to the work to be conducted by the worker) is subjected to voice output (refer to
Note that, according to the first embodiment, both of the display output (refer to
A second embodiment is a modification of the first embodiment. The difference between the second embodiment and the first embodiment, will be mainly described below.
According to the second embodiment, in addition to the operation according to the first embodiment, in a case where the effect that a worker is confused about the current work stage (work stage in currently guiding) is determined, information for a more detailed guide of the current work stage (also, referred to as detailed guidance information) is subjected to display output on a touch panel 25.
At step S20A and step S20B, display processing of the detailed guidance information is performed.
The operation at step S20A will be described below. After the effect that the current work stage (e.g., the first work stage (opening work of a right door 40)) belongs to a detection target stage, is determined at step S12, the operation at step S20A is performed in wait for the detected result of the state-detection sensor 8 (here, an open/closed detection sensor for the right door 40) (step S13).
First, at step S21 (
In a case where the predetermined time has not elapsed after the output of the first-stage guidance information 301 starts, the processing of
After that, without acquisition of the detected result of the open/closed detection sensor for the right door 40, when the predetermined time elapses after the output (the display output and the voice output) of the current work guidance information 301 starts, the processing proceeds from step S21 to step S22.
At step S22, the MFP 10 determines the effect that the worker is confused (hesitating) about the current work stage (here, the first work stage (opening work of the right door 40) in the plurality of work stages). In detail, in a case where the final state of the first work stage (opening work of the right door 40) is not detected by the sensor regardless of the elapse of the predetermined time (here, 30 seconds) after the output of the first-stage guidance information 301 starts, the effect that the worker is confused about the first work stage is determined (estimated). Then, the processing proceeds from step S22 to step S23.
At step S23, the MFP 10 performs display output of the detailed guidance information (also, referred to as detailed guide information) that is the information for a more detailed guide of the current work stage (first work stage), onto the touch panel 25 (not illustrated). As the detailed guidance information, for example, an animation guidance is exemplified for a guide of the first work stage with an animation indicating that the work target (right door 40) in the first work stage varies gradually from the state before the work (closed state) to the final state (open state).
After that, when the worker who has verified the detailed guidance information conducts the first work stage, the processing proceeds from step S13 (
Note that, similarly to step S20A, the subroutine processing of
After the effect that the current work stage (e.g., the second work stage (sheet removal work)) belongs to a non-detection target stage, is determined at step S12, the operation at step S20B is performed in wait for the voice input from the worker (step S15).
Specifically, at step S21 (
Without reception of the voice input from the worker, when the predetermined time elapses after the output of the current work guidance information 302 starts, the processing proceeds from step S21 to step S22. The effect that the worker is confused about the second work stage (non-detection target stage) in the plurality of work stages, is determined.
Then, the processing proceeds from step S22 to step S23. The detailed guidance information is subjected to display output on the touch panel 25.
Then, when the voice input from the worker (“OK”) is received after the worker who has verified the detailed guidance information conducts the second work stage, the processing proceeds from step S15 (
As described above, according to the second embodiment, in a case where the effect that the worker is confused about the current work stage is determined, the detailed guidance information for a more detailed guide of the current work stage is automatically subjected to display output. Thus, the worker is not required to perform, on an operation screen of the MFP 10, a button operation (e.g., a depression operation to a “detail” button) for performing display output of the detailed guidance information. Therefore, the display output of the detailed guidance information can be performed efficiently.
Note that, here, in a case where the detected result of the state-detection sensor 8 is not acquired (or the voice input is not received) by the elapse of the predetermined time after the output of the current work guidance information 300 starts, the effect that the worker is confused about the current work stage is determined (steps S21 and S22). However, the present embodiment is not limited to this.
For example, in a case where a voice input of the effect that the worker is confused about the current work stage, is received, the effect that the worker is confused about the current work stage may be determined. Specifically, in a case where a voice input, such as “I am baffled”, is received after the current work guidance information 300 is output, the effect that the worker is confused about the current work stage may be determined.
According to the second embodiment, in a case where the effect that the worker is confused about the current work stage is determined, the detailed guidance information is subjected to display output (not illustrated). However, the present embodiment is not limited to this. For example, in a case where the effect that the worker is confused about the current work stage is determined, the current work guidance information 300 may be subjected to voice output again, instead of display output of the detailed guidance information. For example, in a case where the effect that the worker is confused about the first work stage in the plurality of work stages is determined, the first-stage guidance information 301 may be subjected to voice output again (refer to
Furthermore, in a case where the effect that the worker is confused about the current work stage is determined, the current work guidance information 300 may be subjected to voice output again in addition to display output of the detailed guidance information.
A third embodiment is a modification of the first embodiment. The difference between the third embodiment and the first embodiment, will be mainly described below.
According to the third embodiment, in addition to the operation according to the first embodiment, in a case where the effect that different work from the current work stage has been conducted at the current work stage is determined, information for recovering the state before the conduct of the different work (original state) (also, referred to as recovery guidance information 500) is output (refer to
First, at step S31 (
Specifically, in a case where a state-detection sensor 8 detects, at the first work stage, a variation in the state of a different object from the work target in the first work stage (in detail, a sensor capable of detecting a variation in the state of the different object, in a plurality of state-detection sensors 8), the MFP 10 determines the effect that the wrong work has been conducted. For example, in a case where a different door (e.g., a front door 60), which is open, from the right door 40 (work target in the first work stage), is detected by the state-detection sensor 8 (open/closed detection sensor for the front door 60) at the first work stage, the effect that the wrong work has been conducted is determined.
Alternatively, in a case where a voice input of the effect that the wrong work has been conducted (e.g., a voice input of the content of “I made a mistake”) is received at the first work stage, the MFP 10 determines the effect that the wrong work has been conducted.
After the determination of the effect that the wrong work has been conducted, the processing proceeds from step S31 to step S32. The MFP 10 performs output (display output and voice output) of the information for recovering the state before the conduct of the wrong work (recovery guidance information 500). Specifically, the MFP 10 performs display output (
Then, after determination of the effect that the state before the conduct of the wrong work has recovered, the MFP 10 outputs work guidance information 300 regarding the current work stage (here, first-stage guidance information 301) again. Specifically, in a case where the sensor (open/closed detection sensor for the front door 60) detects that the wrongly opened front door 60 (different door from the right door 40) is closed, or in a case where a voice input of the effect of recovery of the original state (e.g., “I have recovered the original state”) is received, the effect that the state before the conduct of the wrong work (original state) has been recovered is determined. Then, the work guidance information 300 regarding the current work stage is output again.
As described above, according to the third embodiment, the determination of the effect that the wrong work has been conducted at the current work stage, causes automatic output of the recovery guidance information 500 (refer to
Particularly, when the state-detection sensor 8 (here, the open/closed detection sensor for the front door 60) detects a variation in the state of the different object (e.g., the front door 60) from the work target (e.g., the right door 40) in the current work stage, the effect that the wrong work has been conducted is determined and additionally the recovery guidance information 500 is automatically output. Thus, even in a case of opening the front door 60 wrongly at the first work stage (opening work of the right door 40) although the right door 40 should be opened rightly, the worker can recognize the conduct of the wrong work, early. As a result, the worker recovers the original state (state just before the front door 60 opens), early.
Therefore, in a case where the worker conducts the different work (wrong work) from the current work stage, the state before the conduct of the wrong work (original state) can be recovered early. Furthermore, a reduction can be inhibited in the work efficiency of the entire elimination work of a paper jam (a plurality of work stages).
Note that, according to the third embodiment, both of the display output (
The embodiments of the present invention have been described above, but the present invention is not limited to the above descriptions.
For example, according to each embodiment, the information for guiding the worker to the elimination stages of the abnormal event occurring in the MFP 10, is output as the work guidance information 300 (
Similarly to the first embodiment, each piece of processing of
Specifically, first, the work guidance information regarding the first work stage (first-stage guidance information) 311 is output (step S11).
Although embodiments of the present invention have been described and illustrated in detail, the disclosed embodiments are made for purposes of illustration and example only and not limitation. The scope of the present invention should be interpreted by terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2018-078438 | Apr 2018 | JP | national |