This application is based on and claims priority to Japanese Patent Application No. 2023-081543, filed on May 17, 2023, the entire contents of which are incorporated herein by reference.
The present invention relates to information processing. More particularly, the present invention relates to an information processing device, an analysis system, an image forming device, an analysis method, and a program.
When a paper jam occurs in an existing image forming device, the location where the paper jam occurred is identified based on values collected from transport sensors and other sensors, and procedural tasks for removal of paper jam that are applicable to the location where the paper jam occurred are displayed on an operating panel. However, the operator of the image forming device might not always follow the procedural tasks for removal of paper jam as presented. Thus, there is a demand to know why the operator does not follow the procedural tasks for removal of paper jam as presented, so that the tasks for removal of paper jam may be improved based on this.
Assuming that an image forming apparatus is provided that performs a variety of operations, patent document 1 discloses a technique that is directed to finding, with ease, points among these operations where improvement is needed. According to the existing technique of patent document 1, an operational characteristics acquisition device includes: an acquirer that acquires an operation log of an image forming apparatus; and an analyzer that analyzes the acquired operation log so as to generate operational characteristics data concerning appearance of an inefficient operation. However, according to the existing technique of patent document 1, although inefficient operations in screen operations can be analyzed from the operation log of the operating panel, inefficient operations related to paper jam clearance cannot be analyzed, and inefficient operations in paper jam clearance operations cannot be identified.
The present disclosure has been prepared in view of the foregoing, and aims to provide an information processing device that can analyze the tasks that the operator of an image forming device performs to remove paper jam in an image forming device.
In order to achieve the above aim, the present disclosure provides an information processing device having the following characteristics. The information processing device includes a receiving part configured to receive: paper jam occurrence information about a paper jam that has occurred in an image forming device; and sensor output log data associated with one or a plurality of sensors provided in the image forming device to detect paper jam removal tasks performed in response to occurrence of the paper jam. The information processing device includes a memory part configured to store procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described. The information processing device 20 includes an output control part configured to output a task analysis result obtained based on: the procedural task data associated with the paper jam occurrence information; and the sensor output log data.
Structured thus, it is possible to analyze the tasks that the operator performs when removing a paper jam in the image forming device.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, but the present invention is by no means limited to the embodiments described below. Note that a variety of changes and improvements may be applied to the embodiments according to the present disclosure.
Hereinafter, embodiments of the present invention will be described in greater detail with reference to the accompanying drawings, using a device usage information analysis system including an image forming device and an analysis server as an example. However, the device usage information analysis system to according the following embodiments is by no means limiting.
The analysis server 20 is deployed in the form of a server or personal computer on a cloud infrastructure or on a network within an organization such as a company, or in the form of a virtual machine on an infrastructure. The analysis server 20 collects, from the image forming devices subject to analysis (one or multiple multi-function devices 30 and one or multiple printers 32), device type identification information that identifies each device's model, and usage information about the use of each device. In particular, the analysis server 20 according to the embodiment described below collects information about the tasks that the user (or the operator) of an image forming device performs for removal of paper jam when a paper jam occurs in the image forming device. The analysis server 20 aggregates the data received from the image forming devices (one or multiple multi-function devices 30 and one or multiple printers 32) subject to analysis, analyzes the status of jobs when the paper jam occurred, and generates analysis results. Here, “removal of paper jam” refers to the act of removing jammed paper or sheet. The analysis server 20 constitutes the information processing device of the present embodiment.
The terminal 90 is a terminal that the user, who is also the analyst, uses such as a personal computer, tablet computer, smartphone, and so forth. The analyst can access the analysis server 20 by operating the terminal 90 and perform a variety of operations, including configuring various settings for analysis, giving instructions to execute analysis, giving instructions to view the result of analysis, and so forth. The terminal 90 can receive the data of the result of analysis from the analysis server 20 and display it on a screen of the terminal 90. Note that, although the following embodiment will be described assuming that the analyst accesses the analysis server 20 through the terminal 90, the embodiment is not particularly limited to this. The analysis server 20 may be equipped with a display, and the analyst may give instructions to execute analysis and view the result of analysis, by directly using the analysis server 20.
The multi-function devices 30 and printers 32 are devices subject to analysis, and are examples of image forming devices according to the present embodiment. The multi-function devices 30 and printers 32 are all equipped with one or multiple sensors for detecting the tasks that the operator performs to remove paper jam in the event a paper jam occurs. When a paper jam occurs in a multi-function device 30 or a printer 32, the multi-function device 30 or the printer 32 saves data based on outputs of one or multiple sensors (sensor output log data), collected with regard to the paper jam that occurred, from the time the paper jam occurred until it is cleared, and sends the data to the analysis server 20 in a timely manner.
Although
The network 12 may include a local area network (LAN), a wide area network (WAN), a public line network such as the Internet, a mobile communication network such as 4G or 5G, wireless communication for drones, or a combination of these. For example, in the event a manufacturer wants to analyze information about the use of devices that the manufacturer has made, the network 12 may include the manufacturer's network, the Internet, and the network in which the equipment is installed.
Hereinafter, before the device usage information analysis system 10 according to the present embodiment is described in detail, the hardware structure of each device constituting the system 10 will be explained.
Among these, the controller 110 includes: a CPU 101, a system memory (MEM-P) 102, a north bridge (NB) 103, a south bridge (SB) 104, and an application-specific integrated circuit (ASIC) 106, which are principal parts of a computer; and a local memory (MEM-C) 107, an HDD controller 108, and an HDD 109, which are memory parts, and structured such that an accelerated graphics port (AGP) bus 121 connects between the NB 103 and the ASIC 106.
Among these, the CPU 101 is a control part that has control over the entire multi-function device 30. The NB 103 is a bridge for connecting the CPU 101 with the MEM-P 102, the SB 104, and the AGP bus 121, and has a memory controller that controls reading and writing from and to the MEM-P 102, a peripheral component interconnect (PCI) master, and an AGP target.
The MEM-P 102 consists of a ROM 102a, which is a memory for storing programs and data that implement the functions of the controller 110, and a RAM 102b, which is used as memory for loading programs and data, a graphics memory for memory printing, and the like. Note that the programs stored in the RAM 102b may be provided as installable or executable files recorded on a computer-readable recording medium such as a CD-ROM, CD-R, or DVD.
The SB 104 is a bridge for connecting the NB 103 with PCI devices and peripheral devices. The ASIC 106 is an integrated circuit (IC) for image processing that includes hardware elements for image processing, and functions as a bridge that connects the AGP bus 121, a PCI bus 122, the HDD 109, and a MEM-C 107. This ASIC 106 consists of: a PCI target and an AGP master; an arbiter (ARB), which is the core of the ASIC 106; a memory controller that controls the MEM-C 107; multiple direct memory access controllers (DMACs) that rotate image data using hardware logic and the like; and a PCI unit that transfers data between a scanner part 131 and a printer part 132 via the PCI bus 122. A universal serial (USB) interface and/or an IEEE 1394 (Institute of Electrical and Electronics Engineers 1394) interface may be connected to the ASIC 106.
The MEM-C 107 is a local memory used as an image buffer for copying and a code buffer. The HDD 109 is a storage for storing image data, font data used for printing, and forms. The HDD controller 108 controls reading and writing of data from and to the HDD 109 under the control of the CPU 101. The AGP bus 121 is a bus interface for a graphics accelerator card proposed to speed up graphics processing. The AGP bus 121 is allowed direct access to the MEM-P 102 at high throughput, thereby increasing the speed of the graphics accelerator card.
Also, the near-field communication circuit 120 has a near-field communication circuit 120a. The near-field communication circuit 120 is, for example, a communication circuit for NFC or Bluetooth (registered trademark).
Furthermore, the engine control part 130 includes the scanner part 131 and the printer part 132. The operating panel 140 includes a panel display part 140a such as a touch panel that displays, for example, current setting values, a selection screen, and so forth, and receives inputs from the user, and a panel 140b including a numeric keypad for receiving setting values of conditions related to image formation such as density settings and a start key for receiving a copy start instruction. In the multi-function device 30 according to present the embodiment, the operating panel 140 constitutes an output device for outputting guidance including instructions for paper jam removal tasks in response to the occurrence of a paper jam. To be more specific, the operating panel 140 constitutes a display device for displaying guidance. The controller 110 controls the entire multi-function device 30, and controls, for example, graphics, communications, and inputs from the operating panel 140. The scanner part 131 and the printer part 132 includes an image processing part for error diffusion, gamma conversion, and so forth.
Note that an application switching key of the operating panel 140 enables the user to sequentially switch and select a document box function, a copy function, a printer function, and a facsimile function of the multi-function device 30. The multi-function device 30 transitions to a document box mode when the document box function is selected, to a copy mode when the copy function is selected, to a printer mode when the printer function is selected, and to a facsimile mode when the facsimile function is selected.
The network I/F 150 is an interface for data communications using the communication network. The near-field communication circuit 120 and the network I/F 150 are electrically connected to the ASIC 106 via the PCI bus 122.
Note that the hardware structure illustrated in
The scanner part 131 may include, for example, a document glass and an automatic document feeding device, and may have a sensor for detecting document size. The structure of the printer part 132 is not particularly limited either, and any type of printer such as an electrophotographic printer or an inkjet printer may be used.
Also, although the hardware structure of the multi-function device 30 has been described, a similar or the same hardware structure can be applied to the printer 32 by adding or removing components as appropriate (for example, the scanner part may be removed).
As illustrated in
Among these, the CPU 201 controls operations of the entire computer 200. The ROM 202 stores a program such as an initial program loader (IPL) for driving the CPU 201. The RAM 203 is used as a work area for the CPU 201. The HDD 204 stores various data such as programs. The HDD controller 205 controls reading and writing of various data from and to the HDD 204 under the control of the CPU 201. The display 206 displays various information such as a cursor, menus, windows, characters, and images. The external device connecting I/F 208 is an interface for connecting various external devices. Examples of external devices in this case include a universal serial bus (USB) memory, a printer, and so forth. The network I/F 209 is an interface for data communication using a communication network. The data bus 210 is, for example, an address b bus or a data bus for electrically connecting components such as the CPU 201 illustrated in
The keyboard 211 is an example of an input means having multiple keys for entering characters, numerical values, and various instructions. The pointing device 212 is an example of an input unit for selecting and executing various instructions, selecting an object, and moving a cursor. The DVD-RW drive 214 controls reading and writing of various types of data from and to a DVD-RW 213, which is an example of a removable recording medium. Note that the DVD-RW drive 214 may support not only a DVD-RW but may also support other recording media such as a DVD-R, and no particular limitation applies. The media I/F 216 controls reading and writing (storing) of data from and to a recording medium 215 such as a flash memory.
The overall structure of the device usage information analysis system 10 of the present embodiment and the hardware structure of each device constituting the device usage information analysis system 10 have been described above. Following this, the data structure 300 used in the device usage information analysis system 10 will be described below with reference to the block diagram illustrated in
First, the data stored on the image forming device 30 for analysis will be explained with reference to
The sensor output log data 312 is data associated with on one or multiple sensors provided in the image forming device 30 for detecting paper jam removal tasks that are suitable for a paper jam that occurred. More preferably, the sensor output log data 312 is not exactly one or multiple raw sensor data as collected from one or multiple sensors. Instead, the sensor output log data 312 is, or includes, task log data that is gained by converting one or multiple raw sensor data collected from one or multiple sensors, respectively, and shows each task that the user performed on the image forming device 30 (for example, opening the cover, opening the tray, etc.). Information that shows a task includes, for example, the type of a corresponding sensor, the output value of the sensor, and the time at which the output value changed. For example, an event having a predetermined timestamp indicating that the output value of the left/right cover open/close sensor changed from “OFF” to “ON” may be recorded.
The paper jam occurrence log data 314 includes paper jam occurrence information about paper jams that have occurred in the image forming device 30. The paper jam occurrence information includes information about the time (for example, date and time) each paper jam occurred, and information (for example, paper jam code) that indicates one or both of the type of the paper jam, and the location where the paper jam occurred.
The screen operation log data 316 includes data that indicates each operation performed by the user via an input device (software keys and hardware keys on the touch screen of operating panel 140) included in the image forming device 30. Each task corresponds to guidance that is output from the output device (for example, operating panel 140) included in the image forming device 30, and that includes instructions on tasks for removing paper jam. The information to indicate each operation may include information identifying the screen, information about the button operated, information about the time the operation was performed, and so forth.
The job detail data 318 includes data that indicates, for example, the details of the job that was in progress when a paper jam occurred in the image forming device 30. The details of a job may include information such as setting items applied to the job (such as printing side, which specifies whether the job is executed in single-sided printing or double-sided printing, paper type, paper size, etc.), job settings details showing the values of these setting items, and information about the time the job was executed.
Also illustrated in
The collected data 352 includes sensor output log data 352a, paper jam occurrence log data 352b, screen operation log data 352c, and job detail data 352d, collected from one or multiple image forming devices 30 (30a to 30c) described above. These data are stored in each image forming device 30 when a paper jam occurs, transmitted from each image forming device 30 to the analysis server 20 at a predetermined timing (which may be scheduled, for example, as once a day), and stored therein.
The paper jam removal procedural task data 354 is defined per paper jam code, and describes one task or a series of procedural tasks that the operator should perform when a paper jam of a predetermined paper jam code occurs, in order to remove the paper jam. The procedural tasks that are determined from the sensor output log data 352a and screen operation log data 352c pertaining to the occurrence of a predetermined paper jam and that the operator actually performed (tasks performed), and one or more procedural tasks that are set forth in the paper jam removal procedural task data 354 and that the operator should perform (tasks required) are compared against each other, so that whether the tasks for removal of paper jam were performed appropriately can be determined.
The unfinished analysis data 356 is incomplete data that is temporarily stored while the collected data 352 is being analyzed by the analysis program 360. The analysis result data 358 is generated as a result of analysis by the analysis program 360. The analysis result data 358 may be tabular data that holds analysis results as numerical values, or the analysis result data 358 may be converted into a graph format so that analysis results are presented in the form of graphs.
Following this, a functional structure 400 of the device usage information analysis system 10 will be described below with reference to the function block diagram illustrated in
First, the function block 410 on the image forming device 30 will be explained with reference to
The sensor/input device block 412 includes: one or multiple sensors 412a to 412e for detecting, upon the occurrence of a paper jam, tasks performed for removal of the paper jam; and a touch panel 412f and hardware keys 412g, which function as input devices that accept input operations according to instructions on paper jam removal tasks in response to the occurrence of the paper jam. One or multiple sensors 412a to 412e include at least one sensor that is selected from the group consisting of: one or multiple cover open/close sensors; one or multiple tray open/close sensors; and one or multiple transport path paper detection sensors. In the embodiment illustrated in
The conversion part 414 converts one or multiple raw sensor data output from one or multiple sensors 412a to 412e into task log data. The task log data corresponds to the items to be compared against the paper jam removal procedural task data 354, and shows each task that has been performed on the image forming device 30. The conversion part 414 further converts the data input from each input device (touch panel 412f and hardware keys 412g) into screen operation log data 316. The screen operation log data 316 corresponds to the items to be compared against the paper jam removal procedural task data 354, and shows each screen operation that has been performed on the image forming device 30. Instead of one or multiple raw sensor data, the task log data obtained by the conversion part 414 by way of conversion may be saved and transmitted as sensor output log data. By this means, the data capacity for retaining log data for analyzing paper jam removal tasks and transmission traffic can be reduced significantly.
When a paper jam occurs, the save part 416 saves information about the jam (for example, its type/location, the time of occurrence, etc.) as paper jam occurrence log data 314. When a paper jam occurs, the save part 416 also stores sensor output log data 312 (task log data), which spans from the time the paper jam occurs to the time the paper jam is removed, in the memory area, in association with paper jam occurrence information. Furthermore, when a paper jam occurs, the save part 416 saves screen operation log data 316. The screen operation log data 316, also spanning from the time the paper jam occurs to the time the paper jam is removed, shows each operation that is performed via the touch panel 412f and hardware keys 412g. The save part 416 further saves job detail data 318, which shows the details of the job that is being executed when the paper jam occurs.
The transmission part 418 transmits the paper jam occurrence log data 314, sensor output log data 312, screen operation log data 316, and job detail data 318, saved in the save part 416 in response to the occurrence of a paper jam, to the analysis server 20, at appropriate timings. The timing for transmitting these data may be scheduled as appropriate, such as every hour, once a day, once every two days, once a week, when the device is powered on, when the device is shut down, when the remaining capacity falls below a certain threshold, and so forth.
Continuing with reference to
The receiving part 452 receives the paper jam occurrence log data 314, sensor output log data 312, screen operation log data 316, and job detail data 318 transmitted from one or multiple image forming devices 30, and stores them in the collected data storage part 454. Note that the details of the data stored in the collected data storage part 454 are omitted in
In response to from instructions the user/analyst, the analysis part 456 analyzes the sensor output log data (preferably, screen operation log data 352c) about one or multiple image forming devices 30, stored in the collected data storage part 454. To be more specific, the analysis part 456 compares the paper jam removal procedural task data 354, corresponding to respective paper jam occurrence information, against the sensor output log data 352a received (preferably, screen operation log data 352c), thereby generating a task analysis result. The task analysis result here may show statistical data of one or both of procedural errors that occur, and the time that is required, per task or per multiple tasks for removal of a predetermined paper jam in multiple image forming devices 30. The analysis part 456 also analyzes the job detail data 352d and paper jam occurrence log data 352b for the multiple image forming devices 30, and generates an aggregated result. Here, the aggregated results may show statistics of occurrence of paper jams per item of job settings. The analysis part 456 may analyze the screen operation log data 352c, with the job detail data 352d and paper jam occurrence log data 352b. Note that the details of the analysis results and aggregated result will be described later.
The memory part 458 stores the paper jam removal procedural task data 354a to 354z described above, for each predetermined paper jam (each paper jam being identified by a unique paper jam code). The memory part 458 also stores unfinished analysis data 356 and analysis result data 358, generated by the analysis part 456.
The output control part 460 exerts control such that a task analysis result is prepared and output (for example, sent out, displayed, etc.) based on paper jam removal procedural task data 354 associated with paper jam occurrence information, and sensor output log data 352a. The output control part 460 may be provided by a web server function in a particular embodiment. The output control part 460 outputs a response (for example, an HTTP response), which describes the display screen, in response to a request (for example, an HTTP request) from the browser 492 on the terminal 90. This allows the analyst to view the analysis results on the browser 492.
Note that, with the embodiment illustrated in
Note that each component illustrated in
In step S103, the processor saves the job detail data at the time the paper jam occurred. In step S104, the processor saves the paper jam occurrence log data, including the date and time at which the paper jam occurred, and the paper jam code. In step S105, the processor starts saving the sensor values and screen operation log. By this means, change in sensor values after a paper jam occurs and the time at which these changes occur are saved. Step S105 and steps subsequent to correspond the procedural tasks of paper jam removal that correspond to the location where the paper jam occurred.
In a period after the sensor values and screen operation log start being saved (for example, in step S106), the processor records changes in a predetermined sensor value (1). Subsequently, in accordance with the procedural tasks for removal of paper jam, changes of another value are recorded. In step S107, the processor records yet another sensor value's changes (N).
To provide an additional explanation on the tasks for removal of paper jam described above, the user may perform these tasks in different ways, depending on the model of the device. In the first group of models, upon execution of paper jam removal tasks, whether or not the user has a performed a given task in removal of paper jam in proper order is determined on the image forming device 30 side as well, based on outputs of sensors. Depending on the result of this determination, the procedural task to be displayed on the operating panel 140 proceeds to the next step.
On the other hand, when the device belong in a different group of models, upon execution of paper jam removal tasks, the above flow linked to sensors is not executed. Instead, for example, press on a button (a software key, a hardware key, etc.) for proceeding to the next step, displayed on the paper jam removal guidance screen 500, is detected, and thereupon the guidance on procedural tasks on the paper jam removal guidance screen 500 of the operating panel 140 moves on to the next step. In this case, screen operation information, including information that identifies the screen being displayed, information about the button pressed on the screen, and the date and time, is recorded as screen operation log data 316.
Referring again to
Hereinafter, with reference to
In step S301, the processor of the analysis server 20 selects, from among the datasets collected in the collected data storage part 454, a set of data (a set of interest) pertaining to one occurrence of paper jam that is going to be analyzed. In step S302, through the analysis part 456, the processor obtains the sensor output log data 352a, paper jam occurrence log data 352b, screen operation log data 352c, and job detail data 352d pertaining to the set of interest. In step S303, through the analysis part 456, the processor reads and obtains the paper jam removal procedural task data 354, which describes one or multiple procedural tasks for removal of paper jam, for the set of interest, from the paper jam occurrence log data 314, particularly based on the paper jam code included in the paper jam occurrence log data 314.
In step S304, through the analysis part 456, the processor generates a paper jam removal task analysis table, for the set of interest, from the paper jam occurrence log data 352b, the sensor output log data 352a, and the screen operation log data 352c. To be more specific, in step S304, the analysis part 456 compares the paper jam removal procedural task data 354 according to the paper jam code against the sensor output log data and screen operation log data 352c. The contents of the paper jam removal task analysis table are filled in, and unfinished analysis data 356 is provided.
Note that
Referring again to
Referring back to
In step S307, the processor aggregates the paper jam removal task analysis tables of all sets (for example, per specific paper jam code) and generates task analysis results. The task analysis results here may be generated in a graph format, for example.
Note that the task analysis results shown in
Referring back to
Note that the aggregated results illustrated in
In step S309, this process ends.
Thus, according to the embodiment described above, it is possible to provide an information processing device, an analysis system, an image forming device, an analysis method, and a program, whereby, for example, the tasks that the operator of an image forming device performs to remove a paper jam in the image forming device can be analyzed.
In particular, by using, as reference, task analysis results determined based on procedural task data associated with paper jam occurrence information and based on sensor output log data, inefficient parts in jam removal operations can be estimated, so that the person designing the image forming devices can identify points where improvement is needed. In particular, by calculating statistics on the number of occurrences of procedural errors, it is possible to understand in which tasks procedural errors occur. Also, by calculating the statistics of the time required for each step, it is possible to understand how much time is required for each step from the time a paper jam occurs to the time paper jam is removed. Furthermore, by referencing aggregated results prepared based on paper jam occurrence log data and job detail data, it is possible to know in what job settings (paper type, paper size, printing side, etc.) paper jams tend to occur.
Each function of the embodiment described above can be implemented by one or more processing circuits. “Processing circuit” as used herein refers to a processor programmed to perform a variety of function by software such as a processor implemented by an electronic circuit, or refers to a device designed to perform the functions described herein such as an application-specific integrated circuit (ASIC), a digital signal processor (DSP), a field programmable gate array (FPGA), and other existing circuit modules.
The group of devices described in the above embodiment represents only of one of multiple computing environments for implementing the embodiment disclosed herein. In one embodiment, the analysis server may include multiple computing devices such as a computer cluster. The computing devices may be configured to communicate with each other via a communication link of choice, including a network, shared memory, and so forth, to perform the processes disclosed herein.
In addition, the terminal, analysis server, and image forming device herein can be configured to share the processing steps disclosed herein (for example, those of
Examples of the present invention include the following:
<1> An information processing device including: a receiving part configured to receive: paper jam occurrence information about a paper jam that has occurred in an image forming device; and sensor output log data associated with one or a plurality of sensors provided in the image forming device to detect paper jam removal tasks performed in response to occurrence of the paper jam; a memory part configured to store procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described; and an output control part configured to output a task analysis result obtained based on: the procedural task data associated with the paper jam occurrence information; and the sensor output log data.
<2> The information processing device according to <1>, in which the sensor output log data includes task log data that is gained by converting one or a plurality of raw sensor data collected from the one or plurality of sensors such that the task log data indicates each task performed on the image forming device.
<3> The information processing device according to <1> or <2>, in which:
the receiving part is further configured to receive screen operation log data that indicates each operation performed on an input device of the image printing device according to guidance, including instructions on tasks for removing the paper jam, the guidance being output via an output device of the image forming device; and
the task analysis result is further obtained based on: the procedural task data associated with the paper jam occurrence information; and the screen operation log data.
<4> The information processing device according to any one of <1> to <3>, in which:
the information processing device makes a plurality of image forming devices subject to analysis;
the information processing device further includes an analysis part configured to generate the task analysis result for the plurality of image forming devices by comparing the procedural task data associated with the paper jam occurrence information, against the sensor output log data, for each of the plurality of imaging devices; and
the task analysis result shows statistics of at least one of occurrence of procedural errors, or time that is required, per task or per predetermined number of tasks for removal of a predetermined paper jam in the plurality of image forming devices.
<5> The information processing device according to any one of <1> to <4>, in which:
the information processing device makes a plurality of image forming devices subject to analysis;
the receiving part is further configured to receive job detail data that indicates details of a job being executed upon the occurrence of the paper jam in the image forming device;
the information processing device further includes an analysis part configured to analyze the job detail data and the paper jam occurrence information for each of the plurality of imaging devices and generate an aggregated result; and
the aggregated result shows statistics of occurrence of jams per item of job settings.
<6> The information processing device according to any one of <1> to <5>, in which:
the paper jam occurrence information includes at least one of type or location of the paper jam, and time of the occurrence of the paper jam; and
the sensor output log data includes information collected from at least one sensor selected from the group of sensors of the image forming device, consisting of: one or a plurality of cover open/close sensors; one or a plurality of tray open/close sensors; and one or a plurality of transport path paper detection sensors.
<7> The information processing device according to any one of <1> to <6>, in which the information processing device is one of the image forming device or a server connected to the image forming device via a network.
<8> An analysis system including:
an image forming apparatus with one or a plurality of sensors that are configured to detect jam removal tasks performed in response to occurrence of a paper jam;
a receiving part configured to receive: paper jam occurrence information about the paper jam that has occurred;
a memory part configured to store procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described; and
an output control part configured to output a task analysis result obtained based on: the procedural task data associated with the paper jam occurrence information; and sensor output log data.
<9> The analysis system according to <8>, further including: a conversion part configured to convert the one or plurality of raw sensor data collected from the one or plurality of sensors, into task log data that indicates each task performed on the image forming device, and the sensor output log data includes the task log data.
<10> An image forming device including:
one or a plurality of sensors configured to detect paper jam removal tasks performed in response to occurrence of a paper jam;
a conversion part configured to convert one or plurality of raw sensor data collected from the one or plurality of sensors, into task log data that indicates each task performed on the image forming device and that corresponds to items to be compared against procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described;
a save part configured to save the task log data from a time the paper jam occurs until the paper jam is removed; and
a transmission part configured to transmit, to the information processing device, sensor output log data, including the task log data saved.
<11> The image forming device according to <10>, further including:
an output device configured to output guidance providing instructions on tasks for removal of the paper jam;
an input device configured to accept an input corresponding to the guidance, and
the save part is further configured to save screen operation log data that indicates each operation performed via the input device, and the transmission part is further configured to the screen operation log data.
<12> An analysis method that causes a computer system to:
receive jam occurrence information about a paper jam that has occurred in an image forming device;
receive sensor output log data associated with one or a plurality of sensors provided in the information forming device to detect paper jam removal tasks performed in response to occurrence of the paper jam;
read procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described; and
output a task analysis result obtained based on: the procedural task data associated with the paper jam occurrence information; and the sensor output log data.
<13> A program for causing a computer to function as:
a receiving part configured to receive: paper jam occurrence information about a paper jam that has occurred in an image forming device; and sensor output log data associated with one or a plurality of sensors provided in the image forming device to detect paper jam removal tasks performed in response to occurrence of the paper jam;
a memory part configured to store procedural task data, in which one or a plurality of procedural tasks for removing the paper jam are described; and
an output control part configured to output a task analysis result obtained based on: the procedural task data associated with the paper jam occurrence information; and the sensor output log data.
<14> A computer-readable non-transitory recording medium storing therein a program that, when executed on a computer, causes the computer to perform the analysis method of <12>.
Although the present disclosure has been described above based on an embodiment, the present disclosure is by no means limited to the requirements shown in the above embodiment. These requirements can be changed in a variety of ways within the scope of the present disclosure, and can be determined as appropriate according to the mode of implementation.
Number | Date | Country | Kind |
---|---|---|---|
2023-081543 | May 2023 | JP | national |