This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2021-054121 filed Mar. 26, 2021.
The present disclosure relates to an information processing apparatus, a non-transitory computer readable medium storing an information processing program, and information processing method.
JP2012-150707A discloses an approval support program. The approval support program causes a display device to display an electronic document associated with the approval information including information for identifying an approver, and acquires position information on the electronic document specified by an operator. In a case where the position information is included in an input range of the approval information on the electronic document, the approval support program acquires organization information from a storage unit, and causes the display device to display the acquired organization information in a display form in which the approver and a person other than the approver are visually different. Based on information for identifying an approver, the storage unit stores the information for identifying an approver and organization information including an approver and a person other than the approver who belongs to the same organization as the approver in association with each other.
As a system for approving a form, for example, there is a system that displays the contents of the entire form and enables approval by an approver based on the displayed contents. However, in a case of confirming the display of the entire form by looking at the form uniformly, items to be confirmed may be overlooked.
Aspects of non-limiting embodiments of the present disclosure relate to an information processing apparatus and a non-transitory computer readable medium storing an information processing program that can recognize the presence of a predetermined item in a form.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided an information processing apparatus including a processor configured to acquire text string information corresponding to a predetermined item in a digitized form and position information of the text string information, put the acquired text string information into an unrecognizable state in a display of the form by using the acquired position information for the item, and change the unrecognizable state to a recognizable state by performing a predetermined operation on the item.
Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
Hereinafter, an example of an exemplary embodiment of the present disclosure will be described with reference to the drawings. The same reference signs are given to the same or equivalent components and parts in each drawing. In addition, the dimensional ratios in the drawings are exaggerated for convenience of description and may differ from the actual ratios.
First, the outline of the premise of this exemplary embodiment will be described. As shown in the above problem, there is a problem regarding the approval of forms. In a form process until the form is approved, the general flow is that a drafter creates the form, and an approver approves the form and stamps a seal. In the case of an electronic form, it is common to stamp an electronic seal. In such a form process, there are cases where an approver cannot confirm accurately, such as a case of approving a large number of types of forms and a case where the delivery date of the approval process is short.
In response to such a problem, as a mechanism for accurately confirming the contents of the form, there is a mechanism for requesting confirmation of the display contents as employed in the manuals such as contracts.
Under such circumstances, the inventor of the technology of the present disclosure has found, while it takes time to confirm all the items on the form, a need to have crucial items confirmed on the spot. Therefore, in this exemplary embodiment, a mechanism is proposed in which target items are listed according to criteria of the items and a text string corresponding to the target item is put into an unrecognizable state. The target item is controlled so that the content is not confirmed unless the user takes some action on the item.
In this exemplary embodiment, a method of superimposing a mask image is used as a method of putting the text string of the item into an unrecognizable state.
With the above mechanism, it is possible to make the approver of the form always confirm the information of the items that have to be confirmed and approve the information.
Hereinafter, in the description of this exemplary embodiment, as a method of putting the text string of the item into an unrecognizable state, a method of superimposing the mask image 141 will be described as an example, but the method is not limited thereto, and a method capable of putting the text string into an unrecognizable state may be used as appropriate, such as mosaic, changing the font color or background color of the text string, and replacing the text string. Information about a text string such as a text string itself, a font color, a background color, and the like is an example of the “text string information” of the present disclosure. Further, removing the mask image 141 in a case where the mask image 141 is clicked by the user operation is an example of “changing the unrecognizable state to a recognizable state by performing a predetermined operation”.
The input unit 15 in the UI unit 14 includes a pointing device such as a mouse and a keyboard, and is used to receive various types of input information. The display unit 16 is, for example, a liquid crystal display and displays various types of information. The display unit 16 may employ a touch panel method and serve as the input unit 15. Hereinafter, the user's operation received by the input unit 15 is referred to as a user operation.
The communication I/F 18 is an interface for the information processing apparatus 10 to communicate with the UI unit 14. For the communication, for example, a wired communication standard such as Ethernet (registered trademark) or FDDI is used.
The storage unit 13 is realized by a storage device such as an HDD, an SSD, or a flash memory. An information processing program 13A is stored in the storage unit 13 serving as a storage medium. The CPU 11 reads the information processing program 13A from the storage unit 13, loads the information processing program 13A into the memory 12, and sequentially executes the processes included in the information processing program 13A. Further, the storage unit 13 stores various types of information that need to be stored, such as an information storage unit 13B.
The information storage unit 13B stores key value information and criteria for each item. The key value information is information indicating conditions for associating a key text string and a value text string, which are necessary for performing key value extraction to be described later. The criteria for each item are criteria for crucial items that need to be confirmed. In this exemplary embodiment, the mask image 141 is superimposed on a position of a text string of a target item with the item satisfying the criteria as the target item. Also, only crucial items without setting the criteria may be listed, and the mask image 141 may be superimposed on text strings of target items with the listed items as the target items.
In this exemplary embodiment, a key value extraction method based on the key value information is used to acquire the text string from the form. The key value extraction is to extract a value text string by specifying a key text string. The key text string is a text string representing an item. Items include, for example, “destination”, “amount of money”, and “approved delivery date”. The value text string is a text string representing a value corresponding to an item. The text string representing the value is, for example, “54,000 yen” in a case where the item is a value of “amount of money”. In the case of extracting the key value from the form, for example, in a case where “amount of money” is specified as the key text string, “54,000 yen” is extracted as the value text string in the form. Any method such as optical character recognition (OCR) can be used for key value extraction. The key text string, which is a text string representing an item, is an example of the “predetermined item” of the present disclosure, and the value text string, which is a text string representing a value, is an example of the “text string information” of the present disclosure. Further, as the key value information, the specification of the item which is a key text string may be received from the user.
The above is the description of the hardware configuration of the information processing apparatus 10.
The acquisition unit 110 receives the digitized form 140A, and acquires a text string corresponding to a predetermined item in the form 140A and position information of the text string. The item is an item obtained by referring to the key value information stored in the information storage unit 13B. The text string is a text string corresponding to the item obtained by the key value extraction described above. The method of acquiring the position information will be described.
The determination unit 111 determines whether or not the item of the text string acquired by the acquisition unit 110 is a target item based on the criteria for each item stored in the information storage unit 13B.
The mask image generation unit 112 generates the mask image 141 for the target item based on the horizontal width of the position information and the vertical and horizontal size of the horizontal width. For example, in a case where the horizontal width of the vertical and horizontal size is “60” and the vertical width thereof is “20”, the mask image 141 of the vertical and horizontal size is generated.
The form processing unit 113 superimposes the mask image 141 generated by the mask image generation unit 112 on the form 140A based on the X and Y coordinates of the position information for each target item, and outputs the processed form 140B. For example, in a case where the X coordinate is “200” and the Y coordinate is “300”, the mask image 141 is superimposed on the coordinates.
Further, the form control unit 114 receives a user operation on the processed form 140B and controls the display of the processed form 140B displayed on the display unit 16. As shown in
Next, the operation of the information processing apparatus 10 will be described.
The process of the information processing apparatus 10 is substantially divided into a target item determination process, a form mask process, and a form control process. Each process is performed by the CPU 11 reading the information processing program 13A from the storage unit 13 and loading the program into the memory 12 for execution.
In Step S100, the CPU 11 acquires a text string corresponding to a predetermined item in the form 140A and position information of the text string. The acquired text string and position information are stored in the information storage unit 13B.
In Step S102, the CPU 11 selects an item to be determined from the items for which the text string has been acquired.
In Step S104, the CPU 11 determines whether or not the selected item is a target item based on the criteria for each item stored in the information storage unit 13B. In a case where it is determined that the item is a target item, the process proceeds to Step S106, and in a case where it is determined that the item is not a target item, the process returns to Step S102, selects the next item, and repeats the process.
In Step S106, the CPU 11 adds the selected item as a target item to a target item list.
In Step S108, the CPU 11 determines whether or not a determination process has been completed for all the acquired items. In a case where it is determined that the determination process has been completed for all the items, the process proceeds to Step S110, and in a case where it is determined that the determination process has not been completed for all the items, the process returns to Step S102, selects the next item, and repeats the process.
In Step S110, the CPU 11 outputs a target item list and ends the target item determination process. The output here is the storage of the target item list in the information storage unit 13B. Further, in the output, the target item list may be displayed on the display unit 16.
In Step S120, the CPU 11 selects a target item from the target item list.
In Step S122, for the target item selected in Step S120, the CPU 11 acquires the position information acquired for the target item.
In Step S124, the CPU 11 generates a mask image 141 for the target item selected in Step S120 based on the horizontal width of the position information acquired in Step S122 and the vertical and horizontal size of the horizontal width.
In Step S126, for the target item selected in Step S120, the CPU 11 superimposes the mask image 141 generated in Step S124 on the coordinates of the text string of the target item in the form 140A based on the X and Y coordinates of the position information acquired in Step S122. The superimposition process may be performed by first generating an image obtained by copying the form 140A and superimposing the mask image 141 on the generated image.
In Step S128, the CPU 11 determines whether or not the process has been completed for all the target items. In a case where it is determined that the process has been completed for all the target items, the process proceeds to Step S130, and in a case where it is determined that the process has not been completed for all the target items, the process returns to Step S122, selects the next target item, and repeats the process.
In Step S130, the CPU 11 outputs the final processed form 140B and ends the form mask process. In the output, the processed form 140B is stored in the information storage unit 13B. Further, in the output, the processed form 140B may be displayed on the display unit 16.
In Step S140, the CPU 11 causes the display unit 16 to display the processed form 140B.
In Step S142, the CPU 11 receives a user operation on the processed form 140B.
In Step S144, the CPU 11 determines whether or not the user operation is a click on the mask image 141.
In Step S146, the CPU 11 removes the mask image 141 clicked in Step S142 from the processed form 140B, and redraws the processed form 140B on the display unit 16. The redrawing may be performed by superimposing the remaining mask images 141 other than the clicked and removed mask image 141 on the copied image of the form 140A to generate the processed form 140B.
In Step S148, the CPU 11 displays change information on the display unit 16. The information storage unit 13B stores a log of items corresponding to the clicked mask image 141, and displays change information based on the log.
In Step S150, the CPU 11 determines whether or not all the mask images 141 of the processed form 140B have been removed. In a case where it is determined that all the mask images 141 have been removed, the process proceeds to Step S152, and in a case where it is determined that all the mask images 141 have not been removed, the process returns to Step S142, receives a user operation, and repeats the process.
In Step S152, the CPU 11 displays completion information on the display unit 16.
In Step S154, the CPU 11 performs control such that approval of the processed form 140B is permitted and ends the process. Since the form may be approved by a general approval process, the description thereof will be omitted.
The display mode of the change information in Step S148 will be described.
By displaying the above change information, the user can grasp the items to be confirmed.
The display mode of the completion information in Step S152 will be described.
As described above, according to the information processing apparatus 10 of this exemplary embodiment, it is possible to recognize the presence of predetermined items in the form.
The present disclosure is not limited to the above-described exemplary embodiment, and various modifications and applications are possible without departing from the gist of the disclosure. For example, in a case where there are a plurality of acquired target items, it is assumed that the number of target items on which the mask image 141 is superimposed is expected to be suppressed. In this case, among the target items satisfying the criteria, the target item on which the mask image 141 is superimposed is determined in accordance with a predetermined rule. As an example, in a case where there are two or more destination items in the form, a rule is defined that the mask image 141 is generated and superimposed only on the first destination. In this way, by suppressing the number of target items on which the mask image 141 is superimposed, the confirmation work of the user can be simplified. Further, in a case where two items are displayed side by side at adjacent positions, a rule may be set so as to combine, generate, and superimpose the mask image 141 from the position information of the two items. By combining the mask images 141 for two or more adjacent display position items in this way, the user can confirm the items that need to be confirmed with just one click of the mask image 141, and the confirmation work can be simplified.
In addition, various processors other than the CPU may execute the information processing executed by the CPU reading the software (program) in the above exemplary embodiment. In the embodiments above, the term “processor” refers to hardware in a broad sense. Examples of the processor include general processors (e.g., CPU: Central Processing Unit) and dedicated processors (e.g., GPU: Graphics Processing Unit, ASIC: Application Specific Integrated Circuit, FPGA: Field Programmable Gate Array, and programmable logic device). In addition, information processing may be executed by one of these various processors, or may be executed by a configuration of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs, a combination of a CPU and an FPGA, and the like). Further, the hardware structure of these various processors is, more specifically, an electric circuit in which circuit elements such as semiconductor elements are combined.
Further, in the above exemplary embodiment, the mode in which the information processing program is stored (installed) in the ROM or the storage in advance has been described, but the present disclosure is not limited thereto. The program may be provided in a form recorded on a non-transitory recording medium such as a compact disk read only memory (CD-ROM), a digital versatile disk read only memory (DVD-ROM), and a universal serial bus (USB) memory. Further, the program may be downloaded from an external device via a network.
In the embodiments above, the term “processor” is broad enough to encompass one processor or plural processors in collaboration which are located physically apart from each other but may work cooperatively. The order of operations of the processor is not limited to one described in the embodiments above, and may be changed.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2021-054121 | Mar 2021 | JP | national |