This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2014-044298, filed on Mar. 6, 2014, the entire contents of which are incorporated herein by reference.
An embodiment according to the present invention relates to a delivery sorting processing system and a delivery sorting processing method.
Conventionally, a delivery sorting processing system recognizing characters from an image obtained by photographing a delivery, identifying a delivery address from the recognized characters, and sorting the delivery automatically has been used. However, when the adjustment of the system is insufficient, the character recognition processing becomes inappropriate, and in some cases, the identified address is not relevant to the delivery address.
A delivery sorting processing system in an embodiment includes an image acquisition section, an input section, a recognition section, an image processing section, an association processing section, and a determination section. The image acquisition section acquires an image obtained by photographing a delivery. The input section receives an operation of an operator. The recognition section recognizes recognizable semantic information included in the image, based on the image the image acquisition section acquires. The image processing section extracts a candidate for a delivery destination area based on the semantic information recognized by the recognition section, and determines priority according to a possibility of being relevant to a delivery destination area. The association processing section processes a delivery based on information included in a candidate for a delivery destination area of the highest priority in a case where a predetermined condition is satisfied, performs processing of determining a delivery destination of a delivery based on information included in a candidate for a delivery destination area selected by an operator's operation the input section receives in a case where a predetermined condition is not satisfied, and stores a candidate for a delivery destination area determined to have high priority by the image processing section in a storage section in association with a candidate for a delivery destination area selected by an operator's operation the input section receives. The determination section refers to the storage section, and determines whether there is a match between a candidate for a delivery destination area stored in the storage section and determined to have high priority by the image processing section and a candidate for a delivery destination area selected by the operator's operation, and outputs a determination result.
In the following, a delivery sorting processing system in an embodiment will be described with reference to the drawings.
In the example illustrated in
The delivery sorting processing system 10 includes, for example, a plurality of scanners 50, a reading processing section 300, a determination processing section 400, an image display terminal 100, a warning display terminal 500, a storage section 200, and a conveying section 60.
The plurality of scanners 50 photographs the shipping bill 20 affixed to the delivery 21 arriving at a predetermined photographing position and outputs an image (image data). The scanners 50 are, for example, scanners of a line scan method capable of photographing the moving delivery 21 at high resolution. The plurality of scanners 50 is installed at positions where the delivery 21 can be photographed from mutually different angles. For example, the plurality of scanners 50 is each installed in a position where a top surface and four side surfaces of the delivery 21 can be photographed. It is to be noted that, for example, a scanner 50 may be a camera capable of photographing a predetermined plane area at a time.
The conveying section 60 includes the belt conveyor 70, and a conveyance control section 80 driving the belt conveyor 70. In the conveying section 60, the belt conveyor 70 is driven based on a signal output from the conveyance control section 80. Thereby, the belt conveyor 70 moves the delivery 21 placed thereon. Among the plurality of sorting boxes, the conveying section 60 conveys the delivery 21 to the sorting box corresponding to a processing result of the reading processing section 300 or the determination processing section 400. The conveyance control section 80 controls a conveyance state of the conveying section 60. For example, the conveyance control section 80 controls the speed of the belt conveyor 70, a route to the sorting box of the delivery 21, and the like.
The image display terminal 100 includes an image display section 110 and a first input section 120. The first input section 120 is an input section. The image display section 110 displays information which the reading processing section 300 reads from the scanner 50 based on a signal output from the reading processing section 300. In addition, the first input section 120 is a device for receiving an operation of selecting a portion of the information displayed by the image display section 110, an operation of giving an instruction to subject a portion of the displayed information to predetermined processing, or the like. The first input section 120 outputs information corresponding to a received operation to the reading processing section 300.
The reading processing section 300 and the determination processing section 400 of the delivery sorting processing system 10 include a processor such as a CPU (Central Processing Section). In addition, the delivery sorting processing system 10 includes the storage section 200 such as a ROM (Read Only Memory), a RAM (Random Access Memory), an HDD, and a flash memory. In addition, the delivery sorting processing system 10 includes, as a software function section that functions by the processor executing a program stored in the storage section 200, an image acquisition section 310, a recognition section 320, an image processing section 330, an association processing section 340, a determination section 410, and an area setting parameter calculation section 420. It is to be noted that some or all of these software function sections may be a hardware function section such as an LSI (Large Scale Integration) and an ASIC (Application Specific Integrated Circuit).
The storage section 200 includes an area extraction information storage section 210, and an area selection information storage section 220. In the area extraction information storage section 210, information corresponding to a candidate for a delivery destination area described below is stored. In addition, in the area selection information storage section 220, a candidate for a delivery destination area selected by an operator's operation described below and a candidate for a delivery destination area of the highest priority are stored in association with each other.
The reading processing section 300 includes the image acquisition section 310, the recognition section 320, the image processing section 330, and the association processing section 340.
The image acquisition section 310 acquires an image of the delivery 21 which the scanner 50 photographs.
The recognition section 320 recognizes the semantic information including the delivery destination information which is recognizable from the image that the image acquisition section 310 acquires. That is, the recognition section 320 recognizes the semantic information displayed on the shipping bill 20 affixed to the delivery 21, or on the delivery 21. For example, the recognition section 320 may recognize the character information by OCR (Optical Character Recognition) from the image obtained by the scanner 50 photographing, and may further recognize the delivery destination information based on the character information recognized by OCR. It is to be noted that the recognition section 320 may recognize the symbolic information (such as a bar code) from the image obtained by the scanner 50 photographing. In this case, the recognition section 320 recognizes the identification information by decoding the symbolic information. In addition, by disposing a plurality of laser devices (not shown) irradiating the delivery 21 with lasers from the respective directions in the delivery sorting processing system 10, the recognition section 320 may measure a shape of the delivery 21 based on reflected light beams, and may recognize the semantic information by using an image corrected in accordance with the measurement result. Thereby, the image corrected in accordance with the shape of the delivery 21 can be acquired, and therefore, the recognition section 320 can recognize the semantic information with higher accuracy.
The image processing section 330 recognizes information described on the delivery 21 and information described in the shipping bill 20 from the semantic information the recognition section 320 recognizes. That is, the image processing section 330 recognizes the delivery destination information such as a postal code, an address, and a name. In addition, based on the semantic information which the recognition section 320 recognizes, the image processing section 330 extracts from the image an area which is to be a candidate for an area where the delivery destination information is described. Here, the candidate for the delivery destination area extracted by the image processing section 330 based on the semantic information is referred to as a “delivery destination candidate area”.
In addition, for each of delivery destination candidate areas extracted, the image processing section 330 determines priority according to a possibility of being relevant to the delivery destination area of the delivery 21. Hereinafter, the delivery destination candidate area with the highest possibility of the delivery destination area is referred to as a “delivery destination candidate area of the highest priority”.
In addition, the image processing section 330 determines whether the sorting destination corresponding to the delivery destination can be confirmed in the delivery destination candidate area of the highest priority (that is, whether a predetermined condition is satisfied).
Here, the case where the sorting destination cannot be confirmed means the following cases. For example, it is a case where all or a portion of the information of the delivery destination candidate area of the highest priority is unclear, and therefore, the image processing section 330 cannot accurately recognize the delivery destination information such as a postal code, an address, and a name. In addition, for example, a case where the image processing section 330 cannot extract the delivery destination candidate area from the image, and a case where the semantic information obtained by the recognition section 320 lacks information to confirm the sorting destination are also included in the case where the sorting destination cannot be confirmed. In addition, for example, a case where the image processing section 330 can extract the delivery destination candidate area, but cannot determine the delivery destination candidate area of the highest priority is also included in the case where the sorting destination cannot be confirmed. It is to be noted that the conditions of the case where the sorting destination cannot be confirmed can be appropriately set.
When the predetermined condition is satisfied, that is, when the image processing section 330 can determine the sorting destination from the delivery destination information by referring to a table where the delivery destination information and the sorting destination are associated with each other, the image processing section 330 outputs the confirmed sorting destination to the association processing section 340.
When the predetermined condition is not satisfied, the image processing section 330 outputs the purport that the predetermined condition is not satisfied to the association processing section 340, and then, as described below, confirms the sorting destination corresponding to the delivery destination by acquiring sorting information from the first input section 120 of the image display terminal 100. Then, the image processing section 330 outputs the confirmed sorting destination to the association processing section 340.
When the predetermined condition is satisfied, the association processing section 340 processes the delivery 21 based on the confirmed sorting destination from the image processing section 330. That is, the association processing section 340 processes the delivery 21 based on the information described in the delivery destination candidate area of the highest priority. Processing the delivery 21 means, for example, instructing the conveyance control section 80 to convey the delivery 21 to the sorting box (that is, sorting destination) corresponding to the destination (that is, delivery destination). In addition, the association processing section 340 stores in the storage section information corresponding to a result of processing the delivery based on the information described in the delivery destination candidate area of the highest priority when the predetermined condition is satisfied.
In addition, when the predetermined condition is not satisfied, the association processing section 340 outputs the delivery destination candidate area which the image processing section 330 extracts to the image display section 110 of the image display terminal 100, and then, processes the delivery 21 based on the confirmed sorting destination from the image processing section 330. That is, the association processing section 340 processes the delivery 21 based on the sorting information input by the operator based on the information described in the delivery destination candidate area selected by the operator's operation.
Further, the association processing section 340 stores the delivery destination candidate area determined to have high priority, and the delivery destination candidate area selected as the delivery destination area by the operator's operation on the first input section 120 in association with each other in the area extraction information storage section 210. It is to be noted that the association processing section 340 may store in the storage section 200 the semantic information which the recognition section 320 recognizes. In addition, the association processing section 340 may store in the storage section 200 the semantic information recognized by the recognition section 320, in association with the candidate for each of the delivery destination areas extracted by the image processing section 330, and the priority determined according to the possibility of being relevant to the delivery destination area determined by the image processing section 330.
The determination processing section 400 includes the determination section 410, and the area setting parameter calculation section 420. The determination section 410 refers to the area extraction information storage section 210 and the area selection information storage section 220, and determines whether there is a match between the delivery destination area determined to have high priority by the image processing section 330 (delivery destination candidate area of the highest priority) and the delivery destination candidate area selected as the delivery destination area by the operator. In addition, the determination section 410 outputs the determination result of whether there is a match to the warning display terminal 500.
The warning display terminal 500 includes a warning display section 510 and a second input section 520. The warning display section 510 displays the determination result determined by the determination section 410, or the results processed and calculated by the area setting parameter calculation section 420. The second input section 520 is a device for receiving the operations on the determination section 410 and the area setting parameter calculation section 420.
The area setting parameter calculation section 420 calculates a parameter for extracting an area to be the delivery destination candidate area based on the information stored in the area extraction information storage section 210 and the area selection information storage section 220. Further, the area setting parameter calculation section 420 calculates a parameter for determining the delivery destination candidate area of the highest priority based on the information stored in the area extraction information storage section 210 and the area selection information storage section 220.
The area setting parameter calculation section 420 outputs the calculated parameters to the reading processing section 300. The reading processing section 300 outputs the parameters calculated by the area setting parameter calculation section 420 to the image processing section 330. The image processing section 330 performs processing using the input parameters.
The area setting parameter calculation section 420 calculates a parameter for extracting the delivery destination candidate area and a parameter for determining the priority of the delivery destination candidate area, using the information indicating the delivery destination candidate area stored in the area extraction information storage section 210 and the information indicating the priority of the delivery destination candidate area stored in the area extraction information storage section 210. By preparing a plurality of test images for verifying the performance difference in advance and applying the parameters that the area setting parameter calculating section 420 calculates to the test images, the delivery sorting processing system can determine the appropriateness of the parameters and the adverse effects of the parameters that the area setting parameter calculating section 420 calculates.
(Operation of Sorting Processing)
In the following, an operation of the sorting processing performed in the delivery sorting processing system 10 in the embodiment will be described with reference to
First, the scanner 50 photographs the delivery 21 that is being conveyed by being placed on the belt conveyor 70, and outputs an image. The image acquisition section 310 acquires the image output from the scanner 50, and outputs the image to the recognition section 320 (step S300).
The recognition section 320 receives the image from the image acquisition section 310, and performs the processing of recognizing the semantic information described in the shipping bill 20 affixed to the delivery 21 from the image (step S302).
The image processing section 330 extracts from the image the area (delivery destination candidate area) to be a candidate for the delivery destination area where the delivery destination information is described, based on the semantic information the recognition section 320 recognizes (step S304). It is to be noted that a method for extracting the delivery destination candidate area will be described below.
Next, with respect to the extracted delivery destination candidate area, the image processing section 330 determines the priority according to the possibility of being relevant to the delivery destination area (step S306). It is to be noted that a method for determining the priority according to the possibility of being relevant to the delivery destination area will be described below.
The association processing section 340 stores the delivery destination candidate area in the area extraction information storage section 210 in association with the priority according to the possibility of being relevant to the delivery destination area (step S308).
Here, the information stored in the area extraction information storage section 210 will be described. In the image from which the semantic information is recognized by the recognition section 320, a coordinate value indicating a position of an object on the image and predetermined identification information are attached.
The priority of the area 1 is determined to be the highest, and the priority of the area 2 is determined to be the second highest, and the priority of the area 3 is determined to be the lowest. The area 1 includes character strings 11, 12, and 13, and the area 2 includes character strings 21, 22, and 23, and the area 3 includes a character string 31.
The coordinate value on the image corresponds to the position of a pixel of the image, and indicate in which pixel in the vertical direction and in which pixel in the horizontal direction a specific pixel in the image exists. For example, as illustrated in the lower diagram of
In addition, in the area extraction information storage section 210, for example, a coordinate value corresponding to an area including a plurality of character rows, which is obtained based on the position relation between each of the character row coordinates extracted by the image processing section 330, information related to the priority determined by the image processing section 330 and the delivery destination candidate area corresponding to the priority, the semantic information recognized by the recognition section 320 for each delivery destination candidate area, and the like may be stored. In addition, the association processing section 340 may store an image acquired by the image acquisition section 310 in the area extraction information storage section 210.
The association processing section may store in the storage section any one of an image the image acquisition section acquires, a coordinate value corresponding to an image of the delivery, a coordinate value corresponding to a candidate for each of delivery destination areas of the delivery, a coordinate value corresponding to each of character rows in each of the delivery destination areas, a coordinate value corresponding to an area including a plurality of rows together based on a position relation of each of the character rows extracted by the image processing section, information on the priority determined by the image processing section and a candidate for the delivery destination area corresponding to the priority, and semantic information recognized by the recognition section with respect to a candidate for each of the delivery destination areas, in association with the candidate for the delivery destination area determined to have high priority by the image processing section and the candidate for the delivery destination area selected by an input of an operator's operation the input section receives.
Returning to
When the image processing section 330 determines in the processing of step S310 that the sorting destination corresponding to the delivery destination can be confirmed, the image processing section 330 outputs the confirmed sorting destination to end the sorting processing (step S320). When the sorting destination corresponding to the delivery destination cannot be confirmed, the image processing section 330 proceeds to processing in step S312.
When the image processing section 330 determines in the processing of step S310 that the sorting destination cannot be confirmed, the association processing section 340 causes the image display section 110 to display the image (step S312).
Then, the operator of the delivery sorting processing system 10 selects the delivery destination candidate area as the delivery destination area on the screen displayed in the image display section 110 in step S312 via the first input section 120 (step S314). The first input section 120 receives, for example, the selection of the desired delivery destination candidate area by an input by a mouse, an input by a keyboard, or an input by a touch panel. In the image display terminal 100, the selected delivery destination candidate area may be displayed at magnification.
When the operator selects any of the areas, the area is magnified and displayed.
The image processing section 330 acquires the sorting information which is input by the operator via the first input section 120 of the image display terminal 100 (step S316). The image display section 110 receives the input of the sorting information by the interface display screen shown in
The association processing section 340 associates the information of the delivery destination candidate area selected by the operator with the sorting information input by the operator and the delivery destination information such as a name, and writes them into the area selection information storage section 220 (step S318).
Then, the reading processing section 300 outputs information obtained by associating the information of the delivery destination candidate area selected by the operator with the sorting information input by the operator, as the sorting destination information, to the image display terminal 100, or the warning display terminal 500 (step S320). It is to be noted that the reading processing section 300 may cause a display provided separately, a screen display section installed independently in remote areas, or the like to display the information obtained by associating the information of the delivery destination candidate area selected by the operator with the sorting information input by the operator, as the sorting destination information, on the display screen.
Next, the method for extracting the delivery destination candidate area corresponding to step S304 described above will be described. For example, there is a method such as the following. With respect to the obtained image, the image processing section 330 creates a differential image in which a portion where the density difference between adjacent pixels is equal to or more than a certain level is set to “1” and the other portion is set to “0”, and performs labeling on a portion where “1”s are concatenated. Based on this labeling, the image processing section 330 sets the labeling located close in the vertical direction or the horizontal direction, concatenated in a spread manner, as a candidate for the character row. The image processing section 330 sets collectively the character row candidates closely positioned in the same direction as the delivery destination candidate area.
In addition, for example, there is a method such as the following. The image processing section 330 converts the image to a binary image of black and white pixels by comparing pixel values (brightness, color) with a threshold for each of the pixels of the image of the delivery. The image processing section 330 detects an area where the black pixels are concatenated from this conversion result, and obtains information of a rectangle that circumscribes the concatenated area of the black pixels from the concatenated area of the black pixels. This information is referred to as rectangle information. Then, by comparing the rectangle information with reference information prepared in advance, the image processing section 330 may detect the delivery destination candidate area according to the magnitude of the similarity. Further, the image processing section 330 compares the shape of the rectangular area, the position of the rectangular area on the delivery, and the like, included in the rectangle information, with the reference information prepared in advance, and based on this comparison result, extracts the delivery destination candidate areas in one or more locations.
Next, the method for determining the priority according to the possibility of being relevant to the delivery destination area that corresponds to step S306 described above will be described. For example, there is a method such as the following. The image processing section 330 evaluates how many rows are included in the delivery destination candidate area, whether the number of rows are more or less than a predetermined number n, whether the average width of the detected row is larger or smaller than a predetermined dimension, how the distribution of the width of the detected row is, how the distribution of the position of the detected row is, or the like. The image processing section 330 determines the priority according to the possibility of being relevant to the delivery destination area by comparing the positions of the row and column, the distribution of the row and column, and the like in the evaluated delivery destination candidate area with the reference information prepared in advance, and determining the comparison result comprehensively.
Thus, the delivery sorting system includes the image processing section 330 that acquires the delivery destination candidate area selected by the operator and the sorting information input by the operator, when the delivery destination information of the delivery 21 cannot be recognized. Thereby, the delivery sorting system can confirm the sorting destination of the delivery 21 based on the selected delivery destination candidate area and the input sorting information, and sort the delivery 21 appropriately.
(Operation of Diagnosis Processing)
In the following, an operation of diagnosis processing according to the embodiment will be described with reference to
The determination section 410 refers to the area extraction information storage section 210 and the area selection information storage section 220, and compares the delivery destination candidate area of the highest priority determined by the image processing section 330 with the delivery destination candidate area selected as the delivery destination area by the operator, and determines whether there is a match between them (step S400). The determination section 410 outputs the result of the determination in step S400 to the warning display terminal 500 (step S402).
Here, an example of a display screen that is output in the warning display terminal 500 will be described with reference to
By performing such display, the operator can recognize whether the adjustment of the delivery sorting processing system 10, which automatically recognizes the delivery destination, is appropriate. By referring to the results displayed on the display screen by the warning display section 510, the operator can verify a defect of the setting of the delivery destination candidate area, and a defect of the determination of the delivery destination candidate area of the highest priority, and adjust the delivery sorting processing system 10.
In addition, the determination section 410 may cause the image display section 110, a display provided separately, the maintenance center installed independently in remote areas, or the like to display the information indicating the determination result on the display screen. Further, the warning display terminal 500 may inform an operator or a maintenance personnel of a suspicion of the defect by voice and the like. In addition, each time the determination is made, each time the determination is made a predetermined number of times, or when the determination result of the mismatch is made a predetermined number of times or more, the determination processing section 400 may output the determination result, or the warning display terminal 500 may cause the warning display section 510 to display the determination result. In addition, when a certain number or more of the determination results are accumulated, when a certain number of the determination results are consecutively accumulated, or when the determination results of the mismatch are accumulated by a certain proportion or more, the determination processing section 400 may output the determination result, or the warning display terminal 500 may cause the image display section 110 and the like to display the determination result.
Returning to
The area setting parameter calculation section 420 performs, for example, a simulation of extracting the delivery destination candidate area and a simulation of determining the priority of the delivery destination candidate area, using a plurality of parameters prepared in advance, and calculates an optimal parameter for extracting the delivery destination candidate area, and an optimal parameter for determining the priority of the delivery destination candidate area to determine the respective parameters. For example, the area setting parameter calculation section 420 sets, for example, an area where the height of the area X is a predetermined value X1 or more, and the sum of position coordinates of the area start (sx+sy) Y is a predetermined value Y1 or less, as the delivery destination candidate area of the highest priority. In addition, for example, the area setting parameter calculation section 420 sets five kinds of values for each of the predetermined value X1 and the predetermined value Y1, and performs on all combinations of X1 and Y1 the simulation of determining the priority of the delivery destination candidate area with respect to an image of an abnormal group in which the delivery destination candidate area inappropriate as the delivery destination area is selected. Then, the area setting parameter calculation section 420 calculates correct answer rates of all the combinations of the X1 and Y1 by comparing the delivery destination area extracted from the image with the simulation results. The area setting parameter calculation section 420 determines the parameters X1 and Y1, with which an area of the highest correct answer rate among them is set, as the optimal parameters. When there are “n” parameters related to the processing of determining the priority of the area setting, and all of them are intended to be adjusted, the area setting parameter calculation section 420 may perform the n squared times of simulations based on a round robin system by preparing two kinds of variation values for each. When the number of the simulations is too high, the area setting parameter calculation section 420 may calculate an appropriate setting value by the experimental design method and the like.
In addition, as for the method for calculating the optimal parameters, for example, there is a method for calculating the optimal parameters by machine learning. For example, it is a method for setting the optimal parameters by using the gradient method. When there is a plurality of setting value candidates for the parameters, the area setting parameter calculation section 420 identifies the direction of improvement, for example, by changing each of the parameters slightly, and checking the increase and decrease of the number of correct answers before and after the change. Then, the area setting parameter calculation section 420 adjusts each of the parameters until an end condition such as a condition in which increase and decrease in the number of correct answers no longer occur is satisfied. The parameters that satisfy the end condition become the optimal parameters. In addition, the setting of the optimal parameters is not limited to the method described above, and the operator may perform the setting.
Next, the area setting parameter calculation section 420 outputs the parameter for extracting the delivery destination candidate area calculated as described above, and the parameter for determining the priority to the reading processing section 300. The reading processing section 300 applies the parameters that are input from the area setting parameter calculation section 420 in place of the parameters that are internally set (step S406). In addition, the reading processing section 300 outputs the applied parameters (step S408). It is to be noted that the parameters output from the area setting parameter calculation section 420 may be displayed on the display screen of the image display section 110, the warning display section 510, a display provided separately, a screen display section installed independently in remote areas, or the like.
As described above, in the diagnosis processing of the delivery sorting processing system 10 according to the embodiment, it can be detected whether the sorting into the sorting destination corresponding to the delivery destination of the delivery 21 is appropriately performed. The determination section 410 refers to the area extraction information storage section 210 and the area selection information storage section 220, and compares the delivery destination candidate area that is determined to have the highest possibility of being relevant to the delivery destination area with the delivery destination candidate area that is selected as the delivery destination area by the operator. Then, the determination section 410 determines whether there is a match between them, and outputs the determination result to the warning display terminal 500. Thereby, the operator can determine whether the parameters set to sort the delivery 21 into the sorting destination corresponding to the delivery destination are appropriate.
In addition, in the diagnosis processing of the delivery sorting processing system 10 according to the embodiment, the area setting parameter calculation section 420 can calculate the parameter for extracting the delivery destination candidate area from the semantic information, and the parameter for determining the priority, which are set in the image processing section 330. By applying the parameters calculated by the area setting parameter calculation section 420 to the delivery sorting processing system 10, the sorting into the sorting destination corresponding to the delivery destination of the delivery 21 can be performed more appropriately.
Thus, the determination section 410 determines whether there is a match between the delivery destination candidate area stored in the area extraction information storage section 210 and determined to have high priority as the delivery destination area, and the delivery destination candidate area stored in the area selection information storage section 220 and selected as the delivery destination area by the operator, and outputs the determination result. When there is no match between them, the determination section 410 outputs information indicating that the parameters set in the delivery sorting processing system 10 are defective. As a result, it is possible to detect defects of the parameters applied to the delivery sorting processing system 10.
Further, by referring to the information stored in the area extraction information storage section 210 and the information stored in the area selection information storage section 220, the area setting parameter calculation section 420 can calculate the parameter for extracting the delivery destination candidate area, and the parameter for determining the priority of the delivery destination candidate area. By applying the parameters that the area setting parameter calculating section 420 calculates in place of the defective parameters, the reading processing section 300 can improve the accuracy of the sorting processing of the delivery 21 without involvement of the operator's operation.
In the following, a modification relating to the above embodiment will be described.
Although the sorting processing of the delivery 21 is performed by processing the image acquired by the scanner 50 in the embodiment, the delivery sorting processing system according to the present invention is not limited thereto. For example, in addition to the configuration in the embodiment, the delivery sorting processing system may include a configuration for recognizing a bar code.
The bar code scanner 600 acquires an image including a bar code displayed on a delivery 21. The bar code reading section 350 recognizes the bar code from the image the bar code scanner 600 acquires, and further acquires information (numeric string, for example) corresponding to the bar code by decoding the bar code. The bar code recognition section 360 recognizes delivery destination information corresponding to the information acquired by the bar code reading section 350.
For example, when the bar code, which is symbolic information obtained by encoding identification information (numeric string) indicating the delivery destination information, is printed on the delivery 21, or described in a shipping bill 20, the bar code scanner 600 acquires an image including the bar code. The bar code reading section 350 recognizes the bar code from the image the bar code scanner 600 acquires, and further acquires information (numeric string) corresponding to the bar code by decoding the bar code. The bar code recognition section 360 recognizes the delivery destination information corresponding to the information (numeric string, for example) acquired by the bar code reading section 350. Thereby, the delivery destination information of the delivery 21 can be recognized, and a sorting destination of the delivery can be confirmed. In addition, according to the modification, a recognition section 320 not only acquires the delivery destination information based on the bar code, but also recognizes character information included in the image by OCR from the image obtained by scanner 50 photographing. An image processing section 330 acquires the delivery destination information based on the character information recognized by OCR.
Processing in the above modification will be described with reference to
First, the bar code scanner 600 acquires an image including a bar code that is printed on the delivery 21 or described in the shipping bill 20. The bar code reading section 350 recognizes the bar code from the acquired image, and further acquires information (numeric string) the bar code indicates by decoding the bar code. The bar code recognition section 360 recognizes the delivery destination information corresponding to the acquired information (numeric string) (step S200). Then, the bar code recognition section 360 determines whether the sorting destination corresponding to the delivery destination of the delivery 21 can be confirmed (step S202). When the sorting destination can be confirmed, the sorting destination is output (step S320).
When the bar code recognition section 360 cannot confirm the sorting destination corresponding to the delivery destination, the scanner 50 acquires the image of the delivery 21 being placed on and conveyed by a belt conveyor 70 (step S204). The recognition section 320 recognizes the character information described in the shipping bill 20 affixed to the delivery 21 from the image by OCR. Then, the image processing section 330 performs processing of selecting correct delivery destination information and acquires the delivery destination information based on a character string recognized by OCR, so as to make it possible to correctly recognize the delivery destination information even when a portion of the character string recognized by OCR is incorrectly recognized (step S206). Thereby, when the sorting destination corresponding to the delivery destination can be confirmed, sorting destination information is output. When the sorting destination of the delivery destination cannot be confirmed, the step proceeds to processing in step S300. It is to be noted that the processing in step S300 may be omitted, and the image obtained by the photographing in step S204 may be used in step S302.
Thereby, the delivery sorting processing system 10 in the modification performs the processing of recognizing a bar code in addition to the recognition processing by OCR, and therefore, can improve the probability that the sorting destination corresponding to the delivery destination of the delivery can be confirmed.
According to at least one of the embodiments described above, the delivery sorting processing system stores the delivery destination candidate area determined to have high priority by the image processing section 330, and the delivery destination candidate area selected as the delivery destination area by the operator's operation on the first input section 120, in association with each other in the storage section 200. In addition, the delivery sorting processing system has a function of determining whether there is a match between the delivery destination candidate area and determined to have high priority and the delivery destination candidate area that the operator selects as the delivery destination area stored in the storage section 200, and outputting the determination result. Therefore, the delivery sorting processing system 10 can provide the information necessary for adjusting the system that automatically recognizes the delivery destination to the image processing section.
In addition, according to at least one of the embodiments, the delivery sorting processing system includes the area setting parameter calculating section 420 that calculates the parameter for extracting the delivery destination candidate area and the parameter for determining the priority of the delivery destination candidate area. Further, the delivery sorting processing system can provide these calculated parameters to the operator or the reading processing section 300. Therefore, the delivery sorting processing system can acquire the delivery destination information appropriately from the delivery. As a result, the delivery sorting processing system can sort the delivery appropriately into the sorting destination corresponding to the delivery destination.
While some embodiments according to the present invention have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the invention. These embodiments can be carried out in a variety of other modes, and various omissions, substitutions and changes can be made without departing from the spirit of the invention. These embodiments and their modifications fall within the scope of the invention described in the claims and its equivalents as well as within the scope and spirit of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2014-044298 | Mar 2014 | JP | national |