The present disclosure is related to a projection instruction device, a parcel sorting system, and a projection instruction method useful to sort parcels.
As recent economic activity rises, the amount of parcel circulation tends to increase. In a circulation process of parcels, sorting work for sorting the parcel by destination is a time-consuming process and relies on manual work from before, but a technology of automating at least a part of the sorting work is proposed.
PTL 1 discloses a system in which a moving parcel is tracked, an image to be displayed is determined based on information related to the parcel read from the parcel and information of a position of the parcel, and the image is projected from a projector to display the image on the parcel.
PTL 1: U.S. Pat. No. 7,090,134
Meanwhile, in recent years, the amount of parcel circulation has been increased more and more and types of parcels have also become various, so that a technology of effectively and precisely sorting the parcel is required.
The present disclosure is related to the technology of effectively and precisely sorting the parcel.
According to the present disclosure, there is provided a projection instruction device that generates a projection image to be projected on a parcel based on parcel identification information specifying the parcel, the device including: a processor; and a memory, in which by cooperating with the memory, the processor specifies a person in charge of processing the parcel or a destination of the parcel based on the parcel identification information to generate a person in charge instruction projection image indicating at least one of the person in charge or the destination.
According to the present disclosure, there is provided a parcel sorting system including: the projection instruction device described above; a label reader that reads the parcel identification information from a label attached to a parcel; an image sensor that images at least the parcel; and an image projection device that projects at least the person in charge instruction projection image on the parcel.
According to the present disclosure, there is provided a projection instruction method of generating a projection image to be projected on a parcel based on parcel identification information specifying the parcel, the method including: by causing a processor to cooperate with a memory, specifying a person in charge of processing the parcel or a destination of the parcel based on the parcel identification information to generate a person in charge instruction projection image indicating at least one of the person in charge or the destination.
According to the present disclosure, it is possible to more effectively and precisely sort parcels and to further deal with an increase in the amount of parcel circulation. In particular, by generating a projection image indicating a worker in charge of processing a parcel to project the projection image on the parcel, it is possible for the worker to more smoothly pick the parcel up and it is possible to improve work efficiency.
Hereinafter, embodiments (hereinafter, referred to as “present embodiment”) which specifically disclose a projection instruction device, a parcel sorting system, and a projection instruction method according to the present disclosure will be described in detail with reference to appropriate drawings. Meanwhile, in some cases, an unnecessarily detailed explanation may be omitted. For example, in some cases, a detailed explanation of already well-known items and a repetition explanation of substantially the same configuration may be omitted. This is to avoid unnecessary repetition of the following description and to facilitate understanding by those skilled in the art. The accompanying drawings and the following description are provided to enable those skilled in the art to fully understand the present disclosure and are not intended to limit a scope of the claims.
Hereinafter, the embodiments of the disclosure will be described with reference to
Label reader 10 as a reading device is a device which includes various components such as a lens (not illustrated), an image sensor, and the like. By using label reader 10, it is possible to read label recording information recording various types of information related to the parcel from a label attached to the parcel transported by the transport conveyor. By using the read label recording information, it becomes possible to specify the parcel. By the read information, parcel identification information is defined.
Image sensor 20 is an imaging device which includes various components such as a lens (not illustrated), an image sensor, and the like. Image sensor 20 is generally configured by an imaging camera. The imaging camera is a three-dimensional camera, a plurality of two-dimensional cameras, or the like. Image sensor 20 includes distance image sensor 22 and color image sensor 24.
Distance image sensor 22 images the parcel transported by the transport conveyor and generates a distance image. The generated distance image is used as information indicating a position of the parcel, a distance to the parcel, a size of the parcel, and the like. “Distance image” means an image including distance information indicating a distance from an imaging position to a position (including a surface of a parcel) indicated by each of pixels (that is, “image” in the present disclosure includes a distance image). In addition, a term of “distance image” includes one which cannot be recognized as an image by human eyes, such as a table or the like listing numerical values indicating a distance. That is, “distance image” may be information indicating a relationship between coordinates and a distance in the imaged region, and a data structure is not limited thereto. In the present disclosure, distance image sensor 22 is used for specifying the position of the parcel. Therefore, distance image sensor 22 also can be replaced with another sensing device (an ultrasonic sensor, an infrared sensor, a stereo camera, and a monocular video camera).
Color image sensor 24 images a parcel generated by the distance image and generates a color image. “Color image” refers to an image in which a color of a surface of a parcel is expressed with a predetermined gradation, and the “gradation” includes not only 256 gradations of RGB but also all kinds of grayscales or the like. Color image sensor 24 in the present disclosure is used for tracking each parcel, for the parcel specified by distance image sensor 22. Color image sensor 24 also can be replaced with another sensing device (an ultrasonic sensor, an infrared sensor, a stereo camera, and a monocular video camera).
That is, in the present disclosure, a term of “image” includes both of a distance image and a color image. In the present disclosure, information output from an image sensor as a sensing device including a distance image sensor and a color image sensor is referred to as sensing information. In the present embodiment, an example of the sensing device will be described by using image sensor 20 (including distance image sensor 22 and color image sensor 24). In addition, in the present embodiment, an example of the sensing information will be described by using a distance image output by distance image sensor 22 and a color image output by color image sensor 24.
Projection instruction device 30 functions as a calculation device in parcel sorting system 100. As illustrated in
Projector 40 is configured by a general projection device, and projects projection light including the projection image received from projection instruction device 30, on the parcel and displays the projection image onto the parcel.
Parcel sorting system 100 can be configured to include label reader 10, image sensor 20 (distance image sensor 22 and color image sensor 24), projection instruction device 30, and projector 40 connected with one another in wired communication or in wireless communication. In addition, parcel sorting system 100 also can be configured to include two or more devices of any of label reader 10, image sensor 20, projection instruction device 30, and projector 40 as an integral device. For example, image sensor 20 and projector 40 can be combined to construct an integral imaging projection device (see
In the present embodiment, as illustrated in
Further, image sensor 20 images the image (the distance image and the color image) of parcel P transported by transport conveyor 50 and obtains information such as a position of parcel P, a distance to parcel P, a size (lengths of three sides when parcel P is rectangular parallelepiped) of parcel P, a color of parcel P, a pattern of parcel P, and the like. Further, positions of label reader 10 and image sensor 20, a type of the sensing device, and an order of processes are not particularly limited to the illustrated embodiments. As described above, in the present example, image sensor 20 and projector 40 are configured as integrated imaging projection device 60 and are disposed above transport conveyor 50.
Projection instruction device 30 (not illustrated in
Projector 40 which receives the projection instruction, as an image projection device, projects projection light including the projection image generated by projection instruction device 30 on parcel P and displays the projection image on parcel P. Here, the projection image displayed on parcel P is, for example, an image of an encircled number having a color indicating a sorting location corresponding to a delivery address of parcel P (see
In
For example, parcel P1 has parcel identification information of “AAA111” on a label, and the parcel identification information of “AAA111” specifies that the parcel is a target to be sorted in region A. Here, when parcel P1 reaches the specific region, processor 34 transmits the generated projection image to projector 40 as illustrated in
In the present embodiment, as illustrated in
Hereinafter, in parcel sorting system 100 according to the embodiment, an outline of an operation of sorting the parcel performed by projection instruction device 30 will be described.
On the other hand, in parallel with step S1 and step S2, after distance image sensor 22 of image sensor 20 images the distance image of the parcel, input unit 32 of projection instruction device 30 obtains the distance image as the sensing information from distance image sensor 22 (step S20). Processor 34 determines whether or not the ID corresponding to the parcel existing in the distance image exists in memory 36.
An example of a method of determining whether or not the ID corresponding to the parcel existing in the distance image exists in memory 36 is as follows. That is, processor 34 calculates a time required for the parcel to move between label reader 10 and distance image sensor 22 by a distance (assumed to be known) between label reader 10 and distance image sensor 22 and a speed of transport conveyor 50. By subtracting the time from a time when the distance image is obtained, a time when the ID is assigned to the parcel existing in the distance image by label reader 10 (or processor 34) can be estimated. It can be estimated that the ID assigned close to the estimated time is the ID corresponding to the parcel existing in the distance image. In addition, as another example, a method of installing another distance image sensor in the vicinity of label reader 10 may be possible. That is, by tracking the parcel, to which the ID is assigned, by using another distance image sensor installed in the vicinity of label reader 10 since label reader 10 assigns the ID (or processor 34), a distance between the parcel (or the ID) and label reader 10 is measured for a time unit. Processor 34 can estimate the ID of the parcel in the distance image obtained in predetermined step S20 by the measured distance between the parcel (or the ID) and label reader 10, a distance of the parcel in the distance image obtained in step S20, and a distance (assumed to be known) between two distance image sensors.
In this manner, processor 34 determines whether or not the ID corresponding to the parcel included in the distance image exists in memory 36 (step S30). That is, as described in step S2, in memory 36, the parcel identification information, the ID, and the time information corresponding to a time when the ID is assigned are recorded in advance. On the other hand, as described above, for example, processor 34 subtracts the time required for the parcel to move between label reader 10 and distance image sensor 22 from the time when the distance image is obtained, a time when the ID is assigned to the parcel existing in the distance image by label reader 10 (or processor 34) can be estimated. Processor 34 compares the time information recorded in memory 36 in advance and the estimated time. In a case where the time information is close to the estimated time (for example, a case where a time difference is equal to or smaller than a predetermined threshold time), processor 34 can determine that the ID corresponding to the parcel included in the distance image exists in memory 36. In a case where it is determined that the ID corresponding to the parcel exists in memory 36 (Yes in step S30), the process moves to step S60 and subsequent steps.
In a case where it is determined that the ID corresponding to the parcel does not exist in memory 36 (No in step S30), on the premise that the ID is not assigned to the parcel, processor 34 specifies the position of the parcel again (step S40) and assigns the ID to the parcel (step S50).
In parallel with the above steps, color image sensor 24 generates a color image for each parcel of which a distance image is obtained. Processor 34 tracks the parcel, to which the ID is attached, transported and moved by transport conveyor 50 based on the color image from color image sensor 24 obtained by input unit 32 (step S60). Based on the color image likewise, processor 34 determines whether or not the worker picks up the tracked parcel (step S70). In a case where it is determined that the parcel is not picked up by the worker (No in step S70), processor 34 determines whether or not the parcel exists in a specific region (a predetermined sorting area in which the parcel is to be picked up) to be described below. In a case where it is determined that the parcel exists (reaches) in the specific region (Yes in step S80), processor 34 generates the projection image and transmits the projection image to projector 40 (step S90). In a case where it is not determined that the parcel exists (reaches) in the specific region (No in step S80), the process returns to step S60 and processor 34 continues to track the parcel.
In addition, in step S70, in a case where it is determined that the parcel is picked up by the worker (Yes in step S70), processor 34 reads detail information of the parcel from memory 36 (step S100), generates the projection image including the detail information, and outputs the projection image generated by output unit 38, to projector 40 (step S90). Projector 40 which obtains the projection image from projection instruction device 30, projects the projection image on the corresponding parcel.
The above is the outline of the operation procedure performed by processor 34 and the like of projection instruction device 30. Certainly, the operation procedure is not limited to that described above. For example, the determination in step S70 can be omitted. In addition, for the determination in step S70, it is possible to use contact determination between a hand of the worker and the parcel, a color image, a distance image, or the like. Next, a specific case of the present disclosure will be described.
[Instruction about Worker in Charge for Parcel]
In general, plurality of workers M are located beside or in the vicinity of transport conveyor 50, and each worker picks up a parcel from the transport conveyor. The worker can visually recognize a parcel to be picked up by a projection image indicating the encircled numbers as illustrated in
Projection instruction device 30 according to the present disclosure is not limited to the projection image of specifying the parcel itself as illustrated in
In the same manner as described in steps S1 and S2 in
By comparing the delivery address of the parcel identification information obtained from label reader 10 with the delivery address of a worker to which each of markers A, B, and C is assigned, processor 34 can specify the worker in charge for picking up a specific parcel. With this specification, processor 34 can generate, for the specific parcel, a person in charge instruction projection image indicating a person in charge for the parcel to be picked up. According to steps S20 to S80 in
Projector 40 projects the person in charge instruction projection image, indicating the worker in charge of the parcel, generated by processor 34 on the parcel. In
In the example described above, the information such as the sorting destination of the parcel is included as the person in charge instruction projection image, but the present example is not limited thereto. If the worker in charge can recognize a sorting destination of the parcel by assigning work in advance, the sorting destination is replaced with a number indicating the sorting destination or the like of the parcel, information specifying the worker in charge, such as a name or an identification number of the worker in charge, may be included.
In the example described above, image sensor 20 identifies markers A, B, and C of each worker, but a type of the worker identification information such as markers A, B, and C is not particularly limited. Further, an angle or the like of image sensor 20 may be arranged so that, the worker may be directly identified without relying on a marker or the like by using a face authentication technology or the like. In addition, a dedicated image sensor may be provided separately from image sensor 20, and each worker may be specified by using marker identification, a face authentication technology, or the like. Further, an object to be recognized and identified may not be the worker, but may be a destination at which parcels are loaded (a destination of the parcel) such as a truck or a roll box pallet. In a case where both the worker and the destination of the parcel can be recognized and identified, an image indicating the worker and the destination of the parcel may be projected together, or only one of the worker and the destination of the parcel may be projected.
Further, in the example described above, the person in charge is specified by using the delivery address in the information included in the parcel identification information, but the person in charge may be specified by using other information of the parcel identification information. In addition, for a busy worker who processes many parcels, parcels to be turned around may be reduced. An association between the worker and the delivery address (other parcel identification information) may be stored in another database, and projection instruction device 30 may refer to the database. For example, at a specific timing (a start timing of parcel sorting system 100 or the like), a position at which a worker stands in the vicinity of transport conveyor 50 may be recognized by a camera or the like, and an association between the worker and the delivery address (other parcel identification information) may be determined and registered in the database.
In addition, according to the worker identification information such as markers A, B, and C described above, a person in charge itself is generally stored, but instead of the person in charge, an association between a delivery address (other parcel identification information) and a device such as a tag such as an RFID tag, a beacon, a smartphone, or the like, which has information indicating a position at which a parcel is processed (picked up), may be registered in the database. Further, a region (for example, every one meter or the like) in which each worker can work may be determined in advance, and a beacon or the like may be fixedly installed in each region in which the parcel is processed. Even with such a configuration, each worker can recognize a parcel to be processed by projecting an arrow image or the like (indicating the region) from the parcel toward the beacon. With this configuration, by associating the beacon and identification information of a worker in charge of a region indicated by the beacon in advance, the identification information for identifying the region itself indicated by the beacon is assigned. Therefore, image sensor 20 does not need to recognize the worker, and a processing load can be reduced. Further, with this configuration, there is no need to give a marker or a beacon to a worker, so that it is no longer necessary to force the worker to wear clothes with the marker or the like, and a risk of losing tags, beacons, and the like lent to the worker can be reduced. In the configuration in which a beacon or the like is provided for each region, even in a case where a worker in charge of the region is temporarily absent, there is a possibility that an image such as an arrow is generated as if there is a worker in charge. Therefore, in a case where it is not possible to recognize that the worker is in the region by using image sensor 20, an arrow image or the like indicating to another region may be projected. In addition, if parcel sorting system 100 recognizes coordinates of transport conveyor 50, it is possible to use this information to define a region for each worker. In this case, it is not necessary to provide a beacon or the like for each region. Parcel sorting system 100 recognizes at least coordinates of transport conveyor 50 in advance for tracking parcel P or the like. Therefore, according to a configuration of this modification example, a person in charge instruction projection image can be generated without adding new hardware, a marker, or the like to parcel sorting system 100. Here, a region assigned to each worker is defined by coordinates on transport conveyor 50, so that even if coordinates outside transport conveyor 50 are not recognized, the person in charge instruction projection image can be generated. In an actual environment, the worker is in a region outside transport conveyor 50, and there is a high possibility that the worker waits in the vicinity of transport conveyor 50 to sort parcels. For this reason, even if the region assigned to each worker is approximated by the coordinates of transport conveyor 50, there is no large shift in contents of the person in charge instruction projection image. If parcel sorting system 100 can also recognize coordinates in the vicinity of transport conveyor 50, the region of each worker in charge may be defined by using the coordinates outside transport conveyor 50.
In the example described above, the person in charge instruction projection image and the projection image indicating a sorting destination or the like of the parcel are integrally generated, but the present example is not limited thereto. Each image may be generated as a separate image. In this case, for example, an image of only a part of the arrow in the person in charge instruction projection image in
Further, in a case where a worker in charge of a parcel is not found around, it is possible to project an error pattern indicating this to the parcel.
As a method to maintain track and an instruction for the worker, for example, a method of following a person indoors by using an RFID, a beacon, a smartphone, or the like or a person recognition by a camera can be used, and further, it is possible to improve accuracy by using clothes, hats, and the like (switching colors by a person, adding a marker, and the like) of the worker.
As described above, according to the present disclosure, a projection image indicating a worker in charge of picking up a parcel (for instructing a person in charge) is projected on the parcel, so that the worker in charge can easily notice the existence of the parcel in charge and can pick the parcel up smoothly. Meanwhile, the process flow in
Although the embodiment of a projection instruction device, a parcel sorting system, and a projection instruction method according to the present disclosure is described with reference to the drawings, the present disclosure is not limited to such an example. Those skilled in the art can conceive various modification examples, change examples, substitution examples, addition examples, deletion examples, and equivalent examples within the scope described in the claims and these rightly belong to the technical scope of the present disclosure.
The present disclosure is useful to provide a projection instruction device, a parcel sorting system, and a projection instruction method capable of indicating a worker in charge for a parcel.
Number | Date | Country | Kind |
---|---|---|---|
2017-187202 | Sep 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/025738 | 7/6/2018 | WO | 00 |