This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2022-001599, filed on Jan. 7, 2022, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an image capture system, a control device for this system, and a method for causing a computer to function as the control device.
For example, a resolution such as capturing an image of a merchandise rack installed in a store by a camera, analyzing the captured image, and thus checking whether information on a shelf label attached to the merchandise rack is correct or not, is employed. For such a resolution, the characters on the shelf label need to be recognized from the captured image and therefore a camera with a high resolution and a narrow image capture range, that is, a so-called narrow-range camera, is preferable. Meanwhile, in the merchandise rack, many kinds of merchandise items are displayed separately on a plurality of shelves, and the width, height and the like of the shelf required for display vary depending on the merchandise item. Therefore, the shelf labels showing the prices of the individual merchandise items are not arranged at an equal interval in vertical and horizontal directions.
In such circumstances, in order to capture an image of a shelf label no matter what position in the merchandise rack the shelf label is attached, a plurality of narrow-range cameras are installed next to each other in the direction of the height of the merchandise rack on a wheeled platform automatically traveling in the direction of the width in front of the merchandise rack. Each narrow-range camera repeatedly captures an image of the merchandise rack in timing with the automatic traveling of the wheeled platform. Constructing such an image capture system enables the acquisition of an image of the merchandise rack in which all the shelf labels attached to the merchandise rack are captured with a high resolution, and the execution of image processing for recognizing the characters on the shelf labels.
However, in the related-art image capture system, each camera repeatedly captures an image of the merchandise rack regardless of whether a shelf label is present or not, and therefore even an image that does not include any shelf label is processed. Thus, there are problems to be solved, such as an increase in the capacity of a memory storing image data and a reduction in the image processing speed.
An embodiment described herein is to provide an image capture system and a control device therefor that can achieve a reduction in the capacity of the memory storing image data and an improvement in the image processing speed.
In general, according to one embodiment, a control device includes a traveling control unit and an image capture control unit. The traveling control unit controls traveling of a wheeled platform traveling along an image capture plane where a plurality of image capture target objects are arranged, spaced apart from each other in vertical and horizontal directions. The image capture control unit controls an image capture operation of a plurality of cameras installed in a direction perpendicular to a traveling direction of the wheeled platform, according to an image capture timing list set on a per camera basis, of the plurality of cameras.
An embodiment of an image capture system and a control device therefor will now be described with reference to the drawings.
In this embodiment, an image capture system for checking whether information on a shelf label attached to the front of a merchandise rack is corrected or not is described as an example.
The image capture device 10 is formed of a wheeled platform 11 with a camera installation unit 12 loaded thereon. The wheeled platform 11 is a vehicle freely moving on a floor surface spreading in front of a merchandise rack where a shelf label that is an image capture target object is attached. The wheeled platform 11 is self-propelled to travel in response to a traveling instruction from the control device 20. Such a wheeled platform 11 can be paraphrased as self-propelled robot.
The camera installation unit 12 is formed of a first camera installation unit 121 and a second camera installation unit 122. Each of the first camera installation unit 121 and the second camera installation unit 122 is fixed to a top part of the wheeled platform 11.
The first camera installation unit 121 is a structure where a plurality of first cameras 13 are installed in a line, spaced apart from each other with a predetermined space in a direction (arrow Z) perpendicular to a traveling direction (arrow X) of the wheeled platform 11 and with the lenses thereof facing the same direction. The first camera 13 is a high-resolution narrow-range camera suitable for image capture for the shelf label. In this embodiment, the number of first cameras 13 is seven. The first cameras 13 are given reference numbers “131”, “132”, “133”, “134”, “135”, “136”, and “137” in order from the nearest first camera 13 to the wheeled platform 11 and thus distinguished from each other. In the description below, the reference number “13” is used to collectively refer to the first cameras and the reference numbers “131” to “137” are used in the case of describing the individual first cameras 13.
The second camera installation unit 122 is a structure which has substantially the same shape and size as the first camera installation unit 121 and where one second camera 14 is installed at a center part in the direction of the height and with the lens thereof facing the same direction as each of the first cameras 13. The second camera 14 is a wide-range camera having a wider angle of view than the first camera 13. The second camera 14 is a wide-range camera having an angle of view that enables image capture covering the entirety of the front of at least one merchandise rack.
The control device 20 is a computer device remotely controlling the traveling of the wheeled platform 11 in the image capture device 10 and the image capture operation of the first camera 13 and the second camera 14. The control device 20 has a processor 21, a main memory 22, an auxiliary memory device 23, an image capture device interface 24, a timer 25, a touch panel 26, a communication interface 27, and a system transmission path 28. The system transmission path 28 includes an address bus, a data bus, a control signal line or the like. The system transmission path 28 connects the processor 21 to the other parts directly or via a signal input/output circuit and transmits a data signal sent and received between the processor 21 and the other parts. The control device 20 forms a computer by having the processor 21 connected to the main memory 22, the auxiliary memory device 23, the image capture device interface 24, the timer 25, the touch panel 26, and the communication interface 27 via the system transmission path 28.
The processor 21 is equivalent to a control center of the computer. The processor 21 controls each part so as to implement various functions as the control device 20 according to an operating system or an application program. The processor 21 is a CPU (central processing unit), for example.
The main memory 22 is equivalent to a main memory part of the computer. The main memory 22 includes a non-volatile memory area and a volatile memory area. The main memory 22 stores an operating system or an application program in the non-volatile memory area. The main memory 22 may store data that is necessary for the processor 21 to execute processing for controlling each part, in the non-volatile or volatile memory area. The main memory 22 uses the volatile memory area as a work area where data is suitably rewritten by the processor 21. The non-volatile memory area is a ROM (read-only memory), for example. The volatile memory area is a RAM (random-access memory), for example.
The auxiliary memory device 23 is equivalent to an auxiliary memory part of the computer. For example, an EEPROM (electrically erasable programmable read-only memory), an HDD (hard disk drive), or an SSD (solid-state drive) or the like can serve as the auxiliary memory device 23. In the auxiliary memory device 23, data used by the processor 21 to execute various kinds of processing and data generated through the processing by the processor 21, or the like, are saved. The auxiliary memory device 23 may store the application program.
The image capture device interface 24 is an interface for performing data communication with the image capture device 10, using wireless or wired communication. The image capture device interface 24 transmits, for example, a data signal relating to a traveling instruction such as the start, the stop, the traveling velocity, or the traveling direction of the wheeled platform 11, to the image capture device 10. The image capture device interface 24 receives image data captured by each of the first camera 13 and the second camera 14.
The timer 25 is a peripheral circuit having a function of tracking a set time in response to an instruction from the processor 21 and reporting a time-out every time the timer 25 finishes the tracking of time. The set time will be described later.
The touch panel 26 functions as an input device and a display device of the control device 20. An operator of the control device 20 can operate the touch panel 26 to input necessary data for the control of the image capture device 10, or the like, to the control device 20. The operator can also check the result of image processing by an image processing device 30, described later, from information displayed on the touch panel 26.
The communication interface 27 is an interface for performing data communication with the image processing device 30 connected via a communication line, for example, the internet or the like. The image processing device 30 is a computer device having a processing function of checking whether information on a shelf label is correct or not, based on an image of the shelf label captured by the image capture device 10. The image processing by the image processing device 30 may be implemented, for example, by cloud computing technology.
The merchandise racks Sa, Sb, Sc, Sd have substantially the same depth and height but have different lateral widths. In an example, a lateral width Wa of the merchandise rack Sa and a lateral width Wc of the merchandise rack Sc are equal, whereas a lateral width Wb of the merchandise rack Sb is narrower than the lateral width Wa or the lateral width Wc, and a lateral width Wd of the merchandise rack Sd is even narrower than the lateral width Wb. The merchandise racks Sa, Sb, Sc, Sd may have any number of shelves. Each shelf may be at any height and is suitably changed according to the number of merchandise items displayed there, the size of the merchandise items, and the like.
Meanwhile, the number of shelves in the merchandise rack Sb is four. Two shelf labels are attached to the first shelf at the bottom and the third shelf. Three shelf labels are attached to the second shelf and the fourth shelf. That is, at the front of the merchandise rack Sb, a total of 10 shelf labels 40, which are image capture target objects, are arranged, spaced apart from each other with a suitable space in the direction of the height of the merchandise rack Sb, that is, the vertical direction, and in the direction of the lateral width, that is, the horizontal direction.
In this way, the merchandise rack Sa and the merchandise rack Sb have different numbers of shelves from each other. Also, the height of each shelf is different between the merchandise rack Sa and the merchandise rack Sb. Therefore, the shelf labels 40 arranged in the merchandise rack Sa and the shelf labels 40 arranged in the merchandise rack Sb are arranged at positions that are different in the vertical direction.
In such a configuration, the control device 20 stores a rack data table 50 having a data structure shown in
The serial number is a successive number starting with “1” and the maximum value thereof coincides with the number of merchandise racks to which a shelf label as an image capture target object is attached. Therefore, in the example shown in
The rack ID is a unique code set on a per merchandise rack basis in order to identify each of the merchandise racks Sa, Sb, Sc, Sd. In the rack data table 50, the rack IDs are described in order from the merchandise rack near the standby point of the wheeled platform 11, in correlation with the serial numbers “1” to “4”. Therefore, in the example shown in
The X-coordinate and the Y-coordinate are coordinate values on the XY plane of a bottom end near the standby position of the wheeled platform 11, of the merchandise racks Sa, Sb, Sc, Sd identified by the corresponding rack ID. Therefore, in the example shown in
The width is the lateral width of the merchandise racks Sa, Sb, Sc, Sd identified by the corresponding rack ID. Therefore, in the example shown in
The angle is the angle formed by the direction of the lateral width of the merchandise rack Sa, Sb, Sc, Sd identified by the corresponding rack ID to the X-axis of the XY plane. Therefore, in the example shown in
The rack data table 50 having such a data structure is a data table describing data that is necessary for the processor 21 of the control device 20 to control the traveling of the wheeled platform 11 of the image capture device 10 and to control the image capture timing of the first camera 13 and the second camera 14.
In
In
The processor 21 controls the traveling of the wheeled platform 11 in such a way that the image capture device 10 arrives at the image capture point of the first camera 13 or the image capture point of the second camera 14 for each merchandise rack Sa, Sb, Sc, Sd, based on the data in the rack data table 50. The processor 21 also controls the image capture timing of the first camera 13 and the second camera 14 in such a way that the first camera 13 performs image capture at the image capture point of the first camera 13 and that the second camera 14 performs image capture at the image capture point of the second camera 14.
The traveling control unit 211 is a function of controlling the constant-velocity traveling of the wheeled platform 11 traveling along an image capture plane where a plurality of image capture target objects, that is, the shelf labels 40, are arranged, spaced apart from each other in the vertical and horizontal directions, that is, along the front of the merchandise racks Sa, Sb, Sc, Sd.
The image capture control unit 212 is a function of controlling the image capture operation at the narrow-range image capture point, of a plurality of cameras, that is, the first cameras 13, installed in the direction perpendicular to the traveling direction of the wheeled platform 11, according to an image capture timing list set on a per first camera 13 basis, of the plurality of first cameras 13.
The list generation unit 213 is a function of specifying the positions of a plurality of image capture target objects, that is, the shelf labels 40, arranged on the image capture plane, from a captured image of the image capture plane, that is, the front of the merchandise racks Sa, Sb, Sc, Sd, and generating an image capture timing list, based on the positions of the plurality of image capture target objects and the image capture ranges of the plurality of first cameras 13.
The image capture point decision unit 214 is a function of deciding the image capture point of the second camera 14 to the image capture plane, that is, the wide-range image capture point for each of the merchandise racks Sa, Sb, Sc, Sd. By the way, the narrow-range image capture point from the point of the coordinates (X, Y) in each of the merchandise racks Sa, Sb, Sc, Sd described in the rack data table 50 is a point shifted by half the length T (T/2) of the side in the horizontal direction of the image capture range 60. The other narrow-range image capture points are points shifted by the length T of the side in the horizontal direction of the image capture range 60.
The image acquisition unit 215 is a function of moving the wheeled platform 11 to the wide-range image capture point decided by the image capture point decision unit 214 before the traveling control unit 211 controls the traveling of the wheeled platform 11, then causing the second camera 14 to perform image capture, and thus acquiring an image that is necessary for generating the image capture timing list.
The output unit 216 is a function of outputting the images captured by the plurality of first cameras 13 under the control of the image capture control unit 212, to the image processing device 30.
All of the functions as the traveling control unit 211, the image capture control unit 212, the list generation unit 213, the image capture point decision unit 214, the image acquisition unit 215, and the output unit 216 are implemented by information processing executed by the processor 21 according to a control program. The control program is a type of application program stored in the main memory 22 or the auxiliary memory device 23. The method of installing the control program in the main memory 22 or the auxiliary memory device 23 is not particularly limited. The control program may be recorded in a removable recording medium or distributed by communication via a communication network and thus can installed in the main memory 22 or the auxiliary memory device 23. The recording medium may be any form of recording medium that can store a program and is readable by a device, such as a CD-ROM or a memory card.
For example, when the control start time comes and the control program starts, the processor 21 starts the procedures shown in the flowchart of
After finishing the processing of ACT 2 and ACT 3, the processor 21 increments the value of the first counter n by “1” in ACT 4. In ACT 5, the processor 21 checks whether the value of the first counter n exceeds the value of the register memory N or not.
If the value of the first counter n does not exceed the value of the register memory N (NO in ACT 5), the processor 21 proceeds to ACT 6. In ACT 6, the processor 21 acquires n-th rack data (rack ID, X-coordinate, Y-coordinate, width, angle) corresponding to a serial number equal to the value of the first counter n, from the rack data table 50. For example, if the value of the first counter n is “1”, the processor 21 acquires the rack data of the merchandise rack Sa.
In ACT 7, the processor 21 decides a wide-range image capture point for the merchandise rack identified by the rack ID of the rack data. For example, if the value of the first counter n is “1”, the processor 21 decides a point that is advanced from the point of the coordinates (Xa, Ya) by a distance half the width Wa in a direction tilting from the X-axis by an angle θa and that is spaced apart from the front of the merchandise rack Sa by the distance Lb, as the wide-range image capture point.
In ACT 8, the processor 21 moves the wheeled platform 11 of the image capture device 10 to the wide-range image capture point. For example, if the value of the first counter n is “1”, the wheeled platform 11 is standing still at the standby point. The processor 21 controls the traveling direction, the traveling velocity and the like of the wheeled platform 11 in such a way that the wheeled platform 11 moves from the standby point to the wide-range image capture point and stops there.
When the wheeled platform 11 has stopped at the wide-range image capture point, the processor 21 executes wide-range image capture processing in ACT 9. Details of the wide-range image capture processing will be described later. During the execution of the wide-range image capture processing, the wheeled platform 11 does not move from the wide-range image capture point. After finishing the wide-range image capture processing, the processor 21 in ACT 10 decides a narrow-range image capture start point for the merchandise rack identified by the rack ID of the rack data. For example, if the value of the first counter n is “1”, the processor 21 decides a point spaced apart from the point of the coordinates (Xa, Ya) by the distance La, as the narrow-range image capture start point.
In ACT 11, the processor 21 moves the wheeled platform 11 of the image capture device 10 to the narrow-range image capture start point. For example, if the value of the first counter n is “1”, the processor 21 controls the traveling direction, the traveling velocity and the like of the wheeled platform 11 in such a way that the wheeled platform 11 moves from the wide-range image capture point to the narrow-range image capture start point for the merchandise rack Sa and stops there.
When the wheeled platform 11 stopped at the narrow-range image capture start point, the processor 21 in ACT 12 executes narrow-range image capture processing. Details of the narrow-range image capture processing will be described later. After finishing the narrow-range image capture processing, the processor 21 returns to ACT 4. The processor 21 increments the value of the first counter n further by “1”. If the value of the first counter n does not exceed the value of the register memory N, the processor 21 executes the processing of ACT 6 to ACT 12 similarly to the above. That is, the processor 21 decides a wide-range image capture point for the merchandise rack Sb, for example, based on the rack data of the merchandise rack Sb, then moves the wheeled platform 11 to the wide-range image capture point, and executes the wide-range image capture processing. The processor 21 also decides a narrow-range image capture start point for the merchandise rack Sb, based on the rack data of the merchandise rack Sb, then moves the wheeled platform 11 to the narrow-range image capture start point, and executes the narrow-range image capture processing.
From this point onward, the processor 21 alternately executes the wide-range image capture processing and the narrow-range image capture processing, for example, based on the rack data of the merchandise rack Sc and also the rack data of the merchandise rack Sd. If the value of the first counter n exceeds the value of the register memory N (YES in ACT 5), the processor 21 proceeds to ACT 13. In ACT 13, the processor 21 outputs the image data captured by the plurality of first cameras 13 to the image processing device 30 via the communication interface 27. In ACT 14, the processor 21 controls the traveling direction, the traveling velocity and the like of the wheeled platform 11 in such a way that the wheeled platform 11 moves to the standby position and stops there. Then, the processor 21 ends the information processing according to the control program.
On entering the wide-range image capture processing, the processor 21 in ACT 21 outputs an image capture ON signal to the wide-range camera, that is, the second camera 14. On receiving the image capture ON signal, the second camera 14 executes an image capture operation. Thus, an image of the entire front of the merchandise rack Sa is captured by the second camera 14 and the image data thereof is sent to the control device 20 via the image capture device interface 24.
In ACT 22, the processor 21 estimates the position of the shelf label 40 arranged in the merchandise rack Sa, based om the image data. For the estimation of the shelf label position, for example, a learning technique of a hierarchical neural network called deep neural network (DNN) is used. That is, a DNN model for the detection of the shelf label is designed and the DNN model is installed in the control device 20. The processor 21 inputs the captured image of the merchandise rack Sa to the DNN model. Then, due to the action of the DNN model, each shelf label 40 arranged in the merchandise rack Sa is detected and the position of each shelf label 40 is estimated. In ACT 23, the processor 21 generates a shelf label position image 70 in which the position of the shelf label 40 is filled in black, as shown in
If an image of the merchandise rack Sb having a narrower lateral width than the merchandise rack Sa is captured by the second camera 14, the shelf label 40 in the merchandise rack Sa or the merchandise rack Sc next to the merchandise rack Sb may appear in the captured image. In this case, the processor 21 ignores the shelf label 40 detected outside the frame of the merchandise rack Sb and specifies only the position of the shelf label 40 detected within the frame.
In ACT 24, the processor 21 calculates a number of times of narrow-range image capture Tm. Specifically, the processor 21 divides the lateral width Wa of the merchandise rack Sa included in the rack data by the length T of the side in the horizontal direction of the image capture range 60 employed when the first camera 13 at the narrow-range image capture point captures an image of the merchandise rack Sa, and defines the quotient (where the numbers after the decimal point are rounded up) as the number of times of narrow-range image capture Tm.
In ACT 25, the processor 21 generates a timing list 80 (see
In ACT 26, the processor 21 resets the value of a second counter Q to “0”. Next, in ACT 27, the processor 21 increments the value of the second counter Q by “1”. In ACT 28, the processor 21 checks whether the value of the second counter Q exceeds the number of times of narrow-range image capture Tm or not.
If the value of the second counter Q does not exceed the number of times of narrow-range image capture Tm (NO in ACT 28), the processor 21 proceeds to ACT 29. In ACT 29, the processor 21 resets the value of a third counter P to “0”. Next, in ACT 30, the processor 21 increments the value of the third counter P by “1”. In ACT 31, the processor 21 checks whether the value of the third counter P exceeds the number of first cameras 13 or not.
If the value of the third counter P does not exceed the number of first cameras 13 (NO in ACT 31), the processor 21 proceeds to ACT 32. In ACT 32, the processor 21 superimposes, on the shelf label position image 70, the image capture range 60 employed when the first camera 13 corresponding to the row number P performs image capture at the narrow-range image capture point corresponding to the column number Q.
In ACT 33, the processor 21 checks whether the shelf label 40 is included in the area where the image capture range 60 is superimposed, in the shelf label position image 70, or not. If the shelf label 40 is not included in the area where the image capture range 60 is superimposed, in the shelf label position image 70 (NO in ACT 33), the processor 21 proceeds to ACT 34. In ACT 34, the processor 21 sets the image capture flag PQF of the data area PQ specified by the row number P and the column number Q to “0”. If the shelf label 40 is included in the area where the image capture range 60 is superimposed, in the shelf label position image 70 (YES in ACT 33), the processor 21 proceeds to ACT 35. In ACT 35, the processor 21 sets the image capture flag PQF of the data area PQ to “1”.
After finishing the processing of ACT 34 or ACT 35, the processor 21 returns to ACT 30. The processor 21 then executes the processing from ACT 30 onward, similarly to the above. That is, the processor 21 repeatedly executes the processing of ACT 32 to ACT 35 until the value of the third counter P exceeds the number of first cameras 13.
If the third counter P exceeds the number of first cameras 13 (YES in ACT 31), the processor 21 returns to ACT 27. The processor 21 then executes the processing from ACT 27 onward, similarly to the above. That is, the processor 21 increments the value of the second counter Q further by “1”. If the value of the second counter Q does not exceed the number of times of narrow-range image capture Tm, the processor 21 resets the value of the third counter P to “0”. Subsequently, the processor 21 executes the processing of ACT 32 to ACT 35 every time the value of the third counter P is incremented. Then, if the value of the third counter P exceeds the number of first cameras 13, the processor 21 returns to ACT 27 again and increments the value of the second counter Q further by “1”.
If the value of the second counter Q exceeds the number of times of narrow-range image capture Tm (YES in ACT 28), the processor 21 proceeds to ACT 36. In ACT 36, the processor 21 saves, in the auxiliary memory device 23, the timing list 80 where the image capture flag PQF of “1” or “0” is described in the P×Q data areas PQ corresponding to the individual row numbers P and the individual column numbers Q. Then, the processor 21 exits the wide-range image capture processing for the merchandise rack Sa.
Similarly, with respect to the image capture range 62 of the first camera 132 corresponding to the row number P=2, the shelf label is included in the areas of the column numbers Q=1, 2, 3, 4, 5. Consequently, image capture flags 21F, 22F, 23F, 24F, 25F are “1” and an image capture flag 26F is “0”, as shown in
With respect to the image capture range 63 of the first camera 133 corresponding to the row number P=3, the shelf label is not included in any one of the areas of all the column numbers Q. Consequently, all of image capture flags 31F, 32F, 33F, 34F, 35F, 36F are “0”, as shown in
With respect to the image capture range 64 of the first camera 134 corresponding to the row number P=4, the shelf label is included in the areas of the column numbers Q=2, 3, 4, 5. Consequently, image capture flags 42F, 43F, 44F, 45F are “1” and image capture flags 41F, 46F are “0”, as shown in
With respect to the image capture range 65 of the first camera 135 corresponding to the row number P=5, the shelf label is included in the areas of the column numbers Q=1, 2, 4, 5, 6. Consequently, image capture flags 51F, 52F, 54F, 55F, 56F are “1” and an image capture flag 53F is “0”, as shown in
With respect to the image capture range 66 of the first camera 136 corresponding to the row number P=6, the shelf label is not included in any one of the areas of all the column numbers Q. Consequently, all of image capture flags 61F, 62F, 63F, 64F, 65F, 66F are “0”.
With respect to the image capture range 67 of the first camera 137 corresponding to the row number P=7, the shelf label is included in the areas of all the column numbers Q. Consequently, all of image capture flags 71F, 72F, 73F, 74F, 75F, 76F are “1”.
Thus, in the wide-range image capture processing for the merchandise rack Sa, the timing list 80 shown in
On entering the narrow-range image capture processing, the processor 21 in ACT 41 resets the value of a fourth counter R to “0”. Next, in ACT 42, the processor 21 increments the value of the fourth counter R by “1”. In ACT 43, the processor 21 checks whether the value of the fourth counter R is “1” or not. If the value of the fourth counter R is “1” (YES in ACT 43), the processor 21 proceeds to ACT 44. In ACT 44, the processor 21 sets a time-out time t/2 in the timer 25. The time-out time t/2 is half the time taken for the wheeled platform 11 to move the length T of the side in the horizontal direction of the image capture range 60.
If the value of the fourth counter R is not “1”, that is, if the value is “2” or greater (NO in ACT 43), the processor 21 proceeds to ACT 45. In ACT 45, the processor 21 checks whether the value of the fourth counter R exceeds the number of times of narrow-range image capture Tm or not.
If the value of the fourth counter R does not exceed the number of times of narrow-range image capture Tm (NO in ACT 45), the processor 21 proceeds to ACT 46. In ACT 46, the processor 21 sets a time-out time t in the timer 25. The time-out time t is the time taken for the wheeled platform 11 to move the length T of the side in the horizontal direction of the image capture range 60.
After finishing the processing of ACT 44 or ACT 46, the processor 21 proceeds to ACT 47. In ACT 47, the processor 21 controls the start of the traveling of the wheeled platform 11. In ACT 48, the processor 21 causes the timer 25 to start. In ACT 49, the processor 21 waits for the timer 25 to reach a time-out. If the timer 25 reaches a time-out (YES in ACT 49), the processor 21 proceeds to ACT 50. In ACT 50, the processor 21 causes the wheeled platform 11 to stop traveling.
In the narrow-range image capture processing after the wheeled platform 11 moves to the narrow-range image capture start point in ACT 11 in
After finishing the processing of ACT 50, the processor 21 in ACT 51 acquires the image capture flag PQF in a column R of the column number Q corresponding to the value of the fourth counter R, from the timing list 80. In ACT 52, the processor 21 selects the first camera 13 corresponding to the image capture flag PQF of “1”.
In ACT 53, the processor 21 outputs an image capture ON signal to the selected first camera 13. The processor 21 does not output an image capture ON signal to the first camera 13 that is not selected. The first camera 13 received the image capture ON signal executes an image capture operation. In ACT 54, the processor 21 takes in the image captured by the first camera 13 received the image capture ON signal and stores the image in a memory M for storing the captured image. The memory M is a part of the volatile area in the main memory 22.
Therefore, when the value of the fourth counter R is “1”, the processor 21 acquires the image capture flags 11F, 21F, 31F, 41F, 51F, 61F, 71F. As shown in
After finishing storing all of the images captured by the first camera 132, the first camera 135, and the first camera 137 in the memory M, the processor 21 returns to ACT 42. That is, the processor 21 increments the value of the fourth counter R further by “1”. Thus, the value of the fourth counter R becomes “2”, which does not exceed the number of times of narrow-range image capture Tm. Therefore, the processor 21 sets the time-out time t in the timer 25. The processor 21 then controls the start of the traveling of the wheeled platform 11 and causes the timer 25 to start. If the timer 25 reaches a time-out, the processor 21 causes the wheeled platform 11 to stop traveling. The velocity of the wheeled platform 11 in this case, too, is such a velocity that the wheeled platform 11 moves half the length T of the side in the horizontal direction of the image capture range 60 while the timer 25 tracks the time t/2, that is, such a velocity that the wheeled platform 11 moves the length T of the side in the horizontal direction of the image capture range 60 while the timer 25 tracks the time t. Therefore, the wheeled platform 11 stops at a point away from the narrow-range image capture start point of the coordinates (Xa, Ya) by a distance 3T/2 in the direction of the width of the merchandise rack Sa (X-direction).
At this time, the processor 21 acquires the image capture flags 12F, 22F, 32F, 42F, 52F, 62F, 72F. As shown in
Subsequently, the value of the fourth counter R becomes “3”. In this case, too, the processor 21 sets the time-out time t in the timer 25 because the value of the fourth counter R does not exceed the number of times of narrow-range image capture Tm. The processor 21 then controls the start of the traveling of the wheeled platform 11 and causes the timer 25 to start. If the timer 25 reaches a time-out, the processor 21 causes the wheeled platform 11 to stop traveling. The velocity of the wheeled platform 11 in this case, too, is the same as when the value of the fourth counter R is “1” or “2”. Therefore, the wheeled platform 11 stops at a point away from the narrow-range image capture start point of the coordinates (Xa, Ya) by a distance 5T/2 in the direction of the width of the merchandise rack Sa (X-direction).
At this time, the processor 21 acquires the image capture flags 13F, 23F, 33F, 43F, 53F, 63F, 73F. As shown in
From this point onward, every time the value of the fourth counter R is incremented to “4”, “5”, and “6”, the processor 21 executes processing similar to when the value of the fourth counter R is “2” or “3”. Consequently, when the value of the fourth counter R is “4”, images captured by the first camera 131, the first camera 132, the first camera 134, the first camera 135, and the first camera 137 are stored in the memory M. When the value of the fourth counter R is “5”, again, images captured by the first camera 131, the first camera 132, the first camera 134, the first camera 135, and the first camera 137 are stored in the memory M. When the value of the fourth counter R is “6”, images captured by the first camera 131, the first camera 135, and the first camera 137 are stored in the memory M.
Subsequently, the value of the fourth counter R becomes “7”, which exceeds the number of times of narrow-range image capture Tm. If the value of the fourth counter R exceeds the number of times of narrow-range image capture Tm (YES in ACT 45), the processor 21 proceeds to ACT 53. In ACT 53, the processor 21 clears the timing list 80. Then, the processor 21 exits the narrow-range image capture processing for the merchandise rack Sa.
Subsequently, the processor 21 decides a wide-range image capture point for the merchandise rack Sb, moves the wheeled platform 11 to the wide-range image capture point, and executes the wide-range image capture processing for the merchandise rack Sb. Then, the processor 21 decides a narrow-range image capture start point for the merchandise rack Sb, moves the wheeled platform 11 to the narrow-range image capture start point, and executes the narrow-range image capture processing for the merchandise rack Sb.
The processor 21 executes the processing of ACT 47 to ACT 50 in
The processor 21 executes the processing of ACT 6 and ACT 7 in
In this way, with the control device 20 having the functions as the traveling control unit 211 and the image capture control unit 212, each first camera 13 does not capture an image that does not include the shelf label 40. Therefore, the capacity of the memory M storing image data captured by each first camera 13 can be saved. Also, since the control device 20 does not process image data that does not include the shelf label 40, the overall speed for data processing can be increased.
The control device 20 has the list generation unit 213. Thus, a separate device for generating the timing list 80 is not needed and therefore the time and effort taken for generating the timing list 80 can be reduced.
The control device 20 also has the image capture point decision unit 214 and the image acquisition unit 215. The control device 20 causes the list generation unit 213 to generate the timing list 80, based on an image acquired by the image acquisition unit 215. Therefore, for example, even if the position where the shelf label 40 is attached in a merchandise rack is changed, the timing list 80 corresponding to the change can be easily generated and this enhances versatility.
The control device 20 also has the output unit 216. Therefore, the image processing device 30 does not process unwanted image data of an image that does not include the shelf label 40, either. This can achieve a load reduction effect on the image processing device 30. Also, the amount of communication traffic between the control device 20 and the image processing device 30 can be reduced.
Thus, the image capture system 100 that can achieve a reduction in the capacity of a memory storing image data and an improvement in the image processing can be provided.
An embodiment of the image capture system 100 and the control device 20 therefor was described above. However, this embodiment is not limiting.
For example, the control device 20 can also be applied to other cases than recognizing characters on the shelf label 40 arranged in a merchandise rack. For example, the control device 20 can also be applied to an image capture system where a first camera captures an image of a barcode printed on a cardboard box randomly placed in a warehouse or the like and where the barcode is recognized from the captured image.
In the embodiment, the case where the individual merchandise racks have the same height is described as an example. However, the individual merchandise racks may have different heights from each other. In this case, the number of first cameras 13 installed in the first camera installation unit 121 may be decided, based on the tallest merchandise rack.
In the embodiment, the predetermined interval between the narrow-range image capture points next to each other is equal to the length T of the side in the horizontal direction of the image capture range 60. This interval may be slightly shorter than the length T and images sequentially captured by the same first camera 13 may partly overlap each other.
In the embodiment, the case where the wide-range image capture processing is executed for one merchandise rack so as to generate a timing list 80, then the narrow-range image capture processing using the timing list 80 is executed, and the timing list 80 is cleared, is described as an example. In this regard, first, the wide-range image capture processing may be sequentially executed for two or more merchandise racks so as to generate a timing list 80 for each of the merchandise racks, and subsequently, the narrow-range image capture processing may be executed on a per merchandise rack basis, using the timing list 80 corresponding to the merchandise rack.
The timing list 80 may not necessarily correspond one-to-one to the merchandise rack. If two merchandise racks Sa, Sb are next to each other as shown in
While some embodiments of the present disclosure have been described, these embodiments are presented simply as examples and are not intended to limit the scope of the present disclosure. These novel embodiments can be carried out in various other forms and can include various omissions, replacements, and modifications without departing from the spirit and scope of the present disclosure. These embodiments and the modifications thereof are included in the scope of the present disclosure and also included in the scope of the claims and equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2022-001599 | Jan 2022 | JP | national |