This disclosure relates to a carrying system, in particular to an automated carrying system.
In order to save labor costs and improve management efficiency, nowadays warehouse systems have been developing towards automation, which has led to the rise of automated carrying systems. Take self-guided forklifts as an example. A control center of the warehouse system can specify a self-guided forklift as well as an initial position and a target position of the goods to be carried (hereinafter referred to as “target goods”), so that the specified self-guided forklift moves automatically to the initial position without manual operation, and carry the target goods that are placed in the initial position to the target position to complete the carrying task.
Due to the configuration of the known automated carrying systems, however, the aforementioned initial position and target position can only be specified with precise positions. For example, the user may specify a fixed point on a warehouse map of the control center via a user interface, or manually enter the coordinates of the fixed point. Nevertheless, if the warehouse worker accidentally puts the target goods askew or accidentally hits against the target goods when placing the goods in place, leaving the target goods out of the right position, it will be impossible for the self-guided forklift to locate the target goods to complete the carrying task. If the target position can only be a fixed point, problems easily occur, such as failure to unload goods due to other goods that have been placed in the target position, or a number of self-guided forklifts have to wait in line to unload goods. In addition, since the initial position and the target position can only be fixed points, when the target goods are placed in different positions, the user has to manually specify the self-guided forklift, the initial position and the target position via the user interface repeatedly in order to complete the carrying of all the target goods, which is quite inconvenient for use.
According to an embodiment of the disclosure, an automated carrying system is provided, which comprises a control center and self-guided transport equipment. The control center is used for providing command information, which includes a target area, target goods and a delivery destination. The self-guided transport equipment is electrically connected to the control center. The automated carrying system is configured to perform the following steps: controlling the self-guided transport equipment to enter the target area according to the command information; controlling the self-guided transport equipment to capture images in the target area; determining whether the image contains goods; if the image contains goods, determining whether the goods are the target goods; and if the goods are the target goods, controlling the self-guided transport equipment to pick up and carry the goods to the delivery destination.
Compared with the prior art, the target area of the present disclosure is an area instead of a fixed point, which can avoid the failure of carrying tasks due to deviation of the target goods from the right position, and which is advantageous for the user to apply a single command to all the target goods within the target area, without the need to give commands one by one to the target goods placed in different positions within the target area. The delivery destination of this disclosure may also be an area instead of a fixed point, thereby avoiding situations where goods cannot be unloaded if the fixed point has been occupied by other goods, or a number of self-guided transport equipments have to wait in line for unloading. Hence, the automated carrying system of the disclosure is advantageous for improving the success rate of carrying tasks, the carrying efficiency, and also the convenience of use for users.
The reference signs are listed as follows:
10, 10a, 10b, 10c, 20, 30: self-guided transport equipment
11: processing unit
12, 32, 215: imaging module
13, 33, 220: first distance sensor
14: drive module
15, 34, 120: goods holder module
16: second storage module
17: power supply module
18: second communication module
31, 281: carrier
34
a: mechanical arm
34
b: goods holder portion
35, 131, 132, 133: wheel
40: control center
41: management unit
42: user interface
43: first communication module
44: first storage module
100: forklift device
120
a: prong
280: bearing structure
282: mounting part
400, 405, 410, 415, 420, 430, 440, 450: step
441, 442, 443, 444, 445, 446, 447, 448: step
600
a,
600
b,
600
c,
600
d,
600
e,
600
f,
600
g,
600
h: user interface
610
a,
610
b,
610
c,
610
d,
610
e,
610
f,
610
g,
610
h: map
611
a,
611
b,
611
c,
611
d,
611
e,
611
f,
611
g,
611
h: shelf pattern
612
a,
612
b,
612
c,
612
d,
612
e,
612
f,
612
g,
612
h: goods pattern
620
a,
620
b,
620
c,
620
d,
620
e,
620
f,
620
g,
620
h: input interface
630
a,
630
b,
630
c,
630
e,
630
f,
630
g,
630
h: target area
631
d,
632
d,
633
d: area
640: hand
700: goods stack
710: barcode
711, 712, 713: characteristic information
720: label
721, 722, 723: pattern
730: goods
750: image
751: block
752: gap image
770: pallet
771: hole
O: point
R: radius
H: height
L: length
W: width
L1: first path
L2: second path
L3: checking path
L4, L5, L6: shortest distance
The foregoing and further technical contents, features, and effects of the disclosure will be clearly presented in the following detailed description of the preferred embodiments in combination with exemplary drawings. It should be noted that the directional terms mentioned in the following embodiments, for example, up, down, left, right, front, back, etc., only refer to the directions of the exemplary drawings. Hence, the directional terms used herein are for the purpose of explaining, rather than limiting the disclosure. In addition, the same or similar elements will be represented by the same or similar reference signs throughout the following embodiments.
In this disclosure, electrical connection means that electrical energy or data, such as electrical signals, magnetic signals, and command signals, can be transmitted directly, indirectly, by wire or wirelessly between elements.
Please refer to
The control center 40 may be a server or a computer, the management unit 41 may be a warehouse management system (WMS), and the user interface 42 is used for the user to input information to be transmitted to the management unit 41, whereby the user can control the self-guided transport equipment 10 through the control center 40. Preferably, the control center 40 may include a displayer (not shown) for displaying the user interface 42, the displayer may include a touch screen, and the control center 40 may further include input devices (not shown), such as a mouse and a keyboard. In this way, the user can input information on the user interface 42 directly via the touch screen and/or the input devices. The first communication module 43 may be, but is not limited to, a Wi-Fi wireless transmission module. The first storage module 44 may be used for storing data, such as map information of a workplace (e.g., a warehouse) of the self-guided transport equipment 10, goods storage information, goods information, etc. The first storage module 44 may be, but is not limited to, a read-only memory, a random access memory, or a combination thereof. For the user interface 42, reference can be made to the descriptions of
The self-guided transport equipment 10 is electrically connected to the control center 40 to receive the command information provided by the control center 40. To be specific, the self-guided transport equipment 10 may comprise a processing unit 11, an imaging module 12, a drive module 14 and a goods holder module 15, wherein the processing unit 11 is electrically connected to the imaging module 12, the drive module 14 and the goods holder module 15. The processing unit 11 has computing capabilities, and the processing unit 11 may be, but is not limited to, a central processing unit (CPU) or a graphics processing unit (GPU). The imaging module 12 is used for capturing images, for example, for capturing images of the surrounding environment of the self-guided transport equipment 10 so as to obtain surrounding information of the workplace where the self-guided transport equipment 10 is located. The imaging module 12 may be a two-dimensional imaging module or a three-dimensional imaging module. The two-dimensional imaging module may be a camera, and the three-dimensional imaging module may be, but is not limited to, a combination of two cameras or a combination of a camera and a projector. In the case where the imaging module 12 is a two-dimensional imaging module, the self-guided transport equipment 10 may preferably include a first distance sensor 13, which is electrically connected to the processing unit 11 and is used for sensing a distance between the self-guided transport equipment 10 and a surrounding object. The first distance sensor 13 may be, but is not limited to, LiDAR. In the case where the imaging module 12 is a three-dimensional imaging module, the distance between the self-guided transport equipment 10 and the surrounding object may be directly calculated from the image obtained by the three-dimensional imaging module.
The drive module 14 is used for driving the self-guided transport equipment 10 to move. The goods holder module 15 is used for picking up goods, and based on the shape and characteristics of the goods, a goods holder module 15 suitable for picking up the goods may be selected. The self-guided transport equipment 10 may preferably include a second communication module 18, through which the processing unit 11 is electrically connected to the control center 40, and the second communication module 18 may be, but is not limited to, a Wi-Fi wireless transmission module. The self-guided transport equipment 10 may preferably include a second storage module 16, which is electrically connected to the processing unit 11, and the second storage module 16 may be used for storing data, such as the map information of the workplace (e.g., a warehouse) of the self-guided transport equipment 10, goods storage information, goods information, positioning information of the self-guided transport equipment 10, navigation information of the self-guided transport equipment 10, etc. The second storage module 16 may be, but is not limited to, a read-only memory, a random access memory, or a combination thereof. The self-guided transport equipment 10 may include a power supply module 17, which is used for providing the power required by the self-guided transport equipment 10. For example, the power supply module 17 may be electrically connected to the processing unit 11, the imaging module 12, the first distance sensor 13, the drive module 14, the goods holder module 15, the second storage module 16, and the second communication module 18 to supply the power required by the aforementioned elements. The power supply module 17 may be a plug or a battery. The self-guided transport equipment 10 may preferably include a second distance sensor (not shown), which may be electrically connected to the processing unit 11, thereby further providing obstacle-avoidance function for the self-guided transport equipment 10. The second distance sensor may be, but is not limited to, a photoelectric sensor.
The control center 40 will be described below as a remote control center. However, the disclosure is not limited thereto. The control center 40 may also be arranged on the self-guided transport equipment 10 and be electrically connected to the processing unit 11, in which case the first communication module 43 and the second communication module 18 in
With reference to
Referring to
Referring to
In
Next, the management unit 41 transmits the command information including information about the target area 630a, the target goods and the delivery destination to the self-guided transport equipment 10. After receiving the command information, the processing unit 11 controls the drive module 14 to drive the self-guided transport equipment 10 to enter the target area 630a according to the command information (Step 410). The processing unit 11 controls the self-guided transport equipment 10 to move in the target area 630a while capturing images by using the imaging module 12 (Step 420), and continuously determines in real time whether the images contain goods (Step 430). If the image contains goods, the processing unit 11 may calculate a distance between the goods and the self-guided transport equipment 10 from the image alone or from the image in conjunction with the data collected by the first distance sensor 13, and control the self-guided transport equipment 10 to move to the front of the goods and then determine whether the goods are the target goods (Step 440). If the goods are the target goods, the processing unit 11 controls the goods holder module 15 of the self-guided transport equipment 10 to pick up the goods, and then controls the drive module 14 to drive the self-guided transport equipment 10 to move to the delivery destination, and also controls the goods holder module 15 to place the goods at the delivery destination.
Determination of whether the image contains goods may be performed through image comparison. Take this embodiment as an example. As the target goods are “loaded pallet (specified goods)”, the command information may further include pallet image information, or the processing unit 11 may retrieve pallet image information from the first storage module 44 or the second storage module 16 according to the command information and then compare the image captured by the imaging module 12 with the pallet image information. If the image contains contents that match the pallet image information, it is determined that the image contains goods. In other embodiments, if the goods are not restricted to be placed on pallets, the command information may further include goods image information, or the processing unit 11 may retrieve goods image information from the first storage module 44 or the second storage module 16 according to the command information and then compare the image captured by the imaging module 12 with the goods image information. For example, in the case where the goods are all placed in cartons, the goods image information may be carton image information, or the goods image information may be the image information of all the goods in the warehouse or characteristic information of barcodes of the goods. With reference to
Image comparison may also be used for determining whether the goods are the target goods. For example, the command information may include barcode information of the goods AAA, and the pallets, shelves or cartons for packaging goods in the warehouse are provided with the barcode of the goods loaded therein, so the processing unit 11 may compare the image of the barcode captured by the imaging module 12 with the barcode information of the goods AAA, or the processing unit 11 may retrieve the characteristic information about the goods AAA from the first storage module 44 or the second storage module 16 according to the barcode information, and then compare the image captured by the imaging module 12 with the characteristic information of the goods AAA.
In other embodiments, when it is determined that the image contains goods (Step 430), it may be further determined whether the goods are in the target area 630a. If the goods are in the target area 630a, proceed to Step 440; and if the goods are not in the target area 630a, return to Step 420. In this way, the accuracy of the automated carrying system for performing carrying tasks can be improved.
In other embodiments, after Step 450 is completed, the processing unit 11 may transmit processing result information to the control center 40, wherein the processing result information may include the type and the quantity of the goods that have been picked up, and also precise positions of the goods before and after being picked up, whereby the data stored in the control center 40 can be updated.
In
In
In
Referring to
If it is determined that the goods are the target goods, proceed to Step 441, calculating a quantity of the goods in the goods stack. Step 442 is to determine whether the quantity of the goods in the goods stack is greater than or equal to the required quantity of the target goods. If it is determined as “YES”, proceed to Step 450, controlling the self-guided transport equipment 10 to pick up and carry the goods from the goods stack to the delivery destination and, by this time, the automated carrying system completes the carrying task; and if it is determined as “NO”, proceed to Step 443, determining whether the self-guided transport equipment 10 has already moved around the target area for a full circle. If it is determined as “NO”, controlling the self-guided transport equipment 10 not to pick up the goods from the goods stack, and return to Step 420 in order to preferentially search for a further goods stack of the target goods in sufficient quantity; and if it is determined as “YES”, it means that the quantity of all the goods stacks of the target goods within the target area is less than the required quantity, in which case the required target goods have to be obtained from different goods stacks of the target goods. So proceed to Step 444, controlling the self-guided transport equipment 10 to pick up and carry the goods from the goods stacks to the delivery destination. And then proceed to Step 445, controlling the self-guided transport equipment 10 to move to a further goods stack in the target area, wherein the further goods stack is formed by a plurality of target goods stacked together (the way of searching for the further goods stack may be carried out through Step 420 to Step 440). Proceed to Step 446, controlling the self-guided transport equipment 10 to pick up and carry the goods from the further goods stack to the delivery destination. Proceed to Step 447, calculating the quantities of the goods that have been picked up by the self-guided transport equipment to obtain a sum of the quantities of the picked goods, that is, adding up the quantities of the goods picked up by the self-guided transport equipment 10 after Step 444. Proceed to Step 448, determining whether the sum of the quantities of the picked goods is greater than or equal to the required quantity of the target goods. If it is determined as “NO”, which means that the required quantity has not yet been reached, return to Step 445; and if it is determined as “YES”, which means the automated carrying system has completed the carrying task. Proceed to Step 460, controlling the self-guided transport equipment 10 to execute an end command. Step 441 to Step 448 will be described in detail below in conjunction with
In
In other embodiments, the quantity of the goods 730 in the goods stack 700 may also be calculated according to gaps between the goods 730 in the goods stack 700. In detail, the command information may include gap image information.
In other embodiments, if the goods 730 include identification patterns, such as labels 720, the quantity of the goods 730 in the goods stack 700 may also be calculated based on the quantity of the identification patterns.
The foregoing methods for calculating the quantity of goods 730 in the goods stack 700 may be used separately, or two or three of these methods may be used in combination at the same time to improve the accuracy of calculation.
In the above embodiment, the target goods are “loaded pallet (specified goods)”. In the case where the self-guided transport equipment 20 (i.e., the self-guided forklift) is used as the carrying equipment, the self-guided transport equipment 20 may extend with a prong 120a into the hole 771 of the pallet 770, thereby carrying all the goods 730 on the pallet 770 through one forking action, which, compared with picking up goods 730 by suction (such as the self-guided transport equipment 30), is advantageous for improving the carrying efficiency. In other embodiments, in the case where the automated carrying system takes the self-guided transport equipment 20 as the carrying equipment, and the command information does not limit the target goods to be placed on the pallet 770, if it is determined that the goods are the target goods, the automated carrying system may be further configured to perform the following steps: determining whether the goods stack 700 is placed on the pallet 770; if it is determined as “YES”, proceed to the subsequent step, such as Step 450 in
Please refer to
Step 400 is to obtain initial position information of the self-guided transport equipment 10. Step 405 is to obtain path information, which is obtained by calculating based on the initial position information and the target area. Step 415 is to control the self-guided transport equipment 10 to enter the target area according to the command information and the path information. For Step 420 to Step 450, please refer to the preceding texts. Now Step 400 to Step 415 will be described in detail with reference to
In
Referring to
Referring to
In the above embodiment, if the subject involved in determination or calculation in the steps is the processing unit 11 (such as in Step 430 and Step 440), this is only for the purpose of giving examples. In practical application, the processing unit 11 may transmit images to the control center 40, and the determination is carried out by the control center 40.
Compared with the prior art, the target area of the present disclosure is an area instead of a fixed point, which can avoid the failure of carrying tasks due to deviation of the target goods from the right position, and which is advantageous for the user to apply a single command to all the target goods within the target area, without the need to give commands one by one to the target goods placed in different positions within the target area. The delivery destination of the disclosure may also be an area instead of a fixed point, thereby avoiding situations where goods cannot be unloaded if the fixed point has been occupied by other goods, or a number of self-guided transport equipments have to wait in line for unloading. Hence, the automated carrying system of the disclosure is advantageous for improving the success rate of carrying tasks, the carrying efficiency, and also the convenience of use for users.
The above descriptions are only the preferred embodiments of this disclosure, which do not intend to limit the disclosure. For the skilled in the art, the disclosure may have various modifications and changes. Any modification, equivalent substitution, improvement, etc. within the spirit and principles of this disclosure should be included in the scope of protection of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201910855116.6 | Sep 2019 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2020/102781 | 7/17/2020 | WO |