The invention relates to a method, a container crane control system, a computer program and a computer program product for loading a container on a landing target.
Container cranes are used to handle freight containers, to transfer containers between transport modes at container terminals, freight harbours and the like. Standard shipping containers are used to transport a great and growing volume of freight around the world. Trans-shipment is a critical function in freight handling. Trans-shipment may occur at each point of transfer and there is usually a tremendous number of containers that must be unloaded, transferred to a temporary stack, and later loaded on to another ship, or back onto the same ship or loaded instead onto another form of transport such as a road vehicle or train.
Traditionally, the container cranes have been controlled in an operator cabin mounted on the container crane. Recently however, container cranes have become remote controlled and even fully automated. This reduces or eliminates the need for crane operators being exposed to inconvenience, danger and even injury.
WO 2015/022001 discloses a method for automatically landing a container on a landing target using a container crane. The container crane comprises a trolley and spreader for holding and lifting the container and a crane control system for controlling movements of said container crane. A distance from the container to the landing target is measured and the container is moved towards the landing target dependent on the measured distance. A plurality of images of the landing target are made using at least one camera mounted on the spreader. The images are processed to identify one or more landing features in the images of the landing target. Distances from the container to the landing target are calculated based on a measurement of distance between the container and the landing features in the images.
Any improvement in how the landing target is identified is of great value.
It is an object to improve the identification of a landing target for loading a container.
According to a first aspect, it is provided a method for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The method is performed in a container crane control system and comprises the steps of: obtaining two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; performing feature extraction based on the two-dimensional images to identify key features of the landing target; generating a point cloud based on the feature extraction, wherein each point in the point cloud contains coordinates in three dimensions; and controlling movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
The key features may comprise corners of the landing target. The key features may comprise twistlocks of the landing target. The key features may comprise any one or more of a gooseneck of a chassis, a guide structure, a girder, and a beam.
The landing target may be situated higher than surrounding surfaces.
The step of performing feature extraction may comprise identifying a landing target being a surface at a height higher from surrounding surfaces.
The step of obtaining two-dimensional images also comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader. In such a case, the step of performing feature extraction is based also on a two-dimensional image from at least one camera of the second pair.
The first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
The method may further comprise the step of: detecting orientation of the landing target based on lines in the two-dimensional images. In such a case, the step of controlling movement is also based on the orientation of the landing target.
The step of performing feature extraction may be based on scale invariant feature transform, SIFT.
The step of generating a point cloud may also be based on stereo image matching based on the two-dimensional images.
The method may further comprise the step of: obtaining additional depth data from a depth detection device. In such a case, the step of generating a point cloud is also based on the additional depth data.
According to a second aspect, it is provided a container crane control system for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The container crane control system comprises: a processor; and a memory storing instructions that, when executed by the processor, cause the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
The landing target may be situated higher than surrounding surfaces.
The instructions to obtain two-dimensional images may comprise instructions that, when executed by the processor, cause the container crane control system to obtain two-dimensional images of the landing target from a second pair of cameras arranged on the spreader, and wherein the instructions to perform feature extraction comprise instructions that, when executed by the processor, cause the container crane control system to perform the feature extraction also on a two-dimensional image from at least one camera of the second pair.
The first pair of cameras and the second pair of cameras may be arranged along the same side of the spreader.
According to a third aspect, it is provided a computer program for loading a container on a landing target on a land vehicle using a container crane comprising a trolley and a spreader for holding and lifting the container. The computer program comprises computer program code which, when run on a container crane control system causes the container crane control system to: obtain two-dimensional images of the landing target from a first pair of cameras arranged on the spreader; perform feature extraction based on the two-dimensional images to identify key features of the landing target; generate a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions; and control movement of the container to the landing target based on the point cloud and the identified key features of the landing target.
According to a fourth aspect, it is provided a computer program product comprising a computer program according to the third aspect and a computer readable means on which the computer program is stored.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the element, apparatus, component, means, step, etc.” are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated.
The invention is now described, by way of example, with reference to the accompanying drawings, in which:
The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
Embodiments presented herein are based on identifying key features of a landing target from several two-dimensional images using stereo matching and feature extraction. Alternatively, any other type of 3D mapping sensor could be used such as Lidar, time of flight cameras etc. Key features can e.g. be corners of the landing target, twistlocks, etc. A point cloud with points in three dimensions is also generated, based on the identified features, to describe the landing target and the environment around the landing target to improve ability to identify the landing target. By basing the landing target identification on three dimensions, a more reliable identification is achieved, allowing operation to be continued also in conditions with bad visibility.
A container crane 51 uses a number of powerful electric motors mounted on a spreader 55 and on a trolley 53 to power moving parts and retract or extend cables to lift up or down the spreader 55. The spreader 55 can hold a load 21 in the form of a container. Electric motors are also used to power the movements of the trolley 53 holding the spreader 55, to lift and transport the containers out of the ship and onto a land vehicle 23 or a stack etc. The container crane 51 can be used for loading containers on a ship and/or for unloading containers from a ship to land or on a landing target 59 on a land vehicle 23, e.g. a truck chassis or a train carriage chassis. Moving containers from one position on land to another landing target is also possible.
The width of shipping containers is standardised at 8 ft. (2.436 m), but the height varies, typically between 8 ft. (2.436 m) and 9.5 ft. (2.896 m). The most common standard lengths are 20 ft. (6.096 m) and 40 ft. (12.192 m) long. The 40 ft. (12.192 m) container is very common today and even longer containers up to 53 ft. (16.154 m) long are also in use. International standard dimensions are based on a number of ISO recommendations made between 1968 and 1970, and in particular a recommendation R1161 from January 1970, which made recommendations about dimensions of corner fittings for standard containers. The distances between corner fittings on standard shipping containers are standardised in accordance with the ISO recommendations. The corner fittings, also known as corner castings, include standard openings so that a container may be picked up by inserting a hook of the spreader 55 into each of the four corner fittings at the top of the container 21. The size and shape of the oval-shaped openings are defined in another standard, ISO 1161 from 1984. The same type of corner fittings, e.g. those on the bottom of a container, may be used to lock a container in place in a position (e.g. in a hold or on deck) on board a ship, on a wagon or a chassis.
The spreader 55 is thus used to grip the container 21 e.g. using twistlocks to engage with the standard sized opening in the corner fittings on the container, to lift it, lower it and release it. In this description, the term spreader 55 is used to denote a part of a lifting device that is in direct contact with a container 21. Spreaders 55 are normally designed to handle more than one size of container, typically 20-40 ft. (6.096-12.192 m) or 20-40-45 ft. (6.096-12.192-13.716 m) long containers. Some spreaders 55 may at any time lift and handle one single 40 ft. (12.192 m) or a 45 ft. (13.716 m) container or two 20 ft. (6.096 m) containers. Some spreaders 55 are adjustable in use so that the same spreader 55 can be used to pick up one 20 ft. (6.096 m), or two 20 ft. (6.096 m) containers at a time by adjusting the length of the spreader.
The container crane 51 can thus be used to lift a container 21 up from a ship and land it on a landing target 59, or vice versa. Alternatively, the container crane 51 can be used to transfer the container 21 between the ship and ground or a container stack or any other suitable container movement.
A container crane control system 1 is used to control the operation of the crane 51. In order to enable autonomous control of the crane 51, the container crane control system 1 comprises several cameras (shown in more detail in
The control device 15 is any suitable control device capable of performing logic operations and can comprise any combination of a central processing unit (CPU), graphics processing unit (GPU), a microcontroller unit (MCU), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and discrete logic circuitry, optionally combined with persistent memory (e.g. read only memory, ROM).
The cameras are provided in pairs. Specifically, a first pair 35a comprises the first camera 32a and the second camera 32b. A second pair 35b comprises the third camera 32c and the fourth camera 32d. A third pair 35c comprises the fifth camera 32e and the sixth camera 32f. A fourth pair 35d comprises the seventh camera 32g and the eighth camera 32h.
The term end side is to be interpreted as one of the shorter sides of the spreader. Hence, the first pair 35a of cameras and the second pair 35b of cameras are provided on one end of the spreader 55, while the third pair 35c of cameras and the fourth pair 35d of cameras are provided on the other end of the spreader 55.
In an obtain 2D images step 40, the container crane control system obtains two-dimensional images of the landing target from a first pair (e.g. any of the pairs 35a-d of
Optionally, this comprises obtaining two-dimensional images of the landing target from a second pair of cameras arranged on the spreader. The first pair of cameras and the second pair of cameras can be arranged along the same end side of the spreader. The term end side is to be interpreted as one of the shorter sides of the spreader.
In a feature extraction step 42, the container crane control system performs feature extraction based on the two-dimensional images to identify key features of the landing target. The key features can be corners of the landing target. Alternatively or additionally, the key features can comprise twistlocks of the landing target. Optionally, other key features are identified, e.g. any one or more of a gooseneck of a chassis, guide structures, girders, beams, etc. When more key features are extracted, this increases the reliability for subsequent loading of the container on the landing target.
The feature extraction can comprise identifying a landing target being a surface at a height higher from surrounding surfaces. The difference in height is significant, typically in the region of 1.2 metres, making this identification robust, even in poor weather conditions. Of course, the height difference can vary greatly from the example mentioned here.
As shown in
When a two-dimensional image from at least one camera of the second pair is available, this is also used in the feature extraction. The use of additional cameras improves the ability to identify the depth dimension.
The feature extraction is based on any suitable algorithm, e.g. scale invariant feature transform (SIFT) or similar.
In an optional obtain additional depth data step 43, additional depth data from a depth detection device is obtained. The depth detection device can e.g. be in the form of Lidar, time of flight cameras, etc.
In a generate point cloud step 44, the container crane control system generates a point cloud based on the feature extraction, wherein each point in the point cloud contains co-ordinates in three dimensions. Each point may also contain light values in one or more colours, e.g. RGB (Red, Green and Blue) light values. In one embodiment, more points are generated in the point cloud around at least one identified feature. The point cloud can be made more dense around the identified feature(s), since the features are of particular interest when landing a container. For instance, when identifying corners of a chassis on which to land a container, it is of great benefit to generate more density in the point cloud around the corners of the chassis, where the height difference to ground is significant. On the other hand, once the location of ground is identified around the chassis, more details of the ground do not help in landing a container. In this way, these regions of the landing target are better covered in the point cloud, making the landing more secure and robust. This is of great benefit, e.g. in poor weather conditions.
The point cloud can also be derived using stereo image matching based on the two-dimensional images. This results in more dense depth maps than if only the feature extraction is used. The stereo image matching can e.g. be based on block matching.
Optionally, the additional depth data received in step 43 may be fused with the camera images to yield even more reliable point cloud data. Hence, the point cloud can also be based on the additional depth data to thereby obtain an even more extensive and accurate point cloud.
In a detect orientation step 45, the container crane control system detects orientation of the landing target based on lines in the two-dimensional images, see
In a control movement step 46, the container crane control system controls movement of the container to the landing target based on the point cloud and the identified key features of the landing target. When available, movement is controlled also based on the orientation of the landing target.
The method loops to provide continued feedback of position in relation to the landing target and appropriate movement control.
By using three-dimensional data in the feature extraction and the point cloud, a more reliable identification of the landing target is achieved. Moreover, due the height difference (25 of
A corresponding method can be applied for picking up a container, where instead of a landing target, the key features of a container to be picked up are identified.
The memory 64 can be any combination of random-access memory (RAM) and read only memory (ROM). The memory 64 also comprises persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid-state memory or even remotely mounted memory.
A data memory 66 is also provided for reading and/or storing data during execution of software instructions in the processor 60. The data memory 66 can be any combination of random-access memory (RAM) and read only memory (ROM).
The container crane control system 1 further comprises an I/O interface 62 for communicating with other external entities. Optionally, the I/O interface 62 also includes a user interface.
Other components of the container crane control system 1 are omitted in order not to obscure the concepts presented herein.
The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2018/081188 | 11/14/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/098933 | 5/22/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
11222299 | Baalke | Jan 2022 | B1 |
11348269 | Ebrahimi Afrouzi | May 2022 | B1 |
20160167932 | Holmberg | Jun 2016 | A1 |
20170055446 | Nykamp | Mar 2017 | A1 |
20190294914 | Fevold | Sep 2019 | A1 |
20200207588 | Zanarini | Jul 2020 | A1 |
20200302207 | Perkins | Sep 2020 | A1 |
Number | Date | Country |
---|---|---|
113192199 | Jul 2021 | CN |
102020119866 | Feb 2021 | DE |
2574587 | Apr 2013 | EP |
2724972 | Apr 2014 | EP |
3033293 | Oct 2017 | EP |
3315005 | May 2018 | EP |
2013046941 | Mar 2013 | JP |
2012161584 | Nov 2012 | WO |
2013046941 | Apr 2013 | WO |
2014053703 | Apr 2014 | WO |
2015022001 | Feb 2015 | WO |
2016156667 | Oct 2016 | WO |
WO-2022006629 | Jan 2022 | WO |
Entry |
---|
International Preliminary Report on Patentability; Application No. PCT/EP2018/081188; Issued: Feb. 23, 2021; 22 Pages. |
International Search Report and Written Opinion of the International Searching Authority; Application No. PCT/EP2018/081188; Completed: Jul. 24, 2019; Mailing Date: Aug. 7, 2019; 17 Pages. |
Number | Date | Country | |
---|---|---|---|
20220009748 A1 | Jan 2022 | US |