The present invention provides a sensor unit, a method and system for cargo and personnel detection and tracking.
Present invention relates to recognition, identification, maintenance and handling of cargo in loading and unloading operation of cargo, typically used for offshore container tracking, such as when loading and/or unloading containers between platforms or docks and floating vessels, and to execute warehousing related to these cargos. Additionally the present invention may provide recognition and tracking of people close to cargo in movement.
Challenges in such operations are that cargo recognition and identification requires people and/or detector systems on cargo deck/storage area to identify cargo visually or optionally by electronic signalling devices (for example: RFID). The problem is that this is expensive, often high risk (persons in loading area), unreliable and error prone. Visual tags are exposed to wear and tear and thus often not visible due to damage or view is simply obstructed. Cargo may also not be found due to resent unregistered movement of the cargo.
Electronic detectors are often RFID based, which perform poorly if range is larger than typically 8-10 meters. This technique requires advanced technical installations and may not be possible at all to accomplish.
It is further a problem to maintain updated cargo status, such as exact location of cargo and delivery status, on vessels and platforms. Communication between handling entities may suffer from unreliable detection and communication processes. The result may lead to high costs related to floating warehousing, poor logistic planning, decreased deck efficiency, and long delivery times. Other deficiencies may be lack of proper handling of dangerous goods, and misinterpretation of delivery/pickup locations.
A further complication in loading and unloading heavy cargo is the presence of people, and specifically presence of unauthorized persons in loading area. Many accidents and injuries have been experienced due to undetected people activity in a loading area of heavy goods, such as when moving a container by a crane on a ship deck.
Present invention seeks to eliminate some or all of the above mentioned problems. This may be accomplished by providing a new central server service connected to all cargo handling entities, that being offshore or onshore loading areas such as platforms and container harbour storages, cranes, and offshore vessels, and a unique cargo identification feature, including sensor units and image processing utilities.
In one embodiment of present invention a device, sensor unit, is provided comprising camera and/or laser/lidar capabilities, wherein an image of for example a top view of a cargo carrying unit, CCU, such as a container, is captured. The sensor unit may provide 2D and/or 3D images. A lidar may be a 3D flash solid state lidar or the like.
In a further embodiment of present invention, the sensor unit may comprise a heat sensitive camera, and the image processing features comprise the ability to recognize personnel by its temperature footprint. Using heat sensitive camera on CCUs may add extra distinguishing features to the task of identifying correct CCU. Typical use may be to initiate corrective measures to avoid collision between people and cargo.
A processing device is provided for image processing, wherein the image processing includes creating a unique signature of the cargo, CCU Signature, and recognizes this as a CCU already registered by its unique CCU ID in a client/users logistics system. The location parameters of the CCU are updated based on the processed image, and/or a transport mission of the CCU, for example by loading the CCU to for example a Platform Supply Vessel, PSV. It is thus possible to maintain exact warehousing maps of for example the deck utilization of a PSV, or a number of PSVs.
In a further embodiment of present invention, the image processing comprise further analysis tools for recognition of personnel working in the vicinity of a CCU to be loaded/offloaded, and optionally personnel is categorized as for example authorized or unauthorized, and thereby provide the possibility for various alarm levels to be raised when personnel is for example detected close to a moving CCU.
In a further embodiment of the invention, the image capturing and CCU identifying features may be used for making a detectable/identifiable CCU signature of a CCU by capturing side view images of the CCU. CCU Signatures may be based on side view and top view images and thus provides a more robust identification system, where match on one of the two image directions may identify a specific CCU.
The present invention may also provide a system for registering, maintaining and controlling the whereabouts of CCUs in a logistics system. The system may comprise inventory listings for a plurality of CCU storage facilities, comprising one or more of on shore storage facilities, offshore storage facilities, and vessels such as PSVs. Further the system may comprise a plurality of tools for CCU lifting/moving, such as cranes, forklifts and trucks. Each facility may have installed sensor units for capturing top view images of each CCU being handled, the sensor units are connected to computing resources and optionally to remote logistics system(s). The computing resources are capable of recognizing and identifying existing CCUs, and also able to recognize and introducing new CCUs into the logistics system.
The scope of the invention is defined by the accompanying claims.
The invention is described by the following detailed description, and is exemplified by the non-limiting details of the following figures:
The invention is discussed in more detail using specific embodiments as illustrated in the attached figures. It is the intention of the inventors that some or all features shall be able to be utilized in any order or combination, and the figures shall not be regarded as limiting the scope of the invention, but rather as simplified examples for increasing the readability of the description.
The following description is further using some expressions having the following non-limiting meaning:
CCU—Cargo carrying Unit
Back-load—Cargo being returned from an offshore location
NCS—The Norwegian Continental Shelf
PSV—Platform supply vessel
AIS—Automatic Identification System. A radio-wave based identification system for vessels.
CCU ID—The identifier of a CCU (for example alphanumeric), defined by the customer/third party logistics system. It is used as a key for retrieving CCU meta-data from e.g. customers' logistic system. May corresponds to alphanumeric letters painted/visible on the physical CCUs.
CCU Signature—The identifier for CCUs according to present invention consisting of descriptors like feature points, contours, dimensions, weight, color, last known location etc.
Vessel Signature—The identifier for vessels according to present invention, consisting of descriptors like feature points, contours, dimensions, color, AIS identifier etc.
Deck Signature—The identifier according to present invention for loading decks, consisting of descriptors like feature points, contours, dimensions, which installation it belongs to, colors etc.
OptiLift Signature—A common definition of CCU-, Vessel-, and/or Deck Signature.
Onshore base—Supply base located onshore where supply vessels get and deliver load and back-load.
Pose—3D position and orientation with respect to a reference coordinate system
Offshore installation—Offshore platform, semisubmersible rig, jack up-rig, FPSO (Floating Production, Storage and Off-loading), FPU (Floating Production Unit).
Loading site—one of offshore, onshore area handling cargo/CCU, the phrase is used in a wide context to identify any place a cargo can be fetched from or delivered to.
Vessel—marine and land vehicle, ship or other transport used for moving goods
On-shore crane—Crane used on onshore bases.
Offshore crane—Crane used on offshore installations
Site Server—A server according to present invention located on an offshore installation or an onshore base.
Central Server—A server according to present invention located on shore. Can also be a server application in a cloud-environment.
Master or Master PC—The PC in the crane, connected to the OptiLift sensor unit.
Sensor Unit—The physical sensor unit containing camera, laser, stepper motors etc.
Viewer—If not otherwise stated, Viewer refers to the Viewer application of the software, visible to the crane operator.
RANSAC—RANdom SAmple Consensus is a common robust estimation often used in computer vision and well known by a person skilled in the art
Levenberg-Marquardt optimization—provides a numerical solution to the problem of minimizing a function, generally nonlinear, over a space of parameters of the function.
Sobel gradients—edge detection algorithms where an image emphasizing edges is created
Bundle adjustment—simultaneously refining 3D coordinates describing a scene geometry
In the following scenarios related to cargo transport to/from and offshore platform is used to demonstrate the invention. It shall be understood that the invention may be adapted for being used in any warehousing handling cargo movement between storages and transport vessels. Examples of such alternative environments can, without excluding any, be: Transport of goods between warehouses by truck/car, moving containers between container sites by train/lorry, using airplane to transport cargo between airports. Transport may well encompass relay stations where the purpose is to transfer cargo using a lifting/moving tool between types of transport means, such as a crane moving a container between, for example: train carrier and ship, ship to ship, truck to train and the like.
When the phrase deck map and storage site map is used it should be understood that the phrases shall comprise the meaning of any type of area usable for storing or transporting cargo/CCU, also including, but not limited to: on/off shore container site, ship cargo decks, train cargo decks, truck cargo deck, plane cargo decks, warehouse storage including automatic bay warehouse and belts, drones or other airborne transportation resources, marine cargo transport resources, and others.
When the phrase crane or moving tool is used it should be understood that the phrases shall comprise the meaning of any type of moving tool such as, but not limited to: cranes, forklifts and trucks manually or automatically controlled, bay warehouse lifting means and others.
Present invention provides various combinations of one or more sensor units, on-site and/or remote computer resources and network communication services for:
A high level system outline is illustrated in
A user, such as a crane operator, may be provided with data regarding a CCU to be loaded/unloaded to/from a vessel. The user may further be provided with map data of the vessel or area comprising the CCU of interest.
It is also within the scope of the invention to provide a web based solution for providing a dialogue platform for a user hosted on mobile and/or stationary communication units such as for example a smart phone or a PC.
It is further within the scope of the invention to provide a feature for maintaining a correct deck map, identifying correct location of CCU even where storage areas do not comprise sensor units for monitoring and maintaining storage location of CCU. The vessel deck storage area is swiped whenever the vessel passes under a crane comprising a sensor unit as exemplified in image analyzed in
The system of present invention may provide user login and authentication.
The system of present invention may provide user configuration and management of settings.
The system of present invention may provide Onshore CCU tracking.
The system of present invention may provide tracking of contents inside CCUs.
The system of present invention may provide Tracking using hand-held devices on/offshore.
Key features of the present invention may comprise all or some of the following:
Cargo Identification:
Cargo Tracking:
Cargo Position Maintenance:
People Recognition and Tracking:
Other Features:
In prior art cargo lifting processes human resources are required to read print codes or scan bar/QR codes of cargo containers, or advanced tracking devices have been deployed on sites and on each cargo for electronic tracking of the cargo. These types of solutions are expensive to maintain, and often put personnel at risk when for example a crane hook is in motion and the crane either fetches or delivers a container.
In order to keep track of the container, there are further challenges in maintenance of correct position of container. For example, when a cargo is moved on deck there may not be a record for this information, and finding correct container may be a time-consuming task.
In a first embodiment of the invention a sensor device is mounted in a crane tip of a cargo lifting crane, such as exemplified in
The sensor unit 2 as shown in one example in
When the sensor unit of the present invention is arranged in for example the crane boom as illustrated in
Based on the assumption that each CCU image captured from above in the present invention is unique, and cannot be misinterpreted, the network service system may be updated each time a captured image identifies a previously registered CCU.
Assuming that the computer system connected to a sensor unit of present invention has access to maps of all loading areas, or at least the deck map of all loading vessels able to be participating with the load/unload operation of a CCU to/from the area which the image capturing unit operates from, it may identify and update present position of all CCUs coming into image capturing range of the sensor unit.
In the following the network services is described as the “Central server”, the CCU handling service on-site is described as “Site server”, and the software applications connected to the sensor unit is described as “Master”.
Storage capacity may be provided in all stages of the operation to improve reliability of the information communication between the network based services, the CCU handling sites, and the sensor units.
An overview of the information exchange structure, message handling, is described in
The sensor unit may comprise a camera, optional with heat sensitive features, an IR sensor and/or laser camera/lidar, for image or video capturing. When an image is captured and analyzed, a user, such as a crane operator, may be presented with the ability to select new/unknown CCUs in the image to be identified, tracked, and registered in a database of the network service system.
Using the sensor unit for automatic recognition of labels and/or various “tag” technologies may increase accuracy of the system, and can be comprised by for example: characters are recognized on the CCU, using “OCR”—Optical Character Recognition; or recognizes bar-codes or QR-codes visible to any of the camera connected to the system. Another embodiment example may be provided wherein the invention comprises detectors for communicating/receiving signals for identification from other transmitting sources such as RFID or GPS-devices arranged in combination with cargo or personnel.
The Central server may be facilitated in a cloud environment and providing data mining and maintenance. The Central server may also provide advanced and resource-demanding computing resources. The Central server communicates with Site servers, both onshore and offshore. Site servers may comprise Masters. Masters may comprise image capturing means, such as sensor units described above. Any number of Masters may be comprised in a Site server environment, and any number of Site servers may be connected and comprised in a system maintained by a Central server environment. The number of CCUs may be numerous, and vary over time. CCUs may be introduced to the system on a non-regular basis, and CCUs may be terminated from the system on a non-regular basis.
The provision of an efficient and safe communication environment provides the ability to design a flexible operation environment, for example by moving all crane operator tasks from the physical crane implementation to a control facility arranged remotely from the crane itself. Remote crane operation could be arranged at Site Server or Central Server location or even anywhere reachable by a secure network/cloud connection.
Each Master, such as a crane, may comprise computing means, for retrieval and analysis of images and video streams, for operation control and command handling, operator dialogue, and for communication with the Site servers.
An overall module block diagram of a system according to present invention is shown in
A system database is provided with one or more instances of one or more of the following modules as described in
The detection and tracking functionality of the system of the invention contains and provides controls for the:
Cargo detection and tracking comprises the major functionalities which are:
The
The object (e.g. cargo, vessel, platform etc.) may be tracked using a frame-to-frame tracking method 607, 614. In frame-to-frame tracking 607, 614 the initial target object signature (e.g. CCU Signature) is replaced with the previous frame where the object was detected. The frame-to-frame tracking is robust to small inter-frame variations and improve the estimation of the object pose.
Now, each step in the process where the invention is used in a 3D model-based tracking of objects will be discussed in more detail.
The Feature Pose Initialization (Model based 3D tracking) 601, 609 encompass an input being provided including an image containing the target region (for example the cargo, vessel or platform), and the output of this step in the process shall provide 3D feature points (x,y,z) with (2D) descriptors of the feature points. A step by step process for this is shown in
The Feature Point Matching (Model based 3D tracking) of a platform, ship or cargo 602, 610, 616 encompass an input being provided including feature points and descriptors for the target or previous frame, and the current frame. The output of this step in the process shall provide matching feature points between the target template and the current frame, and updated feature points and descriptors for the target template. A step by step process for this is shown in
The Frame-to-Frame Tracking of object encompass an input being provided including the previous frame, the target region in previous frame and the current frame. The output of this step in the process shall provide rotation and translation of the target object between the previous frame and current frame (i.e. the pose change between the frames). A step by step process for this is shown in
As an alternative to step 906 and 907, the following approach may be used:
The Update Target Signature 604, 612, 618 encompass an input being provided including the object pose, the matched feature points and descriptors and the detected feature points and descriptors. The output of this step in the process shall provide updated feature points and descriptor for the target signature. A step by step process for this is shown in
The Pose Estimation in 3D 603, 611, 617 encompass an input being provided including matched feature points (i.e. matching between the 3D feature point in the target template and the feature points detected in the current frame), and intrinsic camera parameters (i.e. focal length and camera center). The output of this step in the process shall provide the current 3D position and orientation of the object (i.e. a rotation in type of vector map and a translation vector T). A step by step process for this is shown in
In the object Motion Model 605, 613, 619 the object motion model component estimate the current position and orientation of the object. The object motion model may also predict the position of the object based on the old observation and the motion model. Examples of motion models are constant velocity and oscillation behavior. The main tasks for the object motion model is to give a more accurate estimation of the current position of the object and in case of missing observation give a prediction of the current position.
It is possible to track multiple objects in the 2D and or 3D vision methods simultaneously. The depth/distance to each object can then be sampled using the range measurement device, when it hits to each object.
The system performs the initial detection of objects, such as cargo or people for a given camera and thermal image using a multitude of methods. Certain objects of known size can be detected using image analysis with known features, such as containers can be identified using rectangle detection. Furthermore, Deep Neural Network (DNN) algorithms may be used for initial cargo and people detection. The used DNN may for example be formulated as a region-based convolutional neural network as defined by Shaoqing, Kaiming, Ross, & Jiam, 2015, but since the DNN and artificial intelligence is rapidly developing, it is within the scope of the invention to take advantage of potential improvements in this technical field.
It is further optionally provided a logistics tool that uses DNN to perform initial detection of CCUs and people in a given image. Diagram in
The detection of previously registered CCUs may be performed by fusion of various techniques, which is further defined below. The initial detection may be performed as follows:
The CCU initialization process initializes the cargo for a given CCU Id, using one or more of:
Initialization of the cargo via signature comprises a process of loading a binary blob defining the states of various detectors for a given cargo, and the content of a blob is defined below.
Detectors are explained as based on image interpretations, but may also be based on other information sources such as, but not limited to: weight, last known location, movement pattern of cargo and/or deck and/or people, vessel transport route, neighbour container, communication tags, visible labels and other.
The initialization from image is performed as follows:
The tracking function of the cargo detection and tracking feature may have one or more of the parameters of:
The tracking function of the cargo detection and tracking feature may perform the following operations:
A data fusion process is provided which aims to combine detection results coming from all detectors into correct/refined results. The process works as:
estimationVar(Mn,i)=(1TCn,i−1)−1
If multiple detectors estimate the location of a cargo with very similar center positions, then all these estimates would produce similar estimation variances. In this situation, the weight of detectors is compared to select the estimate coming from the most reliable detector.
Cargo detection manager can save and load its state to and from external file and binary blobs. A state file may have a structure as shown in
One optional implementation of a database maintenance system is illustrated in a high level flow chart in
The cargo detection and tracking feature uses a prioritization feature to ensure that the system can perform tracking of high numbers of cargo, while minimizing the performance reduction due to high load. The prioritization feature may determine the cargo id's that should be detected for the given image frame. It achieves that by first setting:
from a configuration file.
Each cargo holds a detection frequency value, which varies between the high and low detection frequencies. Whenever a cargo is initiated or detected in a frame, its detection frequency is set to the high detection frequency. For the frames where the cargo is not detected, the detection frequency is reduced by a linear function, shown in
The computer system connected to the sensor unit may analyze the image or video stream, and detect and track CCUs as described in
Logistics currently offer three preferred types of Feature Based Detection, but is not limited to these as the method is not limited to any specific type of Feature Based Detection and may use any existing or future methods:
The process method of the Feature Based Detection may be set up to perform the following for each initiated cargo:
For each detected cargo, a CCU signature may be updated according to the following:
In
Texture based detection method treats the cargo as a single feature where the descriptor is defined using the texture of it, and detects the texture during the tracking. The texture may be defined using, but not limited to, the following algorithm:
The tracking may then be performed for a given image and cargo as follows:
It should be noted that the texture based detection method provide a rotation invariant detection. However, the scaling is not handled during the above procedure. To handle scaling properly, various scales of the template should be matched to the image. Due to performance reason, the texture based detection method tests a different scale for each frame. If a template match happens, it keeps using the found scale until it loses the tracking. When a tracking is lost the texture based detection method seeks to match different scales in the following frames. The scale adjustment percentage may be configurable.
Position based detector provides two main processes; detection and update.
Detection process computes the cargo position with respect to the camera, based on stored pose information. Update process refines the cargo position with respect to the vessel and platform. Accordingly, the detection process is called with the detection methods of all other detectors by the cargo detection and tracking feature and the update process is called right after the detection.
Update process takes place after the detection process, where the final detection results are used for refining each detected cargo's position with respect to the vessel and platform. For each detected cargo, the update process:
cargoWrtVessel=e·cargoWrtCamera
cargoWrtPlatform=platformWrtCamera−1·cargoWrtCamera
The detection process uses the information refined in the update process to estimate the location of cargo with respect to vessel or platform. For each cargo,
cargoWrtCamera=vesselWrtCamera−1·cargoWrtVessel
cargoWrtCamera=platformWrtCamera·cargoWrtPlatform
Image collection logic determines that the image is collectible based on a cargo's position with respect to the camera, and the iris opening. Accordingly, any of the following three conditions trigger an image collection:
Collision is avoided by CCU detection and identification, people detection and motion course computation via extrapolation of detected motion as outlined in diagram in
Duplicated cargo signatures will be found locally by checking the proximity of all tracked cargos in each image. If two or more cargos have almost the same image position (within for example 10 pixels), they are considered to be duplicates of the same cargo.
Duplicates may be removed locally according to for example the following priority:
Now six different use cases will be discussed in as described in
The system and/or the data may be used by the following group of users:
The figure defines five different entities, these being a load plan which is set up in advance of loading operation, an AIS which might be an Automatic Identification System, for example a radio-wave based identification system for vessels. The crane operator is an on-site or remotely located person or function operating a lifting/moving equipment, for example a crane operator and the instruments being associated with crane operation and sensors for collecting information of the loading area, for example cameras and lasers in the sensor unit. Third party server comprises a data management system which may provide or receive one or more of relevant data for CCU, location of CCU, platform or vessel maps, and on shore base information. Deck personnel being personnel or automated machinery handling manual work related to cargo handling on vessel deck or other storage areas.
The process is comprised of several independent operations, wherein the identification of a vessel, such as a Platform Supply Vessel, PSV, triggers a predefined load plan to be identified for the identified PSV.
Identification of vessel and or cargo is dependent on having correct scale or distance. Having correct distance may solve the scale if one of cargo or deck is known. If cargo or deck is known scale can be calculated by measuring pixel ratios on various cargo measures or ship boundaries. Present invention utilize more than one mechanism for defining correct scale. Alternatively can any type of object/area with known dimension may be used for calculating all other dimensions.
The PSV may be identified by signaling over radio communication its signature, or a sensor unit may capture an image which is analyzed and PSV id recognized, or the correct PSV identity is manually inputted to the system.
Once the PSV is identified and load plan is established, the activities will be related to one of: loading a CCT onto the PSV, loading a CCU off the PSV, moving a CCU on the PSV or monitor deck and updating the present deck map.
The latter, monitor deck and updating the present deck map, does not necessarily involve moving a CCU, but a continuous operation for ensuring that the deck map is updated and mirrors the actual location of the different CCUs on deck. Each time a CCU is captured on image, and the processing recognized the CCU signature, its physical location is recognized and compared with the deck map stored in the system. If there is a mismatch between the registered and the real location, the database is updated to mirror the actual physical location.
When the load plan identifies a lift of a CCU from the PSV, the location is fetched from the deck map, and when CCU is in view of the sensor unit, an image of the CCU is processed to find the CCU Signature. If CCU signature finds a related CCU ID in the database, the CCU is hooked, lifted and reported as in movement. If the CCU ID could not be found, the CCU ID would need to be created in the system. This may be achieved by importing correct CCU data from labels or other, manually or automatically. If the CCU is new to the system, the CCU signature is stored with the CCU ID and relevant data in the system.
When the CCU is in movement, the system may notify third party server that the CCU is moving.
When the CCU has been moved to a planned destination, and is un-hooked, the deck map may be updated with the new position of the CCU, either on the deck of the PSV or on shore. Either way the deck plan is updated.
The sensor unit arranged in lifting equipment may continuously monitor all CCUs coming into the aperture view of the camera(s) or sensors in the sensor unit. Each CCU is analyzed and CCU signature established and checked with deck map. If inconsistency with expected location of the CCU is detected, the deck map may be updated.
The scenario in
When a “new” CCU is hooked by the crane, the image of it is captured by the sensor unit, and a CCU Signature is created associated with a pre-stored CCU ID, automatically or with manual input. When the CCU is un-hooked the new position is registered, and deck map is updated. CCU Signature is also updated.
When a CCU is undergoing an internal lifting offshore as described in
When a “new” CCU is hooked by the crane, the image of it is captured by the sensor unit, and a CCU Signature is created, automatically or with manual input. When the CCU is un-hooked the new position is registered, and deck map is updated. CCU Signature is also updated.
In
The planning is about creating an acceptable efficient sailing rout, and utilizing capacity of PSVs. Logistics personnel may receive shipping orders from several offshore installations or vessels, and comparing the orders and available space on the offshore installations, deck maps, and the PSV, vessel map, planned for executing the transport task, a number of CCUs may be identified for lift/transport.
When offshore installations planned for being visited on the sailing route is established, the logistic personnel may accept back-load requests from the offshore installations and allocate free space for this on the PSV when CCUs have been loaded to the offshore installation in question.
It is possible to view the task of creating a sailing route and creating a load plan as two separate tasks, wherein creating a load plan is part of present invention, the sailing route may be governed by many external factors, and although sailing plan influences the load plan, it is not necessarily the same the other way. Creating a load plan for the individual offshore installation and PSV involves retrieving deck maps and vessel map, and further to retrieve the registered latest updated location of CCUs planned to be lifted on or off the PSV.
In a further use scenario as illustrated in
Logistic personnel may by receiving current deck map from offshore installation, reserved deck areas plan, the planned back load and which CCUs are left on offshore installation estimate a realistic available deck space before and after planned PSV loading/unloading at offshore installation.
The invention is further defined by the following embodiment descriptions:
A first method embodiment for detection and tracking of a Cargo Carrying Unit, CCU, comprising the following steps:
detecting, or creating for new CCUs, a unique CCU signature, wherein the unique CCU signature is constructed by analyzing images of the CCU capture by a sensor device, and analyzing the images according to one or more of predefined detection methods, and
the analysis providing a combination of one or more descriptors defined by CCU feature points, contours, dimensions, weight, colour, movement pattern, neighbour CCU feature point, planned travel route or last known location.
A second method embodiment for detection and tracking of a Cargo Carrying Unit according to the first method embodiment, wherein predefined detection methods comprising one or more of:
A third method embodiment for detection and tracking of a Cargo Carrying Unit according to the first or second method embodiment, wherein feature based detection comprising one of:
for each initiated CCU the method comprises:
A fourth method embodiment for detection and tracking of a Cargo Carrying Unit according to the third method embodiment, wherein the CCU signature is updated according to the following:
A fifth method embodiment for detection and tracking of a Cargo Carrying Unit according to the second method embodiment, wherein texture based detection treats the CCU as a single feature where the descriptor is defined by the texture of the CCU, and the texture is defined during the tracking using the following selection criteria:
A sixth method embodiment for detection and tracking of a Cargo Carrying Unit according to the fifth method embodiment, wherein tracking is performed for a given image and cargo according to the following steps:
A seventh method embodiment for detection and tracking of a Cargo Carrying Unit according to the second method embodiment, wherein position based detection comprising both detection process and updating process, wherein
computing the cargo position with respect to the camera in detection process, based on a stored pose information according to any of the previous claims 3 to 6, and
refining the cargo position with respect to the vessel and platform in update process.
An eighth method embodiment for detection and tracking of a Cargo Carrying Unit according to the seventh method embodiment, wherein the detection process is followed by the update process, and where the final detection results are used for refining each detected CCU's position with respect to the vessel and platform, wherein for each detected cargo, the updating process comprising the following steps:
cargoWrtVessel=vesselWrtCamera−1·cargoWrtCamera
cargoWrtPlatform=platformWrtCamera−1·cargoWrtCamera
A ninth method embodiment for detection and tracking of a Cargo Carrying Unit according to the eighth method embodiment, wherein the detection process further uses the information refined in the update process to estimate the location of CCU with respect to vessel or platform, wherein for each CCU performing one of the following method steps:
cargoWrtCamera=vesselWrtCamera·cargoWrtVessel
cargoWrtCamera=platformWrtCamera·cargoWrtPlatform.
A tenth method embodiment for detection and tracking of a Cargo Carrying Unit according to the first method embodiment, wherein the CCU signature is associated with a unique CCU ID stored in a logistic system, and if the CCU signature cannot be associated with a preregistered CCU ID: creating a new association between the CCU signature and a CCU ID found in the logistic system.
An eleventh method embodiment for detection and tracking of a Cargo Carrying Unit according to any of the first to tenth method embodiment, wherein the method further comprising updating position data associated with CCU ID in the logistic system according to last identified movement of CCU during tracking of the CCU.
A twelfth method embodiment for detection and tracking of a Cargo Carrying Unit according to any of the first to eleventh method embodiment, wherein the method further comprising using the sensor device for recognizing and tracking, by pattern recognition, personnel on one of vessel deck or loading sites, and issuing a predefined warning level upon detecting personnel and CCU being on a crossing path.
A thirteenth method embodiment for detection and tracking of a Cargo Carrying Unit according to any of the first to twelfth method embodiment, wherein the method further comprising analyzing the images and recognizing one or more of characters, bar-codes or QR codes on a tag associated with the CCU, and further using this the tag information to increase accuracy in defining the CCU-signature.
A fourteenth method embodiment for detection and tracking of a Cargo Carrying Unit according to any of the first to thirteenth method embodiment, wherein the method further comprising detecting an RFID or GPS signal associated with the CCU, and further using the RFID or GPS signal information to increase accuracy in defining the CCU-signature.
A first system embodiment for detecting and tracking movement of CCUs in a logistic system, the system comprising:
at least one loading site comprising at least one crane having a sensor unit installed for capturing images of loading area below the crane,
at least one vessel able to carry CCUs between loading site facilities,
at least one loading site facility having at least one tool for loading CCUs to and from vessel, the loading site facility further comprising utilities for inputting CCU IDs of CCUs into a logistic system,
a logistic system maintained in a network connected computer environment,
a data transmission network for connecting landing site facilities, and each sensor unit is connected to a local computer resource for analyzing images captured by the sensor unit.
A second system embodiment for detecting and tracking movement of CCUs in a logistic system according to the first system embodiment, wherein the local computer resource further comprising a display unit for communicating images and for output and input of operation specific commands and parameters concerned with lifting operations, and the system further comprising storage resources at computer resources for storing communication data to provide later communication of data if transmission network are disconnected.
A third system embodiment for detecting and tracking movement of CCUs in a logistic system according to the first system embodiment, wherein the network computer environment comprise, at a site remote from the local computer resource, a display unit for communicating images and for output and input of operation specific commands and parameters concerned with lifting operations, and the system further comprising storage resources at computer resources for storing communication data to provide later communication of data if transmission network are disconnected.
A third system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first or second system embodiment, wherein the system further comprise a server computer resource arranged at the loading site for communicating with each of the sensor units arranged in loading tool on the loading site, and wherein the logistic system is a third party logistic system, and an interface module is provided for communication between server computer resource and third party logistic system.
A fourth system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first to third system embodiment, wherein the system further comprising in the logistic system a loading site map for each loading site comprising CCUs stored on the loading site, and a deck map of each vessel comprising CCUs stored on the deck of the vessel.
A fifth system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first to fourth system embodiment, wherein the system further comprising one or more load plans for planned CCU transport between two or more of loading sites and storage facilities, each load plan comprising deck map of a vessel used in planned transport and loading site maps of loading sites planned visited during transport.
A sixth system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first to fifth system embodiment, wherein the system further comprising one or more of OCR module, bar- or QR-code decoding module, for recognizing information from a tag associated with a CCU.
A seventh system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first to sixth system embodiment, wherein the system further comprising one or more of an RFID or GPS communication module, for receiving information from an RFID or GPS communication module associated with a CCU.
An eighth system embodiment for detecting and tracking movement of CCUs in a logistic system according to any of the first to seventh system embodiment, wherein the server computer resource and/or the local computer resource comprise resources for detection and tracking of CCU, according to any of the first to fourteenth method embodiment for detection and tracking of a Cargo Carrying Unit.
A first Sensor unit assembly (1) embodiment for capturing images and video sites and objects, the sensor unit comprising:
a sensor unit (2) comprising one or more sensors (3, 4, 5),
attachment means (7),
a connector box (6) comprising power means and connectors for external wiring and/or communication means for wired or wireless communication with remote computing means,
connectors and cabling (8) for connecting the sensors (3, 4, 5) to the connector box (6), wherein at least one sensor (3, 4, 5) is a camera.
A second Sensor unit assembly (1) embodiment according to the first Sensor unit assembly (1) embodiment, wherein at least one sensor is a heat sensitive camera.
A third Sensor unit assembly (1) embodiment according to any of the first to second Sensor unit assembly (1) embodiment, wherein the further sensors comprise one or more of: an IR sensor, a laser camera, a lidar, or a radar.
A fourth Sensor unit assembly (1) embodiment according to any of the first to third Sensor unit assembly (1) embodiment, wherein the sensor unit comprise one or more of OCR module, bar- or QR-code decoding module.
A fifth Sensor unit assembly (1) embodiment according to any of the first to fourth Sensor unit assembly (1) embodiment, wherein the sensor unit comprise one or more of an RFID or GPS communication module.
A sixth Sensor unit assembly (1) embodiment according to any of the first to fifth Sensor unit assembly (1) embodiment, wherein the sensor unit is attached to the attachment means (7) via a rotation frame (9) comprising rotational movement means (20) and bearing means (21).
Number | Date | Country | Kind |
---|---|---|---|
20180178 | Feb 2018 | NO | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/NO2019/050029 | 2/1/2019 | WO | 00 |