This patent disclosure relates generally to attachment of work tools to a machine, and more particularly, to a method and a system for computer vision assisted work tool recognition and installation for a machine.
Machines have multiple types of work tools or attachments for different work purposes. For example, a wheel loader may use a bucket for moving earth, and may use a fork for picking up pallets. Changing work tools in a safe and quick fashion is one of the basic qualifications of a good machine operator. However, it may require a new operator a long time of training to master this skill.
A typical challenge encountered by the machine operator during a changing of the work tool is that additional manual operations may be required for changing the machine's output mode settings. Usually, the machine operator has to select a proper machine output mode for a specific work tool by manually pressing a button on a control panel. Forgetting to press the button or inadvertently selecting a wrong output mode may cause the machine to malfunction.
Another challenge for the machine operator is a limited front field of view of the work tool. The view of the work tool to be installed or attached may be blocked by some mechanical parts on a machine. For example, the hydraulic cylinders and mechanical linkages in front of an operator cab of a wheel loader may block the machine operator's view of a fork to be installed or attached to the machine.
Yet another challenge faced by the machine operator is the difficulty of aligning an attachment coupler of the work tool at a distance from the machine to which the work tool attaches. When the work tool to be installed is at a distance from the machine, it is difficult for the machine operator to manually accurately align an attachment coupler of the work tool with the machine due to the distance.
WO 2014046313 discusses capturing an image of an attachment and recognizing the attachment using a database. However, conventional systems and methods do not address the challenges faced by the machine operator, for example, when the view of the work tool is blocked.
Accordingly, there is a need to resolve these problems and other problems related to conventional methods and systems used for attaching work tool to machines in order to reduce training time for new operators, and increase work productivity.
In one aspect of this disclosure, a method for installing a work tool for a machine is provided. The method includes detecting, at an electronic controller unit of a machine, a work tool based upon a first input signal from a sensor coupled to the electronic controller unit. The method includes determining, at the electronic controller unit, a first three-dimensional location of the work tool relative to the machine. The method includes detecting, at the electronic controller unit, an occlusion of the work tool. The method includes determining, at the electronic controller unit, a second three-dimensional location of the work tool upon the detecting of the occlusion based on the first three-dimensional location. The method includes controlling, at the electronic controller unit, a motion of the machine for installing the work tool based upon the second three-dimensional location.
In another aspect of this disclosure, a work tool installation system is provided. The work tool installation system includes a machine attachable to a work tool. The machine includes a sensor and an electronic controller unit coupled to the sensor. The electronic controller unit is configured to detect the work tool based upon a first input signal from the sensor, determine a first three-dimensional location of the work tool relative to the machine, detect an occlusion of the work tool, determine a second three-dimensional location of the work tool, when the occlusion is detected, based on the first three-dimensional location, and control a motion of the machine for installing the work tool based upon the second three-dimensional location.
In yet another aspect of this disclosure, an electronic controller unit of a machine is provided. The electronic controller unit includes a memory and a processor. The memory includes computer executable instructions for recognizing and installing a work tool to a machine. The processor is coupled to the memory and configured to execute the computer executable instructions, the computer executable instructions when executed by the processor cause the processor to detect the work tool based upon a first input signal from a sensor, determine a first three-dimensional location of the work tool relative to the machine, detect an occlusion of the work tool, determine a second three-dimensional location of the work tool, when the occlusion is detected, based on the first three-dimensional location, and control a motion of the machine for installing the work tool based upon the second three-dimensional location.
Various aspects of this disclosure are related to addressing the problems in the conventional machines and methods by using computer-vision assisted work tool recognition and installation.
Now referring to the drawings, where like reference numerals refer to like elements,
The machine 102 may be a movable machine or a stationary machine having movable parts. In this respect, the term “movable” may refer to a motion of the machine 102, or a part thereof, along linear Cartesian axes, and/or along angular, cylindrical, or helical coordinates, and/or combinations thereof. Such motion of the machine 102 may be continuous or discrete in time. For example, the machine 102, and/or a part of the machine 102, may undergo a linear motion, an angular motion or both. Such linear and angular motion may include acceleration, rotation about an axis, or both. By way of example only and not by way of limitation, the machine 102 may be an excavator, a paver, a dozer, a skid steer loader (SSL), a multi-terrain loader (MTL), a compact track loader (CTL), a compact wheel loader (CWL), a harvester, a mower, a driller, a hammer-head, a ship, a boat, a locomotive, an automobile, a tractor, or other machine to which the work tool 104 is attachable.
In the example shown in
Under the hood 118, the machine 102 includes an electronic controller unit 126, an inertial measurement unit (IMU) 128, and a machine control system 130. The machine 102 may include other components (e.g., as part of the chassis 114) such as transmission systems, engine(s), motors, power system(s), hydraulic system(s), suspension systems, cooling systems, fuel systems, exhaust systems, ground engaging tools, anchor systems, propelling systems, communication systems including antennas, Global Positioning Systems (GPS), and the like (not shown) that are coupled to the machine control system 130.
By way of example only and not by way of limitation, the machine component 108 may be an excavator arm including hydraulic cylinders and mechanical linkages, although other types of mechanical parts may be utilized to attach the work tool 104 to the machine 102. The mechanical linkages may include attachment components compatible to mate with the attachment coupler 106. The machine component 108 may be extendable, expandable, contractable, rotatable, translatable radially or axially, or otherwise movable by the machine 102 to couple to the work tool 104. For example, a height and a tilt of the machine component 108 may be variable to facilitate attachment at the attachment coupler 106. Once attached to the work tool 104, the machine component 108 may be configured to receive requisite power from the machine 102 to perform various operations (e.g., digging earth) in the exemplary worksite using the work tool 104.
In one aspect of this disclosure, the sensor 110 may be a camera positioned on, inside, or above the operator cab 112. Alternatively, the sensor 110 may be a camera positioned on the machine component 108, e.g., near or at a front end of the machine component 108 closest to the work tool 104, although the sensor 110 may be positioned at other locations on the machine 102. By way of example only and not by way of limitation, the sensor 110 may be a monocular camera, a stereo camera, an infrared camera, an array of one or more types of cameras, an opto-acoustic sensor, a radar, a laser based imaging sensor, or the like, or combinations thereof, configured to assist recognition, detection, tracking, and installation of the work tool 104.
The work tool 104 is attachable to the machine 102, for example, to a linkage at an end portion of the machine component 108 via the attachment coupler 106. By way of example only and not by limitation, the work tool 104 may be a bucket for moving earth, a fork for lifting pallets, a harvester attachment, a drill head, a hammer head, a compactor head, or any other type of implement attachable to the machine 102. In this respect, the machine 102 may be configured to be attachable not just to one type of the work tool 104, but also to different types of the work tool 104, as well as to a plurality of work tools at the same time. Depending on the type of the work tool 104, the machine 102 may be configured to operate in an output mode specific to the type of the work tool 104. An output mode of the machine 102 is specified by appropriate electrical and mechanical parameters for operation of the work tool 104 when attached to the machine component 108. For example, an output mode for a bucket is different from an output mode of a fork in terms of an output power delivered to the work tool 104. If an incorrect output mode is selected, or if no output mode is selected by a manual operator when the work tool 104 is attached to the machine component 108, the machine 102 may not be able to properly perform, or not perform, the job for which the machine 102 was deployed. Further, depending on the type of the work tool 104, the attachment coupler 106 may be an attachment pin, a latch, a hook, a ball/socket joint, or other types of attachment components that make the work tool 104 couplable to the machine component 108 of the machine 102. In one aspect, the work tool 104 may be stationary. In another aspect, the work tool 104 may be mobile or movable towards the machine 102. For example, another machine (not shown) may be used to push the work tool 104 to match a motion of the machine 102 and/or of the machine component 108.
As illustrated in
As the machine 102 moves closer to the work tool 104, as illustrated in
Referring back to
For example, referring to
It will be appreciated that the terms “first”, “second”, “third”, and “fourth” used herein with respect to the initial, intermediate, or final positions of the machine 102 and the machine component 108 relative to the work tool 104 are for differentiating purposes only and not for any particular priority in which such relative positions between the machine 102, the machine component 108, and the work tool 104 are effectuated. Although illustrated as linear distances (as indicated by respective straight lines), the first three-dimensional location 120, the second three-dimensional location 220, the third three-dimensional location 122, and the fourth three-dimensional location 222, as well as other intermediate three-dimensional locations for intermediate relative positions of the machine 102 with respect to the work tool 104, may be vectors expressible in one or more coordinate systems and stored in a memory 508 (shown in
Referring to
In one aspect of this disclosure, the machine control system 130 may include various hydraulic and electrical power systems controlled by the electronic controller unit 126, based upon output signals from the electronic controller unit 126 to the machine control system 130. The machine control system 130 may include or may be coupled to the steering system 124 configured to guide a motion of the machine 102 and/or the machine component 108. In another aspect, the machine control system 130, or a part thereof, may be located remote from the machine 102, e.g., in a base station physically separated from the machine 102. In this scenario, the machine control system 130 may have a direct or indirect communication link with the electronic controller unit 126 to control the machine 102 for installing the work tool 104.
Referring to
In one aspect of this disclosure, the sensor 110 has a field of view 502 within which the work tool 104 and/or the attachment coupler 106 fall. During an occlusion of the work tool 104 and/or the attachment coupler 106, as discussed, the machine component 108 may fall within the field of view 502 of the sensor 110 to partially or fully block a view of the work tool 104 and/or the attachment coupler 106. This may prevent the sensor 110 from obtaining a full image of the work tool 104 and/or the attachment coupler 106. In conventional systems, such occlusion may slow down or make it difficult to attach the work tool 104 to the machine component 108, and requires manual intervention that is disruptive to the changing and installation process of the work tool 104. To address this issue, the electronic controller unit 126 may continuously receive an input signal 518 from the sensor 110 at an input-output port 504 of the electronic controller unit 126. The input signal 518 may include information regarding a current or an updated three-dimensional location or position of the work tool 104 and/or the attachment coupler 106 relative to the machine 102 and/or the machine component 108. The electronic controller unit 126 may be configured to detect the work tool 104 and determine the occlusion of the work tool 104 and/or the attachment coupler 106 based, at least partially, upon the information in the input signal 518. In another aspect, the electronic controller unit 126 is coupled to the IMU 128 to receive an input signal 520 from the IMU 128. The input signal 520 may include data related to a dead-reckoning of the machine 102.
In one aspect of this disclosure, the electronic controller unit 126 includes the input-output port 504, a processor 506, and the memory 508 coupled to each other, for example, by an internal bus (not shown). The electronic controller unit 126 may include additional components known to one of ordinary skill in the art, which components are not explicitly illustrated in
The input-output port 504 may be a single port or a collection of ports. The input-output port 504 is configured to transmit and receive various inputs and data from other parts of the machine 102 and forward such inputs and data to the processor 506. In one aspect, the input-output port 504 may be two separate ports, one configured to receive various input signals from various parts of the machine 102 (e.g., the sensor 110, the IMU 128, etc.) and another configured to output signals for display (e.g., on the output device 140) or for control of the machine 102 (e.g., to the machine control system 130). Alternatively, the functionalities of inputting and outputting maybe integrated into a single port illustrated as the input-output port 504 in
In one aspect, the processor 506 is a hardware device such as an integrated circuit (IC) chip fabricated to implement various features and functionalities of the aspects discussed herein. By way of example only and not by way of limitation, the processor 506 may be fabricated using a Complementary Metal Oxide Semiconductor (CMOS) fabrication technology. In one aspect, the processor 506 may be implemented as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a System-on-a-Chip (SOC), or the like. In another aspect, the processor 506 may include components such as packaging, input and output pins, heat sinks, signal conditioning circuitry, input devices, output devices, processor memory components, cooling systems, power systems and the like, which are not shown in
The memory 508 may be implemented as a non-transitory computer readable medium. By way of example only, the memory 508 may be a semiconductor based memory device including but not limited to random access memory (RAM), read only memory (ROM), Dynamic RAM, Programmable ROM, Electrically Erasable programmable ROM (EEPROM), Static RAM, Flash memory, combinations thereof, or other types of memory devices known to one of ordinary skill in the art. In one aspect, the memory 508 is coupled to the processor 506 directly via a communication and signal bus. In one aspect, the memory 508 may be made of or implemented using a non-transitory computer readable storage medium on which the computer executable instructions 510 reside. The computer executable instructions 510 when executed by the processor 506 cause the processor 506 to carry out the features and functionalities of the various aspects of this disclosure, such as those discussed with respect to
The computer executable instructions 510 may be executed by the processor 506 using high-level or low-level compilers and programming languages (e.g., C++). In one aspect, the computer executable instructions 510 may be executed remotely by a base station, and results of such execution provided to the processor 506 for controlling an output of the machine 102 to install the work tool 104 to the machine 102. In this respect, it will be appreciated that the specific location of the computer executable instructions 510 inside the memory 508 is by way of example only, and not by way of limitation.
In one aspect, the memory 508 includes or is coupled to a database 512. The database 512 includes images of a plurality of work tools, including the work tool 104. Such images are saved as a library of image files and computerized models in the database 512. Such models may include or may be used to generate three-dimensional and two dimensional views of the plurality of work tools attachable to the machine 102, including the work tool 104. Each such image or model in the database 512 includes an image of the respective attachment couplers of the work tools. For example, an image of the work tool 104 in the database 512 includes an image of the attachment coupler 106. Such image files may be in standard format (e.g., JPEG) known to those of ordinary skill in the art. The database 512 may further include various numerical parameters associated with one or more dimensions of the work tool 104 and the attachment coupler 106, as well as other identification information associated with the work tool 104 and the attachment coupler 106. In one aspect, the processor 506 may be able to generate an image of the work tool 104 and the attachment coupler 106 based upon the numerical parameters of the work tool 104 and the attachment coupler 106 stored in the database 512. Such images and information may be continuously accessible to the processor 506 before, during, and after an occlusion of the work tool 104 and/or the attachment coupler 106 in the field of view 502 of the sensor 110 occurs.
Referring to
Referring to
It will be appreciated that the three-dimensional scene 600 and the three-dimensional scene 700 are two visual examples of the machine 102 in operation as outputted on the output device 140, but the output device 140 may continuously display a plurality of three-dimensional scenes on a frame-by-frame basis as provided by the processor 506 to the output device 140 based upon the input signals (including the input signal 518) from the sensor 110. In one aspect, the three-dimensional scene 600 and the three-dimensional scene 700 may be provided on a display of a remote operator of the machine 102 in a remote base station (not shown) as a real-time video of the work scene in which the machine 102 and the work tool 104 are deployed. Such frame-by-frame representation of the work environment of the machine 102 when used for recognition and subsequent installation of the work tool 104 may be in a simulation format, or may be used as a simulation to train operators of the machine 102 to install the work tool 104.
The present disclosure is applicable to accurately recognizing work tools for installation to machines using computer-vision.
Machines have multiple types of work tools (attachments) for different work purposes. For example, a wheel loader may use a bucket for moving earth, and may use a fork for picking up pallets. Changing work tools in a safe and quick fashion is one of the basic qualifications of a good machine operator. However, it may require a new operator a long time of training to master this skill.
A typical challenge encountered by the machine operator during a changing of the work tool is that additional manual operations may be required for changing the machine's output mode settings. Usually, the machine operator has to select a proper machine mode for a specific work tool by manually pressing a button on control panel. Forgetting to press the button or inadvertently selecting a wrong mode may cause the machine to malfunction.
Another challenge for the machine operator is a limited front field of view of the work tool. The view of the work tool to be installed or attached may be blocked by one or more mechanical parts on the machine. For example, the hydraulic cylinders and mechanical linkages in front of an operator cab of a wheel loader may block the machine operator's view of a fork to be installed or attached to the machine.
Yet another challenge faced by the machine operator is the difficulty of aligning an attachment coupler of a work tool at a distance from the machine to which the work tool attaches. When the work tool to be installed is at a distance from a machine (e.g., an excavator), it is difficult for the machine operator to accurately align the work tool with an attachment coupler due to the distance.
Various aspects of this disclosure address the complex problem of recognizing and installing or attaching the work tool 104 when there is an occlusion of the work tool 104 with respect to the field of view 502 of the sensor 110, for example, due to the machine component 108 being present between the sensor 110 and the attachment coupler 106 of the work tool 104.
Referring to
In another aspect, in the method 800, one or more processes or operations, or sub-processes thereof, may be skipped or combined as a single process or operation, and a flow of processes or operations in the method 800 may be in any order not limited by the specific order illustrated in
The method 800 may begin in an operation 802 in which the processor 506 is configured to capture a work scene in which the machine 102 and the work tool 104 are deployed. Such capturing of the work scene may be done as a three-dimensional capture, as indicated, for example, in the three-dimensional scene 600 and the three-dimensional scene 700. The three-dimensional capture of the work scene may be based upon continuous inputs from the sensor 110 and/or the IMU 128. For example, when the sensor 110 is a monocular or a stereo camera, the three-dimensional scene 600 may be captured as an initial work scene. The sensor 110 may continue capturing the work scene on a frame-by-frame basis by capturing the three-dimensional scene 700 and subsequent three-dimensional scenes. In one aspect, the three-dimensional scene 600 and the three-dimensional scene 700 may be captured by the sensor 110 and outputted by the processor 506 on the output device 140 (as illustrated in
In one aspect, the work scene captured by the sensor 110 may not have a desired level of detail. In this scenario, the processor 506 may add various additional features of the work scene based upon prior knowledge of the work scene, using the database 512 to create the three-dimensional scene 600. For example, when the sensor 110 is an infrared camera operating at night, the processor 506 may create a simulation of the work scene to add more details (e.g., surrounding structures, color, etc.) to the captured work scene. Such details may then be presented as part of a simulation of the three-dimensional scene 600 and/or the three-dimensional scene 700 on the output device 140. The operation 802 may be initiated manually (e.g., by an operator) or automatically (e.g., every time the machine 102 is started).
In an operation 804, the processor 506 detects the work tool 104 in the captured work scene. The work scene may include a plurality of objects around the machine 102. The processor 506 may detect the work tool 104 based upon the input signal 518 from the sensor 110. In one aspect, the input signal 518 may be provided every 30 s to the processor 506, although a frequency at which the input signal 518 is generated by the sensor 110 may be programmable and variable. For example, the input signal 518 from the sensor 110 may include information related to a plurality of shapes detected by the sensor 110 that are in the field of view 502 of the sensor 110. Upon receiving the input signal 518, the processor 506 may send a query to the database 512 to obtain one or more dimensions of the work tool 104 stored as a three-dimensional model of the work tool 104 in the database 512, as well as three-dimensional models of other work tools and objects that may correspond to the plurality of shapes detected by the sensor 110. The processor 506 may detect the work tool 104 by extracting visual and/or geometric features corresponding to computer vision feature descriptors. Such feature descriptors may include, but are not limited to, a histogram of oriented gradients, speeded-up robust features, scale-invariant feature transform, and the like, or combinations thereof.
In another aspect, the processor 506 may extract the visual/geometric features of the objects from the input signal 518 and compare these features with the three-dimensional model library in the database 512. If there is a match, the processor 506 determines that the work tool 104 has been detected.
In yet another aspect, as part of the detection of the work tool 104 by the processor 506, the work tool 104 may reflect an output electromagnetic, acoustic, and/or optical signal from the sensor 110. Such a reflected signal when captured back by the sensor 110 is provided to the processor 506 as the input signal 518. The input signal 518 may include optical intensity variations indicating a presence of the work tool 104. The processor 506 detects the work tool 104 based, for example, on the optical intensity variations in the input signal 518. In contrast, when the work tool 104 is not present or is not detected, the input signal 518 from the sensor 110 may be missing or may not have any information regarding the presence of the work tool 104. In one aspect, processor 506 may detect the work tool 104 based upon an input provided to the processor 506 by an operator of the machine 102. The operator may view the work tool 104 directly. Alternatively, the operator may obtain information regarding a presence of the work tool 104 on the output device 140. The processor 506 may apply image processing to detect images captured by the sensor 110, e.g., when the sensor 110 is an optical camera.
In an operation 806, the processor 506 carries out classifying the work tool 104 detected in the operation 804. The processor 506 obtains an image of the work tool 104 based upon the input signal 518 from the sensor 110. The processor 506 may then extract data associated with the physical features of the work tool 104. Such data may include, for example, dimensions and shape of the work tool 104, determined, for example, in a manner similar to the extraction of features of the work tool 104 for the detection performed in the operation 804. The processor 506 may communicate with the database 512 to obtain a match of the work tool 104 with a known work tool whose information is stored in the database 512. For example, the database 512 may store a library of computerized models of a plurality of types of work tools that may be attached to the machine 102, and one of such computerized models may match the work tool 104, based upon the information in the input signal 518.
In one aspect, a search algorithm may be executed by the processor 506 to search for data matching the data associated with the work tool 104, as detected. For example, once a height, length, and/or depth of the work tool 104 have been determined by the processor 506 based upon extracting such information from the input signal 518 of the sensor 110, the processor 506 may send a query to the database 512, to determine whether the database 512 has matching work tools meeting satisfying the criteria corresponding to the dimensions of the work tool 104 in the query. The database 512 may communicate back one or more matches of the work tool 104. When there is more than one match, the processor 506 may apply additional criteria to identify and accurately recognize the work tool 104. For example, the database 512 may indicate to the processor 506 that there are two work tools matching the dimensions of the work tool 104.
However, the processor 506 may determine that only one of the two matching work tools are appropriate for the machine 102 and/or the work environment in which the machine 102 is deployed. Accordingly, the processor 506 classifies the work tool 104 by selecting information regarding the correctly matched known work tool provided by the database 512, and rejects the other matching results from the database 512. In this respect, the work tool 104 is identified or recognized using computer vision. The term “computer vision” may relate to processing information related to the work tool 104 to determine a type, dimensions, and a positioning of the work tool 104 relative to the machine 102, and further, applying the processed information to assist installation of the work-tool using generated images of the work tool 104 (e.g., the work tool image 604 in
In an operation 808, the processor 506 may select an output mode of the machine 102 based upon the classifying carried out in the operation 806. The output mode for the machine 102 may be set automatically based upon the classifying of the work tool 104. Alternatively, the processor 506 may output an indication to the output device 140 based upon which an operator can select the output mode of the machine 102. In one aspect, when the processor 506 automatically sets the output mode of the machine 102 based upon the classifying, manual errors due to inadvertent erroneous selection by the operator of the machine 102 may be avoided. The output mode selection determines an amount of power to be transmitted to the work tool 104, for example, by the machine component 108 to which the work tool 104 is attached. Such an output power is within a range to avoid potentially improper functioning or potential damage to the work tool 104 and/or to the work surface upon which the work tool 104 operates. For example, based upon the output mode, the processor 506 may output signals to the machine control system 130 to output appropriate electrical current to one or more motors or hydraulic actuators of the machine 102, which motors or hydraulic actuators may then control a motion of the machine 102 and/or the machine component 108.
In an operation 810, once the work tool 104 has been detected by the processor 506, the processor 506 may determine the first three-dimensional location 120 of the work tool 104. The first three-dimensional location 120 may be an initial location or position of the work tool 104 from the machine 102, for example, when the work tool installation system 100 is initiated. Alternatively, the first three-dimensional location 120 may refer to any initial location or position of the work tool 104 with respect to a calculation cycle or time slot of the processor 506 for which the method 800 is being implemented or carried out. For example, the processor 506 may need to update a previous determination of a calculated three-dimensional position of the work tool 104. To do so, the processor 506 may restart calculating and may use the first three-dimensional location 120 as a starting point or starting value to determine subsequent three-dimensional locations of the work tool 104 (e.g., the second three-dimensional location 220).
In one aspect, the processor 506 determines the first three-dimensional location 120 based upon a knowledge of a position of the sensor 110 and determining a propagation time for a signal emitted by the sensor 110 towards the work tool 104 to be reflected back and received at the sensor 110. Upon reception of the reflected signal, the sensor 110 provides the input signal 518 to the processor 506. For example, the processor 506 may utilize such a determination of the first three-dimensional location 120 when the sensor 110 is an active perception sensor, such as a Lidar, a time-of-flight three-dimensional camera, radar, and the like, or combinations thereof.
However, when the sensor 110 is a passive perception sensor, such as a monocular color camera, a monocular infrared camera, or a binocular stereo camera, the processor 506 may apply computer vision algorithms to determine the first three-dimensional location 120. For monocular cameras used as the sensor 110, the processor 506 may use computer vision technologies such as “structure-from-motion” to generate, for example, the three-dimensional scene 600 in the field of view 502 of the sensor 110. For binocular stereo cameras used as the sensor 110, the processor 506 may use a stereo triangulation to generate, for example, the three-dimensional scene 600 in the field of view 502 of the sensor 110. Once the three-dimensional scene 600 has been reconstructed by the processor 506, the first three-dimensional location 120 of the work tool 104 is calculated with respect to the sensor 110. Since the sensor 110 location with respect to the machine 102 are known (e.g., (x, y, z) coordinates, roll, pitch, and yaw), the first three-dimensional location 120 of the work tool 104 with respect to the machine 102 may be obtained. It will be appreciated by one of ordinary skill reading this disclosure that the second three-dimensional location 220, the third three-dimensional location 122, and the fourth three-dimensional location 222, or any other three-dimensional locations in the work tool installation system 100, may be determined by the processor 506 in a manner similar to the determination of the first three-dimensional location 120 by the processor 506.
In one aspect, the processor 506 may use a linear relationship to determine the first three-dimensional location 120 of the work tool 104. By way of example only and not by way of limitation, the processor 506 may know a three-dimensional position of the sensor 110 on the machine 102 in Cartesian coordinates (e.g., X, Y, Z values) and may obtain this information from the memory 508. Further, the processor 506 may obtain physical dimensions of the machine 102 and the machine component 108 and determine the relative position of the work tool 104 with respect to the machine 102 and the machine component 108.
In an operation 812, the processor 506 may determine the third three-dimensional location 122 of the attachment coupler 106 of the work tool 104. In one aspect, the third three-dimensional location 122 is determined based upon the classifying of the work tool 104 carried out in the operation 806. Upon classifying, the processor 506 may know where the attachment coupler 106 is located on the work tool 104 based upon the data for the work tool 104 retrieved from the database 512. In this respect, the processor 506 does not have to rely upon explicit input signals from the sensor 110 regarding the third three-dimensional location 122 of the attachment coupler 106. Rather, once the work tool 104 has been recognized, based upon the classifying, the processor 506 uses the information regarding the work tool 104 from the database 512 to determine the third three-dimensional location 122 of the attachment coupler 106.
In one aspect, such a determination of the relative position of the attachment coupler 106 is part of the computer vision based determination utilized by the processor 506. Further, such computer vision assisted determination of the relative position of the attachment coupler 106 may include determining a precise position of the attachment coupler 106 on the work tool 104 using the information obtained from the database 512 regarding the work tool 104. For example, when the work tool 104 is a bucket, an attachment pin of the bucket and a location of the attachment pin may be determined by the processor 506 upon detection and classification of the work tool 104 (e.g., as carried out in the operations 804 and 806). Accordingly, the processor 506 may form a complete three-dimensional positional view of the work tool 104 independent of an actual physical view of the work tool 104. The three-dimensional view of the work tool 104 may then be provided to the output device 140 as an output by the processor 506. Such an output may be represented as part of the three-dimensional scene 600 and the three-dimensional scene 700 illustrated in
In an operation 814, the processor 506 may continuously track the work tool 104 and the attachment coupler 106 of the work tool 104 using the input signal 518 from the sensor 110. The tracking of the work tool 104 and/or the attachment coupler 106 may be based upon determining a relative distance between the machine 102 and the attachment coupler 106 starting, for example, with the third three-dimensional location 122 of the attachment coupler 106. As the machine 102 moves toward the work tool 104, the processor 506 applies computer-vision based on image information about the work tool 104 and the attachment coupler 106 from the database 512 to accurately track the work tool 104 and/or the attachment coupler 106 to adjust and control the motion of the machine 102.
In addition to the input signal 518, the processor 506 is configured for the tracking based on the classification of the work tool 104 carried out in the operation 806. Since the processor 506 knows the exact type of the work tool 104, the processor 506 can determine the type and location of the attachment coupler 106 on the work tool 104. Then, based upon a velocity of the machine 102, and hence the sensor 110, the processor 506 can continuously track where the attachment coupler 106 on the work tool 104 is located. The tracking of the work tool 104 and/or the attachment coupler 106 may be viewable on the output device 140. For example, the work tool 104 and/or the attachment coupler 106 as tracked may be presented on the three-dimensional scene 600 for an operator of the machine 102. Such tracking of the work tool 104 and the attachment coupler 106 may be referred to as “visual odometry” in which a velocity of the machine 102 is used in conjunction with the images of the work tool 104 retrieved from the database 512 by the processor 506 to determine an accurate three-dimensional localization of the attachment coupler 106. Such three-dimensional localization of the attachment coupler 106 may be indicated on the output device 140, for example, as geographical coordinates and directions.
In one aspect, such tracking may be carried out on a frame-by-frame basis based on input signals (including the input signal 518) from the sensor 110 received at the processor 506. For example, the three-dimensional scene 600 may be considered as a first frame on the output device 140 in which the work tool 104 and/or the attachment coupler 106 are presented and tracked. Likewise, subsequent three-dimensional scenes (e.g., the three-dimensional scene 700) may be presented as a time based progression of the captured work scene. In each such frame, the work tool 104 and/or the attachment coupler 106 are tracked as indicated by the relative positions of the work tool 104 and/or the attachment coupler 106 with respect to the machine 102. It will be appreciated that the tracking of the work tool 104 and/or the attachment coupler 106 on a frame-by-frame basis may be for visual presentation of the simulated locations of the work tool 104 and/or the attachment coupler 106 (directly correlated with the actual physical locations of the work tool 104 and/or the attachment coupler 106). However, for purposes of calculating the relative positions of the machine 102 and the work tool 104 and/or the attachment coupler 106 (e.g., using the first three-dimensional location 120, etc.), such a visual presentation is an example only, and not a limitation. For example, the processor 506 may determine the relative positions based on numerical or tabular values that are updated as the work tool 104 and/or the attachment coupler 106 are tracked.
In an operation 816, the processor 506 may detect an occlusion of the work tool 104. Such occlusion may be partial (e.g., as shown in
In one aspect, the occlusion may be detected by the processor 506 based upon the continuous tracking performed in the operation 814. For example, the processor 506 may provide the three-dimensional scene 600 as a first frame on the output device 140. In the first frame on the three-dimensional scene 600, the work tool 104 and the attachment coupler 106 are clearly visible correspondingly as the work tool image 604 and the attachment coupler image 606. The processor 506 may continue to provide a plurality of such frames as the machine 102 moves towards the work tool 104 to install the work tool 104.
However, in one such plurality of frames represented, for example, as the second frame in the three-dimensional scene 700, the attachment coupler image 606 may not be available and may be partially or fully blocked. In this second frame, the processor 506 detects that the work tool 104 has been occluded since the attachment coupler image 606 is detected to be partially or fully missing. The processor 506 carries out the occlusion detection using the input signal 518 that informs the processor 506 about the objects in the work scene in which the machine 102 is deployed. As indicated by information in the input signal 518 to the processor 506, various machine components, including the machine component 108, are detected in three-dimensional space around the machine 102. Therefore, the processor 506 knows if one or more of the machine components (e.g., the machine component 108) come into the line of sight between the sensor 110 and the work tool 104, which causes full occlusion or partial occlusion. The processor 506 keeps tracking a previously detected object (e.g., the work tool 104) on a frame-by-frame basis. In the meanwhile, the processor 506 keeps performing object detection (e.g., in the operation 804) and classification on each frame (e.g., in the operation 806). If the processor 506 finds that the work tool 104, as tracked, shows different geometric features between two adjacent frames (e.g., between the first three-dimensional scene 600 and the second three-dimensional scene 700), the processor 506 determines that an occlusion of the work tool 104 has happened.
In another aspect, the processor 506 may determine, based upon the visual odometry performed in the operation 814 that the machine component 108 is moving with a known velocity towards the work tool 104 and/or the attachment coupler 106. Based upon the first three-dimensional location 120, the second-three-dimensional location 220 and/or the third three-dimensional location 122 determined in the operations 810 and 812, as well as the type of the work tool 104 classified in the operation 806, the processor 506 may calculate an exact location of the work tool 104 and the attachment coupler 106. Accordingly, even during the occlusion detected by the processor 506, the method 800 for installing the work tool 104 to the machine 102 does not stop and the machine 102 continues moving towards the work tool 104 and the attachment coupler 106 to install the work tool 104 to the machine 102. In yet another aspect, the processor 506 may seek confirmation regarding the occlusion from the operator of the machine 102, or may receive an input from a human operator of the machine 102 that an occlusion has been visually detected, in addition to or independently of the detection of occlusion determined by the processor 506 itself in the operation 816.
In an operation 818, the processor 506 may update the three-dimensional scene 600 based on the detected occlusion in the operation 816. The three-dimensional scene 600, as updated to reflect the detected occlusion of the work tool 104, may be provided as the three-dimensional scene 700 outputted on the output device 140. The processor 506 may further carry out the updating of the three-dimensional scene 600 based upon information captured by the sensor 110 in the operation 802 initially. For example, the sensor 110 may include information regarding static structures (trees, buildings, etc.) in or around the work scene of the machine 102. The processor 506 may use such static structure, in addition to the dynamically changing positions of the various objects in the field of view 502, to update the three-dimensional scene 600.
In one aspect, the processor 506 may update the three-dimensional scene 600 using the tracking carried out in the operation 814. For example, the three-dimensional scene 700 may show an updated or current relative position of the work tool 104 and/or the attachment coupler 106 with respect to the machine 102 and/or the machine component 108 based upon the tracking by the processor 506. In another example, even after the occlusion has been detected, the processor 506 may continue updating the three-dimensional scene 600 using computer vision since the processor 506 has knowledge of the physical features of the work tool 104 and a last known location of the work tool 104 and/or the attachment coupler 106, as well as the velocity of the machine 102, among other parameters (e.g., additional inputs from the sensor 110, the IMU 128, GPS location of the machine 102, etc.).
In an operation 820, the processor 506 may determine a dead-reckoning or deduced reckoning of the machine 102. The processor 506 may receive a second input signal (e.g., the input signal 520) from the IMU 128 coupled to the electronic controller unit 126. The second input signal is used by the processor 506 to determine a motion of the machine 102 in three-dimensional space. As known, the IMU 128 may receive inputs from various inertial sensors (not shown) attached to the machine 102 to generate the second input signal to the processor 506. In one aspect, the processor 506 may further utilize a satellite signal or a signal from an unmanned aerial vehicle to corroborate the information in the second input signal regarding the dead-reckoning of the machine 102. In another aspect, the IMU 128 may be optional for purposes of this disclosure. However, using the second input signal from the IMU 128, in addition to the visual odometry and computer vision based tracking performed by the processor 506, may result in a more robust estimate of the first three-dimensional location 120, the second three-dimensional location 220, the third three-dimensional location 122, and the fourth three-dimensional location 222 of the work tool 104 and the attachment coupler 106, respectively, with respect to the sensor 110 on the machine 102.
In an operation 822, the processor 506 may perform a fusion of the first input signal (e.g., the input signal 518) from the sensor 110 and the second input signal (e.g., the input signal 520) from the IMU 128 to generate a combined signal that provides a current or most recent relative positioning between the machine 102 and the work tool 104. The combined signal may be generated by the processor 506 by normalizing the first input signal and the second input signal to a common format, and then verifying whether the information regarding the motion of the machine 102 as deduced from the visual odometry and the dead-reckoning are same or in a similar range of values.
In an operation 824, based upon the fusion of the first and the second input signals from the sensor 110 and the IMU 128, respectively, the processor 506 determines an updated position of the work tool 104 relative to the machine 102. For example, the updated position of the work tool 104 may be obtained by calculating the second three-dimensional location 220, after the occlusion has been detected in the operation 816.
Likewise, in an operation 826, the processor 506 determines the fourth three-dimensional location 222 of the attachment coupler 106 based upon the combined signal obtained in the operation 822. For example, once an updated position of the work tool 104 is determined by the processor 506, based upon one or more of the classification of the work tool 104, the dead-reckoning on the machine 102, and the third three-dimensional location 122 of the attachment coupler 106, the processor 506 may determine the fourth three-dimensional location 222 even after the occlusion occurs.
In an operation 828, the processor 506 may output an updated position of the work tool 104 and the attachment coupler 106 based upon the operations 824 and 826 to the machine control system 130. For example, the processor 506 may output the second three-dimensional location 220 of the work tool 104 and the fourth three-dimensional location 222 of the attachment coupler 106 as part of the updated position of the work tool 104 and the attachment coupler 106. In one aspect, the processor 506 may output the updated position of the work tool 104 and/or the attachment coupler 106 in the three-dimensional scene 700 to the operator of the machine 102. Such outputting of the updated positions of the work tool 104 and/or the attachment coupler 106 may be provided on a continuous time basis as the machine 102 moves and the occlusion of the work tool 104 and/or the attachment coupler 106 changes, or even vanishes after a certain time has passed without stopping the machine 102 or without disrupting the process of installing the work tool 104 to the machine 102.
In an operation 830, the processor 506 may control the machine 102 and/or the machine component 108 based upon the updated positions of the work tool 104 and the attachment coupler 106 using the machine control system 130. Based upon the updated positions, the machine control system 130 may provide a response signal to the processor 506 to control a motion of the machine 102 and/or the machine component 108. For example, the updated position may require the machine 102 to move in a particular spatial direction at a speed different from a current speed of the machine 102 upon detecting the occlusion in the operation 816. Similarly, the updated position of the work tool 104 and the attachment coupler 106 may require the machine component 108 to move in a specific direction and speed different from a current direction in which the machine component 108 was moving. For example, the machine component 108 might have been moving at a particular angular acceleration initially, prior to the occlusion. The machine control system 130 may receive an output signal from the processor 506 to reduce the angular acceleration with which the machine component 108 may be rotating upon detection of the occlusion. As a result, the machine component 108 may be controlled to move in a more accurate manner even though the attachment coupler 106 is partially of fully occluded in the field of view 502 of the sensor 110. It will be appreciated that the processor 506 may control the motion of the machine 102 and/or the machine component 108 in other ways, for example, along linear and/or circular directions, depending upon the type of the work tool 104 and/or the attachment coupler 106.
In an operation 832, the processor 506 determines that the machine component 108 and the attachment coupler 106 are at an appropriate distance and relative orientation for the attachment coupler 106 to couple to or attach to the machine component 108. The processor 506 then outputs signals to the machine control system 130 to attach the work tool 104 to the machine component 108 of the machine 102 at the attachment coupler 106. Based upon the classifying of the work tool 104 and the attachment coupler 106 (in operation 806), and the updated positions of the work tool 104 and the attachment coupler 106 (in operations 824 and 826, respectively), the processor 506 knows the dimensions and the location of the attachment coupler 106 and installs the attachment coupler 106, even when the attachment coupler 106 is occluded in the field of view 502 of the sensor 110. Such installation may be performed by the processor 506 outputting signals to the machine control system 130 to open mechanical linkages on the machine component 108 to couple to the attachment coupler 106.
In an operation 834, once the work tool 104 has been installed to the machine component 108, the processor 506 may output an indication of the installation of the work tool 104 to the operator of the machine 102. Such an indication may be on the output device 140, for example, as a visual, an audio, or an audio-visual indication inside the operator cab 112 or to the remote base station, or both.
In an operation 836, upon receiving the indication from the processor 506 that the work tool 104 has been installed to the machine 102, the operator may operate the machine 102, as appropriate for the work site in which the machine 102 is deployed. Such operating of the machine 102 may further include outputting power from the machine 102 to the work tool 104 based upon the output mode selected in the operation 808 specific to the type of the work tool 104 as determined in the operation 806.
Various aspects of this disclosure aid the operator of the machine 102 in the process of changing and attaching the work tool 104 when the work tool 104 and/or the attachment coupler 106 may be temporarily or otherwise be out of view (or, occluded) from the field of view 502 of the sensor 110. For example, when the sensor 110 is a camera, the work tool 104 may be out of the camera's view due to the motion of the machine 102 and/or the machine component 108. This may occur, for example, when the camera is mounted on a top of the operator cab 112 and the work tool 104 gets close to the machine 102 at an end of a work tool changing and installation process. In one example, the hydraulic cylinders in the machine component 108 in front of the operator cab 112 may partially block the field of view 502, because of which an operator of the machine 102 may not be able to determine the first three-dimensional location 120 or the second three-dimensional location 220 of the work tool 104.
To solve this complex problem, the processor 506 uses motion tracking to continuously estimate the location of the machine 102 in three-dimensional space. Upon detection of an occlusion of the work tool 104, the processor 506 can still calculate an updated position of the work tool 104 using a last seen location of the work tool 104 and the estimated motion of the machine 102 (received, for example, from the IMU 128). Similar to a determination of the relative position of the work tool 104, the processor 506 determines the relative position of the attachment coupler 106 (e.g., an attachment pin) with respect to the machine 102 and/or the machine component 108 even after the occlusion happens. As a result, various aspects of this disclosure ensure that the occlusion does not disrupt the process of installing the work tool 104 to the machine 102 at the attachment coupler 106.
It will be appreciated that the foregoing description provides examples of the disclosed system and technique. However, it is contemplated that other implementations of the disclosure may differ in detail from the foregoing examples. All references to the disclosure or examples thereof are intended to reference the particular example being discussed at that point and are not intended to imply any limitation as to the scope of the disclosure more generally. All language of distinction and disparagement with respect to certain features is intended to indicate a lack of preference for those features, but not to exclude such from the scope of the disclosure entirely unless otherwise indicated.
Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.