The present disclosure relates generally to determining machine state, and, more particularly, to methods and systems for determining orientation and/or position of a machine by fusing data generated by a plurality of on-machine sensors.
Facilitating some earth working activities (e.g., mining, construction, dredging, excavating, or the like), including remotely-controlled and/or autonomous activities, may require information about machine position and/or orientation. For example, some computer-aided excavation operations require that a machine be localized, e.g., in an environment, and/or that an orientation of the machine be determined, prior to and/or during excavation. Conventionally, such localization and/or orientation determination may be based on global positioning sensors, including GNSS (global navigation satellite system) sensors. For instance, some conventional applications use two GNSS sensors coupled to a portion of a body at known, spaced locations on that portion. The portion to which the sensors are coupled moves relative to one or more other portions of the body. Global positions from the two GNSS sensors (and their relationship to the body of the machine) will provide a position and orientation, or heading, of the machine. However, outfitting machines, especially fleets of machines, with two GNSS sensors is expensive. Accordingly, it may be desirable to provide a system that uses a single GNSS sensor, and that leverages data from other sensors, to determine an accurate position and/or orientation of a vehicle.
Systems have been designed with a view toward attempting to determine relative positions of machine implements and/or components. For example, Patent Publication GB2571004A to Neyer et al. (“the '004 Publication”) describes systems and methods for controlling a mobile machine using various sensor modalities. For instance, the '004 publication describes outfitting a mobile working machine with displacement sensors, angle sensor, inertial sensors, rotation rate sensors and acceleration sensors, as well as one or more cameras, to determine a position of a manipulator arm.
While the system described in the '004 Publication may include sensors for determining aspects of a work machine, the '004 Publication does not describe many of the techniques for determining a machine state, such as an orientation of the machine detailed herein.
The present disclosure is directed to one or more improvements in the existing technology.
An example machine includes a lower frame configured to move along a surface; an upper frame rotatable relative to the lower frame; a first sensor configured to measure an angular orientation of the upper frame relative to the lower frame; a global navigation satellite system (GNSS) sensor coupled to the upper frame and configured to sense a global position; one or more processors; and memory storing executable instructions. When executed by the one or more processor, the instructions cause the machine to perform actions including: receiving, from the first sensor, a first input indicating a first angular orientation of the upper frame relative to the lower frame at a first time; receiving, from the first sensor, a second input indicating a second angular orientation of the upper frame relative to the lower frame at a second time; receiving, from the GNSS sensor, a first global position associated with the first time and a second global position associated with the second time; and determining, based at least in part on the first input, the second input, the first global position, and the second global position, an orientation of the machine.
An example system includes: a machine having a lower frame and an upper frame movable relative to the lower frame; a three-dimensional position sensor disposed on the upper frame at a position spaced from an axis of rotation of the upper frame relative to the lower frame; an additional sensor coupled to the machine; one or more processors; and memory storing executable instructions. When executed by the one or more processor, the instructions cause the machine to perform actions comprising: receiving, from the additional sensor, first sensor data associated with a first machine position and second sensor data associated with a second machine position different from the first machine position; determining based at least in part on the first sensor data and the second sensor data, an estimated motion of the machine between the first machine position and the second machine position; receiving, from the three-dimensional position sensor, a plurality of global positions associated with the machine; and determining, based at least in part on the plurality of global positions associated with the machine and the estimated motion of the machine, an orientation of the machine in the second machine position.
An example method of determining an orientation of a machine, includes: receiving, from a rotation sensor, sensor data indicating a rotation of an upper frame of the machine relative to a lower frame of the machine; determining, based at least in part on the sensor data and a location of a three-dimensional position sensor, an estimated arc of the three-dimensional position sensor; receiving, from the three-dimensional position sensor, a plurality of global positions; and determining, based at least in part on the plurality of global positions and the estimated arc, an orientation of the machine.
This disclosure generally relates to methods, systems, and techniques for determining machine state. While specific parts described herein may be parts on machines, e.g., ground-engaging machines, earth-moving machines, or the like, the techniques described herein may be applicable to any number of other machines. Where possible, the same reference numerals are used through the drawings to refer to the same or like features.
As illustrated in
The upper frame 102 is coupled to the lower frame 104 and is rotatable relative to the lower frame 104, as noted above. As shown in
As noted above, the machine 100 can be embodied as an excavator, and may include a boom 112, a stick 114, and a bucket 116. As in conventional machines, the boom 112 is coupled to and configured to move relative to the upper frame 102, the stick 114 is coupled to and configured to move relative to the boom 112, and the stick 114 is coupled to and configured to move relative to the stick 114. In the illustrated example, a pair of first hydraulic actuators 118 are provided to move the boom 112 relative to the upper frame 102. A second hydraulic actuator 120 is provided to move the stick 114 relative to the boom 112, and a third hydraulic actuator 122 is provided to move the bucket 116 relative to the stick. For example, and without limitation, the hydraulic actuators 118, 120, 122, may be controlled to facilitate completion of tasks, e.g., digging tasks, with the machine 100. In other examples, the machine 100 may include other and/or additional implements. For instance, the bucket 116 may be replaced with a different implement coupled to the stick 114. In still other instances, the machine 100 can include implements other than the boom 112 and the stick 114. Moreover, other and/or additional actuators may be used to control the implements, e.g., instead of the hydraulic actuators 118, 120, 122.
The machine 100 also includes a number of sensors, which are shown schematically in
The machine 100 also includes a position sensor 126. The position sensor 126 is configured to determine a position, e.g., a two- or three-dimensional global or local position, of the position sensor 126. In some examples, the position sensor 126 may be a Global Navigation Satellite System (GNSS) device, although other position sensors are contemplated, including but not limited to a Machine Target position sensor, a robotic total station, a robotic tracking station, and other types of sensor systems. In the example of
In addition to the rotational sensor 124 and the positional sensor 126, the machine 100 can also include additional sensors. For example,
Other techniques and systems for determining movement of the tracks also are contemplated. For example, track movement may be inferred from control signals used to move the machine 100. For example, such control signals may be generated in response to an operator input, e.g., from an operator in the cab 110, from a remote operator, and/or from a computing device acting as an operator (e.g., in an autonomous system). In these examples, the track sensor 128 may be configured to generate information associated with these signals, and an actual movement of the tracks 108, and thus of the machine 100, may be inferred from the signals. For example, and without limitation, the track sensor 128 may detect a physical displacement of a user control, such as a joystick steering wheel, or the like, or may identify control signals, e.g., electrical signals, generated in response to such movements.
The machine 100 can also include one or more imaging and/or ranging sensors, schematically depicted in
The machine 100 may also include implement sensors, e.g., for generating information associated with implements of the machine. In the example of
As noted above, the machine 100 is used to perform one or more tasks. Such tasks may include moving the machine 100, e.g., via the tracks 108, to various locations and/or operating the bucket 116 (via the boom 112 and/or the stick 114) to dig, grade, or the like. Tasks may also include or require rotating of the upper frame 102 (and therefore the cab 110, the boom 112, the stick 114, and the bucket 116) relative to the lower frame 104. In some examples, the tasks can be performed in response to instructions from a human operator, e.g., sitting in the cab 110. Without limitation, the human operator can operate the machine 100 using a lever, joystick, touchscreen or other type of control. The operator can also use input commands that correspond to different types of functions, operations, or events, such as “stick-in” events, “stick-out” events, “boom-up” events, “boom-down” events, “bucket-curl” events, “bucket-dump” events, or any other event associated with movement of one or more components of the machine 100.
The sensor data 140 can include the data generated by any of the sensors associated with the machine 100, as detailed above. Without limitation, the sensor data 140 may include rotational data generated by the rotational sensor 124, e.g., as angular displacements of the upper frame 102 relative to the lower frame 104. The sensor data 140 may also include positions, such as global positions, generated by the position sensor 126, e.g., as positions of a GNSS sensor in a global coordinate system. The sensor data 140 may also include track data generated by the track sensor 128 (and/or any additional track sensors as detailed above). Furthermore, the sensor data 140 may include environmental data generated by the sensor 130 and/or the implement sensors 132, 134, 136. The sensor data 140 may also include information from other than the sensors illustrated in
As illustrated in
In some examples, one of the fusion model(s) 146 may be configured to estimate a center of rotation of the machine 100 using position data from the position sensor 126 and rotational data from the rotational sensor 124. For instance, and as detailed further below in connection with
Although
The state data 142 may include any information about the machine 100 state, as discussed above. In some examples, the state data 142 may include a two- or three-dimensional vector indicative of the relative position and orientation of the machine 100. An example of a machine orientation vector 144 is shown in
As just described, data associated with the machine 100 can be used to determine the state data 142, e.g., via the state determination system 138 implementing the fusion model(s) 146. Examples of determining the state data 142 according to techniques described herein are detailed below with reference to
More specifically,
In the example of
In the second machine position, the lower frame 204 of the vehicle has not moved relative to the first state, e.g., the tracks have not caused the machine 200 to move relative to the ground in a global coordinate system. However, the GNSS sensor 208 will detect a change in location, e.g., because the GNSS sensor is coupled to the upper frame 202 and the upper frame 202 has moved, e.g., as shown by the upper frame 202′, as a result of the rotation of the upper frame 202 relative to the lower frame 204. The GNSS sensor 208 is coupled to the upper frame 202 at a position spaced, e.g., in the X-Y plane, from the center of rotation 206 about which the upper frame 202 rotates relative to the lower frame 204. Because of this spacing, the GNSS sensor 208 tracks an arc about the center of rotation 206 of the upper frame 202 relative to the lower frame 204, that is, the GNSS sensor moves from the first position shown by the GNSS sensor 208 to a second position shown by the GNSS sensor 208′. As will be appreciated, as the upper frame 202 rotates from the first position to the second position, the GNSS sensor 208 registers a plurality of global positions, including at least global positions associated with the first machine state and the second machine state. Depending on the configuration of the GNSS sensor 208, e.g., the frequency at which the GNSS sensor 208 determines and/or generates a global position, the GNSS sensor 208 may also determine a plurality of global positions between the illustrated machine positions.
In the example of
In an idealized case, e.g., where there is no sensor noise or errors due to machine vibration, the global positions measured by the GNSS sensor 208 could be sufficient to determine the orientation vector 210′. Specifically, because the position of the GNSS sensor 208 on the machine 208 is known, e.g., relative to the center of rotation 206 of the machine 200 and/or relative to positions of implements on the machine 200, the GNSS sensor 208 will experience the same rotation, e.g., the angle of rotation, θ, as the upper frame 202 and the positions generated by the GNSS sensor 208 are on an arc at a known distance from the center of rotation 206. With two global positions generated by the GNSS sensor, two candidate centers of rotation can be determined, e.g., at an intersection of circles about the global locations, which circles have a radius equal to the distance between the center of rotation 206 and the GNSS sensor 208. A third global position measured from the GNSS sensor can triangulate the center of rotation 206, and thereby determine the direction of the machine orientation vector 210′.
While data from the GNSS sensor 208 may be sufficient to determine state information in an idealized case, real world applications are different. Specifically, even with only rotational motion, as in the example of
Without limitation, aspects of this disclosure use the points 302 and the angle determined by the rotational sensor, e.g., the rotation angle, θ, to determine the position and orientation of the machine. For example, characteristics of an expected arc 304 approximating the movement of the GNSS sensor 208 can be determined from the arrangement of the GNSS sensor 208 on the upper frame 206 and based on the measured angle of rotation, θ. In some aspects of this disclosure, errors of the measured points 302 in the form of offsets relative to the arc 304, e.g., lateral distances, can be determined, and an estimated center of rotation 306, corresponding to the center of rotation 206, can be determined based on these errors. In other examples, the arc 304 may be fit to the points 302, to yield an estimated center of rotation 306. For instance, techniques for determining the center of rotation can include least-square-error minimization, averaging, optimization based on model fitting, or the like. In addition, these estimation methods may provide an indication of the uncertainty around the final estimate e.g. the variance of error, median error, etc. A confidence value associated with the state estimation of the vehicle, e.g., associated with the machine orientation vector, may be based, at least in part, on this uncertainty.
Other examples for determining the estimated center of rotation 306 also are contemplated. For example, perpendicular bisectors of lines connecting adjacent pairs of the points 302 would all meet at the center-of-rotation 206 in the absence of noise. However, due to the global position points being noisy, techniques described herein can estimate a center-of-rotation for pairs of the points 302. A cluster of estimated centers, e.g., one corresponding to each pair of points, is thus generated. Techniques described herein can estimate the estimated center of rotation 306 from the cluster. This clustering can be used in connection with, or instead of the fitting described above.
In some examples, the first point 302(1) and the second point 302(2) may be more accurate than other instances of the points 302, e.g., because the machine may be stationary during the sensing of these positions. The methods described above can incorporate this as a prior belief by appropriately weighting the errors e.g. by weighting the least-square-errors from the center-of-rotation estimates obtained from the points 302(1), 302(2) and the angle of rotation, θ, from the rotational sensor 124 with a higher weight than other error terms.
The GNSS global position data has a certain amount of error. Accordingly, successive measurements by the GNSS sensor may spuriously indicate movement of the machine, e.g., in the absence of actual motion. For example, conventional GNSS sensor can find a global position within about 0.1 feet (e.g., about three centimeters). In some examples, the techniques described herein can determine the orientation and position of the vehicle upon receiving data about two points outside the margin of error associated with the GNSS sensor. Stated differently, although
In the example just described, other sensor data, such as portions of the sensor data 140, may be generated at the vehicle to confirm that no transverse movement of the machine has occurred. For example, and as discussed above, information about control inputs to the vehicle may be received by the state determination component. Specifically, when an operator of the machine inputs a control to move the machine, e.g., via a joystick, user interface, steering mechanism, or other control device, the input may be sent to a controller to cause the tracks 108 to move in some manner. In the example of
The foregoing techniques are improvements over conventional systems that require multiple GNSS sensors, e.g., at different, known locations on the upper frame 202. Other conventional systems may have a single GNSS sensor (or other type of position sensor), but require the machine to perform an extensive calibration procedure to determine a position and orientation of the machine, e.g., prior to use and/or at intervals during use. For example, a portion of this conventional calibration procedure can include causing the machine to rotate the upper frame relative to the lower frame about a relatively large angle, e.g., 30-degrees or more. Through this calibration procedure, a sufficient number of spaced global positions, over the sufficiently large angle, are used to approximate the center-of-rotation of the machine within an acceptable tolerance, e.g., by estimating an arc made by the points. However, because of the large errors associated with the global positions generated by the GNSS sensors, as noted above, the large angle and a relatively large number of positions are required to provide a machine orientation with acceptable accuracy. In contrast, aspects of this disclosure fuse information from the rotational sensor and the (single) GNSS device (and/or data from other sensors) to determine precise state information, including position and/or orientation, of the machine 200, without the need for an extensive calibration procedure. In some examples, the techniques described herein may calculate position and orientation of the machine 200 with greater accuracy and without requiring a large calibration angle. Techniques described herein may determine machine state information with an angular rotation, e.g., the angle of rotation, θ, of five-degrees or less. In some examples, the techniques described herein can be performed during normal operation of the machine, e.g., during performance of a task, thereby completely obviating the need for a separate calibration procedure. In other implementations, a greatly simplified calibration procedure, e.g., with a much smaller angular rotation, can greatly reduce the time needed for calibration.
Multiple measurements from the GNSS sensor may be sufficient to determine a position and orientation of the vehicle if it is known that there is no translational machine movement. Translational motion, however, may alter the expected arc 304 of the GNSS sensor 208. Aspects of this disclosure can also use additional data, e.g., from the sensor data 140, to generate a machine orientation vector in response to more complex machine movements. Without limitation, IMU data, track data, and/or the like, e.g., sensor data associated with the lower frame 204 may be used by the fusion model(s) 146 to determine an estimated path of the GNSS sensor, with measurements generated by the GNSS sensor 208 being fit to the estimated path. As noted, relatively small motions are sufficient for performing the techniques described herein, so the estimated path may not deviate far from the arc and/or the estimated arc may be isolated from other, e.g., translational, movements during operation of the machine.
The examples of
Specifically,
In the example of
As noted above, in the second machine position 414, the lower frame 404 of the machine 400 has moved relative to the ground. More specifically, a first track 418 and/or a second track 420, which may correspond to the first track 108a and the second track 108b, have been actuated to cause the machine 400 to move relative to the ground, in a global coordinate system. As with previous examples, the GNSS sensor 408 will generate global positions associated with the first machine position 410 and the second machine position 414 (as well as positions between the first machine position 410 and the second machine position 414), e.g., because the GNSS sensor is coupled to the upper frame 402 and the upper frame 402 is carried by the lower frame 404. However, upper frame 402 has not moved relative to the lower frame 404, so there is no expected arc to which the measured points may be compared, as in the example of
In the example of
The information associated with the movement of the tracks may be used to approximate a translational and/or rotational movement of the machine 100. For example, the state determination system 138, e.g., using one or more of the fusion model(s) 146 can approximate a first track path 424 and/or a second track path 426. Because the location of the GNSS sensor relative to the upper frame 402 is known, the expected path 422 can be determined based on one or more of the first track path 424 and/or the second track path 426. The global position data from the GNSS sensor 408 is also available at times associated with locations along the estimated path 422. In some instances, the state determination system 138 computes a translation and three-dimensional rotation (yaw, pitch and roll) of the machine with respect to its center-of-rotation based on machine geometry, known location of the sensors, and a mapping from left/right track inputs to translation amounts for each of the sensed data above. These estimates based on individual sensors can be merged using a sensor fusion algorithm, e.g., as the fusion model(s) 146 to determine an accurate estimate of the motion of the machine 100, and the machine position and orientation can then be computed, e.g., by applying the estimated motion vector to the prior position and orientation of the machine. For example, in an embodiment, a Kalman filter or an extended Kalman filter can be used to track the center-of-rotation and the orientation of the machine using the sensor measurements as inputs.
The estimated track paths 424, 426, and thus the estimated path 422 of the GNSS sensor 408, may be prone to larger errors than the estimated arc 304 determined from the rotational sensor 124. For instance, the tracks 418, 420 may be prone to slippage, e.g. based on the terrain, soil conditions, and/or the like. As a result, the machine orientation vectors 412, 416 computed without the benefit of rotation of the upper frame 402 relative to the lower frame 404, may have a lower confidence than machine orientation vectors based at least in part on the relative rotational movement. As a result, in some implementations of this disclosure, when a machine, like the machine 400, is controlled to traverse from a first position to a second position, as in
In the example of
The machine 500 also includes an imaging sensor 512, which may correspond to the sensor 130 shown in
In example of
In some examples, the image sensor 512 may capture a video sequence or still images at fixed time intervals as the image data, including the first image data 516 and the second image data 518. The state determination system 138 can implement one or more methods for inferring an angle of rotation of the image sensor 512 camera from the image data. In one example, points on distinctive structures, such as a building 524, in the overlapping region 520 may be determined and described using SIFT, SURF or HoG features, for example, and corresponding points in a subsequent image may be determined by matching these features. It is required that the images being matched have an overlap containing at least some of the distinctive structures, e.g. as in the overlapping region 520. The camera motion can be computed from the set of known point correspondences and the positional relationship of the camera to a point or axis about which the machine 500 pivots in the illustrated example. Moreover, because the GNSS sensor 508 has a known positional relationship relative to the image sensor 512, the state determination system 138, e.g., using the fusion model(s) 146 can determine an expected arc, as in the example of
Insofar as
The system 600 can include one or more sensors 608, a display 610, one or more user interfaces 612, one or more controller 614, processor(s) 616, memory 618 communicatively coupled with the processor(s) 616, and one or more communication connections 620. In the illustrated example, the memory 618 of the machine 602 stores a state determination component 622 and a graphical user interface (GUI) generation system 624. Although these systems are illustrated as, and will be described below as, separate components, functionality of the various systems may be attributed differently than discussed. Moreover, fewer or more systems and components may be utilized to perform the various functionalities described herein. Though depicted in
The sensor(s) 608 can include any sensors described herein. For example, and without limitation, the sensor(s) 608 can include the rotational sensor 124, position sensor 126, the track sensors 128, the sensors 130 and/or other sensors discussed herein. Without limitation, the sensor(s) 608 can be configured to generate data about aspects of the machine 602 and/or about an environment of the machine 602.
The user interface(s) 612 may be provided to an operator of the machine 602, e.g., to allow a user to interact with the machine 602. In some examples, the user interface(s) 612 are accessible by an operator of the machine 602 when the operator is in the cab 110. The user interface(s) 612 can include display screens, touch screens, joysticks, steering wheels, switches, pedals, and/or any other mechanism or component with which the operator can interface.
The controller(s) 614 can include components that implement control signals. Without limitation, the controller(s) 614 may receive signals based on operator inputs received via the user interface(s) 612 and determine actions to implement those controls. The controller(s) 614 may include hydraulic controllers, actuator controllers, electronic controllers, or the like. In some instances, data from the user interface(s) 612 and/or the controller(s) 614 may be used to infer machine motion, as detailed further herein.
In at least one example, the state determination component 622 can include functionality to determine a state data for the machine 602 based on the sensor inputs. For example, the state determination component 622 may be substantially the same as the state determination system 138 discussed above. In examples, the state determination component 622 can receive sensor data from a rotational sensor and from a single GNSS sensor and determine an orientation of the machine 602 based on that information. For example, the state determination component 622 may utilize one or more models, e.g., the fusion models 140 discussed above.
In some examples, the GUI generation system 624 can include functionality to generate one or more interactive interfaces, such as for presentation via the display 610. In some examples, the GUI generation system 624 may receive information from the state determination component 622 to generate the GUIs. In some examples, the GUIs may illustrate the machine 602 and information about the state of the information. The state information may include a machine orientation vector. The GUIs may also present to an operator a confidence value associated with the state information. As discussed herein, state information determined using rotational data from the rotational sensor may be more accurate than state data determined using other techniques. This accuracy may be illustrated to an operator via a GUI.
The communication connection(s) 620 enable communication between the machine 602 and the remote computing device(s) 606 and/or other local or remote device(s). For instance, the communication connection(s) 620 can facilitate communication with the remote computing device(s) 606, such as via the network(s) 604. The communication connection(s) 620 can enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as BLUETOOTH, other radio transmission, or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
In some implementations, the machine 602 can send information, such as instructions to generate GUIs, sensor data, or the like to the remote computing device(s) 606, via the network(s) 604. The remote computing device(s) 606 can receive such information from the from the machine 602 via the communication connections 620, 640. In some implementations, the remote computing device(s) 606 can perform some of the functions attributed to the machine 602, including determining the state of the machine 602 or generating the GUIs, for example. In at least one example, the remote computing device(s) 606 can include one or more processors 626 and memory 628 communicatively coupled with the processor(s) 626. In the illustrated example, the memory 628 of the remote computing device(s) 606 may store a state determination component 630, a GUI generation component 632, a state model generation component 634, and/or include data stores 636. In examples, the state determination component 630 can be substantially the same as the state determination component 622 and the GUI generation component 632 can be substantially the same as the GUI generation component 624.
The state model generation component 634 can include functionality to generate one or more models, e.g., including the fusion model(s) 140 discussed above. Without limitation, the state model generation component 634 can implement one or more training processes, e.g., to train a machine learning model, to generate state information from various sensor inputs. Moreover, the state model generation component 634 can also include functionality to model aspects described herein, such as an estimated path for the GNSS sensor, the center of gravity, and/or other machine components.
The data stores 636 can include models 638, which can include the fusion model(s) 146 in some instances. The data stores 636 may also store machine specific information, machine-specific models, and/or other data used to determine aspects of state determination, as described herein.
The remote computing device(s) 606 may also include communication connection(s) 640 that enable communication between the remote computing device(s) 606 and other local or remote device(s). For instance, the communication connection(s) 640 can facilitate communication with the machine 602 and/or other machines, such as via the network(s) 604. The communications connection(s) 640 can enable Wi-Fi-based communication such as via frequencies defined by the IEEE 802.11 standards, short range wireless frequencies such as BLUETOOTH®, other radio transmission, or any suitable wired or wireless communications protocol that enables the respective computing device to interface with the other computing device(s).
The processor(s) 616 of the machine 602 and the processor(s) 626 of the remote computing device(s) 606 can be any suitable processors capable of executing instructions to process data and perform operations as described herein. By way of example and not limitation, the processor(s) 616, 626 can comprise one or more Central Processing Units (CPUs), Graphics Processing Units (GPUs), or any other device or portion of a device that processes electronic data to transform that electronic data into other electronic data that can be stored in registers and/or memory. In some examples, integrated circuits (e.g., ASICs, etc.), gate arrays (e.g., FPGAs, etc.), and other hardware devices can also be considered processors in so far as they are configured to implement encoded instructions.
The memory 618 and the memory 628 are examples of non-transitory computer-readable media. The memory 618, 628 can store an operating system and one or more software applications, instructions, programs, and/or data to implement the methods described herein and the functions attributed to the various systems. In various implementations, the memory can be implemented using any suitable memory technology, such as static random-access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory capable of storing information. The architectures, systems, and individual elements described herein can include many other logical, programmatic, and physical components, of which those shown in the accompanying figures are merely examples that are related to the discussion herein.
Although various systems and components are illustrated as being discrete systems, the illustrations are examples only, and more or fewer discrete systems may perform the various functions described herein. Moreover, functionality ascribed to the machine 602 may be performed at the remote computing device(s) 606 and/or functionality ascribed to the remote computing device(s) 606 may be performed at the machine 602.
In more detail,
At operation 702, the process 700 includes can include receiving first data from a GNSS sensor. For example, the state determination system 138 may receive sensor data generated by the position sensor 126 as the machine 100 is used to perform one or more tasks. In examples, the first data includes a plurality of global positions generated by a GNSS sensor as the machine 100 moves. The GNSS sensor may be coupled to an upper frame 102 of the machine 100 at a position spaced from an axis about which the upper frame 102 rotates. Thusly disposed, the GNSS sensor will move, and thus generate new global positions, in response to most machine movements.
At operation 704, the process 700 includes receiving second data from a rotational sensor. For example, the state determination system 138 may receive sensor data generated by the rotational sensor 124 as the machine 100 is used to perform one or more tasks. In examples, the second data includes a rotational displacement of the upper frame 102 of the machine relative to the lower frame 104 of the machine 100. The rotational sensor 124 may be configured to determine an angular displacement of the upper frame 102 relative to the lower frame 104 about the axis 106. The rotational sensor 124 may determine the rotational displacement with a high degree of accuracy, e.g., within about 0.1 degrees.
At operation 706, the process 700 includes determining a swing angle based on the second data. As noted above, the rotational sensor 124 determines an angular displacement about the rotational axis 106. The swing angle may be an absolute angle, a relative angle, or other angular measurement. In the example of
At operation 708, the process 700 includes estimating a center-of-rotation based on the swing angle and the first data. As discussed above in association with the example of
At operation 710, the process 700 can optionally include receiving additional data. The additional data can be any of the sensor data 140 discussed above and/or other information associated with the machine and/or movement of the machine. In some instances, the additional data can include, or can be used to generate, the track data and/or the image data discussed above in connection with
At operation 712, the process 700 includes determining machine state information based on the estimated center-of-rotation, machine parameters, and, optionally, the additional data. For example, the machine state information can include position, location, orientation, velocity, acceleration, and/or other information associated with aspects of the machine 100. Without limitation, the machine state information may be a machine orientation vector indicating a position and orientation of the machine. In examples, once the estimated center of rotation is determined, a known spatial relationship between components of the machine, such as the GNSS sensor, the center of gravity, the tracks, and/or other sensors, may be used to determine the direction of the machine. The additional data may be fused with the sensor data to determine the estimated center of gravity, to confirm assumptions about movement of the vehicle (e.g., an absence of lateral movement), and/or for other purposes detailed herein.
At operation 714, the process 700 includes controlling a machine based at least in part on the machine state information. For example, the machine 100 may be used to performing excavation tasks, including but not limited to digging, grading, or the like. In some examples, these operations may be aided using a three-dimensional model of the terrain to be excavated. The machine state, including the orientation of the machine may be required to perform operations according to the model. In other instances, the operation 714 can include generating a GUI for display to an operator, a site manager, or other user associated with the machine.
The disclosed systems and methods find application in any environment in which state data of a machine may be necessary, e.g., for control of the machine, localization of the machine, or the like. In implementations, the state determination can be characterized at least in part as a machine orientation vector. For instance, the state may be based on rotational sensor data associated with a relative rotation of portions of the machine and global positions generated by a GNSS sensor. The disclosed systems and methods allow for more accurate state determination, and in some instances without the need for involved and cumbersome calibration and/or re-calibration routines. For example, the techniques described herein may reduce the time for performing calibration and re-calibration, including during the performance of tasks using the machine.
For example, and with reference to
Techniques described herein may improve efficiency at work sites and/or improve efficiency of machines, like the machine 100. By way of example and not limitation, techniques described herein can determine machine state information, including orientation of a machine, in the absence of frequent, involved, and/or disruptive calibration routines, which can lead to more efficient use of the machine 100, including but not limited to reduced fuel consumption and/or wear of parts. For instance, when an operator has to perform a calibration routine to determine machine state, the machine is not available to be used for productive operational tasks. Aspects of this disclosure may determine state information about the machine during performance of tasks, thereby increasing throughput of work and reducing wear on machine components caused by non-work-related use.
One having ordinary skill in the art will appreciate the computer programs for implementing the disclosed techniques may be stored on and/or read from computer-readable storage media. The computer-readable storage media may have stored thereon computer-executable instructions which, when executed by a processor, cause the computer to perform, among other things the processes disclosed herein. Exemplary computer-readable storage media may include magnetic storage devices, such as a hard disk, a floppy disk, magnetic tape, or other magnetic storage device known in the art; optical storage devices, such as CD-ROM, DVD-ROM, or other optical storage devices known in the art; and/or electronic storage devices, such as E PROM, a flash drive, or another integrated circuit storage device known in the art. The computer-readable storage media may be embodied by one or more components of the machine 100.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed system for determining orientation and/or position of a machine without departing from the scope of the disclosure. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and equivalents thereof.