The present description is related to excavators used in heavy construction. More particularly, the present description is related to improved sensing and control in such excavators.
Hydraulic excavators are heavy construction equipment generally weighing between 3500 and 200,000 pounds. These excavators have a boom, an arm, a bucket (or attachment), and a cab on a rotating platform that is sometimes called a house. A set of tracks is located under the house and provides movement for the hydraulic excavator.
Hydraulic excavators are used for a wide array of operations ranging from digging holes or trenches, demolition, placing or lifting large objects, and landscaping. Precise excavator operation is very important in order to provide efficient operation and safety. Providing a system and method that increases excavator operational precision without significantly adding to cost would benefit the art of hydraulic excavators.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A mobile machine includes a rotatable house and a sensor operably coupled to the rotatable house and configured to provide at least one sensor signal indicative of acceleration. The mobile machine includes one or more controllers coupled to the sensor, the one or more controllers being configured to implement: sensor position determination logic that determines a sensor position of the sensor on the rotatable house based on the sensor signal during a rotation of the rotatable house; and control signal generator logic that generates a control signal to control the mobile machine, based on the sensor position.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Precision control or automatic control of an excavator or similar machines, such as cranes or back hoes, rely on a system of sensors. Often, these sensors include inertial measurement units (IMUs) that can detect acceleration, gravity, orientation, angular rotation, et cetera. When the IMU is coupled to the machine at manufacture, the sensors' physical locations on the components of the machine are typically known. However, when the sensors are added at a later time, (e.g., such as aftermarket components or manufacturer upgrade components) the sensors' precise location and/or orientation on the machine is unknown. While the additional sensors may be used without knowing their precise location, being able to determine their location on the machine allows for higher precision control.
When an object is rotated about an axis, the acceleration it experiences is a function of its displacement from the rotational axis. Therefore, the location of a sensor can be determined based on the sensor data collected (e.g., acceleration) during a rotation of the sensor about one or more axes in one or more directions. Additionally, sensors may be mounted on components that are movable in relation to the rotational axis (e.g., a boom is movable in relation to the swing axis of the house). Accordingly, the component may be moved from one pose to another between rotations. With the known geometry of the component and the sensed accelerations in the different poses, the sensor location ambiguity can be reduced or eliminated.
Controller 202 is configured to receive one or more inputs, perform a sequence of programmatic steps to generate one or more suitable machine outputs for controlling the operation of machine 100 (e.g., implementing the various logic components). Controller 202 may include one or more microprocessors, or even one or more suitable general computing environments as described below in greater detail. Controller 202 is coupled to user interface devices 210 in order to receive machine control inputs from an operator within cab. Examples of operator inputs include joystick movements, pedal movements, machine control settings, touch screen inputs, etc. Additionally, user interface devices 210 also include one or more operator displays in order to provide information regarding excavator operation to the operator.
Data store 212 stores various information for the operation of machine 100. Illustratively, geometry 214 that corresponds the geometry of various components of machine 100 (e.g., controllable subsystems 240) are stored in data store 212. For example, the dimensions and shape of boom 110 are stored in geometry 214. Such information may include the length, width, height, bends, radii of corners, size and location of the linkage pins, mass, center of mass, etc. Geometry 214 can also include three-dimensional models of the components, including subcomponents and mass calculations. Of course, data store 212 can include many other items as well as indicated by block 216.
Sensors 220 include inertial measurement units (IMU) 222, linkage sensors 224 and can include a variety of other sensors as well, as indicated by block 226. IMU sensors 222 can be disposed on machine 100 at a variety of different places. For instance, IMU sensors 222 can be placed on the rotatable house 102, boom 110, arm 116 and attachment 124. IMU sensors 222 are able to sense acceleration, orientation, rotation, etc. They are displaced on these and other components of machine 100 for precise control of machine 100.
Sensors 220 also include linkage sensors 224 which can include strain gauges, linear displacement transducers, potentiometers, et cetera. Linkage sensors 224 can sense the force applied on the controllable subsystems 240 and/or the orientation of the controllable subsystems via the displacement of its actuator. For instance, boom 110 is often actuated by a hydraulic cylinder and the displacement of the piston in the cylinder will correlate to a location of boom 110 relative to rotatable house 102. In another example, a potentiometer can be located proximate a linkage pin between boom 110 and arm 116, this potentiometer will output a signal indicative of the angle between boom 110 and arm 116.
Sensor position determination logic 230 determines the position of the various IMU sensors 222 (or other sensors) on machine 100. Sensor position determination logic 230 includes pose sequence logic 231, motion sequence logic 232, house sensor position determination logic 233, boom sensor position determination logic 234, arm sensor position determination logic 235, attachment sensor position determination logic 236 and can include other components as well as indicated by block 237. Pose sequence logic 231 generates or selects a sequence of poses for machine 100 to actuate to during a sensor position determination process. For example, to determine the position of a sensor on machine 100 it may be beneficial to change the pose of machine 100 and accelerate (e.g. rotate house 102) in various poses. This is because as the pose changes, the sensor will be displaced (predictably) at a different relative location to the rotational axis of rotatable house 102.
Motion sequence logic 232 generates or selects a sequence of motions for machine 100 to actuate through during a sensor position determination process. For example, to determine a position of a sensor on machine 100, creating motion allows for the detection of acceleration, especially angular acceleration and velocity. Since angular acceleration/velocity share a relationship with the physical displacement from the rotation axis, a known rotational acceleration/velocity can be used to determine the physical displacement from the rotation axis. This and the known geometry from geometry 214 and linkage locations to one another can provide the locations of the sensors on their respective controllable subsystems 240. Motions generated or selected by motion sequence logic 232 also can include periods of rest, such that the orientation of IMU sensors 222 can be determined. Also, the periods of rest allow for a control value or angle of an IMU sensor 222.
House sensor position determination logic 233 receives sensor signals from IMU sensor 222 that is located on rotatable house 102. As the rotatable house 102 rotates through a given series of motions and rests the attached IMU sensor 222 will generate various readings. House sensor position determination logic 233 receives these readings and determines a position of the IMU sensor 222 on rotatable house 102 based on those readings. Of course, house sensor position determination logic 233 can determine the position of an IMU sensor 222 located on rotatable house 102 in other ways as well. For example, house sensor position determination logic 233 can generate an interface that allows a user to enter user input and house sensor position determination logic 233 determines the sensor location based on the user input.
Boom sensor position determination logic 234 receives sensor signals from IMU sensor 222 that is located on boom 110. As the rotatable house 102 rotates through a given series of motions and rests, boom 110 also rotates and pauses and the attached IMU sensor 222 will generate various readings. Boom sensor position determination logic 234 receives these readings and determines a position of IMU sensor 222 on boom 110 based on the sensor readings. Of course, boom sensor position determination logic 234 can determine the position of an IMU sensor 222 located on boom 110 in other ways as well. For instance, an actuator of boom 110 may actuate and the readings received from IMU 222 during this actuation may be used to calculate the position of sensor 222. In another example, boom sensor position determination logic 234 can generate an interface that allows a user to enter user input and boom sensor position determination logic 234 determines the sensor location based on the user input.
Arm sensor position determination logic 235 receives the sensor signals from one or more IMU sensors 222 that are located on arm 116. As the rotatable house 102 rotates through a given series of motions and rests, arm 116 also rotates and pauses and the attached IMU sensors 222 will generate various readings. Arm sensor position determination logic 235 receives these readings and determines a position of IMU sensor 222 on arm 116. Of course, arm sensor position determination logic 235 can determine the position of an IMU sensor 222 located on arm 116 in other ways as well. For instance, an actuator of arm 116 may actuate and the readings received from IMU 222 during this actuation may be used to calculate the position of sensor 222. In another example, arm sensor position determination logic 235 can generate an interface that allows a user to enter user input and arm sensor position determination logic 235 determines the sensor location based on the user input.
Attachment sensor position determination logic 236 receives the sensor signals from one or more IMU sensors 222 that are located on attachment 124. As the rotatable house 102 rotates through a given series of motions and pauses, attachment 124 also rotates and pauses and the attached IMU sensors 222 will generate various readings. Attachment sensor position determination logic 236 receives these readings and determines a position of IMU sensor 222 on attachment 124. Of course, attachment sensor position determination logic 236 can determine the position of an IMU sensor 222 located on attachment 124 in other ways as well. For instance, an actuator of attachment 124 may actuate and the readings received from IMU 222 during this actuation may be used to calculate the position of sensor 222. In another example, attachment sensor position determination logic 236 can generate an interface that allows a user to enter user input and attachment sensor position determination logic 236 determines the sensor location based on the user input.
Control system 250 controls the operations of machine 100. Control system 250 includes (semi)automatic control logic 252, control signal generator logic 254 and can include other items as well, as indicated by block 256. (Semi) automatic control logic 252 allows for fully automatic or partially automatic control by an operator of machine 100. For instance, semi-automatic control would include smart grade operations that would allow an attachment 124, that is a bucket, to grade or dig a flat bottom trench, despite the standard displacement of linkages 109 during actuation being circular (e.g., due to rotation about linkage pins). Fully automatic control can include full-automatic control by the system, such as digging a trench without user intervention.
Boom 110 has a boom X axis (XB) defined by a line connecting the boom/house linkage pin to the boom/arm linkage pin. A boom Z axis (ZB) is perpendicular to XB and extends upward from the boom/house linkage pin. As shown, there is a sensor 222-1 on boom 110. Sensor 222-1 is located P1B away from the XB, XZ origin at an angle of θ1. Sensor 222-1 is also located P1A away from boom 110/arm 116 linkage pin.
Arm 116 has an arm X-axis (XA) defined by a line connecting the arm/boom linkage pin to the arm/attachment linkage pin. An arm Z-axis is perpendicular to XA and extends upward from the boom/arm linkage pin. As shown, there is a sensor 222-2 on arm 116. Sensor 222-2 is located P2A away from the XA, XA origin at an angle of θ2.
The positions of sensors 222-0, 222-1, 222-2 can be defined globally (e.g., on XM and ZM), locally (e.g., on XB, ZB or XA, ZA), or relative to some other point on machine 100. Of course, any position defined on one of these spectrums can be converted to another. For instance, as shown the local X-axis passes through the pin joint, however, in other examples the X-axis could be defined elsewhere.
Operation 400 proceeds at block 420 where the position of the first sensor (e.g., sensor 222 on rotatable house 102) is determined. As indicated by block 422, the position may be determined based on, amongst other things, the sensor signal output of sensor 222 as machine 100 is rotated through a series of motions. As indicated by block 424, the position may be determined based on manually measuring the location of sensor 222 on rotatable house 102. As indicated by block 426, the position may be determined in other ways as well.
Operation 400 proceeds at block 430 where it is determined if there are more sensors to be located. If not, then operation 400 proceeds at block 470 which will be described in greater detail below. If so then operation 400 proceeds at block 440.
At block 440, the position of the second sensor (e.g., sensor 222 on boom 110) is determined. As indicated by block 442, the position may be determined based on the sensor signal output of sensor 222 on boom 110 as machine 100 is rotated through a series of motions. As indicated by block 444, the position may be determined based on manually measuring the location of sensor 222 on boom 110. As indicated by block 446, the position may be determined in other ways as well. For instance, by analyzing an image taken of the sensor on machine 100, the image can be analyzed for machine parts and the sensor. Then the distance between these parts in the image can be used to determine the sensor's physical location.
Operation 400 proceeds at block 450 where it is determined if there are more sensors to be located. If not, then operation 400 proceeds at block 470 where the positions of the sensors are stored in, for example datastore 212. If so, then operation 400 proceeds at block 460 where the position of the next sensor is determined. As indicated by block 462, the position may be determined based on the sensor signal output of sensor 222 as machine is rotated through a series of motions (e.g., rotating house 102, raising boom 110, lowering boom 110, extending arm 116, retracting arm 116, etc.). As indicated by block 464, the position may be determined based on manually measuring location of sensor 222. As indicated by block 466, the position may be determined in other ways as well.
Operation 500 begins at block 510 where sensor location determination operation 500 initializes. As indicated by block 512, initialization can include moving machine 100 to a flat stable surface. As indicated by block 514, initialization can include calibrating one or more sensors 220 on machine 100. Of course, initialization can include a variety of other things, as indicated by block 516. For instance, initialization can include loading machine geometry or locations of other sensors or components of machine 100.
Operation 500 proceeds at block 520 where the angle of sensor 222 is determined at rest. For example, angle θ0 in
Operation 500 proceeds at block 530 where rotatable house 102 is swung about a Z-axis in one direction (e.g., counter clockwise) and during this rotation, sensor data is gathered. For example, sensors 220 (e.g., IMU 222) sense characteristics of the motion (e.g., acceleration, force, etc.) and the sensed data is stored. As indicated by block 532, rotatable house 102 is swung at full speed. As indicated by block 534, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated block 536, rotatable house 102 is swung at a different speed or state.
Operation 500 proceeds at block 540 where rotatable house 102 is swung about the Z-axis in a second direction that is opposite the first direction (e.g., clockwise) and during this rotation, sensor data is gathered. For example, (e.g., IMU 222) sense characteristics of the motion and the sensed data is stored. As indicated by block 542, rotatable house is swung at full speed. As indicated by block 544, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated by block 546, rotatable house 102 is swung at a different speed or state.
Operation 500 proceeds at block 550 where the distance PX is calculated. Global P0MX can be calculated in a few different ways. For example, global PX can be calculated using equations 8 and 9 above, with respect to
Operation 500 proceeds at block 560 where PX and PZ are calculated. PX and PZ can be calculated using equations 10 and 11 shown below. The global PX calculated in block 550 is used to solve for PX and PZ. As indicated by block 562, a measured PZ can be used to solve for PX and PZ. As indicated by block 564, a nominal PZ can be used to solve for PX and PZ. Of course, PX and PZ can be determined in other ways as well as indicated by block 566.
Operation 500 proceeds at block 570 where PY is determined. PY can be determined using equation 6 above and the data collected in blocks 530 and 540. Of course, PY can be determined in other ways as well as indicated by block 564.
Operation 500 proceeds at block 580 where the position is stored for later use. As indicated by block 582 the relative positions of the sensors can be stored. For example, the position of the sensor relative to a component of machine 100 (e.g., a linkage pin, the boom, house, arm, etc.) As indicated by block 584, the global position of the sensors can be stored. For example, the position of the sensor relative to the swing axis of machine 100 or a position of the sensor relative to the ground. As indicated by block 586, the position of the sensor may be stored in data store 212 on machine 100. Of course, the position of the sensor may be stored in some other format at a different location as well, as indicated by block 588.
Operation 500 proceeds at block 590 where machine 100 is controlled based on the position of one or more sensor(s) 222.
Operation 600 begins at block 610 where operation 600 is initialized. As indicated by block 612, initialization can include moving machine 100 to a flat stable surface. As indicated by block 614, initialization can include calibrating sensors 220 on machine 100. Of course, initialization can include a variety of other things as indicated by block 616. For instance, initialization can include loading machine geometry or locations of other sensors or components of machine 100.
Operation 600 proceeds at block 620 where θ1 is determined at rest. θ1 can be determined using the above-mentioned equation 1, as indicated by block 622. θ1 can be determined in other ways as well, as indicated by block 624.
Operation 600 proceeds at block 630 where rotatable house 102 is swung about a Z axis in one direction (e.g., counter clockwise) and during this rotation, sensor data is gathered. For example, sensors 220 (e.g., IMU 222) sense characteristics of the motion and the sensed data is stored. As indicated by block 632, rotatable house 102 is swung at full speed. As indicated by block 634, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated block 636, rotatable house 102 is swung at a different speed or state.
Operation 600 proceeds at block 640 where rotatable house 102 is swung about the Z axis in a second direction that is opposite the first direction (e.g., clockwise) and during this rotation, sensor data is gathered. For example, (e.g., IMU 222) sense characteristics of the motion and the sensed data is stored. As indicated by block 642, rotatable house could be swung at full speed. As indicated by block 644, in addition or in the alternative, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated by block 646, in addition or in the alternative, rotatable house 102 is swung at a different speed or state.
Operation 600 proceeds at block 650 where boom 110 is repositioned. After boom 110 is repositioned, operation 600 repeats blocks 620-640 with boom 110 in the new position. As indicated by block 662, the new position can be a 90-degree rotation of boom 110. The new position can include a different rotation or pose as indicated by block 656.
Operation 600 proceeds at block 660 where PX and PY are determined. As indicated by block 662, PX and PY can be determined using equations 15-18 above. For example, a best fit of the sensor data for the first position could be calculated with equation 15 and equation 16 for the second position where equations 17 and 18 are assumed. Note that θ1 in equation 15 represents the angle of boom 110 in the first position and θ2 in equation 16 represents the angle of boom 110 in the second position.
Operation 600 proceeds at block 670 where PZ is calculated. As indicated by block 672, boom 110 can be actuated and based on the sensor signal during actuation, PZ can be calculated. As indicated by block 674, PZ can be determined by measuring the position. Of course, PZ can be calculated in other ways as well as indicated by block 676.
Operation 600 proceeds at block 680 where machine 100 is controlled based on the position of one or more sensor(s) 222.
Operation 700 proceeds at block 720 where θ is determined at rest. Theta can be determined using equation 1 above, as indicated by block 722. Of course, theta can be determined in other ways as well as indicated by bloc 724.
Operation 700 proceeds at block 730 where rotatable house 102 is swung about a Z axis in one direction (e.g., counter clockwise) and during this rotation, sensor data is gathered. For example, sensors 220 (e.g., IMU 222) sense characteristics of the motion and the sensed data is stored. As indicated by block 732, rotatable house 102 is swung at full speed. As indicated by block 734, in addition or in the alternative, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated block 736, in addition or in the alternative, rotatable house 102 is swung at a different speed or state.
Operation 700 proceeds at block 740 where rotatable house 102 is swung about the Z axis in a second direction that is opposite the first direction (e.g., clockwise) and during this rotation, sensor data is gathered. For example, (e.g., IMU 222) sense characteristics of the motion and the sensed data is stored. As indicated by block 742, rotatable house is swung at full speed. As indicated by block 744, in addition or in the alternative, rotatable house 102 is swung at a steady state that could be less than full speed. As indicated by block 746, in addition or in the alternative, rotatable house 102 is swung at a different speed or state.
Operation 700 proceeds at block 750 where machine 100 is repositioned. Pose sequence logic 231 can determine the pose at which machine 100 should be repositioned to. For example, machine 100 can reposition into four different poses for four iterations, a first pose has a tucked arm 116 and a lowered boom 110, a second pose has a tucked arm 116 and a raised boom 110, a third pose has an extended arm 116 and a raised boom 110 and a fourth pose has an extended arm 116 and a lowered boom 110.
Operation 700 proceeds at block 760 where PX, PY, and PZ are determined. As indicated by block 762, PX, PY, and PZ can be determined using a linear regression of the values gathered in block 730 and 740. As indicated by block 764, PX, PY, and PZ can be determined by measuring the location of the sensor. PX, PY, and PZ can be determined in other ways as well, as indicated by block 766.
Operation 700 proceeds at block 770 where machine 100 is controlled based on the position of one or more sensor(s) 222.
Computer 810 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 810 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media is different from, and does not include, a modulated data signal or carrier wave. It includes hardware storage media including both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computer 810. Communication media may embody computer readable instructions, data structures, program modules or other data in a transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
The system memory 830 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 831 and random-access memory (RAM) 832. A basic input/output system 833 (BIOS), containing the basic routines that help to transfer information between elements within computer 810, such as during start-up, is typically stored in ROM 831. RAM 832 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 820. By way of example, and not limitation,
The computer 810 may also include other removable/non-removable volatile/nonvolatile computer storage media. By way of example only,
Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (e.g., ASICs), Program-specific Standard Products (e.g., ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
The drives and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into the computer 810 through input devices such as a keyboard 862, a microphone 863, and a pointing device 861, such as a mouse, trackball or touch pad. Other input devices (not shown) may include a joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 820 through a user input interface 860 that is coupled to the system bus, but may be connected by other interface and bus structures. A visual display 891 or other type of display device is also connected to the system bus 821 via an interface, such as a video interface 890. In addition to the monitor, computers may also include other peripheral output devices such as speakers 897 and printer 896, which may be connected through an output peripheral interface 895.
The computer 810 is operated in a networked environment using logical connections (such as a local area network—LAN, or wide area network WAN) to one or more remote computers, such as a remote computer 880.
When used in a LAN networking environment, the computer 810 is connected to the LAN 871 through a network interface or adapter 870. When used in a WAN networking environment, the computer 810 typically includes a modem 872 or other means for establishing communications over the WAN 873, such as the Internet. In a networked environment, program modules may be stored in a remote memory storage device.
It should also be noted that the different embodiments described herein can be combined in different ways. That is, parts of one or more embodiments can be combined with parts of one or more other embodiments. All of this is contemplated herein. The flow diagrams are shown in a given order it is contemplated that the steps may be done in a different order than shown.
Example 1 is a mobile machine comprising:
a rotatable house;
a sensor operably coupled to the rotatable house and configured to provide at least one sensor signal indicative of acceleration of the sensor; and
one or more controllers coupled to the sensor, the one or more controllers being configured to implement:
sensor position determination logic that determines a sensor position of the sensor on the rotatable house based on the sensor signal during a rotation of the rotatable house; and
control signal generator logic that generates a control signal to control the mobile machine, based on the sensor position.
Example 2 is the mobile machine of claim 1, wherein the one or more controllers are configured to implement:
motion sequence logic that causes the rotation of the rotatable house to comprise a sequence of rotations and rest states.
Example 3 is the mobile machine of any or all previous examples, wherein the sensor position determination logic determines the sensor position based on a best fit algorithm applied on:
the at least one sensor signal during a rest state; and
the at least one sensor signal during one of the rotations.
Example 4 is the mobile machine of any or all previous examples, further comprising a boom coupled to the rotatable house and a boom sensor coupled to the boom, the boom sensor generates a boom sensor signal indicative of the acceleration of the boom sensor; and
wherein the sensor position determination logic comprises boom sensor position determination logic that determines a boom sensor position based on the boom sensor signal during the rotation of the rotatable house.
Example 5 is the mobile machine of any or all previous examples, wherein the linkage sensor position determination logic receives machine geometry data from a datastore and wherein the linkage sensor position determination logic determines the sensor position based on the machine geometry data.
Example 6 is the mobile machine of any or all previous examples, wherein the one or more controllers are configured to implement:
pose sequence logic that causes the boom to actuate to one or more poses during the sequence of rotations and rest states.
Example 7 is the mobile machine of any or all previous examples, wherein the one or more poses comprise:
a first pose where the boom is at a first angle;
a second pose where the boom is at a second angle.
Example 8 is the mobile machine of any or all previous examples, wherein the second angle is approximately 90 degrees offset from the first angle.
Example 9 is the mobile machine of any or all previous examples, further comprising an arm coupled to the boom and an arm sensor coupled to the arm, the arm sensor generates an arm sensor signal indicative of the acceleration of the arm sensor; and
wherein the sensor position determination logic comprises arm sensor position determination logic that determines an arm sensor position based on the arm sensor signal during the rotation of the rotatable house.
Example 10 is the mobile machine of any or all previous examples 1, wherein sensor position determination logic generates an interface that allows a user to enter user input and sensor position determination logic determines the sensor position based on the user input.
Example 11 is the mobile machine of any or all previous examples, wherein the sensor comprises an IMU.
Example 12 is a method of controlling an excavator, the method comprising:
obtaining, periodically, sensor signals from a sensor operably coupled to the excavator;
actuating one or more controllable subsystems of the excavator through a series of motions;
determining a sensor location of the sensor based on the sensor signals obtained during the series of motions;
controlling the excavator based on the sensor location.
Example 13 is the method of any or all previous examples, wherein actuating the one or more controllable subsystems of the excavator through the series of motions comprises:
actuating the one or more controllable subsystems to a first pose;
holding the one or more controllable subsystems at rest in the first pose; and
rotating the excavator while maintaining the first pose.
Example 14 is the method of any or all previous examples, wherein actuating the one or more controllable subsystems of the excavator through the series of motions comprises:
rotating the excavator in a second direction while maintaining the first pose.
Example 15 is the method of any or all previous examples, wherein actuating the one or more controllable subsystems of the excavator through the series of motions comprises:
actuating the one or more controllable subsystems to a second pose;
holding the one or more controllable subsystems at rest in the second pose; and
rotating the excavator while maintaining the second pose.
Example 16 is the method of any or all previous examples, wherein determining the sensor location comprises:
determining a best fit of the sensor data based on the sensor signals obtained during the series of motions.
Example 17 is the method of any or all previous examples, wherein actuating the one or more controllable subsystems of the excavator through the series of motions comprises:
actuating the one or more controllable subsystems to a third pose;
holding the one or more controllable subsystems at rest in the third pose; and
rotating the excavator while maintaining the third pose.
Example 18 is a mobile machine comprising:
a rotatable house;
a boom;
a first IMU sensor coupled to the rotatable house;
a second IMU sensor that couples to the boom;
house sensor position determination logic that determines a position of the first IMU sensor;
boom sensor position determination logic that determines a position of the second IMU sensor;
a control system that controls the mobile machine based on the position of the first IMU sensor and the position of the second IMU sensor.
Example 19 is the mobile machine of any or all previous examples, wherein the house sensor position determination logic determines the position of the first IMU sensor based on a first sensor signal generated by the first IMU sensor; and
wherein the boom sensor position determination logic determines the position of the second IMU sensor based on a second sensor signal generated by the second IMU sensor.
Example 20 is the mobile machine of any or all previous examples, further comprising:
an arm;
a third IMU sensor coupled to the arm;
arm sensor position determination logic that determines a position of the third IMU sensor based on a third sensor signal generated by the third IMU sensor.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.