The present disclosure generally relates to a system for controlling operation of a machine. More particularly, the present disclosure relates to a system for determining sensor blockage and blind zones around the machine to control the operation of the machine.
Autonomous machines are designed to perform some or all required functions autonomously (using sensors), i.e., without contemporaneous user input. Several systems and methods have been developed to allow such machines to perform such functions autonomously. However, the systems and methods used to control the autonomous machines are highly dependent on the area of a work site that can be perceived by one or more sensors of the autonomous machine.
The machine's ability to perceive its environment is constrained by the sensors field of view and sensing range, the terrain complexity arising from static and dynamic objects and atmospheric conditions (dust, rain). Accordingly, sections of the terrain might not be visible to the machine. For e.g. the field of view for the sensors may have some missing data (portion of the work site not perceived by the sensor). This missing data may be due to an object proximate to the sensor occluding its field of view. Alternatively, the sensor may not be able to detect an object in the vicinity of the sensor due to harsh environmental conditions. Alternatively, due to the dusty conditions on the work site the sensor may not be able to perceive a portion of the work site.
Since some sections of the work site may not be perceived, it is challenging for a control system of the autonomous machine to determine an appropriate path to follow to perform the desired operations on the work site.
WO 2011085435A1 discloses gathering data about a work site. The data about the work site pertains to the terrain features and objects present on the ground surface of the work site. Based on the data related to the work site, the height of objects or terrain features is determined.
In one aspect of the present disclosure, a control system for a machine operating at a work site is disclosed. The control system includes a perception system comprising at least one sensor disposed on the machine and configured to generate data signals pertaining to at least one of the machine and an environment associated with the machine. The control system also includes a controller communicably coupled to the perception system for receiving the data signals from the at least one sensor. The controller is configured to determine from the data signals terrain features associated with the work site and presence of one or more objects on the work site or the machine. The controller is further configured to determine geometry of the at least one sensor, location of the at least one sensor on the machine and generate a field of view for the at least one sensor. The controller is also configured to estimate the cast shadow for at least one object of the one or more objects based on the geometry and location of the at least one sensor and compare the field of view with the cast shadow to determine the sensor blockage for the at least one sensor.
In another aspect of the present disclosure, a machine is disclosed. The machine includes a frame, at least one operational system disposed on the frame, the at least one operational system being configured to actuate at least one type of operation in the machine and a perception system comprising at least one sensor disposed on the machine and configured to generate data signals pertaining to at least one of the machine and an environment associated with the machine. The machine further includes a controller communicably coupled to the perception system for receiving the data signals from the at least one sensor. The controller is configured to determine from the data signals terrain features associated with the work site and presence of one or more objects on the work site or the machine. The controller is further configured to determine geometry of the at least one sensor, location of the at least one sensor on the machine and generate a field of view for the at least one sensor. The controller is also configured to estimate the cast shadow for at least one object of the one or more objects based on the geometry and location of the at least one sensor and compare the field of view with the cast shadow to determine the sensor blockage for the at least one sensor.
In yet another aspect of the present disclosure, a method of controlling operation of a machine at a work site is disclosed. The method includes generating by at least one sensor disposed on the autonomous machine data signals pertaining to at least one of the autonomous machine and an environment associated with the autonomous machine, receiving by a controller communicably coupled to the at least one sensor the data signals, determining from the data signals terrain features associated with the work site and presence of one or more objects on the work site or the autonomous machine, determining geometry of the at least one sensor and location of the at least one sensor on the autonomous machine, generating a field of view for the at least one sensor, estimating the cast shadow for at least one object of the one or more objects based on the geometry and location of the at least one sensor, comparing the field of view with the cast shadow to determine the sensor blockage for the at least one sensor and operating the autonomous machine on the work site based on the sensor blockage
Reference will now be made in detail to the embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference number will be used throughout the drawings to refer to the same or like parts.
With reference to
Although the exemplary machine 100 is embodied as a drill in the illustrated embodiment of
Referring to
As shown in
The articulation system 112 may include linkages (not shown) that are coupled to the frame 106 and the work implement 114, as shown in
As shown in
The machine 100 further includes a control system shown and generally indicated by numeral 200 in
Referring to
Further, for the purpose of illustration the perception system 201 including the at least one sensor 202 is disposed on the cab 128 of the machine 100. However, in various other embodiments, the perception system 201 inclusive of the at least one sensor 202 may be disposed at any other location of the machine 100.
In an embodiment, the at least one sensor 202 may be one or more perception sensors. As an example, the perception sensor may be one or more devices such as, but not limited to, hall-effect sensors, a light detection and ranging system (LIDAR), a radio detection and ranging system (RADAR), a sound navigation and ranging system (SONAR), one or more visual cameras and the like. One skilled in the art will acknowledge that any type of sensors known in the art may be implemented in lieu of the perception sensor for performing functions that are consistent with the present disclosure.
At least one sensor 202 disclosed herein can obtain data from the environment 134 in which the machine 100 is currently located. In an embodiment, the at least one sensor 202 may generate data signals pertaining to terrain features associated with the work site 102 (refer to
The at least one sensor 202 could also generate data signals relating to, for example, a current location of the machine 100 and/or a distance of the machine 100 with a respective one of the one or more objects 138 i.e., stationary or moving objects respectively. To that end, the at least one sensor 202 could additionally include other types of devices for e.g., an altimeter for determining other characteristics of the environment 134 for e.g., an altitude of the work surface 122 or the work site 102 on which the machine 100 is located, or an altitude of the work surface 122 or the work site 102 on which the one or more objects 138 are located. Such additional characteristics associated with the environment 134 may be implemented for use in appropriate computations to determine subsequent parameters of interest relating to a positioning of the machine 100, an orientation of the machine 100, and/or a location of the one or more objects 138 on the work site 102 with respect to the machine 100.
As shown in
The controller 204 is also configured to receive the data signals from the at least one sensor 202. In an embodiment, the controller 204 may be a processor. Based on appropriate computation by the controller 204 the characteristics of the environment 134 can be determined i.e., terrain features associated with the work site 102, and the presence of one or more objects 138 in the vicinity of the machine 100.
In an embodiment, the controller 204 processes the data signals pertaining to the characteristics of the environment 134 and determines the terrain features on the work site 102. For e.g. the controller 204 may be able to determine a depression or a ridge on the work surface 122 of the work site 102. Further, the controller 204 processes the data signals pertaining to the one or more objects 138 and determines the shape and contour of one or more objects 138. Based on the shape and contours of the one or more objects 138, the controller 204 may further classify the one or more objects 138 as a mountain, a large rock, a secondary machine, a personnel, etc. For e.g. the controller 204 may determine the object 138X and the 138Y as a human operator and dump truck respectively. Further, the controller 204 may determine the object 138 in front of the machine 100 as a rock present proximate to the machine 100, as shown in
The data signals could be provided, in real-time or near real-time, from the at least one sensor 202 to controller 204 for accomplishing a control in the movement of the machine 100 on the work site 102 as will be described later herein.
Further, the controller 204 is configured to determine the geometry of the at least one sensor 202. For e.g. the controller 204 may determine the shape and contours of the at least one sensor 202. The controller 204 may also be configured to determine the location of the at least one sensor 202 on the machine 100. The controller 204 then generates a field of view for the at least one sensor 202 for e.g. field of view 160 for the first sensor 202a, as shown in
The process of determining the field of view 160 will now be discussed in detail with reference to
The controller 204 then receives data signals pertaining to one or more objects 138 and/or the machine 100, and determines the shape of the one or more objects 138 and/or the machine 100. Based on the location and geometry of the at least one sensor 202, the controller 204 is configured to estimates cast shadow (the area of the one or more objects 138/the machine 100 that wouldn't be visible to the at least one sensor 202) that would be generated due to the contours of the one or more objects 138. For e.g. as shown in
It may be contemplated that for different locations on the worksite or different orientations of the machine 100 with respect to the one or more objects 138, the cast shadow generated by the one or more objects 138 may be different. For e.g. referring to
After estimating cast shadow by the one or more objects 138 for the at least one sensor 202, the controller 204 is configured to compare the field of view 160 of the at least one sensor 202 with the cast shadow 170 of the one or more objects 138 for the at least one sensor 202 (as shown in
In an embodiment, the controller 204 may estimate field of view of the first sensor 202a for an alternate location 142 on the work site 102 based on the sensor blockage 156 for the first sensor 202a (at the location 140 on the work site 102).
In an embodiment, the controller 204 modifies field of view of the at least one sensor 202 based on sensor blockage and cast shadow of the one or more objects 138. For e.g. as shown in
In an embodiment, after determining the sensor blockage 156 for the first sensor 202a (as shown in
In the embodiment illustrated, the field of view 160 for the first sensor 202a and the field of view 190 for the second sensor 202b encapsulates the same portion of the work site 102. However, in alternate embodiments, the field of view 160 and 190 may encapsulate a different portions of the work site 102. The controller 204 then compares the field of view 160 of the first sensor 202a and a field of view 190 of the second sensor 202b and determines net sensor blockage 192 and net missing data 194 for the first sensor 202a and the second sensor 202b by forming a total cumulative field of view 220 for the perception system 201, as shown in
Based on the calculated net sensor blockage 192 and the net missing data 194 for the first sensor 202a and the second sensor 202b, the controller 204 determines the blind zone for the machine 100. The blind zone may be defined as at least a portion of the work site 102 not visible to both the first sensor 202a and the second sensor 202b. The net sensor blockage 192 and the net missing data 194 for the first sensor 202a and the second sensor 202b include the portion of the work site 102 not visible to both the first sensor 202a and the second sensor 202b. Accordingly, the net sensor blockage 192 and the net missing data 194 for the first sensor 202a and the second sensor 202b become the blind zone for the machine 100.
In an embodiment, the controller 204 may determine only the net sensor blockage 192. For e.g. the controller 204 may be able to determine the one or more objects 138, the characteristics of the environment 134 and the terrain features in the net missing data 194. However, the controller 204 may not be able to determine the one or more objects 138, the characteristics of the environment 134 and the terrain features in the net sensor blockage 192. Thus, only the net sensor blockage 192 becomes the portion of the work site 102 not visible to both the first sensor 202a and the second sensor 202b. Accordingly, the blind zone will only be the net sensor blockage 192.
Further, the controller 204 may be configured to determine whether the one or more objects 138 have moved in or are in the blind zone for the machine 100 based on the location of the machine 100. For e.g. after determining the blind zone, the machine 100 may have to move to another location (from location 140 to location 142). Based on the location of the machine 100, the controller 204 detects the one or more objects 138 that should be lying in the blind zone for the machine 100 (due to the change in location of the machine 100). Thus, we can determine whether the one or more objects 138 and a portion of the terrain of the work site 102 are lying in the blind zone of the machine 100 i.e. are not visible to the at least one sensor 202 due to the blind zone for the machine 100.
In an alternate embodiment, the perception system 201 may be configured to continuously collect data signals pertaining to the one or more objects 138 on the work site 102. The controller 204 may be configured to receive these data signals pertaining to the one or more objects 138 and detect whether the one or more objects 138 have moved from one location to another location on the work site 102. For e.g. the perception system 201 uses the at least one sensor 202 to gather data signals pertaining to the one or more objects 138X and 138Y on the work site 102 at different time intervals. The controller 204 then receives the data signals pertaining to the one or more objects 138X and 138Y at different time intervals and applies certain algorithms to the data signals to determine the location of the one or more objects 138X and 138Y at different time intervals. If the location of the one or more objects 138X and 138Y changes at different time intervals then the controller 204 concludes that the one or more objects 138X and 138Y are moving objects.
In an alternate embodiment, the machine 100 may have to move to a plurality of planned positions on the work site 102. However, as discussed above, the one or more objects 138 may have been detected as being moving objects. After determining that the one or more objects 138 are moving objects, the controller 204 determines the end location of the moving objects to determine whether the one or more objects 138 have moved into the blind zone for the machine 100 at the plurality of positions on the work site 102.
Further, the controller 204 may be configured to determine one or more locations of the machine on the work site 102 when the one or more objects 138 may be occluded from the field of view of the at least one sensor 202. For e.g. during operation, the machine 100 may have to move from one location to another location (from location 140 to location 142 on the work site 102). The controller 204 may have determined the sensor blockage for the at least one sensor 202. Further, during operation the controller 204 may have determined the location of the one or more objects 138 on the work site 102. The controller 204 determines whether the sensor blockage for the at least one sensor 202 is obstructing a portion of the field of view for the at least one sensor 202 wherefrom the one or more objects 138 would have been visible.
The controller 204 may also be configured to actuate movement of the machine 100 based on the sensor blockage and the blind zones for the at least one sensor 202. The controller 204 may also be configured with appropriate set/s of capabilities for interpreting the presence of the one or more objects 138 as ‘obstacles’ with regards to determining a path of travel for the machine 100 by the controller 204 on the basis of the determined characteristics of the environment 134. In such embodiments, the controller 204 is further configured to determine a path of travel for the machine 100 on the basis of the detected obstacles, as represented by the one or more objects 138 present in the vicinity of the machine 100. For example, the at least one sensor 202 may generate data signals indicative of a detection of a tree on the work site 102. Based on inputs from the perception system 201 the controller 204 may correspondingly navigate the machine 100 by appropriately commanding the drive system 108, the steering system 118, the braking system, and the articulation system 112 so that the machine 100 is avoided from coming into contact or colliding with the tree on the work site 102.
Autonomous machines are designed to perform some or all required functions autonomously (using sensors), i.e., without contemporaneous user input, the systems and methods used to control the autonomous machines are highly dependent on the area of a work site that can be perceived by the sensor of the autonomous machine. The machine's ability to perceive its environment is constrained by the sensors field of view and sensing range, the terrain complexity arising from static and dynamic objects and atmospheric conditions (dust, rain). Accordingly, large sections of the terrain might not be visible to the machine.
In an aspect of the present disclosure, a control system 200 for a machine is disclosed. The control system comprises a perception system 201 comprising at least one sensor 202 and a controller 204. The perception system 201 uses the at least one sensor 202 and gathers data signals pertaining to the characteristics of the environment 134. The controller 204 receives the data signals pertaining to the environment 134.
In an aspect of the present disclosure, a method 1100 for controlling an operation of a machine for e.g., the machine 100, is disclosed, as shown in
Further, with implementation of embodiments disclosed herein, several machines known to persons skilled in the art can be beneficially rendered autonomous with regards to the functions required on a given work site. With regards to the drilling industry, use of embodiments disclosed herein can assist many vendors to entail reduced costs, at least in part, due to the autonomous operation of drilling machines on a given work site.
While aspects of the present disclosure have been particularly shown and described with reference to the embodiments above, it will be understood by those skilled in the art that various additional embodiments may be contemplated by the modification of the disclosed machines, systems and methods without departing from the spirit and scope of what is disclosed. Such embodiments should be understood to fall within the scope of the present disclosure as determined based upon the claims and any equivalents thereof.