The present disclosure relates generally to warning zone systems for work vehicles, and, more particularly, to a system and method for configuring worksite warning zones.
Improving operator safety at industrial worksites, such as construction worksites, is important. To improve operator safety, worksite owners have implemented a variety of safety systems to reduce worksite hazards and to increase the safety of operators within the vehicle, as well as outside the vehicle and around the worksite.
For example, some conventional approaches and techniques employ the use of radar sensors to mitigate safety hazards. Drawbacks to such systems include inaccurate and limited sensing capabilities, and false detection warnings, which can lead to system disengagement or deactivation by an operator based on the false warnings. One way to improve upon these systems is to enable the operator to define warning zones with the machine. Therefore, to overcome the drawbacks, there is a need in the art for a robust and improved warning zone system that provides increased sensing accuracy and substantially real-time monitoring and warning zone configuration.
According to an aspect of the present disclosure, a warning zone system for a work vehicle is disclosed. The warning zone system comprises an object detection system arranged on a work vehicle, wherein the object detection system is configured to detect and classify object obstructions located at a worksite; a zone configuration system, wherein the zone configuration system is configured to associate position data with the object obstructions, and generate object models of the object obstructions based on the associated position data; and an electronic data processor communicatively coupled to each of the object detection system and the zone configuration system, wherein the electronic data processor is configured generate and associate warning zones with the object models for display on a user display in substantially real-time.
According to another aspect of the present disclosure, a method is disclosed. The method comprises capturing at least one image of an object obstruction arranged in a worksite; classifying the object obstruction based on a plurality of object characteristics; associating position data with the object obstruction; generating a model of the object obstruction; generating and associating one or more warning zones with the object obstructions; and displaying the warning zones on a user display in substantially real-time.
Other features and aspects will become apparent by consideration of the detailed description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
Referring to
The work vehicle 100 can comprise a frame assembly comprising a first frame 102 (e.g., a front frame) and a second frame 104 (e.g., a rear frame) structurally supported by wheels 106, 108. An operator cab 110, which includes a variety of control mechanisms accessible by a vehicle operator, can be mounted to the first frame 102. An engine 112 can be mounted to the second frame 104 and arranged to drive the wheels 108 at various speeds via coupling through a drive transmission (not shown). As shown in
With reference to
As will be appreciated by those skilled in the art,
Referring to
As shown in
For example, with reference to
The electronic data processor 202 can be arranged locally as part of a vehicle electronics unit 200 of the work vehicle 100 or remotely at a remote processing center (not shown). In various embodiments, the electronic data processor 202 can comprise a microprocessor, a microcontroller, a central processing unit, a programmable logic array, a programmable logic controller, or other suitable programmable circuitry that is adapted to perform data processing and/or system control operations. For example, the electronic data processor 202 can be configured to associate a plurality of warning zones 501 (
With continued reference to
The data storage device 204 stores information and data (e.g., geocoordinates or mapping data) for access by the electronic data processor 202 or the vehicle data bus 220. The data storage device 204 can similarly comprise electronic memory, nonvolatile random-access memory, an optical storage device, a magnetic storage device, or another device for storing and accessing electronic data on any recordable, rewritable, or readable electronic, optical, or magnetic storage medium.
The location-determining receiver 218 may comprise a receiver that uses satellite signals, terrestrial signals, or both to determine the location or position of an object or the vehicle. In one embodiment, the location-determining receiver 218 comprises a Global Positioning System (GPS) receiver with a differential correction receiver for providing precise measurements of the geographic coordinates or position of the work vehicle 100. The differential correction receiver may receive satellite or terrestrial signal transmissions of correction information from one or more reference stations with generally known geographic coordinates to facilitate improved accuracy in the determination of a location for the GPS receiver. In other embodiments, localization and mapping techniques such as simultaneous localization and mapping (SLAM) can be employed. For example, in low receptivity areas and/or indoor environments such as caves, mines, or urban worksites, SLAM techniques can be used to improve positioning accuracy within those areas. Additionally, in other alternative embodiments, sensors such as gyroscopes and accelerometers can be used collectively with or independently of the location-determining receiver 218 to map distances and angles to the images captured by the object detection system 152.
The electronic data processor 202 manages the data transfer between the various vehicle systems and components, which, in some embodiments, can include data transfer to and from a remote processing system (not shown). For example, the electronic data processor 202 collects and processes data (e.g., object characteristic data and mapping data) from the data bus 208 for transmission either in a forward or rearward direction.
The electronic device 206 can comprise electronic memory, nonvolatile random-access memory, flip-flops, a computer-writable or computer-readable storage medium, or another electronic device for storing, retrieving, reading or writing data. The electronic device 206 can include one or more software modules that record and store data collected by the object detection system 152, the zone configuration system 154, or other network devices coupled to or capable of communicating with the vehicle data bus 220, or another sensor or measurement device for sending or measuring parameters, conditions or status of the vehicle electronics unit 200, vehicle systems, or vehicle components. Each of the modules can comprise executable software instructions or data structures for processing by the electronic data processor 202. As shown in
The term module as used herein may include a hardware and/or software system that operates to perform one or more functions. Each module can be realized in a variety of suitable configurations and should not be limited to any particular implementation exemplified herein, unless such limitations are expressly called out. Moreover, in the various embodiments described herein, each module corresponds to a defined functionality; however, in other embodiments, each functionality may be distributed to more than one module. Likewise, in other embodiments, multiple defined functionalities may be implemented by a single module that performs those multiple functions, possibly alongside other functions, or distributed differently among a set of modules than specifically illustrated in the examples herein.
The object detection module 230 records and stores near real-time imaging data collected by the object detection system 152. For example, the object detection module 230 can identify and associate one or more object characteristics 126 such as dimensions, colors, or geometric configurations with the captured images. In some embodiments, the object detection module 230 can identify the object by comparing and associating the captured image to stored data such as metadata 135, image data, or video data.
A mapping module 232 can access the object detection module 230 and associate the identified object obstructions 114 with one or more coordinates or geographic locations. For example, in some embodiments, the mapping module 232 can generate two-dimensional (2D) or three-dimensional (3D) object models 124 of detected object obstructions 114 by utilizing imagery data such as mesh data, location data, coordinate data, or others. In other embodiments, the mapping module 232 can map the entire worksite 10 in 2D or 3D format including the generated 2D or 3D object models 124 of the identified object obstructions 114.
The zone configuration module 234 can associate the generated 2D and 3D object models 124 with warning zones 501. For example, in one embodiment, the zone configuration module 234 can characterize detected object obstructions 114 as active warning zones 501 or operator zones that include one or more site operators or pedestrians located within the zones. This, in turn, can alert a vehicle operator to change course or halt operations of the work vehicle 100. In other embodiments, the zone configuration module 234 can define object obstructions 114 as hazardous or impassable and generate warning alerts 503 notifying a vehicle operator that such zone should not be traveled through during operation of the work vehicle 100.
In additional embodiments, the grade control module 236 can control the orientation of the blade assembly 116. For example, the grade control module 236 can utilize GPS data to adjust a position and orientation of the blades 118 of the blade assembly 116 and output corresponding coordinate data to the mapping module 232.
The vehicle data bus 220 supports communications between one or more of the following components: a vehicle controller 222, the object detection system 152, the zone configuration system 154, a grade control system 226, and the electronic data processor 202 via a wireless communication interface 216.
The vehicle controller 222 can comprise a device for steering or navigating the work vehicle 100 consistent with the grade control system 226 or other instructions provided by the vehicle operator based on feedback received from the object detection system 152 or zone configuration system 154. For example, the grade control system 226 can receive one or more position signals from the location determining receiver 218 arranged on the work vehicle 100 (e.g., the operator cab 110). Additionally, the grade control system 226 can determine a location of the blades 118 and generate command signals communicated to the vehicle controller 222 to change a position of the blades 118 based on signals received from/by the location determining receiver 218. Once the data is received, the electronic data processor 202 can execute software stored in the grade control module 236 to allow for the position data 122 to be mapped to the images captured or cross-referenced with stored maps or models. For example, it should be noted that, in some embodiments, the grade control system 226 can comprise a collection of stored maps and models.
Referring now to
At 304, the object detection module 230 can classify the images into various categories based on a plurality of object characteristics 126 such as object type 128 (e.g., person, pile, etc.), object size 130, object location 132, combinations thereof, or other suitable object identifying characteristics. In other embodiments, various artificial intelligence and machine learning techniques can be employed to generate the classified data based, for example, on one or more neural networks. Additionally, in other alternative embodiments, an operator may classify the images via a user interface arranged on a portable device such as mobile phone or tablet.
Next at 306, the electronic data processor 202 can access the mapping module 232 and generate 2D or 3D models of the captured images by associating the identified object obstructions 114 with one or more coordinates or geographic locations as discussed above with reference to
At 308, 2D or 3D models of the detected object obstructions 114 are generated by utilizing imagery data such as mesh data, location data, coordinate data, or others. The mapping module 232 can also input positioning data received directly from the location determining receiver 218 or from the grade control system 226.
In some embodiments, the electronic data processor 202 can receive or transfer information to and from other processors or computing devices. For example, the mapped information stored by the electronic data processor 202 can be received or transferred from other computers and or data collected from the imaging devices 153 arranged on the work vehicles 100 may be transferred to another a processor on another work vehicle 100. In some embodiments, the information/data may be transmitted via a network to a central processing computer for further processing. For example, a first vehicle may store a computerized model of a worksite (i.e., a map of the worksite) and the work to be performed at the work site by the implements.
Once a desired number of object obstructions 114 have been detected and mapped, the electronic data processor 202 can use such information to define one or more worksite warning zones 501 via the zone configuration module 234. The zone configuration module 234 can communicate with the mapping module 232 to classify and associate warning signals with the 2D and/or 3D models (i.e., generate worksite warning zones). In some embodiments, the worksite warning zones 501 can be classified as active (mobile) or inactive (stationary) depending upon the characteristics or the features of object obstructions 114 detected in the worksite 10. For example, object obstructions 114 such as site operators or pedestrians detected within the worksite 10 can be characterized as active, whereas object obstructions 114 such as ponds, buildings, or, utility poles can be characterized as inactive. Additionally, each of the object obstructions 114 can be further characterized as hazardous or non-hazardous based on the associated data.
In some embodiments, the electronic data processor 202 can query the detailed map info stored on the data storage device 206 to determine whether there is a warning zone 501 associated with the location of the identified first object. As previously discussed with reference to
Once the 2D and/or 3D models and corresponding warning zones 501 are generated, at 310, the electronic data processor 202 can again execute the zone configuration module 234 to generate one or more warning alerts 503 associated with the warning zones. At 312, in some embodiments, the warning alerts can be displayed on the user display 210 when the work vehicle 100 is proximate or within a predetermined range of the warning zones. For example, as shown in
Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is a system for configuring worksite warning zones. The zone configuration system is particularly advantageous in that it allows for near real-time configuration of worksite warning zones based on a detection of one or more object obstructions.
While the above describes example embodiments of the present disclosure, these descriptions should not be viewed in a limiting sense. Rather, other variations and modifications may be made without departing from the scope and spirit of the present disclosure as defined in the appended claims.
Number | Date | Country | |
---|---|---|---|
62850846 | May 2019 | US |