The present disclosure relates generally to object detection systems, and, more particularly, to an object detection system and method for off-road or industrial vehicles.
In industrial applications, worksite procedures are important for operators, workmen, and other personnel located within the worksite. Generally, industrial standards require that specialized protection equipment be worn. Additionally, effective detection mechanisms are desired to identify the entry of an operator or workman within an area near heavy machinery or equipment.
To address the desire for detection mechanisms, some conventional approaches employ the use of RFID sensors or retroreflective sensors to detect objects, including people, based upon reflective technology. Drawbacks to such approaches include decreased scalability via software, as well as ineffective and limited object differentiation.
As such, there is a need in the art for an improved object detection system that provides increased detection accuracy.
According to an aspect of the present disclosure, an object detection system and method is disclosed. The object detection system includes at least one image capturing device operably coupled to a work vehicle, wherein the at least one image capturing device is configured to capture images of one or more worksite objects associated with a workman. An electronic data processor is communicatively coupled to the at least one imaging device, the electronic data processor including an object recognition device that is configured to process images received by the image capturing device. A computer readable storage medium comprising machine readable instructions that, when executed by the electronic data processor, cause the object recognition device to: associate a plurality of identifying indicia with the worksite objects; determine an object type of the worksite objects based on the plurality of identifying indicia; and characterize a workman located within a vicinity of the work vehicle based on the determined object type.
According to another aspect of the present disclosure, a work vehicle having an object detection is disclosed. The work vehicle including a frame; a plurality of ground engaging elements coupled to the frame; and at least one image capturing device operably coupled to the frame. The at least one image capturing device captures images of one or more worksite objects. An electronic data processor communicatively coupled to the at least one imaging device, the electronic data processor comprising an object recognition device configured to process images received by the image capturing device. A computer readable storage medium comprising machine readable instructions that, when executed by the electronic data processor, cause the object recognition device to: associate a plurality of identifying indicia with the worksite objects; determine an object type of the worksite objects based on the plurality of identifying indicia; and characterize a workman located within a vicinity of the work vehicle based on the determined object type.
According to another aspect of the present disclosure, a method is disclosed. The method including capturing, with an image capturing device, one or more images of at least one worksite object. Associating a plurality of identifying indicia with the worksite object. Determining an object type of the worksite object based on the plurality of identifying indicia. Classifying the object type into one or more categories based on a work task associated with the object type; and characterizing a workman located within a vicinity of a work vehicle based on the determined object type.
Other features and aspects will become apparent by consideration of the detailed description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
Referring to
The work vehicle 100 can comprise a frame 102 and an operator cab 104 supported by wheels 108 or tracks (not shown). A boom assembly 114 can be coupled to the frame 102 and can extend in length between a proximal end 113 and a distal end 115. A bucket structure 116 can be coupled to the boom assembly 114 at its distal end 115 and can comprise a conventional loader bucket as shown. It should be noted, however, that
As illustrated in
The image capturing devices 155 can be mounted to a front of the work vehicle 100 to capture images of one or more worksite objects 125 associated with or worn by a workman located within the worksite 170. In some embodiments, the image capturing devices 155 can have a field of view forward of or to the side of the work vehicle 100. For example, the image capturing device 155 can have a wide field of view that spans approximately 180 to 360 degrees along a center axis of the image capturing device 155 or a structural support attached thereto within a defined range. In other embodiments, one or more of the image capturing devices 155 can be optionally mounted to a rear of the work vehicle 100 in a direction opposite a direction of travel 160 (refer, e.g., to
The electronic data processor 152 can be arranged locally as part of a vehicle electronics unit 200 of the work vehicle 100 (
As will be appreciated by those skilled in the art,
Referring now to
As depicted, the various devices (i.e., vehicle data storage device 206, vehicle wireless communications device 212, user interface 156, and vehicle data bus 204) may communicate information, e.g., signals such as image data over the main data bus 202 to the electronic data processor 152. In other embodiments, the electronic data processor 152 can manage the transfer of data to and from a remote processing system 222 via a network 225 and wireless infrastructure 220. For example, the electronic data processor 152 can collect and process the image data from the main data bus 202 for transmission either in a forward or rearward direction (i.e., to or from processing system 222).
The vehicle data storage device 206 stores information and data for access by the electronic data processor 152 or the vehicle data bus 204. The vehicle data storage device 206 can comprise electronic memory, nonvolatile random-access memory, an optical storage device, a magnetic storage device, or another device for storing and accessing electronic data on any recordable, rewritable, or readable electronic, optical, or magnetic storage medium. Additionally, the vehicle data storage device 206 may include one or more software modules or data structures that record, and store data collected by the image capturing device 155 or other network devices coupled to or capable of communicating with the vehicle data bus 204. For example, in some embodiments, the one or more software modules and/or data structures can comprise a color recognition module 207, a shape recognition module 209, a visual characteristic module 211, and a classification module 213 as will be discussed with reference to
Referring to
In some embodiments, the color recognition module 207 can receive and processes color data such as reflective materials associated with the worksite objects 125 detected in each of the images captured by the image capturing device 155. For example, the color recognition module 207 can be configured to compare color pixels of images of the worksite objects 125 with color data stored in the vehicle data storage device 206 to determine a single color or an array of colors of the worksite objects 125. In other embodiments, the color recognition module 207 can comprise an extraction or sampling circuit (not shown) that extracts color data associated with the captured images.
The shape recognition module 209 communicates with the color recognition module 207 to determine a shape of the worksite objects 125 based on the color data. For example, the shape recognition module 209 can associated various patterns, dimensions, and image coordinates with the color data to determine a geometrical configuration and orientation of the worksite objects 125. Additionally, in some embodiments, the shape recognition module 209 can determine the geometrical configuration of the worksite objects 125 by comparing and/or collating one or more shapes identified within the received images with models stored in the vehicle data storage device 206. The visual characteristic module 211 correlates visual data such as visual cues, environmental data, position data, and other visual data with the color and shape data received from each of the color and shape recognition modules 207, 209. The visual cues can include, without limitation, object size, signage, hand gestures and/or signaling, standing postures of workmen, hand objects (e.g., tools), or other visual indicators associated with the workmen. Workmen located within a vicinity of the work vehicle 100 can perform a variety of tasks that require the use of visual cues. For example, some workmen may be required to hold signs or provide hand gestures to an operator of the work vehicle 100 to provide directional guidance. In some embodiments, the visual characteristic module 211 compares the visual data with stored data to determine one or more characteristics of the worksite object 125.
In some embodiments, the classification module 213 may comprise a classifier unit or other device that is configured to classify and associate data received from each of the color recognition module 207, shape recognition module 209, and visual characteristic module 211 to generate 2D or 3D models of the worksite objects 125. For example, the classification module 213 can classify each of the worksite objects 125 into one or more categories or subgroups based on an identified object type (e.g., hat, vest, sign, etc.) and a workman associated with the object type.
In operation, referring now to
Once the one or more images are captured at 302, the image data is transmitted to the electronic data processor 152 for processing at 304. During processing, the color, shape, and visual characteristic data is associated to determine the object type (e.g., reflective hat or vest) and proximate location of the worksite objects 125 at 306. For example, as discussed with reference to
Next at 308, once the various characteristics of the worksite objects 125 are identified at 306, further classification is performed by the classification module 213, and a comparative analysis of the captured images and stored reference images is completed by the electronic data processor 152 to determine the type of the worksite objects 125. Although in
Following determination of the worksite object 125, a second peripheral scan can be completed to determine a position of one or more workmen relative to the work vehicle 100. For example, in some embodiments, the work vehicle 100 can further comprise one or more sensors such as proximity sensors mounted to the vehicle that detects a position or presence of one or more workmen relative to the work vehicle 100. Optionally at 312, an alert is generated by the electronic data processor 152 and displayed on the user interface 156, if, based on sensor feedback, it is determined that a workman is within a predetermined danger zone (e.g., too close) relative to the work vehicle 100. In other embodiments, a work function of the work vehicle 100 can be inhibited based upon the alert generated at 312. In addition to the alert being visual, the alert may be audible, haptic, or other, or any combination.
Without in any way limiting the scope, interpretation, or application of the claims appearing below, a technical effect of one or more of the example embodiments disclosed herein is an object detection system. The object detection system is particularly advantageous in that it provides real-time monitoring of an industrial worksite by detecting one or more worksite objects to identify workmen located around work vehicles on construction, forestry, industrial and other worksites.
While the above describes example embodiments of the present disclosure, these descriptions should not be viewed in a limiting sense. Rather, other variations and modifications may be made without departing from the scope and spirit of the present disclosure as defined in the appended claims.
This Application claims priority to U.S. Provisional Application No. 62/751,597, titled “Safety Detection System and Method,” filed Oct. 27, 2018, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62751597 | Oct 2018 | US |