Work vehicles operate in work areas to accomplish such tasks as earth moving, agricultural tasks, hauling, and other tasks with occasional or frequent movement of the work vehicle to new locations in or outside of the work area. The work vehicle may have one or more work tool that is operated in the work area to accomplish tasks. Multiple work vehicles, other equipment, personnel, and/or mobile and/or fixed objects may operate and/or be positioned together in the work area. Personnel and operators of work vehicles and other equipment maintain awareness of the work vehicles, equipment, and other personnel and objects within the work area. In a non-limiting example, an operator of a construction work vehicle may be positioned in the operator station of the work vehicle and visually check the surroundings of the work vehicle before or while controlling movement or another operation of the work vehicle and/or operating a work tool of the work vehicle to accomplish one or more tasks in the work area.
Various aspects of examples of the present disclosure are set out in the claims.
In an embodiment of the present disclosure, a method of operating a work vehicle at a work area is provided. The method includes monitoring a target area in the work area and activating a response on the work vehicle directed toward the target area upon satisfaction of a work area condition in the work area.
In an embodiment of the present disclosure, a work area monitoring system for a work vehicle is provided. The system includes an input module mounted to the work vehicle and configured to monitor a target area in a work area and an output module in communication with the input module and configured to activate a response on the work vehicle directed toward the target area upon satisfaction of a work area condition in the work area.
The above and other features will become apparent from the following description and accompanying drawings.
The detailed description of the drawings refers to the accompanying figures in which:
Like reference numerals are used to indicate like elements throughout the several figures.
At least one embodiment of the subject matter of this disclosure is understood by referring to
Referring now to
Referring now to
The sensors 36 of the illustrated embodiments include ultrasonic sensors, but the sensors 36 include one or more infrared, laser or radar based, or any other object-sensing devices or systems in additional embodiments of the present disclosure. The sensor positions 38 for the sensors 36 of the illustrated embodiment are located on or at the operator station 18 and/or the rear end 28 as illustrated in
The work vehicle 10 and/or the system 30 of one or more embodiments includes a controller 46. The controller 46 receives or is configured to receive one or more sensor signals from the sensors 36 in an embodiment. In an embodiment, the sensor signal includes a signal distance value representing the distance 34 between the sensor 36 and the object 32. Accordingly, in at least one embodiment, the controller 46 receives or otherwise determines a signal location value, which is based on the sensor position 38 of the signal-sending sensor 36, and the signal distance value, which is based on the distance 34 between the sensor 36 and the object 32 sensed by the signal-sending sensor 36.
The system 30 illustrated in
Referring to
The response on or of the work vehicle 10 described herein of various embodiments of the vehicle 10, the system 30, and/or the method 100 of the present disclosure includes activation or illumination of lighting, including without limitation lighting sufficient to illuminate the target area 56 and/or the work area 58, capturing a photographic, thermal, or other image or images of the target area 56 and/or the work area 58, initiating a video and/or audio recording of the target area 56 and/or the work area 58, and/or scanning or otherwise receiving input of the target area 56 and/or the work area 58.
Satisfaction of the work area condition in the work area 58 described herein of various embodiments of the vehicle 10, the system 30, and/or the method 100 may include, without limitation, one or more of the sensor(s) 36 sensing the object(s) 32 in or at the target area 56 and/or the work area 58, the work vehicle 10 being relocated in, moving across, or moving at the work area 58, and/or a lapsing of a predetermined period of time.
In one or more embodiments, the work vehicle 10 is in an unoccupied state at the work area 58. An unoccupied state includes the work vehicle 10 not having the operator 20 located at or in the operator station 18 in an embodiment. In another embodiment, an unoccupied state of the work vehicle 10 includes the work vehicle 10 being in a non-operational state. In an embodiment, such as during an unoccupied state of the work vehicle 10 at the work area 58, the input module 54 monitors or is further configured to monitor the target area 56 in the work area 58 with one or more of the sensor(s) 36. The output module 60 activates or is configured to activate the response on the work vehicle 10 directed toward the target area 56 upon sensing the presence or location of the object(s) 32 at the target area 56 with the one or more sensor(s) 36.
As stated previously, the output module 60 activates the response on the work vehicle 10 directed toward the target area 56 upon satisfaction of the work area condition in the work area 58. The output module 60 of an embodiment for an unoccupied state of the work vehicle 10 activates the response upon one or more of the sensor(s) 36 sensing the object(s) 32 in or at the target area 56 and/or the work area 58 as satisfaction(s) of the work area condition. As shown in
The response on or of the work vehicle 10 in the system 30 and/or the method 100 includes activation or illumination of the lamps 40 or other lighting, including without limitation lighting sufficient to illuminate the target area 56 and/or the work area 58. The target area 56 may be one portion of the area surrounding the work vehicle 10, as shown in
In one or more embodiments, the work vehicle 10 is in an operational state at the work area 58. An operational state includes, in particular non-limiting embodiments, the work vehicle 10 being occupied by the operator 20 at or in the operator station 18, being in the process of performing an operation of the work vehicle 10, and/or traveling in, at, or across the work area 58.
In one or more embodiments when the work vehicle 10 is in an operational state at the work area 58, the input module 54 determines or is configured to determine a location of the work vehicle 10 in the work area 58 relative to the target area 56. The input module 54 determines the location of the work vehicle 10 by receiving input or determining input through a global position system (GPS), inertial measurement, and/or any other location-determining system or process. In such embodiments, satisfaction of the work area condition in the work area 58 includes the work vehicle 10 being relocated or moving in the work area 58.
In one or more embodiments, such as when the work vehicle 10 is in an operational state at the work area 58, satisfaction of the work area condition in the work area 58 includes the input module 54 determining an existence of an anomaly in the work area. The anomaly includes, without limitation, a barrier, object, terrain deviation, and/or other element or elements that is/are inconsistent with the remaining work area 58, a historical set of input values to the system 30 and/or the vehicle 10 and/or a stored algorithm or reference data used to determine or define the existence of an anomaly. In one non-limiting example, a construction work vehicle traveling across and scanning or otherwise sensing the work area 58 may approach a large rock obstacle as an anomaly to satisfy the work area condition.
In an embodiment, when the work vehicle 10 is in an operational state or an unoccupied state, the output module 60 is further configured, directly or indirectly, to capture one or more image(s) of the target area 56 upon the input module 54 determining the existence of an anomaly in the work area 58. The output module 60 may include or be in communication with a still image or video camera, thermal imaging camera, and/or other imaging device to capture the one or more images of the work area 58. The output module 60 and/or another device of the system 30 and/or the vehicle 10 may store, onboard or remotely, and/or transmit the image(s) for processing, analysis, and/or future reference.
In an embodiment, the output module 60 is further configured to scan the target area 56 upon the input module 54 determining the existence of an anomaly in the work area. The output module 60 may include or be in communication with a three-dimensional scanner, radar or laser-based range finder, and/or another scanning device to scan the target area 56. In the above non-limiting example, upon the determining the existence of the anomaly, the system 30 and/or the vehicle 10 may capture the image of the obstacle or scan the area of or around the obstacle in order to, in some examples, communicate or record the location and/or physical details of the obstacle for planning and operation in the work area 58.
In an embodiment of the present disclosure, the satisfaction of the work area condition in the work area 58 includes a lapsing of a predetermined period of time. In a non-limiting example, the system 30 and/or the vehicle 10 predetermines a five-minute period of time such that, in the non-limiting example, the work area condition is satisfied repeatedly every five minutes. In the non-limiting example, the input module 54 communicates with the output module 60 to activate a response on the work vehicle 10 directed toward the target area 56, such as actuation or initiation of image capturing, as described herein, upon satisfaction of the work area condition of the lapsing of the predetermined time period. Accordingly, the system 10 and/or the vehicle 10 may capture images of the work area 58 as the work vehicle 10 operates in the work area 58 in order to store, onboard or remotely, and/or transmit the images for processing, analysis, and/or future reference.
The steps, functions, and methods of each embodiment of the work vehicle 10 and/or the system 30 described herein form part of one or more embodiments of the method 100 of operating the work vehicle 10 at the work area 58 illustrated in
Without in any way limiting the scope, interpretation, or application of the claims appearing below, it will be appreciated that the work vehicle 10, the system 30, and the method 100 of the embodiments of the present disclosure improve the security of the work vehicle 10 and the work area 58. In a non-limiting example, the system 30, the vehicle 10, and/or the method 100 illuminates the work area 58 and/or capture an image or video recording upon the determination that an individual is approaching the work vehicle 10, such as when the work vehicle 10 is unoccupied. Further, the work vehicle 10, the system 30, and/or the method 100 improve the safety and comfort of an operator of the work vehicle 10 by illuminating the work area 58 near the work vehicle 10 when the operator is approaching the work area 58. Furthermore, the work vehicle 10, the system 30, and/or the method 100 improve operation of the work vehicle 10 in the work area 58 via, in particular non-limiting examples, scanning, capturing image(s), or video recording the work area 58 to monitor the work area 58 and/or the status of operations in the work area 58.
As used herein, “e.g.” is utilized to non-exhaustively list examples and carries the same meaning as alternative illustrative phrases such as “including,” “including, but not limited to,” and “including without limitation.” As used herein, unless otherwise limited or modified, lists with elements that are separated by conjunctive terms (e.g., “and”) and that are also preceded by the phrase “one or more of,” “at least one of,” “at least,” or a like phrase, indicate configurations or arrangements that potentially include individual elements of the list, or any combination thereof. For example, “at least one of A, B, and C” and “one or more of A, B, and C” each indicate the possibility of only A, only B, only C, or any combination of two or more of A, B, and C (A and B; A and C; B and C; or A, B, and C). As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Further, “comprises,” “includes,” and like phrases are intended to specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
While the present disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description is not restrictive in character, it being understood that illustrative embodiment(s) have been shown and described and that all changes and modifications that come within the spirit of the present disclosure are desired to be protected. Alternative embodiments of the present disclosure may not include all of the features described yet still benefit from at least some of the advantages of such features. Those of ordinary skill in the art may devise their own implementations that incorporate one or more of the features of the present disclosure and fall within the spirit and scope of the appended claims.