This invention relates generally to occupant detection systems, and more specifically to operator awareness systems at a vehicle or machine.
Automobile and machine manufacturers are continually striving to enhance passenger safety by both active and passive means. For example, operator presence detection systems are often deployed as safety features in consumer as well as commercial vehicles and machines. One of the most common operator presence devices is a pressure-sensitive sensor positioned underneath a passenger seat. For example, a seat sensor can be used in conjunction with an airbag deployment system to detect the presence of an occupant and as a check that a passenger's weight satisfies a minimum weight requirement for safe airbag deployment.
Seat sensors can also be deployed in agricultural machines as a means to insure that an operator is present and in control of a machine while a power take-off (PTO) is turned on. Agricultural fields can cover large areas, with a single pass taking up to an hour at a machine's relatively slow speed. Operator shifts can be up to 12 hours, a long time to remain seated, and some operators cannot resist the urge to get up and stretch or take a break while a machine is running. A seat sensor can be used to detect a vacant seat and shut down a PTO in response. Unfortunately, a seat sensor is easily exploitable by an operator who wants to avoid PTO down time, for example he may simply place a weight in the seat, or disconnect the seat switch, defeating its purpose, and placing an operator, his machine and his environment at risk.
Even when working properly with a seated operator, a seat sensor is limited to confirming an operator's presence, not his attention to machine operations. An operator may be drowsy and nodding off, be distracted by an entertainment system, or simply be occupied with looking behind the machine at a towed implement rather than looking ahead in the direction that a machine is heading; a dangerous situation when a machine is not equipped with an automatic guidance system. An accident can occur when an operator fails to notice an upcoming obstacle due to any of the above-mentioned reasons.
Some types of systems for checking or encouraging operator attention have been proposed, such as requiring an operator to push a button at periodic intervals, or flashing a message on a control display screen that invites an operator to respond. However, these types of systems can cause frustration and irritation on the part of an operator and are most often considered a nuisance that an operator wants to avoid. Thus, a nuanced approach is required, one that gauges an operator's awareness, is not easily circumvented, and is not invasively annoying to an operator.
A system is presented for determining an operator's situational awareness and, dependent on his awareness, warning him when his vehicle is approaching an obstacle or predetermined location. In an example embodiment, a system of the invention can be configured to determine the direction that an operator is looking. If the operator is not looking in the direction of machine travel, the system can alert him to an upcoming obstacle or marked position in his travel path. However, if the operator is looking in the direction of machine travel, and thus is able to see the obstacles in the machine/implement path, no alert is provided. Thus the system can enhance operator safety without providing unnecessary alarms that can either annoy an operator, or simply be ignored as part of an “alarm overload” condition at a machine in which multiple alarms are provided at such frequent intervals that an operator simply ends up ignoring them as routine.
In an example embodiment, a system can include an optical and/or infrared camera or other sensor coupled to a situational awareness detector (SAD). A camera can be mounted in an upper region of a cab in an unobtrusive location. In an example embodiment, camera input can be monitored so that an effort by an operator to cover, disconnect, or otherwise circumvent the system can be thwarted. In an example system, predetermined locations, such as ponds, ends of rows, fences, turns, angles and the like can be provided by a user and stored at a SAD as marked locations or objects that require an operator's attention. In addition, a dynamic obstacle detector, such as ladar or radar detector can be used to detect objects such as large rock piles in the path of a machine or machine implement. A SAD can be configured to compare the location of predetermined marked locations and dynamically detected objects to a machine's travel path to determine whether the object lies therein. In addition, a SAD can be configured to determine an operator's awareness of an upcoming obstacle in his machine path by determining whether the machine path or object is within an operator's gaze.
In an example embodiment, a SAD can comprise an operator orientation module (OOM) configured to determine an operator sight direction; a machine path module (MPM) configured to determine the path that a machine and/or its implement is headed; an object presence module (OPM) configured to determine whether an object is present within the machine travel path, and an operator awareness module (OAM) configured to determine operator awareness of an object, for example by determining whether a machine travel path is within a line of sight or field of view of an operator. In an exemplary embodiment, the OAM can be configured to determine whether a particular object in the machine path is within the view of an operator. When a determination is made that an operator is not aware of an object in his machine's path, an example SAD can be configured to provide an alert.
The above mentioned and other features of this invention will become more apparent and the invention itself will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
Corresponding reference characters indicate corresponding parts throughout the views of the drawings.
As required, example embodiments of the present invention are disclosed. The various embodiments are meant to be non-limiting examples of various ways of implementing the invention and it will be understood that the invention may be embodied in alternative forms. The present invention will be described more fully hereinafter with reference to the accompanying drawings in which like numerals represent like elements throughout the several figures, and in which example embodiments are shown. The figures are not necessarily to scale and some features may be exaggerated or minimized to show details of particular elements, while related elements may have been eliminated to prevent obscuring novel aspects. The specific structural and functional details disclosed herein should not be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention. For example, while the exemplary embodiments are discussed in the context of an agricultural vehicle, it will be understood that the present invention is not limited to that particular arrangement. Likewise functions discussed in the context of being performed by a particular module or device may be performed by a different module or device without departing from the scope of the claims.
In an example embodiment, the OOM 24 can be configured to determine the orientation of an operator positioned in the cab 16. For example, the OOM 24 can use sensor input provided by one or more optical and/or infrared cameras or other sensors positioned in the cab 16 to determine operator orientation. In an example embodiment, one or more sensors/cameras can be positioned in various locations throughout the cab, such as at the front of the cab facing rearward, the rear of the cab facing forward, and at the sides of the cab 16 facing inward. The OOM 24 can comprise one or more image processing, pattern recognition, face detection or other algorithms configured to detect an operator and determine the orientation of an operator's head to determine the direction that an operator is looking. For example, algorithms for multi-perspective multi-modal systems such as those discussed in “Multiperspective Thermal IR and Video Arrays for 3D Body Tracking and Driver Activity Analysis”, by Shinko Cheng, Sangho Park, and Mohan Trivedi and published at the 2nd Joint IEEE International Workshop on Object Tracking and Classification in and Beyond the Visible Spectrum in conjunction with IEEE CVPR2005, San Diego Calif., USA. June, 2005, which is incorporated in its entirety by reference, can be practiced and/or modified to analyze and track the head and face gaze direction of an operator. In an example embodiment, methods such as those disclosed in the article entitled, “Image-Based Passenger Detection and Localization Inside Vehicles”, authored by Petko Faber and published in the International Archives of Photogrammetry and Remote Sensing, Vol. XXXIII, Part B5, pages 231-232, Amsterdam 2000, which is incorporated in its entirety by reference, can be practiced and/or modified. In addition, algorithms similar to those used for video gesture control and 3D depth camera video gesture control in the gaming industry can also be employed. For example, the GestureTek™ Maestro 3D software configured to track a user's movements within a volume of interest to allow a user to have device-free control, such as that employed in Kinect™ video games, can be adapted to track a user's head. For example, referring to
The MPM 26 can be configured to determine the travel path for the machine 10 and implement 12. In an example embodiment, the controller 22 can receive input from a satellite receiver configured to determine current geographical location using satellite signals as known in the art for Global Positioning System (GPS) and Global Navigation Satellite System (GNSS) receivers. In addition, user input, such as the width of a towed implement can be received. The MPM 26 can use the geographical location and implement data to determine the machine travel path, which can be characterized by a heading and a width defined by latitude or longitude parameters.
The OPM 28 can be configured to determine whether an object is present in the machine travel path. In an example embodiment, a user can input predetermined marked locations of known obstacles or objects of interest, such as fencing, a pond, end of row, an angled turn, and the like. For example, latitude and longitude parameters can be provided and stored in a memory or database at the OPM 28. The OPM 28 can be configured to receive machine travel path information from the MPM 26 and determine whether the predetermined marked location parameters lie in the path defined at the MPM 26. In addition, a dynamic obstacle detection means can provide input regarding detected obstacles. For example, a radar, ladar or other type of detector can be configured to detect objects within its line of sight to provide input regarding obstacles that may not have been previously known, such as rock piles that have been formed in the course of working a field. In an example embodiment the radar detector can be configured to provide the heading and distance to the detected object. In an example embodiment, the detector acquires measurements through an arc that covers the width of the implement 12. In an exemplary embodiment, the machine path determined at the MPM 26 can be provided to the detector via a CAN bus (not shown) at the machine 10. Radar messages of suitable signal strength and within a predetermined proximity can be checked to see if they occur within the machine path. If sufficient measurements occur within the machine path, then an obstacle message providing obstacle position can be sent to the SAD 20 via the CAN bus. In a further embodiment, cameras can be used for obstacle detection, with video data transferred to the OPM 28 via the CAN bus. The OPM 28 can include one or more image processing algorithms known in the art for obstacle detection.
The OAM 30 can be configured to determine operator awareness of an upcoming object in his machine's path. In an example embodiment, the OAM 30 can be configured to compare the operator sight direction with the machine travel direction to determine whether an operator is aware of the machine travel path and any obstacles therein. In an example embodiment, line of sight bearing can be provided or field of view can be provided by the OOM 24 in terms of parameters such as a range of bearings or a range of latitude and longitude coordinates. If machine path as defined by the MPM 26 falls within the operator FOV 18, then a determination can be made that the operator is aware of the objects in the machine path. If not, then a determination can be made that an operator is not aware of a machine path, and thus not aware of any obstacles therein.
In an example embodiment, the OPM 28 can be configured to provide locations of objects that are within a machine's travel path to the OAM 30 and the OAM 30 can be configured to determine whether the object locations are within the operator line of sight or FOV provided be the OOM 24. If a determination is made at the OAM 30 that an operator is not aware of an object in the machine travel path, the OAM 30 can be configured to trigger the alert module 32 to provide an alert, preferably an audible alert. In an example embodiment, the OAM 30 can be configured to trigger the alert module 32 when the object is within the travel path and within a predetermined proximity of current machine location. For example, the OAM 30 can receive current geographic position from a GPS receiver at the machine 10, or from the MPM 26 and compare current location to the location coordinates for an object in the machine path as received from the OPM 28. In an example embodiment, an alert message can be sent to a speaker of an onboard audio system within the cab 16, for example via a CAN bus at the machine 10. In an example embodiment, as the distance between a machine and the object in its path narrows, an alert can be provided with increased urgency.
If it is determined that an object is present within the machine path, the method 40 can continue to block 46 where a determination can be made regarding operator sight direction. In an example embodiment, the OOM 24 can use camera input to determine the orientation of an operator, particularly an operator's head, and an operator's line of sight direction. In an example embodiment, an operator FOV can be determined based on the line of sight.
At decision block 48 a determination can be made as to whether an operator is aware of the obstacle in the machine path. In an example embodiment, the OAM 30 can use the operator sight of direction or FOV provided by the OOM 24 and the machine path provided by the MPM 26 to make this determination. If the machine path is within the operator FOV, or its heading is within a predetermined angle of an operator line of sight, a determination can be made that an operator is aware of the object. It is also contemplated that in an example embodiment the OPM 28 can provide locations of objects or obstacles that are in the machine path, and the OAM 30 can determine whether those locations are within an operator FOV. If it is determined that an operator is aware of an object, the method can continue at block 42. If the determination is made that an operator is not aware of the obstacle, then an alert can be provided at block 50. For example, the OAM 30 or controller 22 can trigger an alarm at the cab 16. In an example embodiment, the alarm can increase in urgency, such as increased volume or frequency as the machine 10 approaches the object. In an example embodiment, the alarm can be triggered when the machine is within a predetermined distance of the object.
It is noted that a method of the invention can include various sequences of the blocks depicted in
Thus the invention provides systems, apparatus, methods for warning an operator of an obstacle in a machine implement. The invention improves the safety of operator and machine during field operations, is not easily exploitable, and does not overload an operator with annoying or unnecessary alarms or distractions. In an example embodiment, one or more sensors such as cameras can be mounted in unobtrusive locations to detect an operator's presence and orientation. By way of example, but not limitation, a method can include checking sensor, for example camera input, to insure that the camera has not been disabled or covered. The invention can detect the presence of obstacles in a machine's path and make a determination regarding operator object awareness by comparing an operator gaze direction or field of view with a machine path direction. In an example embodiment, an operator is only warned when he is unaware of the object, for example in the case that is he is looking in a direction other than that of the machine travel path.
This application claims priority to U.S. Provisional Application No. 61/581,992 filed Dec. 30, 2011, entitled “METHOD OF DETECTING AND IMPROVING OPERATOR SITUATIONAL AWARENESS ON AGRICULTURAL MACHINES”.
Number | Date | Country | |
---|---|---|---|
61581992 | Dec 2011 | US |