Field of Invention
Embodiments of the present invention relate to user interfaces for mobile machines. More particularly, embodiments of the present invention relate to advanced user interfaces and user interface systems for mobile machines that automatically adapt to the machine's operating environment and provide natural, intuitive interaction between the machine and the machine's operator.
Description of Related Art
Mobile machines, such as mobile machines used in the agriculture and construction industries, are increasingly larger, more complex and more automated. Many of such machines have multiple auxiliary functions. Tractors used in the agriculture and construction industries, for example, may include front and rear linkage systems, multiple power take offs, and multiple hydraulic couplers for interfacing an onboard hydraulic system with an external attachment. Such auxiliary functions include operator controls inside the cabin and, sometimes, additional controls outside the operator cabin. These machines typically include multiple embedded computing devices to help manage machine operation, and may collect information from an array of sensors located throughout the machine and use the collected information to optimize machine performance and provide information about the machine to the operator through a user interface.
It is common for multiple machines to work cooperatively within a relatively small geographic area, such as a group of construction machines doing groundwork at a worksite or a fleet of combine harvesters and grain carts harvesting a field and transporting harvested grain to a grain storage facility. Using multiple machines at a single site can increase productivity but also presents challenges. The work of the machines must be coordinated, for example, and care must be taken to avoid machine-to-machine and machine-to-person collisions and other accidents.
The above section provides background information related to the present disclosure which is not necessarily prior art.
A mobile machine in accordance with a first embodiment of the invention comprises an operator cabin, one or more optical sensors in the operator cabin, and one or more computing devices for detecting a first movement of an operator in the operator cabin using data from the one or more optical sensors, and for performing a first action on an attachment associated with the machine, the first action being in response to, and associated with, the first movement.
A mobile machine in accordance with a second embodiment of the invention comprises an operator cabin, one or more optical sensors in the operator cabin, and one or more computing devices for detecting a first movement of an operator in the operator cabin using data from the one or more optical sensors, and for performing a first action on a component of the machine, the first action being in response to, and associated with, the first movement.
These and other important aspects of the present invention are described more fully in the detailed description below. The invention is not limited to the particular methods and systems described herein. Other embodiments may be used and/or changes to the described embodiments may be made without departing from the scope of the claims that follow the detailed description.
Embodiments of the present invention are described in detail below with reference to the attached drawing figures, wherein:
The drawing figures do not limit the present invention to the specific embodiments disclosed and described herein. The drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the invention.
The following detailed description of embodiments of the invention references the accompanying drawings. The embodiments are intended to describe aspects of the invention in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments can be utilized and changes can be made without departing from the scope of the claims. The following description is, therefore, not to be taken in a limiting sense.
In this description, references to “one embodiment”, “an embodiment”, or “embodiments” mean that the feature or features being referred to are included in at least one embodiment of the technology. Separate references to “one embodiment”, “an embodiment”, or “embodiments” in this description do not necessarily refer to the same embodiment and are also not mutually exclusive unless so stated and/or except as will be readily apparent to those skilled in the art from the description. For example, a feature, structure, act, etcetera described in one embodiment may also be included in other embodiments, but is not necessarily included. Thus, the present technology can include a variety of combinations and/or integrations of the embodiments described herein.
Embodiments of the present invention relate to improved user interfaces and user interface systems for mobile machines that make a greater amount of information available to machine operators in a manner that is natural, intuitive and easy to use. More specifically, embodiments of the invention relate to user interface systems capable of automatically detecting aspects of the machine's operating environment and automatically optimize the user interface for the operating environment. The operating environment may include the state of external and independent objects within the same region as the mobile machine as well as the state of an operator inside the mobile machine. Such optimization may occur in real time to reflect changes in the machine's operating environment as they occur.
Embodiments of the present invention also relate to user interface systems for mobile machines that facilitate user input and interaction by enabling various methods of user input including traditional buttons, switches or touchscreen inputs, as well as more natural forms of user input such as gestures and sound recognition. Gesture and sound recognition enable hands-free control of the mobile machine, enabling operators to interact with the machine without any mechanical input devices.
Turning now to the drawing figures, and initially
With particular reference now to
Reference will be made herein to items or objects external to the operator cabin 14. Such items or objects may include independent objects, attachments and machine components. An independent object is an object that is not physically attached to or part of the mobile machine. Independent objects include other mobile machines, attachments coupled with other mobile machines, and fixed structures such as barns, grain storage bins, grain dryers, grain silos, road markings, signs, bridges, railroad crossings, fence lines, power lines, creeks, rivers and geographic markers. An attachment is a machine or device that is mechanically coupled with the mobile machine and is intended to be coupled with, and decoupled from, the mobile machine as part of the ordinary and normal operation of the machine. Examples of attachments include compacting or tilling implements pulled or pushed by a tractor or bulldozer, combine headers, windrower headers, loaders attached to tractors, mowers attached to tractors, and balers attached to tractors. Machine components include parts of the mobile machine that are not decoupled from the machine during ordinary and normal operation of the machine. Machine components include wheels, tires and tracks, engines, and components of hydraulic systems such as hydraulic motors.
Aspects of the present invention may be enabled by a communications and control system associated with the mobile machine. An exemplary communications and control system 52 is illustrated in
The position determining system 56 may include a global navigation satellite system (GNSS) receiver, such as a device configured to receive signals from one or more positioning systems such as the United States' global positioning system (GPS), the European GALILEO system and/or the Russian GLONASS system, and to determine a location of the mobile machine using the received signals. The position determining system 56 may incorporate GNSS enhancements such as Differential Global Positioning System (DGPS) or Real Time Kinematic (RTK) that increase the accuracy of GNSS systems.
The sensors 60 may be associated with any of various components or functions of the mobile machine including, for example, various elements of the engine, transmission(s), hydraulic system, electrical system, power take off(s) and linkage systems. The sensors 60 may collect information about the operation of the machine such as engine speed, engine temperature, wheel position, hydraulic fluid pressure and hydraulic fluid temperature. The sensors 60 may also collect information about attachments coupled with the machine as well as the machine's environment such as ambient temperature. As explained below in greater detail, one or more of the sensors 60 may be configured and/or placed to determine when an attachment is coupled with the machine.
The actuators 62 are configured and placed to drive certain functions of the machine including, for example, steering when an automated guidance function is engaged, manipulating a hydraulic system or a linkage system such as a three-point hitch linkage system. The actuators 62 may take virtually any form but are generally configured to receive control signals or other inputs from the controller 54 (or other component of the system 52) and to generate a mechanical movement or action in response to the control signals or instructions. By way of example, the sensors 60 and actuators 62 may be used in automated steering of a machine wherein the sensors 60 detect a current position or state of steered wheels or tracks and the actuators 62 drive steering action or operation of the wheels or tracks. In another example, the sensors 60 collect data relating to the operation of the machine and store the data in the storage component 64, communicate the data to a remote computing device via the gateway, or both.
The controller 54 includes one or more integrated circuits programmed or configured to implement the functions described herein. By way of example the controller 54 may be a digital controller and may include one or more general purpose microprocessors or microcontrollers, programmable logic devices, or application specific integrated circuits. The controller 54 may include multiple computing components placed in various different locations on the machine. The controller 54 may also include one or more discrete and/or analog circuit components operating in conjunction with the one or more integrated circuits or computing components. Furthermore, the controller 54 may include or have access to one or more memory elements operable to store executable instructions, data, or both. The storage component 64 stores data and preferably includes a non-volatile storage medium such as optic, magnetic or solid state semiconductor technology.
It will be appreciated that, for simplicity, certain elements and components of the system 52 have been omitted from the present discussion and from the drawing of
In some embodiments, all of the components of the system 52 are contained on or in a host machine. The present invention is not so limited, however, and in other embodiments one or more of the components of the system 52 may be external to the machine. In another embodiment, for example, some of the components of the system 52 are contained on or in the machine while other components of the system 52 are contained on or in an implement associated with the machine. In that embodiment, the components associated with the machine and the components associated with the implement may communicate via wired or wireless communications according to a local area network such as, for example, a controller area network. The system 52 may be part of a communications and control system conforming to the ISO 11783 (also referred to as “ISOBUS”) standard. In yet another exemplary embodiment, one or more components of the system 52 may be located remotely from the machine and any implements associated with the machine. In that embodiment, the system 52 may include wireless communications components (e.g., the communications gateway 68) for enabling the machine to communicate with a remote computer, computer network or system.
An exemplary implementation of the user interface system 58 is illustrated in
The sensors 72 are adapted to detect operator presence, state or behavior such as operator movement, contact, body position and sounds including spoken and non-spoken sounds (e.g., hand claps or finger snaps). Optical or infrared sensors, such as image capture devices, may be used to detect movement and body position. Microphones, piezoelectric sensors and/or other sensors may be used to detect sounds.
The sensors 72 are preferably placed in or near the operator cabin 14 to best detect operator presence, state or behavior. Optical sensors for detecting operator position and movement may be placed at positions in or near the ceiling 44 of the operator cabin 14 to maximize unobstructed view of the operator from the sensors 72. One exemplary configuration is illustrated in
One or more of the sensors 72 may also be placed to detect movement of the operator's feet and/or legs. A sensor 72C is illustrated in
The inputs 74 may include physical input components for receiving instructions or other input from a user. Such physical input components may correspond to the operator controls 34 illustrated in the drawings and discussed above, and may include buttons, switches, levers, dials, and microphones. The inputs 74 may further include one or more touchscreen displays capable of presenting visual representations of information or data and receiving instructions or input from the user via a single display surface.
The one or more display components 76 may include one or more display consoles, one or more heads-up display projectors/surfaces, or a combination thereof. In some embodiments of the invention the system 58 is configured to automatically place user interface elements at optimal or preferred locations, as explained below in greater detail. To enable that functionality display components 76 are placed at two or more locations within the operator cabin 14 such that the user interface system 58 may select one of a plurality of display locations for placing a user interface element. By way of example, multiple display consoles may be used, one or more placed on each side of the operator cabin 14 or in each of multiple corners of the operator cabin 14. According to another implementation, a heads-up display system may be used with the cabin windows serving as display surfaces.
An exemplary projector 80 is illustrated in
The projector 80 illustrated in
The IR/MC module 78 is configured to handle data processing tasks associated with image recognition and motion control, including gesture recognition. The IR/MC module 78 may also include hardware and/or software for voice and sound recognition. Generally, the IR/MC module 78 will work in conjunction with the sensors 72 and/or user interface system controller 70 to process, for example, image data collected by the sensors 72. Because image recognition and motion control can include complex and resource-intensive data processing it may be advantageous to include in the user interface system 58 dedicated and specialized hardware, software or both to offload that data processing from the user interface system controller 70 or other general purpose controllers.
Gestures recognized may be hand location, hand orientation, hand posture, hand movement, arm location, arm orientation, arm posture, arm movement, finger location, finger orientation, finger posture, finger movement, leg location, leg orientation, leg posture, leg movement, foot location, foot orientation, foot posture, foot movement, head location, head orientation, head movement and facial expressions. The gesture recognition functions may be relatively simple, such as only a few simple gestures, or may be more complex, involving many gestures involving various body parts.
Gesture recognition technology is known in the art and may be implemented using any of various techniques. A first method of implementing gesture recognition involves the use of depth-aware cameras such as structured light or time-of-flight cameras to generate a depth-map and create or estimate a three-dimensional representation of what is captured by the cameras. Another method involves using two two-dimensional cameras with a known special relationship and approximating a three-dimensional representation of images captured by the cameras. Yet another technique involves the use of controllers attached to parts of the human body (e.g., gloves, bracelets, rings) that detect position, movement or both. The IR/MC module 78 may include specialized mathematical algorithms for identifying human gestures using data captured by the sensors 72. The present invention may use known image and gesture recognition technology and techniques, including those discussed herein or others.
The mobile machine may be operated as part of a group of objects including machines, fixed or stationary structures, or other objects that are interconnected via a communications network. This networked group of objects may share data to enable machine coordination and object awareness, among other things. Data may be shared, for example, to coordinate work and avoid collisions.
In the construction industry the networked group of objects may include tractors, bulldozers, scrapers, articulated trucks, compactors, excavators, graders, cranes, surveying equipment or combinations thereof. In one exemplary scenario a group of networked bulldozers and scrapers are moving and working soil to prepare a construction site. In the agriculture industry the networked group of objects may include tractors, combine harvesters, windrowers, sprayers, particulate spreaders, grain storage bins, grain driers and barns. In one exemplary scenario illustrated in
Each of the objects may be equipped with a communications gateway similar to the gateway 68 described above to enable wired or wireless communications. Various types of networks may be used to enable communications and data transfer between the objects including direct machine-to-machine communication, a mesh network, or a wide area network such as where each of the objects is connected to the Internet and communicates with each of the other objects via the Internet. Multiple networking schemes may be employed for a single group of objects, such as where a first object communicates with a second object via the Internet but communicates with a third object via direct wireless communications.
In some embodiments of the invention, the user interface system 58 is operable to automatically select and present computer-generated or computer-enabled user interface elements that are associated with objects within the machine's operating environment. This feature may optimize display space and machine operation by presenting only those user interface elements that are needed by the operator at the time they are needed.
The wireless communication received via the communications gateway 68 may include information about an independent object. The wireless communication may originate from the independent object, such as where a first mobile machine sends a wireless communication directly to a second mobile machine. Alternatively, the wireless communication may not originate from the independent object, such as where the wireless communication originates from another mobile machine but includes information about an attachment coupled with the other mobile machine, or where the wireless communication originates from another mobile machine but includes information about a fixed structure that is in communication with the other mobile machine. If multiple objects are communicatively interconnected via a mesh network a communication may originate from a first object and be received and retransmitted by one or more other intermediate objects before being finally received by the mobile machine.
Once the machine has received the communication the communications and control system 52 identifies the object using information received in the wireless communication that was received via the communications gateway 68. The communication may include, for example, an identifier with information about the object. If the object is a mobile machine, the communication may include such information as the make and model of the machine, a specific machine identifier such as a machine name assigned by the operator, the name of a person operating the machine, as well as information about the machine's operating state.
Once the communications and control system has identified the object, the user interface system 58 selects a user interface element associated with the object. Data relating to a plurality of user interface elements may be stored, for example, in the storage component 64 associated with the system 52, and may define an appearance, behavior, or both of each of the user interface elements. A user interface element may be associated with the object if it presents information about the object, allows an operator to interact with the object, or both. By way of example, a user interface element may present information about an object if the object is a grain storage bin and the user interface element includes an indication of an amount or type of grain in the storage bin, a temperature of the bin, a level of moisture in the bin, or a combination thereof. In that scenario, the user interface element may allow the machine operator to interact with the grain storage bin if the machine operator can activate or deactivate a ventilation system associated with the grain storage bin via the user interface element in the mobile machine. This functionality may be enabled by the communications and control system 52 communicating a command to the grain storage bin in a wireless communication sent via the communications gateway 68.
Rather than select a user interface element from the storage component 64, the user interface system 70 may receive data defining the user interface element in the communication received via the communications gateway 68. Each object in the network may store data defining its own user interface element and communicate that data to each other machine in the network. The user interface element may also be stored in a remote location not associated with the machine or the object, wherein the machine retrieves the user interface element from the remote location via the communications gateway 68.
Once data associated with the user interface element is selected or received, the user interface system 58 presents the user interface element via one or more of the display components 76. A user interface element presented via a display component may be a “soft” element meaning that it is generated by software and may be presented, for example, on a display console or on a heads-up display. These user interface elements may also be dynamic in that the system 58 automatically adds, removes, revises and/or updates or otherwise modifies user interface elements during operation of the mobile machine, as explained below in greater detail.
Each user interface element may be presented so that its location is associated with or corresponds to the location of the independent object, such as a side of the machine that is generally between the operator's seat 32 and the location of the independent object. Placement of user interface elements is discussed in greater detail below. Alternatively, the system 58 may place the user interface element in a location indicated by the machine operator, and the operator may change the location of any user interface element at any time. If the user interface element is presented on a touchscreen or a heads-up display surface, for example, the operator may “drag” the user interface element from one location to another by placing a finger in contact with the display surface at or near the location of the user interface element and dragging the finger along the display surface to the desired location of the user interface element. Other methods of selecting a user interface location, such as selecting a user interface element layout template, are within the ambit of the present invention.
The user interface system 58 may determine a system or controllable component of the mobile machine associated with the independent object, and the user interface element may enable an operator to manipulate the system or controllable component of the mobile machine associated with the independent object. The system or controllable component may be an auxiliary system such as a power take off, a linkage such as a three-point hitch, or a hydraulic coupler. By way of example, if the mobile machine is a combine harvester and the independent object is a grain truck or a grain cart, the system associated with the independent object may be a grain unload auger of the combine harvester and the user interface element may enable an operator to deploy, activate, deactivate and stow the unload auger by interacting with the user interface element.
The user interface element may be dynamic and continuously updated to reflect newly-received information from or about the independent object. By way of example, the user interface system controller 70 may receive additional information about the independent object in wireless communications received via the communications gateway 68. Such communications may occur periodically such as every five minutes, every minute, every thirty seconds or every fifteen seconds, or may occur continuously or substantially continuously. Alternatively, the communications may occur only as needed, such as only when new or additional information about the independent object is available. Thus, if the independent object is initiating the communications, it may detect when a status relating to the information has changed and initiate a communication only when the status has changed.
With reference to
The user interface element 110 illustrated in
The machine operator may interact with the user interface element to request additional information from or about the independent object. With continued reference to the exemplary scenario set forth in the preceding paragraphs, the tractor operator may expand a drop-down menu in the user interface element 110 by pressing or indicating the down arrow 112, as illustrated in
Other exemplary scenarios may include user interface elements that indicate the reservoir fill level of a sprayer, the fill level of a fuel tank, or engine operating parameters such as temperature. Virtually any information relating to the object, including a machine operating state, may be included in the user interface element.
The machine may also be configured to communicate information about itself or an attachment associated with itself to the independent object, such as in a wireless communication via the communications gateway 68. That information may include a machine identifier, data defining a user interface element, machine state or operating parameters. The information may be communicating automatically or upon request by the independent object. The machine may collect information relating to an attachment associated with the machine and communicate the information about the attachment to the independent object.
Referring again to the exemplary scenario set forth in the preceding paragraphs and illustrated in
In some embodiments of the invention, the machine only presents user interface elements relating to objects that are communicating directly with the mobile machine. As additional objects begin communicating with the machine the user interface system 58 adds a user interface element for each of the additional objects. As objects in communication with the mobile machine stop communicating with the machine (for example, they are shut down or are out of communications range) the user interface system 58 removes any user interface elements associated with those objects. In this way, the user interface system 58 automatically presents user interface elements only for objects that are presently in communication with the machine. In this manner the user interface system 58 may automatically manage the user interface by including only desirable or relevant user interface elements.
With reference to
The user interface system 58 may continuously and automatically determine the distance to each of the objects 116 and revise the user interface to include user interface elements relating only to those objects that are within the desired range of the machine. Thus, as some of the objects 116 move into the range 118, the system 58 adds user interface elements associated with those objects, and as some of the objects 116 move out of the range 118 the system 58 removes user interface elements associated with those objects. This function may be particularly desirable, for example, where the objects are interconnected via a mesh network such that some objects in the network are located a considerable distance from other objects, or where the objects are networked via the Internet (or other wide area network) that is not limited by geographic distance. The predetermined range may be fifty meters, one hundred meters, one hundred and fifty meters, two hundred meters, three hundred meters, four hundred meters or five hundred meters, and the machine operator may adjust the predetermined range at any time during operation of the machine.
Another method of managing user interface elements involves presenting user interface elements only for objects that are within a geographic boundary. The operator may define the geographic boundary by, for example, drawing the boundary on a map that is presented via the user interface system 58.
In some embodiments of the invention, the system determines when an implement or other attachment is being used with the machine and automatically presents a user interface element associated with the attachment. When the attachment is no longer being used with the machine the system may automatically remove the user interface element associated with the attachment. This aspect of the invention helps to minimize the number of user interface elements presented via the user interface by only presenting user interface elements for those machine systems that are being used. These user interface elements have traditionally existed as physical input mechanisms such as dials, buttons, switches and levers that, cumulatively occupied a large portion of the operator's area in the operator cabin. By dynamically adding and removing user interface elements on an as-needed basis, the user interface may be much less cluttered.
By way of example, if the machine has a hydraulic system with multiple couplers for interfacing external systems or devices with the on-board hydraulic system, no user interface elements associated with the hydraulic couplers need be present if nothing is connected to any of the couplers. Similarly, if four of eight hydraulic couplers are in use, there may need to be only four user interface elements corresponding to the four couplers in use rather than for all eight couplers. Similarly, if the machine includes a front or rear linkage system, such as a three-point hitch system, and nothing is attached to the linkage system, there is no need for a user interface element for controlling operation of the three-point hitch. Similarly, if the machine includes a front or rear power take off system and nothing is attached to the power take off drive, there is no need for a user interface element for controlling operation of the power take off system.
In one embodiment of the invention, this functionality is enabled when the system automatically determines when an attachment is coupled with the machine, identifies a controllable component of the machine associated with the attachment and, after identifying the controllable component, presents a user interface element via the user interface for enabling the operator to manipulate the controllable component.
The system may automatically determine when an attachment is coupled with the machine or may receive an input from a user indicating that the attachment is coupled with the machine. The system may use internal or external sensors to determine when an attachment is coupled with the machine. A controllable component of the machine is a device or system built into the machine which can be controlled by user input or manipulation. Examples of controllable components include an onboard hydraulic system coupled with an external hydraulic system, a front linkage, a rear linkage and a power take off. To identify a controllable component associated with the attachment, the system may use the sensor data to identify the controllable component or may reference stored data, such as a look-up table.
For simplicity embodiments of the invention have been described and illustrated with reference to a single user interface element corresponding to a single independent object. The invention, however, is not so limited. The machine network may include multiple machines and each machine may include user interface elements for each of the other machines, such that if a work area includes a total of eight machines each machine may have up to seven user interface elements associated with other machines.
In some embodiments of the invention, the user interface system 58 is operable to automatically and intelligently place the computer-generated user interface elements in strategic locations for easy and convenient use by the machine operator.
With reference to
The item of interest may be an independent object or an attachment coupled with the mobile machine, both of which are described above. If the item of interest is an independent object, the machine may determine the location of the item of interest by receiving a communication via the communications gateway 68 that includes location information for the item of interest. As explained above, the communication may be received directly from the item of interest, from another object in the region and/or through a wide area network. If the item of interest is another mobile machine, that mobile machine may collect location information from an onboard position determining component and communicate that location information to the machine. If the item of interest is a stationary structure, such as a grain storage bin, the item's location may be stored locally in the machine, such as in the storage component 64.
The item of interest may also be an attachment, such as an implement pulled by or mounted on the mobile machine. One exemplary attachment is the implement 20 illustrated in
If the attachment is pivotally coupled with the mobile machine via a drawbar, determining the position of the attachment relative to the machine may present challenges because the position of the attachment may not be related to the state of any on-board machine system such as a hydraulic system or linkage system. One exemplary method of determining a position of such an attachment is illustrated in
Exemplary scenarios are illustrated in
The scenario depicted in
The item of interest may be a component of the mobile machine that is external to the operator cabin, such as an engine, hydraulic motor, fuel tank or tire.
The display location may be on an LCD console or a heads-up display surface. The selected display location may be between a seat in the operator cabin of the machine and the location of the item of interest. One advantage of placing the user interface element between the seat and the location of the item of interest is that when an operator is sitting in the seat and turns his or her head to face the item of interest, the user interface element is within the operator's field of view such that the operator need not turn his or her head to view the user interface element. A user interface element is between the seat and the location of the item of interest if it is placed on a side of the operator cabin that is between the seat and the location of the item of interest, and need not lie directly on a line connecting the seat and the location of the item of interest.
Determining the location of the item of interest relative to the operator cabin may involve determining a geographic location of the mobile machine, determining an orientation of the mobile machine, and determining the geographic location of the item of interest. The geographic location of the mobile machine may be determined by the position determining component 56. The orientation of the machine may be determined using an onboard device such as a digital compass, may be determined using successive geographic locations of the mobile machine in a manner similar to that explained above, or a combination of the two. Determining the geographic location of the item of interest may involve, for example, receiving location information from the item of interest or retrieving location information from an on-board or remote storage device.
In some embodiments, the user interface element may be placed directly in, or proximate to, the operator's line of sight as the operator looks toward the item of interest. The line of sight is between the operator's head and the item of interest such that as the operator looks at the item of interest he or she sees the user interface element superimposed over the item of interest. To place a user interface element within the operator's line of sight with respect to an item of interest the user interface system may determine a location of the operator's head, create a virtual line from the operator's head to the location of the item of interest (similar to the line 148 but connecting the operator's head with the item of interest), determine where the virtual line intersects a display surface, and then place the user interface element at the location where the virtual line intersects the display surface. If the user interface element is to be placed proximate the line of sight but not on it, the same method may be used but the user interface element may be placed near the intersection of the virtual line and the display surface rather than at the intersection. Placing the user interface element near the intersection may involve placing it so that an edge of the element is spaced from the point of intersection by five centimeters, ten centimeters or fifteen centimeters.
The location of the operator's head may be determined by the sensors 72 and the IR/MC component 78 described above. The IR/MC component 78 may be configured, for example, to recognize the operator's head, and if the operator's head is detected from two or more angles (e.g., using two or more cameras), the location of the operator's head in three-dimensional space may be calculated or estimated. Object detection and location technology is known in the art and any of various methods may be used to determine the location of the operator's head within the cabin 14. The user interface system 58 may track the location of the operator's head as it moves, and may revise the location of the user interface element in real time to reflect changes in the location of the operator's head. Thus, if the user interface element is placed directly in the operator's line of sight with respect to the item of interest and the operator leans forward in the seat, the user interface element would also move so that it remains within the operator's line of sight with respect to the item of interest. If the operator then leans back in seat the user interface system would again move the user interface element to follow the location of the operator's head.
In some embodiments of the invention the user interface system 58 estimates the location of the operator's head rather calculating it. Rather than use sensors and image processing to detect the operator's head, for example, the system may simply estimate the position of the operator's head based on an average operator height. Additionally, the user interface system may prompt the operator to submit certain inputs to help estimate a location of the operator's head. Such inputs may include a calibration input wherein the operator is prompted to identify a location on one or more display surfaces that are used to determine the determine the location of the operator's head and use that as an estimate moving forward. An example of this would be where the system 58 prompts the operator to move a graphic on a heads-up display surface until the graphic covers the engine compartment from the operator's perspective. Using the known location of the engine compartment and placement of the graphic, the system 58 may estimate or determine the location of the operator's head. Repeating this process multiple times may increase the accuracy of the estimate location of the operator's head.
The user interface system 58 may generate the user interface element to follow the operator's field of view. The operator's field of view corresponds to the direction the operator is facing. If the operator is facing forward the operator's field of view is the front window 36 of the machine. If the operator is turned looking at an implement behind the machine the field of view is the rear window 42. The user interface system 58 may determine which direction the operator is facing by using face recognition technology and images capture from multiple image capture devices. If an image capture device placed at or near the front of the operator cabin 14 and facing the back of the cabin detects the operator's face, the system 58 determines that the operator is facing forward. If the an image capture device placed at or near a left side of the operator cabin and facing the right of the cabin detects the operator's face, the system 58 determines that the operator is facing the left. This functionality may be useful where the operator desires to keep a user interface element visible regardless of which way the operator is facing.
In one exemplary scenario, the operator desires to monitor the engine temperature and therefore provides an input to the user interface system 58 indicating that an engine temperature user interface element is to remain in his or her field of view. As the operator begins operating the tractor the system detects the user's face in a forward-facing position and determines that the first field of view is the front window of the tractor. The user interface system places the user interface element corresponding to the engine temperature on the front window. During operation of the tractor the user interface system continuously monitors the position of the operator's face and, while the operator is facing forward, leaves the user interface element on the front window. The user interface system detects that the operator has turned his or her head to the right side and determines that the current field of view is the right side window of the tractor and places the user interface element on that window. Later the user interface system detects that the operator has turned his or her head so that it is again facing the front of the operator cabin, wherein the user interface system determines that the current field of view is the front window of the cabin and again places the user interface element on the front window.
The user interface system 58 may enable the operator to select how user interface elements are presented on the display surfaces. The user may indicate that user interface elements corresponding to independent objects and attachments are to be placed in the line of sight with respect to each associated object or attachment, or may indicate that all user interface elements are to be placed at a top or a bottom of the heads-up display surface to avoid interfering with the operator's view. Furthermore, the operator may indicate that all user interface elements be placed in fixed locations and not move at all, regardless of the operator's position or field of view.
The user interface system 58 may present multiple user interface elements, some associated with independent objects, some associated with attachments and/or some associated with machine components. Some of the user interface elements may be placed in the operator's line of sight, as explained above, some user interface elements may follow the operator's field of view outside his or her line of sight, and some user interface elements may be in fixed locations. The operator may determine how each user interface element is treated, as explained above.
Embodiments of the invention leverage components of the user interface system 58 to enable advanced user inputs, including gestures and sounds, that may be defined by the operator. The operator may configure the user interface system 58 to detect virtually any gesture or sound and to perform virtually any action or function associated with the mobile machine. Actions or functions associated with operator-defined inputs may include mechanical movement or operation of the machine, such as controlling the machine's speed and direction, controlling the machine's engine speed, or controlling auxiliary functions including power take off and linkage. Actions or functions associated with operator-defined inputs may also include non-mechanical functions such as adjusting user interface settings, communicating with other machines or remote computing systems, retrieving information from the communications and control system, and operating internal or external lights, to name a few. Furthermore, actions or functions associated with operator-defined inputs may be performed on or by attachments coupled with the machine.
The user interface system 58 may be configured to detect and identify gestures made with the operator's fingers, hands, arms, legs and feet, as explained above. The operator may program the actions or functionality associated with particular gestures or sounds by providing an input to the user interface system 58 to put the system 58 in input recording mode wherein the system 58 detects and records operator movement or sound. The recorded movement or sound is then assigned to a function or action performed by the machine, such as one of the functions or actions described above. The operator may submit an input indicating the function or action assigned to the recorded movement or sound. The operator may submit the input indicating the function by either selecting a predefined function, such as turn on external lights or slow forward speed by ten percent, or may define an action or sequence of actions. Defining an action or sequence of actions may include placing the user interface system in action recording mode and performing an action or sequence of actions that are then recorded by the user interface system 58. It may be desirable to define an action where a series of steps are repeated, such as where a machine is working in a field and performs a series of steps each time the machine enters and exits a headland of the field or where a machine with a bucket performs a series of steps to dump the contents of the bucket into a truck. In both of those examples the same steps may be repeated such that the operator may record the series of steps involved, assign the steps to a simple user input, and then subsequently perform the steps using the simple user input.
The operator has the freedom to define virtually any movement or sound as an input and to associate the movement or sound with virtually any machine function or action. This allows the operator to select input methods that he or she is most comfortable with and to assign those inputs to actions that are most frequently performed. This may greatly increase the convenience of operating the machine and reduce operator fatigue.
Examples of movement or sound that may be recorded as input and examples of associated actions or functions include one hand clap stops the machine from moving; two consecutive hand claps stop the machine from moving and deactivates all auxiliary functions; three consecutive hand claps immediately shut down all engines and motors on the machine; extending both hands forward with palms facing upward and moving both hands in an upward motion causes a combine harvester to raise a header attached to the combine, and the same motion with palms facing downward and both hands moved in a downward motion causes the header to move downward; extending both hands forward with palms facing upward and moving both hands in an upward motion causes a loader bucket to be raised, and the same motion with palms facing downward and both hands moved in a downward motion causes the loader bucket to be lowered; the spoken works “entering headland” may cause the header of a combine harvester to raise and functions associated with the header to be slowed or stopped; the spoken works “entering crop” may cause the header of a combine harvester to be returned to a harvesting position and functions associated with the header to be activated; pointing a finger to the right causes a wayline nudge to the right and pointing to the left causes a wayline nudge to the left; the spoken words “nudge right” causes a wayline nudge to the right and the spoken words “nudge left” causes a wayline nudge to the left; extending both hands forward, closing the hands to form fists and moving both hands forward simultaneously causes the machine to travel faster, moving both hands back simultaneously causes the machine to travel slower. These are but a few examples.
An exemplary method of implementing the advanced, programmable user inputs discussed herein is illustrated in
The system 58 detects a second movement of the operator, as depicted in block 154, and records the second movement, as depicted in block 156. The system 58 may record the second movement in response to a user input indicating an input recording mode, as explained above. The system 58 assigns the second operator movement to a second action, as depicted in block 158. The second action is different than the first action but, like the first action, may be associated with a component or system of the machine or with an attachment. The system then detects a third operator movement, as depicted in block 160, and compares the third movement to stored movements. If the third movement is the same as the first movement, the machine performs the first action, as depicted in block 162. If the third movement is the same as the second movement, the machine performs the second action, as depicted in block 164.
In some embodiments of the invention, the machine operator may save user interface preferences and settings in a user profile in the communications and control system 52 and retrieve the preferences and settings at a later time, thus saving the operator the time and effort involved in setting up the user interface system 58 to his or her preferences each time the operator uses the machine. The user interface preferences and settings in the user profile are unique to the operator, such that each operator may have his or her own profile and multiple user profiles may be stored on (or are retrievable by) the machine.
User interface preferences may include how and where user interface elements are presented, which gestures and sounds are used as inputs, and which actions those inputs correspond to. Taller machine operators may desire to place user interface elements near the bottom of the windows to avoid obstructing their view, while shorter machine operators may desire to place user interface elements near the top of the windows for the same reason. In some instances machine operators may desire for some user interface elements to be within his or her line of sight with respect to items of interest outside the operator cabin, and may desire for some user interface elements to be visible but in peripheral locations.
To associate user interface preferences and settings with a user profile, the system 58 may identify the machine operator, associate the machine operator with an operator profile and record the preferences and settings submitted by the machine operator. The system 58 may identify the operator automatically, such as where the system uses facial recognition or other biometric recognition techniques, or may employ manual means to identify the operator such as where the system prompts the operator to submit a user name or other identifying information via an element of the user interface.
The user profile may roam from machine to machine and may include information specific to the machine the operator is presently operating. If multiple machines are interconnected via a communications network, as illustrated in
Some user profile settings and preferences may be applicable to more than one machine, while others may be applicable to only a single machine. User profile preferences and settings relating to the operation of a grain unload mechanism for a combine harvester may be identical across multiple types of combine harvesters. User profile preferences and settings relating to the operation of the header of the combine harvester may be different for different headers, for different types of crops being harvested, or both.
In an exemplary scenario a user operating a tractor adjusts the user interface settings and preferences, including selecting a number of user interface elements relating to machine components to be presented and to place a first group of those user interface elements along the top edges of cabin windows (as part of a heads-up display) and to place another group of those user interface elements in locations proximate the corresponding machine components. The operator may move one or more of the user interface elements to desired locations on the display surfaces using a method described above. During operation of the machine the user interface system automatically presents user interface elements relating to independent objects in the vicinity of the tractor, including other mobile machines and/or fixed structures. As the user interface system 58 presents each user interface element the operator determines how the element will be presented, such as a size, appearance and location of the element. The user interface system 58 may present each user interface element according to a default size and location (for example, at the top edges of the windows), and the operator may make changes to the size and placement as desired. The operator may place some of the user interface elements into a line of site with respect to the corresponding object, and may move other of the interface elements into a position proximate the line of sight.
The operator may also configure sound inputs, if the user interface system is configured to receive sound inputs. For example, the operator may program the user interface system to stop movement of the machine upon detecting a single clap, stop movement of the machine and any auxiliary systems upon detecting two consecutive claps, or immediately shutting down all machine functions upon detecting three consecutive claps. Alternatively, the user interface system may be configured to detect spoke words, such as “stop” or “shut down.” As the operator submits these preferences the user interface system records them as part of the operator's profile, which may be stored locally on the machine, at a location remote from the machine, or both.
When the operator returns to the tractor at a later date and begins operating the machine, the user interface system identifies the operator, retrieves the operator's profile, and sets up the user interface according to the settings and preferences in the user profile. If the operator makes further adjustments to the settings and preferences the user interface system records the adjustments as part of the operator's profile.
This functionality may be implemented in multiple machines such that when the operator leaves a first machine and begins using a second machine, the second machine retrieves the operator's profile and implements the user interface according to the operator's preferences and settings. If the two machines are identical the user interface system 58 of the second machine may implement the user interface exactly as it was implemented in the first machine. If the second machine is different than the first machine, however, the second machine's user interface system 58 may implement a different set of preferences and settings than the first machine's. If the first machine is a combine harvester and the second machine is a tractor, for example, any user interface settings associated with operation of the harvester's header would not be applicable to the tractor and, thus, would not be implemented. However, user interface gestures or sounds associated with stopping or shutting down the tractor may be implemented on the combine harvester as well.
When a machine's user interface system 58 implements an operator's profile, it identifies and implements those portions of the profile that apply to that particular machine. Each user profile may indicate which portions of the profile relate to each machine. In this manner the profile roams from machine to machine, following the operator, and changes made to the profile in a first machine may carry over to one or more other machines.
The user interface system 58 may determine when particular attachments are coupled with a machine and implement user profile settings specifically associated with the attachment or attachments. The communications and control system 52 may determine when an attachment is coupled with the machine when the operator submits an input indicating that the attachment is coupled with the machine or automatically using sensors to detect the presence of the attachment, as explained above. When the user interface system determines that an attachment is coupled with the machine it may present a user interface element associated with the attachment, as explained above. The operator may determine preferences and settings, such as size, appearance and placement, relating to that user interface element, as explained above. The user interface system 58 records the preferences and settings submitted by the operator as part of the operator's profile. When the attachment is decoupled from the machine the user interface system 58 removes the user interface element associated with attachment, and when the attachment is coupled with the machine again the user interface system 58 presents the user interface element associated with the attachment according to the settings and preferences previously indicated by the operator. Furthermore, if the operator is subsequently operating a second machine and the same attachment is coupled with the second machine, the user interface system 58 of the second machine may present the user interface element according to the preferences and settings submitted by the operator when he or she was operating the first machine. In this manner user profile settings relating to attachments also follow operators from machine to machine.
The user interface system 58 may determine when a machine is performing a particular task and implement user profile settings specifically associated with the task. An operator may submit a first set of preferences and settings when performing a first task and a second set of preferences and settings when performing a second task, even if the machine and any attachments associated with the machine have not changed. The communications and control system 52 may be configured to detect the particular task and implement the user interface according to the operator's preferences and settings according to that task.
The exemplary embodiments of the invention described herein and illustrated in the drawings provide advantages over existing user interface systems for mobile machines. Embodiments of the invention provide user interface systems, for example, that make a greater amount of information available to machine operators in a manner that is natural, intuitive and easy to use. Furthermore, embodiments of the present invention may eliminate the need for some or all of the physical user interface elements of mobile machines including display consoles and physical control components such as buttons, knobs, switches and levers. An exemplary machine operator cabin is illustrated in
Although the invention has been described with reference to the preferred embodiment illustrated in the attached drawing figures, it is noted that equivalents may be employed and substitutions made herein without departing from the scope of the invention as recited in the claims.
Having thus described the preferred embodiment of the invention, what is claimed as new and desired to be protected by Letters Patent includes the following:
Under provisions of 35 U.S.C. §119(e), Applicant claims the benefit of U.S. Provisional Application No. 62/235,298, entitled USER INTERFACE FOR MOBILE MACHINES and filed Sep. 30, 2015.
Number | Date | Country | |
---|---|---|---|
62235298 | Sep 2015 | US |