This disclosure relates generally to an image display system and, more particularly, to a system utilizing augmented reality to inform personnel of movement of autonomously and remotely controlled machines.
Movable machines such as haul trucks, dozers, motor graders, excavators, wheel loaders, and other types of equipment are used to perform a variety of tasks. For example, these machines may be used to move material and/or alter work surfaces at a work site. The machines may perform operations such as digging, loosening, carrying, etc., different materials at the work site.
In order to increase the efficiency of operation at a work site, movable machines are sometimes operated autonomously or semi-autonomously. In other instances, the machines are operated by remote control. In instances in which the machines are operated autonomously or by remote control, an operator will not be present on the machine. In instances of semi-autonomous operation, an operator may be present at the machine but the machine may take certain actions or make certain movements without the actions or movements being directly controlled by the operator.
Risks to personnel at work sites in which machines are operated autonomously, semi-autonomously, or by remote control may be increased as compared to work sites in which all machines are directly controlled by an operator on the machine. With respect to machines that operate autonomously or by remote control, personnel near such machines may be unaware of impending movement of the machines. For example, personnel may assume that a machine is not going to move since an operator is not present at or on the machine. With respect to machines that operate autonomously or semi-autonomously, such machines may rely on sensors to determine whether personnel are in proximity to such machines before beginning automated movement. However, unexpected movement by personnel cannot be anticipated by such sensors.
Augmented reality or augmented vision exists in which a person's perception or view of the real world is augmented with additional informational input. That input may include additional information about the scene currently viewed by the person. U.S. Patent Publication No. 2003/0014212 discloses an augmented reality system that introduces additional input to augment the perception of a surveyor who is performing surveying tasks. The augmented reality system utilizes a head-mounted apparatus and may determine the current position and direction of view of the surveyor. Based on the determined position and direction of view, the system accesses additional surveying information stored in a database and transmits the data to the head-mounted apparatus to augment the view through the apparatus. The foregoing background discussion is intended solely to aid the reader. It is not intended to limit the innovations described herein, nor to limit or expand the prior art discussed. Thus, the foregoing discussion should not be taken to indicate that any particular element of a prior system is unsuitable for use with the innovations described herein, nor is it intended to indicate that any element is essential in implementing the innovations described herein. The implementations and application of the innovations described herein are defined by the appended claims.
In an aspect, a display system for use at a work site may include a display device having a display screen and a pose sensor system associated with the display device for generating display device pose signals indicative of a position and an orientation of the display device relative to the work site. A controller may be configured to determine the position and orientation of the display device relative to the work site, generate machine control signals to control movement of a machine without an operator at the machine controlling the movement the machine, generate an augmented reality image based upon the machine control signals and the position and orientation of the display device, and render the augmented reality image on the display screen.
In another aspect, a controller-implemented method of operating a display system at a work site may include determining the position and orientation of a display device relative to the work site based upon display device pose signals generated by a pose sensor system associated with the display device and generating machine control signals to control movement of a machine without an operator at the machine controlling the movement the machine. The method may further include generating an augmented reality image based upon the machine control signals and the position and orientation of the display device and rendering the augmented reality image on the display screen.
In still another aspect, a system for use at a work site may include a machine having a propulsion system with the machine being movable without an operator at the machine controlling movement of the machine. The system may further include a display device including a display screen and a pose sensor system associated with the display device for generating display device pose signals indicative of a position and an orientation of the display device relative to the work site. A controller may be configured to determine the position and orientation of the display device relative to the work site, generate machine control signals to control movement of a machine without an operator at the machine controlling the movement the machine, generate an augmented reality image based upon the machine control signals and the position and orientation of the display device, and render the augmented reality image on the display screen.
To coordinate and potentially control the activities and movement of the machines 10 and the personnel about the work site 100, a computerized or electronically implemented management system indicated generally at 110 may be based out of a fixed or mobile location such as command center 111. The management system 110 may be implemented by a control system 112 as shown generally by an arrow in
The term “controller” is meant to be used in its broadest sense to include one or more controllers and/or microprocessors that may cooperate in controlling various functions and operations. The functionality of the controller 113 may be implemented in hardware and/or software without regard to the functionality. The controller 113 may be operatively associated with on one or more databases and/or data maps relating to the operating conditions and the operating environment of the work site 100 as well as the various machines 10 and personnel at the work site.
The control system 112 may be located at command center 111 and may also include components located remotely from the command center 111 such as on machines 10 and personnel. As such, the functionality of control system 112 may be distributed so that certain functions are performed at the command center 111 and other functions are performed remotely.
To facilitate communication between and among the command center 111, the machines 10 and the personnel, the control system 112 may include a communications system such as wireless network system 114 for transmitting signals to and from each of the command center, the machines, and the personnel. Any suitable form of communications system may be used including, for example, radio frequency (RF) signals. The communications network may be based around a central hub whereby a plurality of transceivers communicate signals to a central router that routes the signals to the intended recipient or it may be a distributed network (i.e., peer-to-peer) whereby each transceiver may communicate directly with every other transceiver.
As used herein, a machine 10 operating in an autonomous manner operates automatically based upon information received from various sensors without the need for human operator input. As an example, a haul or load truck that automatically follows a path from one location to another and dumps a load at an end point may be operating autonomously. A machine 10 operating semi-autonomously includes an operator, either within the machine or remotely, who performs some tasks or provides some input and other tasks are performed automatically and may be based upon information received from various sensors. As an example, a load truck 12 that automatically follows a path from one location to another but relies upon an operator command to dump a load may be operating semi-autonomously. In another example of a semi-autonomous operation, an operator may dump a bucket from an excavator 11 in a load truck 12 and a controller may automatically return the bucket to a position to perform another digging operation. A machine 10 being operated manually is one in which an operator is controlling all or essentially all of the functions of the machine. A machine 10 may be operated remotely by an operator (i.e., remote control) in either a manual or semi-autonomous manner.
Machine 10 may be any type of machine that performs some operation associated with an industry such as mining, construction, farming, transportation, or any other industry known. For example, the machine may be an earth-moving machine, such as an excavator, wheel loader, load truck, dozer, backhoe, material handler, or any other type of working machine.
Machine 10 may include a cab 20 that an operator may physically occupy and provide input to control the machine. Cab 20 may include one or more input devices through which the operator issues commands to control the propulsion and steering of the machine as well as operate various implements associated with the machine.
Machine 10 may be equipped with a plurality of machine sensors 32, as shown generally by an arrow in
A position sensing system 33, as shown generally by an arrow in
Machine 10 may be controlled by a machine control system 30 as shown generally by an arrow in
The machine controller 31 may receive input command signals from control system 112, remote control input command signals from an operator using a remote control console 130 (
To increase safety and efficiency, personnel at work site 100 may be equipped with a display system including a display device to display virtual text and/or images that are displayed to augment the image seen by the personnel. Such augmented image system is sometimes referred to as augmented reality system or augmented vision system.
Referring to
Head mountable display device 201 may include an adjustable strap or harness 203 that allows the head mountable display system to be worn about the head of the wearer. The head mountable display system 200 may include a visor or goggles 204 with transparent lenses that function as the display screen 202 through which the wearer views the physical environment. One or more image projectors 205 may direct images onto the display screen 202 within the wearer's line of sight.
The image projector 205 may be an optical projection system, light emitting diode package, optical fibers, or other suitable projector for transmitting an image. The display screen 202 may be configured to reflect the image from the image projector 205, for example, by a thin film coating, tinting, polarization or the like. The display screen 202 also may be a beam splitter, as will be familiar to those of skill in the art. Thus, while the display screen 202 may be transparent to most wavelengths of light, it reflects selected wavelengths such as monochromatic light back to the eyes of the wearer. Such a device is sometimes referred to as an “optical combiner” because it combines two images, the real world physical environment and the image from the image projector 205. In still other embodiments, it may be possible to configure the image projector (such a laser or light emitting diode) to draw a raster display directly onto the retina of one or more of the user's eyes rather than projecting an image onto the display screen 202. Other configurations are contemplated. Regardless of the type of image projector 205, the projected images appear as an overlay superimposed on the view of the physical environment thereby augmenting the perceived environment.
A headset controller 206 may be provided on head mountable display device 201. The headset controller 206 may have wireless communications capabilities such as a transceiver to communicate with management system 110 or other aspects of control system 112 remote from the headset controller 206 such as machine controller 31. Headset controller 206 may operate independently or with other the controllers to control the projection of the images onto the display screen 202 and determine the images to be projected by the image projector 205.
The head mountable display system 200 may also include a headset pose system 207 used to determine the orientation and position or pose of the head of the wearer. For example, the headset pose system 207 may include a plurality of headset pose sensors 208 that generate signals that may be used to determine the pose of the wearer's head. In one example, the headset pose sensors 208 may be Hall effect sensors that utilize the variable relative positions of a transducer and a magnetic field to deduce the direction, pitch, yaw and roll of the wearer's head. In another example, the headset pose sensors 208 may interact with a positioning system such as a global navigation satellite system or a global positioning system to determine the pose of the wearer's head. The data obtained by the headset pose sensors 208 may be used to determine the specific orientation of the wearer's field of view relative to the work site 100.
In another example of an augmented reality system, a heads-up display system 240 (
In still another example of an augmented reality system, personnel at work site 100 may be equipped with a portable computing device 250 depicted schematically in
The portable computing device 250 may also include a display screen 255, a wireless communications interface 256, a camera 257, a microphone 258, a global positioning sensor 259, and one or more input devices 260. In some instances, the display screen 255 may be configured as a touch screen to also operate as a portable device input. The wireless communications interface 256 may act as a communications channel between the control system 112 and the portable computing device 250 as well as between the portable computing device and any other system.
Central processing unit 251 may utilize data from the global positioning sensor 259 to determine the position of the portable computing device and communicate the position to the management system 110 and such position may be stored within a virtual work site map in the controller 113. In addition, the portable computing device 250 may display camera images from the camera 257 on display screen 255 and overlay augmented reality images on the camera images. In such case, a use of the portable computing device 250 may view the physical environment at which the portable computing device is directed while also viewing the virtual text and/or images that are displayed over the images on the display screen 255.
Management system 110 may maintain a virtual map of the work site 100 and each machine 10 and the personnel in controller 113 based upon position information communicated wirelessly from each machine and person back to the management system. Using the positioning and possibly other work site information, the management system 110 may generate and relay various instructions to the various machines 10 and/or personnel at the work site 100. The instructions may include any suitable information helpful to the development of the work site 100, such as coordination instructions that may direct the machines 10 and personnel where to move or travel or what activity to perform. Further, because management system 110 may concurrently track the positions and movement of all the machines 10 and personnel about the work site 100, the coordination instructions may direct or suggest interaction between the machines and/or personnel.
The management system 110 may also include information about operations being performed by the machines 10. This operational information may be particularly useful when operations are occurring that have limited or no real-time direct human control. For example, machines 10 that are being autonomously or semi-autonomously may start moving with little or no warning to nearby personnel. Still further, machines 10 that are being operated by remote control undergo similar unexpected movement with little or no warning to such nearby personnel. Accordingly, management system 110 may include an augmented reality display system, generally indicated at 115, configured to warn nearby personnel of impending or ongoing movement of various machines 10. For movements that are autonomous or semi-autonomous, a certain number of movements may be planned ahead of time. Accordingly, the augmented reality display system may generate and display an augmented reality image that includes a visual representation of such future scheduled or planned machine actions. These action may include moving the machine 10 in a specific direction at the work site 100 or moving a machine implement (e.g., a bucket of an excavator in a specific manner.
Referring to
At stage 43, data may be generated by the sensors of the display device such as the headset pose sensors 208 of head mountable display device 201 and received by headset controller 206. The data may be used at stage 44 to determine the pose of the display screen, which corresponds to the pose of the wearer's head, relative to the work site 100. The headset controller 206 may communicate at stage 45 the pose of the head mountable display device 201 to the management system 110.
In an example in which the display device is a part of a heads-up display system 240 mounted on a machine 10, the pose of the display screen may be determined by data from the position or pose sensors of the machine 10 and data maps of the physical dimensions or configuration of the machine that correlate the pose of the machine to the position of windshield 243. In an example in which the display device is part of a portable computing device, the pose of the display screen may be determined from the global positioning sensor 259.
Controller 113 may determine at stage 46 whether any machines 10 at the work site 100 are being operated in an autonomous or semi-autonomous manner or are being operated by remote control. For example, if machines 10 are operated autonomously or semi-autonomously, the operations being performed may be controlled by management system 110 through controller 113. In case the on-board machine controller 31 is controlling the autonomous or semi-autonomous operation, the machine controller 31 may communicate the commands utilized for the autonomous or semi-autonomous operation back to the management system 110 through controller 113. Still further, if the machine 10 is being operated by remote control, the remote control signals may be sent from the remote control console 130 through management system 110 and controller 113 to machine controller 31 or, if the remote control signals are sent from the remote control console 130 directly to the machine controller 31, the remote control signals may also or simultaneously be sent to the management system 110 through controller 113. Through these processes, the management system 110 may monitor the control and operation of the various machines 10 operating in an autonomous or semi-autonomous manner or by remote control.
At decision stage 47, the controller 113 may determine whether the display device, and thus a person using the display device, is within a predetermined distance from any of the machines 10 that are being operated at the work site 100 in an autonomous or semi-autonomous manner or by remote control. The predetermined distance may be dependent upon any of a number of factors. For example, the predetermined distance may be changed based upon the type of machine 10 being operated, the type of operation being performed, or the type of display device. In other words, the predetermined distance may vary depending on each of these factors. Controller 113 may contain a data map of predetermined distances that takes into consideration each of a plurality of factors. In one example, the predetermined distance may be relatively large when associated with a machine 10 that is stationary and is about to begin movement since a user of the display device may not have any other notice of the impending movement of the machine. In instances in which a machine 10 is already in motion, continued movement of an autonomously or semi-autonomously operated or a remotely controlled machine may require a smaller predetermined distance.
If the display device is not within the predetermined distance at decision stage 47, the controller 113 may generate at stage 48 any desired augmented reality image including no image. For example, the augmented reality image may identify certain objects at the work site 100 such as with highlighting and may further specify the distance to such objects. At stage 49, the controller 113 (whether at the command center 111, at a machine 10, or at a display device) may render or display the image on the display screen.
If the display device is within the predetermined distance at decision stage 47, the controller 113 may generate at stage 50 an image to highlight the movement of the autonomously or semi-autonomously operated or remotely controlled machine. In one example, the augmented reality image may include highlighting the machine in a flashing manner. In another example, the augmented reality image may depict the future or scheduled movement of the machine 10 or an implement of the machine. If the machine 10 is not within the field of view of the display device, the display device may generate another type of image to warn the user of the display device.
At decision stage 51, the controller 113 may determine whether any other images are to be displayed on the display screen of the display device. If additional images are desired as part of the augmented reality image or overlay, the additional images may be generated at stage 52 by controller 113 and then rendered or displayed at stage 49 on the display screen together with the image generated at stage 50. If no additional images are desired, the controller 113 may render or display on the display screen at stage 49 the augmented reality image generated at stage 50.
The industrial applicability of the system described herein will be readily appreciated from the foregoing discussion. The foregoing discussion is applicable to systems operating at work sites 100 in which personnel are present and machines 10 are being operated in an autonomous or semi-autonomous manner or by remote control. The display system 200 may be used at a mining site, a landfill, a quarry, a construction site, a roadwork site, a forest, a farm, or any other area in which it is desired to improve the efficiency and visibility of a machine operator.
In one embodiment, the display system includes a display device such as head mountable display device 201 having a display screen 202 and a headset pose system 207 associated with the display screen. The headset pose system 207 generates pose signals indicative of the position and orientation of the display screen relative to the work site 100. A controller 113 may be configured to generate machine control signals to control movement of a machine 10 without an operator at the machine directly controlling the movement of the machine. In some instances, the machine 10 may be an autonomously operated machine. In other instances, the machine 10 may be operated by remote control. In both of those cases, the machine will not have an operator on the machine 10. A machine 10 being operated autonomously has machine control signals generated directly by a controller such as controller 113 at command center 111 or a machine controller 31 located on the machine. A machine 10 being operated by remote control has machine control signals generated by an operator but those signals are then processed and transmitted to the remote machine. In still other instances, the machine 10 may be operated semi-autonomously. In such case, an operator is at the machine but certain actions or movements of the machine occur without the operator controlling those movements.
The controller 113 may subsequently generate an augmented reality images based upon the machine control signals and render the augmented reality image on the display screen 202. The augmented reality image may be a warning or other notification of impending or current movement of a machine 10. The augmented reality image may include a visual representation of the next scheduled or planned machine action (such as moving the machine in a specific direction or moving the machine implement in a specific manner) that the autonomous or semi-autonomous machine is being commanded to execute.
The display system provides a system to increase the safety of personnel at a work site. This may be accomplished by enhancing the awareness of the personnel of machines that are operating in an autonomous or semi-autonomous machine or a machine being operated by remote control. An augmented reality image or overlay may be displayed on a display screen associated with people at the work site to inform or alert them of impending or ongoing movements of such machines 10.
It will be appreciated that the foregoing description provides examples of the disclosed system and technique. All references to the disclosure or examples thereof are intended to reference the particular example being discussed at that point and are not intended to imply any limitation as to the scope of the disclosure more generally. All language of distinction and disparagement with respect to certain features is intended to indicate a lack of preference for those features, but not to exclude such from the scope of the disclosure entirely unless otherwise indicated.
Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.
Accordingly, this disclosure includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above-described elements in all possible variations thereof is encompassed by the disclosure unless otherwise indicated herein or otherwise clearly contradicted by context.