The present disclosure relates to a method for generating an overlay for a scene viewable in an augmented-reality device based at least on a job role of a user operating the augmented-reality device. More specifically, the present disclosure relates to a system including a work machine, an augmented-reality device, and an electronic controller configured to generate an augmented-reality overlay specific to a job role of a user and to the work machine associated with the user.
Work machines can help move, shape, and reconfigure terrain within a worksite. For instance, at a paving worksite, one or more pieces of paving equipment, such as a cold planer, can be used to remove a portion of a roadway, parking lot, or other such work surface in order to expose a paving surface. Once the portion of the work surface has been removed, a paving machine, such as an asphalt paver, may distribute, profile, and partially compact heated paving material (e.g., asphalt) onto the paving surface. One or more compaction machines may then be used to further compact the paving material until a desired paving material density has been reached.
Augmented-reality devices may be used to assist a user in operating work machines at a worksite. Augmented reality refers to technology that begins with a real-world view of a physical environment through an electronic device and augments that view with digital content. Often, an augmented-reality device is a head-mounted display, commonly in the form of computerized smart glasses, although other implementations are available. With appropriate programming, an augmented-reality device used at a worksite may alert a user to hazards in a project, such as the location of power lines, pipes, manhole covers, or other items within a paving worksite.
One approach for using augmented-reality devices within a worksite is described in U.S. Pat. No. 10,829,911 (“the '911 patent”). The '911 patent describes a virtual assistance system including an augmented-reality display for assisting a work machine in grading a worksite. Various modules associated with the virtual assistance system indicate the presence of hazards within the worksite, which are then emphasized within the augmented-reality display. The emphasis may occur by augmenting, overlaying, or superimposing additional visual objects within a machine operator's view of the physical worksite. The '911 patent, however, is directed only to use of the augmented-reality display by the machine operator. A large worksite can have many personnel with varying roles or responsibilities who may benefit from an augmented-reality display, which the '911 patent does not contemplate. As a result, the system of the '911 patent is not desirable for augmented-reality devices that must be adapted for different modes of operation according to the role of the user, such as may exist with various personnel within a large worksite.
Examples of the present disclosure are directed to overcoming deficiencies of such systems.
In an aspect of the present disclosure, a computer-implemented method includes receiving, by an electronic controller, an indication of activation of an augmented-reality device associated with a user at a worksite and obtaining context data relating to usage of the augmented-reality device at the worksite, where the context data includes a user identity for the user. The method further includes identifying, by the electronic controller, a first job role associated with the user identity within the worksite for the augmented-reality device and generating an augmented-reality overlay for the augmented-reality device specific to the user based at least in part on the first job role. The electronic controller causes a first modification of a mixed-reality display of real-world images for a scene within a window of the augmented-reality device viewable by the user. The first modification includes the augmented-reality overlay visually coordinated with the real-world images and differs between the first job role and a second job role.
In another aspect of the present disclosure, a computer-implemented method includes receiving, by an electronic controller, user data identifying a user of an augmented-reality device at a worksite, identifying a job role for the user at the worksite, and receiving machine data identifying a work machine associated with the user at the worksite. The electronic controller selects a visual overlay among a plurality of a visual overlays available for a scene viewable within the augmented-reality device at least in part on a combination of the job role and the work machine. Further, the method includes receiving, by the electronic controller, worksite data relating to operation of the work machine by the user at the worksite and filtering the worksite data into status data based at least in part on a combination of the job role and the work machine. Additionally, the electronic controller causes a modification of a mixed-reality display of real-world images for the scene within a window of the augmented-reality device viewable by the user, where the modification for the scene includes the visual overlay coordinated with the real-world images and the status data, and where the modification is specific to the job role and the work machine.
In yet another aspect of the present disclosure, a system includes a work machine operable on a worksite by a user, an augmented-reality device associated with the user, and an electronic controller, coupled to at least the augmented-reality device. The electronic controller is configured to receive a user identity for the user of the augmented-reality device at the worksite, identify a first job role associated with the user identity within the worksite for the augmented-reality device, and generate an augmented-reality overlay for the augmented-reality device specific to the user based at least in part on the first job role. Moreover, the electronic controller of the system is configured to cause a modification of a mixed-reality display of real-world images for a scene within a window of the augmented-reality device viewable by the user. The modification includes the augmented-reality overlay visually coordinated with the real-world images and differs between the first job role and a second job role.
Wherever possible, the same reference numbers will be used throughout the drawings to refer to same or like parts. The present disclosure begins with a discussion of an example system 100 (e.g., a paving system 100) depicted in
Turning first to
The example paving system 100 in
Further referring to
As shown, the paving machine 102 may also include a communication device 132. Such communication devices 132 may be configured to permit wireless transmission of a plurality of signals, instructions, and/or information between the paving machine 102 and various other machines of the paving system 100. The communication device 132 may also be configured to permit wireless transmission of a plurality of signals, instructions, and/or information between the paving machine 102 and one or more servers, processors, computers, and/or other controllers 134, one or more tablets, computers, cellular/wireless telephones, personal digital assistants, mobile devices, or other electronic devices 136, and/or other components of the paving system 100.
The controller 134 illustrated in
The controller 134 may be a single processor or other device, or may include more than one controllers or processors configured to control various functions and/or features of the paving system 100. As used herein, the term “controller” is meant in its broadest sense to include one or more controllers, processors, and/or microprocessors that may be associated with the paving system 100, and that may cooperate in controlling various functions and operations of the components (e.g., machines) of the paving system 100. The functionality of the controller 134 may be implemented in hardware and/or software without regard to the functionality.
The one or more electronic devices 136 may also comprise components of the paving system 100. Such electronic devices 136 may comprise, for example, mobile phones, laptop computers, desktop computers, and/or tablets of project managers (e.g., foremen) overseeing daily paving operations at the worksite and/or at the paving material plant. Such electronic devices 136 may include and/or may be configured to access one or more processors, microprocessors, memory, or other components. In such examples, the electronic devices 136 may have components and/or functionality that is similar to and/or the same as the controller 134.
The network 138 may be a local area network (“LAN”), a larger network such as a wide area network (“WAN”), or a collection of networks, such as the Internet. Protocols for network communication, such as TCP/IP, may be used to implement the network 138. Although embodiments are described herein as using a network 138 such as the Internet, other distribution techniques may be implemented that transmit information via memory cards, flash memory, or other portable memory devices. The network 138 may implement or utilize any desired system or protocol including any of a plurality of communications standards. The desired protocols will permit communication between the controller 134, the electronic devices 136, the various communication devices 132 described herein, and/or any other desired machines or components of the paving system 100. Examples of wireless communications systems or protocols that may be used by the paving system 100 described herein include a wireless personal area network such as Bluetooth RTM (e.g., IEEE 802.15), a local area network such as IEEE 802.11b or 802.11g, a cellular network, or any other system or protocol for data transfer. Other wireless communication systems and configurations are contemplated.
In example embodiments, one or more machines of the paving system 100 (e.g., the paving machine 102) may include a location sensor 140 configured to determine a location and/or orientation of the respective machine. In such embodiments, the communication device 132 of the respective machine may be configured to generate and/or transmit signals indicative of such determined locations and/or orientations to, for example, the controller 134, one or more of the electronic devices 136, and/or to the other respective machines of the paving system 100. In some examples, the location sensors 140 of the respective machines may include and/or comprise a component of global navigation satellite system (GNSS) or a global positioning system (GPS). Alternatively, universal total stations (UTS) may be utilized to locate respective positions of the machines. One or more additional machines of the paving system 100 may also be in communication with the one or more GPS satellites 142 and/or UTS, and such GPS satellites 142 and/or UTS may also be configured to determine respective locations of such additional machines. In any of the examples described herein, machine locations determined by the respective location sensors 140 may be used by the controller 134, one or more of the electronic devices 136, and/or other components of the paving system 100 to coordinate activities of the paving machine 102, one or more cold planers, and/or other components of the paving system 100.
The paving machine 102 may also include a controller 144 operably connected to and/or otherwise in communication with the console 130, the communication device 132, and/or other components of the paving machine 102. The controller 144 may be a single controller or multiple controllers working together to perform a variety of tasks. The controller 144 may embody a single or multiple processors, microprocessors, field programmable gate arrays (FPGAs), digital signal processors (DSPs), and/or other components configured to calculate and/or otherwise determine one or more travel paths of the paving machine 102, screed settings, and/or other operational constraints of the paving machine 102 based at least in part on information received from the one or more other machines of the paving system 100, paving machine operating information received from an operator of the paving machine 102, one or more signals received from the GPS satellites 142, and/or other information. Numerous commercially available processors or microprocessors can be configured to perform the functions of the controller 144.
As shown in
The cold planer 146 may further include one or more rotors 156 having ground-engaging teeth, bits, or other components configured to remove at least a portion of the roadway, pavement, asphalt, concrete, gravel, dirt, sand, or other materials of a work surface 158 on which the cold planer 146 is disposed. The cold planer 146 may also include a conveyor system 160 connected to the frame 159, and configured to transport removed portions of the work surface 158 from proximate the rotor 156 (or from proximate the first and second rotors) to a bed 162 of the haul truck 148. Additionally, the cold planer 146 may include an actuator assembly 163 connected to the frame 159 and configured to move the rotor 156 (or to move the first and second rotors) relative to the frame 159 as the rotor 156 removes portions of the work surface 158.
In addition to and/or in place of the actuator assembly 163 associated with the rotor 156, the cold planer 146 may include a front actuator assembly 167 and a rear actuator assembly 169. In such examples, the front actuator assembly 167 may be connected to the frame 159, and configured to raise and/or lower one or more wheels, continuous tracks, or other ground engaging elements (disposed at the front of the cold planer 146) relative to the frame 159. Similarly, the rear actuator assembly 169 may be connected to the frame 159, and configured to raise and lower one or more wheels, continuous tracks, or other ground engaging elements (disposed at the rear of the cold planer 146) relative to the frame 159.
As shown in
The cold planer 146 may also include an operator station 166, and the operator station 166 may include a console 168 and/or other levers or controls for operating the cold planer 146. In some examples, the operator station 166 and/or the console 168 may be substantially similar to the operator station 128 and console 130 described above with respect to the paving machine 102. For example, the console 168 may include a control interface for controlling various functions of the cold planer 146 including, for example, sharing various operating data with one or more other machines of the paving system 100.
With continued reference to
In addition, the haul truck 148 may include a communication device 170 and a location sensor 172. The communication device 170 may be substantially similar to and/or the same as the communication devices 132, 154 described above, and the location sensor 172 may be substantially similar to and/or the same as the location sensors 140, 164 described above.
The worksite, in the form of paving system 100, may additionally include one or more devices providing “augmented reality” or “augmented vision” for a user 150, shown in
One current commercial option for augmented-reality device 174 is a set of HoloLens smart glasses available from Microsoft Corporation of Redmond, Washington. HoloLens devices are head-mounted, mixed-reality smart glasses. Among other features, HoloLens is an untethered holographic device that includes an accelerometer to determine linear acceleration along the XYZ coordinates, a gyroscope to determine rotations, a magnetometer to determine absolute orientation, two infrared cameras for eye tracking, and four visible light cameras for head tracking. As such, the HoloLens includes advanced sensors to capture information about what the user is doing and the environment the user is in. HoloLens includes network connectivity via Wi-Fi and may be paired with other compatible devices using Bluetooth. A custom processor, or controller, enables the HoloLens to process significant data from the sensors and handle affiliated tasks such as spatial mapping.
As with other devices within paving system 100, augmented-reality device 174 may be in communication with controller 134 via the network 138, such as through its ability to establish a Wi-Fi connection. With this communication, augmented-reality device 174 or controller 134 may provide or generate spatial mapping information relating to a geographic region, such as the worksite of paving system 100. Spatial mapping provides a detailed representation of real-world surfaces in the environment around augmented-reality device 174. The spatial mapping helps anchor objects in the physical world so that digital information can be accurately coordinated with them when augmented within a display. In some examples, a map of the terrain of a worksite associated with paving system 100 may be retrieved from an external source for use by augmented-reality device 174. In other examples, augmented-reality device 174 collects data through its cameras and builds up a spatial map of the environment that it has seen over time. As the physical environment changes, augmented-reality device 174 can update the map as its cameras collect information that the wearer sees.
Either controller 134 or augmented-reality device 174 can retain a map of the worksite usable by augmented reality device 174. In operation, augmented-reality device 174, through its many sensors and cameras, can identify a physical scene within a field of view of user 150, as wearer of the glasses, that corresponds with the map. As the field of view of user 150 changes, the relevant data from the spatial map associated with what is seen by user 150 through display screen 176 also changes.
Augmented-reality device 174 enables the programming of digital information to be superimposed or augmented over the view of the physical world within display screen 176. In particular, selected physical objects seen through display screen 176 in the physical domain may be highlighted or emphasized with graphics in the digital domain. Knowing the coordinates of the selected physical objects from the spatial mapping data, augmented-reality device 174 can coordinate the positioning of the graphics within display screen 176 so the two align. In some examples, the graphics are superimposed with highlighting. In other examples, the graphics include holograms or other graphics sufficient to communicate desired information to user 150.
Although not depicted in
Besides potential hazards, augmented-reality device 174 in some examples highlights with digital information objects within the field of view significant to a work function of user 150. For example, when user 150 is an operator of cold planer 146, based on the current position and field of view of user 150, augmented-reality device 174 may help identify areas of work surface 158 yet to be treated. Sensors other than those within augmented-reality device 174, at least as discussed above within paving system 100, may be used to collect information about the location, perspective, and terrain relative to a field of view of user 150.
While
As embodied as 204 in
Machine identity 206 is another example of context data 202. Machine identity 206 specifies a particular machine or machine type associated with user 150 of augmented-reality device 174. Machine identity 206 in some situations is an alphanumeric code or other informational symbol communicating a make, type, and/or model of a work machine, such as a Caterpillar AP555F track asphalt paver or a Caterpillar PM620 cold planer. Machine identity 206 may be provided in various ways, such as through entry directly into augmented-reality device 174, through communication from a computerized device to controller 134 or augmented-reality device 174, or through communication from controller 144 or controller 152 on one of the work machines to controller 134 or augmented-reality device 174.
Context data 202 additionally includes location 208 and time 210. As discussed above for
In addition to context data 202, electronic components within paving system 100 collect and communicate worksite data 210. In general, worksite data 210 includes operational data 212 relating to execution of work functions within paving system 100 collected from one or more operational sensors associated with the work machines and the worksite. In one example, system controller 134, electronic devices 136, and/or any other desired machines or components of paving system 100 may continuously or periodically send requests to communication device 132, communication device 154, or communication device 170 requesting data obtained from operational sensors (not shown). The operational sensors may detect any parameter such as, for example, light, motion, temperature, magnetic fields, electrical fields, gravity, velocity, acceleration in any number of directions, humidity, moisture, vibration, pressure, and sound, among other parameters. Thus, the operational sensors may include accelerometers, thermometers, proximity sensors, electric filed proximity sensors, magnetometer, barometers, seismometer, pressure sensors, and acoustic sensors, among other types of sensors. Corresponding operational data 212 associated with the type of sensor may be gathered. Thus, operational data 212 obtained via the operational sensors may be transmitted to controller 132 or controller 152, for example, for further transmission and/or processing. Examples of operational data 212 gathered from sensors include operator manipulation of the work machine, machine velocity, machine location, fluid pressure, fluid flow rate, fluid temperature, fluid contamination level, fluid viscosity, electric current level, electric voltage level, fluid (e.g., fuel, water, oil) consumption rates, payload level, and similar characteristics. In the example of
In the implementation of
In some examples, drone data 216 is part of worksite data 210. One or more drones in the air may collect unique information as drone data 216 about the worksite in the direction of the Y axis and about the worksite from a wide perspective. Drone data 216 can include information about the condition of work surface 158 and paving surface 118, a state of progress for the worksite, movement and status of equipment and personnel within the worksite, and other conditions within the knowledge and experimentation of those of ordinary skill in the field.
In the implementation of
While
In particular, method 300 begins with a step 302 of receiving, by an electronic controller, an indication of activation of an augmented-reality device associated with a user at a worksite. In an example, a user turns on augmented-reality device 174, and the electronic controller within augmented-reality device 174 registers the activation of the device to begin operation. Alternatively, the electronic controller is controller 134, which receives the indication via network 138.
In a next step 304, the electronic controller obtains context data that includes user data and machine data. For instance, after activation, a controller, whether controller 134 or a controller within augmented-reality device 174, obtains context data 202 relevant to augmented-reality device 174, which includes at least user identity 204 and machine identity 206. As discussed above, user identity 204 may be affiliated with a login and authentication process for a user to use augmented-reality device 174, and machine identity 206 can be an identification of a particular work machine at the worksite associated with user 150, such as a work machine that user 150 will be operating. In some contexts, as explained below, context data 202 does not include machine identity 206, as user 150 is not associated with a specific machine. Other features of context data 202 may also be obtained by the controller, such as location 208 and time 209, although they are not elaborated on within method 300.
Following step 304, a job role 222 is identified for user 150 at the worksite from the user identity (step 306). A job role is a defined responsibility or function that a user has within the worksite. Typical job roles within the context of the present disclosure are operator, supervisor, inspector, and visitor. Fewer or more job roles may exist without departing from the disclosed and claimed processes. In this example, an operator is a job role in which user 150 controls or pilots operation of user machine 224, such as one of paving machine 102, cold planer 146, and haul truck 148. In this situation, the operator is able to affect steering, acceleration, stopping, starting, and numerous other functions associated with user machine 224. In some examples, job role is identified for user 150 by accessing a database that includes eligible users of augmented-reality device 174 and job roles associated with those users. A person within an enterprise whose occupation is to operate paving equipment such as cold planer 146 may be listed in the database as an operator. Another person may work in management and be listed in the enterprise database as a supervisor. Alternatively, paving system 100 may provide the option for a user of augmented-reality device 174 to enter a particular job role, such as directly into the augmented-reality device 174, through an electronic device 136, or by some other means as part of the login process. The level of access and control provided for associating a job role with a user is subject to the particular implementation and within the knowledge of those of ordinary skill in the art.
Step 306 also entails identifying a user machine from the machine data within the context data 202. A user machine 224 identified from machine data 206, as explained above, specifies in some examples a make, model, or type of equipment associated with user 150. Thus, if user 150 has a job role as an operator, that operator may further be currently associated with a Caterpillar PM620 cold planer in one situation. For other job roles, machine data 206 and identification of a user machine 224 under step 306 may not occur. Specifically, if job role 222 is an inspector or a visitor, the activity associated with that user is not tied to a particular machine necessarily. The variation in associating users with work machines depends on the implementation.
As reflected in
Continuing through
In step 314, a controller causes a modification of a mixed-reality display of real-world images for the scene within a window of the augmented-reality device viewable by user 150. The modification includes the augmented overlay 226 coordinated with the real-world images and status data 228 and is specific to job role 222 and user machine 224. In some implementations, a controller within augmented-reality device 174 (or controller 134) will cause display screen 176 to change the content within a field of view of a user for a scene by superimposing the augmented overlay 226 that is specific to job role 222 and user machine 224. Thus, for the example of an operator of cold planer 146, the controller will cause display screen 176 to show the highlighted objects determined for the augmented overlay 226 relevant to operation of that machine and to show the filtered worksite data 210 specific to the workflow happening for that machine.
In addition to augmentation coordinated with objects within display screen 176, the modification of the mixed-reality display also includes content relating to filtered worksite data 210 not necessarily coordinated with viewed objects. For instance, display screen 176 in
In contrast to
Returning to
Accordingly, as illustrated in
The present disclosure provides systems and methods for generating an overlay for a scene viewable in an augmented-reality device based at least on a job role of a user operating the augmented-reality device. The augmented-reality device obtains context data and worksite data relating to the user and a machine associated with the user. From the context data, a job role is identified for the user. Based on the job role and a machine type, an augmented overlay for a mixed-reality display is selected from a plurality of augmented overlays. The selected augmented overlay provides a superimposed emphasis on selected objects within the user's field of view and provides status data relating to a workflow being performed by the user. As a result, the user can obtain customized information tailored to the user's job role and to the machine associated with the user. Moreover, the same augmented-reality device may be configured for other users or reconfigured for the same user having a different job role or associated machine, providing efficient functionality.
As noted above with respect to
In the examples of the present disclosure, augmented-reality device 174 is configurable to match at least the job role 222 for a user of the device. Additionally, a user machine 224 associated with user 150 can enable additional configuration of the device. At a worksite, if a user has a job role as an operator of a machine, an augmented overlay 226 specific to operation of that machine can be selected, showing hazards, work guidance, performance metrics, and other information tied to the user's job role and machine. If the user changes job role 222, or a new user has a different job role, such as a supervisor, the augmented overlay 226 for the same scene viewable by the operator may highlight different objects and present different information tied to the tasks of the supervisor. Accordingly, following the methods of the present disclosure, augmented-reality device 174 is configurable to provide the most useful information to the user based on a job role 222 and a user machine 224, and information displayed within the device can be changed to match the defined job role for different users. The augmented-reality device 174, therefore, provides more flexible use among a variety of users and provides augmentation tailored to the job functions of the user.
Unless explicitly excluded, the use of the singular to describe a component, structure, or operation does not exclude the use of plural such components, structures, or operations or their equivalents. As used herein, the word “or” refers to any possible permutation of a set of items. For example, the phrase “A, B, or C” refers to at least one of A, B, C, or any combination thereof, such as any of: A; B; C; A and B; A and C; B and C; A, B, and C; or multiple of any item such as A and A; B, B, and C; A, A, B, C, and C; etc.
While aspects of the present disclosure have been particularly shown and described with reference to the embodiments above, it will be understood by those skilled in the art that various additional embodiments may be contemplated by the modification of the disclosed machines, systems and methods without departing from the spirit and scope of what is disclosed. Such embodiments should be understood to fall within the scope of the present disclosure as determined based upon the claims and any equivalents thereof.