The present disclosure is related to the operation and monitoring of a fleet of material handling vehicles.
The use of robotic or autonomous vehicles in manufacturing and material handling applications is expanding rapidly as technology advances. The ability to adapt software to autonomous vehicles to improve decision making and perform more and more complex tasks is a driver of productivity. For example, having autonomous vehicles in a fleet cooperate and avoid dangerous interactions allows more autonomous vehicles to be used in a single workspace.
The ability for fleet members to interact and complete more complex tasks also presents additional opportunities for autonomous vehicles to encounter error conditions that they are neither programmed to recognize or resolve. Unexpected conditions, such as an unexpected impediment in a workspace or a malfunction of a particular subsystem of the autonomous vehicle can present problems that magnify within a fleet environment, such as for example, if a malfunctioning autonomous vehicle blocks the workflow of other autonomous vehicles in the fleet.
Failures of autonomous vehicle subsystems or error conditions can cripple a fleet operation in short order if not resolved. In many cases, resolution requires a human interaction that requires the human to enter the workspace and resolve the issue. The introduction of the human into the autonomous vehicle workspace presents new obstacles for the fleet members as well as potentially exposing the human to injury by one of the autonomous vehicles. This is confounded by the issue that many autonomous vehicles do not have readily available user controls to operate the autonomous vehicle and, even if present, the autonomous vehicle is not likely to be adapted for an operator to manually operate the autonomous vehicle.
Still further, the need to have humans on “stand-by” reduces the productivity gains provided by a fleet of autonomous vehicles. When operating properly, autonomous vehicle fleets need little to no intervention in the normal activities. For example, autonomous vehicles may be programmed to return to charging/fueling stations when needed and may take themselves out of service when preventative maintenance is required. Thus, there is no need for humans to be present when the fleet, or any particular autonomous vehicle, is operating correctly. There is a need to have humans available in the vicinity of the autonomous vehicles if error conditions arise.
Improved performance of autonomous vehicles and the reduction of the need for nearby humans to resolve errors and system malfunctions in autonomous vehicles would provide additional productivity and cost reductions, as well as reducing the potential for injury to humans. Still further, due to the limited number of issues that occur in a particular workspace, reducing or eliminating the need for human physical interaction presents the potential for additional productivity gains.
The present disclosure includes one or more of the features recited in the appended claims and/or the following features which, alone or in any combination, may comprise patentable subject matter.
A material handling vehicle tracking and monitoring system is provided. The system includes a remote monitoring station and a material handling vehicle in communication with the remote monitoring station. The remote monitoring station includes a display interface and a server. The material handling vehicle includes one or more sensors with input parameters, a drive system that includes a drive and a drive controller, and a vehicle controller that includes a processor and a memory device. The vehicle controller includes programmable instructions executable by the processor that when executed cause the vehicle controller to perform various functions. The processor receives data from one or more sensors of the material handling vehicle and processes the data. The data is stored in the memory device of the vehicle controller. The processor generates time-sequenced data based on one or more sensors of the material handling unit and combines the time-sequenced data to provide a historical composite of data. A sequence of one or more composite images is created based on historical composite data. The composite image is transmitted to a display interface of the remote monitoring station.
In some embodiments, the system also includes a camera system designed to generate a signal representative of images of an environment surrounding the material handling vehicle. In some aspects, the processor receives signals from one or more sensors and from the camera system. The signals are aggregated and used to create a data array that is provided in a form of a synchronized composite of sensor signals and camera signals. The synchronized composite represents a time-sequenced composite image file that superimposes data derived from the sensor signals onto the images from the camera system. The time-sequenced composite image file is transmitted in real-time to the display interface of the remote monitoring station. In some forms, the remote monitoring station transmits a command signal to the material handling vehicle, and a camera associated with the material handling vehicle simultaneously changes the point of time at which the time-sequenced data is presented. In some embodiments, the remote monitoring station includes a headset, and a position of the headset is calibrated from data in an environment surrounding the material handling vehicle. In some aspects, the position of the headset is calibrated from a neutral position relative to a particular camera, a second material handling vehicle, or a combination thereof. In some forms, a user input device of the remote monitoring station includes an input for varying a point in time which corresponds to a portion of an image file associated with the composite image being transmitted, such that a user may choose to view the portion of the image file as it existed a different time from current real-time. In one or more embodiments, the portion of the image file transmitted is responsive to the position of the headset, such that the user may vary the field of view at different times to view the surroundings of the vehicle at that point in time. In one or more forms, the time sequence may be paused such that a user may look around the vehicle at a single point in time by moving the head set to change the field of view. In some embodiments, the material handling vehicle is designed to transmit an alert condition to the remote monitoring station associated with a time of the alert condition, such that a user can select the time of the alert condition to view images from a camera system or the data from the one or more sensors of the material handling vehicle.
A method for monitoring a material handling vehicle is provided. The method includes providing a remote monitoring station and a material handling vehicle. The remote monitoring station includes a display interface and a server. The material handling vehicle is in communication with the remote monitoring station. A processor of a vehicle controller of the material handling vehicle receives data from one or more sensors of the material handling vehicle. The processor of the vehicle controller processes and stores the data in a memory device of the material handling vehicle. The processor generates time-sequenced data to provide a historical composite of the data and creates a composite image based on the historical composite data. The composite image is transmitted to the display interface of the remote monitoring station.
In some embodiments, the remote monitoring station is monitoring updated real-time location and status indicators of the material handling vehicle. In some forms, the remote monitoring station transmits a color to the display, wherein the color is based on an error condition and urgency for the material handling vehicle. In some aspects, the remote monitoring station detects a critical event from one or more sensors on the material handling vehicle. In one or more embodiments, the remote monitoring station generates a point cloud from three-dimensional data obtained from one or more sensors. In some aspects, the remote monitoring station presents the point cloud in an image plane.
A material handling vehicle tracking and monitoring system is provided. The system includes a remote monitoring system and a materail hadnling vehicle in communication with the remote monitoring station. The remote monitoring station includes a display interface and a server. The material handling vehicle includes one or more sensors with input parameters and on-board inertial measurement units (IMUs). The material handling vehicle includes a drive system, a camera system, and a vehicle controller. The drive sytem includes a drive and a drive controller. The camera system includes one or more cameras. The vehicle controller includes a processor and a memory device. The vehicle controller includes programmable instructions executable by the processor that when executed cause the vehicle controller to perform various functions. The processor receives first signals from one or more sensors of the material handling vehicle. The processor receives second signals from the camera system and aggregates the first signals and the second signals received from the one or more sensors and the camera system. The processor creates visual data using the aggregated signals to generate a time-sequenced composite image file. The processor transmits the time-sequenced composite image file in real-time to the display interface of the remote monitoring station.
In some aspects, the material handling vehicle includes a communication circuitry to communicate with the remote monitoring station. In some forms, the communication circuitry includes a wireless communication protocol to share data in real-time between the material handling vehicle and the remote monitoring station. In one or more embodiments, one or more sensors includes an array of cameras, wherein an operator may toggle between views of one or more material handling vehicles and each camera of the array of cameras at a coordinated point in time using a headset worn by the operator.
Additional features, which alone or in combination with any other feature(s), such as those listed above and/or those listed in the claims, can comprise patentable subject matter and will become apparent to those skilled in the art upon consideration of the following detailed description of various embodiments exemplifying the best mode of carrying out the embodiments as presently perceived.
The detailed description particularly refers to the accompanying figures in which:
According to the present disclosure, a material storage facility 10 includes a number of storage racks 12, 14, 16, 18 positioned in a workspace 20 of the material storage facility 10. The material storage facility 10 also includes a number of autonomous vehicles 22 that operate within the material storage facility 10 to process materials in the material storage facility 10, including, in some embodiments, moving materials 24 from one storage location 26 to another storage location 28. It should be understood that the autonomous vehicles 22 may also move materials from the particular material storage facility 10 to another location outside of the material storage facility 10, such as to a manufacturing or distribution location outside of the material storage facility 10. In the embodiment of
The material storage facility 10 further includes a remote monitoring station (RMS) 30 positioned in the material storage facility 10. An operator 8 is located in the remote monitoring station 30, responsible for monitoring the fleet of autonomous vehicles 22 and resolving issues or error conditions. The positioning of the remote monitoring station 30 in the material storage facility 10 is for illustrative purposes only. The remote monitoring station 30 may be positioned outside of the material storage facility 10 at the same location, or may be positioned in a geographically distant location, such as a single remote monitoring station 30 facility that provides monitoring for various material storage facilities 10 around the world. The material storage facility 10 also includes a number of fixed cameras 32 positioned throughout the workspace 20. The number of fixed cameras 32 are positioned to provide viewing coverage of the workspace 20 in which the autonomous vehicles 22 operate.
Referring now to
Autonomous vehicle 22 includes a number of sensors 150 providing data to the controller 34. The sensors 150 include an array of cameras 42 positioned in various locations on the autonomous vehicle 22 to provide a three-hundred-sixty degree view of overlapping fields of view to provide an remote operator 8 full access to view the surroundings of the autonomous vehicle 22. The cameras 42 are embodied as RGBD cameras, providing enhanced information controller 34 that will be used as discussed below. In some embodiments, the autonomous vehicle may also include an array of RGB cameras 142 that provide visual data without the enhanced depth of the RGBD cameras 42. The autonomous vehicle 22 also includes one or more on-board inertial measurement units (IMUs) 44 to provide input to the controller 34 regarding changes in acceleration and orientation of the autonomous vehicle 22 in three-space during use. Still further, the autonomous vehicle 22 also includes an array of contact sensors 46 positioned on the autonomous vehicle 22 so that any physical contact between the autonomous vehicle 22 and some other item in the environment can be detected. Using the array of cameras 42, accelerometer(s) 44, and contact sensor array 46, the controller 34 is operable to sense the environment around the autonomous vehicle 22 to control the autonomous vehicle 22 as it completes its mission. In addition, the array of cameras 42, accelerometer(s) 44, and contact sensor array 46 allows the controller 34 to identify unexpected conditions that may require a response by the autonomous vehicle 22 or an intervention by an operator 8 positioned at the remote monitoring station 30. In some embodiments, the autonomous vehicle 22 may also include an array of LiDAR sensors 54 positioned about the autonomous vehicle 22 and operable to provide additional information to the controller 34 regarding the environment about the autonomous vehicle 22. Still further, in some embodiments, the autonomous vehicle 22 may also include an array of sonar sensors 56 positioned about the autonomous vehicle 22 and operable to provide additional information to the controller 34 regarding the environment about the autonomous vehicle 22. Additional sensors or sensor systems 150 of the autonomous vehicle 22 may include encoders 144 for measuring movement of components of the autonomous vehicle 22 or radar sensors 140. Other sensors 152 may be used as required by a particular application.
Also shown in the block diagram of
The autonomous vehicle 22 also includes communication circuitry 52 that allows the autonomous vehicle 22 to interact with the remote monitoring station 30 and to share data there between. The communications circuitry 52 can take many forms and be similar to the input/output controller 40 or be modified for a particular application. It is contemplated that the communications circuitry 52 would include a high-speed wireless communications protocol to allow data to be shared between the controller 34 and the remote monitoring station 30 in real-time to allow the autonomous vehicle 22 to operate at relatively high speeds in the workspace 20.
As will be explained in further detail below, the array of cameras 42, accelerometer(s) 44, and contact sensor array 46, and optionally the LiDAR array 54 and/or sonar array 56 provide signals to the controller 34 that are then processed and stored in memory 38 to provide a time sequenced data based on the inputs from the various sensors 42, 44, 46, 54, and 56 that can be combined to provide a historical composite of the data that can create a composite image and status that allows a user to reconstruct the environment of the autonomous vehicle 22 at an earlier point in time. This is useful for an operator 8 to diagnose a particular problem experienced by an autonomous vehicle 22 by viewing the surroundings of the autonomous vehicle 22 at an earlier point in time. The data can also be combined to create a composite image available to the operator 8 at the remote monitoring station 30 in real-time so that the operator 8 may manually operate the autonomous vehicle 22 using the combined data.
The autonomous vehicle 22 also includes an independent safety system 138 with safety rated sensors that are operable to limit operation of the autonomous vehicle 22. The autonomous vehicle 22 includes a battery 88 which powers the various components of the autonomous vehicle 22. The battery 88 interfaces with a battery relay 134 which is operable to interrupt power to both the manipulation system 136 and drive system 49 to prevent operation of those systems in critical safety situations. If the safety system 138 detects an unsafe condition, the power to the manipulation system 136 and drive system 49 is cut until the unsafe condition is resolved.
The remote monitoring station 30 is shown diagrammatically in
The remote monitoring station 30 also includes an augmented reality interface 64 that is worn by an operator 8 when the operator 8 is monitoring the one or more material storage facilities 10 as shown in
Referring now to
Once the alerts configuration is loaded at step 166, the process 160 advances to a normal operating process 170. The process continuously progresses through a decision tree where the data from the sensors 150 is evaluated to determine if a critical event has occurred at decision step 172. If no critical event is detected, the process 160 progresses to a decision step 174 to determine if any important events have occurred. If a critical event is detected at decision step 172, then the process 160 advances to step 176 wherein critical event data is saved to the server 78. Then the process 160 progresses to step 178 where the event data is displayed to the operator 8 at the remote monitoring station 30.
If no critical event is detected at step 172 and no important event is detected at step 174, then the process advances to step 180 where sensor data is temporarily saved as short-term data. This short-term data is not maintained beyond a predetermined period as the data set becomes overwhelming for the controller 34. However, the short-term event data is available to the operator 8 in certain situations as discussed below. If important data is detected at step 174, then the process 160 advances to step 178 discussed above and then advances to the step 180.
From step 180, the process advances to step 182 where the autonomous vehicle 182 operates based on the detected data. The operation is evaluated at decision step 184 to determine if operation of the autonomous vehicle 22 should continue. If the data indicates that the autonomous vehicle 22 is safe to operate, then the process 160 returns to step 170 and progresses through the decision tree described above. If the autonomous vehicle 22 is not safe to operate, then the decision step 184 initiates a shut-down and the process progresses to an end step 186 where the autonomous vehicle 22 is inoperable until the shut-down condition is resolved by the operator 8.
Referring to
Thus, when an event occurs requiring the operator 8 to assess the environment of the autonomous vehicle 22, the operator 8 may also be able to move back in time, virtually, to view the environment of the autonomous vehicle 22 at the carlier time based on the data saved to the autonomous vehicle at step 180 or the server at step 176. The operator 8 may move their head to view different fields of view around the autonomous vehicle 22 at that point in time, with the information on the display being presented as an augmented reality image. This is useful for the operator 8 to reconstruct the environment of the autonomous vehicle 22 prior to the error condition. The data from the array of cameras 42, accelerometer(s) 44, and contact sensor array 46, and optionally the LiDAR array 54 and/or sonar array 56 is stored in memory 38 and is used to reconstruct the image on the augmented reality interface 64 as if it were in real-time. As such, the operator 8 is able to detect any abnormalities, improving the ability to resolve the alert condition.
The workstation 56, through communications circuitry 76, is also in communication with the cameras 32 positioned throughout the workspace 20 and selectively accessible by the operator 8 to choose a particular camera 32 to view the workspace 20 from the perspective of the particular camera 32. A server 78 facilitates the communication between the workstation 56, cameras 32, and autonomous vehicles 22 so that all of the information is available to operator 8. This functionality is available in real-time such that when an error condition is triggered, such as with the autonomous vehicle 22′ in
The data from the cameras 32 are maintained in memory 74 on the workstation 56 or at the server 78. When the operator 8 has time-shifted the information presented on the augmented reality interface 64 as discussed above, the workstation 56 receives the time-shift information and resets the images from the various cameras 32 to the same point in time as the time-shifted such that the operator 8 may toggle between the virtual operator condition and viewing the workstation 56 at the time-shift to thereby review the view from any of the cameras 32 to further diagnose the error condition. This further assists the operator 8 in diagnosing the error condition of the autonomous vehicle 22′.
Importantly, the workstation 56, through the server 78, is also operable to reset available information in all of the adjacent autonomous vehicles 22 to the time-shifted point in time. Thus, while the operator 8 is engaged with a virtual time-shift, any of the actions the operator 8 takes will be in reference to the particular point in time that the operator 8 has shifted to, until the time-shift is released. For example, the operator 8 can move from being the virtual operator 8 of the autonomous vehicle 22′ if
As shown in
Referring to
Although this disclosure refers to specific embodiments, it will be understood by those skilled in the art that various changes in form and detail may be made without departing from the subject matter set forth in the accompanying claims.
This application is a continuation of U.S. patent application Ser. No. 18/170,023, filed Feb. 16, 2023, which is a continuation of U.S. patent application Ser. No. 17/034,365, filed Sep. 28, 2020, now U.S. Pat. No. 11,599,125, which claims priority to U.S. Provisional Patent Application No. 62/907,933, filed Sep. 30, 2019, all of which are expressly incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
62907933 | Sep 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 18170023 | Feb 2023 | US |
Child | 18822719 | US | |
Parent | 17034365 | Sep 2020 | US |
Child | 18170023 | US |