The present invention relates to robots for inspecting surfaces and, more particularly, to a robot that employs multiple inspection modes.
Currently inspections of surfaces mainly rely on artificial visual inspection. This inspection process usually includes the steps of sprinkling the inspected surface or court with water and waiting for a couple of hours (2-3 hours) until most of the water has evaporated. If some areas in the court are not flat, residual water will remain in depressions in the surface. Such an inspection process is tedious and, in some sense, ad hoc. It often leads to inaccurate inspection results and low efficiency.
A prior automated system of detecting the flatness of a surface is disclosed in US Application Publication No. 2019-22376563. This publication relates to a system in which the spatial resolution is fixed and low (about 1 cm) because test needles are arranged in a fixed matrix structure and make contact with the ground. Depressions are only detected by off LED discrete signals without measuring the exact depth data. In order to detect the flatness of a large field like a tennis court, technical personnel need to place a needle bearing device multiple times. This can be dangerous because the personnel can be scratched by the test needles.
US Patent Application Publication No. 2019-22376563 discloses a system with a movable wheel linked with a detection device. It provides a detection width that is the wheel width (about 10 cm in a row). Also, the detection reference is built on the local surface where the front and rear support wheels are located. In order to detect flatness of a large field like a tennis court, technical personnel need to hold the handle and walk around the surface many times to fully cover the whole area.
A mobile inspection robot is disclosed in US Patent Application Publication No. 2013/0231779. This robot includes a robot body and a drive system supporting the robot body and configured to maneuver the robot over a work surface. A controller communicates with the drive system and a sensor system that includes a camera or ultrasound sensor. The controller executes a control system that includes a control arbitration system and a behavior system that are in communication with each other. The behavior system executes an inspection routine based on the execution of commands from the control arbitration system, which in turn are based on sensor signals received from the sensor system to identify and inspect electrical equipment. It particularly inspects switch status, temperature and humidity. However, this robot does not measure the flatness or other dimensionality of ground surfaces.
Thus, there is a need in the art for a fast and accurate way to determine the flatness of large courts, and preferably to do so without the need for a number of personnel.
The present invention is directed to a method for inspecting surfaces, e.g., the surfaces of courts used for sports such as tennis, basketball, badminton, etc. The invention employs hardware and software. The hardware includes a mobile base, sensors for base navigation, sensors for surface inspection, a communication system and a host computer. The software includes modules for base motion planning and navigation, point cloud acquisition and processing, surface modelling and analysis, multi module coordination and user interfaces.
The inspection procedure is as following: The robot is moved in a zigzag pattern trajectory over the court surface. For every fixed distance, a 3D point cloud of the surface is generated and the location of the point cloud with respect to a world coordinate system is recorded. The location of the point cloud is based on SLAM (simultaneous localization and mapping) for spatial mapping. At the same time, a high-resolution photo of the corresponding area on the surface is recorded by a camera. Both the point cloud and the photo are transmitted to the host computer for processing and analysis. This information is used in a new 3D detection and image processing algorithm to find flaws in the surface like bumps or depressions. If irregular flaws are detected, the robot marks each such problematic location.
The present invention significantly improves the accuracy of the inspection as well as its efficiency in order to provide high quality court facilities in a timely fashion to residents. The invention has three advantages: (1). The traditional inspection process usually includes covering the inspected court with water and waiting for a couple of hours until most of the water is evaporated. This takes too much time and requires a sunny day. The invention can finish more kinds of inspection processes (bump area, step-like irregular surfaces) at anytime. (2). In the prior system there is no record left when the water has evaporated, but with the invention all of the digital information of the flaws is recorded forever and they can be marked with spray paint. (3). There are too many recreational courts in large urban areas so that the portion of the government responsible for their maintenance has to spend too much time inspecting all these courts. Using the present invention, the inspections can be achieved by remote operation of the robot instead of requiring human inspectors on site every time.
This patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The foregoing and other objects and advantages of the present invention will become more apparent when considered in connection with the following detailed description and appended drawings in which like designations denote like elements in the various views, and wherein:
The invention includes a mobile robot that travels over the surface to be detected as shown in
The structure of the hardware of the inspection robot system 10 is shown in
The locating lidar 14 is the basic sensor for the mobile robot's navigation and location functions. The 3D laser scanning camera 15 is used for image acquisition, while the safety protection sensor generates ultrasonic waves to ensure the safety of the mobile robot during driving, i.e. to prevent collisions between the robot and objects that may be located on the surface. The IMU 16 is used to obtain the robot attitude data during its movement over the surface. This information is used to correct the acquired image. The network device 22 (
A block diagram of the control system for the inspection robot is shown in
The 3D camera 15 is connected via Ethernet to the controller 13. The IMU is connected to the controller via a RS485 serial communications. The locating lidar 14 is connected to controller via Ethernet and the ultrasonic source of the protection sensor is connected with the controller through RS485 serial communications. The controller 13 connects to the host computer 12 through 4G/5G/Wi-Fi or other communications protocols including hardwired.
The security module 23 uses ultrasonic radar, which is installed on all sides of the vehicle body to detect obstacles and prevent collisions.
As can be seen in
When the robot operates, it needs different voltages for the modules, such as 5 VDC, 12 VDC, and 24 VDC. The output voltage of the battery is adjusted by the voltage regulating power supply 26 to meet the power consumption requirements for different pieces of equipment, and the voltage regulating power supply also plays a role in voltage regulation. The BMS in the Li-battery system assures that the battery system can work well under all kinds of circumstances.
The mobile robot is mainly driven by four wheel servo hubs (electric motors), and the front and rear wheels 20 are suspended by torsion beams, respectively. The robot can fully meet the needs for travel over various sports grounds.
The inspection robot's host computer 12 may be, for example, an Advantech EPC-C301, which is a compact, fan-less embedded system with diverse I/O ports for easy integration and diverse peripheral expansion. The EPC-C301 is designed for automated machines, smart parking, and self-service human machine interface (HMI) applications. The communication method between the host computer 12 and the controller 13 is typically WiFi. The functions of the host computer are (a) to compile and monitor the running program of the controller, (b) to obtain real-time running information from the inspection robot for the monitor, (c) to accept automatic control, remote control and other function settings from the user or operator and (d) to accept various programs and manual operation instructions for maintenance.
In one embodiment of the present invention Lidar 14 is the key equipment for navigation and determining the location of the robot. The working environment of this kind of robot is outdoors, so lidar with a higher protection level is required. The lidar adopts the mature time-of-flight (ToF) principle and non-contact detection, and adds the latest multiple echo detection technology (two echoes), so that it can accurately measure in harsh environments. The main features are: IP67 protection level, large monitoring range (360° scanning angle) and flexible regional configuration. At the same time, it has the advantages of a self-checking function, stable detection and insensitivity to objects with low reflectivity.
The method of determining the location of the inspection robot is shown in
After the construction of a stadium is completed, the whole field will be flat and generally there will be no obstacles during a detection process. Therefore, the lidar locating method with reflectors is used. A certain number of lidar reflectors are arranged on the edge of the field, and the robot can obtain higher positioning accuracy.
The IMU 16 is a gyroscope, which is used to detect the position and attitude of the robot during motion. In the detection process, tilted or uneven ground will cause a tilted deformation of the image measured by the 3D camera 15. By using the attitude data of the IMU, the tilted image can be corrected to obtain the correct image.
The 3D camera 15 measures range by using triangulation. This means that the object is illuminated with a line of light from one direction, and the camera measures the object from another direction. The most common lighting used when measuring range is a line projecting laser 18 that projects a laser line 18A. The camera analyzes the sensor images to locate the laser line in them. The higher up the laser line is found for a point along the x-axis (the width of the object), the higher up is that point on the object. When measuring range, there are two angles that are important: (a) the angle at which the camera is mounted and (b) the angle of the incoming light (incidence). Both angles are measured from the normal of the transport direction. The angle of the camera is measured to the optical axis of the camera—that is, the axis through the center of the lens as shown in
The 3D camera and the laser generator are installed on a bracket in a fixed relative position. The bracket is installed on the inspection robot. When the inspection robot moves forward at a certain speed V, the laser generator 18 projects the line 18A to the ground.
Defect marking can be implemented by using spray chalk to mark deflection areas. Such chalk is environmentally friendly and it can be washed away with water.
The architecture of the inspection robot software system is shown in
A client interface, which is part of the client module 30, performs the function of motion planning, automatic/manual remote operating mode selection, real-time monitoring and data analysis & evaluation. The motion planning module of the client interface includes the necessary input information such as the length and width of the court or other surface and the inspection velocity of the robot. Then the desired path is generated automatically and shown on a display screen of the client interface. The corresponding files are transferred (LAN mode or WAN mode) directly into the server module 32 that is installed on the inspection robot.
When in the automatic mode the interface shows the real-time status of the inspection procedures, including a live image, the pose of the robot and the progress rate. When in the manual mode the interface also shows the real-time status, but with a tele-operation function controlled by joysticks, i.e., the user interacts with and operates the robot from a remote location.
By the end of the inspection operation by the robot, i.e., once it has traversed or covered the entire field or court, all of the measured data will be transferred into the client interface, analyzed, and displayed. The whole map of the court will be shown on the interface display along with the bump cloud, depression cloud and the steplike cloud. In an exemplary embodiment, lighter colored places on the map can indicate higher altitude of the pose from the inspected court.
The server interface of server module 32 can perform the basic functions of parameter initialization, local/wide area network monitoring, data transmission and application program setup. This server interface panel is used to guarantee the normal running of the robot. Under a local data transmission mode, an attached router is used as a relay station for transmission. Under wide data transmission mode, the cloud server is used for relaying.
The high precision 3D scanner 15 (
In
Note that the “Point cloud” not only includes the 3D pose (X, Y, Z), but also includes intensity (I) which indicates the IR light reflectivity of different material and texture on the ground.
The mapping system is synchronized by a global clock to match the scanning (50 Hz) with its pose (8 Hz) in the field World Reference Frame (W.R.F.) as a high precision simultaneous localization and mapping (SLAM) algorithm and general purpose input/output (GPIO) synchronization hardware design. As the robot moves around the field, a timer triggers the 3D Scanner. The same timer also triggers the localization module 44. Therefore, the “Point cloud (t)” can be transformed from the robot local reference frame to the field W.R.F. by measured “Pose(t)” of “Point cloud(t).” Then the 3D point cloud map (saved as “whole_map.ply”) of the field can be obtained by continuously doing so.
This mapping module can focus on three types of goals: (a) tilting angle of the ground, (b) several types of 3D uneven ground flaws, and (c) 2D painted line position flaw. After the detection, some selected type of the flaw position is sent to the “marking module” so that the robot can leave a mark on the ground for later use, e.g., as an indication of a location that needs repairing.
The ground surface is intended to have an inclination angle for drainage use; but the angle should be lower than 1:100 at any location. The tilting can be expressed by pitch and roll angles of W.R.F. in an earth reference frame (E.R.F) whose xy-plane is horizontal and which is measured by the onboard inertial measurement unit (IMU) 16 on the robot as shown in
A “step-like” 3D uneven surface over 1 mm should not be at any position and direction. Detection of such a defect is carried out by an online local height filter which outputs the step-like height change above a certain value (1 mm). The abnormal step-like uneven ground would be saved as a point cloud in “steplike_cloud.ply.”
The local bump and depression beneath a 3 m straight edge exceeding 6 mm is also detected. By maintaining a max. and min. height matrix within the 3 m edge, the abnormal uneven part would be detected and stored as “drepression_cloud.ply” and “bump_cloud.ply” respectively. See
A Finite-State Machine (FSM) is designed as shown in
In an exemplary embodiment the inspection robot uses the SICK NAV350 Lidar for localization during the inspection process. Based on the time of flight (TOF) method the Lidar captures the reflectors in the environment and automatically calculates the absolute position and orientation of the robot. At least three reflectors are required to complete the localization.
Before the inspection process, the lidar node can automatically perform the initialization process. The lidar will automatically detect the position and the number of reflectors within the detection range set by the user. The detected reflectors during initialization will be used as a reference for real-time localization of the robot in subsequent processes.
During the navigation process, the lidar detects the position of the reflector in the environment in real time at a frequency of 8 Hz and automatically calculates the absolute position of the robot. The industrial personal computer) (IPC) 13 (
The inspection robot can not only complete the entire inspection process through automatic planning and control, but the user can also control the robot when needed through the joystick. After pressing a button of the joystick, it will be automatically converted into linear and angular velocity commands that the robot can execute.
The motion control function of automatic inspection mainly consists of three parts shown in
The microcontroller with robot kinematics model and trajectory tracking controller controls the 4 independent wheel drivers directly via RS485 communication.
The velocity of each wheel satisfies the following constrains:
The forward kinematics model calculates the velocity of the geometric center of mass (COM) based on the velocity of the left and right drive wheels, which can be expressed as:
The inverse kinematics model decomposes the velocity of the left and right driving wheels based on the velocity of the geometric center of mass COM, which can be expressed as:
The trajectory tracking controller follows the event-based planning and control framework:
The controller runs at a frequency of 8 hz limited to the frequency of querying the robot's pose from the lidar at the onboard IPC. Event-based planning and control uses a suitable motion reference variable other than time, usually the travel distance of robot. Each time the microcontroller obtains a new robot pose, it will do an orthogonal projection to the current trajectory to find the current motion reference s. A lookahead distance Δs is added to calculated travel distance. The desire state with desire robot pose and velocity qd=[xr, yr, θr, vr, ωr] can be obtained on the new travel distance s+Δs. Therefore, the error between current state and desire state can be defined in robot reference frame:
The designed trajectory tracking controller is as follows:
where v*, ω* are calculated control input of robot, [k1, k2, k3] are positive control gain. The calculated control input will be converted into the rotational speed of the four wheels based on the robot kinematics, and the wheel rotational speed will be sent directly to the 4 independent wheel drivers via RS485-Modbus communication. A proportional integral derivative (PID) controller inside the wheel drive will automatically adjust the input current to adjust the rotational speed to the desired speed.
The robot system can not only use landmark-based lidar positioning, but in a second embodiment it can also use the Global Navigation Satellite System (GNSS) to determine its position. A photo of a high performance GNSS antenna that provides superior tracking of satellite signals including Beidou, GPS, GLONASS and GALILEO is shown in
A GNSS positioning module is shown in
The GNSS positioning provides longitude, latitude, and altitude data, which cannot be directly applied to the inspection process. Through the Universal Transverse Mercator (UTM) projection, these are converted into a 2D coordinate system combined with altitude data to form the 3D coordinates of the robot. In order to achieve higher positioning accuracy and cope with GNSS-denied environments, a multi-sensor fusion framework based on federal Kalman filter as shown in
The Federal Kalman Filter (FKF) method achieves more precise positioning results by fusing the output of two sub-filters based on the Extended Kalman Filter (EKF). The formula of the extended Kalman filter is as follows
For the nonlinear discrete-time predict and update equations:
x
k
=f(xk-1,uk)+ωk Process noise: ωk˜N(0,Qk)
z
k
=h(xk)+vk Observation noise: vk˜N(0,Rk)
x
k
=f(xk-1,uk)+k Process noise: ωk˜N(0,Qk)
z
k
=h(xk)+vk Observation noise: vk˜N(0,Rk)
Predicted state estimate: {circumflex over (x)}k|k-1=f(xk|k-1,uk)
Predicted covariance estimate: Pk|k-1=FkPk-1|k-1FkT+Qk
Predicted state estimate: {circumflex over (x)}k|k-1=f({circumflex over (x)}k-1|k-1,uk)
Predicted covariance estimate: Pk|k-1=FkPk-1|k-1FkT+Qk
Measurement residual:
Kalman gain: Kk=Pk|k-1HkT(HkPk|k-1HkT+Rk)T
Updated state estimate: xk|k=xk|k-1+Kk
Updated covariance estimate: Pk|k=(I−KkHk)Pk|k-1
Measurement residual:
Kalman gain: Kk=Pk|k-1HkT(HkPk|k-1HkT+Rk)T
Updated state estimate: xk|k={circumflex over (x)}k|k-1+Kk
Updated covariance estimate: Pk|k=(I−KkHk)Pk|k-1
Sub-filter 1 integrates GNSS positioning results with a robot odometry model based on wheel encoders. Sub-filter 2 combines robot odometry with measurements from the Inertial Measurement Unit (IMU). In the main filter, the results of the two sub-filters are fused according to manually set weight coefficients β_i. When the GNSS signal is unavailable, the weight of sub-filter 1 is set to zero the weight of sub-filter 2 is set to one, and the positioning system solely relies on the results from the sub-filter 2 based on the IMU and robot odometry until a viable GNSS signal is acquired again.
The output of the GNSS system is shown in
While the invention is explained in relation to certain embodiments, it is to be understood that various modifications thereof will become apparent to those skilled in the art upon reading the specification. Therefore, it is to be understood that the invention disclosed herein is intended to cover such modifications.
This application claims the benefit of priority under 35 U.S.C. Section 119(e) of U.S. Application No. 63/423,226 filed Nov. 7, 2022, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63423226 | Nov 2022 | US |