The present disclosure relates to floor grinders and other floor surfacing machines for processing hard material surfaces such as stone and concrete. There are disclosed methods and devices for calibration and control of floor surfacing systems.
Floor grinding relates to the process of smoothing and polishing, e.g., concrete floors by means of a grinding machine. By grinding and polishing hard materials such as concrete and stone, it is possible to achieve a finish resembling that of a polished marble floor. A polished concrete floor is easy to clean and often visually appealing.
Floor grinding may also be used to level a floor surface, i.e., to remove bumps and other imperfections. This may be desired in production facilities where complicated machinery may require a levelled supporting surface.
Floor grinding is, in general, a tediously slow process. The grinding process must often be repeated many times in order to achieve the required surface finish, and each grinding iteration often takes a considerable amount of time. This applies, in particular, to floor grinding at larger venues such as assembly halls and shopping malls.
To increase productivity, automated floor grinders may be used. Automated floor grinders navigate autonomously on the surface to be processed. However, such systems are often associated with issues when it comes to calibration accuracy which affects the autonomous control system negatively. The calibration procedure is also often a tedious process requiring many steps. Consequently, there is a need for efficient and accurate methods for calibrating a floor grinding system.
It is an object of the present disclosure to efficient and accurate methods for calibrating a floor grinding system. This object is obtained by a calibration device for calibrating a floor surfacing system. The calibration device comprises at least four infrared sources arranged separated from each other on a structural member according to a known geometrical configuration, where three of the infrared sources are located in a common plane and where a fourth infrared source is located distanced from the common plane along a normal vector to the common plane. The calibration device is arranged to be positioned at one or more locations around a perimeter of the surface to be processed in view from an infrared vision sensor, whereby the infrared vision sensor may obtain images of the calibration device at the one or more locations.
This way a boundary of a surface area to be processed by the floor surfacing system can be determined at the same time as the vision sensor set-up is calibrated. Thus, a more efficient floor grinding process is obtained since the calibration process is made more efficient and also more accurate.
According to some aspects, the calibration device comprises a trigger mechanism arranged to activate the at least four infrared sources. The trigger mechanism may, e.g., be a button on the calibration device or some other trigger mechanism. The infrared vision sensor can then be active continuously and configured to detect the calibration device as soon as the infrared sources are activated by the trigger mechanism. When the calibration device is activated, its position is stored and later processed to complete the calibration routine. The infrared sources may be modulated to transmit an identification code to the vision sensor, thereby allowing the vision sensor to distinguish between a plurality of different calibration devices. This allows several calibration systems to be used in parallel while in view of each other, which is an advantage.
According to other aspects, the calibration device comprises a trigger mechanism arranged to transmit a trigger signal to the infrared vision sensor, which trigger signal is configured to trigger an image capture action by the infrared vision sensor. The trigger mechanism may, e.g., be a button on the calibration device or some other trigger mechanism. The trigger mechanism allows for convenient operation of the calibration device and a more efficient calibration process.
According to aspects, the normal vector intersects one of the infrared sources located in the common plane. Thus, a shape resembling the axes of a Cartesian coordinate system is obtained, which simplifies computation.
According to aspects, the structural member comprises three arms extending from a common intersection point, where each arm comprises a respective infrared source, and where a fourth infrared source is arranged at the intersection point. This particular shape allows for a low complex calibration routine based on finding a surface plane.
According to aspects, an angle between a first arm and a second arm is configurable. The calibration device comprises an angle sensor configured to measure the angle between the first arm and the second arm. This allows to match the shape of the calibration device to corners having angles different from 90 degrees, which is an advantage.
Generally, all terms used in the claims are to be interpreted according to their ordinary meaning in the technical field, unless explicitly defined otherwise herein. All references to “a/an/the element, apparatus, component, means, step, etc.” are to be interpreted openly as referring to at least one instance of the element, apparatus, component, means, step, etc., unless explicitly stated otherwise. The steps of any method disclosed herein do not have to be performed in the exact order disclosed, unless explicitly stated. Further features of, and advantages with, the present invention will become apparent when studying the appended claims and the following description. The skilled person realizes that different features of the present invention may be combined to create embodiments other than those described in the following, without departing from the scope of the present invention.
The present disclosure will now be described in more detail with reference to the appended drawings, where
The invention will now be described more fully hereinafter with reference to the accompanying drawings, in which certain aspects of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments and aspects set forth herein; rather, these embodiments are provided by way of example so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout the description.
It is to be understood that the present invention is not limited to the embodiments described herein and illustrated in the drawings; rather, the skilled person will recognize that many changes and modifications may be made within the scope of the appended claims.
A control unit 110 may be used to autonomously control the machine 100. The control unit may be located external to the machine (as shown in
An issue with autonomous operation of floor grinders is that the setup of the indoor positioning system is time-consuming. For instance, if an IR system is used, then the exact location and orientation of the IR vision sensor in relation to the surface area to be processed must be determined with high accuracy. After this information has been obtained, it is necessary to measure and mark out the working area and make corrections in the vision sensor tilt angles to calibrate the whole setup.
The present disclosure relates to a calibration device which is portable and convenient to carry around. The calibration device comprises at least four IR sources arranged separated from each other on a structural member according to a known geometrical configuration. The calibration device can be used both to mark the boundary 240 of the surface area 220 to be processed by the machine, and at the same time to calibrate the vision sensor set-up, i.e., to determine the location of the sensor, its height h and viewing angle (aa, ab). The device is arranged to be placed at locations around the perimeter of the surface 220, and an image is captured by the vision sensor 210 for each location. Since the system knows exactly what the calibration device looks like in three dimensions (due to the known geometrical configuration of the infrared sources on the calibration device), it can determine from which angle the calibration device is viewed by the vision sensor at each location, and also the distance from the vision sensor 210 to the calibration device based on the scaling of the calibration device in the image (a far-away calibration device will be smaller than a device closer to the vision sensor). This way, the calibration device facilitates both defining the surface area 220 to be treated and at the same time allows for the vision sensor set-up to be calibrated. The calibration device can be used to obtain more information than absolutely necessary to calibrate the system. This is an advantage since each additional measurement or snap-shot of the calibration device improves the calibration accuracy by averaging out measurement errors and the like. Further examples of the calibration process will be given below.
When a vision sensor, such as a camera, is used to capture an image of a three-dimensional object, the shape of that object is projected onto a plane in dependence of the viewing angle and location of the camera.
If the viewing angle is changed to a2, the relative locations of the two object projections in the image changes to 332 and 342. For instance, the two-dimensional coordinates of the projection of the first object 330 changes from (x1, y1) on plane P1 to (x1′, y1′) on plane P2. From this it is appreciated that, as long as the objects 330, 340 are arranged separated from each other according to a known geometrical configuration, then the viewing angle of the vision sensor 210 can be determined or at least estimated based on the projections of the two objects in the image captured by the vision sensor 210. If the vision sensor 210 is an IR camera, then the projections correspond to pixels in a digital image.
In general, the parameters of the vision sensor set-up are the rotation of the sensor (the viewing angle) and the position of the vision sensor (or translation of the projection center). Suppose that the rotation of the sensor around an X-axis is ϕ, the rotation around a Y-axis is θ, and the rotation around a Z-axis is φ, then the corresponding rotation matrix is
R=RXRYRZ
where
In this application, rotation about one axis can be defined as corresponding to vision sensor roll, which can be disregarded. Thus, only two rotation angles need to be considered in this context. A projection of a point in three dimensions to a point in two dimensions can be written (using homogenous coordinates) as
where (x′,y′) is the projection in two dimensions from the point (x,y,z), e.g., (x′,y′) may be illuminated pixels in a captured image of an infrared source. A distance dependent scaling factor λ is also introduced for more complex objects. The further away from the vision sensor the object is, the smaller it of course appears in the image, which effect is captured through λ. The scaling factor therefore carries information about the distance from the object to the vision sensor
The calibration devices disclosed herein comprise at least four infrared sources arranged separated from each other on a structural member according to a known geometrical configuration. These infrared sources will result in illuminated pixels in an image captured by the vision sensor 210. The spatial relationship between the infrared sources can be described using a matrix
where each column represents the location in three dimensions of an infrared source. Changing the viewing angle of the vision sensor is equivalent to applying a rotation to the vectors in the above matrix. Changing the position of the calibration devices with respect to the vision sensor 210 will also show up as a scaling and a rotation of the location vectors. Therefore, the viewing angle and distance to the vision sensor can be determined from the projections of the infrared sources onto the image captured by the vision sensor. To see this, imagine comparing a captured image of a calibration device to a simulated two-dimensional image obtained by rotation, scaling and projection of the known three-dimensional shape onto a two-dimensional surface. By testing a range of rotation angles and scaling, a match can be found between the captured image and the simulated projection—this parameterization corresponds to the viewing angle and vision sensor distance to the calibration device. Of course, the viewing angle and distance can also be determined using known mathematical methods. The mathematics related to projection of three-dimensional objects onto two-dimensional planes is known in general and will therefore not be discussed in more detail herein.
The effect of projecting a set of point sources 410 arranged separated from each other according to a known geometrical configuration onto a plane 420 is schematically illustrated in
Given a number of snapshots of the calibration device from different angles and at different locations, the viewing angle and the position of the vision sensor can be estimated with increased accuracy. Each snapshot gives two additional equations for each infrared source, one equation for the pixel coordinate x′ and one equation for the pixel coordinate y′. Since the at least four infrared sources on the calibration device are arranged separated from each other on a structural member according to a known geometrical configuration, the relationship between the different infrared sources is known. If many snapshots are available, then the system of equations will be over determined. In this case the estimation of vision sensor set-up and surface area boundary can be performed by, e.g., least squares minimization, constrained optimization, or by any other known optimization technique. Such techniques for determining vision sensor viewing angles from projections are known in general and will therefore not be discussed in more detail herein.
An especially low complex approach for calibrating a floor surfacing system will be discussed below in connection to
The calibration device is arranged to be positioned at one or more locations around a perimeter of the surface 220 to be processed in view from an infrared vision sensor 210, whereby the infrared vision sensor may obtain images of the calibration device at the one or more locations.
When in use, the common plane may be arranged parallel to the surface which is to be processed, i.e., the calibration device can be deployed with the three sensors in the common plane downwards towards the surface 220.
In the example shown in
where the first column represents the center diode 510, and the other columns represent the other three diodes (one axle per infrared source). In other words, according to some aspects, the normal vector intersects one of the infrared sources located in the common plane. Preferably, the normal vector intersects the center diode 510. This way the four diodes are arranged in a pyramid shape with the center diode forming the peak of the pyramid.
However, it is appreciated that other geometrical configurations than the one shown in
The calibration device 500 is, according to some aspects, arranged to mark the location and spatial configuration of obstacles on the surface 220. For instance, there may be a well or other structure which will interfere with the floor grinding. The calibration device can be deployed in connection to the obstacle and a signal can be generated and transmitted to the control unit 110 indicating the presence of an obstacle. The vision sensor 210 may capture an image showing the location of the obstacle. The spatial configuration of the obstacle can be marked, e.g., by a circle having a pre-determined or configurable radius. The spatial configuration of the obstacle can also be marked by deploying the calibration device at locations along a perimeter of the obstacle and triggering an image capture by the vision sensor at each such location. The control unit 110 can then determine the spatial extension of the obstacle and maneuver the floor grinding machine accordingly. Thus, the control unit may be arranged to receive a signal from the calibration device indicating the presence of an obstacle, and to determine the spatial configuration of the obstacle based on the signal from the calibration device. This signal may be a radio signal, or a modulation applied to the infrared sources (similar to a remote control for a television apparatus).
The calibration device 500 is particularly suitable for calibration of floor surfacing systems to process rectangular surfaces, since the calibration device can be positioned at the corners of the rectangular surface, whereupon the geometry can be easily established by aligning the axes of the coordinate systems defined by the calibration device when located in the different corners.
A scenario like this is schematically illustrated in
With reference again to
According to some other aspects, the calibration device 500 comprises a trigger mechanism 521 arranged to transmit a trigger signal to the infrared vision sensor 210. The trigger signal is configured to trigger an image capture action by the infrared vision sensor 210. The trigger mechanism may, e.g., be a push-button 521 as shown in
In the example of
According to aspects, a first arm 560 and a second arm 570 extend at right angles from a third arm 550. The infrared sources are arranged at the end point of the arms 550, 560, 570. The distance between the fourth infrared source 510 and the other three infrared sources 520, 530, 540 may be between 5 cm and 50 cm, and preferably between 20 cm and 30 cm.
The first arm 560 and the second arm 570 optionally extends at right angles from each other. This type of configuration is illustrated in
Thus, according to aspects, the angle A between the first arm 560 and the second arm 570 is configurable. The calibration device may also comprise an angle sensor 910 configured to measure the angle A between the first arm 560 and the second arm 570. The output from the angle sensor can be communicated to the vision sensor or to the control unit 110, which then can adjust the determining of, e.g., viewing angle and distance from the vision sensor to the calibration device in dependence of the configurable angle A.
First, the lower three pixels 740 in each group or cluster of illuminated pixels is selected. These three pixels correspond to the three infrared sources 510, 530, 540 located in the common plane.
Assuming the calibration device has been positioned on a plane surface, straight lines are then drawn from the center diode pixel 750 through the other two common plane pixels 760. These lines, due to the position of the calibration device in a corner, represents a boundary line of the rectangular surface. These ‘imaginary’ lines 711, 712, 721, 722, 731, 732 are shown in
The left-most or the right-most group or cluster of illuminated pixels is then selected 770. These pixels will correspond to a calibration device position along the same wall as the vision sensor is deployed in connection to. This calibration device will be viewed directly from the side from the vision sensor 210, as illustrated in
It is appreciated that more advanced methods can be applied to calibrate the floor surfacing system based on captured images of the calibration device at different locations.
It is appreciated that other shapes than that shown in
Particularly, the processing circuitry 1110 is configured to cause the control unit 110 to perform a set of operations, or steps, such as the methods discussed in connection to
The storage medium 1130 may also comprise persistent storage, which, for example, can be any single one or combination of magnetic memory, optical memory, solid state memory or even remotely mounted memory.
The control unit 110 may further comprise an interface 1120 for communications with at least one external control unit. As such the interface 1120 may comprise one or more transmitters and receivers, comprising analogue and digital components and a suitable number of ports for wireline or wireless communication.
The processing circuitry 1110 controls the general operation of the control unit 110, e.g., by sending data and control signals to the interface 1120 and the storage medium 1130, by receiving data and reports from the interface 1120, and by retrieving data and instructions from the storage medium 1130. Other components, as well as the related functionality, of the control node are omitted in order not to obscure the concepts presented herein.
Consequently, there is disclosed herein a control unit 110 for calibrating a floor surfacing system 200. The control unit comprises an interface 1120 for receiving infrared image data from an infrared vision sensor 210 and processing circuitry 1110, wherein the infrared image data comprises pixel locations in two dimensions indicating locations of a calibration device around a perimeter of a surface area 220 to be treated by the floor surfacing system 200. The calibration device comprises at least four infrared sources 510, 520, 530, 540 arranged separated from each other on respective structural members 501 according to a pre-determined geometrical configuration, wherein the processing circuitry 1110 is configured to determine a spatial configuration h, aa, ab of the infrared vision sensor 210 based on the pixel locations and on the pre-determined geometrical configuration.
According to some aspects, the processing circuitry 1110 is further arranged to determine a boundary of the surface area 220 to be treated by the floor surfacing system 200 based on the pixel locations.
The control unit 110 is optionally arranged to receive data from a calibration device 1000 indicating an angle A between a first arm 560 and a second arm 570 of the calibration device, and to determine the boundary of the surface area 220 to be treated by the floor surfacing system 200 based also on the angle A.
There is furthermore disclosed herein a system for calibrating a floor surfacing system 200. The system comprises one or more calibration devices 500, 1000 according to the discussion above, a control unit 110 as shown in
According to some aspects, the method also comprises determining S4 a boundary of the surface area 220 to be treated by the floor surfacing system 200 based on the pixel locations and on the pre-determined geometrical configuration.
According to some such aspects, the boundary of the surface area 220 to be treated by the floor surfacing system 200 is determined S41 under the assumption of a flat surface supporting the calibration device 500, 1000 at each location.
Number | Date | Country | Kind |
---|---|---|---|
1951505-5 | Dec 2019 | SE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/SE2020/051032 | 10/26/2020 | WO |