This application is a National phase application of International application PCT/EP2016/074371 filed Oct. 11, 2016 and claiming priority of German application DE 102015220289.5 filed Oct. 19, 2015, both applications are incorporated herein by reference thereto.
The invention relates to a method and to a measuring system for measuring a movable object, for example, a lateral guide for cast strands in a metallurgical casting or rolling installation. In addition, the invention relates to such a casting or rolling installation.
Laser-supported methods and systems for measuring objects, for example, rolls or lateral guides in metallurgical installations are in principle known in the prior art, thus, for example, from the US Patent Application US 2005/0057743 A1 or the Korean Patent Application KR 10 2004 0045566.
The published unexamined German patent application DE 10 2011 078 623 A1 discloses a method and a device for determining the position of a roll in a rolling mill. The measuring device disclosed there uses a light source for emitting a light beam in the form of a collimated light ray. Such a light ray is necessary so that, when the light ray strikes a mirror on the roll to be measured, a defined area of the mirror is irradiated, in such a manner that a defined reflected light beam can reach a receiving device. The receiving device is a two-dimensional receiver which is designed to receive the received light beam resolved two dimensionally. Moreover, an evaluation device is provided which evaluates the image of the light beam received by the receiver unit.
The object of the invention is to provide an alternative method and alternative measuring system for measuring a movable object, which is characterized by a very simple and time-saving handling.
This aim is achieved in terms of procedure by the method claimed in claim 1. This method provides the following steps:
The light source according to the invention is preferably a laser light source, since such a laser light source automatically already brings along with it the property—essential for the invention—of emitting parallel light rays. Alternatively, the required parallelism of the light rays can also be brought about with the aid of a suitable optical system, in particular a converging lens.
The term “ . . . light rays influenced by the object” is understood to mean that the light rays which have been emitted by the light source are interrupted by the object and absorbed, deflected away or deflected onto the receiving device by the object. The “ . . . light rays not influenced by the object” go from the light source to the receiving device without being interrupted by the object, optionally after reflection by a device other than the object.
The sensors of the sensor field which are associated with the light rays which are emitted but are influenced by the object are sensors which either receive none of the emitted light rays, since the light rays are absorbed by the object or deflected away from the receiving device, or which receive those emitted light rays which are reflected by the object.
The resolution of the sensor field is represented by the known spacings of the sensors of the sensor field.
The device according to the invention offers the advantage that all the desired information for measuring the object can be determined in a simple manner by evaluating the image. The evaluation of the image can preferably occur fully automatically or semi-automatically, which advantageously simplifies the use of the method considerably for an operator, while also considerably reducing the time required for the determination of the desired information.
According to a first embodiment example, the spacings of the positions of the sensors on the sensor field do not have to be identical at all. It is only important that the spacings are in principle known in the first place, since the knowledge of these spacings is required for the computation of various information described below.
The image with the positions of the sensors of the sensor field can be displayed on a display device for an operator.
For determining the actual depth of penetration of the object into the spatial area spanned by the light beam of the light rays, the image is evaluated in such a manner that the known spacings of all the sensors which are associated with light rays which are emitted but are influenced by the object are added up, in the direction of the movement of the object in the image.
The actual penetration depth thus determined can then be compared with a predetermined target penetration depth. If the actual penetration depth deviates from the target penetration depth, a preferably automatic correction of the final position of the object can occur until the actual position is in agreement with the target position. Preferably, an error message can also be generated and displayed on the display device if the actual penetration depth differs from the target penetration depth.
According to another advantageous design of the method, the actual penetration depth can be determined individually for the different areas of the object by evaluating of the image of the sensor field. These individual actual penetration depths can be compared with associated individual target penetration depths for the different areas of the object. If the individual actual penetration depth is then in agreement with the individual target penetration depth for individual areas of the object, while this is not the case for other areas of the object, then, from this situation, it can be deduced that there is partial wear of the other areas of the object. The amount of wear is then represented by the magnitude of the difference between the individual actual penetration depth and the individual target penetration depth of the other areas of the object.
The determined difference between the target penetration depth and the actual penetration depth relative to the entire object or the amount of wear of the other areas of the object can be stored as offset value. In future positionings of the object, the offset value can then be automatically taken into consideration and thus the object can immediately be positioned accurately.
In addition to the possibility of computing the penetration depth of the object in the spatial area spanned by the light rays, the evaluation of the image according to the invention also offers the possibility of determining the speed with which the object penetrates into the spatial area spanned by the light beam. This occurs according to the invention by the following steps: measurement of the path length which the object travels during its entry into the spatial area, by adding up the known spacings of all the positions of the sensors of the sensor field in the image which are associated with the light rays which are emitted but are influenced by the introduced object, in the direction of the movement of the object during a certain time interval, and determination of the speed by division of the path length by the time interval.
Moreover, the evaluation of the image also provides the possibility of determining the contour of the object.
The method according to the invention can provide that the light rays emitted by the light source, to the extent that they are not interrupted by the object, are deflected with the aid of a reflector before they strike the receiving device.
The object can be, for example, a lateral guide on a transport path, for example, on a roller table for a slab. The light source then has to be installed in such a manner that the light rays extend perpendicularly to the direction of movement of the lateral guide, and that the lateral guide, in its movement, enters the spatial area spanned by the light beam of the light rays.
The aim of the invention is achieved moreover by the measuring system and by a casting or rolling installation with the measuring system according to the invention. The advantages of this measuring system and of the claimed casting or rolling installation correspond to the advantages mentioned above in reference to the claimed method. Additional advantageous designs of the method and of the measuring system are the subject matter of the dependent claims.
Six figures are attached to the description in which:
The invention is described in detail below in reference to the mentioned
According to
The receiving device 120 is arranged in
The image 122 can be visualized on a display device 160 for an operator.
According to the invention, an evaluation device 140 is associated with the receiving device, in order to investigate the image 122, for example with regard to the penetration depth (s) of an object in the spatial area spanned by the light beam of the light rays, the speed and/or the contour of the object.
The transmission of the data of this image to the display device 160 can occur by wire or wirelessly. All the electronic devices of the measuring system 100, in particular the light source 110, the receiving device 120 and the evaluation device 140, can be supplied with electrical energy with the aid of an electrical energy source belonging to the measuring system, for example, a battery or an accumulator.
The images shown in
In
The described measuring system 100, before its use, is built into the casting or rolling installation and calibrated there. In this context, calibration is understood to mean first of all the fine adjustment or fine positioning of the light source, the receiving device and optionally the reflector device with the aid of associated setting elements, so that they are oriented optimally with respect to one another and can interact.
After the measuring system 100 has been built into the installation 300 and after the calibration of the measuring system, the latter is ready for carrying out the method according to the invention for measuring a movable object, according to
The light source 110 is activated for emitting parallel light rays 130. The lateral guide 200 is then introduced transversely to the propagation direction of the light rays 130 into the spatial area spanned by the light rays (arrow direction in
In both cases, the receiving device 120 generates the image 122, on the one hand, with the positions of the sensors of the sensor field which receive the light rays which are emitted and not influenced by the lateral guide. These positions are indicated in
The image 122 thus generated is evaluated subsequently by the evaluation device 140 according to the invention with regard to different aspects.
On the one hand, the evaluation device 140 is designed to determine the image with regard to the actual penetration depth (s) of the object 200 or of the lateral guides into the spatial area spanned by the light beam of the light rays. In particular, the determination occurs by adding up the known spacings di, dj of all the emitted but not received light rays in the direction of the movement of the lateral guides in the image. This movement direction is indicated in
The penetration depth (s) determined by evaluation of the image 122 is the so-called actual penetration depth (s). The method according to the invention can provide that this actual penetration depth is compared with a predetermined target penetration depth, wherein this target penetration depth represents a target position for the object 200 or the lateral guide, for example, in a casting or rolling installation. The determined deviation of the actual penetration depth from the target penetration depth as a rule means that the target position has not been reached correctly, and accordingly, an actuator 210, which is used for positioning the lateral guides 200, has to be re-positioned or re-calibrated. In the context of the calibration, which can preferably also occur automatically, the actuator 210 is set in such a manner that the object again reaches its predetermined target position, i.e., the adjustment of the actuator occurs until the actual position is in agreement with the target position. The initially detected deviation of the actual penetration depth from the target penetration depth can also be stored as offset value in a control for the actuator, so that it can also be considered on a regular basis for future activations of the actuator. The offset value can also serve for generating an error message, which can be displayed on the display device 160, for example.
The determination of the actual penetration depth can occur individually or separately for different areas 202, 204 of the object, see
The difference between the target penetration depth and the actual penetration depth relative to the entire object or the amount of wear of a certain section of the object can, as described, be determined by evaluation of the image. The difference or the amount of wear is then preferably stored as offset value in the control associated with the actuators, so that it can be taken in consideration automatically in the future in preferably automatic re-positioning procedures.
Irrespective of the possibility of determining the actual penetration depth, the evaluation of the image 122 by the evaluation device 140 also makes it possible to determine the speed with which the object or the lateral guide 200 penetrates into the spatial area spanned by the light rays. For this purpose, one determines a path length traveled by the object 200 when it enters the spatial area, which is measured by adding up the known spacings of all the sensors in the sensor field which are associated with the light rays which are influenced or are not influenced by the object or the light rays which are not influenced by the object, in the direction of movement of the object during a certain time interval. For determining the speed, the measured path length is then divided by the time interval measured. The path length can be the entire penetration depth or a partial length thereof.
Moreover, the evaluation of the image also makes it possible to determine the contour of the object which penetrates into the spatial area spanned by the light rays. The contour 230 corresponds to the boundary line between the positions 130 of the sensors which receive the light rays which are not influenced by the object. and the positions 132 of the sensors which are associated with the light rays which are influenced by the object, as can be seen in
Number | Date | Country | Kind |
---|---|---|---|
102015220289.5 | Oct 2015 | DE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2016/074371 | 10/11/2016 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/067823 | 4/27/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4548503 | Liesch | Oct 1985 | A |
4821544 | Tamler | Apr 1989 | A |
5932888 | Schwitzky | Aug 1999 | A |
6175419 | Haque | Jan 2001 | B1 |
6292262 | Ciani | Sep 2001 | B1 |
6889441 | Seiffert | May 2005 | B2 |
6968625 | Segerstrom | Nov 2005 | B2 |
7126144 | De Coi | Oct 2006 | B2 |
8755055 | Khajornrungruang | Jun 2014 | B2 |
9163934 | Hirabayashi | Oct 2015 | B2 |
20030051354 | Segerstrom | Mar 2003 | A1 |
20050057743 | Seiffert | Mar 2005 | A1 |
20060145101 | De Coi | Jul 2006 | A1 |
20090113968 | Pawelski | May 2009 | A1 |
20130128285 | Khajornrungruang | May 2013 | A1 |
20140347679 | Hirabayashi | Nov 2014 | A1 |
Number | Date | Country |
---|---|---|
202162217 | Mar 2012 | CN |
203648997 | Jun 2014 | CN |
103990654 | Aug 2014 | CN |
2140939 | Mar 1973 | DE |
19704337 | Aug 1998 | DE |
102011078623 | Jan 2013 | DE |
298588 | Jan 1989 | EP |
2100475 | Dec 1982 | GB |
07323357 | Dec 1995 | JP |
2002035834 | Feb 2002 | JP |
2006055861 | Mar 2006 | JP |
2010155274 | Jul 2010 | JP |
20040045566 | Jun 2004 | KR |
9001141 | Feb 1990 | WO |
Entry |
---|
Micro-Epsilon optoCONTROL // Optical precision micrometers Oct. 22, 2014, pp. 1-20. |
Spurstabil, Nov. 1, 2011, 3 pages. |
Spaltsensoren (Gapsensors): Columns and Edges on Tracks, Aug. 15, 2011, 2 pages. |
Aligning Continuous Casters and Steel Mill Rolls, Hamar Laser, pp. 2-5. |
Number | Date | Country | |
---|---|---|---|
20180306834 A1 | Oct 2018 | US |