The present application is a U.S. national stage application under 35 U.S.C. §371 of PCT Application No. PCT/IB2011/000955, filed May 5, 2011, which claims priority to Italian Application No TO2010A000377, filed May 5, 2010, the entireties of which are incorporated herein by reference.
The present invention, relates to a system and related method for determining vehicle wheel alignment.
Systems are known for determining vehicle wheel alignment, in particular for a motor vehicle, which enable the automatic measuring of one or more characteristic angles of the wheels, for example, the convergence and camber angles, in order to check the correct reciprocal alignment of the wheels. In fact, as is known, incorrect alignment can cause excessive or uneven tyre wear and also cause driving and vehicle stability problems.
In general, systems for determining vehicle wheel alignment are configured to detect the orientation of the plane of each wheel with respect to a single set of three orthonormal axes taken as reference (it should be noted that the “plane of the wheel” is intended as the plane on which the outer side surface of the wheel lies), so as to enable suitable corrective actions to be taken to restore the reciprocal alignment of the wheels.
In particular, some systems envisage the use of detection elements for the characteristic angles, or in any case suitable sensitive elements, directly connected to the wheels of the vehicle via special mounting devices (so-called “clamps”), in order to identify the set-up geometry; in this case, great care is needed in mounting them on the wheels in order to avoid damaging delicate parts.
Other systems move the observation point outside of the vehicle, so as to define a fixed reference system with respect to that of the set-up, by means of observing angular variations of the wheels through one or more image acquisition devices unconstrained by the orientation of the vehicle. In particular, some systems contemplate positioning the image acquisition devices directly on the car lift (able to raise the vehicle under observation in a known manner); other systems contemplate positioning the same image acquisition devices on structures that are fixed or independently movable, located at a distance from and free with respect to both the vehicle and the car lift. In the first case, the image acquisition devices follow the movements of the car lift, but, because of this, they must dynamically compensate distortion; in the second case, the image acquisition devices must lock onto the car lift via controlled movements so as to remain aimed on the wheels, but do not need to compensate distortion.
Usually, such systems use suitable targets mounted on the wheels of the vehicle so as to highlight their rotation and position in space.
In particular, the targets have a flat surface depicting two-dimensional images of various shapes that can be recognised by the image acquisition devices. A processing device coupled to the image acquisition devices generally performs a so-called “best fit” operation on the geometries of the two-dimensional images identified on a generally flat surface forming part of the real target and the two-dimensional images that the image acquisition devices provide in their reference system. This operation enables the spatial orientation of the target to be determined dynamically and therefore to define elementary rotations and translations regarding the linear and angular movement of each wheel within a single reference system (for example, the vehicle's reference system). Afterwards, these elementary rotations and translations, opportunely linked together, are used for the definition of further, more complex rotations and translations that more specifically concern the vehicle's set-up and alignment characteristics.
For example, WO 2008/143614 A1 discloses an alignment determining system that contemplates the use of targets connected to the wheels of a vehicle. Each target is formed by a set of two-dimensional target elements (in particular, having the form of circles), arranged on multiple planes, mutually parallel to each other or forming a preset angle. The system provides for the identification of the target elements on their associated planes from the acquired images and the implementation of “best fit” algorithms, for example, the mean square mathematical algorithm, to determine the orientation of the targets with respect to a reference system.
However, neither does this solution depart from the traditional ones, as it maintains a typically two-dimensional analytical approach (by means of the best fit operation, i.e. a mathematical and not a geometrical solution), considering the displacement of each individual point identified in the acquired images on the associated plane with respect to the configuration of the real target. Accordingly, even this solution does not allow an increase in measurement resolution to be achieved for a given size of the targets.
Although advantageous in certain aspects, known systems have the drawback of requiring a stereo system for image acquisition, which entails the presence of a plurality of image acquisition devices and acquired images for each target observed. Alternatively, in the case of using a sole image acquisition device, it becomes necessary to perform a suitable recognition procedure for the orientation of the target with respect to the sole image acquisition device, by means of observing the target during suitable movements of the vehicle (for example, forwards and backwards, the so-called “run-out” operation), or during suitable movements of the target itself.
In addition, acquisition of the geometrical characteristics of the two-dimensional target becomes difficult as the inclination of the target changes, resulting in inconsistency in the accuracy of the measurements taken.
Furthermore, in known systems that contemplate the identification of target elements as geometric points on a surface, measurement precision can be compromised in the case where one or more of these target elements are hidden or, in any case, cannot be seen by the image acquisition devices.
The need is therefore felt in this field for developing a system for determining the orientation of vehicle wheels that provides greater resolution and precision in angle measurements, does not contemplate the need to perform specific vehicle displacement manoeuvres in order to identify the orientation of the targets and that is also of simple and economic implementation.
The object of the present invention is to provide a system for determining the orientation of vehicle wheels that totally or partially solves the above-indicated drawbacks and satisfies the above-mentioned need.
According to the present invention, a system and a method for determining the orientation of vehicle wheels is thus provided, substantially as respectively described in claims 1 and 17.
For a better understanding of the present invention, some preferred embodiments shall now be described, purely by way of non-limitative example and with reference to the attached drawings, where:
a-7b are perspective views of further embodiments of a target used in the system in
The system 1 comprises a plurality of targets 5, shown schematically, equal in number to the number of wheels 2, each target 5, the structure and function of which shall be described in detail further on, being mechanically coupled to a respective wheel 2 by a mounting element, or “clamp” (here not shown); this mounting element can be made, for example, as described in the Italian utility models IT-0000254272 and IT-0000254273, filed by the same Applicant.
The system 1 also comprises a first and a second image-capturing device 6a and 6b, for example, consisting of cameras arranged respectively on the right-hand and left-hand sides of the vehicle 3 with respect to the longitudinal axis A. In particular, the first image-capturing device 6a is placed so that a respective viewing area includes the wheels 2 on the left-hand side of the vehicle 3; similarly, the second image-capturing device 6b is placed so that a respective viewing area includes the wheels 2 on the right-hand side of the same vehicle 3. In particular, the image-capturing devices 6a and 6b are arranged with respect to the vehicle 3 and the associated wheels 2 such that each target 5 is only viewed by one of these image-capturing devices 6a and 6b.
Each image-capturing device 6a and 6b has a respective image reference system SdRtel, defined by a set of three orthogonal axes xtel, ytel and ztel, where the transverse axes xtel and ytel define the image plane associated with the two-dimensional images captured by the respective image-capturing device 6a and 6b (i.e. the plane on which the dimensions of the objects are evaluated by the number of pixels), and the orthogonal axis ztel coincides with the optical axis of the same image-capturing device 6a and 6b.
In the embodiment shown, the first and the second image-capturing devices 6a and 6b are carried on the same support structure 7, including a horizontal cross-beam that carries the same image-capturing devices 6a and 6b at its end portions; the support structure 7 is configured to enable automatic or manual positioning of the image-capturing devices 6a and 6b with respect to the vehicle 3 (or, in a similar manner, with respect to the car lift 4). Alternatively, in a manner not shown herein, the image-capturing devices 6a and 6b can be constrained to respective mutually independent vertical structures, with the possibility, for example, of sliding vertically to be able to lock onto the adjustment position of the car lift 4, or being constrained to the same car lift 4 so as to follow its movements.
The system 1 also comprises a processing device 8, for example, in the form of a personal computer or any other computer device equipped with a processor or similar calculation means, operatively connected to the first and second image-capturing devices 6a and 6b; in particular, the processing device 8 is connected to the first image-capturing device 6a by means of a first communications interface 9a, configured to implement wireless or wired data transfer (using any known technique), and is connected to the second image-capturing device 6b by means of a second communications interface 9b, this also configured to implement wireless or wired data transfer (again, using any known technique). As shall be described in detail further on, the processing device 8 is configured to process the two-dimensional images provided by the image-capturing devices 6a and 6b with reference to the respective image reference systems, in order to determine the spatial orientation characteristics of the wheels 2 of the vehicle 3.
In the embodiment shown, the system 1 further comprises a coupling structure 10, which shall also be described in detail in the following, configured to ensure that a desired reciprocal positioning and orientation relationship between the image reference systems SdRtel associated with the image-capturing devices 6a and 6b is maintained, so that it is possible to establish a relation between the associated angle measurements and so determine the alignment characteristics of the wheels 2 in a single common reference system (for example, the reference system of the vehicle 3).
According to one aspect of the present invention, also with reference to
In particular, each target 5 is composed of a plurality of target elements 12, these also having a three-dimensional shape, arranged as a whole to form the three-dimensional structure of the same target 5 and having a geometric shape such as to enable easy identification in the two-dimensional images taken by the image-capturing devices 6a and 6b. The target elements 12 are reciprocally arranged according to a three-dimensional geometric configuration definable by means of a given analytical expression (and associated with a “canonical” three-dimensional geometric shape), this analytical expression describing the reciprocal arrangement of these same target elements.
A (non-limitative) example of a target 5 is schematically shown in the above-mentioned
In particular, the target elements 12 are angularly equispaced from one another along the circumference of the respective outer or inner circular ring. In the embodiment shown, the outer ring is composed, for example, of twelve target elements (which in the above-mentioned
Each target element 12 has, as previously pointed out, a three-dimensional geometric shape and, in particular, a spherical shape. Advantageously, this spherical shape ensures that the same target elements 12 maintain an unaltered shape in two-dimensional images from whatever angle they are taken (within a given angular range), in this way being easily identifiable; in particular, the associated geometric centre, henceforth defined as the “sphere centre”, is easily identifiable in these two-dimensional images. In fact, the spheres exhibit isotropic characteristics both with respect to shape and with respect to reflection. Given that their shape remains circular, it is therefore possible to find the position of the individual target element 12 even in the case where it remains partially covered by other target elements due to the viewing angle. Furthermore, given their spherical shape, any reflections on the surface caused by light sources in the measurement environment are present on all the target elements 12 in the same position (typically central, if the main lighting is coaxial to the optical axis of the image-capturing devices 6a and 6b); the effects due to these reflections are therefore easily eliminated through post-processing.
In particular, it is possible to associate a set of three orthogonal axes Xtrg, Ytrg and Ztrg with the target 5 defining a target reference system SdRtrg, the spatial orientation of which corresponds to the orientation of the wheel 2 to which the same target 5 is integrally coupled.
In detail, a set of three mutually orthogonal vectors are identified inside the target 5, each one aligned along a respective orthogonal axis Xtrg, Ytrg and Ztrg. In particular, an orthogonal vector vztrg is identified, corresponding to the vector joining the two centres O1 and O2 of the outer and inner circular rings formed by the target elements 12. In this regard, it should be noted that in the described embodiment, the arrangement of the target elements 12 on two concentric rings positioned on two parallel planes is advantageous; in fact, even though the two rings may appear as two ellipses on the image plane, due to the inclination of the target 5, the related centres O1 and O2 are always identifiable and the vector joining these centres O1 and O2 always appears as the orthogonal vector vztrg associated with axis Ztrg of the real target 5. It follows that determining the displacement of the centres O1 and O2 allows the inclination of this axis Ztrg to be determined.
In addition, the fact that the images of the spheres must always be superimposable on the two ellipses enables possible errors committed during image processing to be detected and corrected, for example, those due to noise that may be inevitably added to the scene. In this regard, the positions returned by the image-processing algorithm are corrected so that they are brought as close as possible to the ellipse that interpolates the position of the spheres for which the image processing process has returned a shape error below a preset threshold. This correction operation provides more stability to the position of the spheres in the image plane and therefore to the measurement. In particular, the spherical shape of the target elements 12 is found to be advantageous in this respect, permitting the application of shape-factor evaluation algorithms (the shape of the target elements 12 must in fact be circular in the two-dimensional image).
A first and a second transverse vector vxtrg and vytrg based on the position of specific target elements 12 are also identified within the same target 5. For example, the first transverse vector vxtrg corresponds to the vector joining the sphere centres of the target elements 12 of the outer circular ring, indicated be references T4 and T10 in the above-mentioned
In order to facilitate identifying the target elements 12 defining the transverse vectors vxtrg and vytrg in the two-dimensional images that are acquired from the image-capturing devices 6a and 6b, the target 5 can advantageously comprise one or more reference elements 14 that indicate the orientation, these also having a three-dimensional geometry and in particular a spherical shape, for example, with a smaller diameter than the target elements 12 (so as to be easily identifiable). In the example shown in above-mentioned
In alternative, or even in addition, to using reference elements 14, and again for the purpose of facilitating identification of the set of three orthogonal axes Xtrg, Ytrg and Ztrg associated with the target 5, a specially provided colour code associated with the target elements 12 (or other means of unambiguous identification of each of the target elements 12) could be used in the system 1. For example, the target elements 12 belonging to the outer circular ring could have mutually different colours (or different shades, tones or contrasts of colour) according to a predetermined code that enables identification of reciprocal positioning. By using the colour code shown by way of example in
In use, with particular reference to
In particular,
The operations carried out by the processing unit 8 of system to determine the alignment of the wheels 2 of vehicle 3 shall now be described, with reference to the flowchart in
In a first step, indicated by reference 20, the first and second image-capturing devices 6a and 6b take shots of their respective targets 5 and send the two-dimensional images acquired in the respective image reference systems SdRtel (containing in a known manner a set of pixels representing the captured images), to the processing unit 8 via the respective interfaces 9a and 9b.
Then, in a successive step 21, the processing device 8 digitally processes the two-dimensional images of each target to identify the position of the target elements 12 considered significant, i.e. that identify in a predetermined manner the set of three orthonormal axes associated with the target 5; in particular, the processing device 8 identifies the projections of the target vectors vxtrg, vytrg and vztrg, on the image plane, henceforth respectively indicated as vxtrg_prj, vytrg_prj and vztrg_prj (and referred to as “projection vectors”).
In greater detail, after identifying the positions of the sphere centres of the target elements 12, the processing device 8 determines the position of the projection vectors vxtrg_prj, vytrg_prj and vztrg_prj in the acquired two-dimensional image (using the previously described criteria) and then determines the dimensions thereof (in terms of the number of pixels). In particular, for each of the above-mentioned projection vectors, the processing device 8 calculates the dimensions (Δxpix, Δypix)i in the image plane, expressed as the number of pixels along the transverse axes Xtel and Ytel of the same image plane (here, the i index indicates the relative projection vector chosen from vxtrg_prj, vytrg_prj or vztrg_prj). The dimensions of these projection vectors expressed in the chosen length measurement unit, in mm for example, will be subsequently indicated as (Δxmm, Δymm)i.
In particular, the real dimensions of the target vectors vxtrg, vytrg and vztrg are known in the same length measurement unit (as the geometrical dimensions of the target 5 are known by design); these real dimensions, expressed in mm for example, are henceforth indicated as Δxtrg, Δytrg and Δztrg.
In a successive step 22, the processing device 8 then determines the orientation of the targets 5 in the respective image reference system SdRtel, using the previously acquired information, and also determines the distance D between the centre of the target 5 and the image plane of the associated image-capturing device 6a and 6b, calculated along the optical axis ztel.
In detail, for each target 5, a rotation matrix MatRottrg is defined that transforms the set of three vectors of known length, expressed in mm for example, in the target reference system SdRtrg of the target 5 into another set of three vectors of inferable length, this also expressed in mm for example, in the image reference system SdRtel of the image-capturing device 6a and 6b; in other words, the rotation matrix MatRotTrg imposes a rotation through which a vector identified on the real target, and expressed in mm for example, is projected onto a plane parallel to the image plane, passing through the centre of the target and determines the dimensions thereof, in the same measurement unit.
By applying geometrical considerations, which shall be better understood by also referring to the diagram in
where α, β and γ, as shown in the above-mentioned
In greater detail, assuming that the set of three orthonormal axes X′″Y′″Z′″ coincides with the image reference system SdRtel, and that the set of three orthonormal axes XYZ coincides with the target reference system SdRtrg, the above-indicated rotation matrix MatRotTrg can be thought of as the combination of three successive rotations:
The above-mentioned first, second and third intermediate rotation matrices βRotMat, αRotMat and γRotMat are defined as follows:
The overall rotation that describes the rotation between the target reference system SdRtrg and the image reference system SdRtel is represented, as previously pointed out, by the rotation matrix MatRottrg, which is obtained as the product of the above-mentioned intermediate rotation matrices βRotMat, αRotMat and γRotMat multiplied together and in the order indicated.
By using further geometrical considerations, it is also possible to obtain a relation between the dimensions in pixels of the projection vectors (Δxpix, Δypix)i and the corresponding dimensions in the length measurement unit (Δxmm, Δymm)i, based on the characteristics of the image-capturing devices 6a and 6b.
In particular: pixIMGdx and pixIMGdy are defined as the total dimensions in pixels of the two-dimensional image captured by the image-capturing devices 6a and 6b respectively along the transverse axes Xtel and Ytel of the image plane; dfX and dfY are defined as the focal distances along the same transverse axes Xtel and Ytel, which establish a relation between the observation distance, expressed in mm, and the maximum observable dimension at that distance, again expressed in mm; lCCD and hCCD are defined as the dimensions in the length unit, expressed in mm in the example, of the sensor used by the image-capturing device 6a and 6b (a CCD—Charge Coupled Device in this embodiment) along the transverse axes Xtel and Ytel; L and H are defined as the maximum dimension visible from the image-capturing device 6a and 6b at distance D along the same transverse axes Xtel and Ytel.
It is then possible to demonstrate that the following relations are valid:
dfX=D·lCCD/(L+lCCD)
dfY=D·hCCD/(H+hCCD)
Furthermore, using the following relations:
L=pixIMGdx˜Δxmm/Δxpix
H=pixIMGdy·Δymm/Δypix
gives:
dfX=D·lCCD/(pixIMGdx·Δxmm/Δxpix+lCCD)
dfY=D·hCCD/(pixIMGdy·Δymm/Δypix+hCCD)
The basic relations between the dimensions, in pixels and in the length measurement unit, of the projection vectors on the image plane are thus obtained:
Δxpix=Δxmm·pixIMGdx·dfX/(D·lCCD−dfX*lCCD)
Δypix=Δymm·pixIMGdy·dfY/(D·hCCD−dfY*hCCD)
Therefore, by using the rotation matrix MatRotTrg, it is possible to identify, for each rotation of the target 5, the relations that link together: the dimensions in the length unit, mm in the example, of any known vector in the real model (Δxtrg, Δytrg, Δztrg); the dimensions in pixels of the related projection vectors on the image plane (Δxpix, Δypix); and the unknown quantities (angles of rotation α, β and γ and distance D) that characterize this rotation:
Δxpix=(Δxtrg·ax+Δytrg·bx+Δztrg·cx)·(pixIMGdx·dfX/(D·lCCD−dfX·lCCD)
Δypix=(Δxtrg·ay+Δytrg·by+Δztrg·cy)·(pixIMGdy·dfY/(D·hCCD−dfY·HCCD)
where ax,bx,cx and ay,by,cy, are the elements of the rotation matrix MatRotTrg, as previously defined.
To find the values of the four unknown quantities (α, β, γ and D), it is therefore sufficient to observe the behaviour of at least two vectors considered significant in order to obtain four relations (in particular, the two above-mentioned relations for each of the two significant vectors), obtaining a resolvable system of four equations in four unknown variables; for example, the vectors vxtrg and vytrg or, in alternative, any other pair of target vectors vxtrg, vytrg and vztrg of which the dimensions are known in the real world (expressed in mm for example), can be considered for this purpose.
The values of the four unknown variables completely define the rotation and translation function between the target reference system SdRtrg and the image reference system SdRtel; starting from these values, identified for each target 5 (and referring to the orientation of the associated wheel 2), it is possible to find (in a known manner) the values of the characteristic angles that define the set-up of the vehicle 3.
In particular, the processing device 8 is thus able to detect the orientation (in terms of angles of rotation α, β and γ) of each target 5, within the image reference system of the related image-capturing device 6a and 6b.
To detect the alignment of the wheels 2 of the vehicle 3 in a single reference system (for example the reference system identified on the same vehicle 3), in a step 23 successive to step 22 (referring again to
In order to dynamically determine the relation of reciprocal positioning and orientation between the image reference systems associated with the image-capturing devices 6a and 6b, the system 1 comprises, as previously pointed out, the coupling structure 10, which is, for example, composed of two optical devices 10a and 10b that are similar and each associated with one of the image-capturing devices 6a and 6b. Both optical devices 10a and 10b consist of an optical transmission stage, for example, equipped with one or more LEDs, and an optical receiver stage, for example, equipped with one or more linear CCDs, receiving the light radiation emitted by the optical transmission stage associated with the other device. Based on the position of the light sources in the image captured by each optical device, the processing device 8 continuously determines, in a manner which is known and is consequently not described in detail, the reciprocal position and orientation between the image-capturing devices 6a and 6b (in terms of reciprocal rotation and translation).
As an alternative, the system 1 may comprise a further image-capturing device, again indicated by reference numeral 10a, arranged in an integral manner with the first image-capturing device 6a, and a further target, again indicated by reference numeral 10b, arranged in an integral manner with the second image-capturing device 6b. The further image-capturing device 10a and target 10b can be made to operate, for example, in a manner similar to that previously described in relation to determining the orientation angles of the targets 5 connected to the wheels 2 of the vehicle 3.
Alternatively, to resolve the problem of determining the reciprocal position of the image-capturing devices 6a and 6b in a static manner, the processing device 8 could establish a relation between the related image reference systems during a specific adjustment step, where a gauge (or reference element) is simultaneously identified by both image-capturing devices 6a and 6b.
In any case, at the end of the above-mentioned step 23, the processing device 8 determines, in step 24, the results in terms of the characteristic set-up angles of the wheels 2 of the vehicle 3, for example convergence and camber angles, expressed in a single reference system, for example the reference system associated with the vehicle 3; the processing device 8 also controls the display of these results on a suitable display device, for example for presenting them to an operator.
In a manner not shown, the system 1 can be completed by including an associated illuminator device for each image-capturing device 6a and 6b that guarantees sufficient illumination on both sides of the vehicle 3, with respect to the longitudinal axis A, for the processing of acquired images; this illumination is opportunely reflected by the target elements 12, enabling their identification. In particular, the wavelength of the radiation used for this illumination operation can be defined according to the target chosen, as can its flashing frequency; for example, a visible or infrared light source could be used.
In one embodiment, as shown in
In the embodiment in
The use of a target 5 with a configuration comprising a concave spherical cap-shaped support structure 28 containing the target elements 12 therein allows identification of the set of three orthogonal vectors associated with the target over a wide angular range of observation (for example, between −30° and +30°).
In addition, as shown in
In any case, the presence of a further target element 12 arranged centrally with respect to the support structure 28 that identifies the geometric centre and the point of intersection of the various meridians is advantageous.
In particular, as schematically shown, the three-dimensional arrangement of the target elements 12 enables the identification of at least a first and a second meridian m1 and m2 (constituted by semi-elliptic curves intersecting at the centre of the support structure 28) substantially orthogonal to each other, along which the sphere centres of some of the target elements 12 are aligned (in this case, the target elements 12 identified in a suitable manner by an associated pair of reference elements 14). In addition, the meridians identified in the image can advantageously be more than two, for example, being six in number, angularly equispaced from one another at an angle of 30°; in this case, identification of the six meridians allows the identification of six corresponding angularly equispaced directions, thereby achieving an increase in measurement stability.
For example, as shown in
The processing operations on the two-dimensional images to identify the rotation of the target 5, and of the associated wheel 2, may envisage determining the inclination of the meridians (for example, measured in correspondence to the position of the central target element 12) to determine the direction of the vectors associated with the target 5, the rotation of which with respect to the reference system can be obtained with known techniques; the deformation of the meridians (and corresponding directions) as the angle of rotation of the wheel changes, and therefore that of the associated target 5 with respect to the reference system of the image-capturing device 6a and 6b, can also be analyzed.
The advantages of the system and the method for determining vehicle wheel alignment according to the invention are clear from the previous description.
In particular, it should be again underlined that the use of three-dimensional targets (in particular, formed by a three-dimensional arrangement of target elements) and, in particular, the use of target elements that are also three-dimensional, enables determining the absolute position and orientation of each target (and of the wheel to which the same target is coupled) with respect to a fixed reference system, in a precise and secure manner using a single image-capturing device, without it being required to move the vehicle or the wheels of the vehicle to vary the spatial position of the targets, move the targets, or resort to the utilization of a stereo acquisition system. In fact, it is easy to determine a set of three orthogonal axes associated with the target (identifying reference target elements) in space, and in this way to determine the spatial orientation of the same target inside a given reference system.
The described solution also allows increasing measurement resolution with respect to standard solutions, without, for example, requiring an increase in the size of the targets used.
In other words, three-dimensional information is advantageously and intrinsically associated with the target, through which it is possible to determine the spatial orientation starting from the processing of even just one two-dimensional image (transforming the two-dimensional information provided by the image-capturing device into three-dimensional information, thanks to the target's particular geometric structure).
Furthermore, thanks to the fact that the reciprocal arrangement of the target elements 12 is defined by a known three-dimensional geometric shape (expressed by means of an analytical expression), it is possible to identify the set of three orthogonal vectors even in the case where one or more of these same target elements are not visible, for example, due to the superimposition of multiple target elements on the image plane. This advantage derives in fact from the combined processing of the target elements 12, which are considered as belonging to the same known three-dimensional geometric figure.
In particular, thanks to the use of spherically shaped three-dimensional target elements with isotropic characteristics, measurement accuracy remains unchanged as the inclination of the targets varies with respect to the image reference systems of the image-capturing devices 6a and 6b.
Finally, it is understood that changes and modifications may be made to what described and illustrated herein without departing from the scope of the present invention, as defined in the attached claims.
In general, the target 5 may have a different three-dimensional shape. In any case, the target is shaped so as to allow the definition of vector quantities according to a known three-dimensional arrangement, in particular so as to allow the identification of a set of three orthogonal axes associated with it (for example, by the identification of significant points or planes on the same target), preferably under different visual angles (for example in an angle range between −30° and +30°). For example, the orthogonal vector vztrg may be determined through the identification of a significant point and plane of the target 5, such as the vector originating from this point and orthogonal to this plane. In particular, the configuration described for the targets 5 allows measurement resolution to be kept constant and maximised throughout the whole angle range considered.
One or more of the targets 5 may also be replaced by targets of the active type, i.e. constituted by target elements that, instead of reflecting incident radiation, are capable of generating and emitting electromagnetic radiation, in the visible frequency range or, in alternative, in the infrared range.
In particular, as shown in
The target elements, indicated here with reference numeral 12′, of the outer ring are mechanically coupled to the circular rim 31, while the target elements 12′ of the inner ring are mechanically coupled to the base disc 30. Each target element 12′ comprises an emitter unit 32, constituted by a pair of LEDs for example, and associated control electronics 33 connected to a printed circuit board 34.
As is clear from examining
The advantage of an active solution with respect to the use of passive target elements 12 consists in the fact of not needing an illuminator device, which, in addition to potentially being bothersome for the operator, even in the event where infrared radiation is emitted, having to illuminate the target from a certain distance in any case entails higher electric power consumption.
The described system could also include a larger number of image-capturing devices, in particular more than two, arranged in equal numbers on both the right-hand and left-hand sides with respect to the longitudinal axis A of the vehicle 3. In addition, it could also be possible to use a single image-capturing device, capable of framing all the targets associated with the wheels 2 of the vehicle 3 for which it is wished to determine the orientation.
Furthermore, as shown schematically in
It should be pointed out that the remaining system and method used for determining the orientation of the three-dimensional targets in space does not substantially differ from that previously illustrated, again providing the reconstruction of the three-dimensional characteristics of the target 5 starting from the two-dimensional images acquired by the image-capturing devices 6a and 6b.
In a substantially similar manner, not shown, the alignment determining system can also envisage the image-capturing devices 6a and 6b being mounted directly on the car lift 4, again without substantial differences regarding the method of measuring and using the information obtained from observation of the three-dimensional targets associated with the wheels 2 of the vehicle 3.
In general, it will be evident that the difference from the metrological standpoint linked to the different arrangement of the image-capturing devices 6a and 6b consists in the identification of the reference system with respect to which the measurements are returned; in particular, in the embodiment shown in
Finally, the described system and the method obviously also allow determining the spatial orientation of just a single wheel 2 of the vehicle 3, the image of which is taken by a single image-capturing device 6a or 6b.
Number | Date | Country | Kind |
---|---|---|---|
TO2010A0377 | May 2010 | IT | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/IB2011/000955 | 5/5/2011 | WO | 00 | 12/31/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2011/138662 | 11/10/2011 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4319838 | Grossman et al. | Mar 1982 | A |
4455759 | Coetsier | Jun 1984 | A |
4854702 | Stieff | Aug 1989 | A |
5048954 | Madey et al. | Sep 1991 | A |
5313711 | Kling et al. | May 1994 | A |
5724743 | Jackson | Mar 1998 | A |
5748301 | Muller et al. | May 1998 | A |
5809658 | Jackson et al. | Sep 1998 | A |
5842281 | Mieling | Dec 1998 | A |
6313911 | Stieff | Nov 2001 | B1 |
6532673 | Jahn et al. | Mar 2003 | B2 |
6634109 | Dale et al. | Oct 2003 | B1 |
6658749 | Jackson et al. | Dec 2003 | B2 |
6748796 | Van Den Bossche | Jun 2004 | B1 |
7164472 | Dorrance et al. | Jan 2007 | B2 |
7313869 | Rogers et al. | Jan 2008 | B1 |
7444752 | Stieff et al. | Nov 2008 | B2 |
7710555 | Hoenke et al. | May 2010 | B2 |
7864309 | De Sloovere et al. | Jan 2011 | B2 |
7877883 | Schommer et al. | Feb 2011 | B2 |
7908751 | Nobis et al. | Mar 2011 | B2 |
8059955 | Ohara et al. | Nov 2011 | B2 |
8104185 | Gray et al. | Jan 2012 | B2 |
8125537 | Maekawa | Feb 2012 | B2 |
8149298 | Ohara et al. | Apr 2012 | B2 |
8196461 | Abraham et al. | Jun 2012 | B2 |
8334500 | Ohara et al. | Dec 2012 | B2 |
8363129 | Ohara et al. | Jan 2013 | B2 |
8448342 | Nobis et al. | May 2013 | B2 |
8522609 | Nobis et al. | Sep 2013 | B2 |
8567678 | Ohara et al. | Oct 2013 | B2 |
8573363 | Healy et al. | Nov 2013 | B2 |
8638452 | Muhle et al. | Jan 2014 | B2 |
20040128844 | Robb et al. | Jul 2004 | A1 |
20050096807 | Murray et al. | May 2005 | A1 |
20060152711 | Dale et al. | Jul 2006 | A1 |
20070283582 | Donner et al. | Dec 2007 | A1 |
20080289202 | Kassouf et al. | Nov 2008 | A1 |
20090216484 | Schommer et al. | Aug 2009 | A1 |
Number | Date | Country |
---|---|---|
1434169 | Jun 2004 | EP |
4370706 | Dec 1992 | JP |
2004023783 | Mar 2004 | WO |
WO2004023783 | Mar 2004 | WO |
2008143614 | Nov 2008 | WO |
WO2008143614 | Nov 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20130194446 A1 | Aug 2013 | US |