Determination procedure of the luminance of traffic signs and device for its embodiment

Information

  • Patent Grant
  • 10769458
  • Patent Number
    10,769,458
  • Date Filed
    Wednesday, April 4, 2018
    6 years ago
  • Date Issued
    Tuesday, September 8, 2020
    3 years ago
Abstract
The method of the invention comprises: obtaining a sequence of at least two images, with different levels of illumination; extracting the region containing the sign in the image; calculating the luminance values of the signs; and obtaining the difference in luminance of the sign corresponding to the two levels of illumination. The value obtained is the luminance of the sign (11) corresponding to an illumination equal to the difference between the illuminations, or additional illumination. This result is based on the additive property of luminance, according to which the luminance of a sign is the sum of the luminance produced by each source of illumination. A basic illumination device (5), an additional illumination device (7), at least one camera for taking images, and image recording, positioning and synchronism systems are required to implement the method.
Description

The present invention relates to a procedure and device permitting the luminance provided by a sign installed on the road to be calculated, isolating the existing ambient illumination conditions. The calculation of the luminance is carried out on the basis of analysing the images gathered by an image recording system, and is applied to all signs appearing in those images.


The present invention encompasses the calculation of the luminance both of signs consisting of back-reflecting material and those that are internally lit, and to any traffic sign in general.


The invention comprises a set of lights and a procedure permitting the elimination of any effects which external illumination could have on the measurement, whether these come from other road users or from ambient illumination, along with a control system for the illumination provided by the vehicle. The illumination system used by the invention complies with existing standards in terms of illumination systems fitted to vehicles, the intensity is not high and it is not disturbing to road users, and the tests can be conducted without interrupting the traffic.


The invention comprises a moving system for the synchronized recording of images and data, referenced by means of a positioning system and an automatic system for sign identification and processing.


BACKGROUND OF THE INVENTION

The evaluation of the state of traffic signs installed on the road has normally been done by means of analysing two parameters:

    • The back-reflection coefficient: Property of back-reflecting materials, which permits light to be returned in the same direction as the incident light.
    • The luminance: Quantity of light returned by the sign to which the eye is sensitive.


The back-reflection coefficient is a parameter characteristic of the material, whose value falls off as the sign deteriorates. In order to evaluate it, systems have been developed such as those described in documents U.S. Pat. Nos. 7,173,707, 7,043,057 and 6,674,878, in which procedures are described for measuring the multiple back-reflection values or three-dimensional map of back-reflection coefficients provided by the sign. This measurement is made via a system that is capable of providing illumination by means of a high intensity stroboscopic light and which measures the different values of light intensity. It then generates some surfaces of back-reflection coefficients and simulates the back-reflection values along the road, recreating a virtual drive.


But, although the back-reflection coefficient or the three-dimensional map of back-reflection coefficients is a characteristic parameter of the sign, in order to obtain the luminance value (the parameter that really defines the visibility conditions) indirect methods have to be applied in which theoretical models of the illumination are used. The stated documents therefore lead to the problem of how to make a direct measurement of the luminance without having to be subject to intermediate operations based on unreal theoretical models.


Another problem that is raised by using the back-reflection coefficient lies in the fact that it is only applicable to back-reflecting signs, and signs that are internally lit have to be discarded from the analysis.


To make a direct calculation of the luminance (in units of cd/m2) there exist various apparatus known as luminance meters. In order to make a measurement of the magnitude, these devices require certain specific conditions of stability and illumination, and the measurement they provide is of a point source. This equipment therefore displays serious drawbacks for making a simultaneous measurement while moving of all the signs lying within the analysis zone. The luminance measurements would have to be made one at a time, and in different zones of the sign, in order to then obtain an average of them.


Moreover, as the measurement requires a strictly controlled type of illumination, we would have to discard data that is influenced by the effects of moving traffic, external lighting, etc.


As a consequence, it is an objective of the present invention to have a procedure and a device that would permit the luminance of the sign to be determined:

    • Directly, making a direct measurement of the physical parameter.
    • Automatically, for all signs appearing in a scene.
    • While circulating with a vehicle.
    • On the basis of a known or standardized light source, independently of the existence of other external light sources at the moment of the measurement.
    • Independently of whether the sign is back-reflecting, or internally lit, or any other kind of traffic sign.
    • Independently of the level of back-reflection which this type of sign provides once installed on the road.


DESCRIPTION OF THE INVENTION

The present invention comprises a procedure and device for calculating the luminance provided by traffic signs arranged along the road, the measurement being based on certain light sources whose illumination conditions are known and controlled in turns of form and intensity. To achieve this, the images picked up by a mobile data gathering system are used. The procedure permits the interferences produced by uncontrolled external illuminations to be cancelled out, obtaining the luminance in the zones that are sought. The internal oscillations of the vehicle affecting its own illumination are also cancelled out. The invention permits simultaneous analysis of all the signs appearing in the images gathered by the moving vehicle, and is applied both to the back-reflecting type and to internally lit signs or to any other kind of traffic sign.


The present invention provides a series of improvements over existing systems by permitting a direct calculation to be made of the luminance from a moving vehicle, without any need to use theoretical models of illumination. It isolates the measurement from the interferences produced by light sources other than those of the vehicle. It also allows measurements to be carried out safely, without any need to cut off the traffic, since it does not use high intensity lamps and the measurements can be made at the usual road speed.


The visibility of a vertical road sign is directly related to the luminance provided by that sign. Luminance is defined as the ratio of luminous intensity of the surface apparently seen by the eye in a defined direction. Its unit is cd/m2.


With the procedure described in the present invention, a measurement is made of the luminance using at least one digital camera, irrespective of whether it is colour or black and white, duly calibrated, in such a way that the luminous energy in the form of photons received by each pixel of the camera sensor is reflected as the luminance of a section of material. The camera or cameras to be used can be digital, analogue or any other kind, with their corresponding accessories for carrying out those operations.


In order to obtain the luminance level of each pixel in the image starting from a colour or black and white camera, it is necessary to perform two transformations. First of all, in the case of colour cameras, each level of red, blue and green of the pixel is combined with some appropriate weightings, specified in standard CIE-121, in order to convert it to a grey level. Secondly, it is transformed directly to luminance starting from a change equation obtained by comparison with a laboratory luminance meter.


With the method described, by using at least one digital camera it is possible to cover several signs simultaneously if they appear in the image. The images are processed in such a way that each sign is automatically identified and the luminance values are obtained for each of them.


The light intensity provided for the sign, whether this be from the actual vehicle itself or from other sources in the environs, has a direct influence on the measurement of the luminance of the back-reflecting materials. For the correct measurement, an illumination has been considered that eliminates the variability owing to the internal oscillations of the vehicle (of voltage, for example).


Moreover, in order to cancel out the perturbations produced by external light sources, a calculation technique of the luminance is proposed based on a differentiation of the values starting from at least two different illumination levels.


In some types of vertical signs, the luminance is provided by an internally generated illumination. On the other hand, in others it depends on the light provided by an external illumination and which is returned by the vertical sign, as occurs in the case of back-reflecting materials. The procedure and device proposed here are applicable to any traffic sign.


The procedure of the present invention for calculating the luminance comprises the following stages:

    • Obtaining a sequence of images composed of at least two images with different illumination levels.
    • Extracting the region where the sign is on the image.
    • Calculating the luminance values of the signs.
    • Obtaining the luminance difference of the sign corresponding to the two illumination levels.


The value obtained is the luminance of the sign corresponding to an illumination equal to the difference in the illuminations, or additional illumination. This result is based on the additive property of luminance, according to which the luminance calculated for each sign is the sum of the luminances produced by each light source.


In one of the images, the luminance is the sum of that due to the base illumination La, that due to the additional illumination Lb and the uncontrolled ambient illumination Lc (other vehicles, public lighting, etc). So the measured luminance is

L1=La+Lb+Lc,


Analogously, for the other image, the luminance will be due to the base illumination base La and the ambient illumination Ld. The measured luminance will be:

L2=La+Ld.


As the synchronism system ensures that the difference between the images to process is minimum in terms of the ambient illumination, we can assume that:

Lc=Ld.


If we deduct one level from the other, we obtain the value of the differential luminance corresponding to the additional illumination.

ΔL=L1−L2=Lb


In this way a luminance value is obtained that is based on some lamps whose illumination conditions are known and controlled in terms of form and intensity, and which is independent of other uncontrolled external light sources.


In order to carry out the procedure of the invention, a device is used which, fitted to a vehicle, comprises:

    • At least one camera for capturing the images.
    • A base illumination device.
    • An additional illumination device.
    • A positioning system.
    • A system for image and data recording and treatment.
    • A synchronism system.


Among the advantages presented by the invention we can point out:

    • It is not based on point source measurements, like those that can be provided by back-reflectometers or luminance meters.
    • The analysis is carried out automatically for all the signs appearing in a scene.
    • The measurements are taken by means of a moving vehicle.
    • The measurement is made for a light source whose photometry is known and its level of light intensity is controlled.
    • The measurement is not affected by the existence of other eternal light sources, ambient or similar.
    • It permits an analysis of its state of visibility, independently of the level of back-reflection that the sign provides once installed on the road.
    • It functions independently of whether the sign is back-reflecting or internally lit, or any other kind.





BRIEF DESCRIPTION OF THE DRAWINGS

To complement the foregoing description, and with the aim of aiding a better understanding of the characteristics of the invention, a detailed description of a preferred embodiment is going to be made, on the basis of a set of plans accompanying this descriptive specification in which, on an orientative rather than limiting basis, the following has been represented.



FIG. 1 shows a vehicle fitted with the device of the invention, approaching a traffic sign.



FIG. 2 shows the front of the above vehicle with the additional and base illumination devices.



FIG. 3 shows the vehicle with the base illumination and additional illumination switched on.



FIG. 4 shows the vehicle with just the base illumination switched on.



FIG. 5 shows, schematically in time, the determination of the luminance of two traffic signs.





In the above figures, the numerical references correspond to the following parts and elements.


1. Driver.


2. Sign panel.


3. Distance between the driver and the sign.


4. Normal to the sign panel 2.


5. Base illumination.


6. Stabilizer for the base illumination.


7. Additional Illumination.


8. Adjustment and control of the additional illumination.


9. Light intensity sensor.


10. Public lighting.


11. Vertical traffic sign.


DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT

As can be seen in FIG. 1, a driver (1), as he goes along the road, perceives the information shown in the different signing elements. For a sign panel (2) lying within the angle of vision of the driver (1), at a distance (3) and in a defined instant, an angle of entrance β is defined formed between the light beam incident on the luminous surface at a point and normal (4) to that surface at that point, and also an angle ol observation α formed by the light beam incident on the surface and the reflected light reaching the eyes of the driver (1). As the driver approaches the sign these angles change and the luminance perceived by the observer varies.


The luminance provided by a sign, especially if it consists of a back-reflecting material, depends on various factors:

    • Contributed light. In case of internally lit signs, the reflected light is independent of the contributed light, but for back-reflecting materials, the greater the amount of light that is contributed to the sign, the more light that is reflected.
    • Angle of entrance β and of observation α.
    • Special properties of the material. In particular the variation in back-reflection with the angle of visualization with respect to the normal.
    • The rotation of the sign about its axes (angles of twist, roll and dip).


In order to calculate the distance of the vehicle from the sign, a technique will be used based on the analysis of the images, using a single camera, though it can be extended to an array of cameras or specific techniques of triangulation or stereovision can be used. The procedure used in the present embodiment starts from the analysis of the image by known techniques of artificial vision which permits the position of an object of interest to be determined on that image on the basis of the size of that object as it appears in the image. Applied to different images of the same object, whose cadence is known thanks to the synchronism system, it is possible to establish a direct relation between the real size of the sign, the size of the sign in the image and the distance to the sign. Taking samples of the sizes of the object in the image, as well as the relative distances of the different images to one chosen as reference, it is possible to generate a set of geometric transformation equations on the basis of which the distance that is sought and the size of the object can be determined.


In order to carry out the procedure of the invention for determination of the luminance, a device is used which, fitted to a vehicle, comprises at least:

    • One camera for capturing the images.
    • A base illumination device.
    • An additional illumination device.
    • A positioning system.
    • A system for image and data recording and treatment.
    • A synchronism system.


      Cameras


The aim of the present invention is to study the luminance perceived by the driver. For this reason, the location has been considered of at least one of the cameras in a suitable position on the vehicle that is close to the eye of the driver.


Digital cameras permit the colour coordinates of objects to be represented in an image defined by a set of pixels.


The representation of the colour is normally done by means of the coordinates R, G, B (Red, Green, Blue) though there exist other systems, for example in 1931 the international Commission on illumination established the system XYZ or CIE 1931. In it, the Y component represents the energy or luminance of the pixel and X and Z the colour. For standard camera systems (PAL, NTSC) many different combinations have been defined for transforming the RGB system to XYZ, as in the case of standard ITU.BT-601. In this way, in order to obtain the luminance of an object, it will suffice to carry out the transformation between the systems and determine the value of Y.


In the system used in this embodiment, and depending on the camera, a conversion will be applied like that described above (standard) or a non-standard transformation that comprises the following steps:

    • A standard illuminant is selected defined by the CIE.
    • A white balance is carried out (see document “Digital Color Imaging Handbook”, Gaurav Sharma, ed. CRC PRESS, Boca Raton, New York, Washington, D.C.).
    • Using a colour card according to the CIE 1931 standard, obtain the colour coordinates of pure red, green and blue with the camera.
    • Using a colour card, obtain the XYZ colour coordinates of pure red, green and blue with a colorimeter as per standard CIE 1931.
    • Obtain the conversion coefficients for colour to grey level starting from both measurements.


The vehicle has two illumination systems, one which corresponds to the base illumination and the other which corresponds to the additional illumination. These two systems have to be adapted in order to permit a reliable measurement. See FIG. 2.


Base Illumination Device


In a preferred embodiment the base illumination (5) consists of the dipped lights of the vehicle, powered by the battery, whose voltage oscillates depending on its state of charge, its degree of aging, etc. Given that the illumination provided by these lights is directly related to the battery voltage, a stabilizer (6) needs to be inserted to compensate for the oscillations in that voltage.


Additional Illumination Device


In a preferred embodiment, the additional illumination (7) consists of the lamps external to the vehicle, which have to meet the following requisites:

    • They must provide a constant level of illumination during the data gathering, independently of oscillations in the supply voltage, of the temperature of the bulbs of the external lamps, and of the wear undergone by those lamps with time.
    • They have to provide a known level of illumination at each moment.
    • The level of illumination has to be able to be adjustable for each specific period of data gathering.


On account of all this, an adjustment control system (8) is needed for the additional illumination in a closed loop that can take on these responsibilities. Basically, the functioning of this system can be summarized as:

    • It measures the illumination by means of a sensor (9).
    • It compares it with a desired level C, modifiable by the operator.
    • It takes the decisions in the various control elements.


The map of intensity of the light provided by the additional lamps is known and controlled. Their light intensity is adjusted in such a way that avoids disturbing other circulating vehicles, preventing dangerous situations from arising that can be created by the other patents commented on above.


The additional lamps have to have a short response time. This can be achieved by means of lamps that are sufficiently fast (for example, based on LEDs) or with mechanical shutter systems that permit a sudden and controlled variation in the illumination that its provided.


The level of intensity of the illumination provided by the lamps has to be controlled and its geometric distribution has to be known.


The system described in the present embodiment permits data gathering to be done in movement along the road. In order to avoid disturbing other users travelling on the same road, it is necessary to follow the instructions regarding the design of lamps and maximum illumination levels for each emitter-observer angle that can be admissible, such as for example those defined in Regulation No 8 of the United Nations Economic Commission for Europe.


Image and Data Recording and Treatment System


The system comprises:

    • A digital storage device.
    • A device that applies some transformation equations enabling the luminance of each pixel to be obtained starting from its characteristic values (grey scale).


      Positioning System


The system comprises:

    • A geo-referenced positioning device (GPS) and its corresponding antenna.
    • An information recording device.


      Synchronism System


The system comprises a local positioning device (via the vehicle's milometer) which provides the distance traveled, and a device that generates certain events as a function of that distance traveled. The synchronism system allows adjustment of the distance from the start to when the images are captured and the delay time between each of them.


The synchronism system has to be such that permits adjustment of the on and off times of the lamps with sufficient precision for eliminating the subcyclic effects of public lighting. If f is the frequency of the grid (normally 50 Hz in Europe) the subcyclic effects have a duration of 1/(2*n*f) n=1 . . . N. On the other hand, given that the described device is capable of capturing Images while moving, the duration of the time for capturing each image will be sufficiently small so as not to alter the size of the objects appearing in them and it will also be a multiple of the subcyclic frequencies. The interval of time between capturing two successive images of the same sign with variation in the illumination will be;

Δt=p/(2*n*f) with n=1 . . . N and p=1 . . . P.


In this way, as “n” and “p” are whole numbers, the moment of capturing each image will occur at the same point of the wave of the supply voltage from the electric grid, thus minimizing the differences in the uncontrolled ambient illumination.


The general operating procedure is as follows. A vehicle fitted with the devices and systems described above can circulate on the road and record different sequences of images of various signs situated along that road. With a view to a better understanding vertical signs will, for example, be used, though the procedure of the present invention is valid for any kind of sign that is the object of study.


As the vehicle approaches the sign (11), the sign will be seen to become displaced in the image and its size will Increase until it disappears from the field visible to the camera.


The method that allows the effects of external illumination, such as public lighting (10) for example, to be cancelled out consists of capturing images with different illumination levels, by a combination of base illumination (5) and the additional illumination (7) of the vehicle, obtaining the value of the differential luminance, which cancels out the effect of the ambient illumination as well as the base illumination of the vehicle.


As can be seen in FIG. 3, first of all the images are captured with the base illumination (5) and the additional illumination (7) switched on. Then, with an interval of time determined by the synchronism device, the additional illumination (7) is switched off or hidden in order to obtain the second image as can be seen in FIG. 4.


The synchronism system establishes the instants of time in which the two images are captured corresponding to two illumination levels L1 and L2. See FIG. 5.


Finally, the location data is stored, along with the time of the synchronism system.


Once the data has been downloaded, an automatic computer programme processes the images where the vertical signs (11) appear and it extracts the region corresponding to that sign. Another automatic procedure calculates the luminance level on the basis of a grey scale. Finally, the differential luminance is obtained from L1 and L2.


The relation with the positioning system allows the relative position of the image and the sign on the road to be calculated.


It will be evident to an expert in the subject that, without departing from the essence of the invention, there is a series of modifications and variants that can be made allowing it to be adapted to the desired operating conditions. So, the additional illumination system has been described as an external light incorporated into the vehicle, but it could be included among the vehicle's own lamps with suitable control. No details have been given of those processes and devices which, being conventional, will be more than familiar to the expert in the subject.

Claims
  • 1. A luminance detection system connectable to a vehicle and operable while the vehicle is moving along a roadway, the luminance detection system comprising: a light source configured to illuminate a road marking positioned along the roadway, the light source being separate and distinct from a pair of headlights of the vehicle;a first camera configured to obtain first image data, the first image data including the road marking, the first image data having first color characteristics;a second camera configured to obtain second image data, the second image data including the road marking also included in the first image data, the second image data having second color characteristics, wherein the first color characteristics are separate and distinct from the second color characteristics;a processor configured to: calculate a distance between the vehicle and the road marking based on the first image data and the second image data;apply one or more luminance transforms to convert one or more pixel values of the first image data and the second image data to obtain corresponding luminance levels representative of at least a portion of the road marking;after applying the one or more luminance transforms, compare the first image data, and the second image data to minimize ambient light information; andcalculate a luminance value representative of a luminance of the road marking based on (i) the distance between the vehicle and the road marking (ii) the luminance levels representative of the road marking and (iii) the comparison of the first image data, after applying the luminance transform, and the second image data to minimize ambient light information.
  • 2. The luminance detection system of claim 1, wherein at least one of the first camera and the second camera is a color camera.
  • 3. The luminance detection system of claim 1, wherein the processor is configured to: extract pixel values representative of the road marking from the first image data and the second image data.
  • 4. The luminance detection system of claim 1, further comprising a geo-referenced positioning device configured to generate vehicle position data representative of a position of the vehicle along the roadway, wherein the processor is configured to calculate the distance between the vehicle and the road marking based on the vehicle position data.
  • 5. The luminance detection system of claim 1, wherein the light source is positioned at the front of the vehicle.
  • 6. The luminance detection system of claim 1, wherein the first camera and the second camera are positioned at the front of the vehicle.
  • 7. The luminance detection system of claim 1, further comprising: a light illumination adjustment sub-system including: a light source external to the vehicle and separate and distinct from dipped headlights of the vehicle,a sensor configured to measure an amount of light from the light source, anda controller configured to adjust the amount of light emitted by the light source as a function of the amount of light measured by the sensor.
Priority Claims (1)
Number Date Country Kind
200800371 Feb 2008 ES national
CROSS REFERENCE TO APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/394,939, filed Dec. 30, 2016 which is a continuation of U.S. patent application Ser. No. 14/865,882, filed Sep. 25, 2015 which is a continuation of U.S. patent application Ser. No. 14/265,747, filed Apr. 30, 2014, which is a continuation of U.S. patent application Ser. No. 12/866,888, filed Aug. 9, 2010, which is a U.S. National Stage Patent Application of International Application No. PCT/ES2008/000214, filed Apr. 8, 2008, which claims priority to Spanish Application No. P200800371, filed Feb. 12, 2008, all of which are incorporated herein by reference in their entireties.

US Referenced Citations (169)
Number Name Date Kind
3636250 Haeff Jan 1972 A
4348652 Barnes et al. Sep 1982 A
4373819 Pallotta Feb 1983 A
4491923 Look Jan 1985 A
4553848 Bernd et al. Nov 1985 A
4721389 Dejaiffe Jan 1988 A
4726134 Woltman Feb 1988 A
4920385 Clarke et al. Apr 1990 A
5050327 Woltman Sep 1991 A
5051906 Evans, Jr. et al. Sep 1991 A
5164785 Hopkins et al. Nov 1992 A
5373357 Hopkins et al. Dec 1994 A
5392365 Steinkirchner Feb 1995 A
5448484 Bullock et al. Sep 1995 A
5465115 Conrad et al. Nov 1995 A
5465308 Hutcheson et al. Nov 1995 A
5530549 Brown Jun 1996 A
5533388 Yamamoto et al. Jul 1996 A
5579471 Barber et al. Nov 1996 A
5627915 Rosser et al. May 1997 A
5633944 Guibert et al. May 1997 A
5633946 Lachinski et al. May 1997 A
5643476 Garmire et al. Jul 1997 A
5647058 Agrawal et al. Jul 1997 A
5696503 Nasburg Dec 1997 A
5699444 Palm Dec 1997 A
5740274 Ono et al. Apr 1998 A
5757878 Dobbs et al. May 1998 A
5764411 Shanks Jun 1998 A
5784196 Sola Jul 1998 A
5790691 Narayanswamy et al. Aug 1998 A
5793034 Wesolowicz et al. Aug 1998 A
5802361 Wang et al. Sep 1998 A
5812267 Everett, Jr. et al. Sep 1998 A
5818640 Watanabe et al. Oct 1998 A
5844699 Usami et al. Dec 1998 A
5852823 De Bonet Dec 1998 A
5864630 Cosatto et al. Jan 1999 A
5892847 Johnson Apr 1999 A
5893095 Jain et al. Apr 1999 A
5911139 Jain et al. Jun 1999 A
5915032 Look Jun 1999 A
5938319 Hege Aug 1999 A
5941944 Messerly Aug 1999 A
5949914 Yuen Sep 1999 A
5950190 Yeager et al. Sep 1999 A
5974521 Akerib Oct 1999 A
5983237 Jain et al. Nov 1999 A
5991085 Rallison et al. Nov 1999 A
6011515 Radcliffe et al. Jan 2000 A
6018697 Morimoto et al. Jan 2000 A
6023967 Chung et al. Feb 2000 A
6036322 Nilsen et al. Mar 2000 A
6048069 Nagaoka et al. Apr 2000 A
6064768 Hajj et al. May 2000 A
6084595 Bach et al. Jul 2000 A
6120879 Szczech et al. Sep 2000 A
6123263 Feng Sep 2000 A
6134819 McClain et al. Oct 2000 A
6141433 Moed et al. Oct 2000 A
6141434 Christian et al. Oct 2000 A
6142871 Inoue Nov 2000 A
6166813 Roberts Dec 2000 A
6173231 Chojnacki Jan 2001 B1
6208386 Wilf et al. Mar 2001 B1
6212480 Dunne Apr 2001 B1
6226636 Abdel-Mottaleb et al. May 2001 B1
6240424 Hirata May 2001 B1
6240664 Hjaltason May 2001 B1
6253477 Balint Jul 2001 B1
6266442 Laumeyer et al. Jul 2001 B1
6271840 Finseth et al. Aug 2001 B1
6292227 Wilf et al. Sep 2001 B1
6363161 Laumeyer et al. Mar 2002 B2
6358751 Benichou et al. May 2002 B1
6382126 Findley May 2002 B1
6389417 Shin et al. May 2002 B1
6389424 Kim et al. May 2002 B1
6405132 Breed et al. Jun 2002 B1
6407674 Gallagher Jun 2002 B1
6411215 Shnier Jun 2002 B1
6411953 Ganapathy et al. Jun 2002 B1
6424914 Lin Jul 2002 B1
6438130 Kagan et al. Aug 2002 B1
6449384 Laumeyer et al. Sep 2002 B2
6453056 Laumeyer et al. Sep 2002 B2
6463432 Murakawa Oct 2002 B1
6476910 Hermes Nov 2002 B1
6502105 Yan et al. Dec 2002 B1
6507441 Eisenberg et al. Jan 2003 B1
6514597 Strobel et al. Feb 2003 B1
6526352 Breed et al. Feb 2003 B1
6538751 Ono Mar 2003 B2
6558021 Wu et al. May 2003 B2
6563959 Troyanker May 2003 B1
6566710 Strachan et al. May 2003 B1
6567103 Chaudhry May 2003 B1
6567551 Shiiyama May 2003 B2
6574378 Lim Jun 2003 B1
6574616 Saghir Jun 2003 B1
6575378 Aoki et al. Jun 2003 B2
6584221 Moghaddam et al. Jun 2003 B1
6594931 Barton et al. Jul 2003 B1
6611628 Sekiguchi et al. Aug 2003 B1
6625315 Laumeyer et al. Sep 2003 B2
6653990 Lestruhaut Nov 2003 B1
6674878 Retterath et al. Jan 2004 B2
6678590 Burchfiel Jan 2004 B1
6711280 Stafsudd et al. Mar 2004 B2
6772062 Lasky et al. Aug 2004 B2
6778697 Shin et al. Aug 2004 B1
6810135 Berenz et al. Oct 2004 B1
6885767 Howell Apr 2005 B1
6888622 Shimomura May 2005 B2
6891960 Retterath et al. May 2005 B2
7043057 Retterath et al. May 2006 B2
7082426 Musgrove et al. Jul 2006 B2
7092548 Laumeyer et al. Aug 2006 B2
7173707 Retterath et al. Feb 2007 B2
7298487 Hansen et al. Nov 2007 B2
7409110 Kayahara Aug 2008 B2
7411681 Retterath et al. Aug 2008 B2
7444003 Laumeyer et al. Oct 2008 B2
7515736 Retterath et al. Apr 2009 B2
7590310 Retterath et al. Sep 2009 B2
7995796 Retterath et al. Aug 2011 B2
8660311 Retterath et al. Feb 2014 B2
8860944 Retterath et al. Oct 2014 B2
9335255 Retterath et al. May 2016 B2
20010021011 Ono Sep 2001 A1
20010036293 Laumeyer et al. Nov 2001 A1
20010043717 Laumeyer et al. Nov 2001 A1
20010043718 Laumeyer et al. Nov 2001 A1
20010045034 Mueller et al. Nov 2001 A1
20020044278 Le Apr 2002 A1
20020045455 Spratt Apr 2002 A1
20020063638 Gallagher May 2002 A1
20020090492 Haunschild et al. Jul 2002 A1
20020106109 Retterath et al. Aug 2002 A1
20020163942 Baillargeon et al. Nov 2002 A1
20020176605 Stafsudd Nov 2002 A1
20020186865 Retterath et al. Dec 2002 A1
20030016869 Laumeyer et al. Jan 2003 A1
20030090415 Miyasaka et al. May 2003 A1
20030174054 Shimomura Sep 2003 A1
20040062442 Laumeyer et al. Apr 2004 A1
20040127614 Jiang et al. Jul 2004 A1
20040156531 Retterath et al. Aug 2004 A1
20040218910 Chang et al. Nov 2004 A1
20050021472 Gettman et al. Jan 2005 A1
20050146725 Hansen et al. Jul 2005 A1
20050249378 Retterath et al. Nov 2005 A1
20050271304 Retterath et al. Dec 2005 A1
20060238714 Fox et al. Oct 2006 A1
20060262312 Retterath et al. Nov 2006 A1
20070043707 Kulkarni Feb 2007 A1
20070081744 Gokturk et al. Apr 2007 A1
20070154067 Laumeyer et al. Jul 2007 A1
20070216904 Retterath et al. Sep 2007 A1
20070262765 Joos et al. Nov 2007 A1
20090078870 Haruna Mar 2009 A1
20090252376 Retterath et al. Oct 2009 A1
20100082597 Retterath et al. Apr 2010 A1
20100316252 Roman et al. Dec 2010 A1
20120065940 Retterath et al. Mar 2012 A1
20130271613 Retterath et al. Oct 2013 A1
20140036269 Retterath et al. Feb 2014 A1
20150153271 Retterath et al. Jun 2015 A1
20160377529 Retterath et al. Dec 2016 A1
Foreign Referenced Citations (10)
Number Date Country
1486799 Dec 2004 EP
1580074 Sep 2005 EP
1976296 Jan 2008 EP
1486799 Dec 2004 FI
2661248 Oct 1991 FR
2000149195 May 2000 JP
2000353292 Dec 2000 JP
2008015970 Jan 2008 JP
242637 Nov 2005 TW
2007083741 Jul 2007 WO
Non-Patent Literature Citations (61)
Entry
International Patent Application No. PCT/ES2008/000214, International Search Report, dated Oct. 28, 2008, 8 pages.
International Patent Application No. PCT/ES2008/000214, Written Opinion of the International Searching Authority, dated Oct. 28, 2008, 17 pages.
European Patent Application No. 08761461, Response to Communication Pursuant to 161 (2) and 162 EP, dated Oct. 19, 2010, 6 pages.
European Patent Application No. 08761461, Supplementary European Search Report and Opinion, dated May 23, 2011, 7 pages.
European Patent Application No. 08761461., Response to European Search Report and Opinion, dated Oct. 20, 2011, 13 pages.
European Patent Application No. 08761461, Communication Pursuant to Article 94(3) EPC, dated Dec. 18, 2012, 6 pages.
European Patent Application No. 08761461, Response to Communication Pursuant to Article 94(3) EPC, dated Mar. 14, 2013, 14 pages.
European Patent Application No. 08761461, Corrected Form 1703, dated Oct. 11, 2011, 6 pages.
European Patent Application No. 08761461, Supplementary European Search Report and Written Opinion dated Jun. 1, 2011, 7 pages.
Hak et al., “Dominant Color Transform and Circular Pattern Vector for Traffic Sign Detection and Recognition”, IEICE Transaction Fundamentals, Jun. 1998, vol. E81-A, No. 6, pp. 1128-1135.
Papageorgiou, et al., A Trainable Pedestrian Detection System, Center for Biological and Computational Learning and Artificial ntelligence Laboratory, MIT, IEEE International Conference on Intelligent Vehicles, 1998, pp. 241-246.
Kalinke et al., “A Texture-based Object Detection and an Adaptive Model-based Classification”, Institut fur Neuroinformatik, Bochum, Germany, IEEE International Conference on Intelligent Vehicles, 1998, pp. 143-148.
Risack, et al., “Robust Lane Recognition Embedded in a Real-Time Driver Assistance System”, Fraunhofer-Institut fur Informations, Karlsruhe, Germany, IEEE International Conference on Intelligent Vehicles, 1998, pp. 35-40.
“The Road Sign Recognition System—RS2”, Faculty of Transportation Sciences, Prague, Czech Republic, 1999, 7 pgs.
“The Chamfer System”, Internet Printout, 4 pgs., c. approximately 1999.
You et al., Real-Time Object Recognition: Hierarchical Image Matching in a Parallel Virtual Machine Environment, School of Computing and Information Technology, Griffith University, Brisbane.
Yanai, et al., “An Architecture of Object Recognition System for Various Images Based on Multi-Agent”, Dept. of Computer Science, University of Electro-Communications, Tokyo, Japan, and Dept. of Mathematical Engineering and Information Physics, University of Tokyo, Tokyo, Japan, 4 pgs., (1998).
Schutz et al., Multi-Feature Matching Algorithm for Free-Form 3D Surface Registration, Institute for Microtechnology, Neuchatel, Switzerland, 3 pgs., Aug. 1998.
Tim Baker, “Representation of Uncertainty in Spatial Target Tracking”, Malcolm Strens, DERA Farnborough, United Kingdom, 4 n12:s., (1998).
Liu et al., “Using Centroid Covariance in Target Recognition”, Dept. of Electrical Engineering, University of Washington, Seattle, Washington, 4 pgs., (1998).
Hjaltason, et al. “Using Spatial Sorting and Ranking in Model Based Object Recognition”, Computer Science Dept. University of Maryland, College Park, Maryland, 3 pgs., (Aug. 1998).
Nwagboso et al., Surveillance Systems for Terrestrial Transport Safety and Improved User Information Capability, Bolton Institute, Bolton, United Kingdom, Dept. of Biophysical & Electronic Engineering, Genova, Italy, Vigitec, Brussels, Belgium, pp. 1-7, (1998).
Luo et al., “Landmark Recognition using Projection Learning for Mobile Robot Navigation”,Center for Robotics and Intelligent Machines, IEEE World Congress on Computational Intelligence, vol. IV, pp. 2703-2708, Jun. 1994.
Estable et al., “A Real-Time Traffic Sign Recognition System”, Daimler-Benz Research Center, Proceedings of the Intelligent Vehicles '94 Symposium, Paris, France, pp. 213-218, Oct. 1994.
Ghica et al., “Recognition of Traffic Signs by Artificial Neural Network”, Dept. of Computer Science Memorial University of Newfoundland, IEEE, pp. 1444-1449, Mar. 1995.
“Realtime Traffic Sign Recognition (TSR)”, Jens Logemann, Ed., Univeritat Koblenz-Landau, 3 pgs., Nov. 1997.
Moss et al., “Registering Multiple Cartographic Models with the Hierarchical Mixture of Experts Algorithm”, Dept. of Computer Science, University of New York, IEEE, pp. 909-914, 1997.
Crowley et al., “Multi-Modal Tracking of Faces for Video Communications”, GRA VIR—IMAG, I.N.P. Grenoble, Grenoble, France, IEEE, pp. 640-645, 1997.
Escalera et al., “Road Traffic Sign Detection and Classification”, IEEE Transactions on Industrial Electronics, vol. 44, No. 6, pp. 848-859, Dec. 1997.
Mandal, “Illumination Invariant Image Indexing Using Moments and Wavelets”, Journal of Electronic Imaging, Apr. 1998 pp. 282-293, vol. 7 (2), USA.
Celentano, “Feature Integration and Relevance Feedback Analysis in Image Similarity Evaluation” Journal of Electronic Imaging, Apr. 1998, pp. 308-317, vol. 7(2), USA.
Estevez, “Auto-Associative Segmentation for Real-Time Object Recognition in Realistic Outdoor Images”, Journal of Electronic Imaging, Apr. 1998 pp. 378-385, vol. 7(2), USA.
Bixler, “Extracting text from real-world scenes”, Artificial Intelligence Group, Jet Propulsion Department of Computer Science, Virginia Tech, Blacksburg,Virginia; Laboratory, California Institute of Technology, Pasadena, California, Article, 8 pp., 1988.
Carson et al., “Region Base Image Querying,” Proc. OfIEEE CUPR Workshop on Content-Based Access ofImages and Video Libraries, 1997.
Lui et al., “Scalable Object-Based Image Retrieval,” a pdf paper, Sep. 2003.
Ozer et al., “A Graph Based Object Description for Information Retrieval in Digital Image and Video Libraries,” a pdf paper, 1998.
Fan et al., “Automatic Model-Based Semantic Object Extraction Algorithm,” IEEE Trans on Circuits and Systems for Video Technology, vol. 11, No. 10, Oct. 2001, pp. 1073.
Ardizzoni et al., “Windsurf: Region Based Image Retrieval Using Wavelets,” Proc. of the 1st Int'l Workshop on Similarity Search, Sep. 1999, pp. 167-173.
Application and File History for U.S. Appl. No. 11/122,969, filed May 5, 2005, now U.S. Pat. No. 7,590,310, Inventors Application and File History for U.S. Appl. No. 09/177,836, filed Oct. 23, 1998, now U.S. Pat. No. 6,266,442, Inventors Laumeyer et al.
Application and File History for U.S. Appl. No. 09/177,836, filed Oct. 23, 1998, now U.S. Pat. No. 6,266,442, Inventors Laumeyer et al.
Application and File History for U.S. Appl. No. 10/634,630, filed Aug. 5, 2003, now U.S. Pat. No. 7,092,548, Inventors Laumeyer et al.
Application and File History for U.S. Appl. No. 11/457,255, filed Jul. 13, 2006, now U.S. Pat. No. 7,444,003, Inventors Laumeyer et al.
Application and File History for U.S. Appl. No. 09/928,218, filed Aug. 10, 2001, now U.S. Pat. No. 6,891,960, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 11/056,926, filed Feb. 11, 2005, now U.S. Pat. No. 7,515,736, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 12/419,843, filed Apr. 7, 2009, now U.S. Pat. No. 7,995,796, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 09/918,375, filed Jul. 30, 2001, now U.S. Pat. No. 6,674,878, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 10/736,454, filed Dec. 15, 2003, now U.S. Pat. No. 7,043,057, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 11/381,503, filed May 3, 2006, now U.S. Pat. No. 7,173,707, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 11/702,421, filed Feb. 5, 2007, now U.S. Pat. No. 7,411,681, Inventors Retterath et al.
Application and File History for U.S. Appl. No. 12/584,894, filed Sep. 14, 2009, now U.S. Pat. No. 8,150,216, Inventors Retterath et al.
Hyaltason et al., “Using Spatial Sorting and Ranking in Model-Based Object Recognition [on-line]”, Aug. 16-20, 1998 [retrieved on Oct. 1, 2013], Fourteenth International Conference on Pattern Recognition, 1998, vol. 2, pp. 1347-1349. Retrieved from the Internet:http://ieeexplore.ieee.org/xpls/abs_all.jsp?anumber=711951.
Janssen et al., “Hybrid Approach for Traffic Sign Recognition [on-line]”, Jul. 14-16, 1993 [retrieved on Oct. 1, 2013], Intelligent Vehicles'93 Symposium, pp. 390-395. Retrieved from the Internet: http://ieeexplore.ieee.org/xpls/abs_all.isp?arnumber=697358.
McGee et al., “An Implementation Guide for Minimum Retroreflectivity Requirements for Traffic Signs [on-line]”, Apr. 1998 [retrieved on Apr. 26, 2013], U.S. Department of Commerce National Technical Information Service, Publication No. FHWA-RD-97-052, 60 pages. Retrieved from the Internet: http://trid.trb.org/view.aspx?id=483855.
Maerz et al., “Surveyor: Mobile Highway Inventory and Measurement System [on-line]”, [retrieved on Dec. 12, 2012]. Retrieved from the Internet: http://web.mst.edu/-norbert/ref.htm. Cited in 892 dated Dec. 27, 2012.
Application and File History for U.S. Appl. No. 13/205,337, filed Aug. 8, 2011, now U.S. Pat. No. 8,660,311. Inventors Retterath et al.
Application and File History for U.S. Appl. No. 14/025,614, filed Sep. 12, 2013, now U.S. Pat. No. 8,860,944. Inventors Retterath et al.
Application and File History for U.S. Appl. No. 14/512,735, filed Oct. 13, 2014, now U.S. Pat. No. 9,335,255. Inventors Retterath et al.
Application and File History for U.S. Appl. No. 15/148,722, filed Oct. 13, 2014. Inventors Retterath et al.
Long, Michigan DOT Reflects on Signs, Oct. 1997 (accessed Jan. 11, 2017) transportation research board, 192 pp. 24-25.
Lumia, “A mobile system for measuring retroreflectance of traffic signs”, Mar. 1, 1991 (accessed Jan. 11, 2017) proc SPIE optics, illumination and image sensing for machine vision V 1385 pp. 15-26.
Klausmann et al., “Robust Lane Recognition Embedded in a Real-Time Driver Assistance System”, IEEE International Conference on Intelligent Vehicles, 1998, 6 pages.
Related Publications (1)
Number Date Country
20180225531 A1 Aug 2018 US
Continuations (4)
Number Date Country
Parent 15394939 Dec 2016 US
Child 15945199 US
Parent 14865882 Sep 2015 US
Child 15394939 US
Parent 14265747 Apr 2014 US
Child 14865882 US
Parent 12866888 US
Child 14265747 US