DEVICE FOR BIOMETRICALLY CONTROLLING A FACE SURFACE

Abstract
The invention relates to devices for measuring surface contours and can be used in a security system for identifying a person. The inventive device for biometrically controlling a face surface comprises a TV camera, a unit for displaying a face position, a computer and an illumination unit provided with a transparency and an objective lens for projecting the transparency image on the face, which is located in such a way that the optical axes of the objective of the illumination unit and of the TV camera are disposed on the same plane at an angle with respect to each other, wherein the unit for displaying the face position is embodied and disposed in such a way that it makes it possible to display the symmetrical face position with respect to the plane formed by the optical axes of the objective lenses of the illumination unit and the TV camera.
Description
FIELD OF ENGINEERING THE INVENTION RELATES TO

The invention relates to the devices for measuring surface contours and can be used for person identification in security systems.


STATE OF THE ART

A device for contactless control of the surface profile of objects is known, WO 00/70303 of 23 Nov. 2000, comprising of a pulse illumination unit provided with a pulse light source and a transparency, which forms a transparency image on an object surface, and an image recording unit.


The disadvantage of this device is that it is unpractical for biometrical control of a face profile, since it assumes arbitrary face orientation, which requires association of the face contour points with its image and complicates biometrical control of the face surface.


A device for biometrical control of a face surface is known, WO 02/09038 of 31 Jan. 2002, comprising a TV camera (image recording unit), a unit for displaying face position and a computer.


The disadvantage of this device is the low accuracy of measuring the position of the points on the face surface due to the fact that coordinates of these points are determined in plane only and not spatially, as well as the low operation rate of conducting a biometrical control, caused by the necessity of performing manual operations.


INVENTION DISCLOSURE

The invention aims at providing efficient biometric control of a face surface.


A technical result of utilization of this invention is an increase in control efficiency and accuracy in determining biometric face characteristics.


Described technical result is achieved by means of a device for biometrically controlling a face surface, which comprises a TV camera, a unit for displaying face position and a computer. This device additionally includes an illumination unit provided with a transparency and an objective lens for projecting transparency image on the face surface, which is arranged in such a way that optical axes of the objective lenses of the illumination unit and TV camera are disposed in one plane at an angle with respect to each other, while a unit for displaying face position is embodied and disposed in such a way that it makes it possible to display the symmetric face position with respect to the plane, formed by optical axes of the objective lenses of the illumination unit and TV camera.


The computer performs snap-association of the obtained contours of the face surface with the system of coordinates associated with a human face. For this purpose, the computer is provided with a capability to determine actual, asymmetrical face position with respect to the plane formed by the optical axes of the objective lenses of the illumination unit and a TV camera.


Objective lenses of the illumination unit and a TV camera can be positioned one under the other in such a way that their optical axes lie in a vertical plane and the unit for displaying a face position is positioned between them.


The transparency of the illumination unit can be realised in the form of a screen composed of parallel band segments and one band, which is transversal with respect to them along the axis of symmetry of the screen, and arranged in such a way that its transversal band is located in the vertical plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.


The unit for displaying the face position can be realized in the form of a planar mirror with a band and arranged in such a way that the band is located in the plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.


The unit for displaying the face position can be realized in the form of a two-face mirror or in the form of several two-face mirrors, edges of which are located in the plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.


The unit for displaying the face position can be realized in the form of a TV screen with a vertical marking defining the location of the plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows the schematic of the device for biometrical control of a face surface.



FIG. 2 schematically illustrates positioning of the device units as seen by a person at the time of biometrical control of his/her face.



FIGS. 3, 4 and 5 schematically show a combined mirror unit for displaying a face position.





BEST EMBODIMENT

According to the schematic shown in FIG. 1, the device includes an illumination block 1 provided with a transparency and an objective lenses for projecting the transparency image on the face surface, TV camera 2, computer 3 and a unit for displaying the face position 4.


Optical axes of the projecting objective lenses of the illumination unit 1 and TV camera 2 are disposed at an angle a with respect to each other.


The image of the transparency, distorted by the surface profile of the human face, is recorded by the TV camera and transmitted to the computer 3, which computes the height Z of the surface profile for a point with coordinates X,Y using the formula:






Z=ΔY/tan(α),


where ΔY is a measure of the transparency image band distortion.


The computer determines characteristic points and fields of the face surface based on three coordinates.


The person, whose face is being analysed, orients his/her face as shown in FIG. 2 using a unit for displaying the face position 4, which can be realized in the form of a mirror or a TV screen with vertical marking, which allows orienting a nose along the marking or eyes—symmetrically with respect to the marking. The display unit can be realized in the form of a two-face mirror or a series of two-face mirrors (see FIGS. 3, 4 and 5), whose edges are located in the plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.


Realization of a transparency of the illumination unit 1 in the form of a line screen with transversal band located in the plane, formed by the optical axes of the objective lenses of the illumination unit and TV camera, allows a person to orient his/her face symmetrically with respect to this transversal band by observing it in the unit for displaying the face position. In this case, the nose is positioned along the band.


The illumination unit and TV camera can operate outside of the visible part of the optical range.


The computer is realized with a capability to determine the actual asymmetrical position of the face with respect to the plane formed by the optical axes of the objective lenses of the illumination unit and TV camera.

Claims
  • 1-5. (canceled)
  • 6. A method for capturing face-surface profiles with a contactless biometric control system, the method comprising: directing a pulse illumination unit of the contactless biometric control system to project a transparent image on a face of a user such that the face of the user distorts the transparent image, the pulse illumination unit having an illumination objective lens with an illumination optical axis;receiving from a camera of the contactless biometric control system an image data indicative of a distorted image of the transparent image projected on and distorted by the face of the user, the camera having a camera objective lens with a camera optical axis, the illumination and camera optical axes of the illumination and camera objective lenses being disposed in a common plane at an oblique angle with respect to each other;processing with a computer processor of the contactless biometric control system the distorted image; andgenerating with the computer processor of the contactless biometric control system characteristic points and fields of the face of the user based on three-dimensional coordinates for a surface profile of the face derived from the distorted image.
  • 7. The method of claim 6, wherein the pulse illumination unit includes a screen with plural parallel band segments and a transverse band segment extending transversely to the parallel band segments along an axis of symmetry of the screen.
  • 8. The method of claim 7, wherein the transverse band segment is located in a vertical plane formed by the illumination and camera optical axes of the objective lenses of the pulse illumination unit and the camera.
  • 9. The method of claim 6, further comprising directing a face-orientation display unit of the contactless biometric control system to display to the user a face position of the face relative to the pulse illumination unit.
  • 10. The method of claim 9, wherein the face-orientation display unit includes a video display screen with a marking identifying a position of the common plane formed by the illumination and camera optical axes of the objective lenses of the pulse illumination unit and the camera.
  • 11. The method of claim 9, wherein the face-orientation display unit includes a video display screen adapted to display a symmetrical face position with respect to the common plane formed by the optical axes of the objective lenses of the illumination unit and the camera.
  • 12. The method of claim 6, wherein the contactless biometric control system further comprises a face-orientation unit with a two-face mirror having lateral edges parallel to the common plane formed by the illumination and camera optical axes of the objective lenses of the pulse illumination unit and the camera.
  • 13. The method of claim 12, wherein the two-face mirror is located between the pulse illumination unit and the camera in the plane formed by the illumination and camera optical axes of the objective lenses of the pulse illumination unit and the camera.
  • 14. The method of claim 6, wherein at least one of the three-dimensional coordinates is determined based on Z=ΔY/tan(α), where Z is the height of the surface profile, ΔY is a measure of transparency image band distortion, and a is the angle.
  • 15. The method of claim 6, wherein the processing the distorted image includes measuring a transparency image band distortion.
  • 16. A computer-implemented method for biometrically controlling a face surface, the method comprising: directing an illumination unit to project a transparency image on a face of a user such that the face of the user distorts the transparency image, the illumination unit having an illumination objective lens with an illumination optical axis;directing a face-orientation unit to display to the user a face position of the face;receiving from a camera image data indicative of a distorted image of the transparency image projected on and distorted by the face of the user, the camera having a camera objective lens with a camera optical axis, the illumination and camera optical axes of the objective lenses being disposed in a common plane at an angle with respect to each other;processing the distorted image; andgenerating characteristic points and fields of the face of the user based on three-dimensional coordinates for a surface profile of the face derived from the distorted image.
  • 17. A method for controlling a face-surface profile with a contactless control system, the method comprising: projecting, via an illumination unit of the contactless control system, a transparency image on a face of a user such that the face of the user distorts the transparency image, the illumination unit having an illumination objective lens with an illumination optical axis;displaying, via a face-orientation unit of the contactless control system, a face position of the face of the user relative to the illumination unit;capturing, via a camera unit of the contactless control system, image data indicative of a distorted image of the transparency image projected on and distorted by the face of the user, the camera unit having a camera objective lens with a camera optical axis, the illumination and camera optical axes of the objective lenses being disposed in a common plane at an angle with respect to each other; andprocessing, via a computing unit, the distorted image to generate characteristic points and fields of the face of the user based on three-dimensional coordinates for a surface profile of the face derived from the distorted image.
  • 18. The method of claim 17, wherein the illumination unit includes a screen with a plurality of band segments and at least one band segment extending transverse to the plurality of band segments, the at least one transverse band segment being located in a plane formed by the optical axes of the objective lenses of the illumination unit and the camera unit.
  • 19. The method of claim 17, wherein the face-orientation unit includes one or more two-face mirrors having edges located in the common plane formed by the optical axes of the objective lenses of the illumination unit and the camera unit.
  • 20. The method of claim 17, wherein the face-orientation unit includes plural two-face mirrors each having lateral edges parallel to the common plane formed by the optical axes of the objective lenses of the illumination unit and the camera unit.
  • 21. The method of claim 17, wherein the face-orientation unit is located between the illumination unit and the camera unit in the plane formed by the optical axes of the objective lenses of the pulse illumination unit and the camera unit.
  • 22. The method of claim 17, wherein the face-orientation unit includes a screen with a vertical marking defining the position of the common plane formed by the optical axes of the objective lenses of the illumination unit and the camera unit.
  • 23. The method of claim 17, wherein at least one of the three-dimensional coordinates is determined based on Z=ΔY/tan(α), where Z is the height of the surface profile, ΔY is a measure of transparency image band distortion, and a is the angle.
  • 24. The method of claim 17, wherein the processing the distorted image includes measuring a transparency image band distortion.
  • 25. The method of claim 17, further comprising transmitting the distorted image to the computing unit.
Priority Claims (1)
Number Date Country Kind
2004-000312 Aug 2004 RU national
Continuations (1)
Number Date Country
Parent 11573548 Apr 2008 US
Child 14800972 US