The present disclosure relates to the field of virtual reality technologies and, in particular, to a virtual reality device with iris acquisition function.
Virtual reality (Virtual Reality, VR) technique is a computer technique capable of creating and experiencing a virtual world, which can create a virtual environment using a computer, and belongs to a system simulation of three dimensional dynamic vision and entity behavior in a manner of interactive integration of multi-source information. User can enter into a virtual space to perceive and operate various objects in the virtual world in real time by means of professional equipment such as sensing helmet, data glove and the like, so as to obtain a highly immersive real experience through vision, touch, hearing and so on. Virtual reality technique is an important research aspect of simulation technique, which is an integration of a variety of techniques such as computer graphics, man-machine interface technique, multi-media technique, sensing technique, network technique and the like.
In recent years, virtual reality technique is developing rapidly, and virtual reality device such as VR glasses is widely popularized. The research aspect of the virtual reality device in the prior art mainly concerns the graphic and man-machine interaction etc, so that the user can obtain better immersive experience. However, since the virtual reality devices are widely applied, customers have more and more diversified and personalized demands, for example, demands on convenience of payment, safety of recognition when purchasing through the virtual reality device, on privacy of unlocking encryption when viewing video and the like.
Iris recognition technique is one of the human biological recognition techniques. Features such as high uniqueness, high stability and unchangeability of the iris are a physical foundation of the iris in identity authorization. Therefore, iris recognition is a most convenient and accurate biological recognition technique among all of the biological recognition techniques including fingerprint identification technique. Iris recognition technique is regarded as the biological recognition technique with most prospects in 21st century, which shall be deemed as a key technique in a variety of application fields such as security, national defense, e-commerce and the like. Such a trend appears to be obvious in various applications throughout the world with a wide application prospect.
In view of the above, the present research combines the iris recognition technique and the virtual reality technique, and develops a virtual reality device with iris acquisition function, so as to meet demands of the user on high convenience, safety and privacy when using the virtual reality device.
Many aspects of the exemplary embodiments can be better understood with reference to the following drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
Technical solutions in embodiments of the present disclosure will be described clearly and completely combining the accompanying drawings in the embodiments of the present disclosure. Obviously, the described embodiments are only a part of the embodiments, rather than all of them. Based on the embodiments of the present disclosure, any other embodiment obtained by those of ordinary skill in the art without creative efforts shall fall in the protection scope of the present disclosure.
As shown in
The first observing lens 102 and the second observing lens 103 are respectively arranged at two sides of the optical central axis A′. The positions of the iris camera 101 and the infrared source 104 are defined as long as the display screen 100 and the line of sight when human eyes are observing the infrared source 104 will not be interfered, and are respectively arranged at two sides of the optical central axis A′. Moreover, the infrared source 104 shall be arranged on the focal plane of the first observing lens 102 and the second observing lens 103. In the present embodiment, the iris camera 101 is configured to acquire two iris images of the human eyes from the first observing lens 102 or the second observing lens 103. In order to facilitate describing the principle, taking that the iris camera 101 acquires two iris images of the human eyes from the first observing lens 102 as an example.
As shown in
In which, d in the formula is the center-to-center distance between the first observing lens 102 and the second observing lens 103, the center-to-center distance d matches with the pupil distance of the human eyes, D is the diameter of the first observing lens 102, Ø is the height of the display screen 100 in the direction perpendicular to the optical central axis A′.
The view field of the iris camera 101 shall meet the following relation:
ω>2×max[(α−β),(γ−α)]
in which, ω is an view field angle, α is an angle between an optical axis B′ of the iris camera 101 and a direction perpendicular to the optical central axis A′, β is an angle between the direction perpendicular to the optical central axis A′ and a line from one of two iris images of the human eyes to an optical center 101′ of the iris camera 101, and γ is an angle between the direction perpendicular to the optical central axis A′ and a line from the other one of the two iris images of the human eyes to the optical center of the iris camera 101. The optical center 101′ is located in a center of the iris camera 101, and on the optical axis B′ of the iris camera 101.
moreover, when α−β=γ−α, the view field angle ω of the iris camera 101 shall meet the following formula:
and the field depth of the iris camera 101 shall further meet the following formula:
near field depth: Δ1≤(X+l′)sin α+(Y−d/2)cos α
far field depth: Δ2≥(X+l′)sin α+(Y+d/2)cos α
field depth: Δ≥d·cos α.
The specific position of the iris camera 101 can be referred to the principle diagram shown in
As shown in
in which, L is the vertical distance between the infrared source 104 and the optical central axis A′, f′ is a focal length of the first observing lens 102 and the second observing lens 103. The light-emitting angle ψ shall be as large as possible, on the contrary, the tilting angle θ shall be as small as possible.
As shown in
in which, L is the vertical distance between the infrared source and the optical central axis, X is the straight-line distance between the iris camera and the first observing lens, Y is the vertical distance between the iris camera and the optical central axis, l is the straight-line distance between the first observing lens and the human eye, l′ is the straight-line distance between the first observing lens and the human eye image or iris image generated through the first observing lens, d is the center-to-center distance between the first observing lens and the second observing lens, the center-to-center distance d matches with the pupil distance of the human eyes, D is the diameter of the first observing lens, Ø is the height of the display screen in the direction perpendicular to the optical central axis.
Similar with the first exemplary embodiment, the view field angle ω of the iris camera shall meet the following formula:
ω>2×max[(α−β),(γ−α)]
in which,
when α−β=γ−α, the view field angle ω of the iris camera shall meet the following formula:
and the field depth of the iris camera shall further meet the following formula:
near field depth: Δ1≤(X+l′)sin α+(Y−d/2)cos α
far field depth: Δ2≥(X+l′)sin α+(Y+d/2)cos α
field depth: Δ≥d·cos α
The specific position of the iris camera can be referred to the principle diagram shown in
Meanwhile, the light-emitting angle ψ of the infrared source shall meet the following formula:
in which, f′ is the focal length of the first observing lens and the second observing length. The light-emitting angle ψ shall be as large as possible, on the contrary, the tilting angle θ shall be as small as possible.
As shown in
Assuming the straight-line distance between the iris camera 201 and the first observing lens 202 or the second observing lens 203 is X, and the vertical distance between the iris camera 201 and the optical central axis is Y, the straight-line distance between the first observing lens 202 or the second observing lens 203 and the human eye is l, and the straight-line distance between the first observing lens 202 and the human eye image or iris image generated through the first observing lens 202 is l′, then X and Y shall meet the following formula:
In which, H is the straight-line distance between the display screen 200 and the first observing lens 202 or the second observing lens 203, d is the center-to-center distance between the first observing lens 202 and the second observing lens 203 which matches with the pupil distance of the human eyes, D is the diameter of the first observing lens 202 and the second observing lens 203, 0 is the height of the display screen 200 in the direction perpendicular to the optical central axis.
The view field of the iris camera 201 shall meet the following relation:
ω>2×max[(α−β),(γ−α)]
in which,
Moreover, when α−β=γ−α, the view field angle ω of the iris camera 201 shall meet the following formula:
and the field depth of the iris camera 201 shall further meet the following formula:
near field depth: Δ1≤(X+l′)sin α+(Y−d/2)cos α
far field depth: Δ2≥(X+l′)sin α+(Y d/2)cos α
field depth: Δ≥d·cos α.
The specific position of the iris camera 201 can be referred to the principle diagram shown in
Referring to
in which, L is the vertical distance between the infrared source 204 and the optical central axis, f′ is a focal length of the first observing lens 202 and the second observing lens 203. The light-emitting angle ψ shall be as large as possible, on the contrary, the tilting angle θ shall be as small as possible.
As shown in
in which, L is the vertical distance between the infrared source and the optical central axis, X is the straight-line distance between the iris camera and the first observing lens, Y is the vertical distance between the iris camera and the optical central axis, l is the straight-line distance between the first observing lens and the human eye, l′ is the straight-line distance between the first observing lens and the human eye image or iris image generated through the first observing lens, d is the center-to-center distance between the first observing lens and the second observing lens, the center-to-center distance d matches with the pupil distance of the human eyes, D is the diameter of the first observing lens, Ø is the height of the display screen in the direction perpendicular to the optical central axis.
Similar with the first exemplary embodiment, the view field angle ω of the iris camera shall meet the following formula:
ω>2×max[(α−β),(γ−α)]
in which
when α−β=γ−α, the view field angle ω of the iris camera shall meet the following formula:
and the field depth of the iris camera shall further meet the following formula:
near field depth: Δ1≤(X+l′)sin α+(Y−d/2)cos α
far field depth: Δ2≥(X+l′)sin α+(Y+d/2)cos α
field depth: Δ≥d·cos α.
The specific position of the iris camera can be referred to the principle diagram shown in
The light-emitting angle ψ of the infrared source shall meet the following formula:
in which, L is the vertical distance between the infrared source and the optical central axis, f′ is a focal length of the first observing lens and the second observing lens. The light-emitting angle ψ shall be as large as possible, on the contrary, the tilting angle θ shall be as small as possible.
In the above embodiments, the virtual reality device with iris acquisition function can further include two infrared sources, as shown in
In other possible embodiments, the virtual reality device with iris acquisition function can further include two iris cameras, which are arranged adjacent to the display screen and are respectively at two sides of the optical central axis, configured to acquire a single iris image of the human eyes respectively from the two observing lenses. The position relation that the iris cameras shall meet can be referred to the above embodiments. Similarly, the number of the infrared source can be one or two, the condition that the light-emitting angle shall meet can also be referred to the above embodiments.
The above are preferred embodiments of the present disclosure, which are not intended to limit the present disclosure, for person skilled in the art, the present disclosure can have various alternations and modifications. Any modification, equivalent replacement and improvement made within the spirit and principle of the present disclosure shall fall into the protection scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201610687733.6 | Aug 2016 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
20020018136 | Kaji | Feb 2002 | A1 |
20170325675 | Liu | Nov 2017 | A1 |
Number | Date | Country | |
---|---|---|---|
20200167563 A1 | May 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15416883 | Jan 2017 | US |
Child | 16779645 | US |