This application is a U.S. National Stage filing under 35 U.S.C. § 371 of international patent cooperation treaty (PCT) application No. PCT/CN2013/088547, filed Dec. 4, 2013, and entitled “IMAGING APPARATUS AND IMAGING METHOD,” which claims priority to Chinese Patent Application No. 201310328739.0, filed with the Chinese Patent Office on Jul. 31, 2013 and entitled “IMAGING APPARATUS AND METHOD”, which applications are hereby incorporated herein by reference in their respective entireties.
The present application relates to the field of imaging technologies, and more particularly to an imaging apparatus and method.
Wearable devices such as Google glass and smartwatches are gradually accepted by people, and these electronic smart devices will make people's life more and more convenient.
Conventional myopia glasses or hyperopia glasses add a concave lens or a convex lens with a fixed focal length before the eye of a person that suffers from a refractive error to correct various refractive error problems of different causes. However, conventional glasses have the troubles of optometry and lens fitting, and can only solve problems such as correction within a certain distance range. For an object beyond a certain distance, a user may be unable to obtain a clear image and has an indistinct vision, or sees the object with a difficulty that the eyes get tired easily.
Based on the foregoing case, multi-focus glasses with each lens having a plurality of different focal lengths appear. By using an example in which glasses are worn for eyes with both presbyopia and myopia, the upper portion of the glass is a myopia lens and is used to help a user to see an object at a far place clearly; the lower portion of the glass is a hyperopia lens and is used to help a user to see an object at a near place clearly. However, the user needs to see a far place through the upper portion of the glass while see a near place through the lower portion; for example, the user needs to lower the head to see an object that is low at a far place and raise the head to see a object that is high at a near place; or the user needs to manually adjust the position of the glasses, which makes the use troublesome.
Similarly, healthy human eyes and an imaging recording apparatus such as a camera or a video recorder also cannot obtain clear images of objects at all distances within a visual field. For example, when healthy human eyes see an object very close to the eyes, the eyes may also see the object unclearly or get tired.
A lens and a lens array with an electronically adjustable focal length are conventionally known, where the focal length of a lens can be adjusted. However, adaptive adjustment of the focal length of a lens after automatic detection of the focus of a user's eyes is not mentioned.
A technical problem to be solved by one or more embodiments of the present application is to provide an imaging apparatus and method, so as to automatically adjust an imaging parameter of an imaging apparatus according to a focus of an imaging receiver, thereby enabling the imaging receiver (for example, a user's eyes) to conveniently obtain clear imaging of objects at different distances, thereby improving user experience.
To achieve the foregoing objective, in a first aspect, the present application provides an imaging apparatus, which includes:
an imaging lens module, having at least one adjustable imaging parameter, and used to image an observed object of an imaging receiver;
an information processing module, used to detect a focus position of the imaging receiver, and determine the imaging parameter of the imaging lens module according to the focus position; and
a lens adjustment module, used to adjust the imaging lens module according to the determined imaging parameter.
In a second aspect, the present application further provides an imaging method, which includes:
detecting a focus position of an imaging receiver, and determining at least one imaging parameter of an imaging lens module according to the focus position, where the imaging lens module is located between the imaging receiver and an observed object, and the imaging parameter is adjustable; and
adjusting the imaging lens module according to the determined imaging parameter.
In the technical solutions of the embodiments of the present application, a focus position of an imaging receiver is automatically detected and an imaging parameter of an imaging lens module located between the imaging receiver and an object is automatically adjusted according to the focus position, thereby enabling an imaging receiver (for example, a user's eyes) to conveniently obtain clear imaging of objects at different distances.
Particularly, for a user's eyes that suffers from a problem such as a refractive error, the apparatus and method of the embodiments of the present application may solve the problems, such as indistinct vision and eye exhaustion caused by insufficient (when myopic eyes see an object at a far place, or hyperopic eyes see an object at a near place) or excessive (myopic eyes see an object at a near place, or hyperopic eyes see an object at a far place) refractive correction, that occur in watching objects at different distances; problems of indistinct vision and eyes exhaustion caused by decreased lens adjustment range when presbyopic eyes see objects at a near place and at a far place at the same time; and problems of indistinct vision and eyes exhaustion caused by optical axis offsets from astigmatism and strabismus.
The present disclosure will become more fully understood from the detailed description given herein below for illustration only, and thus are not limitative of the present disclosure, and wherein:
The method and apparatus of the present application are illustrated below in detail with reference to the accompanying drawings and embodiments.
An imaging receiver has a limited adjustment range of an imaging parameter such as a focal length. By using an example that the imaging receiver is a user's eyes (definitely, the imaging receiver may further be an imaging recording apparatus such as a video recorder and a camera), a user with a normal eyesight may be unable to see, or sees with a great difficulty, an object very close to the eyes. The adjustment range is limited further for eyes that have problems such as refractive errors such as myopia, hyperopia, presbyopia, and astigmatism and strabismus. By using presbyopic eyes that are not normal for both near vision and far vision as an example, to see an object clearly, the eyes often stay in an adjustment state, which easily causes eye exhaustion. Although common glasses can be worn for correction, glasses in the prior art can hardly perform imaging correction for objects at both a far place and a near place.
Therefore, as shown in
The imaging lens module 110 has an adjustable imaging parameter and is used to image an observed object of an imaging receiver.
The information processing module 120 is used to detect a focus position of the imaging receiver, and determine the imaging parameter of the imaging lens module according to the focus position.
The lens adjustment module 130 is used to adjust the imaging lens module according to the determined imaging parameter.
The imaging receiver can obtain an expected image of an object with the imaging lens module 110.
The imaging apparatus according to the embodiment of the present application can correspondingly adjust imaging parameters for objects at different distances in a visual field according to a demand of an imaging receiver, thereby enabling a user to comfortably watch the objects at different distances in the visual field, respectively, thereby improving user experience.
As shown in
In a possible implementation manner of the embodiment of the present application, the imaging lens module 110 includes at least one lens. Here, an example in which each imaging lens module 110 only includes one lens is used for illustration.
In a preferred implementation manner of the embodiment of the present application, the imaging parameter of the imaging lens module 110 includes: a shape and/or a refractive index of the imaging lens module.
In this implementation manner, the adjustment to the imaging parameter of the imaging lens module 110 may be: for example, the curvature of the lens of the imaging lens module 110 is adjusted to change the focal length of the imaging lens module 110; or the refractive index of the lens of the imaging lens module 110 is adjusted to change the focal length of the imaging lens module 110. In addition, for an astigmatic user, the surface of the lens of the imaging lens module 110 can be adjusted to a cylindrical surface to correct astigmatism; for a strabismal user, the surface of the lens of the imaging lens module 110 can be adjusted to a prismatic surface to correct strabismus. Definitely, in other possible implementation manners of the embodiment of the present application, the imaging lens module may further include two or more lenses, and in this case, for an astigmatic or strabismal user, the imaging lens module 110 of one lens is adjusted to a cylindrical surface or a prismatic surface.
As shown in
The refractive examination unit 121 is used to learn imaging parameters corresponding to the imaging receiver 200 when the imaging receiver 200 acquires expected images of objects at a plurality of distances, and obtain refractive examination information corresponding to the imaging receiver. Here, the expected image may be, for example, a clear image or a relatively clear image of an object. When the imaging receiver 200 is eyes, the expected image here may be a clear or relatively clear image of an object that a user's eyes watch comparatively comfortably, that is, when the user watches the clear image of the object, the eyes do not require excessive adjustment and do not get exhausted easily.
Table 1 shows a representative example of refractive examination information of a myopia user corresponding to the imaging receiver in this embodiment obtained in this implementation manner. Here, the target distance is the distance between an object and an imaging apparatus (in other implementation manners, the distance between an object and the imaging receiver can further be selected as the target distance). The optimal refractivity is the refractivity that a corresponding region of the imaging lens module is required to reach when the user's eyes watch the clear image of the object comfortably at the target distance. In other embodiments of the present application, the refractive examination information may further include, for example, optical parameter information for other refractive errors such as astigmatism or strabismus.
For example, it is assumed that the distance between an object and the imaging apparatus is 1 m, and therefore the imaging parameter of the imaging lens module 110 corresponding to the object preferably corresponds to the refractivity −5.75. It is assumed that the distance between an object and the imaging apparatus is 0.8 m, and therefore the optimal refractivity corresponding to the distance of 0.8 m may be obtained through an interpolation method according to the corresponding optimal refractivities of 0.5 m and 1 m, and an imaging parameter of a corresponding imaging lens module 110 is further obtained. A person skilled in the art should know that when the granularity of the distance of the object for learning the imaging parameter corresponding to the imaging receiver is smaller, more refractive examination information is obtained, and the accuracy of the imaging parameter of the required imaging lens module 110 obtained through the refractive examination information is higher.
The focus position determination unit 122 is used to determine a focus position of the imaging receiver according to an optical parameter of the imaging receiver. When the focus position of the imaging receiver is obtained, the distance between the object and the imaging apparatus is obtained.
The imaging parameter calculation unit 123 is used to calculate the imaging parameter of the imaging lens module 110 according to the focus position of the imaging receiver and the refractive examination information corresponding to the imaging receiver. In this implementation manner, the imaging parameter of the corresponding imaging lens module 110 of each object is calculated through the lookup method based on Table 1.
To avoid that when a user moves at a high speed, because an observed object in a visual field keeps changing, timely adjustment fails as the adjustment speed of the imaging apparatus cannot follow the speed that the object changes or the user feels dizzy though timely adjustment succeeds, preferably, in a possible implementation manner of the embodiment of the present application, the information processing module 120 further includes a movement gesture analysis processing unit 124.
The movement gesture analysis processing unit 124 is used to determine the imaging parameter of the imaging lens module 110 according to movement gesture information of the imaging apparatus 100 (or the imaging lens module 110).
Here, the movement gesture information of the imaging apparatus includes: the relative movement gesture information of the imaging apparatus and the object and/or the movement speed information of the imaging apparatus.
Preferably, in this implementation manner, as shown in
Preferably, as shown in
a movement trajectory prediction and adjustment subunit 1241, used to predict the imaging parameter corresponding to the imaging lens module 110 at a next moment according to the relative movement gesture information of the imaging apparatus and the object and the imaging parameter of the imaging lens module at a current moment. For example, the object moves in the direction towards the imaging receiver, and a movement speed is 0.5 m per second, so that the distance between the object and the imaging apparatus at a next second can be predicted according to the current distance between the object and the imaging apparatus and the foregoing information, so as to further adjust the imaging parameter of the imaging lens module 10.
In a possible implementation manner of the embodiment of the present application, preferably, the movement gesture analysis processing unit 124 includes:
a first movement adjustment subunit 1242 or a second movement adjustment subunit 1243, used to, when a movement speed of the imaging apparatus exceeds a set threshold value, adjust the imaging parameter of the imaging lens module to a set common imaging parameter or the value of the imaging parameter of the imaging lens module at a previous moment.
Preferably, in a possible implementation manner of the embodiment of the present application, to prevent hopping in time on the imaging parameter of the imaging lens module 110 which causes a user's dizziness, the information processing module 120 further includes:
a history information smoothing unit 125, used to perform smoothing processing of time on the current imaging parameter of the imaging lens module 110 according to history information of the imaging parameter of the imaging lens module 110.
In a possible implementation manner of the embodiment of the present application, the focus position determination unit can obtain the focus position with three forms of systems:
A first focus position determination unit can obtain the focus position of the imaging receiver according to an imaging parameter of an optical path between an image collection device and an imaging receiver when a clearest image presented on an imaging plane of the imaging receiver is collected (when the imaging receiver is an eye, the imaging plane is the eyeground (for example, a retina) of the eye). The first focus position determination unit is described in further detail below.
A second focus position determination unit calculates the focus position of the imaging receiver by tracking an optical axis direction of the imaging receiver with an optical axis tracking system and then obtaining a scenario depth of the position of an observed object with a depth sensing device.
A third focus position determination unit is applicable to a scenario where the imaging apparatus corresponds to at least two correlated imaging receivers, tracks optical axis directions of the at least two imaging receivers using an optical axis tracking system, and then obtains focus positions of the imaging receivers through an intersection of the optical axis directions of the at least two imaging receivers. For example, the optical axis directions of the two eyes are tracked respectively, and after the optical axis directions of the two eyes are obtained, the position of the intersection of the two optical axes is calculated, so as to obtain the focus position of the eyes. This implementation manner requires at least two imaging receivers (for example, a human's two eyes) and is not applicable to a scenario where only one imaging receiver exists.
In a possible implementation manner of the embodiment of the present application, the function of the first focus position determination unit may be implemented by a focus detection system, and an example in which the imaging receiver is an eye is used for illustration below:
As shown in
The image collection apparatus 510 is used to collect an image presented on an eyeground.
The imaging apparatus 520 is used to adjust an imaging parameter between an eye and the image collection apparatus 510 to enable the image collection apparatus 510 to obtain a clearest image.
The image processing apparatus 530 is used to process the image obtained by the image collection apparatus 510, so as to obtain an optical parameter of the eye when the image collection apparatus obtains the clearest image.
The system 500 performs analysis processing on the image on the eyeground of the eye to obtain an optical parameter of the eye when the image collection apparatus obtains the clearest image, so as to calculate the current focus position of the eye, which provides a basis for further implementing an adaptive operation of the eye.
Here, the image presented on the “eyeground” is mainly an image presented on a retina, which may be an image of the eyeground itself, or may be an image of another object projected on the eyeground. Here, the eyes may be a human's eyes, or may also be other animal's eyes.
As shown in
In a possible implementation manner of the embodiment of the present application, the imaging apparatus 520 includes: an adjustable lens unit 521, located on an optical path between the eye and the image collection apparatus 510, having an adjustable focal length and/or an adjustable position in the optical path. By means of the adjustable lens unit 521, the system equivalent focal length between the eye and the image collection apparatus 510 becomes adjustable, and with the adjustment of the adjustable lens unit 521, the image collection apparatus 510 can obtain a clearest image on the eyeground at a position or state of the adjustable lens unit 521. In this implementation manner, the adjustable lens unit 521 is adjusted continuously in real time in the detection process.
Preferably, in a possible implementation manner of the embodiment of the present application, the adjustable lens unit 521 is a focal-length adjustable lens, used to adjust a refractive index and/or a shape thereof to accomplish the adjustment of the focal length thereof. Specifically: 1) The focal length is adjusted through adjusting the curvature of at least one surface of the focal-length adjustable lens; for example, the curvature of the focal-length adjustable lens is adjusted by increasing or reducing a liquid medium in a cavity formed by double transparent layers. 2) The focal length is adjusted through changing the refractive index of the focal-length adjustable lens; for example, a specific liquid crystal medium is filled in the focal-length adjustable lens, and the arrangement manner of the liquid crystal medium is adjusted through adjusting the voltage of a corresponding electrode of the liquid crystal medium, so as to change the refractive index of the focal-length adjustable lens.
In another possible implementation manner of the embodiment of the present application, the adjustable lens unit 521 includes: a lens group, used to adjust relative positions of lenses in the lens group to accomplish the adjustment of the focal length of the lens group.
In addition to the foregoing two manners of changing an optical path parameter of a system through adjusting the characteristic of the adjustable lens unit 521, the optical path parameter of the system can further be changed through adjusting the position of the adjustable lens unit 521 on the optical path.
Preferably, in a possible implementation manner of the embodiment of the present application, to prevent the watching experience of an observed object from being affected for a user, and to apply the system on a wearable device portably, the imaging apparatus 520 further includes: a splitter apparatus 522, used to form optical transfer paths between the eye and the observed object and between the eye and the image collection apparatus 510. Therefore, the optical path can be folded, thereby decreasing the volume of the system, and also minimize other experience for the user.
Preferably, in this implementation manner, the splitter apparatus includes: a first splitter unit, located between the eye and the observed object, and used to transmit light from the observed object to the eye, and transfer light from the eye to the image collection apparatus.
The first splitter unit may be a beamsplitter, a splitter light waveguide (including an optical fiber) or other suitable splitter devices.
In a possible implementation manner of the embodiment of the present application, the image processing apparatus 530 of the system includes an optical path calibration module, used to calibrate the optical path of the system, for example, calibrate the alignment of optical axes of optical paths, so as to ensure the precision of measurement.
In a possible implementation manner of the embodiment of the present application, the image processing apparatus 530 includes:
an image analysis module 531, used to analyze the image obtained by the image collection apparatus to find the clearest image; and
a parameter calculation module 532, used to calculate the optical parameter of the eye according to the clearest image and the imaging parameter of the system known when the clearest image is obtained.
In this implementation manner, by means of the imaging apparatus 520, the image collection apparatus 510 can obtain the clearest image. However, the image analysis module 531 needs to find the clearest image. At this time, the optical parameter of the eye can be calculated according to the clearest image and the known optical path parameter of the system. Here, the optical parameter of the eye may include the optical axis direction of the eye.
In a possible implementation manner of the embodiment of the present application, preferably, the system further includes: a projection apparatus 540, used to project a light spot to the eyeground. In a possible implementation manner, the function of the projection apparatus may be implemented by a miniature projector.
Here, the projected light spot may have no specific pattern and is only used to illuminate the eyeground.
In a preferable implementation manner of the embodiment of the present application, the projected light spot includes a pattern rich in features. The rich features of the pattern can facilitate detection and increase the precision of detection.
To prevent normal watching of eyes from being affected, preferably, the light spot is an infrared light spot invisible to eyes.
At this time, to reduce interferences of other spectrums:
A transmission filter for light invisible to eyes may be disposed on an exit surface of the projection apparatus.
A transmission filter for light invisible to eyes may be disposed on an incident surface of the image collection apparatus.
Preferably, in a possible implementation manner of the embodiment of the present application, the image processing apparatus 530 further includes:
a projection control module 534, used to control the brightness of a projected light spot of the projection apparatus according to the result obtained by the image analysis module.
For example, the projection control module 534 can adaptively adjust the brightness according to the characteristic of the image obtained by the image collection apparatus 510. Here, the characteristic of the image includes the contrast of the image feature, the texture feature, and the like.
Here, a special case of controlling the brightness of the projected light spot of the projection apparatus is to turn on or off the projection apparatus. For example, when a user continuously stares at a point, the projection apparatus may be turned off periodically. When the user's eyeground is bright enough, the light emitting source may be turned off and only the eyeground information is used to detect the distance between the current sightline focus of the eye and the eye.
In addition, the projection control module 534 can further control the brightness of the projected light spot of the projection apparatus according to ambient light.
Preferably, in a possible implementation manner of the embodiment of the present application, the image processing apparatus 530 further includes: an image calibration module 533, used to calibrate an image on an eyeground, so as to obtain at least one reference image corresponding to an image presented on an eyeground.
The image analysis module 531 can perform comparison and calculation on the image obtained by the image collection apparatus 530 and the reference image, so as to obtain the clearest image. Here, the clearest image may be an obtained image having minimum differences from the reference image. In this implementation manner, a difference between the current obtained image and the reference image can be calculated through an existing image processing algorithm, for example, by using a classic automatic focusing algorithm for a phase difference value.
Preferably, in a possible implementation manner of the embodiment of the present application, the parameter calculation module 532 includes:
an eye optical axis direction determination unit 5321, used to obtain the optical axis direction of the eye according to the feature of the eye when the clearest image is obtained.
Here, the feature of the eye may be acquired from the clearest image, or may also be acquired in other manners. The optical axis direction of the eye represents the stared direction of the eye's sightline.
Preferably, in a possible implementation manner of the embodiment of the present application, the eye optical axis direction determination unit 5321 includes: a first determination subunit, used to obtain the optical axis direction of the eye according to the feature of the eyeground when the clearest image is obtained. Compared with obtaining the optical axis direction of an eye through features of a pupil and an eyeball surface, determining the optical axis direction of an eye through the feature of the eyeground has higher precision.
When a light spot pattern is projected to the eyeground, the size of the light spot pattern may be greater than an eyeground visible region or smaller than an eyeground visible region, in which:
when the area of the light spot pattern is smaller than or equal to that of the eyeground visible region, the optical axis direction of the eye can be determined through detecting the position of the light spot pattern on the image relative to the eyeground by using a classic feature point matching algorithm (for example, a Scale Invariant Feature Transform (SIFT) algorithm)); and
when the area of the light spot pattern is greater than or equal to that of the eyeground visible region, the direction of the user's sightline can be determined by determining the optical axis direction of the eye through the obtained position of the light spot pattern on the image relative to the original light spot pattern (obtained by the image calibration module).
In another possible implementation manner of the embodiment of the present application, the eye optical axis direction determination unit 5321 included: a second determination subunit, used to obtain the optical axis direction of the eye according to the feature of the pupil of the eye when the clearest image is obtained. Here, the feature of the pupil of the eye may be acquired from the clearest image, or may also be acquired in other manners. The obtaining the optical axis direction of the eye through the feature of the pupil of the eye belongs to the prior art, which is no longer described here.
Preferably, in a possible implementation manner of the embodiment of the present application, the image processing apparatus 530 further includes: an eye optical axis direction calibration module 535, used to calibrate the optical axis direction of the eye, so as to determine the optical axis direction of the eye more precisely.
In this implementation manner, the known imaging parameter of the system includes a fixed imaging parameter and a real-time imaging parameter, where the real-time imaging parameter is the parameter information of the adjustable lens unit when the clearest image is acquired, and the parameter information may be recorded in real time when the clearest image is acquired.
After the current optical parameter of the eye is obtained, the distance from the focus of the eye to the eye can be calculated, specifically:
where do and de are distances from the current observed object 5010 of the eye and from the real image 5020 on the retina to an eye equivalent lens 5030, respectively, fe is the equivalent focal length of the eye equivalent lens 5030, and X is the sightline direction of the eye (which can be obtained from the optical axis direction of the eye).
where dp is the optical equivalent distance from the light spot 5040 to the adjustable lens unit 521, di is the optical equivalent distance from the adjustable lens unit 521 to the eye equivalent lens 5030, fp is the value of the focal length of the adjustable lens unit 521, and di is the distance from the eye equivalent lens 5030 to the adjustable lens unit 521.
The distance do from the current observed object 5010 (the focus of the eye) to the eye equivalent lens 5030 can be obtained from (1) and (2), as shown by formula (3):
According to the calculated distance from the observed object 5010 to the eye, and also the optical axis direction of the eye that can be obtained from the record above, the focus position of the eye can be easily obtained, which provides a basis for subsequent further interactions related to the eye.
The subminiature camera 610 has the same effect as the image collection apparatus recorded in the implementation manner in
The first beamsplitter 620 has the same effect as the first splitter unit recorded in the implementation manner in
The focal-length adjustable lens 630 has the same effect as the focal-length adjustable lens recorded in the implementation manner in
In this implementation manner, the image processing apparatus is not shown in
The brightness of an eyeground is generally insufficient, and therefore illumination for the eyeground is recommended. In this implementation manner, a light emitting source 640 illuminates the eyeground. In order not to affect the user experience, here the light emitting source 640 is preferably light invisible to eyes, and is preferably a near-infrared light emitting source which does not affect the eye 200 much but the camera 610 is relatively sensitive to.
In this implementation manner, the light emitting source 640 is located at the outer side of the glass frame on the right side, and therefore one second beamsplitter 650 and the first beamsplitter 620 are needed to accomplish together the transfer of light emitted by the light emitting source 640 to the eyeground. In this implementation manner, the second beamsplitter 650 is further located in front of the incident surface of the camera 610, and therefore further needs to transmit light from the eyeground to the second beamsplitter 650.
As can be seen, in this implementation manner, to improve user experience and increase collection clarity of the camera 610, the first beamsplitter 620 preferably has the characteristic a high reflectivity for infrared and high transmittance for visible light. For example, an infrared reflective film may be disposed on the side facing the eye 200 of the first beamsplitter 620 to achieve the characteristic.
As can be seen from
In other implementation manners of the embodiment of the present application, the eye focus detection system 600 may be located at a side of the lens of the glasses 400 near the eye 200, and in this case, the optical characteristic parameter of the glass needs to be obtained in advance, and during the calculation of the distance from the focus, the influences of the glass need to be considered.
The light emitted by the light emitting source is reflected by the second beamsplitter 650, projected by the focal-length adjustable lens 630, and reflected by the first beamsplitter 620, then enters a user's eyes through the lens of the glasses 400, and eventually reaches a retina of an eyeground. The camera 610 photographs the image on the eyeground through the pupil of eye 200 via the optical path formed by the first beamsplitter 620, the focal-length adjustable lens 630, and the second beamsplitter 650.
Here, the curved surface beamsplitter 750 transfers an image presented on an eyeground to the image collection apparatus respectively corresponding to the position of a pupil when the optical axis direction of the eye is different. In this manner, the camera can photograph imaging mixed and interposed from different angles of the eyeball. However, because only the eyeground part of the pupil can be clearly imaged on the camera, other parts are defocused to cause clear imaging to fail, and therefore prevent severe interferences on the imaging of the eyeground part, and the features of the eyeground part can still be detected. Therefore, compared with the implementation manner shown in
As shown in
a processor 810, a communications interface 820, a memory 830, and a communications bus 840.
The communications among the processor 810, the communications interface 820, and the memory 830 are accomplished through the communications bus 840.
The communications interface 820 is used to perform network element communications.
The processor 810 is used to execute a program 832, and specifically can execute the functions corresponding to the information processing module.
Specifically, the program 832 may include a program code, and the program code includes a computer operation instruction.
The processor 810 may be a central processing unit (CPU), or an application specific integrated circuit (ASIC), or one or more integrated circuits configured to implement the embodiment of the present application.
The memory 830 is used to store the program 832. The memory 830 may contain a high-speed random access memory (RAM) memory, or may also further include a non-volatile memory, for example, at least one disk memory. The program 832 can specifically enable the information processing module 800 to execute the following steps:
learning corresponding imaging parameters when the imaging receiver acquires expected images of objects at a plurality of distances, and obtaining refractive examination information corresponding to the imaging receiver;
determining a focus position of the imaging receiver according to an optical parameter of the imaging receiver; and
calculating an imaging parameter of an imaging lens module according to the focus position of the imaging receiver and the refractive examination information corresponding to the imaging receiver.
The specific implementation of the steps in the program 832 can be referred to the corresponding description of corresponding steps and units in the embodiments of the present application, which is no longer elaborated here. A person skilled in the art shall clearly understand that for convenience and simplicity of description, the specific work process of devices and modules described above can be referred to the description of the corresponding process in the foregoing method embodiments, which is no longer elaborated here.
As shown in
S110: Detect a focus position of an imaging receiver, and determine an imaging parameter of an imaging lens module according to the focus position, where the imaging lens module is located between the imaging receiver and an observed object and has an adjustable imaging parameter.
S120: Adjust the imaging lens module according to the determined imaging parameter.
Preferably, in a possible implementation manner of the embodiment of the present application, the imaging parameter of the imaging lens module includes: a shape and/or a refractive index of the imaging lens module.
Preferably, in a possible implementation manner of the embodiment of the present application, before the step of determining an imaging parameter of an imaging lens module according to the focus position, the method includes:
learning the corresponding imaging parameters when the imaging receiver acquires expected images of objects at a plurality of distances, respectively, and obtaining refractive examination information corresponding to the imaging receiver.
Preferably, in a possible implementation manner of the embodiment of the present application, Step S110 includes:
determining the focus position of the imaging receiver according to an optical parameter of the imaging receiver; and
calculating the imaging parameter of the imaging lens module according to the focus position of the imaging receiver and the refractive examination information corresponding to the imaging receiver.
Preferably, in a possible implementation manner of the embodiment of the present application, the method further includes:
determining the imaging parameter of the imaging lens module according to movement gesture information of the imaging lens module.
Preferably, in a possible implementation manner of the embodiment of the present application, the step of determining the imaging parameter of the imaging lens module according to movement gesture information of the imaging lens module includes:
predicting the imaging parameter of the imaging lens module corresponding to a next moment according to relative movement gesture information of the imaging lens module and an object and the imaging parameter of the imaging lens module at a current moment.
Preferably, in a possible implementation manner of the embodiment of the present application, the step of determining the imaging parameter of the imaging lens module according to movement gesture information of the imaging lens module includes:
when a movement speed of the imaging lens module exceeds a set threshold value, adjusting the imaging parameter of the imaging lens module to a set common imaging parameter.
Preferably, in a possible implementation manner of the embodiment of the present application, the step of determining the imaging parameter of the imaging lens module according to movement gesture information of the imaging lens module includes:
when a movement speed of the imaging lens module exceeds a set threshold value, adjusting the imaging parameter of the imaging lens module to an imaging parameter value of the imaging lens module at a previous moment.
Preferably, in a possible implementation manner of the embodiment of the present application, before the step of determining the imaging parameter of the imaging lens module according to movement gesture information of the imaging lens module, the method further includes:
acquiring the movement gesture information of the imaging lens module.
Preferably, in a possible implementation manner of the embodiment of the present application, the movement gesture information of an imaging apparatus includes: the relative movement gesture information of the imaging lens module and the object and/or movement speed information of the imaging apparatus.
Preferably, in a possible implementation manner of the embodiment of the present application, the method further includes:
Performing smoothing processing of time on a current imaging parameter of the imaging lens module according to history information of the imaging parameter of the imaging lens module.
Preferably, in a possible implementation manner of the embodiment of the present application, the imaging receiver is a user's eyes.
Preferably, in a possible implementation manner of the embodiment of the present application, the imaging apparatus is glasses.
Preferably, in a possible implementation manner of the embodiment of the present application, the step of determining the focus position of the imaging receiver according to an optical parameter of the imaging receiver includes:
obtaining the focus position of the imaging receiver according to an imaging parameter of an optical path between an image collection device and an imaging receiver when a clearest image presented on an imaging plane of the imaging receiver is collected.
Preferably, in a possible implementation manner of the embodiment of the present application, the step of determining the focus position of the imaging receiver according to an optical parameter of the imaging receiver includes:
collecting the image presented on the imaging plane of the imaging receiver;
adjusting the imaging parameter of the optical path between the imaging receiver and the image collection device to collect the clearest image; and
processing the collected image, and calculating the focus position of the imaging receiver according to the imaging parameter of the optical path between the image collection device and the imaging receiver and the optical parameter of the imaging receiver when the clearest image is collected.
Preferably, in a possible implementation manner of the embodiment of the present application, the optical parameter of the imaging receiver includes an optical axis direction of the imaging receiver.
Preferably, in a possible implementation manner of the embodiment of the present application, the method includes: when the imaging receiver is an eye, transfer an image presented on an eyeground to the image collection device respectively corresponding to the position of a pupil when an optical axis direction of the eye is different.
Preferably, in a possible implementation manner of the embodiment of the present application, before the step of collecting the image presented on the imaging plane of the imaging receiver, the method further includes: projecting a light spot to the imaging plane of the imaging receiver.
Preferably, in another possible implementation manner of the embodiment of the present application, the step of determining the focus position of the imaging receiver according to an optical parameter of the imaging receiver includes:
tracking an optical axis direction of the imaging receiver, then obtaining a scenario depth of the position of the observed object, and calculating the focus position of the imaging receiver.
Preferably, in yet another possible implementation manner of the embodiment of the present application, the method corresponds to at least two correlated imaging receivers, and the step of determining the focus position of the imaging receiver according to an optical parameter of the imaging receiver includes:
tracking optical axis directions of the at least two imaging receivers, and obtaining the focus position of the imaging receiver through an intersection of the optical axis directions of the at least two imaging receivers.
The specific implementation manner of the foregoing steps may be implemented according to the corresponding description of the apparatus embodiment, which is no longer described here.
A person skilled in the art may understand that in the method of the specific implementation manner of the present application, the sequence numbers of the steps do not mean a specific execution sequence, and the execution sequence of the steps should be determined based on the functions and internal logic thereof, rather to constitute any limitation on the implementation process of the specific implementation manner of the present application.
By means of the method of the present application, a user can see a clearest effect when watching a real scenario.
Persons of ordinary skill in the art may further appreciate that, in combination with the examples described in the embodiments herein, units and algorithm steps may be implemented by electronic hardware or a combination of computer software and electronic hardware. Whether these functions are performed using hardware or software depends on particular applications and design constraints of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each specific application. However, such implementation should not be considered as beyond the scope of the present application.
If implemented in the form of software functional units and sold or used as an independent product, the functions may also be stored in a computer readable storage medium. Based on this, the above technical solution or the part that makes contributions to the prior art can be substantially embodied in the form of a software product. The computer software product may be stored in a storage medium and contain several instructions to instruct computer equipment (for example, a personal computer, a server, or network equipment) to perform all or a part of the steps of the method described in the embodiments of the present application. The storage medium may be any medium that is capable of storing program codes, such as a universal serial bus (USB) flash drive, a removable hard disk, a read-only memory (ROM), a RAM, a magnetic disk or an optical disk.
The above implementation manners are merely provided for describing the present invention, but not intended to limit the present invention. It should be understood by persons of ordinary skill in the art that various changes and variations can be made without departing from the spirit and scope of the present as defined by the claims of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2013 1 0328739 | Jul 2013 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2013/088547 | 12/4/2013 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2015/014059 | 2/5/2015 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4264154 | Petersen | Apr 1981 | A |
4572616 | Kowel et al. | Feb 1986 | A |
4973149 | Hutchinson | Nov 1990 | A |
5182585 | Stoner | Jan 1993 | A |
5537163 | Ueno | Jul 1996 | A |
6038080 | Schachar | Mar 2000 | A |
6072443 | Nasserbakht et al. | Jun 2000 | A |
6111597 | Tabata | Aug 2000 | A |
6151061 | Tokuhashi | Nov 2000 | A |
6152563 | Hutchison et al. | Nov 2000 | A |
6325513 | Bergner et al. | Dec 2001 | B1 |
7001020 | Yancey et al. | Feb 2006 | B2 |
7298414 | Stavely et al. | Nov 2007 | B2 |
7334892 | Goodall et al. | Feb 2008 | B2 |
7486988 | Goodall et al. | Feb 2009 | B2 |
7764433 | Kam et al. | Jul 2010 | B2 |
7766479 | Ebisawa | Aug 2010 | B2 |
8104892 | Hillis et al. | Jan 2012 | B2 |
8109632 | Hillis et al. | Feb 2012 | B2 |
8282212 | Hillis et al. | Oct 2012 | B2 |
8384999 | Crosby et al. | Feb 2013 | B1 |
8896632 | MacDougall et al. | Nov 2014 | B2 |
20020101568 | Eberl et al. | Aug 2002 | A1 |
20020113943 | Trajkovic et al. | Aug 2002 | A1 |
20030043303 | Karuta et al. | Mar 2003 | A1 |
20030125638 | Husar et al. | Jul 2003 | A1 |
20050003043 | Sewal et al. | Jan 2005 | A1 |
20050014092 | Hasegawa et al. | Jan 2005 | A1 |
20050030438 | Nishioka | Feb 2005 | A1 |
20060016459 | Mcfarlane et al. | Jan 2006 | A1 |
20060103808 | Horie | May 2006 | A1 |
20060122530 | Goodall et al. | Jun 2006 | A1 |
20060146281 | Goodall | Jul 2006 | A1 |
20060164593 | Peyghambarian et al. | Jul 2006 | A1 |
20060122531 | Goodall et al. | Aug 2006 | A1 |
20070019157 | Hills et al. | Jan 2007 | A1 |
20070211207 | Lo et al. | Sep 2007 | A1 |
20080002262 | Chirieleison | Jan 2008 | A1 |
20080106633 | Blum et al. | May 2008 | A1 |
20090066915 | Lai | Mar 2009 | A1 |
20090189974 | Deering | Jul 2009 | A1 |
20090279046 | Dreher et al. | Nov 2009 | A1 |
20090303212 | Akutsu et al. | Dec 2009 | A1 |
20100053539 | Lin | Mar 2010 | A1 |
20110018903 | Lapstun et al. | Jan 2011 | A1 |
20110019258 | Levola | Jan 2011 | A1 |
20110213462 | Holladay | Jan 2011 | A1 |
20110051087 | Inoue | Mar 2011 | A1 |
20110199202 | De Mers et al. | Aug 2011 | A1 |
20110242277 | Do et al. | Oct 2011 | A1 |
20110279277 | Li-Chung | Nov 2011 | A1 |
20120007959 | Kwon et al. | Jan 2012 | A1 |
20120013389 | Thomas et al. | Jan 2012 | A1 |
20120038549 | Mandella et al. | Feb 2012 | A1 |
20120092618 | Yoo | Apr 2012 | A1 |
20120113235 | Shintani | May 2012 | A1 |
20120127062 | Bar-Zeev et al. | May 2012 | A1 |
20120127422 | Tian et al. | May 2012 | A1 |
20120133891 | Jiang | May 2012 | A1 |
20120140044 | Galstian et al. | Jun 2012 | A1 |
20120154277 | Bar-Zeev et al. | Jun 2012 | A1 |
20120169730 | Inoue | Jul 2012 | A1 |
20120206485 | Osterhout et al. | Aug 2012 | A1 |
20120212499 | Haddick et al. | Aug 2012 | A1 |
20120212508 | Kimball | Aug 2012 | A1 |
20120242698 | Haddick et al. | Sep 2012 | A1 |
20120290401 | Neven | Nov 2012 | A1 |
20120293773 | Publicover et al. | Nov 2012 | A1 |
20120307208 | Trousdale | Dec 2012 | A1 |
20130044042 | Olsson et al. | Feb 2013 | A1 |
20130050432 | Perez et al. | Feb 2013 | A1 |
20130050646 | Nanbara | Feb 2013 | A1 |
20130072828 | Sweis et al. | Mar 2013 | A1 |
20130093997 | Utsunomiya et al. | Apr 2013 | A1 |
20130107066 | Venkatraman et al. | May 2013 | A1 |
20130127980 | Haddick | May 2013 | A1 |
20130135203 | Croughwell, III | May 2013 | A1 |
20130147836 | Small et al. | Jun 2013 | A1 |
20130194323 | Choi et al. | Aug 2013 | A1 |
20130215504 | Kim et al. | Aug 2013 | A1 |
20130241805 | Gomez | Sep 2013 | A1 |
20130241927 | Vardi | Sep 2013 | A1 |
20130278631 | Border et al. | Oct 2013 | A1 |
20130335301 | Wong et al. | Dec 2013 | A1 |
20130335404 | Westerinen et al. | Dec 2013 | A1 |
20130335833 | Liao et al. | Dec 2013 | A1 |
20130342572 | Poulos et al. | Dec 2013 | A1 |
20140078175 | Forutanpour et al. | Mar 2014 | A1 |
20140160157 | Poulos et al. | Jun 2014 | A1 |
20140225915 | Theimer et al. | Aug 2014 | A1 |
20140225918 | Mittal et al. | Aug 2014 | A1 |
20140232746 | Ro et al. | Aug 2014 | A1 |
20140240351 | Scavezze et al. | Aug 2014 | A1 |
20140267400 | Mabbutt et al. | Sep 2014 | A1 |
20140267420 | Schowengerdt et al. | Sep 2014 | A1 |
20140282224 | Pedley | Sep 2014 | A1 |
20140327875 | Blum et al. | Nov 2014 | A1 |
20140354514 | Kronsson | Dec 2014 | A1 |
20140375680 | Ackerman et al. | Dec 2014 | A1 |
20150002542 | Chan et al. | Jan 2015 | A1 |
20150035861 | Salter et al. | Feb 2015 | A1 |
20150234184 | Schowengerdt et al. | Aug 2015 | A1 |
20150235427 | Nobori et al. | Aug 2015 | A1 |
20150235632 | Liu et al. | Aug 2015 | A1 |
20150070391 | Nishimaki et al. | Sep 2015 | A1 |
20160034032 | Jeong | Feb 2016 | A1 |
20160035139 | Fuchs et al. | Feb 2016 | A1 |
20160062454 | Wang et al. | Mar 2016 | A1 |
20160171772 | Ryznar et al. | Jun 2016 | A1 |
20160189432 | Bar-Zeev et al. | Jun 2016 | A1 |
20160196603 | Perez et al. | Jul 2016 | A1 |
20160299360 | Fonte et al. | Oct 2016 | A1 |
20160370605 | Ain-Kedem | Dec 2016 | A1 |
20170092235 | Osman et al. | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
1372650 | Oct 2002 | CN |
1470227 | Jan 2004 | CN |
1141602 | Mar 2004 | CN |
1527126 | Sep 2004 | CN |
1604014 | Apr 2005 | CN |
1645244 | Jul 2005 | CN |
1653374 | Aug 2005 | CN |
1901833 | Jan 2007 | CN |
1912672 | Feb 2007 | CN |
2868183 | Feb 2007 | CN |
1951314 | Apr 2007 | CN |
101069106 | Nov 2007 | CN |
101072534 | Nov 2007 | CN |
101097293 | Jan 2008 | CN |
101103902 | Jan 2008 | CN |
201005945 | Jan 2008 | CN |
101116609 | Feb 2008 | CN |
101155258 | Apr 2008 | CN |
101194198 | Jun 2008 | CN |
101430429 | May 2009 | CN |
201360319 | Sep 2009 | CN |
201352278 | Nov 2009 | CN |
101662696 | Mar 2010 | CN |
201464738 | May 2010 | CN |
101782685 | Jul 2010 | CN |
101819331 | Sep 2010 | CN |
101819334 | Sep 2010 | CN |
201637953 | Nov 2010 | CN |
101900927 | Dec 2010 | CN |
101917638 | Dec 2010 | CN |
201754203 | Mar 2011 | CN |
102008288 | Apr 2011 | CN |
102083390 | Jun 2011 | CN |
102203850 | Sep 2011 | CN |
102292017 | Dec 2011 | CN |
102419631 | Apr 2012 | CN |
102481097 | May 2012 | CN |
101149254 | Jun 2012 | CN |
102487393 | Jun 2012 | CN |
202267785 | Jun 2012 | CN |
102572483 | Jul 2012 | CN |
102576154 | Jul 2012 | CN |
202383380 | Aug 2012 | CN |
102918444 | Feb 2013 | CN |
102939557 | Feb 2013 | CN |
102981270 | Mar 2013 | CN |
103054695 | Apr 2013 | CN |
103065605 | Apr 2013 | CN |
103150013 | Jun 2013 | CN |
103190883 | Jul 2013 | CN |
103197757 | Jul 2013 | CN |
103280175 | Sep 2013 | CN |
103297735 | Sep 2013 | CN |
103353663 | Oct 2013 | CN |
103353667 | Oct 2013 | CN |
103353677 | Oct 2013 | CN |
103558909 | Feb 2014 | CN |
19959379 | Jul 2000 | DE |
2646859 | Oct 2013 | EP |
03023431 | Jan 1991 | JP |
2676870 | Nov 1997 | JP |
H09289973 | Nov 1997 | JP |
3383228 | Mar 2003 | JP |
2003307466 | Oct 2003 | JP |
2005058399 | Mar 2005 | JP |
2007129587 | May 2007 | JP |
201143876 | Mar 2011 | JP |
2012199621 | Oct 2012 | JP |
2012247449 | Dec 2012 | JP |
201012448 | Apr 2010 | TW |
2004023167 | Mar 2004 | WO |
2005077258 | Aug 2005 | WO |
2012075218 | Jun 2012 | WO |
2012083415 | Jun 2012 | WO |
2013074851 | May 2013 | WO |
Entry |
---|
International Search report dated Jun. 12, 2014 for PCT Application No. PCT/CN2013/088554, 4 pages. |
International Search Report dated Jan. 8, 2015 for PCT Application No. PCT/CN2014/088242, 2 pages. |
International Search Report dated May 5, 2014 for PCT Application No. PCT/CN20131088544, 4 pages. |
International Search Report dated Jun. 5, 2014 for PCT Application No. PCT1CN20131088549, 4 pages. |
Smith, et al., “Determining Driver Visual Attention With One Camera”, IEEE Transactions on Intelligent Transportation Systems, vol. 4, No. 4, December 2003, 14 Pages. |
Singh, et al., “Human Eye Tracking and Related Issues: A Review”, International Journal of Scientific and Research Publications, vol. 2, Issue 9, Sep. 2012, ISSN 2250-3153, 9 pages. |
Ji et al., “Real-Time Eye, Gaze and Face Pose Tracking for Monitoring Driver Vigilance”, Real-Time Imaging 8, 357-377 (2002) available online at http://www.idealibrary.com, 21 pages. |
International Search Report dated Mar. 6, 2014 for PCT Application No. PCT/CN2013/088540, 8 pages. |
International Search Report dated May 28, 2014 for PCT Application No. PCT/CN2013/088545, 4 pages. |
International Search Report dated Apr. 3, 2014 for PCT Application No. PCT/CN2013/088531, 10 pages. |
International Search Report dated Feb. 27, 2014 for PCT Application No. PCT/CN2013/088522, 6 pages. |
International Search Report dated May 28, 2014 for PCT Application No. PCT/CN2013/088553, 6 pages. |
International Search Report dated May 8, 2014 for PCT Application No. PCT/CN2013/088547, 4 pages. |
Kim, et al. “A 200 s Processing Time Smart Image Sensor for an Eye Tracker using pixel-level analog image processing”, IEEE Journal of Solid-State Circuits, vol. 44, No. 9, Sep. 2009, 10 pages. |
Hansen, et al. “In the eye of the beholder: a survey of models for eyes and gaze”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, No. 3, Mar. 2010, 23 pages. |
Office Action dated Jun. 29, 2017 for U.S. Appl. No. 14/783,495, 50 pages. |
Dthce Action dated Jun. 29, 2017 for U.S. Appl. No. 14/783,503, 120 pages. |
Gao et al. “Measuring Directionality of the Retinal Reflection with a Shack-Hartmann Wavefront Sensor”, Dec. 2009, Optics Express, vol. 17, No. 25, Optical Society of America, 20 pages. |
Office Action dated Jul. 12, 2017 for U.S. Appl. No. 14/780,519, 45 pages. |
Office Action dated Jun. 8, 2017 for U.S. Appl. No. 14/779,968, 79 pages. |
Office Action dated May 3, 2017 for U.S. Appl. No. 14/781,306, 46 pages. |
Office Action dated Feb. 27, 2017 for U.S. Appl. No. 14/783,495, 39 pages. |
Office Action dated Apr. 21, 2017 for U.S. Appl. No. 14/781,581, 19 pages. |
Office Action dated Dec. 29, 2016 for U.S. Appl. No. 14/780,519, 25 pages. |
Office Action dated Mar. 30, 2017 for U.S. Appl. No. 15/028,019, 36 pages. |
Office Action dated Oct. 4, 2017 for U.S. Appl. No. 14/781,584, 95 pages. |
Office Action dated Dec. 19, 2017 for U.S. Appl. No. 14/783,503, 78 pages. |
Office Action dated Nov. 9, 2017 for U.S. Appl. No. 14/780,519, 24 pages. |
Office Action dated Nov. 17, 2017 for U.S. Appl. No. 14/783,495, 32 pages. |
Office Action dated Dec. 14, 2017 for U.S. Appl. No. 14/779,321, 82 pages. |
Office Action dated Dec. 15, 2017 for U.S. Appl. No. 14/779,968, 67 pages. |
Lee et al. “A Robust Eye Gaze Tracking Method Based on a Virtual Eyeball Model”, Machine Vision and Applications, (2009) 20:319-337, Springer-Verlag, 2008. 19 pages. |
Office Action dated Feb. 5, 2018 for U.S. Appl. No. 14/779,321, 38 pages. |
Notice of Allowance dated Sep. 11, 2018 for U.S. Appl. No. 14/780,519, 29 pages. |
Office Action dated Jun. 25, 2018 for U.S. Appl. No. 14/779,321, 43 pages. |
Office Action dated Jun. 14, 2018 for U.S. Appl. No. 14/780,519, 29 pages. |
Office Action dated Jul. 13, 2018 for U.S. Appl. No. 14/783,495, 36 pages. |
Office Action dated Jul. 17, 2018 for U.S. Appl. No. 14/781,584, 75 pages. |
Office Action dated Sep. 20, 2018 for U.S. Appl. No. 14/779,968, 71 pages. |
Notice of Allowance dated Apr. 17, 2019 for U.S. Appl. No. 14/783,495, 23 pages. |
Office Action dated Apr. 25, 2019 for U.S. Appl. No. 14/779,968, 70 pages. |
Office Action dated May 2, 2019 for U.S. Appl. No. 14/781,584, 105 pages. |
Jeong et al., “Tunable microdoublet lens array”, Optics Express, vol. 12, Issue 11, May 31, 2004, pp. 2494-2500. |
Beauchemin et al., “The Computation of Optical Flow”, ACM Computing Surveys, vol. 27, No. 3, Sep. 1995, pp. 433-467. |
Notice of Allowance dated Nov. 20, 2018 for U.S. Appl. No. 14/779,321, 31 pages. |
Office Action dated Dec. 21, 2018 for U.S. Appl. No. 14/783,495, 35 pages. |
Number | Date | Country | |
---|---|---|---|
20160139433 A1 | May 2016 | US |