The present invention relates, in general, to a system for parallax correction. More specifically, the present invention relates to a system and method for dynamically correcting parallax in a head mounted display (HMD), which is placed directly in front of a user's eye.
Vision aid devices which are worn on the head are typically located directly in front of the aided eye or eyes. As these systems migrate from direct view optical paths to digital camera aids, the system configuration requires that a head mounted display (HMD) be placed directly in front of the user's aided eye, with one inch of eye relief. This placement of the HMD prevents the co-location of the camera aperture directly in front of the aided eye. The camera aperture must be moved either in front of the HMD or to one side of the HMD.
If, for example, the digital camera is placed 100 mm to the side of the optical axis of the aided eye, then a displacement is created between the aperture of the digital camera and the image display of the digital camera, the display typically centered about the optical axis of the aided eye. This displacement creates a disparity between the apparent positions of objects viewed through the camera, and the actual positions of the objects seen in object space (or real space). This offset in perceived space and object space is referred to as parallax.
In the case of the user viewing an object through a head mounted video device, parallax reduces the usefulness of the video system. The human psycho-visual system is unconsciously attuned to perceiving the world through its natural entrance aperture, which is the pupil in the human eye. The hand-to-eye coordination inherent in manual tasks is based on this innate property. Normal human movement tasks, such as walking and running, depend on this subconscious process. A fixed system, which is aligned to remove parallax at some fixed distance, is miss-aligned at all other distances. This is especially true when the video system is aligned to remove parallax of an object at far range and the user attempts to locate another object at close range, such as tool 12 on
As will be explained, the present invention addresses the parallax problem by providing a system for dynamically realigning the video image so that the image coincides with the real world at all distances.
To meet this and other needs, and in view of its purposes, the present invention provides a dynamically corrected parallax system including a head borne video source for imaging an object and providing video data. A controller is included for electronically offsetting the video data provided from the head borne video source to form offset video data. A display device receives the offset video data and displays the offset video data to a user's eye. The display device is configured for placement directly in front of the user's eye as a vision aid, and the head borne video source is configured for displacement to a side of the user's eye. The offset video data corrects parallax due to displacement between the display device and the head borne video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes an offset of a number of columns of pixels in the X direction of the X,Y array. The offset video data, alternatively, may include an offset of a number of rows of pixels in the Y direction of the X,Y array. The offset video data may also include an offset of a number of columns of pixels in the X direction of the X,Y array and another offset of a number of rows of pixels in the Y direction of the X,Y array.
Geometrically, the optical axis of the user's eye extends a distance of D to an object imaged by the video source, and an optical axis of the aperture of the video source extends in a direction parallel to the optical axis of the user's eye. The displacement to a side is a horizontal displacement distance of d in a Frankfort plane between the optical axis of the user's eye and the optical axis of the aperture of the video source. The offset video data is based on the horizontal displacement distance d and the distance D to the object.
Furthermore, a horizontal offset angle θD is formed, as follows:
θD=tan−1 d/D,
where d is a horizontal displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source.
The display device includes an X,Y array of respective columns and rows of pixels, and the offset video data includes the following horizontal offset:
offsetcolumns=#Columns/FOVhorz*θD
where offsetcolumns is the amount of horizontal offset in columns, FOVhorz is the horizontal field-of-view of the video source, and #Columns is the total number of columns of the display device.
Further yet, a vertical offset angle θD may also be formed, where
φD=tan−1 d′/D,
where d′ is a vertical displacement distance between the optical axis of the user's eye and the optical axis of the aperture of the video source. The offset video data includes the following vertical offset:
offsetrows=#Rows/FOVvert*φD
where offsetrows is the amount of vertical offset in rows, FOVvert is the vertical field-of-view of the video source, and #Rows is the total number of rows in the display device.
The dynamically corrected parallax system includes a display electronics module disposed between the video source and the display device for converting the video data from the video source into digital video data. The display electronics module is configured to receive an offset command from the controller and modify the digital video data into the offset video data. The display electronics module and the controller may be integrated in a single unit. A focus position encoder may be coupled to the controller for determining a distance D to an object imaged by the video source, where the distance D is used to correct the parallax.
The display device may be a helmet mounted display (HMD), or part of a head mounted night vision goggle.
Another embodiment of the present invention includes a dynamically correcting parallax method for a head borne camera system having a video source and a display device, where the display device is configured for placement directly in front of a user's eye as a vision aid, and the video source is configured for displacement to a side of the user's eye. The method includes the steps of: (a) imaging an object, by the video source, to provide video data; (b) determining a focus distance to an object; (c) offsetting the video data to form offset video data based on the focus distance determined in step (b) and a displacement distance between the user's eye and an aperture of the video source; and (d) displaying the offset video data by the display device.
It is understood that the foregoing general description and the following detailed description are exemplary, but are not restrictive, of the invention.
The invention is best understood from the following detailed description when read in connection with the accompanying drawings. Included in the drawing are the following figures:
As will be explained, the present invention dynamically realigns the video image so that the image coincides with the real world at all distances. To do this, the present invention determines the range to the object of interest, so that dynamic alignment may be accomplished based on the determined range. In one embodiment, the invention uses an absolute position of the camera's focus mechanism (or angular orientation of a manual focus knob) to determine the distance to the user's object-of-interest and then applies an appropriate amount of parallax correction to the image shown on the user's display. In this manner, the apparent location of an object-of-interest is correctly perceived at its true position in object space.
In one embodiment of the invention, the video is provided to the user on a digital display device, such as a LCD or LED display. These displays consist of an array of rows and columns of pixels. By controlling the timing of the video data sent to the display, the present invention induces an offset in the image as the image is displayed to the user. By shifting the image in display space, the present invention removes the disparity between the apparent position of an object and its actual position in object space.
A consequence of shifting the image on the display is lost rows and/or columns of pixels in the direction of the image shift. Rows and/or columns of pixels on the opposite edges of the display show arbitrary intensity values, because (assuming a one-to-one relationship in pixel resolution between the camera and the display) these pixels are no longer within the field-of-view of the camera and, therefore, do not provide image data. Thus, shifting the image introduces a reduction in the effective user's field-of-view, because of the reduced usable image size. This negative effect may be minimized, however, by setting the camera pointing angle for convergence at a distance much closer than the far field.
Referring next to
It will be appreciated that video source 23 may be any camera device configured to be placed on the side of the optical axis of a user's eye. In the embodiment shown in
As another embodiment, focus knob 26 may be controlled by a motor (not shown) to allow for a zoom lens operation of video source 23. In this embodiment, focus position encoder 21 may determine the focal length to an object-of-interest by including a zoom lens barrel. A focal length detecting circuit may be included to detect and output the focal length of the zoom lens barrel. As a further embodiment, video source 23 may include a range finder, such as an infrared range finder, which may focus an infrared beam onto a target and receive a reflected infrared beam from the target. A position sensitive device included in focus position encoder 21 may detect the displacement of the reflected beam and provide an encoded signal of the range, or position of the target.
The microcontroller may be any type of controller having a processor execution capability provided by a software program stored in a medium, or a hardwired program provided by an integrated circuit. The manner in which microcontroller 22 computes the X,Y offset control signals is described next.
Referring to
The user is aided in the viewing of object 31 by way of display device 25. As shown in
Using
The horizontal offset angle θD is given by equation (1) as follows
θD=tan−1 d/D (Eq. 1)
The correction factor ‘Chorz’ (for a 40 degree FOV and a 1280 pixel horizontal display resolution) is given by equation 2, in units of columns per degree, as follows
Here, #columns is the total number of columns in the digital display, or 1280 columns (in this example). The image shift on the display device, or the amount of offset-in-columns, is given by equation 3 below, where θD is the horizontal offset angle between the camera's line of sight 36 and the camera's optical axis 37.
offsetcolumns=Chorz *θD (Eq. 3)
In a similar manner, using
The vertical offset angle φD is given by equation (4) as follows
φD=tan−1 d′/D (Eq. 4)
The correction factor Cvert (for a 30 degree vertical FOV and a 1024 pixel vertical display resolution) is given by equation 5, in units of rows per degree, as follows
Here, #rows is the total number of rows in the digital display, or 1024 rows (in this example). The image shift on the display device, or the amount of offset-in-rows, is given by equation 6 below, where φD is the vertical offset angle between the camera's line of sight 36 and the camera's optical axis 37.
Offsetrows=Cvert*φD (Eq. 6)
Referring next to
The plot shown in
A similar plot to the plot shown in
Lastly,
The embodiments described above may be used by any head borne camera system, including a head mounted night vision goggle and a head mounted reality mediator device.
Although the invention is illustrated and described herein with reference to specific embodiments, the invention is not intended to be limited to the details shown. Rather, various modifications may be made in the details within the scope and range of equivalents of the claims and without departing from the invention.
Number | Name | Date | Kind |
---|---|---|---|
4398799 | Swift | Aug 1983 | A |
4752824 | Moore | Jun 1988 | A |
4924247 | Suzuki | May 1990 | A |
5173726 | Burnham et al. | Dec 1992 | A |
5500671 | Andersson et al. | Mar 1996 | A |
5787313 | Compton et al. | Jul 1998 | A |
5815746 | Masuda | Sep 1998 | A |
6307526 | Mann | Oct 2001 | B1 |
6381360 | Sogawa | Apr 2002 | B1 |
6560029 | Dobbie et al. | May 2003 | B1 |
6590573 | Geshwind | Jul 2003 | B1 |
7121736 | Ayame | Oct 2006 | B2 |
7538326 | Johnson et al. | May 2009 | B2 |
20060010697 | Sieracki et al. | Jan 2006 | A1 |
20060250322 | Hall et al. | Nov 2006 | A1 |
Number | Date | Country |
---|---|---|
2665738 | Dec 2004 | CN |
Number | Date | Country | |
---|---|---|---|
20080084472 A1 | Apr 2008 | US |