This invention applies to “head tracking” or “camera pose” or “line of sight” determination in head mounted, or hand held, display systems used for virtual reality or augmented reality applications.
Many systems exist today in which a head mounted display contains means to calculate the position or “pose” of the display as it moves through 3D space and renders images based on what would be seen from that position or pose. One such means takes the form of a camera mounted on the frame of a head mounted display, such camera able to look out along the user's line of sight and return images of objects along that path. This technique often uses a predetermined object or “marker” to act as a fiducial indicator, through which, received images at the camera may be analyzed against reference shape data to calculate the position of the camera necessary to match the received image, such as taught by Neely in U.S. Pat. No. 7,127,082 and Ellsworth in US application 2014/0340424.
The camera and marker system is limited by the resolution of the camera and the time it takes to process the images it returns. This limitation sets up a trade-off between how fast motion can be tracked versus how accurately position can be measured.
In a head mounted virtual reality or augmented reality system, a fast but lower resolution second camera is added and used to quickly find an area of a visual field returned by a slower but higher resolution first camera, where that area is likely to contain the image of a marker for a head tracking system or a hand held device.
FIG. 1.—Prior Art—A head mounted display with marker tracking camera.
FIG. 2.—A head mounted display with multiple marker tracking cameras.
FIG. 3.—A “marker” pattern used as a fiducial indicator.
FIG. 4.—Image returned by “fast” sensor.
FIG. 5.—Image returned by “hi-res” sensor.
The prior art is shown in
As a further advantage of the two camera system, the frames that are produced quickly by camera 203 can be used to infer motion between the times of arrival of high resolution frames from camera 103. The most common head motion is panning from side to side and the tilting between upper and lower views. In these motions the indistinct image received quickly on camera 203 is seen to shift laterally for panning, and vertically for up and down tilts. A close approximation of what would be intermediate frames on the high resolution camera 103 during these intermediate times can be inferred from the overall movement seen on camera 203, and from that inference new display frames in the head mounted display, or hand held device, can be generated to give the user the impression of faster tracking ability.
A two camera embodiment has been presented, but those skilled in the art will understand that image sensors in cameras can be made to have characteristics that can be modified programmatically during operation. In such an embodiment, a single physical camera would be switched from fast-scan/low-resolution mode to slower high resolution mode as it gathers frames. This embodiment achieves much of the operation of the simultaneous action of two independent physical cameras.
In contrast to a single added camera 203 embodiment, it is also possible for multiple fast cameras to be tasked to cover tiled or overlapping fields of view. In such an embodiment, the plurality of cameras simulates a higher resolution situation at the same fast sampling rate. For some applications an array of fast low-resolution cameras, each returning images of a small part of a bigger image field, may do the entire image processing task by working together.
The invention should not be construed to be limited to application in only head mounted displays, but has general applicability in any device that requires information specifying position and orientation, or pose. An example of such an embodiment would be in game controllers that are held in the hands of users and moved in gesture arcs to communicate control information or manipulate virtual objects.
A further benefit of the camera 203 addition is that by synchronizing to active LED fiducial light emitters, the fast camera 203 can record an image when the LEDs are in an off phase of their duty cycle so as to record a background image of false targets if there are such. The false target image can then be subtracted from an image taken in the active part of the fiducial duty cycle, causing interfering light sources to be reduced in contrast to desired fiducial images. Although the embodiments shown rely on markers comprising active emitters, those of ordinary skill in the art will understand that the invention may be practiced with passive reflecting or fluorescing markers, as taught in applications 62/012,911 and 62/165,089, and that the contrast of images of said markers may also be enhanced by differencing returned frames having differing marker illumination.
An illustrative embodiment has been described by way of example herein. Those skilled in the art will understand, however, that change and modifications may be made to this embodiment without departing from the true scope and spirit of the elements, products, and methods to which the embodiment is directed, which is defined by our claims.
The present application claims the benefit of provisional patent application No. 62/009,797 filed on Jun. 9, 2014, entitled “MULTIPLE SENSOR TRACKING SYSTEM WITH SUB WINDOW CONTROL” by Jeri J. Ellsworth and Ken Clements, and provisional patent application No. 62/012,911, filed on Jun. 16, 2014, entitled “FIDUCIAL ACTIVATION BY STIMULATED FLUORESCENCE” by Ken Clements, and provisional patent application No. 62/165,089 filed on May 21, 2015, entitled “RETROREFLECTIVE FIDUCIAL SURFACE” by Jeri J. Ellsworth et al., the entire contents of which are fully incorporated by reference herein. U.S. Pat. No. 7,120,875 B2 10/2006 Daily et al. U.S. Pat. No. 7,127,082 B2 10/2006 Neely U.S. Pat. No. 7,996,097 B2 10/2006 DiBernardo et al. U.S. Pat. No. 8,031,227 B2 10/2011 Neal et al. U.S. Pat. No. 8,077,914 B1 12/2011 Kaplan U.S. Pat. No. 8,224,024 B2 7/2012 Foxlin et al. U.S. Pat. No. 8,696,458 B2 4/2014 Foxlin et al. U.S. Pat. No. 8,724,848 B1 5/2014 Heath et al. 2012/0320216 A1 12/2012 Mkrtchyan et al. 2014/0340424 A1 11/2014 Ellsworth K. Dorfmüller, “Robust tracking for augmented reality using retroreflective markers.” Computers & Graphics 23.6 (1999): 795-800. J. P. Rolland, L. Davis and Y. Baillot, “A survey of tracking technology for virtual environments.” Fundamentals of wearable computers and augmented reality 1 (2001): 67-112. E. Foxlin, and G. Welch, “Motion Tracking: No Silver Bullet, but a Respectable Aresnal.” IEEE Computer Graphics and Applications (2002). F. Ababsa and M. Mallem, “A robust circular fiducial detection technique and real-time 3d camera tracking ” Journal of Multimedia 3.4 (2008): 34-41.
Number | Date | Country | |
---|---|---|---|
62009797 | Jun 2014 | US | |
62012911 | Jun 2014 | US | |
62165089 | May 2015 | US |