The subject matter of this disclosure is illustrated, by way of example and not by limitation, in the accompanying drawings, in which like reference numerals indicate similar elements and in which:
FIGS. 1A′-E′ show representative, expected performance curves associated with respective lens designs of
FIGS. 5A′-B′ show representative, expected performance curves associated with respective adjustment of interpupilary distance in the example embodiment of
In this detailed description, certain terminology will be employed so as to disclose the subject matter of this disclosure. It is intended that any and all terminology includes all equivalents. Moreover, unless the disclosure states explicitly otherwise: (i) “includes” means “includes, without limitation”; (ii) terms used in singular form also disclose the plural form, and vice versa; (iii) lists of items (e.g., preceded by “an example”, “examples” and like constructs) are intended to disclose open lists, where other example/items may be added to, implicated by, or otherwise considered to be within the list; and (iv) the term “and/or” may be used in connection with a list of items, in which case it is to be understood that the list discloses “any one of the listed items alone, all of the listed items together, or any combination of two or more of the listed items”.
As used herein, “example embodiment”, “one or more embodiments”, “an embodiment” and/or similar phrases or formulations mean that the components, parts, structures, arrangements, relationships, steps, operations, actions, characteristics, features, functions, numbers, ranges, systems, configurations and/or other details described in connection with that so-referenced embodiment are included in at least one embodiment of the subject matter of the disclosure. Furthermore, one or more, or various combinations, of components, parts, structures, arrangements, relationships, steps, operations, actions, characteristics, features, functions, numbers, ranges, systems, configurations and/or other details may be present in one or more embodiments other than that so-referenced embodiment, and/or may be combined in any suitable manner with one or more embodiments other than that so-referenced embodiment. Moreover, various appearances of any one such phrase or formulation are not necessarily referring to the same embodiment.
As used herein, “image” refers to a representation, in any form, of any object, in whole or in part, which representation is being reproduced, or this is capable of being reproduced. As an example, a reference to “image” may include an optical image that has been optically acquired from the world (e.g., via telescopes, microscopes, binoculars, monocular devices, spectacles or the like). As other examples, any reference to “image” may include an image electronically acquired from the world, in various form(s), including: a single still image; a group of still images; a video clip; a group of video clips; any part of any of these and/or any combination of these. As other examples, any reference to “image” may include an image electronically generated, whether or not acquired, in whole or in part, from the world, in various form(s), including: a single still image; a group of still images; a video clip; a group of video clips; any part of any of these and/or any combination of these. As yet another example, any reference to “image” may include a combination of any of the above examples. Generally, “image” may refer to a representation, in any form, of any object, in whole or in part, that is or may be reproduced so as to enable a user's ultimate perception thereof, e.g., via use of radiation (e.g., visible light) and/or optical components. Any reference to “image” includes both the singular and plural forms, unless otherwise provided by the context.
In the context of an image viewer (as further described hereinafter), an “image” generally is available via the image viewer for viewing by a user. When an “image” is acquired optically, the “image” may be made available to the image viewer via an optical component (e.g., a lens, a mirror, and/or a combination of optical components). When an “image” is acquired or generated electronically, the “image” typically may be made available via a “display device”.
As used herein, “computing device” refers generally to hardware, firmware and/or software resources that execute instructions and/or perform fixed operations (e.g., responsive to events, interrupts or other stimuli), such as to provide a output, task or other result. A computing device may be variously implemented; example implementations may include a microprocessor, a microcontroller, a digital signal processor, a combinational/sequential logic device, a gate array, a computing core or cell, a programmable logic device, and/or a system-on-chip device. A computing device typically includes and/or is coupled to additional components, including, as examples, a memory system, a communication capability, a networking capability and/or input/output resources. Embodiments herein that implicate a computing device may be implemented by embedding the computing device within a greater system. Examples of such systems contemplate that a computing device may be integrated in a chip, module and/or other functional block, possibly along with other functional blocks. In any such system, particular configurations, partitions, groupings and/or other arrangements among blocks are (i) generally neither required nor contemplated and (ii) are all generally within the spirit and scope of the subject matter of this disclosure.
When reference may be made herein to “computing device” as an “image source” (as described below), the computing device may be understood as storing, processing, recovering, generating, reproducing, providing and/or otherwise contributing as to an image, in whole or in part, including an image that can be, or is being, reproduced for viewing by the user. Examples of “computing devices” providing such functionality include: cell phones; portable media players (e.g., the IPOD® of Apple Computer, Inc., Cupertino, Calif.); personal computers (e.g., desktop computers, laptop computers, other computers of any form factor); personal digital assistants (e.g., Palm® devices of Palm, Inc., Sunnyvale, Calif.); storage devices (e.g., LifeDrive™ devices of Palm, Inc., Sunnyvale, Calif.); and/or game stations (e.g., PSP™ and PlayStation® of Sony Computer Entertainment America, San Mateo, Calif.; Game Boy® of Nintendo of America, Inc., Redmond, Wash.; Xbox®, Microsoft Corporation, Redmond, Wash.).
As used herein, “camera” refers to any of various technologies that acquire, or contribute to the acquisition of, all or part of, an image. References to “camera” include, as an example, a component, apparatus and/or system that employs any of various detectors and/or image acquisition devices, e.g., so as to enable an image in the form of an electronic signal (whether the signal is digital or analog). References to “camera” include, as examples, devices that acquire an image at a selected wavelength, at selected wavelengths, in a selected range of wavelengths, and/or in selected ranges of wavelengths (e.g., acquiring images in visible wavelengths and/or infrared wavelengths). References to “camera” include, as examples: endoscopes and other similar, relatively non-invasive, imaging instruments (with or without related computing device) used in surgical procedures; professional and consumer still and video cameras employing charge-coupled devices (CCDs), CMOS imaging chips or other electronic imaging technology (e.g., tube, integrated circuit or otherwise); and/or military, police and emergency cameras employed for pilots, ground troops, police, firemen, rescuers and the like, generally for specialized image viewing (e.g., imaging a target, navigating a plane, spotting hidden combatants or suspects, and/or finding, identifying or assessing victims).
As used herein, “optical source” refers to any of various technologies that source an image, in whole or in part, by optically acquired the image from the world. Examples of an optical source include: telescopes, microscopes, binoculars, monocular devices, and/or spectacles.
As used herein, “image source” refers to any of various technologies that, in whole or in part, source an image. Examples of an image source include: an optical source, a camera and/or a computing device. Examples of an image source also include devices that, directly or indirectly, receive an image, in whole or in part, so as to make the image available to an image viewer (as “image viewer” is described in the disclosure that follows). Image sources may receive images variously, including from, e.g., an optical source, a camera and/or a computing device. Image sources may receive images using various couplings, including, e.g., analog and/or digital technologies; optical, electronic and/or opto-electronic technologies; wired and/or wireless technologies; serial and/or parallel technologies; and/or standards-based and/or proprietary technologies.
As used herein, “display device” refers to any of various technologies that reproduce an image. As an example, a display device reproduces an image for a user's ultimate perception thereof. As examples, the display device may be implemented using, as examples, emissive, transmissive, and/or reflective displays. As further examples, the display device may be implemented using: a liquid crystal display (LCD), whether reflective, transreflective, or transmissive; a liquid crystal on silicon (LCOS) component; a digital micro-mirror device (DMD); and/or other imaging device(s) based on Micro-Electro-Mechanical Systems (MEMS) technology. As yet other examples, the display device may be implemented using light emitting diodes, such as, but not limited to, organic light emitting diodes (OLEDs) (in which examples, the diodes may be provided in a two- and/or three- dimensional array). As still another example, the display device may be implemented using an electrophoretic display.
In an example embodiment, a display device may have integrated therein, specify combination with, and/or otherwise use driver/interface electronics to drive, control and/or otherwise display the image via the display device. Among other implementations, a display device may be implemented separately from its driver/interface electronics, being coupled thereto using various technologies, including, e.g.: analog and/or digital technologies; optical, electronic and/or opto-electronic technologies; wired and/or wireless technologies; serial and/or parallel technologies; and/or standards-based and/or proprietary technologies.
In an example embodiment, a display device may have integrated therein, specify combination with, and/or otherwise use an illumination source. As examples, an illumination source may be used with any of an LCD, an LCOS component, a DMD, MEMS technology and/or an electrophoretic display.
An illumination source may be variously implemented. As examples, an illumination source may be implemented using various light source technologies, including: an incandescent filament; an arc source; one or more laser diodes; and/or one or more LEDs. Generally, as is understood in the art, the selection and/or implementation of the illumination source should be based on, account for, resolve among, and/or otherwise be consistent with relevant factors, including, as examples, the display device (e.g., one or more of technology, dimensions, disposition and/or features of the display device) and/or the application in which the display device is being implemented (e.g., characteristics of the image space, such as f-number relating to the display device and an image plane and/or the color saturation being sought). To illustrate, in an example embodiment wherein a display device supports color imaging via integral color filtering, the illumination source may comprise (i) one or more LEDs, which LEDs may provide “white” light for filtering by the display device or (ii) one or more RGB-LED triads (i.e., wherein each such triad has three LEDs: one emitting red light, one emitting green light and one emitting blue light) so as to further enhance the color saturation of the integral color filtering. To illustrate further, in an example embodiment wherein a display device omits integral color filtering (i.e., but where color imaging is being supported), the illumination source may comprise one or more RGB-LED triads. It is to be understood that, although the description of the illumination source has emphasized various light source technologies, an illumination source may comprise components in addition to any particular light source technology/technologies (e.g., optical components such as one or more lenses, condensers, filters, and/or diffusers).
As to an example embodiment having a display device that uses an illumination source, it should be understood that the illumination source is provided, even if not explicitly described. It should also be understood that the illumination source may be provided either integral with, or separately from, the display device, including being provided in or by a component, module and/or system with which the display device is intended to be used. It should also be understood that the illumination source may be provided as (i) a separate, detachable or replaceable component of an HMD in which the display device is integrated or otherwise used and/or (ii) an integrated component of an HMD with which a display device is or can be associated.
It is also to be understood that a display device and/or an illumination source (if any) may be provided either integrated with or separately from an image viewer, and/or other components, modules, and/or systems associated with the image viewer (as “image viewer” is described in the disclosure that follows). In an illustrative example, a display device and/or an illumination source (if any) may be provided for removable coupling (e.g., on an optical, electronic, electrical and/or mechanical basis) with an image viewer. In another illustrative example, a display device and/or an illumination source (if any) may be provided via permanent coupling (e.g., on an optical, electronic, electrical and/or mechanical basis) with the image viewer. It is understood that a person of ordinary skill in the art may determine any such arrangement based on various relevant factors, including, as an example, the tolerances for optical coupling applicable in any particular application.
It is also to be understood that a display device and/or an illumination source (if any) may be provided either integrated with or separately from an image source. In an illustrative example, a display device and/or an illumination source (if any) may be provided for removable coupling (e.g., on an optical, electronic, electrical and/or mechanical basis) with an image source. In another illustrative example, a display device and/or an illumination source (if any) may be provided via permanent coupling (e.g., on an optical, electronic, electrical and/or mechanical basis) with the image source. It is understood that a person of ordinary skill in the art may determine any such arrangement based on various relevant factors, including, as an example, the tolerances for optical coupling applicable in any particular application.
In an example embodiment, the display device is a microdisplay device. An example of a light emissive microdisplay device is the SVGA+ of eMagin Corporation of Hopewell Junction, N.Y. An example of an electroluminescent microdisplay device is the MicroBrite AMEL640.480, of Planar America, Inc. of Beaverton, Oreg. An example of a diffusely backlit transmissive microdisplay device is the KCD-KDCF-AA of Kopin Corporation of Taunton, Mass. An example of an SVGA reflective LCOS display is Z86D-3 of Brillian Corporation, Tempe, Ariz.
Each of
FIGS. 1A′-E′ show representative, expected performance curves 4 associated with respective lens designs of
The balloon of
The balloon of
Similarly, in the diffractive/Fresnel structure 8, the Fresnel structure has Fresnel steps 6′. In this embodiment, the Fresnel steps 6′ have variable pitch and variable depth. It is understood, however, that the Fresnel structure can have pitch and/or depth variable (e.g., either pitch or depth may be constant, while the other is variable). It is also expected that constant depth tends to keep a design relatively closely constrained to the base profile.
The Fresnel structure—via Fresnel steps—should be modeled so as to be optically equivalent to or substantially equivalent to a second profile (represented by curve 7). That is, the Fresnel structure should represent a collapsed second (sag) profile. It is understood that the Fresnel pitch should be substantially greater than the diffractive pitch so as not to degrade the diffractive efficiency (e.g., about 20 or more diffractive steps in each Fresnel facet).
In the balloon of
While a lens design consistent with
Turning to
The balloon of
Similarly, in the diffractive/Fresnel structure 10, the Fresnel structure has Fresnel steps 6′. In this embodiment, the Fresnel steps 6′ have variable pitch and/or variable depth. The Fresnel structure should be modeled to be optically equivalent to or substantially equivalent to a second profile (represented by curve 7). That is, the Fresnel structure should represent a collapsed second (sag) profile. In an example embodiment, the Fresnel structure generally should be implemented to support approximately 20 or more diffractive steps per Fresnel zone. In an example embodiment, the Fresnel steps can vary. In an example embodiment having varying Fresnel steps, the steps should vary while adhering to the base profile sufficiently to preclude localized defocus.
In the balloon of
In an example embodiment, the Fresnel structure may be modeled/implemented to include 20 or more diffractive zones within each Fresnel zone (e.g., so as not to diminish the diffraction efficiency and while maintaining a high-quality Fresnel structure), wherein a diffractive zone is the optical surface intermediate to the diffractive steps and a Fresnel zone is the optical surface intermediate to the Fresnel steps. In an example embodiment, the radial width of a Fresnel zone may be determined by the localized sag as set forth by the spheric/aspheric equations and the depth of the step (e.g., the sag over the width of the zone equals the depth of the step).
The lens design of
As well, this lens design may be useful in field applications, including those applications wherein diffractive and/or Fresnel surfaces may be incompatible with a working environment (e.g., in an example embodiment wherein its smooth aspheric surface is exposed to a work environment (i.e., disposed toward the exit pupil), this lens design will have no exposed diffraction or Fresnel structures, which structures are understood to be relatively delicate and/or readily contaminated such as, e.g., by particulates, dust, moisture, oils and/or other environmental contaminants).
Referring to FIGS. 1A′-E′, representative, expected MTF curves 4 are shown, respectively, for lens designs consistent with
Turning to
It is to be understood that a lens design may have certain performance characteristics that may be less than ideal. In such case, the performance shortfall may be variously addressed. To illustrate, the performance characteristics may be adjusted using various optical techniques, including, as examples, by using various films, dyes, and/or other treatments in association with one or more elements, and/or surface(s) thereof. It is noted that use of dyes to address variation in relative illumination across an aspheric lens is disclosed in Hebert, U.S. Pat. No. 6,972,735, Patent Date Dec. 6, 2005, which disclosures, as well as all other disclosures thereof, are hereby incorporated by reference, as if set forth herein in its entirety, for all purposes. Also to illustrate, the performance characteristics may be addressed by combining lens structures. It is noted that combining lens structures is disclosed in Hebert, U.S. Pat. No. 6,008,939, Patent Date Dec. 28, 1999, which disclosures, as well as all other disclosures thereof, are hereby incorporated by reference, as if set forth herein in its entirety, for all purposes.
As to
Turning to
In
In
In both
The display unit 14 may be variously implemented, including (i) as shown in
In example embodiments, the illumination source 12 may also be variously implemented. Generally, the illumination source 12 should be implemented consistent with the display device 102 and the application. As such, with some display devices, no illumination source 12 may be employed.
In example embodiments, the illumination lens 13 and/or the illumination mirror 22 may be variously implemented. Generally, these components are implemented, if provided at all, based on, responsive to and/or toward resolving among, various factors, including, as examples: (a) the application; (b) the type of display device 14; (c) the relative dimensions of the illumination source 12 and/or the display unit 14; and/or (d) the available dimensions for proper illumination of the display unit 14.
As shown, illumination lens 13 may be modeled and/or implemented consistent with the lens design of
It is also understood that the image made available to the image viewer generally will relate to the image source associated with the image viewer, which, in turn, may depend on the image viewer's application. As an example, if an image viewer is comprised in an HMD, the image source may be a camera, in which case the image may be made available by a display device. However, as another example, if an image viewer is comprised in a modular HMD consistent with the subject matter of this disclosure, the image source may be user-selectable (e.g., a telescope, a camera, and/or a computing device) and, depending on the selected image source at any given time, the image may be made available by an optical component and/or a display device.
In
In propagating through the display unit 14, the light from the illumination source 12 generally is modulated, pixel-by-pixel, according to image data (e.g., as to a still image or a video clip) provided to the display (e.g., by, from and/or in an image source, such as a camera or computing device).
The light is propagated from the display unit 14 to and through the first relay lens 15. In an example embodiment where the first relay lens 15 is employed in an image viewer comprised in a HMD having telecentric characteristics consistent with the subject matter of this disclosure, the first relay lens 15 may be designed such that each pixel cone of light is precisely collimated (i.e., collimated sufficiently precisely so as to enable adjustment of interpupilary distance (IPD) across a selected range). In such example embodiment (as will be described further hereinafter), precise collimation may enable variation in the distance between first relay lens 15 and second relay lens 17, while incurring minimal impact on image quality for the user (i.e., while maintaining optical performance within an acceptable range, which is illustrated in the MTF curves 4 of
In an example embodiment of such an HMD wherein the light from the illumination source 12 were not collimated or not well collimated by the illumination lens 13, first relay lens 15 may be designed and implemented so that sufficient light collimation is established in the space between the first relay lens 15 and the second relay lens 17 and, as such, telecentricity is established (e.g., so that IPD adjustment may be supported, as described below). In such case, it is understood that the light cones are collimated for plural portions of the image (e.g., individual pixels of the image provide via the display device 102), even though the cones generally may not be actually parallel to the optical axis of the image viewer 100 (i.e., cones associated with pixels not on and/or wholly aligned with the optical axis). That is, it is understood that a ray bundle from any single point on the plane of the display is substantially collimated, within itself, between the relay lenses, even though those rays are only parallel to the optical axis for the central point on the display.
As shown in
Conversely, in an HMD supporting telecentric characteristics, it is recognized that modeling any such optical component 15, 17, 20 using a lens design providing compromised performance (e.g., characterized by MTF curves 4 inferior to those of a lens design modeled consistent with
To illustrate, in an HMD having telecentric characteristics (e.g., supporting an adjustment range of interpupilary distance (IPD)), telecentric performance of the HMD tends to be degraded if the first relay lens 15, as modeled/implemented, introduces compromises relating to its optical coupling with the illumination lens 13, including, e.g., compromises that degrade light collimation.
Also to illustrate, in an HMD having telecentric characteristics (e.g., supporting an adjustment range of interpupilary distance (IPD)), telecentric performance of the HMD tends to be degraded if the second relay lens 17, as modeled/implemented, introduces compromises relating to its optical coupling with the first relay lens 15. Second relay lens 17 generally de-collimates light rays in forming an intermediate image (as that operation is further described herein). However, the light rays of the intermediate image maintain the HMD's telecentric characteristics provided the spacing of lens 17 from the HMD's telecentric stop matches the focal length of lens 17. However, even where the image viewer of the HMD is configured other than with such telecentric spacing (e.g., with substantially, but not strictly matching spacing, or e.g., with otherwise unmatched spacing), lens 17 should be modeled/implemented so as to enable imaging within acceptable performance (e.g., supporting an adjustment range of interpupilary distance (IPD)).
Having passed through first relay lens 15, the light is propagated to a nominal stop plane 16′. If there were no scattered light in the image viewer, the physical limits of the illumination source 12 may correspond to a functional telecentric stop. However, if scattered light is present, a true, physical stop may be provided in the vicinity of the true telecentric stop plane (e.g., toward improving contrast). This stop can be an aperture “nominally” close to the telecentric stop position, or the limited size of splitting mirror element 16.
At nominal stop plane 16′, an image is provided which is an approximate image of the image illumination source 12. In an example embodiment, the nominal stop plane 16′ is provided by a splitting mirror 16. In an example embodiment wherein the image viewer is comprised in a hybrid HMD as disclosed herein, the splitting mirror may comprise two mirror elements, left splitting mirror element 16L and right splitting mirror element 16R (further described below in relation to
It is understood that various true or functional stops may be employed and/or exploited in an image viewer. Generally, a stop may be employed and/or exploited to constrain propagated light (e.g., stray light and/or light undesirably scattered by an optical element prior to the stop) and, in so doing, address performance degradation that might be associated with such light (e.g., contrast degradation that might arise if the scattered light were to pass). In an HMD having telecentric characteristics (e.g., supporting an adjustment range of IPD), any such stop may comprise a telecentric stop, i.e., supporting the telecentric characteristics of the HMD relating to an adjustment range of IPD. As an example, the display device 102—via the display unit 14, the illumination source 12 and/or the illumination mirror 22—may provide a functional stop (e.g., via the size and/or location of the display unit 14 and/or the irradiance (area and/or spatial intensity) of the illumination source 12). As another example, a true, physical stop may be disposed at or near splitting mirror 16 (which stop, as stated above, may be implemented using the size of the mirror alone or using the mirror together with selected, additional structure). As yet another example, a true, physical stop may be disposed between second relay lens 17 and intermediate image plane 18, which stop may be variously implemented (including, e.g., using shutter film or a filter). As another example, provided a stop advances its purpose (e.g., to at least some selected and/or sufficient degree based on applicable engineering criteria), one or more true, physical stops may be employed, with each (i) implemented as is practical for the image viewer (e.g., consistent with one or more design criteria, as in the examples above or otherwise) and/or (ii) disposed wherever it is mechanically practical to place it.
After nominal stop plane 16′, the light propagates on through second relay lens 17 to intermediate image plane 18. In an example embodiment, intermediate image plane 18 may be implemented so as to substantially replicate the display plane of display unit 14. In this example embodiment, the second relay lens is modeled and disposed relative to the intermediate image plane 18 so as to precisely focus an intermediate image on the intermediate image plane 18. In this example embodiment, if the optical components prior to plane 18 provide, e.g., 1:1 magnification, relatively low distortion and relatively high resolution, the intermediate image may be a substantially close replication of the image provided by the display device 102 (i.e., in
In an example embodiment, illumination lens 13, the first relay lens 15 and/or the second relay lens 17 may be identical or substantially identical. In such embodiment, it is expected that use of one lens for various optical components in the image viewer may have various advantages, including, as examples: reducing cost of the lenses and/or the image viewer; simplifying logistics in ordering and/or handling lenses; and/or enhancing manufacturing (e.g., efficiencies in assembling image viewers).
From intermediate image plane 18, light is propagated through field lens 19 and, then, on through eyepiece 20 to the exit pupil 21. Generally, field lens 19 may be modeled and/or implemented toward having selected eye relief between eyepiece 20 and the exit pupil 21. In an example embodiment, field lens 19 may be modeled and/or implemented so as to provide clearance sufficient for a user's spectacles.
As is known to persons of ordinary skill in the art, “eye relief” refers to the distance from the vertex of the eyepiece's last optical surface to the location along the eyepiece's optical axis at which the exit pupil is ideally located, e.g., where light rays from all points in the defined object field (i.e., display 14) occupy a common area. In example embodiments, the exit pupil has the same or larger area as a user's pupil, e.g., so as to allow for non-critical positioning and comfortable viewing. As is also known to persons of ordinary skill in the art, “eyebox” refers to a three-dimensional image space (i.e., an image volume) formed by extending the exit pupil along the optical axis in either direction from the exit pupil's ideal position, and within which space the user can view the entire image. That is, an eyebox, generally, is an image volume (a) within which the user's pupil may move around while intercepting a complete image, e.g., intercepting all pixels of the display plane of display unit 14 and (b) outside of which the user's pupil will intercept an incomplete or partially dimmed image, e.g., intercepting less than all pixels of the display plane of display unit 14. Relative to the eyebox and to the exit pupil, several characteristics of image quality of the image viewer 100, 100′ may depend on the placement of the user's pupil, which quality characteristics include, as examples: resolution, distortion, and vignetting.
In an example embodiment, an image viewer 100, 100′ may be modeled/implemented including, e.g., toward providing selected characteristics of eyeboxes 27, 27′ and/or exit pupils 21, 21′. To illustrate, in
In an example embodiment, diffuser 26 may be variously implemented, including, as an example, using a microlens array or a diffusing surface coating. In the various implementation, the diffuser 26 generally is modeled and/or implemented consistent with one or more factors, including, as examples: (a) the diffuser should be properly disposed (e.g., on, in, at or otherwise associated with intermediate image plane 18 on the rear surface of field lens 19); (b) the diffuser should expand a higher f-number cone of light into a lower f-number cone that fills out eyebox 27; and/or (c) the diffuser's structure should be invisible or substantially invisible to the user (i.e., relatively invisible, substantially invisible or, at least, not noticeably visible to a typical user, with respect to the image, such as the image from the display unit 14 shown in
Turning to
In the example embodiment of
As shown in
As shown in
Turning to
Turning to
In
In an example embodiment, adjustment as disclosed above may be provided for various purposes. Examples of such purposes include: (a) to enable the user to selectively align the optical axis 54 of the subassembly 52 with the user's pupil (e.g., to aim or otherwise position the delivered image in the user's line of sight); (b) to enable the subassembly 52 to be removed from, or substantially removed from, the user's line of sight, or otherwise to be rendered unobtrusive or substantially unobtrusive to the user; and/or (c) to enhance appearance (e.g., to store the subassembly so that the image viewer is unobtrusive or substantially unobtrusive as to others). If the adjustment is implemented, the adjustment generally should be implemented so as to deliver its purpose, while responding to and/or toward resolving among applicable engineering criteria (e.g., if to enable user alignment, the implementation may account for physical parameters, such as the space implicated/available for adjustment, the implicated mechanicals, and/or one or both subassembly's dimensions, exit pupil, eye relief, eyebox, and/or position relative to the user's pupil). (Hereinafter, adjustment of this form is sometimes referred to as “general adjustment”.)
To illustrate, in the example embodiments of
Although, in these Figures, general adjustment is illustrated by using a first position and a second position, it is to be understood that general adjustment may be implemented so as to include other positions. Examples of these other implementations include: (a) an adjustment range that sets either or both of these positions as the range's boundaries; (b) an adjustment range that sets neither of these positions as the range's boundaries; (c) an adjustment range that has either or both boundaries outside the range set by these positions; and/or (d) an adjustment range that has either or both boundaries inside the range set by these positions.
In an example embodiment, general adjustment may be implemented through rotation of the subassembly 54 relative to the central core 52. In such example embodiment, rotation of the subassembly 54 may be enabled so that, with such rotation, the relative dispositions, alignments, arrangement and/or interactions among selected optical components thereof are maintained or substantially maintained. As shown in the example embodiments of
In an example embodiment, rotation of the subassembly 54 relative to the central core 52 may be enabled, so that, with such rotation, the splitting mirror 16 of the central core 52 moves so as to maintain or substantially maintain its relative disposition, alignment, arrangement and/or interaction with the optical path of the subassembly 54. In the example embodiments shown in
In an example embodiment, rotation of the subassembly 54 relative to the central core 52 may be enabled so that, with such rotation, the central core's splitting mirror 16 and rotational mirror 24 move relative to one another so as to maintain, or substantially maintain, their optical interaction. Generally, by doing so, the image will be maintained, or substantially maintained, in the optical path of the subassembly 54. It is expected that failure to do so during general adjustment, without other corrections/adjustments, may tend to result in undesirable rotation of the image on the intermediate image plane 18 and, thus, a potentially undesirable perception of the image by the user.
In an example embodiment, general adjustment may be enabled by providing for rotation, as one, of selected optical components (e.g., from the splitting mirror 16 through eyepiece 20), about a rotational axis 58. In such embodiment, the rotational mirror 24 may also be enabled to rotate about the rotational axis 58. In such an embodiment, for any selected amount of general adjustment, such selected optical components may be implemented so as to rotate about the rotational axis 58 through an angle that is greater than (e.g., a multiple of, such as twice) the angle that rotational mirror (24) rotates about the horizontal axis 58. As
In an example embodiment, general adjustment may be enabled: (a) by providing for rotation, as one, of selected optical components (e.g., from the splitting mirror 16 through eyepiece 20), about a rotational axis 58 and/or (b) by providing for rotation of the rotational mirror 24 about a rotational axis 58′. In such an embodiment, for any selected amount of adjustment, selected optical components may be implemented so as to rotate about the rotational axis 58 through an angle that is the same as or different from (e.g., greater than, less then, or a multiple of, such as twice) the angle that rotational mirror (24) rotates about the horizontal axis 58′. Generally, in such embodiment, the rotational axis 58 should be relatively similar (e.g., in proximity, alignment and/or coincidence) to the rotational axis 58′. Sufficient similarity in such axes 58, 58′ may be variously implemented but, generally, should be implemented so that, e.g., the image is transmitted without rotation or other degradation. To illustrate, where such embodiment is implemented in a hybrid HMD application having telecentric characteristics, the axes 58, 58′ should be sufficiently similar so that, with rotation, the movement of optical components relative to the rotational mirror 24 minimizes or avoids the image (a) missing an aperture (in whole or in part) of any optical components (e.g., from the split mirror 16 through the eyepiece 20) and/or (b) being blocked (in whole or in part) by any stop (e.g., a telecentric stop).
In an example embodiment, general adjustment may be enabled by providing for: (a) rotation, as one, of selected optical components (e.g., from the second relay lens 17 through eyepiece 20), about a rotational axis 58; (b) rotation of the rotational mirror 24 about a rotational axis 58′; and/or (c) rotation or other movement of the splitting mirror 16. Such rotation or other movement of the splitting mirror 16 may be about a rotational axis 58″ (not shown), which axis 58″ generally is implemented as described above. Such rotation or other movement of the splitting mirror 16 may also be otherwise implemented (including as a combination of rotation about any one or more axes and/or with other movement). In any case, the implementation generally should be accomplished so as to minimize or avoid image degradation associated with general and/or IPD adjustment (as IPD adjustment is described further below).
As to the example embodiments illustrated in
Turning to
In
As to IPD adjustment, it is understood that the IPD adjustment range may be variously implemented, including, as an example, responsive to a range of IPDs expected or known in a selected population (e.g., populations reflecting heritage, gender, age, and/or residency of the United States and/or other jurisdictions). To illustrate, IPDs ranging from 49 mm to 65 mm are thought to cover a significant percentage of the adult population in the United States.
As to a selected IPD adjustment range, the adjustment is optically enabled, at least in part, through selection of the second relay lenses 17 and/or the field mirrors 25. Selection of lens 17 and/or mirrors 25 includes, for example, proper size and/or proper arrangement, e.g., so as to adequately accommodate the convergence/divergence of light, respectively incident thereon, over the IPD adjustment range (which convergence/divergence is illustrated in
Further as to IPD adjustment, it is understood that IPD adjustment is physically enabled and/or supported using selected mechanical devices, articles and/or associated technologies (“IPD adjustment technology”). It is also understood that IPD adjustment technology is indicated by and within the reference to housing/mechanicals 60, 62, as shown in various Figures herein. It is understood that the design and implementation of IPD adjustment technology is well known to persons of ordinary skill in the art.
Turning to FIGS. 5A′, B′, MTF curves 4 are shown based two adjustment positions for IPD as may be encountered in a hybrid HMD supporting telecentric characteristics consistent with the subject matter of this disclosure. The MTF curves 4 are representative curves presented so as to illustrate the expected performance of the HMD for such IPD positions. As is illustrated by such curves 4, IPD adjustment should be implemented so as to result in only minor (or negligible) differences in performance (e.g., resolution) across the adjustment range. Generally, such performance differences are expected to be supported by HMDs, image viewers and other systems consistent with the disclosures hereof.
Turning to
In an example embodiment, a HMD system 40 comprises a modular assembly 46. A modular assembly 46 may be variously implemented. As an example, a modular assembly 46 may be implemented so as to be comprised in an HMD assembly 44, together with an image viewer 100, 100′ and/or a display device 102 (or selected one or more components of such device 102). As an example, a modular assembly 46 may be implemented integrated in headgear 42 (e.g., for permanent attachment). As another example, a modular assembly 46 may be implemented so as to be detachable from, and re-attachable to, headgear 42 (e.g., at the user's selection). In any such example, a modular assembly 46 may be implemented either/both (i) within or substantially within any one or more of the various components of the headgear 42 and/or (ii) as an attachment, protrusion, or other addition to any one or more of the various components of the headgear 42. If the modular assembly 46 is comprised in an HMD assembly 44, the modular assembly 46 may be implemented so as that the HMD assembly 44 is attached via the modular assembly 46 to the headgear 42 (e.g., in permanent attachment or otherwise).
As an example, a modular assembly 46 may be implemented to comprise: (a) one or more peripherals connectors 28; (b) driver/interface electronics 30; and/or (c) one or more stow units 32. It is understood that, in an example embodiment, any one or more of a peripherals connector 28, driver/interface electronics 30, and/or a stow unit 32 may be implemented as a component of an HMD system 40, but separately from the modular assembly 46.
A peripherals connector 28 may be variously implemented. In an example embodiment, a peripherals connector 28 is implemented to provide one or more selected connections between a selected peripheral (see
It is understood that more than one peripherals connectors 28 may be implemented in HMD system 40 in order to support a selected group or number of connections and/or peripherals. It is further understood that the numbers and/or types of peripherals connectors 28 generally should be implemented responsive to, and/or toward resolving among, applicable engineering criteria (e.g.: the field application to which the HMD system is being designed; the size, weight and form factor of the peripherals; the mounting area/space available and/or usable in relation to the headgear 42; and/or the support/stability provided via the headgear 42).
Driver/interface electronics 30 may be variously implemented. Driver/interface electronics 30 may be implemented to drive, enable, support or otherwise contribute to operation of, as examples: (a) one or more peripherals connectors 28; (b) an image viewer 100 (e.g., as to an integrated display unit 14, a user detection technology feature 90, and/or other features); (c) a display device 102; (d) one or more peripherals connected to respective peripherals connectors 28; and/or (e) any combination of these.
One or more stow units 32 may be variously implemented. In an example embodiment, stow units 32 are implemented responsive to the units' respective purposes. In an example embodiment, a stow unit 32 may be implemented so as to enable the user to stow various peripherals, one or more of which peripherals (i) may or may not be used via a peripherals connector 28 and/or (ii) may be stowed in a stow unit 32 either/both in use or not in use. In such example embodiment, a stow unit 32 may be implemented to stow, from time to time, at various times, or all the time, peripherals that include, as examples: a camera (e.g., a digital camera, a video camera, a night vision camera, and/or a video binocular); an optical source (e.g., binoculars); headphones (e.g., earbud headphones); a Bluetooth device (e.g., an interface to a television source, to other video source, and/or to an audio source); a cell phone (e.g. a smart phone); a music player (e.g., an IPOD®); a personal digital assistant; any of various computing devices (e.g., of any of various, but of form factor proportional to that of the respective stow unit 32); a gaming station; a storage device (e.g., a LifeDrive™, any of various so-called thumb drives, or other storage devices, whether having built in interface technology or employing the driver/interface electronics 30); a connector (e.g., a USB cable to connect between a remote or attached peripheral and the HMD system, such as connection to a personal computer via peripherals connector 28); and/or a battery (e.g., a back-up and/or extended life battery). It is noted that, as to peripherals comprising optical components (e.g., a camera and/or an optical source), one or more of the optical components therein may be modeled and/or implemented consistent with a lens design of this disclosure, e.g., responsive to, and/or toward resolving among, applicable engineering criteria engineering criteria associated with any such peripheral.
In an example embodiment, depending on the peripheral stowed therein, the stow unit 32 may provide various connections so as to couple a stowed peripheral with one or more other components of the HMD system 40 (e.g., the same or substantially similar to any one or more of the electrical, electronic and/or other connections described above with reference to the peripherals connectors 28). In an example embodiment, a stow unit 32 may comprise a pocket (e.g., sewn into or onto a cap or formed in other headgear, with or without a pocket closure or peripheral retaining device).
Headgear 42 may be variously implemented. As examples, headgear 42 may be implemented via: a cap (e.g., a baseball cap, a fishing cap, a sailing cap, a golf cap, a driving cap, a bicycling cap, and/or an officiating cap); a hood (e.g., of a sweatshirt, a snowboarding jacket and/or a ski jacket); a visor (e.g., a sun visor); a sporting helmet (e.g., a bicycle helmet, a skiing helmet, a snowboarding helmet, a kayaking helmet, a motorcycling helmet, a driving helmet, a jockey's helmet, and/or an officiating helmet); service headgear (e.g., a helmet for army, navy, air force, NASA and/or special forces, and/or headgear for fireman hat and/or); protective headgear (e.g., a hard hat); professional headgear (e.g., for surgeons, television cameramen, and/or journalists); gaming headgear; fashion hats (e.g., cowboy hats); and/or customized devices (e.g., designed and/or implemented to serve as headgear 42).
In an example embodiment, headgear 42 comprises a mount 48 and a base 49. The mount 48 refers to connection, within the HMD system 40, between the headgear 42 and the HMD assembly 44. The base 49 refers to connection between the HMD system 40 and the user's head. Generally, the mount 48 and base 49 may be variously implemented, which implementations should respond to, and/or resolve among, various engineering criteria, including, as examples: (a) the type of headgear 42; (b) the applications to which the HMD system 40 is designed and used; (c) desired stability of the HMD system 40 on the user's head, both during use and not during use of the HMD assembly 44 (e.g., responsive to torque arising from the weight/disposition of the HMD assembly 44 and/or of any peripherals associated with the modular assembly 46, such as via peripherals connectors 28 and/or the stow unit 32); (d) desired distribution of weight, pressure and/or other support criteria relating to the HMD system 40, around and/or about the user's head; (e) desired comfort of the HMD system 40 on the user's head; (f) fashion or other appearance criteria; and/or (g) safety. It is noted that, as to certain headgear, responding to and/or resolving among engineering criteria is disclosed in Kaufmann et al., U.S. Pat. No. 6,480,174, Patent Date Nov. 12, 2002, which disclosures, as well as all other disclosures thereof, are hereby incorporated by reference, as if set forth herein in its entirety, for all purposes.
In
In
In
As described above, the HMD system 40 may employ various peripherals. In an example embodiment, the HMD system 40 may employ various peripherals, at any time, alone and/or in combination. In such example embodiment, the user may be enabled to select which peripheral(s) to employ at any given time and for what duration. In an example embodiment, if peripherals are employed in combination, peripherals' number may be dependent on various factors, including: the ability of headgear's mount 48 and base 49 to accommodate such peripherals (e.g., to enable attachment the peripherals and/or to sufficiently perform when peripherals are attached); the number of available, compatible peripherals connections 28; peripheral's demand for power relative to available supply of power from an HMD-resident power source; and/or peripheral's demand for drive/interface resources relative to available supply of drive/interface resources from the HMD's driver/interface electronics 30).
In an example embodiment, one or more peripherals may be employed such that they are integrated in the HMD system 40. As an example, various storage devices and/or various image sources may be so integrated. In this example, the HMD system 40 may be implemented so that, other than the one or more integrated peripherals, either no additional peripherals may be employed or one or more additional peripherals may be employed (e.g., user-selected peripherals, times and/or durations).
In an example embodiment illustrated in
In an example embodiment illustrated in
As shown in
Although
The I/O assembly 74, generally, is implemented so as to communicate with the HMD system 40, so as to enable the user to interact with the system 40. To illustrate, an example embodiment of an I/O instrument 72 may be implemented to communicate with the HMD system 40 via the I/O device 37 of video camera 34, as illustrated in
Although device 37 may be implicated in the communication, the I/O assembly 74 may be variously implemented toward realizing communication with the HMD system 40. In an example embodiment as shown in
In an example embodiment, the video camera 34 sources selected light for reflection by the I/O assembly 74. The video camera 34 may be implemented to do so by implementing the I/O device 37 to comprise one or more LEDs and/or other source of light, including light of selected intensity (e.g., typically low-power light) and/or light of one or more selected wavelengths. So implemented, the I/O device 37 may be enabled to flood the camera's field of view (i.e., from a location adjacent to the camera's lens), whereby the reflector 78, when reflecting in the camera's field of view, is enabled to provide a reflected image (e.g., a relatively bright) back to the video camera 34, for acquisition in the acquired image (i.e., so as to be included in the video signal representing the acquired image). The video camera 34 and/or other components of the HMD system 40 may be enabled to detect this reflected image in the video signal by differentiating it from other parts of the video signal by any of various methods (including, e.g., if the LEDs of the camera's I/O device 37 are pulsed during alternate video frames, the reflected image may be differentially detected in the video signal).
In addition to detecting the reflected image, the video camera and/or other components of the HMD system 40 may be implemented to analyze and exploit that detected signal. As an example, the detected, reflected signal may be analyzed for its relative position in the acquired image (e.g., among horizontal and vertical pixels, among selected frames, and/or from frame to frame in selected order or sequence), so as to be exploited as an input/output (I/O) device associated with the HMD system 40. In an example embodiment, when the HMD system is used together with a computing device, the I/O instrument 72 may be used, through this reflection operation, as a form of selector, pointing device, like a mouse, or as a touch screen).
To illustrate, in
In another example embodiment, the HMD system 40 may provide a virtual keyboard so as to enable typing (e.g., touch typing). In such embodiment, the I/O instrument 72 may be implemented as one or two gloves, or multiple finger cots or multiple thimbles, and/or otherwise so that one or more of the user's fingertips used for typing have an associated I/O assembly 74.
It is thought that I/O instruments 72, as disclosed herein, in conjunction with the HMD assembly 44, may enable use of personal computers, laptop computers and/or other computing devices without use of a standard computer monitor and/or standard keyboard (e.g., two of the largest components of desktop and laptop computers). Moreover, it is thought that such use in portable computing devices may significantly reduce battery requirements and/or extend the time that a computing device may be operated from a given amount of battery power. It is also thought that this combination may also provide advantages and enhancements to cell phone and PDA interfacing.
Turning to
In
In this disclosure (including the accompanying drawings and appended claims), various aspects of the subject matter are described. Therein, specific components, parts, structures, arrangements, relationships, steps, operations, actions, characteristics, features, functions, numbers, ranges, systems, configurations and other details (“Specific Details”) are set forth, including to provide a thorough understanding of the subject matter. However, it should be apparent to persons having ordinary skill in the art, including those having the benefit of this disclosure, that the subject matter may be practiced without one or more of the Specific Details.
In this disclosure (including the accompanying drawings and appended claims), one or more Specific Details may be well known to a person of ordinary skill in the art. One or more of such well-known Specific Details may have been merely referenced, partly or wholly omitted and/or simplified herein, including so as to not obscure the subject matter.
As to this disclosure (including the accompanying drawings and appended claims), variations, modifications, substitutions, equivalents and/or other changes (“Changes”) are may be appreciated, recognized or otherwise understood by a person of ordinary skill in the art, having the benefit of this disclosure, including, without limitation, as to one or more of the Specific Details. Such Changes shall be considered to fall within the scope and spirit of the subject matter of this disclosure. Accordingly, the appended claims are intended to cover, and shall cover, all such Changes.
The example embodiments are, and shall be considered, illustrative; they are not, and shall not be considered, either restrictive or exhaustive. Moreover, the subject matter of this application is not, and shall not be, limited to the example embodiments, or to any one or more of the Specific Details. This disclosure (including the accompanying drawings and the appended claims) may be modified within the scope and equivalents of the original filing.