Head-worn display devices are known in the art. Typically, the display is a small color monitor arranged to present images to a user's left eye, right eye, or both. These devices often surround the user's face or head and thus not only are heavy but also occlude substantially all of the user's vision. In other words, while wearing the display, the user generally cannot easily view other objects in the user's normal peripheral vision or loses substantial portions of normal peripheral vision during use. Other head worn displays may include two separate displays, one for each eye, that are also supported on a heavy frame.
While, these devices can provide a high-resolution display of images and sound, occlusion of the user's normal viewing space, or a majority thereof can be problematic. The user will typically only use the display in a few, select locations where that user perceives the location to be safe, for example, in a living room, elsewhere in the home, in a work space while seated or standing or in a substantially fixed location. Users cannot efficiently perform many other day to day tasks when wearing an occlusive display device. These tasks include participating in activities requiring moderate to high personal mobility, requiring frequent depth perception adjustments, moving through areas with irregular and uneven surfaces or requiring active collision avoidance (i.e., personally moving through areas or events with constantly changing obstacles, crowds, avoiding fast moving objects that may be encountered, while operating vehicles, negotiating the use of public transportation) or any circumstance where personal safety maybe sacrificed by loss of normal peripheral vision.
Secondly, such prior art head worn displays are limiting in certain limited tasks. Such tasks can include viewing images, graphics or movies with audio. This can be for gaming purposes or recreational viewing of images from a television broadcast or video. Such prior art head worn displays are severely limited in connection with other day-to-day desired functional computing tasks. For example, the user may desire using the display in connection with communication tasks, running business applications, active navigation tasks, mobile instruction with real time updates or using the display to wirelessly control other devices that the user regularly uses or comes in contact with on a day to day basis. These devices can include such as, for example, a Personal Digital Assistant, a notebook computer, a desktop computer, a mobile phone, a vehicle, a wireless network, wireless service hot spot, thin client, other electronic device or an appliance. Such prior art head worn displays often cannot interface with or slave such devices to initiate and control running programs, initiate real time device functional changes, alter real time device operational parameters, enable local or remote wireless communication with mobile devices and/or engage with wireless networks and services.
Thirdly, such prior art devices are not readily upgradeable to provide other functions that the user may desire. A user may desire, in some locations, to have some functional attributes of one or more particular software applications or one or more particular hardware configurations, while in other locations the user may not desire to have those software applications or hardware configurations. In fact, the user may not use such a heavy display device with multiple software applications or hardware configurations, and instead may wish to remove unnecessary software and hardware from the device so the device remains ultra lightweight.
Accordingly, there is a need in the art for a monocular device that does not occlude large portions of the user's normal viewing space to prevent or discourage the user from wearing the device in the user's day-to-day normal activities. There is also a need in the art for a device that provides for other functions besides viewing images or graphics and that can be user upgradeable so the user can select and choose which hardware or software components the user desires to interface with the device. There is also a need in the art for a monocular device that only occludes less than about ten to about twenty percent of the user's normal vision, while leaving about eighty to about ninety percent or more of the user's vision free from obstruction. It is appreciated that the wearer has a view of vertical and horizontal vision, and that in one embodiment about eighty to about ninety percent of the wearer's vision in the horizontal is free from obstruction. There is also a need in the art for a device that can be easily moved from a displayed position to a stowed position without removing the device from the wearer's head. There is also a need in the art for a device that does not completely immerse the user in video and audio so the user cannot perform other day to day tasks.
In a first aspect of the present disclosure, there is provided a head mounted monocular display that includes a display arranged relative to a dominant wearer's eye, a housing connected to the display, and a support member. The support member is connected to the housing, which supports the display relative to the dominant wearer's eye. The display is generally located in a position relative to the wearer's dominant eye so the display is in the peripheral view of the wearer, and does not occlude the wearer's normal peripheral vision by the display blocking a front of the wearer's dominant eye.
In another aspect, there is provided a method of supporting a head mounted display on a wearer. The method includes providing a resilient housing, and connecting the head mounted display to the resilient housing. The housing is supported on the wearer so that the display is in the peripheral view of the wearer, and the display is supported relative to a wearer's head to occlude no more than about ten to about twenty percent of the normal field of view of the wearer. The normal field of view of the wearer is defined as about 180 degrees in the horizontal, and about 120 degrees in a vertical direction.
In yet a further aspect of the present disclosure, there is provided a head mounted monocular display device that includes a display arranged relative to a wearer's eye, a housing connected to the display, a power supply, and a circuit operatively connected to the display and power supply. A support member is connected to the housing, which supports the display relative to the wearer's eye. A port is associated with the display, the housing or the support member. The port operatively connects a functional component to the circuit for removably connecting at least one additional functional component to the circuit.
The foregoing will be apparent from the following more particular description of example embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments of the present invention.
A description of example embodiments of the invention follows.
Turning now to
In the preferred embodiment, a housing 210 (
A number of buttons and an LED may be associated with the device 100 and protrude from housing 210 or other locations (e.g., switch 1/switch 2/switch 3 and reset inputs). A VGA quality display 140 is shown in
Turning again to
Moreover, the monocular display device 100 can be advantageously viewed simply by looking out of the corner of the user's dominant eye momentarily to view images, and then immediate return to the field of vision in front of the user. This enables that the user can wear the monocular display device 100 in day-to-day activities. Advantageously, the user's can quickly look at the display 100 and then quickly, safely and easily regain focus to objects that are in front of the user. This is advantageous since the user can use the monocular display device 100 in the user's day-to-day tasks and is not confined to using the device 100 only in certain designated “safe” locations. The user's dominant eye is defined as the right or left eye that is the strongest or dominant in the user's day-to-day vision.
In this embodiment, the support structure 110 may be any device for quickly and easily permitting the monocular display device 100 to be stowed from a viewing position or located adjacent the user's dominant eye D to a second, or stowed, position. In this aspect, the monocular display device 100 includes a first arm 115. The first arm 115 is a tubular resilient member that is connected to the optical display housing 120 by a hinge 125. The support structure 110 also includes a second arm 130 that is connected to the first arm 115 by a second hinge 135. In this manner, the second arm 130 may be connected to another structure associated with, or worn around the user's head, ear, or connected to a garment for support. The user may quickly and easily move the display housing 120 to the stowed position using the support structure 110. Structure 110 is made from a lightweight material such as aluminum or a thermoplastic.
The monocular display device 100 also includes a display component 140 that will be discussed in detail herein. The display component 140 is preferably a lightweight display that projects an image that is magnified. Turning to
It should be appreciated that the displayed image need not be projected to, or displayed on, the entire eyeglass optical display 105. Instead, the image can be displayed only on a portion of the display 105. This provides that the image itself does not occlude the user's vision. This permits the user to see about ninety to ninety five percent of the user's normal vision through the eyeglass optical element 105.
Turning now to
In one embodiment, the monocular display device 100 includes a display that is a micro-display component 140 such as, for example, a liquid crystal display, a light emitting diode display, an organic light emitting diode based display, a cholesteric display, a electro-luminescent display, an electrophoretic or an active matrix liquid crystal display. Various lightweight and high-resolution display configurations are possible and within the scope of the present disclosure.
In one preferred embodiment, the display component 140 may be a WVGA display sold under the trade name “CYBERDISPLAY WVGA LV”® manufactured by the instant Assignee. The display component 140 can be a color filter, wide format, active matrix liquid crystal display having a resolution of 854×480. The display component 140 in this embodiment can be 0.54 inches in the diagonal dimension. In another embodiment, the display component 140 may alternatively include a VGA display sold under the trade name “CYBERDISPLAY VGA”® which is also manufactured by the instant Assignee. The display component 140 can be a color filter, active matrix liquid crystal display having a resolution of 640×480. The display component 140 in this embodiment can be about 0.44 inches in the diagonal dimension and lightweight.
In a further embodiment, the display component 140 can be a 0.44 inch diagonal SVGA display with about 800×600 resolution, a wide SVGA display with about 852×600 resolution, an XVGA display with about 1,024×768 resolution, an SXGA display with 1,280×1,024 resolution or High Definition Television display with either 1,400×720 resolution or full 1,920×1,080 resolution.
In an embodiment shown in
Like the embodiment of
Turning now to
The present monocular display device 200 preferably has program instructions stored on a memory to form a computer networking master/slave relationship with other devices using a communication protocol in which the monocular display device 200 controls one or more other devices or processes, and once the master/slave relationship is established, the direction of control is directed from the monocular display device 200 to the desired components. In this manner, the user need not carry heavy secondary components and may simply control those secondary components using the primary lightweight monocular display device 200 over a wireless interface.
In that aspect, the monocular display device 200 may include a processor (not shown), a memory, and a bus including a wireless interface. The wireless interface may include a transmitter/receiver or transceiver and be compatible for communications with personal area networks and such devices using short-range radio frequency signals. In one preferred embodiment, the wireless interface may communicate using a BLUETOOTH® radio standard, flexible Ultra Wideband (UWB) or using other radio frequency communication standards for low or flexible power consumption and compatibility. In another embodiment, the monocular display device 200 may communicate using Wi-Fi.
Turning now to
Referring to
Turning again now to
Turning now to
Turning now to
Turning now to
In this alternative embodiment, positioned on the first body portion 305, is a power supply 320. The power supply 320 may include various differently compact power devices such as, for example, a battery or a wired connection. However, in this non-limiting embodiment, the power supply 320 may be configured to include a different rechargeable power source. In this embodiment, the power supply 320 can be configured as a solar photovoltaic rechargeable cell. The power supply 320 configured as the solar cell may be further configured as the primary power source for the monocular display device 300 or may alternatively be configured as a secondary or auxiliary power source. Various configurations are possible and within the scope of the present disclosure.
Preferably, the power supply 320 is positioned in a complementary location so as to receive sunlight, artificial light or may be rotated to such a recharging position to receive light using the support structure 315. Turning now to
Preferably, in this embodiment, the electromagnetic field coil rechargeable antenna component 321 includes a battery component (not shown) that is operatively connected to the electromagnetic field coil rechargeable antenna component 321. The electromagnetic field coil rechargeable antenna component 321 preferably captures energy from a transmitted or received magnetic fields and stores the captured energy in the battery component. These fields may be from a cell phone or wireless mobile device that the wearer carries.
The electromagnetic field coil rechargeable antenna component 321 may be configured for use in a sealed casing for primary power or configured for auxiliary power. The electromagnetic field coil rechargeable antenna component 321 preferably includes a transformer with a coil that captures the electromagnetic field for use by the device 300 and/or can be used with the embodiment of
Turning now to
As shown in
It should be appreciated that other peripheral or secondary components may not be desired to be located on frame F. Additional peripheral components may cause the device 500 to become heavier and uncomfortable, or cause the frame F to fall from the user's face. In this aspect, the monocular display device 400 further includes a lanyard strap interface 520 (
In this embodiment, the lanyard interface 520 may provide additional features for the monocular device 500 that permit the lanyard interface 520 to carry slightly heavier items that would not be appropriate for housing in the display housing 405 or the body 410. In this aspect, the lanyard interface 520 may provide additional features such as increased battery life, increased memory functions, increased sensing features or other previously described components or new different components. The lanyard interface 520 preferably connects to either side of the eyeglass frame F, but also has an interior and provides for space for the additional components. The lanyard interface 520 may include wiring to a secondary auxiliary battery, additional sensors, additional rear view cameras, a lightweight solid-state memory, a bus, or a processor.
In another aspect, the lanyard interface 520 may act as a pass-through for wiring components to the other opposite eyeglass frame F. In this aspect, the lanyard 520 may communicate with an auxiliary secondary housing that is removably connected to the opposite eyeglass frame F by a different clip or fastener.
Turning now to
Alternatively, the secondary auxiliary housing 505 may be configured to connect to other locations. In one embodiment, the housing 505 can be located to hang from a wearer's hat, eyeglasses or may even hook around or wrap around the wearer's ear. In another embodiment, the auxiliary housing 505 may wrap around the user's wrist, ankle, arm, leg, or bicep/tricep muscle. The secondary auxiliary housing 505 preferably increases the functionality of the monocular display device 500 by storing one or more additional or secondary components that provide increased functionality to the monocular display device 500.
In the embodiment of
In the embodiment of
In the embodiment shown in
In one embodiment, the wearer may further use a wireless input/output device in order to control the monocular display device 600. In one aspect, the wireless input/output device 605 may include a wireless mouse, a wireless trackball, a wired mouse, a wired trackball, a microphone, a wireless/wired touchpad device or a combination of these input/output features. As can be seen, from
In an alternative embodiment shown in a rear view of
In another embodiment, the medallion 805 may include discrete peripheral components. These can include a device such as an input/output device, a secondary hard drive, a secondary memory, a radio-module or components, a television or video broadcast components, sensors, optical drives, disk drives, removable media, or other intermediary components for which to communicate with other primary computing components that are located in the monocular display device 800.
Such components may also include antennas, cameras, compasses, positional status components, head position sensor components, Global Positioning System components, targeting components, audio components, video components such as graphics cards, bar code readers, radio frequency identification components, user condition monitoring components, temperature sensing components, accelerometers, gas or biological sensing components or other components that can improve user functionality of the device 800. In another embodiment, the medallion 805 may include primary components that communicate with, and control the display 810.
As shown, in
Turning now to
The medallion 905 is a lightweight device that provides additional functionality to the monocular display device 800. As previously stated above, a normal field of view of the wearer is defined as about 180 degrees in the horizontal, and about 120 degrees in a vertical direction. Also previously stated is that the housing is supported on the wearer so that the display is in the peripheral view of the wearer, and the display is supported relative to a wearer's head to occlude no more than about ten to about twenty percent of the normal field of view of the wearer. Based on these numbers, the display can be of certain dimensions at particular distances from the eye. For example, based on trigonometry and geometry, a field of view a distance x from the user's eye can be calculated as the surface area of a cross-section of a half-sphere a distance D from the user's eye, where the cross section cuts off an area of the half-sphere proportional to the field of view and occupies a finite area based on an area A based on the field of views defined above, where the field of view is represented by θ, and a represents a length of the display. The area of the display can be represented by:
A=x*y
where x is the length of display and y is the height. Further, the area of the cross section can be represented by:
A=2πDh
where h is equal to
h=D−√{square root over (D2−a2)}
where a represents the radius of the cross-section of the half-sphere. Substitution gives
A=2π(D−√{square root over (D2−a2).)}
In the y direction, this gives:
A=2πD(D−√{square root over (D2−0.75D))}=(2πD(D−√{square root over (D2−ay2)}))
and in the x direction,
A=2πD2=2πD(D−√{square root over (D2−ax2)}).
Solving for ay and ax yields the dimensions of the display. Based on these defined and well known relationships and the fact that the display blocks no more than ten to twenty percent of the normal field of view of the wearer, the normal field of view of the viewer also being defined by degrees, the following table governs the range of sizes, in centimeters, of the display for various distances from the user's eye in centimeters. In other words, based on the total field of view calculated for each distance from the user's eye, the range of sizes of the display can be determined by limiting the dimensions to 10-20% of that area. Other distances from the eye are possible under these relationships, but the following table provides various examples of such relationships.
Further, as previously stated above, the Application states that the display component 140 can be a color filter, wide format, active matrix liquid crystal display having a resolution of 854 X 480 and can be 0.54 inches in the diagonal dimension. The Application also previously states that display component 140 can be a color filter, active matrix liquid crystal display having a resolution of 640 X 480 and can be about 0.44 inches in the diagonal dimension and lightweight. If the display component 140 is 854 x 480 with a 0.54 inch diagonal, mathematical relationships reveal that the size of the display is approximately 0.36 inches by 0.64 inches, for a total area of 0.23 square inches. If the display component 140 is 640 X 480 with a 0.44 inch diagonal, mathematical relationships reveal that the size of the display is approximately 0.39 inches by 0.53 inches, for a total area of 0.20 square inches. The application further states that the display component 140 can be a 0.44 inch diagonal SVGA display with about 800 X 600 resolution, a wide SVGA display with about 852 X 600 resolution, an XVGA display with about 1,024 X 768 resolution, an SXGA display with 1,280 X 1,024 resolution or High Definition Television display with either 1,400 X 720 resolution or full 1,920 X 1,080 resolution. Additional dimensions and areas can be calculated based on these ratios and screen diagonals by a person of ordinary skill in the art. This is by permitting operation of one or more additional electronic modules, which may plug into the medallion 905, and then communicate with the monocular device 800 using one or more wireless or wired interfaces such as BLUETOOTH®, Wi-Fi, cellular signals, infrared signals, USB, RS-232, RS-485, Ethernet, or another previously described interface that is established between the medallion 905, and the monocular device 800. It is envisioned that the medallion 905 may be operatively coupled to the display 810 to provide power to the display 810.
In one embodiment, the monocular device 800 may communicate wirelessly with the medallion 905 using a wireless protocol. In another embodiment, the monocular display device 800 may communicate with the medallion 905 using a wired connection or interface. In yet another embodiment, the medallion 905 may communicate with the lanyard interface 1410 in a wired or wireless manner and the lanyard interface 1410 may then communicate with the monocular display device 1400 in a wired or wireless manner (
As can be seen, the medallion 905 may further include a first USB interface slot 920 and a second USB interface slot 925 in different locations of the medallion 905. Other components may be inserted into the slots 920, 925 in order to expand the capabilities of the medallion 905 such as expanding the memory capabilities, video, audio, or sensory functions, or graphical capabilities of the medallion 905, or monocular display device 800.
Turning now to
Turning now to
Turning now to
The wearer, using the monocular display device 1200 and an input/output device, may control switching from the primary battery to the auxiliary battery 1205 using a control signal output from the monocular display device 1200. This is accomplished without removing the monocular device 1200 from the wearer's head. In this embodiment, the wired auxiliary battery 1205 disposed on the lanyard interface 1210 may include a cushioned housing 1205′ and an engagement structure having a clip or fastener. The battery 1205 may be hooked around, to, or through the lanyard interface 1210. The wearer may include several different lanyard interface components 1210 with fresh batteries that may be replaced once the lanyard interface 1210 having the auxiliary battery 1205 is exhausted. Battery 1205 may also be configured as a primary battery to power the monocular display device 1200.
Turning now to
The components, may be wired through the lanyard interface 1310, to be coupled to a circuit or board associated with and coupled to the monocular display device 1300. Components 1305′ can be connected only when needed, or alternatively may remain connected throughout the operation of the device 1300. The wearer using the monocular display device 1300 and using an input/output device may control operation of the components 1305′ without having to toggle any buttons associated with the components 1305′ themselves or use any other separate controllers or control signals associated with the components 1305′. The wearer may control these components 1305′ with ease using solely the monocular display device 1300 and without removing the monocular display device 1300 for convenient operation in a networked arrangement.
Similarly, the components 1305′, connected to lanyard interface 1310, may include a cushioned housing and an engagement structure having a clip or fastener to connect to the lanyard interface 1310. It should be appreciated that the wearer may include several different lanyard interfaces 1310 in sets or groups and each with different components 1305′ that may be replaced and interchanged. For example, the user may have a first lanyard interface 1310 with a rear view camera that may be controlled by the monocular display device 1300 for taking images.
In another example, the user may have a second lanyard interface 1310 (not shown) with different component such as a Global Positioning System that can also be controlled by the monocular display device 1300 using a common communication protocol, or networked relationship.
In another example, the user may have a third lanyard interface (not shown) with another two or more different components such as a music player and a mobile communication device. Both can be controlled by the monocular display device 1300 using a common communication protocol, or networked relationship. In this manner, the user may select which components the user is going to use over the course of a period of usage and then select the appropriate lanyard interface 1310 with components 1305′. The user may also include lanyard interfaces 1310 with no components, but instead these lanyard interfaces 1310 may act as a housing and be selectively loaded with other components 1305′ as needed. Various lanyard configurations 1310 are possible and within the scope of the present disclosure.
Turning now to
Additionally, other component modules (not shown) may also be connected to the medallion (not shown) along connection 1415. In this manner, at least one of (or both) the monocular device 1400 and medallion (not shown) may control the component 1405′ disposed in the lanyard interface 1410.
In one embodiment shown in
A first connection 1515 from between (i) the lanyard interface and the monocular device 1500 is removable, and (ii) another connection between line 1530 from the lanyard interface to the secondary housing 1535 connected to the frames F is also removable.
In this aspect, both (i) sections of the lanyard interface 1510 and (ii) the secondary housing 1535 may each be detached from one another or from the monocular display device 1500 for replacement with another fresh or different component. Likewise, sections 1510 operatively coupled to the medallion (not shown) are also removable.
Turning now to
Turning now to
Turning now to
In this embodiment, the lanyard interface 1915 may be operable to connect to the user's eyeglass frame F, but also be operable to communicate with the medallion 1905 and provide a wired or wireless connection between the medallion 1905 and the monocular display device 1900.
Turning now to
In another embodiment, the engagement structure may connect with other portions of the baseball cap instead of the brim B, however, preferably the display housing 2015 permits the display 2005 to be positioned in a location where the display does not substantially occlude the viewer's vision, and the viewer may view ninety to ninety five percent of the viewer's normal viewing area (relative to the instance if the display 2005 was not present in the viewer's field of vision). The display 2005 is shown disposed in the stowed position, or more particularly is positioned in alignment with the brim B. The monocular display device 2500 further includes a speaker system 2020 for audio. An ear bud 2020 or speakers are disposed in or on the monocular display device 2000. The ear bud 2020 is connected along wire or lead 2025. Wire 2025 is connected to a lanyard interface 2035, which is connected to a body portion 2030 of the monocular display device 2000, so the monocular display device 2000 can output an audio signal to the ear bud 2020 through the wired lanyard interface 2035.
Further, the monocular display device 2000 of the
Turning now to FIG. 20BA, the monocular device 2000 is shown in a viewing position. Here, the display 2005 is supported in a display housing 2010 and is located extended from a body portion 2015. In the viewing position, the display 2005 is located in the peripheral vision of the viewer with first and second arms 2020, 2025 extended from the body portion 2015. The display housing 2010 may be connected to the body portion 2015 by an articulating and telescoping arrangement as discussed above with the previously described embodiments.
Turning now to
Turning now to
It should be appreciated that the display component 2105 should have sufficient brightness and clarity, but at the same time operate within predefined low power limits and also be lightweight.
The display 2100 further includes a prismatic optical configuration including several optical surfaces arranged to direct the enhanced virtual image to the user in a magnified manner. The prismatic optical configuration includes a first aspherical optical surface or element 2115 and first and second reflective surfaces 2120, 2125. The first aspherical optical surface 2115 initially receives the image from the display component 2105.
The image is then reflected from the first and the second reflective surfaces 2120, 2125 to properly orient the image that is emitted from the display component 2105 to the viewer. In one embodiment, the first and the second reflective surfaces 2120, 2125 are plain reflective surfaces. In another embodiment, first and second reflective surfaces 2120, 2125 may be diffractive, and or micro-lens reflective surfaces, or mixed with the first surface 2120 being a diffractive and or micro-lens reflective optical surface while the surface 2125 is a plain reflective surface. In this embodiment, the display 2100 may further include a second aspherical optical and or micro-lens surface 2130 with the second aspherical optical surface 2130 being positioned relative to an outlet 2135. Alternatively, another different optical element may be positioned at the outlet 2135.
The first and the second aspherical optical surfaces 2115, 2130 are adapted to properly orient the image at the outlet 2135. In this manner, the image will be emitted from the display component 2105 to the first aspherical optic surface 2115 and to the first reflective surface 2120. The image will then be properly oriented to the second reflective surface 2125 and displayed in a virtual optically magnified manner to the viewer through outlet 2135 as shown in
In one embodiment, the monocular display device 2100 can have a display 2105 with optical elements having at least four optical surfaces. These surfaces include an aspherical entrance surface 2115 for receiving the image from the display 2105, an aspherical exit surface 2130 so the user views the image directly through the exit surface 2130 and at least two reflective surfaces 2120, 2125. Each reflective surface 2120, 2125 can be positioned to reflect the displayed image from the entrance surface 2115 to the exit surface 2130. The four or more optical surfaces of the optical element 2115, 2120, 2125, 2135 can be shaped or molded to generate a magnified virtual image of displayed image. This permits the user to view crisp and clear images close to the user's eye E.
The virtual image appears to be located a distance from the user. This image is substantially greater in size relative to an optical path defined from a path measured from the display 2105 through the optical element 2115 and to the user's dominant eye E (
Turning now to
The optical element 2115 is substantially free from distortion, astigmatism, chromatic aberrations and is designed for displaying low to high resolution text, charts, graphs, photographs, maps, graphical user interfaces, Internet web pages and video content with overall quality.
The display 2100 can be configured to include at least one monocular optical element surface 2115 including an entrance surface, multiple reflecting surfaces 2120, 2125 and an exit surface 2135. These surfaces can be curved to contribute to display image magnification producing the virtual image. In another alternative embodiment, the display 2100 can be configured with at least one optical element reflective surface being flat and the exit surface 2135 being aspherical. A distance between the optical element surface and the display 2105 can also be user adjustable. In one embodiment, the distance can be manually adjustable with a knob 2145 (
The display 2100 may be formed from a single block of optical material with at least four surfaces with at least two side surfaces being reflective surfaces. Each surface may include a plurality of apertures aligned in a row extending generally parallel to the optical element exit surface. Each aperture on one side surface has a complimentary aperture on the other side surface forming a pair.
Alternatively, the optical element 2100 can be a solid element formed of at least two different materials to form an achromat. The optical element 2100 may include at least one entrance surface 2115 and one exit surface 2130 that are formed of a first material that is different than a second material from which the reflective surfaces 2120, 2125 are formed. The optical element 2100 can be formed by bonding together at least two different optical materials to form a solid optical element, or panel. The display 2100 may incorporate or a clear, flat, transparent, protective, scratch resistant film or other element to protect the optical element exit surface 2135 (
Turning now again to
Turning now to
Turning now to
The touch screen 2305 may be located in a position which is adjacent to slots 2310, 2315 and can receive an input signal by the user dragging the user's finger across or over an overlay on the touch screen 2305. In this manner, as shown in
Turning now to
Preferably, the display element 2400 is located adjacent to a first field lens 2415 and a second objective lens 2415′. The field lens 2415 is connected to the second objective lens 2415′ and includes an air gap 2420 disposed therebetween. The field lens 2415′ preferably collimates the illumination of display element 2400 and matches the illumination with the objective lens 2415 across the air gap 2420.
In this aspect, the image is magnified in a prismatic manner across at least four optical surfaces to magnify the image displayed to the viewer.
The image then passes from the second mirrored surface 2410, where the image is reflected about ninety degrees to the fourth optical surface 2535′. The image then is magnified and properly displayed to the viewer's eye VE. For ease of assembly, the field lens 2415′ is assembled with, or otherwise connected to, the objective lens 2415 with a predetermined air gap 2420 using a first and a second registration pins 2440, 2445. Pins 2440, 2445 are dimensioned to properly fix the distance between the lenses 2415′, 2415 during assembly. Registration pins 2440, 2445 preferably have a predetermined length and are dimensioned so the optical distance is preserved between the field lens 2415′ and the objective lens 2415, and to properly magnify and display the image to the user. The registration pins 2440, 2445 are preferably molded for ease of assembly into the lenses 2415′, and 2415. The lenses 2415′, and 2415 are preferably enclosed in a suitable housing 2445 that is thin, and low cost. Alternatively, the display 2400 may be connected to a lens as described in U.S. patent application Ser. No. 11/420,624 to Ray Hebert entitled “Devices, and Methods for Image Viewing”, which is herein incorporated by reference in its entirety.
While this invention has been particularly shown and described with references to example embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.
This application claims priority to U.S. Provisional Patent Application No. 60/880,270 to Jacobsen et al., filed on Jan. 12, 2007, which is herein incorporated by reference in its entirety. This application also claims priority to U.S. Provisional Patent Application No. 60/930,242 to Jacobsen et al., filed on May 15, 2007, which is herein incorporated by reference in its entirety. This application also claims priority to U.S. Provisional Patent Application No. 60/962,686 to Jacobsen et al., filed on Jul. 31, 2007, which is herein incorporated by reference in its entirety. Further, this application also claims priority to U.S. Provisional Patent Application No. 60/999,801 to Jacobsen filed on Oct. 19, 2007, which is herein incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3953114 | Bidgood | Apr 1976 | A |
4201316 | Klingaman | May 1980 | A |
5235979 | Adams | Aug 1993 | A |
5515070 | Kawada | May 1996 | A |
5806526 | Rhoad | Sep 1998 | A |
5812977 | Douglas | Sep 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5831664 | Wharton et al. | Nov 1998 | A |
5844656 | Ronzani et al. | Dec 1998 | A |
5867817 | Catallo et al. | Feb 1999 | A |
5873070 | Bunte et al. | Feb 1999 | A |
5886822 | Spitzer | Mar 1999 | A |
5909183 | Borgstahl et al. | Jun 1999 | A |
5949351 | Hahm | Sep 1999 | A |
5990793 | Bieback | Nov 1999 | A |
5995936 | Brais et al. | Nov 1999 | A |
6010216 | Jesiek | Jan 2000 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6046712 | Beller et al. | Apr 2000 | A |
6073034 | Jacobsen et al. | Jun 2000 | A |
6084556 | Zwern | Jul 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6108197 | Janik | Aug 2000 | A |
6148241 | Ludtke et al. | Nov 2000 | A |
6167413 | Daley, III | Dec 2000 | A |
6181304 | Robinson et al. | Jan 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6212020 | Ahlgren et al. | Apr 2001 | B1 |
6216158 | Luo et al. | Apr 2001 | B1 |
6229503 | Mays, Jr. et al. | May 2001 | B1 |
6232937 | Jacobsen et al. | May 2001 | B1 |
6292158 | Amafuji et al. | Sep 2001 | B1 |
6295479 | Shima et al. | Sep 2001 | B1 |
6304234 | Horiuchi | Oct 2001 | B1 |
6325507 | Jannard | Dec 2001 | B1 |
6339706 | Tillgren et al. | Jan 2002 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6356437 | Mitchell et al. | Mar 2002 | B1 |
6359602 | Amafuji et al. | Mar 2002 | B1 |
6369952 | Rallison et al. | Apr 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6433760 | Vaissie et al. | Aug 2002 | B1 |
6445363 | Urisaka | Sep 2002 | B1 |
6448944 | Ronzani et al. | Sep 2002 | B2 |
6452572 | Fan et al. | Sep 2002 | B1 |
6456892 | Abrams et al. | Sep 2002 | B1 |
6462882 | Chen et al. | Oct 2002 | B2 |
6486862 | Jacobsen et al. | Nov 2002 | B1 |
6487021 | Ophey | Nov 2002 | B1 |
6507762 | Amro et al. | Jan 2003 | B1 |
6522474 | Cobb et al. | Feb 2003 | B2 |
6529331 | Massof et al. | Mar 2003 | B2 |
6535182 | Stanton | Mar 2003 | B2 |
6538624 | Karasawa et al. | Mar 2003 | B1 |
6545654 | Jacobsen et al. | Apr 2003 | B2 |
6608884 | Mazess et al. | Aug 2003 | B1 |
6618099 | Spitzer | Sep 2003 | B1 |
6622018 | Erekson | Sep 2003 | B1 |
6633267 | Numa | Oct 2003 | B2 |
6636185 | Spitzer et al. | Oct 2003 | B1 |
6653989 | Nakanishi | Nov 2003 | B2 |
6674493 | Shaw | Jan 2004 | B2 |
6677936 | Jacobsen et al. | Jan 2004 | B2 |
6680802 | Ichikawa et al. | Jan 2004 | B1 |
6690338 | Maguire, Jr. | Feb 2004 | B1 |
6697200 | Nagaoka | Feb 2004 | B2 |
6710928 | Roest | Mar 2004 | B2 |
6724354 | Spitzer et al. | Apr 2004 | B1 |
6727865 | Yonezawa | Apr 2004 | B1 |
6734834 | Baram | May 2004 | B1 |
6738535 | Kanevsky et al. | May 2004 | B2 |
6745253 | Struble | Jun 2004 | B2 |
6747611 | Budd et al. | Jun 2004 | B1 |
6751026 | Tomono | Jun 2004 | B2 |
6753828 | Tuceryan et al. | Jun 2004 | B2 |
6757719 | Lightman et al. | Jun 2004 | B1 |
6762885 | Ogasawara et al. | Jul 2004 | B1 |
6771423 | Geist | Aug 2004 | B2 |
6771424 | Amafuji et al. | Aug 2004 | B1 |
6795421 | Heinonen et al. | Sep 2004 | B1 |
6816314 | Shimizu et al. | Nov 2004 | B2 |
6822623 | Kim et al. | Nov 2004 | B2 |
6825987 | Repetto et al. | Nov 2004 | B2 |
6834192 | Watanabe et al. | Dec 2004 | B1 |
6847489 | Wu | Jan 2005 | B1 |
6853293 | Swartz | Feb 2005 | B2 |
6868360 | Olstad et al. | Mar 2005 | B1 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
6880931 | Moliton et al. | Apr 2005 | B2 |
6899539 | Stallman et al. | May 2005 | B1 |
6900777 | Hebert | May 2005 | B1 |
6904570 | Foote et al. | Jun 2005 | B2 |
6922184 | Lawrence | Jul 2005 | B2 |
6947014 | Wooten | Sep 2005 | B2 |
6947219 | Ou | Sep 2005 | B1 |
6947975 | Wong et al. | Sep 2005 | B2 |
6956614 | Quintana | Oct 2005 | B1 |
6963379 | Tomono | Nov 2005 | B2 |
6963454 | Martins et al. | Nov 2005 | B1 |
6966647 | Jannard | Nov 2005 | B2 |
6972735 | Hebert | Dec 2005 | B2 |
6975991 | Basson et al. | Dec 2005 | B2 |
6977630 | Donath et al. | Dec 2005 | B1 |
6982683 | Stanton | Jan 2006 | B2 |
6987620 | Nagaoka | Jan 2006 | B2 |
6999239 | Martins et al. | Feb 2006 | B1 |
7001058 | Inditsky | Feb 2006 | B2 |
7002534 | Park | Feb 2006 | B2 |
7004582 | Jannard | Feb 2006 | B2 |
7013009 | Warren | Mar 2006 | B2 |
7019715 | Kasai et al. | Mar 2006 | B1 |
7038235 | Seitz | May 2006 | B2 |
7050078 | Dempski | May 2006 | B2 |
7061449 | Oya et al. | Jun 2006 | B2 |
7063256 | Anderson et al. | Jun 2006 | B2 |
7075501 | Spitzer et al. | Jul 2006 | B1 |
7081999 | Yamazaki | Jul 2006 | B2 |
7082393 | Lahr | Jul 2006 | B2 |
7088234 | Naito et al. | Aug 2006 | B2 |
7088516 | Yagi et al. | Aug 2006 | B2 |
7113151 | Kinebuchi | Sep 2006 | B2 |
7119965 | Rolland et al. | Oct 2006 | B1 |
7121467 | Winter et al. | Oct 2006 | B2 |
7126558 | Dempski | Oct 2006 | B1 |
7143785 | Maerkl et al. | Dec 2006 | B2 |
7145726 | Geist | Dec 2006 | B2 |
7147324 | Jannard | Dec 2006 | B2 |
7148860 | Kooi et al. | Dec 2006 | B2 |
7150526 | Jannard | Dec 2006 | B2 |
7155556 | Kim et al. | Dec 2006 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7199934 | Yamasaki | Apr 2007 | B2 |
7213917 | Jannard | May 2007 | B2 |
7216973 | Jannard | May 2007 | B2 |
7219994 | Jannard | May 2007 | B2 |
7231038 | Warren | Jun 2007 | B2 |
7249846 | Grand | Jul 2007 | B2 |
7269183 | Morris | Sep 2007 | B2 |
7278734 | Jannard | Oct 2007 | B2 |
7302465 | Ayres et al. | Nov 2007 | B2 |
7321354 | Jacobsen et al. | Jan 2008 | B1 |
7331666 | Swab | Feb 2008 | B2 |
7445332 | Jannard | Nov 2008 | B2 |
7452073 | Jannard | Nov 2008 | B2 |
7458682 | Lee | Dec 2008 | B1 |
7461936 | Jannard | Dec 2008 | B2 |
7494216 | Jannard | Feb 2009 | B2 |
7512414 | Jannard | Mar 2009 | B2 |
7542012 | Kato et al. | Jun 2009 | B2 |
7620432 | Willins | Nov 2009 | B2 |
7642990 | Todd et al. | Jan 2010 | B2 |
7682018 | Jannard | Mar 2010 | B2 |
7740353 | Jannard | Jun 2010 | B2 |
7744213 | Jannard | Jun 2010 | B2 |
7753520 | Fuziak | Jul 2010 | B2 |
7760898 | Howell | Jul 2010 | B2 |
7798638 | Fuziak | Sep 2010 | B2 |
7806525 | Howell | Oct 2010 | B2 |
7918556 | Lewis | Apr 2011 | B2 |
7959084 | Wulff | Jun 2011 | B2 |
7966189 | Le et al. | Jun 2011 | B2 |
7967433 | Jannard | Jun 2011 | B2 |
7969383 | Eberl | Jun 2011 | B2 |
7969657 | Cakmakci | Jun 2011 | B2 |
7976480 | Grajales | Jul 2011 | B2 |
7988283 | Jannard | Aug 2011 | B2 |
8010156 | Warren | Aug 2011 | B2 |
8020989 | Jannard | Sep 2011 | B2 |
8025398 | Jannard | Sep 2011 | B2 |
8072393 | Riechel | Dec 2011 | B2 |
8098439 | Amitai | Jan 2012 | B2 |
8140197 | Lapidot | Mar 2012 | B2 |
8184983 | Ho et al. | May 2012 | B1 |
8212859 | Tang | Jul 2012 | B2 |
8378924 | Jacobsen | Feb 2013 | B2 |
8812399 | Jacobsen | Aug 2014 | B2 |
8825468 | Jacobsen et al. | Sep 2014 | B2 |
9116340 | Jacobsen et al. | Aug 2015 | B2 |
20020015008 | Kishida | Feb 2002 | A1 |
20020074370 | Quintana et al. | Jun 2002 | A1 |
20020094845 | Inasaka | Jul 2002 | A1 |
20030068057 | Miller | Apr 2003 | A1 |
20040113867 | Tomine et al. | Jun 2004 | A1 |
20040125047 | Crane et al. | Jul 2004 | A1 |
20050136958 | Seshadri et al. | Jun 2005 | A1 |
20050219152 | Budd et al. | Oct 2005 | A1 |
20050264527 | Lin | Dec 2005 | A1 |
20060028400 | Lapstun et al. | Feb 2006 | A1 |
20060119539 | Kato et al. | Jun 2006 | A1 |
20060132382 | Jannard | Jun 2006 | A1 |
20060221266 | Kato et al. | Oct 2006 | A1 |
20070048697 | Du et al. | Mar 2007 | A1 |
20080055194 | Baudino et al. | Mar 2008 | A1 |
20080198324 | Fuziak | Aug 2008 | A1 |
20090099836 | Jacobsen et al. | Apr 2009 | A1 |
20090128448 | Riechel | May 2009 | A1 |
20090154719 | Wulff | Jun 2009 | A1 |
20090180195 | Cakmakci | Jul 2009 | A1 |
20090251409 | Parkinson et al. | Oct 2009 | A1 |
20100020229 | Hershey | Jan 2010 | A1 |
20100171680 | Lapidot | Jul 2010 | A1 |
20100271587 | Pavlopoulos | Oct 2010 | A1 |
20110001699 | Jacobsen et al. | Jan 2011 | A1 |
20110187640 | Jacobsen et al. | Aug 2011 | A1 |
20110254698 | Eberl | Oct 2011 | A1 |
20110255050 | Jannard | Oct 2011 | A1 |
20120013843 | Jannard | Jan 2012 | A1 |
20120068914 | Jacobsen et al. | Mar 2012 | A1 |
20120075177 | Jacobsen et al. | Mar 2012 | A1 |
20120105740 | Jannard | May 2012 | A1 |
Number | Date | Country |
---|---|---|
1435707 | Aug 2003 | CN |
1685273 | Oct 2005 | CN |
1 544 665 | Jun 2005 | EP |
1633146 | Mar 2006 | EP |
09-504120 | Apr 1997 | JP |
2009504120 | Apr 1997 | JP |
10-301055 | Nov 1998 | JP |
2010301055 | Nov 1998 | JP |
2004-080679 | Mar 2004 | JP |
2004-358092 | Dec 2004 | JP |
2005286927 | Oct 2005 | JP |
2006-005804 | Jan 2006 | JP |
2006005804 | Jan 2006 | JP |
2006-197734 | Jul 2006 | JP |
2006197734 | Jul 2006 | JP |
2006-217520 | Aug 2006 | JP |
WO9521408 | Feb 1994 | WO |
WO9523994 | Mar 1994 | WO |
WO 9511473 | Apr 1995 | WO |
WO0079327 | Jun 1999 | WO |
WO2009076016 | Dec 2007 | WO |
WO2008013111 | Jan 2008 | WO |
Entry |
---|
Wang, X., “Video Streaming over Bluetooth,” Institute for Infocoram Research (I2R), School of Computing, National University of Singapore (2004). |
Written Opinion of the International Searching Authority, for Application No: PCT/US2008/000245, 13 pages, dated Jul. 23, 2009. |
Further Search Report of GB0913858.7 dated Jul. 13, 2011. |
Number | Date | Country | |
---|---|---|---|
20080169998 A1 | Jul 2008 | US |
Number | Date | Country | |
---|---|---|---|
60880270 | Jan 2007 | US | |
60930242 | May 2007 | US | |
60962686 | Jul 2007 | US | |
60999801 | Oct 2007 | US |