Head up display for integrating views of conformally mapped symbols and a fixed image source

Information

  • Patent Grant
  • 10598932
  • Patent Number
    10,598,932
  • Date Filed
    Wednesday, January 6, 2016
    8 years ago
  • Date Issued
    Tuesday, March 24, 2020
    4 years ago
Abstract
A method or system can be used with an aircraft or other vehicle. The system can include or the method can use a head up display for integrating views of conformally mapped symbols and a first image from at least one image source in an environment. The head up display includes a computer and a combiner configured to provide a second image in response to the computer. The second image includes the conformally mapped symbols and a window for viewing the first image on the image source.
Description
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is related to U.S. application Ser. Nos. 14/225,062 and 14/754,368, both assigned to the Assignees of the present application and incorporated herein by reference in their entireties.


BACKGROUND

Displays are used in various applications. For example, avionic and other vehicular systems use head down display (HDD) systems and head up display (HUD) systems including but not limited to wearable displays, such as, head worn displays (HWD) and helmet mounted display (HMD). In aircraft applications, HUD and HDD systems advantageously display information from aircraft systems and sensors in a graphical and alphanumeric format. The information can also or alternatively include computer generated graphics based upon a terrain and structure database and/or real time sensor captured images.


HUDs generally include combiners that provide information conformally with the view of the environment through the windshield. For example, the F35 Joint Strike Fighter (JSF) includes a visor-type combiner and precision head tracking HUD for the pilot that allows the pilot to effectively view a full sphere, limited only by the flexibility of the seated pilot. Certain images provided by HUDS can suffer from degraded image quality in certain lighting situations. The degraded image quality can make it difficult to view detailed information. For example, a white symbol superimposed over a cloud and a green symbol superimposed over green landscape are difficult to perceive and/or ascertain. Further, information, such as lists or other text, is often more easily viewed on HDDs, such as, large area displays (LADs) or large area HDDs (LAHDDs), than HUDs.


SUMMARY

In one aspect, the inventive concepts disclosed herein are directed to an apparatus in an environment. The environment includes a head down image source disposed at an image source position. The apparatus includes a processor, a projector, and a combiner configured to provide an image from the projector. The processor is configured to cause the projector to provide the image having a window conformally mapped to the image source position.


In another aspect, the inventive concepts disclosed herein are directed to a head up display for integrating views of conformally mapped symbols and a first image on at least one image source in an environment. The head up display includes a computer and a combiner configured to provide a second image in response to the computer. The second image includes the conformally mapped symbols and a window for viewing the first image on the image source.


In still another aspect, the inventive concepts disclosed herein are directed to a method of providing a virtual display in an environment. The method includes providing a first image to a combiner. The first image has a window for viewing an in-dash display. The method also includes providing a second image on the in-dash display





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure will become more fully understood from the following detailed description, taken in conjunction with the accompanying drawings, wherein like reference numerals denote like components, and:



FIG. 1 is a perspective view schematic illustration of an aircraft control center or cockpit including a display system having a combiner according to some embodiments;



FIG. 2 is a schematic general block diagram of the display system illustrated in FIG. 1 according to some embodiments;



FIG. 3 is a schematic general block diagram of the display system illustrated in FIG. 1 showing images on one or more HDDs viewed through the combiner of the display system according to some embodiments;



FIG. 4 is a more detailed, schematic general block diagram of the display system illustrated in FIG. 1 according to some embodiments;



FIG. 5 is a simplified side view drawing of the combiner and a projector for the display system illustrated in FIG. 1 according to some embodiments;



FIG. 6 is a flow diagram showing operations of the display system illustrated in FIG. 1 according to some embodiments;



FIG. 7 is a schematic illustration showing an image on the combiner of the display system illustrated in FIG. 1 according to some embodiments; and



FIGS. 8A-B are schematic illustrations of a grab and hold operation as viewed from the combiner of the display system illustrated in FIG. 1 according to some embodiments.





DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS

Before describing in detail the particular improved system and method, it should be observed that the inventive concepts include, but are not limited to, a novel structural combination of conventional data/signal processing, displays, optical components and/or communications circuits, and not in the particular detailed configurations thereof. Accordingly, the structure, methods, functions, control and arrangement of various components, optics, software, and circuits have, for the most part, been illustrated in the drawings by readily understandable block representations and schematic diagrams, in order not to obscure the disclosure with structural details which will be readily apparent to those skilled in the art, having the benefit of the description herein. Further, the inventive concepts disclosed herein are not limited to the particular embodiments depicted in the exemplary diagrams, but should be construed in accordance with the language in the claims.


According to some exemplary embodiments, a display system provides a window for viewing a head down display (HDD), other display, gauge or sensor. In some embodiments, the image on a combiner of a head up display (HUD) includes one or more transparent windows at one of more virtual locations associated with the actual location of the HDD, other display, gauge or sensor. In some embodiments, the transparent windows (e.g., regions free of overlays provided by the HUD) allow information on the HDDs, displays, gauges or sensor to be viewed without interference from the symbols or images displayed on the combiner. The display system allows sensed (from an enhanced vision system (EVS)) and generated (from a synthetic vision system (SVS)) real-world features and/or representative icons to be displayed to the flight crew in conjunction with HUD operations and yet does not interfere with views of the cockpit instrumentation and HDDs. Advantageously, the system and method of some embodiments allows information more easily viewed on the HDDs to be viewed without clutter from information provided on the combiner of the HUD, thereby providing an integrated user interface to unify the information presented on a HUD overlay and HDDs.


In some embodiments, the system and method utilizes a processor in communication with a worn display (e.g., a head mounted display (HMD)) and HDDs. Various information or symbols can be provided in the HUD image and moved to the HDD by using gestures or signals from user interfaces. For example, a grab and hold gesture can be used to move airspeed information, altitude tape information, an airspace boundary overlay, airport data, and/or communication data from the HUD view to a view on the HDD. Other gestures can be used to select information from menus or page through information screens on the HDD in some embodiments.


With reference to FIG. 1, a display system 10 is provided for an aircraft 11 including a cockpit or an aircraft control center 12. Although discussed with respect to the aircraft 11, the display system 10 can be utilized in a variety of applications including but not limited to other transportation applications (e.g. ground vehicle, marine, space, etc.), robotic or drone applications, medical applications, or targeting applications according to some embodiments. In some embodiments, the display system 10 is configured for use in smaller cockpit embodiments, for use in remote vehicle or aircraft applications, for use in ships or boats, or for use in simulators or other training devices. The display system 10 can provide two dimensional or three dimensional virtual images in some embodiments.


The display system 10 includes one or more of a HUD 18 and one or more of a HDD 20, a HDD 28, and a HDD 30 provided below a glare shield 31. The HDDs 20, 28 and 30 and the HUD 18 can be used to provide information to the flight crew, thereby increasing visual range and enhancing decision-making abilities. The HUD 18 includes a combiner 32 and a projector 34. The HDDs 28 and 30 are large area format HDDs in some embodiments.


In some embodiments, the HDDs 20, 28 and 30 and the combiner 32 provide images associated with weather displays, weather radar displays, communication displays, flight data displays, engine instrument information displays, chart displays, mapping displays, flight plan displays, terrain displays, or other flight instrumentation. Further, the HDDs 20, 28 and 30 and the combiner 32 provide a synthetic vision system (SVS) image, an enhanced vision system (EVS) image (e.g., an EFVS image), a radar image, a sensor image or a merged or combined image derived from any two or more of the SVS image, the radar image, the sensor image, and the EVS image in some embodiments. The HDDs 20, 28 and 30 and the combiner 32 are configured to display a three dimensional or perspective image of terrain and/or weather information in some embodiments. Other views of terrain and/or weather information can also be provided (e.g., plan view, horizontal view, vertical view, or combinations thereof).


The HDDs 20, 28 and 30 and the combiner 32 can be implemented using any of a variety of display technologies, including cathode ray tube (CRT), liquid crystal display (LCD), organic LED display, laser-based, and other display technology. The combiner 32 can be any type of device for providing conformal images, including but not limited to, waveguide combiners, reflective combiners, or holographic combiners, in some embodiments. The combiner 32 is embodied as a head worn combiner or a fixed HUD combiner in some embodiments. In some embodiments, the combiner 32 utilizes waveguide optics and diffraction gratings to receive collimated light provided by the projector 34 and provide collimated light to a user. In some embodiments, the combiner 32 is a goggle, glasses, helmet or visor-type combiner.


In some embodiment, the HUD 18 is a head worn display system (e.g., an HMD) with head and/or eye tracking. The HUD 18 utilizes the projector 34 to provide the image to the combiner 32 including at least one virtual region corresponding to the locations of the HDDs 20, 28, and 30 and/or gauges, instrumentation, or other equipment in the flight control center 12.


With reference to FIG. 2, the HUD 18 includes the combiner 32, the projector 34, a tracker 36, and a processor or computer 56. The projector 34 includes an image source 58 and optics 60. The display system 10 provides a window 40 on the combiner 32 at a virtual location associated with the HDD 28 in some embodiments. In some embodiments, the display system 10 provides a window 41 on the combiner 32 at a virtual location associated with the HDD 30 in some embodiments. Other windows associated with HDD 20 (FIG. 1), instrumentation, controls or gauges can be provided on the combiner 32 in some embodiments.


The tracker 36 is a head or eye tracker. In some embodiments, the tracker 36 provides gaze information associated with the user (e.g., pilot) to the computer 56 in one embodiment. The tracker 36 can be any type of sensor or set of sensors for determining head positon and/or eye positon including but not limited to camera based sensors, magnetic sensors, mechanical sensors, infrared sensors, etc. In some embodiments, the tracker 36 can be or include one or more cameras or sensors to provide gaze information. The cameras can be fixed with in the aircraft control center 12 (FIG. 1) or worn by the user for determining the content of the user's visual field (e.g., gaze information). In some embodiments, the camera associated with the tracker 36 can utilize marks within the aircraft control center 12 to determine where the user is looking. Spatial registry software can be utilized with data from the camera to locate the view of a user in some embodiments.


In operation, the HUD 18 provides images from the image source 58 via the optics 60 to a pilot or other operator so that he or she can simultaneously view the images and the real world scene on the combiner 32 in some embodiments. The images can include graphic and/or text information (e.g., flight path vector, target icons, symbols, fuel indicators, course deviation indicator, or pitch indicator). The image can also include information from other sensors or equipment (e.g., a vertical traffic collision avoidance display, terrain avoidance and awareness display, a weather radar display, flight control sensors, an electronic flight bag, a navigation system, and environmental sensors) in some embodiments. In addition, the images can include synthetic or enhanced vision images. In some embodiments, collimated light representing the image from image source 58 is provided on the combiner 32 so that the pilot can view the image conformally on the real world scene through the combiner 32 with the virtual windows 40 and 41 for viewing the HDDs 28 and 30. The virtual windows 40 and 41 do not include information on the combiner 32 and appear as a transparent region in some embodiments.


The computer 56 can use gaze information, eye position and/or head position from the tracker 36 to determine the user's field of view and appropriately place the windows 40 and 41 as well as conformal symbols in some embodiments. In some embodiments, the user can select information on the image provided on the combiner 32 to be viewed on the HDDs 28 and 30 through windows 40 and 41. Advantageously, HUD 18 allows seamless integration of information displayed on the HDDs 28 and 30 and the combiner 32 using the windows 40 and 41 in some embodiments.


In some embodiments, monochromatic symbols and information are provided on the combiner 32 while colored symbols and colored information are provided on the HDDs 28 and 30 and viewed through the windows 40 and 41. Weather radar information, terrain avoidance system information, and traffic collision avoidance systems information including colored symbology is provided on the HDDs 28 and 30 in some embodiments. Textual listings are provided on the HDDs 28 and 30 in some embodiments.


The image source 58 can be or include any type of devices for providing an image including but not limited to a CRT display, an LED display, an active matrix liquid crystal display (LCD), a light emitting diode, laser illuminator, etc. In one embodiment, image source 58 can be a micro LCD assembly or liquid crystal on silicon (LCOS) display and can provide linearly polarized light. Image source 58 can include a laser or LED backlight in one embodiment.


The computer 56 can be a HUD computer or HWD computer and controls the provision of images by the image source 58. The computer 56 can be a processing circuit or part of a processing circuit associated with other electronic components in the aircraft control center 12 (FIG. 1). The computer 56 can receive data from various sensors, equipment of aircraft 11 (FIG. 1). The computer 56 includes software or instructions stored on a non-transitory medium such as a memory in some embodiments. The software includes gesture recognition software, HUD function software, spatial registry software and video processing software in some embodiments.


The optics 60 are collimating optics which can be a single optical component, such as a lens, or include multiple optical components, in some embodiments. The optics 60 are integrated with the image source 58 in some embodiments. The optics 60 are separate or partially separate from the image source 58 in some embodiments.


With reference to FIG. 3, the display system 10 allows a control panel 202, a navigation display panel 204, a primary flight display panel 206, and an electronic flight bag display panel 212 to be viewed through respective windows 40, 41, 42, and 43. The control panel 202, navigation display panel 204, primary flight display panel 206, and electronic flight bag display panel 212 are each provided on a single HDD or on a part of an HDD (e.g., HDDs 28 and 30 (FIG. 2)) in some embodiments. In some embodiments, the control panel 202, navigation display panel 204, primary flight display panel 206, and electronic flight bag display panel 212 are provided on a single large panel HDD or a pair of HDDs. When the pilot turns or rotates head or gaze position to view information on the combiner 32, the computer 56 (FIG. 2) adjusts the positions of the windows 40, 41, 42, and 43 to match the position of the control panel 202 and the display panels 204, 206 and 212 in some embodiments.


The user can view information through a window 43 on the combiner 32 associated with primary flight display panel 206 in some embodiments. The user can view information on combiner 32 associated with the electronic flight bag display panel 212 through the window 43 in some embodiments. The user can view information on combiner 32 associated with the navigation display panel 204 through the window 41 in some embodiments. Although only three display panels 204, 206, and 212 are shown in FIG. 3, more image sources and windows can be provided. Display panels 212, 204, and 206 are images in a fixed place below the glare shield 31 in some embodiments.


With reference to FIG. 4, the display system 10 is configured to coordinate the viewing of information on at least one of HDDs 28 and 30 and the combiner 32 in some embodiments. The HDD 28 includes an image 38 which can be viewed through the window 40 in an image 39 provided on the combiner 32 in some embodiments. Information associated with the image 38 can be selected and placed in the window 40 via a user interface 452. Once placed in the window 40, the HDD 28 can augment the information with additional information and symbols provided in the image 38 on the HDD 28. The user interface 452 includes sensors (e.g., optical) for sensing hand motions and moving data in response to the hand motions. The computer 56 can execute gesture recognition algorithms to determine the gestures sensed via a camera associated with the user interface 452. In some embodiments, the user interface 452 includes a mouse, track ball, joy stick, touch pad or other interface device for selecting and moving data via a cursor, pointer or other graphic. The image 38 can include flight instrumentation information, compasses, navigation flight and hazard information.


The image 39 is an image including flight control symbols and/or other HUD symbology with or without a vision system image or SVS image provided conformally on the combiner 32 in some embodiments. In some embodiments, the image 39 does not include flight control symbols and/or other HUD symbology and includes a vision system image and/or a SVS image. The window 40 has a clear background for viewing information on the HDD 28 in some embodiments.


The computer 56 includes a processor 425, an HDD frame module 426, an image renderer 428, a HUD frame module 436, and an image renderer 438 in some embodiments. The processor 125 is coupled to the projector 34 and is coupled to the HDDs 28 and 30 in some embodiments. In some embodiments, the display system 10 receives a synthetic vision frame from a synthetic vision system (SVS) and/or a vision frame from a vision system (VS). The processor 425 serves to provide a conformal image on the combiner 32 and select the information to be on display on the HDDs 28 and 30 in some embodiments.


The image renderer 428 utilizes display information from the HDD frame module 426 to provide an image on the HDDs 28 and 30. The image renderer 428 can be utilized to provide any type of flight information. The HUD frame module 436 provides information (e.g., HUD symbology) to the image renderer 438 for providing the image 39 on the combiner 32 on the combiner 32. The image renderer 438 uses data from the tracker 36 to provide the window 40.


The image renderers 428 and 438 can be hardware components or hardware components executing software configured to provide the images 38 and 39 in some embodiments. The frame modules 426 and 436 include memory such as a frame buffer.


The processor 425 can be part of or integrated with a radar system, the SVS, the VS, a HDD display computer for the HDDs 20, 28, and 30, or a HUD computer for the projector 34 in some embodiments. In some embodiments, the processor 425 is an independent platform.


The display system 10 also includes a data link receiver or data bus for receiving information from one or more of flight management computers and other avionic equipment for receiving phase of flight indications in some embodiments. Phase of flight indications are used to automatically choose information for displaying on HDDs 28 and 30 and the combiner 32 at landing, approach, cruise or take off in some embodiments. For example, the window 40 can automatically be removed or provided in response to a phase of flight such as landing. In some embodiments, certain information (e.g., airport information, enhanced vision information, traffic collision avoidance information or terrain avoidance information) is automatically provided in image 38 during the landing phase of flight. In another example, an airport moving map can be viewable through the window 40 (e.g., virtual window) on the HDD 28 when taxiing. In some embodiments, flight plan information automatically appears in the window 40 during cruise, and the window 40 is removed during landing or flight (e.g., altitude, roll, pitch, yaw, air speed, vertical speed indications) and position parameters are provided in the window 40 during landing.


With reference to FIG. 5, the HUD 18 includes the combiner 32 and the projector 34. The projector 34 includes the image source 58 and the optics 60. The image source 58 includes a backlit liquid crystal display module including a transmissive AMLCD display The optics 60 provides collimated light to the combiner 32 (e.g., collimation via a catadioptric folded collimator in one embodiment). Generally, the collimated light input to the combiner 32 has a small dimension in a vertical direction allowing a compact design. The combiner 32 includes a diffractive grating 476 for coupling light into the waveguide associated with combiner 32 and a diffractive grating 478 that ejects light out of combiner 32 to the user. The gratings 476 and 478 are configured for pupil expansion. The image source 58 and combiner 32 can be attached to a ceiling of the cockpit or a bulkhead wall or can be worn in some embodiments.


With reference to FIG. 6, the display system 10 operates according to a flow 600. The flow 600 can be performed by the computer 56 (FIG. 4) executing the software or instructions according to various embodiments. In some embodiments, the computer 56 receives the position of the HDDs 28 and 30 which can be stored in memory at an operation 602. In some embodiments, the computer 56 receives a head position or eye position at an operation 604. At an operation 606, the virtual location of the HDDs 28 and 30 is determined using the gaze information. At an operation 608, information is displayed on the combiner 32 including windows 40 and 41 for viewing the HDDs 28 and 30 in some embodiments.


At an operation 610, the computer 56 determines if information has been selected for movement from the combiner 32 to the HDDs 28 and 30, from the HDDs 28 and 30 to the combiner 32, or between the HDDs 28 and 30. Grab and hold gestures can be used to virtually move the information from locations on the combiner 32 and the HDDs 28 and 30 in some embodiments. In some embodiments, cursors, pointers or other symbols are manipulated using user interface devices, such as track balls, mouse devices, buttons, joy sticks, or touch panels, associated with the user interface 452 to select and move the information.


At an operation 612, the information is moved according to the gesture or selection in the operation 610. The flow 600 can be advantageously used to use windows 40 and 41 as drop zone for information that is displayed on the combiner 32 but can be displayed with a higher image quality on the HDDs 28 and 30. In addition, the information can augmented with data more appropriately displayed on the HDDs 28 and 30 when dropped into the windows 40 and 41 in some embodiments.


With reference to FIG. 7, the display system 10 provides an image 702 on the combiner 32. The image 702 includes windows 40, 41, and 42 associated with the location of display panels 704, 706 and 708. The display panel 704 is a weather radar display image in some embodiments. The display panel 706 is an EVS or VS display image in some embodiments. The display panel 708 is primary flight display image in some embodiments. The combiner 32 also provides image 712 including conformal flight data.


Any types of symbols can be displayed in the windows 40, 41, and 42 and as part of the image 712. The symbols include airspeed and altitude tape. The symbols can be abstract and represent that more information is available when moved to the windows 40, 41, and 42 in some embodiments. For example, pages of information related to airports associated with an airspace boundary symbol is provided in one or more of windows 40, 41, and 42 when selected on the image 712 or at any location on the combiner 32. The information includes radio frequencies, instrument, approaches, runway length, runway width, elevation, and available services in some embodiments.


With reference to FIGS. 8A-B, the display system 10 responds to a grab and hold gesture associated with an operator's hand 802. The hand 802 virtually grabs an airspace boundary symbol 806 in some embodiments. When grabbed, the airspace boundary symbol 806 is copied and moved as airspace boundary symbol 804 to the window 41 (FIG. 8B). Once moved and released in the window 41, an HDD (e.g. HDD 28) provides information associated with the airspace boundary symbol 806 as a display panel 812. Once moved, the airspace boundary symbol 806 is removed from the combiner 32 in some embodiments. In some embodiments, the airspace boundary symbol 806 remains until specifically selected for removal by a delete gesture.


Although discussed above with respect to the air space boundary symbol 806, other symbols can be moved to the window 41 and additional information associated with the symbol can be provided in the window 41 when moved. In some embodiments, information can be removed from display panel 812 and placed on the combiner 32 outside of window 41 in response to the grab and hold gesture. Information in the window 41 can be placed in another window by the grab and hold gesture.


In some embodiments, the display system 10 can recognize gestures for paging through menus associated with information in the windows 40, 41, and 42. The user can quickly change pages with a swipe gesture in some embodiments. In some embodiments, the grab and hold gesture could be used to drop information in an HDD associated with a co-pilot for the co-pilot's review. Moving a navigation overlay from the combiner 32 to a window 40 can trigger a moving map page to be displayed on the HDDs 28 and 30 or on the combiner 32 in some embodiments. In some embodiments, virtual controls are provided on the combiner 32 and gestures are used to manipulate the virtual controls. In some embodiments, virtual handles adjacent radio equipment or throttle controls could pull up a tuning page or engine monitoring page on the HDDs 28 and 30 (FIG. 2) or the combiner 32. In some embodiments, the symbol representing an item that can be dragged and dropped can glow, be highlighted, flash slowly, etc. For example, am throttle symbol can glow and be dragged and drop to provide bar graphs representing engine characteristics. According to other examples, a glowing radio symbol or a glowing engine symbol can be dragged and dropped to obtain radio or engine data. In another example, a passenger or cabin symbol is dragged and dropped to display and control cabin information from the HDD 28 or 30. The cabin information includes no smoking indications, seatbelt indications, and electronics usage indications in some embodiments.


Although exemplary embodiments are described with respect to cockpit environments, the display technology described herein can be utilized in other environments. While the detailed drawings, specific examples, detailed algorithms, and particular configurations given describe preferred and exemplary embodiments, they serve the purpose of illustration only. The inventive concepts disclosed herein are not limited to the specific forms shown. For example, the methods may be performed in any of a variety of sequence of steps or according to any of a variety of computer sequences. The hardware and software configurations shown and described may differ depending on the chosen performance characteristics and physical characteristics of the image and processing devices. For example, the type of system components and their interconnections may differ. The systems and methods depicted and described are not limited to the precise details and conditions disclosed. The flow charts show exemplary operations only. The specific data types and operations are shown in a non-limiting fashion. Furthermore, other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the exemplary embodiments without departing from the scope of the invention as expressed in the appended claims.

Claims
  • 1. A head up display for integrating views of conformally mapped symbols, the head up display comprising: a display configured to provide a first image in an environment, the display having a screen at a location in the environment;a computer; anda combiner configured to provide a second image in response to the computer, the second image comprising the conformally mapped symbols, the second image having a window for viewing the first image, wherein the window has a virtual location matching the location of the screen.
  • 2. The head up display of claim 1, wherein the first image is comprised of primary flight data and the combiner receives light from a real world scene.
  • 3. The head up display of claim 1, wherein the first image further comprises at least one additional window associated with a location of another image source.
  • 4. The head up display of claim 1, wherein the combiner is a near eye fixed waveguide combiner or a helmet mounted combiner.
  • 5. The head up display of claim 1, further comprising: a head tracking device, wherein the first image is provided in accordance with a head position and wherein the virtual location of the window continues to match the location of the screen when the head position is changed.
  • 6. The head up display of claim 1, further comprising: a user interface configured to sense a gesture and where the computer is configured to provide information in the window in response to the gesture.
  • 7. The head up display of claim 1, wherein the display is a large panel display image source.
  • 8. A method for use with a head up display for integrating views of conformally mapped symbols in an environment, the head up display comprising a display for providing a first image, a computer, and a combiner for providing a second image in response to the computer, the display having a screen at a location in the environment, the method comprising providing the conformally mapped symbols in the second image, the conformally mapped symbols being mapped to features in the environment; andproviding a window in the second image for viewing the first image, wherein the window has a virtual location matching the location of the screen.
  • 9. The method of claim 8, wherein the first image is comprised of primary flight data and the combiner receives light from a real world scene.
  • 10. The method of claim 8, wherein the first image further comprises at least one additional window associated with a location of another image source.
  • 11. The method of claim 8, wherein the combiner is a near eye fixed waveguide combiner, a head worn display or a helmet mounted combiner, and the window is free of the conformally mapped symbols.
  • 12. The method of claim 8, further comprising: a head tracking device, wherein the first image is provided in accordance with a head position and wherein the virtual location of the window continues to match the location of the screen when the head position is changed.
  • 13. The method of claim 8, further comprising: a user interface configured to sense a gesture and where the computer is configured to provide information in the window in response to the gesture.
  • 14. The method of claim 8, wherein the display is a large panel display image source.
  • 15. A head up display for integrating views of conformally mapped symbols in an environment and allowing a first image to be seen, the first image being provided by a head down display having a screen at a location in the environment, the head up display comprising: a computer; anda combiner for providing a second image in response to the computer, wherein the conformally mapped symbols are provided in the second image, the conformally mapped symbols being mapped to features in the environment, wherein a window is provided in the second image for viewing the first image, wherein the window has a virtual location matching the location of the screen.
  • 16. The head up display of claim 15, wherein the first image is comprised of primary flight data and the combiner receives light from a real world scene, wherein the conformally mapped symbols are virtually provided through a windshield in the environment.
  • 17. The head up display of claim 15, wherein the first image further comprises at least one additional window associated with a location of another image source.
  • 18. The head up display of claim 15, wherein the combiner is a near eye fixed waveguide combiner, a head worn display or a helmet mounted combiner.
  • 19. The head up display of claim 15, further comprising: a head tracking device, wherein the first image is provided in accordance with a head position and wherein the virtual location of the window continues to match the location of the screen when the head position is changed.
  • 20. The head up display of claim 15, further comprising: a user interface configured to sense a gesture and where the computer is configured to provide information in the window in response to the gesture.
US Referenced Citations (568)
Number Name Date Kind
2141884 Sonnefeld Dec 1938 A
3620601 Waghorn Nov 1971 A
3851303 Muller Nov 1974 A
3885095 Wolfson et al. May 1975 A
3940204 Withrington Feb 1976 A
4082432 Kirschner Apr 1978 A
4099841 Ellis Jul 1978 A
4178074 Heller Dec 1979 A
4218111 Withrington et al. Aug 1980 A
4232943 Rogers Nov 1980 A
4309070 St. Leger Searle Jan 1982 A
4647967 Kirschner et al. Mar 1987 A
4711512 Upatnieks Dec 1987 A
4714320 Banbury Dec 1987 A
4743083 Schimpe May 1988 A
4749256 Bell et al. Jun 1988 A
4775218 Wood et al. Oct 1988 A
4799765 Ferrer Jan 1989 A
4854688 Hayford et al. Aug 1989 A
4928301 Smoot May 1990 A
4946245 Chamberlin et al. Aug 1990 A
5007711 Wood et al. Apr 1991 A
5035734 Honkanen et al. Jul 1991 A
5076664 Migozzi Dec 1991 A
5079416 Filipovich Jan 1992 A
5117285 Nelson et al. May 1992 A
5124821 Antier et al. Jun 1992 A
5148302 Nagano et al. Sep 1992 A
5151958 Honkanen Sep 1992 A
5153751 Ishikawa et al. Oct 1992 A
5159445 Gitlin et al. Oct 1992 A
5160523 Honkanen et al. Nov 1992 A
5183545 Branca et al. Feb 1993 A
5187597 Kato et al. Feb 1993 A
5210624 Matsumoto et al. May 1993 A
5218360 Goetz et al. Jun 1993 A
5243413 Gitlin et al. Sep 1993 A
5289315 Makita et al. Feb 1994 A
5295208 Caulfield et al. Mar 1994 A
5302964 Lewins Apr 1994 A
5303085 Rallison Apr 1994 A
5317405 Kuriki et al. May 1994 A
5341230 Smith Aug 1994 A
5351151 Levy Sep 1994 A
5359362 Lewis et al. Oct 1994 A
5363220 Kuwayama et al. Nov 1994 A
5369511 Amos Nov 1994 A
5400069 Braun et al. Mar 1995 A
5408346 Trissel et al. Apr 1995 A
5418584 Larson May 1995 A
5438357 McNelley Aug 1995 A
5455693 Wreede et al. Oct 1995 A
5471326 Hall et al. Nov 1995 A
5473222 Thoeny et al. Dec 1995 A
5496621 Makita et al. Mar 1996 A
5500671 Andersson et al. Mar 1996 A
5510913 Hashimoto et al. Apr 1996 A
5515184 Caulfield et al. May 1996 A
5524272 Podowski et al. Jun 1996 A
5532736 Kuriki et al. Jul 1996 A
5537232 Biles Jul 1996 A
5572248 Allen et al. Nov 1996 A
5579026 Tabata Nov 1996 A
5583795 Smyth Dec 1996 A
5604611 Saburi et al. Feb 1997 A
5606433 Yin et al. Feb 1997 A
5612733 Flohr Mar 1997 A
5612734 Nelson et al. Mar 1997 A
5619254 McNelley Apr 1997 A
5629259 Akada et al. May 1997 A
5631107 Tarumi et al. May 1997 A
5633100 Mickish et al. May 1997 A
5646785 Gilboa et al. Jul 1997 A
5648857 Ando et al. Jul 1997 A
5661577 Jenkins et al. Aug 1997 A
5661603 Hanano et al. Aug 1997 A
5665494 Kawabata et al. Sep 1997 A
5668907 Veligdan Sep 1997 A
5682255 Friesem et al. Oct 1997 A
5694230 Welch Dec 1997 A
5701132 Kollin et al. Dec 1997 A
5706108 Ando et al. Jan 1998 A
5707925 Akada et al. Jan 1998 A
5724189 Ferrante Mar 1998 A
5726782 Kato et al. Mar 1998 A
5727098 Jacobson Mar 1998 A
5729242 Margerum et al. Mar 1998 A
5731060 Hirukawa et al. Mar 1998 A
5731853 Taketomi et al. Mar 1998 A
5742262 Tabata et al. Apr 1998 A
5751452 Tanaka et al. May 1998 A
5760931 Saburi et al. Jun 1998 A
5764414 King et al. Jun 1998 A
5790288 Jager et al. Aug 1998 A
5812608 Valimaki et al. Sep 1998 A
5822127 Chen et al. Oct 1998 A
5841507 Barnes Nov 1998 A
5856842 Tedesco Jan 1999 A
5868951 Schuck et al. Feb 1999 A
5892598 Asakawa et al. Apr 1999 A
5898511 Mizutani et al. Apr 1999 A
5903395 Rallison et al. May 1999 A
5907416 Hegg et al. May 1999 A
5907436 Perry et al. May 1999 A
5917459 Son et al. Jun 1999 A
5926147 Sehm et al. Jul 1999 A
5929946 Sharp et al. Jul 1999 A
5937115 Domash Aug 1999 A
5942157 Sutherland et al. Aug 1999 A
5945893 Plessky et al. Aug 1999 A
5949302 Sarkka Sep 1999 A
5966223 Friesem et al. Oct 1999 A
5985422 Krauter Nov 1999 A
5991085 Rallison Nov 1999 A
5991087 Rallison Nov 1999 A
5999314 Asakura et al. Dec 1999 A
6042947 Asakura et al. Mar 2000 A
6043585 Plessky et al. Mar 2000 A
6075626 Mizutani et al. Jun 2000 A
6078427 Fontaine et al. Jun 2000 A
6115152 Popovich et al. Sep 2000 A
6127066 Ueda et al. Oct 2000 A
6137630 Tsou et al. Oct 2000 A
6169613 Amitai et al. Jan 2001 B1
6176837 Foxlin Jan 2001 B1
6195206 Yona et al. Feb 2001 B1
6222675 Mall et al. Apr 2001 B1
6222971 Veligdan et al. Apr 2001 B1
6249386 Yona et al. Jun 2001 B1
6259423 Tokito et al. Jul 2001 B1
6259559 Kobayashi et al. Jul 2001 B1
6285813 Schultz et al. Sep 2001 B1
6317083 Johnson et al. Nov 2001 B1
6317227 Mizutani et al. Nov 2001 B1
6321069 Piirainen Nov 2001 B1
6327089 Hosaki et al. Dec 2001 B1
6333819 Svedenkrans Dec 2001 B1
6340540 Ueda et al. Jan 2002 B1
6351333 Araki et al. Feb 2002 B2
6356172 Koivisto et al. Mar 2002 B1
6359730 Tervonen Mar 2002 B2
6359737 Stringfellow Mar 2002 B1
6366378 Tervonen et al. Apr 2002 B1
6392812 Howard May 2002 B1
6409687 Foxlin Jun 2002 B1
6470132 Nousiainen et al. Oct 2002 B1
6486997 Bruzzone et al. Nov 2002 B1
6504518 Kuwayama et al. Jan 2003 B1
6524771 Maeda et al. Feb 2003 B2
6545778 Ono et al. Apr 2003 B2
6550949 Bauer et al. Apr 2003 B1
6557413 Nieminen et al. May 2003 B2
6563648 Gleckman et al. May 2003 B2
6580529 Amitai et al. Jun 2003 B1
6583873 Goncharov et al. Jun 2003 B1
6587619 Kinoshita Jul 2003 B1
6598987 Parikka Jul 2003 B1
6608720 Freeman Aug 2003 B1
6611253 Cohen Aug 2003 B1
6646810 Harter et al. Nov 2003 B2
6661578 Hedrick Dec 2003 B2
6674578 Sugiyama et al. Jan 2004 B2
6686815 Mirshekarl-Syahkal et al. Feb 2004 B1
6690516 Aritake et al. Feb 2004 B2
6721096 Bruzzone et al. Apr 2004 B2
6741189 Gibbons, II et al. May 2004 B1
6744478 Asakura et al. Jun 2004 B1
6748342 Dickhaus Jun 2004 B1
6750941 Satoh et al. Jun 2004 B2
6750995 Dickson Jun 2004 B2
6757105 Niv et al. Jun 2004 B2
6771403 Endo et al. Aug 2004 B1
6776339 Piikivi Aug 2004 B2
6781701 Sweetser et al. Aug 2004 B1
6805490 Levola Oct 2004 B2
6825987 Repetto et al. Nov 2004 B2
6829095 Amitai Dec 2004 B2
6833955 Niv Dec 2004 B2
6836369 Fujikawa et al. Dec 2004 B2
6844212 Bond et al. Jan 2005 B2
6844980 He et al. Jan 2005 B2
6847274 Salmela et al. Jan 2005 B2
6847488 Travis Jan 2005 B2
6853491 Ruhle et al. Feb 2005 B1
6864861 Schehrer et al. Mar 2005 B2
6864927 Cathey Mar 2005 B1
6885483 Takada Apr 2005 B2
6903872 Schrader Jun 2005 B2
6909345 Salmela et al. Jun 2005 B1
6917375 Akada et al. Jul 2005 B2
6922267 Endo et al. Jul 2005 B2
6926429 Barlow et al. Aug 2005 B2
6940361 Jokio et al. Sep 2005 B1
6950173 Sutherland et al. Sep 2005 B1
6950227 Schrader Sep 2005 B2
6951393 Koide Oct 2005 B2
6952312 Weber et al. Oct 2005 B2
6958662 Salmela et al. Oct 2005 B1
6987908 Bond et al. Jan 2006 B2
7003187 Frick et al. Feb 2006 B2
7018744 Otaki et al. Mar 2006 B2
7021777 Amitai Apr 2006 B2
7026892 Kajiya Apr 2006 B2
7027671 Huck et al. Apr 2006 B2
7034748 Kajiya Apr 2006 B2
7053735 Salmela et al. May 2006 B2
7058434 Wang et al. Jun 2006 B2
7095562 Peng et al. Aug 2006 B1
7101048 Travis Sep 2006 B2
7110184 Yona et al. Sep 2006 B1
7123418 Weber et al. Oct 2006 B2
7126418 Hunton et al. Oct 2006 B2
7126583 Breed Oct 2006 B1
7132200 Ueda et al. Nov 2006 B1
7149385 Parikka et al. Dec 2006 B2
7151246 Fein et al. Dec 2006 B2
7158095 Jenson et al. Jan 2007 B2
7181105 Teramura et al. Feb 2007 B2
7181108 Levola Feb 2007 B2
7184615 Levola Feb 2007 B2
7190849 Katase Mar 2007 B2
7199934 Yamasaki Apr 2007 B2
7205960 David Apr 2007 B2
7205964 Yokoyama et al. Apr 2007 B1
7206107 Levola Apr 2007 B2
7230767 Walck et al. Jun 2007 B2
7242527 Spitzer et al. Jul 2007 B2
7248128 Mattila et al. Jul 2007 B2
7259906 Islam Aug 2007 B1
7268946 Wang Sep 2007 B2
7285903 Cull et al. Oct 2007 B2
7286272 Mukawa Oct 2007 B2
7289069 Ranta Oct 2007 B2
7299983 Piikivi Nov 2007 B2
7313291 Okhotnikov et al. Dec 2007 B2
7319573 Nishiyama Jan 2008 B2
7320534 Sugikawa et al. Jan 2008 B2
7323275 Otaki et al. Jan 2008 B2
7336271 Ozeki et al. Feb 2008 B2
7339737 Urey et al. Mar 2008 B2
7339742 Amitai et al. Mar 2008 B2
7375870 Schorpp May 2008 B2
7391573 Amitai Jun 2008 B2
7394865 Borran et al. Jul 2008 B2
7395181 Foxlin Jul 2008 B2
7397606 Peng et al. Jul 2008 B1
7401920 Kranz et al. Jul 2008 B1
7404644 Evans et al. Jul 2008 B2
7410286 Travis Aug 2008 B2
7411637 Weiss Aug 2008 B2
7415173 Kassamakov et al. Aug 2008 B2
7418170 Mukawa et al. Aug 2008 B2
7433116 Islam Oct 2008 B1
7436568 Kuykendall, Jr. Oct 2008 B1
7454103 Parriaux Nov 2008 B2
7457040 Amitai Nov 2008 B2
7466994 Pihlaja et al. Dec 2008 B2
7479354 Ueda et al. Jan 2009 B2
7480215 Makela et al. Jan 2009 B2
7482996 Larson et al. Jan 2009 B2
7483604 Levola Jan 2009 B2
7492512 Niv et al. Feb 2009 B2
7496293 Shamir et al. Feb 2009 B2
7500104 Goland Mar 2009 B2
7528385 Volodin et al. May 2009 B2
7545429 Travis Jun 2009 B2
7550234 Otaki et al. Jun 2009 B2
7567372 Schorpp Jul 2009 B2
7570429 Maliah et al. Aug 2009 B2
7572555 Takizawa et al. Aug 2009 B2
7573640 Nivon et al. Aug 2009 B2
7576916 Amitai Aug 2009 B2
7577326 Amitai Aug 2009 B2
7579119 Ueda et al. Aug 2009 B2
7588863 Takizawa et al. Sep 2009 B2
7589900 Powell Sep 2009 B1
7589901 Dejong et al. Sep 2009 B2
7592988 Katase Sep 2009 B2
7593575 Houle et al. Sep 2009 B2
7597447 Larson et al. Oct 2009 B2
7599012 Nakamura et al. Oct 2009 B2
7600893 Laino et al. Oct 2009 B2
7602552 Blumenfeld Oct 2009 B1
7616270 Hirabayashi et al. Nov 2009 B2
7618750 Ueda et al. Nov 2009 B2
7629086 Otaki et al. Dec 2009 B2
7639911 Lee et al. Dec 2009 B2
7643214 Amitai Jan 2010 B2
7656585 Powell et al. Feb 2010 B1
7660047 Travis et al. Feb 2010 B1
7672055 Amitai Mar 2010 B2
7710654 Ashkenazi et al. May 2010 B2
7724441 Amitai May 2010 B2
7724442 Amitai May 2010 B2
7724443 Amitai May 2010 B2
7733572 Brown et al. Jun 2010 B1
7747113 Mukawa et al. Jun 2010 B2
7751122 Amitai Jul 2010 B2
7764413 Levola Jul 2010 B2
7777819 Simmonds Aug 2010 B2
7778305 Parriaux et al. Aug 2010 B2
7778508 Hirayama Aug 2010 B2
7847235 Krupkin et al. Dec 2010 B2
7864427 Korenaga et al. Jan 2011 B2
7865080 Hecker et al. Jan 2011 B2
7872804 Moon et al. Jan 2011 B2
7884985 Amitai et al. Feb 2011 B2
7887186 Watanabe Feb 2011 B2
7903921 Ostergard Mar 2011 B2
7907342 Simmonds et al. Mar 2011 B2
7920787 Gentner et al. Apr 2011 B2
7944428 Travis May 2011 B2
7969644 Tilleman et al. Jun 2011 B2
7970246 Travis et al. Jun 2011 B2
7976208 Travis Jul 2011 B2
7999982 Endo et al. Aug 2011 B2
8000491 Brodkin et al. Aug 2011 B2
8004765 Amitai Aug 2011 B2
8016475 Travis Sep 2011 B2
8022942 Bathiche et al. Sep 2011 B2
RE42992 David Dec 2011 E
8079713 Ashkenazi Dec 2011 B2
8082222 Rangarajan et al. Dec 2011 B2
8086030 Gordon et al. Dec 2011 B2
8089568 Brown et al. Jan 2012 B1
8107023 Simmonds et al. Jan 2012 B2
8107780 Simmonds Jan 2012 B2
8132948 Owen et al. Mar 2012 B2
8132976 Odell et al. Mar 2012 B2
8136690 Fang et al. Mar 2012 B2
8137981 Andrew et al. Mar 2012 B2
8149086 Klein et al. Apr 2012 B2
8152315 Travis et al. Apr 2012 B2
8155489 Saarikko et al. Apr 2012 B2
8159752 Wertheim et al. Apr 2012 B2
8160409 Large Apr 2012 B2
8160411 Levola et al. Apr 2012 B2
8186874 Sinbar et al. May 2012 B2
8188925 Dejean May 2012 B2
8189263 Wang et al. May 2012 B1
8189973 Travis et al. May 2012 B2
8199803 Hauske et al. Jun 2012 B2
8213065 Mukawa Jul 2012 B2
8233204 Robbins et al. Jul 2012 B1
8253914 Kajiya et al. Aug 2012 B2
8254031 Levola Aug 2012 B2
8295710 Marcus Oct 2012 B2
8301031 Gentner et al. Oct 2012 B2
8305577 Kivioja et al. Nov 2012 B2
8306423 Gottwald et al. Nov 2012 B2
8314819 Kimmel et al. Nov 2012 B2
8321810 Heintze Nov 2012 B2
8335040 Mukawa et al. Dec 2012 B2
8351744 Travis et al. Jan 2013 B2
8354806 Travis et al. Jan 2013 B2
8355610 Simmonds Jan 2013 B2
8369019 Baker et al. Feb 2013 B2
8384694 Powell et al. Feb 2013 B2
8398242 Yamamoto et al. Mar 2013 B2
8403490 Sugiyama et al. Mar 2013 B2
8422840 Large Apr 2013 B2
8427439 Larsen et al. Apr 2013 B2
8432363 Saarikko et al. Apr 2013 B2
8432372 Butler et al. Apr 2013 B2
8447365 Imanuel May 2013 B1
8472119 Kelly Jun 2013 B1
8477261 Travis et al. Jul 2013 B2
8491121 Tilleman et al. Jul 2013 B2
8491136 Travis et al. Jul 2013 B2
8493366 Bathiche et al. Jul 2013 B2
8493662 Noui Jul 2013 B2
8508848 Saarikko Aug 2013 B2
8547638 Levola Oct 2013 B2
8578038 Kaikuranta et al. Nov 2013 B2
8581831 Travis Nov 2013 B2
8582206 Travis Nov 2013 B2
8593734 Laakkonen Nov 2013 B2
8611014 Valera et al. Dec 2013 B2
8619062 Powell et al. Dec 2013 B2
8633786 Ermolov et al. Jan 2014 B2
8639072 Popovich et al. Jan 2014 B2
8643691 Rosenfeld et al. Feb 2014 B2
8649099 Schultz et al. Feb 2014 B2
8654420 Simmonds Feb 2014 B2
8659826 Brown et al. Feb 2014 B1
8670029 McEldowney Mar 2014 B2
8693087 Nowatzyk et al. Apr 2014 B2
8736802 Kajiya et al. May 2014 B2
8736963 Robbins et al. May 2014 B2
8749886 Gupta Jun 2014 B2
8767294 Chen et al. Jul 2014 B2
8810600 Bohn et al. Aug 2014 B2
8814691 Haddick et al. Aug 2014 B2
8830584 Saarikko et al. Sep 2014 B2
8830588 Brown et al. Sep 2014 B1
8913324 Schrader Dec 2014 B2
8938141 Magnusson Jan 2015 B2
8964298 Haddick et al. Feb 2015 B2
9097890 Miller et al. Aug 2015 B2
9244280 Tiana et al. Jan 2016 B1
9456744 Popovich et al. Oct 2016 B2
9523852 Brown et al. Dec 2016 B1
20020012064 Yamaguchi Jan 2002 A1
20020021461 Ono et al. Feb 2002 A1
20020131175 Yagi et al. Sep 2002 A1
20020171940 He Nov 2002 A1
20030030912 Gleckman et al. Feb 2003 A1
20030039442 Bond et al. Feb 2003 A1
20030063042 Friesem et al. Apr 2003 A1
20030149346 Arnone et al. Aug 2003 A1
20030214460 Kovacs Nov 2003 A1
20030228019 Eichler et al. Dec 2003 A1
20040013314 Peli Jan 2004 A1
20040089842 Sutherland et al. May 2004 A1
20040130797 Leigh Travis Jul 2004 A1
20040188617 Devitt et al. Sep 2004 A1
20040208446 Bond et al. Oct 2004 A1
20040208466 Mossberg et al. Oct 2004 A1
20050135747 Greiner et al. Jun 2005 A1
20050136260 Garcia Jun 2005 A1
20050259302 Metz et al. Nov 2005 A9
20050269481 David et al. Dec 2005 A1
20060093793 Miyakawa et al. May 2006 A1
20060114564 Sutherland et al. Jun 2006 A1
20060119916 Sutherland et al. Jun 2006 A1
20060132914 Weiss et al. Jun 2006 A1
20060215244 Yosha et al. Sep 2006 A1
20060221448 Nivon et al. Oct 2006 A1
20060228073 Mukawa et al. Oct 2006 A1
20060279662 Kapellner et al. Dec 2006 A1
20060291021 Mukawa Dec 2006 A1
20070019152 Caputo et al. Jan 2007 A1
20070019297 Stewart et al. Jan 2007 A1
20070041684 Popovich et al. Feb 2007 A1
20070045596 King et al. Mar 2007 A1
20070089625 Grinberg et al. Apr 2007 A1
20070133920 Lee et al. Jun 2007 A1
20070133983 Traff Jun 2007 A1
20070188837 Shimizu et al. Aug 2007 A1
20070211164 Olsen et al. Sep 2007 A1
20080043334 Itzkovitch et al. Feb 2008 A1
20080106775 Amitai et al. May 2008 A1
20080136923 Inbar et al. Jun 2008 A1
20080151379 Amitai Jun 2008 A1
20080158096 Breed Jul 2008 A1
20080186604 Amitai Aug 2008 A1
20080198471 Amitai Aug 2008 A1
20080278812 Amitai Nov 2008 A1
20080285140 Amitai Nov 2008 A1
20080309586 Vitale Dec 2008 A1
20090017424 Yoeli et al. Jan 2009 A1
20090019222 Verma et al. Jan 2009 A1
20090052046 Amitai Feb 2009 A1
20090052047 Amitai Feb 2009 A1
20090067774 Magnusson Mar 2009 A1
20090097122 Niv Apr 2009 A1
20090097127 Amitai Apr 2009 A1
20090121301 Chang May 2009 A1
20090122413 Hoffman et al. May 2009 A1
20090122414 Amitai May 2009 A1
20090128902 Niv et al. May 2009 A1
20090128911 Itzkovitch et al. May 2009 A1
20090153437 Aharoni Jun 2009 A1
20090190222 Simmonds et al. Jul 2009 A1
20090213208 Glatt Aug 2009 A1
20090237804 Amitai et al. Sep 2009 A1
20090284552 Larson Nov 2009 A1
20090303599 Levola Dec 2009 A1
20090316246 Asai et al. Dec 2009 A1
20100039796 Mukawa Feb 2010 A1
20100060551 Sugiyama et al. Mar 2010 A1
20100060990 Wertheim et al. Mar 2010 A1
20100079865 Saarikko et al. Apr 2010 A1
20100092124 Magnusson et al. Apr 2010 A1
20100096562 Klunder et al. Apr 2010 A1
20100103078 Mukawa et al. Apr 2010 A1
20100136319 Imai et al. Jun 2010 A1
20100141555 Rorberg et al. Jun 2010 A1
20100165465 Levola Jul 2010 A1
20100171680 Lapidot et al. Jul 2010 A1
20100177388 Cohen et al. Jul 2010 A1
20100214659 Levola Aug 2010 A1
20100231693 Levola Sep 2010 A1
20100231705 Yahav Sep 2010 A1
20100232003 Baldy et al. Sep 2010 A1
20100246004 Simmonds Sep 2010 A1
20100246993 Rieger et al. Sep 2010 A1
20100265117 Weiss Oct 2010 A1
20100277803 Pockett et al. Nov 2010 A1
20100284085 Laakkonen Nov 2010 A1
20100296163 Saarikko Nov 2010 A1
20100315719 Saarikko et al. Dec 2010 A1
20100321781 Levola et al. Dec 2010 A1
20110013423 Selbrede et al. Jan 2011 A1
20110019250 Aiki et al. Jan 2011 A1
20110019874 Jarvenpaa et al. Jan 2011 A1
20110026128 Baker et al. Feb 2011 A1
20110026774 Flohr et al. Feb 2011 A1
20110038024 Wang et al. Feb 2011 A1
20110050548 Blumenfeld et al. Mar 2011 A1
20110096401 Levola Apr 2011 A1
20110157707 Tilleman et al. Jun 2011 A1
20110164221 Tilleman et al. Jul 2011 A1
20110211239 Mukawa et al. Sep 2011 A1
20110235179 Simmonds Sep 2011 A1
20110235365 McCollum et al. Sep 2011 A1
20110238399 Ophir et al. Sep 2011 A1
20110242349 Izuha et al. Oct 2011 A1
20110242661 Simmonds Oct 2011 A1
20110242670 Simmonds Oct 2011 A1
20110310356 Vallius Dec 2011 A1
20120007979 Schneider et al. Jan 2012 A1
20120033306 Valera et al. Feb 2012 A1
20120044572 Simmonds et al. Feb 2012 A1
20120044573 Simmonds et al. Feb 2012 A1
20120062850 Travis Mar 2012 A1
20120099203 Boubis et al. Apr 2012 A1
20120105634 Meidan et al. May 2012 A1
20120120493 Simmonds et al. May 2012 A1
20120127577 Desserouer May 2012 A1
20120212398 Border Aug 2012 A1
20120224062 Lacoste et al. Sep 2012 A1
20120235884 Miller et al. Sep 2012 A1
20120235900 Border et al. Sep 2012 A1
20120242661 Takagi et al. Sep 2012 A1
20120280956 Yamamoto et al. Nov 2012 A1
20120300311 Simmonds et al. Nov 2012 A1
20120320460 Levola Dec 2012 A1
20130069850 Mukawa et al. Mar 2013 A1
20130088412 Helot et al. Apr 2013 A1
20130093893 Schofield et al. Apr 2013 A1
20130101253 Popovich et al. Apr 2013 A1
20130138275 Nauman et al. May 2013 A1
20130141937 Katsuta et al. Jun 2013 A1
20130170031 Bohn et al. Jul 2013 A1
20130184904 Gadzinski Jul 2013 A1
20130200710 Robbins Aug 2013 A1
20130249895 Westerinen et al. Sep 2013 A1
20130250207 Bohn Sep 2013 A1
20130257848 Westerinen et al. Oct 2013 A1
20130258701 Westerinen et al. Oct 2013 A1
20130314793 Robbins et al. Nov 2013 A1
20130322810 Robbins Dec 2013 A1
20130328948 Kunkel et al. Dec 2013 A1
20140043689 Mason Feb 2014 A1
20140104665 Popovich et al. Apr 2014 A1
20140104685 Bohn et al. Apr 2014 A1
20140140653 Brown et al. May 2014 A1
20140140654 Brown et al. May 2014 A1
20140146394 Tout et al. May 2014 A1
20140152778 Ihlenburg et al. Jun 2014 A1
20140168055 Smith Jun 2014 A1
20140168260 O'Brien et al. Jun 2014 A1
20140168735 Yuan et al. Jun 2014 A1
20140172296 Shtukater Jun 2014 A1
20140176528 Robbins Jun 2014 A1
20140204455 Popovich et al. Jul 2014 A1
20140211322 Bohn et al. Jul 2014 A1
20140218801 Simmonds et al. Aug 2014 A1
20140300966 Travers et al. Oct 2014 A1
20150010265 Popovich et al. Jan 2015 A1
20150167868 Boncha Jun 2015 A1
20150177688 Popovich et al. Jun 2015 A1
20150277375 Large et al. Oct 2015 A1
20150289762 Popovich et al. Oct 2015 A1
20150316768 Simmonds Nov 2015 A1
20160209657 Popovich et al. Jul 2016 A1
20180295350 Liu Oct 2018 A1
Foreign Referenced Citations (37)
Number Date Country
200944140 Sep 2007 CN
101881936 Nov 2010 CN
102057228 May 2011 CN
102822723 Dec 2012 CN
102866710 Jan 2013 CN
103620478 Mar 2014 CN
10 2006 003 785 Jul 2007 DE
0 822 441 Feb 1998 EP
2 110 701 Oct 2009 EP
2 225 592 Sep 2010 EP
2 381 290 Oct 2011 EP
2 733 517 May 2014 EP
2259278 Aug 1975 FR
2677463 Dec 1992 FR
2 115 178 Sep 1983 GB
2004-157245 Jun 2004 JP
2006-350129 Dec 2006 JP
2007-219106 Aug 2007 JP
WO-9952002 Oct 1999 WO
WO-03081320 Oct 2003 WO
WO-2006002870 Jan 2006 WO
WO-2007130130 Nov 2007 WO
WO-2007130130 Nov 2007 WO
WO-2009013597 Jan 2009 WO
WO-2009077802 Jun 2009 WO
WO-2010067114 Jun 2010 WO
WO-2010067117 Jun 2010 WO
WO-2010125337 Nov 2010 WO
WO-2010125337 Nov 2010 WO
WO-2011012825 Feb 2011 WO
WO-2011051660 May 2011 WO
WO-2011055109 May 2011 WO
WO-2011107831 Sep 2011 WO
WO-2013027006 Feb 2013 WO
WO-2013033274 Mar 2013 WO
WO-2013163347 Oct 2013 WO
WO-2014091200 Jun 2014 WO
Non-Patent Literature Citations (110)
Entry
U.S. Appl. No. 14/225,062, filed Mar. 25, 2014, Carlo L. Tiana et al.
U.S. Appl. No. 14/754,368, filed Jun. 29, 2015, Kenneth A. Zimmerman et al.
Amendment and Reply for U.S. Appl. No. 12/571,262, dated Dec. 16, 2011, 7 pages.
Amitai, Y., et al. “Visor-display design based on planar holographic optics,” Applied Optics, vol. 34, No. 8, Mar. 10, 1995, pp. 1352-6.
Ayras, et al., “Exit pupil expander with a large field of view based on diffractive optics”, Journal of the Society for Information Display, 17/8, 2009, pp. 659-664.
Cameron, A., The Application of Holograhpic Optical Waveguide Technology to Q-Sight Family of Helmet Mounted Displays, Proc. of SPIE, vol. 7326, 7326OH-1, 2009, 11 pages.
Caputo, R. et al., POLICRYPS Switchable Holographic Grating: A Promising Grating Electro-Optical Pixel for High Resolution Display Application; Journal of Display Technology, vol. 2, No. 1, Mar. 2006, pp. 38-51, 14 pages.
Crawford, “Switchable Bragg Gratings”, Optics & Photonics News, Apr. 2003, pp. 54-59.
Extended European Search Report for EP Application No. 13192383, dated Apr. 2, 2014, 7 pages.
Final Office Action in U.S. Appl. No. 13/864,991, dated Apr. 2, 2015, 16 pages.
Final Office Action on U.S. Appl. No. 13/869,866 dated Oct. 3, 2014, 17 pages.
Final Office Action on U.S. Appl. No. 13/250,858 dated Feb. 4, 2015, 18 pages.
Final Office Action on U.S. Appl. No. 13/250,940 dated Oct. 17, 2014, 15 pages.
Final Office Action on U.S. Appl. No. 13/892,026 dated Apr. 3, 2015, 17 pages.
Final Office Action on U.S. Appl. No. 13/892,057 dated Mar. 5, 2015, 21 pages.
Final Office Action on U.S. Appl. No. 14/038,400 dated Aug. 10, 2015, 32 pages.
First office action received in Chinese patent application No. 201380001530.1, dated Jun. 30, 2015, 9 pages with English translation.
International Preliminary Report on Patentability for PCT Application No. PCT/US2013/038070, dated Oct. 28, 2014, 6 pages.
International Search Report and Written Opinion regarding PCT/US2013/038070, dated Aug. 14, 2013, 14 pages.
Irie, Masahiro, Photochromic diarylethenes for photonic devices, Pure and Applied Chemistry, 1996, pp. 1367-1371, vol. 68, No. 7, IUPAC.
Levola, et al., “Replicated slanted gratings with a high refractive index material for in and outcoupling of light” Optics Express, vol. 15, Issue 5, pp. 2067-2074 (2007).
Moffitt, “Head-Mounted Display Image Configurations”, retrieved from the internet at http://www.kirkmoffitt.com/hmd_image_configurations.pdf on Dec. 19, 2014, dated May 2008, 25 pages.
Non-Final Office Action on U.S. Appl. No. 13/869,866 dated Jul. 22, 2015, 28 pages.
Non-Final Office Action on U.S. Appl. No. 13/892,026 dated Aug. 6, 2015, 22 pages.
Non-Final Office Action on U.S. Appl. No. 13/892,057 dated Jul. 30, 2015, 29 pages.
Non-Final Office Action on U.S. Appl. No. 13/250,858 dated Jun. 12, 2015, 20 pages.
Non-Final Office Action on U.S. Appl. No. 13/250,858 dated Sep. 15, 2014, 16 pages.
Non-Final Office Action on U.S. Appl. No. 13/250,940 dated Mar. 18, 2015, 17 pages.
Non-Final Office Action on U.S. Appl. No. 13/432,662 dated May 27, 2015, 15 pages.
Non-Final Office Action on U.S. Appl. No. 13/844,456 dated Apr. 1, 2015, 16 Pages.
Non-Final Office Action on U.S. Appl. No. 13/864,991 dated Oct. 22, 2014, 16 pages.
Non-Final Office Action on U.S. Appl. No. 13/869,866 dated May 28, 2014, 16 pages.
Non-Final Office Action on U.S. Appl. No. 14/038,400 dated Feb. 5, 2015, 18 pages.
Non-Final Office Action on U.S. Appl. No. 14/044,676 dated Apr. 9, 2015, 13 pages.
Non-Final Office Action on U.S. Appl. No. 14/109,551 dated Jul. 14, 2015, 32 pages.
Non-Final Office Action on U.S. Appl. No. 14/152,756, dated Aug. 25, 2015, 39 pages.
Non-Final Office Action on U.S. Appl. No. 14/168,173 dated Jun. 22, 2015, 14 pages.
Non-Final Office Action on U.S. Appl. No. 14/225,062 dated May 21, 2015, 11 pages.
Nordin, G., et al., “Diffraction properties of stratified volume holographic optical elements,” Journal of the Optical Society of America A., vol. 9, No. 12, Dec. 1992, pp. 2206-2217, 12 pages.
Notice of Allowance for U.S. Appl. No. 12/700,557, dated Oct. 22, 2013, 9 pages.
Notice of Allowance on U.S. Appl. No. 13/250,970 dated Sep. 16, 2014, 7 pages.
Notice of Allowance on U.S. Appl. No. 13/251,087 dated Jul. 17, 2014, 8 pages.
Notice of Allowance on U.S. Appl. No. 13/355,360 dated Apr. 10, 2014, 7 pages.
Notice of Allowance on U.S. Appl. No. 14/038,400, dated Oct. 30, 2015, 9 pages.
Notice of Allowance on U.S. Appl. No. 14/225,062, dated Dec. 2, 2015, 10 pages.
Office Action for U.S. Appl. No. 12/571,262, dated Sep. 28, 2011, 5 pages.
Office Action for U.S. Appl. No. 12/700,557, dated Aug. 9, 2013, 12 pages.
Office Action for U.S. Appl. No. 12/700,557, dated Feb. 4, 2013, 11 pages.
Office Action for U.S. Appl. No. 13/250,621, dated May 21, 2013, 10 pages.
Office Action for U.S. Appl. No. 13/250,858 dated Feb. 19, 2014, 13 pages.
Office Action for U.S. Appl. No. 13/250,858, dated Oct. 28, 2013, 9 pages.
Office Action for U.S. Appl. No. 13/250,940, dated Aug. 28, 2013, 15 pages.
Office Action for U.S. Appl. No. 13/250,940, dated Mar. 12, 2013, 11 pages.
Office Action for U.S. Appl. No. 13/250,970, dated Jul. 30, 2013, 4 pages.
Office Action for U.S. Appl. No. 13/250,994, dated Sep. 16, 2013, 11 pages.
Office Action for U.S. Appl. No. 13/355,360, dated Sep. 12, 2013, 7 pages.
Office Action on U.S. Appl. No. 13/250,940 dated Mar. 25, 2014, 12 pages.
Office Action on U.S. Appl. No. 13/251,087 dated Mar. 28, 2014, 12 pages.
Office Action on U.S. Appl. No. 13/892,026 dated Dec. 8, 2014, 19 pages.
Office Action on U.S. Appl. No. 13/892,057 dated Nov. 28, 2014, 17 pages.
Office Action, USPTO, U.S. Appl. No. 10/696,507, dated Nov. 13, 2008 (CTX-290US), 15 pages.
Plastic has replaced glass in photochromic lens, www.plastemart.com, 2003, 1 page.
Press Release, “USAF Awards SBG Labs an SBIR Contract for Wide Field of View HUD”, SBG Labs—DigiLens, Apr. 2013, 1 page.
Press Release: “Navy awards SGB Labs a contract for HMDs for simulation and training”, Press releases, DigiLens, Oct. 2012, pp. 1-2, retrieved from the internet at http://www.digilens.com/pr10-2012.2.php. 2 pages.
Requirement for Restriction/Election on U.S. Appl. No. 13/844,456 dated Sep. 12, 2014, 23 pages.
Restriction Requirement for U.S. Appl. No. 12/700,557, dated Oct. 17, 2012, 5 pages.
Schechter, et al., “Compact beam expander with linear gratings”, Applied Optics, vol. 41, No. 7, Mar. 1, 2002, pp. 1236-1240.
Urey, “Diffractive exit pupil expander for display applications” Applied Optics, vol. 40, Issue 32, pp. 5840-5851 (2001).
Webster's Third New International Dictionary 433 (1986), 3 pages.
Wisely, P.L., Head up and head mounted display performance improvements through advanced techniques in the manipulation of light, Proc. of SPIE vol. 7327, 732706-1, 2009, 10 pages.
U.S. Appl. No. 14/814,020, filed Jul. 30, 2015, Brown et al.
Chinese Office Action issued in corresponding application No. 201310557623, dated Jan. 17, 2017, 10 pages.
Extended European Search Report for European Application No. 13765610.4 dated Feb. 16, 2016, 6 pages.
Final Office Action on U.S. Appl. No. 13/250,858, dated Jul. 11, 2016, 21 pages.
Final Office Action on U.S. Appl. No. 13/864,991, dated Jun. 27, 2016, 16 pages.
Final Office Action on U.S. Appl. No. 14/044,676, dated Aug. 12, 2016, 23 pages.
Final Office Action on U.S. Appl. No. 14/152,756, dated Oct. 12, 2016, 18 pages.
Final Office Action on U.S. Appl. No. 14/168,173, dated Nov. 4, 2015, 10 pages.
Final Office Action on U.S. Appl. No. 14/260,943, dated Jul. 19, 2016, 23 pages.
Non-final Office Action on U.S. Appl. No. 13/250,858, dated Nov. 14, 2016, 18 pages.
Non-Final Office Action on U.S. Appl. No. 13/844,456, dated Aug. 16, 2016, 18 pages.
Non-Final Office Action on U.S. Appl. No. 13/844,456, dated Dec. 29, 2016, 24 pages.
Non-Final Office Action on U.S. Appl. No. 13/844,456, dated Jan. 15, 2016, 16 Pages.
Non-Final Office Action on U.S. Appl. No. 13/864,991 dated Nov. 30, 2015, 18 pages.
Non-Final Office Action on U.S. Appl. No. 13/892,026 dated Mar. 22, 2016, 16 pages.
Non-Final Office Action on U.S. Appl. No. 13/892,057, dated May 16, 2016, 23 pages.
Non-Final Office Action on U.S. Appl. No. 14/044,676, dated Dec. 29, 2016, 26 pages.
Non-Final Office Action on U.S. Appl. No. 14/044,676, dated Jan. 20, 2016, 21 pages.
Non-Final Office Action on U.S. Appl. No. 14/152,756, dated Apr. 26, 2016, 17 pages.
Non-Final Office Action on U.S. Appl. No. 14/168,173 dated Mar. 10, 2016, 9 pages.
Non-Final Office Action on U.S. Appl. No. 14/260,943 dated Feb. 3, 2016, 19 pages.
Non-Final Office Action on U.S. Appl. No. 14/465,763, dated Sep. 29, 2016, 4 pages.
Non-Final Office Action on U.S. Appl. No. 14/497,280, dated Sep. 22, 2016, 15 pages.
Non-Final Office Action on U.S. Appl. No. 14/820,237, dated Aug. 5, 2016, 14 pages.
Non-Final Office Action on U.S. Appl. No. 15/005,507, dated Nov. 22, 2016, 7 pages.
Non-Final Office Action on U.S. Appl. No. 13/250,858, dated Mar. 18, 2016, 20 pages.
Notice of Allowance on U.S. Appl. No. 13/432,662, dated Feb. 18, 2016, 10 pages.
Notice of Allowance on U.S. Appl. No. 13/892,026, dated Jul. 18, 2016, 10 pages.
Notice of Allowance on U.S. Appl. No. 13/892,057, dated Nov. 8, 2016, 10 pages.
Notice of Allowance on U.S. Appl. No. 14/814,020, dated Aug. 12, 2016, 15 pages.
Notice of Allowance on U.S. Appl. No. 14/820,237, dated Jan. 23, 2017, 10 pages.
Notice of Reasons for Rejection for Japanese Application No. 2015-509120, dated Nov. 1, 2016, 4 pages.
Final Office Action for U.S. Appl. No. 14/754,368 dated Sep. 22, 2017. 15 pages.
Final Office Action on U.S. Appl. No. 14/152,756, dated Jun. 7, 2017, 16 pages.
Non-Final Office Action for U.S. Appl. No. 14/754,368 dated Jan. 24, 2018. 15 pages.
Non-Final Office Action for U.S. Appl. No. 15/136,841 dated Jul. 13, 2017. 36 pages.
Non-Final Office Action on U.S. Appl. No. 14/754,368, dated May 8, 2017, 12 pages.
Notice of Allowance for U.S. Appl. No. 15/005,507 dated May 23, 2017. 8 pages.
First Office Action for CN Patent Application No. 201610500335.9 dated Sep. 19, 2019. 7 pages.
Notice of Allowance for U.S. Appl. No. 14/754,368 dated Jun. 21, 2018. 13 pages.