Heads-up displays are intended to enhance pilot situational awareness and focus by displaying information without forcing the pilot to divert focus to look at screens disposed around the cabin. Using a touchscreen input with a heads-up display and no haptic feedback is challenging. Typing commands on a keyboard is also challenging where the pilot cannot conveniently look at the inputs. It would be advantageous for pilots to have a convenient mechanism for verifying what input the pilot is selecting or inputting without looking away from the heads-up display.
In one aspect, embodiments of the inventive concepts disclosed herein are directed to a touch sensitive input device with symbology replicated at a predetermined location of a heads-up display.
In a further aspect, the touch input symbology is generally organized for input selection via directional movement. The input symbology orients the input origin point based on the first point of contact for each new input.
In a further aspect, the input system utilizing the touch sensitive input device is adapted for written text recognition with the written text and recognized letters replicated on the heads-up display at a predetermined location.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and should not restrict the scope of the claims. The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments of the inventive concepts disclosed herein and together with the general description, serve to explain the principles.
The numerous advantages of the embodiments of the inventive concepts disclosed herein may be better understood by those skilled in the art by reference to the accompanying figures in which:
Before explaining at least one embodiment of the inventive concepts disclosed herein in detail, it is to be understood that the inventive concepts are not limited in their application to the details of construction and the arrangement of the components or steps or methodologies set forth in the following description or illustrated in the drawings. In the following detailed description of embodiments of the instant inventive concepts, numerous specific details are set forth in order to provide a more thorough understanding of the inventive concepts. However, it will be apparent to one of ordinary skill in the art having the benefit of the instant disclosure that the inventive concepts disclosed herein may be practiced without these specific details. In other instances, well-known features may not be described in detail to avoid unnecessarily complicating the instant disclosure. The inventive concepts disclosed herein are capable of other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
As used herein a letter following a reference numeral is intended to reference an embodiment of the feature or element that may be similar, but not necessarily identical, to a previously described element or feature bearing the same reference numeral (e.g., 1, 1a, 1b). Such shorthand notations are used for purposes of convenience only, and should not be construed to limit the inventive concepts disclosed herein in any way unless expressly stated to the contrary.
Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by anyone of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
In addition, use of the “a” or “an” are employed to describe elements and components of embodiments of the instant inventive concepts. This is done merely for convenience and to give a general sense of the inventive concepts, and “a” and “an” are intended to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Finally, as used herein any reference to “one embodiment,” or “some embodiments” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the inventive concepts disclosed herein. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment, and embodiments of the inventive concepts disclosed may include one or more of the features expressly described or inherently present herein, or any combination of sub-combination of two or more such features, along with any other features which may not necessarily be expressly described or inherently present in the instant disclosure.
Broadly, embodiments of the inventive concepts disclosed herein are directed to a touch sensitive input device with symbology replicated at a predetermined location of a heads-up display.
Referring to
In at least one embodiment, the processor 100 is configured to render input options on the touch sensitive input device 104 (a touch sensitive screen) according to a specific symbology as further described herein. The processor 100 replicates the image rendered on the touch sensitive input device 104 at a specific location on the HUD 102; that is to say, renders a copy of the image in real-time at the specific location on the HUD 102. As elements of the symbology are manipulated via the touch sensitive input device 104, those manipulations are replicated at the specific location on the HUD 102 to allow the user to observe those manipulations without looking away from the HUD 102.
In at least one embodiment, the replicated image on the HUD 102 is rendered with reduced opacity so that underlying information is not completely obscured. Furthermore, the processor 100 may remove the replicated image from the HUD 102 when the touch sensitive input device 104 is idle for a predetermined period of time.
In at least one embodiment, the processor 100 interprets inputs to the touch sensitive input device 104 relative to an initial contact. For example, when a user touches the touch sensitive input device 104, the rendered symbology centers on that point of contact.
In at least one embodiment, inputs are hierarchically organized such that available input options generally comprise a limited set of menus, easily differentiable by relative direction from an initial contact point. Navigation is generally accomplished by drilling down through a series of menus, each re-centered after a higher-level selection.
In at least one embodiment, certain inputs instantiate a text recognition process. In such embodiments, the replicated image includes an input box for written text. Written inputs may also be centered based on an initial contact location such that as individual letters are input, subsequent letters may begin at any location on the touch sensitive input device 104. The processor 100 may normalize the locations of the written inputs in the replicated image such that the written inputs are rendered with consistent sizes and in relative proximity according to the order in which they are written.
In at least one embodiment, the processor 100 queries one or more connected avionics systems to identify system functionality and selection options. The processor 100 may then assign menu options corresponding to that system and system functionality for rendering on the touch sensitive input device 104.
Referring to
In at least one embodiment, one menu option in one or more of the radial menus 204 corresponds to a selection for text recognition. Upon selection, the radial menu 206 is moved and rendered to accommodate a written text input element 208.
Referring to
In at least one embodiment, where the text being input is associated with a function having several components or parameters, the text input user interface 300 may comprise a parameter selection element 310. For example, as an interface to a communication system, the user may use the written text input element 302 to enter a call sign and also allow the user to switch to a frequency selection function via the parameter selection element 310.
Referring to
In at least one embodiment, actual selectin of a directional selection indicator 406 is based on a directional motion with respect to an initial point of contact, then releasing contact with the touch sensitive input device 400. After a user makes a directional motion, but before releasing contact, a processor rending the replicated input symbology on the HUD 402 may highlight the proposed selection. Such highlighting may include a textual indication of the action that will be performed upon release.
In at least one embodiment, the replicated symbology on the HUD 402 may be rendered in a further simplified or abbreviated form. For example, referring to
Referring to
In at least one embodiment, the radial menus are organized hierarchically such that selecting an initial menu element 504 from a radial menu (as in
In at least one embodiment, menu layers may end with a specific element for altering one or more parameters in a connected avionics system. When setting a value in a bounded range, a slider 510, 520 may be rendered with current values 512, 522 highlighted. For example, a user may select, via directional selection of increasingly granular radial menus, a representation of a primary flight display. The primary flight display may include a direction slider 510 disposed at one edge of the touch sensitive input device 500 while an altitude slider 520 is disposed on another edge of the touch sensitive input device 500.
Where two sliders 510, 520 are present, a connected processor may identify which slider 510, 520 is proximal to an initial contact location, expand that slider 510, 520 for further interaction, and reduce the other slider 510, 520 to avoid confusion or interference.
In at least one embodiment, when a slider 510, 520 is selected by proximal contact to expand for manipulation, the corresponding current value 512, 522 may be changed by directional movement up or down. At any time, directional movement away from the slider 510, 520 may cause the touch sensitive input device 500 to render additional directional options to select 514, 518, 524, 528 or hold 516, 526 the new current value 512, 522.
Referring to
In at least one embodiment, the radial menus 604 are centered on an initial contact location such that menu selection is always based on relative movement from that initial contact location. The replication 606 is stationary in the HUD 602 no matter where the initial contact location is on the touch sensitive input device 600.
In at least one embodiment, selections may be made or parameters adjusted via one or more sliders 616. The sliders 616 may also be replicated in a simplified for 618 to allow the user to observe the selection being made without cluttering the HUD 602.
Referring to
While navigating the hierarchical menu structure or proceeding through the successive process steps, it may be necessary regress to a previous step. The touch sensitive input device 700 includes a “back” element 712 for regressing though previous steps or menus. The back element 712 is also replicated 714 on the HUD 702.
In at least one embodiment, the symbology replicated on the HUD 702 is generally more condensed than on the touch sensitive input device 700. While all of the relative positional relationships are preserved, the absolute distances between elements is not as critical on the HUD 702 because, while there is a risk of a user selecting the wrong input on the touch sensitive input device 700 if those elements are too close together, the elements on the HUD 702 are purely informational; close spacing on the HUD 702 is therefore not injurious. However, because replication on the HUD 702 is intended to allow the user to know what is being selected on the touch sensitive input device 700, and make corrections, the relative placement of elements should be maintained.
It is believed that the inventive concepts disclosed herein and many of their attendant advantages will be understood by the foregoing description of embodiments of the inventive concepts disclosed, and it will be apparent that various changes may be made in the form, construction, and arrangement of the components thereof without departing from the broad scope of the inventive concepts disclosed herein or without sacrificing all of their material advantages; and individual features from various embodiments may be combined to arrive at other embodiments. The form herein before described being merely an explanatory embodiment thereof, it is the intention of the following claims to encompass and include such changes. Furthermore, any of the features disclosed in relation to any of the individual embodiments may be incorporated into any other embodiment.
Number | Name | Date | Kind |
---|---|---|---|
7158871 | Han et al. | Jan 2007 | B1 |
8223088 | Gomez | Jul 2012 | B1 |
9583010 | Kolek et al. | Feb 2017 | B1 |
9652989 | Myren et al. | May 2017 | B1 |
10000297 | He et al. | Jun 2018 | B2 |
10388171 | Liberman et al. | Aug 2019 | B2 |
20030069687 | Tsuyuki | Apr 2003 | A1 |
20080210474 | Lai et al. | Sep 2008 | A1 |
20110040547 | Gerber | Feb 2011 | A1 |
20110169928 | Gassel | Jul 2011 | A1 |
20140108947 | Chatrenet | Apr 2014 | A1 |
20140361983 | Dolfing | Dec 2014 | A1 |
20150212647 | Kim | Jul 2015 | A1 |
20160077523 | Zygmant | Mar 2016 | A1 |
20160147433 | Lin | May 2016 | A1 |
20160253083 | Lee | Sep 2016 | A1 |
20170024121 | Park | Jan 2017 | A1 |
20170090693 | Ku | Mar 2017 | A1 |
20170118640 | Lee | Apr 2017 | A1 |
20170131839 | Zhang | May 2017 | A1 |
20170138759 | Turner | May 2017 | A1 |
20170139556 | Josephson | May 2017 | A1 |
20170154326 | Jo | Jun 2017 | A1 |
20170269800 | Park | Sep 2017 | A1 |
20170275020 | Charbonnier et al. | Sep 2017 | A1 |
20170345420 | Barnett, Jr. | Nov 2017 | A1 |
20180164973 | Kim | Jun 2018 | A1 |
20180170550 | Streckert | Jun 2018 | A1 |
20180232425 | Das | Aug 2018 | A1 |
20180273200 | De Villele et al. | Sep 2018 | A1 |
20180341400 | Kim | Nov 2018 | A1 |
20190018498 | West | Jan 2019 | A1 |
20190061971 | Kim et al. | Feb 2019 | A1 |
20190164436 | Suddreth et al. | May 2019 | A1 |
20200117342 | Jatram | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
102011113600 | Mar 2013 | DE |
3404525 | Nov 2018 | EP |
2355055 | Apr 2001 | GB |
2529682 | Mar 2016 | GB |
20070008615 | Jan 2007 | KR |
2009040322 | Apr 2009 | WO |
2016074212 | May 2016 | WO |