The present disclosure relates to human-computer interface systems for use in vehicles. More specifically, the present disclosure relates to a system and method for providing an interface between a driver or passenger of a vehicle and a personal computing device.
As the availability of mobile computing and communication devices has grown in recent years, individuals increasingly desire to use these devices while performing other tasks, such as while driving a vehicle. Of course, in current vehicles, such use can be extremely dangerous, as it distracts the user from the task of driving. Even in “self driving” vehicles, which may become available in the near future, the vehicle's steering wheel presents a physical obstacle which prevents the comfortable use of a separate keyboard or other computer input device while sitting in the driver seat. Passengers may also desire to use such devices, yet the interiors of most vehicles limit the availability of convenient and comfortable placement options. Improved systems and methods are therefore needed which allow a person to safely and comfortably interact with a personal computing device while driving or riding as a passenger in a vehicle.
According to one aspect, a system for providing an interface between a person in a vehicle and a personal computing device is disclosed, comprising at least one projector located in the vehicle for projecting an image onto an interior surface of the vehicle, said image comprising a simulated computer keyboard, at least one gesture sensor for sensing a finger gesture of the person with respect to locations of individual characters within said image, and a computer processor operatively connected to said at least one projector and said at least one gesture sensor, wherein the computer processor receives and processes input from the at least one gesture sensor to determine a first character being selected by the person within said keyboard. The image may further comprise a simulated computer touchpad, wherein the computer processor receives and processes input from the at least one gesture sensor to determine a first input command being entered by the person in the simulated computer touchpad.
According to another aspect, a system for providing an interface between a person in a vehicle and a personal computing device is disclosed, comprising at least one projector for projecting an image onto an interior surface of the vehicle, said image comprising a simulated computer keyboard a physical sensing device located within said interior surface, and a computer processor operatively connected to said at least one projector and said physical sensing device, wherein the computer processor receives and processes input from the physical sensing device to determine a first character being entered by the person within said physical touchpad. The physical sensing device may be camouflaged within the interior surface when the image is not being projected.
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, and alterations and modifications in the illustrated device, and further applications of the principles of the invention as illustrated therein are herein contemplated as would normally occur to one skilled in the art to which the invention relates.
The system 100 may also optionally comprise a communication module 106 for transmitting information to and from a personal computing device 107. The personal computing device 107 may comprise a smart phone, a laptop computer, a tablet computer, or any other personal computing device known in the art. In addition to personal computing devices, the communication module 106 may operatively communicate with other dedicated electronic devices within the vehicle, such as GPS navigation devices, and audio and video entertainment devices, such as MP3 players, DVD players, and the like. The communication module 106 may communicate with the personal computing device 107 using any wired or wireless protocol known in the art, including Bluetooth, Universal Serial Bus (USB), and the like. In addition, the communication module 106 may be connected to a network external to the vehicle, such as the Internet.
Output display 150 may communicate with the computer processing unit 102 either directly or through communication module 106. Output display 150 preferably comprises a digital display, such as an LCD screen, which displays the results of the user input being performed. In a preferred embodiment, the output display 150 is incorporated as part of the vehicle dash instrument cluster. In other embodiments, output display 150 may comprise a heads up display or other in-vehicle display.
As shown in
In certain vehicles, the distance between the mounted projector 105 and the steering wheel may change during use. For example, the steering wheel 120 may be adjusted in a telescoping fashion to accommodate different drivers. To allow for this, the computer processing unit 102 may automatically adjust the focus of the image 110 to optimize it to account for adjustments in steering wheel positions. Manual focus and adjustment capability may also be provided depending on the needs of the particular application. In certain embodiments, multiple projectors 105 may be placed at separate locations and focused on a single image area to enhance the quality of the projected image 110. This further allows continuous projection in case one of the projectors 105 is blocked by the user's body or other obstacle. In other embodiments, each projector 105 may be used to project a separate portion of the overall image 110.
Field source 140, which may optionally be included within the housing of the projector 105, provides a sensing field in the area of the steering wheel surface 115. The gesture sensor 130, which may also be optionally included within the housing of the projector 105, is able to sense the location of the driver's fingers relative to the image 110 within the sensing field produced by field source 140. The computer processing unit 102 receives the location information and determines which one of the keys 135 within the simulated keyboard 125 the driver is attempting to select. The gesture sensor 130, along with the computer processing unit 102, may also detect and determine touchpad commands performed by the user, such as “click,” “drag,” etc. The computer processing unit 102 may use any gesture detection algorithm or format known in the art. In one embodiment, the computer processor may use OpenCV to perform the gesture detection.
In further embodiments, the displayed image 110 can be toggled between a keyboard and touchpad based on a predetermined input command from the user. For example, if the user wishes to switch to a touchpad-only input, she may simply perform a “drag” motion along the keyboard area Likewise, if the user wishes to switch to a keyboard-only input, she may simply begin to type, at which point the processor will recognize the typing action and switch to a keyboard-only mode.
In certain embodiments, the gesture sensor 130 may comprise a charge coupled device (CCD) camera. In other embodiments, the gesture detector 130 may comprise an infrared sensor, with field source 135 providing an infrared field which overlays the image 110 and allows the gesture sensor 130 to determine the location of the user's fingers within the sensing field. One example of a device which functions as a virtual laser projector and gesture sensor for keyboard input is the Magic Cube, supplied by Celluon, Inc. of Ace High-End Tower 918, 235-2 Guro-dong, Guro-Gu, Seoul, KOREA. Another example of a virtual laser projector is described in U.S. Pat. No. 6,611,252 issued Aug. 26, 2003 which is herein incorporated by reference.
As illustrated in
When installed in the roof portion 151 of the vehicle 100, the projector 105 will be fixed with respect to the driver. Therefore, the projected image 110 and the sensing field will automatically remain in the same orientation regardless of the rotation of the steering wheel 120, as illustrated in
The rotation sensor 160 may comprise any type of sensor known in the art for detecting rotation of a steering wheel relative to a vehicle including accelerometers, gyroscopes, proximity switches, and the like. In other embodiments, the computer processing unit 102 may receive the steering wheel rotational position from the vehicle engine computer through a wired or wireless communication link.
In addition to a single gesture sensor 130, multiple gesture sensors 130 may be placed at separate locations with respect to the image 110, as shown in
As shown in
In certain embodiments, where the color of the steering wheel surface 115 or other interior projection surface is very dark or does not otherwise allow for a quality image 110 to be viewed by the driver, an appropriately colored overlay may be attached to the surface 115 or other projection surface. The overlay is preferably white in color to improve the visibility of the projected image 110. The overlay may be formed from any suitable material and attached using an appropriate method including, but not limited to, adhesive, magnets, or elastic straps, to name a few. The overlay may be further configured to split or breakaway upon deployment of the vehicle airbag, which is typically contained within the central portion 165 of the steering wheel 120.
As shown from a side view in
In certain embodiments, the physical touchpad 170 may be configured to be camouflaged within the surface 115 of the steering wheel 120. In other embodiments, the physical touchpad 170 may be placed beneath the surface 115, with the surface 115 being thin enough or made of an appropriate material to transfer the physical touch of the users fingers to the physical touchpad 170 (via capacitance, resistance, mechanical compaction, etc.). The physical touchpad 170 may also be pre-weakened or otherwise configured to breakaway when the vehicle airbag is deployed.
In further embodiments, a combination physical keyboard and touchpad, as opposed to a projected image, may be incorporated into the driver-facing surface 115 of the steering wheel 120. One example of such a combination keyboard and touchpad is described in U.S. Pat. No. 7,659,887 issued Feb. 9, 2010 and U.S. Patent Application Publication No. 2010/0148995 dated Jun. 17, 2010, both of which are hereby incorporated by reference.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiment has been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected.
The present application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/485,420 filed May 12, 2011 which is hereby incorporated by reference in its entirety to the extent not inconsistent.
Number | Date | Country | |
---|---|---|---|
61485420 | May 2011 | US |