INPUT DEVICE FOR A MOTOR VEHICLE

Information

  • Patent Application
  • 20150367859
  • Publication Number
    20150367859
  • Date Filed
    December 19, 2013
    11 years ago
  • Date Published
    December 24, 2015
    9 years ago
Abstract
Embodiments are disclosed for an input device for a motor vehicle. An example input device for a motor vehicle having a dashboard and a steering wheel includes a sensor and a circuit connected to the sensor, wherein the sensor is positioned in the dashboard sensitive toward an area of the steering wheel,wherein the sensor is configured to output a sensor signal based on a contactless hand gesture,wherein the circuit is configured to evaluate the sensor signal,wherein the circuit and/or the sensor is configured to distinguish between a steering movement and the hand gesture, andwherein the circuit is configured to output a detection signal (S3) based on the detection of the hand gesture.
Description

The present invention relates to an input device for a motor vehicle.


ROHM Co., Ltd., shows a “1 Chip Optical Proximity+Ambient Light sensor IC”—BH1772GLC, measuring a distance between the sensor and a human hand up to 150 mm using IR-LED light.


The object of the invention is to improve an input device for a motor vehicle.


This object is attained by an input device with the features of independent claim 1. Advantageous refinements are the subject of dependent claims and included in the description.


An input device is provided for a motor vehicle with a dashboard and a steering wheel. The input device has a sensor and a circuit connected to the sensor.


The sensor is positioned in the dashboard sensitive toward the steering wheel.


The sensor is designed to output a sensor signal based on a contactless hand gesture.


The circuit is configured to evaluate the sensor signal.


The circuit and/or the sensor are configured to distinguish between a steering movement and the hand gesture and to output a detection signal based on the detection of the hand gesture.


Tests by the applicant have shown that the arrangement in the dashboard and direction of the sensor sensitivity toward the steering wheel may enable the driver to make inputs without having to avert his gaze from the road traffic, so that safety in road traffic is increased. At the same time, an erroneous unwanted input during a steering movement can be effectively eliminated.


The object of the invention furthermore is to provide an improved system for a vehicle.


Said object is attained by the system with the features of independent claim 9. Advantageous refinements are the subject of dependent claims and included in the description.


Therefore, a system is provided for a vehicle having a dashboard and a steering wheel and a windshield.


The system has a head-up display and an input device.


The head-up display is configured to project an image onto the windshield.


The input device has a sensor and a circuit connected to the sensor and the head-up display.


The sensor is positioned in the dashboard sensitive toward the steering wheel.


The sensor is designed to output a sensor signal based on a contactless hand gesture.


The circuit is configured to evaluate the sensor signal.


The circuit is configured to distinguish between a steering movement and the hand gesture and to output a detection signal based on the detection of the hand gesture.


The circuit is configured to change the image based on the detection of the hand gesture.


The embodiments described hereinafter refer to both the input device and the system.


According to one embodiment, the evaluation circuit may be configured to determine and/or to control an amplitude of the sensor signal. The amplitude may be adjusted by adjusting the transmit power of an IR transmitter. The amplitude may be determined by means of an analog-to-digital converter and determination of a maximum value from the digitized sensor signal.


According to one embodiment, the evaluation circuit may be configured to compare the amplitude of the sensor signal with a threshold. The threshold may be associated with a position of the hand within a space between the steering wheel and the sensor.


According to one embodiment, the evaluation circuit may be configured to compare a course of the sensor signal with a pattern course associated with the hand gesture.


According to one embodiment, the circuit may be configured, based on the sensor signal, to determine at least two distance values associated with at least two different distances between the hand and the sensor. A function of application software may be controlled by means of different distance values.


According to one embodiment, the sensor may be configured to generate the sensor signal based on a direction of a hand movement. Preferably, at least two directions may be determined. According to one embodiment, it is provided that six directions of hand movement can be distinguished by the sensor.


According to one embodiment, a sensitivity of the sensor and/or the circuit can be designed to be adjustable.


According to one embodiment, the circuit may be configured by means of an input to program a command in an application program. The command may be assigned to the detection of the hand gesture. By this means, the user can assign a current input or a frequently required input to the hand gesture.


According to one embodiment, the circuit may be configured to add or remove an object of an application program in the image in order to change the image.


According to one embodiment, the circuit may be configured to determine a selection input, the selection input being associated with a selection of an application program. According to one embodiment the circuit may be configured to output information of the application program in another display. According to one embodiment the object may contain information associated with the selected application program.


The previously described refinement variants are especially advantageous both individually and in combination. In this regard, all embodiments may be combined with one another. Some possible combinations are explained in the description of the exemplary embodiments shown in the figures. These possible combinations of the refinement variants, depicted therein, are not definitive, however.


The invention will be described in greater detail hereinafter by exemplary embodiments using graphic illustrations.





Shown are:



FIG. 1 a schematic representation of a vehicle interior;



FIG. 2 a schematic block diagram of a system for a motor vehicle;



FIG. 3 another schematic representation of a vehicle interior;



FIG. 4 another schematic representation of a vehicle interior;



FIG. 5 a schematic diagram;



FIG. 6 another schematic diagram;



FIG. 7 one embodiment of views of displays;



FIG. 8 one embodiment of a view of a display;



FIG. 9 one embodiment with views for a telephone call; and



FIG. 10 one embodiment with views showing the reducing an object of an application program.






FIG. 1 shows a schematic representation of a vehicle interior. According to one embodiment of FIG. 1, the vehicle may include a driver seat 140 and a passenger seat 150. The vehicle may further include a steering wheel 110 on the driver's side and a gear shift 170 and a front windshield 130. In one embodiment of FIG. 1, a system may be provided, the system also referred to as an infotainment system providing information and entertainment functionality. The infotainment system may have a central information display 430 in the form of a user interface. Central information display 430 may be centrally arranged in dashboard 120 or center console of the vehicle. Central information display 430 may be touch screen, comprising a touch sensitive surface for user input. The infotainment system may have, or be in communication with an instrument cluster display 420. According to one embodiment, instrument cluster display 420 may be arranged inline with the position of steering wheel 110, so that the user may see the displayed information content through the openings in steering wheel 110. Instrument cluster display 420 may be a color screen.


The infotainment system may have a head-up display 410. Head-up display 410 may be configured to project an image 411 onto front windshield 130. A surface of front windshield 130 may reflect the projected image towards the user, in the case of the embodiment of FIG. 1, towards the driver of the vehicle. According to one embodiment shown in FIG. 1, the projected image can be of the size of a reflection area 419. The form of front windshield 130 may deviate from a flat reflection surface, and an electronic rectification and/or optical rectification may be used. Likewise, front windshield 130 can have a flat additional pane for reflection. Using mirroring technique the focus may be adjusted in an area outside the vehicle, so that the image 411 may appear virtually in front of the car of above the front lid.


According to one embodiment of FIG. 1, an input device for the motor vehicle is realized by sensor 302 and a circuit 200, which is shown schematically in one embodiment of FIG. 2. Sensor 302 may be positioned in the dashboard 120 sensitive toward the steering wheel 110. The sensor 302 may be arranged and designed to determine a position and/or a movement between steering wheel 110 and dashboard 120.


The infotainment system may have a first sensor 301 and a second sensor 302. First sensor 301 and a second sensor 302 may be infrared sensors. First sensor 301 and/or a second sensor 302 may be positioned in predetermined locations, in order to sense a movement of a hand of a user of the vehicle. In this respect, the second sensor's 302 sensitivity may be directed toward an area 111 of steering wheel 110. Said area 111 may be a grip position preferred by the driver. In contrast, the first sensor 301 may be arranged in the central console to enable input in the area of central display 430. The first sensor 301 may be designed in such a way that an input motion, for example, a hand gesture, is distinguished from an operation of the gear shift.


The infotainment system may have an input device 304 in the central console. The input device 304 may be part of the user interface, and may have one or more push-buttons, input-wheels, and so forth. The system may have an input device 303 integrated in steering wheel 110, having one or more push-buttons, switches and so forth. According to one embodiment, a separation of functions may occur. By means of the sensor 301, 302 the user can switch between applications and basic program functions can be controlled. Separated may be a plurality of submenu items of application programs that may be selectable by an input by means of input device 304.


The infotainment system may have a first near field communication device 361 (NFC) in a predetermined position, such as proximate to a left retainer 161 in one embodiment of FIG. 1. Also the infotainment system may have a second near field communication device 362 (NFC) in a predetermined position, such as proximate to a right retainer 162 in an embodiment of FIG. 1. First near field communication device 361 and second near field communication device 362 can be configured to connect to a mobile device 461, 462, such as to a mobile phone, or other mobile communication device in close proximity. Therefore a mobile device 461 positioned in or near the left retainer 161 according to one embodiment shown in FIG. 1 can have a connection to first near field communication device 361 and a mobile device 462 positioned in or near right retainer 162 shown in FIG. 1 may have a connection to second near field connection device 362.



FIG. 2 shows a block diagram of an example of the system for a motor vehicle. The system may have a circuit 200. The circuit 200 may have a processor to run a program. The circuit 200 may have a plurality of interfaces to connect other devices.


A sensor 302 can be connected to circuit 200. The sensor 302 is designed to output a sensor signal S302 based on a contactless hand gesture 912, 913. Hand gesture 912, 913 is shown schematically, for example, in FIG. 4. According to one embodiment in FIG. 2, the circuit 200 may be configured to evaluate the sensor signal S302.


According to one embodiment, the circuit 200 and/or the sensor 302 may be configured to distinguish between a steering movement 910, 911 and hand gesture 912, 913. Steering movement 910, 911 is shown schematically in FIG. 3 by movement arrows. Hand gesture 912, 913 is shown schematically in FIG. 4 by movement arrows. The sensor 302 can be designed structurally to detect a movement only in an area up to distance dth. The distance dth is shown schematically in FIG. 3. The distance dth according to one embodiment of FIG. 3 does not extend to the operating elements of steering wheel 110, such as steering column switch 112 for the turn signal, windshield wipers, etc. According to one embodiment in FIG. 2, the circuit 200 may have a function block 201 for evaluating the sensor signal S302. The function block 201 can be realized by hardware or as a program sequence by software. The function block 201 may enable the differentiation between steering movement 910, 911 and hand gesture 912, 913. The circuit 200 may be configured to output a detection signal S3 based on the detection of hand gesture 912, 913. If hand gesture 912, 913 is detected by function block 201, detection signal S3 may be output to the function block 202 for control. The function block 202 of the circuit 200 based on detection signal S3 may generate image data 412, 422, 432 for displays 410, 420, 430.


A head-up display 410 and/or an instrument cluster display 420 and/or a central information display 430 and/or a first sensor 301 and/or a second sensor 302 and/or a first near field connection device 361 and/or a second near field connection device 362 and/or an input device 303, 304 may be connected to or in communication with circuit 200. The head-up display 410 may also be referred to as head-up display.


According to one embodiment, an infotainment system of a vehicle may include an imaging system. The infotainment system may have a head-up display 410 and a central information display 430 and a sensor 302 for detecting gestures 912, 913 of a user. The infotainment system may have a circuit 200 connectable to head-up display 410 and to central information display 430 and to sensor 302.


The circuit 200 may be configured to send first image data 412 to the head-up display 410 and second image data 432 to the central information display 430 to be displayed. The head-up display 410 may be configured to project an image 411 onto the front windshield 130, as shown in FIG. 1. The image 411 may be based on first image data 412.


The central information display 430 may have a screen configured to display an image based on image data 432. The circuit 200 may be configured to add content information to first image data 412 for head-up display 410 and to reduce content information from image data 412 for head-up display 410, when a corresponding gesture 912, 913 of the user is detectable by means of the sensor 302.


Image data 412 for the head-up display 410 and image data 432 for the central information display 430 may be different. Reducing information contained in image data 412 for the head-up display 410 may be initiated easily by hand gesture 912, 913 of the driver, relieving strain on the driver. The workload of the driver is therefore reduced further to a minimum by reducing the content in the image 411 an increasing transparency.


Image 411 may be projected within an area 419. The projected image 411 may be predefined, and may be adjustable by the user. The area 419 may be positioned to the driver's view. The position of the area 419 may be adjusted to steering wheel 110, so that the image 411 is viewable by the driver who is also able to observe the traffic in front of the vehicle. The image 411 may be at least partially transparent, such as semitransparent. At least parts of the area 419 may be transparent during driving, so that the view of the driver is not disturbed significantly.


According to one embodiment, an infotainment system of a vehicle that includes the imaging system is provided. The infotainment system may have a display 410, 420, 430. The infotainment system may have a sensor 302 for detecting gestures of a user. The infotainment system may have a circuit 200 connectable to display 410, 420, 430 and to the sensor 302. The sensor 302 may be of a contactless type. The sensor 302 may be an infrared sensor.


The sensor 302 may be positioned in a predetermined position, such as a dashboard 120 of the vehicle sensitive towards a steering wheel 110 of the vehicle. In FIG. 1, the sensor 302 may be positioned in dashboard 120 facing an area 111 of steering wheel 110, where a human hand 900 of the driver is in a normal position. The driver can therefore easily make gestures 912, 913 while driving the vehicle as shown in an embodiment of FIG. 4. The driver therefore does not need to search for a sensitive area of the sensor 302. An input by means of the sensor 302 may occur in a quite general area before sensor 302. In contrast to a keypad, the probability of an incorrect input is significantly reduced. According to one embodiment, the sensor 302 may be positionable by the driver to detect in a predetermined area. For example, the sensor 302 may be placed adjustable in the dashboard 120.


The circuit 200 and the sensor 302 may be configured to distinguish between a gesture 912, 913 to change the information content in display 410, 420, 430 and a steering movement 910, 911 of the hand 900 of the user. To distinguish between gesture 912, 913 and steering movement 910, 911, corresponding differences in amplitude C1, C2 of the sensor signal S302 of the sensor 302 may be analyzed. Amplitude C1, C2 is shown schematically in the diagram in FIG. 5. The diagram of FIG. 5 shows by way of example the dependence of the numerical value of an infrared proximity sensor 302 on a distance d of an object, for example, a human hand, on a light intensity IR of the emitted infrared light, and on an angle α of the hand to a central position in front of sensor 302. If the numerical values C do not exceed a threshold th, no hand gesture 912, 913 is detected by circuit 200. This is true, for example, in the case of FIG. 3, whereby hand 900 during steering movement 910, 911 does not fall below the distance threshold dth associated with threshold th.


In contrast, with hand gesture 912, 913 according to the embodiment in FIG. 4, both first distance d1 and second distance d2 fall below the distance threshold dth. This is reflected in the diagram of FIG. 5, whereby in the case of first distance d1 first amplitude C1 is generated and in the case of second distance d2 second amplitude C2. In an embodiment, the circuit 200 according to FIG. 2 may be configured to determine, based on the sensor signal S302, at least two distance values C1, C2 associated with at least two different distances d1, d2 between the hand 900 and the sensor 302. Inputs may occur by means of distance values C1, C2. Such an input may be a selection of a number of options or the adjustment of a value—by counting up or down in the image in one of displays 410, 420, 430.


The circuit 200 may be configured to recognize a predefined steering movement 910, 911 and/or a predefined gesture 912, 913.


According to one embodiment, the circuit 200 in FIG. 2 is configured to compare a course of sensor signal Cu(t), Cl(t) with a pattern course Cp(t), associated with hand gesture 912, 913, over the time t. Two courses of sensor signals Cu(t), Cl(t) and a pattern course Cp(t) are shown schematically in FIG. 6. The two sensor signal courses Cu(t), Cl(t) are thereby offset in time, so that a movement direction 912 or 913 of the hand gesture may be determined by the succession of sensor signal courses Cu(t), Cl(t). The time offset can be generated by two spatially separated IR diodes or by two spatially separated IR receivers. If both sensor signal courses Cu(t), Cl(t) are within the pattern course Cp(t), a hand gesture 912, 913 is detected.


According to one embodiment in FIG. 2, there may be two sensors 301, 302 sensing movements of a hand 900 of a user. The two sensors 301, 302 may be both connected to a circuit 200. An interior camera 305 may be connected to circuit 200. Interior camera 305 may be aligned to record a face of the user, especially the face of the driver of the vehicle to determine eye movements.


The infotainment system may have a microphone 306 to record the voice of the user. The infotainment system may be configured to run a program for voice recognition. The infotainment system may have an interface 340 to a bus of the vehicle, e.g., a CAN bus, to retrieve data of the vehicle, e.g., the current speed, vehicle rain sensor data, and so forth.


The infotainment system may have a satellite receiver 330 to receive position data of the current position of the vehicle, such as GPS data or GLONASS data. The system may have a transceiver 350 for communicating with a wireless network such as, for example, a UMTS network or a WLAN network.


The infotainment system may have one or more cameras 311, 312, 313, 314 positioned to record an image of the surroundings of the vehicle. According to one embodiment, the circuit 200 may be connected to a front camera 311 capturing image data of the road and traffic in front of the vehicle. The circuit 200 may be connected to a back camera 312 capturing image data of the road and traffic behind the vehicle. The circuit 200 may be connected to a left camera 313 and/or to a right camera 314 recording an image correspondingly. The one or more cameras 311, 312, 313, 314 may be used to record the complete surroundings of the vehicle concurrently. Circuit 200 may be configured to run a program of object recognition to recognize objects in the recorded image data. The recognized object may be a road user like a vehicle.


The infotainment system may have one or more distance sensors 321, 322, 323, 329. The distance sensors 321, 322, 323, 329 may be ultrasonic sensors or radar sensors, or any other device or system for measuring a distance to an object in the surroundings of the vehicle. The one or more distance sensors 321, 322, 323, 329 may be connectable to circuit 200.



FIGS. 3 and 4 show a schematic view of a driver's cockpit. In FIG. 3, the driver makes a steering movement 910, 911, whereby hand 900 grips steering wheel 110. One finger may operate steering column switch 112 to start the turn signal. During steering movement 910, 911 the hand 900 does not fall below the distance dth. An incorrect input by the detection of steering movement 910, 911 as a hand gesture can be ruled out by this. In contrast, the driver in FIG. 5 moves hand 900 in a space between steering column switch 112 and sensor 302. The hand 900 may fall below the distance threshold dth. In this space, the hand 900 may make a contactless hand gesture 912, 913, whereby sensor 302 may not be touched by the hand 900. Also shown in FIG. 5 are two distances d1 and d2 between the hand 900 and the sensor 302 causing a difference in the sensor signal S302 for control purposes.



FIG. 7 shows the functional relationships for sensors 301, 302. In the upper left area, surroundings 1 of the vehicle are shown as a view through the windshield. An image 411 of head-up display 410 is projected onto the windshield and thereby into the driver's view area overlaying the surroundings 1. In the lower left area, the display of the instrument cluster display 420 is shown with a speed display and a three-dimensional street view. In the right lower area, the image of central information display 430 is shown with a number of applications, such as email, telephone, and weather report.


According to one embodiment of FIG. 7, a first image 411 may be output via a head-up display 410, a second image may be output via an instrument cluster display 420, and a third image may be output via a central information display 430. A first sensor 301 may be used to select applications in central information display 430. A second sensor 302 may be used to add or reduce information content.


If a hand gesture 912 with a sensed upward movement of the hand is detected by second sensor 302, an usage, for example, a navigation view, may be added to image 411 of the head-up display 410 from the instrument cluster display 420. If a hand gesture 913 with a sensed downward movement of the hand is detected by second sensor 302, information in image 411 of the head-up display 410 may be reduced; for example, the navigation may no longer be shown in image 411 of the head-up display 410.


According to one embodiment of FIG. 7, second sensor 302 may, moreover, detect a horizontal movement. A hand gesture 914 with a movement of hand 900 to the left, the application selected in the central information display 430 may be added to the instrument cluster display 420. A hand gesture 915 with a movement of the hand 900 to the right again removes it. The application in the central information display 430 may be selected by means of the first sensor 301 by means of a detection of a hand gesture 916, 917 by movement of the hand to the left or right.


According to one embodiment a system of a vehicle is provided. The embodiment may be explained using FIGS. 1, 2, and 7. The system may have a display 410, 420, 430. The system may have a sensor 302 for detecting gestures of a user. The system may have a circuit 200 connectable to display 410, 420, 430 and connectable to sensor 302. The circuit 200 may also be referred to as a central unit 200.


The sensor 302 may be positioned in a dashboard 120 of the vehicle sensitive towards a steering wheel 110 of the vehicle. The circuit 200 and the sensor 302 may be configured to distinguish between a gesture 912, 913, 914, 915 to change the information content in display 410, 420, 430 and a steering movement 910, 911 of the hand 900 of the user.


A gesture 912, 913, 914, 915 recognized by the sensor 302 may be assigned to one of a predetermined number of keys, such as four keys W, S, Q, and E. According to one embodiment based on key W, the information content from the instruments cluster display 420 may be added to the head-up display 410. According to one embodiment based on key S, the information content in the head-up display 410 can be reduced. According to one embodiment based on key Q, the information content from the central information display 430 may be added to the instrument cluster display 420. According to one embodiment based on key E, the information content in the instrument cluster display 420 can be reduced.


First sensor 301 and/or second sensor 302 may be configured to zoom in and out according to a change of a distance d1, d2 of hand 900 of the user to corresponding sensor 301, 302.


A view of surroundings 1 of the motor vehicle is shown schematically in FIG. 8. The view of the user is through a windshield 130, as this is shown schematically in FIG. 1. Image 411 of a head-up display 410 is projected onto windshield 130, which in the embodiment of FIG. 8 partially overlaps the view of surroundings 1.


According to one embodiment, a system of a vehicle is provided that includes an imaging system. The system may have a head-up display 410. The system may have a circuit 200 connectable to head-up display 410.


The circuit 200 of one embodiment in FIG. 2 may be configured to send image data 412 to the head-up display 410 to be displayed. The head-up display 410 may be configured to project an image 411 onto front windshield 130. The image 411 may be based on image data 412.


The circuit 200 may be configured to output a first information item and a second information item. The first information item may have a higher priority than the second information item.


The circuit may be configured to replace the second information item by the first information item, when an alert signal is estimated based on measured traffic related data and/or received traffic related data.


According to one embodiment of FIG. 8, a traffic jam may be detected in front of the vehicle. The circuit 200 may be configured to output a warning signal on approach to the traffic jam. A symbol with high priority may be output as a warning signal as a component of image 411. For example, a previously displayed navigation guidance may be faded out before or may be decreased in size significantly. The driver may change the information shown in image 411 by detected hand gestures by means of sensor 302. It may allow swiping between three different information levels about the traffic jam, while zooming in or out with gesture sensor 203: Traffic warning, alternative route, big route guidance arrow, or an overview map may be displayed based on the hand gesture.


The average speed of road users in front the vehicle may be compared with a threshold. If the average speed of the road users is below the threshold, a warning symbol may be displayed as the first information item. Traffic congestion data may be received wirelessly, such as from a radio station. As the vehicle approaches the traffic congestion, a warning symbol may be shown as the first information item. The second information item of lower priority is, for example, a route guidance symbol.


In an embodiment of FIG. 9, a sequence is shown schematically by means of different views on different displays 410, 430 of FIG. 1. The topmost view shows central information display 430 with a number of applications. By hand gesture 917 of hand 900 detected by a first sensor 301, the applications in central information display 430 are moved and the application in the center may be selected. According to one embodiment of FIG. 9, swipe hand gesture 917 brings the application “phone book” into the middle of the foreground.


The middle view in FIG. 9 also shows the central information display 430 with a second sensor 302. By a hand gesture 912 of hand 900 detected by the second sensor 302, the application “phone book” in the middle of the central information display 430 may be displayed in addition in a head-up display 410 as image 411. In this respect, the displaying of the same application in the central information display 430 and in the head-up display 410 may differ.


The lower view in FIG. 9 shows a view through the windshield with an image 411 of the head-up display 410. According to one embodiment, only pictures of people from the phone book are shown in the head-up display 410. A simple intuitive selection by the driver is possible by displaying of the pictures of people. The position of a number of pictures of people may be changed by changing the distance d between hand 900 and second sensor 302. To make a selection, one of the pictures of people can be scrolled into the foreground. If the distance of hand 900 to second sensor 302 is kept constant, a selection gesture may be detected and a telephone call may be initiated. A telephone connection can be ended by another hand gesture.


According to one embodiment of FIG. 9, an infotainment system of a vehicle is provided. The system may include an imaging system with a function to initiate a telephone call. In a first step the phone book may be brought into the central information display 430 via a swipe gesture. In a second step the phone book application may be brought to head-up display 410 using a gesture. In the third step the head-up display 410 may show pictures from the phone book. In a fourth step a gesture changing the proximity of hand 900 to sensor 302 makes the pictures flip. If the gesture is held, a telephone call may be initiated in a fifth step. The corresponding picture may be enlarged in a sixth step. The telephone call may be ended in a seventh step.


According to one embodiment of FIG. 10, the possibility of reducing the information in a head-up display 410 is provided. An infrared sensor 302 between instrumentation cluster display 420 and head-up display 410 may be used to reduce or add information by gesture.


Two views through a windshield with an image 411 projected onto the windshield and a sensor 302 with a hand gesture 913 of a hand 900 are shown schematically in one embodiment of FIG. 10. An object 415 (text) of an application within image 411 is moved downward and/or made smaller by hand gesture 913 detected by means of sensor 302. According to one embodiment in FIG. 10, the object is a text and/or a graphic and/or a video and/or a widget. Then, the object 415 in image 411 disappears automatically. The application associated with object 415 can continue to be shown on another display 420, 430.


While various embodiments of the invention have been described, it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the invention. Accordingly, the invention is not limited to the aforementioned embodiments.


LIST OF REFERENCE CHARACTERS




  • 110 steering wheel


  • 111 grip area


  • 112 steering column switch


  • 120 dashboard


  • 130 windshield


  • 140 driver seat


  • 150 passenger seat


  • 161, 162 retainer


  • 170 gear shift


  • 200 circuit


  • 201, 202 function block


  • 301, 302 sensor, IR distance sensor


  • 303, 304 input keypad


  • 305 interior camera


  • 306 microphone


  • 311, 312, 313, 314 exterior camera


  • 321, 322, 323, 329 outside distance sensors


  • 330 satellite receiver


  • 340 bus interface


  • 350 transceiver


  • 361, 362 NFC


  • 410 head-up display


  • 411 image


  • 412, 422, 432 image data


  • 419 projection area


  • 420, 430 display


  • 461, 462 cell phone, smartphone


  • 900 hand


  • 910, 911 steering movement


  • 912, 913 hand gesture

  • d, d1, d2, dth distance

  • t time

  • IR infrared light

  • S301, S302 sensor signal

  • S3 detection signal, command

  • C, Cu(t), Cl(t) signal values, signal course

  • C1, C2 amplitude

  • Cp(t) pattern course

  • α angle


Claims
  • 1. An input device for a motor vehicle having a dashboard and a steering wheel, the input device comprising: a sensor; anda circuit connected to the sensor, wherein the sensor is positioned in the dashboard sensitive toward an area of the steering wheel,wherein the sensor is configured to output a sensor signal based on a contactless hand gesture,wherein the circuit is configured to evaluate the sensor signal, andwherein the circuit and/or the sensor is configured to distinguish between a steering movement and the hand gesture, andwherein the circuit is configured to output a detection signal (S3) based on the detection of the hand gesture.
  • 2. The input device according to claim 1, wherein the evaluation circuit is configured to determine an amplitude of the sensor signal and/or to control the amplitude of the sensor signal.
  • 3. The input device according to claim 2, wherein the evaluation circuit is configured to compare the amplitude (C) of the sensor signal with a threshold (th), whereby the threshold (th) is associated with a position (dth) of the hand in a space between the steering wheel and the sensor.
  • 4. The input device according to claim 1, wherein the evaluation circuit is configured to compare a course of the sensor signal (Cu(t), Cl(t)) with a pattern course (Cp(t)) associated with the hand gesture.
  • 5. The input device according to claim 1, wherein the circuit is configured, based on the sensor signal, to determine at least two distance values (C1, C2) associated with at least two different distances (d1, d2) between the hand and the sensor.
  • 6. The input device according to claim 1, wherein the sensor is configured to generate the sensor signal based on a direction of a movement of the hand.
  • 7. The input device according to claim 1, wherein the sensor and/or the circuit is configured to adjust a sensitivity of the sensor and/or the circuit.
  • 8. The input device according to claim 1, wherein the circuit is configured to initiate a command (S3) in an application program, which is assigned to the detection of the hand gesture based on an input by a user.
  • 9. The input device according to claim 8, wherein a type of the command (S3) depends on the currently displayed application program.
  • 10. A system for a motor vehicle having a dashboard and a steering wheel and a windshield, the system comprising a head-up display, configured to project an image onto the windshield or onto a separate combiner; andan input device, wherein the circuit is connected to the head-up display and configured to generate the image, andwherein the circuit is configured, based on the recognition of a hand gesture, to change the image.
  • 11. The system according to claim 10, wherein the circuit is configured to change the image based on an addition or removal of an object in the image, the object being associated with an application program running on the circuit.
  • 12. The system according to claim 11, wherein the circuit is configured to determine a selection input, the selection input being associated with a selection of an application program,wherein the circuit is configured to output information associated with the application program on another display, andwherein the object in the image of the head-up display includes information associated with the selected application program.
  • 13. The system according to claim 10, wherein the input device has a sensor and a circuit connected to the sensor and to the head-up display, wherein the sensor is positioned in the dashboard sensitive toward the steering wheel, wherein the hand gesture comprises a contactless hand gesture and the sensor is configured to output a sensor signal based on the contactless hand gesture, wherein the circuit is configured to evaluate the sensor signal, and wherein the circuit is configured to distinguish between a steering movement and the hand gesture and to output a detection signal based on the detection of the hand gesture.
PCT Information
Filing Document Filing Date Country Kind
PCT/EP2013/003863 12/19/2013 WO 00
Provisional Applications (1)
Number Date Country
61745229 Dec 2012 US