The present invention relates to an operation input device.
There is known an operation input device having a camera configured to capture an image of a predetermined detection region including at least a part of spokes connected to a steering wheel for steering a vehicle, an extraction means for extracting a shape of a hand and/or a movement of the hand of a driver, based on a captured image of the camera, a determination means for determining a hand command (operation command by fingers) corresponding to the shape of the hand and/or the movement of the hand extracted by the extraction means and an execution means for executing the hand command determined by the determination means (for example, see Patent Document 1).
This operation input device is set with an input position of the hand command being at the spokes that are not held during an operation of the steering wheel, that is, with setting a detecting region, a command input and a steering operation can be surely differentiated. Thus, it is considered that the operation input device can accurately recognize an operation input without incorrectly judging the shape of the hand, during a steering operation, as the hand command
Patent Document 1: JP 2006-298003A
The operation input device disclosed in Patent Document 1 limits a detection region to spokes that are not being held during an operation of a steering wheel for accurately recognizing an operation input by differentiating a command input and a steering wheel operation. However, from a viewpoint of operability, it is preferable that an operator be able to perform a gesture while holding a steering wheel.
It is an object of the invention to provide an operation input device that enables input by finger gestures and notification to fingers while holding the steering wheel.
[1] An operation input device according to a first embodiment of the invention includes a touch detector configured to detect an operation state to an operation unit of a steering wheel of a vehicle, a controller configured to determine an operation command by fingers of an operator to the touch detector based on a touch state of the fingers of the operator to the touch detector and to operate an operation target device and a notification unit for notifying the operator based on the operation command by the fingers determined by the controller.
[2] The operation input device described in the above-mentioned [1] in which the operation command by the fingers is a gesture input by a hand of the operator may be provided.
[3] Further, the operation input device described in the above-mentioned [1] or [2] in which the touch detector is of an electrostatic capacitance sensor may be provided.
[4] Moreover, the operation input device described in any one of the above-described [1] to [3] in which notification by the fingers based on an operation command is a display of an operation menu of the operation target device may be provided.
[5] The operation input device described in any one of the above-described [1] to [3] in which notification by the fingers based on an operation command is a display of a projected image on the fingers of the operator may be provided.
[6] The operation input device described in any one of the above-described [1] to [3] in which notification by the fingers based on an operation command is a vibration presentation to the fingers of the operator may be provided.
[7] The operation input device described in the above-described [1] or [3] in which the operation unit of the steering wheel includes an electrostatic sensor built-in grip, and the touch detector is mounted on a front surface of the electrostatic sensor built-in grip may be provided.
[8] The operation input device described in [1], [3] or [7], in which the touch detector includes an operation input portion may be provided. The operation input portion has a plurality of driving electrodes arranged with equal intervals inbetween in a predetermined direction, a plurality of detection electrodes arranged with equal intervals inbetween in an orthogonal direction to the predetermined direction, a driving unit configured to provide driving signals to the plurality of driving electrodes, and a reading unit configured to read out electrostatic capacitance generated by a combination of the plurality of driving electrodes and the plurality of detection electrodes.
[9] The operation input device described in [1] or [5] in which notification by the fingers based on an operation command is a projected image displayed on a hand of the operator while holding the operation unit of the steering wheel may be provided.
According to an embodiment of the invention it is possible to provide an operation input device that enables input by finger gestures and notification to fingers while holding a steering wheel.
An operation input device 1 includes a touch sensor 111 that is a touch detector configured to detect an operation state to an operation unit 101 of a steering wheel 100 of a vehicle 9, a controller 18 configured to determine an operation command by fingers of an operator to the touch sensor 111 based on a touch state of the fingers 200 of the operator to the touch sensor 111 and to operate an operation target device and a notification unit for notifying the operator based on the operation command by the fingers determined by the controller 18.
As the notification means, as illustrated in
As illustrated in
The operation input device 1 includes a controller 18 configured to determine an operation command by fingers to the touch sensor 111 based on a touch state of the fingers 200 of the operator to the touch sensor 111, and the controller 18 is configured to operate an operation target device 300 based on the operation command by the fingers and to perform notification (display of menu and the like by display portion 130, image projection to fingers 200 by projection portion 140 and tactile sensation feedback to the fingers 200 by vibration actuator 120) to fingers 200 based on the operation command of the fingers.
The operation input device 1 is configured to perform notification as a feedback for the operation input by the gesture operation while the touch sensor 111, provided at the top portion of the steering wheel, as the operation unit 101 of the steering wheel 100 of the vehicle 9 is being held. The notification includes a display to HUD, the display portion 130 or the like, projection display to the back of the hand, a tactile sensation feedback to the fingers 200 by vibration, and the like. Thus, it is possible to include a traffic status in the forward direction, the operation unit 101 and the fingers 200 in the same view and to make various types of notification to the fingers 200 performing the input operations at the same time. This results in enabling safe operations.
The controller 18 is, for example, a microcomputer including a Central Processing Unit (CPU) that carries out computations, processes, and the like on acquired data in accordance with stored programs, Random Access Memory (RAM) and Read Only Memory (ROM) that are semiconductor memories, and the like. A program for operations of the controller 18, for example, is stored in the ROM. The RAM is used as a storage region that temporarily stores computation results and the like, for example, and detection value distribution information 150 and the like are generated. The controller 18 also includes an internal means for generating a clock signal, and operates on the basis of this clock signal.
The operation input portion 112 includes, as shown in
The driving electrodes 115 and the detection electrodes 116 are, for example, configured as electrodes using tin-doped indium oxide (ITO), copper or the like. The driving electrodes 115 and the detection electrode 116 are arranged at the lower part of the operation input portion 112, each being insulated and intersecting with each other.
The driving electrodes 115 are, for example, arranged in parallel with the x-axis with equal intervals inbetween on the paper surface of the
The detection electrodes 116 are, for example, arranged in parallel with the y-axis with equal intervals inbetween on the paper surface of the
The detection signal S1b is generated in accordance with a set resolution. Specifically, the reading unit 114 performs processing to obtain the detection signal S1b by combining coordinates x1 to coordinates xn, coordinates y1 to coordinates ym, and the electrostatic capacitance count value, as shown in
A vibration actuator 120 as a notification means may utilize various actuators as long as the actuator is of a configuration in which vibration is generated by applying a voltage or an electric current. As shown in
The vibration actuator 120, for example, may use an eccentric rotation motor including an eccentric rotor. For example, the eccentric rotor is formed of a metal such as brass, and functions as an eccentric weight during a rotation of a rotation motor in a state of being mounted on a rotational axis because the center of gravity thereof is set to be in a position eccentric from the rotational axis. Therefore, when the rotation motor rotates while being mounted with the eccentric rotor, because of the eccentricity of the eccentric rotor, the eccentric rotor causes a whirling movement about the rotational axis, resulting in generating vibrations in the rotation motor, thus functioning as the vibration actuator.
Moreover, the vibration actuator 120, for example, may be a monomorph piezoelectric actuator provided with a metal plate and a piezoelectric element. The monomorph piezoelectric actuator is a vibration actuator having a structure which bends with only one piezoelectric element layer. Examples of the material of the piezoelectric element include lithium niobate, barium titanate, lead titanate, lead zirconate titanate (PZT), lead metaniobate, polyvinylidene fluoride (PVDF), and the like. Note that, as a modification of the vibration actuator, a bimorph piezoelectric actuator in which two sheets of piezoelectric elements are provided on both sides of a metal plate may be used.
A display portion 130 as a notification means, for example, is configured to function as the display portion of an operation target device and the display portion of a vehicle-mounted device. The display portion 130 is, for example, a liquid crystal monitor arranged on a center console 90. The display portion 130, for example, displays a menu screen, images and the like, related to a display image 141. The related menu screen and the images are, for example, icons and the like of the menu screen of the functions that can be operated by a touch sensor 111. The icons enable selection, decision and the like, for example, by a touch state (presence or absence of the touch by the holding fingers, a number of fingers touched and determination of either the left or right hand by determining of the thumb) to an operation input portion 112 of the touch sensor 111 and an operation such as a tracing operation that is made by touching consecutively.
A projection portion 140 as a notification means, for example, as shown in
The projection portion 140, for example, is configured to generate a display image 141 based on the image information S2 acquired from a controller 18 and to project the generated display image 141 onto the back of fingers 200 of an operator. The display image 141 may be a symbol, pattern, diagram and the like corresponding to an operation command by the fingers determined by a gesture operation.
The projection portion 140 is, as an example, a projector having a light-emitting diode (LED) as a light source. The projection portion 140 projects, as shown in
As illustrated in
A controller 18 may obtain a center of gravity position of the touch region from the above-described hatching region. An X-coordinate at the position of the center of gravity G is an average value taken from the values in
The controller 18 may determine a touch state of fingers from the above-described hatching region. As illustrated in
For more accuracy to be achieved in the above-described determination, templates for various pattern matchings are to be stored in a memory. The templates for various types of holdings such as holding with the right hand, holding with the left hand, holding with an extended forefinger and holding with four fingers extended are to be prepared. Further, it is possible to detect accurately a positional relationship of the hand and a movement of the finger by calibrating a width of the hands and the like per each operator.
In an input operation shown in
By such input operation, a hatching region as a touch region of a touch sensor 111 (development view) shown in
A controller 18 can determine that the operator is making a trace (slide) operation in the direction of the arrow A in the diagram with the forefinger of the left hand being extended. This may be determined by a change in the position of the center of gravity G, the pattern matching based on the detection value distribution information 150 or the like.
The controller 18 can determine, based on the gesture operation of the above-described input operation, an operation command made by the fingers. Thus, the controller 18 selects a main menu of a display portion 130 shown in
The controller 18, as shown in
In an input operation shown in
By such input operation, a hatching region as a touch region of a touch sensor 111 (development view) can be detected as in
A controller 18 can determine that the operator is making a vertical movement operation of the finger with the forefinger of the left hand being extended. This may be determined by a change in the detection value distribution information 150, the pattern matching or the like, described above.
The controller 18 can determine, based on the gesture operation of the above-described input operation, the operation command made by the fingers. Thus, the controller 18 selects a selection menu of A of the display portion 130 shown in
The controller 18, as shown in
In an input operation shown in
By such input operation, a hatching region as a touch region of a touch sensor 111 (development view) shown in
The controller 18 can determine that the operator holds the electrostatic sensor built-in grip 110 with the four fingers being extended. This may be determined by the position of a center of gravity G, the pattern matching based on the detection value distribution information 150 and the like.
The controller 18 can determine, based on the gesture operation of the above-described input operation, the operation command made by the fingers. Thus, the controller 18 selects a selection menu of A′ of a display portion 130 shown in
The controller 18, as shown in
According to an embodiment, effects such as those described below are achieved.
(1) An operation input device 1 according to the embodiment includes a touch sensor 111 that is a touch detector configured to detect an operation state to an operation unit 101 of a steering wheel 100 of a vehicle 9, a controller 18 configured to determine an operation command by fingers of an operator to the touch sensor 111 based on a touch state of the fingers 200 of the operator to the touch sensor 111 and to operate an operation target device and a notification unit for notifying an operator based on the operation command by the fingers determined by the controller 18. Thus, safe operation is made possible by including a traffic status in the forward direction, a display and an operation hand in the same view.
(2) A gesture input while holding a steering wheel 100 (electrostatic sensor built-in grip 110) enables the operation to be performed in a stable manner. Further, at the same time, by providing a tactile sensation feedback linked with operations, operational feeling is improved.
(3) By projecting an operation content linked with a movement of hands, not only the operator but also a passenger can comprehend the operation content.
(4) Further, it is possible to detect accurately a positional relationship of the hand and a finger movement by calibrating a width of the hands per each operator.
(5) The detection system does not utilize camera images, therefore, no camera cost and no place for camera attachment is necessary.
Although several embodiments of the invention have been described above, these embodiments are merely examples and the invention according to the claims is not to be limited thereto. These novel embodiments may be implemented in various other forms, and various omissions, substitutions, changes and the like can be made without departing from the spirit and scope of the invention. In addition, all the combinations of the features described in these embodiments are not necessarily needed to solve the technical problem. Further, these embodiments are included within the spirit and scope of the invention and also within the invention described in the claims and the scope of equivalents thereof.
1 Operation input device
9 Vehicle
18 Controller
100 Steering wheel
101 Operation unit
110 Electrostatic sensor built-in grip
111 Touch sensor
112 Operation input portion
113 Driving unit
114 Reading unit
115 Driving electrode
116 Detection electrode
120 Vibration actuator
130 Display portion
140 Projection portion
Number | Date | Country | Kind |
---|---|---|---|
2016-152708 | Aug 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/021828 | 6/13/2017 | WO | 00 |