The invention relates to a tactile presentation device and a tactile presentation method.
As the conventional art, a tactile presentation device in which an annular piezoelectric element is fixed to a main surface on one side of a flexible plate-like member in which a main surface on the other side serves as an operation surface touched by an operator.
Such tactile presentation device notifies the operator with a tactile presentation through deflection of the plate-like member generated alongside the deformation of the annular piezoelectric element. The tactile presentation device can generate a sufficiently large displacement with respect to the operation surface, and can clearly acknowledge the operator of the reception of the operation.
[Patent Literature 1] Japanese Laid-open No. 2019-169009
In the conventional tactile presentation device, when the operator's awareness conforms to the fingertip making contact with the operation surface, it is possible that the significant displacement of the operation surface, that is, a strong tactile sensation, may bring discomfort to the operator.
Accordingly, an objective of the invention is to provide a tactile presentation device capable of suppressing the discomfort by presenting a suitable tactile sensation.
An aspect of the invention provides a tactile presentation device. The tactile presentation device includes: an operation unit, having an operation member operated by a user; a tactile presentation unit, presenting a tactile sensation to the user via the operation member; an awareness determination unit, determining whether the user is aware of at least one of the operation member and a display screen for displaying an operation target operated by the operation member; and a control unit, controlling the tactile presentation unit so as to adjust an intensity of the tactile sensation presented to the user in accordance with a determination result of the awareness determination unit.
Another aspect of the invention provides a tactile presentation method for controlling a tactile presentation device. The tactile presentation device includes: an operation unit, having an operation member operated by a user; and a tactile presentation unit, presenting a tactile sensation to the user via the operation member. The tactile presentation method includes: an awareness determination step of determining whether the user is aware of at least one of the operation member and a display screen for displaying an operation target operated by the operation member; and a control step of controlling the tactile presentation unit so as to adjust an intensity of the tactile sensation presented to the user in accordance with a determination result of the awareness determination step.
Yet another aspect of the invention provides a tactile presentation method. The tactile presentation method includes: a step of detecting a sight line of a user from an operation in which the user operates an operation member; a step of monitoring whether the sight line intersects with an awareness region of the operation member; and a step of exerting control to: if the sight line of the user intersects with the awareness region, present a tactile sensation weaker than a tactile sensation in a case where an awareness of the user conforms; and if the sight line of the user does not intersect with the awareness region, present a tactile sensation stronger than a tactile sensation in a case where the awareness of the user does not conform.
According to the invention, by presenting a suitable tactile sensation, the discomfort can be suppressed.
A tactile presentation device according to the embodiments is schematically configured as including: an operation unit, having an operation member operated by a user; a tactile presentation unit, presenting a tactile sensation to the user via the operation member; an awareness determination unit, determining whether the user is aware of at least one of the operation member and a display screen for displaying an operation target operated by the operation member; and a control unit, controlling the tactile presentation unit so as to adjust the intensity of the tactile sensation presented to the user in accordance with the determination result of the awareness determination unit.
The tactile presentation device adjusts the intensity of the tactile sensation presented to the user in accordance with whether the user is aware of the operation member or the display screen. Therefore, compared with the case where the tactile presentation is constant regardless of awareness, the discomfort brought by the strong tactile sensation regardless of awareness can be suppressed.
(Outline of Tactile Presentation Device 1)
In the following, an example of a tactile presentation device according to the embodiment will be described with reference to the drawings. In the respective drawings according to the embodiment described in the following, the ratios between the figures may be different from the actual ratios. In addition, in
The tactile presentation device 1 is configured to present a tactile feedback, that is, a tactile sensation, making the user recognize that an operation performed by the user is received. As an example, the tactile presentation device 1 according to the embodiment is electrically connected to an electronic apparatus mounted in a vehicle 8, and is configured to function as an operation device receiving an input of the information used for controlling the electronic apparatus. As an example, the electronic apparatus is a control device, etc., for comprehensively controlling an air conditioning device, a music and video playback device, a navigation device, and the vehicle 8.
When the tactile presentation device 1 is mounted in the vehicle 8, it may be difficult to recognize the tactile presentation presented to the user due to vibrations and sounds, etc., resulted from traveling, etc. However, when the tactile presentation device constantly presents a strong tactile sensation, although the user is aware of the presentation of the tactile sensation, the strong tactile sensation may bring discomfort when presented to the user. The tactile presentation device 1 according to the embodiment includes, as an example, a configuration as follows to present a suitable tactile sensation in accordance of the user's awareness.
As shown in
As shown in
Although the tactile presentation device 1 presents a tactile sensation through vibration with respect to the operation member, the tactile presentation device 1 may further add at least one of a sound and light to present the tactile sensation. The tactile presentation unit 3 of the embodiment performs tactile presentation through making a sound in addition to applying vibration with respect to the operation member.
As shown in
As shown in
In the following, as an example, the tactile presentation in the case where the user operates the touch pad 2 as the operation unit, and operates an operation target such as a cursor or an icon displayed on a display screen 840 of the main display 84 as the display device is described.
However, the operation unit is not limited to the touch pad 2. For example, the operation unit may also be at least one of multiple steering switches 86 arranged on a steering 83 and receiving a push operation, etc., the main display 84 including a touch panel receiving a touch operation, and other mounted operation devices receiving a rotary operation, etc.
In addition, the display screen displaying the operation target is not limited to the display screen 840 of the main display 84. The display screen may also be at least one of the display screen 840 of the main display 84, a display screen 850 of the sub-display 85, a display screen of a multi-functional mobile phone or a tablet terminal wiredly or wirelessly connected to the vehicle 8, and a display screen of a display device mounted elsewhere, such as in a head-up display, and displaying the operation target on a front glass, etc.
(Configuration of Touch Pad 2)
The touch pad 2 is a touch pad detecting a touched location on the operation surface 200 through being touched on the operation surface by a body portion (e.g., finger) of the user's body or a dedicated pen. Through performing an operation on the operation surface 200, it is possible for the user to operate the electronic apparatus that is connected. As the touch pad 2, a touch pad of a resistance film type, an infrared type, a surface acoustic wave (SAW) type, a capacitance type, etc., can be used.
In the embodiment, the touch pad 2 is a touch pad of the capacitance type, and is able to detect a tracing operation, a touch operation, a multi-touch operation, and a gesture operation, etc.
The touch pad 2, as shown in
As shown in
As shown in
As shown in
As a modified example, the touch pad 2 may also be configured as including an electrostatic control unit calculating the coordinates under operation. In such case, the touch pad 2 generates the detection information S1 including the coordinates where the operation is detected and outputs the detection information S1 to the control unit 6.
(Configuration of Tactile Presentation Unit 3)
The tactile presentation unit 3 presents a tactile sensation by applying vibration to the panel 20 of the touch pad 2 and outputting a sound in association with the vibration. The tactile presentation unit 3 presents the vibration by using an actuator, such as a voice coil motor or a piezoelectric element. The tactile presentation unit 3 of the embodiment, as an example, includes a voice coil motor 30 applying vibration to the panel 20 and a sound output unit 31 outputting a sound.
As a modified example, the tactile presentation unit 3 may also output a sound together with vibration due to the vibration of the panel 20. In this case, the driving signal output from the control unit 6 is a signal in which a signal for generating a sound is superimposed with a signal for generating vibration.
The voice coil motor 30, as shown in
The sound output unit 31, as shown in
In addition,
The driving signal S2 indicated in a solid line in
The driving signal S2 indicated in a dotted line in
The driving signal S3 indicated in a solid line in
The driving signal S3 indicated in a dotted line in
As an example, the driving signals S2 and the driving signals S3 when the user's awareness conforms and when the user's awareness does not conform are stored as the driving signal information 51 stored in the storage unit 5. The driving signal information 51 may be information of the waveforms of the driving signals, and may also be information of functions for generating the driving signals. The invention is not particularly limited in this regard.
In the case where the driving signal S2 and the driving signal S3 are functions, when the awareness is determined as not conforming, the control unit 6 multiplies the signals when the awareness conforms by α times and β times based on the driving signal information 51 to generate the driving signal S2 and the driving signal S3.
As an example, the driving signal S2 and the driving signal S3 are signals whose signal amplitudes differ when the user's awareness conforms and when the user's awareness does not conform. However, the invention is not limited thereto. The driving signal S2 and the driving signal S3 may also have different vibration patterns and sound output patterns, such as timings or numbers of times, when the user's awareness conforms and when the user's awareness does not conform, and may also have different amplitudes in addition to different vibration patterns and sound output patterns.
(Configuration of Awareness Determination Unit 4)
In the case where one of the first region determined in advance and including the operation member and a second region determined in advance and including the display screen intersects with the user's sight line, the awareness determination unit 4 determines that the user is aware.
As shown in a dotted line in
As a modified example, the first region may also be an awareness region 74 including the steering switch 86 arranged in the steering 83 on which an operation is detected. In addition, the first region may also be another awareness region including an operation member under operation, such as a switch or a touch pad presenting a tactile sensation other than the above.
When multiple first regions are set as regions including operation members operable by the user, even if a first region including an operation member not under operation intersects with the sight line, the awareness determination unit 4 still determines that the user's awareness does not conform.
As a modified example, the second region may also be an awareness region 73 including the display screen 850 of the sub-display 85 as the display screen. In addition, the second region may also be another awareness region including a display screen of a head-up display, etc., displaying an operation target other than the above.
As shown in
The determination unit 41 is a micro-computer formed by a central processing unit (CPU) calculating, processing, etc., the obtained data according to a stored program, and a RAM and a read-only memory (ROM), etc., which are semiconductor memories.
As an example, the determination unit according to the embodiment detects the sight line by adopting a process using a Purkinje image 92. In the following, an awareness region 7 representing the first region or the second region is adopted as an example, and the detection of the intersection between the user's sight line 93 and the awareness region 7 is described.
The determination unit 41 is configured to irradiate an eyeball 90 with a near infrared beam from the illumination unit 42, and calculate an intersection point 94 that is an intersection point between the sight line 93 and the awareness region 7. The slight line 93 is from the reflected light (Purkinje image 92) on the surface of the cornea and the location of a pupil 91, and the awareness region 7 includes the operation member. The area of the awareness region 7 may be set to an extent that the processing speed of the awareness determination unit 4 does not drop significantly.
Specifically, the determination unit 41 segments an image 43 captured by the camera 40 into regions of similar brightness, and determines, from region shapes, a pupil region from the respective segmented regions by using a pattern matching process, etc. An example of a process for determining the pupil region by pattern matching includes a process of assuming that the pupil is elliptical and specifying the elliptical region from the image.
Also, the determination unit 41 performs ellipse approximation based on the minimum error squared sum with respect to a contour set of the pupil 91 to obtain the ellipse center.
Also, the determination unit 41 detects the Purkinje image 92 as a target within a predetermined range from the obtained ellipse center. The center coordinates of the Purkinje image 92 are the gravity center of the obtained region.
Also, the determination unit 41 calculates the sight line 93 from the pupil 91 and the Purkinje image 92 that are obtained. In the following, the calculation is described.
The coordinates in the image coordinate system (XbYb coordinate system) shown in
Also, the determination unit 41 calculates the coordinates of the pupil center as the cornea curvature center in the camera coordinate system, and converts the respective coordinates into coordinates of the world coordinate system to obtain a sight line vector in the world coordinate system. When the sight line vector intersects with the awareness region 7 at the intersection point 94, the determination unit 41 determines that the user's awareness conforms to the awareness region 7. However, the process for detecting the sight line is not limited to the above example, and it is possible to apply a process such as calculating the intersection point 94 from the eye position or the rotation angle of the face, etc.
Although the determination unit 41 determines that the user's awareness conforms when the sight line 93 intersects with the awareness region 7, the invention is not limited thereto. The determination unit 41 may also determine that the user's awareness conforms when the user's face conforms to the direction of the awareness region 7.
The determination unit 41 generates an awareness information S5 indicating whether the user's awareness conforms to the awareness region 71 or the awareness region 72 and outputs the awareness information S5 to the control unit 6.
(Configuration of Control Unit 6)
The control unit 6 is a micro-computer formed by a CPU calculating and processing the obtained data according to a stored program, and a RAM and a ROM, etc., which are semiconductor memories. The ROM, for example, stores a program for operating the control unit 6. The RAM, for example, is used as a storage region temporarily storing a computation result. In addition, the control unit 6 is provided inside with a means for generating a clock signal, and operates based on the clock signal.
The control unit 6 calculates the coordinates at which the operation is detected based on the detection information S1 periodically obtained from the touch pad 2 and the electrostatic threshold value 50 stored in the storage unit 5. As an example, the coordinates are calculated by using weighted average. However, the invention is not limited thereto. The location of the gravity center may also be set as the coordinates at which the operation is detected.
The control unit 6, for example, can perform detection on whether a contact is made to the operation unit, specification of the operation coordinates, and specification of the operation mode. Regarding the detection on whether a contact is made to the operation unit, when the detection value of the touch pad 2 as the operation unit is equal to or greater than a predetermined threshold Th (electrostatic threshold value 50), the control unit 6 determines that there is a contact. In addition, the control unit 6 specify the gravity center of the region in which the detection value is equal to or greater than the threshold Th as the operation coordinates.
When the trajectory of the operation coordinates continues through time, the control unit 6 determines the operation as a tracing operation, and when the presence and absence of a contact to the operation unit is detected, the control unit 6 determines the operation as a touch operation. In addition, when the presence and absence of a contact to the operation unit is detected at two or more places separated from each other, the control unit 6 determines that the operation is a multi-touch operation. In addition, when the trajectory of the operation coordinates is associated with the user's behavior and specified, the control unit 6 can determine the operation as a gesture operation. For example, along the trajectory of a region in which the detection value is equal to or greater than the predetermined threshold through time, the control unit 6 can specify whether the user performs a wipe operation. Accordingly, the control unit 6 can specify various operation modes with respect to the operation unit.
When the user is aware of at least one of the panel 20 and the display screen 840, the control unit 6 sets the intensity of the tactile sensation to be relatively weaker than the intensity when the user is not aware. That is, when the user is aware of at least one of the awareness region 71 (first region) including the panel 20 and the awareness region 72 (second region) including the display screen 840, the control unit 6 sets the intensity of the tactile sensation to be relatively weaker than the intensity when the user is not aware.
Setting the intensity of the tactile sensation to be relatively weaker may be a process of reducing the intensity of the tactile sensation at the time of being aware to be weaker than the intensity when not aware as well as a process of increasing the intensity of the tactile sensation at the time of being not aware to be stronger than the intensity of being aware. In the embodiment, as shown in
The control unit 6 generates an operation information S6 including the information of the coordinates at which the operation is detected based on the detection information S1 output from the touch pad 2, and outputs the operation information S6 to the electronic apparatus that is electrically connected. The electronic apparatus obtains the operation information S6, and, when it is necessary to present the tensile sensation, outputs an instruction signal S7 to the control unit 6. The control unit 6 confirms the user's awareness based on the input of the instruction signal S7, and presents a suitable tactile sensation in accordance with whether the user's awareness conforms or not.
The control unit 6 is not limited to presenting the touch sensation according to the input of the instruction signal S7, but may also be configured to determine whether to present the tactile sensation in accordance with the operation of the control unit 6.
In the following, an example of the operation of the tactile presentation device 1 according to the embodiment is described in accordance with the flowchart of
(Operation)
When the power of the vehicle 8 is turned on, the control unit 6 of the tactile presentation device 1 monitors whether an operation is performed based on the detection information S1 periodically output from the touch pad 2. When “Yes” in Step1 (Step1: Yes), the control unit 6 generates the operation information S6 based on the operation that is operated, and outputs the operation information S6 to the electronic apparatus. Also, when obtaining the instruction signal S7 from the electronic apparatus, the control unit 6 controls the awareness determination unit 4 to detect the sight line (Step2).
When the user's sight line 93 intersects with the awareness region 71 or the awareness region 72 based on the awareness information S5 output from the awareness determination unit 4, that is, when the user's awareness conforms to the panel 20 or the display screen 840 (Step3: Yes), the control unit 6 generates the driving signal S2 and the driving signal S3 presenting a tactile sensation weaker than the tactile sensation when the user's awareness does not conform (Step4).
The control unit 6 outputs the driving signal S2 and the driving signal S3 that are generated to the tactile presentation unit 3 to present the tactile sensation (Step5), and the operation relating to tactile presentation is ended.
In Step3, when the user's sight line 93 does not intersect with any of the awareness region 71 and the awareness region 72, that is, when the user's awareness does not conform to any of the panel 20 and the display screen 840 (Step3: No), the control unit 6 generates the driving signal S2 and the driving signal S3 presenting a tactile sensation stronger than the tactile sensation when the user's awareness conforms (Step6), and presents the tactile sensation (Step5).
The tactile presentation device 1 according to the embodiment can suppress the discomfort by presenting a suitable tactile sensation. Specifically, the tactile presentation device 1 adjusts the intensity of the tactile sensation presented to the user in accordance with whether the user is aware of the panel 20 on which an operation is performed or the display screen 840 displaying the operation target. Therefore, compared with the case where a constant tactile sensation is presented regardless of awareness, a suitable tactile sensation can be presented to suppress the discomfort resulting from the strong tactile sensation.
When the user's sight line 93 does not conform to the awareness region 71 or the awareness region 72, the tactile presentation device 1 increases the intensity of the tactile sensation presentation together with a sound, and that the operation performed without the user watching the awareness region 71 or the awareness region 72 has been received can be clearly presented. In addition, in the case where the user's sight line 93 conforms to the awareness region 71 or the awareness region 72, the tactile presentation device 1 can lower the intensity of the tactile sensation together with a sound to suppress the discomfort.
When the user's sight line 93 intersects with the awareness region 71 including the panel 20 or the awareness region 72 including the display screen 840, the tactile presentation device 1 determines that the user's awareness conforms to the panel 20 or the display screen 840. Therefore, compared with the case where such configuration is not adopted, the configuration can present a more suitable tactile sensation in accordance with the user's awareness to suppress the discomfort.
The tactile presentation device 1 weakens the tactile sensation not only when the operation finger 9 contacts the panel 20, but also when the user watches the display screen 840 displaying the operation target. Therefore, compared with the case where such configuration is not adopted, the configuration can present a more suitable tactile sensation to suppress the discomfort.
When presenting the tactile sensation by adding a sound together with the vibration of the panel 20, the tactile presentation device 1 adjusts the intensity of the sound in addition to the vibration according to the user's awareness. Therefore, compared with the case where such configuration is not adopted, the configuration can suppress the discomfort when a loud sound is output even though the user is aware.
Here, as another embodiment, as shown in
In the embodiment, the main display 84 includes the touch panel 84a. Accordingly, the first region including the operation member and the second region including the display screen form one region. That is, the region determined in advance and including the operation surface 841 becomes the awareness region 72 of
According to the tactile presentation device 1 according to at least one of the embodiments, it is possible to suppress the discomfort by presenting a suitable tactile sensation.
Depending on the purpose, a portion of the tactile presentation device 1 according to the embodiments and the modified examples may be realized by an application specific integrated circuit (ASIC) and a field programmable gate array (FPGA), etc., for example.
Although some embodiments and modifications of the invention have been described above, the embodiments and modifications are merely examples and do not limit the invention according to the claims. The novel embodiments and modifications can be implemented in various other embodiments, and various omissions, replacements, changes, etc., can be made without departing from the gist of the invention. Moreover, not all of the combinations of features described in the embodiments and modifications are essential for the means for solving the problems of the invention. Further, the embodiments and modifications are included in the scope and gist of the invention, and are included in the scope of the invention described in the claims and the equivalent scope thereof.
1: Tactile presentation device; 2: Touch pad; 3: Tactile presentation unit; 4: Awareness determination unit; 5: Storage unit; 6: Control unit; 7: Awareness region; 8: Vehicle; 9: Operation finger; 10: Base; 12: Reference position; 20: Panel; 21: Detection unit; 30: Voice coil motor; 31: Sound output unit; 40: Camera; 41: Determination unit; 42: Illumination unit; 43: Image; 50: Electrostatic threshold value; 51: Driving signal information; 71 to 74: Awareness region; 80: Floor console; 81: Center console; 82: Instrument panel; 83: Steering; 84: Main display; 84a: Touch panel; 84b: Display unit; 85: Sub-display; 86: Steering switch; 87: Steering column; 90: Eyeball; 91: Pupil; 92: Purkinje image; 93: Sight line; 94: Intersection point; 200: Operation surface; 201: Back surface; 840: Display screen; 841: Operation surface; 842: Display screen; 850: Display screen.
Number | Date | Country | Kind |
---|---|---|---|
2019-234088 | Dec 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/048165 | 12/23/2020 | WO |