The present disclosure relates to an electronic device, and a method of driving the electronic device.
Conventionally, there has been an information processing apparatus that includes a touch panel to generate positional information within an operation area depending on a position touched with a touch operation in the operation area; a sound generation means to generate sound; and a control means configured to cause the sound generation means to generate a predetermined sound depending on the positional information input from the touch panel.
The control means includes a division processing means configured, when first positional information is input from the touch panel, to divide the operation area into multiple areas by multiple lines connecting a position in the operation area obtained depending on the first positional information and edges of the operation area; and a correspondence processing means configured to associate each of the areas with an executable process and a sound corresponding to the process.
The control means further includes a sound generation processing means configured, when second positional information different from the first positional information is input from the touch panel, to cause the sound generating means to generate the sound corresponding to an area among the areas that includes the position obtained depending on the second positional information (see, for example, Patent Document 1).
[Patent Document 1] Japanese Laid-Open Patent Publication No. 2012-123689
Meanwhile, such a conventional information processing apparatus vibrates the entire touch panel, and as a vibration element, uses an eccentric motor or a voice coil motor.
Because of this, it cannot vibrate one of the multiple areas that partition the operation area, and when the user touches the touch panel with both hands or multiple fingers, the user cannot distinguish an area by vibration. For example, it is difficult for a visually handicapped user to distinguish an area based on vibration. Also, it is difficult to distinguish an area based on vibration without visual observation.
According to an embodiment of the present invention, an electronic device includes a top panel including an operation surface; a position detector configured to detect a position of an input operation performed on the operation surface; a first vibration element placed in a first area among a plurality of areas partitioning the operation surface, and configured to generate a natural vibration in an ultrasonic range selectively within the first area, in response to being driven by a first drive signal to generate the natural vibration on the operation surface; a second vibration element placed in a second area among the plurality of areas, and configured to generate the natural vibration in the ultrasonic range selectively within the second area, in response to being driven by a second drive signal to generate the natural vibration on the operation surface; a sound outputter; a memory configured to store first data in which coordinates of the first area, the first vibration element, the first drive signal, and a first voice guidance assigned to the first area are associated with each other, and coordinates of the second area, the second vibration element, the second drive signal, and a second voice guidance assigned to the second area are associated with each other; and a controller configured, in response to a plurality of input operations being detected by the position detector, to drive the first vibration element by the first drive signal based on the first data such that a strength of the natural vibration varies depending on a change rate in time of a position of the input operation, and to cause the sound outputter to output the first voice guidance.
The object and advantages in the embodiment will be realized and attained by means of the elements and combinations particularly pointed out in the claims. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention as claimed.
In the following, embodiments of an electronic device and a method of driving the electronic device will be described.
It is possible to provide an electronic device with which multiple areas can be distinguished by sound and vibration, and a method of driving the electronic device.
The electronic device 100 is a smart phone or a tablet computer that includes, for example, a touch panel as an input operation part. Since the electronic device 100 simply needs to be a device that includes a touch panel as an input operation part, it may be, for example, a mobile information terminal or a device that is installed and used in a specific place, such as an ATM (Automatic Teller Machine).
An input operation part 101 of the electronic device 100 has a display panel placed under a touch panel, on which various buttons 102A, sliders 102B, and the like (referred to as a GUI operation part 102, below) of GUI (Graphic User Interface) are displayed.
Normally, the user of the electronic device 100 touches the input operation part 101 with a fingertip in order to operate the GUI operation part 102. Also, the electronic device 100 includes the speaker 103. The speaker 103 is an example of a sound outputter.
Next, by using
The electronic device 100 includes a housing 110, a top panel 120, a double-sided tape 130, vibration elements 140A1, 140A2, 140A3, a touch panel 150, a display panel 160, and a substrate 170.
The housing 110 is made of, for example, resin, and as illustrated in
The top panel 120 is a thin, plate-shaped member that is rectangular in plan view, and is made of a transparent glass or a reinforced plastic such as polycarbonate. The surface of the top panel 120 (a surface on the side in the positive direction of the Z-axis) is an example of an operation surface on which the users of the electronic device 100 performs an input operation.
The top panel 120 has the vibration elements 140A1, 140A2, and 140A3 (referred to as 140A1-140A3, below) adhered on a surface on the side in the negative direction of the Z-axis, and has four sides in plan view adhered to the housing 110 with the double-sided tape 130. Note that the double-sided tape 130 simply needs to be capable of having the four sides of the top panel 120 adhered to the housing 110, and does not need to be rectangular and toroidal as illustrated in
The touch panel 150 is placed on the side in the negative direction of the Z-axis of the top panel 120. The top panel 120 is provided in order to protect the surface of the touch panel 150. Note that another panel, a protective film, or the like may be provided on the surface of the top panel 120.
In a state where the vibration elements 140A1-140A3 are adhered to the surface on the side in the negative direction of the Z-axis, the top panel 120 vibrates when the vibration elements 140A1-140A3 are driven. In the embodiment, the top panel 120 is vibrated with the natural vibration frequency of the top panel 120, to generate a standing wave on the top panel 120. However, in practice, since the vibration elements 140A1-140A3 are adhered to the top panel 120, it is desirable to determine the natural vibration frequency taking the weight of the vibration elements 140 and the like into consideration.
On the surface of the top panel 120 on the side in the negative direction of the Z-axis, the vibration elements 140A1-140A3 are adhered along the short side that extends in the Y-axis direction on the side in the negative direction of the X-axis. The vibration elements 140A1-140A3 simply need to be elements that can generate vibration in an ultrasonic range, for example, a device including a piezo-electric element may be used.
Here, any one of the vibration elements 140A1-140A3 is an example of a first vibration element, and any one of the other vibration elements 140A1-140A3 is an example of a second vibration element.
The vibration elements 140A1-140A3 are driven by a drive signal output from a drive controller, which will be described later. The amplitude (strength) and the frequency of a vibration generated by the vibration elements 140A1-140A3 are set by the drive signal. Also, the drive signal controls turning on and off the vibration elements 140A1-140A3. The vibration elements 140A1-140A are turned on and off independently from each other.
Note that the ultrasonic range here means a frequency band of, for example, approximately over 20 kHz. In the electronic device 100 in the embodiment, since the frequency at which the vibration elements 140A1-140A3 vibrate is equivalent to the vibration frequency of the top panel 120, the vibration elements 140A1-140A3 are driven by the drive signal so as to vibrate at the natural frequency of the top panel 120. This is the same in the case of driving all the vibration elements 140A1-140A3, in the case of driving any two of these, and in the case of driving any one of these.
The touch panel 150 is placed above the display panel 160 (on the side in the positive direction of the Z-axis) and under the top panel 120 (on the side in the negative direction of the Z-axis). The touch panel 150 is an example of a position detector to detect a position at which the user of the electronic device 100 touches the top panel 120 (referred to as the position of an input operation, below).
Various buttons and the like of the GUI (referred to as GUI operation parts, below) are displayed on the display panel 160 under the touch panel 150. Therefore, the user of the electronic device 100 normally touches the top panel 120 with a fingertip, in order to operate a GUI operation part.
The touch panel 150 simply needs to be a position detector that can detect the position of an input operation performed by the user on the top panel 120, and may be a position detector of, for example, an electrostatic capacitance type or a resistance film type. Here, the embodiment will be described with the touch panel 150 being a position detector of an electrostatic capacitance type. Even if a space lies between the touch panel 150 and the top panel 120, the electrostatic-capacitance-type touch panel 150 can detect an input operation on the top panel 120.
Also, although the embodiment will be described here in which the top panel 120 is placed on the input surface side of the touch panel 150, the top panel 120 may be uniformly formed with the touch panel 150. In this case, the surface of the touch panel 150 corresponds to the surface of the top panel 120 illustrated in
Also, in the case of the touch panel 150 being an electrostatic capacitance type, the touch panel 150 may be placed above the top panel 120. Also in this case, the surface of the touch panel 150 constitutes the operation surface. Also, in the case of the touch panel 150 being an electrostatic capacitance type, a configuration is possible in which the top panel 120 illustrated in
The display panel 160 simply needs to be a display that can display an image, for example, a liquid crystal display panel or an organic EL (Electroluminescence) panel. The display panel 160 is installed in the depressed portion 110A of the housing 110, and on the substrate 170 (on the side in the positive direction of the Z-axis) with a holder or the like (not illustrated).
The display panel 160 is driven and controlled by a driver IC (Integrated Circuit), which will be described later, to display GUI operation parts, images, characters, marks, figures, and the like depending on an operational state of the electronic device 100.
The substrate 170 is placed inside of the depressed portion 110A of the housing 110. On the substrate 170, the display panel 160 and the touch panel 150 are placed. The display panel 160 and the touch panel 150 are fixed to the substrate 170 and the housing 110 with a holder and the like (not illustrated).
The substrate 170 mounts a drive controller, which will be described later, and in addition, various circuits and the like that are necessary to drive the electronic device 100.
When the user touches the top panel 120 with a finger, and a movement of the fingertip is detected, the electronic device 100 configured as above causes the drive controller mounted on the substrate 170 to drive at least one of the vibration elements 140A1-140A3, so as to vibrate the top panel 120 at a frequency in the ultrasonic range. This frequency in the ultrasonic range is a resonance frequency of a resonance system including the top panel 120 and the vibration elements 140A1-140A3, and generates a standing wave on the top panel 120.
The electronic device 100 provides the user with a tactile sensation through the top panel 120 by generating the standing wave in the ultrasonic range.
Next, by using
By using the Young's modulus E, the density ρ, the Poisson ratio δ, the long side dimension, and the thickness t of the top panel 120, and the number of cycles k of the standing wave that exist in the direction of the long side, the natural frequency (resonance frequency) f of the top panel 120 is represented by the following Expressions (1) and (2). Since the same waveform appears in a standing wave in the units of ½ cycles, the number of cycles k takes a value in the units of 0.5, which may be 0.5, 1, 1.5, 2, and so on.
Note that the coefficient α in Expression (2) collectively represents coefficients other than k2 in Expression (1).
The standing wave illustrated in
Although the top panel 120 is a plate-shaped member, when the vibration element 140 is driven to generate the natural vibration in the ultrasonic range, the top panel 120 is bent to generate a standing wave on the surface as illustrated in
Note that although in the embodiment described here, a vibration element 140 is adhered along the short side that extends in the X-axis direction on the side in the positive direction of the Y-axis on the surface of the top panel 120 on the side in the negative direction of the Z-axis, two vibration elements 140 may be used. In the case of using two vibration elements 140, the other vibration element 140 may be adhered along the short side that extends in the X-axis direction on the side in the negative direction of the Y-axis on the surface of the top panel 120 on the side in the negative direction of the Z-axis. In this case, the two vibration elements 140 may be placed to be axially symmetric with respect to the central line parallel to the two short sides of the top panel 120 as the axis of symmetry.
Also, in the case of driving the two vibration elements 140, the elements are driven in phase if the number of cycles k is an integer, or driven in reverse phase if the number of cycles k is a fraction (a number including an integer part and a fractional part).
Next, by using
Also, in
The natural vibration in the ultrasonic range is generated on the entire top panel 120 as illustrated in
To clarify this, in
In the operational pattern illustrated in
On the other hand, in the operational pattern illustrated in
Here, when the top panel 120 is caused to generate the natural vibration in the ultrasonic range, the squeeze effect generates an air layer between the surface of the top panel 120 and a finger, and the dynamic friction coefficient decreases when the surface of the top panel 120 is traced with the finger.
Therefore, in
Therefore, as illustrated in
On the other hand, in
Therefore, as illustrated in
As described above, in the cases of
Note that although it has been described here change of the dynamic frictional force in the case of switching on and off a vibration, the same is applied to the case where the amplitude (strength) of the vibration elements 140A1-140A3 are changed.
Next, by using
The electronic device 100 includes the vibration elements 140A1-140A3, an amplifier 141, the touch panel 150, a driver IC (Integrated Circuit) 151, the display panel 160, a driver IC 161, an amplifier 181, a speaker 103, a control unit 200, a sinusoidal wave generator 310, and an amplitude modulator 320.
The control unit 200 includes an application processor 220, a communication processor 230, a controller 240, and a memory 250. The control unit 200 is implemented with, for example, an IC chip. The controller 240 has a drive controller 240A built in.
Also, the drive controller 240A, the sinusoidal wave generator 310, and the amplitude modulator 320 constitute a drive control unit 300.
In
The amplifiers 141A1, 141A2, and 141A3 (referred to as 141A1-141A3, below) are placed between the drive control unit 300 and the vibration elements 140A1-140A3, respectively, to amplify drive signals output from the drive control unit 300 so as to drive the vibration elements 140A1-140A3, respectively.
The driver IC 151 is connected to the touch panel 150 to detect positional data representing a position at which an input operation is performed on the touch panel 150, to output the positional data to the control unit 200. Consequently, the positional data is input into the application processor 220 and the drive controller 240A. Note that inputting positional data into the drive controller 240A is equivalent to inputting the positional data into the drive control unit 300.
The driver IC 161 is connected to the display panel 160, to input graphical data output from the drive control unit 300 into the display panel 160 so as to display images based on the graphical data on the display panel 160. Thus, GUI operation parts, images, and the like based on the graphical data are displayed on the display panel 160.
The amplifier 181 is connected to the application processor 220, amplifies an audio signal input from the application processor 220, and outputs the signal to the speaker 103. The speaker 103 outputs an audio signal input from the amplifier 181 as a voice.
Depending on an input operation performed on the top panel 120, the application processor 220 reads voice data stored in the memory 250, and outputs the data to the amplifier 181. Consequently, a voice depending on the input operation performed on the top panel 120 is output from the speaker 103.
The application processor 220 executes various applications of the electronic device 100.
The controller 240 of the application processors 220 executes drive control of the vibration elements 140A1-140A3, and a voice guidance. The controller 240 includes the drive controller 240A. Among control processes executed by the controller 240, drive control of the vibration elements 140A1-140A3 is executed by the drive controller 240A, and voice guidance control is executed by parts other than drive controller 240A in the controller 240. The controller 240 is an example of a first controller and a second controller.
The voice guidance control by the controller 240, and the drive control of the vibration elements 140A1-140A3 executed along with the voice guidance control will be described later using
The communication processor 230 executes processing necessary for the electronic device 100 to perform communication compliant with 3G (Generation), 4G (Generation), L (Long Term Evolution), WiFi, or the like.
The drive controller 240A outputs amplitude data to the amplitude modulator 320 in the case where two predetermined conditions are satisfied. The amplitude data is data representing an amplitude value for adjusting the strength of a drive signal used for driving the vibration elements 140A1-140A3. The amplitude value is set depending on a change rate in time of the positional data. Here, as the change rate in time of the positional data, the speed of a fingertip of the user moving along the surface of the top panel 120 is used. The moving speed of a fingertip of the user is calculated by the drive controller 240A based on a change rate in time of the positional data input from the driver IC 151.
In order to constantly generate tactile sensations perceived by a user with a fingertip irrespective of the moving speed of the fingertip, for example, the drive control unit 300 in the embodiment sets the amplitude value smaller when the moving speed is higher, and sets the amplitude value greater when the moving speed is lower.
Data representing such a relationship between the amplitude data representing the amplitude value and the moving speed is stored in the memory 250.
Note that although a form will be described here in which the amplitude value is set depending on the moving speed by using the data stored in the memory 250, the amplitude value A may be calculated by using the following Expression (3). The amplitude value A calculated by Expression (3) is smaller when the moving speed is higher, and is greater when the moving speed is lower.
A=A
0/√{square root over (|V|/a)} (3)
where A0 is a reference value of the amplitude; V is the moving speed of a fingertip; and a is a predetermined constant. In the case of calculating the amplitude value A by using Expression (3), the memory 250 may store data representing Expression (3), and data representing the reference value A0 of the amplitude and the predetermined constant a.
Also, when a fingertip of the user moves along the surface of the top panel 120, the drive control unit 300 in the embodiment vibrates the top panel 120 in order to change the dynamic frictional force acting on the fingertip. Since the dynamic frictional force is generated while a fingertip is moving, the drive controller 240A causes the vibration elements 140A1-140A3 to vibrate when the moving speed becomes greater than or equal to a predetermined threshold speed. The moving speed becoming greater than or equal to the predetermined threshold speed is the first predetermined condition.
Therefore, the amplitude value represented by the amplitude data output by the drive controller 240A is zero when the moving speed is less than the predetermined threshold speed, and once the moving speed has become greater than or equal to the predetermined threshold speed, is set to the predetermined amplitude value depending on the moving speed. When the moving speed is greater than or equal to the predetermined threshold speed, the amplitude value is set smaller when the moving speed is higher, and is set greater when the moving speed is lower.
Also, the drive control unit 300 in the embodiment outputs the amplitude data to the amplitude modulator 320 in the case where the position of a fingertip performing an input operation is located in a predetermined area where a vibration is to be generated. The position of a fingertip performing an input operation being located in a predetermined area where a vibration is to be generated is the second predetermined condition.
Whether the position of a fingertip performing an input operation is located in the predetermined area where a vibration is to be generated is determined based on whether the position of the fingertip performing the input operation is located inside of the predetermined area where a vibration is to be generated.
Here, positions on the display panel 160 of GUI operation parts, areas to display images, and an area to display a whole page, which are displayed on the display panel 160, are identified by area data representing these areas. The area data exists for all the GUI operation parts, the areas to display images, and the area to display a whole page, which are displayed on the display panel 160, for all applications.
Because of this, determining as the second predetermined condition whether the position of a fingertip performing an input operation is located in a predetermined area where a vibration is to be generated, relates to the type of an application being activated in the electronic device 100. This is because a display on the display panel 160 varies depending on the type of the application.
This is also because depending on the type of the application, types of input operations that involve a fingertip touching and moving on the surface of the top panel 120 vary. Types of input operations that involve a fingertip touching and moving on the surface of the top panel 120 include, for example, what is called a “flick operation” in the case of operating a GUI operation part. A flick operation is an operation to move a fingertip by a comparatively short distance along the surface of the top panel 120 as if to bounce the fingertip on the surface.
Also, in the case of turning over a page, for example, a swipe operation is performed. A swipe operation is an operation to move a fingertip by a comparatively long distance along the surface of the top panel 120 as if to sweep the surface by the fingertip. In addition to the case of turning a page, a swipe operation is also performed, for example, in the case of turning over a photograph. Also, in the case of sliding a slider (see the slider 102B in
Input operations that involve a fingertip touching and moving on the surface of the top panel 120, such as a flick operation, a swipe operation, and a drag operation cited here as examples, are selectively used by the type of a display provided by an application. Because of this, determining whether the position of a fingertip performing an input operation is located in a predetermined area where a vibration is to be generated, relates to the type of an application being activated in the electronic device 100.
The drive controller 240A determines whether a position represented by positional data input from the driver IC 151 is located inside of a predetermined area where a vibration is to be generated by using the area data.
Second data, in which data representing a type of an application; an area data representing a GUI operation part or the like on which an input operation is performed; and a pattern data representing a vibration pattern are associated with each other, is stored in the memory 250.
Also, in order to interpolate a change of the position of a fingertip during a time period required for calculating a drive signal based on positional data after the positional data has been input into the drive control unit 300 from the driver IC 151, the drive controller 240A executes the next process.
The drive control unit 300 executes calculation for each predetermined control cycle. This is the same for the drive controller 240A. Because of this, representing a time period required for calculating a drive signal based on positional data after the positional data has been input into the drive control unit 300 from the driver IC 151, by Δt, the required time Δt is equivalent to the control cycle.
Here, the moving speed of a fingertip can be obtained as speed of a vector that has the starting point at a point (x1, y1) represented by positional data input into the drive control unit 300 from the driver IC 151, and the ending point (x2, y2) at the position of a fingertip after the required time Δt elapses.
The drive controller 240A obtains a vector that has the starting point at a point (x2, y2) represented by positional data input into the drive control unit 300 from the driver IC 151, and the ending point (x3, y3) at the position of a fingertip after the required time Δt elapses, to estimate coordinates (x3, y3) after the required time Δt elapses.
The electronic device 100 in the embodiment estimates coordinates after the required time Δt elapses as described above, to interpolate the change of the position of a fingertip during the required time Δt.
Such calculation to estimate coordinates after the required time Δt elapses is performed by the drive controller 240A. The drive controller 240A determines whether estimated coordinates are located inside of a predetermined area where a vibration is to be generated, to generate a vibration in the case where the coordinates are located inside of the predetermined area where the vibration is to be generated. Therefore, the second predetermined condition is that estimated coordinates are located inside of a predetermined area where a vibration is to be generated.
As described above, the two predetermined necessary conditions for the drive controller 240A to output amplitude data to the amplitude modulator 320 are constituted with the moving speed of the fingertip greater than or equal to the predetermined threshold speed, and the estimated coordinates being located inside of a predetermined area where a vibration is to be generated.
In the case where the moving speed of the fingertip is greater than or equal to the predetermined threshold speed, and the estimated coordinates are located inside of a predetermined area where a vibration is to be generated, the drive controller 240A reads amplitude data representing an amplitude value in accordance with the moving speed from the memory 250, and outputs the data to the amplitude modulator 320.
The memory 250 stores data that represents a relationship between the amplitude data representing the amplitude value and the moving speed, and data in which data representing a type of an application; an area data representing a GUI operation part or the like on which an input operation is performed; and a pattern data representing a vibration pattern are associated with each other.
The memory 250 also stores data that is necessary to perform drive control of the vibration elements 140A1-1403, and the voice guidance. This data will be described later using
The memory 250 also stores data and programs necessary for the application processor 220 to execute applications, and data and programs necessary for the communication processor 230 to execute communication-processing.
The sinusoidal wave generator 310 generates a sinusoidal wave necessary to generate a drive signal for vibrating the top panel 120 at the natural frequency. For example, in the case of vibrating the top panel 120 at the natural frequency f of 33.5 kHz, the frequency of the sinusoidal wave is set to 33.5 kHz. The sinusoidal wave generator 310 inputs the sinusoidal wave signal in the ultrasonic range into the amplitude modulator 320.
Note that although a form of using the sinusoidal wave generator 310 will be described here, instead of the sinusoidal wave generator 310, a clock generator to generate a clock may be used here. For example, the slew rate of a clock generated by the clock generator may be set small to make the waveform of rising and falling edges of the clock less sharp. A clock having the slew rate set to a small value in this way may be used instead of a sinusoidal wave generated by the sinusoidal wave generator 310. In other words, instead of a sinusoidal wave, a waveform signal whose amplitude changes periodically may be used.
The amplitude modulator 320 modulates the amplitude of a sinusoidal wave signal input from the sinusoidal wave generator 310 by using amplitude data input from the drive controller 240A, to generate a drive signal. The amplitude modulator 320 modulates only the amplitude of the sinusoidal wave signal in the ultrasonic range input from the sinusoidal wave generator 310, without modulating the frequency and the phase, to generate the drive signal.
Therefore, the drive signal output by the amplitude modulator 320 is a sinusoidal wave signal in the ultrasonic range, in which only the amplitude of the sinusoidal wave signal in the ultrasonic range input from the sinusoidal wave generator 310 is modulated. Note that in the case of the amplitude data being zero, the amplitude of the drive signal becomes zero. This is equivalent to not outputting a drive signal from the amplitude modulator 320.
In
As illustrated in
As illustrated in
As illustrated in
As can be understood from
When the vibration element 140B1 is driven, an area on the side in the negative direction of the X-axis with respect to the short side of the top panel 120 over the span in the long-side direction, is obtained where the amplitude of the standing wave is greater. At this time, the amplitude of the standing wave is extremely smaller in an area over the span in the long-side direction on the central part and on the side in the positive direction of the X-axis with respect to the short side of the top panel 120.
Because of this, when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the side in the negative direction of the X-axis with respect to the short side of the top panel 120, a reduction effect of the dynamic frictional force is obtained sufficiently by the squeeze effect; whereas when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the central part and on the side in the positive direction of the X-axis with respect to the short side of the top panel 120, virtually no reduction effect of the dynamic frictional force is obtained by the squeeze effect.
Also, when the vibration element 140B2 is driven, an area is obtained on the central part with respect to the short side of the top panel 120 over the span in the long-side direction where the amplitude of the standing wave is greater. At this time, the amplitude of the standing wave is extremely smaller in an area over the span in the long-side direction on the side in the negative direction of the X-axis and on the side in the positive direction of the X-axis with respect to the short side of the top panel 120.
Because of this, when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the central part with respect to the short side of the top panel 120, a reduction effect of the dynamic frictional force is obtained sufficiently by the squeeze effect; whereas when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the side in the negative direction and on the side in the positive direction of the X-axis with respect to the short side of the top panel 120, virtually no reduction effect of the dynamic frictional force is obtained by the squeeze effect.
When the vibration element 140B3 is driven, an area is obtained on the side in the positive direction of the X-axis with respect to the short side of the top panel 120 over the span in the long-side direction where the amplitude of the standing wave is greater. At this time, the amplitude of the standing wave is extremely smaller in an area over the span in the long-side direction on the central part and on the side in the negative direction of the X-axis with respect to the short side of the top panel 120.
Because of this, when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the side in the positive direction of the X-axis with respect to the short side of the top panel 120, a reduction effect of the dynamic frictional force is obtained sufficiently by the squeeze effect; whereas when a fingertip is moved while touching the top panel 120 in an area over the span in the long-side direction on the central part and on the side in the negative direction of the X-axis with respect to the short side of the top panel 120, virtually no reduction effect of the dynamic frictional force is obtained by the squeeze effect.
As such, by selecting one of the vibration elements 140B1, 140B2, and 140B3 to be driven, it is possible to select an area in the short-side direction (X-axis direction) of the top panel 120 where a standing wave having a great amplitude is generated. In other words, it is possible to select an area in the short-side direction (X-axis direction) of the top panel 120 where a reduction effect of the dynamic frictional force is obtained by the squeeze effect.
Therefore, as illustrated in
In other words, by driving the vibration elements 140B1, 140B2, and 140B3 one by one, it is possible to have partitioned three areas on each of which a vibration is generated autonomously on the top panel 120.
As illustrated in
Also, as illustrated in
Also, as illustrated in
As described above, by selecting and driving one of the vibration elements 140A1, 140A2, and 140A3, it is possible to select the area 120A1, 120A2, or 120A3 in the long-side direction (Y-axis direction) of the top panel 120 where a standing wave having a great amplitude is generated.
Therefore, when driving the vibration element 140A1, it is possible to obtain a reduction effect of the dynamic frictional force by the squeeze effect in the area 120A1, and not to sufficiently obtain a reduction effect of the dynamic frictional force by the squeeze effect in the areas 120A2 and 120A3. Because of this, for example, the user touching the entire top panel 120 with both hands can distinguish the area 120A1 from the areas 120A2 and 120A3 by tactile sensations.
Similarly, when driving the vibration element 140A2, it is possible to distinguish the area 120A2 from the areas 120A1 and 120A3 by tactile sensations; and when driving the vibration element 140A3, it is possible to distinguish the area 120A3 from the areas 120A1 and 120A2 by tactile sensations.
Also, when driving all of the vibration elements 140A1, 140A2, and 140A3 simultaneously, it is possible to generate a standing wave having a great amplitude on the areas 120A1, 120A2, and 120A3.
Here, for example, assume that the electronic device 100 drives each of the vibration elements 140A1-140A3 for two seconds in turn. Also, for example, the frequency of the drive signal is set to 35 kHz. Also, in (A) to (C) of
First, at time t=0 s, the electronic device 100 drives the vibration element 140A1. A drive pattern to drive the vibration element 140A1 is a pattern to periodically increase and decrease the amplitude of a sinusoidal wave at 35 kHz (in the ultrasonic range) between A2 and A1 while time elapses. Compared with a smaller amplitude, a greater amplitude decreases the dynamic frictional force lower by the squeeze effect.
Because of this, increasing and decreasing the amplitude between A2 and A1 while time elapses enables the user to perceive, with a fingertip or a palm, a tactile sensation of smoothness changing in time.
Also, at time t=2 s, the electronic device 100 stops the vibration element 140A1, and drives the vibration element 140A2. A drive pattern to drive the vibration element 140A2 is a pattern to periodically increase and decrease the amplitude of a sinusoidal wave at 35 kHz (in the ultrasonic range) between A12 and A11 while time elapses. The amplitudes A12 and A11 are greater than the amplitudes A2 and A1.
Also, at time t=4 s, the electronic device 100 stops the vibration element 140A2, and drives the vibration element 140A3. A drive pattern to drive the vibration element 140A3 is a pattern to periodically increase and decrease the amplitude of a sinusoidal wave at 35 kHz (in the ultrasonic range) between A22 and A21 while time elapses. The amplitudes A22 and A21 are smaller than the amplitudes A2 and A1.
At time t=6 s, the electronic device 100 stops the vibration element 140A3.
In this way, when the electronic device 100 drives each of the vibration elements 140A1-140A3 for two seconds in turn, if the user moves a fingertip or a palm touching the top panel 120, the user can perceive a tactile sensation of smoothness on each of the areas 120A1, 120A2, and 120A3 in turn. Also, since the different amplitudes of the vibration are generated in the respective areas 120A1, 120A2, and 120A3, it is possible to perceive transitions among the areas 120A1, 120A2, and 120A3 by tactile sensations.
Next, operations of the electronic device 100 will be described by using
Images 160A1, 160A2, and 160A3 are displayed on the display panel 160. Areas where the images 160A1, 160A2, and 160A3 are displayed are substantially the same as the areas 120A1, 120A2, and 120A3 on the top panel 120 (see
Also, an area obtained by converting one of the three areas where the images 160A1, 160A2, and 160A3 are displayed into coordinates of the touch panel 150 is an example of a first area, and an area obtained by converting another of the three areas where the images 160A1, 160A2, and 160A3 are displayed into coordinates of the touch panel 150 is an example of a second area.
On the upper parts of the images 160A1, 160A2, and 160A3, characters representing classification of “noodles”, “bowls”, and “drinks” are displayed, respectively. The electronic device 100 is an input device that enables the user to order a food or a drink as desired, by performing an input operation on the top panel 120.
The electronic device 100 is also an input device that enables to input without visual observation, with vibration of the top panel 120 caused by driving the vibration elements 140A1-140A3 (see
The images 160A1, 160A2, and 160A3 displayed on the display panel 160 illustrated in
The state illustrated in
Here, a multi-touch means that the user touches the top panel 120 with both hands, or touches the top panel 120 with multiple fingers. A multi-touch is detected when multiple input operations are performed on the top panel 120, and multiple pairs of coordinates are detected by the touch panel 150.
In
At this time, since the dynamic frictional force is not reduced in the areas 120A2 and 120A3, the user does not perceive a tactile sensation of smoothness with the right hand.
In such a state, the electronic device 100 outputs a voice guidance of “Smooth part corresponds to noodles” from the speaker 103.
In this way, in a state where the area 120A1 of noodles has been recognized, if the user performs a confirmation operation in the area 120A1, a detailed input mode for inputting a noodle can be activated.
The confirmation operation means, for example, pressing the top panel 120 strongly, and performing a confirmation operation enables the electronic device 100 to receive input. Such a function to receive a confirmation operation may be implemented by an OS (Operation System) of the application processor 220 (see
In
At this time, since the dynamic frictional force is not reduced in the areas 120A1 and 120A3, the user does not perceive a tactile sensation of smoothness with the middle finger, the third finger, and the little finger of the left hand, and the index finger, the middle finger, the third finger, and the little finger of the right hand.
In such a state, the electronic device 100 outputs a voice guidance of “Smooth part corresponds to bowls” from the speaker 103.
In this way, in a state where the area 120A2 of noodles has been recognized, if the user performs a confirmation operation in the area 120A2, a detailed input mode for inputting a bowl can be activated.
In
At this time, since the dynamic frictional force is not reduced in the areas 120A2 and 120A3, the user does not perceive a tactile sensation of smoothness with the left hand and the thumb of the right hand.
In such a state, the electronic device 100 outputs a voice guidance of “Smooth part corresponds to drinks” from the speaker 103.
In this way, in a state where the area 120A3 of noodles has been recognized, if the user performs a confirmation operation in the area 120A3, a detailed input mode for inputting a drink can be activated.
In
Assist areas 161B-166B are placed around the buttons 161A-166A, respectively. When an input operation is performed, different drive patterns are used for the buttons 161A-166A and for the assist areas 161B-166B when driving the vibration elements 140A1-140A3. The assist areas 161B-166B are provided in order to help (assist) guidance of a fingertip to the buttons 161A-166A.
Also, above the buttons 161A, 162A, and 163A, areas 167A1, 167A2, and 167A3 are provided, and below the buttons 164A, 165A, and 166A, areas 168A1, 168A2, and 168A3 are provided.
An area obtained by converting the area where the buttons 161A-166A are displayed into coordinates of the touch panel 150 is an example of a third area. An area obtained by converting the area where the assist areas 161B-166B are displayed into coordinates of the touch panel 150 is an example of a fourth area. An area obtained by converting the area where the areas 167A1, 167A2, 167A3, 168A1, 168A2, and 168A3 are displayed into coordinates of the touch panel 150 is an example of a fifth area.
As illustrated in
In such a case, if the user moves the index finger of the right hand upward, the electronic device 100 drives the vibration elements 140A1-140A3, and outputs a voice guidance of “Plain soba is located above” from the speaker 103.
The drive pattern of the vibration elements 140A1-140A3 at this time is, for example, as illustrated in
Above the index finger of the right hand illustrated in
Also, if the user moves the index finger of the right hand leftward (in the negative direction of the Y-axis) from the state illustrated in
The drive pattern of the vibration elements 140A1-140A3 at this time is, for example, as illustrated in
In this way, since the electronic device 100 does not drive the vibration elements 140A1-140A3 if the user moves a fingertip in a direction where the buttons 161A-166A do not exist, a state appears where the dynamic frictional force acting on the fingertip is greater (higher friction). This enables not to guide the fingertip of the user in the direction where the buttons 161A-166A do not exist. Note that the same is also applied to the case where the user moves the index finger of the right hand rightward (in the positive direction of the Y-axis), or downward from the state illustrated in
Also, although the operation of the electronic device 100 described here assumes that the fingertip of the user is touching the top panel 120 in the area 168A2, an operation of the electronic device 100 is substantially the same when the fingertip of the user is touching the top panel 120 in the area 167A1, 167A2, 167A3, 168A1, or 168A3.
As illustrated in
In such a case, the electronic device 100 drives the vibration elements 140A1-140A3, and outputs a voice guidance of “Plain soba is located right” from the speaker 103.
The drive pattern of the vibration elements 140A1-140A3 at this time is, for example, as illustrated in
At time t1, when the index finger starts moving rightward in the assist area 165B, as illustrated in
At time t2, when the index finger enters the button 165A as illustrated in
Since the amplitude of this drive signal periodically varies in a sinusoidal waveform between A4 and A5, a tactile sensation of roughness is brought to the fingertip of the user. Also, the amplitudes A4 and A5 are smaller than the amplitude A3. Because of this, when the index finger enters the button 165A, the user perceives a change of the tactile sensation being sensed at the fingertip. Also, the user is informed that the current position of the fingertip is located in the button 165A of “plain soba” by the voice guidance.
Because of this, the user can recognize that the current position of the fingertip is located in the button 165A of “plain soba” with the tactile sensation and the voice guidance without visual observation.
When the user moves the index finger further rightward, and enters the assist area 165B at time t3 as illustrated in
This enables the user to recognize that the current position of the fingertip is located in the assist area 165B, and has gone out of the button 165A of “plain soba” without visual observation.
At time t4, when the user stops moving the index finger in the assist area 165B, the drive of the vibration elements 140A1-140A3 is stopped. Note that it is substantially the same in the case where the user separates the index finger from the top panel 120 in the assist area 165B at time t4.
In this way, the electronic device 100 guides a fingertip of the user in a direction where one of the buttons 161A-166A is located by using the areas 167A1, 167A2, 167A3, 168A1, 168A2, and 168A3, a tactile sensation using the squeeze effect, and a voice guidance.
Also, the electronic device 100 guides the fingertip of the user on the inside of the buttons 161A-166A by using the assist areas 161B-166B placed around the buttons 161A-166A, a tactile sensation using the squeeze effect, and a voice guidance.
Because of this, the user can precisely recognize the respective positions of the buttons 161A-166A without visual observation, and can precisely order the menu associated with one of the buttons 161A-166A by performing a confirmation operation.
Data used in a summary guidance mode has a configuration in which coordinates, a vibration element ID (Identifier), a drive pattern, an image ID, and voice data are associated with each other.
Here, f1(X, Y), f2(X, Y), and f3(X, Y) represent the areas 120A1, 120A2, and 120A3, respectively.
The vibration element ID represents the identifier of a vibration element to be driven when generating a vibration in one of the areas 120A1, 120A2, and 120A3. Here, the codes of the vibration elements are used as the identifiers. The areas 120A1, 120A2, and 120A3 are assigned to the vibration elements 140A1, 140A2, and 140A3, respectively.
The drive pattern is data in which amplitudes of a drive signal to drive the vibration element 140A1, 140A2, or 140A3 are arranged in a time series, and drive patterns P1, P2, and P3 are assigned to the vibration elements 140A1, 140A2, and 140A3, respectively.
The image ID represents an identifier of an image displayed on the position that overlaps the area 120A1, 120A2, or 120A3 of the top panel 120. Here, the codes of the images are used as the identifiers. The images 160A1, 160A2, and 160A3 are assigned to the areas 120A1, 120A2, and 120A3.
The voice data is voice data to be output from the speaker 103 when generating a vibration in the area 120A1, 120A2, or 120A3, and “These are noodles”, “These are bowls”, and “These are drinks” are assigned to the areas 120A1, 120A2, and 120A3, respectively.
By using the data as illustrated in
Data used in a detailed guidance mode includes coordinates, a vibration element ID, a moving direction, a drive pattern, an image ID, and voice data.
The coordinates f11(X, Y) and f12(X, Y), . . . , f21(X, Y), f22(X, Y), and f23 (X, Y) are data that represent areas where the buttons 161A-166A, the assist areas 161B-166B, the areas 167A1-167A3, and the areas 168A1-168A3 are displayed by coordinate values of the coordinate system of the touch panel 150.
The vibration element ID represents an identifier of a vibration element to be driven in the case where the position of a fingertip has been moved in response to an input operation performed in the button 161A, the assist area 161B, the area 167A1, or the like. For example, the button 161A has the vibration elements 140A1-140A3 assigned.
Also, the area 167A1 has the vibration elements 140A1-140A3 assigned only in the case where the moving direction approaches the button 161A from the top, and has no vibration element assigned in the case where the moving direction is rightward and leftward.
This is because, as in the case of the operation illustrated in
Note that a vibration element ID is not assigned to an operation that generates no vibration. For example, the area 167A1 has no vibration element assigned in the case where the moving direction is rightward and leftward.
The moving direction represents a direction in which a fingertip or a hand performing an input operation moves. A moving direction is detected by the application processor 220 based on detected values on the touch panel 150. Here, the moving direction is designated with upward, downward, leftward, or rightward for the sake of description.
Note that if a moving direction turns out to be diagonal when determining whether it is upward, downward, leftward, or rightward, it may be determined as follows. For example, in the case where the moving direction is a lower-right direction, a boundary between the downward direction (in the positive direction of the X-axis) and the rightward direction (in the positive direction of the Y-axis) may be defined with a straight line tilted 45 degrees in the positive direction of the X-axis and tilted 45 degrees in the positive direction of the Y-axis, to determine whether the lower-right direction is downward or rightward. This is the same for a lower-left direction, an upper-right direction, and an upper-left direction. Also, the boundary is not limited to be defined as above; the direction of the boundary may be biased upward, downward, leftward or rightward.
The drive pattern is data in which amplitudes of a drive signal to drive the vibration element 140A1, 140A2, or 140A3 are arranged in a time series, and drive patterns P11, P12, . . . , P21, P22, P23, and so on are assigned to the buttons 161A-166A, the assist areas 161B-166B, and the areas 167A1-167A3 and 168A1-168A3.
The image ID represents an identifier of an image of one of the buttons 161A-166A, the assist areas 161B-166B, and the areas 167A1-167A3 and 168A1-168A3. Here, the codes such as the button 161A are used as the identifiers.
The voice data is voice data to be output from the speaker 103 when the position of an input operation moves within the display area of one of the buttons 161A-166A, the assist areas 161B-166B, and the areas 167A1-167A3 and 168A1-168A3.
For example, the button 161A has voice data of “This is plain udon” assigned. Also, the assist area 161B has voice data of “Plain udon is located right” assigned in the case where the position of an input operation approaches the button 161A from the left, and has voice data of “Plain udon is located left” assigned in the case where the position of an input operation approaches the button 161A from the right. Also, the assist area 161B has voice data of “Plain udon is located above” assigned in the case where the position of an input operation approaches the button 161A from the bottom, and has voice data of “Plain udon is located below” assigned in the case where the position of an input operation approaches the button 161A from the top.
Also, the area 167A1 has voice data of “No button is located right” assigned in the case where the position of an input operation moves rightward toward the button 161A, and does not have the vibration elements 140A1-140A3 assigned.
Also, the area 167A1 has voice data of “No button is located left” assigned in the case where the position of an input operation moves leftward toward the button 161A, and does not have the vibration elements 140A1-140A3 assigned.
Also, the area 167A1 has voice data of “Plain udon is located below” assigned in the case where the position of an input operation approaches the button 161A from the top, and has the vibration elements 140A1-140A3 assigned.
By using the data as illustrated in
The control unit 200 determines whether an input operation is performed (Step S1). The control unit 200 determines whether an input operation is performed by determining whether the touch panel 150 detects coordinates.
If having determined that an input operation is performed (YES at Step S1), the control unit 200 determines whether the input operation is a multi-touch (Step S2). The control unit 200 determines whether two or more pairs of coordinates have been detected by the touch panel 150, to determine whether it is a multi-touch.
Note that if having determined that the input operation is not a multi-touch (NO at Step S2), the control unit 200 returns the flow to Step S1.
If having determined that the input operation is a multi-touch (YES at Step S2), the control unit 200 displays images of a summary guidance mode (Step S3). The control unit 200 displays the images 160A1, 160A2, and 160A3 illustrated in
The control unit 200 drives the vibration element 140A1 (Step S4). The control unit 200 drives the vibration element 140A1 based on data used in the summary guidance mode (see
The control unit 200 executes a voice guidance (Step S5). The control unit 200 executes the voice guidance based on data used in the summary guidance mode (see
The control unit 200 determines whether a confirmation operation has been performed (Step S6). Since whether a confirmation operation has been performed is detected by the application processor 220, the control unit 200 determines whether a confirmation operation has been detected by the application processor 220, to determine whether a confirmation operation has been performed.
If having determined that no confirmation operation is performed (NO at Step S6), the control unit 200 determines whether two seconds has elapsed since having started driving the vibration element 140A1 (Step S7). This is because the time period to drive the vibration element 140A1 is two seconds.
If having determined that two seconds has not elapsed since having started driving the vibration element 140A1 (NO at Step S7), the control unit 200 returns the flow to Step S1. Thus, Steps S1 to S7 are repeated, and in the case where a confirmation operation is not performed, the vibration element 140A1 is driven for two seconds after having started driving the vibration element 140A1.
If having determined that two seconds has elapsed since having started driving the vibration element 140A1 (YES at Step S7), the control unit 200 determines whether an input operation is performed (Step S8). The control unit 200 determines whether an input operation is performed by determining whether the touch panel 150 detects coordinates.
If having determined that an input operation is performed (YES at Step S8), the control unit 200 determines whether the input operation is a multi-touch (Step S9). The control unit 200 determines whether two or more pairs of coordinates have been detected by the touch panel 150, to determine whether it is a multi-touch.
Note that if having determined that the input operation is not a multi-touch (NO at Step S9), the control unit 200 returns the flow to Step S8.
If having determined that the input operation is a multi-touch (YES at Step S9), the control unit 200 displays images of a summary guidance mode (Step S10). The control unit 200 displays the images 160A1, 160A2, and 160A3 illustrated in
The control unit 200 drives the vibration element 140A2 (Step S11). The control unit 200 drives the vibration element 140A2 based on data used in the summary guidance mode (see
The control unit 200 executes a voice guidance (Step S12). The control unit 200 executes the voice guidance based on data used in the summary guidance mode (see
The control unit 200 determines whether a confirmation operation has been performed (Step S13). Since whether a confirmation operation has been performed is detected by the application processor 220, the control unit 200 determines whether a confirmation operation has been detected by the application processor 220, to determine whether a confirmation operation has been performed.
If having determined that no confirmation operation is performed (NO at Step S13), the control unit 200 determines whether two seconds has elapsed since having started driving the vibration element 140A2 (Step S14). This is because the time period to drive the vibration element 140A2 is two seconds.
If having determined that two seconds has not elapsed since having started driving the vibration element 140A2 (NO at Step S14), the control unit 200 returns the flow to Step S8. Thus, Steps S8 to S14 are repeated, and in the case where a confirmation operation is not performed, the vibration element 140A2 is driven for two seconds after having started driving the vibration element 140A2.
If having determined that two seconds has elapsed since having started driving the vibration element 140A2 (YES at Step S14), the control unit 200 determines whether an input operation is performed (Step S15). The control unit 200 determines whether an input operation is performed by determining whether the touch panel 150 detects coordinates.
If having determined that an input operation is performed (YES at Step S15), the control unit 200 determines whether the input operation is a multi-touch (Step S16). The control unit 200 determines whether two or more pairs of coordinates have been detected by the touch panel 150, to determine whether it is a multi-touch.
Note that if having determined that the input operation is not a multi-touch (NO at Step S16), the control unit 200 returns the flow to Step S15.
If having determined that the input operation is a multi-touch (YES at Step S16), the control unit 200 displays images of the summary guidance mode (Step S17). The control unit 200 displays the images 160A1, 160A2, and 160A3 illustrated in
The control unit 200 drives the vibration element 140A3 (Step S18). The control unit 200 drives the vibration element 140A3 based on data used in the summary guidance mode (see
The control unit 200 executes a voice guidance (Step S19). The control unit 200 executes the voice guidance based on data used in the summary guidance mode (see
The control unit 200 determines whether a confirmation operation has been performed (Step S20). Since whether a confirmation operation has been performed is detected by the application processor 220, the control unit 200 determines whether a confirmation operation has been detected by the application processor 220, to determine whether a confirmation operation has been performed.
If having determined that no confirmation operation is performed (NO at Step S20), the control unit 200 determines whether two seconds has elapsed since having started driving the vibration element 140A1 (Step S21). This is because the time period to drive the vibration element 140A3 is two seconds.
If having determined that two seconds has not elapsed since having started driving the vibration element 140A3 (NO at Step S21), the control unit 200 returns the flow to Step S15. Thus, Steps S15 to S21 are repeated, and in the case where a confirmation operation is not performed, the vibration element 140A3 is driven for two seconds after having started driving the vibration element 140A3.
If having determined that a confirmation operation has been performed at Step S6, S13 or S20 (YES at Step S20), the control unit 200 stops the vibration elements (Step S22), to transition into a detailed guidance mode.
The control unit 200 displays an image of a detailed guidance mode (Step S23). The control unit 200 displays the buttons 161A-166A, the assist areas 161B-166B, and the areas 167A1-167A3 and 168A1-168A3 illustrated in
The control unit 200 determines whether an input operation is performed (Step S24). The control unit 200 determines whether an input operation is performed by determining whether the touch panel 150 detects coordinates. The control unit 200 detects coordinates of the position of an input operation at Step S24.
Note that if having determined that no input operation is performed (NO at Step S24), the control unit 200 returns the flow to Step S23.
If having determined that an input operation is performed (YES at Step S24), the control unit 200 determines whether the position of the input operation is moving (Step S25). This is because an effect that the dynamic frictional force is reduced by the squeeze effect is obtained when a fingertip or a hand touching the top panel 120 is moving.
The control unit 200 may determine whether the position of the input operation is moving by determining whether the coordinates detected by the touch panel 150 are changing.
If having determined that the position of the input operation is moving (YES at Step S25), the control unit 200 determines the moving direction of the position of the input operation (Step S26). This is because, for example, there is a case where the voice guidance differs depending on a direction approaching the button 161A as in the case of the assist area 161B (see
The control unit 200 may determine the moving direction of the position of the input operation by determining a direction in which coordinates detected by the touch panel 150 change.
The control unit 200 drives the vibration element(s) based on the coordinates of the input operation detected at Step S24, the moving direction detected at Step S26, and the data of the detailed guidance mode illustrated in
Note that at Step S27, for example, in the case where the position of the input operation is moving rightward or leftward in the area 167A1, since no vibration element is assigned according to the data illustrated in
The control unit 200 executes a voice guidance based on the coordinates of the input operation detected at Step S24, the moving direction detected at Step S26, and the data of the detailed guidance mode illustrated in
The control unit 200 determines whether a confirmation operation has been performed (Step S29).
If having determined that a confirmation operation has been performed (YES at Step S29), the control unit 200 determines whether the confirmed content is “return” (Step S30). The process at Step S30 may execute the determination based on, for example, whether the coordinates of the input operation used in the process at Step S27 or S28 are contained in the coordinates f21(X, Y) of the button 166A.
If having determined that the confirmed contents is not “return” (NO at Step S30), the control unit 200 confirms an ordered content on which the confirmation operation has been performed (Step S31). For example, in the case where “plain udon” has been ordered, the control unit 200 outputs data representing “plain udon”.
The electronic device 100 completes the series of steps (END).
Note that the control unit 200 returns the flow to Step S23 at Step S25 if having determined that the position of the input operation is not moving (NO at Step S25).
Also, at Step S29, in the case where a confirmation operation is performed when the position of the input operation is not contained in any of the buttons 161A-166A, a voice guidance may be executed to indicate that, for example, “It is an area outside of buttons”, to request the user to redo the operation. In this case, the flow may be returned from Step S29 to Step S23.
Also, at Step S24, in the case where it has been repeatedly determined for more than a predetermined number of times that an input operation is not performed, the series of steps may be terminated.
As described above, according to the embodiment, when the user performs a multi-touch on the top panel 120, the control unit 200 displays the images 160A1, 160A2, and 160A3 illustrated in
Because of this, without visual observation, the user can grasp that the areas 120A1, 120A2, and 120A3 correspond to “noodles”, “bowls”, and “drinks”, respectively, with a tactile sensation perceived with a fingertip, a palm, or the like, and by the voice guidance. Here, “noodles”, “bowls”, and “drinks” constitute a crude classification.
Also, if one of “noodles”, “bowls”, and “drinks” is selected by the user, the control unit 200 displays a detailed menu as illustrated in
Because of this, without visual observation, the user can be guided to one of the buttons 161A-166A to readily order a favorite menu with a tactile sensation perceived with a fingertip, a palm, or the like, and by the voice guidance.
Therefore, according to the embodiment, it is possible to provide an electronic device 100 with which multiple areas can be distinguished by sound and vibration, and a method of driving the electronic device.
Note that in the embodiment described above, three vibration elements 140A1-140A3 are used to selectively generate a vibration on the three areas 120A1 to 120A3 of the top panel 120. However, two or more vibration elements are acceptable so as to independently generate vibration on two or more areas on the top panel 120. This is because if it is possible to independently generate vibration on at least two or more areas, it is possible to provide an area where vibration is generated, and an area where vibration is not generated in a summary guidance mode.
Next, modified examples will be described by using
When displaying the images 160A1, 160A2, and 160A3 on the display panel 160, the buttons 160A11, 160A12, 160A13, 160A21, 160A22, 160A23, 160A31, 160A32, and 160A33 may be displayed.
The buttons 160A11, 160A12, and 160A13 are buttons to select “plain udon”, “meat udon”, and “tempura udon”, respectively. These are noodles.
Also, the buttons 160A21, 160A22, and 160A23 are buttons to select “tempura bowl A”, “tempura bowl B”, and “sukiyaki bowl”, respectively. These are bowls.
Also, the button 160A31, 160A32, and 160A33 are buttons to select drinks, such as “orange juice”.
Such buttons 160A11-160A13, 160A21-160A23, and 160A31-160A33 may be displayed in the images 160A1, 160A2, and 160A3 so that the buttons 160A11-160A13, 160A21-160A23, and 160A31-160A33 become selectable when switching occurs from the summary guidance mode to the detailed guidance mode.
In
More specifically, assume that the index finger of the user's right hand starts moving in the assist area 161B at time t11, enters the display area of the button 161A at time t12, and enters the assist area 161B at time t13. Furthermore, assume that it enters the assist area 162B from the assist area 161B at time t14, enters the button 162A at time t15, and enters the assist area 162B at time t16.
In such a case, the electronic device 100 drives the vibration elements 140A1-140A3 at time t11, and a voice guidance of “Plain udon is located right” is output from the speaker 103. The drive pattern of the vibration elements 140A1-140A3 at this time is, for example, as illustrated in
The predetermined short time is set to a time that is sufficiently shorter than the time required for the fingertip passing through the assist area 161B at an average moving speed of the fingertip. Note that instead of setting such a predetermined short time, the coordinates of the fingertip may be detected to stop driving the vibration elements 140A1-140A3 before the fingertip enters the display area of the button 162A from the assist area 161B.
The vibration elements 140A1-140A3 becomes not driven as the amplitude of the drive signal is forced to zero while the index finger of the user's right hand stays in the assist area 161B. At this time, since the dynamic frictional force acting on the fingertip of the user increases, a tactile sensation as if to touch a protrusion is brought.
When the index finger of the user's right hand enters the display area of the button 161A at time t12, the vibration elements 140A1-140A3 are driven by the drive signal whose amplitude changes periodically between A4 and A5, and a voice guidance of “This is plain udon” is output from the speaker 103. At this time, a tactile sensation of roughness is brought to the fingertip of the user.
When entering the assist area 161B at time t13, the drive of the vibration elements 140A1-140A3 is stopped only for the predetermined short time. Since the dynamic frictional force acting on the fingertip of the user increases at this time, a tactile sensation as if to touch a protrusion is brought. Also, since the fingertip of the user is moving in the direction away from the button 161A, no voice guidance is executed at this time.
Also, once the predetermined short time has elapsed since time t13, the vibration elements 140A1-140A3 are driven by a drive signal of the amplitude A3. At this time, a tactile sensation of smoothness is brought to the fingertip of the user.
When entering the assist area 162B from the assist area 161B at time t14, the vibration elements 140A1-140A3 are kept driven by the drive signal of the amplitude A3, and a voice guidance of “Meat udon is located right” is output from the speaker 103. At this time, a tactile sensation of smoothness is brought to the fingertip of the user.
When the predetermined short time has elapsed since time t14, the vibration elements 140A1-140A3 become not driven as the amplitude is forced to zero. Since the dynamic frictional force acting on the fingertip of the user increases at this time, a tactile sensation as if to touch a protrusion is brought.
When entering the button 162A at time t15, the vibration elements 140A1-140A3 are driven by the drive signal whose amplitude changes periodically between A5 and A6, and a voice guidance of “This is meat udon” is output from the speaker 103. At this time, a tactile sensation of roughness is brought to the fingertip of the user. Note that the amplitude A6 is a greater than the amplitude A5, and less than the amplitude A4. Therefore, on the button 162A and on the button 161A, the vibration elements 140A1-140A3 are driven by driving vibrations of different amplitudes.
When entering the assist area 162B at time t16, the drive of the vibration elements 140A1-140A3 is stopped only for the predetermined short time. Since the dynamic frictional force acting on the fingertip of the user increases at this time, a tactile sensation as if to touch a protrusion is brought. Also, since the fingertip of the user is moving in the direction away from the button 162A at this time, no voice guidance is executed.
Also, once the predetermined short time has elapsed since time t16, the vibration elements 140A1-140A3 are driven by the drive signal of the amplitude A3. At this time, a tactile sensation of smoothness is brought to the fingertip of the user.
As described above, sections may be provided between the button 161A and the assist area 161B, and between the button 162A and the assist area 162B, where the amplitude of the drive signal is forced to zero. Since a tactile sensation as if to touch a protrusion is brought, it becomes easier to perceive the boundaries between the button 161A and the assist area 161B, and between the button 162A and the assist area 162B with the tactile sensation.
Also, by driving the vibration elements 140A1-140A3 by drive vibrations of different amplitudes on the button 162A and on the button 161A, the difference between the button 162A and the button 161A becomes sensible with the tactile sensation.
This is also the same in the case where a fingertip is moved in any direction in the display areas of the buttons 161A-166A and the assist areas 161B-166B.
In this way, it is possible to guide a fingertip of the user in a direction where one of the buttons 161A-166A is located by using a tactile sensation using the squeeze effect, and a voice guidance. Because of this, the user can precisely recognize the positions of the buttons 161A-166A without visual observation, and can precisely order a menu associated with one of the buttons 161A-166A by performing a confirmation operation.
As illustrated between time t1 and time t2 in
By changing the amplitude in time in this way, while the fingertip approaches the button 165A closer in the display area of the assist area 165B, a tactile sensation becoming smoother is brought to the fingertip of the user gradually. Because of this, it is possible to inform the user of the state of approaching the button 165A from the assist area 165B, through the tactile sensation.
Also, in contrast to this, as illustrated between time t3 and time t4 in
By changing the amplitude in time in this way, while the fingertip moves away from the button 165A in the display area of the assist area 165B, a tactile sensation becoming less smooth is brought to the fingertip of the user gradually. Because of this, it is possible to inform the user of the state of moving away from the button 165A, through the tactile sensation.
Images 160A41, 160A42, and 160A43 are displayed on the display panel 160. The size of the display areas of the images 160A41, 160A42, and 160A43 is the same as that of the display areas of the images 160A1, 160A2, and 160A3 illustrated in
Characters of “plain udon”, “meat udon”, and “tempura udon” are displayed on the images 160A41, 160A42, and 160A43, respectively.
In a state where the images 160A41, 160A42, and 160A43 are displayed as illustrated in
The user can select one of “plain udon”, “meat udon”, and “tempura udon” by performing a confirmation operation.
Note that such a guidance mode may be executed as a summary guidance mode. In this case, a detailed mode does not need to be executed. Alternatively, in a detailed mode, the buttons to select “plain udon”, “meat udon”, and “tempura udon” may be enlarged as the images 160A41, 160A42, and 160A43, respectively.
Also, in
The electronic device illustrated in
The display on the display panel 160 illustrated in
In the electronic device illustrated in
Having such an electronic device with a summary guidance mode to select air-conditioner, audio, or navigation installed in the vehicle enables the user, while driving the vehicle, to easily activate the summary guidance mode only by performing a multi-touch with a single hand without visually observing the display panel 160.
In
The buttons 360A61, 360A62, and 360A63 are buttons to select “turn up temperature”, “turn down temperature”, and “return”, respectively.
Using such a detailed guidance mode to operate air-conditioner, audio, and navigation enables the user, while driving the vehicle, to select the button 360A61, 360A62, or 360A63 with a tactile sensation and a voice guidance, without visually observing the display panel 160.
In
The buttons 261A, 264A, and 267A are buttons for operating the air-conditioner, the buttons 262A, 265A, and 268A are buttons for operating the audio, and the buttons 263A, 266A, and 269A are buttons for operating the navigation.
Also, above the buttons 261A, 262A, and 263A, areas 167A1, 167A2, and 167A3 are provided, and below the buttons 267A, 268A, and 269A, areas 168A1, 168A2, and 168A3 are provided.
Using such a detailed guidance mode to operate air-conditioner, audio, and navigation enables the user, while driving the vehicle, to select one of the buttons 261A-269A with a tactile sensation and a voice guidance, without visually observing the display panel 160.
In
Also, above the buttons 261A1, 262A1, and 263A1, areas 167A1, 167A2, and 167A3 are provided, and below the buttons 267A1, 268A1, and 269A1, the areas 168A1, 168A2, and 168A3 are provided.
Using such a detailed guidance mode to operate an air-conditioner enables the user, while driving the vehicle, to select one of the buttons 261A1-269A1 with a tactile sensation and a voice guidance, without visually observing the display panel 160.
When displaying images 160A51, 160A52, and 160A53 on the display panel 160, buttons 261A-269A may be displayed. The buttons 261A-269A are substantially the same as those illustrated in
In a state where a map is displayed on the display panel 160, buttons 263A2, 266A2, and 269A2 may be displayed. The buttons 263A2, 266A2, and 269A2 are size-enlarged buttons of 263A, 266A, and 269A illustrated in
In a state where a map is displayed on the display panel 160, buttons 263A3, 266A3, and 269A3 may be displayed on lower parts in the display panel 160. Assist areas 263B3, 266B3, and 269B3 are provided around the buttons 263A3, 266A3, and 269A3.
The buttons 263A3, 266A3, and 269A3 are substantially the same as the buttons 263A, 266A, and 269A illustrated in
When displaying images 160A61, 160A62, and 160A63 on the display panel 160, buttons 261A63, 262A63, 263A63, 264A63, 265A63, and 266A63 may be displayed. The images 160A61, 160A62, and 160A63 are to classify “cash transactions”, “bankbook/balance”, and “wire transfer, etc.”, and are displayed in a summary guidance mode.
The buttons 261A63, 262A63, 263A63, 264A63, 265A63, and 266A63 are buttons to select “withdrawal”, “balance inquiry”, “wire transfer”, “deposit”, “bankbook entry”, and “return”, respectively. Note that any one of the buttons 261A3, 262A3, 263A3, 264A3, 265A3, and 266A3 may be served as “return” button.
The buttons 261A3 and 264A3 are placed in the image 160A61, the buttons 262A3 and 265A3 are placed in the image 160A62, and the buttons 263A3 and 266A3 are placed in the image 160A63.
Such buttons 261A3, 262A3, 263A3, 264A3, 265A3, and 266A3 may be displayed in the image 160A61, 160A62, and 160A63, so that upon a transition from a summary guidance mode to a detailed guidance mode, the button 261A3, 262A3, 263A3, 264A3, 265A3, or 266A3 can be selected.
The images 160A61, 160A62, and 160A63 are displayed on the display panel 160. Areas where the images 160A61, 160A62, and 160A63 are displayed are substantially the same as the areas 120A1, 120A2, and 120A3 of the top panel 120 (see
Characters representing classification of “cash transactions”, “bankbook/balance”, and “wire transfer, etc.” are displayed on upper parts of the images 160A61, 160A62, and 160A63, respectively. The electronic device 100 is an input device with which the user performs an input operation on the top panel 120 to execute a wire transfer and the like.
The images 160A61, 160A62, and 160A63 displayed on the display panel 160 illustrated in
In
At this time, since the dynamic frictional force is not reduced in the areas 120A2 and 120A3, the user does not perceive a tactile sensation of smoothness with the right hand.
In such a state, the electronic device 100 outputs a voice guidance of “Smooth part corresponds to cash transactions” from the speaker 103.
In this way, in a state where the area 120A1 of cash transactions has been recognized, if the user performs a confirmation operation in the area 120A1, a detailed input mode of cash transactions can be activated.
In
The buttons 261A4 and 262A4 are buttons to select “withdrawal” and “deposit”, respectively.
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A4 or 262A4 with a tactile sensation and a voice guidance.
In
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A5 or the like with a tactile sensation and a voice guidance, and to easily execute a transaction on the ATM.
In
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A6 or the like with a tactile sensation and a voice guidance, and to easily execute a transaction on the ATM.
When displaying images 160A71, 160A72, and 160A73 on the display panel 160, buttons 261A7 and the like may be displayed. The images 160A71, 160A72, and 160A73 classify “tickets”, “transfer tickets”, and “reservation/book tickets/season tickets”, and are displayed in a summary guidance mode.
Nine buttons including the button 261A7 are buttons to select “below WY 500”, “below WY 500”, “reserved seat tickets”, “JPY 500 to 1000”, “JPY 500 to 1000”, “book tickets”, “over WY 1000”, “over JPY 1000”, and “return”, respectively.
Such buttons 261A7 and the like may be displayed in the images 160A71, 160A72, and 160A73, so that the buttons 261A7 or the like can be selected and when a summary guidance mode transitions to a detailed guidance mode.
Images 160A71, 160A72, and 160A73 are displayed on the display panel 160. Areas where the images 160A71, 160A72, and 160A73 are displayed are substantially the same as the areas 120A1, 120A2, and 120A7 of the top panel 120 (see
Characters representing classification of “tickets”, “transfer tickets”, and “reservation/book tickets/season tickets” are displayed on upper parts of the image 160A71, 160A72, and 160A73. The electronic device 100 is an input device with which the user can perform an input operation on the top panel 120 to execute a reserved seat ticket and the like.
The image 160A71, 160A72, and 160A73 displayed on the display panel 160 illustrated in
In
At this time, since the dynamic frictional force is not reduced in the areas 120A2 and 120A7, the user does not perceive a tactile sensation of smoothness with the thumb and the index finger of the left hand, and the right hand.
In such a state, the electronic device 100 outputs a voice guidance of “Smooth part corresponds to tickets” from the speaker 103.
In this way, in a state where the area 120A1 of tickets has been recognized, if the user performs a confirmation operation in the area 120A1, a detailed input mode of tickets can be activated.
In
The buttons 261A9, 262A9, and 263A9 are buttons to select “below WY 500”, “WY 500 to 1000”, and “over WY 1000”, respectively.
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A9, 262A9, or 263A9 with a tactile sensation and a voice guidance.
In
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A7 or the like with a tactile sensation and a voice guidance, and to easily execute a transaction on the ATM.
In
Using such a detailed guidance mode enables the user, without visually observing the display panel 160, to select the button 261A8 or the like with a tactile sensation and a voice guidance, and to easily purchase a ticket at a desired price.
Note that although the embodiment has been described in which the electronic device 100 includes the display panel 160, the electronic device 100 may include no display panel 160.
The electronic device 100A includes a housing 110, a top panel 120, a double-sided tape 130, vibration elements 140A1-140A3, a touch panel 150, and a substrate 170. Note that
Since the electronic device 100A includes no display panel 160, it does not need to include image data in data used in a summary guidance mode and a detailed guidance mode (see
In response to an input operation performed on the top panel 120, similar to the electronic device 100, the electronic device 100A drives one of the vibration elements 140A1-140A3 in a summary guidance mode, and drives the vibration elements 140A1-140A3 in a detailed guidance mode.
The electronic device 100B includes a housing 110B, a top panel 120B, a double-sided tape 130B, vibration elements 140A1-140A3, a touch panel 150B, a display panel 160B, and a substrate 170B.
The electronic device 100B illustrated in
The top panel 120B curves such that the central part in plan view projects toward the side in the positive direction of the Z-axis. Although a cross-sectional shape of the top panel 120B in the YZ plane is illustrated in
Using the top panel 120B of a curved surface glass as such enables to provide a satisfactory tactile sensation.
The electronic device 100C includes a housing 110, a top panel 120, a double-sided tape 130, vibration elements 140A1, 140A2, 140A3, 140B1, 140B2, and 140B3, a touch panel 150, a display panel 160, and a substrate 170.
The electronic device 100C has a configuration in which the vibration elements 140B1, 140B2, and 140B3 are added to the electronic device 100 illustrated in
When the vibration element 140B1 is driven, as illustrated in
When the vibration element 140A1 is driven, as illustrated in
When the vibration element 140B3 is driven, as illustrated in
Also, when the vibration elements 140A2 and 140A3 are driven, as illustrated in
Also, images of products such as a PC, cellular, and smart phone are displayed on the area 120C, and GUI buttons for individual customers, institutional customers, support information, announcements, and the like are displayed on the area 120B3.
Therefore, generating a vibration in the area 120A1, 120B1, 120C, or 120B3 in a summary guidance mode and executing a voice guidance enables the user to know various information items of a company without visual observation.
The top panel 120 has partitioned areas 120D1, 120D2, and 120D3. Images of the residual capacity of the battery, the reception state of a radio wave, and the like are displayed on the area 120D1. The area 120D2 is an area to display various contents. The area 120D3 is an area where a home button and the like are placed.
The electronic device 100D includes a housing 110, a top panel 120, a double-sided tape 130, vibration elements 140D1, 140D2, 140D3, a touch panel 150, a display panel 160, and a substrate 170.
The electronic device 100D includes the vibration elements 140D1, 140D2, and 140D3 instead of the vibration elements 140A1, 140A2, and 140A3 of the electronic device 100 illustrated in
When the vibration element 140D1 is driven, as illustrated in
When the vibration element 140D2 is driven, as illustrated in
Also, when the vibration element 140D3 is driven, as illustrated in
Therefore, generating a vibration in the area 120D1, 120D2, or 120D3 in a summary guidance mode and executing a voice guidance enables the user to operate the electronic device 100D as a smart phone terminal without visual observation.
As above, an electronic device and a method of driving the electronic device have been described according to exemplary embodiments, and note that the present invention is not limited to the embodiments specifically disclosed as above, but various modifications and changes can be made without deviating from the subject matters described in the claims.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
This application is a continuation application of International Application PCT/JP2016/050558 filed on Jan. 8, 2016 and designated the U.S., the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2016/050558 | Jan 2016 | US |
Child | 16028142 | US |